Summary: Scientists hope to accelerate the development of human-level AI using a network of powerful supercomputers — with the first of these machines fully operational by 2025.

New supercomputing network could lead to AGI, scientists hope, with 1st node coming online within weeks

Source: Lisa D. Sparks - 1970-01-01T00:00:00Z

0 UP DOWN

Close-up abstract pattern of intertwined colorful light beams on a black background
(Image credit: Jose A. Bernat Bacete via Getty Images)

Researchers plan to accelerate the development of artificial general intelligence (AGI) with a worldwide network of extremely powerful computers — starting with a new supercomputer that will come online in September.

Artificial intelligence (AI) spans technologies including machine learning and generative AI systems like GPT-4. The latter offer predictive reasoning based on training from a large data set — and they can often surpass human capabilities in one particular area, based on their training data. They are sub-par at cognitive or reasoning tasks, however, and cannot be applied across disciplines.

AGI, by contrast, is a hypothetical future system that surpasses human intelligence across multiple disciplines — and can learn from itself and improve its decision-making based on access to more data.

The supercomputers, built by SingularityNET, will form a "multi-level cognitive computing network" that will be used to host and train the architectures required for AGI, company representatives said in a statement.

These include elements of advanced AI systems such as deep neural networks, which mimic the functions of the human brain; large language models (LLMs), which are large sets of data AI systems train on; and multimodal systems that connect human behaviors such as speech and movement inputs with multimedia outputs. This is similar to what you would see from AI videos.

Building a new AI supercomputer network

The first of the supercomputers will start to come online in September, and work will be completed by the end of 2024 or early 2025, company representatives told LiveScience, depending on supplier delivery timelines.

Related: 22 jobs artificial general intelligence (AGI) may replace — and 10 jobs it could create

Get the world’s most fascinating discoveries delivered straight to your inbox.

The modular supercomputer will feature advanced components and hardware infrastructure including Nvidia L40S graphics processing units (GPUs), AMD Instinct and Genoa processors, Tenstorrent Wormhole server racks featuring Nvidia H200 GPUs, alongside Nvidia’s GB200 Blackwell systems. Altogether they form some of the most powerful AI hardware available.

"This supercomputer in itself will be a breakthrough in the transition to AGI. While the novel neural-symbolic AI approaches developed by the SingularityNET AI team decrease the need for data, processing and energy somewhat relative to standard deep neural nets, we still need significant supercomputing facilities," SingularityNET CEO Ben Goertzel told LiveScience in a written statement.

"The mission of the computing machine we are creating is to ensure a phase transition from learning on big data and subsequent reproduction of contexts from the semantic memory of the neural network to non-imitative machine thinking based on multi-step reasoning algorithms and dynamic world modeling based on cross-domain pattern matching and iterative knowledge distillation. Before our eyes, a paradigmatic shift is taking place towards continuous learning, seamless generalisation and reflexive AI self-modification.”

The road to an AI "superintelligence"

SingularityNET’s goal is to provide access to data for the growth of AI, AGI and a future artificial super intelligence — a hypothetical future system that is far more cognitively advanced than any human. To do this, Goertzel and his team also needed unique software to manage the federated (distributed) compute cluster.

Federated compute clusters allow for the abstraction of user data and the exposure of summary data necessary for large scale, yet protected computations of data sets containing highly secure elements such as PII.

“OpenCog Hyperon, is an open-source software framework designed specifically for the architecture of AI systems,” Goertzel added. “This new hardware infrastructure is purpose-built to implement OpenCog Hyperon and its AGI ecosystem environment.”

To grant users with access to the supercomputer, Goertzel and his team are using a tokenized system that is common in AI. Users gain access to the supercomputer and through their tokens they can use and can add data to the existing sets other users rely on to test and deploy AGI concepts.

In their simplest form, AI tokens mimic tokens from free-standing video games in arcades. Players had to buy tokens to then insert into the video game to get a certain amount of chances at playing. In this instance, the data collected from playing the game is accessible to everyone else who is playing, not only at one arcade but also wherever that instance of the game is located in other arcades around the world.

“GPT-3 was trained on 300 billion tokens (typically words, parts of words, or punctuation marks), and GPT-4 on 13 trillion,” wrote Mercatus scholar and software engineer Nabeel S. Qureshi. “Self-driving cars are trained on thousands of hours of video footage; OpenAI Copilot, for programming, is trained on millions of lines of human code from the website Github.”

Leaders in AI, specifically DeepMind’s co-founder Shane Legg, have stated systems could meet or surpass human intelligence by 2028. Goertzel has previously estimated systems will reach that point by 2027, while Mark Zuckerberg is actively pursuing AGI having invested $10 billion in building the infrastructure to train advanced AI models in January.

SingularityNET, which is part of the Artificial Super Intelligence Alliance (ASI) — a collective of companies dedicated to open source AI research and development — plans to expand the network in the future and expand the computing power available. Other ASI members include Fetch.ai, which recently invested $100 million in a decentralized computing platform for developers.

Lisa D Sparks is a freelance journalist for Live Science and an experienced editor and marketing professional with a background in journalism, content marketing, strategic development, project management, and process automation. She specializes in artificial intelligence (AI), robotics and electric vehicles (EVs) and battery technology, while she also holds expertise in the trends including semiconductors and data centers.