Artificial Intelligence

Scientists create AI capable of speaking to each other

17th April 2024
Harry Fowle
0

A UNIGE team has developed an AI capable of learning a task solely based on verbal instructions, and to do the same with a sister AI.

Performing a new task based solely on verbal or written instructions, and then explaining how to do it so others can replicate the process, represents a fundamental aspect of human communication that AI has traditionally found challenging. However, researchers at the University of Geneva (UNIGE) have made significant progress in this area by developing an artificial neural network capable of learning and performing basic tasks, then providing a linguistic description of these tasks to another AI, which then replicates them. This achievement has promising implications, particularly for the field of robotics, and has been documented in Nature Neuroscience.

Humans uniquely possess the ability to perform tasks based on instructions alone, a capability not shared by other species that typically require extensive trial and error learning. This human ability to quickly assimilate and transfer knowledge is now being emulated in AI through advancements in natural language processing, a sub-field of AI that enables machines to understand and interact using human language. Despite these advances, the neural mechanisms enabling AI to mimic this human-like cognitive function remain largely unexplored.

This breakthrough at UNIGE has roots in the development of artificial neural networks, which are modelled after the human brain. These networks mimic the way neurons communicate through electrical signals, enabling complex computational tasks. While the specific neural computations needed to replicate human cognitive feats like learning from instructions are not yet fully understood, the recent developments in Geneva mark a significant step towards unravelling these mechanisms.

The ability of AI to perform tasks from instructions, learn from the process, and then verbally communicate the methodology opens new pathways for automation and enhances how robots can learn and function in dynamic environments. This not only broadens the potential applications of AI in daily life but also advances our understanding of both human and artificial cognition.

Alexandre Pouget, Full Professor in the Department of Basic Neurosciences at the UNIGE Faculty of Medicine, explains: ‘‘Currently, conversational agents using AI are capable of integrating linguistic information to produce text or an image. But, as far as we know, they are not yet capable of translating a verbal or written instruction into a sensorimotor action, and even less explaining it to another artificial intelligence so that it can reproduce it.”

A model brain

With some prior training, the researcher and his team successfully developed an artificial neural model that possesses this dual capability. ‘‘We started with an existing model of artificial neurons, S-Bert, which has 300 million neurons and is pre-trained to understand language. We ‘connected’ it to another, simpler network of a few thousand neurons,’’ explains Reidar Riveland, a PhD student in the Department of Basic Neurosciences at the UNIGE Faculty of Medicine.

In the initial phase of the experiment, the neuroscientists trained the network to emulate Wernicke’s area, the brain region that enables the perception and interpretation of language. Following this, they trained the network to simulate Broca’s area, which, guided by Wernicke’s area, is responsible for the production and articulation of words. This entire training process was conducted using conventional laptop computers. The AI received written instructions in English, guiding it through various tasks.

For instance, the instructions included actions like identifying the direction—left or right—of a perceived stimulus, responding in the direction opposite to a stimulus, or choosing the brighter of two visual stimuli that differed slightly in contrast. The scientists then assessed the model's simulated intentions to move or, in this context, to point. ‘‘Once these tasks had been learned, the network was able to describe them to a second network - a copy of the first - so that it could reproduce them. To our knowledge, this is the first time that two AIs have been able to talk to each other in a purely linguistic way,” stated Alexandre Pouget.

For future humanoids

This model creates exciting new opportunities for exploring the interaction between language and behaviour. It holds particular promise for the robotics industry, where developing technologies that allow machines to communicate with each other is a critical challenge. This current version is quite small as far as neural networks go, but the team are excited about the future of this technology and application.

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier