Researchers from the University of Sheffield have studied the interaction between flies, vision, and environment, and discovered that rather than being a passive receptacle, a fly’s vision is an active cog in a fast and efficient machine. That machine could unlock the future potential of AI systems – such as robotics and autonomous vehicles – a significant boon for anything requiring real-time reaction.
According to the team, an insect’s lightning-quick reaction time could chart a pathway for more efficient robots and self-driving cars, owing to a newly identified mechanism called synaptic high-frequency jumping.
By studying flies’ eyes and brains, observing their behaviour, and building digital simulations, the team discovered that house flies and fruit flies do not process information passively. Instead, they twitch their bodies in sync with what they see. These tiny, jerky movements help their brains receive clearer information about the world around them faster. During high-speed movement, such as escaping danger, rapid eye movements called saccades act as a “turbo boost” for the insect, tripling the speed at which information is sent to the brain and effectively eliminating delays. This means insects can react to the world around them in milliseconds, sometimes before visual signals have been fully processed.
Before this, it had been assumed that information flowed through fixed neural pathways at a steady pace, but the study shows that vision works in active collaboration with movement.
How can a fly help future AI systems?
AI systems typically depend on large-scale computation and data processing, but these are often slow and power-hungry. Insects bypass these limitations by closely linking sensing and action so that the two become almost inseparable, delivering impressive performance on a shoestring of biological resources.
The University of Sheffield’s Dr Jouni Takalo, who led development of the biophysically realistic statistical model underlying the work, said: “Our model shows how thousands of tiny sensors work together to reshape visual signals. By acting as a team, these sensors can instantly shift their focus to where it’s needed most. This allows the insect to produce fast, reliable reactions even when moving at high speeds in the wild.”
This is particularly attractive to AI systems, especially where split-second decisions are non-negotiable. Adopting movement-driven, adaptive information processing in place of sprawling, energy-intensive computing infrastructure could allow machines to do more with far less.
Professor Mikko Juusola, senior author of the study from the University of Sheffield’s School of Biosciences and Neuroscience Institute, said: “Our findings reveal a fundamentally new way of thinking about how brains compute information – one where speed and efficiency emerge from active interaction with the environment. We’ve demonstrated how even the smallest brains can solve complex problems at extraordinary speeds.
“It shows that vision is not limited by the speed at which insect brains process information. Instead, the brain automatically speeds up to keep pace with the body, cutting out lag and making sure information flows as quickly as possible.”
The moment an insect snaps into a sharp turn, its brain shifts up a gear, meaning that it has scope for greater data capacity, allowing it to focus on the fastest-moving, most time-critical information. This ability means that it can overcome physical and neural limitations, making it a fascinating model to build upon.
Nature has always been, and will continue to be, a wonderful and powerful source of inspiration – offering everything we need. When researchers and engineers take heed of what it is capable of, it shows that sometimes the most unlikely creature may also offer the key to unlocking some of the most complex problems we face.