Researchers from Scottish universities have succesfully developed a way to breathe new life into outdated robot pets and toys using augmented reality (AR) technology.
They have tested a new software system which can overlay a wide range of new virtual behaviours on commercially-available robot pets and toys which are designed to look like animals and mimic their actions.
The system, ‘Augmenting Zoomorphic Robotics with Affect (AZRA)’, aims to address the shortcomings of the current generation of these zoomorphic robots, which often have very limited options for interaction.
In the future, AZRA-based systems could enable older pets and even previously non-interactive toys like plush dolls to provide experiences which are much closer to those provided by real animal companions.
The richer experiences AZRA enables could help provide more pet-like experiences for people who are unable to keep real animals for reasons of health, cost or restrictions on rental properties.
When users of the AZRA system wear AR devices like Meta’s Quest headset around their robot pets and toys, it projects a sophisticated overlay of virtual facial expressions, light, sound and thought bubbles onto the toy’s surfaces and surroundings.
AZRA is underpinned by a sophisticated simulation of emotions based on studies of real animal behaviour. In doing so, it can make robot seeem more convincingly ‘alive’ by imbuing them with moods which fluctuate unpredictably and can be affected by the touch or voice of their owner.
Eye contact detection and spatial awareness features means it knows when it is being looked at, and touch detection enables it to respond to strokes – even protesting when it is stroked against its preferred direction. It can request attention when ignored, or relax peacefully when sensing its owner is busy with other activities.
Additionally, the system can adjust the enhanced pet’s behaviour to better suit their owners’ personality and preferences. If users are high-energy and playful, the robot slowly adapts to become more excitable. In quieter households, it becomes more relaxed and contemplative.
The researchers say their work could also help cut down on electronic waste by reducing the likelihood of robot pets and toys being disposed of after their owners become tired of them.
The development of AZRA will be presented as a paper at the 34th IEEE International Conference on Robot and Human Interactive Communication in the Netherlands on 26th August.
Dr Shaun Macdonald, of the University of Glasgow’s School of Computing Science, is the paper’s lead author and led the development of AZRA. He was initially inspired to develop the system after receiving a less-than-inspiring gift.
He said: “I was given a little robot pet that had a very basic set of movements and potential interactions. It was fun for a few days, but I quickly ended up losing interest because I had seen everything it had to offer.
“I was a bit disappointed to realise that, despite all the major developments in technology over the last 25 years, zoomorphic robots haven’t developed much at all since I was a child. It’s all but impossible to build a relationship with a robot pet in the way you might with a real animal, because they have so few behaviours and they become over-familiar very quickly.
“As a researcher in human-computer interaction, I started to wonder whether I could build a system which could overlay much more complex behaviours and interactions on the toy using augmented reality. Being able to imbue older robots and pets with new life could also help reduce the carbon footprint of unwanted devices by keeping them from landfill for longer.”
Dr Macdonald used a simple, off-the-shelf zoomorphic pet, the Petit Qoobo, as the real-world platform onwhich to overlay the AR elements during the development of the system.
Guided by previous research into the emotional needs of dogs, Dr Macdonald developed Zoomorphic Robot Affect and Agency Mind Architecture, or ZAMA. ZAMA provides the AZRA system with a kind of artificial emotional intelligence, giving it a series of simulated emotional states which can change in response to its environment.
Rather than simple stimulus-response patterns, the system provides the AR pet with an ongoing temperament based around combinations of nine personality traits including ‘gloomy’, ‘relaxed’ or ‘irritable’. It has daily moods that fluctuate naturally, and a long-term personality which develops over time through interactions with its owner.
It simulates desires for touch, rest, food, and socialisation which are subtly randomised each day. When its needs aren’t met, the AR robot will actively seek interaction, displaying emojis and thought bubbles to communicate what it wants.
The researchers are already working to explore future potential of the technology, which includes participatory studies where volunteers can interact with the robot and then adjust its emotional parameters in real-time to explore what feels natural versus artificial in robot behaviour.
“AZRA turns a robot from being a device that I almost entirely choose to interact with into a device which can engage me in interaction itself. It feels more like me and another entity attempting to interact and communicate, rather than me make-believing almost all of that interaction myself,” added Dr Macdonald. “One of the main advantages of this system is that we don’t have a fixed ‘this is how this should work’ approach. What we have is a really great development test bed where we can try different ideas quickly and see what works. As AR glasses become more mainstream, this could become a way to breathe new life into existing robots without having to replace them entirely.”