‘Smart’ walking stick could unlock a previously restricted world for the visually impaired
A team of engineers at the University of Colorado Boulder have been working with the latest AI technology to develop a ‘smart’ walking stick for people who are visually impaired.
The researchers behind the project hope that once perfected, their smart walking sick could be used by the visually impaired to navigate otherwise impossible tasks in a world designed for sighted people.
Tasks such as shopping for a specific box of cereal or picking a private place to sit down in a crowded area are things usually taken for granted by sighted people yet remain impossible for the visually impaired. Shivendra Agrawal, a student in the midst of his doctoral in the Department of Computer Science at CU Boulder, says: “I really enjoy grocery shopping and spend a significant amount of time in the store, a lot of people can’t do that, however, and it can be really restrictive. We think this is a solvable problem.”
The team in the Collaborative Artificial Intelligence and Robotics Lab at CU Boulder have recently taken great strides in coming closer to solving this issue. The walking stick resembles any other typical cane but with the added features of a camera along with computer vision technology. Through the combination of these features, the smart walking stick is able to map the world around it and then guide users through vibrations in the handle in conjunction with spoken directions.
Agrawal stressed that his team’s device should not be seen as a substitute for designing places to be more accessible for the visually impaired but hoped instead that it will demonstrate new ways AI can help millions of people become more independent.
During tests, the team focused specifically on two challenges faced every day by the visually impaired.
The challenge of shopping
A task that even someone with a case of perfect 20/20 vision can find daunting, the team tasked the cane with finding and grasping a specific product amidst dozens of similar-looking and similar-feeling options stacked in aisles – the product in question being cereal.
To make the test as one-to-one with reality as possible, the team set up a realistic model shopping aisle within the lab consisting of a grocery shelf stocked to the brim with various cereals and flavours. The walking stick was then used to scan the shelves for the desired product.
The smart walking stick is able to differentiate between the cereals through its database of product photos which it then uses to scan and compare with the products on the shelf. “It assigns a score to the objects present, selecting what is the most likely product. The system then issues commands like ‘love a little to your left’,” says Agrawal.
The system was able to select the desired product of ‘Kashi GO Coconut Almond Crunch’, assigned a 0.91 match score with the second highest score reaching no more than 0.70.
The challenge of finding a place to sit
The team then worked towards solving the age-old problem of ‘where do I sit?’
“Imagine you’re in a café, you don’t just want to sit anywhere. You usually take a seat close to the walls to preserve your privacy, and you usually don’t like to sit face-to-face with a stranger,” explained Agrawal. This is where the smart walking stick can aid the visually impaired by allowing them to find a better, more personal spot to sit without the need to ask for assistance, something that is a popular desire amongst those afflicted.
Similar to the shopping tests, a realist environment of a café was set up in the lab, complete with tables, chairs, patrons, and other variables to make the task more challenging.
Due to the complexity of the task, the smart stick had to be used alongside a laptop housed in a backpack the user would wear. A camera on the cane handle would feed the laptop data from which a set of algorithms would determine obstacles and calculate an effective route to a seat which was then transmitted to the user. It is hoped that as the technology is developed, the device will be able to be used with a smartphone app instead of a laptop, making it far more practical.
Subjects were able to find the right chair in an appropriate location in 10 out of 12 trials with varying levels of difficulty. Whilst thus far all subjects have been blindfolded sighted people, the team hopes to perform trials with visually impaired individuals once the technology progresses appropriately.
Looking to the future
Whilst it may be a while before you see people walking around with a ‘smart’ walking stick, the team behind it hopes their early results will not only lead to further progression of their own but also inspire other engineers to think about how robotics and AI can assist those with disabilities. “Our aim is to make this technology mature but also attract other researchers into this field of assistive robotics. We think assistive robotics has the potential to change the world,” said Agrawal.
Bradley Hayes, Assistant Professor of Computer Science at CU Boulder commended the work saying that: “Shivendra’s work is the perfect combinations of technical innovation and impactful application, going beyond navigation to bring advancements in underexplored areas, such as assisting people with visual impairment with social convention adherence or find and grasping objects.”
The team reported its findings at the International Conference on Intelligent Robots and Systems in Kyoto, Japan.
On a final note, Agrawal said: “AI and computer vision are improving, and people are using them to build self-driving cars and similar inventions, but these technologies also have the potential to improve the quality of life for many people.”