From something you can feel to something that feels right
Helped by EU funding, Ultrahaptics is the first and only company in the world to develop a device which allows users to feel sensations with their bare hands while interacting in mid-air with a touchless sensing system. This frontier heralds a fundamentally new relationship with technology. The EU-funded UTOUCH project has enabled the development of ultrasonic speakers which project Invisible forces into the air, capable of being felt on human skin at a distance of up to one metre.
The technology adds value to gesture-control applications, such as hand swipe recognition for control functions, providing reassurance of the user’s selection. Given that the technology can generate invisible buttons, dials and sliders that can be felt when needed, along with visible hand-tracking interfaces, the range of possible applications is almost limitless.
The team were able to demonstrate some possibilities including: automative gesture control, household device controls, a holographic ATM, and transport interfaces for increased accessibility (e.g. for the visually impaired), at this year’s http://www.ces.tech/ (Consumer Electronics Show, in Las Vegas.)
Explaining the technology, which grew out of a PhD project undertaken by the current Ultrahaptics Chief Technology Officer Tom Carter, UTOUCH project coordinator and Ultrahaptics CEO Mr Steve Cliffe explains, ‘We use 40 KHz ultrasonic transducers, much like those used in vehicle parking sensors and many other places. Based on the X and Y coordinates for our target reception area, we arrange the transducers accordingly so that we can project the sound at high pressure to the right place.’
Motion sensor cameras are used in conjunction with the device, to continually track the position and activity of the hand. Knowing where it is and what it is attempting to do, allows algorithms to inform the device what kind of sensation to created and at which 3D coordinates. Sound waves and vibrations are then manipulated to sculpt a force onto the hand, creating the desired feeling and texture.
To get the technology to market the team could not simply make the device available to customers without integration support and so they created the Dev Kit.
These are support tools comprised of development boards and software, with embedded Ultrahaptics systems, which enable a range of SMEs to code their own applications without having to undergo the Ultrahaptics evaluation programme. As Mr Cliffe recalls, ‘The EU support allowed us to demonstrate commercial traction, which was an essential precursor to attracting further investment.’
The team has been exploring a range of applications. Taking the car industry as an example, mid-air haptics combined with gesture recognition for in-car controls will mean that safety is increased with drivers concentrating on the road, with the responsive technology reinforcing and acknowledging control selection.
Invisible controls providing feedback for user interfaces without the need to touch an actual display, has wide-ranging safety benefits. For example, in medical environments it can reduce pathogen transmission from surfaces and equipment.
As the system even functions through surgical gloves, time currently taken during surgery between procedures for cleaning equipment, could be reduced. The same hygiene and reduced maintenance principle also applies to public spaces, for example for elevators controls.
The technology can also be extended to the smart home and future workspaces where heating, entertainment, lighting, along with modern workstation elements relying on touch for activation (such as keyboards and mice etc.) could be replaced haptic feedback gestures.
Speaking of some of the somewhat unexpected successes of the technology Mr Cliffe recalls, ‘Brand engagement is a fast moving sector. Take film posters, people typically currently spend 3.8 seconds looking at one. By making them more interactive with touchless sensing we get that dwell rate up to 10-15 seconds, and up to 30 seconds when we integrate gaming.’
A demonstration of this at CES included a poster which emitted lighting – which could be felt by visitors - from a sorcerer’s orb. Indeed, the development team see the next frontier residing in mixed realities, blending Artificial Intelligence, Artificial Reality and Virtual Reality, of particular value to the gaming and entertainment industries.
The team is currently further developing textures, investigating the alteration of vibrations to vary sensations of softness and roughness. As Mr Cliffe summarises, ‘The technology will be ubiquitous, replacing many devices and procedures, where it will enhance safety and security, increase accessibility – for example for the blind – as well as delivering more unique, immersive and enriching experiences in daily life.’