Dentsu Lab’s Project Humanity is building interfaces that translate minute muscle signals and brainwaves into full digital expression – and in December 2025, they put that on a live stage in Amsterdam.
There is a quiet assumption embedded in most of the products we build: that the person using them can move their hands. They can swipe, tap, type, click. The entire paradigm of human-computer interaction has been built around that premise. But for an estimated 200 million people worldwide living with serious physical disabilities, that’s not the case.
The signal in the silence
ALS – amyotrophic lateral sclerosis – is a progressive neurodegenerative disease in which the motor neurons responsible for voluntary movement gradually degrade. The body becomes immobile, often entirely, while the mind remains fully intact. For most patients, the disease does not take cognition. It takes expression.
The engineers at Dentsu Lab saw this as an interface problem. If the output channel – the body – is compromised, then the solution is to find a new one. Their approach begins with electromyography (EMG) sensors attached directly to the patient’s body. These sensors detect the electrical activity generated by even the smallest, most residual muscle contractions.
That raw biological data is then processed and mapped onto a digital avatar – a full-body representation in virtual space that mirrors the user’s intended movements. The interface is not symbolic or abstract; it is embodied. The avatar moves because the person’s muscles are still trying to move. The system simply makes that intention legible.
More than accessibility
What makes this technically interesting is that the avatar is not just an avatar – it is a control surface. Once mapped, it can be used to interact with any software: productivity tools, communication platforms, creative applications. The user is not confined to a bespoke assistive tool. They have a general-purpose body in digital space.
Dentsu Lab has already demonstrated this in an e-sports context, running trial events where participants with physical disabilities played games online alongside people without disabilities. The result was instructive: the differences in physical ability became irrelevant at the layer of the interface.
This has direct implications beyond gaming. If the interface can be generalised, it becomes a pathway into the Metaverse for people who currently have no viable route in – opening up educational platforms, remote work environments, and social spaces that are otherwise inaccessible.
Amsterdam, December 2025: a live proof of concept
If the e-sports trial was a proof of function, Waves of Will was a proof of expression. On the evening of 10th December 2025, in collaboration with NTT Inc., Dentsu Lab staged a live dance performance in Amsterdam featuring Breanna Olson – a professional dancer who has been living with ALS.
The technical mechanism here shifted from EMG to brainwave detection. Breanna’s neural signals – the electrical activity corresponding to her intention to move, to dance – were captured and translated in real time into choreography performed by her digital avatar on stage.
In an interview with BBC News, Breanna described the experience as “exhilarating” and “magical” – watching herself, in virtual form, take to the stage once more.