Artificial Intelligence

Making the metaverse: hardware hurdles

25th November 2021
Beatrice O'Flaherty
0

The metaverse promises a step-change in how society communicates but, without the hardware technology, will ultimately remain a pipe dream.

We will still be able to interact with the metaverse through our phones and laptops, but these will become legacy devices. To immerse ourselves, however, requires a virtual reality (VR) headset; for true integration between the created and the physical, augmented reality (AR) devices are required.

The metaverse will require capabilities from AR, VR, and legacy devices. Source: IDTechEx

 

 

The endgame is for virtual worlds to exist seamlessly alongside the real, with immersive interactions between the two changing our perception of physical presence. And the software for this is nearly there, but the hardware still has many hurdles to overcome.

 

A touchstone in the augmented and virtual reality device industries is social acceptability and without advances in sensing, display, and optics technology, AR and VR headsets will not be sleek enough to achieve this. Despite billions of dollars being poured into development, true AR glasses are still nowhere near the point of sitting alongside Ray-Bans in terms of appearance or displaying images that are of IMAX quality.

Meta (formerly Facebook) announced its AR glasses project at the same time as its name change, admitting they were years from viability. VR headsets sometimes invite comparisons to Robocop - only the highest-end devices start to blur the lines between what is real and what is not.

Long term, the goal is a light and comfortable device that you can wear all day, switching between AR and VR whilst enabling natural interactions between yourself and other metaverse users. The technological journey towards the hardware needed to create this hypothetical device presents even more compelling developments and complex challenges than the software.

 

Hardware requirements for the metaverse. Source: IDTechEx

Seeing into created realities

Putting a screen right in front of your eyes reveals things we do not notice whilst looking at a phone or TV. We see the gaps between pixels in a phenomenon called the screen door effect and the rule of thumb is that 60 pixels per degree (ppd) of field of view are required for VR or AR to start looking like reality, leading to big demands on resolution.

On top of this, optics are required to focus and size these images correctly for our vision. In the case of AR, these optics are very inefficient, leading to brightness demands in the millions of nits – for reference, the iPhone 13 Pro Max screen maxes out at 1200 nits.

MicroLED displays are a promising solution for AR and VR. They do not suffer from burn-in like OLED displays, can have crazy brightness levels (JadeBird makes one for AR applications with a maximum brightness level of 3m nits), and enable tiny pixel pitches, with Mojo Vision producing what they call a nanoLED display that is small enough to fit into a contact lens and has a subpixel pitch of 900nm.

 However, there is one major issue: microOLED microdisplays are not good at producing full-colour images, with blue microLEDs being significantly more efficient than other colours. Quantum dot colour conversion is the favored solution here, enabling conversion of blue light to red and green, and these quantum dots can be inkjet printed or lithographically patterned.

There are still concerns with longevity, especially in very bright microdisplays, as well as reliance on heavy metals in many formulations. IDTechEx outlined the development timeline for microLED displays in detail in the report 'Micro-LED Displays 2021-2031: Technology, Commercialisation, Opportunity, Market and Players'.

The biggest battleground for AR, in particular, is combiner optics. These devices overlay projected images on a transparent lens. Here, companies fight for the best colour rendition, the widest field of view, and the largest eye box to enable a convincing display experience that works for every set of eyes.

 Looking at the big news this year here, surface relief waveguides seem to be the solution the industry is betting on. In May, WaveOptics, was acquired by Snap (another social media giant looking towards the metaverse) and in November, investment by such giants as Samsung Electronics in Digilens led to a valuation of $500m. Both are fabless waveguide firms. An exciting development from Digilens is its TREX waveguide, which can double effective display resolution, providing one weapon in the arsenal of reaching the 60ppd we can see.

The IDTechEx report 'Optics and Displays in AR, VR, and MR 2020-2030: Technologies, Players and Markets' delves into optical combiner and display tech in these applications in detail.

Knowing where you are looking

If you tried to build a headset that covered all 135° of each eye’s horizontal field of view at 60ppd resolution, things would quickly get unmanageable – fortunately, only the center of our vision is of this high quality, with the outer edges not being so demanding. By tracking our eyes, resolution can be maximised in the center of where the user is looking whilst lowering demands elsewhere.

In the future, this eye-tracking tech may even be used to project AR/VR images directly onto the retina via laser beam scanning, getting around the need for combiner optics and correction for glasses wearers: that is, if consumers can get comfortable with the idea.

Companies in this space are using emerging image sensor technologies, as covered in the IDTechEx report 'Emerging Image Sensor Technologies 2021-2031: Applications and Markets', to track the eye more efficiently. Event-based vision can help keep processing demands down by natively recording movement instead of a stream of conventional image frames. Using printed image sensors, eye-tracking technology can be squeezed into a more svelte package.

Meta Materials (no relation to Meta) is already embedding microcameras directly into glasses lenses and this approach will be integrated into combiner or magnifier optics for AR and VR respectively as the technology matures.

Connecting to your avatar

All of what has been discussed so far amounts to little more than a high-end TV strapped to your face if you cannot interact with it in ways resembling the real world. Not only do AR and VR devices need to sense our movements but, for full immersion, haptic (touch feedback) devices are required as well. In more Meta news, in November 2021 its Reality Labs (RL) division showed off a prototype haptics glove product, including videos of Mark Zuckerberg trialing various demos.

This glove uses microfluidic systems to deliver local haptic feedback to different areas of the hand, appearing to deliver further touch feedback to each finger. Although there was some controversy over this prototype’s resemblance to a product from HaptX, Meta’s IP position is strong here and it represents the efforts being made by metaverse-focused companies to ensure that sensory experiences past the audio-visual are delivered.

A key winner on the sensing side in recent headsets has been time of flight cameras for hand tracking, eliminating the need for game controller style interactions with VR and AR devices. Apple have been known to be invested in the VR/AR space for years and, in October 2021, industry sources reported that LG Innotek had begun supplying time of flight cameras to the firm for a VR headset slated for release in 2022.

\When Apple adopts a technology, it is usually a strong statement that it is about to become ubiquitous, representing another data point in the strong future IDTechEx sees for AR and VR.

IDTechEx reports cover the core technology, key players, and evolution of market sectors in haptics, wearable sensors, flexible sensors, and more related to this area.

An exciting future

In a limited way, the metaverse is already here. The hardware development mountain that prevents its full realisation is slowly being climbed and light, good-looking AR glasses replacing our phones and laptops in the future feels like a near certainty. As this journey progresses, IDTechEx will be there to chart its progress and plot out its future course.

IDTechEx offers an extensive portfolio of technical market research reports covering many technologies relevant to AR/VR and the metaverse. These include optics and displays for AR/VR, microLED displays, emerging image sensors, haptics, wearable sensors, printed/flexible sensors, and more.

All of these reports cover the current state and expected future developments, both in terms of technical capabilities and commercial adoption. Granular forecasts segmented by technology and application assist with planning future projects, while multiple company profiles based on primary interviews provide detailed insight into the major players. Also included in the reports are multiple application examples, SWOT analysis, and technological/commercial readiness assessments. Further details and downloadable sample pages for each report can be found here.

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier