XR’s wide range of interface technologies: Achieving total immersiveness

Visuals are important to create convincing representations of the real world in extended reality, but they are neither inclusive nor do they provide a sufficiently authentic sensation in virtual worlds. Genuine immersiveness will require interfaces that address the breadth of human senses

Although consumers have become very familiar with speech recognition, current discussions around the metaverse and extended reality (XR) omit, for the most part, the relevance speech and sound can play in virtual or enhanced environments. After all, most humans interact almost constantly with their environment in making and identifying sounds.

This is a big omission. Indeed, as the drive for a compelling metaverse experience continues, the goal of genuine immersiveness will come to fruition only when virtual landscapes or elements can provide a portfolio of all sensations that allows users to become at one with these environments.

Siri has become one of the best-known speech assistants since Apple introduced the feature in 2011. The technology goes back to an SRI International project from 2003 that the research institute spun off as an independent company in 2007. Apple acquired this company in 2010 and since then, Amazon.com has introduced Alexa and Google has created its Google Assistant.

Microsoft has its own speech recognition and in March 2022 it acquired Nuance Communications, a provider of speech-recognition and artificial intelligence (AI) technologies. Coincidentally, Nuance goes partially back to another SRI International spin-off that was acquired in 2005 before becoming part of Microsoft.

Sound can become the story

But sound can be more, and film and game developers have shown that sound can become the story – be it the ticking of a watch or an almost subliminal bass that echoes the beating of a heart, not to mention explosions and the like.

Flexound has developed a technology that can embed in theatre or gaming chairs, as well as cushions or even furniture. The system enables the creation of personal sound spheres and allows users to feel the sound with their body. Meanwhile, Sennheiser has developed immersive sound that Netflix is now using in some of its movies and selected scenes. Sennheiser has also created AMBEO, which offers spatial sound for a wide range of applications, including cinematic virtual reality.

Also, many companies already offer services to design meaningful sound environments for specialised applications – even if their current focus is not on XR or metaverse-related applications. These efforts range from fairly comprehensive services to well-targeted applications.

Spatial Inc, for example, is developing soundscapes for retail outlets, hospitality environments, office spaces and museum, among other locations. Sen Sound, meanwhile, is focusing on designing more pleasant, less stressful sounds for hospitals and healthcare locations. It is easy to see how such design understanding will find its way into creating more meaningful and immersive augmented reality (AR) and virtual reality (VR) applications.

The telling touch of haptics

Another set of interface technologies that already plays a role in high-end gaming will help XR applications to become more immersive and, in many cases, more inclusive and relevant. Haptics can enable use cases that are currently difficult to translate from real into virtual environments.

Haptics divide into tactile and kinaesthetic sensations. Tactile experiences mainly relate to the skin, such as the perception of texture, touch, pressure or vibration. Kinaesthetic experiences relate to muscles, tendons and joints and the perception of weight and stretch, but also the movement of body parts. Different interfaces already exist and will develop to address these types of sensation.

Perhaps most beneficial will be the sense of haptics in hands. Holding a tool, squeezing an object or feeling a surface in virtual environments will enhance the authenticity of the experience if haptic feedback is added. On the most basic level, haptic feedback will enable the use of virtual presentations of standard interfaces, such as buttons that can be pushed or dials that turn and provide some sense of clicking sensation.

Holding a tool, squeezing an object or feeling a surface in virtual environments will enhance the authenticity of the experience if haptic feedback is added.

A number of companies offer gloves as interfaces. HaptX features both types of feedback and includes motion tracking for applications in VR or for telerobotics operations. SenseGlove offers an advanced solution that enables users to get a sense of the size, density and resistance of objects in virtuality. Sensations in fingers and hands to manipulate and explore objects in VR offer obvious benefits.

It is therefore no surprise that Facebook’s Reality Lab is experimenting with haptic gloves to gain an understanding of their possibilities. Other companies are trying to achieve similar sensations without the need to don a glove. Ultraleap is one of these and is using ultrasound to project haptic sensations onto the hands.

These different approaches will find use in different environments and for different use cases. Ultrasound can be used more easily in public spaces and for AR, where the need to put on gloves might cause friction in the experience. Meanwhile, physical gloves can create more diverse sensations and represent particular objects more accurately.

To create a complete sense of immersiveness, entire suits can offer enveloping sensations. To fully become at one with virtual environments for entertainment, training, diagnostic and therapeutic applications, systems that can provide haptic sensations to the upper body or entire body are welcome additions. One of these is bHaptics’ range of so-called TactSuit products, essentially vests that incorporate haptic feedback dispersed around the body. The company also offers a haptic glove.

Teslasuit, meanwhile, offers a set of haptic garments: the Teslaglove and Teslasuit, which consists of a jacket and trousers. Here the haptics sensations are based on electrostimulation.

More immersive virtual environments

Researchers are also experimenting with different approaches to creating haptic sensations. A group of scientists at the University of Chicago say they have “identified five chemicals that can render lasting haptic sensations: tingling (sanshool), numbing (lidocaine), stinging (cinnamaldehyde), warming (capsaicin) and cooling (menthol)”.  These haptic devices include a sleeve that surrounds a section of the forearm and a strip that can be applied underneath the visual headset on the user’s cheeks.

Perhaps attachable accessories for mass-market headsets – similar to the way the University of Chicago researchers use their strip in conjunction with headsets – could become a market niche to provide users with particular sensations. Feelreal, which began as a Kickstarter campaign, is a device that looks like a shield and attaches to the lower part of visual headsets. The device can provide the sensation of a cool breeze, warmth, water sprinkles, and a wide range of smells.

Meanwhile, OVR Technology offers a device that can attach to VR headsets to provide a wide range of scents. The company mentions use cases such as meditation and response training for this device. Leveraging VR users’ sense of smell should come naturally for developers who are seeking to create more immersive virtual environments. OVR CEO Aaron Wisniewski says: “The metaverse without scent would be like living life in black and white.”

Such accessories can provide a wide range of interaction possibilities. A research team at Salzburg University of Applied Sciences has developed AirRes, an add-on for the Meta Quest 2 headset. The device looks like a gas mask that the researchers tout as a breathing interface. It leverages a wearer’s breathing as input information via a resistance valve to enable interaction with virtual wind instruments, blowing out a birthday cake, or using a blowpipe to propel projectiles, for example.

Read more about the metaverse

The resistance device also can restrict a user’s breathing, thereby simulating entering a smoke-filled room, for instance.

Applications in gaming or emergency training are obvious. Some of the applications the researchers present point to the device’s potential to enhance virtual environments with new levels of authenticity. Lightly exhaling at mirrors to fog them up can reveal hidden numbers left behind, similar to the way children leave messages for each other in mystery games.

Such fogging-up of glass or metal surfaces can create a sensation of realistic interaction with objects within environments and offers the potential to leave messages for other players on virtual windows or metal surfaces for gaming purposes, for example.

Meanwhile, H2L Technologies has developed a wristband that can inflict pain via small electric shocks. The company’s CEO, Emi Tamaki, says: “Feeling pain enables us to turn the metaverse world into a real world, with increased feelings of presence and immersion.” Although the device can provide a sense of pain, its main purpose is to create the sensation of resistance and weight when interacting with objects in VR.

Finally, brain interfaces could be the ultimate connectivity to virtual worlds and augmented landscapes. Such connections to XR environments are currently speculative, but as time passes and neuroscience advances, fairly simple applications will become conceivable, and companies are already exploring related technologies.

Meta Platforms’ Reality Labs is looking at the use of a brain-computer interface (BCI) for AR glasses, particularly for BCI use in communication applications. Others also see benefits in the combination of AR glasses and brain interfaces. In March 2022, Snap acquired NextMind, a developer of BCIs. Presumably, Snap is exploring the use of such an interface with its AR smartglasses, Snap Spectacles. Potential uses of the interface could include gaming or operating devices, for example.

Meanwhile, Elon Musk’s Neuralink is working more generally on developing interfaces that enable “a direct link between the brain and everyday technology”. In 2021, the company released information on a macaque monkey’s ability to move cursors on a screen and play the video game Pong via brain activity alone.

Sophisticated XR

Currently, many developments in XR and the metaverse are experimental, and interface technologies are part of such experimentation. Truly immersive environments will require advanced interfaces that can replicate authentic real-world representations. But unwieldy and difficult-to-use interfaces can even prevent users from immersing themselves in VR as well.

Costs are another consideration, and safety will play an increasingly important role. But VR applications will drive the search for interfaces that enable more comprehensive interactions with virtual objects and worlds – and advanced interfaces will enable ever-more sophisticated XR applications.

Martin Schwirn is the author of Small Data, Big Disruptions: How to Spot Signals of Change and Manage Uncertainty (ISBN 9781632651921). He is also senior adviser, strategic foresight at Business Finland, helping startups and incumbents to find their position in tomorrow’s marketplace.

Read more on Internet infrastructure

CIO
Security
Networking
Data Center
Data Management
Close