Since the 1960s, we’ve been teased with the potential of virtual reality. It is a place of infinite creativity. A place where we can build our own alternate realities and interact with each other… The metaverse.
A tech success in the making
“The promise of VR is to make the world you wanted.” This compelling promise of John Carmack, former CTO of Oculus, is why we are still chasing our VR dreams.
We’ve come a long way from the initial headsets that were cumbersome and nausea-inducing. Innovative VR games such as Half-Life: Alyx have generated new waves of excitement around this technology. And we’ve even seen many in-person events and work meetings switching to VR in 2020. But there are still hurdles to overcome before VR becomes a household must-have.
Some are obvious such as visual quality, processing power, and simulator sickness. In the past few years, these have all significantly improved. But one major hurdle has yet to see any satisfying solution: interactions.
The obvious starting point in VR interactions is tracking body movement and position. This is necessary for feeling a sense of ownership over your virtual avatar. Partial tracking is already possible with some VR headset offering 6 degrees of freedom.
To take the experience further, you can invest in haptic gloves or suits, scent generators, full body speakers, and many more accessories. And while many are excited for the Omni One VR treadmill coming out later this year, this type of equipment isn’t accessible to a casual VR enthusiast.
Rather than creating a plethora of high-tech devices, we need to consider the fundamental question of how to make interacting in virtual environments more intuitive and easy.
Rethinking how we interact in VR
Two technologies that have often been heralded as revolutionizing the VR experience are eye and hand tracking.
Oculus rolled out hand tracking in 2020 using cameras already built into its headset. Games such as Hands Physics Labs are starting to explore these capabilities.
Eye tracking has been around even longer and is becoming more common in VR headsets. Besides being used as a simple mechanic for selecting items, it’s also provided a workaround for technical problems. For example, the newest PlayStation VR uses eye tracking to achieve higher resolutions by only loading visuals you’re looking at.
These technologies have opened up possibilities in VR, but they rely on monitoring physical actions in the real world. What if we could create a more direct connection between human and machine?
The NextMind Brain-Computer Interface already makes it possible to create a telepathic link between you and your virtual environment. By incorporating our NeuroTag designs into the visual environment in VR, we can transform active focus into digital action in real time. Having these interactions directly in the headset helps maintain full immersion.
Similarly, Facebook is working to bring an innovative EMG armband to the public. When in use, it looks similar to hand tracking, but actually works by detecting signals sent between the brain and the muscles. When it is released, it will be interesting to see how it will be integrated in VR experiences.
Integrating BCIs in virtual worlds
Experiencing mind interactions in VR will soon be possible in our upcoming demo game, MindVaders. In this universe, all interactions are mind-enabled. You can focus on the aliens’ brains to make them explode. Or focus on a teleportation device to switch locations.
A beta version of the demo has already been tested by some influencers. A public version will be released soon.
Replacing traditional inputs with mind interactions can solve problems such as motion sickness. In our example, you control your movement by teleporting to a point on which you are focusing. This reduces the chance of motion sickness caused by walking in VR.
But this technology can also supplement other inputs. While using traditional controllers, you could add in simultaneous hands-free interactions. For example, use our BCI to activate drop-down menus.
This could be useful in industrial settings where VR and augmented reality are used. Assembly lines are often noisy and employees’ hands are full. Integrating visual-based tech such as NextMind and eye tracking could make these actions easier.
In gaming, you can create focus-based challenges where you operate mind-activated commands in addition to other activities.
This kind of plug-n-play approach is necessary to bring VR to a wider audience. As technologies such as BCI start being adopted, the next few years in VR and mixed reality will look even more exciting.
How would you use mind activations in VR?