Community round-up, July 2021

We love seeing what the NextMind community of developers is up to. This month we’ve got a muscle stimulator, a puzzle level game, and an easy-to-make smart object.

Looking for inspiration on how to use your Dev Kit? Check out some of these examples from our community.

Electrical muscle stimulation through mind selections

Portrait of German Vargas from the College of Coastal Georgia

ASSISTIVE TECHNOLOGY

German Vargas from the College of Coastal Georgia transforms his passion into amazing science projects on his YouTube channel.  

Using the NextMind Dev Kit, German demonstrates a neuroprosthetics concept where a user with limited mobility could regain muscle control using mind-activated buttons. When you focus on one of the arrows, the signal is transmitted through an Arduino board to a TENS (transcutaneous electrical nerve stimulation) unit to contract arm muscles. 

We find this example inspiring because it shows a potential way to use visual BCIs for medical purposes

But he didn’t stop there…German also created a mind-controlled Lego Car you can check out here.

Lucid: A combination of voice and brain commands

Portrait of Alex Lepine, a game development and design student

GAME DEV

Alex Lepine is a game development and design student exploring in alternative gameplay. 

Alex is designing a puzzle game called Lucid. The player is a spy who must avoid getting caught by the night guards to complete the level. The twist is that you advance your character (the red block) via voice recognition. To see around corners and avoid the guards, activate the NeuroTags on the security cameras (pink spheres).

Beyond the novel gameplay experience, Alex was also aiming to design more accessible gameplay. No mobility or fine motor movements are required thanks to voice recognition and mind interactions.

Lucid is still a work in progress, but a prototype will be made available online soon. This is a great example of how mind interactions can be combined with other technologies for a never-before-seen gaming experience.  

Lucid game screenshot 1 accessible game prototype use case
The red characters are night guards. Stay out of their beams to avoid detection!
Lucid game screenshot 2 accessible game prototype use case
Each pink sphere is a security camera. Switch between views to see around corners and navigate through the maze.

Why use mind interactions in your game development? Here are 3 reasons.

Mind-enabled pointer

Portrait of Collin Cunningham, creative engineer at Adafruit

INTERNET OF THINGS

Collin Cunningham is a creative engineer working at Adafruit, a company that makes fun DIY electronics projects.

Collin got his hands on one of our Dev Kits to explore how mind interactions can be applied to smart objects

He designed an interface with 3 NeuroTags; depending on where you focus, the Sensor will interpret this as a command to the motor to rotate the pointer. See the project in his detailed blog post showing you how to build your own.

This example highlights the possibilities of IoT interactions where brain commands are translated into the real world. 

Pointer with NextMind digital interface internet of things smart object use case
Select a direction with your mind, and the pointer will follow! Check out Collin’s blog post to learn how to build your own smart object.

What do you dev? Tell us what you’re working on in the comments for a chance to be featured on the blog. 

You might like...