Looking for inspiration on how to use your Dev Kit? Check out some of these examples from our community.
Electrical muscle stimulation through mind selections
German Vargas from the College of Coastal Georgia transforms his passion into amazing science projects on his YouTube channel.
Using the NextMind Dev Kit, German demonstrates a neuroprosthetics concept where a user with limited mobility could regain muscle control using mind-activated buttons. When you focus on one of the arrows, the signal is transmitted through an Arduino board to a TENS (transcutaneous electrical nerve stimulation) unit to contract arm muscles.
We find this example inspiring because it shows a potential way to use visual BCIs for medical purposes.
But he didn’t stop there…German also created a mind-controlled Lego Car you can check out here.
Lucid: A combination of voice and brain commands
Alex Lepine is a game development and design student exploring in alternative gameplay.
Alex is designing a puzzle game called Lucid. The player is a spy who must avoid getting caught by the night guards to complete the level. The twist is that you advance your character (the red block) via voice recognition. To see around corners and avoid the guards, activate the NeuroTags on the security cameras (pink spheres).
Beyond the novel gameplay experience, Alex was also aiming to design more accessible gameplay. No mobility or fine motor movements are required thanks to voice recognition and mind interactions.
Lucid is still a work in progress, but a prototype will be made available online soon. This is a great example of how mind interactions can be combined with other technologies for a never-before-seen gaming experience.
INTERNET OF THINGS
Collin Cunningham is a creative engineer working at Adafruit, a company that makes fun DIY electronics projects.
Collin got his hands on one of our Dev Kits to explore how mind interactions can be applied to smart objects.
He designed an interface with 3 NeuroTags; depending on where you focus, the Sensor will interpret this as a command to the motor to rotate the pointer. See the project in his detailed blog post showing you how to build your own.
This example highlights the possibilities of IoT interactions where brain commands are translated into the real world.
What do you dev? Tell us what you’re working on in the comments for a chance to be featured on the blog.