NextMind technology decodes the act of focusing.

Brain activity is translated into real-world actions, allowing you to control a wide range of digital objects with your mind in real-time.

How it works


Objects are tagged with faint graphic overlays specially optimized for the visual cortex.


The visual cortex generates electrical brain waves that are picked up by the NextMind Sensor.


The NextMind Engine uses machine learning to decode brain activity and pinpoint the object of focus.


From the moment you start focusing, you can see your brain acting on the object. As you focus more, the neural feedback on the digital object increases until it obeys your command.

Feel your
In Action

Your continuous control over objects progressively builds your new sense of direct brain commands.


“This technology breakthrough represents the next frontier of human-computer interaction

Sid Kouider, Founder and CEO of NextMind, Slush 2019.


Direct Brain Commands are the act of controlling digital technology using your mental focus. It allows you to interact with digital games and technology without using manual controllers or gamepads, giving you a greater sense of freedom and control.

Yes. Our product is certified by regulatory authorities in all countries we currently ship to (CE, FCC, etc). NextMind’s technology uses EEG (Electroencephalography), a technique that has been used for almost a century to measure neural signals at the surface of the head.

NextMind is designed as a neurotech platform for developers:

  • Accessible. We’ve done the hard work of processing neural signals, so you can focus on creating new applications. 
  • User-friendly. Easy to put on and play.
  • Adaptable. Compatible with many environments and controllers.
  • Direct. Command digital objects in real-time.
  • Unlimited. Create your own brain-enabled use cases.

With the NextMind Dev Kit, you can build the first generation of mind-controlled experiences.

Access all the Frequently Asked Questions