Welcome to the NextMind Unity SDK documentation. This section describes the SDK components and will help you design your own NextMind-enabled apps.
The Unity SDK has been designed to facilitate the development of games and applications, exploiting the NextMind technology. It uses a high-level API that allows you to build your project without struggling on low-level aspects. The NextMind SDK is designed to reduce complexity to the point where the only question you’ll need to worry about is: “Which objects do I want to interact with using just my mind?”.
The NextMind SDK is built around two main components:
- a NeuroTag that makes any object in your application “mind-interactible”.
- the NeuroManager that manages the communication between the NeuroTags on the scene and the core of the NextMind Engine.
Apart from these two main components, the NextMind SDK offers a large range of functions that can be used to customize applications. A user can obtain information from the NextMind Sensor (such as: battery level, contact, etc.), manage the Bluetooth scanning behavior, simulate inputs, etc.
After defining the NeuroTags, it is up to you to implement the actions that they are supposed to trigger. The possibilities of release actions are endless as you can connect NeuroTags with any digital or external outputs.
The NextMind SDK is provided as an ensemble of Asset Packages:
- [Core package]: All the essential files needed to build a NextMind-enabled app. You’ll find inside the core libraries exposing the main classes, and some convenient assets (prefabs, shader, components, tools, etc…).
- [Examples package] (optional): Several examples of how to use the SDK, showing best practices (for instance, how to build your custom calibration app, or how to tag an object).
To learn more about these packages, further information can be found in the tutorials section.