Community focus: XR with Dilmer Valecillos

NextMind and Dilmer Valecillos discuss the future of brain-computer interface in XR and his tutorial series.

We recently sat down with NextMind user Dilmer Valecillos to talk about his experience with NextMind in his XR development. Dilmer is passionate about teaching XR development and sharing his knowledge on his Youtube channel. He’s started a series on using BCI in your XR projects.

I think it [BCI] is the future and I wanted to teach that to all the people.” 

Dilmer Valecillos, XR developer

Interview has been shortened for clarity and conciseness. 

The NextMind experience

NextMind: Since you’ve been in VR community a while, you must have seen many tech trends come and go. What makes BCI so special for XR? 

Dilmer: In the future, if we can use how we went from talking on the phone to texting [for BCI], we will be able to communicate more than we are normally doing. It is amazing that you can read the mind and transform that to the screen. Being able to turn on the TV, talk to somebody, all around you with the community caring… I’m really excited about that! 

A lot of people see that as a negative thing but we are in a world that is moving so fast. So let’s see BCI [as a tool] to communicate more than what we are normally doing! 

NextMind: What was it like using a BCI for the first time? 

Dilmer: When you imagine a BCI with sensors all around your head, I was like ‘I don’t know if that’s going to be easy to jump into it!’ But, when I start playing with the one you guys offered, I’m like, ‘Oh, this is really a thing that I can offer to the community, that I can experiment with very fast!’ 

I went into your website; I downloaded the SDK. You guys have everything set up in a way that I was able to just follow step-by-step instructions. I got it up and running in Unity in less that 5 minutes.

I think for beginners there is not really much of a challenge other than getting to know Unity. Once you know Unity, it’s like following the instructions on the developer area. If you have C-Sharp and Unity knowledge, I think jumping into NextMind is totally doable for beginners. 

NextMind: Eye tracking is becoming more common in VR and headsets. A common question we get is what is the difference between eye tracking and BCI. As a user of both technologies, can you describe the different experiences?

Dilmer: When using eye tracking you are looking on something and then you know, it is basically focused on that, a specific object, on that area. With the NextMind, I felt like I had to concentrate, you are basically, the more you concentrate, it feels more organic, versus, something, it is just a point, pointer at a specific place. To me it was more like a immerse to use a BCI versus use of eye tracking. 

I felt like it was more accurate in many times, I also really like getting into the SDK that I was able to get a confidence value versus, when I use eye tracking it just that number 1. The OnReleased, the OnTriggered gives me way more granularity about what I can do in games versus like couldn’t do with eye tracking. 

For more info about possible events see the NextMind Dev Kit API Reference

Helping others explore BCI in XR

NextMind: What was your goal in making your tutorial series?

Dilmer: I think it’s one of those things that you don’t see it until you try it. But If I’m trying it and showing it, then it clicks, right? So my goal was letting the community know that it’s not as complicated as it sounds because NextMind made it easier for developers.

And I’m going to keep experimenting with prototypes so that this tutorial 0 becomes something that people go to and it can open more possibilities for use cases in the [XR] industry.

NextMind: What ways do you think BCI tech will be used going forward in XR? 

Dilmer: BCI with locomotion was one thing that I thought was interesting. One interesting use case would have been to look at different areas and control the movement on a headset based on how much you focus.

Dilmer created a locomotion tutorial using BCI in VR. See his Youtube series for the tutorial and access the GitHub here.

I think BCI can also play a big role in helping with UI. It’s also for people who can’t talk or don’t have hands. You are basically giving them hands. You are giving them all the tools that they don’t have right now. To me, this is a big win.

I see it as… something is part of the device at some point. I want to see your technology incorporated into Oculus devices, many different of the Vive devices. Absolutely, I see that merging at some point.

One cool use cases would be to use BCI NextMind to interact in augmented reality. I have not seen anything like that yet so I think that depending on the first video I build, I’m gonna have a series of videos with augmented reality and BCI. I think I see a lot of future with BCI in my channel because it’s not hidden to any one that that’s going to be used a lot in the future. Just like my watch is useful today, I won’t be living without augmented reality glasses in the future but I think BCI is up there too because of the easy that you can activate and interact with things.

What would you like to create with the NextMind Dev Kit? Tell us in the comments.

You might like...