Brain Computer Interface Experiments

I have recently got my hands on the Emotiv Epoc+ headset. This is a Brain Computer Interface (BCI). A device that allows you to understand neurological brain states and mental commands to control things in the physical world. Imagine when you think about something it happen, like turning on the light or controlling computer games. Or imagine a device that can understand and react to how stressed or relaxed you are… This is a device that opens up a whole host of opportunities.

First impressions?
This headset is fairly straight forward to plug in and play right out the box. The best way to create demonstrators with this product is to use IBM’s Node Red programming platform. Emotiv have created a custom library for their devices that allows you to easily connect the headset to websites or physical hardware.

Experiment #1
Imagine working away at your desk, your concentration drops as you drift away into a dream then suddenly a chocolate brazil nut appears in front of you. This brazil nut does two things, firstly it acts as a trigger to snap you out of your daydream and secondly once eaten the rush of the cocoa’s neurologically benefiting polyphenols and the nuts grey matter boosting omega-3 and zinc will bring you back you your A game.

This is exactly what we have created at Uniform as part of our experiments with Brain Computer Interfaces. We have created a device that knows when your concentration dips and dispenses you a chocolate covered Brazil nut.

BCI4.gif
BCI1.jpeg

Experiment #2
The Emotiv Epoc can be trained to understand mental commands. Simply press train and think of something. You can do this for up to six separate commands. To test this I set up a quick experiment. When I thought about blue a blue LED would turn on and when I thought about red the red LED turns on. Simple.

I found this quite a tricky function to train (I’m sure this is very brain dependant and varies as people think about different things in very different ways).

BCI3.jpeg
BCI2.jpeg

Experiment #3
A while ago one of our artworkers came up to me and said ‘wouldn’t it be great if you could blink your eyes to generate a screenshot?’. I thought this was a really fun idea. How could our control of on screen processes change as we add more inputs than our hands on a keyboard and mouse? Can we add more inputs that are seamless and useful.

The Emotiv Epoc opens opportunities with its facial expression recognition function. The Emotiv Epoc can recognise smiling, frowning, winking, blinking and teeth clenching.

I decided to bring my colleagues dream alive. Every time I blinked my eyes my computer captured a screenshot. I quickly discovered that I blink a lot. I changed the control to teeth clenching, a behaviour that we don’t normally do.

What are the possibilities?
This new technology builds on from Solo, our emotional radio. Now instead of reading facial emotions (which can be faked) we can now read brain states of stress, engagement, interest, focus, excitement and relaxation. Imagine this technology built into a car where the car can react to your neurological state and help you drive safely. We could see when our friends and actually listening to us in the age of the eight second memory span. Brands could read peoples minds to help understand what product is best for them.

The one aspect that excites me most about this technology is the fact that it has the potential to understand our brains more than we do, it can tell us exactly what we’re thinking and figure out exactly what we need to perform better.

Previous
Previous

Grip

Next
Next

Roommate