Advertisement

brain computer interfaces: emotiv epoc

Started by September 26, 2009 10:58 AM
2 comments, last by ddn3 15 years, 1 month ago
I'm thinking about picking up an Emotiv EPOC headset to play with. I figure, even if it's not useful as an input device, I could get some interesting EEG-type data about myself. 1) Has anyone here ordered one/know anybody who's received one from the company? The site claims 10 weeks to ship, which is totally ludicrous and means I may not be interested anymore by the time I get one. 2) Am I right in understanding that something like 4 inputs can be detected at once, out of a set of 13? That would mean I have to hang on to my keyboard.. 3) Some forum threads I've read imply that the fastest possible response time is 0.25 seconds, anyone know about that? This would make it nearly impossible to use as a primary input device, since a 0.25s lag on, say, mouse movement would make it very hard to see cause and effect. 4) Since the device measures biopotentials, muscle movements are also detected. If I train with the EPOC until I have a decent level of proficiency, will I start making faces every time I think "jump" or "pull"? I'm imagining that since we can get access to raw data from the device, there's some possibility that I'll apply some machine learning algorithm and be able to do more than the stock SDK lets you do. Or that I have amazingly clear brain pulses that will interact better with the device.
That's how they do their cognitive detection using machine learning algorithms. Though I believe to get access to the raw data, you'll need a researcher license which is about 2k, which if your serious about this isn't that much (about the price of a high end computer).

I think the most exciting thing about this isn't the cognitive detection (brain orders) but the emotive detection (excitement, fear, engagement) this will really allow people to tailor experiences to the user like never before.

Also the facial and emotive recording opens up alot of new ground on avatar control, which is also exciting.

-ddn
Advertisement
I somehow think that eye-tracking would add quite a lot to this system, at least for looking around.
[size="2"]I like the Walrus best.
In theory they have that capacity with their affective sensor suite (detect muscle movement and tension, perhaps even sensitive to detect various eye movements and then correlate that to a particular point on the screen?).

Scientist have recently discovered a way to detect the number people are thinking using brain sensory technology such as this. Maybe it won't be long before you can use emotiv to completely replace the keyboard?

Emotiv + Natal = the future people have been dreaming about? Free floating air touch technology combined with mental command based inputs?

-ddn

This topic is closed to new replies.

Advertisement