I’ve been following the progress of Emotiv for a couple of years now, for two reasons. The first is a parochial one – the founders started out in Australia and have created some real waves with their work. The second is more obvious: computer-brain interfaces are an obvious evolution for virtual environments and Emotiv’s product is one of the first ones available for consumers at a reasonable price.
Emotiv CEO, Nam Do, is in Sydney for the XMediaLab Global Media Cultures so I took the opportunity to nab him for a few quick questions on the EPOC headset about to be released:
Lowell: For the newly initiated, how would you describe Emotiv’s mission as a company?
Nam: Since the beginning, man-machine interaction has always been in conscious form. We have always consciously direct machine to perform different tasks for us. However, interaction between human are much more complex, we take in not only conscious communication but also read each other’s expression and feel other’s emotions. Our mission is to create a new form of interface that take man-machine interaction a lot closer to that of human to human. We have created a neuro-technology that allow computer to not only read our conscious thoughts but also understands our non-conscious expressions and emotions. It’s a headset called Emotiv EPOC that reads your brainwave and allows you to control application with your mind.
Lowell: How have pre-orders of the EPOC been going?
Nam: Pre-orders are great, we have a very strong and growing fan base that has been followed and supported us for a long time. The orders are coming in very strong and with only 2 weeks, we almost fill up our limited release for Christmas. And I’m talking only about US market and direct order from our website. It does feel good!
Lowell: Can you confirm which applications / games / worlds are actively working to support Emotiv products?
Nam: We have a few applications/games that we bundle with the headset as well as we are going through testing our 3rd party’s applications to include in the box at launch. Users will also have access to a website where our developers around the world uploading their applications, much like the iPhone store.
Lowell: Are you able to outline the Emotiv product map over the coming year?
Nam: Making the product better and more and more applications ALL THE TIME.
Lowell: Are you meeting much skepticism on the efficacy of the EPOC?
Nam: Surprisingly no. We expected a lot more skepticism for our 1st release as this is a totally new technology. You have to remember that although it has a lot of potential, it’s like the computer in the 70s, you can’t expect broadband internet and photo realistic graphics yet. We have released our SDK headset a few months ago and the feedback is actually really good, people are amazed at what the headset can do.
Lowell: Do you have data on the ‘accuracy’ of the EPOC i.e. are there any objective measurements on the proportion of the time that emotions are translated effectively versus not effectively?
Nam: We have 3 different suites of detections call Expressiv, Affectiv and Cognitiv. The latency is ranging from 10ms for Expressiv to about 150ms for Cognitiv. Accuracy is also depends on how many cognitiv actions you are trying to do at a time, if it’s 2, then the accuracy is super high, it gets less accurate when you try to add more actions in too quick before you mastered the 1st 2. This is has a lot to do with the ability of making your different thoughts distinctive enough as well so you can see different people with different mental abilities which make it really interesting.
Lowell: Has there been interest from the developer community in the SDK, and if so, can you give some examples of new ways the Emotiv IP is being explored?
Nam: There is huge interest from the development community, we now have over 6,000 developers worldwide from giant multi-national corporations to independent developers. We expect to see a lot of interesting applications over the next few months. Lots of developers are submitting their applications to bundle with the consumer headset.
Lowell: The Emotiv forums seem to have some significant dissent around the pricing of the developer SDK and its highly proprietary structure. Is Emotiv reviewing its pricing at all, or do you have a response to claims that the cost is inhibiting widespread developer takeup?
Nam: Yes, we have a few people disapointed with the pricing of full RAW EEG data at the begining when it was $25,000. We since then restruture the cost to allow developers that interested in raw EEG data to get access to what they want (not neccessary full 16 channels) at a much lower cost (starting at $2,500). You have to remember that anything with 1/2 the resolution of our heaset out there is priced at around minimum $50K. Since then, we have seen a huge pick up in the number of people buying raw EEG data version of the SDK. So the answer it yes, we have reviewed and changed the pricing and it is now very attractive to developers.
—–
We’ll be reviewing the EPOC in the near future, in the meantime if you want to see the EPOC in action, check out the YouTube channel or this snippet from The New Inventors earlier this year:
Over to you: what are your thoughts on the EPOC? Does it appeal to you as an interface?
Daynuv says
I presume someone is integrating this amazing device with a Second Life / Opensim client? Do you have any insider knowledge?
Lowell Cremorne says
Hi Daynuv,
No unfortunately no further insight on that but it’d be a great thing to see for sure.