keywordRelated searchesType your search term & press enterTo exit search function, press esc
Since their introduction in 2008, Emotiv's brainwave-reading headsets have inspired a level of starry-eyed geek enthusiasm unique even to the hype-saturated world of tech. It doesn't take much brainpower to figure out why. Whether you're using them to control video game characters, pilot a drone or track your daily stress levels, these head-mounted electroencephalography (EEG) readers are truly the stuff of science fiction.
Which is why we jumped at the chance to tinker with the latest version of the device, the EPOC+, at this month's AEC Hackathon in Dallas, Texas. In truth, we'd been dreaming up design and architecture use cases for neuro-headsets at LINE for a while, which was great, since the goal of this hackathon was to try and figure out ways new technologies and innovations might be used to solve various industry challenges.
One such application occurred to us while working on stadium shading systems: Why not use the headsets as a means of measuring and reacting to collective discomfort — like overheating? Because they're capable of gauging things like facial expression (say, squinting) and emotional states (like frustration), in theory, you could build a dynamic shading system that measures and reacts to those inputs. If a particular section of headset-equipped attendees gets too hot or experiences too much glare from the sun, the stadium itself could some day react appropriately, deploying an area-specific shading solution. Obviously, a scenario like that is still a ways off. But as wearable tech like the EPOC gets smaller, more discreet and more powerful, it's not crazy to envision the spaces around us adjusting to our needs instead of the other way around.
Another big appeal of EEG headsets lies in their ability to provide an entirely new interface for controlling software. Visualization tools abound in our field, and while keyboards and mice remain the intermediaries between us and the digital world for now, it's exciting to think of the possibilities that open up when you can start manipulating this world with your mind instead of a mouse click.
As luck would have it, that's the area I decided to explore during the Hackathon. My idea was simple: I wanted to see if I could use the EPOC+ to perform some basic tasks within two popular visual scriptors: Grasshopper and Dynamo. After training with Emotiv's own software, which reads the EEG signals coming from the headset's 14 sensors and translates them into actions, I was eventually able to come up with a proof of concept. Specially, I managed to use facial expressions, like smiling and blinking, to manipulate an onscreen 3-D cube in Grasshopper, making it twist, stretch or clone itself. You can read about my exploits in detail here.
What's truly exciting is that ideas like these only begin to hint at what's possible. Neuro-headsets could one day help measure and map people's emotional responses to individual spaces and buildings, studying how layout affects mood, and yes, they may eventually even help us control and explore virtual 3-D design environments. Wishful thinking? Perhaps for now. But I can't think of a piece of technology more suited for the task.
Want more? https://vimeo.com/130588561