"Calming could be used to provide health rather than picking up power-ups or waiting on timers, potentially," Breen continues. "You could play out the fantasy by having to get out of the line of fire and relax somehow." On the other hand, feelings of excitement "could be used on the other end of the spectrum to enable the character, so if the fantasy of the character was to become empowered by becoming excited, then, again, you're feeling the character's role rather than simply forming mechanical actions in the game. And we also think that the emotional detections have the potential to be used to enable dynamic difficulty in a game. So, for instance, being able to make adjustments to suit the player's ability or their feelings, rather than predefined specifications for what difficulty should be."
Excitement may be where they've started, but by the time they bring their offerings to market, they hope to offer the ability for games to "get a sense of what the player needs. You need to adjust the game to suit the players. And we think that that's really important, because the range of player ability right now is so broad that the games have difficulty really tailoring [themselves] to suit that range of ability." Emotiv's suites offer game developers the ability get "a sense of ... how the players are responding to the content. We think that's a really powerful tool to provide the best possible experience." He makes a comparison with film editing, where "a director has a very fine control of tempo by carefully editing the scene. They'll basically get the audience to an apex and before they deliver new material, they want to make sure that the scene provides time for the observer to relax and settle before they provide the next big bang." Games don't currently have this luxury, he says, and Emotiv "provides a method for game developers to sense whether the player is at the emotional level they want them to be [at] and then [they are] able to adjust the content accordingly."
Their third offering is the Cognitiv Suite, which allows the game to detect conscious decisions from the player's brainwaves and interpret them into in-game actions. "You can visualize an action and see that action translated into movement on screen," Breen says. They've already got some basic movements in, such as lifting, pushing, pulling, dropping and rotating an object, "and those actions can be translated to anything onscreen, really, that you might use them for."
The science may be complex, but the interaction is simple: "You think about pushing the object, and the device allows detection for that event, and then you're able to perform that action [in game]." Currently, they're up to being able to detect three distinct actions simultaneously, which he says is "just the beginning. It's not clear where the limits are, but we intend to continue and expand both the number of things that you can do, [both] the number of things that you can do simultaneously, as well as the list of things you can do discretely."
Developers hoping to work with the system and utilize Emotiv's development kits "will find it extremely easy to work with," he says. "The detections themselves have been done, so the information that's passed through the API is fairly standard and easily integrated into menuing and functions that they would expect. It's not going to look very different than other types of interfaces, other than the fact that it's detecting these discrete things that they didn't otherwise have access to. From a functional standpoint, it'd be very simple." No mind reading required.