Technology Review: Connecting Your Brain to the Game
Technology Review: Connecting Your Brain to the Game
Emotiv Systems, an electronic-game company from San Francisco, wants people to play with the power of the mind. Starting tomorrow, video-game makers will be able to buy Emotiv's electro-encephalograph (EEG) caps and software developer's tool kits so that they can build games that use the electrical signals from a player's brain to control the on-screen action.
Emotiv's system has three different applications. One is designed to sense facial expressions such as winks, grimaces, and smiles and transfer them, in real time, to an avatar. This could be useful in virtual-world games, such as Second Life, in which it takes a fair amount of training to learn how to express emotions and actions through a keyboard. Another application detects two emotional states, such as excitement and calm. Emotiv's chief product officer, Randy Breen, says that these unconscious cues could be used to modify a game's soundtrack or to affect the way that virtual characters interact with a player. The third set of software can detect a handful of conscious intentions that can be used to push, pull, rotate, and lift objects in a virtual world.
The notion of using brain activity to interact with computers isn't new. A number of schools--such as the University of Minnesota; University of California, San Diego; and Purdue--have research labs devoted to decoding thoughts from the brain and manipulating cursors on a screen, which is especially useful for disabled people. In addition, companies have cropped up in the past couple of years claiming to offer an effective brain-computer interface for video games or for biofeedback purposes. For instance, S.M.A.R.T. BrainGames, a company based in San Marcos, CA, sells games and EEG caps designed to treat people with attention deficit/hyperactivity disorder.
To use Emotiv's system, a person puts on the EEG cap and adjusts it to her head, making sure that most of the sensors touch the scalp. The system automatically picks up blinks and emotional states. However, in order to move virtual objects, such as a box on a computer screen, a person must go through a series of training sessions in which she concentrates for about 10 seconds on mentally moving the box. Tan Le, one of Emotiv's cofounders, says that there is a large amount of machine learning built into the software, so the more a person concentrates on a specific task, the more precisely the system follows the mental instructions.
Since Emotiv's technology is currently patent pending, the company will not disclose the details of its system. However, Le claims that the company's heads of research--optics expert Allan Snyder and former Bell Labs chip engineer Neil Weste--have made a number of scientific discoveries that are worthy of academic research papers. But so far, none have been published, and no game manufacturer has publicly committed to using the technology.
No comments:
Post a Comment