Microsoft Taps Into Your Brain


People who think of Microsoft as a tech-age Big Brother probably won’t be comforted by the software giant’s effort to read your mind. Actually, their intentions are benign… they want to create thought-driven inputs that bypass joysticks and keyboards. Desney Tan, a Microsoft researcher, thinks that ultimately the technology could make workplaces more productive, games more fun, and computers easier to use. Tan even envisions units able to match music to your mood, or block email notifications while you are concentrating.

Tan’s EEG cap has 32 electrodes that are affixed to the scalp with a conductive gel or paste. When neurons fire, they produce an electrical signal of a few millivolts. Electronics within the device record the voltage at each electrode, relative to the others, and send that data to a computer.

A subject using Tan’s system spends 10 to 20 minutes performing a series of tasks that require either high or low concentration–such as remembering letters or images for various amounts of time. EEG readings taken during the activity are fed to a computer, which manipulates them mathematically to generate thousands of derivations called “features.” The machine-learning algorithm then sifts through the features, identifying patterns that reliably indicate the subject’s concentration level when the data was collected. Tan and his collaborators at the University of Washington, Seattle, and Carnegie Mellon University have shown that a winnowed set of about 30 features can predict a subject’s concentration level with 99 percent accuracy. [Text and photo from MIT Technology Review.]

While not intended to be a neuromarketing tool, if Micosoft does develop effective EEG-based technology for computer and game control purposes it might well lead to better technology and new tools with a broad range of applications.

1 Comment
  1. Erwin van Lun says

    Hi Roger,
    Tan provides a future prediction. But I believe we always will have to wear sensors on our head. Don’t you think it would be more interesting to read emotions from the face (facial coding) or body movements from a human?

Leave A Reply

Your email address will not be published.