CMU Computers Read Thoughts
Most scientists have dismissed the idea of reading minds using technology as pure science fiction, but Carnegie Mellon University researchers have moved a step closer to doing so. Not only have they been able to identify which of several images a subject is looking at using fMRI scans of their brains. The most startling result is that the CMU researchers were able to take the data from the initial batch of subjects and repeat the identification feat with new subjects.
A dozen volunteers were shown line drawings of five different types of buildings and five different kinds of tools while their brain activity was monitored by functional magnetic resonance imaging — or fMRI — which measures changes in blood flow.
Computers then analyzed the fMRI images — which are taken 60 times a minute — checking 20,000 locations on each image for changes in activity. Patterns emerged, and the computers were able to “learn” which patterns of brain activity were associated with specific images and determine not only whether the person was looking at a picture of a building or a tool, but which tool.
Even more significant, the patterns established with the fMRI images were used to identify which of the objects was being viewed by a different set of people. This means that people generally think the same way, and a computer program could conceivably be written to read the minds of most people, [CMU neuroscientist Marcel] Just said. [From CMU computers seek where thoughts originate by Allison M. Heinrichs, Pittsburgh Tribune-Review.]
That a more general mapping process could be used to assess what previously untested subjects are thinking about could have important implications in neuromarketing (not to mention many other fields of research).
“This part of the study establishes, as never before, that there is a commonality in how different people’s brains represent the same object,” said [Tom M.] Mitchell, head of the Machine Learning Department in Carnegie Mellon’s School of Computer Science and a pioneer in applying machine learning methods to the study of brain activity. “There has always been a philosophical conundrum as to whether one person’s perception of the color blue is the same as another person’s. Now we see that there is a great deal of commonality across different people’s brain activity corresponding to familiar tools and dwellings.”
“This first step using computer algorithms to identify thoughts of individual objects from brain activity can open new scientific paths, and eventually roads and highways,” added Svetlana Shinkareva, an assistant professor of psychology at the University of South Carolina who is the study’s lead author. “We hope to progress to identifying the thoughts associated not just with pictures, but also with words, and eventually sentences.” [From Study Identifies Where Thoughts of Familiar Objects Occur Inside the Human Brain.]
We are clearly a long way off from any kind of generalized thought reading, but it seems possible that very limited types of discrimination could be much closer. Is the subject thinking about Coke or Pepsi? Beer or the bikini-clad model?
Carnegie Mellon and its Center for Cognitive Brain Imaging is clearly doing cutting edge work. We’ve often reported on CMU’s neuroeconomics guru, George Loewenstein, who was one of the researchers to first demonstrate prediction of buying intentions purely from brain imaging.