One of the challenges facing marketers is the difficulty in predicting real-world behavior from data captured in less than real circumstances. A horizontal, immobile subject surrounded by a claustrophobic, noisy fMRI tube might reasonably be expected to behave differently than one walking around a retail store, for example. While EEG caps and wireless transmitters have permitted capturing data from mobile subjects, they have to be in the actual environment. In many cases that is easy to do, but what if the store hasn’t been built yet? Or what if one wants to test a large number of display configurations?
While with enough time and money anyone can simulate a store and multiple displays so that test subjects can walk through the environment being evaluated, in some cases creating that environment virtually might be faster and less costly. Last month, Neurofocus debuted its N-Matrix 3D virtual reality system:
“We can give manufacturers, marketers, and retailers what they have always sought but could not have until now: an extremely lifelike testing environment that also allows endless opportunities for quickly altering every element from the macro to the micro,” said Dr. A. K. Pradeep, Chief Executive Officer of NeuroFocus. “N-Matrix 3D signals the end of trade-offs in market research, where until now compromises had to be made and accepted. We are bringing the real world right into the laboratory by adding the critical third dimension to virtual reality, and at the same time we’re also providing the flexibility to alter that reality with speed and scalability.”
Never bashful, Neurofocus compares their simulations to “Avatar 3D:”
Based on neuroscientific research, the N-Matrix 3D virtual reality shopper insights system applies advanced, proprietary digital technology that is on a par with what “Avatar 3D,” the world record-breaking motion picture, incorporates. The patented software applies neuroscientific knowledge of how the brain perceives and analyzes products, package designs, and store settings to create three-dimensional virtual renderings that stimulate the subconscious to process them as it does reality.
This is interesting stuff, and I’d hope to see some applications tested that go beyond the peanut butter aisle. Now that virtual reality is past the chunky, jaggy colored block phase of old VR systems, doing far more sophisticated and realistic environments should be possible. Why not, for example, see how people react to very different car showroom configurations? Or any kind of built space? Maybe the age of the neuroarchitect isn’t that far off…
Eventually, I suppose, it might be possible to put totally credible “humans” in such an environment to interact with the test subject. Today’s gaming systems already do a somewhat credible job of creating virtual people (though complex facial expressions remain a challenge), so building people into a virtual retail simulation should be doable at some point.