Apparently what they found opens up possibilities not only for useful medical or health-oriented implementations, but of incorporating similar sensors into virtual reality (VR) or augmented reality (AR) headsets.
In their paper Mental Recognition via Wearable EEG the researchers said they tested 16 subjects, ten men and six women who had their reactions to both instructional videos and cute cat videos measured by commercial EEG head-set the InteraXon Muse.
You can pick up a Muse device for $300 and it has notably fewer sensors than medical-grade EEG monitors, and is marketed as a meditation aid.
UoM researchers Pouya Bashivan, Irina Rish and Steve Heisig set various machine-learning algorithms to work filtering out artefacts in the feedback from the sixteen subjects watching the educational or amusing videos.
The Muse’s limited array of four prefrontal and occipital/temporal sensors yielded enough data in terms of oscillatory responses to satisfactorily deduce the mental state of the subject in terms of a ‘rational’ or ‘emotional’ response to the videos.
This opens the way for VR machines which are controlled by your mind. However the one downside which could tigger the development is that the contacts could be effected by the user having too much hair. It was better that contacts were made directly onto the surface of the scalp with gel.
Still the technology could be there even at a consumer level which bodes well for future development. If using EEG signals gets good enough it could mean that you don’t need things like gloves or full suits and you can control everything in your mind.