When we think, we often “speak” the words too – not just in our head, but also with tiny, almost imperceptible, movements of our jaws. Now, a new headset named AlterEgo, developed at MIT Media Lab, can decipher those movements and “hear” those words. As things stand, it’s the closest thing to mind-reading there can be – though, of course, it’s not mind-reading at all.
The headset includes electrodes that detect muscular signals in the wearer’s jawline and face. This information is then sent to a system that analyses the data, associating them with words from the English language. Think a question, “say” the words silently, and have AlterEgo feed you the answer aurally, without actually speaking it out loud.
The lead researcher on the project is Arnav Kapur, an American scientist of Indian origin. As he puts it, “Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?”