Okay, great: we can control Our phones with speech recognition and our television sets with gesture recognition. But those technologies don’t work in all situations for all people. So I say, forget about those crude beginnings; what we really want is thought recognition.
As I found out during research for a recent NOVA episode, it mostly appears that brain-computer interface (BCI) technology has not advanced very far just yet. For example, I tried to make a toy helicopter fly by thinking “up” as I wore a $300 commercial EEG headset. It barely worked.
Such “mind-reading” caps are quick to put on and noninvasive. They listen, through your scalp, for the incredibly weak remnants of electrical signals from your brain activity. But they’re lousy at figuring out where in your brain they originated. Furthermore, the headset software didn’t even know that I was thinking “up.” I could just as easily have thought “goofy” or “shoelace” or “pickle”—whatever I had thought about during the 15-second training session.
There are other noninvasive brain scanners—magnetoencephalography, positron-emission tomography and near-infrared spectroscopy, and so on—but each also has its trade-offs.
Of course, you can implant sensors inside someone’s skull for the best readings of all; immobilized patients have successfully manipulated computer cursors and robotic arms using this approach. Still, when it comes to controlling everyday electronics, brain surgery might be a tough sell.
My most astonishing discovery came at Carnegie Mellon University, where Marcel Just and Tom Mitchell have been using real-time functional MRI scanners to do some actual mind reading—or thought recognition, as they more responsibly call it.
As I lay in the fMRI, I saw 20 images on the screen (of a strawberry, skyscraper, cave, and so on). I was instructed to imagine the qualities of each object. The computer would try to figure out, from every two objects, the sequence of the two images I had just seen (whether strawberry had come before skyscraper, for example). It got them 100 percent right.
It turns out that, regardless of our native language or personal history, the same parts of our brain “light up” when we think of certain nouns. For “strawberry,” we might think “red,” “eat” or “hold in one hand.” The computer knows which brain areas are active for which qualities. The system can also guess what number you’re thinking of or which of 15 emotions you’re feeling.
Now, much needs to happen before we can change TV channels just by thinking “CBS.” In these early days, most BCI research is focused on how to help the disabled move or how to detect lies. And that work is raising plenty of questions about ethics, privacy and credibility. There will be other questions when thought recognition does come to gadgets. What happens if you get distracted when you’re mind dictating an e-mail? Who wins if your spouse and you think about two different channels? And who’s going to submit to an MRI to adjust music volume?
Just, who runs the Center for Cognitive Brain Imaging at Carnegie Mellon, isn’t worried about that part. “Our machine is a monster,” he told me. But “someday some biophysicist is going to develop some far smaller device, probably operating on a different principle.” At this point, it is too early to see where BCI will land or even when it will take off. And that’s fine. After all, when somebody invented the wheel, he or she probably didn’t imagine Acela trains, roller coasters or skateboards right away.
Still, I’ve had my mind read, and I’m a believer. There’s something brewing, and millions of dollars are being poured into the effort to refine it. The next great interface breakthrough may tap into the electrical device you were born with.
This article was originally published with the title The Remote Control in Your Mind.