Could mind control be the future of gaming?
TOPICS
GamingImagine a future where you can play video games with your mind. Well, it’s happening right now.
Head to Perri Karyal’s YouTube channel to see her fighting bosses in the dark fantasy game Elden Ring, conjuring spells with her thoughts. What is this technology? And why aren’t we all using it?
It turns out that brain–computer interfaces, or BCIs, have been around for a long time. The term was coined by Jacques Vidal at the University of California, Los Angeles back in the early 1970s, and various experiments since have seen animals or humans controlling screens or moving robots with their thoughts. Scientists have even used brain implants to translate people’s thoughts into speech:
Marius Klug, who leads the ‘Young Investigator Group – Intuitive XR’ at Brandenburg University of Technology Cottbus–Senftenberg in Germany, says the field of BCI was wide-ranging when it began. “Over the decades, it went narrower and narrower, so it became basically just medical,” with a focus on things such as allowing people to move wheelchairs with their brain, he explains.
Even so, several groups have looked at applications of BCIs in video gaming over the past few decades. Emotiv, founded in Australia in 2003, released its EPOC neuroheadset in 2009, which president Tan Le said would allow “players to control gameplay with their thoughts, expressions and emotions.”
Now headquartered in San Francisco, the company remains a major manufacturer of brain-monitoring devices. Gabe Newell, president of gaming giant Valve, has talked up the possibilities of BCIs in video games for years, and co-founded a company called Starfish to create “the next generation of neural interfaces.” Marius’s group at the Brandenburg University of Technology is particularly interested in researching how BCIs can be combined with virtual reality (VR) and augmented reality (AR).
Playing Elden Ring with your mind
Getting BCIs to work with video games is tricky. Brain activity is usually measured with an electroencephalogram (EEG), a recording of electrical activity made using a set of electrodes attached to the scalp. But it’s difficult to tell precisely what’s going on inside the brain.
Marius likens an EEG to trying to follow a football match by placing 10 microphones outside the stadium. “You kind of know what's going on in there,” he says: If there’s a goal, cheers from fans will be louder at one end than at the other; if there’s a foul, the noise from the crowd might be different.
But you’re not getting the full picture. “Basically, that's the level of accuracy that you can expect if you use electroencephalography,” he says.
The other method of measuring brain activity is by drilling a hole through the skull and placing an electrode directly inside the brain, which Marius likens to putting a microphone on an individual football player. “You know exactly what's going on for that particular player, but you don't really know what's going on for the entire crowd,” he says. “So it's very narrow, very detailed.” (It should go without saying that most people won’t want to have their head drilled into to play a game.)
So how do you play video games with your mind, presuming you haven’t had an electrode implanted directly into your cerebral cortex? “It's not intuitive in the way that people assume,” says Perri Karyal, who became interested in BCIs during her master’s degree in psychology. “They think that it's reading my thoughts, and I think, ‘jump’, and then the game goes, ‘Oh, jump’.” Sadly, it’s not that simple.
Perri uses an Emotiv EPOC X headset, combined with voice commands and an eye-tracking device made by Tobii. Connecting all of this with games like Elden Ring required a bit of coding, which she had to teach herself from scratch. “The BCI is all very raw,” she says. “I took the API from it, and I made the software that compiles the mind control and the Tobii eye tracker all into one thing.” Then it was a case of calibrating the BCI so that specific brain patterns would map to actions in a game—for example, Perri imagining pushing a heavy boulder to launch an attack in Elden Ring.
Calibrating the system was arduous. “The BCI will literally just record my entire pattern of brain activity, so it will look at all of the data from all of the electrodes, and how much there is, and what kinds of brainwaves they are,” says Perri. “Then you do that hundreds of times, and eventually it’ll figure it out.”
Even after the system has been taught to recognise a specific brain pattern, it’s another matter to persuade your brain to repeat that exact pattern every time you want to, say, attack an enemy. The first time Perri tried, it took perhaps half an hour to get the game to do anything, and it was initially hit-or-miss that her commands would register.
After countless hours of practice, Perri still has to remain still and concentrate hard to perform an in-game action. Mapping more controls to different thoughts and switching between them is harder still. “Having four controls, that's taken me two years,” says Perri. “The thing that takes the longest is figuring out what your visualisations are going to be and making sure that they're different and distinct and the same every single time.”
Perri was nonetheless able to perform some impressive feats using her mind-control system, like beating the notoriously difficult boss Malenia in Elden Ring. But her setup, ingenious as it is, remains a long way from playing an entire game with thoughts alone. “Currently I can't press more than one button at once with mind control,” she says. “If there's a sprint jump or something, I literally can't do that, so I have to use a voice command.”
The system is also binary, either on or off, so there’s no way to replicate smooth analogue controls. Plus there’s a hefty delay of about a second between Perri’s thought prompt and an onscreen action, since the system must scan the last 500 milliseconds of recorded brain activity, then decide whether it fits a pattern it has been trained to recognise.
In short, using this rig for any game that requires fast reflexes or rapid switching between multiple controls is enormously difficult—even impossible.
What games work best with BCI?
Perhaps we’re coming at BCIs and gaming from the wrong angle. Perhaps we shouldn’t be trying to control games with our minds at all.
Marius certainly thinks so. “Our bodies are far superior than anything we can do with brain–computer interfaces at this point,” he says. “If you can control something using your muscles, you should.” Sluggish and difficult-to-master mind-control inputs remain inferior to fingers and thumbs, and Marius thinks that within his lifetime, mind-control systems won’t become good enough to rival traditional controllers for avatar-based games like Elden Ring.
“But we can do some things that you cannot do with your body,” adds Marius, “like measuring mental states.” This is something Marius and his Intuitive XR group are looking into: how a player’s overall mental state (surprised, scared, focused, and so on) can feed into a game to change it in some way.
In other words, the game reacts to how you’re feeling, an easier variable to measure via EEG than specific thoughts. This kind of application has come to be known as a ‘neuroadaptive system,’ a term that emerged from the relatively recent field of passive BCIs (as opposed to active BCIs, like the system Perri uses).
Marius developed a compelling demonstration of passive BCIs’ potential in the form of the Real Virtual Magic mod for the VR version of the popular role-playing game Skyrim. To use the mod, players wear a Muse meditation device (a relatively simple EEG headband) under their VR headset, which tracks concentration levels. The more focused the player is, the more powerful their magic spells are in-game: they can even do double-damage at peak concentration.
Conversely, if the player panics, spell power falls away. “As soon as you enter a dungeon or get attacked by some bandits, then all of a sudden you realise that you now have to take a deep breath and focus, because otherwise you won't be able to cast magic spells very well,” says Marius, adding that having to concentrate to use magic powers adds to the immersion enormously, making the player feel like a “real” mage. “If you are able to maintain your focus to a higher level, you are really powerful.”
The potential for neuroadaptive technologies is huge. “I like the idea of generative gaming, where you're in a horror game, for example, and you get super stressed out, so the game gets harder and more enemies spawn,” says Perri.
Conor Russomanno, co-founder and CEO of the company OpenBCI, adds that this kind of system could work brilliantly with games like Resident Evil. “That type of experience could be highly tailored,” he says, with the game learning what scares individual players most by monitoring EEG output, then adapting game content accordingly. He imagines all sorts of applications for the technology, like a narrative-led game where, rather than the player making conscious, binary choices, “the game is responsive to your emotions, to your intentions, to your actions, to the characters that you are interested in.”
Conor first began looking into neuroadaptive gaming whilst at university. He did his bachelor’s degree in engineering before switching to design and technology for his master’s, when he came across EEGs and BCIs. “I saw my brain waves dancing around on a screen for the first time, [and] it felt like I had the power of my own mind in my own hands,” he recalls. “I was like, ‘Wow, I didn't realise that you could extract zeroes and ones and the squiggly lines of your consciousness so easily into a computer’. I've never really looked back. I saw a bar graph going up and down, and I felt some connection to it.”
For his thesis, Conor developed a neuroadaptive game. “It's the story of a humanoid robot that's experiencing symptoms of consciousness,” he says, adding that the plot can take multiple paths according to the player’s brain activity. “Either you're very connected to the story and you're paying a lot of attention, and the robot becomes more sentient and discovers a sense of agency and free will, or you're kind of checked out, not paying attention. You're not really connected to the story.”
The possibilities of such neuroadaptive games seem endless. But the barrier to their widespread adoption remains hardware: something that Conor is trying to change.
OpenBCI and the Galea headset
After graduating, Conor co-founded OpenBCI, which started with a subcontract from DARPA to build EEG devices for non-traditional users. That was followed by a Kickstarter campaign in 2013 to fund an open-source BCI. “At that time, there were only a handful of EEG devices that you could procure from the internet,” says Conor. “Now there are hundreds, if not thousands of various brain-sensing sleep bands or meditation bands or lucid dream inducers.”
After a brief stint at an AR start-up in 2017, Conor returned to OpenBCI with the idea of creating a VR/AR headset with integrated BCI technology. “I was like, ‘This is our chance, guys, this is how we get EEG and other biosensors to become mainstream, as we ride the wake of the AR market’.”
Initially, OpenBCI partnered with Valve, using its Index VR headset. “Then we actually decided to move away from Valve,” Conor says, adding that the two companies are still on good terms. “We needed image-based eye tracking integrated into the headset, and their headsets didn't have it at the time.” Instead, OpenBCI partnered with Varjo, a Finnish firm that manufactures high-end VR, AR and mixed-reality headsets.
This result is the Galea biosensing headset, currently in beta. The base model is around $25.000, while the higher-end one is $40.000. “The objective is to bring the price down,” says Conor. “But we have spent six years in R&D, so we've got some ground to make up in terms of resources invested in the project. But I think the price will come down eventually for devices like Galea, and we'll also get much better at building them.” Currently, each Galea headset is hand-built in Brooklyn at a rate of about one per day.
“It's really exciting,” says Perri of Galea. “It's hard to say without seeing it and without trying it, but it kind of does everything that I would want an EEG to do. Something I really want to do is combine mind control with VR.”
Marius has mixed opinions after trying an early Galea model at a conference in 2023. “I really like the idea, and I used to be a strong proponent of building EEG electrodes into a VR headset, because it feels like such a natural addition,” he says. “You already have something on your head, it's so obvious. But I have changed my mind. I think these days I am more a proponent of a modular system.”
His argument is that it would make more sense for the BCI component to be an add-on that could be swapped between VR headsets as you upgrade systems, rather than integrated. He also points out that electrodes can be uncomfortable “because they have to poke through the hair, and this just hurts after a while,” so it would make sense to be able to remove them.
But weight is the biggest issue. “The Galea version that I tried was rather heavy, much heavier than what I as a gamer would be happy with,” he says, noting that this weight gave the headset momentum when he turned his head, causing it to wobble and consequently break the connection between the EEG electrodes and the scalp.
The challenges of BCI
There are many such problems to be overcome before BCI gaming devices take off. Like many EEG devices, the Emotiv EPOC X uses wet sensors, which means they need to be covered in gel or another conductive substance to conduct electricity from the scalp. Perri has to soak her hair in salt water before using the device, which she says is “probably another thing that stops it from becoming mainstream.” The alternative is dry sensors, like those used in the Galea … but as Marius found, these can be hard and uncomfortable.
Conor says one of the biggest challenges is noise. “Environmental noise, movement noise, noise from other devices that are nearby: you have to do a lot of isolation with the electronics,” he says. The devices must be carefully shielded from things like wi-fi signals to avoid interference. “And so the trickiest thing with EEG is not the cost of the equipment, per se, it's isolating it so you aren't picking up garbage.”
Then there’s ergonomics. “Everybody's head is shaped so differently, and it's already a problem for AR/VR headset companies that just need to strap the thing on.” Adding around eight electrodes that need constant contact with the scalp compounds this problem enormously.
But the differences between brains are potentially a much bigger problem than differences in head shape. “Everyone's brain is so different, and brain activity is so different,” says Perri, who thinks there will never be a “plug-and-play” BCI; some level of calibration will always be required to tune the system to individual thinking.
“We can only do so well by creating an average human representation of any mental metric, like stress or arousal or fatigue or cognitive workload or flow,” says Conor. “It's only going to work so well because everybody is neurodiverse from each other.”
Machine learning and artificial intelligence could help with improving BCI calibrations and classifying brain signals. But the most worrying issue is that BCIs don’t seem to work at all for some people. “
Around 20 percent of the population are so-called ‘BCI illiterate’,” says Marius. “I don't like the term, but it's a term that's used in the research. It’s basically people who cannot produce the signal: it just doesn't work."
Marius continues, "It could be that their brain is folded in a way that the [brain signal] classifier doesn't pick up. It could be that they can't imagine movements very well, because visual imagery is also very different between people. So some people can imagine colours and details very well, and some people cannot, and that could be the same with your motor imagery. These idiosyncrasies of brains normally don't make any difference in your day-to-day life, but for BCI, they could be huge.”
Equally, there could be people who are naturals, able to do things with BCIs that are “unthinkable for other people,” suggests Marius: perhaps the neuroscience equivalent of innately talented mages.
Will BCI games ever go mainstream?
Given the challenges, will we ever see BCIs gain widespread adoption in gaming?
Perri can’t see this happening in the near future, but she hopes it will happen eventually. “Or maybe there'll be more and more enthusiasts that get involved, more competitions between people, which is already starting to happen,” she says.
“Outside of gaming, EEG is starting to find some really functional use cases. I know Emotiv and lots of other companies have these earbuds where they track your focus, and people are already using those in offices.”
Marius sees earbud-based EEG electrodes as potentially a better way of combining BCIs with VR than integrating a BCI into a headset. “You could just clip it on or off,” he says. But getting the cost down is a more difficult problem. As with the early days of VR—and as with most technological advances—the initial user base is small, so devices have to be expensive as a result of the small addressable market. Yet the complexity of a BCI device is actually much lower than a smartphone or even noise-cancelling headphones, Marius says, so potentially they could be manufactured cheaply if there was a big enough consumer base.
Perhaps a low-cost pair of BCI earbuds could be linked to a smartphone game with wide appeal, like a meditation game where players focus to keep a guru hovering in the air. If BCIs are going to break into the mainstream, it’s likely to be through a simple device like this, linked to something everyone already owns.
Meanwhilst, at the other end of the scale, Conor imagines more immersive applications for BCIs in systems that track as many variables as possible, like eye movement, skin moisture, heart rate, body movement and respiration. “I have this thesis that in order to have a true brain–computer interface, it needs to be bi-directional, meaning you need the ability to both read from the human mind and write to the human mind,” he says. “You need as many sensors as possible to measure the internal environment of the user and the external environment of the world that the user is in.”
Perhaps one day we’ll be able to disappear into those immersive worlds entirely.
15 Nov 2024
-
Lewis Packwood
Illustration by Thomas Travert.
02/03
Related Insights
03/03
L’Atelier is a data intelligence company based in Paris.
We use advanced machine learning and generative AI to identify emerging technologies and analyse their impact on countries, companies, and capital.