Valve, the company behind the PC game distribution platform Steam is developing a brain computer interface (BCI). The idea is to use this technology for future Virtual Reality (VR) and Augmented Reality (AR) games and apps. I always felt that the controls are the weakest link when it comes to futuristic VR games. Considering the immersive nature of these games, even high-tech motion controllers fail to deliver quick and proper control scheme. For instance, a gamer knows what needs to be done in a certain situation. However, to react, he needs to recall a certain motion or button combination when playing with conventional controllers. This leaves many gamers fumbling, which effectively takes away precious in-game time. Valve is working to change that by turning your neuron signals into input for computers. If done right, you can interact with games faster and more naturally than ever.
I can’t think of any brand better than Valve for such development. It has a rich gaming heritage and its distribution platform has constantly maintained around 90 million monthly users on PC. Over two decades ago, founded by ex-Microsoft employees, Valve released its first game Half Life. Even to this day, Half Life franchisee is considered as one of the best PC games ever made. Even the game’s mod dubbed Counter Strike (CS) became a cultural phenomenon in gaming cafes across the world. CS still remains one of the most played online games in the world. The same company is now not only working to re-invent the way we play video games but opening the world of endless possibilities. Recently, Microsoft was lauded for developing a special Adaptive Xbox controller for gamers with limited mobility. Valve can go a step further and level the playing field for gamers with its BCI system that turns your brain into a controller. Another possible usage is to enable paralysis patients to control exosuits directly by brain.
At this point, the engineers at Valve are using an open-source headset Ultracortex for experiments. Sure, it looks like a torture device. But don’t worry as brands such as Neurable have already figured out a way to integrate such hardware in sleek headsets without compromising on aesthetics. Ultracortex can track Electroencephalography (EEG) data in real time. In other words, you get brain activity readings. Using its algorithm and machine learning methods, Valve can turn brain signals into user input. Game developers can devise a way to use this data to alter the gameplay according to the gamer’s brain response. Instead of making the player choose the difficulty level, the game can adjust it on the fly based on the readings from the BCI headgear fitted with EEG. If a player seems to be bored or frustrated, the game can throw-in more challenges or dial back the difficulty level. Moreover, multiplayer games will be matched more accurately based on the data. For example, noobs will be less likely to be slaughtered by pros in multiplayer games such as PUBG and Apex Legends.
If you think that this idea is far-fetched, Valve’s Psychologist Mike Ambinder, states that the existing tech is capable of understanding’s player’s state of mind by based on the region where the neurons are firing up and causing voltage fluctuations. The only problem is at this point is the reliability of it. I’m sure that Valve will manage to iron out these issues soon enough. I can’t wait to unleash my superior brain in gaming, otherwise limited by my lousy finger dexterity. Unfortunately, at this point, Valve Corporation shed no light on its timeline for developing a consumer version of its mind control gaming headset. But, one thing for sure, the system will be ready to be showcased before Half Life 3 is confirmed.