Facebook Has Developed a Wrist-Based AR Controller With Neuron-Based Gestures For Ambient Computing

The wrist-based system developed by Facebook also lets users literally touch and feel objects in the virtual space, a feature the company calls "the force".

208621

Facebook’s Research Lab (FRL), once known as Oculus Research, has been working on ways to facilitate user interface for virtual reality (VR) and augmented reality (AR) platforms of the future. Six years of research has brought Facebook to a wrist-based input system that can literally read the signals your brain sends to your fingers through the neurons. The company said the system is still a prototype and it’s a few years away from coming to the mainstream, but this is a “nearer term” development than other systems Facebook has been working on.

“What we’re trying to do with neural interfaces is to let you control the machine directly, using the output of the peripheral nervous system specifically the nerves outside the brain that animate your hand and finger muscles,” said Thomas Reardon, FRL Director of Neuromotor Interfaces at Facebook Research Labs (FRL). 

How does Facebook’s wrist-based input system work?

The new system uses a wrist-watch-like device that’s meant to be worn all the time, and it will be able to recognize a finger point, swipe and many other things. For instance, Facebook demoed a system where a touch typist can simply start performing typing actions on a tabletop and the system would recognize what they’re writing. The company said keyboards like this would use artificial intelligence (AI) to learn how the user types — typos and all — and adapt to the user instead of having users adapt to it.


EMG Typing

In fact, that’s the central goal of Facebook’s work with these systems. The idea is to make human-machine interaction (HCI) more natural, where the user leads the way instead of having to adapt to a new device or platform. The company also has AI tech that is contextually aware and can make recommendations based on what a user is doing, how they behave and where they are. For instance, a kitchen timer can be set when a particular step in a recipe says a task has to be done for a particular amount of time.

And though this all sounds really new, and it is in a way, it actually advances existing tech. Facebook is using electromyography, which is the use of sensors to lead the electrical signals our brain sends through the neurons. 

FRL Research AR

“We believe our wristband wearables may offer a path to ultra-low-friction, always-available input for AR glasses, but they’re not a complete solution on their own — just as the mouse is one piece of the graphical user interface,” said Hrvoje Benko, Director of Research Science at FRL. “They need to be assisted with intent prediction and user modeling that adapts to you and your particular context in real time.”