Apart from working on a Clubhouse-clone and an Instagram for kids, Facebook is also working on AR-based technologies at its Facebook Reality Labs facility. We’ve already seen reports of Facebook partnering with Ray-Ban to develop an AR Glass. Now, researchers at Facebook Reality Labs (FRL) have come up with a new wrist-based AI technology that lets users interact with AR-based computer interfaces in the real world.
The company has been working on Human-Computer Interaction (HCI) technologies for quite some time now. Facebook even recently shared its 10-year vision of a contextually-aware, AI-powered interface for its upcoming AR glasses. However, to make such a technology standard in the market, Facebook says that it needs a lot of development.
As a result, the company recently announced a toned-down version of the technology that uses a wrist-based device to interact with AR interfaces in the real world. The wearable wristband, developed by the FRL, relies on electromyography (EMG) technology to translate electrical motor nerve signals into digital commands.
So, the device uses various onboard sensors to detect the motor nerve signals coming from the brain to the hand of a user. Now, as it sits on the wrist, it can decode a certain motor nerve signal even before it reaches the fingers of the user. This way the device responds to precise finger movements of just a millimeter. Moreover, Facebook says that going into the future, it may even be possible for the device to anticipate a movement and take the corresponding action.
The company says that initially, the technology would support simple pinch and tapping gestures of the fingers. However, with further development, the EMG technology could eventually support richer controls in AR interfaces. As per Facebook, users will be able to actually touch and move virtual objects and UIs in the future just like we see in Iron Man doing it in the Avengers movies.
You can check out a demo video showcasing the control movements right below.
Now, apart from the ultra-low-friction finger-based input, the company is also working on a feedback system for the new HCI technology. This will include advanced haptic responses that will allow users to “feel” the virtual UIs and objects. Facebook showcased an example in which a user takes a virtual bow and arrow and nocks the arrow onto the bow. In doing so, the user was able to feel the sensation and tension of pulling the arrow through the advanced vibration-based feedback of the system.
So, the possibilities for this technology are innumerable and moving forward, Facebook intends to develop it further. Furthermore, Facebook stated that it will “address some groundbreaking work in soft robotics to build comfortable, all-day wearable devices and give an update on [its] haptic glove research” later this year.