Aria Gen 2 Adds Built-In Head, Hand, Eye Tracking & Audio To Meta's Research Glasses

Home » Aria Gen 2 Adds Built-In Head, Hand, Eye Tracking & Audio To Meta's Research Glasses

Aria Gen 2 adds head tracking, eye tracking, hand tracking, and audio to Meta’s research glasses, and it’s already being tested to help vision impaired people navigate indoors.

Meta’s Aria glasses are not products, nor are they prototypes of future products. They’re displayless glasses designed to support AR, AI, and robotics research by providing rich first-person visual data from onboard cameras.

The original Project Aria glasses from 2020 featured a somewhat similar camera suite: a color camera, two wide angle outwards-facing greyscale cameras, and two inwards-facing eye tracking cameras, as well as a microphone array. But none of these sensors were processed on-device. Instead, the device simply recorded them all to onboard flash memory for later processing on PCs and servers by researchers.

As well as upgrading these cameras, Aria Gen 2 adds a Meta-developed highly efficient custom chipset to process these cameras on-device while using very little power, enabling onboard positional tracking, eye tracking, and hand tracking, as well as speech recognition.

Additionally, Aria Gen 2 adds audio output via open-ear speakers, a heart rate sensor, and an additional contact microphone that lets the system distinguish between the wearer’s voice and external sound.



0:00
/1:50



The addition of this chipset, and audio output, enables Aria Gen 2 to run simple voice-driven applications on-device.

Meta says the glasses are capable of 6 to 8 hours of “continuous use” and weigh 75 grams, only 25 grams more than Ray-Ban Meta glasses, which lack any kind of tracking.

One such application comes from Meta’s first outside partner for Aria Gen 2, Envision, a company that sells modified Google Glass Enterprise Edition 2 glasses with custom software to help vision impaired people see the world by reading out text and describing what it sees on demand. It also offers a free phone app that does the same.

Leveraging Aria Gen 2’s built-in world-scale positional tracking and precise spatial audio, Envision is experimenting with helping vision impaired people navigate indoor environments by guiding them with a spatial “beacon” sound.



0:00
/1:56



The companies stress that this application is still in the “exploratory and research” phase, but it points to a future where smart glasses with tracking and AI can make the lives of people with vision impairment easier.

Researchers interested in leveraging Meta’s Aria Gen 2 glasses can sign up here, and the company says it will share more about external availability in the coming months.

Leave a Comment

Your email address will not be published. Required fields are marked *