Meta has revealed more information about Aria Gen 2, its experimental smart glasses designed to serve as a test platform for research about augmented reality, AI, and robotics. The glasses pack several improvements into their lightweight frame that could one day translate into consumer products, including an improved eye-tracking system that can track gaze per eye, detect blinks, and estimate the center of pupils.

“These advanced signals enable a deeper understanding of the wearer’s visual attention and intentions, unlocking new possibilities for human-computer interaction,” Meta writes. Meta initially announced Aria Gen 2 in February, saying they will “pave the way for future innovations that will shape the next computing platform.” They build upon Meta’s first iteration of the glasses in 2020, which were similarly available for researchers only. 

Along with an improved eye-tracking system, Aria Gen 2 comes with four computer vision cameras that Meta says enable 3D hand and object tracking. Meta says researchers can use this information to enable highly precise tasks like “dexterous robot hand manipulation.”

The glasses also have a photoplethysmography sensor built into the nosepad, which allows the device to estimate a wearer’s heart rate, along with a contact microphone that Meta says provides better audio in loud environments. There’s a new ambient light sensor as well, allowing the glasses to differentiate between indoor and outdoor lighting.

The Aria Gen 2 glasses include folding arms for the first time, weigh around 75 grams, and come in eight different sizes. Meta plans on opening applications for researchers to work with Aria Gen 2 later this year. The initiative builds on the successful development of Meta’s Ray-Ban smart glasses, a form factor it aims to expand with its Orion augmented-reality glasses, a rumored partnership with Oakley, and a high-end pair of “Hypernova” glasses with a built-in screen.

Categories: digitalMobile