At Google I/O 2025, the company gave another on-stage demo of “prototype” smart glasses with a HUD.
We say another because the same duo, Shahram Izadi (Google’s Android XR lead) and Nishtha Bhatia (a Google PM), gave a very similar demo on stage at TED2025 last month.

The TED demo focused almost entirely on Gemini AI capabilities, as well as navigation, and the new I/O demo strongly focuses on Gemini too. But it also showcased some wider functionality, including receiving and replying to message notifications and taking photos.
Izadi and Bhatia finished the demo by attempting real-time speech translation, a feature Meta rolled out to the Ray-Ban Meta glasses last month. While it partially worked, the casted view from Bhatia’s glasses froze halfway through.
“We said it’s a risky demo”, Izadi remarked.

What’s really new at I/O compared to TED though is the announcement of actual products. Google says it’s working with Gentle Monster and Warby Parker, rising competitors to Ray-Ban and Oakley owner EssilorLuxottica, mirroring Meta’s strategy to deliver smart glasses in stylish designs.
However, Google described what was shown on stage at I/O as “prototype” hardware, said that including a display is “optional” for actual products, and did not say whether the first Gentle Monster and Warby Parker products will include a HUD. It’s possible they’ll be more basic devices like Ray-Ban Meta glasses.
If any of these glasses do have a HUD, they won’t be alone on the market. In addition to the existing Even Realities G1 and Frame glasses, Mark Zuckerberg’s Meta reportedly plans to launch its own smart glasses with a HUD later this year. Unlike Google’s glasses though, which appeared to be primarily controlled by voice, Meta’s HUD glasses will reportedly also be controllable via finger gestures, sensed by an included sEMG neural wristband.
Apple too is reportedly working on smart glasses, with apparent plans to release a product in 2027.