Meta announced a swathe of new software features coming soon to its smart glasses.
Nutrition Tracking
Meta says it’s “soon” adding a food log to the Meta AI smartphone app, and that with “a simple voice prompt or quick photo” from any of its smart glasses you can add to it.
Meta AI will “extract key nutrition details” from your prompt or photo, the company says, updating the food log.
“Over time, your food log powers increasingly personalized insights that get more useful, helping you make healthier, more informed choices”, Meta proclaims.
The company says the feature will soon roll out to users aged 18+ in the US, and will come to Meta Ray-Ban Display “later this summer”.
WhatsApp Summaries
Meta’s displayless glasses can read out WhatsApp messages, if you want, and you can send a WhatsApp message by asking Meta AI. This can be a text message, transcribed, a voice message, or a captured image. On Meta Ray-Ban Display there’s also an app to visually scroll and read WhatsApp messages, and you’ll see message notifications pop up in your view, if you want.
This is all fine for the occasional message, but is too much for fast-moving group chats as you go about your day.
Meta says it’s soon rolling out a Meta AI feature called WhatsApp Summaries to the Early Access program. By asking “Hey Meta, catch me up on my messages” you can get a summary of your chats, or you can ask specific questions like “What did Jamie suggest for dinner?”.
The company claims this is all “processed on device” and that chats “remain private with end-to-end encryption”.
New Live Translation Languages – Without Downloads?
For around a year now, Meta’s smart glasses have been able to translate speech between English, French, Italian, and Spanish. To activate the feature, you say “Hey Meta, start live translation.”
On displayless glasses you hear the spoken words of people nearby, translated to your own language, through the speakers via text-to-speech. On Ray-Ban Meta Display, you see it as text on the display instead. Simultaneously, the person you’re conversing with can see your words translated into their language as text on the Meta View smartphone app.

Now, Meta has added Hindi, Arabic, Russian, Swedish and Finnish for users enrolled in the Early Access program, and says that it will support 20 languages by summer, including Mandarin, Korean, and Japanese.
The company also suggests that you will no longer need to download language packs in advance. Currently, you must download the specific language pair you want to your glasses in advance. This change would be a significant reduction to the friction involved in the translation experience today.
Display Recording
A major issue I faced when reviewing Meta Ray-Ban Display was the inability to show you, our readers, what I was seeing. Sure, you can stick a camera up to the lens, but the waveguide is designed for a human eye, not a sensor, and I’ve never seen any attempt at this accurately depict what I saw. These camera capture techniques also preclude actually wearing the glasses.
0:00
The only current screen recording footage of Meta Ray-Ban Display comes from Meta itself, which seems to have an internal screen recording capability.
Meta says that in a coming update this spring, Meta Ray-Ban Display owners will finally be able to record the display, with the output showing it superimposed on the camera view, and including any playing audio.
This should make it a lot easier for journalists and influencers to demonstrate Meta Ray-Ban Display to their audiences, and for regular buyers to tell friends what it’s like too.
On-Foot Navigation In Any US City
In our review of Meta Ray-Ban Display, we harshly criticized the fact that the on-foot navigation feature only worked within the urban area of 28 specific cities.

Meta says that in May, the feature will expand to “every city across the US”. The US government doesn’t officially define what makes a “city”, so it’s unclear what Meta means by this. It’s also unclear whether this only refers to the urban area or the metropolitan area. I, for example, was unable to use on-foot navigation near JFK airport, despite New York being listed, with the feature only becoming available in Manhattan. We’ll find out in May.
The expansion of navigation being US-only makes sense, given that the company indefinitely delayed its plans to launch Meta Ray-Ban Display internationally in January, citing unexpectedly high demand and supply limitations.

Neural Handwriting Wide Rollout
In January, Meta started rolling out Neural Handwriting to the Early Access program for Meta Ray-Ban Display owners in the US.
The feature lets you enter text letter-by-letter by using your index finger to trace letters on a surface, such as a desk or your thigh, sensed by the surface electromyography (sEMG) sensors of the Meta Neural Band.
0:00
At the Early Access launch, it was limited to WhatsApp and Messenger. Now, Meta says it’s rolling out to “everyone in the coming weeks”, and that it will support Instagram and native iOS and Android messaging too.
Further International Expansion
Meta and EssilorLuxottica say they’ll start selling their Ray-Ban Meta and Oakley Meta smart glasses in new countries “in the coming months”, including Japan, South Korea, Singapore, Chile, Colombia, and Peru.

This will bring the total number of countries where the Ray-Ban Meta and Oakley Meta glasses are sold to 24:
- Australia
- Austria
- Belgium
- Canada
- Chile
- Colombia
- Denmark
- Finland
- France
- Germany
- India
- Ireland
- Italy
- Japan
- Mexico
- Norway
- Peru
- Singapore
- South Korea
- Spain
- Sweden
- United Arab Emirates
- United Kingdom
- United States of America



