Meta Adds Real-Time AI Video Features to Smart Glasses

Jean Gilles
Jean Gilles 2 Min Read
Image Credit: Meta

Meta’s Ray-Ban Meta smart glasses are getting major AI upgrades with the latest firmware v11, now available for early access users in the U.S. and Canada. The update introduces “live AI,” allowing users to talk continuously with Meta AI without needing to say “Hey Meta” each time. Users can ask follow-up questions or change topics during the conversation.

The update also brings real-time AI video, letting users ask questions about their surroundings based on what the glasses’ front-facing camera sees. First announced at Meta’s Connect conference, this feature puts Meta ahead of competitors like OpenAI and Google, who are working on similar technology.

Another key addition is live translation, which works for English, Spanish, French, and Italian. When users talk to someone speaking one of these languages, they’ll hear a translation through the glasses’ speakers and see a transcript on their phone. The glasses also now support Shazam, allowing users to identify songs by saying, “Hey Meta, Shazam this song.”

Meta acknowledges that live AI and translation might not always be perfect and says it’s working to improve the features over time.

These upgrades follow Meta’s earlier rollout of AI features in Europe and coincide with strong sales. According to Ray-Ban’s parent company, EssilorLuxottica, the glasses are the top seller in 60% of Ray-Ban stores across Europe, the Middle East, and Africa. With these updates, Meta continues to push its smart glasses forward with practical and innovative AI features.

TAGGED:
Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *