Mark Zuckerberg, CEO of Meta, announced updates to the company’s Ray-Ban Meta smart glasses. at Meta Connect 2024 on Wednesday. Meta continued to make the case that smart glasses can be the next big consumer device, announcing new AI capabilities and familiar smartphone features coming to Ray-Ban Meta later this year.
Some of Meta’s new features include real-time AI video processing and live language translation. Other announcements, like QR code scanning, reminders, and integrations with iHeartRadio and Audible, appear to give Ray-Ban Meta users the smartphone features they already know and love.
Meta says its smart glasses will soon have real-time AI video capabilities, meaning you’ll be able to ask the Ray-Ban Meta glasses questions about what you see in front of you, and Meta AI will respond verbally in real time. Currently, the Ray-Ban Meta glasses can only take a photo and describe it to you or answer questions about it, but the video upgrade should make the experience more natural, at least in theory. These multimodal features are expected to arrive later this year.
In a demo, users could ask Ray-Ban Meta questions about a meal they were preparing or city scenes unfolding before them. Real-time video capabilities mean Meta’s AI should be able to process live action and respond audibly.
That’s easier said than done, however, and we’ll have to see how fast and seamless the feature is in practice. We’ve seen demonstrations of these real-time AI video capabilities from Google and OpenAI, but Meta would be the first to launch such features in a consumer product.
Zuckerberg also announced a live translation for Ray-Ban Meta. English-speaking users can talk to someone who speaks French, Italian, or Spanish, and their Ray-Ban Meta glasses should be able to translate what the other person is saying into their language of choice. Meta says this feature will arrive later this year and will include more language later.
Ray-Ban Meta glasses get reminders, which will allow people to ask Meta AI to remind them of things they are looking at through the smart glasses. In a demo, a user asked his Ray-Ban Meta glasses to remember a jacket he was looking at so he could share the image with a friend later.
Meta announced that integrations with Amazon Music, Audible and iHeart would soon be available on its smart glasses. This should make it easier for people to listen to music on their streaming service of choice using the glasses’ built-in speakers.
The Ray-Ban Meta glasses will also have the ability to scan QR codes or phone numbers from the glasses. Users can ask the glasses to scan something, and the QR code will immediately open on the person’s phone with no further action required.
The smart glasses will also be available in a range of new Transitions lenses, which respond to ultraviolet light to adapt to the brightness of the room you’re in.