Yesterday at the Meta Connect 2024 developer conference, CEO Mark Zuckerberg and other executives introduced a number of new products and features to power the giant platform’s vision for AI, mixed reality and online content.
Here’s a look at what the company presented during its annual keynote.
Quest 3S mixed reality headset
Zuckerberg unveiled the new Meta Quest 3S mixed reality headset, which follows the Quest 3 released a year ago. Priced starting at $299, the 3S has the same mixed reality features as the original model but for $200 less. (The lower price prompted Zuckerberg to say “Hell yeah” in response to audience applause.)
The Quest 3S also offers various hardware and software upgrades. Besides Dolby Atmos spatial audio, it can also be remote desktop for Windows 11 PCs thanks to Meta’s partnership with Microsoft. In addition to enabling digital interactions with real-world environments, Meta also previewed a new feature called Hyperscape that allows users to recreate physical spaces in a virtual environment. To demonstrate the feature, Zuckerberg showed an example of virtual artist Daniel Arsham’s art studio and the recording studio where Green Day recorded their album “Dookie.”
Other content partnerships around the Quest 3S include new apps for Prime Video, Amazon Music and Twitch. Zuckerberg also introduced other new mixed reality social features for Meta Horizons for gaming and co-viewing content on YouTube.
“A lot of the magic of mixed reality is that you can feel a sense of presence,” Zuckerberg said. “It’s unlike anything any other platform has ever built and different than anything any other platform can provide.”
Although the lower price of Meta headsets could make VR more accessible, cost is not the only barrier to adoption, noted Forrester analyst Mike Proulx. He thinks helmets are still too bulky and wear out more quickly over long periods of time. This is why upgrades to smart glasses could be particularly promising. There are no clear marketing applications for AR glasses yet. However, Proulx said it’s more about preparing for what’s possible in the future, adding that Meta “has demonstrated quite convincingly a future 3D computing platform that addresses the many headwinds that headsets VR and AR won’t be able to overcome.”
“Since Meta’s glasses see and interact with the context around them, there are natural connection points to interact with brands’ advertising, whether physical or online,” continued Proulx. “I can imagine print ads or billboards easily coming to life, but that’s just the beginning. I believe the real opportunity for the brand lies in improving the customer experience. This type of computer interface makes transactions with brands much easier and instantaneous.
Meta also announced a series of updates for its AI products, including the launch of its new AI model, Llama 3.2, which will be able to understand both images and text. Other features include using Meta AI by chatting with it – similar to what OpenAI recently added for ChatGPT. Meta AI’s voice feature will include system voices as well as several celebrity voices, including John Cena, Dame Judi Dench, Kristen Bell, Keegan-Michael Key and Awkwafina.
The celebrity voices come just months after OpenAI launched an AI voice that sounded so much like Scarlett Johannson that the actress threatened to sue the startup for appropriating her voice without permission.
“I think voice will be a much more natural way to interact with AI than text,” Zuckerberg said. “And I think this has the potential to become one of, if not the most common way to interact with AI.”
Meta AI’s voice capabilities also include a way to perform real-time language translation via Meta’s Ray-Ban glasses (see below) and a companion app. To demo the feature, Meta invited UFC fighter Brandon Moreno on stage for a brief conversation with Zuckerberg where Meta translated between Spanish and English. Another AI feature previewed was automatic video dubbing to translate video content into users’ native language on Instagram and Facebook.
Updated features such as language translation could be particularly beneficial for content creators and marketers to reach new audiences, said Nicole Greene, an analyst at Gartner. Multimodal features like video AI could also power more shoppable media. The AI assistant feature of Meta Ray-Bans could also offer new ways for users to not only create content, but also discover it. However, new AI capabilities also come with ongoing risks such as external threats of identity theft, misinformation, and data privacy.
“For businesses, how can they present their unique brand differentiation in a way that is relevant to customers in an increasingly crowded content environment,” Green said. “How does their brand stand out when consumers might be turning to convenience, and how are they using new technology to seamlessly integrate into experiences rather than chasing customers with hyper-personalized messages via the platform.”
The new version of Llama 3.2 will be available everywhere starting this week except the European Union, following Meta’s decision in July not to release upcoming AI models in the EU due to regulatory issues. However, Zuckerberg said, “I remain eternally optimistic that we’re going to solve this problem,” prompting a few people in the audience to applaud awkwardly.
Even though advertising was not the focus of Meta Connect, the company said more than a million advertisers use Meta’s generative AI tools to produce ad creative – with more than 15 million ads created with these tools last month. On stage, Zuckerberg said Meta AI now has nearly 500 million monthly active users.
Meta also launched an improved version of its Ray-Ban smart glasses, with updates including easier ways to talk with Meta AI and allow it to identify real-world objects. Other new features include video AI, real-time translation, better memory capabilities, and ways to send audio messages via WhatsApp and Messenger. It also features more advanced content integrations with Calm, Spotify and Amazon Music, as well as new partnerships with Audible and iHeart.
Perhaps the biggest and most surprising news came at the end, when Zuckerberg launched a prototype of Orion, Meta’s first “true AR” glasses. Previously codenamed Project Nazare, the glasses have been in the works for years. Rather than using traditional displays, Orion uses light diffraction to create holographic displays projected onto an environment.
“The display is different from any other screen you’ve ever used and that’s because it’s not actually a screen,” Zuckerberg said. “This is a completely different type of display architecture with these tiny projectors in the arms of the glasses that project light information waveguides and nanoscale 3D structures etched into lenses so they can diffract light and project holograms of different depths and sizes into the world in front of you.
Although a release date has not yet been set, the growing line of smart glasses signals to some that Meta is looking to create a more cohesive connection between smartphones and other devices.
Meta’s AR glasses could open up new possibilities for product integration, data-driven insights and location-based applications to offer brands new ways to personalize content, said Sasha Wallinger, founder of Blockchain StyleLab. Wallinger – a longtime marketer focused on the intersection of creativity and innovation – also envisioned a potential “Pokémon Go meets Google Ads, with optimization for selection categories the wearer actually wants to see” .
“Aside from the marketing and future technology elements, I am also very excited about how the Meta frame collection has the potential to make technology more accessible to the fashion, beauty and eyewear industry” , Wallinger said.