Meta Connect 2025 and Takeaways for the XR Industry
Meta Announcement and XR
From the outset of the keynote with CEO and Founder Mark Zuckerberg putting on a pair of Ray-Bans and walking to the stage to his opening line “AI glasses and virtual reality” – Meta’s annual developer conference confirmed the company’s current focus and one that has been in the making for over a decade.
It also demonstrated the company’s recognition of convergence of software and hardware with regard to Extended Reality (XR) technology with artificial intelligence.
Announcements
Meta Ray-Ban Display
Meta unveiled the new Meta Ray-Ban Display as its first pair of “AI smart glasses” with an integrated full colour display in the right lens.
This is the latest edition to their existing “smart glasses” which include voice-activated controls to capture photos, control music or translate conversations in real time.
Meta Neural Band
Along with the Meta Ray-Ban Display Meta introduced the Meta Neural Band which is a wearable wristband with gesture interface based on hand and finger movement. This interface will provide users with a more subtle and private mode of input as voice activation in public settings is not always possible or desirable.
Oakley Meta Vanguard
Targeting high performance athletes and sporting enthusiasts, the Oakley Meta Vanguard not only taps into the coveted Oakley market but also includes
Learning from (others’) past mistakes
One can see that Meta has clearly learned lessons from the infamous product Google Glass. Launched in 2013 it was a technical marvel but the public and mass culture were simply not ready.
We have evolved into online social creatures
A lot of the criticism was around user experience which I wrote about back in 2017. In that article I discussed how it a lot of the fallout actually stemmed from a disconnect between the primary and secondary users’ user experience.
But the differences between Google Glass and Meta Glasses are stark, not just from 12 years of technical progress in terms of form and function but society and culture.
In 2025 we are all chronically online. We consume large amounts of digital content designed specifically for our screens. Moreover, we are much more aware AND accepting that everyone else may be capturing, recording and livestreaming in public spaces. But back in 2013 this was unheard of.
If you want mass adoption, you need mass appeal
The first presentation in Zukerberg’s presentation was “They need to be great glasses first.”
Meta has strategically partnered with Essilor Luxottica, a company responsible for (and dominating) the design and production of ophthalmic lenses, equipment and instruments, prescription glasses and sunglasses. They own brands like Ray-Ban, Oakley, Oliver People and retailers like Sunglass Hut.
While Meta’s glasses are designed with fashion conscious consumers in mind, Google Glass appealed (unintentionally?) to tech enthusiasts with an almost cyberpunk aesthetic.
Impact on XR industry
Terminology
Meta has positioned these products as AI glasses rather than AR glasses deliberately shying away (for now) from using words like augmented reality or mixed reality. There are a number of strategic reasons for this approach. Currently the XR landscape suffers from a lack of consensus when it comes to terminology (XR, VR, AR, MR, spatial computing) making things incredibly confusing for consumers.
While technologists debate semantics and technical differences, the public are left scratching their heads as to what the technology can actually do and potential applications. Throw into the mix buzzy terms like “metaverse” and “web3” and no wonder there is confusion around XR tech. Given the current interest in AI, calling these products “AI glasses” is at least (for the time being) something that the public can get their heads around.
AI is context
Meta isn’t jumping on the AI bandwagon because of the current stranglehold AI has on public consciousness. It’s a legitimate understanding of technology convergence.
XR, to date, has largely been about presenting content as a truly immersive experience. From virtual environments and objects, spatial mapping of the user’s physical environment and spatial audio to a variety of inputs such as controllers, hand tracking, gaze and speech.
What AI now brings to the table is an understanding of what the user is doing in real time. For example, I could be
At the botanical gardens and ask “What is the name of this flower?”
Looking at a menu in a foreign language and ask “What does this sign say?”
Listening to music at my local café and ask “What’s this song?”
Yes, a lot of this technology already exists (Thanks Google Lens, Google Translate and Shazam) but wearing a pair of glasses with this capability removes so much of the current friction:
Taking out your phone
Unlocking it
Locating the right app
Opening that app
Navigating to the right screen
Inputing the request.
Look ma, no phone!
Well not just yet. These glasses are not standalone products. They still require connection to a mobile phone and the Meta AI app.
But ultimately, I believe that Meta want to replace mobile phones entirely. In this way, they see Horizon OS as a direct competitor to Apple’s iOS and Google’s Android operating systems.
What does this mean for the future of narrative design?
Meta is determined to be at the forefront of this new spatial computing revolution with the Metaverse as the ultimate end goal. That is, a persistent and immersive online world that is experienced as a 3D spatial environment. But it’s still very early days.
The new Meta Ray-Ban Displays feature content as 2D objects – essentially flat dialog boxes.
The next evolution of content may include AR features such as face and surface detection which you may remember from Instagram and Facebook filters (which interestingly Meta discontinued at the beginning of 2025).
Eventually 3D objects will be positioned within “world space” so that they are anchored and persistent within the user’s physical environment. Such 3D content would blend seamlessly with the real world. These types of glasses could then be called “mixed reality glasses” sharing a similar capability as Meta’s Quest 2 and 3 headsets. And that’s when things will get really interesting….
Location Based Experiences
With GPS-enabled technology, you would then be able to anchor experiences to real world locations. The appearance of buildings, street corners, window shopfronts could appear entirely different.
Identity and Appearance
Gaming
One of my favourite potential applications are alternation reality games. A bit like Free Guy
Stealing Worlds by Karl Schroeder
https://www.amazon.com.au/Stealing-Worlds-Karl-Schroeder/dp/0765399989
END
Pivotal step change
Meta Connect’s announcements demonstrate the company’s commitment and investment in extended reality technologies. Given the lack of any such announcements at Apple's WWDC 2025 event and Google’s lack of media around Project Astra leaves many pundits wondering when the sleeping giants will awaken…
You can watch Meta’s full keynote presentation below (event starts at 31:25)
https://www.youtube.com/watch?v=80s0chTOsK0