

Meta is reportedly restructuring its Reality Labs and Metaverse organizations as it doubles down on the integration of artificial intelligence across its products, according to a Business Insider report.
The News
In a memo obtained by Business Insider, Meta CTO Andrew Bosworth announced that longtime Metaverse head Vishal Shah will transition to lead AI Products within Meta Superintelligence Labs (MSL), a new division focused on developing and integrating “personal superintelligence” into Meta’s platforms.
Shah, who led Reality Labs over the past four years, will now oversee AI integrations across both Family of Apps (FoA) and Reality Labs, reporting to MSL head Nat Friedman, Bosworth’s memo reveals (seen in full below).
“The priority of the metaverse work remains unchanged, and it continues to be a companywide priority,” Bosworth’s memo reads. “We’ve proved our thesis to the industry, and we continue to see competitors enter this space to try and catch up to us, so we need to continue to press our hard-earned advantage.”

In the memo, Bosworth characterizes Shah’s move as a key moment in Meta’s broader plan to merge metaverse innovation with “personal superintelligence,” or the company’s vision for artificial general intelligence (AGI) laid out in Meta CEO Mark Zuckerberg’s July 2025 statement.
Shah reflects on his transition as “difficult yet exciting,” noting that the “metaverse hype has thankfully died down.” Shah emphasizes that AI will be the “transformative shift of our generation,” enabling personalized, context-aware experiences that bridge virtual and physical worlds.
Stepping into Shah’s previous role, Gabriel Aul will lead the Metaverse Product Group, with Jason Rubin, Samantha Ryan, and Thamara Sekhar now reporting to him. Aul will also oversee the Horizon Experiences team, headed by new leader Saxs Persson. Meanwhile, Ryan Cairns continues to lead Horizon OS, now elevated to an org-level product group reporting directly to Bosworth.
We’ve copied Bosworth’s reported memo below. Additionally, you can read both Shah’s and Friedman’s memos in the original report.
Bosworth’s Memo
An update to Metaverse’s structure
I hope you’ve had time to read Vishal’s update about him taking on a new role in MSL leading product and cross-company integrations on the Products and Applied Research Team. I’m pleased that he will oversee the integration of personal superintelligence with FoA and RL’s portfolio. I’m confident that his deep expertise and experience with RL will accelerate our work.
It is thanks to Vishal’s leadership for the last four years that we find ourselves well-equipped 1 deliver on our vision and strategy. We already have the right leadership and team in place.
Gabriel Aul will step in to lead the Metaverse PG. Metaverse will continue to focus on creating high-quality experiences for both VR and mobile. Jason Rubin, Samantha Ryan, and Thamara Sekhar will move to report to Gabe. We will also welcome a new leader, Saxs Person, to lead the Horizon Experiences team under Gabe.
Ryan Cairns will continue to lead Horizon OS which will become an org-level PG, and he’ll report directly to me. The Horizon OS charter and reporting structure remain unchanged. They will continue to focus on building quality hardware and software for the metaverse, especially ahead of our big launches and exciting VR roadmap. Metaverse and Horizon OS will continue to work closely together to ensure an integrated product experience across our devices and platforms. Gabe’s and Ryan’s posts will have more on this, and you can ask me more questions during Tuesdays with Boz tomorrow.
The priority of the metaverse work remains unchanged and it continues to be a companywide priority. We’ve proved our thesis to the industry and we continue to see competitors enter this space to try and catch up to us, so we need to continue to press our hard-earned advantage.
VR is evolving beyond its roots in gaming to become a broader platform for entertainment, productivity, and connection as we deepen our Al and general compute capabilities. Mobile is starting to attract young social gamers at a greater scale and our Al creation tools are accelerating world-building to create the flywheel. We have the right team and strategy in place and now we need to focus on execution.
My Take
Despite the name, Meta’s center of gravity appears to be moving from metaverse-first to AI-first. Shah, who for years was one of the public faces of its metaverse efforts, marks that shift by essentially being promoted deeper into the company’s new core.
Meta still seems to believe the metaverse is the future of human communication and interaction, making the announcement more about putting AI at the foundation of that vision instead of replacing it. Still, the company has decreased its focus on VR as the metaverse’s driving force.
As it is, the company doesn’t fund high quality VR content like it used to, and has put more emphasis on getting concurrent user numbers on its Horizon Worlds metaverse app, which the company released on mobile and the web in late 2023. Disheartening for a VR nerd like me hoping for more high production value single-player VR games, but an understandable move for a publicly traded company.
And I’d argue Meta’s early signs of traction with smart glasses (aka ‘AI glasses’) probably tipped the scales. While VR development has been costly, and not capable of quickly driving a return on the billions invested every quarter—and true AR glasses with all of their Pokémon-hunting implications still aren’t here—smart glasses offer a real opportunity today for Meta to flex its XR muscles among a much larger subset of everyday consumers.

And love it or hate it, AI will be at the center of that. I’d go as far to say there isn’t going to be a successful AR platform in the future that won’t. But first, I’d ask you to shelve what you might think of AI right now for a moment: the helpful voice in your ears that speaks with a distinctly corporate cadence and default disposition for hollow niceties. AI is more than that.
Even with the frankly magical Neural Band to mediate user input, which comes alongside the company’s recently launched $800 Meta Ray-Ban Display glasses, there’s a massive gulf between user input and glasses output that requires AI intuition to bridge. As much as I love the concept, doing Minority Report-style input sequences on a digital keyboard and pinching and zooming windows is a nonstarter. Typing out words on your leg, like Meta Ray-Ban Display has been shown to do, isn’t much better in the grand scheme of things. Getting answers to questions without needing to talk or type to an AI agent is.
Questions like: What’s that guy’s name again? How long should I wait to flip this steak for a good sear? What did that guy just say in Gujarati? Where did I park my car? All of those questions could be quietly answered more quickly by a proactive AI than a reactive search through the library of disparate apps on your smartphone.
Okay, maybe letting an AI into your life and giving it access to all of your waking data is a bit dystopian, and maybe a little to close to farming your long-term memory out to a Meta-controlled data center. But it certainly sounds useful. And profitable too.
What’s more astounding is Meta hasn’t done any of the usual work to launch multiple generations of it first line of smart glasses. It didn’t need to throw out a wide net of developer kits to inculcate third-party developers, fund a library of first and third-party apps, buy up a bunch of studios, etc. AI and a set of clearly defined core use cases seem to be enough for now.
Granted, I’m not saying there will never be an app store for Meta’s smart (or future AR) glasses, but it is interesting that they’ve done so much up to this point without needing to do the seemingly requisite slow and dirty work. And all with no persistent call for a “killer app” either.
The post Meta Reportedly Reshapes Metaverse Division Amid Leadership Shifts, Putting AI at Its Core appeared first on Road to VR.