AI can bring real-world objects into VR as 3D assets in seconds, with Meta’s new SAM 3D Objects model setting a new standard for quality.
It has been possible for years now to generate a 3D model of a real-world object by capturing dozens of images of it from surrounding angles, leveraging traditional photogrammetry techniques. Epic’s RealityScan, for example, takes around 15–45 minutes of cloud processing, while Apple offers an on-device Object Capture API for iPhone Pro models that takes around 5 minutes.
But over the past year or so, advanced AI models have emerged that can produce 3D assets from a single image in a matter of seconds. And while they don’t offer the same quality of photogrammetry, the quality has steadily improved with each new model release, mirroring the overall rapid advancement of AI.
0:00
EchoTheReality on SideQuest, which uses an old AI model from 2024.
For an example of how this applies to VR, Takahiro “Poly” Horikawa published a Quest app on SideQuest that uses hand tracking to let you frame a specific real-world object and take a photo of it, leveraging Meta’s passthrough camera API. This image is then provided to Stability AI’s Stable Fast 3D API, based on the TripoSR model, and the result is spawned as a virtual object beside the image capture spot.
TripoSR is now almost two years old, though. And a few days ago, Meta launched SAM 3D Objects, the new state-of-the-art model for generating 3D assets from a single image.
0:00
Meta SAM 3D Objects
You can test out SAM 3D Objects for free in your web browser on the Meta AI Demos page. Just provide it with an image and you’ll be able to select which object you want to convert to a 3D model. Seconds later, you’ll see a 3D view where you can pan around the object with your mouse or finger.
Meta’s site isn’t designed for mobile screens, so you’ll probably want to use a PC, laptop, tablet, or VR headset. Also note that the model is only designed for inanimate objects, not people or animals.
This free public demo does not let you download the 3D model. But SAM 3D Objects is open source, available on GitHub and Hugging Face. That means developers should be able to host it on a cloud computing platform that offers GPUs, and use it to provide the experience of that EchoTheReality demo but with higher quality output – essentially pulling an object from reality into VR.
Social VR platforms, for example, could let you conduct show-and-tell for objects in your real room in a matter of seconds. Or decorate your home space with items you crafted in the real world. Meta has no announced plan to add this to Horizon Worlds, but it would seem like a natural future step, complementing the Hyperscape worlds it just launched.

