visionOS 26.4 will bring foveated streaming to Apple Vision Pro, enabling higher-quality wireless VR remote rendering from a local or cloud PC.
Before you continue reading, note that foveated streaming is not the same as foveated rendering, though the two techniques can be used alongside each other. As the names suggest, while foveated rendering involves the host device actually rendering the area of each frame you’re currently looking at with higher resolution, foveated streaming refers to actually sending that area to the headset with higher resolution.
It’s a term you may have heard in the context of Valve’s Steam Frame, where it’s a fundamental always-on feature of its PC VR streaming offering, delivered via the USB PC wireless adapter.
Given that the video decoders in headsets have a limited maximum resolution and bitrate, foveated streaming
0:00
Valve’s depiction of foveated streaming.
Unlike the macOS Spatial Rendering introduced in the main visionOS 26 release last year, which is a relatively high-level system that only supports a local Mac as a host, Apple’s developer documentation describes the new Foveated Streaming as a low-level host-agnostic framework.
The documentation highlights Nvidia’s CloudXR SDK as an example host, while noting that it should also work with local PCs. Apple even has a Windows OpenXR sample available on GitHub, which to our knowledge is the first and only time the company has even mentioned the industry-standard XR API, never mind actually using it.

The lead developer of the visionOS port of the PC VR streaming app ALVR, Max Thomas, tells UploadVR that he’s currently looking into adding support for foveated streaming, but that it will likely be “a lot of work”.
Because of how the feature works, Apple’s foveated streaming might even enable foveated rendering for tools like ALVR.
Normally, visionOS does not provide developers with any information about where the user is looking – Apple says this is in order to preserve privacy. Instead, developers only receive events, such as which element the user was looking at as they performed the pinch gesture. But crucial to foveated streaming working, the API tells the developer the “rough” region of the frame the user is looking at.
This should allow the host to render at higher resolution in this region too, not just stream it in higher resolution. As always, this will require the specific VR game to support foveated rendering, or to support tools that inject foveated rendering.
0:00
Clip from Apple’s visionOS foveated streaming sample app.
Interestingly, Apple’s documentation also states that visionOS supports displaying both rendered-on-device and remote content simultaneously. The company gives the example of rendering the interior of a car or aircraft on the headset while streaming the highly detailed external world on a powerful cloud PC, which would be preferable from a perceived latency and stability perspective to rendering everything in the cloud.
We’ll keep an eye on the visionOS developer community in the coming months, especially the enterprise space, for any interesting uses of Apple’s foveated streaming framework in practice.
