At WWDC today, Apple announced the headlining features of visionOS 26, its next big OS release for Vision Pro. Among them is a new revamped spatial photos feature that ought to make them even more immersive.
Vision Pro launched with the ability to view spatial photos, captured either with the headset itself or with iPhone 16, 15 Pro and Pro Max. These spatial photos created a sense of depth and dimensionality by combining stereo capture and applying depth mapping to the image.
Now, Apple says it’s applied a new generative AI algorithm to create “spatial scenes with multiple perspectives, letting users feel like they can lean in and look around,” essentially ‘guessing’ at details not actually captured on camera.
With visionOS 26, Vision Pro users will be able to view spatial scenes in the Photos app, Spatial Gallery app, and Safari. The company says developers will also be able to use the Spatial Scene API to add the feature into their apps.
To show off the new AI-assisted spatial photos feature, real-estate marketplace Zillow says it’s adopting Spatial Scene API in the Zillow Immersive app for Vision Pro, which lets users to see spatial images of homes and apartments.
Apple’s visionOS 26 is slated to arrive sometime later this year, although the company says testing is already underway.