Apple Vision Pro's New OS Subtly Expands the Headset's Potential
I had surgery this summer, and throughout the recovery, I routinely used two gadgets: Meta’s Ray-Bans for easy calls and music listening, and Apple’s Vision Pro to escape into movies. For the latter, I used a developer version of VisionOS 2 with features now available for anyone to use on the mixed-reality headset. Those who have one will appreciate a few subtle changes that make using it a bit more interesting. I say “a bit” because the Vision Pro is still, while advanced, a limited headset in how much it can do.
VisionOS 2’s use of new hand gestures is fantastic
Apple assumes you’ll do everything on Vison Pro with your eyes and hands. And its onboard eye- and hand tracking are so good, that’s mostly true. The latest OS also adds dashboard interfaces that appear directly over your hands, making them feel like pop-up dashboards. Meta has had this for years on its Quest headset, but Apple’s riff on the features feels remarkably fluid and fast.
Turning my hand up brings up a floating circle icon and opens the grid of apps when I tap my fingers. It’s faster and easier than pressing the Digital Crown button on the top of the Vision Pro, which I used to do.
Turning my hand over again brings up a control center widget that I like even more. This replaces the odd floating dot found at the top of the headset’s display (and still appears from time to time). Getting rid of it means no more weird interruptions to my field of view. The widget also shows the time, like a clock. The “what time is it?” concerns in the Vision Pro are instantly solved.
The widget also shows battery life, volume level and Wi-Fi connection. Tapping my fingers brings up a submenu for other control settings, like connecting to my Mac as a virtual display or checking notifications. Tapping, holding and dragging adjusts volume, saving me another Digital Crown reach-up move.
It’s so good and so simple that I want more. I want to control all of Vision Pro through simple subgestures in this mode, avoiding hunting through bigger grids of apps and menus. It could happen — and should happen.
Keyboards work when you’re in immersive environments
Meta first started recognizing keyboards and having them emerge into VR, and I loved it back then. Apple’s headset now has standard automatic awareness of a Mac or Bluetooth keyboard while in immersive environments; I can see my hands and the keyboard clearly while I’m surrounded by nicer beaches than my cluttered office. I can write more easily in a peaceful surrounding.
3D photo conversion: FOMO delight
I sometimes don’t shoot 3D video clips in the iPhone 15 Pro, and then find my memories aren’t 3D-enabled for viewing with the Vision Pro later on. My entire back library of photos and videos is hindered by this, too. Apple has a brilliant AI-enabled feature in VisionOS 2 that nearly negates the need for a 3D photo-capable new iPhone 16: It just converts 2D photos to 3D automatically.
The conversion is fast, and photos still show normally in 2D on your Mac, iPhone and elsewhere. But in the Vision Pro, old photos look remarkably 3D. Old pics of my kids, trips I took a decade agoβ¦ they made me emotional. The conversion of depth isn’t always perfect, but it’s often amazing. It’s made me enjoy flipping through my photo library more.
Safari is better for videos and WebXR
Apple now launches WebXR immersive experiences without going into settings (finally), so you could launch a VR experience on the web without needing a dedicated Vision Pro app. Videos play back in immersive environments, too, which makes Netflix, YouTube and other app-free video services feel more like apps on Vision Pro. It’s stuff I’ve wanted from the beginning. Is it a game changer? No, but it’s welcome.
Meditation now tracks your breath
A small quirky bonus I noticed during a mindfulness meditation in-headset (something I like doing every week or so) is it recognizes my inhaling and exhaling and matches the animations to my breaths. Apple hasn’t pursued any other health and fitness avenues with its mixed-reality spatial computer (yet), but this little tweak makes me wonder how much more the Vision Pro could be tuned to be aware of movements, posture and other health-related things.
Can’t wait to try the giant curved Mac monitor
A feature coming later this year promises to expand the virtual Mac display you can enable in Vision Pro to a giant curved monitor that surrounds your whole field of view. Meta has similar tech for its headsets already, but it’ll be great to try with Apple’s lovely micro-OLED display. It’s still not the same as having multiple virtual Mac monitors, but it looks like a much larger canvas for any apps you want to keep open simultaneously.
Still missing: What about iPhone, Watch and Apple Intelligence?
In the past six months, I’ve worked a fair amount on my Mac in Vision Pro, but I’m shocked that the iPhone — the device I always have on me — doesn’t directly interface yet. I’d use the iPhone as a physical controller for the Vision Pro if I could, or for quickly typing things faster than the awkward air-tapping on the headset I have to do now. Or for sharing information back and forth quickly, expanding a shared item into a Vision Pro expansive view. VisionOS 2 does have a way to AirPlay your phone’s screen right into a pop-up window on Vision Pro, but that’s not the same thing.
Apple already has hook-ins for Messages, AirDrop and other iCloud-type things (as well as copy-and-paste across devices), but I’m waiting for more. And speaking of Apple devices, the Watch would also be a perfect accessory. The Watch already recognizes wrist gestures and has haptic feedback as well as a touchscreen. It could be used as a fitness tracker with mixed-reality apps, which the competing Quest headset already does.
Also, what about Apple Intelligence? The generative AI features Apple has touted since June haven’t emerged for iPhone, iPad or Mac yet, but the Vision Pro should also be on this list as a clear next-on-deck recipient of an AI upgrade. I use Siri a lot in Vision Pro to open up apps and do quick actions, and it makes a ton of sense for an upgraded assistant to be available to explore what else could be instantly conjured in mixed reality. Sounds like that won’t happen until next year, though. Maybe, along with it, Apple will add connected support for the iPhone — and Apple Watches, too.
Source: CNET