Every experience I have with Apple's Vision Pro mixed reality headset is the same as the last, yet very different. I liken it to peeling an onion: I think I understand the feel and texture of it, but each time I notice new gradations and even flavors that remind me that I still don't fully understand Apple's cutting-edge wearable technology .
For the third time wearing the Vision Pro, I had the somewhat unique experience of watching my own content through the powerful and expensive ($3,499 when it ships next year) headset.
A few weeks ago, Apple released a beta for iOS 17.2, adding spatial video recording to the iPhone 15 Pro and iPhone 15 Pro Max (the full version released this week). It is a landscape mode-only video format that uses the 48 MP main and 12 MP Ultrawide cameras to create a stereo video image. I started capturing videos in that format almost immediately, but with the caveat that not every video is worth this more immersive experience (you can't be too far away from your subject, and it helps to keep the phone level and stable) . Still, I had a whopping nine clips that I brought with me for my second and much more personal Vision Pro Spatial Video experience.
During this third Vision Pro trial, I tried to pay more attention to some of the headset installation and initialization details. As I've mentioned before, the Vision Pro is one of Apple's more customized hardware experiences. If you wear glasses, you'll have to pay extra for a pair of custom-made Zeiss lens inserts. I have provided my prescription information prior to this test. It's not clear how long consumers will have to wait for their own inserts (could Apple have an express optician service at the back of every Apple Store? Doubtful).
Not everyone will need those lenses, or have to put up with the extra costs and wait. If you don't wear glasses you're ahead of people like me, and the same goes for if you wear contact lenses.
Provide the right tailor-made experience
Yet there are other adjustments that I have not paid attention to until now. The face cushion that rests on your face and is magnetically connected to the main body of Vision Pro is available in a number of different curve styles to accommodate the different contours of a range of typical human faces. I don't know how many different options Apple will offer.
One thing that's crucial for a comfortable AR and VR experience is matching your eye's interpupillary distance: the distance between the centers of your eyes. This was the first time I paid attention to one of the first steps in my Vision Pro installation. After pressing the headset's digital crown for a long time, a few large green shapes appeared before my eyes. They measured the space between my eyes and in the Vision Pro, and then the dual micro-LED screens and their 23 million pixels of images moved to match the space between my eyes. If you listen carefully, you may be able to hear the technicians doing their work.
I also noticed how the Vision Pro took me through three different sets of eye-tracking tests, where I looked at a ring of dots and pinched my index finger and thumb together for each dot to select them. It may feel tedious to do this three times (okay, it is), but it's a crucial step that ensures that the Vision Pro's primary interaction paradigm works perfectly every time.
Now that I'm wearing it for the third time, I've become quite the expert at looking and squeezing. A gold star for me.
Spatial computing is somewhat well known
We AirDropped my spatial video and panoramic photos from a nearby phone. It was nice to see how smoothly AirDrop works on the Vision Pro – I noticed someone trying to AirDrop the content and simply looked at 'Accept' and then squeezed my thumb and finger. Within seconds the contents were in my Photos library (spatial video gets its own icon).
When Apple's panoramic photography was new in iOS 6, I took a lot of panoramic photos. I was tickled by the torn people moving too fast in the shot, and the ability to have someone appear twice in one trick panorama shot. Apple has largely resolved the first problem: I noticed that fewer of my recent panoramas show people with two heads. However, these days I take very few panoramas and only had four decent ones to try with the Vision Pro.
But even with just a few examples, I was shocked by the quality and immersive nature of the images. My favorite by far was the photo I took earlier this year from my CES 2023 hotel room with an iPhone 14 Pro. Taking these photos is a kind of ritual. I like to see what the views and weather are like in Las Vegas, and usually share something on social media to remind people that I'm back at CES.
It wouldn't be an exaggeration to say that this one shot, taken from quite high up in the Planet Hollywood Hotel, was a revelation. Not only because the view practically hanging around my head was beautiful, but for the first time, when I looked at the far right side of the image, I noticed a complete reflection of me taking the photo. It's a detail I've never noticed when looking at the panorama on my phone, and there's something incredibly strange about unexpectedly seeing yourself in such an immersive environment.
A view from Antigua was also fascinating. The clarity and detail in general, which are a credit to iPhone 14 Pro and iPhone 15 Pro Max photography, are impressive. I viewed most of my panoramas in immersive mode, but by using a pinch-and-push gesture with both hands I was able to put the panoramic image back into a windowed view.
Spatial vision
To prepare for my spatial video experience, I shot videos of Thanksgiving dinner, Dickensian carols, walks in the park, model trains, and interactions with a friend's four-year-old.
Each of these videos hit me a little differently, and they all shared a few key characteristics in immersive mode. You can view spatial video on the Vision Pro in a window, but I preferred the immersive style, which blurs the boundaries and delivers each video almost in a cloud. Instead of hard edges, any 3D video blurs at the edges, so there's no clear demarcation between the real world and the world floating in front of your face. This reduces the field of view somewhat, especially the vertical height and depth. When I viewed the spatial videos on my iPhone (where they look like regular, flat videos), I was able to see everything I captured from edge to edge, while in immersive mode on the Vision Pro, some details were lost towards the top. and bottom of the ether.
In my model train videos, the spatial 3D video effect reminded me of the possibly apocryphal story of early movie audiences who, upon seeing a film of an approaching train, ran screaming from the theater. I wouldn't say my video was that intense, but my model train looked like it was about to drive right into my lap.
I enjoyed each video, and while I didn't feel like I was in any of them, each video felt more realistic, and the emotions I had watching it were amplified. I suspect that when consumers experience the Vision Pro and spatial videos for themselves, they may be surprised at the level of emotion they experience with family videos – it can be quite intense.
It was once again a short and sedentary experience, and I'm sure I didn't strain the endurance of the Vision Pro's external two-hour battery pack. I did find that if, for example, I was working all day, watching multiple two-hour movies, or going through a huge library of spatial videos, I could plug a cable connected to the AC adapter directly into the available battery. USB-C port.
I'm still not sure if the Apple Vision Pro is for everyone, but the more I use it and the more I learn about it, the more convinced I am that Apple is about to revolutionize our computing experience. Not everyone will end up buying Vision Pro, but most of us will feel its impact.