This is how Apple is telling developers to build apps for Vision Pro and spatial computing

If you want to understand what your spatial computing experience with Apple Vision Pro will be like, you should look at the upcoming apps. Oh yeah, they aren't here yet. That means we have to look to developers and, no surprise, even they have questions about what it means to develop for Apple's latest computing and digital experience platform.

A few weeks ago I asked a dozen developers if they were excited about building apps for the Vision Pro. Most said yes, and some were even willing to imagine what those app experiences could look like.

We're still months away from the consumer release of Apple's mixed-reality headset, and while I've tried it a few times, an understanding of what it will be like to play, communicate, explore, and even to work untouchable. That conversation I had with the developers really reminded me of that. No one had a really concrete idea of ​​how exactly to compute in a Vision Pro or how to build applications designed to work on the new visionOS platform.

It seems like there are several common questions developers have when it comes to building apps for Vision Pro, and they're all asked and answered in a recent Apple Developers blog post titled Q&A: Spatial design for visionOS, which is based on a Q+ Apple's design team held a meeting with developers in June at WWDC23 (where Apple first unveiled Vision Pro and visionOS).

Even though I'm not a developer, I could relate to all of the developers' burning visionOS questions and found Apple's answers to be insightful and, in some cases, surprising.

Apple Vision Pro interface (Image credit: Apple)

Start slowly

Vision Pro is a mixed reality headset capable of near-complete pass-through viewing that places AR elements into your real world, or full immersion. The latter, which I experienced, is impressive, especially for the way it can put a virtual version of your hands (but not your knees) into the experience.

However, it can be difficult for developers to know where on the spectrum of immersion to place users. Apple doesn't recommend that these apps completely immerse users, at least at first.

“In general, we don't recommend immediately putting people into a fully immersive experience. It's better to get them oriented in your app before transporting them elsewhere,” Apple said in the post.

Apple also recommends that visionOS developers create a ground plane to connect their apps to the real world. Most of the time, you'll likely be using Vision Pro in passthrough or mixed reality mode, which means having a connection to the real floor beneath your feet will help ground the app and keep it from feeling disorienting.

The moment

Apple Vision Pro interface (Image credit: Apple)

There was a moment during my first Vision Pro experience when a virtual butterfly (that looked real) fluttered onto my finger from a forest of windows. It was breathtaking and unforgettable.

It seems like Apple wants developers to think about moments like this in their apps. No, Apple doesn't recommend everyone add a butterfly (although that would be great). Instead, Apple is telling developers to think about how their apps can shine in spatial computing. It is, as Apple puts it, “an experience not tied to a screen.”

In one demo I saw, the key moment was clearly when a dinosaur emerged from the wall in front of me, but Apple notes in the post that key moments can be something as simple as adding a focus mode with spatial audio in a write app .

3D challenges

(Image credit: Apple)

In Vision Pro you no longer computer on a flat surface. Everything is in 3D and that requires a new way of working with display elements. Apple solves this in visionOS by applying gaze and gesture control. In the post, Apple reminds developers to think about that 3D space when developing apps.

“Things can become more complex when you design elements in 3D, such as close controls for a far away element,” Apple Designers explains.

In general, developing apps for visionOS will be more complex. Instead of just thinking about how moving the mouse to an element on the screen might work, developers should think about what happens when someone wears Vision Pro looks like at a part. I remember being impressed by the fact that every experience in Vision Pro always knew exactly what I was looking at and how buttons and windows changed based on my gaze.

Comfort is the thing

(Image credit: Apple)

Spatial computing involves more of your body than traditional computing. First of all, you wear headphones. Second, you look around to see wraparound environments and apps. Finally, you use your hands and fingers to control and interact with the interface.

However, Apple recommends that developers do not spread the main content over a 360-degree area.

“Comfort should define the experience. We recommend keeping your main content in the field of view so that people don't have to move their neck and body too much. The more centered the content is in the field of view, the more comfortable it is for the eyes” , Apple designers wrote in the post.

I still remember that while some of my Vision Pro experiences felt immersive, I wasn't swinging my head back and forth to find the controls and follow the action.

A wise decision

The key to my Vision Pro experience was the sound, which uses spatial audio to create a 360-degree soundstage. But it's not just about the immersive experience. Apparently, sound can be used to connect Vision Pro wearers to their spatial computing experiences.

In the post, Apple reminds developers who may not have given sound much thought to use audio cues in their apps, noting that “an audio cue helps (users) recognize and confirm their actions.”

I'm not sure what it will be like to live and work in the world of spatial computing, but Apple's guidance is quickly bringing things into focus.

You might also like it

Related Post