Instruction

What is Spatial Computing?

During the WWDC 2023 Keynote, Apple introduced the Vision Pro, a device users wear on their heads that creates a spatial computing experience. It provides stereoscopic displays and spatial audio presented in surround sound. The dual displays and spatial audio engulf the user, creating a compelling sensory, realistic metaphor combining Augmented Reality, AR, and Virtual Reality, VR, tooling. Before you jump into creating apps for the Vision Pro, it’s important to understand the history of AR and VR devices.

Every day, you use your eyes and life experience to judge the distance to an object. You use binaural sound to judge where an object is since what you hear is relative to the space you occupy.

The stereoscope was invented in the early 1800s and presented the user with a separate image in the left and right eye. Gazing at the images, the viewer could reproduce the stereo image in their brain and experience the photographed objects as if they were actually present.

The stereoscope was followed by the kinescope, which showed the user moving pictures in stereo visuals. In the late 1920s, the first flight simulators came along. As a child, you may have played with a ViewMaster, a device introduced in the late 1930s.

You’re probably surprised that AR and VR aren’t new and pre-date digital computers. Computer-based VR came along in the 1960s and developed over the following decades into arcade games, vehicle simulators, power gloves, and projected environments surrounding the observer.

Today, most people think of AR and VR as a headset worn by the user or viewed using the iPhone’s optical and LiDAR cameras. In the latter, 3D objects are projected into a space based on the iPhone’s infrared measurements and placed relative to walls, surfaces, and furnishings. This creates the sensory illusion that the object is really in the space.

Popular headsets like the Meta’s Oculus, Microsoft’s Halo, Google’s Cardboard, PlayStation VR, and the Magic Leap occupy the public’s idea of what wearable AR/VR technology looks like. These devices are used in various applications, such as gaming, practical business uses, or to visit and experience far-off locations and museums.

Enter Apple into the AR/VR space, where they’ve assembled tools like ARKit, RealityKit, Stage Manager, and RoomPlan and co-developing file formats like USDz for many years. As WWDC 2023 approached, people were on the edge of their seats waiting to hear about Apple’s mystery device. Combining all of the history in AR/VR and the latest technological advances, Apple presented the Vision Pro and introduced us all to spatial computing.

The Vision Pro and visionOS are poised to create a paradigm shift in the idea of computing. Spatial computing involves interacting with elements rendered in front and around the computer user. The main tools are the users’ eyes and hands and conventional input tools like Bluetooth trackpads, keyboards, and mice.

At WWDC, Apple invited journalists to try the device and demo applications highlighting the technical advances. Over the summer and fall, Apple hosted developer labs where developers could try out their existing apps and create new visionOS and AR/VR experiences on an actual device. While this new technology is exciting, you might wonder why you should learn how to create visionOS apps.

Why Should You Learn How to Create visionOS Apps?

visionOS is the operating system that runs on the VisionPro. It’s written primarily in SwiftUI but can run apps built for UIKit or apps compatible with iPadOS. In this course, you’ll focus on building SwiftUI apps specifically for visionOS. By doing so, you’ll take advantage of the UI explicitly designed to run on the VisionPro and subsequent devices running visionOS.

Throughout this module, you’ll get a brief introduction to creating experiences in Windows, Volumes, and Immersive apps. You’ll explore these app metaphors before you try them out in the next few lessons.

In this first lesson, you’ll build a Window app for visionOS. Apple has provided a lot of design and UI in this area, representing how most apps will run on the Vision Pro.

The Three Faces of visionOS

visionOS has three styles of apps. The basic Window app is the style that most apps will use. Imagine a decent-sized 4K Mac monitor in front of you at around an arm’s length distance. The Shared Space, where all the apps launch, is a limitless space where you can open multiple app windows and arrange or resize them as you like.

Volume apps are similar, except they occupy a cubic space before you. Objects in a Volume will float virtually within the volume.

Finally, the Immersive apps can engulf you in a complete experience, changing your surroundings in the Full Space. You’re virtually transported to a different location where what you see and hear is all around you. The Vision Pro’s Digital Crown allows users to dial back the amount of immersion.

In this first lesson, you’ll build an app from scratch using the visionOS template in Xcode. App windows on visionOS get new visual treatments so that the surface or window appears to be made of a frosted glass-like material. When you’re working on the app, the window appears solid. When you move to another app or control, the window’s surface fades, moves back, and becomes more translucent while not quite fading out altogether.

Note: When using the Vision Pro, users use their eyes and hands to interact with apps. You’ll work in the Vision Pro simulator, which is a 2D representation, but you can get the impression of moving items and windows in a virtual room. On the actual hardware, the invisible “pointer” goes where you look. As your eyes cross a pane, menu item, or other control, they shimmer with a hover effect. The Vision Pro simulator does a good job of simulating these effects but is very different using the actual device.

The Vision Pro also employs PassThrough video from the external cameras, so you can see other objects in the room without noticeable distortion. If you raise your hands, you see your hands, and if the app includes visionOS gestures, you can interact with objects and controls with your hands. For example, when the visual keyboard appears, you tap the keys with your finger as if an actual keyboard is in front of you.

A Window app is a controlled rectangular space where your app is contained. In the next exercise, you’ll create a window with a side bar and a main detail view, similar to an iPadOS app, using a NavigationSplitView. You’ll add some items and change the content in the detail view. On visionOS, you can also use Ornamentsfor items like TabBar, Search bar, or Toolbars. These appear parallel to the window and float near the edge of the window and slightly in front.

See forum comments
Download course materials from Github
Previous: Introduction Next: visionOS Simulators