Instruction

RealityKit is Apple’s framework for working with 3D content. It handles animating objects, simulating effects, and rendering objects for use in VR and AR experiences. Apple introduced RealityKit in 2019 and continues to add new features, particularly now for visionOS.

RealityKit uses the Entity Component System (ECS) architecture paradigm, popular in 3D gaming. The primary element, an Entity, can be a 3D model, lights, sounds or a particle emitter, and is placed into a scene. Attached to entities are Components, which act upon the entities and add logic to them. Components can be properties that add physics, or collisions, for games and interaction, or even add accessibility properties. The System part of ECS manages the updates of the entities with each frame. The result of this modular system is that objects in a scene can look and act like they belong in world viewed with a Vision Pro device. For example, seaweed can be made to sway in an underwater scene, with natural blurring as it recedes away from the spectator. Balls in a game of pool can scatter as they are struck. Birds can flock and emit spatial audio sounds, just like a real-life flock would.

You’ll create and interact with entities and components in this course.

In RealityKit, entities can have a material applied with Physically Based Rendering (PBR). Materials are made up from various assets that can contain color data, roughness data and normals that modify the object to create a 3D texture. A ball can be made to look like it’s made of real wood, with specular highlights and wood-grain depressions. RealityKit’s vertex shaders take care of rendering the material, blending the lighting of the scene along with geometry modifiers, to achieve the appearance of the object in the real world.

The models brought in can be USDz files that could contain material properties or a 3D mesh with materials applied and ready-made animation. Like the biplane in the Introduction to visionOS course, the USDz can be placed into a scene, but can also be modified with RealityKit or in Reality Composer Pro. A model can also be described and created from scratch in code.

Models can also have Materials applied to make them appear more natural and less plastic. Materials have a diffuse color, roughness, can be metallic and opaque or transparent, and can have normals applied. Glass or water would have a high opacity. A light bulb could emit a colored light. Apple is also making use of MaterialX in Reality Compose Pro to allow you to define materials using shader graphs. As well as the physical properties, you can also adjust the influence that lighting has on your models.

In the following lesson, you’ll learn how to apply materials and components to Entities. You’ll also learn how to create Entities from scratch. In later lessons, you’ll learn how to create a similar experience with Reality Composer Pro. We’ll take a look at running your app on an Apple Vision Pro in the second lesson on RealityKit. Continue to the next lesson to learn how to work directly with RealityKit for visionOS.

See forum comments
Download course materials from Github
Previous: Introduction Next: Demo