Building a Portal App in ARKit: Adding Objects
In this second part of our tutorial series on building a portal app in ARKit, you’ll build up your app and add 3D virtual content to the camera scene via SceneKit. By Namrata Bandekar.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
Building a Portal App in ARKit: Adding Objects
20 mins
This is an excerpt taken from Chapter 8, “Adding Objects to Your World”, of our book ARKit by Tutorials. This book show you how to build five immersive, great-looking AR apps in ARKit, Apple’s augmented reality framework. Enjoy!
In the previous tutorial of this series, you learned how to set up your iOS app to use ARKit sessions and detect horizontal planes. In this part, you’re going to build up your app and add 3D virtual content to the camera scene via SceneKit. By the end of this tutorial, you’ll know how to:
- Handle session interruptions
- Place objects on a detected horizontal plane
Before jumping in, download the project materials using the “Download Materials” button and load the starter project from the starter folder.
Getting Started
Now that you are able to detect and render horizontal planes, you need to reset the state of the session if there are any interruptions. ARSession
is interrupted when the app moves into the background or when multiple applications are in the foreground. Once interrupted, the video capture will fail and the ARSession
will be unable to do any tracking as it will no longer receive the required sensor data. When the app returns to the foreground, the rendered plane will still be present in the view. However, if your device has changed its position or rotation, the ARSession
tracking will not work anymore. This is when you need to restart the session.
The ARSCNViewDelegate
implements the ARSessionObserver
protocol. This protocol contains the methods that are called when the ARSession
detects interruptions or session errors.
Open PortalViewController.swift and add the following implementation for the delegate methods to the existing extension.
// 1
func session(_ session: ARSession, didFailWithError error: Error) {
// 2
guard let label = self.sessionStateLabel else { return }
showMessage(error.localizedDescription, label: label, seconds: 3)
}
// 3
func sessionWasInterrupted(_ session: ARSession) {
guard let label = self.sessionStateLabel else { return }
showMessage("Session interrupted", label: label, seconds: 3)
}
// 4
func sessionInterruptionEnded(_ session: ARSession) {
// 5
guard let label = self.sessionStateLabel else { return }
showMessage("Session resumed", label: label, seconds: 3)
// 6
DispatchQueue.main.async {
self.removeAllNodes()
self.resetLabels()
}
// 7
runSession()
}
Let’s go over this step-by-step.
-
session(_:, didFailWithError:)
is called when the session fails. On failure, the session is paused and it does not receive sensor data. -
Here you set the
sessionStateLabel
text to the error message that was reported as a result of the session failure.showMessage(_:, label:, seconds:)
shows the message in the specified label for the given number of seconds. -
The
sessionWasInterrupted(_:)
method is called when the video capture is interrupted as a result of the app moving to the background. No additional frame updates are delivered until the interruption ends. Here you display a “Session interrupted” message in the label for 3 seconds.
-
The
sessionInterruptionEnded(_:)
method is called after the session interruption has ended. A session will continue running from the last known state once the interruption has ended. If the device has moved, any anchors will be misaligned. To avoid this, you restart the session. - Show a “Session resumed” message on the screen for 3 seconds.
- Remove previously rendered objects and reset all labels. You will implement these methods soon. These methods update the UI, so they need to be called on the main thread.
-
Restart the session.
runSession()
simply resets the session configuration and restarts the tracking with the new configuration.
You will notice there are some compiler errors. You’ll resolve these errors by implementing the missing methods.
Place the following variable in PortalViewController
below the other variables:
var debugPlanes: [SCNNode] = []
You’ll use debugPlanes
, which is an array of SCNNode
objects that keep track of all the rendered horizontal planes in debug mode.
Then, place the following methods below resetLabels()
:
// 1
func showMessage(_ message: String, label: UILabel, seconds: Double) {
label.text = message
label.alpha = 1
DispatchQueue.main.asyncAfter(deadline: .now() + seconds) {
if label.text == message {
label.text = ""
label.alpha = 0
}
}
}
// 2
func removeAllNodes() {
removeDebugPlanes()
}
// 3
func removeDebugPlanes() {
for debugPlaneNode in self.debugPlanes {
debugPlaneNode.removeFromParentNode()
}
self.debugPlanes = []
}
Take a look at what’s happening:
-
You define a helper method to show a message string in a given
UILabel
for the specified duration in seconds. Once the specified number of seconds pass, you reset the visibility and text for the label. -
removeAllNodes()
removes all existingSCNNode
objects added to the scene. Currently, you only remove the rendered horizontal planes here. -
This method removes all the rendered horizontal planes from the scene and resets the
debugPlanes
array.
Now, place the following line in renderer(_:, didAdd:, for:)
just before the #endif
of the #if DEBUG
preprocessor directive:
self.debugPlanes.append(debugPlaneNode)
This adds the horizontal plane that was just added to the scene to the debugPlanes
array.
Note that in runSession()
, the session executes with a given configuration:
sceneView?.session.run(configuration)
Replace the line above with the code below:
sceneView?.session.run(configuration,
options: [.resetTracking, .removeExistingAnchors])
Here you run the ARSession
associated with your sceneView
by passing the configuration
object and an array of ARSession.RunOptions
, with the following run options:
-
resetTracking
: The session does not continue device position and motion tracking from the previous configuration. -
removeExistingAnchors
: Any anchor objects associated with the session in its previous configuration are removed.
Run the app and try to detect a horizontal plane.
Now send the app to the background and then re-open the app. Notice that the previously rendered horizontal plane is removed from the scene and the app resets the label to display the correct instructions to the user.
Hit Testing
You are now ready to start placing objects on the detected horizontal planes. You will be using ARSCNView
’s hit testing to detect touches from the user’s finger on the screen to see where they land in the virtual scene. A 2D point in the view’s coordinate space can refer to any point along a line segment in the 3D coordinate space. Hit-testing is the process of finding objects in the world located along this line segment.
Open PortalViewController.swift and add the following variable.
var viewCenter: CGPoint {
let viewBounds = view.bounds
return CGPoint(x: viewBounds.width / 2.0, y: viewBounds.height / 2.0)
}
In the above block of code, you set the variable viewCenter
to the center of the PortalViewController
’s view.
Now add the following method:
// 1
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
// 2
if let hit = sceneView?.hitTest(viewCenter, types: [.existingPlaneUsingExtent]).first {
// 3
sceneView?.session.add(anchor: ARAnchor.init(transform: hit.worldTransform))
}
}
Here’s what’s happening:
Here you use the existingPlaneUsingExtent
result type which searches for points where the ray from the viewCenter
intersects with any detected horizontal planes in the scene while considering the limited extent of the planes.
The result of hitTest(_:, types:)
is an array of all hit test results sorted from the nearest to the farthest. You pick the first plane that the ray intersects. You will get results for hitTest(_:, types:)
any time the screen’s center falls within the rendered horizontal plane.
-
ARSCNView
has touches enabled. When the user taps on the view,touchesBegan()
is called with a set ofUITouch
objects and aUIEvent
which defines the touch event. You override this touch handling method to add anARAnchor
to thesceneView
. -
You call
hitTest(_:, types:)
on thesceneView
object. ThehitTest
method has two parameters. It takes aCGPoint
in the view’s coordinate system, in this case the screen’s center, and the type ofARHitTestResult
to search for.Here you use the
existingPlaneUsingExtent
result type which searches for points where the ray from theviewCenter
intersects with any detected horizontal planes in the scene while considering the limited extent of the planes.The result of
hitTest(_:, types:)
is an array of all hit test results sorted from the nearest to the farthest. You pick the first plane that the ray intersects. You will get results forhitTest(_:, types:)
any time the screen’s center falls within the rendered horizontal plane. -
You add an
ARAnchor
to theARSession
at the point where your object will be placed. TheARAnchor
object is initialized with a transformation matrix that defines the anchor’s rotation, translation and scale in world coordinates.
The ARSCNView
receives a callback in the delegate method renderer(_:didAdd:for:)
after the anchor is added. This is where you handle rendering your portal.