AR Face Tracking Tutorial for iOS: Getting Started
In this tutorial, you’ll learn how to use AR Face Tracking to track your face using a TrueDepth camera, overlay emoji on your tracked face, and manipulate the emoji based on facial expressions you make. By Yono Mittlefehldt.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
AR Face Tracking Tutorial for iOS: Getting Started
25 mins
Picture this. You have just eaten the most amazing Korean BBQ you’ve ever had and it’s time to take a selfie to commemorate the occasion. You whip out your iPhone, make your best duck-face and snap what you hope will be a selfie worthy of this meal. The pic comes out good — but it’s missing something. If only you could put an emoji over your eyes to really show how much you loved the BBQ. Too bad there isn’t an app that does something similar to this. An app that utilizes AR Face Tracking would be awesome.
Good news! You get to write an app that does that!
In this tutorial, you’ll learn how to:
- Use AR Face Tracking to track your face using a TrueDepth camera.
- Overlay emoji on your tracked face.
- Manipulate the emoji based on facial expressions you make.
Are you ready? Then pucker up those lips and fire up Xcode, because here you go!
Getting Started
For this tutorial, you’ll need an iPhone with a front-facing, TrueDepth camera. At the time of writing, this means an iPhone X, but who knows what the future may bring?
You may have already downloaded the materials for this tutorial using the Download Materials link at the top or bottom of this tutorial and noticed there is no starter project. That’s not a mistake. You’re going to be writing this app — Emoji Bling — from scratch!
Launch Xcode and create a new project based on the Single View App template and name it Emoji Bling.
The first thing you should do is to give the default ViewController
a better name. Select ViewController.swift in the Project navigator on the left.
In the code that appears in the Standard editor, right-click on the name of the class, ViewController
, and select Refactor ▸ Rename from the context menu that pops up.
Change the name of the class to EmojiBlingViewController
and press Return or click the blue Rename button.
Since you’re already poking around EmojiBlingViewController.swift, go ahead and add the following import to the top:
import ARKit
You are, after all, making an augmented reality app, right?
Next, in Main.storyboard, with the top level View in the Emoji Bling View Controller selected, change the class to ARSCNView.
ARSCNView
is a special view for displaying augmented reality experiences using SceneKit content. It can show the camera feed and display SCNNode
s.
After changing the top level view to be an ARSCNView
, you want to create an IBOutlet
for the view in your EmojiBlingViewController
class.
To do this, bring up the Assistant editor by clicking on the button with the interlocking rings.
This should automatically bring up the contents of EmojiBlingViewController.swift in the Assistant editor. If not, you can Option-click on it in the Project navigator to display it there.
Now, Control-drag from the ARSCNView
in the storyboard to just below the EmojiBlingViewController
class definition in EmojiBlingViewController.swift and name the outlet sceneView
.
Before you can build and run, a little bit of code is needed to display the camera feed and start tracking your face.
In EmojiBlingViewController.swift, add the following functions to the EmojiBlingViewController
class:
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
// 1
let configuration = ARFaceTrackingConfiguration()
// 2
sceneView.session.run(configuration)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
// 1
sceneView.session.pause()
}
Right before the view appears, you:
- Create a configuration to track a face.
- Run the face tracking configuration using the built in
ARSession
property of yourARSCNView
.
Before the view disappears, you make sure to:
- Pause the AR session.
There is a teensy, tiny problem with this code so far. ARFaceTrackingConfiguration
is only available for phones with a front-facing TrueDepth camera. You need to make sure you check for this before doing anything.
In the same file, add the following to the end of the viewDidLoad()
function, which should already be present:
guard ARFaceTrackingConfiguration.isSupported else {
fatalError("Face tracking is not supported on this device")
}
With this in place, you check to make sure that the device supports face tracking (i.e., has a front-facing TrueDepth camera), otherwise stop. This is not a graceful way to handle this, but as this app only does face tracking, anything else would be pointless!
Before you run your app, you also need to specify a reason for needing permission to use the camera in the Info.plist.
Select Info.plist in the Project navigator and add an entry with a key of Privacy - Camera Usage Description
. It should default to type String
. For the value, type EmojiBling needs access to your camera in order to track your face.
FINALLY. It’s time to build and run this puppy… er… app… appuppy?
When you do so, you should see your beautiful, smiling face staring right back at you.
OK, enough duck-facing around. You’ve got more work to do!
Face Anchors and Geometries
You’ve already seen ARFaceTrackingConfiguration
, which is used to configure the device to track your face using the TrueDepth camera. Cool.
But what else do you need to know about face tracking?
Three very important classes you’ll soon make use of are ARFaceAnchor
, ARFaceGeometry
and ARSCNFaceGeometry
.
ARFaceAnchor
inherits from ARAnchor
. If you’ve done anything with ARKit before, you know that ARAnchor
s are what make it so powerful and simple. They are positions in the real world tracked by ARKit, which do not move when you move your phone. ARFaceAnchor
s additionally include information about a face, such as topology and expression.
ARFaceGeometry
is pretty much what it sounds like. It’s a 3D description of a face including vertices
and textureCoordinates
.
ARSCNFaceGeometry
uses the data from an ARFaceGeometry
to create a SCNGeometry
, which can be used to create SceneKit nodes — basically, what you see on the screen.
OK, enough of that. Time to use some of these classes. Back to coding!