Face Detection Tutorial Using the Vision Framework for iOS
In this tutorial, you’ll learn how to use Vision for face detection of facial features and overlay the results on the camera feed in real time. By Yono Mittlefehldt.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
Face Detection Tutorial Using the Vision Framework for iOS
20 mins
Using Detected Faces
Face detection is something you’ve probably been seeing more of recently. It can be especially useful for image processing, when you want to really make the people in the images shine.
But you’re going to do something way cooler than that. You’re going to shoot lasers out of your eyes!
Time to get started.
While still in FaceDetectionViewController.swift, right below updateFaceView(for:)
, add the following method:
// 1
func updateLaserView(for result: VNFaceObservation) {
// 2
laserView.clear()
// 3
let yaw = result.yaw ?? 0.0
// 4
if yaw == 0.0 {
return
}
// 5
var origins: [CGPoint] = []
// 6
if let point = result.landmarks?.leftPupil?.normalizedPoints.first {
let origin = landmark(point: point, to: result.boundingBox)
origins.append(origin)
}
// 7
if let point = result.landmarks?.rightPupil?.normalizedPoints.first {
let origin = landmark(point: point, to: result.boundingBox)
origins.append(origin)
}
}
Whew! That was quite a bit of code. Here’s what you did with it:
- Define a new method that will update the
LaserView
. It’s a bit likeupdateFaceView(for:)
. - Clear the
LaserView
. - Get the yaw from the result. The yaw is a number that tells you how much your face is turned. If it’s negative, you’re looking to the left. If it’s positive, you’re looking to the right.
- Return if the yaw is 0.0. If you’re looking straight forward, no face lasers. 😞
- Create an array to store the origin points of the lasers.
- Add a laser origin based on the left pupil.
- Add a laser origin based on the right pupil.
VNFaceObservation
would not move.
OK, you’re not quite done with that method, yet. You’ve determined the origin of the lasers. However, you still need to add logic to figure out where the lasers will be focused.
At the end of your newly created updateLaserView(for:)
, add the following code:
// 1
let avgY = origins.map { $0.y }.reduce(0.0, +) / CGFloat(origins.count)
// 2
let focusY = (avgY < midY) ? 0.75 * maxY : 0.25 * maxY
// 3
let focusX = (yaw.doubleValue < 0.0) ? -100.0 : maxX + 100.0
// 4
let focus = CGPoint(x: focusX, y: focusY)
// 5
for origin in origins {
let laser = Laser(origin: origin, focus: focus)
laserView.add(laser: laser)
}
// 6
DispatchQueue.main.async {
self.laserView.setNeedsDisplay()
}
Here you:
- Calculate the average y coordinate of the laser origins.
- Determine what the y coordinate of the laser focus point will be based on the average y of the origins. If your pupils are above the middle of the screen, you'll shoot down. Otherwise, you'll shoot up. You calculated
midY
inviewDidLoad()
. - Calculate the x coordinate of the laser focus based on the
yaw
. If you're looking left, you should shoot lasers to the left. - Create a
CGPoint
from your two focus coordinates. - Generate some
Laser
s and add them to theLaserView
. - Tell the iPhone that the
LaserView
should be redrawn.
Now you need to call this method from somewhere. detectedFace(request:error:)
is the perfect place! In that method, replace the call to updateFaceView(for:)
with the following:
if faceViewHidden {
updateLaserView(for: result)
} else {
updateFaceView(for: result)
}
This logic chooses which update method to call based on whether or not the FaceView
is hidden.
Currently, if you were to build and run, you would only shoot invisible lasers out of your eyes. While that sounds pretty cool, wouldn't it be better to see the lasers?
To fix this, you need tell the iPhone how to draw the lasers.
Open LaserView.swift and find the draw(_:)
method. It should be completely empty. Now add the following code to it:
// 1
guard let context = UIGraphicsGetCurrentContext() else {
return
}
// 2
context.saveGState()
// 3
for laser in lasers {
// 4
context.addLines(between: [laser.origin, laser.focus])
context.setStrokeColor(red: 1.0, green: 1.0, blue: 1.0, alpha: 0.5)
context.setLineWidth(4.5)
context.strokePath()
// 5
context.addLines(between: [laser.origin, laser.focus])
context.setStrokeColor(red: 1.0, green: 0.0, blue: 0.0, alpha: 0.8)
context.setLineWidth(3.0)
context.strokePath()
}
// 6
context.restoreGState()
With this drawing code, you:
- Get the current graphics context.
- Push the current graphics state onto the stack.
- Loop through the lasers in the array.
- Draw a thicker white line in the direction of the laser.
- Then draw a slightly thinner red line over the white line to give it a cool laser effect.
- Pop the current graphics context off the stack to restore it to its original state.
That's it. Build and run time!
Tap anywhere on the screen to switch to Lasers mode.
Great job!
Where to Go From Here?
You can do all sorts of things with your new found knowledge. Imagine combining the face detection with depth data from the camera to create cool effects focused around the people in your photos. To learn more about using depth data, check out this tutorial on working with image depth maps and this tutorial on working with video depth maps.
Or how about trying out a Vision and CoreML tag team? That sounds really cool, right? If that piques your interest, we have a tutorial for that!
You could learn how to do face tracking using ARKit with this awesome tutorial.
There are, of course, plenty of other Vision APIs you can play with. Now that you have a foundational knowledge of how to use them, you can explore them all!
We hope you enjoyed this tutorial and, if you have any questions or comments, please join the forum discussion below!