Apple Pencil Tutorial: Getting Started
In this Apple Pencil tutorial, you’ll learn about force, touch coalescing, altitude, and azimuth, to add realistic lines and shading to a drawing app. By Caroline Begbie.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
Apple Pencil Tutorial: Getting Started
30 mins
- Prerequisites
- Getting Started
- Your First Drawing with Pencil
- Smoother Drawing
- Tilting the Pencil
- Altitude, Azimuth and Unit Vectors
- Draw With Shading
- Working With Texture
- Using Azimuth to Adjust Width
- Playing with Opacity
- Finger vs. Pencil
- Faking Force For a Finger
- Reducing Latency
- Housekeeping: Deleting Drawing Predictions
- Where To Go From Here?
Housekeeping: Deleting Drawing Predictions
There's a little housekeeping in order to ensure those predicted touches are properly disposed of at the end of the stroke or when the user cancels the drawing.
Add a new UIImage
property at the top of the CanvasView
to hold the proper drawn image -- the one without predictions:
private var drawingImage: UIImage?
Next, find the following statement in touchesMoved(_:withEvent:)
:
image?.drawInRect(bounds)
And replace it with the following:
drawingImage?.drawInRect(bounds)
Here you're drawing drawingImage
into the graphics context, rather than the image being displayed at that time by the canvas view. This will overwrite any predicted touches drawn by the previous move event.
Now, at the bottom of touchesMoved(_:withEvent:)
, but just above these lines:
image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
Add this new code:
// 1
drawingImage = UIGraphicsGetImageFromCurrentImageContext()
// 2
if let predictedTouches = event?.predictedTouchesForTouch(touch) {
for touch in predictedTouches {
drawStroke(context, touch: touch)
}
}
Here's what's happening in there:
- You save the graphics context with the new stroke that's been drawn but don't include the predicted strokes.
- Like you did with the coalesced touches, you get the array of predicted touches and draw the strokes for each predicted touch.
Now add these two methods:
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
image = drawingImage
}
override func touchesCancelled(touches: Set<UITouch>?, withEvent event: UIEvent?) {
image = drawingImage
}
These are called at the end of a stroke. By replacing the image with drawingImage
when a touch ends or is cancelled, you're discarding all the predicted touches that were drawn onto the canvas.
One last thing: You'll need to clear both the canvas and the true drawing when you shake to clear.
In CanvasView.swift, in clearCanvas(animated:)
, locate this code inside the animation closure:
self.image = nil
Add this statement straight after that line:
self.drawingImage = nil
Now a little bit further in that same method, locate:
image = nil
and add this code after it:
drawingImage = nil
Here you clear both images of any drawing that you've done.
Build and run. Draw some squiggles and curves. You may notice that you're drawing all the touches that Apple predicted you might make, and consequently perceived latency is reduced. You may need to watch closely because it's very subtle. :]
Note: When you run the second code sample at the end of this tutorial, you'll be able to visualize what predicted touches actually do. You'll see an option to replace the texture with a blue color just for the predicted touches.
Note: When you run the second code sample at the end of this tutorial, you'll be able to visualize what predicted touches actually do. You'll see an option to replace the texture with a blue color just for the predicted touches.
Apple's algorithm for predicted touches is astonishingly good. It's the little subtleties like this one that make it a pleasure to develop for Apple platforms.
Where To Go From Here?
Congratulations! You have now completed a simple drawing app where you can scribble and enjoy getting artsy with your Pencil. :] You can download the finished project to see the final result.
You did some pretty awesome things and learned about the following:
- Smoothing lines and shapes so they look natural
- Working with altitude and azimuth
- Implementing drawing and shading
- Adding and working with texture
- Adding an eraser
- Working with predictive data and what to do when it's not used
I'm also providing a second project that has buttons to turn coalesced and predicted touches on or off, so that you can visualize their effects.
Apple's WWDC video on touches has a great section on how coalesced and predicted touches work with the 60 Hz frame rate. Watch it to see how latency has improved from 4 frames in iOS 8 to 1.5 frames in iOS 9.1. Pretty spectacular!
FlexMonkey (aka Simon Gladman) has done some really creative things with the Pencil that go well beyond just drawing with it. Take a look at his blog, especially the Pencil Synthesizer and FurrySketch.
I hope you enjoyed this Apple Pencil tutorial - I'd love to see as many apps as possible integrating this very cool device. If you have any questions or comments please join the forum discussion below!