Video Depth Maps Tutorial for iOS: Getting Started
In this iOS video depth maps tutorial, you’ll harness iOS 13’s video depth maps to apply realtime video filters and create a special effects masterpiece! By Owen L Brown.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
Video Depth Maps Tutorial for iOS: Getting Started
20 mins
No Green Screen? No Problem!
That’s good and all, but maybe you don’t want to work on superhero movies. Perhaps you prefer science fiction instead.
No worries. This next filter will have you jumping for joy on the Moon! For that, you’ll need to create a makeshift green-screen effect.
Open DepthImageFilters.swift and add the following method to the class:
func greenScreen(
image: CIImage,
background: CIImage,
mask: CIImage)
-> CIImage {
// 1
let crop = CIVector(
x: 0,
y: 0,
z: image.extent.size.width,
w: image.extent.size.height)
// 2
let croppedBG = background.applyingFilter("CICrop", parameters: [
"inputRectangle": crop
])
// 3
let filtered = image.applyingFilter("CIBlendWithMask", parameters: [
"inputBackgroundImage": croppedBG,
"inputMaskImage": mask
])
// 4
return filtered
}
In this filter:
- Create a 4D
CIVector
to define a cropping boundary equal to your input image. - Then crop the background image to be the same size as the input image. This is important for the next step.
- Next, combine the input and background images by blending them based on the mask parameter.
- Finally, you return the filtered image.
Now you need to hook the mask and filter logic for this back in DepthVideoViewController.swift and you’ll be ready to go.
Find captureOutput(_:didOutput:from:)
in DepthVideoViewController.swift and add the following case to the switch filter
statement:
case (.filtered, .greenScreen, let mask?):
previewImage = depthFilters.greenScreen(
image: image,
background: background,
mask: mask)
Here you filter the input image with the background and the mask using the new function you just wrote.
Next, find depthDataOutput(_:didOutput:timestamp:connection:)
and add the following case to the switch
statement:
case .greenScreen:
mask = depthFilters.createHighPassMask(
for: depthMap,
withFocus: sliderValue,
andScale: scale,
isSharp: true)
This code creates a high pass mask but makes the cutoff sharper, resulting in harder edges.
Build and run the project. Move the slider around and see what objects you can put on the Moon.
Out of this world!
Dream-like Blur Effect
OK, OK! Maybe you don’t like the superhero or science fiction genres. I get it. You’re more of an art film type of person. If so, this next filter is right up your alley.
With this filter, you’re going to blur out anything besides objects at a narrowly defined distance from the camera. This can give a dream-like feeling to your films.
Go back to DepthImageFilters.swift and add a new method to the class:
func blur(image: CIImage, mask: CIImage) -> CIImage {
// 1
let blurRadius: CGFloat = 10
// 2
let crop = CIVector(
x: 0,
y: 0,
z: image.extent.size.width,
w: image.extent.size.height)
// 3
let invertedMask = mask.applyingFilter("CIColorInvert")
// 4
let blurred = image.applyingFilter("CIMaskedVariableBlur", parameters: [
"inputMask": invertedMask,
"inputRadius": blurRadius
])
// 5
let filtered = blurred.applyingFilter("CICrop", parameters: [
"inputRectangle": crop
])
// 6
return filtered
}
This one is a bit more complicated, but here’s what you did:
- First, define a blur radius to use. The bigger the radius, the bigger and slower the blur!
- Once again, create a 4D
CIVector
to define a cropping region. This is because blurring will effectively grow the image at the edges and you just want the original size. - Then, invert the mask because the blur filter you’re using blurs where the mask is white.
- Next, apply the
CIMaskedVariableBlur
filter to the image using the inverted mask and the blur radius as parameters. - Crop the blurred image to maintain the desired size.
- Finally, return the filtered image.
By now, you should know the drill. Open DepthVideoViewController.swift and add a new case to the switch
statement inside captureOutput(_:didOutput:from:)
:
case (.filtered, .blur, let mask?):
previewImage = depthFilters.blur(image: image, mask: mask)
This will create the blur filter when selected in the UI.
Now for the mask.
Replace the default
case with the following case inside the switch
statement in depthDataOutput(_:didOutput:timestamp:connection:)
:
case .blur:
mask = depthFilters.createBandPassMask(
for: depthMap,
withFocus: sliderValue,
andScale: scale)
Here you create a band pass mask for the blur filter to use.
It’s time! Build and run this project. Try adjusting the sliders in the Mask and Filtered segments as well as changing the filters to see what effects you can create.
It’s so dreamy!
Where to Go From Here?
You’ve accomplished so much in this video depth maps tutorial. Give your self a well-deserved pat on the back.
You can download the final project using the Download Materials button at the top or bottom of this tutorial.
With your new knowledge, you can take this project even further. For instance, the app displays the filtered video stream but doesn’t record it. Try adding a button and some logic to save your masterpieces.
You can also add more filters or even create your own filters! Check here for a complete list of CIFilter
configurations that ship with iOS. Also, check out the Core Image video course that will teach you all about Core Image filters.
I hope you enjoyed this video depth maps tutorial. If you have any questions or comments, please join the forum discussion below!