Core Image Tutorial: Getting Started

Learn the basics of cool image filtering effects with Core Image and Swift. By Ron Kliffer.

Leave a rating/review
Download materials
Save for later
Share
You are currently viewing page 2 of 3 of this article. Click here to view the first page.

Getting Photos From the Photo Album

Now that you can change the values of the filter right away, things are getting interesting! But what if you don’t care for this image of flowers? Next, you’ll set up a UIImagePickerController to get pictures out of the photo album and into your app so you can play with them.

There’s already an IBAction connected to the camera button’s Touch Up Inside action. It’s called loadPhoto(). Add the following code to the loadPhoto():

let picker = UIImagePickerController()
picker.delegate = self
present(picker, animated: true)

The first line of code instantiates a new UIImagePickerController. Set the delegate of the image picker to self (the ViewController) and then present the picker.

There’s a compiler error here. You need to declare that ViewController conforms to the UIImagePickerControllerDelegate and UINavigationControllerDelegate protocols.

Add the following extension to the bottom of ViewController.swift:

extension ViewController: UIImagePickerControllerDelegate, UINavigationControllerDelegate {
  func imagePickerController(
    _ picker: UIImagePickerController,
    didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey: Any]) {
  }
}

This delegate method returns the selected image along with some related information in the info dictionary. For more details on the various data in this dictionary, check out the documentation.

Build and run the app, and tap the button, which brings the image picker with the photos in your photo album.

Showing an image picker

Selecting an image does nothing. You’re about to change that. Add the following code to imagePickerController(_:didFinishPickingMediaWithInfo:):

//1
guard let selectedImage = info[.originalImage] as? UIImage else { return }

//2
let ciImage = CIImage(image: selectedImage)
filter.setValue(ciImage, forKey: kCIInputImageKey)

//3
applySepiaFilter(intensity: slider.value)

//4
dismiss(animated: true)

Here’s a code breakdown. It:

  1. Retrieves the select image using the originalImage UIImagePickerController.InfoKey key.
  2. Applies the selected image to the filter.
  3. Calls applySepiaFilter(intensity:) using the current slider value for the intensity. This will update the image view.
  4. Dismisses the image picker once you’re done filtering the selected image.

Build and run. Now, you’ll be able to update any image from your photo album.

Selecting an image

What About Image Metadata?

Of course, it’s time to talk about image metadata for a moment. Image files taken on mobile phones have various data associated with them, such as GPS coordinates, image format and orientation.

Orientation in particular is something you’ll need to preserve. Loading a UIImage into a CIImage, rendering to a CGImage and converting back to a UIImage strips the metadata from the image. To preserve orientation, you’ll need to record it and then pass it back into the UIImage.

Start by adding a new property to ViewController.swift:

var orientation = UIImage.Orientation.up

Next, add the following line to imagePickerController(_:didFinishPickingMediaWithInfo:) before calling applySepiaFilter(intensity:):

orientation = selectedImage.imageOrientation

This saves the selected image orientation to the property.

Finally, alter the line in applySepiaFilter(intensity:) set in the imageView object:

imageView.image = UIImage(cgImage: cgImage, scale: 1, orientation: orientation)

Now, if you take a picture taken in something other than the default orientation, the app preserves it.

What Other Filters Are Available?

One of CIFilter biggest strengths is the ability to chain filters. To do this, you’ll create a dedicated method to process the CIImage and filter it to look like an old photo.

The CIFilter API has more than 160 filters on macOS, with most of them available on iOS as well. It’s also now possible to create custom filters as well.

To see any available filters or attributes, check out the documentation.

The CIFilter API has more than 160 filters on macOS, with most of them available on iOS as well. It’s also now possible to create custom filters as well.

To see any available filters or attributes, check out the documentation.

Creating Old Photo Filter

Add the following method to ViewController:

func applyOldPhotoFilter(intensity: Float) {
  // 1
  filter.setValue(intensity, forKey: kCIInputIntensityKey)

  // 2
  let random = CIFilter(name: "CIRandomGenerator")

  // 3
  let lighten = CIFilter(name: "CIColorControls")
  lighten?.setValue(random?.outputImage, forKey: kCIInputImageKey)
  lighten?.setValue(1 - intensity, forKey: kCIInputBrightnessKey)
  lighten?.setValue(0, forKey: kCIInputSaturationKey)

  // 4
  guard let ciImage = filter.value(forKey: kCIInputImageKey) as? CIImage else { return }
  let croppedImage = lighten?.outputImage?.cropped(to: ciImage.extent)

  // 5
  let composite = CIFilter(name: "CIHardLightBlendMode")
  composite?.setValue(filter.outputImage, forKey: kCIInputImageKey)
  composite?.setValue(croppedImage, forKey: kCIInputBackgroundImageKey)

  // 6
  let vignette = CIFilter(name: "CIVignette")
  vignette?.setValue(composite?.outputImage, forKey: kCIInputImageKey)
  vignette?.setValue(intensity * 2, forKey: kCIInputIntensityKey)
  vignette?.setValue(intensity * 30, forKey: kCIInputRadiusKey)

  // 7
  guard let outputImage = vignette?.outputImage else { return }
  guard let cgImage = context.createCGImage(outputImage, from: outputImage.extent) else { return }
  imageView.image = UIImage(cgImage: cgImage, scale: 1, orientation: orientation)
}

Here’s what’s going on, section by section. The code:

  1. Sets the intensity in the sepia-tone filter you used earlier.
  2. Establishes a filter that creates a random noise pattern looking like this:A random noise pattern
    It doesn’t take any parameters. You’ll use this noise pattern to add texture to your final “old photo” look.
  3. Alters the output of the random noise generator. You want to change it to grayscale and brighten it a bit, so the effect is less dramatic. The input image key is set to the outputImage property of the random filter. This is a convenient way to pass the output of one filter as the input of the next.
  4. Has cropped(to:) take an output CIImage and crops it to the provided rect. In this case, you need to crop the output of the CIRandomGenerator filter because it goes on forever. If you don’t crop it at some point, you get an error saying the filters have “an infinite extent”. CIImages don’t actually contain image data; they describe a “recipe” for creating it. It’s not until you call a method on the CIContext that the data is processed.
  5. Combines the output of the sepia and the CIRandomGenerator filters. The latter performs the same operation as the “Hard Light” setting does in an Adobe Photoshop layer. Most, if not all, of the filter options in Photoshop are achievable using Core Image.
  6. Runs a vignette filter on this composited output that darkens the photo’s edges. You use the intensity value to set the radius and intensity of this effect.
  7. Gets the output image and sets it to the image view.

Applying Old Photo Filter

That’s all for this filter chain. You now have an idea of how complex these filter chains may become. You can combine Core Image filters into these kinds of chains, so you can achieve an endless variety of effects.

If you want to view all this in action, then replace all the calls to applySepiaFilter(intensity:) with applyOldPhotoFilter(intensity:).

Build and run. You should get a more refined old-photo effect, complete with sepia, a little noise and some vignetting.

Old photo filter

This noise could be more subtle, but refining that is up to you, dear reader. Now, you have the full power of Core Image at your disposal. Go crazy!