Introduction

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

The previous lesson provided a comprehensive understanding of building custom image classification models using Create ML, covering the entire workflow from data preparation to training and evaluation. With this foundation, you’re now ready to take the next step in deploying your trained model in an iOS app. This transition allows your model to move from a theoretical construct to a functional component in a real-world application.

In this lesson, you’ll learn how to export your model from Create ML and integrate it into an iOS app using SwiftUI. This includes configuring your SwiftUI project to incorporate the model, optimizing the model’s performance for smooth and responsive user experiences, and evaluating its effectiveness in real-time scenarios. By embedding the model into an app, you create a tangible tool that users can interact with directly.

The learning objectives for this lesson are to:

  • Detail the integration of a trained custom model into a simple iOS app.
  • Identify the app’s performance and user experience with the custom model.

By the end of this lesson, you’ll have a fully functional app with a custom model integrated and ready to perform emotion-detection tasks.

See forum comments
Download course materials from Github
Previous: Quiz: Reviewing Customized Models with Create ML Next: Instruction