MoodTracker: An Emotion Detection App Using Machine Learning
In this demo, you’ll start working with MoodTracker, a SwiftUI app that utilizes machine learning to detect and classify emotions from user-uploaded or camera-taken photos. Powered by a custom image classification model created with Create ML, MoodTracker provides real-time feedback on detected emotions such as happiness, sadness, along with confidence levels. In the first lesson, you’ll set the stage for handling the machine learning part by managing the UI and learning how to get an image from the user.
Start by opening the starter project for this lesson. Build and run the project. You’ll see a welcome screen. Press the Start Emotion Detection button. It’ll guide you to an empty screen. Your goal in this demo is to handle how you’ll get the image that you evaluate for the user either by uploading or taking a new photo.
Setting Up the UI
Open the ContentView. Notice how the NavigationLink
navigates to an empty screen. You’ll create the EmotionDetectionView to be the destination for this NavigationLink
. Create a new view inside the Views folder and name it EmotionDetectionView. Replace the current body
with an empty VStack
and give it a title.
VStack(spacing: 20) {
}
.navigationTitle("Emotion Detection")
Next, add a sheet
modifier to present ImagePicker
. It’s a helper struct
built on UIKit
that you can use to get a photo from user either by uploading this photo or by taking one with the camera. Now, add an actionSheet
modifier to give the user the option to choose the source of the image.
.actionSheet(isPresented: $showSourceTypeActionSheet) {
ActionSheet(title: Text("Select Image Source"), message: nil, buttons: [
.default(Text("Camera")) {
self.sourceType = .camera
self.isShowingImagePicker = true
},
.default(Text("Photo Library")) {
self.sourceType = .photoLibrary
self.isShowingImagePicker = true
},
.cancel()
])
}
.sheet(isPresented: $isShowingImagePicker) {
ImagePicker(image: self.$image, sourceType: self.$sourceType)
}
Make sure to add the needed properties to control presenting the ImagePicker
and set the default values for both to be false
. Also add the needed properties to get the image and sourceType for picking an image.
@State private var isShowingImagePicker = false
@State private var showSourceTypeActionSheet = false
@State private var image: UIImage?
@State private var sourceType: UIImagePickerController.SourceType = .photoLibrary
Open the ContentView, then add the newly added EmotionDetectionView inside the NavigationLink
.
struct ContentView: View {
var body: some View {
NavigationView {
VStack {
Text("Welcome to MoodTracker!")
.font(.title)
.padding()
NavigationLink {
EmotionDetectionView()
} label: {
Text("Start Emotion Detection")
.font(.headline)
.padding()
}
}
}
}
}
Build and run the app. Press the Start Emotion Detection button. You’ll notice that the only change is the appearance of the new title. You’ll need to handle the views inside the VStack
to be able to get the image from the user and show them the chosen image.
Creating the Image Display View
Create a new view inside the Views folder and name it ImageDisplayView. This view will handle showing a placeholder at first and then the chosen photo after the user picks the photo. Replace the body
with a Group
view that shows either the image that the user chose (in case there is an image) or a placeholder image.
Group {
if let image = image {
// 1
Image(uiImage: image)
.resizable()
.scaledToFit()
.frame(maxWidth: .infinity, maxHeight: 300)
.cornerRadius(10)
.shadow(radius: 10)
.onTapGesture {
self.showSourceTypeActionSheet = true
}
.padding()
} else {
// 2
VStack(spacing: 10) {
Image(systemName: "photo.on.rectangle")
.resizable()
.scaledToFit()
.frame(width: 100, height: 100)
.foregroundColor(.gray)
Text("Tap to Select an Image")
.foregroundColor(.gray)
}
.frame(maxWidth: .infinity, maxHeight: 300)
.background(Color.black.opacity(0.1))
.cornerRadius(10)
.shadow(radius: 10)
.onTapGesture {
self.showSourceTypeActionSheet = true
}
.padding()
}
}
Next, add the needed properties for the chosen image and the action sheet property. Before leaving this view, ensure that you updated the Preview
with the correct initializer.
@Binding var image: UIImage?
@Binding var showSourceTypeActionSheet: Bool
Finally, open EmotionDetectionView and add this new view in the VStack
.
ImageDisplayView(image: $image, showSourceTypeActionSheet: $showSourceTypeActionSheet)
Build and run the app. Press the Start Emotion Detection button. Now you have a placeholder image. Press the image and notice the appearance of the action sheet to choose the image source. Pick one of the images that you have in the photo library and notice how the placeholder image is replaced with the chosen image. You can choose another image by pressing the chosen image and repeat the same cycle. But what if the user doesn’t understand that? You’ll add a button under the image to show your user another way to select a new image. It’ll help you also for the next lessons as you’ll need other buttons for additional functionality.
Adding Action Buttons
Create another view inside the Views folder and name it ActionButtonsView. Add two properties for the chosen image and for the reset action that will happen when the user chooses to select a new image.
@Binding var image: UIImage?
var reset: () -> Void
Replace the body
with a VStack
that show a reset button in case there is a chosen image. Make sure to handle the error in the preview of this view by putting a constant values.
VStack(spacing: 10) {
if image != nil {
Button(action: reset) {
Text("Select Another Image")
.font(.headline)
.padding()
.frame(maxWidth: .infinity)
.background(Color.red)
.foregroundColor(.white)
.cornerRadius(10)
}
.padding(.horizontal)
}
}
Now open EmotionDetectionView and add the ActionButtonsView
below the ImageDisplayView
. This’ll show the reset button below the chosen image after user selection.
ActionButtonsView(image: $image, reset: reset)
Finally, add the reset
method below the body
which will reset the value of image to nil
again.
func reset() {
self.image = nil
}
Build and run the app. Follow the same previous steps and notice how the reset button appears below the chosen image. Press it and notice how the view resets to the placeholder image. You’ve done a great job. You can select a photo from the user either by camera or from their library. Now you’re ready to add the machine learning functionality to the MoodTracker app.