Sharing the Text
This last step requires no action from you. Aren’t those the best? The app is integrated with the UIActivityViewController
. Look at shareDidTouch()
:
@IBAction func shareDidTouch(_ sender: UIBarButtonItem) {
let vc = UIActivityViewController(
activityItems: [textView.text, imageView.image!],
applicationActivities: [])
present(vc, animated: true, completion: nil)
}
It’s a simple two-step process. Create a UIActivityViewController
that contains the scanned text and image. Then call present()
and let the user do the rest.
Where to Go From Here?
Congratulations! You are now an ML developer! You can get the completed version of Extractor using the Download Materials button at the top or bottom of this tutorial. But do note that, as mentioned at the beginning, you still have to add your own GoogleService-Info.plist after downloading the final project. You’ll also need to update the bundle ID to match what you configured in the Firebase console.
In this tutorial, you learned:
- The basics of ML Kit by building a text detection photo app.
- The ML Kit text recognition API, image scale and orientation.
And you did all this without having an ML Ph.D. :]
To learn more about Firebase and ML Kit, please check out the official documentation.
If you have any comments or questions about this Firebase tutorial, Firebase, ML Kit or the sample app, please join the forum discussion below!