iOS Accessibility in SwiftUI Tutorial Part 1: Getting Started
In this article, you’ll fix the accessibility of a SwiftUI master-detail app with various types of images that need more informative labels. By Audrey Tam.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
iOS Accessibility in SwiftUI Tutorial Part 1: Getting Started
35 mins
- Getting Started
- Accessibility in SwiftUI
- Using the Accessibility Inspector
- SwiftUI: Accessibility by Default
- Emoji Need Context
- Performing Actions
- Unhelpful Image Names
- MapView Accessibility Information?
- Context Menu Accessibility?
- Accessibility Issues Identified
- Fixing Accessibility Labels
- Labeling Artwork Images
- Labeling System Images
- Labeling the MapView Annotation
- Labeling Emoji
- Using VoiceOver on a Device: Basic Navigation
- Running on Your Device
- Setting Up VoiceOver Shortcut
Labeling System Images
You’re still in the detail view. Highlight the map pin icon next to the artwork’s location:
Its Trait is Button, so tapping it is a command for the app to perform an action. Its label should tell the user what this command is.
In DetailView.swift, in the HStack
, replace Image(systemName: "mappin.and.ellipse")
with this code:
Image(systemName: "mappin.and.ellipse")
.accessibility(label: Text("Open Map"))
The name of an SFSymbols image describes what it looks like, which doesn’t convey any of your app’s context to your users. The Image(systemName:)
initializer doesn’t have a label
parameter, so you have to use the Accessibility API to specify one.
Build and run, then navigate to a detail view. Use the accessibility inspector and VoiceOver to confirm the system image has a new label:
So far, you’ve only had to add a small amount of code. For the next two fixes, you’ll need to do some refactoring.
Labeling the MapView Annotation
MapView
is an MKMapView
wrapped in a UIViewRepresentable
protocol. It’s accessible by default. You can pan and zoom the map and flick through the points of interest. But the map pin marking the annotation isn’t labeled. I adapted this view from Apple’s Landmarks sample project, which has only a coordinate
property.
To fix this, you’ll need to modify MapView
. You’ll pass it the Artwork
, so it can set more properties of its annotation.
In MapView.swift, add this property:
let artwork: Artwork
And delete the coordinate
property:
let coordinate: CLLocationCoordinate2D // Delete this line
Xcode now complains a lot, but that just helps you find everything you need to fix.
First, down in updateUIView(_:context:)
, replace the second line with this:
let region = MKCoordinateRegion(center: artwork.coordinate, span: span)
You’re replacing the coordinate
property with the same value artwork.coordinate
.
There’s another use of coordinate
, just after you create annotation
. Replace annotation.coordinate = coordinate
with:
annotation.coordinate = artwork.coordinate
annotation.title = artwork.title
annotation.subtitle = artwork.locationName
Here, in addition to replacing the deleted property coordinate
, you’re setting the annotation’s title
and subtitle
. Now these will appear on the map, and VoiceOver will read them to the user.
Your last task is to change the two initializations of MapView
to take an Artwork
argument, instead of CLLocationCoordinate2D
.
Down in MapView_Previews
, replace the MapView
initializer with this:
MapView(artwork: artData[5])
And finally, in LocationMap.swift, replace the MapView
initializer with this:
MapView(artwork: artwork)
Build and run, then navigate to the first detail view and tap the map pin to show the map. In Accessibility Inspector, click the Play button to auto-navigate the VoiceOver simulator over the map. You’ll hear something similar to this:
Ala Wai Golf Course, Waikiki Beach, Prince Jonah Kuhio Kalanianaole, Honolulu Zoo, Kapiolani Regional Park, Prince Jonah Kuhio Kalanianaole, Kuhio Beach, Prince Jonah Kuhio Kalanianaole, Kuhio Beach, Legal, Link, Kuhio Beach, Done, Button.
Text
element in the HStack
by modifying it with an accessibility label — tell your VoiceOver user this is a map centered at the artwork’s location. Click Reveal to see the solution.
[spoiler title=”Solution”]
Add this modifier to the Text
element:
.accessibility(label: Text("Map centered at " + artwork.locationName))
VoiceOver will read this accessibility label instead of just the location name.
[/spoiler]
Build and run, then navigate to a map view, and use the accessibility inspector and VoiceOver to see the new label:
MKAnnotation
, which is read after all the Points of Interest. At the time of writing this article with Xcode 11.3 and iOS 13.3, VoiceOver also reads out Points of Interest on devices.
Labeling Emoji
Next, you’ll provide context for the reaction emoji. These appear in both the master list view and the detail view as artwork.reaction.rawValue
. What you need is a way to translate each emoji into a word or phrase that expresses its meaning in this app.
Artwork.swift has an enumeration for the reaction emoji:
enum reactionEmoji: String {
case love = "💕"
case thoughtful = "🙏"
case wow = "🌟"
case none = ""
}
var reaction: reactionEmoji
To begin, in Artwork.swift, add this method to the enum
:
func reactionWord() -> String {
switch self {
case .love: return "love it reaction: "
case .thoughtful: return "thoughtful reaction: "
case .wow: return "wow reaction: "
case .none: return ""
}
}
You’re specifying the meaning of the emoji in the context of this app. The colon makes VoiceOver pause between reaction and the artwork’s title.
Now, in ContentView.swift, add this modifier to the Text
element in the List
:
.accessibility(label: Text(artwork.reaction.reactionWord()
+ artwork.title))
And, in DetailView.swift, add the same modifier to the first Text
element in the VStack
:
.accessibility(label: Text(artwork.reaction.reactionWord()
+ artwork.title))
You’re providing an explicit accessibility label for the Text
elements. VoiceOver will read this instead of the Text
element’s content.
Build and run, then use the accessibility inspector and VoiceOver to check your new labels:
Using VoiceOver on a Device: Basic Navigation
The accessibility inspector’s VoiceOver simulator is really convenient, but you should run your app on a device, to find out how it really behaves and sounds for a VoiceOver user.
Running on Your Device
In the PublicArt Project, adjust the iOS Deployment Target, if your device isn’t on the latest version of iOS 13. Note that it must be some version of iOS 13.
In the PublicArt Target, change the Bundle Identifier organization, then click Signing & Capabilities. Check the checkbox to Automatically manage signing, and select a Team.
Setting Up VoiceOver Shortcut
On your device, open Settings▸Accessibility▸Accessibility Shortcut, and select VoiceOver. This enables you to switch VoiceOver on and off by triple-clicking the device’s side button.
Build and run the app on your device. Triple-click the side button to start VoiceOver. Tap the first item, and listen to VoiceOver say Wow reaction then mangle Prince Jonah’s name.