Top 10 WWDC 2018 Videos in Review
We assemble a list of the top 10 WWDC 2018 videos that cover everything you need to know, including Core ML, Siri Shortcuts, ARKit 2 and more! By Tim Mitra.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
Top 10 WWDC 2018 Videos in Review
30 mins
- 1) Platforms State of the Union — Session 102
- 2) What’s New in Cocoa Touch — Session 202
- 3) Introduction to Siri Shortcuts — Session 211
- 4) Introducing Create ML — Session 703
- 5) Swift Generics — Session 406
- 6) Advanced Debugging With Xcode and LLDB — Session 412
- 7) Getting the Most Out of Playgrounds in Xcode — Session 402
- 8) Building Faster in Xcode — Session 408
- 9) High-Performance Auto Layout — Session 220
- 10) Embracing Algorithms — Session 223
- Where to Go From Here?
3) Introduction to Siri Shortcuts — Session 211
[Video Link]
“The potential of Siri Shortcuts is virtually unlimited. Implemented correctly, it’s a paradigm shift in how iOS devices will be used and how we’ll think about making our apps.” — Ish ShaBazz, independent iOS Developer
Ari Weinstein, the creator of the award-winning Workflow app, presented Siri Shortcuts, which bares the fruit of Apple’s acquisition of Workflow. The sophomoric SiriKit now lets you expose the capabilities of your app to Siri. It’s a pretty straight-forward approach. You can design the intent or shortcut. Donate that shortcut to the OS and handle the intent when Siri successfully makes the call back to your app. Shortcuts can be informational or a call to your app’s workflow. You can also make use of an NSUserActivity
type by simply setting isEligibleForPrediction
to true
in your App Delegate.
In the sample app, Soup Chef, Apple demonstrates how you would categorize the shortcut, and then add in some parameters such as string, number, person or location. Once donated to Siri, you can trigger the shortcut by speaking the phrase you provide. Siri can also run your shortcut independently of your app, making a suggested action at a certain time or place based on repeated user actions. If your app supports media types, Siri can access and start playing your content directly.
4) Introducing Create ML — Session 703
“Create ML is amazing. I can’t wait to see iOS devs doing fantastic things using machine learning.” — Sanket Firodiya, Lead iOS Engineer at Superhuman Labs, Inc.
Machine learning continues to be a hot topic these days and Apple has made it easy to add this technology to your apps. With Core ML 2, you can consider machine learning as simply calling a library from code. You only need to drop a Core ML library into your project and let Xcode sort everything else out.
Building on Core ML 2’s demystification of neural networks, Apple gives you Create ML. It only takes a few lines of code to use. You create and traine your model in Swift, right on your Mac. Create ML can work with image identification, text analysis and even with tabular data wherein multiple features can make solid predictions. You can even augment the training with Apple’s ready-made models utilizing Transfer Learning — reducing the training time from hours to minutes. This also further reduces the size of the models from hundreds of megabytes down to a mere handful. In another session, “Introduction to Core ML 2 Part One,” Apple expounds on weight quantization to further reduce the size without losing quality.
In the workflow for Create ML, you define your problem, collect some categorized sample data and train your model right inside a Playground file, using a LiveView trainer. Drag and drop your training data into the view. Once trained, you save your new model. You can also drop in some data to test the accuracy of the predictions. When you’re happy with the model you’ve made, export it. Finally, drag your new model into your project. You can train models on macOS Mojave in Swift and in the command line REPL.
5) Swift Generics — Session 406
This session takes a focused look a Swift generics. Previous sessions have covered generics, in part, but here is a deeper dive into the specifics. Swift and generics have evolved over the years and are now posed toward ABI stability in Swift 5.0, which is coming soon. Generics have been refined over time, and Swift 4.2 marks a significant point. Recently, the language has gained conditional conformance and recursive protocol constraints.
The sessions covers why generics are needed, and it builds up the Swift generic system from scratch. Untyped storage is challenging and error prone because of constant casting. Generics allow developers to know what type it is going to contain. This also provides optimization opportunities. Utilizing a generic type enables Swift to use parametric polymorphism — another name for generics.
Designing a protocol is a good way to examine generics is Swift. The talk covers how to unify concrete types with a generic type. A placeholder type, or associated type, is a sort of placeholder for a concrete type that is passed in at runtime. The talk covers some powerful opportunities with generics.
The second part of the talk covers conditional conformance and protocol inheritance, as well as classes with generics. In the talk, they look at a collection protocol to extend capabilities. Conditional conformance extends or adds composability to protocols and types that conform to it.
Swift also supports object-oriented programing. Any instance or subclass should be able to substitute for the parent and continue execution — this is known as Liskov Substitution Principle. A protocol conformance should also be available to subclasses — capturing capabilities of some of the types.
6) Advanced Debugging With Xcode and LLDB — Session 412
“Debugging is what we developers do when we’re not writing bugs.” — Tim Mitra, Software Developer, TD Bank
Chris Miles describes how the Xcode team has smoothed out many bugs that made Swift debugging difficult. Radars filed by fellow developers have exposed the edge cases for the team to fix. Doing a live debugging session, Miles shows an advanced use of breakpoints. Using the expression command and editing a breakpoint, you can change the value to test your code, without having compile and rerun your code.
You can also add your forgotten line of code at a breakpoint by double-clicking the breakpoint and opening the editor. For example, if you forget to set a delegate, you can enter the code to set your delegate, but also to test this fix. Use the breakpoint to set the delegate and test it right away. You can also test a function call inside a framework, even though you don’t know the values passed in — you’re working with assembly language now. You can examine the registers because the debugger provides pseudo registers. The first argument is the receiver, the second in Objective-C message send is the selector and the next series are the arguments passed in. Generally, you can use the po
command in the console to print a debug description and see the current values. A little bit of typecasting can help. Miles further demonstrates how to cut through repeated calls by judiciously setting properties during the run.
Anther advanced tricks involves the thread of execution — with caution, as you can change the state of you app. p
is another LLDB command to see a debug representation of the current object. Using the Variable Debugger, while paused, lets you view and filter properties to find the items to inspect. You can set a watchpoint by setting a “watch attempt” contextually on a property. Watchpoints are like breakpoints, but pause the debugger when a value changes.
“We use our debugger to debug our debuggers.” — Chris Miles, Software Engineering Manager, Apple, Inc.
During the session a macOS app’s views are also debugged — this time, inspecting elements in the View Debugger — using the same tricks to print out the values of views and constraints. Using the View Debugger’s inspector, you can find elements and see the current values or determine if they are set up by their parent or superviews. You can sort out whether your element in the view is supporting a dark variant for Dark Mode or even for Accessibility. This also covers Auto Layout debugging, debug descriptions and even the super handy Command/Control-click-through for accessing items layered behind others.