Siri Is Cooking for WWDC 2024

Apple has given Siri a huge shot of intelligence with the introduction of two key components: the App Intents framework and Apple Intelligence. This powerful combination transforms Siri into a deeply integrated, context-aware assistant capable of tapping into the data models and functionality of your favorite apps. By Gina De La Rosa.

Leave a rating/review
Save for later
Share

For years, Siri felt more like a halfhearted attempt at a virtual assistant than a truly helpful AI companion. Plagued by struggles to understand context and integrate with third-party apps, Apple’s iconic assistant seemed likely to be left behind as rivals like Alexa and Google Assistant continued at a rapid pace.

That all changes with iOS 18, iPadOS 18, and macOS Sequoia. Apple has given Siri a huge shot of intelligence with the introduction of two key components: the App Intents framework and Apple Intelligence. This powerful combination transforms Siri from a parlor trick into a deeply integrated, context-aware assistant capable of tapping into the data models and functionality of your favorite apps.

At the heart of this reinvention is the App Intents framework, an API that allows developers to define “assistant schemas” — models that describe specific app actions and data types. By building with these schemas, apps can express their capabilities in a language that Apple’s latest AI models can deeply comprehend.

Note:If you’d like to learn about App Intents with Shortcuts, see the article titled Creating Shortcuts with App Intents.

App Intents are just the entry point. The true magic comes from Apple Intelligence, a brand new system announced at this year’s WWDC that infuses advanced generative AI directly into Apple’s core operating systems. Combining App Intents with this new AI engine gives Siri the ability to intelligently operate on apps’ structured data models, understand natural language in context, make intelligent suggestions, and even generate content — all while protecting user’s privacy.

To illustrate the potential, this article explores how this could play out in the kitchen by imagining a hypothetical cooking app called Chef Cooks. This app adopts several of Apple’s new assistant schemas.

Data Modeling With App Entities

Before Siri can understand the cooking domain, the cooking app must define its data entities so Apple Intelligence can comprehend them. This is done by creating custom structs conforming to the @AssistantEntity schema macros:

@AssistantEntity(schema: .cookbook.recipe)
struct RecipeEntity: IndexedEntity {
  let id: String
  let recipe: Recipe

  @Property(title: "Name") 
  var name: String 
    
  @Property(title: "Description") 
  var description: String? 

  @Property(title: "Cuisine") 
  var cuisine: CuisineType? 
  var ingredients: [IngredientEntity] 
  var instructions: [InstructionEntity] 

  var displayRepresentation: DisplayRepresentation { 
    DisplayRepresentation(title: name, 
      subtitle: cuisine?.displayRepresentation) 
  } 
} 

@AssistantEntity(schema: .cookbook.ingredient) 
struct IngredientEntity: ObjectEntity { 
  let id = UUID() 
  let ingredient: Ingredient @Property(title: "Ingredient") 
  var name: String @Property(title: "Name") 
  var amount: String? 
    
  var displayRepresentation: DisplayRepresentation { 
    DisplayRepresentation(title: name, subtitle: amount) 
  } 
}

Adopting the .cookbook.recipe and .cookbook.ingredient schemas ensures the app’s recipes and ingredient data models adhere to the specifications that Apple Intelligence expects for the cooking domain. Note the user of the @Property property wrappers to define titles for key attributes. With the data groundwork laid, the app can start defining specific app intents that operate this data using the @AssistantIntent macro.

Finding Recipes

One of the core experiences in a cooking app is searching for recipes. The cooking app can enable this for Siri using the .cookbook.findRecipes schema.

@AssistantIntent(schema: .cookbook.findRecipes)
struct FindRecipesIntent: FindIntent {
  @Property(title: "Search Query")
  var searchQuery: String?
 
  @Dependency
  var recipeStore: RecipeStore

  @MainActor
  func perform() async throws -> some ReturnsValue<[RecipeEntity]> {
    let results = try await recipeStore.findRecipes(matching: searchQuery)
    return .result(results)
  }
}

This intent accepts a searchQuery parameter and uses the app’s RecipeStore to find matching recipes from the database. Siri could then integrate this app functionality in a variety of intelligent ways. For example:

“Hey Siri, find vegetarian recipes in the Chef Cooks app.”

*Siri displays a list of matching veggie recipes.*

Crucially, Siri can understand the domain context and even make suggestions without the user explicitly naming the app.

Viewing Recipe Details

With the ability to find recipes, users likely will want to view the full details of a particular dish. The cooking app can support this by adopting the .cookbook.openRecipe schema:

@AssistantIntent(schema: .cookbook.openRecipe)
struct OpenRecipeIntent: OpenIntent {
  var target: RecipeEntity

  @Dependency
  var navigation: NavigationManager

  @MainActor
  func perform() async throws -> some IntentResult {
    navigation.openRecipe(target.recipe)
    return .result()
  }
}

This intent simply accepts a RecipeEntity and instructs the apps’ NavigationManager to open the corresponding full recipe detail view. It enables experiences like:

“Hey Siri, show me the recipe for chicken Parmesan.”

  • App opens to the chicken Parmesan recipe.
  • The user sees an appetizing photo of Margherita pizza in Siri suggestions.

“Open that recipe in Chef Cooks.”

  • App launches directly to the pizza recipe.

But where Apple Intelligence and App Intents really shine is in more advanced intelligent experiences …

Intelligent Meal Planning

By modeling its data using assistant schemas, Chef Cooks can tap into Apple Intelligence’s powerful language model to enable seamless, multi-part queries:

“Hey Siri, I want to make chicken enchiladas for dinner this week.”

Rather than just searching for and opening a chicken enchilada recipe, Siri understands the full context of this request. It first searches Chef Cooks’s data for a suitable enchilada recipe, then:

  1. Checks whether all ingredients are in stock based on the user’s semantic understanding of their kitchen inventory.
  2. Adds any missing ingredients to a grocery list.
  3. Adds the recipe to a new meal plan for the upcoming week.
  4. Provides a time estimate for prepping and cooking the meal.

All of this happens without leaving the conversational Siri interface, thanks to the app adopting additional schemas like .shoppingList.addItems and .mealPlanner.createPlan. App Intents open the door to incredibly intelligent, multifaceted app experiences in which Siri acts as a true collaboration assistant, understanding your intent and orchestrating multiple actions across various data models.

Interactive Widgets With WidgetKit

Of course, not every interaction must happen by voice. Chef Cooks can use its App Intents implementation to power intelligent interactive widgets as well using WidgetKit.

Note: If you’re unfamiliar with WidgetKit, take a look at the Interactive Widgets with SwiftUI article.

One example of using interactive widgets is integrating Chef Cooks’ .cookbook.findRecipe intent using the Safari Web Widget to provide a focused recipe search experience without leaving the browser:

struct RecipeSearchEntry: TimelineEntry {
  let date = Date()
  var searchQuery = ""

  @OpenInAppIntent(schema: .cookbook.findRecipes)   
  var findRecipesIntent: FindRecipesIntent? {
    FindRecipesIntent(searchQuery: searchQuery)
  }
}

This widget entry combines the @OpenInAppIntent property wrapper with Chef Cooks’ FindRecipeIntent implementation to allow users to enter a search query and instantly view filtered recipe results — all in the Web Widget UI. Chef Cooks could even construct more advanced WidgetKit experiences by combining multiple intents into rich, interactive widgets that drive custom flows such as planning a meal by first finding recipes and then adding ingredients to a grocery list, or showing complementary recipes and instruction videos based on past cooking sessions.

With App Intents providing the structured data modeling, WidgetKit can transform these intelligent interactions into immersive, ambient experiences across Apple’s platforms.