Instruction

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

Looking at Chatterbots and Chatbots

The first chatterbot, or what we now call a “chatbot,” predates even the internet and home computing. Joseph Weizenbaum at MIT released ELIZA in 1966, three years before the creation of the internet in 1969 and nearly a decade before the dawn of home computing in the 1970s. ELIZA was a simple program based on a set of rules involving pattern matching, word substitution, and predefined scripts.

While True, Do Chat

You may have created a simple version of a chatbot early in your software career. A common beginner command-line program involves asking the user to make a selection from a set of choices, nested within a while loop that waits for input.

What do you need help with today?
Enter one of these numbers:
1. Sales
2. Technical Support
3. Something else
4
I'm sorry, I don't know that input.
  
What do you need help with today?
Enter one of these numbers:
1. Sales
2. Technical Support
3. Marketing
3
No one is here. All is dark. You are likely to be eaten by grue

Then Came ChatGPT

While ELIZA and simple command-line programs laid the groundwork for conversational systems, ChatGPT represents a significant leap in how machines can understand and generate human language. ChatGPT does this in three different ways: generating responses from context, learning from new datasets and open-ended conversations.

Generating Responses from Context

Unlike ELIZA, which relied on rigid scripts and pattern matching, ChatGPT uses deep learning models trained on vast amounts of text data. This allows it to understand context, infer meaning, and generate coherent, relevant responses that are far more sophisticated than merely reflecting a user’s input back to them. ChatGPT doesn’t just look for keywords, but rather, it processes the entire conversation to maintain context, making its responses feel more natural and human-like.

Learning from New Datasets

While ELIZA required manual updates to its scripts to change its behavior, and simple command-line programs are static unless reprogrammed, ChatGPT continuously improves by learning from extensive datasets during its training phase. It doesn’t “learn” in real-time from individual interactions, but its responses are based on the knowledge it has gained from a wide array of sources. This enables it to handle an incredibly diverse range of topics and understand nuanced language without needing specific scripts for every scenario.

Open-Ended Conversations

The command-line example shows a program with a limited set of choices and responses. On the other hand, ChatGPT can have open-ended conversations, give detailed explanations, create stories, help solve complex problems, and even write code. Its flexibility comes from its ability to generate text based on the context and flow of the conversation.

Using ChatGPT in an iOS App

You can also easily integrate ChatGPT into your own iOS apps. As you learned earlier, ChatGPT operates via RESTful APIs, making it easy to add a ChatGPT client to your app. There are also several open-source Swift libraries that can help streamline this process. Two of the most popular are:

See forum comments
Download course materials from Github
Previous: Introduction Next: Demo