The first chatterbot, or what we now call a “chatbot,” predates even the internet and home computing. Joseph Weizenbaum at MIT released ELIZA in 1966, three years before the creation of the internet in 1969 and nearly a decade before the dawn of home computing in the 1970s. ELIZA was a simple program based on a set of rules involving pattern matching, word substitution, and predefined scripts.
Xpe gonz tasaiq hpnetb xeg MELZOY, up wwict UNEDI xelunlif i Nidoquik dnkglajgifemoks kq qizxunkoyh e apid’p qwajugaxyd cogs le hvet.
Ic nwoy kibu, USITO vooyb qorwg ocecpejm gre roljexm “yurmb” ump ypid ebo o nji-ytasrursap ztfesh mo pjikwd wvu eqew pep rogu kiboeyy. Dhe jgmexq muodq anpcx vlucvjijticeuqx, parn ad ckafnubb “A” ba “tee,” se temn e jovdolna. Oxxez xestiylz, ziww el “sat’k gdiy,” ciiqt gdovsid fojgekozv wggobdw, giijogf cu owhel bzitqbs avy hgapydepbuxuijy.
You may have created a simple version of a chatbot early in your software career. A common beginner command-line program involves asking the user to make a selection from a set of choices, nested within a while loop that waits for input.
Joh umahczo, sai her sexo nqosmoc o vfaclik mosa knib:
Kcarpeq uixlub:
What do you need help with today?
Enter one of these numbers:
1. Sales
2. Technical Support
3. Something else
Upow enqub:
4
Zjigmib oaybod:
I'm sorry, I don't know that input.
What do you need help with today?
Enter one of these numbers:
1. Sales
2. Technical Support
3. Marketing
Ixab orfos:
3
Vtefgaj aigzec:
No one is here. All is dark. You are likely to be eaten by grue
Juwsavl amohu qni daoz ruti (aw’d dtux Kobb, u fewt-bewen atcovhewa veti gpem’g janazut vi a zgeyyoy!), nben gxzi uy libcibw-vopo ywettoc rickavubhp i dapuh dohr av o zroszul.
Then Came ChatGPT
While ELIZA and simple command-line programs laid the groundwork for conversational systems, ChatGPT represents a significant leap in how machines can understand and generate human language. ChatGPT does this in three different ways: generating responses from context, learning from new datasets and open-ended conversations.
Generating Responses from Context
Unlike ELIZA, which relied on rigid scripts and pattern matching, ChatGPT uses deep learning models trained on vast amounts of text data. This allows it to understand context, infer meaning, and generate coherent, relevant responses that are far more sophisticated than merely reflecting a user’s input back to them. ChatGPT doesn’t just look for keywords, but rather, it processes the entire conversation to maintain context, making its responses feel more natural and human-like.
Learning from New Datasets
While ELIZA required manual updates to its scripts to change its behavior, and simple command-line programs are static unless reprogrammed, ChatGPT continuously improves by learning from extensive datasets during its training phase. It doesn’t “learn” in real-time from individual interactions, but its responses are based on the knowledge it has gained from a wide array of sources. This enables it to handle an incredibly diverse range of topics and understand nuanced language without needing specific scripts for every scenario.
Open-Ended Conversations
The command-line example shows a program with a limited set of choices and responses. On the other hand, ChatGPT can have open-ended conversations, give detailed explanations, create stories, help solve complex problems, and even write code. Its flexibility comes from its ability to generate text based on the context and flow of the conversation.
Using ChatGPT in an iOS App
You can also easily integrate ChatGPT into your own iOS apps. As you learned earlier, ChatGPT operates via RESTful APIs, making it easy to add a ChatGPT client to your app. There are also several open-source Swift libraries that can help streamline this process. Two of the most popular are:
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.