OpenAI API Chatbot for ChatGPT
Go on an adventure into AI! In this tutorial, you’ll use the OpenAI ChatGPT API to create a chatbot in Android and create your own experience with this user-friendly machine learning technology. By arjuna sky kok.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
OpenAI API Chatbot for ChatGPT
30 mins
- Getting Started
- Opening the Starter Project
- Creating a Chat Messages List
- Displaying the Chat Response
- Talk to Me: Mimic the Back-and-Forth
- Artificial, for Real
- Creating Typing Animation
- Animate Text Output
- Creating OpenAI API Calls
- Messages Overview
- Engage the Bot! (You Won’t Be Assimilated)
- Puttin’ on the Persona
- Chat Completion Mode
- Limits to Your Adventure
- Adding Another Parameter
- Where to Go From Here?
Messages Overview
The messages
value is a list of objects that have the role
and content
keys. If you send the message as a user or human, you need to fill the value user
into the role
key. You put the content of the message into the content
key.
In this messages
value, you also set the tone of the conversation by putting specific content with the system
role. For example, if you want to tune ChatGPT to become a travel guide in Tokyo, this is the payload you need to send:
{
"model": "gpt-3.5-turbo",
"messages": [
{"role": "system", "content": "I want you to act as a travel guide in Tokyo."},
{"role": "user", "content": "Hello AI!"}
]
}
ChatGPT will be optimized for a travel guide conversation. You don’t need to do anything about this because, in the init
method of the ChatRetriever
class, the messages
variable — which keeps the value of the messages
property in the payload — is initialized with the object with the system
role if you’ve previously set the persona in the Persona screen.
Besides the two keys, model
and messages
, there are other optional parameters you can add to the payload. For example, you can set the value of the temperature
key, which accepts the value from 0 to 2. A higher value makes the output more random, whereas a lower value makes the output more deterministic.
Another useful parameter is max_tokens
, which limits the maximum tokens a reply message generates. Tokens are pieces of words, roughly four characters in English. So if you want a short reply message from ChatGPT, set the max_tokens
parameter to 50.
The n
parameter is how many reply messages you want, but you only want one in your case.
You can learn more about other parameters such as suffix
, top_p
, stream
, logprobs
, echo
, stop
, presence_penalty
, frequency_penalty
, best_of
, logit_bias
and user
in the OpenAI API documentation.
Enough with theory. It’s time to do some coding!
Engage the Bot! (You Won’t Be Assimilated)
Open the ChatRetriever.kt file and find the line // TODO: Call the OpenAI chat completion API
. Then, add the following code to it:
this.messages.clear()
if (this.persona.isNotEmpty()) {
this.messages.add(Message(MessageType.SYSTEM.type, persona))
}
this.messages.add(Message("user", message))
val requestBody = ChatRequest("gpt-3.5-turbo", messages)
val call = service.getChatCompletions(requestBody, "Bearer $openAIAPIKey")
call.enqueue(callback)
You reinitialized the messages variable and added the user’s message to the messages variable. You then constructed an API call using the getChatCompletions
method with the Retrofit library. Finally, you passed the callback object to the enqueue
method, which Chatty Bot will execute when the API call succeeds or fails.
Reviewing the payload response, you initially extracted the message from the content
property of the payload. According to the OpenAI API documentation, the payload JSON response should appear like this:
{
...
"choices": [{
"index": 0,
"message": {
"role": "assistant",
"content": "\n\nHi there, Where do you want to spend your time in Tokyo?",
},
"finish_reason": "stop"
}],
...
}
The data classes have been set up to handle the payload you send and receive, and they’re located in the ChatModel.kt file. These data classes mimic the JSON data structure.
Open ChattingScreen.kt and navigate to the line // TODO: Add reply message from AI to the dialog boxes
. Add the following code to handle the reply message from the API server:
val reply = response.body()?.choices!![0].message.content
historicalMessages.add(Message("assistant", reply))
addedString = reply
In the code, you retrieve the value from the content
property of the message
property within the choices
array of reply messages. You select the first item because you didn’t previously set the n
parameter, which defaults to 1.
Then, you store the reply message in the historicalMessages
variable, allowing Chatty Bot to display the API server’s response in the chat window.
If you recall, you set the addedString
variable to generate the typing animation.
Puttin’ on the Persona
Now, you must modify the button’s onClick
method. Overwrite the code below the line // TODO: Add dialog boxes to the chatting window
with the following code:
historicalMessages.add(Message("user", message))
retriever.retrieveChat(callback, message)
message = ""
This time, instead of the reply message, send the user message to the server using the retrieveChat
method of the ChatRetriever
object. In the callback, you’ll add the reply message to both the historicalMessages
and addedString
variables.
Build and run the project once more. Select the Travel Guide in Tokyo persona:
Next, on the chat screen, type something like “Can you give me a list of recommended restaurants?” and then press the button to get the following message:
Look at that! You’re on your way to being a go-to travel guide. In no time, you’ll be saying, “I’m famous in Japan!” :-]
But before you get there, you might notice that a problem arises as you continue conversing with the bot: The chatbot sometimes forgets the topics you and it have previously discussed. For instance, if you say “More, please.” to receive additional restaurant recommendations, ChatGPT appears to be unable to comprehend:
ChatGPT forgot the previous conversation regarding recommended restaurants, indicating a need for more context retention. Naturally, you could always add the context, but that can be a hassle.
Chat Completion Mode
To address this issue, you must switch the API call to chat completion mode. Currently, you use the chat API service in text completion mode. To use chat completion mode, you must return the conversation history to the API server. You guessed it; the payload grows larger with each chat! Transmit the reply message within the messages
property but with the assistant
role. Following is the JSON example payload:
{
"model": "gpt-3.5-turbo",
"messages": [
{"role": "system", "content": "I want you to act as a travel guide in Tokyo."},
{"role": "user", "content": "Hello AI!"}
{"role": "assistant", "content": "Hello! Welcome to Tokyo. I'd be happy to help you navigate this amazing city. What would you like to know?"},
{"role": "user", "content": "I want to eat delicious food!"}
{"role": "assistant", "content": "Great! Tokyo is a food paradise with a wide range of culinary delights. Here are a few..."},
{"role": "user", "content": "What about sake? Any recommendation?"}
]
}
Therefore, you must send the reply messages received from the server back to the server along with the user messages. With this context, the chatbot will no longer forget previous conversations.
In the ChatRetriever
class, the addReplyMessage
method adds the reply message to the messages
variable with the assistant
role. When making the API call, you must execute this method within the success callback.
Open the ChattingScreen.kt file and replace the code below the line // TODO: Add reply message from AI to the dialog boxes
with the following code:
val reply = response.body()?.choices!![0].message.content
historicalMessages.add(Message("assistant", reply))
retriever.addReplyMessage(reply)
addedString = reply
Next, inside the retrieveChat
method in ChatRetriever.kt, delete these lines:
this.messages.clear()
if (this.persona.isNotEmpty()) {
this.messages.add(Message(MessageType.SYSTEM.type, persona))
}
That way, you keep the user’s previous messages in the payload.
Finally, rebuild and run the project once more. This time, the chatbot retains the conversation’s context. Now, ask the same question:
As anticipated, ChatGPT provides a list of recommended restaurants in Tokyo. Then, you type “More, please.” and receive additional recommendations without issue: