Roles in ChatGPT

Roles in ChatGPT

Here is the deal. This time, we will cover the topic of roles and fine-tuning our API calls, and next time we will have even more fun with some hidden gems.

What roles can we use when implementing ChatGPT APIs?

As you probably saw in our last publication, we have sent, as a role, the most common one—"user." Basically, these are all our requests and instructions. The user acts as an initiator of interaction with an AI. Keep in mind that every interaction with the AI shall include at least one user message. Usually, the message sent with the role "user" is a text. Some times there are images, but that happens quite rarely.

An important remark that we need to make is that ChatGPT remembers the messages that are sent as "user," and even if a previous message is included in the current message, it gives the model the ability to connect both requests and interpret them together. Try this question:

messages=[{"role": "user", "content": "What is the southernmost point of Europe?"}]        

Let’s try something more fun with two roles and multiple messages, as stated above.

import openai

openai.api_key = 'ADD_YOUR_KEY_HERE'

try:

    response = openai.ChatCompletion.create(

        model="gpt-3.5-turbo-0125",

        messages=[

            {"role": "system", "content": "You are a helpful travel assistant."},

            {"role": "user", "content": "I am visiting Italy. Can you recommend a city?"},

            {"role": "assistant", "content": "I recommend Rome for its rich history and amazing food."},

            {"role": "user", "content": "What are some must-visit attractions there?"}  # Follow-up question

        ]

    )

    print(response["choices"][0]["message"]["content"])

except Exception as e:

    print(f"An error occurred: {e}")        


Here is the expected result: Some must-visit attractions in Rome include the Colosseum, Roman Forum, Vatican City (including St. Peter's Basilica and the Sistine Chapel), Trevi Fountain, Pantheon, and the Spanish Steps.

The idea behind this request is that the user is interacting with ChatGPT as if it were a tourist agent and can provide information about a trip to Italy. We include the response of the model that Rome should be on the bucket list, and we get information about sightseeing there. Now, try to guess what the result will be if we remove all the lines except:

{"role": "user", "content": "What are some must-visit attractions there?"}        

Give it a try and try to understand the reason for this behavior using the information above.

The System Role

The next role that we have on the list is the role called “system.” This role sets the tone and behavior of the model. That’s where we define how the model will answer (concise, detailed, formal, casual, etc.). Probably the most interesting part here is that we can define the knowledge boundaries and limit the response to a particular subject.

The Assistant Role

The next role to explore is “assistant,” which is interesting because it stores previous discussions and can maintain the context based on them. Understanding the role will give you the ability to maintain and control a conversation. The main strength of this role is context retention. This is the role that gives us continuous conversation with ChatGPT without it constantly asking what the context of our questions is.

Let’s try it again (we will provide only the JSON; otherwise, the publication will be way too long).

messages=[ 

{"role": "system", "content": "You are a travel assistant."}, 

{"role": "user", "content": "I am visiting Italy. Can you recommend a city?"}, 

{"role": "assistant", "content": "I recommend Rome for its rich history and amazing food."}, 

{"role": "user", "content": "What are some must-visit attractions there?"} 

]        

Before running the request, try to guess how this will be different if you do this:

messages=[ 

{"role": "system", "content": "You are a travel assistant."}, 

{"role": "user", "content": "I am visiting Italy. Can you recommend a city?"}, 

{"role": "user", "content": "What are some must-visit attractions there?"} 

]        

The expected result after the first request will be that the agent will remember the city you discussed. However, when you remove the "assistant" role in the second request, it will have no clue about it and will provide generic information about multiple cities worth visiting in Italy.

Another advantage of this role is that you can define it in advance, so your personalized model will always behave in a particular way and keep track of behavior. It will keep the conversation as it was.

messages = [ 

{"role": "system", "content": "You are a career advisor."}, 

{"role": "user", "content": "I want to switch to a tech job. What should I do?"}, 

{"role": "assistant", "content": "You should consider learning Python and data structures."}, 

{"role": "user", "content": "Can you suggest some resources?"} 

]        

In this case, the model will remember that you are switching to learning Python, and it will always recommend learning materials about Python and not Java, for example.

To summarize, this role gives you the ability to maintain long conversations and keep information about them without feeling that the model has lost track of it. Nowadays, this sounds like common sense, but anyone working with LLMs knows that in the beginning, the models were not able to “remember,” and the conversations were sometimes awkward.

The Combination of Assistant and Function Roles

And at the end, let's discuss the combination of the assistant role and function, something that was previously known as “tool.”

Actually, a function combined with the assistant role gives ChatGPT more capabilities beyond text generation. This allows the model to call external functionality, functions, APIs, data, etc.

How does it work?

In a normal situation, the assistant simply generates text based on the requests you give. By adding a “function,” the LLM can:

  • Detect when an additional function call is needed.
  • Request function execution and add the required parameters.
  • Receive and process the function output.
  • Retrieve data and form the final response.

Here is a simple example of how it works:

A user requests the weather in Berlin: "What is the weather in Berlin?"

  1. The assistant understands that it has no actual data and needs to request it from an external API.
  2. It generates a request using the “tool” role.
  3. The function "get_weather" returns the data.
  4. Finally, the “assistant” role uses the data to generate the final call.

{

    "role": "assistant",

    "content": null,

    "tool_calls": [

        {

            "id": "call_12345",

            "type": "function",

            "function": {

                "name": "get_weather",

                "arguments": "{\"city\": \"Berlin\"}"

            }

        }

    ]

}        

The "tool" executes the request. The system must execute it and then return the result:

{

    "role": "tool",

    "name": "get_weather",

    "content": "{\"temperature\": \"8°C\", \"condition\": \"Cloudy\"}"

}        

The “tool” role acts as a bridge between ChatGPT and the external world. It returns structured data so the “assistant” can generate text for the final response.

At the end, the “assistant” sends the response to the user:

{

    "role": "assistant",

    "content": "The weather in Berlin is currently 8°C and cloudy."

}        

Now, let’s expand the abilities of the “assistant” role. It can actually handle multiple functions within the same call. For example, it can:

  • Fetch data about flight prices.
  • Check the weather.
  • Provide hotel recommendations.

Multiple functions are processed simultaneously.

Disclaimer: “Summary and final thoughts” and “What’s next” were generated by ChatGPT without using a function or “tool” role. 😉

To view or add a comment, sign in

More articles by Milena Georgieva

  • Talking to ChatGPT coder style

    Talking to ChatGPT coder style

    When ChatGPT was announced, I was excited to test it immediately. I wouldn’t claim that I was among the first or…

Insights from the community

Others also viewed

Explore topics