Princess Isabella and a training program

Once upon a time, there was a princess named Isabella. This is her story... She was created by OpenAI, which is no longer new. Since the introduction of ChatGPT-3 on May 28, 2020, nearly the whole world has become aware of the possibilities of Generative Pre-trained Transformer (GPT) language models. It's easy for daily use, as you can type whatever question you want. However, how do you use the OpenAI API?

The API connection is the backdoor to OpenAI. You can connect with the language model via code. By using this backdoor, you can make your prompts and outputs more detailed. How can we achieve this detail using the backdoor? First, you need to make a connection with OpenAI. This can be done by creating a client (opening the backdoor). You can do this by downloading the OpenAI package and inserting your personal API key (your backdoor key). Now we can send prompts through the backdoor by setting up a call.

## loading the package
from openai import OpenAI

## creating a client
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

## making a call
response = client.chat.completions.create()

You need to specify your call in several ways: you can select different roles, adjust the temperature, use functions, and define the output structure. To set the role, you use the messages variable, combined with specific content. A role can be seen as the type of character OpenAI has to deal with. It is always combined with the message that role wants to provide.

  • System Role: This role determines the AI's viewpoint or behavior. For instance, if you're looking to develop a personalized training program, you would set the AI’s role as a personal trainer. This adjustment allows the AI to tailor its responses according to the specific context you need.

  • User Role: This role is used for your input or requests. When you provide a query or a comment, you're acting in the user role. Your instructions or questions guide the AI in generating relevant responses.

  • Assistant Role: This role represents the AI’s responses or reactions. It encompasses follow-ups and answers based on previous interactions. The assistant role ensures that the AI’s replies are consistent and contextually relevant to your earlier queries.

messages =
[{"role": "system", "content": "You are a personal trainer."},
{"role": "user", "content": "What is a pull up?"},
{"role": "assistent",
"content": "A pull-up is a strength exercise where you lift your body
 up by pulling with your arms while hanging from a horizontal bar."}
{"role": "user", "content": "How to perform a basic pull up?"}]

Now that we know the different roles, we can set the temperature of the model to tweak the output we want to receive. The temperature determines how creative the AI will be. You can set a temperature between 0 and 2. The higher the temperature, the more varied the responses will be. With a higher temperature, it's more likely you will get a different response if you ask the same question. If you shout through the door, "Where is the party?" with a temperature of zero, you always get the answer: "The party is here!" With a higher temperature, you might get, "Hell yeah, here’s the party!" or "Let’s get that party started in here!"

Another input we can pass through the API is functions. ChatGPT can interpret the user's message and fit it within the boundaries of a function so that your application can call additional resources. An example of an additional resource could be a call to another API or an external database. After the call, the output data could be used to finalize the ChatGPT output or display the data in the interface. Defining your function and its parameters will help the AI be more specific. In the Jupyter Notebook, I created a function that could potentially be used to search a database. The output of the model shows what parameters can be distinguished based on the user's input. In this case, the exercise type: running, the program length: 3 weeks, and the exercise level: 5 kilometers. These arguments could be used to search for specific programs in a database. Another example could be that the program calls a weather API to understand the current weather forecast.

Lastly, if you want a specific output structure, you can pass the requested structure in the call. Since the beginning of this month (August 2024), OpenAI introduced an additional feature for this: structured outputs. Previously, you would use functions to get a structured output. In the Jupyter notebook, I used this method to get the arguments of a function. I could use these arguments again as input for a user prompt. Using functions this way ensured that the outcome is relatively the same, independent of the user's input.

The new feature gives you the opportunity to precisely define your output. You pass your structured output as input in the call, and OpenAI will send the answer back in the structure you provided. It was useful to enter descriptions to make the prompt more specific! You can pass a JSON or a predefined class as a structured output. The exact JSON output structure is the basis of the webpage. Independent of what the user asks, due to the structured output, the AI will always give the same JSON structure. Because I know the structure, it can easily be configured within the React component. In the component, the objects are mapped, and after some styling, you have created a dynamic interface.


Prompt: "Complete the following: Once upon a time there was a"

Once upon a time, in a land far away, there was a beautiful princess named
Isabella. She had long, flowing golden hair and eyes as blue as the ocean. Her
name was Princess Isabella. Despite her mesmerizing beauty, Princess Isabella
was known for her kindness and compassion towards everyone in her kingdom. She
would spend her days helping the needy, visiting the sick, and making sure the
villagers were taken care of. One fateful day, a wicked sorceress named Morgana
heard about Princess Isabella's remarkable traits. Morgana was envious of the

In this article, I mainly described the OpenAI API. However, you can access the API through different providers (e.g., Azure or directly). However, there are other providers. Examples include AWS, Meta, or Google. A smaller, lesser-known AI platform that connects multiple models is Perplexity—a small tip, give it a try!

If you want to learn the basics like me, this Azure course is a good way to start: Microsoft Azure and open ai. In the beginning, the Azure environment helped me to easily access the OpenAI API, and soon I got my first result—Isabella was born. The AI had created a beautiful story, just at the click of a button.