The magic of basic AI prompting.
In my previous article, I agreed to explore the importance of effective "prompting" which has given rise to a specialised field known as Prompt Engineering. As with my earlier piece, I aim to maintain simplicity in this article. My target audience consists of individuals who may not possess extensive technical knowledge in the realms of artificial intelligence or machine learning. I plan to write a more in-depth article on advanced prompting techniques in the future. However, for the time being, we will focus on the fundamentals of prompting, which can still empower you to become a proficient user of Generative AI applications like ChatGPT.
Why does prompting hold such significance? At their core, generative AI applications involve providing an input and receiving an output in return. We are all familiar with the saying "garbage in, garbage out". Your objective is to construct prompts that effectively steer the AI model towards the desired response or action, while minimising the likelihood of mistakes, misunderstandings, or something I referred to in the last article as ‘hallucination’ – Generative AI’s tendency to ‘wing’ its answers when it doesn’t understand the question or know the answers.
What exactly is a prompt?
A prompt can be defined as a query, or directive presented to an AI system to evoke a particular response or action. It acts as a starting point for AI-driven tasks or conversations.
As an example, consider using a virtual assistant such as Siri. You may ask, "When is the next public holiday in the UK?". In this situation, your question functions as a prompt, prompting the AI, in this case Siri, to provide you with the requested information. For most generative AI users, especially ChatGPT users, they will typically interact with a web interface that appears as follows:
Within the text box labelled "Send a message…" users will input their chosen prompt. By clicking the arrow, ChatGPT, in this instance, will generate a corresponding response.
Let's now examine prompts. As mentioned earlier, this article will solely focus on fundamental prompts and concepts.
Prompts can be as basic and as ambiguous as:
"Tell me something interesting about...."
"What's your opinion on ....?"
"The sky is..?"
"What's the best way to solve this problem?"
However, such prompts lack specific context or clear instructions, which may result in the AI system generating responses that don't align with the user's intent or expectation.
To create effective prompts, it is recommended to incorporate one or more of the following components:
Instruction - an instruction that specifies a particular task or action for the AI model to perform.
Context – a context that provides relevant external information to help guide the AI system to generate better responses.
Input Data - input data or question that prompts the AI system to generate a response.
Output Indicator – an indicator that specifies the desired format or type of the output.
It is important to note that not all prompt components (above) are essential. Creating effective prompts is an iterative process that requires experimentation to achieve optimal results. Generally, starting with simpler prompts and gradually adding more elements and context is recommended for better outcomes.
To create effective prompts for simple tasks, it is helpful to use commands that direct the model on what action to take, such as "Write", "Classify", "Summarise", "Translate", or "Order". Most times, place these instructions at the beginning of the prompt and ensure they are specific and descriptive about the intended task. The more precise and detailed the prompt is, the better the results are likely to be.
Now, let's examine an example that incorporate the components mentioned earlier:
Recommended by LinkedIn
Here we want a basic prompt for designing a diet plan for weight loss.
You can create similar prompts to summarise text, or even articles on web sites (by simply providing the full web address).
For example:
Prompt:
At a particular instant roughly 15 billion years ago, all the matter and energy we can observe, concentrated in a region smaller than a dime, began to expand and cool at an incredibly rapid rate. By the time the temperature had dropped to 100 million times that of the sun’s core, the forces of nature assumed their present properties, and the elementary particles known as quarks roamed freely in a sea of energy. When the universe had expanded an additional 1,000 times, all the matter we can measure filled a region the size of the solar system. At that time, the free quarks became confined in neutrons and protons. After the universe had grown by another factor of 1,000, protons and neutrons combined to form atomic nuclei, including most of the helium and deuterium present today. All of this occurred within the first minute of the expansion. Conditions were still too hot, however, for atomic nuclei to capture electrons. Neutral atoms appeared in abundance only after the expansion had continued for 300,000 years and the universe was 1,000 times smaller than it is now. The neutral atoms then began to coalesce into gas clouds, which later evolved into stars. By the time the universe had expanded to one fifth its present size, the stars had formed groups recognisable as young galaxies.
Summarise this in one sentence:
Output:
Around 15 billion years ago, all matter and energy was concentrated in an incredibly small region before rapidly expanding and cooling, with protons and neutrons later combining to form atomic nuclei, and only much later did the universe become cool enough for neutral atoms to form, leading to the evolution of stars and the formation of galaxies.
You can also ask it to extract information from a given article.
Here is an example:
Prompt:
Obama was born in Honolulu, Hawaii. After graduating from Columbia University in 1983, he worked as a community organizer in Chicago. In 1988, he enrolled in Harvard Law School, where he was the first black president of the Harvard Law Review. After graduating, he became a civil rights attorney and an academic, teaching constitutional law at the University of Chicago Law School from 1992 to 2004......
When did Obama enrol in law school?
Output:
Obama enrolled in Harvard Law School in 1988 after graduating from Columbia University in 1983 and working as a community organizer in Chicago.
Or even generate code..
I have deliberately avoided delving into advanced prompting techniques such as zero-shot/few-shot prompting, chain of thought prompting, knowledge generation prompting, and other related prompting methods. These complex techniques warrant a separate article as they are best utilised when engaging with Generative AI application programming interfaces (API).
To sum up, the capabilities of Generative AI are vast, and we have merely touched the surface. The purpose of this article was to provide a basic understanding of the concept of "prompting". In forthcoming articles, we will delve into more complex and advanced concepts, as well as the practical applications and implications of this technology for both individuals and businesses.
Thanks for reading.