Episode 3, season 1 – Mastering prompt engineering

I. A galaxy of context

In a galaxy not so far away, the key to mastering AI lies not only in fine-tuning but also in understanding the power of context. As Master Yoda might advise, prompt engineering is essential in training your own AI model.

Master Yoda: Luke.
Luke: Yes, Master Yoda?
Yoda: In your training of creating your own model, mastering prompt engineering, you must.
Luke: But I thought the key was fine-tuning the model?
Yoda: Much to learn, you still have. Context is the force. Not the only technique, fine-tuning is. Be one with the context, you must; or mastered by the droids (AI), you will be.

With that wisdom in mind, let’s delve into the essentials of prompt engineering.

II. What is Prompt Engineering?

Prompt engineering, also known as prompt tuning or prompting, is a technique that gained significant traction over the past year. For users of ChatGPT and similar models, it’s about providing precise and comprehensive context to improve the quality of responses. The more detailed the prompt, the more accurate and relevant the answers become.

For data scientists and engineers, prompt engineering involves embedding additional context into prompts to guide AI models more effectively. This can include hidden context or prompt templates that enhance the model’s understanding of the query.

III. Common examples of Prompt Engineering

1. Example 1: Contextual pricing in mobile applications

Imagine a user in a mobile app asks about available lightsabers and their prices. The pricing could vary based on the user’s planetary location, you can detect the currency (1). For instance, prices on Tatooine might be displayed in wupiupi, while on Earth, they could be in bitcoin. Additional context, such as the user’s age, could influence the lightsaber’s length. This contextual information, referred to as “streaming data,” improves the accuracy of responses and enhances the user experience, showcasing the limitless potential for online businesses.

2. Example 2: Diplomatic invitations for princess Leia

Consider an AI model that needs to handle confidential data, like sending an invitation to Princess Leia Organa (2). If the model is not trained on the specific layout required for such invitations, a hidden prompt can provide the necessary template details. This ensures the creation of an appropriate and diplomatic invitation, avoiding any potential incidents.

IV. The path forward

This was Step 3 of 13 in the journey to creating your proprietary AI model. Next time, we’ll explore static and dynamic prompt engineering, data augmentation, and techniques to reduce hallucinations in AI responses.

May the context be with you.

References:

(1) In a 3D environment, the AI Lisa we did (HR Soft Skills detector AI through various experiences) receives the user’s location and orientation as input along with the oral prompt. This allows the NPC to orient and guide the user in a 3D space.
(2) The AI Stephane we created can detect whom it is speaking to and adapt his business knowledge accordingly. I will explain later why we use prompt engineering strategy here and not RAG or fine-tuning. Stephane is familiar with most jobs, tools, and processes in banking, insurance, and marketing.

Scroll to Top