logo
logo
extension

Install for Free

Chrome Extension for ChatGPT

👈 Go back to Blog

Marketing

avatarMichael King
clock

June 09, 2023

How to Write Effective ChatGPT Prompts – Episode 1 – FridAI

TL;DR

In this episode of FridAI with Mike King, learn how to master the art of prompt engineering in generative AI and write effective prompts in ChatGPT with clear, useful prompts.

Table of Contents

arrow-down

What is Prompt Engineering?

Prompt Engineering Markdown Examples

Context Window – LLM Memory

LLM’s Literal Understanding of Language

The Anatomy of a Prompt

Greetings and salutations, folks. 

Welcome to the inaugural edition of FridAI. I’m your host, Mike King, CMO here at AIPRM. And what we’re going to be doing with this series is giving you tips, tactics, and the latest news that you can capitalize on in your work with your generative AI tools. I want to start from the basics today.

We’ll ramp up to more complex and advanced things in future episodes, but for this one, I want to talk about how to write effective ChatGPT prompts. 

So for anyone watching this who doesn’t know what a prompt is, a prompt is the specific input or instruction that you give to a generative AI tool to get it to do something.

The reason why this is something that we need to discuss is that most people when they first use these tools, they just put a simple one-liner in, or you know, half of a sentence, and it’s going to respond to anything you say.

But the less specific you are, the more it’s going to take creative license to say like, “Okay, here’s what I think you wanted.”

It’s a garbage-in, garbage-out environment, which means if you put in something that’s not well thought through, you’re going to get something back that you didn’t expect or didn’t want.

That’s where prompt engineering comes into play. That’s really the idea of spending the time to carefully craft what it is that you want from a language model so you can get what you expect.

Again, being that it’s a garbage-in, garbage-out situation. That’s why, you know, people use tools like AIPRM, and our prompt community is preparing these very well-thought-out prompts that help you quickly get to what it is that you’re looking for.

What is Prompt Engineering?

Now here’s a quick, simple example of prompt engineering, and we denote prompts using the curly braces just like code.

So in this example, I’m saying, “Acting as Logan Roy from the TV show Succession, write a succinct response to finding out that all of your children have gone bankrupt in 10 years,” and then the output is, “Bankrupt. Unacceptable. I built an empire. Not for you to waste. Buck up. Get back in the game. Show you’re Roys!”

It sounds a lot like Logan Roy, unlike me.

Prompt Engineering Markdown Examples

Let’s talk about context windows and why they’re important.

A context window in generative AI is basically the short-term memory within the conversation or conversations that you’re having with the tool.

And so in GPT-4, these can be quite large. It can be up to 24,000 words or 32,000 tokens, and it’s about 48 pages. They’ve also got a version of it that’s much bigger than that, that’s not widely available yet, but nevertheless, it’s a really large context window that can be used for these situations. And the reason why this is important is because if you stay within the same chat session and you switch across different topics or what have you, everything that you asked and was answered before influences the answers that you get downstream.

When you’re using these tools, you’re going to want to start new chats as you start new topics and new conversations.

Context Window – LLM Memory

So you want to choose your words very wisely.

I like to use Amelia Bedelia here because if you’re familiar with this children’s book character, she takes everything very literally, just like ChatGPT does. You want to make sure you’re choosing your words very wisely so you can get what you want.

Make sure you’re using specific verbs, words like “condense” and “extrapolate” rather than “rewrite.” Because if you say “rewrite,” it’s going to say, “Okay, do you want me to rewrite this the same length? Do you want me to say the same things?” It makes a determination of what to do. But if instead, you say “condense,” it’s going to summarize, or you could use “summarize” as an example. You could also use “extrapolate” where it’s like, “Cool, add more to this based on what’s there.” It’s very important to be very specific in those verbs.

Same thing with adjectives, especially when you’re thinking about voice and tone for your content.

Be as prescriptive with those descriptors as possible.

Then entities, any people, places, and things, I like to put those in quotes just to clarify to the language model.

Like, “Hey, I’m talking about this specific thing.” Because there can be situations where, let’s say I said, “Logan Roy Rogers,” it might have thought I meant Roy Rogers, which can be a person, it could also be a restaurant.

So putting “Logan Roy” in quotes makes it clear that I’m talking about a specific entity.

Then specificity. I can’t stress this enough, be as specific as possible. Not ambiguous, not vague, unless you want it to be creative.

LLM’s Literal Understanding of Language

Now, the anatomy of a prompt, or at least the light, the sort of prompts that we do here at AIPRM and a lot of the people in our community do as well, is effectively like this. And the order of these items doesn’t really matter. You just want to make sure that it makes sense when you’re writing them to ChatGPT. 

  • So the role, who is the GPT creating ads?
  • Is it a person, is it a specific type of profession?

Just be very specific about that. That’s going to limit the context of its response.

Then there’s just general context.

  • Where is this person?
  • What are they writing for?
  • Are they writing a book?
  • Is it a plumber that’s at someone’s home talking about toilets being clogged?

Whatever it is, give the context for that person that’s meant to be responding.

 

The instructions:

  • What do you want them to do?
  • Do you want them to write a specific thing in a specific order?
  • Do you want them to have specific headings in the content that they’re giving you?
  • Do you want them to write large paragraphs, short paragraphs?
  • Do you want them to account for things like perplexity so that you can trick an AI detector?

All those instructions need to be baked in there.

The format: that’s also going to inform what it responds with.

  • Is it meant to be a tweet?
  • Is it meant to be a chapter in a book?
  • Is it meant to be a blog post?

Whatever format you’re thinking about, be very specific there and then that’s going to limit how many words it responds with or expand the number of words that it responds with.

Ideally, if you have examples, provide those because it can mimic very specifically what it is that you are wanting if you show it to them. Not them, but to it. It’s not a requirement, but again, it’s the quickest way to get to what you want. 

Finally, any constraints: If you don’t want it to use technical jargon, if you want it to use a specific voice or tone, if you want it to be a specific type of message, make sure that you put that in the constraints.

As you write your prompts using this format, it’s going to become second nature.

Every time I write a prompt, now I’m always starting with my role, which will be something like “New York Times bestselling author” because typically that’s going to be pretty good writing, right?

A lot of the concern is that the writing might be empty or too basic. What I do is I say, “Write something with a lot of technical detail” or write something with whatever constraints that I require, so I get closer to what it is in the output that I wanted.

The Anatomy of a Prompt

That concludes our first episode here.

Hope to see you back again next FridAI. If you aren’t already using AIPRM, what are you doing? It’s free. And if you have any problems with the prompts that you’re writing, you just want some input or some feedback from folks, check out our community.

It’s forum.aiprm.com, and I’ll see you next week.

Thank you for reading.

Written by

YouTubeLinkedInTwitter
Michael King is AIPRM's Chief Marketing Officer. An artist and a technologist all rolled into one, Michael King is the CEO and founder of digital marketing agency, iPullRank focused on technical SEO, content strategy and machine learning. King consults with enterprise and mid-market companies all over the world, including brands like SAP, American Express, Nordstrom, SanDisk, General Mills, and FTD. King's background in Computer Science and as an independent hip-hop musician sets him up for deep technical and creative solutions for modern marketing problems. Check out his book "The Science of SEO" from Wiley Publishing.

Marketing

avatarMichael King
clock

June 09, 2023

How to Write Effective ChatGPT Prompts – Episode 1 – FridAI

TL;DR

In this episode of FridAI with Mike King, learn how to master the art of prompt engineering in generative AI and write effective prompts in ChatGPT with clear, useful prompts.

🚀1M+ Users

users_avatar

Free

Introducing Teams: Share Your Prompts💫

*for a limited time

extension

Install for Free

Already have the extension?
Subscribe to a Premium Plan

Recent Articles 👇

cG9zdDo4MDcz
min-icon

12 Min

[WEBINAR REPLAY] Your Practical Guide to Advanced AIPRM Features for Content Creation with ChatGPT

Marketing

cG9zdDo3NzUy
min-icon

2 Min

Introducing the AIPRM Prompt Wizard: A Practical Beginner’s Solution for Prompt Engineering

Marketing

cG9zdDo3NDg1
min-icon

2 Min

[WEBINAR] Your Practical Guide to Advanced AIPRM Features for Content Creation with ChatGPT

Marketing

cG9zdDoyODE4
min-icon

26 Min

Unlocking the Power of ChatGPT for Content Gap Analysis

Marketing

cG9zdDo2NTg3
min-icon

7 Min

[WEBINAR REPLAY] Small Business, Big Moves: Harnessing the AIPRM & ChatGPT Power Combo

Marketing

cG9zdDo2Nzg2
min-icon

31 Min

Promptjacking – Your ChatGPT Entrepreneurial Business Ideas Prompt

Marketing