Module 1: What Is a Prompt (And Why Most People Get It Wrong)
Prompt Engineering Fundamentals
That gap? It comes down to understanding one thing: a prompt isn't a search query. It's an instruction set.
Stop Googling at AI
Here's my biggest frustration watching people use ChatGPT for the first time. They type things like "marketing ideas" or "help with email" and then complain the output is generic. Of course it is. You gave the AI nothing to work with.
Google trained us to use keywords. AI needs context, specificity, and direction. It's the difference between shouting "food!" at a restaurant versus saying "I'd like the salmon, medium-rare, with the asparagus instead of fries."
You need AI to help with marketing. Which prompt will produce a more useful result?
Which prompt is better?
The Four Elements That Actually Matter
After writing thousands of prompts across client work, content creation, and data analysis, I've found that every effective prompt contains some combination of four things:
1. Clarity โ State the exact output you want. Not "write about marketing" but "write a 200-word LinkedIn post about why email marketing outperforms paid social for B2B SaaS companies."
2. Context โ Background the AI doesn't have. Your industry, audience, constraints, what you've already tried. The more relevant context, the better the output.
3. Constraints โ Boundaries that focus the output. Word count, format, tone, what to include, what to avoid. Constraints aren't limitations โ they're creative guardrails.
4. Examples โ The most underused element. Show the AI what good looks like and it will mirror that quality. Skip this and you're gambling on its defaults.
Which element of prompting do most people under-use, even though it has the biggest impact on output quality?
A Real Before-and-After
The lazy prompt:
Write something about our Q4 results
Output: Generic corporate fluff that could apply to any company on earth.
The engineered prompt:
Write a 150-word internal Slack message from our VP of Sales announcing Q4 results. Revenue was ยฃ2.3M (up 34% YoY). We closed 3 enterprise deals (Tesco, Barclays, NHS Digital). Tone: celebratory but grounded โ acknowledge the team's hard work during a tough market. End with what's ahead for Q1.
Output: Something you'd actually send. Specific, human, useful.
The difference isn't talent. It's information. The second prompt gave the AI everything it needed to do its job.
A prompt like 'Write something about our Q4 results' will produce poor output because the AI lacks intelligence to understand business contexts.
Why This Matters More Than You Think
Ethan Mollick, a Wharton professor who's published extensively on AI in the workplace, found that workers using AI with good prompting practices outperformed those without AI by 40% on quality metrics. But โ and this is crucial โ workers using AI with bad prompts actually performed worse than those using no AI at all. Bad prompts don't just waste time. They produce confident-sounding garbage that you might actually use.
According to Ethan Mollick's research, what happens when workers use AI with BAD prompts?
The Universal Task Prompt
I need you to [SPECIFIC TASK]. Context: I'm a [YOUR ROLE] at [COMPANY TYPE]. This is for [AUDIENCE/PURPOSE]. Requirements: - Format: [SPECIFY] - Length: [WORD COUNT OR SIZE] - Tone: [DESCRIBE] Here's what good looks like: [EXAMPLE OR DESCRIPTION]
I need you to write a project status update email. Context: I'm a product manager at a fintech startup. This is going to our CEO and board members as a weekly update. Requirements: - Format: Bullet points with headers for each workstream - Length: 200-300 words - Tone: Professional, confident, data-driven Here's what good looks like: Lead with the most important metric, then break down by workstream, end with blockers and asks.
The "Fix My Vague Prompt" Prompt
I want to ask an AI to help me with: [YOUR VAGUE IDEA] Rewrite this as a detailed, specific prompt that would get a great result. Include context, constraints, format, and tone. Make assumptions where needed and flag them.
I want to ask an AI to help me with: making my CV better Rewrite this as a detailed, specific prompt that would get a great result. Include context, constraints, format, and tone. Make assumptions where needed and flag them.
The Comparison Test
I'm going to give you the same task twice with different levels of detail. Complete both so I can see the difference. Version A (vague): "Write about leadership" Version B (specific): "Write a 200-word LinkedIn post arguing that the best leaders ask more questions than they answer. Target audience: mid-level managers in tech. Include one example from Satya Nadella's approach at Microsoft. End with a provocative question." Now complete both.
1. Open ChatGPT or Claude right now.
2. Type a lazy, one-line prompt for something you actually need: "Write a cover letter" or "Help me with a presentation."
3. Look at the mediocre output.
4. Now rewrite your prompt using all four elements: clarity, context, constraints, examples.
5. Compare the two outputs side by side.
The difference should be obvious. Save your upgraded prompt โ it's the start of your prompt library.
---
- 1A prompt is an instruction set, not a search query โ treat it like briefing a capable colleague
- 2The four pillars: clarity, context, constraints, examples
- 3Bad prompts don't just waste time โ they produce convincing garbage
- 4The gap between good and bad prompters is wider than the gap between AI and no AI
- 5You can use AI to improve your prompts (meta-prompting) โ start there if you're stuck
You need a LinkedIn post about remote work. Which prompt will get a better result?
Which prompt is better?
Complete all exercises to unlock the Mark Complete button