A practical guide to prompts, frameworks, and real-world examples from the GenAI Innovator Group
The Open University · 2025
Part One
Introduction
Volunteers from Marketing, Communications and Development have been using a licensed version of ChatGPT since mid-2024, where OpenAI doesn't use The Open University workspace data to train its models. This guide shares what they've learned — both the principles of effective prompting and the specific prompts that have proven useful.
GenAI is a tool that can save time, help your creativity and improve your effectiveness on daily tasks. These tools have proven useful in summarising large amounts of information, copywriting, and can even help in decision making processes, whilst having a back-and-forth dialogue with the user in a process known as a chain prompt.
🔒 Maintain confidentiality. Never upload confidential, personal, or sensitive unpublished OU data to GenAI tools that could harm the OU, customers, staff, or reputation if made public. Use anonymised summaries instead. The OU's ChatGPT environment is secure and does not train external models, but all data handling should comply with OU privacy and information security policies.
⚖️ Use AI ethically. AI should support, not replace, professional judgement. Always credit human authorship in published work and disclose AI assistance where relevant. Avoid using AI to imitate individuals or create misleading material.
Part Two
Creating the Perfect Prompt
When working with GenAI, the quality of the prompt you write makes a big difference to the output. Think of prompts as instructions: the clearer you are, the more accurate and useful the response will be.
Think about how you'd ask a teammate for help using an Instant Message on Microsoft Teams. That same tone of voice works well with GenAI — clear, straightforward, and conversational.
1
Be Clear and Specific
State exactly what you want. Include details such as tone, format, or word length. For example: "Write a 200-word summary in a friendly but professional tone."
2
Provide Context
Give the AI enough background to understand the task — the audience, purpose, or examples of similar work.
3
Add Constraints
Constraints such as length, structure, or style help the AI stay focused. For example: "List five bullet points, each under 20 words."
4
Use Iteration
Don't expect the perfect answer first time. A quick back-and-forth often gets better results than one long, complicated prompt. Try: "That's a good start — could you shorten it and make it more student-friendly?"
5
Set a Role
Ask the AI to take on a role — senior consultant, policy advisor, strategist — to shape the output for your context. This encourages higher-level insights and recommendations.
6
Structure Your Prompt
Use lists, sections, or numbered steps. The AI will often play these steps back as part of its response, keeping the output organised.
7
Check and Edit
Always review outputs before using them. Be cautious of "high confidence / low accuracy" — AI can generate reports that look great but contain fundamental errors. Current best practice uses AI for early drafts, but final drafts should be human-engineered.
8
Verify AI Outputs
Cross-reference key claims online or in OU data. Double-check numbers, quotes, and sources. Re-run the same prompt phrased differently to test consistency. For factual tasks, ask: "Please show your reasoning or list your sources."
9
Troubleshoot
If you're not getting the right answer: simplify, add examples, check assumptions, or ask "What information would help you improve your answer?" Consider asking it to play the role of an expert and review its own answer.
10
Ask for Confidence Checks
Ask the AI to rate its own performance on a scale of 1–10 with reasoning. This gives you a quick confidence signal and encourages it to highlight assumptions or gaps. A score of 8+ often still needs slight changes, but you may prefer to do these yourself.
⚠️ SEO warning: Always review and adapt AI-generated content before using it for public-facing materials. Content that appears AI-generated (generic, unoriginal, low-value) can be penalised by search engines. Refine the language to ensure it sounds natural, human-written, and demonstrates E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness).
Part Three
Prompt Frameworks
There are a number of frameworks for structuring prompts. Which one to use depends on the purpose of the project and your familiarity with AI tools.
BRTF
Beginner-friendly — great for getting started
B — Background: context (who, what, where, why)
R — Role: the persona the AI should adopt
T — Task: the specific action you want
F — Format: how the response should be delivered
RACE
Practical — defines who, for whom, and what
R — Role: the AI's persona
A — Audience: who will receive the output
C — Context: the background or situation
E — Expectation: the desired outcome
TREE
Critical thinking — requires reasoning and evidence
T — Task: what you want the AI to do
R — Reasoning: why the task matters
E — Evidence: data or support to include
E — Explanation: how to justify the response
ITERATE
Iterative refinement — test, refine, improve
I — Initial prompt: write your first version
T — Test: try it in the tool
E — Evaluate: review what worked
R — Revise: adjust wording or structure
A — Adjust: reframe to address gaps
T — Try again: re-run and compare
E — Execute: use the final result
💡 Recommendation: The ITERATE approach — conversational back-and-forth — is often the most productive. One-shot prompts rarely work perfectly. The dialogue makes it clearer why the AI took a certain approach, which helps you defend your choices when taking ownership of the outputs.
Part Four — Researching
Researching
Prompts for market research, competitor analysis, text analysis, and finding information efficiently.
ResearchingTREE
Gap Analysis — Reviewing a Debrief Presentation
Purpose
Identify where existing market insight debriefs provide strong evidence and where further data or research is needed. Helps managers avoid over-reliance on incomplete sources and focus effort on areas of uncertainty.
What GenAI does
Reads a report or dataset and produces a structured summary of key findings
Highlights areas with strong evidence (clear data, repeated across sources)
Flags gaps or weak spots (missing evidence, outdated references, lack of detail)
Suggests questions or areas for further research
Prompt
You are an insight analyst. Review the uploaded report. Please:
Summarise the main findings in 300 words.
Highlight 3–5 areas where evidence is strong and well-supported.
Highlight 3–5 gaps, limitations, or areas requiring further research.
Suggest follow-up questions or sources that could address those gaps.
Quality Checklist
Summary captures the essence of the report without duplication
Gaps are specific (not vague like "more data needed")
Strong evidence points are clearly grounded in the report
Suggested follow-up is realistic and not already in progress
Self-rating is provided with rationale
💡 File types: This works with Word, PowerPoint, or PDF reports. For complex PDFs with tables and charts, copy tables into Excel or upload the source file — AI may misread embedded elements.
ResearchingTREE
Market & Competitor Research — Desk Research
Purpose
Gather a quick, structured overview of a market or competitor, saving time on initial desk research and highlighting areas that need deeper investigation.
Prompt
You are a market analyst.
Research {market/competitor} and return:
A 500-word summary of current activity and positioning.
3–5 strengths and opportunities.
3–5 risks or weaknesses.
Any regulatory, demographic, or market trends affecting them.
Caveats (e.g. data age, missing sources, paywall restrictions).
Quality Checklist
Information is from reputable, recent sources
Strengths/weaknesses are balanced (not just positives)
Market context is relevant to OU strategy
Caveats about source limitations are included
ResearchingTREE
Text Analysis — Summarising Verbatims into Key Themes
Purpose
Quickly distil large sets of survey comments or open-text responses into structured, thematic insights that can be shared with stakeholders.
Prompt
You are a market research analyst. Please analyse the following open-text survey responses. Group the comments into themes, and for each theme, provide a short description, representative quotes, and an indication of how common the theme is. If something is not easily categorised, include it in an 'other' category.
Highlight both positive and negative feedback. Finally, provide 3–5 actionable recommendations based on the analysis.
[Insert survey responses here]
Lastly, provide the rationale why you categorised each response in a specific category rather than another one.
Quality Checklist
Themes are relevant and not overly generic
Quotes are representative and anonymised if needed
AI hasn't double-counted or over-emphasised single responses
Frequencies align with the raw data
Categorisation rationale makes sense and shows transparency
✅ Pro tip: Read through verbatim comments first to get a sense of the themes and numbers expected. The AI's outputs can then be tweaked according to what you've observed. Asking for categorisation rationale creates a column you can sense-check and then delete before dissemination.
ResearchingTREE
Search with Complex Criteria
Purpose
Handle searches with complex, multi-layered criteria and summarise results — going beyond what traditional search engines offer in a single query.
Prompt
You are a UK-based, English speaking, marketing professional at The Open University with intermediate/advanced technical abilities. You regularly use Excel, PowerPoint, Power BI, Looker Studio.
Please suggest some free webinars on how to present complex data in a visual and/or easily understood way. Only suggest results that are accessible, in that they have not yet taken place, or a free recording is available. Format can be from 30 mins to 2 hours, and lecture style is preferred.
Please suggest 5-10 options and summarise results with key info such as title, date, synopsis and key takeaways. Please also provide a link to the webinar sign-up or recording.
At the end, provide a confidence rating (1–10) with reasoning.
Quality Checklist
Check that training/recordings are still available — links may be outdated
ResearchingTREE
Personalised News Updates
Purpose
Create a personalised newsfeed to ensure you see the most relevant industry and role-specific news without manually searching numerous publications.
Prompt
Prepare a daily brief of news from the education industry regarding universities. Use paragraphs to summarise each news article. Begin the paragraph(s) with a strong topic sentence phrased as an assertion. Bold the topic sentence. After each summary, include "Source" followed by the source name that links to the source material.
Particularly focus on UK universities providing virtual/online course content.
Quality Checklist
Summaries accurately describe the articles
Links work and have not been hallucinated
Part Five — Creating Content
Creating New Content
Prompts for accessibility compliance, digital asset production, and research instrument design.
CreatingITERATE
Accessibility Statement Writing
Purpose
Produce accessibility statements in the exact format required by the OU AUT team template, saving time on formatting and WCAG rule matching.
Prompt (refined version)
Write accessibility statement for "Form elements must have labels" following example "Some select elements do not have accessible names. People using screen readers or voice recognition software will find it harder to identify and understand the purpose of these dropdowns. This fails WCAG 2.1 Success Criterion 4.1.2 Name, Role, Value (Level A)."
Quality Checklist
Issues GenAI identified are real issues
Issues checked against W3C WCAG rules and descriptions
Statement strictly adheres to the prescribed template
💡 ITERATE in action: Start with a simple prompt first ("Write accessibility statement for 'Form elements must have labels'"). The first output won't match your template. Then refine by providing an example of the exact format you need — the second attempt will match.
CreatingTREE
Support Accessible Digital Asset Production
Purpose
Help produce accessible versions of digital assets — including alt text for images, captions for video, transcripts, and rights checking.
Prompt
I've uploaded a video.
Please extract and list all key images or visual scenes used in the video.
For each image or scene:
1. Provide concise, descriptive alt text suitable for accessibility purposes.
2. Suggest possible sources or origins of the image (e.g., stock library, public domain, original footage, logo, etc.).
3. Check and explain the likely rights or copyright status of each image — for example, whether it might be copyrighted, Creative Commons licensed, or likely public domain.
If the source or rights information cannot be confirmed, please provide best-effort guidance on how to verify it.
Quality Checklist
Treat outputs as a preliminary draft to be reviewed and refined
Review outputs for bias built into the GenAI models
Verify image rights information with original sources
CreatingITERATE
Designing a Discussion Guide & Questionnaire
Purpose
Design research instruments (a tutor discussion guide and a student questionnaire) that explore specific topics. Generates structured, ready-to-use materials that can be refined for fieldwork.
Prompt (initial)
You are an educational researcher at The Open University.
Please draft:
1. A tutor discussion guide (8–10 open-ended questions, 45–60 minutes).
2. A student questionnaire (10–12 questions, a mix of multiple-choice and open-text).
Both should focus on why students do not resit or resubmit when given the opportunity. Include practical, motivational, and emotional barriers, and potential supports.
Follow-up prompt (to refine)
After reviewing the initial output, provide additional context:
Project briefing details — target population, research aims, specific concerns from past surveys
Constraints — "keep questionnaire under 12 questions," "design tutor guide for 45 minutes"
Tone guidance — "questions should be neutral, respectful, student-friendly"
Planned use — "these will be piloted with a small group first"
Quality Checklist
Questions are non-leading and respectful in tone
Coverage of all barrier types (practical, emotional, motivational)
No duplication between guide and questionnaire
Number of questions and timings are realistic
Aligns with OU ethical standards and MRS Code of Conduct
Reviewed for bias in question tone
Part Six — Repurposing
Repurposing Content
Prompts for transcription, converting handwritten notes, summarising for stakeholders, and reframing sensitive insights.
RepurposingTREE
Transcribing VoxPop Videos (MP4 files)
Purpose
Quickly and accurately transcribe VoxPop videos used in student or staff research. Makes qualitative content easier to analyse, theme, and share. Also improves accessibility.
Prompt
You are a research assistant.
I will provide you with an MP4 VoxPop video file. Please:
Produce a verbatim transcript of the spoken content.
Create a clean version (if requested) that reads smoothly but remains faithful to meaning.
Flag any unclear words or audio gaps with timestamps.
(Optional) Provide a 150-word summary of the key points.
Quality Checklist
Transcript is complete and accurate (watch the video first)
Unclear audio is flagged transparently
Cleaned version doesn't distort meaning
RepurposingTREE
Converting Handwritten Text from PDFs
Purpose
Transform handwritten notes (from workshops, focus groups, or scanned feedback forms) into editable, searchable digital text for thematic analysis or reporting.
Prompt
You are a research assistant.
I will provide you with a PDF containing handwritten notes. Please:
Convert all legible handwriting into typed text.
Flag any uncertain or unreadable words with markers.
Produce a clean, editable version of the notes.
(Optional) Provide a short summary of key themes if the content allows.
Quality Checklist
Converted text is as accurate as possible given handwriting legibility
All flagged words are clearly marked for manual review
Layout/structure of notes is preserved where it adds meaning
RepurposingTREE
Summarising Dense Text & Creating Stakeholder Briefings
Purpose
Transform the contents of a presentation (survey results, research findings, project updates) into a clear, concise introductory email suitable for senior stakeholders.
Prompt
You are a manager preparing to brief a senior leader. Please read the attached PowerPoint presentation. Summarise the purpose, methodology, and key insights into an introductory email draft. Provide two versions: (1) a fuller version with headings and bullet points, and (2) a one-paragraph executive summary version. Ensure tone is professional, concise, and suitable for senior leadership.
Quality Checklist
Sensitive or caveated insights are included where appropriate
Email tone matches the intended audience
Attachments or links are correctly referenced
Phrasing aligns with organisational style and tone
RepurposingTREE
Reframing Sensitive Insights for Staff Audiences
Purpose
Adapt sensitive or potentially challenging insights into clear, constructive narratives. Maintain the strategic lens and highlight student voice, while avoiding language that could be misinterpreted as critical of staff during organisational change.
Prompt
You are acting as a strategic communications partner. Here are some insights: [insert key findings]. Please reframe them for a staff audience who may be sensitive due to organisational change. Your response should: (1) provide two to three alternative slide wordings, (2) include presenter notes that reduce defensiveness and emphasise student voice alongside staff expertise, (3) keep the strategic lens clear (e.g. Recruit, Retain, Return), and (4) suggest a before-and-after example to show the effect of reframing.
Quality Checklist
Language is inclusive and non-judgemental
Reframe values staff contributions explicitly
Student voice is visible and embedded
Message still lands with a strategic focus
No phrases that could create unintended anxiety about age, capability, or redundancy