The Open University

Staff Guidance onGenerative AI in Teaching & Assessment

Updated 2026  |  Sections 1–7  |  Reviewed annually — always consult the online version at the AI in LTA Exploratorium
🚫 Category 1
GenAI Prohibited
✅ Category 2
GenAI Permitted
⭐ Category 3
GenAI Required
1. Scope

This guidance applies to all staff who design, deliver or assess Open University teaching and assessment — including modules and qualifications that use, or may be affected by, Generative AI (GenAI).

Out of scopeSupervision of postgraduate research (PGR) students and other research/scholarship activity with or about GenAI are covered by separate arrangements.
Core Reference Documents
DocumentPurpose
OU Guidance on GenAI for StudentsWhat students may and may not do
GenAI Enabling Principles (2023)University-wide strategic approach to GenAI
GenAI Acceptable Use PolicyPrimary source for permitted/prohibited use — staff and students
OU Responsible AI PolicyExpectations for safe, transparent and accountable AI deployment
Academic Conduct PolicyHandling academic integrity concerns including GenAI misuse
AI in LTA ExploratoriumOne-stop shop: events, resources, guidance, Community of Practice
⚑ ImportantWhere subject- or qualification-specific guidance exists from Boards of Studies, Schools or Faculties, it must be consistent with the principles in this document. Always check the online version rather than relying on downloaded copies — this guidance is reviewed annually each January.
2. Tools & Safe Use
✅ Approved tool for OU teaching & assessment workMicrosoft Copilot Chat (web) in protected mode is the ONLY external GenAI tool approved for drafting, revising or reviewing OU teaching and assessment materials. In protected mode: data stays within the OU environment, is not used to train public models, and Microsoft does not claim copyright in outputs.
Accessing Copilot Safely
  • 1Go to https://m365.cloud.microsoft/chat or use the Copilot button in Microsoft Edge
  • 2Sign in with your OU account
  • 3Check for the shield icon — hover over it to confirm 'enterprise data protection' applies
  • If the shield icon is NOT visible — do not use that session for OU work
Safe Use Principles — All GenAI Tools
✔ DO
Follow OU data protection guidelines; obtain explicit written approval for any exceptions
Use GenAI to draft or improve your own teaching materials — not to mark or grade student work
Seek advice from CLIP team when GenAI generates module content that will become OU copyright material
Be transparent about GenAI use in module websites, study guides and slide decks — model good practice
Critically check all outputs for inaccuracy, hallucinations, bias, stereotyping and irrelevance before use
✘ DO NOT
Enter personal, confidential or commercially sensitive information about any individual or the OU into any GenAI tool
Provide student-generated materials (TMA/EMA scripts, forum posts, correspondence) to any GenAI tool, including Copilot
Upload patented or copyrighted OU, student, colleague or third-party material to any tool other than protected-mode Copilot
Copy GenAI outputs verbatim into OU materials without acknowledgement (even from Copilot)
Treat GenAI as an authoritative source — especially for policy, legal, health-related or sensitive topics
3. Assessment

The rapid development of GenAI means higher education must reconsider both what it assesses and how. This is an ongoing process — not a one-off adjustment. GenAI-related assessment design should be embedded in regular programme reviews, production and presentation meetings.

Where to startRe-examine your Learning Outcomes (LOs). For each LO, consider: (1) what evidence students should produce; (2) which skills should be demonstrated without, alongside, or through GenAI; and (3) how GenAI use might support or undermine the intended outcomes.
3.1 Assessment Categories

Every assessment component must be explicitly labelled as one of three categories. By default (if not labelled), assessments are treated as Category 2.

🚫
CAT 1
GenAI Prohibited Students cannot use GenAI (except as a reasonable adjustment).
When to use: Only in clearly justified, exceptional cases where GenAI would prevent demonstration of essential foundational skills.
⚠ Category 1 is difficult to enforce. Use sparingly and supplement with additional authorship evidence measures.

CAT 2
GenAI Permitted (within defined limits) Most OU assessment falls here. Provide explicit, clear, tool-agnostic guidance on permitted/prohibited uses and acknowledgement requirements.

Typical permitted use patterns:
a) Generating ideas and structuring only
b) Editing the student's own work — not producing new content
c) Completing part of the task, but student must critically evaluate, adapt and integrate the output

CAT 3
GenAI Required Working with GenAI is itself part of the learning outcome (e.g. critical AI literacy, AI-supported professional practice, or evaluation of GenAI outputs).
Important: Students who object to or cannot access GenAI must be provided with an alternative assessment route.
3.2 Acknowledging GenAI Use

Unless you specify otherwise, students must follow these default acknowledgement requirements:

📚Reference all AI-generated outputs using Generative AI (Harvard) style in Cite Them Right. Uncited GenAI outputs = plagiarism.
📄Summarise GenAI use in an appendix — describe the conversation, include key prompts in quotation marks, cite as personal communication.
💾Keep a record of the GenAI conversation — students may be asked to provide it if requested by a tutor or Academic Conduct Officer.
💡 RecommendationAdd the GenAI academic integrity declaration template from the Exploratorium to the Assessment tab of your module, for students to submit with each assessment. Full conversation transcripts should NOT be routinely required — a summary appendix is sufficient.
3.3 Designing Robust Assessment Questions

GenAI capabilities evolve rapidly. Use Microsoft Copilot (protected mode) to test drafts — e.g. ask it to 'answer this question as a Level 1 student'. Treat the results as indicative, not definitive.

Questions more challenging for GenAI typically require:

#FeatureExample approach
1Personal, organisational or geographic context not available on the open webAsk students to draw on their own workplace, community or placement experience
2Higher-level cognitive skills: analysis, evaluation, synthesis, creativityRequire students to critique or compare, not just describe
3Specific engagement with module materials (beyond simple reproduction)Ask students to apply module frameworks to a case or data set provided in the question
4Justified opinion or specific conclusion, referenced to evidence or theory'Argue for a position and anticipate two counterarguments'
5Description of process or method, including intermediate reasoning steps'Show your working and explain each modelling decision'
6Specific reflection on module activities, own work, or project decisions'Critique your own earlier TMA draft in light of the tutor feedback received'
7Authentic tasks mirroring professional environments, including original data extractionInterview a client/colleague; analyse data you have collected yourself
3.4 Marking Schemes & Criteria

Marking schemes must explicitly reward academic integrity — accurate sourcing, careful reasoning, appropriate GenAI acknowledgement, and evidence of the student's own thinking.

Review schemes to check that answers which are technically correct but show no engagement with taught material, or which are superficial and generic, cannot gain high marks. Where GenAI use is permitted, give markers clear instructions on:

  • Which module concepts, sources and skills should be evidenced in the main submission
  • How to evaluate GenAI use documented in the appendix (clarity of prompts, quality of AI output, quality of student's critical editing and integration)
Key principleStringent, focused marking criteria applied consistently allow tutors to award lower marks for shallow, generic or poorly evidenced answers — regardless of whether the source is the student, GenAI, an essay mill, or any other third party.
3.5 Forms of Assessment

Consider diverse, authentic assessment forms that emphasise process, application and judgement:

FormWhy it helps in a GenAI context
Oral discussions / vivaTests understanding in real time; hard to outsource to GenAI
Portfolios / reflective commentariesEmphasise personal learning journey; require student-specific evidence
Group work / presentationsProcess observable; contribution more traceable; employability skills
Incremental / staged submissionsMirrors real-world practice; tutors see work evolving; students must act on feedback at each stage
Project-based / authentic tasksSituated in real contexts; original data; hard to replicate generically
4. Teaching

Changes to assessment must be accompanied by corresponding changes to teaching. Students need structured opportunities to learn and practise relevant skills before they are assessed.

Preparing Students for GenAI-Aware Assessment

If you introduce oral discussions or viva-style elements, provide preparatory support:

  • Videos of mock discussions or worked examples of good and weak responses
  • Checklists or prompt sheets for participation
  • Low-stakes rehearsal activities in tutorials

For GenAI-aware modules, also include guided practice in:

  • Formulating prompts effectively
  • Critiquing GenAI outputs
  • Using feedback to improve subsequent work
Engaging Students About GenAI
What to explainHow to model it
What GenAI tools can and cannot do — e.g. tendency to hallucinate references, oversimplify, or reflect dominant cultural perspectivesUse subject-specific examples in tutorials
Safe, ethical, policy-aligned use — not entering personal data; not uploading OU materials to non-approved tools; using Copilot in protected modeDemonstrate in tutorials or short screencasts
How to check accuracy, bias and relevance in GenAI outputsAnalyse outputs with students live in sessions; use the OU Critical AI Literacy framework
Why assessments develop transferable skills — communication, research, critical thinking, ethical judgementEmphasise what GenAI cannot replace: weighing evidence, value-laden decisions, empathy, lived experience
5. Academic Integrity

The Academic Conduct Guide provides detailed advice on GenAI for tutors, module teams and other staff, including how to mark and when to refer assignments for investigation.

Identifying Possible GenAI Use
⚠ Critical reminderStylistic features commonly associated with GenAI (limited personal voice, 'flowery' language, overuse of academic terminology, shifts between first and third person) can also occur in genuine student writing — including from students with autism or specific learning difficulties. Never rely on style alone, or on AI detection tools, as evidence of unauthorised use.
IndicatorWhat to doWeight of evidence
Hallucinated / fictitious references — citations that don't exist or consistently fail to match sourcesVerify a sample via Library, DOI or catalogue searches. Treat as evidence work does not meet expectations for accurate source use.Strong indicator — but consider alongside other factors
No engagement with module materials; generic or inconsistent reasoningConsider whether to discuss with student. Refer under Academic Conduct Guide if appropriate.Moderate — consider combination of indicators
Undeclared AI appendix; inconsistent writing level across submissionFollow Academic Conduct Guide and Policy. Consult Faculty-specific guidance for referral criteria.Moderate — consider combination of indicators
Stylistic features only (tone, vocabulary, sentence structure)Do NOT refer based on style alone. Discuss with student if concerned.Insufficient alone — do not refer on this basis
Better Than Detecting: Designing Integrity In

As GenAI tools improve, attempting to 'spot' AI use will become increasingly impractical. Staff time is better invested in assessment design that:

  • Makes appropriate, transparent use of GenAI where this supports learning
  • Reduces incentives and opportunities for misconduct
  • Aligns with the Academic Integrity Principles for Assessment Design
  • Follows the Principles of Teaching Academic Integrity in Level 1 modules
6. Contacts & Support

The following teams can provide support. Keeping them informed of changes to your module helps share good practice and ensures GenAI-related activity remains aligned with institutional policies.

Team / ContactWhen to contactHow to contact
CLIP (Content, Licensing & Intellectual Property)Copyright, licensing and legal issues; GenAI-generated content that will become OU copyright materialVia SharePoint intranet page
Data ProtectionWhat data can be shared with which GenAI tools; privacy queriesdata-protection@open.ac.uk
Information SecuritySystem configuration, third-party tool security, prompt-injection or phishing risksinformation-security@open.ac.uk
Cloud Platform TeamAccess to protected Azure OpenAI / ChatGPT environment for teaching, research or scholarship projectscloud-platform@open.ac.uk
Learning Design ServiceRedesigning teaching/assessment activities; embedding Critical AI Literacy; applying design frameworksLDS-LearningDesign@open.ac.uk (FAO GenAI team)
Academic Liaison LibrariansDeveloping students' critical GenAI skills; information literacy aspects of GenAI activitiesContact via module production mailbox or Lib-presentation-{Faculty}@open.ac.uk
AI Steering GroupInstitution-wide AI strategy or risk; questions about the OU AI Framework, Responsible AI Policy or GenAI Acceptable Use PolicyAI@open.ac.uk
GenAI in LTA Academic LeadsQuestions about this guidance; sharing examples of practice that could inform future updatesGAI-LTA@open.ac.uk
7. Acknowledgements

In 2025, this guidance was written by Michel Wermelinger (Academic co-Lead for GenAI in Learning, Teaching and Assessment) with input from Mirjam Hauck (Academic co-Lead), Jessica Evans and Chelle Oldham (Academic Leads for Academic Integrity), Mychelle Pride (Academic Director, PVC-S), Ian Pickup and Rachel Penny (Pro-Vice-Chancellors, Students), and colleagues from the AI Steering Group, Learner and Discovery Services, Library, Information Rights, and across all Faculties.

Updated by Haider Ali in 2026, with further input from colleagues involved in AI in Learning, Teaching and Assessment, Academic Integrity, Learning Design, Library, and the development of the OU Critical AI Literacy framework and AI glossary resources.

Quick Reference Card
The Three Assessment Categories at a Glance
🚫 CATEGORY 1 — GenAI PROHIBITED
Use only when GenAI would prevent demonstration of essential foundational skills.

Use sparingly. Hard to enforce.
✅ CATEGORY 2 — GenAI PERMITTED
Default category if not labelled. GenAI allowed within clearly defined, tool-agnostic boundaries.

Most OU assessments fall here.
⭐ CATEGORY 3 — GenAI REQUIRED
GenAI use is the learning outcome. Must provide alternative for students who cannot use GenAI.

Integrate into marking criteria.
Safe Use Checklist
  • Only use Copilot Chat (web) in protected mode for OU teaching and assessment work — check the shield icon is visible
  • Never paste student work (TMAs, EMAs, forum posts) into any GenAI tool
  • Never enter personal or confidential data into any GenAI tool without explicit approval
  • Always critically check GenAI outputs for hallucinations, bias and inaccuracy
  • Be transparent with students and colleagues about where GenAI was used in materials
  • Label every assessment component as Category 1, 2 or 3
  • Ensure marking criteria explicitly reward academic integrity and the student's own judgement
  • Provide an alternative route for Category 3 tasks for students who cannot use GenAI