Ethical AI for Co-Production in Charities
- Helen Vaterlaws

- Oct 2, 2025
- 3 min read
Updated: 12 hours ago

If you are leading co-production in research or service design, you are likely already navigating the "AI dilemma": the promise of efficiency versus the priority of trust. Live captions, auto-transcription, and smart summaries promise time back and wider access, but they also raise vital questions about consent, safety, ethics and data privacy.
For co-production, AI should be an amplifier, not an autopilot. It can widen access and reduce the administrative burden, but it can't replace human lived experience. This guide focuses on one core principle: how to use ethical AI for co-production in a way that protects relationships and keeps people firmly in control.
Defining Ethical AI for Charity Co-production
When I talk about ethical AI in this context, I mean:

AI as a supporting tool, not the decision-maker.
Transparency: Participants know when it’s being used and can opt out without penalty.
Human oversight: Any AI output is checked and shaped by humans, specifically the participants themselves.
Data privacy & sovereignty: Data is minimised, protected, and (where possible) kept out of any model-training pipelines, verified through vendor terms and settings.
If these pieces aren’t in place, the implementation is unlikely to be robust enough for ethical co-production.
Accessibility Benefits: Widening the Circle
Used carefully, AI has the potential to help more people take part in co-produced research and design. You can use ethical AI to:
Live captions: Improve online workshops for deaf and hard-of-hearing participants, people in noisy environments, or those joining via mobile devices.
Easy-read rewrites: Generate draft versions of consent forms and briefings in plain language, which are then human-verified with participants.
Multilingual summaries: Provide key points in a participant's preferred language to ensure they can engage fully before or after a session.
Watch out for bias: AI transcription can struggle with conversational nuances and emotional context. Always ensure a human who was in the room performs the final verification of any transcripts.
Operational Efficiencies: Freeing Up Human Space
Ethical AI can reduce the administrative burden, allowing your team to focus on the human elements of co-production.
Auto-transcription with human-checked themes: Use tools to generate a fast first-pass draft, then perform the thematic analysis with participants. Treat AI as a starting point, not the final version.
Action Capture: Use tools to handle scheduling and turn workshop discussions into shared action lists that everyone can see and update in real-time.
The goal is to reduce busywork so that more of your time is spent listening, reflecting, and deciding together.
6 Essential Guardrails for AI in Lived Experience Projects
To keep co-production safe and trustworthy, build these guardrails into your project design from the start:
Informed Consent: Be explicit in briefings about which AI tools are being used (e.g., for transcription or summarising) and exactly why they are being used.
Model Training Policy: Use tools and settings that explicitly opt out of model training (and verify this in the vendor’s terms and configuration). Keep use aligned with your data protection and safeguarding policies.
Right to Opt Out: Always offer a non-AI alternative (such as manual note-taking or human interpreters) for those who prefer it.
Data Minimisation: Redact names and identifiable details by default. Only keep what is strictly necessary for the project.
Data Deletion: Be clear what withdrawal means in practice, and ensure your workflow can remove or anonymise a participant’s identifiable data where feasible (in line with your retention and safeguarding requirements).
Human Validation: Ensure participants or an advocate group validate AI-generated summaries before they are shared.
Change doesn’t start with a workshop; it starts with one honest conversation.
Note: These insights are based on practitioner experience and do not constitute legal or regulatory advice. Always review your specific funder contracts, data protection policies (GDPR) and safeguarding policies before making significant changes to operations. Examples are for illustrative purposes only; no official affiliation with the organisations or tools mentioned is claimed.

