top of page

Map the Trust, Then Buy the Tool: The Relational Core Strategy for Charity AI (Part 2/4)

  • Writer: Helen Vaterlaws
    Helen Vaterlaws
  • Jan 28
  • 4 min read

Updated: 17 hours ago

Two volunteers smiling at the camera represent 'relational scaffolding'.  This image highlights why human trust is the primary currency in charity AI adoption, contrasting against tech-led failures.

If you feel hesitation about AI, it may be because it isn’t like the software we’ve bought for the last 20 years. Traditional software follows rigid if-this-then-that rules; AI is non-deterministic and probabilistic, meaning the same input can produce different outputs. That shift, from rules to probabilities, is where a governance challenge appears: a senior leader’s assumption of a tidy automated process meets the messy, unpredictable reality of the frontline.


While the technology is evolving rapidly and better prompting can reduce errors, we need to manage the risks today to safely unlock the potential benefits. Right now, AI can still hallucinate, mirror bias, or miss the emotional nuance a human volunteer catches instantly. That is why mapping your relational core is not a cultural nicety; it is the technical safety requirement that allows you to innovate with confidence.


In a charity, people are the human-verification layer: they catch statistical errors before they reach a vulnerable person. Mapping trust networks isn’t about slowing innovation; it increases the probability that your AI investment will succeed because the tool is built to support the people who keep your mission safe.


The Hidden Infrastructure: Bringing 'Grey Tech' into the Light


Woman on train working on a Charity AI strategy.

To map the relational core we must be honest about where it currently lives. Often that includes some element of “grey tech”. Grey tech is the informal infrastructure charities use to get things done: messaging groups for quick coordination, desktop spreadsheets that track items, or the occasional use of free AI tools where official systems feel too slow.


Staff rarely adopt grey tech to be defiant. They do it because they are trying to bridge a rigid formal system and the urgent, messy needs of beneficiaries. That intent matters. However, resourcefulness is not the same as security: if mapping shows sensitive data stored outside sanctioned systems, that creates real risks and must be addressed. The risk isn't just where data is stored, but where it is processed. Pasting a confidential case note into a free, public AI tool for summarisation can expose that data to the public model. Where unintentional breaches are identified, follow your data-protection protocols and consult legal or data-protection advisers.


The goal is to uncover the human intent behind workarounds and move that work into safer, sanctioned environments that preserve both speed and care. For guidance on data protection and ethics, refer to the Information Commissioner's Office (or your regional equivalent).


Rapid Mapping: A 4-Step Guide to Surfacing Trust Networks


Mapping the relational core isn’t a luxury. The data already exists, but it’s hiding in people’s heads. You just need a partnership between frontline and ops to look at the messy parts of your work together.


The “Sticky Note” Test (Observation)


  • Concept: Identify where current tools limit your team’s capacity for direct impact.

  • Action: Run a 60–90 minute workflow discovery session. Ask staff to map a single process and place a red sticky note on every step where the technology creates a bottleneck. If remote, use a digital whiteboard.


Goal: Find specific friction points where AI could provide relief, rather than adding another layer of software.




The “Who Do You Call?” Audit (Mapping)


  • Concept: Identify the human bridges whose intuition compensates for technological gaps.

  • Action: In your next team meeting, ask: “When the system fails, who do you call for a temporary solution?” Map these informal connections. These individuals are your most important stakeholders for any pilot.


Goal: Ensure a new AI tool doesn’t sever the informal networks that keep services resilient.




The “Drudgery vs. Delight” List (Co-Design)


  • Concept: Separate administrative burden from work that requires human empathy.

  • Action: Have staff sort tasks into two buckets:

    Bucket A (Drudgery): repetitive data entry, shift scheduling, etc.

    Bucket B (Direct Impact): complex advocacy, assessment calls, etc.


Goal: Protect the relational core (Bucket B) while targeting AI at the cognitive load (Bucket A) that drives burnout.




The “3am Rule” (Governance)


  • Concept: Establish clear human accountability before any tool goes live.

  • Action: Ask: If this tool fails at 3am, does the on-call manager know how to suspend the service without vendor support? Is there a named person responsible for escalation?


Goal: Prevent an operational hiccup becoming a reputational crisis by ensuring human oversight remains the final arbiter.




Reality Checks (read before you start)


  • Not one-and-done: Revisit these maps regularly. As AI evolves, the line between drudgery and delight will move.


  • Scale the method: For a small charity, a 20-minute chat with a lead volunteer may be enough; large organisations should pilot one department first.


  • Align obligations: Always align pilots with safeguarding, funder contracts and data-protection duties. Consult legal or data-protection advisers on high-stakes decisions.



Read the full AI adoption for charities series



I’m heading to UNESCO House in Paris, February 2026 for the International Association for Safe & Ethical AI's second annual conference. I’ll be sharing free notes from the event on LinkedIn for those interested.



Note: These insights are based on practitioner experience and do not constitute legal or regulatory advice. Always review your specific funder contracts and data protection policies (GDPR) before making significant changes to your data collection or retention schedules. Examples are for illustrative purposes only; no official affiliation with the organisations or tools mentioned is claimed.

© 2026 Insights2Outputs Ltd. | All rights reserved | Privacy Policy

Disclaimer: This content is provided for informational and illustrative purposes only. It does not constitute professional advice and reading it does not create a client relationship. Always obtain professional advice before making significant business decisions.

bottom of page