The Three Types of Authenticity Your Mission-Driven Organization Needs to Consider When Using AI

Our inboxes are full of form letters with our names tacked on at the beginning. I get messages daily from politicians increasingly trying to sound like they reached out to me individually. Most of the phone calls I get these days are from my close, personal friend Scam Likely. And on LinkedIn, I’m learning what the details of someone’s messy divorce taught them about B2B sales. These aren’t grave sins by any means, but they don’t exactly feel authentic.

A while ago, I wrote a blog post about authenticity, which I described as “a squishy one,” and in fact struggled to get an LLM to help me because I didn’t really have a handle on the topic. In drafting Amplify Good Work, I had the opportunity to do a lot more reading and thinking on the subject, so I decided to revisit the idea armed with more structured thoughts.

There is opportunity for mission-driven organizations to stand out in a crowd with something as simple as a cup of coffee, a phone call, or even a human-written, personalized email. What people are missing in a world full of mass communication and spam texts is authenticity. And you, human leader who is part of a human staff, can offer that. 

But this blog is about AI! Famously not an authentic human connection. How can AI help us be authentic at work? 

Authenticity Isn't Just One Thing

When we talk about authenticity in organizational contexts, we're talking about three distinct types. Understanding the difference helps you figure out where AI can help and where it hurts.

1. Operational Authenticity

Operational authenticity is about alignment between your stated values and actual practices. It's the difference between what you say you believe and what you actually do.

Consider the (hypothetical) environmental nonprofit with dozens of plastic water bottles in the trash. The labor advocacy organization with terrible working conditions. The mental health organization that expects 60-hour weeks. These are operational authenticity failures.

AI can threaten operational authenticity when your technology choices contradict your values. If you advocate for workers' rights but use AI to replace jobs without consideration, that's a problem. If you champion transparency but hide your AI use, that's another red flag.

But here's the interesting part: AI can actually help you identify operational authenticity gaps. You can upload your mission statement, employee handbook, and program descriptions to an LLM and ask: "We want to make sure our practices are aligned with our values and that our community can see that. Where could our practices appear to conflict with our stated values?" The AI won't pull punches the way your stakeholders might.

2. Representational Authenticity

Representational authenticity is whether your communications and materials accurately represent reality: your organization, its work, and its impact.

Picture a food bank website with stock photos of perfectly diverse models in designer clothes pretending to consider a can of creamed corn. Or an annual report with AI-generated images of "community events" that never happened. These violate representational authenticity.

The tricky part is that AI makes it incredibly easy to create polished, professional-looking content. You can generate testimonials, create synthetic (and therefore anonymous) photos of "volunteers," or produce case studies of programs that accidentally include hallucinated features, statistics, or events.

Representational authenticity matters more for mission-driven organizations than for businesses. When a TV ad uses a stock photo, or an Amazon listing has photoshopped images in which the same person in the same pose is wearing two different shirts, we don’t blink an eye. When your homeless shelter does it, people wonder if you're actually connected to the community you claim to serve.

3. Relational Authenticity

This is the big one for many organizations: the genuineness of connections between you and your stakeholders. Does this interaction feel personal, human, and sincere? Or does it feel automated and performative?

That "Dear /FirstName" email? Relational authenticity fail. An AI chatbot pretending to be a crisis counselor? Much bigger problem.

Context matters enormously here. An AI avatar for a weather reporter standing in a hurricane? Probably fine—we don't need someone risking their safety to prove to us that yes, the hurricane is still happening. But if an AI avatar made a phone call to your biggest donor? That sends a very different message about how much you value those relationships.

Using AI to Support (Not Sabotage) Authenticity

The good news is that thoughtful AI use can actually strengthen authenticity:

For Operational Authenticity:

  • Use AI to audit your practices against your values

  • Free up staff time from repetitive tasks so they can focus on mission-critical human work

  • Be transparent about your AI use in your policies and communications

For Representational Authenticity:

  • Use AI to fact-check your reports and communications

  • Generate first drafts, but always add real examples and specific details

  • Clearly label AI-generated images or content where representational authenticity could be at risk

For Relational Authenticity:

  • Use AI for backend processes, not front-line relationships

  • When you do use AI for communication, be transparent about it

  • Use some time you’ve saved using AI to redirect human staff to reach out to donors, including the smaller-dollar ones that you might not normally be able to prioritize

The Bottom Line

Authenticity isn't just nice to have for mission-driven organizations: it's fundamental to trust. And trust is what separates you from donors just asking AI where to give, volunteers finding other places that offer genuine connection, or community members questioning whether you really understand their needs.

The organizations that will thrive with AI aren't the ones that use it everywhere or nowhere. They're the ones that thoughtfully consider which type of authenticity matters most in each situation and make their technology choices accordingly.

What authenticity challenges has your organization faced with AI? I'd love to hear your experiences in the comments.

LLM disclosure: I used Claude Opus 4.1 to help draft this post. I gave it an up to date draft of my book manuscript and asked it for three ideas for blog posts I could write. I tried asking it to read my blog to make sure that it wasn’t overlapping with past posts in the same prompt: it struggled. I have gotten better results asking it to pull the post topics first, then submitting a separate prompt asking it to rely on that information. I liked the authenticity idea because I remembered that while writing the chapter, the three types of authenticity stood out to me. The intro attracted the most editorial attention: it was terribly cheesy and not very compelling, so I deleted it and started over.

Previous
Previous

Ethics and LLMs: Justice

Next
Next

Celebrating a Completed Draft