Ethics & LLMs: Effort and Craft
Imagine this: You attend a town hall about a proposed development in your neighborhood. You ask a thoughtful question about traffic safety, concerned about your kids walking to school. The city official gives what sounds like a comprehensive response about impact studies and mitigation measures. Later, you discover that response was generated by AI.
How would you feel?
If you're like most people, you'd feel a bit betrayed. Not because the information was wrong—it might have been perfectly accurate. But because the message you received was: "Your concerns weren't important enough for a human to think through."
The Economics of Effort
We've always valued things that cost someone time, energy, and skill: it’s not an AI-specific thing. Have you ever gotten a card in the mail that looks handwritten, but you look closer and all the “e”s are exactly the same, and you can see the printer dots on the letters? Have you a drawing on a wall and then realized it was a photograph with a stock charcoal drawing filter on it? How about a a book cover that looks nice, until you realize it’s a template from Canva? These things feel cheap, because we are comparing it with something that took skill, time, and energy to do.
In mission-driven work, this dynamic is even more critical. When nonprofits ask for donations, when governments ask for trust, when social enterprises ask for support, they're essentially saying: "This cause is so important that we're dedicating our lives to it." The effort invested in research, in crafting communications, in program designsignals that both the cause and the stakeholders matter.
That handwritten thank-you note from the executive director? The hours spent researching the perfect program design? The careful attention to each grant proposal? These aren't just inputs to production—they're signals of care, commitment, and respect.
The AI Dilemma
Here's where things get complicated. AI can now produce outputs that rival or exceed human quality with a fraction of the effort. A perfectly crafted grant proposal in 20 minutes. A comprehensive policy brief in an hour. Personalized donor communications at scale.
This breaks our traditional effort-value equation. When high-quality work becomes effortless, how do we signal that something—or someone—matters to us?
The "cutting corners" perception is real and growing. As people become more aware of AI capabilities, they're increasingly looking for "tells:” signs that something was automated rather than crafted. Mission-driven organizations find themselves in an awkward position: potentially being "caught" using tools that might actually help them serve their mission better.
The False Choice
But here's the thing: not all effort creates equal value. There's a difference between meaningful effort that builds relationships and develops insights, and performative busywork that just eats up time.
The real question isn't whether to use AI or not. It's how to use AI strategically to preserve the effort that matters while eliminating the effort that doesn't.
Think about it this way: Would you rather have your favorite nonprofit spend 20 hours formatting reports, or 20 hours building relationships with the community they serve? Would you rather have city planners spending their time fighting with spreadsheets, or engaging with residents about their concerns?
AI's greatest value might be freeing humans to invest effort where it creates the most meaning and impact.
Where Effort Still Matters Most
So where should organizations protect human effort?
Relationship-building work: Personal communications with major stakeholders, client interactions, community engagement. When someone feels heard and valued, that often requires genuine human investment.
Complex judgment calls: Strategic decisions, ethical considerations, understanding nuanced community needs. These require the kind of deep thinking and contextual understanding that comes from effortful engagement.
Visible, symbolic work: The things stakeholders see and judge. A founder's letter to donors, public-facing communications about organizational values, responses to community concerns.
Meanwhile, AI can handle the behind-the-scenes work: research synthesis, first drafts, administrative tasks, data analysis. This isn't cutting corners, it's strategic allocation of scarce human attention.
The Transparency Question
Should organizations acknowledge when they use AI? There's no universal answer, but here's a framework: Be transparent when it builds trust, and preserve authenticity where it matters most.
Using AI to research background information for a report? Probably fine to keep that internal. Using AI to write the executive director's personal letter to major donors? That might warrant a different approach.
The key is never misleading stakeholders about what they're getting, while also not undermining the value you're trying to create.
The Path Forward
The organizations that get this right will understand that effort isn't just an input, it's a signal. They'll use AI to eliminate wasteful effort while amplifying meaningful effort. They'll spend less time on busywork and more time on the relationships and insights that drive real impact.
This isn't about being more or less efficient. It's about being more intentional with the scarcest resource any mission-driven organization has: the time and attention of people who care enough to dedicate their careers to making the world better.
The goal isn't to preserve inefficiency for its own sake. It's to protect the meaningful effort that builds trust, develops capabilities, and signals organizational values while freeing up resources to do more of what actually matters.
In the end, your stakeholders don't just want to know that you got the job done. They want to know that you cared enough to try. AI can help you try better, not try less.