Hey funders, don’t freak out about AI-supported grant proposals

Share
[Image description: Two meerkats standing close to one another, looking to the our right. One meerkat has their hand on the shoulder of the other one. Between them there is a warm light, turning their fur a golden hue. Image by quhl on Pixabay]

A while ago, a colleague mentioned a funder who required a whole-ass grant proposal from their current grantees to renew their grant every year. Requiring a full proposal for renewal is very annoying, but common enough, like the philanthropic equivalent of pinkeye.

This one funder, however, specified that grantees could not copy and paste information from previous year’s proposals. This forced grantees every year to spend time rewriting their mission statements, community needs, program descriptions, evaluation methodologies, budget narratives etc. using different words and phrases, even though most of that information remains the same.

If I ever wake up in the middle of the night recalling the name of this foundation (or if you know them), I will be sure to publicly bestow upon them the glorious title of “Crappy Funder of the Milky Way Galaxy.” They would get a plaque they can display for being a condescending, time-wasting fopdoodle. That’s right, I said fopdoodle, because only Old English can convey how archaic and vexing this funder is.

With AI-supported grantwriting platforms like Grantable increasingly being used in our sector, nonprofits can better deal with clueless funder malarkey and shenanigans like the above. Someone can paste answers from last year’s grant proposal into ChatGPT, for example, and ask it to paraphrase, saving them time and energy that can be used on much more important work, such as running programs or turning some milk crates into a makeshift filing system.

With the rise of AI-assisted proposals, I bet some funders are probably experiencing a combination of fear, anxiety, and possibly resentment. With the barriers being lowered, they may be flooded with more grant proposals. They may no longer be able to judge and discard proposals based on ridiculous and inequitable criteria like spelling and grammar. They may use AI themselves to read, summarize, and possibly even select proposals for funding, which brings with it some ethical considerations of its own. This article from our colleagues in Australia provides a good summary of how many funders will respond to increasingly-AI generated grant applications.

However, I am less interested in how funders WILL respond, and more on how they SHOULD. While there are tons of crappy, fustilarian funders in our sector, there are plenty of amazing ones. For those who are trying to be thoughtful about the increase in AI-generated content and what that means for their work, here’s some advice:

Let go and have a Zen-like acceptance of a new normal: Generative AI is a tool, and if it’s a useful tool, people will use it. You can resist and fight it, or you can accept this as what people will be doing from now on. And it could be a good thing. Funders get all the information they need in the formats they want, while nonprofits conserve their time and energy to focus on helping people, saving democracy, or anything else that’s more important than trying to count words in characters and other menial, pointless tasks that most grants require. That’s a good thing.

Refocus on your value of Equity, Diversity, and Inclusion: Many funders tout Equity as one of their values. AI, when it’s used with intention, can help advance this. As it’s been mentioned many times over the years, grants are inherently inequitable, tending to reward the organizations that have more fundraising resources and can play the game most effectively. This leaves out many organizations led by marginalized communities, who often can’t compete. AI-supported grantwriting can balance things out a bit. Funders who value DEI should find ways to work with this.

Recalibrate your criteria and parameters: If you’re freaked out that you may receive significantly more grant proposals now that it’s easier to produce grant applications, then spend more time refining your parameters and criteria. Especially around eligibility. Use your equity lens to ensure funding goes to the communities that are most affected by inequity and injustice. If you care about Black communities, for example, ensure your grants are only accessible to Black-led organizations, based on criteria like their ED(s) being Black, along with 75% of their board members, and 75% of their senior staff, or whatever makes most sense. When criteria are grounded on equity, it won’t matter that proposals are AI-generated, because only so many organizations will be eligible in the first place.

Reimagine your grantmaking process and accept the One Proposal: AI just further illustrates how silly and burdensome most grant processes are. Our sector has wasted enough time on grants. Even having nonprofits input prompts into AI grantwriting engines is a ridiculous use of people’s time, considering it’s all the same regurgitated answers, but tailored to funders’ various silly snowflake preferences (“Describe your goals, metrics, and evaluation plan, in 250 characters”). Every nonprofit should just have one grant proposal that they use for every funder, with no tailoring whatsoever. If you need something clarified, ask the organizations directly.

Don’t punish nonprofits for using AI: There’s talk of funders using technology to identify when nonprofits are using AI to write grant applications, probably to weed out those who do. What’s the point of doing that? Grants are not college essays or news articles, where it matters who actually does the writing. Grants are a tedious mechanism for delivering answers about an organization and its work. AI just makes it less tedious. Punishing nonprofits for using AI is petty and paternalistic.

If you’re using AI yourself, be intentional about equity: As Beth Kanter and Allison Fine mention in their book, The Smart Nonprofit, racial and other biases and prejudices are built into algorithms. Funders who use AI to write grant RFPs, read proposals, eliminate applications, come up with a list of grant finalists, or whatever, should be aware that AI engines, which are mostly designed by white dudes, will likely favor white-coded proposals. It will be interesting to see the dynamics between AI-generated grant proposals and AI-supported grant review and selection. To keep it from reinforcing inequity, both funders and nonprofits need to be aware of biases that are built into these tools.

Whether funders like it or not, AI is here, and it will be used to write grant proposals. It will save many organizations a lot of frustration and grief that often come with traditional grantmaking practices that have been inflicted upon the sector over decades. Funders can use their energy to resist this, or use this as an opportunity to reassess inefficient and inequitable granting processes and work with nonprofits on a more meaningful level to tackle systemic issues plaguing our communities.

Related post: The Ethics and Opportunities of Artificial Intelligence in the Nonprofit Sector

Please donate to support the people and communities affected by the wildfires in Maui. Here’s a great list of organizations to donate to, as well as other ways to help.

Share