Search blog

We’re 39 percent similar; how can we be exponentially better?

Image from TAG stating a random grant form is 39% similar to any other grant form.

Self-examination was the zeitgeist of philanthropy in 2021. We reflected. We adapted. We listened to the “rising voices of the people we serve” as a foundation staffer framed it recently. At times those voices were passionately personal, at others they were mandates delivered through data. One such data story is this: Though long suspected, we finally have the data to confirm that 39 percent of our grant applications are duplicative across funders. Recent evidence from the Technology Association of Grantmakers (TAG) indicates that grantmakers are asking the same questions with slight differences across funders, wasting a significant amount of a nonprofit’s time during the application process. This waste adds to the overhead for a nonprofit, distracting from the impact sought by their mission and adding to their unfunded operational costs.

Holding questions about power and privilege in mind, is it still justifiable that such waste is required of nonprofits? What might funders do in 2022 to begin addressing this mismatch between our stated values and the reality of our operations?

In this blog post, we’ll review the research results, discuss implications of being 39 percent similar, and identify the soul-searching needed to move forward in new alignment with our values.

Want to see more? Skip to the video to watch a short conversation between Chantal Forster, Candid’s C. Davis Parchment, and data scientist Kwame Porter Robinson about the research results, implications, and advice for next steps.

Screenshot of Kwame Porter Robinson YouTube video

Analysis of grant applications from 130 funders

Here’s the backstory: In the summer of 2021, over 130 funders shared example grant applications with TAG as part of our involvement in the #FixtheForm campaign with GrantAdvisor.org. These grant forms enabled TAG to not only determine the level of overlap between questions (39 percent) but also to identify groups of shared questions. Thanks to the generosity of these funders, we were able to identify 13 groups where duplicative questions are present using several human-guided machine learning techniques.

Below are the 13 question groups and the percent of grant form fields that fall within each:

  1. Organizational Biographical and General Information (18 percent)
  2. Miscellaneous (3 percent)
  3. Corporate Delegation and Oversight, Organizational Structure (5 percent)
  4. Data Handling, Overview, Measurement, Evaluation and Reporting (4 percent)
  5. Project Demographics/Orientation/Status (2 percent)
  6. Alternative Supports (<1 percent)
  7. How Did You Hear of Us (<1 percent)
  8. How has COVID-19 Impacted Your Work (<1 percent)
  9. Organizational Budgeting, Revenue Practices, and Forecasts (20 percent)
  10. Collaborative Partnerships and Community Support (5 percent)
  11. Requested Grant Funding Related (20 percent)
  12. Time Spent Filling Out the Form (<1 percent)
  13. What the Organization Does (22 percent)

Within each group are form fields (i.e., questions) that are similar and shared by many funders. For example, here are the fields within the “Organizational Biographical and General Information” group that are common across funders:

Organizational Biographical and General Information
Provide the following information about the organization and staff. Where contact information is requested, provide (if applicable) Name, Title, Address, Email Address, Phone Number.

  • Professional References/Testimonials (others who can speak on the organization’s behalf)
  • Qualitative Staff Characteristics (project-related experience, shared backgrounds, executive biographies, etc.)
  • Applicant Contact Information
  • Contact Person Contact Information
  • Intern Roles (if any)
  • Key Contacts Contact Information
  • Key Program Staff Resumes/CVs
  • Organization’s Contact Information
  • Organization’s Donation Website
  • Organization’s DUNS Number and/or GuideStar profile
  • Organization’s Fax Number
  • Organization’s Founding Date/History
  • Organization’s Legal Status
  • Organization Social Media (Facebook page, Instagram page, Twitter page, etc.)
  • Primary Applicant Contact Information
  • Staff Contact Information
  • Staff Demographics (Age, Ethnicity, Gender, Immigration Status)
  • Staff Qualifications
  • Total Number of Part-Time Workers
  • Total Number of Full-Time Workers
  • Total Number of Paid Staff
  • Total Number of Staff Hours

Even a quick scan of the form fields above will reveal that much of this information is available from any public repository of 990 forms filed annually by US nonprofits. As such, it’s a fair question to ask: Why do foundations continue to ask for this information when they might instead use technology to pre-populate applications from public sources?

Examining each of the 13 groups discovered by TAG’s research results in a similar pattern: Each group contains shared fields that are prime candidates for reduction, standardization, and automation. View the full set of results here.

Activating our values in 2022

Over the past two years, philanthropy has asked important questions about equitable access—access to adequate funding, access to digital tools, access to power. The research conducted by TAG highlights an important corollary: What does it look like to begin answering these questions at the operational level?

There is a long history of questioning how to address inefficient processes in the charitable sector, including grant application and reporting processes that siphon precious time away from nonprofits doing important work. Auspiciously, this research suggests there are common-sense remedies. Grantmakers could develop a “common core” of application/reporting questions that could be auto-populated from shared data repositories, requiring less effort from nonprofits. “Now is the time for us to challenge the barriers we have created as a sector,” says John Mohr, CIO of the MacArthur Foundation. “It’s time for us to develop creative solutions such as a philanthropic data commons and tools that will ease the burden on nonprofits, increase access to capital, and eliminate barriers to opportunity.”

Implementing those solutions, however, requires the charitable sector to change in fundamental ways, Specifically, we will need to have more:

  1. Humility to recognize that systems and processes like duplicative and onerous grant applications and reporting forms are structural impediments for nonprofits.
  2. Willingness to collaborate with other funders, infomediaries, and product providers in finding shared technology solutions that ease nonprofits’ burden.
  3. Appetite to invest in shared solutions and infrastructure.

How to get started now

If you are an infomediary… consider mapping the common fields identified here to any data fields available via Application Programming Interface (API) in your public database. Publish the mapping publicly to incentivize adoption by product providers and grantmakers who may have custom solutions.

If you are a product provider… work with your teams to brainstorm ways that your software might pre-populate and automate grant applications from 990 databases or a funder’s own data stores. Consider the creation of a baseline grant template for your customers that leverages shared questions for at least 40% of the application and pre-populates data using APIs from public data sources.

If you are a nonprofit… share the research findings with your funder network, being candid if possible about your experience with the application process.

If you are a grantmaker… consider launching a cross-functional taskforce in your organization (including members from programs, administration, and IT) to review this research and determine willingness to engage in aligned cross-sector action. For example, grantmakers could commit to working with their grants management system provider to leverage existing data repositories that contain nonprofit data.

Join a working session on the implications of these findings and the possibility for a sector-led response on January 26.

About the TAG study

I’m deeply grateful to the 133 grantmakers in the US, Canada, EU, and the UK who shared example grant applications with TAG. Forms were collected as part of the “100 Forms in 100 Days” campaign conducted by TAG and GrantAdvisor.org. View the full list of participating organizations here and an overview of organization types below.

Bar graph of participation by organization type.Figure 1 Participation by type of grantmaking organization
Bar graph of participation by grantmaker budget.Figure 2 Participation by annual grantmaking budget

Data analysis on the forms was conducted by data scientist Kwame Porter Robinson, a PhD student specializing in human interaction with artificial intelligence (HAII) and natural language understanding (NLU) at the University of Michigan’s School of Information. In his four-part analysis, Robinson leveraged a combination of human-guided machine learning, clustering techniques, and corpus-based similarity analysis drawn from long standing semantic similarity and information retrieval research. The resulting analysis illustrates that “typically, a random grant form is 39 percent similar to any other grant form,” says Robinson, “although wide variation is possible, from 0 percent to up to 93 percent similarity depending upon the forms under comparison.” You can find Robinson’s scripts used in this analysis here.

Research funding

Funding for this research was provided by The John D. and Catherine T. MacArthur Foundation and an award from the Robert Wood Johnson Foundation.

Tags:

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.