The Nonprofit Technology Standards Gap: What Healthcare and Finance Got Right That We Haven't
Healthcare built HL7 and FHIR. Finance built SWIFT and open banking APIs. The nonprofit sector built fragmented CRMs and disconnected spreadsheets. Understanding this gap, and what it will take to close it, is essential for any nonprofit leader planning to use AI effectively.

Imagine a world where every hospital used a completely different format to store patient blood types, where transferring a patient between facilities meant manually re-entering hundreds of data fields, and where doctors could never access a patient's history from another provider. That world existed in healthcare until about a decade ago. Then regulatory mandates, industry coalitions, and shared investment created standards like HL7 and FHIR that made interoperability possible across thousands of systems. Today, a patient's records can follow them across providers, states, and care settings with relative seamlessness.
The nonprofit sector is still living in that pre-standard world. A donor's giving history exists in one system, their event attendance in another, their volunteer activity in a third, and their correspondence in a fourth. When that donor moves, changes emails, or gives to a partner organization, the connections are lost entirely. Staff spend hours every week manually reconciling data that should flow automatically. And when nonprofit leaders try to deploy AI to improve fundraising, program delivery, or operations, they quickly discover that their fragmented data infrastructure makes meaningful AI nearly impossible.
This is the nonprofit technology standards gap, and it has significant consequences. According to research from the CCS Philanthropy Pulse Report 2025, over 90% of nonprofits operate three or more core technology platforms, and 80% run four or more. More than half plan to change or add platforms in the coming year, further fragmenting their data. Meanwhile, only 7% of nonprofits report major strategic impact from AI, despite 92% using it in some capacity. The correlation between fragmented data and weak AI outcomes is not a coincidence.
To understand how the nonprofit sector arrived at this point, and what it would take to create the conditions for genuine interoperability, it helps to look carefully at what healthcare and finance actually built, why they were able to build it, and what obstacles stand in the way of similar progress in the social sector.
How Healthcare Built the Standard That Transformed Patient Data
The story of healthcare data interoperability is not a story of spontaneous industry collaboration. It is a story of regulatory mandate backed by significant funding. The Health Level Seven (HL7) organization, itself a nonprofit founded in 1987, spent decades creating messaging standards for how healthcare data should be structured and transmitted. But adoption remained patchy and inconsistent until the federal government intervened.
The HITECH Act of 2009 provided over $25 billion in incentive payments to hospitals and physicians who adopted certified electronic health records systems. The Affordable Care Act pushed further, and the 21st Century Cures Act of 2016 went further still, explicitly mandating standardized APIs and prohibiting information-blocking practices. Hospitals that did not comply risked losing Medicare and Medicaid funding. That combination of carrots and sticks drove a transformation in less than fifteen years.
FHIR, which stands for Fast Healthcare Interoperability Resources and is pronounced "fire," represents the modern evolution of these standards. Built on REST APIs using JSON and XML, FHIR is designed for the mobile and cloud era. It achieves roughly 80% adoption across healthcare organizations, enabling a patient's records to move between providers, insurers, and public health systems in ways that would have seemed impossible twenty years ago. The standard succeeded because it combined clear technical specifications with regulatory enforcement, industry investment, and a unifying nonprofit organization dedicated to its development.
What Made Healthcare Standards Work
The ingredients that created genuine interoperability in a fragmented sector
- Regulatory mandate with teeth: The 21st Century Cures Act prohibited information-blocking and required standardized APIs, with real financial consequences for noncompliance.
- Federal funding at scale: Over $25 billion in HITECH incentives funded adoption, reducing the financial burden on individual organizations.
- A dedicated standards organization: HL7 provided a neutral home for technical development, with participation from vendors, providers, and government.
- Clear, demonstrable ROI: Better data sharing improved patient safety and reduced duplicative testing, making the business case for investment obvious.
- A unified vertical: Despite its complexity, healthcare is a defined sector with a shared vocabulary, shared regulatory environment, and shared compliance pressures that made coordination feasible.
How Finance Achieved Interoperability Through Standards and Regulation
The financial industry had its own path to interoperability, and it followed a similarly coercive logic. SWIFT, the Society for Worldwide Interbank Financial Telecommunication, began in the 1970s as a cooperative messaging system that allowed banks to communicate securely across borders. The FIX protocol followed in the 1990s for securities trading. These standards did not emerge from voluntary goodwill. They emerged because the alternative, billions of dollars in transaction failures and manual reconciliation, was unacceptably expensive.
The more transformative development came with open banking regulations. The European Union's Payment Services Directive 2 (PSD2) required banks to open their APIs to third-party developers, creating a mandated ecosystem of data sharing that drove competition and innovation. Financial Data Exchange (FDX), formed in 2018 as a nonprofit, now counts over 200 member organizations and sets standards for how financial data can be shared with consumer permission. Similar open banking frameworks are developing across the United States, United Kingdom, and Australia.
What finance got right was recognizing that data standards create market value. When a customer can easily share their financial history with a new lender, or connect their bank account to a budgeting app, the entire financial ecosystem becomes more efficient and more competitive. The regulatory mandate to share data was not just a compliance burden; it became a driver of product innovation. Nonprofits have not yet experienced that kind of systemic shift, and it shows.
The Cost of Data Fragmentation in Nonprofits
90%+
of nonprofits operate three or more core technology platforms, creating inevitable data silos
52%
cite data quality and availability as top barriers to AI adoption
36%
report difficulty leveraging data for decision-making, up from 14% two years prior
29%
report direct operational inefficiencies and delays caused by disparate systems
Sources: CCS Philanthropy Pulse Report 2025, Nonprofit AI Adoption Report 2026
What the Nonprofit Sector Has Tried (And Why It Hasn't Been Enough)
The nonprofit sector has not been completely passive about data standards. Several significant efforts have attempted to create common frameworks, and understanding their limitations explains why they have not produced the kind of systemic change that HL7 and FHIR created in healthcare.
The International Aid Transparency Initiative (IATI) created a widely adopted standard for international development and humanitarian organizations to publish spending and activity data. It represents genuine progress within a specific subsector, but it is focused on financial transparency rather than operational interoperability, and it applies primarily to organizations working in international development rather than the broader social sector.
Microsoft's Nonprofit Accelerator introduced a Nonprofit Common Data Model, an open-source framework available on GitHub that provides standard definitions for concepts like donors, grants, programs, and beneficiaries. Salesforce has its own Nonprofit Success Pack (NPSP) and Nonprofit Cloud data models. Candid, the nonprofit data organization formed by the merger of GuideStar and Foundation Center, released its Philanthropy Classification System (PCS) under Creative Commons in 2025, removing the non-commercial restriction that had previously limited its use.
Each of these efforts represents real work by credible organizations. The problem is that they are platform-specific or sector-specific, they lack regulatory backing, they depend on voluntary adoption by vendors who often have competitive incentives to keep data proprietary, and there is no neutral convening body with the authority or funding to drive sector-wide adoption. The result is a landscape of competing quasi-standards that improve things for organizations within particular ecosystems while doing nothing to enable data to flow between those ecosystems.
Existing Nonprofit Standards
What has been built and its limitations
- IATI Standard: Excellent for international aid transparency, but limited to that subsector and focused on financial reporting rather than operational data
- Microsoft Nonprofit Common Data Model: Open-source and technically sound, but requires organizations to be within the Microsoft ecosystem to benefit fully
- Salesforce NPSP: Widely used but proprietary, creating lock-in and making it difficult to share data with organizations using different platforms
- Candid PCS: Now fully open, but primarily a classification system rather than a data exchange standard
What's Missing Compared to Healthcare
The structural gaps that prevent sector-wide standards
- No federal regulatory mandate requiring data interoperability or API openness
- No dedicated neutral standards body equivalent to HL7 with sector-wide authority and funding
- No significant government funding comparable to HITECH's $25 billion investment to drive adoption
- Vendor incentives run counter to open standards, as data lock-in protects market position
Why the Standards Gap Is Directly Blocking Nonprofit AI Adoption
The connection between data standards and AI capability is not theoretical. AI systems require clean, consistent, accessible data to function effectively. When a nonprofit's donor data is split across five platforms with different field definitions, different data entry conventions, and no shared unique identifiers, AI tools have nothing reliable to work with. You cannot train a predictive model on data where "John Smith" appears as three different records in three different systems with no way to know they represent the same person.
This is why 95% of IT leaders cite integration issues as their primary AI adoption barrier, and why 52% of nonprofits specifically identify data quality and availability as blocking their AI efforts. The organizations seeing meaningful AI results are, almost without exception, the ones that have invested heavily in data infrastructure, unified their systems, or chosen platforms that share data standards. For most nonprofits, those options require either significant technical resources or vendor lock-in that creates different problems down the road.
The issue extends beyond AI. Even basic operational analytics become difficult when data cannot flow between systems. Answering a question like "what is the five-year giving history of our most active volunteers?" should take seconds. For most nonprofits, it takes hours of manual data gathering across multiple exports, manual matching, and error-prone reconciliation. This is not a software problem. It is a standards problem. No amount of AI layered on top of fragmented, incompatible data will produce reliable results.
The article on nonprofit CRM and grant systems integration explores specific approaches for connecting systems within the constraints that currently exist. But those approaches are fundamentally workarounds for a missing infrastructure layer that should exist at the sector level.
The AI Readiness Gap
How data fragmentation creates the gap between AI adoption and AI impact
The 2026 Nonprofit AI Adoption Report found that 92% of nonprofits now use AI in some form, but only 7% report major strategic impact. This gap is not primarily about lack of tools or skills. It is about data quality and accessibility. Organizations that cannot connect their systems cannot give AI tools the clean, unified data they need to produce reliable insights or automation.
The pattern mirrors what healthcare experienced before FHIR: high adoption of individual tools, low impact from those tools in isolation, because the underlying data infrastructure could not support real interoperability. Healthcare solved this with standards and mandates. Nonprofits have not yet found an equivalent path.
Why the Nonprofit Sector Has Structural Disadvantages Healthcare and Finance Didn't
Understanding why the nonprofit sector hasn't replicated healthcare's or finance's success requires acknowledging some genuine structural differences that make the problem harder, not just the same problem with less funding.
Healthcare and finance are both highly regulated industries with clear legal frameworks, defined regulatory bodies, and significant legal consequences for noncompliance. The threat of losing Medicare funding or banking licenses created the pressure that made standards adoption a survival issue rather than a nice-to-have. Nonprofits face a patchwork of state-level regulations, with many states still explicitly exempting nonprofits from data privacy requirements. As of early 2026, only a handful of states require nonprofits to meet specific data standards, and there is no equivalent of the 21st Century Cures Act for the social sector.
The nonprofit sector is also far more diverse than either healthcare or finance. Healthcare has a relatively defined set of data concepts: patients, diagnoses, medications, procedures, providers. Finance has accounts, transactions, instruments, and counterparties. The nonprofit sector spans everything from food banks to arts organizations to international development agencies to legal aid clinics. Creating a single data model that works for all of these requires either extreme specificity (creating separate standards for each subsector) or extreme generality (creating a standard so abstract it doesn't actually solve real interoperability problems).
Resource constraints compound everything. The organizations most harmed by the lack of standards are the smallest nonprofits, which also have the least capacity to invest in integration or advocacy for better standards. Larger nonprofits can afford the custom integrations, dedicated IT staff, and platform consolidation that partially address the problem. The organizations running on tight budgets and using free or low-cost tools have no pathway to interoperability under the current regime.
No Unified Regulator
Healthcare has CMS and ONC. Finance has the Fed, SEC, and banking regulators. Nonprofits are regulated by state attorneys general in a fragmented, inconsistent patchwork with no federal data standards authority.
Sector Diversity
The nonprofit sector includes over 1.8 million organizations with wildly different missions, data needs, and technical capacities. Creating a single standard that serves all of them is far harder than standardizing patient records or bank transactions.
Resource Constraints
Healthcare and finance have massive financial incentives to invest in interoperability. Nonprofits typically operate with tight budgets, limited IT staff, and boards that see technology investment as overhead rather than infrastructure.
What Would Actually Need to Change to Close the Gap
A realistic path toward nonprofit data interoperability would need to address the structural factors that have prevented progress, not just the technical ones. Several developments could drive meaningful change, though none of them are imminent.
The most powerful lever would be funder requirements. If major foundations began requiring grantees to use standardized data formats for impact reporting, or if government grant programs required specific data schemas for federally funded nonprofits, adoption would accelerate quickly. This is analogous to the way federal healthcare funding requirements drove EHR adoption. Funders have not yet embraced this role systematically, though some impact investing frameworks and collective impact initiatives are beginning to require standardized outcome reporting.
A sector-wide coalition with genuine vendor participation would be another essential ingredient. The nonprofit technology landscape includes CRM vendors, fundraising platforms, grant management systems, volunteer management tools, and financial software, all of which currently maintain proprietary data models. If those vendors committed to a common API standard, the way banks have begun committing to FDX standards in open banking, data could flow between systems without custom integration work. This would require either regulatory pressure or a sufficiently compelling market incentive, neither of which currently exists.
There is also a role for the large technology companies that have made significant investments in the nonprofit sector. Microsoft, Salesforce, and Google all maintain nonprofit programs and data models. If those organizations coordinated on a shared standard rather than competing proprietary models, the combined weight of their market positions could drive adoption more effectively than any regulatory mandate. That coordination has not happened, but it remains possible.
The work on open data for nonprofits explores some of these collaborative infrastructure models in more detail. And the article on connecting nonprofit CRM, grant, and AI tools addresses practical approaches that work within today's constraints.
Paths Forward: What Nonprofit Leaders Can Advocate For
Systemic change requires both sector-level advocacy and organizational action
- Push funders to require data standards: Ask your major funders to adopt standardized impact reporting schemas and to require grant recipients to use open data formats
- Support nonprofit technology coalitions: Engage with NTEN, Candid, and other nonprofit technology organizations that are working on shared infrastructure questions
- Negotiate for open APIs in vendor contracts: When selecting or renewing technology contracts, prioritize vendors that provide open APIs and standard data export formats
- Participate in collective impact data initiatives: Coalitions that agree on shared outcome metrics create the embryonic form of nonprofit data standards and demonstrate what sector-wide coordination can accomplish
- Document your integration costs: Tracking and reporting the actual cost of data fragmentation creates evidence for funders, policymakers, and sector leaders about the magnitude of the problem
What Individual Organizations Can Do While Waiting for Sector Change
Sector-level change will take time. In the meantime, individual nonprofits need practical approaches to reduce the cost and impact of data fragmentation within their own operations.
The most impactful organizational action is platform consolidation. The fewer systems holding your data, the less integration work you need to do and the more coherent your data becomes. This does not mean eliminating all specialization, but it does mean being deliberate about where you accept fragmentation. A unified CRM that handles donor management, email, and event registration creates a much stronger foundation than three separate tools that nominally "integrate" but maintain separate databases.
When consolidation is not feasible, iPaaS (Integration Platform as a Service) tools like Zapier, Make (formerly Integromat), and Workato can create automated data flows between systems. These tools are not a substitute for real interoperability, but they can reduce the manual work of reconciling data across platforms. Building and maintaining these integrations still requires ongoing technical investment, and they break when vendors change their APIs, but they are significantly better than no integration at all.
Organizations should also invest in master data management practices even in the absence of sector-wide standards. Establishing a canonical record for donors, constituents, and programs, with clear rules about which system is authoritative for which data, reduces inconsistency even when data itself is stored in multiple places. This kind of organizational discipline about data creates the foundation for AI adoption regardless of what happens at the sector level.
The knowledge management practices for nonprofits article covers some of these organizational data hygiene approaches in more detail. And for organizations specifically planning for AI adoption, the AI strategic planning framework addresses how to assess and improve data readiness as part of broader AI strategy.
Platform Strategy
- Consolidate to fewer platforms where mission impact allows
- Prioritize vendors with open APIs and standard data export formats
- Negotiate data portability into vendor contracts before signing
- Evaluate integration costs as part of total cost of ownership for any new tool
Data Governance
- Establish master data management rules designating authoritative systems for each data type
- Document your data dictionary so field definitions are consistent across systems
- Create automated data quality checks that flag inconsistencies before they compound
- Assign data stewardship responsibilities to specific staff members
The Gap Is Real, But So Is the Opportunity
The nonprofit sector's technology standards gap is not a minor inconvenience. It is a structural barrier that limits AI adoption, increases operating costs, and prevents the sector from realizing the full value of the data it already collects. Healthcare and finance closed analogous gaps through a combination of regulatory mandates, significant investment, and neutral standards organizations that could convene the entire industry. Nonprofits have not yet assembled those ingredients.
What makes this moment particularly important is that AI is making the cost of the standards gap much more visible. When a hospital couldn't share patient records between systems, it cost money and created clinical risk. When a nonprofit can't share donor data between systems, the consequence was administrative inefficiency that was easy to ignore. Now that AI tools promise to unlock significant value from organizational data, the failure to have coherent, accessible data is no longer just an IT problem. It is a strategic barrier to mission impact.
The good news is that the sector is more aware of this problem than it has ever been. The growth of nonprofit technology organizations, the increasing interest from major technology companies in the social sector, and the pressure from funders who want to see measurable AI impact are creating conditions where standards conversations are more likely to happen. The work will be slow, incremental, and politically complicated. But the alternative, continuing to build AI capabilities on fragmented, incompatible data infrastructure, is increasingly untenable as the expectations for AI impact continue to rise.
For nonprofit leaders, the practical implication is clear: invest in your data infrastructure now, even in the absence of sector-wide standards, because the organizations that build coherent data foundations today will be the ones that can extract value from AI over the next five years. The sector will eventually get better standards. The organizations that wait for them will be far behind those that built data discipline in the meantime.
Ready to Build Your Data Foundation?
One Hundred Nights helps nonprofits assess their data infrastructure, identify integration gaps, and develop practical strategies for AI readiness that don't wait for sector-level standards to catch up.
