No One Has the Lock on Open Infrastructure
Why Defining, Coordinating, and Scaling Open Infrastructure Is Harder Than We Pretend
I've been sitting with something my friend Kaitlin Thaney said to me years ago: "Why does everyone think they have the lock on open infrastructure?" At the time, the question hit me like a brick. It made me pause. Was I parroting assumptions? Was I missing the bigger picture?
The answer, as it turns out, was yes. That answer has shaped the way I think and talk about open infrastructure today. I was excited to discover that Kaitlin had written an essay exploring this idea in more depth.
The Trouble with Definitions
In her excellent essay in Katina Magazine1, Kaitlin and her team at Invest in Open Infrastructure (IOI) unpack the limits of trying to define "open infrastructure." They review definitions across disciplines, ranging from cyberinfrastructure to social infrastructure, as well as UNESCO's definition of open science2. Kaitlin grapples with the term in a way that causes you to stop and think. Together, Kaitlin and her team have found that definitions often exclude just as much as they include. And that exclusion has consequences. It shapes what gets funded, who gets to lead, and which infrastructures are seen as legitimate.
Humans love a clean definition. For those of us in higher education, it is something that makes grant applications smoother, helps us align with institutional strategy, or simplifies decisions for policymakers. But with open infrastructure, the more we define, the more we risk erasing what matters. An open-source discovery layer for a university in the U.S. differs from a community-run repository in Kenya or a digital heritage project overseen by the Indigenous People of Aotearoa (New Zealand). And yet, when institutions or funders apply a single metric for openness or sustainability, large parts of the ecosystem get left out.
These exclusions aren’t just theoretical. They influence what gets funded, who gets invited to the table, and how we assess impact. Definitions create boundaries. And when those boundaries are drawn from the center, either geographically, economically, or institutionally, they reflect the assumptions of the powerful.
The irony is that most of us working in technology know better. We know that what counts as infrastructure in one place may not even register as such in another. We know that digital and non-digital systems coexist. We know that openness isn’t just a licensing status; it’s a cultural and relational practice.
Ostrom Was Right: Governance is Contextual
This is where Dr. Elinor Ostrom's work becomes essential. In her Nobel Prize–winning research on common-pool resources3, Ostrom dismantled the idea that centralized control or market-based models were the only ways to manage shared systems. Instead, she offered something revolutionary: the notion that communities could and often did govern shared resources on their own terms. Her eight design principles for durable institutions emphasize local autonomy, nested governance, and mechanisms for conflict resolution, tools that feel especially relevant for the messy, distributed world of open infrastructure.
Ostrom’s work challenges us to think beyond efficiency. In fact, one of her core insights is that local communities often resist imposed solutions not because they’re stubborn or backward-looking, but because the imposed solution doesn’t fit. It's the wrong scale, the wrong structure, the wrong assumptions. Yet so many conversations about "coordination" in open infrastructure echo this same mistake. They're led by people who assume they already know the answer. The result? Frameworks are designed in isolation, priorities are shaped by institutional inertia, and there is a fixation on mapping over understanding.
This is why phrases like “coordination is needed, but not strict governance” give me pause. Too often, they gesture at making the community relinquish autonomy while masking a desire for control over funding flows, legitimacy, or the future of the infrastructure. What we need isn’t coordination with rigid assumptions, but genuinely polycentric, context-aware governance. That’s the kind of collaboration that honors complexity, rather than trying to flatten it.
From Efficiency to Resilience
These assumptions are particularly dangerous now in an era of political instability and widespread institutional mistrust; we can no longer assume that “public access” or “trusted repositories” are stable categories. The concept of open is shifting under our feet and taking our infrastructure with it. Governments once funded, maintained, and preserved critical infrastructure. Increasingly, they no longer do this type of work, especially here in the US. Universities (mainly in the US, but I suspect increasingly abroad as well) were once seen as stewards of the public good. Unfortunately, many no longer inspire that kind of confidence.
These aren't abstract concerns. The Data Rescue Project4 is one of the clearest illustrations of how communities have stepped in to fill the gaps left behind by massive institutions. When climate data, environmental reports, and government research outputs began disappearing from US federal websites, volunteers and archivists didn’t wait; they mobilized and saved what they could. They built systems to preserve public data in a moment when the public's right to know was under direct threat.
This kind of community-led, resilience-focused work reminds us what open infrastructure actually requires: trust, responsiveness, redundancy, and care. These are not things you get from efficient governance or one-size-fits-all coordination. They’re things you build to fit the needs of those you serve.
Open infrastructure isn't a monolith. It's a spectrum of practices and people; some are visible, while others are not. It encompasses repositories, APIs, governance structures, cultural norms, and labor practices. And it lives in specific contexts. What constitutes infrastructure in a small college differs from what is considered infrastructure in a global research university. What matters in Ghana may not look the same as what matters in Toronto.
And yet, we continue to treat “open infrastructure” as if it were a solvable equation. Something that can be modeled, packaged, and scaled if only we get the framework right. However, that approach often overlooks the nuances surrounding the knowledge that makes open infrastructure sustainable in the first place. As Kaitlin reminds us, none of us has the lock on this. And when someone insists they do—whether through a framework, a consortium, or a governance scheme—they risk excluding the very communities they claim to serve.
Embracing Complexity
So no, none of us has the lock on open infrastructure. That’s not a bug, it’s a feature.
Open infrastructure thrives when we stop trying to control it and start learning from it. From the people who preserve datasets when governments fail. From the communities that govern their own digital repositories. From the developers and engineers who build for the long haul, not just the next funding cycle.
Elinor Ostrom demonstrated that the most effective solutions are often local, overlapping, and self-determined. Kaitlin Thaney reminds us that what counts as infrastructure is always in flux, and that every attempt to define it risks excluding someone. What both point to is this: we need more frameworks for listening than for coordination.
That means resisting the urge to simplify or declare once and for all what open infrastructure is. Because when we do that, we miss the parts that matter most.
Complexity isn’t a liability. It’s how we know the system is alive. And our job isn’t to fix it into place, it’s to keep it breathing.
What we need are conversations rooted in curiosity, humility, and a sense of community. We need to stop pretending that there is a single, correct model or a definitive definition. We also need to recognize that, when it comes to sustaining open infrastructure, a certain level of complexity is not a liability. It's the only way forward.
What We Talk About When We Talk About Open Infrastructure in Katina Magazine
Thank you for this. I agree that open infrastructure isn’t a monolith and Elinor Ostrom’s work on commons and polycentric governance resonates strongly here. Her emphasis on context-specific, overlapping governance models seems particularly relevant for open scholarly infrastructure, which too often gravitates toward centralised solutions. Like Ostrom’s commons, our infrastructures are not just technical but social and cultural, shaped by local needs and practices.
I also appreciate your framing of "curiosity, humility, and a sense of community" as foundational values. The Wikipedia movement and the global, volunteer-driven work behind the Linux kernel are powerful examples of this ethos in action. Anthropologist Gabriella Coleman’s work exploring the hacker and open source communities underscores how much of open infrastructure relies on trust, care, and long term collective maintenance. These elements we often overlook when focusing only on scalability or efficiency as you mention and also when it comes to funding opportunities.
Similar to Ostrom, I also want to point out Mariana Mazzucato's work on "public value" which provides a framework to treat our infrastructures not just as commons but as collective public goods that deserve sustained investment and policy support.