By Carl Cervone
Compiled by: Elsa
Translator’s Preface
In this article, the author uses the concept of "circles" as a starting point to reveal that in our daily lives, we often only focus on the circle we are in, and often use distance as an excuse to ignore the funding of public goods outside the circle. The article also further explores how to expand the funding mechanism for public goods to a wider range of areas, beyond the circles we are directly in contact with, and create a truly effective public goods funding system. Through such an expansion, we can build a "diversified, civilization-scale public goods funding infrastructure."
Content
This post was inspired by the work and thought leadership of the organizations explicitly mentioned in the article (such as Gitcoin, Optimism, Drips, Superfluid, Hypercerts, etc.), as well as multiple conversations with Juan Benet and Raymond Cheng about the characteristics of network capital vs. private capital.
Every funding ecosystem has core areas, as well as important but peripheral areas.
Gitcoin did a great job visualizing the concept of Nested Scopes in a 2021 blog post. The original post describes a series of impact funding mechanisms that initially focus on the inner circle (“crypto”), then expand to the next circle (“open source software”), and eventually impact the entire world.
Owocki’s illustration shows the evolution of crypto-native impact funding mechanisms, from “crypto funding crypto” to impacting the world
That’s a great way to put it: Start by solving problems close to home and then scale up.
Optimism uses a similar lens to explain its vision for retroactive public goods funding.
Optimism's vision is to expand its support for public goods through retroactive funding
Optimism is within Ethereum, which is contained within “all Internet public goods”. “All Internet public goods” are contained within “global public goods”. Each outer domain is a superset of its inner domain.
Here is my condensed version of these four concentric circles memes.
I care about "everything" but I don't want to worry about how it's funded
While I personally might not spend time thinking about deep-sea biodiversity or noise pollution in Kolkata, there are a lot of people who care about these issues. Simply being aware of something often moves it from “everything” to “something I want other people to care about.”
Most of us are not equipped to assess what is important outside our immediate circle.
We are generally able to reasonably assess the things that are most relevant to us in our daily lives. This is our inner circle, or the things we do care about.
In an organization, one's inner circle may include your teammates, projects you work closely with, tools you frequently use, etc.
We can also evaluate some (but probably not all) of the things that are one degree upstream or one degree downstream of our daily sphere. These are the things we sometimes care about.
In the case of a software package, the upstream might be your dependencies, and the downstream would be the projects that depend on your package. In an educational course, the upstream might include valuable lessons or resources that influence the course, and the downstream might include students who recommend the course to their friends.
Whether they are software developers or educators, they can look further upstream for research and the institutions responsible for that research, etc. Now we are entering the realm of caring about "everything".
However, most rational people stop caring too much about anything at this point. Once we get beyond one degree, things get fuzzy. These are the things we want other people to care about.
The risk is that we might use distance as an excuse not to fund these things, thus exacerbating the free-rider problem.
While it is true that all our inner circle affairs depend on good financial support from the outer circle, it is difficult to contribute more than our "fair share" (however one might try to calculate it) to affairs that are more than one circle away from us. There are good reasons for this.
First, it’s hard to categorize large areas. A category like “all internet public goods” is so broad that if you look at it the other way around, you could argue that almost anything could fall into that category and deserve funding.
Second, it’s also hard to motivate stakeholders to care about funding outside of their immediate circle because the impact is so diffuse. I’d rather fund a whole person on a team I know than a faceless subset of a team I don’t know.
Finally, there are no direct consequences for not funding these projects—assuming, of course, that others continue to fund them and don’t pull out.
Thus, we have the classic free-rider problem.
Aside from the fact that governments can print money, impose taxes, and issue bonds to pay for long-term public goods projects, as a society we don’t have a very good mechanism for funding things outside of our immediate circle. Most capital is invested in things that have short-term returns and more immediate impacts.
One way to address this is to have people focus on funding things that are close to them (i.e. things they can personally evaluate) and to have mechanisms in place to continually push some of that funding out to the periphery.
This is exactly how private capital flows, by the way. We should try to emulate some of the characteristics of private capital.
The venture capital model for things with no short/medium term returns works because private capital is composable and easily divisible
There is a model for funding hard tech with a payback period of 5 to 10+ years: it’s called venture capital. Of course, the amount of money that flows to long-term projects in any given year is more influenced by interest rates than by final value. But venture capital is a proven model that has attracted and mobilized trillions of dollars over the past few decades.
The model works in large part because venture capital (and other sources of investment capital) are composable and easily divisible.
By composable, I mean you can take venture capital money and also do an IPO, take a bank loan, issue bonds, raise capital through more exotic mechanisms, etc. In fact, that is what people expect. All of these funding mechanisms are interoperable.
These mechanisms are well composed because there are clear commitments about who owns what and how the cash is allocated. In fact, most companies use a range of financing tools during their life cycle.
Investment capital can also be easily fragmented. Many people pay into the same pension fund. Many pension funds (and other investors) invest as limited partners (LPs) in the same venture capital fund. Many venture capital funds invest in the same companies. All of these fragmentation events occur upstream of companies and their daily affairs.
These properties make the flow of private capital in complex network graphs very efficient. If a venture-backed company experiences a liquidity event (e.g., IPO, acquisition, etc.), the proceeds are efficiently distributed between the company and its venture capital firms, the venture capital firms and their limited partners, pension funds and their retirees, and even from retirees to their children.
This is not how public goods money flows through the network. We have a relatively small number of large water towers (governments, large foundations, high net worth individuals, etc.) compared to a large number of irrigation channels.
Private versus public capital flows
To be clear, I am not advocating that public goods should receive venture capital funding. I am simply pointing out two important characteristics of private capital that have no counterpart in public capital.
How can we get more public goods funding flowing beyond our immediate circle?
Optimism recently announced new initiatives for retroactive funding within its ecosystem.
In the last retrospective funding round of Optimism, the range of projects that could be funded was very broad. In the foreseeable future, the scope of funding will be much narrower, focusing on the closer upstream and downstream links in its value chain.
How does optimism currently consider upstream and downstream impacts?
Not surprisingly, feedback on these changes has been mixed, with many projects that were once in scope for funding now excluded from upcoming rounds.
Whereas 10 million tokens were designated for “on-chain builders” in the newly announced first round, on-chain builders received a disproportionately smaller share of funding in round 3 — only about 1.5 million of the 30 million available to compete for. What will these projects do with the funds if they are getting 2-5x the backdated funding compared to the 1.5 million?
One thing they could do is put some of their tokens into their own retroactive funding or grant rounds.
Specifically, if Optimism funds DeFi applications that drive network transaction volume, then these applications can fund front-ends, portfolio trackers, and other applications that serve the impact they care about.
If Optimism funds dependencies on the core of the OP stack, then these teams can fund their own dependencies, research contributions, etc.
What if projects took the retroactive funding they felt they were due and put the rest into circulation?
This is already happening in various forms. The Ethereum Attestation Service now has a scholarship program for teams building on its protocol. Pokt just announced its own retroactive funding round, integrating all tokens received from Optimism (and Arbitrum) into this round. Even Kiwi News, which received less than the median funding in round three, has implemented its own version of retroactive funding for community contributions.
At the same time, Degen Chain pioneered a more radical concept of allocating tokens to community members, requiring them to give these tokens to other community members in the form of “tips”.
All of these experiments are directing public goods funding from central pools (such as OP or Degen treasuries) to the periphery, expanding their reach.
The next step is to make these commitments explicit and verifiable.
One way to do this might be to have projects determine a Floor Value and a Percentage Above The Floor that they are willing to put into their pool. For example, maybe my Floor Value is 50 tokens, and the Percentage Above The Floor that I am willing to put in is 20%. If I receive a total of 100 tokens, then I will allocate 10 tokens (20% of the 50 tokens above the Floor) to fund the edge of my network. If I only receive 40 tokens, then I will keep all 40.
(Incidentally, my project did something similar during the last Optimism grant.)
In addition to pushing more funding to the margins, this also serves the critical function of helping public goods projects establish a cost base. In the long run, the message to projects that consistently receive less funding than expected is that they are mispricing their work or are being undervalued in the ecosystem from which they are funded.
Projects with surplus will be evaluated in subsequent rounds not only on their own impact, but also on the broader impact they create through good capital allocation. Projects that don't want to take on the burden of running their own grant programs can choose to park their surplus somewhere else productive, like the Gitcoin matching pool, the Protocol Guild, or even choose to destroy it!
In my opinion, these two values determined by a project before it receives funding should be kept secret. If a project receives 100 tokens and donates 10 tokens, no one else should know if their values are (50, 20%) or (90, 100%).
The final step is to connect these systems.
The examples of EAS, Pokt, and Kiwi News are inspiring, but they all require setting up new projects and then applying/exchanging/transferring funding tokens to new wallets to ultimately transfer funds to new beneficiaries.
Protocols like Drips, Allo, Superfluid, and Hypercerts provide the underlying infrastructure for more composable funding flows — now we need to connect these pipes, like this pilot project with Geo Web.
The mission of this cycle is to create a system for funding public goods that actually works. Then we start to scale it up.
In crypto, we are still in the stage of experimenting with various mechanisms to decide which projects to fund and allocate funds. Compared with decentralized finance (DeFi), the infrastructure of public goods funding is still immature, poorly composable, and lacks field testing.
To get this beyond the experimental stage and to scale, we needed to solve two problems:
1. Measure and prove that these mechanisms not only work, but also that they are more effective than traditional public goods funding models (see this post [1] for why this is an important problem to work on, and another post [2] for an analysis of Gitcoin’s long-term impact);
2. Clear commitment: A clear commitment on how “profits” or surplus funds will flow to external circles.
In venture capital, there is always an investor behind the investor — ultimately, this could be your grandmother (or, more accurately, all of our grandmothers). Each such investor is incentivized to allocate capital efficiently so that they can be trusted with more capital allocations in the future.
For public goods, there is always a group of closely connected actors that you depend on, both upstream and downstream of your work. But there is currently no commitment to share this surplus with these entities. Until such commitments become the norm, it will be difficult to scale the funding of public goods beyond our immediate circle.
We have not yet reached a stage where we are better than the traditional model (Image from Gitcoin whitepaper)
I think it’s not enough to just commit to, “When we reach a certain scale, we’ll fund these projects.” It’s too easy to move the goalposts. Instead, these commitments need to be established early on, built into the foundational elements of how funding mechanisms and grant programs are constructed.
I don’t think it’s reasonable to expect the coffers of a few whales to fund everything. This is the water tower model we have in traditional governments and large foundations.
But if we are small, the more we make a clear commitment to fund our dependencies, the more we demonstrate that there is a market for public goods, thereby expanding the total addressable market (TAM) and changing incentives.
Only then will we have something truly scalable that can gather its own momentum and create the “diverse, civilization-scale public goods funding infrastructure” of our dreams.
References
[1] Building a network of Impact Data Scientists. Retrieved from https://docs.opensource.observer/blog/impact-data-scientists/.
[2] A longitudinal assessment of Gitcoin Grants impact on open source developer activity. Retrieved from https://docs.opensource.observer/blog/gitcoin-grants-impact/.