Consensys Research
The Maximally Decentralized Blockchain Settlement Layer
The dangers of blockchain maximalism and the importance of building an interoperable blockchain ecosystem.
By Everett Muzzy and Mally Anderson
Finding the Middle Ground
This is the second article in a series exploring the state and future of interoperability and sidechain functionality in the blockchain ecosystem. In the first piece, Avoiding Blockchain Balkanization, we examined the history and current state Web2 ecosystem to identify warnings and signs that the blockchain industry is at risk of heading toward a similar state of siloed protocols and exploited data. In this piece, we discuss the importance of cultivating a middle ground between balkanization and maximalism, and propose the need for a maximally decentralized base settlement layer to anchor all global blockchain-based transactions.
Maximalist Argument
A common trope in the blockchain ecosystem is that of the “maximalist.” No matter which protocol or blockchain to which the term refers, maximalism is accompanied by an unwavering confidence that there is a “war” between blockchains from which one blockchain will emerge dominant and all future systems and applications will be built on top of that one protocol. Maximalism is not a new concept to the Web-connected world. Tim Berners-Lee, creator of the World Wide Web, worried about the role of the Internet in promoting maximalist thought. Comparing it to the polar opposite—intensely granular, balkanized thought—Berners-Lee cautioned against both:
“In fact, there are two equally frightening prospects. On the one hand is the descent to the lowest common denominator, often represented by US fast food and cartoons, with the loss of all that is rich and diverse. On the other, is an extreme of diversity. When anyone can filter mail so that they can read only messages from people who think the same weird things as themselves, and when what they read on the Web they only find by following links from sites of the same strange cult, will they be able to dig themselves into a cultural pothole so deep and so steep that when eventually they physically meet a real person on the street, the lack of common understanding will be total, and the only form of communication left will be to shoot them?” [Berners-Lee, 1996]
It is not too much of a stretch to state that the present-day blockchain ecosystem is guilty of promoting both maximalist and balkanized rhetoric, and therefore is at risk of eventually trapping itself in one or the other. Maximalism in particular is antithetical to the very promise of blockchain technology—i.e. the promise that exploitative, centralized parties can be held accountable and that users can vote to change the rules or choose other methods if they wish. As recently as February of this year, Andreas Antonopoulos cautioned against blockchain maximalism (specifically through the lens of Bitcoin), suggesting that the ecosystem is still far from accepting that maximalism is unhealthy and, as will be explored, perhaps impossible: “The moment Bitcoin becomes the only choice,” Antonopolous argued, “the level of corruption and abuse of power that we’re going to see in the Bitcoin community is going to require us to build something to disrupt it…If you just replace the power structure of traditional central banking with a power structure of Bitcoin-maximalist-billionaires…that [won’t] change anything” [source].
Settlement Argument
The argument this paper positions could be better stated as the settlement argument. The settlement argument proposes a future in which a multitude of blockchains operate alongside and in cooperation with each other to suit the needs of all types of use cases. The key to the settlement argument is that a multitude of blockchains operate alongside and in cooperation with each other to suit the needs of all types of use cases. The key to the settlement argument is that one blockchain serves as the global settlement layer for all those data transactions, no matter on which blockchain they occur. The settlement layer provides an ‘anchor’ for the ecosystem, establishing undeniable security and objective finality should anything happen on a different blockchain that requires arbitration.It is important to note that the settlement argument is not maximalist, even though it positions one blockchain as the root chain for the world. Maximalism is defined by exclusion; i.e. the ecosystem is only legitimate if one blockchain wins. The settlement argument is defined by interoperability and inclusion; i.e., the ecosystem only works if many kinds of coexisting blockchains operate on top of the maximally decentralized root chain. A fully interoperable network is greater than the sum of its constituent parts, allowing participants to square and cube the solution space.Whichever chain or protocol serves as the anchor for the ecosystem provides the security, immutability, and confidence to support the whole system. The foundational settlement layer could be compared to the US Supreme Court (in its ideal state): incorruptible, always available, resilient, and called upon only to serve as final arbitrator. This metaphor is apt for a number of reasons. Various other blockchains and scaling solutions with their own respective priorities (for example, privacy for enterprise, or throughput speed for games and exchanges) can execute their own daily functions while relying on the decentralized, secure mainnet layer—the true World Computer—only when they need it. Most computation can occur in the other layers just as most cases are resolved in civil suits and state courts, and escalate to arbitration in the supreme court when necessary. The finality and settlement this “supreme court” layer provides is not necessarily rapid, but it is genuine and absolute, guaranteeing all the participants’ safety.
Global Data
Pursuing an ecosystem supported by a settlement layer rather than one monolithic blockchain may be a computational necessity over a philosophical preference. In other words, maximalism may, in fact, be impossible to achieve in the near future. Currently, the block size of Bitcoin contains (on average) 1 MB of data. With the average Bitcoin blocktime of 1 block every 10 minutes, that’s 144 MB of data per day stored/transacted on the Bitcoin blockchain. Meanwhile, nearly 2.5 quintillion bytes of data are created globally every day. By 2020, an estimated 1.7 MB of data will be created every second per every person on earth. And our data creation is not slowing down. The evolution of IoT and machine learning will not only create more data, but more rich data needing robust and proper analysis, organization, and storage. In the coming years, as an estimated 4 billion of the world’s 7.8 billion people currently living without reliable internet connection (2016) become increasingly more connected, data creation will increase exponentially. According to the US Department of the Treasury, SWIFT directs the movement of an estimated $5 trillion USD per day ($1.25 quadrillion USD per year with ~250 business days per calendar year). Even in this early phase of adoption, Bitcoin alone transacts an average of $200 million per day (with notable fluctuations). As a 24/7, global, borderless transaction layer with nearly the entire global population eventually capable of adopting it as a means of payment or SoV, it is not difficult to imagine a future where crypto payments rapidly overtake global SWIFT (and related CHIPS, Fedwire, etc.) payment volumes per day.
“No single ledger, no matter how fast and scalable, is capable of, or appropriate for, recording all transactions or running all “on chain” business logic between parties.” –John Wolpert, Consensys
It is not necessarily expected that every ounce of data or every unit of currency will eventually be represented on a blockchain. Even with a fraction of the world’s future data and money transacted or stored on a blockchain, however, the data and processing requirements will quickly outpace the current speeds and block limits of most decentralized protocols—even with future scaling mechanisms. The sheer amount of data our world will have to handle in the not-too-distant future necessitates that we explore more robust and sustainable methods of distributed ledger technology. Promoting a diverse, interoperable future rather than a monolithic one ensures we can continue supporting the exponential increase in global information without banking on one blockchain to scale proportionally to worldwide data creation and transaction.
Choosing a Settlement Layer for an Interoperable Blockchain Ecosystem
Choosing the right base settlement layer for an interoperable ecosystem largely boils down to one characteristic: decentralization. The danger of even a moderately centralized base settlement blockchain is that we repeat the same mistakes of Web2, but with orders of magnitude greater consequences. As we tokenize the world’s assets, for example, well-resourced financial houses and traders will spare no effort or expense to manipulate markets for gain or political advantage. We cannot have deep liquid tokenized markets of the next generation economy as vulnerable as they have been in the legacy economy. We cannot choose anything other than a maximally decentralized base trust layer as the foundational settlement layer of the global economy. Another way to think about the importance of a settlement layer is as a gearbox for a diverse ecosystem of layers and blockchains that prioritize different features. Just as the gears in an engine let the engine operate at different speeds, various layers in the ecosystem can operate more slowly when they require maximal decentralization, even systems based on older style database technologies—we can think of this as first gear—and maximize throughput in higher gears, such as exchanges that need to process many thousands of transactions per second.
Scalability and the Settlement Layer
On the subject of scalability: the layer 2 mechanisms and sidechains for optimizing throughput help to address the scalability trilemma, a major challenge for all blockchains. The scalability trilemma dictates that decentralized systems can only prioritize at most two of the following three properties: scalability (performance according to speed and volume), decentralization, and security. How do we increase the transaction throughput to many thousands of transactions per second without either forcing every node to become a supercomputer or to accommodate an unsustainable volume of state data? Near-term solutions for Ethereum’s layer 2—including Plasma chains and state channels—can improve the scalability problem in the short term by moving some computation off of mainnet. Detailed transactions would occur on these subchains and state channels, and only their hashes would be exported to the mainchain. We can think of this like a grading system. A professor grades a test according to how many answers each student got right or wrong, but they only enter the final test score in their grade book. At the end of the semester, the professor averages those test grades into a final grade for the course and files it with the academic dean, which we might think of as the settlement layer processing a final transaction on the blockchain. The specifics of the computation are not necessary to see or understand the final hashed figure. More comprehensive solutions will be required in the long term to spread more of the workload of state storage, processing, and transaction pinning across all nodes in a network. Improving scalability with layered mechanisms like those in progress on Ethereum can alleviate the limitations of the scalability trilemma in order to make the mainnet the best viable settlement layer for a diverse, interoperable blockchain ecosystem. Favoring liveness and availability over safety and consistency in the event of a temporary network split, only Ethereum is sufficiently computationally expressive (ruling out Bitcoin) and sufficiently decentralized to serve as the root chain that can anchor a wide variety of different kinds of network architectures, from Plasma-linked Ethereum side chains for games or exchanges that can process throughput of 65,000 transactions per second or higher.
Quantifying Decentralization: Decentralized Transactions per Second
Decentralization is a fundamental blockchain concept, but how to actually determine or quantify decentralization—and consequently, how to value one blockchain’s potential over another—is more complicated. Currently, transactions per second throughput is the most popular competitive metric for comparing blockchains, but this emphasis on speed ignores the essential feature of decentralization. In Balaji Srinvasan’s 2017 Quantifying Decentralization, he proposed the use of the Gini and Nakamoto coefficients to attach an objective measure of decentralization to a blockchain. By applying Srinvasan’s logic of measuring comparable blockchain characteristics (i.e. node decentralization) and representing it numerically, we propose a measurement we can call DTPS, or decentralized transactions per second. The purpose of DTPS is to factor a blockchain’s decentralization into the ecosystem debate of judging one blockchain’s transaction throughput against another’s. The statement that “EOS can process 4,000 transactions per second but Ethereum can only process 14” is often countered with, “but EOS’s protocol centralization jeopardizes security and governance.” There does not exist, however, a way to factor all that information into a single comparable statistic that factors in near-objective decentralization with objective TPS.DTPS is the product of Transactions per Second (TPS) multiplied by the “Decentralization Quotient” (DQ).DTPS = DQ * TPSThe DQ is a measurement reminiscent of Srinvasan’s Nakamoto Coefficient in its attempt to quantify the characteristics of a blockchain (or a system like Visa) that signify decentralization. DQ can be measured between 0 and 1, where 1 represents fully decentralized and 0 represents fully centralized. DTPS aims to take into account all the transactions that occur on a public mainnet, as well as transactions occurring in parallel through sidechains, state channels, and other scaling or transaction throughput mechanisms. The current issue with DTPS is the subjectivity of decentralization and transactions per second, especially with respect to scaling solutions that do not exist on the mainnet. This paper, therefore, introduces a preliminary conceptual framework for DTPS and positions it as a “measurement in progress,” with notable assumptions made in the following calculations. We invite the ecosystem to collaborate on ways to gather, verify, and establish more quantifiable decentralization factors to arrive at an agreed-upon approach to and definition of DTPS. If we look at DTPS on the layer 1 or public mainnet of a number of blockchains, we begin to see the opportunity and the challenges of defining the metric. TPS on the mainnet is relatively easy to determine. DQ, however, is more complex and encompasses far more variables. By looking at just the number of nodes and wallet owners, we can begin to determine which blockchains are more decentralized than others. Where to place those blockchains on a scale from 0 (totally centralized) to 1 (totally decentralized, a theoretical limit rather than a realistic benchmark), is (for now) more arbitrary. For the sake of this “measurement in progress,” let’s peg Bitcoin—currently understood as the most decentralized network—as 0.8. From there, we can approximate the DQs of other blockchains: ETH = 0.7, LTC = 0.5, TRON = 0.3, XRP = 0.2, EOS = 0.1. Visa, for example, would have a DQ (and thus a DTPS) of 0. With those arbitrary DQs, we get a snapshot of DTPS when considering only layer 1:
DTPS = DQ * TPS
BTC = 0.8 * 7 = 5.6 DTPS
ETH = 0.7 * 15 = 10.5 DTPS
LTC = 0.5 * 56 = 28 DTPS
TRON = 0.3 * 1200 = 360 DTPS
XRP = 0.2 * 1000 = 200 DTPS
EOS = 0.1 * 4000 = 400 DTPS
VISA = 0.0 * 65,000 = 0 DTPS
When we begin to factor in layer 2 scaling solutions being developed on top of these mainnets, we arrive at a more complete but (currently) more subjective view of DTPS. The subjectivity comes from the constantly-developing TPS of layer 2 scaling solutions that are currently in progress. By factoring in the understood/projected TPS numbers of existing layer 1 scaling solutions, we see a different snapshot of DTPS:
DTPS = DQ * TPS
BTC = [0.8 * 7] + [0.8 * 300] = 245 DTPS
= [Mainnet] + [Lightning]
ETH = [0.7 * 15] + [0.7 * 65,000] + [0.7 * 400] + [0.3 * 10] = 45,000 DTPS
= [Mainnet] + [Plasma] + [State Channels] + [Consortium]
LTC = 0.5 * 56 = 28 DTPS
TRON = 0.3 * 1200 = 360 DTPS
XRP = 0.2 * 1000 = 200 DTPS
EOS = 0.1 * 4000 = 400 DTPS
The nuances of layer 2 scaling transactions per second is only half the input needed for a more complete view of DTPS. Decentralization Quotient (DQ) also needs ecosystem mind-share to arrive at an established number of metrics that 1) can be gathered reliably and consistently, 2) signify a degree of decentralization, and 3) can be (relatively) equally compared across blockchains. Srinvasan posed a few of these metrics in Quantifying Decentralization, and we believe there are others to consider as well:
If, as a community, the blockchain ecosystem is able to agree upon objective measures of the metrics above, we can arrive at an accepted DQ definition that works across a variety of blockchain protocols. The purpose of DTPS is not to establish one blockchain as entirely ‘better’ than another in every way—but rather to offer the ecosystem a better understanding of which chain might be better suited specifically to serve as the base settlement layer of an interoperable ecosystem. Beyond that, DTPS provides users with a more wholesome understanding of different systems’ value propositions when considering which chain on which to run a business, personal, or government function. By establishing a base settlement layer on which all blockchain transactions ‘anchor’ their transactions, the DTPS of the ecosystem skyrockets, and grows exponentially with every sidechain or linked blockchain that is attached to that root-chain. The result is a diverse ecosystem of blockchains, each perhaps uniquely suited for specific use cases, but all equally secure in their DTPS.
Why Ethereum
We should always imagine and strive for a future just beyond the limits of possibility, but we must also remain realistic about the future of blockchain technology. Continuing to focus on maximalism will not get the emerging blockchain industry very far, and if protocol teams continue to develop against one another rather than in collaborative parallel with one another, we will arrive at an insecure, unsustainable, balkanized blockchain ecosystem that will not fulfill its tremendous promise. The best answer lies in the middle ground: a radically decentralized, programmable base settlement layer on top of which interoperable blockchains can accomodate for individual use cases without compromising security or privacy needs. Only through decentralization and interoperability is a blockchain-powered future truly accessible. The base settlement layer can and should be the blockchain protocol that emerges as the most decentralized, programmable, and secure. In the current state of the ecosystem, Ethereum has emerged as the most suitable option for the role.
Footnotes
The metrics and numbers filled in on this sheet are preliminary and incomplete. We invite the community to discuss the importance of listed metrics, propose additional, and begin collecting data to complete this chart.
Number of companies (if any) on which the project depends. Additionally, the companies’ structure, location, and ownership / funding sources.
Does the network slow down or freeze if it loses n% of nodes.
ABOUT THE AUTHORS
Everett Muzzy
Everett is a writer and research at Consensys. His writing has appeared in Hacker Noon, CryptoBriefing, Moguldom, and Coinmonks.
Mally Anderson
Mally is a writer and researcher at Consensys. Her writing has appeared in MIT’s Journal of Design and Science, MIT’s Innovations, Quartz, and Esquire.