The Long History of the Fight over Scaling Bitcoin
The fight surrounding how to scale Bitcoin did not start in 2015. Nor did it start in 2013, nor in 2011. It is as old as Bitcoin itself and has been there before Satoshi released the first version of the cryptocurrency. And the arguments have not changed much since then.
Sometimes debates progress, sometimes not. In the case of scaling Bitcoin, we find a debate rotating on the same axes for years.
2008 James Donald
While most people know that the heated fight about Bitcoin’s block size limit did not come from nowhere, few people are aware that the debate goes back to November 2008, when Satoshi announced his project for the first time in the cryptography mailing list.
In fact, the first answer Satoshi got for it, the first idea, that was expressed about Bitcoin, the first spark, his paper ignited, was literally “it does not seem to scale.” James A. Donald, a Canadian libertarian and cypherpunk, was the first person to answer Satoshi. He said:
We very, very much need such a system, but the way I understand your proposal, it does not seem to scale to the required size… To detect and reject a double spending event in a timely manner, one must have most past transactions of the coins in the transaction, which, naively implemented, requires each peer to have most past transactions, or most past transactions that occurred recently. If hundreds of millions of people are doing transactions, that is a lot of bandwidth – each must know all, or a substantial part thereof.
The answer, Satoshi gave to James A. Donald, is still frequently quoted by advocates of on-chain scaling:
Long before the network gets anywhere near as large as that, it would be safe for users to use Simplified Payment Verification to check for double spending, which only requires having the chain of block headers, or about 12kB per day.
Only miners would need to run full nodes, and Satoshi seemed to have no problem with the emergence of big server farms:
At first, most users would run network nodes, but as the network grows beyond a certain point, it would be left more and more to specialists with server farms of specialized hardware. A server farm would only need to have one node on the network and the rest of the LAN connects with that one node.
Also Satoshi explained that the bandwidth requirements may not be as problematic as James A. Donald thought:
The bandwidth might not be as prohibitive as you think. A typical transaction would be about 400 bytes (ECC is nicely compact). Each transaction has to be broadcast twice, so let’s say one kB per transaction. Visa processed 37 billion transactions in FY2008, or an average of 100 million transactions per day. That many transactions would take 100GB of bandwidth, or the size of 12 DVD or two HD quality movies, or about $18 worth of bandwidth at current prices.
If the network were to get that big, it would take several years, and by then, sending two HD movies over the Internet would probably not seem like a big deal.
Usually the quote stops here. It is often quoted to demonstrate, that Satoshi planned to scale Bitcoin on-chain and was optimistic that this can be done to a level like Visa. But it is interesting to read James A. Donald’s answer. He makes his points, and, more importantly, he anticipates many of the arguments the opponents of on-chain scaling raise today.
First he answered with somewhat of a confusing statement about governments and economic entities, expressing his discontent with leaving it to big server farms to mint new bitcoin:
If a small number of entities are issuing new coins, this is more resistant to state attack that with a single issuer but the government regularly attacks financial networks, with the financial collapse ensuing from the most recent attack still under way as I write this.
So for James A. Donald the only sustainable solution was to keep the requirements for mining nodes as small as possible:
I think we need to concern ourselves with minimizing the data and bandwidth required by money issuers – for small coins, the protocol seems wasteful. The smaller the data storage and bandwidth required for money issuers the more resistant the system is the kind of government attacks on financial networks that we have recently seen.
Comparing Bitcoin with bank cards was not enough for James A. Donald:
You have to go where no bank card goes.
At present, file sharing works by barter for bits… File sharing requires extremely cheap transactions, several transactions per second per client, day in and day out, with monthly transaction costs being very small per client…
And he saw only one method to support file sharing with bitcoin:
… we will need a layer of account money on top of the bitcoins, supporting transactions of a hundred thousandth the size of the smallest coin, and to support anonymity, chaumian money on top of the account money.
Today this has been known as payment channels. Most hope is today on the Lightning Network to become a layer on top of Bitcoin that builds a network of payment channels enabling to instantly send nearly infinite numbers of transactions.
As with the advocates of Lightning, James A. Donald sees Bitcoin more as a settlement system as a payment system:
We can build a privacy layer on top of this – account money and chaumian money based on bitgold coins, much as the pre 1915 US banking system layered account money and bank notes on top of gold coins, and indeed we have to build a layer on top to bring the transaction cost down to the level that supports agents performing microtransactions, as needed for bandwidth control, file sharing, and charging non-whitelisted people to send us communications.
With this quote we have more or less the whole debate in a nutshell. The roots of the block size debate, therefore, started to grow months before Satoshi made the first Bitcoin transaction to Hal Finney.
2010 Hal Finney
In December 2010 Hal Finney, another long-term cypherpunk and cryptographer, made a claim for Bitcoin banks:
Actually there is a very good reason for Bitcoin-backed banks to exist, issuing their own digital cash currency, redeemable for bitcoins. Bitcoin itself cannot scale to have every single financial transaction in the world be broadcast to everyone and included in the block chain. There needs to be a secondary level of payment systems which is lighter weight and more efficient. Likewise, the time needed for Bitcoin transactions to finalize will be impractical for medium to large value purchases.
Hal Finney considered Bitcoin to be no payment system, but to be a currency. And like every other currency the payments itself could be done through intermediaries which he called banks.
Bitcoin-backed banks will solve these problems. They can work like banks did before nationalization of currency. Different banks can have different policies, some more aggressive, some more conservative. Some would be fractional reserve while others may be 100 percent Bitcoin backed. Interest rates may vary. Cash from some banks may trade at a discount to that from others.
Bitcoin, the network of miners confirming transactions on the blockchain, can be an anchor in a network of Bitcoin banks:
I believe this will be the ultimate fate of Bitcoin, to be the “high-powered money” that serves as a reserve currency for banks that issue their own digital cash. Most Bitcoin transactions will occur between banks, to settle net transfers. Bitcoin transactions by private individuals will be as rare as… well, as Bitcoin-based purchases are today.
The answers Hal Finney received have been a little, gentle foretaste of the toxicity this topic contains. A bitcointalk user replied:
Sounds like a fractional reserve system, one of the fiat money pitfall bitcoin claims it can avoid.
But at this time no one was able to foresee how big the debate would become.
2013 Gavin Andresen, Peter Todd, Mike Hearn, Gregory Maxwell
In late January 2013, the scalability issue became a hot topic on bitcointalk. In this era some people started to see the hardcoded 1MB block size limit as a problem and proposed hard forks to lift the limit.
Some user demanded to change the limit as fast as possible:
Changing this limit needs to be discussed now, before we start hitting it. Already a quick glance at the blockchain shows plenty of blocks exceeding 300kB. Granted most of that’s probably S.Dice, but nobody can really dispute that bitcoin is rapidly growing, and will hit the 1MB ceiling fairly soon.
The limit of 1MB blocks was not as much a practical problems as a problem of expectations and promises: With 1MB blocks you can only do seven transactions a second (in fact it are more like three of four.) This is way too less for a currency that is sold under the slogan of becoming a world currency.
But as soon as in this thread the proponents of raising the block size faced a radical opposition. User da2ce7 explained that the block size increasing hard fork is a fundamental problem:
This has been discussed again and again. This is a hard-limit in the protocol, changing it is as hard as changing the total number of coins… i.e. virtually impossible.
Many people have invested into Bitcoin under the pretence that the hard-limits of the protocol do not change.
Even if a super-majority wanted the change. A significant amount of people (myself included) will reject the chain.
And Gregory Maxwell, today knows as a thought leader of the proponents of keeping the blocks small, explained his view on the topic. First he made the claim that a limit is needed to finance miners in the long term:
Without a sharp constraint on the maximum block size there is currently no rational reason to believe that Bitcoin would be secure at all once the subsidy goes down.
Bitcoin is valuable because of scarcity. One of the important scarcities is the limited supply of coins, another is the limited supply of block space; limited block space creates a market for transaction fees, the fees fund the mining needed to make the chain robust against hostile reorganization. I have not yet seen any suggestion as to how Bitcoin is long-term viable without this except ones that argue for cartel or regulatory behavior (both of which I don’t consider viable: they moot the decentralized purpose of Bitcoin.)
But this is not his only concern:
With gigabyte blocks bitcoin would not be functionally decentralized in any meaningful way: only a small, self-selecting group of some thousands of major banks would have the means and the motive to participate in validation.
The believe that bigger blocks will destroy the most important property of Bitcoin, the decentralization, has been here since 2013.
In this year Bitcoin’s capacity was rules by soft limits. These limits could be lifted by the miners with a soft fork. In 2013, Bitcoin’s transactions volume did rise and the devs discussed to lift a 250 kilobyte soft fork.
At this time, Gavin Andresen started to explore ideas how deal with block size limits in the long term. He proposed a floating block size to make sure that the limit grows with the demand. Peter Todd stepped in and warned of this idea:
There has been a lot of talk lately about raising the block size limit, and I fear very few people understand the perverse incentives miners have with regard to blocks large enough that not all of the network can process them, in particular the way these incentives inevitably lead towards centralization.
The scenario, that miners raise the block size again and again to push out weaker nodes of the network, is one of the main critics today raised against Bitcoin Unlimited’s approach to the block size issue.
Either way miners, especially the largest miners, make the most profit when the blocks they produce are large enough that less than 100 percent, but more than 50 percent, of the network can process them.
The end of the story will lead, as Todd says, “to centralization of mining capacity.”
At this point Gavin Andresen explains his view in the debate. Maybe his posts shows better than everything else the contrasts of two camps, that have already emerged at this point in time:
I strongly feel that we shouldn’t aim for Bitcoin topping out as a “high power money” system that can process only seven transactions per second.
I agree with Stephen Pair– THAT would be a highly centralized system.
The one side fears the centralization of nodes and miners, while the other fears the centralization of transactions. Andresen made clear, where his priorities lie:
I think we should put users first. What do users want? They want low transaction fees and fast confirmations. Let’s design for that case, because THE USERS are who ultimately give Bitcoin value.
And then the discussion started we are all well aware today. On the one side Gavin Andresen, Mike Hearn and some other people, on the other side Peter Todd, Gregory Maxwell and others. Pieter Wuille took more or less a middle point in this debate, like some other users. Most users seem to agree that the block size limit has to be raised at some point in time, but few agree on the new limits and the time.
Maybe one user nailed down what the debate really was about:
This comes down to Bitcoin as a payment network versus Bitcoin as a store of value.
Old, long, neverending story: What are the priorities? Is the property of Bitcoin being a decentralized store of value more important or is it the property to serve as a payment system for the users? Should nodes remain decentralized, or transactions?
In this thread the discussion went over 26 pages. Outside of the discussion, the debate has not yet reached a conclusion; the gap through the community, that emerged at this time, has not ceased to exist. It has only grew wider and wider.