Mythbusting: Bitcoin (on its own) cannot scale.

5 450
Avatar for TomZ
Written by
4 years ago

Also available in: Arabic / العربية

This myth is very stubborn in being squashed. For years have we seen people give numbers, show results and do design improvements that clearly gives the indication that Bitcoin can scale on-chain. Mostly with bigger blocks.

Yet, there is still a myth that this is impossible. One of the most often heard counter-argument about Bitcoin being able to scale on-chain is a gut-feeling type. If we take a certain large group of people (say, a country) and give them x transactions a day, that means blocks will be n Gigabytes. Every 10 minutes. And when we see this rather large number, for instance per day or month, then it will feel impossible.

The basic myth to be solved here isn't about Bitcoin and on-chain scaling. The basic problem we are hitting here is that humans don't have any natural experience with exponential growth and that most of us still don't grasp how far Computers have come in the last couple of decades using exponential growth. I mean, sure, you can stream a 4K video over the Internet and watch it using a 3 by 1 inch device that you plug into your TV, people accept that today. But processing some huge amount of data every day in a server room, that just doesn't make sense...

This scaling myth is so pervasive because of the same reason that in 1970, 1980 and again in 1990 and in 2000 and in 2010 and now in (almost) 2020, people could simply not grasp the advances that the next 10 years would bring.

So this myth is easy to bust on a purely historical basis. We literally have 50 years of data and proof that supports the claim we can scale on chain. The earliest claim of this is from Satoshi himself:

"It never really hits a scale ceiling []
"By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even If Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions."

The real hard part is to not forget that history when someone claims we really can't expect a computer to do [something] in 20 years. Never mind that we are talking about Artificial Intelligence taking our jobs, robots going out on patrol and cars driving themselves soon.

Scaling, how?

Here is a little slice of history wrt scaling.

In 2016 tests show a good desktop can process 368 million transactions a day.
In June tests show a good desktop can process 1,944 million transactions a day.
In Nov tests show a simple server can process 3,480 million transactions a day.

Bottom line is that in the small amount of history I personally have observed in crypto we already see 10 times as much throughput in 3 years of doing software development and simply getting newer hardware.

The improvements in software are the result, for large part, due to people believing this myth. So I felt compelled to act and work on scaling in the software. The actual usage of the network today is around 1000 times less than what tests show we can process. Now, imagine what the entire industry can accomplish if there is a need for more scale but we are getting close to the limits. One man doing work vs an entire industry. Can you imagine? I mean, when there is an incentive suddenly people can become very inventive bunch.

The modern computer is still getting faster every year. Moore’s Law is still on-schedule and going well. Moore was an Intel engineer and Intel has for the last several years failed to hit the increase. Which is more visible since AMD has been taking over and beating Intel at scaling numbers. This is open market working where the longer term effect is uninterrupted. A different company just carries forward the scaling race.

The effect on our scaling myth is simple that simply buying the latest hardware every 2 years means you get twice the speed. You can serve twice the number of people ever 2 years.

So, in short;

  • Moore’s Law is working as normal. Intel is not on the forefront anymore. Yet, every 10 years we can serve 100 times the amount of people we can support today.

  • On top of this software innovations are going to allow speeding up. The 10x historical gain mentioned above, for instance, is far above what hardware alone could reach. And that was in a software project of a very limited number of people. Imagine what a large team working together can do.

  • Early this year a test ran which successfully demonstrated 250MB blocks while economic activity is on average around 250KB blocks. A 1000 times difference. We have huge amount of growth that we can do without even having to do a lot of scaling innovation.

Our industry, hardware and software combined, has demonstrated that it can grow (more than) exponentially.

Historical facts shows Satoshi right: using bigger blocks we never really hit a scaling ceiling.

Sponsors of TomZ
empty
empty
empty

3
$ 6.55
$ 5.00 from @btcfork
$ 1.00 from @unitedstatian
$ 0.20 from @majamalu
+ 4
Avatar for TomZ
Written by
4 years ago

Comments

TomZ, did you read my "book"?

$ 0.00
4 years ago

Thanks Tom!

+100 MYTHBUSTER

txid: 11e122dcc4dbbf995cd6ced283e892273c23eb109b47aec8d44af96857d92a87

I think this is a worthy attempt to bust this myth, even though I think some more perspective could be added to give a fuller picture, but I leave it to others if they want to submit competing articles on this myth.

I think the jury is out on the Moore's Law questions - it might not be something I agree with 100%, but regardless of where one falls on that, I think the fact is clearly demonstrated that there is more than enough room with current technology to grow way beyond what Bitcoin is currently handling.

Some links to some of the Gigablock Testnet project materials and to Vermorel's analysis of terabyte blocks would be nice, to dispel the economic questions surrounding hardware to process MUCH bigger blocks.

$ 0.05
User's avatar btcfork
This user is who they claim to be.
We have manually verified this user via some other channel.
4 years ago