What Scalability means in the crypto currency space

On the one hand, the blockchain area is currently one of the most promising in technology. On the other hand, it also shows how quickly the lofty plans can be brought back to reality — even if the whole world and its best minds have created these plans.

At the end of the day, it’s always about scalability, i.e. how software behaves when it is put under heavy load, preferably by many simultaneously active users. When that happens, it’s always the moment of truth. This also and especially applies to peer-to-peer software, which is based on the fact that many peers collaborate with one another.

Scalability — this is always taught by experience — is the crux of the matter.

In the case of Bitcoin, the problem is world-famous: Energy consumption increases rapidly because a certain Algorithm (“ proof of work”) consumes (and must consume) a lot of computing time. There are others technical reasons why the number of transactions per second is severely limited. As a result, the transaction rate is still a long way off, even after years is high enough to allow Bitcoin to be used as a means of payment. That will probably work out too no longer change.

In practice, however, this in no way means that Bitcoin no longer plays a role. Decentralized currency seems too important to many to be abandoned. That is why there already is a myriad of new projects that all promise to be better than Bitcoin in terms of scalability. In principle there are two possibilities:

  1. Completely new projects starting from scratch to make everything better.
  2. “Second Layer” projects use another project as the basic technology and build on it to allow for improvement.

The second approach is more natural, because it reflects what is already commonplace in the IT industry. Software is created in layers, with the higher layers becoming more and more special while the lower layers solve basic issues.

Growth over layers has long been taking place in the blockchain area. However it is sometimes wild growth, and it will be cut back over time because many projects just fail. Maybe because they are just useless ( or are scam). In any case it is an evolutionary process, not the effect of superior human intelligence.

This brings us back to the beginning of our deliberations. An important criterion for survivability of software is its scalability. Scalability is by no means only achieved by involving the brightest and most talented minds to create something the world has never seen before. Rather, if that software is finally becoming fast and stable, it will be due to many different factors, not just software quality as it usually is measured.

Programmer, Thinker, Entrepreneur, “Intuitioner”

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

How to Write a TCP Port Scanner in Go

Pipeline notifications in a slack channel via AWS bots

Given a sorted and rotated array, find if there is a pair with a given sum

.NET Web API SDK Architecture

Using AWS CLI to transfer a domain from one AWS account to another one

What is a C Static Library?

Strategy of splitting Monolith to Microservices

Journey for a home-made back-end, part II: a team around a platform

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Alexander Weinmann

Alexander Weinmann

Programmer, Thinker, Entrepreneur, “Intuitioner”

More from Medium

Trading Solana NFTs & Crypto on Ethereum Virtual Machine (EVM) Networks

Technical Release 11

Jigstack Academy — What are Smart Contracts?

Year in Review and Trend to Watch in 2022