On the one hand, the blockchain area is currently one of the most promising in technology. On the other hand, it also shows how quickly the lofty plans can be brought back to reality — even if the whole world and its best minds have created these plans.
At the end of the day, it’s always about scalability, i.e. how software behaves when it is put under heavy load, preferably by many simultaneously active users. When that happens, it’s always the moment of truth. This also and especially applies to peer-to-peer software, which is based on the fact that many peers collaborate with one another.
Scalability — this is always taught by experience — is the crux of the matter.
In the case of Bitcoin, the problem is world-famous: Energy consumption increases rapidly because a certain Algorithm (“ proof of work”) consumes (and must consume) a lot of computing time. There are others technical reasons why the number of transactions per second is severely limited. As a result, the transaction rate is still a long way off, even after years is high enough to allow Bitcoin to be used as a means of payment. That will probably work out too no longer change.
In practice, however, this in no way means that Bitcoin no longer plays a role. Decentralized currency seems too important to many to be abandoned. That is why there already is a myriad of new projects that all promise to be better than Bitcoin in terms of scalability. In principle there are two possibilities:
- Completely new projects starting from scratch to make everything better.
- “Second Layer” projects use another project as the basic technology and build on it to allow for improvement.
The second approach is more natural, because it reflects what is already commonplace in the IT industry. Software is created in layers, with the higher layers becoming more and more special while the lower layers solve basic issues.
Growth over layers has long been taking place in the blockchain area. However it is sometimes wild growth, and it will be cut back over time because many projects just fail. Maybe because they are just useless ( or are scam). In any case it is an evolutionary process, not the effect of superior human intelligence.
This brings us back to the beginning of our deliberations. An important criterion for survivability of software is its scalability. Scalability is by no means only achieved by involving the brightest and most talented minds to create something the world has never seen before. Rather, if that software is finally becoming fast and stable, it will be due to many different factors, not just software quality as it usually is measured.