BLOCKCHAIN BLOCK SIZE AND SCALABILITY EXPLAINED

13 Jul 2023

BlockchainSoftware

1. Why is block size important?


 


The size of individual blocks on a blockchain can have a possibly large impact on the network's speed and capacity, but there are always trade-offs.


 


Blockchains get their name from the fact that they are composed of an ever-ongoing history of blocks. Blocks themselves are batches of transaction data. The amount of data contained in each block combined with the chain's block generation speed determines the number of transactions per second, or TPS, that the network can handle. 


 


Having a high TPS rate is more attractive, so developers are always looking to improve this. Actual rates vary based upon network conditions, but Bitcoin currently maxes out around seven TPS, and Ethereum isn't much better at 15 TPS. 


 


For example, Visa can process around 1,700 TPS, so improvements must be made if these networks want to compete as global payment solutions. Because the TPS rate of a blockchain is deeply tied to each block's size, this becomes a significant factor in finding a path to mainstream adoption. 


As we shall see, merely increasing the size is only one way to approach the issue, and there are many different viewpoints on how to move forward.


 


 


 2. What are some ways blockchains can scale?


 


Scaling solutions arrive in two forms: on-chain and off-chain. Both come with pros and cons, yet there is no clear agreement that is more encouraging for future growth.


 


On-chain scaling


 


On-chain scaling points to the philosophy of changing something about the blockchain itself to make it faster. For instance, one approach to scaling includes shrinking the amount of data used in each transaction so that more transactions fit into a single block. 


 


By altering how the transaction data is handled, it allowed a notable improvement to overall network capacity. Another way to possibly boost the TPS of a network is to increase the rate of block generation. While this can be helpful, there are limitations to this method relating to the time it takes to create a new block through the network. You don't want new blocks being created before the previous block was communicated to all of the nodes on the network; otherwise, it will cause consensus issues.


 


Creating seamless communication between discrete blockchains is another potential way that these systems could scale. If different chains can all transact between one another, then each network doesn't have to handle as much data, and each should improve. 


 


Of course, a system would be needed to ensure the data being sent between networks is 100% accurate. Combining multiple native chains and smart contracts makes it possible for the entire decentralized ecosystem to scale together, once fully implemented.


 


Then there's a technique called sharding, in which transactions are broken up into "shards," with different nodes only confirming specific shards, effectively performing parallel processing to speed up the system.


 


This can be applied to proof-of-work or proof-of-stake systems and forms a major component of Ethereum 2.0. Offering the potential to improve the network's capacity and speed, developers hope that we will see upward of 100,000 TPS become a reality. 


 


It should be made clear that it will still take a few years before the sharding process is fully implemented into Ethereum. Critics have pointed out that it also adds complexity and hurts security on the system. This is because sharding increases the chances of a "double-spend" occurring due to an attack. 


 


The issue here is that it takes notably fewer resources to take over individual shards than performing a traditional 51% attack, which can lead to transactions being confirmed that would otherwise be seen as invalid, such as the same (ETH) being sent to two different addresses.


 


Some projects have attempted to improve network speeds by limiting the amount of validating nodes — a very different philosophy from Ethereum's. An example of this is EOS, which has limited its validators to just 21. Token holders then vote on these 21 validators to keep a fair, distributed governance form with sometimes mixed results. 


 


The network reported 4,000 TPS, and developers are confident that they can continue to scale, which has positioned the project as one of Ethereum's main competitors in this space. 


 


However, limited validators are often looked down upon as a form of centralization, so not all users are sold on the model, especially if decentralization is one of the key benefits you are looking for in your blockchain system. 


 


Of course, one of the most frequently discussed means to scale a blockchain is to increase individual blocks' size. This has been tried before, and it's the same approach that Bitcoin Cash took when it forked away from Bitcoin in 2017. Not wanting a limit of 1 MB, the Bitcoin Cash community changed the rules so that the project could have 8 MB, and later 32 MB, blocks. 


 


This does mean there is more room in each block for added transaction data, but some point out that it is unrealistic to continue growing block sizes indefinitely. 


 


Some consider this solution to be merely leaving the problem for down the road, and at worst, which they see as harming the blockchain's decentralized nature. And when you understand that the average block on the Bitcoin Cash network is still under 1 MB, the debate on whether this is a viable solution is still debatable. 


 


Off-chain scaling


 


There are also ways to improve network throughput that don't directly change anything about the blockchain. Which can are often called second-layer solutions as they sit on top of the blockchain. One of the most well known of these projects is the Lightning Network for Bitcoin.


 


Lightning Network nodes can open up channels between each other and transact back and forth directly, and only when the channel is closed does the Lightning Network transmit the final tally to be recorded on-chain. 


 


These nodes can also be put together, making a much faster, cheaper payment system that only interacts with the leading network a fraction of the time. 


 


Ethereum also has solutions along these lines. There is the Raiden Network, designed to be Ethereum's version of the Lightning Network. The project performs not only off-chain transactions but also state changes, which allows for the processing of smart contracts. The biggest drawback with this system and those like it is that they are a work in progress, with bugs and other technical issues that can still occur if channels aren't created or closed correctly.


 


A similar approach is something called sidechains. These are blockchains that are "branched off" of the main chain, with the ability to move the native asset between them. 


 


It means sidechains can be created for specific purposes, which will keep that transaction activity off of the primary network, freeing up the overall bandwidth for things that need to be settled on the main chain.


 


For Bitcoin, it's the Liquid sidechain, and Ethereum's version is known as Plasma. The downside here is that each sidechain itself needs to be secured by nodes, leading to issues with trust and security if a user is unaware of who is running them behind the scenes.


 


 3. What are the arguments for and against increasing block size?


 


Those who want to see block size increase argue that larger blocks not only improve capacity and speed but also push down fees. Those against it are concerned that larger blocks will lead to greater centralization.


 


Those who believe in increasing block size are key to bringing Bitcoin (BTC) and other decentralized assets into mainstream adoption. It is certainly fair to point out that as block size increases, more transactions are confirmed in each block, but the average transaction fee will drop.


 


Which kind of sounds like the best of both worlds, as the network would be both faster and cheaper. This is made stronger when advocates point out that other scaling solutions, such as the sidechains mentioned earlier and sharding, are still being tested and aren't ready to be mass-implemented yet. 


 


These are all fair points, but increasing the size of blocks does have some consequences. They give why larger blocks are such a problem because node operators need to download each new block as it is propagated, which with current hardware is of no major issue if blocks are 1 MB, 4 MB or even 32 MB in size. 


 


However, if a blockchain is to be adopted globally, then even this is not enough. Blocks would need to be on the scale of gigabytes, which could be a roadblock for many. If most average users cannot afford hardware or internet connections capable of handling this, then fewer would do so, leading to increased centralization.


 


Ultimately, the ones who decide on these changes to a network are the miners, who can signal that they support an upgrade to the network's protocol. 


 


Because many miners are grouped into large pools, which ultimately all signal together, this can conceivably be another form of centralization, as those working together have far more say than lone miners ever could. 


 


Fortunately, there is more than one way to approach this issue, and not all projects want to see open-ended block sizes. Other developers oppose this problem in different, creative ways in the hopes of putting scaling to rest. 


 


 4. How have different projects approached the issue?


 


No single solution has emerged as the best one, and projects are still actively investigating creative versions of all these philosophies to make scalable networks.


 


Lightning Network and sidechain research are still going strong, and many expect some form of it to be what enables everyday purchasing with Bitcoin to become normal. 


 


As mentioned before, projects such as Bitcoin Cash have embraced the creation of larger blocks, and BitcoinSV has taken this further with an upper limit on its blocks of a vast 2 GB. This has led to an increase in the cost of maintaining a node and more frequent issues with orphaned blocks.


 


Not all projects are taking the larger block approach, of course. Ethereum itself is looking to migrate over to a new proof-of-stake system that is being labeled Casper. Another project called Cardano has developed a unique approach called Hydra, which sees each user generating 10 "heads" and each head acting as a new channel for throughput on the network. Hopefully, this will allow for seamless scalability, as increased use of the system should also generate increased capacity.


 


 5. The debate is still not settled.


 


Despite much of the extraordinary work being done, there is no one clear solution for blockchain scaling, and there may never be.


 


All of the possibilities explored above in this article help bring digital assets onto a global stage, but none have risen yet above the others. There are pros and cons for each, and it isn't implausible that there simply won't be a single defining winner. Different projects with unique goals may need to scale in different ways. 


 


It is even plausible that more than one of these ideas can be used in tandem to multiply each's benefits. In time, how to scale blockchains will be less of an issue, but we aren't there just yet. 


 


By testing and reimagining solutions, developers should be getting closer each day to a global data processing system that rivals or surpasses current offerings. 


 


It is essential to keep an open mind and be willing to try new things, as possibly the answer that we are looking for is already being tested in the field right now with the future looking bright for blockchain.