ENDGAME: Practically Eliminating Block Gas Limits for Developers

ENDGAME: Practically eliminating block gas limits for developers

Mega GM // Gmega

Testnet is officially open for users (go try it out!) and "real-time" is no longer just a tag line—people can feel it.

But what is speed without capacity? If MegaETH had to sacrifice the ability for developers to put complex logic onchain in the name of real-time responsiveness we'd be undermining the very customers we are seeking to empower (application builders).

With that said, we're happy to report that we're not only taking chain responsiveness to its limits; we're also leveraging extreme node specialization to push our node network to the edges in order to practically eliminate block gas limits for those same developers.

Block gas limit, transaction limit and deployable contract sizes allmagnitudes above "normal" in the EVM while maintaining real-time responsiveness, unlocking developer creativity and the next era of applications.

Read on if you want to find out how.

TYPICAL BLOCKCHAIN NETWORK

First, an overview of what a traditional network is even comprised of so we can highlight differences.

We'll start with an image to make it easy (if this connects for you feel free to skip this section):

Typical Blockchain Network Visualization

Common roles within a blockchain network: block producer, node network and users.

Now lets break down what each of these roles represent.

Common Network Roles

Block producer

This is the entity responsible for creating the blocks which can be appended to the chain. For L1s, this is a diverse and distributed set of validators randomly selected for the role. For L2s the common construction hands this role over to a single machine: the sequencer. The key differences of the two parties that fill the block producer role above are that sequencers typically have larger hardware requirements and either don't give up the role or do it infrequently whereas validators are constantly rotated (e.g. Solana leaders rotate out after ~1.2s)

Full Nodes

These machines receive the blocks produced by the block producer (whether validators or a sequencer), execute the transactions within the blocks themselves to verify accuracy against the existing chain history and then update their local "truth" to be in-sync with the chain itself. Once in-sync they can then serve this information to users of applications, developers wanting chain information, etc, etc. This is the "network" of a blockchain. Note: your network is only as fast as its slowest entity. The above means if these entities serving up your chain information cannot both keep up with the blocks produced by the validator(s)/sequencer and verify their correctness, then your network is operating at this slowed pace.

Users

This is you. When you are reading information from apps, or submitting tx to the chain, it is all routed through full nodes which are kept in-sync with block producers. Pretty self-explanatory here.

THE HARDWARE AGREEMENT

So, those are the parties - great. But what does this have to do with gas limits? To understand that we have to discuss what gas, along with two other dimensions of scale, represent within a distributed network.

In short, the gas limit represents onchain compute, or a block's complexity, and is a network's promise to it's nodes that in order to keep up with the blocks it produces you only need X hardware to execute the blocks created and not fall behind. Gas limits are a compute throttling method, essentially.

It's not the only dimension that dictates a chain's throughput though.

Two other elements of impact are:

Together with compute, these make up an implicit "hardware agreement" of the network:

Hardware Agreement Visualization

The three dimensions of scaling that impact a network's throughput

The traditional setup across crypto is to have singular machines (full nodes) operate in isolation and be able to handle the maximum possible requirement of all 3 dimensions.

One full node must have:

Of the above, compute is oft the most-limiting for the average EVM network and is why block gas limits are similar across well-distributed networks:

Block Gas Limits Comparison

Table: Comparison of gas parameters across EVM chains in 2024 (source: Paradigm [https://www.paradigm.xyz/2024/04/reth-perf])

So, the problem is identified as the compute required of a single machine to keep up with the block producers of a chain.

How do we solve for that? Node specialization.

NODE SPECIALIZATION: MEGAETH'S ANSWER

"Okay, but WTF does node specialization mean?" - You

It just means that we've taken the approach of splitting this traditionally-single entity (the full node) in to a combination of machines specialized to handle singular dimensions of scale instead of having single machines perform all 3.

Node Specialization Visualization

A side-by-side of traditional network nodes next to MegaETH's specialized node infrastructure

Then: The full node must handle max bandwidth, compute and storage outcomes of the block producers.

Now: The full node is replaced with a replica node, which only receives state diffs instead of full blocks, and the full blocks are distributed amongst an entire network of prover nodes independently executing those blocks then reporting proofs that the blocks are valid to the replica nodes. Visualized:

Visualized:

Node Specialization Visualization

A visualization of the prover network and replica node relationship

The implications of the above are:

On that last bit, it is important to note that MegaETH will also offer the Full Node option for those who wish to verify 100% of the chain state themselves.

Node Specialization Visualization

Latest node specs offered by MegaETH

Cool, compute/gas limits are gone—what does that mean for me?

IMPACTS OF NO GAS LIMITS

At the highest level it just means "people can do more complex stuff onchain," which often reveals itself in strict size-limits for both contracts and transactions.

A comparison:

No Gas Limits Comparison

A comparison of MegaETH's compute upper limits to the average EVM chain

Two things to point out:

For developers: The deployable contract size (~512kb) is 21.3x larger than your average EVM chain (24kb).

For users: The single tx limit of MegaETH (presently 1G) is 24.5x larger than the full block limits of your average EVM chain.

Utilizing the new limits of the above is going to take some time because they come with configuration nuances, but we expect teams to adapt quickly and we are in our developer chat to assist with any questions.

LIMITS ARE REALLY HIGH. WHAT DO?

A few example categories:

"What everyone is missing is that MegaETH has virtually eliminated gas limits for the EVM" - @0x_ultra

In the end, it's just onchain creativity a mindset shift from one of scarcity, gas golfing and contract optimizations to one of abundance and no longer being limited by the current EVM paradigm.

Now it's on applications to maximize it.

Published: 03/25/2025