People have been asking us about transaction signing on GTE. We have already implemented 7702 internally, and are just waiting for wallet providers to support it. When that is the case, we will be live with signless transactions immediately. In the meantime, we just shipped no

ENDGAME: Practically eliminating block gas limits for developers
Mega GM // Gmega
Testnet is officially open for users (go try it out!) and "real-time" is no longer just a tag line—people can feel it.
But what is speed without capacity? If MegaETH had to sacrifice the ability for developers to put complex logic onchain in the name of real-time responsiveness we'd be undermining the very customers we are seeking to empower (application builders).
With that said, we're happy to report that we're not only taking chain responsiveness to its limits; we're also leveraging extreme node specialization to push our node network to the edges in order to practically eliminate block gas limits for those same developers.
Block gas limit, transaction limit and deployable contract sizes allmagnitudes above "normal" in the EVM while maintaining real-time responsiveness, unlocking developer creativity and the next era of applications.
Read on if you want to find out how.
TYPICAL BLOCKCHAIN NETWORK
First, an overview of what a traditional network is even comprised of so we can highlight differences.
We'll start with an image to make it easy (if this connects for you feel free to skip this section):

Common roles within a blockchain network: block producer, node network and users.
Now lets break down what each of these roles represent.
Common Network Roles
Block producer
This is the entity responsible for creating the blocks which can be appended to the chain. For L1s, this is a diverse and distributed set of validators randomly selected for the role. For L2s the common construction hands this role over to a single machine: the sequencer. The key differences of the two parties that fill the block producer role above are that sequencers typically have larger hardware requirements and either don't give up the role or do it infrequently whereas validators are constantly rotated (e.g. Solana leaders rotate out after ~1.2s)
Full Nodes
These machines receive the blocks produced by the block producer (whether validators or a sequencer), execute the transactions within the blocks themselves to verify accuracy against the existing chain history and then update their local "truth" to be in-sync with the chain itself. Once in-sync they can then serve this information to users of applications, developers wanting chain information, etc, etc. This is the "network" of a blockchain. Note: your network is only as fast as its slowest entity. The above means if these entities serving up your chain information cannot both keep up with the blocks produced by the validator(s)/sequencer and verify their correctness, then your network is operating at this slowed pace.
Users
This is you. When you are reading information from apps, or submitting tx to the chain, it is all routed through full nodes which are kept in-sync with block producers. Pretty self-explanatory here.
THE HARDWARE AGREEMENT
So, those are the parties - great. But what does this have to do with gas limits? To understand that we have to discuss what gas, along with two other dimensions of scale, represent within a distributed network.
In short, the gas limit represents onchain compute, or a block's complexity, and is a network's promise to it's nodes that in order to keep up with the blocks it produces you only need X hardware to execute the blocks created and not fall behind. Gas limits are a compute throttling method, essentially.
It's not the only dimension that dictates a chain's throughput though.
Two other elements of impact are:
- Bandwidth - The upload/download speed of nodes, which enables their ability to communicate with the rest of the network
- Storage - Hardware requirement of nodes to store the chain's information. The more a history can process, the more information needs to be stored.
Together with compute, these make up an implicit "hardware agreement" of the network:

The three dimensions of scaling that impact a network's throughput
The traditional setup across crypto is to have singular machines (full nodes) operate in isolation and be able to handle the maximum possible requirement of all 3 dimensions.
One full node must have:
- The bandwidth to download/upload all of the blocks
- The compute power to re-execute all transactions for all blocks
- The storage capacity to hold the entire chain's state
Of the above, compute is oft the most-limiting for the average EVM network and is why block gas limits are similar across well-distributed networks:

Table: Comparison of gas parameters across EVM chains in 2024 (source: Paradigm [https://www.paradigm.xyz/2024/04/reth-perf])
So, the problem is identified as the compute required of a single machine to keep up with the block producers of a chain.
How do we solve for that? Node specialization.
NODE SPECIALIZATION: MEGAETH'S ANSWER
"Okay, but WTF does node specialization mean?" - You
It just means that we've taken the approach of splitting this traditionally-single entity (the full node) in to a combination of machines specialized to handle singular dimensions of scale instead of having single machines perform all 3.

A side-by-side of traditional network nodes next to MegaETH's specialized node infrastructure
Then: The full node must handle max bandwidth, compute and storage outcomes of the block producers.
Now: The full node is replaced with a replica node, which only receives state diffs instead of full blocks, and the full blocks are distributed amongst an entire network of prover nodes independently executing those blocks then reporting proofs that the blocks are valid to the replica nodes. Visualized:
Visualized:

A visualization of the prover network and replica node relationship
The implications of the above are:
- Because compute (i.e. transaction complexity) is no longer handled by a single entity for every single block, and instead spread across a cluster of machines in the prover network, it is no longer the most-limiting dimension of scale and is virtually eliminated as a constraint
- The above then shifts the problem to bandwidth and storage, with storage size due to state growth being the one we're currently-focused on. To mitigate this, we're iterating on pricing models derived from the number of kvs updated as opposed to transaction complexity (gas)
- By splitting this single machine into a network of machines, it's imbuing some trust assumptions into this specific setup.
On that last bit, it is important to note that MegaETH will also offer the Full Node option for those who wish to verify 100% of the chain state themselves.

Latest node specs offered by MegaETH
Cool, compute/gas limits are gone—what does that mean for me?
IMPACTS OF NO GAS LIMITS
At the highest level it just means "people can do more complex stuff onchain," which often reveals itself in strict size-limits for both contracts and transactions.
A comparison:

A comparison of MegaETH's compute upper limits to the average EVM chain
Two things to point out:
For developers: The deployable contract size (~512kb) is 21.3x larger than your average EVM chain (24kb).
For users: The single tx limit of MegaETH (presently 1G) is 24.5x larger than the full block limits of your average EVM chain.
Utilizing the new limits of the above is going to take some time because they come with configuration nuances, but we expect teams to adapt quickly and we are in our developer chat to assist with any questions.
LIMITS ARE REALLY HIGH. WHAT DO?
A few example categories:
- Complex onchain computation
- Running machine learning models directly in smart contracts
- Real-time price calculations
- Full sorting of large arrays without loop limits
- Graph algorithms that can traverse entire networks/relationships
- Storage and State Management
- Maintaining larger in-contract data structures
- Keeping more historical data accessible in contract storage
- Handling bulk operations in a single transaction
- Protocol Design
- Running full zero-knowledge proof verification
- Complex cryptographic operations without offchain components
- Real-time automated market makers with sophisticated formulas
"What everyone is missing is that MegaETH has virtually eliminated gas limits for the EVM" - @0x_ultra
In the end, it's just onchain creativity a mindset shift from one of scarcity, gas golfing and contract optimizations to one of abundance and no longer being limited by the current EVM paradigm.
Now it's on applications to maximize it.
Published: 03/25/2025