In fact, ETH scaling is a major benefit for L2
Original Title: Why Scaling Ethereum is Bullish for L2s
Original Author: Etherealize
Original Translation: Ken, ChainCatcher
On February 3, Vitalik Buterin posted on X, garnering over 6 million views. "The initial vision of L2 and its role in ETH no longer makes sense," he wrote, "We need a new path."
Stakeholders in competing blockchains quickly rendered this as a failure. Cryptocurrency news media called it a "major reversal." The narrative that ultimately formed was that ETH finally admitted defeat— the rollup-centric roadmap was unworkable, while the monolithic scaling solutions adopted by blockchains like Solana were proven correct.
This assertion is incorrect. If you make investment decisions based on this claim, you risk being on the wrong side and missing out on the most significant infrastructure changes currently happening in the cryptocurrency space.
What Vitalik Actually Said
If you read the full text instead of just the headline, the conveyed message is very clear. ETH has not abandoned Layer 2 (L2 networks). It is shifting from a "rollup-centric" scaling approach (i.e., expecting L2 to be a copy of the base layer) to a model of aggressive scaling for L1 (Layer 1 itself). L2 remains important, but the reason has changed: customization.
The initial vision viewed L2 rollups as copies—simple replicas of the ETH virtual machine without the burden of base layer consensus. The idea was that these rollups would eventually decentralize to "stage two," inheriting ETH's full security guarantees while providing cheaper transactions. In exchange, they would contribute to ETH's liquidity network effects and security budget.
But this did not happen. As Vitalik acknowledged, "The pace of L2's advancement to stage two (and then interoperability) has been far slower and much more difficult than initially expected." Many chains that call themselves L2 are, in practice, centralized blockchains with ETH bridging. They can unilaterally change rules, censor transactions, and migrate entirely, contributing little to ETH's network effects.
Then two things happened that rendered the initial vision outdated. Both were positive developments.
The Base Layer is Rapidly Scaling
After the London hard fork in August 2021, ETH's gas limit was set to 30 million gas per block. This level has been maintained for over three years. The ETH community has been cautious about increasing throughput due to a real trade-off at the core of blockchain design: pushing too many computational tasks onto the chain increases hardware requirements for validators, leading to network centralization in the hands of a few, undermining the decentralization that gives the system its value.
To a large extent, this is precisely the trade-off that ETH's competitors choose to ignore. For example, today a Solana validator node requires enterprise-grade hardware: over 24 physical CPU cores, 256 GB of RAM, multiple enterprise-grade NVMe SSDs, and a 10Gbps network connection. The monthly hosting cost for a competitive validator node can exceed $1,000. In contrast, an ETH validator can run on a mini-computer worth $1,100 placed under your desk. This is not a trivial difference. Because of this, ETH can maintain around 1 million active validator nodes while achieving a level of decentralization that other smart contract platforms find hard to match. As of early 2026, the Solana network has only about 800 active validator nodes.
But blockchains do need to scale. High-performance competitors have proven that there is a huge market demand for cheap, fast L1 transactions. ETH's response is a broader cultural shift—from "long-term research" to "short-term execution," and the results are already showing.
By 2025, through the collaborative actions of validators, the gas limit will double from 30 million to 60 million, while the Pectra and Fusaka upgrades will expand blob capacity and introduce other protocol improvements. The ETH Foundation has also committed to implementing an aggressive roadmap aimed at tripling L1 throughput approximately every year for the foreseeable future.
By the end of 2026, the goal is to push the gas limit above 100 million. In 2027, block time is expected to be halved from 12 seconds to 6 seconds (and possibly shortened to 4 seconds), effectively doubling throughput again without changing block size. That same year, block-level access lists will allow nodes to process transactions in parallel, eliminating a major computational bottleneck. In 2028, migrating to a binary tree state structure will allow for higher gas limits, as this eliminates the need for validators to store the entire state on disk. By 2029, the network will begin transitioning to a native zero-knowledge architecture— a fundamental architectural change that will radically alter the mathematical logic of scaling.
The key breakthrough to achieve this long-term vision is zkEVM. Currently, every node in each L1 blockchain must re-execute every transaction to verify the state. zkEVM compresses the verification process into a constant-sized cryptographic proof that requires minimal computational resources to validate. When combined with ETH's data availability sampling— which will allow validators to verify the existence of data without downloading all of it— it creates a pathway to throughput comparable to high-performance chains while retaining the decentralization that gives ETH block space its unique value.
This is about five years ahead of what most observers expected. It is so "spectacular" that Ben Edgington, the leader during ETH's transition to proof of stake, announced he would end his retirement to rejoin the project.
ETH Foundation researcher Justin Drake articulated the North Star goal of technological development: achieving "fast L1" with second-level finality; "gigagas L1" with 10,000 transactions per second through real-time zkEVM proofs; and "trillion gas L2" with 10 million transactions per second through data availability sampling. The roadmap also prioritizes implementing post-quantum cryptography and native privacy features at the base layer.
The New Value Proposition of Layer 2
So, if L1 is scaling, what is the significance of L2?
L2 has found its product-market fit: meeting the needs of institutions that want both the security of ETH and the liquidity of the ETH ecosystem, while also wishing to customize chains to better serve their clients and comply with regulatory requirements.
This ultimately leads to the question of "stage two." For an L2 to achieve the decentralization of "stage two," it must relinquish the unilateral ability to upgrade its management of bridging and proof system contracts— including the ability to quickly respond to regulatory demands or patch urgent vulnerabilities. This is a real operational limitation for institutions that bring millions of users into the ETH ecosystem.
This is the core tension in today's L2 ecosystem. Users can still withdraw their assets back to ETH's L1, which is the most important security guarantee provided by rollups. But without achieving stage two, operators can still upgrade bridging contracts, censor transactions, or change rules. Moreover, due to a lack of interoperability, each L2 is fragmenting liquidity and competing with other L2s in a manner not fundamentally different from alternative L1s.
Vitalik's article resolves this tension by acknowledging reality: L2 exists on a continuous spectrum, and that's okay. Some L2s will pursue complete stage two decentralization and serve as true extensions of ETH block space. Others will maintain more centralized control in exchange for customization capabilities, which is also a reasonable use case as long as this trade-off is honestly communicated in market promotions.
The demand from institutions for the second type of L2 is enormous and growing, with Robinhood's decision to build ETH L2 being the clearest example of this.
In June 2025, Robinhood announced at EthCC (ETH Community Conference) that it would build its own ETH Layer 2 using the Arbitrum tech stack instead of launching a new L1 blockchain. This surprised many in the crypto industry. Robinhood is one of the largest retail brokers in the world. It has enough resources and user base to launch its own chain. It had also actively discussed doing so. But it ultimately chose not to.
The reasoning articulated by Robinhood's cryptocurrency head Johann Kerbrat hits at the core of why L2 is important: "Ensuring the security of a truly and highly decentralized chain is extremely difficult, and we can essentially get that for free from ETH. When you see newly created L1s, they are actually neither decentralized nor secure, so ultimately, what you have is just a fancy database that might even be slower than a real database."
The second factor is liquidity. Robinhood's goal is to tokenize all assets— starting with publicly traded stocks and expanding to private equity, real estate, and other real-world assets. This requires tapping into ETH's existing liquidity network. As Kerbrat said, "We need that liquidity... If you're alone on your private island, no one can come and go freely. I believe we can attract customers because Robinhood is a large platform, but we want to rebuild the entire financial system on-chain, and we need everyone to be able to come to our island."
Robinhood CEO Vlad Tenev compared the customizability of L2 with solutions built on alternative L1s like Solana, viewing it as a trade-off between short-term value and long-term value: "In the long run, control is more important, as it allows us to build better products. Plus, the technology behind these rollups has become so excellent that you really haven't missed out on much." As an L2, Robinhood retains complete control over sequencer revenue, gas fees, regulatory customization, and product roadmaps— while inheriting the security and settlement guarantees of ETH. It can name it 'Robinhood Chain' while letting ETH handle the most challenging parts.
Robinhood is not alone. Coinbase (Base), Kraken (Ink), and OKX (X Layer) have all launched their own ETH L2s. But a more telling signal is who chooses to build with them. Just this month, Nasdaq partnered with Kraken to build a tokenized stock gateway, and the parent company of the New York Stock Exchange, Intercontinental Exchange, invested $200 million in OKX, planning to put NYSE-listed stocks on-chain.
These institutions need the security of ETH and the liquidity of its ecosystem. But they also need regulatory compliance, privacy controls, custom fee structures, and operational control. A permissionless, fully transparent base layer cannot meet all these needs. But a Layer 2 built on top of it can.
As Vitalik wrote in a follow-up clarification article days later, L2 should "do things that really bring something new" (such as privacy, efficiency for specific applications, ultra-low latency, institutional compliance, etc.). Most importantly: "The atmosphere should match the substance." The degree of connection L2 has with ETH in public perception should match its degree of connection in reality. Sidechains with bridging are different from stage two rollups that cannot survive without leaving ETH. Calling oneself "ETH L2" should mean something specific regarding its security guarantees.
This is about protecting the integrity of the ETH brand, and in turn, protecting the trust that institutions are beginning to build in ETH.
Layer 2 remains the best business model in the crypto space. You do not need to spend millions of dollars annually on validator infrastructure, nor do you need to pay for security costs through token issuance. You inherit the security of ETH and pay for it when using block space.
The Flywheel Effect: Why Scaling L1 Makes L2 More Useful
This is the part that those advocating "ETH is abandoning L2" completely miss: scaling the base layer does not compete with L2. It greatly enhances the utility of L2.
To understand why, you need to grasp what ETH is at the protocol level. It operates as a globally replicated ledger. Every full node independently verifies each transaction to ensure the ledger is correct. Protocol parameters like gas limits and block times must remain conservative enough for ordinary machines to keep up; otherwise, you will ultimately need data center-level hardware to participate, thereby reconstructing the centralized infrastructure you were trying to escape.
This means that the original L1 throughput is essentially scarce, which is why ETH block space is valuable. This is precisely why transactions settled on ETH have stronger guarantees than those settled on chains with only a few hundred validator nodes running in three data centers.
Rollups cleverly solve this limitation. They move most user transactions off-chain to L2, where it is both fast and cheap, while primarily using ETH for two things: data availability (publishing compressed transaction data that anyone can use to reconstruct L2's state) and final settlement (anchoring L2's state transitions to L1 consensus). By bundling many off-chain transactions together, rollups allow many users to share the gas costs of a single L1 transaction.
When ETH scales its L1, it directly lowers the costs of these two functions. Each block containing more gas means cheaper settlement costs. More blob capacity means more L2s can publish data simultaneously without competing for scarce data availability. Faster block times mean L2 withdrawals and cross-chain operations become quicker. Faster final confirmations mean L2 can confirm transactions with higher certainty in a shorter time.
The result is a system where each part plays its role: L1 handles what it does best (low-risk DeFi, high-value settlements, and serving as an authoritative data source), while L2 competes on specialized use cases. This competitive dynamic is much healthier than the current situation— where the primary reason for L2's existence is simply that L1 is too slow and expensive for everyday transactions.
Unresolved Issue: Liquidity Fragmentation
Layer 2 does not solve all problems. With current technology, every new L2 is an independent asset and user island. Without seamless interoperability, the ETH ecosystem does not operate as a complete network but rather resembles a dozen competing networks. This is the most reasonable criticism of the ETH L2 ecosystem.
The initially rollup-centric roadmap assumed that L2s would converge on interoperability standards, allowing liquidity to flow freely throughout the ecosystem. But this has not happened. Instead, liquidity has become fragmented, and for most users, the experience of bridging assets between different L2s remains slow, expensive, and fraught with risks.
The ETH Foundation has listed this as a top priority for 2026. The core of the plan is an "open intent framework," where users simply declare what they want to do— exchange, bridge, pay— and the system will automatically route the best path across different L2s. Behind the scenes, a new ETH interoperability layer aims to make cross-L2 transactions feel indistinguishable from transactions on a single chain. Vitalik has also pushed for the development of native rollup precompiles that will directly verify zkEVM proofs on L1, improving trustless composability between the base layer and rollups.
This is the next problem that needs to be solved. If ETH can get this right, making asset movement between different L2s feel like using a single chain, then every new L2 will enhance the entire network rather than fragment it.
What This Means
As of the writing of this article, ETH's market capitalization is approximately $240 billion. It is the second most valuable blockchain in the world after Bitcoin, with a significant lead. The narrative that "ETH is dying" is fundamentally at odds with what the market is actually telling you.
Robinhood is tokenizing thousands of stocks on ETH's L2. The gas limit has already doubled, with a reliable roadmap to increase it tenfold from current levels within four years. Institutional adoption of ETH-based L2s is accelerating, not slowing down. Moreover, the enthusiasm in the engineering community has reached its highest point in years— reflected not only in the roadmap itself but also in the quality of talent it attracts back to actively contribute.
What is happening is a maturation of strategy. The initially rollup-centric roadmap was a pragmatic response to an emergency: in 2020, ETH could not scale its L1 quickly without sacrificing decentralization while competitors were capturing market share. That emergency has ended. But the engineering talent and infrastructure that ETH invested in during that period— blobs, data availability sampling, zkEVM research, rollup frameworks— were not in vain. They laid the groundwork for the next phase: an aggressively scaling L1 surrounded by a customizable L2 ecosystem serving institutional and specific needs that a general-purpose blockchain can never meet.
The correct interpretation of Vitalik's article is not that L2 has failed. Rather, the initial framework that set L2 to bear the entire social responsibility of scaling ETH as a branded shard was wrong. The new framework is simpler and more honest: L2 exists on a decentralized continuous spectrum, each serving different customer needs. The L2 closest to ETH inherits its security and contributes to its network effects. Those further away serve reasonable purposes but should not pretend to be something they are not. And the ETH L1 that gives all this value is about to become even stronger.
ETH has not abandoned L2. It has simply given L2 a more enduring reason for existence than "L1 is too slow." And this should make you more optimistic about ETH, not lose confidence in it.
You may also like

Why did the star Web3 project Across Protocol choose to abandon DAO?

Memories: 10 Key Contributions of the TON Core Team That Few People Knew in the Early Days

2025 South Korea CEX Listing Post-Mortem: Investing in New Coins = 70% Loss?

BIP-360 Analysis: Bitcoin's First Step Towards Quantum Immunity, But Why Only the "First Step"?

50 million USDT exchanged for 35,000 USD AAVE: How did the disaster happen? Who should we blame?

The Cryptographic Past of the Middle East

Resolving the Intergenerational Prisoner's Dilemma: The Inevitable Path of Nomadic Capital Bitcoin

Who Will Control AI? Why Decentralized AI May Be the Only Alternative to Government and Big Tech
AI has become critical infrastructure, and governments and corporations are competing to control it. Centralized development and regulation are entrenching existing power structures. The Web3 community is building a decentralized alternative — distributed compute, token incentives, and community governance — before that window closes.

Vitalik wrote a proposal teaching you how to secretly use AI large models

On the eve of the explosion of on-chain options

WEEX AI Hackathon: How Did This AI Trading Winner Succeed?
A self-taught AI trading enthusiast achieved top-10 results at the WEEX AI Hackathon. Learn about the mindset, AI tools, and lessons behind this impressive performance.

One Balance to Rule Them All: Gravitas' On-Chain Prime Broker Ambition

That person who cashed out at the NFT peak is now selling a new shovel in the OpenClaw craze

Inter-generational Prisoner's Dilemma Resolution: The Nomadic Capital and Bitcoin's Inevitable Path

Upstream and downstream are starting to fight, all for the sake of everyone being able to "Lobster"

Circle and Mastercard Announce Partnership, the Next Stage for the Crypto Industry Belongs to Payments

From 5 Mao per kWh of Chinese electricity to a $45 API export: Tokens are rewriting currency units
