Web3’s real ‘TCP/IP moment’ hasn’t happened yet | Opinion
Disclosure: The views and opinions expressed here belong solely to the author and do not represent the views and opinions of crypto.news’ editorial.
The internet scaled because IP created a universal fabric for data. Web3 never got that luxury. Instead, it inherited 1980s-era networking and a patchwork of ad-hoc protocols that slow down and congest the moment you try to run real transactions at scale, let alone billions of AI agents, global settlement layers, or a planetary-scale decentralized physical infrastructure network sensor mesh. We’re long past the point where faster chains or bigger blocks can help.
- Web3 can’t scale with its fragmented, outdated networking. It needs a universal, decentralized data protocol — its own TCP/IP — to achieve trustless, global throughput.
- Mathematical breakthroughs like RLNC show decentralized networks can match centralized performance if data movement is redesigned from first principles.
- A universal coded data layer would unlock real scale, fixing chain fragmentation, enabling trillion-dollar DeFi, supporting global DePIN networks, and powering decentralized AI.
Web3 needs its own TCP/IP moment: a decentralized Internet Protocol built on the principles that made the original internet unstoppable, but engineered to preserve what makes blockchain matter: trustlessness, censorship resistance, and permissionless participation that finally performs at scale.
What the industry keeps missing
Before IP, computers couldn’t talk across networks. IP created a universal standard for routing data between any two points on earth, turning isolated systems into the internet. It became one of three pillars of internet infrastructure (alongside compute and storage). Every web2 application runs on TCP/IP. It’s the protocol that made planetary-scale communication possible.
Web3 is repeating the same early mistakes. Every blockchain invented its own networking layer, including gossip protocols, Turbine, Snow, Narwhal, mempools, and DA sampling. None of them is universal, and they’re needlessly restrictive. Everyone’s chasing speed with bigger blocks, more rollups, more parallelization. But they’re all using fundamentally broken networking models.
If we’re serious about scaling web3, we need a reliably fast, trustless, fault-tolerant, and most importantly, modular internet protocol.
Two decades at MIT, solving decentralization’s hardest problem
For over two decades, my research at MIT has focused on one question: Can decentralized systems move information as fast and reliably as centralized ones — and can we make it mathematically provable?
To answer that, we combined two fields that had rarely intersected: network coding theory, which mathematically optimizes data movement, and distributed algorithms, led by Nancy Lynch’s seminal work on consensus and Byzantine fault tolerance.
What we found was clear: decentralized systems can reach centralized-level performance — but only if we redesign data movement from first principles. After years of proofs and experiments, Random Linear Network Coding (RLNC) emerged as the mathematically optimal method for doing this across decentralized networks.
Once blockchains arrived, the application became obvious. The internet we have was built for trusted intermediaries. The decentralized web needs its own protocol: one designed to withstand failure and attack while scaling globally. The architectural shift is such that:
- performance comes from mathematics, not hardware;
- coordination comes from code, not servers;
- and the network becomes stronger as it decentralizes.
Like the original Internet Protocol, it isn’t meant to replace what exists, but to enable what comes next.
The use cases that break today’s infrastructure
Decentralized systems are hitting their limits at the exact moment the world needs them to scale. Four macro trends are emerging — and each exposes the same bottleneck: Web3 still runs on networking assumptions inherited from centralized systems.
1. The fragmentation of L1s and L2s means blockchains scale locally, but fail globally
We now have more than a hundred blockchains, and while each can optimize its own local execution, the moment these networks need to coordinate globally, they all hit the same challenges: data movement is restricted, inefficient, and fundamentally sub-optimal.
What blockchains lack is the equivalent of an electric grid, a shared layer that routes bandwidth wherever it’s needed. A decentralized Internet Protocol would give every chain access to the same coded data fabric, accelerating block propagation, DA retrieval, and state access without touching consensus. And like any good grid, when it works, congestion is minimized.
2. Tokenization & DeFi at trillion-dollar markets
DeFi cannot settle trillions on networks where propagation is slow, it collapses under load, or where RPC bottlenecks centralize access. If multiple chains were connected by a shared coded network, propagation spikes would likely not overwhelm any single chain — they would be absorbed and redistributed across the entire network.
In traditional systems, you build larger data centers to absorb peak load. These are expensive and lead to single points of failure. In decentralized systems, we cannot rely on megacenters; we must rely on coded distribution.
3. DePIN at global scale
A global network with millions of devices and autonomous machines cannot function if each node waits on slow, single-path communication. These devices must behave like a single, coherent organism.
In energy systems, flexible grids absorb both commercial mining operations and a single hair dryer. In networking, a decentralized protocol must do the same for data: absorb every source optimally, and deliver it where it is needed most. That requires coded storage, coded retrieval, and the ability to make use of every available path rather than relying on a few predetermined ones.
4. Decentralized AI
Distributed AI, whether training on encrypted fragments or coordinating fleets of AI agents, depends on high-throughput, fault-tolerant data movement. Today, decentralized storage and compute are separated; access is slow; retrieval depends on centralized gateways. What AI needs is data logistics, not simple storage: meaning that data is encoded while in motion, stored in coded fragments, retrieved from wherever is fastest at the time, and recombined instantly without depending on any single location.
Web3’s next leap
Every major leap in the internet’s evolution began with a breakthrough in how data moves. IP delivered global connectivity. Broadband enabled Netflix and cloud computing. 4G and 5G made Uber, TikTok, and real-time social possible. GPUs sparked the deep learning revolution. Smart contracts unlocked programmable finance.
A universal, coded data layer would do for blockchains what IP did for the early internet: create the conditions for applications we can’t yet imagine. It’s the foundation that transforms Web3 from experimental to inevitable.