Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

The Coming Exaflood

Original Article

Today there is much praise for YouTube, MySpace, blogs and all the other democratic digital technologies that are allowing you and me to transform media and commerce. But these infant Internet applications are at risk, thanks to the regulatory implications of “network neutrality.” Proponents of this concept — including Democratic Reps. John Dingell and John Conyers, and Sen. Daniel Inouye, who have ascended to key committee chairs — are obsessed with divvying up the existing network, but oblivious to the need to build more capacity.

To understand, let’s take a step back. In 1999, Yahoo acquired Broadcast.com for $5 billion. Broadcast.com had little revenue, and although its intent was to stream sports and entertainment video to consumers over the Internet, two-thirds of its sales at the time came from hosting corporate video conferences. Yahoo absorbed the start-up — and little more was heard of Broadcast.com or Yahoo’s video ambitions.

Seven years later, Google acquired YouTube for $1.65 billion. Like Broadcast.com, YouTube so far has not enjoyed large revenues. But it is streaming massive amounts of video to all corners of the globe. The difference: Broadcast.com failed because there were almost no broadband connections to homes and businesses. Today, we have hundreds of millions of links world-wide capable of transmitting passable video clips.

Why did that come about? At the Telecosm conference last October, Stanford professor Larry Lessig asserted that the previous federal Internet policy of open access neutrality was the chief enabler of success on the net. “[B]ecause of that neutrality,” Mr. Lessig insisted, “the explosion of innovation and the applications and content layer happened. Now . . . the legal basis supporting net neutrality has been erased by the FCC.”

In fact, Mr. Lessig has it backward. Broadcast.com failed precisely because the FCC’s “neutral” telecom price controls and sharing mandates effectively prohibited investments in broadband networks and crashed thousands of Silicon Valley business plans and dot-com dreams. Hoping to create “competition” out of thin air, the Clinton-Gore FCC forced telecom providers to lease their wires and switches at below-market rates. By guaranteeing a negative rate of return on infrastructure investments, the FCC destroyed incentives to build new broadband networks — the kind that might have allowed Broadcast.com to flourish.

By 2000, the U.S. had fewer than five million consumer “broadband” links, averaging 500 kilobits per second. Over the past two years, the reverse has been true. As the FCC has relaxed or eliminated regulations, broadband investment and download speeds have surged — we now enjoy almost 50 million broadband links, averaging some three megabits per second. Internet video succeeded in the form of YouTube. But that “explosion of innovation” at the “applications and content layer” was not feasible without tens of billions of dollars of optics, chips and disks deployed around the world. YouTube at the edge cannot happen without bandwidth in the core.

Messrs. Lessig, Dingell and Conyers, and Google, now want to repeat all the investment-killing mistakes of the late 1990s, in the form of new legislation and FCC regulation to ensure “net neutrality.” This ignores the experience of the recent past — and worse, the needs of the future.

Think of this. Each year the original content on the world’s radio, cable and broadcast television channels adds up to about 75 petabytes of data — or, 10 to the 15th power. If current estimates are correct, the two-year-old YouTube streams that much data in about three months. But a shift to high-definition video clips by YouTube users would flood the Internet with enough data to more than double the traffic of the entire cybersphere. And YouTube is just one company with one application that is itself only in its infancy. Given the growth of video cameras around the world, we could soon produce five exabytes of amateur video annually. Upgrades to high-definition will in time increase that number by another order of magnitude to some 50 exabytes or more, or 10 times the Internet’s current yearly traffic.

We will increasingly share these videos with the world. And even if we do not share them, we will back them up at remote data storage facilities. I just began using a service called Mozy that each night at 3 a.m. automatically scans and backs up the gigabytes worth of documents and photos on my PCs. My home computers are now mirrored at a data center in Utah. One way or another, these videos will thus traverse the net at least once, and possibly, in the case of a YouTube hit, hundreds of thousands of times.

There’s more. Advances in digital medical imaging will soon slice your brain 1,024 ways with resolution of less than half a millimeter and produce multigigabyte files. A technician puts your anatomy on a DVD and you send your body onto the Internet for analysis by a radiologist in Mumbai. You skip doctor visits, stay home and have him come to you with a remote video diagnosis. Add another 10 exabytes or more of Internet data traffic. Then there’s what George Gilder calls the “global sensorium,” the coming network of digital surveillance cameras, RFID tags and other sensors, sprawling across every home, highway, hybrid, high-rise, high-school, etc. All this data will be collected, analyzed and transmitted. Oh, and how about video conferencing? Each year we generate some 20 exabytes of data via telephone. As these audio conversations gradually shift to video, putting further severe strains on the network, we could multiply the 20 exabytes by a factor of 100 or more.

Today’s networks are not remotely prepared to handle this exaflood.

Wall Street will finance new telco and cable fiber optic projects, but only with some reasonable hope of a profit. And that is what net neutrality could squelch. Google, for example, has guaranteed $900 million in advertising revenue to MySpace and paid Dell $1 billion to install Google search boxes on its computers; YouTube partnered with Verizon Wireless; MySpace signed its own content deal with Cingular. But these kinds of preferential partnerships, where content and conduit are integrated to varying degrees — and which are ubiquitous in almost every industry — could be outlawed under net neutrality.

Ironically, the condition that net neutrality seeks to ban — discrimination or favoritism of content on the Internet — is only necessary in narrowband networks. When resources are scarce, the highest bidder can exclude the others. But with real broadband networks, capacity is abundant and discrimination unnecessary. Net neutrality’s rules, price controls and litigation would prevent broadband networks from being built, limit the amount of available bandwidth and thus encourage the zero-sum discrimination supposedly deplored.

Without many tens of billions of dollars worth of new fiber optic networks, thousands of new business plans in communications, medicine, education, security, remote sensing, computing, the military and every mundane task that could soon move to the Internet will be frustrated. All the innovations on the edge will die. Only an explosion of risky network investment and new network technology can accommodate these millions of ideas.

Mr. Swanson is a senior fellow at the Discovery Institute, and contributing editor at the Gilder Technology Report.

Bret Swanson

Bret Swanson is a Senior Fellow at Seattle's Discovery Institute, where he researches technology and economics and contributes to the Disco-Tech blog. He is currently writing a book on the abundance of the world economy, focusing on the Chinese boom and developing a new concept linking economics and information theory. Swanson writes frequently for the editorial page of The Wall Street Journal on topics ranging from broadband communications to monetary policy.