Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Democracy & Technology Blog Good enough is great

Wired has a good article by Robert Capps, “The Good Enough Revolution: When Cheap and Simple Is Just Fine.

Cheap, fast, simple tools are suddenly everywhere. We get our breaking news from blogs, we make spotty long-distance calls on Skype, we watch video on small computer screens rather than TVs, and more and more of us are carrying around dinky, low-power netbook computers that are just good enough to meet our surfing and emailing needs. The low end has never been riding higher.
So what happened? Well, in short, technology happened. The world has sped up, become more connected and a whole lot busier. As a result, what consumers want from the products and services they buy is fundamentally changing. We now favor flexibility over high fidelity, convenience over features, quick and dirty over slow and polished. Having it here and now is more important than having it perfect. These changes run so deep and wide, they’re actually altering what we mean when we describe a product as “high-quality.”

Capps acknowledges this sounds a lot like what Professor Clayton Christensen described in Innovator’s Dilemma (1997).
These insights have profound implications for antitrust enforcement, e.g. nimble competitors can overthrow established titans without help from antitrust enforcers.
The secret lies in discovering when the qualities consumers value in a particular product are changing, and it isn’t rocket science.
Basis of competition
Christensen says the marketing literature provides numerous descriptions of different phases in the life cycle of any product.
He notes that a product evolution model produced by Windermere Associates posits that the product characteristic consumers initially value most is functionality. When two or more vendors fully supply that, consumers demand reliability; and the basis of competition shifts from functionality to reliability. Convenience is the next characteristic, followed by price.
Another model by Geoffrey Moore from his book Crossing the Chasm (1991) says that products are initially used by innovators and early adopters who base their buying decision solely on product functionality; then by the early majority after the demand for functionality has been met and vendors begin to address the need for reliability; by the late majority when reliability issues have been resolved and the basis of competition shifts to convenience; then by the rest when the focus of competition shifts to price.
Markets are sometimes attacked for being “broken” when vendors aren’t in a race to cut prices. This is one of the arguments advocates for a government-led National Broadband Strategy are making now.
But a market may simply be in the formative stages of supplying other characteristics consumers demand. It’s no fun if you’re a price-conscious consumer and you have to wait for the price to fall, but you benefit from this process because the development costs of high value products which will ultimately be vastly more affordable are being subsidized by others.
Integration and modularity
Christensen’s sequel (coauthored by Michael E. Raynor), Innovator’s Solution (2003) notes that when the basis of competition is convenience and price, vendors solve the problem by evolving the architecture of their products from being proprietary and interdependent toward being modular. Modularity allows firms to introduce new products faster and more cheaply because they can upgrade individual components or subsystems without having to redesign the whole product.
It also permits an industry structure consisting of independent firms who can specialize in individual components and subsystems.

Modularity enables the dis-integration of the industry. A population of nonintegrated firms can now outcompete the integrated firms that had dominated the industry. Whereas integration at one point was a competitive necessity, it later becomes a competitive disadvantage.

Network neutrality regulation is aimed in part at preserving and promoting modularity by limiting the opportunities for integrating content, devices and applications with broadband services.
But Christensen and Raynor point out that market participants must have the flexibility to pursue integration or modular strategies depending on whether existing products do not match consumer expectations for functionality and reliability versus whether product functionality and reliability exceeds consumer expectations.

We emphasize that the circumstances of performance gaps and performance surpluses drive the viability of these strategies [integration, modularity, reintegration, modularity, etc.]. This means, of course, that if the circumstances change again, the strategic approach must also change. Indeed, after 1990 there has been some reintegration in the computer industry.

If technology permits a better product than the market offers, they say competitors “must” have the flexibility to develop an integrated offering to shake up the market.

When firms must compete by making the best available products, they cannot simply assemble standardized components, because from an engineering point of view, standardization of interfaces (meaning fewer degress of design freedom) would force them to back away from the frontier of what is technologically possible.

Think of consumer expectations in the era before the iPhone versus afterwards. The preexisting offerings by firms such as Motorola and Nokia exceeded consumer expectations for a wireless phone, but fell short of consumer expectations for a smart phone, i.e., a mobile computer or “teleputer.”
The iPhone is an example of reintegration, where modularity in cellphones eventually led to smart phone integration. The iPhone also demonstrates how reintegration can lead again to modularity (think of the Apps Store).
The broadband service industry never had a period of integration. It has always been modular. Keeping it modular wouldn’t be a problem if it was in a position to supply anticipated demand without further investment. But that’s not the case. Massive investment is needed, and broadband providers are struggling to convince investors they can make a profit when regulators have successfully driven most profit out of the telephone business.
Integration could be jeopardized if network neutrality regulation becomes the law.
Shifting from modularity to integration won’t be easy if a subsequent Act of Congress or FCC order is a necessary prerequisite.
Net neutrality regulation does not prohibit integration, but it does ensure broadband service providers will reap none of the rewards and will face a regulatory minefield if they attempt to work individually with innovative content, device and application providers to maximize the user experience.
The result will be that they won’t, meaning that we will be back where we were in the 1960s (remember Lily Tomlin’s character, Ernestine?) with a telephone company which — due to regulation — cannot derive any benefit whatsoever from innovating and is rewarded only to the extent the status quo prevailes
This is a recipe not for innovation but of self-inflicted failure.

Hance Haney

Director and Senior Fellow of the Technology & Democracy Project
Hance Haney served as Director and Senior Fellow of the Technology & Democracy Project at the Discovery Institute, in Washington, D.C. Haney spent ten years as an aide to former Senator Bob Packwood (OR), and advised him in his capacity as chairman of the Senate Communications Subcommittee during the deliberations leading to the Telecommunications Act of 1996. He subsequently held various positions with the United States Telecom Association and Qwest Communications. He earned a B.A. in history from Willamette University and a J.D. from Lewis and Clark Law School in Portland, Oregon.