“How ironic,” says Brookings Institution economist Robert Litan. Almost at the moment of the market collapse, the profession of economists, who perhaps alone among us had never believed in the productive power of the computer, started to come around. Recent converts include professors emeritus at MIT and Harvard, and the Council of Economic Advisers, which issued a 400-page admiration of the IT-driven New Economy two weeks ago. Their message challenges our old faith: four decades after the first sale of a clunky mainframe computer to a private business (Nielsen, the TV ratings people) America’s economy has only recently proven that it can put the computer to productive use on a broad scale.
As an economic force, the computer revolution is just beginning. The search for proof of its practical impact was on even before Nobel economist Robert Solow first articulated the “computer paradox” in 1987. Silicon Valley had been booming for more than a decade, yet productivity growth was in a mysterious rut. It was “embarrassing” to report, wrote Solow, that “what everyone feels to have been a technological revolution, a drastic change in our productive lives,” had been accompanied by the opposite result. The nation can raise productivity by deploying more workers or more machines, or by finding better ways to work with the machines–and the computer industry had so far proved stunningly insignificant on all three counts. The paradox, wrote Solow, is that “We see computers everywhere but in the productivity numbers.”
The revolution was missing something. In 1990 economist Paul David looked at the history of breakthrough technologies for answers, and found one in the electric motor. This machine did little to raise productivity until 40 years after it was invented in the 1880s. It took that long for industrialists to standardize the motor, rebuild factories around it, and hook it up to electric grids. Surely the computer would deliver, too, but only after businesses figured out how to rebuild around it.
In other words, the naive determinism of the computer faith was wrong: just buying computers was not enough to make businesses smarter. And the sensation of an all-pervasive “technological revolution” would prove statistically incorrect. As late as 1996 skeptics at Harvard and the Fed pointed out that less than five percent of all investment in machines was in computers–too small a total to accelerate the momentum of the economy.
For those outside of Silicon Valley, says economist Jack Triplett, there was another problem: “the Dilbert factor.” Since the early 1990s American businesses had been pouring more money into information technology than any other kind of equipment, but often to no effect. Dilbert, the cartoon cubicle dweller, once commented that time lost waiting for Web pages to load had “canceled out all the productivity gains of the Information Age.” As late as 1998 surveys showed that close to half of all IT projects were abandoned before completion. Computer rage spread. A telecom commercial now running on American TV presents comedians spoofing our plight: “My online service is down so often, I had to treat it for clinical depression,” says one.
We are still learning the computer. Yet by 1995 the underlying picture was changing. U.S. productivity growth suddenly doubled to 3 percent, making possible the rare run of high growth, low inflation and high employment that defines our “Goldilocks economy.” Mainstream economists found the productivity boom as mysterious as the doldrums from which it sprang, with one exception. “Alan Greenspan was perhaps the only one who believed it was IT-driven, and that’s why he was able to keep the good times rolling,” says Litan. “But he did it without evidence, on a hunch.”
Now the flock is following. Late last year skeptics at Harvard and the Fed jumped on the New Economy bandwagon. Their story also begins in 1995, when the rate of decline in computer-chip prices abruptly doubled to 30 percent. Cheap chips inspired heavy buying. Soon, semiconductors were transforming memo pads, gas pumps, appliances, phones and toys into minicomputers. The weight of IT investment hit critical mass, driving the economy forward. Harvard’s Dale Jorgenson called on practitioners of the dismal science to “raise the speed limit” on growth and shrug off “the pessimism of the Age of Diminished Expectations.”
But there was a catch. As of late last year it appeared that only Silicon Valley geeks had figured out how to use the computer intelligently; the rest of us had not. Productivity gains were still concentrated heavily in IT itself, above all in semiconductor manufacturers. They were using ever faster computers to design ever faster chips in a “virtuous circle” of productivity gains the White House calls “one of the most remarkable phenomena of the late-20th century.” For one industry to generate so much of U.S. growth struck Fed economists as “amazing.” Jorgenson goes so far as to say the U.S. boom is “driven by Intel,” the world’s dominant chipmaker.
The Intel story goes back to the mid-1980s, when an ascendant Japan was threatening to bury its industrial rivals. In a book called “Only the Paranoid Survive,” Intel chairman Andrew Grove set the tone that still drives the company. In 1995 Intel cut production cycles from three years to two–forcing competitors to match stride. The boom was born. “It’s probably true that the hard core of the New Economy is Intel,” says Bear Stearns chief economist Wayne Angell. “So long as they keep behaving like a little upstart, instead of the dominant player, the boom is likely to continue.”
The implications are striking. Cofounder Gordon Moore says Intel can keep producing faster, cheaper chips until at least 2008, when the layers on a chip will approach the size of an atom–and “you cannot split the atom.” What then? Does growth slow to a walk? What of the faith that computers would somehow inspire “the fire of genius” in all businesses, raising productivity throughout the economy–not just in information technologies? Jorgenson dismisses that idea as “phlogiston,” a mythical element once believed to ignite all flammable things.
That, anyway, is what the weight of the evidence showed until recent weeks. Incredibly, the businesses that spent most heavily on IT, like banks, law offices and insurance companies, showed declining productivity. This was widely cited as another “puzzle” of the computer age. Consider the ATM, which has cut in half the cost of processing withdrawals, but doubled the number of withdrawals. The net result is no gain to the bank, but a huge unmeasurable gain in convenience to customers. In an era when the real currency is creative innovations like the ATM, “productivity is irrelevant,” says Internet author Kevin Kelly. “It works for counting widgets, not ideas. Does it matter if a Picasso turns out 12 paintings a year, rather than two?”
The big picture has changed only recently. On Jan. 12 the Council of Economic Advisers issued the first major study to show that more than half of America’s productivity gains are being generated outside the computer industry. The council cites “a growing body” of new evidence that IT is boosting productivity in everything from smart steel factories to nimble insurance-company back offices. Its conclusion is that America is not only using more computers but, finally, using them to work smarter. “You expect a computer revolution to take some time, and that is broadly what has happened,” says outgoing council chairman Martin Baily. “But it is not only computers. It is the convergence of computers, software and the Internet, and the ability of businesses to innovate around them.”
The Internet is still a tiny piece of the puzzle. The computer revolution is 40 years old, but the World Wide Web is just five years old. The e-commerce market it spawned is growing fast, but still too small to speed up a multitrillion-dollar economy. And in historical perspective, the dot-com shakeout may be a good thing. The birth of the car industry was in 1906. By 1908 more than 240 firms jumped in, and only two were left standing after the downturn of 1910: Ford and General Motors. “I think that’s exactly what we’re seeing in infotech–Darwinian selection in action,” says Hal Varian, dean of the School of Information Management Systems at Berkeley. “We’re really just at the beginning.”
So just how big a deal is all this? As big as the printing press 500 years ago, the electric motor 80 years ago? It makes sense to give a final word to Solow, who has shaped the way we think about the impact of technology. He believes the “computer paradox” is now solved. “I can see computers in the productivity numbers,” says Solow, but adds a note of caution. Recent official White House forecasts have raised “the speed limit” on U.S. growth through 2010 to more than 3 percent, hardly endorsing the hype about “no limit” to growth. “These are the same limits we used to talk about when I was in the Kennedy administration,” says Solow. “At best, it takes us back to the 1960s.” That should be good enough, even for Wall Street. It was, after all, America’s last golden age.