What’s the biggest, scariest threat facing the United States right now? I know for sure that it’s not immigration, free trade, or “radical Islamic terrorism.” And I’m going to resist the easy answer that it’s Donald Trump.
But I have a hard time deciding which of these two very real challenges is more urgent:
a) Slow economic growth—and just as bad, the unequal distribution of what gains there are—leading to a traumatic reset of the average American’s expectations for the future, plus spiraling bitterness and resentment over a “rigged system” that’s leaving the working class behind.
b) Climate change—the likelihood that greenhouse gas emissions will heat the atmosphere well beyond the 2°C limit envisioned in the Paris Agreement, given that the monthly global temperature record has been broken in each of the last 16 months, taking us most of the way to a 2°C increase within the first year after the Paris accord’s completion.
The difficulty is that both problems pose a threat to our way of life, so we can’t prioritize just one of them.
It would be fruitless to fixate on growing the economic pie, and/or slicing it up more equitably, if we knew that whole pie was about to be charbroiled by a shifting climate. But we also wouldn’t invest in battling climate change unless we were motivated by a belief that everyone—in current and future generations—deserves an equal shot at a safe, healthy, prosperous life on this planet.
But what if we don’t have to choose?
Maybe the two problems have a common solution. Maybe the massive investments needed to blunt the effects of climate change—in areas like zero-carbon energy and transportation technology and climate-change adaptation—are exactly the same kinds of investments we would make if we wanted to restore our aging infrastructure, strengthen manufacturing, provide millions of people with new skills, put them to work in rewarding jobs, and boost overall productivity.
It’s time to merge two disconnected conversations into one. The first is the quest among economists, historians, and technologists to identify the forces that power economic growth and divvy up the benefits. In particular, economists would like to explain the U.S. economy’s lackluster performance since 1970, after a century of explosive growth from 1870 to 1970. Thomas Piketty’s Capital in the Twenty-First Century put these questions in the spotlight three years ago, but they’ve been getting renewed attention thanks to Northwestern University economist Robert J. Gordon, author of a blockbuster text called The Rise and Fall of American Growth: The U.S. Standard of Living Since the Civil War (Princeton, 2015).
The second conversation is going on among climate activists, who fear that we’ll never be able to avert the worst effects of global warming through vague, Paris-style promises that nibble around the edges of the problem. What’s really needed, this group believes, is a society-wide mobilization of labor, technology, and innovation —the only precedent in American history being World War II. The main proponent of this argument is environmentalist and author Bill McKibben, whose August 2016 cover story in The New Republic, “A World at War,” makes electrifying reading.
If you haven’t read the Gordon book or the McKibben article, well, you’re in luck: I’m about to summarize them here. (But really, after you’re done here, go read them.)
The mission of The Rise and Fall of American Growth—an unlikely New York Times bestseller, at a dense 779 pages—is to reveal the full depth of the standard-of-living improvements that characterized what Gordon calls the “special century” beginning in 1870. (The changes were especially swift in the second half of that century, from 1920 to 1970.) Secondarily, Gordon asks why the productivity gains that drove these advances tailed off after 1970.
The technological leaps of the special century, which began with telegraphy and railroads and ended with mainframes and moonwalks, are obvious. But Gordon shows in encyclopedic detail how the standard economic statistics on prices, labor output, and other inputs to GDP actually understate the scale of the improvements. Standard measures of GDP couldn’t capture the intangible benefits of things like better lighting inside homes, the time savings from water and sewer connections (which freed women from spending much of their day carrying water in and out of the house), the arrival of “personal travel” as a pastime once automobiles were ubiquitous, and the extra years to enjoy all of the above that came from rising life expectancies.
Gordon’s favorite measure of the impact of innovation is called “total factor productivity.” It gauges efficiency improvements relative to the number of hours people work and the number of machines they use. By his calculations, total factor productivity increased in the U.S. at an unprecedented 1.89 percent per year between 1920 and 1970. It declined to one-third that level, 0.57 percent per year, from 1970 to 1994. It bounced back up slightly, to 1.03 percent per year, during a brief window from 1994 to 2004, probably as the result of the dot-com wave of digitization and networking. And since 2004, it’s hovered at a comparatively anemic 0.40 percent per year.
But Gordon does not think that the U.S. economy broke in some fundamental way after 1970. After all, efficiency improvements of 0.40 percent per year look pretty good when you realize, as Gordon notes, that there was virtually no growth in the West between Roman times and the beginning of the First Industrial Revolution around 1750. Rather, his argument is that the big mid-20th century gains came from transitions that, by definition, could happen only once: rural populations became urban, cars and tractors replaced horses, homes and factories were electrified, antibiotics conquered most infections, infant mortality was vastly reduced, telephones and radio ended isolation, the nation was blanketed with superhighways, and transcontinental and intercontinental jet travel became a reality.
To Gordon’s eye, these fruits of the Second Industrial Revolution (the one that began with electricity and the internal combustion engine) simply outweigh the benefits of what he calls the Third Industrial Revolution (the one focused on information and communications technology, from television to the iPhone). And the reason productivity gains are slower today is that the signature innovations of our era play out in a more limited sphere.
As MIT economist Robert Solow famously quipped in 1987, “We can see the computer age everywhere but in the productivity statistics.” The explanation for Solow’s paradox, Gordon argues, is that “computers are not everywhere. We don’t eat computers or wear them or drive to work in them or let them cut our hair. We live in dwelling units that have appliances much like those of the 1950s, and we drive in motor vehicles that perform the same functions as in the 1950s, albeit with more convenience and safety.” (Emphasis added.)
Gordon musters enough detail to make his book devastatingly convincing. I say “devastatingly” because one implication of his work is that we may be doomed to permanently slow growth. If Gordon is right that the gains from the Second Industrial Revolution were a one-time-only event, then future generations probably won’t have a standard of living notably higher than ours.
That would be discouraging. But there’s one more important point to note about the book. The big gains in total factor productivity weren’t evenly distributed across Gordon’s “special century.” In fact, they show a huge spike, to nearly 3.5 percent per year, in the decade 1940-1950.
Why that decade in particular? All the evidence points to two interrelated causes. The first was America’s involvement in World War II, from 1941-1945, when “the entire economy converted to a maximum production regime in which every machine and structure was used twenty-four hours per day if enough workers could be found” and when “all the indexes of output, hours of work, and productivity soared.”
The second was the post-war consumer boom starting in 1946, when “the floodgates of demand were let loose, and after swift reconversion, manufacturers strained to meet the demand for refrigerators, stoves, washing machines, dryers, and dishwashers, not to mention automobiles and television sets.” Manufacturers were able to meet this demand because they had purchased staggering amounts of modern factory equipment during the war, mostly on the federal government’s dime, and because “the lessons learned from the war translated into permanent efficiency gains after the war.”
Thus the war not only brought forth technologies such as rocketry, computers, radar, jet engines, and atomic energy that would otherwise have taken decades to arrive; it fostered a culture of improvisation and continuous improvement and reset the whole economy at a higher level. In fact, without the “special decade” of 1940-1950, Gordon’s “special century” would look a lot less special. “The most obvious reason why productivity remained high after World War II, despite the end of the military emergency, is that technological change does not regress,” Gordon writes. “People do not forget.”
Which brings me back to Bill McKibben. His New Republic piece can be summarized much more briefly. It argues that the effort to slow climate change is a world war—literally, not metaphorically—and that we are losing it by failing to mobilize on the required scale.
“Winning” this war would not mean averting significant warming. It’s already too late for that. But if we can bring carbon dioxide levels in the atmosphere below 350 parts per million by 2100, the planet would probably stop heating up, at least according to predictions from McKibben’s favorite engineer, Mark Z. Jacobson, director of Stanford’s Atmosphere and Energy Program.
To do that, the U.S. would need to get 80 percent of its energy from non-carbon-emitting sources by 2030 and 100 percent by 2050. Reaching that goal, according to Jacobson’s research, would mean building enough solar panels and wind turbines to produce about 6,500 gigawatts of electricity. And to do that, we’d need to build about 300 solar panel factories as big as the $750 million “gigafactory” that Elon Musk’s SolarCity is building right now outside Buffalo, NY—plus a few hundred more factories to build wind turbines.
That means building 45 new gigafactories every year, starting immediately. Other industrialized countries would need to undertake similar efforts, in proportion to their own energy needs. Sounds daunting, but there is a precedent for this scale of effort in the U.S.—you guessed it, World War II.
Building dozens of new factories per year is the sort of emergency program that only businesses could accomplish and only the federal government could afford. Such a partnership worked well enough during World War II, and McKibben uses a long section of his article to recount how Franklin D. Roosevelt’s War Department strong-armed private industry into building the tanks, guns, aircraft, rubber, and other materiel needed by Allied armies in Europe, Africa, and the Pacific.
But Roosevelt had public opinion behind him, especially after Pearl Harbor. He had enormous clout as the instigator of the New Deal, which was seen as reining in the business-class recklessness that had given the country the Great Depression. He had guile, adeptness, and foresight—one of his most important phone calls in May 1940, as British, French, and Belgian troops fled Dunkirk, was to William Knudsen, president of General Motors, asking him to take over the management of war production. And because Roosevelt’s military was paying the bills for the defense buildup, the military could call the shots.
Today, in the absence of an actual military emergency or a presiding political genius like Roosevelt, it’s not clear how an even larger mobilization could be accomplished, let alone sustained for decades. And in the days since McKibben’s piece appeared, writers such as David Roberts have questioned whether wartime mobilization is even the right way to think about the climate struggle.
Regardless, McKibben is on to something important when he says this: “A truly global mobilization to defeat climate change wouldn’t wreck our economy or throw coal miners out of work. Quite the contrary: Gearing up to stop global warming would provide a host of social and economic benefits, just as World War II did.”
Building and operating the new solar and wind factories would create two million new jobs in the U.S., McKibben says. Employment in the new “climate engineering” sector could easily go much higher if we also made new investments in nuclear power and carbon capture and sequestration technology—areas that McKibben and Jacobson discount. And then there’s the weighty business of adapting to global warming’s inevitable side effects: sea level rise, torrential flooding, melting permafrost, searing droughts, forest fires, and all the rest. By 2100, nations are projected to spend $12 billion to $71 billion per year on dikes and seawalls alone, according to a group of European climate scientists. Somebody has to build all those walls, put out those fires, and relocate all those coastal villages.
So here, finally, is my own argument. The twin lessons of Gordon’s The Rise and Fall of American Growth are that a) rapid growth has historically occurred only when innovation has been broad-based, involving technologies that touch all aspects of our lives, not just the screens in our offices and living rooms, and b) exogenous forces such as massive federal spending in wartime can boost productivity growth to even greater heights.
Grappling with the climate emergency—and it is an emergency, though it may take a Pearl Harbor-like shock to wake people up to that fact—will require a broad-based reinvention of our technological infrastructure. The build-out of the large systems that were the hallmark of the Second Industrial Revolution may have been a one-time event, but now each of these systems must be redesigned to be fossil-fuel-free. Household appliances, cars and buses, railroad engines, freight trucks, and even aircraft must be completely electrified to make this work. Electrical grids will have to be rebuilt to accommodate distributed generation and storage. And we really do have to finance and build hundreds of solar and wind factories. Put it all together, and it adds up to a kind of Fourth Industrial Revolution, defined this time by zero-carbon energy production.
How do you deliberately jumpstart such a revolution? Who knows. Like the planners and engineers who managed the World War II effort, we’ll just have to figure it out as we go. (Step one: extinguish the climate-change denialism still rampant in the party that controls Congress.) But in every significant respect, this feels like the same sort of broad technological shift that, as Gordon shows, spurred rapid productivity growth from 1920 to 1970.
Gordon himself might say that fighting climate change is about protecting our existing standard of living in the face of looming catastrophe, rather than reaching for a new level of prosperity. He writes: “Regulations that require the replacement of machinery or consumer appliances with new versions that are more energy-efficient but operationally equivalent impose a capital cost burden. Such investments do not contribute to economic growth in the same sense as such early twentieth-century innovations as the replacement of the icebox by the electric refrigerator or the replacement of the horse by the automobile.”
But innovation is not dead. If Gordon’s book has a blind spot, it’s that it doesn’t account for the possibility of a Fourth Industrial Revolution with much wider impacts than the Third. In the process of rebuilding our energy and transportation infrastructure, we might come up with machines that are not just more efficient but qualitatively better than their predecessors, in the same way the French TGV and Japanese shinkansen are better than Amtrak. And on top of that, we might figure out more efficient ways to build these more efficient machines, and end up equipping a new generation of workers with the skills, experience, and spirit of improvisation needed to keep the whole cycle going.
Proclamations like Gordon’s that living standards have peaked, or that humanity can progress no further, pop up every so often in intellectual circles. So far, they’ve always been wrong, and I see no reason to abandon optimism, especially now. (“In wartime, defeatism is a great sin,” as McKibben notes.)
Gordon himself acknowledges that technological change does not regress—it only goes forward. The question is whether we’re smart and level-headed enough to tackle both of our scariest threats at once, since we can’t seem to solve either one on its own.
Flickr Creative Commons photo by Juwi Renewable Energies Ltd. Thanks to Victor McElheny for reviewing a draft of this essay.