ENVIRONMENT AND ENERGY MARCH 16, 2011
After Japan, everyone’s asking the question—and the answer is more complicated than you think.
Just how necessary is nuclear power? Lately, politicians around the globe have been asking themselves that question as they watch a small handful Japanese technicians race to prevent three reactors from spewing out radiation at the quake-ravaged Fukushima Daiichi plant. In recent years, a consensus had taken hold that the world needed many, many more nuclear plants to meet its low-carbon energy needs and avoid drastic global warming. (All told, 220 reactors are currently being built or planned worldwide, with another 324 on the drawing board.) Suddenly, though, those plants don't seem like such no-brainers.
Germany, for one, has just suspended a decision to extend the life of its 17 nuclear power plants while it conducts safety checks. “During the moratorium, we will examine how we can accelerate the road to the age of renewable energy,” said Prime Minister Angela Merkel. Switzerland, too, is putting a hold on licensing new replacement plants. Here in the United States, there hasn’t been the same sort of about-face, at least not yet. Both parties still broadly agree that more nuclear plants are part of the answer to the country’s energy woes. (And, just in case, nuclear lobbyists have stormed Capitol Hill to make sure no one goes wobbly.) Jason Grumet of the Bipartisan Policy Center recently summed up the conventional wisdom for The New York Times: “It’s not possible to achieve a climate solution based on existing technology without a significant reliance on nuclear power.”
A number of liberals would agree with this statement. Yes, the argument goes, nuclear power carries some risks, but those aren’t nearly as great as the risks of burning coal, cooking the planet, and sending all sorts of deadly pollutants into the air. Air pollution kills two million people per year and much of that is due to fossil-fuel combustion. The right answer is to learn from Japan's mistakes and improve nuclear safety. (It’s also worth noting that the next generation of reactors are supposed to be even safer.) No energy source is risk-free, and nuclear is one of our best bets. Right?
It's still too early to tell how this debate will get resolved. But suppose society does decide, in the wake of Fukushima, that the risks of meltdowns and the hassles of storing radioactive waste are too much to bear. Do we have any other options? Is there really no way to cut our carbon emissions without nuclear power? It's not an easy question to answer. The United States certainly couldn’t just turn off its 104 nuclear reactors tomorrow, as they provide 20 percent of the country’s electricity (indeed, Germany could see carbon emissions spike due to its moratorium). But, in recent years, experts have been puzzling out just how we could curb pollution and keep the lights on without building new nuclear facilities. Is that actually doable?
For a long time, the argument that the world could wean itself off both fossil fuels and atomic energy was confined to earnest green groups. Last month, the World Wildlife Fund (WWF) released a 250-page report on how to get there by 2050. The roadmap starts with wringing out the enormous amount of waste energy from our industrial processes, buildings, and transportation systems. (That means everything from better insulation for homes to boosting recycling in, say, the paper industry). After that, our power would come from a variety of renewable sources, from sustainably harvested biomass to concentrated solar plants (which, in theory, can store power even when the sun isn’t shining) to acres and acres of wind turbines. It would be costly and difficult, sure, but technically feasible if everything went right.
Unfortunately, that’s a huge “if.” To take one example, the WWF report assumes that the world can get 6,000 exajoules worth of energy from algae-based biofuels by 2050 (translation: a whole heap of energy). Now, it would be wonderful if engineers figure out how to extract oil from algae on a mass scale so that we can keep driving our cars without churning up carbon pollution. Maybe then we’d have no need for nuclear-powered electric cars or whatever else the future might otherwise bring. But algae fuels have a lot of kinks to work out, and analyses still differ on whether you can get more energy out of the process than you put in.
Plus, a report by environmentalists isn’t going to convince everyone we don’t need nuclear energy. So, late last year, Mark Jacobson, an engineering professor at Stanford, and Mark Delucchi, an energy and environmental systems analyst at University of California Davis, published two papers in Energy Policy offering their own detailed analysis of how the world could get 100 percent of its electricity from existing renewables—mostly solar and wind—by 2050. The task would be staggering. We would need nearly four million five-megawatt wind turbines—i.e., turbines twice as big as those currently on the market. (China just built its first five-megawatter last year.) Plus 90,000 large-scale solar farms—for reference, there are only about three dozen in existence now. Plus 1.7 billion three-kilowatt rooftop solar systems—that is, one for every four people on the planet. But it’s doable. The main challenge, the authors found, would be mining enough rare-earth metals—like neodymium—for all those electric motors. So, again, mind-blowingly hard, but it’s at least possible to go carbon-free without nuclear (or algae). What’s more, the world wouldn’t have to pay that much more for energy than it does today.
But that still leaves headaches. The sun doesn’t always shine, and wind doesn’t always blow. What then? The great thing about nuclear is that it can crank out power whenever we want it. Jacobson and Delucchi argue that this problem is solvable. Utilities can get smarter about managing demand (so, for instance, a power company could work out a deal with the local wastewater plant to only run at night, when the wind turbines are spinning). Nations would also need to build smarter electric grids that juggled supply and demand and stored excess energy for when it was needed. (For instance, when it’s really windy out, grid operators could divert some of that power to pump water uphill into reservoirs, and then release the water during calm periods to generate needed electricity.) Other experts, like Caltech’s Nate Lewis, have argued that we need to invent brand-new types of cost-effective power storage before the grid could handle that many intermittent sources, but Jacobson says it could all be done with existing technology.
Granted, these discussions are largely theoretical. Nuclear power isn’t likely to vanish in the United States—the industry has too many backers on both sides of the aisle (back in his Senate days, Barack Obama was a loyal ally of Illinois’s big nuclear utility). But even if nuclear is here to stay, there’s plenty of reason to think it will play, at best, only a supporting role in America’s energy future—and that’s true regardless of what happens in Japan.
As it happens, the main obstacle to expanding nuclear power has never been the fear of fiasco. (Even the partial meltdown at Three Mile Island in 1978 barely nudged public opinion on nukes.) The biggest hurdle has always been cost. New nuclear plants go for at least $10 billion a pop—and that’s before cost overruns inevitably set in (as has happened with a much-hyped new reactor in Finland). Building intricate concrete and steel structures that can withstand all manner of disaster is a costly enterprise. According to a 2009 MIT study, “The Future of Nuclear Power,” getting electricity from nuclear costs about 8.4 cents per kilowatt hour, compared with 6.2 cents for coal and 6.5 cents for natural gas. Other industry analysts have suggested that the MIT study is too optimistic, and that power from new nuclear plants could cost twice as much.
The MIT study concluded that nuclear plants would have trouble getting off the ground unless Congress put some sort of tax on carbon pollution—and even then, nukes could lose out to solar farms or other clean alternatives. Granted, Congress could choose to offer hefty loan guarantees for new reactors, though it’s unclear if even that will be enough to attract leery investors. One possible outcome from the Fukushima disaster is that private companies may get a reminder of just how expensive things can get when something goes wrong. (Three Mile Island may not have soured the public on nuclear, but it did cost nearly $1 billion to clean up.)
That’s why some outside experts have long thought the nuclear renaissance was overblown, even before Fukushima. In a 2007 report for the Council on Foreign Relations, Charles Ferguson noted that all of the 104 reactors currently operating in the United States will likely need to be decommissioned by mid-century. Replacing those reactors (so simply preserving the status quo) would mean building a new reactor every four or five months for 50 years—already a “daunting” pace.
Another way to think about the nuclear question in the context of global warming is to do what Princeton’s Stephen Pacala and Robert Socolow did and identify seven or eight “stabilization wedges,” divvying up the roles that conservation, nuclear, renewables, carbon capture, and so forth will play in order to avoid dangerous temperature rises. For nuclear to provide just one “wedge,” according to an in-depth Keystone Center report, the world would need to build as many as 20 to 40 reactors per year for 50 years—plus the equivalent of ten Yucca Mountains to store all the waste. The experts commissioned by Keystone differed on whether this was remotely feasible or not—suddenly, those millions of wind mills don't look so bad. In the end, only one thing’s really clear: Even if we do stick with nukes, we’re going to need a whole lot else besides.
Bradford Plumer is associate editor of The New Republic.