21st Century Science & Technology


Who Killed U.S. Nuclear Power?

by Marsha Freeman

From the Spring 2001 issue of 21st Century Science & Technology (full text).

Nuclear power plant
Wall Street’s high finance rates killed 5,000 megawatts of nuclear power capacity—four plants—in 1981, midway in construction in the Washington state WPPSS Project, shown here. If the four nuclear plants planned by WPPSS had been completed, the Pacific Northwest would not have an energy crisis today.
Washington Public Power Suppy System

The U.S. Atomic Energy Commission (AEC) made a projection in 1962, that by the year 1980, 40 gigawatts of nuclear-generated electric capacity would be on line in this country (the equivalent of about 40 plants of 1,000 megawatts capacity each). Two years later, amid the optimism generated by President John F. Kennedy’s Apollo program to land a man on the Moon, the AEC revised its projections upward, to 75 GW of nuclear capacity by 1980.

By 1967, through the momentum of the lunar landing program and its high-technology economic expansion, the AEC again upped its projections, this time to 145 GW of nuclear capacity by the year 1980. Engineers in the industry, looking farther ahead, expected 2,000 GW nuclear by the year 2000.

Now, in 2001, there are only 103 nuclear plants in operation in the United States. More than that number have been cancelled. The collapse in orders, and cancellations, have left the U.S. nuclear industry in such a state of contraction, that today it could not even build a new nuclear reactor, were one to be ordered. The pressure vessel would have to be imported, because there is no U.S. firm capable of fabricating one.

There are many myths about who killed nuclear power in this country. Blame is put on the accident at Three Mile Island in 1979, which certainly added to the attacks on the industry, but was not a decisive factor. Blame is put on the American public, which supposedly became anti-nuclear (although, except for a small vocal minority, this has never been the case). The claim is made that nuclear is inherently just too expensive to use, but, in fact, it was a coordinated assault by Wall Street and its foot soldiers in the environmentalist movement that drove the costs up.

If we do not understand how we got to where we are, we will never be able to change the situation.

Soon after President John F. Kennedy was assassinated in 1963, the international financial and oligarchical interests who despised his pro-nuclear, pro-space, and economic growth policies, moved in to bury them.

The Paradigm Shift

The founding of the Club of Rome in 1969, by co-thinkers of European royal families and their toadies in the United States, helped launch a propaganda campaign to convince policymakers and citizens that the world has too many people. Volumes of reports from the Club of Rome and affiliated think tanks opined that science and technology could not alleviate the alleged “overpopulation,” and that, in any case, that technology has many “negative” consequences, such as damaging the environment.

The passage of the National Environmental Policy Act the same year, 1969, made the criterion of how economic projects would affect plants, insects, and animals more important than the impact those projects would have on the economic health of human beings.

The 1973 Middle East War, and the ensuing manipulated “oil crisis,” threw energy policy and planning into turmoil. Overnight, oil prices quadrupled, and coal—until then the mainstay of electricity generation—also rose in price. Under his Project Independence program, to increase the exploitation of domestic energy supplies, President Richard Nixon called for the building of 1,000 nuclear reactors by the year 2000. But soon, Nixon was out of office, and the anti-nuclear moles inside his Administration had already been planning the demise of nuclear energy.

Already in 1971, within days of becoming the head of the Atomic Energy Commission, James Rodney Schlesinger, who had come to Washington from the RAND Corporation, overturned a critical AEC decision. He allowed the Natural Resources Defense Council, which had been formed in 1970 by representatives of top Wall Street law firms, to “intervene” via lawsuits to stop construction of the Calvert Cliffs nuclear plant in southern Maryland. The reason given was that the plant would damage the “environment.” This action laid the basis for two decades of legal maneuvering by environmentalists-in-three-piece-suits to keep utilities tied up in court for years, with bogus environmental and safety concerns, making it impossible for many plants to ever be completed.

With the election of Jimmy Carter as President in 1976, anti-nuclear, pro-environmental policy was brought right into the White House. In preparation for the new Democratic Administration, the New York Council on Foreign Relations, a spinoff of the London’s Royal Institute of International Affairs, produced its Project 1980s report, which called specifically for the “controlled disintegration” of the U.S. economy.

The Rockefeller-funded Trilateral Commission, whose membership dominated the Carter Administration, adopted this Project 1980s perspective as its own. Central to their theme of controlled disintegration was the halt of new energy technologies on the horizon, such as advanced nuclear fission and fusion energy. In their place, they promoted the institutionalization of “conservation,” and small-is-beautiful “alternative” energy, based on inefficient and expensive wind, solar, and biomass—technologies which had virtually disappeared after the Industrial Revolution. Billions of dollars in federal subsidies were poured into these 19th century throwbacks, to try to make them economically palatable to an otherwise highly skeptical public.

The new Department of Energy, which replaced the Atomic Energy Commission—an act that in itself demonstrated the shift in policy—again came under the control of James Schlesinger. While Schlesinger was making speeches about how nuclear energy was not “cost effective,” the Department of Energy showed its anti-technology stripes by actively promoting and participating in “Sun Day” festivities.

The first step toward deregulating the electric utilities took place, under the Carter Administration, through a 1978 Act that gave small, “renewable” energy producers access to the electric grid, and forced utility companies to buy their outrageously priced power.

A march on Washington of 65,000 anti-nuclear demonstrators on May 6, 1979, used the March 1979 incident at Three Mile Island to call for the shutdown of the nation’s then-operating 68 nuclear reactors. This Jacobin mob was the street-level creature spawned by the Council on Foreign Relations and Wall Street’s largest non-profit foundations, in the name of “protecting the environment.” The demonstration further fueled the efforts in the White House and Congress to enact rules and regulations to sabotage the completion of nuclear plants.

The machinations of the anti-nukes also increased the pressure on the Nuclear Regulatory Commission to institute irrational new rules and regulations, which, on one occasion, resulted in 13 power plants being shut down at the same time, for “safety” inspections. Billions of dollars were spent by nuclear utilities to retrofit plants for increased safety, much of which retrofitting was known by many in the industry to be unnecessary. At the same time, the nuclear utilities were bending over backwards to “listen” to and answer the “concerns” of the anti-nukes, in the hope that this process would instill some rationality into the situation.

It was during the Carter administration, that the predecessor to 21st Century, the Fusion Energy Foundation’s Fusion magazine, and the associated political movement of Lyndon LaRouche, took the lead in exposing the Trilateral Commission/Council on Foreign Relations/Wall St. role in fostering and funding the environmentalist movement and its terrorist spinoffs.

In the closing days of the Carter Administration, Lyndon LaRouche, preparing to run for President in the 1980 election, released a report titled “America Must Go Nuclear.” In the introduction, LaRouche stated: “On my first day in office, I shall deliver to the Congress a comprehensive energy policy. This legislation will repeal the worst features of the Environmental Protection Act, permitting work to be completed on the approximately 120 nuclear energy plants presently stalled in various phases of construction. It will also provide for the addition of 1,000 gigawatts of nuclear energy by 2000 A.D.”

President Reagan, who was touted as the first pro-nuclear President in 20 years when elected in 1980, did not even understand the systemic policy changes that would be required to resurrect the nuclear industry.


Figure 1

The 103 nuclear power plants operating today in the United States produce the most reliable, and most efficient 20 percent of the electric power grid. Total capacity is 96.245 GW.

Source: Nuclear Energy Institute

Although the 1974 “oil crisis” led to a renewed interest in nuclear, as evidenced by the number of plants ordered immediately afterwards, a well calculated act by Federal Reserve Chairman, Paul Volcker, one of the many Trilateral Commission agents in the Carter team, dashed the attempts to go nuclear.

Over the Columbus Day weekend in 1979, Volcker raised interest rates in the United States into the double digits. This move had an immediate impact on two consumer goods sectors that rely heavily on credit—automobile purchases, and home mortgages—but the effect on the electric utility industry was more dramatic, and more far reaching.

The idea that the nation did not have to build more power plants, especially nuclear plants, because the economy and energy consumption had fallen, was a self-fulfilling prophecy. When energy prices skyrocketed in the mid-1970s, industries and consumers cut back, to buy what energy they could afford. Traditional 1960s growth rates for electricity demand of 7 percent per year, shrank to 2 to 3 percent per year, and projections for the future decade, based on the forecast of an extended economic recession, were in the 1 to 2 percent range. Once energy supply is made expensive, it can be expected that consumption will decline. Historically, inexpensive energy has fueled increased demand, not vice versa.

Suddenly, after the oil shock, with demand falling, the nuclear plants that were in the pipeline were seen as “over capacity,” an unnecessary “surplus” of power that no one should have to pay for. It was not the Three Mile Island incident in 1979 that started the rush to cancel nuclear plants. Between 1973 and 1979, more than 40 had already been cancelled. And by 1979, the projections for nuclear power by the Department of Energy were slashed to 150 GW by the year 2000. Orders for new nuclear plants disappeared, as seen in Figure 2.


Figure 2

It is a myth that the accident at Three Mile Island in 1979 caused the demise of the nuclear industry. As can be seen here, the number of new nuclear plants ordered reached a high of 35 in 1972, and then collapsed to zero after the “oil crisis” of 1973.

Source: Atomic Industrial Forum

By 1981, electric utilities, which operate the most capital-intensive industry in the nation, were paying 17 percent interest on loans for the construction of power plants. This might have been a bearable escalation in cost, were it not for the fact that the construction time for nuclear power plants was being stretched out from eight years to up to twenty—thanks to anti-nuclear “intervenors” who made a profession out of tying up utilities in court. No company, no matter how solvent, could pay such interest rates, for two decades, while waiting to recoup the cost from the generation of power.

In March 1981, Wall Street’s Merrill Lynch issued a report recommending the cancellation of 18 nuclear plants, because of the financing costs. Utility bond sales were cancelled by financial houses. Six months later, Boston Edison’s Pilgrim-2 plant was cancelled, as the cost had escalated from $400 million to $4 billion, simply because of the schedule stretch-out and high interest rates.


Figure 3

Nuclear power is not intrinsically expensive. What drove nuclear plant costs up were environmentalist delays (caused by anti-nuclear “intervenors” and the high interest financing rates—both perpetrated by those who wanted to kill nuclear power, and who now complain that nuclear costs too much. Shown here, in dollars per kilowatt are the rising costs of financing, environmentalist delays, and construction materials increases for nuclear (N) and the rising costs for comparable coal-fired plants (C) with sulfur removal.

Source: Electric Power Research Institute

In August 1981, the Washington Public Power Supply System (WPPSS) in Washington state, had its credit rating cut by Moody’s Investors Service, and bond underwriters demanded that both the interest and principal on the loans had to be repaid before the nuclear plants generated any electricity or revenues. Of the five planned reactors, two were mothballed and two were cancelled. As a result of the environmental legal sabotage, the construction time was projected to be 12 to 14 years. WPPSS estimated that it would cost $12.1 billion to finish the two units, and that $8 billion of that cost would be interest charges on long-term bonds—two thirds of the total cost.

If the four nuclear plants planned by WPPSS had been completed, providing an additional 5,000 MW of electric generating capacity, even this year’s drought in the Pacific Northwest would not have led to the crisis in energy supply that the region is now suffering.

By the time the last nuclear power plant came on line in the last decade, it was no wonder that its cost of producing electricity was not “competitive” with other sources. The actual cost of building plants had been declining for years, as seen in Figure 3. But the costs of dragging out construction for decades, and paying a king’s ransom to borrow money, as well as the fear any utility would have of starting a project that could put it into bankruptcy court, had driven nuclear power out of the energy picture.

(in billions of dollars)
Unit Megawatts Initial cost estimate Actual cost
Millstone III (Massachusetts
   and Connecticut)   

Limerick 1 (Pennsylvania)

Wolf Creek (Kansas)

Susquehanna 1 (Pennsylvania)

Susquehanna II (Pennsylvania)
















Nuclear power plants that should have cost between $500 million and $1 billion, had their final costs escalate up to 10 times that amount, over the course of construction, thanks to unreasonable regulations by the Nuclear Regulatory Commission, and the stretch-out of schedules over bogus “environmental and safety” concerns. Note that GE and other U.S. firms currently build 1,000 MW and larger nuclear units in Japan, Korea, and Taiwan in 4 to 5 years.
Source: Public Utility Commissions in the respective states

The energy crisis over the past year in California, and public recognition in New York and other states that the failure to build power plants over the last decade means there will be shortages, has resurrected an interest in nuclear energy in the United States. The Nuclear Energy Institute reports that a group of utility executives approached the organization last year, to set up a task force to examine what would be necessary to deploy a new nuclear plant. The idea is to form a consortium of companies that would order perhaps 10 or 20 plants, which would be standardized and would benefit from economies of scale of production.

No matter what plans the nuclear industry may put together, however, only a complete reversal of the financial and political policies that have wrecked the development and deployment of nuclear technology over the past 40 years will make a difference.

Return to top
Home   Current Issue Contents   Sample Articles   Subscribe   Order Books  News
Shop Online
 Contribute  Statement of Purpose  Back Issues Contents  Español  Translations
Order Back Issues 
Index 1988-1999   Advert. Rates  Contact Us

21st Century, P.O. Box 16285, Washington, D.C. 20041 Phone: (703) 777-6943 Fax: (703) 771-9214
Copyright © 2005 21st Century Science Associates. All rights reserved.