Atomic Weapons

EVENTS

The Politics of Nuclear Technology: From Weaponry to Power

The atomic era dawned dramatically, fundamentally reshaping global society and initiating a complex interplay of scientific endeavour, political ambition, economic interest, and public perception. These events culminated in the recognition that humanity possessed a terrifying power capable of literally reshaping the world.

In the immediate aftermath of the World War II, scientists, particularly the physicists who developed the atomic bomb, were hailed as heroes. Life Magazine proclaimed them "men who wear the tunic of Superman and stand in the spotlight of a thousand suns".

The public imagination attributed to them the harnessing of a power so immense that "the world would not be the same". This sentiment was encapsulated by a line from the Hindu scripture, the Bhagavad Gita: "Now I am become death, the destroyer of worlds".

The Moral Imperative and the Vision of Limitless Energy

Many scientists involved in the atomic bomb's creation experienced a deep sense of guilt over their achievement. This guilt bred a profound moral duty to redirect the immense forces they had unleashed towards peaceful purposes and the betterment of mankind. They recognised their capability as scientists and technologists to "change in a massive way the framework in which society functioned". Nuclear power was widely perceived as a major energy future for the world, a truly revolutionary energy source unlike any before it.

Prior to nuclear power, mankind had only found energy sources that were routine natural phenomena, such as fire from lightning. Nuclear power was entirely different; it was human-made. The ability to offer the world a seemingly limitless energy source for the future was, to scientists and engineers, the most exciting philosophical concept imaginable. This excitement permeated the political sphere, transforming atomic power from merely a source of cheap electricity into a pathway to a better world, capable of solving fundamental scarcity problems that economics traditionally encountered. There was a pervasive belief that if a product could be designated "atomic," it inherently possessed superior quality. Optimistic predictions emerged, envisioning nuclear cars, planes, rockets, and even entire new cities powered by vast atomic engines. Even household items, such as toothpaste, were imagined as potential products of the atomic age. Science was seen as having permeated existence, becoming inseparable from society.

The United States embarked on this future, commencing the construction of its first atomic power plant of commercial size in Shippingport, Pennsylvania. This groundbreaking event was electronically initiated by the President of the United States, signifying the high hopes and national pride associated with the technology.

The Nuclear Race and Early Setbacks

The vision of a nuclear-powered Eden, however, masked a reality fraught with political and economic challenges. The enthusiasm for nuclear energy quickly spurred an international competition. Simultaneously with the US construction at Shippingport, the Soviet Union announced it had already built the world's first nuclear power station. What the Soviets did not disclose was that this plant required more electricity to operate than it produced.

In 1956, Britain joined the nuclear race. Her Majesty the Queen opened Calder Hall in Cumberland, marking it as the "first nuclear power station in the world to operate on an industrial scale". The British government aimed for nuclear power to supply half the country's electricity by 1965, hoping to recapture the prosperity of the Victorian era, which had been built on steam power. Lord Charwell, the government's scientific adviser, articulated this ambition, stating that future prosperity would depend on exploiting the latent energy in uranium. The potential of uranium was illustrated vividly: two pounds of uranium could release energy equivalent to that produced by 2,600 tons of coal.

Despite the initial optimism, the path to widespread nuclear power proved far more difficult than first imagined, primarily due to the exorbitant cost of building reactors. They were proving too expensive to compete with conventional fuels, forcing difficult political and economic decisions.

Safety Compromises and Cover-Ups

In the Soviet Union, the economic pressures led to rapid construction, often neglecting proper protection from nuclear radiation. A stark illustration of this came in February 1957, when the planner overseeing the entire nuclear power programme died from an accidental burst of radioactivity. In October 1957, Britain experienced a major accident at the Windscale atom plant, where the reactor's core caught fire, spewing high levels of radioactivity across Northwest England and leading to the condemnation of milk from 200 square miles of farmland as radioactive. The true extent of the radioactivity released was far worse than the public was informed.

These incidents caused some scientists, including Christopher Hinton, who had built Windscale and was in charge of implementing the government's cheap electricity plans, to question the speed at which nuclear technology was being pushed. Hinton, described as a thoroughly honest man, was shocked to discover that estimates of nuclear energy costs compared to coal energy were "cooked". When asked why he did not act, he explained that the undertaking had gone "too far," with billions already committed to a nuclear future. This commitment created a situation where reversing course was deemed impossible.

Politicians, already committed to nuclear power, continued to champion it, driven by a belief in science's capacity to achieve "undreamed of living standards and the possibility of leisure ultimately on an unbelievable scale". This political enthusiasm, however, advanced despite the fact that few scientists in Britain, America, or the Soviet Union knew how to truly fulfil the promises made.

The Industrialisation of Nuclear Power

Control over nuclear technology progressively shifted from scientists to industrialists. American corporations, Westinghouse and General Electric, had already invested millions of dollars in nuclear technology, making retreat impossible. In 1961, the chief executive of General Electric explicitly stated the intention to "ram this nuclear thing through," aiming for new and greater achievements in the atomic era. This period saw messaging surrounding atomic power that was deemed "strange" and "quasi religious," akin to "hyper Calvinism," indicating an almost worshipful attitude towards the technology.

General Electric and Westinghouse undertook an enormous gamble to make nuclear power not only practical but also profitable. They redesigned the simplest form of nuclear reactor, originally used in submarines, to a gigantic scale, offering them to power companies at "knockdown prices". The manufacturers absorbed any extra costs, betting on creating a "bandwagon" effect. Sales efforts were aggressive, with companies celebrating each sale as a major victory, and by the late 1960s, plants were being sold "by the tens," transforming it into a "real business". Plants were frequently sold before their designs were even finalised, with power companies accepting on faith that larger reactors would achieve economies of scale. These early sales were then cited to subsequent buyers as proof of the manufacturer's claims. This approach was likened to the "mcdonaldization" of nuclear energy, prioritising mass production and affordability.

However, as reactors became larger, senior nuclear scientists grew increasingly worried about safety. The uranium core, which powered the generators, became so large that if the cooling water flow were lost, it would melt. Scientists feared such a molten core could burn through the containment shell and emerge on the other side of the world, a phenomenon they termed "The China Syndrome". Alvin Weinberg, who designed the original submarine reactor, stated that while smaller, 60-megawatt submarine reactors had "absolute" containment shells, this guarantee was lost with 600-megawatt and 1,000-megawatt reactors, where a molten mass could, in "very remote situation," breach containment. This change, it was asserted, occurred due to "enormous economic pressure to make the reactors as large as possible," indicating that economic concerns had unfortunately tainted the science.

In 1964, a study by the Atomic Energy Commission (AEC) on the possible consequences of a nuclear accident concluded that nothing inherent in reactors or safeguard systems "guarantees either that major reactor accidents will not occur or that protective Safeguard systems will not fail". It warned that "very large damages could result," marking the point where "the nuclear dream began to fall apart".

Institutional Cover-Ups and Unforeseen Disasters

In 1965, scientists advising the AEC attempted to compel manufacturers to make their reactors safer. When Westinghouse and General Electric proposed massive plants near large cities like New York and Chicago, the advisory committee on Reactor Safeguards became concerned that a core melt so close to major urban centres could cause a disaster. They drafted a letter to AEC chairman Glenn Seaborg, which, by law, would have to be published. The letter stipulated that the plants would only be approved if manufacturers redesigned all future reactors to prevent a molten core from escaping in the event of an accident.

Seaborg, an ardent proponent of large reactors, intervened, asking for the letter not to be published, citing potential "serious" impact on the industry and fear that "the public might misunderstand it". The AEC commissioners decided to address the problem in private, believing the public should be kept "in the dark" about such complex technical issues. While meetings were held with manufacturers, no "tremendous push" was made to force them to change their entire manufacturing system. General Electric, in effect, threatened to withdraw from selling power reactors if forced to deal with the core melt problem. Neither company was "anxious to deal with the problem". Despite the power to refuse licences, the AEC did not, and the Indian Point and other reactors were built without the redesigns the committee had requested. Instead, the AEC ordered only a massive upgrading of emergency cooling systems to prevent a core from ever melting, effectively allowing the manufacturers to have their way. This decision, however, "set a terrible trap for themselves".

In contrast, in the Soviet Union, grandiose nuclear plans from the 1950s remained on the drawing board, as planners were unconvinced they could be constructed cheaply. Physicists and engineers spent their days designing reactors that were never built. This changed in the mid-1960s with Leonid Brezhnev's rise to power, as he believed that giant technological projects were the path to Communism. The nuclear power programme restarted, dominated by physicist Anatoli Alexandrov and his RBMK reactor design, with plans for giant versions near cities. This idyllic picture of a nuclear Eden, however, "masked a reality in which safety was barely even considered". Reactors were built at great speed to cut costs and fulfil the Soviet plan, some entirely lacking protective containment. Despite claims of complete safety for personnel and radiation monitors at exits, an article in the newspaper Communist by K. Arin and a fellow engineer openly criticised the lack of safety, citing issues with design, plant siting, and growing concerns about nuclear waste.

Meanwhile, Britain struggled significantly to make its nuclear plants work. The first of the advanced gas-cooled reactors at Dungeness, ordered at a cost of £80 million and due for commission in 1972, might only start producing electricity in 1977. This meant Britain had "thrown away" its lead in nuclear technology.

By the mid-1970s, as nuclear plants ordered in the 1960s neared completion in America, engineers began to discover the trap they had set. If a molten core could not be contained, then emergency systems to prevent a meltdown had to work flawlessly, requiring engineers to anticipate every conceivable malfunction in the plants' enormous complexity, which proved impossible. Theoretical calculations did not correlate with reality. While regulations mandated emergency core cooling systems, pumps, and valves, there was no real basis for knowing if they would actually prevent a meltdown. The complexity of predicting events inside a huge reactor during a pipe break made accurate judgements impossible, as there were no facts to base them upon.

Winter 1971 tests of emergency core cooling systems at the AEC's Idaho site simulated accidents in a small reactor model. Though emergency systems technically "worked," the water repeatedly failed to fill the core, often being forced out under pressure. Despite this, the industry and senior AEC members maintained that full-size safety systems were safe enough. The federal government and nuclear industry decided that "the absence of proof of danger was almost as good as proof of safety". They continued to work on perfecting emergency systems, but refrained from publicly admitting doubts, fearing an "uproar" and the end of the technology.

Catastrophic Accidents and Public Reaction

The gravest fears materialised on 28 March 1979, at the Three Mile Island plant, where a series of human and mechanical errors exposed the reactor core. The core reacted with steam, producing hydrogen, which subsequently exploded. Emergency teams were unable to comprehend what was occurring inside the reactor. A helicopter detected a large radioactive cloud drifting towards the nearest town. Recordings of the commissioners in charge of the disaster revealed their confusion and the complete lack of confidence in the information available. Operating "totally in the blind," they deliberated on whether to order an evacuation, unsure if moving people would expose them to a greater dose of radiation. For four days, engineers watched helplessly as a hydrogen bubble grew inside the damaged reactor, fearing a massive explosion, yet knowing that attempts to force it out might fully uncover the core, leading to a melt-through. The engineers were trapped by the consequences of an accident that "no one could have anticipated," an "unbelievable" mode that had "never been studied". There was considerable confusion among the company, the government, and other participants.

The Three Mile Island accident had a significant impact on public perception, as observers saw men in white lab coats, presumed to be experts, "scratching their heads" on television, casting doubt on the level of control over the situation. The President's commission estimated the accident's cost could reach $1.8 billion for a plant that might never work again. Three Mile Island was not an isolated incident; in 1979 alone, there were 20 nuclear incidents that could have led to catastrophic meltdown in American nuclear power plants.

Worldwide protests against nuclear power proliferated, transforming the technology in the public imagination "from something good to something bad". Much of this anger was directed at nuclear scientists when it emerged that they had "deliberately concealed many of the risks and uncertainties they had discovered" while publicly promoting nuclear wonders. While some technologists had previously believed they could unilaterally decide what level of risk was acceptable, they later acknowledged that such decisions belonged to the public. However, a counter-argument maintained that the public should remain "in the dark" regarding scientific risks, accepting that "a few accidents" are an inherent risk of scientific endeavour. The historical reality of the nuclear enterprise, originating as a secret endeavour, meant the idea of the public being "intimately involved in very complicated technical issues" seemed inappropriate. A fundamental question arose: can modern intrusive technology and liberal democracy coexist?

The Chernobyl Disaster and its Legacy

On 26 April 1986, the Chernobyl disaster unfolded. An improbable sequence of errors led to an explosion and a molten core that began to burn through the reactor's foundations. Hundreds of volunteers frantically dug a tunnel directly under the plant, pouring in liquid nitrogen to freeze the ground, which also fortuitously began to stifle the graphite fire. After five days, for reasons still unknown, the core began to cool.

Valeri Legasov, a main architect of Russia's nuclear program, led the response at Chernobyl, repeatedly flying through the radiation above the blazing reactor. Initially, Legasov remained a staunch defender of nuclear power. However, in the months that followed, his perspective changed. In a taped interview, he delivered a damning criticism of the entire nuclear power program, identifying the core problem as "the demand that was made of the technology". He concluded, paradoxically, that "the enemy isn't technology" but that "technology must be protected from man". Legasov lamented that while earlier generations of technologists were educated with humanitarian ideals and a clear moral sense, subsequent engineers focused solely on the technical side, losing sight of the purpose and responsibility. The operators at Chernobyl on the night of the accident, for example, believed they were acting correctly, even breaking rules for the sake of perceived improvement, but had lost sight of the ultimate purpose. Two years to the day after the accident, Valeri Legasov committed suicide for unknown reasons.

The Enduring Political and Moral Choice

The history of nuclear technology reveals a crucial insight: its development is not merely a story of technology gone "amok". Rather, it is a history fundamentally shaped by "political and economic and social decisions". The notion that nuclear technology's form is inevitable, or that society cannot shape it, represents a "naive view". In reality, science and technology offer a "range of possibilities," capable of leading in numerous directions and acting as a "liberating Force". However, to harness this potential, society must "stop sleepwalking" and recognise that the direction of technology is not a scientific or engineering choice, but ultimately a "moral choice".

The debate surrounding nuclear power persists, with some arguing that despite the accidents, it remains the most viable option among available technologies, noting its comparatively low death toll compared to other energy sources such as coal or gas, or even the automobile. The "weird public hysteria" surrounding nuclear power, though understandable due to the unique nature of radioactive contamination, is considered by some to be disproportionate.

Economic viability remains a contentious point; if built with minimal risk, nuclear power might not be profitable for private enterprise, potentially requiring it to be viewed as a cost-centre or necessary infrastructure funded by the government. Its successful operation in countries like France and Japan demonstrates its feasibility. It has been suggested that nuclear energy is excluded from the modern green agenda precisely "because it actually works" and could deliver a "utopia," thus eliminating opportunities for sustained "grift" through continuous government contracts. This perspective aligns with theories that capitalist firms sometimes "manufactured scarcity" to prevent abundance for their own gain.

Furthermore, the stark contrast between the Soviet Union, where technicians could publish concerns in a newspaper, and the American system, where similar concerns were suppressed, highlights differing political approaches to transparency and public engagement in critical technological matters. The continued prominence of events like Chernobyl in public discourse is also questioned, with some suggesting it is intentionally kept "front and centre" to foster distrust in nuclear power. This illustrates the ongoing influence of "money power" in shaping public perception and policy, a force that some believe can only be countered by strong leadership or significant societal shifts.

Read more