You are browsing the archive for Fukushima Daiichi.

Fukushima Two Years Later: Many Questions, One Clear Answer

7:30 am in Uncategorized by Gregg Levine

Fukushima's threats to health and the environment continue. (graphic: Surian Soosay via flickr)

You can’t say you have all the answers if you haven’t asked all the questions. So, at a conference on the medical and ecological consequences of the Fukushima nuclear disaster, held to commemorate the second anniversary of the earthquake and tsunami that struck northern Japan, there were lots of questions. Questions about what actually happened at Fukushima Daiichi in the first days after the quake, and how that differed from the official report; questions about what radionuclides were in the fallout and runoff, at what concentrations, and how far they have spread; and questions about what near- and long-term effects this disaster will have on people and the planet, and how we will measure and recognize those effects.

A distinguished list of epidemiologists, oncologists, nuclear engineers, former government officials, Fukushima survivors, anti-nuclear activists and public health advocates gathered at the invitation of The Helen Caldicott Foundation and Physicians for Social Responsibility to, if not answer all these question, at least make sure they got asked. Over two long days, it was clear there is much still to be learned, but it was equally clear that we already know that the downsides of nuclear power are real, and what’s more, the risks are unnecessary. Relying on this dirty, dangerous and expensive technology is not mandatory–it’s a choice. And when cleaner, safer, and more affordable options are available, the one answer we already have is that nuclear is a choice we should stop making and a risk we should stop taking.

“No one died from the accident at Fukushima.” This refrain, as familiar as multiplication tables and sounding about as rote when recited by acolytes of atomic power, is a close mirror to versions used to downplay earlier nuclear disasters, like Chernobyl and Three Mile Island (as well as many less infamous events), and is somehow meant to be the discussion-ender, the very bottom-line of the bottom-line analysis that is used to grade global energy options. “No one died” equals “safe” or, at least, “safer.” Q.E.D.

But beyond the intentional blurring of the differences between an “accident” and the probable results of technical constraints and willful negligence, the argument (if this saw can be called such) cynically exploits the space between solid science and the simple sound bite.

“Do not confuse narrowly constructed research hypotheses with discussions of policy,” warned Steve Wing, Associate Professor of Epidemiology at the University of North Carolina’s Gillings School of Public Health. Good research is an exploration of good data, but, Wing contrasted, “Energy generation is a public decision made by politicians.”

Surprisingly unsurprising

A public decision, but not necessarily one made in the public interest. Energy policy could be informed by health and environmental studies, such as the ones discussed at the Fukushima symposium, but it is more likely the research is spun or ignored once policy is actually drafted by the politicians who, as Wing noted, often sport ties to the nuclear industry.

The link between politicians and the nuclear industry they are supposed to regulate came into clear focus in the wake of the March 11, 2011 Tohoku earthquake and tsunami–in Japan and the United States.

The boiling water reactors (BWRs) that failed so catastrophically at Fukushima Daiichi were designed and sold by General Electric in the 1960s; the general contractor on the project was Ebasco, a US engineering company that, back then, was still tied to GE. General Electric had bet heavily on nuclear and worked hand-in-hand with the US Atomic Energy Commission (AEC–the precursor to the NRC, the Nuclear Regulatory Commission) to promote civilian nuclear plants at home and abroad. According to nuclear engineer Arnie Gundersen, GE told US regulators in 1965 that without quick approval of multiple BWR projects, the giant energy conglomerate would go out of business.

It was under the guidance of GE and Ebasco that the rocky bluffs where Daiichi would be built were actually trimmed by 10 meters to bring the power plant closer to the sea, the water source for the reactors’ cooling systems–but it was under Japanese government supervision that serious and repeated warnings about the environmental and technological threats to Fukushima were ignored for another generation.

Failures at Daiichi were completely predictable, observed David Lochbaum, the director of the Nuclear Safety Project at the Union of Concerned Scientists, and numerous upgrades were recommended over the years by scientists and engineers. “The only surprising thing about Fukushima,” said Lochbaum, “is that no steps were taken.”

The surprise, it seems, should cross the Pacific. Twenty-two US plants mirror the design of Fukushima Daiichi, and many stand where they could be subject to earthquakes or tsunamis. Even without those seismic events, some US plants are still at risk of Fukushima-like catastrophic flooding. Prior to the start of the current Japanese crisis, the Nuclear Regulatory Commission learned that the Oconee Nuclear Plant in Seneca, South Carolina, was at risk of a major flood from a dam failure upstream. In the event of a dam breach–an event the NRC deems more likely than the odds that were given for the 2011 tsunami–the flood at Oconee would trigger failures at all four reactors. Beyond hiding its own report, the NRC has taken no action–not before Fukushima, not since.

The missing link

Read the rest of this entry →

Fukushima Plus Two: Still the Beginning?

4:25 am in Uncategorized by Gregg Levine

An IAEA inspector examines the remains of reactor 3 at Fukushima Daiichi (5/27/11)

I was up working in what were in my part of the world the early morning hours of March 11, 2011, when I heard over the radio that a massive earthquake had struck northeastern Japan. I turned on the TV just in time to see the earliest pictures of the tsunami that followed what became known as the Tohoku quake. The devastation was instantly apparent, and reports of high numbers of casualties seemed inevitable, but it wasn’t until a few hours later, when news of the destruction and loss of power at the Fukushima Daiichi nuclear plant hit the English-language airwaves, that I was gripped by a real sense of despair.

I was far from a nuclear expert at the time, but I knew enough to know that without intact cooling systems, or the power to keep them running, and with the added threat of a containment breach, some amount of environmental contamination was certain, and the potential for something truly terrifying was high.

What started as a weekend of watching newswires and live streams, virtually around the clock, and posting basic tech and health questions on email lists, expanded as the Fukushima crisis itself grew. Two years later, I have written tens of thousands of words, and read hundreds of thousands more. I have learned much, but I think I have only scratched the surface.

We all might be a little closer to understanding what happened in those first days and weeks after the earthquake, but what has happened since is still, sadly, a story where much must be written. What the Daiichi plant workers really went through in those early days is just now coming to light, and the tales of intrigue and cover-up, of corruption and captured government, grow more complex and more sinister with each revelation. But what has happened to the environment, not just in the government-cordoned evacuation zone, but also throughout Japan, across the Pacific, and around the world, will likely prove the most chilling narrative.

Radiation levels in the quarantined parts of Japan are still far too high to permit any kind of human re-habitation, but exposure rates in areas far outside that radius are also well above what would have been considered acceptable before this disaster. And water, used to cool the molten cores and damaged spent fuel pools at Fukushima Daiichi, now dangerously radioactive itself, continues to leak into the ground and into the ocean at unprecedented rates.

Read the rest of this entry →

Seventy Years of Nuclear Fission: Short on Confidence; Long on Waste

5:55 am in Uncategorized by Gregg Levine

From here to eternity: a small plaque on the campus of the University of Chicago commemorates the site of Fermi's first atomic pile--and the start of the world's nuclear waste problem. (Photo: Nathan Guy via Flickr)

On December 2, 1942, a small group of physicists under the direction of Enrico Fermi gathered on an old squash court beneath Alonzo Stagg Stadium on the Campus of the University of Chicago to make and witness history. Uranium pellets and graphite blocks had been stacked around cadmium-coated rods as part of an experiment crucial to the Manhattan Project–the program tasked with building an atom bomb for the allied forces in WWII. The experiment was successful, and for 28 minutes, the scientists and dignitaries present observed the world’s first manmade, self-sustaining nuclear fission reaction. They called it an atomic pile–Chicago Pile 1 (CP-1), to be exact–but what Fermi and his team had actually done was build the world’s first nuclear reactor.

The Manhattan Project’s goal was a bomb, but soon after the end of the war, scientists, politicians, the military and private industry looked for ways to harness the power of the atom for civilian use, or, perhaps more to the point, for commercial profit. Fifteen years to the day after CP-1 achieved criticality, President Dwight Eisenhower threw a ceremonial switch to start the reactor at Shippingport, PA, which was billed as the first full-scale nuclear power plant built expressly for civilian electrical generation.

Shippingport was, in reality, little more than a submarine engine on blocks, but the nuclear industry and its acolytes will say that it was the beginning of billions of kilowatts of power, promoted (without a hint of irony) as “clean, safe, and too cheap to meter.” It was also, however, the beginning of what is now a, shall we say, weightier legacy: 72,000 tons of nuclear waste.

Atoms for peace, problems forever

News of Fermi’s initial success was communicated by physicist Arthur Compton to the head of the National Defense Research Committee, James Conant, with artistically coded flair:

Compton: The Italian navigator has landed in the New World.
Conant: How were the natives?
Compton: Very friendly.

But soon after that initial success, CP-1 was disassembled and reassembled a short drive away, in Red Gate Woods. The optimism of the physicists notwithstanding, it was thought best to continue the experiments with better radiation shielding–and slightly removed from the center of a heavily populated campus. The move was perhaps the first necessitated by the uneasy relationship between fissile material and the health and safety of those around it, but if it was understood as a broader cautionary tale, no one let that get in the way of “progress.”

A stamp of approval: the US Postal Service commemorated Eisenhower's initiative in 1955.

By the time the Shippingport reactor went critical, North America already had a nuclear waste problem. The detritus from manufacturing atomic weapons was poisoning surrounding communities at several sites around the continent (not that most civilians knew it at the time). Meltdowns at Chalk River in Canada and the Experimental Breeder Reactor in Idaho had required fevered cleanups, the former of which included the help of a young Navy officer named Jimmy Carter. And the dangers of errant radioisotopes were increasing with the acceleration of above-ground atomic weapons testing. But as President Eisenhower extolled “Atoms for Peace,” and the US Atomic Energy Commission promoted civilian nuclear power at home and abroad, a plan to deal with the “spent fuel” (as used nuclear fuel rods are termed) and other highly radioactive leftovers was not part of the program (beyond, of course, extracting some of the plutonium produced by the fission reaction for bomb production, and the promise that the waste generated by US-built reactors overseas could at some point be marked “return to sender” and repatriated to the United States for disposal).

Attempts at what was called “reprocessing”–the re-refining of used uranium into new reactor fuel–quickly proved expensive, inefficient and dangerous, and created as much radioactive waste as it hoped to reuse. It also provided an obvious avenue for nuclear weapons proliferation because of the resulting production of plutonium. The threat of proliferation (made flesh by India’s test of an atomic bomb in 1976) led President Jimmy Carter to cancel the US reprocessing program in 1977. Attempts by the Department of Energy to push mixed-oxide (MOX) fuel fabrication (combining uranium and plutonium) over the last dozen years has not produced any results, either, despite over $5 billion in government investments.

In fact, there was no official federal policy for the management of used but still highly radioactive nuclear fuel until passage of The Nuclear Waste Policy Act of 1982. And while that law acknowledged the problem of thousands of tons of spent fuel accumulating at US nuclear plants, it didn’t exactly solve it. Instead, the NWPA started a generation of political horse trading, with goals and standards defined more by market exigencies than by science, that leaves America today with what amounts to over five-dozen nominally temporary repositories for high-level radioactive waste–and no defined plan to change that situation anytime soon.

When you assume…

Read the rest of this entry →

New Fukushima Video Shows Disorganized Response, Organized Deception

6:30 am in Uncategorized by Gregg Levine

A frame from early in the newly released Fukushima video.

Tokyo Electric Power Company (TEPCO), the operator of the Fukushima Daiichi nuclear power plant when the Tohoku earthquake and tsunami struck last year, bowed to public and government pressure this week, releasing 150 hours of video recorded during the first days of the Fukushima crisis. Even with some faces obscured and two-thirds of the audio missing, the tapes clearly show a nuclear infrastructure wholly unprepared for the disaster, and an industry and government wholly determined to downplay that disaster’s severity:

Though incomplete, the footage from a concrete bunker at the plant confirms what many had long suspected: that the Tokyo Electric Power Company, the plant’s operator, knew from the early hours of the crisis that multiple meltdowns were likely despite its repeated attempts in the weeks that followed to deny such a probability.

It also suggests that the government, during one of the bleakest moments, ordered the company not to share information with the public, or even local officials trying to decide if more people should evacuate.

Above all, the videos depict mayhem at the plant, a lack of preparedness so profound that too few buses were on hand to carry workers away in the event of an evacuation. They also paint a close-up portrait of the man at the center of the crisis, Mr. Yoshida, who galvanizes his team of engineers as they defy explosions and fires — and sometimes battle their own superiors.

That summary is from New York Times Tokyo-based reporter Hiroko Tabuchi. The story she tells is compelling and terrifying, and focuses on the apparent heroism of Masao Yoshida, Fukushima’s chief manager when the crisis began, along with the far less estimable behavior of TEPCO and Japanese government officials. It is worth a couple of your monthly quota of clicks to read all the way through.

The story is but one take on the video, and I point this out not because I question Tabuchi’s reporting on its content, much of which is consistent with what is already known about the unholy alliance between the nuclear industry and the Japanese government, and about what those parties did to serve their own interests at the expense of the Japanese people (and many others across the northern hemisphere). Instead, I bring this up because I do not myself speak Japanese, and I am only allowed to view a 90-minute “highlight reel” and not the entire 150 hours of video, and so I am dependent on other reporters’ interpretations. And because neither TEPCO nor the Japanese government (which now essentially owns TEPCO) has yet proven to be completely open or honest on matters nuclear, the subtle differences in those interpretations matter.

Tabuchi took to Twitter to say how much she wanted to tell the story as “a tribute to Fukushima Daiichi chief Yoshida and the brave men on the ground who tried to save us.” But in a separate tweet, Tabuchi said she was “heartbroken” to discover her article was cut in half.

Read the rest of this entry →

Made in Japan? Fukushima Crisis Is Nuclear, Not Cultural

7:29 am in Uncategorized by Gregg Levine

(photo: Steve Snodgrass/flickr)

Since the release of the Fukushima Nuclear Accident Independent Committee’s official report last week, much has been made of how it implicates Japanese culture as one of the root causes of the crisis. The committee’s chairman, Dr. Kiyoshi Kurokawa, makes the accusation quite plainly in the opening paragraphs of the executive summary [PDF]:

What must be admitted – very painfully – is that this was a disaster “Made in Japan.” Its fundamental causes are to be found in the ingrained conventions of Japanese culture: our reflexive obedience; our reluctance to question authority; our devotion to ‘sticking with the program’; our groupism; and our insularity.

That this apparently critical self-examination was seized upon by much of the western media’s coverage of the report probably does not come as a surprise–especially when you consider that this revelation falls within the first 300 words of an 88-page document. Cultural stereotypes and incomplete reads are hardly new to establishment reportage. What might come as a shock, however, is that this painful admission is only made in the English-language version of the document, and only in the chairman’s introduction is the “made in Japan” conclusion drawn so specifically.

What replaces the cultural critique in the Japanese edition and in the body of the English summary is a ringing indictment of the cozy relationship between the Japanese nuclear industry and the government agencies that were supposed to regulate it. This “regulatory capture,” as the report details, is certainly central to the committee’s findings and crucial to understanding how the Fukushima disaster is a manmade catastrophe, but it is not unique to the culture of Japan.

Indeed, observers of the United States will recognize this lax regulatory construct as part-and-parcel of problems that threaten the safety and health of its citizenry, be it in the nuclear sector, the energy sector as a whole, or across a wide variety of officially regulated industries.

No protection

Read the rest of this entry →

Fukushima Nuclear Disaster “Man-Made” Reports Japanese Panel; Quake Damaged Plant Before Tsunami

5:45 am in Uncategorized by Gregg Levine

Aerial view of the Oi Nuclear Power Plant, Fukui Prefecture, Japan. (photo: Japan Ministry of Land, Infrastructure and Transport via Wikipedia)

The massive disaster at the Fukushima Daiichi nuclear facility that began with the March 11, 2011 Tohoku earthquake and tsunami could have been prevented and was likely made worse by the response of government officials and plant owners, so says a lengthy report released today by the Japanese Diet (their parliament).

The official report of The Fukushima Nuclear Accident Independent Investigation Committee [PDF] harshly criticizes the Japanese nuclear industry for avoiding safety upgrades and disaster plans that could have mitigated much of what went wrong after a massive quake struck the northeast of Japan last year. The account also includes direct evidence that Japanese regulatory agencies conspired with TEPCO (Fukushima’s owner-operator) to help them forestall improvements and evade scrutiny:

The TEPCO Fukushima Nuclear Power Plant accident was the result of collusion between the government, the regulators and TEPCO, and the lack of governance by said parties. They effectively betrayed the nation’s right to be safe from nuclear accidents.

. . . .

We found evidence that the regulatory agencies would explicitly ask about the operators’ intentions whenever a new regulation was to be implemented. For example, NISA informed the operators that they did not need to consider a possible station blackout (SBO) because the probability was small and other measures were in place. It then asked the operators to write a report that would give the appropriate rationale for why this consideration was unnecessary.

The report also pointed to Japanese cultural conventions, namely the reluctance to question authority–a common refrain in many post-Fukushima analyses.

But perhaps most damning, and most important to the future of Japan and to the future of nuclear power worldwide, is the Investigation’s finding that parts of the containment and cooling systems at Fukushima Daiichi were almost certainly damaged by the earthquake before the mammoth tsunami caused additional destruction:
Read the rest of this entry →

Something Fishy: CRS Report Downplays Fukushima’s Effect on US Marine Environment

6:55 am in Uncategorized by Gregg Levine

(photo: JanneM via flickr)

Late Thursday, the United States Coast Guard reported that they had successfully scuttled the Ryou-Un Maru, the Japanese “Ghost Ship” that had drifted into US waters after being torn from its moorings by the tsunami that followed the Tohoku earthquake over a year ago. The 200-foot fishing trawler, which was reportedly headed for scrap before it was swept away, was seen as potentially dangerous as it drifted near busy shipping lanes.

Coincidentally, the “disappearing” of the Ghost Ship came during the same week the Congressional Research Service (CRS) released its report on the effects of the Fukushima Daiichi nuclear disaster on the US marine environment, and, frankly, the metaphor couldn’t be more perfect. The Ryou-Un Maru is now resting at the bottom of the ocean–literally nothing more to see there, thanks to a few rounds from a 25mm Coast Guard gun–and the CRS hopes to dispatch fears of the radioactive contamination of US waters and seafood with the same alacrity.

But while the Ghost Ship was not considered a major ecological threat (though it did go down with around 2,000 gallons of diesel fuel in its tanks), the US government acknowledges that this “good luck ship” (a rough translation of its name) is an early taste of the estimated 1.5 million tons of tsunami debris expected to hit North American shores over the next two or three years. Similarly, the CRS report (titled Effects of Radiation from Fukushima Dai-ichi on the U.S. Marine Environment [PDF]) adopts an overall tone of “no worries here–its all under control,” but a closer reading reveals hints of “more to come.”

Indeed, the report feels as it were put through a political rinse cycle, limited both in the strength of its language and the scope of its investigation. This tension is evident right from the start–take, for example, these three paragraphs from the report’s executive summary:

Both ocean currents and atmospheric winds have the potential to transport radiation over and into marine waters under U.S. jurisdiction. It is unknown whether marine organisms that migrate through or near Japanese waters to locations where they might subsequently be harvested by U.S. fishermen (possibly some albacore tuna or salmon in the North Pacific) might have been exposed to radiation in or near Japanese waters, or might have consumed prey with accumulated radioactive contaminants.

High levels of radioactive iodine-131 (with a half-life of about 8 days), cesium-137 (with a half-life of about 30 years), and cesium-134 (with a half-life of about 2 years) were measured in seawater adjacent to the Fukushima Dai-ichi site after the March 2011 events. EPA rainfall monitors in California, Idaho, and Minnesota detected trace amounts of radioactive iodine, cesium, and tellurium consistent with the Japanese nuclear incident, at concentrations below any level of concern. It is uncertain how precipitation of radioactive elements from the atmosphere may have affected radiation levels in the marine environment.

Scientists have stated that radiation in the ocean very quickly becomes diluted and would not be a problem beyond the coast of Japan. The same is true of radiation carried by winds. Barring another unanticipated release, radioactive contaminants from Fukushima Dai-ichi should be sufficiently dispersed over time that they will not prove to be a serious health threat elsewhere, unless they bioaccumulate in migratory fish or find their way directly to another part of the world through food or other commercial products.

Winds and currents have “the potential” to transport radiation into US waters? Winds–quite measurably–already have, and computer models show that currents, over the next couple of years, most certainly will.

Are there concentrations of radioisotopes that are “below concern?” No reputable scientist would make such a statement. And if monitors in the continental United States detected radioactive iodine, cesium and tellurium in March 2011, then why did they stop the monitoring (or at least stop reporting it) by June?

The third paragraph, however, wins the double-take prize. Radiation would not be a problem beyond the coast? Fish caught hundreds of miles away would beg to differ. “Barring another unanticipated release. . . ?” Over the now almost 13 months since the Fukushima crisis began, there have been a series of releases into the air and into the ocean–some planned, some perhaps unanticipated at the time, but overall, the pattern is clear, radioactivity continues to enter the environment at unprecedented levels.

And radioactive contaminants “should be sufficiently dispersed over time, unless they bioaccumulate?” Unless? Bioaccumulation is not some crazy, unobserved hypothesis, it is a documented biological process. Bioaccumulation will happen–it will happen in migratory fish and it will happen as under-policed food and commercial products (not to mention that pesky debris) make their way around the globe.

Maybe that is supposed to be read by inquiring minds as the report’s “please ignore he man behind the curtain” moment–an intellectual out clause disguised as an authoritative analgesic–but there is no escaping the intent. Though filled with caveats and counterfactuals, the report is clearly meant to serve as a sop to those alarmed by the spreading ecological catastrophe posed by the ongoing Fukushima disaster.

The devil is in the details–the dangers are in the data

Beyond the wiggle words, perhaps the most damning indictment of the CRS marine radiation report can be found in the footnotes–or, more pointedly, in the dates of the footnotes. Though this report was released over a year after the Tohoku earthquake and tsunami triggered the Fukushima nightmare, the CRS bases the preponderance of its findings on information generated during the disaster’s first month. In fact, of the document’s 29 footnotes, only a handful date from after May 2011–one of those points to a CNN report (authoritative!), one to a status update on the Fukushima reactor structures, one confirms the value of Japanese seafood imports, three are items tracking the tsunami debris, and one directs readers to a government page on FDA radiation screening, the pertinent part of which was last updated on March 28 of last year.

Most crucially, the parts of the CRS paper that downplay the amounts of radiation measured by domestic US sensors all cite data collected within the first few weeks of the crisis. The point about radioisotopes being “below any level of concern” comes from an EPA news release dated March 22, 2011–eleven days after the earthquake, only six days after the last reported reactor explosion, and well before so many radioactive releases into the air and ocean. It is like taking reports of only minor flooding from two hours after Hurricane Katrina passed over New Orleans, and using them as the standard for levee repair and gulf disaster planning (perhaps not the best example, as many have critiqued levee repairs for their failure to incorporate all the lessons learned from Katrina).

It now being April of 2012, much more information is available, and clearly any report that expects to be called serious should have included at least some of it.

By October of last year, scientists were already doubling their estimates of the radiation pushed into the atmosphere by the Daiichi reactors, and in early November, as reported here, France’s Institute for Radiological Protection and Nuclear Safety issued a report showing the amount of cesium 137 released into the ocean was 30 times greater than what was stated by TEPCO in May. Shockingly, the Congressional Research Service does not reference this report.

Or take the early March 2012 revelation that seaweed samples collected from off the coast of southern California show levels of radioactive iodine 131 500 percent higher than those from anywhere else in the US or Canada. It should be noted that this is the result of airborne fallout–the samples were taken in mid-to-late-March 2011, much too soon for water-borne contamination to have reached that area–and so serves to confirm models that showed a plume of radioactive fallout with the greatest contact in central and southern California. (Again, this specific report was released a month before the CRS report, but the data it uses were collected over a year ago.)

Then there are the food samples taken around Japan over the course of the last year showing freshwater and sea fish–some caught over 200 kilometers from Fukushima–with radiation levels topping 100 becquerels per kilogram (one topping 600 Bq/kg).

And the beat goes on

This information, and much similar to it, was all available before the CRS released its document, but the report also operates in a risibly artificial universe that assumes the situation at Fukushima Daiichi has basically stabilized. As a sampling of pretty much any week’s news will tell you, it has not. Take, for example, this week:

About 12 tons of water contaminated with radioactive strontium are feared to have leaked from the Fukushima No. 1 plant into the Pacific Ocean, Tepco said Thursday.

The leak occurred when a pipe broke off from a joint while the water was being filtered for cesium, Tokyo Electric Power Co. said.

The system doesn’t remove strontium, and most of the water apparently entered the sea via a drainage route, Tepco added.

The water contained 16.7 becquerels of cesium per cu. centimeter and tests are under way to determine how much strontium was in it, Tepco said.

This is the second such leak in less than two weeks, and as Kazuhiko Kudo, a professor of nuclear engineering at Kyushu University who visited Fukushima Daiichi twice last year, noted:

There will be similar leaks until Tepco improves equipment. The site had plastic pipes to transfer radioactive water, which Tepco officials said are durable and for industrial use, but it’s not something normally used at nuclear plants. Tepco must replace it with metal equipment, such as steel.

(The plastic tubes–complete with the vinyl and duct tape patch–can be viewed here.)

And would that the good people at the Congressional Research Service could have waited to read a report that came out the same day as theirs:

Radioactive material from the Fukushima nuclear disaster has been found in tiny sea creatures and ocean water some 186 miles (300 kilometers) off the coast of Japan, revealing the extent of the release and the direction pollutants might take in a future environmental disaster.

In some places, the researchers from Woods Hole Oceanographic Institution (WHOI) discovered cesium radiation hundreds to thousands of times higher than would be expected naturally, with ocean eddies and larger currents both guiding the “radioactive debris” and concentrating it.

Or would that the folks at CRS had looked to their fellow government agencies before they went off half-cocked. (The study above was done by researchers at Woods Hole and written up in the journal of the National Academy of Sciences.) In fact, it appears the CRS could have done that. In its report, CRS mentions that “Experts cite [Fukushima] as the largest recorded release of radiation to the ocean,” and the source for that point is a paper by Ken Buesseler–the same Ken Buesseler that was the oceanographer in charge of the WHOI study. Imagine what could have been if the Congressional Research Service had actually contacted the original researcher.

Can openers all around

Or perhaps it wouldn’t have mattered. For if there is one obvious takeaway from the CRS paper, beyond its limits of scope and authority, that seeks to absolve it of all other oversights–it is its unfailing confidence in government oversight.

Take a gander at the section under the bolded question “Are there implications for US seafood safety?”:

It does not appear that nuclear contamination of seafood will be a food safety problem for consumers in the United States. Among the main reasons are that:

  • damage from the disaster limited seafood production in the affected areas,
  • radioactive material would be diluted before reaching U.S. fishing grounds, and
  • seafood imports from Japan are being examined before entry into the United States.

According to the U.S. Food and Drug Administration (FDA), because of damage from the earthquake and tsunami to infrastructure, few if any food products are being exported from the affected region. For example, according to the National Federation of Fisheries Cooperative Associations, the region’s fishing industry has stopped landing and selling fish. Furthermore, a fishing ban has been enforced within a 2-kilometer radius around the damaged nuclear facility.

So, the Food and Drug Administration is relying on the word of an industry group and a Japanese government-enforced ban that encompasses a two-kilometer radius–what link of that chain is supposed to be reassuring?

Last things first: two kilometers? Well, perhaps the CRS should hire a few proofreaders. A search of the source materials finds that the ban is supposed to be 20-kilometers. Indeed, the Japanese government quarantined the land for a 20-kilometer radius. The US suggested evacuation from a 50-mile (80-kilometer) radius. The CRS’s own report notes contaminated fish were collected 30 kilometers from Fukushima. So why is even 20 kilometers suddenly a radius to brag about?

As for a damaged industry not exporting, numerous reports show the Japanese government stepping in to remedy that “problem.” From domestic PR campaigns encouraging the consumption of foodstuffs from Fukushima prefecture, to the Japanese companies selling food from the region to other countries at deep discounts, to the Japanese government setting up internet clearing houses to help move tainted products, all signs point to a power structure that sees exporting possibly radioactive goods as essential to its survival.

The point on dilution, of course, not only ignores the way many large scale fishing operations work, it ignores airborne contamination and runs counter to the report’s own acknowledgment of bioaccumulation.

But maybe the shakiest assertion of all is that the US Food and Drug Administration will stop all contaminated imports at the water’s edge. While imports hardly represent the total picture when evaluating US seafood safety, taking this for the small slice of the problem it covers, it engenders raised eyebrows.

First there is the oft-referenced point from nuclear engineer Arnie Gundersen, who said last summer that State Department officials told him of a secret agreement between Japan and Secretary Hilary Clinton guaranteeing the continued importation of Japanese food. While independent confirmation of this pact is hard to come by, there is the plain fact that, beyond bans on milk, dairy products, fruits and vegetables from the Fukushima region issued in late March 2011, the US has proffered no other restrictions on Japanese food imports (and those few restrictions for Japanese food were lifted for US military commissaries in September).

And perhaps most damning, there was the statement from an FDA representative last April declaring that North Pacific seafood was so unlikely to be contaminated that “no sampling or monitoring of our fish is necessary.” The FDA said at the time that it would rely on the National Oceanographic and Atmospheric Administration (NOAA) to tell it when they should consider testing seafood, but a NOAA spokesperson said it was the FDA’s call.

Good. Glad that’s been sorted out.

The Congressional Research Service report seems to fall victim to a problem noted often here–they assume a can opener. As per the joke, the writers stipulate a functioning mechanism before explaining their solution. As many nuclear industry-watchers assume a functioning regulatory process (as opposed to a captured Nuclear Regulatory Commission, an industry-friendly Department of Energy, and industry-purchased members of Congress) when speaking of the hypothetical safety of nuclear power, the CRS here assumes an FDA interested first and foremost in protecting the general public, instead of an agency trying to strike some awkward “balance” between health, profit and politics. The can opener story is a joke; the effects of this real-life example are not.

Garbage in, garbage out

The Congressional Research Service, a part of the Library of Congress, is intended to function as the research and analysis wing of the US Congress. It is supposed to be objective, it is supposed to be accurate, and it is supposed to be authoritative. America needs the CRS to be all of those things because the agency’s words are expected to inform federal legislation. When the CRS shirks its responsibility, shapes its words to fit comfortably into the conventional wisdom, or shaves off the sharp corners to curry political favor, the impact is more than academic.

When the CRS limits its scope to avoid inconvenient truths, it bears false witness to the most important events of our time. When the CRS pretends other government agencies are doing their jobs–despite documentable evidence to the contrary–then they are not performing theirs. And when the CRS issues a report that ignores the data and the science so that a few industries might profit, it is America that loses.

The authors of this particular report might not be around when the bulk of the cancers and defects tied to the radiation from Fukushima Daiichi present in the general population, but this paper’s integrity today could influence those numbers tomorrow. Bad, biased, or bowdlerized advice could scuttle meaningful efforts to make consequential policy.

If the policy analysts that sign their names to reports like this don’t want their work used for scrap paper, then maybe they should take a lesson from the Ryou-Un Maru. Going where the winds and currents take you makes you at best a curiosity, and more likely a nuisance–just so much flotsam and jetsam getting in the way of actual business. Works of note come with moral rudders, anchored to best data available; without that, the report might as well just say “good luck.”

Looking Back at Our Nuclear Future

12:30 pm in Uncategorized by Gregg Levine

The Los Angeles Times heralds the nuclear age in January 1957. (photo via wikipedia)

On March 11, communities around the world commemorated the first year of the still-evolving Fukushima Daiichi nuclear disaster with rallies, marches, moments of silence, and numerous retrospective reports and essays (including one here). But 17 days later, another anniversary passed with much less fanfare.

It was in the early morning hours of March 28, 1979, that a chain of events at the Three Mile Island nuclear power plant in Dauphin County, Pennsylvania caused what is known as a “loss of coolant accident,” resulting in a partial core meltdown, a likely hydrogen explosion, the venting of some amount of radioisotopes into the air and the dumping of 40,000 gallons of radioactive waste water into the Susquehanna River. TMI (as it is sometimes abbreviated) is often called America’s worst commercial nuclear accident, and though the nuclear industry and its acolytes have worked long and hard to downplay any adverse health effects stemming from the mishap, the fact is that what happened in Pennsylvania 33 years ago changed the face and future of nuclear power.

The construction of new nuclear power facilities in the US was already in decline by the mid 1970s, but the Three Mile Island disaster essentially brought all new projects to a halt. There were no construction licenses granted to new nuclear plants from the time of TMI until February of this year, when the NRC gave a hasty go-ahead to two reactors slated for the Vogtle facility in Georgia. And though health and safety concerns certainly played a part in this informal moratorium, cost had at least an equal role. The construction of new plants proved more and more expensive, never coming in on time or on budget, and the cleanup of the damaged unit at Three Mile Island took 14 years and cost over $1 billion. Even with the Price-Anderson Act limiting the industry’s liability, nuclear power plants are considered such bad risks that no financing can be secured without federal loan guarantees.

In spite of that–or because of that–the nuclear industry has pushed steadily over the last three decades to wring every penny out of America’s aging reactors, pumping goodly amounts of their hefty profits into lobbying efforts and campaign contributions designed to capture regulators and elected officials and propagate the age-old myth of an energy source that is clean, safe, and, if not exactly “too cheap to meter,” at least impressively competitive with other options. The result is a fleet of over 100 reactors nearing the end of their design lives–many with documented dangers and potential pitfalls that could rival TMI–now seeking and regularly getting license extensions from the Nuclear Regulatory Commission while that same agency softens and delays requirements for safety upgrades.

And all of that cozy cooperation between government and big business goes on with the nuclear industry pushing the idea of a “nuclear renaissance.” In the wake of Fukushima, the industry has in fact increased its efforts, lobbying the US and British governments to downplay the disaster, and working with its mouthpieces in Congress and on the NRC to try to kill recommended new regulations and force out the slightly more safety-conscious NRC chair. And, just this month, the Nuclear Energy Institute, the chief nuclear trade group, moved to take their message to what might be considered a less friendly cohort, launching a splashy PR campaign by underwriting public radio broadcasts and buying time for a fun and funky 60-second animated ad on The Daily Show.

All of this is done with the kind of confidence that only comes from knowing you have the money to move political practice and, perhaps, public opinion. Three Mile Island is, to the industry, the exception that proves the rule–if not an out-and-out success. “No one died,” you will hear–environmental contamination and the latest surveys now showing increased rates of Leukemia some 30 years later be damned–and that TMI is the only major accident in over half a century of domestic nuclear power generation.

Of course, this is not even remotely true–names like Browns Ferry, Cooper, Millstone, Indian Point and Vermont Yankee come to mind–but even if you discount plant fires and tritium leaks, Three Mile Island is not even America’s only meltdown.

There is, of course, the 1966 accident at Michigan’s Enrico Fermi Nuclear Generating Station, chronicled in the John Grant Fuller book We Almost Lost Detroit, but atom-lovers will dismiss this because Fermi 1 was an experimental breeder reactor, so it is not technically a “commercial” nuclear accident.

But go back in time another seven years–a full 20 before TMI–and the annals of nuclear power contain the troubling tale of another criticality accident, one that coincidentally is again in the news this week, almost 53 years later.

The Sodium Reactor Experiment

On July 12, 1957, the Sodium Reactor Experiment (SRE) at the Santa Susana Nuclear Field Laboratory near Simi Valley, California, became the first US nuclear reactor to produce electricity for a commercial power grid. SRE was a sodium-cooled reactor designed by Atomics International, a division of North American Aviation, a company more often known by the name of its other subsidiary, Rocketdyne. Southern California Edison used the electricity generated by SRE to light the nearby town of Moorpark.

Sometime during July 1959–the exact date is still not entirely clear–a lubricant used to cool the seals on the pump system seeped into the primary coolant, broke down in the heat and formed a compound that clogged cooling channels. Because of either curiosity or ignorance, operators continued to run the SRE despite wide fluctuations in core temperature and generating capacity.

Following a pattern that is now all too familiar, increased temperatures caused increased pressure, necessitating what was even then called a “controlled venting” of radioactive vapor. How much radioactivity was released into the environment is cause for some debate, for, in 1959, there was less monitoring and even less transparency. Current reconstructions, however, believe the release was possibly as high as 450 times greater than what was vented at Three Mile Island.

When the reactor was finally shut down and the fuel rods were removed (which was a trick in itself, as some were stuck and others broke), it was found that over a quarter showed signs of melting.

The SRE was eventually repaired and restarted in 1960, running on and off for another four years. Decommissioning began in 1976, and was finished in 1981, but the story doesn’t end there. Not even close.

Fifty-three years after a partial nuclear meltdown at the Santa Susana Field Laboratory site in the Chatsworth Hills, the U.S. Environmental Protection Agency has just released data finding extensive radioactive contamination still remains at the accident site.

“This confirms what we were worried about,” said Assemblywoman Julia Brownley, D-Oak Park, a long-time leader in the fight for a complete and thorough cleanup of this former Rocketdyne rocket engine testing laboratory. “This begins to answer critical questions about what’s still up there, where, how much, and how bad?”

Well, it sort of begins to answer it.

New soil samples weigh in at up to 1,000 times the radiation trigger levels (RTLs) agreed to when the Department of Energy struck a cleanup deal with the California Department of Toxic Substances in 2010. What’s more, these measurements follow two previous cleanup efforts by the DOE and Boeing, the company that now owns Santa Susana.

In light of the new findings, Assemblywoman Brownley has called on the DOE to comply with the agreement and do a real and thorough cleanup of the site. That means taking radiation levels down to what are the established natural background readings for the area. But that, as is noted by local reporter Michael Collins, “may be easier said than done”:

This latest U.S. EPA information appears to redefine what cleaning up to background actually is. Publicly available documents show that the levels of radiation in this part of Area IV where the SRE once stood are actually many thousands of times more contaminated than previously thought.

Just as troubling, the EPA’s RTLs, which are supposed to mirror the extensively tested and reported-on backgrounds of the numerous radionuclides at the site, were many times over the background threshold values (BTVs). So instead of cleaning up to background, much more radiation would be left in the ground, saving the government and lab owner Boeing millions in cleanup.

It is a disturbing tale of what Collins calls a kind of environmental “bait and switch” (of which he provides even more detail in an earlier report), but after a year of documenting the mis- and malfeasance of the nuclear industry and its supposed regulators, it is, to us here, anyway, not a surprising one.

To the atom-enamored, it is as if facts have a half-life all their own. The pattern of swearing that an event is no big deal, only to come back with revision after revision, each admitting a little bit more in a seemingly never-ending regression to what might approximately describe a terrible reality. It would be reminiscent of the “mom’s on the roof” joke if anyone actually believed that nuclear operators and their chummy government minders ever intended to eventually relay the truth.

Fukushima’s latest surprise

Indeed, that unsettling pattern is again visible in the latest news from Japan. This week saw revelations that radiation inside Fukushima Daiichi’s reactor 2 containment vessel clocked in at levels seriously higher than previously thought, while water levels are seriously lower.

An endoscopic camera, thermometer, water gauge and dosimeter were inserted into the number 2 reactor containment, and it documented radiation levels of up to 70 sieverts per hour, which is not only seven times the previous highest measurement, but 10 times higher than what is called a fatal dose (7 Sv/hr would kill a human in minutes).

The water level inside the containment vessel, estimated to be at 10 meters when the Japanese government declared a “cold shutdown” in December, turns out to be no more than 60 centimeters (about two feet).

This is disquieting news for many reasons. First, the high radiation not only makes it impossible for humans to get near the reactor, it makes current robotic technology impractical, as well. The camera, for instance, would only last 14 hours in those conditions. If the molten core is to be removed, a new class of radiation-resistant robots will have to be developed.

The extremely low water levels signal more troubling scenarios. Though some experts believe that the fuel rods have melted down or melted through to such an extent that two feet of water can keep them covered, it likely indicates a breach or breaches of the containment vessel. Plant workers, after all, have been pumping water into the reactor constantly for months now (why no one noticed that they kept having to add water to the system, or why no one cared, is plenty disturbing, as is the question of where all that extra water has gone).

Arnie Gundersen of nuclear engineering consultancy Fairewinds Associates believes that the level of water roughly corresponds with the lower lip of the vessel’s suppression pool–further evidence that reactor 2 suffered a hydrogen explosion, as did two other units at Fukushima. Gundersen also believes that the combination of heat, radioactivity and seawater likely degraded the seals on points where tubes and wires penetrated the structure–so even if there were no additional cracks from an explosion or the earthquake, the system is now almost certainly riddled with holes.

The holes pose a couple of problems, not only does it mean more contaminated water leaking into the environment, it precludes filling the building with water to shield people and equipment from radiation. Combined with the elevated radiation readings, this will most certainly mean a considerably longer and more expensive cleanup.

And reactor 2 was considered the Fukushima unit in the best shape.

(Reactor 2 is also the unit that experienced a rapid rise in temperature and possible re-criticality in early February. TEPCO officials later attributed this finding to a faulty thermometer, but if one were skeptical of that explanation before, the new information about high radiation and low water levels should warrant a re-examination of February’s events.)

What does this all mean? Well, for Japan, it means injecting another $22 billion into Fukushima’s nominal owners, TEPCO–$12 billion just to stay solvent, and $10.2 billion to cover compensation for those injured or displaced by the nuclear crisis. That cash dump comes on top of the $18 billion already coughed up by the Japanese government, and is just a small down payment on what is estimated to be a $137 billion bailout of the power company.

It also means a further erosion of trust in an industry and a government already short on respect.

The same holds true in the US, where poor communication and misinformation left the residents of central Pennsylvania panicked and perturbed some 33 years ago, and the story is duplicated on varying scales almost weekly somewhere near one of America’s 104 aging and increasingly accident-prone nuclear reactors.

And, increasingly, residents and the state and local governments that represent them are saying “enough.” Whether it is the citizens and state officials from California’s Simi Valley demanding the real cleanup of a 53-year-old meltdown, or the people and legislature of Vermont facing off with the federal government on who has ultimate authority to assure that the next nuclear accident doesn’t happen in their backyard, Americans are looking at their future in the context of nuclear’s troubled past.

One year after Fukushima, 33 years after Three Mile Island, and 53 years after the Sodium Reactor Experiment, isn’t it time the US federal government did so, too?

As World Honors Fukushima Victims, NRC Gives Them a One-Fingered Salute

8:45 am in Uncategorized by Gregg Levine

Sign from Fukushima commemoration and anti-nuclear power rally, Union Square Park, NYC, 3/11/12. (photo: G. Levine)

Nearly a week after the first anniversary of the Japanese earthquake and tsunami that started the crisis at the Fukushima Daiichi nuclear power facility, I am still sorting through the dozens of reports, retrospectives and essays commemorating the event. The sheer volume of material has been a little exhausting, but that is, of course, compounded by the weight of the subject. From reviewing the horrors of a year ago–now even more horrific, thanks to many new revelations about the disaster–to contemplating what lies ahead for residents of Japan and, indeed, the world, it is hard just to read about it; living it–then, now, and in the future–is almost impossible for me to fathom.

But while living with the aftermath might be hard to imagine, that such a catastrophe could and likely would happen was not. In fact, if there is a theme (beyond the suffering of the Japanese people) that runs through all the Fukushima look-backs, it is the predictability–the mountains of evidence that said Japan’s nuclear plants were vulnerable, and if nothing were done, a disaster (like the one we have today) should be expected.

I touched on this last week in my own anniversary examination, and now I see that Dawn Stover, contributing editor at The Bulletin of the Atomic Scientists, draws a similar comparison:

Although many politicians have characterized 3/11 and 9/11 as bizarre, near-impossible events that could not have been foreseen, in both cases there were clear but unheeded warnings. . . . In the case of 3/11, the nuclear plant’s operators ignored scientific studies showing that the risks of a tsunami had been dramatically underestimated. Japan’s “safety culture,” which asserted that accidents were impossible, prevented regulators from taking a hard look at whether emergency safety systems would function properly in a tsunami-caused station blackout.

Stover goes on to explain many points where the two nightmare narratives run parallel. She notes how while governments often restrict information, stating that they need to guard against mass panic, it is actually the officials who are revealed to be in disarray. By contrast, in both cases, first responders behaved rationally and professionally, putting themselves at great risk in attempts to save others.

In both cases, communication–or, rather, the terrible lack of it–between sectors of government and between officials and responders exacerbated the crisis and put more lives at risk.

And with both 9/11 and 3/11, the public’s trust in government was shaken. And that crisis of trust was made worse by officials obscuring the facts and covering their tracks to save their own reputations.

But perhaps with that last point, it is more my reading my observations into hers than a straight retelling of Stover. Indeed, it is sad to note that Stover concludes her Fukushima think piece with a similar brand of CYA hogwash:

By focusing needed attention on threats to our existence, 3/11 and 9/11 have brought about some positive changes. The nuclear disaster in Japan has alerted nuclear regulators and operators around the world to the vulnerabilities of nuclear power plant cooling systems and will inevitably lead to better standards for safety and siting — and perhaps even lend a new urgency to the problem of spent fuel. Likewise, 9/11 resulted in new security measures and intelligence reforms that have thus far prevented another major terrorist attack in the United States and have created additional safeguards for nuclear materials.

When it comes to post-9/11 “security” and “intelligence reforms,” Stover is clearly out of her depth, and using the Bush-Cheney “no new attacks” fallacy frankly undermines the credibility of the entire essay. But I reference it here because it sets up a more important point.

If only Stover had taken a lesson from her own story. The Fukushima disaster has not alerted nuclear regulators and operators to vulnerabilities–as has been made clear here and in several of the post-Fukushima reports, those vulnerabilities were all well known, and known well in advance of 3/11/11.

But even if this were some great and grand revelation, some signal moment, some clarion call, what in the annals of nuclear power makes Stover or any other commentator think that call will be heard? “Inevitably lead to better standards”–inevitably? We’d all exit laughing if we weren’t running for our lives.

Look no further than the “coincidental” late-Friday, pre-anniversary news dump from the US Nuclear Regulatory Commission.

Late on March 9, 2012, two days before the earthquake and tsunami would be a year in the rear-view mirror, the NRC put on a big splashy show. . . uh, strike that. . . released a weirdly underplayed written announcement that the commission had approved a set of new rules drawing on lessons learned from the Fukushima crisis:

The Nuclear Regulatory Commission ordered major safety changes for U.S. nuclear power plants Friday. . . .

The orders require U.S. nuclear plants to install or improve venting systems to limit core damage in a serious accident and to install sophisticated equipment to monitor water levels in pools of spent nuclear fuel.

The plants also must improve protection of safety equipment installed after the 2001 terrorist attacks and make sure it can handle damage to multiple reactors at the same time.

Awwwrighty then, that sounds good, right? New rules, more safety, responsive to the Japanese disaster at last–but the timing instantly raised questions.

It didn’t take long to discover these were not the rules you were looking for.

First off, these are only some of the recommendations put before the commission by their Near-Term Task Force some ten months ago, and while better monitoring of water levels in spent fuel pools and plans to handle multiple disasters are good ideas, it has been noted that the focus on hardening the vents in Mark I and Mark II boiling water reactors actually misdiagnoses what really went wrong in two of the Fukushima Daiichi reactors.

Also, it should be noted this represents less than half the recommendations in last summer’s report. It also does not mandate a migration of spent fuel from pools to dry casks, an additional precaution not explicitly in the report, but stressed by NRC chief Gregory Jaczko, as well as many industry watchdogs.

But most important–and glaring–of all, the language under which these rules passed could make it that almost none of them are ever enforced.

This is a little technical, so let me turn to one of the few members of Congress that actually spends time worrying about this, Rep. Ed Markey (D MA-7):

While I am encouraged that the Commission supports moving forward with three of the most straightforward and quickly-issued nuclear safety Orders recommended by their own expert staff, I am disappointed that several Commissioners once again have rejected the regulatory justification that they are necessary for the adequate protection of nuclear reactors in this country. . . .

After the terrorist attacks of September 11, 2001, the NRC determined that some nuclear security upgrades were required to be implemented for the “adequate protection” of all U.S. nuclear reactors. This meant that nuclear reactors would not be considered to be sufficiently secure without these new measures, and that an additional cost-benefit “backfit” analysis would not be required to justify their implementation. The “adequate protection” concept is derived from the Atomic Energy Act of 1954, and is reflected in NRC’s “Backfit Rule” which specifies that new regulations for existing nuclear reactors are not required to include this extra cost-benefit “backfit” analysis when the new regulations are “necessary to ensure that the facility provides adequate protection to the health and safety of the public.”

Both the NRC Fukushima Task Force and the NRC staff who reviewed the Task Force report concluded that the new post-Fukushima safety recommendations, including the Orders issued today, were also necessary for the “adequate protection” of existing U.S. nuclear power plants, and that additional cost-benefit analysis should not be required to justify their implementation.

While Chairman Jaczko’s vote re-affirmed his support of all the Near-Term Task Force’s recommendations, including the need to mandate them all on the basis that they are necessary for the adequate protection of all U.S. nuclear power plants, Commissioner Svinicki did not do so for any of the Orders, Commissioner Magwood did not do so for two of the three Orders, and Commissioners Apostolakis and Ostendorff rejected that basis for one of the three. As a result, the Order requiring technologies to monitor conditions in spent nuclear fuel pools during emergencies will proceed using a different regulatory basis. More importantly, the inability of the Commission to unanimously accept its own staff’s recommendations on these most straightforward safety measures presents an ominous signal of the manner in which the more complicated next sets of safety measures will be considered.

In other words, last Friday’s move was regulatory kabuki. By failing to use the strictest language for fuel pools, plant operators will be allowed to delay compliance for years, if not completely excuse themselves from it, based on the argument that the safety upgrade is too costly.

The other two rules are also on shaky ground, as it were. And even if by some miracle, the industry chose not to fight them, and the four uber-pro-nuclear commissioners didn’t throw up additional roadblocks, nothing is required of the nuclear facilities until December 31, 2016.

So, rather than it being a salutary moment, a tribute of sorts to the victims in Japan on the anniversary of their disaster, the announcement by the NRC stands more as an insult. It’s as if the US government is saying, “Sure, there are lessons to be learned here, but the profits of private energy conglomerates are more important than any citizen’s quaint notions of health and safety. ”

As if any more examples were needed, these RINOs (rules in name only) demonstrate again that in America, as in Japan, the government is too close to the nuclear industry it is supposed to police.

And, for the bigger picture, as if any more examples were needed, be it before or after March 11, it really hasn’t been that hard to imagine the unimaginable. When an industry argues it has to forgo a margin of safety because of cost, there’s a good chance it was too dangerous and too expensive to begin with.

* * *

By way of contrast, take a look in at the some of the heartfelt expressions of commemoration and protest from New York’s Fukushima memorial and anti-nuclear rally, held last Sunday in Union Square Park.

Fukushima One Year On: Many Revelations, Few Surprises

11:30 am in Uncategorized by Gregg Levine

Satellite image of Fukushima Daiichi showing damage on 3/14/11. (photo: digitalglobe)

One year on, perhaps the most surprising thing about the Fukushima crisis is that nothing is really that surprising. Almost every problem encountered was at some point foreseen, almost everything that went wrong was previously discussed, and almost every system that failed was predicted to fail, sometimes decades earlier. Not all by one person, obviously, not all at one time or in one place, but if there is anything to be gleaned from sorting through the multiple reports now being released to commemorate the first anniversary of the Tohoku earthquake and tsunami–and the start of the crisis at Fukushima Daiichi–it is that, while there is much still to be learned, we already know what is to be done. . . because we knew it all before the disaster began.

This is not to say that any one person–any plant manager, nuclear worker, TEPCO executive, or government official–had all that knowledge on hand or had all the guaranteed right answers when each moment of decision arose. We know that because the various timelines and reconstructions now make it clear that several individual mistakes were made in the minutes, hours and days following the dual natural disasters. Instead, the analysis a year out teaches us that any honest examination of the history of nuclear power, and any responsible engagement of the numerous red flags and warnings would have taken the Fukushima disasters (yes, plural) out of the realm of “if,” and placed it squarely into the category of “when.”

Following closely the release of findings by the Rebuild Japan Foundation and a report from the Union of Concerned Scientists (both discussed here in recent weeks), a new paper, “Fukushima in review: A complex disaster, a disastrous response,” written by two members of the Rebuild Japan Foundation for the Bulletin of the Atomic Scientists, provides a detailed and disturbing window on a long list of failures that exacerbated the problems at Japan’s crippled Fukushima Daiichi facility. Among them, they include misinterpreting on-site observations, the lack of applicable protocols, inadequate industry guidelines, and the absence of both a definitive chain of command and the physical presence of the supposed commanders. But first and foremost, existing at the core of the crisis that has seen three reactor meltdowns, numerous explosions, radioactive contamination of land, air and sea, and the mass and perhaps permanent evacuation of tens of thousands of residents from a 20 kilometer exclusion zone, is what the Bulletin paper calls “The trap of the absolute safety myth”: Read the rest of this entry →