You are browsing the archive for radiation.

Fukushima Two Years Later: Many Questions, One Clear Answer

7:30 am in Uncategorized by Gregg Levine

Fukushima's threats to health and the environment continue. (graphic: Surian Soosay via flickr)

You can’t say you have all the answers if you haven’t asked all the questions. So, at a conference on the medical and ecological consequences of the Fukushima nuclear disaster, held to commemorate the second anniversary of the earthquake and tsunami that struck northern Japan, there were lots of questions. Questions about what actually happened at Fukushima Daiichi in the first days after the quake, and how that differed from the official report; questions about what radionuclides were in the fallout and runoff, at what concentrations, and how far they have spread; and questions about what near- and long-term effects this disaster will have on people and the planet, and how we will measure and recognize those effects.

A distinguished list of epidemiologists, oncologists, nuclear engineers, former government officials, Fukushima survivors, anti-nuclear activists and public health advocates gathered at the invitation of The Helen Caldicott Foundation and Physicians for Social Responsibility to, if not answer all these question, at least make sure they got asked. Over two long days, it was clear there is much still to be learned, but it was equally clear that we already know that the downsides of nuclear power are real, and what’s more, the risks are unnecessary. Relying on this dirty, dangerous and expensive technology is not mandatory–it’s a choice. And when cleaner, safer, and more affordable options are available, the one answer we already have is that nuclear is a choice we should stop making and a risk we should stop taking.

“No one died from the accident at Fukushima.” This refrain, as familiar as multiplication tables and sounding about as rote when recited by acolytes of atomic power, is a close mirror to versions used to downplay earlier nuclear disasters, like Chernobyl and Three Mile Island (as well as many less infamous events), and is somehow meant to be the discussion-ender, the very bottom-line of the bottom-line analysis that is used to grade global energy options. “No one died” equals “safe” or, at least, “safer.” Q.E.D.

But beyond the intentional blurring of the differences between an “accident” and the probable results of technical constraints and willful negligence, the argument (if this saw can be called such) cynically exploits the space between solid science and the simple sound bite.

“Do not confuse narrowly constructed research hypotheses with discussions of policy,” warned Steve Wing, Associate Professor of Epidemiology at the University of North Carolina’s Gillings School of Public Health. Good research is an exploration of good data, but, Wing contrasted, “Energy generation is a public decision made by politicians.”

Surprisingly unsurprising

A public decision, but not necessarily one made in the public interest. Energy policy could be informed by health and environmental studies, such as the ones discussed at the Fukushima symposium, but it is more likely the research is spun or ignored once policy is actually drafted by the politicians who, as Wing noted, often sport ties to the nuclear industry.

The link between politicians and the nuclear industry they are supposed to regulate came into clear focus in the wake of the March 11, 2011 Tohoku earthquake and tsunami–in Japan and the United States.

The boiling water reactors (BWRs) that failed so catastrophically at Fukushima Daiichi were designed and sold by General Electric in the 1960s; the general contractor on the project was Ebasco, a US engineering company that, back then, was still tied to GE. General Electric had bet heavily on nuclear and worked hand-in-hand with the US Atomic Energy Commission (AEC–the precursor to the NRC, the Nuclear Regulatory Commission) to promote civilian nuclear plants at home and abroad. According to nuclear engineer Arnie Gundersen, GE told US regulators in 1965 that without quick approval of multiple BWR projects, the giant energy conglomerate would go out of business.

It was under the guidance of GE and Ebasco that the rocky bluffs where Daiichi would be built were actually trimmed by 10 meters to bring the power plant closer to the sea, the water source for the reactors’ cooling systems–but it was under Japanese government supervision that serious and repeated warnings about the environmental and technological threats to Fukushima were ignored for another generation.

Failures at Daiichi were completely predictable, observed David Lochbaum, the director of the Nuclear Safety Project at the Union of Concerned Scientists, and numerous upgrades were recommended over the years by scientists and engineers. “The only surprising thing about Fukushima,” said Lochbaum, “is that no steps were taken.”

The surprise, it seems, should cross the Pacific. Twenty-two US plants mirror the design of Fukushima Daiichi, and many stand where they could be subject to earthquakes or tsunamis. Even without those seismic events, some US plants are still at risk of Fukushima-like catastrophic flooding. Prior to the start of the current Japanese crisis, the Nuclear Regulatory Commission learned that the Oconee Nuclear Plant in Seneca, South Carolina, was at risk of a major flood from a dam failure upstream. In the event of a dam breach–an event the NRC deems more likely than the odds that were given for the 2011 tsunami–the flood at Oconee would trigger failures at all four reactors. Beyond hiding its own report, the NRC has taken no action–not before Fukushima, not since.

The missing link

Read the rest of this entry →

Fukushima Plus Two: Still the Beginning?

4:25 am in Uncategorized by Gregg Levine

An IAEA inspector examines the remains of reactor 3 at Fukushima Daiichi (5/27/11)

I was up working in what were in my part of the world the early morning hours of March 11, 2011, when I heard over the radio that a massive earthquake had struck northeastern Japan. I turned on the TV just in time to see the earliest pictures of the tsunami that followed what became known as the Tohoku quake. The devastation was instantly apparent, and reports of high numbers of casualties seemed inevitable, but it wasn’t until a few hours later, when news of the destruction and loss of power at the Fukushima Daiichi nuclear plant hit the English-language airwaves, that I was gripped by a real sense of despair.

I was far from a nuclear expert at the time, but I knew enough to know that without intact cooling systems, or the power to keep them running, and with the added threat of a containment breach, some amount of environmental contamination was certain, and the potential for something truly terrifying was high.

What started as a weekend of watching newswires and live streams, virtually around the clock, and posting basic tech and health questions on email lists, expanded as the Fukushima crisis itself grew. Two years later, I have written tens of thousands of words, and read hundreds of thousands more. I have learned much, but I think I have only scratched the surface.

We all might be a little closer to understanding what happened in those first days and weeks after the earthquake, but what has happened since is still, sadly, a story where much must be written. What the Daiichi plant workers really went through in those early days is just now coming to light, and the tales of intrigue and cover-up, of corruption and captured government, grow more complex and more sinister with each revelation. But what has happened to the environment, not just in the government-cordoned evacuation zone, but also throughout Japan, across the Pacific, and around the world, will likely prove the most chilling narrative.

Radiation levels in the quarantined parts of Japan are still far too high to permit any kind of human re-habitation, but exposure rates in areas far outside that radius are also well above what would have been considered acceptable before this disaster. And water, used to cool the molten cores and damaged spent fuel pools at Fukushima Daiichi, now dangerously radioactive itself, continues to leak into the ground and into the ocean at unprecedented rates.

Read the rest of this entry →

Something Fishy: CRS Report Downplays Fukushima’s Effect on US Marine Environment

6:55 am in Uncategorized by Gregg Levine

(photo: JanneM via flickr)

Late Thursday, the United States Coast Guard reported that they had successfully scuttled the Ryou-Un Maru, the Japanese “Ghost Ship” that had drifted into US waters after being torn from its moorings by the tsunami that followed the Tohoku earthquake over a year ago. The 200-foot fishing trawler, which was reportedly headed for scrap before it was swept away, was seen as potentially dangerous as it drifted near busy shipping lanes.

Coincidentally, the “disappearing” of the Ghost Ship came during the same week the Congressional Research Service (CRS) released its report on the effects of the Fukushima Daiichi nuclear disaster on the US marine environment, and, frankly, the metaphor couldn’t be more perfect. The Ryou-Un Maru is now resting at the bottom of the ocean–literally nothing more to see there, thanks to a few rounds from a 25mm Coast Guard gun–and the CRS hopes to dispatch fears of the radioactive contamination of US waters and seafood with the same alacrity.

But while the Ghost Ship was not considered a major ecological threat (though it did go down with around 2,000 gallons of diesel fuel in its tanks), the US government acknowledges that this “good luck ship” (a rough translation of its name) is an early taste of the estimated 1.5 million tons of tsunami debris expected to hit North American shores over the next two or three years. Similarly, the CRS report (titled Effects of Radiation from Fukushima Dai-ichi on the U.S. Marine Environment [PDF]) adopts an overall tone of “no worries here–its all under control,” but a closer reading reveals hints of “more to come.”

Indeed, the report feels as it were put through a political rinse cycle, limited both in the strength of its language and the scope of its investigation. This tension is evident right from the start–take, for example, these three paragraphs from the report’s executive summary:

Both ocean currents and atmospheric winds have the potential to transport radiation over and into marine waters under U.S. jurisdiction. It is unknown whether marine organisms that migrate through or near Japanese waters to locations where they might subsequently be harvested by U.S. fishermen (possibly some albacore tuna or salmon in the North Pacific) might have been exposed to radiation in or near Japanese waters, or might have consumed prey with accumulated radioactive contaminants.

High levels of radioactive iodine-131 (with a half-life of about 8 days), cesium-137 (with a half-life of about 30 years), and cesium-134 (with a half-life of about 2 years) were measured in seawater adjacent to the Fukushima Dai-ichi site after the March 2011 events. EPA rainfall monitors in California, Idaho, and Minnesota detected trace amounts of radioactive iodine, cesium, and tellurium consistent with the Japanese nuclear incident, at concentrations below any level of concern. It is uncertain how precipitation of radioactive elements from the atmosphere may have affected radiation levels in the marine environment.

Scientists have stated that radiation in the ocean very quickly becomes diluted and would not be a problem beyond the coast of Japan. The same is true of radiation carried by winds. Barring another unanticipated release, radioactive contaminants from Fukushima Dai-ichi should be sufficiently dispersed over time that they will not prove to be a serious health threat elsewhere, unless they bioaccumulate in migratory fish or find their way directly to another part of the world through food or other commercial products.

Winds and currents have “the potential” to transport radiation into US waters? Winds–quite measurably–already have, and computer models show that currents, over the next couple of years, most certainly will.

Are there concentrations of radioisotopes that are “below concern?” No reputable scientist would make such a statement. And if monitors in the continental United States detected radioactive iodine, cesium and tellurium in March 2011, then why did they stop the monitoring (or at least stop reporting it) by June?

The third paragraph, however, wins the double-take prize. Radiation would not be a problem beyond the coast? Fish caught hundreds of miles away would beg to differ. “Barring another unanticipated release. . . ?” Over the now almost 13 months since the Fukushima crisis began, there have been a series of releases into the air and into the ocean–some planned, some perhaps unanticipated at the time, but overall, the pattern is clear, radioactivity continues to enter the environment at unprecedented levels.

And radioactive contaminants “should be sufficiently dispersed over time, unless they bioaccumulate?” Unless? Bioaccumulation is not some crazy, unobserved hypothesis, it is a documented biological process. Bioaccumulation will happen–it will happen in migratory fish and it will happen as under-policed food and commercial products (not to mention that pesky debris) make their way around the globe.

Maybe that is supposed to be read by inquiring minds as the report’s “please ignore he man behind the curtain” moment–an intellectual out clause disguised as an authoritative analgesic–but there is no escaping the intent. Though filled with caveats and counterfactuals, the report is clearly meant to serve as a sop to those alarmed by the spreading ecological catastrophe posed by the ongoing Fukushima disaster.

The devil is in the details–the dangers are in the data

Beyond the wiggle words, perhaps the most damning indictment of the CRS marine radiation report can be found in the footnotes–or, more pointedly, in the dates of the footnotes. Though this report was released over a year after the Tohoku earthquake and tsunami triggered the Fukushima nightmare, the CRS bases the preponderance of its findings on information generated during the disaster’s first month. In fact, of the document’s 29 footnotes, only a handful date from after May 2011–one of those points to a CNN report (authoritative!), one to a status update on the Fukushima reactor structures, one confirms the value of Japanese seafood imports, three are items tracking the tsunami debris, and one directs readers to a government page on FDA radiation screening, the pertinent part of which was last updated on March 28 of last year.

Most crucially, the parts of the CRS paper that downplay the amounts of radiation measured by domestic US sensors all cite data collected within the first few weeks of the crisis. The point about radioisotopes being “below any level of concern” comes from an EPA news release dated March 22, 2011–eleven days after the earthquake, only six days after the last reported reactor explosion, and well before so many radioactive releases into the air and ocean. It is like taking reports of only minor flooding from two hours after Hurricane Katrina passed over New Orleans, and using them as the standard for levee repair and gulf disaster planning (perhaps not the best example, as many have critiqued levee repairs for their failure to incorporate all the lessons learned from Katrina).

It now being April of 2012, much more information is available, and clearly any report that expects to be called serious should have included at least some of it.

By October of last year, scientists were already doubling their estimates of the radiation pushed into the atmosphere by the Daiichi reactors, and in early November, as reported here, France’s Institute for Radiological Protection and Nuclear Safety issued a report showing the amount of cesium 137 released into the ocean was 30 times greater than what was stated by TEPCO in May. Shockingly, the Congressional Research Service does not reference this report.

Or take the early March 2012 revelation that seaweed samples collected from off the coast of southern California show levels of radioactive iodine 131 500 percent higher than those from anywhere else in the US or Canada. It should be noted that this is the result of airborne fallout–the samples were taken in mid-to-late-March 2011, much too soon for water-borne contamination to have reached that area–and so serves to confirm models that showed a plume of radioactive fallout with the greatest contact in central and southern California. (Again, this specific report was released a month before the CRS report, but the data it uses were collected over a year ago.)

Then there are the food samples taken around Japan over the course of the last year showing freshwater and sea fish–some caught over 200 kilometers from Fukushima–with radiation levels topping 100 becquerels per kilogram (one topping 600 Bq/kg).

And the beat goes on

This information, and much similar to it, was all available before the CRS released its document, but the report also operates in a risibly artificial universe that assumes the situation at Fukushima Daiichi has basically stabilized. As a sampling of pretty much any week’s news will tell you, it has not. Take, for example, this week:

About 12 tons of water contaminated with radioactive strontium are feared to have leaked from the Fukushima No. 1 plant into the Pacific Ocean, Tepco said Thursday.

The leak occurred when a pipe broke off from a joint while the water was being filtered for cesium, Tokyo Electric Power Co. said.

The system doesn’t remove strontium, and most of the water apparently entered the sea via a drainage route, Tepco added.

The water contained 16.7 becquerels of cesium per cu. centimeter and tests are under way to determine how much strontium was in it, Tepco said.

This is the second such leak in less than two weeks, and as Kazuhiko Kudo, a professor of nuclear engineering at Kyushu University who visited Fukushima Daiichi twice last year, noted:

There will be similar leaks until Tepco improves equipment. The site had plastic pipes to transfer radioactive water, which Tepco officials said are durable and for industrial use, but it’s not something normally used at nuclear plants. Tepco must replace it with metal equipment, such as steel.

(The plastic tubes–complete with the vinyl and duct tape patch–can be viewed here.)

And would that the good people at the Congressional Research Service could have waited to read a report that came out the same day as theirs:

Radioactive material from the Fukushima nuclear disaster has been found in tiny sea creatures and ocean water some 186 miles (300 kilometers) off the coast of Japan, revealing the extent of the release and the direction pollutants might take in a future environmental disaster.

In some places, the researchers from Woods Hole Oceanographic Institution (WHOI) discovered cesium radiation hundreds to thousands of times higher than would be expected naturally, with ocean eddies and larger currents both guiding the “radioactive debris” and concentrating it.

Or would that the folks at CRS had looked to their fellow government agencies before they went off half-cocked. (The study above was done by researchers at Woods Hole and written up in the journal of the National Academy of Sciences.) In fact, it appears the CRS could have done that. In its report, CRS mentions that “Experts cite [Fukushima] as the largest recorded release of radiation to the ocean,” and the source for that point is a paper by Ken Buesseler–the same Ken Buesseler that was the oceanographer in charge of the WHOI study. Imagine what could have been if the Congressional Research Service had actually contacted the original researcher.

Can openers all around

Or perhaps it wouldn’t have mattered. For if there is one obvious takeaway from the CRS paper, beyond its limits of scope and authority, that seeks to absolve it of all other oversights–it is its unfailing confidence in government oversight.

Take a gander at the section under the bolded question “Are there implications for US seafood safety?”:

It does not appear that nuclear contamination of seafood will be a food safety problem for consumers in the United States. Among the main reasons are that:

  • damage from the disaster limited seafood production in the affected areas,
  • radioactive material would be diluted before reaching U.S. fishing grounds, and
  • seafood imports from Japan are being examined before entry into the United States.

According to the U.S. Food and Drug Administration (FDA), because of damage from the earthquake and tsunami to infrastructure, few if any food products are being exported from the affected region. For example, according to the National Federation of Fisheries Cooperative Associations, the region’s fishing industry has stopped landing and selling fish. Furthermore, a fishing ban has been enforced within a 2-kilometer radius around the damaged nuclear facility.

So, the Food and Drug Administration is relying on the word of an industry group and a Japanese government-enforced ban that encompasses a two-kilometer radius–what link of that chain is supposed to be reassuring?

Last things first: two kilometers? Well, perhaps the CRS should hire a few proofreaders. A search of the source materials finds that the ban is supposed to be 20-kilometers. Indeed, the Japanese government quarantined the land for a 20-kilometer radius. The US suggested evacuation from a 50-mile (80-kilometer) radius. The CRS’s own report notes contaminated fish were collected 30 kilometers from Fukushima. So why is even 20 kilometers suddenly a radius to brag about?

As for a damaged industry not exporting, numerous reports show the Japanese government stepping in to remedy that “problem.” From domestic PR campaigns encouraging the consumption of foodstuffs from Fukushima prefecture, to the Japanese companies selling food from the region to other countries at deep discounts, to the Japanese government setting up internet clearing houses to help move tainted products, all signs point to a power structure that sees exporting possibly radioactive goods as essential to its survival.

The point on dilution, of course, not only ignores the way many large scale fishing operations work, it ignores airborne contamination and runs counter to the report’s own acknowledgment of bioaccumulation.

But maybe the shakiest assertion of all is that the US Food and Drug Administration will stop all contaminated imports at the water’s edge. While imports hardly represent the total picture when evaluating US seafood safety, taking this for the small slice of the problem it covers, it engenders raised eyebrows.

First there is the oft-referenced point from nuclear engineer Arnie Gundersen, who said last summer that State Department officials told him of a secret agreement between Japan and Secretary Hilary Clinton guaranteeing the continued importation of Japanese food. While independent confirmation of this pact is hard to come by, there is the plain fact that, beyond bans on milk, dairy products, fruits and vegetables from the Fukushima region issued in late March 2011, the US has proffered no other restrictions on Japanese food imports (and those few restrictions for Japanese food were lifted for US military commissaries in September).

And perhaps most damning, there was the statement from an FDA representative last April declaring that North Pacific seafood was so unlikely to be contaminated that “no sampling or monitoring of our fish is necessary.” The FDA said at the time that it would rely on the National Oceanographic and Atmospheric Administration (NOAA) to tell it when they should consider testing seafood, but a NOAA spokesperson said it was the FDA’s call.

Good. Glad that’s been sorted out.

The Congressional Research Service report seems to fall victim to a problem noted often here–they assume a can opener. As per the joke, the writers stipulate a functioning mechanism before explaining their solution. As many nuclear industry-watchers assume a functioning regulatory process (as opposed to a captured Nuclear Regulatory Commission, an industry-friendly Department of Energy, and industry-purchased members of Congress) when speaking of the hypothetical safety of nuclear power, the CRS here assumes an FDA interested first and foremost in protecting the general public, instead of an agency trying to strike some awkward “balance” between health, profit and politics. The can opener story is a joke; the effects of this real-life example are not.

Garbage in, garbage out

The Congressional Research Service, a part of the Library of Congress, is intended to function as the research and analysis wing of the US Congress. It is supposed to be objective, it is supposed to be accurate, and it is supposed to be authoritative. America needs the CRS to be all of those things because the agency’s words are expected to inform federal legislation. When the CRS shirks its responsibility, shapes its words to fit comfortably into the conventional wisdom, or shaves off the sharp corners to curry political favor, the impact is more than academic.

When the CRS limits its scope to avoid inconvenient truths, it bears false witness to the most important events of our time. When the CRS pretends other government agencies are doing their jobs–despite documentable evidence to the contrary–then they are not performing theirs. And when the CRS issues a report that ignores the data and the science so that a few industries might profit, it is America that loses.

The authors of this particular report might not be around when the bulk of the cancers and defects tied to the radiation from Fukushima Daiichi present in the general population, but this paper’s integrity today could influence those numbers tomorrow. Bad, biased, or bowdlerized advice could scuttle meaningful efforts to make consequential policy.

If the policy analysts that sign their names to reports like this don’t want their work used for scrap paper, then maybe they should take a lesson from the Ryou-Un Maru. Going where the winds and currents take you makes you at best a curiosity, and more likely a nuisance–just so much flotsam and jetsam getting in the way of actual business. Works of note come with moral rudders, anchored to best data available; without that, the report might as well just say “good luck.”

Looking Back at Our Nuclear Future

12:30 pm in Uncategorized by Gregg Levine

The Los Angeles Times heralds the nuclear age in January 1957. (photo via wikipedia)

On March 11, communities around the world commemorated the first year of the still-evolving Fukushima Daiichi nuclear disaster with rallies, marches, moments of silence, and numerous retrospective reports and essays (including one here). But 17 days later, another anniversary passed with much less fanfare.

It was in the early morning hours of March 28, 1979, that a chain of events at the Three Mile Island nuclear power plant in Dauphin County, Pennsylvania caused what is known as a “loss of coolant accident,” resulting in a partial core meltdown, a likely hydrogen explosion, the venting of some amount of radioisotopes into the air and the dumping of 40,000 gallons of radioactive waste water into the Susquehanna River. TMI (as it is sometimes abbreviated) is often called America’s worst commercial nuclear accident, and though the nuclear industry and its acolytes have worked long and hard to downplay any adverse health effects stemming from the mishap, the fact is that what happened in Pennsylvania 33 years ago changed the face and future of nuclear power.

The construction of new nuclear power facilities in the US was already in decline by the mid 1970s, but the Three Mile Island disaster essentially brought all new projects to a halt. There were no construction licenses granted to new nuclear plants from the time of TMI until February of this year, when the NRC gave a hasty go-ahead to two reactors slated for the Vogtle facility in Georgia. And though health and safety concerns certainly played a part in this informal moratorium, cost had at least an equal role. The construction of new plants proved more and more expensive, never coming in on time or on budget, and the cleanup of the damaged unit at Three Mile Island took 14 years and cost over $1 billion. Even with the Price-Anderson Act limiting the industry’s liability, nuclear power plants are considered such bad risks that no financing can be secured without federal loan guarantees.

In spite of that–or because of that–the nuclear industry has pushed steadily over the last three decades to wring every penny out of America’s aging reactors, pumping goodly amounts of their hefty profits into lobbying efforts and campaign contributions designed to capture regulators and elected officials and propagate the age-old myth of an energy source that is clean, safe, and, if not exactly “too cheap to meter,” at least impressively competitive with other options. The result is a fleet of over 100 reactors nearing the end of their design lives–many with documented dangers and potential pitfalls that could rival TMI–now seeking and regularly getting license extensions from the Nuclear Regulatory Commission while that same agency softens and delays requirements for safety upgrades.

And all of that cozy cooperation between government and big business goes on with the nuclear industry pushing the idea of a “nuclear renaissance.” In the wake of Fukushima, the industry has in fact increased its efforts, lobbying the US and British governments to downplay the disaster, and working with its mouthpieces in Congress and on the NRC to try to kill recommended new regulations and force out the slightly more safety-conscious NRC chair. And, just this month, the Nuclear Energy Institute, the chief nuclear trade group, moved to take their message to what might be considered a less friendly cohort, launching a splashy PR campaign by underwriting public radio broadcasts and buying time for a fun and funky 60-second animated ad on The Daily Show.

All of this is done with the kind of confidence that only comes from knowing you have the money to move political practice and, perhaps, public opinion. Three Mile Island is, to the industry, the exception that proves the rule–if not an out-and-out success. “No one died,” you will hear–environmental contamination and the latest surveys now showing increased rates of Leukemia some 30 years later be damned–and that TMI is the only major accident in over half a century of domestic nuclear power generation.

Of course, this is not even remotely true–names like Browns Ferry, Cooper, Millstone, Indian Point and Vermont Yankee come to mind–but even if you discount plant fires and tritium leaks, Three Mile Island is not even America’s only meltdown.

There is, of course, the 1966 accident at Michigan’s Enrico Fermi Nuclear Generating Station, chronicled in the John Grant Fuller book We Almost Lost Detroit, but atom-lovers will dismiss this because Fermi 1 was an experimental breeder reactor, so it is not technically a “commercial” nuclear accident.

But go back in time another seven years–a full 20 before TMI–and the annals of nuclear power contain the troubling tale of another criticality accident, one that coincidentally is again in the news this week, almost 53 years later.

The Sodium Reactor Experiment

On July 12, 1957, the Sodium Reactor Experiment (SRE) at the Santa Susana Nuclear Field Laboratory near Simi Valley, California, became the first US nuclear reactor to produce electricity for a commercial power grid. SRE was a sodium-cooled reactor designed by Atomics International, a division of North American Aviation, a company more often known by the name of its other subsidiary, Rocketdyne. Southern California Edison used the electricity generated by SRE to light the nearby town of Moorpark.

Sometime during July 1959–the exact date is still not entirely clear–a lubricant used to cool the seals on the pump system seeped into the primary coolant, broke down in the heat and formed a compound that clogged cooling channels. Because of either curiosity or ignorance, operators continued to run the SRE despite wide fluctuations in core temperature and generating capacity.

Following a pattern that is now all too familiar, increased temperatures caused increased pressure, necessitating what was even then called a “controlled venting” of radioactive vapor. How much radioactivity was released into the environment is cause for some debate, for, in 1959, there was less monitoring and even less transparency. Current reconstructions, however, believe the release was possibly as high as 450 times greater than what was vented at Three Mile Island.

When the reactor was finally shut down and the fuel rods were removed (which was a trick in itself, as some were stuck and others broke), it was found that over a quarter showed signs of melting.

The SRE was eventually repaired and restarted in 1960, running on and off for another four years. Decommissioning began in 1976, and was finished in 1981, but the story doesn’t end there. Not even close.

Fifty-three years after a partial nuclear meltdown at the Santa Susana Field Laboratory site in the Chatsworth Hills, the U.S. Environmental Protection Agency has just released data finding extensive radioactive contamination still remains at the accident site.

“This confirms what we were worried about,” said Assemblywoman Julia Brownley, D-Oak Park, a long-time leader in the fight for a complete and thorough cleanup of this former Rocketdyne rocket engine testing laboratory. “This begins to answer critical questions about what’s still up there, where, how much, and how bad?”

Well, it sort of begins to answer it.

New soil samples weigh in at up to 1,000 times the radiation trigger levels (RTLs) agreed to when the Department of Energy struck a cleanup deal with the California Department of Toxic Substances in 2010. What’s more, these measurements follow two previous cleanup efforts by the DOE and Boeing, the company that now owns Santa Susana.

In light of the new findings, Assemblywoman Brownley has called on the DOE to comply with the agreement and do a real and thorough cleanup of the site. That means taking radiation levels down to what are the established natural background readings for the area. But that, as is noted by local reporter Michael Collins, “may be easier said than done”:

This latest U.S. EPA information appears to redefine what cleaning up to background actually is. Publicly available documents show that the levels of radiation in this part of Area IV where the SRE once stood are actually many thousands of times more contaminated than previously thought.

Just as troubling, the EPA’s RTLs, which are supposed to mirror the extensively tested and reported-on backgrounds of the numerous radionuclides at the site, were many times over the background threshold values (BTVs). So instead of cleaning up to background, much more radiation would be left in the ground, saving the government and lab owner Boeing millions in cleanup.

It is a disturbing tale of what Collins calls a kind of environmental “bait and switch” (of which he provides even more detail in an earlier report), but after a year of documenting the mis- and malfeasance of the nuclear industry and its supposed regulators, it is, to us here, anyway, not a surprising one.

To the atom-enamored, it is as if facts have a half-life all their own. The pattern of swearing that an event is no big deal, only to come back with revision after revision, each admitting a little bit more in a seemingly never-ending regression to what might approximately describe a terrible reality. It would be reminiscent of the “mom’s on the roof” joke if anyone actually believed that nuclear operators and their chummy government minders ever intended to eventually relay the truth.

Fukushima’s latest surprise

Indeed, that unsettling pattern is again visible in the latest news from Japan. This week saw revelations that radiation inside Fukushima Daiichi’s reactor 2 containment vessel clocked in at levels seriously higher than previously thought, while water levels are seriously lower.

An endoscopic camera, thermometer, water gauge and dosimeter were inserted into the number 2 reactor containment, and it documented radiation levels of up to 70 sieverts per hour, which is not only seven times the previous highest measurement, but 10 times higher than what is called a fatal dose (7 Sv/hr would kill a human in minutes).

The water level inside the containment vessel, estimated to be at 10 meters when the Japanese government declared a “cold shutdown” in December, turns out to be no more than 60 centimeters (about two feet).

This is disquieting news for many reasons. First, the high radiation not only makes it impossible for humans to get near the reactor, it makes current robotic technology impractical, as well. The camera, for instance, would only last 14 hours in those conditions. If the molten core is to be removed, a new class of radiation-resistant robots will have to be developed.

The extremely low water levels signal more troubling scenarios. Though some experts believe that the fuel rods have melted down or melted through to such an extent that two feet of water can keep them covered, it likely indicates a breach or breaches of the containment vessel. Plant workers, after all, have been pumping water into the reactor constantly for months now (why no one noticed that they kept having to add water to the system, or why no one cared, is plenty disturbing, as is the question of where all that extra water has gone).

Arnie Gundersen of nuclear engineering consultancy Fairewinds Associates believes that the level of water roughly corresponds with the lower lip of the vessel’s suppression pool–further evidence that reactor 2 suffered a hydrogen explosion, as did two other units at Fukushima. Gundersen also believes that the combination of heat, radioactivity and seawater likely degraded the seals on points where tubes and wires penetrated the structure–so even if there were no additional cracks from an explosion or the earthquake, the system is now almost certainly riddled with holes.

The holes pose a couple of problems, not only does it mean more contaminated water leaking into the environment, it precludes filling the building with water to shield people and equipment from radiation. Combined with the elevated radiation readings, this will most certainly mean a considerably longer and more expensive cleanup.

And reactor 2 was considered the Fukushima unit in the best shape.

(Reactor 2 is also the unit that experienced a rapid rise in temperature and possible re-criticality in early February. TEPCO officials later attributed this finding to a faulty thermometer, but if one were skeptical of that explanation before, the new information about high radiation and low water levels should warrant a re-examination of February’s events.)

What does this all mean? Well, for Japan, it means injecting another $22 billion into Fukushima’s nominal owners, TEPCO–$12 billion just to stay solvent, and $10.2 billion to cover compensation for those injured or displaced by the nuclear crisis. That cash dump comes on top of the $18 billion already coughed up by the Japanese government, and is just a small down payment on what is estimated to be a $137 billion bailout of the power company.

It also means a further erosion of trust in an industry and a government already short on respect.

The same holds true in the US, where poor communication and misinformation left the residents of central Pennsylvania panicked and perturbed some 33 years ago, and the story is duplicated on varying scales almost weekly somewhere near one of America’s 104 aging and increasingly accident-prone nuclear reactors.

And, increasingly, residents and the state and local governments that represent them are saying “enough.” Whether it is the citizens and state officials from California’s Simi Valley demanding the real cleanup of a 53-year-old meltdown, or the people and legislature of Vermont facing off with the federal government on who has ultimate authority to assure that the next nuclear accident doesn’t happen in their backyard, Americans are looking at their future in the context of nuclear’s troubled past.

One year after Fukushima, 33 years after Three Mile Island, and 53 years after the Sodium Reactor Experiment, isn’t it time the US federal government did so, too?

Fukushima One Year On: Many Revelations, Few Surprises

11:30 am in Uncategorized by Gregg Levine

Satellite image of Fukushima Daiichi showing damage on 3/14/11. (photo: digitalglobe)

One year on, perhaps the most surprising thing about the Fukushima crisis is that nothing is really that surprising. Almost every problem encountered was at some point foreseen, almost everything that went wrong was previously discussed, and almost every system that failed was predicted to fail, sometimes decades earlier. Not all by one person, obviously, not all at one time or in one place, but if there is anything to be gleaned from sorting through the multiple reports now being released to commemorate the first anniversary of the Tohoku earthquake and tsunami–and the start of the crisis at Fukushima Daiichi–it is that, while there is much still to be learned, we already know what is to be done. . . because we knew it all before the disaster began.

This is not to say that any one person–any plant manager, nuclear worker, TEPCO executive, or government official–had all that knowledge on hand or had all the guaranteed right answers when each moment of decision arose. We know that because the various timelines and reconstructions now make it clear that several individual mistakes were made in the minutes, hours and days following the dual natural disasters. Instead, the analysis a year out teaches us that any honest examination of the history of nuclear power, and any responsible engagement of the numerous red flags and warnings would have taken the Fukushima disasters (yes, plural) out of the realm of “if,” and placed it squarely into the category of “when.”

Following closely the release of findings by the Rebuild Japan Foundation and a report from the Union of Concerned Scientists (both discussed here in recent weeks), a new paper, “Fukushima in review: A complex disaster, a disastrous response,” written by two members of the Rebuild Japan Foundation for the Bulletin of the Atomic Scientists, provides a detailed and disturbing window on a long list of failures that exacerbated the problems at Japan’s crippled Fukushima Daiichi facility. Among them, they include misinterpreting on-site observations, the lack of applicable protocols, inadequate industry guidelines, and the absence of both a definitive chain of command and the physical presence of the supposed commanders. But first and foremost, existing at the core of the crisis that has seen three reactor meltdowns, numerous explosions, radioactive contamination of land, air and sea, and the mass and perhaps permanent evacuation of tens of thousands of residents from a 20 kilometer exclusion zone, is what the Bulletin paper calls “The trap of the absolute safety myth”: Read the rest of this entry →

Aftershocking: Frontline’s Fukushima Doc a Lazy Apologia for the Nuclear Industry

8:30 am in Uncategorized by Gregg Levine

There is much to say about this week’s Frontline documentary, “Nuclear Aftershocks,” and some of it would even be good. For the casual follower of nuclear news in the ten months since an earthquake and tsunami triggered the massive and ongoing disaster at Japan’s Fukushima Daiichi nuclear power station, it is illuminating to see the wreckage that once was a trio of active nuclear reactors, and the devastation and desolation that has replaced town after town inside the 20-kilometer evacuation zone. And it is eye-opening to experience at ground level the inadequacy of the Indian Point nuclear plant evacuation plan. It is also helpful to learn that citizens in Japan and Germany have seen enough and are demanding their countries phase out nuclear energy.

But if you are only a casual observer of this particular segment of the news, then the Frontline broadcast also left you with a mountain of misinformation and big bowl-full of unquestioned bias.

Take, for example, Frontline correspondent Miles O’Brien’s cavalier treatment of the potential increase in Japanese cancer deaths, courtesy of the former property of the Tokyo Electric Power Company (TEPCO):

MILES O’BRIEN: When Japanese authorities set radiation levels for evacuation, they were conservative, 20 millisieverts per year. That’s the equivalent of two or three abdominal CAT scans in the same period. I asked Dr. Gen Suzuki about this.

[on camera] So at 20 millisieverts over the course of a long period of time, what is the increased cancer risk?

GEN SUZUKI, Radiation specialist, Nuclear Safety Comm.: Yeah, it’s 0.2— 0.2 percent increase in lifetime.

MILES O’BRIEN: [on camera] 0.2 percent over the course of a lifetime?

GEN SUZUKI: Yeah.

MILES O’BRIEN: So your normal risk of cancer in Japan is?

GEN SUZUKI: Is 30 percent.

MILES O’BRIEN: So what is the increased cancer rate?

GEN SUZUKI: 30.2 percent, so the increment is quite small.

MILES O’BRIEN: And yet the fear is quite high.

GEN SUZUKI: Yes, that’s true. Read the rest of this entry →

Gregory Jaczko Has a Cold

8:30 am in Uncategorized by Gregg Levine

NRC Chairman Gregory Jaczko (photo: pennstatelive)

In April 1966, Esquire Magazine published a story by Gay Talese that is still considered one of the greatest magazine articles of all time; the article, the cover story, was titled “Frank Sinatra Has a Cold.”

The piece, still very much worth the read, says much about celebrity, journalism, and, of course, celebrity journalism, but germane here is a point Talese makes early on: for most people, having a cold is a trivial matter–after all, it’s called the “common” cold–but when a man, a cultural icon, a giant of stage and screen like Sinatra (remember, this is 1966) has a cold, well. . . .

Frank Sinatra with a cold is a big deal. It affects him, his mood, his ability to perform, and so it affects his friends, his entourage, his personal staff of 75, his audience, and perhaps a part of the greater popular culture. In other words, as Talese wants you to understand, in this case, a cold is anything but trivial.

Gregory Jaczko, the chairman of the United States Nuclear Regulatory Commission, made some comments to the press earlier this week. Jaczko, it seems, is worried. He believes, as noted in an Associated Press story, that “U.S. nuclear plant operators have become complacent, just nine months after the nuclear disaster in Japan.” The NRC head thinks that a slew of events at over a dozen domestic nuclear facilities reveal the safety of America’s reactors to be something less than optimal.

To be clear, safety concerns at any kind of plant, be it a soda bottler or a microchip manufacturer, are probably not trivial, but when the safe and secure operation of a nuclear facility comes into question–as the aftermath of Chernobyl or the ongoing crisis in Japan will tell you–it ratchets up concern to a whole different level. So, when the man who more or less serves as the chief safety officer for the entirety of the nation’s nuclear infrastructure says he’s worried, many, many other people should be worried, too.

To put it another way, Greg Jaczko has a cold.

But that’s not the scariest part. Read the rest of this entry →

The Party Line – August 19, 2011: Japan Nuclear Crisis Continues, Highlighting More Potential Dangers in US

7:15 am in Uncategorized by Gregg Levine

Imagine, if you will, living somewhat close to a nuclear reactor—not right next door, but close enough—and then imagine that an accident at that reactor causes a large release of radioactive isotopes into the atmosphere. Certainly scary, but maybe less scary because you know your government has computer models that show where the nuclear fallout will blow and fall, and they explain that the amounts that will blow and fall on you are negligible.

Sure, you might think twice about that reassurance, but it is not like they are saying everything is OK. The government, after all, did evacuate some people based on their fallout models. . . so they are on top of it.

Then imagine five months later, after you’ve breathed the air, drank the water, and tramped dirt and snow in and around your home, the government reveals that even though they had the models, and even though they knew the amounts of radioactivity pouring into the atmosphere from the damaged nuclear plant, they didn’t input the known amounts into the fallout model, so that when the government was reassuring people, it was doing so based on a minimum measurable number used to build the model, and not the actual amounts then being released. So, now, you find that not only have you been living in a place that was well within a zone now littered with hazardous fallout, you find that many who were evacuated were moved directly into the path of that radioactive plume. Read the rest of this entry →