Confirmed: EPA Rigged RADNET Japan Nuclear Radiation Monitoring Equipment

The EPA re-calibrated (rigged) Japan nuclear radiation monitoring equipment causing them to report lower levels of radioactive fallout after the Fukushima nuclear meltdown than what was detected before the disaster.

I recently programmed an application to pull all of the EPA radiation monitoring graphs for all major US cities and complied them into an easy to use web interface. Of course we took the data being reported with a grain of salt under the suspicion that the Feds were fiddling with the results.

Now, an investigative report looking into why the much of the EPA radiation monitoring equipment was offline when the Fukushima nuclear meltdown occurred reveals that EPA has in fact rigged radiation monitoring equipment to report lower values of radiation.

RadNet or SadNet? The EPA’s Failed Radiation Detection System

RadNet – the EPA’s front-line, radiological detection network is severely flawed and suffers from maintenance and reliability issues.

The lack of consistent data and the number of units offline (a techie term for broken) at the time they were most needed shows that the EPA was not prepared for this emergency.

Besides that fact the broken system left us all unprotected; the confusion, apprehension and fear witnessed as people try to wade through the incomplete and inaccurate data online is evidenced by an exchange on the UC Berkely website over this RadNet graph:

EPA Historical Data Is Available Via The RadNet Query Interface

You can generate beta or gamma graphs similar to those shown in the Japanese Emergency web page via the RadNet Query Interface.

RadNet Query Interface

You can select "Beta Gross Count Rate (CPM)" or any/all of the gamma ranges, the "Measurement End Date/Time", the date range, the city you want to see and click "Submit". You can generate graphs from the subsequent results page. The query is limited to 400 results at a time.

"Rigged", if being used in its most negative connotation, is a bit of an exageration. I see no proof that anything was rigged in that sense. If you think about it, why would anyone "rig" the results to show a very obvious drop? If I were to rig something, I'd keep the levels constant with previous reports. However, at the very least, the equipment was not maintained as well as it should have been. Which apparently required quick service to provide more accurate measurements.

I'm certainly no expert on how the EPA uses their monitor data. But, I would hazzard a guess (and that's all it is, a guess) that they use the monitors to simply detect upward trends in background levels. Any significant and sustained increase is investigated. Which would include pulling air filters to analyze. *If* used that way, the actual levels reported are less important than identifying the significant increase. I can't even begin to guess how well they normally monitor the data that they are getting.

Lastly, while the Sacramento graph looks like something was done to the monitor post Fukushima, the San Francisco graph shows no adjustments at that time.

I don't know if this link will work or not, but here's SF from 3/1-3/17/2011. If it doesn't work, you can generate it from the query page.

consitentcy

why after two years with very reasonable and serious evidence that Fukushima is and has been releasing radiation for 2.25 years should the Epa have not endeavored to not only fix all the problems ( "offline" issues and whatnot ) but also upgrade the system ? I mean just to be sure ...isn't that the point of the EPA's existence? ...Science in action? Or do we not question or answer if either might ...offend?

Long discussion about this

Long discussion about this here. That thread actually inspired the investigative journalism by Webworker referenced in the article.

It's hard to say if they did these recalibrations to hide the data or just due to poor management of resources. Either way it's not good.

That's why I threw in the SF graph

It shows no such adjustments. And SF is close enough to Sacramento for me to believe that it was indeed a monitor calibration issue rather than them hiding anything.

I think you'd be wrong with

I think you'd be wrong with that logic.

I certainly can't explain the details, but Sacramento shows a much larger variation than SF in radiation, and it's probably similar to the reason why all the Bay Area pollution typically blows to the valley. The SF EPA beta graphs seem to be always low and aren't registering any of the spikes seen in all the other CA cities.

In fact, Radiation Network SF monitor used to always be in the teens, occassionally in the low 20's. Only in the last couple weeks has it been more elevated. This elevation is still not seen in the EPA beta graphs - thinking it might be the particular place where the SF monitor is located.

There is a good reason why the Fresno monitor hasn't been fixed in these last several months, or put back online. There was something going on there before this hit, and with the way everything settles in that part of the valley, sure best not to let that be public.

Now, with a new nuclear power plant being dicussed for Fresno, sure would be nice to have historical data available for that area.

If that's true why bother

If that's true why bother with a Sacramento monitor at all? Sorry that doesn't really make me feel any better.

Because there's nuclear fuel being stored there

While the plant has been shut down since 1989, there still is low-level radioactive waste and a dry-cask spent fuel storage facility at the Rancho Seco nuclear power plant site. I think it's a good idea to have a monitor in the area.

"rigged"

I won't go as far to say rigged I do know for a fact they pull monitors offline when they think an odd or high reading is an anomaly .and this did occur post Fukushima to a large portion of monitors .This has been covered here

http://www.nuc.berkeley.edu/node/3586

http://blog.alexanderhiggins.

rigged

rigged