anyone know how to do that?
We have heard a number of opinions of various kinds, as to whether with a modest $2,500 budget you could buy and use equipment to evaluate whether to eat some type or source of food, or not.
Scientists point out the use of cryogenically cooled germanium detectors, to achieve sufficient sensitivity. Blocking out environmental gamma rays is no doubt important. There is no hope of making high-precision and high-accuracy single-isotope analysis of your food. But how about a real enough measurement to know that the milk isn't too good for you this week?
As an engineer I have on many occasions made exquisitely informative, useful and ballpark-accurate measurements (leaving aside many many high-accuracy measurements too), with ridiculously crude equipment, laughable stuff sometimes. HOW you use cheapo equipment is, in my opinion, 99% of the value of the measurement approach. I've also seen people use a $500,000 piece of bench equipment, and without the correct approach, all they've done is prove that 2 = 2, that is to say, no meaningful measurement was made.
Obviously, you'd need sensitive equipment, i.e. 0.01 microSieverts per hour minimum, even if it has to operate at room temperature, and you'd have to construct a lead cask, use a type of reentrant beaker, have very long measurement times, and have some type of even crude reference sources or samples, to prove that you're accomplishing anytime at all.
That said, precision, i.e. whether the answer is a 1 or a 2, is pretty irrelevant to choosing your food. You only really care whether it's 10 or 1 or 0.1.
I'm not a nuclear specialist. If you want to make some microwave measurements will clear knowledge of what your noise floor is and what it is you're actually measuring, or alter some microwave circuits using an xacto knife to eliminate VSWR, I'm your man. However, I can't claim practical experience in the nuclear area.
I wonder if there's anyone out there with super-practical experience with less than ideal lab equipment - i.e. someone who made useful measurements 30 years ago when the avaiable equipment was crude, for example. Or just a super practical person with a strong nuclear background. That would nail the question, as to whether with inexpensive equipment, and low levels of contamination, you can make very simple but useful measurements. Anyone?
I have also made crude measurements as well, without expensive equipment...A university scientist is forced to do this more often than not. This is simply a signal to noise problem with some Poisson (counting) statistics mixed in. If you make a long enough measurement and you understand your background count rate to well below your source count rate, then you will see something. You may not know the something is, but you will see differences. The problem is that there are "normal" differences in radiation emissions of water, milk, etc, from natural sources that are larger than the intensity we are measuring (e.g. K-40 differences). These are dubbed systematics, and when you have systematic variations larger than your source intensity then you need greater precision to determine actually which isotope is emitting. You may still be able do this with a relatively cheap detector using a sodium-iodide based spectrometer and some spectral deconvolution schemes. However, you will fight calibration drift for long time measurements that could drown out your signal. You can get small silicon PIN detectors that may work, I have not any quotes for these but the energy resolution is better.
thank you!! for your excellent post. Your post is bringing things into focus - thanks to everyone else too. I'm the one who started this post. Your note about "crude measurements as well, without expensive equipment" is EXACTLY along the lines I was thinking of.
Your points about the systematic differences in sample radiation, background values, and typical values for the substance (e.g. milk, broccoli, water, fish) are indeed to the point - if we cannot quantify those differences, then the low levels of contamination may not yield use an indicative number (leaving aside precise numbers (many digits) or accurate numbers (digits centred on the true value).
Use of a "sodium iodide based spectrometer and some spectral deconvolution schemes" is quite workable. I don't know where, whether and for how much such a detector might be availble, especially when everyone is excited. Given spectral data collection (i.e. counts in each bucket of xx-amount electron-volts energy per photon vs. a reference), feeding that to any of many cheap available programs would do the necessary stats and deconvolution.
That said, where does this leave us? It means that a very handy person with some bits and bobs hanging around could experiment a bit, and might ... come up with meaningful indicative data. Anyone?
I am dealing with illness in the family, so I am a bit distracted at present. Any experimenters out there? Any BRAWM members who could point us in the right direction - or give us some engineering science (esp. with numbers attached) telling us the threshold (equipment, smarts and contamination) at which it will/won't work?
thanks to all who posted!
My brain always knows what it think it is typing. My fingers, on the other hand, seem to have a mind of their own.
Thank you for your comments and suggestions. As another has mentioned, it sounds like testing your own food will offer little piece of mind. If I am reading this correctly, handheld equipment wouldn't register any radiation or if it did, the accuracy would be in the 10% - 15% area.
However, I have young children and am convinced that I can reduce their exposure. I believe (or want to believe) that even a reduction in exposure of 10% is better than nothing.
As such, I really do want to just run this down to it's end. Let's say I had a $2,500 budget and would like to test milk, meat and produce (in the grocery store if possible). Is that impossible? Any suggestions?
I can't thank you enough for your time and insight. My children would thank you too if they were old enough to understand.
I decided to fill my cupboards and freezer with food produced before the incident/s. Frozen veggies, powdered milk, prepackaged rice, beans, frozen fruit from South America ... whatever the need. Right now I have enough food/water supply for one year. Two freezers were already full of organic wild game from last fall's hunts. I decided to be prepared. It's all we can really do.
thanks to those who posted, including the BRAWM posting!
Indeed true - that there are the questions of:
- how much, if any, radiation (geiger counter, best gamma beta alpha, can give an indication
- which isotopes - spectral devices as suggested could do. I think street prices are higher now because of the tremendous demand
- measurement setup - to block out background and self-blocking, an attempt to mock up something like the BRAWM set would be needed
That said, as an engineer I'm often just looking for sub-slide-rule kind of accuracy - i.e. is it a 1, a 3, a 10, a 30 ... etc. Whether the iodine 131 is 1.1 vs. 1.2 Bq/litre is unnecessary for choosing your food.
So I figured - if I'm getting my food from the same places that Berkeley gets theirs from, it may be good enough to assume that the ratio of isotopes is about the same. From there all I need is a relative strength reading to just ratio off the BRAWM results. For relative overall strenght of radiation, a geiger counter probably does the trick.
I figured would still need a lead shielded can and a sort of reentrant beaker setup. If I could calibrate my setup vs. BRAWM and/or what the local university uses, would I be able to achieve sub-slide-rule accuracy - i.e. 0.3 vs. 1.0 vs. 3.0 Bq/litre? I have decades of experience estimating things and generally for a low level of absolute accuracy, this kind of approach works great. Any further suggestions?
Thanks again to those who posted!
Yes, we use Marinelli beakers in our lab (as described in the post below), but we do not use a Geiger counter; instead, we use a High Purity Germanium detector (HPGe) as described in our setup page.
The difference between the detectors is huge: not only is a HPGe able to find specific isotopes (like I-131 or Cs-137) that are distinct from the background radiation, but it is much more efficient than a Geiger counter, in that it can see much tinier amounts of radiation. You can certainly feel free to try using a Geiger counter to test your food and drink -- in fact, you may even find out that everything you consume is naturally radioactive -- but it will be very difficult for you to identify whether or not something contains fallout from Japan.
Tim [BRAWM Team Member]
As Mark of BRAWM has stated many times on this forum; the levels from Fukushima found in the USA are too low to be detected with Geiger counters.
Survey instruments such as Geiger counters are useful when the contamination is the dominant source of radiation. But if the contamination is buried in the noise of the natural radiation; the only way to discriminate between the two is to identify the isotopes specifically; i.e. do spectroscopy.
You can't do spectroscopy with a Geiger counter because it isn't sensitive to the energy of the radiation. You get the same signal whether you detect low energy radiation or high energy radiation. That's because the detector gas in the Geiger counter breaks down on even a low energy radiation detection. The Geiger counter is akin to a digital device that just counts a "1" when it sees any radiation, whereas the Germanium detectors that BRAWM uses give an analog signal, the magnitude of which is sensitive to the energy of the radiation being counted.
Because of the Geiger counter breakdown; that registers the different isotopes equally; that destroys the information that you would get from using statistics. As Mark of BRAWM has stated previously:
Geiger counters are rather blunt instruments; they can detect radioactivity but they cannot tell you which isotope is responsible for it. One might detect radioactivity using one of these instruments, but there is plenty of benign natural radiation out there (e.g., where does the 38 CPM of the background test come from?). A Geiger counter would really only be useful for finding contamination in northeast Japan and nowhere else in the world.
Thanks for the explanation which 'I sort of thought so' conerning using a geiger counter vs. what you have in the lab to check food and other items.
I guess in using the crude hand held geiger counter you would have to know what is acceptable and 'normal' radioactivity for the food item you are checking but even then, it doesn't eliminate the possibility of a radionuclide with a half life of more than 1 day being present.
-Off my menu: All seafood. I'm culinarily frustrated and will miss all kinds such as Anchovies on Pizza, Red Snapper, Crab, Flounder, Salmon, Fake crab (made with Pollock, an ocean fish), Abalone, Squid (used to make Calamari), Sea Bass, Shrimp, Seaweeds, Tuna etc...
Not an expert in the area - but I see the BRAMM team using "Marinelli beakers". Wonder if a person could check their own food ...
Marinelli beakers are "reentrant beakers" - e.g. a sample of liquid makes a donut-shape around the meter. Then the whole thing is put in a lead container (which can be enhanced with copper and polyethylene) to block out background radiation.
Here is a document on "how to":
The idea is have the gamma radiation from e.g. a litre of milk, be homogeneous around the meter, and not that the sample shield itself from the meter too much.
The key is to calibration the setup with a known source. Of course, you have to be able to estimate what isotopes you're measuring too. If there is no way to know that exactly, then using the data from very similar samples could give an order of magnitude measurement.
High precision down to 0.04 Bq/litre can be obtained, but nobody on these forums likely cares about levels that low. The latest BRAWM milk sample 4/21/2011 is 1.14Bq/litre, by comparison.
It does appear that with a known source and a bit of shop work, a person could make rough measurements, or at least comparisons, of food purchased for consumption.
here are some images of marinelli beakers, to get the idea. BRAWM using them for water and milk
If you didn't have these beakers, would a geiger counter show you anything? I noticed that in Japan shop keepers are using them directly on Veggies.
Geiger counters only tell you the number of disintegrations per second. When a radioactive isotope decays it gives off charged particles. These partices cause an electrical current to flow when they enter the geiger counter tube. These pulses in electricty are heard as clicks or are "counted" by a digital circuit.
A geiger counter can NOT tell you the energy (which is important) or the isotope that the particles came from. You would need a seperate device for this.
Also keep in mind that SPECIAL (read more expensive) geiger counters are required to test for alpha and low evergy beta radiation.
If you want to know the ISOTOPE and energy level then you need a different, more expensive device. Here is list of isotope identifiers that are portable from one company: http://www.laurussystems.com/IsotopeID.htm
Keep in mind that these things are EXPENSIVE. The MicroRaider - Personal Detector looks to be the best for pocket carry, but it is $700. This device measures ONLY gamma energy.
If you want to be able to detect alpha and beta radiation and take measurements of that (Which is what I-131, Cs-137, etc. mainly emit), then you need another device. Examples of this type of device would be listed here:
The cost of a device like Digilert 100 is around $550.
So your looking at around $1250to be able to search for, detect and identify most radiation. Keep in mind that most of these meters will be less sensitive than what a University will have.
To test food you could use the Digilert 100 to detect if radiation is present. This could be done at the grocery store or at home. To test milk you would almost need to buy the milk, pour it in a pitcher, put your meter in a ziploc bag, insert the bag in the milk for a few minutes and then remove it and look at the log on the meter.
If radiation is found, then you would need to use an isotope identifier to determine which isotopes are contaminating the milk. The isotope would need to be present in sufficient quanitity for it to emit enough gamma radiation so that the isotope identifier could determine which isotopes are present based on the energy level of their gamma emissions.
I probably just totally lost you, but....
I think if I was wanting to have ONE device I would buy the Digilert 100 to start with. I would at least know IF radiation was present. Whether it is alpha, beta or gamma. If I continued to find contamination, then I would probably want a simple isotope identifier to try and figure out what was doing the emitting.
Read this wiki to learn how a geiger counter works: http://en.wikipedia.org/wiki/Geiger_counter
Using low-cost spectrometers will be insufficient to detect the amount of radiation we are observing. We are currently using 10% and 50% relative efficiency high-purity germanium detectors cooled to cryogenic temperatures (~$50k-$100k apiece) and we count for at least 12 hours within a fully encapsulated 2" lead cave using Maranelli beakers. This is about as sensitive as you can get and our measurements are very close to detection limits. The minimum detectable activity for the detectors mentioned above, even with a good cave, will be orders of magnitude (factors of 10) greater than what we are performing in our lab.
Our 50% detector is an HPGe Interchangeable Detector Module (IDM) on loan from Ortec
Thanks very much for your post - if you are representing BRAWM as I assume, BRAWM is using germanium detectors cooled to cryogenic temperatures, with detector setups costing $50k-$100k apiece.
It's slowly looking like none of us could get any useful indication on food with equipment costing less than $5,000.
For choosing one's own food to eat and to give to one's family, there is no need for more than 1/2 a significant digit of accuracy. Whether it's 8 vs. 9 Bq/kg makes no real difference in choosing your broccoli. So it' not about an accuracy more than what gives an extremely basic indication.
However, if any equipment we could afford to buy wouldn't have enough sensitivity to register anything at all above the noise floor, then it's a lost cause for us. Does that sound about right?
I've develop a detector which can be use for checking a water and food.
A measurement of water samples and short description of the detector are on net at:
Could you forward this email to someone who might be working on this ubject
in your region.
It's not about the significant digits here, it is about minimum detectable activity. One has to detect the emissions in the presence of natural gamma-ray backgrounds which are significant. There are two ways of doing this. First, encapsulate your sample in a lead cave to block natural gamma-rays from the outside. By the way, this already tells you something about your own natural gamma-ray exposure relative to these samples. The second, you need a detector that has extremely good energy resolution, hence the cryogenic HPGe detectors. Even if you have a decent cave, natural radioactivity from the lead itself and from components in your system will limit your detectability. Very simply it comes down to this: when you detect a gamma-ray you have to make a statistical determination that this event came from your sample and not from something else in the environment.
This problem I often characterize as trying to detect a firefly while looking directly into the sun. Clouds may be used to block the sunlight making it easier to detect the firefly, but in the end the amount of sunlight determines your ability to detect.
I noticed that in Japan they are using hand held units on the spot to check food. Is this because their levles of contamination are so much higher? Could you give a rough idea about at what level of radioactive Cs or I contamination would a simple hand held meter be usefull.
I kind of chuckle every time I see video of some poor guy scanning fish with a GM counter. If they see a high reading from the counter, well it could be from the fish, but it could be from so many other things as well. If they are trained to take any piece of food away from the current environment, re-scan, and verify that indeed the higher reading is due to that sample, then I would say that is better, but that you have a problem. The sensitivity of a GM counter to see something is geared more to higher levels of contamination so just because food passes this GM test does not mean it is clear by no means. I am hoping some of this food is being sent to labs such as ours to test for actual levels of contamination. False fences are normally more dangerous than no fences at all.
thanks for your useful details - and also for your sense of humour!
i.e. "I probably just totally lost you, but...."
I loved that. I got the same thing buying a new laptop recently. I have 30 years experience doing ISO and mil-spec projects and have been responsible for 10+ million lines of mission-critical code on pretty well every kind of commercial, custom and embedded hardware. Have a degree in Electrical Engineering and have designed spacecraft hardware. But in the computer store the young salesman looked at my grey hair (I'm in my 50's) and said "oh, maybe you should actually get an iPad instead. You know, laptops confuse older people."
i.e. he said "I probably just totally lost you, but...."
I looked at his eyes with a broad, silent grin, and he started backpedalling, "oh, I'm not saying you're old, I mean not THAT old, I mean ..."
Made me laugh. Thanks!
UC Berkeley • College of Engineering • Contact
Campanile photo courtesy of Andrew P. Keating