You know what I am truly curious about? How dose rates (in µSv/hr for instance) actually can be used, if at all, to gauge the dangers of deposited fallout (whose accurate details would be in activity (in Becquerel or Curie) per something, such as Bq/kg of soil, or in Bq/m^2, or equivalents.)
If you know the normal background dose rate, and have the new higher one, how do you calculate a reliable estimate for the activity? Insights welcome: in comments here, or on Facebook. Tx
Maybe this clarifies my question more: ponder these three maps: background radiation dose rate levels (well above the ground) today on June 6, 2011; + a map for Cesium-137 contamination in the immediate region surrounding the Fukushima Daiichi NPP at the end of April (well over a month ago – but Cs-137’s half life is about 30 years, so these measurements are basically up-to-date, albeit possibly slightly lower than most current data wherever there’s been additional deposits), and Iodine-131 from the very end of March (over 2 months ago, so much of this has already diminished since I-131’s half-life is about 8 days). Look closely:
- Background dose rate:
In regards to nuances added in blue in my blogpost yesterday (HERE), notice that reported dose rates above (which are averages!), are almost all BELOW 0.05 µSv/hr in the areas that received the least fallout, values that are very low in most of the world. Just saying… the added nuances and correction were called for, but some of those making the suggestions weren’t exactly nuanced or very scientific in their claims (that 0.2 µSv/hr was ‘normal’ in Tokyo before the accident happened) either, at least… I still find that a bit hard to believe, given the data from professional institutions that contradicts that.
- Cesium-137 deposits:
SOURCE: from page 7 (7 of 8) of this pdf file: http://www.mext.go.jp/component/english/__icsFiles/afieldfile/2011/05/10/1304797_0506.pdf
- Iodine-131 deposits (outdated, but still gives an idea that can be laid next to background readings from the same time (around March 23, 2011):
SOURCE: http://www.mext.go.jp/a_menu/saigaijohou/syousai/1305747.htm –> Look for March 25, 2011 pdf . It’s this one: 福島第１原子力発電所（特定条件 WSPEEDI）［平成23年3月25日（金曜日）］ （PDF:594KB）
So my question is: If you know the normal background dose rate, how do you calculate a reliable estimate for the activity for an elevated background measurement? For example: background radiation well off the ground right now, in Iilate Village in Fukushima Perfecture is currently 2.91 µSv/hr, which is at least 2.5 µSv/hr above what was normal before the accident. Yet deposits for Cs-137 were reported in early April to be about 3,7oo,ooo Bq/m^2 (3.7 megabecquerels per square meter) in the same area. (See report here:
http://www.japan.org/tags/cs-137 —> Dec 5, 2012 note: this is apparently one of many links that have been taken down since ;-/). Looking at these maps, I’m wondering if it possible that very slight elevations in background dose rates could actually mean quite significant soil contamination? How could you know? Input welcome!!!
Possible answers: [Dec 5, 2012 – No comments received so far.]