What you need to know about radiation (The non-tech version)

I recently posted a copy of my old radiation safety notes. Those were written for young physicists, so they aren’t exactly models of clarity. Since people asked, here’s a completely nontechnical summary of the key things to know about radiation.

Radiation consists of high-energy particles of all sorts. They can injure you by crashing into cells. If they hit DNA they can damage it, and if enough DNA gets damaged then some damage will make it past your body’s bad-DNA-removal systems and turn into cancer. If you get really a lot of radiation, it can kill cells directly, especially in internal organs; this causes radiation sickness and is quickly (and unpleasantly) fatal.

Elements are the different types of atoms which exist in the world. Each atom has a nucleus which is made out of a mixture of protons and neutrons. The number of protons, a.k.a. the atomic number or simply “Z,” determines all of the chemical properties of the element, and so which element it is. (For example, Carbon has Z=6, Uranium Z=92) The total number of protons and neutrons is called the atomic weight, or sometimes “A.” Isotopes are variants of the same element with different atomic weights. (“Uranium-238” is the isotope of Uranium with a total of 238 protons and neutrons) Most combinations of protons and neutrons aren’t perfectly stable; over time, they tend to fall apart (“decay”) and emit small parts of themselves, i.e. radiation. Substances like these which naturally emit radiation are radioactive materials. The more unstable a nucleus is, the faster it will decay; the amount of time it takes half of the nuclei in a sample to go away is called the half-life of the nucleus. This can range from milliseconds to millions of years. Radioactive materials with short half-lives will be very radioactive for a short while, but then stop quickly; radioactive materials with longer half-lives will stay moderately radioactive for a very long time.

Because chemistry only depends on Z, different isotopes of the same chemical will mix thoroughly and can’t be separated by chemical means. This means that if you need a lot of one particular isotope, you have to do something complicated and mechanical. This also ties in to one of the main dangers of radioactives, below. “Low-Z” materials are materials with a Z up to about 20; this includes most organic materials, plastics, and so on. These materials are relatively transparent to radiation, which is why airport backscatter X-rays can’t see explosives. “High-Z” materials include most metals (except Aluminum!), and absorb radiation well.

There are four common types of radiation. Alpha particles, which are light atomic nuclei,  can cause a lot of damage (because they’re relatively heavy, and being made of nuclear matter interact a lot with other nuclear matter). However, they’re relatively easy to block; even paper or clothing offer considerable protection. They can be very dangerous if ingested, though. Beta particles are electrons and positrons, and penetrate more deeply than alphas, though they do less damage. Gamma rays are simply high-energy light (part of a wider category including UV light and X-rays) and are the most penetrating. Betas and gammas are most effectively blocked with a thick layer of high-Z material such as lead, or several feet of concrete. Neutrons are emitted by the decay of certain very large radioactive elements, including those used in nuclear power. They can cause as much damage as alphas, but because they have no electric charge they are much more penetrating. Also, high-Z nuclei tend to absorb neutrons into themselves. This makes them good at shielding neutrons, but in the process they tend to turn into other, often radioactive, isotopes. This means that metals used in neutron-rich environments like reactors tend to become radioactive over time.

The common units for measuring radiation doses are the Sievert and the rem.* This chart can give you a notion of some effects of radiation; section 5.6.3 of The Nuclear Weapons FAQ goes into more detail and talks about acute doses.

Danger levels: For a normal adult, up to 5 rems per year total (up to 1.5 for a pregnant woman) is safe and should cause no adverse effects. Doses up to 50 rems will cause increasing risks for cancer; generally, each 0.01 rems you get is about one “micromort” (a one-in-a-million increase in your chance of dying, about the equivalent to driving 230 miles) Doses above 50 rems can cause acute radiation sickness. Children are more vulnerable because they’re smaller, so they get more radiation per unit of body mass.

Normal levels: By simply walking around and existing, you pick up about 0.3 rems per year, from natural radiation in rocks, from space, and so on. A chest X-ray is good for about 0.01 rems; an airplane flight, about about 0.04 rems; a mammogram, about 0.3 rems. The most radioactive fruit is the banana, which is rich in Potassium. (The radioactive isotope Potassium-40 is about 1% of all Potassium in nature) Eating a banana is good for about 0.01 mrem, or one Banana Equivalent Dose. You can learn how to protect yourself from them here.

Nuclear fission is a reaction in which a large, unstable nucleus breaks into two smaller nuclei, generally throwing off some spare protons (and a lot of excess energy) in the process. Only a few isotopes do this. If one of these isotopes is hit hard by a neutron (which is heavy but electrically neutral, so it can get in close to the nucleus) it can be enough to break that nucleus, causing it to fission and emit more neutrons. This can cause a chain reaction; if enough neutrons hit other nuclei, rather than being absorbed or escaping, the total number of fissions keeps increasing. An assembly of fissile material which can do this is a critical mass.** Nuclear reactors work by having a large pile of fissile material, generally a 3-5% solution of Uranium-235, which is just at the level of criticality but no more. These are mixed in with control rods, which are rods of some neutron-absorbing substance like Cadmium or graphite, which can be moved in and out to control the speed of the reaction. Together these form the reactor core, which gets very hot from all this energy release; that heat is used to heat water or some other fluid, and ultimately to make steam which turns a turbine, like in any other power plant.

Nuclear fusion is a reaction in which two small nuclei are rammed together to form a slightly larger nucleus. For certain combinations of nuclei, this will create a net release of energy. Fusion releases more energy than fission, and its result products are nonradioactive.*** This is great for clean energy, but it’s much harder from an engineering perspective.****

At the Fukushima-Daiichi power plant, seismometers detected the earthquake and immediately shut down the reactor by driving the control rods fully in. However, even with control rods fully in and the chain reaction stopped, the baseline radioactivity of the core requires constant cooling. The earthquake knocked out the power to the cooling system, and the tsunami destroyed the backup systems. Plant operators moved on to the next level of backup cooling methods, which involve pumping seawater directly into the hot bits. While the cores of several reactors have melted, the core containment units have held, and so none of the core materials were released. Another problem at the plant was that pressure was building up in the cooling system, because what water was there initially was boiling (and also being decomposed into Hydrogen and Oxygen – water does that at high enough temperatures) and the automatic venting system had lost power as well. This was dealt with by manually venting the steam. Some of this steam was radioactive, and this is where the radiation releases happened. A third problem are the spent fuel rods, which were in temporary (cooled) storage in the reactor as well; their cooling water was not being replenished, and humans have had to go in and assist. All of this has been complicated by problems such as “the building is on fire” and “if hydrogen builds up in an area and isn’t vented quickly enough, that part of the building will explode.” Heroic work by the reactor crew has so far minimized the effects.

At Chernobyl, on the other hand, thanks to a combination of poor design, poor maintenance, and poor operations, the core temperature went out of control and this caused the control rods themselves to burst into flame and explode. This destroyed the core’s containment vessels, and spewed bits of the core itself into the air and the environment. The worst radiation doses in the Fukushima area have been in the ballpark of 0.3 rems per day, for about two days; the worst radiation doses in the Chernobyl area were several thousand rems.

The difference between the two has to do with the amount of stuff emitted, and the chemistry of the stuff emitted. The radioactive steam from F-D got its radioactivity by being ordinary water bombarded by neutrons from the reactor itself. Low-Z materials tend to transform into relatively safe isotopes with short half-lives. (For example, Oxygen in nature is 99.7% O-16, which gets turned by a neutron into O-17 (stable); 0.04% O-17, which turns into O-18 (also stable); and 0.2% O-18, which turns into O-19 (beta source, half-life of 27 seconds; turns into Flourine-19, which isn’t healthy but isn’t radioactive). [You can look this stuff up here] So if you irradiate the oxygen in water, you get a very small amount of a source which decays quickly) The contents of the core, on the other hand, include such exciting isotopes as U-238, Cs-137, and Sr-90. Anything which spews that into the air is likely to be Very Bad.

Strontium-90 is a good example of a dangerous isotope. It’s a beta source with a half-life of 28.9 years. More importantly, Strontium is chemically very similar to Calcium; if you get it inside your body, your body will try to build bones out of it. Radioactive bones, which come with little radiation sources shining straight in to your bone marrow. Cesium-137 is similar to Sodium, although not as much as Strontium is to Calcium; Cs-137 is commonly used for radiotherapy for cancer. Plutonium-239, used in nuclear weapons, is one of the worst; it’s a powerful alpha and beta emitter; it’s chemically a more reactive version of Iron, and so reacts with Hemoglobin to form Plutoglobin, which your body then transports into your bone marrow; and when exposed to air, it forms a pyrophoric oxide which not only burns, it oxidizes itself (think Magnesium) and so can’t easily be extinguished.

So to summarize: Radiation is made of energetic particles which harm you by ramming into cells and damaging them; radioactives are materials which naturally emit radiation. You can naturally heal radiation damage, and block many kinds of it, but ingesting radioactives is particularly dangerous. Some radioactives look like elements used by your body, and so your body will ingest them particularly thoroughly. Others (like those in the steam) are fairly neutral and nonreactive, and so once they stop being radioactive themselves are safe. If you hear about a radiation dose, remember that anything up to a rem per year (10 mSv per year) is not cause for panic.


* The roentgen is a unit of radiation emission. The rad measures how much radiation you absorb; one rad is 0.01 Joules of energy per kilogram of your body mass. (More body = more ability to absorb things) The rem (short for Radiation Equivalent in Man) multiplies rads by a factor which takes the different damage levels from different kinds of radiation into account; betas and gammas are 1, alphas and neutrons (depending on their energy) anything from 1 to 20. The Sievert is roughly 100 rems, but Sieverts use a different table of medical effects. You’ll commonly see milliSieverts (1 mSv = 0.1 rem) and microSieverts (1μSv = 0.0001 rem).

** There are actually two different kinds of critical reaction, prompt critical and delayed critical. The former is used in nuclear bombs, the latter in nuclear power. Achieving prompt criticality isn’t something you can do by accident; nuclear reactors can never undergo the sort of reaction that triggers a nuclear bomb.

*** The stability of a nucleus depends on the difference between the energy of the nucleus as a whole, and the energy it would have if broken up into pieces. If this difference is negative, then the nucleus is stable; if it’s positive, then it’s unstable, and the bigger it is, the more unstable it gets. (Which means a shorter half-life) Energy-producing reactions work by making something more stable and using the energy difference you get. Fission works by taking a big, unstable nucleus, and kicking it so that it breaks into two smaller pieces; that energy difference is released as kinetic energy. But the smaller pieces aren’t guaranteed to be stable, just “more stable than what you started with.” Fusion works by taking two nuclei and combining them into something which is more stable than the originals were; once again there’s a difference, but this time by definition what you’re left with is stable. There’s no “chain reaction” for fusion; you need some way to keep the particles fusing, even though they electrically repel one another, and keeping that pressure up is the hard part. Some techniques are crushing the matter with a nuclear bomb (useful for H-bombs), compressing it with lasers, compressing it with giant magnetic fields, or crushing it with gravity. (Which is how the Sun works)

**** Practical fusion energy is about 20 years away, and it has been for the past 50 years.

Advertisement
Published in: on March 24, 2011 at 09:15  Comments (7)  
Tags:

The Hazards (and not) of Radiation

Since everyone has radiation on their minds right now thanks to the situation in Japan, and since XKCD was nice enough to post this handy chart, I thought it might be useful to put up a link to my old “basics of radiation” guide from teaching radiation labs to undergrads many years ago.

(PDF Link: Handy Facts About Radiation)

A big thing this doc tries to do is explain the difference between things which will and won’t kill you. XKCD’s diagram is good at showing you the relative dangers of different levels of “primary radiation” — things like gamma rays and neutrons hitting you directly. What it doesn’t talk about is the relative dangers of radioactive substances, which can mostly kill you by getting inside your body and continuing to emit small doses of radiation.

For reference, so far the Fukushima-Daiichi plant has mostly emitted steam with radioactive noble gases in it, which is great from a safety perspective — those decay quickly and bond chemically with nothing, so they’re probably the safest radioactive leak you could ever get. XKCD’s diagram is appropriate for those. If the core were to be breached and spray some of the heavier materials inside there, like Cesium or Uranium, the results would be much nastier.

Also, a footnote: This doc works in units of rems, while a lot of the recent news reports have used Sieverts. 1 Sv = 100 rems.

Published in: on March 22, 2011 at 10:03  Comments (4)  
Tags:

Multinomial derangements

A few days ago, Amy asked me an interesting question. Say you have a deck of N cards, n1 of one type, n2 of the second type, and so on, and you shuffle them all together. Then you predict the order in which you will see all of the cards, and flip them over one-by-one. How many will you guess right, on average?

This problem turned out to be tricker than I thought it would be, and since it doesn’t show up on random internet searches, I figured it would be good to work through the problem here. It’s related to the problem of counting derangements (permutations of N elements with no fixed point), which was solved by Bernoulli in the early 1700’s; but the version with groups of identical objects turns out to be slightly messier. In the same way that multinomials generalize binomials, these “multinomial derangements” generalize ordinary derangements.

(more…)

Published in: on January 17, 2011 at 12:00  Comments Off on Multinomial derangements  
Tags: ,

Books of the Year, 2010

This year I followed Amy’s lead from last year and kept a log of the books I read. Apart from being a fun exercise, and making it easier to remember what I read and when, it gives me a natural opportunity to write a “best of the year” post.

So first, some statistics: this year’s haul included a total of 101 books fully read, plus about 20 more partially read but abandoned. (Not all because they were bad; some because I wasn’t in the mood, or whatever. And the worst of the lot were actually finished.) Of the fully read, there were 38 SF, 26 fantasy, 18 lit fic, 13 nonfiction; only 15 of them were re-reads. 6 were YA, 5 middle-grade. I would say that 30 of them were good enough to recommend, with the ones below especially noteworthy. There were plenty of mediocre ones, but only one terrible enough to make me want to claw my eyes out of my head. And I added two writers, Kazuo Ishiguro and Theodora Goss, to my list of “writers whose work I will read the moment it comes out,” an unusually good crop for the year.

And so: Twenty books worth reading and one worth setting on fire. (more…)

Published in: on January 3, 2011 at 09:00  Comments (4)  
Tags: , , ,

Monte Carlo methods in History

I just finished re-reading Jared Diamond’s magnum opus, Guns, Germs and Steel. It’s amazing how well the text holds up on re-evaluation; the analysis is deep, and the number of cases he covers is wide enough to convince me that there is a real meat to his argument. It did, however, get me thinking about some interesting ways to extend his work. He proposes several in the epilogue, including all of the obvious further data searches and analyses which would need to be run to confirm or refute the hypothesis, and these are surely in the hands of people far more qualified to think about them than I am. But he raised one point obliquely which got me thinking about the one thing I have most trouble with in the argument, and it gave me a thought for how to answer the question.

(more…)

Published in: on October 28, 2010 at 18:09  Comments (10)  
Tags: , , ,

That’s great, it starts with an earthquake, every 26.2Myrs.

It’s not often that reading a research paper makes me literally jump up in my seat and yell “Holy shit!,” but that’s what I did when I saw figure 1.

A fascinating new paper is up on the arXiv today about the Nemesis Hypothesis. This hypothesis originated in the 1980’s, when people noticed that mass extinctions tended to happen every 27 million years or so, and it was suggested that this may be because some hard-to-see star — perhaps a red dwarf — is orbiting with our sun in a very eccentric orbit, passing close to us only every so often and in the process pulling in hordes of comets from the Oort cloud and bringing about death and destruction. (A second periodicity, of roughly 62M years, also exists; it’s known that these extinctions also seem to be tied to oscillations in the sea level, so comets are considered a less likely source and there’s no analogous Nemesis2 hypothesis that I know of)

What the new paper did was accumulate a very detailed record of extinction rates over the past 500Myrs, using all of the best datasets available, and produce a plot of the fraction of species going extinct per unit time. The Fourier transform of this curve is simply shocking: it shows two extremely clear peaks at periods of 62 and 26.2Myr, respectively. Amazingly, these peaks are clear enough to rule out the Nemesis hypothesis; if the periodicity were due to a star’s motion, it would actually have to be less perfectly regular  than the experimental data, because the star’s motion would be perturbed by other stars, the galactic disk, and so on. (Good summary paper of that)

I was up much later than I should have been reading through these papers and thinking through the results. Some things seem clear:

  1. These guys seem to have done fairly serious numeric analysis. I’m not qualified to evaluate their data sources and prospective issues from that side, so I’ll have to wait for the specialist community to weigh in, but I didn’t see any red flags in the paper. From looking at the Nemesis papers, it seems pretty clear that if their statistical analysis is good, then Nemesis is genuinely ruled out; I couldn’t think of any variations on that hypothesis which would survive this data.
  2. The peaks on the graph in this paper are holy-shit sharp and distinct. Something is happening with clockwork regularity that wipes out most of life on Earth.

I’m now spending some time thinking about this and about what these spikes may mean. One interesting question is whether the 62My and 26.2My peaks are related or if they’re caused by completely disjoint phenomena. Interestingly, there’s another bump on the graph at about 17My, although it looks like it’s just above the level of statistical significance, not quite big enough to tell for sure if it’s a real signal. If there is a bump there, then these bumps have the odd property of being evenly spaced in frequency space, at intervals of about 0.02 My-1. The pattern of frequencies ω0+nω1 is familiar from many differential equations — e.g., it’s the pattern of energy levels of a quantum-mechanical harmonic oscillator — so it’s something which could naturally emerge from a fairly wide range of physical phenomena.

On the other hand, the two could be wholly separate, or they could be the only two real spikes. It’s going to be hard to tell without staring some at the raw data, and even then we may not have enough precision to really know. One thing which I do suspect we’ll be able to determine from this dataset is whether either of the two cycles could be coming from purely biological or other complex systems such as Clathrate guns — such systems seem less likely to have extremely precise and stable periods. Honestly, the first thing that pops into my mind when I see this level of stability is pulsars or astrophysical jets; we don’t know a lot about super-long-period pulsars, but because they have such long periods it would be awfully hard to know a lot about them by nature. Call this the “Cosmic Death Ray Hypothesis.” Other interesting possibilities could be long-frequency oscillations of the Sun, or something resonating in the structure of the Earth… although the latter seems a bit less precise to me.

Definitely time to look at the raw data. A lot of this hypothesizing depends on just how tight the error bounds on this data really are. If they’re as tight as they seem from the graphs, this is an amazing source of data.

The good news: We’re still about 16My from the next predicted peak in this cycle, so we’ve got a little while to figure it out. Before the cosmic death rays come to get us.

ETA: There’s a good post about this on the arXiv blog which gives some more context.

Published in: on July 12, 2010 at 11:16  Comments (4)  
Tags: ,

On the value of hobbies

This bit of news just makes me happy. Not because the scientific advance is so critical (it’s an important advance, and will have significant practical applications, but that isn’t what I like about it) but because, despite having a really stressful and high-level day job, Steve Chu is still relaxing by going into the lab and doing science. And publishing papers in Nature. It’s nice to see that you can still make time for the things you love doing the most, no matter where you end up in your career.

Published in: on July 10, 2010 at 17:28  Comments Off on On the value of hobbies  
Tags: ,

Fascinating medical research

Physical gender is apparently more fluid than we anticipated. New paper from NIMR: they showed that knocking out the Foxl2 gene from the ovaries of adult mice not only made the ovaries stop working as such, but they switched over and started to become testicle-like. Not actually manufacturing sperm, but they do start manufacturing male rather than female hormones.

NB that was in adult mice; apparently some of the key sex-differentiation pathways don’t shut off after the basic organs are built. Which isn’t entirely surprising given the existence of sex-changing fish, but still pretty damned surprising given that these are mammals.

This has some very interesting potential applications, especially for those with gender dysphoria. (Imagine being able to switch some parts over in situ rather than needing regular hormone injections?) It also really increases my curiosity about gender dysphoria; now that we’re starting to get a clearer picture of the various stages involved in determining physical gender, it would be really interesting to see if anomalies in any of these stages were correlated with dysphoria later in life.

Published in: on June 11, 2010 at 17:18  Comments (1)  
Tags:

Observation about physics

Based on the current best measurements of the large-scale structure of the universe, 73% of the total mass-energy is cosmological constant, a.k.a. dark energy; of the remainder, 90% is dark matter of various sorts. The remainder is 90% intergalactic gas, and the rest luminous matter.

For short: 97% of the universe is poorly understood; 3% is hot air; and the rest is on fire.

Any resemblance to other projects is purely coincidental, I promise.

Published in: on April 29, 2008 at 18:50  Comments (7)  
Tags: ,

Eka-Thorium

Now, this is neat: A superheavy element (Z=122, A=292, tentatively named “eka-Thorium” or “unbibium”) which is relatively stable (t1/2 ≥ 108 yr) has been observed in nature, in natural Thorium samples. (It’s called eka-Thorium because it would sit directly below Thorium on the periodic table; as a result, it’s chemically very similar to Thorium, which means it can mix in to Thorium ores and stick there because it doesn’t separate very easily)

This is a whole 30 atomic numbers above the next-largest naturally occurring element, Uranium. (Z=92, A=238) It’s the first empirical proof that superheavy nuclei can actually exist and be stable.

Edit: Some history – the last time an element was discovered in nature was Francium (Z=87), which was discovered in 1939 by Marguerite Perey. It was thought highly likely that that would be the last time anyone ever did.

Published in: on April 28, 2008 at 10:43  Comments (26)  
Tags:
%d bloggers like this: