And here it is, ladies and gentlemen, your moment of Norse.

Today I encountered a line of beauty products with the curious name of Voluspa.

Question: Was the name chosen at random, simply because it contains the three letters “spa?” Or is there a plan for a whole advertising campaign: “Pamper yourself like Freyja on the eve of Ragnarök?”

Advertisement
Published in: on September 3, 2010 at 22:14  Comments (4)  
Tags: , ,

Nuclear Magnetic Resonance

Last time, we looked at a particle with spin 1/2 in a constant magnetic field. We saw that, like with a classical magnetic dipole, this field caused the magnetic moment to rotate around the field axis. Whenever we encounter a rotating body, a natural thing to try to do is to apply a rotating force to it and see if we can make it resonate.1 Today we’ll do that and see what happens.

(more…)

Published in: on August 24, 2010 at 10:00  Comments (1)  
Tags:

The Two-State System

Alright, enough with the mathematical abstraction! This post, and the next quite a few posts, will all be looking at very concrete, physical examples, applying the methods of quantum mechanics and seeing what we can see. Today, we’re going to start with the simplest possible system that has nontrivial dynamics: a system with only two states in it. In a sign of great creativity, this is normally referred to as “the two-state system.”

(more…)

Published in: on August 17, 2010 at 10:00  Comments Off on The Two-State System  
Tags:

From the Schrödinger Equation to the Uncertainty Principle

Last time, we walked through some of the history of quantum mechanics and came out with the Schrödinger Equation, the master equation of nonrelativistic quantum mechanics. Much of what we’ll do in this course will involve solving this equation in a variety of interesting cases; but before we begin, it’s worth plunging a bit more deeply into the equation itself and seeing what we can learn just from its structure. Among other things, we’ll see the relationship of the abstract vectors we get from the linear algebra approach to the functions we use in the differential-equation approach; see the (rather simple) way that real systems evolve over time; and encounter the fundamental limitations on measurement in quantum mechanics.

(more…)

Published in: on August 9, 2010 at 10:00  Comments (2)  
Tags:

Wheat and politics

Side note: today Russia banned grain exports for the rest of the year, following droughts and wildfires. Many other countries (with the notable exception of the US) have been having major grain production shortfalls as well.

One country which may be particularly affected by this is Egypt; the government there subsidizes effectively free bread, which is a key factor in maintaining some semblance of social stability, especially in Cairo and Alexandria. This bread subsidy is one of the biggest line items in Egypt’s budget, and Russia is their primary grain supplier. This year (and next, if it continues) could push the country’s budget over the edge, and have a significant impact on the government’s stability.

Published in: on August 5, 2010 at 13:39  Comments (2)  
Tags: ,

The Schrödinger Equation

Today I’d like to start doing some physics. I wish I could say that we were going to derive the Schrödinger equation — which is basically the master equation of quantum mechanics — but it doesn’t follow from a simple examination of mathematical principles. Its justification comes from the fact that it seems to accurately describe the physical world. So I’m going to walk through some of its history, and the experiments and physical facts leading up to it, and will end up with an equation and, more importantly, an explanation of what the quantities being solved for actually mean.

A bit of history

Our story begins in 1900. At this time, our understanding of physics wasn’t quite complete — there was still some argument over whether “atoms” had any physical reality or were simply a useful calculational tool for chemistry, and we were still trying to sort out just how we moved relative to the æther — but we were confident enough in our understanding that Lord Kelvin could comfortably say that “There is nothing new to be discovered in physics now; all that remains is more and more precise measurement.”

(more…)

Published in: on August 4, 2010 at 10:10  Comments (6)  
Tags:

A Mathematical Prelude: Some constants and a bit of linear algebra

And now for the first contentful post about quantum mechanics. It’s wild, wonderful, exciting… okay, not really. It’s the mathematical prelude, an overview of some key physical constants and some key bits of linear algebra which we’ll use rather extensively going forwards. This isn’t meant to be deep, or to be an introduction to linear algebra; it’s a way to establish the notations we’ll be using, and to point out the facts which will happen to be important to us later on. (more…)

Published in: on August 2, 2010 at 10:02  Comments (2)  
Tags:

The Quantum Mechanics Experiment

I’ve decided to try an experiment: To use this blog as a platform to teach undergrad quantum mechanics. Each post will be the equivalent of a “lecture;” the comments section is a natural place for the back-and-forth of questions.

Why? Mostly because it’s fun. I love teaching, I love quantum mechanics, and I haven’t had students in far too long.

It’s also an interesting experiment — I want to see if a blog platform is really a good place to teach a course. It sounds good in theory; the “lectures” are persistent, so people can come to it whenever they find it; there’s a natural place to ask and answer questions, and even to see what other people have asked; and there is one key technical innovation now in place, namely that WordPress allows you to enter equations.

(For reference, and for use when commenting: You can type in equations as “$latex …. $“. It’s a bit finicky, so preview your comments before you post to make sure it worked)

So I’m going to do this, posting as time permits, and I intend to work my way through the entire standard undergrad QM syllabus. Come join!

(more…)

Published in: on August 2, 2010 at 10:00  Comments (1)  
Tags:

Design Principles: When to Reimplement

This is another post about design. It’s about a principle which can apply fairly broadly; it could equally be about how to structure an API in a software system, or about how to handle a requirement in a business. Here it is in two flavors:

The software version

If your system’s dependency on another system cannot be expressed through a narrow, stable API, don’t depend on that external system — instead, reimplement it yourself.

The business version

If your business depends on some core function, and you care about the details of how the job is done, rather than just whether it’s done to some simple standard, don’t outsource that function. (i.e., FedEx needs to fly its own planes)

At first glance this may sound extreme; “reimplement it yourself” / “do it in-house” is a tall order for many things you may rely on. But in practice, this sort of decision can be life or death for your system. The reason is that, if you care about how a job is done in detail, you’re going to want to probe into it in depth; you’re going to want detailed controls over the individual steps of the task; you’re going to want to be involved in the day-to-day of the operations to make sure it’s done to your particular need. In terms of software, this means that you won’t be communicating with this system just via a narrow API like “write this data to a file;” you’ll be using complex API’s, bypassing API’s altogether to get internal state, and so on.

As this progresses, you gradually move from using the system to being intimately involved in it, debugging it, and ultimately needing to modify it to your particular needs. But crucially, if you don’t control that system, you can’t do that.

Now, this doesn’t mean that you shouldn’t consider outsourcing the job at first, and moving to in-house when your need to mess with the details grows. But if you’re going to do that, you need to recognize that the design constraints of working with this external system are going to shape your own design from the get-go, and even once you go in-house, the legacy of those decisions will be with you forever. If you are confident enough in the API that you believe that these design choices will be correct even afterwards, and that the changes you’ll make as you go in-house will simply be extensions to that initial relationship, great;  but if you suspect that your needs are going to end up being fundamentally different from the external system, you may want to bite the bullet and do it yourself from the get-go.

There’s an obvious risk in doing this, of course; it’s more expensive, takes more time and money, and doesn’t give you an immediate advantage over a competitor who outsources. But this risk can pay off if you know that you’re going to hit that transition point reasonably soon — that way, a competitor who built around the wrong outsourcing is suddenly going to find themselves in need of a massive redesign, while you’re revealing wonderful new features to the world.

Published in: on July 20, 2010 at 12:30  Comments (3)  
Tags: ,

That’s great, it starts with an earthquake, every 26.2Myrs.

It’s not often that reading a research paper makes me literally jump up in my seat and yell “Holy shit!,” but that’s what I did when I saw figure 1.

A fascinating new paper is up on the arXiv today about the Nemesis Hypothesis. This hypothesis originated in the 1980’s, when people noticed that mass extinctions tended to happen every 27 million years or so, and it was suggested that this may be because some hard-to-see star — perhaps a red dwarf — is orbiting with our sun in a very eccentric orbit, passing close to us only every so often and in the process pulling in hordes of comets from the Oort cloud and bringing about death and destruction. (A second periodicity, of roughly 62M years, also exists; it’s known that these extinctions also seem to be tied to oscillations in the sea level, so comets are considered a less likely source and there’s no analogous Nemesis2 hypothesis that I know of)

What the new paper did was accumulate a very detailed record of extinction rates over the past 500Myrs, using all of the best datasets available, and produce a plot of the fraction of species going extinct per unit time. The Fourier transform of this curve is simply shocking: it shows two extremely clear peaks at periods of 62 and 26.2Myr, respectively. Amazingly, these peaks are clear enough to rule out the Nemesis hypothesis; if the periodicity were due to a star’s motion, it would actually have to be less perfectly regular  than the experimental data, because the star’s motion would be perturbed by other stars, the galactic disk, and so on. (Good summary paper of that)

I was up much later than I should have been reading through these papers and thinking through the results. Some things seem clear:

  1. These guys seem to have done fairly serious numeric analysis. I’m not qualified to evaluate their data sources and prospective issues from that side, so I’ll have to wait for the specialist community to weigh in, but I didn’t see any red flags in the paper. From looking at the Nemesis papers, it seems pretty clear that if their statistical analysis is good, then Nemesis is genuinely ruled out; I couldn’t think of any variations on that hypothesis which would survive this data.
  2. The peaks on the graph in this paper are holy-shit sharp and distinct. Something is happening with clockwork regularity that wipes out most of life on Earth.

I’m now spending some time thinking about this and about what these spikes may mean. One interesting question is whether the 62My and 26.2My peaks are related or if they’re caused by completely disjoint phenomena. Interestingly, there’s another bump on the graph at about 17My, although it looks like it’s just above the level of statistical significance, not quite big enough to tell for sure if it’s a real signal. If there is a bump there, then these bumps have the odd property of being evenly spaced in frequency space, at intervals of about 0.02 My-1. The pattern of frequencies ω0+nω1 is familiar from many differential equations — e.g., it’s the pattern of energy levels of a quantum-mechanical harmonic oscillator — so it’s something which could naturally emerge from a fairly wide range of physical phenomena.

On the other hand, the two could be wholly separate, or they could be the only two real spikes. It’s going to be hard to tell without staring some at the raw data, and even then we may not have enough precision to really know. One thing which I do suspect we’ll be able to determine from this dataset is whether either of the two cycles could be coming from purely biological or other complex systems such as Clathrate guns — such systems seem less likely to have extremely precise and stable periods. Honestly, the first thing that pops into my mind when I see this level of stability is pulsars or astrophysical jets; we don’t know a lot about super-long-period pulsars, but because they have such long periods it would be awfully hard to know a lot about them by nature. Call this the “Cosmic Death Ray Hypothesis.” Other interesting possibilities could be long-frequency oscillations of the Sun, or something resonating in the structure of the Earth… although the latter seems a bit less precise to me.

Definitely time to look at the raw data. A lot of this hypothesizing depends on just how tight the error bounds on this data really are. If they’re as tight as they seem from the graphs, this is an amazing source of data.

The good news: We’re still about 16My from the next predicted peak in this cycle, so we’ve got a little while to figure it out. Before the cosmic death rays come to get us.

ETA: There’s a good post about this on the arXiv blog which gives some more context.

Published in: on July 12, 2010 at 11:16  Comments (4)  
Tags: ,
%d bloggers like this: