The key piece of data is the amount of radioactivity deposited in the solar system, which has been deduced from isotope abundances in meteorites, explains astrophysicist Brian Fields of the University of Illinois at Urbana-Champaign. When a supernova explodes, it releases a shell of radioactive material that diminishes in concentration just like the intensity of a lightbulb fades with distance. The size of the proto-solar system and its distance from the explosion determine how much of that radioactive ejecta mingles in. Based on the meteorite record and the properties of a typical supernova, Fields and his colleagues could therefore calculate the ratio of the explosion's distance to the budding solar system's size--a modest 66:1, they report in a paper accepted for publication in the Astrophysical Journal.
From that ratio they worked out the distance itself by considering the sizes of protostars elsewhere in the galaxy. "Once we did that, our eyes got big, and we said, 'son of a gun, it had to be really close,'" Fields remarks. They pegged the supernova's distance at less than five light-years, which is well within the roughly 10-light-year breadth of a cluster. For comparison, Proxima Centauri, the nearest star to our own, lies about four light-years away. "To have such an in-your-face explosion, you have to be all in the same stellar nursery," Fields says. Many such clusters, containing hundreds or thousands of stars each, have been observed. But because the stars are held together only weakly, they can easily drift apart, Unfortunately, that drift makes it impossible to say which stars would once have shared a nursery with the sun, Fields points out.
The group's estimate is the most careful one yet, says astrophysicist Steven Desch of Arizona State University. "It really does make people think about the proximity in a way they hadn't before," he says. "This puts you definitely inside the cluster."