Our group’s calculations suggest that the predicted masses of the first star-forming clumps are not very sensitive to the assumed cosmological conditions (for example, the exact nature of the initial density fluctuations). In fact, the predicted masses depend primarily on the physics of the hydrogen molecule and only secondarily on the cosmological model or simulation technique. One reason is that molecular hydrogen cannot cool the gas below 200 kelvins, making this a lower limit to the temperature of the first star-forming clumps. Another is that the cooling from molecular hydrogen becomes inefficient at the higher densities encountered when the clumps begin to collapse. At these densities the hydrogen molecules collide with other atoms before they have time to emit an infrared photon; this raises the gas temperature and slows down the contraction until the clumps have built up to at least a few hundred solar masses.
What was the fate of the first collapsing clumps? Did they form stars with similarly large masses, or did they fragment into many smaller parts and form many smaller stars? The research groups have pushed their calculations to the point at which the clumps are well on their way to forming stars, and none of the simulations has yet revealed any tendency for the clumps to fragment. This agrees with our understanding of present-day star formation; observations and simulations show that the fragmentation of star-forming clumps is typically limited to the formation of binary systems (two stars orbiting around each other). Fragmentation seems even less likely to occur in the primordial clumps, because the inefficiency of molecular hydrogen cooling would keep the Jeans mass high. The simulations, however, have not yet determined the final outcome of collapse with certainty, and the formation of binary systems cannot be ruled out.
Different groups have arrived at somewhat different estimates of just how massive the first stars might have been. Abel, Bryan and Norman have argued that the stars probably had masses no greater than 300 solar masses. Our own work suggests that masses as high as 1,000 solar masses might have been possible. Both predictions might be valid in different circumstances: the very first stars to form might have had masses no larger than 300 solar masses, whereas stars that formed a little later from the collapse of larger protogalaxies might have reached the higher estimate. Quantitative predictions are difficult because of feedback effects; as a massive star forms, it produces intense radiation and matter outflows that may blow away some of the gas in the collapsing clump. But these effects depend strongly on the presence of heavy elements in the gas, and therefore they should be less important for the earliest stars. Thus, it seems safe to conclude that the first stars in the universe were typically many times more massive and luminous than the sun.
The Cosmic Renaissance What effects did these first stars have on the rest of the universe? An important property of stars with no metals is that they have higher surface temperatures than stars with compositions like that of the sun. The production of nuclear energy at the center of a star is less efficient without metals, and the star would have to be hotter and more compact to produce enough energy to counteract gravity. Because of the more compact structure, the surface layers of the star would also be hotter. In collaboration with Rolf-Peter Kudritzki of the University of Hawaii and Abraham Loeb of Harvard, one of us (Bromm) devised theoretical models of such stars with masses between 100 and 1,000 solar masses. The models showed that the stars had surface temperatures of about 100,000 kelvins—about 17 times higher than the sun’s surface temperature. Therefore, the first starlight in the universe would have been mainly ultraviolet radiation from very hot stars, and it would have begun to heat and ionize the neutral hydrogen and helium gas around these stars soon after they formed.
We call this event the cosmic renaissance. Although astronomers cannot yet estimate how much of the gas in the universe condensed into the first stars, even a fraction as small as one part in 100,000 could have been enough for these stars to ionize much of the remaining gas. Once the first stars started shining, a growing bubble of ionized gas would have formed around each one. As more and more stars formed over hundreds of millions of years, the bubbles of ionized gas would have eventually merged, and the intergalactic gas would have become completely ionized.
Scientists from the California Institute of Technology and the Sloan Digital Sky Survey have recently found evidence for the final stages of this ionization process. The researchers observed strong absorption of ultraviolet light in the spectra of quasars that date from about 900 million years after the big bang. The results suggest that the last patches of neutral hydrogen gas were being ionized at that time. Helium requires more energy to ionize than hydrogen does, but if the first stars were as massive as predicted, they would have ionized helium at the same time. On the other hand, if the first stars were not quite so massive, the helium must have been ionized later by energetic radiation from sources such as quasars. Future observations of distant objects may help determine when the universe’s helium was ionized.
If the first stars were indeed very massive, they would also have had relatively short lifetimes—only a few million years. Some of the stars would have exploded as supernovae at the end of their lives, expelling the metals they produced by fusion reactions. Stars that are between 100 and 250 times as massive as the sun are predicted to blow up completely in energetic explosions, and some of the first stars most likely had masses in this range. Because metals are much more effective than hydrogen in cooling star-forming clouds and allowing them to collapse into stars, the production and dispersal of even a small amount could have had a major effect on star formation.