The First Stars and Galaxies
Astrophysicist Avi Loeb on structure formation in the early Universe, mapping hydrogen using a 21 centimeter l...
According to the current cosmological model, dark matter is a major constituent of the universe, accounting for a large proportion of the total material. It is also a new kind of matter which does not correspond to any of the elementary particles that we know about such as protons, neutrons, electrons and so on. While the existence of dark matter has been inferred largely through astronomical observations and confirmed by simulations, it still remains a riddle in terms of identifiable particle in physics.
Astronomical observations and the ‘missing mass’
The first indications came from studies of objects and clusters of galaxies almost 90 years ago, when in 1933 Fritz Zwicky was measuring the speeds with which galaxies orbit within clusters of galaxies – the way that the earth orbits the sun or stars in the galaxy orbit the center of the galaxy. From measuring the size of the clusters and the speeds of the galaxies, he worked out what their orbital times were and how much material was needed in the cluster to keep them orbiting around the center. What Zwicky discovered was that the amount of mass required to keep the galaxies orbiting around the clusters appeared to be much larger than was directly observable in the form of stars – he got numbers hundreds of times larger. Over time, estimates of astronomical distances have changed but, still, if you look today at the clusters of galaxies, the amount of material required to keep galaxies orbiting inside the clusters, so as to stop the clusters from flying apart is about 10 times what you can observe. These observations then constituted the first evidence for dark matter. Additional evidence came later from seeing how gas and small galaxies orbit around individual large galaxies like the Milky Way. This suggested that large amounts of material are required very far from the center of the galaxy where there are no stars in order to explain the orbital speeds of the objects seen at these radii.
Light bending
More recently, dark matter was needed to explain gravitational lensing effects, as, for example, when light coming to us from a distant galaxy goes through a galaxy cluster. The gravitational field of the cluster bends the light rays and this causes a distortion of the image. By measuring this distortion, you can infer how much the light rays have been bent, thus measuring how much mass there is in the galaxy cluster. These kinds of measurement have showed again that there is much more material there than could be accounted for by all the things that you can see through optical, x-ray or direct observations.
Microwave background radiation
More recently still, evidence for dark matter was found in the very early universe, even before galaxies formed, by looking at small fluctuations in the microwave-background radiation, the heat radiation that comes to us directly from the Big Bang itself. When you take an image of the sky in this radiation, you see what the universe looked like when it was only 300,000 years old, whereas today it is 13 billion years old. At that time, there were no non-linear objects, no stars and no galaxies. There were just weak fluctuations, like sound waves propagating through the hot gas. By looking at structure in these very distant clouds when the universe was very young, you can figure out the properties of these sound waves and it turns out that they also require material to be present in large quantities, about 6 or 7 times as much as ordinary matter, which does not interact with the radiation or the ordinary matter in any way except through gravitational interactions. This shows that dark matter is also present in the early universe as well as around today’s galaxies and galaxy clusters and the amounts needed seem to be the same.
Renewed interest in dark matter
In the 1970s more evidence like Zwicky’s was found on a variety of scales and people started thinking about how galaxies and galaxy clusters might have formed. To get this to work, it seemed necessary to include the dark matter component in addition to the ordinary baryonic material, which started mostly as hydrogen and helium coming directly out of the Big Bang. It seemed that without dark matter the galaxies would not have time to form. What did work was for objects to form when some region of space where the density of material was a bit higher than average stopped expanding with the rest of the universe and fell back on itself. The mixture of dark matter and gas would come to a kind of equilibrium after which the dark matter would sit there while the gas gradually lost its energy through radiation and sank to the bottom of the gravitational potential well, eventually turning into a galaxy. This kind of process could lead to large equilibrium lumps of dark matter with galaxies sitting at the centers, thus explaining why you only get to see the evidence for dark matter when you get out to the edges of galaxies. With this scheme it turned out to be possible to get the galaxies to form in the available time and to end up spinning as fast as is observed. From this time on, dark matter has always been an important part of our model for galaxy formation.
Early attempts at explanation
However, when I was working through these ideas in the 1970s, it seemed most likely that the dark matter was, in fact, ordinary matter which had turned into very faint, effectively invisible stars at some early time (we did not learn the details of the structure in the microwave background until 25 years later). This idea had a number of problems but it seemed simpler than inventing a completely new kind of matter. But later in 1970s the idea of dark matter as a new kind of material received a stronger impetus from the work of Yakov Zeldovich and his colleagues in the Soviet Union, who suggested that if the neutrino (a type of elementary particle already known to physicists which interacts extremely weakly with light or ordinary matter) had a small mass of just the right size, then neutrinos produced in the Big Bang could constitute the dark matter. Zeldovich’s school in Moscow and a number of other people then started thinking about the consequences of neutrinos as dark matter. In about 1980 an experiment in Russia on the end-point of tritium decay seemed to show a signal indicating that the electron neutrino has a mass of about 30 electron volts, which was exactly the value needed to explain dark matter.
Everything seemed to fit, but it turned out there was a problem with the experiment and the result eventually proved to be incorrect. Nevertheless, it still gave a big push to the idea that the dark matter might be some new kind of elementary particle. Computer experiments which looked in more detail at how cosmic structure would evolve if the dark matter really was a 30 electron volt neutrino soon showed that although this gives the right total amount of mass it doesn’t make the right type of structure, in particular it makes objects which are much too big. At about this time astronomers were starting to map the structure of the galaxy distribution by surveying large areas of the sky. And it was clear that the large-scale structure of the universe, the pattern of the galaxy distribution, is not consistent with the pattern one would expect if the universe was made of neutrinos.
Cold Dark Matter and the Planck Observations
After a relatively short time, the idea that the dark matter was neutrinos thus fell into disfavor, and various new particle candidates were suggested. It became clear that given certain kinds of particle properties you could get large-scale structure that looks very similar that observed, avoiding the problems of the neutrinos because the early evolution of structure would be different. The disadvantage of these particles is that none of them had been shown to exist by any physics experiment before or since. Despite this, in the 1980s this gradually became the standard picture, known as the “cold dark matter model”. Cold dark matter is any new kind of particle which interacts weakly with light and ordinary matter and is present in the right abundance but which, unlike the neutrinos, had very weak thermal motions in the early universe. As a result, it doesn’t make very large structures, but rather starts out by making small structures which grow galaxies at their centers and then gradually fall together to make bigger systems like clusters. The 1980’s were also the time when computers were getting powerful enough to do realistic simulations of cosmic structure formation. So what happened is that people did simulations of this cold dark matter model and found patterns for the galaxy distribution that look quite similar to that observed. At the time, the model was not very popular among observational astronomers or other physicists, because, in order to solve an astronomical puzzle, the “missing mass problem”, it invented an entirely new form of matter. This seemed too radical to many people.
But then what happened in the 1990s and 2000s was that astronomers built sensitive enough instruments to measure the fluctuations in the microwave background radiation and these measurements have rapidly become extremely precise. The most recent results from the European Planck satellite have provided a very detailed picture of structure in the early Universe and this unambiguously requires dark matter, some kind of particle that interacts very weakly with light or with electrons and protons, This is a completely different context
from earlier studies of dark matter and yet the amount that’s needed is just the same. Indeed the most precise measurement of the amount of dark matter in the Universe now comes from the microwave background measurements.
So the fact that you need the dark matter in two completely different astronomical situations – around galaxies and galaxy clusters today and in the very early universe when all structure was linear, with weak gravitationally driven sound waves propagating through near-uniform gas – and that analysis of this early period in the universe’s history needs the same kind of matter in the same quantity as you need today to understand the structure, the dynamics and the gravitational lensing by galaxy clusters and galaxies, all this gives a lot more credibility to the idea that dark matter is real and may be a new kind of elementary particle.
One of the things that came out from the Planck observations is a very precise estimate of how much ordinary matter and dark matter were present at early times. There is five times as much dark matter as ordinary matter, so the dark matter dominates, and the initial collapse of objects is dominated by the gravity of their dark matter, but then what happens is that the dark matter can’t get rid of its energy because it has no interactions with any of the other particles, and so it just sits there while the gas loses energy by radiation and sinks to the center getting denser and denser until it makes stars. The picture that Martin Rees and I set out in 1978,still works with this new identification of what dark matter might be.
At the beginning of the 1980s, right after the theoretical work with Martin Rees, I was involved in the first computer simulations of neutrino-dominated universes which produced structure which looks nothing like what we saw in the very first surveys of the distribution of galaxies which had been finished a year or two earlier. My collaborators and I therefore switched to simulations of structure formation in a cold dark matter universe. Predictions of what the initial conditions should look like in such a universe had just been made so the obvious thing for us to do was to see if these led to better results for present-day structure, and indeed this turned out to be the case.
The Millennium Simulation was a much more recent version of those early simulations. The first simulations we did in the 1980s represented the entire distribution of matter in the universe by thirty-two thousand gravitating points. We solved this just like in the Newtonian two-body problem, except we were doing the thirty-two thousand body problem – that was the maximum number of particles you could follow in a computer back then. By 2005, when we got to the Millennium Simulation, computers and computational methods had improved dramatically and we could do the 10^10 particle problem, a calculation which represented the mass distribution of the universe – technically the dark matter distribution – 10^10 particles, which all evolved under the influence of their mutual gravitational forces.
10^10 is a large enough number that you can represent the matter distribution in quite a lot of detail and you can represent the initial conditions also quite faithfully. This allows you to get much more precise predictions for what the distribution of matter should look like at the present day. The way to think about this is that the microwave background tells us precisely what the structure of the dark matter distribution was when universe was 380,000 years old, and we just evolve this gravitationally in the computer to the present day so that we can compare the result to the observed mass distribution and see if the statistics agree. In fact, it turned out that, for example, gravitational lensing observations of galaxies and galaxy clusters agree in detail with the structure predicted for such objects by simulations of this kind, and the observed pattern of the distribution of galaxies in space is very similar to the pattern found if you try to follow galaxy formation inside the Millennium Simulations. The goal here is to try and make ever more precise and quantitative comparisons between the predictions of evolution from the initial conditions seen in the microwave background and the structure of the universe we see today. If this works, it gives us increased confidence in our theory. If not, then it shows the theory needs to be modified.
In some ways the cold dark matter model is very nice because you have two very different epochs where you have detailed information and you seem to have a model which has a relatively small number of parameters which gets you all the way from one to the other. On the other hand, the model has two ingredients, which are completely outside of the rest of physics: the dark matter and also the dark energy, which is currently driving the accelerated expansion of the universe. This makes many people very uncomfortable – you have to invent two brand new things to explain what we see, and there is no obvious relation between them, or between them and the ordinary matter.
One possible alternative explanation has come from theorists who have proposed that the need for dark matter might be avoided by a suitable extension of gravity theory. Perhaps, the laws of gravity might differ from those set out by Newton and Einstein when you get to very large scales, and perhaps they can be changed in such a way that they can reproduce the phenomenology that requires dark matter if we assume standard gravity. There has been some success in this program at the level of understanding the structure of galaxies and galaxy clusters, but the difficulty arises when you want to find an explanation which works simultaneously for the microwave background radiation. Clearly, you don’t want to have two separate explanations to get rid of the dark matter at two different times, so you need to change the theory of gravity in a way that solves both problems at the same time, and this has so far resisted solution.
Essentially, people who tried to get rid of the dark matter by having a more complicated gravity theory than general relativity, have usually had to introduce additional fields, which in the end appear to behave like dark matter. Thus in the end, you haven’t really solved the problem, you’ve just given it a different name. At the moment there isn’t an alternative theory which does not effectively introduce dark matter by the back door, which can account both for the detailed structure of microwave background fluctuations and for the gravitational lensing and dynamical effects usually ascribed to dark matter in the present-day universe. Nevertheless, until we identify what the dark matter is directly, the situation remains uncomfortable, because we know nothing about how it relates to the rest of physics.
So far, the evidence for dark matter comes entirely from astronomical observations. Many suggestions for what it might be can be tested experimentally and people all over the world are trying to find dark matter particles in the lab. By now the three known types of neutrino are all known to have mass and many of their properties have been measured. It turns out that they are all too light to make up most of the dark matter. Many suggestions for new particles which might be the dark matter have been made. For example, it could be a new kind of neutrino with the opposite helicity to those already known – a right-handed neutrino corresponding to one or more of the known left-handed ones. Another possibility could be the lightest supersymmetric partner of the known particles, if it turns out that supersymmetry is realized in nature. Another suggestion is a particle called an axion which was postulated in 1977 by physicists Peccei and Quinn to solve a difficult problem in the theory of strong interactions. All these particles could have properties which would result in them behaving like cold dark matter.
So there are at least three particle candidates for cold dark matter. Physicists have designed experiments to try and detect axions or the lightest supersymmetric particle if either of these is really the dark matter. In this case the earth is continually moving through a sea of these particles so there are always some of them in the lab which may be detected if we can build sensitive enough equipment. So far nothing has been detected, and for some of the candidates, like the supersymmetric particles, the constraints are now getting quite strong, so it actually looks unlikely to be the right solution. The real difficulty is, since we don’t know what the dark matter particles are, particle physicists can always invent new kinds of particles with properties which imply that they would have evaded detection in all experiments so far. Indeed, some of the candidates already suggested, for example the right-handed neutrinos, have such weak interactions that it seems unlikely they will ever be detected in a lab. But of course, until you can find some properties of dark matter other than gravity, you don’t really have that much clue about what it is.
It always struck me that we have two things we don’t understand – the dark matter and the dark energy. Since both imply that there is additional physics that we haven’t understood yet, maybe the two are related and there is an interaction of some sort between. If this is the case, it might cause additional phenomena which we could observe astronomically and so gain new insight into their properties.
But more generally, you just have to keep testing the theory you have more and more precisely until eventually it breaks. This is like what happens with quantum mechanics – if you keep testing things to smaller and smaller scales, you suddenly find that classical mechanics doesn’t work well any longer. In fact, we now know that if you go to small enough scales it doesn’t work at all.
Many people, for instance, have been enthusiastic about candidates for dark matter which are intermediate between cold dark matter and normal neutrinos, so-called warm dark matter. Some kinds of right-handed neutrinos could have this property. For such warm dark matter big galaxies would form in the same way as with cold dark matter, but the smallest galaxies would form differently because small-scale structure was wiped out in the initial conditions. This makes a difference to their observable structures, so in principle we may be able to distinguish between warm and cold dark matter by looking at small galaxies.
That’s an example where you could rule out one currently viable theory for the dark matter in favour of another by astronomical observations. Cases like these are always helpful, although this still would not allow us to say what the dark matter is, only what it is not. We have to push our understanding of things that are astronomically explorable to precise limits and when, at some point we find something that does not work, maybe this will be the clue.
Edited by Roman Varum
Astrophysicist Avi Loeb on structure formation in the early Universe, mapping hydrogen using a 21 centimeter l...
Astronomer Denilso Camargo on the Moon formation and the major environmental features of the lunar surface
Physicist Xavier Bekaert on the properties of symmetry, homogeneity of universe, and why every point in the sp...