Dark Matter, Dark Energy, and the Open Frontier
Ordinary matter is 5% of the universe. The other 95% — dark matter and dark energy — is inferred from gravitational and cosmological evidence but not directly detected. Physics' most successful century ended with most of the universe unaccounted for.
The Problem with the Universe
The twentieth century of physics ended with an uncomfortable accounting problem. The Standard Model of particle physics, tested to extraordinary precision, describes the particles and forces that make up all ordinary matter. General relativity, confirmed across every available test, describes gravity and the large-scale structure of spacetime. Between them, these theories should be able to describe everything.
They can’t. Ordinary matter — protons, neutrons, electrons, the stuff everything we can see and touch is made of — constitutes approximately 5% of the total energy content of the universe. About 27% is dark matter: something that has gravitational effects but does not interact electromagnetically. About 68% is dark energy: something causing the expansion of the universe to accelerate. Neither appears in the Standard Model. Neither is explained by general relativity except as a parameter to be inserted from observation.
The situation is peculiar. Physics’ most successful century, which produced theories of unprecedented predictive power, ended with most of the universe unaccounted for. Not approximately accounted for — literally absent from the theoretical framework.
Dark Matter: The Evidence
The dark matter case rests on multiple independent lines of evidence, each pointing to the same conclusion.
Galaxy rotation curves. In the 1970s, Vera Rubin and Kent Ford measured the rotation velocities of stars in spiral galaxies at different distances from the galactic center. Newtonian gravity predicts that stars far from the galactic center should move slower, just as outer planets in the solar system move slower than inner ones — the gravitational pull weakens with distance from the central mass concentration. Rubin and Ford found flat rotation curves: stars at large distances from the center move at approximately the same velocity as stars closer in, and in some cases faster. The only explanation consistent with Newtonian gravity is that the galaxies contain far more mass than the visible stars and gas — distributed in an extended halo that encompasses and dwarfs the visible disk.
Gravitational lensing. Massive objects bend light, as general relativity predicts. The amount of bending depends on the amount of mass. Maps of gravitational lensing around galaxy clusters consistently show mass distributions that are larger and more extended than the distributions of visible matter. The Bullet Cluster — two galaxy clusters that have collided and passed through each other — provides particularly clean evidence: the hot gas (visible via X-ray emission) was slowed by electromagnetic interactions during the collision and is concentrated in the collision region, while the gravitational lensing map shows mass concentrated elsewhere, in the regions where the dark matter from each cluster passed through each other unimpeded.
Cosmic microwave background. The CMB is the afterglow radiation from the early universe, when the universe cooled enough for protons and electrons to combine into neutral hydrogen atoms and the universe became transparent to radiation. The temperature fluctuations in the CMB encode the acoustic oscillations of the early universe’s plasma — the pattern of compressions and rarefactions driven by the competition between gravity (which pulls matter together) and radiation pressure (which pushes it apart). The detailed pattern of these oscillations depends sensitively on the ratio of ordinary matter to dark matter in the early universe. The observed CMB perfectly matches predictions that include dark matter in the ratio roughly consistent with all other measurements.
Dark Matter: The Candidates
What dark matter actually is remains unknown. The leading candidate for decades has been WIMPs — Weakly Interacting Massive Particles, hypothetical particles with masses in the range of 10–1000 GeV that interact only through gravity and the weak nuclear force. They arise naturally in supersymmetric extensions of the Standard Model. They would have been produced in the right abundance in the early universe (the “WIMP miracle”) to match the observed dark matter density.
WIMP searches have come up empty. Direct detection experiments — detectors placed deep underground to shield them from cosmic ray background, waiting for a WIMP to scatter off a nucleus — have improved in sensitivity by many orders of magnitude over thirty years without finding a signal. The absence of supersymmetric particles at the LHC has eliminated large regions of WIMP parameter space. WIMPs are not ruled out, but the parameter space where they could exist is shrinking.
Alternative candidates include axions (very light particles arising from an independent solution to a separate problem in QCD, the Peccei-Quinn mechanism), sterile neutrinos (a heavier cousin of the known neutrinos that interacts only gravitationally), and primordial black holes (formed in the early universe before stellar evolution, which LIGO observations have constrained significantly). Some physicists have proposed modifications to gravity (MOND — Modified Newtonian Dynamics) that eliminate the need for dark matter entirely, but these proposals have difficulty explaining the full range of evidence, particularly the CMB acoustic oscillations, without reintroducing something dark-matter-like.
Dark Energy: The Accelerating Universe
In 1998, two independent teams studying Type Ia supernovae to measure the universe’s deceleration — they expected the expansion to be slowing under gravity — found instead that the expansion is accelerating. Distant supernovae were dimmer than expected, meaning they were farther away than a decelerating universe would place them. The universe is not just expanding; the expansion is speeding up.
For this to happen, there must be something with negative pressure — something that pushes space apart rather than pulling it together. Einstein’s general relativity permits this through the cosmological constant Λ (lambda) — a term Einstein himself introduced in 1917 (for the wrong reasons) and later removed when Hubble discovered the expansion, calling it his “biggest blunder.” The cosmological constant represents the energy density of empty space — vacuum energy. A positive cosmological constant causes accelerating expansion.
The observations fit a model (ΛCDM — Lambda Cold Dark Matter) in which the universe’s energy content is 5% ordinary matter, 27% cold dark matter, and 68% cosmological constant / dark energy. The model is extraordinarily successful at explaining the large-scale structure of the universe, the CMB, and the expansion history.
The theoretical problem: quantum field theory predicts a vacuum energy density of approximately 10¹²⁰ times the observed value. The ratio between the predicted vacuum energy and the observed cosmological constant is the largest known discrepancy between theoretical prediction and observation in the history of science. Either our theoretical understanding of vacuum energy is catastrophically wrong, or the cosmological constant is being set by some fine-tuning mechanism that cancels the vast predicted vacuum energy almost but not exactly to zero — leaving the observed tiny remainder.
The anthropic argument — that the cosmological constant has the value it has because a much larger value would have prevented galaxy formation, and hence observers, from existing — is the only currently available explanation, and most physicists find it unsatisfying as physics.
What the Open Problems Are Actually Saying
The situation in fundamental physics circa the early twenty-first century is unusual. The mature theories — general relativity, the Standard Model — work with extraordinary precision within their domains. The open problems — dark matter, dark energy, quantum gravity, the matter-antimatter asymmetry, the origin of the cosmological constant — are not small refinements. They concern the majority of the universe’s content, the fundamental incompatibility of the two best theories, and the origin conditions of the universe.
This is not physics in crisis. The experimental tools are better than they have ever been: LIGO has opened a gravitational wave window on the universe, the LHC continues collecting data, CMB experiments are pushing to higher precision, dark matter searches are probing new regions of parameter space. The James Webb Space Telescope is observing galaxies in the early universe with unprecedented clarity, probing the epoch of reionization and the first star formation.
What the situation requires is theoretical creativity to match the experimental capability — proposals for physics beyond the Standard Model that make testable predictions at accessible energies, and approaches to quantum gravity that yield observable consequences rather than only Planck-scale predictions. The twentieth century’s great theories provided the tools to ask questions the twenty-first century has to answer.
The 95% of the universe that physics can’t yet describe is not a problem with the data. The data has been telling us about it for fifty years. It is a problem that requires new ideas.