In every bit of nothing, there is something. If you zoom in on empty space and take out all the planets and stars and galaxies, you might expect a pure vacuum, but you’d be wrong. Instead you would find a dynamic scene, with particles sparking to life and disappearing almost immediately.
Quantum mechanics, the theory governing the infinitesimal world, doesn’t allow for nothingness. At any given moment in time and space, energy can never be perfectly zero—there is always some wiggle room. Out of that wiggle room, “virtual” particles can arise—specifically, a pair made of a particle and its antiparticle, which annihilate each other and are gone as quickly as they came. As bizarre as this may seem, experiments have observed the real-world effects of virtual particles. When particle accelerators first measured the mass of the Z boson, it was slightly off from its pure mass because it was sometimes turning into a virtual top quark—one of many observations proving that virtual particles exist.
The effect of all these particles wiggling into and out of being is a thrumming “vacuum energy” that fills the cosmos and pushes outward on space itself. This activity is the most likely explanation for dark energy—the reason the universe, rather than staying static or even expanding at a steady rate, is accelerating outward faster and faster every moment.
The problem with vacuum energy is that there’s not enough of it. When scientists first started thinking about the concept, they calculated that this energy should be huge—it should have expanded the universe so forcefully and quickly that no stars and galaxies ever formed. Because that is clearly not the case, the vacuum energy in the universe must be very small—about 120 orders of magnitude smaller than what quantum theory predicts. That’s like saying that something weighing five pounds should really weigh five-with-120-extra-zeros-after-it pounds. The discrepancy has prompted some scientists to call vacuum energy “the worst theoretical prediction in the history of physics.”
Vacuum energy is thought to be the main ingredient in the “cosmological constant,” a mathematical term in the equations of general relativity. The enormous discrepancy between the predicted amount of vacuum energy and the measured amount is often called the cosmological constant problem. “It’s generally regarded as one of the most awkward, embarrassing, difficult problems in theoretical physics today,” says Antonio Padilla, a physicist at the University of Nottingham in England, who has spent 15 years trying to figure it out. “It suggests there’s something missing in our story. I find it exciting—why would you not want to work on that?”
The riddle has enticed some of the greatest minds in physics and elicited a plethora of ideas to solve it. Last year New York University physicist Gregory Gabadadze spent an hour summarizing all the concepts theorists have come up with so far in a talk at the Brown University physics department. At the end, one of the audience members asked him which of the ideas he favored. “None of them,” Gabadadze replied. They are all too “radical,” he said, and all require “giving up sacred principles.”
But some physicists say new theoretical work is injecting excitement into the quandary. And recent advances in precision laboratory experiments that probe gravity, as well as the advent of gravitational-wave astronomy, offer hope that some of the proposed solutions to the problem could finally be put to the experimental test—or, at the very least, ruled out.
The Birth of a Problem
The cosmological constant has a checkered history. “It was what you could call a nonsolution to a nonproblem,” says physicist Rafael Sorkin of the Perimeter Institute for Theoretical Physics in Ontario. Albert Einstein first invented it in 1917 as a mathematical kludge to force his general relativity field equations to predict a static universe, as he and most scientists then believed the cosmos to be. But in 1929 astronomer Edwin Hubble measured the speeds of many galaxies and found, to his surprise, that they are all moving away from us—in fact, the farther away the galaxy, the faster it was going. His measurements showed that space is expanding everywhere, and no matter where you look, it will seem as if all galaxies are receding because the distance between everything is constantly growing. Faced with this news, Einstein decided a couple of years later to remove the cosmological constant from his equations, calling it “my biggest blunder,” according to physicist George Gamow.
For a while the cosmological constant was a footnote of history, but it was quietly preparing for a comeback. In the late 1990s two teams of astronomers were competing to measure how much the expansion of the universe was slowing down as a result of gravity pulling matter inward. In 1998 and 1999 they published their results, based on measurements of special supernovae whose distances could be determined very accurately. The most distant of these supernovae turned out to be much dimmer, and therefore farther away, than expected. The expansion wasn’t slowing down at all—it was speeding up. This alarming discovery won three of the teams’ leaders a Nobel Prize and prompted cosmologist Michael Turner to coin the term “dark energy” for the mysterious force causing the acceleration. Immediately physicists suggested that the source of dark energy might be the cosmological constant—in other words, vacuum energy. “Perhaps there was more insight in Einstein’s blunder than in the best efforts of ordinary mortals,” Saul Perlmutter, one of the discoverers of the acceleration, later wrote.
Although the cosmological constant allowed scientists to balance the Einstein field equations again, making them predict an accelerating universe like the one astronomers had observed, the value of the constant didn’t make sense. It actually worsened a problem that had been bothering scientists for a while. In the years that the constant lay on the cutting-room floor, physicists had linked this term from general relativity with the concept of vacuum energy from quantum mechanics. But the vacuum energy was supposed to be huge.
One of the first people to notice something was amiss was physicist Wolfgang Pauli, who found in the 1920s that this energy should be so strong that the cosmos should have expanded long past the point where light could traverse the distance between any of the objects in it. The whole of the observable universe, Pauli calculated, “would not even reach to the moon.” He was reportedly amused by his estimation, and no one took it seriously at the time. The first to formally calculate the value of the cosmological constant based on quantum theory’s predictions for the vacuum energy was physicist Yakov Zel’dovich, who found in 1967 that the energy should make the cosmological constant gigantic. But at the time, scientists thought the universe was expanding at a steady or slowing rate, and most believed the cosmological constant to be zero. The cosmological constant problem was born.
Thirty years later, when astronomers realized that the expansion of the cosmos was accelerating, the problem didn’t go away. The amount of acceleration, though shocking at the time, was still minuscule compared with what quantum theory said it should be. In a way, reviving the cosmological constant made the predicament worse. It was one thing to try to imagine why the constant might come out to precisely zero. It became more difficult to understand why it might be just slightly more than nothing. “Its value is very weird,” says theoretical physicist Katherine Freese of the University of Texas at Austin. “Even weirder than zero.”
Not everyone agrees that this is a problem in need of fixing. The cosmological constant is technically just a constant of nature, a number in an equation that can take on any value, says Sabine Hossenfelder, a theoretical physicist at the Frankfurt Institute for Advanced Studies in Germany. The fact that it has the value it has is just a numerical coincidence. “You could just take the constant and be done with it,” Hossenfelder says. “All these debates about why does it have the value it has are not scientifically good questions,” she says. Nothing about quantum field theory was falsified when its prediction didn’t match astronomical measurements, and the theory is still as useful as it ever was. “I think most people in the cosmology and astrophysics community believe it’s a problem because they’ve been told that for a long time.”
Yet many physicists cannot let it go. The unexpected smallness of the cosmological constant is a thread that needs pulling. “It bothers me a lot,” Gabadadze says, “and I want some answers.”
Despite the zeal many physicists have for attacking the question, the pace of progress has been frustratingly slow. “It’s been more than 50 years since Zel’dovich really pointed out what the problem was, and there’s certainly no established, accepted explanation,” Padilla says. “Ideas come and go, but generally very little sticks.”
Most proposed solutions to the cosmological constant problem fall into three categories: change the general relativity equations that describe the expansion of the universe, modify the quantum field theory equations that predict the amount of vacuum energy, or throw something entirely new into the mix.
Tweaking general relativity could change the mathematical role the cosmological constant plays—or cut it out altogether. Freese and her colleagues, for instance, sought to eliminate the need for the constant to explain the acceleration of the universe by altering the way general relativity calculations should be applied to the expanding cosmos. “Matter and photons might be enough, without adding any new component to the universe, if their role in the equations is different,” she says. Her model is based on the idea that extra dimensions, beyond the three of space and one of time that we witness, might be hidden out of sight.
Another angle on updating general relativity is called sequestration, proposed by Padilla and his colleagues. They modify Einstein’s theory in a way that seals gravity off so it cannot feel the effects of vacuum energy. “I’m not going to pretend this is the established model,” Padilla adds, “but no one’s been able to rule it out.”
If general relativity isn’t the problem, though, maybe quantum mechanics is. Some theorists have suggested that the quantum field theory method of calculating vacuum energy is off. Stefan Hollands of the University of Leipzig in Germany and his colleagues take issue with applying the regular quantum equations to curved spacetime, saying they were designed with flat space in mind. If physicists could correctly modify them for curved space, they argue, the cosmological constant problem would go away.
But the resolution might require more than just mathematically finagling the traditional equations. One recent unorthodox idea is a proposal by Steve Carlip of the University of California, Davis, that spacetime is fundamentally made of “foam.” In this picture, the curvature of space would constantly fluctuate on extremely small scales, well beyond anything we could hope to measure. All this complicated topology would cancel out much of the impact of the cosmological constant, making it very small at the local level. “It’s kind of a wild idea,” Carlip says. “It is a desperate measure, but so is every other attempt to deal with the cosmological constant, and these are desperate times.”
Sorkin, who says Carlip’s spacetime foam is “going in the right direction,” also has his own entry in the field. He works on an approach to unifying quantum mechanics and gravity called causal set theory. According to this model, spacetime is fundamentally discrete—meaning that instead of being a smooth, continuous expanse, it is broken up into tiny chunks, individual units of space and time that represent the building blocks of the universe just as atoms are the building blocks of matter. If this is the case, calculating the cosmological constant involves dividing by the number of spacetime units in the universe, leading to a value much closer to what astronomers observe.
One of the most prominent—and, by some, most hated—solutions to the cosmological constant problem is called the anthropic principle. This line of thinking agrees that the cosmological constant in our universe has an unlikely value but explains it by saying we live in a multiverse. If ours is just one bubble in a cosmic sea, with different physical laws and constants in each, then there was bound to be one with this value. Most of the others would not lead to a universe with galaxies, stars, planets or life, so the fact that we find ourselves in one of the outliers is only to be expected. Because string theory requires a multiverse, string theorists tend to regard the cosmological constant problem as essentially solved by this reasoning. Other physicists, though, consider this philosophy a cop-out. “It’s giving up on the problem,” Sorkin says.
All these strategies tend to involve rather dramatic revisions of established physics. “Every single one of them calls for a major revamping of basic principles, either of spacetime, say, or the number of dimensions of the universe,” Gabadadze says. “They are all distasteful in some way.” No single theory has clearly risen above the rest. “At this point it becomes a matter of taste,” Carlip says. “Probably the answer is something that nobody’s thought of.”
Constancy or Quintessence?
The cosmological constant remains the best explanation for dark energy—the mysterious force causing the expansion of space to accelerate. But what if dark energy isn’t actually related to the cosmological constant or vacuum energy at all? What if the universe’s vacuum energy is somehow perfectly canceled out and the cosmological constant is zero? In that case, dark energy might be the work of something called quintessence.
The notion of quintessence was introduced in 1998 by physicists Robert Caldwell, Paul Steinhardt and Rahul Dave as an alternative explanation for the accelerating expansion of the universe. Quintessence would be some form of energy throughout space with a negative pressure. In contrast to the cosmological constant, quintessence could change over time. One version of quintessence, called phantom energy, postulates an energy whose density increases with the age of the universe, leading to an ultimate “big rip” when space is torn apart by runaway expansion until the distance between particles becomes infinite.
To test whether dark energy is caused by quintessence or the cosmological constant, scientists must determine whether the strength of dark energy has changed over time. Various projects have been gathering data about the expansion rate of space at different cosmic epochs. One example is the Dark Energy Survey, a six-year effort to map galaxies at many distances across a large area of the sky using the Victor M. Blanco Telescope in Chile. The survey’s data are in, but scientists are still analyzing them—so far all signs point to dark energy being constant. Another way to find out whether quintessence is real is to look for evidence that this energy has caused the fundamental constants of nature to change over time. No indications of inconstant constants have yet emerged.
Over the next couple of decades experiments should give scientists a better idea of whether the cosmological constant (and the vacuum energy behind it) is the source of dark energy. The Vera C. Rubin Observatory Legacy Survey of Space and Time, planned to begin in 2022 on a telescope currently under construction in Chile, should dramatically improve the precision of current measurements of the history of cosmic expansion. Soon scientists should be able to say much more clearly whether there’s room in the data for quintessence or whether an unchanging force has been at work.
Spacetime Ripples and Neutron Stars
If, as the evidence seems to show so far, dark energy is truly a result of the cosmological constant, there is still some hope of sorting through the various proposed explanations for its unexpected smallness. Upcoming experiments and astronomical observations may offer a way to discriminate between the proliferation of theories, weeding out some and, just maybe, offering support for others.
Five years ago scientists gained a whole new lens to study the cosmos when they began to detect gravitational waves, the ripples in spacetime produced by the collision of huge masses such as black holes and neutron stars. Gravitational-wave observatories such as LIGO (the Laser Interferometer Gravitational-wave Observatory) in the U.S. and Virgo in Europe are now regularly spotting waves produced by cosmic cataclysms, and these waves may prove useful in probing the nature of vacuum energy. Some attempts to solve the cosmological constant problem rely on changes to general relativity that would cause gravity to travel slightly slower than the speed of light. The fact that gravitational waves seem to arrive simultaneously with light from the same events has quashed that idea, ruling out a few theories already. “We had a model 10 years ago called the Fab Four that was aimed at solving the cosmological constant problem,” Padilla says. “I’d already started to doubt it, but gravitational-wave data killed it.”
Gravitational waves are also revealing strange activity inside neutron stars. These compact remnants of supernovae are so dense that atoms have collapsed, their protons and electrons smashing together to form a mass of almost pure neutrons. This bizarre state gives rise to strange phenomena—for instance, the core of a neutron star might contain a novel phase of matter that would cause a jump in the amount of vacuum energy inside it. Gravitational-wave observatories might be sensitive to the gravitational effects of the extra vacuum energy here, potentially revealing secrets about the nature of vacuum energy.
And while astrophysics experiments search for clues on a cosmic scale, experiments a bit closer to home might also help researchers sort through the cosmological constant hypotheses. Lab setups that probe the universe at the smallest possible distances could be sensitive to some of the alterations of general relativity that physicists are proposing.
An example is the work of the Eöt-Wash group at the University of Washington, where scientists are using an extremely sensitive balance experiment to conduct precision tests of gravity. Their instrument is called a torsion balance: a metal disk with holes cut out of it hangs down from a fine wire, with a similar disk right below it that rotates at a constant rate. The two are separated by distances akin to the width of a piece of paper, and as the bottom disk rotates, its gravitational force causes the upper disk to twist back and forth.
This extremely sensitive experiment allows researchers to track how gravity behaves on scales down to tens of millionths of a meter. If the gravitational force weakens at such close quarters, as some ideas suggest—or if extra, minute dimensions of space are discernible there—the Eöt-Wash team will find them. So far gravity has followed Newton and Einstein’s laws to the letter in their tests, and no hidden dimensions have been seen, but the scientists keep adjusting their balance to probe smaller and smaller separations. Even if the group never detects deviations that affect vacuum energy, that won’t necessarily be conclusive: it is possible that such changes occur only at distances beyond our reach.
“We’ll keep trying,” Gabadadze says of attempts to test cosmological constant hypotheses with experiments. “Every generation of physicists since 1960 or so has seen new solutions emerging. Maybe one day some of them will have observational predictions that can be tested, but at this point we’re not there.” Despite the difficulty of the puzzle, he and other physicists still hope for a solution soon. Perhaps these efforts to understand the cosmological constant problem will reveal deeper truths about quantum physics and general relativity. Or maybe scientists will discover a simpler fix. And even while they’re seeking a solution that may never materialize, many physicists revel in the quest.