Dark energy

Dark Energy.

In 1994, two scientists in Chile decided to use their expertise to tackle, 

one of the fundamental questions in cosmology.

What is the fate of the Universe?

Specially, in a Universe full of matter, that is gravitationally attracting all other matter.

Logic dictates that expansion of space-which began at the big bang,

and has continued ever since-would be slowing.

But by how much?

Just enough that the expansion will eventually come to an eternal standstill.

For so much, that the expansion will eventually reverse itself, 

in a kind of about-face big-bang.

Another team in California were also working on the same subject.

In 1998, the two teams independently reached the same conclusion, 

as to how much the expansion of the Universe is slowing down.

It is not slowing down.

It is speeding up.


2023, marks the 25th anniversary of the discovery of evidence for ‘Dark energy’.

Dark energy is a moniker, for whatever is driving the acceleration.

Even then it meant next to nothing, yet encompassed nearly everything.

The coinage was almost a joke.

If dark energy was real, it would constitute two thirds of all the mass and energy, 

in the universe.

That is two thirds of the Universe in its entirety.

Yet, what that two thirds of the Universe was remained a mystery.

A quarter of a century later that summary still applies.

Over the decades scientists have gathered ever known convincing evidence, 

of dark energy’s existence.

This efforts continues to drive observational cosmology, if not to detect, at least define it.

Right from the start in 1998, scientists recognised that dark energy, 

presents an existential problem of more immediate urgency than the fate of the Universe:

the future of physics.


The mystery of why a Universe, full of matter, gravitationally attracting all other matter,

hasn’t collapsed on itself has haunted astronomy, 

since Newton’s introduction of a Universal law of gravitation.

In 1693, 6 years after the publication of his Principia, 

Newton acknowledged that positing a Universe in perpetual equilibrium,  

is to making ‘a infinite number of needles, stand accurately poised upon their points’.

He added, it is possible - at least by a divine power.  

Stephen Hawking wrote in 1999, that Newton could have predicted, 

the expansion of the Universe.

In 1917, Einstein when he applied his equations of general relativity to cosmology,

he confronted the same problem as Newton.

Unlike Newton,Einstein added to the equation, not a divine power, 

but the Greek symbol lambda.

Lambda is a arbitrary mathematical shorthand, 

for whatever was keeping the Universe in perfect balance.

The following decade astronomer Edwin Hubble, 

seemingly rendered lambda superfluous through his twin discoveries.

He discovered that other galaxies exist beyond our milky way, 

and those galaxies appeared to be receding from us in a fairly straight forward way.

The further the galaxy, the faster it appeared to be receding.

Perhaps the Universe had emerged from a single explosive event.

The 1964 discovery of evidence supporting the big bang theory, 

immediately elevated cosmology from metaphysics to hard science.

Six years later the astronomer Allan Sandage defined the science of big bang cosmology,

as the search for two numbers.

One number was the rate of expansion now.

The second was the deceleration in the expansion over time.


Decades would pass before the first real investigation into the second number got underway.

Two collaborations more or less simultaneously started work on it.

Astronomers can determine galaxies’ velocity, 

- the rate at which stretching space is carrying them away from us -,

by measuring how much light has shifted to the red end of the visible portion,

of the electromagnetic spectrum.

This is called the redshift.

Determining their distance from us, however, is trickier.

It requires a ‘standard candle’, a class of objects whose light output doesn’t change.

A 100 watt light bulb, for instance, is a standard candle.

If we know its absolute luminosity is 100 watts, 

we can apply the inverse-square law to its apparent luminosity.

This is how bright it looks to you at the current distance from it, 

to calculate how far away it actually is.

According to the inverse-square law, when the distance from a star emitting light is doubled,

the intensity of the light from the star decreases by a factor of four.

That is, the total energy is spread over a larger area.

The standard candle, that Hubble used in plotting his diagram was a Cepheid variable,

a star that brightens and dims at regular intervals.

But Cepheid variables are difficult to detect, at distances greater than 100 million light-years.

Astronomers trying to measure the rate of expansion, over the history of the universe, 

would need a standard candle, they could observe light-years away.

This is a kind of distances that charge-coupled device detectors, 

with their superior photon collecting power could probe.


A candidate for a standard candle emerged in the 1980’s.

It was a type of supernova, the explosion of a white dwarf when it accretes too much matter, from a companion star.

The logic seemed reasonable.

If the cause of an explosion is always the same, then so should be the effect,

- the explosion’s absolute luminosity.

Further investigations determined that the effect was not uniform.

Both the apparent brightness, and the length of time, 

over which the visibility of the ‘new star’ faded differed from supernova to supernova.

In 1992, a scientist recognised a correlation between a supernova’s absolute luminosity,

and the trajectory of its apparent brightness, from initial flare through diminution.

They had to become standardisable, to the High-z, or redshift collaboration. 

Only then would the scientists be comfortable to measuring the deceleration parameter.

Hubble’s original diagram had indicated a straight line correlation of velocity and distance.

Scientists chose to plot redshift (velocity) on the X axis, 

and the apparent magnitude (distance) on the Y axis.

Assuming that the expansion was in fact decelerating, 

at some point the line would have to deviate from its 45 degree beeline rigidity.

It needs to bend downwards to indicate that distant objects were brighter, 

and therefore nearer than one might otherwise expect.


From 1994 to 1997, the two teams used the major telescopes on Earth, 

and crucially the Hubble telescope in space to collect data on dozens of supernovae,

that allowed them to extend the Hubble diagram further and further.

By 1998, they both had found evidence that the line indeed diverged from 45 degrees.

But instead of curving down, the line was curving up.

It indicated that the supernovae were dimmer than what they expected.

The expansion therefore wasn’t decelerating but accelerating.

The conclusion was counterintuitive, and in its own way revolutionary, 

as Earth not been at the centre of the Universe.

The astrophysics community accepted dark energy with alacrity.

They recognised that two teams arrived at the same result independently.

The two teams had used different data, that is separate sets of supernovae.

One factor that was equally persuasive in consolidating consensus was scientific.

The result answered some major questions in cosmology.

How could a universe be younger than its oldest stars?

How did a universe full of large scale structures, such as superclusters of galaxies,

mature so early as to reach the cosmological equivalent of puberty,

while it was still a toddler?

Problem solved !

An expansion that is speeding up now implies an expansion, 

that was growing less quickly in the past.

Therefore more time has passed since the Big bang, 

than cosmologist had previously assumed.

The universe is older than scientists had thought .

The toddler was a teenager after all.


The most compelling reason scientists were willing to accept their existence of dark energy,

was that it made the Universe add up.

For years cosmologists had been wondering why the density of the Universe seemed so low.

According to the prevailing cosmological model at that time, and today, 

the universe underwent an ‘inflation’.

The inflation started 10 to the power of minus 36 second after the Big bang.

It finished 10 to the power of minus 33 second after the Big bang.

In the interim the Universe increased its size, by a factor of 10 to the power of 26.

Inflation thereby would have ‘smoothed out’ space , so that the Universe would look roughly the same in all directions, as it does for us, no matter where you are.  

In scientific terms, the Universe should be flat.

And a flat Universe dictates that the ratio between its actual mass-energy density, 

and the density necessary to keep it from collapsing should be 1.

Before 1998, observations had indicated that the composition of the Universe,

was nowhere near this critical density.

It was may be a third of the way there.

Some of it would be in the form of baryons, meaning protons and neutrons, 

which make up all matter.

But most of it would be in the form of dark matter.

This component of the Universe is not accessible to telescopes, 

in any part of the electromagnetic spectrum, but is detectable, indirectly, 

such as through gravitational effects, on the rotation rates of galaxies.

Dark energy would complete that equation.

Its contribution to the mass-energy density would be in the two-thirds range, 

just enough to reach critical density.

Astronomers needed to know, where was the empirical evidence?

Everywhere, it turned out.


One way to calculate the constitution of the Universe, 

is by studying the Cosmic Microwave Background, or CMB.

This phenomenon was discovered in 1964, and transformed cosmology into a science.

The CMB, is a all-sky relic radiation dating to when the Universe, was only 379,000 years old. 

This was when atoms and light were emerging, from the primordial plasma, 

and going their separate ways.

The CMB’s bath of warm red and cool blues, represents the temperature variations,

that are the matter-and-energy equivalent of the Universe’s DNA.

Take that picture, then compare it with simulation of million of universes, 

each with its own amount of baryonic matter, dark matter and dark energy.

Hypothetical universes with no regular matter or dark matter and 100% dark energy,

or with 100% regular matter and no dark matter or dark energy, 

or with any combination in between will provide unique colour patterns.


The Wilkinson Microwave Anisotropy Probe, or WMAP was launched in 2001.

It delivered data from 2003 to 2012, and provided one such census.

Planck, an even more precise space observatory, began collecting its own CMB data in 2009, 

and released its final results in 2018.

It corroborated WMAP’s findings.

The Universe is 4.9% of matter, 26.6% of dark matter, and 68.5% of dark energy.

An obvious question have still bothered scientists from the beginning.

What is it?

Dark energy does help the Universe add up on the macro scale 

- the one that falls under the jurisdiction of general relativity.

On the micro scale, though, it does not make sense.

According to quantum physics, space is not empty.

It is a phantasmagoria of particles popping into and out of existence.

Each of these particles contains energy.

Scientists’s best guess is that this energy accounts for dark energy.

It’s seemingly neat explanation except that quantum physics, 

predicts a density value a lot larger than the 2/3rds astronomically initially suggested,

- 10 to the power of 20 larger.

As the job goes, even for cosmology, that’s a big margin of error.


In 1998, theorists got to work on shrinking that gap.

They eventually got to so much work, that the interplay between observers and theorists,

threatened to consume the community.

In 2007, the theorist Simon White wrote a controversial essay entitled, 

’why dark energy is bad for astronomy’.

The observers weren’t shy about expressing their frustration.

Adam Riess was the scientist who determined mathematically, 

that without the addition of lambda - dark energy - the supernovae data, 

indicated a Universe with negative matter.

He was the lead author of the high-z discovery paper.

He dutifully checked new physics papers every day, 

but says he found most of the theories to be ‘pretty kooky’.

The scientist Schmidt in his presentations, 

included a slide that listed 47 theories he had culled, from 2500 available in recent literature.

Some of the names of the theories, were, 

’5 dimensional Riccy flat bouncing cosmology’, 

’Diatomic ghost condensate dark energy’, 

’Nambo-Goldstone boson quintessence’,

In 2007, he told an audience, ‘you tell us (observers) what you need, 

- we will go and get it for you.

There were endless theories of what dark energy might be, but they lacked credibility.

To find out what dark energy is, theorists need to know how it behaves.

For instance does it change over space and time?.

More precise observations are required to make progress.


Type Ia surveys continued to fill the Hubble diagram with more and more data points.

These data points are squeezing in within more and more compact error bars.

Such uniformity might be more gratifying if theory could explain the observations.

Instead cosmologists find themselves having to go back and really make sure.

The trust worthiness of the seeming uniformity depends on the reliability, 

of the underlying schematics.

These are the assumptions that drove the observations in the first place, 

and that continue to guide how astronomers try to measure supernova distances.

One problem is, that it is almost certain that there is more than one physical mechanism,

that causes a white dwarf in a binary system to explode.

Differing mechanisms might mean data that are non standardisable.

Another problem is that analyses of the chemical components of supernovae, 

have shown that older exploding stars contain lighter elements than more recent specimens.

This observation is consistent with the theory that succeeding generations of supernovae,

generate heavier and heavier elements.

Therefore, older or less evolved material arriving on a white dwarf in the past,

may change the nature of the explosion.

Even so, astronomers are still keen to use supernovae.

Scientists use a technique called ‘twins embedding’.

Rather than treating all types of Ia supernova as uniform, like a species, 

they examine the light properties of individual specimens, 

whose brightness in different wave lengths follows almost the same pattern over time.

Once they find matching twins, they try to standardise from this data.


Two facilities in Chile will undertake their own surveys, 

of thousands of southern-sky supernovae.

First the Vera C. Rubin observatory will locate the supernovae.

Then the 4-meter Multi-Object Spectroscopic Telescope, 

will identify their chemical components.

This will help to clarify how supernovae with more heavy elements, 

might explode differently.

As for space telescopes, scientists continued to mine supernovae in the Hubble archives.

The James Webb Space Telescope (JWST), 

will eventually turn its attention to high redshift supernovae, 

once the telescope has addressed more of its primary goals.

Scientists are anticipating eagerly, the Nancy Grace Roman Space Telescope, 

a successor to JWST, that is due for launch in 2027.


Surveying supernovae, is not the only way to measure dark energy.

One alternative is to study Baryon Acoustic Oscillations (BAO) - sound like waves,

that formed baryon particles, bounced off one and another, 

in the hot and chaotic early universe.

When the Universe cooled enough for atoms to coalesce, 

these waves froze - and they are still visible in the cosmic microwave background.

Similar to the way supernovae serve as standard candles, 

providing a distance scale stretching, from our eyeballs across the Universe, 

BAOs provide a standard ruler, a length scale for lateral separations across the sky.

Scientists can measure the distances between densities of oscillations in the CMB,

then trace the growth of those distances over space and time, 

as these densities gather into clusters of galaxies.

Some scientists believe that this is the best way, 

to trace the expansion history of the Universe.

Astronomers are awaiting the results from two major BAO surveys, 

that they would allow them to reconstruct cosmic evolution, 

at ever earlier eras across the Universe.

The Dark Energy Spectroscopic Instrument (DESI), in a telescope in Arizona, 

is collecting optical spectra (light broken up into its constituent wavelengths),

for about 35 million galaxies, quasars and stars.

From this data astronomers will be able to construct a 3D map, 

extending from the nearby objects back to a time, 

when the Universe was about a quarter of its present age. 

The first data, contained nearly two million objects, that scientists are studying.

The Prime Focus Spectrograph, in Hawaii will begin following up on DESI results,

at even greater distances, and complete its own 3D map. 

The European space agency’s Euclid space craft, 

will contribute its own survey of galaxy evolution to the BAO catalogue.

It will also be employing the second non supernovae method for measuring, 

the nature of data energy, - weak gravitational lensing.


This relatively new approach exploits a general relativistic effect.

Sufficiently massive objects, such as galaxies, 

are galaxy clusters can serve as magnifying glasses, for far more distant objects,

because of the way mass bends the path of light.

Astronomers can then sought the growth of galactic clustering strength, 

over time to track the competition between the gravitational attraction of matter, 

and the repulsive effect of dark energy.

Euclid’s data should be available in the next few years.

Since the discovery of acceleration, cosmologists have been hoping for an experiment,

that would provide 20 times more precision.

We have this possibility in the next few years, 

of seeing what happens when we get to that level.


1998 is considered as the breakthrough year for dark energy.

In 2011, the Noble prize was awarded to the leading scientists of dark energy, 

Perlmutter, Riess and Schmidt.

Dark energy became an essential component of the standard cosmological model, 

along with baryonic matter, dark matter and inflation.

Yet the possibility exists that some fundamental assumption is wrong.

Some scientist posit that we might have an incorrect understanding of gravity.

Such an error would skew the data.

In that case the BAO measurements, 

and Euclid’s weak gravitational lensing results will diverge.

From the scientific perspective this would not be bad.

Scientists always look forward to discovering something new.

Maybe upcoming deluge of data will help scientist discern, 

how dark energy behaves over changing space and time.

This will go a long way in determining the fate of the Universe.

Until then we have to be content with the modest conclusion: to be continued.