All the light ever created in the Universe is still out there darting across vast expanses. But we only get to see the light created by the stars that existed at a certain point in the past, the further the start, the earlier it is in the Universe's history.
Astronomers have devised a way of measuring the amount of light created in the Universe since the beginning of time.
It's not a very complicated method but it does require you to think of light in ways we're not accustomed to. Astronomers have used NASA's Fermi Space Telescope to measure the amount of light created.
Fermi is a gamma-ray telescope, it can detect the most energetic sources of light in the Universe. There are only a few objects in the night sky that emit gamma ray light, most of them are blazars.
Blazars are active galactic nuclei that are so energetic they blast past the X-ray spectrum, which would classify them as a quasar, and right into the gamma ray spectrum.
Astronomers looked at gamma rays coming from 150 blazars, each emitting light above 3 GeV or 3 billion electron volts, about a billion times more energetic than visible light.
There are over 1,000 blazars that have been discovered until now, most of them with Fermi.
Gamma rays travel across the Universe until finally hitting Earth, specifically, Fermi's Large Area Telescope instrument.
The Universe seems empty, but between the distant galaxy that emits the gamma ray and us are countless particles, the most abundant of which are light particles.
For gamma rays, light in the visible or ultraviolet spectrum actually acts as a fog, however unintuitive that may seem. Energetic photons interact with the less energetic ones in visible and UV light.
This is a rare occurrence even in the vast distances of space, but when it happens, the two photos transform into an electron-positron pair and the gamma ray photon is lost.
This effect varies with distance, i.e. the more a gamma ray photon travels, the more likely it is to encounter and interact with a light photon, and with light density, i.e. if there is more light in its path, it is more likely to interact with it. More energetic gamma ray photos are also more likely to interact.
This is how astronomers were able to measure the amount of light in the Universe, not only now, but at certain points in the past.
They looked at blazars at three different distances to Earth, and as such at three different points in time, the furthest were from when the universe was only 4.1 billion years old, closer ones from when it was 8.6 billion years and the closest blazars from when the universe was 11.2 billion years.
They measured the attenuation of the gamma ray radiation, i.e. how much of it didn't reach Earth and at what energies. They found that the blazars further away lost most of their most energetic particles, starting at above 25 GeV. Blazars closer by were less affected.
Based on these measurements, they determined that the average star density is 1.4 stars per 100 billion cubic light years.
Put another way, the average distance between stars is 4,150 light years. This would be accurate if all stars were identical and spaced out at equal distances, obviously that's not the case.
You may be wondering how is any of this helpful. But knowing the amount of light in the universe at any given time means astronomers can pinpoint with better accuracy when the first stars were formed and how many were there at any given time in the universe.