The idea is relatively new to astronomy

Dec 22, 2009 11:50 GMT  ·  By
Models of the Cosmic Web are very difficult to compile, even with the wealth of data currently available
   Models of the Cosmic Web are very difficult to compile, even with the wealth of data currently available

It took centuries of astronomical observations for the early scientists to figure out that the stars in the night sky actually belonged to different galaxies, and to also determine that some of these stars were actually distant galaxies themselves. The model that held that the Universe was made of galaxies separated by large distances appeared around that time. However, about ten years ago, there was a reshaping of that theory, as experts discovered the fact that several galaxies could actually form to create more massive structures, known as clusters and superclusters.

Efforts to model this so-called cosmic web have been ongoing ever since, but, as they inserted more and more recently available data into their models, scientists discovered that simulating the interactions and the positions of galaxies inside the Universe was a job far trickier and more complex than originally thought. At this point, we know of the existence of about 100 billion galaxies, all of which are arranged inside a wispy, web-like structure. This web has knots, where massive numbers of galaxies lie, as well as filaments that are sparsely populated, and regions that are void of discernible structures.

One of the things that make modeling the Cosmic Web extremely difficult is the fact that it comprises objects that vary over several orders of magnitude. Therefore, it stands to reason that any model needs to be able to handle the relationships that form between these levels of magnitude, as well as between the larger structures and the smaller ones that make them up. Usually, in such models, the large-scale structures are favored over smaller ones. If you zoom inside a supercluster, you will reach at one point a level where objects are too small to resolve.

In these instances, computer models apply statistical methods to smoothen the calculations, but these techniques favor the integrity of the large-scale structure. The thing is that this process causes the original data to be lost, so, if you were to zoom in again, you would need to compile a new, large-scale image. This is an option for creating 3D models of the Universe, but it isn't of too much use when attempting to determine how large- and small-scale structures interact in order to create a whole. Currently, experts are working on ways to make the latter scenario possible, Technology Review reports.