Drowning in data

That recent successful experiment to image a black hole produced an enormous amount of data.

It all had to be shipped to the processing facility; there was far too much to send over the Internet. It had to be shipped on boxes of hard drives. Then it all had to be analyzed.

It was a huge task. Astronomy has not always been like this.

Not that long ago, an observing session at a large telescope would involve whole nights making sure the telescope operated properly and stayed properly pointed at its target.

The result would be one or two images, a computer data file, or in some cases, just columns of numbers in a notebook. You would take the whole lot home to analyze.

If your attention slipped at all during the observations, the result could be an entire observing session wasted.

Today things are easier. There is no need to sit there adjusting telescope tracking all night, freezing in an unheated telescope dome.

Modern telescopes look after themselves and the instruments we attach to them have improved immeasurably.

Astronomical images may require hours of exposure to collect the required amount of light. A radio image made using our Synthesis Radio Telescope requires around 10 days of data collection.

Today, we can record the raw data as it is collected, and make images or otherwise process it afterwards. If we make mistakes we can go back to the original information, and start over.

That raw data can be archived and be made available to other researchers.

The latest generation of optical and radio telescopes is raising a new problem.

We want to keep recording the raw data, but it is coming out of the telescopes at such a rate that no current data handling system can handle it.

Even it if were somehow successfully recorded, analysis would be another serious challenge.

Two examples are the CHIME (Canadian Hydrogen Intensity Mapping Experiment) now operating at our observatory, and even more so, the Square Kilometre Array, the largest radio telescope in the world, which will generate a tsunami of data.

The only way we can deal with this is to have a computer program filter and partially process the data before it is recorded.

This raises a really nasty issue. How does such a program decide what is important and should be kept, as opposed to what should be let go?

Although new instruments almost inevitably yield new discoveries, they are primarily developed to address known problems of high scientific interest.

If we have to make a pre-processor program to weed out the uninteresting stuff from the raw data, how can we make sure it stays open minded? It is so easy to build in our prejudices, making it search most assiduously for what we expect to find.

What we really want is something to look at the data and to call our attention to anything at all that looks odd or stands out in some way.

Not long ago such a program would have been a science fiction dream; now it is not.  Artificial intelligence (AI) has arrived.

We don't tell our robot observing assistant what we are looking for. Using new concepts like "computer learning" and "neural networks" we just let the program look at huge amounts of past data.

Without telling it anything, it can identify patterns, what usually turns up in the data, learn about things that can corrupt it, and then report what stands out or what most closely fits what we ask it for.

This will revolutionize our ability to squeeze information out of huge amounts of data. When observational discoveries are being made using AI, how long it will be before the AI becomes a co-author on the published articles?

  • Mars lies very low, getting lost in sunset glow as it approaches the other side of the sun.
  • Jupiter, shining like a searchlight, rises around midnight
  • Saturn is up at 1 a.m.
  • The moon will reach Last Quarter on the 26th.


Canada got there first

n the 1960s, Canada was the first country to successfully use a radio telescope thousands of kilometres in diameter.

A technique was developed to make multiple radio telescopes, separated by thousands of kilometres, act as though they were joined together.

This technique, known as very long baseline interferometry, or VLBI, was developed for one main reason: to find out what quasars actually are.

However, it also became very useful for something completely different: measuring the changing shape of the Earth.

The level of detail in an image is dictated by how big the mirror or lens used to make it is compared with the length of the waves being imaged.

Light waves are very short, so our eye lenses, with a diameter of a few millimetres are good enough to see details as small as one per cent of the area of the lunar disc.

To record the same detail for waves that are centimetres or metres long, the lenses or mirrors have to be huge.

For example, our Synthesis Radio Telescope, which can yield radio images with levels of detail achievable with our eyes, consists of a row of antennas 600 metres long.

We can increase our eyes’ ability to discern fine detail by using a telescope or binoculars. These have the effect of making the pupils of our eyes larger. However, this is hard to do with radio telescopes.

Modern engineering science and materials can provide us with radio dishes up to about 100 metres in diameter, but no larger.

In the 1960s, a strange class of cosmic radio sources called quasars were detected. They were very small, looking like stars through our biggest optical telescopes, and they lie millions or billions of light years away.

Most of their strangeness was in the radio emissions they produced, so we wanted to make radio images of them. We now believe they are driven by black holes.

Cosmic radio sources appear so small in the sky that most of them lie beyond the imaging ability of dish radio telescopes, even the largest. This led to the development of techniques whereby we can combine groups of small radio telescopes into arrays bigger than any possible single-antenna radio telescope.

As time passed, most radio sources were imaged, but the quasars remained unimaginable. We needed even bigger arrays. However, the telescopes in an array have to be connected together, which limits these arrays to maybe a few kilometres across.

Even the biggest arrays were inadequate for dealing with quasars.

Something completely new was needed. Could we make arrays without having to connect the antennas together? There was an international race to achieve this. Canada got there first.

The idea was to record on video tape the signals received by different antennas. The tapes from the various antennas would then be brought together later for processing.

To make this possible extremely precise, timing signals were added to the tapes. Now the antennas could be anywhere on Earth.

An array the size of the Earth gave us some usable images. This is the technique that recently gave us our first ever images of a black hole.

One really useful by-product of these VLBI experiments is a measurement of the positions of the antennas, including the distances between them, accurate to a few millimetres.

By putting antennas on different continents we can measure the speed the tectonic plates are moving and how landmasses are stretching, twisting or otherwise changing shape.

This works well because the small size of quasars, which makes them extremely hard to image, makes them also ideal reference sources for measuring the positions of the antennas.

Our obsession with quasars has given us the ultimate ruler for measuring our world.

  • Mars lies very low and inconspicuous in the west, sinking slowly into the twilight.
  • Jupiter, shining like a searchlight, rises around midnight
  • Saturn rises at 2 a.m.
  • The moon will be full on the 18th.

Dark energy matters

Until very recently, one of the wry jokes about fast radio bursts (FRBs) is that the number of theories attempting to explain them was bigger than the number of FRBs that have been detected.

The CHIME (Canadian Hydrogen Intensity Mapping Experiment) radio telescope is changing that. This instrument has a huge field of view - much of the sky -  and is catching lots of them.

Lets hope this will lead us to get a better idea of what they are. All we know at the moment is that very short (millisecond) bursts of radio emission are coming to us from millions or billions of light years away.

The transmitted energy must be huge, so big the only engines we know of that can drive them are neutron stars and black holes. We hope the number of theories will soon start to drop.

Theories are the currency of science. Coming up with a new theory is a lot more than having a casual idea to explain something we see. The first step is observing something and coming up with a possible physical process or set of physical processes to explain it. That could be a casual idea.

The science starts at the next step. This involves searching the literature for other work on the subject and then using this to flesh out the idea to generate a coherent series of physical and mathematical arguments to account for what was observed. This is not the end of the story.

That new theory must make predictions. If it is correct, there will be other consequences we can look for in new observations. The theory must predict things that can be tested. Unless it does this, it is not a proper theory and is useless.

If a theory survives a long period of detailed examination and testing, it may finally be recognized as a "law" — one of the fundamental rules used by Mother Nature to run the universe.

Isaac Newton provided us with good examples. He came up with a simple theory. I

f you push something with a steady force, an object will accelerate at a constant rate.

If you double the mass of the object, it will accelerate at half that rate.

On the other hand, if you push with twice the force, the object will accelerate twice as rapidly.

The consequences of this simple idea are now recognized as Newton's Laws of Motion.

He also theorized that objects attract each other with a force related to their masses and the distance between them. He came up with the idea of gravity.

Expressed mathematically Newton's Laws of Motion and Gravity have been tested and used over and over again, successfully, and are now recognized as fundamental Laws of Nature.

That is why scientists are so unhappy with the concepts of dark matter and dark energy.

When we saw that in most galaxies the stars orbiting in them are moving too quickly, we concluded the masses of the galaxies are much bigger than we thought. However, we cannot see that missing mass.

Someone suggested this invisible mass is made up of Dark Matter, an otherwise totally invisible "something" invented to make our measurements make sense.

The expansion of the universe is speeding up. This does not make sense. If all the objects in the universe are pulling at all the other objects, gravity should be slowing the expansion down.

The only way the expansion can accelerate is if some unknown outward force is at work.

Enter the idea of Dark Energy. As in the case of Dark Matter, we have just given something a name, we have no explanations or lines of analysis to follow.

This could change. The CHIME radio telescope is intended to map the hydrogen in the very young universe. This is when dark energy would be very active in starting the first galaxies to form.

This won't necessarily lead us straight to a theory, but at least it might tell us which way to go.

  • Mars lies low in the west after sunset.
  • Jupiter rises around midnight
  • Saturn rises at 2 a.m.
  • The Moon will reach first quarter on the 11th.


Cosmic dust in your eyes

Dust might not sound like a very exciting subject compared with, say, black holes, but actually it is.

Not only is it the raw material for making new planets, asteroids and other bodies, it also provides an environment for all sorts of other important and fascinating things to happen.

Most of us have suffered from sunburn. This happens because overexposure to solar ultraviolet rays results in damage to the complex organic molecules making up our skin. Moreover this is after our atmosphere has filtered out most of this dangerous radiation.

However, over the last few decades, we have found space to be loaded with fairly complex organic molecules.

How can this happen?

In addition to being very cold and a near vacuum, space is flooded with ultraviolet light. The answer is surprising, dust.

In most places, space is fairly clear, making it possible for our telescopes to reach far out into space, and back in time to almost the beginning of the universe.

However, in some directions we see great dust clouds, thick enough to block the light of the stars lying behind.

Tis summer, if you are somewhere with a dark, clear sky, look into the southern sky. You will see the Milky Way apparently splitting into two streams. Of course, there is really only one stream, but there is a thick belt of dust down the middle that blocks out the stars.

Inside these dust clouds, the ultraviolet is blocked, and it is very dark and cold. Thanks to previous generations of stars, the clouds are loaded with all the elements produced as waste products by those stars.

This makes the clouds excellent places for chemistry to happen. At such low temperatures, chemicals react very slowly, but there is lots of time — billions of years. The result is an incredible witch’s brew of organic chemicals.

Ultraviolet light penetrating the outer part of a cloud can break a molecule into fragments. When this happens in the Earth’s warm, dense atmosphere, those fragments won't last long.

Inside those dark, cold clouds things move slowly and the chance of colliding with something is low. So the fragments can slowly diffuse more deeply into the cloud.

If by some small chance, two fragments do run into one another, they will almost certainly bounce apart again, like billiard balls.

In this scenario, with collisions infrequent and unlikely to result in larger molecules, reaction rates would be very low. This is where the dust comes in.

It provides somewhere for chemical reactions to take place. The rough surfaces of the dust grains provide lots of opportunities for the molecule fragments to bounce around until they come to rest on the surface.

There they sit while more molecular fragments slowly accumulate. Chemical reactions take place and the grain becomes coated with complex, organic molecules.

Then, when a cloud becomes unstable, collapsing under its own gravity and forming a new planetary system, those planets receive a ration of the organic molecules that can, under the right conditions, form the basis of life.

This is why there are more optical and radio telescope hours dedicated to studying dust than are spent looking at or for black holes.

We look for the radio and infra-red signatures of the organic molecules in the clouds, and try to deduce the reactions slowly taking place, and we look at these dust clouds as they form protoplanetary discs, the beginnings of new planetary systems.

The current situation is that we are finding many times more molecular signatures in those clouds than we have managed to identify. Whatever recipe for life we consider, the ingredients are probably there, providing the conditions on the new planet are suitable.

We know it happened at least once.

  • Mars lies low and inconspicuous in the west, sinking slowly into the twilight as it moves toward the other side of the sun.
  • Jupiter, shining like a searchlight, rises around 1 a.m.
  • Saturn rises at 3 a.m.
  • The moon will be new on the 4th.

More Skywatching articles

About the Author

Ken Tapping is an astronomer born in the U.K. He has been with the National Research Council since 1975 and moved to the Okanagan in 1990.  

He plays guitar with a couple of local jazz bands and has written weekly astronomy articles since 1992. 

Tapping has a doctorate from the University of Utrecht in The Netherlands.

[email protected]

The views expressed are strictly those of the author and not necessarily those of Castanet. Castanet does not warrant the contents.

Previous Stories