How the Webb sends its hundred-megapixel images millions of miles to Earth – TechCrunch

0

NASA just revealed the first series of images from the James Webb Space Telescope, from an impressive deep field of galaxies to a careful characterization of the atmosphere of a distant exoplanet. But how does a spacecraft millions of miles away send tens of gigabytes of data back to Earth?

First, let’s talk about the images themselves. These aren’t just common JPEGs – and the Webb isn’t just an ordinary camera.

Like any scientific instrument, the Webb captures and sends home tons of raw data from its instruments, two high-sensitivity near and mid-infrared sensors, and a host of accessories that can specialize them for spectroscopy, coronagraphy and other tasks as required.

Let’s take one of the first images recently released with a direct comparison as an example.

Megapixel race

The Hubble, which is more comparable to a traditional visible light telescope, took this image of the Carina Nebula in 2008:

Picture credits: NASA/ESA/Hubble Legacy Team

Of course, it’s an incredible image. But the Hubble is more comparable to a traditional visible light telescope, and more importantly, it was launched in 1990. The technology has changed somewhat since then!

Here is the Webb version of this same region:

Picture credits: NASA, ESA, CSA, STScI

It is clear to any viewer that the Webb version has far more detail even looking at these smaller versions. The wispy texture of the nebula resolves into complex cloud formations and streaks, and more stars and presumably galaxies are clear and visible. (Although note here that the Hubble image has its own charms.)

Let’s just zoom in on a region to emphasize the level of detail captured, just left and up from center:

Picture credits: NASA, ESA, CSA, STScI

Extraordinary, right? But this detail has a cost: data!

The Hubble image is around 23.5 megapixels, weighing in at 32 megabytes uncompressed. The Webb image (as made available after processing) is 123 megabytes and around 137 megabytes. That’s more than five times the data, but even that doesn’t tell the whole story. The Webb’s specifications allow it to send data back at 25 times the Hubble’s rate – not just larger images, but more of them… 3,000 times further.

Long distance call

Hubble is in low Earth orbit, about 340 miles above the surface. This means that communicating with it is really very simple – your phone reliably receives signals from much more distant GPS satellites, and it’s child’s play for a NASA scientist to transmit information back and forth. to a satellite in such close orbit.

JWST, on the other hand, is at the second Lagrange point, or L2, about a million miles from Earth, directly away from the sun. That’s four times the distance traveled by the moon and a much tougher proposition in some ways. Here is a NASA animation showing what that orbit looks like:

Fortunately, this type of communication is far from unprecedented; we have sent and received vast amounts of data from much further afield. And we know exactly where the Webb and Earth will be at any given time, so while it’s not trivial, it’s really about choosing the right tools for the job and doing a very careful timing.

From the start, the Webb was designed to transmit Ka-band radio waves, in the 25.9 gigahertz range, well within the ranges used for other satellite communications. (Starlink, for example, also uses Ka, along with others around that territory.)

That main radio antenna is capable of sending around 28 megabits per second, which is comparable to home broadband speeds – if it took around five seconds for your router’s signal to travel a million miles of empty space to reach your laptop.

Purely illustrative for an idea of ​​distances – objects are obviously not to scale. Picture credits: NASA/ESA/Hubble Legacy Team

That gives it about 57 gigabytes of downlink capacity per day. There’s a second antenna operating on the lower S-band – surprisingly, the same band used for Bluetooth, Wi-Fi and garage door openers – reserved for low-bandwidth things like software updates, telemetry and health checks. If you are interested in details, IEEE Spectrum has a great article which goes into more detail about this.

It’s not just a constant flow, because of course Earth’s rotations and other events can come into play. But because they deal with mostly known variables, the Webb team schedules their contact times four or five months in advance, relaying data through the Deep Space Network. The Webb may be capturing data and sending it on the same day, but the capture and transmission was planned long before.

Interestingly, the Webb only has about 68 gigabytes of storage space internally, which you think would make people nervous if it could ship 57 – but there’s more than enough opportunity to offload that data so it never gets that dreaded “drive”. complete “.

But what you see at the end – even that big 123 megabyte uncompressed TIFF image – is not what the satellite sees. In fact, it doesn’t even perceive color as we understand it at all.

“Allow data to shine through with color”

The data coming into the sensors is in the infrared, which is beyond the narrow band of colors that humans can see. We use many methods to see outside of this band, of course, such as X-rays, which we capture and view in a way that we can see by hitting film or a calibrated digital sensor to detect them. It’s the same for the Webb.

“The telescope is not really a compact camera. So it’s not like we can just take a picture and we have it, is it? It is a scientific instrument. It was therefore designed first and foremost to produce scientific results,” explained Joe DePasquale, of the Space Telescope Science Institute, in a NASA podcast.

Picture credits: NASA/ESA/Hubble Legacy Team

What it detects is not really data that humans can analyze, let alone perceive directly. For one thing, dynamic range is off the charts – it means the difference in magnitude between the darkest and brightest points. There’s basically nothing darker than the endless vacuum of space and not much brighter than an exploding sun. But if you have an image that includes both, taken over a period of several hours, you end up with huge deltas between dark and light in the data.

Now our eyes and brains have pretty good dynamic range, but it blows them out of the water – and more importantly, there’s no real way to show it.

“It basically looks like a black image with a few white spots, because there’s such a huge dynamic range,” DePasquale said. “We need to do something called data stretching, which is taking the pixel values ​​and repositioning them, basically, so you can see all the detail that’s there.”

Before you object in any way, first know that this is essentially how all images are created – a selection of the spectrum is clipped and adapted for viewing by our very capable but also limited visual system. Because we cannot see in infrared and there is no equivalent of red, blue and green in these frequencies, image analysts have to do complicated work that combines objective use data to a subjective understanding of perception and even beauty. The colors may correspond to wavelengths in a similar order, or perhaps be split to more logically highlight regions that “look” similar but emit wildly different radiation.

“We like in the astrophotography imaging community to refer to this process as ‘color representative’, instead of what it used to be called, many people still call ‘false color images’ “. I don’t like the term ‘false color’, because it has this connotation that we’re faking it, or it’s, you know, it’s not really what it sounds like; the data is the data. We don’t go in there and apply, like paint color on the image. We respect the data from start to finish. And we allow the data to display in color.

If you look at the image above, the two views of the nebula, consider that they were taken from the same angle, more or less at the same time, but using different instruments that capture different segments of the IR spectrum. Although ultimately both should be displayed in RGB, the various objects and features found by inspecting higher wavelengths can be made visible by this kind of creative yet scientifically rigorous color assignment method.

And of course, when data is more useful as data than as a visual representation, there are even more abstract ways to think about it.

Picture credits: NASA, ESA, CSA, STScI

An image of a distant exoplanet may only show a point, but a spectrogram shows telltale details of its atmosphere, as you can see in this example just above.

Collecting, transmitting, receiving, analyzing and presenting data from something like the Webb is a complex task, but one to which hundreds of researchers are devoted with joy and enthusiasm now that it is operational. Anticipate even more creative and compelling ways to display this information – with JWST just beginning its mission a million miles away, we have a lot to look forward to.

Share.

About Author

Comments are closed.