Create a free profile to get unlimited access to exclusive videos, sweepstakes, and more!
Speaking of satellites…
So, I recently wrote about a lovely image of the nearby galaxy M77. It’s a beautiful picture, but it had a weird thing in it: a short, multi-colored streak. I conjectured that it was a geosynchronous satellite, a human-made satellite orbiting roughly 40,000 km up, directly above the Earth’s equator. In such an orbit, it takes the satellite 24 hours to circle the Earth once, so from our point of view it stays stationary in the sky even as the stars rise and set.
If the orbit is slightly elliptical, or inclined a bit, the satellite will appear to move a little, and given that the galaxy is located in the sky directly above the Earth’s equator (the projection of the Earth’s equator on the sky is called the celestial equator), I convinced myself that’s what I was seeing (as opposed to an asteroid, which could also move slowly across an image). But readers started chiming in, and I realized quickly I was wrong, so I updated the post to let people know.
One person who contacted me is my old friend Adam Block from the University of Arizona's Astroomy Department, who is a gifted astrophotographer. He takes amazing images that I’ve featured on the blog many times. Adam reminded me that, in May 2017, he took (with Eric Pearce) an amazing photo showing dozens of geosynch satellites in a long strip across the celestial equator.
[Geosynchronous satellites form a belt around the Earth over its equator. Credit: Adam Block and Eric Pearce/Steward Observatory/University of Arizona]
It looks small, but I had to shrink it to fit the blog; the original image is 19240 x 2201 pixels! It’s a mosaic of 10 images, each with an exposure of about 25 seconds. During that time, the rotation of the Earth caused the stars to move, leaving streaks in the exposure. But the satellites move around the Earth at the same rate as the Earth rotates, so they appear motionless, as little dots (the very bright streak is the planet Jupiter).
Adam also created a video of a trio of geosynch sats showing how they hang nearly motionless as the stars zoom past:
In the galaxy post, I mentioned that the positions of the satellites in the sky depend on your latitude; if you’re farther north you’re looking “down” on them a bit and they appear south of the celestial equator. At the time, Adam took the geosynch satellite image above, Jupiter was a few degrees south of the celestial equator, which is why the satellites appear near it.
As it happens, and as Adam reminded me, a lot of interesting astronomical objects appear just south of the celestial equator in the sky … including the famous Orion Nebula. That makes taking images of it a bit of an issue, since lots of satellites can cross it during long exposures. For proof of that, Adam sent me this photo he took:
Oh, yikes! Look at all those satellite trails! This image is actually rather funny. Usually, when you create a color image, you use a set of filters, like blue, green and red. When added together they produce a “natural” color image that mimics what the eye sees. But there’s a lot more to it than that. In many of the exposures, there will be irritating intruders: cosmic rays (subatomic particles zipping around space that leave a bright streak in the image), bad pixels in the camera and yes, even satellites. So you take a bunch of images in each filter, and combine them in a way that throws out things that aren’t in each image. That way, the nebula or whatever your target is shows up because it doesn’t change from exposure to exposure, but the other things disappear (there are lots of ways to do this, like median filtering).
In the original image, Adam did just this, and produced a spectacular image of the nebula:
Whoa. But he was asked by someone to actually show the satellites (he was interested in seeing how the satellites behave). So, what Adam did is take each image that had a satellite streak in it and subtract the cleaned image, leaving only the streak behind. Then he cleaned up the streaks a bit and added them back in to the final image. When he was done, he got the chaos you see in the first image. It doesn’t appeal to me in the same way as the cleaned version, but it’s still pretty cool. And it shows you one small thing we observational astronomers have to deal with when creating images.
I did a lot of this when I worked on Hubble images, and wound up writing a lot of my own software to deal with it. This task is a lot easier now; there are entire suites of astronomical software available to help. Some of it is quite clever; you can feed it a video stream, for example, and it’ll pick out the best parts of each frame to add together to make a single very high-resolution image. This is now a very common practice for planetary imagers.
I love all this. I remember when I used to roll my own Tri-X film and develop it myself, and shot a gazillion photos of the Moon only to have none turn out. Now it’s far easier … but still enough of an art to make it fun, and to be more rewarding the more effort you put into it. I’m very glad that folks like Adam are out there, taking the time needed to create such beauty.