My What Big Telescopes you Must Have

“Gosh, you must have a really powerful telescope!”

This is what I hear from friends and family members when they see my astrophotography. The discussion that usually follows is usually about “How far can you see?” or “What magnification?” I then try to explain that when it comes to telescopes, it’s really about how much light they can collect. With an eyepiece or barlow lens, you can push any optic to just about any magnification you want. It doesn’t do any good if you don’t have enough light from the object however. How far you can see depends not just on the optic, but also how bright they are. You can see the Andromeda galaxy with your naked eye, and it’s 2 and a half MILLION lightyears away, meanwhile most nearby nebula, in fact, the VAST majority of stars in our own galaxy are invisible without some sort of optical aid.

Nearly full moon compared in size to the Heart nebula.

Nearly full moon compared in size to the Heart nebula.

It’s also surprising to a lot of people just how big some of these objects are in the sky. Some of my favorite images were not taken with a dedicated telescope at all, but a regular telephoto camera lens. For the image at the right, I’ve superimposed two different photographs, one of the full moon, and one of the heart nebula. They are both to scale, and shot with the same lens and the same camera. First thing to notice is how big the nebula is compared to the full moon. Next time you see the moon looking all huge coming up over a neighbors house, imagine the size of this cloud of gas and dust coming up along side the moon.

Of course the moon is much brighter than this nebula too, and that’s where the light gathering power comes in. Essentially, the bigger around the telescope or camera lens, the more light it can gather. On top of this, we take really really long exposures to show dim objects. For example, the moon image was exposed for 0.03 seconds. That’s 1/30th of a second. The heart nebula exposure was 20 minutes (or 1200 seconds). <GeekSpeak/>To be fair for my insider friends, both of these were taken with a 3nm Ha filter, but the relative brightness differences should still hold</GeekSpeek>

So, the exposure of the nebula was 40,000 times longer than the exposure of the moon. But wait, there’s more! The resulting image of the moon showed the moon quite plainly, but the 20 minute exposure of the nebula was black except for a few stars. In order to see the nebula, I had to stretch (brighten) the image in Photoshop. How much stretching? Well, comparing the original data, I found the moon was about 20 times brighter than the brightest parts of the nebula, so while both images show a bright object, the dimmer one had been brightened by a factor of 20. Okay class… 20 x 40,000 = 800,000. So, <GeekSpeak/>at least in Ha light</GeekSpeak> the moon is approximately 800,000 times brighter than that faint heart shaped nebula.

This image was also taken with a very specialized camera (a QSI 683ws8), that has a cooler on it that chills the imaging chip to below zero. For the image of the nebula, the camera chip was -20C (-4 degrees Fahrenheit). The cooler the chip the better, as it reduces “noise” in the image. Noise is that graininess you see in pictures without enough light. I might could get away with saying the cooler the chip, the “clearer” the resulting image is (in layman’s terms). If the chip is too warm, and the image too faint, many details simply won’t show up at all lost in the noise (graininess).

Of course another thing that surprises people is that the picture above is black and white. We call this “Monochrome”, and the most sensitive cameras are always monochrome. We get a color picture by exposing through filters that let only red, green, or blue light through them. Then we combine these in software to make a color image. We could do the same thing with daylight photographs, and color cameras actually have tiny little filters like this over top of individual pixels on the imaging chip (Google “Bayer Matrix” if you feel adventurous). Sometimes too, we use filters that let light of only a certain wavelength through. These wavelengths of light are emitted by specific atoms, and the light then tells us what the object is made of. We can then create false color images and actually see where and how the material is distributed.

The Hubble Palette is a popular narrow band color mapping.

The Hubble Palette is a popular narrow band color mapping.

This final image of the heart nebula is such an image. It’s over 20 hours worth of 20 minute exposures (combined to reduce that “noise” I was talking about). The color blue represents ionized oxygen, the red is from sulfur, and the green is from hydrogen. Red, green, and blue mix, and the colors represent the mixing of these three elements in the cloud.

Pretty cool huh?

Richard

This entry was posted in Astrophotography. Bookmark the permalink.