As I am learning the intricacies of this hobby and having an absolute ball doing it, I wanted to explain how all of this works. Of course, my images are not near as good as the ones you see in books or news articles, because those images typically come from the Hubble space telescope. You can see why NASA spent billions trying to escape the atmosphere. My equipment is also not ideal for this task: go on any amateur astronomy forum, and you will see dozens of posts from beginners like me asking how to take images like they see in books with their fresh, new dobsonian, followed by replies from experts that “it just won’t work.” While you won’t get perfect, low-noise, wide-angle shots of dim nebulae, you can still have a lot of fun.
Showing my pictures to people does typically get a response of awe and wander, but the satisfaction I get every time I complete an image is not easily shared. The planning, persistence, patience, fidgeting, and eventual exhilaration that comes from producing a single image is addicting, and all of this does not readily come through when looking at a single grainy image, especially on a dimmed cell phone screen.
The first problem with taking pictures of dim objects is that they are very dim. They are so dim that I generally cannot even see them in the eyepiece of my quite large Newtonian telescope. Living near a large urban area, light pollution is the enemy. Most of the deep space objects, save for the Orion Nebula and Andromeda galaxy, are barely brighter than the ambient light pollution on an average clear night in the Akron area. This is why the images I produce are still so noisy: the object is barely above the noise, so trying to remove the noise will remove some of the object. This balancing game is difficult and frustrating, and I typically choose to just leave the noise to try to retain as much data from the object as I could get.
The second problem is that the night sky moves. Or, should I say, the earth rotates. Trying to take a single image of something so dim is not easy. Even with a large reflector telescope and a decent DSLR, you are receiving so few photons from the target object that you need to take really long exposures to get any dynamic range on the target. It’s not like taking a snap of your friend at the beach at 1/200 of a second. With my homemade tracking mount, I am lucky to get some decent, well-tracked exposures at 15 seconds so far. The sky moves quite a lot over that 15 seconds, and so tracking has to be absolutely perfect in order to keep the stars pinpoint and the object not blurred.
Amateurs with good equipment can take exposures from 2 to 30 minutes. However, this “good” equipment can be $2,000-$5,000 for a motorized German equatorial mount, and $1,000 to $10,000 for a fast imaging telescope. This also doesn’t include the dedicated CCD imagers at $1000 or more, filters, proprietary software, guiding scopes, and all the other trinkets that add up. So, being able to create images showing the structure, essence, and beauty of deep space objects with budget gear that was not meant for this, and a $200 homemade tracking platform, just adds to the satisfaction and motivation I have for this hobby.
As you can tell, you can’t just put your Nikon Coolpix on a tripod and capture the spiral arms and dust lanes of a galaxy that is 25 million light years away.
What I want to do here is outline what it really takes for me to produce one of these small, noisy images that captures objects we never get to experience naturally.
On a clear night, the first thing I do is set up. This includes hauling over 200 pounds of telescope and tracking platform out into the yard. The tracking platform needs to be aligned to the celestial pole, which, luckily for us, is about where the bright star Polaris is. I usually have to wait until Polaris is visible, which is pretty much after dark. I set the platform pointing in the general direction and set an angled block on it that has a laser pointer set at my latitude, about 41 degrees. I rotate the platform and level it by adjusting leveling feet until it points at Polaris. While not completely accurate, this seems to prove good enough for my relatively short exposures.
After the platform is set up and aligned, I place the telescope on top. I then need to collimate the telescope. Collimation means lining up the optical axes of the large, primary mirror and the eyepiece holder, where the camera is mounted. This must be done to ensure that the entire field of view is even and in focus at once. The first step is to get the eyepiece holder to point at the center of the primary mirror. I do this by shining a red flashlight on the primary, which has a red dot on it, and adjusting the secondary mirror until the red dot is centered in a target eyepiece. Once this is done, I use a “Barlowed laser” to reflect an image of that red dot back into the eyepiece, where it is centered. This takes maybe five minutes.
In order to take great, clear shots, you need to focus your “lens.” By far the best way to do this is with a Bahtinov mask. This is a plastic cover that you put over the end of the telescope. It has slits cut in it at certain angles, and these slits produce diffraction spikes on a bright star. By changing the focuser, you can align the three diffraction spikes to get perfect focus. This probably takes 10 or so minutes, as I need to set up the camera, place it in the telescope, get the live preview from the camera going on the laptop, and then find and focus on a star through the live preview.
The most difficult part is getting the tracking platform to track at just the correct speed. I can get it close by putting a bright star in the crosshairs of the live preview on the laptop, and adjusting the tracking motor speed until the star appears still relative to the crosshairs. This is usually not completely accurate, and I tend to take a dozen or more test shots of a bright star at my target exposure length while I minutely adjust the imperfect potentiometer of the tracking motor. Once I get a good average of frames that have pinpoint stars, I call it good. The accuracy of my tracking platform is nowhere near perfect, so if I can get 4 out of 7 images to not have noticeable trailing, I move on.
The next hardest part is finding the objects. As I stated, these objects are so dim that I cannot usually see them in a large eyepiece, and so I definitely do not seem them on the live preview that has maybe 50ms exposure times. I tend to turn up the ISO gain of the live preview pretty high, so I can at least see some dimmer stars I couldn’t before. I then consult my planetarium software, Stellarium, to see what kinds of stars are around the object. I try to find some kind of distinct landmark shape in the stars, and move the scope around and take images until I finally get my object to appear. Here is what a raw image looks like right after taking it:
This is M63, the Sunflower galaxy, barely visible even in a 15 second long exposure. This also shows how much light pollution there is, as the galaxy is nearly drowned out by it. You cannot see any structure or detail in the galaxy, only a faint smudge that is something other than a pinpoint star. The picture is more red than it would naturally appear because I removed a red filter from inside the camera. The light pollution is really more of a brownish green tint, like this:
Once I’ve found the object, I begin to take images of it in bursts of seven, the maximum number that my camera can handle at a time. I review each image of each of these bursts, looking for any trailing present in the stars. If necessary, I re-adjust the tracking motor and take more bursts. I try to take about two hours of exposures of the object, usually at least 150 pictures. I like to shoot for 200 images if I can. As I said, my tracking platform isn’t perfect, and a really good shot can be followed up by a really terrible one. The slightest bit of vibration or wind can completely ruin a shot, so I try to sit patiently as far away as I can from the telescope. With 200 exposures in total and some decent tracking, I am lucky to get maybe 30-60 images that come out clean.
Now that I’ve collected a sufficient number of “light frames,” or actual images of the target, I move on to taking other images that will help combat some noise. The first set is “dark frames.” These are taken at the same settings as the light frames, but with a black cap over the end of the telescope. This is supposed to capture noise that has built up in the camera from heat due to the exposures as well as operating and ambient conditions. I generally take 20 or so of these. Then, I take “bias frames,” which are at the same ISO gain but are taken at the shorter exposure time, 1/4000 of a second. These frames are supposed to capture the inherent “dark noise” of the camera sensor. Finally, I replace the dark cap with a white one and shine a flood light on it. This is captured in “flat frames,” which can expose flaws in the optical assembly, such as dust, vignetting, and unevenness in the camera sensor.
Once all of this is done, it’s time to carry everything back into the house. Takes quite a few trips, but the anticipation is building and the physical labor is unnoticed. I hook my old laptop up to the wired network and begin transferring the images to my desktop for processing. A typical session could lead to 6GB of data, which takes maybe 20 minutes to transfer.
Now is the fun part! I open up my astro image processing program Siril, and have it begin by applying all of the dark, bias, and flat frames to each one of the light frames. This is a very resource intensive process, as it takes maybe 25 minutes and 25GB of RAM to complete. Once that is complete, I register all of the processed light frames. This transforms and aligns each image based on the stars in it, so that the target object is perfectly aligned in each image. Finally, the images are combined together in what is called “stacking.” There are many ways to stack images, and I play around with this a bit until I get the best results. As I said, I usually limit the stacking to the best 30-50 pictures. This stacking process is what removes most of the noise and makes the target come to life.
Now that I have a single complete image, I begin processing it. The first step is to neutralize the light pollution background color, and then white balance the object so that it’s natural color shows. I apply a background extraction algorithm in Siril to try to remove any background gradient. Then, I move the picture into Gimp, where I do some final manual manipulation. I typically subtract a uniform color from the image to remove as much of the light pollution noise as I can without losing too much of the object. Finally, I play with the brightness of the image to try to get the background to be a bit blacker.
The entire process of imaging a deep sky object probably takes me five to six hours in total. It takes a lot of patience, physical work, luck, and computing power to photograph some of the most difficult objects in the world. The result is always worth it, though.
So when you look at one of my images, don’t only think about the magnitude, beauty, and aura of something incomprehensibly large, but please consider the care and attention I took to produce that for you. 🙂