Seeing in the Dark: How To Catch a Nebula
by Michael on (Updated on )
Back

Have you ever wondered how it's possible to photograph distant and faint deep space objects whose beauty and scale refused to be understood by the human mind? Probably not, but for the 5 of you who might be interested,  I'm doing a deep dive into the art of seeing in the dark.

Our Objective: Nebula

For this mission, I'll be focusing on how I capture nebulae (plural of nebula), which are large regions in space of gas and dust, often stretching for thousands of light-years. Many nebulae are referred to as "stellar nurseries" as the gas and dust they contain will eventually collapse under the pull of gravity into brand new baby stars. Some of the material left over from star-birth can also become a protoplanetary disk, thus creating new planets. Seeing a nebula is like looking back in time to before our own solar system was born - an otherworldly baby photo. 

There are several types of nebula we care about as would-be astrophotographers:

  • Reflection Nebula
    • These nebulae are like mirrors - their dust is illuminated by nearby stars. They can be difficult to photograph in light-polluted areas close to cities (more on this later).
  • Emission Nebula
    • These nebulae are their own light source - they are made up of ionized gases which "emit" their own light. To me, these are the most fun to photograph because they can be easier to photograph (again, more on this later).
  • Dark Nebula
    • These nebulae are often seen together with either reflection or emission nebulae, however they are basically the opposite - instead of producing light, they absorb and block most visible forms of light. This may sound like an uninteresting object to photograph, but often the beauty of other nebula are highlighted when a dark nebula is also in the composition.
Before we actually discuss the details of photographing nebulae, we first must talk about the magic around how computers store images.

Color and Light in Digital Images

Electromagnetic Spectrum (from primalucelab.com)

The goal of any form of photography is to record the intensity of light and color at a given point in time. Remember, light is just a wave, and the wavelength (frequency) of that wave dictates what color the light is. Not all light can be seen with the human eye. The diagram above shows the various types of light and their corresponding wavelengths. We can only see from 400 nanometers (violet) to 700 (red). But just outside of the visible light spectrum, there is also ultraviolet (UV),  infrared (IR), and other forms.

When you whip out your phone to take a quick selfie of your outfit-of-the-day, you are really asking your phone to take a measurement of the amount of light that is bouncing off you from the bathroom lighting fixture. Your phone is able to collect the intensity of the light at thousands of individual points (called pixels) over a set amount of time (called exposure length - typically measured in fractions of a second) and turns those light measurements into a series of numbers. When you go to review that selfie to make sure you didn't blink, you are actually asking your phone to make its screen light up in such a way to recreate the light that was captured during your selfie (kind of baller it can do that if you ask me).

What's interesting about your phone's camera (and all other camera's you've ever used) is that they include a special filter that block any light outside of the visible light spectrum. They do this because when you're photographing your OOTD or your lunch, you don't want to have ultraviolet or infrared light interfering with the photo. So instead of showing you a photo with a ton of light you don't actually care about because you can't see it, the modern camera simply blocks it from being recorded.


A photo shown in full color (RGB), then the individual color channels 

What's even more interesting is that every digital photo is secretly made up of only three primary colors - red, green, and blue. Your phone's screen has millions of pixels that can only use one of those three colors to display an image. But by blending those three colors together additively, any color is able to be created - including the horrendous mauve that you decided to wear yesterday. Because of this, digital photos actually store the same image three times in different "color channels" - one channel (or image) for each color (red, green, and blue). The example photo above shows a picture I took in Iceland. The first photo is all three color channels combined as your normally would see it, the next are the red, green, and blue channels separated, which is basically capturing the amount of red, green, and blue light separately. Note how the hill is brighter in the red and green channels, yet appears dark in the blue channel. This is because there is less blue light in the hill! It's mostly green. I still find this amazing that combining three black and white images produces a color one.

Narrowband vs. Full Spectrum Imaging

Rosette Nebula in the Hubble Color Palette



When it comes to photographing nebulae, the different types of nebulae require that we capture different spectrums of light as described above. Some of that light might be UV or it could be IR. Therefore, we use specialized cameras which don't have any filters that would block light. What's even crazier is that many of these cameras are black and white only - we can capture the color channels (described above) independently instead of all-at-once like your phone does. The way we do this is to put a filter (piece of glass) in front of our camera that only lets specific wavelengths of light (color) through. For example, we can use red, green, and blue color filters to capture each channel on their own. Imagine if you had to take three photos every time you wanted to take one. That's essentially what we do in the world of astrophotography. 

These light-blocking filters are not only limited to red, green and blue. We can have filters that can select very specific frequencies of light outside of those primary colors and only capture those wavelengths. This technique is called narrowband imaging and is critical for astrophotography as it allows us to collect light that nebulae emit and completely ignore light that would otherwise interfere with our photos.

For example:

Reflection Nebula can reflect any wavelength of light. Therefore, when we photograph them, we typically capture full spectrum of light (everything from ultraviolet to infrared). For digital photos, this is full RGB

Emission Nebula typically emit very specific wavelengths of light. Emission Nebula are predominately made up of:

Notice how for each element, they don't exactly correspond to red, green, or blue perfectly. Hydrogen ii is close to red. Oxygen-iii is on the border of cyan and blue. Sulpher-ii is on the border of red and orange. Because digital images must be made up of red, green, and blue color channels as we explored before, astrophotographers did their best and assigned each element to a specific channel. In the Hubble Space Telescope, NASA famously maps Sii to red, HA to blue, and Oiii to green. But this is not what the eye would see. This is often why you hear people claim that space photos are fake. To be clear, the light collected from these objects is 100% real light. The reason the colors get remapped is so that scientists can easily visually observe what exact elements are in a nebula and display the amazing cosmic filaments in all their grandeur.

Speaking of being able to see these objects with your eyes, I think it's time we talk about...

Exposure Length and Tracking

A long exposure of a waterfall blurs the motion of water


Cameras are often compared to eyes, but they have an interesting feature your eyes don't have: cameras can collect light over fractions of a second, seconds, minutes, or even hours (if your eyes can do this, please contact me immediately). Your everyday photos usually only collect light for somewhere between 1/30th and 1/4000th of a second. That's because cameras are used under your bright household lights or even under the nuclear furnace that is the Sun, so they don't need to spend a lot of time collecting light to "see." 

A long exposure of my sister running with Christmas lights wrapped around her


When there is less light in a scene, photographers can set their cameras to take longer "exposures" to collect light for more time. This is called Long-exposure photography and is a common artistic technique to show motion in every-day life such as with water falling in a waterfall or headlights passing in the night. It can also be used in particularly dark environments, but eventually, there will be motion blur if objects are moving through a scene. In the photo above, I asked my sister to run around while wrapped in Christmas lights to create this interesting effect. The lights, which are usually single points, became lines because she moved while the camera was still exposing.

Deep space objects, are extremely dim, so we also must use long-exposures to collect enough light to see them. The issue is that nebulae, like everything else in the universe, are constantly moving, so if we do a long exposure, they would blur as they move across the night sky with the rotation of the Earth (see Sidereal Time).

In order to achieve long exposures without suffering from motion blur in our photos, we need our camera to rotate at the same speed as the Earth. This is done with an Equatorial Mount that is basically a tripod on steroids. Its job is to both hold the camera and telescope, and also rotate them to match the movement of the stars. It can be paired with a guide scope which watches a specific star to ensure the mount doesn't lose track of the object you're photographing. This is also baller.

How long are exposures in Astrophotography? While it's possible to continuously expose throughout the whole night and take just one image, that's like putting all your eggs in a single basket, assuming you carry eggs in baskets still. What if an airplane or satellite crosses through your image while you're exposing? That would ruin the entire photo. To get around this, astrophotographers don't just take one long-exposure - we take dozens, if not hundreds in a single night, or even across many nights. It's then possible to "add" (or stack) all these photos together. How? Well, as we discussed earlier (and what you clearly remember), photos are really just numerical records of the intensity of light. Through software, we can add and average the recorded light which therefore reduces the amount of noise or grain in our photos.

I think it's finally time to talk about...

Gear and Hardware

My astro rig as of 2026

The gear required to do basic astrophotography can be as simple as a tripod with a star tracker mount, a telephoto lens, and a camera. But that's like saying the only thing you need to cook dinner is a pan and a fire. Sure, it works, but wouldn't it be nice to also have a gas stove with an oven and a microwave? 

I use dedicated astro equipment as shown in the photo above:

  • An equatorial mount to hold my telescope and track the stars
    • SkyWatcher EQ-6 R Pro
  • A telescope big enough to frame most large objects
    • 900mm SkyWatcher EvoStar 100ed
    • SkyWatcher 0.85 reducer 
  • An auto-guider to help lock onto and track targets in case the mount gets misaligned
    • Astromania 50mm Guide Scope
    • ZWO ASI290MM Mini Guide Camera (yes, a camera just for guiding)
  • An astronomy camera for photographing objects
    • ZWO ASI1600-MM Pro
  • Dedicated narrowband filters to select the light to collect
    • ZWO 1.25" Sii, HA, & Oiii Filters
  • A filter wheel which lets met change the filters automatically without touching the camera
    • ZWO EFW 8-Position Filter Wheel 1.25"
  • An auto-focuser to keep the stars constantly in focus
    • Moonlite V3 Focuser 
  • Dew Heaters to prevent frost and dew from accumulating on the gear overnight
  • A computer to serve as the brains of the operation
    • Raspberry Pi 5 running Stellarmate OS and Kstars

Data Collection

The heart of the Heart Nebula (Bicolor image - HA and Oiii only)


This is by far the most difficult part of the entire process - actually collecting data. It's exceedingly rare to have nights that are completely free of clouds, where no full moon is lighting up the night sky, where there is no wind, and finally, when you actually have time in life to spend many hours awake messing with your telescope. But assuming the stars metaphorically and literally align, this is the process:

  1. Pick a target to shoot! Stellarium is great software for finding out what's in the sky each night
  2. Setup your mount and connect all your gear together into the computer
  3. Align your mount with the North Star. This allows it to correctly keep objects in frame
  4. Run software to do plate solving which identifies where your telescope is pointing in the night sky
  5. Run your auto-focuser to ensure your main camera is completely in focus
  6. Run software to slew (move) your telescope so it's pointing at your object
  7. Pick your desired amount of data to collect, and which filters to use
  8. Calibrate your auto-guider to track a start in the frame of your composition
  9. ...begin taking photos!
  10. Remember to turn off your telescope before sunrise

As easy as 1, 2, 3,.., 10!

Editing

Editing in Pixinsight

Editing is an important step in Astrophotography. It involves taking all the data collected during the night and combining it all together into a final color image. This topic is such a rabbit hole involving deconvolution algorithms, working with non-linear data, and a whole host of other big-sounding words that I decided to leave editing out entirely. If you're interested in how I edit my photos, perhaps I can share that in a future post.

Conclusion

Astrophotography is a black hole that seems like quite the undertaking to learn. But it only seems that way because it is. I'm just joking, of course! As with any other field, there is a learning curve to overcome. But while you work to summon enormous galactic structures from the blank void of night, you will inevitably find yourself accidentally learning some fundamental realities of how our universe works. And I think that's extremely baller.