Imaging the Jellyfish
Choosing the Target
It seems that the Moon and the weather rarely cooperate, and when it’s clear, the Moon is often too bright to capture RGB targets. On one such evening last month I went in search of a good narrowband target.
As a marine biologist by training, of course I am always drawn to aquatic-themed targets. The Jellyfish is an interesting one as well. The light coming from it is almost entirely outside the visible range. That means that if you look through a telescope at it, it’s extremely difficult to see, if you can see it at all.
However, it makes for a great narrowband imaging target! It has lots of light coming from it in the Hydrogen-Alpha and Suphur-II range. Perfect for a night with lots of light pollution from the Moon.
The Jellyfish Nebula, or IC 443, is a leftover cloud of dust from a supernova. Its star likely collapsed between 3,000 and 33,000 years ago. When stars collapse, they eject their outer layers of dust and gas into space as they do so, and the core collapses into either a neutron star or a black hole, depending on the size of the original star.
I used AstroImageJ to process the images initially (it’s free and open source), and Adobe Photoshop to combine the black-and-white images into a colour picture.
Setting up the Scope
With the target chosen, I set about programming CCD AutoPilot to do my bidding. With the amount of time I had, I took 3 exposures each of Hydrogen Alpha, Sulphur-II and Oxygen-III. Each exposure was 1800 seconds, or 30 minutes. Longer exposures capture more light from the object. We also have special software and hardware on the telescope that allows for tenth-of-a-second micro-adjustments to the direction of the light. This corrects a lot of the light fluctuations caused by the atmosphere.
Once morning rolled around, I had the scope take flats for Hydrogen Alpha. Unfortunately this is all I had time for, as the scope was also taking flats for other targets at dawn. I may do another run on another night in order to collect more light frames and flats for other filters.
Checking back in the morning, all was well and the images were ready for processing!
Preparing the Light Frames
So out of that process there are now 9 images in 3 wavelengths showing the target itself, and 5 flat frames just for one of the filters. I wasn’t be able to create a fully processed final image, but I at least got started.
The only images I could truly fix errors on were the 3 Hydrogen-alpha frames. The other 6 frames (S-II and O-III) don’t have flat frames, so I removed the dark frames, and am waiting to remove the bias and flat together once I have the flat frames I need.
Take a look at the difference between a processed light frame and a raw light frame. Processing the image takes out broken pixels, hotspots, vignetting and electronic noise.
For my purposes, I stabilized all the images in AstroImageJ before moving over to Photoshop, because I find it easier.
Getting a Colour Image
Using Photoshop, I combined each filter’s three images into a single image. Doing this allows for individual errors in each image to be eliminated. You can actually do something similar with daytime photos with people in them. If you combine them using certain settings, the “odd pixels out” will be removed (like a person who is in one photo but not in the other two). In an astrophotographical sense, this makes for less noisy photos. The more frames you combine, the better the result will be! Stay tuned for a more detailed tutorial on this process (though hopefully made by someone with more experience than I).
It is at this point that artistic skill comes into play a bit. You can play with levels to make each black and white image pop the way you want it to. But it takes a creative hand and eye!
Now, with one individual frame for each colour filter, they can be applied to light channels, red, green and blue. Since the images are in black and white, when applied to colour channels, it just tells Photoshop to illuminate those coloured pixels accordingly. For example, a black and white image put into the red channel would turn into a red and white image. Same goes for the other two channels, and then when all are combined, you get a colourful image based on which colour pixels are lit up more or less than others. What was black in all three images will be black in an RGB image. What was black in the red and blue image but white in the green image will appear green in an RGB image.
First Shot: Not Quite Accurate
Voila! A colour image! Sort of. I made a mistake on my first try, actually. The Jellyfish Nebula is not supposed to be pink, as pretty as that is.
I mean, really though, this is a false colour image as it is. If you looked at this with just a plain old telescope, it would look red. What we’re making is our own thing. We’re taking light from parts of the spectrum you can’t see and pushing it into parts of the spectrum you can.
Sulphur-II emits light at 672nm. That’s a deep deep red colour. Hydrogen-alpha emits light at 656nm. Also red, but slightly brighter. Oxygen-III emits light in the green-blue-turquoise-y part of the spectrum, at 498nm. This poses a problem. We have red (about 570nm), green (about 540nm) and blue (430nm) to work with. The wavelengths the telescope imaged don’t align with those specific colours.
Astrophotographers often work in something called the Hubble Palette. You may recognize this palette when you see it out in the wild. It just means we assign the colours based on the closest wavelengths to them. Starting with Sulphur-II, the deepest red of all, it gets assigned to the red channel. Hydrogen-alpha is the next one along, so it gets assigned to the next-reddest channel: green. Finally, Oxygen gets assigned to blue (it’s actually pretty close to this wavelength too).
I didn’t do that at first. I accidentally assigned Hydrogen to red (instead of green), Sulphur to blue (instead of red) and Oxygen to green (instead of blue). I got as much wrong as I could. It did make for a beautiful pink photo though, as red and blue combine to make that brilliant colour.
And really, does it matter? It’s not like we can actually see this nebula anyway. The colours aren’t represented as they are. If they were, the whole thing would be red. But there is an accepted way of doing things, and alas, I caved to the astronomical peer pressure.
Second Time's the Charm
This was only my second shot at trying to process astrophotos. It takes a lot of work and a lot of problem solving, but is it ever satisfying at the end to have wrestled with the various programs and files and won!