Light Vortex Astronomy
  • Blog
  • Astrophotographs
  • Articles
  • Tutorials
  • Links
  • About Me
  • About e-EyE
 ​Send a Gift via PayPal:

Remote Hosting in Spain:
Picture

 

Tutorial (PixInsight):

Colour-Calibrating Images


In absence of sensitive enough colour-detecting cells in our retinas, the only measure of real colour in deep space comes from our images of objects such as nebulae and galaxies. This of course discounts narrowband images that introduce false colours. When we say a colour image taken with a one shot colour CCD or DSLR camera, or an LRGB image made with a monochrome CCD camera, is colour calibrated, we have essentially neutralised the background to a very light grey and defined the white point. If done correctly, we can say that our image is a true representation of the deep space object's real colour. 

This is essential for working with colour images that are not narrowband images, if the images are to appear realistic. PixInsight has a number of processes that allow us to perform this colour calibration to colour images. Covered in this tutorial are precisely these processes. Also mentioned is how this procedure relates to narrowband images for those with monochrome CCD cameras.

Assumed for this tutorial:
  • Knowledge of operating PixInsight, related to dealing with images and processes (read this, sections 3 and 4).
  • Your images have already been pre-processed fully (read this). 
  • If your images are monochrome, they have been prepared fully by registering them, cropping black edges, subtracting background gradients and matching their brightness levels (read this). You have colour-combined them together into a colour image (read this). 

​Please feel free to ask questions or leave comments via the comments section on the bottom of this page. 

Contents
1. Neutralising the Background with BackgroundNeutralization
2. Performing the Colour Calibration with ColorCalibration
3. Performing the Colour Calibration with PhotometricColorCalibration
4. Removing the Green Colour Cast with SCNR
5. Notes on Images with Narrowband Data
​Comments

 

1. Neutralising the Background with BackgroundNeutralization

We start by presenting the image that we will use for this tutorial. This image comes from three separate Red, Green and Blue monochrome images. As per the above assumed list, the monochrome images were pre-processed fully, registered with StarAlignment, cropped with DynamicCrop, background-extracted with DynamicBackgroundExtraction and matched with LinearFit. The monochrome images were then colour-combined with ChannelCombination. Below is this image, in its linear state, auto-stretched in PixInsight.
Picture
The above may not present a very clear scenario for needing to colour calibrate the image, since it already looks pretty natural (with a slight overall tint to the background). This is the case because it has been background-extracted with DynamicBackgroundExtraction and then matched with LinearFit before colour-combination. The LinearFit function matches the average brightness level of the background and signal over the images and as a result, they already match well.

Nevertheless, there are some benefits to be had from performing a full colour calibration routine, especially since it does not take very long at all to perform. It is important to note that this procedure should be carried out on a linear image, before any stretching happens. We start by neutralising the background. For this, we use the BackgroundNeutralization process.
Picture
This process benefits greatly from creating a small preview box in your image that covers only background. By background, I mean no nebulosity, no galaxies and no stars - simply background space. Pick an area that is devoid of anything interesting at all and that has a more or less average background tint to it. It helps to zoom in and pan around the image. Once you find a suitable area, create a preview box around it - do not include any stars if you can!
Picture
The preview box need not be very large at all, so do not worry. Your image may be more difficult to work with if you have a lot of stars in the field of view. Once you have created your preview box, go back to Readout Mode in PixInsight, as we may need it.

With the preview box created, click the small button next to the text box for Reference image in the BackgroundNeutralization process, select the preview box from the list and click OK.
Picture
Picture
Picture
If you have captured some stars in your preview box because they are totally unavoidable, or some nebulosity that is also totally unavoidable, you can change the Upper limit parameter in BackgroundNeutralization to exclude these problematic pixels. If your preview box is clear of any stars or nebulosity (or pretty much clear, as shown in mine above), do not change default settings as they work very, very well with linear images. If you identify problematic pixels inside your preview box, click and leave the mouse button clicked and the Readout Mode output will appear.
Picture
On the left of the Readout Mode output, you will see values listed for R, G and B. Your Upper limit setting in BackgroundNeutralization should ideally be below this if what you are pointing at truly stands out above background (it does not in my preview box!). Since there are three values given (R,G and B), pick the smallest one of the three to set your Upper limit slightly below it. If there are multiple problematic pixels inside your preview box, check them all out with Readout Mode, find the smallest value and use this to set your Upper limit setting. You can leave Lower limit at default.

Please remember that if you are careful in creating your preview box, this may not be necessary at all, in most cases (unless your image has absolutely no area not covered in nebulosity). If indeed there is zero background in your image (this is extremely rare unless it is a very narrow-field image of a large nebula with overall bright nebulosity), you need not apply BackgroundNeutralization at all and can skip to section 2.

Once you are happy with your BackgroundNeutralization settings, click the Apply button and your image will take a slightly different tone. At this point, you will need to re-apply the auto-stretch to truly see the difference made to your image (as the auto-stretch you had was applicable when the background was not neutralised).
Picture
Clearly LinearFit and DynamicBackgroundExtraction did an absolutely excellent job of matching the images prior to colour-combination but BackgroundNeutralization has in itself benefited the image somewhat as well. At this point, you can close the BackgroundNeutralization process. Leave the preview box in place because we will need it for the next section.
 

2. Performing the Colour Calibration with ColorCalibration

Colour calibration can be performed with one of two processes - ColorCalibration or PhotometricColorCalibration (new as from version 1.8.5). This section covers the former and the next section covers the latter. You may choose to use one or the other for your images, but please note that using both is not the correct procedure for colour calibration. 

The 
ColorCalibration process requires two preview boxes - one over background and another over what is essentially a white reference. White references in astrophotography are essentially G2V stars, but PixInsight defers in how it performs the colour calibration - it instead opts to sample all colours across an image. In the end, it is actually very effective at doing its job. We now open the ColorCalibration process.
Picture
Since we already have a preview box created over background, this will act as our background reference. It must therefore be selected as such. We simply click the button next to the text box for Reference image under Background Reference and select the preview box over background.
Picture
Picture
Picture
We now need to create a preview box in our image that represents a white reference. If your image contains a fairly prominent galaxy, simply create a preview box over the entire galaxy and disable Structure Detection in ColorCalibration. This forces the process to use the entire galaxy as a white reference, which is indeed excellent as galaxies contain every type of star and average out to white very nicely. If however your image, like mine, does not have the luxury of containing a fairly prominent galaxy, we must use stars for our white reference.

In this scenario, we keep Structure Detection enabled and default settings here work well. This feature basically forces the process to detect the stars in the image (or the preview box) and use them for white reference, as opposed to nebulosity, background, etc. Keeping Structure layers to 5 samples most stars in the image, particularly small ones. You can raise this to 6 or so to include bigger stars, but at the risk of including structures such as nebulosity. Keep Noise layers to 1 as this ignores small-scale noise in the image when detecting stars for white reference.

Since my image contains a good amount of stars, it is easy to create a fairly large preview box over most of the image and be done with it. 
Picture
Please note you can indeed overlap the white reference preview box over the background preview box - Structure Detection will ensure only stars are used as white reference anyway. Once you have created your white reference preview box, click the button next to the text box for Reference image under White Reference and select this new preview box.
Picture
Picture
Picture
At this point, ColorCalibration is properly configured and you can click Apply on it with your image selected. Once done, re-apply the auto-stretch to truly see the effects of colour calibration (as the previous auto-stretch applied to the image as it was before colour calibration). You may also close ColorCalibration and delete all the preview boxes.
Picture
There has been a pronounced effect on the colours in my image and it well and truly looks natural now. Best of all, the image is still in its linear state and further post-processing can be carried out (such as noise reduction, stretching, contrast enhancement, etc). There may be a green colour cast to your image at this point, which is what we talk about next. Before that however, we introduce another colour calibration process in PixInsight (new as from version 1.8.5), which can be used instead of ColorCalibration used in this section. 
 

3. Performing the Colour Calibration with PhotometricColorCalibration

​PhotometricColorCalibration was introduced in PixInsight version 1.8.5 as a new method of colour calibration. This process plate solves your image to figure out a true white reference for your image based on a selection by the user, such as a spiral galaxy, elliptical galaxy or G2V type star. Using PhotometricColorCalibration is supposedly more scientific but it may or may not lead to a result you like more than that produced by the aforementioned ColorCalibration process. Use one process or the other, but it is recommended you try both prior to deciding. 

With no preview boxes present in our image but with the background neutralised with BackgroundNeutralization, we open the PhotometricColorCalibration process. 
Picture
The first step is to tell PhotometricColorCalibration the Declination and Right Ascension coordinates of your image. The centre of the image's field of view is ideal but any object contained within it close to the centre will suffice. If there is a named object near the centre of the image, simply click the Search Coordinates button, enter the designation or name of the object and click Search​. 
Picture
Picture
Picture
As shown above, either designations or common names work well. However, entering horsehead alone was not enough​ and the tool was unable to retrieve the object coordinates - please be aware of this. Once your object has been picked up by the tool, click the Get button and the Declination and Right Ascension coordinates will be updated in PhotometricColorCalibration​. 
Picture
Please note that PhotometricColorCalibration will use the Declination and Right Ascension coordinates entered alongside an estimate of your field of view to plate solve your image. This does however assume that the field of view is around your target Declination and Right Ascension coordinates, i.e. that these coordinates are the centre of your field of view. This is why it is best to choose an object that is in the centre or near the centre of your image (there is some leeway where it will still plate solve correctly). 

If there is no object at the centre or near the centre of your image, you should determine the actual Declination and Right Ascension coordinates of the centre of your image. For this, you can upload your image to Astrometry.net for plate solving. It is advisable that you upload a single raw frame. If you are working with a mosaic that has a panel in the centre, upload a single raw frame from this central panel. If your mosaic does not have a central panel, you can upload a pre-processed channel of your completed mosaic, but upload one without any post-processing (still in its linear​ state). ​Once the image is uploaded, it will be plate solved automatically. Once done, the page will report Success and you can go into the results page. On the right column of the results page you will find coordinates for the centre of your uploaded image. 
Picture
​In my case, as the Horsehead Nebula is so close to the centre of the image, the coordinates are very close. Still, having bothered plate solving the image online, I may as well update my coordinates to the true centre.
Picture
Next, we need to provide PhotometricColorCalibration with the pixel scale of the image. If you do not use DrizzleIntegration during pre-processing (which artificially increases the resolution of your images), then entering the focal length of your telescope and pixel size of your camera will do the job just fine. In my case however, I have also used DrizzleIntegration for pre-processing. Astrometry.net calculated that my pixel scale was 1.04 arcseconds/pixel when in reality it is double this, 2.08 arcseconds/pixel​. What all this means is that if I enter the correct telescope focal length of 450mm and the correct pixel size of 4.54μm, PhotometricColorCalibration will fail to plate solve the image as the pixel scale will differ considerably from what is in the image above. 

The best way around this problem is to enter your camera's correct pixel size, in my case 4.54μm. Then, for the focal length of the telescope, you should enter an up-scaled value depending on what DrizzleIntegration setting you used. As I always use DrizzleIntegration at x2, the resulting image pixel scale is half of the true value, as measured by Astrometry.net​. In order to get half the pixel scale, instead of entering my true focal length of 450mm, I enter 900mm. Alternatively, I could enter 450mm for focal length but half the pixel size, 2.27μm. Either way, this will lead to the correct pixel scale of 1.04 arcseconds/pixel and will allow PhotometricColorCorrection to function properly. 
Picture
Finally, all we need to tweak, if we want, is the White reference used by PhotometricColorCalibration. The default of Average Spiral Galaxy is actually the one recommended by PixInsight, as it seems to provide the best overall end results across the board. You may however want to try Elliptical Galaxy or the one we expect would give a good white balance, G2V Star. ​​For now, I will keep the default Average Spiral Galaxy​ selected. The default Database server should do the job (this is where the process will download plate solving astrometric index files from to plate solve your image with). If when you run the process, plate solving fails because it cannot download astrometric index files, just change the Database server to something else. We now run the process and check the end result. 
Picture
​The end result looks excellent and very natural. You will notice a set of graphs coming up - these show you how well the colour calibration data was fit to the references used. You may close the graphs window if you wish. 

If your plate solving failed, we can tweak some parameters on PhotometricColorCalibration to help it work. If your stars are slightly distorted, you can enable Distortion correction under Plate Solving Parameters. If you would like the process to detect more stars, you can expand its Advanced Plate Solving Parameters section and drag the Log(sensitivity) slider all the way to the left so the value is -3.00 (this is more sensitive to picking up stars as it is a logarithmic scale). If your image has quite a lot of noise as well, you can increase Noise reduction to 1 so that the first pixel layer (where the vast majority of small-scale noise resides) is ignored in star detection. 
Picture
The following shows the end results produced by PhotometricColorCalibration based on the three main White reference options - Average Spiral Galaxy, Elliptical Galaxy and G2V Star​, respectively. Following each, I performed an auto-stretch to show the actual result. 
Picture
Picture
Picture
It is certainly worth trying these three White reference options, as well as the ColorCalibration process covered previously. Once you are content with the colour calibration performed, we move on to removing the green colour cast that is likely present in your image. ​
 

4. Removing the Green Colour Cast with SCNR

Having a green colour cast to images is fairly common but as we know, generally speaking, deep space objects are not green-coloured (there are some planetary nebulae that would disagree with this statement!). Though some images may not show a very strong green colour cast, it is sometimes clearly visible on zooming in.
Picture
To remove what is effectively green noise is easy to do with the SCNR process.
Picture
SCNR is one of those processes you can apply and forget, because it is so simple to operate. Default settings work tremendously well in the vast majority of cases, with Average Neutral selected under Protection method and Green selected under Color to remove. Amount can be left at 1.00 if you want to keep the colour cast removed image, or lower if you want a blend between the colour cast removed image and the original image (e.g. 0.50 will give you an end result that is 50% of the original image with 50% of the colour cast removed image). Keep Preserve lightness enabled in order to not alter the brightness anywhere in the image. A simple click of the Apply button with default settings gives a clear difference (with an accompanied re-application of the auto-stretch).
Picture
Though in most cases that will be that, you may wish to actually keep some of the green in order to preserve the nice teal colours provided by some planetary nebulae. If you wish to do this, instead of Average Neutral selected under Protection method, select Maximum Neutral. That is all there is to it. Below shows the result with Maximum Neutral.
Picture
Sure, the difference is very small for this image, but choose as best suits your taste and your particular image.

This concludes the entire colour calibration procedure with a very effective end result that looks very natural. Best of all, the image is still in its linear state, ready for further post-processing (e.g. noise reduction, stretching, contrast enhancement, etc). We now close off with a quick discussion of how colour calibration applies to images with narrowband data.
 

5. Notes on Images with Narrowband Data

 Narrowband data can generally be used in two ways:
  1. To produce fully-narrowband images with false colours (e.g. Hubble Palette). 
  2. To enhance a specific colour channel of a regular RGB colour image (e.g. Hydrogen-Alpha in the Red channel). 
Let us explore each one listed above separately.

First, fully-narrowband images with false colours. These images are said to have false colours quite simply because the monochrome images used to produce the colour image do not sample much of the spectrum at all. Narrowband filter bandwidths used for their capture generally range between 8.5nm down to 3nm, which is a minuscule part of the spectrum. Also, some of the filters used sample spectral lines that are very close together in the spectrum. For example, the famous Hubble Palette uses a Sulphur-II image in the Red channel, a Hydrogen-Alpha image in the Green channel and an Oxygen-III image in the Blue channel. Though Sulphur-II can indeed be said to be within the Red part of the spectrum, so can Hydrogen-Alpha! Oxygen-III is more of a teal colour as well, as it lies in-between Green and Blue. The end result of colour-combining such monochrome narrowband images is not going to yield a set of realistic colours for the deep space object.

That is not a problem, really, and in fact people have come up with all sorts of interesting palettes for colour-combining various monochrome narrowband images into a colour image. The balance is one between scientific value (e.g. specific gas distributions) and artistic value (does it look good and is it eye-catching?). Certainly, some even produce colour images out of two monochrome narrowband images, usually Hydrogen-Alpha and Oxygen-III (this is called the Bicolour Palette). Needless to say that colour calibration is not even a concept involved in this type of image (fully-narrowband). For this kind of image, BackgroundNeutralization is the only process required, performed as soon as the monochrome images are colour-combined. In fact, BackgroundNeutralization is required if you are to sample just how the colour-combined image looks.

Second, monochrome narrowband images combined with regular RGB colour image channels. Regular RGB colour images require the colour calibration procedure detailed in this tutorial. However, once you combine a monochrome narrowband image with a specific colour channel, this colour calibration is lost. Trying to colour calibrate again is in fact counterproductive as a significant part of the enhancements provided by the narrowband data are lost when you colour calibrate again. However, much like with fully-narrowband images, use of the BackgroundNeutralization process on its own has strong benefits to neutralise the background while keeping the enhancements provided by the narrowband data.

As a quick summary, here is what to do with images with narrowband data with respect to colour calibration:
  • For fully-narrowband images, once colour combined, use BackgroundNeutralization alone as per section 1 of this tutorial. 
  • For images with narrowband data used to enhance colour channels in a regular RGB colour image, as soon as you produce the RGB colour image, colour calibrate it as per this entire tutorial and then combine with narrowband data. Once the narrowband data has been combined with the RGB colour image, use BackgroundNeutralization alone as per section 1 of this tutorial. 
This should summarise the relationship between colour calibration and images with narrowband data. 

 
Comment Box is loading comments...