The histogram is probably the most important tool available in any astro-imaging application, both for capturing images and for processing them once captured. For those just starting out in astrophotography, the histogram is also one of the simplest but most frequently misunderstood tools.
In this short guide will explain the histogram; both what it means and how it can help you to make better images.
Despite the fact that it is such an important tool, many applications treat the histogram as a bit of an afterthought. It is often displayed without any labels or explanation, plotted at too small a scale to be useful and with no ability for the user to zoom in or otherwise manipulate the display. Perhaps it is not surprising that it causes so much confusion.
Computer Image Basics
Before I explain the mysteries of the histogram, we just need to review a few basics of computer images. As you may know, they are made up of a grid of Pixels, each of which represents a small part of the overall picture; see below for an example.
In a monochrome (“black and white”) image, each pixel is recorded as a number which corresponds to how “bright” it is. A value of zero means that pixel is black and the maximum value means it is white. The numbers in between represent the different shades of grey.
In a typical monochrome image, the maximum value in the image file depends on the number of Bits recorded for each pixel. An image with 8 bits per pixel will contain values between 0 and 255, making for a maximum of 256 different shades of grey (remembering to count black and white as shades!) At 16 bits per pixel, values will range from 0 to 65,535 making for 65,536 different shades.
Colour images work in the same way, except that three numbers are recorded representing the amount of Red, Green and Blue in each pixel. By mixing these three Primary Colours in different proportions, we can create any other colour. The more bits per primary colour the more possible combinations of the three primary colours there are:
Bits Per Primary Colour | Shades Per Primary Colour | Total Colour Combinations |
8 | 256 | 16.7 million |
12 | 4,086 | 68.7 billion |
14 | 16,384 | 4.4 trillion |
(Note, one billion = 1 thousand million and 1 trillion = 1 million million).
In reality, the human eye cannot distinguish more than 16.7 different million colours. Images with more colour combinations than this will usually be converted by your computer’s graphics card to compress the colours in to a much smaller range that the monitor can display (called a Gamut). Nonetheless, the extra colour depth in an image is very useful when processing them, since astro-images generally require us to identify and enhance tiny differences in colour and brightness.
We use the term ADU (Analog to Digital Unit) to refer to the pixel values that the camera stores in the image file in order to distinguish them from the (usually) smaller number of brightness levels that are actually displayed on the screen.
As a final complication to bear in mind, computers like to work in multiples of 8 bits (a Byte). So a 12 or 14 bit ADU will occupy 16 bits in the image file. Depending on the camera and software used, the extra bits may be left unused, or each ADU from the camera may be multiplied so that it fills the 16 bit number space in the file. If so, each 12 bit ADU value would be multiplied by 16, and each 14 bit ADU multiplied by 4.
If that is all a bit too much to take in, just bear in mind that each pixel in our image is represented by numbers, and the bigger the number, the brighter that pixel is.
So What Is a Histogram?
So the image you have worked so hard to create is just a bunch of numbers in your computer, and as we’ll see the histogram is an important tool for understanding those numbers.
A histogram is just a bar chart which shows how many pixels have each ADU value. Below is a simplified mock-up of a histogram to illustrate the point:
The full range of potential ADU values is plotted along the horizontal axis of the chart. In the above example you can see that we have values ranging from 0 (black) on the left to 65,355 (white) on the right. If you were paying attention earlier, you’ll realise this histogram is from a 16 bit monochrome image.
- The number of pixels in the image which have a given ADU value is plotted on the vertical axis. You can see that the scale ranges from zero to 250,000 pixels. There are actually 786,432 pixels in my sample image, but the vertical axis will usually show a smaller scale based on the highest ‘peak’ in the histogram since we don’t need to see all the blank space at the top of the chart.
- The red bars show the number of pixels that have that particular ADU value. The tallest bar in the chart shows us that nearly 225,000 of the pixels in our hypothetical image have an ADU value of 13,107. You can also see that there are zero pixels that have an ADU value of 65,535.
Note: In this simple example I’ve mocked up the histogram using 21 ADU values at intervals between 10 and 65,535. In a real histogram there would be a bar for each ADU value. I have used a small number of points so I can show you what is going on more easily.
In the case of a colour image the histogram may be displayed in different ways:
- A single chart showing just the ‘brightness’ (Luminance) of the image, which is a sort of average of the red, green and blue values for each pixel.
- Three separate charts showing the histogram for the red, green and blue values of each pixel.
- A single chart with the red, green and blue histograms overlaid on top of each other (these are often annoyingly hard to read though!)
We’ll stick with out monochrome example for now, but we will take a look at a real example of a colour histogram later on.
Interpreting the Histogram
Our sample histogram is a fairly typical for a deep-sky image (a daylight image or a picture of the moon would produce a rather different histogram). Below I have annotated our example to help you interpret it.
- On the left hand side there are no data points with values of zero. Even a completely ‘black’ pixel should have a value which is slightly more than zero due to the way the camera electronics work. The camera creates small random fluctuations to the ADU values of each pixel called Readout Noise which makes it uncertain exactly where the ‘black’ is in your image.
- The next thing you see in the histogram is a big peak. Since the majority of pixels in an typical image of the night sky contain the sky background itself, they appear as a peak somewhere at the left of the histogram.
- Note: If you are imaging in light-polluted skies, the sky background won’t be particularly dark, especially on a long exposure. The main histogram peak will be further to the right the brighter your sky and the longer the exposure, since the large number of background pixels will have higher ADU values than for a dark sky or a shorter exposure.
- As we move towards the right-hand side of the peak, more and more of the pixels will be from the nebulosity, galaxy arms and the other faint, fuzzy objects that you are trying so hard to capture. That said, there is no hard ‘line’ where the background sky stops and the ‘interesting’ stuff starts. The most useful information will usually appear in the very right hand side of the peak and the first part of the ‘tail’. This makes it hard to tease out the fine detail in a deep sky image.
- Finally we have a long, flat tail running off towards the right side of the histogram. These are mostly the pixels from the stars in your image, perhaps mixed in with the brighter parts of your target (e.g. the core of a galaxy or the brightest parts of a nebula). Again there is no hard ‘line’ where the target stops and the stars begin.
- Whilst they appear bright and obvious, the stars tend to cover a very small number of the pixels in an image, and thus the bars representing them in the right-hand side of histogram are correspondingly small.
- At the left hand end of the histogram, there may be some ‘Cold Pixels’ with ADU values of or very near to zero, and at the right hand end there will be a fair few “Hot Pixels” and other anomalies with a very high or even the maximum ADU value. The hot and cold pixels are usually the result of physical defects in the camera sensor.
In summary, the histogram tells us about the range of brightness in our image. It is important to note that the histogram contains no spatial information whatsoever. In other words you can’t tell which pixels belong to which features in the image by looking at the histogram. For example it doesn’t distinguish between pixels that are a particular shade of grey because of light pollution and other pixels of the same shade because they are part of a galaxy arm or nebula.
Histograms and Exposure
The histogram is very useful for figuring out if you have properly exposed your image. There are two main problems to watch out for:
- Over-exposed images: If your image is over-exposed, some of the pixels will become Saturated. They have reached the maximum ADU value that the camera can record (in our example this is 65,535) and any further light that hits the saturated pixels has no effect. As you continue to expose, more and more of the pixels become saturated and more of the details in the image disappear as they turn the same shade of bright white.
- Saturating pixels is generally a bad idea, but there are a very few exceptions to this rule. Most typically you would take an over-exposed image to bring out faint details in one part of the image, and then merge it with a second (less exposed) image to restore the lost detail to the highlights.
- A few targets like M31 (the Andromeda Galaxy) or M42 (the Orion Nebula) have a very wide range of brightness and might require this type of technique. Sometimes a very bright star might also become saturated as you chase the faint details of a nearby nebula. On the whole though, saturation is a bad thing when astro-imaging, so you should avoid it in most cases.
- Look at the sample histogram above. The small peak at the right-hand end of the chart shows that there are a fairly large number of pixels which have reached the maximum value and become saturated. This indicates that you have over-exposed the image, and you should either expose for a shorter period of time or reduce the gain/ISO on camera. (Deciding which is the correct option is a topic for another day!)
- Bear in mind that it can be very hard to see whether you have over-exposed the image on the small histograms found in most image capture applications and on the back of DSLRs. The saturated peak may be very small, requiring you to zoom in on the histogram to check for over-exposure. Depending on the camera, the preview image and histogram might also have been converted to an 8-bit JPEG image which can make it hard or impossible to check exposure properly.
- It is usually easier to review your histogram in an external image processing application, so remember to install one on your imaging laptop. Don’t wait until the next day, rather you should check your first few exposures straight away and make adjustments to exposure times and camera settings if necessary.
- Under-exposed images: If your image is under-exposed, many of the background pixels will be dominated by readout noise from the camera.This will result in a speckled or grainy background in your images . The grainy background is difficult to remove in post-processing and makes it hard to tease out the faint details from your image.
- It is easier to tell whether you have under-exposed using the on-camera or capture software histogram. As you can see from the example above, the main histogram peak is butting up against the left hand end of the chart. This indicates that many of the background pixels are at or very close to zero where the readout noise has the biggest effect.
- The solution is to expose for longer or to increase the gain/ISO on your camera. Again this is a complex issue for another day, but a general rule of thumb is that it is better to expose for longer rather than pushing the gain/ISO setting higher.
Real World Examples
Below is an example of a histogram taken from a monochrome image of the Andromeda Galaxy. As you can see, the practice matches up to the theory quite nicely. We have a big peak near the left of the histogram which represents the sky background. This is followed by a ‘shoulder’ to just to the right which represents the faint outer regions of the galaxy, and a long ‘tail’ running away to the right, which includes the brighter inner parts of the galaxy and the stars.
In the three images below, I have highlighted the background, mid-tones and highlights in green and the corresponding portion of the histogram in pink (click on the images for full-sized versions).
In my example you can see that the dividing line between the background and the galaxy appears to be well defined with a clear line where the sky ‘stops’ and the galaxy ‘starts’. Don’t be fooled though, that is simply because I have already processed this image to bring the faint outer arms of the galaxy out from the sky background. In an unprocessed image, the boundary is much less well-defined. Indeed, much of the art in processing astronomical images is finding that boundary and carefully brightening the target without bringing along all the unwanted noise that lurks in the image.
The final example below is a histogram of the colour version of the same image. You can see that there are three histogram curves overlaid on top of each other corresponding to the the red, green and blue values of each pixel in the image.
Hopefully this quick run-through has cleared up some of the mysteries of the histogram, as well as demonstrating how useful it can be in analysing your images.