Transforming 60Da color to normal

Original May 2018

Update September 2020

For the transformation to work on a second 60Da picture pointing at something else, the two 60Da shots have to contain comparable histograms. For example I took a post-sunset deep blue shot within the belt of Venus, and it did not transform well.

An ImageMagick solution? You bet!

So I bought a Canon 60Da (a for astronomical) camera that has "no IR-filter", in order to take images of nebulosities.... and since then I have found myself using it *a lot* for twilights and to as lesser extent daylight picture taking. In daylight with a daylight WB, I find my eyes are not really good enough to see much difference. Of course it makes twilights a fair bit redder than a "normal" camera, but enough for me to want to transform it to appear closer to what I see with my eyes. I fully recognize that my brain is doing its own processing (red exhaustion for example), but for this picture-taking I am less interested in scientific accuracy than in getting close to how it looked. Here are 2 images, the red one from my 60Da, and the "normal" one from Bruce McCurdy's (formerly Mike Noble's) 5DmarkII.

Yes they are out of focus (I was alternating between shooting a nearby Xrite ColorChecker and this distant sky), and the fields of view are not identical, but otherwise same exposure values, both on daylight white balance.

I never visually saw red or pink as seen at left, but to my eye were always shades of orange similar to the one at right.

From snibgo: My process module cols2mat (see Colours to matrix and polynomials) calculates the 6x6 colour matrix that best transforms the first image to the second. For example, taking the first two images on your page:

CODE: SELECT ALL

set SRC1=20180308_AL_60Da_nofilter.jpg set SRC2=20180308_MP_IRfilter.jpg %IMDEV%convert ^ %SRC1% ^ %SRC2% ^ -process 'cols2mat method Cross' ^ noIRtoIR.jpg

Here are all three: 60Da 5D Mark II transformed 60Da

Hot damn!

I will be looking into a more complete methodology, matching solar depression angles to pairs of twilight scenes and pairs of Xrite ColorChecker passport images, but the above certainly counts as darn close for a first effort. Of course it is impossible to have an exact match (see snibgo's discussion below).

In the absence of a side-by-side pair of cameras, you can now transform any 60Da image by using Imagemagick's color matrix command and the matrix

as calculated above:

CODE: SELECT ALL

set CMAT=0.53761,0.237494,0.0655367,0,0,-0.0618967,0.0230337,0.522579,0.442921,0,0,-0.0514341,-0.197834,0.169792,1.01265,0,0,-0.0241132,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1

magick convert IMG_0289.JPG -color-matrix %CMAT% +depth IMGc_0289.JPG

Intermediate solution:After downloading, installing, compiling, adjusting, re-compiling, I got the snibgo modules working! (Details here)

An intermediate solution, before I implemented a more reliable one based on colchkcha follows Alan (snibgo)'s mask and transform. Because I am going from 60Da to normal, and not normal to 60Da, I reversed the mask, then rather than play with hue,saturation,lightness values (modulate), I played with the matrix method, which is closer to the transform solution: 60Da transformed normal (target)

I know the oranges are not quite as intense as the target on the right, but it actually looks closer to what I remember. At any rate, something better will have to wait until I get colchkcha running.

CODE: SELECT ALL

First the mask: magick 60Da.jpg ( +clone -fill Red -colorize 100 ) -compose Difference -composite -grayscale RMS -auto-level mask.png

negate it: magick mask.png -negate mask_neg.png

transform with matrix:

magick 60Da.jpg ( +clone -color-matrix " 0.9 0.15 0.1 0.10 0.9 0.2 0 0 0.8 " ) mask_neg.png -compose Over -composite 60Da2MP_90_15_10_10_90_20_b80_matrix2.png

I want to transform all my images into "normal". I am aware that Photoshop/Lightroom readily manipulate WhiteBalance, but presuming ImageMagick can do it, then why pay retail prices? Importantly, I need to run the transformation on 2 thousand images per session (I am shooting 4 images for HDR every 5 seconds) so automation is a must. And I will want to run it on past events.

I think the solution is for me to photograph a gray card under a range of twilight conditions with TWO cameras, one 60Da ("no IR-filter") and another, say 5D Mark 1, to create a library of "equivalents" then use snibgo's (http://im.snibgo.com/colchkcha.htm#graybal) "Gray Balance" approach :"When we have a photo of the card, we can use the process module to correct the gray balance of the photo, and any others taken under the same conditions. The method is: crop to just the last row, create a clone and grayscale it, calculate with NoCross the matrix that most closely does that grayscale transformation, and apply that matrix to the entire image."

Once I have this library, I can then transform any other twilight image from years ago and in the future by using the appropriate matrix to my new image. I suppose I can catalog each transform matrix by solar depression angle and say "close enough", not worrying that every twilight with the Sun at -8 will be slightly different.

Does this make sense? I am expecting that the transformation might be different depending on the exposure and condition - I would be surprised (but pleasantly!) if I have one transform for a twilight sky with a solar depression of say -8 degrees at ISO800 f/6.3 1/15s it will apply equally well to the exposures at 1s, 1/4, 1/15, 1/60 taken at that time as well as 10 minutes earlier and 20 minutes later.

Your comments are welcome.

A link to Fred Weinhaus' histogram match: http://www.fmwconcepts.com/imagemagick/histmatch/index.php

Snibgo kindly posted a thoughtful reply on the ImageMagick discourse: (http://www.imagemagick.org/discourse-server/viewtopic.php?f=22&t=33684)

Interesting problem. A general assumption in image processing is that we are dealing with real colours, those inside the CIE horseshoe. In principle, we know the transformation each camera makes from "scene referred" to "camera referred", so we know the inverse of each transformation, so we can transform the output from either camera to look as if it was made by the other. Or we can take short cuts and consider just gray balance (my page you linked) and contrast and saturation and so on.

But the general assumption isn't true for you. One of your cameras captures non-real colours, colours you can't see, beyond the red end of the spectrum. It transforms those colours into visible ones, colours you can see on the computer screen or whatever. I guess that it also "shuffles up" ordinary visible red colours, to make room for the colours that were infrared, but I could be wrong.

I have no experience of IR, so I can only speculate.

Two important factors are: the amount of IR in the illumination, and the IR reflectivity of the object. About half the radiation from the sun is IR. Does this proportion vary by time of day, or moisture in the air? I don't know. What proportion of light from blue sky or cloud is IR? I don't know, but I suspect not much. How much from a sunset? Different artificial lights will have different proportions of IR (eg inefficient incandescent versus modern fluorescent or LED). Green vegetation is a good IR reflector. In addition, all objects above absolute zero emit IR radiation, so cameras sensitive to IR can see people even when there is no light source.

You might get hold of a photographic IR filter, the type that screws to the front of a lens. Or even just a piece of IR gel. (Or improvise with a rectangular container of water!) Then you can take photos of the same scene with the same camera and lens, with and without the IR filter. If you also have an ordinary camera with a sensor IR filter, then take the same photo, and you have three images. You can create a series of tests with different illuminations (midday, before sunset, after sunset, artificial lights), and of different objects (gray card, people, foliage, landscape, skyscape). Test with auto-exposure, but also the same manual exposure for all three photos.

(We have a naming problem: the IR filter at the sensor blocks IR light ("cut-IR"), but the IR filter at the front of the lens blocks everything exceptIR ("pass-IR").)

These tests will show you what the Canon 60Da does to images in the presence or absence of IR. The photos with the "pass-IR" filter measure how much IR there is. I would expect an obvious tonal shift, and probably a red-shift in the areas where there is most IR present. For example, as shown on your page, the 60Da shows the red sky as lighter and redder, but doesn't change the blue sky.

For a constant subject (eg sunsets with black silhouette foreground) I expect a transformation from one camera to the other might be quite simple. Two stages:

1. Create a mask that is white where you have most IR, black where there is none, and gray in between. This is made from the image, according to redness and brightness. For example:

CODE: SELECT ALL

magick input.png ( +clone -fill Red -colorize 100 ) -compose Difference -composite -grayscale RMS -auto-level mask.png

2. Create a transformation from "ordinary camera" to "IR camera". This might be a red shift and lighter tones, eg "-modulate 120,100,80".

3. Composite with the mask, eg:

CODE: SELECT ALL

magick toes.png ( +clone -modulate 120,100,80 ) mask.png -compose Over -composite out.png

Sorry, I've rambled a bit.

Trio: