Photography in the marine environment has unique characteristics, the dominant blue and green dominate the color range known on the surface. The camera cannot control these color variables, resulting in near-monochrome images.
By using custom parameters, abandoning the standard parameters imposed by the manufacturer, it is possible to solve the problem.
I show some pictures to better understand what we are talking about. The images below were taken by an RX100M2 in "RAW + JPG" configuration, to save the single shot in both modes and make a direct comparison possible.
JPG processed by the camera with "Standard" parameters
RAW processed with customized parameters
Sardinia. Riserva naturale marina . Capo Carbonara (Campolongu)
Sardinia. Riserva naturale marina . Tavolara (Porto San Paolo)
The above JPG images are characterized by "selective absorption" and "diffusion"
SELECTIVE ABSORPTION (chromatic variation)
The hue varies proportionally to the thickness of water crossed by the light, both direct (sunlight vs subject) and reflected (subject vs camera). The different chromatic components of light through the water are attenuated in a non-homogeneous way, a phenomenon known as "selective absorption", the red component is the one that undergoes the greatest attenuation.
DIFFUSION (reduction of contrast and definition)
The water is clouded by the presence of myriads of suspended particles that reflect the light from which they are illuminated in every direction, the resulting effect involves the diffusion of the same light with consequent reduction of the general contrast, and in case of higher turbidity a proportional decrease in image sharpness.
The two unwanted characterizations are easily controlled, using the RAW file, and in post processing, acting respectively on "white balance" and "contrast curves" - "sharpness".
The JPG standard has an 8-bit quantization, corresponding to 256 levels of gray per color channel (RGB), while the image sensor of my RX100M2 produces a 12bit RAW format file, corresponding to 4096 levels of gray per color channel (RGB) , it follows a ratio between the two formats RAW: JPG = 16: 1.
What does this difference mean?
JPG format (processed by the camera)
The JPG file is processed directly in the camera by the "ISP" image processor, the latter manages parameters such as: white balance, brightness, contrast, color saturation and sharpness, according to settings chosen a priori by the manufacturer to obtain a beautiful image 'aspect in a normal shooting condition, this processing imposed by the manufacturer cannot be undone, but as we have seen in the previous images, the marine environment with its wide color excursions is not a normal shooting condition, consequently the internal management the camera finds its limits.
After the adjustments described, compression according to the JPG standard is applied by the same ISP, this method is very efficient but it is destructive (lossy), which means that part of the information on the original image is lost and cannot be restored, the result is a file with a smaller range of tonal values (8 bits) than what can be obtained using the RAW format (12 bits). For underwater images plagued by heavy color problems, this limitation is critical.
The appearance of a JPG image can be changed by image editing applications, but as it is a lossy compressed format, as mentioned above, a lot of tonal and color data has been permanently deleted during the compression process, the details lost in overexposed highlights cannot be recovered in the JPG, the same goes for darker and underexposed images. Furthermore, it should not be forgotten that each subsequent manipulation results in a recompression of the JPG image with further deterioration in quality.
The RAW format
The description of the RAW file is simpler, as it is intercepted upstream of the "ISP" image processor and is not subjected to any image processing or compression, the RAW file presents all the tonal and chromatic data recorded by the image sensor.
RAW presenting the full range of tonal / chromatic levels provided by the image sensor, no longer constrained by the manufacturer's rigid choices, allows complete "customized" control over white balance, brightness, contrast, color saturation and sharpness, the wide range available dynamic, allows the recovery of details in overexposed highlights and shadow details in underexposed areas. Flexibility is the main advantage of shooting in RAW.
Adobe brings as an example a nice gastronomic analogy: a RAW file contains all the ingredients to prepare your favorite dish as you want it, while a JPG is that pre-cooked and prepackaged meal to be heated in the microwave.
Adobe | Raw vs Jpg
Sardinia. Riserva naturale marina . Capo Carbonara (Campolongu)
Sardinia. Riserva naturale marina . Tavolara (Porto San Paolo)
The main enemy of the clear underwater image is constituted by the suspension of sand and corpuscles that cloud the water; the only solution is to reduce the thickness of water interposed between camera and subject to the minimum possible. This means that the use of a long focal length is allowed to obtain the evidence of the details, not for photographing from afar.
I find a clarification necessary, too many times we read that underwater photography wants the wide angle and the long focal length is demonized. It is not true, the writer of this perversely unites "subject camera distance" with "focal length", the first must be as short as possible, the second (focal length) can be as much as I need when necessary, if you need to take a portrait of a fish. I don't necessarily have to be 20 cm from his eyes with a 28 mm, if I don't want to intimidate him I can move to 60 cm using a 100 mm without creating any quality reduction to the image! Personally I think a 28-100mm zoom is the optimum. as a demonstration I insert 3 photographs (1) 28 and (2) 100 mm, taken in Campolongu on the same day 22/09/2022 and the same weather conditions.
In terms of depth of field, I am in favor of reduced formats, with the same aperture, an APS-C sensor has a DoF higher than a Full Frame, and an even greater 1 " sensor, we do not go further because even smaller formats do not support the use of the fundamental RAW format.
The following diagram shows the DoF (depth of field), with the subject at 100 cm. aperture f / 5.6 using a 28 mm (Eq./35mm) on the 3 formats in question.
N.B. The calculation is carried out considering the shift in focal length due to refraction, in water x 1.33, therefore 28 mm mounted on the camera becomes 37.2 mm. Same goes for the other formats.
We will find described the "refraction" phenomenon. further on.
Exposure mode: [A] aperture priority, poor stability underwater and fast fish movements require relatively fast exposure times: > 1/250 s. and aperture sufficiently small to offer sufficient depth of field: f / 5.6. With these time / aperture pairs you need to choose high sensitivities: ISO 400, which with a 1 "sensor still allows for adequate image quality.
If the day is not very sunny, or in depths greater than 5 meters, it may be necessary to use the flash.
In underwater photography, the environment has different shooting conditions from those for which common lenses were designed. Water is an element with a density about 800 times higher than that of air, which defeats part of the optical research carried out by the designers, the mass of water interposed with the subject behaves as an additional optical element, not calculated in the design phase, causing unwanted optical aberrations. These aberrations, as we shall see, are also corrected in post processing using the RAW format.
[1] Refraction (focal length variation)
[2] Chromatic dispersion (transverse chromatic aberration)
[3] Selective absorption (variable color temperature with depth)
[4] Diffusion (abatement of contrast and definition)
[5] Reflection (decrease in brightness caused by surface reflection)
[6] Dimming (decrease in brightness caused by absorption)
Photographing in RAW format allows the recovery of degenerated parameters in post-processing; [2] Dispersion, [3] Selective absorption and [4] Diffusion.
The most evident change observed in the water is the variation in proportions, it is classic for scuba divers to harpoon a large fish and pull a much smaller fish out of the water. This phenomenon is called "refraction". In photography we get the same effect, our lens increases in focal length, just like the previous fish.
What happens to a ray of light when it passes from air to water?
It no longer follows a straight path, but a broken one (refracted), a behavior justified by the mathematician Pierre de Fermat (1607-1665): "The path of a ray of light between two points is the one that takes the least time" (Fermat's principle).
Refraction is a phenomenon known since the second century. A.D. (Ptolemy), calculated in the eleventh century. A.D. (Ibn Al-Haytham / Alhazen) and definitively formulated in 1621 (Willebrord Snellius / Snell).
Refractive index "n"
When a light beam passes from one medium to another of greater density, it decreases its speed and changes in direction. The beam is deviated at the point of contact of the two elements and tends to approach the line perpendicular to the plane of incidence.
Each transparent substance deflects the light in a characteristic way according to a refractive index "n" determined by the different speed with which the light passes through it:
n = c / v
We define the refractive index "n" of the medium, the ratio between the velocity "c" in vacuum and the velocity "v" in the medium. In air, the speed of light c is close to 300,000 km / s. while in water "v" is 225,000 km / s. for which the refractive index "n" of the water is about 1.33.
Snell's law states that the ratio between the sine of the angle of incidence "θi" and the sine of the refractive angle "θr" is equal to the ratio of the refractive index "nr" of the second medium and the refractive index "ni" of the first:
sen(θi) / sen(θr) = nr / ni
If a light ray passes through a medium with refractive index ni <nr of the second medium, we will have θr <θi
Moving from a less dense medium, where the light is faster, to a denser one where the light is slower, the light beam approaches the perpendicular to the separation surface. If it passes from the denser to the less dense medium, it moves away from the perpendicular.
The product of the refractive index and the sine of the angle that the light beam forms with the perpendicular remains the same:
ni sen(θi) = nr sen(θr)
In refraction the principle of reversibility applies; reversing the direction of the ray (exchanging the incident ray with the refracted ray) the deviation remains the same.
The air / water coupling with their different refractive index 1.00 / 1.33 can be seen as a two-element optical group.
Basically, water reduces the angle of view by a factor of 1.33 with the same effect as an add-on lens, extending the focal length of our lens, for example: a "terrestrial" 28 mm becomes 28 x 1.33 = 37 mm "underwater".
N.B. The "SteadyShot" stabilization mode must be excluded, as the camera is unaware of the lens' change in focal length x 1.33, and would apply incorrect feedback. Fortunately, the high density of the water cushions our movements.
In the table: Water equivalent focal length conversion, for one lens: Zeiss Vario-Sonnar T* zoom Focal range = 10.4 mm - 37.1 mm (Eq. 28 mm - 100 mm).
Focal length eq. 35 mm
There is also a positive side in the refraction phenomenon, in the macro application the extension of the focal length obtained in water, allows for the same magnification ratio to maintain a greater objective-subject distance. Considering the distances involved in this application, that extra 1/3 could prove to be saving, avoiding the risk of abrasion of the front camera port against the rocks!
The Zeiss Vario-Sonnar T * 28-100mm eq. provides a macro position at the focal length of "28 mm eq" only, the minimum focusing distance is: 45 mm from the front of the lens. . . 60 mm in water.
In summary, the "Refraction" phenomenon, we can define it as gross, we verify it as soon as we enter the water, our new "optical add-on x 1.33" involves a change in focal length, this is a fact, not a problem, we have seen that it could also be good news.
But the devil hides in the details, refraction brings with it a more subtle complication that shows itself once the photo is taken, when it is too late.
At the end of the 1600s, Isaac Newton complicates the simplistic approach of "refraction", given until then, by proving that monochromatic white light is composed of the sum of all the colors of the visible spectrum, demonstrating it with a beam of "transversely incident" white light. to an optical prism, realizing a double refraction, passage [1] air / glass and [2] glass / air, obtaining at the output the fan-shaped dispersion of the chromatic spectrum.
In addition to demonstrating the composition of light, it proves that as the wavelength (color) changes, the refraction in the medium changes, and therefore by applying Snell's law, lights of different colors are deflected at different angles, (the refractive index decreases as the wavelength increases).
The phenomenon formulated by Newton of the different degree of refraction, called "chromatic dispersion", finds application in any optical project, the chromatic dispersion and relative "chromatic aberration" can be found in every single lens due to the passage [1] of air / glass and [2] glass / air, similar to Newton's prism.
Two types of chromatic aberration are defined: transverse and axial
Transverse aberration occurs when different wavelengths (color components) are focused at different locations on the focal plane. Transverse aberration is typical in retrofocus wide-angle lenses and generally in markedly asymmetrical optical schemes, but can be corrected in post-processing.
Axial aberration occurs when different wavelengths (color components) are focused at different distances from the focal plane. Axial aberration is relatively difficult to correct in post-processing and is typical of long focal lengths. but it can be effectively corrected a priori by reducing the aperture.
In a complex lens system, such as a lens, chromatic aberration is corrected in the design phase, the problem arises again when the light paths are misaligned due to anomalous refraction, in our case by placing the new "water / air" passage first, with a refraction not foreseen by the project.
This new condition is out of control by the optical corrections of the original "air / glass" project, and it could not be otherwise, as the "water / air" project variable is not considered in the design phase.
The transverse chromatic aberration does not occur in the center of the image, it is highlighted in the peripheral area of the image where the angle of deviation of the light rays is greater, presenting [1] color fringes on the edges of contrasted subjects and a [ 2] pincushion distortion, is greater in wide-angle and is not affected by aperture closure, unlike axial chromatic aberration that occurs throughout the image, but can be reduced by closing the diaphragm.
Refractive index "n" vs RGB color
A practical case
In post-processing it is possible to easily correct the effects of transverse chromatic aberration from our RAW file, certainly the correction returned by an optically well corrected lens both on land and in water is certainly superior, but since this objective does not exist, we will have to accept the following limits that the software fix imposes:
Realigning / resizing individual color channels results in a loss of resolution compared to the original image.
The camera sensor captures only discrete RGB color channels, while chromatic aberration is not discrete and occurs across the entire spectrum of light.
To realign the RGB levels, so that all the color channels overlap each other spatially correctly in the final image, from Adobe's Camera Raw enlarge the image to 200%, in the presence of colored edges act on " Lens corrections> Colors "Always start by choosing the" Remove chromatic aberration "option, in 90% of cases this automatism correctly restores the aberration.
If not, operate manually (the example refers to the caudal fin of the sarago in the photograph which has a yellow fringe):
Green factor: set the value to "13"
Green hue: move the slider to fine-tune the border color (2/18)
Purple factor: stays at "0"
When light passes through water, it absorbs part of it in a non-homogeneous way for the different colors, this phenomenon is known as "selective absorption".
The different chromatic components of the light are absorbed by the water at different depths.
The red component disappears after the first 5 m, the orange can reach up to about 15 m, the yellow up to 20 m and the green up to 30 m.
The selective absorption is determined by the thickness of water crossed by the light, both vertical (depth) or horizontal (distance) the rule is the same;
Example [1]: if we are in the water we look at an object at 5 m. depth or 5 m. in front of us, the red color will be missing in both cases.
Example [2]: if we use a flash to recover the missing red color, beyond 2.5 m. the red color will still be missing, because the light from the flash travels 2.5 m to illuminate the object and its reflection towards us makes a further 2.5 m. The result does not change.
The phenomenon of selective absorption linked to the thickness of water it passes through is also linked to the solar position. Perpendicular incident rays will pass through less water thickness than oblique incident rays at the water surface.
The selective absorption brings strong chromatic dominants, but operating in the water on the "white balance" parameter present in the camera to correct the chromatic dominants as the depth of immersion varies, I find it unrealistic.
To correct the problem of color casts, the only solution is to photograph in RAW and adjust the white balance in post-processing.
Of course, this technique cannot return colors that no longer exist, ie at 30 meters it is impossible to artificially increase the red component as at that depth it is no longer present, increasing a "0" always remains "0".
(reduction of contrast and definition) Normally the water is clouded by the presence of myriads of suspended particles that reflect the light from which they are illuminated in every direction, the resulting effect involves the diffusion of the same light with consequent reduction of the general contrast, and in case of higher turbidity a proportional decrease in image sharpness.
By reducing the thickness of water interposed between us and the subject to a minimum, the sharpness of the image increases proportionally, the rest is recovered in post-processing by acting on the "contrast curves" and sharpness.
Note: The decay of contrast and sharpness is attributable solely to the suspended particles, water is in no way responsible for this effect, if the water is crystal clear there are no obvious variations in image sharpness, as demonstrated in the images below, carried out in the tank to be sure of the clarity of the water:
(decrease in brightness caused by surface reflection) A ray of light that hits the surface of the sea is partially reflected upwards and cannot penetrate. The angle at which it is reflected and the amount of reflected light depend on the angle of incidence: the lower the sun, the greater the amount of reflected light, so at sunset and sunrise there will be more reflected light and less light in the water.
(decrease in brightness caused by absorption) Water has a density about 800 times higher than air, so its light absorption is just as high, in the most transparent waters 99% of solar radiation is absorbed in the first 100-150 m. after that there is only blue up to the lower limit of absence of light 300-500 meters.
The phenomenon of attenuation, as for selective absorption is linked to the thickness of water that is crossed by solar radiation, the arrival of perpendicular solar rays, in fact, decreases the phenomenon of reflection and increases the amount of rays that penetrate. As perpendicular incident rays will pass through a lower thickness of water than oblique incident rays at the surface of the water. For the underwater photography, the best hours are around noon.
These last three problems; diffusion, reflection and attenuation make it necessary to forget the concept of infinity under water, furthermore these phenomena are not constant, they depend on the quality of the water, clear or cloudy, in which we go to photograph.
Mounting filters and lenses to the filter holder thread in DSC-RX100 and DSC-RX100M2 models
When DSC-RX100 or DSC-RX100M2 are inserted into the MPK-URX100A housing, the center of the filter holder thread and that of the lens on the camera are not aligned. Therefore, do not use correction ports and optical add-ons, connected to the filter thread, as image distortion phenomena may occur.
Camera underwater weight *
including camera, battery, memory card (SD), diffuser (45 g.), strap.
Underwater Weight**
The product floats on water if the weight is negative (-), and sinks if it is positive (+).
The correction of the hydrostatic attitude can be adjusted with appropriate ballast to be screwed into the bottom of the MPK-URX100A housing, considering the values in the table.
N.B. The notes were developed using a Sony RX100M2 camera, with Sony MPK-URX100A diving suit, but generically applicable to any underwater camera.
The photographs used were taken in September 2022 in the marine nature reserves of Tavolara and Capo Carbonara in Sardinia. Google Photos Gallery
Exposure mode, clear water camera, sunny day and dive within 4 meters:
[A] Aperture priority with preset value f / 5.6 and ISO 400 sensitivity
DSC-RX100M2 pre-settings inserted in MPK-URX100A:
[Menu Photo > 2]
Drive Mode = Multi Shot
Focus mode = Continuous AF
[Menu Photo > 4]
AF Illuminator = Off
SteadyShot = Off
[Menu Video > 1]
SteadyShot = Off
[Menu Options > 2]
Control ring = Standard
[Menu Config. > 1]
LCD brightness = Sunny weather