<script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script>
<!-- martinchinwe -->
<ins class="adsbygoogle"
style="display:inline-block;width:728px;height:90px"
data-ad-client="ca-pub-3432567292388333"
data-ad-slot="3193770573"></ins>
<script>
(adsbygoogle = window.adsbygoogle || []).push({});
</script>
2.1. Theory of digital image
The digital image is made up of rectangular arrays of pixels of varying pixel intensity, ρ(m,n). The pixel is the smallest addressable unit of a digital image. In a digital image, every pixel value has two positive sample numbers m and n where m is the row sample numbers and n is the column sample numbers respectively. Each pixel value in a digital image is a data sample in the image. The brightness of an image depends on the intensity values of each pixel and the frequency of occurrence, of each pixel. The frequency of occurrence is the number of times a given pixel intensity, repeats in an image. The relationship between noiseless image, and the pixel intensity is an identity relationship [32]. Every digital image is made up of a certain number of rows and columns of pixels.
A digital image is represented as a 2-dimensional lattice of r-dimensional pixels, where r is one colour per pixel in the grayscale image, three colours per pixel in colour (RGB) image and r is greater than three colours per pixel in multispectral image represented using RGB colours, Ultraviolent and near-infrared light. The space of the lattice is the spatial domain while the gray level, colour or spectral information is the range domain [33].
2.1.1. Types of Digital Images
Digital images are divided into three important types: colour image, grayscale image and multispectral image. However, for conventional use, such as biomedical imaging systems and digital cameras, digital images are divided into the grayscale and colour image. The multispectral image is reserved for unconventional use such as military radar system, satellite system.
2.1.1.1.Grayscale Images
The gray scale (black and white) image is made up of pixels and each of the pixels holds a single colour corresponding to the gray level of the image at a particular location. In 8-bit gray level image, each pixel is encoded with 8-bits and the gray levels smoothly span from 00 (black) to 255 (white) which means 256 possible colours per pixel are used and requires 1byte per pixel memory capacity for storage. Other forms of grayscale images are binary image and normalized image. An example of grayscale images is biomedical images captured using CT.
2.1.1.1.1. Normalized grayscale image
Most often, grayscale images are normalized to reduce the range of the pixel intensity required. For a normalized grayscale image, the pixel intensities are real numbers varying from 0.0 pixel value (black) to 1.0 pixel value (white). A grayscale image is normalized by dividing each of the pixel values by 255 (white colour value). A threshold in grayscale image is defined as a pixel value such that when all pixel intensity ρ (m, n) > T are converted to 1.0 and pixel intensity ρ ≤ T are converted to 0.0, the original image will be accurately represented. At any point between threshold colour and black colour, the proportion of black colour is more than the proportion of white colour in the image. For any point between threshold colour and white colour, the proportion of white colour is more than that of black colour.
2.1.1.1.2. Binary image
A modified version of a grayscale image is a binary image. It is an image represented by 0 and 1 only. It is generated by converting any pixel intensity more than the threshold to 1.0 (HIGH) and any pixel intensity equal to or less than threshold to 0.0 (LOW). The threshold of a grayscale image depends on the proportion of each colour in the image. The process of generating a binary image from grayscale image is called segmentation.
2.1.1.2.Colour Image
On the other hand, the colour image is represented using three colours; red colour (R), green colour (G) and blue colour (B) and is also known as an RGB image. In an RGB image, each pixel is encoded using 24-bit corresponding to 8-bit of red (R), 8-bit of green (G) and 8-bit of blue (B) per image pixel. The RGB colour is formed by mixing primary colours additively. It has domain ranging from 0 (black) -16,777,215 (white) which corresponds to 224 (16,777,216) possible colours per pixel and it therefore needs higher storage memory (3bytes per pixel) for storage compared to gray image (1byte per pixel) since a byte represents 8bits. An example of a colour image is an image captured using a digital camera.
2.1.1.3.Multispectral image
For non-conventional applications such as military and communication, multispectral images are used. The multispectral image is represented using RGB colour, Ultraviolent and near-infrared wave. The image contains 240 (1,099,511,628,000) colours and spectra and requires 5byte per pixel for storage. An example of multispectral image is remotely sensed images captured using radars and satellites [35].
2.1.2. CT lung image
CT lung image is a digital image of lung captured using a computerized tomography machine. In the CT lung image, a region with spiculated mass (cancerous growth) is represented as white pixels in the lungs while region with no spiculated mass is represented as black pixels.
The area of lung occupied by the white pixels determines the stage of the cancer growth and the area occupied by white pixels increases as the cancer growth advance in stage.
2.1.3. Generation of digital image
Digital image is generated from continuous-time image by 2-dimensional sampling at a frequency at least equal to the Nyquist rate. The Nyquist rate is the minimum frequency at which a continuous-time image must be sampled so that it can be reconstructed from the digital image without loss of information. Sampling of the image below the Nyquist rate causes an aliasing effect. Aliasing effect is the distortion of the image due to under-sampling. The Nyquist rate for a digital image is defined as 2B where B is the number of bits per pixel in the image. For effective recovery of images, the Nyquist rate for grayscale image is 16-bit/pixel while the Nyquist rate required for colour image is 48-bit/pixel.
2.1.4. Corruption of Digital Image
Digital images are corrupted by noise when some of the pixel intensities are modified by unwanted random signal. Noise is an unwanted signal that combines with desired image to reduce the quality of the image. There are many ways that noise can be introduced into a digital image, depending on how the image is acquired. Some of these ways are:
v Noise due to the film grain or the result of damage to the film present in a scanned picture. Film grain is the random visual texture of processed photographic film due to the presence of small particles of a metallic silver, dye clouds, or dust in the photographic film before processing.
v Noise due to error in detector such as Charge-Coupled detector (CCD) in image capturing devices. A CCD (Charge Coupled Device) is a silicon based multichannel array detector of UV, visible and near-infra light.
v Noise due to the electronic transmission of image.
When a noiseless digital image, Ῡ(m, n) interacts with noise, it combines additively or multiplicatively with the noise signal, to form corrupted digital image, χ (m, n). The presence of noise in a digital image distorts the information carried by the image and erodes the quality of the image. The reduction of image quality due to noise has great impact on modern technology such as radar system, satellite system and biomedical imaging system where the accurate recognition of image features has played a great role. The image features are the properties of regions in the image that makes it possible for the image or regions in the image to be accurately recognized. A typical example of digital image from biomedical imaging system is CT lung image.
2.1.4.1.Noise in CT Lung Image
The most outstanding noise type in biomedical images is the multiplicative noise. A multiplicative noise is an unwanted signal that gets multiplied to noiseless biomedical image during capture, transmission and processing to reduce the image quality [36]. It occurs due to undulations on the surface of captured object, shadows cast by other objects and dusts. The presence of noise severely degrades the quality of biomedical image and leads to the loss of vital features in the image [37]. An example of a biomedical image is a CT lung image. The CT lung imagery finds applications in lung cancer detection.
In lung cancer detection, the feature of interest is the spiculated mass (a lump of tissue with spikes or points on the surface) on the lung image.The corruption of the CT lung image leads to the presence of false pixels in the image. The features of the noisy digital lung image are normally restored using preprocessing filters. The use of preprocessing filter enhances the quality of the image and improves the recognition accuracy.