KEMBAR78
Color Spaces and Image Adjustments | PDF | Color | Electromagnetic Radiation
0% found this document useful (0 votes)
34 views95 pages

Color Spaces and Image Adjustments

The document discusses color images, color spaces, and color image processing, covering topics such as color vision, color models, and transitions between color spaces. It highlights the importance of understanding color perception and the techniques used in color image display and printing. Additionally, it explores the physics of light, human color vision, and various color models like RGB and CMYK.

Uploaded by

Vaibhav Parashar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views95 pages

Color Spaces and Image Adjustments

The document discusses color images, color spaces, and color image processing, covering topics such as color vision, color models, and transitions between color spaces. It highlights the importance of understanding color perception and the techniques used in color image display and printing. Additionally, it explores the physics of light, human color vision, and various color models like RGB and CMYK.

Uploaded by

Vaibhav Parashar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 95

color images, color spaces and color image processing

Ole-Johan Skrede
08.03.2017
INF2310 - Digital Image Processing

Department of Informatics
The Faculty of Mathematics and Natural Sciences
University of Oslo

After original slides by Fritz Albregtsen


today’s lecture

∙ Color, color vision and color detection


∙ Color spaces and color models
∙ Transitions between color spaces
∙ Color image display
∙ Look up tables for colors
∙ Color image printing
∙ Pseudocolors and fake colors
∙ Color image processing
∙ Sections in Gonzales & Woods:
∙ 6.1 Color Funcdamentals
∙ 6.2 Color Models
∙ 6.3 Pseudocolor Image Processing
∙ 6.4 Basics of Full-Color Image Processing
∙ 6.5.5 Histogram Processing
∙ 6.6 Smoothing and Sharpening
∙ 6.7 Image Segmentation Based on Color
1
motivation

∙ We can differentiate between thousands of colors


∙ Colors make it easy to distinguish objects
∙ Visually
∙ And digitally
∙ We need to:
∙ Know what color space to use for different tasks
∙ Transit between color spaces
∙ Store color images rationally and compactly
∙ Know techniques for color image printing

2
the color of the light from the sun
spectral exitance

The light from the sun can be modeled with the spectral exitance of a black surface (the
radiant exitance of a surface per unit wavelength)

2πhc2 1
M (λ) = { hc } .
λ5 exp λkT −1

where
∙ h ≈ 6.626 070 04 × 10−34 m2 kg s−1 is the Planck
constant.
∙ c = 299 792 458 m s−1 is the speed of light.
∙ λ [m] is the radiation wavelength.
∙ k ≈ 1.380 648 52 × 10−23 m2 kg s−2 K−1 is the
Boltzmann constant.
∙ T [K] is the surface temperature of the radiating Figure 1: Spectral exitance of a black body surface for different

body. temperatures.

4
spectral irradiance

The distance from the earth to the sun is about d ≈ 1.496 × 1011 m, and the radius of the
sun is about r ≈ 6.957 × 108 m. The radiation we measure at the top of the earth’s
atmosphere from a black body sphere is then given by the spectral irradiance
( r )2
E0 (λ) = M (λ) .
d

The wavelength at the peak is given by the Wien


displacement law

2.897 772 9 × 10−3 m K


λmax ≈ ≈ 501.51 nm
T
which is green (RGB ≈ (0, 255, 135)).
Figure 2: Spectral irradiance of a black body surface for different
temperatures.
5
spectrum of solar radiation

∙ The spectrum of the Sun’s solar radiation is


close to that of a black body.
∙ The Sun emits EM radiation across most of the
EM spectrum.
∙ The stronges output is in the range for visible
light (from about 380 nm to about 780nm.

Figure 4: The electromagnetic spectrum (Source: By Victor Blacus - SVG


Figure 3: Solar irradiance spectrum above atmosphere and at surface. Extreme UV and X-rays are produced version of File:Electromagnetic-Spectrum.png, CC BY-SA 3.0, https:
(at left of wavelength range shown) but comprise very small amounts of the Sun’s total output power. //commons.wikimedia.org/w/index.php?curid=22428451)
6
(Source: By Nick84 - http://commons.wikimedia.org/wiki/File:Solar_spectrum_ita.svg, CC BY-SA 3.0,
light propagation behaviour

∙ Absorption: Light stops at the object, and do not reflect


or refract, making it appear dark.
∙ Reflection (on a smooth surface): Light bounces off the
surface of a material at an angle equal to the incident
angle (as in a mirror).
∙ Scatter (reflection on rough surface): Light bounces off in
many directions.
∙ Transmission: Light travels through the object. Example:
Glass.
∙ Refraction: Changes in wave speed when light transmits
through a different material causes distortions from the
original path.
∙ Diffraction: Bending of light around corners of an
obstacle or an aperture. Figure 5: Different light path options
7
sunlight radiation on earth

∙ Not all radiation from the sun that


reaches our planet reaches the earth’s (a) Solar radiation partitions

surface.
∙ A lot of radiation (especially in the UV
and IR wavelength ranges) are absorbed
by molecules such as H2 O and CO2 .

(b) Radiation transmittance

Figure 6: Sunlight and the earth.

8
dispersion

∙ Dispersion is the phenomenom where


the phase speed of the light is
dependent on the frequency.
∙ This is a property of the propagation
media, and they are called dispersive
media

Figure 8: Illustration of dispersion. The different colors in white light propagating


through a dispersive prism is revealed since different wavelengths are refracted at
different angles.
Figure 7: Colors and wavelengths

9
light and color

∙ Achromatic light is light without color, grayscale etc. Characterized by its intensity.
∙ Chromatic light is the colors, and span a narrow band of the electromagnetic
spectrum
∙ Thre basic quantities determines chromatic light:
∙ Radiance: Total amount of energy from the light source. Measured in Watts [W]
∙ Luminance: The amount of energy an observer percieves. Measured in lumens [lm]
∙ Brightness: Analogue to the achromatic intensity. Subjective and difficult to measure.

10
the color of an object

∙ The color we percieve in an object is determined by the nature of the light reflected
by the object.
∙ Therefore, the color of an object is determined by
∙ The light hitting the object.
∙ How much light is reflected and absorbed.
∙ Or, in other words
∙ The spectral distribution of the incident light.
∙ The spectral distribution of the reflected light.
∙ The reflection properties are determined by
∙ Chemical pigments
∙ Physical surface structures
∙ Together, this determines what wavelengths are reflected, absorbed or transmitted.
∙ How we actually percieve color is complicated. Two complementary theories:
Trichromacy and Color opponency.
11
∙ Vegetation
∙ Green plants are green because of a
pigment called chlorophyll which
absorbs red and blue wavelengths.
∙ As chlorophylls degrade in the autumn,
hidden pigments of yellow xanthophylls
and orange beta-carotene are revealed.
∙ The sky (a) Absorbtion spectrum of chlorophyll

∙ When the sun is high in the sky: the


blue part (shorter wavelengths) are
scattered from air particles more than
red light. (Technically, the sky is violet,
but we perceive it as blue. More on that
in a second.)
∙ At sunset: More atmosphere to (b) Sky colors

propagate throught, thus most of the


blue light is scattered, revealing the red
light.
12
trichromatic theory (young, helmholtz)

∙ The retina is responsible for light detection in the eye


∙ Two types of detectors: rods and cones
∙ Rods detects achromatic light
∙ Cones detects chromatic light, and can be devided into three principal sensing
receptors (cone opsins) that has peak sensitivity at different parts of the spectrum
∙ S (short): Around blue (∼ 445 nm, (2 %, but most sensitive)
∙ M (middle): Around green (∼ 535 nm, (33 %)
∙ L (long): Around red (∼ 575 nm, (65 %)

Figure 10: Absorbtion of light by S (”blue”), M (”green”) and L (”red”) cones.


13
color opponency theory (hering)

∙ Three classes of opponent channels:


∙ Red-Green: Either red or green (no such thing as
greenish-red). Senses the difference (S + L) - M
∙ Yellow-Blue: Either yellow or blue (no such thing
as bluish-yellow). Senses the difference (M + L) -
S
∙ White-Black: Luminesance level, a spectrum of
graylevels.

∙ The response from S-, M-, and L-cones are


shown in the figure to the right
∙ This can explain some aspects of color
Figure 11: Opponent functions in spectral hues chromatic response
blindness, and color afterimages functions for the CIE 1964 standard observer. (Source:
https://www.handprint.com/HP/WCL/color6.html)

14
15
16
stage (or zone) theory

The brain combines information from each type of receptor to give rice to different color
perception. Modern theory of visual perception incorporates trichromatic theory and
opponence theory, in two stages. In summary, some percieved colours can be explained
as
∙ Blue light stimulates S more than green or red light, but M and L more weakly
∙ Blue-green light stimulates M moore than L, and S more strongly
∙ Green light stimulates M more than S and L
∙ Green-Yellow light stimulates both L and M equally strong, but S weakly
∙ Red light stimulates L much more than M, and S hardly at all

17
color description
some words and their meanings

All colors can be fully described by their ∙ The chromaticity determines both the
hue, saturation and some form of intensity, dominating wavelength and the
or brightness value. saturation of the color.

∙ Hue: The dominant wavelength of the ∙ Chromaticity and brightness together


color. fully describes a color.

∙ Saturation: How saturated the color is ∙ For instance, different graylevels have
with white light, from a pure color (no the same chromaticity, but different
white) to fully saturated (white). intensity.

∙ Intensity: How bright the color is, some


notion of darkness.
∙ Hue and saturation together
determines the color, and we call these Figure 13: Chromaticity modeled as a unit polar coordinate chart (ρ, θ) where fully
two chromaticity. saturated colors are at the boundary (ρ = 1), and the angle θ represent the
wavelength, and thereby the hue.

19
additive color mixing

∙ Mixing light of two or more colors


∙ Additive primaries: Red, Green, Blue
∙ Additive secondaries: Yellow, Cyan,
Magenta
∙ Utilized in TV and Computer monitors
∙ Difference between a mix of red and
green (yellow) and yellow light
(wavelength at about 580 nm), but we
Figure 14: Additive mixing
detect no difference.

20
subtractive color mixing

∙ Mixing filters or pigments that absorbs


and reflect different colors.
∙ If you begin with white light, the result
color you see is a result of a subtractive
color process where different
wavelengths have been absorbed.
∙ Subtractive primaries: Yellow, Cyan,
Magenta
∙ Subtractive secondaries: Red, Green,
Blue
∙ Mixing of paint is subtractive (although, Figure 15: Subtractive mixing

you probably used a RYB color model in


kindergarden).

21
human color vision

∙ We can differentiate between about 100


different hues
∙ We can differentiate between about
6000 different hue and intensity
combinations.
∙ For each of those, we can differentiate
about 60 different saturation levels.
∙ In total we differentiate about 360 000
colors. Figure 16: RGB cube

22
color spaces
color models and color spaces

Systems covered today.

∙ A color model is an abstract mathematical ∙ CIE standard color model


model describing how colors can be represented ∙ CIEXYZ
by tuples of numbers (e.g. RGB or CMYK) ∙ RGB(A) color model
∙ A color space is a specific organization of colors. ∙ sRGB, Adobe RGB
Can have different color spaces for a specific ∙ HSV, HSL and HSI
color model ∙ CMY(K) color model
∙ E.g. sRGB and Adobe RGB are color spaces of the ∙ YUV color model
RGB color model. ∙ YUV, YIQ, YCb Cr
∙ Color spaces and models are sometimes
What is not covered is e.g. the LAB
(incorrectly) used indiscriminately.
color model (see e.g. Hunter Lab
and CIEL*a*b*).

24
cie standard observer

∙ CIE (Commission Internationale de l’Eclairage)


defined a model of standard tristimulus values ∫ 780
X= Le,Ω,λ (λ)x̄(λ) dλ,
{X, Y, Z}. 380
∫ 780
∙ {X, Y, Z} are hypothetical, but X ∼ R, Y ∼ G
Y = Le,Ω,λ (λ)ȳ(λ) dλ,
and Z ∼ B 380
∫ 780
∙ A quantitive measurement of colors, where any
Z= Le,Ω,λ (λ)z̄(λ) dλ,
wavelength can be matched perceptually by 380
positive combinations of X, Y , and Z.
Le,Ω,λ is the spectral radiance.
∙ Defined such that for a hypothetical light source
with temperature 5400 K, X = Y = Z = 1
∙ In the same way, white light from indirect
daylight (D65, 6500 K), X = 0.950456, Y = 1,
Figure 17: The CIE 1931 standard observer color matching functions 1 .
Z = 1.088754.
1 By User:Acdx - Own work, GFDL, https://commons.wikimedia.org/w/index.php?curid=6233111
25
cie chromaticity coordinates

∙ From the tristimulus values, we can construct a set of


chromaticity coordinates {x, y, z}.

X Y Z
x= ,y = ,z = ,
X +Y +Z X +Y +Z X +Y +Z
∙ Note that z = 1 − x − y
∙ This is often used to derive different color spaces.
Figure 18: The CIE 1931 xy color space chromaticity
∙ The chromaticity diagram on the left represent the diagram 1 .

human gamut (colors visible to us).


∙ A gamut is a certain complete subset of colors.

1 By BenRG - File:CIExy1931.svg, Public Domain, https://commons.wikimedia.org/w/index.php?curid=7889658


26
cie human gamut

Example gamut of the CIE


∙ The curved edge of the gamut called the spectral locus,
1931 RGB primaries, where
and is fully saturated monochromatic colors (each point
representing a wavelength). Red: 700 nm

∙ The straight line on the lower part is called the line of Green: 546.1 nm
purples, and have no monochromatic color equivalent. Blue: 435.8 nm
Purple and magenta do not exist in the rainbow (but
violet do).
∙ The line segment between two points contain all colors
that can be created with different mixtures of the two
colors at those points.
∙ Therefore a color space gamut is convex. E.g. a triangle if
it is built of three primary colors. Figure 19: The CIE 1931 RGB 1 .

1 By BenRG - Own work, inspired by File:CIExy1931.png, Public Domain,

https://commons.wikimedia.org/w/index.php?curid=7889718
27
rgb color model

∙ An additive color model where the ∙ Organized in a unit cube with


primaries red, green and blue, are coordinates (r, g, b), where (0, 0, 0) is
added together to form a wide variety black and (1, 1, 1) is white.
of colors. ∙ A typical 24 bit display has 256 values
∙ Device dependent, meaning that the for each channel.
color is dependent on the device that
detect or output the color.
∙ Typical input devices are video cameras,
digital cameras and image scanners.
∙ Typical output devices are TV, computer
and mobile displays. (a) CRT monitor 1 (b) RGB color cube 2

1 By Gona.eu - Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=6223756


2 By SharkD - Own work, GFDL, https://commons.wikimedia.org/w/index.php?curid=3375025
28
rgba color model

∙ As the RGB color model with and added α-channel.


∙ Usually used as an opacity channel. 0% is fully transparent, and 100% is fully
opaque.
∙ Allows image composition, as background is visible trough foreground if the
foreground is transparent.

Figure 21: Red, green and blue colors with different transparency (alpha). Note that when overlapping, blue is overlapping green which is overlapping red.

29
srgb color space

∙ Created in 1996 by HP and Microsoft to


approximate the color gamut of most common
computer monitors
∙ Most commonly used RGB color space, standard
color space for images on the internet.
∙ Relatively small gamut, covers about 35% of all
visible colors.

Chromaticity Red Green Blue White


x 0.6400 0.3000 0.1500 0.3127 Figure 22

y 0.3300 0.6000 0.0600 0.3290


Y 0.2126 0.7152 0.0722 1.0000

30
cie xyz to srgb transformation

The CIE XYZ tristimulus values must be normalized to the D65 white point (X = 0.9505,
Y = 1.0000, Z = 1.0890). Then compute a linear transform
    
Rlin 3.2406 −1.5372 −0.4986 X
    
Glin  = −0.9689 1.8758 0.0415   Y 
Blin 0.0557 −0.2040 1.0570 Z
Now, these values must be gamma-corrected before we get the final sRGB values
{
12.92Clinear , Clinear ≤ 0.0031308
Csrgb = 1/2.4
(1 + a)Clinear − a, Clinear > 0.0031308
where a = 0.055, for C ∈ {R, G, B}.

31
adobe rgb color space

∙ Created in 1998 by Adobe to view most colors


available to CMYK printers.
∙ Larger gamut than the sRGB, covers about 50% of
all visible colors.
∙ Richer in the green-cyan region.

Chromaticity Red Green Blue White


x 0.6400 0.2100 0.1500 0.3127
Figure 23: 1
y 0.3300 0.7100 0.0600 0.3290

1 By Mbearnstein37 - Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=29994804


32
cmyk color model

∙ Cyan, Magenta, Yellow, Key (black) ∙ The CMYK-model is subtractive


∙ CMYK (in [0, 1]) to RGB (in [0, 1]): ∙ RGB is common in display
monitors, CMYK is common in
R = (1 − C)(1 − K) printers.
G = (1 − M )(1 − K)
B = (1 − Y )(1 − K)

∙ RGB (in [0, 1]) to CMYK (in [0, 1]):

K = 1 − max{R, G, B}
C = (1 − R − K)/(1 − K)
M = (1 − G − K)(1 − K)
Y = (1 − B − K)(1 − K)
Figure 24: 1 Collection of gamuts

1 By BenRG and cmglee - http://commons.wikimedia.org/wiki/File:CIE1931xy_blank.svg, CC BY-SA 3.0,

https://commons.wikimedia.org/w/index.php?curid=32158329 33
device dependence

∙ RGB colors on a screen is dependent on the properties of the screen. That is, the
same image can look different on two different screens.
∙ This is also the case for CMYK: the same image printed on two different printers can
look completely different. The color is dependent on the printer, the printer ink, the
paper etc.
∙ There is not allways overlap between CMYK and RGB colors. The monitor can display
some colors that the printer cannot print, and vice versa.
∙ We say that RGB and CMYK are device dependent color models.
∙ CIEXYZ is an example of a model that is device independent.
∙ The number of stable, ”recognizable” colors on a monitor is actually quite small.

34
hue, saturation, intensity (hsi)

∙ Describe colors by their hue,


saturation and intensity.
∙ This can be veiwed in a double
cone (fig on the right).
∙ Hue: An angle from red (H =
0).
∙ Saturation: Distance from
center axis.
∙ Intensity: Vertical axis.

∙ RGB is useful for color


generation. HSI more useful
for color description and color
image processing. Figure 25: HSI double cone

35
rgb to hsi

For R, G, B ∈ [0, 1]
{
θ B≤G
H=
360 − θ B>G

where1 { }
(R − G) + (R − B)
θ = arccos √ .
2 (R − G)2 + (R − G)(R − B)
Saturation is given by
3 min{R, G, B}
S =1− .
R+G+B
Intensity is given by
1
I=
(R + G + B).
3
Notice that H is not defined for R = G = B, and S is not defined when I = 0.
1 Remember to convert from radians to degrees. This also apply in the next slides.
36
hsi to rgb

0 < H ≤ 120: 120 < H ≤ 240: 240 < H ≤ 360:

H = H − 120 H = H − 240
( )
S cos H R = I(1 − S) R = 3I − (R + B)
R=I 1+ ( )
cos(60 − H) S cos H G = I(1 − S)
G=I 1+ ( )
G = 3I − (R + B) cos(60 − H) S cos H
B =I 1+
B = I(1 − S) B = 3I − (R + B) cos(60 − H)

37
hsv (hue, saturation, value and hsl (hue, saturation, lightness)

∙ HSV and HSL are alternatives to HSI.


∙ The hue the same in all three representations.
∙ Intensity, Value and Lightness are different

V = max{R, G, B}
1 (a) HSV cylinder
L = (max{R, G, B} + min{R, G, B})
2
∙ Saturation is also different
max{R, G, B} − min{R, G, B}
SHSV =
V
max{R, G, B} − min{R, G, B}
SHSL =
1 − |2L − 1|

(b) HSL cylinder 38


comparison of some colors

Note that all values are in range [0, 1] except hue, which is in range [0, 360]

RGB CMY HSI


Red (1, 0, 0) (0, 1, 1) (0, 1, 1/3)
Yellow (1, 1, 0) (0, 0, 1) (60, 1, 2/3)
Green (0, 1, 0) (1, 0, 1) (120, 1, 1/3)
Blue (0, 0, 1) (1, 1, 0) (240, 1, 1/3)
White (1, 1, 1) (0, 0, 0) (0, 0, 1)
Gray (1/2, 1/2, 1/2) (1/2, 1/2, 1/2) (0, 0, 1/2)
Black (0, 0, 0) (1, 1, 1) (0, 0, 0)

39
luminance and chrominance color model

∙ Several kinds: YUV, YIQ, YCbCr, YPbPr, etc.


∙ For YUV:
∙ Y is luminance (luma).
∙ U and V are chrominance.

∙ These kind of models are common in TV and video


encoding, partly for historical reasons (compatibility with
black and white TV).
Figure 27: YUV at Y = 0.5.

40
yiq

∙ NTSC is the standard for TV and video in North America R, G, B, Y ∈ [0, 1],
and Japan, they use the system YIQ. I ∈ [−0.5957, 0.5957] and
∙ Y describes luminance, I and Q describes chrominance Q ∈ [−0.5226, 0.5226].
information
∙ RGB to YIQ
    
Y 0.299 0.587 0.114 R
    
 I  =  0.596 −0.274 −0.322   G 
Q 0.211 −0.522 0.311 B

∙ YIQ to RGB
    
R 1 0.956 0.623 Y
    
 G   1
= −0.272 −0.648   I 
Figure 28: YIQ at Y = 0.5. Note that I and Q are
B 1 −1.105 0.705 Q scaled to [−1, 1].

41
ycbcr

∙ This is used in digital TV and video.


∙ Y is luminance (luma)
∙ Cb is blue minus luma (B - Y)
∙ Br is red minus luma (R - Y)
∙ YCbCr is digital, RGB can both be analog or digital.
∙ MPEG-compression (in DVD, digital TV and video) is coded
in YCbCr.
∙ Digital video cameras (MiniDV, DV, Digital Betacam etc)
provides a YCbCr signal over a digital link like FireWire or
SDI. Figure 29: YCbCr at Y = 0.5.

∙ The analog dual of YCbCr is YPbPr.

42
color spaces, summary

∙ A color model is a standardized way to specify colors.


∙ Specidies a coordinate system where each point is a color.
∙ Different color models serves different purposes
RGB: color monitor display
CMYK: color printing
HSI, HSV, HSL: human color perception
YIQ, YCbCr, YUV: color compression in video and TV coding
CIEXYZ, CIELAB: device independent standard

43
digital color images
organization of color images

True color uses all colors in the color space.


∙ Used in applications that contain many colors with subtle differences.
E.g. digital photography or photorealistic rendering.
∙ Two main ways to organize true color: Component ordering and packed
ordering.
Indexed color uses only a subset of colors.
∙ Application dependence on which subset to use.
∙ Reduces memory and computation cost.
∙ Used when subtle color differences is not vital.

45
true color: component ordering

∙ Each image index consist of multiple channels, one for each color component.
∙ For RGB, one red channel, one green channel and one blue channel.
∙ If each channel is discretized into 8 bits (256 intensity values for red, green and
blue), we get (28 )3 = 224 = 16 777 216 different colors

Figure 30: True color, component ordering1 .

1 Wilhelm Burger, Mark J. Burge, Principles of Digital Image Processing: Fundamental Techniques, 2010
46
example

47
red channel

48
green channel

49
blue channel

50
full example summary

(a) All (b) Red

(c) Green (d) Blue


51
example, detail

52
red channel

53
green channel

54
blue channel

55
detail example summary

(a) All (b) Red

(c) Green (d) Blue

56
true color: packed ordering

∙ Color components are packed together at the same image element.

Figure 33: True color, packed ordering1 .

1 Wilhelm Burger, Mark J. Burge, Principles of Digital Image Processing: Fundamental Techniques, 2010
57
packed ordering and memory layout

∙ Endianness: Sequential order used to numerically interpret a range of bytes in


computer memory as a larger, composed word value.
∙ Little endian: Bytes are stored by increasing significance, with the least significant byte
(LSB) 1 stored first. Common in microprocessors (e.g. Intel x86 processors).
∙ Big endian: Bytes are stored by decreasing significance, with the most significant byte
(MSB)2 stored first. Common in data networking (e.g. in the Internet Protocol suite).

∙ A 32 bit word ARGB with decreasing significance A → R → G → B would be stored


as ARGB in a big-endian system, but as BGRA in a little-endian system.

1 The byte containing the least significant (”the rightmost”) bit.


2 The byte containing the most significant (”the leftmost”) bit.
58
example: endianness confusion

A 32 bit ARGB value 0x80FF00FF would be interpreted differently by big-endian and


little-endian systems if not handeled properly.
Big-endian interpretation: 0x80FF00FF. Little-endian interpretation: 0xFF00FF80.

Hex Decimal (8-bit) Hex Decimal (8-bit)


Alpha 80 128 (50.2 ) Alpha FF 255 (100 )
Red FF 255 Red 00 0
Green 00 0 Green FF 255
Blue FF 255 Blue 80 128

59
indexed color

∙ An image contains indices of a look up table (LUT) of colors (palette) in stead of


color intensity values.
∙ Permits only a limited number of colors.
∙ Often are these images stored in indexed GIF or PNG formats.
∙ Use color quantization to transform optimally from a true color image to indexed
color image.

Figure 34: Indexed color ordering1 .

1 Wilhelm Burger, Mark J. Burge, Principles of Digital Image Processing: Fundamental Techniques, 2010 60
color images and look up tables

∙ As we have seen, in true color, an RGB pixel can be stored in 24 bits (8 for each color).
∙ To reduce this size, we could e.g. assign 3 bits to each color.
∙ This would result in only 512 different possible colors, and a total of 9 bits ber pixel.
∙ A region with many nuances of a color would not loog good.
∙ It is certain that all 512 colors exist in the image.
∙ Alternatively, one could use 8 bits and a LUT.
∙ Each row in the table represent a 24 bit RGB color.
∙ The table consist of the 256 colors that best represent the image.

61
color quantization

∙ Purpose: reduce size of color images.


∙ E.g. 24 bit TIFF to 8 bit TIFF.
∙ Replace the true color with a best match from a smaller subset.
∙ Some different quantization algorithms:
∙ Uniform quantization
∙ Median-cut algorithm

62
uniform quantization algorithm

Convert each component c of the original RGB value independently and uniformly to the
new value ĉ { }
N
ĉ = floor c

Here c ∈ {r, g, b} and N is the number of intensity values in the respective
representation.
Example: 3 × 8 bit (N = 256) to a 3 : 3 : 2
Example: 3 × 12 bit (N = 212 = 4096) true packed 8 bit: 3 bits for red (N̂ = 8), 3 bits for
color image to 3 × 8 bit (N̂ = 28 = 256) value: green (N̂ = 8), and 2 bits for blue (N̂ = 4).

63
median-cut algorithm

This transforms a 24 bit true color image to 8 bit indexed color image.
1. Find the box in the RGB space that cover all colors present in the image.
2. Sort the colors in the box along the longest RGB dimension of the box. This is done
by computing the color histogram.
3. Split the box in two at the median in the sorted list.
4. Repeat step 2 and 3 for all boxes (including the new ones that you create). Repeat
until you have 256 boxes.
5. For each box, compute the mean RGB value in that box, and let this value represent
the value of the box.
6. Map each 3 × 8 bit RGB value in the original image to the index of the box which
value is the closest in the RGB space.

64
endianness and lut

A LUT is also prone to confusion in endianness. For instance, a LUT with 16 bit values and
a 5 : 6 : 5 bit packed RGB ordering, (from MSB to LSB) 1000010000010000 would be
interpreted differently. Let ”|” represent a byte delimeter, then we would have a:
Big endian interpretation Little endian interpretation
10000100 | 00010000 00010000 | 10000100

Component Binary Decimal Component Binary Decimal


Red 10000 16 (∼ 50%) Red 00010 2 (∼ 6.25%)
Blue 100000 32 (∼ 50%) Blue 000100 4 (∼ 6.25%)
Green 10000 16 (∼ 50%) Green 00100 4 (∼ 12.5%)

65
pseudo colors

∙ Pseudo-color images can be graylevel images where each graylevel is assigned an


RGB value according to some LUT.
∙ Is often used to emphasize small graylevel differences (e.g. in medical imaging).
∙ Also often used in graphical display of data.
∙ If the color-LUT is mapped back to graylevels, the intensity should(!) be correct.
∙ Colormaps in plotting libraries are often pseudo-color LUT.

66
how to choose the correct colormap

∙ Often, we want the colormap to be perceptually uniform: Equal steps in the data are
percieved as equal steps in the color space.
∙ The human brain percieves changes in lightness as changes in data better than
changes in hue.
∙ Up until recently, Matlab (until 2014) and Python’s Matplotlib (until 2.0 release) used
jet as the default colormap for data display1 .
∙ jet is a rainbow colormap that is not perceptually uniform.

Figure 35: A gradient of jet, and the lightness component of the CAM02-UCS colorspace.

1 See a more detailed wrap-up here: https://bids.github.io/colormap/


67
comparison of some colormaps

(a) gray (sequential) (b) magma (sequential perceptual uniform)

(c) coolwarm (diverging) (d) viridis (sequential perceptual uniform) (new matplotlib default)

68
color graphics i

∙ We can produce rasterized data based


on observations, simulations,
computations etc.
∙ For instance population density or
precipitation (nedbør) data projected
on a world map.
∙ The use of a LUT gives a graphical
display that is not created by imaging.

Figure 37: Example of global mean annual (mm/year) percipitation (notice the colormap).
(Source: https://data.giss.nasa.gov/impacts/agmipcf/)

69
fake colors

Using imaging from outside the visible spectrum and mapping it to RGB.
Example: NOAA (National Oceanic and Atmospheric Administration) satelites equipped
with a AVHRR (Advanced Very High Resolution Radiometer), an instrument sensing in the
visual and infrared part of the EM spectrum1 .

Band Band width Applications


1 (visible) 0.58 µm - 0.68 µm Clouds and land surfaces cartography (day)
2 (near IR) 0.725 µm - 1.00 µm Clouds and land surfaces cartography (day)
3A (near IR) 1.580 µm - 1.64 µm Snow and ice detection
3B (IR) 3.550 µm - 3.93 µm Clouds and sea surface temperature mapping (night)
4 (IR) 10.30 µm - 11.30 µm Clouds and sea surface temperature mapping (night)
5 (IR) 11.50 µm - 12.50 µm Sea surface temperature

1 http://eoedu.belspo.be/en/satellites/noaa.htm

70
noaa avhrr example

Figure 38: Composite mapped mosaics, band 1, 2, 4 1

1 http://www.ssd.noaa.gov/POES/COMP/

71
quantization and printing
grayscale printing

∙ Printers print gralevels in binary (black or nothing).


∙ We can remediate this problem by introducing a finer mesh.
∙ That is, the printer uses halftones.
∙ This works since the we percieve a mean of close intensity values.
∙ The challenge is to create patterns of binary pixels that represent
a gray level.
∙ Several methods exist:
∙ Global threshold (be smarter!) Figure 39: One pixel partitioned
into 4 × 4 uniform sub-pixels.
∙ Patterning
∙ Ordered dithering
∙ Error diffusion

73
global threshold

(a) Original (b) Result

74
dithering

∙ Dithering is a method used to create an illusion of color.


∙ Often used to remedy color quantization.
∙ There exist several dithering algorithms, we will cover
∙ Thresholding (technically a dithering method)
∙ Ordered dithering
∙ Error diffusion dithering

Figure 42

(a) Original (b) 16 colors, no dithering (c) 16 colors, with dithering

75
ordered dithering

∙ Simulate the subpixel partition by scaling the


∙ We threshold the values in an
image up with a factor 2n .
image using a dither matrix.
∙ Use the dither matrix as a mask and ”place it on
∙ A dither matrix Dn
the original pixels”.
∙ Has shape 2n × 2n
∙ Partitions the graylevel scale ∙ Threshold the ”subpixel value” against the
[0, 255] into (2n )2 equidistant corresponding element in the dither matrix.
values. ∙ That is, let I ∈ [0, 255]h×w be the original image,
n n
∙ Example: and In ∈ {0, 255}2 h×2 w be the new, dithered
  image, then
0 128 32 160 {
 
 192 64 224 96  255, if I[⌊ 2in ⌋, ⌊ 2jn ⌋] > Dn [i mod 2n ]
D2 =   In [i, j] =
 48 176 16 144  0, if I[⌊ 2in ⌋, ⌊ 2jn ⌋] ≤ Dn [i mod 2n ]
240 112 208 80
for i ∈ [0, 2n h − 1] and j ∈ [0, 2n w − 1].

76
example of ordered dithering using d2

(a) Original (b) Result

77
error diffusion dithering

∙ Used to reduce quantization levels in 1 # Floyd−S t e i n b e r g on an u i n t 8 image im


2 palette = [0 , 255]
an image. 3 m, n = im . shape
4 new_im = im . copy ( )
∙ Distributes the quantization error in 5
made in one pixel to nearby pixels. 6 f o r i i n range (m) :
7 f o r j i n range ( n ) :
∙ Example: Floyd-Steinberg dithering 8 o l d _ v a l = new_im [ i , j ]
9 new_val = n e a r e s t _ v a l ( o l d _ v a l , p a l e t t e )
∙ Each traversed pixel is assigned the 10 new_im [ i , j ] = new_val
value that is closest in the palette. 11 q u a n t _ e r r o r = o l d _ v a l − new_val
12 # Boundary treatement ignored f o r
∙ The residual is then weighted and readability
13 new_im [ i + 1 , j −1] += i n t ( q u a n t _ e r r o r * 3 / 1 6 )
added to the nearby pixels as
14 new_im [ i + 1 , j ] += i n t ( q u a n t _ e r r o r * 5 / 1 6 )
[ ] 15 new_im [ i + 1 , j + 1 ] += i n t ( q u a n t _ e r r o r * 1 / 1 6 )
3
p c 16 16 new_im [ i , j + 1 ] += i n t ( q u a n t _ e r r o r * 7 / 1 6 )
3 5 1 17
16 16 16 18 r e t u r n new_im . astype ( np . u i n t 8 )
19

where p signifies the previously visited


pixel, and c the current one.
78
error diffusion example

(a) Original (b) Result

79
color printing

∙ A CMYK color model is used.


∙ Halftone patterns are used at certain angles (different for each color) to create color
patterns.
∙ We percieve the result in such a way that no sharp color transitions are seen.

Figure 45: Three examples of modern color halftoning.1


80
color image histogram

∙ An image with 3 color channels has a 3D cube as


histogram
∙ With one bin for each color value, a histogram
gets of an 8-bit RGB image gets (28 )3 = 16777216
bins.
∙ A 1024 × 1024 image can maximally fill 1/16 of
this cube (22·10 /224 = 2−4 = 1/16.
∙ In other words: the cube is mostly empty.
∙ For this reason, it is most common to project the
histogram to 1D or 2D.
∙ 1D: Projection to the R-, G-, or B-axis. Figure 46: 3D histogram example

∙ 2D: Projection to the RG-, RB-, or GB-plane.

81
component color histogram

(a) Red (a) Red and green

(b) Green (b) Red and blue

Figure 47: Lena

(c) Blue (c) Green and blue

82
histogram equalization in rgb

∙ Histogram equalization on each RGB component, independently.


∙ This often produce a bad result.
∙ It is better to do it in HSI:
1. Transform the image from RGB to HSI.
2. Do histogram equalization on the I-component.
3. Transform the new HSI image back to RGB.

83
histogram equalization in hsi

∙ The H and S channel are not changed.


∙ But since the I channel is, color
perception can be altered.
∙ To remediate this, you can adjust the
saturation S before you transform back
to RGB.

Figure 50: Histogram equalization followed by saturation adjustment in HSI.

84
histogram equalization example

(a) Original (b) RGB (c) HSI

85
low-pass filtering

∙ RGB-filtering blurs colors.


∙ Filtering of the I component in HSI
produces a smooth image, without color
adjustment.

(a) HSI components

(b) Left: RGB lowpass. Middle: Filtering I component and converting back to HSI. Right:
Difference between results.
Figure 52: Lena and RGB components.

86
laplace filtering

∙ We can make a graylevel image appear sharper by adding a scaled laplace of the
same image (previous lecture).
∙ RGB:
∙ Compute the laplace of each RGB component, and add it to the respective component.
∙ The color of each pixel is then influenced by the color of neighbouring pixels.
∙ HSI:
∙ Compute the laplace of the intensity channel, and add it to the intensity channel.
∙ The color is preserved, but the intensity near edges is changed.

87
color image thresholding

∙ Suppose that we have observed the same scene in different wavelengths.


∙ We can then threshold based on:
∙ 2D histogram
∙ 3D histogram
∙ Higher order histograms
∙ Simple method:
1. Decide thresholds for each channel independently.
2. Combine the segmented channels into one image.

∙ For RGB, this corresponds to partition the RGB space into a boxes.

88
color image thresholding

A bit more sophisticated method.


1. Choose an arbitrary point in the multidimensional color-space as reference, e.g.
(R0 , G0 , B0 ) in RGB space.
2. Let fR , fG , fB be the different color components of an RGB image.
3. Compute the distance based on the reference point

d(x, y) = (fR [x, y] − R0 ])2 + (fG [x, y] − G0 ])2 + (fB [x, y] − B0 ])2

4. Then, compute the final segmentation g as


{
1, if d(x, y) ≤ dmax
g[x, y] =
0, if d(x, y) > dmax

for some threshold dmax .


5. This is then a sphere with radius dmax around the reference point.
89
color image thresholding

We can do the same, but with an ellipse in RGB space in stead of a sphere.
1. Choose an arbitrary point in the multidimensional color-space as reference, e.g.
(R0 , G0 , B0 ) in RGB space.
2. Choose distance thresholds dR , dG , dB .
3. Compute the distance based on the reference point

(fR [x, y] − R0 ])2 (fG [x, y] − G0 ])2 (fB [x, y] − B0 ])2
d(x, y) = 2 + 2 +
dR dG d2B

4. Then, compute the final segmentation g as


{
1, if d(x, y) ≤ 1
g[x, y] =
0, if d(x, y) > 1

for some threshold dmax .


90
thresholding hsi

∙ Transform to to HSI.
∙ Suppose that we want to segment parts
in an image
∙ with a certain hue
∙ and above some saturation threshold.

∙ Create a mask by segmenting the


saturation image.
∙ Multiply the hue component image with
this mask.
∙ Chose a hue interval corresponding to
the desired color (remember that hue is
circular). Figure 55: Image segmentation in HSI space (row, colum): (1, 1): Original. (1, 2): Hue. (2, 1):
Saturation. (2, 2): Intensity. (3, 1): Binary saturation mask. (3, 2): Product of (1, 2) and (3, 1).
(4, 1): Histogram. (4, 2) Segmentation of red components in (1, 1).
91
edge detection


1
F = (gxx + gyy ) + (gxx − gyy ) cos(2θ) + 2gxx sin(2θ)
2
where ( )2 ( )2 ( )2
∂fR ∂fG ∂fB
gxx = + + ,
∂x ∂x ∂x
( )2 ( )2 ( )2
∂fR ∂fG ∂fB
gyy = + + ,
∂y ∂y ∂y
∂fR ∂fR ∂fR ∂fR ∂fR ∂fR
gxy = + + ,
∂x ∂y ∂x ∂y ∂x ∂y Figure 56: RGB edge detection. (1, 1): Original. (1, 2):
Gradient in RGB color vector space (F ). (2, 1):
and ( ) Gradients computed on per RGB component, and
1 2gxy then added. (2, 2): Difference between (1, 2) and (2, 1).
θ= arctan .
2 gxx − gyy

92
noise in color images

∙ Add gaussian noise on each


RGB component (µ = 0,
σ 2 = 800).
∙ The noise is not that visible in
the RGB image.
∙ Convert the noisy image to HSI. (a) Noisy RGB components

∙ The hue and saturation


channels are very noisy.
∙ The intensity channel is less
noisy than the RGB channels.

(b) Noisy HSI components


93
Questions?

94

You might also like