Upload
ravikrishna
View
239
Download
0
Tags:
Embed Size (px)
DESCRIPTION
GIS Neighborhood and Multiband Operations
Citation preview
1/19/2012
1
GNR630 Introduction to Geospatial
TechnologiesInstructors: Prof. (Mrs.) P. Venkatachalam
Prof. B. Krishna MohanProf. S.S. Gedam
CSRE, IIT Bombaypvenk/bkmohan/[email protected]
Slot 6
Lecture 5-6 Neighborhood Operations and Multiband Operations
January 20/25, 2012 11.05AM – 12.30PM
Contents of the Lecture
• Concept of Neighborhood
• Image Smoothing
• Edge Enhancement
• Color Transforms
• Band Arithmetic
IIT Bombay Slide 1
GNR630 Lecture 5-6 B. Krishna Mohan
January 20/25, 2012 Lecture 5-6
Neighborhood Operations and Multiband Operations
1/19/2012
2
Neighborhood Operations
Pixel and Neighborhood
A B C
D X E
F G H
• Pixel under consideration X
• Neighbors of X are A, C, F,H, B,D,E,G
• Size of neighborhood = 3x3
• Neighborhoods of size mxn m and n are odd; Unique pixel at the centre of the neighborhood
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 02
1/19/2012
3
4-neighborhoods
A B C
D X E
F G H
• B,D,E and G are the 4-neighborhood of X
• 4-neighbors are physically closest to X, at
one-unit distance
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 03
8-neighborhood
A B C
D X E
F G H
• A,C,F and H are ALSO included with B,D,E,G as neighbors; 8-pixel set is the 8-neighborhoodof X
• A,C,F and H are the diagonal neighbors, sqrt(2)
times farther from X
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 04
1/19/2012
4
Larger Neighborhoodso o o o o
o o o o o
o o X o o 5 x 5 neighborhood
o o o o o
o o o o o
• Larger neighborhoods used based on need; computational load varies exponentially with size of neighborhood
• 3x3 � 9 neighbors; 5x5 � 25 neighbors …
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 05
Point Operations v/s Neighborhood Operations
• Point operations do not alter the sharpness or resolution of the image
• Gray level associated with a pixel is manipulated independent of the gray levels associated with neighbors
• Pixel operations cannot deal with noise in the image, nor highlight local features like object boundaries
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 06
1/19/2012
5
Neighborhood Effect
• 15 17 16 16 17 19
• 18 17 15 18 70 15
• 17 14 16 16 20 17
• Natural Noise?
• Abnormalities can be located by
comparing a pixel with neighboring pixels
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 07
Neighborhood Effect
• 15 17 16 16 17 50
• 18 17 15 18 50 49
• 17 14 16 49 50 48
Normal region Boundary
• Sharp transitions from one region to
another are marked by large difference in
pixel values at neighboring positions
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 08
1/19/2012
6
Neighborhood Operations
• Results of operations performed on the neighborhood are posted at the location of the central pixel
• The values in the input image are not overwritten, instead the results are stored in an output array or file
• Cannot be computed in real time since the configurations of gray levels in the neighborhood are very large
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 09
Neighborhood Operations
• Simple averaging
A B C
D X E
F G H
• g(X) = (1/9)[f(A) + f(B) + f(C) + f(D) + f(X) +
f(E) + f(F) + f(G) + f(H)]
• The output gray level is the average of the gray levels of all the pixels in the 3x3 neighborhood
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 10
1/19/2012
7
Example15 17 16 15 17 1618 17 15 18 57 1517 14 16 17 14 16Case 1 Case 2
• In case 1, after averaging, the central element 17 is replaced by the local average 16 –negligible change
• In case 2, after averaging, the central element 57 is replaced by 21 – significant change
• Averaging is a powerful tool to deal with random noise
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 11
Neighborhood Operations -Procedure
• The procedure involves applying the computational step at every pixel, considering its value and the values at the neighboring pixels
• Then the neighborhood is shifted by one pixel to the right and the centre pixel of the new neighborhood is in focus
• This process continues from left to right, top to bottom
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 12
1/19/2012
8
Image
Processing step
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 13
Mathematical form for averaging
• In general, we can write
g(X) =
where K is the number of neighbors Ai. A5 refers to X, the central pixel for a 3x3 neighborhood.
• It is obvious that all neighbors are given equal weightage during the averaging process
1
( )
| ( ) |
K
i
i
f A
N X
=
∑
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 14
1/19/2012
9
General form for averaging• In case different weights are preferred for different
neighbors, then we can write
• g(X) =
• For simple averaging over a 3x3 neighborhood, wi = (1/9), i=1,2,…,9
• We can alter, for example, the weights for 4-neighbors and 8-neighbors. In such a case, wi is not a constant for all values of i.
1
1
( )K
i i
i
K
i
i
w f A
w
=
=
∑
∑
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 15
Averaging as Space Invariant Linear Filtering
• In signal processing terminology, the weighted averaging
can be represented by convolution:
k,l= -w, …, 0, …, w
For a 3x3 window, w=1; For 5x5 window, w=2, …
, , ,
,
1
(2 1)(2 1)
l j wk i w
i j k l i k j l
k i w l j w
k l
g h f
hw w
= += +
− −
= − = −
=
=+ +
∑ ∑
2-d discrete convolution of h with f: g = f*h
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 16
1/19/2012
10
Concept of Convolution
• Convolution is a weighted summation of inputs to produce an output; weights do not change anytime during the processing of the entire data
• If the input shifts in position, the output also shifts in position; character of the processing operation will not change
• The weights with which the pixels in the image are modified are represented by the term filter
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 17
Filter Mask
• The filter can be compactly represented using the weights or multiplying coefficients:
• e.g., 3x3 averaging filter
• 0.111 0.111 0.111 1 1 1
• 0.111 0.111 0.111 or (1/9) 1 1 1
• 0.111 0.111 0.111 1 1 1
• This implies that the pixels in the image are multiplied with corresponding filter coefficients and the products are added
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 18
1/19/2012
11
Reduced neighborhood influence
0.05 0.15 0.05
0.15 0.20 0.15
0.05 0.15 0.05
• Central pixel is given 20% weight, 4-neighbors 15% weight. Diagonal neighbors given 5% weight.
• Note that the weights are all positive, and sum to unity
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 19
Image
Filter
Mask
IIT Bombay Slide 20
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
12
Border Effect
• The computation of the filtering operation
is applicable at those positions of the
image where the filter completely fits
inside.
• At the boundary positions, only part of the
filter fits inside the image. At such
positions, the computation is arbitrarily
defined
IIT Bombay Slide 21
GNR630 Lecture 5-6 B. Krishna Mohan
Original
Image
IIT Bombay Slide 22
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
13
3x3
ave
rag
ing
IIT Bombay Slide 23
GNR630 Lecture 5-6 B. Krishna Mohan
Gaussian smoothing
• Gaussian filter: linear smoothing
• weight matrix
for all where
W: one or two σ from center
)(2
12
22
),( σ
cr
kecrw
+−
=
,),( Wcr ∈
∑∈
+−
=
Wcr
cr
e
k
),(
)(2
12
22
1
σ
IIT Bombay Slide 24
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
14
Median Filter
• Median filter is the most commonly used
non-linear filter for image smoothing
• When the image is corrupted by random
salt-and-pepper noise, median operation is
very effective in removing the noise,
without degrading the input image
• gij = median {fi-k,j-l | k,l=-w, …, o, …, w}
IIT Bombay Slide 25
GNR630 Lecture 5-6 B. Krishna Mohan
Mean v/s Median filter• Consider an example:
• 15 17 16 15 17 17
• 18 17 15 157 18 15
• 17 14 16 17 14 16
• Case 1 Case 2• Mean=16 Mean=32
• Median=16 Median=17
• In arithmetic averaging, noise is distributed over the neighbours
• In median filtering, the extreme values are pushed to one end of the sequence after sorting, hence ignored when filtered
IIT Bombay Slide 26
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
15
Algorithm• Consider the size of the window around the pixel
• Collect all the pixels in the window and sort them in ascending / descending order
• Select the gray level after sorting, according to the rank criterion
• It can easily be verified that median and mode filters are nonlinear, according to the definition of linearity
IIT Bombay Slide 27
GNR630 Lecture 5-6 B. Krishna Mohan
Example
Median
filtering
Example here
is over 7x7
neighborhood
IIT Bombay Slide 28
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
16
Some Comments• Neighborhood operations can suppress
unwanted noise as well as minor detail in an image, also called smoothing
• Simple averaging with equal weightage to all neighbors is a well known smoothing method
• Gaussian smoothing is another popular smoothing operation
• Median filter is a popular nonlinear smoothing operation
IIT Bombay Slide 29
GNR630 Lecture 5-6 B. Krishna Mohan
Edge Enhancement Methods
IIT Bombay Slide 30
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
17
Edge• Edge: boundary where brightness
values significantly differ
among neighbors
edge: brightness value appears to abruptly
jump up (or down)
IIT Bombay Slide 31
GNR630 Lecture 5-6 B. Krishna Mohan
IIT Bombay Slide 33
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
18
What Is An Edge?
• An edge is a set of connected pixels that lie
on the boundary between two regions
• The pixels on an edge are called edge points
• Gray level / color / texture discontinuity
across an edge causes edge perception
• Position & orientation of edge are key
properties
IIT Bombay Slide 34
GNR630 Lecture 5-6 B. Krishna Mohan
Different Edges
A
Different colors
Different brightness
IIT Bombay Slide 35
GNR630 Lecture 5-6 B. Krishna Mohan
Different Intensities
1/19/2012
19
Locating an Edge
• Locating an edge is important, since the
shape of an object, its area, perimeter and
other such measurements are possible
only when the boundary is accurately
determined
• Edge is a local feature, marked by sharp
discontinuity in the image property on
either side of it
IIT Bombay Slide 36
GNR630 Lecture 5-6 B. Krishna Mohan
Principle of Gradient Operator
The interpretation of this operator is that the
intensity gradient is computed in two
perpendicular directions, followed by the
resultant whose magnitude and orientation
are computed by treating the values from
the two masks as two projections of the
edge vector
IIT Bombay Slide 37
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
20
Gradient Edge Detection• Given an image f(x,y), compute
• �f =
• Squared gradient magnitude
|�f|2 =
Gradient direction =
,f f
x y
∂ ∂
∂ ∂
22f f
x y
∂ ∂ +
∂ ∂ arctan
f f
y x
∂ ∂
∂ ∂
IIT Bombay Slide 38
GNR630 Lecture 5-6 B. Krishna Mohan
Gradient Directions
Vertical gradient Horizontal gradient
Diagonal gradient
IIT Bombay Slide 39
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
21
Gradient Edge Detectors
• As seen, two mutually perpendicular gradient detectors are required to detect edges in an image, since edges may occur in any orientation.
• Using two mutually perpendicular orientations, an edge in any direction can be resolved in terms of these two orthogonal components
IIT Bombay Slide 40
GNR630 Lecture 5-6 B. Krishna Mohan
Roberts Operator• Roberts operator: two 2X2 masks to
calculate gradient; Operates on 2x size neighborhood
gradient magnitude:
r1 = f(A) – f(D); r2 = f(B) – f(C)
r1, r2 gradient outputs from the masks;
direction = arctan(r2/r1)
2
2
2
1 rr +
1 0 0 1
0 -1 -1 0
A B
C D
IIT Bombay Slide 41
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
22
Gradient Edge Detectors• Prewitt Operator
• gradient magnitude:
• gradient direction: clockwise w.r.t. column axis
• p1, p2 are gradient outputs from the masks
2
2
2
1 ppg +=
)arctan( 21 pp=θ
1 1 1 -1 0 1
0 0 0 -1 0 1
-1 -1 -1 -1 0 1
Prewitt 1 Prewitt 2
IIT Bombay Slide 42
GNR630 Lecture 5-6 B. Krishna Mohan
Gradient Edge Detectors
• Prewitt Edge Detector (one part of it)
)()1()(' xfxfxf −+=
)1()()1(' −−=− xfxfxf+
x-1 x x+1
-1 0 1
-1 0 1
-1 0 1 = f (x+1) – f (x -1)
More stable than Roberts, robust to noise in the image, and produces better edges. More time consuming,
IIT Bombay Slide 43
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
23
Input image
IIT Bombay Slide 44
GNR630 Lecture 5-6 B. Krishna Mohan
Prewitt Operator Output
IIT Bombay Slide 45
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
24
Gradient Edge Detectors
Sobel edge detector
)arctan( 21 ss=θ
2
2
2
1 ssg +=
gradient magnitude:
gradient direction:
1 2 1 -1 0 1
0 0 0 -2 0 2
-1 -2 -1 -1 0 1
Sobel 1 Sobel 2
Compare with Prewitt!
IIT Bombay Slide 46
GNR630 Lecture 5-6 B. Krishna Mohan
Laplacian Operator
• The Laplacian operator is based on the Laplace equation given by
• Laplacian operator is discretized version of the above equation and is based on second derivatives along x and y directions
2 2
2 20
f f
x y
∂ ∂+ =
∂ ∂
IIT Bombay Slide 47
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
25
Laplacian Operator
• Filter coefficients
• The discrete version of the second derivative operator:
• [1 -2 1] and [1 -2 1]T in the horizontal and vertical directions
• Superimposing the two,
we get the discrete Laplace
operator
0 -1 0-1 4 -10 -1 0
IIT Bombay Slide 48
GNR630 Lecture 5-6 B. Krishna Mohan
Properties of Laplace Operator
• Isotropic operator – cannot give orientation information
• Any noise in image gets amplified
• Faster since only one filter mask involved
• Smoothing the image first prior to Laplace operator is often needed for reliable edges
IIT Bombay Slide 49
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
26
Image Sharpening
• Achieved by two ways:
• Sharp(x,y) = Image(x,y)+ |Image(x,y) –Smooth(x,y)|
• Alternately,
• Sharp(x,y)=Image(x,y) + GradMag(x,y)
IIT Bombay Slide 50
GNR630 Lecture 5-6 B. Krishna Mohan
COLOR TRANSFORMS
1/19/2012
27
Motivation for Color Transforms• In the multiband (>3) datasets that are
provided by remote sensors, we can
choose any three bands to generate color
composites
• By applying suitable transformations, we
can enhance these images based on
principles of color perception
IIT Bombay Slide 51
GNR630 Lecture 5-6 B. Krishna Mohan
Visible Range of EMSpectrum
IIT Bombay Slide 52
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
28
Color• Color is determined by the wavelength bands of
the electromagnetic spectrum
• Color is described (perceived) in terms of
– HUE: dominant wavelength in color
– SATURATION: purity of color
• (depends on the amount of white light mixed with the color)
– INTENSITY: actual amount or strength of light
• All of them contribute to our perception of color
IIT Bombay Slide 53
GNR630 Lecture 5-6 B. Krishna Mohan
Our Visual System• Our eyes have 2 types of sensors:
– CONES• Sensitive to colored light, but not very effective in
perceiving color in dim light conditions
– RODS• Strongly sensitive to white (panchromatic) light.
Can sense differences in light even in dim conditions. (Our eyes can adjust to lighting conditions and see outlines of objects even in darkness)
IIT Bombay Slide 54
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
29
Cones
• The cones in our eyes consist of three types of elements sensitive to:
• 440 nm (BLUE)
• 545 nm (GREEN)
• 580 nm (RED)
IIT Bombay Slide 55
GNR630 Lecture 5-6 B. Krishna Mohan
Color ModelsThey provide a standard way of specifying a
particular color using a 3D coordinate system.
• Hardware oriented:– RGB (display monitors)
– CMYB (printers)
• Image processing oriented:– HSI
IIT Bombay Slide 56
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
30
R-G-B Model• It is an
additive color model.
• An image consists of 3 components, one for each primary color, Red, Green and Blue.
• Appropriate for image displays.
IIT Bombay Slide 57
GNR630 Lecture 5-6 B. Krishna Mohan
Relation between RGB-HSI Models
IIT Bombay Slide 58
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
31
IIT Bombay Slide 59
GNR630 Lecture 5-6 B. Krishna Mohan
HSI ModelOriginal artwork from the book Digital Image Processing by R.C. Gonzalez and R.E. Woods © R.C. Gonzalez and R.E. Woods, reproduced with permission granted to instructors by authors on the website www.imageprocessingplace.com
RGB-HSI Conversion(See Gonzalez and Woods)
IIT Bombay Slide 60
GNR630 Lecture 5-6 B. Krishna Mohan
1
2
1
2
( ) / 3
min( , , )1
1/ 2[( ) ( )]cos if
( ) ( )( )
1/ 2[( ) ( )]360 cos if
( ) ( )( )
I R G B
R G BS
I
R G R BH B G
R G R B G B
R G R BH B G
R G R B G B
−
−
= + +
= −
− + −
= < − + − −
− + −
= − > − + − −
Look for full discussion in downloadable article on the website www.imageprocessingplace.com
1/19/2012
32
HSI to RGB Conversion• This conversion depends on whether the
color is in the Red-Green zone or Green-
Blue zone or Blue-Red zone.
• In each case the hue varies from 0 to 120,
121 to 240, 241 to 360 degrees
respectively.
IIT Bombay Slide 61
GNR630 Lecture 5-6 B. Krishna Mohan
RG Sector
Hue in the range 0o – 120o
B = I(1-S)
R = I [ 1 + S.cosH/{cos(60o-H)}]
G = 1-(R+B)
IIT Bombay Slide 62
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
33
GB SectorHue in the range 120o – 240o
H = H – 120o
R = I(1-S)
G = I [ 1 + S.cosH/{cos(60o-H)}]
B = 1-(R+G)
IIT Bombay Slide 63
GNR630 Lecture 5-6 B. Krishna Mohan
CMY Model
• Cyan-Magenta-Yellow is a subtractive modelwhich is good to model absorption of colors.
• Appropriate for paper printing.
IIT Bombay Slide 64
GNR630 Lecture 5-6 B. Krishna Mohan
−
=
B
G
R
Y
M
C
1
1
1
1/19/2012
34
Title
IIT Bombay Slide 65
GNR630 Lecture 5-6 B. Krishna Mohan
RGB Additive CMY Subtractive
BR SectorHue in the range 240o – 360o
H = H – 240o
G = I(1-S)
B = I [ 1 + S.cosH/{cos(60o-H)}]
R = 1-(G+B)
IIT Bombay Slide 66
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
35
Application of HSI System• Direct access to color of an object
• Manipulation of color easier
• While processing documents, color based
separation into different files can simplify
processing / recognition
• Data of different sensors can be fused
using the RGB-to-HSI-RGB transformation
IIT Bombay Slide 67
GNR630 Lecture 5-6 B. Krishna Mohan
Example
IIT Bombay Slide 68
GNR630 Lecture 5-6 B. Krishna Mohan
Original Image After increased saturation
1/19/2012
36
Image Arithmetic
Multiband Arithmetic• Operations performed on combinations of
multispectral bands
• Ratio, difference, combination of ratio and
difference etc. are widely employed to
emphasize objects with sharply different
response in a pair of bands
IIT Bombay Slide 69
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
37
Motivation
IIT Bombay Slide 70
GNR630 Lecture 5-6 B. Krishna Mohan
Multiband Arithmetic
IIT Bombay Slide 71
GNR630 Lecture 5-6 B. Krishna Mohan
• In a given pair of bands the response of two objects is generally different.
• Pixel by pixel comparison between images can highlight pixels that have very high difference in reflectance in those bands
• Operations like band difference and band ratio or combinations of them are popularly used for this purpose
1/19/2012
38
Band Ratio• Very common operation
Ratioi,j(m,n) = Bandi (m,n) / Bandj(m,n)
If Bandj(m,n) = 0, suitable adjustment has to be made (e.g., add +1 to the denom.)
Minimum ratio will be 0; Maximum ratio will be 255
IIT Bombay Slide 72
GNR630 Lecture 5-6 B. Krishna Mohan
Inp
ut Im
ag
e
IIT Bombay Slide 73
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
39
Inp
ut Im
ag
e F
CC
IIT Bombay Slide 74
GNR630 Lecture 5-6 B. Krishna Mohan
IR/R
IIT Bombay Slide 75
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
40
Band Ratio• For fast computing, approximations can be
made such as:
0 ≤ ≤ ≤ ≤ Ratioi,j(m,n) ≤1, ≤1, ≤1, ≤1, Ratioi,j(m,n)scaled =
Round [Ratioi,j(m,n)x127]
1 < Ratioi,j(m,n) ≤ 255, ≤ 255, ≤ 255, ≤ 255, Ratioi,j(m,n)scaled =
Round [127 + Ratioi,j(m,n)/2]
• Advantage – in one pass image is generated in range 0-255
IIT Bombay Slide 76
GNR630 Lecture 5-6 B. Krishna Mohan
Band Difference• Similar to band ratio, band difference can
also be used to account for difference in reflectance by objects in two wavelengths
• Band ratio - more popular in practical applications such as geological mapping
• Topographic effects on the images are reduced by ratioing.
IIT Bombay Slide 77
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
41
Band Multiplication• Pixel by pixel multiplication of two images
• Not used to multiply gray levels in one band with corresponding gray levels in another band
• Used in practice to mask some part of the image and retain the rest of it by preparing a mask image and performing image to image multiplication of pixels
IIT Bombay Slide 78
GNR630 Lecture 5-6 B. Krishna Mohan
Band Addition• Similar to Band Multiplication, band addition has
no direct practical application in adding gray levels of two bands of an image
• This method too can be used to mask a portion of the image and retain the remaining part.
IIT Bombay Slide 79
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
42
Specialized Indices• Combination of band differences, ratios
and additions can result in useful outputs that can highlight features like green vegetation
• One such feature is Normalized Difference Vegetation Index (NDVI)
• NDVI(m,n) =
IIT Bombay Slide 80
GNR630 Lecture 5-6 B. Krishna Mohan
( , ) ( , )
( , ) ( , )
IR R
IR R
Band m n Band m n
Band m n Band m n
−
+
NDVI• NDVI results in high values where IR dominates
red wavelength. This happens where vegetation is present
• Range of NDVI is [-1 +1]• NDVI has been widely used in a wide ranging of
agricultural, forestry and biomass estimation applications
• It is also used to measure the length of crop growth and dry-down periods by comparing NDVI computed from multidate images
IIT Bombay Slide 81
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
43
Inp
ut Im
ag
e
IIT Bombay Slide 82
GNR630 Lecture 5-6 B. Krishna Mohan
NIR
IIT Bombay Slide 83
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
44
RED
IIT Bombay Slide 84
GNR630 Lecture 5-6 B. Krishna Mohan
NDVI
IIT Bombay Slide 85
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
45
Other Vegetation Indices
• Simple Ratio = Red/NIR• NDVI6 = (Band 6 – Band 5)/(Band 6 + Band 5)
• NDVI7 = (Band 7 – Band 5)/(Band 7 + Band 5)
• Standard NDVITM = (TM4 – TM3)/(TM4 + TM3)
These are applicable when seven band data like Landsat
Thematic Mapper data are available
For IRS LISS3 imagery, NDVIIRS =
IIT Bombay Slide 86
GNR630 Lecture 5-6 B. Krishna Mohan
4 3
4 3
( , ) ( , )
( , ) ( , )
Band m n Band m n
Band m n Band m n
−
+
IRS L4-NDVI
IIT Bombay Slide 87
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
46
Fast Computation of NDVI• Range of NDVI [-1, +1]
• Scale suitably to generate an NDVI image
• For example, NDVIscaled =127(1+NDVI)
• This ensures that the resultant NDVI has a
range of [0 254]
IIT Bombay Slide 88
GNR630 Lecture 5-6 B. Krishna Mohan
Selected Reflectance Curves
IIT Bombay Slide 89
GNR630 Lecture 5-6 B. Krishna Mohan
From J.R.
Jensen’s
lecture notes
at Univ. South
Carolina
Used with permission
1/19/2012
47
Time Series of 1984 and 1988 NDVI Measurements Derived from AVHRR Global Area Coverage (GAC) Data Region around El Obeid, Sudan, in Sub-Saharan Africa
IIT Bombay Slide 90
GNR630 Lecture 5-6 B. Krishna Mohan
From J.R.
Jensen’s
lecture notes
at Univ. South
Carolina
Used with permission
Simple Ratio v/s NDVI
IIT Bombay Slide 91
GNR630 Lecture 5-6 B. Krishna Mohan
From J.R.
Jensen’s
lecture notes
at Univ. South
Carolina
Used with permission
1/19/2012
48
Data Fusion
Data Fusion• Combine datasets to prepare a superior
dataset
• Stack up all the datasets to create a large higher dimensional dataset – e.g., multitemporal data from same sensor
• Fuse the datasets to create a higher resolution dataset
• Fuse the datasets to create a new dataset that has attributes of individual ones
IIT Bombay Slide 92
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
49
Data Fusion
• Most commonly employed by endusers of
remotely sensed data
• Supported by most software packages
IIT Bombay Slide 93
GNR630 Lecture 5-6 B. Krishna Mohan
Introduction• Merging multi-sensor data can help exploit
strengths of various data sets
– Radiometric resolution advantage
– Spatial resolution advantage
– Spectral resolution advantage
– Temporal resolution advantage
IIT Bombay Slide 94
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
50
Spatial Resolution Enhancement
• This is the most common application of
data fusion
– Low resolution images have fewer pixels per unit area due to larger pixel size
– Improve spatial resolution
– High resolution images provide more pixels per unit area by smaller sampling interval (pixel size)
IIT Bombay Slide 95
GNR630 Lecture 5-6 B. Krishna Mohan
Zooming is NOT resolution enhancement
• How is spatial resolution enhanced?
• Low resolution � absence of high spatial
frequency content
• High frequency information is to be
transferred from another data source (of
higher resolution)
IIT Bombay Slide 96
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
51
Resolution Sharpening
• Most often, data from the lower spatial
resolution multispectral sensors and the
higher spatial resolution panchromatic
sensors are merged
• Results in multispectral data at higher
spatial resolution
IIT Bombay Slide 97
GNR630 Lecture 5-6 B. Krishna Mohan
Multi-sensor Data Merging
Most common operation
• PAN images to sharpen multispectral data
e.g., IRS pan + IRS ms
• Sharpening low resolution multispectral
images with high resolution multispectral
images
For instance, SPOT ms + TM ms
(20 metres) (30 metres)
IIT Bombay Slide 98
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
52
Input Image Preparation• Contrast Adjustment
– Zoom low resolution image to the same physical size of the high resolution image
– Match histogram of the MS image with that of PAN image using histogram based techniques
• Image Registration– Register the zoomed low resolution image to
the high resolution image. This should be accurate to a fraction of a pixel
IIT Bombay Slide 99
GNR630 Lecture 5-6 B. Krishna Mohan
Image Sharpening
• MShr = f(MSlr, PANhr) , where
• MS = multispectral Image
• PAN = Panchromatic Image
• lr = low resolution
• hr = high resolution
IIT Bombay Slide 100
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
53
Sharpening Techniques
• Principal Component Analysis method
• Intensity-Hue-Saturation method
• Ratio-based (Brovey Transform)
• Arithmetic algorithm
• Multiplicative
• Wavelet Transform method
IIT Bombay Slide 101
GNR630 Lecture 5-6 B. Krishna Mohan
RGB-HSI Transform Method• In color images, the spectral information is
contained in the hue and the saturation.
• Hue denotes the basic dominant wavelength of the radiation
• Saturation denotes the purity of the color or is a function of the amount of dilution of the color with white light
• Intensity is an indicator of the strength of the color or the magnitude of the energy that reaches our eye
IIT Bombay Slide 102
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
54
RGB-HSI Transform Method• The philosophy in HSI based fusion is to replace
the intensity with the new data set first and then compute the inverse transform of the HSI data set to the RGB coordinate system
• The spatial resolution of the added component and the spectral information in the hue and saturation together provide an enhanced data set compared to the original low resolution multispectral and high resolution panchromatic data sets.
IIT Bombay Slide 103
GNR630 Lecture 5-6 B. Krishna Mohan
Inp
ut H
igh
Reso
lutio
n Im
ag
eIIT Bombay Slide 104
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
55
Input Multispectral Image
IIT Bombay Slide 105
GNR630 Lecture 5-6 B. Krishna Mohan
IHS
Reso
lutio
n M
erg
eIIT Bombay Slide 106
GNR630 Lecture 5-6 B. Krishna Mohan
1/19/2012
56
IHS
Reso
lutio
n M
erg
e -
FC
C
IIT Bombay Slide 107
GNR630 Lecture 5-6 B. Krishna Mohan
TO BE CONTINUED!