Digital image processingDigital image processing
Introduction to Remote SensingInstructor: Dr. Cheng-Chien Liu
Department of Earth Sciences
National Cheng Kung University
Last updated: 18 November 2004
Chapter 7Chapter 7
OutlineOutline
IntroductionIntroduction Image enhancementImage enhancement Image rectification and restorationImage rectification and restoration Image classificationImage classification Data mergingData merging Hyperspectral image analysisHyperspectral image analysis
IntroductionIntroduction
Definition of digital image: Definition of digital image: ff((xx,,yy))• DN: Digital Number• Processing
Origin of DIPOrigin of DIP Applications of DIPApplications of DIP
• Classified by EM spectrum, X, UV, VNIR, IR, microwave, radiowaveOur focus: VNIR
• SoundUltrasound, …
Components of DIPComponents of DIP Contents of DIPContents of DIP
EnhancementEnhancement
HistogramHistogram Thresholding: Fig 7.11Thresholding: Fig 7.11 Level slicing: Fig 7.12Level slicing: Fig 7.12 Contrast stretching: Fig 7.13Contrast stretching: Fig 7.13
Image rectification and restorationImage rectification and restoration
Rectification Rectification 糾正 糾正 distortion distortion 畸變畸變 Restoration Restoration 復原復原 degradation degradation SourceSource
• Digital image acquisition type
• Platform
• TFOV
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
Two-step procedure of geometric Two-step procedure of geometric correctioncorrection• Systematic (predictable)
e.g. eastward rotation of the earth skew distortion Deskewing offest each successive scan line slightly to the west
parallelogram image
• Random (unpredictable)e.g. random distortions and residual unknown systematic
distortionsGround control points (GCPs)
Highway intersections, distinct shoreline features,… Two coordinate transformation equations
Distorted-image coordinate Geometrically correct coordinate
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
Affine coordinate transformAffine coordinate transform• Six factors
• Transformation equationx = a0 + a1X + a2Y
y = b0 + b1X + b2Y (x, y): image coordinate (X, Y): ground coordinate
• Six parameters six conditions 3 GCPs
• If GCPs > 3 redundancy LS solutions
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
ResamplingResampling• Fig 7.1: Resampling process
Transform coordinateAdjust DN value perform after classification
• MethodsNearest neighborBilinear interpolationBicubic convolution
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
Nearest neighborNearest neighbor• Fig 7.1: a a΄ (shaded pixel)
• Fig C.1: implementRounding the computed coordinates to the nearest whole
row and column number
• AdvantageComputational simplicity
• DisadvantageDisjointed appearance: feature offset spatially up to ½ pixel
(Fig 7.2b)
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
Bilinear interpolationBilinear interpolation• Fig 7.1: a, b, b, b a΄ (shaded pixel)
Takes a distance-weighted average of the DNs of the four nearest pixels
• Fig C.2a: implementEq. C.2Eq. C.3
• AdvantageSmoother appearing (Fig 7.2c)
• DisadvantageAlter DN valuesPerformed after image classification procedures
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
Bicubic (cubic) interpolationBicubic (cubic) interpolation• Fig 7.1: a, b, b, b, c, … a΄ (shaded pixel)
Takes a distance-weighted average of the DNs of the four nearest pixels
• Fig C.2b: implementEq. C.5Eq. C.6Eq. C.7
• Advantage (Fig 7.2d)Smoother appearingProvide a slightly sharper image than the bilinear interpolation image
• DisadvantageAlter DN valuesPerformed after image classification procedures
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
Radiometric correction Radiometric correction 輻射校正輻射校正• Varies with sensors• Mosaics of images taken at different times
require radiometric correction
Influence factorsInfluence factors• Scene illumination• Atmospheric correction• Viewing geometry• Instrument response characterstics
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
Sun elevation correctionSun elevation correction• Fig 7.3: seasonal variation• Normalize by calculating pixel brightness values
assuming the sun was at the zenith on each date of sensing
• Multiply by cos0
Earth-Sun distance correctionEarth-Sun distance correction• Decrease as the square of the Earth-Sun distance• Divided by d2
Combined influence Combined influence 200 cos
d
EE
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
Atmospheric correctionAtmospheric correction• Atmospheric effects
Attenuate (reduce) the illuminating energyScatter and add path radiance
• Combination
• Haze compensation minimize Lp
Band of zero Lp (e.q.) NIR for clear water
• Path length compensationOff-nadir pixel values are normalized to their nadir
equivalents
ptot LET
L
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
Conversion of DNs to radiance valuesConversion of DNs to radiance values• Measure over time using different sensors• Different range of reflectance
e.g. land water
• Fig 7.4: radiometric response functionLinearWavelength-dependentCharacteristics are monitored using onboard calibration lamp
• DN = GL + BG: channel gain (slope)B: channel offset (intercept)
• Fig 7.5: inverse of radiometric response functionEquationLMAX: saturated radianceLMAX - LMIN: dynamic range for the channel
LMINDN255
LMINLMAX
L
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
NoiseNoise• Definition
• SourcesPeriodic drift, malfunction of a detector, electronic
interference, intermittent hiccups in the data transmission and recording sequence
• InfluenceDegrade or mask the information content
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
Systematic noiseSystematic noise• Striping or banding
e.g. Landsat MSS six detectors driftDestriping (Fig 7.6)
Compile a set of histograms Compare their mean and median values identify the problematic detectors Gray-scale adjustment factors
• Line dropLine drop correction (Fig 7.7)
Replace with values averaged from the above and below
Image rectification and restoration Image rectification and restoration (cont.)(cont.)
Random noiseRandom noise• Bit error spikey salt and pepper or snowy
appearance
• Moving windowsFig 7.8: moving windowFig 7.9: an example of noise suppression algorithmFig 7.10: application to a real imagey
Image classificationImage classification
Overall objective of classificationOverall objective of classification• Automatically categorize all pixels in an image into land cover
classes or themes
Three pattern recognitionsThree pattern recognitions• Spectral pattern recognition emphasize in this chapter• Spatial pattern recognition• Temporal pattern recognition
Selection of classificationSelection of classification• No single “right” approach• Depend on
The nature of the data being analyzedThe computational resources availableThe intended application of the classified data
Supervised classificationSupervised classification
Fig 7.37Fig 7.37• A hypothetical example
Five bands: B, G, R, NIR, TIR,Six land cover types: water, sand, forest, urban, corn, hay
Three basic steps (Fig 7.38)Three basic steps (Fig 7.38)• Training stage
• Classification stage
• Output stage
Supervised classification (cont.)Supervised classification (cont.)
Classification stageClassification stage• Fig 7.39
Pixel observations from selected training sites plotted on scatter diagram
Use two bands for demonstration, can be applied to any band numberClouds of points multidimensional descriptions of the spectral
response patterns of each category of cover type to be interpreted
• Minimum-Distance-to-Mean classifierFig 7.40
Mean vector for each category Pt 1 Corn Pt 2 Sand ?!!
Advantage: mathematically simple and computationally efficientDisadvantage: insensitive to different degrees of variance in the spectral
response dataNot widely used if the spectral classes are close to one another in the
measurement space and have high variance
Supervised classification (cont.)Supervised classification (cont.)
Classification stage (cont.)Classification stage (cont.)• Parallelepiped classifier
Fig 7.41 Range for each category Pt 1 Hay ?!! Pt 2 Urban
Advantage: mathematically simple and computationally efficient
Disadvantage: confuse if correlation or high covariance are poorly described by the rectangular decision regions
Positive covariance: Corn, Hay, Forest Negative covariance: Water
Alleviate by use of stepped decision region boundaries (Fig 7.42)
Supervised classification (cont.)Supervised classification (cont.)
Classification stage (cont.)Classification stage (cont.)• Gaussian maximum likelihood classifier
Assumption: the distribution of the cloud of points is Gaussian distribution
Probability density functions mean vector and covariance matrix (Fig. 7.43)
Fig 7.44: Ellipsoidal equiprobability contoursBayesian classifier
A priori probability (anticipated likelihood of occurrence) Two weighting factors If suitable data exist for these factors, the Bayesian implementation of the classifier is
preferableDisadvantage: computational efficiency
Look-up table approach Reduce the dimensionality (principal or canonical components transform) Simplify classification computation by separate certain classes a prior
Water is easier to separate by use of NIR/Red ratio
Supervised classification (cont.)Supervised classification (cont.)
Training stageTraining stage• Classification automatic work• Assembling the training data manual work
Both an art and a scienceSubstantial reference dataThorough knowledge of the geographic areaYou are what you eat!
Results of classification are what you train!
• Training dataBoth representative and complete
All spectral classes constituting each information class must be adequately represented in the training set statistics used to classify an image
e.g. water (turbid or clear) e.g. crop (date, type, soil moisture, …)
It is common to acquire data from 100+ training areas to represent the spectral variability
Supervised classification (cont.)Supervised classification (cont.)
Training stage (cont.)Training stage (cont.)• Training area
Delineate boundaries (Fig 7.45) Carefully located boundaries no edge pixels
Seed pixel Choose seed pixel statistically based criteria contiguous pixels cluster
• Training pixelsNumber
At least n+1 pixels for n spectral bands In practice, 10n to 100n pixels is used
Dispersion representative
• Training set refinementMake sure the sample size is sufficient
Assess the overall quality Check if all data sets are normally distributed and spectrally pure
Avoid redundancy Delete or merge
Supervised classification (cont.)Supervised classification (cont.)
Training stage (cont.)Training stage (cont.)• Training set refinement process
Graphical representation of the spectral response patterns Fig 7.46: Histograms for data points included in the training areas of “hay”
Visual check on the normality of the spectral response distribution Two subclasses: normal and bimodal
Fig 7.47: Coincident spectral plot Corn/hay overlap for all bands Band 3 and 5 for hay/corn separation (use scatter plot)
Fig 7.48: SPOT HRV multi-spectral images Fig 7.49 scatter plot of band 1 versus band 2 Fig 7.50 scatter plot of band 2 versus band 3 less correlated adequate
Quantitative expressions of category separation Transform divergence: a covariance-weighted distance between category means
Table 7.1: Portion of a divergence matrix (<1500 spectrally similar classes)
Supervised classification (cont.)Supervised classification (cont.)
Training stage (cont.)Training stage (cont.)• Training set refinement process (cont.)
Self-classification of training set data Error matrix for training area not for the test area or the overall scene
Tell us how well the classifier can classify the training areas and nothing more Overall accuracy is perform after the classification and output stage
Interactive preliminary classification Plate 29: sample interactive preliminary classification procedure
Representative subscene classification Complete the classification for the test area verify and improve
Summary Revise with merger, deletion and addition to form the final set of statistics used in
classification Accept misclassification accuracy of a class that occurs rarely in the scene to preserve the
accuracy over extensive areas Alternative methods for separating two spectrally similar classes GIS data, visual
interpretation, field check, multi-temporal or spatial pattern recognition procedures, …
Unsupervised classificationUnsupervised classification
Unsupervised Unsupervised supervisedsupervised• Supervised define useful information
categories examine their spectral separability
• Unsupervised determine spectral classes define their informational utility
Illustration: Fig 7.51Illustration: Fig 7.51• Advantage: the spectral classes are found
automatically (e.g. stressed class)
Unsupervised classification (cont.)Unsupervised classification (cont.)
Clustering algorithmsClustering algorithms• K-means
Locate centers of seed clusters assign all pixels to the cluster with the closest mean vector revise mean vectors for each clusters reclassify the image iterative until there is no significant change
• Iterative self-organizing data analysis (ISODATA)Permit the number of clusters to change from on iteration to the next
by Merging: distance < some predefined minimum distance Splitting: standard deviation > some predefined maximum distance Deleting: pixel number in a cluster < some specified minimum number
• Texture/roughnessTexture: the multidimensional variance observed in a moving window
passed through the imageMoving window variance threshold smooth/rough
Unsupervised classification (cont.)Unsupervised classification (cont.)
Poor representationPoor representation• Roads and other linear features not smooth
• Solution hybrid classification
Table 7.2Table 7.2• Outcome 1: ideal result
• Outcome 2: subclasses classes
• Outcome 3: a more troublesome resultThe information categories is spectrally similar and cannot
be differentiated in the given data set
Hybrid classificationHybrid classification
Unsupervised training areasUnsupervised training areas• Image sub-areas chosen intentionally to be quite different from
supervised training areasSupervised regions of homogeneous cover typeUnsupervised contain numerous cover types at various locations throughout
the scene To identify the spectral classes
Guided clusteringGuided clustering• Delineate training areas for class X• Cluster all class X into spectral subclasses X1, X2, …• Merge or delete class X signatures• Repeat for all classes• Examine all class signatures and merge/delete signatures• Perform maximum likelihood classification• Aggregate spectral subclasses
Classification of mixed pixelsClassification of mixed pixels
Mixed pixelsMixed pixels• IFOV includes more than one type/feature
Low resolution sensors more serious
Subpixel classificationSubpixel classification• Spectral mixture analysis
A deterministic method (not a statistical method)Pure reference spectral signatures
Measured in the lab, in the field, or from the image itself Endmembers
Basic assumption The spectral variation in an image is caused by mixtures of a limited number of surface materials Linear mixture satisfy two basic conditions simultaneously
The sum of the fractional proportions of all potential endmembers Fi = 1 The observed DN for each pixel B band B equations B+1 equations solve B+1 endmember fractions
Fig 7.52: example of a linear spectral mixture analysisDrawback: multiple scattering nonlinear mixturemodel
EDNFDNFDNFDN NN ,2,21,1 ...
Classification of mixed pixels (cont.)Classification of mixed pixels (cont.)
Subpixel classification (cont.)Subpixel classification (cont.)• Fuzzy classification
A given pixel may have partial membership in more than one category
Fuzzy clustering Conceptually similar to the K-means unsupervised classification approach Hard boundaries fuzzy regions
Membership grade
Fuzzy supervised classification A classified pixel is assigned a membership grade with respect to its
membership in each information class
The output stageThe output stage
Image classification Image classification output products output products end usersend users• Graphic products
Plate 30
• Tabular data
• Digital information files
Postclassification smoothingPostclassification smoothing
Majority filterMajority filter• Fig 7.53
(a) original classification salt-and-pepper appearance(b) 3 x 3 pixel-majority filter(c) 5 x 5 pixel-majority filter
Spatial pattern recognitionSpatial pattern recognition
Classification accuracy assessmentClassification accuracy assessment
SignificanceSignificance• A classification is not complete until its accuracy is assessed
Classification error matrixClassification error matrix• Error matrix (confusion matrix, contingency table)
Table 7.3Omission (exclusion) 漏授
Non-diagonal column elements (e.g. 16 sand pixels were omitted)Commission (inclusion) 誤授
Non-diagonal raw elements (e.g. 38 urban pixels + 79 hay pixels were included in corn)Overall accuracyProducer’s accuracy 生產者準確度
Indicate how well training set pixels of the given cover type are classifiedUser’s accuracy 使用者準確度
Indicate the probability that a pixel classified into a given category actually represents that category on the ground
Training area accuracies are sometimes used in the literature as an indication of overall accuracy. They should not be!
Classification accuracy assessment Classification accuracy assessment (cont.)(cont.)
Sampling considerationsSampling considerations• Test area
Different and more extensive than training areaWithhold some training areas for postclassification accuracy
assessment
• Wall-to-wall comparisonExpensiveDefeat the whole purpose of remote sensing
• Random samplingCollect large sample of randomly distributed points too expensive
and difficult e.g. 3/4 of Taiwan area is covered by The Central mountain
Only sample those pixels without influence of potential registration error
Several pixels away from field boundariesStratified random sampling
Each land cover category Stratum
Classification accuracy assessment Classification accuracy assessment (cont.)(cont.)
Sampling considerations (cont.)Sampling considerations (cont.)• Sample unit
Individual pixels, clusters of pixels or polygons
• Sample numberGeneral area: 50 samples per categoryLarge area or more than 12 categories: 75 – 100 samples
per categoryDepend on the variability of each category
Wetland need more samples than open water
Data Merging and GIS IntegrationData Merging and GIS Integration
RS applications RS applications data merging data merging unlimited variety of dataunlimited variety of data• Multi-resolution data fusion• Plate 1: GIS (soil erodibility + slope
information)
TrendTrend• Boundary between DIP and GIS blurred• Fully integrated spatial analysis systems
norm
Multi-temporal data mergingMulti-temporal data merging
Same area but different dates Same area but different dates composites composites visual interpretationvisual interpretation• e.g. agricultural crop• Plate 31(a): mapping invasive plant species
NDVI from Landsat-7 ETM+ March 7 blue April 24 green October 15 red
GIS-derived wetland boundary eliminate the interpretation of false positive areas
• Plate 31(b): mapping of algae bloom• Enhance the automated land cover classification
Register all spectral bands from all dates into one master data set More data for classification Principal components analysis reduce the dimensionality manipulate, store, classify, …
• Multi-temporal profileFig 7.54: greenness. (tp, , Gm, G0)
Change detection proceduresChange detection procedures
Change detectionChange detection• Types of interest
Short term phenomena: e.g. snow cover, floodwaterLong tern phenomena: e.g. urban fringe development, desertification
• Ideal conditionsSame sensor, spatial resolution, viewing geometry, time of day
An ideal orbit: ROCSAT-2 Anniversary dates Accurate spatial registration
• Environmental factorsLake level, tidal stage, wind, soil moisture condition, …
Change detection procedures (cont.)Change detection procedures (cont.)
ApproachApproach• Post-classification comparison
Independent classification and registrationChange with dates
• Classification of multi-temporal data setsA single classification is performed on a combined data setsGreat dimensionality and complexity redundancy
• Principal components analysisTwo or more images one multiband imageUncorrelated principal components areas of changeDifficult to interpret or identify the changePlate 32: (a) before (b) after (c) principal component image
Change detection procedures (cont.)Change detection procedures (cont.)
Approach (cont.)Approach (cont.)• Temporal image differencing
One image is subtracted from the otherChange-no change thresholdAdd a constant to each difference image for display purpose
• Temporal image ratioingOne image is divided by the otherChange-no change thresholdNo change area ratio 1
• Change vector analysisFig 7.55
• Change-versus-no-change binary maskTraditional classification of time 1 imageTwo-band (one from time 1 and the other from time 2) algebraic
operation threshold binary mask apply to time 2 image
Change detection procedures (cont.)Change detection procedures (cont.)
Approach (cont.)Approach (cont.)• Delta transformation
Fig 7.56 (a): no spectral change between two dates (b): natural variability in the landscape between dates (c): effect of uniform atmospheric haze differences between dates (d): effect of sensor drift between dates (e): brighter or darker pixels indicate land cover change (f): delta transformation
Fig 7.57: application of delta transformation to Landsat TM images of forest
Multisensor image mergingMultisensor image merging
Multi-sensor image mergingMulti-sensor image merging• Plate 33: IHS multisensor image merger of
SPOT HRV, landsat TM and digital orthophoto data
Multi-spectral scanner + radar image Multi-spectral scanner + radar image datadata
Merging of image data with ancillary Merging of image data with ancillary datadata
Image + DEM Image + DEM synthetic stereoscopic images
Fig 7.58: synthetic stereopari generated from a single Landsat MSS image and a DEM
Standard Landsat images fixed, weak stereoscopic effect in the relatively small areas of overlap between orbit passes
• Produce perspective-view imagesFig 7.59: perspective-view image of Mount Fuji
Incorporating GIS data in automated Incorporating GIS data in automated land cover classificationland cover classification
Useful GIS data (ancillary data)Useful GIS data (ancillary data)• Soil types, census statistics, ownership
boundaries, zoning districts, …
Geographic stratificationGeographic stratification• Ancillary data geographic stratification
classification• Basis of stratification
Single variable: upland wetland, urban ruralFactors: landscape units or ecoregions that combine several
interrelated variables (e.g. local climate, soil type, vegetation, landform)
Incorporating GIS data in automated Incorporating GIS data in automated land cover classification (cont.)land cover classification (cont.)
Multi-source image classification Multi-source image classification decision rules (user-defined)decision rules (user-defined)• Plate 34: a composite land cover classification
A supervised classification of TM image in early MayA supervised classification of TM image in late JuneA supervised classification of both dates combined using a
PCAA wetlands GIS layerA road DLG (digital line graph)
• Table 7.5: basis for sample decision rules