9
Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

Embed Size (px)

Citation preview

Page 1: Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

Long-Wave Infrared and Visible Image Fusion for Situational Awareness

Nathaniel Walker

Page 2: Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

Agenda

• What is image fusion?

• Applications

• System-level considerations

• Image fusion algorithms

• Image quality metrics

• Further research

Page 3: Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

What is Image Fusion?

• Combine data from multiple sensors into a single image• Visible

• Image Intensified (I2)

• Near Infrared (NIR)

• Short-wave Infrared (SWIR)

• Medium-wave Infrared (MWIR)

• Long-wave Infrared (LWIR)

• X-Ray

• Enhance the capabilities of the human visual system• ‘See’ outside the visible spectrum

• All-weather visibility

Page 4: Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

Applications

• Surveillance and Targeting

• Navigation

• Satellites

• Guidance/Detection Systems

Page 5: Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

System-Level Considerations

• Parallax• Optical alignment• Image registration• Sensor pixel resolution• Color vs. grayscale

• Spectral resolution can be lost in fusion

• Human factors• Presentation of IR data

• Realism of displayed data (superposition, contrast reversal)

• Preserving relative intensity across the scene

Page 6: Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

Image Fusion Algorithms (Zhang, Blum 1999)

• Weight-based combinations of the two sources• linear combination

• general loss of contrast

• Feature extraction• High-pass filtering or edge detection

• Maximizing image quality metrics

Page 7: Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

Image Quality Metrics

• Mostly done by subjective evaluation• ‘Optimal’ methods are task and application dependent

• Two classes of quantitative metrics (Chen, et al. 2005)• Analysis of the fused image

• standard deviation – measure of contrast

• entropy - measure of information content

• SNR

• Comparison of the fused image to the source images

• cross-entropy

• objective edge based measure

• universal index based measure

Page 8: Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

Further Research

• Concentration on grayscale fusion algorithms for effective communication of spectral information to the viewer• Sensor Assumptions

• perfect optical alignment and image registration

• same pixel resolution and field of view (FOV)

• Compare quantitative metrics of image quality to subjective image evaluation for situational awareness

• Focus on human factors for injecting infrared content into a visible spectrum image• What approach adds value without causing distraction or removing

detail

Page 9: Long-Wave Infrared and Visible Image Fusion for Situational Awareness Nathaniel Walker

References