View
93
Download
1
Tags:
Embed Size (px)
Citation preview
Introduction to Image Analysis
What is an Image
Optical Systems project light to a focal plane
Imaging systems measure light intensity at the focal
plane of the optical system
Intensity measurements are stored in a numerical
matrix rendered on the screen as an image
Image Generation
www.mediacy.com
What is an image?
What is Resolution?
It is the ability to differentiate objects
Defined as the smallest gap that can be measured
In optical systems it is limited by
Diffraction radii
Sampling rate
Resolution
Resolution
Raleigh Criterion Limit
Resolution
Resolution
Resolution
Resolution
Sampling Rate
Objects are easily
resolved optically
but sampling rate is
much higher than
required
Resolution
Sampling Rate
Objects are easily
resolved optically
and sampling rate
is optimal
Resolution
Sampling Rate
Objects are easily
resolved optically
but sampling rate is
too low
Optimising resolution
Lens resolution is limited by Raleigh criterium
r(nm) = 1.22λ/ (NAobj+NAcon) in brightfield
r(nm) = 1.22λ/ 2NAobj in fluorescence
Camera resolution is limited by pixel separation
r(nm) = ΔPix(nm) / (MagObj x MagCon) x 2
If camera resolution is finer then the lens resolution, the
system is optimised
Optimising Resolution
0
1
2
3
4
4x NA 0.10 10x NA 0.3 20x NA 0.45 40x NA 0.70 60x Oil NA 1.40
Camera resolution 6.45um pixel Optical Resolution
Camera resolution 4.5um pixel
Bit Depth
Computers store data in a binary mode
All data consists of variations of combinations of a
specific number 1s and 0s
1 bit = 2 possible values (1/0)
2 bit = 4 possible values (00/01/10/11)
Bit Depth
8 bit
28 Variations
Max 256
12 bit
212 Variations
Max 4,096
16 bit
216 Variations
Max 65,536
Bit Depth
Digital sensors count photo-electrons
A pixel has a finite capacity (typically between
10,000-100,000e- full well)
No Electrons collected = 0
As many electrons as can be collected in pixel =
Bit Depth Max value (255/4095/65535)
Bit Depth
In theory, higher bit depth means higher precision
when measuring intensity
But photo-electron counts are not very precise
(typically ±10e- read noise)
Intensity precision = Full Well / Read Noise
For a typical camera this could mean 10,000/10 =
1000 levels of variation
Other noise factors mean that intensity resolution is
usually worse
Bit Depth
0
250
500
750
1000
1250
0 80 160 240 320 400 480 560 640 720 800
n Pixels
-250
0
250
500
750
1000
1250
0 320 640 960 1280 1600 1920 2240 2560 2880 3200 3520 3840
n Pixels
-250
0
250
500
750
1000
1250
0 25 50 75 100 125 150 175 200 225 250
n Pixels
The Problem with Colour
Pixels not inherently wavelength sensitive
An e- is an e- regardless of whether it was excited by
a blue photon or a red photon
Colour measurements require us to discriminate
between wavelengths of light
The Problem with Colour
How to generate a colour image?
Need to measure intensities at each point in red,
green and blue
Use filters to restrict to only the wavelength of light
you are interested in measuring
The Problem with Colour
3 sensors
with a prism1 Sensor
with 3 filters
1 sensors
with Bayer
mask
The Problem with Colour
3CCD:
+ Fast, Good colour resolution
- Expensive and technically challenging
Filter Wheel:
+ Great colour resolution, removable for higher
sensitivity in monochrome
- Slow, disabling it requires moving mechanical parts
Bayer Mask
+ Fast and inexpensive
- Resolution and sensitivity are sacrificed
Everything in Imaging is a
Compromise
Optical resolution - Higher resolution means
shallower depth of field
Pixel sampling rates - more pixels means slower
frame rates
Bit depth - Higher bit depth require more computing
power
Colour - makes analysis more complicated
Image Investigation
Tools
Bitmap Analysis
Image Histogram
Line Profile
Line Profile Analysis
Plot pixel intensities along a
line
Detect profile features:
Valleys
Peaks
Falling Edges
Rising Edges
Measuring Objects
Groups of pixels together make
objects
From Outlines we can get
measurements:
Perimeter
Area
Bounding box
Centre coordinates
Radii
Much, much more...