Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
All digital imaging faces a compromise between resolution, speed, and sensitivity. For example, larger pixels provide more sensitivity because they capture more light, but the larger area reduces the ability to discriminate detail, reducing resolution.
Parameter InterplayResolution
Sensitivity Acquisition Speed
ETERNAL TRIANGLE
For more information, go online to: posters.sciencemag.org/camerasensorsMOREONLINE
Unknowingly, the development of digital cameras for scientific imaging started in 1905, when Albert Einstein postulated that light with sufficient energy hitting a metal could free an electron. This is called the photoelectric effect. A photon with a wavelength between 350 and 1,000 nanometers is absorbed when it hits a silicon crystal—a semiconductor material used to make a digital sensor—and that releases an electron. This electrical signal is the first step in the generation of a digital image.
The light-detection capability of a silicon-based sensor, or its quantum efficiency (QE), is a measure of the efficiency with
which photons are converted into electrical charge. QE depends on the wavelength of the incident light and the design of the sensor, such as monochrome or color, as shown in these graphs.
The size of a single image and the imaging frame rate define the amount of data generated each second by a camera. The technology that connects the camera to a computer must meet or exceed that bandwidth. Many digital interface technologies can be used, and each one offers specific benefits. Cable length, cost, effective data rate, an option to transfer power and data through one cable, and ease of use determine the best solution for a specific camera application. USB 3.0 provides a raw-data bandwidth of about 5 gigabits per second, making it the fastest standard data interface for digital cameras, and it is easy to implement. Alternatives, such as 10-Gigabit Ethernet, offer higher bandwidths and longer cables, but are more challenging to apply in camera–computer communication and therefore less common than USB 3.0.
Color vs. MonochromeIn a color camera, a Bayer filter in front of the sensor allows every pixel to detect only photons of a given color—red, green, or blue. This turns a monochrome sensor into a color sensor, which has fewer and bigger effective pixels, and lower sensitivity than the underlying monochrome sensor, but allows instantaneous acquisition of color images and videos.
Single pixels are arranged in a grid, which forms the sensor. The way the pixels are built, how the charge is converted into a voltage, and how the pixels are addressed and controlled by the electronics is called the sensor architecture. Small pixels are required to capture high-resolution images, but larger pixels capture more photons, enabling low-light applications
like live-cell imaging. The speed of a sensor depends on several factors—most importantly, the readout time, which is how long it takes to collect and digitize information from all the pixels.
BRIEF GLOSSARY:
back thinning: etching a sensor to reduce the path of the light through the silicon before it reaches the detection area.
blooming: an image artifact in an overexposed image resulting from the charge in one overexposed pixel spreading to adjacent pixels. This is especially noticeable when imaging isolated bright spots.
dark current: spurious signal in the absence of photon absorption.
dynamic range: range of brightness that can be captured by a sensor.
pixel: one light-sensitive element of a sensor grid.
postprocessing: computational modifications of image data after capture.
resolution: in microscopy, this is the size of the smallest structures that can be resolved by an instrument. In the context of digital imaging, resolution can also refer to the number of pixels in a sensor or in an image.
sensitivity: the number of photons required to create a detectable signal.
signal-to-noise ratio: the amplitude of the signal compared to the overall noise.
In 1979, Germany released a stamp in honor of Einstein’s 1921 Nobel Prize in Physics, received for discovering the photoelectric effect.
Gal
yam
in S
erge
j/Sh
utte
rsto
ck.c
om From Photons to Pixels
From Camera to Computer
From Pixels to Sensor
Sensor of ZEISS Axiocam
Dissecting the SensorA CCD sensor collects the electric charge from all the pixels through a small number of charge amplifiers, making for a homogeneous sensor that requires little postacquisition image correction. A CMOS sensor includes millions of miniature amplifiers, one at every pixel. This parallelization drastically improves the readout speed, but causes small variations between the pixels that is removed by digital postprocessing.
CCD sensors deliver high sensitivity, good linearity, low dark current, and good image homogeneity, while CMOS sensors provide fast readout speeds, high sensitivity, and a huge dynamic range, but are sometimes limited by residual pixel inhomogeneity and distortion of fast objects due to the rolling-shutter operation.
Modifications of CCD and CMOS sensors are also available. Back-illuminated or back-thinned sensors further increase quantum efficiency to >90%, and electron multiplying CCD (EMCCD) sensors make even the dimmest signals visible by reducing the effective readout noise.
CCD: charge-coupled device. Mainstream image sensor technology for the last 40 years with very homogenous signal quality, likely to be replaced by CMOS for many applications. Still best for long exposure times. Best raw data quality.
EMCCD: electron multiplying CCD. Most sensitive due to large pixels and electron multiplication, low resolution, and extremely low-light applications.
CMOS: complementary metal-oxide semiconductor. Active pixel sensor with pixelwise charge-to-voltage signal conversion, on-chip analog-to-digital converter (ADC), undergoing major improvements over the last few years. Fast readout.
sCMOS: scientific CMOS. Specially optimized with low readout noise. Fast readout.
SENSOR:
widest range available
very expensive
affordable
expensive
yes
no
yes
yes
SENSOR SELECTION CRITERIA FOR MICROSCOPY
Sens
i�vi
�
Dynam
ic ra
nge
Spee
d
Resol
u�on
Colo
r op�
on
Price
bestvery good
acceptablepoor
LEGEND:
CCD
EMCCD
CMOS
sCMOS
Ligh
t sh
eet
imag
e of
you
ng O
ctop
us
bim
acul
oide
s. ©
Imag
e co
urte
sy o
f Eri
c Ed
sing
er &
Dan
iel S
. Rok
hsar
, Oki
naw
a In
stit
ute
of S
cien
ce a
nd T
echn
olog
y.
Expanded glossary online
Capturing the light Acquiring the signal Converting signal from analog to digital Computing the final image
STEPS IN IMAGE FORMATION
0
10
20
30
40
50
60
70
80
300 400 500 600 700 800 900 1000 1100
Wavelength (nm) of Incident Light
Quan
tum
Effi
cienc
y (%
) of S
enso
r
Monochrome Sensor
Color (RGB)Sensor
A color camera delivers lower e�ective sensitivity than a monochrome camera with the same sensor, but also records the color of the collected light.
MOREONLINE
MOREONLINE
11 2
2
3
3
4
4
In microscopy, scientists in search of a perfect instrument often focus on the optics or special camera features. But getting the best images of a specific specimen also depends on the camera’s sensor and related electronics.
Here, we follow photons on their journey from sample to digital image.
T H E S C I E N C E O F M I C R O S C O P Y C A M E R A S
Sponsored by
Produced by theScience/AAAS
Custom Publishing Office
Imm
unos
tain
ing
of p
lank
toni
c cn
idar
ia;
© H
elen
a Pa
rra-
Ace
ro, I
nsti
tut
de B
iolo
gia
Evol
utiv
a (C
SIC
-Uni
vers
itat
Pom
peu
Fabr
a), B
arce
lona
, Spa
in
© ZEISS Microscopy
SensorControl
DataInterfaceADC
SystemManagement
Signal Postpro-
cessing Unit
AnalogSignal
Camera
Sensor