6
Automated High-speed Nanopositioning inside Scanning Electron Microscopes Daniel Jasper and Sergej Fatikow, Member, IEEE Abstract— This paper describes a new, automated positioning system that uses the scanning electron microscope as a fast, high-resolution sensor system. With two line scans and low com- putational overhead, the exact position of a reference pattern is determined. Using a customized external scan generator and scanning algorithm, the bottleneck of image acquisition is by- passed and the position tracking system can reach update rates of 1 kHz. Using imaged-based object recognition algorithms for automated tracking-initialization and a tight integration into the control infrastructure of mobile nanohandling robots, fully automated nanopositioning is possible. A positioning accuracy below 10 nm is achieved in a tungsten cathode-based microscope and positioning operations along well-defined trajectories are completed in a few tens of microseconds. I. INTRODUCTION Nanoobjects such as CNTs and other nanowires have become of significant interest for both the research commu- nity and industrial applications. Integrated into micro- and macrodevices, nanoobjects can capitalize on their extraordi- nary material properties to create systems with superior prop- erties. In order to individually characterize nanoobjects and to implement nanocomponent-based devices, nanohandling robots have been successfully applied [1], [2]. Such robots can manipulate objects with an accuracy of single nanome- ters [3]. However, sensory feedback is a major problem, as the exact relative position between tools and nanoobjects has to be measured. Currently, microscopic feedback is the only possibility to determine this relative position as the objects are too small to be localized by other means such as interferometric measurements. Used microscopic systems are optical microscopes, scanning electron microscopes (SEMs), transmission electron microscopes (TEMs) or atomic force microscopes (AFMs). SEMs are commonly favored for robotic manipulations because of their comparatively large working range, high resolution and reasonable update rate. Optical microscopes can only be used for microobjects and very large nanoobjects because of their limited resolution. AFMs and TEMs have a high resolution but also a very small working range making complex robotic operations difficult. During most nanohandling operations, several micro- and nanoobjects need brought into a predefined relative posi- tion. Due to uncertaincies on the nanoscale such as drift- induced position errors, mounting inaccuracies and even non- linearities in the SEM image, the relation between positions Parts of this work were supported by the European Community: Project NanoHand (IP 034274). Both authors are with the Division Microrobotics and Control Engi- neering, University of Oldenburg, 26111 Oldenburg, Germany. (e-mail: [email protected], phone: +49-441-7984296) measured in SEM images and values of the nanohandling robots’ internal position sensors is non-linear and time- variant. Therefore, the internal sensors are insufficient for reliable nanopositioning. Instead, the microscopic position feedback can be used for closed-loop positioning [4]. This process is called visual servoing. Successful visual servoing using the SEM as imaging device has been demonstrated in [5]. In this way, most handling operations can be successfully completed, but the positioning speed is limited due to several reasons. First, SEM image acquisition is slow. Even with well-selected regions of interest and accepting high noise levels, 50 Hz can hardly be exceeded [6]. Second, most image processing algorithms are computationally intensive and thus introduce a delay into the control loop. Third, if the algorithms are executed using standard computers and operating systems without real-time capabilities, there is an unpredictable jitter on the update rate. All these aspects make a visual servoing control loop slow. However, if the objects of interest are well-known, their position can be determined only relying on a number of dedicated line scans [7], achieving a much higher update rate and/or resolution. This paper shows the implementation of an architecture for fully automated nanopositioning. Section II describes the new position tracking approach based on SEM line scans including pattern generation, a hardware-based scan generator as well as brightness and contrast tracking. Sec- tion III describes the integration of the tracking approach into the control infrastructure of mobile nanohandling robots, creating a fully automated positioning system. Section IV demonstrates the potential of the proposed system based on measurement results. II. POSITION TRACKING BY SEM LINE SCANS Position tracking with line scans is most effective if used on special target patterns that are created for easy tracking (see Section II-B). A simple example is a circular pattern (see Fig. 1). Two line scans are conducted exactly over the last known position of the circle’s center (see Fig. 1a). If the pattern remains stationary, the detector shows an almost cosine-shaped signal. If the pattern moves, the line scans are no longer conducted over the circle’s center and a deviation of the detector signal during the line scan is visible (Fig. 1b). Each line scan is used to calculate the pattern’s position in one degree of freedom. The position of the next line scan can then be adapted to the new position of the pattern’s center and continuous tracking over long distances is possible. Also, multiple patterns can be tracked alternately. Thus, instead of 6th annual IEEE Conference on Automation Science and Engineering Marriott Eaton Centre Hotel Toronto, Ontario, Canada, August 21-24, 2010 MoC4.3 978-1-4244-5448-8/10/$26.00 ©2010 IEEE 704 978-1-4244-5449-5/10/$26.00 ©2010 IEEE

[IEEE 2010 IEEE International Conference on Automation Science and Engineering (CASE 2010) - Toronto, ON (2010.08.21-2010.08.24)] 2010 IEEE International Conference on Automation Science

  • Upload
    sergej

  • View
    214

  • Download
    2

Embed Size (px)

Citation preview

Page 1: [IEEE 2010 IEEE International Conference on Automation Science and Engineering (CASE 2010) - Toronto, ON (2010.08.21-2010.08.24)] 2010 IEEE International Conference on Automation Science

Automated High-speed Nanopositioning insideScanning Electron Microscopes

Daniel Jasper and Sergej Fatikow, Member, IEEE

Abstract— This paper describes a new, automated positioningsystem that uses the scanning electron microscope as a fast,high-resolution sensor system. With two line scans and low com-putational overhead, the exact position of a reference patternis determined. Using a customized external scan generator andscanning algorithm, the bottleneck of image acquisition is by-passed and the position tracking system can reach update ratesof 1 kHz. Using imaged-based object recognition algorithms forautomated tracking-initialization and a tight integration intothe control infrastructure of mobile nanohandling robots, fullyautomated nanopositioning is possible. A positioning accuracybelow 10 nm is achieved in a tungsten cathode-based microscopeand positioning operations along well-defined trajectories arecompleted in a few tens of microseconds.

I. INTRODUCTION

Nanoobjects such as CNTs and other nanowires havebecome of significant interest for both the research commu-nity and industrial applications. Integrated into micro- andmacrodevices, nanoobjects can capitalize on their extraordi-nary material properties to create systems with superior prop-erties. In order to individually characterize nanoobjects andto implement nanocomponent-based devices, nanohandlingrobots have been successfully applied [1], [2]. Such robotscan manipulate objects with an accuracy of single nanome-ters [3]. However, sensory feedback is a major problem, asthe exact relative position between tools and nanoobjectshas to be measured. Currently, microscopic feedback is theonly possibility to determine this relative position as theobjects are too small to be localized by other means such asinterferometric measurements. Used microscopic systems areoptical microscopes, scanning electron microscopes (SEMs),transmission electron microscopes (TEMs) or atomic forcemicroscopes (AFMs). SEMs are commonly favored forrobotic manipulations because of their comparatively largeworking range, high resolution and reasonable update rate.Optical microscopes can only be used for microobjects andvery large nanoobjects because of their limited resolution.AFMs and TEMs have a high resolution but also a very smallworking range making complex robotic operations difficult.

During most nanohandling operations, several micro- andnanoobjects need brought into a predefined relative posi-tion. Due to uncertaincies on the nanoscale such as drift-induced position errors, mounting inaccuracies and even non-linearities in the SEM image, the relation between positions

Parts of this work were supported by the European Community: ProjectNanoHand (IP 034274).

Both authors are with the Division Microrobotics and Control Engi-neering, University of Oldenburg, 26111 Oldenburg, Germany. (e-mail:[email protected], phone: +49-441-7984296)

measured in SEM images and values of the nanohandlingrobots’ internal position sensors is non-linear and time-variant. Therefore, the internal sensors are insufficient forreliable nanopositioning. Instead, the microscopic positionfeedback can be used for closed-loop positioning [4]. Thisprocess is called visual servoing. Successful visual servoingusing the SEM as imaging device has been demonstrated in[5]. In this way, most handling operations can be successfullycompleted, but the positioning speed is limited due to severalreasons. First, SEM image acquisition is slow. Even withwell-selected regions of interest and accepting high noiselevels, 50 Hz can hardly be exceeded [6]. Second, mostimage processing algorithms are computationally intensiveand thus introduce a delay into the control loop. Third, ifthe algorithms are executed using standard computers andoperating systems without real-time capabilities, there is anunpredictable jitter on the update rate. All these aspectsmake a visual servoing control loop slow. However, if theobjects of interest are well-known, their position can bedetermined only relying on a number of dedicated line scans[7], achieving a much higher update rate and/or resolution.

This paper shows the implementation of an architecturefor fully automated nanopositioning. Section II describesthe new position tracking approach based on SEM linescans including pattern generation, a hardware-based scangenerator as well as brightness and contrast tracking. Sec-tion III describes the integration of the tracking approachinto the control infrastructure of mobile nanohandling robots,creating a fully automated positioning system. Section IVdemonstrates the potential of the proposed system based onmeasurement results.

II. POSITION TRACKING BY SEM LINE SCANS

Position tracking with line scans is most effective if usedon special target patterns that are created for easy tracking(see Section II-B). A simple example is a circular pattern(see Fig. 1). Two line scans are conducted exactly over thelast known position of the circle’s center (see Fig. 1a). Ifthe pattern remains stationary, the detector shows an almostcosine-shaped signal. If the pattern moves, the line scans areno longer conducted over the circle’s center and a deviationof the detector signal during the line scan is visible (Fig. 1b).Each line scan is used to calculate the pattern’s position inone degree of freedom. The position of the next line scan canthen be adapted to the new position of the pattern’s centerand continuous tracking over long distances is possible. Also,multiple patterns can be tracked alternately. Thus, instead of

6th annual IEEE Conference on Automation Science andEngineeringMarriott Eaton Centre HotelToronto, Ontario, Canada, August 21-24, 2010

MoC4.3

978-1-4244-5448-8/10/$26.00 ©2010 IEEE 704978-1-4244-5449-5/10/$26.00 ©2010 IEEE

Page 2: [IEEE 2010 IEEE International Conference on Automation Science and Engineering (CASE 2010) - Toronto, ON (2010.08.21-2010.08.24)] 2010 IEEE International Conference on Automation Science

target pattern

detector signal

rapid beammovement

electronbeam path

new patternposition

b)a)

Dy

Dx

Fig. 1. Determining the position of a pattern with line scans. a) For astationary pattern, the line scans are conducted over its center. b) If thepattern moves, each line scan reveals an offset in one degree of freedom.

tracking one pattern with a 1 kHz update rate, two patternscan be tracked with 500 Hz.

A precise calculation of the pattern’s position on eachline scan is the key to robust and precise tracking withthis approach. In [7], an approach similar to quadratureinterpolation in linear encoders was applied to the SEM linescans. In this paper, a new method based on center of gravitycalculation is presented.

A. Center of gravity calculation

The calculation of the pattern’s position has to fulfillseveral performance requirements. First, it needs to deter-mine the pattern’s position on the line scan correctly, evenwith a high level of noise. Noise on pixel values of theline scan should not translate into significant noise on theposition readout. Second, the calculation should be as robustas possible against changing imaging conditions such asbrightness, contrast and focus. Third, as it is hard to createpatterns that lead to an ideal cosine detector signal, thecalculation of the position should be independent of the exactsignal shape. For the following considerations, each line scanreturns N pixels with the pixel values pi with 0 ≤ i < N .

For the center of gravity (CoG) calculation approach, eachpixel is weighted with its gray value. The CoG of thesevalues is likely to represent the center of the pattern (see

a) b)

d) e)

c)

f)

Fig. 2. Determining the pattern’s position with CoG calculation. The dottedline represents the center of gravity, the X is the correct position of thepattern and the dashed line the chosen threshold T .

Fig. 2a). On a line of length N , the CoG c is calculated by:

c =

∑N−1i=0 i · pi∑N−1i=0 pi

=swsv. (1)

Thus, sw is the weighted sum and sv the sum of the values,which will later be calculated during the line scan. In thisform, there are two disadvantages to the CoG calculation.First, if the pattern moves between two scans, some pixels areno longer on a scanned line, i.e. the scanned line only detectspart of the pattern (see Fig. 2b). The other part is cut offand the center of gravity no longer correctly represents thepattern’s position. Second, the calculation is highly sensitiveto an offset in the pixel value, i.e. to a change in brightness ofthe SEM image. Such an offset leads to a changing influenceof the background pixels, i.e. pixels not belonging to thepattern, and again the CoG does not correctly represent thepattern’s center (see Fig. 2c). Both issues can be improved byapplying a threshold function t before calculating the CoG:

t(pi) =

{0, when pi < T,

pi − T, else.(2)

Thus, for a given threshold T , the equation is changed to:

c =

∑N−1i=0 i · t(pi)∑N−1i=0 t(pi)

(3)

With this equation, the pixels that are cut off because ofthe movement do not contribute to the derived position (seeFig. 2d). Even if relevant pixels are missed, there is no bigerror in the calculated position (see Fig. 2e). In order tomake the approach robust against changes in brightness, thethreshold T has to be changed according to the values of eachscan (see Fig. 2f). For an efficient hardware implementation(see Section II-C), the new threshold is not yet availableduring the scan and thus cannot be used to calculate thetwo sum sw and sv . However, as there are no rapid (<1 ms)changes in brightness, T can be determined based on theprevious pattern scan.

The particle shown in Fig. 3a was used to verify theCoG calculation. The particle is about 2µm in diameterand almost circular. Orthogonal line scan pairs are not onlyconducted over the particle’s real center (see Fig. 3b) but alsoover other positions of a 2µm by 2µm test area. The CoG

testarea

realcenter

linescans

2µm

shapedeviation

a)

2µm

b)

Fig. 3. Tracked particle. a) Irregular shape of particle leads to increasedmeasurement error. b) Complete characterization of the line scan trackingon the particle.

978-1-4244-5448-8/10/$26.00 ©2010 IEEE 705

Page 3: [IEEE 2010 IEEE International Conference on Automation Science and Engineering (CASE 2010) - Toronto, ON (2010.08.21-2010.08.24)] 2010 IEEE International Conference on Automation Science

0.05µm error

-0.8-0.6

-0.4-0.2

0 0.2

0.4 0.6

0.8 1

x [µm]

-0.8-0.6

-0.4-0.2

0 0.2

0.4 0.6

0.8 1

y [µm]

0

0.04

0.08

0.12

0.16

0.2

error [µm]

Fig. 4. Measurement errors when conducting the line scans with differentoffsets to the particle’s center.

calculation was then used to determine the pattern’s centerfrom each line scan pair.

Fig. 4 shows the distances between the measured positionand the real center of the particle, i.e. the measurement error.The x- and y-coordinates show the center of the line scanswith respect to the particle’s real center. For offset in therange of ±0.6µm, the particle’s center is measured with anerror of less than 50 nm. The shape devation in Fig. 3 isdirectly translated into an increased measurement error fory < −0.1µm, virtually independent of the x-offset. The±0.6µm-range should be used as a limit for tracking andcorresponds to approx. 30% of the pattern size.

B. Pattern generation

In order to effectively use tools with the described positiontracking approach, special target patterns need to be fabri-cated on the tools themselves. Milling with a focused ionbeam (FIB) is a promising method of creating such patterns.Fig. 5 shows the decoration of an electrostatic microgripperwith four patterns. The patterns 1, 3 and 4 in Fig. 5 arerotationally invariant, whereas pattern 2 can be used to detect

1

2

3

4

Fig. 5. Patterns created by FIB milling on the tip of an electrostaticmicrogripper [8].

2 µm2 µm

a) b)

Fig. 6. Electron beam paths for FIB-milled patterns. a) Rotationallyinvariant pattern, b) pattern for detecting rotations.

rotations similar to [3]. Patterns 3 and 4 are created on thegripper jaws. They can either be used to track the positionof the gripper or the gripper opening. Pattern 1 and 2 arecreated on a central plunger, which can be used to overcomethe adhesion forces on the micro- and nanoscale [8].

Fig. 6 shows, how the two kinds of patterns are scanned.The rotation-invariant patterns require two orthogonal linescans as described above. To detect the rotation in Fig. 6b,four line scans on a continuous square are used, as describedin [7]. The two horizontal scans indicate the x-position,whereas the two vertical scans indicate the y-position. Thedifference between the two line scans of each pair givesfeedback about the rotation. If rotation is detected, the entirescanned square needs to be rotated, in order to continuouslyscan rotations with unlimited angles. A huge benefit of thisscanning sequence is that no rapid beam movements arerequired and no beam settling times need to be adhered to.

There are two important considerations about the pattern:The pattern shape and the achievable signal-to-noise ratio.The pattern shape is important because deviations (compareFig. 3) lead to position tracking errors. The signal-to-noiseratio of the FIB-milled patterns shown in Fig. 5 is high. Thesecondary electrons can easily escape the central column,leading to a bright inner circle, but they get trapped in thetrench around the column, leading to a dark surrounding.

C. Dedicated scan generator

The two line scans as described above can be completedin less than 1 ms. However, to allow for a high update rate,the latency between the completion of the two line scans,the availability of the position information and the start ofthe next line scan has to be minimized. For this reason,scan generators included in current SEMs cannot be used.Even if a flexible software interface is provided allowingfor arbitrary line scans, scanning the lines, transferring thedata, calculating the position and ordering the next line scanintroduces a significant latency (>10 ms) and makes high-speed tracking and control infeasible.

Instead, a custom scan generator was built that uses ananalog scan coil interface to control the electron beam’sposition. Most SEMs feature such an interface, as it is alsoused for external energy dispersive X-ray analysis (EDX)detectors. The scan generator is based on a development

978-1-4244-5448-8/10/$26.00 ©2010 IEEE 706

Page 4: [IEEE 2010 IEEE International Conference on Automation Science and Engineering (CASE 2010) - Toronto, ON (2010.08.21-2010.08.24)] 2010 IEEE International Conference on Automation Science

25MSpsADC

25MSpsDAC

25MSpsDAC

x±10V

y±10V

SE±10V

28 bitaccumulator

14

28 bitaccumulator

14

28

inc.

28

inc.

14

Scanning state machine- Pixel averaging- Beam settling- Beam delay

Microcontroller + FPU- Communications- Scan planning

CoGcalculator

CANcontroller

USBcontrollerimaging

pixels

trackingpixels14

positions

scan tasks

FPGA

sums

multiplyingDAC

16-bitDAC

+

+

SPI

scanning signals±10V

rotationscalingoffset

positions

8

Fig. 7. Simplified schematic of the custom scan generator based on aXilinx Spartan 3 FPGA.

board of a Xilinx Spartan 3 field programmable gate array(FPGA). An important peripheral on the development boardis a high-speed USB interface to transfer scanned images andposition information to a computer. A custom analog-digitalextension board can create two analog voltages to controlthe electron beam and digitize the detector signal. The usedconverters operate at a frequency of 25 MHz. Additionally,a controller area network (CAN) interface is used for a real-time delivery of the tracked position to the robot controllers.

Fig. 7 shows the simplified schematics of the scan gen-erator. The FPGA contains a microcontroller for commu-nications and scan planning. It transmits request for linearscans from an arbitrary point a to another point b to a statemachine. The state machine then executes these scans withthe help of two 28-bit accumulators for the x- and y-positionof the electron beam. The resolution of these accumulatorsis much higher than the resolution of the 14-bit digital toanalog converters (DACs) that generate the analog voltages.The additional resolution is required so that, with a constantsampling rate of 25 MSps, a line scan can can deliver up to214 pixels with up to 214-fold pixel averaging. AdditionalDACs can multiply the signal and add an offset. Thus, thesignals generated by the first two DACs can be scaled, shiftedand rotated. With the possibility of arbitrary linear scans, thescan generator can either be used for image acquisition orfor directly implementing the described tracking. If used fortracking, the pixel data is directly transferred to a hardwareimplementation of the CoG tracking.

The hardware implementation of the CoG approach issimple and efficient. The two sums sw and sv in 1 are

calculated during the line scan. After applying the thresholdfunction, each acquired measurement point is added to thesums. Once a line scan is completed, a division leads tothe desired position information. A microcontroller exten-sion called floating-point unit (FPU) can efficiently performthis division. With a few additional FPU computations, theposition of the line scan together with the calculated centerof gravity can be translated into a position in the SEMcoordinate system. These calculations consume a total ofapprox. 100µs, after which the position is available for thenext line scan and can be transferred to the robot controller.

D. Brightness and contrast trackingAs described in Section II-A, the position tracking with

line scans and CoG calculation is virtually immune tochanges in brightness and contrast. For other image pro-cessing algorithms however, the brightness and contrast inimages obtained by the SEM is of high significance. Thus,it is desirable to track these values, i.e. keep brightness andcontrast constant during a handling operation. This can beachieved with the data collected during the two line scans.

For the employed SEM (Zeiss LEO 1450), the two settingsbrightness b and contrast c can be controlled in a range from0 to 100%. The brightness is a linear offset to the obtainedvalues, whereas the contrast describes an exponential gain.Thus, a given detector signal s leads to a pixel value paccording to the equation:

p = s · ec·k1 + (b− k2) · k3 (4)

The constants k1 = 28, k2 = 0.496 and k3 = 26000 wereapproximated based on several experiments. k2 and k3 aredetermined by setting c to 0 and changing b. Afterwards, k1was measured by altering c and fitting an exponential curve.

On each line scan, the brightest pixel value pb and thedarkest pixel value pd can be measured. In order to trackthe brightness and contrast conditions, the values of pb andpd should remain constant. At the start, the desired valuesdb and dd for the brightest and darkest pixel are stored. Ifthe contrast and brightness conditions change, pb and pddeviate from db and dd. According to 4, this deviation canbe compensated for by changing the brightness and contrastsettings. To change the contrast, the missing factor f betweencurrent and desired contrast is calculated and translated intothe new exponential contrast setting c′:

f =db − ddpb − pd

(5)

c′ = c+lnfa

(6)

To compensate for brightness changes, a new setting b′

is calculated using the medium of the brightest and darkestpixels:

b′ = b+ (db + dd

2− pb + pd

2) · 1

k3(7)

The changes to brightness and contrast are low-pass-filtered because the SEM exhibits significant noise on thepixel values at the employed scan speed. This noise directlyleads to rapidly and continuously changing values.

978-1-4244-5448-8/10/$26.00 ©2010 IEEE 707

Page 5: [IEEE 2010 IEEE International Conference on Automation Science and Engineering (CASE 2010) - Toronto, ON (2010.08.21-2010.08.24)] 2010 IEEE International Conference on Automation Science

III. AUTOMATED POSITIONING

For this paper, mobile robots are used to demonstrateautomated positioning based on the line scan-based tracking.Using mobile robots is challenging, as they do not provideinternal sensors and require the simultaneous control ofmultiple degrees of freedom [7]. Other actuators, e.g. linearnanopositioners, can be positioned in a similar fashion.

A. Open-loop control

The employed mobile robot features 27 different piezoelectric segments combined into six actuation channels. Formost nanohandling operations, such a robot does not performrotational movements due to its size of several cm3. If thetool or specimen needs to be rotated, even by just a fewdegrees, this requires a movement of several mm or evencm of the entire robot. This is usually performed by firstmoving the robot to a safe distance, then performing therotation with coarse positioning sensors and finally fine-approaching the destination from a different angle. Thus, forthe actual fine nanopositioning, only translational movementsare performed. For these movements, three different actua-tion amplitudes u1 to u3 are used, enabling the robot to moveinto an arbitrary direction (compare Fig. 8b):

u1 = K · xu2 = −K · cos 60 · px +K · sin 60 · pyu3 = −K · cos 60 · px −K · sin 60 · py.

(8)

The accuracy and reliability of the open-loop control hasbeen extensively described in [3]. In order to support thehigh update rates delivered by the tracking system, the signalgeneration has been implemented on a special hardware con-troller. This controller generates six synchronous, sawtooth-shaped actuation signals with the required amplitudes andfrequency. The hardware was specially designed in orderto apply new actuation signals as fast as possible. Mostparameters, including polarity and amplitude of each channelas well as the step frequency can be changed within 80µs.Thus, the controller itself introduces very little latency intothe control loop. Via a USB or a CAN interface, controlcommands or position updates can be received.

B. Trajectory control

For some applications, closed-loop position control isinsufficient as there is no control of the robot’s path betweenstart and destination. E.g. machining operations applicationsin the macro world such as milling, require a movement a

b)a)

a

b

a b

a

a

b

a

a

1

3

2

1

3

2

1

3

2

1b2a

3a

3b

1a

2a2b

1a

3a

c)

Fig. 8. a) The 27 piezo segments can be grouped into six channels: b) threesignal groups for purely translational movement and c) two signal groupsfor rotational movement.

long a well-defined path with a well-defined speed. Such acontrol is called trajectory control. For micro- and nanohan-dling applications, this can also be required, as even slightdeviations from a planned path can lead to collisions.

Trajectory control for the mobile robot can be imple-mented using the SEM position tracking approach. Forsimplicity, this paper only describes the control of lineartrajectories, but this approach can be easily extended toperform other sorts of trajectories. A linear trajectory isdefine by its start s, destination d and duration T . Thetrajectory starts at the time t0 = 0 and finishes at timetf = T . Thus, at an arbitrary time ti in this interval, therobot should be located at the position p(ti):

p(ti) = s+ (d− s)tiT. (9)

Each tracking update leads to an iteration of the controlloop. The control time ti is determined and the controldeviation eti is calculated comparing the current trackingresult and p(ti). In order to eliminate this control deviationdespite the changing set-point values, the controller requiresan integral part. Furthermore, a derivative part can improvethe response time. Thus, a PID controller is used to calculatethe control value vector dt:

dt = Kp · eti +Ki ·ti∑t=0

eti +Kd ·(eti − eti−1

). (10)

Kp, Ki and Kd are the coefficients of the three PIDcontroller components. The values dx and dy can then beused to calculate the values px and py:

α = arctandydx

(11)

px = sinα (12)py = cosα (13)

Using px and py with (8), the control amplitudes for eachactuation channel can be calculated. Thus, the robot moveswith the maximum step length into the direction given by pxand py . The movement speed is influenced by the actuationfrequency, which is chosen proportionally to the absolutecontrol deviation. Because of the excellent frequency re-sponse of the employed robot (see [3]), a constant factorKf is used to translate from a speed in µm/s to a frequencyin Hz:

f = Kf ·√d2x + d2y. (14)

C. Initialization

While the technique described in Section II can be usedto track positions and even contrast and brightness, it cannotbe used for object detection. Hence, from an unknownstate, the initial position of the tracked objects needs tobe found by other means in order to fully automate thepositioning system. As performance is a secondary concernduring initialization, this can be done by any establishedtechnique used for object recognition in SEM images. Thecurrent implementation uses cross correlation-based template

978-1-4244-5448-8/10/$26.00 ©2010 IEEE 708

Page 6: [IEEE 2010 IEEE International Conference on Automation Science and Engineering (CASE 2010) - Toronto, ON (2010.08.21-2010.08.24)] 2010 IEEE International Conference on Automation Science

matching [4]. Template matching can be used directly orindirectly to find the position of the patterns or objects.

The direct initialization is especially useful for the trackingof micro- or nanoobjects such as the particle in Fig. 3. Atemplate image of such an object is used on images withhigh magnification. Once the targeted object is found, it canbe tracked with the line scan-based tracking. This howeveris impracticable for small patterns (e.g. 2µm) that arepossibly located anywhere within a large working range (e.g.1x1 mm2) because imaging such a large area with sufficientresolution for template matching is time consuming.

For the latter cases, the indirect approach should beemployed, capitalizing the additional information of therelative position between the patterns and larger objects. Thisapproach is especially useful for tools with tracking patterns(compare Fig. 5). Template matching is used to find thetool’s position in an image with low magnification. Oncethe position of the tool is determined, the line scan-basedtracking is initialized, using the known relative positionbetween tool and pattern on the tool.

IV. RESULTS

In order to demonstrate the capabilities of the trajectorycontrol, the nanohandling robot was moved along the edgesof a 16µm square (see Fig. 9). Due to the trajectory control,the robot moves with a constant velocity along each edge.The duration of each linear trajectory was set to 50 ms andis precisely realized. Thus, the robot moves with 320µm/s,safely remaining with in the tracking’s velocity limit. Themagnification in Fig. 9 shows that the deviation from thesquare’s edge remains below 120 nm. The source of thedeviation is caused by external disturbances such as the forceexerted by the cabling and particles on the working surface.Considering the high movement speed however, 120 nm is anacceptable deviation. If more precision is required, the robothas to be moved slower to give the control algorithm moretime to correct movement deviations. Despite small pausesin each corner, the entire square movement was completedin a little over 200 ms, which is an unprecedented speed forclosed-loop SEM-based nanopositioning.

-2

0

2

4

6

8

10

12

14

16

18

0.2 0.25 0.3 0.35 0.4 0.45

po

sitio

n [

µm

]

time [s]

-0.2

-0.1

0

0.1

0.2

0.26 0.28 0.3 0.32 0.34

po

sitio

n [

µm

]

time [s]

x x

y

y

Fig. 9. Trajectory control moving the robot along a 16µm square. left)Each movement takes approx. 50 ms. right) The deviation from the straightline remains below 120 nm.

V. CONCLUSIONS AND FUTURE WORKS

This paper describes a complete system for the automatednanopositioning of mobile robots. A line scan-based trackingapproach in conjunction with a custom SEM scan generatorfacilitates high-speed and high-resolution position tracking.A closed-loop trajectory controller enables the reliable andprecise positioning of a nanohandling robot. Nanohandlingbased on the described positioning system with its outstand-ing performance is a valid candidate for high-throughputindustrial manufacturing lines. Although an inherently serialprocess, nanohandling can extend the capabilities of standardbatch processing, e.g. by integrating functional nanocompo-nents into micro electromechanical systems.

The next step is to extend the tracking and position-ing to the third dimension. Currently, two approaches areinvestigated. First, the SEM’s focus information is usedby evaluating the sharpness of the tracked object on eachdedicated line scan. Despite the high depth of sharpnessof SEM imaging, a resolution of a single micrometer isachievable [9]. Second, the electron beam can be tilted byan additional electromagnetic coil in the vacuum chamber.Thus, the tracking is done from two slightly different anglesmaking sub-micrometer resolution in depth possible [10].

REFERENCES

[1] S. Fatikow, V. Eichhorn, T. Wich, H. Hulsen, O. Hanßler, andT. Sievers, “Development of an automatic nanorobot cell for handlingof carbon nanotubes,” in Proc. IARP - IEEE/RAS - EURON JointWorkshop on Micron and Nano Robotics, Paris, France, October 2006,http://iarp06.robot.jussieu.fr/Papers/Huelsen/.

[2] V. Eichhorn, S. Fatikow, T. Wortmann, C. Stolle, C. Edeler, D. Jasper,O. Sardan, P. Bøggild, G. Boetsch, C. Canales, and R. Clavel,“NanoLab: A Nanorobotic System for Automated Pick-and-Place Han-dling and Characterization of CNTs,” in Proc. of IEEE Int. Conferenceon Robotics and Automation (ICRA), 2009.

[3] D. Jasper and C. Edeler, “Characterization, optimization and controlof a mobile platform,” in Proc. of 6th Int. Workshop on Microfactories(IWMF), 2008.

[4] T. Sievers, M. Jahnisch, C. Schrader, and S. Fatikow, “Vision feedbackin an automatic nanohandling station inside an SEM,” in Proc. of 6thInt. Optomechatronics Conference on Visual/Optical Based Assemblyand Packaging, SPIE’s Optics East, Oct. 2006.

[5] T. Sievers and S. Fatikow, “Real-time object tracking for the robot-based nanohandling in a scanning electron microscope,” Journal ofMicromechatronics - Special Issue on Micro/Nanohandling, vol. 3,no. 3-4, pp. 267–284(18), 2006.

[6] B. E. Kratochvil, L. Dong, and B. J. Nelson, “Real-time Rigid-bodyVisual Tracking in a Scanning Electron Microscope,” The InternationalJournal of Robotics Research, vol. 28, pp. 498–511, 2009.

[7] D. Jasper, “High-speed Position Tracking for Nanohandling insideScanning Electron Microscopes,” in Proc. of IEEE Int. Conferenceon Robotics and Automation (ICRA), 2009.

[8] B. Chen, Y. Zhang, and Y. Sun, “Active Release of Microobjects Usinga MEMS Microgripper to Overcome Adhesion Forces,” Journal ofmicroelectromechanical systems, vol. 18, no. 3, 2009.

[9] C. Dahmen, “Focus-based depth estimation in the SEM,” in Proc.of Int. Symposium on Optomechatronic Technologies, vol. 7266, SanDiego, CA, U.S.A., November 2008, p. 72661O.

[10] M. Jahnisch and S. Fatikow, “3D Vision Feedback for NanohandlingMonitoring in a Scanning Electron Microscope,” International Journalof Optomechatronics, vol. 1, no. 1, pp. 4–26, 2007.

978-1-4244-5448-8/10/$26.00 ©2010 IEEE 709