12
Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan, and Matthew Buisker Department of Mechanical Engineering School of Engineering and Mines University of North Dakota Grand Forks, ND 58202-8359 ABSTRACT An active gimbal system is developed to allow small fixed wing Unmanned Aircraft Systems (UAS) platforms to both station-keep and estimate accurate position information for targets of interest. Specific targets vary from a stationary point on the ground to aircraft in the national airspace. The system developed to accomplish this at the University of North Dakota is the Precision Orientation Instrumentation for Navigation and Tracking (POINTing) system. To utilize the POINTing system on small UAS platforms, there are inherent challenges in vehicle stability and attitude control due to wind and turbulence to overcome. To combat these effects, passive and active vibration control components are incorporated. Passive vibration control minimizes deflection amplitudes without adding significant weight or volume, and by utilizing actuators both active attitude and vibration control is accomplished. The actuators are small and lightweight, with the ability to operate on fast time-scales and generate significant force and motion. The autonomous control system uses Global Positioning System (GPS), Inertial Measurement Unit (IMU), and other sensor data to determine position and attitude during flight resulting in control algorithms where the low frequency attitude changes are handled by the gimbal system, while the relatively high frequency vibrations are mitigated using active control. I. Introduction The objective of developing the Precision Orientation Instrumentation for Navigation and Tracking (POINTing) system is to allow small fixed wing UAS platforms to both station-keep on targets of interest and estimate accurate position information for those targets. Specific targets of interest vary from a stationary point on the ground (applications such as fire-fighting, surveillance, and atmospheric research) to cooperative and non- cooperative aircraft in the national airspace system for airborne sense and avoid. This concept is illustrated in Figure 1. Flight Path Defined by GPS Waypoints Precision Pointing at a Single Location or Accurate Target Tracking Figure 1. Small UAS POINTing system concept.

Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

  • Upload
    others

  • View
    29

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems

William Semke, Jaganathan Ranganathan, and Matthew Buisker

Department of Mechanical Engineering School of Engineering and Mines

University of North Dakota Grand Forks, ND 58202-8359

ABSTRACT An active gimbal system is developed to allow small fixed wing Unmanned Aircraft Systems (UAS) platforms to both station-keep and estimate accurate position information for targets of interest. Specific targets vary from a stationary point on the ground to aircraft in the national airspace. The system developed to accomplish this at the University of North Dakota is the Precision Orientation Instrumentation for Navigation and Tracking (POINTing) system. To utilize the POINTing system on small UAS platforms, there are inherent challenges in vehicle stability and attitude control due to wind and turbulence to overcome. To combat these effects, passive and active vibration control components are incorporated. Passive vibration control minimizes deflection amplitudes without adding significant weight or volume, and by utilizing actuators both active attitude and vibration control is accomplished. The actuators are small and lightweight, with the ability to operate on fast time-scales and generate significant force and motion. The autonomous control system uses Global Positioning System (GPS), Inertial Measurement Unit (IMU), and other sensor data to determine position and attitude during flight resulting in control algorithms where the low frequency attitude changes are handled by the gimbal system, while the relatively high frequency vibrations are mitigated using active control. I. Introduction The objective of developing the Precision Orientation Instrumentation for Navigation and Tracking (POINTing) system is to allow small fixed wing UAS platforms to both station-keep on targets of interest and estimate accurate position information for those targets. Specific targets of interest vary from a stationary point on the ground (applications such as fire-fighting, surveillance, and atmospheric research) to cooperative and non-cooperative aircraft in the national airspace system for airborne sense and avoid. This concept is illustrated in Figure 1.

Flight Path Defined by GPS Waypoints

Precision Pointing at a Single Locationor Accurate Target Tracking

Figure 1. Small UAS POINTing system concept.

Page 2: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

While the POINTing system is applicable for use in both fixed wing and rotor aircraft, the initial target platform for

small UAS imaging payload has been constructed by the Unmanned Aircraft Systems Engineering team at the

development is fixed wing. These unmanned aircraft have a much greater carrying capacity and are more suited to long endurance missions when compared to comparably-sized rotor aircraft. However, fixed wing aircraft have the distinct disadvantage in that they are continuously moving and cannot hover in a specific location. By using the POINTing system, the fixed wing aircraft will keep looking at a target as if the sensor was in a fixed location. This capability will greatly enhance the capability of small UAS platforms in many dual-use applications, including automatic target recognition and tracking, continuous monitoring of identified targets, reconnaissance and surveillance, environmental monitoring for fire-fighting and flood control, and border security. AUniversity of North Dakota that allows for the manual pan and tilt control of electro-optical (EO) and uncooled thermal infrared (IR) cameras [1-3]. While this allows for limited station-keeping and general surveillance applications, it does not have the ability to watch a specific target over time or provide precise target position information. The first flight of this payload took place on October 25, 2006, in military restricted airspace over Camp Ripley, an Army National Guard facility near Little Falls, Minnesota. The payload was flown on the Lockheed Martin ER (extended range) experimental UAS designed and fabricated in Eagan, Minnesota. Lockheed Martin-Eagan personnel helped with electromagnetic interference (EMI) testing and a fit-check at their facility prior to approval for flight. Figure 2 provides example electro-optical and uncooled thermal infrared images captured during separate flights, so the data is not registered.

Figure 2. Example MPEG-2 video stills captured du n payload flight on October 25, 2006, over

Since these test, significant progress has been made n the POINTing system, which is a three-axis gimbal

ring the maideCamp Ripley near Little Falls, Minnesota. The left images were captured by a Sony electro-optical (EO) video camera, while the right images were captured using a BAE Systems uncooled thermal infrared (IR) imager.

o

developed for EO/IR digital imaging. Figure 3 shows a CAD drawing of the three-axis gimbal design and implementation, while Figure 4 is a CAD drawing and physical implementation of the sensor payload known as the SUNDOG – Surveillance by University of North Dakota Observational Gimbal. The payload cassette is being designed for flight by the Lockheed Martin Sky Spirit II, a small UAS capable of flying up to 75 pounds of payload. Most of the hardware and software utilized in the payload design is commercial-off-the-shelf or share-ware in nature. Figure 5 provides a diagram of the commercial-off-the-shelf (COTS) hardware utilized in the payload and ground control station design. The first flight of this payload occurred on August 16, 2007.

sical implementation. This three-axis gimbal has the

capability of manually pointing a Sony EO block camera LIR/Indigo Systems uncooled thermal IR imager. Figure 3. POINTing system CAD representation and phy

and a F

Page 3: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

mentation of the POINTing system payload cassette for

Spirit II. This payload is known as the SUNDOG – Surveillance by Figure 4. A CAD representation and physical impleintegration into the Lockheed Martin Sky

University of North Dakota Observational Gimbal.

Payload

WLAN 802.11a/b/g

3DM-GX1 IMU

Dual-Channel Frame Grabber

SONY EO

FLIR Uncooled IR

Micromotor&

Controller (x3)

Garmin GPS

Serial Ports

Crista IMU

RS-232 C

amera P

ointing System

Navigation System

EIDE 2.5” HDD

Laptop w/ FlightGear

Ground Stat

Joystick Controller

ion

PC/104+

WLAN 802.11a/b/g

NTSC

PC/104+ SBC

Figure 5. COTS hardware and software diagram for the SUNDOG payload.

This system has pointing of the cameras with a joystick based upon real time video data received from the EO camera. The next step in the

. Kinematics Analysis for Accurate Gimbal Positioning

et the appropriate model of how a gimbal needs to tate in order to point at a certain location on the ground. In this situation, the system of interest includes an

the capability for a ground-based payload operator to manually control the

development of the POINTing system is to autonomously control both attitude variations and mechanical vibrations. This phase incorporates Global Positioning System (GPS) and Inertial Measurement Unit (IMU) data to determine position and attitude during flight. This information is used by an autonomous system embedded into the POINTing system to station-keep on a target. The low frequency attitude changes will be handled by the gimbal system, while the relatively high frequency vibrations will be mitigated using active control. II A kinematic analysis is done on the system of interest to groinertial system that has coordinates fixed to the Earth, a coordinate system that is body-fixed to the airplane and a third coordinate system that is fixed to the gimbal. The end result of the analysis are the rotation angles, about each gimbal axis, that will result in the gimbal pointing at the correct spot with a desired orientation. Correct orientation will allow an image to have the same orientation to the ground coordinates independent of the direction of flight. The scenario investigated is when the inertial coordinates of the target and camera are known and the orientation angles are required for accurate pointing.

Page 4: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

As stated previously, this system includes three separate coordinate systems. These coordinates systems and their orientations are shown in Figure 6. The orientation for the coordinate systems is arbitrary, but they do need to be defined. In this system the inertial coordinates were defined so that the x-axis is in the North direction, the y-axis is in the East direction and the z-axis is downward. The airplane coordinates were defined with the x-axis being the same as the heading vector. These two systems are shown in Figure 6. The gimbal system is defined with the x-axis being the tilt axis of the gimbal and initially aligned with the x-axis of the airplane system. The y-axis of the gimbal is the pan axis for the gimbal, as shown in Figure 6. The z-axis of the gimbal is also defined as the line-of-sight axis, which is required to help solve for the pointing and orientation parameters.

Xa

Ya Za

X(North)

Y(East)

Z(Down)

Figure 6. Left: Orientation of inertial and airplane-fixed coordinate systems Right: Orientation of gimbal co system.

The topic of coordinate transform en well covered in the literature

-11]. Often it is advantageous to define a system using multiple coordinate systems. The problem with multiple

Figure 7. Plane rotations of ψ, θ, and φ about the z, y, and x axis respectively

⎦⎢⎢⎣ +−+ φθφθψφψφθψφψ cccsssccscss

ordinate

s and the kinematics involved with flight have be[4coordinate systems is that vector operations cannot be done if the vectors are defined by different coordinate systems. Therefore, the components of a vector in one coordinate can be transformed into a vector in a different coordinate system. The first transform that is needed for this analysis is a transform between the inertial system and the airplane fixed system. For this transform, a 3-2-1 (NASA Standard Airplane) rotation will be used. In this situation, ψ is the heading angle, or yaw, and corresponds to a rotation about the z-axis of the airplane, θ is the pitch angle and is a rotation about the y-axis of the airplane, and φ is the roll angle and is a rotation about the x-axis of the airplane. These rotations are shown in Figure 7. The order of rotation for this transform then is ψ, θ, and then φ. The rotation matrix for this type of transform is well known and is shown in Equation 1 as Ra.

Xa ψ

[ ] ⎥⎢ ++−= φθφθψφψφθψφψ scsssccssccsRa (1) ⎥⎥

⎤⎡ − θθψθψ scscc

Ya

Xa

Za

θ φ

Ya

Za

Page 5: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

The next transform needed is from the airplane fixed coordinates to the gimbal fixed coordinates. This is done by g about the z-axis by γ, then rotating about the y-axis by β, and then rotating

he rotation matrix for each of these individual rotations is shown in Equationsrespectively. To find the total rotation matrix needed, the rotation matrices from the individual rotations are

first rotatin about the x-axis by α. T 2 – 4 as R1g, R3g and R2g,

multiplied together as shown in Equation 5 with the total rotation matrix shown in Equation 6 as Rg .

[ ]⎥⎥⎥

⎢⎢⎢

−=

αααα

csscR g

00

001

2 [ ]⎥⎥⎥

⎢⎢⎢

⎡ −=

ββ

ββ

cs

scR g

0010

0

3 [ ]⎥⎥⎥

⎢⎢⎢

⎡ −=

10000

1 γγγγ

cssc

R g (2 – 4)

[ ] [ ][ ][ ]gggg RRRR 231= (5)

⎢⎢⎢

−+

−−+−=

αββγαβγαγβγ

αβγαγαβγαγβγ

scscssscccs

cscssssccsccRg

r of rotation used for the Ra matrix is arbitrary, but using a different order will result in a different rotation matrix. The order of rotation for the Rg matrix will depend on the system setup and other systems may have

ifferent locations for the drive motors of each axis or they may move in a diffehere. Whatever the order, these rotation matrices define the crucial relationship for transforming between

n angles (α, β, γ) must be given to the gimbal system so it can accurately locate the target. ation angles. The inputs into the system are plane itude (roll, pitch, and yaw), target location in inertial

⎧x ⎧x ⎧x

The location of the gimbal needs to be expressed in inertial coordinates to define the line of sight vector. To do this the inertial coordinates of the plane must first be trans ed into theairplane. This is done by multiplying the inertial c dinates b e rotation m

⎥⎥−

αβαβγα

cccsss (6) [ ]

The orde

d rent order than what was used

coordinate systems. III. Gimbal Pointing Angles for Known Target Locations In many remote sensing applications a gimbal system needs to point a sensor at a given target location. To do his the proper rotatiot

In this section an algorithm is developed to find these rotocation in inertial coordinates (GPS information), plane attlcoordinates, and the offset distance of the gimbal from the GPS receiver on the plane. If this offset is not included or is not known, there will be errors in the angles of rotation given to the gimbal. The analysis for finding the rotation angles is done by constraining the z axis of the gimbal system to lie along the vector going from the gimbal location to the target location. The first step in finding the rotation angles is to define the known locations in the form of vectors. In Equations 7-9, the P vector is the position of the plane in inertial coordinates, the T vector is the position of the target in inertial coordinates and the Go vector is the offset of the gimbal in airplane fixed coordinates.

⎪⎪⎭

⎪⎪⎬

⎪⎪⎩

⎪⎪⎨=

p

p

p

z

yPr

⎪⎭

⎪⎬

⎪⎩

⎪⎨=

t

t

t

zyT

r

⎪⎪⎭

⎪⎪⎬

⎪⎪⎩

⎪⎪⎨=

go

go

go

z

yG0

r (7–9)

⎫ ⎫ ⎫

form body-fixed coordinate system of the oor y th atrix, as shown in Equation 10.

[ ]{ }PRP aa =r

(10) Once this is done, the gimbal offset, which is already in the body-fixed coordinates, can be subtracted from the plane coordinates to give the actual location of the gimbal in body-fixed coordinates (Equation 11).

{ } { }GPG = 0aa -r

(11) Now the gimbal location needs to be transformed back into the inertial coordinates so the vector between the gimbal and target can be defined. This is done by multiplying the gimbal location by the inverse of the airplane rotation matrix, as shown in Equation 12.

[ ] { }aa GRG 1-=r

(12) The line of sight vector is then defined as the difference between the gimbal location and the target location, both in inertial coordinates.

Page 6: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

{ } { } ⎪⎪⎬

⎪⎪⎨Δ== yGTD -

r (13)

⎫x

⎪⎪⎭⎪

⎪⎩

Δ

Δ

z

he line of sight vector then needs to be transformed into the body-sing the airplane rotation matrix as shown in Equation 14.

T fixed coordinates of the plane. This is done by u

[ ]{ }⎫

⎪⎩

Δ

Δ

a

a

z

x

he final rotation is to transform the pointing vector into the gimbal fixed cor this rotation are the angles that will be the pointing angles for the

using specific constraints, the equations from the rotation will provide the pointing angles. The transformation is

⎪⎭

⎪⎬

⎪⎨Δ== aaa yDRD

r (14)

T ordinate system. The angles needed fo gimbal. Therefore they are unknown, but, by

shown in Equation 15.

[ ]{ }⎪⎩Δ

⎪⎭

⎪⎩ Δ+Δ+Δ gaaa zzccyscxs )()()( αβαββ -

straint that the z-axis of the gimbal must fall along the line of sight vector is now used tonknown variables of α, β, and γ. This constraint gives the fact that if the line of sight vector an

gimbal system lie along the same vector, then, when the line of sight vector is expressed in terms of the gimbal

⎪⎭

⎪⎬

⎫⎪⎨

ΔΔ

=⎪⎬

⎫⎪⎨

⎧Δ+Δ++ΔΔ+Δ++Δ

== g

g

aaa

aaa

agg yx

zcssscysssccxcszcscssyssccsxcc

DRD )()()()()()(

αβγαγαβγαγβγαβγαγαβγαγβγ

---

r (15)

The con help solve for the u d the z-axis of the

coordinate system, it will not have any components in the x or y direction of the gimbal system. This provides the following two equations (Equations 16,17). 0=gxΔ 0=gyΔ (16-17) The relationships provided in Equations 16 and 17 holds true, but there are only 2 equations with 3 unknowns. The third equation that is used is from the orientation equations shown in Equations 18-22. This additional onstraint must be defined for the third rotation to be to providc found e the proper orientation of the camera. This

constraint comes from the fact that the unit vector in the x-direction in the gimbal system needs to be perpendicular to the inertial direction which is used to orient the camera. For example, if the top of the camera image needs to always be North in direction, then the x unit vector in the gimbal system needs to always be perpendicular to the North direction, or in this case the inertial x-axis. This leads to the constraint that the dot product between the unit vector in the gimbal system and the orientation vector in the inertial system must be zero. This relationship is shown symbolically to give the third constraint. Subscripts g and I represent which coordinate system the vectors are in, gimbal or inertial, respectively.

ˆˆˆˆ1ˆ1̂

0

0

0.1

=

==•

=

a

axx

ax

ig

grr

++Ι=

Ι=

=•

KcJbii

x

xx

i

igr

rr

(18-22)

he x unit vector in the gimbal system needs to be transformed inproduct of the two vectors can be calculated to find a. This is done by multiplying the unit vector by the inverse of the total rotation matrix which is demonstrated in Equations 23 - 25.

T to the inertial coordinate system so the dot

[ ][ ]agT RRR = (23)

⎫⎧1r )

⎪⎭

⎪⎬

⎪⎩

⎪⎨=00gx (24

⎪⎩

⎪⎨

⎧=

cxR gT

1-

⎪⎬

⎫ba

(25) [ ] { }⎪⎭

Page 7: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

The line of sight is in the z axis of the gimbal which makes the x and y axes components zero when pointing at e desired target.. By applying this constraint (Equations 16nknown variables are found. The third equation used for solv

Gamma) comes from Equation 25(first row), since the value of a is zero.

th -17) in Equation 15, two equations with three u ing the three unknown variables (Alpha, Beta,

The nonlinear set of equations (Equations 26-28) determine the correct pointing angles for this gimbal system. Once the equations are solved, the three rotation angles can be used in a controller to provide commands to the gimbal system. ()()( 0)+−Δ++Δ aa cssyssccsxcc =Δ azcsγαγαβγαγβγ - β α (26) 0)()()( =Δ−+Δ++Δ aaa zcssscysssccxcs αβγαγαβγαγβγ (27) −−+−−− ψφαγθβψφαγψβαγφψβαγφθβψφαγ sssssscsscsssccssccssscccc (28)

0=−++ θψφαγψφαγψθβγθψφαγ scscssccsccccsccss W losed form expressions for hen solving the set of three equation (Equations 26-28) in Mathematica, a set of 64 c

lpha, Beta & Gamma (gimbal rotations) are found, as shown in Equations 29 - 31. When theolutions are substituted in Equations 26-28 and evaluated in Matlab, the equations should giveolutions satisfy this condition. The angular limitation in the gimbal system (shown in Figure 24) are ±25 degrees bout x axis, ±75 degrees about y axis, while there is no limitation about the z axis. The physical structure of the

A 64 closed form s zero, but only 8 sacamera used in the gimbal system along the wiring are the reasons for the angular restrictions. The gimbal is free to rotate about the z axis with the use of slip ring. By applying these angular restriction in a Matlab program, the 8 sets of expressions is reduced to 1 or 2 solutions, of which one is chosen to point/track the target.

]/()([ 22222 zyxzyArcCos Δ+Δ+ΔΔ+Δ±±=β (29)

][))(*)/()(( 22222222 zyxzyxzy

zArcCosΔ+Δ+ΔΔ+Δ+ΔΔ+Δ

Δ±±=α (30)

& ][ 2][ baArcCos ±±=γ b

aArcCos ±−=γ (31)

where 2222 ))()(( ψθψφ ysszcszyxa Δ+Δ−Δ+Δ+Δ= , and

)))2)()((

*))()(

)((2)(

(22

2

))((((

2222222

2

22

2222

2222

22 ψc +2222

22222222

ψθψθψ

φψθψφψθψφ

ψθψφψθψ

θφψϕψφψφ

ψφψθφψθφψ

θϕψθ

szsyszxscyx

czssycszssycs

ysszcccxcszy

zcyssccszsyssy

ssxssszcysscz

sscxczyzy

ΔΔ+Δ+Δ+Δ+Δ

+Δ+Δ+Δ+Δ

ΔΔΔΔ+Δ+

ΔΔΔΔ+Δ+

Δ+ΔΔΔ+

ΔΔ+ΔΔ+Δ

IV. Gimbal Motion Control Process The software used to control the gimbal motion/rotation is the 3DM-GX1 Data Acquisition & Display software,

atlab, and the Faulhaber Motion Manager. Three MCDC 3006S Faulhaber MicroMo Electronics Motor ontrollers and three 1524A0129 Faulhaber MicroMo Electronics DC micromotors are used in the gimbal system

& Display software is used to acquire and store the attitude information si, Phi, & Theta) in a ‘.CSV/Excel’ file every second. By using the attitude information from “.CSV file”, the

θψ sb

+=

MC(Figure 3). 3DM-GX1 Data Acquisition(Pclosed form expressions/solutions (Equations 29-31) derived in Mathematica are used to determine the gimbal rotations (Alpha, Beta & Gamma) using Matlab. The gimbal rotation values are then stored in a text file. A VBscript program was developed in house to rotate the motors along the three axes by using the gimbal rotation values from the text file. It uses the angle information evaluated by Matlab using the closed form expressions to convert the information to motor controller parameters.

Page 8: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

V. Experimental Setup A experiment was conducted to verify the tracking capability of the gimbal. A laser pointer was attached to the amera to track the pointing location of the system. Table 1 shows a comparative study of accuracy easurements for different set of phi, theta and psi values. The accuracy obtained from the experiment shows

tions (Equations 26-28) has the potential to track the target with the change in attitude formation. A experimental setup for case 2 is shown in Figure 8. The experiment was conducted by fixing the

cmthat the set of three equaintarget and plane location at (0,3,0) & (0,0,2) respectively and varying the attitude information (psi, phi, theta). The initial target location is (0,3,0) and the plane location is (0,0,2) and IMU information (psi, phi, theta) are 5.3,-178,-0.2 degrees, respectively. When the SUNDOG payload is rotated the IMU information is changed to (psi=5.6, phi=175, theta=-0.2 degrees), in the this case, and the camera points to a new location. By using the target location, plane location and IMU information in Matlab, the closed form expressions (Equations 29-31) will give the gimbal rotation (Alpha, Beta, Gamma) which are then sent to the controller to rotate the motor along the three axis to align the camera to the desired target. The final image shows the corrections made by the program using the closed form expressions/solutions.

Figure 8 Experimental setup for case 2 (Target location (0,3,0), Plane location (0,0,2), Gimbal offset (0,0,0) )

Table : 1 - Experimental results for four test cases showing pointing errors and a complete end-to-end correction algorithm. (Target lo n (0,3,0), Plane location (0,0,2), Gimbal offset (0,0,0) ) catio

DISTANCE MOVED (cm) ERROR (cm) C

* ASE

PSI (DEG)

PHI (DEG)

THETA (DEG)

ALPHA (DEG)

BETA (DEG)

GAMMA (DEG)

HORIZONTAL VERTICAL HORIZONTAL VERTICAL

ACCURACY%

CHANGE IN PHI VALUES

1.4 -178 -0.6 -1.524 -58.299 88.785 1 5.7 6.35 0.2 174.3 -1.7 -1.0975 -50.619 88.657 57.785 15 85.22

5.3 -178 -0.2 -4.6 -57.89 87.148 2 5.6 175 -0.2 -4.5336 -52.071 86.106 4.445 104.775 4.445 3.81 94.42

CHANGE IN ALUES PSI V

6 -178 0.4 -5.3354 -58.358 86.708 3 13.2 -178 -11.467 -57.485 7.62 0.635 8.89 80.24 -0.5 82.9 44.45

6.4 -178 0 -5.4367 -58.049 86.8 4 21 -178 -0.2 -17.883 -56.604 79.109 89.535 22.86 7.62 10.795 85.7

AV AGE = 86. ER 4 % NOTE: Accuracy is defin as the ability to track/point the initial target location from the new target location which occurred due to the chan in e ati

edge attitud inform on.

Page 9: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

VI. Numerical Experiments

er of trials were conducted to verify the tracking capability nd continuity in the gimbal rotations (Alpha, Beta, Gamma). Figures 9 -12 show the comparison between the xperimental and simulated change in attitude information (psi only in this case) and their corresponding gimbal

and target locations. The gimbal rotation (Alpha, Beta, Gamma) for both the xperimental and simulated attitude information (psi, phi, theta) follows similar path, which also indicates that the

Apart from conducting the experimental setup, numbaerotations by fixing the planeeclosed form expressions derived (Equation 29-31) hold true and have the potential to track the target continuously, at least over a limited rotation range. There are discontinuities for rotations beyond this limited range.

-40

-30

-20

-10

0

10

20

30

40

50

60

70

0 20 40 6

-100

-80

-60

-40

-20

0

20

40

60

80

100

PSIPHI

0

THETA

0 20 40 60

Count

Deg

rees

Count

Deg

rees

ALPHABETAGAMMA

Figure 9-10. Experimental attitude information and its corresponding gimbal rotations

-40

-30

-20

-10

0

10

20

30

40

50

60

0 10 20 30 40 50

-100

-80

-60

-40

-20

0

20

40

60

80

100

0 10 20 30 40 50

Count

Deg

rees

ALPHABETAGAMMA

PSIPHITHETA

Count

Deg

rees

Figure 11-12 Simulated attitude information and its corresponding gimbal rotations

Page 10: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

Figures 13-16 show the experimental and simulated attitude information and its corresponding gimbal rotation values. Here the psi and theta values are approximately kept the same for each count and the phi value is changed continuously with respect to the count. From Figures 14 and 16, the gimbal rotation values are approximately the same and it follows a similar path in both the experimental and simulated attitude information (psi,phi,theta). Hence, Figures 13-16 show that when the phi value changes continuously with respect to count, both the experimental and simulated values behave in the similar manner and are continuous over the entire range tested.

-40

-35

-30

-25

-20

-15

-10

-5

0

5

10

0 20 40 60

-10

0

10

20

30

40

50

60ALPHABETAGAMMA

Deg

rees

Count

PSIPHITHETA

0 20 40 60D

egre

es

Count Figure 13-14. Experimental attitude information and its corresponding gimbal rotations

-35

-30

-25

-20

-15

-10

-5

0

5

10

0 10 20 30

Count

Deg

rees

40

-10

0

10

20

30

40

50

60

0 10 20 30 40

Count

Deg

rees

ALPHABETA

PSI

PHI

THETA

GAMMA

Figure 15-16. Simulated attitude information and its corresponding gimbal rotations

When the attitude information (psi,phi,theta) are changed, the corresponding gimbal rotations in experimental and simulated setups behave similarly. The gimbal rotations are found to be continuous when theta is changed from -30 to +35 degrees (shown in Figure 17-20), while keeping phi and psi at a constant value. Outside this range discontinuities occur.

Page 11: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

-30

-20

-10

0

10

20

30

40

0 20 40 60

-100

-80

-60

-40

-20

0

20

40

60

80

100

0 20 40 6

PSIPHITHETA

Deg

rees

Count

Deg

rees

0

ALPHA

BETA

GAMMA

Count Figure 17-18. Experimental attitude information and its corresponding gimbal rotations

-30

-20

-10

0

10

20

30

40

0 20 40 6

Count

Deg

rees

0

-100

-80

-60

-40

-20

0

20

40

60

80

100

PSI

PHI

THETA

0 20 40 6

Count

Deg

rees

0

ALPHA

BETA

GAMMA

Figure 19-20. Simulated attitude information and its corresponding gimbal rotations

VII. Sources of

The experimental and simulated attitude information and the corresponding gimbal rotations (Figure 9-20) show a discontinuity in the gimbal rotation (Alpha, Beta, Gamma) when theta and psi values are greater than ± 30 degrees. The discontinuity may be minimized or eliminated by the utilizing IMU information in quaternion form. Therefore, the gimbal rotation expressions derived (Equation 29-31) using the three sets of equations (Equations 26-28) hold true and gives a continuous gimbal rotation if the theta and psi values are limited to less than ± 30 degrees. It is also to be noted that even when the theta and psi values are greater than ±30 degrees, the gimbal system will point/track the correct target location, but there will be a discontinuity in gimbal rotation (Alpha, Beta, Gamma) at one point and then acts in a piecewise continuous fashion. In the experiments conducted there is about 5-15 % error in final target tracking offset. Practically, the inability to rotate the payload about its center of gravity, not fixing the plane location exactly at (0,0,2) at all times during the

Error

Page 12: Active Gimbal Control for Surveillance Using Small ...€¦ · Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems William Semke, Jaganathan Ranganathan,

experiment, and errors and noise in the IMU information, and not taking into account gimbal offset are the potential sources of error. If the experiment was conducted by eliminating the above mentioned errors, the target offset would approach zero. There was no filtering or processing of IMU data in the experiments conducted, the use of a Kalman filter and/or additional preprocessing of the input data may prove beneficial as well. VIII. Conclusions An end-to-end system that uses closed form analytical expressions to determine gimbal pointing angles based on GPS and IMU information has been demonstrated to be effective. This system allows for accurate pointing of a camera from an UAS using minimal computational time. Many other algorithms require complex solution strategies to solve for the pointing angles, the method presented here has simple algebraic calculations to establish the proper pointing and orientation of the camera. The algorithm described is to be used on the SUNDOG payload to aid in the POINTing system at the University of North Dakota. IX. Acknowledgments This research was supported in part by Department of Defense contract number FA4861-06-C-C006, “UnmanneAerial System R n”, the North

akota Departm ,” and NASA rant NNG05WC01A, “National Suborbital Education and Research Center.” The authors would also like to

utions of the Unmanned Aircraft Systems Engineering (UASE) team at UND.

] Semke, W., Schultz, R., Dvorak, D., Trandem, S., Berseth, B., and Lendway, M., "Utilizing UAV Payload

ematics,” Journal of Aircraft, Vol. 38, Issue 4, July 2001, pp. 718-737.

pacecraft Attitude Dynamics, Dover Publications, Inc., Mineola, New York, 2004.

2005, Vol. 18-22, April 2005, pp. 2600-2605. 1] Sugpil Yoon; Lundberg, J.B., “Equations of Motion for a Two-Axes Gimbal System”, Aerospace and

IEEE Transactions, Vol. 37, Issue 3, Jul 2001, pp. 1083–1091.

d emote Sense and Avoid System and Advanced Payload Analysis and Investigatioent of Commerce, “UND Center of Excellence for UAV and Simulation ApplicationsD

Gacknowledge to contrib X. References [1] Lendway, M., Berseth, B., Trandem, S., Schultz, R., and Semke, W., “Integration and Flight of a University-

Designed UAV Payload in an Industry-Designed Airframe,” Proceeding of the Association Unmanned Vehicle Systems International, 2007.

[2] Lendway, M., Berseth, B., Martel, F., Trandem, S., and Anderson, K., “A University-Designed Thermal-Optical Imaging Payload for Demonstration in a Small Experimental UAS,” AIAA Infotech@Aerospace, 2007.

[3Design by Undergraduate Researchers for Educational and Research Development," to appear in Proceedings ASME IMECE, IMECE2007-43620, November 2007.

[4] Buisker, M., “Statistically Significant Factors that Affect the Pointing Accuracy of Airborne Remote Sensing Payloads,” M.S. Mechanical Engineering, University of North Dakota, May 2007

[5] Baruh, H., Analytical Dynamics, WCB/McGraw-Hill, 1999. ] Philips, W.F., Hailey, C.E., Gebert, G.A., “Review of Attitude Representations Used for Aircraft Kin[6

[7] Hughes., P.C., S[8] Hilkert, J.M., “Kinematic Algorithms for Line-of-Sight Pointing and Scanning using INS/GPS Position and

Velocity Information,” Proceedings of SPIE, Vol. 5810, pp. 11-22. [9] Weiss, H., “Quaternion-Based Rate/Attitude Tracking System with Application to Gimbal Attitude Control,”

Journal of Guidance, Control, and Dynamics, Vol. 16, No. 4, 1993. [10] Quigley, M.; Goodrich, M.A.; Griffiths, S.; Eldredge, A.; Beard, R.W., “Target Acquisition, Localization, and

Surveillance Using a Fixed-Wing Mini-UAV and Gimbaled Camera,” Robotics and Automation, 2005. ICRA

[1Electronic Systems,