Final Report KC1 MK1_test

Embed Size (px)

Citation preview

  • 7/29/2019 Final Report KC1 MK1_test

    1/24

    UNIVERSITY OF CALIFORNIA, DAVIS

    UAV Camera Stabilization

    for Multispectral CropImagingMAE 276 Final Project

    Kevin Brouwers, Kellen Crawford, Matthew Klein

    3/18/2013

    Spectral imaging of plants can allow one to establish a relationship between the leaf temperature

    (gathered via proximal infrared sensing) and the Stem Water Potential (SWP). Using this information

    one may be able to analyze the water demand of a field of crops. Dr. Shrinivasa Upadhyaya's research

    group in the Biological Systems Engineering Department at UC Davis has developed a UAV outfitted with

    a spectral camera to allow for remote collection of spectral images for near real-time analysis of crop

    irrigation demand. Here a team of three students from Dr. Michael Hill's Data Acquisition and Analysis

    course has experimentally investigated the current issues with the camera stabilization control

    algorithms in order to maximize the amount of useable images being captured.

  • 7/29/2019 Final Report KC1 MK1_test

    2/24

    INTRODUCTION

    VISION:

    Technological advancements continue to increase productivity and efficiency across the

    commercial market. Humans now can do more and produce more with fewer resources

    than ever before. Agriculture is one such field which, while generally slower to respond to

    technology, has both incredible potential and necessity for advancements in this area.

    Though many farmers are portrayed as stubborn and resistant to change, advancements

    such as GPS-operated combines, a myriad of fruit harvesters, and a host of other

    implements replacing human hands have been wholeheartedly embraced. As a result, the

    agricultural sector continues to feed an exponentially increasing population with fewer and

    fewer hands dedicated to that field. Current research and development is paving the way forthe next groundbreaking advancement in agriculture: the many applications of unmanned

    aerial vehicles (UAVS). UAVs are notorious for their surveillance capabilities, and have

    drawn plenty of criticism for their application in that field domestically. Not surprisingly,

    the Federal Aviation Administration (FAA) has placed vast restrictions on the operation of

    UAVs in U.S. airspace. A higher awareness about the potential applications of UAVs,

    however, has begun to open some of that airspace, and the agricultural sector will be one of

    the biggest benefactors.

    Much of the current information in the agricultural database, including domestic cropacreage, productivity, irrigation resources, and the effect of weather patterns on our crops

    comes from remote sensing data derived from either satellites or high-altitude aircraft. Such

    assets depend largely on multispectrometers to measure reflectance data in the visible and

    infrared spectra. Simple indices derived from ratios of different reflectance bands contain

    much more information about the status of that crop than any visual inspection ever could.

    In its current utilization, however, this information is at too large of a scale, too infrequent,

    and too dependent on weather conditions for most farmers to act on. By employing this

    multispectral capability on a small, easily operable UAV, this wealth of information can be

    utilized by a vast array of farmers at a very precise scale, with real-time information,

    opening up the possibility of enormous applications in precision agriculture.

    BACKGROUND:

  • 7/29/2019 Final Report KC1 MK1_test

    3/24

    Much of the work in this field is being done at the University of California at Davis, under

    the direction of Dr. Shrini Upadhyaya. The current model under development is an

    octocopter, about two feet in diameter, with a multispectral camera mounted on a platform

    below the guts of the aircraft, as pictured in Figures 1 and 2.

    Figure 1 - UC Davis UAV

    Figure 2 - Camera platform

    The platform has two axes of rotation, pitch and roll, and is designed to maintain a constant

    attitude, pointed straight down to the ground, independent of the attitude of the UAV frame.

    It does so utilizing an inertial measurement unit (IMU), in this case an ArduIMU, version 3.

    The IMU has its own gyroscopes and accelerometers, in three axes, which feed into the

    software-driven control loop.

    The UAV is designed to simply fly over a field of interest, using pre-programmed GPS

    coordinates, and take images every couple of seconds during the duration of the flight. After

  • 7/29/2019 Final Report KC1 MK1_test

    4/24

    the flight, all of the images (typically a set of several hundred) are fed into a software image-

    stitching program to create a mosaic image of the entire field. Before these images are used

    in the mosaic, however, a lab technician must go through and examine each image and toss

    out any that are distorted. The lab tech must do this, because many of the images, around

    20% in most sets, are distorted in a way that warps what the field actually looks like. A

    typical distorted image is depicted in Figure 3.

    Figure 3 - Typical distorted image

    If the above image were to be fed into the mosaic, it would end up distorting the entire

    mosaic. The analysis of these images relies heavily on the spatial information of the

    vegetation, so distortions such as this one cant be tolerated. The result is a lab technician

    spending hours sifting through hundreds of images obtained during a ten-minute flight, a

    very inefficient use of time.

    PROPOSAL:

    The purpose of this inquiry is to determine the cause of these distorted images and to fix it if

    possible. This will save hours of post-processing time and allow for a much more

    streamlined analysis of the images, resulting in more immediate feedback to the end user.

    The desired final product will be a system as close to the original as possible, with the

    distorted image issue resolved. The end product needs to be as similar to the original for a

    smooth transition back to the end user, who will be using this system again very shortly to

    take more images during this years growing season.

    APPROACH:

  • 7/29/2019 Final Report KC1 MK1_test

    5/24

    Going in to this project, the only thing the team really knew about the problem was that

    some of the pictures taken during these flights are distorted. Upon inspection of the setup,

    there are three possible sources of the distortion; either the camera is faulty and does not

    properly scan some images; the flight conditions are simply too dynamic, with the entire

    airframe being moved during the brief time of exposure; or the control loop of the camera

    platform is inadequate. The control loop contains both hardware components and a

    software proportional-integrator (PI) control code.

    Roughly 20% of the previous years images were characterized as distorted. A more

    concrete rate of distortion is unknown, because the user simply tossed out the distorted

    images without making a detailed record of each set of data. As a result, the team will have

    to first quantify the problem being addressed by establishing a distortion rate to the

    original setup. Even before that can happen, however, the team must identify the conditions

    under which the distorted images appear. If the distortion can be reproduced in a lab

    environment, it will allow for much closer control of variables affecting the images.

    After the problem is quantitatively defined, each potential source of the distortion will be

    isolated following the basic logic illustrated in Figure 4.

    Figure 4 - Approach to determining source of distortion

    In addition to the experimental investigation to be conducted in the lab with static

    conditions a dynamic system model will be developed to study the dependence of physical

  • 7/29/2019 Final Report KC1 MK1_test

    6/24

    design parameters to the camera stabilization quality. It was observed that the roll axis is

    of primary interest here due to some resonance between the Camera/Landing Gear body

    and the upper UAV body through the bushings that mount those two systems together.

    Model development is critical to creating a controller for dynamic systems. A model allowsfor controller design through simulations that may be performed very quickly and are thus

    allow for an efficient use of resources by minimizing design time and cost of building

    prototypes. Here a multi-body dynamic model was developed to model the roll and pitch

    control of the camera gimbal. In the essence of saving time two planar motion models were

    developed in order to study the dynamics of each the roll and pitch control independently

    as opposed to creating a three-dimensional model. The servo that controls the pitch angle

    of the camera is a direct drive connection and is applied about the actual pitch axis of the

    cameras inertia. The roll control servo, on the other hand, controls the gimbal at a fixeddistance from the roll axis of the cameras inertia and this causes a reaction moment that is

    felt back at the upper UAV body. This must go through the bushings that support the

    Landing Gear Body-to-Main UAV body attachment. Due to the low stiffness/damping of

    these bushings it has been observed that a particular control input can cause a resonance

    between the UAV body and the Landing Gear Body. Attention will be paid to study the effect

    of changing the bushing stiffness/damping in order to improve the controller performance.

    Additionally, a design revision could be made in order to have the roll servo acting on the

    cameras roll axis inertia directly as opposed to being offset as it is here which could create

    a performance improvement while allowing for the bushings to be kept in order to provide

    the intended cushioning. Figure 5 portrays the roll axis model that was developed and how

    the physical system was reduced to a simple multi-body system of three main components.

  • 7/29/2019 Final Report KC1 MK1_test

    7/24

    Figure 5 - Roll axis model development. Overlaid on the actual UAV studied here.

    Figure is a sketch of the model developed for the roll axis. System parameters such as

    distances, masses and moments of inertia are applied. They were measured by creating a

    CAD model of the pertinent components. Commonly in 3-d modeling software one may

    apply a material density to each component and this allowed for us to procure the masses

    and moments of inertia from the CAD model. In creating the CAD model all necessary model

    dimensions were gained as well.

    There are three main bodies in this model and they include: 1) the UAV body, which has a

    prescribed input angular velocity; 2) the Landing Gear body, which hangs from the UAV

    body and is attached through an angular spring that was modeled to have a cubic stiffness

    profile to give the spring low stiffness for about 10 degrees of displacement and then

    becomes relatively stiff at full bushing compression; and finally 3) the Camera body, which

    consists of the hanging arm and the camera mounted at the bottom. In Figure 6 the blue

    dots show the C.o.G. of each body and the white dots portray the joints of rotation. The

    upper joint is the bushing joint and the lower joint is the pivot between the Landing Gear

    body and the Camera body.

  • 7/29/2019 Final Report KC1 MK1_test

    8/24

    Figure 6 - Sketch of the model with proper coordinates and parameters applied.

  • 7/29/2019 Final Report KC1 MK1_test

    9/24

    RESULTS

    DYNAMIC SYSTEM MODELING:

    Initial observations of the camera platform, with its control loop powered on to keep it

    pointed straight down, immediately identified a couple of potential sources of distortion.

    Even as the UAV was sitting stationary, the control loop was making slight adjustments to

    the attitude of the platform. Common sense suggests that, since the UAV is not moving, the

    IMU should not be adjusting for anything and should also remain motionless once it finds its

    proper attitude. These adjustments were being made with relative consistency at a

    frequency on the order of 1 Hz or less. Seeing as the camera was programmed to take

    pictures about every two seconds, the possibility that the servo made one of these

    adjustments in the brief moment the image was being scanned was highly possible.Furthermore, the servos seemed to have a tendency to shudder every few seconds, which

    led to the entire platform being shaken very slightly. This was also quite possibly a source of

    distortion. These were the initial observations with the UAV in a static configuration.

    In the dynamic world, a couple of other potential problems were discovered. To begin with,

    the roll servo had a much longer moment arm transferring its motion to the platform than

    did the pitch servo. As a result, the roll servos movements, including the shuddering issue

    mentioned above, were amplified through that moment arm. Additionally, there are four

    rubber bushings connecting the frame of the UAV to the lower camera platform which,when subjected to a moment from the roll servos long moment arm, induced a substantial

    oscillation to the camera platform which took about 1.5 seconds to damp out.

    The equations of motion for this dynamical system were generated by creating a Bond

    Graph model of this system. A similar model was found in the text Advances in

    Computational Multibody Systems written by Jorge A. Ambrosio. On page 138, Figure 9

    which may be viewed using Google Books double jointed robot arm is modeled using the

    Bond Graph technique. In Figure below a Bond Graph was drawn for the roll-axis model of

    Figure representing the UAV camera gimbal.

  • 7/29/2019 Final Report KC1 MK1_test

    10/24

    Figure 7 - Bond graph representing the roll axis model of the UAV and camera gimbal. The actual MTF parameters will be

    shown in equations later. A program for LaTeX was used to generate this figure.

    The 1-junctions with Inertial Elements with a J appended model the rotational dynamics of

    the three bodies. Connected to those junctions via Modulated Transformers are the 1-

    junctions that model the translational velocities of each body in both the x- and y-directions.

    Attached to those are the masses, md, mh, and mb. In order to maintain proper integral

    causality for all energy storing elements additional states were added in the form of

    capacitances that link the translational inertias to the respective rotational ones. These can

    be thought of as modeling the translational joint stiffness and their stiffness were

    calculated by selecting a high natural frequency for the joint. Additionally, the jointdisplacements were monitored and the frequency was modified iteratively until these were

    at sufficiently low levels relative to the system dimensions. Specifically, here the UAV body

    dimension is approximately 5cm long and thus the joint displacements were considered to

    be acceptable if they were below 0.5mm or about 1/100 of the smallest system dimension.

    Resistive elements were paired with the joint stiffness to reduce computational chatter.

    Their values were computed by selecting a value that ensured a critically damped system

  • 7/29/2019 Final Report KC1 MK1_test

    11/24

    given the previously computed stiffness. These are all shown as the 1/kkmm and Rkmm in

    Figure . The notation KMM was used as this method of introducing state variables to ensure

    proper causality is commonly referred to as the Karnopp-Margolis Method which was

    developed by Dr. Dean Karnopp and Dr. Donald Margolis at the University of California,

    Davis.

    The input flight disturbance is modeled as the flow input Sf:1(t) and the KMM was applied

    here as well as there would be a causality conflict if a flow input was placed on a 1-junction

    that also had an inertial elements attached. Again this angular displacement was designed

    to be small in order for the input velocity to be very close to the actual velocity if the UAV

    body at which it was being applied. The relative velocity between the UAV body and the

    Landing Gear body is modeled by the 0-junction that has the C: 1/kslop and R:Rslop elements

    attached. This stiffness/damper pair model the bushings between the UAV body and theLanding Gear. The 0-junction that models the relative velocity between the Landing Gear

    and Camera bodies has an Effort Source attached which models the angular control servo

    motor. A resistive element was also placed there to model a bearing friction. If this is not a

    realistic component then the friction coefficient can be set to a small enough relative value

    such that it does not significantly contribute to the system.

    The coefficients for the Modulated Transformers are not shown in Figure , but are listed

    below in Table 1. The equations relate the translational velocities to the angular velocities

    of the bodies and the force to the torques.

    Table 1 - Equations that describe the translational velocities as a function of the angular velocities. These are used to derive

    the Modulated Transformer coefficients for the Bond Graph.

  • 7/29/2019 Final Report KC1 MK1_test

    12/24

    A key component of the model development process is performing a proper diagnosis of the

    models performance in order to understand whether or not the results being produced are

    physically understandable. Determining how to go about testing the model to understand

    its conceptual validity was a new process that was learned here. A good first step found

    here was to compare the model, in some constrained operation if necessary, to a previously

    developed analytical model. In this case the system is really just a complicated pendulum

    with multiple bodies and strange joints. Thus, here the first and second bodies were fixed in

    position while the Camera body was started from an initially displaced angle to test

    whether or not the results matched that of a simple pendulum of the same system

    specifications. Figure shows that the model does in fact act as expected. The two lines

    overlap each other almost exactly. There is a small difference due to a small amount of

    bearing friction that was left in the system. Figure shows the free response of the Camera

    body as a function of varying the bearing friction on the Camera joint. A value of 0.015 N-m-

    s/rad was chosen as this provided the closest response to the actual system. This

    parameterization was performed qualitatively and therefore was not compared to actual

    system response data, but done through observing the system upon an input and seeing

    how many oscillations it had before coming to rest. Figure is a plot of the joint forces and

    displacements that are the result of the Karnopp-Margolis method. The displacements

    should be small and they are below 10^-5 m, which was decided as discussed earlier to be

    adequate. This was based on selecting a frequency of 150 Hz for the joint stiffness

    calculation. Figure is a plot of the free response of the entire system starting at an initial

    displacement of 10 degrees. The UAV body stays fixed at 10 degrees due to the KMM spring

    applied relative to the input angular velocity which in this case is zero.

  • 7/29/2019 Final Report KC1 MK1_test

    13/24

    Figure 8 - Comparison between the analytical model for a simple pendulum and a constrained version of the

    camera gimbal roll axis model.

    Figure 9 - Testing the base oscillations as a function of the bearing friction coefficient.

  • 7/29/2019 Final Report KC1 MK1_test

    14/24

    Figure 10 - Testing initial 10 degree displacement on all three components to see response. (a) Restrain forces in

    joint, (b) Joint Displacements (c) Relative angular displacement of the bushing joint.

    Figure 11 - Plot of the UAV body, Landing Gear body and Camera body angles when starting from a 10 degree

    displacement.

  • 7/29/2019 Final Report KC1 MK1_test

    15/24

    After completing the initial conceptual validation of the model the PID control gain tuning

    was started. This was tested for step, ramp and sinusoidal inputs. Figure plots the

    response of the bodies after starting from an initial angular displacement of 10 degrees.

    The right plot ofFigure is of the torque required by the servo in order to achieve the

    response shown on the left. The servo torque is modeled to saturate at a maximum of 5.2

    kg-cm of torque in each direction per the manufacturers specifications. The left plot in

    Figure is the same as Figure , however here the servo torque is applied to the system. It is

    seen that the Camera body (Base Angle in the plot legend) drops to zero degrees in about

    0.25 seconds as opposed to the free response of 2.5 seconds, which is a ten-fold

    improvement.

    Figure 12 - Response of system with servo motor controlling at the Camera body joint from an initial angular

    displacement similar to Figure . (a) Angular response of the three bodies. (b) The control torque required to

    achieve response.

    Figure shows the response to a constantly ramping input angular velocity on the UAV body

    when starting from zero initial angular displacement for all of the bodies. It is seen that at

    about 0.4 seconds the control torque reaches its peak limit, which also corresponds to a

    jump in displacement for the Camera body as seen in the angle plot ofFigure . Figure

    shows the response to a sinusoidal input angular velocity and Figure to a constant angular

    velocity step input.

  • 7/29/2019 Final Report KC1 MK1_test

    16/24

    Figure 13 - Response for a ramping angular velocity input. Notice that the control torque saturates briefly at the

    maximum of 5.2 kg-cm. This is a limit that was placed on the control based on the manufacturers specifications

    for the servos being used here.

    Figure 14- Response to a varying sinusoidal input angular velocity.

  • 7/29/2019 Final Report KC1 MK1_test

    17/24

    Figure 15 - Response to a step input angular velocity.

    A frequency analysis was performed in order to understand the sensitivity of the controller

    performance to the flight disturbance input frequency. Input frequencies of 1, 10, 100, and

    1000 radians per second were applied. It is observed in Figure that the controller performs

    well for the first two plots and then begins having trouble above 100 radians per second.

    The controller can properly stabilize the camera up to about a 6 Hz input frequency before

    the platform accelerations and velocities reach values above which will allow for a non-

    distorted image.

  • 7/29/2019 Final Report KC1 MK1_test

    18/24

    Figure 16 - Camera Body Response as a function of the bushing stiffness.

    Figure 17 - A frequency analysis was performed to test the control capability as a function of frequency. An arrow

    highlights the angular response of the Camera body for 4 different frequencies. The angular displacement stays

    low for all, however, the velocities/accelerations are quite high for the higher frequencies, which is likely to still

    produce low quality images. As long as the flight disturbances stay below ~35 rad/s (6 Hz) the controller is able

    to respond fast enough to isolate those signals.

  • 7/29/2019 Final Report KC1 MK1_test

    19/24

    REPRODUCING DISTORTED IMAGES IN LAB ENVIRONMENT:

    The first step in the diagnosis was to see if the distorted images could be reproduced in a

    static lab environment. Its important to remember that these images were taking during

    flight, and as such were subjected to a much more dynamic environment than was easily

    created in a lab. Again, for the sake of simplicity and control of experimental variables, the

    system was set up in a static lab environment, with the main body of the UAV stationary.

    Propped up at a height of about 1 foot, using a piece of engineering paper as a visual target

    for the camera to help identify image distortion, the team powered on the UAV in its original

    configuration. Taking pictures about every two seconds, 50 images were taken and

    analyzed, looking for the same distortion that was prevalent in the field data. Of these 50

    images, nine were clearly distorted, for a distortion rate of 18%. An example of one of these

    distorted images, compared to a normal image, is provided in Figure 5.

    Figure 18 - Distorted image on left compared to a normal image

    The waviness seen in the bottom of the image on the left is almost certainly a result of a

    movement of the camera during the brief moment the image was being scanned. Being a

    digital camera, the image is scanned from left to right, top to bottom. As a result, the camera

    is much more sensitive to sideways movements, because there is a comparatively larger

    time difference between the top pixels and the bottom pixels. In its current configuration,

    this sideways movement translates to the pitch axis. Tying in to the IMU via its serial port,the acceleration and gyroscopic data, which drive the PI control loop of the platform, as well

    as the commands being sent to servos, were made available and are presented in Figure 6.

  • 7/29/2019 Final Report KC1 MK1_test

    20/24

    Figure 19 - Original system's IMU output

    This data provided a benchline analysis for the movement of the platform that could then be

    used for comparison to future modifications to the system. A couple of key observations to

    make note of regarding this data is the relative differences in magnitude between the roll

    and pitch acceleration and angular velocity, and the large amount of seemingly erroneous

    signals being sent to the servos. There is clearly much more noise in the roll acceleration

    than the pitch, which is probably a manifestation of the longer moment-arm of the roll servo

    mentioned previously. The servo signals being generated by the software range from 1000

    to 2000 microseconds, 1000 being fully clockwise and 2000 being fully counter clockwise.

    Since the UAV is stationary, the ideal output for both servos should be a straight line with a

    slope of zero. Depending on the attitude at which the UAV happens to be sitting, both signals

    should also be right around 1500. The general slopes of each signal are a manifestation of

    the drift correction of the original code, the integral term. This aspect of the code warrants

    attention, since there is a marked drift in the roll direction, but will not be part of this

    analysis, because the team is chiefly concerned with image distortion. What is worth

    mentioning is the apparent fluctuation in the servo signals, as if the code cant quite decide

    between two different servo positions and ends up jumping back and forth around the

    0 50 100 150-2000

    -1000

    0

    1000

    2000

    time(seconds)

    rollacceleration

    ay

    0 50 100 150

    -0.2

    -0.1

    0

    0.1

    0.2

    0.3

    time(seconds)

    angularrate(degrees/sec)

    gyroy

    0 50 100 1501000

    1200

    1400

    1600

    1800

    2000

    time(seconds)

    servosignal(microseconds

    )

    roll

    0 50 100 150-2000

    -1000

    0

    1000

    2000

    time(seconds)

    pitchacceleration

    ax

    0 50 100 150

    -0.2

    -0.1

    0

    0.1

    0.2

    0.3

    time(seconds)

    angularrate(degrees/sec)

    gyrox

    0 50 100 1501000

    1200

    1400

    1600

    1800

    2000

    time(seconds)

    servosignal(microseconds

    )

    pitch

  • 7/29/2019 Final Report KC1 MK1_test

    21/24

    general value it needs to be. At about 60 and 70 seconds, for example, the signals take a

    fairly large dip before recovering to closer to where they should be. This is a telltale sign of

    over-control.

    CAMERA ISOLATION:

    The team determined there was enough of an issue with the distorted images in a static lab

    environment to take the first step in diagnosing the problem: isolating the camera. This

    simply consisted of cutting the power to the servos, effectively removing all control aspects

    of the system. In the same environment as the distorted images were recorded in, the team

    again took 50 pictures for a visual inspection and found no distorted images in the set. This

    was conclusive enough to rule out the camera as being a major source of distortion.

    SERVO ISOLATION:

    As mentioned previously, one of the first observations made was the tendency for the

    servos to shudder every couple seconds, with enough movement to noticeably move the

    platform. To determine if this shuddering was causing the distorted images, the control

    code was hardwired to keep the servos stationary. This removed the PI control aspect of the

    system, but kept power to the servos to allow them to shudder. Again, 50 images were taken

    in the same environment, and, though the shuddering was quite apparent, there were no

    distorted images in the set. This ruled out the probability that the servo shuddering was a

    major contributor to the distorted images.

    CONTROL LOOP:

    Eliminating the camera and the servos as major sources of distortion left the control loop of

    the platform as the major culprit. As previously mentioned, the control loop consists of

    several hardware components and a PI control software loop. In the static test environment,

    the several hardware components identified as potential problems were immediately

    eliminated from the equation. To begin with, the soft bushings connecting the UAV frame tothe platform were not receiving any force inputs from a moving UAV frame, so they were

    not part of the transfer function of the control loop. Also, the distortion in the images is a

    function of pitch movement, as presented in Figure 5. Therefore, the amplifying effect of the

    long moment-arm of the roll servo can be ignored for this analysis. That really left the

    software of the control loop, and the signals it was sending to the servos, as the most likely

  • 7/29/2019 Final Report KC1 MK1_test

    22/24

    source of distortion. Since the UAV was stationary, there was no issue of lag in the system,

    so it really came down to a matter of over-control.

    The first concern with the code was the highly irregular and erroneous signals that are

    visible in Figure 6. These are clearly outliers from the rest of the signals, perhaps generated

    by a glitch in the software, and should be disregarded. To address this, a simple filter was

    written in to the code to ignore signals that fell outside of the range of the servos. Analysis

    of the IMU data did not reveal any stark differences in the acceleration and gyro data,

    however, and the camera trial did not yield any lower distortion than the original code.

    The next aspect of the control code for inspection was its operating frequency. The code is

    really just a continuous loop calling on several functions that is set to repeat every 5 ms, or

    200 Hz. In reality, the loop took about 10 milliseconds (ms) to run through each iteration, so

    it was really running at right around 100 Hz. As a result, the code printed out acceleration,

    gyroscopic, and servo signal data for analysis about every 10 ms. Analysis of the servos,

    however, led to the understanding that the servos had a constant refresh rate of 20 ms. This

    meant the control code was sending commands to the servos at twice the rate they were

    being executed, causing unnecessary digital noise. As a result of this finding, the frequency

    of the code was lowered to something closer to the servo refresh frequency. After trying

    several different frequencies, including 50 Hz, 40 Hz, 30 Hz, and 25 Hz, 40 Hz was found to

    be an optimum frequency, and resulted in an image distortion rate of just over 5%.

    Compared to the original codes distortion rate of 18%, this simple change resulted in an

    almost 4-fold improvement to the system. Figure 7 lays out the IMU data from that

    frequency trial, with the erroneous signal filter still in place.

  • 7/29/2019 Final Report KC1 MK1_test

    23/24

    Figure 20 - Frequency changed to 40 Hz

    Though a greater difference in the IMU data between the 100 Hz code and the 40 Hz code

    was expected, the significance of the change lies in the lowered distortion rate.

    To really get to the bottom of the distortion, however, the team really needed a way to tie

    the IMU data to the time when the image was being scanned. That was, the acceleration and

    gyro data during a distorted image scan could really be analyzed instead of having to look at

    general system behavior. In its original configuration, the camera capture and control loop

    were independent of each other, however, so there was no way to synchronize the two

    sources of data. All that was known was a window of 2-3 seconds in the IMU data where

    each image was being captured. When dealing with a code that updated every 25 ms, and a

    camera that took about 5 ms to scan, a 2-3 second window is huge. To solve this dilemma

    and allow for better analysis of the distortion, the team retrofitted the camera to be

    triggered by the control loop. Because of the limitations of the camera, an image couldnt be

    taken every iteration of the loops, but the code was altered to trigger the camera at a more

    conservative 4 second interval. Because the purpose of this alteration was to examine

    distorted images, the original parameters of the code were run, at 100 Hz, with the only

    addition being the few lines of code having to do with triggering the camera. The resulting

    0 20 40 60 80 100 120 140 160-2000

    -1000

    0

    1000

    2000

    time(seconds)

    rollacceleration

    ay

    0 20 40 60 80 100 120 140 160

    -0.2

    -0.1

    0

    0.1

    0.2

    0.3

    time(seconds)

    angularrate(degrees/sec)

    gyroy

    0 20 40 60 80 100 120 140 1601000

    1200

    1400

    1600

    1800

    2000

    time(seconds)

    servosignal(microseconds

    )

    roll

    0 20 40 60 80 100 120 140 160-2000

    -1000

    0

    1000

    2000

    time(seconds)

    pitchacceleration

    ax

    0 20 40 60 80 100 120 140 160

    -0.2

    -0.1

    0

    0.1

    0.2

    0.3

    time(seconds)

    angularrate(degrees/sec)

    gyrox

    0 20 40 60 80 100 120 140 1601000

    1200

    1400

    1600

    1800

    2000

    time(seconds)

    servosignal(microseconds

    )

    pitch

  • 7/29/2019 Final Report KC1 MK1_test

    24/24

    image set yielded zero distorted images. Though this was counterproductive to the analysis

    of the original distorted images, a very clean and simple fix to the problem was stumbled

    upon.

    CONCLUSIONS

    The primary interest here in creating the model for the roll axis was to understand if an

    actual camera stabilization improvement could be made by tightening the bushings on the

    UAV. Figure shows the Camera body response as a function of the bushing stiffness. A

    bushing stiffness above 30 Hz will provide adequate resistance against camera disturbance

    for the ramp type input that is placed on the UAV body at about t=4s and for the earlier

    applied sinusoidal input it does help, but not dramatically.

    Another important capability that this model allows is for one to ensure that the selected

    servos can provide the necessary torques for proper camera stabilization. It was shown in

    Figure , Figure , Figure , and Figure that when implementing a control torque maximum it

    has little effect on the requested torque as most of this time the necessary torque is much

    less than the maximum.

    FUTURE