11
1 Diggle Smalls Nick Cox University of Florida, ECE IMDL EEL4665C Instructors: Dr. Arroyo, Dr. Schwartz TAs: Josh Weaver, Andy Gray, Devin Hughes

Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

1

Diggle Smalls

Nick CoxUniversity of Florida, ECE

IMDLEEL4665C

Instructors: Dr. Arroyo, Dr. SchwartzTAs: Josh Weaver, Andy Gray, Devin Hughes

Page 2: Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

2

Table of Contents

0. Abstract ……………………………………………………………………………………… 31. Introduction ………………………………………………………………………………...... 32. Executive Summary ……………………………….…………………..…………………… 33. Integrated System ………………………………………………………………………….. 44. Software …………………………………………………………………………………….. 55. Mobile Platform ……………………………………………………………………………… 66. Actuation …………………………………………………………………………………….. 77. Sensors and Sensor Integration …………………………………………………………… 78. Behaviors ……………………………………………………………………………………. 99. Experimental Layout and results ……………………………………………………………910. Conclusion …………………………………………………………………………………..10References …………………………………………………………………………………….. 11

Page 3: Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

3

0. Abstract

The goal of the project was to create an autonomous robot that performs some useful task.Diggie smalls mainly functions as a front loader, often seen at construction sites digging up pilesof dirt. However, it can be used to accomplish any task that requires moving bulk quantities ofany relatively small item (toxic waste cleanup, waste management, etc.). The end result for thisproject was a robot that can autonomously navigate to a pile of aquarium gravel, scoop thegravel into its bin, locate a storage receptacle and finally dump the gravel into the storagereceptacle. It uses high­level concepts in computer vision to complete its tasks, which is allperformed on­board by a single­board linux computer.

1. Introduction

As time goes on, more and more every day tasks are becoming automated. Like Google’sself­driving car, people are beginning to trust robots to perform tasks that they would normally dothemselves. Something as simple as moving dirt from one place to another is the perfect taskfor a robot, freeing up humans to do more productive things. Also in the case of toxic wastecleanup, no humans have to endanger themselves to get the job done. I wanted to makesomething that could be used in everyday life, albeit it would have to be on a much larger scale. Itook IMDL so I could learn about robotics and de­mystify the process, and I definitelyaccomplished that goal.

2. Executive Summary

Diggie Smalls came out exactly as intended: it is a functioning skid­steer loader powered by anembedded Linux computer which handles all decision making. The computer communicateswith lower­level boards that control motors and servos, and also interface with sensors. Infraredsensors are used for obstacle avoidance and general rangefinding, while a force­sensitiveresistor is used to detect weight in the bucket. It is powered by a three­cell Lithium Polymerbattery using voltage regulators to provide 5V for electronic boards.

For demonstration purposes, a pile of aquarium gravel represents dirt or some other substancethat needs to be moved. Diggie smalls pivots, while the onboard computer receives informationfrom a video camera mounted on the front of the chassis. It uses color detection to find the pileof pink aquarium gravel, then navigates toward it. It drives into the pile until weight is detected onthe force­sensitive resistor, then it turns around to look for a receptacle (in this case, a MountainDew box). It finds the box using object recognition, navigates toward it, and then dumps thegravel into the box. After, it will repeat the process until the pile is gone. Admittedly, it isn’t perfect;every once in a while it will miss the box or or overshoot it by a little bit. However, the majority ofthe time it works perfectly.

Page 4: Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

4

3. Integrated System

3.1. Pololu Mini Maestro Servo Controller

This servo controller was chosen for its convenience. Its most obvious use is to control the twoservos for my actuation system, but all 12 channels can also be used as analog inputs. Theseanalog inputs already work on 5V, so there is no need to convert a 3.3V output to 5V. I have myinfrared sensors and force­sensitive resistors hooked up to pins on the Mini Maestro.

3.2. Atmel XMega128A4U development board

The Xmega was chosen because of its familiarity. I developed it into a fully­functional brushedmotor controller. It allows the user to set duty cycle, direction, and other variables to control themotors. Any time it receives a set motor command through serial communication, it sets theduty cycle to a set point. Using timer interrupts, the motor controller accelerates (or decelerates)the motor speed to the set point at a user­defined rate. Unfortunately with the nature of DiggieSmalls’ design, it was often more practical to instantaneously set duty cycle values so thisfeature was not used much.

Page 5: Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

5

3.3. ODROID­U2/Robot Operating System

The ODROID­U2 is a powerful single­board computer with an quad­core 1.7GHz ARMprocessor. It runs Ubuntu Linaro 12.11, which is a specialized Ubuntu distro designed for ARMprocessors. This is the ‘brains’ of the system, and it allows all decision­making to be done on

board without the need for a laptopand XBEE. Robot Operating system(ROS) is installed on the ODROIDto provide hardware abstraction thatallows for more elegant and robustrobotic designs. The mainadvantage of ROS is that it allowsprograms to be split up into nodes. Anode is a single program that runsconnected to a network of othernodes. While a node is running, itcan publish to and receiveinformation from other runningnodes through TCP/IP or UDP. Itgenerally works in this fashion:

1. A node creates a topic, for example “IR_sensors” and creates an object that publishes tothis topic.

2. A separate node creates a subscriber object and subscribes to the IR_sensors topic3. The ROS Master will detect when there is a publisher and subscriber on the same topic.

Once detected, the ROS Master creates a socket connection between the two nodes,allowing communication at runtime.

4. The publisher node sends a message, which is just a set of variable values ofpre­defined type.

5. The subscriber calls the functions spin() or spinOnce(), and if there is a messageavailable a callback is processed. Callback functions often update global variables, butthey can be made to do anything a normal function can do.

Some important things to note about this publisher/subscriber communication process is that itis completely anonymous. A node doesn’t know (or care) about the origins of the messages itreceives. This makes each node behave as a truly independent program, which increases faulttolerance, simplifies code, and allows for easier and more thorough testing. Although my projectonly includes the very basics of ROS, there are many other benefits like established codelibraries and multi­computer robotic systems.

4. Software

Software on the ODROID is written entirely in C++, and software on the Xmega board is written

Page 6: Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

6

in C. The Xmega is running the motor controller programs described previously. The ODROIDhas 4 nodes running at all times;

1. Main node: This node performs all the decision making. It checks sensor values againstthresholds to decide what action to perform, then sends commands in the form ofmessages to the other nodes.

2. Maestro node: This node receives command messages from the Main node and relaysthe information serially to the Maestro servo controller. At the same time, it continuouslyreads data from infrared sensors and the force­sensitive resistor and publishes their dataon the topic sensor_data.

3. Xmega node: Similar to the Maestro node, the Xmega node receives commandmessages from the Main node and sends the information down to the ATxmega128A4Umicrocontroller which is used as a motor controller.

4. Camera node: The camera node listens to the topic state_data to see if the robot is in thepile phase or bin phase of execution. If it is in pile phase, meaning searching or navigatingto the pile, it performs color detection. If it is in bin phase, it performs object recognition.More information on this can be found in the ‘Sensors and Sensor Integration’ section ofthis report.

All source code is available at http://github.com/nc61/

5. Mobile platform

At first the plan was to create the mobile platform out of polycarbonate, but that proved to be toodifficult. It can be cut easily, but with little mechanical experience I was limited to straight cutswith minimal fine detail. Luckily, balsa wood in conjunction with the T­tech provided a muchbetter alternative. All parts on the platform were designed using SolidWorks and cut out using theT­tech.

The most eye­catching aspects of theplatform are the large tracks used in lieuof wheels. The tracks and allaccessories were purchased fromLynxMotion.com. The design includestwo elongated pill­shaped bracketsseparated by 1.5” standoffs. Aroundeach standoff is a nylon bushing thatacts as a roller for the track. At eachend there is a sprocket which fits theinner contours of the track. Thesprocket in the back is attached directlyto the motor using a clamping hubwhich is bolted to the sprocket andclamped to the tapered end of the shaft.

Page 7: Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

7

The sprocket on the other end rotates freely inside bearings that rest within the woodenbrackets.

Each track bracket is attached to the body of the robot using one motor mount and oneL­bracket. The design of the body itself is mostly arbitrary, except for holes cut out near the backthat fit the motor and slits cut out of the front to allow room for the arms. Rounded edges in thefront and back are purely for aesthetics. The body is made of two identical platforms separatedby 2” standoffs, with extra space cut out of the top layer to fit the ODROID and allow easy batteryremoval.

6. Actuation

Each track is turned by a 12V DC motor at a 30:1 gear ratio. The GHM­12, purchased fromLynxmotion, have a stall ‘torque’ of 10.0 Kgcm, which is important for driving into piles of dirt and

driving over obstacles. A non­geared motor withinmy budget would not have provided enough torqueto overcome the frictional force holding the tracks inplace, especially when pivoting. The motors arepowered by a dual H­bridge motor driver (PololuMC33926).

For the arms that raise the bucket, a standard Hi­tecservo was used (HS645MG) with 107oz­in of torqueat 4.8V. One bucket arm is attached to the servohorn, while the other arm rotates freely inside abearing and serves only to constrain the system to

stay level. It would have been better to go with a more powerful servo, but at the time ofpurchase my design required the use of a standard sized servo. That being said, there hasnever been a time when the servo failed to lift gravel. At the end of the arms is the aluminumbucket. A standard Hi­Tec servo (HS485­HB) is attached to the bucket to provide scooping anddumping capabilities.

7. Sensors and Sensor Integration

7.1. Infrared sensors

Sharp GP2Y0A21YK infrared sensors are used for obstacle avoidance. When driving, if readingon the left IR sensor reaches a threshold value the robot will pivot right. The robot continuesreading the infrared data until another threshold is reached (“FAR” threshold). Then the robotcontinues moving forward and returns to its normal function. The same applies to the right IRreaching threshold. When the middle IR is high, the robot stops and backs up until the middle IRreaches the FAR threshold. Then, it decides which direction to turn based on which IR reading is

Page 8: Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

8

higher (left or right). Then it pivots until the middle IR reaches the VERY FAR threshold,essentially meaning the object is nowhere in sight. the robot moves forward then returns to itsoriginal mission.

7.2. Force­Sensitive Resistor (FSR)

As the name suggests, this sensor responds to force. 5V is sent across the resistor, and theother terminal connects to a pulldown resistor and an analog input pin. This setup acts as avoltage divider circuit, with small FSR readings allowing a higher voltage on the input pin. Whenthere is nothing resting on top of the resistor, it has a very high (~1MOhm) resistance. As forceis applied, resistance drops. When resistance drops, the voltage drop between 5V and the inputpin becomes smaller, thus a higher voltage is read on the input pin. The FSR is used todetermine when there is gravel in Diggie Smalls’ bucket.

7.3. Camera

All computer vision is done with a library called OpenCV, which is short for Open SourceComputer Vision. All vision processing is done on the ODROID­U2. The camera is used for twodifferent purposes: color tracking and object recognition. The camera used was a ‘Creative Live!Chat’ webcam.

7.3.1. Color Tracking

Color tracking is a very simple process using OpenCV. The first step is to capture an imagefrom a camera and convert it from BGR to HSV color space. After that, it is helpful to apply agaussian blur (or any blur type) to reduce noise. Then, using library functions, filter out theappropriate color by only accepting colors in a certain HSV range. All color within the range in theimage will be converted to white, and everything else will be black. Once again, use libraryfunctions to calculate the moments of the white space in the frame (moments have the exactsame definition as they would in an introductory physics class, with white analogous to ‘mass’).Dividing the x­moment by the area gives the x position of the detected color blob. In the case ofDiggie Smalls, once the x­coordinate of the pink blob is centered the robot will move toward it. Italso uses the area of the blob to roughly determine its distance. On the ODROID, this can all beeasily done at above 15 frames per second.

7.3.2. Object recognition

Diggie Smalls uses a Keypoint/Descriptor based approach to object recognition. Specifically, theSURF extractor and matcher are used. Before any image processing, the image must beconverted to grayscale. Once a matrix of grayscale values is obtained, areas in the matrix withhigh curvature are chosen as key points. In this context, high curvature is very similar to sayinghigh contrast. Key points are assigned vector quantities based on the nature of the image (usingintegral image techniques), and a matcher compares the key points of a frame to the reference

Page 9: Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

9

image (the object to be detected). If it finds a match, a ‘distance’ value is given that representshow strong the match is. High­distance matches are discarded and low­distance matches arekept. Finally, homography is used to transform the reference object into the plane of the detectedobject.

SURF is very powerful because it is rotationand scale invariant, and it is also much morerepeatable and accurate compared to otherdescriptors and matchers. Unfortunately, it isalso much slower than some other objectrecognition algorithms. Even so, I managed torun object detection at around 6­8fps on theODROID which is satisfactory. However, theSURF algorithm is very susceptible to motionblur, which means the robot has to move injerky motions to accurately find the bin.

Example of SURF detecting a pop­tart box

8. Behaviors

Diggie smalls is a state machine. In the FIND_PILE state, it rotates around looking for the colorpink. When pink is detected and its area is above a certain threshold (to deter false positivesfrom noise), it will drive toward the pile while maintaining the x­coordinate near the center. Duringthis time, it will avoid obstacles as described in the ‘Sensors and Sensor Integration’ section.Once the area of the pile reaches a larger threshold, the robot will dig its bucket into the groundand drive toward the pile until weight is detected on the FSR. At this moment the robot lifts thegravel, backs up for 1.5 seconds and begins to pivot in a jerking motion to the left. It will pivot untilit sees the Mountain Dew Box used as a storage receptacle. Once located and centered, itdrives toward the box until the infrared sensor reaches a threshold which means the bin is closeenough to dump the load. Then the robot tilts the bucket downward to drop the gravel, backs up,and repeats the process from the first state.

9. Experimental Layout and Results

Infrared sensors are loosely calibrated. Since Diggie Smalls doesn’t need to know any exactdistances, all threshold values were found experimentally while mounted on the platform. I justheld out my hand at a suitable triggering distance, noted the value, and recorded that value in athreshold definition. The same applies to the FSR; since it is not very sensitive to the low weightof the gravel, I just found a reliable cutoff value and stuck to it. I noticed the FSR can be verynoisy, so I implemented a running average of its readings in the main program. I did the same forthe infrared sensors. I created a program called CalibrateCam which allows me to adjust HSV

Page 10: Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

10

thresholding values while the video continuously updates, so it is very easy to calibrate my colordetection code on the fly.

Most of my sensor issues came from running object recognition. The first major issue was theblur, which I already mentioned. Secondly, there are a lot of false positives. To get rid of falsepositives, I only publish data that adheres to very strict rules. Since the SURF algorithmcalculates the corners of the image, all filtering must be done by examining the corners. In orderfor the data to be deemed ‘good’, the matched object must be rectangular, not too elongated,and it must have reasonable coordinates for all corners of the image. Furthermore, the areacannot be too large or too small.

However, just filtering out certain images is not enough. For example, one condition is that 9/5 > (y2 ­ y3)/(y0 ­ y1) > 5/9, where y2 and y3 represent the right corners’ y­coordinates and y0and y1 represent the left corners’. This had to be included due to a peculiarity in the SURFalgorithm that causes some matches to be represented highly off­axis. In this situation, one sideof the bin placed around the detected object becomes very long and it misrepresents the xcoordinate of the bin. This leads to false positives and false negatives, which are both frustratingto deal with. On demo day, 9/5 and 5/9 threshold values were too strict, because sometimes itwould throw out data that could have been used to detect the box. Diggie Smalls would hit itstarget, but overshoot the bin slightly when aligning. I decided to loosen the restrictions on theratio, but apply a running average to the data to filter out some of the stray elongated matches.

10. Conclusion

IMDL is not easy and it is definitely not cheap. However, I learned more in this class than I havein any other class I have taken so far. I learned so much about embedded computers, linux,computer vision, C++, mechanical design and more. I learned so much that, if I were to take thisclass again, I could build an entire robot in the span of a couple weeks. It’s funny how afterspending so much time programming a robot, you can think what it is thinking and see what it isseeing. When there is a glitch in the code and my robot isn’t moving, I know instantly what I leftout because I can follow the code in my head as Diggie Smalls runs.

One of the major things I learned is that no design is going to be perfect. Sometimes rather thancoming up with the perfect design in theory, it is better to create your design then tailor it to yourneeds. For example, I spent so long designing my actuation system in my head, but in the end Iscrapped it and just kind of winged it. It seemed to work out for the best, because I don’t think Icould have made this robot any better given the time frame.

I would still like to make some changes to Diggie Smalls. I hoped to add machine learningconcepts that would speed up the process of locating objectives after the first cycle, but I didn’thave time. Also, I would still like to set my robot up to have a remote control option.

Page 11: Diggle Smalls - University of Florida...took IMDL so I could learn about robotics and de mystify the process, and I definitely accomplished that goal. 2. Executive Summary Diggie Smalls

11

References

Information about ROS: http://www.ros.org/Information about ODROID: http://www.hardkernel.com/main/main.phpInformation about SURF [pdf]: www.vision.ee.ethz.ch/~surf/eccv06.pdfPololu electronics and information: http://www.pololu.com/Lynxmotion products and information: http://www.lynxmotion.com/