eBug - An Open Robotics Platform for Teaching and Research

  • Upload
    waili8

  • View
    214

  • Download
    0

Embed Size (px)

Citation preview

  • 8/3/2019 eBug - An Open Robotics Platform for Teaching and Research

    1/9

    eBug - An Open Robotics Platform for Teaching and Research

    Nicholas DAdemo Wen Lik Dennis Lui Wai Ho Li

    Y. Ahmet Sekercioglu Tom Drummond{nick.dademo|dennis.lui|wai.ho.li|ahmet.sekercioglu|tom.drummond}@monash.edu

    Monash University, Australia

    Abstract

    The eBug is a low-cost and open robotics plat-form designed for undergraduate teaching andacademic research in areas such as multime-dia smart sensor networks, distributed control,

    mobile wireless communication algorithms andswarm robotics. The platform is easy to use,modular and extensible. Thus far, it has beenused in a variety of applications, including aneight-hour demonstration managed by under-graduates at the Monash University Open Day.This paper describes both the eBug hardwareand software which is in the process of beingopen-sourced online. The paper also highlightsseveral research applications of the eBug, in-cluding its use in a wireless control testbed andas the eyeBug; an eBug that senses the worldin real-time using the Microsoft Kinect.

    1 Introduction

    The eBug was first conceived in 2008 as a platform toinvolve undergraduates in a research project on wirelesscontrol of multiple robots. Since then, the eBug hasundergone several iterations, resulting in an extensibleplatform that has been used in a variety of research ap-plications as detailed in Section 3. The latest versionis shown in Figure 1 which highlights the different lay-ers of the eBug. The design is modular, modifiable andextensible in that it allows layers to be both replacedor added depending on the required application. TheeBug is intended to be used in several research projectsin the near future which are detailed alongside relevantreferences in Section 4. The hardware design and soft-ware sources are available online at the Monash WirelessSensor and Robot Networks Laboratory Website1.

    1http://wsrnlab.ecse.monash.edu.au/

    Figure 1: e-Bug exploded view

  • 8/3/2019 eBug - An Open Robotics Platform for Teaching and Research

    2/9

    Table 1: Comparison of similar-sized robotic platformsCost Open Platform

    eBug US $500 YesEPFL e-puck US $11901 NoK-Team K-Junior US $11062 NoK-Team Khepera III US $42702 No

    Robomote US $150

    3

    Yes

    1.1 Similar Platforms

    When compared with several other robotic platforms ofsimilar size and capabilities (Table 1), the eBug differen-tiates itself by firstly being a completely open platform.Here, open refers to the complete design (both hard-ware and software) being available online1 to the publicfor both improvement as well as customization. Further-more, the relatively simple chassis design constructed ofcommonly available parts and materials allows the eBugto be replicated much more easily than other robotic

    platforms. Thus, the eBug is suited to be constructedin-house by students as a learning experience at educa-tional institutions wishing to utilize the robot. For thisreason, the cost shown for the eBug in Table 1 excludeslabour. By way of example, the first author (a finalyear electrical engineering undergraduate at the time ofwriting) was able to construct an eBug in approximately12 hours. This included the soldering of components onboth the logic and power layers as well as the construc-tion of the mechanical layer.

    Out of the platforms listed in Table 1, the Robomote3

    first produced by the University of Southern Californiain 2002 is the most similar to the eBug in terms of costand availability of design files. One main difference isthat the eBug uses more modern hardware. It also hasthe following advantages:

    More advanced communications ca-pabilities (2.4 GHz 250 Kbps XBee vs. 916.5 MHz19.2 Kbps OOK).

    Designed to traverse a variety of surfaces, includingcarpet.

    Features on-board lithium-polymer battery chargerand supervisor/management circuitry.

    Able to carry additional sensors/peripherals (i.e.

    expandable).

    1.2 Previous eBug Designs

    Since 2008, the eBug has undergone several major designiterations (Figure 2):

    Design 1: (2008) Used DC motors and Ni-MH bat-teries.

    2http://www.roadnarrows.com/ [23rd Aug 2011]3http://robotics.usc.edu/~robomote/ [4th Sep 2011]

    Figure 2: Early eBug designs produced between 2008and 2009 (left to right: designs 1-3)

    Design 2: (2008) Produced in parallel to Design1. Featured IR sensors for communication and im-proved wireless capabilities.

    Design 3: (2009) Merger of first two designs withvarious improvements.

    Design 4: (2010-) Near-complete redesign. In-creased number of features. Improved, more effi-cient, LiPo-based power supply (Figure 1).

    2 Design

    As shown in Figure 1, the eBug features a modular de-sign which separates the robot into distinct layers (frombottom to top): Mechanical, Power, Logic, and Expan-sion Board. A modular design such as this has the fol-lowing advantages:

    If any modifications or improvements need to bemade to the Power Layer for example, the LogicLayer does not also need to be changed or re-designed - that is, the Logic Layer does not carehow the Power Layer is implemented but only caresthat power is provided appropriately.

    Debugging is simplified by the separation of thepower and logic sections of the eBug. For example,known-good layers can easily be swapped betweenrobots, thus greatly speeding up the debugging pro-cess.

    In the case of hardware damage/faults, repair time

    is reduced due to the fact that only the faulty layer(as opposed to the entire unit) needs to be replaced.

    If the eBug Logic board is not suitable for a par-ticular application, a custom board with a similarshape can instead be used by appropriately inter-facing with the Power Layer.

    Additional layers can be created and added to theeBug to provide extra functionality (see Section3.2).

  • 8/3/2019 eBug - An Open Robotics Platform for Teaching and Research

    3/9

    Figure 3: 25 eBug robots in various stages of assembly

    2.1 Overview

    The following are the major design goals set out prior tothe design of the eBug and the corresponding results:

    Long battery life for long-duration experi-ments: The eBug has an approximate running timeof 3 hrs during heavy use in a typical applicationsuch as described in Section 3.1. It also has a 14-hr idle running time (using a 2500 mAh 3-cell LiPobattery).

    Ability to travel on various surfaces: Due tothe use of hard rubber wheels with small ground-contact area, the eBug is able to travel on a varietyof surfaces.

    Ability to carry extra weight of additionalexpansion layers: The eBug is able to carry addi-tional weight while still retaining mobility. Section3.2 details an example of a Microsoft Kinect (ap-proximately 500 g) mounted on an eBug.

    2.2 Hardware

    Microcontroller

    The eBug utilizes the Atmel AVR XMEGA 8/16-bit mi-crocontroller which is capable of up to 32 MIPS at 3.3 V.Note that the XMEGA 100-pin variants, as used in the

    eBug, are all pin-to-pin compatible. Several reasons forchoosing this particular microcontroller include:

    The XMEGA offers high-performance (typically1 MIPS/MHz).

    Features a range of useful peripherals (e.g. DMA,Event System, 12-bit ADC/DAC, on-board USBcontroller) not usually found on typical 8/16-bit mi-crocontrollers such as the AVR MEGA series (asused in the popular Arduino platform).

    Low power consumption - the XMEGA has a max-imum current draw of only 20 mA.

    As will be discussed later in this paper, the eBughas not been designed to carry out complex, CPU-intensive tasks. Thus, a higher-performance mi-crocontroller (i.e. 32-bit ARM Cortex-M series) isnot required and would unnecessarily increase power

    consumption. Open-source software development tools for the

    Atmel AVR series of microcontrollers are freely-available (e.g. WinAVR). Furthermore, Atmel alsoprovides a free Integrated Development Environ-ment software package (AVR Studio).

    Power

    As shown in Figure 4, the eBug is powered by a 3-cell (11.1 V) lithium-polymer battery and also featuresan on-board battery charging circuit which allows therobot to be charged with an external DC power adap-

    tor. Furthermore, protection circuits constantly monitorthe lithium-polymer pack to

    Measure and maintain an accurate record of avail-able charge which can also be displayed visually bya set of five on-board LEDs.

    Provide protection against the following events:

    Cell over and under-voltage

    Charge and discharge over-current

    Charge and discharge over-temperature

    Short-circuit

    The majority of the on-board peripherals operate at sup-

    ply voltages of 5 V and 3.3 V which are provided by abuck switching regulator and an LDO regulator respec-tively. The eBug can also be completely powered by anexternal DC power adaptor allowing long periods of useduring development and debugging. Another feature ofthis design is that the unregulated battery supply voltagecan also be directly accessed at the Logic Layer. This isuseful for add-on boards which may require a higher sup-ply voltage or an unregulated supply (e.g. DC motors,actuators).

    Communication

    Wireless communication capabilities are provided by alow-cost, low-power Digi XBee RF module operatingwithin the ISM 2.4 GHz frequency band. The mod-ule has a maximum indoor range of approximately 40 m(line-of-sight) at a transmit power of 2 mW with currentconsumption ranging from 15 mA to 40 mA dependingon whether the device is in the idle or transmit/receivestate. The module features two modes of operation:

    Transparent Operation: Module acts as a serial linereplacement.

  • 8/3/2019 eBug - An Open Robotics Platform for Teaching and Research

    4/9

    Figure 4: eBug Power Supply Design

    Figure 5: Simplified diagram showing the connections ofthe major eBug peripherals to the AVR XMEGA

    API Operation: Allows the full functionality of themodule to be used (e.g. ZigBee addressing sup-port, remote configuration, remote I/O pin sam-pling, mesh networking).

    The eBug utilizes the XBee module in the API modeof operation in order to allow the use of the above-mentioned advanced features. API mode also offers thehigher performance of the two modes - a maximum data

    throughput of 35 Kbps. Furthermore, the XBee modulecan also be put into sleep mode to lower power consump-tion when not needed.

    Sensors and Peripherals

    The eBug features a range of sensors and peripherals asshown in Figure 5:

    Analog temperature sensor: Enables measurementof temperatures from 2 C to 150 C with up to0.5 C accuracy.

    Figure 6: eBug expansion header

    Analog light sensor: Enables measurement of illumi-nances up to 100,000 lux via an adjustable softwaregain setting.

    Speaker: Driven by a 1 W mono BTL audio power am-plifier. The amplifier can be shutdown when not

    needed to reduce power consumption.Liquid crystal display: 2x16 character LCD (with

    software-adjustable intensity backlight) outputsvarious run-time information such as CPU usage,test status, battery/supply voltage(s), remainingbattery capacity etc.

    RGB LEDs: 16 RGB LEDs driven by a two cascaded24-channel 12-bit PWM LED drivers allow the gen-eration of almost any colour for visual feedback pur-poses. These LEDs are located in equispaced posi-tions on the edge of the eBug.

    Infrared (IR) receivers/proximity sensors: IR re-ceivers or alternatively, proximity sensors (or both)can be installed in 8 equispaced positions on theedge of the eBug depending on the required appli-cation.

    Infrared emitters: 16 IR emitters also equispaced onthe edge of the eBug (but at intervals between theaforementioned IR receivers/proximity sensors) canbe used for communication (via modulation) or ba-sic obstacle detection when used in combinationwith the infrared receivers or proximity sensors, re-spectively.

    Stepper motors: Two 10V 1.8

    /step 35 mm hybridbipolar stepper motors provide the eBug with mo-bility. The on-board stepper motor drivers are ca-pable of full, as well as 1/2, 1/4, 1/8, and 1/16(micro-)step modes, thus allowing the eBug to movesmoothly even at low speeds.

    Expansion Header

    Additional functionality can be easily added to the eBugvia the Expansion Header which provides power (5 V and

  • 8/3/2019 eBug - An Open Robotics Platform for Teaching and Research

    5/9

  • 8/3/2019 eBug - An Open Robotics Platform for Teaching and Research

    6/9

    Figure 10: eBug API example usage

    the robot by simply using this high-level API. Figure 10shows an example usage of the eBug API to wirelesslyread the value of the temperature sensor connected tothe ADC.

    3 Applications

    3.1 Wireless Control TestbedThe first application of the eBug was to create a testbedwhich not only demonstrated control of multiple robotsover a wireless link (using a central computer, i.e. cen-tralized control), but also served as a platform to testvarious formation control algorithms using real hard-ware. While tracking robots using augmented-reality(AR) markers has been demonstrated before [Fiala,2004], the main focus of our application is to create areliable testbed. As shown graphically in Figure 11, thetestbed functions as a closed-loop control system as fol-lows:

    1. Frame is grabbed from an overhead USB camera.2. Software running on a nearby PC uses ARToolKit-

    Plus 4 to firstly search the frame for any BCH-codedmarkers (which are located on top of each eBug).The position and orientation of each eBug in theframe is then calculated.

    3. The appropriate control command (in the form of aneBug API packet) is then calculated for each eBugdepending on the desired control behaviour.

    Figure 12: Overhead view in the control software

    4. This command is then finally sent wirelessly (usingan XBee module connected to the PC) to each eBug.Process repeats from Step 1.

    A simple control algorithm was successfully imple-mented on the testbed which directs each eBug towarda random target position without collisions with other

    robots. The testbed captures frames at 640x480 reso-lution at 60 FPS (frames per second) with control com-mands sent to each eBug at 100 ms intervals. Furthe-more, as the control software is modular by design, morecomplex control algorithms can easily be run on thetestbed. A video demonstrating the testbed is availableas a multimedia attachment to the paper. It can also beviewed online1.

    3.2 eyeBug: eBug with RGB-D Sensing

    Figure 13 shows the eyeBug, a real-world example ofhow more powerful capabilities and features can be eas-ily added to the eBug by utilizing its expandable de-sign. The eyeBug is an eBug extended with a MicrosoftKinect5 sensor and a BeagleBoard-xM6:

    The Microsoft Kinect is a low-cost RGB-D sensorthat provides colour and depth images at 30 Hzthrough a USB interface. It has rapidly gained agrowing user base in the robotics community as itis both cheap (AUS$150) and has a large commu-nity of open-source software developers. Exampleimages taken by the eyeBugs Kinect are shown inFigure 14.

    The BeagleBoard-xM is a low-cost ARM Cortex-A8

    development board (AUS$150) which includes USB2.0 as well as many other commonly-used interfaces(e.g UART, SPI, I2C, Ethernet).

    The Kinect communicates with the BeagleBoard-xM viaUSB and a custom PCB designed to interface the Bea-gleBoards UART to the eBug allows direct control of therobot. This Kinect Layer is placed above the eBug LogicLayer and includes boost converter circuitry to supply

    4http://handheldar.icg.tugraz.at/

  • 8/3/2019 eBug - An Open Robotics Platform for Teaching and Research

    7/9

    Figure 11: Wireless control testbed configuration

    the 10 V needed to power the Kinect. Furthermore,the board also features a dedicated area for prototypingwhich provides access to 3.3 V-tolerant BeagleBoard-xMGPIO lines.

    Preliminary results show that the eyeBug is able toperform a pseudo-random walk around an arena while

    avoiding obstacles. The Kinect is used to sense ob-stacles so that the robot always turns away from thenearest-sensed obstacle. The algorithm running on theBeagleBoard-xM (with an embedded version of Ubuntu11.047) is implemented in C++ and makes use ofOpenCV8 and the libfreenect9 drivers. The unoptimisedimplementation of our algorithm runs in real-time at12 FPS on the eyeBug. Additional technical details canbe accessed online1.

    A video of the eyeBug is available as a multimediaattachment to the paper. It can also be viewed online1.

    3.3 Performance of Follow the Leader

    Formation Control AlgorithmsThe eBugs were also used in a project which tested theperformance of decentralized algorithms for formation

    5http://www.xbox.com/en-US/kinect6http://beagleboard.org/hardware-xM7http://rcn-ee.net/deb/rootfs/natty/ubuntu-11.

    04-r3-minimal-armel.tar.xz8http://opencv.willowgarage.com/9http://openkinect.org/

    control under noisy localization information and con-strained communication conditions. Decentralized con-trol involves the robots performing tasks without anexternal agent coordinating or directing the individualrobots. This projects focus was to control formations ofeBugs with two competing objectives:

    1. Move all eBugs from their initial position to a finaldestination through a path of travel (an arbitrarilydefined trajectory).

    2. Maintain the relative distance between the eBugssuch that a set formation is retained during theirtravel [Lawton et al., 2003].

    The technique of reconciling those two competing objec-tives defines different formation control algorithms.

    In 2003, Lawton and his team published a seminal pa-per [Lawton et al., 2003] and proposed three control lawsfor decentralized formation control under progressively

    more rigorous constraints. Using three prototype eBugsin our research project, we began implementing thesecontrol strategies to quantify their robustness againstlocalization estimation inaccuracies and the effects ofrobot-to-robot communication channel delays and band-width limits.

    Early results obtained for one-leader, two-follower for-mations were published in a technical report [Siripala,2010]. We are currently bringing more eBugs online formore comprehensive experiments with larger formations

  • 8/3/2019 eBug - An Open Robotics Platform for Teaching and Research

    8/9

    Figure 13: eyeBug

    Figure 14: Images captured from the eyeBug - Left:RGB, Right: Depth (the black area closest to the cam-era corresponds to the approximate 50 cm Kinect sensordeadzone)

    involving more complex communication and sensing sce-narios.

    4 Conclusions and Future Work

    This paper presents the design details of a low-cost openrobotics platform, the eBug, which is modular, modifi-able and extensible. Moreover, applications of the plat-

    form have clearly demonstrated its ease of use; even fornon-experts. We are currently expanding our numberof eBugs to 30 robots. This will allow us to tacklea variety of research projects by validating theoreticalmodels with real world implementations. Planned re-search projects include networked control systems [Ab-dallah and Tanner, 2007; Antsaklis and Baillieul, 2007;Sinopoli et al., 2003; Stubbs et al., 2006] over wirelessnetworks [Kumar, 2001] and networked robotics [Kimet al., 2009; Pohjola et al., 2009; Jung et al., 2010],communication algorithms in mobile wireless sensor net-works [Ekici et al., 2006], data harvesting from sen-

    sor fields by using mobile robots [Tekdas et al., 2009;Gu et al., 2006], and formation control for roboticswarms [Anderson et al., 2008; Lawton et al., 2003;Tanner et al., 2004; Ren et al., 2007].

    Acknowledgments

    The authors would like to acknowledge the following peo-ple who have contributed in making the eBug possible:

    David McKechnie who provided the initial eBugconcept and design in 2008.

    Aidan Galt and Rory Paltridge who both designed

    and produced early eBug iterations in 2008 and 2009respectively.

    Alexandre Proust for all his work with the Kinect-related software development.

    Tony Brosinsky for his assistance in the productionof the eBug mechanical layer.

    Ray Cooper, Ian Reynolds, Geoff Binns, and Ray-mond Chapman for their constant technical and lo-gistical support.

    The Department of Electrical and Computer Sys-tems Engineering for their financial support.

    References

    [Abdallah and Tanner, 2007] C. T. Abdallah and H. G.Tanner. Complex Networked Control Systems. IEEEControl Systems Magazine, 27(4):3032, August 2007.

    [Anderson et al., 2008] B. D. O. Anderson, C. Yu, B. Fi-dan, and J. M. Hendrickx. Rigid Graph Control Archi-tectures for Autonomous Formations. IEEE ControlSystems Magazine, pages 4863, December 2008.

  • 8/3/2019 eBug - An Open Robotics Platform for Teaching and Research

    9/9

    [Antsaklis and Baillieul, 2007] P. Antsaklis and J. Bail-lieul. Special Issue on Technology of Networked Con-trol Systems. Proceedings of the IEEE, 95(1):58, Jan-uary 2007.

    [Ekici et al., 2006] E. Ekici, Y. Gu, and D. Bozdag.Mobility-Based Communication in Wireless SensorNetworks. IEEE Communications Magazine, pages

    5662, July 2006.[Fiala, 2004] M. Fiala. Vision Guided Control of Multi-

    ple Robots. 1st Canadian Conference on Computerand Robot Vision (CRV04), pages 241246, May2004.

    [Gu et al., 2006] Y. Gu, D. Bozdag, R. W. Brewer, andE. Ekici. Data Harvesting with Mobile Elementsin Wireless Sensor Networks. Computer Networks,50:34493465, 2006.

    [Jung et al., 2010] J. H. Jung, S. Park, and S-L Kim.Multi-Robot Path Finding with Wireless MultihopCommunications. IEEE Communications Magazine,

    pages 126132, July 2010.[Kim et al., 2009] S-L. Kim, W. Burgard, and D. Kim.

    Wireless Communications in Networked Robotics.IEEE Wireless yCommunications Magazine, pages 45, February 2009.

    [Kumar, 2001] P. R. Kumar. New Technological Vistasfor Systems and Control: The Example of WirelessNetworks. IEEE Control Systems Magazine, 2001.

    [Labrosse, 2011] J. J. Labrosse. MicroC/OS-II Real-Time Operating System Kernel. http://www.micrium.com/page/products/rtos/os-ii, 2011.

    [Lawton et al., 2003] J. R. T. Lawton, R. W. Beard, andB. J. Young. A Decentralized Approach to FormationManeuvers. IEEE Transactions on Robotics and Au-tomation, 19(6):933941, December 2003.

    [Pohjola et al., 2009] M. Pohjola, S. Nethi,and R. Jantti. Wireless Control of a Multihop RobotSquad. IEEE Wireless Communications, pages 1420,February 2009.

    [Ren et al., 2007] W. Ren, R. W. Beard, and E. M.Atkins. Information Consensus in Multivehicle Co-operative Control. IEEE Control Systems Magazine,

    pages 7182, April 2007.[Sinopoli et al., 2003] B. Sinopoli, L. Schenato, S. Schaf-

    fert, and S. S. Sastry. Distributed Control Appli-cations Within Sensor Networks. Proceedings of theIEEE, 91(8):12351246, August 2003.

    [Siripala, 2010] P. J. Siripala. Decentralized FormationControl for Dancing eBugs. Technical report, Depart-ment of Electricaland Computer Systems Engineering, Monash Univer-sity, 2010. http://titania.ctie.monash.edu.au/ugrad-projects/pj-formation-control.pdf.

    [Stubbs et al., 2006

    ]A. Stubbs, V. Vladimerou, A. T.Fulford, D. King, J. Strick, and G. E. Dullerud. A

    Hovercraft Testbed for Networked and DecentralizedControl: Multivehicle Systems Control over Networks.IEEE Control Systems Magazine, pages 5669, June2006.

    [Tanner et al., 2004] H. G. Tanner, G. J. Pappas, andV. Kumar. Leader-to-Formation Stability. IEEETransactions on Robotics and Automation, 20(3):443455, June 2004.

    [Tekdas et al., 2009] O. Tekdas, V. Isler, J. H. Lim, andA. Terzis. Using Mobile Robots to Harvest Data from

    Sensor Fields. IEEE Wireless Communications, pages2228, February 2009.