Upload
maryann-watts
View
218
Download
1
Tags:
Embed Size (px)
Citation preview
PRESENTED BY TEAM TRIMAXION
JON PRICE, ERIC ENGER, JASON FARRELL
TOBIAS KNUTSSON, JOHN-OSKAR AHLSTROM, OSCAR ALMGREN, ROBIN MALMROS
Computer Science Capstone Team Trimaxion
Introduction
System Design The design of the system was initially draft very
early We wanted to keep it as simple but also allow for
complexity where needed Components should be standalone and require
only knowledge of what they are receiving Limit redundant code Division of components
Video Server GUI Main Server (Image Processing, Path finding) Brain
Introduction (cont)
Goals The robot should allow for manual control within
given values being sent from the GUI A live video stream should be displayed in the GUI and
be clickable When the video is clicked the destination should be
determined and the path automatically calculated and followed
Upon completion the robot should be able to acquire a target and fire a projectile at the target
Introduction (cont)
Methodology Components should be standalone and be able to work
alone Test a completed component with a test application Perform weekly tests to ensure frozen code works
with new code KISS – Keep it Simple Stupid – Nothing Unnecessary,
add extra when product is complete Maintain constant communication and provide
continual progress updates
Introduction (cont)
Areas of Technical Expertise Robot being Completely autonomous when finding a
path Light based movement sensor with pin wheel Projectile launching system (not implemented)
KISS – KEEP IT SIMPLE STUPID
Development of Trimaxion
PRESENTED BY JASON FARRELL
Video Server
Research
Real Time Transfer Protocol UDP based to allow efficient data transfer with
permissable data loss
Java Media Framework (JMF) Allow access to media capture device (microphone,
webcam) Use data source to display video or produce audio Use RTP Session Manager to broadcast data to client Use of Player and Processor objects to manage and
display stream data
Development
First goal was to be able to display the output of the data source in a GUI
Next be able to transfer the data source over the network To test this we used JMStudio which came with the
framework and was developed by Sun Microsystems. To assist with the development of the transmitter we
used the VideoTransmit class from the Sun Developer network.
We specified the destination via RTP by specifying an RTP address of the format: rpt://ip.address:port/track
Testing
The primary means of testing was the use of JMStudio, which we knew to be a working program, thus to make it easier to determine which component was failing.
First segment, testing the transmission by opening an RTP Session from JMStudio
Second Segment, testing the reception by transmitting from JMStudio
Final Segment, testing using both the GUI and VideoServer for Reception and Transmission respectively.
PRESENTED BY ERIC ENGER AND JASON FARRELL
Graphical Front End
Research
What would the GUI look like? Manual movement commands for rotation and straight
line travel Streaming web cam image, clickable to allow point
selection (JMF)
Communications protocol with Main Server TCP with Java Sockets
What to send between? Command object
Image, Point, Command
Development
Design Needed to support JButton and JTextField for manual
robot control parameter entry Communicate with Main Server using Object I/O
Streams Leverage Java for all components Limit number of mouse clicks to send info to server
Command class houses instances of ImageIcon, Point, and Command Command is a char field that dictates what type of command
is being sent Use of JMF to read the RTP Data Stream coming from Video
Server JMF also used to “capture” the current image in data stream
Development (cont)
Command.java Implementation Supports a char field that is accessed by Main Server
to determine command type M – move R – rotate P – Path Find
Image is the captured image from the web cam stream. Stored as an ImageIcon but accessed as an Image using encapsulation ImageIcon is inherently serializable
Point – Instance of point generated by mouseClick event associated with VisualComponent of JMF Player
Testing
We used the finished GUI with a test server involving stub code from Main Server. Goal: send the image and point across the Grand
Valley Network Results: Transmitting Computer must be plugged into the
wall
Have finished GUI attempt to connect and send image to Test Server located in Sweden Goal: Display the image from the webcam in Sweden
Result: All computers involved must be connected via a wall connection with no firewall
PRESENTED BY TOBIAS KNUTSSON AND JOHN-OSKAR AHLSTROM
Main Server – Image Processing
Research
http://google.com/search?q=java+image+processing
Java Advanced Imaging (JAI) Platform independent (Java Interface) Fast (C Backbone)
Development
Needed abilities: Locate obstacles Locate robot position and relative angle Remove robot from the obstacle map Determine map scale (using the robot as a reference)
Development approach: Find solutions that work in different environments (lighting conditions, surfaces etc.)
The web is an invaluable resource for examples. We do not want to reinvent the wheel
Testing
Challenge: Designing the algorithm using static images and keeping it flexible
Solution: Try to create realistic scenario-photos, but change lighting, obstacles surfaces
Scaling images gives better performace, but also gives you alot more factors to keep track of
PRESENTED BY ROBIN MALMROS, OSCAR ALMGREN
Main Server - Pathfinding
Research
Discussion led to Guiding Star (A*) What is Guiding Star? Google revealed excellent sources
Development
Coding in Java using Eclipse Optimized the path with removing between
points Integrated the path finding into Astrolabe
Testing
Testing of arbitrary boolean map Testing of map received from Point Image Testing with full system integration
PRESENTED BY JON PRICEASSISTING ERIC ENGER, TOBIAS KNUTSSON
AND JOHN-OSKAR AHLSTROM
Robot
Deciding on the Build
Robot must be able to turn without deviating from its current location.
Decided to use a tank design.Initial navigation would use the
TiminingNavigator class developed in Lejos.Communication would be achieved using the
RCXPort provided by Lejos.
Robot Design Phase
Our initial robot design was a very basic tank design that was found using Google. The design was modified slightly to place the IR sensor in the back of the robot.
The final design was a modification made by the team in Sweden. This design placed the IR sensor towards the ceiling and adding a well designed bumper sensor to the front.
The TimingNavigator Class
The TimingNavigtor class was written by the developers of Lejos.
It provides (somewhat) accurate movement based on how long it takes for two functions to be completedHow long does it take to travel 1 meter.How long does it take to rotate 360 degrees.
TimingNavigator, cont.
The measurements obtained depend on the surface used and the voltage of the batteries. So, the measurements would actually change over time.
It was determined a better solution was needed.
Tobias saves the day when he discovers the “poor-mans” rotation sensor.
Rotation Sensor
The encoder wheel is attached to a large gear.
The light sensor is used to read white and black values on the wheel.
A count is kept during movement.
Rotation Sensor, cont.
Provides accurate movement regardless of battery voltage or motor speed.
Still requires measurements to be taken, but they should hold true for that situation.
Measurements required:Count to travel 1 centimeterCount to rotate 1 degree
Rotation Sensor, cont.
The rotation sensor did introduce some new headaches.
Find the largest number of sector’s on the wheel and still be accurate.
Find the average value the light sensor “sees” for black and white areas.
Use trial and error to calculate 1 centimeter and 1 degree.
What is the accuracy obtainedSmallest movement allowedSmallest turn allowed
The LightNavigator class
Class developed to accurately move using the new rotation sensor.
Once the class is instantiated, all you need to do is tell it to travel ‘x’ centimeters or rotate ‘x’ degrees.
Drawback: Does not maintain current x and y position of the robot.
Communication with RCXPort
Decided to use the RCXPort already developed for Lejos.
Slow speeds, but provides reliable connection and ensures all data is transferred.
Easy to work with because it uses Java’s Data IO methods.
Develop RCXProtocol class to bridge the server <----> RCX communication.
RCXProtocol Class
Simple class that creates the RCXPort and retrieves the input and output streams.
Provides methods for communicating with the robot: Travel ‘x’ centimeters Rotate ‘x’ degrees
Sends the commands and blocks until a response from the RCX is received.
Continuous Testing
During the design phase there was continuous testing of the robot’s “brain”.
Movement always seemed to be hit or miss when it came to accuracy.
What was the cause of the inconsistency? X and Y coordinates, along with robot’s rotation were
inaccurate. Once removed/not used we were able to provide more
accurate movement.
Inconsistent Communication with RCX
There are still inconsistencies with the way the RCX communicates with the server. Commands received, executed, no acknowledgement Commands never received.
One solution was to add some minor delays after commands are sent and executed.
Final Words
Bumper Sensor Stops all movement and plays a series of tones. No recovery at this point Possibilities?
Lost Communication Java IO Blocks, so it was hard to “timeout” the IO
streams on the RCX. RCXPort provides reliable data transfer, so no
packets will be lost. Possibilities?
References
“Poor Mans” Rotation Sensor:http://www.restena.lu/convict/Jeunes/Divers/Poor_rot.htmLejos Download and API:http://lejos.sourceforge.net/Initial Tankbot Design:http://www-education.rec.ri.cmu.edu/robo_preview/content/mult/slide/tankbot.htm
PRESENTED BY TEAM TRIMAXIONCAUSE YOU DON’T WANT END UP LIVING IN A VAN
DOWN BY THE RIVER
Testing & Integration
Introduction
Testing of VideoServer and GUI across the atlantic to determine connectivity requirements
Localized testing to determine components interaction locally
Full testing involving connections between America and Sweden
VideoServer to GUI Testing
After completion of localized testing we set out to “meet the Swedes” and get them on cam
We acquired a firewall free connection from Carl Strebal and proceeded to assist the Swedes with getting the code working on their end
We discovered that for optimum results both sides needed to be connected to the wall
Also during this testing process we decided on a consistent port access protocol Any connection from America to Sweden uses port 22224 Any connection from Sweden to America uses port 22222
Localized Testing
Problems: Finding a computer was the hardest portion We tried a number of machines before settling on Eric’s
machine to host Astrolabe (Main Server) Over calculation of scaled values – robot moving too far Required network testing to ensure proper
communication. Jason: Video Server /GUI Eric: GUI / Main Server (Astrolabe)
Integration: Fusing atypical code Environmental variable problem
.dll files and pathing
DERKA DERKA
Conclusions
What we would do over?
Communication RedesignBlue ToothMulticast Video ServerRedesign of movement calculation for robotProjectile launching device
Final Thoughts
Very worthwhile – learned about communication and managing a distributed application project
Gain new insight into other areas where Java can be used for development JMF for accessing web cams and audio devices JAI for advanced Image processing
Learned to work with members of another culture and made new friends
The problem seemed easy at first but ended up being very challenging
The main problem came from the lack of time to properly design the application
Also difficulty in communicating at times led to delays that forced us to push back various milestones
ARE YOU THINKING WHAT I’M THINKING PINKY?I THINK SO BRAIN, BUT BURLAP CHAFES ME SO
AND NOW…. A DEMONSTRATION
And to close