If you can't read please download the document
Upload
diamond
View
41
Download
1
Embed Size (px)
DESCRIPTION
Jivko Sinapov, Kaijen Hsiao and Radu Bogdan Rusu. Proprioceptive Perception for Object Weight Classification. What is Proprioception?. - PowerPoint PPT Presentation
Citation preview
Jivko Sinapov, Kaijen Hsiao and Radu Bogdan RusuProprioceptive Perception for Object Weight Classification
What is Proprioception? It is the sense that indicates whether the body is moving with required effort, as well as where the various parts of the body are located in relation to each other.- Wikipedia
*Why Proprioception?
*Why Proprioception?FullEmpty
HardSoftvsWhy Proprioception?
Lifting: gravity, effort, etc.
Pushing: friction, mass, etc.
Squeezing: compliance, flexibility
Power, Play and Exploration in Children and Animals, 2000
Related Work: Proprioception Learning Haptic Representations of Objects:
[ Natale et al (2004) ]
Related Work: Proprioception Proprioceptive Object Recognition
[ Bergquist et al (2009) ]
Perception Problem for PR2:Is the bottle full or empty?
General ApproachLet the robot experience what full and empty bottles feel like
Use prior experience to classify new bottles as either full or empty
Behavior:Power, Play and Exploration in Children and Animals, 2000
Behaviors1) Unsupported Holding2) Lifting
Data RepresentationBehavior Execution:[Ji, Ei, Ci ]Recorded Data:Joint PositionsEffortsClass Label{full, empty}
ExampleRecorded Joint Efforts of Left Arm:
Classification Procedure[Ji, Ei, ?]Feature ExtractionRecognition Model
Recognition ModelX =[Ji, Ei, ?]Recognition Model
Recognition ModelX =[Ji, Ei, ?]Recognition ModelFind N closest neighbors to X in joint-feature space
Recognition ModelX =[Ji, Ei, ?]Recognition ModelFind N closest neighbors to X in joint-feature spaceTrain classifier C on the N neighbors that maps effort features to class label
Recognition ModelX =[Ji, Ei, ?]Recognition ModelFind N closest neighbors to X in joint-feature spaceTrain classifier C on the N neighbors that maps effort features to class labelUse trained classifier C to label X
Objects:Procedure:Place object on tableRobot grasps it and performs the current behavior (either hold or lift) in a random position in space Robot puts object back down on table in random position; repeat. Each behavior performed 100 times on each bottle in both full and empty statesA total of 2 x 5 x 100 x 2 = 2000 behavior executionsTraining Procedure
Evaluation5 fold cross-validation: at each iteration, data with 4 out of the five bottles is used for training, and the rest used for testing
Three classification algorithms evaluated:K-Nearest NeighborsSupport Vector Machine (quadratic kernel)C4.5 Tree
Chance Accuracy: 50%
Can the robot boost recognition rate by applying a behavior multiple times?
How much training data is necessary?
Application to RegressionX =[Ji, Ei, ?]Recognition ModelFind N closest neighbors to X in joint-feature spaceTrain regression model C on the N neighbors that maps effort features to class labelUse trained regression model C to label X
Regression Results
Regression ResultsMean Abs. Error = 0.08827 lbs
Regression ResultsMean Abs. Error = 0.08827 lbsChance error = 0.2674 lbs
Application to Sorting Task Sorting task:
Place empty bottles in trash
Move full bottles on other side of table
Application to Sorting Task
Application to Sorting Task
Sorting Task: video
Application to a new recognition taskFull or empty?
Behavior:
40 trials with full box and 40 trials with empty boxRecognition Accuracy: 98.75 % (all three algorithms)
slide object across table
Sliding task: video
ConclusionBehavior-grounded approach to proprioceptive perception
Implemented as a ROS package:http://www.ros.org/wiki/proprioception
This work has been submitted to ICRA 2011.
Future Work More advanced proprioceptive feature extraction
Multi-modal object perception: Auditory 3D Tactile
Note: experiments with lift behavior were done after the ones with the hold behavior