Upload
others
View
5
Download
0
Embed Size (px)
Citation preview
Recording hand-surface usage in grasp demonstrations
Ravin de Souza1, Jose Santos-Victor2, and Aude Billard1
1Learning Algorithms and Systems Laboratory, EPFL, Switzerland2Institute for Systems and Robotics, IST, Portugal
Humans are expert graspers having mastered the use
of their hands to grasp objects in different ways and for
different purposes. Grasp data on how humans use their
hands therefore provides an excellent resource to derive
insights from human grasp behaviour which can be
transferred to robotic hands. The important question is:
what to record? We may record the demonstrated shape
or configuration as in [1], [2]. However, the observation
that grasping is a contact based activity, and that it
occurs within a task context, indicates that we must also
record tactile interaction between hand surfaces and the
grasped object. This is commonly achieved using tactile
sensors to cover the grasping surface of the hand [3],
[4], [5]. In this regard, it is important to recognize
that both frontal surfaces and finger sides have to be
instrumented, as opposition of the latter against the
thumb plays an important task relevant role in several
commonly encountered tasks such as screw-driving,
opening a bottle-cap, cutting, hammering, etc. In the
grasp of Figure 1, thumb surfaces act against the finger
surface, the finger side and the palm to grasp the tennis
ball. But these actions adopt different significance if the
same grasp was used for example to manipulate the
stick-shift of a car or to turn a large knob. Thus merely
recording the configuration and force distribution on
the hand is not sufficient and we also need knowledge
of surface geometry to analyze the manner in which
hand surfaces interact with each other and against task
forces.
To achieve this, we propose a data glove covered
with tactile sensors in order to obtain tactile and con-
figuration information simultaneously. The ’sensorized’
glove is similar to [3] except that all grasping sur-
faces including finger sides are covered with tactile
sensors. Further, we use a kinematic model of the
hand, accounting for pronation/supination of the thumb,
and scaled according to the hand measurements of
the demonstrator, to reconstruct the geometry of the
grasping surfaces. We consider the demonstrated hand
surface usage to be task relevant and this is captured
in the raw sense from the pose and force vectors
Fig. 1: A grasp data representation which includes surface geometry tocapture how hand surfaces are used.
(D = [p, f ]ni=1
) associated with elements of a grasping
patch decomposition imposed on the surface of the
hand (Figure 1). As shown in the figure, grasping
patches correspond to the tactile sensor array, and all
sensory elements of a single patch are assumed to act
cohesively.
Grasp data captured in this way allows us to create
an intermediate representation of the grasp based on
analysis of pair-wise interactions between all grasping
patches. From this we can separate the grasp into a
set of cooperating high level oppositional intentions [8]
which can now be used to recreate the task relevant
information, i.e the D, that was demonstrated.
We have so far constructed a grasp data set com-
prising task-oriented grasps of real-world objects as
well as canonical grasps taken from a grasp taxonomy
(Figures 2 - 6). For each grasp scenario we provide:
• A picture of the object and the grasp made
• Joint angles using a 22DOF hand model
• Raw tactile data for 34 grasping patches
• pose and force vectors corresponding to how each
grasping patch was used (D = [p, f ]34i=1
).
We are working to extend our existing grasp data set to
cover all objects in the YCB object set [9]. We expect
that this can provide a valuable resource for researchers
to extract insights about human grasping that can be
applied to robots.
Fig. 2: Examples of grasp scenarios included in the data set. Grasps are taken from the Feix taxonomy [7].
Fig. 3: Similar grasp on different objects. Opening a tight bottle-cap withdifferent size/shape.
Fig. 4: Similar grasp on different objects. Cutting with different diame-ter/weight handles.
Fig. 5: Different grasps on the same object. Opening a bottle-cap whenit is tight and when it is loose. Entirely different grasps are required foreach case.
ACKNOWLEDGMENT
This research was funded by a doctoral grant (SFRH/BD/ 51071/
2010) from the portuguese Fundacao para a Ciencia e a Tecnologia
and by the Swiss National Science Foundation through the National
Centre of Competence in Research (NCCR) in Robotics.
Fig. 6: Different grasps on the same object. Cutting with different tools.The tools have the same handle but have the cutting blade positioneddifferently requiring entirely different grasps for each case.
REFERENCES
[1] H. Kjellstrom, J. Romero, D. Kragic, Visual recognition of
grasps for human-to-robot mapping, in: IEEE/RSJ International
Conference on Intelligent Robots and Systems, 2008.
[2] H. Friedrich, V. Grossmann, M. Ehrenmann, O. Rogalla,
R. Zollner, R. Dillman, Towards cognitive elementary oper-
ators: grasp classification using neural network classifiers, in:
International Conference on Intelligent Systems and Control,
1999.
[3] D. R. Faria, R. Martins, J. Lobo, J. Dias, Extracting data from
human manipulation of objects towards improving autonomous
robotic grasping, Robotics and Autonomous Systems 60 (3)
(2012) 396–410.
[4] K. Murakami, K. Matsuo, T. Hasegawa, R. Kurazume, A
Decision Method for Placement of Tactile Elements on a
Sensor Glove for the Recognition of Grasp Types, IEEE/ASME
Transactions on Mechatronics 15 (1).
[5] G. Buscher, R. Koiva, C. Schurmann, R. Haschke, H. J. Ritter,
Tactile dataglove with fabric-based sensors, in: IEEE-RAS
International Conference on Humanoid Robots, 2012, pp. 204–
209.
[6] R. L. De Souza, S. El Khoury, J. Santos-Victor, A. Billard,
Towards comprehensive capture of human grasping and ma-
nipulation skills, in: 13th International Symposium on the 3-D
Analysis of Human Movement, Lausanne, Switzerland, 2014.
[7] T. Feix, R. Pawlik, H. Schmiedmayer, J. Romero, D. Kragic,
A comprehensive grasp taxonomy, in: Robotics, Science and
Systems: Workshop on Understanding the Human Hand for
Advancing Robotic Manipulation, 2009.
URL http://grasp.xief.net
[8] R. L. De Souza, S. El-Khoury, J. Santos-Victor, A. Billard,
Recognizing the grasp intention from human demonstration,
Robotics and Autonomous Systems.
[9] B. Calli, A. Walsman, A. Singh, S. Srinivasa, P. Abbeel, A. M.
Dollar, Benchmarking in Manipulation Research: Using the
Yale-CMU-Berkeley Object and Model Set, IEEE Robotics and
Automation Magazine 22 (3) (2015) 36–52.