23
1 Collaborative Robotics for Assembly and Kitting in Smart Manufacturing D3.4 Evaluation of the HRI/C system This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807 Public D3.4 Evaluation of the HRI/C system

D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

1

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

D3.4 Evaluation of the HRI/C system

Page 2: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

2

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Project Acronym: ColRobot

Project full title: Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

Project No: 688807

Call: H2020-ICT-2015

Coordinator: ENSAM

Project start date: February 1, 2016

Project duration: 36 months

Abstract This deliverable evaluated the performance of the demonstrator for the human-robot interaction/ collaboration (HRI/C) system detailed in deliverable D3.3. The hardware, software, demo use-case and integration issues are described. Gesture recognition is critical for the success of the demonstrator, so that the classification accuracy in different conditions and by different users is reported. The task sequencer and the feedback provided to the user are also analysed in terms of efficiency and modularity.

Document control sheet

Title of Document Evaluation of the HRI/C system

Work Package WP3 – Human-robot interaction and collaboration

Last version date 28/09/2017

Status Final

Document Version: V.2.2

File Name Colrobot WP3 D3.4 Evaluation of the HRI.C system v.2.1 28092017

Dissemination Level Public

Partner Responsible UC

Versioning and contribution history

Version Date Revision Description Partner

V.1 21/09/2017 Creation of the first draft UC (Nuno Mendes)

V1.1 21/09/2017 Reviewed UC (Pedro neto)

V1.2 26/09/2017 Reviewed TAS (Raphael Boissonnade)

V.2 27/09/2017 Reviewed UC (Nuno Mendes)

V.2.1 28/09/2017 Reviewed UC (Nuno Mendes)

V.2.2 29/09/2017 Coordinator review ENSAM (Olivier Gibaru)

Disclaimer This document is provided « as is » with no warranties whatsoever, including any warranty or merchantability,

noninfringement, fitness for any particular purpose, or any warranty otherwise arising out of any proposal, specification or sample. No license, express or implied, by estoppels or otherwise, to any intellectual property rights

are granted herein. The members of the project ColRobot do not accept any liability for actions or omissions of ColRobot members or third parties and disclaim any obligation to enforce the use of this document.

This document reflects only the authors' view and the Commission is not responsible for any use that may be made of the information it contains. This document is subject to change without notice.

Page 3: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

3

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Index

1. Executive Summary ..................................................................... 4

2. Introduction .................................................................................. 5

2.1. Hardware ............................................................................................................................... 5

2.1.1. Inertial measurement unit system ..................................................................................................... 5

2.1.2. Surface electromyography system .................................................................................................. 6

2.1.3. Ultra-wideband system ...................................................................................................................... 7

2.1.4. Robot .................................................................................................................................................... 9

2.1.5. Gripper ............................................................................................................................................... 10

2.1.6. Personal computer ........................................................................................................................... 10

2.2. Software ............................................................................................................................... 11

2.3. Use-case ............................................................................................................................... 12

2.3.1. Bring parts and tools ......................................................................................................................... 14

2.3.2. Third hand .......................................................................................................................................... 16

3. Gesture recognition ................................................................... 19

3.1. False positives ...................................................................................................................... 19

3.2. False negatives ................................................................................................................... 20

4. Task sequencer .......................................................................... 21

4.1. Reaction times .................................................................................................................... 21

4.2. Feedback to user................................................................................................................ 22

5. Conclusion ................................................................................. 23

Page 4: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

4

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

1. Executive Summary Following deliverable D3.3, dedicated to the demonstrator for the human-robot interaction/ collaboration (HRI/C) system, this deliverable evaluates the performance of the demonstrator according to different parameters in a qualitative and quantitative way.

The hardware, software, demo use-case and integration issues are described. Since this deliverable is public some confidential issues were omitted. Gesture recognition is critical for the success of the demonstrator, so that the classification accuracy in different conditions and by different users is reported. The task sequencer and the feedback provided to the user are also analysed.

Summarizing, the evaluation indicates that:

(1) Hardware and software are well integrated;

(2) Gesture classification is critical and false positives/negatives are plausible to appear. Touch commands help to reduce this problem (only working when the robot is near the human co-worker);

(3) Speech and visual feedback to the user helps to inform the co-worker about the evolution of the interactive process;

(4) The task sequencer is modular and composed by primitive tasks that can be organized in different sequences.

Page 5: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

5

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

2. Introduction

2.1. Hardware Different hardware components with different purposes integrate the demonstrator. In order to get information about upper body postures and gestures, concerning torso and arms, inertial measurement unit sensors (IMUs) are used. This is an off-the-shelf product provided by a ColRobot project partner, Technaid, named Tech-MCS V3, a picture of the system can be seen in Figure 1. A surface electromyography system (sEMG) named Myo Gesture Control Armband, which is provided by Thalmic Labs, is used to get information about human hand and its fingers, Figure 2. In future versions of the demonstrator, an EMG sensor from ColRobot partner Technaid will be used. An ultra-wideband (UWB) system provided by Eliko Tehnoloogia Arenduskeskus named KIO is used for tracking workers in their workplaces, Figure 3. KIO consists in a precise real-time location system (RTLS) which works in non-line-of-sight conditions and both indoors and outdoors. Other main hardware used in ColRobot project is the collaborative and sensitive robot Kuka LBR iiwa 7 R800 which has integrated force/torque feedback and visual feedback through a LED ring, Figure 4. A homemade gripper was built to catch objects, Figure 5. A personal computer (PC) is used to run developed software.

2.1.1. Inertial measurement unit system

The IMUs system Tech-MCS V3 is used to capture the upper limbs movements of the human worker (torso and arms). Inertial sensors (accelerometers and gyroscopes), magnetometers, and a thermometer compose this system. The Tech-MCS V3 incorporates 3D inertial sensors called "Tech-IMUs" and a hub device called "Tech-HUB V3" that organizes and sends the data obtained from the Tech-IMUs to a PC, Figure 1. Data is transmitted by USB cable or by Bluetooth. These IMUs can be attached directly to the human body, or integrated in the workers’ jacket. Specifications of the Tech-MCS V3 are displayed in Table 1.

Table 1: Tech-MCS V3 specifications

Parameter Value

Tech HUB Size 156x100x43 mm (LxWxH)

Tech HUB Weight 225 gr

Tech IMU Size 36x26x11 mm (LxWxH)

Tech IMU Weight 10 gr

Power and Battery • Power Adapter 110/220 VAC

• 4 Batteries AA Type (1.2 V @ 2450 mAh)

Tech HUB Nominal Voltage 5 V

Tech HUB Maximum Current 3 A (16 IMU)

Communication Features • 4 Tech IMU Ports for 1-16 Tech IMU V4/CV4

• USB Port

• Bluetooth® Communication

• 150 m. Wireless Range (Line‐Of‐Sight)

• 50 m. Wireless Range Indoor

• MicroSD Recording (PC Offline Mode)

Page 6: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

6

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Figure 1: Tech-MCS V3

2.1.2. Surface electromyography system

A (non-intrusive) surface electromyography (sEMG) sensor provides information about human hand movements which, in turn, are used to detect hand postures and gestures. This system is used to indicate for example if the hand is opened or closed. Accurate identification of finger joint angles is not expected. A Myo device, shown in Figure 2, has been used to develop the interaction technology between human and machine so far. However, Technaid has been developing a novel device which integrated IMU and sEMG technology and will be presented in future reports. Although the EMG device used in the current developments has not been the same that will be used in the final demonstrator, the developments can be easily integrated in the novel device. The specifications of Myo are presented in Table 2.

Figure 2: Myo Gesture Control Armband

Page 7: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

7

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Table 2: Myo specifications

Parameter Value

Arm size Expandable between 7.5 - 13 inches (19 - 34 cm) forearm circumference

Weight 93 gr

Thickness 0.45 inches

Operation system • WINDOWS: Windows 7, Windows 8, Windows 10

• MAC: OS X 10.8 (Mountain Lion) and above

• IOS: 7.0 and higher

• ANDROID: Android 4.3 (Jelly Bean) and up

Gestures Hand gestures detected by proprietary EMG muscle sensors

Motion Highly sensitive motion sensor

Sensors • Medical Grade Stainless Steel EMG sensors

• Highly sensitive nine-axis IMU containing three-axis gyroscope,

• Three-axis accelerometer

• Three-axis magnetometer

LEDs Dual Indicator LEDs

Processor ARM Cortex M4 Processor

Haptic Feedback Short, Medium, Long Vibrations

Communication Bluetooth® Smart Wireless Technology

Power and Battery • Micro-USB charging

• Built-in rechargeable lithium ion battery

• One full day use out of single charge

2.1.3. Ultra-wideband system

KIO RTLS Regular Cell configuration ensures positioning accuracy of 30 cm in 2D/3D and is suitable for a range of precise positioning applications. Figure 3 shows two position elements (anchors) of KIO system and Table 3 displays its main specifications.

Page 8: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

8

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Figure 3: KIO

Table 3: KIO specifications

Parameter Value

Operational distance 40 m line-of-sight (LoS)

Distance measurement error ± 15 cm for all distances

Ranging rate • 10 Hz (max) – μUSB powered tag and anchor

• 1 Hz – battery powered tag

Operational frequency range 3.1 - 4.8 GHz, 900 MHz signal bandwidth

RF transmission • Signal level -41.3 dBm/MHz

• LDC max 5% of RF channel utilization time

• Onboard omnidirectional antenna

Ranging scheme • Time-of-Flight multilateration

• Automatic anchor selection

Communication standard IEEE 802.15.4-2011-UWB

Host interface μUSB 2.0 12 Mbit/s

Power source • 5 V DC, μUSB ≤ 500mA for the standard cabled connection

• An internal 750 mAh rechargeable LiPo battery is available on request, charging over μUSB

Power consumption 120 mW, 32.4 mA

Tag battery lifetime ~14 days at 1 Hz update rate

Device dimensions • 85x55x15 mm – μUSB powered tag and anchor

• 85x55x18 mm – battery-powered tag

Device weight • 36g – μUSB powered tag and anchor

• 54g – battery-powered tag

Maximum number of tags • 4 Hz update rate – 10 tags

• 1 Hz update rate – 40 tags

System scale-up • Each anchor covers 100-150 m2, depending on the shape of the room and obstacles.

• Expanding the tracking area requires a new 4-anchor set

Page 9: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

9

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

2.1.4. Robot

The industrial robot used in the current developments was the collaborative industrial robot Kuka LBR iiwa 7 R800, shown in Figure 4. The main specifications of this robot are presented in Table 4. This robot was used due to its collaborative and sensitive capabilities which allow coping with safety issues to the user co-worker.

Figure 4: Kuka LBR iiwa 7 R800

Table 4: Robot specifications

Parameter Value

Brand KUKA

Model LBR iiwa 7 R800

Controller KUKA Sunrise Cabinet

Number of axes 7

Number of controlled axes 7

Volume of working envelope 1.7 m³

Pose repeatability (ISO 9283) ± 0.1 mm

Weight approx. 23.9 kg

Rated payload 7 kg

Maximum reach 800 mm

Ambient temperature during operation 5 °C to 45 °C (278 K to 318 K)

Air humidity 20 % to 80 %

Sound level < 75 dB (A)

Page 10: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

10

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

2.1.5. Gripper

The gripper shown in Figure 5 was built to catch the intended parts and tools, which will be presented later in this report, gripper specifications are displayed in Table 5.

Figure 5: Gripper

Table 5: Gripper specifications

Parameter Value

Number of fingers 2

finger length 100 mm

Minimal distance between fingers (closed) 14 mm

Maximal distance between fingers (opened) 20 mm

Closing force 110 N

Opening force 90 N

Weight 0.2 kg

min. operating pressure 2 bar

max. operating pressure 8 bar

Nominal operating pressure 6 bar

Closing time 0.05 s

Opening time 0.05 s

Repeat accuracy 0.02 mm

2.1.6. Personal computer

A PC was used to run some software as described in section 2.1. The PC specifications are presented in Table 6.

Table 6: PC specifications

Parameter Value

Processor Intel Core i5-6400

RAM DDR4 8G

Page 11: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

11

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Graphics card NVIDIA GeForce GT710 1GB

Wireless 802.11ac (WPA2 support)

Operating system Windows 10 Home or Professional editions

2.2. Software Three main independent applications, Figure 6, were developed to carry out HRI/C and in particular a use-case (which is described in section 2.3). The applications are:

Gesture recognition system (Technaid system, sEMG system, and UWB system); Task manager; Robot program.

The former application is used to recognise gestures in continuous, which are performed by a user and captured by the Tech-MCS V3. Each gesture has associated a particular message, when a gesture is recognised, its associated message is sent to the task manager application.

The task manager application is used to manage the execution of tasks (robotic or collaborative tasks). Basically, this application consists of a message receiver and sender. Depending on the selected operation, a sequence of tasks must be performed by a given order. The task manager asks the execution of a task to the robot, this one when finished the execution of this task sends a message to the task manager informing about the conclusion of the task, being ready to execute other task. Receiving this message, the task manager waits for a gesture message or a double touch command which will trigger the execution of the next task of the sequence.

A task consists of a robot skill or a sequence of robot skills performed by a given order. On the other hand, a robot skill is a basic instruction that a robot can do, for example: a robot movement, open a gripper, close a gripper, turn on a LED light, etc. The robot is running a program where all robot skills are available. When the robot receives a message asking for the execution of a task, the robot performs it. When this task ends, the robot sends a message to the task manager informing about task conclusion being available to execute other task. In this state, when the robot is not execution any task, it is ready to detect a double touch, detecting it, a message will be sent from the robot to the task manager reporting the double touch. In turn, the task manager receiving this last message initiates the next task of the sequence, as described above.

Page 12: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

12

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Figure 6: HRI/C architecture

2.3. Use-case The use-case shown in this report consists in a robot and a user sharing the same workspace and working together to perform assembly tasks (collaborative assembly). This use-case can be divided in two steps, in a first step the robot delivers tools and parts to the user and in a second step the robot assists the user in some assembly tasks. There are tools which are tidied up in a tool support as shown in Figure 7 and these tools have to be brought to a workplace by the robot. In the same way there are parts, such as screws, nuts and washers inside boxes and other parts, in a part support as shown in Figure 8. These parts have to be brought to the workplace by the robot. Some of the tasks performed by the robot are:

B1 – Bring box 1 which contain screw type 1 and leave it in the workplace; T1 – Bring tool 1 and leave it in the workplace; B2 – Tidy up box 1 in the part support and bring box 2 which contain screw type 2 and leave it in

the workplace; T2 – Bring tool 2 and leave it in the workplace; B3 – Tidy up box 2 in the part support and bring box 3 which contain washers and leave it in the

workplace; FP – Bring fastening plates.

In order to perform this assembly operation, the robot and the user must work together, sometimes in parallel (performing independent tasks) and simultaneously (performing the same task together). Table 7 presents a list of tasks that the human and the robot must execute by the displayed order.

Page 13: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

13

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Figure 7: Tools

Figure 8: Parts

Table 7: Task sequence for robotic system and user

# Human Robot

1 Touch or perform a gesture to begin

2 Execution of B1 task

3 Touch robot or perform a gesture

4 Take screws from the box 1

Put the box 1 in the workplace

Execution of T1 task

5 Touch robot or perform a gesture

6 Execution of B2 task

Page 14: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

14

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

7 Touch robot or perform a gesture

8 Take screws from the box 2

Put the box 2 in the workplace

Execution of T2 task

9 Touch robot or perform a gesture

10 Execution of B3 task

11 Touch robot or perform a gesture

12 Take washers from box 3

Put the box 3 in the workplace

Execution of FP task

13 Touch robot or perform a gesture

14 Tidy up box 3 in the part support

15 Perform a gesture

16 Move close to workplace

17 Touch robot to change to hand guiding mode

18 Catch a big part with the robot User moves robot by human hand with fast movements

19 Screw fastening plates to the big part

20 Positioning the big part into the assembly position

User moves robot by human hand with high precision

21 Screw the big part to a structure Robot can perform small movements

22 Give the task as finished Task sequence finished

2.3.1. Bring parts and tools

From the Table 7, we can see that the tasks 2, 6, 10 and 12 represent Bring part tasks while 3 and 8 represent Bring tool parts. Figure 9 and Figure 10 display the execution of the tasks 2 and 3 respectively.

The task sequence is selected by the user in the robot, consequently, the correspondent sequence is loaded in the task manager and the robot initiates its execution. The first task of the sequence is just performed after the user has performed the “Next” gesture (Figure 11), which represents allowance the execution of the next task of the sequence to the robot, or a double touch in the robot. This procedure is repeated as described in Table 7. In Figure 12 are depicted the execution of the task sequence from the beginning till the task 17.

Page 15: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

15

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Figure 9: Bring task in execution for a part

Figure 10: Bring task in execution for a tool

Figure 11: User performing the “Next” gesture

Page 16: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

16

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Figure 12: Screenshots of the task sequence in execution

2.3.2. Third hand

One of the ColRobot goals is to use the robot as a third hand which in ColRobot project consists of moving the robot in compliance with a human hand while the robot holds a part. When the human releases the robot, it keeps its position. The third hand functionality can work in two different ways: performing fast hand guiding movements; or performing precise hand guiding movements. The last tasks of the task sequence use third hand concept, which is depicted in Figure 13.

Page 17: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

17

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Figure 13: Screenshots of the task sequence using third hand concept

Fast hand guiding movement

The third hand functionality provided by ColRobot system concerning fast hand guiding moment consists of moving the manipulator robot by human hand while the robot holds a part, a tool, or nothing. This movement is performed in the joint space of the robot which allows fast movements and high flexibility. While the robot is working on this mode, the LED ring is lighted in red as shown in Figure 14. In order to move the robot in this mode, the user has to press the white button in the robot’s flange. On the other hand, to change from the fast hand guiding movement mode to the precise hand guiding movement mode, the user has to press the green button for 2 seconds.

Figure 14: Performing a fast hand guiding movement

Page 18: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

18

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

Precise hand guiding movement

The other hand guiding mode is the precise hand guiding movement which allows high positional accurate movements to the manipulator robot. In this mode, all the robot movements are compliant to human hand being performed in the robot Cartesian space. When the robot is working in this mode, the LED ring is lighted in blue as shown in Figure 15 and the user does not need to press any button. In order to commute from the precise hand guiding movement mode to the fast hand guiding movement mode, the user has to press the green button for 2 seconds. If the user presses the green button of the robot flange once, not keeping the button pressed, the gripper will open or close depending on its state. Notice that, if the user presses the green button for a while (a few milliseconds) and releases the button too early to commute state (lower than 2 seconds), anything happens. Thus, the system provides the security necessary to perform a collaborative task and in particular the described tasks.

Figure 15: Performing a precise hand guiding movement

Page 19: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

19

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

3. Gesture recognition Seven static gestures and three dynamic gestures were trained in the application gesture recognition in order to interact with the robot. The recognition of these gestures is performed by two independent artificial neural networks (ANNs) architectures, one for static gestures and the other one for dynamic gestures. Some tests were carried out in order to assess feasibility of the two architectures. Each static gesture was repeated 1000 times as well as non-static-gestures were performed 2000 times, the results of these tests are presented in Table 8. On the other hand, each dynamic gesture was repeated 120 times as well as non-dynamic-gestures were performed 120 times, the results are presented in Table 9. These tests were performed by different people, during different days and different time.

By the analysis of Table 8, we can conclude that a global recognition rate (RR) of 0.953 was achieved. All of the seven static gestures presented high RR being correctly recognized in 95.5% of the times. An important point which deserves to be highlighted is that any static gesture is not wrong recognized taking into account the gesture library, i.e. a given gesture is not recognized as another gesture from the library. In fact, in the worst scenario a given static gesture belonging to the library is just recognised as not being a gesture (non-gesture).

Table 8: Confusion matrix for static gesture recognition

Recognized

Performed

1 2 3 4 5 6 7 Non-gesture RR

1 1000 1

2 995 5 0.995

3 779 221 0.779

4 912 88 0.912

5 1000 1

6 1000 1

7 1000 1

Non-gesture 45 15 49 1891 0.953

Recognition tests with the three dynamic gestures and non-dynamic-gestures were performed 120 times each one. Table 9 displays the results of this tests, which achieved a global RR of 0.996. Effectively, the three kinds of gestures were correctly recognized in 99,4% of the times. All of the non-gestures, considered in the tests, were classified as non-gestures, which means this recognition system presented high stability to reject disturbances.

Table 9: Confusion matrix for dynamic gesture recognition

Recognized

Performed

1 2 3 Non-gesture RR

1 120 1

2 120 1

3 118 2 0.983

Non-gesture 120 1

3.1. False positives A false positive consists in a performed gesture that does not belong to the gestures’ library, to be considered as a gesture from the library. As can be seen in Table 8, some false positives occurred for the considered static gesture dataset of test, about 4.7% in 2000 gestures. In fact, the incorrected classified non-gestures were similar to the valid static gestures belonging to the library. Even a human sometimes has difficult in differentiate a gesture from a non-gesture when they are quite similar. When

Page 20: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

20

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

the valid static gestures were defined this situation was expected because of that these gestures were defined as gestures that the user virtually does not perform while he/she is executing his/her working tasks.

For the library of dynamic gestures, false positives were not observed. This was due to the complexity of the dynamic gestures which effectively consists of a mix of basic dynamic gestures with static gestures and non-gestures.

3.2. False negatives A false negative gesture can be defined as a valid gesture, belonging to the gestures’ library, which is classified by the recognition system as non-gesture, in other words the gesture is not recognized. False negatives are found in the test results presented in Tables 8 and 9. In general, false negatives occurred due to the fact that the classification architecture is to restrict with the sensor data. However, if the architecture is not enough restricted with the data, there is the risk of false positive occurrence as well as misclassification. The best classification architecture is the one that find a balance between the two unwanted situations. Another reason for false negative occurrence relatively to dynamic gestures is the time that a gesture takes to be performed. In this system, there is a maximum and minimum time to perform a dynamic gesture if the dynamic gesture is out of this interval, the gesture is immediately considered as a non-gesture. When a false negative occurs, there is not any problem, the user just needs to repeat the gesture.

Page 21: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

21

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

4. Task sequencer In order to manage the execution of a robotic task sequence, an application called task manager is used. This application receives messages from the robot and from the gesture recognition application and sends messages back. Additionally, the task sequencer also provides speech and visual feedback. The robotic task sequence is defined here in the task manager. In the robot, all of the individual tasks that the robot can perform are defined. The Task Manager asks the robot to perform one of the tasks, depending on the sequence, and when this task has already finished the robot sends a message back to the task manager informing about the end of the individual task. After that, the process repeats until the last task of the sequence is reached being the ColRobot system ready to perform the same or other robotic task sequence.

4.1. Reaction times The time taken by the system to detect a gesture and execute the function representing this gesture depend on the gesture, some gestures take more time to be performed than other ones.

In order to establish the reaction time of the system, an experiment was carried out. This experiment consists in performing each gesture and observing when its effect is produced recording the elapsed time. Tables 10 and 11 display mean reaction times for the execution of the experiment for each static and dynamic gesture. The results considered in the tables represented gestures performed without false negatives. The false negative occurrence would increase the system reaction time since the user would need to perform the gesture once again.

Table 10: Mean reaction time for static gestures

Static gesture Reaction time (s)

1 1.2

2 1.1

3 1.3

4 1.2

5 1.3

6 1.2

7 1.5

Table 11: Mean reaction time for dynamic gestures

Dynamic gesture Reaction time (s)

1 3.7

2 3.8

3 4.1

Reaction time for physical contact is also important in two main situations: double touch for the robot executes the next task of the sequence; and stop moving for safety issues (an accidental collision between human and robot). The double touch takes in average about 1.4 seconds which is considered as reasonable time for the purpose. A collision between the user and the robot leads to a stopping reaction time lower than 1 second. Depending on the task that the robot is executing, the robot can stop immediately it receives the stopping message or it can stop just after the skill in execution is completed. If the robot detects a collision, it will stop immediately. On the other hand, if the gesture representing the stop instruction is performed by the used, the robot will just stop after it has finished the robot’s skill currently in executing. For additional security issues, at any time, the user has a physical button that let stopping whole the system.

Page 22: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

22

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

4.2. Feedback to user In order to inform the user about robotic task sequence, three independent channels are used: two visuals and one audio. Visual feedback is provided by a screen which is managed by the task sequencer and by a LED ring which is managed by the robot. The information provided by the screen is the robot state, the task that is in execution, and which the next task of the sequence will be, as shown in Figure 16.

Figure 16: Feedback on a screen

Other visual feedback is provided by the robot through a LED ring with three different colors. This feedback is displayed when the robot is executing a hand guiding task. The red LED is continuous on when the robot is in movement by hand guiding in the joint space. In this case the robot executes movements in relatively high speed and low positional accuracy. Figure 17 shows the LED ring displaying red light.

Figure 17: Feedback on the robot LED ring

The task manager application also provides audio/speech information to the user about the beginning and ending of a task sequence and the beginning of each task.

Page 23: D3.4 Evaluation of the HRI/C system - colrobot...Tech HUB Size 156x100x43 mm (LxWxH) Tech HUB Weight 225 gr Tech IMU Size 36x26x11 mm (LxWxH) Tech IMU Weight 10 gr Power and Battery

23

Collaborative Robotics for Assembly and Kitting in Smart Manufacturing

D3.4 Evaluation of the HRI/C system

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688807

Public

5. Conclusion This deliverable evaluates the performance of the demonstrator for the HRI/C system. Both cognitive and physical interaction modes were analyzed. The main hardware and software are described in the context of the use-case presented. Classification accuracy and system reaction times were estimated. It was concluded that the achieved classification is reliable to false positives/negatives if the number of gestures is relatively small. The physical interaction with the robot allows precise positioning and the task manager is modular. Visual and audio feedback help and guide the user through the collaborative task being performed.