147
DRIVE C2X 12/06/2014 Deliverable D11.3 Version 1.0 I Report on FOT Operations Deliverable D11.3 Report on FOT Operations Version number Version 1.0 Dissemination level PU Lead contractor ERT Due date 31/10/2013 Last update 12/06/2014

DRIVE C2X Deliverable - EICT...DRIVE C2X 12/06/2014 Deliverable D11.3 Version 1.0 2 Report on FOT Operations Authors Davide Brizzolara, ERTICO – ITS Europe François Fischer, ERTICO

  • Upload
    others

  • View
    10

  • Download
    0

Embed Size (px)

Citation preview

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 I Report on FOT Operations

    Deliverable D11.3

    Report on FOT Operations Version number Version 1.0 Dissemination level PU Lead contractor ERT Due date 31/10/2013 Last update 12/06/2014

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 2 Report on FOT Operations

    Authors

    Davide Brizzolara, ERTICO – ITS Europe

    François Fischer, ERTICO –I TS Europe

    Harri Koskinen, VTT

    Sami Koskinen, VTT

    Ines Heinig, Chalmers University

    Peter Follin, Lindholmen Science Park AB (LSP)

    Cécile Barbier, Renault

    Luciano Ojeda, PSA Peugeot Citroën

    Filippo Visintainer, Centro Ricerche Fiat

    José Fernandez, CTAG

    Pirkko Rama, VTT

    Helge Rosé, Fraunhofer

    Matthias Schulze, Daimler AG

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 3 Report on FOT Operations

    Project funding

    7th Framework programme

    INFORMATION AND COMMUNICATION TECHNOLOGIES

    Objective ICT-2009.6.2: ICT for Mobility of the Future

    Large-scale integrating project

    Grant agreement no.: 270410

    Project co-ordinator

    Matthias Schulze

    Daimler AG

    HPC 050 – G003

    71059 Sindelfingen

    Germany

    Phone +49 7031 4389 603

    Mobile +49 160 86 33 30 8

    Fax +49 7031 4389 218

    E-mail [email protected]

    Legal disclaimer

    The information in this document is provided ‘as is’, and no guarantee or warranty is given that the information is fit for any particular purpose. The above referenced consortium members shall have no liability for damages of any kind including without limitation direct, special, indirect, or consequential damages that may result from the use of these materials subject to any liability which is mandatory due to applicable law.

    © 2014 by DRIVE C2X Consortium

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 4 Report on FOT Operations

    Revision and history chart

    Version Date Comment 0.1 01/07/2013 First draft of Table of Content 0.2 02/07/2013 ToC updated (VTT) 0.3 07/08/2013 ToC add on and reorganisation 0.4 14/09/2013 Update of Chapter 4 0.5 24/10/2013 Added contribution TS Spain, TS

    Sweden 0.6 06/12/2013 Integration of contribution 0.7 13/12/2013 Update to Chapter 5 0.8 31/01/2014 Peer review resolution (ERT) 0.9 10/02/2014 Update of contributions from test

    sites 0.10 17/02/2014 Revision of the test 0.11 21/02/2014 Update of the contribution from

    TS France 0.12 05/03/2014 Correct table format – change

    conclusions 0.13 21/03/2014 Comments from VTT (Pirkko

    Rämö) included 0.14 25/03/2014 Comments from VTT (Tapani

    Mäkinen) included 0.15 03/04/2014 Small updates by VTT (Tapani

    Mäkinen/Harri Koskinen) 0.16 08/04/2014 Final draft (ERT) 0.17 30/05/2014 PTV inputs on WebSce included

    (ERT) 0.18 03/06/2014 Add clarifications about event

    table (ERT) 1.0 12/06/2014 Final editing, document ready for

    submission to EC (DAI)

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 5 Report on FOT Operations

    Table of contents

    Executive summary .............................................................................................................. 13

    1 Introduction ................................................................................................................. 15

    1.1 DRIVE C2X objectives and scope............................................................................ 15

    1.2 Objectives of the deliverable ................................................................................. 16

    1.3 Overall Organisation of the FOT operations .......................................................... 16

    1.4 Structure of the document ..................................................................................... 18

    2 FOT preparation ...........................................................................................................20

    2.1 Methodology: principles and constraints ...............................................................20

    2.1.1 Introduction ....................................................................................................20

    2.1.2 Subjects .......................................................................................................... 21

    2.1.3 Required sample sizes .....................................................................................22

    2.1.4 Functions and HMIs ........................................................................................22

    2.1.5 Study design ...................................................................................................22

    2.1.6 Measures ........................................................................................................ 25

    2.1.7 Test scenarios for the controlled tests ............................................................ 25

    2.2 Pre-deployment and validation activities ............................................................... 25

    2.2.1 Overview......................................................................................................... 25

    2.2.2 Implemented DRIVE C2X architecture ........................................................... 26

    2.2.3 Validation activities ....................................................................................... 28

    2.2.4 Main conclusion from the validation .............................................................. 29

    2.3 Test site adaptation ............................................................................................... 31

    2.3.1 Analysis of test sites ........................................................................................ 32

    2.3.2 Adaptation requirements ................................................................................ 34

    2.3.3 Test site adaptation ........................................................................................ 36

    2.3.4 Conclusion from adaptation............................................................................ 38

    2.3.5 Adaptation of applications .............................................................................. 38

    2.4 Data Management ................................................................................................. 39

    2.4.1 Overview......................................................................................................... 39

    2.4.2 Tools for data processing ................................................................................ 43

    2.5 Piloting ................................................................................................................. 46

    2.5.1 Piloting Overview .......................................................................................... 46

    2.5.2 Piloting activities ............................................................................................ 47

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 6 Report on FOT Operations

    2.5.3 General observations ...................................................................................... 51

    2.5.4 Lessons Learnt and conclusion ....................................................................... 51

    3 FOT testing process ..................................................................................................... 53

    3.1 Introduction ........................................................................................................... 53

    3.2 DRIVE C2X functions .............................................................................................. 53

    3.2.1 Overview of the tested functions .................................................................... 53

    3.2.2 HMI ................................................................................................................. 56

    3.2.3 Functions implemented on the test sites ........................................................ 57

    3.3 Field Operational Test Organization ...................................................................... 58

    3.4 Data management ................................................................................................. 61

    3.4.1 Data collection ................................................................................................ 61

    3.4.2 Data processing .............................................................................................. 63

    3.4.3 Plan for data provision ................................................................................... 64

    3.4.4 Data Conversion for TS France and TS Germany ........................................... 64

    3.5 Subject management ........................................................................................... 66

    3.5.1 Introduction ................................................................................................... 66

    3.5.2 User selection ................................................................................................. 67

    3.5.3 User agreement .............................................................................................. 67

    3.5.4 User instruction and feedback ....................................................................... 68

    3.5.5 User coordination and communication .......................................................... 68

    3.6 Test vehicles management ................................................................................... 69

    4 FOT operations ............................................................................................................ 70

    4.1 Introduction ........................................................................................................... 70

    4.2 TS Finland .............................................................................................................. 70

    4.2.1 Fleet Management / Test vehicles management ............................................ 70

    4.2.2 User management .......................................................................................... 71

    4.2.3 FOT management ........................................................................................... 72

    4.2.4 Test process .................................................................................................... 75

    4.2.5 FOT Planning .................................................................................................. 78

    4.2.6 Data Management .......................................................................................... 78

    4.3 TS France ............................................................................................................... 79

    4.3.1 Fleet Management / Test vehicles management ............................................ 79

    4.3.2 User management .......................................................................................... 79

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 7 Report on FOT Operations

    4.3.3 FOT management .......................................................................................... 80

    4.3.4 Tests process .................................................................................................. 81

    4.3.5 FOT planning .................................................................................................. 83

    4.4 TS Germany ........................................................................................................... 85

    4.4.1 Fleet Management / Test vehicles management ............................................ 85

    4.4.2 User management .......................................................................................... 85

    4.4.3 FOT management .......................................................................................... 88

    4.4.4 Test process ................................................................................................... 88

    4.4.5 FOT planning ................................................................................................ 100

    4.4.6 Data Management ........................................................................................ 101

    4.5 TS Italy ................................................................................................................. 101

    4.5.1 Fleet Management / Test vehicles management .......................................... 101

    4.5.2 User management ........................................................................................ 102

    4.5.3 FOT management ......................................................................................... 103

    4.5.4 Test process .................................................................................................. 104

    4.5.5 FOT Planning ................................................................................................ 109

    4.5.6 Data Management ........................................................................................ 109

    4.6 TS Spain ............................................................................................................... 109

    4.6.1 Fleet Management / Test vehicles management .......................................... 109

    4.6.2 User management ........................................................................................ 110

    4.6.3 FOT management ......................................................................................... 111

    4.6.4 Test process .................................................................................................. 114

    4.6.5 FOT Planning ................................................................................................ 118

    4.6.6 Data Management ........................................................................................ 119

    4.7 TS Sweden ........................................................................................................... 120

    4.7.1 Fleet Management / Test vehicles management .......................................... 120

    4.7.2 User management ........................................................................................ 121

    4.7.3 FOT management ......................................................................................... 121

    4.7.4 Test process .................................................................................................. 123

    4.7.5 FOT planning ................................................................................................ 127

    4.7.6 Data Management ........................................................................................ 127

    4.8 Overview of the FOT operations .......................................................................... 130

    5 FOT results ................................................................................................................. 132

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 8 Report on FOT Operations

    5.1 Result criterion ..................................................................................................... 132

    5.2 Log data ............................................................................................................... 132

    5.2.1 Overall log data results ................................................................................. 132

    5.2.2 Test site details ............................................................................................. 133

    5.3 User feedback ...................................................................................................... 136

    6 Conclusions ................................................................................................................ 139

    7 Glossary ..................................................................................................................... 141

    8 References ................................................................................................................. 144

    Annex A Organization of provided data ......................................................................... 145

    Annex B Overview of the DRIVE C2X Architecture ......................................................... 146

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 9 Report on FOT Operations

    Figures

    Figure 1: Scheme of sub-projects in DRIVE C2X ................................................................... 17

    Figure 2: Overview of FOT Operations in the framework of SP3 workflow .......................... 18

    Figure 3: FESTA “V-procedure” ............................................................................................20

    Figure 4: DRIVE C2X Reference Protocol Stack of an ITS Station ......................................... 27

    Figure 5: On Board Unit Deployment .................................................................................. 28

    Figure 6: Overview of the data flow .................................................................................... 40

    Figure 7: LogPro VS LogMover ............................................................................................ 44

    Figure 8: LogMover functioning scheme .............................................................................. 45

    Figure 9: Schematic overview of DRIVE C2X FOT system and the position of piloting therein ............................................................................................................................................ 46

    Figure 10: Example of messages displayed to the driver on the Finnish controlled test route ............................................................................................................................................ 49

    Figure 11: Wide frame of the reference HMI ......................................................................... 56

    Figure 12: Test Management Centre .................................................................................... 59

    Figure 13: Test Management Centre – TS Spain ................................................................... 61

    Figure 14: Data collection overview .................................................................................... 62

    Figure 15: Data Processing flow ........................................................................................... 63

    Figure 16: Tuple mapping ..................................................................................................... 65

    Figure 17: Logmapper structure .......................................................................................... 66

    Figure 18: User selection criteria .......................................................................................... 67

    Figure 19: TS Finland equipment (CCU unit and HMI) .......................................................... 71

    Figure 20: test process – TS Finland ..................................................................................... 73

    Figure 21: Test Procedure for one test session – Finland ...................................................... 74

    Figure 22: FOT Operation Road map – TS Finland (batch 1) ................................................. 76

    Figure 23: FOT Operation Road map – TS Finland (batch 2) ................................................. 76

    Figure 24: FOT Operation Road map signage – Finland ....................................................... 77

    Figure 25: IVS event shown to the driver in winter conditions part of testing ....................... 77

    Figure 26: RWW – TS Finland ............................................................................................... 77

    Figure 27:Test day schedule – Finland (Batch 1) ................................................................... 78

    Figure 28: Scenario edition tool “SCOREF_FOT_Scenario_Editor_V1.4.1”used in French CT ............................................................................................................................................ 82

    Figure 29: FOT Operation Road map – TS France ................................................................ 83

    Figure 30: scheduled messages occurrences for the controlled FOT .................................... 83

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 10 Report on FOT Operations

    Figure 31: TS France – controlled test planning ................................................................... 84

    Figure 32: Test Field Germany – Drive Centre Hessen ......................................................... 88

    Figure 33: Test Field Germany – Urban Scenario ................................................................. 89

    Figure 34: Test Field Germany – Rural Scenario .................................................................. 90

    Figure 35: Test Field Germany – Urban Scenario ................................................................. 90

    Figure 36: Functions tested on TS Germany ......................................................................... 91

    Figure 37 – AEV scenario - Route leading from the starting point to the end point at the test track in Friedberg ................................................................................................................ 92

    Figure 38: Test Route for the Car Breakdown Warning (CBW) test scenarios ....................... 93

    Figure 39 - Graphical representation of the four column formations ................................... 94

    Figure 40 – Test route for Emergency Electronic Brake light Warning (EEBL) test scenarios ............................................................................................................................................ 94

    Figure 41 - Positions of IRS-equipped/non IRS-equipped traffic lights in the urban road scenario .............................................................................................................................. 96

    Figure 42: City routes of the simTD-test field. Test route S1 for GLOSA is displayed ........... 97

    Figure 43: City routes of the simTD-test field. Test route S2 for GLOSA is displayed ........... 97

    Figure 44: City routes of the simTD-test field. Test route S3 for GLOSA is displayed .......... 98

    Figure 45: City routes of the simTD-test field for In Vehicle Sign ........................................ 99

    Figure 46: City routes of the simTD-for Road Work Warning .............................................. 99

    Figure 47: Route used for testing the TJAW from the starting point near the service area “Wetterau West” on the motorway A5 to the end-point .................................................... 100

    Figure 48: Fleet and Users. The involved people in one sample run, between BRE User Cars (right) and CRF scenario cars (left) ..................................................................................... 102

    Figure 49: Test roads – RIS location ................................................................................... 104

    Figure 50: example of installed RIS on pole (left) and on gantry (right) .............................. 104

    Figure 51: sample route in the Italian Test Site, displayed on the Webscenario tool .......... 105

    Figure 52: DRIVE C2X test environment at TS Italy: “SAFETY SCENARIO” ........................ 106

    Figure 53: CRF HMI at TS Italy ............................................................................................ 108

    Figure 54: HMI CRF YAMAHA HMI at TS Italy .................................................................... 108

    Figure 55: age distribution for NT tests – TS Spain ............................................................. 110

    Figure 56: age distribution for CT tests – TS Spain ............................................................. 110

    Figure 57: FOT Management Centre in CTAG .................................................................... 111

    Figure 58: Web tool for real monitoring ............................................................................. 112

    Figure 59: CT Test Scenario - TS Spain ............................................................................... 115

    Figure 60: Real view of test round for RWW ....................................................................... 115

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 11 Report on FOT Operations

    Figure 61: CT Test Scenario (IVS/SL) - TS Spain ................................................................. 116

    Figure 62: CT Test Scenario (WW, GLOSA) – TS Spain ...................................................... 116

    Figure 63: Test track (for GLOSA and WW): map view ....................................................... 117

    Figure 64: Test track (for GLOSA and WW): real view ........................................................ 117

    Figure 65: SISCOSA intelligent corridor – almost 100 km of high capacity roads ............... 118

    Figure 66: FOT Operations planning – TS Spain ................................................................. 118

    Figure 67: FOT Operation TS Spain Data Provision ............................................................ 119

    Figure 68: FOT Operation TS Spain Data Provision (updated plan) .................................... 119

    Figure 69: FOT Operation – TS Spain – NT tests data collection ........................................ 120

    Figure 70: Test Management tool interface – TS Sweden .................................................. 122

    Figure 71: Test roads – driver home addresses in blue, VCC area in red .............................. 123

    Figure 72: OW/broken car test setup .................................................................................. 124

    Figure 73: RWW setup – using Cleanosol trucks ................................................................. 124

    Figure 74: RWW setup – using Accident Research Car ........................................................ 124

    Figure 75: WW setup – wind warning – TS Sweden ............................................................ 125

    Figure 76: IVS setup via back office solution ....................................................................... 126

    Figure 77: GLOSA location in Gothenburg city ................................................................... 126

    Figure 78: Test day schedule – Sweden .............................................................................. 127

    Figure 79: Data Flow – TS Sweden ..................................................................................... 127

    Figure 80: DRIVE C2X Architecture .................................................................................... 146

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 12 Report on FOT Operations

    Tables

    Table 1: Samples distribution on the test sites (baseline + treatment) .................................22

    Table 2: Common test site features ..................................................................................... 34

    Table 3: Reference system by test site ................................................................................. 35

    Table 4: CBW sign application on DRIVE C2X test sites ........................................................ 54

    Table 5: IVS signs on the DRIVE C2X test sites ..................................................................... 55

    Table 6: Criteria for different information modes ................................................................ 56

    Table 7: Types of HMI by test site ......................................................................................... 57

    Table 8: Functions tested by test site ................................................................................... 58

    Table 9: Vehicles available on the Test Sites ....................................................................... 69

    Table 10: subjects’ overview – TS Finland Batch 1 ................................................................ 71

    Table 11: subjects’ overview – TS Finland Batch 2 (Bad weather conditions tests) ............... 71

    Table 13: overview of the FOT operations – TS France ......................................................... 81

    Table 14 – Overview of the simTD users. The number of drivers, gender deviation and mean age of each block are displayed .......................................................................................... 86

    Table 15: simTD ITS Roadside Stations ................................................................................ 91

    Table 16: simTD ITS Roadside Stations on the different roads ............................................. 91

    Table 17: RIS selection – TS Italy ........................................................................................ 105

    Table 18: TS Italy – Planned Events in the Controlled Scenario “SAFETY” ......................... 107

    Table 19: FOT plan overview – TS Spain ............................................................................. 112

    Table 20: Expected number of events – TS Spain – Naturalist Tests .................................. 113

    Table 21: Expected number of events – TS Spain – Controlled Tests ................................. 113

    Table 22: forecast number of events – TS Sweden ............................................................. 122

    Table 23: overview of the FOT operations .......................................................................... 131

    Table 25: summary of the number of events collected (treatment + baseline) ................... 133

    Table 26: Detailed number of events per function (baseline + treatment) - TS Finland ...... 134

    Table 27: Detailed number of events per function - TS France ........................................... 134

    Table 28: Detailed number of events per function - TS Germany ....................................... 135

    Table 29: overall exposure – TS Sweden ............................................................................ 136

    Table 30: Detailed number of events per function - TS Sweden ......................................... 136

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 13 Report on FOT Operations

    Executive summary

    DRIVE C2X aims at delivering a comprehensive assessment of cooperative systems functions, providing for instance: warnings about road hazards (road works, weather, traffic jam...) or information to optimise driving through traffic light, increasing thus traffic efficiency. The assessment will use log data resulting from Field Operational Tests carried out on several test sites located in different EU countries. To achieve this goal, DRIVE C2X will address four major technical objectives:

    1. Create and harmonise a European wide testing environment for cooperative systems 2. Coordinate the tests carried out in parallel throughout the DRIVE C2X community 3. Evaluate cooperative systems 4. Promote cooperative driving

    This deliverable reports the activities carried out as part of the DRIVE C2X sub-project 3: FOT operations. It reports the FOT operations documenting the activities for the collection of consistent and high quality data for technical and impact assessment as well as user acceptance.

    The project builds strongly on previous and on-going work on cooperative systems and the FOT operations have been carried out by a Europe-wide testing community including seven test sites in Finland, France, Germany, Italy, the Netherlands, Spain, and Sweden.

    The coordination of the FOT operations was left under the responsibility of the test site leaders. However the FOT organisation has followed a common and harmonised methodology, in order to preserve the conditions for combining data from all test sites for the common analyses. This was a challenging part of the FOT operation because the methodology principles needed also to be adapted to the local circumstances. Therefore: each functional test site, having a particular road infrastructure, has adapted the implementation of the cooperative system applications to its specific context.

    According to the topology of the cooperative systems infrastructure, FOT were executed in a specific way, for instance running naturalistic or controlled and scenario based tests. In the controlled approach, the drivers were called into the test and had to follow the driving instructions provided by the on board unit, allowing thus to meet specific test situations, like for instance a traffic jam involving several DRIVE- C2X equipped vehicles. In the naturalistic approach, the test drivers' behaviour were monitored in their daily driving, and the routes and driving times are based on drivers’ needs. The tested scenarios comprise both V2I and V2V.

    A wide set of functions were implemented in the DRIVE C2X reference system: Road works warning (RWW), Traffic jam ahead warning (TJAW), Car breakdown warning (CBW), Weather warning (WW), Approaching emergency vehicle warning (AEVW), In-vehicle signage (IVS), Green-light optimal speed advisory (GLOSA).

    The testing process on each test sites was supported by a test management centre which plaid the threefold role of active element of cooperative applications in controlled tests, of data collector for the test sites and of monitoring system of the on-going trial.

    The FOT operations have been carried out in four phases:

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 14 Report on FOT Operations

    1. FOT system integration and validation: testing and validating of the general FOT framework defined in DRIVE C2X in practice on a dedicated and fully DRIVE C2X compliant test site.

    2. DRIVE C2X test site preparation and adaptation: preparation of the test sites both from organisational and technical points of view in order to implement the common guidelines provided by the developers of the FOT Framework.

    3. Piloting: each test site to pilot the adapted DRIVE C2X FOT tools and procedures to ensure that the test sites are ready for the collection of FOT data.

    4. FOT management: coordination of the functional tests across all test sites in terms of Test Data Management, Fleet and User Management, and Service/Application Management.

    Test data management was an essential part of the FOT operations. It supported not only the storing of the test results on specified files and databases, keeping track of the location of the different information but also verified that data collection with logging tools matched project requirements. Specific procedures for collecting log files manually or automatically and for monitoring data collection and validating stored data, both manually and with the help of dedicated software have been implemented.

    A wide range of data was collected and stored: the logs describing the internal states of the vehicle and generally information available on the CAN bus; the basic GPS data (latitude, longitude, speed, heading, altitude and position fix modem, indicating the availability and roughly the quality of positioning); all the information sent to the vehicle’s HMI devices, mainly to the display, complemented with logs of what was finally displayed on the screen and user settings, such as mute; the content of the DENM messages; and, finally, log function-related data especially for technical evaluation purposes.

    For Fleet and User Management, specific documents have been created with the aim to achieve an optimum harmonisation across the test sites concerning the procedures to manage the users during the test runs.

    Application Management supported the harmonisation of the DRIVE C2X functions on the different test sites. Differences in the implementation of the DRIVE C2X functions, needed to take into account test site specificities should not have created the provision of inconsistent data concerning the analysis requirements. Therefore the application management has also collected and documented the differences between test sites, relating to the parameterisation of the functions. In this deliverable the execution of FOT operations is carefully detailed for the different test; sites.

    The FOT results have shown that the sites have been able to collect and store compliant data, according to the DRIVE C2X guidelines and analysis requirements for the FOT evaluation.

    The challenge of collecting data from different test sites for a common technical and impact assessment has been successfully supported by the use of a common methodology applied by all test sites during the different phases of the FOT operations.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 15 Report on FOT Operations

    1 Introduction

    1.1 DRIVE C2X objectives and scope

    The objective of DRIVE C2X is to assess comprehensively cooperative systems by means of extensive Field Operational Tests (FOTs) in different European countries. This general objective is split into four major technical objectives: (1) Create and harmonise a Europe-wide testing environment for cooperative systems. (2) Coordinate the tests carried out in parallel throughout the DRIVE C2X community. (3) Evaluate cooperative systems. (4) Promote cooperative driving.

    The DRIVE C2X reference system is based on sub-systems as defined by the COMeSafety communications architecture (see [3] and [4]) and it is in line with the communications architecture standard published in ETSI EN 302 665 [5]. The DRIVE C2X system architecture makes use of the ITS Station concept and realises Vehicles ITS Stations (VIS), Roadside ITS Station (RIS) and Central ITS Station (CIS) as backend systems (see more details in Paragraph 2.2.2).

    A general understanding of the benefits of cooperative systems exists today, but those systems have been evaluated in small-scale experiments until now, mainly on closed test tracks. There is no proof of these benefits yet with many communicating vehicles driven by ordinary people in variable conditions on roads.

    The project builds strongly on previous and on-going work on cooperative systems. The European-wide testing community comprises seven test sites in Finland, France, Germany, Italy, the Netherlands, Spain, and Sweden. The essential activities in this project consist of defining the test methodology and evaluating the impacts of cooperative driving functions on users, the environment and society. In addition to impact assessment, other important areas of testing are focusing on technical functionality and robustness of the systems – also in harsh winter conditions. The collected user feedback and the results from technical tests will allow creating realistic business models for the market introduction.

    The purpose of DRIVE C2X is to bind together and harmonise existing European test sites for common testing and coordinate this testing according to mutually agreed methodology and operation procedures. DRIVE C2X relies on the following main pillars in developing and implementing its methodology:

    • Results of PRE-DRIVE C2X tools and methods necessary for successful test and evaluation of co-operative systems in a field operational trial;

    • FESTA handbook for FOT methodology, especially the V-shaped ‘FOT-chain’ to plan the testing procedure;

    • Lessons learned from two large-scale European FOTs, EUROFOT and TeleFOT; • Partners’ own experience in carrying out user tests in field conditions over the past 20

    years.

    The functions to be tested are related to different types of use cases: Traffic flow, Traffic management, Local danger alert, Driving assistance, local information services as well as specific functions to be defined independently by each test site.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 16 Report on FOT Operations

    The work is organised into five different sub-projects as detailed in Paragraph 1.3. This deliverable specifically illustrates how the actual FOTs were carried out and what procedures contributed to the successful completion of the FOT, focusing on the activities carried out in SP3.

    1.2 Objectives of the deliverable

    The main goal of this deliverable is to report the FOTs operations documenting the activities developed to achieve the following objectives:

    • Execution effective FOT operations, following the guidelines and lessons learned from other on-going FOT projects (especially TeleFOT) and applying a common and harmonised methodology to testing.

    • Collection of consistent and high quality data for the activities carried out in FOT Evaluation (SP4) for technical evaluation, user acceptance and evaluating the impacts of the implemented cooperative driving applications on driver behaviour, safety, environment and traffic efficiency.

    In this derivable the FOT testing process and operations are illustrated detailing the results in relation to the planned activity.

    This deliverable presents the wide range of the activities which were needed to establish and provide a consolidated and harmonized FOT test environment. The specific structure of the DRIVE C2X field trial is explained, as well as the interaction with a number of national projects and their partners. The test site adaptation and piloting are illustrated, followed by a presentation of the common testing process applied by all the test sites and finally a description of FOT operation process and results for each test site.

    The efforts of the test sites are highlighted to observe the actual FOT methodology as accurately as possible ensuring that all the systems, test procedures and the methods were driving the FOT execution according to what they were designed for.

    DRIVE C2X applications or functions tested in the different tests will be described. They have been selected owing to their particular relevance from a European point of view and collecting data at a number of national European test sites with national use cases (e.g. TS Germany)

    The FOT operations are described separately for each test site: a short overview have been provided by the test leaders in order to describe the operations, describing the vehicles and users involved, the planned test scenarios and all the activity of data collection.

    Specifically for each test site it is described how the harmonized methodology was applied in order to provide data compliant with the analysis requirements.

    1.3 Overall Organisation of the FOT operations

    The FOT Operations were carried out in six different European test sites, the so-called Functional Test Sites (FTS). The “FOT operations” sub-project was managed in DRIVE C2Xclose relationship with the other sub-projects, needing to carry out parallel activities and to take into consideration some interactions, as described in Figure 1.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 17 Report on FOT Operations

    Figure 1: Scheme of sub-projects in DRIVE C2X

    The “FOT operations” have relied on the strong collaboration with selected national test sites. For logistical reasons, each test site was responsible for its deployment of the cooperative ITS applications, as well as the coordination of its FOT operations. Ensuring the usage of harmonised FOT test processes and the provision of consistent FOT data for the impact and technical assessment, required to split the sub-project into four phases:

    1. FOT system integration and validation. WP 32 tested and validated the general FOT framework defined in DRIVE C2X in practice on a dedicated and fully DRIVE C2X compliant test site. System testing and in particular on the interoperability of the systems delivered by the partners had the highest priority. WP32 provided a full technical system validation including guidelines and requirements for effective functional testing on the chosen test sites.

    2. DRIVE C2X test site preparation and adaptation. WP33 prepared the test sites both from organisational and technical points of view in order to implement the common guidelines provided by SP2 (WP26). Specific descriptions of each test site were delivered including full scale test scenarios for controlled and uncontrolled FOT approaches.

    3. Piloting. WP34 cooperated with each test site to pilot the adapted DRIVE C2X FOT tools and procedures to ensure that the test sites were ready for the collection of FOT data. WP34 validated each test site implementation before the start of functional testing.

    4. FOT management. WP35, WP36, and WP37 coordinated and supported the FOT activities of the functional test sites, respectively with the following focus: Test Data Management, Fleet Management, and Service/Application Management. But these work packages have also supported the activities relating to the above phases.

    The four phases started successively, however a timely overlap of the corresponding activities enabled the work packages to support each other, as shown in Figure 2.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 18 Report on FOT Operations

    WP31: SP management

    WP32: FOT system integration and

    validationWP33: TS adaptation

    WP34: TS Piloting

    WP35: Test data management

    WP36: Fleet/user management

    WP37: Application and service management

    FOT operationsFOT Operations

    FOT operations

    M3.1 FOT System Validated

    M3.2 TS Adaptation

    M3.3 TS Piloted and ready for FOT operations

    M3.4 TS operations completed

    Sup

    port

    to S

    yste

    m T

    S

    Sup

    port

    to

    TS a

    dapt

    atio

    n

    Support toTS Piloting

    Figure 2: Overview of FOT Operations in the framework of SP3 workflow

    The organization of FOT operations was left under the responsibility of the functional test site leaders. Each functional test site had a particular road infrastructure. Therefore, the implementation of the cooperative system functions was adapted to the specific test site context. According to the test site environment, FOT operations were executed in a specific way, for instance running naturalistic or controlled scenario-based tests.

    Following common testing methodology principles, the FOT operations ensured two main goals:

    • Collection of consistent data, as a result of the FOT tests, following the experimental design procedures provided by SP4.

    • Execution of harmonised FOT operations, following the guidelines and lessons learned from other FOT projects (TeleFOT for instance).

    More details about the FOT testing methodology are provided in the Chapter 3 of this Deliverable.

    1.4 Structure of the document

    The structure of the deliverable is oriented to the logical and chronological sequence of the work performed, starting from the test site preparation until the execution of FOT on all

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 19 Report on FOT Operations

    test sites. A specific focus is put on the FOT testing methodology and FOT results highlighting the lessons learned.

    Chapter 2: FOT preparation

    This Chapter offers a general overview of the common methodology principles, which drove the FOT preparation and were applied by the different test sites. Furthermore, the activities for pre-deployment and validation are illustrated, considering the work carried out in the WP32 for the reference system validation, in WP33 concerning the adaptation and the particular implementation choices. Finally, a short report on the preparation of the data chain and data tools is provided along with an illustration of the piloting activities.

    Chapter 3: FOT testing process

    Chapter 3 reports how the FOT operations were organised, starting with a common part summarizing the FOT testing process (baseline/treatment, naturalistic/controlled, etc…), based on the common methodology explained in the chapter 2.1. It describes the overall data requirements, giving the process for handling data in relationship with the FOT runs. A paragraph describes the specific DRIVE C2X application and use cases. Finally, some specific topics are presented: data management, subject management and test vehicles management.

    Chapter 4: FOT operations

    Chapter 4 details the FOT organisation for each test site, based on the FOT plans and the actual test executions. The chapter is divided into specific sub-chapters specifically highlighting the situation for each test sites and focusing on different aspects: fleet and user management, FOT management, FOT planning and data management.

    Chapter 5: Result of FOT execution

    Chapter 5 illustrates the results of FOT execution referring to how the running the FOT succeeded. This chapter illustrates the outcome related to a number of runs, the recorded events and the lessons learned presenting an overview of the work on each test site.

    Chapter 6: Conclusion

    Chapter 6 presents a conclusion on the work performed in the FOT operations and the contemplation of the results makes the final section of this document.

    Chapter 7: Glossary

    Chapter 7 provides the meaning of the terms and abbreviations used in the deliverable.

    Chapter 8: References

    Chapter 8 propose a list of references to documents, publications or web link, in order to get more insight into projects or topics mentioned in the deliverable.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 20 Report on FOT Operations

    2 FOT preparation

    2.1 Methodology: principles and constraints

    2.1.1 Introduction

    The development of DRIVE C2X methodology was an iterative process together with the test sites, function developers and research scientists responsible for impact assessment. The aim was to have a consistent evaluation methodology to be implemented throughout DRIVE C2X tests. However, the methodology principles needed to be adapted to the local circumstances, while keeping the methodology sufficiently harmonized in terms of critical aspects, and thereby preserve conditions to combine data from all test sites for the common analyses.

    Figure 3: FESTA “V-procedure”

    The FESTA handbook for FOT methodology, especially the V-shaped ‘FOT-chain’ (Figure 3) to plan the testing procedure, has been applied in DRIVE C2XDRIVE C2X.

    First version of methodology guidelines was provided during the first six months of the DRIVE C2X project. The principles were explained and the plans and circumstances of the tests were discussed separately for each test site. The meetings were organized in each test site in order to get insight into local circumstances, and agree on principles. By the end of the first project year, the methodology guidelines were amended and provided in an SP4

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 21 Report on FOT Operations

    deliverable: D42.1 DRIVE C2X FOT research questions and hypotheses and experimental design.

    The methodological principles were examined together with the test sites in several check points before piloting and field tests:

    • Vigo meeting (Feb 2012): focus in the plans of test sites, organizing the tests in accordance with the study design, use of Web scenario editor.

    • Helsinki meeting (Jun 2012): focus in study designs and baseline data collection. • Helmond meeting (Sep 2012): focus in functions, function adaptations, HMIs, collection

    of metadata and user acceptance data.

    • Helmond meeting (Dec 2012): focus in details of function adaptation and designs, specifically in the German TS.

    • Conference calls, SP4 and each test site (except German test site): focus in detailed plans of test sites and guidance on study designs.

    • Guidance by e-mail (spring 2013): documentation of piloting results and amendments to the test arrangements.

    DRIVE C2X used two test approaches: naturalistic and controlled tests. In the controlled approach, the drivers were called into the test and they were asked to drive the test route with arrangements. It was recommended that the tests would be conducted in real traffic. In the naturalistic approach, the test drivers' behaviour was monitored in their daily driving, and the routes and driving times were based on drivers’ needs. As part of naturalistic approach, in some cases the test drivers were asked or tempted to choose a specific route (this was called as semi naturalistic approach). In both approaches driver behaviour was monitored and several data types collected while driving.

    2.1.2 Subjects

    The selection of test drivers defined to which population the results could be generalised. Different kind of group had to be involved in the tests always taking into account that for each group the number of collected data had to reach a minimum size to enable conclusions.

    Four main background variables were considered: 1) how experienced the drivers were in car driving and 2) in the use of new technology, 3) gender and 4) age.

    For the DRIVE C2X FOT operations, the following criteria for the driver sample were suggested:

    • No professional drivers but quite experienced having annual mileage more than 10000 km;

    • No ITS specialists were included in the tests; • Both female and male drivers, at least 40% in each group; • Three age groups should be represented: from 18 to 34 years (30%), 35-60 years (40%),

    from 60 years (30%). The criteria were defined to reduce variation in the samples between the test sites and to focus on the potential customers in the coming years.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 22 Report on FOT Operations

    2.1.3 Required sample sizes

    The definition of the sample size, concerning the number of events needed for the analysis, was detailed in DRIVE C2X D42.1 on research questions and experimental design. On one hand, from a theoretical point of view, it resulted that a sample size including 1000 events would be sufficient for the evaluation of the main effects in the tests conducted in real traffic. On the other hand, taking into account the feedback from the test sites on their indicative plans, Table 1 was completed detailing the how many events (i.e. specific driving occurrence relevant to a considered function) were targeted for each function by test site.

    Table 1: Samples distribution on the test sites (baseline + treatment)

    Function Finland France Germany Italy Spain Sweden CT NT CT CT CT NT CT NT

    TJAW 0 30+10 30+30 40+40 30+30 70+10 30+30 0 RWW 50 + 50 50+125 50+50 80+80 30+30 156+40 50+50 50+50 CBW 50 50+10 50+50 200+200 30+30 54+5 30+30 30+30

    AEVW 0+10 0 0 25+25 30+30 0 20+40 0 WW 25 10+125 10+10 200+200 0 868+5 30+60 30+30

    EEBL 0 30 0 80+80 30 0 0 0 IVS 250+25

    0 0 250+125

    0 1000+100

    0 300+30

    0 300+30

    0 320+32

    0 1000+10

    00 GLOSA 0 0 0 500+500 0 0 20+20 300+300

    Note: CT: Controlled Tests – NT: Naturalistic Tests

    2.1.4 Functions and HMIs

    In the beginning of the DRIVE C2X project, the set of functions for full scale impact assessment was decided. The goal was to have as many functions as possible, applicable to several test sites. All functions applied would be included in the technical tests and user acceptance tests. The list of functions for the full scale impact assessment was amended later, based on the selection of functions in the test sites.

    The detailed function specifications and adaptations were specified by the function owners. Local circumstances and harmonization aspects were considered for the function adaptation, together with the function owners. Some differences between test sites remained due to the local circumstances.

    Originally, a common reference HMI was agreed. However, it turned out that some test sites needed to use their own HMIs, and thereby receive user acceptance and other results according to these specific HMIs. However, the basic messages to the drivers – e.g. road work warning – were consistent among all the test sites.

    2.1.5 Study design

    To be able to assess the impacts occurring under treatment condition a reference or baseline was needed. It was highlighted that for all tests designed for impact assessment baseline data must be collected.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 23 Report on FOT Operations

    Under treatment condition, the considered application/system was active and provided assistance to the driver in situations according to the use case. In the baseline condition, the test person drove the same route under the same conditions as during the treatment – but without the assistance of the application or the possibility to use the function. In controlled tests, the baseline run was arranged before the treatment run for each function. During baseline condition the similar event must occur. In the naturalistic approach baseline data was suggested to be collected in the before phase when HMI is not activated for the driver.

    Within subject design was proposed as first choice for most functions and approaches. This means that subjects serve as their own control; they are measured under baseline condition and treatment condition, and the data can be identified for one person in both conditions.

    A more detailed explanation of the previous concepts is proposed in the following paragraphs.

    2.1.5.1 Naturalist versus controlled approach

    Two different approaches were used on the test Sites for data collection: controlled and naturalistic.

    Controlled approach

    In the controlled approach, test drivers were called into the test and asked to drive the test route with some arrangements. Tests have been conducted in real traffic (e.g. TS Italy) as much as possible. However, to guarantee drivers safety, some use cases were run only on a closed test track (e.g. TS Spain). One test included several runs of the route and several situational variables (i.e. circumstances) could be fixed and selected in advance. The experiments have been designed so that some variables are systematically controlled during the data collection.

    The controlled tests may be more practical to organize than naturalistic tests or in some cases essential for some functions, especially when several equipped parties were needed in the same location to provide C2X services.

    The controlled approach offered the possibility to create test circumstances that are necessary for the use case. In the controlled tests, within subject design was suggested.

    Naturalistic approach

    In the naturalistic approach, the test drivers' behaviour was monitored in their daily driving, and the routes and driving times depend on drivers’ needs.

    The collected data was organized according to the independent and situational variables afterwards. In this approach, the design was applied to the main part of the data afterwards. It was also possible to catch situations not in control, e.g. weather.

    Clearly, the advantage in the naturalistic approach was the better ecological validity than controlled tests, it would be more convenient to conclude what would happen in real traffic when the systems were implemented and used widely. In addition, with naturalistic approach impacts on mobility and overall long term impacts could be assessed. The problem in the naturalistic approach turned out to be a small number of events recorded.

    It should be noted that the vehicles using cooperative system received information from other entities like car (V2V) or road side units (V2I), equipped with the DRIVE-C2X reference

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 24 Report on FOT Operations

    systems. Therefore, vehicles were neither standalone nor independent and needed to ensure receiving messages from the other ITS stations. This constraint gave preference to using the controlled approach, where the conditions for exchanging messages with other ITS stations were to be ensured in the test scenario. The frequency to encounter relevant situations can be estimated more reliably for controlled tests than natural tests.

    In synthesis, in the naturalistic approach:

    • Drivers drove regularly across a given corridor equipped with Roadside ITS station for the I2V functions.

    • V2V functions required using many vehicles driving in the ITS corridor, daily at the same period of time (with employees of the same company, driving to or from their workplace)

    • In the controlled approach: • Roads (or test tracks) were defined with a list of events. • required specific conditions concerning other equipped vehicles enabling testing function

    (Car Breakdown Warning, for instance)

    2.1.5.2 Baseline vs. treatment

    To enable the impact assessment of the DRIVE C2X reference system, a reference or baseline was needed. In a baseline condition the test person drove the same route, in same conditions, as during the tests – but without the application or possibility to use the function(s) provided by the DRIVE C2X reference system.

    In the baseline phase, data collection was active but the function(s) operated in "silent mode": data were collected, but it did not give any signals to the driver. From the viewpoint of the driver the function(s) was off. Baseline data for each test person were collected using the same car in same or similar area (speed limits, road type, and link/intersection) and in similar conditions (lighting, weather, traffic situation) as in actual tests while using the function. Baseline data were needed for events relevant to the function.

    After the baseline has been collected functions were activated (mostly one at the time) and driving tasks were repeated.

    The treatment phase was the part of the data collection during which the function(s) were switched on by the experimental leader.

    2.1.5.3 Between-subjects vs. within-subjects design

    A relevant aspect in data collection is related to the choice of between-subjects or within-subjects design.

    In the within-subject design subjects serve as their own control (baseline and treatment data are collected for the same users) while in the between-subject design, subjects are split into two comparable groups: only one have the given functions for use, for treatment data collection, while the other group will serve as a control group, for baseline data collection.

    In all tests, within subject design was mainly used. This means that subjects served as their own control; they were measured before and after the treatment and these behaviours were compared by means of statistical methods. The approach is assessed as appropriate for a relatively short term tests where no significant seasonal trends are expected.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 25 Report on FOT Operations

    2.1.6 Measures

    There were several types of measures originating from the definition of research questions. Data collection device, reference system was developed for logging of driver behaviour (by SP2). In some cases a national logger was used. In addition to the logging of driver behaviour, data was collected to describe the circumstances such as weather and traffic conditions. It was highlighted that the test sites should provide complete metadata for each data set to describe and explain the data content.

    Each research hypothesis was connected to an indicator and measures. The lists of measures were compiled by functions. The importance of these measures was prioritized by the impact assessment tasks. In addition there were some common variables which were asked for all the analyses. Two of them were: 1) whether the test driver is driving is in free flow or not, and 2) whether the driving takes place on a link or in an intersection area.

    Because the test vehicles typically did not have radar to measure headways and identify free flow driving it was suggested that each test vehicle would be equipped with a simple video system. After piloting, this plan was complemented with an observation procedure specifically meant to test vehicles not having video on board. In many cases, an on board observer also supported navigation to keep the driver on the test route.

    Questionnaires and focus groups were applied to get user acceptance data. The procedure was agreed with the test sites. On-line questionnaires were provided to the test sites. Each test site took care of the national translation. In addition, guidance how to collect data was provided as well as procedure for focus group data collection.

    2.1.7 Test scenarios for the controlled tests

    The test scenarios were provided for the nine functions. The test scenarios provided guidance for detailed planning of the tests. They defined for each function recommendations of route lengths, preferred road type and speed limit, lighting, weather and traffic conditions and location. A template was provided to describe the test scenarios. Several detailed instructions were provided, e.g. that the test driving should be as normal as possible, test route should be planned to be long enough and frequency of presenting functions should not be too high, repeated measures would be good (not only one observation / person / situation), how the test persons should be prepared and instructed for the tests etc.

    2.2 Pre-deployment and validation activities

    2.2.1 Overview

    This paragraph provides a quick overview of pre-deployment and validation activities and results referred to the reference system, the software developed within the project on the basis of the previous Protocol Stack and installed in practise on a single test site.

    The role of STS was taken by the Helmond test site in the Netherlands, which is operated by TNO.

    The STS was equipped with many Roadside ITS Stations (RIS). These RIS had implementations of the complete networking stack, including a Geonetworking

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 26 Report on FOT Operations

    implementation and up to the various DRIVE C2X messages (CAM, DENM, etc.). The RIS could run two completely independent systems in a so called dual mode. This has been used on several RIS to run 1) a DRIVE C2X-independent implementation used to validate the DRIVE C2X logging components and 2) implementation of the DRIVE C2X reference RIS platform with the required roadside components of several applications.

    The STS was equipped with a camera system monitoring a large part of the test site. This system was able to track every single vehicle and measure its position over time with an accuracy of about 1m at 100ms intervals. Cameras inside several test vehicles have been used to compare the HMI output with the log data.

    The validation activities were carried out on the implementation of this Protocol Stack by WP32 responsible for the validation of the DRIVE C2X reference system on a single test site.

    2.2.2 Implemented DRIVE C2X architecture

    The DRIVE C2X system makes use of the ITS Station concept where each ITS Station is a functional entity that provides Intelligent Transport System (ITS) [5]. The DRIVE C2X architecture mainly comprises the following basic components which can be combined arbitrarily to form a cooperative intelligent:

    • Vehicle ITS Station (VIS), equipped with communication hardware for information exchange with other vehicles or roadside infrastructure.

    • Roadside ITS Station (RIS), which can send information to vehicles or act as relay stations for (multi-hop) communication between vehicles and may be connected with central components.

    • Central ITS Station (CIS), an organised entity where centrally managed applications and services are operated.

    Each ITS Station consists of a Communication & Control Unit (CCU) and an Application Unit (AU) which may be combined into a single physical unit or also form a mobile network, where AUs obtain connectivity to the network via the egress interface of the CCU.

    Within the ITS Station, a protocol stack is present. In Figure 4 the DRIVE C2X Reference Protocol Stack for an ITS station is depicted. It follows the ISO/OSI reference model and defines four horizontal protocol layers and two vertical layers.

    ITS Access Technologies cover various communication media and related protocols for the physical and data link layers. The access technologies are not restricted to any particular type of media, though most of the access technologies are based on wireless communication. The access technologies are used for communication inside an ITS Station (among its internal components) and for external communication (for example with other ITS Stations).

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 27 Report on FOT Operations

    ITS Applications

    ITS Facilities

    ITS Network & Transport

    ITS Access Technologies

    ITS

    Man

    agem

    ent

    ITS

    Sec

    urity

    Figure 4: DRIVE C2X Reference Protocol Stack of an ITS Station

    ITS Network and Transport comprises protocols for data delivery among ITS Stations, and from ITS Stations to other network nodes, such as in the Internet. ITS network protocols particularly include routing of data from source to destination through intermediate nodes, and efficient dissemination of data in geographical areas.

    ITS Facilities are a collection of functions to support applications for various tasks. The facilities provide data structures to store, aggregate and maintain data of different types and sources (such as from various vehicle sensors and from received data from communication). As for communication, ITS facilities enable various types of addressing modes to applications; provide ITS-specific message handling and support establishment and maintenance of communication sessions. An important facility is the management of services, including discovery and download of services as software modules from the ITS Application Service System and their management in the ITS Station.

    ITS Applications refer to the different applications and functions. Besides the horizontal layers, the reference protocol stack in Figure 6 introduces two vertical layers that flank the horizontal stack:

    ITS Management is responsible for configuration of an ITS Station and for cross-layer information exchange among different layers.

    ITS Security provides security and privacy services, including secure message formats at different layers of the communication stack, management of identities and security credentials, and aspects for secure platforms (firewalls, security gateway and tamper-proof hardware).

    As an example, the overview of an On Board Unit is provided in Figure 5. The VIS implemented by the DRIVE C2X reference system was composed by two main modules: a Communication Control Unit (CCU) and an Application Unit (AU). The AU ran the cooperative applications via OSGi framework collecting GNSS data for positioning and vehicle data from CAN bus, the latter translating proprietary CAN message into DRIVE C2X CAN messages, through an OEM software module. The CCU was specifically dedicated to manage the communication interface of the VIS.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 28 Report on FOT Operations

    Figure 5: On Board Unit Deployment

    2.2.3 Validation activities

    The validation was carried out during dedicated test weeks, where all applications available were tested on the OEM and Test site vehicles equipped with DRIVE C2X ITS stations. These tests followed a controlled testing approach, allowing also testing the interoperability of the different reference system. A number of relevant scenarios were planned for each application, and vehicles received instructions on how and where to drive. During each test run most applications were tested, and a vast amount of test data was generated making the basis for the validation analysis.

    The analysis was carried out by taking into account different analysis tools and components:

    • An independent system (DITCM) was used to measure the overall system performance with respect to timing, position, communication range/quality and HMI.

    • Log data from all components were checked in detail. The log data gives information about the vehicle and its driver during the FOTs following the validation phase.

    • Test Management Centre tools, delivered within DRIVE C2X, have been used in the validation tests: these tools help the FOT to define tests and to execute and monitor controlled test runs.

    • The communication part of the architecture has been validated during the “Cooperative Mobility Services Plugtests” that was organised in the context of WP32 together with ETSI in November 2011, where it was concluded that the reference system is interoperable with non-DRIVE C2X-implementations.

    The analysis starts by validating how the reference system performs regarding time synchronization, positioning, communication range/quality and HMI. These were related to

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 29 Report on FOT Operations

    the overall system performance and were key enablers. When a vehicle (containing an installation of the reference system together with its proprietary systems) did not perform well on these aspects it rendered the rest of the system obsolete. As an example, when the positioning was wrong, the behaviour of the applications would certainly be wrong. Also, when a message was received with a time offset that was too high, it was considered outdated and the communication stack automatically thrown it away before it even reaches the applications.

    To achieve these objectives, the critical components in the facility layer (- Facilities) that log information into the DRIVE C2X log files were validated. Analysing the log files was the only way in which validation of the internal working of the system was possible. Facility components storing log information included ITU, POTI, CAM, DENM and VDP. Components SPAT, TOPO and LDM did not log any data into the log files. An analysis of the log data from all components in detail was performed. The log data gave information about the vehicle and its driver during the FOTs following the validation phase.

    Applications validation started with an analysis of the HMI to validate that this is in line with the expected behaviour. Next, each application was analysed in a different subsection. The reason was that each application was unique in the sense that it could be executed in different scenarios. As an example of different scenarios, take the Motorcycle Approaching Indication (MAI) application. One scenario could be a motorcycle driving behind vehicles and then performing an overtaking scenario, but another scenario could be to approach the vehicles from the side at an intersection.

    2.2.4 Main conclusion from the validation

    The validation activities offered a good chance to improve the current implementation. A certain number of issues have been highlighted and the results of validation were used to support the debugging of the DRIVE C2X reference system which has been carried out also during the piloting phase before starting the FOT operations.

    Taking into account the conclusion of the validation, during the first part of the piloting, fine tuning, debugging, testing and updating was a continuous process and it was supported by a ticketing system. This system allowed the participants of the project to report the identified issues and to request support to the developers. The ticketing system supported the tracking of all the problems and the identification of the common issues resulted in an efficient process of debugging.

    In this paragraph some observations from the validation phase are reported.

    Validation revealed that the implementation on each specific vehicle has significant impact on key aspects. Test sites were advised to test the behaviour for their vehicles before starting testing activities focusing on the following kind of errors: variations in time synchronization, communication range, directional communication ranges (front and rear), and position update (rates between 1 and 10Hz have been reported in the tests).

    Concerning the analysed components in the facility layer, mainly the following issues were observed: the application unit seemed to suffer from garbage collection resulting in a system freezing randomly of up to 3 seconds which requires an additional 5 seconds to return to a normal operation mode; the Vehicle Data Provider (VDP) logged a set of different parameters for each vehicle and some of the parameters were not logged and

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 30 Report on FOT Operations

    some of the facility components have no logging at all (SPAT, TOPO, LDM). Useful indications had been provided to the developers and to the test site to improve the quality of logging: in particular test sites have been advised to make sure that the version of log definitions and the log software are corresponding, and that the definitions were stored for offline analysis later on.

    The two existing Test Management Centre tools, namely the WebScenarioEditor (WebScE) and the Codar viewer have been evaluated. The Codar viewer could be used to visualise in real-time the progress location of the vehicles, and the events observed on the HMI in the vehicles. The WebScE could be used to define and control test runs. Some indications have been provided to include additional export options for the information stored by WebScenarioEditor.

    During the validation activites, some remarks about essential features of the WebScenarioEditor were reported in the corresponding deliverable. These remarks have been taken in to account prior to the FOT execution as follow:

    • Sending of Logprofiles to the vehicles: o The WebScE has automatically sent a logprofile, embedded in the scenario, to all

    participating vehicles. It was possible to set different profiles for each group (e.g. Treatment vs. Baseline). The scenario stored on each vehicle contained the following pieces of information: Route, Assigned drivers, All meta-information entered into the scenario in the WebScE, Groups and assigned vehicles, Start and Endtimes of testrun

    • Acces to a logbook of the executed scenarios: o The information concerning the scenario executed with the WebScenarioEditor

    were made available in the ITEF logging.

    • Validation of the SLOG data: o The scenario logging feature was updated and validated again;

    • Coupling of logfiles with scenarios and consistency bewteen the scheduled scenarios and the logs: o The WebScenarioEditor had the capability to automatically combine the scenarion

    information with the log files o The WebScenariEditor Replay feature enabled to compare the planned route with

    the actual route, based on the events received during the test.

    In general, all applications that have been included in the validation tests have shown to be working in at least a couple of vehicles. Nevertheless specific issues have been identified almost in all applications and they have been listed for debugging purpose.

    Considering the results of the validation process the following observations were made:

    • The piloting phases of the FOTs are necessary to ensure that the combination of reference system and proprietary systems work properly.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 31 Report on FOT Operations

    • Changes in the software components were considered starting from a list of the main issues provided by the validation responsible persons. The debug has been followed-up by using a ticking system to make the process efficient for the involved developers and test site leaders.

    2.3 Test site adaptation

    The objective of the adaptation was that the seven selected test sites were adapted in a way that FOTs could be carried out by using the DRIVE C2X reference system. The adaptation performed targeted the ITS system and the test system, in the context of the activities of WP33, and the application adaptation, addressed in WP37.

    The work was performed to reach such a mature status at the test sites that the reference system developed in SP2 can run and FOTs can be conducted is highly test site-driven. The status and background of the DRIVE C2X test sites were highly heterogeneous, and also the desired level of DRIVE C2X compliancy was rather different. Therefore, the performed adaptations were complex to align and difficult to compare.

    Technical adaptation targeted to two technical units:

    • The ITS stations with all its functionalities up to and including the facilities layer • The DRIVE C2X test system.

    The project defined three different types of test sites:

    • A system test site (Netherlands), • Large-scale functional test sites, and • Small-scale functional test sites.

    The system test site served as the reference test site to the other two types of test sites which are only differentiated in size. It was the main test site, where the complete and original reference system was deployed and running.

    The system was deployed first to the system test site used as a role model for other test sites.

    The adaptation of the applications developed in SP2 and deployed to the test sites is illustrated in 2.3.5.

    The work package in charge of the validation had a stringent design, which is reflected in this description too:

    • Analysis of test sites, • Adaptation requirements, • Test site adaptation.

    The work performed in WP33 was logically and chronologically performed in these steps.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 32 Report on FOT Operations

    2.3.1 Analysis of test sites

    The activity of Test Site adaptation started with the analysis of the selected test sites with the objective to get an overview of the test sites capabilities in order to derive adaptation requirements in the subsequent step.

    A questionnaire to the Test Sites leaders was used to support the analysis: representatives of SP2 and SP4 were requested to state requirements and claims regarding the analysis of the Test Sites. The results, collected in an internal document, are briefly summarised in this paragraph.

    The test site analysis (the main common feature are summarised in Table 2) depicted a summary on availabilities of test sites providing an impression on the size and the scope of the test sites.

    All test sites had specific features which were relevant from an adaptation perspective: a short overview is here provided for the different Test Sites in the following lines identified during the

    Test site Italy

    The Interface to local service centre and roadside infrastructure was one of the main highlights, relevant for data collection and the provision of additional information, including data from traffic sensors, weather sensor, etc. In relation with the previous point, RIS were deployed which uses existing optical fibre infrastructure connecting roadside with the Trento Service Centre. This allowed to place CCU’s in small roadside shelters and to connect them remotely to AU’s through BRE Local Area Network. The installation of GPS/RTK positioning system was also relevant to support good experimental environments for evaluation, fine tuning and impact assessment of co-operative systems

    Test site Germany

    Even though the infrastructure RISs developed in the national FOT, simTD could not be adapted to the reference system, since this project used its own cooperative system, the test sites Germany provided a huge amount of FOT data: it was stated to provide an adaptation by implementing a conversion at data level in which the test site managed the simTD log data and was involved in the DRIVE C2X test data management as well. This process benefits from the fact that simTD and DRIVE C2X technologies are similar. Furthermore, in calendar week 49/2012 Hessen Mobil implemented a DRIVE C2X test site in the non- public area in Friedberg. Two native DRIVE C2X RIS were deployed to accommodate all tests scheduled for evaluation. The German DRIVE C2X test in Friedberg was successfully accomplished.

    Test site France

    ITS architecture was based on the French project SCORE@F system with RIS, VIS and CIS. It was a great advantage to collect compliant data even if technologies were not exactly the same since applications between DRIVE C2X and SCORE@F are similar. TS France used several types of roads and environments for naturalistic and controlled driving experimentation: urban roads (Versailles downtown), rural road (RD91), highway and tunnel (A86) and test tracks (Versailles Satory). One of the most relevant highlight was also the

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 33 Report on FOT Operations

    strong implication of local authorities, which is very useful in FOT organization, mainly in this case to plan roadwork and legal messages on IVS.

    Test site Finland

    The analysis of test site Finland mainly focused on the following aspects: direct connections to local Traffic Management Centre (operated by City of Tampere and Finnish Road Administration) allowing the collection of data including precise road weather information, real-time traffic information (based on both FCD and RIS’s) and real-time VMS data. It was noted that harsh weather conditions, during several months of winter, when road and street surfaces are snow or ice covered quite often were ideal to test, both functions and equipment, especially in slippery, black ice and limited visibility conditions.

    Test site Sweden

    The existence of a large number of tunnels where dangerous goods are not allowed was reported and the road conditions and status of the traffic were used to provide a recommended route guidance to avoid roads not allowed for dangerous goods.

    Test site Spain

    For test site Spain, the following points where highlighted: all RISs access to an Oracle database linked to the Traffic Management Centre and operated by DGT where information in real time from the test site data sensors (inductive loops, weather stations and variable message signs) were available. Furthermore RIS might be remotely accessed via Ethernet, so that the software could be remotely updated and regular status checks could be performed. CTAG DL (data logger) was used in the vehicles to implement the required CAN gateways to provide DRIVE C2X reference system with their CAN data and own HMI was designed for naturalistic tests.

  • DRIVE C2X 12/06/2014

    Deliverable D11.3 Version 1.0 34 Report on FOT Operations

    Table 2: Common test site features

    Test site Road network Available ITS Infrastructure

    Rural [km]