Transcript

INSPIRE Infrastructure for Spatial Information in Europe

MIG/MIWP-16 (subgroup on monitoring): Final activity report

Title Final activity report of the MIG subgroup on monitoring (MIG/MIWP-16)

Creator MIG/MIWP-16 members

Creation date 2015-11-29

Date of last revision 2015-12-15

Subject INSPIRE Implementing Rules for Metadata

Status V. 1.0

Publisher European Commission Joint Research Centre

Type Text

Description Report on the activities carried out by the MIG-T temporary sub-group MIWP-16 set-up to work on the improvement of the usefulness and the reliability of the monitoring information delivered annually by the Member States to the European Commission (via the EEA).

Contributor MIG/MIWP-16 members

Format doc

Source European Commission Joint Research Centre

Rights Public

Identifier MIG_MIWP-16_activity_report_v1.docx

Language EN

Relation Not applicable

Coverage

Comments

Not applicable

2

Table of Contents Acknowledgements.....................................................................................................................3 1 Mandate ...............................................................................................................................4 2 Review of the monitoring indicators with the objective of automating their calculation ......4

2.1 Introduction .................................................................................................................4 2.2 Proposed non mandatory additional elements to be reported for spatial data sets and/or services ........................................................................................................................4

2.2.1 Metadata UUIDs of spatial data sets ......................................................................5 2.2.2 Metadata UUIDs of the discovery, view and download services related to a spatial data set ....................................................................................................................5 2.2.3 Metadata UUID of the discovery service servicing the metadata of a network service 5 2.2.4 Direct accessibility of a service ...............................................................................5

3 New INSPIRE monitoring workflow available to Member States ........................................5 4 Prototype tools to automate the inspire monitoring .............................................................6 5 Prototype dashboard for monitoring data officially reported by Member States .................8 6 Proposed updates to the current monitoring requirements .............................................. 10

6.1 Introduction .............................................................................................................. 10 6.2 Indicators proposed to be discarded ....................................................................... 10

6.2.1 DSi1: Geographical coverage of spatial data sets .............................................. 11 6.2.2 NSi3: Use of network services ............................................................................. 11

6.3 Indicators proposed to be discarded in case of fully automated monitoring ........... 12 6.3.1 MDi1: Existence of metadata ............................................................................... 12 6.3.2 NSi1: Accessibility of metadata through discovery services ............................... 13

6.4 Monitoring variables proposed to be added (on a voluntary basis) ......................... 13 6.5 Indicator proposed to be added ............................................................................... 13 6.6 Open Issues ............................................................................................................. 13

Annex A – Theoretical and Practical Mapping of Metadata Elements to Monitoring Variables ................................................................................................................................................. 14

A.1. Introduction .............................................................................................................. 14 A.2. Remarks................................................................................................................... 14 A.3. MIWP-16 monitoring mapping - theoretical & practical ........................................... 15 A.4. Indicator MDi1 (Existence of metadata) .................................................................. 16 A.5. Indicator MDi2 (Conformity of metadata) ................................................................. 18 A.6. Indicator DSi1 (Geographical coverage of spatial data sets) .................................. 19 A.7. Indicator DSi2 (Conformity of spatial data sets) ...................................................... 20 A.8. Indicator NSi1 (Accessibility of metadata through discovery services) ................... 22 A.9. Indicator NSi2 (Accessibility of data sets through view and download services) .... 23 A.10. Indicator NSi3 (Use of network services) ............................................................ 24 A.11. Indicator NSi4 (Conformity of network services) ................................................. 25 A.12. Raw data section ................................................................................................. 27

Annex B – Questionnaire issued to Member States by the working group. ............................ 28 B.1. Questions related to the implementation of a dashboard ........................................ 28 B.2. Review of the monitoring indicators ......................................................................... 28 B.3. Questions related to the inclusion of reporting elements into the dashboard ......... 29 B.4. Questions related to the current and foreseen use of monitoring information at European and Member State level ...................................................................................... 30

Annex C – Analysis of the responses from member states to the questionnaire and answers to the questionnaire (Q1/2014). ............................................................................................... 31

C.1. Questions related to the implementation of the dashboard (questions D-B 1-13) .. 31 C.2. Questions related to the inclusion of reporting elements (Art. 11 to 16) into the dashboard ............................................................................................................................ 32 C.3. Questions related to the current and foreseen use of monitoring information at European and Member state level ....................................................................................... 32 C.4. Review of the monitoring indicators ......................................................................... 32 C.5. Answers to the questionnaire .................................................................................. 35

Questions related to the INSPIRE dashboard .................................................................. 35 INSPIRE Indicators .......................................................................................................... 48 INSPIRE Reporting .......................................................................................................... 84 INSPIRE General Information .......................................................................................... 87

3

Acknowledgements The work reported in this document has been possible thanks to the active involvement of many experts of the MIWP-16 whose members are listed below:

BE Nathalie Delattre National Geographic Institute of Belgium

BG Martin Baychev Department EA “Electronic communication networks and

information systems”

BG Lilyana Turnalieva Department EA “Electronic communication networks and

information systems”

DE Sabine Geissler Coordination Office SDI Germany

DE Daniela Hogrebe Coordination Office SDI Germany

DK Lars Storgaard Danish Ministry of the Environment, Danish Geodata

Agency

EE Age Sild Estonian Land Board Bureau of Spatial Data Services

ES Paloma Abad Power National Geographic Institute

FI Kai Koistinen National Land Survey of Finland

FI Ilkka Rinne Spatineo

FR Sylvain Grellet French Geological Survey

FR Marc Leobet Ministry of Ecology, Sustainable Development and Energy

FR Étienne Taffoureau French Geological Survey

GR Elena Grigoriou Ministry of Reconstruction of Production, Environment &

Energy

GR Kalliope Pediaditi Environmental Sustainability Consultant & Researcher

IS Anna Guðrún Ahlbrecht

National Land Survey of Iceland

IT Nico Bonora Istituto Superiore per la Protezione e la Ricerca Ambientale

NL Ine de Visser Geonovum

NL Michel Grothe Geonovum

PL Ewa Surma Office of Geodesy and Cartography

SE Anders Rydén Lantmäteriet

SK Tomas Kliment

SK Jan Tobik

SK Martin Tuchyna Slovak Environmental Agency

UK John Dixon Department for Environment, Food & Rural Affairs (DEFRA)

UK Alex Ramage Transport Scotland

EC Freddy Fierens European Commission – Joint Research Centre

EC Sven Schade European Commission – Joint Research Centre

EEA Paul Hasenohr European Environment Agency We also wish to acknowledge the key contribution of François Prunayre at Titellus who has developed a prototype dashboard and prototype tools to automate the INSPIRE monitoring, as well as the contribution of Angelo Quaglia at JRC who developed the metadata validator currently available on the INSPIRE Geoportal. Contact information Paul Hasenohr (MIWP-16 coordinator) European Environment Agency Kongens Nytorv 6 1050 Copenhagen K DENMARK E-mail: [email protected]

4

1 Mandate As a result of a meeting of Member States in Copenhagen on the 15/10/2013 a strong need was expressed for the improvement of the usefulness and the reliability of monitoring information. The MIWP-16 group was formed as a result of the meeting and the work programme of the group was defined in two phases. The objectives of the phases are shown below:

● Phase 1

○ Review the indicators defined in Articles 3 to 10 of the Commission Decision

of 5 June 2009, implementing Directive 2007/2/EC of the European Parliament

and of the Council as regards monitoring and reporting, with the objective of

automating their calculation.

○ Analyse how to extract monitoring information from the metadata records

available in the EU-Geoportal or from the metadata records available in

national geoportals.

○ Analyse how to filter out the metadata records which are for INSPIRE datasets

out of a catalogue containing metadata on more datasets than only INSPIRE

datasets.

○ Design a dashboard (including functional requirements) which would provide

access to all monitoring information and related indicators for every Member

State.

○ Test the approach with pilot countries.

○ Update the justification document for indicators as appropriate.

○ Update the Technical Guidelines on monitoring as appropriate.

● Phase 2

○ Possibly, propose evolutions of discovery metadata or monitoring data

requiring modifications of some legally binding pieces of INSPIRE legislation.

2 Review of the monitoring indicators with the objective of automating their calculation

2.1 Introduction A questionnaire, available in Annex B, was issued to all Member States asking questions about the monitoring and reporting set up and the perceived usefulness of the information provided by the Member States. The group explored how the information in the Implementing Rules for monitoring and reporting could be derived automatically from the metadata that is provided by the Member States relating to INSPIRE data. It was clear at this stage of the work, that a small number of indicators could not be derived solely from the metadata without Member States providing some additional information. A review of all indicator variables with their automatability status as well as a theoretical mapping to metadata elements is available in Annex A. As a result of the questionnaire a very small number of indicators were considered not to be useful in the real world and thus we are proposing that these indicators be dropped. For those indicators that can’t be derived directly from the provided metadata records, the group has identified methods that would allow Member States to supplement the metadata with additional information allowing for the calculation of the indicator from a combination of the metadata and the additional information.

2.2 Proposed non mandatory additional elements to be reported for spatial data sets and/or services

In order to improve the quality of the monitoring data, we propose to add to the current monitoring requirements a few elements which can be derived from the metadata for spatial data sets and services.

5

2.2.1 Metadata UUIDs of spatial data sets A UUID (universally unique identifier) is a unique and persistent identifier. Applied to metadata for datasets, it allows for the unambiguous identification of a metadata record describing a dataset. Retrieving metadata uuids of spatial data sets can be done automatically and this has been demonstrated during the set-up of prototype tools to automate the Inspire monitoring.

2.2.2 Metadata UUIDs of the discovery, view and download services related to a spatial data set

Knowing the metadata UUID of the discovery, view and download services related to a spatial data set permits to establish the link between data sets and services. From this link, it is then easy to know which themes are related to any given service by analysing the information reported for the underlying datasets and there is no need anymore to provide the information required under column T of the XLS monitoring template (list of themes related to the spatial data service). Furthermore this linkage improves significantly the quality of the reported information as, instead of simply knowing that a data set is available through a view and/or a download service, it would then be possible to know through which services the data set is actually available. Retrieving metadata uuids of services related to a spatial data set can be done automatically in most cases and this has been demonstrated during the set-up of prototype tools to automate the Inspire monitoring.

2.2.3 Metadata UUID of the discovery service servicing the metadata of a network service

This information provides more value than simply knowing that the metadata of a network service is available through an unknown discovery service. Retrieving the metadata uuid of the discovery service servicing the metadata of a network service can be done automatically in most cases and this has been demonstrated during the set-up of prototype tools to automate the Inspire monitoring.

2.2.4 Direct accessibility of a service In order to automatically validate the conformity of reported network services with the INSPIRE Implementing Rules, it is necessary to know if they are directly and fully accessible from the internet (i.e. without login, key, etc.). Statistics on the number of data sets served by services directly accessible from the internet could also be computed, which would provide an estimate of the amount of data freely available. To a certain extent, this information can be extracted from the conditions applying to access and use but is expressed differently by each Member State when there are guidelines on that topic in national metadata profiles. A common approach would be preferable. For example, the MIWP-8 (MIG temporary subgroup working on an update of the metadata technical guidelines) is proposing a codelist for LimitationsOnPublicAccess in which noLimitations means that there is no limitation on public access to spatial data sets and services.

3 New INSPIRE monitoring workflow available to Member States

Until 2014 Member States could report their monitoring data either through a Microsoft Excel file formatted according to a template provided by the European Commission or through an XML file according to a schema which had never been updated since 2009. As a consequence

6

more elements had to be provided via XLS than via XML. Only two Member States were delivering their monitoring data as XML files until 2014. End 2014, the 2009 UML class diagram modelling the monitoring data has been updated in order to cover all the elements present in the XLS template as well as the additional elements proposed for inclusion in the INSPIRE monitoring by the MIWP-16. After approval of the diagram by the MIWP-16, the corresponding XSD schema has been updated as well. For the 2015 monitoring exercise (on 2014 data), Member States have been encouraged to provide their monitoring data as XML according to the new schema. In order to facilitate the transition from XLS to XML, two set of tools have been developed: a webform (developed by the EEA) and the tools to generate the monitoring report out of the metadata contained in a discovery service (MIWP-16 activity). The former allows the generation of a monitoring XML file out of a monitoring XLS file and allows for the easy editing and printing of monitoring XML files while the latter allows for the creation of a monitoring XML file out of the metadata records contained in a discovery service (a detailed description is available in the next section). Thanks to the work of the MIWP-16 and thanks to the set of tools which have been made available, the following workflow has been successfully used by some Member States in 2015:

1. Generate a XML monitoring file automatically out of the content of their national

catalogue

2. Review the indicators automatically calculated and possibly proceed to updates in the

catalogue (in which case, rerun 1)

3. Load the XML file into the webform and edit if needed (e.g. to add datasets or services

which are not in the national catalogue). Save the modified XML file.

4. Upload the XML file to Reportnet

4 Prototype tools to automate the INSPIRE monitoring

A set of tools has been developed in order to provide Member States with the possibility to:

1. harvest one or several discovery service endpoints (and configure some harvesting

parameters)

2. compute the indicators based on the harvested data or a subset of them (e.g. to

calculate indicators corresponding to the metadata provided by one responsible

authority only)

3. display the indicators and related variables in a dashboard similar to the dashboard for

monitoring data officially reported by Member States. This provides an easy way to

visualise (and analyse) the performance on the various indicators.

The specifications for the tools have been developed by the MIWP-16 and the development of a prototype has been done by Titellus through funding from the EEA. Released under an open-source licence, some Member States have deployed them in their infrastructure and are further developing them. The MIWP-16 has always insisted that such developments should be contributed back to the open-source community for all Member States to benefit from them. The rules applied to derive the monitoring indicators from the metadata are described in detail in Annex A. Due to the lack of a commonly agreed validator (work on-going in MIWP-5) for metadata, it has been agreed within the group to rely on the validator available on the INSPIRE Geoportal and to consider a metadata record as valid if the level of conformity returned by the validator was greater or equal to 75%.

7

Figure 1: Harvesting configuration page (Austria and BE/Wallonie are displayed)

Figure 2: Form allowing to create and preview the monitoring report out of harvested metadata (here shown for NL)

These tools are available at https://inspire-dashboard.eea.europa.eu/dashboard/ The access is protected in order not to confuse people with the dashboard for monitoring data officially reported by Member States. Once the functionalities requested and funded by the Netherlands will have been integrated into the instance hosted at EEA (e.g. possibility to validate conformity of services against the ELF [European Location Framework] validator to compare against the declared conformity in the metadata), the access will be granted to the INSPIRE reporters and any additional person nominated by a NCP. In order to guarantee the sustainability of the tools developed to automate the INSPIRE monitoring, their maintenance process (project governance, management of change requests, funding, …) needs to be addressed by the MIG.

8

5 Prototype dashboard for monitoring data officially reported by Member States

In order to improve the usefulness of the monitoring indicators, the MIWP-16 has considered it be appropriate to provide a means to visualise and query the indicators, the corresponding variables as well as the underlying reported data. Specifications for a dashboard have therefore been drafted by the MIWP-16 and the EEA funded the development of a prototype dashboard together with the prototype tools to automate the INSPIRE monitoring. The source code for the dashboard is released under an open source licence and is available at https://github.com/titellus/daobs Some Member States have installed it within their infrastructure. All available monitoring data have been loaded to the dashboard which has been presented at the 2015 INSPIRE conference in Lisbon. The prototype dashboard is publicly available at https://inspire-dashboard.eea.europa.eu/official

Figure 3: Landing page of the dashboard with a list of specialised dashboards on the right

9

Figure 4: Maps showing indicators trends over the past 3 or 5 years

Figure 5: Indicators and variables related to network services in 2014

10

Figure 6: Statistics computed out of the raw data reported with filtering capabilities (narrow your search section)

In order to guarantee the sustainability of the dashboard for monitoring data officially reported by Member States, its maintenance process (project governance, management of change requests, funding, …) needs to be addressed by the MIG.

6 Proposed updates to the current monitoring requirements

6.1 Introduction The review of the monitoring indicators carried out at the beginning and during the course of the project together with the tests performed during the setup and the operation of the dashboard prototypes lead us to propose some modifications to the current monitoring requirements:

● Abandon some indicators and stop the collection of their underlying data;

● Introduce the collection of some new elements in order to improve the quality of the

monitoring information.

Whilst it is clear for most the indicators how the variables are calculated, for a small number of the indicators clarification of the calculation method for those variables needs to be more rigorously defined.

6.2 Indicators proposed to be discarded Having completed a survey of Member States on the usefulness of the indicators (see Annexes B and C), a meeting was convened in Arona (Italy) to discuss how the indicators could be derived from the metadata (that is published by the Member States and harvested to the INSPIRE Geoportal) directly or if additional information was required to derive the values of the indicators. One indicator series DSi1 (Geographical coverage of spatial data sets) although meaningful from a theoretical perspective, in practice does not provide the information that was intended. It was very difficult especially in a federated environment to provide the actual area and the relevant area. For any individual data set in the federated environment it is not always clear

11

(certainly at this stage) which other datasets within the metadata for a Member State should be combined to give a total picture.

6.2.1 DSi1: Geographical coverage of spatial data sets DSi1 corresponds to the ratio between the sum of the actual areas of all the spatial data sets of all Annexes and the relevant areas of all the spatial data sets of all Annexes. DSi1.1, DSi1.2 and DSi1.3 are sub-indicators for respectively Annexes I, II and III. The relevant and actual areas of data sets are not part of the metadata elements required by INSPIRE to describe a data set and therefore, with the exception of Sweden that has included these pieces of information in their national metadata profile, it is not possible to derive DSi1 from the metadata served by INSPIRE Discovery services. Furthermore most Member States representatives in the MIWP-16 explained that it is extremely difficult to get the relevant and actual area for each data set when they collect the material to prepare the annual INSPIRE monitoring. As a consequence, some Member States are either not providing any figures about the actual and relevant area of their spatial data sets or providing the exact same figures for every data set they report. In 2014 (reported in 2015) two Member States did not report at all on DSi1, 11 had a DSi1 greater than 0.99 out of which 5 had a DSi1 equal to 1 meaning that the actual and relevant areas were reported as identical. It is proposed to stop the collection of the relevant and actual area of spatial data sets and to consequently remove DSi1 from the list of INSPIRE monitoring indicators. This may require the “monitoring and reporting” Implementing Rules to be reviewed, removing the reporting of this indicator. This indicator is identified in Chapter III of the Implementing Rules, Article 5. We recommend that this entire Article is dropped from the Implementing Rules.

6.2.2 NSi3: Use of network services NSi3 corresponds to the ratio between the sum of the annual number of network service request for all services (discovery, view, download, transformation, invoke) and the number of network services. NSi3.1, NSi3.2, NSi3.3, NSi3.4 and NSi3.5 are sub-indicators to monitor the use of respectively Discovery, View, Download, Transformation and Invoke services. The sub-indicators themselves are the summation of all the service requests for all service requests for all INSPIRE services. A particular data provider may choose in any year to merge or split data layers between one or more services. For example a data provider for the historic environment may have 12 layers of information that correspond to INSPIRE. In year 1, the provider decides to have 1 service with all 12 layers. The next year they decide to have 1 layer in a service. So with no change in data provision the indicator would change significantly. Network service usage statistics are not part of the metadata elements required by INSPIRE to describe a network service, therefore it is not possible to derive NSi3 from the metadata served by INSPIRE Discovery services. Furthermore most Member States representatives in the MIWP-16 explained that, with the growing number of service providers, it is becoming impossible to collect network service usage statistics. Where a Member State has a federated discovery catalogue, even the simple number of hits that are generated as part of the ”discovery” phase is not easily identified. It is thought that the use case for this set of indicators was to allow an understanding of how INSPIRE data was being used, and to provide some justification for the large investment made in INSPIRE by Member States. This indicator by itself does not provide the information that is needed to support this use case. Whilst some Member States that are not using federated systems may be able to provide this information, where a Member State has a thousand or more data providers the complexity in bringing this information together is immense. It is worth noting that some data providers are looking to monitor the availability and the number of hits that their service(s) receive for their own internal understanding of the use of their data. The complexity comes in the sheer volume of data providers in a MS, where the data provider

12

is not actively monitoring the number of hits on their service(s). It is worth noting that the definition of the term “service requests” is not specific enough. The difficulty comes about as the service(s) are routinely automatically polled to determine if the service is active. Is this a valid service request? The relevance of the current use of network services indicators has been severely questioned for a number of reasons. Foremost, in each Member State, there are significant gaps in the collection of usage statistics from the various data providers as can be seen in the yearly monitoring reports: some services are assigned a number of requests equal to 0 while others, usually from a common responsible authority, are assigned a common value. Also services can be split or merged from one year to another, while still servicing the same pool of data sets leading then to a variation of NSi3 even if the total number of requests and underlying data sets remain unchanged from one year to the next. Another source of concern with respect to the relevance of this indicator is that the concept of service request is not well defined. How should service requests from external monitoring systems, from bots and from the provider’s internal network be handled? Lastly the importance and usefulness of a network service cannot only be assessed based on the number of requests. In order for the usefulness of INSPIRE to be understood, Member States and the individual data providers need to establish a typology of its users and associated usage. This then provides information that is at least as relevant for the data provider and for the INSPIRE coordination bodies at national and European levels. It is proposed to stop the collection of the annual number of requests per network service and to consequently remove NSi3 from the list of INSPIRE monitoring indicators. To allow data providers and the Inspire coordination bodies to use information that is useful, we propose that Member States provide as part of the three-yearly report a comprehensive section on the qualitative usage of Network services and data. Such report would be based on an analysis of the relevant server log files but also, when available, on user registration data, licence agreements signed and any other relevant source of information such as questionnaires or other type of user research carried out in the Member States to evaluate the user base, usage patterns and experienced importance of the INSPIRE Network services and the provided data sets. If this proposal is accepted, it will be important that a template is provided for this section of the three-yearly report. This section should focus on the usage trends like relative yearly growth of the use of the spatial data sets via the Network services in Member States. This indicator is identified in Chapter III of the Implementing Rules, Article 9. We recommend that this entire article is dropped.

6.3 Indicators proposed to be discarded in case of fully automated monitoring

NOTE: A fully automated monitoring would exclude relevant spatial data sets and services not yet described and accessible via metadata records available through INSPIRE Discovery services.

6.3.1 MDi1: Existence of metadata MDi1 corresponds to the ratio between the number of spatial data sets and services that have metadata and the total number of spatial data sets and services reported. If the monitoring is based exclusively on the content of Discovery services, then MDi1 is equal to 1 by construction. In case of moving to a fully automated monitoring we recommend that the general indicator MDi1 and its specific indicators MDi1.1, MDi1.2, MDi1.3 and MDi1.4 (respectively for spatial data sets in Annex I, II, III and for spatial data services) are removed from the list of INSPIRE monitoring indicators. Article 3 of the Commission Decision (2009/442/EC) needs to be dropped.

13

6.3.2 NSi1: Accessibility of metadata through discovery services NSi1 corresponds to the ratio between the number of spatial data sets and services with metadata, for which a Discovery service exists and the total number of spatial data sets and services reported. If the monitoring is based exclusively on the content of Discovery services, then NSi1 is equal to 1 by construction. In case of moving to a fully automated monitoring we recommend that the general indicator NSi1 and its specific indicators NSi1.1 and NSi1.2 (respectively for spatial data sets and spatial data services) are removed from the list of INSPIRE monitoring indicators. Article 7 of the Commission Decision (2009/442/EC) needs to be dropped.

6.4 Monitoring variables proposed to be added (on a voluntary basis)

In order to improve the quality and usefulness of the monitoring data, it is proposed to add the variables described in the section Proposed non mandatory additional elements to be reported for spatial data sets and/or services : metadata UUIDs of spatial data sets; metadata UUIDs of the discovery, view and download services related to a spatial data set; metadata UUID of the discovery service servicing the metadata of a network service; direct accessibility of a service.

6.5 Indicator proposed to be added The MIWP-16 proposes the addition of an indicator on spatial data service similar to the indicators existing for Discovery, View, Download, Transform and Invoke services. Once revised metadata technical guidelines have been approved (work on-going in MIWP-8), the process to identify spatial data services from metadata will be clear. This will guarantee that this indicator can be calculated automatically.

6.6 Open issues The proposed modifications on the INSPIRE monitoring indicators require an update of the Commission decision implementing the INSPIRE Directive as regards monitoring and reporting (2009/442/EC) and the corresponding Guidance documents. In addition to the above modifications these updates should consider the following issues:

- usefulness of indicator DSi2 (conformity of spatial data sets) to monitor the

implementation status in the Member States: not all INSPIRE relevant data can be

transformed into INSPIRE data models;

- new understanding of Spatial Data Services (SDS) as it is described in the Discussion

Paper (currently reviewed by MIG-T);

- maintenance of the “prototypes”/production systems;

- information about direct accessibility of services through conditions applying to

access and use should ideally be harmonised.

14

Annex A – Theoretical and practical mapping of metadata elements to monitoring variables

A.1. Introduction The theoretical mapping has been done during the review of the indicators in the first half of 2014. The tools to generate the monitoring data out of the metadata have been developed mainly during the first quarter of 2015 and the practical mapping has been done at that time.

A.2. Remarks

1. In order to do the mappings between indicator variables and metadata elements

listed in the following pages, it is assumed that all metadata records describe

datasets, series or services made available by Member States under INSPIRE.

2. Ø means that this variable cannot be retrieved from any metadata element either

directly or indirectly.

3. Where it is written “a theme from Annex I”, it is to be understood as “any of the names

listed in Annex I of the INSPIRE Directive, in any official language of the EU” (e.g.

Administrative units, Cadastral parcels, Verkehrsnetze)

4. Where it is written “a theme from Annex II”, it is to be understood as “any of the

names listed in Annex II of the INSPIRE Directive, in any official language of the EU”

(e.g. Elevation, Land cover, Ortho-imagerie)

5. Where it is written “a theme from Annex III”, it is to be understood as “any of the

names listed in Annex III of the INSPIRE Directive, in any official language of the EU”

(e.g. Statistical units, Buildings, Impianti di monitoraggio ambientale)

Regarding point 3, 4, 5, the INSPIRE dashboard indexing is using INSPIRE themes in any official language of the EU (See https://github.com/titellus/daobs/blob/daobs-1.0.x/solr/src/main/solr-cores/data/conf/_schema_analysis_synonyms_inspiretheme.json)

6. Where it is written COMMISSION REGULATION (EU) No 1089/2010 of 23 November

2010 implementing Directive 2007/2/EC of the European Parliament and of the

Council as regards interoperability of spatial data sets and services, the same text in

any official language of the EU is equally valid (e.g. RÈGLEMENT (UE) No

1089/2010 DE LA COMMISSION du 23 novembre 2010 portant modalités

d'application de la directive 2007/2/CE du Parlement européen et du Conseil en ce

qui concerne l'interopérabilité des séries et des services de données géographiques).

Regarding point 6, the INSPIRE dashboard indexing is using the reference to the Commission regulation in any official language of the EU (See https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-inspire-constant.xsl#L78)

7. Where it is written Commission Regulation (EC) No 976/2009 of 19 October 2009

implementing Directive 2007/2/EC of the European Parliament and of the Council as

regards the Network Services, the same text in any official language of the EU is

equally valid (e.g. REGOLAMENTO (CE) N. 976/2009 DELLA COMMISSIONE del 19

ottobre 2009 recante attuazione della direttiva 2007/2/CE del Parlamento europeo e

del Consiglio per quanto riguarda i servizi di rete).

Regarding point 7, the INSPIRE dashboard indexing is using the reference to the Commission regulation in any official language of the EU (See https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-inspire-constant.xsl#L105)

15

A.4. MIWP-16 monitoring mapping - theoretical & practical

The practical implementation in the INSPIRE dashboard1 provides 2 scenarios:

1. Compute indicators based only on the first INSPIRE themes only2

2. Compute indicators based on all INSPIRE themes. In that case, the dataset or service

containing more than one INSPIRE will be take into account for each theme.

Information is collected using the following processes: 1. Harvest the discovery service of the Member State

2. Analyse services/datasets relations

a. For all harvested services, check all operatesOn,

i. add to the operatesOn target the link to the service (and its type)

ii. add to the service the list of INSPIRE themes found in operatesOn

target

3. Validate metadata records using INSPIRE validator webservice (http://inspire-

geoportal.ec.europa.eu/GeoportalProxyWebServices/resources/INSPIREResourceTe

ster?probeNetworkServices=false&probeDataResourceLocators=false)

For all indicators, the values are based on the harvested records. It may be underestimated if: ● a number of spatial data set do not have metadata;

● an error occurred during harvesting (incomplete harvesting).

1 https://github.com/titellus/daobs/ 2 Indexing of the first INSPIRE themes in a dedicated field is done here https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L249

A.5. Indicator MDi1 (existence of metadata)

Variables Definition Availability in/from metadata in the national INSPIRE discovery service (for any given MS)

INSPIRE dashboard implementation comment Affected by the scenario3

DSv_Num1 number of spatial data sets for Annex I

Ø Yes

DSv_Num2 number of spatial data sets for Annex II

Ø Yes

DSv_Num3 number of spatial data sets for Annex III

Ø Yes

SDSv_Num number of spatial data services

Ø Count all services harvested. We may harvest records related to no annex.

No

NSv_NumDiscServ number of discovery services

Ø No

NSv_Num_View_Serv

number of view services Ø No

NSv_NumDownlServ number of download services

Ø No

NSv_NumTransfServ number of transformation services

Ø No

NSv_NumInvkServ number of invoke services Ø No

MDv1.1 number of spatial data sets for Annex I that have metadata

Nb of results returned by a search filtering (hierarchyLevel=dataset or hierarchyLevel=series) and identificationInfo/*/descriptiveKeywords/*/keyword = "a theme from Annex I" and identificationInfo/*/descriptiveKeywords/*/thesaurusName/*/title = GEMET – INSPIRE themes, version 1.0

HierarchyLevel is indexed using the following rule4: ● A dataset is identified by

count(gmd:hierarchyLevel[gmd:MD_ScopeCode/@codeListValue='dataset']) > 0 or count(gmd:hierarchyLevel) = 0

● A series or service by gmd:hierarchyLevel/gmd:MD_ScopeCode/@codeListValue

For the INSPIRE thesaurus detection, the INSPIRE dashboard only check a thesaurusName containing “INSPIRE themes”5.

Yes

3 Indicate if the indicator will have different values if the scenario 1 or scenario 2 is used to compute the indicators (in case datasets, services or series contain more than one INSPIRE theme). 4 See https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L47 and https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L118 5 See https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L236

17

MDv1.2 number of spatial data sets for Annex II that have metadata

Nb of results returned by a search filtering (hierarchyLevel=dataset or hierarchyLevel=series) and identificationInfo/*/descriptiveKeywords/*/keyword = "a theme from Annex II" and identificationInfo/*/descriptiveKeywords/*/thesaurusName/*/title = GEMET – INSPIRE themes, version 1.0

See MDv1.1 Yes

MDv1.3 number of spatial data sets for Annex III that have metadata

Nb of results returned by a search filtering (hierarchyLevel=dataset or hierarchyLevel=series) and identificationInfo/*/descriptiveKeywords/*/keyword = "a theme from Annex III" and identificationInfo/*/descriptiveKeywords/*/thesaurusName/*/title = GEMET – INSPIRE themes, version 1.0

See MDv1.1 Yes

MDv1.4 number of spatial data services that have metadata

Nb of results returned by a search filtering hierarchyLevel=service

In the INSPIRE dashboard, MDv1.4 = SDSv_Num because all indicators are based on harvested metadata records. The dashboard cannot take into account resources that do not have metadata.

18

A.6. Indicator MDi2 (conformity of metadata) Metadata validation is made by the INSPIRE validator web service available on the EU Geoportal (http://inspire-geoportal.ec.europa.eu/GeoportalProxyWebServices/resources/INSPIREResourceTester?probeNetworkServices=false&probeDataResourceLocators=false). The validator returns an XML report containing a “CompletenessIndicator”. A metadata record is marked as valid when this completeness indicator is greater than or equal to 75% as agreed at the Copenhagen meeting held in November 2014.

Variables Definition Availability in/from metadata in the national INSPIRE discovery service (for any given MS)

INSPIRE dashboard implementation comment Affected by the scenario6

DSv_Num1 number of spatial data sets for Annex I

Ø

DSv_Num2 number of spatial data sets for Annex II

Ø

DSv_Num3 number of spatial data sets for Annex III

Ø

SDSv_Num number of spatial data services

Ø

MDv2.1 number of spatial data sets for Annex I that have conformant metadata

Can be retrieved from validation of the metadata record against an agreed validator

Yes

MDv2.2 number of spatial data sets for Annex II that have conformant metadata

Can be retrieved from validation of the metadata record against an agreed validator

Yes

MDv2.3 number of spatial data sets for Annex III that have conformant metadata

Can be retrieved from validation of the metadata record against an agreed validator

Yes

MDv2.4 number of spatial data services that have conformant metadata

Can be retrieved from validation of the metadata record against an agreed validator

Count all services harvested. We may harvest records related to no annex.

Yes

6 Indicate if the indicator will have different values if the scenario 1 or scenario 2 is used to compute the indicators (in case datasets, services or series contain more than one INSPIRE theme).

19

A.7. Indicator DSi1 (geographical coverage of spatial data sets) This information is not available in the metadata records harvested.

Variables Definition Availability in/from metadata in the national INSPIRE discovery service (for any given MS)

INSPIRE dashboard implementation comment Affected by the scenario7

DSv1.1_ActArea sum of the actual areas of all the spatial data sets in Annex I

Ø Not available. Default value to 0.

DSv1.2_ActArea sum of the actual areas of all the spatial data sets in Annex II

Ø Not available. Default value to 0.

DSv1.3_ActArea sum of the actual areas of all the spatial data sets in Annex III

Ø Not available. Default value to 0.

DSv1.1_RelArea sum of the relevant areas of all the spatial data sets in Annex I

Ø Not available. Default value to 0.

DSv1.2_RelArea sum of the relevant areas of all the spatial data sets in Annex II

Ø Not available. Default value to 0.

DSv1.3_RelArea sum of the relevant areas of all the spatial data sets in Annex III

Ø Not available. Default value to 0.

7 Indicate if the indicator will have different values if the scenario 1 or scenario 2 is used to compute the indicators (in case datasets, services or series contain more than one INSPIRE theme).

20

A.8. Indicator DSi2 (conformity of spatial data sets)

Variables Definition Availability in/from metadata in the national INSPIRE discovery service (for any given MS)

INSPIRE dashboard implementation comment

Affected by the scenario8

DSv2.1

number of conformant spatial data sets with conformant metadata for Annex I

Nb of metadata records validating against an agreed validator and returned by a search filtering (hierarchyLevel=dataset or hierarchyLevel=series) and identificationInfo/*/descriptiveKeywords/*/keyword = "a theme from Annex I" and thesaurus = GEMET and dataQualityInfo/*/report/*/result/*/pass = true and dataQualityInfo/*/report/*/result/*/specification/*/title = “COMMISSION REGULATION (EU) No 1089/2010 of 23 November 2010 implementing Directive 2007/2/EC of the European Parliament and of the Council as regards interoperability of spatial data sets and services” (and with the correct date of publication [2010-12-08])

HierarchyLevel is indexed using the following rule9: ● A dataset is identified by

count(gmd:hierarchyLevel[gmd:MD_ScopeCode/@codeListValue='dataset']) > 0 or count(gmd:hierarchyLevel) = 0

● A series or service by gmd:hierarchyLevel/gmd:MD_ScopeCode/@codeListValue

For the INSPIRE thesaurus detection, the INSPIRE dashboard only checks a thesaurusName containing “INSPIRE themes”10. For the specification, the INSPIRE dashboard checks all translations of the Commission Regulation in lower case mode. It does not check the date of publication.11

Yes

DSv2.2

number of conformant spatial data sets with conformant metadata for Annex II

Nb of metadata records validating against an agreed validator and returned by a search filtering (hierarchyLevel=dataset or hierarchyLevel=series) and identificationInfo/*/descriptiveKeywords/*/keyword = "a theme from Annex II" and dataQualityInfo/*/report/*/result/*/pass = true and dataQualityInfo/*/report/*/result/*/specification/*/title = “COMMISSION REGULATION (EU) No 1089/2010 of 23 November 2010 implementing Directive 2007/2/EC of the European Parliament and of the Council as regards

See DSv2.1 Yes

8 Indicate if the indicator will have different values if the scenario 1 or scenario 2 is used to compute the indicators (in case datasets, services or series contain more than one INSPIRE theme). 9 See https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L47 and https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L118 10 See https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L236 11 See https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L513

21

interoperability of spatial data sets and services” (and with the correct date of publication [2010-12-08])

DSv2.3

number of conformant spatial data sets with conformant metadata for Annex III

Nb of metadata records validating against an agreed validator and returned by a search filtering (hierarchyLevel=dataset or hierarchyLevel=series) and identificationInfo/*/descriptiveKeywords/*/keyword = "a theme from Annex III" and dataQualityInfo/*/report/*/result/*/pass = true and dataQualityInfo/*/report/*/result/*/specification/*/title = “COMMISSION REGULATION (EU) No 1089/2010 of 23 November 2010 implementing Directive 2007/2/EC of the European Parliament and of the Council as regards interoperability of spatial data sets and services” (and with the correct date of publication [2010-12-08])

See DSv2.1 Yes

DSv_Num1 number of spatial data sets for Annex I

Ø

DSv_Num2 number of spatial data sets for Annex II

Ø

DSv_Num3 number of spatial data sets for Annex III

Ø

22

A.9. Indicator NSi1 (accessibility of metadata through discovery services)

Variables Definition Availability in/from metadata in the national INSPIRE discovery service (for any given MS)

INSPIRE dashboard implementation comment Affected by the scenario12

NSv1.1

number of spatial data sets with metadata, for which a discovery service exists

Nb of results returned by a search filtering (hierarchyLevel=dataset or hierarchyLevel=series)

In the INSPIRE dashboard, NSv1.1 = DSv_NUM because all indicators are based on harvested metadata records. The dashboard cannot take into account resources that do not have metadata.

Yes

NSv1.2

number of spatial data services with metadata, for which a discovery service exists

Nb of results returned by a search filtering hierarchyLevel=service

In the INSPIRE dashboard, NSv1.2 = SDSv_Num because all indicators are based on harvested metadata records. The dashboard cannot take into account resources that do not have metadata.

No

12 Indicate if the indicator will have different values if the scenario 1 or scenario 2 is used to compute the indicators (in case datasets, services or series contain more than one INSPIRE theme).

23

A.10. Indicator NSi2 (accessibility of data sets through view and download services) Extraction of the operatesOn relation is made using the following approach13:

1. Check if the xlink:href attribute contains an id parameter (assuming the link is a URL to a GetRecordById request with the UUID of the target metadata)

2. If not found, fallback to the uuidref attribute

Variables Definition Availability in/from metadata in the national INSPIRE discovery service (for any given MS)

INSPIRE dashboard implementation comment Affected by the scenario14

NSv2.1 number of spatial data sets for which a view service exists

Nb of occurences of identificationInfo/*/operatesOn returned by a search filtering metadata records with hierarchyLevel=service and identificationInfo/*/serviceType = view This will not return a complete list of datasets (the coupled resource element is mandatory IF linkage to datasets on which the service operates are available).

The INSPIRE dashboard counts the number of datasets or series having a link to a view service and not the number of operatesOn. This means that the dataset linked in an operatesOn MUST be described in the catalogue to be taken into account.

No

NSv2.2 number of spatial data sets for which a download service exists

Nb of occurences of identificationInfo/*/operatesOn returned by a search filtering metadata records with hierarchyLevel=service and identificationInfo/*/serviceType = download This will not return a complete list of datasets (the coupled resource element is mandatory IF linkage to datasets on which the service operates are available).

See NSv2.1 No

NSv2.3 number of spatial data sets for which both a view and a download service exist

Nb of common occurences of identificationInfo/*/operatesOn returned first by a search filtering metadata records with hierarchyLevel=service and identificationInfo/*/serviceType = view and then by a search filtering metadata records with hierarchyLevel=service and identificationInfo/*/serviceType = download This will not return a complete list of datasets (the coupled resource element is mandatory IF linkage to datasets on which the service operates are available).

See NSv2.1 No

13 See https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L592 A different implementation might be needed for Germany where the md-fileIdentifier of the data set is enough to establish the link to the services (assuming that the data set and the service are referenced in the same catalogue):

1. identify resourceIdentifier in MD of data set (identificationInfo[1]/*/citation/*/identifier) 2. GetRecordRequest with the following AND-filter: PropertyIsEqualTo „apiso:type“ : „service“ PropertyIsLike „apiso:ServiceType“ : „view“ (or „download“) PropertyIsLike „apiso:OperatesOnIdentifier“ : resourceIdentifier

14 Indicate if the indicator will have different values if the scenario 1 or scenario 2 is used to compute the indicators (in case datasets, services or series contain more than one INSPIRE theme).

24

A.11. Indicator NSi3 (use of network services) This information is not available in the metadata record harvested.

Variables Definition Availability in/from metadata in the national INSPIRE discovery service (for any given MS)

INSPIRE dashboard implementation comment Affected by the scenario15

NSV3.1

sum of the annual number of network service request for all the discovery services

Ø Not available. Default value to 0. No

NSV3.2 sum of the annual number of network service request for all the view services

Ø Not available. Default value to 0. No

NSV3.3

sum of the annual number of network service request for all the download services

Ø Not available. Default value to 0. No

NSV3.4

sum of the annual number of network service request for all the transformation services

Ø Not available. Default value to 0. No

NSV3.5 sum of the annual number of network service request for all the invoke services

Ø Not available. Default value to 0. No

15 Indicate if the indicator will have different values if the scenario 1 or scenario 2 is used to compute the indicators (in case datasets, services or series contain more than one INSPIRE theme).

25

A.12. Indicator NSi4 (conformity of network services)

Variables Definition Availability in/from metadata in the national INSPIRE discovery service (for any given MS)

INSPIRE dashboard implementation comment Affected by the scenario16

NSV4.1 number of conformant discovery network services

Nb of metadata records returned by a search filtering on hierarchyLevel=service and identificationInfo/*/serviceType = discovery and dataQualityInfo/*/report/*/result/*/pass = true and dataQualityInfo/*/report/*/result/*/specification/*/title = “Commission Regulation (EC) No 976/2009 of 19 October 2009 implementing Directive 2007/2/EC of the European Parliament and of the Council as regards the Network Services” (and with the correct date of publication [2009-10-20])

HierarchyLevel is indexed using the following rule17: ● A service by

gmd:hierarchyLevel/gmd:MD_ScopeCode/@codeListValue

For the specification, the INSPIRE dashboard checks all translation of the Commission Regulation in lower case mode. It does not check the date of publication.18

No

NSV4.2 number of conformant view network services

Nb of metadata records returned by a search filtering on hierarchyLevel=service and identificationInfo/*/serviceType = view and dataQualityInfo/*/report/*/result/*/pass = true and dataQualityInfo/*/report/*/result/*/specification/*/title = “Commission Regulation (EC) No 976/2009 of 19 October 2009 implementing Directive 2007/2/EC of the European Parliament and of the Council as regards the Network Services” (and with the correct date of publication [2009-10-20])

See NSv4.1 No

NSV4.3 number of conformant download network services

Nb of metadata records returned by a search filtering on hierarchyLevel=service and identificationInfo/*/serviceType = download and dataQualityInfo/*/report/*/result/*/pass = true and dataQualityInfo/*/report/*/result/*/specification/*/title = “Commission Regulation (EC) No 976/2009 of 19 October 2009 implementing Directive 2007/2/EC of the European Parliament and of the Council as regards the Network Services” (and with the correct date of publication [2009-10-20])

See NSv4.1 No

NSV4.4 number of conformant transformation network services

Nb of metadata records returned by a search filtering on hierarchyLevel=service and identificationInfo/*/serviceType = transformation and dataQualityInfo/*/report/*/result/*/pass =

See NSv4.1 No

16 Indicate if the indicator will have different values if the scenario 1 or scenario 2 is used to compute the indicators (in case datasets, services or series contain more than one INSPIRE theme). 17 See https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L47 and https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L118 18 See https://github.com/titellus/daobs/blob/daobs-1.0.x/harvesters/csw-harvester/src/main/resources/xslt/metadata-iso19139.xsl#L513

26

true and dataQualityInfo/*/report/*/result/*/specification/*/title = “Commission Regulation (EC) No 976/2009 of 19 October 2009 implementing Directive 2007/2/EC of the European Parliament and of the Council as regards the Network Services” (and with the correct date of publication [2009-10-20])

NSV4.5 number of conformant invoke network services

Nb of metadata records returned by a search filtering on hierarchyLevel=service and identificationInfo/*/serviceType = invoke and dataQualityInfo/*/report/*/result/*/pass = true and dataQualityInfo/*/report/*/result/*/specification/*/title = “Commission Regulation (EC) No 976/2009 of 19 October 2009 implementing Directive 2007/2/EC of the European Parliament and of the Council as regards the Network Services” (and with the correct date of publication [2009-10-20])

See NSv4.1 No

A.13. Raw data section The INSPIRE dashboard creates the raw data section of the monitoring with the following rules:

● For all, discovery property:

○ discoveryAccessibility is always set to true because all records are harvested

from a Discovery service

○ discoveryAccessibilityUuid is set based on the harvester configuration; this

information is not available in metadata records.

● For services:

○ For services, all INSPIRE themes are reported (as this was already the case

when reporting through Excel monitoring sheets).

○ By default, directlyAccessible is set to true.

○ By default, userRequest is set to -1

● For datasets:

○ If the monitoring is based on the first INSPIRE theme, only one dataset is

created (if more than one INSPIRE theme is declared, the list of INSPIRE

themes is added as a comment in the document). If not, a dataset having 3

INSPIRE themes will be triplicated (to have the indicator matching the raw

data).

○ By default, coverage properties (ie. relevantArea, actualArea) are set to 0.

Only one view service is available. If a record is published in more than one, the first is reported and a comment indicates the total number of services this record is published into and the list of UUIDs. The same applies for download.

28

Annex B – Questionnaire issued to Member States by the working group.

B.1. Questions related to the implementation of a dashboard

D-B 1) Do you think there should be a common dashboard between MS?

D-B 2) Where should the dashboard be implemented?

D-B 3) The dashboard should contribute to monitor INSPIRE implementation process and progress of EU countries [agree/disagree]

D-B 4 - a) One role of the dashboard should be to monitor the implementation status of INSPIRE in the Member States at a certain time, i.e. the dashboard reflects the content of the INSPIRE discovery services on that certain time [agree/disagree]

D-B 5) Which is the main audience of the dashboard?

D-B 6) Which information the dashboard should provide?

D-B 7 - a) The dashboard should function completely automatically by requesting the INSPIRE discovery services that are registered in the INSPIRE registry and deriving the monitoring information from the metadata the services provide [agree/disagree]

D-B 8) Describe the main benefits you perceive from having a dashboard

D-B 9) Describe the main difficulties you perceive in having a dashboard

D-B 10 - a) Do you think it is necessary to modify any of the existing indicators?

D-B 11 - a) Should the dashboard be linked to National Geoportals?

D-B 12) How do you deal with data sets and services that are not described with metadata yet?

D-B 13 - a) Are there only INSPIRE metadata accessible through your national discovery service?

B.2. Review of the monitoring indicators

MDi1 - a) How is the MDi1 indicator (Existence of Metadata) collected?

MDi1 - d) Should the indicator MDi1 be included in the dashboard?

MDi1,1/2/3 - a) How are the MDi1,1/2/3 indicators (Existence of Metadata for annexes I,II and III) collected?

MDi1,1/2/3 - d) Should the indicators MDi1,1/2/3 be included in the dashboard?

MDi1,4 - a) How is the MDi1,4 indicator (Existence of Metadata for spatial data services) collected?

MDi1,4 - d) Should the indicators MDi1,4 be included in the dashboard?

MDi2 - a) How is the MDi2 indicator (Conformity of metadata) collected?

MDi2 - d) Should the indicator MDi2 be included in the dashboard?

MDi2,1/2/3 - a) How are the MDi2,1/2/3 indicators (Conformity of metadata for annexes I,II and III) collected?

MDi2,1/2/3 - d) Should the indicators MDi2,1/2/3 be included in the dashboard?

MDi2,4 - a) How is the MDi2,4 indicator (Conformity of metadata for spatial data services) collected?

MDi2,4 - d) Should the indicators MDi2,4 be included in the dashboard?

DSi1 - a) How is the DSi1 indicator (Geographical coverage of spatial data sets) collected?

DSi1 - d) Should the indicator DSi1 be included in the dashboard?

DSi1,1/2/3 - a) How are the DSi1,1/2/3 indicators (Geographical coverage of spatial data sets) collected?

DSi1,1/2/3 - d) Should the indicators DSi1,1/2/3 be included in the dashboard?

DSi2 - a) How is the DSi2 indicator (Conformity of spatial data sets) collected?

DSi2 - d) Should the indicator DSi2 be included in the dashboard?

DSi2,1/1/3 - a) How are the DSi2,1/1/3 indicators (Conformity of spatial data sets) collected?

29

DSi2,1/1/3 - d) Should the indicators DSi2,1/1/3 be included in the dashboard?

NSi1 - a) How is the NSi1 indicator (Accessibility of metadata through discovery services) collected?

NSi1 - d) Should the indicator NSi1 be included in the dashboard?

NSi1,1 - a) How is the NSi1,1 indicator (Accessibility of metadata through discovery services - possibility to search for spatial data set) collected?

NSi1,1 - d) Should the indicator NSi1,1 be included in the dashboard?

NSi1,2 - a) How is the NSi1,2 indicator (Accessibility of metadata through discovery services - possibility to search for spatial data services) collected?

NSi1,2 - d) Should the indicator NSi1,2 be included in the dashboard?

NSi2 - a) How is the NSi2 indicator (Accessibility of spatial data set through view and download services) collected?

NSi2 - d) Should the indicator NSi2 be included in the dashboard?

NSi2,1 - a) How is the NSi2,1 indicator (Accessibility of spatial data set through view services) collected?

NSi2,1 - d) Should the indicator NSi2,1 be included in the dashboard?

NSi2,2 - a) How is the NSi2,2 indicator (Accessibility of spatial data set through download services) collected?

NSi2,2 - d) Should the indicator NSi2,2 be included in the dashboard?

NSi3 - a) How is the NSi3 indicator (Use of all network services) collected?

NSi3 - c) Should the indicator NSi3 be included in the dashboard?

NSi3,1 - a) How is the NSi3,1 indicator (Use of discovery services) collected?

NSi3,1 - c) Should the indicator NSi3,1 be included in the dashboard?

NSi3,2 - a) How is the NSi3,2 indicator (Use of view services) collected?

NSi3,2 - c) Should the indicator NSi3,2 be included in the dashboard?

NSi3,3 - a) How is the NSi3,3 indicator (Use of download services) collected?

NSi3,3 - c) Should the indicator NSi3,3 be included in the dashboard?

NSi3,4 - a) How is the NSi3,4 indicator (Use of transformation services) collected?

NSi3,4 - c) Should the indicator NSi3,4 be included in the dashboard?

NSi3,5 - a) How is the NSi3,5 indicator (Use of invoke services) collected?

NSi3,5 - c) Should the indicator NSi3,5 be included in the dashboard?

NSi4 - a) How is the NSi4 indicator (Conformity of all services) collected?

NSi4 - c) Should the indicator NSi4 be included in the dashboard?

NSi4,1 - a) How is the NSi4,1 indicator (Conformity of network services) collected?

NSi4,1 - c) Should the indicator NSi4,1 be included in the dashboard?

NSi4,2 - a) How is the NSi4,2 indicator (Conformity of view services) collected?

NSi4,2 - c) Should the indicator NSi4,2 be included in the dashboard?

NSi4,3 - a) How is the NSi4,3 indicator (Conformity of download services) collected?

NSi4,3 - c) Should the indicator NSi4,3 be included in the dashboard?

NSi4,4 - a) How is the NSi4,4 indicator (Conformity of transformation services) collected?

NSi4,4 - c) Should the indicator NSi4,4 be included in the dashboard?

NSi4,5 - a) How is the NSi4,5 indicator (Conformity of invoke services) collected?

NSi4,5 - c) Should the indicator NSi4,5 be included in the dashboard?

B.3. Questions related to the inclusion of reporting elements into the dashboard

Rep 1) Should information on 'coordination and quality assurance' be included in the dashboard?

Rep 2) Should information on 'contribution to the functioning and coordination of the infrastructure' be included in the dashboard?

Rep 3) Should information on 'use of the infrastructure for spatial information' be included in the dashboard?

Rep 4) Should information on 'data sharing arrangements' be included in the dashboard?

Rep 5) Should information on 'cost benefit aspects' be included in the dashboard?

30

Rep 6 - a) Do you collect any additional information during the monitoring process that is not mandated by the Decision on INSPIRE monitoring and reporting?

B.4. Questions related to the current and foreseen use of monitoring information at European and Member State level

Inf 1) Is the status of a) INSPIRE content and b) status of implementation presented to eGovernment community?

Inf 2) About INSPIRE Monitoring information, is the communication with domain specific networks triggered?

Inf 3) About INSPIRE Monitoring information, are there investigations initiatives in synergy with Open Data ongoing?

Inf 4) About INSPIRE Monitoring information, are there activities targeting the awareness rising and motivation of stakeholders in place?

Inf 5) About INSPIRE Monitoring information, is the support for governance processes in place?

Inf 6) Are feedback collected on INSPIRE coordinating processes?

Inf 7) Are feedback collected on INSPIRE implementation processes?

31

Annex C – Analysis of the responses from Member States to the questionnaire and answers to the questionnaire (Q1/2014).

C.1. Questions related to the implementation of the dashboard (questions D-B 1-13)

Code Question / Statement Decision / Conclusion

D-B 1 Do you think there should be a common

dashboard between MS?

Proceed with the

implementation of a

dashboard.

D-B 2 Where should the dashboard be implemented? At central level and

(extended) at Member State

level.

D-B 3 The dashboard should contribute to monitor

INSPIRE implementation process and progress of

EU countries.

Agreement provided that the

dashboard does not replace

the official monitoring

(unless a MS requests it for

himself)

D-B 4 One role of the dashboard should be to monitor

the implementation status of INSPIRE in the

Member States at a certain time, i.e. the

dashboard reflects the content of the INSPIRE

discovery services on that certain time

The dashboard should

provide near real time

information and provide the

possibility to create

snapshots of the situation at

a given time upon request by

a MS.

D-B 5 Which is the main audience of the dashboard? European Commission,

Member States, Spatial data

user community

D-B 6 Which information should the dashboard

provide?

Focus on monitoring

information and conformity

issues of metadata, data and

services (e.g. validation

results)

D-B 7 The dashboard should function completely

automatically by requesting the INSPIRE

discovery services that are registered in the

INSPIRE registry and deriving the monitoring

information from the metadata the services

provide

Agreement that this is the

goal but we will face

difficulties while trying to

achieve it.

32

Code Question / Statement Decision / Conclusion

D-B 8 Describe the main benefits you perceive from

having a dashboard

To be reviewed later

D-B 9 Describe the main difficulties you perceive in

having a dashboard

To be reviewed later

D-B 10 Do you think it is necessary to modify any of the

existing indicators?

EC/EEA to use the answers as

an input to the INSPIRE policy

evaluation in an anonymous

way.

D-B 11 Should the dashboard be linked to National

Geoportals?

Disregarded due to too many

interpretations of the

question.

D-B 12 How do you deal with data sets and services that

are not described with metadata yet?

To be reviewed later

D-B 13 Are there only INSPIRE metadata accessible

through your national discovery service?

INSPIRE and non-INSPIRE

datasets and services are

mixed in the same

catalogues

C.2. Questions related to the inclusion of reporting elements (Art. 11 to 16) into the dashboard

Most of the respondents answered that they did not want to have reporting elements provided

to the dashboard. From discussions amongst meeting participants, it appears that the main

issue lies with the provision of information in a way which is relevant for a dashboard.

On a side note, the MS representatives expressed their wish for more precise guidelines about the content of the various sections of the report.

C.3. Questions related to the current and foreseen use of monitoring information at European and Member state level

It was decided to go back to the people who answered “currently active” or “foreseen” to get more details about their plans and practices in order to extract good practices to be shared across MS.

C.4. Review of the monitoring indicators

Code Question / Statement Decision / Conclusion

MDi1

MDi1,x

Should the indicator MDi1 (existence of

metadata) be included in the dashboard?

To be included in the

dashboard.

33

Code Question / Statement Decision / Conclusion

MDi2

MDi2,x

Should the indicator MDi2 (conformity of

metadata) be included in the dash board?

To be included in the

dashboard.

Comments associated with

answers “Yes under certain

conditions” to be analysed

DSi1

DSi1,x

Should the indicator DSi1 (geographical coverage

of spatial data sets) be included in the dash

board?

NOT to be included in the

dashboard.

DSi2

DSi2,x

Should the indicator DSi2 (conformity of data

sets) be included in the dash board?

To be included in the

dashboard.

For each data set, provide

the conformity status as

stated in the metadata and

the conformity status as

reported from the commonly

agreed validators to be

provided under MIWP-5.

NSi1

NSi1,x

Should the indicator NSi1 (accessibility of

metadata through discovery services) be

included in the dashboard?

To be included in the

dashboard.

NSi2

NSi2,x

Should the indicator NSi2 (accessibility of spatial

data set through view and download services) be

included in the dash board?

To be included in the

dashboard.

Information to be retrieved

automatically from

metadata or to be provided

as AI (ref. dashboard

concept) for some data sets

if needed.

NSi3

NSi3,x

Should the indicator NSi3 (use of all network

services) be included in the dash board?

To be included in the

dashboard.

Comments associated with

answer c at question NSi3c

to be reviewed

34

Code Question / Statement Decision / Conclusion

NSi4

NSi4,x

Should the indicator NSi4 (conformity of all

services) be included in the dash board?

To be included in the

dashboard.

For each service, provide the

conformity status as stated

in the metadata and the

conformity status as

reported from the commonly

agreed validators to be

provided under MIWP-5.

35

C.5. Answers to the questionnaire

Questions related to the INSPIRE dashboard

D-B 1) Do you think there should be a common dashboard between MS?

Yes 16 100%

No 0 0%

D-B 2) Where should the dashboard be implemented?

a) at central level only 7 44%

b) at central level and (extended) at Member State level

7 44%

Other 2 13%

Spain: yes Belgium: at central level, and at MS level WHEN resources are available UK: At Member State level and collated at central level (ie indicators harvested by Commission)

36

D-B 3) The dashboard should contribute to monitor INSPIRE implementation process and progress of EU countries

Agree 14 88%

Disagree 0 0%

Other 2 13%

Denmark: The dashboard should show which data sets and services each Member States they provide to INSPIRE and which of them are INSPIRE compliant. This overview can inspire other member states in their affords in identifying data sets in scope of annex 1, 2 and 3. Uk: Agree - but on condition that Member States can control when indicators are calculated.

D-B 3.1) If "disagree" on the above assertion, could you explain why? UK: As National Contact Point we would be concerned if dynamically calculated indicators in a dashboard were to be used as a basis for enforcing INSPIRE compliance. Therefore we would propose that the dashboard harvest indicators that are calculated in local dashboards, on a cycle determined by Member States with a minimum specified frequency. (Would suggest annually inline with regulations). Indicators could be harvested in the XML format already specified for returning monitoring information or an updated version of it.

D-B 4 - a) One role of the dashboard should be to monitor the implementation status of INSPIRE in the Member States at a certain time, i.e. the dashboard reflects the content of the INSPIRE discovery services on that certain time

Agree 14 88%

Disagree 1 6%

Other 1 6%

D-B 4 - b) If "disagree" on the above assertion, could you explain why? Finland: Not sure if certain time means here the real time monitoring. It should contain a view of monitoring status of the end of each year but also real time (for example daily or

37

weekly) status monitoring. Yearly monitoring status view for seeing the long term picture and real time status for seeing the current status. UK: The time at which the dashboard is monitored should be controlled by each Member State National Contact Point, to avoid mis-reporting.

D-B 5) Which is the main audience of the dashboard?

a) European Commission 1 6%

b) Member States 1 6%

c) Spatial data user community 1 6%

d) all of them 11 69%

Other 2 13%

D-B 5 - b) If "b" on the above assertion, please elaborate on the type of information which could be interesting - useful to the spatial dataset and services users community Spain: with the abstract (the same as the abstract element of the metadata file) Finland: - list of national INSPIRE datasets and services - availability of dataset and service metadata and link to available metadata - metadata conformity and a validation report if not conformant - dataset availability in services and link to those services - service conformity and a validation report if not conformant The Netherlands: It is usefull to know if there is data on a specific theme (not yet available in the portal) for a region Sweden: The information in the Excel monitoring sheets are of great interest, as well as the custodians of the different datasets. It would help the NCP’s to follow and present the progress and availability of data within the country as well as identifying new cross-border datasets and to find contact organisations in other countries for these. It will also give a good overview on how different countries defines their datasets and what data is available from a cross-border perspective. UK: Progress in other Member States, particulary information about the types of data that have been published for each theme, achieving consistency accross territories is proving difficult. It will become more difficult as we move into transformation. So it would be useful to have this information.

38

D-B 6) Which information the dashboard should provide?

a) monitoring information only 5 31%

b) monitoring and reporting information 1 6%

c) conformity issues of metadata, data and services (e.g. validation results)

2 13%

d) all of them 4 25%

Other 4 25%

Spain: in two parts: - with dataset and service compliance metadata - with dataset and service non-compliance metadata Italy: a) & c) Belgium: monitoring information including validation results (Y/N or even more detailed?) IF all MS use the same validation tools which should be provided by the EC France: a) & c) The Netherlands: a + c

39

D-B 7 - a) The dashboard should function completely automatically by requesting the INSPIRE discovery services that are registered in the INSPIRE registry and deriving the monitoring information from the metadata the services provide

Agree 5 31%

Disagree 1 6%

Agree but this is not feasible presently 9 56%

Other 1 6%

Belgium: Agree IF metadata is adapted when it should appear that other information is useful/necessary for the dashboard/monitoring (which is at the moment not yet included in the metadata). Also: not all information can be included in the MD (e.g. use of services during a certain period). Therefore it is necessary to allow data input into the dashboard manually or from other sources than the metadata.

D-B 7 - b) If it is currently not feasible, could you explain why? Denmark: A result of an automatically process would rely of the quality of metadata. And from the Danish point of view the quality of metadata is in some instances varying quality. Italy: because of the heterogeneous status of implementation in public authorities and it is difficult to have an automatic derivation of monitoring information Greece: Because in Greece we do not yet have discovery services for all the data and spatial data services that fall under the scope of the INSPIRE directive. We therefore would like to be able to monitor the progress of our data producers manually and we want to know which datasets and services exist and should be compliant to INSPIRE. We think that not compliant data and services should be part of the monitoring list and we want to know that they do not have metadata, they are not INSPIRE compliant and there are not access services for them. The complete automatic monitoring of the implementation of the directive is correct and it is the only way to actually be sure that the monitoring information is real. Unfortunately, it is not feasible yet. Belgium: -metadata does not contain all the necessary information for automatic monitoring: (1) some fields required for monitoring are not present (yet) in metadata, (2) metadata is not always kept up-to-date by the data provider, (3) some metadata fields which will be necessary for the dashboard are not mandatory. Data providers will have to complete these metadata fields. (2) and (3) will be difficult since metadata is by most data providers considered as a burden and has low priority. -implementation cost to integrate the dashboard in own geoportal (if wanted), and if wanted to extend it with additional information. Finland: At least in Finland the discovery service includes also INSPIRE conformant metadata of non-INSPIRE datasets and services. There should be a commonly agreed mechanism how to separate the non-INSPIRE data from INSPIRE ones. On the other hand, all INSPIRE datasets and services don't (at least not yet) have metadata documents in the national discovery service. Anyway the existence of these datasets and services should also be indicated in the dashboard.

40

France: Some indicators (about area & use of infrastructure) are not in the metadata. Germany: Some of the required monitoring information could not be derived from metadata (use of services, relevant and actual area). UK: We disagree and this is not currently feasible. We disagree because we think Member State National Contact Points should have an opportunity to validate their indicators before they are presented to a dashboard. This is particularly so if it is to be used to monitor legal compliance. It is not feasible because not all indicators can be calculated programatically, and particularly not in a federated SDI (such as that implemented in the UK). (See also responses on indicators). Slovak Republic: Not all of Slovak obliged organizations has registered their discovery services in INSPIRE registry and not all of them are completely in accordance with requirements of INSPIRE. Anonymous: not all monitoring info is kept in the metadata (e.g sevice requests, actual/relevant area) Datasets/services with no existing metadata are excluded

D-B 8) Describe the main benefits you perceive from having a dashboard Analysis of the answers: This question was interpreted and answered as “what is the function/role you perceive/want the dashboard to have.” Perceived Roles functionality: a) Reduce through dashboard automated – real time processes, MS monitoring and reporting obligations and associated administrative burden. b) Improved, user friendly, accessible, comparative information regarding progress in implementing INSPIRE which would motivate wider, INSPIRE compliance by public authorities. c) Facilitated access and search services, to datasets and metadata according to different themes by different countries. Spain: The statistical knowledge of dataset and services in all country Cyprus: Monitoring the current status of SDI at any time. Estonia: Generates automatically reports Will reduce development cost Will help conformity testing Denmark: The dashboard will be a tool for the member states to inspire them in their affords in identifying data sets in scope of annex 1, 2 and 3. E.g. to answer the question "which data sets has my neighboring countries identified in scope of annex 3 US?" Italy: easier recovery of information related to INSPIRE Monitoring; full pan-european landscape on INSPIRE monitoring activities easily accessible; help EC to focus on MS weakness or lacks Greece: The main benefit would be the real time information on all the components of the member states' and European infrastructure. Belgium: - automatic and up-to-date (semi real time) monitoring - use of metadata (+ stimulus for data providers to update their metadata) - link between metadata, data and its services (should be implemented in the dashboard) - user friendly and easily publicly available - clear overview using graphics Finland: Automatizing the collection of monitoring information saves a great amount of yearly work. Showing an up-to-date status of INSPIRE implementation makes it easier for organisations to see what they still have to do to fulfil INSPIRE requirements.

41

Dashboard gives a more visible and concrete view to the monitoring information and INSPIRE in general. France: An easier way to show monitoring's results to managers and politics; a way to compare national results with the one of European countries. Germany: Transparency about the implementation status of INSPIRE in the member states. The Netherlands: efficiency, collecting the information by hand is a lot of work comparable information, it is collected the same way actual, it is the situation, not the wish Sweden: The possibility to get at quick overview of all available datasets and services within different countries would be a major benefit, as well who the data custodian in respective country is. Currently, each separate Excel monitoring sheet have to be downloaded and manipulated in order to get an idea of a specific aspect. It will also give a good overview on how different countries defines their datasets and what data is available from a cross-border perspective. UK: Greater transparency on the progress in delivering INSPIRE. Greater visability on extent of data published and performance and use of data services - enabling the benefits of these services to be demonstrated. Reduced burden on each MS to provide data if dashboard is implemented correctly, and provided it is combined with changes to monitoring indicators. Slovak Republic: eliminating manual copying information from metadata reducing the errors in monitoring and reporting graphical user more friendly interface exchange of standards for M&R between member states, clarification of requirements for M&R Anonymous: simplification of the national monitoring process

D-B 9) Describe the main difficulties you perceive in having a dashboard Analysis of the answers: The key difficulties described concern feasibility issues regarding the development of the dashboard, due to its reliance on metadata Current metadata problems can be summarized as: a) Some fields required for dashboard and monitoring missing from metadata or currently not mandatory b) Poor quality metadata (eg: not updated, incomplete, unverifiable, different interpretation of description fields) c) Absence of metadata for many datasets Additionally the issue of potential financial burden to MS for the implementation of the dashboard in National geoportals was mentioned as a potential difficulty. Spain: The lack of feedback the Inspire community It is necessary examples, guidelines, or rules more explain (the numeber of records is very differents in each country for the same theme). For instance: Is a inventary of plants a dataset? Cyprus: N/A Estonia: developing user-friendly interface Denmark: The dashboard shall show member states monitoring-information based on data sets and services metadata. The quality of metadata plays a huge role in this process. The dashboard should not be used as a control tool to check for conformity and progress in each member states INSPIRE implementation - there can be several reasons for why the number of available data sets can increase or decrease. E.g. organizational changes, new data-production-cooperation etc. Italy: Implementing and management (economically and technically) the dashboard at national scale; difficulties to apply standards for interoperability between National And central node; lack of a national endpoint node

42

Greece: If I have understood the structure of the dashboard well, one difficulty will be the fact that many of our datasets and services that fall under INSPIRE are not accessible by network or spatial data services, so it would be difficult to have real time information on them. Belgium: -metadata does not contain all the necessary information for automatic monitoring: (1) some fields required for monitoring are not present (yet) in metadata, (2) metadata is not always kept up-to-date by the data provider, (3) some metadata fields which will be necessary for the dashboard are not mandatory. Data providers will have to complete these metadata fields. (2) and (3) will be difficult since metadata is by most data providers considered as a burden and has low priority. -implementation cost to integrate the dashboard in own geoportal (if wanted), and if wanted to extend it with additional information. France: To reach consensus on functionnalities (quite easy) and on design (much more difficult). Germany: The dashboard could lead to a "competition" between the member states. The dashboard could be used as a control instrument by the commission. The Netherlands: not all indicators can be automatically generated out of the metadata Sweden: Metadata is not always correct, partly because of human errors entering faulty information, partly due to communication between the national geoportals and the INSPIRE geoportal. This may be a source for erroneous information in the dashboard system. This will, however, eventually be corrected as errors are discovered and data custodians can be enlightened on the errors. Another problem might be to differentiate between INSPIRE datasets and other national datasets (and products). UK: Getting appropriate data from the metadata records to provide the information, particularly as the quality of metadata is variable, particularly with some of the less mature data publishers. A lack of underpinning standards for the exchange of monitoring information. Slovak Republic: governance licensing framework manual input still needed Anonymous: the potential difficulties with the metadata quality (in the initial phase) will lead to better metadata quality later

D-B 10 - a) Do you think it is necessary to modify any of the existing indicators?

No 2 13%

Yes 13 87%

Other 0 0%

43

D-B 10 - b) If "yes" on the above question, please indicate which indicator(s) and why.

Poland: DSi1, NSi3, Denmark: Area indicators are irrelevant - so far their existence in the monitoring is unknown. The indicator for metadata existence is irrelevant if the monitoring is based on metadata. The indicator for metadata compliance to Metadata IR is irrelevant if the monitoring is based on metadata in discovery services. The indicator for data set / service is available in Discovery Service is irrelevant if the monitoring is based on metadata in discovery services. Italy: DSi1; DSi1.1; DSi1.2; DSi1.3 - not relevant NSi3; NSi3.1; NSi3.2; NSi3.3; NSi3.4; NSi3.5 – comparability issues Greece: I believe that the relevant and actual area do not give any valuable information on the implementation of the directive. This indicator has a meaning if it is reported for a dataset that has not been completed yet. Since the directive does not set requirements for the collection of data, counting the completion of the dataset might be out of the scope of the directive.

Finland: DSi1 indicator (Geographical coverage of spatial data sets) NSi3 indicator (Use off all network services). Information needed to calculate these indicators needs to be manually collected from data providing organizations. France: DSi1- NSi3 Germany: The indicators "Use of services" (NSi3) and "Geographical coverage" (DSi1) are quite difficult to calculate and don't say anything about the implementation status of INSPIRE. The Netherlands: calculation of the actual area and coverage more metadata is needed. At least the administrative unit it covers. the use of the services, it's not in the metadata also not in the new elements described in the SDSS. Use monitored at the EU INSPIRE portal can be an alternative. Sweden: The information for “relevant area” and “actual area” in particular. The indicator doesn’t provide much information at all. Assume there are 250+ datasets on the list, which I think is a very modest assumption, all being captured to 100 % of its intended coverage. The overall indicator will then also show 100 %. One additional dataset will, for the short time data capture is in progress not change that overall 100 % significantly (20 % captured = 99.7 %, 50 % captured = 99.8 %, etc.). Also the indicator for usage of services need to be considered. This indicator cannot be captured automatically from the metadata. Some indicators are also redundant as they obviously exists, for instance will metadata always exist if it is extracted from the INSPIRE geoportal UK: We think a comprehensive review of the indicators is required. It is clear that several indicators are redundant and that others offer little value to Member States, and indeed we struggle to see how they offer value to the Commission. We think this activity should be prioritised over a dashboard but done with the aim of delivering a dashboard in mind. Anonymous: I am not sure about the usefulnes of most indicators at all; some are useless at all (e.g. number of services), some are useless after the initial phase until 2015/2016; maybe we need a better definition of some indicators

44

D-B 11 - a) Should the dashboard be linked to National Geoportals?

Yes 9 56%

No 2 13%

Other 5 31%

Poland: yes, if it is possible Belgium: National and/or regional portals The Netherlands: not sure Anonymous: this should be possible, but not mandatory

D-B 11 - b) Could you elaborate the reasons of your choice in the above question? Spain: We think it is necesary to link with national catalogue. Cyprus: Immediate access of local users. Denmark: The dashboard should be linked to and harvest information from the EU INSPIRE Geoportal. Italy: yes, but at the moment, it is not easily feasibile because of the heterogeneous status of implementation in public authorities Greece: I am not sure I understand the actual linkage mentioned but if it means that the user of the national geoportal will be given the technical means to be informed for the content of the dashboard, I think that this might be the correct way to pass the information of the dashboard to the users. Finland: It seems obvious to do the linking because the dashboard provides information about the status of national SDI implementation. France: Save money and time by using national SDI Germany: Depends on what is meant by "linked". The Netherlands: what is the benefit of that? Sweden: The purpose of having a central dashboard is to limit the burden of monitoring by each member state (NCP) but still be able to access monitoring information, i.e. from the result of the monitoring. Theoretically, the INSPIRE geoportal should contain the same information as the national geoportals, which mean that an automatic routine for extraction of monitoring information from the INSPIRE geoportal should be enough UK: We believe that the majority of the indicators required to monitor INSPIRE can be derived from discovery metadata published through the national geoportal or by automatically querying the network services registered with these portals.

45

Slovak Republic: Ensure consistency with the other INSPIRE components (metadata, network services, interoperability, data sharing, reporting).

D-B 12) How do you deal with data sets and services that are not described with metadata yet?

Analysis of the answers: This was an important issue dealt with differently using three basic approaches: a) If a dataset or service doesn’t have metadata it is not recorded or considered for inclusion in the monitoring and reporting. b) Manual recording of metadata information into monitoring excel sheets c) Purposeful creation of metadata and subsequent inclusion within the monitoring. Approaches b) and c) where described as time consuming, and transitional as the aim is for all to finally adopt approach a). Spain: Don't include in monitoring and reporting Cyprus: Pending issue that needs to be handled ASAP. Estonia: Metadata is always described before we start dealing with the dataset Denmark: They are part of the monitoring - their monitoring-information are typed in manually. Their existence in the monitoring indicates progress in the implementation of INSPIRE. Italy: data providers should describe data sets and services with metadata before publishing Greece: We collect the relevant monitoring information from the data producers through an excel sheet that contains all the discovery metadata elements along with all the monitoring elements. We fill out the INSPIRE monitoring template manually. The most difficult part, since we do not have actual access to the datasets and services, is that we cannot be sure that the information we collect from the data producers is correct and we cannot be sure that the data actually fall under the scope of the directive. Belgium: Stimulate data providers to describe their data and services with metadata. Datasets and services which have no metadata will not appear in the dashboard. Consequently, in the beginning, the information shown in the dashboard will differ from the information in the current excel files. It will grow and become more complete as metadata will be completed. The quality and completeness of the information of the dashboard will, for the main part, depend on the quality and completeness of the metadata. Finland: If we know that a dataset or service exist we include it manually in yearly INSPIRE monitoring information collection sheet. This is quite easy for public authorities. For municipalities (in Finland >300) we only include in monitoring those datasets and services that have been described with metadata because we don't know for sure about the existence of non-described data. France: We do not even know if they exist under digital form, so we think it is impossible to deal with Germany: Currently they are manually added in Excel. But in the future, if the monitoring information is derived by metadata we have an agreement that only those data sets are included in the monitoring that are already described by metadata and accessible through a discovery service. The Netherlands: we report on them Sweden: This is a temporal problem only. As the SDI evolves all data will eventually be published. Data not published are thus assumed non-existing for the time being. When the data custodians are ready to provide the data, they will also publish metadata for the data. The role of the NCP’s are to inform and encourage the data custodians to publish metadata for their data as soon as possible, and also help the data custodians to identify which data are INSPIRE data (the dashboard will serve well in this context).

46

UK: We only report on datasets and services that are described with metadata, due to the federated nature of our data we would not be aware of INSPIRE data that was not registered. Slovak Republic: where possible and relevant metadata are created, where not possible information is monitored manually Anonymous: either we don’t know them or we ask the data owner to fulfil the INSPIRE obligations

D-B 13 - a) Are there only INSPIRE metadata accessible through your national discovery service?

Yes 3 31%

No 13 69%

D-B 13 - b) If "no" on the above question, how do you distinguish between metadata of INSPIRE data sets or services and other metadata? Spain: By XML schema Estonia: Name of the dataset in the metadata includes a word 'INSPIRE' Denmark: By using keywords. We have instructed the data- and services providers in using a keyword "INSPIRE" to indicated if the data set / service is INSPIRE or not. Our monitoring application filter the metadata using this keyword. Italy: through a "flag" in the Italian metadata format Belgium: The metadatasets of INSPIRE datasets and services have the keyword “Lijst M&R INSPIRE”. Finland: We have a "non-INSPIRE" keyword for non-INSPIRE datasets but this hasn't been a very well working solution. France: See technical methodology sent to MIWP-5 : in two words, use of GEMET thesaurus in Metadata + analysis of URL for the services Germany: We use an additional keyword in the metadata called "inspireidentifiziert". The Netherlands: on a database attribute category INSPIRE, it's not based on something in the metadata. It is set in the portal. Sweden: In the metadata catalogue, a separate metadata element has been introduced in order to differentiate between INSPIRE data and other data published (not following the INSPIRE specifications but of interest mainly from a national perspective).

47

UK: Using the appropriate GEMET keywords in the metadata. Anonymous: we have some OpenData services, most of them additional to INSPIRE services, but not conform

48

INSPIRE Indicators -------------------------

MDi1 - a) How is the MDi1 indicator (Existence of Metadata) collected?

a) automated collection through metadata information

4 25%

b) through other automated process 4 25%

c) manually 8 50%

MDi1 - b) If "b" in the previous question, could you describe the process? Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: An automated management information report is run against our CKAN database (data.gov.uk). This produces a series of CSV reports. We take these reports and manually process them to populate the EC monitoring spreadsheet. This provides the indicator. Anonymous: question is not clear: data are collected manually, indicator is computed in the EXCEL-file

MDi1 - c) Please provide comments regarding the indicator MDi1 Spain: We don´t check it Cyprus: Delay in collecting the information. Greece: There is no way to make sure that the information we collect manually is real. Belgium: At this moment, the information Existence of Metadata is collected manually using the monitoring Excel file which is distributed to the data providers. When retrieving the main part of the monitoring information through the CSW, this indicator will become obsolete (will always be 100%). Finland: Information on non-existing metadata can't be collected automatically. France: easy : by construction, the result is 100% as we check that every known datasets were in the FR SDI Sweden: As only metadata published in the national geoportal is considered, this indicator is indeed redundant, especially from an automatic dashboard perspective. In Sweden, all data custodians are required by law to publish metadata for their data and services in the national geoportal. This means that this will always be 100 % UK: We feel this indicator is not useful, in our case it is always 1 as we can only report a dataset or service if metadata exists due to the nature of our SDI.

49

Slovak republic: All indicators are collected via webform into database from which xls report is exported based on monitoring template. Anyway after export there is still need for manual check and update. Anonymous: not useful in future; you ask for the conformance to legal requirements - I am not sure if this makes sense

MDi1 - d) Should the indicator MDi1 be included in the dashboard?

a) yes 10 67%

b) no 5 33%

c) Yes under certain conditions 0 0%

Other 0 0%

MDi1,1/2/3 - a) How are the MDi1,1/2/3 indicators (Existence of Metadata for annexes I,II and III) collected?

a) automated collection through metadata information

4 27%

b) through other automated process 3 20%

c) manually 8 53%

MDi1,1/2/3 - b) If "b" in the previous question, could you describe the process? Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. Anonymous: question is not clear: data are collected manually, indicator is computed in the EXCEL-file

50

MDi1,1/2/3 - c) Please provide comments regarding the indicators MDi1,1/2/3 Spain: We don´t check it Cyprus: Serious delay in collection. Useful information. Belgium: At this moment, the information Existence of Metadata is collected manually using the monitoring Excel file which is distributed to the data providers. When retrieving the main part of the monitoring information through the CSW, this indicator will become obsolete (will always be 100%). France: easy to compute as we use the validator integrated in the national geocatalogue, but strongly difficult to understand as one can affect a dataset to many themes. Furthermore, many producers don't understand to which theme the dataset belongs Sweden: The same comments as for the question above [As only metadata published in the national geoportal is considered, this indicator is indeed redundant, especially from an automatic dashboard perspective. In Sweden, all data custodians are required by law to publish metadata for their data and services in the national geoportal. This means that this will always be 100 %] Anonymous: not useful in future. You ask for the conformance to legal requirements - I am not sure if this makes sense

MDi1,1/2/3 - d) Should the indicators MDi1,1/2/3 be included in the dash board?

a) yes 9 60%

b) no 4 27%

c) Yes under certain conditions 2 13%

Other 0 0%

51

MDi1,4 - a) How is the MDi1,4 indicator (Existence of Metadata for spatial data services) collected?

a) automated collection through metadata information

3 20%

b) through other automated process 4 27%

c) manually 8 53%

MDi1,4 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: An automated management information report is run against our CKAN database (data.gov.uk). This produces a series of CSV reports. We take these reports and manually process them to populate the EU monitoring spreadsheet. This provides the indicator. Anonymous: see above [not useful in future. You ask for the conformance to legal requirements - I am not sure if this makes sense]

MDi1,4 - c) Please provide comments regarding the indicator MDi1,4

Spain: We don´t check it Cyprus: Useful. Rather hard to collect. Belgium: At this moment, the information Existence of Metadata is collected manually using the monitoring Excel file which is distributed to the data providers. When retrieving the main part of the monitoring information through the CSW, this indicator will become obsolete (will always be 100%). Finland: Information on non-existing metadata can't be collected automatically. Sweden: The same comments as for the question above [As only metadata published in the national geoportal is considered, this indicator is indeed redundant, especially from an automatic dashboard perspective. In Sweden, all data custodians are required by law to publish metadata for their data and services in the national geoportal. This means that this will always be 100 %] Anonymous: see above [not useful in future. You ask for the conformance to legal requirements - I am not sure if this makes sense]

52

MDi1,4 - d) Should the indicators MDi1,4 be included in the dash board?

a) yes 10 71%

b) no 3 21%

c) Yes under certain conditions 1 7%

Other 0 0%

MDi2 - a) How is the MDi2 indicator (Conformity of metadata) collected?

a) automated collection through metadata information

3 21%

b) through other automated process 3 21%

c) manually 8 57%

MDi2 - b) If "b" in the previous question, could you describe the process? Poland: The information is taken from our National Geoportal - validated files Finland: Metadata validity is tested one by one with EU commission validator. Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: We fill in the field in the excel spreadsheet manually.

MDi2 - c) Please provide comments regarding the indicator MDi2 Spain: We don´t check it Cyprus: Useful. Rather hard to collect.

53

France: easy to compute as we use the validator integrated in the national geocatalogue The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this Sweden: Metadata is validated before published in the national geoportal which means that the metadata published are always according to the INSPIRE requirements UK: Metadata is only harvested to our SDI if it passes strict validation. Only records within our SDI are reported in the monitoring report, so the answer to this indicator is always 1.

MDi2 - d) Should the indicator MDi2 be included in the dash board?

a) yes 10 67%

b) no 3 20%

c) Yes under certain conditions 2 13%

Other 0 0%

MDi2,1/2/3 - a) How are the MDi2,1/2/3 indicators (Conformity of metadata for annexes I,II and III) collected?

a) automated collection through metadata information

4 29%

b) through other automated process 2 14%

c) manually 8 57%

MDi2,1/2/3 - b) If "b" in the previous question, could you describe the process? Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: We fill in the field in the excel spreadsheet manually.

54

MDi2,1/2/3 - c) Please provide comments regarding the indicators MDi2,1/2/3 Spain: We don´t check it Cyprus: Useful. Time consuming collection. France: Same issue than for MDi1 1/2/3 [easy to compute as we use the validator integrated in the national geocatalogue, but strongly difficult to understand as one can affect a dataset to many themes. Furthermore, many producers don't understand to which theme the dataset belongs] The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this Sweden: The same comments as for the question above [Metadata is validated before published in the national geoportal which means that the metadata published are always according to the INSPIRE requirements] UK: Metadata is only harvested to our SDI if it passes strict validation. Only records within our SDI are reported in the monitoring report so the answer to this indicator is always 1.

MDi2,1/2/3 - d) Should the indicators MDi2,1/2/3 be included in the dash board?

a) yes 9 53%

b) no 4 27%

c) Yes under certain conditions 2 20%

Other 0 0%

55

MDi2,4 - a) How is the MDi2,4 indicator (Conformity of metadata for spatial data services) collected?

a) automated collection through metadata information

3 21%

b) through other automated process 3 21%

c) manually 8 57%

MDi2,4 - b) If "b" in the previous question, could you describe the process? Finland: Metadata validity is tested one by one with EU commission validator. Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: We fill in the field in the excel spreadsheet manually.

MDi2,4 - c) Please provide comments regarding the indicator MDi2,4 Spain: We don´t check it Cyprus: Time consuming. The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this Sweden: The same comments as for the question above [Metadata is validated before published in the national geoportal which means that the metadata published are always according to the INSPIRE requirements] UK: Metadata is only harvested to our SDI if it passes strict validation, as only records within our SDI are reported in the monitoring report the answer to this indicator is always 1.

56

MDi2,4 - d) Should the indicators MDi2,4 be included in the dash board?

a) yes 10 67%

b) no 3 20%

c) Yes under certain conditions 2 13%

Other 0 0%

DSi1 - a) How is the DSi1 indicator (Geographical coverage of spatial data sets) collected?

a) automated collection through metadata information

1 6%

b) through other automated process 2 13%

c) manually 13 81%

DSi1 - b) If "b" in the previous question, could you describe the process? Anonymous: see above (not useful in future. You ask for the conformance to legal requirements - I am not sure if this makes sense)

DSi1 - c) Please provide comments regarding the indicator DSi1 Poland: difficult in collecting Spain: We think, there will be a list of official geographical coverage. For instance the list of Eurostat surface Cyprus: Useful. Time consuming. Italy: time consuming and not relevant

57

Finland: This information is hard to collect. Possibly not very useful information because INSPIRE itself doesn't require collecting new data. France: impossible to get it at a reasonable cost. We never gave this indicator as we do not know how to get it. Germany: The data providers have difficulties in understanding the indicator, especially what is meant by actual and relevant area. The required information to calculate the indicator can't be derived from metadata. This indicator is not very feasible to report the implementation status. The Netherlands: it can be calculated if in the metadata the administrative unit it covers is added Sweden: An extra metadata element has been included in the national metadata catalogue. The data custodians are required to fill this information in. This makes it easy to extract this information from the catalogue. See also the comment under D-B 10 [The information for “relevant area” and “actual area” in particular. The indicator doesn’t provide much information at all. Assume there are 250+ datasets on the list, which I think is a very modest assumption, all being captured to 100 % of its intended coverage. The overall indicator will then also show 100 %. One additional dataset will, for the short time data capture is in progress not change that overall 100 % significantly (20 % captured = 99.7 %, 50 % captured = 99.8 %, etc.). Also the indicator for usage of services need to be considered. This indicator cannot be captured automatically from the metadata. Some indicators are also redundant as they obviously exists, for instance will metadata always exist if it is extracted from the INSPIRE geoportal] UK: This indicator is problematic for us as the only way we can collect it is to request our data providers provide it manually - and then we add it to the monitoring return manually. In previous years this hasn't been problematic as we have had low numbers of and experienced data publishers. However with the publication over a large quantity of aII data from a large number of less mature data publishers in 2013 this has become exceptionally burdensome. We are not able to provide this indicator in 2014. Anonymous: indicator is usually 100%

DSi1 - d) Should the indicator DSi1 be included in the dash board?

a) yes 5 33%

b) no 8 53%

c) Yes under certain conditions 1 7%

Other 1 7%

58

DSi1,1/2/3 - a) How are the DSi1,1/2/3 indicators (Geographical coverage of spatial data sets) collected?

a) automated collection through metadata information

1 7%

b) through other automated process 1 7%

c) manually 13 87%

DSi1,1/2/3 - b) If "b" in the previous question, could you describe the process? France: impossible to get it at a reasonable cost. We never gave this indicator as we do not know how to get it.

DSi1,1/2/3 - c) Please provide comments regarding the indicator DSi1 Spain: We don´t check it Cyprus: Useful. Time consuming. Italy: time consuming and not relevant Belgium: The value "Geographical coverage of spatial data sets" is useful and interesting for an individual dataset. You can see if a dataset is complete, or under construction and you can see the progress over the years. The surplus value of the indicator however, is not clear. What do we learn from this indicator? What is the purpose? Finland: This information is hard to collected. Possibly not very useful information because INSPIRE itself doesn't require collecting new data. France: See above [impossible to get it at a reasonable cost. We never gave this indicator as we do not know how to get it] Germany: The data providers have difficulties in understanding the indicator, especially what is meant by actual and relevant area. The required information to calculate the indicator can't be derived from metadata. This indicator is not very feasible to report the implementation status. Sweden: The same comments as for the question above [An extra metadata element has been included in the national metadata catalogue. The data custodians are required to fill this information in. This makes it easy to extract this information from the catalogue. See also the comment under D-B 10 (The information for “relevant area” and “actual area” in particular. The indicator doesn’t provide much information at all. Assume there are 250+ datasets on the list, which I think is a very modest assumption, all being captured to 100 % of its intended coverage. The overall indicator will then also show 100 %. One additional dataset will, for the short time data capture is in progress not change that overall 100 % significantly (20 % captured = 99.7 %, 50 % captured = 99.8 %, etc.). Also the indicator for usage of services need to be considered. This indicator cannot be captured automatically

59

from the metadata. Some indicators are also redundant as they obviously exists, for instance will metadata always exist if it is extracted from the INSPIRE geoportal)] UK: This indicator is problematic for us as the only way we can collect it is to request our data providers provide it manually - and then we add it to the monitoring return manually. In previous years this hasn't been problematic as we have had low numbers of and experienced data publishers. However with the publication over a large quantity of aII data from a large number of less mature data publishers in 2013 this has become exceptionally burdensome. We are not able to provide this indicator in 2014.

DSi1,1/2/3 - d) Should the indicators DSi1,1/2/3 be included in the dash board?

a) yes 4 27%

b) no 8 53%

c) Yes under certain conditions 1 7%

Other 2 13%

DSi2 - a) How is the DSi2 indicator (Conformity of spatial data sets) collected?

a) automated collection through metadata information

3 19%

b) through other automated process 3 19%

c) manually 10 63%

60

DSi2 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. Anonymous: question is not clear: data are collected manually, indicator is computed in the EXCEL-file

DSi2 - c) Please provide comments regarding the indicator DSi2 Spain: We don´t check it Cyprus: Useful. Time consuming. Belgium: All MS should use the same validation tools, which should be provided by the EC. It is discussable if only Y/N should appear in the dashboard per dataset/service, or maybe a percentage (e.g. 90% conform). Of course, this depends on the information you get back from the validation tools. Ideally, the dashboard should be connected with the validation tools, and perform a validation test on request or automatically on predefined times. Besides that, the validation tools should evidently be available ‘off line’ (meaning disconnected from the dashboard) for testing. Finland: Conformity information is included in the metadata and is easily available there but this information can be provided without doing any validation on the actual data. The lack of validation perhaps makes this information a bit unreliable. France: This indicator is under the producer's responsibility The Netherland: we prefer one EU validation tool used by each MS the monitoring of this indicator shoul be based on this Sweden: The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, though UK: We have not provided this indicator yet as we have not considered conformity of data. We would anticipate we would extract this from discovery metadata. Anonymous: the only really useful indicator until 2020

DSi2 - d) Should the indicator DSi2 be included in the dash board?

a) yes 13 81%

b) no 0 0%

c) Yes under certain conditions 3 19%

Other 0 0%

61

DSi2,1/1/3 - a) How are the DSi2,1/1/3 indicators (Conformity of spatial data sets) collected?

a) automated collection through metadata information

4 27%

b) through other automated process 2 13%

c) manually 9 60%

DSi2,1/2/3 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually.

DSi2,1/2/3 - c) Please provide comments regarding the indicators DSi2,1/1/3 Spain: We don´t check it Cyprus: Useful. Time consuming. Finland: Conformity information is included in the metadata and is easily available there but this information can be provided without doing any validation on the actual data. The lack of validation perhaps makes this information a bit unreliable. France: Idem [This indicator is under the producer's responsibility] The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this Sweden: The same comments as for the question above [The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, though] UK: We have not provided this indicator yet as we have not considered conformity of data. We would anticipate we would extract this from discovery metadata.

62

DSi2,1/1/3 - d) Should the indicators DSi2,1/1/3 be included in the dash board?

a) yes 13 81%

b) no 0 0%

c) Yes under certain conditions 3 19%

Other 0 0%

NSi1 - a) How is the NSi1 indicator (Accessibility of metadata through discovery

services) collected?

a) automated collection through metadata information

4 25%

b) through other automated process 3 19%

c) manually 9 56%

NSi1 - b) If "b" in the previous question, could you describe the process?

Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: We fill in the field in the excel spreadsheet manually. Anonymous: see above [the only really useful indicator until 2020]

NSi1 - c) Please provide comments regarding the indicator NSi1

Spain: We don´t check it Cyprus: Useful. Time consuming. Belgium: This indicator will become obsolete if dashboard retrieves information via the discovery services.

63

Finland: This indicator is overlapping with MDi1 but both might be useful still. France: Idem MDi1 [easy : by construction, the result is 100% as we check that every known datasets were in the FR SDI] The Netherlands: if the monitoring is based on the content of the discovery service this indicator is not needed; if it exists in the discovery it is in the monitoring, otherwise not Sweden: As the information is derived from the national metadata catalogue, it is automatically derived. UK: Only records within our SDI central catalogue are reported in the monitoring report the answer to this indicator is always 1 Anonymous: not useful in future. you ask for the conformance to legal requirements - I am not sure if this makes sense

NSi1 - d) Should the indicator NSi1 be included in the dash board?

a) yes 8 53%

b) no 4 27%

c) Yes under certain conditions 3 20%

Other 0 0%

NSi1,1 - a) How is the NSi1,1 indicator (Accessibility of metadata through discovery services - possibility to search for spatial data set) collected?

a) automated collection through metadata information

4 27%

b) through other automated process 2 13%

c) manually 9 60%

64

NSi1,1 - b) If "b" in the previous question, could you describe the process? Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: We fill in the field in the excel spreadsheet manually

NSi1,1 - c) Please provide comments regarding the indicator NSi1,1 Spain: We don´t check it Cyprus: Useful. Time consuming. Finland: Information on non-existing metadata can't be collected automatically. France: Idem MDi1 [easy : by construction, the result is 100% as we check that every known datasets were in the FR SDI] The Netherland: this is always the case; the discovery service is harvested in the EU INSPIRE portal Sweden: The same comments as for the question above [As the information is derived from the national metadata catalogue, it is automatically derived] UK: Because all metadata included in our monitoring return is drawn from our SDI's central catalogue, our answer for this is always 1

NSi1,1 - d) Should the indicator NSi1,1 be included in the dash board?

a) yes 9 60%

b) no 4 27%

c) Yes under certain conditions 2 13%

Other 0 0%

65

NSi1,2 - a) How is the NSi1,2 indicator (Accessibility of metadata through discovery services - possibility to search for spatial data services) collected?

a) automated collection through metadata information

4 29%

b) through other automated process 2 14%

c) manually 8 57%

NSi1,2 - b) If "b" in the previous question, could you describe the process? Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: We fill in the field in the excel spreadsheet manually.

NSi1,2 - c) Please provide comments regarding the indicator NSi1,2 Spain: We don´t check it Cyprus: Useful. Finland: Information on non-existing metadata can't be collected automatically. France: no issue The Netherlands: this is always the case; the discovery service is harvested in the EU INSPIRE portal Sweden: The same comments as for the question above [As the information is derived from the national metadata catalogue, it is automatically derived] UK: Because all metadata included in our monitoring return is drawn from our SDI's central catalogue, our answer for this is always 1.

66

NSi1,2 - d) Should the indicator NSi1,2 be included in the dash board?

a) yes 9 60%

b) no 4 27%

c) Yes under certain conditions 2 13%

Other 0 0%

NSi2 - a) How is the NSi2 indicator (Accessibility of spatial data set through view and download services) collected?

a) automated collection through metadata information

1 7%

b) through other automated process 2 13%

c) manually 12 80%

NSi2 - b) If "b" in the previous question, could you describe the process? Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: An automated management information report is run against our CKAN database (data.gov.uk). This produces a series of CSV reports. We take these reports and manually process them to populate the EC monitoring spreadsheet. This provides the indicator.

NSi2 - c) Please provide comments regarding the indicator NSi2 Cyprus: Useful. Denmark: A link or some kind of connection information between data set and service (view and download) is missing and could be considered as part of the monitoring: There is

67

now no connection between data set and its services - in other words: It is not possible to see where a data set is available and visible/view-able. In some cases the service provider naming their service so it is recognizable which data set the service provide but as the monitoring is now it is only the theme and annex that is indicated. Belgium: Guidelines for MD will be necessary to enable automatic retrieval of this information via the MD (how to complete the 'online resources' fields?). The catalogue in the Flemish Geoportal http://www.geopunt.be/catalogus retrieves this information automatically from the MD. When you select a dataset in the catalogue, the button ‘bekijk op kaart’ (= view) and ‘download’ are directly connected with the MD of that datasets. When the online resources are filled in ‘correctly’ in the MD record, the buttons the Geopunt catalogue are activated automatically. Finland: If service metadata doesn't exist or it's not complete this information has to be manually collected. France: How to find Simple download services (ATOM or http/GET)? We check the URL syntax (in order to find most of the download services under Atom or other mode) but we obviously miss some. Sweden: This is not done at the moment but can be deduced from the “Coupled resource” metadata element for services in the metadata implementing rule (1.6) Requires the functionality to be implemented in the Swedish Geodataportal which is under discussion. If the information are to be captured from the INSPIRE geoportal, this should be implemented there.

NSi2 - d) Should the indicator NSi2 be included in the dash board?

a) yes 13 87%

b) no 0 0%

c) Yes under certain conditions 2 13%

Other 0 0%

68

NSi2,1 - a) How is the NSi2,1 indicator (Accessibility of spatial data set through view services) collected?

a) automated collection through metadata information

3 21%

b) through other automated process 2 14%

c) manually 9 64%

NSi2,1 - b) If "b" in the previous question, could you describe the process? Spain: With a validation tool. We exam the getCapabilities file Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: An automated management information report is run against our CKAN database (data.gov.uk). This produces a series of CSV reports. We take these reports and manually process them to populate the EC monitoring spreadsheet. This provides the indicator.

NSi2,1 - c) Please provide comments regarding the indicator NSi2,1 Cyprus: Useful. Finland: If service metadata doesn't exist or it's not complete this information has to be manually collected. France: No issue Sweden: The same comments as for the question above [This is not done at the moment but can be deduced from the “Coupled resource” metadata element for services in the metadata implementing rule (1.6) Requires the functionality to be implemented in the Swedish Geodataportal which is under discussion. If the information are to be captured from the INSPIRE geoportal, this should be implemented there]

69

NSi2,1 - d) Should the indicator NSi2,1 be included in the dash board?

a) yes 13 87%

b) no 0 0%

c) Yes under certain conditions 2 13%

Other 0 0%

NSi2,2 - a) How is the NSi2,2 indicator (Accessibility of spatial data set through download services) collected?

a) automated collection through metadata information

2 14%

b) through other automated process 2 14%

c) manually 10 71%

NSi2,2 - b) If "b" in the previous question, could you describe the process?

Spain: With a validation tool. We exam the getCapabilities file Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. UK: An automated management information report is run against our CKAN database (data.gov.uk). This produces a series of CSV reports. We take these reports and manually process them to populate the EC monitoring spreadsheet. This provides the indicator.

70

NSi2,2 - c) Please provide comments regarding the indicator NSi2,2 Cyprus: Useful. Finland: If service metadata doesn't exist or it's not complete this information has to be manually collected. France: How to find Simple download services (ATOM or http/GET)? We check the URL syntax (in order to find most of the download services under Atom or other mode) but we obviously miss some. Sweden: The same comments as for the question above [This is not done at the moment but can be deduced from the “Coupled resource” metadata element for services in the metadata implementing rule (1.6) Requires the functionality to be implemented in the Swedish Geodataportal which is under discussion. If the information are to be captured from the INSPIRE geoportal, this should be implemented there]

NSi2,2 - d) Should the indicator NSi2,2 be included in the dash board?

a) yes 13 87%

b) no 0 0%

c) Yes under certain conditions 2 13%

Other 0 0%

NSi3 - a) How is the NSi3 indicator (Use off all network services) collected?

a) automatically 1 7%

b) manually 13 93%

71

NSi3 - b) Please provide comments regarding the indicator NSi3 Poland: very hard to collect, no such mechanism available (specially with the period one year) Cyprus: Useful. Italy: Comparability issues between data providers Greece: There might be a need to better define the "use". Is it the number of unique requests, the number of the unique visitors or the number of other applications that use the service? I am not sure that the service providers are sure about the number they (manually) provide. The monitoring of the actual uses of a service by other applications and portals might be more useful for the documentation of the benefits and the added value of an INSPIRE service. I thing that the benefits and the added value is the actual meaning of that indicator. Finland: Useful information for following the service use statistics but hard to collect because it has to be manually collected from service providers. France: No centralized information and too many public authorities; many servers have no statistic systems; The cost to get this information would not be reasonable Germany: Most of the data providers can't provide this information, because it's not measured. So we assume "0" requests in such cases. Thus the value of the indicator is not reliable and doesn't say anything about the use of the services and the implementation status as well. The Netherlands: is an alternative possible ; the use via the EU portal? Sweden: A “receiving point” to which data providers can send log-files is under development and is expected to be functional this spring. UK: Due to the federated nature of our SDI we are unable to gather this information automatically and have to write to all data publishers to obtain this information. This is a significant burden. We do not get a good response to this write round, and the data submitted in the report is often incomplete and of a poor quality. We would also question the value the information in this form to the Commission. Anonymous: number of service requests should be better defined (number of layers per service, ...)

NSi3 - c) Should the indicator NSi3 be included in the dash board?

a) yes 6 38%

b) no 4 25%

c) Yes under certain conditions 5 31%

Other 1 6%

72

NSi3,1 - a) How is the NSi3,1 indicator (Use off discovery services) collected?

a) automatically 1 7%

b) manually 13 93%

NSi3,1 - b) Please provide comments regarding the indicator NSi3,1 Spain: It is necessary to define the "use" term, for instance is a number of visit or number getrecords request,...... Cyprus: Useful. Italy: Comparability issues between data providers Finland: Useful information for following the service use statistics but hard to collect because it has to be manually collected from service provider. France: No centralized information and too many public authorities; many servers have no statistic systems; The cost to get this information would not be reasonable Germany: Most of the data providers can't provide this information, because it's not measured. So we assume "0" requests in such cases. Thus the value of the indicator is not reliable and doesn't say anything about the use of the services and the implementation status as well. The Netherlands: is an alternative possible ; the use via the EU portal? Sweden: The same comments as for the question above [A “receiving point” to which data providers can send log-files is under development and is expected to be functional this spring]

NSi3,1 - c) Should the indicator NSi3,1 be included in the dash board?

a) yes 5 36%

b) no 5 36%

c) Yes under certain conditions 4 29%

Other 0 0%

NSi3,2 - a) How is the NSi3,2 indicator (Use off view services) collected?

73

a) automatically 1 7%

b) manually 13 93%

NSi3,2 - b) Please provide comments regarding the indicator NSi3,2 Spain: It is necessary to define the "use" term, for instance is a number of visit or number getmap request,...... Cyprus: Useful. Italy: Comparability issues between data providers Finland: Useful information for following the service use statistics but hard to collect because it has to be manually collected from service providers. France: No centralized information and too many public authorities; many servers have no statistic systems; The cost to get this information would not be reasonable Germany: Most of the data providers can't provide this information, because it's not measured. So we assume "0" requests in such cases. Thus the value of the indicator is not reliable and doesn't say anything about the use of the services and the implementation status as well. The Netherlands: is an alternative possible ; the use via the EU portal? Sweden: The same comments as for the question above [A “receiving point” to which data providers can send log-files is under development and is expected to be functional this spring] UK: Due to the federated nature of our SDI we are unable to gather this information automatically and have to write to all data publishers to obtain this information. This is a significant burden. We do not get a good response to this write round, and the data submitted in the report is often incomplete and of a poor quality. We would also question the value the information in this form to the Commission.

74

NSi3,2 - c) Should the indicator NSi3,2 be included in the dash board?

a) yes 5 33%

b) no 5 33%

c) Yes under certain conditions 4 27%

Other 1 7%

NSi3,3 - a) How is the NSi3,3 indicator (Use off download services) collected?

a) automatically 1 7%

b) manually 13 93%

NSi3,3 - b) Please provide comments regarding the indicator NSi3,3 Spain: It is necesary to define the "use" term, for instance is a number of visit or number getfeature request,...... Cyprus: Useful. Italy: Comparability issues between data providers Finland: Useful information for following the service use statistics but hard to collect because it has to be manually collected from service providers. France: No centralized information and too many public authorities; many servers have no statistic systems; The cost to get this information would not be reasonable Germany: Most of the data providers can't provide this information, because it's not measured. So we assume "0" requests in such cases. Thus the value of the indicator is not reliable and doesn't say anything about the use of the services and the implementation status as well. The Netherlands: is an alternative possible ; the use via the EU portal?

75

Sweden: The same comments as for the question above [A “receiving point” to which data providers can send log-files is under development and is expected to be functional this spring] UK: Due to the federated nature of our SDI we are unable to gather this information automatically and have to write to all data publishers to obtain this information. This is a significant burden. We do not get a good response to this write round, and the data submitted in the report is often incomplete and of a poor quality. We would also question the value the information in this form to the Commission.

NSi3,3 - c) Should the indicator NSi3,3 be included in the dash board?

a) yes 5 33%

b) no 5 33%

c) Yes under certain conditions 4 27%

Other 1 7%

NSi3,4 - a) How is the NSi3,4 indicator (Use off transformation services) collected?

a) automatically 1 7%

b) manually 13 93%

NSi3,4 - b) Please provide comments regarding the indicator NSi3,4 Cyprus: Useful. Italy: Comparability issues between data providers

76

Finland: Useful information for following the service use statistics but hard to collect because it has to be manually collected from service providers. France: No centralized information and too many public authorities; many servers have no statistic systems; The cost to get this information would not be reasonable Germany: Most of the data providers can't provide this information, because it's not measured. So we assume "0" requests in such cases. Thus the value of the indicator is not reliable and doesn't say anything about the use of the services and the implementation status as well. The Netherlands: is an alternative possible ; the use via the EU portal? Sweden: Same as above although so far there are not transformation services listed. What would be the measure – number of transformations done? UK: Due to the federated nature of our SDI we are unable to gather this information automatically and have to write to all data publishers to obtain this information. This is a significant burden. We do not get a good response to this write round, and the data submitted in the report is often incomplete and of a poor quality. We would also question the value the information in this form to the Commission.

NSi3,4 - c) Should the indicator NSi3,4 be included in the dash board?

a) yes 4 27%

b) no 6 40%

c) Yes under certain conditions 3 20%

Other 2 13%

NSi3,5 - a) How is the NSi3,5 indicator (Use off invoke services) collected?

a) automatically 1 7%

b) manually 13 93%

77

NSi3,5 - c) Should the indicator NSi3,5 be included in the dash board?

a) yes 5 33%

b) no 6 40%

c) Yes under certain conditions 3 20%

Other 1 7%

NSi4 - a) How is the NSi4 indicator (Conformity of all services) collected?

a) automatically 3 20%

b) manually 12 80%

NSi4 - b) Please provide comments regarding the indicator NSi4 Spain: It is necessary automatic validation tools Cyprus: Useful. Belgium: Please see comments earlier on common validation tools [monitoring information including validation results (Y/N or even more detailed?) IF all MS use the same validation tools which should be provided by the EC] Finland: EU commission validator can't validate services that require authentication. Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this Sweden: The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, though

78

UK: We'd anticipate that this information could be derived from the discovery metadata although quality issues remain.

NSi4 - c) Should the indicator NSi4 be included in the dash board?

a) yes 13 87%

b) no 1 7%

c) Yes under certain conditions 1 7%

Other 0 0%

NSi4,1 - a) How is the NSi4,1 indicator (Conformity of network services) collected?

a) automatically 3 20%

b) manually 12 80%

NSi4,1 - b) Please provide comments regarding the indicator NSi4,1 Spain: It is necessary automatic validation tools Cyprus: Useful. Finland: EU commission validator can't validate services that require authentication. France: No issue as long we use only one validator. The problems come if French public authorities use other validators. Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this Sweden: The same comments as for the question above [The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, though]

79

NSi4,1 - c) Should the indicator NSi4,1 be included in the dash board?

a) yes 13 87%

b) no 1 7%

c) Yes under certain conditions 1 7%

Other 0 0%

NSi4,2 - a) How is the NSi4,2 indicator (Conformity of view services) collected?

a) automatically 3 20%

b) manually 12 80%

NSi4,2 - b) Please provide comments regarding the indicator NSi4,2 Spain: It is necessary automatic validation tools Cyprus: Useful. Finland: EU commission validator can't validate services that require authentication. France: idem [No issue as long we use only one validator. The problems come if French public authorities use other validators.] Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this Sweden: The same comments as for the question above [The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, though]

80

NSi4,2 - c) Should the indicator NSi4,2 be included in the dash board?

a) yes 13 87%

b) no 1 7%

c) Yes under certain conditions 1 7%

Other 0 0%

NSi4,3 - a) How is the NSi4,3 indicator (Conformity of download services) collected?

a) automatically 3 20%

b) manually 12 80%

NSi4,3 - b) Please provide comments regarding the indicator NSi4,3 Spain: It is necessary automatic validation tools Cyprus: Useful. Finland: EU commission validator can't validate services that require authentication. France: Idem [No issue as long we use only one validator. The problems come if French public authorities use other validators.] Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. The Netherland: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this Sweden: The same comments as for the question above [The data custodians fill this in when publishing metadata for a dataset or service, as required by the IR for metadata. Current problems are the three options according to the IR whereby the option “not evaluated” isn’t catered for by the ISO standard used for metadata. There is a workaround for this, though]

81

NSi4,3 - c) Should the indicator NSi4,3 be included in the dash board?

a) yes 13 87%

b) no 1 7%

c) Yes under certain conditions 1 7%

Other 0 0%

NSi4,4 - a) How is the NSi4,4 indicator (Conformity of transformation services) collected?

a) automatically 3 21%

b) manually 11 79%

NSi4,4 - b) Please provide comments regarding the indicator NSi4,4 Spain: It is necessary automatic validation tools Cyprus: Useful. Finland: EU commission validator can't validate services transformation services. France: In fact, we have no transformation service and I do not know if we have a validator. Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this Sweden: There are no services listed but if there was they would be collected automatically

82

NSi4,4 - c) Should the indicator NSi4,4 be included in the dash board?

a) yes 11 73%

b) no 3 20%

c) Yes under certain conditions 1 7%

Other 0 0%

NSi4,5 - a) How is the NSi4,5 indicator (Conformity of invoke services) collected?

a) automatically 2 14%

b) manually 12 86%

NSi4,5 - b) Please provide comments regarding the indicator NSi4,5 Spain: It is necessary automatic validation tools Cyprus: Useful. Finland: EU commission validator can't validate invoke services. France: Idem that NSi4.4, with the higher difficulty that we have not IR about invoke services. Germany: Some of the German states collect the monitoring information automatically through metadata, others collect the information manually. The Netherlands: we prefer one EU validation tool used by each MS the monitoring of this indicator should be based on this Sweden: Currently there are not services listed and with the new amendment from the SDS/Invoke, this becomes tricky as there are no yes or no to the question.

83

NSi4,5 - c) Should the indicator NSi4,5 be included in the dash board?

a) yes 11 73%

b) no 2 13%

c) Yes under certain conditions 2 13%

Other 0 0%

84

INSPIRE Reporting -------------------------

Rep 1) Should information on 'coordination and quality assurance' be included in the dash board?

a) yes 4 25%

b) no 9 56%

Other 3 19%

Italy: Not sure if it will help Greece: Not sure how this information might be collected. If there are indicators that can measure such information, then yes. Slovak Republic: With the guidance, what kind of information is expected

Rep 2) Should information on 'contribution to the functioning and coordination of the infrastructure' be included in the dash board?

a) yes 5 31%

b) no 9 56%

Other 2 13%

Italy: Not sure if it will help Greece: Not sure how this information might be collected. If there are indicators that can measure such information, then yes.

85

Rep 3) Should information on 'use of the infrastructure for spatial information' be included in the dash board?

a) yes 6 38%

b) no 9 56%

Other 1 6%

Italy: Not sure if it will help

Rep 4) Should information on 'data sharing arrangements' be included in the dash board?

a) yes 4 29%

b) no 9 64%

Other 1 7%

Italy: Not sure if it will help

Rep 5) Should information on 'cost benefit aspects' be included in the dash board?

a) yes 6 40%

b) no 9 60%

Other 0 0%

86

Rep 6 - a) Do you collect any additional information during the monitoring process that is not mandated by the Decision on INSPIRE monitoring and reporting?

a) yes 6 38%

b) no 10 63%

Rep 6 - b) If "yes" on the above question, which additional information is collected? Greece: The transposition law (No. 3882 of 2010) requires the collection of information on software, hardware, licences that relate in some way to spatial information. There has been an attempt to collect this information, but the fact that the collection was manual does not allow us today to use and update the information collected. Belgium: For Flanders: Link between dataset and its services (via metadata). Finland: Information on dataset and service licensing and user rights. Germany: - Information about the administrative level (federal government, states, municipalities, ...) of the submitting organisation. - fileIdentifier of metadata set - service end point - linkage between data set and view and download service - additional comments Slovak Republic: Urls for metadata, network services

87

INSPIRE General Information -------------------------

Inf 1) Is the status of a) INSPIRE content and b) status of implementation presented to eGovernment community?

Not foreseen 3 20%

Currently presented 8 53%

Foreseen 4 27%

Inf 2) About INSPIRE Monitoring information, is the communication with domain specific networks triggered?

Not foreseen 6 40%

Currently active 6 40%

Foreseen 3 20%

88

Inf 3) About INSPIRE Monitoring information, are there investigations initiatives in synergy with Open Data ongoing?

Not foreseen yet 5 33%

Currently ongoing 7 47%

Foreseen 3 20%

Inf 4) About INSPIRE Monitoring information, are there activities targeting the awareness rising and motivation of stakeholders in place?

Not foreseen yet 5 33%

Currently in place 8 53%

Foreseen 2 13%

Inf 5) About INSPIRE Monitoring information, is the support for governance processes in place?

Not foreseen yet 7 47%

Currently in place 5 33%

Foreseen 3 20%

89

Inf 6) Are feedback collected on INSPIRE coordinating processes?

Not foreseen yet 5 33%

Currently collected 8 53%

Foreseen 2 13%

Inf 7) Are feedback collected on INSPIRE implementation processes?

Not foreseen yet 5 33%

Currently collected 7 47%

Foreseen 3 20%


Recommended