2
D4.1 Publishable Summary Saving Energy in Europe‘s Public Buildings Using ICT www.smartspaces.eu [email protected] Introduction Results of SMARTSPACES prototype testing The SMARTSPACES testing methodology explained SMARTSPACES Key Figures The project started on 1 January 2012 and will last for three years. It will set up 11 pilot sites in 11 cities in 8 countries and be operated by 25 partners with an overall budget of almost 7 million Euro. The project is co-funded by the European Commission within the CIP ICT Policy Support Programme (Grant agreement no. 297273). This publishable summary summarises the content produced for D4.1 called ‘Results of SMARTSPACES Proto- type testing’. The deliverable introduces the SMARTSPACES Testing Methodology used to analyse the quality of the service by performing various tests. More specifically, it summarises results of testing the developed use cases (D1.2), reviewing the identified requirements for the SMARTSPACES services and systems (D1.1), and pro- vides a summary of results on testing on a pilot level. The deliverable represents a first iteration of testing and will serve as the backbone for the second iteration (D4.2) that will not be constrained to usability testing, but cover various testing areas (functionality, inter- faces, range, etc.) Prototype staff Use case testing professionals Requirements fulfilment visitors Test cases Results of SMARTSPACES prototype testing The development programme for the SMARTSPACES service is a Prototyping Model. The concept of this model presumes that the consortium does not have a clear set of requirements at the beginning of the development, and that a process of feedback with the end user is required to be undertaken to develop the initial idea into specific objectives and then into the a system to address those objectives. As part of the Prototyping Model programme, each site has developed a simplified version of the system (in some cases as a stage in the actual software coding programme, others as a “mock-up”) so that feedback can be gathered from the stakeholders at each site. This development programme has a basis in the concept of Evolutionary Prototyping. This programme type is a derivation of Rapid Application Development outlined by James Martin. The concept is that through the re- peated evaluation of a prototype, the prototype can evolve, and in the process of doing so produce a solution which meets the user requirements which become clearer throughout the development programme. >> Fact Sheet

Results of SMARTSPACES prototype testing€¦ · Publishable Summary D4.1 Saving Energy in Europe‘s Public Buildings Using ICT [email protected] Introduction Results of SMARTSPACES

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Results of SMARTSPACES prototype testing€¦ · Publishable Summary D4.1 Saving Energy in Europe‘s Public Buildings Using ICT smartspaces@empirica.com Introduction Results of SMARTSPACES

D4.1 Publishable Summary

Saving Energy in Europe‘s Public Buildings Using ICT

www.smartspaces.eu [email protected]

Introduction

Results of SMARTSPACES prototype testing

The SMARTSPACES testing methodology explained

SMARTSPACES Key Figures The project started on 1 January 2012 and will last for three years. It will set up 11 pilot sites in 11 cities in 8 countries and be operated by 25 partners with an overall budget of almost 7 million Euro.

The project is co-funded by the European Commission within the CIP ICT Policy Support Programme

(Grant agreement no. 297273).

This publishable summary summarises the content produced for D4.1 called ‘Results of SMARTSPACES Proto-type testing’.

The deliverable introduces the SMARTSPACES Testing Methodology used to analyse the quality of the service by performing various tests. More specifically, it summarises results of testing the developed use cases (D1.2), reviewing the identified requirements for the SMARTSPACES services and systems (D1.1), and pro-vides a summary of results on testing on a pilot level.

The deliverable represents a first iteration of testing and will serve as the backbone for the second iteration (D4.2) that will not be constrained to usability testing, but cover various testing areas (functionality, inter-faces, range, etc.)

Prototype

staff

Use case testing

professionals

Requirements fulfilment

visitors Test cases

Results of SMARTSPACES prototype testing

The development programme for the SMARTSPACES service is a Prototyping Model. The concept of this model presumes that the consortium does not have a clear set of requirements at the beginning of the development, and that a process of feedback with the end user is required to be undertaken to develop the initial idea into specific objectives and then into the a system to address those objectives.

As part of the Prototyping Model programme, each site has developed a simplified version of the system (in some cases as a stage in the actual software coding programme, others as a “mock-up”) so that feedback can be gathered from the stakeholders at each site.

This development programme has a basis in the concept of Evolutionary Prototyping. This programme type is a derivation of Rapid Application Development outlined by James Martin. The concept is that through the re-peated evaluation of a prototype, the prototype can evolve, and in the process of doing so produce a solution which meets the user requirements which become clearer throughout the development programme.

>> Fact Sheet

Page 2: Results of SMARTSPACES prototype testing€¦ · Publishable Summary D4.1 Saving Energy in Europe‘s Public Buildings Using ICT smartspaces@empirica.com Introduction Results of SMARTSPACES

D4.1 Publishable Summary

Saving Energy in Europe‘s Public Buildings Using ICT www.smartspaces.eu [email protected]

D4.2 The SMARTSPACES Prototype System

Use case testing

To evaluate the use cases in D1.2, test cases for each use case were developed. To assist in the assessment of each use case, questionnaires were provided to each focus group outlining each test case with a common evaluation framework. The standardised questionnaire template ensured consistency in content across all pilot sites.

Requirements fulfilment - analysis results

Requirements review

Each pilot site was requested to complete a questionnaire on the fulfilment of requirements. The monitoring of implementation across the consortium is a challenging task due to the eleven different implementations of the SMARTSPACES service, but this method allows an assessment as to whether the “proof of concept” has been achieved across the consortium.

The evaluation of each requirement was structured in the same form as the original requirements were set out (D1.1), with a structured requirements model, and an assessment at each site of whether the requirement was applicable, and its priority. To evaluate progress, a fulfilment assessment of “YES”, “NO”, “PARTIAL” was pro-vided and completed by each site. Additionally comments about challenges associated with the meeting of re-quirements were made.

Use case testing - summary of results

User Group

Be

lgra

de

Bir

min

gham

*

Bri

sto

l

Hag

en

Ista

nb

ul

Le

ices

ter

Lle

ida

Mila

n

Mo

ulin

s

Mu

rcia

*

Ve

nlo

Total Number

Professionals X X X X X X X 7

Staff members X X X X X X X X 8

Visitors** X X X X X X X 7

User Category Average

Satisfaction Score

Visitors 6.2

Staff 6.7

Professionals 7.1

Service Provider 7.3

BEMS 6.4

* The tender process in Birmingham and Murcia finished recently. Since then requirements have been narrowed and staff members will have opportu-nity to influence the detailed choice in the coming weeks. Istanbul did not participate in the completion of D4.1 as a consequence of the delayed start to the project.

** Testing with visitors was optional at this point.

The requirements across the consortium were assessed on the basis of their original priority (see D1.1). Analysis on all groups of requirements are documented in D4.1. A summary for group ‘R1.1 Output Format Results‘ is presented here:

REQ ID Description No. of sites applicable

Average priority

Sites fulfilled

R1.1.1 Web-portal 11 8.5 6

R1.1.2 Web-portal - Dashboard

9 8.2 2

R1.1.3 Website 7 8.1 3

R1.1.8 Email 9 8.8 5

R1.1.14 Table format 10 8.7 7

R1.1.15 Database 6 9.2 4

Coming Next