372
Signed XML over Java RMI HTTPS / SOAP FIPA Agent Platform Web Browser Web Browser Web Browser Web Browser Certification Authority (PKI) Data Mining Registration Authority Agent Personal Agents Task Agents Advisor Agents Monitor Agent Data Subject Processor Auditor Controller Business Logic Service Agents Log Agents Presentation Logic LDAP public directory 1 agent / platform 1 agent / user multiple agents PII Onion Routing Handbook of Privacy and Privacy-Enhancing Technologies The case of Intelligent Software Agents Editors: G.W. van Blarkom J.J. Borking J.G.E. Olk Privacy Incorporated Software Agent

Handbook Privacy and PET Final

  • Upload
    pinar

  • View
    327

  • Download
    0

Embed Size (px)

DESCRIPTION

Handbook of Privacy andPrivacy-EnhancingTechnologies

Citation preview

  • Signed XML over Java RMI

    HTTPS / SOAP

    FIPA Agent Platform

    WebBrowser

    WebBrowser

    WebBrowser

    WebBrowser

    Certification Authority(PKI)

    Data Mining

    RegistrationAuthority

    Agent

    PersonalAgents

    TaskAgents

    AdvisorAgents

    MonitorAgent

    Data Subject

    Processor

    Auditor

    Controller

    BusinessLogic

    ServiceAgents

    LogAgents

    PresentationLogic

    LDAPpublic

    directory

    1 agent / platform

    1 agent / user

    multiple agents

    PII

    OnionRouting

    Handbook of Privacy andPrivacy-Enhancing

    TechnologiesThe case of Intelligent Software Agents

    Editors:

    G.W. van BlarkomJ.J. BorkingJ.G.E. Olk

    Privacy Incorporated Software Agent

  • Handbook of Privacy andPrivacy-Enhancing

    TechnologiesThe case of Intelligent Software Agents

  • Handbook of Privacy andPrivacy-Enhancing

    TechnologiesThe case of Intelligent Software Agents

    PISA Consortium

    Editors:G.W. van Blarkom RE

    drs. J.J. Borkingdr.ir. J.G.E. Olk

    Project coordinator:ir. J. Huizenga

  • Handbook of Privacy and Privacy-Enhancing Technologies - The case of Intelligent Soft-ware AgentsThe Hague, 2003

    All rights reserved. No part of this publication may be reproduced, stored in a retrievalsystem, or transmitted, in any form or by any means, electronic, mechanical, photocopy-ing, recording, or otherwise, without the prior permission of the College beschermingpersoonsgegevens.

    ISBN 90 74087 33 7

    Print: TNO-FEL, P.O. Box 96864, 2509 JG, The Hague, The NetherlandsPublisher: College bescherming persoonsgegevens, copyright c2003

    P.O. Box 93374, 2509 AJ, The Hague, The Netherlandsemail: [email protected]

  • Preface

    Houston, we have a problem! Thats what we thought when we saw the 1986 video clipof Apple Corp. about The Knowledge Navigator issued at the introduction of its personaldigital assistant the Newton in 1987. In this video clip visualizing the work of a professorin 2012 his intelligent software agent Phil is acting autonomously in order to accomplishtasks for his master in a complex network environment. The lack of supervision of theintelligent software agents (ISAs) from its masters, could lead to undesirable actions, suchas violation of the privacy. In January 1999 the Dutch Data Protection Authority (NDPA,College bescherming persoonsgegevens) together with TNO published a technology as-sessment study about intelligent software agents and privacy. ISAs can appear in the roleof a benevolent digital butler or a malicious digital criminal or as a means for constantsurveillance. It became quite clear that fundamental research was needed about the risksof intrusion into privacy and the solutions that were needed to minimize privacy violationsby agents.

    An international research consortium was set up to investigate whether an ISA can bebuilt that respects and protects the privacy of its users and that of other users while actingon the Internet. The consortium with the acronym PISA (Privacy Incorporated SoftwareAgent) started its work on 10 January 2001 researching for and subsequently building aprivacy guardian for the electronic age and not without success! A working job-acquiringagent has been designed and built as a demonstrator system showing how privacy can besafeguarded.

    The PISA research has been funded by the EU under its IST 5th Framework programand this Handbook that is lying before you is one of the milestones of this project. TheHandbook reports what results have been achieved and what needs to be done to createprivacy safe agents. It shows what architecture has been developed by the consortiumfor information and collaboration systems for telecommunications, information technol-ogy and media, what EU privacy law rules apply to ISAs and what Privacy-EnhancingTechnologies (PETs) are needed to create a Privacy Incorporated Software Agent (PISA).Especially the fundamental research on privacy ontologies in this project translating pri-vacy law into object code for automatic execution of the law while processing personaldata, will be continued in a the new EU research project PRIME (Privacy and IdentityManagement) that will start in December 2003. Without the research grant of the EUunder the IST fifth framework program this timely research would never happened. Theresearchers are most grateful for this.

    The consortium acknowledgements are to Mr. Andrea Servida, project officer of theEuropean Commission and the reviewers Prof. Piereangela Samarati (UniMilan), Dr. Al-berto Escudero-Pascual (UniStockholm) and Helena Lindskog (Ericsson) for their encour-agements and positive remarks during the review meetings of the PISA project withoutwhom this project would never yielded the important results for the protection of privacyof the EU and world citizens.

    We would also like to thank all participants in this project for the extra miles that werenecessary to achieve the results of this project and our PISA friends especially Unisys

  • ii

    Netherlands whose support was crucial for the start of our work.We hope that this Handbook will be of assistance for software agents developers as

    well for those that will order for such intelligent systems, the users of these agents and thePrivacy Commissioners worldwide.

    The Hague, 28 November 2003

    Jan Huizenga John Borking

    PISA coordinator PISA disseminationTNO NDPA

  • Contents

    1 Introduction 11.1 Background 11.2 The case of agents 2

    1.2.1 Intelligent Software Agents 21.2.2 Privacy and Software Agents 31.2.3 Privacy-Enhancing Technologies (PET) 3

    1.3 Objectives of the handbook 41.4 About the Handbook 41.5 Outline of the Handbook 51.6 Reading the handbook 51.7 Acknowledgements 6

    2 Privacy 72.1 Introduction to the Issue 72.2 General Description of Privacy 82.3 Definitions 102.4 Data Subject and Personal Data 11

    2.4.1 Identified or Identifiable 112.4.2 Special Categories of Data 122.4.3 Automated Individual Decisions 13

    2.5 Controller and Processor 132.6 Security 13

    2.6.1 Legal Framework 132.6.2 The Security of Personal Data 14

    2.7 Privacy Preferences and Policies 192.8 Legal Interception 192.9 Nine Condensed Rules from the EU Viewpoint on Privacy 202.10 The Privacy Threat Analysis 22

    2.10.1 General Approach Risk Analysis 222.10.2 Five-pronged Approach to the Privacy Threat Analysis 242.10.3 Definition of the Primary Threats 252.10.4 Purpose of the System to be Devised 282.10.5 Description of the Solution to be Adopted 292.10.6 Conclusion 29

    2.11 Product Liability 31

    3 PET 333.1 Privacy-Enhancing Technologies (PET) 33

    3.1.1 Definition 333.1.2 History 33

  • iv CONTENTS

    3.1.3 Legal grounds for PET 353.1.4 Original PET Concept Pseudo-identity 363.1.5 Further PET Developments 37

    3.2 Common Criteria for Information Technology Security Evaluation 423.2.1 Anonymity (FPR ANO) 473.2.2 Pseudonymity (FPR PSE) 483.2.3 Unlinkability (FPR UNL) 493.2.4 Unobservability (FPR UNO) 49

    3.3 Privacy by DesignPrivacy act according to the directive built-in code 49

    3.4 PET Projects 513.4.1 State-of-the-art PET 513.4.2 Description of NBIS 523.4.3 LADIS Project 53

    4 The Development of Agent Technology 554.1 Agent Technology 55

    4.1.1 Background 554.1.2 Reasons for Software Agents to Exist 564.1.3 Definition of Agents 564.1.4 Agent Ownership 584.1.5 Interaction between Users and Agents 594.1.6 Classification of Agents 604.1.7 Intelligence and Agency 604.1.8 Examples of Agents 624.1.9 A General Agent Model 634.1.10 Control Theory 644.1.11 Cognitive Psychology 654.1.12 Classical Artificial Intelligence Planning Systems 654.1.13 The Agent Model 654.1.14 The Communication Model 674.1.15 The Future of Software Agents 68

    4.2 Software Agents and PET 684.2.1 PET that Manage the Identified Threats 704.2.2 PET Placed in the Generic Agent Model 734.2.3 Alternative Use of Privacy-Enhancing Technologies 744.2.4 Consequences of Using PET 764.2.5 The Supply of PETs for the Consumer Market 774.2.6 PET Design Criteria for Agents 77

    5 Providing privacy to agents in an untrustworthy environment 795.1 Introduction 79

    5.1.1 Related work 805.1.2 Chapter outline 81

    5.2 System model 825.2.1 System model and assumptions 825.2.2 Threats 82

    5.3 Proposed solutions 845.3.1 Confidentiality 855.3.2 Integrity 88

    5.4 One step further: theoretical boundaries 905.4.1 Shannons secrecy model 90

  • CONTENTS v

    5.4.2 Mobile agent secrecy and privacy model 915.4.3 Evaluation 94

    5.5 Conclusions and future work 95

    6 Public Key Infrastructure 976.1 Overview of architecture 97

    6.1.1 Architectural features required for PKI 986.1.2 Overall Trust hierarchy 996.1.3 Description of the PKI 996.1.4 Design Principles 99

    6.2 Functional descriptions of PKI 1036.2.1 Certificate request 1036.2.2 Certificate issuance 1046.2.3 Certificate storage 1046.2.4 Certificate revocation 1046.2.5 Certificate renewal 1056.2.6 Verification 1056.2.7 Authentication 1056.2.8 Confidentiality 1066.2.9 Non-repudiation 1066.2.10 RootSign function 109

    7 Evaluation and Auditing 1117.1 Privacy Audit Framework 111

    7.1.1 Introduction 1127.1.2 Target Group and Application 1127.1.3 Privacy Audit Framework Introduction 1137.1.4 Legal Framework for Privacy Protection 1247.1.5 Legal Requirements for the Processing of Personal Data 124

    7.2 Common Criteria (PETTEP) 138

    8 Privacy architecture for agents 1418.1 Introduction 1418.2 Privacy Related Issues 142

    8.2.1 Consent 1428.2.2 Processing Personal Data 1438.2.3 Processing Personal Data on Behalf of a Controller 1438.2.4 ISA and Privacy Related Issues 1438.2.5 Privacy Threats for Intelligent Software Agents 143

    8.3 Privacy Incorporated Software Agent (PISA) 1468.4 Personal data 1478.5 Structure of PISA 1488.6 Privacy Supporting Measures 149

    8.6.1 Anonymity and pseudo-identities 1508.6.2 ISA certificates 1518.6.3 Agent Practises Statement (APS) 151

    9 Trust model and network aspects 1539.1 Trust model Agents and Trust 153

    9.1.1 What is Trust? 1539.1.2 The Problem of Trusting Agents: Interactions Twice-Removed 1549.1.3 Building Successful Agents: A Summary Model 1569.1.4 Factors Contributing to Trust 157

  • vi CONTENTS

    9.1.5 Factors Contributing to Perceived Risk 1619.2 Network aspects 162

    9.2.1 Scalability 1629.2.2 Traffic analysis 1629.2.3 Anonymous communication 163

    9.3 Publications 1669.3.1 Journal Papers 1669.3.2 Book/Chapters 1669.3.3 Conference proceedings 1669.3.4 Talks 1679.3.5 Reports 1679.3.6 Papers in Preparation 168

    10 Design method 16910.1 Prevention or Minimisation 169

    10.1.1 Anonymity 17010.1.2 Pseudonymity 17010.1.3 Unlinkability 17010.1.4 Unobservability 171

    10.2 Privacy Regulations 17110.2.1 Consent, Privacy Policies and Preferences 17110.2.2 Categories of Personal Data 17110.2.3 Preferences the User Could Provide to the (P)ISA 17210.2.4 Ontologies and System Design 17310.2.5 Privacy Knowledge Engineering and DEPRM 18710.2.6 Negotiation 18910.2.7 Processing Personal Data 19210.2.8 Auditing Requirements 192

    11 Data Mining 19711.1 Introduction 19711.2 Privacy 199

    11.2.1 What is privacy? 19911.2.2 Articulating privacy 20111.2.3 The Implications of Data Mining in the Context of Fair Informa-

    tion Practices 20411.2.4 Threats 20811.2.5 Summary 208

    11.3 Data mining 20811.3.1 What is data mining? 20911.3.2 Mining data or growing knowledge? 21011.3.3 Alternative metaphor: cultivating knowledge by feeding models

    with data 21111.3.4 The issues of location and data recognisability 21111.3.5 The characteristics of web mining 21211.3.6 Data mining: which organisations and people will use it 21311.3.7 The data mining process 21311.3.8 Purposes of data mining 21411.3.9 Observations on data mining practices 21711.3.10 Data mining and ISAT 21811.3.11 Conclusion 220

    11.4 Data Accumulation and Information Increase 220

  • CONTENTS vii

    11.4.1 Data Accumulation and Information Increase 22011.4.2 The Increase of Registrations - the drawbacks 22111.4.3 Generating new information 22211.4.4 Privacy issues 22211.4.5 Data sharing 22311.4.6 Conclusion 226

    11.5 Data mining privacy threats 22611.5.1 The three main classes of privacy threats related to data mining 22611.5.2 Data mining, the Internet and privacy 22711.5.3 Web mining privacy threats 22811.5.4 Case of Privacy Threat: The FBI Wiretap discussion 23311.5.5 Case: The 2001 situation concerning identity theft in the US 23411.5.6 Conclusions 235

    11.6 Data Mining in defence of privacy 23511.6.1 Detection of privacy infringements 23511.6.2 Detection of potential privacy infringements through web mining 23711.6.3 Web Mining with ParaBots 23811.6.4 Collaborative monitoring of privacy policies and practices 24111.6.5 Conclusion 241

    11.7 Privacy Incorporated Data Mining 24211.7.1 Mining anonymous data 24211.7.2 Unlinking identity and behaviour data 24211.7.3 Virtual Panels 24511.7.4 Conclusion 246

    11.8 Conclusions 246

    12 Human Computer Interaction 24912.1 From Privacy Legislation to Interface Design 25012.2 Testing of User Interfaces for a Privacy Agent 266

    12.2.1 Introduction 26612.2.2 Preparing the Prototype for Testing 26812.2.3 Experimental Method 27012.2.4 Results 27112.2.5 Recommendations 28812.2.6 Conclusions 288

    13 Conclusions and future outlook 291

    Bibliography 297

    A Applying the handbook: The Job Market Case 305A.1 Case description: Ideal situation 305A.2 PISA Demonstrator case: the Job Market Case 306A.3 Scenario 307A.4 Privacy relations 307

    A.4.1 Assumptions 307A.4.2 Elaboration of the privacy and security solutions 307A.4.3 ApplicantAgent needs knowledge of privacy (/privacy level) and

    rules to decide which mechanisms to use about interaction withother entities in the environment 309

    A.4.4 Protection mechanisms for personal data (encryption) 309A.4.5 Authentication mechanism 309A.4.6 Integrity mechanism for ISA 309

  • viii CONTENTS

    A.4.7 Exchange of privacy policy 310A.4.8 Privacy-Enhancing Technologies 310A.4.9 Unobservability needs to be provided 310

    A.5 Conclusions 310A.6 Final Decisions for development of PISA Demonstrator 24 July 2002 312

    B Aspects of the PISA demonstrator 315B.1 The demonstrator scenario 315B.2 The PisaAgent, PersonalAgent and TaskAgent as base classes 317B.3 The ProxyAgents and ServiceAgents in the demonstrator 317B.4 The system components of the demonstrator and the PISA PKI 318B.5 The privacy ontology 318

    C Overview of threats per actor 321

    D Overview of available PISA deliverables 323D.1 WP1.1 D1 323D.2 WP2.1 D7 324D.3 WP3.1 D10 325D.4 WP4.1 D16 326D.5 WP5.1 D21 326D.6 WP1.2 D2 327D.7 WP3.2 D11 328D.8 WP4.2 D17 329D.9 PISA System Architecture (PSA) and Datamining 330D.10 WP1.3 D3 331D.11 WP2.2 D8A 332D.12 WP3.3 D12 333D.13 WP4.3 D18 335D.14 WP5.2 D22 336D.15 WP1.4 D4 336D.16 WP3.4 D13a 337D.17 WP3.4 D13b 338D.18 WP4.4 D19 340D.19 WP5.3 D23 340D.20 WP1.5 D5 342D.21 WP3.5 D14 343D.22 WP5.4 D24-1 344D.23 WP5.4 D24-2 348

    E PISA project information 349E.1 Contribution 349E.2 Goal 349E.3 Results 350

    F PISA Project consortium 351

  • List of Figures

    2.1 Model Risk Assessment. . . . . . . . . . . . . . . . . . . . . . . . . . . 232.2 Five-pronged Approach to Privacy Threat Analysis. . . . . . . . . . . . . 242.3 Risk Analysis Branch 1. . . . . . . . . . . . . . . . . . . . . . . . . . . 262.4 Risk Analysis Branch 2. . . . . . . . . . . . . . . . . . . . . . . . . . . 27

    3.1 The identity protector. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373.2 TOE development model. . . . . . . . . . . . . . . . . . . . . . . . . . . 443.3 Privacy classes and subclasses. . . . . . . . . . . . . . . . . . . . . . . . 483.4 Component levelling of anonymity. . . . . . . . . . . . . . . . . . . . . . 483.5 Component Levelling Pseudonymity. . . . . . . . . . . . . . . . . . . . . 483.6 Component levelling Unlinkability. . . . . . . . . . . . . . . . . . . . . . 493.7 Component Levelling Unobservability. . . . . . . . . . . . . . . . . . . . 49

    4.1 The position of intelligent agents in relation to intelligence and agency. . . 614.2 Black box model of an agent. . . . . . . . . . . . . . . . . . . . . . . . . 644.3 The layered design of an agent that is deliberative, reactive and cooperat-

    ing (cf. [Mul96]). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 664.4 The agent model - a conceptual model of an agent that combines deliber-

    ative, reactive, and cooperating properties and attributes (cf. [Mul96]). . . 674.5 The Identity Protector (IP) placed in an agent-based environment: (a) the

    IP placed between the user and the agent; (b) the IP placed between theagent and the external environment. . . . . . . . . . . . . . . . . . . . . 69

    4.6 PET wrapped around the agent. . . . . . . . . . . . . . . . . . . . . . . . 744.7 PET integrated in the agent. . . . . . . . . . . . . . . . . . . . . . . . . . 754.8 PET used to create trusted components. . . . . . . . . . . . . . . . . . . 764.9 Aspects to take into account during the different phases of the design pro-

    cess of a privacy protecting agent. . . . . . . . . . . . . . . . . . . . . . 78

    5.1 System model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 835.2 Problem with confidentiality. . . . . . . . . . . . . . . . . . . . . . . . . 855.3 Convential mechanism to provide confidentiality for data storage and access. 865.4 Proposal of new confidentiality mechanism for storage and data access. . 875.5 Two possibilities for the E-E-D mechanism. . . . . . . . . . . . . . . . . 885.6 Digital blind signature. . . . . . . . . . . . . . . . . . . . . . . . . . . . 895.7 Shannons secrecy model. . . . . . . . . . . . . . . . . . . . . . . . . . . 915.8 Model for agent environment. . . . . . . . . . . . . . . . . . . . . . . . 92

    6.1 Interaction PKI-Agent platform. . . . . . . . . . . . . . . . . . . . . . . 986.2 PKI hierarchical architecture. . . . . . . . . . . . . . . . . . . . . . . . . 1006.3 Global agent Trust hierarchy with GlobalSigns RootSign program. . . . . 1016.4 PKI Architecture. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

  • x LIST OF FIGURES

    6.5 Non repudation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108

    7.1 Mutual relation between the three developed products and the Security ofPersonal Data study [vBB01]. . . . . . . . . . . . . . . . . . . . . . . . 115

    7.2 The Three Phases of the Management Cycle. . . . . . . . . . . . . . . . 118

    8.1 The ISA and its environment. . . . . . . . . . . . . . . . . . . . . . . . . 1448.2 ISA and its environment: the privacy issues. . . . . . . . . . . . . . . . . 1468.3 PISA Model in its environment. . . . . . . . . . . . . . . . . . . . . . . 1498.4 PISA architecture. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1508.5 Overview of PISA components. . . . . . . . . . . . . . . . . . . . . . . 151

    9.1 Explanation of twice-removed transactions. . . . . . . . . . . . . . . . . 1559.2 Lee, Kim, & Moons model of e-commerce loyalty. . . . . . . . . . . . . 1569.3 A model of agent success. . . . . . . . . . . . . . . . . . . . . . . . . . 157

    10.1 Graphical representation of schemes for representing inconsistencies. . . 17910.2 High-level view of ideal ontology architecture. . . . . . . . . . . . . . . 18010.3 Privacy Ontology Concepts (Statements). . . . . . . . . . . . . . . . . . 18310.4 Schematic representation of the AuditAgent and the audit trails within a

    Multiagent System. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195

    11.1 Three factors contributing to privacy threats. . . . . . . . . . . . . . . . . 19811.2 Data Fusion Concept. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22511.3 Parabot system architecture. . . . . . . . . . . . . . . . . . . . . . . . . 241

    12.1 Schematic representation of our approach. . . . . . . . . . . . . . . . . . 25112.2 A door with poor affordances. . . . . . . . . . . . . . . . . . . . . . . . 26212.3 An example of a Just-In-Time Click-Through Agreement (JITCTA). . . . 26512.4 Configuration of the remote usability tests. . . . . . . . . . . . . . . . . . 26812.5 The two-column layout of the testing interface. . . . . . . . . . . . . . . 26912.6 Frequency of ease of use ratings (1 = very difficult, 4 = neither easy nor

    difficult, 7 = very easy). . . . . . . . . . . . . . . . . . . . . . . . . . . . 27212.7 Greeting screen of the PISA prototype. . . . . . . . . . . . . . . . . . . . 27312.8 The registration interface screen. . . . . . . . . . . . . . . . . . . . . . . 27412.9 Interface screen to acknowledge registration. . . . . . . . . . . . . . . . . 27512.10The main interface screen of the prototype, showing the major functions. . 27612.11The interface screen used to create a new task. . . . . . . . . . . . . . . . 27712.12The interface for setting job search parameters and privacy preferences. . 27812.13Example of rollover help. . . . . . . . . . . . . . . . . . . . . . . . . . . 27912.14The interface screen for entering contact information. . . . . . . . . . . . 28112.15The interface for collecting resume information. . . . . . . . . . . . . . . 28212.16A Just-In-Time Click-Through Agreement (JITCTA). . . . . . . . . . . . 28312.17An interface screen to confirm the information that was entered. . . . . . 28412.18The interface screen used to launch the search task. . . . . . . . . . . . . 28512.19The final interface screen, confirming launch of the task (agent). . . . . . 286

    A.1 The job market case. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306B.1 Users of the PISA demonstrator, their agents and the data flow between

    the agents. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316B.2 PisaAgent class. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316B.3 Subclasses of the ProxyAgent in the demonstrator. . . . . . . . . . . . . . 317

  • LIST OF FIGURES xi

    B.4 All ServiceAgents. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318B.5 System components of the demonstrator. . . . . . . . . . . . . . . . . . . 319B.6 The PISA PKI. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319B.7 The concepts of the privacy ontology. . . . . . . . . . . . . . . . . . . . 320

  • xii LIST OF FIGURES

  • List of Tables

    1.1 Overview of the chapters in the handbook of specific interest to the varioustypes of readers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

    2.1 Risk Classification of Personal Data. . . . . . . . . . . . . . . . . . . . . 172.2 Overview of contributing purposeful factors. . . . . . . . . . . . . . . . . 292.3 Overview of solution-oriented factors. . . . . . . . . . . . . . . . . . . . 30

    3.1 PET Principles Versus CC Class. . . . . . . . . . . . . . . . . . . . . . . 47

    5.1 Comparison between Shannons model and the mobile agents model. . . 94

    7.1 Risk Classification of Personal Data. . . . . . . . . . . . . . . . . . . . . 121

    10.1 General consent model. . . . . . . . . . . . . . . . . . . . . . . . . . . . 18510.2 Data subjects rights. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18510.3 Transfer outside the EU. . . . . . . . . . . . . . . . . . . . . . . . . . . 18610.4 Summary of privacy supporting measures. . . . . . . . . . . . . . . . . . 186

    11.1 Seller/Consumer Conflict of Interest at Stages of the Purchasing Process. . 22811.2 Problems found while testing the search engines. . . . . . . . . . . . . . 239

    12.1 Privacy Roles Defined by The Directive. . . . . . . . . . . . . . . . . . . 25412.2 High-Level Summary of Privacy Principles. . . . . . . . . . . . . . . . . 25512.3 Privacy Principles (1.x), HCI Requirements, and Design Solutions. . . . . 25612.4 Privacy Principles (2.x), HCI Requirements, and Design Solutions. . . . . 25712.5 Privacy Principles (3.x), HCI Requirements, and Design Solutions. . . . . 25812.6 Privacy Principles (4.x), HCI Requirements, and Design Solutions. . . . . 25912.7 Guidelines for Creating Click-Through Agreements. . . . . . . . . . . . 26312.8 Notable HCI Design Features in the Interface Prototype. . . . . . . . . . 26712.9 Percentage of Users Who Understand each Function. . . . . . . . . . . . 27612.10Summary of Notable Results. . . . . . . . . . . . . . . . . . . . . . . . . 28712.11Recommendations for Improving the PISA Interface. . . . . . . . . . . . 289

    A.1 Description of the scenario. . . . . . . . . . . . . . . . . . . . . . . . . . 308

    C.1 Overview of threats per actor. . . . . . . . . . . . . . . . . . . . . . . . . 321

  • Chapter 1

    Introduction

    This introduction chapter deals with the concepts: privacy, Privacy-Enhancing Technolo-gies (PET), intelligent software agents and the relations between these subjects. Further-more this chapter provides a roadmap of this handbook, i.e. about this book, how to useit, the objectives and an outline of this book. Do not try to read this book at one sitting:there is no need to. Instead use this book as a dictionary or an encyclopedia. The sub-jects discussed in this book are meant to be assistance for all those that are involved in theconstruction or use of intelligent software agents and that are concerned about the privacyrequirements.

    1.1 BackgroundPrivacy is defined by Westin [Wes67] as:

    the claim of individuals...to determine for themselves when, how and to whatextent information about them is communicated to others.

    Privacy is a fundamental human right, defined in Article 8 of the 1950 European Conven-tion of Human rights. It is one of the most important human right issues of our evolvinginformation age.Technological developments have led to more and more automatic processing of informa-tion and storage of such information in computer databases. Part of this information ispersonal information. Individuals valueing their privacy are concerned about the fact thatso much personal information is routinely stored in computer databases over which theyhave no control.In a number of countries in the world, notably the democratic ones, informational privacyof people is acknowledged and laid down in treaties, EU directives, national laws and other(non binding) regulations. For example: OECD guidelines on, Council of Europe, EU Di-rectives 95/46, 99/93, 2002/58 and Canadian law. These regulations give the backgroundof informational privacy and adhered rights, restrictions and obligations of the parties dur-ing exchange and storage of personal data1. Especially the increasing automatic processingof personal data via software systems, databases, electronic communication etc. has madethe protection of privacy a pressing issue. Unfortunately, despite all the regulations that

    1See article 2 EU directive definition.

  • 2 CHAPTER 1 INTRODUCTION

    are put on paper, there is no well established and world wide accepted view on the wayprivacy protection and the consolidation thereof can be build into software. So, becausesoftware products are almost by definition international, there is a need to develop sucha view in the international arena, to assure that all privacy respecting countries followsgood practice.

    On the one hand this necessitates the introduction of a methodology for the privacy con-solidation, including privacy threat analysis; on the other hand this will necessitate a clas-sification of software products, dealing with personal data, to ensure that no lightweightprivacy products are used for heavyweight personal data. Such methodology that followssoftware development through every single stage is mandatory; like software quality andother similar aspects like reliability, security or system safety: it is no paint that can beadded after everything else has been dealt with. To put it straightforwardly: postponementof the handling of personal data implications until a later phase, may easily lead to an in-formation system that is adversarial to privacy adaptations: some measures may have beennecessary very early on in system development, before much of it is cast in the concreteof structure. See [HB98] for conceptual models for several types of information systemsensuring the prevention of privacy violations.

    1.2 The case of agents

    The tremendous growth of the Internet has made the Internet an information infrastruc-ture for virtually every subject and application domain. For instance, e-commerce ande-government are becoming part of citizens everyday life. Unfortunately, the enormousand fast-growing amount of data available has led to an information overload. For usersit is becoming more and more difficult to find the information they need. The informationmight be out there but finding it simply takes too much effort and time. Also, informationmight be available that could be of interest to a user without the user actually specificallysearching for this information.

    One way of dealing with the large amounts of available information is to alter the way auser tries to cope with the information. Instead of having a user sit behind its computerand directly navigating through the information, the user could have the computer do thisautonomously through so-called (intelligent) software agents. A software agent is basi-cally a piece of software that can execute tasks with minimal interference by their user. Asoftware agent could run on the computer of its user but could also move itself around onthe Internet or some other network infrastructure.

    1.2.1 Intelligent Software Agents

    There is no general agreement on a definition of the word agent, just as there is noconsensus within the artificial intelligence community on a definition of the term artificialintelligence. In general, one can define an agent as a piece of software and/or hardwarecapable of acting in order to accomplish a task on behalf of its user.

    A definition close to present-day reality is that of Ted Selker from the IBM AlmadenResearch Center:

    An agent is a software thing that knows how to do things that you couldprobably do yourself if you had the time.

  • 1.2 THE CASE OF AGENTS 3

    Agents come in many different flavours. Depending on their intended use, agents arereferred to by an enormous variety of names, e.g., knowbot, softbot, taskbot, userbot,robot, personal (digital) assistant, transport agent, mobile agent, cyber agent, search agent,report agent, presentation agent, navigation agent, role agent, management agent, searchand retrieval agent, domain-specific agent, packaging agent.The word agent is an umbrella term that covers a wide range of specific agent types.Most popular names used for different agents are highly non-descriptive. It is thereforepreferable to describe and classify agents according to the specific properties they exhibit.Intelligent software agents are software agents that while executing their assigned task arecapable of reasoning and actually learning during the execution of the task. They thus seemto behave and act in an intelligent manner. For example, an agent with the task of keepingtrack of the latest movies can learn what movies the user likes based on for instance theactors featured in the movies or the type of movie (action, romantic, etc.). Over time theagent will learn what its user likes and dislikes. The agent might then actually buy tickets(electronically) for a movie with the users favourite actor when it is released.

    1.2.2 Privacy and Software AgentsWhile executing its task, an agent can collect, process, store and distribute data. Some ofthese data could be personal data about individuals. For instance, to delegate tasks to anagent, a user needs to provide the agent with a user-profile containing personal data aboutthe user like mail addresses, habits and preferences. Part of these personal data might beprivacy-sensitive or might become privacy-sensitive when the agent processes the personaldata or combines it with other data. As long as the agent does not process the personal data,the privacy of the individuals involved will not be violated. A potential violation of theprivacy might occur when the agent communicates or exchanges (sensitive) personal datawith its environment. Because agents are both collectors and processors of (personal) data,and therefore form a possible threat to the privacy of those involved, they need to meetthe requirements specified in (national and international) regulations to ensure privacy.Various governments and international governmental organisations have drawn up privacyregulations and privacy guidelines.

    1.2.3 Privacy-Enhancing Technologies (PET)In order to help agents to protect internally stored personal data (that might be privacysensitive), Privacy-Enhancing Technologies (PET) have been developed over the years.PET stands for a coherent system of ICT measures that protects privacy by eliminatingor reducing personal data or by preventing unnecessary and/or undesired processing ofpersonal data, all without losing the functionality of the information system. PET tryto manage the privacy threats that software agents face. Ultimately PET might replaceinspections for enforcing the privacy regulations. Enforcing by means of inspections oraudits to verify whether all organisations that collect personal data are complying withthe privacy regulations is rather time-consuming and thus expensive. Goldberg [GWB97]gives an overview of existing and potential Privacy-Enhancing Technologies for the Inter-net.

    PET help to tackle the two main types of privacy threats that are posed by the use ofISATs: threats caused by agents acting on behalf of a user (through the disclosure ofthe users personal information) and threats caused by foreign agents that act on behalfof others (via traffic flow monitoring, data mining, and even covert attempts to obtainpersonal information directly from the users agent).

  • 4 CHAPTER 1 INTRODUCTION

    There are four ways of using PET to protect an individual in an agent-based environment:

    1. wrapping PET around the individuals agent;

    2. integration of PET in the individuals agent;

    3. combining wrapping and integration of PET;

    4. using PET to create an infrastructure of trusted components.

    Unfortunately the application of PET to ISAs in order to meet privacy regulations is notstraightforward. Yet, this handbook illuminates the use of PET in software agents.

    1.3 Objectives of the handbookThe objectives of the handbook on Privacy-Enhancing Technologies (PET) and IntelligentSoftware Agents (ISAs) are:

    to describe the privacy related issues (privacy preferences, privacy threats and pos-sible solutions) to the use of software agents in general and illuminate these aspectsby describing in detail several cases;

    to describe a solution (Privacy Ontology and Privacy rules and policies) for a soft-ware agent such that it can act according to the EC-Directive;

    to describe method for Privacy by Design; to describe technical solutions for Privacy protection for networks, human computer

    interfaces, public key infrastructures, cryptography and data mining and matching;

    to describe other initiatives and standards for privacy protection ands to discussopportunities and areas of new research;

    to act as a guideline for designers of software agents to meet privacy regulations; to provide customers with information that allows them to draw up privacy aware

    system specification involving software agents.

    1.4 About the Handbook

    The handbook was established within the PISA (Privacy Incorporated Software Agent)project supported by the European Union (project RTD IST-2000-26038), see Appendix Eand F. This handbook presents the results of the joint study by the Dutch Data Protec-tion Authority, TNO, Delft University of Technology, Sentient Machine Research, FINSAConsulting, National Research Centre Canada and GlobalSign, as partners in the PISAproject. With this study the partners have attempted to identify possible threats to the pri-vacy of individuals resulting from the use of agent technology. Secondly, the study soughtto identify and demonstrate ways of applying Privacy-Enhancing Technologies (PET) toagent technology in such a way as to eliminate the impact of these threats.

  • 1.5 OUTLINE OF THE HANDBOOK 5

    1.5 Outline of the HandbookThe handbook starts in Chapter 2 by generally describing and defining privacy and dis-cussing the aspects involved with privacy like identity and personal data, legal intercep-tion, privacy preferences and policies, privacy threat analysis, privacy legislation. Chap-ter 3 discusses in detail Privacy-Enhancing Technologies (definition, history, legal ground,concept, and developments) and compares the Common Criteria used for information tech-nology security evaluation with the principles of PET. PET is further illuminated by adescription of several PET projects. Chapter 4 describes the development of agent tech-nology and the the link between software agents and PET. Chapter 5 discusses the securityand privacy protection aspects. An infrastructure, PKI, for providing security is describedin Chapter 6. Chapter 7 discusses evaluation criteria to determine whether a software agentmeets privacy regulations. In Chapter 8 a general architecture model is established for dis-cussing privacy related issues of software agents. It is shown how measures to protectprivacy can be incorporated. Other important architecture issues like a trust model andnetwork aspects are discussed in Chapter 9. A design method keeping in mind privacy as-pects is presented in Chapter 10. The technique of data mining is illuminated in Chapter 11both as a concern for personal privacy and as an opportunity for privacy protection. Chap-ter 12 focusses on the Human Computer Interaction. Finally, Chapter 13 concludes thehandbook and gives a view on future developments with respect to privacy and softwareagents that can be expected.

    1.6 Reading the handbookThis handbook is targeted at designers and developers of Intelligent Software Agents(ISAs) as well as customers and people responsible for the use of ISAs. This hand-book gives designers starting points to integrate privacy in a structured way in informa-tion and communication systems, specifically ISAs. Additionally, the handbook providescustomers information required for drawing up system specifications and negotiate withdesigners and developers. The handbook tries to answer the following questions:

    What is privacy? What are software agents? What threats to privacy can be attributed to software agents? What (technical) measures are there to eliminate or reduce the impact of these

    threats?

    How to design a privacy incorporated system? How to audit and evaluate a privacy incorporated system?

    To help the reader, guidelines are given for reading the handbook based on the readersbackground and interests. The following types of readers are identified:

    Developer. A developer of agent technology applications.

    Manager. The person responsible for checking the design documents.

    User. The user of agent technology applications.

  • 6 CHAPTER 1 INTRODUCTION

    Table 1.1: Overview of the chapters in the handbook of specific interest to the varioustypes of readers.

    type of handbook chapterreader 2 3 4 5 6 7 8 9 10 11 12 13developer

    manager

    user

    decision maker

    Decision maker. The person who decides to build a Multiagent system (MAS) and whois aware of all aspects, technical and legal, of such an environment.

    Table 1.1 gives an overview of the chapters of the handbook of most interest to eachdifferent type of reader.

    1.7 AcknowledgementsThis handbook is the result of the collaboration of all partners of the PISA consortium.The following people contributed to this handbook (in no particular order): J.J. Bork-ing (CBP), P. Verhaar (TNO-FEL), B.M.A. van Eck (CBP), P. Siepel (CBP), G.W. vanBlarkom (CBP), R. Coolen (TNO-FEL), M. den Uyl (Sentient Machine Research), J.Holleman (Sentient Machine Research), P. Bison (Sentient Machine Research), R. van derVeer (Sentient Machine Research), J. Giezen (TNO-TPD), A.S. Patrick (NRC), C. Holmes(Carleton University), J.C.A. van der Lubbe (TUD), R. Lachman (TNO-FEL), S. Kenny(CBP), R. Song (NRC), K. Cartrysse (TUD), J. Huizenga (TNO-FEL), A. Youssouf (Glob-alSign), L. Korba (NRC), J-M. Kabasele (GlobalSign), M. van Breukelen (TNO-TPD),A.P. Meyer (TNO-TPD), A.K. Ahluwalia (TNO-TPD), J.A. Hamers (TNO-TPD), J.G.E.Olk (TNO-FEL), A. Ricchi (FINSA), G. Manzo (FINSA), and X. Zhou (NRC).

  • Chapter 2

    Privacy

    G.W. van Blarkom J.J. [email protected] [email protected]

    CBP, The Netherlands

    J. [email protected]

    TNO-TPD, The Netherlands

    R. Coolen P. [email protected] [email protected]

    TNO-FEL, The Netherlands

    This chapter explains the concepts of privacy and data protection, the European Directivesthat rule the protection of personal data and the relevant definitions. It focuses on thequestion of when personal data items become non-identifiable, the sensitivity of data, au-tomated decisions, privacy preferences and policies and the nine condensed privacy rulesapplicable to intelligent software agents. The reader shall also find in this chapter the legalsecurity requirements and the method of assessing the privacy threats as a tool for privacyrisk analysis. The development privacy threat analysis has been based on the Code ofPractice for Risk Analysis and Management Method, Information Security Handbook forthe Central Computers and Telecommunications Agency (CCTA).

    2.1 Introduction to the IssueIn a number of countries in the world, notably the democratic ones, information privacyrelated to people is acknowledged and laid down in treaties, EU Directives, national lawsand other (non-binding) regulations1. These regulations give the background to informa-tional privacy and adhered rights, restrictions and obligations of the parties during theexchange and storage of personal data2. Personal data, however, shall not only be storedand read by the human eye. This use has privacy implications in itself. The advent ofautomatic processing via software systems, databases, electronic communication, etc., hasalso made the protection of privacy a pressing issue. Unfortunately, despite all the regu-lations that are put on paper, there is no well established and worldwide accepted view onthe way privacy protection and the consolidation thereof can be built into software. There

    1For example: OECD guidelines, the Council of Europe, EU Directives 95/46, 99/93, 2002/58 and/or Cana-dian law on privacy.

    2See Article 2, EU Directive definition.

  • 8 CHAPTER 2 PRIVACY

    is, therefore, a need to develop such a view in the international arena to ensure that allprivacy respecting countries follow good practices because software products are almostby definition international.On the one hand, this necessitates the introduction of a methodology for privacy consolida-tion, including privacy threat analysis, and, on the other hand, this also necessitates a clas-sification of software products, dealing with personal data, to ensure that no lightweightprivacy products are used for heavyweight personal data. Such methodology that followssoftware development through every single stage is mandatory; such as software qualityand other related aspects of which reliability, security and system safety are just but a fewexamples. You cannot just add it as an afterthought when everything else has been dealtwith. In a nutshell, we could say that the postponement of dealing with personal data im-plications until a later phase, may easily lead to an information system that is contraryto privacy adaptations. Certain measures may have been necessary very early on when de-veloping the system before much of this system has been cast in stone. See [HB98] forconceptual models for several types of information systems that ensure privacy violationprevention.

    Legislative drafting seems to mirror this observation. For instance, Directive 2002/58/EC(Directive on privacy and electronic communications) states in Article 4 (Security), Para-graph 1, that (preventive) measures shall ensure a level of security appropriate to the riskpresented and is reinforced by the following addition to Recital 13: the requirement toinform subscribers of particular security risks does not discharge a service provider fromthe obligation to take, at his or her own costs, appropriate and immediate measures toremedy any new, unforeseen security risks3.Methodology is, of course, the first concern because this shall also produce insight on howto classify software systems, since, by necessity, it shall function as a benchmark for thequality of the privacy consolidation process. The more stringent the methodology that isapplied, the better the software is expected to be, that is, from the privacy protection pointof view; strict adherence to software quality standards and other non-functional restrictionsstill need to be ascertained. This paper presents a methodology on how this can be achievedto evoke a discussion on the subject of privacy consolidation as a step on the road tointernational consensus concerning the construction of privacy-safe information systems.

    To facilitate the application of the privacy consolidation methodology, and of any method-ology during the various stages of software development, qualified staff, budget and timeare necessary resources. If, however, no thorough methodology is to be followed, the out-come can easily be that inadequate resources shall be allocated. The development of sucha methodology shall, therefore, not just improve the awareness of the privacy issue forsystem developers, it shall also contribute to the allocation of the necessary facilities.

    2.2 General Description of Privacy

    Privacy is defined by Westin [Wes67] as:

    the claim of individuals...to determine for themselves when, how and to whatextent information about them is communicated to others.

    3Though it may be going too far to state the possibility that risk management shall be put on a similarregulatory footing to Auditing, it is certainly apparent that powerful forces are gathering strength around an exante as opposed to an ex post comprehension and implementation of measures protecting against moral and otherdamages arising from privacy violations.

  • 2.2 GENERAL DESCRIPTION OF PRIVACY 9

    Privacy is a fundamental human right as defined in Article 8 of the 1950 European Conven-tion of Human Rights. It is one of the most important human right issues of our evolvinginformation age [Ban00]. Informational privacy has two distinct characteristics:

    1. The right to be left alone;

    2. The right to decide oneself what to reveal about oneself.

    The above means that, although it is a situation that is wanted by an individual, it primarilycomprises a set of rules of conduct between the individual person and the persons environ-ment with respect to personal data processing4 in the Internet environment. A major steptowards privacy protection in Europe was the adoption of Convention 108 of the Councilof Europe. Today, informational privacy protection for individuals is expressed throughdifferent European Union Directives such as:

    95/46/EC, Data Protection Directive hereafter DPD; 2002/58/EC, Directive on Telecommunications 2002; 99/93/EC, Digital Signature Directive and non-EU legislation5.

    This type of legislation defines a set of rights concerning personal data accruing to individ-uals irrespective of sector of application and creates obligations concerning the processingof data by third parties. The EU Privacy Directive 95/46/EC has two objectives:

    1. Creating a high level of protection of personal data;

    2. Enabling the free movement of data within the EU.

    The EU privacy legislation has, furthermore, two main functions:

    1. Empowering the individual to manage his or her own personal data and protectingthese data within the limits as defined in the Directive;

    2. Creating a protective legal regime when personal data are processed6.

    In PISA (see Appendix A, B, and E) we use the following definition for privacy:

    The claim of individuals to be left alone, free from surveillance or interferencefrom other individuals, organisations or the state.

    It comprises a set of rules of conduct between the individual person and the personsenvironment with respect to the manipulation of personal identifiable information eventhough it is a situation that is wanted by an individual. Personal data can be defined as thecollection of all data that are or can be related to the individual. This includes factual dataincluding their identification, physical data, behavioural data, social data, financial dataand any other personal data that may be involved.

    4This document uses the EU privacy legislation as the legal framework. This legislation uses the term personaldata. The term Personal Identifiable Information is used instead of personal data in some environments. Theterms are interchangeable, but this document shall use the personal data standard term.

    5See the 2000 Canadian Personal Information Protection and Electronic Documents Act [Can00].6Summary, Conclusions and Recommendations of the TNO Report, STB-02-13a, Privacy-Enhancing Tech-

    nologies and Information Systems of the Central Administration, The Hague, 28 February 2002, commissionedby the Ministry of the Interior and Kingdom Relations (BZK), handed out 23 May 2002 in The Hague, after thesymposium Privacy by Design organised by the Dutch Data Protection Authority (CBP).

  • 10 CHAPTER 2 PRIVACY

    2.3 Definitions

    Directive 95/46/EC7 of the European Parliament and of the Council of 4October 1995 on the protection of individuals with regard to the processing ofpersonal data and on the free movement of such data.

    The following definitions8 apply and shall be used in this handbook throughout:

    (a) Personal data9 shall mean any information relating to an identified or identifiablenatural person (data subject); an identifiable person is one who can be identified,directly or indirectly, in particular by reference to an identification number or toone or more factors that are specific to his or her physical, physiological, mental,economic, cultural or social identity;

    (b) Processing of personal data (processing) shall mean any operation or set of opera-tions which is performed upon personal data, whether or not by automatic means,such as collection, recording, organisation, storage, adaptation or alteration, re-trieval, consultation, use, disclosure by transmission, dissemination, or otherwisemaking available, alignment or combination, blocking, deletion or destruction;

    (c) Personal data filing system (Filing system) shall mean any structured set of personaldata items that are accessible according to specific criteria, whether centralised, de-centralised or dispersed on a functional or geographical basis;

    (d) Controller shall mean the natural or legal person, public authority, agency or anyother body which alone or jointly with others determines the purposes and meansof the processing of personal data; where the purposes and means of processingare determined by national or Community laws or regulations, the controller or thespecific criteria used for his or her nomination may be designated by national orCommunity law;

    (e) Processor shall mean a natural or legal person, public authority, agency or any otherbody which processes personal data on behalf of the controller;

    (f) Third party shall mean any natural or legal person, public authority, agency or anyother body than the data subject, the controller, the processor and the persons who,under the direct authority of the controller or the processor, are authorised to processthe data;

    (g) Recipient shall mean a natural or legal person, public authority, agency or any otherbody to whom data is disclosed, whether a third party or not; however, authoritieswhich may receive data in the framework of a particular inquiry shall not be regardedas recipients;

    (h) The data subjects consent shall mean any freely given specific and informed indi-cation of his or her wishes by which the data subject makes it known that he or shedoes not object to personal data relating to him or her being processed.

    7This Directive is hereafter referred to as the Data Protection Directive (DPD).8See Article 2 Definitions of EU Directive 95/46/EC [EC95].9PII (Personally Identifiable Information) is used instead of PD (Personal Data) in certain regions (e.g. the

    United States). This handbook consistently uses PD.

  • 2.4 DATA SUBJECT AND PERSONAL DATA 11

    Note: the term data subject is implicitly defined. To simplify reading the term, datasubject, once used in context, shall occasionally be replaced by the term person, withoutchange of definition. Additionally, the term Data Protection Authority (or Privacy Com-missioner) is used to indicate the legal authority in charge of providing privacy regulationsupport (through reporting, reprimands, enforcement or imposition).

    2.4 Data Subject and Personal DataThis paragraph discusses several issues from the Directive applicable to the processing ofpersonal data in general.

    2.4.1 Identified or Identifiable

    The legislator gives the definitions of the entities that relate to privacy legislation in Article2 of the Data Protection Directive. In Article 2 (a) the definitions are given for datasubject and personal data:

    (a) personal data shall mean any information relating to an identified or identifiablenatural person (data subject); an identifiable person is one who can be identified,directly or indirectly, in particular by reference to an identification number or toone or more factors that are specific to his or her physical, physiological, mental,economic, cultural or social identity;

    The definition of data subject and of personal data distinguishes between two very im-portant situations, namely whether a natural person is identified by means of personaldata items such as name, date of birth and gender or if a natural person is identifiable bymeans of the available personal data items other than the type of items mentioned before.In legal terms, personal data means any piece of information regarding an identified oridentifiable natural person10. Whether we can talk of personal data depends on a num-ber of elements of which, within the scope of this document, identification is the onlysignificant element. According to Article 2 of the EC Directive 95/46, a natural personcan be identified directly or indirectly. Direct identification requires basic details (e.g.,name, address, etc.), plus a personal number, a widely known pseudo-identity, a biometriccharacteristic such as a fingerprint, PD, etc. Indirect identification requires other uniquecharacteristics or attributes or a combination of both, to provide sufficiently identifyinginformation (see [BR01]).Non-identification is assumed if the amount and the nature of the indirectly identifyingdata are such that identification of the individual is only possible with the application ofdisproportionate effort, or if assistance by a third party outside the power and authorityof the person responsible is necessary11. Whether we can talk of disproportionate effortdepends, on the one hand, on the nature of the data and the size of the population and, onthe other hand, the resources of time and money one is willing to spend in order to be ableto identify the person. We can only note in passing here that the concepts of identificationand identity are essentially contested in theory and are often arbitrary and ambiguous inpractice (see [Raa99]).

    10See Article 2 of EU Directive 95/46/EC; personal data is defined in Article 1 of the UK Data ProtectionAct 1998 and there is a lot of confusion about identification and verification.

    11See Recital 26 of the EC Directive 95/46 [EC95].

  • 12 CHAPTER 2 PRIVACY

    Internet identifiers such as an IP address, browsing activities of a user, session login detailsand the listing of websites visited by an Internet user are classified as personal data.

    A data subject is identifiable if sufficient facts about a data subject are known. Example: aninformation system on a loyalty system database registers all details related to items soldfor analysis purposes. It also registers the following demographical data items, in orderto perform sales analyses, such as, gender, date of birth, loyalty card number and postalcode. The DPD categorises this database to contain personal data because the controllercan easily identify the data subject by matching the loyalty card number with the customerdatabase. See recital 26: whereas, to determine whether a person is identifiable, accountmust be taken of all the means that may reasonably be expected that shall be used, eitherby the controller or by any other person to identify the said person;The Common Criteria for Information Technology Security Evaluation (CC)12 define theanonymity family as: the family to ensure that a user may use a resource or service withoutdisclosing the users identity. The requirements for Anonymity provide protection forthe user identity. Anonymity is not intended to protect the subject identity. In order toanonymise personal data according to the CC, it is enough to remove those data itemsthat directly identify the data subject. Anonymity is, however, more than that from a dataprotection point of view. Anonymity is intended to protect the data subject identity inrelation to privacy legislation. The definition also categorises data as personal data as longas the data subject can be identified.Given an information system that anonymises personal data according to the CC defini-tion, it is very likely that the system still processes personal data according to the DPD.Example: date of birth must be transformed into an age category, the last positions ofthe postal code must be removed and the loyalty card number must be encrypted using aone-way hashing algorithm (this ensures that the items bought by one data subject can belinked without the ability to identify the data subject).

    2.4.2 Special Categories of Data

    The Privacy Directive defines several special categories of data in Article 8. Paragraph 1defines the basic set of categories of special data and, subsequently, forbids the processingof such data.

    1. Member States shall prohibit the processing of personal data revealing racial or eth-nic origin, political opinions, religious or philosophical beliefs, trade union mem-bership, and the processing of the data concerning health or sex life.

    Paragraph 5 of the same Article adds data relating to offences, criminal convictions or se-curity measures to this list. Finally, data relating to administrative sanctions or judgmentsin civil cases concludes this list. Member States are obliged to determine conditions underwhich national identification numbers or any other identifier of general application may beprocessed in the last paragraph of this Article.

    Paragraph 2 of the same Article defines a number of well-defined conditions under whichprocessing of these special categories could be lawful.

    12See Section 3.2 for a more detailed description of the CC.

  • 2.5 CONTROLLER AND PROCESSOR 13

    2.4.3 Automated Individual Decisions

    The legislator has drawn special attention to automated decisions. In a MAS (MultiagentSystem) environment, some of the decisions of an agent may well be classified as anautomated decision. Article 15 of Directive 95/46/EC defines the automated decision as:

    1. Member States shall grant the right to every person not to be subject to a decisionwhich produces legal effects concerning him or significantly affects him and whichis based solely on automated processing of data intended to evaluate certain per-sonal aspects relating to him, such as his performance at work, creditworthiness,reliability, conduct, etc.

    When it comes to the rights of the data subject, the Directive states, in Article 12 (a)third dash, that the data subject has the right to obtain knowledge of the logic involvedin any automatic processing of data concerning said data subject at least in the case ofthe automated decisions referred to in Article 15 (1). This part of the privacy legislationmay become a real issue if an agent somewhere down the line makes a decision as to thefacilities provided by the next agent to pass personal data onto.

    2.5 Controller and ProcessorThe definitions are given for controller and processor in Article 2 (d) and (e):

    (d) Controller shall mean a natural or legal person, public authority, agency or anyother body which alone or jointly with others determines the purposes and meansof the processing of personal data. Where the purposes and means of processingare determined by national or Community laws or regulations, the controller or thespecific criteria for his or her nomination may be designated by a national or Com-munity law;

    (e) Processor shall mean the natural or legal person, public authority, agency or anyother body which processes personal data on behalf of the controller.

    The most important parts of these definitions are that both the controller and the processorshall be natural or legal person that can be held responsible by the data subject for lawfulprocessing of personal data within the context of a handbook for Internet based agents.

    2.6 Security

    2.6.1 Legal FrameworkArticle 17 Security of processing

    1. Member States shall provide that the controller must implement appropriate tech-nical and organisational measures to protect personal data against accidental or un-lawful destruction or accidental loss and against unauthorised alteration, disclosureor access, in particular where the processing involves the transmission of data overa network, and against all other unlawful forms of processing.Having regard to the state of the art and the costs of their implementation, suchmeasures shall ensure a level of security appropriate to the risks represented by theprocessing and the nature of the data to be protected.

  • 14 CHAPTER 2 PRIVACY

    2. The Member States shall provide that the controller must, where processing is car-ried out on his or her behalf, choose a processor who provides sufficient guaranteesin respect of the technical security measures and organisational measures governingthe processing to be carried out and must ensure compliance with those measures.

    3. The carrying out of processing by way of a processor must be governed by a contractor legal act binding the processor to the controller and stipulating in particular that:

    - the processor shall act only on instructions from the controller;

    - the obligations set out in Paragraph 1, as defined by the law of the MemberState in which the processor is established, shall also be incumbent on theprocessor.

    4. For the purposes of keeping proof, the parts of the contract or legal act relating todata protection and the requirements relating to the measures referred to in Para-graph 1 shall be in writing or in another equivalent form.

    Recitals

    (15) Whereas the processing of such data are covered by this Directive only if it is au-tomated or if the data processed are contained or are intended to be contained in afiling system structured according to specific criteria relating to individuals, so as topermit easy access to the personal data in question;

    (26) Whereas the principles of protection must apply to any information concerning anidentified or identifiable person;whereas, to determine whether a person is identifiable, account must be taken of allthe means that can reasonably be expected that shall be used either by the controlleror by any other person to identify the said person;whereas the principles of protection shall not apply to data rendered anonymous insuch a way that the data subject is no longer identifiable;whereas codes of conduct within the meaning of Article 27 may be a useful instru-ment in providing guidance as to the way in which data may be rendered anonymousand retained in a form in which identification of the data subject is no longer possi-ble;

    (47) Whereas where a message containing personal data is transmitted by means of atelecommunications or electronic mail service, the sole purpose of which is thetransmission of such messages, the controller in respect of the personal data con-tained in the message shall normally be considered to be the person from whomthe message originates, rather than the person offering the transmission services;whereas, nevertheless, those offering such services shall normally be consideredcontrollers in respect of the processing of the additional personal data necessary forthe operation of the service;

    2.6.2 The Security of Personal Data

    This section contains a summary of the AV23 study: Security of Personal Data [vBB01].

  • 2.6 SECURITY 15

    Introduction

    Our privacy legislation defines standards for the exercise of a fair and lawful processingof personal data. It is very important, for example, that appropriate technical and organi-sational measures are taken to protect personal data against loss and unlawful processing.This document considers how the obligation to provide protection must be fulfilled inpractice, that is, the requirements that personal data protection measures must meet. Thedocument sets out the Dutch Data Protection Authoritys (DPA) practical guidance forcontrollers in the context of the statutory framework.A controllers duty to protect personal data is a corollary to the individuals right to pri-vacy. This right is established in international treaties, in European legislation, in the Dutchconstitution and in legislation. The Wet bescherming persoonsgegevens (WBP; Dutch Per-sonal Data Protection Act) has provided the general basis for this field of law since itcame into force on 1 September 2001 in the Netherlands. Responsibility for supervisingcompliance with the WBP lies with the Dutch DPA. The WBP regulates the processing ofpersonal data, i.e. any procedure involving such data, from its collection to its destruction.Before collecting personal data, a controller, that is, the person responsible for the pro-cessing of the data, must look into the question of appropriate security measures. The Actcovers both automated and manual data processing. Measures and procedures already inplace for the protection and processing of data need to be tested against the requirementsof the WBP and revised as necessary.

    The security that must be provided goes beyond information security; all matters relevantto the processing of personal data must be addressed; not just those that fall within the ICTdomain of information security. This document describes the additional measures to betaken to supplement those needed for compliance with general security requirements. Thisis because the WBP stipulates that extra measures must be taken to ensure the security ofpersonal data processing activities. The measures put in place must, therefore, be broaderthan those needed to satisfy general data security requirements.

    The protection of personal data is concerned with three quality aspects: exclusivity, in-tegrity and continuity. The guidance set out in this document is particularly concernedwith measures and procedures for protecting the exclusivity of personal data. The othertwo aspects are covered by the general system of security measures.The legal basis for personal data protection measures is formed by Article 13 of the WBP.The relevant passage of the Act states: The controller shall implement appropriate tech-nical and organisational measures to secure personal data against loss or against any formof unlawful processing. These measures shall guarantee an appropriate level of security,taking into account the state of the art and the costs of implementation, and having re-gard to the risks associated with the processing and the nature of the data to be protected.These measures shall also aim at preventing unnecessary collection and further processingof personal data. The level of security that a controller must provide shall depend on therisk class. Article 13 of the WBP forms the basis for the use of Privacy-Enhancing Tech-nologies (PETs). PETs are a coherent system of ICT measures protecting informationalprivacy (in accordance with European Directive 95/46/EC and the WBP) by eliminatingor minimising personal data or by preventing the unnecessary or unwanted processing ofsuch data, without compromising the functionality of the information system. The useof PETs is more than an appropriate technical measure; it is a means of systematicallyensuring compliance with the WBP.

  • 16 CHAPTER 2 PRIVACY

    Protection Levels for Personal Data

    It is important that the measures taken when protecting personal data address threats thatare realistic, given the nature of the data concerned and the scale of the processing activ-ities. The risk may be regarded as the product of the likelihood of an undesirable eventand the seriousness of the implications of that event. The greater the risk, the stricter theprotection requirements that must be met. As a guide to the measures that are appropri-ate, data processing procedures are divided into a number of predefined risk classes. Eachclass is linked to a particular level of protection. The main factors influencing the level ofprotection required include:

    The significance attached by society to the personal data to be processed; The level of awareness within the processing organisation regarding information

    security and the protection of personal data and subjects privacy; The nature of the ICT infrastructure within which the personal data is to be pro-

    cessed.

    The controller must perform a thorough analysis in each case. On the basis of the findings,the controller can decide which risk class the intended procedure falls into and what levelof protection is, therefore, required. The analysis must be verifiable and it must be possibleto give an account of the analysis if necessary. Four risk classes are recognised:

    Risk class 0: Public level risk. The personal data to be processed is already in the pub-lic domain. It is generally accepted that use of the data for the intended purposerepresents no risk to the subjects. This document, therefore, proposes no specialprotection measures.

    Risk class I: Basic level risk. The consequences for the subjects of the loss or unautho-rised or inappropriate use of their personal data are such that standard (information)protection measures are sufficient.

    Risk class II: Increased risk. The loss or unauthorised or inappropriate use of the per-sonal data would have additional consequences for the subjects. Certain types ofpersonal data referred to in Article 16 of the WBP enjoy special legal protectionand, therefore, require at least the level of protection associated with this risk class.The types of personal data in question are data concerning a data subjects religionor philosophical beliefs, race, political opinions, health, sex life, trade union mem-bership, criminal record or record of unlawful or antisocial behaviour following theimposition of an injunction.

    Risk class III: High risk. Where several collections of special categories of personal dataare to be processed, the potential consequences of processing can be sufficientlyserious for the data subjects involved that the procedure warrants inclusion in riskclass III. The measures taken to protect data processed in a class III procedure mustmeet the highest standards.

    The interrelationships between the various risk classes are summarised in Table 2.1.

    The Security of Personal Data in Practical Situations

    The requirements that apply to the protection of personal data are presented below, dividedinto fourteen categories. The measures that must be taken in a particular case depend onthe risk class in which the processing procedure is placed on the basis of risk analysis.

  • 2.6 SECURITY 17

    Table 2.1: Risk Classification of Personal Data.

    Nature of personal data: Personal data Sensitive personaldataIn accordance withArticle 16 WBP

    Personal data of afinancial and/oreconomic nature

    Quantity of personaldata(nature and volume)

    Nature of theprocessing

    Small quantity ofpersonal data

    Simpleprocessing Risk class 0 Risk class II Risk class II

    Large quantity ofpersonal data

    Complexprocessing Risk class I Risk class III

    Security policy, protection plan and implementation of the system of mea-sures and procedures Management formulates a policy setting out general infor-mation security requirements and specific personal data protection requirements. On thebasis of this policy, a plan is drawn up for the creation of a system of protection measures.The responsibilities of staff members with regard to the implementation of the protectionpolicy must be defined. Compliance must be checked regularly and validity in the currentcircumstances must be reviewed regularly.

    Administrative organisation The term administrative organisation covers the en-tire system of measures relevant to the systematic processing of data in order to provideinformation to facilitate the management and operation of the organisation, as well as tofacilitate the process of accounting for such activities. The measures and procedures mustbe defined in a structured manner. It is, furthermore, important that they are reviewedwhenever changing circumstances make this appropriate. To this end, regular checks mustcarried out to ascertain whether the procedures and measures are consistent with currentpractice. It is also important that the responsibilities with regard to information securityand data processing are properly defined.

    Privacy awareness Practical application of protection measures is normally down toan organisations staff, all of which need to play their part. It is, therefore, necessaryto build up or maintain an adequate level of privacy awareness within the organisation.Checks on application and compliance are important.

    Personnel requirements In order to minimise the risk of inappropriate processing,it is important that the need for care and attention in this area is taken into account duringthe recruitment and selection of personnel. Responsibilities with regard to the protectionof personal data must be set out in job descriptions.

    Organisation of the workplace One of the most important aspects of informationsecurity is ensuring that data (in general) do not come into the possession of unauthorisedindividuals. Appropriate measures, which often do not need to be at all complicated, arerequired to minimise the risk of unauthorised access to personal data. Personal data aresometimes transported on portable media. PCs and data carriers must be properly securedand unauthorised access prevented.

    Management and classification of the ICT infrastructure An organisation needsto have an up-to-date overview of its ICT facilities for operational purposes. Proper man-agement of the ICT infrastructure is also necessary for the protection of personal data.

  • 18 CHAPTER 2 PRIVACY

    Access control An organisation is required to define measures and procedures to pre-vent unauthorised access to locations and information systems at or in which personal datais held. This implies being able to close off and control access to relevant areas and en-suring that only authorised personnel can access information systems. It is also importantthat records are kept indicating who is authorised to do what, and that regular authorisationchecks are carried out.

    Networks and external interfaces The transmission of personal data via a networkinvolves a significant security risk. It is, therefore, strongly recommended that personaldata be encrypted for transmission. In this way, it is at least possible to ensure that mes-sages containing personal data are not read by unauthorised persons without explicit, de-liberate intent.

    Use of third-party software In order to minimise security risks, illegal or unapprovedthird-party software must never be used for data processing. Regular checks are importantto ensure that this principle is adhered to. All software modifications must be documentedand managed using a problem and change management system. Procedures for the modi-fication and replacement of software must also be in place.

    Bulk processing of personal data Processes involving personal data on numer-ous subjects are frequently automated or semi-automated. Integrated or non-integratedautomated bulk processes are initiated and then allowed to complete without further inter-ruption. The controller, nevertheless, remains responsible for the processing at all stages;responsibility cannot be transferred to the system manager.

    Storage of personal data Management of data carriers is important for the protectionof personal data. System backups must be made at regular intervals. Clear procedures mustbe defined for the production of backup files containing personal data, and adherence tothese procedures must be checked. Data carriers bearing personal data must be carefullystored and transported to ensure that unauthorised persons cannot remove them or view thedata. Data carriers bearing personal data in risk classes II or III must be stored in lockableareas with appropriate break-in protection, even if the data is stored in encrypted form.

    Destruction of personal data Personal data and data carriers that are no longer re-quired must be destroyed in an appropriate manner once any applicable statutory or otherretention period has elapsed. Responsibilities with regard to the destruction of personaldata and data carriers must be clearly defined and personnel must be aware of who isresponsible for doing what.

    Contingency plan Every organisation must have a contingency plan indicating ex-actly what is to happen in the event of an emergency. Such a plan, however, is useful onlyif personnel are familiar with it and regular drills have been held to practise its implemen-tation. An organisations contingency plan must set out a procedure for the resumption ofdata processing following an emergency.

  • 2.7 PRIVACY PREFERENCES AND POLICIES 19

    Contracting out and contracts for the processing of personal data An organ-isation shall sometimes choose to contract out some or all of its personal data processingactivities, rather than perform them in-house. Under such circumstances, the external pro-cessor must guarantee the same level of protection as that provided by the controller. Acontract made with an external processor must make provisions for the protection of per-sonal data. The external processor, furthermore, must sign a confidentiality contract. Ifan external processor handles personal data, the controller must supervise the processorsprotection arrangements by, for example, carrying out periodic checks.

    2.7 Privacy Preferences and Policies

    Many of the data protection issues are linked with the consent of the data subject. A datasubject can give his or her consent to a form of processing his or her personal data if heknows how the controller shall process his or her personal data; the transparency principle.The controller shall inform the data subject of his or her intentions. This can be publishedin a privacy policy statement. The data subject may have written down the conditions underwhich he is prepared to disclose his or her personal data to a controller. Such a statementmay be called the privacy preferences of the data subject. In theory, there can now be abroker of personal data. Assuming that this broker can be trusted, the data subject maygive privacy preferences and his or her personal data. The broker, on receiving a requestfor disclosure, shall ask for the privacy policy of the controller in question. If privacypreferences and privacy policies can be defined in a binary system, the decision whetherto disclose or not to disclose could be an automated decision. It is one of the objectivesof a PISA that each agent in the environment is equipped with the privacy policy of itscontroller. Before personal data is transferred to an agent, directly by the data subject orby an agent already loaded with personal data, the agent discloses its privacy policy, whichis then matched against the preferences of the current holder of the data, data subject oragent. The preferences are compared with the policy and, if they match, this can be viewedas the consent required by the Data Protection Directive.

    2.8 Legal InterceptionAgents do operate in the environment of the Internet. This means that the Privacy and Elec-tronic Communications Directive of 2002 also applies to the MAS environment as well asdata protection legislation. Recital 11 and Article 15 of the 2002 Privacy and ElectronicCommunications Directive may restrict the scope of the rights and obligations providedfor in Article 5, Article 6, Article 8(1), (2), (3) and (4), and Article 9 of the DPD whensuch restrictions constitutes a necessary, appropriate and proportionate measure within ademocratic society to safeguard national security (i.e. state security), defence, public se-curity, and the prevention, investigation, detection and prosecution of criminal offencesor of unauthorised use of the electronic communication system, as referred to in Article13(1) of the Data Protection Directive. To this end, Member States may, inter alia, adoptlegislative measures providing for the retention of data for a limited period as long as thecriteria of Article 8 of the European Convention of Human Rights are respected. Thisprovision in the Privacy and Electronic Communications Directive means that the privacyof the user, the data subject, is not guaranteed in all circumstances. National implemen-tations of this Directive may foresee in judicial powers to have conditional access to logfiles to be created by providers of public telecommunication services. This has lead tothe following provisions in the telecommunications law in the Netherlands. Article 1 of

  • 20 CHAPTER 2 PRIVACY

    Chapter 13 states that public telecommunication networks and public telecommunicationservices may only be made available to the public if the data processed can be intercepted.(This provision does not arise from the Directive but from the Council recommendationsof 1995 concerning interception ability of communications.) Because there is no legalobligation for all EU Member States to implement similar regulations, not all MemberStates have this obligation. This legislation may lead to a very confusing situation in thefollowing situation. A controller offers a telecommunication service where all the (per-sonal) data processed are encrypted. This controller uses a telecommunication providerthat has no commercial links with the controller. The encryption process is secured usinga Trusted Third Party (TTP). The service provider shall be denied access to the TTP forthe decryption. Even if the service provider discloses the log files, the contents cannotbe made readable to the investigative authorities. The same situation may also arise evenwhen the controller and the provider are one and the same organisation. The questions tobe asked now are twofold:

    The service provider renders an unlawful service; The TTP can be forced to disclose the encryption key to the investigative authori-

    ties. There must be a provision in the contract between the controller and the TTPregarding this matter.

    2.9 Nine Condensed Rules from the EU Viewpointon Privacy

    The interpretation on how privacy has to be approached is not anywhere near uniform, noteven in the countries with a privacy protection regime. The rigor with which the legislationis applied also varies from country to country. The most stringent The EU regime recog-nises nine different basic guiding rules. Eight of these rules deal with privacy protectiondirectly while the remaining rule is a consequence of the global differences in the privacyregimes. In order to protect its citizens against violations of privacy in other non-EU coun-tries where more permissive interpretations exist after (electronically) receiving personaldata, it stipulates that, if at all possible, this data traffic must be prevented. Each rule shallbe indicated by a brief title, a short explication of its meaning and a number of paragraphsproviding further clarification.

    Reporting the processing (Data Protection Authority) The processing of per-sonal data must be reported in advance to the Data Protection Authority (DPA) or a pri-vacy officer, unless processing has been exempted. This rule stipulates that the collector ofpersonal data has to inform the DPA of its collection and its intentions with this collection.The DPA is the main guardian of the rules of the privacy regime. Its authority and compe-tence must be expressed in national legislation13. It must also, naturally, be equipped withan appropriate budget, staff and other required facilities and it must be backed by nationalparliament.

    Transparent processing The person involved must be able to see who is processinghis or her personal data and for what purpose. The main issues of this rule is that (1) thedata subject is informed about all the controllers of his or her personal data and (2) thatthese controllers inform the data subject about the purposes of their collections and furtherprocessing.

    13The USA have not appointed a Federal Privacy Commissioner.

  • 2.9 NINE CONDENSED RULES FROM THE EU VIEWPOINT ON PRIVACY 21

    As required processing Personal data may only be collected for specific, explicitand legitimate purposes and not further processed in a way incompatible with those pur-poses. This rule is meant to make perfectly clear that use and processing for purposes thatare not known to the data subject are forbidden, not to say illegal.

    Lawful basis for data processing The processing of personal data must be basedon a foundation referred to in legislation, such as permission, agreement, legal obligation,justified interest, and such. For special categories of data, such as medical information,stricter limits prevail. The legality of the processing of personal data collected must beensured. Despite eventual consent from the data subject, any processing that is in violationof the law remains illegal.

    Data quality Personal data must be as correct and as accurate as possible, sufficient,to-the-point and not excessive. This rule supports the possible volatile nature of personaldata and charges the collector with the obligation to keep all the items of personal data ingood order, including accuracy, adequacy and timeliness, and is an incentive to minimisedata as much as possible.

    Rights of the parties involved The parties involved have the right to take cognisanceof and to update their data as well as the right to raise objections. This, in fact, addressestwo different issues: (1) the right of the data subject to have the ability to update personaldata through a certain mechanism made known to him or her with the obligation, at thecontrollers side, to implement the amendments in relation to all the duplications of thepersonal data at its collection or its processors collection and (2) the right of the datasubject to raise objections in case the mechanism remains obscure or the amendments arenot implemented in due time.

    Processing personal data by a processor If processing is outsourced to a pro-cessor, it must be ensured that he shall observe the instructions of the person responsible.This rule states that, with the transfer of personal data to a processor, the rights of the datasubject remain unaffected and that all restrictions and obligations of the controller equallyapply to the processor. There is no erosion of personal data constraints in the course ofthis process.

    Protection against loss and unlawful processing of personal data Suitablemeasures of a technical and organisational nature make up the necessary tailpiece of lawfulprocessing [vBB01]. This rule puts the controller under the obligation to take all mean-ingful and possible measures to keep the collection of personal data, and any item relatedto the data, in good order and also to take precautions that use of the collection outside thelegal and purposeful l