36
The Journal of Information Technology Management Cutter IT Journal Vol. 26, No. 8 August 2013 Privacy and Security in the Internet of Things Opening Statement by Rebecca Herold . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 The Internet of Things: Data Collection and Its Impact on Your Privacy by Hugh A. Boyes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 The Internet of Things and Consumers’ Privacy Rights: The Case of RFID Technology by Analía Aspis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 Notice and Consent in the Internet of Things by R. Jason Cronk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 Big Data Privacy and Security Challenges by Jason Stradley . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 The Internet of Things: Establishing Privacy Standards Through Privacy by Design by Nicola Fabiano . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 “The more you learn about the possibilities of the IoT, the more you realize we have only dipped our toes into this fathomless ocean of data possibilities — and associated privacy and security concerns.” — Rebecca Herold, Guest Editor NOT FOR DISTRIBUTION For authorized use, contact Cutter Consortium: +1 781 648 8700 [email protected]

Cutter · 4 CUTTER IT JOURNAL August 2013 ©2013 Cutter Information LLC smart light bulbs5 that can be controlled by a smart- phone. You can dim the lights, change the colors, get

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

The Journal of Information Technology Management

Cutter IT Journal

Vol. 26, No. 8August 2013

Privacy and Security in the Internet of Things

Opening Statement

by Rebecca Herold . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

The Internet of Things: Data Collection and Its Impact on Your Privacy

by Hugh A. Boyes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

The Internet of Things and Consumers’ Privacy Rights: The Case of RFID Technology

by Analía Aspis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

Notice and Consent in the Internet of Things

by R. Jason Cronk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

Big Data Privacy and Security Challenges

by Jason Stradley . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

The Internet of Things: Establishing Privacy Standards

Through Privacy by Design

by Nicola Fabiano . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

“The more you learn about

the possibilities of the IoT,

the more you realize we

have only dipped our toes

into this fathomless ocean

of data possibilities — and

associated privacy and

security concerns.”

— Rebecca Herold,

Guest Editor

NOT FOR DISTRIBUTION

For authorized use, contact

Cutter Consortium:

+1 781 648 8700

[email protected]

Cutter IT Journal®

Cutter Business Technology Council:Rob Austin, Ron Blitstein, Tom DeMarco,Lynne Ellyn, Israel Gat, Vince Kellen,Tim Lister, Lou Mazzucchelli,Ken Orr, and Robert D. Scott

Editor Emeritus: Ed YourdonPublisher: Karen Fine CoburnGroup Publisher: Chris GeneraliManaging Editor: Karen PasleyProduction Editor: Linda M. DiasClient Services: [email protected]

Cutter IT Journal® is published 12 timesa year by Cutter Information LLC,37 Broadway, Suite 1, Arlington, MA02474-5552, USA (Tel: +1 781 6488700; Fax: +1 781 648 8707; Email: [email protected]; Website:www.cutter.com; Twitter: @cuttertweets;Facebook: Cutter Consortium). PrintISSN: 1522-7383; online/electronic ISSN: 1554-5946.

©2013 by Cutter Information LLC. All rights reserved. Cutter IT Journal®is a trademark of Cutter Information LLC.No material in this publication may bereproduced, eaten, or distributed withoutwritten permission from the publisher.Unauthorized reproduction in any form,including photocopying, downloadingelectronic copies, posting on the Internet,image scanning, and faxing is against thelaw. Reprints make an excellent trainingtool. For information about reprints and/or back issues of Cutter Consortiumpublications, call +1 781 648 8700or email [email protected].

Subscription rates are US $485 a yearin North America, US $585 elsewhere,payable to Cutter Information LLC.Reprints, bulk purchases, past issues,and multiple subscription and site licenserates are available on request.

Part of Cutter Consortium’s mission is to

foster debate and dialogue on the business

technology issues challenging enterprises

today, helping organizations leverage IT for

competitive advantage and business success.

Cutter’s philosophy is that most of the issues

that managers face are complex enough to

merit examination that goes beyond simple

pronouncements. Founded in 1987 as

American Programmer by Ed Yourdon,

Cutter IT Journal is one of Cutter’s key

venues for debate.

The monthly Cutter IT Journal and its com-

panion Cutter IT Advisor offer a variety of

perspectives on the issues you’re dealing with

today. Armed with opinion, data, and advice,

you’ll be able to make the best decisions,

employ the best practices, and choose the

right strategies for your organization.

Unlike academic journals, Cutter IT Journal

doesn’t water down or delay its coverage of

timely issues with lengthy peer reviews. Each

month, our expert Guest Editor delivers arti-

cles by internationally known IT practitioners

that include case studies, research findings,

and experience-based opinion on the IT topics

enterprises face today — not issues you were

dealing with six months ago, or those that

are so esoteric you might not ever need to

learn from others’ experiences. No other

journal brings together so many cutting-

edge thinkers or lets them speak so bluntly.

Cutter IT Journal subscribers consider the

Journal a “consultancy in print” and liken

each month’s issue to the impassioned

debates they participate in at the end of

a day at a conference.

Every facet of IT — application integration,

security, portfolio management, and testing,

to name a few — plays a role in the success

or failure of your organization’s IT efforts.

Only Cutter IT Journal and Cutter IT Advisor

deliver a comprehensive treatment of these

critical issues and help you make informed

decisions about the strategies that can

improve IT’s performance.

Cutter IT Journal is unique in that it is written

by IT professionals — people like you who

face the same challenges and are under the

same pressures to get the job done. Cutter

IT Journal brings you frank, honest accounts

of what works, what doesn’t, and why.

Put your IT concerns in a business context.

Discover the best ways to pitch new ideas

to executive management. Ensure the success

of your IT organization in an economy that

encourages outsourcing and intense inter-

national competition. Avoid the common

pitfalls and work smarter while under tighter

constraints. You’ll learn how to do all this and

more when you subscribe to Cutter IT Journal.

About Cutter IT Journal

Cutter IT Journal

Name Title

Company Address

City State/Province ZIP/Postal Code

Email (Be sure to include for weekly Cutter IT Advisor)

Fax to +1 781 648 8707, call +1 781 648 8700, or send email to [email protected]. Mail to Cutter Consortium, 37 Broadway,

Suite 1, Arlington, MA 02474-5552, USA.

SUBSCRIBE TODAY

Request Online LicenseSubscription Rates

For subscription rates for online licenses,

contact us at [email protected] or

+1 781 648 8700.

Start my print subscription to Cutter IT Journal ($485/year; US $585 outside North America)

WE’VE ONLY JUST BEGUN THIS LONG, STRANGE IOT TRIP

While reading through the articles for this special

issue of Cutter IT Journal on privacy and security in

the Internet of Things (IoT), I kept thinking of that old

Carpenters’ song, “We’ve Only Just Begun,” along with

the Grateful Dead’s classic “Truckin,’” which contains

the immortal line: “Lately it occurs to me what a long,

strange trip it’s been.” If we mash up the two lines, the

resulting “We’ve Only Just Begun this Long, Strange

Trip” would seem to be an appropriate anthem for the

Internet of Things. The more you learn about the possi-

bilities of the IoT, the more you realize we have only

dipped our toes into this fathomless ocean of data possi-

bilities — and associated privacy and security concerns.

So what do we mean by the Internet of Things? Basically,

the term — first coined by Kevin Ashton in 19991 —

describes the concept of having communications occur-

ring not just online, on the Internet, but also implicitly

through basic daily activities via our growing array of

digital devices, which are now becoming more and

more pervasive.

The potential uses of data gadgets in the IoT are unlim-

ited, as are the benefits. However, with each benefit

comes a risk. And what multiplies the risks is all the

data that is created that has never existed before, and

the new ways in which that data is used. Just a few

years ago, no one was too concerned about being able

to correlate meaningful information about individuals

within vast repositories made up of zetabytes2 of infor-

mation. One of the things that concerns me is how Big

Data analytics, the capabilities of which are increasing

just as fast as more zetabytes of data are being created,

can be used to take those humongous amounts of data

and reveal intimate information about individuals. With

no restrictions on the use of Big Data analytics on data

collected through the IoT, we will see privacy problems,

and new types of privacy breaches, as a result.

HOW MANY PEOPLE ARE LIVE-STREAMING THEIR LIVES THROUGH THE IOT?

A growing number of companies are now creating

services specifically designed to connect basically any

type of product we use in our everyday life to other

products. For example, the sole purpose of one busi-

ness, Thingsquare, is “connecting things with low-

power wireless Internet protocols to thermostats,

light bulbs, street lights, and more.”3

Consider how the possibilities could, and may already

have, become realities:

n Devices are being built that include sensors and wire-

less chips in strap-on boxes and cameras mounted

within sports helmets. These are connected via WiFi

chips to remote computers. Analysts are describing4

how such technologies can live-stream the players’

vitals online for the coach to see, so he or she can

make decisions regarding players, strategy, and so

on. Alternatively, such images and data can be

streamed to an online site for the entire world to

see — as the game is being played — to give viewers

not only a player’s immediate perspective of the

game, but also to show how the players’ bodies

are handling the physical stress of the sport.

n Smartphones are being enabled to act as user-friendly

interfaces for remotely controlling and configuring

building thermostats. These can also serve as a geolo-

cation app for heating, able to detect where you are

located in the house and then adjust the temperature

accordingly. Related to this is the existence of today’s

Opening Statement

3Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

by Rebecca Herold, Guest Editor

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

The potential uses of data gadgets in the IoT

are unlimited, as are the benefits. However,

with each benefit comes a risk.

©2013 Cutter Information LLCCUTTER IT JOURNAL August 20134

smart light bulbs5 that can be controlled by a smart-

phone. You can dim the lights, change the colors, get

notifications when a bulb needs replacing, use a bulb

as a night-light, sync lights to music, and control them

remotely. (I want to play around with one of these!)

n A smart refrigerator communicating through the IoT

will now be able to communicate with many other

entities, such as (1) the grocery store to order specific

food products when it sees you are out of something,

such as milk; (2) the appliance vendor to notify the

vendor how efficiently the refrigerator is working;

and (3) the electric utility so it can determine the

energy efficiency of the appliance. Is this happening

today? I don’t know. Perhaps. Will we see this

happening if it is not yet? Certainly!

Are the life-enhancing possibilities of the IoT empow-

ered by Big Data analytics exciting? Yes! But do those

possibilities bring with them some significant privacy

risks and security/safety issues? Oh, my, yes indeed!

IN THIS ISSUE

We begin this issue with an article from Hugh Boyes,

who discusses at length some of the issues that accom-

pany the new ways in which data is collected, shared,

and used within the IoT. In particular, he focuses on

data browsers of all kinds that are collecting data,

typically unbeknownst to their users, about where those

users are going online and what they are doing there.

He then describes how the tracking and aggregated

data is transmitted without users’ knowledge, let alone

consent, to unknown others throughout the world.

Boyes also highlights privacy and security issues that

are inherent to storing, manipulating, and aggregating

all that data using Big Data analytics, and he calls for a

reconsideration of the legal protections in the IoT.

The use of RFID tags has been a growing concern to not

just me, but all those involved with privacy. The ways in

which these tags are being used to track people continue

to evolve, with few to no legal restrictions. In our second

article, Analía Aspis takes aim at the specific concerns

involving RFIDs used in the IoT. She first provides some

interesting statistics and then describes ways in which

RFID tags are being used, such as for profiling and for

tracking locations. Aspis then describes the ways in

which privacy could be legally protected when orga-

nizations decide they want to use RFID tags.

Speaking of legal protections and requirements, two of

the longest-standing privacy principles include provid-

ing notice for when data is collected and then asking for

consent to use that data. But how can notice and con-

sent be accomplished within the IoT when the data is

often collected through the devices we are wearing or

carrying, or even as a result of where we are walking

and talking? R. Jason Cronk highlights these issues in

our third article. He begins by discussing some of the

challenges — and outright failures — of notice and

consent concepts. He then builds upon this discussion

to point out how providing notice and obtaining con-

sent within the IoT may become not only enormously

challenging, but in many cases impossible.

And why will it be impossible to give notice and obtain

consent in many corners of the IoT? Well, one of the

ways this becomes challenging or impossible is when

Big Data analytics takes all that disparate information

and uses massive computational power to quickly

reveal new insights into the lives and activities of indi-

viduals. In our fourth article, Jason Stradley outlines

some of the more significant ways in which Big Data

analytics may be applied to data collected from the IoT

to uncover more information than was ever previously

possible, leading to new and unique privacy risks and

potential exploits. One of his main points is that there is

no security built into Big Data analytics. While I agree

that this is overwhelmingly true, after seeing some Big

Data analytics companies trying to establish security

controls, I am hopeful that this tide will soon start

turning. Read Stradley’s article to learn more.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

UPCOMING TOPICS IN CUTTER IT JOURNAL

SEPTEMBER

Giancarlo Succi

Profiting in the API Economy

OCTOBER

Matt Ganis and Avinash Kohirkar

The Value of Social Media Data Analytics

How can notice and consent be accomplished

within the IoT when the data is often collected

through the devices we are wearing or

carrying, or even as a result of where we

are walking and talking?

5Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

So now that we’ve described just a few of the major pri-

vacy issues involved in the IoT, what can we do about

it? Just accept that there is no privacy left? Instead, how

about being proactive and building some privacy pro-

tections into the IoT? I am a Privacy by Design (PbD)

Ambassador,6 and I can tell you that using the PbD

philosophies, along with long-standing privacy princi-

ples and effective information security controls, we have

a great arsenal of privacy protections we can implement

within these new IoT and Big Data technologies to miti-

gate privacy risks. Someone else who recognizes this is

Nicola Fabiano, who writes about such possibilities in

our fifth and final article. Fabiano also details some of

the efforts of the European Commission and interna-

tional privacy authorities to ensure that personal privacy

doesn’t become some quaint relic of the past.

SO LET’S GET THIS IOT PARTY STARTED!

I saw the following statement on the Internet7 and loved

how it succinctly and clearly described the IoT:

The Internet is like the air, available everywhere. By con-necting products and devices to the Internet, they can beaccessed from anywhere and instantly.

Welcome to the IoT — you are already there whether

you realize it or not! I’m looking forward to attending

a Pink concert a few months from now, and I’m going

both to enjoy her music — as you can probably tell from

the subhead above — and will also be searching out

indications of where the IoT is being used at such an

event. I’m sure there will be many!

The IoT is an ever-evolving topic. I hope you will

take the insights our authors have provided and then

examine how the IoT is being used in your own organi-

zation, in those you travel to and through, and those

with whom you do business. And please give us your

feedback. We want to know what you think about the

IoT and how you may already be using it!

ENDNOTES

1Ashton, Kevin. “That ‘Internet of Things’ Thing.” RFID Journal,

22 June 2009.

2How much data is in a zetabyte? A LOT! See a nice graph

showing you how much in the following recent article:

Lacey, Stephen. “The Zetabyte Era: An Illustrated Guide to the

Energy-Hungry Digital Universe.” Greentech Media, 14 August

2013 (www.greentechmedia.com/articles/read/the-zetabyte-

era-an-illustrated-guide-to-the-energy-hungry-digital-universe).

3Thingsquare (http://thingsquare.com).

4For a discussion of what can be done, listen to the following

podcast: Higginbotham, Stacey. “The History of the Internet

of Things Includes a Swedish Hockey Team and LEGOs.”

Gigaom, 16 May 2013 (http://gigaom.com/2013/05/16/

podcast-the-history-of-the-internet-of-things-includes-a-

swedish-hockey-team-and-legos).

5For example, see Lifx (http://lifx.co).

6See more about Privacy by Design (PbD) Ambassadors in

particular, and PbD in general, at www.privacybydesign.ca/

index.php/ambassador/rebecca-herold.

7“About Thingsquare.” Thingsquare

(http://thingsquare.com/about).

Rebecca Herold, CISSP, CISA, CISM, FLMI, CIPP/IT, CIPP/US, is a

Senior Consultant with Cutter Consortium’s Business Technology

Strategies practice. She is also CEO, The Privacy Professor, and

Partner, Compliance Helper. Ms. Herold has more than two decades of

privacy and information security experience and has provided infor-

mation security, privacy, and compliance services to organizations in

a wide range of industries throughout the world. She has been named

one of the “Best Privacy Advisers” multiple times in recent years by

Computerworld, most recently ranking #3. In 2012, Ms. Herold was

named one of the most influential people and groups in online privacy

by Techopedia and recognized as a “Privacy by Design Ambassador”

by the Data Privacy Commissioner of Ontario, Canada. Her blog has

been listed in the “Top 50 HIPAA Blogs” by Medicine|e-Learning.

Ms. Herold is a member of the editorial advisory board for Computers

& Security. She is also an adjunct professor for the Norwich

University Master of Science in Information Assurance program and

the author of 15 books. She can be reached at [email protected].

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

Judging by recent press coverage, privacy is very much

a universal issue. The response to allegations of hacking

and spying by public- and private-sector organizations

demonstrates that people are concerned about their

privacy. However, this coverage does not address the

extensive monitoring of our Internet usage by service

providers and commercial organizations. Few Internet

users will appreciate the extent to which their online

behavior is harvested for analysis, profiling, and shar-

ing. This collection and analysis of usage can extend

well beyond your browsing behavior, with software

in devices reporting on usage, location, and perfor-

mance, whether through smartphone applications,

video games, metadata from digital cameras, or appli-

cations running on personal computing devices (i.e.,

the embryonic Internet of Things [IoT]).

Many organizations rely upon users agreeing to such

collection and retention of information through accep-

tance of long, complex terms and conditions written in

legalese. When users tick the box to accept the terms

and conditions, how often do they really understand

the implications? Fortunately, politicians and regulators

are starting to wake up to this issue, as reflected by the

European Union (EU) proposal for a new data privacy

framework.1 This article considers the issues arising from

data collected and processed in the IoT. It discusses a

range of data collection and processing techniques that

can lead to privacy or security threats.

WHAT IS THE INTERNET OF THINGS?

An embryonic Internet of Things already exists. It

comprises the mobile devices and smartcards we carry

around in our bags and pockets. With developments

like Google Glass, you may soon be wearing part of

it. The IoT also exists in our built environment, repre-

sented by IP-based devices such as CCTV cameras, sen-

sors, and wireless devices (access control smartcards,

RFID-tagged items, etc.). This connectivity generates

a huge amount of data, including:

n Transaction data

n Web server logs

n User data (account details, preferences, and likes)

n User-generated content (blogs, tweets, instant

messages, and uploaded multimedia)

n Device location data

n Status information

n Session-related metadata

Sometimes referred to as Big Data, this profusion of

data contains the digital footprints created as we inter-

act with the Internet and systems around us. Even if a

transaction or content is encrypted, the communications

metadata may provide a useful indication of the nature

of the communications.

Over the next decade, Internet connectivity will touch

virtually all aspects of our lives. For example, companies

like Ford are working on connecting cars to the Internet,

initially for infotainment, but in the longer term to

optimize fuel use, predict your travel route on the fly,

and enable vehicle-to-vehicle (V2V) communication.2

Developments such as intelligent buildings, smart grids,

intelligent transport systems, and smart cities will fur-

ther expand the IoT. Their business cases are based on

using IT to deliver economic and environmental benefits

and to improve people’s experience. Supporting these

“smart” innovations, an array of sensors and data collec-

tion devices will be used to profile the operation of

buildings or systems, to determine user preferences,

and to establish patterns of use or behavior.

Some data collection will be overt, such as recording

purchases, adding transaction data to a user’s online

account, or setting a cookie to reflect user preferences.

However, there will also be extensive hidden data col-

lection for marketing and systems management pur-

poses. This wealth of data and an increasing blurring

of the physical and virtual worlds create a variety

©2013 Cutter Information LLCCUTTER IT JOURNAL August 20136

The Internet of Things: Data Collection and Its Impact on Your Privacy

by Hugh A. Boyes

YES, YOU ARE BEING WATCHED

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

7Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

of threats to the privacy of users and the security of

individuals and organizations.

WHAT IS YOUR BROWSER TELLING ME?

Before examining ways your privacy can be compro-

mised by the IoT, let’s start by reviewing how much

data can be gathered from your browser. As demon-

strated by websites like Support Details,3 a simple Web

page with some supporting JavaScript can collect and

display information related to your computer, includ-

ing its operating system, browser type, IP address,

screen resolution, and color depth. Other more sophisti-

cated websites provide an extensive range of tools to

extract information from a browser. For example, the

BrowserSpy website4 lists 75 different tools to display

information about the capabilities of your browser

and/or computer.

If your browser configuration is rare or unique,

websites may be able to track you, even if you limit

or disable use of HTML cookies. The Panopticlick

Project website, run by the Electronic Frontier

Foundation (EFF), tests the uniqueness of your

browser.5 Its tests are based on the information your

browser shares with sites you visit. Panopticlick’s site

determines how many bits of identifying information

are disclosed, allowing it to create a fingerprint for your

browser. Using these fingerprinting techniques, you can

differentiate between different browsers visiting from

a shared IP address. This can be the result of different

individuals on the same device or different devices.

This EFF project also explains how a small number of

facts about a person can allow him or her to be uniquely

identified.6

Whilst browsing the Internet, you may be giving away

further information, such as your location. For a laptop

or PC user, location information may be based on your

Internet point of presence, or WiFi access point, or

you may have given this information to your browser.

For users of tablet devices or smartphones, it may be

derived from the device’s GPS capability. Therefore, as

you access your social media account from various com-

puters (home, office, hotel, friend’s house) or connect to

WiFi at these locations, the website’s operator can create

a trail of IP addresses linked to your account identity.

This location data may be stored by a website’s content

management system, stored in website logs, or collected

by third parties.

The EU tried to address concerns about Web privacy

through the e-Privacy Directives.7 These directives cover

the use of cookies and similar technologies for storing

information, and accessing information stored, on a

user’s equipment, such as his or her computer or mobile

device. However, as already illustrated, browser finger-

printing can reduce the need for websites to employ

cookies to track users.

FOLLOWING YOUR INTERNET FOOTPRINTS

The previous section demonstrated how easy it is for

websites to collect data about a user, to uniquely iden-

tify a browser fingerprint, and to collect location infor-

mation. In addition, many websites encourage users

to create and log into accounts. This can be to view

content, allow posting, set preferences, or manage

transactions such as purchases or bookings. However,

any browser-based Web activity should be assumed

to leave an activity trail regardless of whether a user

is logged in or not, particularly as such sites often

encourage use of “remember me” functionality.

There is extensive collection of data by advertising and

Web marketing companies using a variety of tracking

techniques. With tools such as Ghostery,8 it is possible

to see which third parties are monitoring your online

behavior on a website. Recent visits to newspaper

websites showed that between 15 and 30 third-party

organizations were interacting with the pages, delivering

advertising content and collecting usage or browser data.

In the IoT, the devices connected to the Internet will be

communicating with applications in a similar fashion to

your browser. Network connections will be established

to exchange data or perform specific transactions. These

sessions are likely to be logged in a number of ways,

both by the parties exchanging data and potentially by

any intermediaries, such as Internet or communications

service providers.

An Internet-connected car that is communicating

with a dealership or garage that maintains it will

provide traceable information, both in metadata related

to its communications and in the content transferred.

With the rollout of IPv6, such a vehicle will probably

have a dedicated IP address. It may communicate

Whilst browsing the Internet, you may be

giving away further information, such as

your location.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

©2013 Cutter Information LLCCUTTER IT JOURNAL August 20138

manufacturer-specific information (e.g., the Vehicle

Identification Number [VIN]), as well as location,

speed, and performance data. The vehicle will be

linked to a specific owner and will probably store

driver preferences. Using this data, you may be able

to identify who is currently driving the vehicle. This

information may be communicated in real time, or

logged and periodically uploaded, to allow service

providers to monitor vehicle condition and perfor-

mance. In the home, we may see similar communica-

tions occurring from a variety of Internet-connected

consumer appliances; for example, to allow reordering

of groceries, access to online entertainment, and moni-

toring of energy consumption.

DATA STORAGE AND MANIPULATION

Major retailers already collect large volumes of detailed

customer and transaction data through their loyalty

card schemes and point-of-sale systems. The IoT will

enable even greater volumes to be acquired in a highly

automated fashion, often on the pretext of improving

customer choice or service. This huge volume of data

generated, communicated, processed, and stored by the

IoT will be of interest to both public and private organi-

zations. However, there are significant privacy and

security issues arising from this data acquisition and

manipulation.

Issues are likely to arise for one or more of the follow-

ing reasons:

n Aggregation of data from a single source or through

fusion of data from multiple sources

n Correlation of data

n Physical/virtual bonding (i.e., the association of a

logical address with a physical asset or location)

n Promiscuous and/or invasive applications

n Preferences and account settings

n Weak system or user security

Data Aggregation and Fusion

Individual pieces of information seen in isolation may

have few or no implications regarding an individual’s

privacy or security, but aggregation of a number of

items can present a different picture. For example, if

individual transactions such as purchases of foreign

currency, travel insurance, airline tickets, hotel reserva-

tions, and the like were linked to information about a

purchaser’s home address and details of any high-value

consumer goods the individual owns, it may represent

an attractive target for criminals. Given the increasing

diversity of products and services available from major

stores and supermarkets, this information could be

available in a single company’s data warehouse.

Collection and aggregation of this data may be justified

as a means of identifying suitable special offers based on

detailed profiling of a customer’s behavior and interests.

The situation becomes more complex when a retailer

provides information to or exchanges data with “trusted

third parties”; for example, to enhance their service

offerings or for credit-checking purposes. Permission to

exchange data is often requested in standard terms and

conditions for customer accounts and loyalty cards.

Through data-matching techniques, two disparate data

sets may be merged to provide a richer picture of the

customer. However, would you want, or knowingly

agree to, an Internet dating site sharing information

about your partner preferences with supermarkets or

department stores?

In the IoT, the picture of a customer may be augmented

with data collected from Internet-connected personal

and domestic devices, combining data from the Internet

with data that originates outside the Internet but is now

incorporated within it. This provides a much more

detailed picture of the customer, but also creates signifi-

cant privacy and security issues. For example, if your

fridge scans contents and usage, would you want it

to collect and store the presence of medications? This

could reveal a personal medical condition. Even if this

is desirable for prescription reordering purposes, how

would you limit who has access to this data, and what

security controls would protect the information in tran-

sit and storage?

Data aggregation and fusion issues also arise from

the presence of unique identifiers in the data sets.

These identifiers could be information such as an

email address, credit card number, physical address,

or the use of information specific to a device, such as

its IP address, location, and so on. Even if the true

name or identity of a customer or account holder is

Would you want, or knowingly agree to, an

Internet dating site sharing information about

your partner preferences with supermarkets

or department stores?

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

9Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

not verifiable, the availability of unique identifiers

can still allow profiles to be built.

Correlation and Correlation Errors

Use of correlation techniques allows a data processor

to combine, with a reasonable degree of certainty,

two data sets that do not have a common key or set

of unique identifiers. For example, the use of ZIP

code or postcode data may allow data about assets,

purchases, and/or preferences to be related to individ-

uals or households. Users or customers may be unaware

that these connections are being made and might be

concerned if the data relates to sensitive personal infor-

mation, such as a health or employment status issue.

Examples could include:

n An innocuous tweet where geolocation data places a

Twitter ID at a specialist clinic (indicating a particular

health problem), where the user can potentially be

linked via the Twitter ID to a named individual (e.g.,

through posts or links on another social media site)

n Automatic tagging of an individual in a photograph

that involves socially unacceptable or illegal behav-

ior, which is then posted on a social media site (jeop-

ardizing an individual’s continued employment in a

sensitive role)

There is a risk that erroneous connections may be made.

The severity of this risk depends on the techniques

used to correlate or associate the data sets. For example,

in legal cases where an individual is being accused

of illegal file sharing or theft of data, there is a need

to associate the alleged activity with an individual

and not simply an IP address. There are a number of

reasons why relying on the IP address may be unreli-

able. Erroneous associations could also occur through

the sharing of the access devices (e.g., laptop, tablet

device, or games console) or through the sale and

reuse of secondhand devices. Issues also arise if the IP

address belongs to an unsecured wireless broadband

router that is accessible to a number of individuals,

including visitors and passers-by. Relying on Web

account information is unreliable if the identity of the

user has not been verified, such as by reference to a

bank account, credit card, or some form of official ID.

Physical/Virtual Bonding

This issue relates to the linking of a physical device (e.g.,

a smartphone or other consumer/domestic device) to

its logical identity (MAC address or IP address) and to

a particular user (or users), location, or organization.

Unfortunately, such linkages cannot be regarded as

permanent; the logical identity of the device may be

changed or spoofed, the device may no longer be in the

location to which it was delivered or installed, or there

may have been a change of ownership. Unless the

physical/logical relationship can be periodically verified,

there is a potential for usage information to be associated

with the wrong person, location, or organization. If

the verification requires human intervention, such as

inspection or registration, there is a significant risk that

the process will not be properly fulfilled at some point

during the asset’s lifecycle. Relying upon this relationship

to record user behavior, buying patterns, and so forth

allows incorrect inferences or conclusions to be drawn.

Promiscuous or Invasive Applications

A significant threat to personal privacy and security is

the behavior of applications, whether these are based on

Web servers or downloadable onto a tablet device or

smartphone. For example, in social media applications,

there has been a tendency to share information by

default and to make the setting and maintenance of pri-

vacy controls overly complex. These promiscuous appli-

cations often invite you to connect with “friends” based

on your email contact information. Indeed the “friend”

may be automatically invited to connect with you. This

can be a source of embarrassment where relationships

have broken up or a source of danger in domestic vio-

lence or witness protection situations. There is also the

risk that applications will augment items you are shar-

ing; for example, by automatically applying geotags to

pictures, posts, and tweets.

There are some privacy-invasive applications and

website scripts that harvest user data and submit it to

their developers; for example, the recent discovery by

Symantec that the Facebook application for Android

leaked the device’s phone number.9 This occurred the

first time a user launched the application, even before

logging in, with the phone number being sent over the

Internet to Facebook servers. Whilst this particular prac-

tice has apparently stopped, it is symptomatic of an atti-

tude taken by some developers toward user privacy and

data protection rules. An obvious question raised by this

incident is why Facebook thought it needed to collect

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

In social media applications, there has been

a tendency to share information by default

and to make the setting and maintenance of

privacy controls overly complex.

©2013 Cutter Information LLCCUTTER IT JOURNAL August 201310

mobile phone numbers in the first place. If there was a

valid reason to do so, then why not ask for the informa-

tion rather than collect it without permission?

In the IoT, one reason for connecting a device to the

Internet will be to improve customer support and

services by collecting the device’s performance data

or usage information. Depending on the volume, fre-

quency, and nature of the data collected, this can be

very invasive. For example, smart meters can provide

energy consumption reports every 15 minutes. This

has led to privacy and security concerns regarding

building occupancy and occupants’ behavior.10 Another

example is automated reporting of faults, bugs, or

usage. Although often presented as anonymous, this

can involve the transmission of a comprehensive range

of data about the device, its configuration, and the soft-

ware environment. This may be an issue if the level of

detail provided allows a device to be uniquely identi-

fied, or if the report includes any snapshot of the file

or data being processed at the time.

Preference or Account Settings

When users set up website or application accounts, they

are often asked to supply a wide range of personal data.

Some is strictly factual (name, birthday, address, etc.),

while other items may relate to usage or behavioral

preferences. In the EU, data protection regulations

require the collection of personal information to be

proportionate and relevant. However, many popular

websites are hosted and managed outside of the EU’s

legal jurisdiction, thus reducing the effectiveness of

these regulations. Additional personal data and meta-

data about an Internet-connected device can provide

further insights into the preferences, behavior, and

activities of a user or organization. This may compound

the issues regarding aggregation and correlation of data

discussed above.

Weak Security

In addition to the privacy and security issues arising

from the collection, storage, and processing of our data,

the frequency of security breaches on major websites

exacerbates the situation.11-14 These breaches are often

reported as involving the exposure of login credentials,

such as usernames, email addresses, and passwords.

However, it is likely that in many cases a hacker will

also have gained access to other personal data, such

as account data and transactions. With an increasing

volume of data being collected and stored as part of

the IoT, personal data may represent an increasingly

attractive commercial target.

Given the tendency of users to use the same or very

similar passwords for access to a number of website

accounts, the compromise of a poorly secured website

can have implications for user accounts on other sys-

tems. Security can be further compromised if websites

rely on commonly requested or easily determined

information for their security questions. For example,

commonly used facts such as home address, mother’s

maiden name, and place and/or date of birth are sus-

ceptible to social engineering attacks. With the increas-

ing number of accounts that we are likely to require in

the IoT, account security needs to be improved.

CONCLUSION

As Internet users, our privacy and security are already

being undermined by the monitoring, tracking, and

analysis undertaken by commercial organizations,

whether as service providers or as owners of the net-

working infrastructure. The EU has tried to address

concerns about tracking using cookies and similar tech-

nologies, but these regulations are easy to circumvent

through some of the techniques discussed in this article.

The IoT is going to make it even easier to collect data as

the devices we own or interact with collect, store, and

process information about us.

We Internet users are often naive about the consent we

are giving when we sign up for an online account or

register a product online. We also make it easy for orga-

nizations to track us by failing to take the time to under-

stand and set user preferences and privacy settings in

applications. Thus we allow additional information to

be collected about our account usage, our location, and

relationships.

As the IoT develops, there will need to be a rethink of

privacy regulations and legislation to ensure that users

and organizations are provided with adequate safe-

guards. A failure to address these issues will make

the Orwellian surveillance state described in the book

1984 an increasing reality. In addition to taking steps to

improve privacy, we also need to encourage improve-

ments to the cyber security of the IoT. This is necessary

to ensure appropriate security is provided for the Big

Data produced by our interactions with the Internet.

A failure to address privacy issues will make

the Orwellian surveillance state described in

the book1984 an increasing reality.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

11Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

ENDNOTES

1Ashford, Warwick. “EC Publishes Proposed Data Protection

Reforms.” ComputerWeekly, 25 January 2012 (www.computer-

weekly.com/news/2240114326/EC-proposes-a-comprehensive-

reform-of-data-protection-rules).

2MacManus, Richard. “Checking Out Ford’s 2013 Model Car

Tech System.” readwrite, 2 April 2012 (http://readwrite.com/

2012/04/02/ford_2013_model_car_tech_system).

3Support Details (http://supportdetails.com).

4BrowserSpy (http://browserspy.dk).

5Electronic Frontier Foundation (EFF) Panopticlick Project

(https://panopticlick.eff.org).

6Eckersley, Peter. “A Primer on Information Theory and

Privacy.” Electronic Frontier Foundation (EFF), 26 January 2012

(www.eff.org/deeplinks/2010/01/primer-information-theory-

and-privacy).

7Directive 2009/136/EC of the European Parliament and of the

Council of 25 November 2009 amending Directive 2002/22/EC

on universal service and users' rights relating to electronic com-

munications networks and services (http://eur-lex.europa.eu/

LexUriServ/LexUriServ.do?uri=OJ:L:2009:337:0011:01:en:

HTML); Directive 2002/58/EC of the European Parliament and of

the Council of 12 July 2002 concerning the processing of personal

data and the protection of privacy in the electronic communications

sector (http://eur-lex.europa.eu/LexUriServ/LexUriServ.

do?uri=CELEX:32002L0058:en:NOT).

8Ghostery (www.ghostery.com).

9“Norton Mobile Insight Discovers Facebook Privacy Leak.”

Symantec, 26 June 2013 (www.symantec.com/connect/blogs/

norton-mobile-insight-discovers-facebook-privacy-leak).

10“Warning Over Smart Meters Privacy Risk.” BBC News,

12 June 2012 (www.bbc.co.uk/news/technology-18407340).

11Rashid, Fahmida Y. “LivingSocial Password Breach Affects

50 Million Accounts.” SecurityWatch (PC Magazine), 27 April

2013 (http://securitywatch.pcmag.com/news-events/310828-

livingsocial-password-breach-affects-50-million-accounts).

12Mustaca, Sorin. “Ubisoft Breached, Users Asked to Reset

Passwords.” Techblog (Avira), 3 July 2013 (http://techblog.

avira.com/2013/07/03/ubisoft-breached-users-asked-to-

reset-passwords/en).

13Saita, Anne. “Drupal.org Resets Passwords after Data Breach.”

Threatpost (Kaspersky Lab Security News Service), 29 May 2013

(http://threatpost.com/drupal-org-resets-passwords-after-

data-breach).

14“LinkedIn Passwords Lifted.” Rapid7 (www.rapid7.com/

resources/infographics/linkedIn-passwords-lifted.html).

DISCLAIMER

Any views expressed in this article are those of the author.

They are not an official statement or policy position approved

or issued by the Institution of Engineering and Technology.

Hugh A. Boyes is the Cyber Security Lead at the Institution of

Engineering and Technology (IET), where he focuses on develop-

ing cyber security skills initiatives for engineering and technology

communities. Mr. Boyes has recently written a technical briefing

document for the IET on resilience and cyber security of technology

in the built environment (www.theiet.org/intelligent-buildings). His

career as a project and program manager encompassed a number

of public- and private-sector roles. He received a BSc degree from

Salford University and an MBA from Henley Management College.

Mr. Boyes is currently a Chartered Engineer and a Fellow of the IET,

and he holds the CISSP credential issued by (ISC)2. He can be reached

at [email protected].

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

In 2006, IBM received patent approval for an invention

called “identification and tracking of persons using

RFID-tagged items.” One stated purpose: to collect

information that could be used to monitor the move-

ments of a person. Radio frequency identification

(RFID) is a technology that uses radio waves to auto-

matically identify — wirelessly, contact-free, and with-

out visibility — objects or people that have an RFID tag

attached to them. It implements the radio frequency

portion of the electromagnetic spectrum to uniquely

identify objects. It consists of two parts:

1. A tag — known as an EPC (electronic product code)

tag — that contains an identification number.

2. A reader that works as a scanner, triggering the tag

to broadcast its identification number and transmit it

to a computer system. This RFID reader is simultane-

ously a data collection instrument, promiscuously

gathering information from each RFID that responds

to its broadcast, and a transmitter or broadcaster of

information, as it sends its data through a network

to a point where data is being further processed.

Today RFID represents the beginnings of a new com-

mercial revolution. Corporations have a growing inter-

est in the technology, which currently serves a variety

of commercial purposes, including:

n Public transportation ticketing (e.g., VRR/VRS

Card in North-Rhine-Westphalia in Germany, SL

in Sweden, the SI Pass in Italy, OV-chipkaart in

the Netherlands)

n Electronic toll collection (e.g., E-ZPass in US, e-TAG

in Australia, Liber-T in France)

n Parking permits (US state of Arizona)

n RFID credit card–enabled financial transactions (e.g.,

Mobile MasterCard PayPass, ExxonMobile)

n RFID-embedded casino chips (Spain, Las Vegas)

n Automatic payment ticketing (Madejski Stadium in

the UK, FIFA World Cup in Germany)

n Applications in the healthcare sector (Jacobi Medical

Center in New York City)

n Tracking of goods and objects (Germany, US, Japan,

France)

It can be used in almost every situation where identifi-

cation of an object or a person is needed as long as the

RFID chip is carried along. This means that new types

of computers will be increasingly omnipresent, invisibly

embedded in a person’s everyday environment. Rather

than explicitly being the “user” of a computer, an indi-

vidual will implicitly benefit from services running

between computers without even taking notice of them.

RFID: THE DARKER SIDE

Over the past few years, however, concerns about the

possible risks of using RFID have increasingly been

voiced, as this technology may permit retailers and

third parties with RFID readers to profile consumers’

choices and track their movements both inside and

outside of a store without their knowledge or consent.

RFID critics argue that corporations would be able to:

n Use hidden tags to conduct data processing without

a consumer’s consent. For purposes such as clandes-

tine inventorying or data mining, corporations may

collect information about a certain product and —

depending on the circumstances — about the person

carrying the product, without that individual’s

knowledge. RFID tags can be embedded into objects

and documents without the consent of the individual

who obtains those items. As radio waves travel easily

and silently through fabric, plastic, and other materi-

als, it is possible to read RFID tags sewn into clothing

or affixed to objects contained in purses, shopping

bags, suitcases, and so on (see Figure 1).1 This issue

was raised by Consumers Against Supermarket

Privacy Invasion and Numbering (CASPIAN), which

found data processing without the consumer’s con-

sent occurring with RFID-embedded products from

Gillette, Walmart, Benetton, and Tesco.2 Furthermore,

©2013 Cutter Information LLCCUTTER IT JOURNAL August 201312

The Internet of Things and Consumers’ Privacy Rights:The Case of RFID Technology

by Analía Aspis

IS THE FOX GUARDING THE HENHOUSE?

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

13Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

radio waves allow data to be processed over a given

distance without any need for a direct line-of-sight

link with the chip and without the consumer having

to take an active part in the process. Furthermore, an

RFID application could collect large amounts of addi-

tional data. If a tagged item is paid for by credit card

or in conjunction with use of a loyalty card, for exam-

ple, then the unique ID of that item could be tied to

the identity of the purchaser. Finally, the product

could be linked with personal information such as

credit card details and even be used to collect such

information or link it with existing databases.

n Track, monitor, and profile consumers. If credit or

loyalty card information allows tagged items to be

linked with a particular consumer, the data transmit-

ted by the tags can reveal which products the con-

sumer purchases, how often those products are used,

and even where the products — and by extension the

consumer — are located. Such data could enable the

creation of an individual profile that could be used,

for instance, to evaluate a consumer’s worth to a

company (or, if stolen, could be sold to a third party).

Moreover, by placing RFID readers at select locations,

such as building entrances, readers are capable of

disclosing how consumers move through space. The

combined knowledge of consumers’ characteristics

and location allows businesses to target consumers

for differential treatment. By aggregating data to

form consumer profiles, corporations could make

inferential assumptions about a consumer’s income,

health, and lifestyle. This information could be sold

to other corporations for marketing purposes.

n Snoop on customers. The capabilities of RFID tech-

nology permit businesses to snoop into the lives of

customers in ways that were never before possible.

RFID-enabled loyalty cards permit businesses to

identify customers and change the prices of items

based on their purchasing profiles. For instance, a

clothing retailer could tag purchased garments with

customers’ credit card information and determine

how much money they are likely to spend as they

enter the store. In addition, sales representatives

could quickly target or avoid customers depending

on their historical purchasing habits.

These concerns are even more pertinent as the technolo-

gies are combined, integrated, and connected — invisi-

bly and remotely — to networks, thereby forming part

of a wider movement toward a society characterized by

ubiquitous computing. The aforementioned fears have

already prompted a debate over RFID technology in

the US and the European Union (EU). While this article

focuses on the US framework, I will also briefly com-

pare the legal responses to RFID in the US and EU

settings.

THE RFID DEBATE: CONSUMERS VS. CORPORATIONS

Although companies tend to downplay the use of RFID

data, there is a conflict between the consumer interest

and the business interest. The Trans Atlantic Consumer

Dialogue3 has highlighted the opposing viewpoints of

industry and consumers on the RFID question: industry

essentially wants to maximize the use of RFID to boost

productivity and competitiveness and then deal with

any negative side effects on privacy; consumers, on the

other hand, place much higher importance on privacy

and protection and tend to regard the proliferation of

RFID as premature in the absence of effective regulatory

frameworks.

From the consumers’ perspective, there should be a

right to know and a right to choose. Consumers should

not be forced to accept RFID unless they benefit from

transparent, accountable RFID usage, with clear and

intelligible onsite information, labeling, and balanced

information campaigns. Built-in anonymity, consent

required for the collection of personal data, and the

ability to deactivate or remove tags are, among others,

the principal consumer privacy demands. Consumer

advocates argue that, absent these safeguards, the most

powerful actors will use tags to their own advantage,

increasing the capacity of multiple retailers to exercise

control over food suppliers, helping vehicle manufac-

turers maintain segmentation of national markets, and

enhancing vertical control through distribution chains,

to name just a few potential results. They claim that

RFID will intensify the asymmetry of information and

power between consumer and suppliers, giving those

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

Figure 1 — Illustration of the threat of clandestine RFIDinventorying as it might emerge in the future. (Source: Juels.)

©2013 Cutter Information LLCCUTTER IT JOURNAL August 201314

who wish to sell us things more information about us

and the offers to which we might be more susceptible.

For example, in the US, CASPIAN provides an

overview4 of the everyday threats that item-level RFID

poses for consumer privacy. Even a six-inch tag can be

hidden inside packaging, sandwiched within a layer of

cardboard, or embedded in shoes. Many smaller tags

can be hidden in fabric labels, checkout labels, or credit

cards. By placing a tag on a shopper loyalty card (which

already links into historical data on that individual’s

previous purchasing behavior), shops can design a per-

sonalized shopping experience, maximizing the selling

opportunity by offering specific point-of-sale promo-

tions, discounts, or coupons — all geared to maximizing

the store’s sales. RFID readers in the shop doorway can

identify customers as they enter, even reading through

their pockets or handbags; other readers can be placed

beneath the floor to read tags in shoes or fabric. The

IEEE5 is also very concerned about information being

used for secondary purposes unrelated to the original

reason for carrying or using the RFID-embedded card.

Because data in an RFID network involves little human

intervention and is acquired immediately during a

transaction — or even following a transaction — the

possibility of data aggregation and use for purposes

other than those intended is an issue that must be

addressed.

For its part, the business sector argues that respect for

consumers will be always assured because businesses

have a profit incentive to please them, and RFID can

contribute to such aims.6 For this sector, RFID is an

extremely promising technology that will be able to

produce uncountable possibilities throughout the value

chain from the producer to the consumer. The retail

branch will be able to acquire information about sales,

the status of goods, and losses in real time, all without

having to manually count the goods. Concerning the

privacy debate, businesses in both the US and the

EU have developed the same line of reasoning. They

emphasize that widespread item-level tagging may

never materialize anyhow, while contending that any

international, national, or regional legislation (in the

US and EU, respectively) could unreasonably limit the

benefits of RFID. They further argue that it would be

ill-advised to attempt to regulate such a rapidly evolv-

ing technology.

LEGAL AND INDUSTRY POLICIES TO PROTECT US CONSUMERS’ PRIVACY IN AN RFID SCENARIO

In the US, existing privacy protection is a mishmash of

constitutional rights, state legislation, federal regula-

tions, and common law torts. There is not a unique fed-

eral “data security” law, but rather a mosaic of laws

that apply to certain privacy breach situations. More

precisely in the case of RFID, today’s US legal environ-

ment does not offer privacy protection against RFID

tagging of consumer goods, although some state-level

regulation for other usages of this technology exist.7

This means that as of now, there is no law that would

protect a person if he or she interacts with an RFID-

tagged good and, in this interaction, a personal data

or privacy breach occurs; nor is there a federal or state-

level law that would apply if a consumer finds that his

or her data was stolen or processed on the occasion of

an interaction with RFID technology in a store.

Even if individual state regulations should be extended

to product tagging in the future, it would be inherently

inefficient because customers who purchase goods in

states where there are RFID regulations and enforce-

ment would be not protected from retailers located

within states that lack such regulations. Moreover, since

RFID technology is used to track inventory nationally,

and manufacturers supply products to various states,

regulating at the state level interferes with efficient com-

merce. In the context of global information networks

and national and multinational information users, state

protection is of limited significance. Therefore, state

constitutional privacy rights have thus far been of little

help in the day-to-day protection of personal privacy.

As for the applicability of tort law to RFID systems,

individuals will have a difficult time bringing a valid

cause of action since they will not know when items

they carry are equipped with RFID tags and whether

information is being procured when they have a rea-

sonable expectation of privacy. Moreover, it is already

often difficult to meet the burden of proof required

for intrusion upon seclusion.8 Nevertheless, even if

this tort applies (though it generally does not when

the individual targeted is in a public place), it is not a

practical option for protecting privacy at the individual,

non-class-action level, where attorneys’ fees are not

recoverable. Accordingly, a meaningful solution to

RFID readers in the shop doorway can identify

customers as they enter, even reading through

their pockets or handbags; other readers can

be placed beneath the floor to read tags in

shoes or fabric.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

15Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

RFID privacy invasions would seek to prevent harm

and not simply provide a means of redressing viola-

tions. In conclusion, without current laws actively

monitoring and regulating the actions of businesses’

RFID uses, the information gathering and aggregation

occurring as a result of RFID technology may expose

customers to harmful invasions of privacy. Although

state legislatures have begun efforts to legislate RFID

technology at the state level, and privacy advocates

have set forth numerous state legislative proposals,

federal regulation would ultimately be needed to

effectively protect US consumers.

In the absence of federal legislation, the Federal Trade

Commission (FTC) may exercise some control over

RFID. It currently permits companies to craft their

own guidelines concerning the use of customer data

collected through RFID technology. As part of self-

regulation, the FTC encourages businesses to inform

consumers of the existence of RFID tags in their prod-

ucts, the type of data that is being collected, and the

data’s intended use. It announced in March 2005 that

it will allow companies using RFID to regulate them-

selves regarding matters of consumer privacy.9 Still,

section 5 of the FTC Act prohibits deceptive or unfair

acts or practices in or affecting commerce.10 Thus, if a

retailer promises that it will place labels on all goods

containing RFID tags and fails to do so, the FTC can

order the retailer to comply and, if necessary, can seek

judicial intervention.

Regrettably, FTC prosecution does not provide signif-

icant protection in RFID contexts because the FTC’s

authority is discretionary: the agency has limited

resources, and it may refrain from pursuing many of the

violations reported to it.11 Moreover, the FTC’s standard

enforcement procedure is to investigate charges carefully

and to negotiate with the transgressor a reprimand and

promise to reform. Such a prolonged process is not

suited to stopping the transmission of personal informa-

tion into unauthorized channels. Lastly, the FTC cannot

prosecute a company unless that company has voluntar-

ily undertaken the affirmative step of establishing a pri-

vacy policy that subjects it to the agency’s jurisdiction. If

a company simply avoids making any statements about

its privacy policies, it can implement RFID systems as it

deems appropriate.

For self-regulation to effectively address privacy

concerns, I believe two conditions are necessary:

1. Organizations must voluntarily adopt and implement

a set of privacy policies, compliance procedures, and

enforcement mechanisms.

2. Consumers must have confidence that an organiza-

tion is playing by the rules.

If these conditions were met, self-regulation would be

a proper solution to supplement regulatory measures,

particularly in areas that are too specific to be addressed

by legislation for RFID’s implementation in the retail

sector. Of course, this solution requires a general agree-

ment on the privacy rules that are going to govern

RFID implementation, a general compliance to them,

an agreed dispute resolution mechanism, and authori-

ties that would enforce any breach of such rules. For

instance, industry can take as an example the RFID Bill

of Rights prepared by Simson Garfinkel,12 which says

that consumers should have the following rights:

n To know whether products contain RFID tags

n To have RFID tags removed or deactivated when

they purchase products

n To use RFID-enabled services without RFID tags

n To access an RFID tag’s stored data

n To know when, where, and why the tags are

being read

Corporate compliance with privacy standards con-

stitutes an increasingly important differentiator in

competitive markets. More flexible, contextual, and

specific tools may offer better privacy protection

than an omnibus law and at potentially lower cost

to consumers, businesses, and society as a whole.

One specific self-regulatory system has been developed

by EPCglobal,13 a nonprofit joint venture between GS1

(formerly known as EAN International) and GS1 US

(formerly the Uniform Code Council, Inc.) to foster

worldwide adoption and standardization of EPC tech-

nology. The group’s nonbinding “Guidelines on EPC

for Consumer Products” encourage consumer notice,

choice, and education as well as certain record use,

retention, and security practices. The guidelines have

developed a group of identifiers that businesses can

affix to their products explaining that an RFID tag is

present and how to remove the tag after purchase. This

represents a first step in empowering customers to

Although state legislatures have begun efforts

to legislate RFID technology at the state level,

federal regulation would ultimately be needed

to effectively protect US consumers.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

©2013 Cutter Information LLCCUTTER IT JOURNAL August 201316

make informed decisions about whether they are com-

fortable buying tagged products and gives the con-

sumer more control in the transaction, at least after

checkout. Of course, the choice offered by retailers

probably wouldn’t extend to customers who do not

want to be tracked via RFID while shopping in the

store prior to making a purchase.

Industry guidelines advocate education and recom-

mend that companies communicate how RFID tags

work; how they retain, use, and protect the information

they collect from customers; and how RFID technology

benefits the customer. This approach could be quite

helpful, since customers can make their best choices

about giving away their personal information if they

know exactly how that information will be used. Such

notice would be effective if it were communicated in an

up-front manner. It would be less effective if buried in a

licensing agreement or in the fine print on a receipt.

CONSUMER PRIVACY PROTECTION: HOW MUCH AND BY WHOM?

What to protect and how much to protect are at the

heart of the current debate over RFID technology and

consumers’ privacy. Concerning the current privacy

legal framework, it can be said that even though the US

and the EU share many values, their systems of privacy

protection with regard to RFID systems diverge most

sharply on how much they value privacy, especially in

competition with other goals, and on the appropriate

role of government in protecting it. European directives

are based on the stated belief that information privacy

is a basic human right, on par with the rights of self-

determination, freedom of thought, and freedom of

expression. As a result, EU legislation places extraordi-

nary value on privacy, requiring each entity that wishes

to collect, process, use, store, and/or disseminate per-

sonal information to register with its national data pro-

tection supervisory authority.

This scheme is anathema to the US constitutional sys-

tem, which so highly values freedom of expression and

of the press, freedom from government intrusions, and

protection of private property — and, frankly, places

less value on privacy. This suggests a core difference

between EU and US privacy protection: the extent to

which the government is responsible for protecting

information privacy.

Legal regulation of privacy in the US private sector is

largely limited to facilitating individual action; private

agreements and industry self-regulation are the primary

means of protecting consumers’ privacy in the face of

private actors. The FTC should work with industry

to encourage best practices and build on EPCglobal’s

set of standards to construct a more defined industry

response to privacy concerns based on notice and con-

sent. If consumers are to accept RFID technology, indus-

try will need to take privacy issues — particularly notice

and consent — seriously. Negative public reaction could

threaten RFID just as much as restrictive, premature leg-

islation. Meanwhile, retailers should label RFID-tagged

items with a recognizable EPC logo, indicate the location

of the tag on the product, and conduct a public informa-

tion campaign, such as handing out informative flyers or

posting signs in stores that have RFID tags. It will also

be important for industry to provide notice to consumers

when RFID readers are present.

In the information society, privacy rights must be

granted. Consumers cannot participate in it unless

they can control their personal details and the release

or retention of them. Consent has to be seen as a con-

scious, deliberate, and open act of trust, not something

presumed or extracted by subterfuge or a form of black-

mail. It is unacceptable for consumers to face a choice

between security or service; this is an illusory form of

autonomy, amounting to no more than a “take it or

leave it” approach from the provider.

Governments do play an important role in protecting

privacy, not only in the EU but in the US as well.

Despite the different approach to privacy in each legal

framework, government regulation should be consid-

ered an option for mitigating data protection threats.

The debate over more or less regulation risks missing

the point that good regulation can be a stimulus to

innovation.

To achieve user trust in RFID applications, three provi-

sions are required:

1. Effective tools that support users in protecting

their personal data and privacy

2. Information on how RFID is going to being

implemented within each store

3. Consistent and enforceable rules to enhance

consumers’ privacy rights

Consent has to be seen as a conscious,

deliberate, and open act of trust, not some-

thing presumed or extracted by subterfuge

or a form of blackmail.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

17Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

After all, RFID will only be able to deliver its numerous

economic and societal benefits if effective guarantees

are in place on data protection, privacy, and the asso-

ciated ethical dimensions that lie at the heart of the

debate over the public acceptance of RFID.

ENDNOTES

1Juels, Ari. “RFID Security and Privacy: A Research Survey.”

IEEE Journal on Selected Areas in Communications, Vol. 24, No. 2,

6 February 2006.

2Thiesse, Frédéric. “RFID, Privacy, and the Perception of the

Risk: A Strategic Framework.” Journal of Strategic Information

Systems, Vol. 16, No. 2, June 2007.

3“RFID and Ubiquitous Computing: How to Ensure That

RFID Also Serves Consumer Interests.” Meeting Report,

Trans Atlantic Consumer Dialogue (TACD), 13 March 2007

(http://tacd.org/index.php?option=com_content&task=

view&id=94&Itemid=1).

4Albrecht, Katherine. “Supermarket Cards: The Tip of the Retail

Surveillance Iceberg.” Denver University Law Review, Vol. 79,

No. 4, Summer 2002.

5“Developing National Policies on the Deployment of Radio

Frequency Identification (RFID) Technology.” IEEE, 17

February 2006 (www.library.ca.gov/crb/rfidap/docs/

IEEE-RFIDPositionStatement.pdf).

6Hildner, Laura. “Defusing the Threat of RFID: Protecting

Consumer Privacy Through Technology-Specific Legislation

at the State Level.” Harvard Civil Rights–Civil Liberties Law

Review, Vol. 41, No. 1, January 2006 (www.law.harvard.edu/

students/orgs/crcl/vol41_1/hildner.pdf).

7The states in question are: California, Georgia, Massachusetts,

Missouri, New Hampshire, North Dakota, Rhode Island, Utah,

and Wisconsin.

8For example, in Nader v. General Motors, the court stated that

only “overzealous public surveillance” is actionable. It is diffi-

cult to argue that a small RFID chip, sewn into clothing seams,

is “overzealous” public surveillance. Therefore, it is inaccurate

to conclude that using RFID technology to collect data is highly

offensive when the information can be gathered by means that

are not “overzealous.” Additionally, under the theory of intru-

sion upon seclusion, plaintiffs must prove damages, which

could be difficult for this technology. See Court of Appeals

of New York, 1970 255 NE2d 765.

9Collins, Jonathan. “FTC Asks RFID Users to Self-Regulate.”

RFID Journal, 10 March 2005 (www.rfidjournal.com/articles/

view?1437).

10Commerce and Trade, U.S. Code, Title 15, Section 45 (2000).

11The FTC has not taken any enforcement actions against any

companies and has not compiled any statistics as to who is

using RFID technology. Under the current self-regulatory

scheme, it is unlikely that the FTC will ever take enforcement

actions because it can only enforce regulations that a company

sets for itself and subsequently violates.

12Garfinkel, Simson L., Ari Juels, and Ravi Pappu. “RFID

Privacy: An Overview of Problems and Proposed Solutions.”

IEEE Security & Privacy, Vol. 3, No. 3, May-June 2005.

13“Guidelines on EPC for Consumer Products.” EPCglobal,

September 2005 (revised) (www.gs1.org/epcglobal/

public_policy/guidelines).

Analía Aspis is a lawyer, consultant, and researcher on cyber law

and data protection at the University of Buenos Aires. She has

authored numerous articles on the legal aspects of IT, including

cyber crime, data security, e-justice, Internet governance, and the

Internet of Things, among others. Ms. Aspis is a frequent lecturer

on privacy and security and holds an MBA from the University of

Lausanne, Switzerland. She is the founder of Internet y Sociedad,

the first Latin American NGO devoted to pursuing initiatives and

research studies on technology and society. She can be reached at

[email protected].

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

Ubiquitous computing, ambient intelligence, and the

Internet of Things (IoT): the world is quickly transform-

ing into one in which intelligent and networked devices

will be everywhere, and everything we’ve previously

viewed as dumb will be infused with connectivity and

awareness. The ability of objects in our environment to

interact with us on this level presents a host of social

challenges, some of which we are just now facing. One

key challenge is how these objects will respect the pri-

vacy of the individuals they encounter. Addressing that

challenge requires an understanding of privacy and, in

particular, the role that notice and consent has played in

mitigating privacy risks. Once that foundation is laid,

this article provides some suggested mitigating controls.

Defining privacy carries certain risks. Clear consensus

on the meaning remains elusive. Information privacy

refers to the control of information about one’s self.

Privacy law scholar Alan F. Westin provides perhaps

the most seminal definition:

Privacy is the claim of individuals, groups, or institutionsto determine for themselves when, how, and to whatextent information about them is communicated to others.1

Westin’s definition clearly has a control bias, whereby

individuals or groups have the autonomy to make

decisions over information about them. However, it

does not quite cover the realities of living in a collective

society. We relinquish information frequently, thus

abdicating control to the recipient with an often tacit

understanding of how that information will be utilized.

This exchange occurs within a system of cultural norms

and laws that, while not absolutely controlling the

recipient’s behavior, punishes those that breach

such standards. When we share information with

our friends, doctors, and merchants, we have certain

expectations about how the information will be used.

Betrayal of those expectations injures the relationship.

Westin’s definition is also cracking under a modern

threat: the availability of information about us that was

never communicated by us. It may have been obtained

directly through observation or surveillance or gathered

indirectly by implication based on our activities. No

longer are we the primary source of information about

ourselves. If we are not the source, then how are we

to know what information is being collected, dissemi-

nated, or used? How do we exert any measure of con-

trol if we don’t even know which information we need

to control?

FAIR INFORMATION PRACTICES

In 1973, the then US Department of Health, Education,

and Welfare identified a set of principles, the Fair

Information Practices (FIPs), designed to govern

the nascent practice of collecting massive amounts

of data on individuals in government databases.2 The

principles were rooted in the notions of openness,

disclosure, secondary use, correction, and security. In

1980, the Organisation for Economic Co-operation and

Development (OECD), adopted a variant of the FIPs that

included an individual participation principle. This prin-

ciple extended to the individual the right to view infor-

mation held about him or her and challenge the accuracy

or completeness of that information.3 More recently, the

US Federal Trade Commission (FTC) has encouraged, at

least for commercial entities, the adoption of its five Fair

Information Practice Principles (FIPPs):4

1. Notice/awareness

2. Choice/consent

3. Access/participation

4. Integrity/security

5. Enforcement/redress

The adoption of a notice and choice model in the com-

mercial sector is distinguishable from that of govern-

ment use of data in that it presumes informed consent

is a necessary component of a free market exchange. In

order for consumers to voluntarily engage in commerce,

they must know what they are giving up in terms of

information and consent to that exchange. Failure to

properly notify a consumer results in an asymmetric

exchange that can give the merchant an unfair advan-

tage over the consumer. Erroneous, misleading, or

incomplete disclosure by the merchant could also be

©2013 Cutter Information LLCCUTTER IT JOURNAL August 201318

Notice and Consent in the Internet of Thingsby R. Jason Cronk

DO ASK, DO TELL

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

19Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

deemed deceptive. In the US, FTC jurisdiction extends

to these unfair and deceptive trade practices; hence, the

FTC promotes FIPPs to give businesses guidelines to

help them avoid such an adverse determination.

GENERAL FAILURES OF NOTICE AND CONSENT

The concept of notice and consent pervades modern pri-

vacy law and regulations. The efforts are laudable, but

our cognitive limits and the rapid pace of advancing

technology are straining their usefulness.

In the context of commercial interaction, the most obvi-

ous difficulty exists in providing sufficiently detailed

notification as to exactly what information is collected,

how it will be used, and how it will be disseminated

to others. Privacy notifications are either overly broad,

perpetuating an asymmetric information exchange

between business and consumer, or so detailed as to

render them costly to review.5 Whether the notifications

are detailed or broad, consumers rarely have sufficient

context to fully understand the activities of the business,

let alone the future ramifications of their data sharing.

A statement such as “your data will be used for market-

ing purposes” is meaningless unless the consumer is

familiar with the standard practices of that industry or

that particular business. Similarly, for a consumer who

is not savvy about Big Data analysis and the advertising

industry, a statement such as:

Your data will be shared with behavioral marketing firmswho will aggregate information about you from multiplesources to create a digital dossier allowing for predictiveanalysis of your future purchases, based on correlativebehaviors in a larger sample population. That analysis willbe used to target tailored, high-conversion-rate ads at you.

... might result in better understanding of the practices

of the business, but not necessarily in full appreciation

of the actual risks such practices entail. Even with this

disclosure, could a 16-year-old girl realistically be pre-

sumed to understand that such a statement would lead

to Target (the discount retailer) sending her ads for

baby products, thereby revealing a pregnancy to her

parents before she opted to do so?6 Could a sophisti-

cated consumer foresee that risk?

Even with simple collection of information, people are

not rational in their decision making about privacy.

Work by behavioral researcher Alessandro Acquisti has

demonstrated that the way choices are presented to

people alters their decision-making process.7 In one

experiment, mall patrons were given a $10 gift card

and given the option to receive a $12 gift card if they

provided additional shopping information about

themselves, which is called willingness to accept reduced

privacy for remuneration. Other patrons were given the

$12 gift card and told they could give up the $12 card

for an anonymous $10 card, which is called willingness

to pay for increased privacy. Different scenarios were

included as controls to make sure the patrons were

paying attention. When patrons were given the ability

to trade privacy for an increase in benefits, only about

half of the patrons did so. However, in the economically

equivalent experiment, where they had to give up $2

to maintain privacy, less than 10% chose that option.8

From Acquisti’s work, we can surmise that people’s val-

uation of privacy depends on whether they have it in

the first place. This directly affects the consent dynamics

because consent may be biased depending on the initial

condition. When consent to share information is cost-

less, other biases, such as the default bias, may appear.

The most frequent bias cited in discussing privacy is

hyperbolic discounting, also known as present bias.

People are likely to discount a future benefit, cost, or

risk against a present benefit. In the context of a deci-

sion involving sharing information and the future risks

such sharing entails, a person’s ability to rationally

weigh risks versus current benefit is questionable.

Those future risks and costs will be discounted in favor

of current gain. This is certainly a wrench in the notion

of informed consent. Even if one is fully informed and

consent is freely given, will a person be biased against

his or her own best interest? How voluntary is our con-

sent if our brains are unable to make the best decisions

for us?

In his paper on privacy self-management, law professor

Daniel Solove identifies the preceding problems — plus

a few more — with the notion of consent.9 The unin-

formed consumer problem (through an unwillingness to

read or lack of understanding) results in consent that is

perfunctory and lacking in substance. The problem of

skewed decision making, essentially the biases men-

tioned above, means consent is not based on a rational

review of the risks. Finally, privacy preferences are con-

text sensitive, thus making it difficult for individuals to

manage decisions in one context where their data might

be used in another context. This occurs when data shifts

from one use to another.

How voluntary is our consent if our brains are

unable to make the best decisions for us?

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

©2013 Cutter Information LLCCUTTER IT JOURNAL August 201320

Solove also identifies “structural” problems, specifically

of scale, aggregation, defining harm, neglected social

values, power, and affirmative opt-in. These problems

exist even if we solve the cognitive challenges of full

notification, cognitive biases, and changing preferences.

Fully performing a cost-benefit analysis requires a deep

understanding of how these issues affect the privacy

decision.

SPECIFIC FAILURES IN THE IoT CONTEXT

The privacy notice, also referred to as a privacy state-

ment or privacy policy, is a modern creation. The ability

of a Web browser to provide a quick link to a lengthy

notification statement is purely a function of the

medium. Historically, or even concurrently, most indi-

vidual and organizational interactions don’t provide

for such disclosures. Retail outlets don’t typically

provide consumers with a brochure about privacy

rights. The exceptions, at least in the US, are specifically

legislated: a HIPAA (Health Insurance Portability and

Accountability Act) statement to medical patients and

a privacy notice for financial institution customers

under GLBA (the Gramm-Leach-Bliley Act). Providing

a statement online is informative without being disrup-

tive, such as statements encouraging people to click a

link to learn more (see Figure 1). Providing a statement

offline is generally much more disruptive to the con-

sumer experience. One could only imagine a retailer

providing a lengthy privacy policy brochure prior to

asking the consumer for his or her ZIP code at the regis-

ter. Mobile is proving a challenge because it elicits much

of the same experience as surfing the Web on a personal

computer, but a mobile privacy statement tends to be

much more intrusive.

To review the specific shortcomings of the notice and

consent paradigm for the IoT, it is helpful to first distin-

guish three specific interactions:

1. Purchased objects. A consumer purchases or acquires

a specific product capable of collecting, using, or

transmitting personal data about the consumer. For

instance, a consumer may purchase a toaster that

identifies the consumer and maintains his or her toast

preferences, storing that information in the home’s

digital dossier.

2. Situationally aware objects. A non-purchaser comes

in contact with an object capable of collecting or

immediately using personal information to change

behavior. The same toaster, when used by a house-

guest, asks the houseguest’s preference because he

or she is not recognized. The toaster then stores the

new biometric identifier and toast preferences in the

house’s digital dossier.

3. Surveillance objects. A non-purchaser comes in

contact with an object that collects and transmits

personal information for later use by a third party.

Finally, our toaster centralizes the toasting prefer-

ences of individuals so that anytime they approach an

intelligent toaster anywhere in the world, it already

knows their preferences. That information is pro-

vided to bakers, who will know that a particular

person prefers whole wheat to white bread.

All three scenarios fail in some way under a notice and

consent regime, but the failures are more pronounced

in the third case than in the second, which is more pro-

nounced than in the first. The opportunity for notice is

present for purchasers at the point of sale, but less so for

those interacting with objects they didn’t purchase. Some

salient notification can be given in the second and third

cases, similar to pronouncements around CCD cameras

that say “Video Surveillance in Progress.” However, the

opportunities for detailed information are limited.

Similarly, beyond the purchasing decision, there exists

limited opportunity for consent. While some IoT objects

might have interactive capabilities, such as the ability

to receive audio or visual cues and react, many objects

will be more passive, simply collecting information for

later analysis or use. Some objects could have creative

options for requesting consent. For example, a carton of

milk that reports its level to a central server might be

labeled in an instructive manner to show the user how

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

Figure 1 — A link offering more information about data being collected.

21Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

to pick up the carton with or without invoking the

reporting mechanism, which could be akin to opt-in or

opt-out.

The impending ubiquity of the IoT presents a challenge

for people who want to avoid communications-enabled

objects. While a person may decide not to visit a website

or avoid certain businesses that collect data about con-

sumers, the omnipresence of communications-enabled

objects will make it difficult for people to opt out. When

implementation becomes so cheap that even disposable

products (the milk carton, your tissue box, etc.) all have

sensing, computing, and communications technology

embedded (see Figure 2), the availability of objects

without these capabilities may be scant.

One major purpose of the IoT is to collect massive

amounts of very discrete data for analysis. Thus, the

relevant privacy problems of Big Data come into play,

specifically those of aggregation, scale, and difficulty in

understanding what predictive analysis may ultimately

affect the individual’s interaction with the object.

PRIVACY RISKS ANALYSIS OF THE IoT

While many of the privacy risks of the Internet of

Things will be specific to the purpose and characteris-

tics of the particular device, we can analyze some risks

based on the common properties of the IoT. We can use

three different models for this analysis:

1. Solove’s taxonomy of privacy10

2. University of Washington law professor Ryan Calo’s

privacy harms11

3. New York University computer science and media

studies professor Helen Nissenbaum’s contextual

integrity12

Solove divides privacy into a taxonomy of violations.

Four broad categories — information collection, pro-

cessing, dissemination, and invasion — are subdivided

into more specific types of harms. Surveillance, a form

of information collection, is a specific risk of the IoT

because as things become aware of their surroundings,

they are necessarily aware of the people in their envi-

ronment. This isn’t so problematic when the informa-

tion is used to react, but it could be problematic if the

information persists beyond the immediacy of the inter-

action. Information processing concerns the use of infor-

mation in a way that might not be intended or desired

by the subject or even in such a way as that person can’t

envision. For instance, a network of health monitors col-

lecting information for your doctors could send a report

to a restaurant you enter and alter your menu based on

the liability concerns of providing you certain foods.

While this could be welcome in some cases, such as

avoiding food allergens, it might be unwelcome and

considered intrusive in others. Finally, the collection of

all this information risks information dissemination in

the form of a breach of confidentiality (the health moni-

toring devices above), exposure (when devices are col-

lecting data when they shouldn’t be or aren’t expected

to be), or disclosure (revealing patterns of activity to

others).

Calo splits privacy harm into two distinct camps: sub-

jective and objective. Subjective harm is the fear or anxi-

ety derived within one’s self from the observation, real

or not, of one’s activities. Having myriad devices col-

lecting and sharing information about one may give

rise to a feeling of being “creeped out.” This can lead

to objective lifestyle changes, paranoia, and depression.

Objective harms are those that are more readily identifi-

able and would not exist absent the privacy violation.

We can readily see this type of harm exhibited in the

Target teen pregnancy case relative to an IoT scenario.

Suppose sensors in a Target store identified pregnant

women by their gait, mannerisms, body temperature,

and other characteristics in order to offer them in-store

services. I think it’s safe to say that many pregnant

women would indeed feel “creeped out” by such

surveillance.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

Figure 2 — Is your milk carton watching you?Is your milk carton telling you it is watching you?

©2013 Cutter Information LLCCUTTER IT JOURNAL August 201322

Nissenbaum suggests our notions of privacy violations

are more subtle than the rigid constructs of Calo and

Solove and are grounded in what she describes as

“contextual integrity.” Norms of information flow and

appropriateness help define the bounds of the context,

and, when those norms are breached, a privacy viola-

tion occurs. These norms are historical, may be based on

laws and regulations, or can be cultural. Unfortunately,

disruptive technologies such as the IoT provide society

with few past analogies upon which to base norms. In

general, we don’t expect our carton of milk to be record-

ing daily consumption rates and reporting them to the

manufacturer, the grocer, our doctor, our spouse, or our

neighbor.

MITIGATING STRATEGIES

As we’ve seen, there are significant deficiencies with the

notice and consent model as well as inherent risks to

privacy in devices classified as part of the Internet of

Things. What strategies can developers and designers

use to counter those concerns?

Conspicuous

Designers should seek ways to make their devices

conspicuous in their consumption and use of informa-

tion. Notice in this context should be some form of

announcement of the very thing’s existence. This could

be auditory, visual, or, really, any sensory announce-

ment. A clever example is a digital camera’s use of a

clicking sound reminiscent of film-based cameras to

give an audible clue (despite the absence of a shutter

in a digital camera) that a picture has been taken.

Single Purpose

Where practical, data should be collected for a single

identifiable purpose. Human cognitive capabilities are

clearly limited, and using data for multiple unrelated

purposes without clear segmentation increases the

likelihood of privacy violations.

Immediate Consumption, Use, and Feedback

Feedback to the data subject should be nearly instanta-

neous. The milk carton should report only to the refriger-

ator, which displays a list of upcoming shopping list

items clearly and obviously. Delay in feedback or non-

obvious uses result in increased difficulty for data sub-

jects to understand the ramifications of interacting with

certain objects. Grocery items with radio frequency identi-

fication (RFID) tags should tally themselves as they enter

the shopping cart. Stores shouldn’t monitor what goes in

the grocery cart unless they provide clear feedback to the

consumer that this is what’s being done. I could envision

a visual display on the cart with a virtual customer

service representative “seeing” what you placed in the

basket. When you remove an item, the representative

could ask you, “We’ve noticed that a lot of customers

remove product XYZ after initially picking it up. Would

you share why?” This provides customers with imme-

diate notification that the store monitors not just their

immediate use but patterns across all customers.

Decentralization

If nothing else, the Internet of Things favors a decentral-

ized approach. The entire design suggests a massive mesh

of networks with interacting components. Developers

should embrace decentralization and decentralized data

flows rather than perpetuate a client-server architecture

that centralizes information, bringing with it associated

privacy and security risks.

Map to Current Analogies

As mentioned, the Internet of Things defies existing

norms surrounding information flows and appropriate-

ness. Developers should find ways to map IoT imple-

mentations onto existing scenarios. Is the application

more evocative of a doctor-patient relationship, where

information shouldn’t be shared outside the context of

the medical community? Could it be related to a famil-

ial interaction where information should never escape

the boundaries of the home? These mappings should

be both presented to consumers in a way that they can

understand as well as be used as a technical constraint

on the system developer.

Ultimately, the collection and use of personal

data must be made transparent to all stake-

holders so as to give consumers, privacy

advocates, governments, and society at large

an opportunity to weigh in and help define

the appropriate social norms.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

23Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

Transparency and Ethics

In their book on Big Data, Kenneth Cukier and Viktor

Mayer-Schonberger call for a new occupation in compa-

nies employing Big Data techniques: an algorithmist

who would make or guide ethical decisions about the

uses of Big Data.13 While laudable, their vision fails

in two respects. Privacy is about participation in the

decision-making process and the construction of social

norms. Further, algorithmists employed by corporations

have split loyalty that doesn’t bode well for making

ethical decisions.

Ultimately, the collection and use of personal data must

be made transparent to all stakeholders so as to give

consumers, privacy advocates, governments, and soci-

ety at large an opportunity to weigh in and help define

the appropriate social norms. This doesn’t mean that

new and innovative things can’t be tried, just that the

decisions can’t be unilateral.

CONCLUSION

The Internet of Things is rich with opportunity — and

risk. Designers must proactively consider the privacy

implications of the design choices they make. Designs

should help people make intelligent, rational decisions

about their interactions with sensors and devices, not

obfuscate those decisions. When the benefits appear

to outweigh the risks, society as a whole should be

informed, educated, and engaged to develop appro-

priate social norms to be followed.

ENDNOTES

1Westin, Alan F. Privacy and Freedom. Atheneum, 1967.

2“Records, Computers, and the Rights of Citizens.” US

Department of Health, Education, and Welfare (HEW)

Secretary’s Advisory Committee on Automated Personal Data

Systems, July 1973 (www.hsdl.org/?view&did=479784).

3“OECD Guidelines on the Protection of Privacy and

Transborder Flows of Personal Data.” Organisation for

Economic Co-operation and Development (OECD),

September 1980 (www.oecd.org/internet/ieconomy/

oecdguidelinesontheprotectionofprivacyandtransborder

flowsofpersonaldata.htm).

4“Privacy Online: Fair Information Practices in the Electronic

Marketplace.” US Federal Trade Commission, May 2000

(www.ftc.gov/reports/privacy2000/privacy2000.pdf).

5McDonald, Aleecia, and Lorrie Faith Cranor. “The Cost of

Reading Privacy Policies.” I/S: A Journal of Law and Policy for

the Information Society, 2008.

6Duhigg, Charles. “How Companies Learn Your Secrets.”

The New York Times, 16 February 2012.

7Sengupta, Somini. “Letting Down Our Guard With Web

Privacy.” The New York Times, 30 March 2013.

8Acquisti, Alessandro, Leslie John, and George Loewenstein.

“What Is Privacy Worth?” Paper presented to the Workshop

on Information Systems Economics, Phoenix, Arizona, USA,

December 2009 (www.heinz.cmu.edu/~acquisti/papers/

acquisti-ISR-worth.pdf).

9Solove, Daniel J. “Privacy Self-Management and the Consent

Paradox.” Harvard Law Review, Vol. 126, No. 8, June 2013

(www.harvardlawreview.org/issues/126/may13/

Symposium_9475.php).

10Solove, Daniel J. “A Taxonomy of Privacy.” University of

Pennsylvania Law Review, Vol. 154, No. 3, January 2006

(www.law.upenn.edu/journals/lawreview/articles/

volume154/issue3/Solove154U.Pa.L.Rev.477%282006%29.pdf).

11Calo, M. Ryan. “The Boundaries of Privacy Harm.” Indiana

Law Journal, Vol. 86, No. 3, Summer 2011 (http://ilj.law.

indiana.edu/articles/86/86_3_Calo.pdf).

12Nissenbaum, Helen. “Privacy as Contextual Integrity.”

Washington Law Review, Vol. 79, No. 1, February 2004

(http://courses.cs.vt.edu/cs6204/Privacy-Security/

Papers/Privacy/Privacy-AS-Contextual-Integrity.pdf).

13Mayer-Schonberger, Viktor, and Kenneth Cukier. Big Data: A

Revolution That Will Transform How We Live, Work, and Think.

Eamon Dolan, Houghton Mifflin Harcourt, 2013.

R. Jason Cronk is a privacy engineering consultant with Enterprivacy

Consulting Group. Prior to joining the firm, Mr. Cronk worked in the

information security department of Verizon. His information technol-

ogy career spans 20 years. In addition, Mr. Cronk is a licensed attor-

ney in Florida and a certified information privacy professional, and he

was recently named a Privacy by Design Ambassador by the Ontario

Information and Privacy Commissioner’s office. Mr. Cronk regularly

writes, speaks, and blogs on privacy and related issues. He can be

reached at [email protected]; Twitter: @privacymaverick.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

The term “Big Data” refers to the huge amounts of

digitized data that organizations and governments

collect about human beings and the world that we

inhabit. The volume of data is growing at a frightening

rate. Google’s Executive Chairman Eric Schmidt brings

it to a point: “From the dawn of civilization until 2003,

humankind generated five exabytes of data. Now we

produce five exabytes every two days … and the pace

is accelerating.”1 While some have called this an exag-

geration, the rate at which we create data is neverthe-

less skyrocketing.

Given the rapidly approaching ubiquity of Big Data in

our day-to-day lives, it seems appropriate to examine

the effects Big Data may have on our privacy and the

security of that data. This examination should help

readers form their own opinion on the following ques-

tion: Is it possible to take advantage of all the benefits

Big Data may have to offer, while minimizing the

potential impacts to personal privacy?

In this article, I will focus on some of the issues Big Data

raises, the possible consequences Big Data will have for

the way we view privacy and security, and how those

views may need to be adapted in order to take full

advantage of this technology. I will discuss the enor-

mous potential Big Data holds to enhance people’s lives,

the enhanced risk that accompanies that potential, and

some things that may be done to minimize those risks.

WHAT IS BIG DATA?

Big Data in a nutshell involves making use of the ever-

growing preponderance of data about “things” to make

predictions of all kinds — whether it be next month’s

most popular movie or the location of the next flu out-

break — by applying advanced statistical modeling

techniques to that data. From smartphones to intelligent

refrigerators, intelligent devices are becoming more

numerous and more varied every day, and the data

becoming available for the various analytical techniques

grows by the second.

The Big Data ecosystem is complex and in a state of

almost continual change. At a high level, it is possible

to identify three essential elements upon which a Big

Data ecosystem depends (see Figure 1). Big data cannot

function without:

1. Cloud infrastructures

2. Data sources

3. Data acquisition and integration techniques

The ability to collect and analyze more data from more

sources on a consistently decreasing cost basis is trans-

forming our capacity to understand our world and all

of the phenomena in it. The potential for advancement

resulting from the Big Data trend spans almost every

realm of the human condition, from city planning to

healthcare to business, even the prediction of human

behavior. The benefits of Big Data can be truly remark-

able and may just change the world. Some examples of

what Big Data may have in store for us include:

n Law enforcement agencies using data from social

media, CCTV cameras, phone calls, and texts to track

down criminals and predict the next terrorist attack

n Musicians using data about our listening preferences

and sequences to determine the most popular playlist

for live performances

©2013 Jason Stradley. All rights reserved.CUTTER IT JOURNAL August 201324

Big Data Privacy and Security Challengesby Jason Stradley

CAN WE HAVE OUR CAKE AND EAT IT TOO?

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

Figure 1 — Big Data ecosystem dependencies.

25Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

n Retailers combining their frequent buyer data with

social media information to detect and take advan-

tage of changing purchasing patterns

The recent concerns regarding the collection of data by

the US National Security Agency (NSA) for the PRISM2

program may have provided a small preview of what

may come from Big Data. This program facilitates the

collection of the private communications of users

directly from the servers of Microsoft, Yahoo!, Google,

Facebook, and other online companies. The diametri-

cally opposed viewpoints expressed by various groups

regarding the value of this activity versus its implica-

tions for civil liberties are a real-world demonstration

of the double-edged sword that is Big Data.

PRIVACY AND SECURITY CHALLENGES

The advent of cloud computing technologies in the last

few years has prompted a great deal of thought in the

information security industry about those technologies’

implications for privacy and security. Many of us ques-

tioned cloud computing’s true value proposition, as a

“killer app” did not seem all that obvious at the time.

The emergence of Big Data as perhaps the highest-

impact application of cloud computing technologies

amplifies concerns about the privacy and security of

data as it is potentially spread across the large and

ever-expanding Big Data ecosystem.

One industry concern is that many of the solutions

that are available and in use today are not sufficient for

a Big Data world. In order to mitigate the risks to pri-

vacy and security in such an environment, comprehen-

sive solutions need to be devised. Traditional security

mechanisms are tailored to more static data sets in what

are still semi-isolated enterprise environments and as

such will not accommodate the much larger scale of a

Big Data ecosystem. Nor are they flexible enough to

keep up with the various data types and methods of

data acquisition and integration. Some of these differ-

ences in scale and characteristics include:

n Multiple infrastructure tiers, both storage and com-

puting, required to process big data

n New elements such as NoSQL3 databases that speed

up performance but are almost entirely dependent

on middleware layers for security

n Existing encryption technologies and solutions that

don’t scale well to large data sets

n Real-time system monitoring techniques that work

well on smaller volumes of data but not very large

data sets

n The growing number of devices, from smartphones

to sensors, that produce data for analysis

n General confusion surrounding the varied legal and

policy restrictions, which lead to ad hoc approaches

for ensuring security and privacy

Another possible (albeit less tangible) issue occurred to

me as I tried to draw some comparison between what

seems to be occurring in the Big Data era and what

happened during the dot-com bubble of the late 1990s.4

During that time, industrial-strength e-commerce was

in its infancy, and huge numbers of what were called

“intermediary agencies” came into being that, at least

for a short time, created enormous value using the

metadata of the data available. Will Big Data spawn a

new generation of these intermediaries, which created

value based on the axiom that “information about the

data is worth more than the data itself,” and will we

be able to avoid the missteps of the past in doing so?

(Those missteps included a huge overvaluation of these

intermediaries and the misapplication of investment

based on those overvaluations, which largely con-

tributed to the bursting of the dot-com economy in the

2001-2002 time frame). This potential for overvaluing

certain types of information may increase the risk of

theft or unauthorized modification of that information.

The Cloud Security Alliance (CSA) recently published

a study that broke down the security and privacy chal-

lenges associated with Big Data adoption into four

categories (see Figure 2):5

1. Data privacy

2. Infrastructure security

3. Data management

4. Data integrity and response

In this section, I will discuss various aspects of these

challenges and attempt to reconcile them with tradi-

tional approaches to privacy and security.

Data Privacy

The Big Data paradigm poses a number of challenges

from a data privacy point of view. In the US, privacy

standards currently take a sectoral approach, driven

by the needs and experiences of specific industries.6

The focus is on the confidentiality of data to facilitate

“being left alone,” and the result is a piecemeal, as-

needed approach to privacy protection. In the case of

the European Union (EU) and other evolving privacy

standards around the world, the focus is instead on pro-

tecting against the unauthorized disclosure and use of

personal information in order to protect the “honor

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

©2013 Jason Stradley. All rights reserved.CUTTER IT JOURNAL August 201326

and dignity” of citizens. This results in a much more

stringent, comprehensive, and proactive approach to

the protection of personal data. So while both the US

and the EU (and other jurisdictions) must address the

indiscriminate, undefined use of data beyond its stated

primary purpose — an all-too-likely outcome of the

Big Data juggernaut — their contrasting approaches

to privacy will no doubt clash at some point.

In fact, in mid-2012 the EU’s Article 29 Working Group7

stated that cloud-based data storage infrastructures did

not provide sufficient security to comply with the EU’s

Safe Harbor program, which allows qualified US-based

companies to engage in data transfers with EU member

states. With cloud storage infrastructures being a basic

building block of the Big Data ecosystem, the ability of

those ecosystems to measure up to international privacy

and security standards is a definite challenge for the

industry.

Infrastructure Security

The nature of Big Data ecosystems dictates an inherent

lack of continuity across cloud infrastructures from a

visibility and control perspective. The way Big Data

processing tasks are allocated to computing assets

across multiple cloud environments raises the same

sorts of issues encountered in highly distributed com-

puting environments. If poorly configured devices

(e.g., mobile devices, consumer electronics, automobiles,

manufacturing control systems) encounter poor oper-

ational practices and/or system compromise, the result

could be data confidentiality and/or integrity issues.

These in turn could cause false results in an analytical

task; for example, predicting a terrorist attack in one

location when in reality it will occur in another location.

Such distributed environments are also potentially vul-

nerable to the most basic infrastructure-based attacks,

such as replay, man-in-the-middle, masquerading, or

even denial-of-service attacks. These infrastructure

attacks may produce data confidentiality, integrity,

or availability problems in one or more parcels of

Big Data analytical results.

In addition, the lack of visibility and control of data col-

lection devices inhabiting far-flung cloud infrastructures

may contribute to a variety of end-point compromise

scenarios, including tampering with the data collection

device or data collection application, ID cloning attacks

(especially on mobile devices), and manipulation of the

collection device’s surrounding environment informa-

tion (e.g., temperature, turbine speed, GPS location) so

as to alter collected data.

Data Management

There are numerous security and privacy challenges

with regard to data management in Big Data ecosys-

tems. The economics of cloud storage is prompting ser-

vice providers to move to the use of auto-tiering storage

solutions that store data on storage facilities appropriate

to the use case of that particular data. The threat this

poses to data confidentiality and integrity is that it may

allow those providers to glean certain properties and

characteristics about the encrypted data (e.g., user activ-

ity, the identity of data sets) as it is synchronized across

cloud-based storage providers, even if the encrypted

data is not actually deciphered.

Other Big Data privacy and security challenges that can

arise from a data management standpoint include the

potential for service providers to collude with end users

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

Figure 2 — Taxonomy of Big Data security and privacy challenges. (Source: CSA, 2013.)

27Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

by exchanging keys and data to allow the provider

access to data that it is not authorized to view. In a

multiuser environment, it is possible for a service

provider to launch rollback attacks by delivering an

out-of-date copy of a data set to the user after having

made an update to that data set. In addition, the lack of

good audit trails will almost assuredly lead to disputes

between users and storage providers in the case of loss

or unauthorized modification of data. Audit logs are

essential to determining the origins of these events and

responsibility for the results.

Data Integrity and Response

The provenance of the data becomes a critical integrity

issue in Big Data ecosystems. The abnormally large

sizes of data sets make it impractical to locally inspect

entire data sets to verify origins and authenticity, which

may impact the integrity of data and resulting output

of analytical processes. Auto-tiering also challenges

providers to guarantee availability levels. Weaker secu-

rity capabilities at lower tiers of storage may be open to

denial-of-service attacks, while backup windows and

data recovery efforts can also be affected by the differ-

ence in performance between tiers. Multi-writer multi-

reader (MWMR) and write-serialization issues can arise

to impact the integrity of data from multiple storage

tiers that are shared by multiple users.

Specific to the Big Data ecosystem is a new generation

of database engine that has been engineered for perfor-

mance and has little to no built-in security. This results

in very poor assurance of transaction integrity, weak

authentication, and weak password storage mecha-

nisms.8 Due to the lack of data validation mechanisms,

these data stores are subject to replay or man-in-middle

attacks, cross-site scripting, cross-site request forgery,

and injection attacks. Most importantly, at this point,

these high-performance database engines do not have

any integration capability with third-party modules that

provide authentication. In addition to poor authentica-

tion capability, these database engines also are deficient

when it comes to authorization mechanisms. Different

manifestations of this high-performance database

engine have not incorporated any type of role-based

access control into their implementations, and instead

provide authorization controls on a per-database basis,

offering no enforcement at lower layers. Lastly, the

architecture of these database engines does not have the

ability to guarantee synchronization across the distrib-

uted computing assets participating in a distributed

task. This potentially undermines the consistency and

ultimately the integrity of the source data and the

results of any analytical task performed.

Real-Time Security Monitoring

The last but certainly not the least of the privacy and

security challenges that adopters of Big Data solutions

face is that of real-time security monitoring. Real-time

security monitoring encompasses the observation of

elements like network traffic and transactions for:

n Things that we know are not supposed to happen,

such as unauthorized access to a data store by a user

or device

n Anomalous behaviors, such as increased volumes

of transactions between devices, differences in

transaction types, and so on

For such monitoring to be useful, it must encompass

the entire Big Data ecosystem, including the public

cloud infrastructure, the central data analysis and

processing clusters, computing assets that have been

assigned portions of the analysis processing, the inter-

connection between those computing assets, and the

data stored on those computing assets. Also requiring

consideration in a security monitoring scenario are the

monitoring application itself and the security of input

sources that collect and forward that security event

data. As with any security monitoring application,

adversaries will attempt to create evasion and/or data

poisoning techniques to circumvent it.

HOW TO ADDRESS THE CHALLENGES

Now that some of the most prevalent challenges to

privacy and security have been illustrated and con-

trasted against the Big Data ecosystem at a high level,

it becomes possible to start to map out how we can

address these challenges. As it turns out, there do not

appear to be any new classifications of threats to Big

Data from a security perspective, only differences in the

opportunity for exploitation and methods of execution

of known types.

It is now more apparent why we need to create a more

comprehensive information security system to accom-

modate the pervasiveness of the Big Data phenomenon.

These issues are all solvable, and we do not necessarily

need to develop any new technologies to achieve a com-

prehensive security and privacy solution for Big Data

ecosystems. Table 1 outlines some of the challenges dis-

cussed, contrasts those challenges against traditional

security/privacy approaches, and categorizes each in

terms of one or more of what I consider to be the six

major tenets of information security:

1. Authentication

2. Confidentiality

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

©2013 Jason Stradley. All rights reserved.CUTTER IT JOURNAL August 201328 NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

Challenges Remedies Security Attribute

Data privacy Replay attacks Transmission encryption Confidentiality

Man-in-the-middle

attacks

Transmission encryption Confidentiality

Cross-site scripting Data input validation

and filtering

Integrity

Cross-site request

forgery

Data input validation

and filtering

Integrity

Injection attacks Data input validation

and filtering

Integrity

Infrastructure

security

Configuration

management

Configuration checking

and assurance

Integrity

Replay attacks Transmission encryption Confidentiality

Man-in-the-middle

attacks

Transmission encryption Confidentiality

Masquerading

attacks

Transmission encryption Confidentiality

Denial-of-service

attacks

Traffic filtering Availability

Data

management

Data leakage in

auto-tiered storage

Transmission encryption Confidentiality

Authentication

Integrity

Denial-of-service

attacks in lower

storage tiers

Traffic filtering Availability

Synchronization

issues between

disparate storage tiers

Periodic audit of data

freshness

Integrity

Collusion attacks Policy-based encryption of

data at rest

Confidentiality

Rollback attacks

Data provenance Provenance-aware storage1

Lack of audit trail Exchange of signed message

digests for events and

transactions between

data storage tiers

Nonrepudiation

Data integrity

and response

Data tampering

on data collection

devices

Use of Big Data analytics to

filter manipulated data

Integrity

Confidentiality

Integrity

Nonrepudiation

External data

manipulation

Physical security of data

collection environment/use

of Big Data analytics to

filter manipulated data

Integrity

Replay attack at data

collection device

Transmission encryption Confidentiality

Man-in-the-middle

attacks at data

collection device

Transmission encryption Confidentiality

ID cloning attacks

on data collection

devices

Encryption with

trusted certificates

1Muniswamy-Reddy, Kiran-Kumar, Peter Macko, and Margo Seltzer. “Provenance for the Cloud.” Proceedings of the 8th USENIX Conference on File and Storage Technologies (FAST ’10). USENIX Association, 2010.

Policy-based data

synchronization and

synchronization validation

Integrity

Table 1 — Big Data Security/Privacy Challenges and Possible Solutions

29Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

3. Access control

4. Integrity

5. Nonrepudiation

6. Availability

While the challenges are not insurmountable, the com-

plexity of any comprehensive security solution for Big

Data ecosystems seems somewhat daunting. Big Data

security solutions must scale to the levels needed to

support the huge amount of processing done by dis-

parate sets of computing assets, they need to extend in

a flexible manner across multiple uncontrolled cloud

infrastructures with limited (if any) visibility into those

infrastructures, and they need to be able to do all these

things simultaneously on a veritable on-demand basis

to satisfy the analytical requirements that are emerging.

CONCLUSION

Big Data is becoming a major player and driving force

in the Internet of Things continuum. Its ability to add

value across so many disciplines is almost beyond the

imagination. As with anything on the cutting edge,

there are risks to be quantified, challenges to be met,

and balances to be struck between what can be done

and what should be done. Only time will tell if we meet

those challenges.

Big Data is going to be here for some time, and it will

change everything about how we think about comput-

ing. It will soon be impractical to contemplate an appli-

cation without the ability to consume mass quantities

of data, creating new data along the way. As computa-

tional power becomes even more ubiquitous and less

costly, the number of computing environments will

increase, while the interactions between those comput-

ing environments will proliferate even further.

Would-be users of Big Data solutions need to be aware

of the privacy and security challenges that are entailed

in that use. Asking hard questions of Big Data providers

and driving them toward a comprehensive privacy and

security solution is critical to realizing the true potential

of Big Data computing while minimizing its risks.

ENDNOTES

1Sigler, M.G. “Eric Schmidt: Every 2 Days We Create As Much

Information As We Did Up to 2003.” TechCrunch, 4 August

2010 (http://techcrunch.com/2010/08/04/schmidt-data).

2Lee, Timothy B. “Here’s Everything We Know About PRISM

to Date.” WONKBLOG (The Washington Post), 12 June 2013

(www.washingtonpost.com/blogs/wonkblog/wp/2013/06/

12/heres-everything-we-know-about-prism-to-date).

3NoSQL databases are a new class of data stores that are

designed to handle massive amounts of data. For more

information, see http://nosql-database.org.

4“Here’s Why the Dot Com Bubble Began and Why It Popped.”

Business Insider, 15 December 2010 (www.businessinsider.

com/heres-why-the-dot-com-bubble-began-and-why-it-

popped-2010-12).

5CSA Big Data Working Group. “Expanded Top Ten Big Data

Security and Privacy Challenges.” Cloud Security Alliance

(CSA), 2013 (https://cloudsecurityalliance.org/download/

expanded-top-ten-big-data-security-and-privacy-challenges).

6Dean, Robert. “The US’ Sectoral Approach to Data Protection.”

Enterprise Features, 2 July 2013 (http://enterprisefeatures.

com/2013/02/the-us-sectoral-approach-to-data-protection).

7Proffitt, Brian. “Safe Harbor Not Safe Enough for EU Cloud

Data.” IT World, 16 July 2012 (www.itworld.com/cloud-

computing/286162/safe-harbor-not-be-safe-enough-eu-

cloud-data).

8CSA Big Data Working Group (see 5).

Jason Stradley is a recognized thought leader, author, speaker, and

visionary in the information security sphere; he specializes in helping

organizations achieve security excellence through establishing the bal-

ance between business risk and security. With over 25 years’ experi-

ence in the IT and security industry, Mr. Stradley counts among his

client base many Fortune 500 companies to which he provides C-level

security advisory services. He has experience in a wide variety of secu-

rity technologies, such as encryption, IDS/IPs, data protection, enter-

prise risk management, security information and event management,

enterprise threat management, vulnerability management, and iden-

tity and access management solutions, to name a few. Mr. Stradley

currently holds the CISSP, CGEIT, CISM, GSLC, CBCP, CRISC,

CCSK and C|CISO certifications, as well as several more technically

oriented certifications. He has also been nominated for a number of

industry awards, such as the 2007 Security Seven Award and the

2008 Information Security Executive of the Year Award. He can be

reached at [email protected].

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

THE IoT PHENOMENON

Defining the Internet of Things (IoT) can be a challenge

because of its technical and conceptual complexity.

Basically, the IoT is a phenomenon founded on a net-

work of objects linked by a tag or microchip that send

data to a system that receives it. The Cluster Book, pub-

lished in 2012 by the IoT European Research Cluster,

defines the Internet of Things as:

A dynamic global network infrastructure with self-configuring capabilities based on standard and interoper-able communication protocols where physical and virtual“things” have identities, physical attributes, and virtualpersonalities, use intelligent interfaces, and are seamlesslyintegrated into the information network.1

In this era of fast-paced technological evolution, exam-

ples of the Internet of Things abound. We can control

devices such as vending machines and stereo speakers

with our smartphones, manage devices in our homes

(domotics for energy saving, security, comfort, commu-

nication) by remote control, and use smartphone apps

to book reservations or purchase services. Larger-scale

IoT applications might include public security systems

or warehouse inventory control systems. Imagine a

fire alarm in a stadium: there are sensors connected

with an operations center, but these are also linked to

a smartphone app used by the operators. If the alarm

is triggered, the system will automatically activate

any necessary communications and the operators will

be ready to intervene. In the warehouse scenario, the

operator applies tags (microtransmitters) with radio

frequency identification (RFID) technology, from which

an RFID reader can receive information about the goods

in storage. An automatic system can pass the RFID

reader by the tagged goods each half-day to verify

the accuracy of the inventory.

The IoT includes every connection among objects, so we

have machine-to-machine (M2M) systems, where each

machine talks with other machine(s), communicating

real-time data and information. Wikipedia describes this

as a situation “where a device (such as a sensor or meter)

[captures] an event (such as temperature, inventory

level, etc.), which is relayed through a network (wireless,

wired or hybrid) to an application (software program)

that translates the captured event into meaningful

information (for example, items need to be restocked).”

Not that the IoT phenomenon is realized only when

two or more objects are linked to each other in a net-

work such as the Internet. Apart from this kind of

connection, an object could also be indirectly linked

to a person, thereby setting up a ring network among

objects and people. It’s very simple, for example, to

imagine a ring network that could link a person with

one or more objects (a clock, a chair, a lamp, etc.)

equipped with a technological system (RFID, near

field communication [NFC], etc.).2

The IoT is a virtual reality that reproduces exactly what

happens in the real world. Let’s imagine that our clock,

chair, and lamp all contain RFID chips and are used by

a disabled man. From a medical point of view, it may

be important, for instance, to know how many times he

uses the chair. At the same time, it is necessary to help

him by automatically turning on the lamp when he sits

in the chair. Using RFID, it is possible for the objects to

communicate among themselves (e.g., the lamp turning

on when the chair sends data that the man is sitting

down) and at the same time send data over the Internet

for, say, medical analysis. The information provided

by each object can be aggregated, thereby creating a

profile for him. The profile may contain sensitive infor-

mation about the man, which raises the possibility of

his being monitored. This is a very important point for

privacy.

PRIVACY: ISSUES AND RISKS IN THE INTERNET OF THINGS

Despite its many potential benefits, the Internet of

Things poses significant privacy and security risks

because of the technologies involved. These include

cloud computing (the IoT often works in a cloud

©2013 Cutter Information LLCCUTTER IT JOURNAL August 201330

The Internet of Things: Establishing Privacy Standards

Through Privacy by Design

by Nicola Fabiano

PEOPLE-CENTERED PRIVACY

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

31Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

computing system) and data transmission technologies

such as RFID, which often cannot guarantee any security

level absolutely.

Identification of Personal Information

The IoT system allows you to transfer data on the

Internet, including personal data. Personal information

may be transmitted only when the object in which the

microchip installed is linked to a person. This connec-

tion may be direct or indirect.

We could have a direct link when the user is aware of

the possible transmission of his or her personal data

and gives consent. Alternatively, let us suppose that a

person buys something. If there is an RFID or other tag

on the object, it could be a risk for privacy — especially

if the person is automatically linked to the object during

the purchasing process, such as through the use of a

credit or loyalty card.

Alternatively, the connection may be indirect when the

object is not linked directly to a person but only indi-

rectly through the use of information that belongs to

that person. For example, if we have x objects linked

together by the Internet, I might know information

about object 1, but I cannot know to whom this informa-

tion belongs. I can know, however, that objects 2, 3, and

so on are connected among themselves and to a person

named “Jane.” In this way, it is possible to link every

piece of information provided by the objects (2, 3, etc.)

to Jane. Furthermore, if I know that it is possible to link

object 1 to the others (2, 3, etc.), I can also indirectly

know that the information provided by object 1 likewise

belongs to Jane.

Profiling

There are several risks and threats in the Internet of

Things, but the main one is probably the risk of profil-

ing.3, 4 If objects are linked to a person, it will be possible

to obtain personal information about that person

through the information transmitted over the Internet

by each of those objects. Furthermore, this transmitted

data may be stored in one or more servers. When a per-

son can be identified through the use of credit or loyalty

cards, it’s very simple to know the types of products

purchased and so on to profile the person, learning

about his or her habits and lifestyle. The person may

have previously provided consent for the dissemination

of data related to his or her purchases for advertising

purposes. In terms of privacy, is it possible to protect a

person? Who manages the personal data? Where will

this data be stored?

Profiling can also be an issue with the movement

toward “smart” grids and cities, a phenomenon that

is close in nature to the Internet of Things. For some

years now, there has been an interest in modernizing

the existing electrical grid by introducing smart meters,

which can communicate a consumer’s energy consump-

tion data to the relevant utilities for monitoring and

billing purposes. From a legal perspective, there is the

need to consider the privacy issues arising from these

initiatives, such as consumer profiling, data loss, data

breach, and lack of consent (if consent is mandatory

by law). In this field, the contributions of Ontario

Information & Privacy Commissioner Ann Cavoukian

are well known and have deepened our understanding

of the privacy challenges related to the smart grid.5

Geolocation

Nowadays it is very simple to find digital photos with

geolocation, which is also used by several smartphone

apps. In fact, each smartphone OS by default alerts the

user that an app could use the GPS system and access

personal data on the device. However, probably few

people know that inside each picture file there are some

fields — among them “EXIF” and “GPS” — that contain

the technical information about the photo and also the

location where the picture was taken (see Figure 1).

If the user has not previously deactivated the geo-

location service in the camera or smartphone, and the

pictures have been published on a website or social net-

work, anyone who views the photo can know exactly

where the picture was taken and see who was there.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

Figure 1 — Geolocation information for a digital photo.

©2013 Cutter Information LLCCUTTER IT JOURNAL August 201332

In this way, privacy could definitely be compromised.

When smartphones and other mobile devices are con-

nected to the Internet, as they typically are, they con-

tribute every day to the Internet of things, sending data

ready to be used by other people.

Liability for Data Breaches

In Europe, there are numerous national and European

Community (EC) laws relating to personal data

breaches. Hence, the Internet of Things also has effects

on liability in cases where the data being collected and

transmitted lacks the appropriate security measures.

For example, Directive 2002/58/EC states that:

In case of a particular risk of a breach of the security ofthe network, the provider of a publicly available elec-tronic communications service must inform the sub-scribers concerning such risk and, where the risk liesoutside the scope of the measures to be taken by theservice provider, of any possible remedies, includingan indication of the likely costs involved.6

Another risk is loss of data during processing. The

consequences entail, obviously, liability for the data

controller and data processor related to each specific

situation. In fact, because the processing of personal

data entails risks to the data in question (such as the

loss of it), the newly proposed EU regulation on privacy

contains an article requiring data controllers to conduct

a privacy impact assessment (PIA) — an evaluation of

data processing operations that pose particular risks to

data subjects.

PROTECTING PRIVACY THROUGH THE PRIVACY BY DESIGN APPROACH

As we’ve seen, the Internet of Things really represents

a global revolution: the objects that people use in the

real world can “talk” to other objects and at the same

time to the data subjects themselves. This consciousness

is the real engine that has pressed politicians and regu-

lators to intervene in the IoT realm. In fact, there is a

growing desire to create a general, comprehensive, and

structured legal framework for the Internet of Things to

protect users and consumers.

In October 2010, the 32nd International Conference of Data

Protection and Privacy Commissioners adopted a resolu-

tion on Privacy by Design (PbD)7 that is a landmark and

represents a turning point for the future of privacy. The

resolution was proposed by Cavoukian, who is certainly

the world leader on Privacy by Design.8 Instead of rely-

ing on compliance with laws and regulations as the

solution to privacy threats, PbD takes the approach of

embedding privacy into the design of systems from the

very beginning. According to its website, “Privacy by

Design advances the view that the future of privacy

cannot be assured solely by compliance with legislation

and regulatory frameworks; rather, privacy assurance

must ideally become an organization’s default mode

of operation.”9

The main goal is to draw up two concepts: data pro-

tection and user. Regarding privacy, we have always

thought in term of compliance with laws, failing to eval-

uate the real role of the user (and his or her personal

data). To develop an effective data protection and pri-

vacy approach, we must start any process with the user

— the person who has to be protected — putting him

or her at the center. This means that during the design

process, the organization always has to be thinking of

how it will protect the user’s privacy. By making the

user the starting point in developing any project (or

process), we realize a PbD approach. This methodologi-

cal approach is based on the following seven founda-

tional principles:10

n Proactive not reactive; preventative not remedial:The Privacy by Design (PbD) approach is characterizedby proactive rather than reactive measures. It antici-pates and prevents privacy-invasive events beforethey happen. PbD does not wait for privacy risks tomaterialize, nor does it offer remedies for resolvingprivacy infractions once they have occurred — it aimsto prevent them from occurring. In short, Privacy byDesign comes before the fact, not after.

n Privacy as the default setting: We can all be certain ofone thing — the default rules! Privacy by Design seeksto deliver the maximum degree of privacy by ensuringthat personal data are automatically protected in anygiven IT system or business practice. If an individualdoes nothing, their privacy still remains intact. Noaction is required on the part of the individual toprotect their privacy — it is built into the system,by default.

n Privacy embedded into design: Privacy is embeddedinto the design and architecture of IT systems andbusiness practices. It is not bolted on as an add-on,after the fact. The result is that it becomes an essentialcomponent of the core functionality being delivered.Privacy is integral to the system, without diminishingfunctionality.

n Full functionality — positive-sum, not zero-sum:Privacy by Design seeks to accommodate all legitimateinterests and objectives in a positive-sum “win-win”manner, not through a dated, zero-sum approach,where unnecessary tradeoffs are made. Privacy byDesign avoids the pretense of false dichotomies, suchas privacy vs. security, demonstrating that it is possibleto have both.

n End-to-end security — full lifecycle protection:Privacy by Design, having been embedded into the

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

33Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

system prior to the first element of information beingcollected, extends throughout the entire lifecycle of thedata involved, from start to finish. This ensures that atthe end of the process, all data are securely destroyed,in a timely fashion. Thus, Privacy by Design ensurescradle-to-grave, lifecycle management of information,end-to-end.

n Visibility and transparency — keep it open: Privacyby Design seeks to assure all stakeholders that what-ever the business practice or technology involved, it isin fact operating according to the stated promises andobjectives, subject to independent verification. Its com-ponent parts and operations remain visible and trans-parent, to users and providers alike. Remember, trustbut verify.

n Respect for user privacy — keep it user-centric: Aboveall, Privacy by Design requires architects and operatorsto keep the interests of the individual uppermost byoffering such measures as strong privacy defaults,appropriate notice, and empowering user-friendlyoptions. Keep it user-centric.

We can see why the Privacy by Design approach is so

important in the IoT environment. In fact, the Internet

of Things should adopt the PbD principles and state-

ments, always placing the user at the center.

The European Data Protection Supervisor (EDPS) has

promoted PbD, touting the concept in its March 2010

Opinion of the European Data Protection Supervisor on

Promoting Trust in the Information Society by Fostering

Data Protection and Privacy11 as “a key tool for generat-

ing individual trust in ICT.” It was not long after this

endorsement that the 32nd International Conference of

Data Protection and Privacy Commissioners adopted the

PbD concept as well.

PRIVACY PROTECTION DEVELOPMENTS IN EUROPE

In 2010, the European Commission set up an expert

group to advise the commission on how to address

the really hard IoT actions, such as:

n Governance mechanisms

n Data ownership

n Privacy

n A “right to the silence of the chips” (i.e., the ability

of the user to turn off the microchip in any data

transmission system)

n Standards

n International scope

Among the mentioned actions, the reference to standards

is important. Generally speaking, a worldwide standard

could overcome any issues with regard to technical inter-

operability. We see, for instance, how the technical stan-

dard used for email allows people all over the world

to uniquely employ a specific resource using the same

system. A standard for data processing would have the

same impact on the processing of personal data.

Apart from the EC initiatives, I should mention the

activities of the Article 29 Data Protection Working

Party. The Art. 29 WP was set up under Directive

95/46/EC of the European Parliament and of the Council of

24 October 1995 on the protection of individuals with regard

to the processing of personal data and on the free movement

of such data.12 It has advisory status, acts independently,

and has published several documents. Among them is

WP 180, which is titled Opinion 9/2011 on the revised

Industry Proposal for a Privacy and Data Protection Impact

Assessment Framework for RFID Applications13 and was

adopted 11 February 2011. While WP 180 focuses its

attention on RFID instead of the IoT phenomenon per

se, I believe it is very important for its express mention

of the PIA, which makes it a key document in the gen-

eral privacy context. Another group, the European

Network and Information Security Agency (ENISA),

also focuses on security and privacy and often publishes

important papers in these fields focusing on specific

topics and proposing useful solutions.14, 15

Establishing a Legal Framework for Privacy Protection

On 25 January 2012, the European Commission pro-

posed the new European legal framework for data

protection, which states that it is the duty of the data

controller to “implement mechanisms for ensuring

that by default, only those personal data are processed

which are necessary for each specific purpose of the

processing and are especially not collected or retained

beyond the minimum necessary for those purposes.”16

In addition, data controllers and processors must con-

duct a PIA before undertaking any risky data process-

ing operations.

The EC proposal also introduces the expression “data

protection by design and by default,” and certainly

establishing this concept in the law is a positive devel-

opment. Nevertheless, it is quite interesting to notice

The Internet of Things should adopt the

Privacy by Design principles and statements,

always placing the user at the center.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

©2013 Cutter Information LLCCUTTER IT JOURNAL August 201334

that the EC used a different expression (i.e., “data pro-

tection by design and by default”) from the one adopted

in the international context (i.e., “Privacy by Design”).

These two expressions represent two different method-

ological approaches. Privacy by Design is structured

in a trilogy of applications (information technology,

accountable business practices, physical design) and

the seven principles quoted above. The EC formulation

is more descriptive and not based on a method; also,

the “by default” concept is autonomous, whereas the

PbD approach embeds the same concept into “by

design.” According to the text of Article 23 of the pro-

posed regulation, it is clear that the EC considers “by

design” and “by default” as different concepts, even

though the words “by design” comprehend the concept

“by default,” making the latter phrase redundant.

Furthermore, the EC proposal seems to pay a lot of

attention to the technical and security aspects instead

of the legal concerns, as seen in its highlighting of the

term “security.” Hence, beyond introducing the expres-

sion “by design and by default,” it is difficult to imag-

ine that its content will represent a real methodological

approach to handling privacy protection according to

international statements and/or become a worldwide

privacy standard in the near future.

CONCLUSION

The Internet of Things involves all stakeholders from

companies to consumers. Focusing on the consumer

is particularly important in order to guarantee a level

of confidentiality that will earn the user’s trust. This

is made possible by adopting the maximum level of

security through the Privacy by Design approach and

performing PIAs to evaluate the privacy risks of data

collection and processing.

Industry may be wary of efforts to regulate the

Internet of Things, as it regards the IoT phenomenon

as a source of enormous business opportunities. For

example, changes in lifestyle — such as the use of

more technological services like domotics applications

— can certainly increase the consumer’s quality of life

(and industry’s profits). It will be up to consumers,

regulators, and privacy professionals to convince the

business sector that understanding the risks related to

the IoT will produce the same business opportunities to

protect privacy and increase the quality of life.

As I hope I have shown, it is very important to set up a

privacy standard to facilitate a methodological approach

to privacy and data protection. With the Internet of

Things reaching ever more deeply into people’s lives,

it would be beneficial to have an international privacy

standard for processing personal data in the same way

throughout the world using the forward-looking PbD

approach.

From a legal point of view, the main difficulty in setting

up and using a privacy standard relates to existing laws,

which are different in each nation (and even in differ-

ent states and provinces within those nations). At the

moment there is no international privacy standard

model. However, the Organization for the Advancement

of Structured Information Standards (OASIS) is a non-

profit consortium that drives the development, conver-

gence, and adoption of open standards for the global

information society. Among the several projects ongoing

within OASIS there is a technical committee that is work-

ing on a Privacy Management Reference Model (PMRM)

to provide “a guideline for developing operational solu-

tions to privacy issues. It also serves as an analytical tool

for assessing the completeness of proposed solutions and

as the basis for establishing categories and groupings

of privacy management controls.”17 The PMRM could

potentially form the basis of the hoped-for international

data processing standard.

Furthermore, I think it is possible to develop a standard

privacy framework that organizations can use for their

data protection activities. This standard framework may

be adapted to national legislation, while keeping the

main framework for all nation-states. Since the Privacy

by Design approach is the foundational methodological

approach to privacy protection, the privacy standard

should be adopted according to PbD principles and

statements.

In conclusion, it is necessary to continue research into

the Internet of Things phenomenon not only on the

technological side, but also with respect to the legal

issues, particularly privacy and security.

Focusing on the consumer is particularly

important in order to guarantee a level of

confidentiality that will earn the user’s trust.

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

35Get The Cutter Edge free: www.cutter.com Vol. 26, No. 8 CUTTER IT JOURNAL

ENDNOTES

1Smith, Ian G. (ed.) The Internet of Things 2012 – New Horizons

(Cluster Book, 3rd edition). European Research Cluster on the

Internet of Things (IERC), 2012 (www.internet-of-things-

research.eu/pdf/IERC_Cluster_Book_2012_WEB.pdf).

2While the Internet of Things is often identified with RFID, it

can also work with Bluetooth, NFC, and other technologies.

The technological resource is neutral and does not affect the

specific phenomenon.

3Hildebrandt, Mireille (ed.). “Deliverable 7.12: Behavioural

Biometric Profiling and Transparency Enhancing Tools.” FIDIS

(Future of Identity in the Information Society), No. 507512, March

2009.

4Hildebrandt, Mireille. “Profiling: From Data to Knowledge.”

Datenschutz und Datensicherheit (DuD), Vol. 30, No. 9, 2006.

5Cavoukian, Ann, Jules Polonetsky, and Christopher Wolf.

“Smart Privacy for the Smart Grid: Embedding Privacy into the

Design of Electricity Conservation.” Identity in the Information

Society, Vol. 3, No. 2, August 2010.

6Directive 2002/58/EC concerning the processing of personal data and

the protection of privacy in the electronic communications sector

(http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=

CELEX:32009L0136:EN:NOT).

7“Resolution on Privacy by Design.” 32nd International

Conference of Data Protection and Privacy Commissioners,

Jerusalem, Israel, 27-29 October 2010 (www.justice.gov.il/

NR/rdonlyres/F8A79347-170C-4EEF-A0AD-155554558A5F/

26502/ResolutiononPrivacybyDesign.pdf).

8Cavoukian, Ann. Privacy by Design ... Take the Challenge. Ann

Cavoukian, 2009.

9“Applications.” Privacy by Design (www.privacybydesign.ca/

index.php/about-pbd/applications).

10“7 Foundational Principles.” Privacy by Design (www.

privacybydesign.ca/index.php/about-pbd/7-foundational-

principles).

11Opinion of the European Data Protection Supervisor on Promoting

Trust in the Information Society by Fostering Data Protection

and Privacy. European Data Protection Supervisor (EDPS),

18 March 2010 (https://secure.edps.europa.eu/EDPSWEB/

webdav/shared/Documents/Consultation/Opinions/2010/

10-03-19_Trust_Information_Society_EN.pdf).

12Directive 95/46/EC of the European Parliament and of the Council of

24 October 1995 on the protection of individuals with regard to the

processing of personal data and on the free movement of such data

(http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=

CELEX:31995L0046:en:HTML).

13Opinion 9/2011 on the revised Industry Proposal for a Privacy

and Data Protection Impact Assessment Framework for RFID

Applications (WP180) (http://ec.europa.eu/justice/policies/

privacy/docs/wpdocs/2011/wp180_en.pdf).

14Castelluccia, Claude et al. “Privacy, Accountability, and Trust

— Challenges and Opportunities.” European Network and

Information Security Agency (ENISA), 18 February 2011

(www.enisa.europa.eu/activities/identity-and-trust/

privacy-and-trust/library/deliverables/pat-study).

15Tirtea, Rodica et al. “Survey of Accountability, Trust,

Consent, Tracking, Security and Privacy Mechanisms in

Online Environments.” European Network and Information

Security Agency (ENISA), 31 January 2011 (www.enisa.

europa.eu/activities/identity-and-trust/library/deliverables/

survey-pat).

16Proposal for a Regulation of the European Parliament and of the

Council on the protection of individuals with regard to the process-

ing of personal data and on the free movement of such data

(http://ec.europa.eu/justice/data-protection/document/

review2012/com_2012_11_en.pdf).

17“OASIS Privacy Management Reference Model (PMRM) TC.”

OASIS (www.oasis-open.org/committees/tc_home.php?

wg_abbrev=pmrm).

Attorney Nicola Fabiano is the founder and Partner at Studio Legale

Fabiano, a Counsel of Italian High Courts, a Civil Law Specialist, a

Privacy and Security Professional, Privacy by Design Ambassador,

and an ICT legal advisor. He follows the evolution of the Internet,

paying particular attention to the legal issues of new Internet para-

digms, such as cloud computing and Internet of Things.

In 2010 Mr. Fabiano was recognized as a Privacy by Design (PbD)

Ambassador by the Information & Privacy Commissioner of Ontario,

Ann Cavoukian, and he is still working on PbD. He is a frequent

speaker on a variety of topics, including privacy and e-privacy,

e-signatures, registered electronic mail, Internet of Things, and other

ICT law issues. Mr. Fabiano is a consultant within the External

Member Group of the EU project BUTLER on Internet of Things.

He has also been a member of the European Network and Information

Security Agency (ENISA) Awareness Raising Community and

enrolled in the official ENISA list of “Experts for identifying

emerging and future risks posed by new ICTs” since 2009.

Mr. Fabiano is Sector Director of the Italian Institute for Privacy

(IIP-Istituto Italiano Privacy), a member of the Council Internet of

Things (http://www.theinternetofthings.eu), and a member of AIPSI

(Associazione Italiana Professionisti Sicurezza Informatica) – ISSA

(Information Systems Security Association) Italian chapter. He has

authored numerous articles on the subject of privacy and data pro-

tection. He can be reached at [email protected].

NOT FOR DISTRIBUTION • For authorized use, contact

Cutter Consortium: +1 781 648 8700 • [email protected]

Cutter IT Journal

About Cutter ConsortiumCutter Consortium is a truly unique IT advisory firm, comprising a group of more than

100 internationally recognized experts who have come together to offer content,

consulting, and training to our clients. These experts are committed to delivering top-

level, critical, and objective advice. They have done, and are doing, groundbreaking

work in organizations worldwide, helping companies deal with issues in the core areas

of software development and Agile project management, enterprise architecture, business

technology trends and strategies, enterprise risk management, metrics, and sourcing.

Cutter offers a different value proposition than other IT research firms: We give you

Access to the Experts. You get practitioners’ points of view, derived from hands-on

experience with the same critical issues you are facing, not the perspective of a desk-

bound analyst who can only make predictions and observations on what’s happening in

the marketplace. With Cutter Consortium, you get the best practices and lessons learned

from the world’s leading experts, experts who are implementing these techniques at

companies like yours right now.

Cutter’s clients are able to tap into its expertise in a variety of formats, including content

via online advisory services and journals, mentoring, workshops, training, and consulting.

And by customizing our information products and training/consulting services, you get

the solutions you need, while staying within your budget.

Cutter Consortium’s philosophy is that there is no single right solution for all enterprises,

or all departments within one enterprise, or even all projects within a department. Cutter

believes that the complexity of the business technology issues confronting corporations

today demands multiple detailed perspectives from which a company can view its

opportunities and risks in order to make the right strategic and tactical decisions. The

simplistic pronouncements other analyst firms make do not take into account the unique

situation of each organization. This is another reason to present the several sides to each

issue: to enable clients to determine the course of action that best fits their unique

situation.

For more information, contact Cutter Consortium at +1 781 648 8700 or

[email protected].

The Cutter Business

Technology CouncilThe Cutter Business Technology Council

was established by Cutter Consortium to

help spot emerging trends in IT, digital

technology, and the marketplace. Its

members are IT specialists whose ideas

have become important building blocks of

today’s wide-band, digitally connected,

global economy. This brain trust includes:

• Rob Austin• Ron Blitstein• Tom DeMarco• Lynne Ellyn• Israel Gat• Vince Kellen• Tim Lister• Lou Mazzucchelli• Ken Orr• Robert D. Scott