8
Procedia Computer Science 37 (2014) 517 – 524 Available online at www.sciencedirect.com 1877-0509 © 2014 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/3.0/). Peer-review under responsibility of the Program Chairs of EUSPN-2014 and ICTH 2014. doi:10.1016/j.procs.2014.08.077 ScienceDirect International Workshop on Privacy and Security in HealthCare (PSCare14) PHR User Privacy Concerns and Behaviours Reza Samavi a,, Mariano P. Consens a , Mark Chignell a a Information Engineering, MIE, University of Toronto Abstract Results of an empirical study on the privacy concerns, attitudes, and behaviour of personal health record (PHR) users are presented. The study addressed the following questions: (1) What are the factors influencing privacy concerns of PHR users? (2) To what extent do PHR users read privacy agreements and which factors are in play when they fail to read them. (3) Are the behaviour and attitudes of PHR users consistent with respect to reading privacy agreements and using privacy settings? We infer from the study results that the factors influencing privacy concerns of general online users (as reported in literature) also apply to PHR users. In spite of privacy concerns, 60% of the respondents in our study reported not reading privacy agreements. PHR users who were highly concerned about privacy did not report changing their default privacy settings (as oered in a typical social networking website) in a manner consistent with their stated attitudes towards privacy. Based on the issues identified in the study we conclude with a number of design recommendations for privacy-friendly PHR systems. Keywords: Information privacy, PHR, privacy management, privacy concern, risk perceptions in data disclosures 1. Introduction While people gain increasing benefits from sophisticated online services, there are growing concerns about the treatment of personal information online (broadly referred to as privacy concerns). This has led to implementation of legislative and regulatory privacy policy. However, the resulting rules assume that people are reasonably diligent in deciding when and to whom to provide access to their personal data. Massive amounts of personal information on the web are being collected and distributed based mainly on contractual agreements and consensual sharing on a personal basis. However, in many cases, online users are not suciently familiar with what they are consenting or agreeing to. When users take responsibility for personalizing their own privacy, ensuring comprehensibility of the consequences is an increasingly complex task, and one that is outside the scope of current privacy regulations. In this paper, we focus our investigation on issues surrounding the comprehensibility of privacy agreements and settings of online applications when are used by personal health record (PHR) users. Investigating PHR users privacy perspective is crucial as in PHRs the most critical personal information of an individual is at stake 1 . A PHR system works as a platform with interfaces for a growing number of third party health applications 2 . Using these applications, users caught in new social contexts that go beyond the familiar patient-clinician health care context. In the new contexts, it is mainly a contractual agreement that guides collection, use and disclosure of an individuals health information 3 . Corresponding author: Fax: +1 416 978 7753, Email: [email protected] (Reza Samavi) E-mail addresses: [email protected] (Mariano P. Consens)., [email protected] (Mark Chignell). © 2014 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/3.0/). Peer-review under responsibility of the Program Chairs of EUSPN-2014 and ICTH 2014.

PHR User Privacy Concerns and Behaviours

  • Upload
    mark

  • View
    214

  • Download
    0

Embed Size (px)

Citation preview

Page 1: PHR User Privacy Concerns and Behaviours

Procedia Computer Science 37 ( 2014 ) 517 – 524

Available online at www.sciencedirect.com

1877-0509 © 2014 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/3.0/).Peer-review under responsibility of the Program Chairs of EUSPN-2014 and ICTH 2014.doi: 10.1016/j.procs.2014.08.077

ScienceDirect

International Workshop on Privacy and Security in HealthCare (PSCare14)

PHR User Privacy Concerns and Behaviours

Reza Samavia,∗, Mariano P. Consensa, Mark Chignella

aInformation Engineering, MIE, University of Toronto

Abstract

Results of an empirical study on the privacy concerns, attitudes, and behaviour of personal health record (PHR) users are presented.

The study addressed the following questions: (1) What are the factors influencing privacy concerns of PHR users? (2) To what

extent do PHR users read privacy agreements and which factors are in play when they fail to read them. (3) Are the behaviour and

attitudes of PHR users consistent with respect to reading privacy agreements and using privacy settings? We infer from the study

results that the factors influencing privacy concerns of general online users (as reported in literature) also apply to PHR users. In

spite of privacy concerns, 60% of the respondents in our study reported not reading privacy agreements. PHR users who were

highly concerned about privacy did not report changing their default privacy settings (as offered in a typical social networking

website) in a manner consistent with their stated attitudes towards privacy. Based on the issues identified in the study we conclude

with a number of design recommendations for privacy-friendly PHR systems.c© 2014 The Authors. Published by Elsevier B.V.

Selection and peer-review under responsibility of Elhadi M. Shakshuki.

Keywords: Information privacy, PHR, privacy management, privacy concern, risk perceptions in data disclosures

1. Introduction

While people gain increasing benefits from sophisticated online services, there are growing concerns about the

treatment of personal information online (broadly referred to as privacy concerns). This has led to implementation of

legislative and regulatory privacy policy. However, the resulting rules assume that people are reasonably diligent in

deciding when and to whom to provide access to their personal data. Massive amounts of personal information on the

web are being collected and distributed based mainly on contractual agreements and consensual sharing on a personal

basis. However, in many cases, online users are not sufficiently familiar with what they are consenting or agreeing to.

When users take responsibility for personalizing their own privacy, ensuring comprehensibility of the consequences

is an increasingly complex task, and one that is outside the scope of current privacy regulations.

In this paper, we focus our investigation on issues surrounding the comprehensibility of privacy agreements and

settings of online applications when are used by personal health record (PHR) users. Investigating PHR users privacy

perspective is crucial as in PHRs the most critical personal information of an individual is at stake1. A PHR system

works as a platform with interfaces for a growing number of third party health applications2. Using these applications,

users caught in new social contexts that go beyond the familiar patient-clinician health care context. In the new

contexts, it is mainly a contractual agreement that guides collection, use and disclosure of an individuals health

information3.

∗ Corresponding author: Fax: +1 416 978 7753, Email: [email protected] (Reza Samavi)

E-mail addresses: [email protected] (Mariano P. Consens)., [email protected] (Mark Chignell).

© 2014 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/3.0/).Peer-review under responsibility of the Program Chairs of EUSPN-2014 and ICTH 2014.

Page 2: PHR User Privacy Concerns and Behaviours

518 Reza Samavi et al. / Procedia Computer Science 37 ( 2014 ) 517 – 524

We report the results of an empirical study of privacy concerns, attitudes, and behaviour of PHR users. The study

was conducted in collaboration with CAPCH (the Canadian Association for People-Centred Health, a community of

healthcare advocates). An online survey was completed by 38 members (around 20% of the active members in this

community), who were using PHR systems. This study makes contributions to the field of online privacy and PHR

system design by addressing the following research questions: (1) What are the factors influencing PHR users privacy

concerns? (2) Do PHR users read privacy agreements and if they do not always read them, what are the factors causing

users not to read the agreements? (3) Are the behaviour and attitudes of PHR users consistent with respect to reading

privacy agreements and using privacy settings? We infer from the study results that the factors influencing privacy

concerns of general online users (as reported in literature) also apply to PHR users. In spite of privacy concerns, 60%

of the respondents in our study reported not reading privacy agreements. PHR users who were highly concerned about

privacy did not report changing their default privacy settings (as offered in a typical social networking website) in a

manner consistent with their stated attitudes towards privacy. Based on the issues identified in the study we conclude

with a number of design recommendations for privacy-friendly PHR systems

The remainder of this paper is organized as follows. Section 2 reviews the related research. Section 3 describes the

model on which this research is based, and the rationale for hypothesis development. Section 3 describes the research

methodology and the results of the study. Section 5 reports on the evaluation of the hypotheses and discusses the

implications of this study. We conclude in Section 6.

2. Related Work

A number of studies evaluated privacy risks and vulnerabilities associated with PHR systems. Halamka et al.

investigated three PHR case studies and concluded that a successful PHR deployment requires careful attention to the

policies around privacy and management of patients’ consent1. Gallman reported on the privacy implications of PHRs

and warned that in the PHR context consumers may think they have more control over usage of their health data than

they actually have4. Martino and Ahuja compared two PHR platforms using 11 privacy criteria such as ease of access,

readability, and transparency5. They concluded that enabling consumers to audit the compliance of PHR systems with

regulatory privacy requirements might significantly affect adoption of PHR systems. Recent surveys have shown that

PHR users are very concerned about their privacy. In an empirical study conducted by Archer and Cocosila showed

that consumers are concerned about privacy issues that would arise from PHRs6. The Markle Foundation’s survey

showed that two-thirds of the public (U.S.) are concerned about the privacy of their health information when using

PHRs7. A survey conducted by California Healthcare Foundation revealed that 75% of those who were not using

PHRs expressed their concerns about the privacy of their information, citing it as a potential barrier to using PHRs8.

Previous studies have formulated a number of scales and factors for measuring privacy concerns. The concern

for information privacy (CFIP) scale defines four data-related dimensions of privacy concerns: collection, errors,

secondary use, and unauthorized access to information9. The IUIPC (Internet Users Information Privacy Concerns)

was an adaptation of the CFIP to address privacy concerns of Internet users. It introduced the factors of control, and

awareness of privacy practice10. Xu et al. 11 and Smith et al. 12 introduced six dimensions influencing an individual’s

privacy concerns: privacy experiences, privacy awareness, personality differences, sensitivity of service or information

involved, trust to the service, and privacy regulations governing data sharing practices. Our goal in this research is

not to design yet another scale for measuring individual privacy concerns, but rather to study which aspects of the

underlying factors described above are consistent with privacy concerns of PHR users.

While research on online privacy is relatively recent, there are interesting parallels between how people read and

use privacy notices and agreements, and how consumers understand and respond to written warning messages and

nutritional labels. Researchers in online privacy have examined the relationship between trust, privacy concerns and

reading privacy agreements and notices, with particular emphasis on the credibility of websites13,14. Milne et al.

found that the perceived comprehensibility of notices has a strong effect on whether or not those notices are read15.

Comprehensible notices were more likely to be read, and trusted (see also16 and17). Westin et al. found that 65% of

consumers have experienced situations where they did not register at a web site because they believed that the privacy

policies were unclear or too complicated to comprehend16.

Several studies showed that inadequate information about privacy risks is not the only reason for ignoring privacy

agreements18,19. Anecdotal evidence, confirming that individuals act and behave in ways contradictory to their stated

privacy concerns20. Adjerid et al. experimentally validated that users behaviour are not consistent with the actual

Page 3: PHR User Privacy Concerns and Behaviours

519 Reza Samavi et al. / Procedia Computer Science 37 ( 2014 ) 517 – 524

privacy risks rather to their relative judgment20. This may be partly due to the fact that individuals tend to discount

future cost or benefits causing inconsistencies in personal preferences over time19.

3. Research Model and Hypotheses DevelopmentIn the model depicted in Fig. 1, we conceptualize the relationships between different antecedents of PHR users

privacy concerns (UPC) and mechanisms for privacy decision making such as reading privacy agreements and using

privacy settings. UPC is at the centre of this model. We treat UPC as a dependent variable with respect to the left

side of the figure and then as an independent variable with respect to the right side of the figure (as suggested in12).

To address our first research question, we aimed to understand the underlying factors affecting privacy concerns of

PHR users. These factors are shown in the model with arrows pointing to UPC. The right side of the figure shows the

effects of privacy concerns on reading privacy agreements, as a mechanism for privacy decision making. To address

the second and third research questions, we extended our investigation to study the consistency of user self-reported

behaviours and perceptions concerning whether or not users read (or didn’t read) privacy agreements and did (or

didn’t) use privacy settings in making major privacy decisions. We describe below some antecedent variables for

privacy concerns and how we developed our hypotheses based on these antecedents.

H7 H8

H9

H1 H2 H3

H4 H5 H6

H10

Fig. 1. Research model

Factors Moderating Privacy Concerns. We reviewed the research literature on factors moderating privacy concerns

of general Internet users (e.g. CFIP9, IUIPC10, Extended IUIPC11) as well as previous surveys addressing privacy

issues of PHR platforms7,8. Based on that review, the following factors were selected for further study as potential

drivers of PHR users privacy concerns.

Privacy awareness (PAW) deals with the extent to which an individual is informed about privacy practices and

associated issues relating to organizations and society in general. Research suggests that privacy concerns of individ-

uals may come into play when individuals become aware of how their data is collected and used, and of what their

individual rights are concerning that data usage10. H1: PHR users who are more aware of privacy regulations andpractices are more concerned about their PHR privacy.

Information type sensitivity(ISE) deals with individual differences w.r.t. their privacy concerns when the type of

information is different. Privacy concerns vary across different types of websites (e-commerce, social networking,

financial, and healthcare)11. H2: PHR users with a more granular view of the types of their health data (in terms ofprivacy) are more concerned about their PHR privacy.

Service type sensitivity (SSE) deals with individual differences w.r.t. their privacy concerns when the types of PHR

service differ (e.g. a service to track physical exercise versus a service to track weight). H3: PHR users with a moredetailed view of the types of PHR services they are exploiting (in terms of privacy) are more concerned about theirPHR privacy. Although H2 and H3 looks similar, the distinction has been made in a number of privacy studies to

show that characteristics of websites and services, even if they are using the same type of user data, may affect user

privacy concerns differently21,22.

Protection by legislation and regulations (PLP) deals with individual differences with respect to the regulations

that govern data usage practices23. H4: PHR users who live in jurisdictions with more robust privacy regulations aremore concerned about their PHR privacy.

Privacy experience (PRE) accounts for the extent to which an individual past experience with privacy breaches or

intrusions affect the current privacy concerns of the individual24. H5: PHR users who have experienced more privacybreaches in the past are more concerned about their PHR privacy.

Personality Differences (PDIFF) such as being extraverted or introverted and level of social awareness may affect

privacy concerns of individuals (25,26). H6: PHR users with different online social personalities will tend to havedifferent attitudes towards their PHR privacy.

Page 4: PHR User Privacy Concerns and Behaviours

520 Reza Samavi et al. / Procedia Computer Science 37 ( 2014 ) 517 – 524

Strategies for Privacy Self-management. Multiple factors affect which strategy (e.g. reading privacy agreements

and notices, or using privacy settings) will be taken when online users make privacy decisions15,10. Therefore we

hypothesized that H7: PHR users who are more concerned about their privacy will also report that they read moreprivacy agreements when using online services. A research study conducted by Vail et al. 27 showed that people react

differently when the format of privacy notices change. So we hypothesized that H8: PHR users perceive privacysettings to be a more effective mechanism in privacy decision making than reading privacy agreements.

We also used this empirical study to test PHR users’ self-reported actual behaviour in using existing privacy set-

tings, as compared to their self-reported intentions to use privacy settings. We stated the following two hypotheses,

H9: PHR users’ self-reported behaviour of changing privacy settings is different from their self-reported behaviourof reading privacy agreements. H10: PHR users’ self-reported perception of using privacy settings is different fromtheir self-reported actual behaviour of changing privacy settings.

4. Research Methodology and Results

We measured PHR users’ privacy concerns, attitudes and behaviours using self-reported scales. The survey in-

cluded 32 questions and were designed for a larger study to measure multiple aspects of PHR users privacy concerns

and behaviour. The results that are reported in this paper are based on 16 questions targeting to measure a) respon-

dents’ privacy concerns and perceived privacy risks, b) respondents’ self-reported behaviour on deciding to read or to

ignore privacy agreements c) respondents perception concerning paying attention to privacy settings versus to reading

privacy agreements, d) respondents’ self-reported privacy behaviour when using social networking websites and their

usage pattern of such social networking websites, and e) respondents’ demographic background. To increase construct

validity we used items from existing scales (available in the literature) where possible. The measurement items were

randomly ordered to minimize the unsystematic variation from boredom effects.

The survey was reviewed and approved by the University of Toronto Ethics Review Board. One of the main

challenges of the study was participant selection. Since PHR is an emerging technology, similar to the other new

technologies, there are certain groups of people who study and know more about the new technologies and are actually

willing to adopt it earlier than others. We intended to target these early adopters and understand their perceptions

on privacy issues of PHR platforms. We made the survey available to a community of people-centred healthcare

advocates in Ontario, Canada (Canadian Association for People-Centred Health (http://capch.ca/ ). Our preliminary

research and interviews with the community organizers showed that a number of participants in this community had

already adopted some form of PHR or were willing to adopt it for themselves or for a close relative. Upon completion

of the survey a $5 donation was made to the charity of the participant’s choice (participants could pick one charity

among four well-known healthcare related charities in Canada) as a gesture of appreciation for their time.

Sample Characteristics. There were 38 people who participated in the survey of which responses from 33 respon-

dents were usable. Respondents were well distributed in terms of gender with 16 out of 30 (53%) being male while 14

out of 30 (47%) were female. Respondents had mainly lived in Canada in the past 10 years (28/30), although 7 out of

30 of them also lived for more than a year in United States or some European Union countries. 27 out of 33 respon-

dents were aged 40 or greater. Respondents in this survey were highly educated. Twenty out of 30 (67%) completed

graduate school and another 9 out of 30 ( 30%) graduated from some college programs. The average income for 24

respondents who reported their income was in the range $150,000-$200,000. Twenty-six respondents were employed,

2 respondents were retired and only 1 respondent was a student. The internal consistency of the questionnaire for

self-reporting PHR user privacy concerns, attitudes and behaviours was tested using Cronbach’s Alpha.

Factors Moderating PHR Users’ Privacy Concerns. We studied the relationship between PHR user self-reported

privacy concerns and other privacy factors as illustrated in the model in Fig. 1. 28 out of 33 (85%) respondents

indicated that they were concerned about the usage of their personal information and wanted to know how their data

are used. This question was used as an indicator of PHR user privacy concerns (UPC). We tested our hypotheses

H1 through H6 with a multiple regression model where UPC was the dependent variable and PAW, ISE, SSE, PLP,

and PDIFF were predictors. We aimed to understand how each factor may contribute to predicting the variability of

privacy concerns and whether the contribution was significant. Before we built our regression model, we created a

correlation matrix for all dependent and independent variables and also ran a number of diagnostic tests. ISE and SSE

were strongly correlated and thus we created one variable as the average of those two sensitivities. A new combined

Page 5: PHR User Privacy Concerns and Behaviours

521 Reza Samavi et al. / Procedia Computer Science 37 ( 2014 ) 517 – 524

hypothesis then replaced the original second and third hypotheses: H2&3: PHR user privacy concerns change as thetype of data they are sharing or type of services they are using change.

We tested H1, H2&3, H4, H5, and H6 using a multi-variable regression model. We ran the model in which all

variables were entered into the model (forced entry). The results are shown in Table ??. The regression model

achieved statistically significant prediction (F(5, 22)=7.73, p<.001, Adjusted R2=.56), with ISSE and PLP being the

significant predictors. Privacy concerns (UPC) tended to be positively related to ISSE (β=.799) and PLP (β=.578),

supporting H2&3 and H4. However, the t-test for β values of PAW, PRE, and PDIFF were not statistically significant.

Thus, contrary to hypotheses H1, H5, and H6, no significant relationship between those factors and privacy concerns

was discovered.

Privacy Concerns and Reading Privacy Agreements. In this step, we studied how PHR users differ in terms of their

self-reported behaviour of reading privacy agreements (RPA), their self-reported perception of using privacy settings

(PSUP), and their self-reported behaviour of changing default privacy settings of some social networking websites

they were members of (PSUB). We tested hypotheses H7-H10 and twe describe below how the results are inferred.

When participants were asked how often in the past 12 months they had read (either completely or partially) Privacy

Agreements before choosing whether to proceed with an application or service, 9 out of 30 respondents answered

never while only 3 out of 30 respondents indicated that they always read the agreements. 9 out of 30 respondents read

the privacy agreements much less than half of the time, while 5 out of 30 respondents read about half of the time and

only 4 out of the responding participants read privacy agreements much more than half of the time. In Fig. 2-a, we

grouped those who always, or much more than half of the time, read the privacy agreements as the readers, those who

never or much less than half of the time read as non-reader and those who read about half of the time as undecided.

To study whether there is a relationship between pattern of respondents’ reading privacy agreements (RPA) with the

pattern of respondents’ privacy concerns (UPC), we used Spearman’s ρ to find the correlation between participants

level of privacy concerns and their willingness to read privacy agreements. There was no significant correlation

between RPA and UPC in our sample (r=.124, NS), suggesting that there is no significant relationship between the

level of self-reported privacy concerns and the level of willingness to read online privacy agreements. Therefore

hypothesis H7 was rejected as we do not have enough evidence, given the size of the sample, to conclude that the H7

null hypothesis is false. Table 1 summarizes the reasons that respondents gave for not reading privacy agreements.

Participants could select as many reasons as they thought important with an option to express other reasons. Leading

reasons were the time required for reading and digesting privacy agreements, with other reasons cited including the

repetitive and boring nature of the task of reading privacy agreements and not being understandable.

Table 1. Results for self-reported reasons for not reading privacy agreements/notices

Reason RespondentsIt is too time consuming to read the entire Privacy Agreement. 19/29 (66%)

The data that the application is going to use is not critical to me. 12/29 (41%)

I found reading Privacy Agreements boring. 12/29 (41%)

I only use an application when I trust the provider. 9/29 (31%)

I would not be able to understand the content of a Privacy Agreement. 8/29 (28%)

The existing privacy legislation will protect me if my data is being misused by an online service. 4/29 (14%)

I do not use my real name. 1/29 (3%)

The providers would not act based on what they state in the Privacy Agreements anyway. 1/29 (3%)

Consistency Between Privacy Perceptions and Behaviours. In terms of handling privacy settings, 12 out of 30

respondents indicated that they would almost always pay attention to the privacy settings, followed by 8 out of 30

who said they would often pay attention, 3 out of 30 who would moderately pay attention, 5 out of 30 who wouldrarely pay attention, and 2 out of 30 who indicated that they would not pay attention to the options and that they

would mainly rely on the default privacy settings. In Fig. 2-b we grouped those who always or often pay attention to

privacy settings as the pay attention group, those who would not or rarely pay attention as do not pay attention group

and those who moderately pay attention as the undecided group. To understand how respondents differed in terms of

reading privacy agreements and notices (RPA) compared to paying attention to privacy settings (PSUP), we performed

a dependent t-test. Respondents reported that they paid more attention to privacy settings (M=3.80, SE=.24) than to

reading privacy agreements (M=2.43, SE=.24), t(29)=-4.48, p<.001. The calculated perceived effect size, r was .62,

Page 6: PHR User Privacy Concerns and Behaviours

522 Reza Samavi et al. / Procedia Computer Science 37 ( 2014 ) 517 – 524

which is above the .5 threshold for a large effect28. The results of the t-test and the effect size supported hypothesis

H8. This can also be inferred by comparing Fig. 2-a and Fig. 2-b as there is a significant difference between the size

of reader group (23%) and pay attention group (66%). Thus we can infer that PHR users perceived using privacy

settings to be more effective than reading privacy agreements and notices for privacy decision making.

mismatch (a) (b) (c)

Fig. 2. Mismatch between perception and behaviour of using privacy settings: (a) behaviour of reading privacy agreements, (b) perception of

paying attention to privacy settings, (c) behaviour of changing default privacy settings

To investigate the consistency of users self-reported perceived behaviour with actual behaviour in using privacy

settings and to test hypotheses H9 and H10, we asked participants how often they changed the default privacy settingsoffered by the social networking websites they are members of. 11 out of 28 respondents answered almost never, 6

out of 28 rarely, 8 out of 28 occasionally, 1 out of 28 often, and 2 out of 28 almost always. The results are shown in

Fig. 2-c. We grouped those who would often or almost often change the default privacy settings as the change group

(29%), those who would never or rarely change the default as do not change group (60%), and those who occasionally

change as the undecided group (29%). We performed another dependent t-test to compare mean values of participants

willingness to read privacy agreements (RPA-Fig. 2-a) with their willingness to change the default privacy settings

of their social networking websites (PSUB-Fig. 2-c). Interestingly, as a result on average, participant behaviour in

changing the default privacy settings (PSUB) (M=2.18, SE=.23) was not significantly different from reading privacy

agreements (RPA) (M=2.43, SE=.24), t(27)=-.96, NS. Therefore, hypothesis H9 was rejected; i.e. given the size of the

sample and the random outcomes, there was not enough evidence to conclude that PHR users’ behaviour in changing

privacy settings is not consistent with the self-reported pattern of reading privacy agreements. This can be observed

by comparing the reader portion of Fig. 2-a and change portion of Fig. 2-c.

Comparing PSUP (the perception of using privacy settings) and PSUB (actual behaviour of using privacy settings)

also revealed that on average, respondents perceived themselves to pay significantly more attention to changing de-

fault privacy settings (M=3.80, SE=.26) than they reported actually changing those default privacy settings (M=2.18,

SE=.23), t(27)=6.06, p<.001, r=.76. As a result of this test hypothesis H10 was supported, indicating that respon-

dents’ self-reported actual behaviour in changing the default privacy settings was in fact inconsistent with their self-

reported perception of using privacy settings. Respondents perceived themselves to be more vigilant towards changing

privacy settings than was warranted by their self-reported behaviour with respect to social networking default privacy

settings.

We also studied the effects of personality differences (usage of social networking website as the indicator) and

demographic differences (gender, age, and education) on RPA, PSUP, and PSUB. The results indicated no significant

differences across categories of personality differences, gender, age and education.

5. Discussion and Implications

The regression results indicate that PHR users’ sensitivity to the type of PHR data and services, as well as the

coverage of privacy protection Acts in the jurisdiction that the users live, are factors moderating the users privacy

concerns. These factors mirror the factors previously noted as influencing the privacy concerns of general online

users11,29,17,12. We were interested in understanding PHR users’ perceptions and behaviour when making privacy

decisions compared to the attitudes of general Internet users. This issue had been previously raised in the Privacy

Leadership Initiative (PLI) report conducted by Harris Interactive Inc. in 200130 and in the Milne et al. study in

200415. Our results, as well as those of other researchers showed that a large portion of respondents, consistently

(more than 50%) decided not to read privacy agreements, as a privacy decision-making strategy. This pattern has not

changed over a decade. In our study while 28 out of 33 respondents (80%) indicated they were concerned about the

Page 7: PHR User Privacy Concerns and Behaviours

523 Reza Samavi et al. / Procedia Computer Science 37 ( 2014 ) 517 – 524

usage of their data, they nevertheless reported not reading privacy agreements to make privacy decisions. Samavi

and Topaloglou31 illustrated that in order to understand user privacy concerns and develop systems that address these

concerns, the system requirement models should be able to capture intentionality, reasons and motivations behind

behaviour of the system users.The implication of the results reported in this paper, particularly for PHR application

architects and developers, is that relying on privacy agreements as the only medium to communicate data practices

and privacy risks may be insufficient to assure that PHR users make informed privacy decisions. Thus there is a need

to implement alternative privacy decision-making strategies.

Another interesting finding of our study was that the self-reported perception of using privacy settings as an alter-

native to privacy notices and agreements diverged from the actual behaviour of changing privacy settings. There is an

extensive body of literature in behavioural economy and psychology32 that suggests that there is a deviation from ra-

tionality when people decide about longer term future risks. Acquisti et al. 19 showed that this deviation is particularly

relevant to privacy decision-making. It seems that PHR users, similar to general Internet users, perceive themselves

to be privacy concerned, and report that they would pay attention to privacy settings, but do not in fact do so. The

distinction between perception and behaviour is particularly crucial in the PHR context since a study conducted by

Li et al. 33 has found perceived privacy control and trust (as opposed to actual behaviour) is one of the major factors

determining intention to adopt the PHR, overriding the effect of potential privacy risks. Thus, there is a need to not

only transform communication of privacy agreements but also to improve privacy settings so that the meaning of the

features and the consequences of sharing is better communicated with users.

The open-ended responses support our conclusions above. Users would like a more simplified model of privacy

settings as can be seen in the following quote: “Only useful if the choices [in privacy settings] are few and under-

standable. Not like Facebook′s options!” Users would also appreciate informative intervention by privacy experts

or other trustworthy entities, such as the privacy commissioners, in the privacy decision-making process. Illustrative

comments from our respondents were, “I′d rather just have a IPC [information and privacy commissioner] certify

button, or something like that, to show that they [application providers and PHR platforms] adhere to the law [instead

of using privacy settings].” Insights about the type of solutions that application designers and architects can pursue

to accommodate users needs were also gained from the survey results. 21 out of 29 (72%) respondents indicated that

they may consider reading privacy agreements and notices if they are in summarized format or if the length is less

than a page. 12 out of 29 (41%) respondents stated that likelihood of reading privacy agreements might increase if

visual cues or graphics are included. Finally, 12 out of 29 (41%) respondents indicated that additional tips provided

by privacy experts would make them more likely to read privacy agreements.

Study Limitations. This study focused on a relatively small sample of people who were likely to have strong views

on privacy. Respondents of the survey came from a community of people-centred healthcare advocates in Ontario,

Canada. While this provided a strongly focused sample it also meant that even with a response rate around 20%

of that community, our sample size was only around 30 for most questions in our survey. Due to the nature of the

community being sampled, respondents in our survey were highly educated, had relatively higher income and were

from older age groups, limiting the generalizability of our results to the general population, although a focus on older

people with higher education, and higher income seems reasonable given that they are relatively more concerned

about their privacy34,35. Another limitation of this study is using survey as the study’s instrument to measure privacy

behaviour and perceptions. To maintain accuracy of the self-reported behaviour, we asked indirect questions and/or

questions about familiar environments from which participants can more accurately report their behaviour (i.e. social

networking websites). However, the accuracy could be improved if the study is re-designed as a lab experiment with

direct observation of participants’ privacy behaviour.

6. ConclusionsWe draw two main conclusions from the results reported here. First, PHR application architects and developers

cannot rely on privacy agreements as the only medium to communicate data protection practices and privacy risks to

PHR users. Second, the growing number of privacy settings, as we observe in popular social networking platforms,

may not be as useful as they seem since they are not being used by most people, even by those people who are

concerned about privacy. Thus there is a crucial need for development of alternative privacy preserving options.

Based on the responses to open-ended questions in our survey, respondents consistently saw value in changing the

privacy decision making process to include involvement from privacy experts. Further research is needed on how to

encode privacy expert knowledge so that it can effectively facilitate decisions concerning personal privacy.

Page 8: PHR User Privacy Concerns and Behaviours

524 Reza Samavi et al. / Procedia Computer Science 37 ( 2014 ) 517 – 524

Acknowledgments

We extend our sincere thanks to Dr. Vaughan Glover, CEO of the Canadian Association for People-Centred Health

(CAPCH) and all members of CAPCH for their support of the study. Financial supports from the NSERC Canada and

Privacy Awards from IBM and the Information and Privacy Commissioner of Ontario are gratefully acknowledged.

References

1. Halamka, J., Mandl, K., Tang, P.. Early experiences with personal health records. J of the American Med Info Association 2008;15(1):1–7.

2. Mandl, K.D., Kohane, I.S.. No Small Change for the Health Information Economy. New England Journal of Medicine 2009;360(13):1278–

1281. URL: http://dx.doi.org/10.1056/NEJMp0900411. doi:10.1056/NEJMp0900411.

3. TELUS health space, . Privacy Impact Assessment. TELUS Health Solutions; version 1.0 ed.; 2011.

4. Gellman, R.. Personal Health Records: Why Many PHRs Threaten Privacy. Tech. Rep.; World Privacy Forum; 2008.

5. Martino, L., Ahuja, S.. Privacy policies of personal health records: an evaluation of their effectiveness in protecting patient information. In:

Proc. of the 1st ACM Int. Health Informatics Symp. 2010, p. 191–200.

6. Archer, N., Cocosila, M.. Canadian patient perceptions of electronic personal health records: An empirical investigation. Communicationsof the Association for Information Systems 2014;34(1):20.

7. Markle Foundation, . Knowledge Netwrok: Survey on public opinions on the potential and privacy considerations of Individually Controlled

Electronic Personal Health Records. Knowledge Netwrok, Connection for Health 2008;.

8. California HealthCare Foundation, . Consumers and health information technology: A national survey. Health IT Consumer Survey 2010;.

9. Smith, H.J., Milberg, S.J.. Information privacy: measuring individuals’ concerns about organizational practices. MIS Q 1996;20(2):167–196.

10. Malhotra, N.K., Kim, S.S., Agarwal, J.. Internet users’ information privacy concerns (IUIPC): The construct, the scale, and a causal models.

Info Sys Research 2004;15(4):336–355.

11. Xu, H., Dinev, T., Smith, H., Hart, P.. Examining the formation of individual’s privacy concerns: Toward an integrative view. ICIS 2008;.

12. Smith, H.J., Dinev, T., Xu, H.. Information privacy research: An interdisciplinary review. MIS Q 2011;35(4):989–1015.

13. Princeton Survey Research Associates, . A Matter of Trust: What Users Want From Web Sites. Results of a National Survey of Internet Users

for Consumer WebWatch. Princeton Survey Research Associates 2002;.

14. Fogg, B.J., Soohoo, C., Danielson, D.R., Marable, L., Stanford, J., Tauber, E.R.. How do users evaluate the credibility of web sites?: a

study with over 2,500 participants. In: Proc. of DUX. ACM. ISBN 1-58113-728-1; 2003, p. 1–15.

15. Milne, G.R., Culnan, M.J.. Strategies for reducing online privacy risks: Why consumers read (or don’t read) online privacy notices. Journalof Interactive Marketing 2004;18(3):15–29.

16. Westin, A.F.. How to craft effective online privacy policies. Privacy and American Business 2004;11(6):1–2.

17. Milne, G.R., Culnan, M.J., Greene, H.. A longitudinal assessment of online privacy notice readability. Public Policy & Marketing 2006;

25(2):238–249.

18. Simon, H.A.. Models of Bounded Rationality. MIT Press; 1982.

19. Acquisti, A.. Privacy in electronic commerce and the economics of immediate gratification. In: Proc. of EC. 2004, p. 21–29.

20. Adjerid, I., Acquisti, A., Brandimarte, L., Loewenstein, G.. Sleights of privacy: framing, disclosures, and the limits of transparency. In:

Proc. of the Ninth SOUPS. 2013, p. 9.

21. Milne, G.R., Boza, M.E.. Trust and concern in consumers’ perceptions of marketing information management practices. Journal ofInteractive Marketing 1999;13(1):5–24.

22. Tsai, J., Egelman, S., Cranor, L., Acquisti, A.. The effect of online privacy information on purchasing behavior: An experimental study.

Information Systems Research 2011;22(2):254–268.

23. Smith, H.J.. Information Privacy and Marketing: What the US Should (and Shouldn’t) Learn from Europe. California Management Review2001;43(2):8–33.

24. Dolnicar, S., Jordaan, Y.. Protecting consumer privacy in the company’s best interest. Australasian Marketing Journal 2006;14:39–61.

25. Lu, Y., Tan, B.C.Y., Hui, K.L.. Inducing customers to disclose personal information to internet businesses with social adjustment benefits.

In: Proc. of ICIS. 2004, p. 571–582.

26. Dinev, T., Hart, P.J.. An extended privacy calculus model for e-commerce transactions. Information Systems Research 2006;17(1):61–80.

27. Vail, M., Earp, J., Anton, A.. An empirical study of consumer perceptions and comprehension of web site privacy policies. IEEETransactions Engineering Management 2008;55(3):442 –454.

28. Cohen, J.. Statistical power analysis for the behavioral sciences. Psychology Press; 1988.

29. Bansal, G., Zahedi, F.M., Gefen, D.. The moderating influence of privacy concern on the efficacy of privacy assurance mechanisms for

building trust: A multiple-context investigation. In: Proc. of ICIS. 2008, .

30. Harris Interactive, . Privacy notices research final results. Privacy Leadership Initiative (PLI) 2001;Study No. 15338.

31. Samavi, R., Topaloglou, T.. Designing privacy-aware personal health record systems. In: ER Workshops – Advances in ConceptualModeling–Challenges and Opportunities. Springer, LNCS; 2008, p. 12–21.

32. Kahneman, D.. Choices, Values, and Frames. Cambridge University Press; 2000. ISBN 0521627494.

33. Li, H., Gupta, A., Zhang, J., Sarathy, R.. Examining the decision to use standalone personal health record systems as a trust-enabled fair

social contract. Decision Support Systems 2014;57:376–386.

34. Dommeyer, C.J., Gross, B.L.. What consumers know and what they do: An investigation of consumer knowledge, awareness, and use of

privacy protection strategies. Journal of Interactive Marketing 2003;17(2):34 – 51.

35. Schoenbachler, D., Gordon, G.. Trust and customer willingness to provide information in database-driven relationship marketing. Journalof Interactive Marketing 2002;16(3):2–16.