33
Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson 8 th December 2006

An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Embed Size (px)

Citation preview

Page 1: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

An Appropriate Design for Trusted Computing and Digital Rights Management

Presentation to the State Services Commission of

New Zealand

Prof. Clark Thomborson

8th December 2006

Page 2: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Topical Outline

Requirements analysis of e-government and corporate DRM at three levels: static, dynamic, governance. Assessment of IRM v1.0 with TC support

Compliance: New Zealand’s Four Principles for TC/DRM Suggested design improvements

IRM: Emphasise integrity and availability, not confidentiality TC: More support for audit Relationship Management: support for hierarchical, bridging, and

peering trust with other systems and individuals Steps toward uniform “purchase requirements” with

emphasis on interoperability and appropriate security. In progress at the Jericho Forum.

Eventually: develop an appropriate audit standard for DRM, perhaps through ISO.

Page 3: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Static Security for DRM

CIA: confidentiality, integrity, and availability. Internally-authored documents fall in three categories:

Integrity first: internal correspondence. Agency (or corporate division)-confidential by default, but keys are shared widely within the agency to ensure ready availability.

Integrity and availability first: operational data, e.g. citizen (or customer) records. Agency-confidential except in cases where privacy laws or expectations require finer-grain protection. Provisions for ‘bridging trust’ allow efficient data sharing between agencies, where appropriate.

Rarely: highly sensitive data, such as state (or corporate) secrets, requiring narrowly controlled access within the agency.

Three categories of externally-authored documents: Integrity first: unsigned objects, e.g. downloads from the web. Integrity and availability first: signed objects, e.g. contracts, tax

returns. Rarely: objects whose confidentiality is controlled by an external

party, e.g. licensed software and media.

Page 4: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Dynamic Security

The gold standard: Authentication, Authorisation, Audit.

If taken to an extreme, we’ll have a “gold-plated” system design!

Metaphorically, a security engineer should Seal all security perimeters with an authenticating gold

veneer, Sprinkle auditing gold-dust uniformly but very sparingly

over the most important security areas, and Place an authorising golden seal on all of the most

important accesses.

Page 5: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Security Governance

Governance should be pro-active, not reactive. Governors should constantly be asking questions,

considering the answers, and revising plans. Specification, or Policy (answering the question of

what the system is supposed to do), Implementation (answering the question of how to

make the system do what it is supposed to do), and Assurance (answering the question of whether the

system is meeting its specifications). We’re still in the early stages of DRM.

The monumental failures of early systems were the result of poorly-conceived specifications, overly-ambitious implementations, and scant attention to assurance.

Page 6: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Microsoft’s IRM v1.0

Supported in Office 2003 for email and attachments. All protected documents are encrypted with individual,

symmetric, keys. Rights-management information is held in the document

meta-data. Keys are held at a server, and are released to

workstations. Workstations hold recently-used keys in a cache. This

improves performance at a small cost in confidentiality: Reduced latency, when re-opening a document; Reduced load on the server; but Reduced ability to withdraw privileges when the status of the

subject changes (e.g. a job re-assignment) or when the document is reclassified.

This would be a good design for a secretive organisation: Strong confidentiality, strong integrity, weak availability.

Page 7: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Assessment of IRM v1.0 with TC, for Corporate DRM

Microsoft’s C = I > A design is a poor match to I = A > C. Availability could be improved with an independent key escrow, and a

rights-management protocol which conforms to an open standard. An I = A > C design would use agency-level keys to encrypt

documents. Signing keys might be distributed to individuals; but I think it would be better to use agency-level signatures, with each individual’s “signing history” maintained in an audit record.

Authentication and authorisation are acceptable in IRM v1.0, and could be improved with TC.

An audit record of first-time document accesses can be maintained at the IRM v1.0 server. Improvement: All accesses could be auditable with platform TC.

Current TC designs do not (as far as we know) support independent audits of all activities in the trusted partition. We believe all key-generation activity of a TPM must be auditable. We suggest requiring a birth-to-death TC-platform log which is both

tamper-evident and tamper-resistant.

Page 8: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

NZ e-Government Principle #1

“For as long as it has any business or statutory requirements to do so, government must be able to: use the information it owns/holds; provide access to its information to others, when they are

entitled to access it.” http://www.e.govt.nz/policy/tc-and-drm/principles-policies-06/tc-d

rm-0906.pdf A rallying flag! Other sovereign governments will surely

require assurance that all of their protected documents will remain available, especially if the master keys are under the ultimate control of a single vendor. To lessen its dependence on a single vendor, governmental

agencies might insist on an independent escrow of keys, an open standard for DRM, and a transition plan to a secondary vendor.

Page 9: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

NZ e-Government Principle #2

“Government use of trusted computing and digital rights management technologies must not compromise the privacy rights accorded to individuals who use government systems, or about whom the government holds information.”

Possibly contentious in an international standard. Can we specify the uses of a TC/DRM system which would

constitute a “compromise of privacy rights” in at least one jurisdiction?

Can we specify the jurisdictional differences in a way that can be supported by a standardised TC/DRM technology?

This confidentiality requirement would, I believe, be within the range of feasibility of IRM v1.0 with TC, in NZ and in the USA.

• I am not competent to comment on privacy rights in other jurisdictions, and I’m by no means an expert on privacy in NZ or in the USA.

Operational requirements: Independent audit of the source code for the rights-management server, and an audit trail of its operations.

Page 10: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

NZ e-Government Principle #3

“The use of trusted computing and digital rights management technologies must not endanger the integrity of government-held information, or the privacy of personal information, by permitting information to enter or leave government systems, or be amended while within them, without prior government awareness and explicit consent.”

Another rallying flag! All sovereign governments (and corporations) have strong requirements

for integrity, and for operational controls on confidentiality and integrity. Technical analysis:

These requirements could be well-supported by IRM v2.0, although they would be problematic in a closed-source DRM system on an unauditable TC platform.

By default, documents entering a governmental (or corporate) security boundary must be “owned” by the receiving agency, so they can be fully managed by a local rights server.

Strong controls (e.g. a manager’s over-ride authority) should be placed on any individual’s importation of non-owned documents.

Page 11: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

NZ e-Government Principle #4

“The security of government systems and information must not be undermined by use of trusted computing and digital rights management technologies.”

One of the supporting policies: “Agencies will reject the use of TC/DRM mechanisms, and

information encumbered with externally imposed digital restrictions, unless they are able to satisfy themselves that the communications and information are free of harmful content, such as worms and viruses.”

A “killer app” for the NZ principles! This requirement is surprisingly difficult to achieve, in current

TC/DRM technology. The e-Government unit has rendered an important service to the

international community by identifying this security issue.

Page 12: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Malware Scans in TC/DRM

An infected document may have been encrypted before its malware payload is recognisable by a scanner. An infected document may be opened at any time in the

future.

Adding a comprehensive, online, malware scan would significantly increase the multi-second latency of a first-time access in IRM v1.0. Third-party malware scans are problematic in a

security-hardened kernel. The scanner must be highly privileged and trustworthy.

Page 13: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Shifting gears...

There is great potential for confusion when using the words “trust” and “privilege”.

We must develop operational definitions for these terms, if we wish to develop trustworthy computer systems.

Page 14: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Technical and non-technical definitions of Trust

In security engineering, placing trust in a system is a last resort. It’s better to rely on an assurance (e.g. a proof, or a recourse

mechanism), than on a trusting belief that “she’ll be right”. In non-technical circles, trust is a good thing: more trust is

generally considered to be better. Trustworthiness (an assurance) implies that trust (a risk-

aware basis for a decision) is well-placed. A completely trustworthy system (in hindsight) is one that has

never violated the trust placed in it by its users. Just because some users trust a system, we cannot conclude that

the system is trustworthy. A rational and well-informed person can estimate the

trustworthiness of a system. Irrational or poorly-informed users will make poor decisions about

whether or not, and under what circumstances, to trust a system.

Page 15: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Privilege in a Hierarchy

Information flows upwards, toward the most powerful actor (at the root).

Commands and trust flow downwards.

The King is the most privileged.

The peons are the most trusted.

King, President, Chief Justice, Pope, or …

Peons, illegal immigrants, felons, excommunicants, or …

Information flowing up is “privileged”.

Information flowing down is “trusted”.

Orange book TCSEC, e.g. LOCKix.

Page 16: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Trustworthiness in a Hierarchy

Information flows upwards, toward the most powerful actor.

Commands and trust flow downwards.

Peons must be trusted with some information!

If the peons are not trustworthy, then the system is not secure.

King, President, Chief Justice, Pope, or …

Peons, illegal immigrants, felons, excommunicants, or …

If the King does not show good leadership (by issuing appropriate commands), then the system will not work well. “Noblesse oblige”!

Page 17: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Email in a Hierarchy

Information flows upwards, toward the leading actor.

Actors can send email to their superiors.

Non-upwards email traffic is trusted: not allowed by

default; should be filtered,

audited, …

King, President, Chief Justice, Pope, or …

Peons, illegal immigrants, felons, excommunicants, or …

Email up: “privileged” (allowed by default) Email down: “trusted” (disallowed by

default, risk to confidentiality) Email across: privileged & trusted routing

Page 18: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Email across Hierarchies

Q: How should we handle email between hierarchies?

Company X Agency Y

Answers:

1. Merge2. Subsume

3. Bridge

Merged X+Y

• Not often desirable or even feasible.• Cryptography doesn’t protect X from Y,

because the CEO/King of the merged company has the right to know all keys.

• Can an appropriate King(X+Y) be found?

Page 19: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Email across Hierarchies

Q: How can we manage email between hierarchies?

Agency X

Company YAnswers:

1. Merge

2. Subsume3. Bridge

Page 20: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Email across Hierarchies

Q: How can we manage email between hierarchies?

Company X Agency Y

Answers:

1. Merge

2. Subsume

3. Bridge! • Bridging connection: trusted in both directions.

Page 21: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Bridging Trust

We use “bridges” every time we send personal email from our work computer.

We build a bridge by constructing a “bridging persona”.

Even Kings can form bridges.

However Kings are most likely to use an actual person, e.g. their personal secretary, rather than a bridging persona.

Agency X Hotmail

• Bridging connection: bidirectional trusted.• Used for all communication among an

actor’s personae.• C should encrypt all hotmail to avoid

revelations.

C, acting as a governmental

agent

C, acting as a hotmail client

Page 22: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Personae, Actors, and Agents

I use “actor” to refer to an agent (a human, or

a computer program), pursuing a goal (risk

vs. reward), subject to some

constraints (social, technical, ethical, …)

In Freudian terms: ego, id, superego.

Actors can act on behalf of another actor: “agency”.

In this part of the talk, we are considering agency relationships in a hierarchy.

Company X Hotmail

• When an agent takes on a secondary goal, or accepts a different set of constraints, they create an actor with a new “persona”.

C, acting as an employee C, acting as

a hotmail client

Page 23: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Bridging Trust: B2B e-commerce

Use case: employee C of X purchasing supplies through employee V of Y.

Employee C creates a hotmail account for a “purchasing” persona.

Purchaser C doesn’t know any irrelevant information.

Company X Company Y

• Most workflow systems have rigid personae definitions (= role assignments).

• Current operating systems offer very little support for bridges. Important future work!

C, acting as an employee C, acting as

a purchaser

Employee V

Page 24: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Why can’t we trust our leaders?

Commands and trust flow upwards (by majority vote, or by consensus).

Information flows downwards by default (“privileged”).

Upward information flows are “trusted” (filtered, audited, etc.)

In a peerage, the leading actors are trusted, have minimal privilege, don’t know very much, and can safely act on anything they know.

“Our leaders are but trusted servants…”

Peers

By contrast, the King of a hierarchy has an absolute right (“root” privilege) to know everything, is not trusted, and cannot act safely.

Page 25: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Turn the picture upside down!

Information flows upwards by default (“privileged”).

Commands and trust flow downwards.

Downward information flows are “trusted” (filtered, audited, etc.)

A peerage can be modeled by Bell-La Padula, because there is a partial order on the actors’ privileges.

Equality of privilege is the default in a peerage, whereas inequality of privilege is the default in a hierarchy.

Facilitator, Moderator, Democratic Leader, …

Peers, Group members, Citizens of an ideal democracy, …

Page 26: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Peer trust vs. Hierarchical trust

Trusting decisions in a peerage are made by peers, according to some fixed decision rule. There is no single root of peer trust. There are many possible decision rules, but simple majority

and consensus are the most common. Weighted sums in a reputation scheme (e.g. eBay for goods,

Poblano for documents) are a calculus of peer trust -- but “we” must all agree to abide by the scheme.

“First come, first serve” (e.g. Wiki) can be an appropriate decision rule, if the cost per serving is sufficiently low.

Trusting decisions in a hierarchy are made by its most powerful members. Ultimately, all hierarchical trust is rooted in the King.

Page 27: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Legitimation and enforcement

Hierarchies have difficulty with legitimation. Why should I swear fealty (give ultimate privilege) to this

would-be King? Peerages have difficulty with enforcement.

How could the least privileged actor possibly be an effective facilitator?

This isn’t Political Science 101! I won’t argue whether ideal democracies are better than ideal

monarchies. I will argue that hierarchical trust is quite different to peer

trust, that bridging trust is also distinct, and that all three forms are important in our world.

My thesis: Because our applications software will help us handle all three forms of trust, therefore our trusted operating systems should support all three forms.

Page 28: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Requirements for Relationship Management Orange-book security is hierarchical.

This is a perfect match to a military or secret-service agency.

This is a poor match to e-government and corporate applications.

A general-purpose TC must support bridging and peering relationships.

Rights-management languages must support bridges and peerages, as well as hierarchies.

We cannot design an attractive, general purpose DRM system until we have designed the infrastructure properly!

Page 29: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Vapourware

Closed-source methodology is appropriate for designing hierarchical systems.

• These systems have trouble with legitimation.• Why should a user trust that the system designers (and

administrators) won’t abuse their privilege?

Open-source methodology is appropriate for designing peerage systems.

• These systems have trouble with enforcement.• Why should anyone trust a user not to abuse their

privilege?

Real-world peerages can legitimise hierarchies, and hierarchies can enforce peerages.

• Can our next-generation OS use both design patterns?!?

Page 30: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

A Legitimised Hierarchy

Auditor

IG2IG1

OS Root Administrator

Users

Chair of User Assurance Group

Inspector-General (an elected officer)

• Each assurance group may want its own Audit (different scope, objectives, Trust, … ).

• The OS Administrator may refuse to accept an Auditor.

• The OS Administrator makes a Trusting appointment when granting auditor-level Privilege to a nominee.

• Assurance organizations may be hierarchical, e.g. if the Users are governmental agencies or corporate divisions.

Page 31: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Summary of Static Trust

Three types of trust: hierarchical, bridging, peering. Information flows are either trusted or privileged.

Hierarchical trust has been explored thoroughly in the Bell-La Padula model. A subordinate actor is trusted to act appropriately, if a superior actor

delegates some privileges. Bell-La Padula, when the hierarchy is mostly concerned about

confidentiality. Biba, when the hierarchy is mostly concerned about integrity. A general purpose TC OS must support all concerns of a hierarchy.

Actors have multiple personae. Bridging trust connects all an actors’ personae. A general purpose TC OS must support personae.

Peering trust is a shared decision to trust an actor who is inferior to the peers. Peerages have trouble with enforcement; hierarchies have trouble with

legitimation. A trusted OS must be a legitimate enforcement agent!

Page 32: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

A Modest Proposal

Let’s convene a broadly-representative group of purchasers to act as “our” governance body! Large corporations and governmental agencies have similar

requirements for interoperability, auditability, static security, and multiple vendors.

First meeting at http://www.trl.ibm.com/projects/watc/program.htm? A first goal: develop buyer’s requirements for DRM, TC,

and relationship management. International agreement and political “buy-in” is required if we are to

have a system that is broadly acceptable. Regulatory requirements, such as protection of individual privacy,

must be addressed. The Jericho Forum is already doing this (but it’s not a standards

organisation). Work through ISO?

A second goal: develop a trustworthy auditing process.

Page 33: An Appropriate Design for Trusted Computing and Digital Rights Management Presentation to the State Services Commission of New Zealand Prof. Clark Thomborson

Acknowledgements & Sources

Privilege and Trust, LOCKix: Richard O'Brien, Clyde Rogers, “Developing Applications on LOCK”, 1991.

Trust and Power: Niklas Luhmann, Wiley, 1979. Personae: Jihong Li, “A Fifth Generation Messaging System”, 2002; and

Shelly Mutu-Grigg, “Examining Fifth Generation Messaging Systems”, 2003. Use case (WTC): Qiang Dong, “Workflow Simulation for International

Trade”, 2002. Use case (P2P): Benjamin Lai, “Trust in Online Trading Systems”, 2004. Use case (ADLS): Matt Barrett, “Using NGSCB to Mitigate Existing

Software Threats”, 2005. Use case (SOEI): Jinho Lee, “A survey-based analysis of HIPAA security

requirements”, 2006. Trusted OS: Matt Barrett, “Towards an Open Trusted Computing

Framework”, 2005; and Governance of Trusted Computing: Thomborson and Barrett, to appear, ITG 06, Auckland.

Corporate DRM: “Enterprise Information Protection & Control”, a position paper under development in the Jericho Forum, www.jerichoforum.org.