39
Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Embed Size (px)

Citation preview

Page 1: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Governance of Trusted Computing

Seminar to the TRUST Group at UC Berkeley

Prof. Clark Thomborson

25th October 2006

Page 2: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

NZ, Auckland, … New Zealand is a South Pacific island nation,

populated by 600,000 “Maori” 300,000 “Asian” 300,000 “Pacific” (Samoan, Fijian, Tongan, …) 3,100,000 “European” (mostly of UK heritage)

1,300,000 people live in the Auckland region. Population density is low almost everywhere else in NZ. New Zealand: 270,000 km2 for 4,000,000 people. California: 400,000 km2 for 36,000,000 people.

The University of Auckland has 25,000 undergraduate students, 5,000 postgraduate students, and 4,000 staff. 5,500 of our students are from other countries.

Page 3: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006
Page 4: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Computer Science Department

We are the largest and most diversified computer science department in New Zealand: 50+ staff 1000 undergraduates 100 postgraduates

Page 5: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Secure Systems Group

Inventions: Software obfuscation, Software watermarking, Tamperproofing, and 3d object watermarking (subcontract: Cardiff U)

Secure systems development: Applications of trusted computing, Specification of security requirements, and Security improvements

http://www.cs.auckland.ac.nz/research/groups/ssg/

Page 6: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006
Page 7: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Topical Outline: TC Governance

Requirements analysis of corporate DRM at three levels: static, dynamic, governance. Assessment of IRM v1.0 with TC support

New Zealand’s Four Principles for TC/DRM Suggested design improvements

IRM: Emphasise integrity and availability, not confidentiality TC: More support for audit Relationship Management: support for hierarchical, bridging, and

peering trust with anyone else in the world. Next step: develop “purchase requirements” for these

technologies. Large corporations and governmental agencies have similar

requirements for interoperability, auditability, static security, and multiple vendors.

Possible tactic: work through the Jericho Forum.

Page 8: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Static Security for Corporate DRM

CIA: confidentiality, integrity, and availability. Internally-authored documents:

Integrity first: internal correspondence. Company-confidential by default, but keys can be shared widely within the company to ensure ready availability.

Integrity and availability first: operational data, e.g. customer records. Company-confidential except in cases where privacy laws or expectations require finer-grain protection.

Rarely: highly sensitive corporate data, such as trade secrets, requiring narrowly controlled access within the corporation.

Externally-authored documents: Integrity first: unsigned objects, e.g. downloads from the web. Integrity and availability first: signed objects, e.g. contracts. Rarely: objects whose confidentiality is controlled by an

external party, e.g. licensed software and media.

Page 9: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Dynamic Security

The gold standard: Authentication, Authorisation, Audit.

If taken to an extreme, we’ll have a “gold-plated” system design!

Metaphorically, a security engineer should Seal all security perimeters with an authenticating gold

veneer, Sprinkle auditing gold-dust uniformly but very sparingly

over the most important security areas, and Place an authorising golden seal on all of the most

important accesses.

Page 10: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Security Governance

Governance should be pro-active, not reactive. Governors should constantly be asking questions,

considering the answers, and revising plans. Specification, or Policy (answering the question of

what the system is supposed to do), Implementation (answering the question of how to

make the system do what it is supposed to do), and Assurance (answering the question of whether the

system is meeting its specifications). We’re still in the early stages of corporate DRM.

The monumental failures of early systems were the result of poorly-conceived specifications, overly-ambitious implementations, and scant attention to assurance.

Page 11: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Microsoft’s IRM v1.0

Supported in Office 2003 for email and attachments. All protected documents are encrypted with individual,

symmetric, keys. Rights-management information is held in the document

meta-data. Keys are held at a server, and are released to

workstations. Workstations hold recently-used keys in a cache. This

improves performance at a small cost in confidentiality: Reduced latency, when re-opening a document; Reduced load on the server; but Reduced ability to withdraw privileges when the status of the

subject changes (e.g. a job re-assignment) or when the document is reclassified.

This would be a good design for a secretive organisation: Strong confidentiality, Strong integrity, but Weak availability.

Page 12: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Assessment of IRM v1.0 with TC, for Corporate DRM

The C = I > A design is a poor match to our I = A > C requirements. Availability could be improved with an independent key escrow, and a

rights-management protocol which conforms to an open standard. An I = A > C design would use individual keys to sign individual

documents, with widely-distributed keys to provide confidentiality at the corporate or division level.

Authentication and authorisation are acceptable, and could be improved with TC.

An audit record of first-time document accesses can be maintained at the server, and all accesses could be audited with platform TC.

Current TC designs do not (as far as we know) support independent audits of all activities in the trusted partition. We believe all key-generation activity of a TPM must be auditable. We suggest requiring a birth-to-death platform log which is both

tamper-evident and tamper-resistant.

Page 13: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

NZ e-Government Principle #1

“For as long as it has any business or statutory requirements to do so, government must be able to: use the information it owns/holds; provide access to its information to others, when they are

entitled to access it.” http://www.e.govt.nz/policy/tc-and-drm/principles-policies-06/tc-d

rm-0906.pdf A sovereign government might require strong assurance

that all protected documents will remain available, even though the master keys are under the ultimate control of a single vendor. To lessen its dependence on a single vendor, governmental

agencies might insist on an independent escrow of keys, an open standard for DRM, and a transition plan to a secondary vendor.

Page 14: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

NZ e-Government Principle #2

“Government use of trusted computing and digital rights management technologies must not compromise the privacy rights accorded to individuals who use government systems, or about whom the government holds information.”

This confidentiality requirement could be well-supported by IRM v1.0 with TC. An independent audit of the source code for the

rights-management server would provide assurance, if this were required, about Microsoft’s inability to read documents whose rights are managed by its software.

Page 15: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

NZ e-Government Principle #3

“The use of trusted computing and digital rights management technologies must not endanger the integrity of government-held information, or the privacy of personal information, by permitting information to enter or leave government systems, or be amended while within them, without prior government awareness and explicit consent.”

This requirement could be well-supported by IRM v2.0, although it would be problematic in a closed-source DRM system on an unauditable TC platform. By default, documents entering a governmental (or corporate)

security boundary must be “owned” by the recipient so they can be fully managed by a local rights server.

Strong controls (requiring a manager’s authority to over-ride) should be placed on importing non-owned documents.

Page 16: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

NZ e-Government Principle #4

“The security of government systems and information must not be undermined by use of trusted computing and digital rights management technologies.”

One of the supporting policies: “Agencies will reject the use of TC/DRM mechanisms,

and information encumbered with externally imposed digital restrictions, unless they are able to satisfy themselves that the communications and information are free of harmful content, such as worms and viruses.”

Page 17: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Malware Scans in TC/DRM

An infected document may have been encrypted before its malware payload is recognisable by a scanner. An infected document may be opened at any time in the

future.

Adding a comprehensive, online, malware scan would significantly increase the multi-second latency of a first-time access in IRM v1.0. Third-party malware scans are problematic in a

security-hardened kernel. The scanner must be highly privileged and trustworthy.

Page 18: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Technical and non-technical definitions of Trust

In security engineering, placing trust in a system is a last resort. It’s better to rely on an assurance (e.g. a proof, or a recourse

mechanism), than on a trusting belief that “she’ll be right”. In non-technical circles, trust is a good thing: more trust is

generally considered to be better. Trustworthiness (an assurance) implies that trust (a risk-

aware basis for a decision) is well-placed. A completely trustworthy system (in hindsight) is one that has

never violated the trust placed in it by its users. Just because some users trust a system, we cannot conclude that

the system is trustworthy. A rational and well-informed person can estimate the

trustworthiness of a system. Irrational or poorly-informed users will make poor decisions about

whether or not, and under what circumstances, to trust a system.

Page 19: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Privilege in a Hierarchy

Information flows upwards, toward the leading actor (at the root) of a secret society.

Commands and trust flow downwards.

The King is the most privileged.

The peons are the most trusted.

King, President, Chief Justice, Pope, or …

Peons, illegal immigrants, felons, excommunicants, or …

Information flowing up is “privileged”.

Information flowing down is “trusted”.

Orange book TCSEC, e.g. LOCKix.

Page 20: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Trustworthiness in a Hierarchy

In a secret society, information flows upwards, toward the most powerful actor.

Commands and trust flow downwards.

Peons must be trusted with some information!

If the peons are not trustworthy, then the system is not secure.

King, President, Chief Justice, Pope, or …

Peons, illegal immigrants, felons, excommunicants, or …

If the King does not show good leadership (by issuing appropriate commands), then the system will not work well. “Noblesse oblige”!

Page 21: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Email in a Hierarchy

Information flows upwards, toward the leading actor.

Actors can send email to their superiors.

Non-upwards email traffic is trusted: not allowed by

default; should be filtered,

audited, …

King, President, Chief Justice, Pope, or …

Peons, illegal immigrants, felons, excommunicants, or …

Email up: “privileged” (allowed by default) Email down: “trusted” (disallowed by

default, risk to confidentiality) Email across: privileged & trusted routing

Page 22: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Email across Hierarchies

Q: How should we handle email between hierarchies?

Company X Company Y

Answers:

1. Merge2. Subsume

3. Bridge

Merged X+Y

• Not often desirable or even feasible.• Cryptography doesn’t protect X from Y,

because the CEO of the merged company has the right to know all keys.

• Can a noble CEO(X+Y) be found?

Page 23: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Email across Hierarchies

Q: How can we manage email between hierarchies?

Company X

Company YAnswers:

1. Merge

2. Subsume3. Bridge

Page 24: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Email across Hierarchies

Q: How can we manage email between hierarchies?

Company X Company Y

Answers:

1. Merge

2. Subsume

3. Bridge! • Bridging connection: trusted in both directions.

Page 25: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Bridging Trust

We make bridges every time we send personal email from our work computer.

We make bridges every time we send work-related email from our home computer.

Even Kings can form bridges.

However Kings are most likely to use an actual person, e.g. their personal secretary, rather than a bridging persona.

Company X Hotmail

• Bridging connection: bidirectional trusted.• Used for all communication among an

actor’s personae.• C should encrypt all hotmail to avoid

revelations.

C, acting as an employee C, acting as

a hotmail client

Page 26: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Personae, Actors, and Agents

I use “actor” to refer to an agent (a human, or

a computer program), pursuing a goal (risk

vs. reward), subject to some

constraints (social, technical, ethical, …)

In Freudian terms: ego, id, superego.

Actors can act on behalf of another actor: “agency”.

In this part of the talk, we are considering agency relationships in a hierarchy.

Company X Hotmail

• When an agent takes on a secondary goal, or accepts a different set of constraints, they create an actor with a new “persona”.

• Bridging connection: bidirectional trusted, models communication among an agent’s personae.

C, acting as an employee C, acting as

a hotmail client

Page 27: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Bridging Trust: B2B e-commerce

Use case: employee C of X purchasing supplies through employee V of Y.

Employee C creates a hotmail account for a “purchasing” persona.

Purchaser C doesn’t know any irrelevant information.

Company X Company Y

• Most workflow systems have rigid personae definitions (= role assignments).

• Current operating systems offer very little support for bridges. Important future work!

C, acting as an employee C, acting as

a purchaser

Employee V

Page 28: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Why can’t we trust our leaders?

Commands and trust flow upwards (by majority vote, or by consensus).

Information flows downwards by default (“privileged”).

Upward information flows are “trusted” (filtered, audited, etc.)

In a peerage, the leading actors are trusted, have minimal privilege, don’t know very much, and can safely act on anything they know.

“Our leaders are but trusted servants…”

Peers

By contrast, the King of a hierarchy has an absolute right (“root” privilege) to know everything, is not trusted, and cannot act safely.

Page 29: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Turn the picture upside down!

Information flows upwards by default (“privileged”).

Commands and trust flow downwards.

Downward information flows are “trusted” (filtered, audited, etc.)

A peerage can be modeled by Bell-La Padula, because there is a partial order on the actors’ privileges.

Equality of privilege is the default in a peerage, whereas inequality of privilege is the default in a hierarchy.

Facilitator, Moderator, Democratic Leader, …

Peers, Group members, Citizens of an ideal democracy, …

Page 30: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Peer trust vs. Hierarchical trust

Trusting decisions in a peerage are made by peers, according to some fixed decision rule. There is no single root of peer trust. There are many possible decision rules, but simple majority

and consensus are the most common. Weighted sums in a reputation scheme (e.g. eBay for goods,

Poblano for documents) are a calculus of peer trust -- but “we” must all agree to abide by the scheme.

“First come, first serve” (e.g. Wiki) can be an appropriate decision rule, if the cost per serving is sufficiently low.

Trusting decisions in a hierarchy are made by its most powerful members. Ultimately, all hierarchical trust is rooted in the King.

Page 31: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Legitimation and enforcement

Hierarchies have difficulty with legitimation. Why should I swear fealty (give ultimate privilege) to this

would-be King? Peerages have difficulty with enforcement.

How could the least privileged actor possibly be an effective facilitator?

This isn’t Political Science 101! I won’t argue whether ideal democracies are better than ideal

monarchies. I will argue that hierarchical trust is quite different to peer

trust, that bridging trust is also distinct, and that all three forms are important in our world.

My thesis: Because our applications software will help us handle all three forms of trust, therefore our trusted operating systems should support all three forms.

Page 32: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Requirements for Relationship Management Orange-book security is hierarchical.

This is a perfect match to a military or secret-service agency.

This is a poor match to e-government and corporate applications.

A general-purpose TC must support bridging and peering relationships.

Rights-management languages must support bridges and peerages, as well as hierarchies.

We cannot design an attractive, general purpose DRM system until we have designed the infrastructure properly!

Page 33: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Open vs. Closed Source TC

Closed-source methodology is hierarchical.• Only personae with sufficiently high privilege can read the

source.• An obfuscated OS kernel can hold, and manipulate,

cryptographic keys for a remote, trusted, OS administrator.• A trusted computer base (TCB) can offer additional

security through its hardware (TPM) enforcement of privilege.

Open-source development is based on peer trust.• The OS kernel can’t hold any secrets if it is open source.• A trusted computer base (TCB) can hold keys in privileged

(closed) hardware.

Page 34: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

More vapourware

Closed-source methodology is appropriate for designing hierarchical systems.

• These systems have trouble with legitimation.• Why should a user trust that the system designers (and

administrators) won’t abuse their privilege?

Open-source methodology is appropriate for designing peerage systems.

• These systems have trouble with enforcement.• Why should anyone trust a user not to abuse their

privilege?

Real-world peerages can legitimise hierarchies, and hierarchies can enforce peerages.

• Can our next-generation OS use both design patterns?!?

Page 35: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

A Legitimised Hierarchy

Auditor

IG2IG1

OS Root Administrator

Users

Chair of User Assurance Group

Inspector-General (an elected officer)

• Each assurance group may want its own Audit (different scope, objectives, Trust, … ).

• The OS Administrator may refuse to accept an Auditor.

• The OS Administrator makes a Trusting appointment when granting auditor-level Privilege to a nominee.

• Assurance organizations may be hierarchical, e.g. if the Users are governmental agencies or corporate divisions.

Page 36: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Summary of Static Trust

Three types of trust: hierarchical, bridging, peering. Information flows are either trusted or privileged.

Hierarchical trust has been explored thoroughly in the Bell-La Padula model. A subordinate actor is trusted to act appropriately, if a superior actor

delegates some privileges. Bell-La Padula, when the hierarchy is mostly concerned about

confidentiality. Biba, when the hierarchy is mostly concerned about integrity. A general purpose TC OS must support all concerns of a hierarchy.

Actors have multiple personae. Bridging trust connects all an actors’ personae. A general purpose TC OS must support personae.

Peering trust is a shared decision to trust an actor who is inferior to the peers. Peerages have trouble with enforcement; hierarchies have trouble with

legitimation. A trusted OS must be a legitimate enforcement agent!

Page 37: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Dynamic Trust and Governance

We are starting to develop a dynamic theory of trust. When someone accepts a subordinate role in a hierarchy or

in a peerage, they make a trusting decision. “Subordination trust” is required when installing a

trusted OS. Dynamic trust is also required in order to model changes in

membership of a peerage. Generally: dynamic trust/distrust is required to

create/destroy an arc or node in our diagrams. A theory of trust will be incomplete until it has a

governance layer. A trustworthy system must have privileged, and trustworthy,

governors who adjust the specifications and oversee the implementations, in response to the assurances they receive.

Who could govern a worldwide system of trust?

Page 38: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

A Modest Proposal

Let’s convene a broadly-representative group of purchasers to act as “our” governance body! Large corporations and governmental agencies have similar

requirements for interoperability, auditability, static security, and multiple vendors.

First meeting at http://www.trl.ibm.com/projects/watc/program.htm? A first goal: develop buyer’s requirements for DRM, TC,

and relationship management. International agreement and political “buy-in” is required if we are to

have a system that is broadly acceptable. Regulatory requirements, such as protection of individual privacy,

must be addressed in the standard. A second goal: develop a trustworthy auditing process. Possible tactic: work through the Jericho Forum.

Page 39: Governance of Trusted Computing Seminar to the TRUST Group at UC Berkeley Prof. Clark Thomborson 25 th October 2006

Acknowledgements & Sources

Privilege and Trust, LOCKix: Richard O'Brien, Clyde Rogers, “Developing Applications on LOCK”, 1991.

Trust and Power: Niklas Luhmann, Wiley, 1979. Personae: Jihong Li, “A Fifth Generation Messaging System”, 2002; and

Shelly Mutu-Grigg, “Examining Fifth Generation Messaging Systems”, 2003. Use case (WTC): Qiang Dong, “Workflow Simulation for International

Trade”, 2002. Use case (P2P): Benjamin Lai, “Trust in Online Trading Systems”, 2004. Use case (ADLS): Matt Barrett, “Using NGSCB to Mitigate Existing

Software Threats”, 2005. Use case (SOEI): Jinho Lee, “A survey-based analysis of HIPAA security

requirements”, 2006. Trusted OS: Matt Barrett, “Towards an Open Trusted Computing

Framework”, 2005; and Governance of Trusted Computing: Thomborson and Barrett, to appear, ITG 06, Auckland.

Corporate DRM: “Enterprise Information Protection & Control”, a position paper under development in the Jericho Forum, www.jerichoforum.org.