Trusted OS Designdmnicol.web.engr.illinois.edu/ece422/sp2010/slides/... · –Penetration testing....

Preview:

Citation preview

1

Trusted OS Design

CS461/ECE422

2

Reading Material

• Section 5.4 of Security in Computing

3

Overview

• Design Principles• Security Features• Kernelized Design• Virtualization

4

Design Principles

• Simplicity– Less to go wrong– Fewer possible inconsistencies– Easy to understand

• Restriction– Minimize access– Inhibit communication

Saltzer and Schroeder 75

5

Economy of Mechanism

• Keep the design as simple and small as possible• Simpler means less can go wrong

– And when errors occur, they are easier to understandand fix

• Interfaces and interactions

6

Fail-Safe Defaults

• Base access decisions on permission ratherthan exclusion

• Burden of proof is on the principal seekingpermission

• If the protection system fails, then legitimateaccess is denied but illegitimate access is alsodenied

7

Complete Mediation

• Every access to every object must be checkedfor authority

• Usually done once, on first action– UNIX: access checked on open, not checked

thereafter• If permissions change after, may get

unauthorized access• Proposals to gain performance by remembering

the result of an authority check should beexamined skeptically

8

Open Design

• The design should not be secret• Do not depend on secrecy of design or

implementation– Popularly misunderstood to mean that source code

should be public– “Security through obscurity”– Does not apply to information such as passwords or

cryptographic keys

9

Separation of Privilege

• Where feasible, a protection mechanism thatrequires two keys to unlock it is more robustand flexible than one that allows access to thepresenter of only a single key.

• Require multiple conditions to grant privilege– Separation of duty– Defense in depth

10

Least Privilege

• Every program and every user of the systemshould operate using the least set of privilegesnecessary to complete the job

• A subject should be given only those privilegesnecessary to complete its task– Function, not identity, controls– Rights added as needed, discarded after use– Minimal protection domain

11

Least Common Mechanism

• Minimize the amount of mechanism common to morethan one user and depended on by all users

• Mechanisms should not be shared– Information can flow along shared channels– Covert channels

• Isolation– Virtual machines– Sandboxes

12

Psychological Acceptability

• It is essential that the human interface bedesigned for ease of use so that users routinelyand automatically accept the protectionmechanisms correctly

• Security mechanisms should not add to difficultyof accessing resource– Hide complexity introduced by security mechanisms– Ease of installation, configuration, use– Human factors critical here

13

Security Features

• Identification and Authentication• MAC vs DAC• Object Reuse Protection

– Prevent leaks via reallocation– Clean before re-use

14

More Security Features

• Complete Mediation– Mediate all means of access

• File access plus direct memory access if possible• Mediate on each access, not generally done for files

15

More Security Features

• Trusted Path– Give end user means to determine they are really

talking with OS– Secure Attention Key (SAK): key sequence that

cannot be intercepted by non-OS• Ctl-Alt-Del in Windows• Rootkit…

– Or security relevant changes only made duringsystem boot

– What about networked applications?

16

More Security Features

• Audit– Must be able to review and recreate security relevant changes– Must protect log

• Log growth– Originally assumed security officer would review directly– Can by used for backing evidence

• Really want to detect anomalies– Intrusion detection

17

Kernelized design

• Contain security feature implementation in asecurity kernel– Coverage– Separation– Unity– Modifiability– Compactness– Verifiability

SecurityKernel

OS Kernel

User Space

18

Reference Monitor

• Reference Monitor – abstract machine thatmediates all access to objects by subjects

• Reference Validation Mechanism (RVM) –Implementation of a Reference Monitor– Tamper-proof– Well defined– Never bypassed– Small enough for analysis and testing

19

Trusted Computing Base (TCB)

• Includes all protection mechanisms includingHW, firmware, and software responsible forenforcing the security policy

• Strong boundary around the TCB is critical– Any code trusted by element of TCB must be part of

TCB too.– If portion of TCB is corrupted, must consider that all

of the TCB can be corrupted

20

TCB Components

• TCB can include– Hardware– Primitive files

• Authentication info• Access Control info

– Protected Memory• For Reference Monitor Execution

– Some inter-process communication

21

TCB/non-TCB Function Split

22

TCB Implementation

• Ideally TCB a separate security kernel– e.g. SCOMP, 10K lines of code in security kernel

• Generally not feasible for retrofitted kernel– Most all trusted Unix variants– Security relevant functionality distributed through OS

kernel

23

Virtualization

• Can design virtualization layer to separate multiple users– Memory virtualization

• As exemplified by IBM MVS– Virtual machines

• Book discusses IBM PR/SM• More recently exemplified in VMWare and XEN

• Malicious program could not access other virtualmemory space or machine– Unless they attack virtualization mechanism

24

Memory Virtualization

25

Machine Virtualization

26

Key Points

• Principles of secure design underlie all security-related mechanisms

• Require:– Good understanding of goal of mechanism and

environment in which it is to be used– Careful analysis and design– Careful implementation

27

Evaluating Systems

Information AssuranceCS461/ECE422

28

Reading Material

• Chapter 5.5 of Security in Computing• The orange book and the whole rainbow series

– http://www.radium.ncsc.mil/tpep/library/rainbow/• The common criteria

– Lists all evaluated protection profiles and products– http://www.commoncriteriaportal.org

29

Outline

• Motivation for system evaluation• Specific evaluation systems

– TCSEC/Orange Book– Interim systems– Common Criteria

30

Evaluation Goals

• Oriented to purchaser/user of system• Assurance that system operates as advertised

31

Evaluation Options

• Rely on vendor/developer evidence– Self-evaluate vendor design docs, test results, etc– Base on reputation of vendor

• Rely on an expert– Read product evaluations from trusted source– Penetration testing

32

Evaluation Options

• Formal Verification• Validation

– Requirements checking– Design and Code review– System Testing

33

Formal Evaluation

• Provide a systematic framework for systemevaluation– More consistent evaluation– Better basis for comparing similar product

• Trusted third party system for evaluation• Originally driven by needs of government and

military

34

TCSEC: 1983-1999

• Trusted Computer System Evaluation Criteria(TCSEC) also called the Orange Book– Specifies evaluation classes (D, C1, C2, B1, B2, B3,

A1)– Specifies functionality and assurance requirements

for each class• Functional Model builds on

– BLP (Bell-LaPadula model, mandatory labelling)– Reference Monitors

35

TCSEC Functional Requirements

• DAC• Object Reuse

– Sufficient clearing of objects between uses in resource pool– E.g. zero pages in memory system

• MAC and Labels• Identification and Authentication• Audit

– requirements increase at higher classes• Trusted Path

– Non-spoofable means to interact with TCB– Ctl-Alt-Del in Windows

36

TCSEC Assurance Requirements

• Configuration Management– For TCB

• Trusted Distribution– Integrity of mapping between master and installations

• System Architecture– Small and modular

• Design Specification – vary between classes• Verification – Vary between classes• Testing• Product Documentation

37

TCSEC Classes

• D – Minimal Protection• C1 – Discretionary Protection

– Identification and authentication and DAC– users processing data at common sensitivity level, separates users

from data– Minimal Assurance, may be based on features, not evaluation

• C2 – Control access protection– Adds object reuse and auditing– More testing requirements– Windows NT 3.5 evaluated C2

38

TCSEC Classes

• B1 – Labeled Security Protection– Adds MAC for some objects

• Controlled objects “labeled”, access control based on these– Stronger testing requirements. Information model of security

policy. Bell-LaPadula model.– Trusted Unixes tended to be B1

• B2 – Structured protection– Design and implementation must enable thorough testing & review

• “well-defined largely independent modules”– MAC for all objects, including devices. Additional logging.

Trusted Path. Least privilege.– Covert channel analysis, configuration management, more

documentation, formal model of security policy

39

TCSEC Classes

• B3 – Security Domains– Requirements on code modularity, layering, simplicity.– Argument (short of proof) that implementation meets design

specifications– Tamper-proof implementation, “highly resistant to penetration”– More stringent testing and documentation.

• A1 – verified protection– Same functional requirements as B3– Five criteria

• Formal model of protection and proofs of consistency/adequacy• Formal specification fo protection system• Demonstration that specification corresponds to model of protection• “proof” that implementation is consistent with specification• Formal analysis of covert channel

– Existence proof : Honeywell’s SCOMP

40

TCSEC Evaluation process

• Originally controlled by government– No fee to vendor– May reject evaluation application if product not of

interest to government• Later introduced fee-based evaluation labs• Evaluation phases

– Design analysis – no source code access– Test analysis– Final review

41

TCSEC Evaluation Issues

• Focused on operating systems• Evaluating a specific configuration

– E.g., Window NT, no applications installed, no network– New patches, versions require re-certification

• Ratings Maintenance Program introduced to ease re-certifications

– Incremental changes documented, re-evaluated– Long time for evaluation

• Sometimes product was obsolete before evaluation finished

• Criteria Creep– B1 means something more in 1999 than it did in 1989

42

Interim Efforts in the ’90s

• Canadian Trusted Computer Product EvaluationCriteria (CTCPEC)

• Information Technology Security EvaluationCriteria (ITSEC) – Western Europe

• Commercial International SecurityRequirements (CISR) – AmEx and EDS

• Federal Criteria – NSA and NIST

43

FIPS 140

• Federal Information Processing Standards• Framework for evaluating Cryptographic Modules• Still in Use• Addresses

– Functionality– Assurance– Physical security

• Level 1 - algorithm be FIPS approved, can run on COTS device• Level 2 - physical security, role-based auth. , s/w crypto in multiprocessors• Level 3 - enhanced physical security.• Level 4 - physical tamper detection/response.Level 3 and 4 devices may be used with suitably well criterianed OS

44

Common Criteria – 1998 to today

• Pulls together international evaluation efforts– Evaluations mean something between countries

• Three top level documents– Common Criteria Documents

• Describe functional and assurance requirements. DefinesEvaluation Assurance Levels (EALs)

– CC Evaluation Methodology (CEM)• More details on the valuation. Complete through EAL5 (at least)

– Evaluation Scheme• National specific rules for how CC evals are performed in that

country• Directed by NIST in US

45

CC Terminology

• Target of Evaluation (TOE)– The product being evaluated

• TOE Security Policy (TSP)– Rules that regulate how assets are managed, protected, and

distributed in a product• TOE Security Functions (TSF)

– Implementation of the TSP (all hardware, software, firmware reliedupon for the correct enforcement of TSP)

CC evaluates “protection profiles”, and products/systems against apre-defined (or user-defined) Evaluation Assurance Level (EAL)

46

Protection Profile (PP)

• Profile that describes the security requirements for aclass of products– Implementation-independent, targets products or systems for

specific consumer needs– Stated in terms of threats, environmental issues and

assumptions, security objectives.– List of PP’s http://www.commoncriteriaportal.org/pp.html

• Replaces the fixed set of classes from TCSEC• ISSO created some initial profiles to match TCSEC

classes– Controlled Access Protection Profile (CAPP) corresponds to C2– Labeled Security Protection Profile (LSPP) corresponds to B1

PP Format

A PP has 6 sections– Introduction : PP identification , overview (narrative summary)– Product or System Family Description : type and general IT

features. Context of use.– Product or System Family Security Environment : assumptions

about use and environment. Threats requiring protection. Organizationpolicies required.

– Security Objectives : Two types. For product/system : trace objectivesto specified threats and policies. For environment: traced to threats notcountered by product or by assumptions about product.

– IT Security Objectives : Functional (drawn from CC, or other).Security Assurance : based on an EAL

– Rationale : Two parts. Objectives: trace stated objectives to allassumptions, threats, organizational policies. Requirements : show aretraceable to objectives, and meet them.

48

Product Evaluation

• Define a security target (ST)– Structured very much like a PP, except with more

implementation specificity– May leverage an evaluated protection profile

• Evaluated with respect to the ST

49

CC Functional Requirements

• Defined in a taxonomy– Top level 11 classes

• E.g., FAU – Security audit and FDP – User Data Protection

– Each class divided into families• E.g., FDP_ACC – Access control policy

– Each family divided into components• E.g., FDP_ACC.2 – Complete access control

– Each component contains requirements anddependencies on other requirements

CC Classes

• FAU : Security Audit• FCO : Communication

– Address non-repudiation• FCS : Crypto support

– Key management, other• FDP : User Data Protection

– Policies for access control,information flow.

• FIA : Identification andauthorization

• FMT : Security management– Attributes, management of

functions in TSF, revocation

• FPR : Privacy– Anonymity, unobservability

• FPT: Protection of SecurityFunctions– Physical; many logical self-

tests, integrity checks• FRU : Resource Utilization

– Fault tolerance, priorities,allocation

• FTA : TOE Access– Concurrency, session locking,

access banners• FTP : Trusted path

51

CC Assurance Requirements

• Similar class, family, component taxonomy• Eight product oriented assurance classes

– ACM – Configuration Management– ADO – Delivery and Operation– ADV – Development– AGD – Guidance Documentation– ALC – Life Cycle– ATE – Tests– AVA – Vulnerability Analysis– AMA – Maintenance of Assurance

52

Evaluation Assurance Levels

• 7 fixed EALs– EAL1 – Functionality Tested– EAL2 – Structurally Tested– EAL3 – Methodically tested and checked

• Analogous to C2

– EAL4 – Methodically Designed, Tested, andReviewed

– EAL5 – Semiformally Designed and Tested– EAL6 – Semiformally Verified Design and Tested– EAL7 – Formally Verified Design and Tested

53

CC Evaluation Process in US

• NIST provides accreditation of third partyevaluation labs– Vendor pays lab– Lab works with oversight board

• Evaluate both PP’s and Products• List of evaluated products

– http://www.commoncriteriaportal.org/products.html

54

Key Points

• Evaluation for the benefit of the customer• Product Evaluations

– Functional Requirements– Assurance Requirements

Recommended