47
CSC 482/582: Computer Security Security Policy Models CSC 482/582: Computer Security

Security Policy Models CSC 482/582: Computer Security

Embed Size (px)

Citation preview

  • Slide 1
  • Security Policy Models CSC 482/582: Computer Security
  • Slide 2
  • Topics 1. Types of Policy Models 2. Bell-LaPadula Model 1. Rules 2. Tranquility 3. Controversy: -property, System Z 3. Integrity Policies 1. Requirements 2. Bibas models 1. Low-Water-Mark policy 2. Ring policy 3. Strict Integrity policy 4. Mixed Policies 1. Chinese Wall Model
  • Slide 3
  • CSC 482/582: Computer Security Confidentiality Policy Goal: prevent the unauthorized disclosure of information. Deals with information flow. Integrity incidental. Multi-level security models are best-known examples. Bell-LaPadula Model basis for most of these.
  • Slide 4
  • CSC 482/582: Computer Security Bell-LaPadula Model, Step 1 Levels consist of security clearance L(s). Objects have security classification L(o). Unclassified Confidential Secret Top Secret
  • Slide 5
  • CSC 482/582: Computer Security Example security levelsubjectobject Top SecretTamaraPersonnel Files SecretSamuelE-Mail Files ConfidentialClaireActivity Logs UnclassifiedUlaleyTelephone Lists Tamara can read all files. Claire cannot read Personnel or E-Mail Files. Ulaley can only read Telephone Lists.
  • Slide 6
  • CSC 482/582: Computer Security Reading Information Information flows up, not down Reads up disallowed, reads down allowed. Simple Security Condition (Step 1) Subject s can read object o iff, L(o) L(s) and s has permission to read o. Note: combines mandatory control (relationship of security levels) and discretionary control (the required permission.) Sometimes called no reads up rule.
  • Slide 7
  • CSC 482/582: Computer Security Writing Information Information flows up, not down Writes up allowed, writes down disallowed. *-Property (Step 1) Subject s can write object o iff L(s) L(o) and s has permission to write o. Note: combines mandatory control (relationship of security levels) and discretionary control (the required permission.) Sometimes called no writes down rule.
  • Slide 8
  • CSC 482/582: Computer Security Basic Security Theorem, Step 1 If 1. system is initially in a secure state, and 2. every transition of the system satisfies the simple security condition, and 3. every transition satisfies the *-property then every state of the system is secure. Proof: induction on number of transitions.
  • Slide 9
  • CSC 482/582: Computer Security Bell-LaPadula Model, Step 2 Security level is (clearance, category set). Examples: ( Top Secret, { Nuc, Eur, Asi } ) ( Confidential, { Eur, Asi } ) ( Secret, { Nuc, Asi } )
  • Slide 10
  • CSC 482/582: Computer Security Relationship of Security Levels (A, C) dom (A, C) iff A A and C C. Examples: (Top Secret, {Nuc,Asi}) dom (Secret, {Nuc}) (Secret, {Nuc, Eur}) dom (Confidential,{Nuc,Eur}) (Top Secret, {Nuc}) dom (Confidential, {Eur}) Categories based on need to know principle.
  • Slide 11
  • CSC 482/582: Computer Security Levels and Ordering Security levels partially ordered. Any pair of security levels may (or may not) be related by dom. dominates serves the role of greater than in step 1. greater than is a total ordering, though.
  • Slide 12
  • CSC 482/582: Computer Security Reading Information Information flows up, not down Reads up disallowed, reads down allowed. Simple Security Condition (Step 2) Subject s can read object o iff L(s) dom L(o) and s has permission to read o. Sometimes called no reads up rule.
  • Slide 13
  • CSC 482/582: Computer Security Writing Information Information flows up, not down. Writes up allowed, writes down disallowed. *-Property (Step 2) Subject s can write object o iff L(o) dom L(s) and s has permission to write o. Sometimes called no writes down rule.
  • Slide 14
  • CSC 482/582: Computer Security Basic Security Theorem, Step 2 If 1. system is initially in a secure state, and 2. every transition of the system satisfies the simple security condition, and 3. every transition satisfies the *-property then every state of the system is secure. Proof: induction on number of transitions.
  • Slide 15
  • CSC 482/582: Computer Security Problem Colonel has (Secret, {Nuc, Eur}) clearance. Major has (Secret, {Eur}) clearance Major can talk to colonel (write up or read down) Colonel cannot talk to major (read up or write down) Clearly absurd!
  • Slide 16
  • CSC 482/582: Computer Security Solution Define maximum (L max ), current (L cur ) levels for subjects: L max (s) dom L cur (s) Example: Treat Major as an object (Colonel is writing to him/her) Colonel has L max (Secret, {Nuc, Eur}) Colonel sets L cur to (Secret, { Eur }) Now L(Major) dom L cur (Colonel) Colonel can write to Major without violating no writes down.
  • Slide 17
  • CSC 482/582: Computer Security Question In simple security condition and *-property, does L(s) mean current level or maximum level of s? Subject level treated as current level (L(s) = L cur (s) in the properties.) If subjects downgrade, trusted not to reveal information between L max (s) and L cur (s) to others who do not have clearance for that information. This includes writing it to objects at lower levels.
  • Slide 18
  • CSC 482/582: Computer Security Principle of Tranquility Cannot change security levels after instantiation. Raising objects security level Information once available to some subjects is no longer available. Usually assume information has already been accessed, so this does nothing. Lowering objects security level The declassification problem. Essentially, a write down violating *-property. Solution: define set of trusted subjects that sanitize or remove sensitive information before security level lowered.
  • Slide 19
  • CSC 482/582: Computer Security Types of Tranquility Strong Tranquility The clearances of subjects, and the classifications of objects, do not change during the lifetime of the system. Weak Tranquility The clearances of subjects, and the classifications of objects, do not change in a way that violates the simple security condition or the *-property during the lifetime of the system.
  • Slide 20
  • CSC 482/582: Computer Security Controversy value of the BST is much overrated since there is a great deal more to security than it captures. Further, what is captured by the BST is so trivial that it is hard to imagine a realistic security model for which it does not hold. McLean Basis: given assumptions known to be non-secure, BST can prove a non-secure system to be secure.
  • Slide 21
  • CSC 482/582: Computer Security -Property Subject s can write object o iff curlevel(s) dom L(o) and s has permission to write o. Idea: change model so that, for writing, subject dominates object; for reading, subject also dominates object. Differs from *-property in that the mandatory condition for writing is reversed. For *-property, object must dominate subject.
  • Slide 22
  • CSC 482/582: Computer Security Analogue Analogue to Basic Security Theorem is: A system is a secure system if its initial state is a secure state and all actions satisfy the conditions for the simple security condition, the -property, and the discretionary security property.
  • Slide 23
  • CSC 482/582: Computer Security Problem This system is clearly non-secure! Information flows from higher to lower because of the -property.
  • Slide 24
  • CSC 482/582: Computer Security Discussion Role of Basic Security Theorem is to demonstrate that rules preserve security. Key question: what is security? Bell-LaPadula defines it in terms of 3 properties (simple security condition, *-property, discretionary security property) Theorems are assertions about these properties. Rules describe changes to a particular system instantiating the model. Showing system is secure requires proving rules preserve these 3 properties.
  • Slide 25
  • CSC 482/582: Computer Security Rules and Model Nature of rules is irrelevant to model. Model treats security as axiomatic. Policy defines security This instantiates the model. Policy reflects the requirements of the systems. McLeans definition differs from Bell-LaPadula and is not suitable for a confidentiality policy. Analysts cannot prove security definition is appropriate through the model.
  • Slide 26
  • CSC 482/582: Computer Security Integrity Policy Requirements 1. Users will not write their own programs, but will use existing production programs and databases. 2. Programmers will develop and test programs on a nonproduction system; if they need access to actual data, they will be given production data via a special process, but will use it on their development system. 3. A special process must be followed to install a program from the development system onto the production system. 4. The special process in requirement 3 must be controlled and audited. 5. The managers and auditors must have access to both the system state and the system logs that are generated.
  • Slide 27
  • CSC 482/582: Computer Security Biba Integrity Model Basis for all 3 models: Set of subjects S Set of objects O Set of integrity levels I These can be linearly ordered or partially ordered (like Bell-LaPadula security levels.) i(s), i(o) gives integrity level of subject, object. Use rather than dom: So if i(s) i(o), the object integrity level dominates the subject integrity level.
  • Slide 28
  • CSC 482/582: Computer Security Intuition for Integrity Levels The higher the level, the more confidence: that a program will execute correctly. that data is accurate and/or reliable. Note relationship between integrity and trustworthiness. Important point: integrity levels are not security levels.
  • Slide 29
  • CSC 482/582: Computer Security Low-Water-Mark Policy Idea: when s reads o, i(s) = min(i(s),i(o)); s can only write objects at lower levels. Rules: 1. Subjects cannot write to higher trust objects: s S can write to o O if and only if i(o) i(s). 2. Subjects integrity lowered by using untrusted data: If s S reads o O, then i(s) = min(i(s), i(o)), where i(s) is the subjects integrity level after the read. 3. Subject can only run programs of lower trust levels s 1 S can execute s 2 S if and only if i(s 2 ) i(s 1 ).
  • Slide 30
  • CSC 482/582: Computer Security Problems Subjects integrity levels decrease as system runs Soon no subject will be able to access objects at high integrity levels. Alternative: change object levels rather than subject levels. Soon all objects will be at the lowest integrity level. Crux of problem is model prevents indirect modification. Because subject levels lowered when subject reads from low-integrity object.
  • Slide 31
  • CSC 482/582: Computer Security Ring Policy Idea: subject integrity levels static. Rules 1. Subjects can only write to lower integrity objects s S can write to o O if and only if i(o) i(s). 2. Any subject can read any object. 3. Subject can only run programs of lower trust levels s 1 S can execute s 2 S if and only if i(s 2 ) i(s 1 ). Eliminates indirect modification problem.
  • Slide 32
  • CSC 482/582: Computer Security Strict Integrity Policy Similar to Bell-LaPadula model 1. s S can read o O iff i(s) i(o) 2. s S can write to o O iff i(o) i(s) 3. s 1 S can execute s 2 S iff i(s 2 ) i(s 1 ) Note: if both read and write allowed, i(o) = i(s). Add compartments and discretionary controls to get full dual of Bell-LaPadula model. Term Biba Model refers to this.
  • Slide 33
  • CSC 482/582: Computer Security Mixed Models Chinese Wall Model Focuses on conflict of interest. CISS Policy Medical records require integrity and confidentiality. ORCON Combines mandatory, discretionary access controls. RBAC Base controls on job function; users may change roles.
  • Slide 34
  • CSC 482/582: Computer Security Chinese Wall Model Problem: Tony advises American Bank about investments. He is asked to advise Toyland Bank about investments. Conflict of interest to accept, because his advice for either bank would affect his advice to the other bank.
  • Slide 35
  • CSC 482/582: Computer Security Organization Organize entities into conflict of interest classes. Control subject accesses to each class. Control writing to all classes to ensure information is not passed along in violation of rules. Allow sanitized data to be viewed by everyone.
  • Slide 36
  • CSC 482/582: Computer Security Definitions Objects: items of information related to a company. Company dataset (CD): contains objects related to a single company. Written CD(O) Conflict of interest class (COI): contains datasets of companies in competition. Written COI(O). Assume: each object belongs to exactly one COI class.
  • Slide 37
  • CSC 482/582: Computer Security Chinese Wall Example Bank ofAmerica CitibankBank of theWest Bank COI Class Shell Oil Union 76 Standard Oil ARCO Gasoline Company COI Class
  • Slide 38
  • CSC 482/582: Computer Security Temporal Element If Anthony reads any CD in a COI, he can never read another CD in that COI. Possible that information learned earlier may allow him to make decisions later. Let PR(S) be set of objects that S has already read.
  • Slide 39
  • CSC 482/582: Computer Security CW-Simple Security Condition s can read o iff either condition holds: 1. There is an o such that s has accessed o and CD(o) = CD(o) Meaning s has read something in os dataset. 2. For all o O, o PR(s) COI(o) COI(o) Meaning s has not read any objects in os conflict of interest class. Ignores sanitized data (see below.) Initially, PR(s) = , initial read request granted.
  • Slide 40
  • CSC 482/582: Computer Security Sanitization Public information may belong to a CD As is publicly available, no conflicts of interest arise. So, should not affect ability of analysts to read. Typically, all sensitive data removed from such information before it is released publicly (called sanitization.) Add third condition to CW-Simple Security 3.o is a sanitized object.
  • Slide 41
  • CSC 482/582: Computer Security Writing Anthony, Susan work in same trading house. Anthony can read Bank 1s CD, Gas CD. Susan can read Bank 2s CD, Gas CD. If Anthony could write to Gas CD, Susan can read it. Hence, indirectly, she can read information from Bank 1s CD, a clear conflict of interest.
  • Slide 42
  • CSC 482/582: Computer Security CW-*-Property s can write to o iff both of the following hold: 1. The CW-simple security condition permits s to read o; and 2. For all unsanitized objects o, if s can read o, then CD(o) = CD(o). Says that s can write to an object if all the (unsanitized) objects it can read are in the same dataset.
  • Slide 43
  • CSC 482/582: Computer Security Information Flows Key question: under what condition can information flow from o O to o O? Answer: when either: CD(o) = CD(o); or o is sanitized.
  • Slide 44
  • CSC 482/582: Computer Security Compare to Bell-LaPadula Fundamentally different CW has no security labels, B-LP does. CW has notion of past accesses, B-LP does not. Bell-LaPadula can capture state at any time: Each (COI, CD) pair gets security category. Two clearances, S (sanitized) and U (unsanitized) S dom U Subjects assigned clearance for compartments without multiple categories corresponding to CDs in same COI class.
  • Slide 45
  • CSC 482/582: Computer Security Compare to Bell-LaPadula Bell-LaPadula cannot track changes over time. Susan becomes ill, Anna needs to take over C-W history lets Anna know if she can. No way for Bell-LaPadula to capture this. Access constraints change over time Initially, subjects in C-W can read any object. Bell-LaPadula constrains set of objects that a subject can access. Cant clear all subjects for all categories, because this violates CW-simple security condition.
  • Slide 46
  • CSC 482/582: Computer Security Key Points Confidentiality models restrict flow of information. Bell-LaPadula models multilevel security. Cornerstone of much work in computer security. Controversy over meaning of security. Different definitions produce different results. Biba Model basis for integrity models. Chinese Wall models time constraints on security.
  • Slide 47
  • CSC 482/582: Computer Security References 1. Anderson, Ross, Security Engineering, Wiley, 2001. 2. David E. Bell and Leonard J. LaPadula, Secure Computer System: Unified Exposition and MULTICS Interpretation, MTR-2997 Rev. 1, The MITRE Corporation, Bedford, MA 01730 (Mar. 1976) http://csrc.nist.gov/publications/history/bell76.pdf 3. Bishop, Matt, Introduction to Computer Security, Addison- Wesley, 2005. 4. Department of Defense, Trusted Computer System Evaluation Criteria, DoD 5200.28-STD (Orange Book), National Computer Security Center, Ft. Meade, MD 20755 (Dec. 1985) http://csrc.nist.gov/publications/history/dod85.pdf http://csrc.nist.gov/publications/history/dod85.pdf 5. Peter Loscocco and Stephen Smalley, Integrating Flexible Support for Security Policies into the Linux Operating System, Proceedings of the FREENIX Track of the 2001 USENIX Annual Technical Conference, 2001.Integrating Flexible Support for Security Policies into the Linux Operating System