DAC Vs. MAC Bell-La Padula Model [BL]
Total Page:16
File Type:pdf, Size:1020Kb
DAC vs. MAC Bell-La Padula model [BL] Most people familiar with discretionary access View the system as subjects accessing objects • • control (DAC) - The system input is requests, the output is decisions - Unix permission bits are an example - Objects can be organized in one or more hierarchies, H - Might set a file private so only group friends can read it (a tree enforcing the type of decendents) Discretionary means anyone with access can Four modes of access are possible: • • propagate information: - execute – no observation or alteration - Mail [email protected] < private - read – observation Mandatory access control - append – alteration • - Security administrator can restrict propagation - write – both observation and modification - Abbreviated MAC (NOT to be confused w. Message The current access set, b, is (subj, obj, attr) tripples Authentication Code or Medium Access Control) • An access matrix M encodes permissible access types • (as before, subjects are rows, objects columns) 1/39 2/39 Security levels Security properties A security level is a (c, s) pair: The simple security or ss-property: • • - c = classification – E.g., unclassified, secret, top secret - For any (S, O, A) b, if A includes observation, then level(S) ∈ - s = category-set – E.g., Nuclear, Crypto must dominate level(O) (c , s ) dominates (c , s ) iff c c and s s - E.g., an unclassified user cannot read a top-secret document • 1 1 2 2 1 ≥ 2 1 ⊇ 2 - L dominates L sometimes written L L or L L The star security or ⋆-property: 1 2 1 ⊒ 2 2 ⊑ 1 • - levels then form a lattice (partial order w. lub & glb) - If a subject can observe O1 and modify O2, then level(O2) dominates level(O ) Subjects and objects are assigned security levels 1 • - E.g., cannot copy top secret file into secret file - level(S), level(O) – security level of subject/object - More precisely, given (S, O, A) b: - current-level(S) – subject may operate at lower level ∈ if A = r then current-level(S) level(O) (“no read up”) ⊒ - level(S) bounds current-level(S) (current-level(S) level(S)) if A = a then current-level(S) level(O) (“no write down”) ⊑ ⊑ - Since level(S) is max, sometimes called S’s clearance if A = w then current-level(S)= level(O) 3/39 4/39 The lattice model [Denning] Straw man MAC implementation top-secret, Nuclear, Crypto h { }i Take an ordinary Unix system • top-secret, Nuclear top-secret, Crypto Put labels on all files and directories to track levels h { }i h { }i • X Each user U has a security clearance, level(U) top-secret, ∅ X • h i Determine current security level dynamically • secret, Nuclear secret, Crypto - When U logs in, start with lowest curent-level h { }i h { }i - Increase current-level as higher-level files are observed secret, ∅ (sometimes called a floating label system) h i L L X 1 1 - If U’s level does not dominate current-level, kill program means L1 L2 ⊑ - Kill program that writes to file that doesn’t dominate it unclassified, ∅ h i Is this secure? Information can only flow up the lattice • • - System enforces “No read up, no write down” - Think of as “can flow to” relation ⊑ 5/39 6/39 No: Covert channels No: Timing channels System rife with storage channels Example: CPU utilization • • - Low current-level process executes another program - To send a 0 bit, use 100% of CPU in busy-loop - New program reads sensitive file, gets high current-level - To send a 1 bit, sleep and relinquish CPU - High program exploits covert channels to pass data to low - Repeat to transfer more bits E.g., High program inherits file descriptor Example: Resource exhaustion • • - Can pass 4-bytes of information to low prog. in file offset - High prog. allocates all physical memory if bit is 1 Other storage channels: - If low prog. slow from paging, knows less memory available • - Exit value, signals, file locks, terminal escape codes, . More examples: Disk head position, processor • If we eliminate storage channels, is system secure? cache/TLB polution, . • 7/39 8/39 Reducing covert channels Declassification Observation: Covert channels come from sharing Sometimes need to prepare unclassified report from • • - If you have no shared resources, no covert channels classified data - Extreme example: Just use two computers (common in DoD) Declassification happens outside of system • Problem: Sharing needed - Present file to security officer for downgrade • - E.g., read unclassified data when preparing classified Job of declassification often not trivial • Approach: Strict partitioning of resources - E.g., Microsoft word saves a lot of undo information • - Strictly partition and schedule resources between levels - This might be all the secret stuff you cut from document - Occasionally reapportion resources based on usage - Another bad mistake: Redacted PDF using black censor bars - Do so infrequently to bound leaked information over or under text (but text still selectable) - In general, only hope to bound bandwidth of covert channels - Approach still not so good if many security levels possible 9/39 10/39 Biba integrity model [Biba] DoD Orange book Problem: How to protect integrity DoD requirements for certification of secure systems • • - Suppose text editor gets trojaned, subtly modifies files, might 4 Divisions: • mess up attack plans - D – been through certification and not secure Observation: Integrity is the converse of secrecy • - C – discretionary access control - In secrecy, want to avoid writing less secret files - B – mandatory access control - In integrity, want to avoid writing higher-integrity files - A – like B, but better verified design Use integrity hierarchy parallel to secrecy one - Classes within divisions increasing level of security • - Now security level is a c, i, s triple, i =integrity h i - c , i , s c , i , s iff c c and i i and s s h 1 1 1i⊑h 2 2 2i 1 ≤ 2 1 ≥ 2 1 ⊆ 2 - Only trusted users can operate at low integrity levels - If you read less authentic data, your current integrity level gets lowered (putting you up higher in the lattice), and you can no longer write higher-integrity files 11/39 12/39 Limitations of Orange book Today: Common Criteria How to deal with floppy disks? Replaced orange book around 1998 • • How to deal with networking? Three parts to CC: • • Takes too long to certify a system - CC Documents, including protection profiles w. both functional • and assurance requirements - People don’t want to run n-year-old software - CC Evaluation Methodology Doesn’t fit non-military models very well • - National Schemes (local ways of doing evaluation) What if you want high assurance & DAC? • 13/39 14/39 LOMAC [Fraser] LOMAC overview MAC not widely accepted outside military Subjects are jobs (essentially processes) • • LOMAC’s goal: make MAC more palatable - Each subject has an integrity number (e.g., 1, 2) • - Stands for Low water Mark Access Control - Higher numbers mean more integrity (so unfortunately 2 1 by earlier notation) Concentrates on Integrity ⊑ • - Subjects can be reclassified on observation of low-integrity data - More important goal for many settings Objects are files, pipes, etc. - E.g., don’t want viruses tampering with all your files • - Also don’t have to worry as much about covert channels - Objects have fixed integrity level; cannot change Security: Low-integrity subjects cannot write to high Provides reasonable defaults (minimally obtrusive) • • integrity objects Has actually had some impact • New objects have level of the creator - Originally available for Linux (2.2) • - Now ships with FreeBSD - Probably inspired Vista’s Mandatory Integrity Control (MIC) 15/39 16/39 LOMAC defaults The self-revocation problem remote tty1 2 eth1 management link Want to integrate with Unix unobtrusively /bin, /etc, WWW • Problem: Application expectations downloads, email • untrusted - Kernel access checks usually done at file open time ttyS0 eth0 1 external net Two levels: 1 and 2 - Legacy applications don’t pre-declare they will observe • low-integrity data Level 2 (high-integrity) contains: • - An application can “taint” itself unexpectedly, revoking its own - FreeBSD/Linux files intact from distro, static web server config permission to access an object it created - The console, trusted terminals, trusted network Level 1 (low-integrity) contains • - NICs connected to Internet, untrusted terminals, etc. Idea: Suppose worm compromises your web server • - Worm comes from network level 1 → - Won’t be able to muck with system files or web server config 17/39 18/39 Self-revocation example Self-revocation example User has high-integrity (level 2) shell User has high-integrity (level 2) shell • • Runs: ps | grep user Runs: ps | grep user • • - Pipe created before ps reads low-integrity data - Pipe created before ps reads low-integrity data - ps becomes tainted, can no longer write to grep - ps becomes tainted, can no longer write to grep level 2 level 2 level 2 level 2 level 2 level 2 ps pipe grep ps pipe grep r /proc/327 level 1 19/39 19/39 Self-revocation example Self-revocation example User has high-integrity (level 2) shell User has high-integrity (level 2) shell • • Runs: ps | grep user Runs: ps | grep user • • - Pipe created before ps reads low-integrity data - Pipe created before ps reads low-integrity data - ps becomes tainted, can no longer write to grep - ps becomes tainted, can no longer write to grep level 1 level 2 level 2 level 1 level 2 level 2 ps pipe grep ps wX pipe grep 19/39 19/39 Solution The flask security architecture Don’t consider pipes to be real objects Problem: Military needs adequate secure systems • • Join multiple processes together in a “job” - How to create