DAC vs. MAC Bell-Lapadula model • Most people familiar with discretionary access • View the system as subjects accessing objects control (DAC) - The system input is requests, the output is decisions - Unix permission bits are an example - Objects can be organized in one or more hierarchies, H - Might set a file private so only group friends can read it (a tree enforcing the type of decendents) • Discretionary means anyone with access can • Four modes of access are possible: propagate information: - execute – no observation or alteration - Mail [email protected] < private - read – observation • Mandatory access control - append – alteration - Security administrator can restrict propagation - write – both observation and modification - Abbreviated MAC (NOT to be confused w. Message • The current access set, b, is (subj, obj, attr) tripples Authentication Code or Medium Access Control) • An access matrix M encodes permissible access types (as before, subjects are rows, objects columns) 1/36 2/36 Security levels Security properties • A security level is a (c, s) pair: • The simple security or ss-property: - c = classification – E.g., unclassified, secret, top secret - For any (S, O, A) ∈ b, if A includes observation, then level(S) - s = category-set – E.g., Nuclear, Crypto must dominate level(O) - E.g., an unclassified user cannot read a top-secret document • (c1, s1) dominates (c2, s2) iff c1 ≥ c2 and s2 ⊆ s1 • The star security or : - L1 dominates L2 sometimes written L1 ⊒ L2 or L2 ⊑ L1 *-property - levels then form a lattice (partial order w. lub & glb) - If a subject can observe O1 and modify O2, then level(O2) dominates level(O ) • Subjects and objects are assigned security levels 1 - E.g., cannot copy top secret file into secret file - level(S), level(O) – security level of subject/object - More precisely, given (S, O, A) ∈ b: - current-level(S) – subject may operate at lower level if A = r then current-level(S) ⊒ level(O) (“no read up”) - level(S) bounds current-level(S) (current-level(S) ⊑ level(S)) if A = a then current-level(S) ⊑ level(O) (“no write down”) - Since level(S) is max, sometimes called S’s clearance if A = w then current-level(S)= level(O) 3/36 4/36 The lattice model Straw man MAC implementation htop-secret, {Nuclear, Crypto}i • Take an ordinary Unix system htop-secret, {Nuclear}i htop-secret, {Crypto}i • Put labels on all files and directories to track levels ∅ X • Each user U has a security clearance, level(U) htop-secret, i X • Determine current security level dynamically hsecret, {Nuclear}i hsecret, {Crypto}i - When U logs in, start with lowest curent-level - Increase current-level as higher-level files are observed ∅ hsecret, i (sometimes called a floating label system) L L X 1 1 - If U’s level does not dominate current-level, kill program means L1 ⊑ L2 - Kill program that writes to file that doesn’t dominate it hunclassified, ∅i • Is this secure? • Information can only flow up the lattice - System enforces “No read up, no write down” - Think of ⊑ as “can flow to” relation 5/36 6/36 No: Covert channels No: Timing channels • System rife with storage channels • Example: CPU utilization - Low current-level process executes another program - To send a 0 bit, use 100% of CPU in busy-loop - New program reads sensitive file, gets high current-level - To send a 1 bit, sleep and relinquish CPU - High program exploits covert channels to pass data to low - Repeat to transfer more bits • E.g., High program inherits file descriptor • Example: Resource exhaustion - Can pass 4-bytes of information to low prog. in file offset - High prog. allocates all physical memory if bit is 1 • Other storage channels: - If low prog. slow from paging, knows less memory available - Exit value, signals, file locks, terminal escape codes, . • More examples: Disk head position, processor • If we eliminate storage channels, is system secure? cache/TLB polution, . 7/36 8/36 Reducing covert channels Declassification • Observation: Covert channels come from sharing • Sometimes need to prepare unclassified report from - If you have no shared resources, no covert channels classified data - Extreme example: Just use two computers (common in DoD) • Declassification happens outside of system • Problem: Sharing needed - Present file to security officer for downgrade - E.g., read unclassified data when preparing classified • Job of declassification often not trivial • Approach: Strict partitioning of resources - E.g., Microsoft word saves a lot of undo information - Strictly partition and schedule resources between levels - This might be all the secret stuff you cut from document - Occasionally reapportion resources based on usage - Another bad mistake: Redacted PDF using black censor bars - Do so infrequently to bound leaked information over or under text (but text still selectable) - In general, only hope to bound bandwidth of covert channels - Approach still not so good if many security levels possible 9/36 10/36 Biba integrity model DoD Orange book • Problem: How to protect integrity • DoD requirements for certification of secure systems - Suppose text editor gets trojaned, subtly modifies files, might • 4 Divisions: mess up attack plans - D – been through certification and not secure • Observation: Integrity is the converse of secrecy - C – discretionary access control - In secrecy, want to avoid writing less secret files - B – mandatory access control - In integrity, want to avoid writing higher-integrity files - A – like B, but better verified design • Use integrity hierarchy parallel to secrecy one - Classes within divisions increasing level of security - Now security level is a hc, i, si triple, i =integrity - hc1, i1, s1i⊑hc2, i2, s2i iff c1 ≤ c2 and i1 ≥ i2 and s1 ⊆ s2 - Only trusted users can operate at low integrity levels - If you read less authentic data, your current integrity level gets lowered (putting you up higher in the lattice), and you can no longer write higher-integrity files 11/36 12/36 Limitations of Orange book Today: Common Criteria • How to deal with floppy disks? • Replaced orange book around 1998 • How to deal with networking? • Three parts to CC: • Takes too long to certify a system - CC Documents, including protection profiles w. both functional and assurance requirements - People don’t want to run n-year-old software - CC Evaluation Methodology • Doesn’t fit non-military models very well - National Schemes (local ways of doing evaluation) • What if you want high assurance & DAC? 13/36 14/36 LOMAC [Fraser] LOMAC overview • MAC not widely accepted outside military • Subjects are jobs (essentially processes) • LOMAC’s goal is to make MAC more palatable - Each subject has an integrity number (e.g., 1, 2) - Stands for Low water Mark Access Control - Higher numbers mean more integrity (so unfortunately 2 ⊑ 1 by earlier notation) • Concentrates on Integrity - Subjects can be reclassified on observation of low-integrity data - More important goal for many settings - E.g., don’t want viruses tampering with all your files • Objects are files, pipes, etc. - Also don’t have to worry as much about covert channels - Objects have fixed integrity level; cannot change • Provides reasonable defaults (minimally obtrusive) • Security: Low-integrity subjects cannot write to high integrity objects • Has actually had some impact - Originally available for Linux (2.2) • New objects have level of the creator - Integrated in FreeBSD-current source tree - Probably inspired Vista’s Mandatory Integrity Control (MIC) 15/36 16/36 LOMAC defaults The self-revocation problem remote tty1 2 eth1 management link /bin, /etc, WWW • Want to integrate with Unix unobtrusively • Problem: Application expectations downloads, email untrusted - Kernel access checks usually done at file open time ttyS0 eth0 1 external net • Two levels: 1 and 2 - Legacy applications don’t pre-declare they will observe low-integrity data • Level 2 (high-integrity) contains: - An application can “taint” itself unexpectedly, revoking its own - FreeBSD/Linux files intact from distro, static web server config permission to access an object it created - The console, trusted terminals, trusted network • Level 1 (low-integrity) contains - NICs connected to Internet, untrusted terminals, etc. • Idea: Suppose worm compromises your web server - Worm comes from network → level 1 - Won’t be able to muck with system files or web server config 17/36 18/36 Self-revocation example Self-revocation example • User has high-integrity (level 2) shell • User has high-integrity (level 2) shell • Runs: ps | grep user • Runs: ps | grep user - Pipe created before ps reads low-integrity data - Pipe created before ps reads low-integrity data - ps becomes tainted, can no longer write to grep - ps becomes tainted, can no longer write to grep level 2 level 2 level 2 level 2 level 2 level 2 ps pipe grep ps pipe grep r /proc/327 level 1 19/36 19/36 Self-revocation example Self-revocation example • User has high-integrity (level 2) shell • User has high-integrity (level 2) shell • Runs: ps | grep user • Runs: ps | grep user - Pipe created before ps reads low-integrity data - Pipe created before ps reads low-integrity data - ps becomes tainted, can no longer write to grep - ps becomes tainted, can no longer write to grep level 1 level 2 level 2 level 1 level 2 level 2 ps pipe grep ps wX pipe grep 19/36 19/36 Solution The flask security architecture • Don’t consider pipes to be real objects • Problem: Military needs adequate secure systems • Join multiple processes together in a “job” - How to create civilian demand for systems military can use? - Pipe
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages7 Page
-
File Size-