![Network Security](https://data.docslib.org/img/3a60ab92a6e30910dab9bd827208bcff-1.webp)
Cristina Nita-Rotaru CS6740: Network security Anonymity. Sources 1. Crowds: http://avirubin.com/crowds.pdf 2. Chaum mix: http://www.ovmj.org/GNUnet/papers/p84-chaum.pdf 3. To r : https://svn.torproject.org/svn/projects/design-paper/tor-design.pdf 4. Predecessors attack: http://prisms.cs.umass.edu/brian/pubs/wright-tissec.pdf 5. Also based on slides prepared by Chi-Cun Chan. 2 Anonymity systems. 1: Motivation Anonymity Anonymity (``without name’’) means that a person is not identifiable within a set of subjects } Unlinkability of action and identity } For example, sender and his email are no more related after adversary’s observations than they were before } Who talks to whom } Unobservability } Adversary cannot tell whether someone is using a particular system and/or protocol 4 Anonymity systems. There is no anonymity on the Internet } Your IP address can be linked directly to you } ISPs store communications records } Usually for several years (Data Retention Laws) } Law enforcement can subpoena these records } Your browser is being tracked } Cookies, Flash cookies, E-Tags, HTML5 Storage } Browser fingerprinting } Your activities can be used to identify you } Unique websites and apps that you use } Types of links that you click 5 Anonymity systems. Wiretapping is ubiquitous } Wireless traffic can be trivially intercepted } Airsnort, Firesheep, etc. } Wifi and Cellular traffic! } Encryption helps, if it’s strong } WEP and WPA are both vulnerable! } Tier 1 ASs and IXPs are compromised } NSA, GCHQ, “5 Eyes” } ~1% of all Internet traffic } Focus on encrypted traffic 6 Anonymity systems. Who uses anonymity systems? } “If you’re not doing anything wrong, you shouldn’t have anything to hide.” } Implies that anonymous communication is for criminals } The truth: who uses Tor? } Journalists } Law enforcement ¤ Business executives } Human rights activists ¤ Military/intelligence personnel } Normal people ¤ Abuse victims } In fact, the predecesor of Tor was developed by the U.S. Naval Research Laboratory. 7 Anonymity systems. Why do we need anonymity? } To protect privacy } Avoid tracking by advertising companies } Viewing sensitive content } Information on medical conditions } Advice on bankruptcy } Protection from prosecution } Not every country guarantees free speech } Downloading copyrighted material } To prevent chilling-effects } It’s easier to voice unpopular or controversial opinions if you are anonymous 8 Anonymity systems. Relevant applications } Anonymous communication } Anonymizing bulletin board and email } Electronic voting } Incident reporting } Anonymous e-commerce } Private information retrieval 9 Anonymity systems. Anonymity layer Application } Function: } Hide the source, destination, and Anonymity content of Internet flows from eavesdroppers Presentation } Key challenge: } Defining and quantifying anonymity Session } Building systems that are resilient to Transport deanonymization Network } Maintaining performance Data Link Physical10 Anonymity systems. 2: Terminology Quantifying anonymity: Anonymity set } Hiding one’s action in many others’ actions } Anonymity set: a group of users in which every one is equally-probable to be associated with a given action ⇒ every one has certain degree of innocence or deniability to an action u1 u2 u3 u4 12 Anonymity systems. More definitions } Unlinkability } From the adversaries perspective, the inability the link two or more items of interess; E.g. packets, events, people, actions, etc. } Three parts: } Sender anonymity (who sent this?) } Receiver anonymity (who is the destination?) } Relationship anonymity (are sender A and receiver B linked?) } Unobservability } From the adversaries perspective, items of interest are indistinguishable from all other items 13 Anonymity systems. Types of adversary } Passive/Active } Passive: eavesdrop traffic } Active: able to observe, delay, alter and drop messages in the system } Local/Global } Local: able to observe traffic to/form user’s network link, within LAN } Global: able to observe effectively large amount or all network links, across LAN boundaries } Internal/External } Internal: participants in the anonymity system, adversary-operated nodes } External: not participate in the protocol but may be able to observe, inject or modify traffic in the system 14 Anonymity systems. TLS does not provide anonymity Data Traffic } Content is unobservable } Due to encryption } Source and destination are trivially linkable } No anonymity! 15 Anonymity systems. Anonymizing proxies HTTPS Proxy No anonymity! } Source is known ¨ Destination is } Destination known anonymity ¨ Source anonymity 16 Anonymity systems. Anonymizing VPNs VPN Gateway No anonymity! } Source is known Ø Destination is } Destination known anonymity Ø Source anonymity 17 Anonymity systems. Using content to Deanonymize HTTPS Proxy • Reading Gmail • Looking up directions to home No anonymity! • Updating your Facebook profile • Etc… 18 Anonymity systems. Statistical inference attacks VPN Gateway } Statistical analysis of traffic patterns can compromise anonymity, i.e. the timing and/or volume of packets 19 Anonymity systems. Data to protect } Personally Identifiable Information (PII) } Name, address, phone number, etc. } OS and browser information } Cookies, etc. } Language information } IP address } Amount of data sent and received } Traffic timing 20 Anonymity systems. Key systems/concepts } Mixes and mixnets } Crowds } Onion routing 21 Anonymity systems. 3: Mixnets. MIX-based systems } Introduced by David Chaum (1981) for anonymous email; has been generalized to TCP traffic } Uses relay servers (MIXes) for anonymous communication } Goals } Sender anonymity } Unlinkability against global eavesdroppers } Idea: Messages from sender “look” (contents, time) differently than messages to recipient } Had impact on other ideas such as: onion routing, traffic mixing, dummy traffic (a.k.a. cover traffic) 23 Anonymity systems. MIX – basic operations } A mix is a store-and-forward relay } Batching } collect fixed-length messages from different sources } accumulate a batch of n messages } Mixing } cryptographically transform collected messages } forwarding messages to their recipients in random order 24 Anonymity systems. MIX - example } Each mix has a public key } Each sender encrypts its message (with randomness) using public key of mix u1 1. Collects messages 2. Discards repeated messages u2 3 .Decrypts messages and accumulates in batch u3 4. Reorder messages in batch and delivers u4 Mix 25 Anonymity systems. MIX - variants } Single mix (also single point of trust, attack and failure) } Mix cascade } Mix network } Different ways of batch and mix operations u1 u2 Batch and Batch and Mix Mix u3 u4 Mix Mix 26 Anonymity systems. MIX (cont.) } Traditional designs are message-based } Advantage: Hinders timing attacks } Messages may be artificially delayed } Temporal correlation is warped } Disadvantage: high latency and asynchronous due to batch and mix operations } may be acceptable for applications like email } frustrating user experience in low latency or interactive applications: web browsing, instant messaging, SSH } Alternatives: circuit-based designs 27 Anonymity systems. Return Traffic } In a mix network, how can the destination respond to the sender? } During path establishment, the sender places keys at each mix along the path } Data is re-encrypted as it travels the reverse path K <KP1 , KS1> KP1 KP2 P3 <KP2 , KS2> <KP3 , KS3> 28 Anonymity systems. Node flushing attack } Intended to defeat MIX-based systems } Flooding attack, (n-1) attack } Flood a node with identifiable fake messages but leave a room for a single message to be traced } Link user’s input message with messages leaving the node A A A u1 Mix 29 Anonymity systems. Trickle attack } Trickle, flushing attack - referred as blending attack } Suppose a MIX accumulates and emits messages in rounds } An active attacker holds a target message until the mix emits a batch of messages } He then submits target message to mix while blocking other incoming messages } Only the target message is emitted in the next round } Requires control over traffic flow 30 Anonymity systems. Packet counting attack } Count the number of messages entering a node and leaving an anonymous tunnel } Constant link padding may help: } Two nodes exchange a constant number of same-sized packets per time unit } Generate dummy traffic on idle or lightly loaded links } Costly 102 521 520 node 99 3 X 65 62 31 Anonymity systems. Dummy / Cover Traffic } Simple idea: } Send useless traffic to help obfuscate real traffic 32 Anonymity systems. Summary for Mixes } Key idea is to gather a bunch of messages, then mix them and output in random order } Can be used as a network } Resilient to timing attacks but possible attacks include packet counting, flushing, etc } Disadvantage is that it is slow 33 Anonymity systems. 4: Crowds Crowds } Key idea } Users’ traffic blends into a crowd of users } Eavesdroppers and end-hosts don’t know which user originated what traffic } High-level implementation } Every user runs a proxy on their system } Proxy is called a jondo } From “John Doe,” i.e. an unknown person } When a message is received, select x � [0, 1] } If x > pf: forward the message to a random jondo } Else: deliver the message to the actual receiver 35 Anonymity systems. Crowds } Anonymous web browsing } Dynamic collecting users (jondo) in a group (crowd) } Member list maintained in a central server (blender) j2 j4 s1 Crowd j1 s2 j3 Web servers 36 Anonymity systems. Crowds
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages88 Page
-
File Size-