
Android Permissions Remystified: A Field Study on Contextual Integrity Primal Wijesekera1, Arjun Baokar2, Ashkan Hosseini2, Serge Egelman2, David Wagner2, and Konstantin Beznosov1 1University of British Columbia, Vancouver, Canada, fprimal,[email protected] 2University of California, Berkeley, Berkeley, USA, farjunbaokar,[email protected], fegelman,[email protected] Abstract time the data is actually requested, it is not clear whether We instrumented the Android platform to collect data re­ or not users are being prompted about access to data that garding how often and under what circumstances smart- they actually find concerning, or whether they would ap­ phone applications access protected resources regulated prove of subsequent requests [15]. by permissions. We performed a 36-person field study Nissenbaum posited that the reason why most privacy to explore the notion of “contextual integrity,” i.e., how models fail to predict violations is that they fail to con­ often applications access protected resources when users sider contextual integrity [32]. That is, privacy violations are not expecting it. Based on our collection of 27M data occur when personal information is used in ways that points and exit interviews with participants, we exam­ defy users’ expectations. We believe that this notion of ine the situations in which users would like the ability to “privacy as contextual integrity” can be applied to smart- deny applications access to protected resources. At least phone permission systems to yield more effective per­ 80% of our participants would have preferred to prevent missions by only prompting users when an application’s at least one permission request, and overall, they stated a access to sensitive data is likely to defy expectations. As desire to block over a third of all requests. Our findings a first step down this path, we examined how applica­ pave the way for future systems to automatically deter­ tions are currently accessing this data and then examined mine the situations in which users would want to be con­ whether or not it complied with users’ expectations. fronted with security decisions. We modified Android to log whenever an application 1 Introduction accessed a permission-protected resource and then gave these modified smartphones to 36 participants who used Mobile platform permission models regulate how appli­ them as their primary phones for one week. The pur­ cations access certain resources, such as users’ personal pose of this was to perform dynamic analysis to deter­ information or sensor data (e.g., camera, GPS, etc.). For mine how often various applications are actually access­ instance, previous versions of Android prompt the user ing protected resources under realistic circumstances. during application installation with a list of all the per­ Afterwards, subjects returned the phones to our labora­ missions that the application may use in the future; if the tory and completed exit surveys. We showed them vari­ user is uncomfortable granting any of these requests, her ous instances over the past week where applications had only option is to discontinue installation [3]. On iOS and accessed certain types of data and asked whether those Android M, the user is prompted at runtime the first time instances were expected, and whether they would have an application requests any of a handful of data types, wanted to deny access. Participants wanted to block a such as location, address book contacts, or photos [34]. third of the requests. Their decisions were governed pri­ Research has shown that few people read the Android marily by two factors: whether they had privacy concerns install-time permission requests and even fewer compre­ surrounding the specific data type and whether they un­ hend them [16]. Another problem is habituation: on av­ derstood why the application needed it. erage, Android applications present the user with four We contribute the following: permission requests during the installation process [13]. While iOS users are likely to see fewer permission re­ • To our knowledge, we performed the first field study quests than Android users, because there are fewer pos­ to quantify the permission usage by third-party ap­ sible permissions and they are only displayed the first plications under realistic circumstances. 1 • We show that our participants wanted to block ac­ third parties without requiring user consent [12]. Horny­ cess to protected resources a third of the time. This ack et al.’s AppFence system gave users the ability to suggests that some requests should be granted by deny data to applications or substitute fake data [24]. runtime consent dialogs, rather than Android’s pre­ However, this broke application functionality for one- vious all-or-nothing install-time approval approach. third of the applications tested. • We show that the visibility of the requesting appli­ Reducing the number of security decisions a user must cation and the frequency at which requests occur are make is likely to decrease habituation, and therefore, it is two important factors which need to be taken into critical to identify which security decisions users should account in designing a runtime consent platform. be asked to make. Based on this theory, Felt et al. created a decision tree to aid platform designers in determining 2 Related Work the most appropriate permission-granting mechanism for While users are required to approve Android application a given resource (e.g., access to benign resources should permission requests during installation, most do not pay be granted automatically, whereas access to dangerous attention and fewer comprehend these requests [16, 26]. resources should require approval) [14]. They concluded In fact, even developers are not fully knowledgeable that the majority of Android permissions can be automat­ about permissions [40], and are given a lot of free­ ically granted, but 16% (corresponding to the 12 permis­ dom when posting an application to the Google Play sions in Table 1) should be granted via runtime dialogs. Store [7]. Applications often do not follow the principle of least privilege, intentionally or unintentionally [44]. Nissenbaum’s theory of contextual integrity can help us Other work has suggested improving the Android per­ to analyze “the appropriateness of a flow” in the con­ mission model with better definitions and hierarchical text of permissions granted to Android applications [32]. breakdowns [8]. Some researchers have experimented There is ambiguity in defining when an application actu­ with adding fine-grained access control to the Android ally needs access to user data to run properly. It is quite model [11]. Providing users with more privacy informa­ easy to see why a location-sharing application would tion and personal examples has been shown to help users need access to GPS data, whereas that same request com­ in choosing applications with fewer permissions [21,27]. ing from a game like Angry Birds is less obvious. “Con­ textual integrity is preserved if information flows accord­ Previous work has examined the overuse of permissions ing to contextual norms” [32], however, the lack of thor­ by applications [13, 20], and attempted to identify mali­ ough documentation on the Android permission model cious applications through their permission requests [36] makes it easier for programmers to neglect these norms, or through natural language processing of application de­ whether intentionally or accidentally [38]. Deciding on scriptions [35]. Researchers have also developed static whether an application is violating users’ privacy can be analysis tools to analyze Android permission specifica­ quite complicated since “the scope of privacy is wide- tions [6, 9, 13]. Our work complements this static anal­ ranging” [32]. To that end, we performed dynamic analy­ ysis by applying dynamic analysis to permission us­ sis to measure how often (and under what circumstances) age. Other researchers have applied dynamic analysis to applications were accessing protected resources, whether native (non-Java) APIs among third-party mobile mar­ this complied with users’ expectations, as well as how kets [39]; we apply it to the Java APIs available to devel­ often they might be prompted if we adopt Felt et al.’s opers in the Google Play Store. proposal to require runtime user confirmation before ac­ cessing a subset of these resources [14]. Finally, we show Researchers examined user privacy expectations sur­ how it is possible to develop a classifier to automatically rounding application permissions, and found that users determine whether or not to prompt the user based on were often surprised by the abilities of background ap­ varying contextual factors. plications to collect data [25, 42]. Their level of con­ cern varied from annoyance to seeking retribution when presented with possible risks associated with permis­ 3 Methodology sions [15]. Some studies employed crowdsourcing to Our long-term research goal is to minimize habituation create a privacy model based on user expectations [30]. by only confronting users with necessary security de­ cisions and avoiding showing them permission requests Researchers have designed systems to track or reduce that are either expected, reversible, or unconcerning. Se­ privacy violations by recommending applications based lecting which permissions to ask about requires under­ on users’ security concerns [2, 12, 19, 24, 28, 46–48]. standing how often users would be confronted with each Other tools dynamically block runtime permission
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages16 Page
-
File Size-