arXiv:1901.06095v1 [cs.CR] 18 Jan 2019 Proa nieDtsoe)wihaecnrle yusers by controlled are which Datastores) Online (Personal [ work eaayetedsrs sn ipemdladhigh- and model simple a using distrust the analyze we piae aaaecletdadsoe nsm centralized some in stored and collected are data (private) hc losidvda sr otuyonterdt.I a In data. their own truly to users individual allows which utilize hi w Osfrfnadprofit. in and applications fun run for PODs to own choose their can users And, themselves. rs mn ifrn oe.I P In nodes. our different introduce among also We trust Internet. solution—P decentralized draft one a accomplish faithfully in paper, to task required this properties In the light for Internet. obstacle decentralized the major of a growth become the eventually might task Such data. distrust a distrust. mutual their completing with own nodes Internet, multiple involves truly usually decentralized can a users in that However, promise its by nated fonrhpmgtla octsrpi consequences— catastrophic [ to abuse data lead might misplacement This ownership control. our of of out are which silos, data oe[ more func- the [ of websites provides, most cluding Internet offer centralized nowadays to immature, that able Though tionalities is decen- proposed. Internet goal, this been decentralized achieve has to order Internet In tralized data. own their etize on. so and uvy n eot ugs htcretItre shighly is Internet [ current centralized that suggest reports and Surveys Introduction 1 properties. needed above eetaie nent srdt r trdin stored are network data peer-to-peer user Internet, a decentralized design, by is, Internet tralized ui Xia Yubin eetaie nenti omn.Pol r fasci- are People booming. is Internet Decentralized sr hudb ncnrlo n eepwrdt mon- to empowered be and of control in be should Users ifrn rmcretcnrlzdacietr,decen- architecture, centralized current from Different 9 4 trust- , ,oln olbrto [ collaboration online ], 11 , 6 λ 1 12 ,dt aiuain[ manipulation data ], 1 , igunLiu Qingyuan , and 2 ]. 1 aigDsrs nteDcnrlzdItre ihP with Internet Decentralized the in Distrust Taming .I etaie nent sues our users, as Internet, centralized a In ]. nttt fPrle n itiue ytm,Saga Ji Shanghai Systems, Distributed and Parallel of Institute IXIU 15 eetaie executor decentralized ,oln aktlc [ marketplace online ], rmwr omtgt h dis- the mitigate to framework a , Abstract 7 1 ,vdosaig[ sharing video ], hn Tan Cheng , 13 IXIU , 14 ,dt rah[ breach data ], edsg and design we , 10 oaheethe achieve to ,sca net- social ], PODs 2 2 igLeng Jing , e okUniversity York New 5 and ] [ 11 in- 3 ] ] ilntselhl ektedt,o le h application, the alter or data, the leak stealthily not will esltoets eg,aqey ahn erigtrain- learning machine a query, a (e.g., task one split we iebt ofietaiyaditgiy hc—nreality— which—in integrity, and confidentiality both vide i xr euiytcnqe osldf h E n,more and, TEE the solidify to techniques security extra (i) xctr ahtrust- Each executor. n hi h trust- the chain and usd h eetaie executor. leak decentralized execution the the nor outside data no that so (TEE), environments is h ituti h eetaie Internet. decentralized the in distrust the sumer nta frtriga rirr result? arbitrary an returning anything of do instead actually they that ascertain permission, to or—even—how read the granted been that have But, sure which make user nodes, with. a the collaborate can to remains—how still nodes peer- question which the the choose Indeed, in know. can nodes not user may all) or a not may a (if she companies, which many network, tech trust to-peer giant to few has a now trusting user of Instead ternet. nls ifrn ye fdsrs.Ad nodrt tackle to P order design in we And, problems, these distrust. systematically of to In types model different simple Internet. analyse a decentralized introduce the we paper, of this growth the major for the obstacle become eventually might it and decentralization n)it tp,eeueec tpsprtl natrust- a in separately step each execute steps, into ing) oeare core the on. so about and execution concerns ruption, stealing, are task stealing, There data counterfeit, data other. mu- participants each the consumer distrust All data result. tually the the from produces task and the inputs, as and PODs from data the h tak ol eeooial insufficient. economically be ste would each attacks of the fine-granularity the to thanks (ii) importantly, by restricted be can assumption such of violations the tice, 1 2 hnnn Xu Shangning , not nordsrs oe,mlil Osadone and PODs multiple model, distrust our In oee,teeaetutise nsc eetaie In- decentralized such in issues trust are there However, ebleesc ituti epsae ntentr of nature the in deep-seated is distrust such believe We P P IXIU IXIU lastu [ true always oehrwn ofiihone finish to want together sehsi rgai.I sue E a pro- can TEE assumes It pragmatic. is ethos ’s trust- sdsge omtgt h bv itut nits In distrust. above the mitigate to designed is λ n the and λ 8 , hoooial soedecentralized one as chronologically s λ 1 39 iy Zang Binyu , ossso eea rse execution trusted several of consists oTn University Tong ao .Hwvr eageta,i prac- in that, argue we However, ]. eetaie executor decentralized IXIU rmwr omitigate to framework a , task IXIU 1 An . n ab Chen Haibo and , executor nP In . aacon- data takes IXIU cor- λ p, , , 1 2 Distrust model a neural network, ranking algorithm) that the execu- should not know. Task privacy requires that the In this section, we are going to introduce our distrust executor learns nothing about the task logic. model in the decentralized Internet. We assume that the • users’ data have already been stored in their PODs. How Execution integrity. Data consumers concern that the to collect data into PODs is out of the topic of this paper. result might be bogus, as the task’s code is not faith- fully executed. Execution integrity requires that the PODs code runs as written by the executor. task • Accountability. Inevitably, executing a task in a dis- data data executor tributed setup will encounter failures, errors, bugs, result consumer misconfigurations, and so on. Accountability re- quires that, for one failed task, all the participants are able to pinpoint and agree on where went wrong. execution proof • Traceability. When users need some processed Figure 1: The workflow of executing a task in the decentralized data (e.g., some statitics, a trained machine learning Internet. model), they want to understand how the data were Figure 1 depicts an abstract workflow of a task in the de- generated. Traceability requires that one can find centralized Internet. A data consumer starts a task which and verify the history of any piece of data generated includes a piece of code and a specification of what data within the decentralized Internet. should be its inputs. Some PODs have the data needed by this task and (suppose) they allow such a task to access 3 A draft solution: PIXIU these data. An executor takes the task from the data consumer and In order to fulfill the requires in previous section, we de- the corresponding data from PODs as inputs, executes the sign PIXIU. PIXIU is a framework built on top of a de- task’s code, and outputs the result to the data consumer. centralized Internet, where all the participating nodes are In addition, the executor generates an execution proof for treated equally in a peer-to-peer manner. Each node can both sides to verify this procedure. Note that the executor play one or multiple roles—PODs, data consumers, or the is an abstract entity that might locate in one or multiple executor—within each task. And, PIXIU, as the frame- nodes in the decentralized Internet. work, organizes and coordinates these roles to finish the In the above workflow, we have three participants with tasks. mutual distrust: the data consumer, PODs, and the execu- In this section, we are going to introduce the basic build- tor. PODs are responsible for providing genuine data, but ing block of PIXIU named trust-λ (§3.1) which provides are fear of either executor or data consumer stealing their execution integrity and task privacy; then, we describe de- data. On the other hand, the data consumer worries that centralized executor (§3.2), the key to achieve the data the PODs may provide fake data, and the task code are privacy and data authenticity; finally, we will justify our stolen or not faithfully executed by the executor. In a de- assumption in §3.3. centralized network, the participants can be anyone in the Due to the space limit, we have to skip other pieces world, hence they naturally distrust each other. of PIXIU which are also indispensable. In specific, we In the following, we summarize the properties required make the following assumptions, which should have been by the decentralized Internet to mitigate the distrust. achieved by other components of PIXIU. • We assume a fixed peer-to-peer network with all nodes • Data privacy. PODs distrust the data consumers that equipped with TEE. they may steal the data, or the executor might leak • the data. Data privacy requires that only a bounded We assume all PODs are willing to expose their data amount of user data can be revealed (i.e., differential (i.e., grant read permission) under the guarantee of data privacy) to the data consumer and the executor. privacy. • We assume that, for one task, both PODs and the data • Data authenticity. Data consumers worry that the consumer have agreed, beforehand, how to split this PODs may sabotage the task by sending fake data as task into trust-λs. inputs. Data authenticity requires that the input data • We assume that PIXIU can always find the relevant data can be authenticated, otherwise, should be explicitly from PODs and qualified nodes as executors. tagged as alleged. 3.1 Properties of trust-λ • Task privacy. The task may contain some confiden- A trust-λ is the basic execution unit in our system. A tial logic of data consumers’ (e.g., the architecture of task can be divided into multiple trust-λs. Each trust-λ

2 has three properties: first, it has only one entry point and PODs data one exit point for data; second, the privacy of code and task consumer data within one trust-λ is protected; third, the execution of data result logic cannot be tampered with.

input data & proof output data & proof differential data privacy prover tasktask executiontask executionexecution decentralized sandbox executor

data function execution proof proof validator generator trust-λ enclaveenclave enclave public storage

Figure 3: The architecture of PIXIU’s decentralized executor. trust-λ

Figure 2: The inner process of one trust-λ.PIXIU-Box ensures ing to the data consumer, has to go through another differ- only one entry for input and one exit for output. ential privacy trust-λ, which guarantees that the sensitive personal data would not be leaked. Finally, the result is The properties of trust-λ are enforced by a software and delivered to the data consumer. hardware co-design. Figure 2 shows the process of one All these procedures are executed within trust-λs, λ λ trust- . Each trust- is composed of three components: which, because of trust-λ’s properties, are faithfully exe- one data validator, one sandbox and one proof generator. cuted with confidentiality. And, each of the trust-λ com- The data validator is the one component that accepts out- mits an execution proof to a public storage, which PODs side data. It checks the input data using proof (described and the data consumer can examine. below), and then passes the data to the sandbox. The sand- One decentralized executor usually has a chain of trust- box ensures that the function running inside cannot send λs. The integrity of the decentralized executor is ensured data out by disabling all I/O and restricting access to mem- by encryption and key management. Specifically, the data ory out of enclave. consumer acts as a dispatcher. It first recrutes several in- The third component generates a proof of execution to stances that runs the trust-λ it needs, then sends the list to show that the output data are indeed generated from the the data owner. The data owner first attests each instance. input data and algorithm. A proof contains sufficient in- After that, it will generate keys and send them to each in- formation of the data flow, but no sensitive data that may stance in a way that the output of one trust-λ can only be reveal user’s personal information. A typical proof only decrypted by the next trust-λ. has hashes of the input data and the function. The proof In the following, we’re going to describe several com- generator will sign the proof with its own private key. The mon components in a decentralized executor. private key is initialized when the node joins the PIXIU network. Data prover is responsible for providing data authentic- All of the three components are running within hard- ity in the decentralized executor. Depending on the type ware enclaves, so that they can attest each other, offer at- of data, there are multiple ways to authenticate a piece of testation to outsiders and protect both integrity and privacy data. If the data have been signed by the hardware (e.g., of the code and data. They run on the same node so that camera [27]), the validation can be as simple as verifying their secure channels are based on shared memory buffers. a signature. Similar case applies to the data signed by their source organization (e.g., health data [28]). 3.2 Decentralized executor Another type of data, which have machine-checkable A decentralized executor is an instantiation of the executor sources (e.g., shopping record, search history), can be au- in a decentralized Internet, which consists of a set of trust- thenticated by logging into the corresponding website and λs. Figure 3 depicts the architecture of a decentralized verifying that the provided data entry does exist. TEE tech- executor. nique [35] is able to guarantee that user’s login is secure, as well as the validation is faithful. In this example, the whole task has been split into mul- tiple trust-λs with different purposes: data prover, task Admittedly, there are cases that it is hard or even im- execution, and differential privacy. First, the input data possible to authenticate the data. PIXIU requires that these flow from PODs into the data prover, in which the pro- data should be tagged as “alleged” and the data consumers gram authenticates the input data. If valid, the data flows should take their own risks to trust these data. into the task execution to finish the essential task the data Differential privacy guarantees the data privacy in consumer provides. The result, instead of directly return- PIXIU. Differential privacy [25] is a statistical technique

3 that aims to protect privacy of individual users while still 4 Real-world use cases allowing statistical queries to these data. Previous sys- tems demonstrate that differential privacy can be applied In this section, we are going to present several real-world transparently to a lot of tasks (e.g., machine learning [16], use cases of PIXIU. statistic analysis [26], SQL query [33]). Hence, PIXIU can leverage these systems to build our differential privacy 4.1 Case 1: advertisements trust-λ. Investigating what tasks can (or cannot) use differ- In this application scenario, advertisers are data con- ential privacy and how to achieve them in practice is our sumers. They want to run their advertisement recom- future work. mendation algorithms on users’ PODs and deliver the ad- vertisements about their products to the potential buyers. Execution proofs empower PIXIU with the accountabil- Clearly, users do not want their data to be leaked to the ity and traceability. Any participant in one task is able to advertisers, and (perhaps) the advertisers also want their verify the whole chain of trust-λs via examining (check- recommendation algorithms to be secret. ing the signature and comparing the hashes) the series of With PIXIU, an advertiser can send its advertisement to execution proofs on the public storage. Since execution the target users without learning who they are. The ad- proofs are signed by trust-λs, they cannot be forged. vertiser can filter out the non-target users by setting condi- Specifically, PODs can ensure that their data flow tions in the task. For example, as a video game seller, it through a differential privacy trust-λ, so that data privacy can specify that its advertisements will only send to those is enforced. And data consumers can ascertain that the who have a purchase record of a Nintendo Switch. The input data have been authenticated by a data prover trust- advertiser can check the advertisementdelivery by examin- λ λ. As for the data generated within PIXIU, anyone can ing the execution proofs committed by each trust- . PODs track the history of the processing and find the original can also make sure that their personal data (e.g., whether data sources. have a Nintendo Switch) have not been leaked to the ad- vertiser, in the same way. 4.2 Case 2: financial data query

3.3 Justify PIXIU’s assumption Some organizations (e.g., banks, financial companies) and individuals have many financial records. Other companies, PIXIU assumes that TEE can provide both execution in- researchers, or individuals may try to understand some tegrity and confidentiality. However, in reality, attacks, statistics about the overall financial status, or the financial like side channel [39] and L1TF [8], may successfully condition of a particular group of people. However, nowa- steal data from enclaves. In the following, we try to ar- days, the data owners are unwilling to share their data, and gue that the violation of such assumption can be restricted the queriers are afraid of that their queries will be leaked. by the two schemes below. In this scenario, data providers need to ensure no data leakage, and for the queriers, they need to keep their One way to reduce the attacking surface, besides ap- queries private. With PIXIU, a PIR (Private Information plying Intel’s latest patch, is to restrict the software Retrieve) trust-λ can be inserted into the task to protect the stack underneath enclaves with trusted hardware like TPM query processing from the data providers. The query will (Trusted Platform Module) and Intel TXT (Trusted eXecu- be encrypted and executed by the PODs of data provider tion Technology). The execution nodes may have differ- without revealing which data have been touched. Since the ent security levels. For examle, a node depending entirely data providers do not trust any query, they will also lever- on Intel SGX (Software Guard eXtension) is considered age differential privacy to ensure no personal information as mid-level security, which could be vulnerable to side will be leaked through the result. channel attacks. A more secure node can run a formally verified kernel and use SGX as well, and nothing else to 4.3 Case 3: machine learning training minimize the attacking surface. Machine learning brings convenience to people’s lives: On the other hand, our methodology of dividing a task machine translation, autocorrect, auto captioning, etc. into multiple trust-λ can increase the cost of an attack, as Most of these machine learning algorithms require training well as decrease its gain, making it economically insuf- on substantial data. And some personalization functional- ficient. Since trust-λs are isolated with each other, the ities require the peeking of user’s own data. People have data leaked in a single trust-λ can be very limited and been forced to choose between convenience and privacy. an attacker may need to compromise many nodes before PIXIU can make this dilemma obsolete. achieving one sucessful attack. Further, the functions that With PIXIU, people can opt in to volunteer their data are considered as more important can be deployed to the for machine learning tasks, without the worry of losing nodes with higher security level. privacy. These machine learning tasks can in turn be used

4 for their own benefits, such as personal assistant, search that enables a user to process her private data (e.g., gene) personalization, etc. by multiple untrusted algorithms on different platforms. Imagine a researcher wants to train a language model Our design is inspired by Ryoan but targets a different of modern day Americans, and she is interested in mak- problem: the data owner is not the consumer, which leads ing use of the abundance of training data lying in peo- to a totally different threat model. For example, Ryoan ple’sPODs. In this case she would be a data consumer, does not need to consider the fake data problem or privacy who specifies the task and gets the word out. Some users leakage between data owner and consumer, while PIXIU that store their text messages in their PODs may now sign considers both. up. Data consumer starts the job in an enclave (consumer Cryptographic tools. There is another thread of sys- enclave), which updates the global model. She then sends tems [17, 29, 30, 38] that leverage model cryptographic algorithm to user’s PODs which will run as trust-λ. tools, such as FHE (Fully-Homomorphic Encryption), The learning process involves several rounds of com- MPC (secure Multi-Party Computation), PIR (Private In- muncation between consumer enclave and user enclaves. formation Retrieval), to completely get rid of the trust In each round, the consumer enclave distributes the global among participants. PIXIU is complementary to them. model. Each user enclave updates the global model with Meanwhile, these methods can be implemented within local data using stochastic gradient descent, encrypts pa- trust-λ as a replacement of TEE. However, as far as we rameters, and sends it back to the consumer enclave. The know, adopting these techniques usually imposes signifi- consumer enclave then collects the encrypted parameters, cant overheads, hence we choose to use TEE as our current which it cannot decrypt. But it can aggregate the param- option. eters and then decrypt the result. After a few rounds of updates, the global model stablizes, and training is done. Privacy-preserving machine learning. Federated learn- ing [36] is a learning scheme that allows a model to be 4.4 Case 4: secure polls and surveys trained on remote and decentralized data. This protects Polls and surveys can be done on PIXIU. Survey conduc- privacy to some extend because data stay where they are. tors need integrity of the result, and survey takers need Cryptographic tools have been used to prevent adversarial discretion. inference of individual data [20]. It is worth mentioning Say a company is trying to conduct a customer survey, that multiple attacks on federated learning have been pro- to learn about how a product is received by the market. posed [31]. In this case, it would be the data consumer. Participating Another line of efforts has been on acheving differential users can authenticate their legitimacy throught an identity privacy with random noise mechanisms [16, 24]. They are verification protocal (e.g., cryptographic proof of member- pertinent on particular machine learning algorithms, and ship). The demographic data can be filled out automati- proved differential privacy parameters. PIXIU can seam- cally using the data in PODs. Then the user can communi- lessly accomodate these designs. cate their encrypted answers securely to the executor. The executor then generates survey results, without revealing the raw data. Eventually, a secure polling system may lead 6 Conclusion to a secure voting system. Decentralization is not just about data (e.g., PODs) and ex- 5 Related work ecution (e.g., distributed computing), it is also about trust. The decoupling of three roles: data owner, data consumer and executor, brings new challenges to the trust model as Decentralized system. There are multiple decentralized well as the system design. PIXIU proposes a decentral- systems [21, 22, 34] designed to give back the control of ized execution framework, where each step of execution data to users. Solid [34], a decentralized platform for so- goes with proofs of data and code. Multiple steps can be cial web applications, defines a series of rich protocols and connected to run various tasks with code and data from dif- suggests the creation of a POD, which is owned by individ- ferent roles. Currently, PIXIU leverages hardware TEE to ual users and stores their data. Amber [21] and Oort [22] achieve better practicability. It can also use cryptographic provide global queries to efficiently collect and monitor tools including MPC and FHE according to different sce- relevant data created by all the users, which enables cross- narios. user data applications. Instead of providing more function- alities in a decentralized setup, PIXIU focuses on taming the distrust within these functionalities. References TEE. Many previous works leverage TEE to protect ex- [1] Facebook and twitter really are where people get their news. ecution in untrusted environment [18, 19, 23, 32, 37]. https://www.wired.com/2015/07/facebook-twitter- Ryoan [32] proposes a distributed sandbox mechanism really-people-get-news/,.

5 [2] Google now handles at least 2 trillion searches per year. https:// [25] C. Dwork, A. Roth, et al. The algorithmic foundations of differ- searchengineland.com/google-now-handles-2-999- ential privacy. Foundations and Trends R in Theoretical Computer trillion-searches-per-year-250247,. Science, 9(3–4):211–407, 2014. [3] These hacks and data breaches broke records in 2016. https:// [26] Ú. Erlingsson, V. Pihur, and A. Korolova. Rappor: Ran- bdtechtalks.com/2017/01/29/biggest-hacks-data- domized aggregatable privacy-preserving ordinal response. In breach-2016/. Proceedings of the 2014 ACM SIGSAC conference on computer and communications security, pages 1054–1067. ACM, 2014. [4] The diaspora* project. https://diasporafoundation.org/. [27] G. L. Friedman. The trustworthy digital camera: Restoring credi- [5] Dtube. https://d.tube. bility to the photographic image. IEEE Transactions on consumer [6] Facebook suspends cambridge analytica for misuse of user data, electronics, 39(4):905–910, 1993. which cambridge denies. https://www.cnbc.com/2018/03/ [28] A. Giakoumaki, S. Pavlopoulos, and D. Koutsouris. Multiple image 16/facebook-bans-cambridge-analytica.html. watermarking applied to health information management. IEEE [7] Graphite docs. https://www.graphitedocs.com/. Transactions on Information Technology in Biomedicine, 10(4): [8] Resources and response to side channel l1 terminal fault. https:// 722–732, 2006. www.intel.com/content/www/us/en/architecture-and- [29] R. Gilad-Bachrach, N. Dowlin, K. Laine, K. Lauter, M. Naehrig, technology/l1tf.html. and J. Wernsing. Cryptonets: Applying neural networks to en- [9] Giving social networking back to you - the mastodon project. crypted data with high throughput and accuracy. In International https://joinmastodon.org/. Conference on Machine Learning, pages 201–210, 2016. [10] Openbazaar. https://openbazaar.org/. [30] T. Gupta, H. Fingler, L. Alvisi, and M. Walfish. Pretzel: Email encryption and provider-supplied functions are compatible. In [11] Solid. https://solid.mit.edu/. Proceedings of the Conference of the ACM Special Interest Group [12] Steemit. https://steemit.com/. on Data Communication, pages 169–182. ACM, 2017. [13] Is weibo’s raffle system truly fair? https://pandaily.com/is- [31] B. Hitaj, G. Ateniese, and F. Perez-Cruz. Deep models under the weibos-raffle-system-truly-fair/. gan: information leakage from collaborative deep learning. In [14] Yelp claims google broke promise to antitrust regulators. Proceedings of the 2017 ACM SIGSAC Conference on Computer https://www.wired.com/story/yelp-claims-google- and Communications Security, pages 603–618. ACM, 2017. broke-promise-to-antitrust-regulators/. [32] T. Hunt, Z. Zhu, Y. Xu, S. Peter, and E. Witchel. Ryoan: a [15] Zeronet. https://zeronet.io/. distributed sandbox for untrusted computation on secret data. In [16] M. Abadi, A. Chu, I. Goodfellow, H. B. McMahan, I. Mironov, 12th USENIX Symposium on Operating Systems Design and K. Talwar, and L. Zhang. Deep learning with differential privacy. In Implementation (OSDI 16), pages 533–549. USENIX Association, Proceedings of the 2016 ACM SIGSAC Conference on Computer 2016. and Communications Security, pages 308–318. ACM, 2016. [33] N. Johnson, J. P. Near, and D. Song. Towards practical differential [17] S. Angel and S. T. Setty. Unobservable communication over fully privacy for sql queries. Proceedings of the VLDB Endowment, 11 untrusted infrastructure. In OSDI, pages 551–569, 2016. (5):526–539, 2018. [18] S. Arnautov, B. Trach, F. Gregor, T. Knauth, A. Martin, C. Priebe, [34] E. Mansour, A. V. Sambra, S. Hawke, M. Zereba, S. Capadisli, J. Lind, D. Muthukumaran, D. OâA˘ ZKeeffe,´ M. L. Stillwell, et al. A. Ghanem, A. Aboulnaga, and T. Berners-Lee. A demonstration Scone: Secure containers with intel sgx. In 12th USENIX of the solid platform for social web applications. In Proceedings Symp. Operating Systems Design and Implementation , 2016. of the 25th International Conference Companion on World Wide Web, pages 223–226. International World Wide Web Conferences [19] A. Baumann, M. Peinado, and G. Hunt. Shielding applications Steering Committee, 2016. from an untrusted cloud with haven. ACM Transactions on Computer Systems (TOCS), 33(3):8, 2015. [35] S. Matetic, M. Schneider, A. Miller, A. Juels, and S. Capkun. Delegatee: Brokered delegation using trusted execution environ- [20] K. Bonawitz, V. Ivanov, B. Kreuter, A. Marcedone, H. B. McMa- ments. In 27th USENIX Security Symposium (USENIX Security han, S. Patel, D. Ramage, A. Segal, and K. Seth. Practical 18), 2018. secure aggregation for privacy-preserving machine learning. In Proceedings of the 2017 ACM SIGSAC Conference on Computer [36] H. B. McMahan, E. Moore, D. Ramage, S. Hampson, et al. and Communications Security, pages 1175–1191. ACM, 2017. Communication-efficient learning of deep networks from decentral- ized data. arXiv preprint arXiv:1602.05629, 2016. [21] T. Chajed, J. Gjengset, J. Van Den Hooff, M. F. Kaashoek, J. Mick- ens, R. Morris, and N. Zeldovich. Amber: Decoupling user data [37] F. Schuster, M. Costa, C. Fournet, C. Gkantsidis, M. Peinado, from web applications. In HotOS, volume 15, pages 1–6, 2015. G. Mainar-Ruiz, and M. Russinovich. Vc3: Trustworthy data an- alytics in the cloud using sgx. In Security and Privacy (SP), 2015 [22] T. Chajed, J. Gjengset, M. F. Kaashoek, J. Mickens, R. Morris, and IEEE Symposium on, pages 38–54. IEEE, 2015. N. Zeldovich. Oort: User-centric cloud storage with global queries. 2016. [38] S. Setty, S. Angel, T. Gupta, and J. Lee. Proving the correct execu- tion of concurrent services in zero-knowledge. In 13th {USENIX} [23] T. T. A. Dinh, P. Saxena, E.-C. Chang, B. C. Ooi, and C. Zhang. Symposium on Operating Systems Design and Implementation M2r: Enabling stronger privacy in mapreduce computation. In ({OSDI} 18), pages 339–356, 2018. USENIX Security, volume 15, pages 447–462, 2015. [39] Y. Xu, W. Cui, and M. Peinado. Controlled-channel attacks: [24] C. Dwork and J. Lei. Differential privacy and robust statistics. In Deterministic side channels for untrusted operating systems. In Proceedings of the forty-first annual ACM symposium on Theory Security and Privacy (SP), 2015 IEEE Symposium on, pages 640– of computing, pages 371–380. ACM, 2009. 656. IEEE, 2015.

6