University of Surrey Computing Sciences Report CS-13-04

University of Surrey Computing Sciences Report CS-13-04

University of Surrey Proceedings of Vote-ID 2013 Department of Computing James Heather Steve Schneider Vanessa Teague (Eds) July 17, 2013 Computing Sciences Report CS-13-04 We are grateful to the authors for their permission to include author copies of their accepted papers in this institutional archive. Preface This is the fourth edition of the International Conference on E-Voting and Iden- tity (VoteID). Previous VoteID conferences were held in Tallinn, Estonia (2011), Luxembourg (2009) and Bochum, Germany (2007). This year’s VoteID takes place in Guildford, UK, on 17–19 July 2013, hosted by the University of Surrey, preceded by a special session on “Voting Systems Demonstrations” exhibiting recent practical developments in voting systems. Countries around the world are increasing their deployment of electronic vot- ing, though in many places the trustworthiness of their systems remains contro- versial. Vote-ID has always maintained a strong focus on designing trustworthy voting systems, but the breadth of interpretations of trustworthiness seems to widen every year. This year’s papers include a range of works on end-to-end ver- ifiable election systems, verifiably correct complex tallying algorithms, human perceptions of verifiability, formal models of verifiability and, of course, attacks on systems formerly advertised as verifiable. To be trustworthy is one thing, but to be trusted is quite another. The increas- ing practical application of electronic voting technologies implies a need for us to understand the wider implications of gaining and deserving public trust. This year’s Vote-ID boasts some prestigious invited speakers on this theme: David Birch, a founding Director of Consult Hyperion, will be speaking on “Suppose Electronic Voting Works? What Then?”; Robert Krimmer, Senior Adviser on New Voting Technologies in the Election Department of the OSCE’s Office for Democratic Institutions and Human Rights (OSCE/ODIHR), gives a talk en- titled “The Election Observation of New Voting Technologies”; Philip Stark, Professor and Chair of Statistics, UC Berkeley, speaks on “E2E to Hand-to-Eye: Verifiability, Trust, Audits”; and Baroness Onora O’Neill of Bengarve CBE FBA Hon FRS F Med Sci, Chair of the Equality and Human Rights Commission, de- livers our keynote address on the subject of “Trustworthiness before Trust”. The Programme Committee selected 12 papers for presentation at the confer- ence out of a total of 26 submissions. Each submission was reviewed by at least four Programme Committee members. The EasyChair conference management system supported the reviewing process and preparation of these proceedings. We would like to thank everyone who helped in bringing this conference together: the authors for their submissions; the Programme Committee and the external reviewers for their conscientious and timely e↵orts in reviewing and discussing the submissions; Maggie Burton, who provided such excellent support for the local arrangements; and Consult Hyperion and IBM for their generous sponsorship that allowed us to extend invitations to the guest speakers, as well as funding a number of student stipends. Finally, we thank our home institutions, The University of Melbourne and University of Surrey, for their support. July 2013 James Heather Steve Schneider Vanessa Teague v Table of Contents Scaling Privacy Guarantees in Code-Verification Elections .............. 1 Aggelos Kiayias and Anthi Orfanou On the Specification and Verification of Voting Schemes ............... 25 Bernhard Beckert, Rajeev Gore and Carsten Schuermann Formal Model-based Validation for Tally Systems ..................... 41 Joseph Kiniry and Dermot Cochran Vote Casting In Any Preferred Constituency: A New Voting Channel .... 62 Jurlind Budurushi, Maria Henning and Melanie Volkamer Attacking the Verification Code Mechanism in the Norwegian Internet Voting System .................................................... 77 Reto E. Koenig, Philipp Locher and Rolf Haenni A Formal Model for the Requirement of Verifiability in Electronic Voting by means of a Bulletin Board ................................ 94 Katharina Br¨aunlich and R¨udiger Grimm Analysis of an Electronic Boardroom Voting System ................... 111 Mathilde Arnaud, Veronique Cortier and Cyrille Wiedling Dispute Resolution in Accessible Voting Systems: The Design and Use of Audiotegrity ................................................... 129 Tyler Kaczmarek, John Wittrock, Richard Carback, Alex Florescu, Jan Rubio, Noel Runyan, Poorvi Vora and Filip Zagorski Mental Models of Verifiability in Voting .............................. 144 Maina M. Olembo, Ste↵en Bartsch and Melanie Volkamer Prˆet `aVoter Providing Everlasting Privacy ........................... 158 Denise Demirel, Maria Henning, Jeroen van de Graaf, Peter Y. A. Ryan and Johannes Buchmann Towards a Practical Internet Voting Scheme Based on Malleable Proofs .. 178 David Bernhard, Stephan Neumann and Melanie Volkamer A Practical Coercion Resistant Voting Scheme Revisited ............... 195 Roberto Ara´ujo and Jacques Traor´e vi Program Committee Josh Benaloh Microsoft Research Jeremy Clark Carleton University J Paul Gibson Telecom Management SudParis Joseph Hall Center for Democracy Technology James Heather University of Surrey Hugo Jonker University of Luxembourg Aggelos Kiayias University of Connecticut RetoKoenig BerneUniversityofAppliedSciences Helger Lipmaa University of Tartu Olivier Pereira Universite catholique de Louvain Mark Ryan University of Birmingham Peter Ryan University of Luxembourg SteveSchneider UniversityofSurrey Berry Schoenmakers Eindhoven University of Technology Vanessa Teague The University of Melbourne Melanie Volkamer Technische Universit¨atDarmstadt Poorvi Vora The George Washington University David Wagner University of California, Berkeley Douglas Wikstr¨om KTH Royal Institute of Technology Zhe Xia Wuhan University of Technology vii Additional Reviewers G Grewal, Gurchetan S. J Joaquim, Rui L Llewellyn, Morgan M Mauw, Sjouke P Peacock, Thea Phillips, Joshua viii Scaling Privacy Guarantees in Code-Verification Elections Aggelos Kiayias1? and Anthi Orfanou2 1 National and Kapodistrian University of Athens, Athens, Greece [email protected], 2 Columbia University, New York, NY [email protected] Abstract. Preventing the corruption of the voting platform is a major issue for any e-voting scheme. To address this, a number of recent proto- cols enable voters to validate the operation of their platform by utilizing a platform independent feedback: the voting system reaches out to the voter to convince her that the vote was cast as intended. This poses two major problems: first, the system should not learn the actual vote; sec- ond, the voter should be able to validate the system’s response without performing a mathematically complex protocol (we call this property “human verifiability”). Current solutions with convincing privacy guar- antees su↵er from trust scalability problems: either a small coalition of servers can entirely break privacy or the platform has a secret key which prevents the privacy from being breached. In this work we demonstrate how it is possible to provide better trust distribution without platform side secrets by increasing the number of feedback messages back to the voter. The main challenge of our approach is to maintain human verifi- ability: to solve this we provide new techniques that are based on either simple mathematical calculations or a novel visual cryptography tech- nique that we call visual sharing of shape descriptions,whichmaybeof independent interest. Keywords: Electronic voting, elections integrity, visual cryptography 1 Introduction The integrity of the voting platform is a critical feature of electronic voting systems. If an attacker controls the voting platform then it can not only breach voter privacy but also manipulate the election results. For this reason, as e-voting systems increasingly find their way to real-world deployments, the security prop- erties of the voting platform have become a major consideration. This problem is particularly exacerbated in the case of Internet voting where the voter is sup- posed to use a general purpose system (PC) for ballot casting. In this context ? Supported by project FINER of Greek Secretariat of Research and Technology, ERC project CODAMODA and Marie Curie grant RECUP. Also partly supported by an EAC grant from the VoTeR center - University of Connecticut. 2 Aggelos Kiayias and Anthi Orfanou the problem has been generally identified as the untrusted platform problem.To solve the problem a general methodology has arisen that enables the human op- erator of the ballot casting PC to validate its operation (i.e., that it has cast the proper vote) by receiving a suitable feedback from the system. This approach, even if we assume the existence of such feedback channel for free3, it has to additionally overcome two major challenges: First, the system should be able to provide such feedback without breaching the privacy of the voter and learning its vote. Second, the validation protocol should not be mathematically complex since then this would require the utilization of the PC again to complete it; in other words, the protocol should be “human-verifiable” i.e., easily executed by a human in the verifier side. We first explain how these problems have been addressed in the literature so far and then we present our results. 1.1 Previous work. An ingenious idea to resolve the untrusted platform problem was proposed by Chaum [4]: in code voting the

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    220 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us