Technology in Society 66 (2021) 101667

Contents lists available at ScienceDirect

Technology in Society

journal homepage: www.elsevier.com/locate/techsoc

Surveillance and – Beyond the . An exploration of 720-degree observation in level 3 and 4 vehicle automation

Tim Jannusch a,b,*, Florian David-Spickermann a, Darren Shannon a, Juliane Ressel a,b, Michaele Voller¨ b, Finbarr Murphy a, Irini Furxhi a, Martin Cunneen a, Martin Mullins a a Emerging Risk Group, Kemmy Business School, University of Limerick, Ireland b Institute for Insurance Studies, TH Koln,¨ Germany

ARTICLE INFO ABSTRACT

Keywords: On the path to high-level vehicle automation, the degree of both inside and outside the car increases Vehicle Automation significantly.Consequently, ethical considerations are becoming central to questions around surveillance regimes Surveillance and data privacy implicit in level 3 and 4 vehicle automation. In this paper, we focus on outputs from the EU Privacy Horizon 2020 project Vision Inspired Driver Assistance Systems (VI-DAS). In particular, we assess the VI-DAS Panopticon 720-degree observation technology, critical to ensuring a safe Human Machine Interaction (HMI), from multi­ Contextual Integrity 720-Degree Observation ple theoretical perspectives to contribute to a better understanding of the phenomena of privacy. As a synonym for surveillance, we started our evaluation with Bentham’s ideation of the panopticon. From there, it is a rela­ tively short step to radical Foucauldian critiques that offered more dystopian technologies of power. However, both theorems demonstrate a limited understanding of the issue of data privacy in the context of safe trans­ portation along the evolution of highly automated vehicles. Thus, to allow the debate to move beyond more binary discussions on privacy versus control/power and to a certain degree escape the shadow of the panopticon, we applied the Nissenbaum four theses framework of Contextual Integrity (CI). Her decision heuristic allowed us to introduce structure and a degree of precision in our thinking on the matter of privacy that represents a step forward to phenomena of privacy in a specific context. Our approach concludes that the VI-DAS 720-degree observation technology can respect the user’s privacy through an appropriate flow of personal information. However, the flowsof must be strongly regulated to ensure that data is seen as a value in terms of a commodity to protect human life and not seen as an asset that needs to be turned into value in terms of capital or the facilitation of asymmetric power-relations.

1. Introduction partly focused on the multiple challenges that we face in finding a so­ lution. One of these projects is the EU Horizon 2020 Vision Inspired There is a growing body of literature on the evolution towards Driver Assistance Systems (VI-DAS) project (VI-DAS, n.d.); [1], which automated driving. It recognises the transition through the various sought to examine and solve the various problems that arise from levels of automation as significant steps along the path towards fully Human Machine Interactions (HMI). autonomous vehicles. Recent technological developments point to an More specifically, the VI-DAS project aims to establish an enhanced imminent move from level 2 to levels 3 and 4 vehicles. The latter driving scenario modelling and risk evaluation for level 3 to 4 vehicle automation levels can be characterised by a shared responsibility for the automation. The suggested solution to the problem of a mix of human driving task between the human and the machine. This hybrid human- agency and ArtificialIntelligence (AI) in these levels is the idea of 720- machine decision making implies that we will need to see the develop­ degree observation. That is to say, a complete level of surveillance both ment of shared liability regimes in order for risk transfer protocols 360-degree within and 360-degree outside the vehicle. On the outside, around private transportation to remain operative. There is a great deal cameras and a LIDAR sensor provide the necessary data for the vehicle to of complexity pertinent to this phase of technological roll-out and a navigate the road system whereas on the inside, cameras are mounted to number of European Union funded projects in the transport realm are collect critical data about human behaviour and in particular the

* Corresponding author. Emerging Risk Group, Kemmy Business School, University of Limerick, Ireland. E-mail address: [email protected] (T. Jannusch). https://doi.org/10.1016/j.techsoc.2021.101667 Received 3 December 2020; Received in revised form 10 June 2021; Accepted 2 July 2021 Available online 13 July 2021 0160-791X/© 2021 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/). T. Jannusch et al. Technology in Society 66 (2021) 101667 readiness of the human driver to resume the control function. The data architectural design that makes a comprehensive monitoring of people inputs are processed by on-board computers that use machine-learning feasible. Complex supervisory procedures supplement the architecture. techniques to model the driving environment and generate an in- Bringing both together, Bentham created a continuous spectre of sur­ depth understanding of the driving situation. A key driver of in-cab veillance coupled with a realisation that people within the panopticon surveillance is to be found at phase 8 (see Fig. 1) where the vehicle are effectively forced to adapt their own behaviour to the given stan­ cedes control to the driver. dards [16]. All versions of the panopticon manifest modifications that In order for this transfer to occur, the vehicle has to know whether are targeted at solving social and political problems at that time to the driver is ready to take back control. The VI-DAS project posits the maximise pleasure and minimise pain in society [16]. However, this idea of a camera located over the dashboard, which is able to scan both approach of total information collection motivated Foucault to study the face and the body position of the driver to determine the driver state Bentham’s panoptical idea in his research on the relationship between and readiness to take back the driving task. Therefore, the cameras are power and knowledge and expanded it into the concept of panopticism constantly gathering data on the user and with the addition of some [16,17]. According to the concept of panopticism Bentham’s panopticon relatively straightforward upgrades could generate a stream of bio­ is a laboratory of power that is applied in the form of continuous indi­ metric data. In the near future, a mass-production car would be able to vidual supervision with control, punishment, compensation and measure or detect tiredness, attention span, alcohol or drug use, and correction of individual misbehaviour. Large-scale control is realised by underlying medical issues. In addition, the 720-degree capture of the an infinite, invisible power seeing everything and everyone at all time inside and the outside allows for a triangulation of data sets that could [18]. In his view, the disciplining power construct of the capture not only biometric data but also behavioural traits. In such a prison-panopticon has already been realised in many aspects of Western situation, aggressiveness and passiveness whilst driving become societies with technologies as instruments of surveillance that create measurable as does the propensity to break the law. respective norms of behaviour and limit freedom [19–21]. Foucault use Despite several safety benefits, the idea of 720-degree observation of the terms technologies or techne refers to a set of practices that emerge posed by the VI-DAS project illustrates that there is virtually no limit to in a given society and not necessarily to any technological artefact. the type and amount of data that can be captured, analysed and stored The flows of personal information from the cab of the vehicle are [3,4]. Nearly every behaviour can be monitored and carefully scruti­ driven by sets of professional practices which may not have an explicit nized [5]. Apart from that, once information about behavioural traits political agenda but nevertheless create regimes where power relations and the driving environment has been collected, it may be persistently change. The presence of certain institutions as potential end users of the available for a variety of stakeholder that may be afforded access to this newly available data does suggest the creation of a new, more intense lake of private information [6] [7]. From there, fragments of informa­ element of control that does chime with Foucauldian ideas of disci­ tion can be easily detached from its original context. One interpretation plinary power. However, although Foucault’s theorem is an interesting is that the targeted employment of big data analytics allows “strangers” starting point from which to interrogate these practices, we should not to generate comprehensive insights about individual lives [5,8]. As a lose sight of the positive side effects of a purposeful data collection result, we may run into danger that others use that information to exert employing 720-degree observation. In certain contexts, society may influencein a covert and manipulative way that threatens our ability to even determine that a specific information flow is reasonable and crit­ reflecton and decide independently about our own behaviour [9]. They ical for societal welfare. This has prompted European policy-makers to can claim this massive flowof personal information as free raw material discuss which institutions should be able to access the data in the first generated from us but not for us and exert control, power and limit place [22]. Current Benthamite and Foucauldian discussions of data freedom [4,10][11] [7]. With that in mind, it becomes vital to not only privacy and surveillance are too general to allow us to understand who consider the potential value of 720-degree observation but to under­ should have access to the data recorded through 720-degree observa­ stand ethical connotations related to data privacy and surveillance. tion. What might be useful in this regard is to follow what Vallor [23] The notion of privacy is a concept that already had appeared in 1890 has termed an “empirical turn” in the fieldof technological ethics which [12]. From a modern perspective, it can be denoted as the ability of looks at specific cases to tease out the moral issues in play. We need a people to determine for themselves when, how and to what extend their more robust framework that helps us to specify the question of access to personal information is accessed by others [13]. In this context, it is data streams in order to optimize HMI, thereby improving the security of interesting to juxtapose Paterson’s [14] characterisation of the auto­ automated vehicles. mobile as a private space, a symbol of freedom, and a refuge from the Consequently, in this paper we argue for a new way of thinking that outside world with automated and semi-automated cars. The different is based on well-established privacy theorems but also one that allows perceptions are striking: the automobile as a private space and a symbol the debate to move beyond more binary discussions on privacy and of freedom on the one hand and a machine equipped to capture and partly escapes the shadow of the panopticon. Nissenbaum’s concept of assess in detail the behavioural traits of the driver on the other. It is easy contextual integrity (CI) offers a step forward in this regard [24]. She to see why people often refer to the current employment of surveillance claims that the originally used narrow definition of privacy falls short technologies within the car (i.e. motor telematics1) as a spy, big brother, and, going forward, the theory of privacy should not primarily be con­ or a stalking system [15]. Apart from this perception of surveillance cerned with information flow but rather the appropriateness of these technologies, among researchers, a frequently used metaphor related to data flows.Accordingly, instead of taking the view that individuals must the issue of privacy and surveillance is the “panopticon”. This raises the have full control over the information use, the key question to be question of whether the substantial body of literature on Benthamite and answered is whether the information flowsare appropriate or not [25]. Foucauldian panoptical thinking helps us to understand this endanger­ However, the latter, is not solely measured according to the benefit of ment of our privacy by the widespread employment of 720-degree the individual and factors in historically evolved values and informa­ observation. tional norms [24]. In the context of level 3 and 4 vehicle automation, the According to Bentham’s idea, the panopticon denotes an acquisition and (correct) processing of the driver status may lead to relevant societal benefits such as improved safety, a reduction of emis­ sions and fuel consumption, and advances in overall traffic flows. Such 1 benefits need to factored in to any ethical analysis of such practices to A technological device employed in the car to gather comprehensive data about the driving behaviour of a person (e.g. distance driven, acceleration, allow for socially responsible data usage [26]. ’ braking). In the insurance industry motor is used to calculate indi­ Nissenbaum s approach is to some extent reflectedin the structure of vidual insurance premiums and to provide feedback to the driver on how to the VI-DAS project in which capturing the views of multiple stake­ improve safe driving behaviour. holders is built into the architecture of the research consortium. The

2 T. Jannusch et al. Technology in Society 66 (2021) 101667

Fig. 1. Shared responsibility in level 3 and 4 vehicle automation [2]. need for robust, multi-perspective governance regimes has also been provide the right conditions for the take-over of control. For this reason, accounted for in the VI-DAS proposal. The consortium is a hybrid of the VI-DAS project proposes a novel 720-degree observation approach technically orientated partners that include a number of blue-chip that allows the gathering of data from inside and outside of the vehicle. technology2 companies and academic institutions specialising in Apart from external sensors, the VI-DAS project uses non-intrusive applied ethics and governance. Any solution to the issue of surveillance sensors3 within the cabin to gather real-time information about in regimes of shared responsibility and liability requires an acceptance drivers’ behaviour and state of attention. Coupled with the external of the two-way flow between ideas around best practice in terms of sensors, every individual driving situation can be understood and the ethics and good governance and the trajectory of technological devel­ modes of transition can be adapted to the individual characteristics of a opment. Therefore, this paper is structured in the following manner: The driver to ensure a prompt resumption of control. first section of this paper expounds on the state-of-the-art implementa­ In the following passages, we provide some background on the tion of surveillance related technologies. Thereby we also capture technical advancements, focussing on the area of driver monitoring to Benthamite/Foucauldian understanding of surveillance regimes and establish a necessary backdrop and to contextualise the following sec­ their limitations as it relates to the roll-out of autonomous vehicle tions. Technology has made great strides in developing a functional technologies. This is followed by an examination of Nissenbaum’s four autonomous vehicle. Standardised classifications have followed in an theses of her framework of CI that seeks to weigh up the utility of her attempt to map the increasing level of responsibility being placed on the decision heuristic in the context of vehicle automation. With Nissen­ vehicle for driving duties [27]. These classifications are measured on a baum’s four theses as a starting point, we then work through the prac­ scale of 0–5 and are represented in Fig. 2. Levels 0 to 2 largely represent tical case scenario of the hand-over and take-over scenario enabled by manual driving, with level 2 vehicles offering risk reductions through the VI-DAS 720-degree observation technology. Thereby we aim to more taking on tasks that assist the driver in avoiding or mitigating anomalous fully explain the framework and assess the issue of data privacy in a real- events or impending crashes [28,29]. Levels 3 to 5 place significant case-scenario to broaden the ethical discussion around 720-degree emphasis on the ‘machine’ element of semi-autonomous vehicle. observation and interrogate the concept of privacy. Whereas level 5 represents full automation, level 3 and level 4 indicate conditional automation and high automation respectively. 2. Background Semi-autonomous vehicles at these levels can provide significant reduction in the number of anomalous events that would otherwise be In recent years, much effort hast been devoted to sensors that allow present during manual driving [30]. McCall et al. [31] note that level 4 monitoring of the exterior of a vehicle for the development of advanced liability remains quite vague, as details of a ‘minimised risk condition’ is driver assistance systems (ADAS) such as forward collision warning. not expanded upon in situations where the driver is unable to retake However, we are facing a transition from a car driven by a human to the control. Existing legislation regarding automated vehicle’s try to close autonomous vehicle and this transition is characterised by a shared re­ this gap by positing a strict liability of the part of the owner to ensure sponsibility for the driving task evident in the levels 3 and 4 of vehicle victim protection [32]. automation. In this phase, the human and the machine must comple­ Although high levels of autonomy are expected to introduce an ment each other’s capabilities to operate the vehicle safely. Therefore, increased level of safety, difficulties may arise in situations where behavioural traits of the driver must be monitored and analysed to

3 Non-intrusive human behaviour sensors are sensors that gather information 2 Blue-chip technologies are industrial and embedded computing solutions about the human being but do not need to have a connection to the human for a diverse range of applications. body.

3 T. Jannusch et al. Technology in Society 66 (2021) 101667

Fig. 2. Levels of Driving Automation and shift in responsibilities [27,33]. smooth handovers between driver and vehicle are not achieved. The In addition to providing a smooth handover in situations where the driver must remain alert and receptive of commands to engage in a vehicle cannot guarantee occupant safety, level 3 and 4 vehicles may handover sequence to retake control of the vehicle in scenarios such as also need to be prepared to take over the responsibility of driving duties in adverse weather conditions. As such, driver distraction, or “a diver­ when the driver is continually inattentive or otherwise impaired. The sion away from activities critical for safe driving toward a competing National Centre for Statistics and Analysis [35] indicate that driver activity” is a significant hurdle that must be crossed by vehicles along distraction is one subset of inattentiveness, which can also extend to the SAE International [27] scale [34]. Optimised HMI environments and fatigue, as well as physical and emotional conditions of the driver. Ryan handover procedures are imperative to the development of level 3 and et al. [30] show that both aggressive and drowsy drivers record signif­ level 4 vehicles, especially given that 9% of U.S. fatal crashes in 2017 icantly more adverse events than drivers in a ‘normal’ state. Purucker, were reported as distraction-affected crashes [35]. Furthermore, drivers Naujoks, Prill, and Neukum [41] address the distraction potential of can potentially become more inattentive as they become more reliant on increasingly complex in-vehicle information systems (IVIS). Menial automated functions [36]. This over-reliance on systems is critical tasks related to mobile phone, navigation and sound devices regularly because drivers tend to trust the automated functions to an unjustified attained ‘Eyes-Off-Road’ (EOR) times of greater than 10 s, with standard extent leading to serious accidents [37,38]. deviations greater than 10 s in tasks related to song selection and address A number of analyses have been conducted to examine the reaction entry. Zeeb, Buchner, and Schrauf [42] establish that drivers with erratic times and attentiveness of drivers upon the initiation of an impending EOR patterns were high risk on the basis of increased reaction times and handover task, with the goal of implementing an optimal strategy of lower situational awareness when faced with a takeover event. alert signals and placement to ensure a smooth transition. Trask, Stew­ Furthermore, Vicente et al. [43] attained a 95% success rate in a art, Kerwin, and Midlam-Mohler [39] found that distracted drivers that real-time EOR gaze tracking system. The introduction of such a system were pre-engaged to a secondary task with two hands responded more would enable level 3 and level 4 semi-autonomous vehicles to send quickly to audio warnings, while visual and speech warnings did not alerts to the driver or initiate takeover control if full attention is not elicit fast reactions and were seen as ineffective and frustrating. Reac­ restored. tion times also decreased the longer the participants were in the study, Similar to Takahashi’s et al. [44] approach, the VI-DAS project uses a suggesting that drivers quickly adapt to the takeover signals and camera to monitor the user of a level 3 or 4 vehicle. However, it thereafter retain an adequate level of attentiveness. To overcome the extended the monitoring of the driver by implementing an external view issue of cognitive and affective loads, van der Heiden, Iqbal, and Janssen which makes a 720-degree observation feasible. This results in a trian­ [40] suggest the use of pre-alerts prior to engaging in the handover event gulation of data sets which cover not only biometric data but behav­ from vehicle to human, as it allows the driver time to disengage from ioural traits to assess the overall situational ability of a driver to take their secondary task and regain situational awareness, while reducing over control. Associated with that is the flowof personal data constantly mental workload and stress from the request event. They additionally gathered due to the surveillance of the driver which does pose question demonstrated that an increasing pulse alert4 is more beneficial since it around the individual’s privacy. conveys an additional level of urgency. Surveillance or in modern parlance dataveillance denotes the “sys­ tematic monitoring of people’s actions or communications through the application of information technology” [45]. In the context of the VI-DAS project, the applied information technology are camera sensors 4 Beeps are given throughout the 20 s pre-alert time before engaging in the that allow a 720-degree observation of the driving environment inside handover event.

4 T. Jannusch et al. Technology in Society 66 (2021) 101667 and outside the vehicle. Privacy characterises the right to determine for data that can only be addressed by Foucauldian thinking of power ourselves when, how and to what extend personal information is relations. collected and accessed by others [13]. The juxtaposition of both defi­ Foucault was one of the most influential and cited thinkers of his nitions show that the issue of surveillance goes hand in hand with the time. He acknowledged Bentham’s idea of the panopticon in his research issue of privacy [46]. In the first place, it is the act of monitoring our on the relationship between power and knowledge and expanded it into behaviour within the cabin that allows others to gather a multitude of the concept of “panopticism” [16,17]. According to Foucault, Bentham’s personal information. From there it is only a short step to the intrusion of panopticon can be seen as laboratory of power that is applied in the form our privacy realised when we lose the right to determine when, how and of continuous individual supervision with control, punishment, to what extent our data can be accessed. Due to their strong interrela­ compensation and correction of individual misbehaviour. Large-scale tion, multi theoretical perspectives are critical to achieve a compre­ control is realised by a central, infinite, invisible power seeing every­ hensive understanding of the issues of surveillance and privacy as well thing and everyone [18]. In Foucault’s view, the disciplining power as their positive aspects in the context of the “VI-DAS car”. construct of the panopticon has already been realised in many aspects of Bentham’s thinking on the panopticon has attracted considerable societies with technologies, such as cameras, as instruments of surveil­ attention and can almost be classifiedas a synonym of surveillance [47]. lance [19–21]. According to Foucault that omniscient disciplining For this reason, it constitutes a promising starting point to get an initial power creates respective norms of behaviour that turn people into understanding about whether we can talk about surveillance in the “marionettes of power”. context of the 720-degree monitoring scheme implemented in the Like Bentham’s architectural thinking, Foucault’s research on power “VI-DAS car”. There are at least four versions of Bentham’s panopticon relations is critical to the VI-DAS project. It allows us to factor in the targeted on solving a specific social or political issue of that time: the negative side effects of surveillance for individuals and the whole of prison panopticon [48], the pauper panopticon [49], the chrestomathic society. Considering the disciplining power of the panopticon, one could panopticon [50], and the constitutional panopticon [16]. However, in a argue that people who do not meet the ideal driver behaviour expected concise manner Bentham’s panoptical idea is a targeted architectural from the system, get punished in a sense that the vehicle will not hand design supplemented by supervisory procedures to achieve a constant over the control. However, in terms of lighter distractions, users might feeling of being watched for those inside the panopticon. The purpose of feel a deprivation of freedom by the emerging automation technology. this constant feeling of surveillance is to get people within the pan­ Putting it into Foucault’s line of thoughts, this characterises a supervi­ opticon to discipline themselves for societal well-being. The prison sion in the form of control, punishment, compensation and correction of panopticon is the most prominent among Bentham’s theorems and a individual misbehaviour. Subsequently, the disciplining power good example to understand how the panopticon works. Inmates are empowered by VI-DAS 720-degree observation can limit human securely kept under lock and key in their prison cells whereby the main freedom [51]. Let us posit the insurance industry as a potential stake­ feature of the prison is a circular building at the heart of the complex holder that can access the data captured by the “VI-DAS car”. Insurers that allows observation all prison cells. There is an inspector in the use the captured information about behavioural traits to penalize those central building who monitors the activities of the inmates at all time. who do not act in accordance to the insurers standards of low-risk. However, as one inspector cannot observe all inmates at the same time, Following their aim of profit maximisation insurers can form a society Bentham decided to locate the inspector in the dark. This special characterised by manipulation that adapt human behaviour to the in­ element of the prison panopticon creates a constant feeling of surveil­ surers standards [52]. Anomalous behaviour could be judged as defi­ lance because the inspector appears invisible and all seeing. cient and socially unacceptable [20] and thus the aim of the VI-DAS With some exceptions, the “VI-DAS car” can be seen as a modern project to create a space of personal fulfilmentis threatened. That’s said, form of Bentham’s panopticon. The panoptical gaze is limited to the Foucault’s insights strengthen the idea that it is vital to the VI-DAS driving environment. Thus surveillance is omniscient in the context of project to ensure that behavioural data is governed and not misused driving but not in the outside world. Within the cabin, cameras observe for any other purpose than to optimize the communication between the every behaviour of the driver. Hense, users of the “VI-DAS car” might human and the car. The emergence of harmful power relations must be perceive the camera functioning as an all seeing and omnipresent prevented. inspector comparable to the inspector highlighted in the prison pan­ Overall, using the traditional view of Benthamite surveillance and opticon. Yet, by contrast to Bentham’s surveillance machine, the eyes of Foucauldian power relations allows a deeper and more comprehensive the VI-DAS inspector (i.e. cameras) are visible for the driver. In addition, insight into the discussion of surveillance originated by VI-DAS 720-de­ VI-DAS calls the user’s attention to the constant use of the integrated gree observation. Nevertheless, there are both theoretical and practical 720-degree surveillance right before the user starts to operate the car. In limitations with the dystopian ideal of a central locus of power. It does the context of the “VI-DAS car”, the 720-degree surveillance is designed not apply to the modern world of VI-DAS which is characterised by to solve a critical problem along aid the evolution of highly automated multiple crossing each other due to the large and complex vehicles. When the driver and the vehicle are both responsible for the network of stakeholders involved in the development of next generation operation of the car, a faulty HMI can significantly increase the proba­ of automated vehicles. Moreover, both theorems are limited by the fact bility of a crash event. That might be comparable to Bentham’s moti­ that there exists no alternative to optimize HMI other than to capture in- vation to employ surveillance elements to solve political and social depth information about the behaviour of the driver. Imagine a take- problems related to that time. Lastly, there are two particular differences over situation between two humans in a car. Throughout the driving between Bentham’s and the VI-DAS panopticon. In contrast to Ben­ task it is vital for the one to constantly monitor the behaviour of the tham’s scheme, surveillance elements within the VI-DAS panopticon are other and to know whether the other if able to take back control and not employed to create a constant feeling of being watched to get the operate the vehicle safely. Thus, considering the progressive develop­ users to discipline themselves. Rather in order not to endanger the driver ment of automated vehicles, the question that should be answered is not or other road users, it is crucial that the vehicle understands when the if surveillance is acceptable in this context. Rather, we should aim to human user is distracted and cannot take over the driving task. understand the trade-offs of data sharing. We should figure out what Furthermore, Bentham’s panopticon is limited to one locus of power that data should be shared, with whom and to what ends. All this is important gathers, processes and stores all information whereby no third party can to avoid an intrusion of individual’s privacy and the emergence of necessarily access that data. Compared with this, the “VI-DAS car” is harmful power relations. Consequently, we argue for a new way of characterised by multiple panopticons crossing each other due to the thinking that adopts a multiple stakeholder perspective and distin­ large and complex network of stakeholders where data sharing becomes guishes how drivers think about privacy to add value to the surveillance commonplace. That poses concerns about the use and potential abuse of and privacy debate around automated vehicles. A major advantage of

5 T. Jannusch et al. Technology in Society 66 (2021) 101667 this systemic multi-stakeholder approach is that it addresses profound whether an information flowis generated or not. Instead, the concern of ethical aspects that arise with the development of HMI technologies, for the framework of CI is whether the flow of personal information is example in terms of (data) control, autonomy and the protection of appropriate in the given context. To answer the question as to which privacy. Clearly, part of the problem here resides in the relative infancy data flowcan be described as appropriate we now need to move to theses of some of many of these practices and while we should not be tolerant two of Nissenbaum’s framework of CI. According to her second theses of abusive practice or wholesale changes of power relation we would do (2) appropriate flowsconform with contextual-information norms (“privacy well to bear in mind that we are in new territory [23]. In terms of re­ norms”). For a better understanding of the second theses let us think lations and the likely end-users of data derived from “in cab” insurance about our social lives where we individually or collectively interact with companies are an important part of the picture. The use of artificial other. In those situations, our behaviour is normally governed by intelligence (AI) by insurers is germane here and this for [53] “has many structured social settings that are definedby a multitude of factors such of the characteristics of other emerging technologies that make them as internal values, power structures, norms or (social) rules. The often refractory to comprehensive regulatory solutions”. The argument here is unspoken contextual values, aims, purposes, and norms a social combine that the use of in vehicle AI may involve information flows across ju­ to govern individual behaviour. A central argument of Nissenbaum is risdictions, from one set of professions to others groups of professionals that among contextual norms, we can findthose that regulate (i.e. shape and incorporates sets of practices that have not traditionally resided or limit) personal information flows. These contextual-informational within the purview of the regulators. The use of AI also threatens certain norms (rules) provide an explanation of why an insurance agent ex­ characteristics of what many ethicists and philosophers take to be pects us to share comprehensive information about our health but not intrinsic to human beings including notions of dignity and autonomy. the stranger we just met on the street. Appropriate flows of personal Such concepts do not fit comfortably into the current paradigm of information are definedby substantive normative dos and don’ts evident financial regulation or indeed legislation governing transport. These in distinct social contexts that influence our perception of privacy. issues has attracted some attention from the regulatory community and Moving on, (3) Five parameters define privacy (contextual-informa­ in 2019 EIOPA established the Consultative Expert Group on Digital tional) norms: subject, sender, recipient, information type, and transmission Ethics in Insurance to inform public policy in this area.5 They had pre­ principle. The third theses of Nissenbaum’s theorem elucidates that five viously published a thematic review on the use of big data analytics in key parameters characterise context-relative informational norms. the motor insurance and home insurance markets where the issue of Senders and receivers of personal information flows can be single or privacy was very prominent [54]. multiple individuals or collectives like organisations. Let us stay with the given example of the insurance agent. In this scenario, the insured is the 2.1. Nissenbaum’s framework of CI sender of the personal health information and the recipient is the in­ surance agent or hereinafter the insurance company. In many cases, the There is a degree of uncertainty around the question of how to subject and sender of information are identical [24]. However, to specify operationalise the term ‘privacy’ implicit in level 3 and level 4 vehicle informational norms it is always important to identify the roles of all automation. As outlined in the Benthamite and Foucault section, albeit actors in the given context. The nature of information (i.e. types of in­ with some limitations, previous philosophical theories have tended to formation involved) is another important parameter that needs clarifi­ offer a profound understanding about the notion of surveillance as it cation. Let us return to our example of the insurance agent. In this refers to the current discussion around 720-degree observation. How­ scenario, the nature of information discussed are medical information ever, the main problem with both theorems is that they postulate an about the status of (our) health. Ultimately, the transmission principle omnipresent power that forcefully collects all available information needs to be addressed within the framework of CI. This parameter, de­ about a person. In the VI-DAS scenario, it is the user that shares scribes a contextual restriction of the flowof personal information that is behavioural information to secure an optimised and safe HMI during a shared between the sender and the recipient. More specifically, the hand-over and hand-back between manual and automated driving transmission principle covers terms and conditions that govern the flow modes. We propose that to stop this flow of personal information ap­ of personal information in a given context. In her book “Privacy in pears restrictive and destructive for the future of mobility and our safety Context”, Nissenbaum gives an illustrative example for the context of in level 3 and 4 vehicle automation. That said, the practical use-case of friendship [24]. Among our friends, we normally anticipate that we the VI-DAS hand-over scenario might help us to understand that it is not share personal information voluntarily. We even tend to share very the necessary flow of personal data that defines a harmful intrusion of intimate information although we know that friends often draw their our privacy. Rather it is the misuse of this data that can discomfort the own conclusions that might not correspond to our own ideas our values. user and therefore needs to be regulated [51]. For this reason, this However, this openness and vulnerability is only possible due to certain section assesses the issue of data privacy in a more user oriented relative or aggregative restrains. We share information reciprocally and perspective to broaden the ethical discussion around 720-degree suppose that our secrets will be handled confidentially. Nissenbaum’s observation and lend more context to the concept of privacy. last thesis posits that we need to ensure (4) the ethical legitimacy of pri­ In order to assess the issue of data privacy in a more user oriented vacy norms evaluated in terms of: A) Interest of affected parties, B) Ethical perspective we refer to Nissenbaum’s framework of CI. In her timely and political values, and C) Contextual functions, purposes and values. article “Contextual Integrity Up and Down the Data Food Chain”, Nis­ Think about social life. Normally our social life is characterised by a senbaum provides a comprehensive overview of the theory of CI [25]. multitude of personal information flows between subjects, senders and Moreover, she applied her theorem as a mechanism for explaining a recipients. These information flows are regulated by multitude of privacy challenges related to her figurativeexpression of a contextual-informational norms that have evolved over time. If we data food chain. In this section, we aim to reproduce Nissenbaum’s four would now declare that only pre-existing contextual-informational fundamental theses that carry the interrelated elements of CI. Then we norms can measure contextual integrity, then the framework of CI forces work through the VI-DAS hand-over and take-over scenario to discuss us to determine that many new technologies that violate established the issue of privacy in a more context specificmanner. The firsttheses of informational norms are problematic. Consequently, in the name of Nissenbaum’s framework of CI is that (1) privacy is determined by an privacy we would refuse many novel but valuable technologies that are appropriate flowof personal information. In her eyes, it is of no importance aimed to modify entrenched informational norms to achieve greater good. To impede a harmful conservativism, the framework of CI provides a way to demonstrate that a new practice is morally legitimate insofar as 5 See https://www.eiopa.europa.eu/content/eiopa-establishes-consultative-e it promotes societal-welfare. For this reason, within the first two layers xpert-group-digital-ethics-insurance_en. of fourth thesis, users of CI need to discuss interests and ethics as well as

6 T. Jannusch et al. Technology in Society 66 (2021) 101667 political values of a new information practice. Such a deliberation requires us to describe the fiveparameters that defineprivacy norms (i. should answer - whose interests are served and how autonomy and e. subject, sender, recipient, information type, and transmission princi­ freedom may be affected by the emergence of new power-relations. ple). In the VI-DAS scenario the information subject and sender is the Finally, thesis four requires a distinctive contextual consideration of user of the “VI-DAS car” who is monitored throughout the drive. All the new practice. Influenced by Regan [55]; the last layer critically re­ information is transferred to and stored in a cloud-based infrastructure. flects if the new practice infringes contextual purposes and values. At From there stakeholders of the VI-DAS consortium can access and pro­ that point, it is important to understand that contextual purposes and cess data of the monitoring scheme. Fig. 3 gives a comprehensive values sometimes incur displeasure in the individual information subject overview about private and public sector stakeholders that are directly or sender. A timely example is the ongoing discussion around the involved in the VI-DAS project. Among those stakeholders are the application of smartphone tracking in the current COVID-19 pandemic automotive industry, standards organisations, regulators, insurance [56–58]. Not every individual might like to share information about companies, and the academic community. current locations and contacts. However, a substantive advantage of this The automotive industry accesses the VI-DAS data to allow new practice is that it may allow us to prevent the further spread of the continuous optimisation of the performance and robustness of the 720- disease and thereby to protect human lives. degree sensor technology. Thereby information about behavioural traits of the driver can be analysed to improve the installed surveillance 2.2. Nissenbaum’s framework of CI applied to the VI-DAS scenario technology. In addition, crash data will be made available to the tech­ nology supplier to reconstruct the crash situation since they are In this section, we aim to confront people’s moral and political point responsible for damages when the automated vehicle is at fault. Stan­ of view to the emerging information technologies as an integral element dards organisations process the VI-DAS data to specify and examine of automated vehicles. Therefore, we apply Nissenbaum’s framework of technological standards for implementation, functionality and testing. CI to the VI-DAS 720-degree observation scheme. As recommended, we Regulators are involved as stakeholders in the VI-DAS project to pro­ carefully map out the personal information flows considering the five vide recommendations around jurisprudence with a special focus on parameters of thesis three. Subsequently we assess interests, social data and user protection. They will gain access to the VI-DAS data to values, contextual ends and values to determine the moral legitimacy of monitor compliance on a recurring basis. Moreover, they will examine the VI-DAS 720-degree observation to ultimately refine our under­ the impact and limitations of legal instruments such as the General Data standing of privacy in this practical scenario. Protection Regulation (GDPR) [62] functioning as a fundamental basis Privacy is determined by an appropriate flowof personal information (inside the EU) for the transmission of personal information in the digital in a specific context. Before we can determine whether the flow of in­ age. The driver will also be able to access the personal data captured by formation is appropriate, we must get an in-depth understanding of the the “VI-DAS car”. It will be possible to reconstruct trips and reconstruct prevailing context and the new practice in terms of information flows. individual behaviour. A special focus can be set on the issue of security. Every year, about 50 million serious accidents and 1.3 million fatalities The user of the “VI-DAS car” can learn from such information about result from trafficaccidents worldwide [59]. Semi-autonomous vehicles safety-critical situations and how to improve individual driving behav­ at level 3 and level 4 are expected to provide significant reduction of iour. Young drivers might benefit most from that technological anomalous events that increase the risk of a crash event [30]. However, improvement as they start to participate in road traffic at a stage when McCall et al. [31] noted that details of a ‘minimised risk condition’ is not their driving skills are not fully developed. By providing information expanded to situations where the driver is unable to retake control. about the actual safety of the behaviour (e.g. speeding, dangerous cor­ Moreover, liability regimes for this hybrid-human machine interaction nering) targeted feedback can offer young people the chance to develop in level 3 and 4 remain vague. For manufacturers of level 3 and level 4 their driving skills in a protected environment [63,64]. In addition, due vehicles, the driver distraction poses a significant challenge [34]. It to the 720-degree surveillance of the while driving environment the impedes the fluid relationship between the human and the car and “VI-DAS car” can provide supportive information about how to handle therefore is critical for the driver’s safety during a hand-over [60]. hazardous situation. Thereby imminent crash events are likely to be A central element of the 720-degree observation of a “VI-DAS car” is prevented [65]. The academic community gets access to VI-DAS the camera located over the dashboard. This camera scans the face and database to solve technological challenges of complex scene analysis body of a driver and thereby gathers information about the driver’s or to research how to further improve trust relations between the human physical, mental, and behavioural state. Furthermore, as it is a contin­ and the car to further optimize HMI. Furthermore, the academic com­ uous surveillance, data about mobile use [61] or other secondary tasks munity can access the information to identify challenges of data usage become visible [55]. Based on a combination of external data sets and all including ethical dilemmas and introduce those to the ongoing inter­ individual data gathered by the sensors of the vehicle, the system will be national academic discussion on these important topics. Lastly, the in­ able to understand the overall driving situation including the behaviour surance industry is a key stakeholder in the VI-DAS consortium. Before of the driver. The information is needed for the system to reliably assess addressing the issue of privacy within the context of 720-degree obser­ whether the driver is attentive and able to take over control of the vation, we outline how insurers can use driver data with regard to an vehicle in a societally acceptable and personalised manner (see Fig. 1). appropriate risk management process. Acting as a carrier and adminis­ Fig. 3 pictures the practice in terms of possible data flows from the trator of risks, insurers need to establish a certain degree of knowledge “VI-DAS car” to various stakeholders of the automated vehicle con­ to assess risks [66,67]. Establishing such “risk knowledge” is pivotal for sortium. A key element of this construct is the VI-DAS dataset. The emerging risks as the full risk potential is unknown and can manifest dataset denotes an agglomeration of external datasets (i.e. naturalistic many years later due to the possible time delay between the causation data, accident data) and progressively collected user and environmental and manifestation of a claim [68]. Automated vehicles including level 3 data via the VI-DAS 720-degree observation scheme. The monitoring and level 4 are regarded as emerging risks. Therefore, insurers are data flow is transferred to and stored in a cloud-based infrastructure interested in collecting data from highly automated vehicles to establish wherein an informational norm has been developed regarding the a sufficiently large data base for risk assessment. This generated “risk acceptance of the uploading, storage and further uses of the data. Ma­ knowledge” enables the insurer to operate a more effective and chine learning approaches are used to process the VI-DAS dataset in pro-active risk management [69]. The VI-DAS project proposes a real-time to determine whether the driver is able to take back control or 720-degree observation that gives the motor insurer additional infor­ not. mation to trace certain behavioural traits revealing the true risk expo­ After we depicted the prevailing context and described the new sure of the driver. Nevertheless, the 720-degree observation poses new practice in terms of information flows, Nissenbaum’s framework challenges in terms of HMI, such as a miscommunication between these

7 T. Jannusch et al. Technology in Society 66 (2021) 101667

formats, tests, certi cation legislation

Car manufacturers Standards Sensor suppliers Automotive Organisations Industry Regulators legislation Navigation aids performance & fusion, recommendations robustness integration, risk analysis Test facilities analysis certi cation responsibilities, ethical risk VI-DAS Dataset Motor perception Insurers External Datasets risk analysis & Ethical External Datasets assessment Groups External datasets and Naturalis�c datasets Driver progressively collected security, (shared via RTMaps) social acceptance user and environmental HMI data with VI-DAS 720- Accident databases degree observa�on and Academic (EDA; Police Reports) other vehicle sensors trend Community analysis

Fig. 3. Practice in terms of possible information flows from the level 3 and level 4 vehicle to various stakeholders. parties [70]. An efficient HMI requires adequate monitoring of the characterised as a powerful (cultural) symbol of freedom, as a private present driving situation and a faultless handover of the driving situa­ space, a refuge from the outside world [14]. This perception of the car tion [2]. A failure or misunderstanding between the driver and the illustrates pre-existing contextual-informational norms that have system can result in accidents with severe bodily injuries of both the evolved over time and can measure contextual integrity. Regarding the driver and other road users. As motor insurers have to compensate VI-DAS 720-degree observation, the camera gathers a myriad of data on justified claims arising out of these losses, they are interested in iden­ the driver and the driving environment. Hence, at that point, we argue tifying the reasons for the inappropriate communication in the HMI as that the VI-DAS technology justifies a prima facie violation of privacy. well as transforming this hazard into a calculable risk. By providing However, at this juncture we must revert to Nissenbaum’s fourth thesis feedback to the driver, for example displaying safety-critical situations, of CI. This last step prevents us from refusing to accede to potentially the insureds may adjust their clients’ behaviour. Additionally, by dis­ valuable technologies that are aimed to modify entrenched informa­ playing certain limitations of the systems, the driver would become less tional norms to achieve greater good in the name of privacy. As the reliant on systems leading to increased control characteristics [71]. emerging VI-DAS technology aims to facilitate in level 3 and 4 vehicles, The next step in Nissenbaum’s CI is to define the transmission we now further assess the legitimacy and moral acceptability of the principles of information that covers the terms and conditions that proposed 720-degree observation technology to determine whether this govern the illustrated flowof personal information in the hand-over- and kind of surveillance constitutes a threat to privacy. take-over scenario. A prominent transmission principle is confidenti­ With reference to the 720-degree observation of VI-DAS, the driver is ality. In the VI-DAS context, confidentiality elucidates that the stake­ aware of the use of monitoring technology to secure a hand-over from an holder who received the VI-DAS data are not allowed to share this flow automated to a manual driving mode. One could argue that the trans­ of personal information with other third-parties. That said, the volume parency resulting from constantly monitoring the driver behaviour leads and sophistication of cyber incidents has increased substantially in the to a limitation of freedom insofar as it limits the free development of last years. These include any risk emerging from connected digital sys­ personality due to the constant feeling of being watched [77]. By tems being compromised, paralysed or attacked resulting in privacy and contrast, it appears reasonable that the feelings triggered by technolog­ data breaches. Within the context of 720-degree observation, the po­ ical surveillance may not last a long time, implying that drivers get used tential for cyber-attacks expands to in-vehicle sensors and thus driver to the VI-DAS technology and retain an adequate level of subjective status monitoring and understanding, including psychological and individual freedom. That said, it is only a short step to develop highly physiological state [72–74]. The sensitive data in the personalised driver granular profiles over time that can be easily supplemented with in­ model is a profound exposure and requires a holistic approach, including formation offered by Big Tech.6 Those user profilescan then be further a dialogue between various stakeholders, to allow for a better under­ leveraged to nudge or manipulate a driver, for instance, to buy addi­ standing of the risk [73]. For example, insurers play a key role in rising tional warranty options or other products offered by the manufacturer. awareness and advancing an environment that promotes an open and Furthermore, it is conceivable that manufacturer may access the infor­ constructive engagement around cyber risk [75]. Other important mation to fend off warranty claims citing abnormal driving behaviour. transmission principles in the VI-DAS context depict that information is Without proper safeguards, government employees can also misuse bidirectional and that an actor deserves to receive information, is enti­ personalised data to track individuals or use the data for law enforce­ tled or needs to know certain information and/or is obligated or ment purposes [78]. A contemporary example is provided by the Chi­ mandated to disclose information [24]. Within the VI-DAS project, the nese authorities. They harness the power of personal data to create a flow of information is further governed by the corresponding General Social Credit System that punish or reward citizens according to certain Data Protection Regulation (GDPR) [62]. In line with Article 5, the standards of behaviour determined solely by the ruling authorities consortium has to inform the information subject about the purpose of [79–81]. Similarly, the driver of the “VI-DAS car” could access data collection and needs the data subject’s explicit, free, and informed consent [76]. Based on Nissenbaum’s framework of CI, a prima facie violation of privacy takes place when the new technology violates the contextual- 6 Big Tech is a name given to the largest information companies that domi­ informational norms [24]. As discussed above, the automobile is nate the information technology industry. These companies include Amazon, Apple, Facebook, Google, and Microsoft.

8 T. Jannusch et al. Technology in Society 66 (2021) 101667 information form the cockpit to track passengers or to reconstruct con­ mainly focused on protection of the user’s data. In addition, we should versations. Using young drivers as an example, parents can use the in­ keep in mind that the CoC is a document containing important infor­ formation to track their children’s every move [82], monitor mation not only for the management of a company but also for every conversation with peers, and increasingly control their children’s single employee that may have access to sensitive user information. everyday behaviour. Lastly, the insurance industry as one key stake­ Moreover, a team responsible for privacy protection must ensure by holder of the VI-DAS consortium can use the data about behavioural regular training events and that the CoC is widely communicated. Both a traits to financially penalize perceived abnormalities [7,83–85]. CoC for the socially acceptable use of personal information by the VI- Thereby we may not only run into danger that motor insurers limit in­ DAS consortium and corresponding legal regimes could additionally dividual freedom [86]. All these negative side-effects can have major be used to create a protective ethical layer for the VI-DAS dataset as implications for the principles of human dignity and personal autonomy illustrated in Fig. 4. That layer should prevent that data being seen as and can also violate the principle of justice [76,86]. In addition, we risk simply an asset that needs to be turned into value in terms of capital. that harmful power imbalances occurring with the misuse of the VI-DAS Other experts [89,90] have discussed the configuration of a Firewall to surveillance machine that becomes particularly concerning when no other prevent the misuse of personal data streams. A Firewall is a network options for mobility exist [76]. However, by contrast to the potential security application that monitors incoming and outgoing flows of in­ downsides, the VI-DAS 720-degree observation does offer a set of posi­ formation and blocks specificdata flowsbased a set of security rules. In tive expectations including collision avoidance and increased safety. In the context of the VI-DAS 720-degree observation scheme, that set of that regard, Haidt characterised care (and harm) as one of the founda­ security rules can be definedby an independent, multi-disciplinary body tional principles for moral reasoning [87,88]. The ability to navigate that ensures an appropriate flow of personal information following safely and smoothly though road traffic is the ultimate desired perfor­ Nissenbaum’s framework of CI. With this multidisciplinary perspective, mance outcome of the “VI-DAS car”. Without integrating monitoring we do not lose sight of the point that notions of privacy, autonomy and technologies like the VI-DAS 720-degree observation, a safe take-over individuality are different from culture to culture [23]. With this in and hand-over scenario seems unfeasible. mind, we should now think about future qualitative research projects that aim to determine the necessary set of security rules for personal data protection that include a cross-cultural element. 2.3. Recommendation In terms of policy recommendations, whilst accepting the privacy has always been a relative concept and for generations there has been trade- Based on the information displayed above, in the specificcontext of off between the general welfare and privacy of the individual there are the 720-degree observation to optimize HMI to enhance the driver’s still reasons to adopt something of a cautious approach. In terms of safety, we argue in favour of the emerging technology as it maximises governance, within both OEMs and insurance companies (for example) pleasure on a micro- and macrosocial level. However, taking the myriad there needs to be a risk governance process that throws light on these of personal data into account, we need to make sure that an appropriate new practices and allow for an interrogation of new practices. Early in flow of personal information will be secured by the VI-DAS consortium the VI-DAS project we came upon a research ethics issue of some import – itself and corresponding legal regimes beyond the GDPR. For the VI-DAS that of accidental findings. In the case that the technology might identify consortium, one could think about creating a Code of Conduct (CoC) for a health issue with a driver of a prototype car, the question was how the protection of the user’s privacy. That agreement, on binding rules of such data should be shared and the driver informed. In a commercial data usage behaviour for all stakeholders of the consortium must ensure environment where we witness the coming together of OEMs and in­ that the generated data flows are solely used to enhance a safe HMI surance companies to leverage data from level 3/4 vehicles, such issues between the driver and the vehicle. In that regard, we think that it is will re-emerge on a much larger scale. Policy makers need to get ahead fundamental that the compliance with the CoC is regularly assessed by of this and a failure to do so could result in a decline in public trust and ethical groups that are independent form the VI-DAS consortium and

Fig. 4. Illustration of a protective ethical layer that ensures that the personal flowof information is solely used to ensure and improve a safe hand-over and take-over between the human and the vehicle.

9 T. Jannusch et al. Technology in Society 66 (2021) 101667 ultimately political problems around the societal acceptance of the practice is morally or politically superior (or inferior) might help miti­ rollout of these technologies. Hence, although a primary aim of projects gate some of the associated ethical problems. However, finally we such as VI-DAS is to reduce road accidents and hence is beneficial for should keep in mind that the framework of CI does not present a magic society, public perception and acceptance depend partly on how per­ bullet, nor does indeed Nissenbaum claim it to be such. We conclude sonal data is protected and processed. Accordingly, a reliable framework with a quote from Van den Hoven, Lokhorst, and Van de Poel [94]; comprising ethical principles is emerging as an important precondition “Ethics can be the source of technological development rather than just a for of the introduction 720-degree observation. constraint and technological progress can create moral progress rather than just moral problems.” 3. Discussion and conclusion Credit author statement This paper is based on the idea of 720-degree observation posited by the VI-DAS project. As the driver is constantly monitored by a camera Tim Jannusch: Conceptualization, Methodology, Formal analysis, located over the dashboard, in some respect there are echoes of the Writing - Original Draft, Writing - Review & Editing, Visualization, Benthamite paradigm. From there, it is a relatively short step to radical Validation, Florian David-Spickermann: Writing - Review & Editing, Foucauldian critiques that combine concerns about power and tech­ Darren Shannon: Writing - Review & Editing, Visualization, Juliane nologies of surveillance. Both traditional surveillance theorems allow Ressel: Writing - Review & Editing, Michaele Voller¨ : Methodology, for a deeper and more comprehensive insight into the dangers inherent Writing - Review & Editing, Supervision, Finbarr Murphy: Writing - in terms of surveillance in the context of VI-DAS 720-degree observa­ Review & Editing, Resources, Irini Furxhi: Writing - Review & Editing, tion. Nevertheless, Benthamite and Foucauldian thinking appears too Martin Cunneen: Writing - Review & Editing, Martin Mullins: Writing general and high level to answer the question who should have access to - Review & Editing, Supervision, Funding acquisition. the recorded data and how the data should be used by stakeholders of the VI-DAS consortium. Funding As our paper suggests, there are more stakeholders than governments or state actors relevant to the VI-DAS consortium. In some ways, the This work was supported by VI-DAS (Vision Inspired Driver Assis­ insurance industry is a notable example of a disciplinary power in so­ tance Systems), a European Commission Horizon 2020 research con­ ciety utilising surveillance techniques in order to exercise control of sortium [grant number 690772]. human agents. However, the disciplining power of the insurance in­ dustry should not be perceived as purely negative. A favourable example References can be seen in the use of telematics technologies that allows feedback to facilitate better driving. That is particularly relevant in the group of [1] CORDIS, Vision Inspired Driver Assistance Systems | Projects | H2020, CORDIS | young novice drivers which are more likely to be involved in a crash or European Commission. Publication Office/CORDIS, 2019. Retrieved from, near-crash event [91–93]. In addition, there are more advantages, which https://cordis.europa.eu/project/rcn/204771/factsheet/en. (Accessed 14 June 2019). should be pictured at this point, one could also consider the potential [2] T. Bellet, M. Cunneen, M. Mullins, F. Murphy, F. Pütz, F. Spickermann, C. Braendle, benefits for older drivers and those with mobility issues, hence moni­ M.F. Baumann, From semi to fully autonomous vehicles: new emerging risks and toring technologies like the 720-degree observation system to optimize ethico-legal challenges for human-machine interactions, Transport. Res. F Traffic Psychol. Behav. 63 (2019) 153–164. HMI are not simply the stuff of dystopian nightmares. [3] D.J. Solove, The digital person: technology and privacy in the information age, NyU However, there are risks to be addressed and data flows to be gov­ Press (2004). erned. Consequently, while Benthamite and Foucauldian thinking is [4] D.J. Solove, A taxonomy of privacy, University of Pennsylvania Law Review 154 (2005). useful in a heuristic sense and provides a good starting point in terms of [5] V. Mayer-Schonberger.¨ Delete: The Virtue of Forgetting in the Digital Age, Princeton interrogating the value of this emerging technology. However, we do University Press, 2011. need a way forward in terms of developing protocols for managing [6] R. Mansell, U. Wehn. Knowledge societies: information technology for sustainable privacy and sensitive data to overcome inherent limitations. Nissen­ development, Oxford University Press, 1998. [7] S. Zuboff, The Age of Surveillance Capitalism: the Fight for a Human Future at the baum’s theorem of contextual integrity provides a deeper insight into New Frontier of Power, PublicAffairs, New York, 2019, p. 691. the current debate on privacy aspects specificto HMI. The introduction [8] L. Sweeney, k-: a model for protecting privacy, Int. J. Uncertain. of her contemporary decision heuristics and incorporating the prevalent Fuzziness Knowledge-Based Syst. (2002). [9] K. Vold, J. Whittlestone, Privacy, Autonomy, and Personalised Targeting: context into the debates to evaluate the appropriateness of personal data Rethinking How Personal Data Is Used, Data, Privacy and the Individual, 2019. flows allowed us to broaden the ethical discussion around 720-degree [10] J.E. Cohen, Examined lives: Informational privacy and the subject as object, Stan. observation and to operationalise our understanding of privacy in a L. Rev. 52 (1999). ’ [11] A. Acquisti, L. Brandimarte, G. Loewenstein, Privacy and human behavior in the more context specific fashion. Overall, Nissenbaum s CI moved our age of information, Science 347 (6221) (2015). debate beyond a more binary discussion on privacy and surveillance and [12] L. Brandeis, S. Warren, The , Harvard law review 4 (5) (1890). to a certain degree escape the shadow of the panopticon. Most impor­ [13] H.J. Smith, S.J. Milberg, S.J. Burke, : measuring individuals’ concerns about organizational practices, MIS Q. (1996) 167–196. tantly, Nissenbaum allowed us to introduce structure and a degree of [14] M. Paterson, Automobile Politics: Ecology and Cultural Political Economy, precision in our thinking on the matter of privacy what represent a step Cambridge University Press, Cambridge, 2007, p. 271. forward in that regard. For this reason, we propose to researchers and [15] O. Musicant, T. Lotan, Can novice drivers be motivated to use a smartphone based app that monitors their behavior? Transportation Research Part F: Traffic practitioners to utilize frameworks like those of Nissenbaum to generate Psychology and Behaviour 42 (2016). a contextual understanding of the intrusion of privacy through present [16] A. Brunon-Ernst, Beyond Foucault: New Perspectives on Bentham’s Panopticon, and future information technology. This may help us to determine so­ Routledge, London, 2016, p. 1196. cially acceptable and appropriate flows of information using a more [17] M. Foucault, Discipline and Punish: the Birth of the Prison, Vintage Books, New York, NY, 1995, p. 333. user-centric perspective than previous philosophical approaches [18] J. Reiman, Driving to the panopticon: a philosophical exploration of the risks to allowed. privacy posed by the highway technology of the future, Santa Clara High Zuboff [7] and others have identified some of the more macro-risks Technology Law Journal 11 (1) (1995) 27. [19] M.C. Behrent, Foucault and technology, Hist. Technol. 29 (1) (2013) 54–104, at play in this new paradigm of date sharing. The wider context is one https://doi.org/10.1080/07341512.2013.780351. in which large global companies are seeking to gather – without re­ [20] L. Dalibert, Posthumanism and somatechnologies: exploring the intimate relations striction – information on citizenry in a manner that has no precedent between humans and technologies, Enschede: Universiteit Twente (2014), https:// ’ doi.org/10.3990/1.9789036536516. historically. The last thesis of Nissenbaum s decision heuristic adapted [21] M. Foucault, M. Senellart, Security, territory, population: Lectures at the College` de to the information age provides a way to measure whether a new France, 1977 - 78, Palgrave Macmillan UK, Basingstoke, 2007, p. 2019, https://

10 T. Jannusch et al. Technology in Society 66 (2021) 101667

doi.org/10.1057/9780230245075,417. Foucault Studies, https://rauli.cbs.dk/ind [49] J. Bentham, Outline of a Work Entitled Pauper Management Improved, 2 volumes, ex.php/foucault-studies/index. (Accessed 14 June 2019). Elibron Classics, 2004 [Place of publication not identified]. [22] C.P. Holland, M. Mullins, M. Cunneen, Creating Ethics Guidelines for Artificial [50] M.J. Smith, W.H. Burston, J.R. Dinwiddy, J. Bentham (Eds.), Chrestomathia, Intelligence (AI) and Big Data Analytics: the Case of the European Consumer Clarendon Press, Oxford, 1993, p. 451. Insurance Market, 2021. Available at: SSRN 3808207. [51] H.S. Sætra, Freedom under the gaze of Big Brother: preparing the grounds for a [23] S. Vallor, Technology and the Virtues: A Philosophical Guide to a Future Worth liberal defence of privacy in the era of Big Data, Technol. Soc. 58 (2019) 101160. Wanting, Oxford University Press, 2016. [52] S.H. Seog, Information Revolution and Insurance Ethos: Weber, Foucault, Deleuze [24] H. Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social and beyond, Foucault, Deleuze and Beyond, 2020 (February 02, 2020). Life, Stanford Law Books an imprit of Standford University Press, Stanford, [53] G. Marchant, “Soft Law” Governance of Artificial Intelligence, 2019. California, 2010, p. 288. [54] https://www.eiopa.europa.eu/sites/default/files/publications/annual_reports [25] H. Nissenbaum, Contextual integrity up and Down the data food Chain, Theor. Inq. /annual-report-2019.pdf, 2019. (Accessed 8 July 2021). Law 20 (1) (2019) 221–256. [55] K. Young, M. Regan, Driver distraction: a review of the literature, in: I.J. Faulks, [26] M. Cuquet, A. Fensel, The societal impact of big data: a research roadmap for M. Regan, M. Stevenson, J. Brown, A. Porter, J.D. Irwin (Eds.), Distracted Driving, Europe, Technol. Soc. 54 (2018) 74–86. Australasian College of Road Safety, Sydney, NSW, 2007, pp. 379–405. [27] SAE International, Taxonomy and Definitionsfor Terms Related to On-Road Motor [56] I.G. Cohen, L.O. Gostin, D.J. Weitzner, Digital smartphone tracking for COVID-19: Vehicle Automated Driving Systems, Retrieved from Society of Automotive public health and civil liberties in tension, Jama 323 (23) (2020) 2371–2372, Engineers, 2016. https://doi.org/10.1001/jama.2020.8570. [28] J.M. Scanlon, K.D. Kusano, R. Sherony, H.C. Gabler, Potential safety benefits of [57] K. Iyengar, G.K. Upadhyaya, R. Vaishya, V. Jain, COVID-19 and applications of lane departure warning and prevention systems in the US vehicle fleet, in: Paper smartphone technology in the current pandemic, Diabetes & Metabolic Syndrome: Presented at the 24th International Technical Conference on the Enhanced Safety Clin. Res. Rev. 14 (5) (2020) 733–737, https://doi.org/10.1016/j. of Vehicles, (ESV) National Highway Traffic Safety Administration, Gothenburg, dsx.2020.05.033. Sweden, 2015. [58] R. Klar, D. Lanzerath, The ethics of COVID-19 tracking apps – challenges and [29] J.M. Scanlon, R. Sherony, H.C. Gabler, Injury mitigation estimates for an voluntariness, Research Ethics 16 (3–4) (2020) 1–9, https://doi.org/10.1177/ intersection driver assistance system in straight crossing path crashes in the United 1747016120943622. States, Traffic Inj. Prev. 18 (sup1) (2017) S9–S17. [59] World Health Organization (WHO), GLOBAL STATUS REPORT ON ROAD SAFETY [30] C. Ryan, F. Murphy, M. Mullins, Semiautonomous vehicle risk analysis: a 2018, 2019. telematics-based anomaly detection approach, Risk Anal. 39 (5) (2019) [60] N.J. Goodall, Machine ethics and automated vehicles, in: Road Vehicle 1125–1140. Automation, Springer, 2014, pp. 93–102. [31] R. McCall, F. McGee, A. Mirnig, A. Meschtscherjakov, N. Louveton, T. Engel, [61] K. Lipovac, M. Đeri´c, M. Teˇsi´c, Z. Andri´c, B. Mari´c, Mobile phone use while driving- M. Tscheligi, A taxonomy of autonomous vehicle handover situations, Transport. literary review, Transport. Res. F Traffic Psychol. Behav. 47 (2017) 132–142, Res. Pol. Pract. 124 (2019) 507–522. https://doi.org/10.1016/j.trf.2017.04.015. ¨ [32] Bundestag, Achtes Gesetz zur Anderung des Straßenverkehrsgesetzes, 2017. [62] EU General Data Protection Regulation (GDPR), Regulation (EU) 2016/679 of the Retrieved from https://www.bgbl.de/xaver/bgbl/start.xav?start=%2F%2F*%5B% European Parliament and of the Council of 27 April 2016 on the protection of 40attr_id%3D%27bgbl117s1648.pdf%27%5D#__bgbl__%2F%2F*%5B%40attr_id% natural persons with regard to the processing of personal data and on the free 3D%27bgbl117s1648.pdf%27%5D__1516351777807. movement of such data, and repealing Directive 95/46, Official Journal of the [33] B. Sheehan, F. Murphy, C. Ryan, M. Mullins, H.Y. Liu, Semi-autonomous vehicle European Union (OJ) 59 (1–88) (2016) 294. motor insurance: a Bayesian Network risk transfer approach, Transport. Res. C [63] V. Coroama, The smart tachograph–individual accounting of traffic costs and its Emerg. Technol. 82 (2017) 124–137, https://doi.org/10.1016/j.trc.2017.06.015. implications, in: Paper Presented at the International Conference on Pervasive [34] J.L. Campbell, J.L. Brown, J.S. Graving, C.M. Richard, M.G. Lichty, T. Sanquist, J. Computing, 2006. L. Morgan, Human Factors Design Guidance for Driver-Vehicle Interfaces (Report [64] D.I. Tselentis, G. Yannis, E.I. Vlahogianni, Innovative insurance schemes: pay as/ No. DOT HS 812 360), National Highway Traffic Safety Administration, how you drive, Transportation Research Procedia 14 (2016) 362–371. Washington, DC, 2016, December. [65] A. Shaout, D. Colella, S. Awad, Advanced driver assistance systems-past, present [35] National Center for Statistics and Analysis, Distracted Driving in Fatal Crashes, and future, in: Paper Presented at the 2011 Seventh International Computer 2017, Retrieved from, Washington, DC, 2019. Engineering Conference (ICENCO’2011), 2011. [36] National Transportation Safety Board, Collision between a Car Operation with [66] D. Farny, Versicherungsbetriebslehre (5. Auflage), Verlag Versicherungswirtschaft Automated Vehicle Control Systems and a Tractor-Semitrailer Truck Near GmbH, Karlsruhe, 2011. Williston, Florida, May 7, 2016, Retrieved from, Washington, DC, 2017. [67] M. Haller, Gesellschaft als Risiko? Zur Rolle der Versicherer in der [37] National Transportation Safety Board (NTSB), Preliminary Report Highway: gesellschaftlichen Risikodebatte. Gesamtverband der deutschen HWY18FH011, 2018. Retrieved from, https://ntsb.gov/investigations/Accident Versicherungswirtschaft: Wieviel Risiko braucht die Gesellschaft, 1998, Reports/Reports/HWY18FH011-preliminary.pdf. pp. 221–266. [38] National Transportation Safety Board (NTSB), Preliminary Report Highway: [68] R.L. Carter, Handbook of Insurance, Kluwer-Harrap, Isleworth, 1973. HWY18MH010, 2018. Retrieved from, https://www.ntsb.gov/investigations/ [69] R.V. Ericson, A. Doyle, Uncertain Business: Risk, Insurance and the Limits of AccidentReports/Pages/HWY18MH010-prelim.aspx. Knowledge, University of Toronto Press, 2004, https://doi.org/10.3138/ [39] S. Trask, M. Stewart, T. Kerwin, S. Midlam-Mohler, Effectiveness of Warning 9781442682849. Signals in Semi-autonomous Vehicles, 2019, https://doi.org/10.4271/2019-01- [70] S. Shaikh, P. Krishnan, A framework for analysing driver interactions with semi- 1013. SAE Technical Paper 2019-01-1013. autonomous vehicles, Electron. Proc. Theor. Comput. Sci. 105 (2013) 85–99 (Proc. [40] R. van der Heiden, S.T. Iqbal, C.P. Janssen, Priming drivers before handover in FTSCS 2012). semi-autonomous cars, in: Paper Presented at the Proceedings of the 2017 CHI [71] P. Koopman, M. Wagner, Challenges in autonomous vehicle testing and validation, Conference on Human Factors in Computing Systems, 2017. SAE Int. J. Transport. Saf. 4 (1) (2016) 15–24. [41] C. Purucker, F. Naujoks, A. Prill, A. Neukum, Evaluating distraction of in-vehicle [72] B. Sheehan, et al., Connected and autonomous vehicles: a cyber-risk classification information systems while driving by predicting total eyes-off-road times with framework, Transport. Res. Pol. Pract. 124 (2019) 523–536. keystroke level modeling, Appl. Ergon. 58 (2017) 543–554. [73] N. Liu, A. Nikitas, S. Parkinson, Exploring expert perceptions about the cyber [42] K. Zeeb, A. Buchner, M. Schrauf, What determines the take-over time? An security and privacy of Connected and Autonomous Vehicles: a thematic analysis integrated model approach of driver take-over after automated driving, Accid. approach, Transport. Res. F Traffic Psychol. Behav. 75 (2020) 66–86. Anal. Prev. 78 (2015) 212–221. [74] S. Parkinson, P. Ward, K. Wilson, J. Miller, Cyber threats facing autonomous and [43] F. Vicente, Z. Huang, X. Xiong, F. De la Torre, W. Zhang, D. Levi, Driver gaze connected vehicles: future challenges, IEEE Trans. Intell. Transport. Syst. 18 (11) tracking and eyes off the road detection system, IEEE Trans. Intell. Transport. Syst. (2017) 2898–2915. 16 (4) (2015) 2014–2027. [75] M. Camillo, Cyber risk and the changing role of insurance, J. Cyber Pol. 2 (1) [44] I. Takahashi, T.T. Nguyen, H. Kanamori, T. Tanaka, S. Kato, Y. Ninomiya, H. Aoki, (2017) 53–63. ˇ Automated safety vehicle stop system for cardiac emergencies, in: Paper Presented [76] J.-F. Bonnefon, D. Cerny, J. Danaher, N. Devillier, V. Johansson, T. Kovacikova, at the 2016 IEEE International Conference on Emerging Technologies and N. Reed, Ethics of Connected and Automated Vehicles: Recommendations on Road Innovative Business Practices for the Transformation of Societies (EmergiTech), Safety, Privacy, Fairness, Explainability and Responsibility, 2020. 2016. [77] A.F. Westin, Privacy and Freedom, Bodley Head, London, 1967. [45] R. Clarke, Information technology and dataveillance, Communications of the ACM [78] D.J. Fagnant, K. Kockelman, Preparing a nation for autonomous vehicles: 31 (5) (1988). opportunities barriers and policy recommendations, Transport. Res. Pol. Pract. 77 [46] D. Lyon, Everyday Surveillance: Personal data and social classifications, (2015) 167–181. Information, Communication & Society (2002). [79] M. Chorzempa, P. Triolo, S. Sacks, China’s social credit system: a mark of progress or a [47] M. Galiˇc, T. Timan, B.-J. Koops, Bentham, deleuze and beyond: an overview of threat to privacy? Policy Briefs PB18-14 (2018). surveillance theories from the panopticon to participation, Philos. Technol. 30 (1) [80] S. Engelmann, M. Chen, F. Fischer, C. Kao, J. Grossklags, Clear Sanctions, Vague (2017) 9–37, https://doi.org/10.1007/s13347-016-0219-1. Rewards: How China’s Social Credit System Currently Defines“ Good” and“ Bad” [48] J. Bentham, Panopticon, or, the Inspection-House: Containing the Idea of a New Behavior, Proceedings of the conference on fairness, accountability, and Principle of Construction Applicable to Any Sort of Establishment, in Which transparency (2019). Persons of Any Description Are to Be Kept under Inspection: and in Particular to [81] G. Kostka, China’s social credit systems and public opinion: explaining high levels Penitentiary-Houses, Prisons, Houses of Industry … and Schools: with a Plan of of approval, New Media Soc. 7 (21) (2019). Management Adapted to the Principle: in a Series of Letters, Written in the Year [82] T. Fotel, T.U. Thomsen, The surveillance of children’s mobility, Surveill. Soc. 1 (4) 1787, Reprinted and sold by T. Payne, London, 1791. (2003).

11 T. Jannusch et al. Technology in Society 66 (2021) 101667

[83] P. Handel,¨ I. Skog, J. Wahlstrom, F. Bonawiede, R. Welch, J. Ohlsson, M. Ohlsson, [90] J. Vincent, C. Porquet, M. Borsali, H. Leboulanger, Privacy protection for Insurance telematics: opportunities and challenges with the smartphone solution, smartphones: an ontology-based firewall, in: Paper Presented at the IFIP IEEE Intell. Transport. Syst. Mag. 6 (4) (2014) 57–70, https://doi.org/10.1109/ International Workshop on Information Security Theory and Practices, 2011. MITS.2014.2343262. [91] K.H. Beck, S. Watters, Characteristics of college students who text while driving: do [84] S. Husnjak, D. Perakovi´c, I. Forenbacher, M. Mumdziev, Telematics system in usage their perceptions of a significant other influence their decisions? Transport. Res. F based motor insurance, Procedia Eng. 100 (2015) 816–825, https://doi.org/ Traffic Psychol. Behav. 37 (2016) 119–128, https://doi.org/10.1016/j. 10.1016/j.proeng.2015.01.436M4. trf.2015.12.017. [85] J. Lemaire, Bonus-malus Systems in Automobile Insurance, vol. 19, Springer [92] K.R. Ginsburg, F.K. Winston, T.M. Senserrick, F. García-Espana,˜ S. Kinsman, D. science & business media, 2012. A. Quistberg, J.G. Ross, M.R. Elliott, National young-driver survey: teen [86] F. Ewold, Insurance and Risk. The Foucault Effect: Studies in Governmentality, 1991, perspective and experience with factors that affect driving safety, Pediatrics 121 p. 197210. (5) (2008), https://doi.org/10.1542/peds.2007-2595. [87] J. Haidt, J. Graham, C. Joseph, Above and below left–right: ideological narratives [93] L. Gong, W.D. Fan, Modeling single-vehicle run-off-road crash severity in rural and moral foundations, Psychol. Inq. 20 (2–3) (2009) 110–119. areas: accounting for unobserved heterogeneity and age difference, Accid. Anal. [88] J. Haidt, C. Joseph, Intuitive ethics: how innately prepared intuitions generate Prev. 101 (2017) 124–134, https://doi.org/10.1016/j.aap.2017.02.014. culturally variable virtues, Daedalus 133 (4) (2004) 55–66. [94] J. Van den Hoven, G.-J. Lokhorst, I. Van de Poel, Engineering and the problem of [89] P. Van Tri, Firewall Configuration for Privacy, 2015. moral overload, Sci. Eng. Ethics 18 (1) (2012) 143–155.

12