Trusting in Machines: How Mode of Interaction Affects Willingness to Share Personal Information with Machines
Total Page:16
File Type:pdf, Size:1020Kb
Proceedings of the 51st Hawaii International Conference on System Sciences j 2018 Trusting in Machines: How Mode of Interaction Affects Willingness to Share Personal Information with Machines Juliana Schroeder Matthew Schroeder University of California, Berkeley [email protected] [email protected] Abstract machine (i.e., expression modality) and whether the machine is typing or talking back (i.e., response Every day, people make decisions about whether to modality). trust machines with their personal information, such as We draw from two primary findings across the letting a phone track one’s location. How do people diverse fields of cognition, neuroscience, and social decide whether to trust a machine? In a field psychology to form predictions about the effect of experiment, we tested how two modes of interaction— expression and response modality on machine trust. expression modality, whether the person is talking or First, expression modality should primarily affect the typing to a machine, and response modality, whether the user’s cognitive state. Indeed, research on expression machine is talking or typing back—influence the modality suggests that verbal (versus nonverbal or willingness to trust a machine. Based on research that physical) modes of expression can reduce self-control expressing oneself verbally reduces self-control behavior [1-3]. For instance, verbally expressing one’s compared to nonverbal expression, we predicted that choice (i.e., speaking) increases heuristic decision- talking to a machine might make people more willing to making and indulgence, thereby reducing self-control, share their personal information. Based on research on compared to physically expressing one’s choice (e.g., the link between anthropomorphism and trust, we button pressing, pointing, typing) for identical self- further predicted that machines who talked (versus control dilemmas [1]. As such, we expect that having a texted) would seem more human-like and be trusted spoken conversation with a machine, as opposed to a more. Using a popular chatterbot phone application, we typed conversation, may make users more likely to give randomly assigned over 300 community members to up personal information, failing to exert control over either talk or type to the phone, which either talked or their information. typed in return. We then measured how much Second, response modality should primarily affect participants anthropomorphized the machine and their the user’s perception of the machine. A machine that can willingness to share their personal information (e.g., create speech should be judged as more human-like than their location, credit card information) with it. Results a machine that creates text. One set of experiments revealed that talking made people more willing to share illustrated this directly: when participants read a piece their personal information than texting, and this was of text that had been created by either a human or robust to participants’ self-reported comfort with machine, they were less likely to believe the text had technology, age, gender, and conversation been written by a human than those who heard the same characteristics. But listening to the application’s voice text spoken aloud [4].Furthermore, anthropomorphizing did not affect anthropomorphism or trust compared to a machine by assuming it is more humanlike (e.g., reading its text. We conclude by considering the seems more rational, competent, thoughtful, and even theoretical and practical implications of this experiment emotional) may increase trust. For example, self-driving for understanding how people trust machines. cars with human voices seem more human-like and are trusted more by users [5]. These data lead us to predict 1. Introduction that users will trust talking machines more than texting machines. Every day, people make decisions about whether to However, there are at least two important caveats trust machines with their personal information. From that may exist in the relationship between response entering one’s credit card number into a company’s modality and trust. First, anthropomorphism is unlikely website to allowing a phone to track one’s location, to always lead to trust. For instance, users feel these decisions require trusting machines with personal, threatened by machines that seem too intelligent [6]. and potentially sensitive, information. How do people Therefore, the level of machine competence, and decide whether to trust a machine? We explore how the whether or not the machine seems threatening, may modality by which people interact with machines can matter. Second, the quality of the voice is also likely to affect how much they are willing to trust them with matter when evoking anthropomorphism. Prior research personal information. Specifically we consider two suggests that only humanlike speech with voices that criteria—whether the user is typing or talking to the naturalistically vary in pitch, amplitude, and rate of URI: http://hdl.handle.net/10125/49948 Page 472 ISBN: 978-0-9981331-1-9 (CC BY-NC-ND 4.0) speech, can increase perceptions of humanization [4, 7]. research suggests just the opposite. A prominent In contrast, more monotone and robotic voices may be example is the Stroop task [19], a classic self-control judged no differently from text. task whereby participants are presented with color In a field experiment, we test the effect of words (e.g., blue, green) printed in the opposite colored expression modality and response modality on trust in ink (e.g., the word “blue” printed in green ink). The machines. We predict two main effects: that talking to a participant’s task is to verbalize the ink color, overriding machine, and being talked to, will increase trust. It is the automatic tendency to verbalize the word itself. also possible that these two variables could interact. For Interestingly, when participants enter their response example, the effect of response modality might be larger manually, via a keystroke, the Stroop effect is smaller when the user is talking to the machine than typing to than when they speak the colors aloud [2]. This effect the machine, because it feels more like a real persists even with practice [20]. Consistent with these conversation with both agents talk to one another. We findings, there tends to be greater activation an area of therefore tested for interactions in addition to main the brain associated with identifying self-control effects. conflicts, the cognitive/dorsal area of the anterior cingulate cortex, during manual response to the Stroop 1.1. Trust in Machines task than during oral response [3, 21]. One recent set of eighteen experiments tested the Trust is an essential ingredient in social interaction effect of expression modality on decision-making that influences decisions about how people will behave among consumers making choices relevant to self- toward others in personal and organizational contexts. control (e.g., between an apple or candy) [1]. These For example, having trust improves the stability of experiments manipulated whether the participant spoke economic and political exchange [8], reduces to indicate their choice, compared to non-verbal transaction costs [9], facilitates cooperation [10], and preferences modalities such as clicking, button- helps firms and individuals manage risk [11]. pressing, pointing, taking, or writing. Across the set of Conversely, trust violations can harm cooperation and studies, speaking tended to result in the more indulgent bargaining outcomes [12, 13], lower organizational choice. One possible reason for these findings is that commitment [14], provoke retaliation [15], and even speaking triggers a heuristic mindset, whereby people trigger organizational-level failures [16]. Golembiewski rely more on their intuitive preferences. and McConkie [16, p. 131] argued that, ‘‘There is no If spoken interaction elicits greater behavioral single variable which so thoroughly influences disinhibition than text-based interaction, as the interpersonal and group behavior as does trust.’’ aforementioned literature suggests, this may have Extending from this literature, trust is not just a implications for a user’s willingness to trust a machine. critical predictor of how humans behave toward other Indeed, prior research among humans demonstrates that humans, but also of how humans behave toward reduced inhibition increases social disclosure [e.g., due machines. It has particular security implications, to alcohol consumption, 22, or visual anonymity, 23]. In whereby humans may become vulnerable to machine other words, people who feel more disinhibited might attacks if they mistakenly put their trust in machines. also be more likely to disclose to their interaction Consistent with prior research, we define trust as ‘‘a partner. Moreover, this effect could even be cyclical: psychological state comprising the intention to accept greater disclosure can lead to greater liking, which vulnerability based upon positive expectations of the further increases disclosure [24]. In sum, the prior intentions or behavior of another’’ [18, p. 395]. By this research on expression modality, self-control, and definition, the decision to trust another agent is disclosure lead us to predict that talking a machine contingent on two aspects: the person’s own might lead a user to be more likely to share personal psychological state and the person’s expectations about information with it, compared to typing to the machine. the agent’s intentions. The