Micro-Expression Extraction for Lie Detection Using Eulerian Video (Motion and Color) Magnification

Total Page:16

File Type:pdf, Size:1020Kb

Micro-Expression Extraction for Lie Detection Using Eulerian Video (Motion and Color) Magnification Master Thesis Electrical Engineering Micro-Expression Extraction For Lie Detection Using Eulerian Video (Motion and Color) Magnification Submitted By Gautam Krishna , Chavali Sai Kumar N V , Bhavaraju Tushal , Adusumilli Venu Gopal , Puripanda This thesis is presented as part of Degree of Master of Sciences in Electrical Engineering BLEKINGE INSTITUTE OF TECHNOLOGY AUGUST,2014 Supervisor : Muhammad Shahid Examiner : Dr. Benny L¨ovstr¨om Department of Applied Signal Processing Blekinge Institute of Technology; SE-371 79, Karlskrona, Sweden. This thesis is submitted to the Department of Applied Signal Processing at Blekinge Institute of Technology in partial fulfilment of the requirements for the degree of Master of Sciences in Electrical Engineering with emphasis on Signal Processing. Contact Information Authors: Gautam Krishna.Chavali E-mail: [email protected] Sai Kumar N V.Bhavaraju E-mail: [email protected] Tushal.Adusumilli E-mail: [email protected] Venu Gopal.Puripanda E-mail: [email protected] University Advisor: Mr. Muhammad Shahid Department of Applied Signal Processing Blekinge Institute of Technology E-mail: [email protected] Phone:+46(0)455-385746 University Examiner: Dr. Benny L¨ovstr¨om Department of Applied Signal Processing Blekinge Institute of Technology E-mail: [email protected] Phone: +46(0)455-38704 School of Electrical Engineering Blekinge Institute of Technology Internet : www.bth.se/ing SE-371 79, Karlskrona Phone : +46 455 38 50 00 Sweden. Abstract Lie-detection has been an evergreen and evolving subject. Polygraph techniques have been the most popular and successful technique till date. The main drawback of the polygraph is that good results cannot be attained without maintaining a physical con- tact, of the subject under test. In general, this physical contact would induce extra consciousness in the subject. Also, any sort of arousal in the subject triggers false posi- tives while performing the traditional polygraph based tests. With all these drawbacks in the polygraph, also, due to rapid developments in the fields of computer vision and artificial intelligence, with newer and faster algorithms, have compelled mankind to search and adapt to contemporary methods in lie-detection. Observing the facial expressions of emotions in a person without any physical con- tact and implementing these techniques using artificial intelligence is one such method. The concept of magnifying a micro expression and trying to decipher them is rather premature at this stage but would evolve in future. Magnification using Eulerian Video Magnification(EVM) technique has been proposed recently and it is rather new to ex- tract these micro expressions from magnified EVM based on Histogram of Oriented Gradients (HOG) features. HOG features is the feature extraction algorithm which ex- tracts local gradient information in an image. Till date, HOG features have been used in conjunction with SVM, and generally for person/pedestrian detection. A newer, sim- pler and modern method of applying EVM with HOG features and Back-propagation Neural Network jointly has been introduced and proposed to extract and decipher the micro-expressions on the face. Micro-expressions go unnoticed due to its involuntary nature, but EVM is used to magnify them and makes them noticeable. Emotions be- hind the micro-expressions are extracted and recognized using the HOG features & Back-Propagation Neural Network. One of the important aspects that has to be dealt with human beings is a biased mind. Since, an investigator is also a human and, he too, has to deal with his own assumptions and emotions, a Neural Network is used to give the investigator an unbiased start in identifying the true emotions behind every micro-expression. On the whole, this proposed system is not a lie-detector, but helps in detecting the emotions of the subject under test. By further investigation, a lie can be detected. Keywords: Micro Expressions, Emotions, Eulerian Video Magnification, Histogram of Oriented Gradients, Voila-Jones Algorithm, Artificial Neural Network. iii Acknowledgments We would like to express gratitude to our supervisor Mr.Muhammad Shahid for intro- ducing us to the topic and for the heartfelt support and encouragement on this pursuit of our masters. Furthermore, we would like to thank Dr. Benny L¨ovstr¨om for his useful comments and remarks through the learning process of this master thesis. The Department of Signal Processing has provided the support and equipment that are mandatory to produce and complete our thesis. In our daily work we have been blessed with a friendly and cheerful group of fellow students (PVK Chaitanya and V Revanth). We will forever be grateful for all your love and help. Finally, we thank our family members and relatives for supporting us throughout all our studies at University and helping us to move across the seas. Also, for providing a second home here, wherein we could complete our writing up. v Contents Abstract iii Acknowledgments v List of Abbreviations xii 1 Introduction 1 1.1 Objectives and Scope of work ....................... 2 1.1.1 Pre-requisitesforthemethodology................ 2 1.2 Research Questions ............................. 2 1.3TheMethod................................. 2 1.4BlockDiagram............................... 3 1.5OverviewofThesis............................. 4 2 Lies, Expressions and Emotions 5 2.1Lie...................................... 5 2.2FacialExpressions............................. 6 2.3Emotions.................................. 7 2.3.1 Anger................................ 7 2.3.2 Disgust............................... 8 2.3.3 Fear................................. 8 2.3.4 Happy................................ 9 2.3.5 Sadness............................... 9 2.3.6 Surprise............................... 9 2.3.7 Contempt.............................. 10 2.4 Micro Expressions .............................. 11 3 Eulerian Video Magnification 13 3.1 Conceptual learning about Pyramids ................... 13 3.1.1 GaussianPyramid......................... 14 3.1.2 Laplacian pyramids ......................... 15 3.2VideospecificationsandprerequisitesofEVM.............. 15 3.2.1 Standardone-to-oneinterviewsetupformat........... 15 3.3 Eulerian Video Magnification ....................... 16 3.3.1 Motion Magnification ........................ 16 3.3.2 Color Magnification ......................... 17 4 Face Recognition and Feature Extraction 20 4.1 Conceptual learning about Voila-Jones algorithm ............ 20 4.1.1 Haar-Likefeatures......................... 20 4.1.2 IntegralImage............................ 21 4.1.3 AdaBoost.............................. 22 4.1.4 CascadeofClassifiers........................ 22 4.2RecognitionofFaceusingVoila-JonesAlgorithm............. 22 vii 4.3 Conceptual learning about HOG features ................. 22 4.3.1 Gradient computation ....................... 23 4.3.2 OrientationBiANNing....................... 23 4.3.3 Descriptor Blocks .......................... 23 4.3.4 BlockNormalization........................ 23 4.4FeatureExtractionusingHOGfeatures.................. 24 5 Database of Images 26 5.1Cohn-Kanadeimagedatabase....................... 26 5.2ExtendedCohn-Kanadeimagedatabase................. 26 5.3 Discussion about emotion labels in extended Cohn-Kanade database . 27 5.3.1 AdvantagesofusingANNovermulti-classSVM......... 28 5.4WorkingwithimagesofextendedCohn-Kanadedatabase....... 28 6 Neural Network and Pulse extraction 30 6.1 Artificial Neural Network using back propagation algorithm ...... 31 6.1.1 Training phase ........................... 31 6.1.2 Testingphase............................ 33 6.2 Pulse Extraction .............................. 34 6.3Outputsrenderedasmovinggraphs.................... 34 7 Graphical User Interface 36 7.1WorkingonGUI.............................. 36 7.2OperationalflowofGUI.......................... 37 8 Results 40 8.1Performance................................. 40 8.1.1 SystemSpecifications........................ 40 8.1.2 InputSpecification......................... 40 8.1.3 TimeDuration........................... 41 8.2 Validating the Neural Network results with the results of the database . 41 8.3 Reasons for not designing and performing an experiment ........ 42 8.4 Verifying the motion magnification of the proposed design ....... 43 8.4.1 Anger................................ 43 8.4.2 Disgust-DF1di........................... 43 8.4.3 Fear-DM2fe............................ 44 8.4.4 Happy-DF1ha........................... 45 8.4.5 Sad-DF1sa............................. 46 8.4.6 Surprise-DF1su.......................... 47 8.5 Verifying both motion and color magnification of the proposed design . 49 8.5.1 Results for angry emotion using motion magnification of subject-1 49 8.5.2 Results for angry emotion using color magnification of subject-1 50 8.5.3 Results for various emotions using motion magnification of subject- 2................................... 50 8.5.4 Results for various emotions without motion magnification of subject 2 ............................... 52 8.5.5 Results for various emotions using color magnification of subject-2 52 9 Conclusion and Future Works 54 9.1Conclusion.................................. 54 9.2Futureworks................................ 54 Bibliography 56 List of Figures 1.1 Block diagram representing methodology ................. 3 2.1 Macro facial expression of a subject . ................. 6 2.2Angry-facialemotion............................ 7 2.3Disgust-facialemotion.........................
Recommended publications
  • Neural Networks for Emotion Classification
    Neural Networks for Emotion Classification For obtaining master degree In computer science At Leiden Institute of Advanced Computer Science Under supervision of Dr. Michael S. Lew and Dr. Nicu Sebe by Yafei Sun August 2003 1 ACKNOWLEDGMENTS I would like to express my heartfelt gratitude to Dr. Michael S. Lew for his support, invaluable guidance, time, and encouragement. I’d like to express my sincere appreciation to Dr. Nicu Sebe for his countless ideas and advice, his encouragement and sharing his valuable knowledge with me. Thanks to Ernst Lindoorn and Jelle Westbroek for their help on the experiments setup. Thanks to all the friends who participated in the construction of the authentic emotion database. I would also like to Dr. Andre Deutz, Miss Riet Derogee and Miss Rachel Van Der Waal for helping me out with a problem on the scheduling of my oral presentation of master thesis. Finally a special word of thanks to Dr. Erwin Bakker for attending my graduation exam ceremony. 2 Neural Networks for Emotion Classification Table of Contents ABSTRACT ............................................................................................................................................5 1 INTRODUCTION ..........................................................................................................................5 2 BACKGROUND RESEARCH IN EMOTION RECOGNITION..............................................8 2.1 AN IDEAL SYSTEM FOR FACIAL EXPRESSION RECOGNITION: CURRENT PROBLEMS.....11 2.2 FACE DETECTION AND FEATURE EXTRACTION................................................................11
    [Show full text]
  • The Future of Neuroimaged Lie Detection and the Law
    The University of Akron IdeaExchange@UAkron Akron Law Review Akron Law Journals June 2015 The uturF e of Neuroimaged Lie Detection and the Law Joelle Anne Moreno Please take a moment to share how this work helps you through this survey. Your feedback will be important as we plan further development of our repository. Follow this and additional works at: http://ideaexchange.uakron.edu/akronlawreview Part of the Science and Technology Law Commons Recommended Citation Moreno, Joelle Anne (2009) "The uturF e of Neuroimaged Lie Detection and the Law," Akron Law Review: Vol. 42 : Iss. 3 , Article 3. Available at: http://ideaexchange.uakron.edu/akronlawreview/vol42/iss3/3 This Article is brought to you for free and open access by Akron Law Journals at IdeaExchange@UAkron, the institutional repository of The nivU ersity of Akron in Akron, Ohio, USA. It has been accepted for inclusion in Akron Law Review by an authorized administrator of IdeaExchange@UAkron. For more information, please contact [email protected], [email protected]. Moreno: The Future of Neuroimaged Lie Detection and the Law 8-MORENO_COPYFORPRINTER.DOC 4/27/2009 12:43 PM THE FUTURE OF NEUROIMAGED LIE DETECTION AND THE LAW Joëlle Anne Moreno∗ I. How Should Law Prepare to Respond to Cognitive Neuroscience? .................................................................. 722 A. The Potential Influence of Cognitive Neuroscience on Law ...................................................................... 722 B. Neuroimages in Court ................................................ 723 C. Cognitive Neuroscience Evidence in Court ............... 723 C. Extra-Legal Uses of Cognitive Neuroscience ............ 732 II. How Should Law (and Law Professors) Respond to Cognitive Neuroscience? ................................................. 733 A. Understanding the Value and Limits of Cognitive Neuroscience ............................................................
    [Show full text]
  • Truth Be Told
    Truth Be Told . - washingtonpost.com Page 1 of 4 Hello vidal Change Preferences | Sign Out Print Edition | Subscribe NEWS POLITICS OPINIONS LOCAL SPORTS ARTS & LIVING CITY GUIDE JOBS SHO SEARCH: washingtonpost.com Web | Search Ar washingtonpost.com > Print Edition > Sunday Source » THIS STORY: READ + | Comments Truth Be Told . Sunday, November 25, 2007; Page N06 In the arena of lie detecting, it's important to remember that no emotion tells you its source. Think of Othello suffocating Desdemona because he interpreted her tears (over Cassio's death) as the reaction of an adulterer, not a friend. Othello made an assumption and killed his wife. Oops. The moral of the story: Just because someone exhibits the behavior of a liar does not make them one. "There isn't a silver bullet," psychologist Paul Ekman says. "That's because we don't have Pinocchio's nose. There is nothing in demeanor that is specific to a lie." Nevertheless, there are indicators that should prompt you to make further inquiries that may lead to discovering a lie. Below is a template for taking the first steps, gleaned from interviews with Ekman and several other experts, as well as their works. They are psychology professors Maureen O'Sullivan (University of San Francisco), (ISTOCKPHOTO) Robert Feldman (University of Massachusetts) and Bella DePaulo (University of California at Santa Barbara); TOOLBOX communication professor Mark Frank (University at Resize Text Save/Share + Buffalo); and body language trainer Janine Driver (a.k.a. Print This the Lyin' Tamer). E-mail This COMMENT How to Detect a Lie No comments have been posted about this http://www.washingtonpost.com/wp-dyn/content/article/2007/11/21/AR2007112102027...
    [Show full text]
  • Ekman, Emotional Expression, and the Art of Empirical Epiphany
    JOURNAL OF RESEARCH IN PERSONALITY Journal of Research in Personality 38 (2004) 37–44 www.elsevier.com/locate/jrp Ekman, emotional expression, and the art of empirical epiphany Dacher Keltner* Department of Psychology, University of California, Berkeley, 3319 Tolman, 94720 Berkeley, CA, USA Introduction In the mid and late 1960s, Paul Ekman offered a variety of bold assertions, some seemingly more radical today than others (Ekman, 1984, 1992, 1993). Emotions are expressed in a limited number of particular facial expressions. These expressions are universal and evolved. Facial expressions of emotion are remarkably brief, typically lasting 1 to 5 s. And germane to the interests of the present article, these brief facial expressions of emotion reveal a great deal about peopleÕs lives. In the present article I will present evidence that supports this last notion ad- vanced by Ekman, that brief expressions of emotion reveal important things about the individualÕs life course. To do so I first theorize about how individual differences in emotion shape the life context. With this reasoning as backdrop, I then review four kinds of evidence that indicate that facial expression is revealing of the life that the individual has led and is likely to continue leading. Individual differences in emotion and the shaping of the life context People, as a function of their personality or psychological disorder, create the sit- uations in which they act (e.g., Buss, 1987). Individuals selectively attend to certain features of complex situations, thus endowing contexts with idiosyncratic meaning. Individuals evoke responses in others, thus shaping the shared, social meaning of the situation.
    [Show full text]
  • E Motions in Process Barbara Rauch
    e_motions in process Barbara Rauch Abstract This research project maps virtual emotions. Rauch uses 3D-surface capturing devices to scan facial expressions in (stuffed) animals and humans, which she then sculpts with the Phantom Arm/ SensAble FreeForm device in 3D virtual space. The results are rapidform printed objects and 3D animations of morphing faces and gestures. Building on her research into consciousness studies and emotions, she has developed a new artwork to reveal characteristic aspects of human emotions (i.e. laughing, crying, frowning, sneering, etc.), which utilises new technology, in particular digital scanning devices and special effects animation software. The proposal is to use a 3D high-resolution laser scanner to capture animal faces and, using the data of these faces, animate and then combine them with human emotional facial expressions. The morphing of the human and animal facial data are not merely layers of the different scans but by applying an algorithmic programme to the data, crucial landmarks in the animal face are merged in order to match with those of the human. The results are morphings of the physical characteristics of animals with the emotional characteristics of the human face in 3D. The focus of this interdisciplinary research project is a collaborative practice that brings together researchers from UCL in London and researchers at OCAD University’s data and information visualization lab. Rauch uses Darwin’s metatheory of the continuity of species and other theories on evolution and internal physiology (Ekman et al) in order to re-examine previous and new theories with the use of new technologies, including the SensAble FreeForm Device, which, as an interface, allows for haptic feedback from digital data.Keywords: interdisciplinary research, 3D-surface capturing, animated facial expressions, evolution of emotions and feelings, technologically transformed realities.
    [Show full text]
  • Machine Learning-Based Lie Detector Applied to a Novel Annotated Game Dataset Nuria Rodriguez Diaz, Decky Aspandi, Federico Sukno, and Xavier Binefa
    JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 1 Machine Learning-based Lie Detector applied to a Novel Annotated Game Dataset Nuria Rodriguez Diaz, Decky Aspandi, Federico Sukno, and Xavier Binefa Abstract—Lie detection is considered a concern for everyone in the periorbital- [6] or perinasal-region [7] in thermal imagery, their day to day life given its impact on human interactions. Thus, which cannot be perceived by human vision. people normally pay attention to both what their interlocutors One of the crucial aspects to appropriately address lie- are saying and also to their visual appearances, including faces, to try to find any signs that indicate whether the person is detection research is the availability of adequate datasets. The telling the truth or not. While automatic lie detection may help acquisition of training and, especially, evaluation material for us to understand this lying characteristics, current systems are lie detection is a challenging task, particularly regarding the still fairly limited, partly due to lack of adequate datasets to necessity to gather ground truth, namely, to know whether a evaluate their performance in realistic scenarios. In this work, we person is lying or not. The main difficulty arises because such have collected an annotated dataset of facial images, comprising both 2D and 3D information of several participants during a knowledge is not useful if the scenario is naively simulated card game that encourages players to lie. Using our collected (e.g. it is not sufficient to instruct a person to simply tell dataset, We evaluated several types of machine learning-based a lie).
    [Show full text]
  • Facial Expression and Emotion Paul Ekman
    1992 Award Addresses Facial Expression and Emotion Paul Ekman Cross-cultural research on facial expression and the de- We found evidence of universality in spontaneous velopments of methods to measure facial expression are expressions and in expressions that were deliberately briefly summarized. What has been learned about emo- posed. We postulated display rules—culture-specific pre- tion from this work on the face is then elucidated. Four scriptions about who can show which emotions, to whom, questions about facial expression and emotion are dis- and when—to explain how cultural differences may con- cussed: What information does an expression typically ceal universal in expression, and in an experiment we convey? Can there be emotion without facial expression? showed how that could occur. Can there be a facial expression of emotion without emo- In the last five years, there have been a few challenges tion? How do individuals differ in their facial expressions to the evidence of universals, particularly from anthro- of emotion? pologists (see review by Lutz & White, 1986). There is. however, no quantitative data to support the claim that expressions are culture specific. The accounts are more In 1965 when I began to study facial expression,1 few- anecdotal, without control for the possibility of observer thought there was much to be learned. Goldstein {1981} bias and without evidence of interobserver reliability. pointed out that a number of famous psychologists—F. There have been recent challenges also from psychologists and G. Allport, Brunswik, Hull, Lindzey, Maslow, Os- (J. A. Russell, personal communication, June 1992) who good, Titchner—did only one facial study, which was not study how words are used to judge photographs of facial what earned them their reputations.
    [Show full text]
  • Emotion Classification Based on Biophysical Signals and Machine Learning Techniques
    S S symmetry Article Emotion Classification Based on Biophysical Signals and Machine Learning Techniques Oana Bălan 1,* , Gabriela Moise 2 , Livia Petrescu 3 , Alin Moldoveanu 1 , Marius Leordeanu 1 and Florica Moldoveanu 1 1 Faculty of Automatic Control and Computers, University POLITEHNICA of Bucharest, Bucharest 060042, Romania; [email protected] (A.M.); [email protected] (M.L.); fl[email protected] (F.M.) 2 Department of Computer Science, Information Technology, Mathematics and Physics (ITIMF), Petroleum-Gas University of Ploiesti, Ploiesti 100680, Romania; [email protected] 3 Faculty of Biology, University of Bucharest, Bucharest 030014, Romania; [email protected] * Correspondence: [email protected]; Tel.: +40722276571 Received: 12 November 2019; Accepted: 18 December 2019; Published: 20 December 2019 Abstract: Emotions constitute an indispensable component of our everyday life. They consist of conscious mental reactions towards objects or situations and are associated with various physiological, behavioral, and cognitive changes. In this paper, we propose a comparative analysis between different machine learning and deep learning techniques, with and without feature selection, for binarily classifying the six basic emotions, namely anger, disgust, fear, joy, sadness, and surprise, into two symmetrical categorical classes (emotion and no emotion), using the physiological recordings and subjective ratings of valence, arousal, and dominance from the DEAP (Dataset for Emotion Analysis using EEG, Physiological and Video Signals) database. The results showed that the maximum classification accuracies for each emotion were: anger: 98.02%, joy:100%, surprise: 96%, disgust: 95%, fear: 90.75%, and sadness: 90.08%. In the case of four emotions (anger, disgust, fear, and sadness), the classification accuracies were higher without feature selection.
    [Show full text]
  • Emotions and Deception Detection
    Emotions and Deception Detection Mircea Zloteanu Division of Psychology and Language Sciences University College London, UK A dissertation submitted in part fulfilment of the requirement for the degree of Doctor of Philosophy in Psychology Prepared under the supervision of Dr. Daniel C. Richardson September 2016 1 I, Mircea Zloteanu, confirm that the work presented in this thesis is my own. Where information has been derived from other sources, I confirm that this has been indicated in the thesis. Signature 2 Abstract Humans have developed a complex social structure which relies heavily on communication between members. However, not all communication is honest. Distinguishing honest from deceptive information is clearly a useful skills, but individuals do not possess a strong ability to discriminate veracity. As others will not willingly admit they are lying, one must rely on different information to discern veracity. In deception detection, individuals are told to rely on behavioural indices to discriminate lies and truths. A source of such indices are the emotions displayed by another. This thesis focuses on the role that emotions have on the ability to detect deception, exploring the reasons for low judgemental accuracy when individuals focus on emotion information. I aim to demonstrate that emotion recognition does not aid the detection of deception, and can result in decreased accuracy. This is attributed to the biasing relationship of emotion recognition on veracity judgements, stemming from the inability of decoders to separate the authenticity of emotional cues. To support my claims, I will demonstrate the lack of ability of decoders to make rational judgements regarding veracity, even if allowed to pool the knowledge of multiple decoders, and disprove the notion that decoders can utilise emotional cues, both innately and through training, to detect deception.
    [Show full text]
  • Brain Activity During Simulated Deception: an Event-Related Functional Magnetic Resonance Study
    NeuroImage 15, 727–732 (2002) doi:10.1006/nimg.2001.1003, available online at http://www.idealibrary.com on RAPID COMMUNICATION Brain Activity during Simulated Deception: An Event-Related Functional Magnetic Resonance Study D. D. Langleben,1 L. Schroeder, J. A. Maldjian,* R. C. Gur, S. McDonald, J. D. Ragland, C. P. O’Brien, and A. R. Childress Department of Psychiatry and *Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 Received May 30, 2001; published online January 4, 2002 cations and there is a strong general interest in objec- The Guilty Knowledge Test (GKT) has been used tive methods for its detection (Holden, 2001). extensively to model deception. An association be- Multichannel physiological recording (polygraph) is tween the brain evoked response potentials and lying currently the most widely used method for the detec- on the GKT suggests that deception may be associated tion of deception (Office of Technology Assessment, with changes in other measures of brain activity such 1990). The effectiveness of the polygraph in the study as regional blood flow that could be anatomically lo- and detection of deception is limited by reliance on calized with event-related functional magnetic reso- peripheral manifestations of anxiety (skin conduc- nance imaging (fMRI). Blood oxygenation level-depen- tance, heart rate, and respiration) (Office of Technol- dent fMRI contrasts between deceptive and truthful responses were measured with a 4 Tesla scanner in 18 ogy Assessment, 1983). Scalp-recorded event-related participants performing the GKT and analyzed using potentials (ERPs) have also been used in the study of statistical parametric mapping. Increased activity in deception.
    [Show full text]
  • Efficient Lying Involves Self-Deception – Because If You Really Believe Your Lie, Your Limbic System Won't Give You Away
    The story “Looking for the Lie” http://www.nytimes.com/2006/02/05/magazine/05lying.html by Robin Marantz Henig New York Times Magazine, February 5, 2006 The pitch Dear Vera, A picture of a brain caught in mid-lie – that’s pretty hard to resist. Think of all the liars you’d love to expose: criminals, terrorists, politicians, philandering spouses, drug- abusing kids. That’s why scientists are trying to develop ways to unmask liars with techniques that go beyond traditional polygraphs – which are notoriously imprecise – or simple hunches – which are notoriously wrong. The next generation of lie detectors will include functional MRI scans, sophisticated electroencephalograms, and beams of light that can be blasted right inside the brain. But as we move closer to an era of high-tech mindreading, we should first take a look at what we know about lying – and whether it’s always such a good idea to uncover it. Is there an evolutionary purpose to deceit? A social purpose? Can anyone live in a community in which lies are always exposed and eliminated? If we can tell with confidence that someone is lying to us, is that something we even want to know? I’d like to propose a piece about lying for the Magazine, either as a 5000-word feature or as a longer cover story that gives me the space to explore the cultural, biological, and evolutionary implications of deception. I will also ask some of the questions raised with any new medical technology: what it can offer, how it can be abused, and whether it will set us off down a “slippery slope” that will lead to information we’d be better off without.
    [Show full text]
  • An Assessment of Lie Detection Capability (1964)
    IDA TECHNICAL REPORT 62-16 AN ASSESSMENT OF LIE DETECTION CAPABILITY (DECLASSIFIED VERSION) Jesse Orlansky July 1964 (classified version issued July 1962) INSTITUTE FOR DEFENSE ANALYSES RESEARCH AND ENGINEERING SUPPORT DIVISION Contract SD-50 Task 8 FOREWORD "An Assessment of Lie Detection Capability" was originally pub- lished as a SECRET NOPORN document by IDA/RESD on July 31, 1962, as TR 62-16. The Director of Defense Research and Engineering, Depart- ment of Defense, deleted portions of the repoit and declassified the remainder on May 13, 1964. The declassified version was printed as Exhibit 25 (pp. 425 to 463) of "Hearings Before a Subcommittee nf the Committee of Government Operations, House of Representatives, Eighty- Eighth Congress, Second Session, April 29 and 30, 1964," by the U.S. Government Printing Office. Except for the forematter, this reprint of the unclassified ver- sion is a facsimile reproduction of Exhibit 25. Three asterisks (* **) indicate that less than a pargrap has been deleted. A line of seven asterisks * * * ) Indicates deletion of one or more paragraphas. The coeplote report rtans its original classification. iii ACXNOWLEDCI4EN! The author wishes to thank many individuals who assisted this study by providing information and by reviewing a preliminary draft. This assistance is acknowledged gratefully but the author alone is accountable for the views expressed in this report. Dr. Joseph E. Barmack of the Institute for Defense Analyses is due a special thanks for his help in preparing the final report. v CONTENTS Foreword ii Acknowledgment v Contents vii Summary ix Conclusions xi Recommendations xiii 1. Purpose of This Report 1 2.
    [Show full text]