Forensic Statistics and the Assessment of Probative Value

Total Page:16

File Type:pdf, Size:1020Kb

Forensic Statistics and the Assessment of Probative Value Center for Statistics and Applications in CSAFE Presentations and Proceedings Forensic Evidence 12-11-2018 Forensic Statistics and the Assessment of Probative Value Hal Stern University of California, Irvine Follow this and additional works at: https://lib.dr.iastate.edu/csafe_conf Part of the Forensic Science and Technology Commons Recommended Citation Stern, Hal, "Forensic Statistics and the Assessment of Probative Value" (2018). CSAFE Presentations and Proceedings. 20. https://lib.dr.iastate.edu/csafe_conf/20 This Presentation is brought to you for free and open access by the Center for Statistics and Applications in Forensic Evidence at Iowa State University Digital Repository. It has been accepted for inclusion in CSAFE Presentations and Proceedings by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. Forensic Statistics and the Assessment of Probative Value Disciplines Forensic Science and Technology Comments Posted with permission of CSAFE. This presentation is available at Iowa State University Digital Repository: https://lib.dr.iastate.edu/csafe_conf/20 Forensic Statistics and the Assessment of Probative Value OSAC Meeting Phoenix, AZ December 11, 2018 Hal Stern Department of Statistics University of California, Irvine [email protected] Interesting times in forensic science Evaluation of forensic evidence • Forensic examinations cover a range of questions – timing of events – cause/effect – source conclusions • Focus here on source conclusions – topics addressed (e.g., need to assess uncertainty, logic of the likelihood ratio) are relevant beyond source conclusions • The task of interest for purposes of this presentation: assess two items of evidence, one from a known source and one from an unknown source, to determine if the two samples come from the same source – Bullet casing from test fire of suspect’s gun – Bullet casing from the crime scene The Daubert standard • Daubert standard (Daubert v. Merrell Dow Pharmaceuticals, 1993) governs admission of scientific expert testimony in federal courts – judge as gatekeeper – conclusions should be the product of applying a scientific methodology – relevant factors for judge to consider • Has the technique been tested in actual field conditions (and not just in a laboratory)? • Has the technique been subject to peer review and publication? • What is the known or potential rate of error? • Do standards exist for the control of the technique's operation? • Has the technique been generally accepted within the relevant scientific community? – applies to all expert evidence (Kumho Tire Co. v. Carmichael, 1999) . Frye standard (Frye v. United States, 1923) – general acceptance in relevant scientific community standard – applicable to novel scientific evidence FRE Rule 702 (post-Daubert) A witness who is qualified as an expert by knowledge, skill, experience, training, or education may testify in the form of an opinion or otherwise if: a) the expert’s scientific, technical, or other specialized knowledge will help the trier of fact to understand the evidence or to determine a fact in issue; b) the testimony is based on sufficient facts or data; c) the testimony is the product of reliable principles and methods; and d) the expert has reliably applied the principles and methods to the facts of the case. Logic of forensic examinations • Examine two samples to identify similarities and differences • Assess similarities and differences to see if they are expected (or likely) under the same source hypothesis • Assess similarities and differences to see if they are expected (or likely) under the different source hypothesis Evaluation and interpretation of forensic evidence • Approaches – Expert assessment based on experience, training, use of accepted methods. Typically summarized by a categorical conclusion (e.g., identification / exclusion / inconclusive) – Two-stage procedure (see, e.g., Parker and Holford in the 1960s) • similarity (binary decision based on distance/score) • identification (likelihood of coincidental match) – Likelihood ratio (sometimes known as the Bayes factor) Satisfying Daubert / FRE 702 • Application of any of these approaches should be supported by evidence regarding how well (how reliably) they perform • Examples: – Studies of reliability and validity of measurements (e.g., chemical composition of glass) – Peer-reviewed studies of techniques/models – Studies of reliability and validity of examiner conclusions • Important to also recall that the approach needs to be “reliably applied …. to the facts of the case.” (e.g., N.C. vs McPhaul, 2017) Forensic Evidence as Expert Opinion • Status quo in pattern disciplines (fingerprints, shoe prints, firearms, toolmarks, questioned documents, etc.) • Examiner analyzes evidence based on – Experience – Training – Use of accepted methods in the field • Assessment of the evidence reflects examiner’s expert opinion • Conclusions typically reported as categorical conclusions – Identification, Exclusion, Inconclusive – Multi-category scales (some support, strong support, very strong support,..) Forensic Evidence as Expert Opinion • Occasionally conclusions are expressed as statements about the hypotheses rather than the evidence, e.g., “based on the evidence, author of the known samples … – Wrote the questioned sample – Highly probable wrote the questioned sample – Probably wrote the questioned sample – Indications may have written the questioned sample – with similar statements on the negative side • This is logically problematic – It is a statement about the likelihood of a hypothesis (“same source”) after viewing the evidence – But, as we will see later, this conclusion must also reflect in part the examiner’s a priori (pre-evidence) opinion about the hypothesis Forensic Evidence as Expert Opinion • What does it take to establish that testimony is – “based on sufficient facts or data” – “the product of reliable principles and methods” • Note that the use of the word “reliable” in the legal sense (trustworthy) differs from its technical use in statistics • In measurement / assessment, statisticians focus on a number of related concepts in thinking about “reliability”: – Would the same analyst draw the same conclusion in a new examination of the evidence (repeatability) – Would different analysts draw the same conclusion given the same evidence (reproducibility) – Repeatability and reproducibility are both components of reliability – Do analysts get the right answer in studies where the ground truth is available (accuracy / validity) Reliability of Measurements: An Example from Handwriting • 5 forensic document examiners (FDE) rated 123 signatures in terms of difficulty to simulate on a 5-point scale (easy - fairly easy - medium - difficult - very difficult) • Assessing reproducibility (similarity of assessments by two different examiners) − Correlation of ratings of each pair of FDEs (.62 - .75) − Statistical model (intraclass correlation coefficient) (.65) • Assessing repeatability (similarity of assessments by same examiner at two different times) … a very small study w/ only 7 signatures − Correlation of ratings (range from .40 - .88) − Statistical model estimates .68 Forensic Evidence as Expert Opinion • PCAST report called for assessment of – Foundational validity of a forensic science discipline – Validity as applied in a particular case • Foundational validity – A method can in principle be reliable (in the legal sense) – PCAST advocated for multiple “black box” studies • Validity as applied – Proficiency testing (this person can do the task) – Case report establishing it has been applied appropriately in this case • PCAST report has been controversial Forensic Evidence as Expert Opinion • Example of a (PCAST-style) “black box” study – Having examples with known “ground truth” allows estimation of error rates – Ulery et al. (2011) “black box” study of fingerprint decisions • false positive rate was 0.1% • false negative rate was 7.5% – There are limitations in this and any study (similarity to case work, case environment?) – Same group carried out a series of “white box” studies in fingerprints to assess • Reliability of different steps in the examination process (e.g., marking of minutiae) Forensic Evidence as Expert Opinion • Reliability and validity are likely to depend on characteristics of the evidence, e.g., – quality of latent print – complexity of a signature • Studies should address this and would allow statements like “for evidence of this type …” Forensic Evidence as Expert Opinion Example: Forensic Evidence as Expert Opinion • A few final remarks on forensic evidence as expert opinion – Information on reliability and accuracy for forensic analyses is extremely helpful and will likely be increasingly requested – As per FRE 702, there is also a need to address application of the method or technique in the current case (e.g., N.C. vs. McPhaul, 2017) – There will always be unique situations without relevant empirical studies (e.g., did this typewriter produce this note) • Not necessarily a problem as long as lack of relevant empirical evidence is acknowledged The Two-Stage Approach • Stage 1 - Similarity – Statistical test or procedure to determine if the two samples “are indistinguishable”, “can’t be distinguished”, “match”, etc. • Stage 2 - Identification – Assessment of the probability that two samples from different sources would be found indistinguishable • Used in assessment of trace evidence (like glass) • Conceptually
Recommended publications
  • What Is Forensic Statistics, and What Does It Mean for Forensic Pattern Recognition
    Center for Statistics and Applications in CSAFE Presentations and Proceedings Forensic Evidence 10-17-2017 What is Forensic Statistics, and What Does it Mean for Forensic Pattern Recognition Simon A. Cole University of California, Irvine Follow this and additional works at: https://lib.dr.iastate.edu/csafe_conf Part of the Forensic Science and Technology Commons Recommended Citation Cole, Simon A., "What is Forensic Statistics, and What Does it Mean for Forensic Pattern Recognition" (2017). CSAFE Presentations and Proceedings. 54. https://lib.dr.iastate.edu/csafe_conf/54 This Presentation is brought to you for free and open access by the Center for Statistics and Applications in Forensic Evidence at Iowa State University Digital Repository. It has been accepted for inclusion in CSAFE Presentations and Proceedings by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. What is Forensic Statistics, and What Does it Mean for Forensic Pattern Recognition Disciplines Forensic Science and Technology Comments Posted with permission of CSAFE. This presentation is available at Iowa State University Digital Repository: https://lib.dr.iastate.edu/csafe_conf/54 What Is Forensic Statistics, and What Does it Mean for Forensic Pattern Recognition? Arizona Identification Council Tri-Division Educational Conference October 17, 2017 Simon A. Cole Department of Criminology, Law & Society Newkirk Center for Science & Society Center for Statistical Applications of Forensic Evidence University of California, Irvine NIST OSAC NIST OSAC NIST OSAC US Department of Justice, Forensic Science Discipline Review (2016) CSAFE Which discipline? • Biology? • Chemistry? • Engineering? • STS? – See “STS on Trial” and remarks by Downey tomorrow; interventions by Edmond, Mnookin, etc.
    [Show full text]
  • Bloodstain Pattern Analysis: Toby Wolson, Retired (Miami-Dade Police Department) • Firearms and Toolmarks: Todd Weller, Weller Forensics • Footwear and Tire: G
    Physics and Pattern Evidence Scientific Area Committee Chair: Melissa Gische, FBI AAFS — February 2018 Pattern SAC Leadership • Officers • Chair: Melissa Gische, FBI • Vice Chair: R. Austin Hicklin, Noblis • Executive Secretary: Thomas Busey, Indiana University, Bloomington • Subcommittee Chairs: • Bloodstain Pattern Analysis: Toby Wolson, Retired (Miami-Dade Police DePartment) • Firearms and Toolmarks: Todd Weller, Weller Forensics • Footwear and Tire: G. Matt Johnson, Orange County Crime Laboratory • Forensic Document Examination: Gerry LaPorte, NIJ • Friction Ridge: Henry Swofford, Defense Forensic Science Center 2 Pattern SAC Members and Liaisons • Members • David Baldwin, Special Technologies Laboratory, USDOE • Ted Burkes, Federal Bureau of Investigation • Lesley Hammer, Hammer Forensics • Paul Kish, Paul Erwin Kish Forensic Consultant & Associates • Hal Stern, University of California, Irvine • David Stoney, Stoney Forensic, Inc. • John Vanderkolk, Indiana State Police Laboratory Division • Ex-Officio Members • Liaison to Human Factors Committee: Rick Lempert, University of Michigan • Liaison to Legal Resource Committee: David Kaye, Pennsylvania State School of Law • Liaison to Quality Infrastructure Committee: Erin Henry, Oklahoma State Bureau of Investigation 3 Pattern SAC: Role • Provide direction and oversight for 5 subcommittees • Firearms & Toolmarks • Footwear and Tire • Friction Ridge • Questioned Documents • Bloodstain Pattern Analysis • Interface with the resource committees • Human Factors • Legal Resource • Quality Infrastructure • (OSAC-wide Forensic Statistics Task Group) • Communicate activities, progress, recommendations • Review and approve standards and guidelines • Coordinate research priorities 4 General Process SAC: 15 SC TG: 3-5 SC: 20 SC: 20 RCs: 62 QIRC: 15 STG: 25 HFC: 11 LRC: 11 SC: 20 SAC: 15 SDO 5 AAFS Standards Board (ASB) Physics and Pattern Evidence SAC subcommittees are using the AAFS Standards Board (ASB) as the Standards Development Organization (SDO) for completed documents.
    [Show full text]
  • Certainty & Uncertainty in Reporting Fingerprint Evidence
    Center for Statistics and Applications in CSAFE Publications Forensic Evidence Fall 2018 Certainty & Uncertainty in Reporting Fingerprint Evidence Joseph B. Kadane Carnegie Mellon University Jonathan J. Koehler Northwestern University Follow this and additional works at: https://lib.dr.iastate.edu/csafe_pubs Part of the Forensic Science and Technology Commons Recommended Citation Kadane, Joseph B. and Koehler, Jonathan J., "Certainty & Uncertainty in Reporting Fingerprint Evidence" (2018). CSAFE Publications. 12. https://lib.dr.iastate.edu/csafe_pubs/12 This Article is brought to you for free and open access by the Center for Statistics and Applications in Forensic Evidence at Iowa State University Digital Repository. It has been accepted for inclusion in CSAFE Publications by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. Certainty & Uncertainty in Reporting Fingerprint Evidence Abstract Everyone knows that fingerprint videncee can be extremely incriminating. What is less clear is whether the way that a fingerprint examiner describes that videncee influences the weight lay jurors assign to it. This essay describes an experiment testing how lay people respond to different presentations of fingerprint evidence in a hypothetical criminal case. We find that people attach more weight to the evidence when the fingerprint examiner indicates that he believes or knows that the defendant is the source of the print. When the examiner offers a weaker, but more scientifically justifiable, conclusion, thevidence e is given less weight. However, people do not value the evidence any more or less when the examiner uses very strong language to indicate that the defendant is the source of the print versus weaker source identification language.
    [Show full text]
  • Forensic DNA and Bioinformatics Lucia Bianchi and Pietro Lio'
    BRIEFINGS IN BIOINFORMATICS. VOL 8. NO 2. 117^128 doi:10.1093/bib/bbm006 Advance Access publication March 24, 2007 Forensic DNA and bioinformatics Lucia Bianchi and Pietro Lio' Abstract The field of forensic science is increasingly based on biomolecular data and many European countries are establishing forensic databases to store DNA profiles of crime scenes of known offenders and apply DNA testing. The field is boosted by statistical and technological advances such as DNA microarray sequencing, TFT biosensors, machine learning algorithms, in particular Bayesian networks, which provide an effective way of evidence organization and inference. The aim of this article is to discuss the state of art potentialities of bioinformatics in forensic Downloaded from https://academic.oup.com/bib/article/8/2/117/221568 by guest on 26 September 2021 DNA science. We also discuss how bioinformatics will address issues related to privacy rights such as those raised from large scale integration of crime, public health and population genetic susceptibility-to-diseases databases. Keywords: forensic science; DNA testing; CODIS; Bayesian networks; DNA microarray INTRODUCTION desiccation conditions with potential applications in Bioinformatics and forensic DNA are inherently mass fatalities has been recently described [2–4]. interdisciplinary and draw their techniques from In several countries new rules could allow statistics and computer science bringing them to bear fingerprints and DNA samples to be taken from on problems in biology and law. Personal identifica- anyone they arrest, whether they are charged or not. tion and relatedness to other individuals are the two This will be certainly facilitated by the introduction major subjects of forensic DNA analysis.
    [Show full text]
  • Improving Biometric and Forensic Technology: the Future of Research Datasets Symposium Report
    NISTIR 8156 Improving Biometric and Forensic Technology: The Future of Research Datasets Symposium Report Melissa Taylor Shannan Williams George Kiebuzinski This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8156 NISTIR 8156 Improving Biometric and Forensic Technology: The Future of Research Datasets Symposium Report Melissa Taylor Shannan Williams Forensic Science Research Program Special Programs Office National Institute of Standards and Technology George Kiebuzinski Noblis, Inc. This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8156 March 2017 U.S. Department of Commerce Wilbur L. Ross, Jr., Secretary National Institute of Standards and Technology Kent Rochford, Acting NIST Director and Under Secretary of Commerce for Standards and Technology Table of Contents 1 Executive Summary .............................................................................................................................. 6 1.1 Background ................................................................................................................................... 6 1.2 Summary of Workshop Presentations and Discussions ............................................................... 6 1.3 Findings ......................................................................................................................................... 7 2 Introduction ........................................................................................................................................
    [Show full text]
  • When DNA Is Not a Gold Standard: Failing to Interpret Mixture Evidence
    © natali_mis | AdobeStock clear signals yields an unambiguous genetic type (“geno - When DNA Is Not a type”). Comparing definite genotypes, relative to a random person, yields a reliable match statistic that numerically Gold Standard: Failing to conveys the probative force of DNA evidence. But most crime scene DNA is now a mixture of two or more people, Interpret Mixture Evidence with good data but less certain interpretation. As the NAS report noted, there may be problems with how the DNA was interpreted, such as when there are mixed samples. orensic science connects evidence through Simplistic interpretation of DNA mixture data often shared characteristics. Markings on a bullet can fails to produce an accurate match statistic or give any Fappear to match grooves in the barrel of a gun. answer at all. While the limitations and liabilities of Latent fingerprints left at a crime scene may be similar unscientific DNA mixture interpretation were recog - to ridge patterns on a suspect’s hand. Tracks in the nized early on, 3 only recently has this profound forensic mud may mirror the treads of a shoe or tire. Police failure come to the fore. Crime laboratories in Austin, gather forensic evidence to help build a case, and Texas, and Washington, D.C., have been shuttered in police dramas on television convey the myth of foren - large part because of failed DNA mixture interpretation. 4 sic infallibility through the “CSI” effect. 1 Virginia re-evaluated DNA match statistics for mixture In 2009, the National Academy of Sciences (NAS) evidence in hundreds of cases. 5 Texas is reviewing 24,000 published its seminal report titled Strengthening Forensic criminal cases for flawed interpretation of DNA mixture Science in the United States .2 The NAS report reviewed evidence.
    [Show full text]
  • Statistical Methods for the Forensic Analysis of User-Event Data
    UC Irvine UC Irvine Electronic Theses and Dissertations Title Statistical Methods for the Forensic Analysis of User-Event Data Permalink https://escholarship.org/uc/item/8s22s5kb Author Galbraith, Christopher Michael Publication Date 2020 License https://creativecommons.org/licenses/by/4.0/ 4.0 Peer reviewed|Thesis/dissertation eScholarship.org Powered by the California Digital Library University of California UNIVERSITY OF CALIFORNIA, IRVINE Statistical Methods for the Forensic Analysis of User-Event Data DISSERTATION submitted in partial satisfaction of the requirements for the degree of DOCTOR OF PHILOSOPHY in Statistics by Christopher Galbraith Dissertation Committee: Chancellor’s Professor Padhraic Smyth, Chair Chancellor’s Professor Hal S. Stern Associate Professor Veronica Berrocal 2020 c 2020 Christopher Galbraith DEDICATION To my parents, Lynn & Larry ii TABLE OF CONTENTS Page LIST OF FIGURES vi LIST OF TABLES x LIST OF ALGORITHMS xi ACKNOWLEDGMENTS xii VITA xiv ABSTRACT OF THE DISSERTATION xvi 1 Introduction 1 1.1 Outline & Contributions . 4 2 Computing Strength of Evidence with the Likelihood Ratio 8 2.1 Evidence Types . 8 2.1.1 Biological (DNA) Evidence . 9 2.1.2 Trace Evidence . 10 2.1.3 Pattern Evidence . 11 2.2 Formal Problem Statement . 12 2.2.1 Source Propositions . 12 2.3 The Likelihood Ratio . 15 2.3.1 The LR as a Bayesian Method . 17 2.3.2 Estimation . 18 2.3.3 Interpretation . 21 2.4 Population Data . 22 2.4.1 Reference Data . 24 2.4.2 Validation Data . 24 2.4.3 Leave-pairs-out Cross-validation . 25 2.5 Empirical Classification Performance . 25 2.6 Information-theoretic Evaluation .
    [Show full text]
  • Improving Biometric and Forensic Technology: the Future of Research Datasets Symposium Report
    NISTIR 8156 Improving Biometric and Forensic Technology: The Future of Research Datasets Symposium Report Melissa Taylor Shannan Williams George Kiebuzinski This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8156 NISTIR 8156 Improving Biometric and Forensic Technology: The Future of Research Datasets Symposium Report Melissa Taylor Shannan Williams Forensic Science Research Program Special Programs Office National Institute of Standards and Technology George Kiebuzinski Noblis, Inc. This publication is available free of charge from: https://doi.org/10.6028/NIST.IR.8156 March 2017 U.S. Department of Commerce Wilbur L. Ross, Jr., Secretary National Institute of Standards and Technology Kent Rochford, Acting NIST Director and Under Secretary of Commerce for Standards and Technology Table of Contents 1 Executive Summary .............................................................................................................................. 6 1.1 Background ................................................................................................................................... 6 1.2 Summary of Workshop Presentations and Discussions ............................................................... 6 1.3 Findings ......................................................................................................................................... 7 2 Introduction ........................................................................................................................................
    [Show full text]
  • Curriculum Vita
    SIMON A. COLE Department of Criminology, Law & Society School of Social Ecology 2340 Social Ecology II University of California Irvine, CA 92697-7080 (949) 824-1443 Fax: (949) 824-3001 [email protected] TEACHING & RESEARCH POSITIONS University of Department of Criminology, Law & Professor 2013-present California, Irvine Society Associate Professor 2006-2013 Assistant Professor 2002-2006 Cornell Department of Science & Technology Visiting Scientist 2001-2002 University Studies Visual Networks Visualization Architect 2000 Rutgers Institute for Health, Health Care Postdoctoral Fellow 1997-1999 University Policy, and Aging Research EDUCATION Cornell Ph.D. 1998 Science & Technology Studies University M.A. 1995 Princeton History A.B. with Honors University European Cultural Studies Certificate of 1989 Proficiency PUBLICATIONS Books B2. Michael Lynch, Simon A. Cole, Ruth McNally, & Kathleen Jordan, Truth Machine: The 2008 Contentious History of DNA Fingerprinting (University of Chicago Press). • Distinguished Publication Award, Ethnomethodology and Conversation Analysis Section, American Sociological Association, 2011 January 11, 2018 SIMON A. COLE PAGE 2 B1. Simon A. Cole, Suspect Identities: A History of Fingerprinting and Criminal Identification 2001 (Harvard University Press). • Awarded Rachel Carson Prize (for a book length work of social or political relevance in the area of social studies of science and technology), Society for Social Studies of Science, October 17, 2003. • Excerpt reprinted in California Lawyer, Volume 21, Number 6 (June 2001), pp. 46-76. • Excerpt reprinted in David A. Sklansky, Evidence: Cases, Commentary, and Problems (Aspen Publishers, 2003), pp. 538-540. • Paperback edition (Harvard University Press, 2002). Journal Articles J30. Simon A. Cole, “Changed Science Statutes: Can Courts Accommodate Accelerating 2017 Forensic Scientific and Technological Change?” Jurimetrics, Volume 57, Number 4 (Summer), pp.
    [Show full text]
  • Forensic Science 1 Forensic Science
    Forensic science 1 Forensic science "Forensics" redirects here. For other uses, see Forensics (disambiguation). Forensic science Physiological sciences • Forensic anthropology • Forensic archaeology • Forensic odontology • Forensic entomology • Forensic pathology • Forensic botany • Forensic biology • DNA profiling • Bloodstain pattern analysis • Forensic chemistry • Forensic osteology Social sciences • Forensic psychology • Forensic psychiatry Forensic criminalistics • Ballistics • Ballistic fingerprinting • Body identification • Fingerprint analysis • Forensic accounting • Forensic arts • Forensic footwear evidence • Forensic toxicology • Gloveprint analysis • Palmprint analysis • Questioned document examination • Vein matching Digital forensics • Computer forensics • Forensic data analysis • Database forensics • Mobile device forensics • Network forensics • Forensic video • Forensic audio Related disciplines • Fire investigation Forensic science 2 • Fire accelerant detection • Forensic engineering • Forensic linguistics • Forensic materials engineering • Forensic polymer engineering • Forensic statistics • Forensic taphonomy • Vehicular accident reconstruction People • William M. Bass • George W. Gill • Richard Jantz • Edmond Locard • Douglas W. Owsley • Werner Spitz • Auguste Ambroise Tardieu • Juan Vucetich Related articles • Crime scene • CSI effect • Perry Mason syndrome • Pollen calendar • Skid mark • Trace evidence • Use of DNA in forensic entomology • v • t [1] • e Forensic science is the scientific method of gathering and examining
    [Show full text]
  • Forensic Science Assessments: a Quality and Gap Analysis- Latent
    REPORT 2 09/15/2017 FORENSICSCIENCE ASSESSMENTS A Quality and Gap Analysis Latent Fingerprint Examination AUTHORS: William Thompson, John Black, Anil Jain and Joseph Kadane CONTRIBUTING AAAS STAFF: Deborah Runkle, Michelle Barretta and Mark S. Frankel The report Latent Fingerprint Examination is part of the project Forensic Science Assessments: A Quality and Gap Analysis. The opinions, findings, recommendations expressed in the report are those of the authors, and do not necessarily reflect the official positions or policies of the American Association for the Advancement of Science. © 2017 AAAS This material may be freely reproduced with attribution. Published by the American Association for the Advancement of Science Scientific Responsibility, Human Rights, and Law Program 1200 New York Ave NW, Washington DC 20005 United States The American Association for the Advancement of Science (AAAS) is the world’s largest general scientific society and publisher of the Science family of journals. Science has the largest paid circulation of any peer-reviewed general science journal in the world. AAAS was founded in 1848 and includes nearly 250 affiliated societies and academies of sciences, serving 10 million individuals. The non-profit AAAS is open to all and fulfills its mission to “advance science and serve society” through initiatives in science policy, international programs, science education, public engagement, and more. Cite as: AAAS, Forensic Science Assessments: A Quality and Gap Analysis- Latent Fingerprint Examination, (Report prepared by William Thompson, John Black, Anil Jain, and Joseph Kadane), September 2017. DOI: 10.1126/srhrl.aag2874 i ACKNOWLEDGMENTS AAAS is especially grateful to the Latent Fingerprint Examination Working Group, Dr.
    [Show full text]
  • Renewal Report
    RENEWAL REPORT NIST Cooperative Agreement #70NANB15H176 Reporting Period: June 1, 2015 – May 1, 2019 Submission Date: May 1, 2019 VOLUME II CSAFE RENEWAL REPORT – VOLUME II This document was produced as part of a NIST Center of Excellence in Forensic Science cooperative agreement (officially referred to as the Center for Statistics and Applications in Forensic Evidence (CSAFE)). Cooperative Agreement: 70NANB15H176 Reporting Period: May 2015 – May 2019 Submission Date: May 1, 2019 Project Period: June 2015 – May 2020 NIST Project Manager: Ms. Susan Ballou Program Manager for Forensic Sciences Law Enforcement Standards Office, NIST [email protected] (301) 975-8750 Investigators: Dr. Alicia Carriquiry – Iowa State University [email protected], (515) 294-7278 Dr. William F. Eddy – Carnegie Mellon University Dr. Hal Stern – University of California, Irvine Dr. Karen Kafadar – University of Virginia DUNS and EIN Numbers: 005309844 and 42-6004224 Website: www.forensicstats.org Renewal Report Microsite: www.renewalreport.forensicstats.org CSAFE RENEWAL REPORT - VOLUME II 2 INTRODUCTION CSAFE Renewal Report Volume II contains detailed descriptions of Renewal Report Volume I’s summarized research projects. In this Volume, you will find each individual project report includes research progress, the newest avenues of exploration, accomplishments, collaborations, and accolades. Each report serves to illustrate the team’s significant efforts in translating advancements in research into practical application for real-world, forensic science investigations.
    [Show full text]