Case 1:17-cr-00130-JTN ECF No. 53 filed 02/23/18 PageID.2039 Page 1 of 10 UNITED STATES DISTRICT COURT WESTERN DISTRICT OF MICHIGAN SOUTHERN DIVISION _________________ UNITED STATES OF AMERICA, Plaintiff, Case no. 1:17-cr-130 v. Hon. Janet T. Neff United States District Judge DANIEL GISSANTANER, Hon. Ray Kent Defendant. United States Magistrate Judge / DEFENDANT’S REPLY TO THE GOVERNMENT’S RESPONSE TO MOTION TO EXCLUDE DNA EVIDENCE NOW COMES, the defendant, Daniel Gissantaner, by and through his attorney, Joanna C. Kloet, Assistant Federal Public Defender, and hereby requests to file this Reply to the Government’s Response to the Defendant’s Motion to Exclude DNA Evidence, as authorized by Federal Rule of Criminal Procedure 12 and Local Criminal Rules 47.1 and 47.2. This case was filed in Federal Court after the defendant was acquitted of the charge of felon in possession of a firearm after a July 8, 2016, hearing before the Michigan Department of Corrections, where the only apparent difference between the evidence available to the respective authorities is the DNA likelihood ratio (“LR”) now proffered by the Federal Government. Accordingly, the defense submitted the underlying Motion to Exclude DNA Evidence as a dispositive pre-trial motion. Local Criminal Rule 47.1 allows the Court to permit or require further briefing on dispositive motions following a response. Alternatively, if the Court determines the underlying Motion is non-dispositive, the defense requests leave to file this Reply under Local Case 1:17-cr-00130-JTN ECF No. 53 filed 02/23/18 PageID.2040 Page 2 of 10 Criminal Rule 47.2. Good cause exists because with this Reply, the defense seeks to narrow the varied and complex issues that have been raised before the Court, in preparation for the hearing on March 22, 2018. I. The Likelihood Ratio (“LR”) is unreliable and should be excluded under FRE 702 and Daubert because the validation studies are insufficient. The validation studies to which the Government refers in its Response do not demonstrate the STRmix program has been adequately validated and that it is appropriate for use in situations such as the instant matter.1 To establish foundational validity, “the procedures that comprise [a methodology] must be shown, based on empirical studies, to be repeatable, reproducible, and accurate, at levels that have been measured and are appropriate to the intended application.”2 Validation studies must involve a sufficiently large number of examiners and be based on sufficiently large collections of known and representative samples from relevant populations to reflect the range of features or combinations of features that will occur in the application.3 Furthermore, the studies should be conducted or overseen by individuals or organizations that have no stake in the outcome of the studies.4 In its Response, the Government attaches two studies in support of its contention that STRmix was validated properly.5 However, these studies were authored by the creator of the STRmix software itself, John S. Buckleton (who also happens to be the witness the Government 1 See ECF No. 52, pp 12-15. 2 Exhibit 1, additional excerpts from PCAST Forensic Science Report (“PCAST Report Excerpts”), p 47, citing National Physical Laboratory, “A Beginner’s Guide to Measurement,” and Pavese, F., “An Introduction to Data Modeling Principles in Metrology and Testing,” in Data Modeling for Metrology and Testing in Measurement Science,” Pavese, F. and A.B. Forbes (Eds.), Birkhauser (2009). 3 Exhibit 1, PCAST Report Excerpt, p 52. 4 Exhibit 1, PCAST Report Excerpt, p 52. 5 ECF No. 52, pp 12-13. 2 Case 1:17-cr-00130-JTN ECF No. 53 filed 02/23/18 PageID.2041 Page 3 of 10 indicated it would seek to call in its favor at the March 22, 2018, hearing before this Court).6 The Government also cites Buckleton’s website for the assertion that STRmix was tested and reliable as shown by 19 publications from 2013 to 2017, but again, Buckleton authored 18 out of 19 of these studies.7 Moreover, the Government has not shown that these studies were in fact validation studies that closely followed the FBI’s SWGDAM Guidelines for Validation of Probabilistic Genotyping Systems – in fact, apparently at least 14 of these studies preceded the publication of those Guidelines.8 Likewise, the link the Government provides to Buckleton’s personal blog contains links to studies performed by seven local law enforcement agencies, at least three of which were before the FBI SWGDAM Guidelines were issued, and all of which lack evidence of rigorous peer-review.9 Furthermore, information available on Buckleton’s blog indicates that the STRmix software version that was ostensibly “validated” by these local agencies was later subject to numerous revisions and updates to correct errors in the software.10 Notably, the FBI SWGDAM Guidelines require re-validation following changes to the software that “may impact interpretation or the analytical process.”11 Even if the existence of validation studies on outdated versions of STRmix provide validation in some factual circumstances, this does not signify that the software is fit for the purpose for which it was employed here.12 Because “crime laboratories are being asked to evaluate 6 See ECF No. 52-3 and ECF No. 52-4. 7 ECF No. 52, p 12. 8 ECF No. 52, p 12; Exhibit 2, FBI SWGDAM Guidelines for Validation of Probabilistic Genotyping Systems, June 15, 2015, p 11. 9 ECF No. 52, p 12; Exhibit 2, FBI SWGDAM Guidelines for Validation of Probabilistic Genotyping Systems, June 15, 2015, p 11. 10 Exhibit 3, “A summary of the seven identified miscodes in STRmix,” located at https://johnbuckleton.files.wordpress.com/2017/12/a-summary-of-the-seven-identified-miscodes-in-strmix.pdf (last accessed Feb. 23, 2018). 11 Exhibit 2, FBI SWGDAM Guidelines for Validation of Probabilistic Genotyping Systems, June 15, 2015, p 11. 12 Exhibit 1, PCAST Report Excerpt, p 56. 3 Case 1:17-cr-00130-JTN ECF No. 53 filed 02/23/18 PageID.2042 Page 4 of 10 many more poor-quality, low-template, and complex DNA mixtures,” DNA mixture software “is being used on the most dangerous, least information-rich samples you encounter.”13 “Common characteristics of forensic casework samples that can increase their complexity include multiple contributors, low quantity (provoking possible drop-out) and low quality (e.g., degradation, inhibition, contamination).”14 “As the number of potential contributors increases, so does uncertainty in accurately determining the true number of contributors.”15 Accuracy may degrade as a function of the absolute and relative amounts of DNA from various contributors.16 More robust validation is important because it “may determine that, past a certain number of contributors, the information content of the profile is simply too limited to reliably distinguish a true contributor from a non-contributor who shares some of the detected alleles by chance.”17 The addendum shows that the authors of the PCAST Report responded to its critics, but did not change position.18 Nor has the report been withdrawn. In preparing the addendum, PCAST reviewed hundreds of papers cited by the respondents and invited the submission of additional studies not considered by PCAST that purport to establish validity and reliability.19 After undertaking additional study, including convening a meeting with STRmix’s creator John 13 Exhibit 4, Frederick R. Bieber, et al, “Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion,” BMC Genetics (2016) 17:125; Joe Palazzolo, “Defense Attorneys Demand Closer Look at Software Used to Detect Crime-Scene DNA,” WALL ST. J., Nov. 18, 2015. 14 Exhibit 5, Hinda Haned, et al., “Validation of probabilistic genotyping software for use in forensic DNA casework: definitions and illustrations,” 56 Science and Justice 104, 106 (2016). On a related point, unlike here, the 2015 Michigan state court decision by Muskegon County Circuit Court Judge William C. Marietti in People v Muhammad (Case No. 14-65263-FC), involved only an apparent two-person mixture. 15 Exhibit 4, Frederick R. Bieber, et al, “Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion,” BMC Genetics (2016) 17:125. 16 Exhibit 1, PCAST Report Excerpt, p 79. 17 Exhibit 5, Hinda Haned, et al., “Validation of probabilistic genotyping software for use in forensic DNA casework: definitions and illustrations,” 56 Science and Justice 104, 106 (2016). 18 Exhibit 6, An Addendum to the PCAST Report on Forensic Science in Criminal Courts, (January 6, 2017) (“PCAST Report Addendum”), pp 1, 8. 19 Exhibit 6, PCAST Report Addendum, pp 2-3, 5, 8. 4 Case 1:17-cr-00130-JTN ECF No. 53 filed 02/23/18 PageID.2043 Page 5 of 10 Buckleton, PCAST observed that “empirical testing of [probabilistic genotyping systems] had largely been limited to a narrow range of parameters (number and ratios of contributors),” and recommended further testing of “a diverse collection of samples within well-defined ranges.”20 The addendum also stated: PCAST has great respect for the value of examiners’ experience and judgment: they are critical factors in ensuring that a scientifically valid and reliable method is practiced correctly. However, experience and judgment alone – no matter how great – can never establish the validity or degree of reliability of any particular method. Only empirical testing can do so.21 In this case, determining the validity of the method requires rigorous software testing and scrutiny of the assumptions underlying the algorithm. Thus, and especially in light of the ongoing changes and updates to the software, validity testing should involve not just forensic scientists and mathematicians, but also software engineers with experience in verification and validation of software.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages280 Page
-
File Size-