Special Report: Addressing Racism in America's Financial System 2021 Edelman Trust Barometer U.S

Total Page:16

File Type:pdf, Size:1020Kb

Special Report: Addressing Racism in America's Financial System 2021 Edelman Trust Barometer U.S Special Report: Addressing Racism in America's Financial System 2021 Edelman Trust Barometer U.S. Online Survey Special Report • 1,500 general population respondents Addressing Racism in America's • All data is nationally representative based on age, region, gender, ethnicity Financial System • Racial and ethnic segments White n=941 Fieldwork: June 11 – June 28, 2021 Black n=505 Latinx n=502 Asian n=515 • All racial and ethnic segments are nationally representative based on age, income, region and gender Demographics across communities White Black Latinx Asian Male 49% Male 47% Male 49% Male 47% Female 53% Margin of error: Female 50% Female 52% Female 49% 18-34 25% 18-34 34% 18-34 39% 18-34 32% • U.S. total margin of error: +/- 2.5% (n=1500) • Ethnicity-specific Data margin of error: 35-54 31% 35-54 34% 35-54 38% 35-54 37% Non-Hispanic White +/- 3.2% (n=941), Black +/- 4.4% (n=505), Latinx +/- 4.4% (n=502), anD Asian +/- 4.3% (n=515) 55+ 44% 55+ 32% 55+ 23% 55+ 31% 2 HISTORICAL CONTEXT 1 2 3 4 Generations of Barriers to Impact on wealth Resilient economic deprivation wealth creation transfer multicultural business $16T 26% 8x $863B GDP loss in the Latinx families least likely to Asian businesses account U.S. economy due to own retirement accounts The wealth of a typical for the most receipts of White family compared to discrimination, since among all Americans multicultural groups Black families (Survey of 2000 (Citigroup) (MagnifyMoney) (Census Bureau) Consumer Finances) 3 Edelman Trust Barometer Special Report: Business and Racial Justice in America FINANCIAL SERVICES RANKS LAST IN ADDRESSING RACISM Percent who say each sector is doing well in addressing racism U.S. general population White Black Latinx Asian - 0 + Sports n/a 44 44 52 41 51 Change, Aug 2020 to Apr 2021 Lowest performing Healthcare n/a 42 44 41 42 41 sector within each community Food and beverage +6 39 39 45 40 36 Technology n/a 37 35 43 43 43 Personal care +3 36 36 41 38 39 Automotive +7 35 36 36 40 34 Pharmaceutical +8 35 34 35 42 36 Financial services +7 33 31 36 40 33 2021 Edelman Trust Barometer Special Report: Business and Racial Justice in America. Q15. How well are each of the following inDustry sectors currently Doing when it comes to aDDressing the problem of systemic racism anD racial inequality in their inDustry? 6-point scale; top 2 box, Doing well. InDustries shown to half of the sample. General population, U.S., anD among Non-Hispanic White, Black, Latinx, anD Asian populations. 4 Edelman Trust Barometer Special Report: Business and Racial Justice in America BUSINESS FALLS SHORT ON CONCRETE ACTION AND SYSTEMIC CHANGE Percent who agree With few exceptions, the business community Companies that issue a statement in support of racial has done very little in the way of concrete equality need to follow it up with concrete actions to address systemic racism in our country action to avoid being seen by me as exploitative or as opportunists U.S. - 0 + U.S. Change, Aug 2020 to Apr 2021 +2 % % 46 53 63 57 55 57 57 53 51 41 0 +4 +8 +6 White Black Latinx Asian White Black Latinx Asian 2021 Edelman Trust Barometer Special Report: Business and Racial Justice in America. Q23. Please inDicate how much you agree or Disagree with the following statements. 9-point scale; top 4 box, agree. General population, U.S., anD among Non-Hispanic White, Black, Latinx anD Asian populations. 5 FINANCIAL SERVICES CUSTOMERS REPORT SYSTEMIC INEQUITIES PERCEIVED SYSTEMIC BIAS AND DISCRIMINATION ACROSS SUBSECTORS, PARTICULARLY AMONG BLACK PEOPLE Percent who agree I think there is systemic bias and discrimination towards certain groups of people within companies in this industry Banks Auto Mortgage Credit card Insurance lenders lenders companies companies 71 62 60 60 54 50% 49 48 45 41 42 43 44 43 34 38 35 32 33 29 30 Wh ite Black Latinx Asian Wh ite Black Latinx Asian Wh ite Black Latinx Asian Wh ite Black Latinx Asian Wh ite Black Latinx Asian 2021 Edelman Money in Color Special Report. Q32/34/35/36/37. Now thinking about specific sectors of financial services, how much Do you agree or Disagree with the following statements about [inDustry]? 4-point scale; top 2 box, agree. Q32 askeD of responDents who have a financial proDuct (Q8/1-23), Q34 askeD of responDents who have a mortgage or real estate (Q8/11, 18), Q35 askeD of responDents who have a creDit carD (Q8/1), Q36 askeD of responDents who have a life insurance policy (Q8/9), Q37 askeD of responDents who have an auto loan (Q8/22) General population, U.S., and among Non-Hispanic White, Black, Latinx anD Asian populations. 7 DIFFERENT TRUST LEVELS ACROSS SUB-SECTORS Percent trust Distrust Neutral Trust 73 60 57 52 47 44 Banks Credit cards Property/casualty Financial Mortgage Auto insurance advisory lending lending % Diff vs gen pop White 76 +3 63 +3 62 +5 55 +3 49 +2 46 +2 Black 69 -4 56 -4 50 -7 49 -3 49 +2 43 -1 Latinx 64 -9 55 -5 47 -10 49 -3 40 -7 37 -7 Asian 78 +5 73 +13 61 +4 60 +8 52 +5 45 +1 2021 Edelman Money in Color Special Report Q29. Below is a list of financial services subsectors. Please inDicate how much you trust each subsector to Do what is right. 9-point scale; top 4 box, agree. General population, U.S., anD among Non-Hispanic White, Black, Latinx anD Asian populations 8 BLACK AND LATINX COMMUNITIES PERCEIVE PREJUDICE IN CREDIT CRITERIA Percent who agree I feel that the credit criteria used by companies in this industry to determine eligibility or pricing are biased or discriminatory Banks Auto Insurance Mortgage Credit card lenders companies lenders companies 73 68 64 62 63 54 54 50% 44 47 47 44 44 43 39 41 43 42 41 35 34 Wh ite Black Latinx Asian Wh ite Black Latinx Asian Wh ite Black Latinx Asian Wh ite Black Latinx Asian Wh ite Black Latinx Asian 2021 Edelman Money in Color Special Report. Q32/34/35/36/37. Now thinking about specific sectors of financial services, how much Do you agree or Disagree with the following statements about [inDustry]? 4-point scale; top 2 box, agree. Q32 askeD of responDents who have a financial proDuct (Q8/1-23), Q34 askeD of responDents who have a mortgage or real estate (Q8/11, 18), Q35 askeD of responDents who have a creDit carD (Q8/1), Q36 askeD of responDents who have a life insurance policy (Q8/9), Q37 askeD of responDents who have an auto 9 loan (Q8/22) General population, U.S., anD among Non-Hispanic White, Black, Latinx anD Asian populations. HIGH LEVELS OF DISTRUST FOR FINANCIAL SERVICES PROFESSIONALS Percent who trust each of the following people with regards to their financial well-being Distrust Neutral Trust 59 58 57 46 44 43 39 32 Bank Tax Financial Credit card Personal Mortgage lender Auto-lender CEOs of a employees accountants advisors company insurance employees employees financial employees employees services company % Diff vs gen pop White 62 +3 63 +5 61 +4 50 +4 45 +1 45 +2 40 +1 33 +1 Black 55 -4 52 -6 47 -10 45 -1 43 -1 41 -2 37 -2 33 +1 Latinx 52 -7 52 -6 52 -5 38 -8 43 -1 41 -2 33 -6 36 +4 Asian 63 +4 66 +8 59 +2 55 +9 46 +2 45 +2 41 +2 36 +4 2021 Edelman Money in Color Special Report Q30. Below is a list of people. Please inDicate how much you trust each to Do what is right when it comes to your financial well-being. 9-point scale; top 4 box, agree. General population, U.S., anD among Non-Hispanic White, Black, Latinx anD Asian populations. 10 DUAL REALITIES IN THE FINANCIAL SERVICES SECTOR FINANCIAL SERVICES ASSIGNS TWO VERY DIFFERENT CUSTOMER EXPERIENCES Percent who have had one or more negative experiences with financial services companies General population High income respondents (100k+) List of negative experiences include: An employee talked doWn to me White Black Latinx Asian I felt that no one there cared to get to know me as a person I felt I needed to change my appearance to be taken seriously 68 I had to supply more proof of my employment than was necessary 58 55 49 I felt ignored or not seen at a physical location 44 41 No one had a similar cultural background to me 34 36 I was given higher rates or fees because of the color of my skin 2021 Edelman Money in Color Special Report Q39. Which of the following, if any, have you experienceD Dealing with financial services companies before? Please select all that apply. General population, U.S., and among Non-Hispanic White, Black, Latinx anD Asian populations. 12 FEELING EXPLOITED AND UNWELCOME Four categories of negative experiences that high income ($100k+) respondents perceive, including instances of exploitation, mistrust, condescension, and feeling unseen Type of Negative Experience White Black Latinx Asian Being Exploited I was given higher rates or fees because of the color of my 5 26 5 3 skin Being Mistrusted I had to supply more proof of my employment than I thought 9 25 17 8 was necessary Condescension An employee talked down to me 12 17 22 8 Feeling Unseen Most likely to perceive 11 14 16 18 each type of I felt that no one there cared to get to know me as a person negative experience 2021 Edelman Money in Color Special Report Q39.
Recommended publications
  • Playing Prejudice: the Impact of Game-Play on Attributions of Gender and Racial Bias
    Playing Prejudice: The Impact of Game-Play on Attributions of Gender and Racial Bias Jessica Hammer Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy under the Executive Committee of the Graduate School of Arts and Sciences COLUMBIA UNIVERSITY 2014 © 2014 Jessica Hammer All rights reserved ABSTRACT Playing Prejudice: The Impact of Game-Play on Attributions of Gender and Racial Bias Jessica Hammer This dissertation explores new possibilities for changing Americans' theories about racism and sexism. Popular American rhetorics of discrimination, and learners' naïve models, are focused on individual agents' role in creating bias. These theories do not encompass the systemic and structural aspects of discrimination in American society. When learners can think systemically as well as agentically about bias, they become more likely to support systemic as well as individual remedies. However, shifting from an agentic to a systemic model of discrimination is both cognitively and emotionally challenging. To tackle this difficult task, this dissertation brings together the literature on prejudice reduction and conceptual change to propose using games as an entertainment-based intervention to change players' attribution styles around sexism and racism, as well as their attitudes about the same issues. “Playable model – anomalous data” theory proposes that games can model complex systems of bias, while instantiating learning mechanics that help players confront the limits of their existing models. The web-based
    [Show full text]
  • ALGORITHMIC BIAS EXPLAINED How Automated Decision-Making Becomes Automated Discrimination
    ALGORITHMIC BIAS EXPLAINED How Automated Decision-Making Becomes Automated Discrimination Algorithmic Bias Explained | 1 Table of Contents Introduction 3 • What Are Algorithms and How Do They Work? • What Is Algorithmic Bias and Why Does it Matter? • Is Algorithmic Bias Illegal? • Where Does Algorithmic Bias Come From? Algorithmic Bias in Healthcare 10 Algorithmic Bias in Employment 12 Algorithmic Bias in Government Programs 14 Algorithmic BIas in Education 16 Algorithmic Bias in Credit and Finance 18 Algorithmic Bias in Housing and Development 21 Algorithmic BIas in Everything Else: Price Optimization Algorithms 23 Recommendations for Fixing Algorithmic Bias 26 • Algorithmic Transparency and Accountability • Race-Conscious Algorithms • Algorithmic Greenlining Conclusion 32 Introduction Over the last decade, algorithms have replaced decision-makers at all levels of society. Judges, doctors and hiring managers are shifting their responsibilities onto powerful algorithms that promise more data-driven, efficient, accurate and fairer decision-making. However, poorly designed algorithms threaten to amplify systemic racism by reproducing patterns of discrimination and bias that are found in the data algorithms use to learn and make decisions. “We find it important to state that the benefits of any technology should be felt by all of us. Too often, the challenges presented by new technology spell out yet another tale of racism, sexism, gender inequality, ableism and lack of consent within digital 1 culture.” —Mimi Onuoha and Mother Cyborg, authors, “A People’s Guide to A.I.” The goal of this report is to help advocates and policymakers develop a baseline understanding of algorithmic bias and its impact as it relates to socioeconomic opportunity across multiple sectors.
    [Show full text]
  • Chevron's Abusive Litigation in Ecuador
    Rainforest Chernobyl Revisited† The Clash of Human Rights and BIT Investor Claims: Chevron’s Abusive Litigation in Ecuador’s Amazon by Steven Donziger,* Laura Garr & Aaron Marr Page** a marathon environmental litigation: Seventeen yearS anD Counting he last time the environmental lawsuit Aguinda v. ChevronTexaco was discussed in these pages, the defen- Tdant Chevron Corporation1 had just won a forum non conveniens dismissal of the case from a U.S. federal court to Ecuador after nine years of litigation. Filed in 1993, the lawsuit alleged that Chevron’s predecessor company, Texaco, while it exclusively operated several oil fields in Ecuador’s Amazon from 1964 to 1990, deliberately dumped billions of gallons of toxic waste into the rainforest to cut costs and abandoned more than 900 large unlined waste pits that leach toxins into soils and groundwater. The suit contended that the contamination poisoned an area the size of Rhode Island, created a cancer epi- demic, and decimated indigenous groups. During the U.S. stage of the litigation, Chevron submitted fourteen sworn affidavits attesting to the fairness and adequacy of Ecuador’s courts. The company also drafted a letter that was By Lou Dematteis/Redux. Steven Donziger, attorney for the affected communities, speaks with signed by Ecuador’s then ambassador to the United States, a Huaorani women outside the Superior Court at the start of the Chevron former Chevron lawyer, asking the U.S. court to send the case trial on October 21, 2003 in Lago Agrio in the Ecuadoran Amazon. to Ecuador.2 Representative of Chevron’s position was the sworn statement from Dr.
    [Show full text]
  • Accountability As a Debiasing Strategy: Does Race Matter?
    Accountability as a Debiasing Strategy: Does Race Matter? Jamillah Bowman Williams, J.D., Ph.D. Georgetown University Law Center Paper Presented at CULP Colloquium Duke University May 19th, 2016 1 Introduction Congress passed Title VII of the Civil Rights Act of 1964 with the primary goal of integrating the workforce and eliminating arbitrary bias against minorities and other groups who had been historically excluded. Yet substantial research reveals that racial bias persists and continues to limit opportunities and outcomes for racial minorities in the workplace. It has been argued that having a sense of accountability, or “the implicit or explicit expectation that one may be called on to justify one’s beliefs, feelings, and actions to others,” can decrease the influence of bias.1 This empirical study seeks to clarify the conditions under which accountability to a committee of peers influences bias and behavior. This project builds on research by Sommers et al. (2006; 2008) that found whites assigned to racially diverse groups generated a wider range of perspectives, processed facts more thoroughly, and were more likely to discuss polarizing social issues than all-white groups.2 This project extends this line of inquiry to the employment discrimination context to empirically examine how a committee’s racial composition influences the decision making process. For example, when making a hiring or promotion decision, does accountability to a racially diverse committee lead to more positive outcomes than accountability to a homogeneous committee? More specifically, how does race of the committee members influence complex thinking, diversity beliefs, acknowledgement of structural discrimination, and inclusive promotion decisions? Implications for antidiscrimination law and EEO policy will be discussed.
    [Show full text]
  • How to Prevent Discriminatory Outcomes in Machine Learning
    White Paper How to Prevent Discriminatory Outcomes in Machine Learning Global Future Council on Human Rights 2016-2018 March 2018 Contents 3 Foreword 4 Executive Summary 6 Introduction 7 Section 1: The Challenges 8 Issues Around Data 8 What data are used to train machine learning applications? 8 What are the sources of risk around training data for machine learning applications? 9 What-if use case: Unequal access to loans for rural farmers in Kenya 9 What-if use case: Unequal access to education in Indonesia 9 Concerns Around Algorithm Design 9 Where is the risk for discrimination in algorithm design and deployment? 10 What-if use case: Exclusionary health insurance systems in Mexico 10 What-if scenario: China and social credit scores 11 Section 2: The Responsibilities of Businesses 11 Principles for Combating Discrimination in Machine Learning 13 Bringing principles of non-discrimination to life: Human rights due diligence for machine learning 14 Making human rights due diligence in machine learning effective 15 Conclusion 16 Appendix 1: Glossary/Definitions 17 Appendix 2: The Challenges – What Can Companies Do? 23 Appendix 3: Principles on the Ethical Design and Use of AI and Autonomous Systems 22 Appendix 4: Areas of Action Matrix for Human Rights in Machine Learning 29 Acknowledgements World Economic Forum® This paper has been written by the World Economic Forum Global Future Council on Human Rights 2016-18. The findings, interpretations and conclusions expressed herein are a result of a © 2018 – All rights reserved. collaborative process facilitated and endorsed by the World Economic Forum, but whose results No part of this publication may be reproduced or Transmitted in any form or by any means, including do not necessarily represent the views of the World Economic Forum, nor the entirety of its Photocopying and recording, or by any information Storage Members, Partners or other stakeholders, nor the individual Global Future Council members listed and retrieval system.
    [Show full text]
  • I Appreciate the Spirit of the Rule Change, and I Agree That We All
    From: Javed Abbas To: stevens, cheryl Subject: Comment On Proposed Rule Change - 250.2 Date: Saturday, March 27, 2021 8:42:43 PM Attachments: image002.png I appreciate the spirit of the rule change, and I agree that we all likely benefit by understanding the fundamental ways in which we are all connected, and that our society will either succeed or fail primarily based on our ability to cooperate with each other and treat each other fairly. However, I do not support the rule change because I do not think it will have the effect of making us more inclusive because of the ways humans tend to react when forced to do things. There will inevitably be push back from a group of people who do not support the spirit of the rule, that push back will probably be persuasive to people who don’t have strong feelings about the ideas one way or the other but will feel compelled to oppose the ideas because it will be portrayed as part of a political movement that they are inclined to oppose, and ultimately the Rule that is supposed to help us become more inclusive will actually help drive us further apart. I do think it’s a great idea to offer these CLEs and let them compete on equal footing in the marketplace of ideas. Rule Change: "Rule 250.2. CLE Requirements (1) CLE Credit Requirement. Every registered lawyer and every judge must complete 45 credit hours of continuing legal education during each applicable CLE compliance period as provided in these rules.
    [Show full text]
  • Bibliography: Bias in Artificial Intelligence
    NIST A.I. Reference Library Bibliography: Bias in Artificial Intelligence Abdollahpouri, H., Mansoury, M., Burke, R., & Mobasher, B. (2019). The unfairness of popularity bias in recommendation. arXiv preprint arXiv:1907.13286. Abebe, R., Barocas, S., Kleinberg, J., Levy, K., Raghavan, M., & Robinson, D. G. (2020, January). Roles for computing in social change. Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, 252-260. Aggarwal, A., Shaikh, S., Hans, S., Haldar, S., Ananthanarayanan, R., & Saha, D. (2021). Testing framework for black-box AI models. arXiv preprint. Retrieved from https://arxiv.org/pdf/2102.06166.pdf Ahmed, N., & Wahed, M. (2020). The De-democratization of AI: Deep learning and the compute divide in artificial intelligence research. arXiv preprint arXiv:2010.15581. Retrieved from https://arxiv.org/ftp/arxiv/papers/2010/2010.15581.pdf. AI Now Institute. Algorithmic Accountability Policy Toolkit. (2018). Retrieved from: https://ainowinstitute.org/aap-toolkit.html. Aitken, M., Toreini, E., Charmichael, P., Coopamootoo, K., Elliott, K., & van Moorsel, A. (2020, January). Establishing a social licence for Financial Technology: Reflections on the role of the private sector in pursuing ethical data practices. Big Data & Society. doi:10.1177/2053951720908892 Ajunwa, I. (2016). Hiring by Algorithm. SSRN Electronic Journal. doi:10.2139/ssrn.2746078 Ajunwa, I. (2020, Forthcoming). The Paradox of Automation as Anti-Bias Intervention, 41 Cardozo, L. Rev. Amini, A., Soleimany, A. P., Schwarting, W., Bhatia, S. N., & Rus, D. (2019, January). Uncovering and mitigating algorithmic bias through learned latent structure. Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society, 289-295. Amodei, D., Olah, C., Steinhardt, J., Christiano, P., Schulman, J., & Mané, D.
    [Show full text]
  • Implicit Bias: How to Recognize It and What Campus Gcs Should Do
    2019 Annual Conference Hyatt Regency Denver at Colorado Convention Center · Denver, CO June 23 – 26, 2019 Implicit Bias: 02B How to Recognize It and What Campus GCs Should Do About It NACUA members may reproduce and distribute copies of materials to other NACUA members and to persons employed by NACUA member institutions if they provide appropriate attribution, including any credits, acknowledgments, copyright notice, or other such information contained in the materials. All materials available as part of this program express the viewpoints of the authors or presenters and not of NACUA. The content is not approved or endorsed by NACUA. The content should not be considered to be or used as legal advice. Legal questions should be directed to institutional legal counsel. IMPLICIT BIAS: WHAT IS IT AND WHAT CAN GENERAL COUNSELS DO ABOUT IT? June 23 – 26, 2019 Kathlyn G. Perez Shareholder, Baker Donelson New Orleans, Louisiana and Kathy Carlson J.D. Executive Director, Department of Internal Investigations Office of Legal & Regulatory Affairs, University of Texas Medical Branch Galveston, Texas I. What is Implicit Bias? ............................................................................................................. 1 II. Implicit Bias in Corporations and Elementary and Secondary Education ........................... 2 III. Higher Education Institutions as Employers........................................................................ 3 A. Implicit Bias in Hiring, Grant Making, and Assignments .....................................
    [Show full text]
  • Filtering Practices of Social Media Platforms Hearing
    FILTERING PRACTICES OF SOCIAL MEDIA PLATFORMS HEARING BEFORE THE COMMITTEE ON THE JUDICIARY HOUSE OF REPRESENTATIVES ONE HUNDRED FIFTEENTH CONGRESS SECOND SESSION APRIL 26, 2018 Serial No. 115–56 Printed for the use of the Committee on the Judiciary ( Available via the World Wide Web: http://judiciary.house.gov U.S. GOVERNMENT PUBLISHING OFFICE 32–930 WASHINGTON : 2018 VerDate Sep 11 2014 23:54 Nov 28, 2018 Jkt 032930 PO 00000 Frm 00001 Fmt 5011 Sfmt 5011 E:\HR\OC\A930.XXX A930 SSpencer on DSKBBXCHB2PROD with HEARINGS COMMITTEE ON THE JUDICIARY BOB GOODLATTE, Virginia, Chairman F. JAMES SENSENBRENNER, JR., JERROLD NADLER, New York Wisconsin ZOE LOFGREN, California LAMAR SMITH, Texas SHEILA JACKSON LEE, Texas STEVE CHABOT, Ohio STEVE COHEN, Tennessee DARRELL E. ISSA, California HENRY C. ‘‘HANK’’ JOHNSON, JR., Georgia STEVE KING, Iowa THEODORE E. DEUTCH, Florida LOUIE GOHMERT, Texas LUIS V. GUTIE´ RREZ, Illinois JIM JORDAN, Ohio KAREN BASS, California TED POE, Texas CEDRIC L. RICHMOND, Louisiana TOM MARINO, Pennsylvania HAKEEM S. JEFFRIES, New York TREY GOWDY, South Carolina DAVID CICILLINE, Rhode Island RAU´ L LABRADOR, Idaho ERIC SWALWELL, California BLAKE FARENTHOLD, Texas TED LIEU, California DOUG COLLINS, Georgia JAMIE RASKIN, Maryland KEN BUCK, Colorado PRAMILA JAYAPAL, Washington JOHN RATCLIFFE, Texas BRAD SCHNEIDER, Illinois MARTHA ROBY, Alabama VALDEZ VENITA ‘‘VAL’’ DEMINGS, Florida MATT GAETZ, Florida MIKE JOHNSON, Louisiana ANDY BIGGS, Arizona JOHN RUTHERFORD, Florida KAREN HANDEL, Georgia KEITH ROTHFUS, Pennsylvania SHELLEY HUSBAND, Chief of Staff and General Counsel PERRY APELBAUM, Minority Staff Director and Chief Counsel (II) VerDate Sep 11 2014 23:54 Nov 28, 2018 Jkt 032930 PO 00000 Frm 00002 Fmt 5904 Sfmt 5904 E:\HR\OC\A930.XXX A930 SSpencer on DSKBBXCHB2PROD with HEARINGS C O N T E N T S APRIL 26, 2018 OPENING STATEMENTS Page The Honorable Bob Goodlatte, Virginia, Chairman, Committee on the Judici- ary ........................................................................................................................
    [Show full text]
  • Debugging Software's Schemas
    \\jciprod01\productn\G\GWN\82-6\GWN604.txt unknown Seq: 1 16-JAN-15 10:36 Debugging Software’s Schemas Kristen Osenga* ABSTRACT The analytical framework being used to assess the patent eligibility of software and computer-related inventions is fraught with errors, or bugs, in the system. A bug in a schema, or framework, in computer science may cause the system or software to produce unexpected results or shut down altogether. Similarly, errors in the patent eligibility framework are causing unexpected results, as well as calls to shut down patent eligibility for software and com- puter-related inventions. There are two general schemas that are shaping current discussions about software and computer-related invention patents—that software patents are generally bad (the bad patent schema) and that software patent holders are problematic (the troll schema). Because these frameworks were created and are maintained through a series of cognitive biases, they suffer from a variety of bugs. A larger flaw in the system, however, is that using these two schemas to frame the issue of patent eligibility for software and computer-related inven- tions misses the underlying question that is at the heart of the analysis—what is an unpatentable “abstract idea.” To improve the present debate about the patent eligibility for these inventions, it is therefore critical that the software patent system be debugged. TABLE OF CONTENTS INTRODUCTION ................................................. 1833 R I. THE STATE OF SOFTWARE PATENTS .................... 1836 R A. What Is Software .................................... 1836 R B. The Software Patent Mess ........................... 1838 R II. THE BIASES IN SOFTWARE’S SCHEMAS .................. 1843 R A.
    [Show full text]
  • Censorship, Free Speech & Facebook
    Washington Journal of Law, Technology & Arts Volume 15 Issue 1 Article 3 12-13-2019 Censorship, Free Speech & Facebook: Applying the First Amendment to Social Media Platforms via the Public Function Exception Matthew P. Hooker Follow this and additional works at: https://digitalcommons.law.uw.edu/wjlta Part of the First Amendment Commons, Internet Law Commons, and the Privacy Law Commons Recommended Citation Matthew P. Hooker, Censorship, Free Speech & Facebook: Applying the First Amendment to Social Media Platforms via the Public Function Exception, 15 WASH. J. L. TECH. & ARTS 36 (2019). Available at: https://digitalcommons.law.uw.edu/wjlta/vol15/iss1/3 This Article is brought to you for free and open access by the Law Reviews and Journals at UW Law Digital Commons. It has been accepted for inclusion in Washington Journal of Law, Technology & Arts by an authorized editor of UW Law Digital Commons. For more information, please contact [email protected]. Hooker: Censorship, Free Speech & Facebook: Applying the First Amendment WASHINGTON JOURNAL OF LAW, TECHNOLOGY & ARTS VOLUME 15, ISSUE 1 FALL 2019 CENSORSHIP, FREE SPEECH & FACEBOOK: APPLYING THE FIRST AMENDMENT TO SOCIAL MEDIA PLATFORMS VIA THE PUBLIC FUNCTION EXCEPTION Matthew P. Hooker* CITE AS: M HOOKER, 15 WASH. J.L. TECH. & ARTS 36 (2019) https://digitalcommons.law.uw.edu/cgi/viewcontent.cgi?article=1300&context= wjlta ABSTRACT Society has a love-hate relationship with social media. Thanks to social media platforms, the world is more connected than ever before. But with the ever-growing dominance of social media there have come a mass of challenges. What is okay to post? What isn’t? And who or what should be regulating those standards? Platforms are now constantly criticized for their content regulation policies, sometimes because they are viewed as too harsh and other times because they are characterized as too lax.
    [Show full text]
  • Alliance the Color of Justice-2019
    THE COLOR OF JUSTICE The Landscape of Traumatic Justice: Youth of Color in Conflict with the Law The Alliance of National Psychological Associations for Racial and Ethnic Equity The Association of Black Psychologists The Asian American Psychological Association The National Latina/o Psychological Association The American Psychological Association The Society of Indian Psychologists 2019 THE COLOR OF JUSTICE The Landscape of Traumatic Justice: Youth of Color in Conflict with the Law The Alliance of National Psychological Associations for Racial and Ethnic Equity The Association of Black Psychologists The Asian American Psychological Association The National Latina/o Psychological Association The American Psychological Association The Society of Indian Psychologists Authors: Roberto Cancio, Ph.D. Cheryl Grills, Ph.D. Jennifer García, Ph.D. With Contributions from: The Youth Justice Coalition Publication, 2019. Recommented Citation: Acknowledgments Cancio, R., Grills, C. T., and J. García. (2019). This document was developed by The Alliance of The Color of Justice: The Landscape of National Psychological Associations for Racial and Traumatic Justice: Youth of Color in Conflict with Ethnic Equity. the Law. The Alliance of National Psychological Associations for Racial and Ethnic Equity. We would like to thank the following people for their invaluable contributions and revisions to this document: Claudette Antuña, Leah Rouse Arndt, Eddie Becton, Kim McGill, Gayle Morse, Kevin Nadal, Amorie Robinson, and Sandra Villanueva. We would also like to thank the Youth Justice Coalition youth who generously shared their stories. The Annie E. Casey Foundation provided support for this report. The Casey Foundation is a private philanthropy that creates a brighter future for the nation’s children by developing solutions to strengthen families, build paths to economic opportunity and transform struggling communities into safer and healthier places to live, work and grow.
    [Show full text]