Digital Redlining’’—Potentially Discrimi- Nating Against Certain Groups of People

Digital Redlining’’—Potentially Discrimi- Nating Against Certain Groups of People

Electronic Commerce & Law Report™ Reproduced with permission from Electronic Commerce & Law Report, 21 ECLR, 6/8/16. Copyright ஽ 2016 by The Bureau of National Affairs, Inc. (800-372-1033) http://www.bna.com BIG DATA Big data analytics can provide numerous benefits to businesses and consumers. But inac- curacies and biases may put businesses at risk of ‘‘digital redlining’’—potentially discrimi- nating against certain groups of people. Attorneys from Wiley Rein LLP discuss how the private sector can continue to use big data while reducing the risk of federal investigations and liability over disparate impact discrimination. Consumer Protection ‘Digital Redlining’: Increasing Risk of ‘Soft Regulation’ and Litigation BY STEPHEN J. OBERMEIER,MEGAN L. BROWN, everal recent developments suggest that private sector users and purveyors of ‘‘big data’’ and re- MATTHEW J. GARDNER AND STEPHEN J. KENNY S lated services may be at risk of increased scrutiny over its downstream effects. Commentators and re- Stephen J. Obermeier, Megan L. Brown, Mat- searchers have begun to scrutinize the risks and soci- thew J. Gardner and Stephen J. Kenny are etal effects of big data, and the federal government is attorneys at Wiley Rein LLP in Washington. raising concerns about potential harms. The private Mr. Obermeier is a partner in the Appel- sector should expect increased attention to practices, as late, Litigation, and Telecom, Media & Tech- regulators try to nudge business practices in preferred nology practices. Ms. Brown is a partner in directions and potential litigants look for cases to bring. the Cybersecurity, Appellate, and TMT prac- As the use of big data has become ubiquitous, com- tices. Mr. Gardner is of counsel in the White mentators are increasingly discussing potential risks Collar Defense & Government Investigations and downsides, including the possibility of ‘‘digital and Cybersecurity practices. Mr. Kenny is an redlining’’—any use of big data that can have a dispa- associate in the Election Law & Government rate impact on or result in unfair treatment of certain Ethics and Litigation practices. groups. These commentators can point to reports by the Federal Trade Commission and the White House to par- COPYRIGHT ஽ 2016 BY THE BUREAU OF NATIONAL AFFAIRS, INC. ISSN 1098-5190 2 tially justify their concerns. In January 2016, the FTC potentially discriminatory practices.’’ It called for the continued its trend of monitoring and analyzing innova- federal government to expand its technical expertise tions in big data that impact consumers. In a report, Big ‘‘to be able to identify practices and outcomes facili- Data: A Tool of Inclusion or Exclusion, the FTC identi- tated by Big Data analytics that have a discriminatory fied benefits from big data, but also highlighted the fear impact on protected classes.’’ that ‘‘potential inaccuracies and biases might lead to detrimental effects for low income and underserved populations’’ (21 ECLR 57, 1/13/16) The FTC Recently Expanded Upon the White The concept has been on the White House’s radar for House’s Analysis at least two years. A 2014 White House report summa- rizing a ‘‘90-day study to examine how big data will In its most recent publication concerning big data, transform the way we live and work and alter the rela- the FTC expanded upon the White House’s warnings tionships between government, citizens, businesses, about digital redlining. The FTC’s report describes in and consumers’’ explained that big data can be benefi- more detail the potential for digital redlining and iden- cial in generating customized search results and adver- tifies potentially applicable legal regimes, including the tisements, but also noted that ‘‘perfect personalization Equal Credit Opportunity Act (ECOA), the Fair Credit . leaves room for subtle and not-so-subtle forms of Reporting Act (FCRA), Title VII of the Civil Rights Act discrimination in pricing, services, and opportunities.’’ of 1964, the Fair Housing Act (FHA), the Americans To the extent data personalization results in a with Disabilities Act, the Age Discrimination in Employ- claimed disparate impact on protected classes, a variety ment Act, and the Genetic Information Nondiscrimina- of players—from banks to e-commerce vendors to real tion Act. It also identifies the FTC’s general authority estate companies—could without any intent to discrimi- over unfair and deceptive practices under Section 5 of nate be exposed to discrimination lawsuits by both pri- the Federal Trade Commission Act, explaining that vate plaintiffs and the government. Fortunately, there ‘‘Section 5 may . apply . if products are sold to cus- are steps businesses can take—and defenses they can tomers that use the products for discriminatory pur- raise—to help fend off these types of suits. poses.’’ Although nothing in the report is per se binding on private entities, the FTC offers ‘‘considerations’’ for The White House Signaled Concern About companies using big data, and poses ‘‘questions’’ for Digital Redlining in 2014 ‘‘compliance.’’ It targets downstream users of data, the aggregators of data, and the creators of the analytics. The 2014 White House Big Data report provides a Some of the FTC’s suggestions seem tailor-made for fu- high-level analysis of the potential benefits and threats ture investigations, inquiries or creative litigants. to big data use in both the government and private sec- Because the FTC has broad jurisdiction and has sig- tor. In its analysis of the private sector, the report notes naled interest, it will expect companies to be cognizant that ‘‘the civil rights community is concerned that . of its expectations. The FTC explained to a court of ap- algorithmic decisions [resulting in consumer personal- peals in the context of data security expectations, ‘‘any ization] raise the specter of ‘redlining’ in the digital careful general counsel would be looking at what the economy—the potential to discriminate against the FTC is doing, [as the FTC] has broad ranging jurisdic- most vulnerable classes of our society under the guise tion [over the private sector] and undertakes frequent of neutral algorithms.’’ The report continues: actions against all manner of practices and all manner Recently, some offline retailers were found to be using an of businesses.’’ Such admonitions may have conse- algorithm that generated different discounts for the same quences for innovation; as companies manage concerns product to people based on where they believed the cus- about liability or the burdens of an investigation into tomer was located. While it may be that the price differ- their technical and business decisions. ences were driven by the lack of competition in certain One potential chilling effect of these government neighborhoods, in practice, people in higher-income areas warnings animated the separate statement of FTC Com- received higher discounts than people in lower-income ar- missioner Maureen Olhausen, in which she reiterated eas. the discipline imposed by the market, to guard against inaccuracies or misuse, and cautioned against giving ‘‘undue credence to hypothetical harms’’ which might Perhaps the most worrisome aspect of the discourage the development of innovative tools using big data. government’s expanding interest in digital redlining is that at least in some circumstances, a company Lawsuits and Government Investigators May Try to Hold Private Entities Liable for may be subject to liability regardless of whether Unintended Discriminatory Impacts Resulting there was any intent to discriminate. From Big Data Analytics Perhaps the most worrisome aspect of the govern- ment’s expanding interest in digital redlining is that at As a result, the report found it important ‘‘to examine least in some circumstances, a company may be subject how algorithmically-driven decisions might exacerbate to liability regardless of whether there was any intent to existing socio-economic disparities’’ including in work- discriminate under the so-called ‘‘disparate impact’’ place and educational contexts, ‘‘especially when it theory of discrimination. That is, facially neutral prac- comes to the practice of differential pricing and other tices that result in a disparate impact on minorities can 6-8-16 COPYRIGHT ஽ 2016 BY THE BUREAU OF NATIONAL AFFAIRS, INC. ECLR ISSN 1098-5190 3 be deemed to be unlawful regardless of discriminatory intent. Discrimination claims based on disparate impact have been available in the employment context even in It isn’t only the government that is tracking the the absence of express statutory language since the impacts of big data on minorities, and even U.S. Supreme Court’s seminal decision in Griggs v. Duke Power Co., 401 U.S. 424 (U.S. 1971). And just this seemingly benign uses of data personalization can last term, in Texas Dep’t of Hous. & Cmty Aff’s v. Inclu- sive Cmtys Project Inc., 135 S.Ct. 2507, 2015 BL 203075 result in negative scrutiny (U.S. 2015), the Supreme Court held that discrimination claims based on disparate impact are available under the Fair Housing Act notwithstanding the lack of any The CFPB’s recent enforcement actions against auto- express statutory liability. lending finance companies, although not involving big There is no reason to believe that plaintiffs’ and gov- data digital redlining, may provide a glimpse into how ernmental application of disparate impact liability will the federal government might pursue a private entity whose data analytics arguably caused a disparate im- stop with the FHA. For example, although

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    5 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us