Page 54 - FIGI - Big data, machine learning, consumer protection and privacy
P. 54
107 http:// www .worldbank .org/ en/ topic/ financialsector/ publication/ general -principles -for -credit -reporting.
108 §1681(a)(1).
109 §§1681a(d)(1)(A)–(C); §1681b.
110 §1681a(d)(1).
111 §1681e(b); §1681e(d); §1681b(b)(1); §1681j(a); and §1681n(a).
112 Federal Trade Commission, Big data: A tool for inclusion or exclusion? Understanding the issues. Washington, DC
(2016). https:// www .ftc .gov/ system/ files/ documents/ reports/ big -data -tool -inclusion -or -exclusion -understanding
-issues/ 160106big -data -rpt .pdf.
113 136 S. Ct. 1540 (2016).
114 Fair Credit Reporting Act of 1970, 84 Stat. 1127, as amended, 15 U. S. C. §1681 et seq. The US Fair Credit Reporting
Act (FCRA) seeks to ensure “fair and accurate credit reporting.” §1681(a)(1). It regulates the creation and the use of
“consumer report[s]” by “consumer reporting agenc[ies]” for credit transactions, insurance, licensing, consumer-
initiated business transactions, and employment. §§1681a(d)(1)(A)–(C); §1681b. The FCRA was enacted long before the
Internet, and applies to companies that regularly disseminate information bearing on an individual’s “credit worthiness,
credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living.” §1681a(d)
(1) The FCRA requires consumer reporting agencies to “follow reasonable procedures to assure maximum possible
accuracy of” consumer reports; to notify providers and users of consumer information of their responsibilities under
the Act; to limit the circumstances in which such agencies provide consumer reports “for employment purposes”;
and to post toll-free numbers for consumers to request reports. It also creates liability for failure to comply with these
requirements. §1681e(b); §1681e(d); §1681b(b)(1); §1681j(a); and §1681n(a).
115 See for example GPFI, Use of Alternative Data at footnote 14.
116 For an excellent discussion of these issues, see Jason Blechman, Mobile Credit in Kenya and Tanzania: Emerging
Regulatory Challenges in Consumer Protection, Credit Reporting and Use of Customer Transactional Data, African
Journal of Information and Communication (AJIC), Issue 17, November 2016, http:// www .macmillankeck .pro/
publications .html.
117 Article 29 Data Protection Working Party, ‘Opinion 03/2013 on Purpose Limitation, 00569/13/EN WP 203, Adopted
on 2 April 2013’ (2013) 47 http:// ec .europa .eu/ justice/ article -29/ documentation/ opinionrecommendation/ files/ 2013/
wp203 _en .pdf; Omer Tene and Jules Polonetsky, ‘Big Data for All: Privacy and User Control in the Age of Analytics’
(2012) 11 Nw. J. Tech. & Intell. Prop. https:// scholarlycommons .law .northwestern .edu/ cgi/ viewcontent .cgi ?article = 1191 &
context = njtip; The European Data Protection Supervisor (EDPS), ‘Opinion 3/2018 on Online Manipulation and Personal
Data’ 3/2018 8–16 https:// edps .europa .eu/ sites/ edp/ files/ publication/ 18 -03 -19 _online _manipulation _en .pdf.
118 G-20, High Level Principles of Digital Financial Inclusion, p16, https:// www .gpfi .org/ sites/ default/ files/ G20 %20High %20
Level %20Principles %20for %20Digital %20Financial %20Inclusion .pdf
119 Astra Taylor and Jathan Sadowski, How Companies Turn Your Facebook Activity Into a Credit Score, The Nation, 15
June, 2015. https:// www .thenation .com/ article/ how -companies -turn -your -facebook -activity -credit -score/
120 GDPR, Article 4.
121 GDPR, Article 9(4).
122 GDPR, Article 22(4) and 9(2)(a) and (g).
123 Article 29 Data Protection Working Party, ‘Advice Paper on Special Categories of Data (“sensitive Data”)’
Ares(2011)444105-20/04/2011 10 available at http:// ec .europa .eu/ justice/ data -protection/ article -29/ documentation/
otherdocument/ files/ 2011/ 2011 _04 _20 _letter _artwp _mme _le _bail _directive _9546ec _annex1 _en .pdf at p4.
124 Antoinette Rouvroy, “Of Data and Men”: Fundamental Rights and Freedoms in a World of Big Data, COUNCIL OF
EUR., DIRECTORATE GEN. OF HUM. RTS. AND RULE OF L., at 10 (Jan. 11, 2016), https:// rm .coe .int/ CoERMPublicC
ommonSearchServices/ DisplayDCTMContent ?documentId = 09000016806a6020
125 When considering automated decision-making, the Article 29 Working Party found that profiling can create sensitive
data “by inference from other data which is not special category data in its own right but becomes so when combined
with other data.” Article 29 Data Protection Working Party, ‘Guidelines on Automated Individual Decision-Making and
Profiling for the Purposes of Regulation 2016/679’, footnote 56, at 15.
126 See Zarsky at footnote 16.
127 See Joshua Kroll, Joanna Huey, Solon Barocas, Edward Felten, Joel Reidenberg, David Robinson, and Harlan Yu,
Accountable Algorithms, Univ. of Penn Law Review, 2017, available at https:// scholarship .law .upenn .edu/ cgi/ viewcontent
.cgi ?article = 9570 & context = penn _law _review. Paul Ohm and David Lehr, Playing with the Data: What Legal Scholars
52 Big data, machine learning, consumer protection and privacy