Page 55 - FIGI - Big data, machine learning, consumer protection and privacy
P. 55

Should Learn About Machine Learning, Univ. of CA, Davis Law Review, 2017, available at https:// lawreview .law .ucdavis
                .edu/ issues/ 51/ 2/ Symposium/ 51 -2 _Lehr _Ohm .pdf.
            128   http:// patft .uspto .gov/ netacgi/ nph -Parser ?Sect1 = PTO1 & Sect2 = HITOFF & d = PALL & p = 1 & u = %2Fnetahtml %2FPTO
                %2Fsrchnum .htm & r = 1 & f = G & l = 50 & s1 = 9100400 .PN . & OS = PN/ 9100400 & RS = PN/ 9100400
            129   Jonathan Zim, The Use of Social Data Raises Issues for Consumer Lending, Miami Business Law Review, https:// business
                -law -review .law .miami .edu/ social -data -raises -issues -consumer -lending/ .

            130   Virginia Eubanks, Automating Inequality, St Martin’s Press (2018).
            131   Hardt, Moritz, Price, Eric, and Srebro, Nathan. Equality of opportunity in supervised learning, NIPS, 2017; Chouldechova,
                Alexandra, Fair prediction with disparate impact: A study of bias in recidivism prediction instruments, Corr, 2017.
            132   Disparate impact has been defined using the “80% rule” such that, where a dataset has protected attribute X (e.g., race,
                sex, religion, etc.) and a binary outcome to be predicted C (e.g., “will hire”), the dataset has disparate impact if:


                for positive outcome class YES and majority protected attribute 1 where Pr(C = cjX = x) denotes the conditional
                probability (evaluated over D) that the class outcome is c 2 C given protected attribute x 2 X. Feldman, Michael,
                Friedler, Sorelle A., Moeller, John, Scheidegger, Carlos, and Venkatasubramanian, Suresh, Certifying and removing
                disparate impact. In KDD, 2015. http:// sorelle .friedler .net/ papers/ kdd _disparate _impact .pdf.
            133   Supreme Court of the United States. Griggs v. Duke Power Co. 401 U.S. 424, March 8, 1971.
            134   The US Supreme Court found that Duke Power’s hiring decision was illegal if it resulted in “disparate impact” by
                race even though it was not explicitly determined based on race. This prevented Duke Power from using intelligence
                test scores and high school diplomas, qualifications largely correlated with race, to make hiring decisions. The legal
                doctrine of disparate impact that was developed from this ruling is the main legal theory used to determine unintended
                discrimination in the USA. Duke Power was unable to prove that the intelligence tests or diploma requirements were
                relevant to the jobs for which they were hiring.
            135   Texas Dep't of Housing and Community Affairs v. Inclusive Communities Project 135 S. Ct. 2507 (2015)
            136   Solon Barocas and Andrew D. Selbst, Big Data’s Disparate Impact, 104 Calif. L. Rev. 671 (2016). http:// www
                .californialawreview .org/ wp -content/ uploads/ 2016/ 06/ 2Barocas -Selbst .pdf.
            137   Steve Lohr, Big Data Underwriting for Payday Loans, NY Times, January 19, 2015, https:// bits .blogs .nytimes .com/ 2015/
                01/ 19/ big -data -underwriting -for -payday -loans/ .
            138   See Accountable Algorithms, at footnote 129.
                Paul Ohm and David Lehr, Playing with the Data: What Legal Scholars Should Learn About Machine Learning, Univ. of
                CA, Davis Law Review, 2017, available at https:// lawreview .law .ucdavis .edu/ issues/ 51/ 2/ Symposium/ 51 -2 _Lehr _Ohm .pdf.
            139   See Accountable Algorithms, at footnote 129.
                Paul Ohm and David Lehr, Playing with the Data: What Legal Scholars Should Learn About Machine Learning, Univ. of
                CA, Davis Law Review, 2017, available at https:// lawreview .law .ucdavis .edu/ issues/ 51/ 2/ Symposium/ 51 -2 _Lehr _Ohm .pdf.
            140   Danielle Keats Citron, Technological Due Process, Washington University Law Review, Vol. 85, pp. 1249-1313, 2007,
                https:// papers .ssrn .com/ sol3/ papers .cfm ?abstract _id = 1012360
            141   Julia Angwin and Jeff Larson, The Tiger Mom Tax: Asians Are Nearly Twice as Likely to Get a Higher Price from
                Princeton Review, ProPublica, Sept. 1, 2015.
            142   See Karen Harris, Austin Kimson, and Andrew Schwedel, Labor 2030: The Collision of Demographics, Automation and
                Inequality, Bain & Company Report, February 7, 2018 available at http:// www .bain .com/ publications/ articles/ labor -2030
                -the -collision -of -demographics -automation -and -inequality .aspx.

            143   See, e.g., Paul Ohm, Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization, 57 UCLA L.
                REV. 1701, 1716-27 (2010).
            144   See FPF's Visual Guide to Practical Data.
            145   See Cavoukian, Ann and El-Emam, Khaled, De-Identification Protocols: Essential for Protecting Privacy,
                Information and Privacy Commissioner of Ontario, 2014; and Information Privacy Commissioner of Ontario, “De-
                Identification Centre”, Information Privacy Commissioner of Ontario (https:// www .ipc .on .ca/ privacy/ de -identification
                -centre/ ).

            146   GDPR, Article 4(5).
            147   Narayanan A, Felten EW. (Princeton). No silver bullet: De-identification still doesn't work 2014. Available at: http://
                randomwalker .info/ publications/ no -silver -bullet -de -identification .pdf.




                                                             Big data, machine learning, consumer protection and privacy  53
   50   51   52   53   54   55   56   57   58   59   60