Page 56 - FIGI - Big data, machine learning, consumer protection and privacy
P. 56

148   Paul Ohm, Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization, 57 UCLA LAW REVIEW
                1701 (2010), https:// www .uclalawreview .org/ pdf/ 57 -6 -3 .pdf.
            149   Yves-Alexandre de Montjoye et al., Unique in the Crowd: The privacy bounds of human mobility, Scientific Reports 3
                (2013).
            150   U.N. Global Pulse. Mapping the risk-utility landscape of mobile phone data for sustainable development & humanitarian
                action, 2015; Yi Song, Daniel Dahlmeier, and Stephane Bressan. Not so unique in the crowd: a simple and effective
                algorithm for anonymizing location data. ACM PIR, 2014; de Montjoye, Y. A., Radaelli, L., & Singh, V. K. (2015). Unique in
                the shopping mall: On the reidentifiability of credit card metadata. Science, 347(6221), 536-539.
            151   Apple, Differential Privacy Overview, https:// images .apple .com/ privacy/ docs/ Differential _Privacy _Overview .pdf.
            152   It is “a strong privacy guarantee for an individual’s input to a (randomized) function or sequence of functions, which we
                call a privacy mechanism. Informally, the guarantee says that the behaviour of the mechanism is essentially unchanged
                independent of whether any individual opts into or opts out of the data set. Designed for statistical analysis, for
                example, of health or census data, the definition protects the privacy of individuals, and small groups of individuals,
                while permitting very different outcomes in the case of very different data sets.” Cynthia Dwork, The differential privacy
                frontier. In: Theory of Cryptography Conference, Springer, LNCS 5444. Berlin: Springer; 2009. pp. 496–502.
            153   Cynthia Dwork, Differential Privacy, 2006 PROC. 33RD INT’L COLLOQUIUM ON AUTOMATA, LANGUAGES &
                PROGRAMMING 1.
            154   See www .kiprotect .com.
            155   Differential Privacy Overview, at footnote 153.
            156   Professor Pompeu Casanovas, Louis De Koker, Danuta Mendelson, Professor David Watts, Regulation of Big Data:
                Perspectives on Strategy, Policy, Law and Privacy, Health and Technology (2017), available at: https:// ssrn .com/ abstract
                = 2989689.
            157   Differential Privacy, at footnote 155.
            158   https:// www .zwillgen .com/ .
            159   US Federal Trade Commission, Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for
                Businesses and Policymakers 27 (Mar. 2012), http:// www .ftc .gov/ os/ 2012/ 03/ 120326privacyreport .pdf.
            160   An act relating to data brokers and consumer protection, House Bill 764 (“H-764”), available here.
            161   See https:// iapp .org/ resources/ article/ california -consumer -privacy -act -of -2018/   and https:// iapp .org/ resources/ topics/
                california -consumer -privacy -act/ .
            162   According to the OECD Privacy Handbook, “[t]he right of individuals to access and challenge personal data is generally
                regarded as perhaps the most important privacy protection safeguard”. OECD Privacy Handbook, 2013, Chapter 3
                (Explanatory Memorandum for Original 1980 Guidelines). In Europe, see Case C-131/12, Google Spain v. Agencia de
                Protección de Datos (AEPD), 2014 EUR-Lex (May 13, 2014). See Kelly & Satola, “The Right to Be Forgotten”, University
                of Illinois Law Review, Vol. 1, 2017.
            163   California Consumer Privacy Act of 2018, Cal. Cov. Code §§178.110(a) & (b), 178.130(a)(2).
            164    2016 EU General Data Protection Regulation, Article 15.
            165    Ibid, Article 16.
            166   The General Comment 16 on Article 17 of the International Covenant on Civil and Political Rights provides that “every
                individual should have the right to request rectification or elimination” of files containing incorrect personal data.
                Human Rights Committee, General Comment 16 (on Article 17 on the right to privacy), 1988, Supp. No. 40, UN Doc
                A/43/40, para 10.  A General Comment to an international convention is a non-binding guide to its interpretation.
                The Council of Europe Convention for the Protection of Individuals with regard to Automatic Processing of Personal
                Data (Convention 108) provides for “rectification or erasure” of any data processed contrary to the principles on data
                quality, which require that personal data undergoing processing must be adequate and up-to-date. Council of Europe
                Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (1981), Art 8(c), with
                reference to Art 5. Similarly, under the APEC Privacy Framework, individuals should have the right to “challenge the
                accuracy of information relating to them and, if possible an as appropriate, have the information rectified, completed,
                amended or deleted”. 2004 APEC Privacy Framework, Art. 23(c).
            167   Article 29 Data Protection Working Party, ‘Guidelines on the Right to Data Portability’ (2017) 16/EN WP 242 rev.01 10,
                available at https:// ec .europa .eu/ newsroom/ document .cfm ?doc _id = 44099.
            168   Article 29 Data Protection Working Party, ‘Opinion 4/2007 on the Concept of Personal Data; 01248/07/EN WP 136’ (n
                68) 6; Article 29 Data Protection Working Party, ‘Guidelines on Automated Individual Decision-Making and Profiling for
                the Purposes of Regulation 2016/679’ (n 19) 18.



           54    Big data, machine learning, consumer protection and privacy
   51   52   53   54   55   56   57   58   59   60   61