Page 53 - FIGI - Big data, machine learning, consumer protection and privacy
P. 53
specific services that access their financial data. Similarly, only a small proportion of participants correctly answered a
question about a detail in the policy even after having an opportunity to re–read the policy in a research setting.”
90 Daniel J. Solove, Privacy Self-Management and the Consent Dilemma, 126 HARV. L. REV. 1880, 1889-93 (2013).
91 IEEE Global Initiative (see footnote 224) at p159.
92 Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life, Stanford University Press
(2010); and Helen Nissenbaum, A Contextual Approach to Privacy Online, Daedalus (2011). https:// www .amacad .org/
publications/ daedalus/ 11 _fall _nissenbaum .pdf.
93 https:// obamawhitehouse .archives .gov/ sites/ default/ files/ privacy -final .pdf.
94 Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in
the Global Digital Economy (2012) http:// btlj .org/ 2012/ 03/ president -obamas -privacy -bill -of -rights -encouraging -a
-collaborative -process -for -digital -privacy -reform/ .
95 See, e.g., Bart Custers, Click Here to Consent Forever: Expiry Dates for Informed Consent, Big Data & Society,
95 January–June 2016: 1–6. http:// journals .sagepub .com/ doi/ 10 .1177/ 2053951715624935.
96 Pentland Alex (MIT). Big Data’s Biggest Obstacles. 2012. Available at: https:// hbr .org/ 2012/ 10/ big -datas -biggest
-obstacles.
97 Work is underway on a standard for an “AI agent” under IEEE project P7006 - Standard for Personal Data Artificial
Intelligence (AI) Agent, https:// standards .ieee .org/ project/ 7006 .html.
98 Jarno M. Koponen, We need algorithmic angels, TechCrunch 2014, https:// techcrunch .com/ 2015/ 04/ 18/ we -need
-algorithmic -angels/ .
99 See Ethically Aligned Design, at footnote 224 at p103. See also Mike Orcutt, Personal AI Privacy Watchdog Could Help
You Regain Control of Your Data, MIT Technology Review, 11 May 2017, https:// www .technologyreview .com/ s/ 607830/
personal -ai -privacy -watchdog -could -help -you -regain -control -of -your -data/ , and the related Privacy Assistant mobile
app, https:// play .google .com/ store/ apps/ details ?id = edu .cmu .mcom .ppa & hl = en.
100 https:// mysudo .com/ .
101 Sebastian Herrera and Patience Haggin, New Apple Sign-In Option Could Keep More Personal Data Away From
Facebook, Google, Wall Street Journal, 6 June 2019, https:// www .wsj .com/ articles/ new -apple -sign -in -option -could -keep
-more -personal -data -away -from -facebook -google -11559839438.
102 Kok-Seng Wong & Myung Ho Kim, Towards a respondent-preferred k-anonymity model, Frontiers Inf Technol Electronic
i
Eng (2015) 16: 720. https:// doi .org/ 10 .1631/ FITEE .1400395. “The level of anonymity (i.e., k-anonymity) guaranteed by
an agency cannot be verified by respondents since they generally do not have access to all of the data that is released.
Therefore, we introduce the notion of k-anonymity, where k is the level of anonymity preferred by each respondent
i
i. Instead of placing full trust in an agency, our solution increases respondent confidence by allowing each to decide
the preferred level of protection. As such, our protocol ensures that respondents achieve their preferred k-anonymity
i
during data collection and guarantees that the collected records are genuine and useful for data analysis.”
103 Structured data has a high degree of organization, such that inclusion in a relational database is seamless and
readily searchable by simple, straightforward search-engine algorithms or other search operations (e.g., payment
and transaction reports). Unstructured data either does not have a pre-defined data model or is not organized in a
predefined manner (e.g., social media entries, emails and images).
104 Similarly, GDPR Article 5(1)(d) provides, “Personal data shall be accurate and, where necessary, kept up to date; every
reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for
which they are processed, are erased or rectified without delay.”
105 The OECD Data Quality Principles (Principle 8), the APEC Privacy Framework (Principle 21), the Madrid Resolution Data
Quality Principle, and Convention for the Protection of Individuals with regard to Automatic Processing of Personal
Data (referred to as Convention 108) (Article 5) all include principles requiring that information be accurate and up-to-
date. The G20 High-Level Principles for Digital Financial Inclusion (HLP-DFI) calls for the development of “guidance to
ensure the accuracy and security of all data related to: accounts and transactions; digital financial services marketing;
and the development of credit scores for financially excluded and underserved consumers. This guidance should cover
both traditional and innovative forms of data (such as data on utility payments, mobile airtime purchases, use of digital
wallet or e-money accounts, social media and e-commerce transactions).” https:// www .gpfi .org/ sites/ default/ files/ G20
%20High %20Level %20Principles %20for %20Digital %20Financial %20Inclusion .pdf
106 International Finance Corporation (IFC), Credit reporting knowledge guide. Washington, DC. http:// www .ifc .org/ wps/
wcm/ connect/ industry _ext _content/ ifc _external _corporate _site/ industries/ financial+markets/ publications/ toolkits /
credit+reporting+knowledge+guide.
Big data, machine learning, consumer protection and privacy 51