Page 22 - FIGI - Big data, machine learning, consumer protection and privacy
P. 22
Monetary Authority of Singapore’s FEAT Principles
2. Use of personal attributes as input factors for AIDA-driven decisions is justified.
12. To increase public confidence, use of AIDA is proactively disclosed to data subjects as part of
general communication.
13. Data subjects are provided, upon request, clear explanations on what data is used to make
AIDA-driven decisions about the data subject and how the data affects the decision.
14. Data subjects are provided, upon request, clear explanations on the consequences that AIDA-driv-
en decisions may have on them.
Smart Campaign’s draft Digital Credit Standards
Indicator 6�1�1�1
The provider has assessed and documented the personal information it needs from clients in order
to deliver the service (e.g. identity, transactions etc). The personal data collected, the personal data
shared, and the period of time during which personal data is stored are minimized and directly justified
by operations needed to provide the service or by law. The assessment identified data privacy risks to
consumers during collection, processing, storage, and transfer of personal data.
Indicator 6�1�1�6
Personal data should be relevant to the purposes for which it is to be used, and, to the extent neces-
sary for those purposes, should be accurate, complete, and kept up-to-date.
Indicator 6�2�1�0
Clients are asked to consent to specific uses of their data. Consent requests explain clearly, in simple,
local language, how data will be used. Separate consent is required for: a) sharing data with specific
third parties (to be clearly identified) as part of service provision; b) reporting data to credit reporting
bureaus; c) use of data for marketing; d) sales to third parties; and e) use of geo-location data. For
services delivered through USSD or SMS, internet links to disclosure statements are not sufficient.
Indicator 6�2�2�0
The client right to opt out of a service and withdraw the permission granted to an organization to use
data (of whatever type) is clearly displayed and accessible to clients, together with the consequences
of opting out.
Indicator 6�2�3�0
Clients have the right to obtain from the provider confirmation of whether or not the provider has data
relating to them, and if that request is rejected clients have the right to an explanation of the denial.
Indicator 6�2�3�1
Clients have the right to have data about them communicated to them within a reasonable timeframe
without excessive fees and using terminology that they can understand.
Indicator 6�2�3�2
Clients have the right to challenge data relating to them and, if the challenge is successful, to have the
data erased, rectified, completed, or amended.
purposes other than the original purpose for which it to the purpose specification and data minimisation
was collected. 80 rules are typically not wide in scope.
There are sometimes exceptions to notice and Many countries’ laws, and international and
consent rules that allow for uses of data beyond regional standards also require the individual to “opt
its initial purpose of collection, such as for statisti- in” by providing consent to collection, use and shar-
cal purposes or when it will be used for scientific ing of personal data. Where this is not required or
82
research. These often depend on large datasets for obtained, some jurisdictions allow the individual to
81
the same reason that machine learning does general- “opt out” by providing notice that they do not wish
ly. There are potential grey areas between what com- their personal data to be collected, used or shared
prises statistical purposes or scientific research and with third parties. When the consumer is not provid-
83
what constitutes product development in the provi- ed with a choice, data protection laws may impose
sion of financial services. However, these exceptions obligations of transparency, requiring data control-
20 Big data, machine learning, consumer protection and privacy