Page 44 - FIGI - Big data, machine learning, consumer protection and privacy
P. 44
Measures such as these alone do not secure fair- and make it far easier to develop the necessary risk
ness, accountability and transparency, but they do management, engineering and other measures that
provide a vocabulary and value system that enables lead to greater protection for consumer privacy.
far more rapid communication about these topics,
7 AREAS FOR FURTHER EXPLORATION
This paper has explored various challenges that 2� Where it is simply unrealistic to expect consumers
consumer protection and data privacy law and regu- to understand the implications for them of wide-
lation face with regard to big data and machine spread circulation of personal data about them, it
learning techniques, particularly where these are may be necessary to develop tighter regulation
used for making decisions about services provid- of the use and sharing of personal data. This
ed to consumers. Conventional requirements to may include not merely relying on the consumer’s
provide notice of the intended purpose of using a consent to matters that are beyond comprehen-
consumer’s personal data when the purpose may as sion, but ensuring that consumers are provided
yet be unclear, or obtaining consent for something better information and controls on transfers of
the consumer largely cannot understand, are under data about them, and protecting consumers from
strain. Risks from inaccuracy of data inputs, or bias uses of their data that they would not reasonably
and discriminatory treatment in machine learning expect to be made.
decisions also raise difficult questions about how to 3� Developing standards for integrating privacy
ensure that consumers are not unfairly treated. The principles in the design of artificial intelligence
difficulty of ensuring transparency over decisions and machine learning models. Following the prin-
generated by algorithms, or of showing what harm ciples developed by Ann Cavoukian (see section
has been caused by artificial intelligence techniques 8.2), these might include standards for (1) proac-
that would not have otherwise been caused, also tive design approach, (2) use of privacy default
pose challenges for consumer protection and data settings, (3) adoption of privacy by design, (4)
privacy law and regulation. consumer-trust orientation, (5) end-to-end secu-
There are various areas where further work can rity, (6) consumer access to information and the
be usefully advanced to develop standards that can opportunity to contest and correct, complete and
apply across big data and machine learning, to work update data about them, as well as (7) standards
towards a balance between freedom to innovate for generating, recording and reporting logs and
and protection of consumers and their data privacy. audit trails of the design process to enable review,
These might include: and ensuring that such logs and audit trails are
coded into the system.
1� Improving the meaningfulness of consent to 4� Developing ethical standards for artificial intelli-
use and sharing of personal data. This would gence computer programming to which the com-
include improving transparency and simplicity of munity of developers may refer to address the
disclosures to consumers about the use to which sorts of issues discussed in this paper, and which
their data may be put, including providing read- may be the basis of ongoing discussion for identi-
ily understandable explanations. More stringent fying new issues and how to approach them.
regulation of consent may also complement the 5� Developing standards for acceptable inferential
consent technologies emerging in the market. analytics. These could address assessment of out-
Where use of personal data extends beyond use put data and decisions of machine learning mod-
for the immediate service to be offered to include els against privacy and antidiscrimination princi-
transfers of personal data to third parties, it may ples. They could also address when inferences of
be important to provide information that puts personal attributes (e.g., political opinions, sex-
the consumer in a position to make a meaningful, ual orientation or health) from different sources
informed judgment about such use of his or her of data (e.g., internet browsing) are acceptable
data. or privacy-invasive depending on the context.
This might also include developing standards for
42 Big data, machine learning, consumer protection and privacy