Page 41 - FIGI - Big data, machine learning, consumer protection and privacy
P. 41
impose economic loss on a person, for example and machine learning to obtain insurance, or other
through denying, or raising the price of goods or ser- guarantees of financial responsibility, to provide a
vices due to a person’s classification as a member of means of redress for those harmed. While this may
209
a particular group (e.g., a person’s neighbourhood, be more immediately obvious for personal injury cas-
sometimes called “redlining”). A person may suffer a es involving equipment such as autonomous vehicles
loss of opportunity, for example as a result of filtering than claims for lost opportunity, it might be consid-
candidates for a loan, credit limit increase or insur- ered for cases of harm caused by data breaches by
ance contract according to race, genetic or health processors of large data sets.
information. It has also been suggested that when courts and
Some harms are unlawful in some countries where legislators address claims for some form of inju-
they involve discrimination on the basis of race, reli- ry resulting from artificial intelligence and machine
gion, criminal history or health. In these cases, exist- learning, they should draw from the rich body
ing laws will specifically protect certain classes of of product liability law. This might in some cases
people and may prohibit discriminatory outcomes. mean applying strict liability, i.e., without showing
However, where membership of a protected class is causation, negligence or fault (let alone intention),
not involved, there may be little way to show harm. for certain harms. Again, redress mechanisms should
Another difficulty facing consumers harmed by incentivise providers to address the problems both
big data and machine learning systems is identify- before and after they arise. For example, product lia-
ing who should be held liable for the damage – for bility law often seeks to avoid undermining the incen-
example, the firm employing the system, the firm tive of manufacturers to fix faults after their products
that coded the algorithms, the firm that supplied the cause harm out of fear that this will be treated as
data? Demonstrating the precise cause and tracing an admission of responsibility for the harm. In such
the responsible party may be impossible for the con- cases, the law will provide that such steps are not
sumer. admissible as evidence of fault. 210
Section 6.2 discussed various things that opera- Overall, much remains to be done in most juris-
tors of machine learning systems can do to reduce dictions to give consumers effective remedies for
risk of bias. In addition to these, some have suggested breaches of their privacy and risks of big data and
requiring some firms relying on artificial intelligence machine learning.
6 RISK MANAGEMENT, DESIGN AND ETHICS
The previous sections have discussed consumer discrimination, just as any other risk. The US National
protection and data privacy, focusing on legal and Institute of Standards and Technology (NIST) recent-
regulatory treatment and remedies. The resulting ly launched work on a Privacy Framework, focus-
213
uncertainty presents a risk to business of being held ing on risk management approaches modelled on its
responsible for violating antidiscrimination laws or Cyber Security Framework. This framework empha-
incurring substantial liability for damages for privacy sizes the importance of prioritising risk management
violations and data security breaches. This section over “tick-the-box” compliance approaches.
looks at various steps that companies can take to Risk management processes for machine learning
mitigate these risks. systems might include documenting objectives and
assumptions, and employing “three lines of defence”
6�1 Risk management that ensure separation (by process, roles, parties
A common approach in situations of uncertainty is to involved and incentives) of:
apply risk management frameworks and processes,
and thus good big data model design includes build- • development and testing of a machine learning
ing risk management into the model. For example, model;
211
some financial service providers like Mastercard will • its validation and legal review; and
apply the cross-industry process for data mining • periodic auditing of the model throughout its life-
(CRISP/DM), which provides a structured approach cycle.
214
to planning data mining projects. 212
Such frameworks and processes may be employed
to assess risks associated with consumer privacy and
Big data, machine learning, consumer protection and privacy 39