Page 26 - FIGI - Big data, machine learning, consumer protection and privacy
P. 26
bureaus may nevertheless find themselves subject to agency” under the FCRA, and was liable to him for
114
legal obligations that apply to traditional credit refer- having supplied incorrect information. The case was
ence bureaus. In some cases, such companies could resolved on other grounds, but the potential breadth
find themselves subject to claims for failure to supply of such legacy legislation poses challenges for firms
accurate information that has a bearing on a person’s operating in the data business. It may give rise to
credit worthiness. responsibilities to consumers for accuracy of data
Many countries recognise a public interest in used to make credit and other decisions that were
ensuring “fair and accurate credit reporting,” as for- not anticipated, weaken legal certainty and under-
mulated in the US, for example. This both benefits mine business innovation and investment.
108
the functioning of financial services markets and pro-
tects consumers. For this reason, consumer reporting Credit reporting requirements and the wider
agencies whose data are used for credit transactions, information ecosystem
insurance, licensing, consumer-initiated business The discussion above concerned the responsibilities
transactions, and employment are often regulated. 109 to consumers that firms may have when dealing with
However, many countries’ consumer reporting data in non-traditional ways, in particular regarding
laws were enacted before the advent of the inter- the accuracy of data they use for decisions in finan-
net, let alone big data and machine learning. Some cial services. A related question arises concerning
countries have a broader concept of consumer firms’ responsibility to contribute to the wider infor-
reporting agencies. In the US, for example, the Fair mation ecosystem that is traditionally regulated by
Credit Reporting Act (FCRA) applies to companies disclosure and reporting obligations.
that regularly disseminate information bearing on an Disclosure obligations arise in numerous contexts,
individual’s “credit worthiness, credit standing, credit whether due to securities laws requirements appli-
capacity, character, general reputation, personal char- cable to public companies, health and safety disclo-
acteristics, or mode of living.” The FCRA requires sures for medicines, or consumer products that pose
110
consumer reporting agencies to “follow reasonable particular risks. In the financial services context, for
procedures to assure maximum possible accuracy” example, a person’s credit history is useful data for a
of consumer reports; to notify providers and users of financial service provider, reducing the asymmetry of
consumer information of their responsibilities under information between lender and borrower. In order
the Act; to limit the circumstances in which such to improve competition among service providers
agencies provide consumer reports “for employment that hold such data and the functioning of financial
purposes”; and to post toll-free numbers for consum- markets, some financial service providers are often
ers to request reports. It also creates liability for fail- required to report credit data about consumers to
ure to comply with these requirements. consumer reporting organizations which organize
111
In a 2016 report, the US consumer agency, the Fair and make it available to the market as a whole.
Trade Commission (FTC), considered how big data In many countries, only banks (i.e., entities that
is used in credit reporting decisions. The FTC clar- are regulated, typically with banking licences, for
112
ified that data brokers that compile “non-traditional deposit taking, lending and other related activities)
information, including social media information” may are required to report to credit reference bureaus
be considered to be credit reporting agencies sub- for inclusion in the credit reference bureau’s records
ject to these obligations. and analytics. Today, the question arises whether
This is not a mere theoretical possibility. For non-banking financial service providers that rely on
instance, in the recent US Supreme Court case automated decisions using alternative data to profile
Spokeo v Robins, Spokeo operated a website which risk should be obligated to report the results of such
113
searched and collected data from a wide range of lending to credit reference bureaus as well.
databases. It provided individuals’ addresses, phone Some consider that alternative lenders should be
numbers, marital status, approximate ages, occupa- required to supply credit data to credit reference
tions, hobbies, finances, shopping habits and musical bureaus about a consumer’s loan that is successfully
preferences and allowed users to search for infor- repaid (positive reporting data) as well as where the
mation about other individuals. The plaintiff, Robins, consumer defaults on the loan (negative reporting
alleged that Spokeo incorrectly described him as a data). Doing so may provide a more “level playing
115
wealthy, married professional, resulting in him being field” of regulatory obligations for similar activities
adversely perceived as overqualified for jobs. Rob- (lending) rather than applying different regulatory
ins claimed that Spokeo was a “consumer reporting obligations depending on the type of entity (a bank
24 Big data, machine learning, consumer protection and privacy