Page 43 - FIGI - Big data, machine learning, consumer protection and privacy
P. 43

interact with law enforcement if there is an accident   University  offers  a Masters of Science in Informa-
            or another unintended outcome.                     tion Technology – Privacy Engineering program that
               Part of the correct functioning of algorithms,   addresses a range of such subjects. 221
            including to prevent future harm, involves ensur-
            ing continued maintenance. Some have called for    6�3  Ethics and self-regulation
            an ongoing legal requirement to monitor outcomes   Beyond management and engineering, there are
            from algorithms, provide mechanisms for receiving   broader efforts underway to change underlying atti-
            feedback (e.g., complaints), conduct inspections,   tudes and awareness of those in the tech industry.
            and correct models.  Such sophisticated matters    Self-regulatory efforts build on principles proposed
                               219
            are beyond the capability of consumers, who lack   by sector participants and others� They emphasize
            expertise and resources. Sometimes human monitor-  accuracy, fairness, accountability and transparen-
            ing will be important, not merely as part of an appeal   cy, sustainable growth and privacy. 222  These include
            from a consumer, but as part of the decision-making   steps  in  the  engineering  community  to  develop
            process itself. Such human involvement needs to be   ethics for artificial intelligence and autonomous
            thoughtfully explored.                             decision-making.
                                                                 Bodies such as the Association for Computing
            6�2  Integrating data privacy by design            Machinery  (ACM)   and  the  Institute  of  Electrical
                                                                               223
            Effectively addressing consumer protection and     and Electronic Engineers (IEEE) are examples,  as
                                                                                                        224
            data privacy in big data and machine learning will   well as Partnership on AI,  Software & Information
                                                                                      225
            require going beyond laws and regulations, and     Industry Association (SIIA),  and companies such
                                                                                       226
            tick-the-box compliance with them. It will need to   as Google  and Microsoft.  These are accompanied
                                                                                      228
                                                                        227
            include designing products and services to minimise   by work by organizations such as Fairness, Account-
            invasion of privacy. The seven principles of privacy   ability, and Transparency in Machine Learning (FAT/
            by  design developed under the leadership of Ann   ML),  Privacy International,  the Future of Life Insti-
                                                                                       230
                                                                   229
            Cavoukian  are:                                    tute,  Center for Democracy & Technology (CDT),
                                                                   231
                     220
                                                                                                           232
                                                               and the Leadership Conference.
                                                                                           233
            1�  Be proactive not reactive, preventative not reme-  Specifically in the field of financial services, as
               dial, anticipating and preventing privacy-invasive   mentioned in the Introduction (section 3), the Smart
               events before they happen;                      Campaign has produced draft Indicators on Algo-
            2� Make privacy the default setting so that consum-  rithms & Data-Driven, Automated Decisions as part
               ers do not have to change settings to protect pri-  of their Digital Credit Standards (see Annex B (Smart
               vacy, i.e., use opt-in rather than opt-out consents;  Campaign Digital Credit Standards)), many of which
            3� Embed privacy into design, integral to the system   have been cited throughout this report. The Smart
               without diminishing functionality as opposed to   Campaign  is housed at the Center for Financial
                                                                        234
               bolted on after design (e.g., including the feature   Inclusion at Accion.  It develops and promotes
                                                                                 235
               of data portability;                            self-regulatory standards for consumer (and oth-
            4� Adopt a win-win approach, benefitting from stron-  er client) protection in financial inclusion, including
               ger consumer trust, lower risk from data breach;  managing a certification program for financial ser-
            5� Employ end-to-end security, ensuring secure     vice providers. Smart and MFR , an independent
                                                                                           236
               intake, storage and destruction of data over the   rating agency that conducts a large proportion of
               life cycle (including encryption of data storage   Smart’s client protection certifications, prepared the
               and transfer);                                  client protection standards for digital credit provid-
            6� Show  visibility and  transparency, using policies   ers. They pilot tested them with two financial service
               and keeping records to enable internal monitoring   providers using automated interactions with con-
               and independent verification; and               sumers operating in Kenya (4G Capital  and Tala)
                                                                                                 237
                                                                                                           238
            7�  Demonstrate respect for user privacy, providing   and have published revised Standards in light of the
               individuals access to information and the opportu-  pilot. The World Bank’s  Use of Alternative Data to
               nity to contest and correct, complete and update   Enhance Credit Reporting to Enable Access to Digital
               data about them.                                Financial Services by Individuals and SMEs operating
                                                               in the Informal Economy (see section 3 and footnote
               It will require privacy engineering in product   14)  is  another  significant  example  of  guidance  for
            development, including integration into training of   financial service providers.
            computer scientists. For instance, Carnegie Mellon



                                                             Big data, machine learning, consumer protection and privacy  41
   38   39   40   41   42   43   44   45   46   47   48