Page 46 - AI for Good - Impact Report
P. 46

AI for Good



                  Caitlin Kraft Buchman from Women at The Table shares recommendations on ensuring gender
                  equality in AI development and deployment (2024):

                  •    Construct large new unbiased datasets with a focus not only on quantity but on quality for
                       the public good. It is important to actively produce open, gender disaggregated datasets;
                       which will better enable an understanding of the sources of bias in AI, and ultimately
                       improve the performance of machine learning systems. We need to Invest in controls
                       to oversee data collection processes and human-in-the-loop verification, so that data is
                       not collected at the expense of women and other traditionally excluded groups. And of
                       course, it is also vital to engage in more inclusive data collection processes that focus,
                       again, not only on quantity but on quality of datasets.
                  •    Pilot AI that allocates 21st century social protection, subsidies, and scholarships where
                       women and girls have traditionally been left behind. Encourage public institutions to
                       innovate and lead in this domain. We need to be creative with small, targeted, impactful
                       pilots based  on social  science research  that  allocate  social incentives,  subsidies,  or
                       scholarships where women have traditionally been excluded in prior systems. This is a
                       positive agenda to advance values of equality we have long embraced, to correct for
                       the visibility, quality, and influence of women proportionate to the population. STEM
                       education alone will not get us where we want to go.
                  •    Enact gender responsive public procurement guidelines for organizations and all levels
                       of government with hard targets and the outline of roles and responsibilities of those
                       organizations required to apply these principles. This could jumpstart new industries
                       and value creation, invented and owned by women and girls, expanding definitions of
                       ‘expertise’ so that those with lived experience and communities affected by technologies
                       can influence the design, deployment, and control of new technologies.
                  •    Mandate algorithmic impact assessments with an integrated approach, and holistically
                       include gender, human rights, and environmental impact. These assessments need to
                       be done beforehand and continuously throughout the lifecycle of the system. We need
                       rigorous testing across the lifecycle and this testing should account for the origins and
                       use of training data, test data, models, Application Program Interface (APIs), and other
                       components over the product lifecycle. AI should improve the quality of, not control, the
                       human experience.
                  •    Enshrine the public’s right to know the systems that impact their lives if algorithmic
                       decisions have been made that affect an individual and that this right includes continuous
                       consent and ends with contestability of the systems.



































                                                           36
   41   42   43   44   45   46   47   48   49   50   51