Page 161 - AI for Good Innovate for Impact
P. 161
AI for Good Innovate for Impact
(continued)
Item Detail
Metadata (type of Images and Labels
data) 4.1-Healthcare
Model Training and Generative Adversarial Networks (GANs), Convolutional Neural Networks
Fine-Tuning (CNNs)
Testbeds or Pilot Initial prototyping efforts are focused on optimising the model for
Deployments deployment on resource-constrained mobile hardware, such as entry-
level smartphones powered by Acorn RISC Machine (ARM) Cortex-A
series chipsets with limited Random Access Memory (RAM) (<2 GB) and
no reliance on cloud connectivity. While full validation is ongoing, the
quantized model has been designed with considerations for low latency
and a compact memory footprint suitable for on-device inference. Future
benchmarking will evaluate inference speed, energy efficiency, and diag-
nostic accuracy across a range of low-power devices to ensure equitable
real-world applicability.
Code repositories Not publicly disclosed
2 Use case description
2�1 Description
This use case aims to democratise smart dermatological diagnosis by tackling both racial and
digital divisions. Many diseases are increasingly being diagnosed by AI and computer vision.
However, there is a significant amount of difficulty and inaccuracy in the diagnosis of darker
skin due to the bias in training datasets - the majority of training instances are based on white
skin lesions. This makes darker skin tones more underrepresented in AI-based dermatological
diagnosis systems, causing increased misdiagnosis rates for such populations. Another major
problem is the digital divide; even with accurate diagnostic tools, many regions in the world
lack the technology needed to run high-power tools. The combination of these two problems
represents a major disparity in dermatology. A common question is, why not just remove
the colour from all images to ensure equality? This is because colour plays a pivotal role in
skin diseases and provides very valuable information and patterns to the AI model. This use
case delves into the solution EquiDermAI, a deep-generative framework that addresses both
racial divisions and the regional inaccessibility to these diagnostic tools (the digital divide). For
countless diseases, many datasets possess a much larger proportion of lighter skin tones, and
the AI models trained on such datasets may classify lighter skin tones accurately but misdiagnose
the darker, more underrepresented skin tones. EquiDermAI, for any given disease, takes as
input a dataset, distinguishes the lighter skin tones (more populous in the dataset) from the
darker ones, and, using the few darker skin tone training instances as reference, carries out a
generative style transfer process using the powerful generative adversarial networks algorithm
(GANs) to generate novel training instances of multiple different skin tones of the lesion from
the large majority of lighter skin tones. The style transfer process continually learns to refine the
pigmentation of the lesions as well, and thus, can generate new images of the white skin lesions
transferred to other darker underrepresented skin tones that accurately depict the different
ways a lesion would look across different skin tones. The datasets used are expert-validated,
and thus the reference images of the darker skin tones provide a measure to ensure that the
125