Page 764 - AI for Good Innovate for Impact
P. 764
AI for Good Innovate for Impact
Avatar Radar uses distance values to detect assets within the detection distance and lists the
detected assets to alert the user about the surrounding information. The amount of information
retrieved varies by detection stage, and the list is ordered by distance for UI convenience.
When the alarm goes off, the Asset Listboard UI pops up where the user can check his/her
surroundings, and he/she can select and check what information is available in East. All of this
information and images can also be viewed in Braille, thanks to a Bluetooth connection to a
tactile display device. This gives the visually impaired the same access to information as sighted
people and the freedom to choose what they want to see.
The Unity Collider component that this app is based on defines the shape of an object for
handling physical collisions. Since colliders are invisible, they don't need to be the exact same
shape as the object's mesh, so you can add a BoxCollider 2D to the Object component and
set the collider size X,Y to create a box-like physical wall over the obstacle. By using colliders
to create physical walls on exhibits and structures, we can warn the avatar when it approaches
an obstacle (“There's a wall.”, “There's an exhibit.”) and prevent entry. This allows the user to
know whether an obstacle is present or not, regardless of their vision.
The platform is not an assistive tool for the blind, but rather a metaverse where blind and
sighted people exist and share the same virtual space together. The display on the platform
functions as a “manipulation tool” for the visually impaired. Utilizing the tablet's tilt sensor to
move the avatar turns the display into an interface that can be manipulated like a joystick. This
allows the user to take the initiative to explore and experience the space.It also allows sighted
and non-sighted people to not only experience the content in the same virtual space, but also
to share their experience and empathy with the exhibit through a simple means of feedback:
liking it. Once again, shaking the exhibit's description pop-up will trigger a thumbs up, and
you'll hear an audible indication of how many thumbs up the exhibit has received. This allows
people to share their feelings about the same content, even if they encounter it in different ways.
Use case status: This use case reflects the feedback after the blind planner participated
in the accessibility planning and the blind person used it directly at the disabled
festival exhibition�
Partner: N/A
2�2 Benefits of the use case
Metaverse accessibility technologies empower users to engage in learning and exploration
without relying on sight. These tools help eliminate marginalization and discrimination faced
by individuals with visual impairments when accessing information or navigating environments.
They enable equal access to laboratories, museums, historical sites, and other educational
settings, allowing for independent and hands-on learning. The virtual nature of the metaverse
removes physical barriers, opening access to diverse learning content. This creates opportunities
for inclusive, high-quality education and lifelong learning—not only for people with visual
impairments, but also for older adults and individuals with limited mobility.
2�3 Future work
First, we plan to analyze what devices visually impaired people use in which environments and
how they utilize them. This will help us understand how they use their devices and the practical
728

