Page 763 - AI for Good Innovate for Impact
P. 763

AI for Good Innovate for Impact



                   Use Case 7: Blind Metaverse








               Organization: Humanities Artificial Intelligence Research Center of Nara Knowledge Information       4.9: Accessibility
               Co.

               Country: South Korea

               Contact:
                    lee jae hyuk Principal researcher, jakelee@ narainformation .com
                    shin dae geon Research Engineer, eorjs5518@ narainformation .com


               1      Use case summary table


                Item                    Details
                Category                Accessibility

                The problem to be  Lack of metaverse accessibility for the visually impaired, difficulty
                addressed               acquiring information, difficulty maneuvering, and complexity of
                                        User Interface(UI)/User Experience(UX).
                Key aspects of the solu- Textualize information by touch and provide it out loud, detect
                tion                    surrounding information centered on your avatar, convert surround-
                                        ing assets and event information into tactile (Braille) information, and
                                        reposition your avatar using tilt.

                Technology keywords     One Touch UI, Gyro Sensor, Avatar Radar

                Data availability       Private

                Metadata (type of data)  Image
                Model  Training  and  Blind-friendly UI/UX, using distance values to provide information,
                fine-tuning             and tilt detection for avatar movement
                Testbeds or pilot deploy- Avatar Radar-based Metaverse Accessibility Platform [1]
                ments


               2      Use Case Description


               2�1     Description

               The Metaverse opens up communication and learning opportunities for people with limited
               mobility beyond physical constraints. However, smartphones and tablets that are controlled
               by touching the display are difficult to access for the visually impaired, as the environment
               and UI are display-based. Avatar Radar, a touch UI for the visually impaired, and Avatar Radar,
               an avatar that moves according to the tilt of the display with a gyro sensor, are key features of
               Metaverse's accessibility technology, providing an optimized Metaverse experience for the
               visually impaired by enabling them to recognize surrounding content without relying on sight.
               Users can tilt the display to move the avatar in the desired direction.



                                                                                                    727
   758   759   760   761   762   763   764   765   766   767   768