Page 221 - Kaleidoscope Academic Conference Proceedings 2024
P. 221

HAND GESTURE DRIVEN SMART HOME AUTOMATION
                                       LEVERAGING INTERNET OF THINGS




                                  1
                   Dhananjay, Kumar ; Sowbarnigaa, Kogilavani Shanmugavadivel ; Mehal Sakthi, Muthusamy Sivaraja
                                                                                                      1
                                                                       1
                         1 Department of Information Technology, Anna University, MIT Campus, Chennai, India





                              ABSTRACT                        gesture  recognition  provides  a  discreet  and  non-intrusive
                                                              convenient  way  to  interact  with  devices,  even  in  noisy
           Smart  home  automation  systems  require  convenient  and   environments. The preferences of gesture-based automation
           efficient user interface to control home appliances. Gesture   over  voice-based  techniques  results  from  the  technical  as
           recognition-based solutions offer flexibility to the users and   well  as  user  convenience  in  controlling  home  appliances.
           play  a  crucial  role  in  advancing  human-computer   However, the solution needs to cater to different functional
           interaction  and  immersive  computing  environments.  This   requirements widely used by everyone irrespective of their
           work  proposes  a  novel  solution  leveraging  deep  learning   abilities  and  disabilities.  Overall,  the  system  design  goal
           techniques  with  attention  mechanisms  including  self-  orients to enhance the user experience in smart homes by
           attention tailored for processing 3D tensors derived from the   offering a user-friendly, customizable, and versatile method
           gesture images. A set of hand gestures is defined, and the   of device control.
           system  is  trained  and  optimized  to  meet  the  real  time
           requirements  in  controlling  devices.  To  improve  the   Existing state-of-art systems like Smartify [1] present home
           accuracy,  the  model  is  parallelly  trained  with  dynamic   automation  by  accessing  devices  via  mobile  phones  and
           learning to adaptively fuse with the classification module.   voice  capturing  technologies.  However,  voice  command-
           The  proposed  modular  architecture  is  implemented  using   based  systems  may  struggle  to  distinguish  commands
           Raspberry  Pi  with  IoT  devices  for  a  typical  home   accurately amidst background noise / music, leading to errors
           environment. The test result achieves gesture classification   or  misinterpretations.  The  proposed  gesture  recognition
           accuracy of 98.24% and latency of about 0.2 seconds in real   system focuses solely on hand movements, eliminating the
           time  control.  The  working  model  highlights  a  practical   influence  of  ambient  sounds.  This  ensures  precise  and
           solution under ITU-T Recommendation J.1611 which deals   reliable control of appliances, even in noisy environments.
           with  the  functional  requirements  of  a  smart  home  and   Other  state-of-art  systems  like  the  Fibaro  [2],  primarily
           gateway.                                           utilizes  a  single  gesture  called  swipe  to  control  devices.
                                                              However,  this  necessitates  the  placement  of  multiple
                Keywords – Gesture recognition, Smart Home    hardware  units  across  various  locations  within  the  same
             Automation, Internet of Things, Attention mechanism   room  for  comprehensive  device  control.  In  contrast,  our
                                                              proposed solution extends beyond single gestures, offering a
                          1.  INTRODUCTION                    diverse range of gestures for intuitive device management.
                                                              Crucially,  it  eliminates  the  need  for  several  handheld
           Smart homes blend IoT devices and automation for seamless   hardware units by enabling the control of multiple devices
           living.  Gestural  control  represents  an  intuitive  interface,   from a single location. This enhances user experience and
           enabling hands-free operation of devices and services within   convenience, streamlining smart home interactions without
           the  smart  home  environment.  By  interpreting  hand   compromising functionality or accessibility.
           movements,  gesture  recognition  systems  trigger  the
           operations  of  appliances.  Moreover,  IoT  technology   The conventional approaches to gesture classification often
           facilitates seamless communication among devices, creating   rely on 2D images, limiting their ability to capture the depth
           a cohesive ecosystem where gestures drive home automation.  and  spatial  dynamics  inherent  in  human  gestures.  This
                                                              limitation  underscores  the  need  for  a  more  sophisticated
           Gesture  recognition  offers  a  compelling  solution  for   approach,  prompting  the  exploration  of  3D  tensor
           controlling  devices  in  smart  homes  due  to  its  natural  and   representations derived from images. The existing research
           intuitive interface. Unlike traditional methods such as voice   works  [3-5]  on  gesture  recognition  utilizes  deep  learning
           or remotes, gesture control allows users to interact with their   models with Convolutional Neural Networks. The proposed
           environment using natural hand movements, eliminating the   model uses a deep learning model with attention mechanism
           need for physical touch or voice commands. Additionally,   and  transfer  learning  model  with  dynamic  learning  rate




            978-92-61-39091-4/CFP2268P @ITU 2024          – 177 –                                     Kaleidoscope
   216   217   218   219   220   221   222   223   224   225   226