Page 43 - ITU Journal, ICT Discoveries, Volume 3, No. 1, June 2020 Special issue: The future of video and immersive media
P. 43

ITU Journal: ICT Discoveries, Vol. 3(1), June 2020



          3.1.2  Temporal property                                   overlays,  as  the  system  could  take  smart
                                                                     decisions  based  on  its  available  network
          OMAFv2  provides  tools  that  enable  altering  the
          temporal  properties  of the overlay  based  on user       bandwidth  and  resources,  such  as  memory
          interaction  and  the  background  timeline.  A  user      and system buffers.
          may  choose  to  switch  to  a  new  overlay  from  the   4.   Opacity  and  Alpha  Channel:  the  standard
          currently viewing overlay, potentially altering the        allows us to signal the opacity and the alpha
          temporal  property  of  the  overlay  relative  to  the    values of an overlay. This helps in controlling
          background  video  (see  also  section  3.1.3).  The       the transparency of the overlay.
          standard also allows a content provider to control
          the  playback  of  the  overlay  based  on  the  user’s   4.   MULTI-VIEWPOINT IN THE MPEG
          viewing direction. This feature enables the control        OMAF STANDARD
          of the temporal property of the overlay based on the
          spatial direction. For example, the content provider   Omnidirectional cameras typically capture subjects
          may  choose  to  pause  the  playback  of  an  overlay   with sufficient details if they are close to it. Subjects
          when  the  overlay  is  not  visible  in  the  user’s   which  are  further  away  from  the  camera  appear
          viewport, and once the user returns to the viewport   with lower details. In addition, OMAF (v1) content
          where the overlay is visible, the overlay is played   enables a single viewpoint to have three degrees of
          back from the pause state. Such a scenario can be    freedom.  3DoF  allows  perusing  content  in  all
          realized in case of both 2D and spherical overlays of   directions  around  a  single  location.  However,
          Fig. 2 and Fig. 3, respectively.                     a single viewpoint does not allow watching a person,
                                                               event or object of interest from a different location.
          3.1.3  Interactivity property                        Hence,  OMAFv2  has  incorporated  support  for
          The  OMAFv2  standard  allows  the  flexibility  to   multiple viewpoints to address the need to enable
          enable user interaction with an overlay. Some of the   high quality content capture as well as provide the
          possible interactions that could be performed on an   possibility of experiencing any subject or event of
          overlay are the following:                           interest from a different perspective. Furthermore,
                                                               multiple  viewpoints  facilitate leveraging  the well-
          1)   rotate an overlay;                              established  cinematic  rules  for  multi-camera

          2)   resize an overlay;                              directors that make use of different shot types, such
                                                               as wide-angles, mid-shots, close-ups, etc. [21].
          3)   switch on-off an overlay;
                                                               The  standard  supports  enablers  which  provide
          4)   change  the  spatial  position  within  the     more  freedom  for  content  creators  to  design
               omnidirectional system;                         content  for  diverse  scenarios  and  effective
          5)   switch to a new overlay.                        storytelling.

          3.1.4  Inherent property
          OMAFv2  provides  for  the  following  inherent
          properties to be either explicitly signaled or implied
          by the usage of an overlay:

          1.   Source:  an  overlay  could  be  a  separate  bit
               stream  from  the  background  bit  stream  or
               could  be  part  of  the  same  stream  as  the
               background stream. The latter indicates that
               the overlay video spatially coexists with the
               background video.
          2.   Representation:  an  overlay  could  be  a  2D       Fig. 5 – Example of a basketball game with multiple
               video/image  or  a  spherically  projected                          viewpoints.
               video/image.                                    Fig.  5  illustrates  an  example  of  OMAFv2  content
                                                               with multiple viewpoints (VPk, VPl and VPm). This
          3.   Priority:  the  standard  allows  us  to  signal  a   allows  the  user  to  experience  the  action  close  to
               priority  bound  to  an  overlay.  Signaling  the   where  it  happens,  as  well  as  from  different
               priority  is  helpful  for  the  omnidirectional   perspectives.  In the following, the key concepts and
               system in the presence of a large number of




                                                © International Telecommunication Union, 2020                 21
   38   39   40   41   42   43   44   45   46   47   48