Report ITU-R BS.2388-6 (09/2025) Usage guidelines for the Audio Definition Model and Multichannel Audio Files
Foreword
Policy on Intellectual Property Right (IPR)
TABLE OF CONTENTS
1 Introduction
2 Use cases
     2.1 Generating BWF audio files from scratch
          2.1.1 UC1.1: Common single group channel-based files
          2.1.2 UC1.2: Common multiple group channel-based files
          2.1.3 UC1.3: Non-common channel-based files
          2.1.4 UC1.4: Transformation/scene-based files
          2.1.5 UC1.5: Object-based files
          2.1.6 UC1.6: Mixed files
     2.2 Reading BWF audio files
          2.2.1 UC2.1: Common single-group channel-based files
          2.2.2 UC2.2: Common multiple group channel-based files
          2.2.3 UC2.3: Non-common channel-based files
          2.2.4 UC2.4: Transformation/scene-based files
          2.2.5 UC2.5: Object-based files
          2.2.6 UC2.6: Mixed files
     2.3 Reading non-ADM WAV files
          2.3.1 UC3.1: One-, two-, five- and six-channel files
          2.3.2 UC3.2: Other numbers of channels
          2.3.3 UC3.3: Multiple mono files
     2.4 Generating BWF Files without information
          2.4.1 UC4.1: Generating one-, two-, five- and six-channel files
          2.4.2 UC4.2: Generating other number of channels
3 Best practices for ADM usage
     3.1 Using Common Definitions
          3.1.1 Using the Common Definitions when reading an audio file with an  chunk
          3.1.2 Using the Common Definitions when writing an audio file with an  chunk
     3.2 Element IDs
          3.2.1 ID prefixes
          3.2.2 Hexadecimal codes
               3.2.2.1 audioProgramme
               3.2.2.2 audioContent
               3.2.2.3 audioObject
               3.2.2.4 alternativeValueSet
               3.2.2.5 audioPackFormat
               3.2.2.6 audioChannelFormat
               3.2.2.7 audioBlockFormat
               3.2.2.8 audioStreamFormat
               3.2.2.9 audioTrackFormat
               3.2.2.10 audioTrackUID
          3.2.3 Recommended ID Numbering for Related Elements
     3.3 Audio types
          3.3.1 Format types
     3.4  chunk and IDs
          3.4.1 Simple PCM channel-based files
          3.4.2 Simple matrix files
          3.4.3 PCM object-based files
          3.4.4 Coded audio files
     3.5 Defaults for unknown audio inputs
          3.5.1 Common Definitions approach
          3.5.2 Wave Format Extensible approach
          3.5.3 Generating other metadata for unknown audio inputs
          3.5.4 Naming audio channels and loudspeaker labels for Cartesian coordinates
     3.6 Times and durations
          3.6.1 Timing attributes
          3.6.2 Timing for Nested audioObjects
          3.6.3 Block sizes for dynamic objects
          3.6.4 Dealing with preambles
     3.7 File management
     3.8  Chunk handling
     3.9 Ensuring streaming compatibility
     3.10 Interactivity and ensembles of audioObjects
          3.10.1 Example of interaction with an ensemble of audioObjects
          3.10.2 Behaviour of interaction with ensembles of audioObjects
     3.11 Multiple audioProgrammes
          3.11.1 Using multiple audioProgrammes versus audioContents and audioObjects
          3.11.2 Default audioProgramme
     3.12 Using the ‘importance’ parameters
          3.12.1 The audioBlockFormat importance parameter
          3.12.2 The audioPackFormat importance parameter
          3.12.3 The audioObject importance parameter
          3.12.4 Using an importance threshold
     3.13 Using tagList sub-element
          3.13.1 Using tagList sub-element for audio format type
               3.13.1.1 Sample code
          3.13.2 Using tagList sub-element for presets (ARIB TR-B48)
               3.13.2.1 Sample code
4 Location of ADM metadata
     4.1 BW64 file specified in Recommendation ITU-R BS.2088
     4.2 A serial representation of the ADM (S-ADM) specified in Recommendation ITU-R BS.2125
5 Examples of ADM usage
     5.1 5.1 and Stereo combination
     5.2 Object-based with a channel-based bed
     5.3 Sharing tracks from audioObjects to achieve different mixes
          5.3.1 Basic example
          5.3.2 Using time parameters in audioObjects for different mixes
6 Other audio-related metadata that may accompany ADM
     6.1 Introduction
     6.2 RIFF/WAV-related Metadata
          6.2.1 Broadcast Metadata
               6.2.1.1 Elements and Attributes
               6.2.1.2 Sample code
     6.3 Audio Format Custom Metadata
          6.3.1 Sub-Elements and Attributes
               6.3.1.1 Sample code
          6.3.2 Types of audioFormatCustomSet
               6.3.2.1 Custom_Set_Type_MPEGH3DA
               6.3.2.2 Custom_Set_Type_Dolbye_DBMD_Chunk
References