Advanced System Format (formerly Advanced Streaming Format, Active Streaming Format) is a Microsoft's container format.
An ASF file can contain multiple independent or dependent streams, including multiple audio streams for multichannel audio, or multiple bit rate video streams suitable for transmission over different bandwidths. The streams can be in any compressed or uncompressed format. Basic syntactic elements n ASF file is composed of are called Objects . There are three top-level objects: Header Object and Data Object and an optional Index Object. Lower level objects are contained within the top-level objects.
Each top-level or lower-level object begins with a globally unique identifier (GUID) and a size value.
These numbers allow the file reader to parse the information at appropriate places into identifiable objects. Because of these GUIDs, lower-level objects can be sent in any order and still be recognized. The ASF format is designed to overcome inaccurate data reception. A partially downloaded ASF file can still be read, as long as it contains the Header object and at least one Data object.
GUID is 128-bit unsigned data type. It uniquely identifies each media object type. New media types, codec types, error correction approaches, and other innovations can be created, identified by their own GUIDs, and inserted into ASF data streams. If new ASF object types is defined, each new type needs its own unique GUID.
Top level objects
Its role is to provide well-known byte sequence at the beginning of ASF file. It also contains general information about the file, such as file size, number of streams, error correction methods, and codecs used. Metadata is also stored here. The Header object is the only top level object that can contain other objects. ASF file contains only one Header Object. Header Object must contain following objects: File properties, Stream Properties, Header Extension
File properties object
Defines the global characteristics of the combined digital media streams found within the Data object.
Stream properties object
Defines how a digital media stream within the Data Object is interpreted and the specific format of the Data Packet itself. Might be embedded in the Extended Stream properties Object.
Header Extension Object
Contains 0 or more extended header objects. This allows additional functionality to be added to an ASF file while maintaining backward compatibility.
Optional Objects in the Header Objects
Codec List Object
Provides information about the codecs and formats.
Script Command Object
provides a list of Unicode strings that are synchronized to the ASF file's timeline.
Bitrate Mutual Exclusion Object
Identifies video streams that have a mutual exclusion relationship to each other. For each set of objects that contains a mutual exclusion relationship, one this object must be present.
Error Correction Object
Defines the error correction method.
Content description Object
Stores standard bibliographic information such as title, author, copyright, description, rating information.
Extended Content Description Object
Stores content that is beyond the standard bibliographic information.
Stream Bitrate Properties Object
Defines the average bit rate of each digital media stream.
Content Branding Object
Stores branding data, including information about a banner image and copyright associated with the file.
Content Encryption Object
Provides additional Evilness.
Extended Content Encryption Object
Provides even more additional Evilness.
Digital Signature Object
Pads the size of the Header Object, enables the size of any object stored in the Header Object to grow or shrink without having to rewrite the entire Data Object and Index Object sections.
Objects that may be present in the Header Extension Object:
Extended Stream Properties Object
Defines additional optional properties of the stream that are not described in the Stream Properties Object
Advanced Mutual Exclusion Object
Must be used if any of the streams in the mutual exclusion relationship are hidden.
Group Mutual Exclusion Object
Describes mutual exclusion relationships between groups of streams.
Stream Prioritization Object
Bandwidth Sharing Object
Indicates streams that share bandwidth, maximum bandwidth of the set of streams is less than the sum of the maximum bandwidth of the individual streams.
Language List Object
Contains an array of Unicode-based language IDs.
Permits authors to store stream-based metadata in file.
Metadata Library Object
Stores stream-based, language-attributed, multiply defined and large attributes in a file.
Index Parameters Object
Supplies information about streams actually indexed by the Index Object.
Media Object Parameters Object
Supplies information streams actually index by media objects.
Timecode Index Parameters Object
Supplies information streams actually index by timecodes.
Reserved for future use.
Advanced Content Encryption Object
Provides additional Evilness.
Contains all Data Packets organized in terms of increasing send times. A Data Packet can contain interleaved data from several streams . The data can consists of entire objects from one or more streams or it can consist of partial objects.
Mandatory structure, must be followed by one or more Data Packets.
Its structure can be:
Error Correction Data -> Payload Parsing Information ->Payload Data ->Padding Data
Error Correction Data -> Opaque data -> Padding data
Error correction data: if the high order bit of the first byte of the Data Packet is set, a Data Packet starts with error correction data, if it isn't Data Packet starts with payload data.
Payload data: this data can contain one or several payloads of data, it contains several payloads if Multiple Payloads Present flag is set.
ASF file may contain various types of index objects placed at the end of file.
Simple Index Object
Contains a list of associated index/key-frame pairs that enables applications to seek through a file efficiently. The index associated with each key frame can be a presentation time, a video frame number, or a reference time stamp. There should be one Simple Index Object for each video stream.
Contains stream-specific indexing information. Index consists of blocks, each block has 64-bit offset in the block header that is added to the 32-bit offsets found in each index entry. When file is larger than 2^32 bytes, then multiple index blocks can be used .
Media Object Index Object
Can be used to index all the video frames or key frames in a video stream. Indexes are in the same manner as for Index Object. First frame for a given stream is corresponding to entry 0 in the Media Object Index Object. Any Media Object Index Object shall contain Media Object Index parameters in its Header.
Timecode Index Object
Supplies timecode indexing information for the streams.
An image stream is a special type of stream that contains still images assigned to presentation times. Image stream can be displayed like a slide show.
For script streams, file transfer streams, Web streams, and arbitrary data streams, no data validation is performed by the reading object. Text streams (contain text strings), file streams (contain one or more data files), script streams and Web streams (contain data files equivalent to the cached version of Web pages) are supported natively.
The script commands are simple name and value string pairs. Script commands can be delivered in one of two ways: in a script stream or in the file header.
Script Streams: Script commands can be delivered in their own stream in an ASF file. Each sample in a script stream contains the two strings of the name/value pair. The advantage of using a script stream is that the commands will be delivered at the correct presentation time.
Script Commands in the File Header: Script commands can be included in the file header for retrieval at the time of playback. The playing application is responsible for executing the script commands at the proper time. The advantage of using script commands in the file header is that all of the script commands are available before beginning to receive samples.
Data Unit Extensions
A data unit extension is a name/value pair that is attached to the sample in the data section of the file. The extended data can be accessed using methods of the buffer object when the sample is retrieved by the reader.
SMPTE Time Code Support
SMPTE time code data with samples as data unit extensions can be included . The data portion of the extension is a WMT_TIMECODE_EXTENSION_DATA structure containing the information from the original SMPTE time stamp.
There're four types of it
Mutual Exclusion by Bit Rate is special type of mutual exclusion and is more commonly referred to as multiple bit rate (MBR) mutual exclusion. An MBR mutual exclusion contains a number of streams that all originate from the same input, but are encoded at different bit rates. When playing a file with MBR, the reader determines the best stream to use based on the available bandwidth.
Mutual Exclusion by Language is designed for use with content (usually audio) recorded in several languages. A language-based mutual exclusion includes several streams that originate from unique inputs. Each input is the same content, but in a different language.
Mutual Exclusion by Presentation is provided to support video streams that contain the same content encoded with different aspect ratios. Typically, this is used when providing video in a letterbox version (aspect ratio 16:9) as well as formatted for television screens (aspect ratio 4:3).
Unknown Mutual Exclusion, all custom mutual exclusion types should be created using the unknown type.
When creating an ASF file, one can specify a priority order for its constituent streams.
One can specify streams in a file that, when taken together, use less bandwidth than the sum of their stated bit rates combined.
Indexes should help when seeking to some point in the content. An index is an object in an ASF file that equates video samples with their presentation times. No index is required for audio streams because audio data is more closely connected with presentation time than video data is. There're three different types of indexes: temporal indexes, frame-based indexes, and SMPTE time code indexes.
Temporal indexes are the most common type. They simply equate video samples with the corresponding presentation times.
Frame-based index equates video samples with video frame numbers and presentation times. Frame numbers are particularly useful in applications that edit video.
SMTPE time code index is the rarest type of index. It uses SMPTE time code as the basis of the index and can be used only on streams that have SMPTE time stamps included with their samples. For more information about SMPTE time code, see SMPTE Time Code Support.
Markers are named places on the timeline of an ASF file. Each marker has a name and a presentation time. They are useful for breaking up large ASF files in to logical pieces.