Glossary of terms
Adaptive bitrate streaming is a technique used in streaming multimedia over computer networks.
Asynchronous serial interface: A streaming data format which often carries an MPEG transport stream.
Advanced Television Systems Committee: A committee established by the FCC to decide the technical standards for digital broadcasting in the U.S.
New standard for terrestrial broadcasting that improves viewer experience with higher audio and video quality, better compression efficiency, robust transmission on fixed and mobile devices. By combining terrestrial TV with IP, it provides more accessibility, personalization, and interactivity.
Recommended Practice provides guidance to broadcasters and equipment manufacturers on a common methodology to be used to determine transport conformance with the elements and parameters of ATSC Standards A/53, “ATSC Digital Television Standards, and A/65, Program and System Information Protocol for Terrestrial Broadcast and Cable.
ATSC/85 is the published standard of “Techniques for Establishing and Maintaining Audio Loudness for Digital Television".
The number of bits per second to represent audio or video after source coding (data compression).
Mainly caused by total service disruption, Black Screen (sometimes also called “Black”, “No Video”, “No Decodable Video” or “No Signal”) is the most serious issue affecting end-viewer experience.
Refers to downloading or preloading data into a reserved memory or a certain amount of data before starting to play audio or video.
In the Commercial Advertisement Loudness Mitigation (CALM) Act, the Federal Communications Commission (FCC) requires commercials have the same average volume as the programs they accompany.
The Telecommunications Act of 1996 mandated that broadcasters provide access to material by people with disabilities such as closed captioning for the deaf or hearing impaired in text form, the audio portion of a broadcast, as well as descriptions of background noise and sound effects. Closed captioning is hidden as encoded data transmitted within the television signal.
The Common Media Application Format is standard for segmented media delivery formalized as ISO/IEC 23000-19. It defines the container that holds the audio and video content. CMAF is an enhanced codification and standardization of existing fragmented MP4 best practices.
Also known as a coder/decoder, it is an application or device that compresses and decompresses a file.
The process of encoding a video file so that it takes up less space than its original making it easier to transmit over a network or the internet.
Incorrect regional content or advertising replacing (sometimes also called “mis-splicing”) may result in significant loss of revenue, reputation and incur penalties.
Digital Video Broadcasting is an industry-led consortium of media and technology companies that work together to design open technical specifications for digital media delivery.
Emergency Alert System is a national public warning system used by state and local authorities to deliver important emergency information, such as weather and AMBER alerts, to affected communities. EAS providers include broadcasters, cable operators, satellite radio and television providers, and wireline video providers.
European Telecommunications Standards Institute (ETSI) oversees the standards used by DVB specifications.
The number of individual frames that are shown to a viewer in a specific time frame. Measured in frames per second, it is also known as frame frequency. Traditional films reference frames per second while digital video uses ‘refresh rate’.
Usually preceding total service degradation, Freeze (sometimes also called “Frozen Screen”) is among TOP-3 issues reported by users in digital signal delivery.
High Efficiency Video Coding (HEVC) is a video compression standard which offers double the data compression ratio at the same or higher level of video quality and the same bit rate as the AVC technique. High Efficiency Video Coding supports resolutions up to 8192×4320, which includes 8K ultra-high definition.
HTTP Live Streaming, also known as HLS is an HTTP-based adaptive bitrate streaming communications protocol developed by Apple Inc. and released in 2009. It is the most popular streaming format used in media players, web browsers, mobile devices, and streaming media servers.
Derived from CRT era, interlaced videos suffer from combing and twittering effects on moving objects edges or striped patterns.
The amount of time between the instant a frame is captured and when it is displayed. Low latency in video content is desired especially when there is real-time action such as sports, gaming, or live events.
A unit of video processing or compression, typically a block of 16 x 16 pixels.
A standard file format for audio and video developed by The Moving Picture Experts Group.
Dynamic Adaptive Streaming over HTTP (DASH), also known as MPEG-DASH, is an adaptive bitrate streaming technique that enables high quality streaming of media content over the Internet delivered from conventional HTTP web servers.
National Television System Committee: An American committee formed to set the line standard and later color standard for broadcasting. Gave its name to the method of color reproduction used in the Americas (except Brazil) and in Japan.
OTA refers to radio and television transmissions that are sent and received over the air. OTA identifies television signals broadcast via RF signal compared to those sent by cable service providers.
Describes the delivery of content, services, or applications on top of network infrastructure or the internet.
The most common digital compression artifact, pixilation, is also known as macro blocking or blocking. Together with black screen and freeze, accounts for up to 90% of overall impairment affecting subscribers' experience.
The transmission of radio or TV channels from the broadcaster into broadcast networks that deliver content to the audience.
SCTE-35 (ANSI/SCTE 35 2013) is a joint standard by ANSI/Society of Cable and Telecommunications Engineers for the insertion of cue tones to signal ad or program events in MPEG transport streams.
SCTE-104 messages can exist either in VANC space of baseband (SDI) video, or sent between systems through TCP/IP, and are typically used as a precursor to eventual creation of SCTE-35 messages.
A service-level agreement that defines the level of service between a vendor and a client with agreed-upon metrics for providing service, resolving issues and if not met, penalties.
The process of continuously delivering media or a video file over the internet to a remote endpoint, allowing data to be viewed online without downloading it.
Text derived from a transcript of the dialog or commentary in films, television programs, video games, and other media, usually displayed at the bottom of the screen, but can also be at the top of the screen if there is already text at the bottom of the screen.
DVB systems MPEG transport stream measurement and analysis.
A standard digital container format used in broadcast systems for transmission and storage of audio, video, and Program and System Information Protocol (PSIP) data.