Telestream’s Christopher Deas talks about evolving QC processes and workflows, as broadcasters and service providers respond to changes in video production and consumption.
Today when consumers can access and enjoy media on a vast array of platforms, file-based quality control (QC) is critical to maintaining the best possible image and audio output. From a viewer standpoint, poor quality represents another reason to watch something else. From a business standpoint, bad quality can result in rejection of content for failing to meet published standards.
Christopher Deas, QC Product Manager at Telestream, talks here about the on-going evolution of QC processes and tools, and how to use them to design workflows that operate at every point of the media creation and distribution workflow whether on the ground, in the cloud, or both.
He said, “QC workflows have certainly come a long way in terms of complexity since the days when expert reviewers would sit all day watching content on a colour-calibrated broadcast reference monitor, obsessing over quality. Aiming to ensure that a rich, crisp piece of media is generated is one thing, but now it must also comply with the many technical and legal requirements as specified by diverse broadcasters and service providers.”
Distribution formats also continue to evolve, along with the quality standards to support them, and varying legal requirements are applied regionally around the world. Is the file formatted correctly? Will it play out flawlessly? Are the captions, audio levels and SCTE 35 ad markers all present, correct and within limits?
QC on Input, QC on Output
To begin the process, post-production companies need to check their content prior to sending it to broadcasters, providers, VOD platforms and other destinations. “Not only will failing to meet and/or comply with each client’s technical and legal requirements result in rejection, but paying for fines and the cost of internally reprocessing the media will also add to costs. Repeated failures or rejections may jeopardize future contracts," said Christopher.
“Service providers and broadcasters will also check incoming content from all sources to ensure they are starting with high quality files. They are checking for the same technical and legal compliance to ensure files will play on their systems. For incoming content, they may also check for the presence of advertising markers, which are key to successfully monetising their OTT streams.”
While some aspects of quality control are subjective, employing an objective, automated QC process that supports a non-reference MOS (Mean Objective Score) rating is a useful way of measuring quality in line with ITU scoring systems. This serves as an overall perceptual quality check and allows identification of problems based on the human vision system.
Tools, Formats and Automation
In many cases, expert operators must still manually perform a final check for image and audio quality, particularly on high value content. For that task they will require a frame accurate media player that can integrate with a colour-calibrated broadcast monitor. QC software should be able to integrate with a player as well, so that faults are automatically identified on the player’s time line.
Christopher said, “A high quality commercial player will play any media format and analyse metadata, timecodes, audio levels and captions. Since humans are not well suited to checking compliance with technical specs or coping with high volumes of content, most companies will need more than one QC tool.
“A robust, predictable QC workflow requires an automatic file-based QC application or service that can check large files much faster than manual operators and works 24 hours a day, checking the detailed syntax of a file and various objective checks such as video and audio levels, dead pixels, dropouts and media offline issues. All these checks must be able to run on a wide range of formats such as ProRes, J2K, DNxHD, DPX, OpenEXR and IMF.”
IMF is a good example of how modern delivery formats have increased the complexity and challenge of file-based QC. It is now necessary to confirm that a package contains all of the content specified in the manifest, while ensuring that all the essence components – video, audio, metadata – are compliant and of acceptable quality. “Auto QC is the only viable way to efficiently check complex IMF packages,” he said.
Manual QC
Manual QC can be approached in two ways – locally and remotely. In a local scenario, the local player has direct access to the source file. It can play on a computer monitor or through an output card to a professional reference monitor for expert viewing assessment. “While free open source players are available, these may have limitations such as dropping frames and may utilise non-approved codec implementations. In other words, they tend to focus on playing the media at all costs, and that may mean playing it incorrectly,” Christopher noted.
“A remote player, even for very large, professional-grade files, becomes necessary when a file is not available locally, particularly if you need to review content immediately. Downloading large files or transcoding the content into proxy versions for review may take hours and incur considerable cost. The media player must also be extensible, through plugins or APIs, so that it can be integrated into existing workflows.”
Choosing the Cloud
Organisations should also have a choice of running their QC processing in a public cloud, private cloud or on their own servers. Ideally, for best performance and lowest operational cost, a QC service should run wherever the media is located. As mentioned, downloading a multi-gigabyte file master for QC could take hours and attract high egress charges.
“When considering running a QC service in the cloud, remember that many first-generation QC applications were developed to run on the Windows OS on local servers. Although it may be possible to port some of these over to cloud computing, the result may be disappointing,” Christopher said.
“Modern cloud-native QC services are based on a container architecture, running on the Linux OS and supporting a scalable workflow. This approach enables faster launch times and reduces costs through a ‘pay-as-you-go’ business model. A cloud-based QC service is also practical as a way to handle sudden bursts of demand, and for smaller operations with unpredictable capacity requirements.”
Humans and Machines
Any conservative auto QC system can be expected to produce a number of false positives. In such cases, it is best to defer to an expert ‘human’ viewer who can decide whether they are genuine faults that require correction, or perhaps an intended editorial decision. For example, a long period of black or silence could be a legitimate fault or it may be a creative storytelling decision.
A well-designed QC system should be able to provide a list of timecodes for the operator to review, allowing them to view only the sections in question, rather than the whole file. Only by letting humans and machines each do what they are best at doing can we achieve maximum quality and efficiency.
Because so many different types of QC applications and services are available, depending on where an organisation sits in the media creation and distribution pipeline, it’s important to be informed of what QC checks may be required on source media as well as on processed media.
Automated QC software or services have virtually become a necessity to cope with volume and complexity, but having different control methods as well, such as APIs and SDKs for developers with a flexible GUI for quick manual operations, is an advantage for many customers.
Finally, a comprehensive file-based QC system should be accessible through different business models to suit the specific requirements of an organisation, which may include standalone applications, cloud-based SaaS, and APIs for developers creating their own purpose-built systems and workflows. www.telestream.net