Code masters

Code masters
User Report
March 11th 2015 at 9:53AM

Transcoding vendors assess key investment criteria for operators including premises vs cloud; hardware vs software; and integration with MAM

Operators face a daunting task: there is a bewildering array of devices to which they are expected to deliver media in addition to their traditional broadcast outputs – ranging from smartphones to set-top boxes, to smart TVs, UHD TVs and beyond. This is a significant increase in operational complexity – but for several of these outlets, the monetisation model is not yet fully formed, so the additional complexity is not necessarily matched by a commensurate increase in revenue. In short, operators – the clients of transcoding kit vendors – have to do more work for not a lot more money.

“We see operators tugging in opposite directions when it comes to hardware or software encoding and transcoding, especially when it comes to multiscreen,” says Carl Furgusson, Ericsson’s head of business development, TV compression. “Considerations such as performance, reliability, costs and resolution requirements will heavily impact decisions that operators make in the coming years.”

TV Technology Europe asks key transcoding vendors for their views on key investment criteria including premises versus cloud; hardware versus software; and integration with MAM.

Invisibility, flexibility, scalability

It all boils down to invisibility for Bruce Devlin, chief media scientist at Dalet. Essentially, transcoding is becoming an enterprise-invisible business process, he says. It’s industrialised, and a modern transcoder not only has to deliver great output content, but also management interfaces and controls to allow the entire farm to be run in an almost invisible way. Devlin says operators need transcoding systems that can create all of the output formats that their new business model requires, but with a level of automation that allows them to do this without a huge increase in staff. “Automation not only allows operators to do more with their existing staff, but also allows the system to be self-monitoring, self-adjusting and in some cases self-correcting,” explains Paul Turner, VP enterprise product management, Telestream.

“This fundamentally enables them to offer services which are of importance to their business, while significantly reducing the costs of doing so. The revenue models for these services are starting to solidify, so customers want to be sure that their transcoding systems are flexible enough to handle the ad insertion and recognition process that will become standard practice as these models mature.”

Harmonic emphasises flexibility in transcoding systems, particularly with respect to adaptive bit rate (ABR) packaging and delivery (e.g., HLS, HDS, DASH, etc). “Many operators are combining ABR and broadcast encoding/transcoding systems (as these are the most stable with respect to standards and configuration) and leveraging a separate ABR packaging and origin stage to manage the volatility of standards and devices on the consumption side,” explains Tom Lattie, Harmonic’s VP market management and development, video products. “With ABR becoming a more common consumption method, particularly on big screens, greater emphasis is being placed on the video quality provided by encoding/transcoding solutions.”

ATEME is similarly focused on making transcoding plants as flexible and scalable as possible. “If a new device appears, operators expect their transcoding solution to easily evolve with a software release to address the latter,” says Remi Beaudouin, product marketing director. “If content production ramps up, operators want to quickly add processing resources with a minimum impact on operational layers.”

On-premise, cloud or hybrid?

One vital business decision is whether to adopt an on-premise, cloud-based, or a hybrid approach to the operation. Each has cost variables. “An operator with UHD master files transcoding for OTT will have a very different set of cost models to an operator who only transcodes SD content in-house for proxies,” outlines Devlin.

“On-premise transcoding gives you ultimate control and, potentially, the minimum operating costs if the system is stable and fault/rejection rates are low. A fully cloud-based system gives maximum versatility with an elastic cost model that can scale easily as business requirements change. On-premise solutions are hard to scale elastically because servers and storage will have to be procured in order to do the up-scaling and they won’t be ‘sold’ when down scaling. Internal transfers of content, however, are free. Off-premise storage can also lead to complex security and key management challenges when high value material needs to be converted.”

Agreeing that on-premise transcoding requires upfront capital investment, Turner says it can be cheaper in the long run, and can be significantly faster if the source and destination are within your facility. “The downside is that operators have to size their transcode farm to match their peak load.”

Processing in the cloud requires that the media be processed with the end point of the processing also located in the cloud. This means that media has to be transferred up to the cloud before processing can take place, which has cost implications in both time and money (if operators already store their programmes up on cloud storage – as many do – this cost is somewhat mitigated). “The same is true for the delivery point,” Turner continues. “If the next step in the overall workflow occurs at some other premises, then the time of transfer of transcoded material must also be considered. While the cost per transcode hour can seem very attractive, operators should also consider just how many hours per year they will use.”

Turner likens this to somebody who needs a truck to move some furniture: if you’re only moving furniture once, then you’ll hire a truck to do so. But if you move furniture all day every day, the rental costs will far outstrip the cost of buying a truck in the first place. “The major positive points for cloud operations are that you don’t have to make significant capital investment in the transcoding infrastructure, and that you can operate a pay as you go method of funding,” says Turner.

Vendors tend toward a hybrid solution where a certain volume of transcode capacity is ‘owned’ and the elastic load capacity is scaled into a private or public cloud. “Good resource management with business rules implemented in a MAM like Dalet Galaxy should allow the benefits of cloud to be achieved as well as the benefits of on-premise,” says Devlin.

“With a hybrid model, operators can size their on-premise transcode farm to match their typical run-rate load, and handle peak work by offloading some of that additional processing to a cloud extension as and when needed,” is Turner’s take.

Hardware-based versus software-based encoding

The prevailing technology narrative across the industry is from dedicated hardware to software, and nowhere is this more apparent than in the encoding/transcoding field. Yet, the answer is not as simple as you’d think. Of course, opinions vary depending on the vendor’s software or hardware-based product. It can’t be doubted, though, that Moore’s law and the development of GPU assist has meant that standard computer platforms can now match, and in many cases exceed, the speed of their hardware counterparts.

The main difference lies in flexibility, suggests Telestream’s Turner. “Software solutions are generically easier to update than their hardware equivalents, and through the update process can have new features (which weren’t available at the time of purchase) added to them at any time. Hardware transcoders are generally more difficult to update with new codecs and features.”

Dalet argues that many organisations are now looking at the energy consumed by data centres and are weighing up the benefits of some hardware acceleration from an energy perspective. “In transcoding, the format stability is such that hardware is often only appropriate for long-term stable functions such as low level codecs and some image processing,” he outlines. “The rate and ease with which software can bewritten and deployed means that it is used for nearly all encode and decode operations. The exceptions to this rule tend to be in live operations and for the last encode prior to emission. Even those elements today are becoming software functions due to the versatility of today’s software.”

Here is ATEME’s take: “A hardware-based system is tailored for its target, no more, no less, which leads to the best performance and usage,” says Beaudouin. “The downside is that it creates as many processing silos as services: one silo for over-the-top, one silo for the linear channel, one channel for VoD. Software solutions overcome this issue by providing flexibility and ease-of-operation: the same appliance can be used for several purposes as software firmware, as well as virtualisation, allows easy portability to various locations.”

Ericsson says its bespoke hardware, which is designed using the company’s own chipset, has benefits over software when it comes to performance. “If the operator needs to maximise bandwidth efficiency then hardware is the primary choice, and will provide the best performance and network efficiency,” says Furgusson. “For traditional broadcast, hardware will often still be the predominant choice due to the high value of network bandwidth.”

For ABR and file-based environments, however, the story is slightly different. Operators need to quickly adapt their ABR transmissions to accommodate the needs of new devices, which can launch at any time without advance warning. “This inability to know what new device or codec is round the corner has led to the emergence of software as a favourable, flexible alternative which can quicken adoption for new devices,” says Furgusson.

“Hybrid models combining hardware and software can also help to address this challenge, allowing operators to use hardware for the stable core service (HEVC, MPEG4, and so on) but use software when flexibility is required, such as dealing with changes to packaging formats.”

Integration with asset management

Multi-platform consumption is causing an explosion in the one-to-many ratio of file-based workflows, resulting in a constantly increasing number of distribution derivatives from each master asset. Tight interaction with the asset management system is critical for a successful file-based transcoding workflow.

“Today, this is mostly used for linking the asset to all of the device-based derivatives,” says Lattie. He points to efforts around BXF (Broadcast eXchange Format) and IMF (Interoperable Master Format) to make this interaction more intelligent and powerful. “The next opportunity is to automatically scale the creation derivatives, not just based on consumption device, but also on content-specific considerations such as regional or ratings deltas and ad insertion.”

Integration is generally done via API, although as Turner points out, less sophisticated systems may only offer ‘hot folder’ integration, “which as you can imagine offers much less interaction, and places all of the management burden on the asset management system itself.” A lot of content is assembled from new and old footage that may have been shot at different rates. The results do not look good when broadcast on a low bandwidth transmission channel and displayed on a big, bright, flat screen. That’s where patented frame rate conversion technology from Dalet comes in. “It can fix up many of these problems without resorting to a re-edit of the original content,” says Devlin.

QC is another vital element in the media chain to the extent that the UK DPP mandates it for UK broadcaster delivery. “Modern media files are so complex that no single playback device can check everything so there is now a broad range of products that are able to perform QC on files and streams,” says Devlin. “These are able to find common design and configuration errors in media files and streams.”

Focus on NAB

Telestream is focusing on the creation of VoD assets with support for DAI (Dynamic Ad Insertion) which, claims Turner, “is a direct monetisation play for our customers, allowing them to create VoD assets in such a way that they can be customised via DAI and therefore monetised.” This hasn’t been possible to date, as all of the necessary pieces weren’t in place. Harmonic is also showing improved ingest support for camera file formats to increase its reach into production workflows.

Harmonic’s software-based offerings include the VOS virtualised video delivery platform, the power and flexibility of which “allows operators to transition services from SD MPEG-2 to HEVC UHD without having to completely replace their infrastructure,” says Lattie. Just prior to press, Harmonic announced the launch of its new Electra X encoder.

Ericsson is highlighting Virtualized Encoding, described as the industry’s first software solution for intelligent utilisation of multiple encoding resources (regardless of technology) and speaking more about this product in the context of TV Anywhere. Another key topic for the company at NAB is likely to be bandwidth efficiency and the effects of mezzanine links on compression performance. “We will show the considerable improvements this can have on broadcast distribution and delivery to the home, and set the scene for wider discussions on compression performance, and how it underlines everything that Ericsson does,”
says Furgusson.