Get started with cloud storage: A buyers guide for media businesses

Servers
Richard Welsh, Sundog Media Tool
Buyers Guide
April 26th 2016 at 11:21AM : By

You're going to migrate to the cloud sooner or later. Richard Welsh, former SMPTE Governor and founder of Sundog Media Toolkit, offers tips on getting started

At Sundog we offer cloud Software-as-a-Service for various post production processes. Very early on in building the Sundog platform we made the decision not to offer storage services. There are many options for storing media in the cloud, and we didn’t want to force any particular solution or provider on our customers. This has meant that over the last few years we have built ways to work with a number of methods of cloud storage and have seen a wide variety of professional solutions. In this article I will outline the main approaches we have seen for cloud storage solutions and some of the practical implications of their use.

Beyond Public-Private-Hybrid

Public and private and hybrid cloud are now well understood concepts in the media industry. What is interesting is the recent rise of community cloud, which is filling a gap in the marketplace for those who want content-focussed cloud storage but don’t want the complexity and cost of operating a private cloud. A number of providers of media-centric storage have sprung up in the last 24 months who offer a laser focus on content applications and who have a deep understanding of production and post workflows. These providers are offering a cost-effective scalable solution that gives an option to those not wishing to move wholesale into public cloud. Many of these providers also run hybrid models where you can use hosted storage for regular work, and still leverage public cloud at peak times.

The other area seeing a strong growth in options is that of the ‘glue’ joining different cloud solutions together. It’s not uncommon to be using multiple services in Google, Microsoft and Amazon public clouds on a project, but store the production data in managed service clouds such as Sohonet or Base Media. In this multi-cloud environment there are many ways to join the dots. In the case of software services, some are based in a particular cloud and require the connection to the storage systems in that cloud. Others are able to work with data coming from multiple sources, and we are seeing increasing sophistication in the ability of SaaS applications to speak to multiple clouds and via multiple mechanisms. Many software systems and managed storage service providers already offer integration to familiar media-centric file transport systems such as Aspera and Signiant.

It’s important you understand the basic financial mechanics of the solution you’re choosing

 

£ $ €

It’s important you understand the basic financial mechanics of the solution you’re choosing. Is data movement (ingress and egress) a chargeable part of the service. Different providers charge for all, some or none of your data movement. Another question you should ask is how does better connectivity impact total cost of the service? Public cloud typically offers some sort of dedicated connection option but you may find this attracts additional charges elsewhere from your telecoms provision or hosted cloud services connecting to the public cloud.

One of the most important elements to understand is tiered access storage – block versus object versus archive storage – what are the combined volume and access costs (for instance archive storage may appear to be more cost-effective than block or object for the total volume, but can quickly become costly if you require frequent access). 

So when does it make sense to use block over object storage? Typically block storage is required when running with a single or small number of processing end points. So for instance if you’re running a real-time application such as colour grading where you require high bandwidth to a single server then block storage makes sense. If (as in the case of the Sundog system) you’re running highly scalable processing where there are potentially many servers running in parallel then object storage is faster because of a high aggregate speed to multiple endpoints. However, if using object storage, it’s important to have the right tools for asset management in that environment. Block storage presents an array of drives striped with a file system, so this looks and behaves like a local SAN/NAS volume. This is useful if you want to spin up servers in the cloud to run the same software that you use on local machines. This set up is often restricted to block storage as the software will only recognise files in an attached volume, and object paths (see below) will be meaningless to the software.

Object storage is entirely virtualised so files now become “objects” with paths that look like network addresses. Object storage has a high level of distribution and redundancy across the infrastructure of the data centre(s) in use. Those objects are reconstructed by the storage controller to deliver files to the sever requesting them. The software must be able to understand those object paths in order to get the files, process them and put them back into object storage. It’s essential that your choice of SaaS and asset management system is able to interpret these objects and present them to you in a familiar way. The more advanced tools will be content aware, interpreting content types, formats, metadata, and allowing proxy views of the media.

Secure storage

The number one question about using cloud services for media in the last few years has definitely been one of security. The strong focus on security for cloud has resulted in a plethora of options and understanding these is another important element in choosing storage. There are two main areas of security to look at here, encryption and access control.

Encryption at rest can be handled broadly in two ways, server side and client side. Server side means that the cloud control layer handles the encryption and decryption of content as it comes into and leaves the storage cloud. In public cloud, server side encryption and key handling is performed transparently to the client and typically doesn’t add any significant processing overhead and little or no additional cost. Client side encryption means the data is pre encrypted before transport (this doesn’t affect the transport security layer which is handled separately) and key management is the responsibility of the client. The advantage is the complete control of keys and decryption points are with the client, but the process is no longer transparent and adds processing and operational cost overhead. The implementation very much depends on other factors such as the use of encryption at rest on local (non-cloud) storage and key management and access control systems already in place. Typically, server side encryption is acceptable for most cases and is already widely used and trusted by large corporations.

Transport layer security is usually fully automated and controlled by a negotiation between the send and receive controllers. In this case we’re considering the requesting server and cloud storage control layer. The both the server and storage control will have a suite of supported encryption cyphers, and the minimum preferred level of transport encryption is defined at the storage end. If the storage and server don’t have a matching cypher suite for the preferred protocol, they will move up through their respective lists until they find a matching suite (the cypher suites get progressively stronger – a weaker suite will never be used). In most cases the transport layer will use the first requested suite as the protocols are widely available. In the case of public cloud, the customer doesn’t have control over this. If you wish to increase the encryption strength you would need to come from public cloud directly to a commercial transport server which will decrypt and then re-encrypt using its own cypher suite. However, most large public and hosted cloud providers use equally strong transport encryption to commercial file delivery systems. If you have your own on-prem storage then you need to ensure you have configured your systems to request an encryption suite you consider sufficiently strong. Most security best practices currently request a preferred strength of 256bits. Just remember that as a rule, the more bits of encryption used, the higher the processing overhead required, which may become a workflow bottleneck.

Look to those companies that offer flexibility in the storage options, and the transport mechanisms for getting files in and out

Access control is a subject as deep as it is wide, but in general you should consider separating your access policies from credentials for your various cloud touchpoints. In this case, policies are the access rights granted according to use cases, and the touch points translate into ‘users’ who have their own credentials for access, and policies applied to them as appropriate. The advantage of this segregation is users can be switched on and off via allowing or denying those credentials, whereas the policies can be written once and applied to different users. This is particularly important if you work on many different projects with various cloud services and you want to be able to control their access to the content over time. It’s easy to switch individual services on and off from a security perspective by following this segregation.

Getting up and running

Cloud storage offers scale and flexibility whilst moving away from a cap-ex model and towards an op-ex model. It’s essential to understand how your workflow will accommodate this virtual approach to storage. As the media industry migrates elements of their operation from local to cloud infrastructure, we will see a high level of fragmented services and hybrid workflows. So it’s essential to analyse the applications you want to run in the cloud, how to interface those various services and especially how cost savings can be achieved.

Look to those companies that offer flexibility in the storage options, and the transport mechanisms for getting files in and out. This is just as important as any financial benefits because if your workflows become constricted those savings can quickly evaporate. Whilst the initial learning curve for cloud storage is steep for those new to it, we have had an overwhelmingly positive experience once customers are up and running. If you’re thinking about it, you should start to dip your toe in the water now. Many public cloud providers offer free trials which are perfect to experiment and learn, take advantage of that and stay ahead of the curve!