Is Software Eating Broadcast Tech – Part II

A little over a year ago I posed the question on this blog – Is Software Eating Broadcast Tech? a play on the term coined by Marc Andreesen that highlighted how software was disrupting, or eating, many industries. So how does this question look a year later? Based on the number of traditionally hardware-centric vendors announcing or showcasing their software based product roadmaps at NAB earlier this month, it’s looking like a resounding yes. Particularly in the media processing and playout areas, both historically dominated by hardware delivered products, everyone seemed keen to stress their future credentials as providers of software solutions that would run, variously, on commodity IT hardware, virtualized infrastructure and “the cloud”.

This is all good news, at least IMHO, but it is not something that can happen overnight or with little effort and no teething problems. Here are my thoughts on the upsides, downsides and likely difficulties ahead (with a tip of the hat to Sergio Leone):

The Good

Software based products will ultimately deliver much greater flexibility. We can deploy them on infrastructure that suits our purpose (cost, speed, location, vendor preference etc) and we can do so dynamically – such as using common compute resources to perform an Auto QC at one moment and then re-allocating them to perform a transcode at another. This allows us to either get much greater efficiency from our private infrastructure, reducing its physical density, optimizing power consumption and so on, or pay a third party such as a public cloud provider only for the resources we need at any particular time.

We can perform the deployment much more rapidly than in the physical product world too. Need a new channel? No problem, choose from a menu of service options and it will be ready to receive and publish content in minutes rather than weeks or months. I am hugely over simplifying this of course, but you get the idea – in a software centric world the effort takes place before a channel deployment through software integration of channel templates and the act of deployment to available virtualized hardware is a process of launching machine images and/or configuration profiles not a large installation project.

This should result in reduced costs, greater business agility and high fives all round.

The Bad

Unfortunately, this transition to software nirvana is non-trivial. Writing good quality, performant software to process audio and video, particularly in real-time, is hard. We know this already of course because such software already lives inside the hardware products we have used historically.

But hardware delivered products should contain what is sometimes called mechanical sympathy (a term coined by Jackie Stewart to describe a driver being at one with a racing car by understanding its hardware intimately). The product designers and developers know exactly what to expect from their chosen hardware stack and optimize accordingly. In the software-running-on-hardware-of-your-choice world, things are less certain.

Different CPU implementations will provide different functional and performance characteristics through clock speed variations, cache sizes/levels, multi-core architectures, internal bandwidth, specialized accelerators and that’s just the CPU. Mix in storage IOPS/throughput variations, memory size/bus bandwidth differences, GPUs, LAN throughput, OS and driver versions, the list goes on. In this new world the software will need to be capable of optimising for generalised rather than specialised hardware and vendors will need to both provide guidance on minimum system requirements and accept that supporting a much more diverse underlying instructure will be more complicated and costly.

Then there is the question of how best to combine software components to perform multiple operations on the media. For real-time delivery, the serialised model of video processing, necessitated by using discrete hardware components, interconnected via SDI, isn’t likely to be optimal in the future. Instead of emulating this architecture by interconnecting virtual machines with IP interfaces (though the most viable approach in the short term), it would be attractive instead to cluster multiple operations on the media being processed while it is in memory on a single machine. This is an obvious benefit of integrated systems such as channel in a box solutions.

Unfortunately this is an area lacking in standardisation, or even interoperability, so today that often means taking the whole integrated system from a single vendor even though they are likely to be good and some functions (probably based on their product heritage) and not so good at others. And because increased software complexity often pushes the boundaries of reliability, the result of more functional integration can be system fragility. Not a property that sits well with the high availability familiar to and expected from the broadcast television industry.

The Ugly

If we know where we want to go (The Good) and we know its not going to be easy (The Bad) then the transition period over the next few years is likely to be The Ugly. Customers who have bought into the vision will be impatient to receive it and thus put pressure on vendors to deliver soon. The vendors meanwhile will inevitably find it is harder, takes longer and costs more money to get there than they estimated (or at least publicly communicated). There is an old software development maxim that applies here – you can have it faster, cheaper or better but you can only choose any two.

The nature of this type of industry disruption tends to lead to a change of fortunes. Some companies will fare better than others in the transition. This is all happening against a background of industry consolidation, it’s one of the reasons driving it, and it will be tough for certain categories of companies. Large companies may be better able to absorb the costs required to successfully evolve and small companies may be nimble enough to adapt more quickly but those in the middle might find it a tough road ahead.

Pretty much every vendor at NAB said they would have their new software products ready for business “by the end of the year” if not before (or here already). Let’s make a note in the diary to see how things look a year from now.



Broadcast Data Convergence (Guest Post)

Like all well-intentioned ideas, I setup this blog intending to post regular ramblings on topics of relevance to our industry and the future of TV in general. Having failed miserably so far, I want to open the floor to anyone else who would like to make a guest post on here and I am delighted to post the first one below. If you have something interesting to say on the future of TV and would like to write a guest post then please DM me @splunkett. Only rules are that you post as yourself rather than an employer and no product endorsements etc.

Rethinking Convergence

Have you ever stopped to wonder what is meant by convergence? In the world of broadcast we have been on the verge of convergence for many a year, the past five of those have bought into sharp focus the products that work and the products that do not. A previous blog entry on this site (Extraordinary Broadcast Delusions and the Madness of Clouds) discussed the evolution of some of those once distant prophecies.

Sure, convergence of technologies provides us with opportunity aplenty to re-imagine our own personal relationship with broadcast content.  Our box in the corner continues to play merrily for our entertainment, a dumb slave to our mastery of the remote control, providing new ways to enjoy the same old experience. That is convergence, isn’t it?

Well, I prefer to look at convergence from another angle; the convergence of ideas and practice across industries.

It was in 1993 that Sir Terry Leahy first envisioned an idea that would transform his business and his industry. In 1995 the Tesco club card was launched. Over the next 18 years Tesco was transformed into one of the most significant companies of our era, or indeed of any era. The premise behind this scheme was simple; learn as much as you can about your customers, store as much information as you can, keep every ounce of data and sell it back to your customers in the form of promotions, offers and targeted deals. Tesco’s customers get a great shopping experience, Tesco cuts its marketing bill, gains customer loyalty and drives up the spend per customer in its stores.

Broadcast media of course has always been different, the relationship with the chap sat in his chair is as remote as the remote; a one way interaction. There is little, if any, data collected. Yet the revolution of technological convergence changes all that.

In February this year Tesco announced the trial of Clubcard TV – imagine that, a retailer moving into TV! Yet isn’t this a logical progression for convergence across industries? All that data about what people buy must have relevance to what people watch. After all if it doesn’t, then what is TV advertising for?

Of course the launch came with all the usual comparisons with Netflix, lovefilm and the like. The caveat emptor for all such launches was displayed loud and clear “success depends upon the quality of the content”. I wonder though whether that is strictly true, in my mind success will depend upon the quality of the data. Content will follow. Netflix proved this when it broadcast House of Cards earlier this year.

So our dumb slave sat in the corner, now resplendent with Apps, takes on a different meaning. The Clubcard of the broadcast world gives us a greater experience, personalises our viewing and makes us want to watch more. It gives the broadcaster better targeting for its ad slots, better loyalty from its viewers and drives up spend from viewers and advertisers alike. Sounds familiar doesn’t it.

So where are we headed, will all industries end up looking like all industries? Perhaps! Amazon is far cry from a book retailer; the purchase of Lovefilm shows its intent in this direction. Yet I still don’t see many broadcasters heading in the other direction, they of course don’t have years of data to fall back on…yet!

@Lee_Cowie is Head of Technology Delivery at a leading broadcast service provider.


Extraordinary Broadcast Delusions and the Madness of Clouds

While sorting through my overloaded bookshelves at the weekend, trying to clear out the I-know-I-won’t-ever-read from the yes-I-will-when-I-just-have-more-time titles, I came across a book I enjoyed back in the 1990s – Charles Mackay’s Extraordinary Popular Delusions and the Madness of Crowds. First published in 1841, Mackay chronicles and details a series of popular follies that swept along various groups and societies, in a herd like manner, to over invest and ultimately abandon what at first seemed like a great idea, until it clearly wasn’t. At which point the same herd looks back and derides the whole business as obviously flawed and doomed from the start, before heading off in search of The Next Big Thing.

One example was that of Tulipomania, a boom to bust scheme that briefly engrossed Dutch society in 1636 as people from all walks of life suspended reason to trade in ever escalating prices for tulip bulbs. When reality finally intervened some people were left with red faces, big losses and (a few) black tulips.

So what has any of this got to do with TV? Well, as many of us will be heading off to the land of the tulip for IBC in Amsterdam in a few weeks, it’s interesting to look back at our own industries past delusions and also assess any potential false dawns at this years show.

Unfortunately, we don’t have to look back very far to find our first contender – 3D TV. Anyone visiting IBC, NAB or CES from 2008 through to early 2012, let alone their local electronics retailer, would have been forgiven for believing that stereoscopic 3D broadcasting really was the future of the medium. On the back of a resurrection of 3D cinema we lost the run of ourselves. Ignoring the practicalities of Roy Orbison style glasses for everyone, arguments over seating positions and a void of available programming the whole industry went on a splurge of manufacturing, marketing and broadcasting until we realised that nobody was actually watching. At NAB this year you could be forgiven for wondering if it was all merely a strange dream as no trace of 3D could be found anywhere. The only place you are likely to see it at this years IBC is if you peel back a few layers of paint on the vendor stands.

Prior to 3D TV, the last Next Big Thing was probably Interactive TV. At the dawn of the new millennium, as digital TV was being deployed in many countries, we were promised (or in fact were promising) a world of TV commerce, TV banking and TV pizza ordering. Surprisingly for all involved, Joe/Jane Consumer didn’t fall in love with button based remote control navigation, 1980s era 8-bit computer graphics and slow/barely connected services.

In both cases it is hard to say with certainty whether these were good ideas with flawed technology implementations or just flawed ideas. If it’s the former then the story is not yet over for both as technology improves, but if it’s the latter then they wont be returning anytime soon. Either way we strain consumer confidence and interest when we over-promise and under-deliver like this. Comebacks are possible however. Remember version 1.0 of the mobile internet? Painfully slow GPRS connections to WAP based pseudo websites on small screens with button based controls. The concept seemed dead until Apple gave us the iPhone, 3G came along and the rest is history.

So looking forward rather than back, what does this years IBC hold in terms of potential follies? Two of the contenders must be 4K and the Second Screen and both have connections to the previous two failures.

3D TV launched on the back of an expensive but successful HD rollout. Consumers were swayed by HD, going out to buy new sets and signing up for HD broadcast subscriptions. 3D TV seemed poised to maintain that momentum, but having failed it might make the introduction of 4K harder. There is the issue of programme availability (in common with 3D) and some practical considerations. Even with good content, for many consumers with sub 60″ sets, the uplift in image quality if not going to be as apparent as that between SD and HD on those same TVs. Unless we see average screen sizes stray into the huge territory, the benefits of 4K (and certainly 8K which is already being prematurely touted as its near term successor) will be restricted. Then there is the tricky business of distribution. Even with HEVC/H.265 we are talking about lots of bandwidth and over zealous implementation of compression ratios will damage the whole quality proposition of 4K. We are still waiting for 1080p broadcasts so lets not oversell ourselves on 4K too soon.

Then there is the Second Screen, in many ways the plausible heir to Interactive TV, it could be what the iPhone was to the WAP based mobile internet – the real deal. Certainly in terms of interactivity it solves the usability problem by replacing a crappy TV remote with a PC/tablet/smartphone and it also solves the connectivity/speed problems of old. Finally, and very importantly, consumers invented second screening themselves, we have just coined the phrase to describe it. But there are also a number of issues. First there is a problem of definition – second screen activity covers a large swathe of behaviours unrelated to the TV programme in front of the viewer. Then there is the lack of standard mechanisms by which a second screen application can interact with either a TV set or a TV programme. Finally, there is as yet a limited number of breakthrough examples of programme relevant second screen apps. We are back to the fundamental question of whether current technology is holding back a great idea or it’s just not a great idea and only time can answer that question.

Finally, to round off the gratuitus repurposing of Mackay’s title – The Madness of Clouds. Actually there is too much to say on this subject in a single post so I will write up a second one shortly dedicated to the topic. Needless to say, the words Broadcast and Cloud will be common currency at this years IBC and having invested a lot of time over the past couple of years in this area it is an important discussion point. Cloud means a lot of different things to different people and there are good, bad and downright ugly examples on offer at the moment but more on that later.