Hobbit HFRPeter Jackson’s brave use of High Frame Rate (HFR) in “The Hobbit” has sparked intense debate over the past two months. The most prevalent questions have been:

“Does HFR look ‘better’ than traditional 24 fps, or does it just look ‘different’?”

“Will audiences eventually get used to HFR and then prefer it over traditional 24 fps, as with many other advances in motion imagery?”

Clearly HFR looks different. Anyone who saw “The Hobbit” in HFR can agree with that. And I think it’s fair to say that Jackson and his team succeeded in their goal to produce a more lifelike visual experience. But many viewers and critics found this new hyper-real experience off-putting, reducing the so-called “suspension of disbelief” that is considered essential to storytelling. Critics widely panned Jackson’s use of HFR, comparing the look of HFR to television and video games, deriding it as “non-cinematic,” “plasticine,” and “fake.”

If HFR didn’t work well for a cinematic adventure film like “The Hobbit,” there must be a scientific explanation. Or, as it turns out, maybe there isn’t. Yet…

At today’s Hollywood Post Alliance (HPA) Tech Retreat, Charles Poynton presented a fairly comprehensive overview of the challenges with motion imagery. His conclusion is that we don’t know enough about how motion is perceived by humans to explain with scientific certainty why HFR looks different. Poynton provided a thoughtful and detailed overview of how image capture and display technologies interact with the human perceptual system. Without going into detail about pulse width modulation or bit splitting (!) here are some of Poynton’s main points from today’s session:

  • We are still trying to fully understand how motion is decoded in the brain. Some scientists believe the brain processes chunks of input (20 to 50 milliseconds each) while others believe perception is more like a continuous stream.
  • Despite advances in our learning about human perception, along with advances in algorithms that use these learnings to automatically assess visual quality, the only way to conclusively assess visual quality remains subjective testing by humans.
  • The way that humans track motion, by moving their eyes several times per second, presents a challenge for motion imaging systems.
  • Different display technologies (LCD, Plasma, DLP, CRT) emit light in different ways over time, creating an additional factor that must be carefully optimized.
  • Along with wide gamma and other advances in motion image technology, HFR needs further study and refinement before it can be successfully adopted.

As a fitting postscript to Poynton’s presentation, Mark Schubin suggested that not only do we not yet know why HFR looks so different from traditional 24 fps, we also don’t know where this will all land. Many other advances in cinema technology were initially rejected by viewers, but eventually became standard fare.

In my opinion, HFR will eventually be incorporated as an additional, valuable tool for storytellers.  Higher frame rates are not inherently superior to 24 fps for all content genres, but they may meaningfully enhance the audience experience when used for scenes that require fast motion or hyper-reality. It may turn out that some television genres like news, live entertainment and sports may be better suited to high frame rates, because the goal of these productions is to approximate live attendance of the event. Dramatic programs with high production values, such as The Hobbit, may ultimately benefit from the suspension of disbelief induced by the cinematic look of 24 fps, incorporating HFR more sparingly as a visual effect.

So maybe HFR is not the problem – maybe we are. Notwithstanding the creativity and skill of Peter Jackson and his team, perhaps we don’t know how to do HFR well yet. Maybe we don’t yet know when to use HFR and when not to. But at least Peter Jackson has forced the discussion by boldly pushing into this new frontier, for which we can all be grateful.

avid_logo_bos12On Monday, Avid announced that Gary Greenfield has resigned from his position as CEO and Chairman of the Board. Louis Hernandez, who was appointed to Avid’s Board of Directors by Greenfield in 2008, has been named as Avid’s new President and CEO. George Billings, who has been a member of Avid’s board since 2004, is now Chairman. Greenfield will continue to serve as a member of Avid’s board.

These changes in Avid’s leadership follow several years of declining results. Here are some of the numbers:

  • When Greenfield took the helm on December 19, 2007, Avid’s stock price was $25.40. At close on Monday, it was $7.87, a decline of 69%.
  • Avid’s revenue for 2007 was $989.6 million. Although Avid has not announced official results for the fourth quarter of 2012 as of yet, some analysts are predicting roughly $614 million in total revenue for 2012. If this estimate is accurate, Avid’s top line will have declined by more than 33% during the last five years. (Note that these numbers are not fully adjusted for acquisitions and divestments.)
  • Avid’s cash position has declined from $224.5 million at the end of 2007, to roughly $71 million, according to Avid’s most recent results announcement (Q3 2012), a decline of more than 68%.

It’s important to note that other large vendors such as Grass Valley and Harris Broadcast have also struggled during this interval. And despite poor financial performance, Avid achieved several important accomplishments over the last five years:

  • Significantly reduced price points for many products, achieving better alignment to the market.
  • Decoupled hardware and software for Pro Tools and Media Composer, enabling customer choice.
  • Implemented “voice of the customer” programs, to increase customer influence.
  • Expanded product categories through strategic acquisitions, including Euphonix (audio consoles and control surfaces) and Blue Order (media asset management).
  • Refreshed several key product lines, while improving overall product quality and customer support.
  • Unified multiple brands under a single Avid brand.
  • Streamlined costs by transitioning to shared services and increasing offshore contract engineering.
  • Narrowed market focus on professional customers by divesting its consumer product lines, while expanding Avid’s professional services capabilities.

Although these accomplishments are significant, they did not result in sustained growth or profitability. To rebound, Avid will need to take bold steps. The good news is that the media industry’s value chain continues to evolve, creating new opportunities for technology vendors. Here are ten of the top macro trends, as I see them, in no particular order:

  1. Transition to HD and Digital. Although largely complete in North America, the transition to digital distribution and HD television is still in full swing globally, creating upgrade opportunities for technology vendors.
  2. Beyond HD. Film studios and broadcasters continue to advance beyond HD for high-end content production. As content creators adopt higher spatial resolution (e.g. 4K), higher frame rates (e.g. 48 fps and 60 fps), stereoscopic, and high dynamic range, as well as other advances in imaging, they will need to upgrade their equipment and expand their storage infrastructure.
  3. Growth in the MI Market. As the global economy has rebounded, so has demand for musical instruments, as well as music creation tools and audio recording products.
  4. More Broadcast Channels. Broadcasters continue to add new channels, especially outside of North America, driving increased demand for content creation and distribution technologies.
  5. New Distribution Outlets and Multi-Screen. Broadcasters and media companies are monetizing content across more distribution outlets, including on demand, web, mobile, and social media, fueling the market for content transformation and distribution technologies. Additionally, new viewing paradigms are emerging including interactive, multi-screen consumption, creating new opportunities for targeted advertising.
  6. Corporate Video. The use of video in corporate environments continues to explode, creating opportunities for vendors of simple content creation tools, collaboration infrastructure, centralized storage and content management systems, as well as transformation and distribution services.
  7. Expanded Audio Deliverables. More content production means more audio production. Broader international distribution is driving more language versioning than ever before. New audio accessibility and loudness regulations, as well as new surround formats like Dolby Atmos, require more audio processing. All of these trends are fueling increased demand for audio production and processing tools.
  8. Transition to File-Based Workflows. As the entire media value chain continues its transition to file-based workflows, demand will increase for media asset management, media storage, and workflow orchestration solutions.
  9. Operational Integration and Automation. Media companies are looking for ways to increase their productivity without increasing operating expenses. To this end, they are adapting concepts and technologies from the IT industry to achieve new levels of process automation and operational integration, expanding the market for professional services and integration technologies.
  10. Cloud. As media companies streamline their operations, cloud-based infrastructure will become increasingly appealing – providing elastic capacity and cost, while enabling new workflows for distributed teams. This trend will unfold over the next several years, creating opportunities for media-savvy cloud-based service providers.

These opportunities are not new, and they are not accessible only to Avid. But none of these opportunities is out of reach for Avid either. The question is whether Avid will participate meaningfully in these opportunities, or continue to retrench in its existing core business.

Earlier this week, one of my former colleagues at Avid told me that many Avid employees do not see this week’s announcement as a regime change, but rather a modest change in roles for the existing management team. No board members have come or gone, an existing board member chosen by Greenfield is now CEO, and Greenfield himself is still in the picture as a board member. It’s hard to argue that this week’s announcement represents a radical departure from the status quo.

Still, one could argue that Hernandez is well positioned to effect change. Although Hernandez has been on Avid’s board for several years, most Avid employees and customers are not familiar with him. This affords Hernandez the opportunity to approach his appointment as a newcomer would. Yet his tenure on Avid’s board has exposed Hernandez to Avid’s business, customers and markets. The question is whether Hernandez will take this opportunity to effect change, or continue the course set by his predecessor.

Predictably, Monday’s announcement has sparked increased interest in Hernandez’s track record. According to publicly available documents, Hernandez was previously Chairman of the Board and CEO of Open Solutions, Inc., a technology company that was acquired in January 2013 by Fiserv, Inc. for $55 million. As part of the acquisition, Fiserv also assumed approximately $960 million of debt. Open Solutions had been taken private roughly six years earlier by the Carlyle Group and Providence Equity Partners for $1.3 billion. In the six years since Open Solutions was taken private, the company’s debt more than doubled from roughly $448 million to $960 million. Because Open Solutions has had a history of losses, Fiserv will realize tax breaks to the tune of $165 million, lowering the total cost of the acquisition to around $865 million.

This means that the market value of Open Solutions has declined by roughly 33% over the last six years, from $1.3 billion to $865 million. According to articles and posts on public websites like Glass Door, Open Solutions has also implemented layoffs and has reduced employee benefits in order to manage costs, affecting employee morale. While the decline of Open Solutions has not been as precipitous as Avid’s, these similarities are worth noting.

In Avid’s Press Release Monday, Hernandez described his appointment as “…an exciting opportunity to lead Avid at this very important juncture…” adding that “[Avid] is well positioned for growth and global expansion in this fast-moving marketplace.” It certainly is an exciting time of change for the media industry, and a potential inflection point for Avid. Along with many other fans of this great company who continue to root for an Avid turnaround, here’s hoping that Hernandez is ready and able to chart a new course forward for Avid.

It’s difficult to summarize a show as diverse and expansive as NAMM. Plenty of other sites provide detailed, exhaustive coverage of new product announcements. So instead, I offer here the top ten themes of NAMM 2013.

1. The MI Industry Is Rebounding, Slowly

According to NAMM, the market for music and audio equipment has rebounded nicely from the economic downturn a few years back. In 2011, MI was a $16.3 billion industry, up 3% over 2010. Although the MI market has not yet fully regained its pre-crisis peak ($18.2) it has rebounded nicely. And so has the NAMM show, which is now as vibrant as ever. For 2013, there were 93,908 total registrants, a 2 percent decrease from 2012. According to NAMM, this decrease was the byproduct of an intentional effort to make the show more business-focused. The result was an increase buyer in attendance (up 4% over 2012) and a decrease in non-industry guests (down 16%).

2. MIDI is 30 Years Young

1983NAMM midi.org
MIDI is born at NAMM 1983
(courtesy of midi.org)

The 30th anniversary of MIDI was celebrated at NAMM 2013. You know what this means? The 30th anniversary of my first synthesizer purchase (a Roland JX-8P) can’t be far off!

3. Everything Old is New Again

Retro continues to be a major theme – from the way instruments look to the way they sound. Guitar makers are capitalizing on the market for vintage electric guitars by releasing additional “re-issue” guitars designed to exactly emulate their forbears. And every year we see more devices and plug-ins designed to replicate, or at least emulate, vintage legacy gear.

  • Universal Audio expanded their library of software models of vintage audio gear, adding new plug-ins that emulate the Teletronix LA-2A, the API 500 EQ, as well as dozens of vintage amplifiers, with their Softube packages.

    ua la_2a_stacked
    Virtual LA-2A’s from UA
  • Native Instruments released two new plug-ins that model the venerable Lexcon 224 (RC24) and 480L (RC48) reverbs. Available now for $149 each or $199 for both.
  • The Joe Perry edition Gibson Les Paul 1959 reissue costs over $10,000 and comes pre-worn on the back where a big 1970’s belt buckle would have scratched it.
  • Fender released four new “pawn shop” guitars – remixes of vintage Fender guitars from the past, as well as a reissue of their 1957 tweed Bandmaster amp for $1,899.
  • Dunlop released an updated MXR Talk Box for those of us who want to unleash our inner Peter Frampton. Available now for $170.
  • The Magnatone brand of vintage tube guitar amplifiers, last manufactured in 1969, is now being relaunched.

4. Real Analog Synths are Back!

Real analog synths made a big comeback at NAMM 2013. These are not modeling synths that emulate analog sounds, they are actual analog synths with real oscillators, filters, etc.

  • Moog unveiled it’s Sub Phatty monophonic analog synthesizer, a new twist on the classic Mini Moog. Available in March for $1,099.
  • Korg announced the MS-20 Mini analog monophonic synthesizer, a replica of the vintage MS-20 mono analog synth, first introduced in 1978. The MS-20 Mini incorporates the exact same circuitry as the original MS-20, but with modern (!) advances like MIDI. Available in April for roughly $599.

    KORG MS-20 Mini
    Korg MS-20 Mini
  • If one voice isn’t enough for you, upgrade to 12 with Dave Smith’s upcoming Prophet 12. A modern take on vintage polyphonic synths, the Prophet 12 offers four oscillators per voice with all the filters and VCA’s you could want. Available in the second quarter for $2,999.
  • Although it was not visible on the show floor, Tom Oberheim announced that his re-imagined Two Voice Pro will be shipping in the spring of 2013 fir $3495.
  • Buchla debuted their upcoming Music Easel, a remake of their 1973 vintage analog synthesizer. Available in the second quarter of 2013 for around $4,000.
  • Elektron’s Analog Four synthesizer, introduced last November, was also demonstrated at NAMM. Priced at $1149, available now.
  • Dubreq demonstrated their new Stylophone S2, an analog synth based on the classic Stylophone, originally launched in 1968. The S2 is now available for $479.

5. Hybrid Everything

NAMM 2013 showcased dozens of hybrid products that blend technologies in every combination imaginable.

  • Hybrid MIDI keyboards, synths, pad controllers like Novation’s Launchkey which includes a music keyboard, pads, faders, knobs and a companion iPad app with onboard sounds. M-Audio’s new Axiom AIR controllers offer deep integration with AIR instruments and Pro Tools.

    Berhinger iX16
    Berhinger iX16
  • Hybrid guitars, amps, processors like Line 6’s JTV-89F with 29 onboard Variax instruments as well the ability to change the guitar’s tuning virtually, without adjusting the strings. Peavey’s new Vypyr VIP amplifiers offer built-in bass, acoustic and electric guitar models.
  • Hybrid software/hardware solutions like Behringer’s iX16 iPad mixer dock, a 16-input digital mixer and USB audio interface that can be controlled by an iPad.
  • Hybrid digital/analog products like Korg’s KingKorg synthesizer, a digital emulation of analog synths that features an on-board vacuum tube.
  • Hybrid MIDI and audio capabilities, like Ableton Live 9’s ability to convert audio to MIDI, including drum beats, melodies and harmonies

6. Easy Music Creation

NAMM 2013 continued the trend toward hardware and software products aimed at lowering the barrier to music creation. On the hardware side, a number of new controllers took aim at non-keyboard musicians:

  • Ableton’s upcoming Push is a tactile grid controller for creation of beats, harmonies and melodies, as well as sound manipulation. Availability is planned for later in Q1 2013, when Push will be bundled with Live 9.

    abletonpush
    Ableton Push
  • Livid Instruments debuted their new Base pad and touch controller, which features 32 pressure/velocity sensitive pads as well as touch faders and buttons. It can be used to drive apps like Ableton Live to quickly create new beats and melodies. Available in March for $399.
  • Nektar introduced their Panorama P1 USB DAW controller, which is deeply integrated with Cubase and Reason software. Planned for availability in April 2013 at $299.
  • Numark launched Orbit, a wireless handheld DJ controller that looks like a video game controller and incorporates a 2-axis accelerometer.
  • The Arturia SparkLE is a smaller, more affordable version of the original Spark software/hardware drum machine. Available soon for $299.
  • Tascam announced two new Pocketstudio ‘sketch pads’ that enable quick and easy capture of musical ideas. The DP-006 and DP-008EX and DP-006 will be available in March, starting at roughly $200.

7. iPad and iPhone

iPad/iPhone apps, accessories and integration continued to be a major theme at NAMM this year.

  • IK Multimedia demonstrated their iRig BlueBoard Bluetooth MIDI pedalboard (available in Q2 for $99.99) and iLoud speakers (available in Q2 for $299.99) as well as dozens of other iPad accessories.

    apogee one
    Apogee One
  • Apogee introduced iPad compatability for their audio interfaces, including a new iPhone/iPod audio interface called “One,” as well as updates to Duet and Quartet that enable connectivity and control from iPod, iPhone and iPad.
  • Korg introduced the second app in their iPad musical instrument (iMS) lineup. The iMS-20 app includes a virtual MS-20 monosynth as well as a sequencer, drum machine and mixer. Available now for $29.99.
  • Rode showcased their iXY stereo microphone for iPhone/iPad, available in March 2013 for $199.
  • Numark showcased their iDJ Live II iPad DJ controller that now includes USB connectivity for computer. Available “later this spring” for $99.
  • Keith McMillen Instruments demonstrated QuNexus, a USB MIDI controller that can be used in conjunction with MIDI devices, computers or iOS devices. Available in April of 2013 for $149.
  • Artiphon demonstrated a prototype of their upcoming Instrument 1, an iPhone powered “multi-instrument.” Available in late 2013 for around $800.

8. Mini Is “Ini”

arturia minilab
Arturia MiniLab

Some products, like Roland’s RD-64 compact stage piano and the Arturia MiniLab, offer reduced size to enable increased portability. But other products seemed to be focused on just trying to be cute. Either way, there were many miniature products at NAMM 2013.

  • Vox released a tiny battery-powered modeling amp, called the MINI5. Available now for $99.
  • Dunlop launched Fuzz Face Mini, a line of diminutive effects pedals that emulate late-60’s era fuzz tone stomp boxes.
  • Numark launched a tiny DJ interface called dj2go. Available now for $49.99.
  • Samson previewed a range of tiny MIDI controllers, which will start at $59.99.

9. The High End Gets Pricier

Fender Telecaster Diamonds
$120k Telecaster

While many vendors are lowering prices to address the mass market, specialty products abounded at NAMM, including a $120,000 Fender Custom Shop Telecaster decorated with diamonds that sold in 30 minutes. No, I’m not kidding! Other vendors simply extended their product lines in the upward direction.

10. Folk Is Coming Back

With the success of bands like Mumford & Sons, some acoustic instrument categories are experiencing a bit of a renaissance.

  • Fender showcased the new Mando-Strat, an electric mandolin that looks like the love child of a mandolin and a Fender P-bass.

    Fender MandoStrat
    Fender MandoStrat
  • Fender announced a new Connecticut-based Acoustic Custom Shop for high-end acoustic guitars.
  • Not to be outdone by Fender, Guild also announced a Custom Shop that is building high-end models, including a special 60th anniversary guitar that is selling for more than $5,000.

The HEVC/H.265 standard is now officially finalized, paving the way for broader commercialization in products in the near future. Back in September of 2012, I published a post introducing HEVC/H.265, a new video codec that offers approximately twice the efficiency of H.264. A draft specification for HEVC was released in July 2012 by the JCT-VC, a joint collaborative group co-sponsored by MPEG and the ITU. And last Friday, the ITU announced that HEVC had received first stage “consent” approval. Simultaneously, MPEG announced that HVEC had achieved Final Draft Internal Standard (FDIS) status. What does this mean?

HEVC is now fully baked and we will soon be seeing wider industry adoption!

It is important to note that the current HEVC specification only describes profiles for single-view 8-bit and 10-bit video with 4:2:0 subsampling. Extensions are in progress for 4:2:2 and 4:4:4 color sampling as well as multi-view (3D). Until these extensions are ratified next year, H.265 will not achieve full functional parity with H.264.

Because HEVC is more complex computationally, adoption will depend on continued increases in processing power and battery power. Although some current mobile devices may be able to decode HEVC in the future simply by updating their software, HEVC will put a huge strain on processors and battery life. For many applications, hardware implementations will be optimal, and many hardware vendors have announced chips that will support H.265. For example, at CES 2013 Broadcom showed the BCM7445, an HEVC decoder chip that will be incorporated into next generation home gateway devices in mid-2014.

2012_01_08_broadcom3
Broadcom BCM7445 at CES 2013

But as with any codec transition, consumer adoption will depend on support across the value chain:

  • Consumer viewing devices will require updated software or hardware to decode HEVC
  • Content owners and distributors will need to create and offer HEVC-encoded content for distribution
  • Content distribution networks, such as cable operators, will need to update their delivery infrastructure to support HEVC

Unfortunately, these barriers to adoption are significant. That’s why most of us are still watching MPEG-2 on TV, even though H.264 was ratified several years ago. Still, finalizing the specification is an important first step towards adoption. And HEVC offers many benefits to broadcasters and content owners. As with 4K and Ultra High-Definition Television (UHDTV), HEVC is an important building-block in the next wave of high resolution imaging.

TV’s will be available this year, but adoption will be slow.

Not surprisingly, Ultra High Definition Television (UHDTV for short) was the big buzz at CES last week. All of the major TV manufacturers were demonstrating UHDTV sets running at 4K resolution, with Sharp showing an updated prototype 85” TV running at 8K resolution. But even as the TV sets themselves are approaching availability, the picture is becoming increasingly clear (pun intended) that uptake will be slow.

  • Although UHDTV sets will be available as soon as March of this year, initial pricing will be prohibitively high (>$10,000). According to CEA analysts, UHDTV units will account for only 5% of TV’s sold in 2016.
  • No 4K or 8K content is available currently to view on UHDTV sets and likely won’t be available in quantity for years to come.
  • There is currently no distribution method for UHDTV content. Data rates for UHDTV can be at least four to sixteen times higher than for 1080p HD.
  • Broadcasters are generally unconvinced that a viable business model will emerge for UHDTV.
  • Most industry analysts and TV vendors now acknowledge that UHDTV has little (if any) value for displays smaller than 50 inches. For my thoughts on the relative value of higher spatial resolution, please read my earlier post on this topic.

In short, UHDTV is a really impressive technology that is desperately in search of a market opportunity. It’s hard to dispute that UHDTV represents a future we can all get excited about, but (like HD before it) the market for UHDTV will likely take several years to develop. Read on if you want more of the gory details. If you’re a UHDTV novice, feel free to check out the primer on UHDTV I posted last September.

Samsung S9000 85-inch 4K UHDTV at CES 2013
(photo courtesy of pocket-lint.com)

UHDTV’s Will Be Available Soon

Virtually all of the major TV vendors demonstrated UHDTV sets at CES last week. Note that none of the sets is currently available, but some will begin shipping as soon as March 2013. Early units will include 94-inch displays from Sony and LG that will be priced at $20,000-$25,000. Smaller models will cost less – the Consumer Electronics Association (CEA) estimates that the average wholesale cost of 4K televisions will drop to $7,000 by late 2013, then to $2,800 in 2014. Even with this steep decline in price, only 1.4 million unit sales are projected for the U.S. in 2016, equivalent to roughly 5% of the market.

“It’s a very, very limited opportunity,” said Steve Koenig, director of industry analysis at the CEA. “The price points here are in the five digits (in U.S. dollars) and very few manufacturers, at least at this stage, have products ready.”

Content is (Lac)King

No UHDTV content is available to consumers currently. But the technology is available to create 4K content. Increasingly, feature films are shot with 4K cameras and are mastered at 4K for theatrical distribution. 4K consumer camcorders are already on the market. Classic films archived on 35mm film could be retransferred at 4K for distribution. And gaming engines could fairly easily render images in 4K or 8K. So the technology is available, but until a compelling market opportunity exists, content owners may not jump on board.

Upscaling and Other Sources

Early UHDTV sets from vendors like Samsung, Sony and Toshiba, will support upscaling of HD content to 4K. For example, Sony recently announced plans for a “Mastered in 4K” Blu-ray library that will offer content mastered in 4K that is downscaled to 1080p HD for Blu-ray release.

“When upscaled via the Sony 4K Ultra HD TVs, these discs serve as an ideal way for consumers to experience near-4K picture quality,” according to Sony.

While this may seem like a sketchy marketing ploy (because the discs are no different from today’s Blu-ray titles) I have to admit that standard def DVD’s look much better when upscaled to HD, so there is a possibility that Blu-ray discs will look even better when upscaled to UHDTV. Still, upscaling of 1080p content is at best a stopgap until UHDTV-native content becomes available.

One source of UHDTV-native content could be your computer. Graphics cards have long since surpassed the 1920×1080 resolution of HD. Home movies and photo montages will look great on a huge UHDTV. I can also imagine UHDTV’s becoming commercially viable as high-end displays for retail and corporate applications. But for television viewing, the lack of native UHDTV content will definitely be a barrier to adoption.

Bandwidth and Distribution

At 4K, UHDTV content has more than four times as many pixels as HD. At 8K, that multiplier jumps to >16x. Carrying that much data will require significant advances in compression technology as well as increases in network bandwidth. Ironically, at a time when sales of physical media are plummeting, these challenges may make physical media the most practical method for distributing UHDTV for the next few years. Recently, the Blu-Ray Disk Association has formed a task force to study the viability of extending the Blu-ray format to support 4k.

As for television, some broadcasters seem interested in exploring UHDTV. Recently, satellite operator Eutelsat Communications launched a demonstration 4K channel in Europe. And an experimental Ultra HD channel is also being planned in Korea. NHK in Japan hopes to begin experimental 8K broadcasts in 2020. But these technology experiments are primarily focused on exploring the technical viability of UHDTV broadcast.

Business Model

Market viability is perhaps the biggest question surrounding UHDTV broadcast. As with the 3D hype two years ago, some industry analysts expect sports programming to drive consumer interest in UHDTV.

The Hollywood Reporter recently stated that “BSkyB in the UK, Sky Deutschland in Germany, Japan’s Sky Perfect Jsat, and Brazil’s TV Globo have all started to explore the potential of 4K, which would include coverage of events such as sports…with an eye toward offering the 2014 FIFA World Cup and 2016 Olympics (both of which will be held in Brazil) in [UHDTV].”

But other broadcasters have gone on record as much more jaundiced about the market opportunity. During the Broadband Unlimited conference at CES last Monday, Sheau Ng, VP of research and development for NBC Universal says there’s no business model for UHD TV yet.

“Therein lies the rub,” Ng said. “It’s not the technology, it’s the business model. Where is the money? Unlike the previous revolution of HD, we have the device manufacturers selling the device when people are still scratching their head and saying ‘What do I do?’ That’s something we’re wrestling with every day. For us to say ‘We’re going to do this,’ we need somebody to say ‘here’s the business model, here’s the number of devices in the market, here’s how we’re going to make money.’”

And Bryan Burns, VP of strategic business planning and development at ESPN, hinted that some broadcasters will wait for 8k.

“By the time we get [to 4K] we will be on to 8K or whatever. I don’t want to make the capital investment [in 4K]. There might be a gradual evolution…but I don’t see us heading to 4K production or an ESPN 4K channel.”

In Summary…

With the proliferation of high resolution cameras and displays, the question surrounding UHDTV is no longer “how,” so much as “why” and “when.” From my perspective, until there is critical mass of UHDTV sets installed in homes, content owners won’t spend the extra dollars to make UHDTV content available. Until the content is available, broadcasters won’t create UHDTV channels. And until a lot of compelling content is both available and affordable, consumers won’t pay extra for UHDTV sets or service.

If this circular “chicken and egg” dynamic sounds to you a lot like the uphill battle faced by HD technology fifteen years ago, or 3D TV two years ago, I think you’ve gotten my point: UHDTV is definitely coming, but not quickly.

At SMPTE’s annual technical conference (ATC) in Los Angeles last week, the keynote address was presented by Anthony Wood, founder and president of Roku and an early pioneer of DVR technology. His talk was called “The future of TV will be streamed.” That’s right! To kickoff the SMPTE ATC this year, the founder of Roku stood up in front of an audience full of television engineers to announce that traditional broadcast will be killed by over-the-top (OTT) streaming.

His argument was simple. The internet has completely disrupted many other entrenched industries including newspapers, magazines, recorded music, as well as retail commerce in general. Traditional broadcast television is next on the list, on the fast track to obsolescence, displaced by streaming services like Netflix and Hulu on streaming devices like Roku and Apple TV.

As with any good keynote address, Mr. Wood’s presentation sparked lively debate. But was it over the top? (Bad pun, I know!)

Roku2 Streaming Player
(source: www.roku.com)

Fundamentally I agree with Mr. Wood’s premise. I believe the future of TV will be streamed.

  • Media consumption is all about convenience – viewing what I want, where I want, when I want, on whatever platform I want.
  • Bandwidth and content will eventually be decoupled. For decades, cable and satellite customers have been forced to purchase bundles of content and bandwidth. But over time, consumers will have the choice to buy bandwidth (home and mobile) from one set of vendors, and rent content separately from other vendors.
  • Studies show that consumers anticipate that they will “cut the cord” when a superior streaming solution becomes available. In a recent consumer study by Deloitte, 64% of consumers agreed with the following statement: “If I could select my own television content and have it streamed to a digital device…I would give up my cable or satellite subscription.”
  • Sales of physical media (DVD, Blu-ray) are dropping precipitously.

That said, the transition will take a long time.

  • Although content is king, media is heavy. So content owners (and consumption device vendors like Roku) still need cable, telco and satellite service providers for media distribution. Bandwidth vendors will continue to extract a big chunk of revenue for media delivery services.
  • Not all content is available via streaming services. For example, live local news and live sports remain the exclusive domain of traditional broadcast. And premium content vendors like HBO typically require customers to have a current cable or satellite subscription in order to access their content via streaming. No wonder the vast majority of Roku owners also have a cable or satellite subscription, according to the Next TV Summit.
  • Consumers are surprisingly happy with cable. According to a recent Deloitte study, consumers are overwhelmingly satisfied with paid TV services when it comes to programming availability, user experience and customer service. Only a minority (10%-32%, depending on age demographic) believe they can get all the TV content they want through internet and streaming sources.
  • Cord cutting is proceeding slowly. In 2011, there were more than 100 million cable and satellite subscribers in the U.S. but only 1.5 million homes terminated their TV service, according to Nielsen.
  • While internet streaming services like Netflix and Hulu are growing quickly, sales of physical discs are still three times larger. According to DEG, sales of physical discs amounted to $1.67 billion in Q3 2012, while subscription streaming services were $579.2 million for the same period.

In this way, Mr. Wood’s keynote highlighted the broadcast industry’s current state of cognitive dissonance. Television is alive and well as we know it. Consumers are watching more TV than ever before. But given explosive growth in OTT streaming, the disruption is clearly underway, if moving slowly.

With all the recent emphasis on web and mobile TV, advertisers and service providers may not be taking full advantage of reaching viewers through gaming console platforms – especially male viewers.

Earlier today, Nielsen announced the results of a recent study that indicates men and women spend roughly the same amount of time watching TV when you account for the use of gaming consoles. Previous studies have shown that women watch more TV than men, but did not include gaming console usage in their definition of TV time.

According to Nielsen, in March 2012 women aged 18-34 watched an average of 4 hours 11 minutes of TV per day, while men watched only 3 hours 34 minutes – 15% less time spent than women. But when Nielsen looked at the use of 7th generation game consoles (Wii, Xbox360, PS3) this gap was all but eliminated. According to Nielsen, men aged 18-34 spent an average of 48 minutes using game consoles, as compared to only 22 minutes for women.

This study is interesting for a number of reasons. For one thing, it shows that men and women spend roughly the same amount of time in front of the TV (4 1/2 hours) but are engaged in a different mix of activities. Previous studies did not explore this nuance. For example, in 2011 Nielsen reported that female viewers (2 and up) watch more than 166 hours of TV per month on average, while males watch less than 140 hours – a gap of at least 16%.

Nielsen did not indicate what activities the gaming consoles were being used for. But regardless of whether they are watching on-demand TV, surfing the web or playing a game, men are spending almost 20% of their TV time on a game console. According to Wikipedia, more than 230 million 7th generation game consoles have been sold worldwide. Perhaps it is time to for advertisers and service providers to pay more attention to gaming console platforms, especially if they want to reach male viewers in the coveted 18-34 age range.

What the world needs now is… a new video codec? Back in 2004, when most of us video geeks were trying to figure out how to make H.264 work in the real world, plans were already afoot to develop its replacement. In case you didn’t notice, a draft specification for H.265 — also known as HEVC, which stands for High Efficiency Video Coding — was released in July by the JCT-VC, a joint collaborative group co-sponsored by MPEG and the ITU. The main benefits of HEVC are as follows:

  1. Greater Efficiency. H.265 should require only half the data rate of H.264 to deliver equivalent perceived quality.
  2. Increased Spatial Resolution. Supported image sizes span from as small as QVGA (320×240) all the way up to 8K (7680×8320) for UHDTV.
  3. Improved noise level, color gamut, and dynamic range as compared to H.264.
  4. Improved methods for parallel processing.

In subjective testing, HEVC has delivered equivalent perceived image quality at half the data rate, when compared to H.264 (High Profile). This means that HEVC could enable web and mobile devices to consume one half their current network bandwidth when streaming video. Or this increase in efficiency could be used to significantly improve image quality at the same data rate. Either way, H.265 will be a boon to web and mobile delivery.

Comparison: H.265 and H.264
(Source: Qualcomm at 2012 Mobile World Congress)

How does HEVC accomplish this drastic increase in efficiency? Although based on H.264, HEVC offers many small tweaks that add up, including the following:

  • HEVC replaces macroblocks with a more efficient (but also complex) hierarchical system for partitioning frames.
  • HEVC provides larger block sizes for higher coding efficiency.
  • HEVC supports tiling, allows multiple encoder instances to work on the same frame simultaneously.
  • HEVC supports wavefront parallel processing, so multiple threads can process different slices of frames more efficiently.
  • HEVC is progressive-scan only, simplifying decoder implementations.
  • HEVC includes entropy coding algorithmic enhancements that enable hardware decoders to run more efficiently.
  • HEVC includes higher precision filtering for improved motion compensation.

HEVC is expected to be fully ratified and published by early 2013. And, although chip vendors may not finalize hardware implementations until the standard is fully baked, software implementations will likely see commercialization in 2013. At IBC earlier this month, several vendors publicly demonstrated HEVC prototypes and announced HEVC support in future products:

What’s the catch? Licensing is still an unknown at this point. MPEG-LA has recently issued a call for patents that read on HEVC, so there is a possibility that H.265 will require a licensing fee. While MPEG-LA has not charged a license for H.264 when used to distribute video for free on the web, other uses of H.264 have incurred licensing costs. In any case, it seems likely that most H.265 implementations will require licensing fees.

There’s never a good time to introduce a new video codec, but it’s easy to argue that HEVC comes at just the right time. Video distribution now consumes the vast majority of all network traffic. Video-capable smart phones and tablets are ubiquitous. Unlimited mobile data plans are a thing of the past. Here’s hoping that licensing hurdles do not prevent the adoption of a new technology whose time has come!