The 4K Revolution: Where did we come from, and where are we going?
This year, HDR or High Dynamic Range has been somewhat the talk of the town related to image format and image quality in cinemas. Given all the attention and buzz of recent months, it’s easy to forget how these kinds of new-technology discussions go through an evolution themselves. Most of us probably still remember the attention for (higher) frame rates that came with the 2016 release of Billy Lynn's Long Halftime Walk, and even going back to 2011 when James Cameron demonstrated 3D HFR footage. But who still remembers the curve that 3D technology (the days of “full-resolution triple flash”) went through? In this article, we will zoom in on the evolution and state-of-play of another image quality metric: resolution. In what could be called a historic battle between 2K and 4K, resolution has gone through a lifecycle of its own in cinema.
The early days
On June 18, 1999, DLP Cinema projector technology was used for the first digital cinema screening, that of Star Wars: Episode I—The Phantom Menace. The projectors had a native resolution of 1280 x 1024, and the 2.35:1 aspect ratio was achieved through an anamorphic projection lens. That means that when this wonderful industry of digital cinema was born and the first steps towards the new future for a whole industry were taken, we were not even using 2K resolution. Even more, we were stretching the pixel horizontally to fit the scope format. In the context of those pioneer days, it’s perfectly understandable that this was accepted; other, more important issues had to be solved back then when the whole concept of digital was new. And also, the 1280 x 1024 resolution was not at all bad given the industry standards of display technology those days. In 1999, more than 90% of TV sets sold in U.S. used CRT technology
When DCI (Digital Cinema Initiatives, LLC) was formed in 2002, this organization put forward the standards that would define digital cinema as we know it today. Two resolutions were included in the standard: 2K (2048 x 1080) and 4K (4096 x 2160). This is mentioned as part of the paragraph on Projection Fundamental Requirements: “The projector is required to display either a native resolution of 4096 x 2160 or 2048 x 1080.”
Since that standardization, the cinema market has now almost achieved 100% digitization. That so-called “first wave” has left the market with the majority of the screens and projectors being 2K. No official numbers exist, but an 80/20 split between 2K/4K is likely. Several drivers have contributed to that. Cost was the first one: 4K technology requires more capacity, bandwidth and hence investment than 2K. This applies on the one hand to the content-creation part: Cameras, rendering computers, storage and transport were in the early days significantly more expensive in 4K compared to 2K. But it also applies to the exhibition part: 4K projectors and media servers had a price premium when introduced to the market.
Another effect was the availability of 4K content. Titles released in 4K were minimal in the early days of digital cinema—e.g., in 2012 only 11 movies were distributed to cinemas in 4K. This meant that the incentive for the industry (and exhibitors specifically) to go to 4K technology was also small.
Today, almost 20 years after the formation of DCI, cinemas are looking into renewal of their projection equipment. Zooming in on resolution, what is the outlook for the second wave that lies ahead of us?
The cinema industry has consistently managed to innovate the viewing experience ahead of home entertainment. Today, however, we are on the verge of being leapfrogged by consumer technology. Some graphs will make this clear.
4K TV prices graph
4K TVs in home graph
Prices for 4K TVs are coming down fast to a level where they are lower (for larger screens) or competitive to FullHD (1920 x 1080, the consumer-equivalent of 2K) sets. This leads to fast adoption of 4K TV technology in the home: Already in 2016, more than 50% of all TVs sold in North America were 4K. For over 60-inch TVs, UHD accounted for 96 percent of the shipments. This year it will be 99 percent.
Note that 4K in the home is actually 3840 x 2160, just below the cinema specification. This is also called “4K UHD” to indicate the difference. This year marks the sixth year since 4K UHD was commercialized back in 2012. In 2000 it was HD (High Definition, or a million pixels) and in 2006 it was FullHD (two million pixels). Every six years there has been a generational shift. The message is clear: After a 10-year run, FullHD in the home will soon be obsolete at the high end, just as HD went extinct in 2009.
4K movie releases for the home are picking up fast. The graph above only considers 4K UHD Blu-ray releases (not streamed content) and shows that the number of titles grew by almost a factor of 10 since 2012. The growth in 4K cinema releases is also clear, but not to the same extent yet as movies released to the home. Still, in just four years’ time the numbers of titles more than tripled.
As the graph above shows, Netflix alone has more than 120 titles available in 4K, of which 18% are movies.
The second wave
Going into the second wave of digital projection in cinema, the stars are aligned differently compared to 10 to 15 years ago. Moviegoers are now finding 4K normal—can it also become (the new) normal in cinema? Will patrons even accept and understand when their preferred movie medium lags behind their TV in a certain specification?
Let’s look at the first driver: availability of content for the cinema. As more titles are being distributed and promoted in the 4K format, exhibitors will be incentivized to embrace and adopt 4K projection technology. Looking at the availability of titles for the home mentioned above, you see that content creators are organizing their workflows around 4K. This does not mean that there will be a one-on-one spillover to cinema, nor that 4K will reach 100% market share just like that. But there is an undeniable trend emerging. This trend should be combined with another one: the dropping cost of computing power and network bandwidth. Stimulated by overarching trends like cloud computing, the cost of rendering and processing 4K has dropped to levels close to those of 2K. Also the complexity and costs of transferring 4K content (which has a larger size than lower resolution) is now no longer limited by the costs of internal and worldwide network capacity. Looking at local networks inside a cinema, postproduction facility or studio lot, these are stepping up from Mbps to Gbps (Gigabit per second) capacity.
The second driver for exhibitors will be the price difference between 2K projectors and servers and the 4K alternative. Initially this premium was higher than 10%, but this is coming down as well—on the one hand driven by standard price erosion, but also thanks to introduction of new technology. Miniaturization and availability of more powerful processing chips will soon make it possible to offer next-generation 4K products at a price point of 2K first-gen products.
Historically, 2K always had the edge on 4K in cinema. The first digitization wave made the market land on an 80/20 split. Since then, however, we have seen forces in the consumer space make 4K the new normal. The disappearing cost and complexity hurdles in cinema could also make it the new normal for the second wave. What is your prognosis? Do you think that exhibitors can afford to not choose 4K when making their technology choices for the next 10 to 15 years?