The Difference Between UHD and 4K

I haven’t paid much attention to news surrounding ‘4K’ or Ultra-HD (UHD) video resolutions up until very recently. I’d largely ignored the topic until the Mac Pro’s were released in December and it became a topic of discussion on ATP much to Casey’s dismay.

The topic of 4K video has crossed my radar from time to time over the past 4 or 5 years, such as when Youtube announced their support for 4K video back in July of 2010. At the time I didn’t think much of this as there was hardly any 4K content out and even fewer ways to see it in 4K. I thought of this as Google being Google in their typical way of allowing their engineers to drive their product development despite the fact that few if any of their customers would use this feature.

As a part of my job, I am responsible for all digital content at a major trade show once per year throughout whatever convention center we’re at. The trade show has between 25,000–30,000 attendees each year which means I’m usually dealing with 30–50 screens spread out around the buildings all with different content on them. To a non-green building professional this content won’t excite you but nevertheless, our marketing department find new and fun ways to make my life difficult each year in terms of formatting etc.

Over the last few years however, things have progressively gotten easier as all of the displays we rent or are given through donations (we’re a non-profit) and the computers we rent from our AV vendors have unified around 1080P for resolution and HDMI connectors. I’m able to tell our marketing folks to “just make everything 1920 x 1080” and this works for 95% of our content. [1]

You’d be surprised as how many college educated marketing & design-related people are starting new jobs, are fully versed in using Adobe Photoshop, Illustrator, and InDesign yet have no idea what 4:3 or 16:9 means without further explanation. I’m going to risk sounding like an old man when I said I wish college professors were teaching these people practical skills. [2]

Back to the main subject: 4K vs UHD. Recently I have seen these terms interchanged and I assumed they meant the same thing. They do not. Dylan Seeger at VS Forum writes:

Like 2K, 4K is a professional format used on the commercial side of video production most often seen by everyday consumers at commercial movie theaters equipped with the latest digital projectors. Unlike UHD, 4K has a different native aspect ratio. A true 4K image (4096x2160) has an aspect ratio of 1.9:1, while a true UHD image (3840x2160) is 1.78:1. We can see here that a 4K panel is actually wider by 256 pixels. This is a trivial number and doesn’t do much in terms of overall resolution or clarity of the image. I’m fairly certain that this minimal difference in resolution is what’s fueling many of us to call UHD “4K.”

This 256 horizontal pixel difference causes at least one major issue when dealing with consumer content. The problem is that almost all television content is presented in a 1.78:1 aspect ratio. If we were to view this content on a true 4K display, we would see black bars on the left and right side of the display to keep that original aspect ratio intact. While enthusiasts understand the reasoning behind this, most everyday viewers would find their TV content annoyingly masked with black bars, very similar to how they find black bars on their 1080p televisions annoying while viewing ’scope films. This is one of the main reasons for choosing 3840x2160 as the next-gen consumer resolution. It makes sense to keep that 1.78:1 aspect ratio as most content made for broadcast TV is presented this way.

True 4K is the resolution specified by the DCI (Digital Cinema Initiative) commercial standard. This is another area where UHD and 4K differ. Much like Blu-ray is the 1080p standard for encoding and presentation, 4K has its own set of standards that the DCI dictates. These standards are high end, resulting in exemplary image quality. While it isn’t totally clear yet what kind of video encoding standards the new UHD video format will use, all rumors point to sub-par encoding. DCI 4K uses JPEG2000 video compression, up to 250Mbps video bitrate, 12-bit 4:4:4 video, and a much wider color gamut. HDMI 2.0 will most likely dictate the standards for UHD Blu-ray (or whatever they decide to call it). Unfortunately, HDMI has very little left to give as an interconnect standard. As a result, there is no way to transport the amount of information needed to exceed or even match the 4K DCI standard. Those in the know are under NDA (non-disclosure agreements), which means we won’t know the specifics for at least another month or two. Rumors point to 10-bit 4:2:2 video for UHD video content at 24 frames per second and a doubling of the throughput to support higher bitrates.
Reading the entirety of Dylan’s article is worth it to understand the differences between the two formats and the implications of those differences. This was the first time that I had seen those differences outlined in detail and thought it best to spread the word.


  1. I still get curveballs thrown at me such as sponsor supplied videos given to me in 4:3 aspect ratios in WMV format or those awful 29:8 aspect ratio screens going into the exhibit halls in Moscone North and South.  ↩

  2. It would also help if they knew HTML and to never copy & paste content out of Microsoft Word into a WYSIWYG editor in a CMS.  ↩