Just What Is High Definition?

I recently read a newsgroup thread that started with someone asking just what high definition was. The thread wound in some surprising ways, with individuals asserting, among other things, that anything 16:9 was high definition and that 480i digital content was high definition (while presumably 480i analog was not). I suspect most of you wouldn’t agree. After all, we all know 720p and 1080i are HD while anything less is SD, right? Well things aren’t quite so clear cut.

Digital Television and Widescreen Television

A lot of people conflate high-definition television with digital television and widescreen television. It’s not all that surprising, as for many they all came along at roughly the same time.

And there are certainly aspects of the production and broadcast flow that muddy the waters. Even people who should know better are often misled. For example, about a decade ago during a discussion of SD vs HD recording capacity, an argument of just what to count as HD ensued. An exasperated product manager pointed to an episode of Friends that happened to be playing on a nearby TV and exclaimed, “There! That’s HD!” As it turns out, we were watching an SD subchannel. The video had been digitally remastered and the left and right of the picture, previously cropped to yield a 4:3 aspect ratio for analog broadcast, had been restored. The broadcast was encoded as 704×486 with a 16:9 display aspect ratio. What led the product manager astray were two things – because the episode had been digitally remastered, it was very clean and rather sharp. It was hands down better than what you’d see in an analog broadcast. Second, it was in widescreen format. But no….this was not HD. It was exceptionally good SD.

To clear the air just a little before we go further…

Digital television (DTV) is just TV that’s broadcast digitally. In the US, over-the-air digital television is governed by the ATSC standard while over-the-air analog television was NTSC. Digital television will often look much better than analog simply because it is by nature free of analog artifacts (ghosting, snow, etc.), and also generally not subject to high frequency losses that can occur in analog systems (or bad cables).

Widescreen television refers to video with a wider aspect ratio than 4:3.  One of the characteristics of high-definition broadcasts is that the picture is in a 16:9 widescreen format rather than the 4:3 format found in legacy analog broadcasts. The vast majority of HDTVs have a 16:9 screen (though you can find some sets from the early days that had 4:3 screens, and the even rarer non-16:9 widescreen panel). However widescreen television existed before digital television, and both PAL and NTSC had widescreen signaling to indicate the aspect ratio of the broadcast video.

High Definition Television

Ok, so just what is high definition television? Let’s start our journey by looking at the ATSC specification. After all, you’d think that a broadcast standard would clearly spell out what this HD business is all about. The ATSC A/53 specification defines standard and high definition as follows:

high definition television (HDTV) – High definition television has a resolution of approximately twice that of conventional television in both the horizontal (H) and vertical (V) dimensions and a picture aspect ratio (H × V) of 16:9. ITU-R Recommendation 1125 further defines “HDTV quality” as the delivery of a television picture which is subjectively identical with the interlaced HDTV studio standard.

standard definition television (SDTV) – This term is used to signify a digital television system in which the quality is approximately equivalent to that of NTSC. This equivalent quality may be achieved from pictures sourced at the 4:2:2 level of ITU-R Recommendation 601 and subjected to processing as part of the bit rate compression. The results should be such that when judged across a representative sample of program material, subjective equivalence with NTSC is achieved. Also called standard digital television. See also conventional definition television and ITU-R Recommendation 1125.

Notice that these definitions avoid any specific resolution, but instead refer to the quality that standard and high definition video are expected to have. So let’s see if we can try to make things a bit clearer.

Conventional Television

First we have to figure out what “conventional television” is. A/53 conveniently omits a definition. I imagine the authors thought it would be obvious to the reader. In any case, it wouldn’t be unreasonable to take “conventional television” to mean the (then-conventional) analog NTSC over-the-air broadcast.

From a broadcast perspective, “conventional television” would have a best-case resolution of around 440×486 for the active video region (the part we’re interested in, and what can be compared to HDTV’s 1920×1080 and 1280×720). This has to do with the limitations of over-the-air broadcast and trying to cram the signal into a 6 MHz broadcast band, of which video has 4.2 MHz.

How we got the 440×486 resolution deserves some explanation. While there are definitely 486 active lines, a bit of hand-waving is required on the horizontal resolution as the analog video signal doesn’t really have pixels. It’s a continuous waveform. However, if you took a camera and pointed it at a test pattern of alternating black and white vertical lines then slowly zoomed out, you’d find that you could get bout 440 lines (220 black lines and 220 white lines) across the picture before they started to blur together. The signal is bandwidth limited and simply can’t swing fast enough to get you more horizontal lines of resolution.

You might also find some references to a horizontal resolution of about 330 lines per picture height. The key here is per picture height, which means that you aren’t counting the full width of the picture. You’re only counting an amount equal to the height of the picture (i.e. truncate the width until you get a square). Since NTSC broadcast video has a 4:3 aspect ratio, you need to multiply 330 by 4/3, which gets you 440. So one could say that over-the-air has a (best case) resolution of about 440×486.

So, if we go back to A/53, high definition should have a resolution of 880×972 or better. 1080i at 1920×1080 definitely meets this criteria. However 720p at 1280×720 is a bit questionable. It meets the horizontal requirement, but not the vertical. And this is the source of some of the debate around whether 720p qualifies as high-definition or is merely standard definition.

But wait, you say, 720p has 720 progressive lines while those 486 lines are interlaced. So a fair comparison would be to use a resolution of 440×243 for conventional video, yielding a high definition resolution of 880×486. Maybe, maybe not. Interlaced video definitely plays games with the scan lines, but each frame has 486 lines. And a good deinterlacer can recover a lot of those “missing” lines. For more thoughts on interlaced vs. progressive video take a look at this post

So if 720p might or might not be high definition, is it standard definition? A/53 says that standard definition would be comparable to video sourced at ITU-R 601. So  ITU-R 601 gives an idea of the best that standard definition could be.  It specifies an active video resolution of 704×486. Even ignoring that ITU-R 601 video is interlaced, 720p’s 1280×720 is a far better resolution.

So no, 720p isn’t standard definition.

That Recommendation ITU-R BT.1125 Business…

Since we aren’t getting all that far with the “conventional television” side of things, let’s turn to the second half of the A/53 definition of high definition. Just what is that  ITU-R 1125 spec that A/53 refers to? Well as it turns out, ITU-R 1125 is a specification for “Basic Objectives for the Planning and Implementation of Digital Terrestrial Television Broadcast Systems”. Note that it’s dealing with broadcast systems, however it does go on to define quality levels for those systems as follows:

  • HDTV quality: The system has the potential to deliver a picture that is subjectively identical with the interlaced HDTV studio standard.
  • EDTV quality: The system as the potential to deliver a picture that is subjectively indistinguishable from the 4:2:2 level of Recommendation ITU-R BT.601.
  • SDTV quality: The system has the potential to deliver a picture that is subjectively equivalent to PAL, NTSC, and SECAM.
  • LDTV quality: The system has the potential to deliver a picture of a quality obtainable with MPEG-1 systems operating at approximately 1/4 the resolution of the 4:2:2 level of Recommendation IT-R BT.601. Some consider this comparable to VHS quality.

This takes us down another rabbit hole of just what the “interlaced HDTV studio standard” is. As it turns out I’ve lost sight of the rabbit and haven’t been able to get a definitive answer. If you know, please drop me a note. From my days at Silicon Graphics when I worked with vendors developing studio-grade  systems, I suspect it refers to any number of formats, including 1920×1080 @ 29.97 interlaced, or 1280×720 @ 59.94 progressive, 1920×1080 @ 24 progressive, 1920×1080 @ 29.97 progressive, … I think you get the idea. The key point is that both 1280×720 and 1920×1080 are generally used for high-definition production. The choice of which to use has more to do with the type of scenes that dominate the content – 1920×1080 has higher spatial resolution but lower temporal resolution, so is good for movies (particularly the progressive variants). 1280×720 progressive on the other hand trades roughly half the spatial resolution for twice the temporal resolution. Consequently it’s better for sports or other fast-motion content.

So once again 1080i clearly falls in the HDTV quality zone, and 720p is probably HDTV quality. Otherwise it floats in that no-man’s land between EDTV quality and HDTV quality.

On a Practical Note

So far we’ve determined that we can say with certainty that 1080i video is high definition.

720p has been left hanging a bit. But perhaps it’s time to turn off that overly analytical part of our brain and use our eyes. After all, video is meant to be watched, not analyzed pixel-by-pixel. From a practical (and subjective) perspective 720p is a noticeable improvement over 480i (and 480p). And just about everyone considers it a high-definition format. Plus,  there is some merit to the argument that progressive video is better than interlaced for some content, in particular rapid motion content such as sports.

And with that, I’ll leave you to ponder just where 480p falls, what to make of 1080i video that’s actually coded as 1152×1080, whether SD video upscaled to 1920×1080 is SD or HD, and the effect that compression has on quality. Something to keep in mind as you’re torturing your brain: many specs make liberal use of subjective quality comparisons and speak to the potential to deliver video of a certain quality, as even the authors keep things a bit circumspect. Oh and you can also now get back to considering whether 3840×2160 is really 4k, or does 4k require 4096×2304 resolution?