More Resolution Confusion…

In an earlier post I alluded to the fact that the human eye may be the limiting factor to the combination of screen size, resolution and viewing distance. And while that is true to a large extent, we need to clarify that when choosing a display, and working your mind over the question of if you need to spend the extra cash for full 1080p resolution, or if 720p at a cost savings will be adequate, or indeed, an imperceptible difference. Well…things are never simple.

Fortunately, the question of 1080p vs 720p may never come up with a growing number of home theater enthusiasts. It’s simply a question of future-proofing your purchase. If you buy a 1080p display today, it’s going to be the flat-out highest resolution display you can buy for a while, at least for the near future. There are already higher resolution projectors in the works, but as far as displaying pixel-for-pixel any HD source available, 1080p will do it. So why even concern ourselves with something less? Oh yea, money. That again.

As I wrote earlier, the average 20-20 vision eyes can only see things as small as 1/60 of a degree of arc wide or tall. That means that if a single pixel of your display is smaller than that, you can’t see it as an individual pixel. And that would seem to indicate that if a 720p display is small enough or far away enough, you wouldn’t be able to see the pixels of either it, or a similar sized 1080p display, so what’s the diff? Ok, sit down.

The first problem with the 20-20 vision limit is that while that is considered “normal” vision, it is now apparent that many more people either have better than 20-20 vision, or have had their vision corrected (notably via contact lenses) to somewhere closer to 20-10. In other words, they can see 1080p pixels much farther away than a person with 20-20 vision. So perhaps 720p won’t be so non-visible to as many people as we thought! A 1080p display would offer those eagle-eyed viewers a noticeably better picture.

Second, you have to realize that since a lot of broadcast, cable, satellite and HD discs contain 1080i or 1080p material, something has to happen to scale that image down to 720p. And, to add insult to injury, most plasma TVs aren’t actually 1280 x 720 pixels at all, but rather 1365 x 768 meaning that even native 720p material must be scaled a bit. Scaling downward is a tricky business, and not for the squeamish. Obviously, if you didn’t have to scale at all, you’d be better off, and that’s what 1080i or 1080p material is like on a 1080 display. Any other resolution means scaling. And scaling means artifacts that can be much bigger than the pixels themselves, and more easily seen by even our mediocre 20/20 eyes. So, again, advantage 1080p.

So, three reasons to opt for 1080p:
1. The 20/20 vision argument isn’t really about 20/20 vision, but should be more appropriately the 20/10 argument. That pushes minimum viewing distance based on acuity out farther.

2. Scaling artifacts may be more visible than the pixels of a 720p display, so why scale if you don’t have to?

3. It’s about as “future-proof” as you can get. 1080p is likely to be the flat-out highest resolution that source material will be available in for quite a while.

Got it? 1080p is better, so if you can, get it.

Whew! I now wish to take aim at a misconception that is so prevalent that one of our distributors (who should really know better) casually said to me in conversation that 1080i is half the resolution of 1080p. I now take aim squarely down my telescopic site at that one. So listen up.

The resolution of anything called 1080 (i or p) is 1920 x 1080 pixels. There are two other important aspects, though, the frame rate (60, 30 or 24) unfortunately, also noted as a number with a p behind it (i.e. 24p), and the method of scanning, either Progressive, or Interlaced (1080p or 1080i). So the full specification of a video format should include the resolution, interlace or progressive indicator, and frame rate, as in “1080p 24p”. If you don’t have both parts, you don’t have the whole story. There are actually many different possibilities. The Advanced Television Standards Committee has published these recommended standards:


Vertical Lines Pixels Picture Rate
1080 1920 60i 30p 24p
720 1280 60p 30p 24p


That’s how the ATSC defined it, and I was surprised how hard it wast to find, buried in their technical specification documents. Somehow the consumer electronics industry has managed to re-write these figures in a much more confusing way (what a shock). Here are the same formats again:


1080p 30p
1080p 24p
1080i
720p 60p
720p 30p
720p 24p


And to make it much worse, the second figures are typically left off. The result is:
1080i (which is really 1080 lines by 1920 pixels at 60 fields, interlaced)
1080p (which is really 1080 lines by 1920 pixels at either 30 or 24 progressive frames)
720p (which is really 720 lines by 1280 pixels at 60, 30 or 24 progressive frames)

So when we ask “Is 1080i better than 1080p?” we are really asking “Is 1080i 60i better than 1080p 30p?” And we can reduce that to the difference between scanning an image with an interlaced pattern twice as often as a single progressive scan. And there it is: once every 1/30th of a second, or twice, interlaced, every 1/30th of a second. It amounts to what happens between scans. In the case of a progressive scan, the camera shutter snaps the image, and the entire image is scanned in one pass. In the case of an interlaced image, the camera snaps an image, and odd lines are scanned, then the camera snaps an image 1/60th of a second later, and the even lines are scanned. There is a 1/60th of a second time offset between interlaced scans. That does two things. First, motion may appear smoother (more scans per second can do this) and second, objects in motion may appear to have jagged edges (caused by half the scan resolution). But all is not lost. Motion adaptive interpolation within the video processor inside the display device can (and should) re-assemble all this interlaced mess using a process called ‘interpolation’.

How adept the display’s interpolation circuits are determines how much motion jaggy artifact you see, and less artifact is better, but usually more expensive to obtain. But we’re only talking about 1080i video here, which is a 60Hz based frame rate. What about film, which is 24 frames per second?

Just to get a 24fps film to work at all on a 60hz based display system takes a process ‘telecine’. Going back to the early days of TV, engineers found that just projecting a 24fps film onto a camera image tube scanned at 30fps results in a rolling flicker bar due to the mismatched frame rates. Normal film projectors use two-blade shutters that expose each frame to light twice for an end flicker rate of 48Hz. That just simply doesn’t play well with the 60Hz field rate of NTSC TV. Early telecine machines were projectors that focused their image on a TV camera image tube, with five-blade shutters that were synchronized to the TV station’s house sync signals. The five-blade shutter means each frame is “exposed” to the image tube five times, resulting in a flicker rate of 120hz, which is a multiple of 60Hz. That got rid of the rolling bar, and left us with an odd way of converting film’s 24fps frame rate to 30fps TV. Ultimately, what took place was a process called 3-2 pull-down, pull-down referring to a projector pulling the film down one frame at a time into its gate. 3-2 pull-down means that as a 24fps film gets transferred to video, a cadence, or pattern of 3 video fields, then two fields, then three, and so on. The resulting video is 30fps with 24fps embedded. Some displays are able to detect and undo this 3-2 cadence, and make it back into 24fps. Not very many displays can deal with the resulting 24p video, so it has to then be converted, sometimes again with interpolation, to a frame/field rate the display can handle. The short story around all of this is that for material that originated on 24fps film, the differences between 1080i and 1080p drop into irrelevance.

What is relevant is how well a display deals with all of these interlace/progressive and various frame rate issues. It’s all about interpolation, image processing, and display refresh rates. So the simple answer is (get ready, this is what you’ve been waiting for…) the differences between 1080i and 1080p are irrelevant for some material, and depends on how elegant the video processing is for others.

At this point you may be asking for a simple answer. There isn’t one, except this: for critical applications, like home theaters, media rooms, or your large primary screen, stick to 1080p displays. For the 42” TV in the bedroom or rec-room, you could save a little money on a 720p set. It’s highly likely that in a few years, 720p sets will be largely off the market, because the cost of 1080p displays will have dipped to match them, and most of the market wants the bigger numbers. After all, once you open your check book, doesn’t 1080p sound better than 720p? I thought so.

There’s much more to picking out the right display for your home theater or media room. Call the experts at Platinum Home Theaters for professional assistance and competitive pricing!

Resolution Confusion…Yet Again

More Resolution Confusion…

In an earlier post I alluded to the fact that the human eye may be the limiting factor to the combination of screen size, resolution and viewing distance. And while that is true to a large extent, we need to clarify that when choosing a display, and working your mind over the question of if you need to spend the extra cash for full 1080p resolution, or if 720p at a cost savings will be adequate, or indeed, an imperceptible difference. Well…things are never simple.

Fortunately, the question of 1080p vs 720p may never come up with a growing number of home theater enthusiasts. It’s simply a question of future-proofing your purchase. If you buy a 1080p display today, it’s going to be the flat-out highest resolution display you can buy for a while, at least for the near future. There are already higher resolution projectors in the works, but as far as displaying pixel-for-pixel any HD source available, 1080p will do it. So why even concern ourselves with something less? Oh yea, money. That again.

As I wrote earlier, the average 20-20 vision eyes can only see things as small as 1/60 of a degree of arc wide or tall. That means that if a single pixel of your display is smaller than that, you can’t see it as an individual pixel. And that would seem to indicate that if a 720p display is small enough or far away enough, you wouldn’t be able to see the pixels of either it, or a similar sized 1080p display, so what’s the diff? Ok, sit down.

The first problem with the 20-20 vision limit is that while that is considered “normal” vision, it is now apparent that many more people either have better than 20-20 vision, or have had their vision corrected (notably via contact lenses) to somewhere closer to 20-10. In other words, they can see 1080p pixels much farther away than a person with 20-20 vision. So perhaps 720p won’t be so non-visible to as many people as we thought! A 1080p display would offer those eagle-eyed viewers a noticeably better picture.

Second, you have to realize that since a lot of broadcast, cable, satellite and HD discs contain 1080i or 1080p material, something has to happen to scale that image down to 720p. And, to add insult to injury, most plasma TVs aren’t actually 1280 x 720 pixels at all, but rather 1365 x 768 meaning that even native 720p material must be scaled a bit. Scaling downward is a tricky business, and not for the squeamish. Obviously, if you didn’t have to scale at all, you’d be better off, and that’s what 1080i or 1080p material is like on a 1080 display. Any other resolution means scaling. And scaling means artifacts that can be much bigger than the pixels themselves, and more easily seen by even our mediocre 20/20 eyes. So, again, advantage 1080p.

So, three reasons to opt for 1080p:

1. The 20/20 vision argument isn’t really about 20/20 vision, but should be more appropriately the 20/10 argument. That pushes minimum viewing distance based on acuity out farther.

2. Scaling artifacts may be more visible than the pixels of a 720p display, so why scale if you don’t have to?

3. It’s about as “future-proof” as you can get. 1080p is likely to be the flat-out highest resolution that source material will be available in for quite a while.

Got it? 1080p is better, so if you can, get it.

Whew! I now wish to take aim at a misconception that is so prevalent that one of our distributors (who should really know better) casually said to me in conversation that 1080i is half the resolution of 1080p. I now take aim squarely down my telescopic site at that one. So listen up.

The resolution of anything called 1080 (i or p) is 1920 x 1080 pixels. There are two other important aspects, though, the frame rate (60, 30 or 24) unfortunately, also noted as a number with a p behind it (i.e. 24p), and the method of scanning, either Progressive, or Interlaced (1080p or 1080i). So the full specification of a video format should include the resolution, interlace or progressive indicator, and frame rate, as in “1080p 24p”. If you don’t have both parts, you don’t have the whole story. There are actually many different possibilities. The Advanced Television Standards Committee has published these recommended standards:

Vertical Lines Pixels Picture Rate

1080 1920 60i 30p 24p

720 x 1280 60p 30p 24p

Somehow the consumer electronics industry has managed to re-write these figures in a much more confusing way (what a shock). Here are the same formats again:

1080p 30p

1080p 24p

1080i

720p 60p

720p 30p

720p 24p

And to make it much worse, the second figures are typically left off. The result is:

1080i (which is really 1080 lines by 1920 pixels at 60 fields, interlaced)

1080p (which is really 1080 lines by 1920 pixels at either 30 or 24 progressive frames)

720p (which is really 720 lines by 1280 pixels at 60, 30 or 24 progressive frames)

So when we ask “Is 1080i better than 1080p?” we are really asking “Is 1080i 60i better than 1080p 30p?” And we can reduce that to the difference between scanning an image with an interlaced pattern twice as often as a single progressive scan. And there it is: once every 1/30th of a second, or twice, interlaced, every 1/30th of a second. It amounts to what happens between scans. In the case of a progressive scan, the camera shutter snaps the image, and the entire image is scanned in one pass. In the case of an interlaced image, the camera snaps an image, and odd lines are scanned, then the camera snaps an image 1/60th of a second later, and the even lines are scanned. There is a 1/60th of a second time offset between interlaced scans. That does two things. First, motion may appear smoother (more scans per second can do this) and second, objects in motion may appear to have jagged edges (caused by half the scan resolution). But all is not lost. Motion adaptive interpolation within the video processor inside the display device can (and should) re-assemble all this interlaced mess using a process called ‘interpolation’.

How adept the display’s interpolation circuits are determines how much motion jaggy artifact you see, and less artifact is better, but usually more expensive to obtain. But we’re only talking about 1080i video here, which is a 60Hz based frame rate. What about film, which is 24 frames per second?

Just to get a 24fps film to work at all on a 60hz based display system takes a process ‘telecine’. Going back to the early days of TV, engineers found that just projecting a 24fps film onto a camera image tube scanned at 30fps results in a rolling flicker bar due to the mismatched frame rates. Normal film projectors use two-blade shutters that expose each frame to light twice for an end flicker rate of 48Hz. That just simply doesn’t play well with the 60Hz field rate of NTSC TV. Early telecine machines were projectors that focused their image on a TV camera image tube, with five-blade shutters that were synchronized to the TV station’s house sync signals. The five-blade shutter means each frame is “exposed” to the image tube five times, resulting in a flicker rate of 120hz, which is a multiple of 60Hz. That got rid of the rolling bar, and left us with an odd way of converting film’s 24fps frame rate to 30fps TV. Ultimately, what took place was a process called 3-2 pull-down, pull-down referring to a projector pulling the film down one frame at a time into its gate. 3-2 pull-down means that as a 24fps film gets transferred to video, a cadence, or pattern of 3 video fields, then two fields, then three, and so on. The resulting video is 30fps with 24fps embedded. Some displays are able to detect and undo this 3-2 cadence, and make it back into 24fps. Not very many displays can deal with the resulting 24p video, so it has to then be converted, sometimes again with interpolation, to a frame/field rate the display can handle. The short story around all of this is that for material that originated on 24fps film, the differences between 1080i and 1080p drop into irrelevance.

What is relevant is how well a display deals with all of these interlace/progressive and various frame rate issues. It’s all about interpolation, image processing, and display refresh rates. So the simple answer is (get ready, this is what you’ve been waiting for…) the differences between 1080i and 1080p are irrelevant for some material, and depends on how elegant the video processing is for others.

At this point you may be asking for a simple answer. There isn’t one, except this: for critical applications, like home theaters, media rooms, or your large primary screen, stick to 1080p displays. For the 42” TV in the bedroom or rec-room, you could save a little money on a 720p set. It’s highly likely that in a few years, 720p sets will be largely off the market, because the cost of 1080p displays will have dipped to match them, and most of the market wants the bigger numbers. After all, once you open your check book, doesn’t 1080p sound better than 720p? I thought so.

There’s much more to picking out the right display for your home theater or media room. Call the experts at Platinum Home Theaters for professional assistance and competitive pricing!