Question:

1080p vs. 1080i ... Is there a difference? If "YES" what is the difference?

by  |  earlier

0 LIKES UnLike

1080p vs. 1080i ... Is there a difference? If "YES" what is the difference?

 Tags:

   Report

3 ANSWERS


  1. Yes. But what the difference is, and under what circumstances it matters takes a longer explanation. Sorry, but as with most things to do with home theatre it is necessary to look beyond the simple yes-no answer to understand that what advertising tells you is often misleading. The buzz is that "true HD" requires 1080p. The reality is that paying for a 1080p display to be used under the viewing conditions that most people have in their living rooms is a waste of money.

    1080i and 1080p are designators for different ways of reconstructing a 1920 x 1080 pixel image on a screen (My terminology assumes a flat panel HDTV display rather than a CRT, but the principles are the same).

    The "p" version is simple ... the 1080 vertical lines of pixels are drawn sequentially (i.e. progressively = p) from bottom to top ... all within a short length of time (the actual time varies with the scan frequency), then a second frame (image) is drawn to replace it. Frame rates are typically 24, 30 or 60 frames/sec (fps) and the human eye interprets the sequence of images as smooth motion.

    The "i" version is more complex. The 1080 lines of pixels for each frame (image) are split into two halves with 540 lines each. One half has the odd numbered lines (1,3,5, ...) and the other has the even number lines (2,4,6,...). The odd numbered lines are drawn first, then the even numbered lines are drawn "interlaced" to fill out the full image. This is done in half the time required by the scan frequency, so that a complete interlaced frames is drawn at the same 24, 30 or 60 fps. Again the result is perceived as smooth motion.

    So, what's the difference?

    Other than the obvious technical difference of drawing the frame progressively (p) or interlaced (i), if there is no (or not much) motion during the time the frames are being drawn there is no appreciable difference. The progressive image and the interlaced image will be the same (or so minutely different they can be considered identical).

    But consider what happens if the program being displayed is showing fast motion (e.g. a hockey game where a puck is being shot toward the net) Even though the interlaced image is drawn quickly there can be a difference in the position of some of the elements of the picture in the two halves of the frame (e.g. the puck), and the two halves won't quite mesh exactly. The result is a slight jaggedness to the image.

    In practice, a 1080p HDTV will "deinterlace" the incoming 1080i signal and draw each frame progressively, so -- assuming a good deinterlacer (and there are better and worse ways to deinterlace) -- the result is that the 1080p HDTV will display virtually indistinguishable pictures from a 1080i and a 1080p source.

    Now the complex part .... what happens on a non-1080p display?

    "1080i" is actually a potentially confusing designation. It can -- as above -- refer to the resolution and scan process for an image, but it does not necessarily relate to the native resolution of a display ... and it is the native resolution that determines what you actually see.

    I probably lost you, so let's use an example. Let's say you have a "720p" HDTV with a native resolution of 1366x768 pixels (vs 1920x1080 pixels on a true 1080p HDTV). This HDTV will accept a 480i/p, 720p and 1080i (and maybe even 1080p) input signal ... but the video processor has to convert these signals to match the available resolution of the screen. So all signals are "processed" (i.e. deinterlaced and/or scaled) to fit the 1366 x 768 pixels available. A 1080i image is deinterlaced and scaled to a vertical 768 pixels. This processing introduces various artifacts (degredation) in the picture.

    The point is that a 1080i signal is probably indistinguishable from 1080p when displayed on a 1080p display, but may be significantly different on a "720p" display (But, to be fair, so is a 1080p signal on a 720p display).

    Also note that broadcast HDTV is either 720p or 1080i (both exist and you don't get to choose). Note that there is NO 1080p TV ... the only current consumer source of true 1080p is from HD DVD or Blu-ray disks. On a 720p HDTV both signals are displayed, and which looks better will depend on the video processing in the TV. On a 1080p HDTV the 720p signal must be scaled to fill the screen, while the 1080i signal only has to be deinterlaced, so the 1080i signal may look slightly better than the 720p signal.

    But keep this all in perspective. Differences can be quite subtle and if you are sitting 8-10 feet away from an HDTV smaller than 50" (and even then it is questionable) you won't see any difference between not only 1080i and 1080p, but between 720p and 1080p.

    Finally, I've included a couple of references in case you want to do more reading.


  2. there is a difference  1080 i is interlaced and 1080 p is progressive scan . now having said that   it honestly takes a microscope to really see the difference  . the human eye and brain  sees any thing faster than 23.7 frames a second as fluid motion .  the resolution of the screen helps clarify the picture  but  when you are dealing with  such a high density picture  you have maxed out what your eyes and brain can discern.

  3. Without going into huge detail, I will put it simply.

    1) 1080i has jagged edges versus smooth 1080p edges.

    2) 1080p will have faster and smoother video.

    The above things have to do with frame rate and the difference between progressive/interlace.

Question Stats

Latest activity: earlier.
This question has 3 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.
Unanswered Questions