Question:

What is the better of the HD resolutions... x*x P or x*x I ?

by  |  earlier

0 LIKES UnLike

I understand the higher the numbers such as 1600x1200 or 1280x768 means a higher pixel count and better resolution. I'm not clear on what the P vs I means, i.e., which is better resolution on HD... 1080i or 1080p?

 Tags:

   Report

5 ANSWERS


  1. P is progressive scan, meaning that all lines on the screen are scanned progressively all at once as opposed to interlaced (i) which all the lines are shown half at once, the odd numbered ones then the even numbered ones.

    Basicly the big difference will be jagged edges, but if you have a 1080p tv that has a good de-interlacer it will show a 1080i signal as 1080p with little to no difference.

    1080p is better but only when comparing to a TV that can only go as high as 1080i. If you look at a 1080i and 1080p signal of the same source going into a 1080p tv then there shouldn't be much, if any, difference.


  2. p stands for pixels and means that all the pixels will move at the same time while i stand for integrated and means that the horizontal lines on the picture change at different times like line 1,3,5,7 change at one time and 2,4,6,8 etc... change at the same time, but it's so fast the human eye can barely tell. So technically the answer is P is better

  3. P= progressive scan which is a better refresh of the screen. I= interlaced which means every other line gets refreshed, then it will refresh the other lines of resolution on it's second pass. If the resolution is the same (1080i vs 1080p) then the P is the better choice without a doubt. Now if you're comparing 1080i to 720p, people will argue all day about which is better. 1080i will give you a better "still" image, whereas 720p (because the refresh is better/quicker), will give you a better picture when there is action involved. Both will be VERY close to the naked eye. Hope this helps!!

  4. i/p designates the refresh rate.  Interlaced refresh is half lines in one scan and the other half in the next.  Progressive is all lines refreshed progressively in one scan.  1080i and p are visually about the same, but  the progressive scan will handle faster action with more clarity and smoother transition between frames.

    The answer of the guy comparing 720p to 1080i or 1080p is completely wrong.  720p is still 720 - much lower resolution than 1080 and 720p does not come close to 1080i in clarity even with fast action.  ESPN uses 720p because they're cheap, not because it's better than 1080i.  Check out football games on NFL HD (1080i) for example or stations that carry 1080 - they're way better.  I'm sick of hearing people claim that 720p looks better than 1080i - they can't tell the difference because they don't know they need glasses, either that or their screens are small enough that there's no difference in the display between 1080 and 720 - that could be the case under 35" - but I can still tell with a 32" LCD.

    Anyway 1080p is better than 1080i, and 1080i is a lot better than 720p.

  5. Before answering let me specify that I am assuming we are talking about 1080i and 1080p video fed to a digital 1080p display.

    In theory "p" is better than "i", but whether a difference will be visible in real life isn't easy to predict. Let's examine some of the issues.

    First, "p" means progressive scan ... the picture is drawn one horizontal line (or line of pixels) after another until the full frame is filled, at a speed defined by the frame rate (typically 24 frames/second for movies).

    On the other hand "i" means interlaced. Each frame is drawn as two separate fields each containing half the lines of the full frame. The first field consists of the odd numbered lines, again drawn sequentially, and then the second field consists of the even lines. The two fields are combined in memory and displayed ... all within the time specified by the frame rate.

    Why is this done? Interlaced scan video uses half the bandwidth of progressive scan video.

    What is the disadvantage? Because there is a slight time lag between when the two field of a frame are recorded, any motion during the lag results in a degree of mismatch (jaggies) between the two fields. Depending on how well the image is deinterlaced (all digital displays can only display progressive scan video so interlaced video must be deinterlaced to display it) the jaggies can be more or less successfully avoided.

    Done incorrectly, deinterlacing gives an inferior picture to progressive scan, particularly for programs with fast motion, such as sports. Done well, results from 1080i and 1080p video are virtually indistinguishable.

    Unfortunately good deinterlacing is by no means a given. In a survey done in 2006 less than half of the 1080p HDTVs tested deinterlaced 1080i video "correctly".

    It's probably safe to say that 1080p is unlikely to give worse results that 1080i, although whether results will be distinguishable will depend on the video processing involved (either internal or external to the display).

    For information on what happens with 720p and 1080i HDTV (the two choices for broadcast HDTV) and the differences between displaying on a 1080p and 720p display see the answer to a similar question at the link.

    Hope this helps clear things up for you.

Question Stats

Latest activity: earlier.
This question has 5 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.
Unanswered Questions