Question:

1080i or 780p?

by  |  earlier

0 LIKES UnLike

I have an LG LCD HDTV and a HDMI compatible DVR. What is the best resolution setting on my DVR 1080i or 780p? Is there a major difference?

 Tags:

   Report

5 ANSWERS


  1. 1080i is better than 780p as long as your television is a 1080i set. Now 1080p is the best you can get.


  2. Wow, it amazes me how much misinformation there is on the responses.  First, Progressive (P) is MUCH better than Interlaced (I) for fast paced video like Sports and gaming so 720p should look better for this then 1080i.

    Secondly, since nothing is even broadcast in 1080i other than some DirectTV HD movies the highest native resolution you're going to get is 720p anyway.

    weeder

  3. 1080i should usually be better for scenery

    720p should usually better for action

    1080i draws half of a 1080 frame every 1/60th of a second so an entire picture is shown 30 times per second.

    720p draws the entire 720p frame every 1/60th of a second so an entire picture is shown 60 times per second.

    That's why FOX and ESPN, among a few other channels, broadcast in 720p because sports with a lot of action look better because the picture is updated twice as fast, even though it has less than half of the resolution.

    1080 resolution is 1920x1080 pixels  (2,073,600 pixels)

    720 resolution means 1280x720 pixes (921,600 pixels)

    720 has about 44% the number of pixels as 1080 so they can afford to send the whole picture twice as many times using the same size information pipe (bandwidth).

    Usually, the two halves of a 1080 picture are from a good camera or from film so joining them shouldn't create any problems and your flat screen "de-interlaces" or joins them to show 1080p at 30 frames per second, but that's not always the case.  There are some cameras where each half image was captured at different instances in time and joining them together creates what are called "jaggies" or motion artifacts.  If the source video was created using progressive techniques and only broken down into interlaced for broadcast, then 1080i at 30 frames per second translates into 1080p at 30 frames per second no problem IF your display is progressive (meaning that UNDER CERTAIN conditions there is actually no significant difference between 1080p @ 30 frames per second and 1080i @ 30 frames per second / 60 half images per second).  Now 1080p @ 60 frames per second would be a whole other ballgame.  Unfortunately, there is no such programming being broadcast.  Also, unless you are using a CRT or rear projection CRT TV, you most likely have a progressive set.  LCDs, Plasmas, and DLPs do not need to take into account persistence which is associated with phosphors fading over time after being stimulated with an electron beam.  For example, the pixels in an LCD are set and stay set until they are told to change.  Sets like that are, by virtue of design, essentially progressive.  By the way, the whole reason the early CRT sets were designed to be interlaced in the first place is because the phosphor glow would fade by the time the electron beam scanned the whole screen and repositioned itself to scan again - they decided to do every other line and create a full picture with two halves and left it to human physiology to blend them.

    Furthermore, the choice also depends on your screen size and the distance from your TV.  You start to lose the benefit of the higher resolutions as you move away from the screen and the differences blur between the resolutions.  For example, a 50" screen at about 10 feet has just maxed out the benefit of 720 and the benefits of 1080 start to become apparent as you move closer to the screen.  If you have a 50" screen and are more than 10 feet away, who cares if you have 720 or 1080.  You won't notice the difference except for the frame rate (frames per second)

    Most channels broadcast at 1080i 30 frames per second (a.k.a. 60 fields per second).  Some others, (like FOX, ABC, and ESPN) broadcast at 720p 60 frames per second.

    To further compound the question, motion pictures are filmed at 24 frames per second and video broadcast is sent over the air and cable at 30 frames per second.  Those two rates don't mesh.  The number 24 times 5 divided by 4 is 30.  For every 4 film frames they make up a fake one based on the ones around it to get up to 5 - that's how they make 24 frames per second into 30 frames per second.  You get to watch 1 lousy frame for every 5.  Most TVs know about the conversion and try to reverse the process at the expense of not showing each frame on the screen for the same amount of time.  Newer sets with varying frame rates can show 24 frames natively.  For example, 120Hz LCD sets can show video and film at their original rates (because both 30 and 24 divide evenly into 120).  Blu-ray movies are stored on their disks at 1080p @ 24 frames per second.  The player may output 24 frames per second or 30 frames per second, depending on how you set it.

  4. The difference between the two of those is the amount of lines of vertical resolution that run through. 1080 means 1080 lines and the i means interlaced, or they run perpendicular to eachother like a weave. 780 then means there are 780 lines and p stands for progressive which means they all run in one direction. Obviously you choose the one with more lines. And also 1080p isn't neccessarily better. 1080p has better quality for still shots but 1080i is better for fast paced action like sports or movies.

  5. 1080i would be the better choice if you have a Hdtv that is 16:9 and can receive a 1080i resolution. There isn't that much of a difference but 1080i is better than 720p.
You're reading: 1080i or 780p?

Question Stats

Latest activity: earlier.
This question has 5 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.