sdc100 wrote:That really depends on the size of the screen. The larger the screen, the more noticeable the difference. For a 32" monitor, most people wouldn't be able to tell the difference between the two at about 5' away (the minimum recommended viewing distance). Keep in mind that 720p TVs can display 1080i, so the resolution is equivalent. Progressive scanning (the 'p') is only important in moving images because it allows a smoother picture. So if you mostly look at static images, i.e. text or webpages, or interlaced (the 'i') sources, then 1080p is unnecessary. Also note that currently, no TV stations are broadcasting in 1080p. They're either 720p or 1080i. 1080p is used mostly in Blu-Ray discs and action video games. Regardless, I wouldn't recommend anything beyond 27" for a computer monitor where you'll read a lot of text.
Regarding 1080i and 720p being equivalent resolutions, I respectfully disagree. 720 has a resolution of either 1280x720 or 1366x768 (921,600 or 1,049,088 total pixels) depending on the screen, while 1080 has a resolution of 1920x1080 (2,073,600 total pixels).
I'm going to assume that you know the differences, but I feel compelled by my Adderall to explain this in detail for anyone else that might be interested.
Full HD has more than double the total amount of pixels of 720HD, which means that double the amount of information can be displayed by the computer. This means that more icons, windows, or text can fit onto the screen.
The "p" and "i" have to do with the signal of the video. "P" in this case means progressive-scan, meaning that ALL of the image is sent at-once to the display. If a signal is 1080 HD 30p, that means that on every 1/30th of a second, a complete 1080 HD image is sent to the TV. If the signal were instead 720 HD and 24p, then a complete 720 HD image would be sent to the screen every 1/24th of a second.
The "i" in 1080i stands for "interlaced". Interlaced videos send their images in halves. A full HD image is 1920 pixels wide and 1080 pixels tall. What 1080i interlacing does is send images that are 1920 pixels wide but only 540 pixels tall. Interlaced HD signals are usually 1080 @ 60i, which means that they send half of an HD image once every 1/60th of a second. It splits the image in rows of odd and even pixels, sending the odd rows first and the even rows second. Once they reach the display , they're put back together into one image again.
Since progressive videos usually run at 30 frames-per-second and interlaced images usually run at 60 half-frames-per-second, the assembled image on the display will refresh as the exact same rate between 720p and 1080i. The difference is that, for static images, 1080i has more than double 720p's resolution.
Interlaced images are considered inferior to progressive signals when movement is involved, though, because of the way they work. Since the two halves of an interlaced image arrive at the display 1/60th of a second apart from each other, they sometimes don't line up perfectly. If something on-screen is moving quickly, it will appear artificially stretched or oddly blurred on the TV, while the same video sent with a progressive signal won't suffer from any artificial blurring like that.