rc529 wrote:You guys. Seriously. Get with the HD trend, and know the facts. "720" and "1080" are the widths of the resolution in pixels. So, YES, there IS a difference between the two. Mathematically, 1080p is [more than] TWICE the resolution of 720p.
Nobody said otherwise. What the OP did post was the frequently-asserted opinion that the human eye (and/or brain) can't discern the difference between 720p and higher resolutions on a smaller set, with 32" being given as the rough dividing line for the definition of "smaller" set. You apparently read this to be saying that there is no physical/objective difference in resolution between physically/objectively different resolutions, which is not what was written.
rc529 wrote:I've upgraded laptops this year, and I went from a 1280x768 (better than 720p) to my current 1920x1080 (1080p) resolution screen, and the difference is INCREDIBLE. (well, technically, two and a quarter times better.)
Now you are expressing an opinion which differs from the OP's, which is great -- and in fact, I am particularly glad to read it, because I have never believed the 32"/720p threshold claim, but I don't have the necessary similar-enough-and-simultaneously-different-enough equipment with which to make my own comparison.
rc529 wrote:P.S. the "p" stands for progressive and the "i" stands for interlaced. Progressive is better. Trust me, it just is.
Ah, trust -- there's so little of it about any more. And on the internet, that's a good thing. To add a bit more detail to your differentiation of Interlaced vs. Progressive:
Interlaced means that every other horizontal scan line of the picture is refreshed during a single refresh cycle, so that it takes two refresh cycles to repaint the entire screen.
Non-interlaced, or Progressive as it was rebranded by the TV industry when incorporating this then-years-old computer monitor technology into the sets of the day (so that it sounded like something new when it was not), means that every scan line is refreshed during a single refresh cycle, resulting in a picture which is updated twice as often.
Using the old TV standard of a 60Hz refresh rate, this means that a standard, interlaced display would have an effective refresh rate of 30Hz, while a non-interlaced display would refresh at 60Hz. (Or do you prefer units of fps?) This increase from 30 to 60 frames per second is perceived by the human eye/brain as a smoother, less flickery image, and as less susceptible to jitter. I also find that interlaced TVs are easier to fall asleep in front of, which is why I would not buy one of these units (even if they did not appear to be the utter chunks of junk that all reviews paint them to be) for the bedroom.