As far as the topic of this thread goes, I own LCD TVs (2 of them, to be precise), but prefer neither. But for the moment, merely as a stop-gap, I find LCD to be the easier compromise to live with. It's more energy efficient, cooler, and less finicky than its plasmatic rival. The picture quality is also reasonable, assuming you can afford a higher-quality model.
Plasma is more of an evolution of CRT, and
does have better picture quality potential, dollar for dollar. You can get LCDs with exceptional image quality, but it'll cost a good deal more than an average plasma. A
full-array LED LCD (not merely edge-lit LED; this distinction is incredibly important, but rarely noted properly at big box stores and even more rarely understood by consumers) gives very good image quality, but they still cost an arm and a leg. When most households are looking for sub-$1000 flat-panel TVs, most people aren't going to pay much attention to $2000-$3500 models when the screen sizes are no bigger than the edge-lit equivalents.
I'm looking to OLED TVs in the future, which will put both existing display technologies to shame. We've already seen some small-scale use of this technology in high-end cellphones and other mobile devices, but up until very recently it wasn't financially feasible to produce larger displays due to failure rates during manufacturing and other cost issues. OLEDs combine various advantages of both LCD and Plasma, and don't suffer from many of their weaknesses.
OLEDs require no backlighting (perhaps the biggest bane of LCD tech) since, like Plasma, each individual pixel is its own light source. Nor does it run hotter, and nor would it be prone to burn in, as Plasmas can be. The color reproduction and black levels are also magnificent, another pro from the plasma category. They're also supposed to be extremely energy efficient, and they can (literally) be paper-thin.
Initially, of course, it will just be used to create even thinner (and much lighter, with no backlight apparatus to worry about) "normal" TVs, however in the future they could be used to create "rollable" displays. Imagine a tablet you could just roll up like a sheet of laminated paper, or a large transparent display so thin you could attach it to the wall as if it were wallpaper and you'd never even know it was there until it was turned on. Or affix it to a window, wirelessly connect it to a computer, and recreate the type of windows Tony Stark had in his bedroom in the 'Iron Man' flicks. OLED technology has incredible potential.
LG showcased their first OLED television, a 55" model that'll go on sale sometime in Q3 or Q4 this year, at this year's CES; and while the pricing will be prohibitive (nobody's sure yet, but probably in the $8000-10,000 range) they'll eventually come down. The important thing is that when these become more affordable, perhaps in late 2013 or 2014/15, they'll finally start the process of putting plasma and LCD both to bed.
dvdjunkie wrote:Wait a minute!!! 24fps???? What are thinking. 24 Frames Per Second is the rate of speed that motion picture film travels through the aperture of the projector.
You want to make sure than you have your Blu-ray set for 1080p, and also 16x9. I own an LG 55" HDTV Monitor with an Insignia Blu-ray player, and have never heard of what you are trying to describe.
Most modern theatrical content on a Blu-Ray is stored at 24fps (or 23.976fps, really, though the difference is imperceptible). Although no consumer LCD televisions actually operate at this refresh rate, they all have a method of tackling it when a Blu-Ray player sends it to them. An ordinary, cheapo 60Hz set uses 3:2 pulldown which duplicates every odd frame twice and every even frame once (hence the 3:2 ratio) to fill the 36 frame deficit between 24fps and the refresh rate of 60Hz. The result of that uneven frame duplication can be a little distracting and jerky, usually referred to as "judder". Since the OP didn't opt for a 120Hz TV (correctly shunning motion interpolation, but not realizing that 120Hz had other advantages) when trying out an LCD, maybe this was part of the problem he had with its display of motion.
A 120Hz TV, when the motion interpolation feature (cause of the notorious "soap opera effect") is turned off, allows for an even frame cadence using 5:5 pulldown: every frame is repeated 5 times per second, essentially becoming 24fps (or closer-looking to it, anyway). Not all 120Hz TVs are programmed for 5:5 pulldown, however; it's something someone needs to check on before they buy a particular model.
Most Blu-Ray players have a setting to export native 1080/24p when that's what's on the disc, which partners with your television's pulldown method to create a more "theatre-esque" viewing experience, or so the industry claims.
dvdjunkie wrote:ajmrowland wrote:
also be wary of artificial speedup settings
Just what do you mean by that? I have never heard of a Blu-ray player that has these settings.
Sounds like he's referring to the motion interpolation, which I mentioned above. It's a television setting, not a Blu-Ray player setting. No TV will actually call it "motion interpolation", however; every manufacturer has their own branding for the technique (Sony = MotionFlow, LG = TruMotion, Samsung = Auto Motion Plus, and so on). But Samsung is definitely not the only brand that has it; they all do.
Essentially, on 120Hz or 240Hz LCD TVs the feature uses a predictive algorithm to examine the frames of a video in order to create new frames to go in between them. Because these TVs refresh either 120 or 240 times each second, and LCDs are fixed-framerate displays; unlike CRT monitors they can't alter their refresh rate... a 120Hz panel
must refresh 120 times each second. But a film only has 24 stills per second for it to work with, so to fill these gaps the television either needs to repeat certain frames (the aforementioned 3:2 pulldown, 5:5 pulldown, etc.) or it can use these algorithms to create new intermediate frames based on the existing ones.
It's supposed to give the motion a "smoother" look. But IMO, the algorithms also give the image a very artificial look and it's usually very distracting and makes the film look as if it was shot on cheaper video. Hence the name "soap opera effect". It's almost as if it triggers some subconscious recognition of artifice in the motion of what's happening on screen, and maybe that's what it is... the DSP is only consumer-tier, after all; and how good is anything consumer-level really going to be? You end up with more of those predicted frames than real ones. Even if it's almost right, almost isn't enough when it comes to natural motion. Something is either natural, or artificial; and the human brain is pretty good at distinguishing the difference.
About the only content I can see benefiting from motion interpolation is fast-paced sporting events that have no particular atmosphere it's necessary to preserve.