Plasma, LCD, OLED, LED, best tv for next gen

I've mentioned this in this thread before, but for people who are looking at pros and cons:

I bought an ST50 on the recommendation of threads like these. The dithering was the first thing I noticed when I turned it on. I honestly thought it was defective. After hearing from people that this was intended behavior, I resolved to get over it. Some people claim to not notice it--I can't imagine how not.

I had severe image retention, even after a week of non-stop break-in. Any high-contrast static image would remain on screen for over an hour.

Bright scenes would buzz audibly.

I'm not saying nobody should buy plasmas--but with all things GAF and internet-proper, a dominant point of view tends to be adopted and perpetuated absent any dissenting opinion. Just voicing that opinion.
If my experience with plasmas was like that I wouldn't regard them much.

You were unlucky, sounds like a lazy panel and you seemingly didn't know what to expect, which isn't a crime by any means, but... You are also drawing a lot of conclusion from having owned a plasma set. I owned five.

Dithering is nor really noticeable from afar, more it can be reduced by engaging cinema modes and the like, it's also the way the TV does gradations so you can be sure that a VT60 with 30.720 steps of gradation instead of 12.288 will have better results, game mode is one of the modes where such thing is more evident and the good news is that the TV is snappy enough to not having to use it if it really bothers you. This said noticing said difference at appropriate distance without having both TV's side by side is not all that easy.

IR persistent panel happens, but it's a defect (never happened to me), as for buzzing my VT60 is the least buzzing plasmas I've ever had. Where you using stock options? because another thing a lot of people do and shouldn't is getting home with a plasma and plastering a configuration with mad contrast and brightness all over it. They need to break-in with natural settings before that. But you probably had a lazy panel not implying otherwise, in a lot of cases that's not so much the case as it is people not knowing the tech at all.

Perhaps you were sitting too close though, which is what seeing dithering and hearing the buzz a little too much could mean (or you could be very sensitive to it, which happens), not trying to be a douche here, but the dithering is a big telltale sign in itself, it's really not a problem when sitting at appropriate distance.
My 2010 plasma has better picture than my parents last year high end Philips LED LCD. The colors look more natural, the LCD is very flashy. The negatives of LCD are worse than the negatives of a plasma to me, the downsides of plasma(lifespan, uses more power) are not impacting my viewing experience unlike lcd's downsides which are related to its image quality: bad viewing angles, washed out colors, motion blur, bad contrast. I'd buy a plasma again if mine died.
Projected lifespan of current plasmas is actually higher than that of LCD's... 100.000 hours to half life is no walk in the park.

Of course real world might be a very different story because power supply's give up more easily on plasmas, LCD's are simply simpler in the way they work so things like stuff happening to the panel (blobs and the like) are unlikely.

Oh, if that's a Panasonic you might be interested in resetting it's hour counter, it's worth it and it'll reverse the black level back to factory levels, which would still be class leading against most LCD's.
 
I thought I was having a very unlucky week with tv searching when 2 bestbuys would always have their w900a display units sold hours before I could inquire about them. I then decided to randomly drop by one of the best buys to check out other TVs when lo and behold, the guy who bought their w900a cancelled his order! I bought the thing right there on the spot and now have this beauty sitting in my room.
20140427_161838rmjf5.jpg


It came with everything and looked to be in perfect condition. First TV I ever bought, hopefully its as awesome as everyone is claiming
 
game mode is one of the modes where such thing is more evident and the good news is that the TV is snappy enough to not having to use it if it really bothers you.

Tell me more about this. I'm getting a VT60 and I'd love to just leave game mode on all the time, it seems like a no-brainer to improve the screen's response time, but I'm not entirely aware of what issues I'll actually be introducing by turning it on.
 
Tell me more about this. I'm getting a VT60 and I'd love to just leave game mode on all the time, it seems like a no-brainer to improve the screen's response time, but I'm not entirely aware of what issues I'll actually be introducing by turning it on.

There's no reason to leave it in Game Mode for non-interactive content. It's really amazing that the Game Mode on my VT60 is almost indistinguishable from the calibrated Cinema mode, but the differences are there. In Cinema mode there is a Motion Smoother which improves motion resolution on low-framerate content like 24fps movies by introducing a dejudder effect which is very nice. I'm not talking about the ugly-ass motion interpolation commonly seen in LCD HDTVs, it's just a dejudder which is possible because the VT60 supports a 96hz refresh which frame multiplies the 24fps movie cadence for better pans and also removes annoying flicker.

For watching movies and TV, I leave Game Mode off and Motion Smoother on Low. For games and PC use, I switch Game Mode on.
 
The current sharp 90" is two year old tech unfortunately. Check out the UQ line. It's essentially a 1080p plus set. Has more and smaller pixels than 1080p and up converts content or can accept 4k and downconvert. It's not the sexiest tv out but if you want size and great pq without killing the wallet this is your best bet.

"without killing the wallet" "10 thousand dollars"

557474_319979194736728_161456750588974_881230_724034153_n.jpg
 
There's no reason to leave it in Game Mode for non-interactive content. It's really amazing that the Game Mode on my VT60 is almost indistinguishable from the calibrated Cinema mode, but the differences are there. In Cinema mode there is a Motion Smoother which improves motion resolution on low-framerate content like 24fps movies by introducing a dejudder effect which is very nice. I'm not talking about the ugly-ass motion interpolation commonly seen in LCD HDTVs, it's just a dejudder which is possible because the VT60 supports a 96hz refresh which frame multiplies the 24fps movie cadence for better pans and also removes annoying flicker.

For watching movies and TV, I leave Game Mode off and Motion Smoother on Low. For games and PC use, I switch Game Mode on.

I use the 48hz mode for blu rays on my current Panasonic Plasma all the time, and definitely plan to use the 96 one on the VT. So this mode's not available with Game Mode on, eh? Certainly reason enough for me to not have it on for movies, at least. My understanding is that the "motion smoother" is an additional toggle on top of the 96hz mode, though. I'm indeed wary of anything that reminds me of LCD interpolation modes, what does this do if not interpolate? A cursory googling seems to imply it's exactly that, a frame interpolater.

I'm fine with losing the smoother in game mode, I plan to have all that stuff disabled anyway, but do I lose the 96hz option as well? Anything else disabled in game mode?
 
I use the 48hz mode for blu rays on my current Panasonic Plasma all the time, and definitely plan to use the 96 one on the VT. So this mode's not available with Game Mode on, eh? Certainly reason enough for me to not have it on for movies, at least. My understanding is that the "motion smoother" is an additional toggle on top of the 96hz mode, though. I'm indeed wary of anything that reminds me of LCD interpolation modes, what does this do if not interpolate? A cursory googling seems to imply it's exactly that, a frame interpolater.

I'm fine with losing the smoother in game mode, I plan to have all that stuff disabled anyway, but do I lose the 96hz option as well? Anything else disabled in game mode?

96khz is disabled, Motion smoother is disabled, and junk like Color Remaster and Vivid Color are disabled which you'll never use anyways. As Game Modes go, the VT60 has a pretty amazing one, even 1080p Pure Direct can be enabled with it on.

The Motion smoother has 3 levels (Weak, Moderate, Strong) and when left on Weak (Low) it doesn't do the frame interpolation. If you set it to Moderate or Strong, it will do that so leave it on Weak or Off.
 
What's the advantage of using a higher multiple of 23/24hz, surely these are just duplicate frames?

Yeah I don't get it either. The 24 FPS if supported and done correctly with blu ray and TV settings should be smooth and as per the quality of the bluray itself.

My understanding of motion smoothing is that it Is mostly best left off on many sets (do this on my LG LED and Panny plasma)...

Unless the TV has very fast rate (240 Hz and above, sometimes they mention 600 Hz processing etc etc) where a black frame can be inserted which gives smooth motion a bit like CRT, obviously the higher the setting the more black frames and the dimmer the picture.

With my Sony W905 use clear setting for Satellite (mid setting on motion / black frames) which is great for football etc, buttery....Game mode for games (dat 8 ms)

and blu ray should look after itself if the 24 Hz mode is implemented correctly.

Also reading people on 4K sets, nice for movies but don't know how people can game on them (Samsung £ 3K set extract for HDtvtest)

If you’re a competitive twitch gamer who demands the most responsive display for playing first-person shooters (FPS) or arcade games, the Samsung UE55HU8500 is not for you – it appears that the intensive onboard video processing has taken its toll on input lag. The lowest input lag we managed to squeeze from the UHDTV was a less-than-impressive 73ms with our Leo Bodnar device connected to the HDMI2 port which was then renamed to [DVI PC]. Input lag in [Game] mode was 76ms; and in [Movie] mode was a painful 152ms.

To me that is unplayable for gaming.
 
"without killing the wallet" "10 thousand dollars"

557474_319979194736728_161456750588974_881230_724034153_n.jpg

You can find the 80uq for sub 5k. Anything else in that range is crazy expensive yes. Speaking of as I am spending some time calibrating it I am becoming more and more happy with my purchase.

Edit
Holy hell the Samsung has up to 152ms of input lag???? That is ridiculous. What are they thinking.
 
96khz is disabled, Motion smoother is disabled, and junk like Color Remaster and Vivid Color are disabled which you'll never use anyways. As Game Modes go, the VT60 has a pretty amazing one, even 1080p Pure Direct can be enabled with it on.

The Motion smoother has 3 levels (Weak, Moderate, Strong) and when left on Weak (Low) it doesn't do the frame interpolation. If you set it to Moderate or Strong, it will do that so leave it on Weak or Off.

Cool beans. Do you have any sources regarding the smoother not doing interpolation on the lowest mode? Not that I don't believe you, I'd just be interested to read more about it, and what exactly it IS doing. Most reviews are useless for shit like this, and just describe all three levels as doing the same thing.
 
Also reading people on 4K sets, nice for movies but don't know how people can game on them (Samsung £ 3K set extract for HDtvtest)



To me that is unplayable for gaming.


The default modes are not good for gaming but with the input renamed to PC it feels fine to me, comparable to my Dell 30" monitor. I don't do any competitive gaming though.

That review is for the curved 4K set also, which might have more input lag as it has some additional video processing options over the flat HU8550. I assume these are disabled in Game/PC mode but not sure. Haven't seen any input lag measurements for the HU8550 yet.
 
Tell me more about this. I'm getting a VT60 and I'd love to just leave game mode on all the time, it seems like a no-brainer to improve the screen's response time, but I'm not entirely aware of what issues I'll actually be introducing by turning it on.
game mode has a lot less processing going on so dithering will be higher and color quantization will be a little worse.

It's no biggie, motion and color accuracy is mostly all there and the pixel response time is 6 ms which is a big plus over existing LCD's but it's basically as it is for most other TV's really, "cinema" modes usually take 3 frames of processing (that's 50 ms extra), game mode skips that altogether, means no intermediate results smoothing image and all that jazz.

Thing is video is motion, you pause and fast objects are blurred (each image is an image in movement), in games that's not the case (they're instead still frames) hence for a lot of games intermediate frames can actually help it feel smoother (they introduce lots of other considerations too though) and they're a no go with game modes all over. If a TV is snappy enough, not gaming in game mode, for those circumstances or simply for better image quality using other mode might be a go.

But it's not a big problem, no. VT60 is a pretty darned good gaming TV in my book.

EDIT: what Unknown Soldier said.
 
game mode has a lot less processing going on so dithering will be higher and color quantization will be a little worse.

It's no biggie, motion and color accuracy is mostly all there and the pixel response time is 6 ms which is a big plus over existing LCD's but it's basically as it is for most other TV's really, "cinema" modes usually take 3 frames of processing (that's 50 ms extra), game mode skips that altogether, means no intermediate results smoothing image and all that jazz.

.

No.

VT 60 :

Input lag (high-speed camera) 23ms compared to lag-free CRT
Leo Bodnar lag tester 41.5 ms

http://www.hdtvtest.co.uk/news/panasonic-txp65vt65b-201306273079.htm

Vt65 is VT60 UK / EU.

Its a good set, but don't mislead people. Its average in lag as I considered it.

Many Sony LED are 8 ms camera / 20 ms Leo Bodnar....
 
Quick question:

How much input lag difference between connecting:
Console>AV Receiver>TV
or
Console>TV
?

Depends on the receiver used and the settings on the receiver. In Onkyo's case you also have to use the sub out (hdmi out 2) to bypass the picture processor.
Generally though even with direct pass through on my Denon X4000 it adds about 8ms lag. Good enough for me to get high quality audio.
 
Depends on the receiver used and the settings on the receiver. In Onkyo's case you also have to use the sub out (hdmi out 2) to bypass the picture processor.
Generally though even with direct pass through on my Denon X4000 it adds about 8ms lag. Good enough for me to get high quality audio.

i have a DENON X500.
OK, so i will try to connect the HDMI of the console directly to my TV and the Optical Out to the X500. thanks.
 
My 2010 plasma has better picture than my parents last year high end Philips LED LCD. The colors look more natural, the LCD is very flashy. The negatives of LCD are worse than the negatives of a plasma to me, the downsides of plasma(lifespan, uses more power) are not impacting my viewing experience unlike lcd's downsides which are related to its image quality: bad viewing angles, washed out colors, motion blur, bad contrast. I'd buy a plasma again if mine died.

Not if you want anything over 65". 4k is the future and even at 65" you need to sit pretty close to enjoy 1080p to its fullest. Its why I ended up with lcd. Plasma is great when you can sit close but when I moved into a house with a large living room it was no longer an option. if anything plasma has more washed out colors compared to the overblown oversaturated style of lcd. Your negatives of lcd are mostly found in cheap lcd tvs. The real bad stuff of lcd is clouding, flash lighting, and halos. Summed up in one word inconsistency ( lack of uniformity).
 
No.

VT 60 :

Input lag (high-speed camera) 23ms compared to lag-free CRT
Leo Bodnar lag tester 41.5 ms

http://www.hdtvtest.co.uk/news/panasonic-txp65vt65b-201306273079.htm

Vt65 is VT60 UK / EU.

Its a good set, but don't mislead people. Its average in lag as I considered it.

Many Sony LED are 8 ms camera / 20 ms Leo Bodnar....
I'm not misleading. Before those Sony TV's under 33 ms was a god sent (now I agree, they are average, that makes it so 23 ms are better than average); and plasmas were often 16 ms, they were class leading.


And VT60 is under those 33 ms by quite a margin, also a TV with a game mode IQ like the one on VT60 lagging only that is awesome as fuck.

You guys are too anal with this input lag stuff and often sound like you really don't understand the inner workings for it. The lower it is the better, YES. But you're not superman and you don't notice narrow differences at all! The issue is if the cumulative lag goes over a certain margin of ms, either 166 or 200 ms, reports vary. Either way, from a certain point onwards you'll notice that the game is lagged, before that happens though, it's all good. Chances of 8 ms lag being "oh I'm in heaven" and 23 ms being "holy shit this sucks" are pretty dim at best.

You don't notice 20 ms of a difference, only if that difference can dip it into another category of lag so you notice what you otherwise wouldn't, it's a case of determining what drop is too much for the glass, lower lag merely gives a little more leeway. This is playing make believe of course, the goal is that it is believable, something being too laggy being the moment you realize it's not.


The real issue is that consoles already have lag themselves so you want to not aggravate that as much as you possibly can.

Typical lag is 133 ms for 30 fps games and 66 ms for 60 fps games; do the math. 23 ms is nothing. For the US models it could be bigger than that according to a measurement a neogaf member did months ago, but I'm talking about the european version and because this theme always escalates for reasons I find silly... I'm not trying to start a discussion. 23 ms is fine and there's no reason to pay more for less (lag). Even because as it stands you won't get a better TV.

My VT60 has a class leading gaming mode I stand by that, so much better than that of a ST60 which really wasn't (a shame after ST50 delivering so much on that front)
Quick question:

How much input lag difference between connecting:
Console>AV Receiver>TV
or
Console>TV
?
Depends.

If the AV receiver does anything to the image then it'll add lag, if it's just passthrough the lag will be either zero or close to it.
 
And VT60 is under those 33 ms by quite a margin, also a TV with a game mode IQ like the one on VT60 lagging only that is awesome as fuck.

If the AV receiver does anything to the image, any kind of conversion or upscaling then it'll add lag, if it's just passthrough the lag will be either zero or close to it.

No, no its not,

The camera method gives the difference between a CRT and the image, and the leo Bodnar gives the lag to a known input signal.

So the Leo Bodnar gives the TOTAL lag, which is around 40 ms. That's OK, but average.
 
I bought an ST50 on the recommendation of threads like these. The dithering ... Bright scenes would buzz audibly.

I had the same experience. I didn't have any IR but I worried about it constantly and my wife was not happy with all the rules I placed on her (she will watch the same news channel for hours). Something else nobody talks about is line-bleed. It was pretty noticeable as well. At that point you start to wonder why nobody gripes about these things like they do about some of the other imperfections but the truth is that all TV technologies today have things that really suck about them and people are just choosing what they are willing to sacrifice and live with.

You are also drawing a lot of conclusion from having owned a plasma set. I owned five.

Perhaps you were sitting too close though, which is what seeing dithering and hearing the buzz a little too much could mean

I tried 3 different Panny ST50 plasmas. They were all the same. I will admit that the higher levels of gradation on the more expensive models would have helped with the dithering but the buzz was what really killed it for me. I wasn't running it in full brightness, in fact I was setting it really low as everyone tells you to do the first few weeks. It was very noticeable at normal seating distances (12-15 ft) but when I am playing games I do like to sit closer, maybe 6-8ft away. Don't know if it's the shape of my room or maybe I've just got really sensitive hearing but it was not tolerable for me. What sucks is that I recognize the benefits of plasma but the buzz is a total deal-killer. I can only assume I am either very unlucky or just have exceptional hearing so when it comes time to pick a TV plasma is simply not an option for me. All the praise and accolades mean absolutely nothing.

Quick question:

How much input lag difference between connecting:
Console>AV Receiver>TV
or
Console>TV
?

Well, it depends on the receiver. I've read that some receivers can add quite a bit of lag but I have an older Onkyo that doesn't add any as long as you set it to "pass-through" the video signals.
 
No, no its not,

The camera method gives the difference between a CRT and the image, and the leo Bodnar gives the lag to a known input signal.

So the Leo Bodnar gives the TOTAL lag, which is around 40 ms. That's OK, but average.
Funny you quoted a HDTVtest review results and didn't quote their take on leo bodnar and plamas:

It’s Harsher On Plasma TVs

When we first got our hands on the Leo Bodnar device, we were surprised when we obtained (nearly) the same 48ms figure from a Panasonic ST50 PDP (plasma display panel) and a new Panasonic ET60 LED LCD (both running in their fastest Game mode). From our experience of playing a decent amount of first-person shooter games online, the Panasonic ST50 is a total joy to play on compared to the LCD. The former feels considerably smoother than the latter, but both are returning basically the same figure.

(...) On LCD-based display updates the screen from top to bottom, one line at a time, which means that a player’s brain cannot make sense of a part of the image until it has been completely rendered. The LCD’s top-to-bottom addressing can be seen with the Leo Bodnar lag tester: measuring the top patch tends to give a lower number than measuring the centre patch from our tests. However, on a PDP, the result is always the same on both patches.

Because plasma displays work by flashing the screen several times just to draw one video frame, on a PDP, an intermediate image doesn’t look half-drawn in the same way that it would on an LCD. Instead, it would have very low gradation (and brightness). In theory, this means that the player has a better chance of seeing the entire gameplay screen, albeit not at full quality, since the subfield drive throws out different steps of the dynamic range quickly just to draw one fully-gradated frame.

This is the key difference according to our theory. On the LCD, obviously our eyes can’t make sense of parts of the frame which haven’t been drawn yet (parts of the frame are either fully rendered or not), but on the plasma, we get extra temporal precision in the feedback loop, since we can see rough versions of the frames before they’re even fully drawn. And, in a fast-paced game, our brain doesn’t care if it’s seeing incomplete images – it should still be able to make out rough outlines and shapes.
The incomplete frames don’t necessarily even have to be coherent to our eyes. Even if we can detect the screen responding to our finger movements at all, it should be enough to make the game feel much more responsive.

In isolation, and for slow-paced games, this is all basically moot. But in a first-person shooter (even one which only runs at 30 frames per second) or racing game, etc, where the entire screen is moving and split-second decisions count, we think the PDP’s subfield drive helps tremendously in making the gameplay feel smooth. After all, in reality, playing fast-paced games is a continuous feedback loop between the player and the screen.

How does this explain why plasma televisions that feel much more responsive are shortchanged by the Leo Bodnar input lag tester which returns a higher figure? Well, we surmised that the flashing white bars need to hit a specific brightness threshold before they can be picked up by the device’s photosensor for lag time calculation: if you decrease or increase the on-screen luminance using the TV’s [Contrast] or [Backlight] control, the Leo Bodnar’s lag number should rise or drop correspondingly.
A plasma’s subframe, while not bright enough to trigger the photosensor, can readily be perceived by us in the sensorial feedback loop, thus accounting for the discrepancy between the displayed input lag figure and the actual responsiveness of a PDP. Ironically, the older stopwatch/camera method – though inconsistent – is capable of capturing subframes before they’re fully drawn (since it’s not limited by any luminance threshold, and the shutter speed is much higher than the panel refresh rate), and so more accurately reflects how responsive a PDP is. This is the reason why we continue to run both tests on most HDTVs we review despite the photo method being such a labour-intensive process.
Source: http://www.hdtvtest.co.uk/news/input-lag

When a ST50 lags 16ms in fast camera mode and is "class leading", and suddenly it's 54 ms in leo bodnar I'll lean towards them being right. Suddenly one of the best gaming TV's ever didn't get to be a steamy pile of shit, no.

Leo Bodnar is a sensor, it registers when the image returns the value threshold it's looking for, that's it. Lower and it'll simply be ignored, this means the sensor it's not wrong but it's possibly looking for the wrong things. Our eyes don't work like that, and hence, for plasmas fast camera method is really more effective and representative.

Sorry, but I'm really not willing to have this discussion, both figures are good to have, but a VT60 is not 40 ms perceptually or in reality, I'd wager.
Wish I could go that big tv is for a small guest bedroom
Shame because ten feet away really could use it.
 
Typical lag is 133 ms for 30 fps games and 66 ms for 60 fps games; do the math. 23 ms is nothing. For the US models it could be bigger than that according to a measurement a neogaf member did months ago, but I'm talking about the european version and because this theme always escalates for reasons I find silly... I'm not trying to start a discussion. 23 ms is fine and there's no reason to pay more for less (lag). Even because as it stands you won't get a better TV.

.

You add up the lag I agree, ping to host and from host / host response time / controller response / game FPS / and screen lag.
 
I had the same experience. I didn't have any IR but I worried about it constantly and my wife was not happy with all the rules I placed on her (she will watch the same news channel for hours). Something else nobody talks about is line-bleed. It was pretty noticeable as well. At that point you start to wonder why nobody gripes about these things like they do about some of the other imperfections but the truth is that all TV technologies today have things that really suck about them and people are just choosing what they are willing to sacrifice and live with.
Plasmas straight out of the box can be bitches with IR.

I have one that if the break-in period could be summed up with one word it would be abuse. It had IR of course, massive IR I was playing games non-stop and managed to pick up the worse games in the world to break-in a TV, I mean, 4:3 videogame with proper aspect ratio and black bars is not clever, it being a 2 CD RPG was poorer judgement. Then I pulled Tales of Vesperia; bright as fuck, with huge letters and hud elements that manage to stay in the same place all the time 80% of the gameplay.

sometime after the 400 and the 500 hour mark it simply went away, I was still playing but the IR would simply go away, it even cleaned even the IR that was left over from weeks ago and wouldn't go down for nothing before. Turned into another panel altogether, really. Before, if I tried cleaning it, it would alleviate, and that's how I knew it wasn't burn-in, but damn, I'd spend hundreds of hours doing it if I wanted it gone.

If that was my first plasma I would have had cold sweats, I didn't. But I don't recommend that kind of break-in because its hard to go through with it, and had I stopped and tried to clean it I'd only be left aggravated by the fact it was taking quite a while to go away. No two panels are the same too, of course, I can't say I was ever unlucky, some people have been. (but that can be a true statement for any electronic appliance)

A lot of people go that route with break-ins unknowingly though, because they buy a plasma for quality, read to much, get home and splash a pro configuration meant for bright room on their new TV complete with pixel orbiter off. Stress test out of the box confirmed.

Break-in period is not a walk in the park, even if the panel is resilient, but it's simply worth it once the technology matures, if it doesn't it's defective and you have to complain or return it.
I tried 3 different Panny ST50 plasmas. They were all the same. I will admit that the higher levels of gradation on the more expensive models would have helped with the dithering but the buzz was what really killed it for me. I wasn't running it in full brightness, in fact I was setting it really low as everyone tells you to do the first few weeks. It was very noticeable at normal seating distances (12-15 ft) but when I am playing games I do like to sit closer, maybe 6-8ft away. Don't know if it's the shape of my room or maybe I've just got really sensitive hearing but it was not tolerable for me. What sucks is that I recognize the benefits of plasma but the buzz is a total deal-killer. I can only assume I am either very unlucky or just have exceptional hearing so when it comes time to pick a TV plasma is simply not an option for me. All the praise and accolades mean absolutely nothing.
I remember talking to you a while back and realizing you are one of the very few that tried everything regarding buzz like some insonorisation behind the TV and still found it an issue.

You probably have really sensitive hearing that picks up that kind of frequency, the other huge deal breaker about plasma being phosphor trails, people that are sensitive to both (or either) have to look elsewhere, I agree. But a lot of the buzz drama is unwarranted also, it's not a majority... I mean I can hear buzz, I know what it is, but it's not a problem. A lot of people can live with the buzz fine they just decide they don't want to or make an effort to improve it, not your case might I stress.
 
I bought the thing right there on the spot and now have this beauty sitting in my room.
20140427_161838rmjf5.jpg


It came with everything and looked to be in perfect condition. First TV I ever bought, hopefully its as awesome as everyone is claiming

Kudos on your purchase! I got lucky with a floor model recently, too. I'm loving my new baby. I hope you enjoy yours as much!
 
I thought I was having a very unlucky week with tv searching when 2 bestbuys would always have their w900a display units sold hours before I could inquire about them. I then decided to randomly drop by one of the best buys to check out other TVs when lo and behold, the guy who bought their w900a cancelled his order! I bought the thing right there on the spot and now have this beauty sitting in my room.
20140427_161838rmjf5.jpg


It came with everything and looked to be in perfect condition. First TV I ever bought, hopefully its as awesome as everyone is claiming
Congrats on the best gaming tv on the market!
 
Kudos on your purchase! I got lucky with a floor model recently, too. I'm loving my new baby. I hope you enjoy yours as much!

Congrats on the best gaming tv on the market!
Thanks, got it for like 1200 so hopefully that's a good price (even if it was a bit over my budget of a thousand).

Anyone know if its possible to use something like nvidia 3dvision on this TV? I actually have a 3dvision compatible monitor and PC but I just can't find myself to spend more on the 3dvision glasses set.
 
Anyone know if its possible to use something like nvidia 3dvision on this TV? I actually have a 3dvision compatible monitor and PC but I just can't find myself to spend more on the 3dvision glasses set.
Should be:

NVIDIA 3DTV Play supports most HDMI 3D TVs, receivers, projectors, and head mounted displays (HMDs) using Release 313 drivers or later.
Source: http://www.nvidia.com/object/3dtv-play-system-requirements.html

I mean, not a specific confirmation, that, but hey, it should work fine.

Also chiming in for congrats on the purchase. ;)
 
Thanks, got it for like 1200 so hopefully that's a good price (even if it was a bit over my budget of a thousand).

Anyone know if its possible to use something like nvidia 3dvision on this TV? I actually have a 3dvision compatible monitor and PC but I just can't find myself to spend more on the 3dvision glasses set.

yeah you'll need 3dtv play. It was like $10 or so when I got it about 4 years ago. You can only do 720p if you are playing at 60 fps.
 
yeah you'll need 3dtv play. It was like $10 or so when I got it about 4 years ago. You can only do 720p if you are playing at 60 fps.

Just to clarify, that is a limitation of the HDMI 1.3 or 1.4 standard. It's why the 3D game support in PS3 and PS4 are also limited to 720p60. HDMI 2.0 can support 1080p60 in 3D but I think we're still a ways away from video cards that support it. Some 4K TVs support HDMI 2.0 already.
 
Plasmas straight out of the box can be bitches with IR.

I have one that if the break-in period could be summed up with one word it would be abuse. It had IR of course, massive IR I was playing games non-stop and managed to pick up the worse games in the world to break-in a TV, I mean, 4:3 videogame with proper aspect ratio and black bars is not clever, it being a 2 CD RPG was poorer judgement. Then I pulled Tales of Vesperia; bright as fuck, with huge letters and hud elements that manage to stay in the same place all the time 80% of the gameplay.

sometime after the 400 and the 500 hour mark it simply went away, I was still playing but the IR would simply go away, it even cleaned even the IR that was left over from weeks ago and wouldn't go down for nothing before. Turned into another panel altogether, really. Before, if I tried cleaning it, it would alleviate, and that's how I knew it wasn't burn-in, but damn, I'd spend hundreds of hours doing it if I wanted it gone.

If that was my first plasma I would have had cold sweats, I didn't. But I don't recommend that kind of break-in because its hard to go through with it, and had I stopped and tried to clean it I'd only be left aggravated by the fact it was taking quite a while to go away. No two panels are the same too, of course, I can't say I was ever unlucky, some people have been. (but that can be a true statement for any electronic appliance)

A lot of people go that route with break-ins unknowingly though, because they buy a plasma for quality, read to much, get home and splash a pro configuration meant for bright room on their new TV complete with pixel orbiter off. Stress test out of the box confirmed.

Break-in period is not a walk in the park, even if the panel is resilient, but it's simply worth it once the technology matures, if it doesn't it's defective and you have to complain or return it.I remember talking to you a while back and realizing you are one of the very few that tried everything regarding buzz like some insonorisation behind the TV and still found it an issue.

You probably have really sensitive hearing that picks up that kind of frequency, the other huge deal breaker about plasma being phosphor trails, people that are sensitive to both (or either) have to look elsewhere, I agree. But a lot of the buzz drama is unwarranted also, it's not a majority... I mean I can hear buzz, I know what it is, but it's not a problem. A lot of people can live with the buzz fine they just decide they don't want to or make an effort to improve it, not your case might I stress.

Well this is comforting to here. My VT60 has had the Xbox One logo as IR since day one, due to similar 'abuses' under Dynamic mode before I knew what IR actually was. It's very light, to be fair, and is only visible in certain lights. The panel's only at 200 hours though so hopefully it'll vanish soon enough!
 
Not if you want anything over 65". 4k is the future and even at 65" you need to sit pretty close to enjoy 1080p to its fullest. Its why I ended up with lcd. Plasma is great when you can sit close but when I moved into a house with a large living room it was no longer an option. if anything plasma has more washed out colors compared to the overblown oversaturated style of lcd. Your negatives of lcd are mostly found in cheap lcd tvs. The real bad stuff of lcd is clouding, flash lighting, and halos. Summed up in one word inconsistency ( lack of uniformity).

I found those negatives in my parents Philips Led which was over 2k euro. The image doesn't look good under normal conditions in a living room, I can imagine why people choose lcd over plasma in a brightly lit showroom, it looks more eye popping and flashy with the demos they put on. Then you come home, in a living room, it's not the same. It's very tiring for my eyes. I found the Led really only good for watching CGI stuff from Pixar etc. My plasma is not tiring my eyes, it feels natural like a crt.

I hope OLED becomes affordable in 2-3 years time when it's time to upgrade for me.
 
When a ST50 lags 16ms in fast camera mode and is "class leading", and suddenly it's 54 ms in leo bodnar I'll lean towards them being right. Suddenly one of the best gaming TV's ever didn't get to be a steamy pile of shit, no.

.

Yeah I would add the same offset for real world...

So W905 = 8 camera, 20 leo bodnar (total)

So VT60 = 23 camera, 23 + (20-8) more likely total = 35

The difference between camera and leo Bodnar is similar for most I believe...
 
Yeah I would add the same offset for real world...

So W905 = 8 camera, 20 leo bodnar (total)

So VT60 = 23 camera, 23 + (20-8) more likely total = 35

The difference between camera and leo Bodnar is similar for most I believe...
The other thing about leo bodnar is that it's taking ghosting time into account, since it's looking for a specific value to be met.

That value being a no requirement for you to perceive image (and it using black bars over white bars which is what takes more to change), hence me feeling fast camera mode, albeit unpractical is more of a real world scenario because if it registers a certain average time sooner in a way your eyes would too so it's not breaking time continuum in any way; shame you have to pull 20 tries and do a statistic. Of course, it's more "real world" even for plasmas because they seem to flunk leo bodnar big time, but even for LCD's it's a good figure to have.

I find it to be very different to plasma in this case, but if you see tests for a LCD TV whose leo bodnar values, when compared to the fast camera method ones are over the moon... It's most certainly a stinker with huge ass ghosting time, unlike plasmas. Hence I like having both.


Plasma ghosting time being 6 ms means the reading should be 23+6=29. or somewhere like 30 something for leo bodnar. The issue for leo bodnar is probably the half frames. Ghosting time for any VA panel being more than that the same offset wouldn't be "fair".

But, it doesn't really matter because we're talking 3-5 ms differences, it's no biggie.
Some photos of TLOU on my P55VT60, i was amazed even after my old 50ST50..
Congrats!

I have a job interview tomorrow, if anything goes right and I land a job the first thing I'm doing is hunting down a GT60 for my gaming room.

Seriously ever since I plastered a 65VT60 all across the living room plugging the game room TV just makes me sad.

And it's an awesome series 50 TV... it's just nowhere near as good as this pornographic pinnacle of gold master.
 
I managed to pull this out of a CRT a few days ago:

kWQAGdH.jpg


With a crappy nokia 5 megapixel cellphone (Lumia 520) that doesn't even have a flash.

Not easy, but I figured it had exposure options so that helped. Flat surfaces so it's not oscilating much also help with it being focused (not the case there).

If the camera was better the image would be even better of course, but it does the job.

As for HDTV's, the issue a lot of times is trying to get "the whole picture" I think, because some of these cameras are really lacking there, the farther you get the less you get.
 
I managed to pull this out of a CRT a few days ago:

kWQAGdH.jpg


With a crappy nokia 5 megapixel cellphone (Lumia 520) that doesn't even have a flash.

Not easy, but I figured it had exposure options so that helped. Flat surfaces so it's not oscilating much also help with it being focused (not the case there).

If the camera was better the image would be even better of course, but it does the job.

As for HDTV's, the issue a lot of times is trying to get "the whole picture" I think, because some of these cameras are really lacking there, the farther you get the less you get.

Yea while smartphone cameras are getting a lot better they usually can't do TV PQ shots justice and still cant compete with a decent digital camera, *especially* in low light conditions.

For example I took these shots with an older Canon G6 cam which I purchased in 2005:

HP_bluray3.jpg


(shot taken in darkness / no lights on in room):
bluray.jpg
 
The modern cell phone camera is pretty amazing
It all depends what you're comparing it too.

"Midget sensors" can't possibly compete with low light performance of a DSLR, and that's very important when trying to show off a TV properly.

But I do agree some are getting to be mighty impressive and effectively good enough these days. Not the case of the camera on my phone though.
For example I took these shots with an older Canon G6 cam which I purchased in 2005:
Powershot G's are awesome.

I've been hunting for a G12 myself lately (which is currently price/quality on the second hand market, later models are still too expensive)

I want to pay 200€ maximum, let a 180€ one slip between my fingers trying to make it 160€, that was greedy.
 
Top Bottom