Plasma, LCD, OLED, LED, best tv for next gen

LCDs have contrast ratios as good as plasmas these days when you realize that even though they can't quite reach the black level of a plasma, their white levels are much higher. The resulting ratio is basically no different in real world viewing.

In reality the black level of my 65X900A isn't as good as my 65VT60 but the color gamut and relative contrast is every bit as good. Both TVs are self calibrated.

I've been so spoiled by the VT60...it's hard for me to watch LCDs with their washed out blacks in a darker room (though LCD tech has gotten better, there is nothing as pleasing to my eye than a nice plasma). My next TV will definitely be an OLED if/when price comes down enough...
 
LCDs have contrast ratios as good as plasmas these days when you realize that even though they can't quite reach the black level of a plasma, their white levels are much higher. The resulting ratio is basically no different in real world viewing.

In reality the black level of my 65X900A isn't as good as my 65VT60 but the color gamut and relative contrast is every bit as good. Both TVs are self calibrated.

Not quite as there's obviously a practical limit on white peak. Most people are not going to want to go beyond 60 or 70fL because it's simply too bright for comfortable viewing. The VT60 has an MLL of 0.001fL or less, which at even at 30fL white peak, is a native ANSI CR of 30,000:1 at minimum. I believe the X900 is around 0.014fL ANSI, which means you'd have to use it at 420fL white peak to match 30k:1. I doubt it's even capable of exceeding 100-200fL maxed out.
 
Not quite as there's obviously a practical limit on white peak. Most people are not going to want to go beyond 60 or 70fL because it's simply too bright for comfortable viewing. The VT60 has an MLL of 0.001fL or less, which at even at 30fL white peak, is a native ANSI CR of 30,000:1 at minimum. I believe the X900 is around 0.014fL ANSI, which means you'd have to use it at 420fL white peak to match 30k:1. I doubt it's even capable of exceeding 100-200fL maxed out.

Real measured ANSI contrast of VT60 is nowhere near 30,000:1 though...
 
http://www.vizio.com/p502uib1e.html

It's a Vizio. It's not the best TV, so maybe it's just cheap. Thing is, I normally wouldn't buy a cheap TV. But I already had credit at Walmart from something a long time ago I forgot about. And so I scored it for $500 during the sales (with the credit applied on the sale). So for $500, I can't complain. I figure, even if the TV goes out then it's just $500. Granted, I'm not laughing at the price. I've just invested a lot more money in the past.

So maybe that explains why it has 30 hz 4K hdmi and one 60hz one. Cheap brand, cheap parts.

All I know is, it looks infinitely better then my last TV which was a 1080p LED. I read up online about all the flaws on this TV, but in the end decided to pull the trigger. Because the price was good, and it does look amazing.

That said, maybe it's not a true 4K tv. So if I really want a good 4K I'll just have to get something in a year or two when prices drop. *shrugs*


Looked up some more info, its a cost thing. The 30hz inputs are HDMI 1.4, the 60hz is HDMI 2 or higher. Interestingly, it also accepts 1080 @ 120hz on that port.
 
Looked up some more info, its a cost thing. The 30hz inputs are HDMI 1.4, the 60hz is HDMI 2 or higher. Interestingly, it also accepts 1080 @ 120hz on that port.

Thanks for looking it up for me (especially since I'm dumb with this stuff).

So just to be clear, all the 4k ports will output 1080p at 120hz? Or only on the ONE HDMI 2 + port, and the other ones are 60 hz?

Right now I have my PS4 hooked up to the best port (the HDMI 2.1 one or whatever). Then I have my Xbox One, my Wii U and my Apple TV on the other 4k @ 30 HZ ports (HDMI 1.1)

The Wii U and Xbox One are the only ones i care about. I know none of them output at 4k, and all of them upscale the 1080p to 4k (which looks amazing). Was just curious about the refresh rate and how that works for when it's 1080p output on them. Sorry for asking so many questions.

If the other ports don't do 120hz at 1080p, then maybe I should get like a HDMI switch to plug into that HDMI 2.1 port. If such a thing exists, and just have all the other ones run off that one. Either way, I can't stress how much I love this TV. I know Vizio isn't the best brand. And I know it has issues. But gah damn, 4k Upscale + this TVs picture in general is so mind blowingly beautiful. And I sit close to the TV in my room, so 50 inch is perfect.

Forza Horizon 2 on this is wow.
 
looking at getting this Samsung 55" LED UN55H6300AFXZA from costco for $770

any info on this one as far as input lag and such?

UN75H6350 is 41ms, see http://www.displaylag.com/lg-samsung-sony-input-lag-results-4k/. Samsung has really reduced those numbers lately.

It's a little bizarre Microsoft were constantly touting 4K support for Xbox One but it never actually materialised. Not even the UI is 4K.

Most knew that Major Nelson's BS about 4k was just that.
 
Thanks for looking it up for me (especially since I'm dumb with this stuff).

So just to be clear, all the 4k ports will output 1080p at 120hz? Or only on the ONE HDMI 2 + port, and the other ones are 60 hz?

Right now I have my PS4 hooked up to the best port (the HDMI 2.1 one or whatever). Then I have my Xbox One, my Wii U and my Apple TV on the other 4k @ 30 HZ ports (HDMI 1.1)

The Wii U and Xbox One are the only ones i care about. I know none of them output at 4k, and all of them upscale the 1080p to 4k (which looks amazing). Was just curious about the refresh rate and how that works for when it's 1080p output on them. Sorry for asking so many questions.

If the other ports don't do 120hz at 1080p, then maybe I should get like a HDMI switch to plug into that HDMI 2.1 port. If such a thing exists, and just have all the other ones run off that one. Either way, I can't stress how much I love this TV. I know Vizio isn't the best brand. And I know it has issues. But gah damn, 4k Upscale + this TVs picture in general is so mind blowingly beautiful. And I sit close to the TV in my room, so 50 inch is perfect.

Forza Horizon 2 on this is wow.


The best way to think about what ports do what, it is best to ignore the TV's final post processing hz number.

So for instance, look instead at what the source output is. For every device you have there, 1080p @ 60hz is going to be there highest output. So realistically, at this point, it makes no difference which port you use.

The only device that could potentially make any difference is a PC. Obviously it depends on the power of your PC's GPU, but theorhetically, if your PC was powerful enough you can output at 120hz (120fps) then you could;

You could play a game in 1080p@60hz (ie. 60fps) on any of the four HDMI 1.4 inputs, same as a console, bluray player etc
Or any game in 1080p@120hz (ie 120fps) but only on the HDMI 2 port. This is something usually only available on PC monitors
Or if you want 4k at 30fps the HDMI 1.4 ports will do
Or if you want 4k at 60fps you must use HDMI 2 port.

Is it worth using a splitter to send all information via HDMI port 2? No.

The through put of each device is limited by;

The output device (Xbox, PS4, Bluray player etc)
The HDMI cable
Or if you added, the HDMI splitter

It will never be higher than the lowest common denominator. If the output device tries to send a signal higher than the capabilities of the HDMI cable or what the splitter is able to send, it just wont send anything.

If the HDMI and splitter are capable of sending more than what the output device is cable of, they just wont be using their full capabilities as you will be limited to what the output device is sending. So there wont be any difference by the time it gets to the tv, it'll still be exactly what the output device sent with no improvements.

Not trying to confuse you but there is only one instance that would make any difference. The only variance to this is say, a HDMI splitter that also acted as an upscaler to 4k @ 60hz (this could be a HDMI upscaler / splitter like device or maybe a home theatre receiver that does the same thing). Otherwise if all it is doing is passing througn the signal with no post processing it doesnt matter. Either way we are talking about devices worth hundreds, not just a standard splitter that simply switches signal paths.

The TV's ability to output any source at 120hz is entirely inconsequential from the point of view of the output source and which port you use. The 120hz (its actually 240hz on this tv) is more to do with frame interpolation, strobing etc that is used to minimise judder in video output, its not actually related to how many frames the source device is outputting, therefor for the purpose of worrying about which HDMI port you use, you can entirely ignore that spec. So the simple answer to your question about port use and 120hz output from the TV side of things, you can always get 120hz from ANY port you use. Its being done in the TVs software / processor. It is only ever an issue from the output source.

Its confusing, I hope I cleared it up a little.

Just think, the HDMI ports and the signal speed (hz/fps) is only relevent when considering the output devices and what they output, not what the TV is outputting.
 
Not quite as there's obviously a practical limit on white peak. Most people are not going to want to go beyond 60 or 70fL because it's simply too bright for comfortable viewing. The VT60 has an MLL of 0.001fL or less, which at even at 30fL white peak, is a native ANSI CR of 30,000:1 at minimum. I believe the X900 is around 0.014fL ANSI, which means you'd have to use it at 420fL white peak to match 30k:1. I doubt it's even capable of exceeding 100-200fL maxed out.

This.

Most calibrators aim for the 35-40fl range for comfortable viewing, i.e. no eye strain. If you go above that, while contrast ratio nominally increases you are introducing eyestrain and most likely color shifting in your gray scale. So yeah, there is a practical limit for contrast or white level.

In practical contrast, there are very few LED LCDs on the market that come anywhere close to a good plasma. When you are talking about Panasonic plasmas, they basically knock them out of the ballpark.

If you set an LCD to ~35-40fl white peak, the resulting contrast will be much lower due to the heightened black level. Some of the better LEDs can still offer decent black levels, but plasma is unmatched, which is why its still the best display tech on the market, despite marketing from TV manufacturers trying to sell you LEDs damaging the plasma market to the point that it will probably disappear in a few years.
 
If you set an LCD to ~35-40fl white peak, the resulting contrast will be much lower due to the heightened black level. Some of the better LEDs can still offer decent black levels, but plasma is unmatched, which is why its still the best display tech on the market, despite marketing from TV manufacturers trying to sell you LEDs damaging the plasma market to the point that it will probably disappear in a few years.


Plasma is already dead, it wont disappear in a few years, it is already gone. Panasonic are out. Samsung are out. LG are out.

I have also never seen TV manufacturers trying to convince anyone to drop Plasma, its just where the market went. People bought more LCD's, then LED backlighting took over so TV's could be thinner and less power hungry. Thats just the way the consumer wanted to go so TV manufacturers put all their eggs in that basket.

There was no 'lets get rid of our Plasma line up by saying our LED LCD is better' conspiracy. Never has been.
 
Plasma got killed by its ergonomics. It's great that your screen has low black levels, but if the screen is reflective and has to be in a low-light environment to show off its potential, then it is not going to look good in an electronics store showroom (or most people's houses for that matter).

When people want to wall-mount their TVs, then yes, things like weight, thickness, and heat matter.

A lot of people who bought into plasma early got soured by issues like buzzing power transformers, rising black levels, burn-in, and decreasing brightness.

And you still have to baby a lot of plasma models. Run the anti-image retention if you are playing games with HUDs. Break in the screen for X hours so you don't have uneven phosphor aging. Don't lay it on its side if you're moving it.

It's a perfect technology for videophiles, because it will give you a better picture if you're willing to put up with the extra bullshit. I used to think I was one of those people, but after 5 years of owning a plasma TV, I realized that I would rather sacrifice a bit of image quality to get something that just works.
 
Burn in is still a killer for games too. I learned that the hard way. LCD has limitations but one huge advantage of it is I can play games as much as I want, however I want, and I don't get the HUD permanently in my TV while doing it.
 
I grabbed a Panasonic P50GT60 last year before they all disappeared. It came with 2 pairs of 3D glasses and I was looking to grab 2 more. Anyone have any recommendations on what to buy? Just stick with Panasonic ones? They have rechargeable ones in John Lewis that look quite nice but they're not cheap

Also, I'm getting broadband installed in our new house next week. I was going to get them to put the modem near the TV unit for easy ethernet connection to my TV and PS3. Is there any reason I shouldn't do this? Can it interfere with the picture? The reason I ask is because the TV came with 2 ferrite cores that they suggest ethernet cables connected to the TV should be run through. If not, what is the purpose of these?
 
It's a perfect technology for videophiles, because it will give you a better picture if you're willing to put up with the extra bullshit. I used to think I was one of those people, but after 5 years of owning a plasma TV, I realized that I would rather sacrifice a bit of image quality to get something that just works.

Agreed. Looking forward to the future of LG OLED.
 
I thought I tweaked out this problem awhile ago, but starting to see it again.

When edges move fast they sometimes show purple/blue fringing. Before I toyed with bright/contrast and it went away. Now I am seeing it again. Help?
 
I thought I tweaked out this problem awhile ago, but starting to see it again.

When edges move fast they sometimes show purple/blue fringing. Before I toyed with bright/contrast and it went away. Now I am seeing it again. Help?

What TV? This fringing, would you describe it as being something like a predator cloaking like appearance like silhouette around objects or people during fast movements?

I usually found this sort of artefacting when things like frame interpolation / dejudder / motion smoothing enhancements were turned on just set too high.
 
Need some advice

I have these 2 TVs on order (both were same price) but can only keep one. Primary purpose is for watching movies and gaming.

Samsung-UN55HU7250-Curved-55

4K no 3D

vs

Samsung-UN60H7150-60-Inch-1080p


1080 + 3D

I am confused which one to keep. Opinions?

I too have the UN55HU7250FXZA on order right now too, but I'm not sure on it. I've heard the black levels are way off on these. Can anyone help us out on this TV?
 
Burn in is still a killer for games too. I learned that the hard way. LCD has limitations but one huge advantage of it is I can play games as much as I want, however I want, and I don't get the HUD permanently in my TV while doing it.
That depended on the indivdual plasma, but it was a huge issue for some for sure.

My older 720p Sammy plasma (which is now our bedroom TV) shows IR really fast but also clears it very quickly, so I just got used to seeing the occasional residual IR.

My newer 1080p Samsung plasma can play games (or watch sports) with static HUDs for quite a while (hours even) and I rarely see even a hint of IR, and I look for it.
 
It's a little bizarre Microsoft were constantly touting 4K support for Xbox One but it never actually materialised. Not even the UI is 4K.

Practically speaking neither MS or Sony wants a 4K UI because the UI stays in memory during gameplay (note that you can press the PS or Xbox button on your given system and return to the console UI with zero wait). It would be a big bump in asset costs with no gain except for the tiny 4K userbase, who might not even notice the difference.

Both systems could theoretically do 4K but given the fill rate and memory requirements it doesn't make much sense.
 
Whole hours? Holy cow.
Yes, the new set is shockingly resilant to IR. I honestly can't recall that last time I saw any at all, and my kids have a habit of leaving movies playing that often end up on the menu screen for a while when they are done.

My old set will show any static images very quickly (just a few mintues), but it fades away quickly as well. It can be annoying for sure if you look for it on that set.
 
LCDs have contrast ratios as good as plasmas these days when you realize that even though they can't quite reach the black level of a plasma, their white levels are much higher. The resulting ratio is basically no different in real world viewing.

In reality the black level of my 65X900A isn't as good as my 65VT60 but the color gamut and relative contrast is every bit as good. Both TVs are self calibrated.

This is horseshit.

I have multiple sets of both in the house (Panasonic/Samsung Plasmas, Sony LCDs), and the picture quality on the plasmas are exponentially better, regardless of where you sit.

The fact that this inferior tech has somehow won infuriates me.

And you still have to baby a lot of plasma models. Run the anti-image retention if you are playing games with HUDs. Break in the screen for X hours so you don't have uneven phosphor aging. Don't lay it on its side if you're moving it.

It's a perfect technology for videophiles, because it will give you a better picture if you're willing to put up with the extra bullshit. I used to think I was one of those people, but after 5 years of owning a plasma TV, I realized that I would rather sacrifice a bit of image quality to get something that just works.

Which Plasma? Most of the issues you describe were solved. The rising black levels was in 2009, I've never done a proper break-in period for any of them, with no HUD burn-in--I've owned them in multiple years from 2007 to last year, so a lot of this is overblown.

Although, if you like things looking like outside from dead center with input lag, yea, sure, LCD all the way.
 
Which Plasma? Most of the issues you describe were solved. The rising black levels was in 2009, I've never done a proper break-in period for any of them, with no HUD burn-in--I've owned them in multiple years from 2007 to last year, so a lot of this is overblown.

Although, if you like things looking like outside from dead center with input lag, yea, sure, LCD all the way.

I know that rising blacks is an older problem. That's why I listed it as a problem that burned early adopters. Of course, after the rising blacks problem was fixed, we got the floating blacks problem.

My current plasma is a Panasonic G25. I think I got it somewhere around 2010. The image retention / burn-in is real. I got Panasonic to look at it a couple times during my warranty period, and every time I was told "working as intended." They said the same thing about the power transformer, which emits a buzzing sound if you display certain content, like an all-green screen. Back when I bought it, I thought people were suckers to pay more for an LED screen with worse picture quality. Turns out I was the sucker for believing the people who told me all of the problems with plasma had been fixed.

I'm going to buy a new TV next year, and I will probably just try to give this one away to friends or family, because I don't see how I could sell it in good conscience.
 
I had both a Panasonic GT30, and a VT30. The image retention was rather troubling. I had to run the screen wipe plenty of times after a after noon of sports, or an evening of video games.

I replaced them with a Samsung and I haven't had any issues with IR since. I have a 50" ST30 in the bedroom, and the IR is still there, but it vanishes a lot quicker then the other Panasonic models I had.
 
This is horseshit.

I have multiple sets of both in the house (Panasonic/Samsung Plasmas, Sony LCDs), and the picture quality on the plasmas are exponentially better, regardless of where you sit.
.

What models?

Despite Unkown Soldier having an anime avatar he knows his shit. I remember a few years ago chatting with him on FFXIV and we both liked plasma TVs over LCD at that time. Things have changed though, You can go out and buy a really darn good LED TV just a tad over $1,000, get it ISF calibrated and be pretty close to a plasma picture.
 
What models?

Despite Unkown Soldier having an anime avatar he knows his shit. I remember a few years ago chatting with him on FFXIV and we both liked plasma TVs over LCD at that time. Things have changed though, You can go out and buy a really darn good LED TV just a tad over $1,000, get it ISF calibrated and be pretty close to a plasma picture.
Change that to $2000 and I'll agree.
 
I've never heard of a plasma set that could beat modern Bravias on input lag.
My Samsung plasma has ~16 ms (1 frame) of input lag when in PC mode, so it's pretty close.

That being said, I agree that input lag is not inherently worse on either plasma or LCD from my experience, and they can both be pretty bad in many cases.
 
My Samsung plasma has ~16 ms (1 frame) of input lag when in PC mode, so it's pretty close.

That being said, I agree that input lag is not inherently worse on either plasma or LCD from my experience, and they can both be pretty bad in many cases.

Input lag is a function of the scaler and other chipsets in the TV, including the Smart TV stuff. A TV manufacturer can choose to optimize the performance of the scaler and other functions if they care enough with the goal of minimal delay from input to displayed image on panel.

Up until very, very recently, most manufacturers couldn't give less of a shit. I mean Samsung Smart TVs just a couple years ago had input lag of 60-80 ms in Game Mode. I don't know what happened recently that made them suddenly start to care. Sony, Samsung, Panasonic, even fucking Vizio are suddenly making TVs with lower input lag than ever before.

It's never been a better time to be a big-screen TV gamer than now, I guess. Now go 360noscope boomheadshot some motherfuckers with your low input lag, soldier. That's an order.
 
Had a question about 4k content.

Do I need a specific player to play 4k? Like if I buy a Blu Ray that has 4k, will that play on my PS4...and will the TV switch over to 4K?

Because the way I have it now, I'm using my PS4 on the 4K @ 60hz HDMI port. But obviously the PS4 is outputting at 1080p just upscaled to 4K. I was just curious about actually playing 4k content itself on a 1080p device.

IF that makes sense.
 
I recently got a Samsung 51 inch 1080p 600hrtz plasma tv. I got this since I assume it has the least amount of lag for Killer Instinct and Halo. I was curious if I made a good choice (so far it seems great)
 
Had a question about 4k content.

Do I need a specific player to play 4k? Like if I buy a Blu Ray that has 4k, will that play on my PS4...and will the TV switch over to 4K?

Because the way I have it now, I'm using my PS4 on the 4K @ 60hz HDMI port. But obviously the PS4 is outputting at 1080p just upscaled to 4K. I was just curious about actually playing 4k content itself on a 1080p device.

IF that makes sense.

Good question. Just bought a 4k tv so I would like to know as well.
 
Had a question about 4k content.

Do I need a specific player to play 4k? Like if I buy a Blu Ray that has 4k, will that play on my PS4...and will the TV switch over to 4K?

Because the way I have it now, I'm using my PS4 on the 4K @ 60hz HDMI port. But obviously the PS4 is outputting at 1080p just upscaled to 4K. I was just curious about actually playing 4k content itself on a 1080p device.

IF that makes sense.

I dont really understand your question.

From what I understand you are asking "can my current devices play future 4k content?

If that's what your asking the answer is "no idea" as nothing is really finalised. Can a PS4 output 4k? Apparently it can but whether that means streaming from an online service or an external HDD is one thing, whether it can read 4k blurays another. Same goes with pretty much any other device right now.

3D blurays only work on 3D compatible bluray players, so whether current bluray players or devices support 4k output after something like a firmware update I have no idea. There doesn't appear to be anything in concrete right now.

---------------------------------------------

Did a little more research and the likely answer to playing 4k bluray is 'no.' You need HDMI 2.0 and compatibility with HDCP2,2. So assuming a PS4 could output 4k content natively, it would likely have to be done via a streaming network or external HDD as it will, at least to my understanding of the PS4 hardware, not support HDMI 2 and HDCP2.2

This would also apply to pretty much any bluray player etc.

You need both the output device and the display to be able to handle HDMI 2 and HDCP2.2. Now for the shitfest part; not all 4k TV's actually support those requirements. Some first generation TV's do NOT support HDMI 2 or HDCP2.2. If your TV does not, you're kind of screwed on 4k bluray and stuck with streaming whether through the internet or a HDD. You also won't be able to run 4k @ 60hz (this will be a thing as some directors are going in this direction over 24fps for their movies)

Also if you run a home theatre this also applies to AV Receivers / Pre-Amps. All links in the chain must support HDMI 2.0 and HDCP2.2 so yeh, some peeps could be in for both an expensive and confusing upgrade.
 
Plasmas have a bunch of issues, buzzing, heat and "burn in", even though i play tons of games and the burned in hud (for example) fades pretty quickly, especially if you use the scrolling bar. So yes you have to babysit them...but not an excessive amount of babysitting is required.
Initially i was very focused on these problems when i got my gt60 last year, after some time i realized that you just get used to a little more noise, and the burn in fades away actually. I calibrated my plasma using my i1pro so i dont use it at max brigtness, that may be a big factor regarding the burn in, i can imagine that putting that thing at a 100% brightness can cause issues in this area. So yeah...you have to work a little for one huge benefit: in a somewhat controlled viewing environment, a good plasma shits on any lcd i have ever encountered when it comes to image quality. No contest. The next step up would be an OLED...if you can afford it.

A question for all gt60/vt60/zt60 owners, do you activate the 1080p pure direct mode for 4:4:4 while gaming? I cant decide if i should do it or not since it seems to introduce a little more lag.
 
Had a question about 4k content.

Do I need a specific player to play 4k? Like if I buy a Blu Ray that has 4k, will that play on my PS4...and will the TV switch over to 4K?
There us no Blu-Ray 4k standard yet, expect that late 2015. All 4k TVs will upscale whatever signal they get to 4k. As for 4k content, most TVs can play that off a USB drive or similar. There are plenty of 4k content on youtube.com, so you can use 4kdownloader or a similar program to download them. Just search for "4k" on youtube. Netflix also got some 4k content and other services are coming.

Because the way I have it now, I'm using my PS4 on the 4K @ 60hz HDMI port. But obviously the PS4 is outputting at 1080p just upscaled to 4K. I was just curious about actually playing 4k content itself on a 1080p device.
IF that makes sense.

The PS4 does not upscale to 4k, the TV does, so you can use any input port on the TV for it. Not sure about the 2nd part of your question here, though.
 
A question for all gt60/vt60/zt60 owners, do you activate the 1080p pure direct mode for 4:4:4 while gaming? I cant decide if i should do it or not since it seems to introduce a little more lag.

For some reason it's greyed out on mine. Probably because I'm using Game Mode.
 
VT/ZT60 owners:

Does your panel lean forward by a couple of degrees? Obviously the TV weighs a ton, so this might be normal but I noticed recently on its stand that there's a fair bit of give and it doesn't sit completely straight up.
 
1080p Pure Direct works in Game Mode. You're probably not sending it an actual 1080p signal, as the name implies, 1080p Pure Direct only works when the TV is receiving 1080p.

That's weird. It's definitely a 1080p signal. I wonder is it because I'm using a custom picture setting or something?
 
I've never heard of a plasma set that could beat modern Bravias on input lag.

There was one of these in the living room, I'll take my VT 100/100 times.

What models?

42 inches
Samsung - 2007 model. Still ticking. Guest room.

50 inches
Panasonic GT25 - Bedroom
Panasonic - still boxed, this is the 2014 model, game room
Sony KDL50W800B - Loft

65 inches
Panasonic VT-60 - Man Cave
Sony KDL65W950B - Living Room

My current plasma is a Panasonic G25. I think I got it somewhere around 2010. The image retention / burn-in is real. I got Panasonic to look at it a couple times during my warranty period, and every time I was told "working as intended." They said the same thing about the power transformer, which emits a buzzing sound if you display certain content, like an all-green screen. Back when I bought it, I thought people were suckers to pay more for an LED screen with worse picture quality. Turns out I was the sucker for believing the people who told me all of the problems with plasma had been fixed.

The one time where this was a problem, I literally left it on pause for a weekend, ran the white-out and it was gone by the time I got home from work. So, no, not really.

Does your panel lean forward by a couple of degrees? Obviously the TV weighs a ton, so this might be normal but I noticed recently on its stand that there's a fair bit of give and it doesn't sit completely straight up.

A bit, sure. I don't mount my TVs for the most part, so its easier to notice. That said, I wouldn't worry.

Sounds like LG has improved on a lot of things beyond just increasing the resolution with this model, also interesting is the 10 year "minimal degradation" guarantee. No mention of input lag unfortunately though.

Consider me a skeptic. Not of OLED, far from it, but of LG. To date, this is the only plasma that's actually broken that I've ever owned.
 
Whoooo! My 60" F8500 from a couple weeks ago was delivered a little while ago!
I am not in the country but my wife did a cursory inspection and nothing appears wrong. Free from visible physical defects, no lines on the screen, possibly no dead pixels on the set-up screen (the one with the solid blue background, I had to explain what she should be looking for and she said she didn't see any), and best of all, no forward directional buzzing from the center! I had her sit down on the couch and view the TV from a position directly in the center and tell me if she noticed anything weird or bad. She said she didn't so then I asked her to specifically listen for anything that would indicate a problem. She said "nothing," so then I described the buzzing and asked her about it; see, I didn't want to tip her off to it first and then have her be expecting it or mistaking a different sound for it. She said she didn't hear anything like what I described.

So I haven't personally looked at it and am not 100% in the clear yet but it seems like I might have gotten a good set. I'm pretty happy about that.

Now I need to decide if I want to run slides or not. I'm only in the country for three weeks and don't know if I want to take 4 days up with that and miss out on four days of movies with the family and new Xbone goodness. I debated and debated for days to run slides on the plasma I got in 2010 before finally deciding not to and jumped straight into movies and gaming (I think maybe I held off a day or two before playing any HUD-intensive games though). Never had any issues with IR that a couple minutes of the scrolling bar or televised programming couldn't fix. And I'm talking 'I would fall asleep while in the Netflix app for 5 or 6 hours, wake up to see the TV still on and displaying that screen (albeit a bit darker due to the 360 picture dimming after a while), and running the white bar for a couple minutes and being good.' That was a Panny though, not a Samsung. Never owned a Samsung, so I guess I should take it kind of easy at the beginning and feel it out for a bit.
 
I grabbed a Panasonic P50GT60 last year before they all disappeared. It came with 2 pairs of 3D glasses and I was looking to grab 2 more. Anyone have any recommendations on what to buy? Just stick with Panasonic ones? They have rechargeable ones in John Lewis that look quite nice but they're not cheap

Also, I'm getting broadband installed in our new house next week. I was going to get them to put the modem near the TV unit for easy ethernet connection to my TV and PS3. Is there any reason I shouldn't do this? Can it interfere with the picture? The reason I ask is because the TV came with 2 ferrite cores that they suggest ethernet cables connected to the TV should be run through. If not, what is the purpose of these?

Sorry to bump this one but my family is hassling me to put an Amazon wishlist together for Christmas

Any Panasonic plasma owners who can recommend some quality 3D glasses so I can get 2 extra pairs? My GT60 came with 2 pairs of the TY-ER3D5MA glasses. I'm assuming these were the latest ones released since this was their last line of plasmas? Are they generally considered the best? I see the TY-ER3D4ME models everywhere which are rechargeable but they look a little chunkier. Are there any really good 3rd party glasses out there? Any advice would be appreciated. Thanks!
 
Top Bottom