• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro/PSSR Appears to Provide Better Image Reconstruction than DLSS Running on a 4090 GPU

Vick

Gold Member
Bravia at the high end is great, I own the Bravia 9 85 inch myself. Best mini LED on the market. I wouldn’t touch Sony if I was budget limited though, the Bravia 3 by all accounts is terrible vs the competition. Pretty hard to compete with TCL and Hisense in the cheap TV market.
I'll take your word for it, wouldn't know first hand as I'm a Panasonic plasma guy.

I've reached a point I would never touch a cheap TV or monitor with a stick, nor an LCD or OLED in general for that matter. As a matter of fact, unless microLED panels with FULL native motion resolution are available the moment all my plasma die and no more can be found, I would legit stop any form of video-based entertainment until that happens.
May seem ridiculous I know, but after almost 20 years of Pioneer and Panasonic plasma any LCD PQ and latency and near black performance, as well as OLED ridiculous motion resolution, near black performance and artificial "plasticity" feel coming from the analog phosphor image, legit deprive me of any joy and form of enjoyment when I sit in front of them.

Plus the idea of needing games running at 1.000fps to achieve the same look I get when playing at 60fps on my panel is simply too preposterous for me to accept.
 
Last edited:
I don't know about matching nVidia's solution.

I'll reiterate, I'd bet it's an area they're likely to enjoy benefits/boosts over PC solutions because consoles are fixed hardware so it should make things easier for them to implement a comparatively more efficient solution.

No one should pretend that consoles aren't advantageous over PCs in some areas... Whether it extends to image reconstruction solutions remains to be seen.
 
Last edited:

Neo_game

Member
They need more games to support it rather than just handful of them and definitely all newer games. This probably could be the best thing about Pro but they have to release proper screenshots to show how it really looks.
 

Malak

Member
3cec936c1456fd9e2a52a501c16b6be108167ec563cb37a81b284937856a4b24.jpg
 

TheShocker

Member
No one cares what you spend your money on. If you want to buy a Pro and enjoy it, do so. But don't make delusional claims to justify it.
 

proandrad

Member
No one cares what you spend your money on. If you want to buy a Pro and enjoy it, do so. But don't make delusional claims to justify it.
A 4090 DLSS image quality is the same as a 3050 DLSS image quality. So there is nothing delusional about a new upscaling solution potentially providing a better image.
 
Last edited:
A 4090 DLSS image quality is the same as a 3050 DLSS image quality. So there is nothing delusional about a new upscaling solution potentially providing a better image.
Totally not delusional to assume that Sony will just leapfrog the market leader which happens to be one of the most valuable companies on the planet with an R&D budget Sony can't even dream about on the first try in one of their core competencies...
Riiiiiight.
 
Last edited:

Vick

Gold Member
Wait, why not?

Lets wait and see before writing this off. Both are using AI upscaling. We know from the PS5 Pro leaks that the machine learning computation power (300 TOPS) is very impressive. On par with the 3080 (273 TOPS). it doesnt have to beat it. It just has to come close. FSR is already almost there, TSR too and they dont using AI upscaling. Why is it so hard to believe that another AI solution with proper hardware computation power allocated to AI tasks wont match it?
IGN finally released some proper high quality footage, and it's legit night and day difference compared to the OG stream:



It immediately invalidated the Spider-Man 2 screenshot posted by OP, which was obviously 30fps motion blur, but I'm not sure about Ratchet actually.

Pro IQ looks absolutely mindblowing already to me on this damn YT video.
 

M1chl

Currently Gif and Meme Champion
I don't know about matching nVidia's solution.

I'll reiterate, I'd bet it's an area they're likely to enjoy benefits/boosts over PC solutions because consoles are fixed hardware so it should make things easier for them to implement a comparatively more efficient solution.

No one should pretend that consoles aren't advantageous over PCs in some areas... Whether it extends to image reconstruction solutions remains to be seen.
The bigger story is to train your ML model to be good in wide variety of the games and that probably not going to be affected if you run it on your fixed HW or on your brand of GPUs, so I don't think its easier
 

proandrad

Member
Totally not delusional to assume that Sony will just leapfrog the market leader which happens to be one of the most valuable companies on the planet with an R&D budget Sony can't even dream about on the first try in one of their core competencies...
Riiiiiight.
Oh what got an expert over here. Enlighten us plebs on how much R&D nvidia is putting into gaming related software still.
 

BennyBlanco

aka IMurRIVAL69
Oh what got an expert over here. Enlighten us plebs on how much R&D nvidia is putting into gaming related software still.

Nvidia is valued 30x higher than Sony. Their bread and butter is AI. They started the ML upscaling tech and shit on both AMD’s and Intel’s efforts from a great height. It’s absurd to think Sony has it all figured out on their first try and basing all this off a YouTube marketing video.
 

proandrad

Member
Nvidia is valued 30x higher than Sony. Their bread and butter is AI. They started the ML upscaling tech and shit on both AMD’s and Intel’s efforts from a great height. It’s absurd to think Sony has it all figured out on their first try and basing all this off a YouTube marketing video.
Microsoft is also more valued than Sony so shouldn't the Xbox hardware be miles better than playstation?
 
We’ve know about PS5 Pro for years, we’ve known that PSSR was a core part of this system for years, we know that Sony has been investigating upscaling since CBR on PS4 Pro and anti aliasing algorithms since PS3 (MLAA). This isn’t like AMD farting out FSR to quickly have something to put next to Nvidia. They’ve been working on these problems for a long time, longer than Nvidia, and their work has influenced developments elsewhere.

There’s no reason not to think PSSR will be an excellent solution, outside of being a Nvidia fanboy.

Well said. Some people really think like SIE just decided to get into the upscaling game only a few years ago, when really they've been at a lot of these things other companies get credit for today, since the PS3...hell even the PS2 days in some cases.

They just never fostered a cult of media influencers and shills to hype up every little project and technology they've been R&D'ing. That's the main difference.

Nvidia is valued 30x higher than Sony. Their bread and butter is AI. They started the ML upscaling tech and shit on both AMD’s and Intel’s efforts from a great height. It’s absurd to think Sony has it all figured out on their first try and basing all this off a YouTube marketing video.

A lot of investors are idiots and pump value into stocks based on trends that collapse in record time. We know this from history, so using the valuation argument means nothing here.

Besides, Microsoft are worth 30x+ higher than Sony too, yet somehow they've been completely outplayed in the console space both in sales and engineering ability by a company significantly smaller than them. Too bad that market cap couldn't be of any benefit in areas where sheer money means so little 🤷‍♂️
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
Totally not delusional to assume that Sony will just leapfrog the market leader which happens to be one of the most valuable companies on the planet with an R&D budget Sony can't even dream about on the first try in one of their core competencies...
Riiiiiight.
What if Clony "copy" Nvidia's homework? They have a history of doing stuff like this.
 
IGN finally released some proper high quality footage, and it's legit night and day difference compared to the OG stream:



It immediately invalidated the Spider-Man 2 screenshot posted by OP, which was obviously 30fps motion blur, but I'm not sure about Ratchet actually.

Pro IQ looks absolutely mindblowing already to me on this damn YT video.

Nice find, looking at the video the comparison between the performance mode even if it looks like there is motion blur which doesn't necessarily help and the PRO using the PSSR for upscale it look actually promising.

On the other hand, they really should have used games like Alan Wake 2, FF XVI or FF VII Rebirth because the IQ in performance mode is very bad on these games and the difference would have had a bigger impact.

In a way it's a bit strange that Sony hasn't shown any games that use FSR as an upscale, I wonder if Sony didn't want to publicly show that FSR is very bad so as not to offend AMD who remains their partner while we know that in the developer documentation they compare PSSR directly to FSR to show how much better their upscaler is.
 
Last edited:
NVIDIA is competent. Microsoft isn't.

Well when it comes to this specific area, likely yes. Though any serious upscaling tech/method Microsoft would make, would be more hardware-agnostic and software-driven anyway. That's just in their nature as the type of company they are.

I mean it's worth noting that DLSS wasn't developed by some massive team either. Last I remember there was like 5 authors of the paper.
And likely PSSR, as well as XeSSS, AutoSR and AppleSauce or whatever it's called - were all similarly staffed.
I expect similar for the non ML alternatives in UE, Unity, FSR ... too.

Good point, and bodes well for PSSR. Also can't discount the R&D other divisions have been doing in similar areas for years now, or prior R&D in the space (or adjacent areas) by SIE going back many years or even decades.

They also have years of experience training data from working on their tech and developers input.

There are hundreds of games that support DLSS 2 and beyond. And this is before you get into stuff like Ray Reconstruction etc.

Yeah, sure, this is all true. I just hope people aren't saying this acting like Sony haven't been doing similar things for years if not decades. There are raytracing experiments and demos going all the way back to the PS2. Systems like the PS2 were the first breakthrough for procedural generation in a mass-market gaming device. Checkerboard rendering was more or less the best upscaling option for AMD GPU hardware of that time which was realistically possible.

PSSR will probably surprise a lot of people with how good it is, regardless if it beats Nvidia's best efforts right now (it probably won't). The way some others are going though you'd think PSSR is going to be a colossal disappointment well behind even modest software-driven solutions, which is ridiculous.

16gb of ps5 pro gddr6 is analogous to 32gb of pc gddr6, claims Sony.

Got a link? I'm guessing the GPU decompression has gotten much better vs. RDNA2, and maybe their own I/O decompression has been improved (maybe quite beyond 22 GB/s peak decompression for starters).
 
Last edited:

BennyBlanco

aka IMurRIVAL69
Microsoft is also more valued than Sony so shouldn't the Xbox hardware be miles better than playstation?
Well said. Some people really think like SIE just decided to get into the upscaling game only a few years ago, when really they've been at a lot of these things other companies get credit for today, since the PS3...hell even the PS2 days in some cases.

They just never fostered a cult of media influencers and shills to hype up every little project and technology they've been R&D'ing. That's the main difference.



A lot of investors are idiots and pump value into stocks based on trends that collapse in record time. We know this from history, so using the valuation argument means nothing here.

Besides, Microsoft are worth 30x+ higher than Sony too, yet somehow they've been completely outplayed in the console space both in sales and engineering ability by a company significantly smaller than them. Too bad that market cap couldn't be of any benefit in areas where sheer money means so little 🤷‍♂️

If Sony was trying to make a PC operating system or an MS office competitor maybe this would be a viable comparison. Nvidia literally lives and dies by AI tech. MS is spread thin on 400 different things. This thread is gonna age like milk.
 

yogaflame

Member
I hope it can upgrade majority of old games too 4k/60fps on the fly or by a few updates for the upgrade to happen. But can it?
 
Last edited:

EDMIX

Writes a lot, says very little
I own a 4090, but I also don't own those games on PC, so not sure.

I think this makes sense if you consider those games where made to run on PS5, so its very likely it would be well optimized to run on something like PS5 Pro, where a 4090, its not like those games are made exactly to max out that card or something, so it could be merely do that the games were not made with 4090 in mind to max everything out or something.

Kinda like telling us Demon Souls runs better on PS3 then a 4090, the reasoning might have more to do with one was actually made for that platform and less to do with overall power or something.

This is btw why i tell people I own both consoles and PC lol

Having a 4090 doesn't fucking mean every last game will even factor into its actual overall power, that is irrelevant if not many are actually making games AROUND that card in the first place.


So that isn't as absolute as many might think..that differs from game to game, I more so got a 4090 so I can max out S.T.A.L.K.E.R 2 as its a game made by developers that primarily create on PC to max it out in the first place
 

sachos

Member
It immediately invalidated the Spider-Man 2 screenshot posted by OP, which was obviously 30fps motion blur, but I'm not sure about Ratchet actually.
Yeah in both Ratchet and SM2 it kinda looks like they disabled motion blur and depth of field. Maybe because they did not update the shutter speed of the motion blur to acomodate for 60fps?
 
Yeah in both Ratchet and SM2 it kinda looks like they disabled motion blur and depth of field. Maybe because they did not update the shutter speed of the motion blur to acomodate for 60fps?
There is less need of strong motion blur at 60fps. It's still there though.
 

proandrad

Member
If Sony was trying to make a PC operating system or an MS office competitor maybe this would be a viable comparison. Nvidia literally lives and dies by AI tech. MS is spread thin on 400 different things. This thread is gonna age like milk.
No, you all are just so quick to jump on the hate train. I won’t assume anything before seeing it in person. Sony’s upscale solution could look better in stills but terrible in motion, I don’t know. I also don’t assume the current version of dlss is the best Nvidia currently has. Nvidia loves to lock and hold on to their latest tech for their new hardware launch. DLSS 3.0 came out 2 years ago and since then we have just had minor updates to it since. More than likely it will come short to the current version of dlss, but it’s not completely unbelievable that it could match or beat the current dlss version.
 
Last edited:

BennyBlanco

aka IMurRIVAL69
No, you all are just so quick to jump on the hate train. I won’t assume anything before seeing it in person. Sony’s upscale solution could look better in stills but terrible in motion, I don’t know. I also don’t assume the current version of dlss is the best Nvidia currently has. Nvidia loves to lock and hold on to their latest tech for their new hardware launch. DLSS 3.0 came out 2 years ago and since then we have just had minor updates to it since. More than likely it will come short to the current version of dlss, but it’s not completely unbelievable that it could match or beat the current dlss version.

Read the title of the thread and realize it’s 100% based off a 9 min YouTube marketing video. It’s a remarkably dumb fanboy thread.

And Nvidia also has done marketing videos that oversold their stuff. Like showing a 3090 running wolfenstein in 8k. I don’t believe any of this shit until these things are actually in people’s hands.
 

Beechos

Member
I find it hard to believe nvidia who is by far the leader in this is beaten by Sonys solution. How does one even compare since they are different platforms.
 
Very minor differences honestly, i don't even think you could distinguish them while playing, just my 2cents.
But you can still notice the slighlty sharper textures on Pro (likely thanks to PSSR enhancing the textures a bit like DLSS), and here we can still see that even on static moments, so not motion blur induced.

But that's not the point. The point here (like in almost all games shown) is PS5 Pro performing twice better than PS5 for virtually the same image fidelity (with added RT shadows in SM2 even). This is objectively an impressive achievement by Cerny. Not cheap though considering the price of the machine. Maybe he got a salary increase and it caused the price hike vs PS4 Pro. :messenger_grinning_smiling:
 
Last edited:
Microsoft is also more valued than Sony so shouldn't the Xbox hardware be miles better than playstation?
So because MS has more money than Sony, AMD (which supplies them both) should sell them their hardware cheaper or MS should eat bigger losses on each device? And since when has MS had their core competence in hardware?
That comparison is utterly....

Stupidity Are You Stupid GIF
 
Last edited:

proandrad

Member
So because MS has more money than Sony, AMD (which supplies them both) should sell them their hardware cheaper or MS should eat bigger losses on each device? And since when has MS had their core competence in hardware?
That comparison is utterly....

Stupidity Are You Stupid GIF

You are coming across real salty in a conversation about speculation. I'm sure there will be black friday deals next year so you pick up a pro at a discount and do your own testing to dunk on us.
 
You are coming across real salty in a conversation about speculation. I'm sure there will be black friday deals next year so you pick up a pro at a discount and do your own testing to dunk on us.
Just because someone is pointing out just how little sense your utterings make doesn't make that person salty.
Any kind of emotionality in this short dialog is entirely in your head....
 

Fafalada

Fafracer forever
Plus the idea of needing games running at 1.000fps to achieve the same look I get when playing at 60fps on my panel is simply too preposterous for me to accept.
MicroLed won't solve that for you either.
Panel makers 'could' (low-persistence refresh has been solved in VR headsets for - around a decade now) - but as with most things in consumer electronics - TV market is anything but customer focused, they'd prefer to tell you what to buy rather than actually address the real customer problems.
 

Kumomeme

Member
Microsoft is also more valued than Sony so shouldn't the Xbox hardware be miles better than playstation?
haha this remind me of early of this generation where there is people absolute confident that Xbox Series X gonna stomp PS5 in ML department. you know.. the debate that claim the console has all latest features while PS5 is just RDNA 1.5 whatsoever.


fast foward..who doing ML and not right now?

Nick Offerman Smile GIF
 

Radical_3d

Member
Nvidia is valued 30x higher than Sony. Their bread and butter is AI. They started the ML upscaling tech and shit on both AMD’s and Intel’s efforts from a great height. It’s absurd to think Sony has it all figured out on their first try and basing all this off a YouTube marketing video.
Aren’t cameras doing image reconstruction since forever now? That’s why Apple ML solution surpass DLSS in some scenarios. Because they’ve been processing the image in the camera of their phones in real time before AI was popular. You’d think that Sony has some expertise there as well…
 

JimRyanGOAT

Member
Even if matches a 3070 the entire PC race takes a massive L and should sell their PC's and buy a ps5 pro

I kid I kid!
 
Last edited:

Sleepwalker

Member
I'll take your word for it, wouldn't know first hand as I'm a Panasonic plasma guy.

I've reached a point I would never touch a cheap TV or monitor with a stick, nor an LCD or OLED in general for that matter. As a matter of fact, unless microLED panels with FULL native motion resolution are available the moment all my plasma die and no more can be found, I would legit stop any form of video-based entertainment until that happens.
May seem ridiculous I know, but after almost 20 years of Pioneer and Panasonic plasma any LCD PQ and latency and near black performance, as well as OLED ridiculous motion resolution, near black performance and artificial "plasticity" feel coming from the analog phosphor image, legit deprive me of any joy and form of enjoyment when I sit in front of them.

Plus the idea of needing games running at 1.000fps to achieve the same look I get when playing at 60fps on my panel is simply too preposterous for me to accept.
Wish they made a 4k plasma
 

SKYF@ll

Member
The details of PSSR will likely be clarified by DF, IGN, and others at a later date.
Sony's upscaling technology has always been excellent, so PSSR shouldn't be bad either.
The video is a comparison of upscaling from an ultra-low resolution DVD.
 

Gaiff

SBI’s Resident Gaslighter
I'll take your word for it, wouldn't know first hand as I'm a Panasonic plasma guy.

I've reached a point I would never touch a cheap TV or monitor with a stick, nor an LCD or OLED in general for that matter. As a matter of fact, unless microLED panels with FULL native motion resolution are available the moment all my plasma die and no more can be found, I would legit stop any form of video-based entertainment until that happens.
May seem ridiculous I know, but after almost 20 years of Pioneer and Panasonic plasma any LCD PQ and latency and near black performance, as well as OLED ridiculous motion resolution, near black performance and artificial "plasticity" feel coming from the analog phosphor image, legit deprive me of any joy and form of enjoyment when I sit in front of them.

Plus the idea of needing games running at 1.000fps to achieve the same look I get when playing at 60fps on my panel is simply too preposterous for me to accept.
Plasma really does that to a man. I really wish we had been wiser and not let it get killed for LCD. LCD was shit for years and the older models weren't even close to plasma. My old Panasonic is still the best TV I've ever owned.
 

kevboard

Member
This is so weird to me. QD-OLD / MLA OLEDs are crazy good.

not only crazy good, but also about 1000 times better than Plasma, which often didn't even have true blacks. they also wouldn't be able to get as bright as QD OLED or MLA OLED, hell, not even as bright as some standard WOLEDs

Plasma was only good compared to the LCD tech it competed with. I think a lot of people have rose tinted glasses on when reminiscing about Plasma screens.
 

Vick

Gold Member
Wish they made a 4k plasma
Oh, they went well above that:



But yeah, not commercially sadly.
Still, 1440p/4K downscaled looks absolutely out of this world on these plasma. Especially because, once in motion, it's 1080 native full lines vs the 300 of a modern 4K panel, of whatever price point.

Plasma really does that to a man.
Will Smith Yes GIF by Bad Boys

I really wish we had been wiser and not let it get killed for LCD. LCD was shit for years and the older models weren't even close to plasma. My old Panasonic is still the best TV I've ever owned.
Heart Pointing GIF by Big Brother 2022
 
I'll take your word for it, wouldn't know first hand as I'm a Panasonic plasma guy.

I've reached a point I would never touch a cheap TV or monitor with a stick, nor an LCD or OLED in general for that matter. As a matter of fact, unless microLED panels with FULL native motion resolution are available the moment all my plasma die and no more can be found, I would legit stop any form of video-based entertainment until that happens.
May seem ridiculous I know, but after almost 20 years of Pioneer and Panasonic plasma any LCD PQ and latency and near black performance, as well as OLED ridiculous motion resolution, near black performance and artificial "plasticity" feel coming from the analog phosphor image, legit deprive me of any joy and form of enjoyment when I sit in front of them.

Plus the idea of needing games running at 1.000fps to achieve the same look I get when playing at 60fps on my panel is simply too preposterous for me to accept.

The bolded part is absolutly not true anymore, i have a Panny VT60 and a Kuro 5090h. Recently got a TCL 845 miniled and it smokes both of my plasmas in terms of motion clarity and resolution in games with bfi at 60hz even! It has a lower input lag then plasmas even with bfi and interpolation enabled. I can stand in a third person game and pan the camera fast and all the text is readable in the background. But again my plasmas are better for 24hz content like movies. And dont get me started on bfi at 120hz on the tcl, it looks mindblowingly good. There is also next to no crosstalk in bfi even at 60hz.

But the idea that you need 1000fps to achive the look of 60fps on a plasma is long gone. Any TCL mini led with a pentonic 700 chipset (ie tcl 805, 845) will destroy ANY plasma for gaming and i speak from first hand experience.

Also had a X90j, LG G2 and a LG C3 both where trash compared to the tcl mini led in terms of motion, and the plasma. Returned all the oleds as they simply could not stack up at all.

tldr: plasmas are long since beat in motion resolution in gaming, but still reign supreme for 24hz movies non interpolated content and colors. But strictly speaking gaming, the plasmas are antiquated by tcl. I am actually considering giving away my two plasmas to get room for another tcl, its just that good
 
Last edited:

Vick

Gold Member
The bolded part is absolutly not true anymore, i have a Panny VT60 and a Kuro 5090h. Recently got a TCL 845 miniled and it smokes both of my plasmas in terms of motion clarity and resolution in games with bfi at 60hz even! It has a lower input lag then plasmas even with bfi and interpolation enabled. I can stand in a third person game and pan the camera fast and all the text is readable in the background. But again my plasmas are better for 24hz content like movies. And dont get me started on bfi at 120hz on the tcl, it looks mindblowingly good.

But the idea that you need 1000fps to achive the look of 60fps on a plasma is long gone. Any TCL mini led with a pentonic 700 chipset (ie tcl 805, 845) will destroy ANY plasma for gaming and i speak from first hand experience.

Also had a X90j, LG G2 and a LG C3 both where trash compared to the tcl mini led in terms of motion, and the plasma.

tldr: plasmas are long since beat in motion resolution in gaming, but still reign supreme for 24hz movies non interpolated content and colors. But strictly speaking gaming, the plasmas are antiquated by tcl. I am actually considering giving away my two plasmas to get room for another tcl, its just that good
Here:


Since the article posted by OP states you need 1.000fps to display a blur free image, and since the image on my Panasonic is natively blur free, I find your post interesting. I am always referring to native motion resolution remember, not interpolated as that's unfortunately something I'll always find artificial. What's the motion resolution on the TCL? Both native and interpolated?

Also, I want to add I'am way past the point of no return on this, I too have a Kuro, my jewel KRP-500M, which is tecnically full motion res as well, and yet I still find a night and day difference with the Panasonic and somehow perceive a ton of motion blur on it, both 24hz and 60hz. Not going to mention what happens with LCD and OLEDs.

Lastly, when you say "but still reign supreme for 24hz movies non interpolated content and colors", and I'll add near black performance as well and by a country mile, I simply need a TV that does it all. I need to watch BD, play videogames, YT, compressed material, all on the same panel, all reference looking all the time.
 
Here:


Since the article posted by OP states you need 1.000fps to display a blur free image, and since the image on my Panasonic is natively blur free, I find your post interesting. I am always referring to native motion resolution remember, not interpolated as that's unfortunately something I'll always find artificial. What's the motion resolution on the TCL? Both native and interpolated?

Also, I want to add I'am way past the point of no return on this, I too have a Kuro, my jewel KRP-500M, which is tecnically full motion res as well, and yet I still find a night and day difference with the Panasonic and somehow perceive a ton of motion blur on it, both 24hz and 60hz. Not going to mention what happens with LCD and OLEDs.

Lastly, when you say "but still reign supreme for 24hz movies non interpolated content and colors", and I'll add near black performance as well and by a country mile, I simply need a TV that does it all. I need to watch BD, play videogames, YT, compressed material, all on the same panel, all reference looking all the time.

My panny also looks way better in motion, the only downside is the blacks look way better on my kuro, guess it has way lower hours. By all those factors you stated you are right in hanging on to the plasma. It will be better than any modern set. I was just more suprised and happy that i finally found a modern tv that could beat my plasmas in gaming. Something that i can abuse and not worry about any burn in or degredation. No idea what the motion res is on interpolated content, never messured. just by eye in games side by side with the plasma. native i would guess sample and hold baseline, i always interpolate so for me its fine.
 
Last edited:
Top Bottom