• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

RoboPlato

I'd be in the dick

Goldenhen

Member
HDMI 2.1 features have been announced, and they are significant for anyone with a gaming PC or planning to buy an Xbox Scorpio.

We already have the existing solution available similar to VRR but I can't think of a single TV manufacturer that has created "gaming" TV with G-sync or Freesync in the last few years. All they care about improving picture quality because that's what most of their customers buying their TV for.

It's very likely TV manufacturer use HDMI 2.1 for eARC and Dynamic HDR as their selling point in 2018.

I wonder Game Mode VRR is mandatory in HDMI 2.1 or optional to support it.
 

Kyoufu

Member
We already have the existing solution available similar to VRR but I can't think of a single TV manufacturer that has created "gaming" TV with G-sync or Freesync in the last few years. All they care about improving picture quality because that's what most of their customers buying their TV for.

It's very likely TV manufacturer use HDMI 2.1 for eARC and Dynamic HDR as their selling point in 2018.

I wonder Game Mode VRR is mandatory in HDMI 2.1 or optional to support it.

It's baked into the HDMI spec so if the TV has HDMI 2.1 then it has Game VRR.
 

Paragon

Member
The new tv's that use it are going to be expensive
There's no reason for it. Cheap Korean monitors were able to add support for FreeSync via firmware upgrades for zero cost whatsoever.
It turns out that lots of existing scalers can already support variable refresh rates - they just need the firmware to do it.
Some monitors just work without any modification if you trick the driver into thinking that the display supports FreeSync.
People have even managed to get FreeSync working on old CRT monitors.
Finally being an official part of the HDMI spec is the incentive TV manufacturers needed to implement this.

Then you have to have the content that supports the upgrade which is likely years away as well.
G-Sync and FreeSync basically just work the majority of existing games on PC.
I see no reason why that should not also be the case for FreeSync/HDMI VRR support with Scorpio.
The point is that it doesn't need specialized content.

We already have the existing solution available similar to VRR but I can't think of a single TV manufacturer that has created "gaming" TV with G-sync or Freesync in the last few years. All they care about improving picture quality because that's what most of their customers buying their TV for.
It's very likely TV manufacturer use HDMI 2.1 for eARC and Dynamic HDR as their selling point in 2018.
I wonder Game Mode VRR is mandatory in HDMI 2.1 or optional to support it.
One is proprietary and requires replacing the TV's processing hardware with an FPGA - never going to happen.
The other either requires a DisplayPort connection or support for a vendor-specific HDMI extension.

Game Mode VRR is part of the HDMI 2.1 spec now - though perhaps an optional part of it. We don't know that yet.
That's still a significant difference from the other two.
 

Madness

Member
You are vastly overstating how quickly the technology will be implemented and how useful it will be to 90% of the populace right away. HDMI 2.1 is a major upgrade of course, but don't expect more than a few models, maybe the flagships to have the ports in 2018, which means 2019 before you see them everywhere. Like I said, even in 2017 manufacturers like Sony only offering 2 HDMI ports out of 4 that do HDR even for the Z9D and A1E. Some lower end models of LG and Samsung only enhance one port for HDR etc.

So anyone holding out thinking it'll come quick, be prepared to wait a while. Especially if you also wait till Black Friday for discounts.
 

Haint

Member
Hold out for HDMI 3.0 in 2020! Huge mistake buying a TV next year!

Or even better, hold out for HDMI 3.1ac-x in 2022!

But keep in mind, HDMI Super Hyper Turbo Ultra Street Fighter 4.0 is just around the corner too!

The thing is 2.1 has all the makings of a genuine plateau point, there is nothing I can think of that it might be lacking for the foreseeable future. This is a first since HDMI fell behind the technology curve when 3D first came out. Literally every new spec since then has been missing a key feature, promised to be added in the next revision. 2.1 is the first forward looking version they've put out, and I fully expect it to remain current and relevant for several years.
 

Theonik

Member
It's baked into the HDMI spec so if the TV has HDMI 2.1 then it has Game VRR.
This not true. That's like saying if a TV has HDMI it has 3D. Game VRR is a feature of the connection, the TV doesn't have to support it.
And how it is implemented will vary wildly between TVs.
 

J-Rzez

Member
Lets look at this 2160p/4k/UHD gen thus far.
- 4K/30
- 4K/60
- 4K/60 HDR
- 4K/60 HDR/DV/TLG
- 4K/60 HDRs + VRR.

Can't wait to see the must haves and "wait fors" in 2019/2020. Manufacturers are figuring out ways to create a smartphone environment rather than allowing people to keep their sets for longer than 2-3 years these days.

If you NEED or want a new TV, buy it. If you want but can kinda wait, then wait until you say "eff it" I want something new or your set dies and you need one.
 

Alexious

Member
Except there is no such thing as future proofing because things change, it is inherent property of technology. I am sure there are a lot of people with AVRs from 5 years ago with Apple 30 pin connecters on it that were pushed as future proofed. Things change and you can't say something will work forever.

The industry may start using HDMI 2.1 physically, but it will be years (years!) before content is out that takes advantage of it, and by then, it will be time to upgrade.

Exactly. From Digital Foundry's reveal that Scorpio would support VRR:

But it's the longer term outlook that is arguably more important. There's a reason why games target either 60fps or 30fps: both divide equally into the 60Hz output of a traditional screen, meaning a smooth, consistent update. With the display refresh put in the developer's hands, arbitrary performance targets like 40fps or 45fps could be targeted. We've tested both on PC using a G-Sync screen running games with Riva Tuner Statistics Server's frame-rate cap in place and both of these frame-rates look so much better than the console standard 30fps. With games that target 60fps, performance drops down to around 50fps are really difficult to pick up on owing to the lack of tearing and reduced v-sync judder.

Of course, reaching the point where VRR is specifically targeted by developers will probably take years, but the point is that take-up from manufacturers - especially in living room displays - is going to require a mainstream piece of gaming technology to drive adoption. We've already talked about how Scorpio pushes console hardware to the next level, and we wanted to isolate FreeSync and VRR support and showcase it as an example of just how much attention to detail is going into the next Xbox. There's no real need for Microsoft to support this, but the fact that it is - and that the most hardcore gamers will appreciate it - illustrates just how much the focus has changed at Xbox HQ. And of course, the chances are that if one console vendor pushes ahead with supporting the tech, the competition will follow.

VRR on consoles will begin to be huge when developers start targeting 40-45 FPS with no tearing problem. But it will take years. I expect PS5 will be out by the time it becomes relatively widespread.
 
Of course it'd take years before developers specifically target a non-standard refresh rate as any kind of priority, there are too many sets out there tied to 60hz, but the point is people will see the benefits on existing software long before that.
 

Kyoufu

Member
This not true. That's like saying if a TV has HDMI it has 3D. Game VRR is a feature of the connection, the TV doesn't have to support it.
And how it is implemented will vary wildly between TVs.

3D requires a special filter though? LG removed it to increase their panel brightness for the OLED range. Why would they disable Game VRR? I see no upside to not supporting a feature that will be part of the HDMI spec.
 

Caayn

Member
Lets look at this 2160p/4k/UHD gen thus far.
- 4K/30
- 4K/60
- 4K/60 HDR
- 4K/60 HDR/DV/TLG
- 4K/60 HDRs + VRR.

Can't wait to see the must haves and "wait fors" in 2019/2020. Manufacturers are figuring out ways to create a smartphone environment rather than allowing people to keep their sets for longer than 2-3 years these days.

If you NEED or want a new TV, buy it. If you want but can kinda wait, then wait until you say "eff it" I want something new or your set dies and you need one.
And 4K120Hz, Dolby Atmos ARC and higher colour depth. People are seriously underestimating the upgrade HDMI 2.1 is, even for 4K sets. Current panels can already do 4K120Hz with 10/12bit colour, but HDMI 2.0 keeps them from reaching the full potential off the panel and are forced to use chroma subsampling to stay within the bandwidth limits of 2.0.
Are 21:9 TV sets dead?
Phillips tried it a few years ago, before 21:9 monitors were even in the picture. Needless to say it bombed and I haven't seen any TV maker give it another shot since.
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
Yep. The Phillips one is the one I knew. Sad to hear 21:9 is not a thing.
 

MrJames

Member
For anyone who owns an Oppo UHD player, they just released a beta FW update that adds support for Dolby Vision.

LHKgSFA.png


The first DV discs release on 6/6.
 

tokkun

Member
The thing is 2.1 has all the makings of a genuine plateau point, there is nothing I can think of that it might be lacking for the foreseeable future. This is a first since HDMI fell behind the technology curve when 3D first came out. Literally every new spec since then has been missing a key feature, promised to be added in the next revision. 2.1 is the first forward looking version they've put out, and I fully expect it to remain current and relevant for several years.

I don't really understand the point of this argument. Whether it takes 1 year or 10 years to replace HDMI 2.1 really makes no difference for the purpose of your TV purchase.

It's clear that any TVs that support HDMI 2.1 in 2018 are only going to support a subset of the spec, because no one expects them to be putting out 10K TVs next year. So if TVs support a small part of HDMI 2.1 in 2018, then a little more in 2019, and a little more in 2020, how is that any different to you than if they call it HDMI 2.1, HDMI 2.2, and HDMI 2.3? It is irrelevant what the specification could theoretically support if those features are not supported on your TV (or your sources for that matter).
 

gsrjedi

Member
This guy Matt posts on AVS Forums and he works for Vizio (I'm not sure of his exact position). He mentioned both 2.1 and the challenges of VRR for televisions. I don't know how accurate this is, but it's something to consider with all of this 2.1 talk recently.


Matt McRae said:
Both Freesync and HDMI 2.1 require significant hardware changes...

Freesync is very cool and something we are looking at but there are many challenges in the TV system architecture (much easier to pull off in a monitor).

HDMI 2.1 is interesting... I know everyone is going to be demanding HDMI 2.1 to be on TVs ASAP. The standard is not even final (got delayed again) so unlikely to appear widely in 2018 unless TV systems make a gamble and commit to sketchy future upgrades. However, it really adds very little functionality to 4k usage models since HDMI 2.0a covers most resolution and color depths. It is a big deal for 8k but don't get me started on 8k...

Matt McRae said:
Freesync and similar technologies work by changing the normal regular cadence for screen draws... in a normal TV signal we paint the frame on a very regular heartbeat (30, 60, 120 frames) and the whole system is timed to follow this timing. Freesync changes this and the source device (usually a PC graphics card) controls the timing which can be varied and dynamic. So the graphics card waits until it can render a frame.. then it send it to the TV for drawing. But these frames are not necessarily being rendered in a regular cadence. So the TV framerate is variable and "locked" to a variable render rate from the graphics card.

A monitor that has no real processing just locks onto this signal and draws... but a TV has a deep and complex video and PQ pipeline that relies on a normal, regular framerates. So you either have to create a separate video path that bypasses all of the processing and get TV panel technologies (TCON etc) to change to adopt to variable timing... or you have to change the panel AND completely rearchitect the TV SoC to accept a variable frame rate that is locked to an external source (VERY difficult).
 

Paragon

Member
Exactly. From Digital Foundry's reveal that Scorpio would support VRR:
VRR on consoles will begin to be huge when developers start targeting 40-45 FPS with no tearing problem. But it will take years. I expect PS5 will be out by the time it becomes relatively widespread.
They're talking about developers specifically targeting framerates that are not-30 FPS.
Frankly I would be surprised if that ever happens. That's not the point of VRR.
The point of VRR is that you can run at an unlocked framerate and it still plays smoothly - smoother, in fact.

As an example:
With a fixed 60Hz refresh rate, 35 FPS is less smooth than 30 FPS, since it's not synced with the refresh rate and you get uneven frame pacing causing a constant stuttering.
With a variable refresh rate display, 35 FPS would be smoother than 30 FPS since the display refreshes at 35Hz instead of 60Hz.

So the developer sets their game to lock to 30Hz V-Sync, instead of running at 35 FPS.
When you connect up a VRR display, V-Sync is disabled and it now runs unlocked at a higher framerate.

Even if the game has a hard cap for 30 FPS rather than only being V-Synced at 30Hz due to them doing something like tying game logic to framerate, just replacing V-Sync with VRR should also reduce input lag by at least 67ms, and typically 100ms or more for console games.

The point of VRR is that you don't have to lock to a certain framerate any more, not that you can now build a game to run at something that's not 30 or 60.
Fixed refresh rate displays are still going to be relevant for a long time, so they will still be targeting those - but the games should run above that when you hook up a VRR display.

Can't wait to see the must haves and "wait fors" in 2019/2020. Manufacturers are figuring out ways to create a smartphone environment rather than allowing people to keep their sets for longer than 2-3 years these days.
If you NEED or want a new TV, buy it. If you want but can kinda wait, then wait until you say "eff it" I want something new or your set dies and you need one.
The point is that HDMI VRR and native 120Hz support are significant advances - specifically for gamers.
Frankly I can't think of anything so significant in a long time.
Don't try to downplay how big a deal it is.

Going forward there might be improved VRR support (expanded ranges or other improvements) and higher refresh rates than 120Hz, but the jump from the fixed 60Hz refresh rate that televisions have been using since their introduction, to a variable 120Hz, is a huge deal.

I ended up keeping my previous TV far longer than I originally intended - and as a result of that, I will not buy a new TV until they support VRR so long as it is still working.
It's a feature worth waiting for if you don't replace your TV every 2-3 years - especially now that OLEDs are here which means that improvements year-over-year seem to be slowing down again and people will feel less need to upgrade.
While there are things you can improve, they're only minor differences for a lot of people, compared to moving from a limited-contrast display to one with perfect black levels and <1ms response times.

Phillips tried it a few years ago, before 21:9 monitors were even in the picture. Needless to say it bombed and I haven't seen any TV maker give it another shot since.
It's a shame, because - just like 3DTVs - they may have "poisoned the well" by releasing products before their time.
A 5120x2160 curved OLED would be incredible.
Frankly, as a PC gamer I would actually prefer 3840x1620 since that would make high framerate gaming easier, but it would be dead in the water if they tried to sell an ultrawide TV with less than 4K resolution.
 

Theonik

Member
3D requires a special filter though? LG removed it to increase their panel brightness for the OLED range. Why would they disable Game VRR? I see no upside to not supporting a feature that will be part of the HDMI spec.
Only passive 3D active 3D just needs a faster, brighter screen.

And the reason is it costs money to implement. There is no removing of features they choose what to implement.
 
Just as a side info, the only reason I recommended DangerMouse to wait a bit longer is the fact that we both own the same TV model (Sony X850c 4K/HDR ready 400 nits peak Brightness, ~32ms input lag for 1080p/4K/HDR) if I wouldn't own it, I would definitely consider to buy a TV asap like the Sony X900e or an OLED.

Yeah. I appreciated it. It's at least also making me be thorough before doing it though I probably still will. I do still love this TV and we wouldn't be letting it go anywhere even if I do get this, it'd replace something ancient in my computer room, which means I gotta be thorough now since I'll want these two TVs to last a while just like the two they will have replaced.
 

h3ro

Member
Just threw my name in the hat for one of the new TCL P series televisions. We're in the process of moving and will need an extra TV for guest bedrooms. The build in Roku software is gonna be a real benefit as I won't have to pick up a new Apple TV.
 

Haint

Member
I don't really understand the point of this argument. Whether it takes 1 year or 10 years to replace HDMI 2.1 really makes no difference for the purpose of your TV purchase.

It's clear that any TVs that support HDMI 2.1 in 2018 are only going to support a subset of the spec, because no one expects them to be putting out 10K TVs next year. So if TVs support a small part of HDMI 2.1 in 2018, then a little more in 2019, and a little more in 2020, how is that any different to you than if they call it HDMI 2.1, HDMI 2.2, and HDMI 2.3? It is irrelevant what the specification could theoretically support if those features are not supported on your TV (or your sources for that matter).

It's all supposition at this point, we have no idea how/if manufacturers are going to pick and choose features. The only thing that's "clear" is that you're guaranteed to be shit out of luck buying a 2.0 set, save for perhaps Dynamic HDR10 being patched in firmware. The features everyone cares about (>60Hz 4K, VRR, and Dynamic HDR) will all but certainly be in high end sets by 2019, it's the only way they can differentiate them and justify the price premiums. Will they all make the 2018's? Maybe, maybe not, but at least there's a possibility. High frame rate 4K specifically has a very good chance of making 2018's, VRR probably much less so.

This guy Matt posts on AVS Forums and he works for Vizio (I'm not sure of his exact position). He mentioned both 2.1 and the challenges of VRR for televisions. I don't know how accurate this is, but it's something to consider with all of this 2.1 talk recently.

Translation: Please buy our current models! Seriously guys 2.1 sucks, Everything_Was_8k_Dolan.jpg. We were a full year behind everyone else in implementing HDR10, we're probably going to be a year behind in implementing 2.1. By "unlikely to appear widely", I mean not Vizio and not low end models.
 

Theonik

Member
That none of the newest disc formats since DVD support anamorphic encodes and just waste resolution on black bars is still annoying as hell.
Wasting resolution on black bars isn't true though, black bars compress extremely well so the cost is minimal. Anamorphic encoding doesn't make sense because our TVs at this moment have fixed size square pixels. The way it worked out on CRTs is they simple used non-square pixels to stretch the image accordingly. With current digital encodings you would have to either have more pixels than most TVs could ever display, or 'waste space' on black bars.
 

tokkun

Member
It's all supposition at this point, we have no idea how/if manufacturers are going to pick and choose features. The only thing that's "clear" is that you're guaranteed to be shit out of luck buying a 2.0 set, save for perhaps Dynamic HDR10 being patched in firmware. The features everyone cares about (>60Hz 4K, VRR, and Dynamic HDR) will all but certainly be in high end sets by 2019, it's the only way they can differentiate them and justify the price premiums. Will they all make the 2018's? Maybe, maybe not, but at least there's a possibility. High frame rate 4K specifically has a very good chance of making 2018's, VRR probably much less so.

Actually, I'm not super-confident about 4K/120 either. I think they'll only add it if they get it for free - i.e. they don't have to pay more to upgrade the SoC or other parts of the data processing pipeline to support it.

The market for 4K/120 has got to be a microscopic fraction of even the high-end TV market. Most people don't have a PC hooked up to their TVs, and most of those who do don't have a rig capable of doing 4K/120. Remember, high-end TVs still pander to people who use a sound bar instead of owning a good set of speakers.

I think you can look at the treatment of 1080/120 for reference. It only recently started getting official support in sets despite being technically possible for a long time. And it's not something reviewers seem to spend much time talking about, if they even mention it at all. So I will be surprised if 4K/120 is seen as some must-have marketing feature in 2018.
 

SOLDIER

Member
Hoping someone here might know a fix for this, since it's kind of a unique OLED+Receiver problem.

Long story short, whenever I've got my PC displayed on my TV and switch to Steam Big Picture Mode, my receiver will change the audio input to AV1 (the default audio source for that receiver). I manually have to change it back to the correct input so that I'm getting sound through my TV.

I can't disable HDMI link entirely since I need it to have the receiver work with my TV (since it uses ARC). Is there a way to disable HDMI link on my PC? I don't know if that will keep the receiver from switching audio inputs, but that's the only fix I can think of at the moment.
 

Smokey

Member
HDMI 2.1 features have been announced, and they are significant for anyone with a gaming PC or planning to buy an Xbox Scorpio.
It probably affects the PlayStation 4 Pro too, if they can add support via a firmware update. If not, the next hardware revision.

People worrying over 20ms vs 30ms input lag is laughable compared to the difference that VRR makes.
The TV side of input lag is only 20-30ms on top of the 100ms+ that V-Sync causes at 30 FPS on consoles.
VRR eliminates the V-Sync lag.

And that's aside from the whole "variable refresh" thing, which makes games play a lot smoother, and allows for unlocked framerates rather than having to cap at 30 or 60.

HDMI 2.1 also supports 4K at 120Hz, which is a huge deal for PC gamers.
The panels used in TVs today are already 120Hz native, they just haven't had an input which supports that yet.

It's the biggest upgrade in a long time - especially for gamers.
If you are the sort that is planning on keeping your next TV for many years, it is absolutely worth waiting for 2018's sets unless you actually need a new TV now.

For some reason people fail to grasp this. It's a huge change and if you don't need a TV now, there is no reason to not wait and see what the 2.1 adoption looks like in 2018.
 

Kyoufu

Member
Actually, I'm not super-confident about 4K/120 either. I think they'll only add it if they get it for free - i.e. they don't have to pay more to upgrade the SoC or other parts of the data processing pipeline to support it.

LG showed off a prototype OLED model with 4K/120 last year so that is definitely coming when HDMI 2.1 is available.
 
I know the newest LG OLED's now have input lag at around 22ms which is fantastic. That's only a little bit more than one frame (out of 60 per second).

Do you think they could ever get it to 1 frame or less? (16.6ms). My PC Monitor is an ASUS at 11ms so to have a 4k/HDR OLED that's only 0.005 seconds slower would be incredible.
 
This guy Matt posts on AVS Forums and he works for Vizio (I'm not sure of his exact position). He mentioned both 2.1 and the challenges of VRR for televisions. I don't know how accurate this is, but it's something to consider with all of this 2.1 talk recently.

Or they could just tie VRR to the game mode which in turn, turns off all the bullshit processing that everyone turns off anyway when gaming and voila?

Am I missing something here?
 
This not true. That's like saying if a TV has HDMI it has 3D. Game VRR is a feature of the connection, the TV doesn't have to support it.
And how it is implemented will vary wildly between TVs.

That's not correct either. VRR works on lower HDMI specs (ie: Freesync). If the TV has a dynamic scaler and the firmware supports it, then it has VRR.
 

Theonik

Member
That's not correct either. VRR works on lower HDMI specs (ie: Freesync). If the TV has a dynamic scaler and the firmware supports it, then it has VRR.
VRR in the HDMI 2.1 sense is only available on HDMI 2.1. The specification is capable of it but both devices must support it.

Freesync over HDMI doesn't actually use standard HDMI but uses vendor extensions that both devices need to support.
 

Madness

Member
I know the newest LG OLED's now have input lag at around 22ms which is fantastic. That's only a little bit more than one frame (out of 60 per second).

Do you think they could ever get it to 1 frame or less? (16.6ms). My PC Monitor is an ASUS at 11ms so to have a 4k/HDR OLED that's only 0.005 seconds slower would be incredible.

They could, the question is whether they would, or what kind of image quality you end up with in game mode to try and get lower. Most manufacturers have been able to get to 22-30 MS for all 4K and HDR models this year. I would imagine that they could probably get under 20ms next year for their OLED's.
 

Weevilone

Member
Translation: Please buy our current models! Seriously guys 2.1 sucks, Everything_Was_8k_Dolan.jpg. We were a full year behind everyone else in implementing HDR10, we're probably going to be a year behind in implementing 2.1. By "unlikely to appear widely", I mean not Vizio and not low end models.

This is a dick post. The Vizio posts mentioned are from their CTO and he's very candid and helpful on AVS. His participation is amazing and a primary reason I'd consider buying their products. He regularly takes community feedback and acts upon it for updates and such.

I'm not sure what you are complaining about with a Vizio HDR10 either, as the unit I had incorporated both Dolby Vision and HDR10.
 

Paragon

Member
Wasting resolution on black bars isn't true though, black bars compress extremely well so the cost is minimal. Anamorphic encoding doesn't make sense because our TVs at this moment have fixed size square pixels. The way it worked out on CRTs is they simple used non-square pixels to stretch the image accordingly. With current digital encodings you would have to either have more pixels than most TVs could ever display, or 'waste space' on black bars.
Anamorphic encoding still makes sense if your source has more resolution.
For Blu-rays mastered in 4K, they could have been encoded in an anamorphic 1920x1080 instead of a letterboxed 1920x1080.
Assuming 2.37:1, when letterboxed the effective resolution on disc is only 1920x810.

On a 1920x1080 display, you would not see a difference between the two, because that 1920x1080 anamorphic encode would still have to be displayed at 1920x810.
But on a 2.37:1 Ultrawide display (2560x1080) or a 4K 16:9 display (3840x1620) you gain an extra 33% resolution from the anamorphic encoding.
And if you assume that 4K is not the end-game for resolution - which it does not appear to be - then anamorphic 4K encodes would still make sense too.

For some reason people fail to grasp this. It's a huge change and if you don't need a TV now, there is no reason to not wait and see what the 2.1 adoption looks like in 2018.
I suspect that they do, but are convincing themselves that they don't need it, by telling other people that it doesn't matter.
I saw it all the time back when I was reviewing TVs.
Most people commenting on the reviews or discussing TVs on forums don't actually want to know the technical details or faults with the display - they're looking to validate their purchases.

Actually, I'm not super-confident about 4K/120 either. I think they'll only add it if they get it for free - i.e. they don't have to pay more to upgrade the SoC or other parts of the data processing pipeline to support it.
That's true. I'm not entirely convinced that 120Hz will happen in 2018 - though LG have been showing off 120Hz demos for their OLEDs since 2016, so I really hope that it happens.
All the processing happens at 120Hz on today's sets already, since they support interpolation to 120Hz or higher.
120Hz is going to happen eventually though. The question is if the hardware will be ready for it in 2018.
My hope is that HDMI 2.1 connections are going to require a minimum bandwidth that can handle 4K120, unlike HDMI 2.0 which allowed for a subset of 2.0 features to be enabled on what was effectively HDMI 1.4 hardware.

Remember, high-end TVs still pander to people who use a sound bar instead of owning a good set of speakers.
That's more logistics than anything else.
High-end speakers are big, expensive, and the room setup matters a ton as well.
High-end sound also requires that you have your own place without shared walls, unless you want to become that neighbor.
Sound bars are easy.

All of those reasons and more are why I've invested a lot into a nice headphone setup rather than speakers.

I think you can look at the treatment of 1080/120 for reference. It only recently started getting official support in sets despite being technically possible for a long time. And it's not something reviewers seem to spend much time talking about, if they even mention it at all. So I will be surprised if 4K/120 is seen as some must-have marketing feature in 2018.
1080p120 is far less interesting than 4K120 though.
If support had been around back when panels were still 1080p native, it would have been a lot more interesting.
Or perhaps if any 4K TVs actually supported simple "pixel doubling" so that they looked like a 1080p native panel with a 1080p input, instead of giving you a blurry scaled image.

I know the newest LG OLED's now have input lag at around 22ms which is fantastic. That's only a little bit more than one frame (out of 60 per second).
Do you think they could ever get it to 1 frame or less? (16.6ms). My PC Monitor is an ASUS at 11ms so to have a 4k/HDR OLED that's only 0.005 seconds slower would be incredible.
I think it's unlikely that it would be below 1 frame - though it is possible.
I might argue that it's better to buffer a frame and use fast scan-out, than trying to get latency below 1 frame of lag.

Eliminating V-Sync lag by using VRR is going to make a much bigger difference for latency in games though.
That's likely to reduce latency by 50ms with 60 FPS games and 100ms with 30 FPS games.
On the TV side of things, you only have another 22ms to go.
 

Haint

Member
This is a dick post. The Vizio posts mentioned are from their CTO and he's very candid and helpful on AVS. His participation is amazing and a primary reason I'd consider buying their products. He regularly takes community feedback and acts upon it for updates and such.

I'm not sure what you are complaining about with a Vizio HDR10 either, as the unit I had incorporated both Dolby Vision and HDR10.

It was intended to be. Panasonic also had a high ranking executive unofficially "community managing" AVS back around the '10, '11, '12 era. He didn't stay long, quickly moved to another forum as IIRC he felt the AVS mods should be doing more to censor arguments/criticism people would direct at Panasonic/Him. Ultimately this guy is there to push his product, either overtly or by currying favor with the videophile community. Subtext of his posts certainly suggest Vizio's taking a wait and see approach to 2.1 for various reasons, which he is naturally trying to marginalize the importance of. I have a sore spot for spin, hence the dick post.

With regards to HDR, they released a half assed HDR10 update several months after the TV's had been available, August or September of last year (2016). It was largely broken and from what I gather they have to go through devices 1 by 1 to get them functioning. Xbone S and PS4 did not function in HDR on their TV's well into November/December last I saw. No idea when/if they fixed them, maybe they still don't work? Sony and Samsung released their first HDR10 updates in like September of 2015.
 
It seems my LG OLED55B6V has a weird bug where Trumotion turns on by itself. I have it turned off, but then sometimes / quite often when I change the channel somehow the 'soap opera effect' is there again full stop. What's even weirder is that Trumotion is still off in the settings. Rebooting the tv fixes the issue.

Is my tv busted?
 

Paragon

Member
I hope people would start to realize that nearest neighbor (box filter) is THE worst kind of scaling.
https://upload.wikimedia.org/wikipedia/commons/e/e9/2xsai_example.png
left = nearest neighbor, right = 2x sai
See also: http://alvyray.com/Memos/CG/Microsoft/6_pixel.pdf
Depends on context. There are times where nearest neighbor scaling is appropriate, and anything else is going to give you a worse result that just looks blurred.
TVs generally don't give you multiple scaler options, you just get what it gives you.
Having the option to use nearest neighbor scaling with 720p/1080p sources on a 4K native screen at least gives you an image which looks just like you're viewing it on a 720p/1080p native panel.
Even if that may not be the absolute best result, it may be a better option than the TV's video-focused scaling algorithms. Instead of a blurry and aliased image with ringing artifacts, you at least get a sharp and aliased image without those artifacts.
 

tokkun

Member
That's more logistics than anything else.
High-end speakers are big, expensive, and the room setup matters a ton as well.
High-end sound also requires that you have your own place without shared walls, unless you want to become that neighbor.
Sound bars are easy.

All of those reasons and more are why I've invested a lot into a nice headphone setup rather than speakers.

I agree, logistics are a problem. That is what I was trying to get at - 4K/120 content is currently a logistical problem (among others). The only source with any meaningful amount of content is high-end PCs. If you want to play modern games in 4K with a solid 120fps, you aren't going to be able to do it with a small and discrete console-like system. You are going to need something bulky, noisy, and hot (not to mention expensive). Moreover, the high bandwidth needed for a 4K/120 signal can be a challenge; a lot of people already struggle to get 4K/60 to work on PC->TV connections requiring a cable run of > 25 feet. So the PC probably needs to be in the same room as the TV.

1080p120 is far less interesting than 4K120 though.
If support had been around back when panels were still 1080p native, it would have been a lot more interesting.
Or perhaps if any 4K TVs actually supported simple "pixel doubling" so that they looked like a 1080p native panel with a 1080p input, instead of giving you a blurry scaled image.

At the end of the day, a format is only appealing if there is content available for it.

I would like to remain optimistic that 4K/120 will eventually move out of the domain of the super-enthusiast gamer and be supported in mass market source devices - driven both by technical progress in GPUs and rising popularity of VR - but that seems more like ~5 years off to me. It would be nice if video came to the rescue and popularized the format, but HFR movies do not seem to be gaining traction. The one area where I could see there being mass market appeal for 4K/120 would be in sports. However, that seems unlikely to happen any time soon because sports are mostly live broadcasts, and broadcasters are not going to want to do 4K/120 due to bandwidth constraints.

So yeah, I think we'll only get support for 4K/120 if it costs almost nothing to implement and some nerdy engineer at the TV manufacturer feels like championing it. It would at least give them a nice demo reel to show off during their CES presentation.
 

tokkun

Member
Is the dynamic HDR with 2.1 gonna be a big deal?

Not from a technical perspective. Dolby Vision will probably still be higher quality and has better hardware compatibility - it works even on HDMI 1.4. Dolby has already demoed DV working on PS4 Pro, which obviously will not be compatible with the HDMI 2.1 format. Nvidia also added DV support to current GPUs via a recent driver.

The main purpose of the HDMI 2.1 dynamic HDR is not so much for consumers, but for manufacturers who do not want to pay the Dolby Vision royalty. In particular, Samsung has resisted adding DV support to their TVs.

If you already own a TV that supports Dolby Vision, then HDMI 2.1's dynamic HDR probably won't make a difference for you. It will only be a big deal if manufacturers of source devices decide they don't want to support DV anymore. However, I think the more likely outcome is that everything but the very low-end devices support both formats, like it is with the competing DTS and Dolby 5.1 audio formats. Even more budget-oriented brands like Vizio and TCL support DV now, so it doesn't seem like the royalty is that much of a barrier.
 

Paragon

Member
I agree, logistics are a problem. That is what I was trying to get at - 4K/120 content is currently a logistical problem (among others). The only source with any meaningful amount of content is high-end PCs. If you want to play modern games in 4K with a solid 120fps, you aren't going to be able to do it with a small and discrete console-like system. You are going to need something bulky, noisy, and hot (not to mention expensive). Moreover, the high bandwidth needed for a 4K/120 signal can be a challenge; a lot of people already struggle to get 4K/60 to work on PC->TV connections requiring a cable run of > 25 feet. So the PC probably needs to be in the same room as the TV.

At the end of the day, a format is only appealing if there is content available for it.

I would like to remain optimistic that 4K/120 will eventually move out of the domain of the super-enthusiast gamer and be supported in mass market source devices - driven both by technical progress in GPUs and rising popularity of VR - but that seems more like ~5 years off to me. It would be nice if video came to the rescue and popularized the format, but HFR movies do not seem to be gaining traction. The one area where I could see there being mass market appeal for 4K/120 would be in sports. However, that seems unlikely to happen any time soon because sports are mostly live broadcasts, and broadcasters are not going to want to do 4K/120 due to bandwidth constraints.

So yeah, I think we'll only get support for 4K/120 if it costs almost nothing to implement and some nerdy engineer at the TV manufacturer feels like championing it. It would at least give them a nice demo reel to show off during their CES presentation.

I think you're forgetting about the "variable" aspect of Variable Refresh Rate displays.
You're correct that a locked 4K 120 FPS at maxed-out settings in new games is unrealistic on today's hardware.
The whole point of VRR though, is that you no longer have to keep the framerate locked to 120 FPS at 120Hz for things to look good.

Old games might run at a locked 4K120.
Less demanding current-gen games might run at 90-120 FPS.
More demanding current-gen games might run at 55-70 FPS.
Maybe some games are a bit more demanding than your system can really handle, and run at 45-55 FPS.

With 4K120 support, it's now possible to run games at resolutions in-between 1080p and 4K too, while still taking advantage of high refresh rate support.
Perhaps native 4K is too demanding but 1440p, 1600p, 1800p etc. give you more acceptable performance, rather than having to drop the resolution to 1080p for >60Hz support.
A mid-range GPU should easily handle 1440p above 60 FPS in most games.

If we're looking at console hardware rather than PCs, maybe a game running at 30 FPS will actually run closer to 40 FPS now that the framerate can be unlocked, and dips below 30 are not going to be as noticeable.
Even if a game is only running at 30 FPS, you still benefit from the significant reduction in latency that VRR brings.
Current-gen consoles will benefit less from >60Hz support, though it may still be important for low framerate compensation on LCDs. Depends what the VRR ranges are.


As gaming hardware gets more powerful, you will continue to see improvements with a 120Hz VRR display.
I'm not fond of the term, but it actually does offer future-proofing vs a 60Hz fixed refresh display, because VRR allows you to push the framerate to run as fast as your hardware can handle, instead of having to lock to 30 or 60 FPS.

If a demanding game is running at 45-55 FPS unlocked today, you would have to cap it to 30 FPS on a 60Hz display for smooth gameplay while the VRR display would simply run it at 45-55 FPS.
If you then upgrade to hardware which is 50% faster, a fixed 60Hz display should now be able to lock it to 60 FPS, while the VRR display will run it at 68-83 FPS - so you get a lot more out of the same upgrade by having the VRR display.
Another 50% hardware upgrade brings that to 101-120 FPS on the VRR display. With 4K60 you won't see an improvement - it's still locked to 60 FPS.


As for PC hardware being big, hot, and loud, there are many different examples of PCs which prove that doesn't have to be the case - but this is starting to stray off-topic.
Broadcast or other sources do plan to move towards HFR content. No-one can really say when that will be, but I have heard that some are targeting the 2020 Olympics.
Maybe movies will stay stuck in the past with 24 FPS, but I certainly hope not.
The point is, again, that by having the support for 120Hz the display is actually being "future-proofed" because it will be capable of supporting this content when it arrives, which is the direct opposite of other technologies that have been introduced recently like HDR, which shipped out in an unfinished state that I would not really consider to be "feature complete" until dynamic HDR arrives with HDMI 2.1
 

tokkun

Member
I think you're forgetting about the "variable" aspect of Variable Refresh Rate displays.

The same caveats I stated apply to getting > 60 fps at 4K - it is still the realm of high-end PCs. I imagine the group of people who benefit from VRR at < 60fps in 4K is a lot bigger than those who benefit from > 60fps.
 
Top Bottom