It doesn't. As far as i know there are no digital testers right now to measure input lag at 4k so you need to measure it the old fashion way. It adds a bit but I don't know how much.
its 1080p at 20ms, 1080p HDR at 23ms and 4K adding 30ms to that.
So around >50ms, and people are lowing low lag on it.
This isn't true. Testers don't measure 4k accurately, but I can tell you from personal experience.Sorry, I confused you with spwolf who threw out the 30 ms at 4k on top of the ~20 ms at 1080p (combined >50ms). Was wondering where this number is coming from:
Because if true, this puts the KS8000 at almost the same input lag as the Vizio and LG OLED at 4k HDR.
Been looking at the 55 inch LG 4K OLED TV. It gets great reviews in most areas, but sounds like it has ~50ms of input lag. Is this gonna make stuff unplayable? I mostly play on PC but use my tv for Nintendo consoles and the occasional PS3 game.
Thanks, I'll give a thumb drive a go!It's been stated on a few sites that Netflix and other 4K streaming is compressed to the point where it is basically the quality of Blu Ray. Download a demo to a thumb stick and try it to see what you think.
Id also mention that store are in most stores aren't calibrated. They are set to whatever factory defaults are. Factory defaults are often way off from proper and aimed at displaying the brightest most over saturated colors without necessarily being maxed out. They also usually have motion enhancement turned on causing the Soap Opera Effect. This is no way to judge the set properly and it will also need to be calibrated in your home to the room it is in to get proper viewing. Additionally, store demos have been running non-stop for varied lengths of time straight sometimes causing burn in on some sets.
I probbaly won't get one until 2018.
its 1080p at 20ms, 1080p HDR at 23ms and 4K adding 30ms to that.
So around >50ms, and people are lowing low lag on it.
What's my Emererson plasma TV get for delay becuase it's probley ass haha so that stuff doesn't bother me
5000 nits? I'd like to be able to use my eyes in 30 years time please.
5000 nits? Are they fucking insane? 2000 is strong enough most people would have to look away from the damn screen unless I'm reading this wrong.
5000 nits? Are they fucking insane? 2000 is strong enough most people would have to look away from the damn screen unless I'm reading this wrong.
Yes but the idea is amusing, and the fact that I'm reading this on my 1600x1200 screen should indicate how little stake I have in this.It doesn't mean you have a display constantly shoving 10000 nits at you
I looked at that those charts and that is what causes me confusion. I don't see the necessity of 12bits+metadata, 12bits alone makes sense, as does 10bits+metadata. It looks to me that 12bits using the PQ curve alone is good enough to be below the Barten ramp from 0.001 nits to 10,000 nits.
this makes me feel better about getting the KS8000 or KS8500 Curved at the end of the year.
Calibration discs are better than nothing, but if you want proper screen calibration, you'll need to use a probe.Brightness should be set using calibration tests disc. Backlight can be fine to set to zero or as low as possible.
Most gaming monitors are still TN panels with some ips but neither is great for media. VA gives the best black levels. Once HDR takes off on PC games you might see it more though.Most gaming monitors these days are all about gsync or freesync and having a high refresh rate and a nice IPS display.
It's the other way around. OLED HDR requires special illumination clothing and room setup to hit the required black level. 1000 nit is fine. This is only used on highlights. Your eyes can deal with it.On top of this, 1000nits + requires specific lighting, so as not to melts your eyeballs.
The sets are designed to be used in darkness as with any media display calibration should be done to the room light.How does the viewing environment affect the brightness you should be looking at to create the HDR effect. I mean, I don't think 500nits in a pitch black room causes the same reaction as 500nits in a store environment. Doesn't this sort of bring in more variables into the mix?
Might be waiting forever. I think the internet will eventually mean there won't be OTA broadcasts. I don't know that a 4k signal can even be transmitted OTA. Can it?
No they are not. 12 bit is the max and you can only do 4096 brightness steps in that.
For a game to well and truly support WCG, every texture and video will need to be replaced with an 10 bit WCG version.
Guys dont buy a PS4 because the graphics will only get better. wait for the PS5!
Four years later.
Guys dont buy a PS5, the graphics will only get better.
That's because you aren't accustomed to something faster. You would absolutely feel the difference.Apparently my parents ~2009 Vizio LCD TV that has about 50ms of input lag and it's never bothered me.. To be fair I don't play FPS or Fighting games on console, those would probably be more much more noticeable.
How impactful will changing textures be? In 3d games, textures are applied to polygons, which have lighting depending on light intensity, colour, angle towards light. That alone can output to a wide colour space. Add reflections and other shader effects and I'm pretty sure the final output can take advantage of 10bit channels. Textures are just one of many inputs on the way to final picture.
This is totally anecdotal, but I have an 8000, my dad has an 8500. I love my 8000, but his 8500 has enough edge light bleeding that I would have been shipping it back if it were mine.
this HDR seems like a huge opportunity for people to have no idea what any of this shit means
I'm sure by the time that 4K HDR sets become cheap there'll be a new standard to wait for.
My XBR65X810C doesn't have HDR but I honestly don't think it really matters, it's still 4K and looks awesome. Just bought a HT-CT790 Soundbar to go with it yesterday![]()
Man people here are wrong. Ks8000 HDR is 22.6 ms. Non HDR is 20.6.
Period. No 20 + 30 crap
Okay thanks. But where does it say that going to 4k resolution adds 30 ms on top of the ~20ms at 1080p?
I posted the source of the +30ms extra lag with 4k input in another thread.Man people here are wrong. Ks8000 HDR is 22.6 ms. Non HDR is 20.6.
Period. No 20 + 30 crap
Syrus, who is more likely to be right, you or a website that specializes in testing TV's?http://uk.rtings.com/tv/reviews/by-...d-vs-1080p-full-hd-tvs-and-upscaling-compared
![]()
So that's another 30ms input lag on top of the 26.1ms that it has at 1080p.
I posted the source of the +30ms extra lag with 4k input in another thread.
Syrus, who is more likely to be right, you or a website that specializes in testing TV's?
I posted the source of the +30ms extra lag with 4k input in another thread.
Syrus, who is more likely to be right, you or a website that specializes in testing TV's?
Is the JU7100 the exact same as KS8000?
The 10k nits are meant to be a later target so likely not part of the 12 bit roadmap. But that's one of the main functions of gamma in older display technologies (our eyes aren't linear neither are displays). HDR doesn't use a gamma function though.The 4096th step could be 10000 nits? I don't think banding will be an issue when adjacent pixel can only be 9750 nits, instead of 9999 nits. Or maybe they work some math magic and still be able to carry that much information by storing the information in blocks so you can display the whole possible brightness levels inside of a block, as long as you have don't have 4096 pixels in that block (lets say 512 pixels), through lossy or lossless compression.
I barely know what I'm talking about..
Where are you reading 4k adds 30ms? I know this topic keeps coming up because the KS series Samsungs look like the only hope for a sub-30ms input lag TV in 4k/HDR this year but nobody seems to know for sure what the real number is. The poster a few spots up has empirical evidence that it is in the 30s or better but I've seen others, like you, claim it is higher without a source or referencing some other model TV. Would you mind clarifying?
I think they are wrong too but how do you know it is still 22.6 in 4k if the measurement was taken in 1080p? I think people are jumping the gun buying this thing until we know for sure, especially if their main reason is for PS4 Pro which is almost 2 months away.