Dave_6
Member
Yup. I wouldn't advise messing around too much in there though. If you look on the AVSForum there is a guide.
Thanks, I'll check it out.
Yup. I wouldn't advise messing around too much in there though. If you look on the AVSForum there is a guide.
Yup. I wouldn't advise messing around too much in there though. If you look on the AVSForum there is a guide.
Interesting story, the 2017's have under the oled menu anew option called APL, or average picture level, it is set to 48 but you can change it, from what I can tell it changes the ABL on the set.
I bought one to disable the auto-dimming feature that kicks in when a static image is on screen for a set time, as I found it was kicking in sometimes, during dark movies, when it shouldn't. I just bought one of those cheap all in one remotes from Argos. Worked great.
why would you disable a feature intended to prevent image retention?
Someone earlier in this thread did that and then left his TV on a static image in the YouTube app and came back home to a nasty surprise. :3
why would you disable a feature intended to prevent image retention?
Someone earlier in this thread did that and then left his TV on a static image in the YouTube app and came back home to a nasty surprise. :3
Not a dumb question. Warm is supposed to produce the most accurate color representation from an ISF and calibration methodology. You are supposed to set it to warm2 and then calibrate to get the "right" level of colours and details. The other most common settings "blow out" colours which look attractive to the eye at first but may not be accurate (eg - whites have a blue tint).I have a dumb question. Why do all calibration suggestions for pretty much all TVs suggest warm colors? Most of the time I see less detail when all colors are washed out and whites are yellowish... I prefer something neutral. Is this just a preference thing or am I missing out?
I have a dumb question. Why do all calibration suggestions for pretty much all TVs suggest warm colors? Most of the time I see less detail when all colors are washed out and whites are yellowish... I prefer something neutral. Is this just a preference thing or am I missing out?
I have a dumb question. Why do all calibration suggestions for pretty much all TVs suggest warm colors? Most of the time I see less detail when all colors are washed out and whites are yellowish... I prefer something neutral. Is this just a preference thing or am I missing out?
I have a dumb question. Why do all calibration suggestions for pretty much all TVs suggest warm colors? Most of the time I see less detail when all colors are washed out and whites are yellowish... I prefer something neutral. Is this just a preference thing or am I missing out?
So all new TVs from major companies are already released this year right? We can't expect new lines of TVs with HDMI 2.1 to be released this year?
So all new TVs from major companies are already released this year right? We can't expect new lines of TVs with HDMI 2.1 to be released this year?
Someone might but generally I don't think so. If you are interested in VRR, it is possible to that some TVs already out could be updated to include it as its not something that needs the bandwidth of 2.1 to function. Its something that can be added. Whether or not they do is a different matter altogether.
Biggest HDMI 2.1 features I'm looking forward to is VRR + bandwidth support for 4K @ 120Hz Full 4:4:4 Chroma. It will be killer on an OLED or even high end LCD. These high end panels already have native 120Hz 4K panels just held back by HDMI 2.0 bandwidth limitations.
I don't know if Sony will release a new Z9 end of this year, but if it is I would not be shocked if it was the first high end consumer UHD display to support HDMI 2.1.
2018 is gonna be really exciting. I'm personally going to be going with new 2018 OLED + Nvidia Volta PC setup. Actually, I would not be shocked if Sony showed off a consumer CLEDIS which I gladly pay fuck ton for lol.
I wonder if the PS4 PRO would support all of this since it is HDMI 2.0. The Scorpio has the advantage of releasing late this year and add hardware support for HDMI 2.1.
I don't think Sony is prepared to ship an HDMI 2.1 set this year. But they are well overdue a re-design of their board/processor setup. They were probably holding up for the spec changes before doing that since it's a significant expenditure. Hopefully they will try to eliminate the port split at the same time but we'll see.
e: As far as we know Scorpio's VRR implementation isn't compatible with HDMI 2.1's implementation and we have no indication MS is going to add HDMI 2.1.
This isn't to say this can't change but the spec had to be finalised before HDMI 2.1 and the changes that are needed would be substantial.
Microsoft never claimed to support HDMI 2.1 VRR. They are supporting AMD's Freesync over HDMI which is a different implementation which isn't supported in very many displays either.I doubt they would be dumb enough to make such claims without already doing pre-testing and making sure whatever implementation HDMI Forum is going with is what their new console supports.
Microsoft never claimed to support HDMI 2.1 VRR. They are supporting AMD's Freesync over HDMI which is a different implementation which isn't supported in very many displays either.
Last week, we published the hardware spec for Microsoft's next Xbox - Project Scorpio. However, there was one little detail we held back, an aspect of the new console we didn't want to get lost in the noise. In the here and now its applications will be limited, but in the fullness of time, it may help to bring about a profound shift in how displays interface with games hardware. To cut a long story short, Scorpio supports AMD's FreeSync - and the upcoming variable refresh rate support baked into the next-gen HDMI 2.1 spec.
The HDMI 2.1 bit is speculation of DF's part. AMD's current Freesync over HDMI uses HDMI vendor extensions not the HDMI 2.1 VRR.Going by this Eurogamer article Scorpio indeed supports HDMI 2.1 VRR too.
http://www.eurogamer.net/articles/digitalfoundry-2017-project-scorpio-supports-freesync-and-hdmi-vrr
Edit: And the Digital Foundry Video: https://www.youtube.com/watch?v=t18QbBdPK-8
The HDMI 2.1 bit is speculation of DF's part. AMD's current Freesync over HDMI uses HDMI vendor extensions not the HDMI 2.1 VRR.
I have my set calibrated to an accurate one and one I actually enjoy looking at, and flip between the two. I can tell you now that I completely and utterly prefer the one calibrated using Warm1 instead of the more accurate Warm2. Whites look white, not blue, and skin-tones are more accurate. I'd rather have an image I enjoy watching than one that someone else tells me is right.
It is not preference or missing out but universal standard of D65 reference aka 6500k white light (technically 6504k). You are seeing a color accurate picture, not one overly skewed to one color ie. More red or more blue, or an artificially enhanced image. You are seeing it how filmmakers often shoot and how theatres often present films. Only recently in the LCD and Plasma and HDTV era has it become skewed with artifical enhancements and vivid/dynamic becoming the norm. It is why when I have guests over I have to put on dynamic or sports mode or they complain that football looks dull or the ice looks yellow during the nhl playoffs.
Again though, unless you have the tools or properly calibrate yourself, you may also end up with inaccurate images as some colors are oversaturated and need correction. Panels also vary as does ambient room light. It is why you can use someone elses settings but should always tweak to your preference.
Blue tint to white is not neutral.
However, most preset "warm" (6500K) presets in TV's are way off and push too much red. The only way to properly calibrate is by using a spectrophotometer.
It's not hard to do at all and the best investment I ever made was buying a i1Display Pro worked amazing on all my high end displays (Plasma, & OLED).
Lastly, people that just copy settings they find on forums most of the time don't even understand panels very from each other greatly so if anything they might end up with a less accurate final image.
Edit: Properly calibrated color temperature should never make whites look horrible. If anything once you adjust and go back to color temperatures that push blue you'll find it very annoying.
Not a dumb question. Warm is supposed to produce the most accurate color representation from an ISF and calibration methodology. You are supposed to set it to warm2 and then calibrate to get the "right" level of colours and details. The other most common settings "blow out" colours which look attractive to the eye at first but may not be accurate (eg - whites have a blue tint).
While I appreciate the reasoning behind the push to use the warm setting, I refuse to use it. I like cool settings, they just look better to me, I hate the piss filter when it comes to warm settings, I watch a lot of hockey and the rink is not white no matter what I do and does not look like when I am at the game in person. My cool or medium setting makes it look good to me. Trust me - do what makes it look good for you, as long as your other settings are reasonable, you'll be fine, there will be many who will convince you otherwise, but do what looks good to you and don't feel bad for using cool.
Hmm, that is kind of shitty of them to word the article like they did, since they were the ones that got hands on time with Scorpio.
Oh well not too long till E3 before we got more news from MS directly.
Also does anyone have examples of LG "super resolution" working? It doesn't seem to do much even on 720p signal...
So it sounds that unless I know what I'm doing, sticking to colors that I think look good is the best option. No matter what I do I cannot get white to look not yellow with "warm" preset... On both of my sets (last year's Hisense and B7 OLED) standard picture with slight adjustments to contrast and color plus turning off most of the "smoothing" crap clearly gives the best picture (and whites in particular).
Also does anyone have examples of LG "super resolution" working? It doesn't seem to do much even on 720p signal...
So it sounds that unless I know what I'm doing, sticking to colors that I think look good is the best option. No matter what I do I cannot get white to look not yellow with "warm" preset... On both of my sets (last year's Hisense and B7 OLED) standard picture with slight adjustments to contrast and color plus turning off most of the "smoothing" crap clearly gives the best picture (and whites in particular).
Also does anyone have examples of LG "super resolution" working? It doesn't seem to do much even on 720p signal...
The HDMI 2.1 spec has not been published yet and the compliance tests have not been released. Until that happens, no one can honestly say that any piece of hardware is HDMI 2.1-compatible. The best they can do is say they intend to make it compatible.
It may only look yellow compared to what you're used to. Perhaps you could leave it for a few days and you'll adjust to it?
I tried to leave it on for one day. It looks yellow compared to my iPhone screen, Vita and Surface. I know that those other devices are not really designed for best movie viewing experience, but let's be honest here, all of us look at them more often than we look at the TV
I tried to leave it on for one day. It looks yellow compared to my iPhone screen, Vita and Surface. I know that those other devices are not really designed for best movie viewing experience, but let's be honest here, all of us look at them more often than we look at the TV
One day is not enough. At least for me, it took me a few days because I was so programmed to using cool or neutral all my life. Now I can't go back to those settings.
All these other screens push blue. Vita is by far the worst offender of the bunch terrible out of box settings.
I know with my iPhone 7 Plus the white point is at 6802K out of the box which is amazing. I know with my older iPhone 6 the white point was 7250K. Both tested with i1Display Pro.
I even tested the new Samsung S8 and it allows you to set white point which is very close to 6800K.
Another issue with calibrating without tools a lot of the time gamma is way off so it gives the image a flat look. When you calibrate to proper gamma point the image really pops. I calibrate to BT. 1886 Gamma which is simply stunning on OLED & my Pioneer Kuro.
I can't disagree with any of this, but my eyes are broken at this point. I wouldn't be surprised that real life looks different to me because of staring at electronics all day.
I'm exaggerating of course, but getting used to one screen looking different than others is almost impossible. I might try again at some point when I'm more confident in my calibration skills.
I tried to leave it on for one day. It looks yellow compared to my iPhone screen, Vita and Surface. I know that those other devices are not really designed for best movie viewing experience, but let's be honest here, all of us look at them more often than we look at the TV
I can't disagree with any of this, but my eyes are broken at this point. I wouldn't be surprised that real life looks different to me because of staring at electronics all day.
I'm exaggerating of course, but getting used to one screen looking different than others is almost impossible. I might try again at some point when I'm more confident in my calibration skills.
Have you tried Warm1 instead of the default Warm2?
I'm not even talking about Warm 2. Even Warm 1 is pretty tough to work with. But to be fair Hisense out of box picture is pretty good according to Rtings. And that picture is set to medium. That's actually my preference...
This is 100% true they plan to release the tests in Q2 (which started in April). However, I'm disappointed Digital Foundry are not more transparent with the information they got. Did someone at MS assure them or did they just simply speculate themselves. If it was just pure speculation by themselves it's very annoying the way they worded that article and video if it was someone from MS telling them I'm fine with what they reported.
Oh well E3 is few weeks away so we can confirm 100% what MS plan to support.
damn guys, all this 2017 models looking like world beaters is making me sad, havent even had my B6 a week
damn guys, all this 2017 models looking like world beaters is making me sad, havent even had my B6 a week
damn guys, all this 2017 models looking like world beaters is making me sad, havent even had my B6 a week
There won't be a thing like an 'endgame' display for the next 8 years at least.2017 models don't have hdmi2.1, so they are not the end game yet.