Is HDR support really that big of a deal?
Is 100GB enough space for 4K content at high frame rates or 3D?
They need to confirm 48fps already.
https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=ces 2014 dolby vision
Generally it was the talk of CES last year. So that combined with a much wider gamut (and higher framerates if that ever catches on) makes content significantly more realistic. More so than the increase in resolution - at least for film / TV.
Are you arguing for content or as a storage medium?
While I do think they should have went up to 4, you have to remember they're using a new codec here. So in terms of image quality versus regular BD, we shouldn't really see a drop off from lack of bitrate. And remember, they don't need to scale up the audio's overhead by nearly as much as the video (object based audio like Atmos doesn't linearly scale).
I remember searching for info about whether PS4 and Xbone could use brute force to decode h.265, since I have a 4K TV that doesn't include a decoder. I never got a definitive answer, but signs are looking positive. Something about an AMD paper, which says that APUs similar to the ones in the consoles could theoretically do it, I think.I wonder if a PS4 software update could play them.
Easily. The 4k movie files on Sony's 4k player are around 40-50GB. Netflix 4k content is 25Mbps on the HEVC codec.
In theory it should be with h.265 ... though I wonder if there could be problems in doing both concurrently?Is 100GB enough space for 4K content at high frame rates or 3D?
Are you talking about simulated HDR lighting effects, or actual higher bit-depths (deep color)? I don't think many games are currently using high bit-depths ... but if they actually are, then yes it would improve how they look. Current TV's aren't really designed to support them.Video games already have HDR, I believe. So this won't provide any improvement in quality on that front, right?
As I said above, I think they're generally on par with BD given the codec advancement.I meant for content. Blu-ray to 4K Blu-ray seems uncomfortably like going from DVD to those WMV HD discs that were around for a while before HD-DVD and Blu-ray launched.
Video games already have HDR, I believe. So this won't provide any improvement in quality on that front, right?
Video games already have HDR, I believe. So this won't provide any improvement in quality on that front, right?
I'd throw the Netflix figure out because that's meant for streaming.
What bitrate are Sony 4K movies being played at? You've already hit the limit if they're 40 to 50GB
If you do HFR, double the capacity. That's 80 to 100GB. Add in 3D and you no longer fit the disc.
Unless my understanding of HDR is incorrect, it's really just increased bit-depth and some metadata?If you do HFR, double the capacity. That's 80 to 100GB. Add in 3D and you no longer fit the disc.
HEVC is a lot more efficient than H.264.
Unless my understanding of HDR is incorrect, it's really just increased bit-depth and some metadata?
Going from 8-bit to 10-bit is a 1.25x increase. From 8-bit to 12-bit is a 1.5x increase. And that's uncompressed data once it's outputting to your display. Compressed, it's likely less. There isn't a doubling.
As I said above, I think they're generally on par with BD given the codec advancement.
lol yeah sorry ... misread your post as HDR.HFR, not HDR. Granted HFR doesn't automatically mean double the size because of double the frames, but there's a big impact on the data size as a result. I have doubts you'll see HFR and 3D for like Avatar 2 and have it fit on a single disc.
Is HDR support really that big of a deal?
I remember searching for info about whether PS4 and Xbone could use brute force to decode h.265, since I have a 4K TV that doesn't include a decoder. I never got a definitive answer, but signs are looking positive. Something about an AMD paper, which says that APUs similar to the ones in the consoles could theoretically do it, I think.
But this isn't just a codec improvement, it's a codec, bitrate, and storage size improvement. How is that at all similar to WMV HD? That was using DVD-9.And the WMV HD discs that I mentioned benefited from codec advancements, too. They also only had bit rates akin to Sony's Superbit DVD brand. Like I said, the similarities to 4K Blu are striking.
At the time I thought those WMV discs looked great. Not so much now that I have the hindsight of having watched >2,000 movies on Blu-ray.
showmethereceipts.gifHow many 160Mbps 4K discs have you watched?
It seems rather gimmicky to me, just more marketing hoopla that doesnt actually improve the home theater experience.
I lol'd for realzIt seems rather gimmicky to me, just more marketing hoopla that doesnt actually improve the home theater experience.
Hold up? The specification hasn't even been finalized, and players won't be hitting until the end of the year. How is there a hold up if it doesn't even exist? It's almost a year a way. Even if PS4 support is possible, why would it already be announced?The PS4 should easily handle a h265 decode, seems like the hold up is Sony itself-- probably due to the fact none of their separate divisions actually get along.
Must support Video CD?
WTF does 1993 Hong Kong still exist out there, or something?
But this isn't just a codec improvement, it's a codec, bitrate, and storage size improvement. How is that at all similar to WMV HD? That was using DVD-9.
This jump has far more in common with the original BluRay.
showmethereceipts.gif
Why are you asking for a linear bitrate increase when not using the same codec? I mean sure, more is always better ... but you're making some weird leaps here.
I genuinely wonder if I'll rebuild my collection of favorite movies a third time.
What? It's anything but gimmicky.
It'll be neat if studios gave discounts for movies that people already have registered on their UV account.I genuinely wonder if I'll rebuild my collection of favorite movies a third time.
It's doubleThis is barely a storage size improvement,
Bit rates are not equal when talking about differing codecs. Moreover, we don't even know the bitrate to my knowledge. Did they say what the spin rate is for this drive or the proposed spec? That's what determines bitrates.and by extension barely a bit rate improvement.
I remember a time when BD releases were single-layer for movies and games. Dual layer was considered prohibitive due to costs and yields.I fully expect the triple layer discs to never see use. Funny that you used showmethereceipts.gif because I want to see the replicators who plan on offering three layers and at what cost.
It's not meant to increase average brightness for random people at best buy. And just because shitty photographers abuse something doesn't make it crap.HDR is emphasizing increasing the brightness of a scene- which of course attracts the random best buy consumer but often results in unnatural looking colors and exposures. Just look at the abuse of 'HDR' in photography. Also not helping are the two competing standards for HDR- which will prevail? Will they be cross compatible? Can current panels even really claim to be HDR?
I assume when you say 'High bit display panels', you mean increased bit-depths (10-bit, 12-bit)? Here's the funny thing about higher bit-depths ... that's what HDR is for .IMO tv makers should be focusing on delivering useful improvements to their consumers such as:
Improved black levels (through FALD or OLED)
Expanded color gamut (Rec 2020)
High bit display panels
Better motion resolution
etc
Instead of marketing hype.
It's not meant to increase average brightness for random people at best buy. And just because shitty photographers abuse something doesn't make it crap.
Might want to check this out - http://www.neogaf.com/forum/showthread.php?p=146723540
It's going to look doctored because it was a high dynamic range photo that has to be dithered down to work on our shit monitors.LOL at that doctored comparison. If thats what life of pi looked like on a conventional LCD, they should be all burned in a fire. Also doesnt that prove my point? The colors are graded differently and the brightness is obviously jacked up beyond the two other LCDs.
You want advances ... but then you disbelief they can happenIf you're telling me Samsung's 2015 panel will have 1000 nits (far short of the 4000 dolby is suggesting), full P3 colorspace and a 10-bit panel, then I applaud them greatly.
Im doubtful however.
Is 100GB enough space for 4K content at high frame rates or 3D?
Still though, look into what people that actually attended CES last year had to say about Dolby Vision, etc. There's nothing really more I can say if you're simply going to disregard what the people that have actually seen it have to say.
I saw demonstrations of HDR and it left me unimpressed. These were shown in a darkened room and it was uncomfortable on the eyes. Ocean waves in the demo were, to me, unnaturally bright and, of course as is true with all these phony demos, the 'control' display's picture was absurdly subdued to enhance the difference.
Seriously, how dumb to these guys think we are?
Similarly, a demo of an expanded color gamut actually looked unnatural next to the control demo that appeared to have more natural colors.
We already have 4-layer consumer BD XL writers that work, let alone that ROMs are less susceptible to error. You can buy them on Amazon.
It's certainly possible they did have the overall brightness too high I'm sure, but the reality is they can't just magically jump from what were essentially higher resolution 1080p displays to UHD in a year. Which is precisely why I've argued people should wait unless they're fine buying another set in a few years. Jumps this dramatic take time, but the fact is they've objectively improved a lot from last year's models ... which is important.I actually think what Dolby in particular is trying to do is good work but it seems to me that manufacturers today are saying 'this is what we think it should be, someday' instead of 'look at what we have achieved now!', meanwhile they're slapping HDR marketing terms on their panels that fall short of any decent standard.
Here's one impression I read from CES showgoer recently on the UHD Alliance demo:
http://www.avsforum.com/forum/40-ol...775690-ces-2015-reactions-8.html#post30562361
I remember the Dolby Vision impressions to be more positive
That's because they're writable. It's never correlated well with ROM pricing. The mechanism to lay data is totally different. Even regular BD writables are still priceyAnd those 100GB discs are $70.
h.265 (HEVC)100GB seems really low, unless I'm wrong in assuming that a 4K movie would need four times the space that a 1080p one would. Are they using different compression codecs or something?
I guess it comes down to whether you think you'll use it enough ... or whether it's better to just go the ISF route since they have the equipment? Obviously you have to consider the fact we're moving to a new color specification. So is it worth getting one now that only does Rec 709? Might want to wait until one is available with selectable gamuts?
Another cheap option would be to buy this - http://www.amazon.com/dp/B000V6LST0/?tag=neogaf0e-20
It comes with a card that has a red, green, and blue Rec 709 cellophane cutout so you can at least get approximate settings for your colors.
I doubt that's ever going to happen with how poorly it plays with refresh rates on TVs. You're better off hoping that HFR filmmaking graduates to 60 fps.
Are there even any TVs that support 144Hz?