• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ultra HD Blu-Ray specs unveiled with 4K, HDR support

Status
Not open for further replies.

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I wonder if a PS4 software update could play them.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
Are you arguing for content or as a storage medium?

While I do think they should have went up to 4, you have to remember they're using a new codec here. So in terms of image quality versus regular BD, we shouldn't really see a drop off from lack of bitrate. And remember, they don't need to scale up the audio's overhead by nearly as much as the video (object based audio like Atmos doesn't linearly scale).

I meant for content. Blu-ray to 4K Blu-ray seems uncomfortably like going from DVD to those WMV HD discs that were around for a while before HD-DVD and Blu-ray launched.
 
Not hyped.

But I will get it with the PS5.
Let's see if flagship 4K TVs have come down in price by then (and are finalized in specs etc). My 8 year old TV will have to hold out a bit longer I guess.
 
I wonder if a PS4 software update could play them.
I remember searching for info about whether PS4 and Xbone could use brute force to decode h.265, since I have a 4K TV that doesn't include a decoder. I never got a definitive answer, but signs are looking positive. Something about an AMD paper, which says that APUs similar to the ones in the consoles could theoretically do it, I think.
 
Easily. The 4k movie files on Sony's 4k player are around 40-50GB. Netflix 4k content is 25Mbps on the HEVC codec.

I'd throw the Netflix figure out because that's meant for streaming.

What bitrate are Sony 4K movies being played at? You've already hit the limit if they're 40 to 50GB

If you do HFR, double the capacity. That's 80 to 100GB. Add in 3D and you no longer fit the disc.
 

Raistlin

Post Count: 9999
Is 100GB enough space for 4K content at high frame rates or 3D?
In theory it should be with h.265 ... though I wonder if there could be problems in doing both concurrently?

Assuming 3D is handled similar to MVC (3D extension to AVC), it doesn't double bandwidth ... it's more like 50% more. It isn't really two video streams, it's a primary full video stream and a secondary stream that's mostly reference data to the primary since it's so similar.

Conceptually higher framerates kind of do the same for you. Part of the way compression works is you make references to prior frames. Only the changes require full bandwidth. So the funny thing is, the higher the framerate ... the less change per frame ... so the more references can be made. It doesn't compress linearly. Higher framerates actually compress better.

The further you push this though the more computationally expensive it is, but that's why h.265 is around. Besides using lessons learned to improve its efficiency etc, part of the gains come through longer references since they are targeting a more modern hardware spec.


Only time will tell I suppose ... though like with BD3D, I'm sure you can expect higher frame and/or 3D UltraBD discs to be movie-only. Any extras would be on a secondary disc (or potentially streamed online).
 

Raistlin

Post Count: 9999
Video games already have HDR, I believe. So this won't provide any improvement in quality on that front, right?
Are you talking about simulated HDR lighting effects, or actual higher bit-depths (deep color)? I don't think many games are currently using high bit-depths ... but if they actually are, then yes it would improve how they look. Current TV's aren't really designed to support them.




I meant for content. Blu-ray to 4K Blu-ray seems uncomfortably like going from DVD to those WMV HD discs that were around for a while before HD-DVD and Blu-ray launched.
As I said above, I think they're generally on par with BD given the codec advancement.
 

Allard

Member
Video games already have HDR, I believe. So this won't provide any improvement in quality on that front, right?

Not comparable. High Dynamic Range in games refers to lighting type where a single light source can illuminate and dynamically change a large scope of the field, but its a graphical/lighting engine. HDR in TV's changes the entire range in which a light can produce per pixel. Darks stay dark while lights could potentially 'blind' you while maintaining a clear and concise image. Current TV's get washed out when they try to go too high on the lighting source leading to blacks that don't seem very 'black' or at least sharper and more artificial. There is really no way to properly show what HDR is like without seeing it in person, all TV and monitors have essentially been kept to a certain standard of "Knits" for its entire life cycle. That Knit is a range of 100, the new TV standard for HDR they are talking about is something closer to 1k knits which dramatically increases the range of light per pixel at a given point in time. At least thats how I understand it looking into the technology the past couple weeks.
 
I'd throw the Netflix figure out because that's meant for streaming.

What bitrate are Sony 4K movies being played at? You've already hit the limit if they're 40 to 50GB

If you do HFR, double the capacity. That's 80 to 100GB. Add in 3D and you no longer fit the disc.

HEVC is a lot more efficient than H.264.

But specifically I don't think HFR 3D is gonna ever hit a mainstream format like 4k Bd will try to shoot for. It would need more than a 100GB disk for sure. A digital projectionist should be able to verify those kind of file sizes.
 

Raistlin

Post Count: 9999
If you do HFR, double the capacity. That's 80 to 100GB. Add in 3D and you no longer fit the disc.
Unless my understanding of HDR is incorrect, it's really just increased bit-depth and some metadata?

Going from 8-bit to 10-bit is a 1.25x increase. From 8-bit to 12-bit is a 1.5x increase. And that's uncompressed data once it's outputting to your display. Compressed, it's likely less. There isn't a doubling.
 
HEVC is a lot more efficient than H.264.

Sure, but we have to deal with the nature of the limitations of bandwidth in the US. Despite it being more efficient, it's going to be a lower bitrate in order to compact the stream to get the data size down for streaming. I highly expect content on disc to be higher bitrate than the stream as a result so we should throw out Netflix as a measure of data size.

Unless my understanding of HDR is incorrect, it's really just increased bit-depth and some metadata?

Going from 8-bit to 10-bit is a 1.25x increase. From 8-bit to 12-bit is a 1.5x increase. And that's uncompressed data once it's outputting to your display. Compressed, it's likely less. There isn't a doubling.

HFR, not HDR. Granted HFR doesn't automatically mean double the size because of double the frames, but there's a big impact on the data size as a result. I have doubts you'll see HFR and 3D for like Avatar 2 and have it fit on a single disc.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
As I said above, I think they're generally on par with BD given the codec advancement.

And the WMV HD discs that I mentioned benefited from codec advancements, too. They also only had bit rates akin to Sony's Superbit DVD brand. Like I said, the similarities to 4K Blu are striking.
At the time I thought those WMV discs looked great. Not so much now that I have the hindsight of having watched >2,000 movies on Blu-ray.

How many 160Mbps 4K discs have you watched?
 

Raistlin

Post Count: 9999
HFR, not HDR. Granted HFR doesn't automatically mean double the size because of double the frames, but there's a big impact on the data size as a result. I have doubts you'll see HFR and 3D for like Avatar 2 and have it fit on a single disc.
lol yeah sorry ... misread your post as HDR.


Like I posted above though, both 3D and HFR do not scale linearly. But yes, it could be pushing the envelope. Certainly don't expect any extras. And possibly, they might have to split long movies. I certainly wouldn't expect improved sub-sampling for such movies. They'll have to stay in 4:2:0 land.
 

golem

Member
Is HDR support really that big of a deal?

It seems rather gimmicky to me, just more marketing hoopla that doesnt actually improve the home theater experience.

I remember searching for info about whether PS4 and Xbone could use brute force to decode h.265, since I have a 4K TV that doesn't include a decoder. I never got a definitive answer, but signs are looking positive. Something about an AMD paper, which says that APUs similar to the ones in the consoles could theoretically do it, I think.

The PS4 should easily handle a h265 decode, seems like the hold up is Sony itself-- probably due to the fact none of their separate divisions actually get along.
 

Raistlin

Post Count: 9999
And the WMV HD discs that I mentioned benefited from codec advancements, too. They also only had bit rates akin to Sony's Superbit DVD brand. Like I said, the similarities to 4K Blu are striking.
At the time I thought those WMV discs looked great. Not so much now that I have the hindsight of having watched >2,000 movies on Blu-ray.
But this isn't just a codec improvement, it's a codec, bitrate, and storage size improvement. How is that at all similar to WMV HD? That was using DVD-9.

This jump has far more in common with the original BluRay.

How many 160Mbps 4K discs have you watched?
showmethereceipts.gif

Why are you asking for a linear bitrate increase when not using the same codec? I mean sure, more is always better ... but you're making some weird leaps here.
 

Raistlin

Post Count: 9999
It seems rather gimmicky to me, just more marketing hoopla that doesnt actually improve the home theater experience.
I lol'd for realz


The PS4 should easily handle a h265 decode, seems like the hold up is Sony itself-- probably due to the fact none of their separate divisions actually get along.
Hold up? The specification hasn't even been finalized, and players won't be hitting until the end of the year. How is there a hold up if it doesn't even exist? It's almost a year a way. Even if PS4 support is possible, why would it already be announced?

Moreover, who said codec support is the only issue here? There are numerous HDMI and architecture questions that need to be answered before we can guess whether it's viable to do.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
But this isn't just a codec improvement, it's a codec, bitrate, and storage size improvement. How is that at all similar to WMV HD? That was using DVD-9.

This jump has far more in common with the original BluRay.


showmethereceipts.gif

Why are you asking for a linear bitrate increase when not using the same codec? I mean sure, more is always better ... but you're making some weird leaps here.

This is barely a storage size improvement, and by extension barely a bit rate improvement.
I fully expect the triple layer discs to never see use. Funny that you used showmethereceipts.gif because I want to see the replicators who plan on offering three layers and at what cost.
 

golem

Member
What? It's anything but gimmicky.

HDR is emphasizing increasing the brightness of a scene- which of course attracts the random best buy consumer but often results in unnatural looking colors and exposures. Just look at the abuse of 'HDR' in photography. Also not helping are the two competing standards for HDR- which will prevail? Will they be cross compatible? Can current panels even really claim to be HDR?

IMO tv makers should be focusing on delivering useful improvements to their consumers such as:

Improved black levels (through FALD or OLED)
Expanded color gamut (Rec 2020)
High bit display panels
Better motion resolution
etc

Instead of marketing hype.
 

Raistlin

Post Count: 9999
This is barely a storage size improvement,
It's double

and by extension barely a bit rate improvement.
Bit rates are not equal when talking about differing codecs. Moreover, we don't even know the bitrate to my knowledge. Did they say what the spin rate is for this drive or the proposed spec? That's what determines bitrates.

I fully expect the triple layer discs to never see use. Funny that you used showmethereceipts.gif because I want to see the replicators who plan on offering three layers and at what cost.
I remember a time when BD releases were single-layer for movies and games. Dual layer was considered prohibitive due to costs and yields.

We already have 4-layer consumer BD XL writers that work, let alone that ROMs are less susceptible to error. You can buy them on Amazon.
 

Raistlin

Post Count: 9999
HDR is emphasizing increasing the brightness of a scene- which of course attracts the random best buy consumer but often results in unnatural looking colors and exposures. Just look at the abuse of 'HDR' in photography. Also not helping are the two competing standards for HDR- which will prevail? Will they be cross compatible? Can current panels even really claim to be HDR?
It's not meant to increase average brightness for random people at best buy. And just because shitty photographers abuse something doesn't make it crap.

Might want to check this out. Besides having some good info on HDR, it actually addresses some of the misinformation that you and others have fallen prey to. - http://www.projectorreviews.com/hom...ical-side/high-dynamic-range-and-hevc-update/

Plus this about the UHD Alliance, where everything is kind of coming together (including the stuff you're asking for below) http://www.neogaf.com/forum/showthread.php?p=146723540

May also want to look up some of the Dolby Vision reactions and info from last year's CES in general. Pretty much everyone was more impressed by it than 4K.

IMO tv makers should be focusing on delivering useful improvements to their consumers such as:

Improved black levels (through FALD or OLED)
Expanded color gamut (Rec 2020)
High bit display panels
Better motion resolution
etc

Instead of marketing hype.
I assume when you say 'High bit display panels', you mean increased bit-depths (10-bit, 12-bit)? Here's the funny thing about higher bit-depths ... that's what HDR is for .

Most displays currently use 8-bit panels because that's all the dynamic range they have to sufficiently saturate. Increasing bit-depth is only useful if the dynamic range (which includes brightness) is increased. Otherwise multiple steps in luma values would simply crush into each other. You wouldn't be able to tell them apart. The corollary to this is that without higher bit-depths, you cannot have HDR otherwise you'd see banding since the delta between a single step is too great.

Also of important note regarding Dolby's initiative (addressed in the HDR link above), they're looking for increased ANSI contrast ratios, not just the sort of bullshit on/off contrast ratios you see routinely advertised. That's important because it's not just about improving the realism in things like specular highlights, it's also looking to improve the low end - shadow detail.


Manufacturers are working on all of the above, and that includes HDR.
 

golem

Member
It's not meant to increase average brightness for random people at best buy. And just because shitty photographers abuse something doesn't make it crap.

Might want to check this out - http://www.neogaf.com/forum/showthread.php?p=146723540

LOL at that doctored comparison. If thats what life of pi looked like on a conventional LCD, they should be all burned in a fire. Also doesnt that prove my point? The colors are graded differently and the brightness is obviously jacked up beyond the two other LCDs.

If you're telling me Samsung's 2015 panel will have 1000 nits (far short of the 4000 dolby is suggesting), full P3 colorspace and a 10-bit panel, then I applaud them greatly.

Im doubtful however.
 

Raistlin

Post Count: 9999
LOL at that doctored comparison. If thats what life of pi looked like on a conventional LCD, they should be all burned in a fire. Also doesnt that prove my point? The colors are graded differently and the brightness is obviously jacked up beyond the two other LCDs.
It's going to look doctored because it was a high dynamic range photo that has to be dithered down to work on our shit monitors.

It's no different than using our shit monitors to demonstrate black levels. While you can't actually witness the real black level, you can however see the relative black levels when side by side.

If you're telling me Samsung's 2015 panel will have 1000 nits (far short of the 4000 dolby is suggesting), full P3 colorspace and a 10-bit panel, then I applaud them greatly.

Im doubtful however.
You want advances ... but then you disbelief they can happen ;)

If you look into the thread I made regarding the UHD alliance, I'm skeptical as well. Not necessarily that Samsung isn't hitting their specs (or at least near) ... but that any display in 2015 will at all saturate what UHD can do. They won't. Actually they can't because of current HDMI frequency constraints even if the panel could (which it can't). It's certainly gonna be a few years to we really have panels that can come close to what UHD can do. But the point is for the first time in quite a while, we actually really have manufacturers pushing to improve image quality beyond just resolution ... and will actually have content to support it.


Still though, look into what people that actually attended CES last year had to say about Dolby Vision, etc. There's nothing really more I can say if you're simply going to disregard what the people that have actually seen it have to say.
 

golem

Member
Still though, look into what people that actually attended CES last year had to say about Dolby Vision, etc. There's nothing really more I can say if you're simply going to disregard what the people that have actually seen it have to say.

I actually think what Dolby in particular is trying to do is good work but it seems to me that manufacturers today are saying 'this is what we think it should be, someday' instead of 'look at what we have achieved now!', meanwhile they're slapping HDR marketing terms on their panels that fall short of any decent standard.

Here's one impression I read from CES showgoer recently on the UHD Alliance demo:

http://www.avsforum.com/forum/40-ol...775690-ces-2015-reactions-8.html#post30562361

I saw demonstrations of HDR and it left me unimpressed. These were shown in a darkened room and it was uncomfortable on the eyes. Ocean waves in the demo were, to me, unnaturally bright and, of course as is true with all these phony demos, the 'control' display's picture was absurdly subdued to enhance the difference.

Seriously, how dumb to these guys think we are?

Similarly, a demo of an expanded color gamut actually looked unnatural next to the control demo that appeared to have more natural colors.

I remember the Dolby Vision impressions to be more positive
 
The players better debut at $299 or less. Heck BDXL drives are readily available NOW on Amazon, Newegg, Tigerdirect, ect for under $75.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
We already have 4-layer consumer BD XL writers that work, let alone that ROMs are less susceptible to error. You can buy them on Amazon.

And those 100GB discs are $70.
 

Raistlin

Post Count: 9999
I actually think what Dolby in particular is trying to do is good work but it seems to me that manufacturers today are saying 'this is what we think it should be, someday' instead of 'look at what we have achieved now!', meanwhile they're slapping HDR marketing terms on their panels that fall short of any decent standard.

Here's one impression I read from CES showgoer recently on the UHD Alliance demo:

http://www.avsforum.com/forum/40-ol...775690-ces-2015-reactions-8.html#post30562361



I remember the Dolby Vision impressions to be more positive
It's certainly possible they did have the overall brightness too high I'm sure, but the reality is they can't just magically jump from what were essentially higher resolution 1080p displays to UHD in a year. Which is precisely why I've argued people should wait unless they're fine buying another set in a few years. Jumps this dramatic take time, but the fact is they've objectively improved a lot from last year's models ... which is important.

A key factor here is Dolby is part of the UHD alliance, they're going to be steering the ship on many of the important facets of this.


I get that you're concerned some manufacturers will advertise a poor implementation of HDR as the second coming ... I appreciate that, and unfortunately expect that. That sort of thing has always happened with tech. Improvements are being made, and we need to separate the concept and intent of a technology (in this case HDR/high bit-depths) versus the poor or early implementers and their hype.
 

Raistlin

Post Count: 9999
And those 100GB discs are $70.
That's because they're writable. It's never correlated well with ROM pricing. The mechanism to lay data is totally different. Even regular BD writables are still pricey


That said, I love when arguments like this have one side pick the most expensive option possible as a 'win'. There's a 3-pack of Sony's there for $45. And it's not like I searched around for that. It's on the first page without scrolling (5th item) when searching for 'bdxl 100gb'. But again, that doesn't even correlate to ROMs anyway.

:\
 
100GB seems really low, unless I'm wrong in assuming that a 4K movie would need four times the space that a 1080p one would. Are they using different compression codecs or something?
 

Raistlin

Post Count: 9999
100GB seems really low, unless I'm wrong in assuming that a 4K movie would need four times the space that a 1080p one would. Are they using different compression codecs or something?
h.265 (HEVC)

It's certainly enough for 4K. Where it gets a bit worrisome is for stuff like higher bit-depths and improved chroma sub-sampling since that all increases bandwidth. Particularly if they want to do higher framerates and/or 3D.

We'll have to see how that goes.
 

Arkanius

Member
I guess it comes down to whether you think you'll use it enough ... or whether it's better to just go the ISF route since they have the equipment? Obviously you have to consider the fact we're moving to a new color specification. So is it worth getting one now that only does Rec 709? Might want to wait until one is available with selectable gamuts?

Another cheap option would be to buy this - http://www.amazon.com/dp/B000V6LST0/?tag=neogaf0e-20

It comes with a card that has a red, green, and blue Rec 709 cellophane cutout so you can at least get approximate settings for your colors.

You are right.
AVS HD 709 has a pattern where I turn my TV blue mode and adjust Tint and Color accordingly to make the bars not flash at all. 80% of the time I find that specific adjustment a huge difference in colour accuracy on all the TVs I calibrate (Home and friends)
 

nOoblet16

Member
I doubt that's ever going to happen with how poorly it plays with refresh rates on TVs. You're better off hoping that HFR filmmaking graduates to 60 fps.

Are there even any TVs that support 144Hz?

How can they have 24 FPS on TV with no artifact but can't get 48 to work ?
 
Status
Not open for further replies.
Top Bottom