That reminded me of a certain crazy russian.
Who now seems to think him and his "insider" colleagues are now somewhat vindicated after this leak!
That reminded me of a certain crazy russian.
Wow, so much games? Thanks for entertaining me. Oh, it's not even 1080p. Even better.
Wait, so it's 34GB in either direction? Argh, I hate it when companies get confusing whether it's read and write or read or write.
Typically DRAM bandwidth is measured in either direction, not combined, right? So say dual channel DDR3 1600 is 25.6 GB/s, and that's going both up and down, it's not a combined figure. What is the PS4s figure, combined or one direction?
MSAA doesn't play well with deferred rendering and seeing how most games use it MSAA is out of question, they have to use post processing AA. Now, MSAA isn't perfect either, the foliage in FH2 is particularly bad in this regard, sometimes it ends up looking horrendous.
You said that you don't know which game use 4xMSAA on current gen. So i said, UFC.
Okay, maybe there is a workaround solution, but will anyone apart from 1st Devs get down to fully utilize it? Seems like a ton of work instead of simply reducing resolution like it's done now.
No matter how your opinion about this is, the IQ in FH2 is one of the best currently in this gen and it's easy to see.
Okay, maybe there is a workaround solution, but will anyone apart from 1st Devs get down to fully utilize it? Seems like a ton of work instead of simply reducing resolution like it's done now.
This might not be due forward+ itself, but I remember hearing from Playground that they decided not to render a depth buffer (as that requires extra steps with transparencies), since the forward+ method they used doesn't require it, and without a depth buffer some effect might have become too costly indeed. But from interviews it seemed more like a development driven decision rather than performance ones.For FH2 again, yes it has many light sources, but none of them cast shadows. Only the sun does. Compare that to deferred rendering like Gran Turismo, DC, PCars where shadows can be cast from street light and headlight, it's much more impressive looking and requires a lot more processing power. Playing on X1 myself, and FH2 looks very similar to GTA V in terms of graphics, except again in GTA V street lights actually cast dynamic shadows for all objects.
Async i/o ing from esram (the the 3rd method I described) do require additional effort, and thus probably many likely won't do it. But thankfully this time the way the gpu can split the buffers is much more flexible and with no performance hindrance compared to 360, and should require less work to deal with a constrained space.
For instance now (after recent upgrades to the performance tool analysis) you can during normal debug find out if you have resources on esram that aren't make enough use of esram advantages or if you have resources on ddr3 that could use them. And if you do know where to put that resource you can flag them during allocation to assign to either memory. You could do like this: For this gbuffer, store normal and specular data into esram, and the roughness attribute into ddr3, and the new apis will take care of that for you. That's an extra step that isn't required on Ps4, yes, so I dunno how will developers respond.
This might not be due forward+ itself, but I remember hearing from Playground that they decided not to render a depth buffer (as that requires extra steps with transparencies), since the forward+ method they used doesn't require it, and without a depth buffer some effect might have become too costly indeed. But from interviews it seemed more like a development driven decision rather than performance ones.
Well, I guess only time will tell if these improvements make an actual difference, or we're gonna be stuck with 720/900p until 2020.
Come on man, MS is playing the underdog now, pretending X1 is 1.3 Tflops and shit, just to cut Sony later on and reveal that it's actually 5 times more powerful, has a secret hidden GPU and can do raytracing in 1080p/60fps.... lol
Who now seems to think him and his "insider" colleagues are now somewhat vindicated after this leak!
Could Crysis 3 run on Xbone in 1080p? Probably not. Ryse uses a more advanced version of CE3 and has GI/SSR/SSS and what not, but is 900p with dips from 30 fps.
So to you technical improvement can only be done in increasing resolution? And why do you exaggerate to "720p/900p" when there are 1080p games and especially in 2014 there weren't many 720p games at all?
Oh sorry, didn't see this before:
Sometimes we should check post history before entering a discussion.
Because 1080p should be assumed for new hardware after sitting on 720p 8 years
Because 1080p should be assumed for new hardware after sitting on 720p 8 years, hell it was promised for last generation, and yet here is 2015 and we're still not getting 1080p games. It's downright pathetic, you would imagine that a megacorp with unlimited resources could build a decent enough machine to handle a resolution that has been standard for TVs for almost a decade.
I never understood the need for 3 OS's.
Because 1080p should be assumed for new hardware after sitting on 720p 8 years, hell it was promised for last generation, and yet here is 2015 and we're still not getting 1080p games. It's downright pathetic, you would imagine that a megacorp with unlimited resources could build a decent enough machine to handle a resolution that has been standard for TVs for almost a decade.
One nice feature that I use is that if a game crashes, it doesn't take down the whole system. Just hit the home button, kill the game and re-start it.
Then you should be complaining about the PS4 too. Both consoles are weak for 1080p60 gaming compared to the stats of PC gaming rigs coming out contemporaneously. This is old console war garbage talk.
In regards to the OP: A really interesting read. I can only make heads or tails of half of it, but this info out to the people who care is certainly useful.
Yep. Getting an XB1 in this respect was like upgrading to Mac OS X from OS 9; suddenly system issues and nonresponsive games weren't the same issue since it didn't bring down your entire system. Definitely an advantage this gen for all systems.
I think there's maybe 1 or 2 PS4 games that isn't 1080p. As far as 60fps, yes hardware has a part to play, but devs could easily prioritize framerate over some of the graphical effects they have going on, but most don't.
Because 1080p should be assumed for new hardware after sitting on 720p 8 years, hell it was promised for last generation, and yet here is 2015 and we're still not getting 1080p games. It's downright pathetic, you would imagine that a megacorp with unlimited resources could build a decent enough machine to handle a resolution that has been standard for TVs for almost a decade.
One nice feature that I use is that if a game crashes, it doesn't take down the whole system. Just hit the home button, kill the game and re-start it.
And it even features 4xMSAA which I don't know any other game of on current gen.
The Order does 4xMSAA
Thanks, that makes it 3 at 3 different resolutions.
The Order does 4xMSAA
But only using 2/3 of the total screen because of this ugly cinematic resolution.
MS clearly says to devs what they can expect. Snapped apps won't "harm" the game os, therefore they have the hypervisor as explained before in this thread.
Because of the uncertain nature, Microsoft assert that the title must be able to tolerate a variance of up to 3%. Furthermore, you as a user can have a profound affect on the reserve. In some instances, the ESR (Extended System Reserve) will force the reserve to 4 percent of the GPU reserve per frame. In other words, for certain actions either you take, or notifications and other prompts, the game will not have access to the 100 percent GPU reserve (even if normally does) and instead has to operate with only 96% of the GPU available to it while it’s drawing that particular frame of animation.
While it might not sound like a lot, it’s about 50GFLOPS of computing power that the game isn’t able to run. Because of this, Microsoft suggest that you alter how the game runs while this happens to fit into the lower performance. It suggests either downscaling the internal rendered image (it suggests an example of 1440×810 (a total of 1,166,400 pixels, compared to 1080P’s native resolution which is 2,073,600 pixels) and then upscaling the image, OR the other alternative would be to reduce effects quality (so, perhaps render things in the distance with a lower level of detail, or cut back on certain shadow details…. This will generally happen in situations where the title is in fill mode, and other menus or items are on the screen, so users can’t notice the drop in quality and thus difference in quality.
While we’ve heavily discussed Microsoft increasing CPU allocation by freeing up the Seventh CPU Core it’s vital to remember that the proportion of the CPU time available on the seventh core can vary drastically based on what’s happening with the Xbox One. Microsoft assures developers they can count on having at least 50 percent of the core available at all times, but when the system must process certain commands spoken by the user (say, “Xbox go to friends”, 50 percent of the seventh core is required to run that task. So in effect, in the worst case scenario, developers gain 50 percent of a single CPU core. If these commands are not running (which should be the majority of the time) this raises to 80% CPU time available for the game. In effect, this means the amount of CPU performance available on the seventh core can vary by 30 percent, and currently the title isn’t informed that previous CPU performance is about to be snatched away from it. This clearly isn’t an ideal situation and Microsoft admit as much, and point it that an updated SDK release will fix this issue and provide the game notification. Also, optimization isn’t easy at the moment, because performance counters aren’t providing details of what’s happening on the Xbox One’s seventh CPU core. This means developers can’t really profile the performance and optimize as well as they should – again, Microsoft are keen to stress this will also be fixed ASAP.
However, there's no such thing as a free lunch, and the additional CPU power comes with conditions and trades attached - however, there is the potential for many games to benefit. Firstly, developers need to give up custom, game-specific voice commands in order to access the seventh core at all, while Kinect's infra-red and depth functionality is also disabled. Secondly, the amount of CPU time available to developers varies at any given moment - system-related voice commands ("Xbox record that", "Xbox go to friends") automatically see CPU usage for the seventh core rise to 50 per cent. At the moment, the operating system does not inform the developer how much CPU time is available, so scheduling tasks will be troublesome. This is quite important - voice commands during gameplay will be few and far between, meaning that 80 per cent of the core should be available most of the time. However, right now, developers won't know if and when that allocation will drop. It's a limitation recognised in the documentation, with Microsoft set to address that in a future SDK update.
I think there's maybe 1 or 2 PS4 games that isn't 1080p.
Easy to tell this is an xb1 thread. Have people posted the ps4 flops count yet or we still on 1080p list wars.I believe there's 4 in total -- UFC, Assassin's Creed Unity, Battlefield 4, and Watch_Dogs
I believe there's 4 in total -- UFC, Assassin's Creed Unity, Battlefield 4, and Watch_Dogs
The fact remains that after only 1 year MS do appear to be scrapping the bottom of the barrel for any crumb of power available, and it already has an impact on games' stability (cf the inability of Devs to know if they have 50% or 80% of the 7th core available at any given time).
I believe there's 4 in total -- UFC, Assassin's Creed Unity, Battlefield 4, and Watch_Dogs
Doesn't at least one game drop the resolution dynamically on PS4... was it wolfenstein?
Doesn't at least one game drop the resolution dynamically on PS4... was it wolfenstein?
Isn't Killzone SF 1080i?
Isn't Killzone SF 1080i?
Yes.
No.
Doesn't at least one game drop the resolution dynamically on PS4... was it wolfenstein?
KZ SF is 1080i in multiplayer.
Wolfenstein maintains a solid 1080p vertical resolution but dynamically alters the horizontal resolution. It's not something I've ever noticed when playing it. It's a very clever solution.
I hope we see more of this kind of technique.
I wouldnt count on it unfortunately. Its a technique known since Wipeout HD and it was very rarely used so far.
I wouldnt count on it becoming a standard, unfortunately. Its a technique known since Wipeout HD and it was very rarely used so far.
What's strange according to xbone's SDK is that there's api for handling that automatically, so developers wouldn't even need to do anything other than set up a flag in code.
Developers have access to the documentation and SDK components so they knew about this from the start and will get any updated information. This is only interesting to people without access to an XBone SDK which cannot really develop for it. (or these documents will not apply to)Damn, some developers might actually find some of these things interesting in the long run.
The answer is it depends. Though ESRAM manufacture process differs from the main APU so it will always be more expensive to manufacture and to shrink.By how much? I think it's hard to say how expensive it really is as it is in one piece of hardware and part of the ALU as a whole. And likely to get way cheaper as the die shrinks.
The problem with Xbone is that this technique actually does not benefit it all that much. Its much more suited for PS4.What's strange according to xbone's SDK is that there's api for handling that automatically, so developers wouldn't even need to do anything other than set up a flag in code.
As well as few games on past gen, still it wont be a standard.On the XB1, CoD:AW does this thing for the single player campaign. It jumps between 1360x1080 to 1920x1080.
Sure i know, i havent even realized i was writting edramNot to be that guy, but it isn't EDRAM. It is ESRAM, they are very different things.
Thanks, that makes it 3 at 3 different resolutions.
EDRAM is limiting the high spectrum of the resolution, .
Sure i know, i havent even realized i was writting edramedited.
And they are not that different![]()
I never understood the need for 3 OS's.
The problem with Xbone is that this technique actually does not benefit it all that much. Its much more suited for PS4.
EDRAM is limiting the high spectrum of the resolution, in many cases even 1080p and jumping from 900 to 720p would not be as beneficial, though still viable.
I personally always saw this technique not only as 'performance stabilizer' but also as a IQ enhancer by running game in 1.5x1.5 SSAA mode when there is a GPU time to spare.
This scenario would never work for Xbone though, because of EDRAM size limitation.