Programming Issues on XBox One - Johnathon Blow

It really seems like Sony is deciding to enter this generation by earnestly addressing and correcting all the major mistakes they made with hardware in the last few generations, while Microsoft is deciding to rest on its laurels and push a voice-activated tv remote into our living rooms, hoping for the best.

Not really. Lack of focus / rushed time tables did it.
 
imPkVjGH5mEys.jpg
 
That 30fpsvs60fps, even if technically possible, isn't going to happen.

Microsoft will just go "Both games must be pretty much the same or we won't let you publish it"

Exclusives however, I'm expecting a pretty big difference.

What if all the publishers find it easy to produce superior performance on the PS4? Will MS tell them all they can't make games? The same MS who boast (arguably) the weakest first party setup?

Of course that's assuming that they sell enough of these things to wield any clout of the magnitude required to dictate terms to publishers.

And then of course we are assuming that Sony will be a passive bystander in this arrangement, which is strange considering the message they are currently pushing.
 
This chart is deceptive. It's using Xbox One's actual memory frequency, while comparing it to PS4's effective memory frequency. I wouldn't say that's a fair comparison.

someone please correct me if I'm wrong

you're wrong. it's a fair comparison. it may be an "effective" frequency, but the bandwidth calculation listed is correct by the nature of the tech involved in gddr5

this is what i got from another forum

"DDR = Double data rate

DDR transmits TWICE per cycle. Hence the name (double data rate).

GDDR5 transmits on the high as well as the low edge so doubles again."

basically GDDR5 is frequency*4, so the GDDR5 in the PS4 runs @ 1375mhz but it can read/write 4 times per clock so you get and effective "5500mhz"

the XB1 is also using effective frequency. ddr3 can read/write 2 times per clock so it should be divided by 2 to get the real value
 

Hey if it's locked with triple buffered V-sync that is fine, but I just can't see a developer doing that when they can simply lock it at 30fps making everything easier on themselves IMO. 45fps would definitely be preferable over 30fps, but I'd rather get the visual fidelity even higher and drop that down to 30fps. And judging from the way this console generation has been, I bet we'll continue to see a lot of 30fps games in the future.
 
Was curious about this. Gets even more interesting - also says that Sony's tools and libraries are better than MS's at this point, and better integrated with Microsoft's own IDE!

aKRcKab.png

ySet2Nf.png

Who is this man?

I'm interested in hearing more stuff like this - now both consoles are revealed I hope we hear more from developers about their experience with each.
 
33% difference isn't going to yield a doubled frame rate.

You're assuming that the XONE would only capable of 30 fps -- and nothing more. That 30fps vs. 60fps is deceiving.

Picture this: If the XONE could do, say, 45 fps at a variable rate for a game, then saying the PS4 version could hit 60fps becomes completely feasible. However, since an inconsistent framerate is unappealing to the human eye, devs would naturally lock it at 30fps on the XONE, but let the PS4 do 60fps. And so, you have the 30 vs 60 gap mentioned by Jonathan.
 
I find it hard to believe that MS could mandate platform parity and it would not immediately be leaked and very widely known with lots of disgruntled devs complaining that they had to intentionally cripple the Sony version.
 
Hey if it's locked with V-sync that is fine, but I just can't see a developer doing that when they can simply lock it at 30fps making everything easier on themselves IMO. 45fps would definitely be preferable over 30fps, but I'd rather get the fielding even higher and drop that down to 30fps. And judging from the way this console generation has been, I still see a lot of 30fps in the the future.

Locked 45 fps would be awful. Every other frame would be displayed for twice as long as the others. You can eliminate tearing but not the inconsistent frame times.
 
That 30fpsvs60fps, even if technically possible, isn't going to happen.

Microsoft will just go "Both games must be pretty much the same or we won't let you publish it"

Exclusives however, I'm expecting a pretty big difference.

With the negative press the Xbox One is currently receiving; outdated set top box focus, lower specs than its competition, 24 hour online check-in requirement, no used games/lending games without paying fees... it may not be in quite the position of power that the 360 was to demand such things. But time will tell of course.
 
Locked 45 fps would be awful. Every other frame would be displayed for twice as long as the others. You can eliminate tearing but not the inconsistent frame times.

Well there ya go, I experimented with locked 40-45 fps before and I wondered why it was so fucking janky. Everyday is a school day haha.
 
But they will. No one is going to optimise for the stronger of the 2. Never have and never will. At least 3rd parties anyway. Obviously 1st parties will because thats all they have to work with.

Considering the PC centric approach of the PS4 and XBONE and how similar they are to each other I'd say that optimising for the stronger console is going to be quite easy.

Devs are used to do this for a long, long time now. They have to cater to a lot of PC configurations. It should be really easy on the new HD twins.

If what Blow said is true then it would be as simple as having a High and Mid settings on a PC game.

Resolution and texture quality will have a considerable difference. Maybe even FPS on some games.
 
you're wrong. it's a fair comparison. it may be an "effective" frequency, but the bandwidth calculation listed is correct by the nature of the tech involved in gddr5

this is what i got from another forum

"DDR = Double data rate

DDR transmits TWICE per cycle. Hence the name (double data rate).

GDDR5 transmits on the high as well as the low edge so doubles again."

basically GDDR5 is frequency*4, so the GDDR5 in the PS4 runs @ 1375mhz but it can read/write 4 times per clock so you get and effective "5500mhz"

the XB1 is also using effective frequency. ddr3 can read/write 2 times per clock so it should be divided by 2 to get the real value

I stand corrected /bow

Kind of lame that MS is using 1066 mem in the Xbox1.... guess they weren't kidding when they said they didn't aim for highest specs possible.
 
That 30fpsvs60fps, even if technically possible, isn't going to happen.

Microsoft will just go "Both games must be pretty much the same or we won't let you publish it"

Exclusives however, I'm expecting a pretty big difference.

This does happen. MS do tell devs and give them ultimatums against the PS3 version. Like it has have motion controls or kinect support if you're putting Move into it. Otherwise we're not releasing your game on our platform.
 
33% difference isn't going to yield a doubled frame rate.

Using that number seems a bit disingenuous, even blatantly false, with regard to improving the framerate. The PS4 GPU has 50% more computational power than the XBO's, and since you're using the XBO performance as a baseline from which to improve you should be stating the increase in power when going from there to the PS4 (50%), not the decrease in power when going the other way (which would be 33%, yes).

I'm not saying we're gonna see a lot of 30 vs 60 fps situations (higher res, larger textures, more effects, etc, seem more likely), but if you're gonna post performance numbers then do it right.
 
it's a case of "glass half empty vs glass half full"

if you take the power of the PS4 and divide it by the XB1 power, you get 1.5 which says the PS4 has 50% more gpu power than the XB1

if you take the power of the XB1 and divide it by the PS4's gpu power, you get .6666666 which says that the XB1 has around 33% less gpu power than the PS4

it's just looking at it in two different ways, but the overall difference remains the same.

768 shader cores for XB1 vs 1152 shader cores for PS4

So, it's still 50%.
 
This does happen. MS do tell devs and give them ultimatums against the PS3 version. Like it has have motion controls or kinect support if you're putting Move into it. Otherwise we're not releasing your game on our platform.

The difference is too large to enforce this. Bigger than ps2, xbox. Their BS tactics won't work.
 
Using that number seems a bit disingenuous, even blatantly false, with regard to improving the framerate. The PS4 GPU has 50% more computational power than the XBO's, and since you're using the XBO performance as a baseline from where to improve you should be stating the increase in power when going from there to the PS4 (50%), not the decrease in power when going the other way (which would be 33%, yes).
Yeah, see the recent DF PS4 v Xbox one analysis thread title.

Digital Foundry: PS4's GPU is 50% faster

OP of that thread: Xbox GPU is 33% less powerful

:lol
 
That 30fpsvs60fps, even if technically possible, isn't going to happen.

Microsoft will just go "Both games must be pretty much the same or we won't let you publish it"Exclusives however, I'm expecting a pretty big difference.

then why did they allow FF13 on 360 while it was inferior then the ps3 visually speaking?

also the fact how hard such a move could backfire.
 
The Killzone Shadow Fall trailer was low framerate (certainly less than 25 in the opening). I'd love to see 60 fps standard though, as that was what I was used to on the Gamecube.
 
60 fps takes 3x the resources of 30 fps due to frame setup time.

Resolution is a much more likely different.

Shouldn't it be closer to 2x to 2.5x?


The Killzone Shadow Fall trailer was low framerate (certainly less than 25 in the opening). I'd love to see 60 fps standard though, as that was what I was used to on the Gamecube.

Hard to say anything about opening of that Killzone footage. You have a weird perspective of the Gamecube.
 
People keep mentioning "the cloud" being a big difference maker. But you have to realize that the games still have to come out with full capability for those people who only connect to the internet once per day. Remember? The games have to be compatible when the internet is down for up to 24 hours.
 
People keep mentioning "the cloud" being a big difference maker. But you have to realize that the games still have to come out with full capability for those people who only connect to the internet once per day. Remember? The games have to be compatible when the internet is down for up to 24 hours.

Well, good thing they won't have to account for that situation as you won't be playing any games then...
 
Not really. Lack of focus / rushed time tables did it.

I wouldn't say lack of focus. It's more of them trying to push the same strategy they've been using for the past 2-3 years by adding in more interactivity with the owner and their technology. This new console looks like their evolution from console manufacturer to set-top box creator.

Too bad Apply is still around and core gamers are usually early adopters.
 
Why are people acting like this hasn't happened before? It has.

Both consoles are runnig the same hardware this time round, except one is substantially quicker.

All devs have to do next gen is push the "make this game better" button pretty much and the game will better on Sonys machine.

If they spent a further 10 minutes it could be much better,

MS have their work cut out.
 
The Killzone Shadow Fall trailer was low framerate (certainly less than 25 in the opening). I'd love to see 60 fps standard though, as that was what I was used to on the Gamecube.
This is what I thought, but apparently the videos were poorly encoded and the DF analysis of some much better source revealed that the game practically never dipped below 30FPS.
 
Top Bottom