Digital Foundry Performance Analysis: Fallout 4

Loudninja

Member
Any word on the AF on consoles?
We'll be giving a more thorough analysis of the game's visuals in our full Face-Off. But initial testing shows PS4 and Xbox One's core graphics settings are surprisingly close across the board. Texture maps are matched for resolution, with a generous level of anisotropic filtering across the ground for good measure. Each uses the same grade of screen-space ambient occlusion, to match PC's highest, and effects quality is identical too. With everything being so close in the visual stakes though, how does the frame-rate on these machines hold up?
.
 

Kezen

Banned
I know. I've been saying this for literally years now, when the popular opinion was still that console optimizations and coding to the metal would somehow allow console hardware to punch way above its weight.

It's true for the CPUs, you need better per-core performance to play DX11 games well although entry level hardware (because that's what the Core I3 line is) proves sufficient for now.

I'm impressed by this little 750ti, no idea how long it will hold out but it has already vastly exceeded my expectations.

Who in 2013 could have said that such a tiny GPU would fare so well against consoles ? No one.
Yet here we are.
 

thelastword

Banned
thelastword, at some point you will just have to accept that you are either overestimating the PS4's performance or underestimating the core i3/750Ti's performance. It's been two full years since the launch of the next generation, we've had tens of retail next-gen multiplatform games and the Core i3/750Ti has managed to match or even beat the Playstation 4 in the overwhelming majority of these games. The examples where the PS4 has managed to offer better performance are few and far between, merely a handful of games these past two years. You may not want to admit it but these games are the exception, not the rule. You can't keep on talking about bad ports or lazy developers, the numbers don't lie. The numbers seem to indicate that the "bad ports" aren't the ones where the 750Ti bests the PS4 but the ones where it loses to it.
Are you saying Project Cars was a bad port on PC? My crappy AMD CPU coupled with the 750 ti runs RE-R2 at 60fps, so does the XB1 btw. I've pointed out a million other games which should run much better on consoles. Saints Row, Alien Isolation, Xenoverse, Payday, Remake etc..etc..etc..

He has the game btw.
Does he have the XB1 version as well to compare?
 

Woffls

Member
Very glad I cancelled my Xbone pre-order and waited to see how the game performed at launch. The CPU core utilisation and low VRAM usage for 1080p is good news for my stupid setup (5820k, 750 Ti SC) but I won't be around my PC much for the next couple of months so I might wait it out and see what patches do over the next couple of weeks.
 

Audette

Member
I asked a page back if my processor would hold up but reading more into it it seems like yeah more processor heavy game. Fallout3/NV/Skyrim all played fine, I was hoping I could make it with this game. Damn, I hate my PC. Need to upgrade the CPU and the MOBO but won't have any money anytime soon.

I guess I'll stick to my Console Preorder of fallout 4.
 

Rival

Gold Member
How does a game drop down to zero frames per second? I've never heard of that. Is it common on a console game?
 
It's true for the CPUs, you need better per-core performance to play DX11 games well although entry level hardware (because that's what the Core I3 line is) proves sufficient for now.

No doubt, but in this case we have to also take into account that the PC CPUs have to run an entire computer operating system on top of the game as well as various other background tasks.
 

Durante

Member
It's true for the CPUs, you need better per-core performance to play DX11 games well although entry level hardware (because that's what the Core I3 line is) proves sufficient for now.

I'm impressed by this little 750ti, no idea how long it will hold out but it has already vastly exceeded my expectations.

Who in 2013 could have said that such a tiny GPU would fare so well against consoles ? No one.
Yet here we are.
In 2013 the "2x" myth was still extant and often considered fact.

Now we have a GPU which is more like 0.75x in raw numbers keeping up pretty well on average.
 

omonimo

Banned
You're right that the FX-8150 has a slight advantage in The Witcher 3, but again, it's much faster than the i3 on paper, having twice as many threads (or, to put it another way, four times as many physical cores) and a 700MHz clockspeed advantage. That the 8150 comes out slightly ahead isn't something that should be celebrated, especially considering it's clocked significantly higher than the Jaguars in the consoles, which ties into the crux of my earlier post -- that the i3, despite being something of a "fake" quad core, can overcome the core/thread advantage the consoles have because of Intel's superior per-clock performance.
I don't know what to say. To me The Witcher 3 seems an appreciable work CPU wise on console. I can live with such 'compromises'. Fallout 4 nope. Honestly I don't really care to measure it with the CPU on PC when final results are acceptable. Quite different it's to point out how console CPU are terrible when the final result it's even the worst possible.
 
Funny enough that last gen games would receive lesser scores on certain consoles when it performed and looked worse. That has certainly died this gen.

Everytime there is a bad port on consoles people use the i3/750ti as evidence of it keeping up or outperforming the PS4. A bad port coupled with a capped framerate is compared to a console version or console versions where the game drops to 0fps whilst walking. An i3/750ti is compared to console versions where a beefier GPU shows lower frames in a scene with heavy alpha, when any non-disingenuous person knows that the better GPU should always perform better in such scenarios.....These type of arguments are truly amazing.


There must be a lot of bad ports then. Do you consider MGSV a bad port ?
 

HORRORSHØW

Member
And Xbox One at 0 fps o.o
7V2s4Iv.png

it has to be a patchable bug
 

orochi91

Member
thelastword, at some point you will just have to accept that you are either overestimating the PS4's performance or underestimating the core i3/750Ti's performance. It's been two full years since the launch of the next generation, we've had tens of retail next-gen multiplatform games and the Core i3/750Ti has managed to match or even beat the Playstation 4 in the overwhelming majority of these games. The examples where the PS4 has managed to offer better performance are few and far between, merely a handful of games these past two years. You may not want to admit it but these games are the exception, not the rule. You can't keep on talking about bad ports or lazy developers, the numbers don't lie. The numbers seem to indicate that the "bad ports" aren't the ones where the 750Ti bests the PS4 but the ones where it loses to it.

Reading the replies over the past couple pages, this seems to be a CPU problem.

There's nothing special about the 750ti, which is slightly worse than the PS4's GPU anyways.

The i3 is the one that's putting the AMD Jaguars to shame.
 

Ein Bear

Member
Conventional wisdom is that this game was originally planned to be cross-gen, right? Damn, I'd love to be able to peer into the alternate universe where PS360 versions came out and see how they look.
 

thelastword

Banned
Call of Duty Black Ops 3

The frame-rate graph is consistent with a triple buffered presentation and completely skips the torn frames. If it were just the first few lines, I could understand, but tearing appears within the top 20% of the image and should be accounted for. It's a limitation of his tool in that case. It is not an easy game to analyze so I'm not surprised. I just think he should have mentioned it.


I hadn't actually kept up with Syndicate at all. Haven't seen it in action yet on real hardware. Was just thinking of the Unity bug.
I'll check it out, but I'm going mobile now.

I know. I've been saying this for literally years now, when the popular opinion was still that console optimizations and coding to the metal would somehow allow console hardware to punch way above its weight.
I don't buy this for the majority of releases. So many games got massive frame boosts on consoles after many claimed they were CPU bound issues. Project Cars, Borderlands, Unity, GTA5 and many more. At some point, I think people must confess that outside of some first party studios games and some stellar third parties titles (MGS5, Metro, Wolfenstein, I think Doom will hit it's target too) that this gen have generally shipped some really awful pieces of code on consoles. In that light, when we have entry level PC's doing better at better presets like it's pie, it paints the picture that the console hardware is not the problem.

Sometimes I think Nixxes, Bluepoint and Hexadrive should redo some of these awful releases on consoles for people to get a clue. I'd personally love for Hexadrive to redo Alien Isolation, Xenoverse, RE-R2, Remake, Payday, Darksiders 2, Fallout 4, Unity, Witcher 3 and the list goes on and on.......
 

BennyBlanco

aka IMurRIVAL69
I wonder why DF doesn't switch to the modern budget card, the gtx 960?

It's a better card and not much more expensive.
 

Rival

Gold Member
i think it happens mostly when the game is loading the next area or whatever. had plenty of 0fps moments in Killzone 2 and Bloodborne for example.

Oh ok. Maybe I would have noticed in bloodbourne had I been good enough to make it to a new area. I think fallout will be PC for me. Hopefully they can fix some of these issues via patches.
 
"This is what happens when a game developer's ambition outgrows the static hardware."

-Eurogamer comments, November 2015

Hahahaha.... incroyable!

I don't know whether to laugh or cry that this is acceptable in 2015.
 
Indeed, infact even Witcher 3 suffers from this shadow LOD.

The thing to remember for people is that due to the fact that Fallout 4 has a barren land the lack of shadows become more obvious, if we were to remove the grass from Witcher 3 we'd see a giant difference like this in shadows.

Now one may say that Fallout 4 is barren compared to Witcher 3 which is dense and as such there is little reason to skip out on the what little objects it has on display, and they would be correct usually, until you realise that the overall draw distance in Fallout 4 is higher than Witcher 3, so it kind of ends evenly. Basically even though it is less dense it covers a larger area than something like Witcher 3. GTA has the advantage of being set in the city so details like lack of shadows at a distance for these buildings and such don't show up as much, take any picture from the open area with a large backdrop and you'll see similar stuff.

Still I think this is a bit extreme as the grass and shadows are completely omitted rather than reduced in detail and the geometry is significantly affected. Infact I even see missing walls (lower right side past the fence) , which is not what I would expect from LOD to outright omit the object from the scene especially at that distance, only for it to pop back into existence when you approach the area.
All great points. I think the lack of shadows in the distance, rather than decreasing their and the object's complexity, points to being draw call limited on console. They can spit out the geometry and what not, but calling more shadows is just too expensive for the CPUs.
well, this shows something pretty interesting tbh.
 
Man, that first convo there:






lmao

Man Gies is so defensive and combative. He's incapable of having human conversations with people. I replied to that thread and, likewise, he replied in a sarcastic, defensive, utterly perplexing way.

I honestly wonder why he even bothers. He hates the gaming community so much, is constantly at odds with people who read his articles/tweets, I just can't imagine why the meager salary that he gets writing about videogames could possibly be worth the emotional battles he feels he has to wage every day.
 
Conventional wisdom is that this game was originally planned to be cross-gen, right? Damn, I'd love to be able to peer into the alternate universe where PS360 versions came out and see how they look.

That was always a rumor but I don't think it's why the game looks/performs like how it does. More likely than not, it's a economic compromise... Creating a new engine for the game would cost a ton and they likely put the majority of that budget towards creating a large world/lore with lots of interactivity, and just had to swallow that the game would suffer from the age of its engine. The expectations for Fallout games in the looks and performance department is pretty low, so it makes sense.

I think the cross-gen rumors were mostly driven by a really trivial fact that the Fallout 4 website shared a CSS file that had 360 and PS3 logos coded onto it... But looking at it deeper that didn't seem to indicate much.
 

Kezen

Banned
No doubt, but in this case we have to also take into account that the PC CPUs have to run an entire computer operating system on top of the game as well as various other background tasks.
A much heavier and thicker API/Driver combo as well.

In 2013 the "2x" myth was still extant and often considered fact.
Now we have a GPU which is more like 0.75x in raw numbers keeping up pretty well on average.
I wonder what kind of PC hardware "should" be required when consoles are pushed harder.
 
Top Bottom