Any word on the AF on consoles?
.We'll be giving a more thorough analysis of the game's visuals in our full Face-Off. But initial testing shows PS4 and Xbox One's core graphics settings are surprisingly close across the board. Texture maps are matched for resolution, with a generous level of anisotropic filtering across the ground for good measure. Each uses the same grade of screen-space ambient occlusion, to match PC's highest, and effects quality is identical too. With everything being so close in the visual stakes though, how does the frame-rate on these machines hold up?
I know. I've been saying this for literally years now, when the popular opinion was still that console optimizations and coding to the metal would somehow allow console hardware to punch way above its weight.
Are you saying Project Cars was a bad port on PC? My crappy AMD CPU coupled with the 750 ti runs RE-R2 at 60fps, so does the XB1 btw. I've pointed out a million other games which should run much better on consoles. Saints Row, Alien Isolation, Xenoverse, Payday, Remake etc..etc..etc..thelastword, at some point you will just have to accept that you are either overestimating the PS4's performance or underestimating the core i3/750Ti's performance. It's been two full years since the launch of the next generation, we've had tens of retail next-gen multiplatform games and the Core i3/750Ti has managed to match or even beat the Playstation 4 in the overwhelming majority of these games. The examples where the PS4 has managed to offer better performance are few and far between, merely a handful of games these past two years. You may not want to admit it but these games are the exception, not the rule. You can't keep on talking about bad ports or lazy developers, the numbers don't lie. The numbers seem to indicate that the "bad ports" aren't the ones where the 750Ti bests the PS4 but the ones where it loses to it.
Does he have the XB1 version as well to compare?He has the game btw.
RIP.
*Does a little PC is the master race dance*
Got a gtx 970
Buying ps4
Got a gtx 970
Buying ps4
It's true for the CPUs, you need better per-core performance to play DX11 games well although entry level hardware (because that's what the Core I3 line is) proves sufficient for now.
In 2013 the "2x" myth was still extant and often considered fact.It's true for the CPUs, you need better per-core performance to play DX11 games well although entry level hardware (because that's what the Core I3 line is) proves sufficient for now.
I'm impressed by this little 750ti, no idea how long it will hold out but it has already vastly exceeded my expectations.
Who in 2013 could have said that such a tiny GPU would fare so well against consoles ? No one.
Yet here we are.
I don't know what to say. To me The Witcher 3 seems an appreciable work CPU wise on console. I can live with such 'compromises'. Fallout 4 nope. Honestly I don't really care to measure it with the CPU on PC when final results are acceptable. Quite different it's to point out how console CPU are terrible when the final result it's even the worst possible.You're right that the FX-8150 has a slight advantage in The Witcher 3, but again, it's much faster than the i3 on paper, having twice as many threads (or, to put it another way, four times as many physical cores) and a 700MHz clockspeed advantage. That the 8150 comes out slightly ahead isn't something that should be celebrated, especially considering it's clocked significantly higher than the Jaguars in the consoles, which ties into the crux of my earlier post -- that the i3, despite being something of a "fake" quad core, can overcome the core/thread advantage the consoles have because of Intel's superior per-clock performance.
Funny enough that last gen games would receive lesser scores on certain consoles when it performed and looked worse. That has certainly died this gen.
Everytime there is a bad port on consoles people use the i3/750ti as evidence of it keeping up or outperforming the PS4. A bad port coupled with a capped framerate is compared to a console version or console versions where the game drops to 0fps whilst walking. An i3/750ti is compared to console versions where a beefier GPU shows lower frames in a scene with heavy alpha, when any non-disingenuous person knows that the better GPU should always perform better in such scenarios.....These type of arguments are truly amazing.
No its not common its just a bug that needs to be fix.How does a game drop down to zero frames per second? I've never heard of that. Is it common on a console game?
Come on guys, he's a junior member and he probably doesn't know that the term is bannable. How about we inform him and help him avoid a mistake in the future instead of asking for his head on a platter?
How does a game drop down to zero frames per second? I've never heard of that. Is it common on a console game?
Jim Sterling on his experience: https://www.youtube.com/watch?v=yFUCt2Fk1DI&t=7m12s (at 7m12s if timestamp link doesn't work).
Jim Sterling on his experience: https://www.youtube.com/watch?v=yFUCt2Fk1DI&t=7m12s (at 7m12s if timestamp link doesn't work).
thelastword, at some point you will just have to accept that you are either overestimating the PS4's performance or underestimating the core i3/750Ti's performance. It's been two full years since the launch of the next generation, we've had tens of retail next-gen multiplatform games and the Core i3/750Ti has managed to match or even beat the Playstation 4 in the overwhelming majority of these games. The examples where the PS4 has managed to offer better performance are few and far between, merely a handful of games these past two years. You may not want to admit it but these games are the exception, not the rule. You can't keep on talking about bad ports or lazy developers, the numbers don't lie. The numbers seem to indicate that the "bad ports" aren't the ones where the 750Ti bests the PS4 but the ones where it loses to it.
I'll check it out, but I'm going mobile now.Call of Duty Black Ops 3
The frame-rate graph is consistent with a triple buffered presentation and completely skips the torn frames. If it were just the first few lines, I could understand, but tearing appears within the top 20% of the image and should be accounted for. It's a limitation of his tool in that case. It is not an easy game to analyze so I'm not surprised. I just think he should have mentioned it.
I hadn't actually kept up with Syndicate at all. Haven't seen it in action yet on real hardware. Was just thinking of the Unity bug.
I don't buy this for the majority of releases. So many games got massive frame boosts on consoles after many claimed they were CPU bound issues. Project Cars, Borderlands, Unity, GTA5 and many more. At some point, I think people must confess that outside of some first party studios games and some stellar third parties titles (MGS5, Metro, Wolfenstein, I think Doom will hit it's target too) that this gen have generally shipped some really awful pieces of code on consoles. In that light, when we have entry level PC's doing better at better presets like it's pie, it paints the picture that the console hardware is not the problem.I know. I've been saying this for literally years now, when the popular opinion was still that console optimizations and coding to the metal would somehow allow console hardware to punch way above its weight.
i think it happens mostly when the game is loading the next area or whatever. had plenty of 0fps moments in Killzone 2 and Bloodborne for example.
After his "review" of BO3, I will never listen to another thing out of his mouth.
Got a gtx 970
Buying ps4
Got a gtx 970
Buying ps4
All great points. I think the lack of shadows in the distance, rather than decreasing their and the object's complexity, points to being draw call limited on console. They can spit out the geometry and what not, but calling more shadows is just too expensive for the CPUs.Indeed, infact even Witcher 3 suffers from this shadow LOD.
The thing to remember for people is that due to the fact that Fallout 4 has a barren land the lack of shadows become more obvious, if we were to remove the grass from Witcher 3 we'd see a giant difference like this in shadows.
Now one may say that Fallout 4 is barren compared to Witcher 3 which is dense and as such there is little reason to skip out on the what little objects it has on display, and they would be correct usually, until you realise that the overall draw distance in Fallout 4 is higher than Witcher 3, so it kind of ends evenly. Basically even though it is less dense it covers a larger area than something like Witcher 3. GTA has the advantage of being set in the city so details like lack of shadows at a distance for these buildings and such don't show up as much, take any picture from the open area with a large backdrop and you'll see similar stuff.
Still I think this is a bit extreme as the grass and shadows are completely omitted rather than reduced in detail and the geometry is significantly affected. Infact I even see missing walls (lower right side past the fence) , which is not what I would expect from LOD to outright omit the object from the scene especially at that distance, only for it to pop back into existence when you approach the area.
well, this shows something pretty interesting tbh.
But, why?
Fixed."This is what happens when a game developer's ambition outgrows their abilities on console optimization."
-Eurogamer comments, November 2015
Hahahaha.... incroyable!
I don't know whether to laugh or cry that this is acceptable in 2015.
Man, that first convo there:
lmao
Looks like there may be some inconsistency in texture streaming, with PC actually coming off worst in this DF comparison.
![]()
![]()
Hmmm....ps4 and x1 seem to have the same visual assets.
And what the fuck us up with texture streaming on pc? That's horrid.
But, why?
Nah, better to feign outrage over a dumb meme.
Conventional wisdom is that this game was originally planned to be cross-gen, right? Damn, I'd love to be able to peer into the alternate universe where PS360 versions came out and see how they look.
A much heavier and thicker API/Driver combo as well.No doubt, but in this case we have to also take into account that the PC CPUs have to run an entire computer operating system on top of the game as well as various other background tasks.
I wonder what kind of PC hardware "should" be required when consoles are pushed harder.In 2013 the "2x" myth was still extant and often considered fact.
Now we have a GPU which is more like 0.75x in raw numbers keeping up pretty well on average.
Got a gtx 970
Buying ps4
Man, Gies is still fighting the good fight. I'm surprised he hasn't gotten a job offer from Microsoft yet.