Windom Earle
Member
spf instead of fps is clearly the better metric for the Xbox One version here.
A different time of the day, but in this Gamersyde video the player goes to the exact same area as in that gif, and that hiccup doesnt happen:
Also, it seems to be in the same moment of the game.
what console?
Knowing Greg it's on ps4, matter of fact I guarantee it. And please don't think the ps4 version doesn't have issues its already been reported by people who got it early it has awful fps drops. Bethesda clearly fucked the console port up.
Funny enough that last gen games would receive lesser scores on certain consoles when it performed and looked worse. That has certainly died this gen.From Eurogamer's review of the Xbox One version of Fallout 4:
http://www.eurogamer.net/articles/2015-11-09-fallout-4-review
Thanks Bethesda.
Everytime there is a bad port on consoles people use the i3/750ti as evidence of it keeping up or outperforming the PS4. A bad port coupled with a capped framerate is compared to a console version or console versions where the game drops to 0fps whilst walking. An i3/750ti is compared to console versions where a beefier GPU shows lower frames in a scene with heavy alpha, when any non-disingenuous person knows that the better GPU should always perform better in such scenarios.....These type of arguments are truly amazing.Can't believe how well the i3/750ti combo still stacks up against consoles. Completely blowing out the ps4 even with better draw distance.
I wonder why they don't switch out the 750ti for a 960?
Yet, the lower powered card performs better (750ti) over the superior PS4 GPU and people praise that as a win for PC. I mean come on, hardly any distant shadows on the console versions and basic PC's are doing this with aplomb. Are these guys just not up to console development?The PC requirements are weird. 550TI, which is a 600Gflop card, or a 7870, which is near 2Tflops and 2-3x the performance of the 550TI. I'm left wondering what hardware level you need.
The DF 750TI seems to be bearable, but up at 1.7Gflops, rather than the minimum 550TI...All weird.
Regarding the consoles...It's going to feel like a looong generation.
I've never been in favor of DF just spitting out "consoles use some lows some mediums some highs", with the inevitable "oh, we were able to match console presets on our 750ti", it can hang or even outclass.....Never showing settings of the 750ti setup, never corroborating their stance on console settings..........Looks like there may be some inconsistency in texture streaming, with PC actually coming off worst in this DF comparison.
![]()
![]()
Hmmmmm...lol@ all the people praising the pc 'port' and calling consoles a mess acting like they JUST decided to go for pc after the video.
Why is this a bug and the deathclaw fight not an even worse form of a bug? How would a weaker GPU ever outclass a superior GPU with heavy alpha all about. At least we know that Bethesda has a history with 0FPS drops but this deathclaw fight is yet another low point in this release. Regardless, we have persons pretending it's some hardware issue on the PS4 at that bossfight, as if the xbox should ever perform better at a similar 1080p to boot......Because the dropping to zero is more a bug.
I was talking about syndicate, NXgamer tried to replicate it on PS4 but he couldn't. If you say it was an issue in Unity as well, I'm sure he was aware so he tested to see if that was eliminated from the engine once and for all...Clearly it was not.Never shows what dips? The Fallout 4 "0 fps" screen is obviously showing just that. Are you talking about Assassin's Creed Unity now?
I know what you're talking about in ACU and I believe all of those issues have been fixed. It was NOT an Xbox One problem either. I encountered those massive "1 fps" issues on my PC playing ACU as well initially. It appeared in areas with certain materials (stain glass in particular). Nasty bug, but it was eliminated quickly.
Ok, I read that as Unity. Too much staring at the screen.
Which one is that exactly?Ah. I read Unity for some reason. I dunno, sometimes we don't encounter things. NXG posted an inaccurate frame-rate analysis this week yet I haven't seen lastword mention that. Weird.
Link broken .
A different time of the day, but in this Gamersyde video the player goes to the exact same area as in that gif, and that hiccup doesnt happen:
![]()
Also, it seems to be in the same moment of the game.
Ps.: The video in the link is XB1 gameplay aswell.
I don't understand why the worst case possible are used to prove how weak are the amd processor. I mean The Witcher 3 has a FOV tons of time better of this and still Alien:R and game like Fallout 4 are considered the best example of performance possible with such CPU. Meh.The Jaguar processors in the PS4/X1 are, put simply, awful. They're intended for low-power devices, not performance-intensive applications -- but that's the reality Sony and Microsoft resigned themselves to when both decided to place profitability well above power. Intel's per-clock performance is so far ahead of AMD's that the i3 simply doesn't need eight (well, six, technically) threads to come out ahead. Alien: Isolation is a good example: the i3-2100 manages to pip ahead of the FX-8150 by a few frames despite having half the threads, a quarter of the physical cores (the i3 is two physical cores + two logical cores), and a 700MHz slower clockspeed.
Bethesda has always fucked console ports up which is why the pc version of their games always run great. There are games on these consoles that put fallouts visuals to shame and they hold 30fps the problem is Bethesda and that's fact.
Why do you think an i3/750ti is significantly able to outpace the PS4?
If the game utilizes multiple cores well, then shouldn't that be reflected in the PS4's performance? Especially since the PS4 GPU is up to task?
Why do you think an i3/750ti is significantly able to outpace the PS4?
If the game utilizes multiple cores well, then shouldn't that be reflected in the PS4's performance? Especially since the PS4 GPU is up to task?
*Does a little PC is the master race dance*
We still get posts like this in 2015. Amazing.
Thats why really. I mean that stutter issue is more a bug i think.
But overall the fps is abit smoother on PS4 but that section suddenly gives Xbox One the advantage. I mean its a complete mess then if its so inconsistent.
The nerve to release it that way. But i guess they win because it only gets very good reviews. They will never change then.
It's not like the game is purely CPU-bound. These drops into the lower 20ties during some fights with heavy use of alpha effects on the PS4 (see the DF video) are without doubt due to some bottleneck in the graphics rendering.
The sad thing is this fellow is probably older than 12.*Does a little PC is the master race dance*
Yet, the lower powered card performs better (750ti) over the superior PS4 GPU and people praise that as a win for PC. I mean come on, hardly any distant shadows on the console versions and basic PC's are doing this with aplomb. Are these guys just not up to console development?
*Does a little PC is the master race dance*
*Does a little PC is the master race dance*
We still get posts like this in 2015. Amazing.
Not all firefights, I hope this is not what you're getting from this video. This was the only firefight where a frame advantage was highlighted for the XB1 version.For the love of god Bethesda, please fix the 25fps firefights. PLEASE.
The XB1 version seems to run fine in firefights, what the fuck? Why? I am so confused, this game is great, but also annoying as fuck when it comes to performance and inventory management.
And we still have people who take them seriously.
He has the game btw.Not all firefights, I hope this is not what you're getting from this video. This was the only firefight where a frame advantage was highlighted for the XB1 version.
I wonder what the CPU contingent are saying about the XB1 CPU now, here's an open world game where the XB1 stays consistently lower frame-wise in the most dense of cities with the most npc's.
I don't understand why the worst case possible are used to prove how weak are the amd processor. I mean The Witcher 3 has a FOV tons of time better of this and still Alien is considered the best optimization possible in such CPU. Meh.
Isn't that PC master race bullshit bannable? I'm sure I heard it was bannable.*Does a little PC is the master race dance*
*Does a little PC is the master race dance*
I find it strange that the 970 holds its own so well at 1440p, shouldn't its bandwidth hamper it somewhat ?
Isn't that PC master race bullshit bannable? I'm sure I heard it was bannable.
thelastword, at some point you will just have to accept that you are either overestimating the PS4's performance or underestimating the core i3/750Ti's performance. It's been two full years since the launch of the next generation, we've had tens of retail next-gen multiplatform games and the Core i3/750Ti has managed to match or even beat the Playstation 4 in the overwhelming majority of these games. The examples where the PS4 has managed to offer better performance are few and far between, merely a handful of games these past two years. You may not want to admit it but these games are the exception, not the rule. You can't keep on talking about bad ports or lazy developers, the numbers don't lie. The numbers seem to indicate that the "bad ports" aren't the ones where the 750Ti bests the PS4 but the ones where it loses to it.
Call of Duty Black Ops 3Which one is that exactly?
I hadn't actually kept up with Syndicate at all. Haven't seen it in action yet on real hardware. Was just thinking of the Unity bug.I was talking about syndicate, NXgamer tried to replicate it on PS4 but he couldn't. If you say it was an issue in Unity as well, I'm sure he was aware so he tested to see if that was eliminated from the engine once and for all...Clearly it was not.
Serious question, GTX570, i5 2500K (OC to 4.70 GHz) PC or Console (PS4 or XB1).
What will run the game better?
Omg Thank you so much, was about to ask how well my 970 would fair.
It's not the 750ti providing the win, its Intel core CPU compared to AMD core in the consoles.
The sad thing is this fellow is probably older than 12.
*Does a little PC is the master race dance*