So when they talk about dips, are we going all the way into the 30s or does it stay in the 50s?
pretty minor dips everywhere
So when they talk about dips, are we going all the way into the 30s or does it stay in the 50s?
I don't get the quote about cpu weakness. I mean, cpu are weak compared to the pc counterpart, but it's definitely more efficient and powerful of the older 360 hardware. Games are older, how can limit a remaster? I don't see what's wrong to the xbone cpu for handle games like this series. Isn't it more matter of the bandwith bottleneck?
So when they talk about dips, are we going all the way into the 30s or does it stay in the 50s?
You really think so? The texture work, per-pixel lighting, and shader quality was all a step up but the dodgy frame-rate, very simplistic geometry, and mediocre animation bring it down. From that same period in 2001 I think Metal Gear Solid 2 stands the test of the time much better with superior animation, 60 fps, an an incredible attention to detail. Obviously it's much smaller in scope but it looks less dated to my eyes.
Halo 2, though, is quite ugly by today's standards.
These are games originally developed and optimized exclusively for very specific hardware. These games probably dont scale that well or very easily. TLOU was apparently quite an undertaking getting it remastered for the PS4, for example.I don't get the quote about cpu weakness. I mean, cpu are weak compared to the pc counterpart, but it's definitely more efficient and powerful of the older 360 hardware. Games are older, how can limit a remaster? I don't see what's wrong to the xbone cpu for handle games like this series. Isn't it more matter of the bandwith bottleneck?
Halo 4 seems to be between 50/60 at least one sustained drop to 40 based on the footage.
Ground textures as an example of uncanny valley, right...
Shouldnt be as much of an issue for a game built from ground up for new hardware.Like TLOU, I find it really telling and odd that there are framedrops in games that were built for decade old hardware.
It does make you wonder how a theoretical halo5 will look and run if these old games arent exactly 60fps always.
Like TLOU, I find it really telling and odd that there are framedrops in games that were built for decade old hardware.
It does make you wonder how a theoretical halo5 will look and run if these old games arent exactly 60fps always.
Like TLOU, I find it really telling and odd that there are framedrops in games that were built for decade old hardware.
It does make you wonder how a theoretical halo5 will look and run if these old games arent exactly 60fps always.
Another aspect shared with The Last of Us is a more inconsistent frame-rate, certainly in comparison with the more solid Halo 3. When we hit the magic 60, the experience is phenomenal, but the dips beneath are quite noticeable
Except you said it yourselves TLoUR stays at 60 most of the time and that most dips are completely unnoticeable.
But, gotta make technical charity, I guess.
I am sorry but that's not true. TLOU RE had improved textures (4x up-res), characters from the cut-scene models and full 16x AF almost everywhere.
Here it's the real last gen game directly ported with no improvement (low AF that you even acknowledged it in this article, low characters models) except for the better resolution.
Have you conveniently forgotten that TLOU RE use 16x AF almost everywhere + high characters model + 4x better textures so that you could shit on TLOU in passing?
And you are supposed to be the best world expert about those matters?
I don't get the quote about cpu weakness. I mean, cpu are weak compared to the pc counterpart, but it's definitely more efficient and powerful of the older 360 hardware. Games are older, how can limit a remaster? I don't see what's wrong to the xbone cpu for handle games like this series. Isn't it more matter of the bandwith bottleneck?
Even a poster named globalisateur has called them out on it:
Nope they present facts I believe.Would you say that DF is biased towards Xbox?
Would you say that DF is biased towards Xbox?
There's already one person in here shitting it up with console wars bullshit. Give it time.is it possible for improvements via day one patch or future patches? personally i'm ok not happy but not annoyed. also why hasn't this thread exploded yet?
And they will.Problem is that even all 8 Jaguar cores aren't quite as powerful as a desktop Ivy Bridge or Haswell Core i3. This doesn't bode well moving forward in the generation. Devs will have to implement new tricks.
There's already one person in here shitting it up with console wars bullshit. Give it time.
is it possible for improvements via day one patch or future patches? personally i'm ok not happy but not annoyed. also why hasn't this thread exploded yet?
Except you said it yourselves TLoUR stays at 60 most of the time and that most dips are completely unnoticeable.
But, gotta make technical charity, I guess.
By Richard Leadbetter
Like TLOU, I find it really telling and odd that there are framedrops in games that were built for decade old hardware.
It does make you wonder how a theoretical halo5 will look and run if these old games arent exactly 60fps always.
343 had a vision they wanted and went for it. It is pretty much a gimmick though but I'll probably use it pretty frequently because it's neat tech. Is it necessary? Absolutely not, is it worth the trade off? Maybe? An option to toggle it off for full 1080p would be great for those who won't use it though.It's kind of baffling that performance is wasted constantly for a switching feature that you benefit from so rarely in comparison.
The insta-switch thing seems a bit weird, without it it would've been full 1080p. Will people even use it? Especially often?
Like TLOU, I find it really telling and odd that there are framedrops in games that were built for decade old hardware.
I don't see the need for it. Halo 2 is clearly ugly as sin, so no idea who would want to replay it in Xbox mode.