[DF] Guardians of the Galaxy: PS5 vs Xbox Series X/S - A Great Game But 60FPS Comes At A Cost

Irrelevant to this thread.
Not irrelevant. This is not the first nor it will be the last title that will show this… it goes back to the "minimum specs matter" argument that some devs made before the generation started. Then again you are making strong efforts not to read anything sent your way so 🤷‍♂️. Keep blaming HW and SW strategy in one case and the devs in the other…
 
Not irrelevant. This is not the first nor it will be the last title that will show this… it goes back to the "minimum specs matter" argument that some devs made before the generation started. Then again you are making strong efforts not to read anything sent your way so 🤷‍♂️. Keep blaming HW and SW strategy in one case and the devs in the other…

I asked you to quote this blaming hardware claim, you couldn't. It's not me that doesn't read.
I've asked you a direct question you just ignore over and over again then claim that I'm doing that. I'll ask you again, if this is a hardware issue like you seem to be claiming then why is the Xbox One X version running so poorly?
 
Last edited:
benchedj1r.png
What the fking hell ? 110 FPS average on 2060 Super ?

Lazy Port... omg
 
Last edited:
Probably designed and optimised for PC hardware then a straight dirty port to consoles with various settings turned off etc.

Still looks and plays great on quality mode.

hoping the devs give it some love and patch it.
 
I can't imagine what this game must run like on a PS4 or better yet a base Xbox one. It must be like 540p?

also, with all the nvidia tech on the pc version I can totally see a scenario where that got the most love. This game does look genuinely incredible on my 3080 pc, those saying it ain't a looker are high. It easily looks as good as any big title released this year, including ratchet.
 
Last edited:
We'll, this is a disappointing turn out for next gen. 4K/30 is fine, but having one of the most aggressive performance mode cut backs we've seen for a last gen game and still not holding to 60 is quite something - especially when several games are running thier 120 FPS modes at these kind of settings.
 
Last edited:
It is crazy seeing the ps5 buckle on the intro alpha heavy scene approaching peters house. the series x stays solid 60 here according to the video.

there's something about this engine that is bringing these new consoles to their knees.

also, obligatory but genuinely serious. Ps5 needs VRR like yesterday. It's getting beyond a joke now because this gen is entering its second year and these demanding games aren't going to let up.
 
The grass looks weird without DLSS so there is maybe something else reconstruction wise in the console version that isn't in the PC native (or I missed a setting somewhere on PC - I'll have another look). Maybe that is crushing console performance for some reason as it is miles behind PC.

PS5 from DF video:
mCu4eHF.jpg


2070s at 1080p:
t163t6y.png


2070s at DLSS Quality 1440p:
YcrUU4N.jpg
 
Forza Horizon 5 at 1080p60 with pretty much the same graphics as XSX version + many other games (new and old) running great on Series S, including Ori and the Will of the Wisps at 4k60:

InconsequentialPerfumedGadwall-size_restricted.gif




Guardians of the Galaxy at 1080p30 with graphical cutbacks:

giphy.gif

bar-brawl.gif



Never change GAF. Never change.
 
Last edited:
I can't wait to see the video on the original xbox one. Place your bets!

240p and 15 fps?
i found a ps4 video


it seems they didn't care a lot for lower denominators with it.(xone is surely worse)
we are near ps3/360 dark times level
 
Last edited:
You are trying to have your cake and eat it too and move the discussion until there are enough threads of conversation

Says the person now talking about the Sega Saturn.
The developers have done a terrible job here, easily proven by the fact that my 5500XT can run the game at a near constant 70fps at 1080p on high. It's less than half as powerful than my Series X and has trouble getting near Series S in most games.
The One X version looks awful with framerates wobbling around the mid 20's. So I'm holding the Devs fully responsible for this shoddy effort on all consoles.
 
Last edited:
Says the person now talking about the Sega Saturn.
The developers have done a terrible job here, easily proven by the fact that my 5500XT can run the game at a near constant 70fps at 1080p on high. It's less than half as powerful than my Series X and has trouble getting near Series S in most games.
The One X version looks awful with framerates wobbling around the mid 20's. So I'm holding the Devs fully responsible for this shoddy effort on all consoles.
Could it be a case of the big boi consoles using unoptimised ultra quality settings? Looking at the performance videos of low-mid range cards (ex. 1660 ti), they perform extremely well till very high settings, but performance drops noticeably at ultra. Surely if a 1660ti can manage frame rates well above 60 at 1080p at very high, consoles should easily be able to hit 1440p60 at similar very high settings. Wish DF would do a pc equivalent video to see what the consoles are actually using for its settings.
 
i can play marvel Spider Man and Mile Morales in Performance RT ( 1440p@60fps with RT )

but, for some reason a corridor game like GotG can't even run at 1080p@60fps on XBSX and PS5, this is the laziest port i've seen for next gen so far.
 
I guess I'm loosing my eye sight finally. I'm switch back and forth between performance and quality and I just do not see a " stark " difference. A difference for sure but not as much as I see between frame rates. 🤷‍♂️
 
You have third party and first party devs that made PS2 and PS3 sing, so?
I am not arguing that it is impossible to get good performance out of XSS in isolation, like nobody doubted you could and people did get good performance out of PS2 and PS3.

You are going for a straw man here: the argument is that the HW is broken and no developers can get good performance out of it. People made the Saturn sing too by that token. Edit: :LOL: reaction is kind of expected ;).

SeriesS hardware is not broken.
And anyone who thinks it is broken can get the seriesX.
 
I vote that every time the PS5 does better than the XSX in these comparisons, we advocate the benefits of AMD Smart Shift.

It could become the new Blast Processing :)
 
Forza Horizon 5 at 1080p60 with pretty much the same graphics as XSX version + many other games (new and old) running great on Series S, including Ori and the Will of the Wisps at 4k60:

InconsequentialPerfumedGadwall-size_restricted.gif




Guardians of the Galaxy at 1080p30 with graphical cutbacks:

giphy.gif

bar-brawl.gif



Never change GAF. Never change.
Now imagine if FH5 or Ori were made exclusively to Series X 🤔
 
the 1xxx , vega and 5700 behind.
Could this game use some form of mesh / primitive shader ?
if it's case it makes console resuls even more surprising / my expectations.
You need a good deal of geometry to make the primitive / mesh shaders usefull. This game is designed to run XB1 with low level of geometry.
 
You need a good deal of geometry to make the primitive / mesh shaders usefull. This game is designed to run XB1 with low level of geometry.
Yep that is what i supposed by default.
So what could explain the clear generational delta here ? it seems dx 12 ultimate capable hardware have an advantage here.
edit : no i missed the radeon VII on the graph. but the result are unusual
edit2 : i was really tired ( not only in this thread) and i also didn't see there was rt involved
 
Last edited:
Yep that is what i supposed by default.
So what could explain the clear generational delta here ?
I was surprised about the static 1080p resolution found by DF when others found DRS on notably the XSS version (~1100p according to the youtuber posted here) and I hear some people think the game is higher resolution than 1080p.

I think we need VGTech or NXGamer inputs on that game before making final assessments.
 
I never seen you being concerned about Sony games being cross-gen. Maybe my memory is just bad 🤭
I'm even concerned about mid-gen refreshes… imagine cross gen lol
I guess you need to visit more threads related to Sony.


BTW my comment was not even talking bad about the games or consoles… just that the games could be even better and the reason why the game is the same in terms of features across X and S (hint: the dev choose not to do anything that could not be done on S).
 
Last edited:
I'm even concerned about mid-gen refreshes… imagine cross gen lol
I guess you need to visit more threads related to Sony.


BTW my comment was not even talking bad about the games or consoles… just that the games could be even better and the reason why the game is the same in terms of features across X and S (hint: the dev choose not to do anything that could not be done on S).
X and S have the same feature set apart from 4K, just as Jason Ronald promised, so there's nothing that can't be done on the S that can be done on the X.
 
X and S have the same feature set apart from 4K, just as Jason Ronald promised, so there's nothing that can't be done on the S that can be done on the X.
You have no ideia.

I can do an easy render code that will run on X and not on S for obvious reasons… just think a bit about what you said.

If a code was made to run on both it is because it was made with that goal in mind… so you avoid to push the render with graphics and effects to the point it won't run on the lower hardware.

It is the same for PC anyway.
 
Last edited:
Yeah? What features is the XSS missing?

Probably easier to say the same experiences rather than features, but basically the same idea. You are playing the same levels as the same characters and so on, only in this particular game the lawn was mowed a bit better on XSS. :messenger_winking_tongue:

Anyone that claims not to understand this type of thing is just trolling you, LOL. If there were levels that were missing on the XSS, or a crazy difference in npc density, things that really changed the game play, there would be something to talk about. Having a lower frame rate or lower resolution/fidelity is obviously the expected end-result.
 
Yeah? What features is the XSS missing?
I would LOVE to see a sincere answer to this question! All I see are laughing emojis indicating they know they have no point.

Probably easier to say the same experiences rather than features, but basically the same idea. You are playing the same levels as the same characters and so on, only in this particular game the lawn was mowed a bit better on XSS. :messenger_winking_tongue:

Anyone that claims not to understand this type of thing is just trolling you, LOL. If there were levels that were missing on the XSS, or a crazy difference in npc density, things that really changed the game play, there would be something to talk about. Having a lower frame rate or lower resolution/fidelity is obviously the expected end-result.
That's the thing. Every time someone accuses MS of lying about the XSS they can NEVER point to one core feature the XSS lacks the XSX has. XSS games aren't missing levels or characters. The system has the same features like quick resume and FPS boost. The GPU is capable of raytracing and SFS. Yet they continue to push the same lame narrative. It'd be funny if it wasn't so sad.
 
I would LOVE to see a sincere answer to this question! All I see are laughing emojis indicating they know they have no point.
I will give a simple example to understand because I already answered the question in my first post.

If I made a game that push heavy on Ray-tracing on Series X it wont run on Series S no matter the resolution.

Games that have the same features and effects across X and S with just change in resolution like the FH5 the guy I quoted said are just that games made to run in both and so they didn't push what they can do with Series X at all.

I used Ray-tracing as example to be easier to you understand but it is the same for anything you do at render in the GPU… if you push to maximize the stronger hardware you won't have the same effects in lower hardware even if you decrease the resolution.

Games that are the same with just resolution and/or framerate differences are games that already were made with the lower hardware in mind.
 
Last edited:
I will give a simple example to understand because I already answered the question in my first post.

If I made a game that push heavy on Ray-tracing on Series X it wont run on Series S no matter the resolution.

Games that have the same features and effects across X and S with just change in resolution like the FH5 the guy I quoted said are just that games made to run in both and so they didn't push what they can do with Series X at all.
Or you just drop RT on XSS, as some games have.
 
I love how the usual suspects are trolling the XSS, when even XSX and PS5 struggle to hold 60fps at 1080p with reduced settings.

Yeah its a problem with XSS, nothing to do with the game, or optimization.
 
Top Bottom