TheRedRiders
Member
I was actually wondering for while where the hellWe haven’t seen Doncabesa and Riky in a little while. I am glad Hitman is 2160 on Xbox, otherwise we might never see those guys.

I was actually wondering for while where the hellWe haven’t seen Doncabesa and Riky in a little while. I am glad Hitman is 2160 on Xbox, otherwise we might never see those guys.
The more important (and interesting) question that should be asked is - what caused, (or why did?), the developer to make a conscious choice to target a lower resolution and shadow quality in this particular game?It's also possible the difference is much smaller in performance. It probably is.
PS5 @ 1800P could be hitting higher than the XSX's FPS average @4k if both were unlocked.
W/O dynamic resolution, and with a locked framerate, you get these big pixel output differences w/o really knowing what the real engine performance difference was. PS5 couldn't maintain 60FPS at 4k? Dump it to 1800p.. that is a quick way to "solve that problem" lol
To me this is just the consequence of an engine developed for last gen machines favouring TFs, over clock speeds. The Series X can just brute force with their TF advantage, the effect used for reflections is very expensive in terms of graphics computation, if they used RT instead the machines would probably be a lot closer. Interesting non the less, but a cross gen game is unlikely to show either machines in their best light.The more important (and interesting) question that should be asked is - what caused, (or why did?), the developer to make a conscious choice to target a lower resolution and shadow quality in this particular game?
Up until now - all cross-gen games for XSX and PS5 have been (generally) equal in output (assuming latest updates/patches in play... ).
Per the article - the FPS drops are rare and inconsequential:
"Put simply, it's 60 frames per second... with just one exception in our hours of play. In the Mendoza mission set in Argentina, it is possible to see Xbox consoles run between 50 to 60fps around a field towards the outskirts of the level, while PlayStation 5 remains constant at 60fps. Hopefully IO will look at improving this for owners of the Microsoft machines, but everything else we played ran flawlessly - bar a very slight stutter in a cutscene at the beginning of the Miami stage from Hitman 2, where all consoles dip from to near 40fps. At this point though, it feels more like a nitpick rather than anything that would have an impact on any specific purchasing recommendation."
The biggest difference is resolution and shadow quality:
"Meanwhile, there are additional tweaks to shadow quality too. Series S uses the equivalent to PC's low quality shadows (in common with the last-gen base console renditions of Hitman 3), while PlayStation 5 runs at medium (similar to One X and PS4 Pro) and Xbox Series X operates at high. The difference is fairly subtle across all three, but it's there nonetheless. In all other scenarios, all three next-gen consoles are the same and the overall presentation is first class. Yes, Series X's resolution advantage is there, and the Glacier engine thrives on precision, giving it a pristine edge. With that said, however, the lower resolution on PlayStation 5 is in no way a problem and still looks wonderful."
I hope that Sony is doing something similar about its hardware heavy SSD/IO complex because the situation is blatantly embarassing in third party games right now.I remember RGT saying that Sony were sending out tutorial and codes for first and third party studios so they could actively explore the GE and use it in different ways.
He also stated that a developer told him it's like Primitive Shaders on steroids, it's fair to say we won't see games take full advantage of it till around the second or third wave of games which is like 2023 onwards.
If his leaks are correct in that the GE also heavily influenced RDNA 3's geometry handling then it should be something special.
But why.
Sony hasn't updated the tools yet.Why is so weird that Hit man 3 has 1800p on ps5? So far all the 3rd party games have been performing the same on both consoles, why now?
I wonder if they wanted to make sure the game is 1080p and 2160p on the Series S and X respectively. This could explain why they chose a static resolution that results in some frame drops instead of using dynamic res. Maybe their engine is not built to use DRS, idk.I guess the big question is why not make both 4K with DRS?
Noted. Even though I was aware of the meaning that post but I decided to add mine as some extra piece of info nevertheless.Of course but it was a joke just to empathise how early in the generation we could have a pro compared with the PS4.
That gif
Today of all days.
![]()
I assume the Sony TV and the PS5 do only support some subset of the HDMI2.1 API/functionality needed to get the VRR working but "not good enough" support it 100% so it may fail with other devices, not made by Sony.Not really, than they had to whitelist the TV on the PS5 and the PS5 in the TV. I really don't think that it works that way. If it really works (currently) it is just a custom standard nobody knows about. Something we really don't want to see if there are standards that could work with any device.
But I really get it. HDMI 2.1 seems to be a mess.
Someone's PS5 be coming in like this
As explained before, these comparisons between game consoles and PC are not really interesting because it is impossible to have it fair => the PS5 (and XsX of course) are running effects at quarter resolution.... We already said that, and Alex seems only now to share that point...
Alex talks about "build performance", which would mean "how long does it take to build/compile the game". It has nothing to do with the performance of the resulting code.He's talking about the compiler architecture for the gpu, in which the more efficient the architecture of the compiler is, the better and more efficient the graphics code runs.
If that’s the case, a quick patch with recompiled code for past games should demonstrate a zero effort upgrade. waiting....He's talking about the compiler architecture for the gpu, in which the more efficient the architecture of the compiler is, the better and more efficient the graphics code runs. I'm assuming Alex has no clue what he's talking about and I don't say that lightly. Dealer proved his absolute lack of common sense by citing Tom Warren as a credit source of info, that's how low he went and this post he made now further proves that point. Citing Battaglia as a credible source of info when he knows jack about compiler architectures is a new low. But then again, it's Dealer we're talking about, so we're basically talking about an absolute nobody who will stoop far too low to get his nonsensical point across. By the way, he latches onto the bs spread by people like Astal and Misterxmedia and I'm sure that on its own says a lot about him.
You mean old timers day?Hey guys,
Apparently all of you are in here now brainstorming how to spin the Series X win over the PS5 in Hitman 3 as a positive for PS5.
Is that true?
Someone in the DF Hitman 3 face-off said so...
lol
Maybe it's the first next-gen game without DRS.
Xbox Series X in native 4k has few dips. PS5 in native 4k should have more dips.
In static resolution, PS5 at 1800p shows constant 60fps .
Maybe with DRS PS5 should get 4k in some scenarios.
But of course, clear Xbox Series X win.
That was in reference to the scoped screenshot. This is the only area in the game the xbox has any type of consistent drop. It's one small part of one level. 60fps 99.99% of the time, but you already know that.
That was in reference to the scoped screenshot. This is the only area in the game the xbox has any type of consistent drop. It's one small part of one level. 60fps 99.99% of the time, but you already know that.
I mean Alex has his sources(but what sources?). Maybe the pull out the ass kind of source. But what am I saying? The more knowledgeable Xbox “insiders” out there like dealer & Colt said it’s the “tools”. So it must be the “tools”, right? Any bs they can conjure to make their plastic box of choice look disadvantaged. In this day and age, it’s simply unacceptable to ship a devkit with unfinished game development tools to devs during the final stretch leading up to the consoles’ launch. Some people do not realize how stupid that assertion sounds, how un-Microsoft it looks. I would never, in a million years come up to the assumption that Microsoft, a software development and cloud infrastructure powerhouse, and developer of one of the two most popular OSes on planet earth, would ship unfinished software to game devs. A thing which would undoubtedly be detrimental to their reputation, apart from the fact that it sounds dumb as f@¢k. Microsoft themselves wouldn’t stoop so low as to ship devkits with unfinished software dev tools in it. But dumbasses like Colt & Dealer find it perfectly reasonable to assume something that dumb. *Tsk tsk tsk* what absolute “brainlets”, I had no idea some people were that dumb.If that’s the case, a quick patch with recompiled code for past games should demonstrate a zero effort upgrade. waiting....
(9TF v 12TF)?As I asked in the other thread, a 44% pixel per frame increase plus better shadows in Hitman 3 for XSX vs PS5 is significant.
It's actually so massive that one has to wonder if the PS5 version didn't receive as much love as it should. Would a 2-3TF performance difference (9TF v 12TF) really be enough to do so?
(9TF v 12TF)?
What about CPU SMT, how much of an impact will that have being that XSX has the option of with or without, where as the PS5 is SMT locked with regards Hitman 3.Not sure if he meant it, but seems the recycled FUD won't stop. Even when it's actually nearly a tie here in this game while PS5 is leading by a comfortable advantage in most games so far.
What about CPU SMT, how much of an impact will that have being that XSX has the option of with or without, where as the PS5 is SMT locked with regards Hitman 3.
But if the game uses over 8 threads, SMT will win regardless. 1 or 2 threads is probably used by the OS, but the point stands. non-smt would be more performant on XBOX if the game is <=8 threads. > 8 threads is pretty much a wash with SMT. I would hope modern games were using SMT to get max performance on both consoles.Yup, that's another thing. But if both aren't using identical resolution it's hard to compare. Because PS5 in the same scene being locked 60fps means it's actually 60+ fps. So overall it's a tie or a small advantage on PS5 even with XSX having the option to boost to 3.8GHz with 8 threads.
But if the game uses over 8 threads, SMT will win regardless. 1 or 2 threads is probably used by the OS, but the point stands. non-smt would be more performant on XBOX if the game is <=8 threads. > 8 threads is pretty much a wash with SMT. I would hope modern games were using SMT to get max performance on both consoles.
44% resolution advantage for XSX.
More than 46.3% performance advantage for PS5.
Just swallow it as it is.
I doubt it's like that in Hitman 3 though. Chances are high that it's handling assets exactly the same way across all versions.And no decompression going on PS5 in the CPU vs PC. It's really an interesting gen to watch and enjoy.
its because many devs pushed for higher resolution on One X version, if One X version was same resolution as PS4 Pro, every game would have this kind of advantage on xbox in performance![]()
Seems like hit man devs are never give Playstation some loves. This is Hitman 2 running on Pro vs X. As you can see performance delta is massive even at the same resolution. No many other 3rd party games have this massive differences between both platforms. So yeah, hit man game engine is not so optimise for Playstation.
What's so wrong about Alex's quote? Better compilers do increase the performance of the same code.He's talking about the compiler architecture for the gpu, in which the more efficient the architecture of the compiler is, the better and more efficient the graphics code runs. I'm assuming Alex has no clue what he's talking about and I don't say that lightly. Dealer proved his absolute lack of common sense by citing Tom Warren as a credit source of info, that's how low he went and this post he made now further proves that point. Citing Battaglia as a credible source of info when he knows jack about compiler architectures is a new low. But then again, it's Dealer we're talking about, so we're basically talking about an absolute nobody who will stoop far too low to get his nonsensical point across. By the way, he latches onto the bs spread by people like Astal and Misterxmedia and I'm sure that on its own says a lot about him.
Which in turn allows them to be more productive, thus being able to improve things faster than before. Why shouldn't that be an improvement?Alex talks about "build performance", which would mean "how long does it take to build/compile the game". It has nothing to do with the performance of the resulting code.
That is something which makes the "change -> compile -> test" cycle faster, which reduces the idle time of the devs
You are comparing something that's true during all the play time (resolution difference) with something that's only true a small number of times (framerate difference).[UPDATED]
Nope:
1800p vs 2160 = 44% (total pixel count)
60fps (actually more than 60fps to have solid 60fps all the time) vs 41fps = 46.3%
46.3 - 44 = more than 2.3% advantage for PS5
Another usual win for PS5, but unknown how big exactly because solid 60fps is actually something around 60-80fps.
You talk as if rushed software doesn't exist. Should I point you to the number of post-release patches they have published for Windows?I mean Alex has his sources(but what sources?). Maybe the pull out the ass kind of source. But what am I saying? The more knowledgeable Xbox “insiders” out there like dealer & Colt said it’s the “tools”. So it must be the “tools”, right? Any bs they can conjure to make their plastic box of choice look disadvantaged. In this day and age, it’s simply unacceptable to ship a devkit with unfinished game development tools to devs during the final stretch leading up to the consoles’ launch. Some people do not realize how stupid that assertion sounds, how un-Microsoft it looks. I would never, in a million years come up to the assumption that Microsoft, a software development and cloud infrastructure powerhouse, and developer of one of the two most popular OSes on planet earth, would ship unfinished software to game devs. A thing which would undoubtedly be detrimental to their reputation, apart from the fact that it sounds dumb as f@¢k. Microsoft themselves wouldn’t stoop so low as to ship devkits with unfinished software dev tools in it. But dumbasses like Colt & Dealer find it perfectly reasonable to assume something that dumb. *Tsk tsk tsk* what absolute “brainlets”, I had no idea some people were that dumb.
This scenario, is always going to be where Playstation going to have edge (not Microsoft edge), this is draw calls hell. So no unless they bring mower, there is not going to be any improvement, I think.
“We considered a lot of different ideas for how to support these features [in Nioh 2]”, he told DailyBits. “For a really intense action game, utilizing the haptic feedback too much may take away from the player’s experience with the game and could hurt their overall enjoyment of the title.”
“As a result, we implemented only a few of the most suitable ideas into the game. I would like to try and make a game that makes full use of the PS5’s haptic feedback feature in the future. It would turn into a game that couldn’t be experienced on the previous generation of hardware.”
“The plan for now is to have the Nioh team move on to work on new projects after the release of Nioh 2 Complete Edition. In order to ensure that future titles, including the possibility of Nioh 3, are titles that all of our fans can enjoy and look forward to, we are putting all of our effort into Nioh 2 CE.”
Code runs faster as long as you dedicate your time to optimizing it. The question isn’t how good a compiler is, the question is how good the code is and how well written it is. You can have the world’s best compiler and the fastest cpu around but that won’t do your code any good if it’s filled with flaws that can and will affect performance. It’s not the compiler that’s the problem, it’s the code and its quality. If the code is a lot of awful, messed up spaghetti code, then you can bet performance and maintainability of the program and it’s code respectively, will suffer in the long term. The tools excuse is the oldest one in the book.What's so wrong about Alex's quote? Better compilers do increase the performance of the same code.
Which in turn allows them to be more productive, thus being able to improve things faster than before. Why shouldn't that be an improvement?
You are comparing something that's true during all the play time (resolution difference) with something that's only true a small number of times (framerate difference).
Using that as absolute values is obviously a flawed comparation that someone with minimal math knowledge knows.
You should know better than to keep trolling.
You talk as if rushed software doesn't exist. Should I point you to the number of post-release patches they have published for Windows?
What's worse, shipping unfinished devkits or not shipping them at all?
Also faster CPU
Also much faster peak memory bandwidth.
When the RDNA2 features start kicking in this sort of gap could become the norm.
You are comparing something that's true during all the play time (resolution difference) with something that's only true a small number of times (framerate difference).
Using that as absolute values is obviously a flawed comparation that someone with minimal math knowledge knows.
You should know better than to keep trolling.
I'm not arguing about the tools thing, I'm talking about Alex's quote talking about how the compiler improves the code.Code runs faster as long as you dedicate your time to optimizing it. The question isn’t how good a compiler is, the question is how good the code is and how well written it is. You can have the world’s best compiler and the fastest cpu around but that won’t do your code any good if it’s filled with flaws that can and will affect performance. It’s not the compiler that’s the problem, it’s the code and its quality. If the code is a lot of awful, messed up spaghetti code, then you can bet performance and maintainability of the program and it’s code respectively, will suffer in the long term.
Do you know if they were 90%, 95% or 35% finished?Unfinished devkits and tools for a piece of hardware in which Microsoft spent a good 3-4 years working on is unacceptable given the time spent working on it. Shipping the tools 90% finished is understandable, shipping them half finished at the point in which the last devkits were sent to devs is unacceptable.
No, I haven't said that. I say that to reach the numbers you are getting and comparing them you are assuming that the frame rate is always 41, which it clearly is not. That's basic math.Sorry to tell you that, but you make yourself sound like a troll here. So FPS performance is fictional?
![]()