Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Sure thing. Keep thinking that a 4tflop can do everything a 12tflop can. The gap between the two is 300%, add faster ssd, cpu etc. ... the 4tflop machine will not run anything optimized for XSX at all. BUT a 12tflop machine will run anything a 4tflop can.

XSS is an extra step. Extra work, time better spent on dlcs, other projects.


You guys are talking past each other..

Lockhart can run any game code that the XSX can ( Not talking resolution or graphic detail) .

The additional work which needs to be done is due to possible memory differences and getting the Resolution and/or detail turned down enough to get it to run on the Lockhart 4TF GPU
 
Last edited:
The gap between the two is 300% and the targeted resolution is 1/4. What's your point? GPUs scale. They always have depending on what you are targeting. Game design is not impacted by this.

I swear you are concern trolling and arguing in bad faith and completely ignoring the fact that this is how the industry has always worked.
it's actually 200%. 300% would be 4x.

but i still what you are saying. now riddle me this, what happens if devs dont target native 4k on the series x? What happens if like the UE5 demo they have to settle for 1440p on this 12 tflops console? 3x of that is 1.2 million pixels or roughly 800p. Is that enough?
 
Are you really saying that, a PS4 can render anything a PS5 can? O´really? It´s much easier to port XSS code over to XSX, than the other way.

Games on PC are optimized for the lowest common denominator since the PS2 days. Your $2,000 RTX 3080Ti with its 16gb? vram, all it will do is run code, assets optimized for GTX1660? 6Gb vram and brute force it to 4K, 8K and higher.

Some developer may take the extra time and add 4K textures ...but not change the cars, the faces, every assets etc. all will look "last gen". That is why so many pc diehard fans whine about, their $5,000 machines never really getting psuhed to the limit.

Last game to do that, server the "hardcore clientele" first, was Crysis .. and because the assets (the player models, faces are still top), the world, essentially everything was made with high end gpus in mind (the tessallation battle).

Crysis SOLD like crap.

Your example doesnt make sense. Porting code from PS5 to PS4 is not the same as the XSX to XSS. This has been stated many times. PS4 and PS5 do not share a CPU, SSD or GPU architecture. XSX and XSS share the same CPU and SDD, with the same GPU architecture clocked lower. Completely different scenario.
 
it's actually 200%. 300% would be 4x.

but i still what you are saying. now riddle me this, what happens if devs dont target native 4k on the series x? What happens if like the UE5 demo they have to settle for 1440p on this 12 tflops console? 3x of that is 1.2 million pixels or roughly 800p. Is that enough?

That's up to the dev and I don't think either console will mandate 4K. if you make a game that can only run at 1440p that's your call and it will have to scale down appropriately but I think that's a bit outside of what we are covering here.

the ue5 demo was like all the other ones where it's a tech demo and less a real world usage.
 
Are you really saying that, a PS4 can render anything a PS5 can? O´really? It´s much easier to port XSS code over to XSX, than the other way.

Games on PC are optimized for the lowest common denominator since the PS2 days. Your $2,000 RTX 3080Ti with its 16gb? vram, all it will do is run code, assets optimized for GTX1660? 6Gb vram and brute force it to 4K, 8K and higher.

Some developer may take the extra time and add 4K textures ...but not change the cars, the faces, every assets etc. all will look "last gen". That is why so many pc diehard fans whine about, their $5,000 machines never really getting psuhed to the limit.

Last game to do that, server the "hardcore clientele" first, was Crysis .. and because the assets (the player models, faces are still top), the world, essentially everything was made with high end gpus in mind (the tessallation battle).

Crysis SOLD like crap.

EDIT:

Imagine Forza 7 being an XSS game, with 20 cars, each 250,000 polys.?, stereo, 2K textures or lower ... 5Gb vram + 2Gb game/os
Imagine Forza 8 being an XSX game, with 50 to 200 cars (Le Mans < Design), each car 250,000 to 500,000, 4K textures, 3D sounds yada ... it eats up 13GB vram fully.

XSX Forza 8, needs to be optimized to run on the XSS. The XSS game does not require any fiddling, bar some minor adjustments.

this isn't even worth it anymore. None of your comparisons or strawmen even make sense. I'll let others handle this because you are literally comparing PS4 to PS5 to make a point claiming that is somehow the same as Lockhart.

At a min why would the car count even be different.
 
XSX and XSS share the same CPU and SDD, with the same GPU architecture clocked lower. Completely different scenario.

Wait, has this been confirmed yet? How are they going to hit the $199 pricetag everyone seems alluding to?

I was under the impression, that the SSD (maybe even just 512Gb? and speed), vram and others got cut in half? Less (GPU)CUs alone wont be enough to reach said price range.
 
Wait, has this been confirmed yet? How are they going to hit the $199 pricetag everyone seems alluding to?

I was under the impression, that the SSD (maybe even just 512Gb? and speed), vram and others got cut in half? Less (GPU)CUs alone wont be enough to reach said price range.
nothing has been confirmed but rumors from tom warren state same cpu, 7.5 gb of ram and same ssd speeds.
 
Wait, has this been confirmed yet? How are they going to hit the $199 pricetag everyone seems alluding to?

I was under the impression, that the SSD (maybe even just 512Gb? and speed), vram and others got cut in half? Less (GPU)CUs alone wont be enough to reach said price range.

No idea where the $199 price is coming from. perhaps has been speculation on this forum but there hasn't been any credible leads to what the prices that they're aiming for but 199 seems crazy.

Again maybe you should read the developer quote that I posted. It states that the SSD and CPU are the same.
 
Available. Compare to 13.5 on series X and PS5. Again the requirements on this Ram are considerably different
7.5 gb available for games Tom said. like 13.5 for series x. so 10gb total i'd assume. 6 gb difference. should help them save $40-50 on the BOM.

I'm guessing around 10GB total then. Also it would make sense if the ram was slower but not split up in two like the XSX.
 
That's up to the dev and I don't think either console will mandate 4K. if you make a game that can only run at 1440p that's your call and it will have to scale down appropriately but I think that's a bit outside of what we are covering here.
What he's saying is that the assumption "XSX will run games at 4K, so XSS can easily just run them at 1080p" is flawed because most likely a lot of games could be built in such a way that they will run at a lower resolution to begin with on XSX, so just cutting resolution by 3-4 for XSS would bring them down to ridiculous resolutions.

IMHO none of this matter anyway, because once you take for granted that PC is also part of the equation, that will end up being the lowest common denominator for a few years (since minimum configuration supported will still definitely have worse components and worse optimization than XSS). Since Microsoft has gone down the "all games will also be on Windows" route, XSS makes zero difference in the grand scheme of things right now.
 
Decided to ask a buddy of mine in the industry about the Lockhart and Series X vs PS5 and what he thinks so far. He is working on Borderlands 3 for Next Gen right now.

Vetted by @Bill O'Rights

When I asked about Lockhart "holding back next gen." He had this to say:

"I really hate this term as it doesn't make any sense in terms of how games are actually made but I understand what people are trying to say. But if you have ever designed a game there isn't much you can and can not do. Its all a matter of what you are willing to put time/resources into to get working correctly. Whenever a new generation of consoles happens it allows creators to get things working faster and easier. Lockhart when we/my team was first briefed on it sounded really bad on paper.

Microsoft failed on providing real dev kits and details on the project. We didn't get any type of Lockhart hardware until very recently. Before we actually had the hardware we were given a profile on Anaconda dev kits that would mimic what Lockhart would be. But Microsoft never mentioned that it would have the same CPU and an SSD or how much RAM they intended Lockhart to have. I suspect this was because they themselves hadn't decided. To put it bluntly, they released these profiles far too early. The tools they provided made us hate Lockhart.

That changed once we got Lockhart Dev kits. It is indeed the same CPU and SSD and getting up and running on this device was super easy compared to Anaconda running in the Lockhart profile. We have been able to do the work we want on Anaconda and get it running on Lockhart with not a ton of work but it has required a bit more time to make sure the code runs on both machines in the same fashion. Its not something we are really worried about anymore. As the generation goes on I feel like this will be the approach for many studios. You start on Anaconda and then optimize for Lockhart. There is nothing the Lockhart can't do that the Anaconda can.

The one thing I have heard thats concerning is that Lockhart dev kits are not common. It seems like Microsoft really wants to be able to use Anaconda to accurately portray Lockhart performance and that has not been the experience my team has had. The profiles and tools are getting better on Anaconda in terms of mimicking Lockhart, but if you don't have a Lockhart dev kit, I feel like you are not going to be able to see how it accurately runs on Lockhart. Maybe this will change, but as of right now you really need a Lockhart Dev kit to understand it. For smaller teams I could see the optimization process being more time consuming but the tools provided by Microsoft have come a long way. They make it very easy to jump from one kit to another and the Lockhart kit is equipped with a lot of tools that help you see exactly where code needs to be looked at. Ray Tracing is one area that they seemed to have focused on and have made it very easy to adjust the levels."

I asked about PS5 Dev kits vs Series X dev kits and which console has the upper hand?

"PS5 dev kit is a bit easier to work with. Its well thought out and designed in ways that make it a bit easier to tweak and change things vs Anaconda. To say I prefer one over the other isn't' really fair because both are very good, but its just a bit easier to work with PS5. But Anaconda has the upper hand in terms of us being able to really push effects. The difference will come down to effects over resolution for us. We have both dev kits pushing 4K/60 on Borderlands 3 and we have almost zero loading times on both kits. Looking at them side by side the image is very similar.

Why is he still referring to it as "Anaconda" ?
 
Decided to ask a buddy of mine in the industry about the Lockhart and Series X vs PS5 and what he thinks so far. He is working on Borderlands 3 for Next Gen right now.

Vetted by @Bill O'Rights

When I asked about Lockhart "holding back next gen." He had this to say:

"I really hate this term as it doesn't make any sense in terms of how games are actually made but I understand what people are trying to say. But if you have ever designed a game there isn't much you can and can not do. Its all a matter of what you are willing to put time/resources into to get working correctly. Whenever a new generation of consoles happens it allows creators to get things working faster and easier. Lockhart when we/my team was first briefed on it sounded really bad on paper.

Microsoft failed on providing real dev kits and details on the project. We didn't get any type of Lockhart hardware until very recently. Before we actually had the hardware we were given a profile on Anaconda dev kits that would mimic what Lockhart would be. But Microsoft never mentioned that it would have the same CPU and an SSD or how much RAM they intended Lockhart to have. I suspect this was because they themselves hadn't decided. To put it bluntly, they released these profiles far too early. The tools they provided made us hate Lockhart.

That changed once we got Lockhart Dev kits. It is indeed the same CPU and SSD and getting up and running on this device was super easy compared to Anaconda running in the Lockhart profile. We have been able to do the work we want on Anaconda and get it running on Lockhart with not a ton of work but it has required a bit more time to make sure the code runs on both machines in the same fashion. Its not something we are really worried about anymore. As the generation goes on I feel like this will be the approach for many studios. You start on Anaconda and then optimize for Lockhart. There is nothing the Lockhart can't do that the Anaconda can.

The one thing I have heard thats concerning is that Lockhart dev kits are not common. It seems like Microsoft really wants to be able to use Anaconda to accurately portray Lockhart performance and that has not been the experience my team has had. The profiles and tools are getting better on Anaconda in terms of mimicking Lockhart, but if you don't have a Lockhart dev kit, I feel like you are not going to be able to see how it accurately runs on Lockhart. Maybe this will change, but as of right now you really need a Lockhart Dev kit to understand it. For smaller teams I could see the optimization process being more time consuming but the tools provided by Microsoft have come a long way. They make it very easy to jump from one kit to another and the Lockhart kit is equipped with a lot of tools that help you see exactly where code needs to be looked at. Ray Tracing is one area that they seemed to have focused on and have made it very easy to adjust the levels."

I asked about PS5 Dev kits vs Series X dev kits and which console has the upper hand?

"PS5 dev kit is a bit easier to work with. Its well thought out and designed in ways that make it a bit easier to tweak and change things vs Anaconda. To say I prefer one over the other isn't' really fair because both are very good, but its just a bit easier to work with PS5. But Anaconda has the upper hand in terms of us being able to really push effects. The difference will come down to effects over resolution for us. We have both dev kits pushing 4K/60 on Borderlands 3 and we have almost zero loading times on both kits. Looking at them side by side the image is very similar.

This Is thread worthy isn't it? A lot of new information (at least confirming the existence and nature of Lockhart) .

Seems a shame to leave it buried in here when this answers a lot of the questions that have cluttered up the front page previously.
 
This Is thread worthy isn't it? A lot of new information (at least confirming the existence and nature of Lockhart) .

Seems a shame to leave it buried in here when this answers a lot of the questions that have cluttered up the front page previously.

I's 2nd the motion for it to be it's own thread.
 
I's 2nd the motion for it to be it's own thread.

Same here. It's vetted information after all and not your typical drive by BS.

Also notice how he said it's easier to make games on the PS5?

Maybe the fear the fear over variable clocks making development difficult isn't justified.
 
Last edited:
I's 2nd the motion for it to be it's own thread.

Well we've had threads made of 140 char tweets from people with a rumour.

This is a verified bit of information and runs to several paragraphs. Should really be given its time to breathe if that other crap is thread worthy.

EDIT: Oh and "seconding" the motion makes me feel like a brandy drinking cigar smoking aristocrat. Thanks!

EDIT 2: On that theme, this is more information than a lot of "journalism" has concocted a piece from. So be ready for Gaf->world->gaf...
 
Last edited:
it's actually 200%. 300% would be 4x.

but i still what you are saying. now riddle me this, what happens if devs dont target native 4k on the series x? What happens if like the UE5 demo they have to settle for 1440p on this 12 tflops console? 3x of that is 1.2 million pixels or roughly 800p. Is that enough?

100% agree with this sentiment. People forget that memory availability is a big deal when it comes to writing performant code.

There instances where more memory hungry data-structures will yield an order of magnitude speed up over others. In memory pre-computed result caching is a trivial example as are combinatorial search algorithms that underpin AI. So if you're just squeezing by with 1440p on XSeX and now you have to port down with nearly half the memory at your disposal, it will be more than just a matter of resolution and effect downscaling. So yeah you'll probably cut out or compromise gameplay features from your game.
 
This Is thread worthy isn't it? A lot of new information (at least confirming the existence and nature of Lockhart) .

Seems a shame to leave it buried in here when this answers a lot of the questions that have cluttered up the front page previously.
Thread made.
 
The gap between the two is 300% and the targeted resolution is 1/4. What's your point? GPUs scale. They always have depending on what you are targeting. Game design is not impacted by this.

I swear you are concern trolling and arguing in bad faith and completely ignoring the fact that this is how the industry has always worked.

Please understand that this is not how the industry as always worked.. the baseline for the industry is always the consoles... and usually after one two years after consoles release they become the lesser dominator because pc is always evolving.. is the nature of things.. bare in mind that right now the industry is working with a baseline power of 1.3 tf.. they don't need to downgrade anything for the pc market just enhance as the vast majority of pc gamers by now have GPUs as fast or far faster than 1.3 tf..

What Lockhart is doing is undermining right from the start the baseline for next gen that should be 10.2 tf.. can you imagine 5 years from now devs having to code for a 4tf machine? Is like if devs in this days had to accommodate current gen games to run in Xbox 360 and PS3.. you really think games would be the same as they are now?

So imagine 5 years from now 10/12 tf will be already bottom line of GPU performance and devs on top of that will need to accommodate for a 4tf machine. This is madness when by then we will have GPUs that far exceed that by orders of magnitude...

Even by then Sony and Xbox will have midgen refreshes boosting at least 25 tf machines.. tech industry moves forward not backwards but MS is really desperate trying to find a shortcut to success with this short term solution based on attractive price and lots of marketing bulshit to full some people they are buying some next gen machine when in fact they are buying a quick fix for MS to get some market share that will be fully supported for maybe 2, 3 years and then will be left behind...

Once again tech market moves forward not backwards.. MS will learn another lesson..
 
Last edited:
So about this Dirt 5 running on 120 fps, i found it quite impressive, even tho obviously it won't be 4K, i guess it's gonna be 1080p, and even then it's still quite impressive.

Hopefully AC Valhalla will have a 60 fps mode, i don't care much about 120 fps, i mean it's cool but i suspect there would be very huge sacrifices graphics wise to get there, which i don't think it's worthy, 60 fps would be more than enough (and i don't think high graphics games can do it for obvious reason), but it they can do 1080/2K, 60 fps that would be neat, that's the target i want mostly to be the focus.
 
That's up to the dev and I don't think either console will mandate 4K. if you make a game that can only run at 1440p that's your call and it will have to scale down appropriately but I think that's a bit outside of what we are covering here.

the ue5 demo was like all the other ones where it's a tech demo and less a real world usage.

I feel that's the point SlimySnake was trying to make. If we are working under the assumption that all/most devs will target native 4K on PS5/Series X, then the Lockhart has enough grunt and more to be a capable 1080p machine. The question is, what happens, when the devs start going for photo-realistic, graphically intensive games and decide to scale back the resolution on Series X/PS5? Going from native 4K to 1800p to 1600p to 1440p or ~xxx0p will have a domino effect on Lockhart, as the power is relative across the 3 machines. It won't necessarily be able to output at the desired 1080p all generation long. Which puts a question mark against the long-term viability of it, a console generation generally lasts 10 years before devs stop making games for it, how well do you think this machine will age in 2025 or 2027? These are all fair questions imo

And, while I agree with you, that UE5 was a tech demo, it is still the most "next-gen" footage we have seen so far. It should serve as a reminder, how higher quality pixels > more pixels. The demo is rendering at 1440p and they are using temporal injection to achieve 4K, yet the IQ on that thing can contest the native 4K games shown at PS5 note. I'll go one step further, not only does it contest, in some cases it outshines them.
 
Last edited:
So about this Dirt 5 running on 120 fps, i found it quite impressive, even tho obviously it won't be 4K, i guess it's gonna be 1080p, and even then it's still quite impressive.

Hopefully AC Valhalla will have a 60 fps mode, i don't care much about 120 fps, i mean it's cool but i suspect there would be very huge sacrifices graphics wise to get there, which i don't think it's worthy, 60 fps would be more than enough (and i don't think high graphics games can do it for obvious reason), but it they can do 1080/2K, 60 fps that would be neat, that's the target i want mostly to be the focus.
I'm hoping for the same but we'll probably be out of luck. Ubisoft is going to roll with 30, there's a 95% chance irl.
 
it's actually 200%. 300% would be 4x.

but i still what you are saying. now riddle me this, what happens if devs dont target native 4k on the series x? What happens if like the UE5 demo they have to settle for 1440p on this 12 tflops console? 3x of that is 1.2 million pixels or roughly 800p. Is that enough?

Project Athia was 4K and the textures look to be about on par with the UE5 demo. IMO.

646b93b374952c62238b60832e87d55d.png


Maybe scaling back res won't be needed and UE5 was just unfinished?

f4ebd0e15c57bb9c7287ce657f5f2c30.png

e7f2bab80ee5ef26d6deaaf84cf99176.png

df95514e826cbd0444b72a1733e77916.png
 
Xbox Series X's Velocity Architecture Will "Greatly Help" Open World Games, Says Developer

In a recent interview with GamingBolt, Kavan talked about how the Xbox Series X's Velocity Architecture will be a big benefit to larger and open world games.

"This will greatly help large games – especially open world – because streaming is always an issue to deal with," he said. "It's not only about reading from SSD, but also providing the assets for the game. So yes, having hardware-level decompression and asset preprocessing might bring in a very interesting point for the overall smoothness."


 
Project Athia was 4K and the textures look to be about on par with the UE5 demo. IMO.

646b93b374952c62238b60832e87d55d.png


Maybe scaling back res won't be needed and UE5 was just unfinished?

f4ebd0e15c57bb9c7287ce657f5f2c30.png

e7f2bab80ee5ef26d6deaaf84cf99176.png

df95514e826cbd0444b72a1733e77916.png
I think the problem is not the "assets quality", is the Lumen which was not using the Hardware Accelereted part and Global Ilumination is heavy.
And EPIC said the UE5 in PS5 (geometry part) is comparable to Fortnite in PS4.

But this is only my thought, I really don't know.
 
Last edited:
I think the problem is not the "assets quality", is the Lumen which was not using the Hardware Accelereted part and Global Ilumination is heavy.
And EPIC said the UE5 in PS5 (geometry part) is comparable to Fortnite in PS4.

But this is only my thought, I really don't know.
To be more precise they said the time spent rendering geometry in the UE5 demo on ps5 was the same as rendering fortnite on ps4.
 
Last edited:
Xbox Series X's Velocity Architecture Will "Greatly Help" Open World Games, Says Developer

In a recent interview with GamingBolt, Kavan talked about how the Xbox Series X's Velocity Architecture will be a big benefit to larger and open world games.




That's not possible, Alex Battaglia aka Dictator said SSD wouldn't help OW games, whom should I trust? 🤔
 
Last edited:
Well we've had threads made of 140 char tweets from people with a rumour.

This is a verified bit of information and runs to several paragraphs. Should really be given its time to breathe if that other crap is thread worthy.

EDIT: Oh and "seconding" the motion makes me feel like a brandy drinking cigar smoking aristocrat. Thanks!

EDIT 2: On that theme, this is more information than a lot of "journalism" has concocted a piece from. So be ready for Gaf->world->gaf...

Please don't go deep into my comment lol. I just provided my thought on it.
 
Last edited:
Please don't go deep into my comment. I just provided my thought on it. Use ignore otherwise.

I didn't mean anything antogonistic by it man - just saying some of the threads that go up these days are kinda light on content. So I agreed with your seconding comment.

It just happened to be your post I replied to but it was a tangential comment really ...

:messenger_ok:
 
It is amazing how many people does not know that 1080p is 1/4 of 4k. Lockhart with 4tf will have no problem handling nextgen game mechanics. This "its gona hold back" nonsense needs to stop.
 
It is amazing how many people does not know that 1080p is 1/4 of 4k. Lockhart with 4tf will have no problem handling nextgen game mechanics. This "its gona hold back" nonsense needs to stop.

Is that really true though, on working on the lockheart version, wouldn't Microsoft need to divert resources from their big console development team in order to make sure that little one is up and running.

In essence it's time and effort that could of been used on the higher performing spec to create/ polish other content.

I am not saying its alot but surely there has to be some consequence.
 
Hope the rumored Xbox Edinburgh is just a streaming machine... because if not, imagine having to optimize your game for 5 Xbox consoles! Xbox One, Xbox One X, Xbox Edinburgh, Xbox Lockhart and Xbox Series X.
 
Is that really true though, on working on the lockheart version, wouldn't Microsoft need to divert resources from their big console development team in order to make sure that little one is up and running.

In essence it's time and effort that could of been used on the higher performing spec to create/ polish other content.

I am not saying its alot but surely there has to be some consequence.

I dont think so this gen, we have zen 2 so every game can be 60 FPS if devs really want, I think games will have graphics mode + performance mode even stuff made just for ps5 / XSX. We just saw 4k30 with RT because its just a first showing on youtube.

Dont get me wrong, a 1080p device for next 7 years is not a great proposition it will age fast, also makes me feel as if XSX is more expensive than people think.

Remember we have NO RDNA2 prices yet, everyone is just guessing.
 
I dont think so this gen, we have zen 2 so every game can be 60 FPS if devs really want, I think games will have graphics mode + performance mode even stuff made just for ps5 / XSX. We just saw 4k30 with RT because its just a first showing on youtube.

Dont get me wrong, a 1080p device for next 7 years is not a great proposition it will age fast, also makes me feel as if XSX is more expensive than people think.

Remember we have NO RDNA2 prices yet, everyone is just guessing
.


I hate it when you make sense.
Its ok when it's someone else's post but not mine.

Fs. I am not re-quoting you so i can slip under the radar on notifications.
Have to win somehow. :messenger_beaming:
 
Last edited:
I dont think so this gen, we have zen 2 so every game can be 60 FPS if devs really want, I think games will have graphics mode + performance mode even stuff made just for ps5 / XSX. We just saw 4k30 with RT because its just a first showing on youtube.

Dont get me wrong, a 1080p device for next 7 years is not a great proposition it will age fast, also makes me feel as if XSX is more expensive than people think.

Remember we have NO RDNA2 prices yet, everyone is just guessing
.


I hate it when you make sense.
Its ok when it's someone else's post but not mine.

Fs. I am not re-quoting you so i can slip under the radar on notifications.
Have to win somehow. :messenger_beaming:

 
Is that really true though, on working on the lockheart version, wouldn't Microsoft need to divert resources from their big console development team in order to make sure that little one is up and running.

In essence it's time and effort that could of been used on the higher performing spec to create/ polish other content.

I am not saying its alot but surely there has to be some consequence.

The argument that the Xbox Series S will restrict the development of games such that they won't tap into the full potential of the Xbox Series X is ridiculous.

The task of developing a game that is intended to run on different machines, each of which has a unique level of performance, is not novel. Game developers have succeeded in completing this task time and time again for decades, particularly in regard to PC games, since PC's are highly customizable and subsequently encompass a wide range of performance levels.

Game developers tackle this task by simply creating several options for scalable graphical effects, such as screen resolution, texture resolution, asset density (e.g. NPCs, vehicles, etc), shadow quality, etc. They also allow certain graphical effects to be toggled on or off depending on whether or not said graphical effects are supported by a system's hardware (e.g. tessellation, ray tracing, variable rate shading, etc).

Hence, by simply creating two sets of pre-selected options for scalable graphical effects, one for the Xbox Series S and another for the Xbox Series X, and by making certain graphical effects exclusive to the Xbox Series X (e.g. ray tracing), developers can develop games that can run on both consoles but that tap into the full potential of the Xbox Series X.
 
Its gonna be funny when the 2tf advantage ends up showing up in a few extra particle effects and the 50% IO advantage, half a second quicker loading times. 😆

Multiplats will be basically the same, and the only real differences may appear in the 1st party titles.

Same as always.
 
The argument that the Xbox Series S will restrict the development of games such that they won't tap into the full potential of the Xbox Series X is ridiculous.

The task of developing a game that is intended to run on different machines, each of which has a unique level of performance, is not novel. Game developers have succeeded in completing this task time and time again for decades, particularly in regard to PC games, since PC's are highly customizable and subsequently encompass a wide range of performance levels.

Game developers tackle this task by simply creating several options for scalable graphical effects, such as screen resolution, texture resolution, asset density (e.g. NPCs, vehicles, etc), shadow quality, etc. They also allow certain graphical effects to be toggled on or off depending on whether or not said graphical effects are supported by a system's hardware (e.g. tessellation, ray tracing, variable rate shading, etc).

Hence, by simply creating two sets of pre-selected options for scalable graphical effects, one for the Xbox Series S and another for the Xbox Series X, and by making certain graphical effects exclusive to the Xbox Series X (e.g. ray tracing), developers can develop games that can run on both consoles but that tap into the full potential of the Xbox Series X.

Same CPU & IO performance, plus enough GPU and RAM to always have 1080P as your typical frame buffer seems to add up to me.

It's still more work and time in development, but there's nothing about the hardware I can imagine that would rule out a game design on XSS that is otherwise possible on XSX.
 
The argument that the Xbox Series S will restrict the development of games such that they won't tap into the full potential of the Xbox Series X is ridiculous.

The task of developing a game that is intended to run on different machines, each of which has a unique level of performance, is not novel. Game developers have succeeded in completing this task time and time again for decades, particularly in regard to PC games, since PC's are highly customizable and subsequently encompass a wide range of performance levels.

Game developers tackle this task by simply creating several options for scalable graphical effects, such as screen resolution, texture resolution, asset density (e.g. NPCs, vehicles, etc), shadow quality, etc. They also allow certain graphical effects to be toggled on or off depending on whether or not said graphical effects are supported by a system's hardware (e.g. tessellation, ray tracing, variable rate shading, etc).

Hence, by simply creating two sets of pre-selected options for scalable graphical effects, one for the Xbox Series S and another for the Xbox Series X, and by making certain graphical effects exclusive to the Xbox Series X (e.g. ray tracing), developers can develop games that can run on both consoles but that tap into the full potential of the Xbox Series X.

Since i have already been considering it a lockheart then that's good to hear.
 
Status
Not open for further replies.
Top Bottom