Seriously, how will Xbox Series X exclusives make full use of Ray-tracing if all of it's games will release on PC too?

Tenka Musou

Banned
Not a console wars thread, just genuinely curious.

If it's mandatory that Xbox games release on PC, does that mean Xbox exclusive developers have to design their games around a non-Raytracing capable PC as the baseline? Which means all we'd get are tacked on RT features like Control and Metro, and no games designed from the ground-up based on Ray-tracing.

So for example, PS5 exclusive devs do not need to bother making a version of their games with baked lighting/fake reflections, but Xbox exclusive devs must make two versions with different lighting modules.

This is only taking into account the graphical side of things, what about A.I applications of ray tracing like that one developer recently mentioned? Are they gonna make two different A.I behaviors?
 
if next gen is 7 years, I think it is safe to assume PC GPUs will.. catch up soon enough. If you could say that PC GPUs will have to catch up. Let's say, average PC GPUs.
 
For the first couple years all first-party games need to run on the original Xbox One too. And they'll always be tied to the Xbox Series S.

I'm not seeing why PC is the problem.
 
Not a console wars thread, just genuinely curious.

If it's mandatory that Xbox games release on PC, does that mean Xbox exclusive developers have to design their games around a non-Raytracing capable PC as the baseline? Which means all we'd get are tacked on RT features like Control and Metro, and no games designed from the ground-up based on Ray-tracing.

So for example, PS5 exclusive devs do not need to bother making a version of their games with baked lighting/fake reflections, but Xbox exclusive devs must make two versions with different lighting modules.

This is only taking into account the graphical side of things, what about A.I applications of ray tracing like that one developer recently mentioned? Are they gonna make two different A.I behaviors?

They will just raise minimum specs on PC.
 
You can turn Minecraft RT off at the press of a button on the keyboard. Not even a menu needed. I'd like to think there's an intelligent way to do this automatically if the game knows RT isn't doable on that machine 🤷🏻‍♂️
 
if next gen is 7 years, I think it is safe to assume PC GPUs will.. catch up soon enough. If you could say that PC GPUs will have to catch up. Let's say, average PC GPUs.
There will still be a huge number of non-RT capable GPUs in the majority of these 7 years.
You can turn Minecraft RT off at the press of a button on the keyboard. Not even a menu needed. I'd like to think there's an intelligent way to do this automatically if the game knows RT isn't doable on that machine 🤷🏻‍♂️
But Minecraft already has a non-RT version, though. The RT was developed after the fact.
 
They will have a tick box to turn it on or off or their minimum spec will require DX12 Ultimate complient hardware. Simple as that.
 
Last edited:
...but Xbox exclusive devs must make two versions with different lighting modules.

This is only taking into account the graphical side of things, what about A.I applications of ray tracing like that one developer recently mentioned? Are they gonna make two different A.I behaviors?
Just two words: DirectX 12 ;)
 
They will have a tick box to turn it on or off or their minimum spec will require DX12 Ultimate compliment hardware. Simple as that.
So they tick a box and turn off RT. What happens then to the game's world?
Unless the developer have made a non-RT version of their game with baked lighting and non-RT reflections etc... It won't be as "simple as that"
 
I doubt Ray tracing will start being the standard right away, they'll still launch some of the new games for the current gen after all.
By the time devs completely abandon other types of lighting techiniques, most gpus on the market will already be able to handle it.
 
So they tick a box and turn off RT. What happens then to the game's world?
Unless the developer have made a non-RT version of their game with baked lighting and non-RT reflections etc... It won't be as "simple as that"

It will be as simple as that for most games, we already have games that do it with raytracing, will it look like shit? Maybe.. it's up to the devs to fall back on older rasterization technology but I mean lots of games now let you turn of modern effects like ambient occlusion, global illumination, shadow effects and it can look like poop but the option is there.
 
So they tick a box and turn off RT. What happens then to the game's world?
Unless the developer have made a non-RT version of their game with baked lighting and non-RT reflections etc... It won't be as "simple as that"
It is as simple as that. Obviously the game won't look as good as it would with RT on, but using DX12 for that purpose is "simple". Now, I would understand your argument if a game mechanic relies heavily on RT reflections (such as watching an enemy's reflection or stuff like that). Those games would have to be re-written and have 2 versions: one for RT and one without RT.
 
I doubt Ray tracing will start being the standard right away, they'll still launch some of the new games for the current gen after all.
By the time devs completely abandon other types of lighting techiniques, most gpus on the market will already be able to handle it.
On the contrary, I forsee everyone racing to put a form of RT in their games because it's the new hot shit, to what extent, we will see.
It will be as simple as that for most games, we already have games that do it with raytracing, will it look like shit? Maybe.. it's up to the devs to fall back on older rasterization technology but I mean lots of games now let you turn of modern effects like ambient occlusion, global illumination, shadow effects and it can look like poop but the option is there.
It is as simple as that. Obviously the game won't look as good as it would with RT on, but using DX12 for that purpose is "simple". Now, I would understand your argument if a game mechanic relies heavily on RT reflections (such as watching an enemy's reflection or stuff like that). Those games would have to be re-written and have 2 versions: one for RT and one without RT.
You both fundamentally misunderstand the point of this thread.

There is no point in bringing current RT-supporting games for comparison because they were not built exclusively on RT, they already feature non-RT reflections and lighting. The RT was a feature that was slapped on on top as an extra.

In the future, does that mean Xbox exclusive devs will be obligated to make a version of their game with old visual tricks and lighting to compensate for the lack of RT? While PS5 exclusive devs are not held to such an obligation
 
On the contrary, I forsee everyone racing to put a form of RT in their games because it's the new hot shit, to what extent, we will see.
They would while still maintaining classic techiniques of illumination, exactly like they're doing right now, precisely because they're still interested in selling their games for current gen-console owners and pc players with non-RT ready hardware.
 
For the next few years yeah they probably will then dx12ultimate hardware will reach a tipping point where there is enough of it out there to justify leaving older hardware behind.

I'd be very impressed if Sony have a big budget game designed solely around ray tracing within the next 2 years, although I'd imagine an indie devs might have something experimental like that.
 
Yes, they will have to build two versions of the lighting system. PC isn't the problem though as developers could just say "fuck you unless you own a RTX card". The XB1 is the "problem".
 
They would while still maintaining classic techiniques of illumination, exactly like they're doing right now, precisely because they're still interested in selling their games for current gen-console owners and pc players with non-RT ready hardware.
Multiplatform games? for sure.

Exclusives though? That is the point of this thread, it seems like Xbox Series X exclusives won't fully realize it's GPU potential when it comes to RT, unless they make two distinct versions of their games.

Which would be like being stuck in a cross-gen limbo, only it lasts the entire generation
 
Multiplatform games? for sure.

Exclusives though? That is the point of this thread, it seems like Xbox Series X exclusives won't fully realize it's GPU potential when it comes to RT, unless they make two distinct versions of their games.

Which would be like being stuck in a cross-gen limbo, only it lasts the entire generation
Don't forget that a lot of PS exclusives will still come out for the ps4.
As for truly ps5 and Xbox Series X exclusives, don't expect anything good to release until around the middle of the gen, when RT-ready cards will most likely already be popular.
 
... it seems like Xbox Series X exclusives won't fully realize it's GPU potential when it comes to RT, unless they make two distinct versions of their games.

Which would be like being stuck in a cross-gen limbo, only it lasts the entire generation
Care to tell us why?

For example, let's say next Halo absolutely requires RT (which I doubt, but for the sake of argument let's say it does): Microsoft could simply put on "System Requirements" an nVidia RTX series or better (or AMD RDNA2 card) as a requirement and that's it.

Heck, I have a Core i5-6400 CPU on my PC and an IGPU and I know for a fact that there are games for PS4/XBox that will never run in my system because they require a video card. In fact, there are millions of PCs with the exact same config and I don't see video game developers catering to those PCs or even using them as the base-line.
 
Care to tell us why?

For example, let's say next Halo absolutely requires RT (which I doubt, but for the sake of argument let's say it does): Microsoft could simply put on "System Requirements" an nVidia RTX series or better (or AMD RDNA2 card) as a requirement and that's it.
But that's not really realistic at all considering how small RTX adaption rate would be, at least in he first 3 years of the generation. They would cut off a HUGE portion of their active userbase.
 
Now, OP (and most people here) seem to have forgotten how the 90s and early 2000's were in terms of consoles and PCs. Back then, 3D graphics were a revolution among consoles and many games couldn't be played on PC simply because there weren't enough video cards on the market. Heck, there wasn't even a 3D standard until OpenGL and Direct3D arrived. If a game for PS2 had a PC version that required Glide, then you were forced to buy a 3dfx video card, otherwise the game wouldn't run.

The real-time ray-tracing technique is as revolutionary as 3D was in the 90s. It will take some time for developers and gamers to get used to it.
 
I think microsoft will make Full RT based game 1440p at 60hz max

So it runs ok
 
Who said anything about having to be full ray tracing? It could be GI, reflections, or both. It matters little to the discussion..
Even with that alone, you still won't get close to what is available on the high end of current gpu's, which were released 2 years ago. So to think xsx will be held back by of PC, is pretty dumb.
 
But that's not really realistic at all considering how small RTX adaption rate would be, at least in he first 3 years of the generation. They would cut off a HUGE portion of their active userbase.
Resident Evil (1996) would like to have a word with you. The game was first released for PS1 (as a PS Exclusive) and 1 year later was released for PC. It required a 3dfx video card, so if you had a Matrox, ATI or another card, you'd be screwed and had to buy a 3dfx card. Did it matter for the success of that game? It didn't.
 
Not every single game will use Ray Tracing. So this is only a question for some developers and games. They probably will have to develop their game with the option to turn on/off the feature, which will lead to more workload.
 
There will still be a huge number of non-RT capable GPUs in the majority of these 7 years.
Define "non-RT capable". My GTX 1070 does ray tracing all the time. Hell, my CPU traces rays whenever I fire a hitscan weapon in an FPS. I'm going to assume, for the sake of argument however, that you mean ray tracing in the sense of simulating light...in which case GTX cards are still capable. Quake RTX is playable on my 1070 if you're willing to lower the resolution. Java Minecraft has also had a couple of real-time ray tracing shader packs that reach arguably playable (consistent 30+) framerates for a while, the oldest being SEUS PTGI which predates RTX's announcement by quite some way.
 
Last edited:
Did you know even games that support raytracing, have the option to turn it off? Maybe starting there, would make you realize this thread was a stupid idea?
Read the thread, clown.
There is no point in bringing current RT-supporting games for comparison because they were not built exclusively on RT, they already feature non-RT reflections and lighting. The RT was a feature that was slapped on on top as an extra.
Resident Evil (1996) would like to have a word with you. The game was first released for PS1 (as a PS Exclusive) and 1 year later was released for PC. It required a 3dfx video card, so if you had a Matrox, ATI or another card, you'd be screwed and had to buy a 3dfx card. Did it matter for the success of that game? It didn't.
The landscape of the PC market and especially in relation to multi-platform games is ridiculously different then than it is today which makes your example moot.
 
You do realize PC already has games that have ray tracing as an option? And that PC's without RTX cards can still play them? It's literally a toggle. I don't understand how this is an issue or a concern.
 
No RayTracing graphic cards RayTracing Slider : [|------------]
Xbox Series S Ray Tracing Slider : [---|---------]
Playstation 5 Ray Tracing Slider : [-------|-----]
Xbox Series X Ray Tracing Slider : [--------|----]
New High End graphic cards RayTracing Slider : [------------|]
 
Top Bottom