• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Shoulf next gen consoles scrap hardware raytracing

quest

Not Banned from OT
I hope neither MS or Sony waste to much die space on RT. At this point it is just not ready for prime time. In a few years nvidia and AMD will figure out much better methods to do it. This is how it always works they introduce a new feature it sucks in performance if gamers and developers like it they find ways to make it perform better and more efficiently. This is the infancy of RT if it catches on in a couple years performance on PC will be 50 times better and not worth using console because of the disparities.
 

JLB

Banned
This reminds me some discussions on mid 90s, when many people argued that 3d graphics were a luxury that didnt make a lot of sense to push (pre voodoo era).
The same way everyone understood at the time after trying a 3d game that there was no way back, same will happen with ray tracing.
 
Last edited:
What do you mean you hope there aren't any? They've been confirmed for both consoles months ago.
They confirmed hardware accelerated raytracing they havent said anything about raytracing cores
Microsoft invented dxr and it wasnt just a nvidia rtx feature its an overall api.
 
Last edited:

quest

Not Banned from OT
What do you mean you hope there aren't any? They've been confirmed for both consoles months ago.
MS has just said hardware which i think will be the intersection testing engines that are laid out in the patents by AMD this summer. The CUs will do the rest of the work it won't add much to the die. Sony on the other hand sounds like will have tensor likevcores or some other acceleration to help RT that will add to the die space and possibly take away the number of CUs as a trade off.
 
This reminds me some discussions on mid 90s, when many people argued that 3d graphics were a luxury that didnt make a lot of sense to push (pre voodoo era).
The same way everyone understood at the time after trying a 3d game that there was no way back, same will happen with ray tracing.
We all know raytracing is better than rasterization we are not disputing that however raytracing puddles and shadows to lose performance is in any sence stupid.
 
MS has just said hardware which i think will be the intersection testing engines that are laid out in the patents by AMD this summer. The CUs will do the rest of the work it won't add much to the die. Sony on the other hand sounds like will have tensor likevcores or some other acceleration to help RT that will add to the die space and possibly take away the number of CUs as a trade off.
im not sure if sony are stupid enough to make raytracing cores of all manufactures in this industry sony arent that stupid, they would never risk the die space for raytracing shadows and reflections.
 

DESTROYA

Member
I’m not worried about it, better to have it then not.
If one console has it and the other does not the forums would be one fanboy shit show that one is better than the other before they are even released.
People worry about this shit too much, let the consoles come out first and then we will see wether it’s beneficial or not.
From what I’ve read both will have dedicated hardware so I’m guessing the feature will be able to be toggled on and off in settings like you can on PC
 

Heinrich

Banned
MS has just said hardware which i think will be the intersection testing engines that are laid out in the patents by AMD this summer. The CUs will do the rest of the work it won't add much to the die. Sony on the other hand sounds like will have tensor likevcores or some other acceleration to help RT that will add to the die space and possibly take away the number of CUs as a trade off.

yup. Exactly.
For MS ray tracing won’t be really taxing. It won’t take any performance so using RT is not a bad thing.
PS5 on the other hand does not have dedicated hardware for RT, So yeah OP there you are right - for PS5 - it will be very taxing.
 
so let me ask you: the push for 3d graphics was stupid? because it also brought gigantic performance penalties back in the day.
The push for 3d graphics isnt stupid but loosing performance is stupid. Until they can do raytracing better then im all for it but currently nope not on consoles they can implement raytracing puddles as long as they care on pcs and tech demos but not to ruin my performance on console.
 

Journey

Banned
Here's some perspective...

Bumpmapping

Instead of adding textures to trees, roads, tires, should games have focused on draw distance, framerate and polygon throughput instead?

We may have ended up with a Halo that looked like this:

HaloCE-Chain_Gun.png



HaloCE-Stealth_Tank_1.png




Changing my fond memories of Halo: CE forever!
 
Here's some perspective...

Bumpmapping

Instead of adding textures to trees, roads, tires, should games have focused on draw distance, framerate and polygon throughput instead?

We may have ended up with a Halo that looked like this:

HaloCE-Chain_Gun.png



HaloCE-Stealth_Tank_1.png




Changing my fond memories of Halo: CE forever!
Polygons and textures are mandatory even a layman knows that however raytraced puddles arent mandatory we can argue from the wuhan and back and raytraced puddles + performance loss isnt mandatory it never will be, in 10 years yes but now no way. It isnt necessary

Having bigger detailed monster Gods in god of war on screen with all the gore is more important to sacrifice for raytraced reflections and shadows.
 

hunthunt

Banned
we dont even know the consoles specs and some of you are already discarding them as shit tier cheap ass pcs far from th power of the Master Race 😆 ( when probably only 1% of pc gamers have a Pc with those specs)
 
Last edited:

Dane

Member
What some are saying is that DLSS may be the solution, the reconstruction method sometimes match well with the native resolution and free performance for Ray Tracing.
 
This is such a silly question to pose. You haven't even seen raytracing in next-gen games yet. You have no clue what you are trying to kill?
I didnt say kill raytracing, i said scrap raytracing cores and current raytracing implementation like shadows and reflections. This reminds me of crysis added tesselleation on a road block that didnt improve how the road block looked but simply killed performance


rVW67TE.jpg


kBl1N6v.jpg


o83Ensc.jpg


rGrm7Tv.jpg
 
Last edited:

Amaranty

Member
I feel the current Hardware raytracing is a stopgap. It is very limited And it will be obsolete in a few years.

Its like back with the early geforce cards when we had fixed Transform and Lighting hardware acceleration.

It was not until the geforce3 we got programmable pixel and vertex shaders. And it blew all the fixed T&L stuff out of the water.

And even then, it took a few more gens to get to general purpose GPU uses.
Why do you believe raytracing is a stopgap? Jokes aside (from my first post), I do prefer higher framerates over better graphics, but if Sony and Microsoft are already implementing ray tracing into consoles, then it will probably become a standard in future.
 

nkarafo

Member
It doesn't matter, consoles will always use the minimum accepted frame rate as the standard. So they might as well use Ray Tracing or any other fancy technique.
 
Last edited:
I dont remember a story being related to frames per second and i find it really silly 10 years ago people cried for 30 fps 1080p today they cry for 60fps 4k in a few years theyll cry for 120fps 8k. its astonishingly stupid.

The fact that you say a 30fps camera is juddery explains it all ..... Proper rubbish. Images being thrown faster on screen has never had anything to do with a games story as i said before i couldnt care less for 60 fps in certain games but everybody has different fetishes nowadays.
You are not talking about the same people in all these instances.

There are those who claim 30fps (or even less) is "just fine", even with today's hardware... But the push for locked 60fps is not new, Sega made the hardware to run Daytona USA so that it would run the game at 60fps back in 1993, explained the reason and it all went down hill from there.

Anyway, as far as I know, those pushing for 8k are usually not the same pushing for 120fps, but if I had to pick a camp I would sit firmly in the 1080p 120fps camp, well before 4k (I love 4k for photos, for videogames I don't think that it's worth it on current hardware, better take the power to render more details, etc.)
 
You are not talking about the same people in all these instances.

There are those who claim 30fps (or even less) is "just fine", even with today's hardware... But the push for locked 60fps is not new, Sega made the hardware to run Daytona USA so that it would run the game at 60fps back in 1993, explained the reason and it all went down hill from there.

Anyway, as far as I know, those pushing for 8k are usually not the same pushing for 120fps, but if I had to pick a camp I would sit firmly in the 1080p 120fps camp, well before 4k (I love 4k for photos, for videogames I don't think that it's worth it on current hardware, better take the power to render more details, etc.)
To me 30fps is the base if you can push 60 your very welcome but as far as im concerned it depends with the game, most competitive games shoot for 60 and the absolute competitive shooters like counter strike shoot for 120-144, 1080p 30 fps is the standard for me as long as they can push graphics fidelity to ultra and good AA supersampling then i dont care.

Im intersted in the games visuals, gameplay and story and i think its ridiculous to force a developer to target 60 fps or above developers should be free to design and express their games anyway they see fit.
 
Give me the option to turn it off when I want higher framerates, and I'll be happy.
Console games arent luxuries as pc, everytging implemented on a console game is pixel perfectly tuned according to the design if they use raytracing then its skin deep in the engine and not just a fancy toggled feature
 
To me 30fps is the base if you can push 60 your very welcome but as far as im concerned it depends with the game, most competitive games shoot for 60 and the absolute competitive shooters like counter strike shoot for 120-144, 1080p 30 fps is the standard for me as long as they can push graphics fidelity to ultra and good AA supersampling then i dont care.

Im intersted in the games visuals, gameplay and story and i think its ridiculous to force a developer to target 60 fps or above developers should be free to design and express their games anyway they see fit.
Given the hardware of the upcoming consoles, and how similarly specced PCs perform I can only hope we get 60fps and >1080p games. I do care about resolution, just not beyond the hardware's capabilities... Sure give me 4k raytraced graphics, with a fully interactive world at 344fps any day of the week, but we need to be somewhat pragmatic about our expectations... So I think most games will target native 4k with 30 or 60fps, and that we will still get more 60fps games than in the last generation, if we are lucky many games will offer a lower resolution/details "perfformance" mode like they do on the pro consoles now, I think this is where the compromise will be made.
 
Given the hardware of the upcoming consoles, and how similarly specced PCs perform I can only hope we get 60fps and >1080p games. I do care about resolution, just not beyond the hardware's capabilities... Sure give me 4k raytraced graphics, with a fully interactive world at 344fps any day of the week, but we need to be somewhat pragmatic about our expectations... So I think most games will target native 4k with 30 or 60fps, and that we will still get more 60fps games than in the last generation, if we are lucky many games will offer a lower resolution/details "perfformance" mode like they do on the pro consoles now, I think this is where the compromise will be made.
This is why i thought upgradable or detachable memory or apu is a good idea instead of midgen refreshes, if they can have offshelf memory catridges that will improve fps and resolution then thats a fantastic idea

Even on pc ive asked the same question why not sell gpu processor separate from the gpu board, lets say u buy a gpu just as u buy a cpu and connect the gpu on the main board gpu socket, lets say u purchase a 2080ti processor and you purchase vram memory in any capacity just as u do system ram! Conventional main boards are old school its time they where refreshed,

We need more freedom in upgrading and amd+nvidia are making it a monopoly in how we play games i need a 2080 but not with 1000$ pricing and i need 32gb vram not locked 8gb that comes with the card.
 
Last edited:

DESTROYA

Member
Console games arent luxuries as pc, everytging implemented on a console game is pixel perfectly tuned according to the design if they use raytracing then its skin deep in the engine and not just a fancy toggled feature
You don’t know that, it can easily be implemented in the settings like they do on PC games .
Earlier this month Xbox Game Studios head Matt Booty confirmed the decree that all their new next gen content must also work on the original 2013 Xbox One.

The first real warning sign that this may gimp content and prove difficult or even untenable for devs today.
If this turns out to be true then Xbox One has no RT features Im guessing you’ll be able to turn RT on or off to run games smoothly enough to play on the “older“ hardware.
 
This is why i thought upgradable or detachable memory or apu is a good idea instead of midgen refreshes, if they can have offshelf memory catridges that will improve fps and resolution then thats a fantastic idea
You don’t know that, it can easily be implemented in the settings like they do on PC games .
Earlier this month Xbox Game Studios head Matt Booty confirmed the decree that all their new next gen content must also work on the original 2013 Xbox One.

The first real warning sign that this may gimp content and prove difficult or even untenable for devs today.
If this turns out to be true then Xbox One has no RT features Im guessing you’ll be able to turn RT on or off to run games smoothly enough to play on the “older“ hardware.
Your talking about cross gen titles if so yes cod 2019 could probably have raytracing on xbox series x but a game specifically designed for xbox series x with raytracing isnt going to be a toggle in the menu screen as u have on pc. Series x or ps5 arent midgen refreshes. Ps4 pro is simply a ps4 on steroids so you can have toggled settings but a ps5 is a fresh generation
 

GymWolf

Member
This reminds me some discussions on mid 90s, when many people argued that 3d graphics were a luxury that didnt make a lot of sense to push (pre voodoo era).
The same way everyone understood at the time after trying a 3d game that there was no way back, same will happen with ray tracing.
Oh cmon, the jump between 2d and 3d is totally different than having nicer shadows and reflections...

We already have great precanned shadows and reflections, it's not the same thing.
 

CJ_75

Member
60 fps is never a goal for every game and were not downgrading nextgen consoles were simply arguing if wasting resources on raytracing instead of traditional graphics is a good thing.

Sony and Microsoft are not Nintendo. They both focus on forward thinking technologies like Raytracing, VRR, Dolby Atmos etc. It's called progression. Just like they progressed with physical media from CD, DVD to Bluray and video output from less than 720p to up until 4K with the latest consoles.
Buying a next gen console without hardware Raytracing support sounds like buying a new 4K/8K tv without HDR. Sure, they would still be able to output a great high resolution picture but definetly less exciting.

At this stage, we just do not know the exact next gen specs yet and we are already arguing about trade offs.
 
Your talking about cross gen titles if so yes cod 2019 could probably have raytracing on xbox series x but a game specifically designed for xbox series x with raytracing isnt going to be a toggle in the menu screen as u have on pc. Series x or ps5 arent midgen refreshes. Ps4 pro is simply a ps4 on steroids so you can have toggled settings but a ps5 is a fresh generation
If the game is built to have the feature raytracing, then it can be disabled. No one is going to build an engine which turns on every setting, and cannot be disabled. How would you test your shadows? Lighting, ambient occlusion, LOD, textures etc, if you couldn't turn off other settings.

Consoles are "the poor man's pc". You won't get all the bells and whistles on the base model car, vs the high end version. Just like you won't get the functionality and settings options that you can tinker with, on a console, vs having the pc version. Hopefully this changes more in the upcoming console gen.

With the ps4 and Xbox one, consoles have been making strides to being more similar to pc's, by allowing users to change game modes. Like having a lower resolution output and 60fps, vs higher resolution output and 30fps.
 
If the game is built to have the feature raytracing, then it can be disabled. No one is going to build an engine which turns on every setting, and cannot be disabled. How would you test your shadows? Lighting, ambient occlusion, LOD, textures etc, if you couldn't turn off other settings.

Consoles are "the poor man's pc". You won't get all the bells and whistles on the base model car, vs the high end version. Just like you won't get the functionality and settings options that you can tinker with, on a console, vs having the pc version. Hopefully this changes more in the upcoming console gen.

With the ps4 and Xbox one, consoles have been making strides to being more similar to pc's, by allowing users to change game modes. Like having a lower resolution output and 60fps, vs higher resolution output and 30fps.
Console games arent made like pc games, everything is fixed specifically engineered to run as best as it can on a specific console target. If raytracing is implemented then it was meticulously engineered on the rendering pipeline, consoles arent pcs where by you target low mid and ultra settings to work in an infinite array of pc rigs with different specs out there!
 
Sure, if you want to show ray-traced reflections with an added artistic touch. As a barebones example of what ray-traced reflections look like I prefer the UE4 one. The breaking mirror wall manages to always amaze me.
Something always bugged me since we have ssr wouldnt the logical step be global reflections instead of raytracing
 
Sony and Microsoft are not Nintendo. They both focus on forward thinking technologies like Raytracing, VRR, Dolby Atmos etc. It's called progression. Just like they progressed with physical media from CD, DVD to Bluray and video output from less than 720p to up until 4K with the latest consoles.
Buying a next gen console without hardware Raytracing support sounds like buying a new 4K/8K tv without HDR. Sure, they would still be able to output a great high resolution picture but definetly less exciting.

At this stage, we just do not know the exact next gen specs yet and we are already arguing about trade offs.
Im all for raytracing except when it kills performance and other important graphical features as i said before whats the point of wasting resources in raytracing shadows and puddles instead of rendering a monster on screen with more polygons, textures and different effects
 

JLB

Banned
Sure, if you want to show ray-traced reflections with an added artistic touch. As a barebones example of what ray-traced reflections look like I prefer the UE4 one. The breaking mirror wall manages to always amaze me.

This Star Wars demo was made on UE4.
 
Last edited:

JLB

Banned
Oh cmon, the jump between 2d and 3d is totally different than having nicer shadows and reflections...

We already have great precanned shadows and reflections, it's not the same thing.

Ray Tracing is the missing piece to reach 100% photorealistic graphics. Its the culmination of almost 50 years of tech avances on the area.
 

GymWolf

Member
Ray Tracing is the missing piece to reach 100% photorealistic graphics. Its the culmination of almost 50 years of tech avances on the area.
Holy hyperbole...

Not at this stage, right now is a hog resources nice feature, nothing more.

Not even full fledged cg films are really photorealistic and they use far more complex rtx than anything that will come out from 400-500 dollars console.

Hardware and optimization are not near ready to be a revolution like 3d was.
 

VertigoOA

Banned
All techniques used to effectively enhance the immersive experience of gaming is welcomed.

I’d much rather see hardware be used to improve games and standard interfaces than have cpu and gpu usage applied to Vr.
 
Last edited:
Ray Tracing is the missing piece to reach 100% photorealistic graphics. Its the culmination of almost 50 years of tech avances on the area.
Theres more to graphics than raytracing, you could say as far as lighting goes then raytracing is the holy grail but as far as graphics go theres far more to be done even after raytracing mind you pixar anf hollywood have been using raytracing for years and have always updated their techs a pixar animation with raytracing in 1990 is different from a pixar animation today.

Theres far more to improve in games first of all graphics are currently rendered using polygons which is isnt a real representation of reality, everything is hollow in graphics we should be using point clouds just as (dreams ps4) until we overcome polygons then were still abput 10% into realistic rendering
 

JLB

Banned
Holy hyperbole...

Not at this stage, right now is a hog resources nice feature, nothing more.

Not even full fledged cg films are really photorealistic and they use far more complex rtx than anything that will come out from 400-500 dollars console.

Hardware and optimization are not near ready to be a revolution like 3d was.

3D revolution took loooong years. I fully recall how people was incredibly skeptical about 3D. And for many of the same reasons you mention.
Then we've got the first looker arcades from Sega, then Voodoo made it possible for home pcs to play acceptable 3d games, and all of the sudden everyone was a believer.
It will happen exactly the same here. RT will be more and more achievable, eventually it will reach a good cost/benefit point, games will make use of it, there will be the first killer game using it, and moving forward will be the norm.
 

JLB

Banned
Theres more to graphics than raytracing, you could say as far as lighting goes then raytracing is the holy grail but as far as graphics go theres far more to be done even after raytracing mind you pixar anf hollywood have been using raytracing for years and have always updated their techs a pixar animation with raytracing in 1990 is different from a pixar animation today.

Theres far more to improve in games first of all graphics are currently rendered using polygons which is isnt a real representation of reality, everything is hollow in graphics we should be using point clouds just as (dreams ps4) until we overcome polygons then were still abput 10% into realistic rendering

My prediction is that once we get machines capable of running games at fully locked 144fps + mature HDR + native 8k + full, mature Ray Tracing, games will be 100%ish photorealistic, to a point it will be incredibly hard to figure if is real or virtual.
 
My prediction is that once we get machines capable of running games at fully locked 144fps + mature HDR + native 8k + full, mature Ray Tracing, games will be 100%ish photorealistic, to a point it will be incredibly hard to figure if is real or virtual.
Games will eventually run with full raytracing graphics will be photorealistic however, 144fps would never be reached on consoles since with any given powerful hardware developers will always push graphics to the max at either 30fps or 60 theyll mever prioritise 144fps such speeds will always be a pc thing.
 
Top Bottom