RDNA already has raytracing based on shaders. That's what PS5 will probably use. XSX uses hardware-accelerated raytracing, which is exclusive to RDNA2.
Oh boy, I can't believe some people are still flying with software RT for PS5 even after the system architect of PS5 said this
"There is ray-tracing acceleration in the GPU hardware," Cerny told Wired, "which I believe is the statement that people were looking for."
Could it be possible for Sony to have an off-chip solution to RT?
I've been wondering how they will approach RT. MS luckily enough has DX RT which is their baby and being used on PC.
It is very detailed, but we'll see in a couple of days. RDNA2 purely for Microsoft doesn't make a lot of sense, but like I said, a lot of details that make it look like the complete flow of the presentation.
Could be the case, but I can't imagine that AMD would go for that. I would imagine that they are very open when it comes to their GPU and don't necessarily tie everything in to DirectX. We'll know tomorrow.Well if RDNA2 is heavily tied to DXR it may mean AMD are designing there RT hardware which is tailor made for DXR at a hardware level will give them an advantage against RTX. SONY would not need these hardware DXR implementations because Sony don't use DXR.
Could it be possible for Sony to have an off-chip solution to RT?
I've been wondering how they will approach RT. MS luckily enough has DX RT which is their baby and being used on PC.
Shader based raytracing acceleration is not software raytracing. Software raytracing would happen on the CPU.Oh boy, I can't believe some people are still flying with software RT for PS5 even after the system architect of PS5 said this
"There is ray-tracing acceleration in the GPU hardware," Cerny told Wired, "which I believe is the statement that people were looking for."
I think it's more likely they have it on die, e.g they will have a customized. RDNA1 clusters, 2 zen2 clusters and some RT hardware
They have to. Right now the only raytracing API is DirectX Raytracing. It's just now ported to Vulkan by the Khronos Group and Microsoft.Thanks. With their own propriety API?
A kind reminder:
![]()
Sources: AMD Created Navi For Sony's PlayStation 5, Vega Suffered
Sources reveal not only that the PS5 uses Navi graphics, but nearly 2/3 of AMD's engineering resources were devoted to the task.www.forbes.com
Yeah, but now that RDNA2 exists it looks like they don't want it anymore.Wasn’t that back then when certain Gaf posters were claiming or speculating that Navi was also exclusive to Sony’s next gen console?
When I think about it, imagine you are AMD.Wasn’t that back then when certain Gaf posters were claiming or speculating that Navi was also exclusive to Sony’s next gen console?
Who knows, if you use the resulting chips in enough devices it might be feasible. But you made a point, it's expensive. So if Sony decided in 2017 to launch in 2019, there would have been only one technology available, RDNA1. So they would've spent 200 million on that. And the resulting Ariel APU reflects that, it's RDNA1 (Navi10). Would at the point where they decided to go for 2020 instead they have spent another 200 million for another architecture? That's a lot of wasted money. I think they just continued with what they had and upped the clock speeds to get the most out of it.When I think about it, imagine you are AMD.
Two arch-rivals approach you to develop for them, what will you do?
Probably have two completely disconnected teams working on respective 'semi-customs'.
Now, what kind of questions are NOT OK to ask?
"Hey AMD, what semi custom are you doing for my competitor" I guess.
But, what kind of questions are OK to ask?
"Hey AMD, what kind of new Tech is on the horizon for us to use in upcoming consoles?"
AMD's R&D annual budget is $1.2-1.6 billion. I think it's too expensive for Sony or Microsoft to afford having entire architecture exclusively to themselves. I don't recall where I read it, but the budgets the two had were said to be about 200-300 million for "semi custom" efforts (including process node bumps).
So, major architectural development will be shared by both, separate semi-custom teams could do relatively minor (compared to full blown new arch) customization (e.g. two mem buses for PS4, "onion" and "garlic" vs that strange memory config Microsoft used).
But Sony don't have the PC market with API or gamesWhen I think about it, imagine you are AMD.
Two arch-rivals approach you to develop for them, what will you do?
Probably have two completely disconnected teams working on respective 'semi-customs'.
Now, what kind of questions are NOT OK to ask?
"Hey AMD, what semi custom are you doing for my competitor" I guess.
But, what kind of questions are OK to ask?
"Hey AMD, what kind of new Tech is on the horizon for us to use in upcoming consoles?"
AMD's R&D annual budget is $1.2-1.6 billion. I think it's too expensive for Sony or Microsoft to afford having entire architecture exclusively to themselves. I don't recall where I read it, but the budgets the two had were said to be about 200-300 million for "semi custom" efforts (including process node bumps).
So, major architectural development will be shared by both, separate semi-custom teams could do relatively minor (compared to full blown new arch) customization (e.g. two mem buses for PS4, "onion" and "garlic" vs that strange memory config Microsoft used).
I think you are misreading it.Amd Is behind schedule
They have to. Right now the only raytracing API is DirectX Raytracing. It's just now ported to Vulkan by the Khronos Group and Microsoft.
When I think about it, imagine you are AMD.
Two arch-rivals approach you to develop for them, what will you do?
Probably have two completely disconnected teams working on respective 'semi-customs'.
Now, what kind of questions are NOT OK to ask?
"Hey AMD, what semi custom are you doing for my competitor" I guess.
But, what kind of questions are OK to ask?
"Hey AMD, what kind of new Tech is on the horizon for us to use in upcoming consoles?"
AMD's R&D annual budget is $1.2-1.6 billion. I think it's too expensive for Sony or Microsoft to afford having entire architecture exclusively to themselves. I don't recall where I read it, but the budgets the two had were said to be about 200-300 million for "semi custom" efforts (including process node bumps).
So, major architectural development will be shared by both, separate semi-custom teams could do relatively minor (compared to full blown new arch) customization (e.g. two mem buses for PS4, "onion" and "garlic" vs that strange memory config Microsoft used).
Who knows, if you use the resulting chips in enough devices it might be feasible. But you made a point, it's expensive. So if Sony decided in 2017 to launch in 2019, there would have been only one technology available, RDNA1. So they would've spent 200 million on that. And the resulting Ariel APU reflects that, it's RDNA1 (Navi10). Would at the point where they decided to go for 2020 instead they have spent another 200 million for another architecture? That's a lot of wasted money. I think they just continued with what they had and upped the clock speeds to get the most out of it.
You have no idea what you are talking about. Vulkan Ray Tracing was worked on by Nvidia, AMD and Intel with Nvidia contributing more than the others. Microsoft had nothing to do with it. Plus its been available to developers since 2018 same year as Turing was released.They have to. Right now the only raytracing API is DirectX Raytracing. It's just now ported to Vulkan by the Khronos Group and Microsoft.
You have no idea what you are talking about. Vulkan Ray Tracing was worked on by Nvidia, AMD and Intel with Nvidia contributing more than the others. Microsoft had nothing to do with it. Plus its been available to developers since 2018 same year as Turing was released.
You have no idea what you are talking about. Vulkan Ray Tracing was worked on by Nvidia, AMD and Intel with Nvidia contributing more than the others. Microsoft had nothing to do with it. Plus its been available to developers since 2018 same year as Turing was released.
Ok? They used Microsoft's open-source DXC compiler via SpirV to translate HLSL to Vulkan. Microsoft did not work on it. It was entirely done by Nvidia with contributions to the extension from Intel and AMD.
Should be pretty obvious that's what i was talking about since i emphasized Nvidia and Turing architecture in my reply.There is no Vulkan RT, there is NVs proprietary extension on Vulkan. (VK_NV_)
They will likely use an updated version of the PS4 API.Thanks. With their own propriety API?
Should be pretty obvious that's what i was talking about since i emphasized Nvidia and Turing architecture in my reply.
Nope.It was just confirmed in the AMD meeting, RDNA2 raytracing was co-engineered with Microsoft.
Fake rumor.
AMD confirmed that Microsoft DXR 1.1 API was co-developed by them.
RDNA 2 is developed by AMD only.
They confirmed RDNA 2 is 7nm (not 7nm+) and power the two next-gen consoles (PS5 and Xbox One Series X).
Nope.
Just stop making a fool out of yourself. They were talking about the api software lol.It was just confirmed in the AMD meeting, RDNA2 raytracing was co-engineered with Microsoft.
All DirectX are co-developed by vender hardware.well to be fair, co-developing the API will most likely have indirectly influenced RDNA2's development.
if you work together closely on the software side that can also mean results of that work can change how the hardware is designed.
but it's not like even if it was fully co-developed with MS that it would be unavailable to Sony, it's not like Microsoft can't use Bluray just because Sony co-developed it.
All DirectX are co-developed by vender hardware.
DXR 1.0 co-developed by nVidia to support RTX.
DRX 1.1 co-developed by AMD to support RDNA 2.
I thought you know that at least.
You can't make an API without the vendor hardware help.
RDNA2 was developed by AMD and Microsoft. Hmm. Didnt know that. There may be some weight behind this story. Watching the AMD conference now.Well MS could codesigned RDNA2 with AMD, which could explain why RDNA2 would not end up with sony consoles. Who knows tho. All tinfoil hat stuff at this point.
You did not know because it was not lolRDNA2 was developed by AMD and Microsoft. Hmm. Didnt know that. There may be some weight behind this story. Watching the AMD conference now.
Thats not what Im hearing.You did not know because it was not lol
He is trying to fool you.
DXR 1.1 API was co-developed by AMD and Microsofy.
That exactly what he says.Thats not what Im hearing.
I just saw it. I understand it now. Again, not too much of a techie. Still learning. I know this tho; 12tf > 9tf.That exactly what he says.
I can transcript to you.
-26:14 time.
So do you know nothing?I just saw it. I understand it now. Again, not too much of a techie. Still learning. I know this tho; 12tf > 9tf.
DXR is a vendor neutral API from Microsoft. Vulkan does not have a standard yet but it is being worked on. In the mean time IMG Tech has their extension for RT on Vulkan and Nvidia released their extension which was contributed to by both Intel and AMD, they are all trying to create a standard Vulkan RT solution that is cross Vendor. Read the freaking Vulkan 1.1.133 specifications.It's pretty obvious you either don't understand the difference between NV's proprietary extensions and "Vulkan RT" or are trying to mislead people.
DXR is an actual vendor neutral standard for RT-ing.
There is no analogy on Vulkan.
Because there are direct x drivers for some hardware feature doesn't mean it has no use for equivalent functions on other APIs. MS pulled the same disingenuous strategy when they alluded that only the xbox one could use mega textures because DX 12 + esram.Well if RDNA2 is heavily tied to DXR it may mean AMD are designing there RT hardware which is tailor made for DXR at a hardware level will give them an advantage against RTX. SONY would not need these hardware DXR implementations because Sony don't use DXR.