• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Ovech-King

Member
I se

I see them going with 16GB at 512GB/s on 256 bit bus.

Mind you, PS4s GPU had access to 8GB of RAM at 176 GB/s, more then Radeon 7870 at the time, with 40% less FLOPS. On top of that it also had 4x more memory available to GPU. So we might se 8.5TF Navi with 16GB at 512 GB/s in PS5/Scarlett

Thanks . I , out of all, would be happy with dedicated 16GB Gddr6 to games at 400$ . However , I think Sony is looking into the future with longevity in mind starting from 2020 to ??? so I strongly believe 512Gb/s at LAUNCH and 3 years after the X released at 320Gb/s sound too conservative . Even more so since they went out of their way to release ps4 at 8Gb GDDR5 7 years prior with less revenue from the platform
 
Last edited:

R600

Banned
why would they use 256bit bus after already having 384? if anything they would keep it.
I am going by OQA leak for PS5.



There is no need for 384 bit bus if they are using 16Gbps or even better 18Gbps chips (I think these will only be used in dev kits, as they are far too highly clocked for consoles). 384 bit would take more space on die and you would have to use 14Gbps or even worse, 12Gbps chips.
 

Darius87

Member
when thinking Tflops just look at 4 collum
CUMHzRDNA TflopsGCN Tflops + 25%IPCGfx cards / consoles estimates
4014657.59.38=~ RX 5700
4015257.819.76
4016258.3210.4=~ RX 5700 XT (Base)
4016808.610.75
4414658.2510.31
4415258.5910.74=~ Vega 56
4416259.1511.44=~ RX 5700 XT
4416809.4611.83=~ GTX 1080
481465911.25
4815259.3711.71
4816259.9812.48=~ Vega 64
48168010.3212.9
5214659.7512.19
52152510.1512.69
52162510.8213.52
52168011.1813.98=~ RTX 2080
56146510.513.13=~ Xbox Scarlet?
56152510.9313.66=~ PS5?
56162511.6514.56
56168012.0415.05
 
This has been less and less true as you yourself pointed out. Most of the co-processors listed also exist under PC, i.e TrueAudio, graphics, video encode/decode etc (general file compression is still handled in software on the 8th gen afaik). What does the 8th gen do in dedicated hardware that's left to brute force on PC at this point? Almost every single generation in a linear pattern has reduced its reliance on bespoke hardware. See PS2 - PS3 - PS4, the trend has marched towards generalized compute.
TrueAudio DSP is not a baseline feature in PCs, it only exists in certain AMD GCN GPUs and it has been deprecated in favor of compute/ray-traced audio in Polaris and beyond.

What's interesting here is that PS4 Pro/XB1X still have the TrueAudio DSP for BC reasons, despite having Polaris-based GPUs. PCs just don't have this semi-custom luxury (mixing old and new stuff).

Audio processing has been CPU-based for a long time on PCs (since the Vista that changed the audio stack) and you have so many CPU cores these days. Hardware-accelerated EAX has been abandoned and older drivers just don't work.

Video encoding/decoding exists in most PCs, but I see famous Twitch streamers saying all the time that you need either an expensive multi-core CPU for streaming or a dedicated 2nd PC (isn't that a waste of electricity?).

File decompression is handled by a dedicated Zlib ASIC in both consoles. Many current-gen games allow you to start playing almost immediately, while they decompress & install the rest of the game (if they had to tax the Jaguar CPU with decompression duties, you wouldn't be able to play, at least not in playable framerates).

Anyways my point was, clearly Jaguar has been a big bottleneck on the 8th gen, so in response to the comment about overstating the importance of the CPU in 3D rendering, that's only been true on PC where the baseline has been way above Jaguar for years, it'll be an important jump in the console space.
It depends on what your definition of a higher "CPU baseline" is.

Baseline IPC/ST? Sure.

Baseline in terms of supporting modern instruction sets?

Not really:

Jaguar is on par with Haswell instruction set-wise (not IPC-wise, obviously).

There are lots of PC folks with old ass CPUs and Zen 2 is going to increase the baseline (not just in terms of IPC) even higher. Steam hardware stats don't lie. :)
 

ANIMAL1975

Member
So, help me out here guys. We know for some time now, that Sony was very involved in the development of Navi and that even a team of AMD people was designated work with Sony engineers in the customization process,... Why are we now assuming that devkits were still Vega, GCN, etc?
 
how the fuck are we getting anything even remotely close to 8k with 36 CUs????

this is gonna be the smallest leap in graphics ever.
You gotta be high on crack if you think we are legitemately getting legit 8k next gen lmao. 8k is just marketing speak for "we support hdmi 2.1" and thats it. PCs can barely run 4k 60fps on gtx 2080 . 8k is just 4k upscaled.
 

R600

Banned
So, help me out here guys. We know for some time now, that Sony was very involved in the development of Navi and that even a team of AMD people was designated work with Sony engineers in the customization process,... Why are we now assuming that devkits were still Vega, GCN, etc?
Because first dev kits went out late last year, which makes 7nm SOC with Navi and Zen2 extremely unlikely. They will probably just now start to receive dev kits with real chip, but if I had to guess only first party in begining.

This was my theory from start when few sources let out info about 12.9TF GPU in dev kit. No way this was possible half a year ago, much less after we found out that getting 13TF Navi GPU would require tower PC case instead of console box
 

LordOfChaos

Member
Video encoding/decoding exists in most PCs, but I see famous Twitch streamers saying all the time that you need either an expensive multi-core CPU for streaming or a dedicated 2nd PC (isn't that a waste of electricity?).

...


It depends on what your definition of a higher "CPU baseline" is.

Baseline IPC/ST? Sure.

Baseline in terms of supporting modern instruction sets?

Not really:

Jaguar is on par with Haswell instruction set-wise (not IPC-wise, obviously).

There are lots of PC folks with old ass CPUs and Zen 2 is going to increase the baseline (not just in terms of IPC) even higher. Steam hardware stats don't lie. :)


That would be for video quality compared to dedicated hardware, in consoles stream quality is kind of :poop:. You can get the same results as efficiently if you're using a dedicated encoder on a PC too. Even on consoles, pro streamers will use external capture hardware rather than the dedicated blocks inside that take shortcuts to do it so efficiently.

Same reason high end video production would use Xeons rather than Quicksync which is far faster.


Obviously I'm talking about performance when relating to Jaguar and Zen 2 being an important step up when coming from such a weak spot. Support all the modern features you want, but it's an objectively slow CPU core by now, smartphones passed it a long while back, the only saving grace was having 7 of them available but even the aggregate performance isn't much anymore.




(note also the difference between "Supporting" AVX and making much better use of it)

Didn't think it would be controversial to say Zen 2 will be a very nice step up from where we are lol
 
Last edited:
8K is not happening no time soon, you guys are funny, people are barely upgrading to 4k tv this days, the " Go" model for most tv networks are making most people less interested in purchasing new tvs. So gaming company so far from doing this, sony will only do this to push ther 8k tv but most wil just use the 8k slogan as marketing gig to catch people off
 

Farrell55

Banned
I se

I see them going with 16GB at 512GB/s on 256 bit bus.

Mind you, PS4s GPU had access to 8GB of RAM at 176 GB/s, more then Radeon 7870 at the time, with 40% less FLOPS. On top of that it also had 4x more memory available to GPU. So we might se 8.5TF Navi with 16GB at 512 GB/s in PS5/Scarlett
No only 5 GByte is available for Dev's on ps4
And dont forget this 5gb is shared as main & vram, in the best case we have 3gb for the Gpu!

Radeon hd 7870 has 2gb vram
 
Last edited:

Let's see if AVX2 will become the new baseline for next-gen games or if we will still see petitions for non-AVX binaries.

People may not realize it, but decade-old PCs are holding back video games more than consoles.

Not to mention the SSD baseline (40x improvement) according to MS... good luck brute-forcing that. :)

(note also the difference between "Supporting" AVX and making much better use of it)
Jaguar's AVX support is good enough to emulate tri-core Xenon's VMX128 for 360 BC. I'd call you crazy if you told me 5 years ago that Jaguar at 1.75 GHz would be able to emulate PowerPC at 3.2 GHz, lol.

It's no coincidence that the equivalent emulator on PC has the exact same requirements:

 

xool

Member
384 bit would take more space on die and you would have to use 14Gbps or even worse, 12Gbps chips.

You can get 1GB (8Gb) chips at 16Gb/s ( https://www.samsung.com/semiconductor/dram/gddr6/ ) .. theoretically you could use 16 of those for a 512bit bus, or a mixture for the 384 bit bus etc

[obv. right about space, and price too]

CUMHzRDNA TflopsGCN Tflops + 25%IPCGfx cards / consoles estimates
4014657.59.38=~ RX 5700
4015257.819.76
4016258.3210.4=~ RX 5700 XT (Base)
4016808.610.75
4414658.2510.31
4415258.5910.74=~ Vega 56
4416259.1511.44=~ RX 5700 XT
4416809.4611.83=~ GTX 1080
481465911.25
4815259.3711.71
4816259.9812.48=~ Vega 64
48168010.3212.9
5214659.7512.19
52152510.1512.69
52162510.8213.52
52168011.1813.98=~ RTX 2080
56146510.513.13=~ Xbox Scarlet?
56152510.9313.66=~ PS5?
56162511.6514.56
56168012.0415.05

Ugh. Fixed your table... (no need to thank me :) )

[edit - that's a useful table .. thank you :)]
 
Last edited:

D-Dude

Member
The important thing to remember about "Terraflops" is that performance will vary depending on whether we're talking about a Monotheistic or Polytheistic die. Terraflops refers to Monotheistic performance, while Teraplots addresses Polytheistic die. Also, the presence of Kugo cores can impact the equation as well, for instance, when talking Kugoson-based GPUs.

If you talk in ancient egyptian I probably understand that better :p
 

CrustyBritches

Gold Member
If you talk in ancient egyptian I probably understand that better :p
It was from GAF's fake Pastebin phase, when next-gen was whatever you wanted it to be. It wasn't RDNA, it was UrDNA. Thusly, the Kugoson GPU was born. I miss those days.:messenger_moon:

If you were serious, then just go look at AMD's Navi 5700 XT/Pro slides for performance in games.
 
Yeah i am lowering back down to team 399 and 9tfs. People forget that Jim Ryan Sony ceo says that they want a faster transition this upcoming gen. Selling a ps5 for 499 doesn’t match that goal SonGoku SonGoku
 
There is no need for 384 bit bus if they are using 16Gbps or even better 18Gbps chips (I think these will only be used in dev kits, as they are far too highly clocked for consoles). 384 bit would take more space on die and you would have to use 14Gbps or even worse, 12Gbps chips.
Consoles tend to use cheaper/slower (B grade) DRAM chips and wide buses. Using high-clocked, grade A chips doesn't make a lot of sense.

Both PS4 Pro and XB1X are a testament to this. What's interesting is that they use the exact same type of GDDR5 (same speed at 6.8 Gbps), but XB1X has a wider bus at 384-bit.

$399 = 256-bit
$499 = 384-bit
 
Last edited:
Yeah i am lowering back down to team 399 and 9tfs. People forget that Jim Ryan Sony ceo says that they want a faster transition this upcoming gen. Selling a ps5 for 499 doesn’t match that goal SonGoku SonGoku
Where did he say that and why would they want a "faster transition" to next-gen? Isn't the PS4 still selling well? That console will have pretty long legs for casuals (FIFA, Fortnite etc.), possibly as long as PS2. And let's not forget the cross-gen period, which will be longer this time around, I reckon.

If anything, it's MS that would want to abandon the tarnished XB1 brand ASAP, not Sony.
 

Imtjnotu

Member
I am going by OQA leak for PS5.



There is no need for 384 bit bus if they are using 16Gbps or even better 18Gbps chips (I think these will only be used in dev kits, as they are far too highly clocked for consoles). 384 bit would take more space on die and you would have to use 14Gbps or even worse, 12Gbps chips.

if they went with with 256bit we get 576 Gb/s so im guess if they end up going with 320 were somewhere in the range of 700Gb/s wish falls inline with early rumors. 384bit gives ups 800+ and i dont think thats needed yet

but you seem to know a shit ton more on the hardware side than i do
 
Last edited:

Imtjnotu

Member
Still basing specs off the first-gen Navi GPU that doesn’t support Ray Tracing.

When will you learn folks.
this is why i dont think were getting vanilla navi. im not sure about navi 20 either but some how both companies will put there own custom twists on the RDNA
 

Imtjnotu

Member
Where did he say that and why would they want a "faster transition" to next-gen? Isn't the PS4 still selling well? That console will have pretty long legs for casuals (FIFA, Fortnite etc.), possibly as long as PS2. And let's not forget the cross-gen period, which will be longer this time around, I reckon.

If anything, it's MS that would want to abandon the tarnished XB1 brand ASAP, not Sony.
its selling well but with the PS5 rumors sales have slowed. switch will dominate this year until next gen consoles
 

bitbydeath

Member
this is why i dont think were getting vanilla navi. im not sure about navi 20 either but some how both companies will put there own custom twists on the RDNA

Even if they customed Navi 10 to allow for it where would RT go? It looks like it’s fighting for space as it is. They then gonna reduce it further in spec to allow for RT?

It doesn’t add up.
 
its selling well but with the PS5 rumors sales have slowed. switch will dominate this year until next gen consoles
Trust me, the average consumer is totally oblivious about PS5 rumors and he will gladly buy a PS4 Super Slim 7nm at $199 late this year (Black Friday/Xmas season).

Switch is doing well too, but it serves different gaming tastes.
 

Imtjnotu

Member
Even if they customed Navi 10 to allow for it where would RT go? It looks like it’s fighting for space as it is. They then gonna reduce it further in spec to allow for RT?

It doesn’t add up.
based on the scarlett size from B3yond im guess they have an extra 60nm of space if they go with 40 cu's. if they go with more and let the extra Cu's crush the numbers they can get away with 50cu's
 

Imtjnotu

Member
Trust me, the average consumer is totally oblivious about PS5 rumors and he will gladly buy a PS4 Super Slim 7nm at $199 late this year (Black Friday/Xmas season).

Switch is doing well too, but it serves different gaming tastes.
i somewhat beg to differ. the handful of casual gamers i work with have all asked me about "is it true there is a ps5 coming". these forums have alot of reach into the casual side of games. but i do think alot of it has to do with switch and rumors. im a playstation kind of guy but i still bought a switch for the portability and SSB
 

FrostyJ93

Member
Its pretty obvious that the CPU and SSD are where most of the improvement will be. RAM and GPU are taking a backseat. Kind of a reversal to current gen, where ram and GPU were the focus.
 

Ovech-King

Member
if they went with with 256bit we get 576 Gb/s so im guess if they end up going with 320 were somewhere in the range of 700Gb/s wish falls inline with early rumors. 384bit gives ups 800+ and i dont think thats needed yet

To help you out : if that leak is true and the chips are 18Gbps you're looking at 720Gb/s if they settle at 20GB GDDR6 or 864Gb/s if they go with 24GB. Given the release is in about 16 months and for longevity, my money is on one of those 2 possible final specs
 
Last edited:

Imtjnotu

Member
To help you out : if that leak is true and the chips are 18Gbps you're looking at 720Gb/s if they settle at 20GB GDDR6 or 864Gb/s if they go with 24GB. Given the release is in about 16 months and for longevity, my money is on one of those 2 possible final specs
prices should also be dropped by them on ram especially with Samsung starting mass production last year. i wouldnt mind either of those at those speeds.
 
Its pretty obvious that the CPU and SSD are where most of the improvement will be. RAM and GPU are taking a backseat. Kind of a reversal to current gen, where ram and GPU were the focus.
How is the gpu taking a backseat 8tf is 7x leap over ps4. Think of wats Sony devs did with 1tf
 

quest

Not Banned from OT
Question is how much bandwidth is really needed if both are sub 10tf? I thought one of the big navi improvements was the compression to help alleviate AMD excessive BW usage. Would just under double scorpio be enough 600G/s?
 

SonGoku

Member
Notice he keeps talking about not expecting much teraflops talk form MS and Sony.

You guys aint getting more than 10TF.
DF is just taking the throw shit at the wall and see what sticks approach
At best they are unreliable, at worst click baiting
I'm staying with 36CU @ 1.8GHz,
Doesnt make sense considering 64CU @1500Mhz would consume the same or less
Look at Navi 5700 TDP, it is 180W GPU with 251mm² die. Just for the record, that is more TDP and die size requirements then either GPUs put in PS4/Pro or Xbox One X. Put some RT there and it is clear anything above 9TF will be extremely unlikely.
72-64 CU GPUs would fit in 380-400 mm2 APUs accounting for RT hardware. With that 11-12TF is possible
Nothing seems off for you? Perhaps its 14TF Navi that is a bit off, as this would be 400mm² part with ~280W TDP (without Zen2 and memory controlers mind you)?
400 mm2 would fit a 72CU GPU on a Zen2 APU with RT HW and it would be way more power efficient than 36CU @1.8gz. Nothing seems off to you about expecting 1.8ghz clocks and 18gbps chips (more power hungry)?

384 bit would take more space on die and you would have to use 14Gbps or even worse, 12Gbps chips.
The die size calculations done account for a 384bit bus and RT hw
Why would you be limited to 12-14Gbps chips?
after we found out that getting 13TF Navi GPU would require tower PC case instead of console box
When did we find that out lol? source?
Selling a ps5 for 499 doesn’t match that goal @SonGoku
uh what did i miss?
 
Last edited:

quest

Not Banned from OT
How is the gpu taking a backseat 8tf is 7x leap over ps4. Think of wats Sony devs did with 1tf


The OG PS4 is close enough to 2tf to call it that. It is really a 5x leap the smallest we have ever seen in raw power by a mile.
How does higher bandwidth affect game graphics?
I'm not an expert but think of the fuel pump in a car engine. The bigger thr fuel pump the more fuel can be delivered to the engine. If an engine cant get enough fuel it can't make its rated HP. But you can only send so much before it comes pointless.
 

joe_zazen

Member
.
Physically the stronger SKU with a software toggle to bring it down to the weaker, surely, as the architectures would be the same otherwise. No way they would have just shipped the weaker one with no way for devs to see their enhancements for the stronger.



I'm finding it hard to believe that while launching the same year as next gen RDNA, concurrent to Scarlett, and being the first to mention RT at all, they would miss out on the hardware based version of it in next gen RDNA. They'd know the roadmap before any of us did.

Why do people think amd is going to have an RT solution that will be any good and not a big waste of die space? What is the reason their next gen cards don't have it?
 

pawel86ck

Banned
Yes... "I am 3rd party developer and my father works at Nintendo, so I know every part of Sonys and some 3rd party's next gen plan"
What's interesting that particular leak is very old, and since it was posted some informations were indeed confirmed, so it may be a real deal after all.

But I think it's possible one year ago PS5 dev kit was using vega architecture, so 13-14 TF make sense back then, while the final product can end up around 10 TF Navi.

But I still hope for the best scenario (around 12 TF navi architecture, not just 8 TF). AMD has surprised us with Navi architecture while people were predicting the worst scenario (some people have said Navi will be even worse than Vega) and maybe we will see some additional surprises 1.5 year from now. I doubt sony will use 5700 XT equivalent (Navi 10), but more like Navi 10 / 20 mix because PS5 will support HW RT while Navi 10 doesnt support it for sure.
 

quest

Not Banned from OT
.


Why do people think amd is going to have an RT solution that will be any good and not a big waste of die space? What is the reason their next gen cards don't have it?

Glad add someone else questioning AMD RT solution if nvidia has issues I have doubts. My guess why navi does not have it is AMD does not have the resources for 2 next generation consoles and graphics card. Hence navi being a hybrid solution until rdna2. That is why I have serious doubts both consoles would have different RT solutions.
 

SonGoku

Member
It's an assumption I made based on my belief about the Gonzalo and Reddit OQA leaks, the Scarlett SoC shot, and Navi 10 power consumption. For me, RT is neither here nor there when it comes to what I want out of next-gen = 60fps. I love how Zen 2 CPU has become passé, and instead of people gushing over that the focus is on shiny shit and RT like I mused in my other post.

I have hope because MS has made some comments about 60fps. Hopefully it will be an industry practice to just have "Performance" and "Visual" modes for games.
It makes no sense for Sony to go with obsolete tech, this gen they demonstrated the opposite.
PS4 launched with a GCN feature set on par with the highest end AMD GPU during that same year
PS4 Pro launched with a GPU that had the latest available AMD GPU tech (Polaris) and then some future features (vega)

On top of that Navi10 is way too weak and power ineficient for a console, they need a bigger chip clocked lower
 
Last edited:
Glad add someone else questioning AMD RT solution if nvidia has issues I have doubts. My guess why navi does not have it is AMD does not have the resources for 2 next generation consoles and graphics card. Hence navi being a hybrid solution until rdna2. That is why I have serious doubts both consoles would have different RT solutions.

Here...

Seems pretty straight forward to me.

AMD is coordinating with partners before they release anything about Navi.

Sony have their own hand rolled RT implementation so went ahead and announced it in Wired.

AMD were quiet about Navi RT until after the MS reveal at E3.

So it looks like MS is cherry picking AMD RT features from RDNA 2.0 or whatever it's called these days.

Sony has no need to wait for a directX compatible RT implementation from AMD. MS on the other hand...

MS haven't a clue what's in the Sony RT implementation.

Who ever has the more elegant streamlined RT solution will probably have die space to spare that can be utilised for additional CUs that a more bloated RT solution wouldn't be able to accommodate.
 

FrostyJ93

Member
DF is just taking the throw shit at the wall and see what sticks approach
At best they are unreliable, at worst click baiting

Doesnt make sense considering 64CU @1500Mhz would consume the same or less

72-64 CU GPUs would fit in 380-400 mm2 APUs accounting for RT hardware. With that 11-12TF is possible

400 mm2 would fit a 72CU GPU on a Zen2 APU with RT HW and it would be way more power efficient than 36CU @1.8gz. Nothing seems off to you about expecting 1.8ghz clocks and 18gbps chips (more power hungry)?


The die size calculations done account for a 384bit bus and RT hw
Why would you be limited to 12-14Gbps chips?

When did we find that out lol? source?

uh what did i miss?

No its not clickbait. You just do not like what you hear so you'll refuse to believe it until proven wrong.
 
Last edited:
DF is just taking the throw shit at the wall and see what sticks approach
At best they are unreliable, at worst click baiting

Doesnt make sense considering 64CU @1500Mhz would consume the same or less

72-64 CU GPUs would fit in 380-400 mm2 APUs accounting for RT hardware. With that 11-12TF is possible

400 mm2 would fit a 72CU GPU on a Zen2 APU with RT HW and it would be way more power efficient than 36CU @1.8gz. Nothing seems off to you about expecting 1.8ghz clocks and 18gbps chips (more power hungry)?


The die size calculations done account for a 384bit bus and RT hw
Why would you be limited to 12-14Gbps chips?

When did we find that out lol? source?

uh what did i miss?
Did read my whole post on how Jim Ryan wants a faster transition
 

FrostyJ93

Member
How did you come to that conclusion?
I explained my thought process

Both the DF people and matt, can't find exact quote, from era said there is a reasonnthey arent talking TF. Because its not going to be that high. DF have been alluding to this for months now. Stop playing dumb.
 

LordOfChaos

Member
.


Why do people think amd is going to have an RT solution that will be any good and not a big waste of die space? What is the reason their next gen cards don't have it?


Next gen RDNA, the one launching closer to concurrently with the next gen consoles, will have it, RDNA 1.0 doesn't I assume because of the current costs of 7nm die size, or the development still isn't done. First generation of their new architecture was all about increasing the efficiency per flop on traditional raster performance, with that under their belt they move on, I guess.

If it's any good, well, none of us can really know that yet, but even if it falls short of Nvidias then-year+ old solution hardware based still makes for many times more rays than software.

l4dstrp6mskcrcat3msp.jpg
 
Last edited:

SonGoku

Member
Both the DF people and matt, can't find exact quote, from era said there is a reasonnthey arent talking TF. Because its not going to be that high. DF have been alluding to this for months now. Stop playing dumb.
Matt commented about the specs not being locked yet and MS/Sony trying to hide the final numbers from each other.
Not because TFs are low or high. Sony hasn't revealed the PS5 yet, they merely done a superficial tease, didn't even mention ram yet
Makes no sense to give hard numbers without a proper reveal this far in advanced.

DF are just throwing shit at the wall based on their interpretation of rumors, they have no insider info
 
Last edited:

FrostyJ93

Member
Matt commented about the specs not being locked yet and MS/Sony trying to hide the final numbers from each other.
Not because TFs are low or high. Sony hasn't revealed the PS5 yet, they merely done a superficial tease, didn't even mention ram yet
Makes no sense to give hard numbers without a proper reveal this far in advanced.

DF are just throwing shit at the wall based on their interpretation of rumors, they have no insider info

Whatever you say man. You're gonna be the one disappointed by a number that ultimately doesn't matter to your real world experience so -shrug-
 

SonGoku

Member
Whatever you say man. You're gonna be the one disappointed by a number that ultimately doesn't matter to your real world experience so -shrug-
Im not trying to change your mind, you can have your own expectation
Im just pointing out the flaw in using DF as source of anything next gen related, put simply they know nothing and are using rumors and wild speculation to remain a relevant name.
If it's any good, well, none of us can really know that yet, but even if it falls short of Nvidias then-year+ old solution hardware based still makes for many times more rays than software.
Im hoping its more basic/simple than nvdias implementation so that its "free" fixed function lighting effects with no big performance penalty we see in RTX on/off comparisons.
 
Last edited:
Status
Not open for further replies.
Top Bottom