• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
its safe to say that both ps5 and xbox scarllett are going to have wifi6. Having wifi6 connections at consumer level is a whole other story.... Has anyone ever experienced network play-online play with wifi6 connections for PC?
I think most PC users use good ol' wired Ethernet. Home console users should do the same.

If we get better Wi-Fi, it will most likely be for wireless VR usage.
 
Since Microsoft is really pushing for cloud (streaming games is a huge part of it), will they finally be able to offset graphical compute out to their servers and send it back to xbox scarllett via wifi6 connections (it can reach theoretical up to 10 Gbps) to free the zen 2 cpu and navi rdna gpu so it can prioritize other graphical demands. Maybe its secret sauce?
 
Come on guys.
We get 30fps because in "better graphics at 30fps or worse at 60fps" the former always wins.
Why would PS5 change that?
There's only one reason it could potentially change:


Read all the tweets, it explains how hard it is for artists these days, even in relatively "underpowered" consoles designed back in 2012.

All this AAA crunch is probably the reason TLOU2 got delayed and if you ask me, I won't be surprised if it gets delayed once again until late 2020 (PS5 launch) to add some extra bling (RTX ON).

Since Microsoft is really pushing for cloud (streaming games is a huge part of it), will they finally be able to offset graphical compute out to their servers and send it back to xbox scarllett via wifi6 connections (it can reach theoretical up to 10 Gbps) to free the zen 2 cpu and navi rdna gpu so it can prioritize other graphical demands. Maybe its secret sauce?
First of all, nobody has a 10 Gbps internet connection, so (W)LAN speed is irrelevant.

Regardless of that, it's a matter of input latency.
 
Last edited:

Imtjnotu

Member
Since Microsoft is really pushing for cloud (streaming games is a huge part of it), will they finally be able to offset graphical compute out to their servers and send it back to xbox scarllett via wifi6 connections (it can reach theoretical up to 10 Gbps) to free the zen 2 cpu and navi rdna gpu so it can prioritize other graphical demands. Maybe its secret sauce?
cover2.jpg
 
Last edited:

Aceofspades

Banned
Since Microsoft is really pushing for cloud (streaming games is a huge part of it), will they finally be able to offset graphical compute out to their servers and send it back to xbox scarllett via wifi6 connections (it can reach theoretical up to 10 Gbps) to free the zen 2 cpu and navi rdna gpu so it can prioritize other graphical demands. Maybe its secret sauce?

Short answer is no. Nothing over the internet can replace local hardware.
 
That’s
Since Microsoft is really pushing for cloud (streaming games is a huge part of it), will they finally be able to offset graphical compute out to their servers and send it back to xbox scarllett via wifi6 connections (it can reach theoretical up to 10 Gbps) to free the zen 2 cpu and navi rdna gpu so it can prioritize other graphical demands. Maybe its secret sauce?

That’s not really how the internet works
 

Sosokrates

Report me if I continue to console war
I really don't see how the next gens can be more then 11.26tflops, unless the go $549 or higher.

44 x 64 x 2 x 2000mhz =11264000 - 1000000 = 11.264tflops

In nov 2020 a 11tf rdna2 card is going to be nearly top of the line, it will be on the same level as a 2080super.

I think at $499 price point 9-10tflops is a lot more realistic.
 

bitbydeath

Member
I really don't see how the next gens can be more then 11.26tflops, unless the go $549 or higher.

44 x 64 x 2 x 2000mhz =11264000 - 1000000 = 11.264tflops

In nov 2020 a 11tf rdna2 card is going to be nearly top of the line, it will be on the same level as a 2080super.

I think at $499 price point 9-10tflops is a lot more realistic.

56CU would send it over 11TF which is possible but granted on the high-end.
 

Sosokrates

Report me if I continue to console war
56CU would send it over 11TF which is possible but granted on the high-end.

56cu total? So 52 active?

Yes any CU count up to rdna's limit is possible but the question is, what is realistic. If we go by recent history consoles have made a small profit, the ps4 and x1 had about a $380 BOM, so if next gen is $499 rrp then there BOM will be about $480, unless they go for a bigger loss, but I don't see any reason why they would.
 

bitbydeath

Member
56cu total? So 52 active?

Yes any CU count up to rdna's limit is possible but the question is, what is realistic. If we go by recent history consoles have made a small profit, the ps4 and x1 had about a $380 BOM, so if next gen is $499 rrp then there BOM will be about $480, unless they go for a bigger loss, but I don't see any reason why they would.

It could realistically go as high as 60CU, 56 active.

What we do know is that AMD Ray Tracing is expected to arrive in next years cards.


And it will obviously be a step up from their current cards.

images


Setting the minimum CU count for next-gen at 44 active.
 
In nov 2020 a 11tf rdna2 card is going to be nearly top of the line
Top of the line? Slightly mid-range for late 2020 standards.

You're going to see 20TF discrete GPUs next year. Unless nVidia goes crazy with Tensor/RT cores and compromises the FP32 ALU count.

AMD doesn't have that limitation.

It could realistically go as high as 60CU, 56 active.
56 CUs * 64 ALUs * 2 ops * 2000 MHz gives us 14.3 TF

Same as these 2 rumors/leaks:






The question is how they're going to cool that beast... maybe Sony has the answer with that V design/cooling patent. We'll see.
 

Sosokrates

Report me if I continue to console war
It could realistically go as high as 60CU, 56 active.

What we do know is that AMD Ray Tracing is expected to arrive in next years cards.


And it will obviously be a step up from their current cards.

images


Setting the minimum CU count for next-gen at 44 active.

Yeah at like $599

Also there is just a good of a chance of 36 or 40 active cu's.
 
Last edited:

pawel86ck

Banned
Top of the line? Slightly mid-range for late 2020 standards.

You're going to see 20TF discrete GPUs next year. Unless nVidia goes crazy with Tensor/RT cores and compromises the FP32 ALU count.

AMD doesn't have that limitation.


56 CUs * 64 ALUs * 2 ops * 2000 MHz gives us 14.3 TF

Same as these 2 rumors/leaks:






The question is how they're going to cool that beast... maybe Sony has the answer with that V design/cooling patent. We'll see.

It would be great, but this particular leak is without any doubts fake because "April 1" date is always used in jokes.
 
Yeah at like $599
$599 BOM is perfectly acceptable with a $100 loss.

They have highly profitable services/digital ecosystems, so why not?

We didn't have that luxury during the PS2/PS3 era.

confirmation from korean dank dark black, reram in ps5. Link :messenger_winking_tongue:

Gaf onQ123 thread about reram: https://www.neogaf.com/threads/how-...large-amount-of-reram-as-the-ps5-ssd.1477282/
Huh, interesting!

QLC NAND is still a safe bet, but it will be funny if O onQ123 is redeemed. :p
 

Sosokrates

Report me if I continue to console war
$599 BOM is perfectly acceptable with a $100 loss.

They have highly profitable services/digital ecosystems, so why not?

We didn't have that luxury during the PS2/PS3 era.


Huh, interesting!

QLC NAND is still a safe bet, but it will be funny if O onQ123 is redeemed. :p

I'm talking about a $599 rrp, with those specs.
Like I said I see no reason to think they will be taking bigger losses then usual.
 
Come on guys.
We get 30fps because in "better graphics at 30fps or worse at 60fps" the former always wins.
Why would PS5 change that?
Keep in mind that this gen we had more games targeting 60 than the gen before, or even those targeting 30 hold it better. This is because after you reach a certain level of power game x will just not need more, they will not always have a possibility or desire to make things look better.

Obviously some teams will still target 30, but there will be less of them, unless raytracing kills performance (like it does on PC) and they decide to use it anyway.
 

onQ123

Member
$599 BOM is perfectly acceptable with a $100 loss.

They have highly profitable services/digital ecosystems, so why not?

We didn't have that luxury during the PS2/PS3 era.


Huh, interesting!

QLC NAND is still a safe bet, but it will be funny if O onQ123 is redeemed. :p

That look more like someone taking info off of the internet & making a story of it.


Here is a actual link about Sony's ReRam from 2019


 
Last edited:

Sosokrates

Report me if I continue to console war
You forget they also receive the parts much cheaper than consumers.

Indeed.
Depends what they go for really.

But I think they will have a BOM of $480 - 520 and a rrp of $499.

We should also remember that the 8 core zen2 will be more expensive then the 8 core jaguar was in 2013, the zen 2 is a fully fledged desktop cpu, the jaguars were like netbook crap. I think the ssd will be about the same price as the 500gb hdds were in 2013 because they will put nand chips directly onto the motherboard saving in cost. So maybe if they go with a total of 16gb ram they might be able to get 56total cu's, but I think it's unlikely. My bet is on 44 - 48.
 

Mr.XtremeGamer

Neo Member
The funny thing is what you guys are gonna think after viewing the total nonsense Misterxmedia and all the hardcore fanboys from the xbox side of things are posted. He blocked my main twitter account after i called him out on his nonsense, so i created another one to spy on the absolute garbage him and his clueless fans are littering the place with, you'd be surprised at the amount of utter hubris and ignorance i came across. Seems they are so hellbent on playstation failing that they go to such extremes to prove the nextbox is superior without knowing anything substantial about it or the ps5. Take a look and see for yourselves.
 

Mr.XtremeGamer

Neo Member
The funny thing is what you guys are gonna think after viewing the total nonsense Misterxmedia and all the hardcore fanboys from the xbox side of things are posted. He blocked my main twitter account after i called him out on his nonsense, so i created another one to spy on the absolute garbage him and his clueless fans are littering the place with, you'd be surprised at the amount of utter hubris and ignorance i came across. Seems they are so hellbent on playstation failing that they go to such extremes to prove the nextbox is superior without knowing anything substantial about it or the ps5. Take a look and see for yourselves.
 
We should also remember that the 8 core zen2 will be more expensive then the 8 core jaguar was in 2013, the zen 2 is a fully fledged desktop cpu, the jaguars were like netbook crap.
No, we have debunked this myth way too many times already:

https://adoredtv.com/zen2-chiplet-quality-examined-by-cost/ (spoiler alert: it's a $30 CPU with 7nm chiplet + 12nm I/O die, it's gonna be even smaller in a monolithic 7nm APU die)

You're not paying on the basis if it's a netbook or desktop IP, you're paying on the basis of die size.

8-core Zen @ 7nm requires roughly the same amount of die space as 8-core Jaguar @ 28nm. SonGoku SonGoku had done the calculations long time ago.
 
Another myth we need to dispel (wafer prices):

2015:

Lithography-wafer-costs.jpg


2019:

D-pGn8qWkAEM_7n.png


4 years ago a 14nm FinFET wafer cost $16000.

4 years later a 16nm FinFET (12-14nm is roughly the same process) wafer costs less than $6000.

3nm wafers are projected to cost over $15000 initially, almost as much as 14nm FinFET back in 2015. See a pattern here?

Why is that? Because manufacturing a new fab requires more money (better/more elaborate equipment) than the previous one and you need to recoup that:


TL;DR: wafer prices are not static. Fab costs are insane, but improved yields and supply/demand curves means the prices will drop over time.

There's a reason Nintendo waited until 2019 to take advantage of 16nm.
 
Last edited:

joe_zazen

Member
Another myth we need to dispel (wafer prices):

2015:

Lithography-wafer-costs.jpg


2019:

D-pGn8qWkAEM_7n.png


4 years ago a 14nm FinFET wafer cost $16000.

4 years later a 16nm FinFET (12-14nm is roughly the same process) wafer costs less than $6000.

3nm wafers are projected to cost over $15000 initially, almost as much as 14nm FinFET back in 2015. See a pattern here?

Why is that? Because manufacturing a new fab requires more money (better/more elaborate equipment) than the previous one and you need to recoup that:


TL;DR: wafer prices are not static. Fab costs are insane, but improved yields and supply/demand curves means the prices will drop over time.

There's a reason Nintendo waited until 2019 to take advantage of 16nm.

it is the cost for actual chip designing on a smaller process that increases exponentially, separate for the actual manufacturing Plant/equipment costs. I think that is where the increased costs come from, not the cost of the wafer. Ie it will cost sony a lot more to get a 7nm designed than it did for 28, even if wafer costs aren’t that different.

At least that is what i have read.
 
it is the cost for actual chip designing on a smaller process that increases exponentially, separate for the actual manufacturing Plant/equipment costs. I think that is where the increased costs come from, not the cost of the wafer. Ie it will cost sony a lot more to get a 7nm designed than it did for 28, even if wafer costs aren’t that different.

At least that is what i have read.
No, I wasn't talking about R&D/tape-out (you're right about the increase, but that's a one-off cost).

Manufacturing tends to cost more since 20nm due to multi-patterning. EUV is the savior and the holy grail of the silicon industry, because it will allow them to reduce lithography masks (less manufacturing steps, so less costs).

Not to mention the etching quality will drastically improve, which is crucial for many reasons (console longevity, higher clocks, less leakage, better density):

picture-3.png
 

Mr.XtremeGamer

Neo Member
What? are we now speculating that Scarlett will have 80CUs? Where is R R600 when you need him? Or he just show up when PS5 is said to be stronger only?

I love how Xbox fanboys always make the most ridiculous claims of power, only to be brought down to earth shortly after.
Fanboys are fanboys man. They like to pull a lot of fantasies out of their ass. They should stick them right back where they belong. Diehard xbox fanboys such as the cherry on top Misterxmedia and his fanboys, will go to great lengths to prove something, even if that something doesn't make sense to begin with. They claim they know how stuff in the computing industry works and call themselves insiders. They are insiders in their fantasy world, but the real world is different.
 
It's not the 80 CU part that's ridiculous (they can do whatever they want with experimental devkits, including having discrete Vega GPUs), it's the RDNA1 (Sony) vs RDNA2 (MS) part.

Both of them are going to be RDNA2 (due to HW RT), so what's the point of lowballing Sony? Unless you have an agenda (aka system wars) of course... fanboys gonna be fanboys, there's no cure for them. Let them dream!

5nm also seems a bit risky to me, I'm not sure if they can make it in time for a late 2020 release. Unless it gets delayed until 2021, which means Sony would get a next-gen headstart.
 
Status
Not open for further replies.
Top Bottom