• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

nowhat

Member
im buying a new tv and im wondering will nexnex gen be 8k native you think, or should i just go with a 4k?
If you're being serious - 8K native, hell no. What we'll probably see is the same thing as with a Pro - non-native 8K resolution, but UI/HUD native 8K.

But unless you have a huge TV, or are glued to it, you will not see the difference (between 4K and 8K) in typical viewing conditions.
 
Last edited:

LordOfChaos

Member
World's first PCIe 4.0 CPU? I didn't know that.

Yep, Intel is really late to PCI-E 4.0, would be a really good time for AMD to capitalize on it with Rome. Brand new Mac Pro is shipping PCI-E 3.0 when AMD will be shipping 4.0 in 10 days.


 
Last edited:

xool

Member
im buying a new tv and im wondering will nexnex gen be 8k native you think, or should i just go with a 4k?

If 8k tv/monitor gets market penetration fast enough I'm expecting another mid gen refresh to do 8k properly..

..not expecting baseline consoles to do 8k outside a couple special cases

you might want to check out the 8k size/viewing distances tables first too (my desk isn't big enough, and there isn't really a wall wide enough to fit an 8k tv )

[angular resolution of eye = 1 arcminute = 0.00029 rad .. @ ~2 meters = 6x10-4m minimum resolvable feature size = 0.6mm .. @8000pixels stepped across diagonal = 4.8m diagonal]

Run that one again - with a tv less than ~5m(16ft) diagonal at over 2m viewdistance you won't be able to detect difference between pixels - at lower screen sizes there'll be a subjective "sharpness" difference that money can't put a value on .

tldr for most people 8k is just free anti-aliasing
 
Last edited:

Fake

Gold Member
It depends... do you only care about resolution? A lot of TVs will serve you.

Do you have any clue about input lag, contrast, image quality...? Then you gotta spend some more money.
Acutally cheap TV are terfible in many ways.
Image quality, sound quality from TV speakers, game mode... All those CAN be compromisse.
They're overall bad. I highly don't recommend take my experience in example.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
If 8k tv/monitor gets market penetration fast enough I'm expecting another mid gen refresh to do 8k properly..

..not expecting baseline consoles to do 8k outside a couple special cases

you might want to check out the 8k size/viewing distances tables first too (my desk isn't big enough, and there isn't really a wall wide enough to fit an 8k tv )

[angular resolution of eye = 1 arcminute = 0.00029 rad .. @ ~2 meters = 6x10-4m minimum resolvable feature size = 0.6mm .. @8000pixels stepped across diagonal = 4.8m diagonal]

Run that one again - with a tv less than ~5m(16ft) diagonal at over 2m viewdistance you won't be able to detect difference between pixels - at lower screen sizes there'll be a subjective "sharpness" difference that money can't put a value on .

tldr for most people 8k is just free anti-aliasing

The main problem (outside of not being able to tell the difference like you said) is there's literally no 8K content. 90% or more channels on cable/satellite are 1080p\i right now. And 4K TVs have been selling for reasonable prices for at least 3 years now.

Imagine the bitrate that would be needed to watch a Netflix TV show in 8K HDR.
 

R600

Banned
5700xt-Time-Spy-Table.png


Matches Vega 7, RTX 2070.

Beats Vega64 by a margin.
 
Last edited:
5700xt-Time-Spy-Table.png


Matches Vega 7, RTX 2070.

Beats Vega64 by a margin.
At only 9.75TF, mind you. However, nVidia is still more efficient at 8TF. The retail cost of the 5700XT is around $450. While that may not reflect console cost, the 580 was at $250 and was the chip used for the Xbox One X which retailed at $500. I think we will be looking at $600 consoles if this GPU is used.
 
Last edited:

LordOfChaos

Member
5700xt-Time-Spy-Table.png


Matches Vega 7, RTX 2070.

Beats Vega64 by a margin.

If we got that I wouldn't be too upset. Vega 64++ performance in a cheaper part is pretty well what I was expecting a year ago, any surprises to the upside will just be added good news. That's a fair lead over the 64 too.
 

CrustyBritches

Gold Member
Seems to support AMD's claim that 5700 XT beats Vega 64 by 14%, this is even higher. Using AMD's own slides, I came out to Radeon VII being ~9% more powerful than 5700 XT, while the 5700XT is about 14% faster than Vega 64. On PC I don't think pulling ~230W will be a huge deal, but in the console space that's a lot of power consumption.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
At only 9.75TF, mind you. However, nVidia is still more efficient at 8TF. The retail cost of the 5700XT is around $450. While that may not reflect console cost, the 580 was at $250 and was the chip used for the Xbox One X which retailed at $500. I think we will be looking at $600 consoles if this GPU is used.

Yeah but the PCs aren't coming out for another 18 months, so.......we'll be fine.
 

R600

Banned
At only 9.75TF, mind you. However, nVidia is still more efficient at 8TF. The retail cost of the 5700XT is around $450. While that may not reflect console cost, the 580 was at $250 and was the chip used for the Xbox One X which retailed at $500. I think we will be looking at $600 consoles if this GPU is used.
@1878MHZ clocks, so actually 9.6TF

In any case, a card between 5700 and XT (~8.3TF) would match Vega64 performance and be few percent slower then Nvidia 2070.
 
Last edited:

Whitecrow

Banned
Acutally cheap TV are terfible in many ways.
Image quality, sound quality from TV speakers, game mode... All those CAN be compromisse.
They're overall bad. I highly don't recommend take my experience in example.
Luckily, I use TV only for gaming, so since image quality is important for gaming, I will not get less than a $1000+ QLED.
 

Imtjnotu

Member
At only 9.75TF, mind you. However, nVidia is still more efficient at 8TF. The retail cost of the 5700XT is around $450. While that may not reflect console cost, the 580 was at $250 and was the chip used for the Xbox One X which retailed at $500. I think we will be looking at $600 consoles if this GPU is used.
GPU prices do not reflect the same as console gpu prices
 

LordOfChaos

Member
While this is true, I don't see AMD shoving anything bigger than the 5700XT into the console. Heat would be an even bigger issue than it is now.

Larger, slower clocked chips can actually be a power saving measure

IGPs got larger for years, even on years they went sideways on performance, because they could clock lower to do the same task, saving power.
 

Fake

Gold Member
So a between 1.0 and 2.0? Console are using a RDNA 1.5 (of course take into consideration RDNA 1.0 don't have RT support).
 

CyberPanda

Banned
2Mk1kQM.jpg

Dictator from DF weighing in.


BsBRmcl.jpg

Even they aren’t buying crapgamers video. What third party dev is gonna go on a no name YouTube video and break NDA facing potential lawsuit and also get fired? Crapgamer btw has used MisterXMedia as a source for his info on various occasions. Yea, ew.

That is not good for Next-gen :(

Similar hardware found in Xbox One X can reach 8k in that benchmark.

PS. Assuming Next-gen will use simular GPU... it needs a way better one imo.
Yea, it’s best to temper expectations. Otherwise a lot are gonna be disappointed.

And looking at the benchmark :

-NVidia's GPUs still have 25% higher IPC compared to AMD
-AMD needs to keep focus on IPC increases now to fully catch up
- next year NVidia will have 7nm chips which will widen the gap again.
 
Last edited:

ethomaz

Banned
BsBRmcl.jpg

Even they aren’t buying crapgamers video. What third party dev is gonna go on a no name YouTube video and break NDA facing potential lawsuit and also get fired? Crapgamer btw has used MisterXMedia as a source for his info on various occasions. Yea, ew.
What the podcast said? I have an ideia but it is good to see what Crapgamers has to say.
 

DJ12

Member
Even they aren’t buying crapgamers video. What third party dev is gonna go on a no name YouTube video and break NDA facing potential lawsuit and also get fired? Crapgamer btw has used MisterXMedia as a source for his info on various occasions. Yea, ew.
Isn't that because despite devs either saying, they are on par or PS5 is stronger, the Xbox fans just cannot get hold of the idea that Scarellet might be weaker, hence why anyone saying one console is stronger, significantly gets grasped to with a vice like grip.

Anyway, that guy as obviously parroting what we have all read from the various pastebin and twitter comments.
 

CrustyBritches

Gold Member
You are looking at wrong results. I doubt Xbox one X reaches ⅝K here, let alone 8.
Yeah, RX 580 should be between 4500-5000(gfx). I did see some results on the 3DMark database that showed ~9k(gfx), but those are probably incorrectly reporting from crossfire setup.
 
Last edited:
GPU prices do not reflect the same as console gpu prices
I know, that's why I mentioned that. Where I wrote " it may not reflect console cost" but it is a good way to measure a generalized cost. We aren't going to get $500 to $800 13TF GPU's in these consoles. It's stupid to think so.

This whole conversation reminds me of a time when people thought Ryzen was going to be in the Xbox One X. The timelines match up too 16 months 6-8Months before release people thought it would have Ryzen and were let down when they found out is was still Jaguar even though the evidence was right in their faces that it was never going to happen.


 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I know, that's why I mentioned that. Where I wrote " it may not reflect console cost" but it is a good way to measure a generalized cost. We aren't going to get $500 to $800 13TF GPU's in these consoles. It's stupid to think so.

This whole conversation reminds me of a time when people thought Ryzen was going to be in the Xbox One X. The timelines match up too 16 months 6-8Months before release people thought it would have Ryzen and were let down when they found out is was still Jaguar even though the evidence was right in their faces that it was never going to happen.



But couldn't we get 10 TF GPUs with ray-tracing hardware built-in?
 
But couldn't we get 10 TF GPUs with ray-tracing hardware built-in?
RT is already confirmed. With the 5700XT at 9.75TF I'd expect that number to go down rather than up. Let's all try to be a bit realistic about this and consider anything above that number to be a blessing from the gaming gods. They are not going to be using Desktop hardware either.
 
Last edited:

Mass Shift

Member
ResilientBanana ResilientBanana

Neither MS or Sony EVER pay msrp. They're typically only spending 1/3rd of the retail costs that we pay for cpus and graphic cards.

$800 GPU? Consumer pays full msrp. Because MS and Sony are buying millions of these chips they would only be paying about $266. IF they're even paying that much. AMD furnishes great arrangements for them because they are both apart of their daily revenue streams.
 

R600

Banned
ResilientBanana ResilientBanana

Neither MS or Sony EVER pay msrp. They're typically only spending 1/3rd of the retail costs that we pay for cpus and graphic cards.

$800 GPU? Consumer pays full msrp. Because MS and Sony are buying millions of these chips they would only be paying about $266. IF they're even paying that much. AMD furnishes great arrangements for them because they are both apart of their daily revenue streams.
Actually less then that. 100-130$ per APU Id say.

Note that they put a whole lot of $$ in designing of these APUs, especially in 7nm, and that is not being counted into costs, but is for when AMD/Nvidia sell you GPU.

They have separate cooling and memory that is not counted into SOC costs.
 

Imtjnotu

Member
ResilientBanana ResilientBanana

Neither MS or Sony EVER pay msrp. They're typically only spending 1/3rd of the retail costs that we pay for cpus and graphic cards.

$800 GPU? Consumer pays full msrp. Because MS and Sony are buying millions of these chips they would only be paying about $266. IF they're even paying that much. AMD furnishes great arrangements for them because they are both apart of their daily revenue streams.
Don't think he gets that. 7870 was around $350 bucks and launch but Sony the but the ps4 apu cost was $105
 
ResilientBanana ResilientBanana

Neither MS or Sony EVER pay msrp. They're typically only spending 1/3rd of the retail costs that we pay for cpus and graphic cards.

$800 GPU? Consumer pays full msrp. Because MS and Sony are buying millions of these chips they would only be paying about $266. IF they're even paying that much. AMD furnishes great arrangements for them because they are both apart of their daily revenue streams.
Actually less then that. 100-130$ per APU Id say.

Note that they put a whole lot of $$ in designing of these APUs, especially in 7nm, and that is not being counted into costs, but is for when AMD/Nvidia sell you GPU.

They have separate cooling and memory that is not counted into SOC costs.
Don't think he gets that. 7870 was around $350 bucks and launch but Sony the but the ps4 apu cost was $105


Maybe I'm not making myself clear. I understand they do not pay MSRP for their material. It is clear to me that they receive a discounted price and they they simply buy the SOC. I'm basing my opinions off of the Xbox One X which is the most recent Console released and the price it released at. If we are trying to stay at $500 price range, I have a hard time believing we will be going beyond something similar to the 5700 series range.
 

R600

Banned
https://www.google.com/amp/s/www.t3...ly-moly-xbox-two-has-a-hell-of-a-fight-coming This and based on dev comments and rumors PS5 is indeed a monster as RuthenicCookie leaked.
Timespy DX12 benchmarks :

Navi XT (9.6TF - 1.83 clock) - 8.7K

Vega64 (12.6TF) - 7.4K

Vega56 (10.5TF) - 6.3K

Firestrike benchmark :

Gonzalo (Zen2 3700 @ 3.2/Navi) - 20K

Vega64 with 2700X - 22K+

Vega56 with 2700X - 18-19K

What to make of this? Navi XT has higher synthetic benchmarks then Vega64 and Vega56.

If same turns out true in Firestrike as well, we can expect Navi XT to score 22K plus with Zen2 (as to my estimates).

So what is inside next gen boxes? I would assume, as I have since begining, something between 5700 and XT.

Best to check everything by yourself, then fall into trap of overhyping vlogers.
 

CrustyBritches

Gold Member
Don't think he gets that. 7870 was around $350 bucks and launch but Sony the but the ps4 apu cost was $105
The summer before the PS4 launch the 7790 was $99, 7850 2GB was $150, and the 7870 2GB GHZ was $175. All with free games. The RX 470(faster than PS4 Pro) was $179 and the Xbox One X got a RX 480 OC/RX 580 which launched at $239.

Retail pricing isn't a direct indicator of much except cards in the same performance tiers have been getting more expensive. Looking at average gaming power consumption, the 7850 = 90W, 7870 = 105W, RX 470 = 125W, RX 480 = 163W. Peak system power consumption for PS4 Base = 148W, PS4 Pro = 155W, Xbox One X = 175W. 5700 Pro = 180W(TBP), 5700 XT = 225W(TBP).
 
Status
Not open for further replies.
Top Bottom