Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Jaguar is the best solution for the money around 2013. The alteratives are ARM Cortex A15, IBM PowerPC A2(PPE replacement) and Intel Atom Silvermont.

Who gives a fuck about Jaguar in 2020? Read again what I'm saying, we need to throw those Jaguars behind us as far as possible, maybe ask Elon Musk to throw them into outer space.
 
Last edited:
I'm not sure where you're getting that information, but the world doesn't agree with that. Several devs have already called that BS out. Can't wait for July event myself to shed more lights on that matter and clear the fog.
Microsoft knows that this might be the case after 2-3 years but not at the start of the gen. The new Spider-Man, how different do you think the AI will be? How much will the actual gameplay change compared to running on a Jaguar CPU? It's mostly a visual upgrade, new story, and reusing a lot of assets of the original game.

I agree that the CPU can bring amazing changes that won't be possible on current gen. Just not at the start, and Microsoft knows this.
 
AMD's Smaiftshift is not about cooling. 1000 watts TDP cooling solution wouldn't solve VRM power feed budget.
giphy.gif


Yes because when you make a videogame you always try to reach 100% of GPU and CPU all the time who
cares if something happens during the gameplay the only important thing is to reach that load in both chips.
 
That's the main highlights, lossless as @PaintTinJr and others has explained previously, compared to the lossy BCPack to reach higher number with quality compromises.

Looks like they are promoting Oodle texture BC7 RDO (lossy) as a close equivalent to non-RDO BC7.

Oodle Texture RDO can product very high quality encodings that are nearly visually indistinguishable from non-RDO encodings, but compress much more, simply by being a smart encoding which takes into consideration the rate of the choices.

Oodle Texture RDO can encode to the same quality as the non-RDO encoders at low lambda, and gradually decreases rate as lambda goes up.

Oodle Texture RDO can make BC7 encodings that are much more compressible. For example :

non-RDO BC7 :
Kraken : 1,048,724 -> 990,347 = 7.555 bpb = 1.059 to 1

RDO lambda=40 BC7 :
Kraken : 1,048,724 -> 509,639 = 3.888 bpb = 2.058 to 1

Modern games are using more and more BC7 textures because they provide much higher quality than BC1 (which suffers from chunky artifacts even at max quality). This means lots of game packages don't benefit as much from compression as we'd like. Oodle Texture RDO on BC7 fixes this.
 
Last edited:
Microsoft knows that this might be the case after 2-3 years but not at the start of the gen. The new Spider-Man, how different do you think the AI will be? How much will the actual gameplay change compared to running on a Jaguar CPU? It's mostly a visual upgrade, new story, and reusing a lot of assets of the original game.

I agree that the CPU can bring amazing changes that won't be possible on current gen. Just not at the start, and Microsoft knows this.
I think it will be much improved, graphics, loading times (or lack of those), more crowded environments, better animations, etc etc... It was already such a great game that more of the exact same would be great :D
 
Microsoft knows that this might be the case after 2-3 years but not at the start of the gen. The new Spider-Man, how different do you think the AI will be? How much will the actual gameplay change compared to running on a Jaguar CPU? It's mostly a visual upgrade, new story, and reusing a lot of assets of the original game.

I agree that the CPU can bring amazing changes that won't be possible on current gen. Just not at the start, and Microsoft knows this.

Glad I'm not listening to Microsoft then.

"We do believe in generations, and whether it's the DualSense controller, whether it's the 3D audio, whether it's the multiple ways that the SSD can be used... we are thinking that it is time to give the PlayStation community something new, something different, that can really only be enjoyed on PS5." - Jim Ryan, CEO of PlayStation.

jim-ryan-playstation-sony-industry-1.original.jpg
 
Microsoft knows that this might be the case after 2-3 years but not at the start of the gen. The new Spider-Man, how different do you think the AI will be? How much will the actual gameplay change compared to running on a Jaguar CPU? It's mostly a visual upgrade, new story, and reusing a lot of assets of the original game.

I agree that the CPU can bring amazing changes that won't be possible on current gen. Just not at the start, and Microsoft knows this.
Yes Xbox One was known for introduce new mechanics this gen :lollipop_neutral:

In spiderman more than you think that game was used to show how fast you can travel and for this specific game that is important for
the gameplay.
 
Yes because when you make a videogame you always try to reach 100% of GPU and CPU all the time who
cares if something happens during the gameplay the only important thing is to reach that load in both chips.
PS5 programmer has to budget their CPU usage for max GPU clock speed.
 
PS5 programmer has to budget their CPU usage for max GPU clock speed.

And where is that BS coming from? Mark Cerny said that both will be at max most of the time. Have any official sources declaring otherwise?

EDIT:

"If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

 
Last edited:
Cross buy worked on PS3 to PS4... both fully netorked consoles.. wasn't just a Vita thing.

And BC has nothing to do with this converation.

I was just using it as an example of how the "mechanism" could differ for the PS4 to PS5 cross buy.

WTF is your problem?

My goodness it just gets funnier and funnier, the scheme for cross buy, smart buy, smart delivery whatever ONLY applies to first party. ONLY.

If Activision wants to offer free current to next gen, they will do it on both consoles.

If EA wants to offer free current to next gen, they will do it on both consoles.

If take two wants to offer free current to next gen, they will do it on both consoles.

If Capcom wants to offer free current to next gen, they will do it on both consoles.

So lets explore further as maybe its more complex :messenger_beaming:.

If Rockstar says GTA5 will be chartged for ps5 it will also charge for XSX

If dev B says game Z will be chartged for ps5 it will also charge for XSX

God is it that hard to get it, its blowing my mind, 3rd party pricing is set by 3rd party publishers, thats it

But but phil said there is a mechanism, are people have no own thought process ? omg, a few fancy words and a nice name does the confusing trick it seems ?
 
Last edited:
And where is that BS coming from? Mark Cerny said that both will be at max most of the time. Have any official sources declaring otherwise?

EDIT:

"If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

Clock speed is not a true indication for ALU usage e.g. heavy scalar integer workloads can yield high clock speed while 256-bit vector AVX yields lower clock speed.

You are ignoring Mark Cerny's warning against AVX.
 
Glad I'm not listening to Microsoft then.

"We do believe in generations, and whether it's the DualSense controller, whether it's the 3D audio, whether it's the multiple ways that the SSD can be used... we are thinking that it is time to give the PlayStation community something new, something different, that can really only be enjoyed on PS5." - Jim Ryan, CEO of PlayStation.

jim-ryan-playstation-sony-industry-1.original.jpg
You didn't answer my question... What's new about Spider-Man that has such an influence on the gameplay and must use the new CPU? Jim Ryan is a marketing guy, that was purely saying this to differentiate them from Microsoft. And he doesn't even mention the CPU, why do you think that is?
 
You didn't answer my question... What's new about Spider-Man that has such an influence on the gameplay and must use the new CPU? Jim Ryan is a marketing guy, that was purely saying this to differentiate them from Microsoft. And he doesn't even mention the CPU, why do you think that is?

Let's play the game first?
 
AMD's Smartshift tech is about sharing electrical energy between the CPU and GPU.

6FHKB28.png


To reduce Smartshift's influence, the OEM designer has to budget for max power with CPU and GPU separately.

For PS5, GPU's 200Mhz boost over 2Ghz is shifted from the CPU's power budget.

Mark Cerny has warned against AVX usage.

This is still inaccurate. You've been inaccurate on this for a quite a few weeks now ....

Yes Smartshift is about the efficient allocation of power. Your first sentence is a win!

No, the developer doesn't deal with separate cpu/gpu power budgets - that's exactly the opposite of what smartshift is doing. The power budget is unified.

Going further the developer doesn't directly control the power allocation at all - that is something the AMD tech does.

And smartshift has no direct influence on clock speeds.
 


I'd be more cautious about this.
There have been trailers/snippets/vids for the first Spiderman where it looked way better/interactive then the final game.
The footage from the Miles Morales Spinoff was great and cool but it is subject to change. It could look worse or better in the final product.
Can't wait for those consoles and games to be released and tested though.
 


Agh the lastest FUD drive, how sweeet.

Current gen Jaguar and HD target scope games will run 4k60

Next gen Zen2 and SSD target games will run 4k30 with RT or something like 1600-p60 or higher depends on RT.

So current games like Destiny will be 4k60 on both as its easy peasy for both new consoles..

So the new strategy is to list how many 4k60 Jaguar and HDD current gen target games there are on XSX ? Will it not be them all :messenger_beaming:

Notice how there is nothing different yet lol
 
Last edited:
R&C is all about the SSD, not the CPU

Source ? Until the portal is gone through, the current world is still in memory and being aninated with effects and Ray tracing.

The new world then gets loaded , is the old word state saved or logged somehwere and then the new world has new enemies, animations and effects with Ray tracing in play.

Its a save, load and animate at 4k wiith lots of particle effects animations and RT, the whole process
 
Last edited:
This is still inaccurate. You've been inaccurate on this for a quite a few weeks now ....

Yes Smartshift is about the efficient allocation of power. Your first sentence is a win!

No, the developer doesn't deal with separate cpu/gpu power budgets - that's exactly the opposite of what smartshift is doing. The power budget is unified.

Going further the developer doesn't directly control the power allocation at all - that is something the AMD tech does.

And smartshift has no direct influence on clock speeds.
From https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core
 
To reduce Smartshift's influence, the OEM designer has to budget for max power with CPU and GPU separately.

For PS5, GPU's 200Mhz boost over 2Ghz is shifted from the CPU's power budget.

Mark Cerny has warned against AVX usage.
Cerny said it's capable of running at 3.5GHz and 2.23GHz or close to those frequencies "most of the time" within the set power budget. So if the CPU and GPU are running at 3.5GHz and 2.23GHz and something GPU-intensive is happening on screen at that moment, the CPU's barely doing anything and is slightly downclocked. Then, part of the CPU's power budget is shifted to the GPU which is ALREADY running at 2.23GHz, this doesn't help increase the frequency even more due to the extra power boost it's getting from the CPU since it's capped at 2.23GHz. What the power allocation from the CPU to the GPU does, is that it helps the GPU maintain that 2.23GHz clock frequency for a longer period of time now due to the extra power supply headroom it has. What Mark Cerny meant by the absolute worst case scenario is when workloads are causing both the GPU and CPU already running at peak clock speeds, to possibly exceed the set power budget already provided to them. THAT's when both the GPU and CPU are slightly downclocked in order to make sure that the set power budget isn't exceeded.
 
PS5 programmer has to budget their CPU usage for max GPU clock speed.
Could you stop talking out of your behind?

Here, a direct quote from Cerny:
"The CPU and GPU each have a power budget, of course, the GPU power budget is the larger of the two. If the CPU doesn't use its power budget – for example, if it is capped at 3.5GHz – then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

Source: https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Now what do you have to say
 
Cerny said it's capable of running at 3.5GHz and 2.23GHz or close to those frequencies "most of the time" within the set power budget. So if the CPU and GPU are running at 3.5GHz and 2.23GHz and something GPU-intensive is happening on screen at that moment, the CPU's barely doing anything and is slightly downclocked. Then, part of the CPU's power budget is shifted to the GPU which is ALREADY running at 2.23GHz, this doesn't help increase the frequency even more due to the extra power boost it's getting from the CPU since it's capped at 2.23GHz. What the power allocation from the CPU to the GPU does, is that it helps the GPU maintain that 2.23GHz clock frequency for a longer period of time now due to the extra power supply headroom it has. What Mark Cerny meant by the absolute worst case scenario is when workloads are causing both the GPU and CPU already running at peak clock speeds, to possibly exceed the set power budget already provided to them. THAT's when both the GPU and CPU are slightly downclocked in order to make sure that the set power budget isn't exceeded.
Again, clock speed is NOT a true indication for the ALU usage level.

from https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core

Could you stop talking out of your behind?

Here, a direct quote from Cerny:
"The CPU and GPU each have a power budget, of course, the GPU power budget is the larger of the two. If the CPU doesn't use its power budget – for example, if it is capped at 3.5GHz – then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

Source: https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Now what do you have to say
Again, clock speed is NOT a true indication for the ALU usage level.

from https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core
 
Last edited:
I'm sold on laying mine down. When I see it, it looks like the 'people's eyebrow'. 😅 Use to love that shit. So yeah, I've already assigned it a lovely space on the wall.
 
AMD's Smartshift tech is about sharing electrical energy between the CPU and GPU.

6FHKB28.png


To reduce Smartshift's influence, the OEM designer has to budget for max power with CPU and GPU separately.

For PS5, GPU's 200Mhz boost over 2Ghz is shifted from the CPU's power budget.

Mark Cerny has warned against AVX usage.

So, a higher TDP would enable the CPU and GPU to hit 3.5Ghz and 2.23Ghz without each component having to borrow power from the other? I guess the current setup is the best that Sony could achieve without causing the system to become too hot for the cooling system that they devised; otherwise, the console would be even larger than it is now.

Now I wonder which console will run quieter.
 
That's the main highlights, lossless as @PaintTinJr and others has explained previously, compared to the lossy BCPack to reach higher number with quality compromises.

Bo I like you but you should start to listen.
Before Oodle Texture the compression was losless.
Now with the usage of Oodle Texture the compression is not losless.

You can expect both oodle texture and bcpack to be the same in quality. However currently we can't say which one is better compressed (in terms of size reduction) overall.
We can however expect something between 20-60% of texture compression from original to final compression overall in both cases. ( depending on the texture )
 
Again, clock speed is NOT a true indication for the ALU usage level.

from https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core


Again, clock speed is NOT a true indication for the ALU usage level.

from https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core

You seem to not understand that fixed frequency is more prone to throttling than smartshift? Constant frequency at max drawing more unnecessary heat? Do you want us to go this route now?
 
From https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core
In the same paragraph:
There's likely more to discover about how boost will influence game design. Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores. However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.
It's because the devkits' frequencies are LOCKED. And what did I say in my previous post?

AMD's SmartShift technology is responsible for allocating power based on the set power budget to certain elements of the APU, not for managing clock speeds of the CPU and GPU. You do realize that the only reason the Series X's CPU can hit 3.6GHz with SMT enabled and 3.8GHz with SMT disabled is because of the fact that they have a low GPU frequency (compared to PS5's) right? Running the CPU at frequencies higher than 3GHz and the GPU at 2GHz while keeping both frequencies locked at the same time with varying power supply levels? That's a thermal nightmare and the console will catch on fire by that point.
 
Last edited:
So, a higher TDP would enable the CPU and GPU to hit 3.5Ghz and 2.23Ghz without each component having to borrow power from the other? I guess the current setup is the best that Sony could achieve without causing the system to become too hot for the cooling system that they devised; otherwise, the console would be even larger than it is now.

Now I wonder which console will run quieter.
Nope. AMD's Smart shift is more than TDP management. Smart shift manages the shared power feed into CPU and GPU.
 
Bo I like you but you should start to listen.
Before Oodle Texture the compression was losless.
Now with the usage of Oodle Texture the compression is not losless.

You can expect both oodle texture and bcpack to be the same in quality. However currently we can't say which one is better compressed (in terms of size reduction) overall.
We can however expect something between 20-60% of texture compression from original to final compression overall in both cases. ( depending on the texture )

Same quality? Not sure about that, what I read was completely different there. UE5 demo was also referring to lossless polygons as well.
 
Again, clock speed is NOT a true indication for the ALU usage level.

from https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core


Again, clock speed is NOT a true indication for the ALU usage level.

from https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core
The rest of the paragraph you quoted specifically explains that these devs are not "budgeting" their cpu usage, but rather chose to optimize with locked performance profile, because they don't need the extra cpu power:

"However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny."
 
Last edited:
You seem to not understand that fixed frequency is more prone to throttling than smartshift? Constant frequency at max drawing more unnecessary heat? Do you want us to go this route now?
PS4 and PS4 Pro consoles have Sony's "fix frequency" regime since 2013.
 
Nope. AMD's Smart shift is more than TDP management. Smart shift manages the shared power feed into CPU and GPU.

Well, it's not using AMD's version in case you were confused, and I doubt you are. They don't have shared budget, but I know you're not interested in spreading accurate information because you didn't read the replies.
 
The rest of the paragraph you quoted specifically explains that these devs are not using smartshift, but rather a locked performance profile, because they don't need the extra cpu power:

"However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny."
I own Ryzen APU mobile and I can set CPU usage to 50% via Windows power management in exchange for high GPU clock speed. I have GPU bias mode profile for my Ryzen APU laptop. This behavior is replicable across multiple mobile Ryzen APUs with the same power design limit.

The reason for Ryzen Controller software exists for Ryzen laptop users is to reach TDP limit from 25-watt design power to 40 watts design power. Increasing power design from 25 watts to 40 watts also increases clock speed for both CPU and GPU.

I not even talking about TDP limits since there's extra TDP cooling headroom available at 25 watts.
 
Last edited:
Just looks this compression rates:

127 MB block compressed GPU textures, mix of BC1-7

78 MB with zip/zlib/deflate

70 MB with Oodle Kraken

40 MB with Oodle Texture + Kraken

https://cbloomrants.blogspot.com/2020/06/oodle-texture-slashes-game-sizes.html
The best thing about this is that devs right now can take all their assets and run them through those compression tools long before their games are to release, even if it's November. We're all going to thank Oodle for our shorter downloads and (to me more important) less space taken on the precious 800-ish GB drive.


I only heard about NeoGAF in January, lurked for a month, then couldn't resists the memes and exciting chat. Sign up 99.99% for this thread. Gonna feel empty when it gets closed.
When I registered, account approval waiting time was around a full month and you couldn't register with a free email (like Gmail). This made FUD spreaders much more careful. Now we're into the full festival of stupid.

What do you mean with this?

I am disagree with your opinion of The Medium, for me looks good for be a game of the first wave and will have some interesting mechanics so good.
I'm going to bite a month of Game Pass to try that game just because it's set in my neighbourhood. I always find it incredibly immersive to play in real locations.

Who gives a fuck about Jaguar in 2020? Read again what I'm saying, we need to throw those Jaguars behind us as far as possible, maybe ask Elon Musk to throw them into outer space.

I disagree. Many millions of less lucky citizens of Earth are still willing to buy PS4 when PS5 drops because it'll be much cheaper. With PS3, Sony sold manufacturing licence to factories in India which produced millions of consoles for 3rd world markets. I guess it's not going to be different this time, especially with all those great exclusives they're still releasing.

However, I'd send to space all those discless Xbox S units lying around in shops. Poorer gamers won't benefit from those machines as they're bound to an expensive subscription scheme.

Microsoft knows that this might be the case after 2-3 years but not at the start of the gen. The new Spider-Man, how different do you think the AI will be? How much will the actual gameplay change compared to running on a Jaguar CPU? It's mostly a visual upgrade, new story, and reusing a lot of assets of the original game.

I've seen particle effects in Returner and R&C, water physics in Horizon, object deformations in Allstars. I see a lot of CPU usage, you see nothing. That's because you have no idea about technology, just the same warring rhetorics. Go pick up fights on your level, that is the meme threads.

God is it that hard to get it, its blowing my mind, 3rd party pricing is set by 3rd party publishers, thats it
I've just imagined if any of the platform holders made it mandatory to upgrade. Guess what'd Rockstar and EA do? Release the same game with a new name, include 2 hours of content and cash in. Greed always finds a way.

From https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core
Your quote comes from Digital Foundry, pet pixel counters with no knowledge of game development. People who are sponsored by Microsoft. They're as unbiased and objective as you are.
 
does that mean you need to download the whole thing with all the duplicates in XSX?

nope, they have said multiple times that via smart delivery the right version of the game will be downloaded. So no duplicates.


Sorry for the mild rant, but I'm sick of this too much worry and talk about current gen and cross-gen games, as if we're excited about them! I, and many others, care about new true next gen.

so I am guessing you are not interested in likes of kena Or the new oddworld game?
 
I own Ryzen APU mobile and I can set CPU usage to 50% via WIndows power management in exchange for high GPU clock speed. I have GPU bias mode profile for my Ryzen APU laptop. This behavior is replicable across multiple mobile Ryzen APUs with the same power design limit.
Yeah? I own eyes and a brain with reading comprehension
 
Status
Not open for further replies.
Top Bottom