Because the old hardware still be around and catered to limits the benefits you get from the new hardware, not saying there still won't be huge benefits.
Considering Devs already have an artistic and visuals look in mind, they don't usually scale up to hard where it would matter especially on PC. I can understand more FPS and higher resolution but some of us just want the next crysis so smaller details such as increased Lods, higher resolution shadows, doesn't always give the visual leap some people want. I can see illumination, volumetric fog, and tessellation being that though. I would also want terrain deformation, higher quality water physics, increased particle effects, destruction, splintering...etc. I think in the end, we are still going to be limited by Consoles. I hope not though.Of course they can. PC can run at higher settings than consoles.
On PC you can run all games at true 4K. Or higher if one desires.
Or you can run games at high refresh rates. 144hz displays are very common on PC. And some go to 240hz.
Then there is ray-tracing. There are more games on PC that use ray-tracing. And most use much higher quality settings than on consoles.
And then there is the usual stuff, higher resolution shadows, volumetric fog and illumination. Increased LODs. Higher use of tessellation.
And the list goes on and on.
What they will actually end up being used for
It is all over for nvidia
Reminds me of PC modders to who have to add rain and puddles onto every square inch of any game just to show off the RT reflections lmaoHonestly were what 3 years or so after Nvidia pushed ray tracing really hard? What games have really shown a case for its use? Sony's big one is Spiderman? I really can see that AMD is going for rasterization, and as time goes on will get better at ray tracing.
I care about resolution, frames and over IQ in a game. Ray tracing is not being used in games in nay big meaningful way.
Wish I had a dollar for every time I’ve heard an AMD fanboy make a hyperbolic claim about some rumored next-gen CPU/GPU.It is all over for nvidia
Except zen 3 was the real deal.Wish I had a dollar for every time I’ve heard an AMD fanboy make a hyperbolic claim about some rumored next-gen CPU/GPU.
Bonus: once it releases, I bet you’ll be here saying “oh yeah well it’s too soon to judge, just wait until we get better drivers/games are optimized for this new architecture, then we’ll really see what it’s capable of!”
We’ll see. I hope we get some fierce competition but I am not going to project that my favorite company is for real gonna dominate this time based on some rumored specs for a future product. You’d think the fanboys would’ve learned by now.
Thats about 41 and 61Tflops for the top two if they were both clocked @2000mhz
Makes current gen consoles seem like pipsqueaks.
Wow so after literally 10 years of “just wait until next year’s CPU!!!” you finally guessed right. And all it took was a half-decade delay in Intel’s fab roadmap to allow them to catch up.Except zen 3 was the real deal.
You seem triggeredWow so after literally 10 years of “just wait until next year’s CPU!!!” you finally guessed right. And all it took was a half-decade delay in Intel’s fab roadmap to allow them to catch up.
truly you AMD fanboys have an almost prophetic ability to predict the outcome of future product match-ups and this isn’t just another iteration of the hype cycle. I stand corrected.
higher rendering resolution = more detail not lost on the screenConsidering Devs already have an artistic and visuals look in mind, they don't usually scale up to hard where it would matter especially on PC. I can understand more FPS and higher resolution but some of us just want the next crysis so smaller details such as increased Lods, higher resolution shadows, doesn't always give the visual leap some people want. I can see illumination, volumetric fog, and tessellation being that though. I would also want terrain deformation, higher quality water physics, increased particle effects, destruction, splintering...etc. I think in the end, we are still going to be limited by Consoles. I hope not though.
You do realize that most games on PC have graphics options, that can scale to higher settings.
Next year gonna be so crazy.
Then on the CPU side you got Alder Lake coming in a couple days, then next year Zen3+, Zen4 and Raptor Lake second half. Shit bout to be too wild.
- RDNA3
- Lovelace
- Arc
But to stay on topic, those MCM GPUs gonna be some real heaters, lmao
Will PC gamers even get to take advantage of any of this compute performance considering consoles exist?
Depends on what settings they use and how they use RT. Most games that use Ultra Settings on PC never really LOOK ultra and still end up using a lot of resources that could be put to better use. With GPU's heading up north of 40 to 60 TF, there will be a lot of headroom to do wayyy more instead of just Res bump and FPS bump.Most next gen games already compromising with performance mode vs quality mode, RT on/off. The PC versions could just run the quality mode with full performance, ray tracing enabled, etc.
price will be 3-5 more of console so what's the point for comparisson subtle trolling?
this thread has nothing to do with consoles and yet you manage to shit post "humour" comparison with hw from the future just to downplay another at least try to be funny.If you think facts and some mild humour is trolling then its you who have the issue.
Seems more of a "you" problem.this thread has nothing to do with consoles and yet you manage to shit post "humour" comparison with hw from the future just to downplay another at least try to be funny.
you stating obvious what's the point? wanna derail thread about amd cards vs consoles? go on... trollSeems more of a "you" problem.
If you're that upset by somone refering to the consoles as "pipsqueaks" in comparison to such vastly more powerful hardware perhaps its time to take a break.
Derail? You're the only one with the issue.you stating obvious what's the point? wanna derail thread about amd cards vs consoles? go on... troll
you need to learn read thread title before posting.Derail? You're the only one with the issue.
You really need to learn to be less sensitive.
Derail? You're the only one with the issue.
You really need to learn to be less sensitive.
you need to learn read thread title before posting.
you need to learn read thread title before posting.
PCs have been more powerful than consoles for like 20 years or more, so I don't really see the point of bringing it up.Seems more of a "you" problem.
If you're that upset by somone refering to the consoles as "pipsqueaks" in comparison to such vastly more powerful hardware perhaps its time to take a break.
You need to learn to read.
It is all over for nvidia
Considering Devs already have an artistic and visuals look in mind, they don't usually scale up to hard where it would matter especially on PC. I can understand more FPS and higher resolution but some of us just want the next crysis so smaller details such as increased Lods, higher resolution shadows, doesn't always give the visual leap some people want. I can see illumination, volumetric fog, and tessellation being that though. I would also want terrain deformation, higher quality water physics, increased particle effects, destruction, splintering...etc. I think in the end, we are still going to be limited by Consoles. I hope not though.
The 580 came out because the 480 was crap. It used like 300w and ran hot as hell.There was no reason to buy a 580 GTX, because it was massive overkill when it came out. then metro 2033 came out and ac unity and the 580 was a cripple, which needed a entire new tier of performance. With RT being a thing, games could very well make the shift to RT only like metro exodus sooner rather then later when amd joins the club to make use out of those cu's.
Are you sure? There were rumors that the release slipped into Q1 2023.It didn't just tape out and it's not coming in Q4 2022.
It taped out a while ago and it's coming Q2 2022.
With out the shortages it would be great as a consumer... Sad faceNext year gonna be so crazy.
Then on the CPU side you got Alder Lake coming in a couple days, then next year Zen3+, Zen4 and Raptor Lake second half. Shit bout to be too wild.
- RDNA3
- Lovelace
- Arc
But to stay on topic, those MCM GPUs gonna be some real heaters, lmao
That is also exciting, but since I mostly use my PC for gaming, by time that matter I'll be ready to upgrade anyway. They haven't even saturated PCIE 4.0 yet. I'm more excited about Microsoft's direct storage. That should be a game changer and then I'll upgrade to a Gen4 NVME. Right now I'm fine with the two Gen3's I have.Plus finally implementing:
DDR5
Wifi6E
PCIE5.0
NVMe SSD Drives up to 16GB/sec read/write speed (almost equal to PS3 and X360 Ram Bandwidth)
USB 4.0
The shortages are a lie. In the beginning, sure, it was real. Not anymore. Nobody will convince me that aren't manufacturing the shortage to keep prices high since they see psychopaths are willing to part their money for low-end stuff. lolWith out the shortages it would be great as a consumer... Sad face
The 580 came out because the 480 was crap. It used like 300w and ran hot as hell.
580 can play BF1 1080p medium at 60 fps. One of my friends still has one in his ancient windows 7 machine.
and will be $1000+ ?Thats about 41 and 61Tflops for the top two if they were both clocked @2000mhz
Makes current gen consoles seem like pipsqueaks.
start at 1500 euro? Will never happenprice will be 3-5 more of console so what's the point for comparisson subtle trolling?
and will be $1000+ ?
Good luck finding oneNext year gonna be so crazy.
Then on the CPU side you got Alder Lake coming in a couple days, then next year Zen3+, Zen4 and Raptor Lake second half. Shit bout to be too wild.
- RDNA3
- Lovelace
- Arc
But to stay on topic, those MCM GPUs gonna be some real heaters, lmao
Bondrewd claimed AMD's GPUs are on a 6 quarter cadence, meaning the first RDNA3 graphics card should appear up to one year and a half after RDNA2.Are you sure? There were rumors that the release slipped into Q1 2023.
Lmao!!Good luck finding one
Lucky for me my computer is named garbage so please. Throw it inWhen I get home from work today I’m throwing my 3080ti in the garbage.
?????????Wow so after literally 10 years of “just wait until next year’s CPU!!!” you finally guessed right. And all it took was a half-decade delay in Intel’s fab roadmap to allow them to catch up.
AMD fanboys?truly you AMD fanboys have an almost prophetic ability to predict the outcome of future product match-ups and this isn’t just another iteration of the hype cycle. I stand corrected.