Wow! Rich!
If you use your PC More for productivity, then that 5900x deal is a no brainer. Hell, appears to be a better deal than the 5800X3D anyway with the ram and a free game.So im looking to upgrade my 3600x, was ready to press the buy button on a 5800x3d and then someone sent me a bundle at a store here with a 5900x + 64gb 3200mhz DDR4 RAM (32gb dual stick) + company of heroes 3 (id sell this honestly) for only $10 more than the 5800x3d alone.
What to do?
I have 32GB 3200mhz ram currently, but more ram is more ram innit?
For gaming I will play at 4k most of the times but I use my pc for productivty a hell of a lot more than gaming.
Wait a sec, I got the free game when I bought the 5800x3d a few weeks ago from Amazon. Isn't it a deal where you get it with either of those cpu's?If you use your PC More for productivity, then that 5900x deal is a no brainer. Hell, appears to be a better deal than the 5800X3D anyway with the ram and a free game.
Not that I'm aware of. AMD change promotions all the time. Even still, based on the quoted person's use case, it still wouldn't make sense for the 5800X3D since they don't game like that and use productivity.Wait a sec, I got the free game when I bought the 5800x3d a few weeks ago from Amazon. Isn't it a deal where you get it with either of those cpu's?
Ok found the link to the promotion here. Apparently you get the game for the followingNot that I'm aware of. AMD change promotions all the time. Even still, based on the quoted person's use case, it still wouldn't make sense for the 5800X3D since they don't game like that and use productivity.
They 've been doing this for several generations. Offering the bare minumum for the lower end SKU's. Reminds me of Apple, and their paultry RAM offerings and upgrades on their iPhones.Interesting discussion on Anand
https://forums.anandtech.com/threads/the-8gb-not-enough-thread.2595331/page-15?view=date
And I think they're right. Nvidia is skimping on VRAM and it will obsolescence the fuck out of a lot gaming PCs the next few years.
But it's not just the 4060. Their higher end cards are VRAM starved as well. I just don't get it, AMD doesn't have this issue.They 've been doing this for several generations. Offering the bare minumum for the lower end SKU's. Reminds me of Apple, and their paultry RAM offerings and upgrades on their iPhones.
Granted 16GB isn't going to be a limiting factor any time soon. 12GB on the other hand is is going to be problematic at 1440P+ when more demanding games release, without the use of DLSS or other TAAU upscaling.
yep, i saw it. But is still beta version for my mobo (X670E Tuf Gaming). I will wait for the final version.For those of you with an Asus rog x670 e motherboard or in general any of those motherboards that support the Zen 4 CPUs there is a new bios update that just came out that apparently better performance. I know that the new Nvidia graphics driver definitely fix the high GPU CPU usage issue being reported because modern warfare 2 ran in the mid to high 80s and then it wouldn't come down from that temperature or voltage until you restart it.
Now with that fix coupled with this new bios, I've never runs so fast the hardware with overclocks while running so cool. The graphics card is always been solid (Gaming OC 4090) but now my temps don't even go above 60 anymore in modern warfare which is semi-taxing. I do disable core isolation and other certain fixes and windows 11 and this build is the most performant one I've ever had given I'm pushing PBO OC and tweaks in DIGIVRM with a good GPU OC on top of that. On the best jest loads I get 71c at the most after an hour of gaming and CPU is as solid as I've ever had and that's saying a lot.
Helps that I'm running an Arctic Liquid Freeze II (push/pull) in a Corsair 7000D.
Looking to upgrade my GPU.
Which do you feel is more important to newer games?
Higher VRAM or DLSS 3?
I’m deciding between a 4070ti and an AMD 7900XT. I’m not sure I really want a 12GB card in 2023, but frame generation seems to be decent, and Nvidia outperforms in Ray tracing.
That 20GB VRAM sure sounds nice though. Especially with Resident Evil 4, where I had to turn down settings cause my card’s only 8GB.
Depends on how much you value those things.
DLSS 3 is only good for single player games, as it ads latency. For competitive games is something to avoid.
But it can be smooth looking. It reduces CPU usage and is less prone to be bottlenecked.
AMD said they are also developing FSR3, with frame generation. But when it's released and if it's any good, it's anyone's guess at this point.
Do you value RT? If so, then the 4070 Ti the best choice.
But RT also increases vram usage.
Vram usage in games is constantly increasing. But the point when a certain amount will be an issue is difficult to predict.
12GB seems ok for most games today, but it's hard to tell for how long. 20 GB is sure to last much longer.
The 40707 Ti is also more power efficient. And at a time when energy cost are high, it's something to also take into account.
Another thing to consider is that AMD has lower driver overhead. So if you have a less powerful CPU, that should be taken into consideration.
DLSS 3.0 is nice to have but not every game supports it or will. More VRAM can potentially benefit every game.Looking to upgrade my GPU.
Which do you feel is more important to newer games?
Higher VRAM or DLSS 3?
I’m deciding between a 4070ti and an AMD 7900XT. I’m not sure I really want a 12GB card in 2023, but frame generation seems to be decent, and Nvidia outperforms in Ray tracing.
That 20GB VRAM sure sounds nice though. Especially with Resident Evil 4, where I had to turn down settings cause my card’s only 8GB.
I hadn’t looked at the difference in efficiency, but yeah 4070ti wins that one. Even uses less power than my 3070ti, so that’s nice.
I’m looking for an all around card. If a game makes use of Ray tracing then I want to make use of it.
I’m going to keep reading up on the two, but I think they really are pretty evenly matched in benchmarks.
Theres that and AMD will be showing off FSR 3 at GDC tommorow. Probably worth waiting to see whats the deal with that, if you're into fake frames.DLSS 3.0 is nice to have but not every game supports it or will. More VRAM can potentially benefit every game.
I wouldn't buy anything with less than 16GB VRAM. That's why I passed on the 3080 with 10 or 12GB. I got a 4080 which has 16GB, DLSS3.0, and awesome RTX performance. Now I've been using the card for a couple months I wish I had got the 4090 with 24GB because I'm seeing games use up to 14GB VRAM. I don't know if it's just because it's there that more games use it or if it really needs it. I'm not even playing at 4K.
If I had to pick between those two cards it'd probably go for the 7900XT as long as you wanted more VRAM and weren't bothered about raytracing. I mean the raytracing on the 7900XT ain't bad but If you're coming from a 3070 Ti it might not be a huge upgrade.
If you want DLSS 3 and the best Raytracing then get the 4070 Ti. 12GB would probably be fine if you're playing at 1080-1440p. I made the decision to move up to the 4080 for the improved performance and extra 4GB VRAM.
damn i think im getting old now...But I built a new PC last year and still got kind of a middle of the road GPU, a 1080ti,
Theres that and AMD will be showing off FSR 3 at GDC tommorow. Probably worth waiting to see whats the deal with that, if you're into fake frames.
Hardware Unboxed did a head to head comparison recently with those two GPU's
From most benchmarks I've seen. If Raytracing is high priority and you play 1440P, the 4070Ti if the probably the one you want. Its also quite bit more efficient GPU. Want 4K.. the 7900XT and 20GB will give you much more headroom and perhaps longevity. Nvidia cards do have better resale value though.
For today, the best choice is the 4070 Ti. It just wins more than it loses.
But don't be surprised if a couple of years from now, you have to upgrade again, just because of the vram.
it's a 6 year old GPU...damn i think im getting old now...
AMD does appear better value but then you have to deal with the quality control of the hardware and the quality of the drivers.This (and price) is what initially drew me to AMD. For $800+ I’d like more than a couple years.
I just don't like paying so much for GPU. When I built my 2012ish (whenever Diablo III came out) PC, I got a 560 ti used for maybe 150 or something, it served my purposes well then. And then the recent PC built in 2022 I got the 1080ti from the same guy coincidentally, used for about 150 also give or take. It's doing great for me so far, but I don't play many new games. But those I've tried so far have run really well, or at least tolerable enough for my standards (Elden Ring, RE4Remake, etc). Mainly I just wanted a decent CPU and decent amount of RAM. There's just no way I could stomach paying as much for some top end GPU as I spent on the entire machine! So pretty much I'm in a cycle that whenever I build a new PC, it just so happens my pal is upgrading his card (he likes to stay pretty up to date) and I'll take the old one! Then again, I'm the one guy still running 720p in 2023. I swear I'm buying a new monitor soon, lol.damn i think im getting old now...
Who cares what others think, if you have looked at the data relevant to what you want and come to a decision on something that's fair enough.I spent £1,200 on my 4080 and people might laugh at me for that but i'm happy to pay extra.
I ended up getting the 5900x, lol productivity performance went up A LOT with this combined with the RAM upgrade.If you use your PC More for productivity, then that 5900x deal is a no brainer. Hell, appears to be a better deal than the 5800X3D anyway with the ram and a free game.
That's what they are though. Intropolated frames. Didn't say that there's anything inherently wrong with the technology. But its not exactly new either. Now its used on GPU's or going to be integrated into drivers/software stack."fake frames"? lol
between enabling and disabling frame generation in Hogwarts Legacy or Cyberpunk let me tell you there is a huge difference between playing at 50fps and 100-140fps (Hogwarts) or 55fps and 105-120fps (cyberpunk). If they are fake they are damn convincing. There is basically no difference between real and fake frames. Yeah yeah i know the "fake" frames add latency or some shit but I can't feel any difference. Maybe if it was an esports game or something it'd matter but every esports game I have can hit 200+fps anyway.
I know what it is but I've just seen a lot of people say it's fake and as if you should use it or see it as a selling point. Yeah it's not being rendered by the GPU like the "real" frames but the GPU is still generating them through AI. They make be "fake" but it gives me better performance so basically real frames to me.That's what they are though. Intropolated frames. Didn't say that there's anything inherently wrong with the technology. But its not exactly new either. Now its used on GPU's or going to be integrated into drivers/software stack.
Looks fine but a Z690 board is for 12th gen cpus. A 13th gen will work but the BIOS may need to be updated to support it. I think Asus boards should let you update the BIOS without the CPU installed. If you want to avoid this then for 13th gen CPUs you would get a Z790 board as you shouldn't need to worry about BIOS. Of course if the 690 is cheaper then go with that. I'd recommend looking at the support site for that 690 board to see if supports BIOS updates without CPU
how am i doing? how is the mobo? Thats where my lack of knowledge is at. I have a 3080, but will be getting a 4090 most likely
already have a case and 800w psu and fans and what not
the ram is 2x16gb. I have a lot of games installed so going to use 1tb for windows. 2tb for steam and 1tb for Xbox games. I’ll get the z790 then. I hate fucking with the bios, it always bricks on me. Thanks!Looks fine but a Z690 board is for 12th gen cpus. A 13th gen will work but the BIOS may need to be updated to support it. I think Asus boards should let you update the BIOS without the CPU installed. If you want to avoid this then for 13th gen CPUs you would get a Z790 board as you shouldn't need to worry about BIOS. Of course if the 690 is cheaper then go with that. I'd recommend looking at the support site for that 690 board to see if supports BIOS updates without CPU
Why are you buying 1x 2TB SSD and 2x 1TB SSD?
I'd recommend going with 32GB RAM instead of 16GB unless you plan on doing that later.
Edit: seems you can update BIOS on the Z690 even with a 13th gen cpu. So you should be good
my bad! i should learn to read betterthe ram is 2x16gb. I have a lot of games installed so going to use 1tb for windows. 2tb for steam and 1tb for Xbox games. I’ll get the z790 then. I hate fucking with the bios, it always bricks on me. Thanks!
Might go with this https://pcpartpicker.com/product/T2...x-atx-lga1700-motherboard-z790-aorus-elite-axmy bad! i should learn to read better
fair enough. i'd probably do the same. i have a boot drive for windows and a drive for games. if i was fucking about with installing games from Microsoft's store i'd want to put them on their own drive. I've had too many issues installing games from the MS store and it's resulted in me having to wipe the ssd.
the Z690 should be fine but yeah if you don't want to mess about with it then go for the Z790. i was looking at that exact same board but the Z790 one and it's expensive here. Hopefully you find a board that isn't much more expensive than the Z690.
Just got the same GPU. Bonus points for the sweet air cooler! Feels like I’m seeing them more often these days.All set up!
So you will have 2 8GB sticks paired together and 2 16GB sticks paired together right? It should work fine, as long as you aren't trying to pair a 16GB stick with an 8GB one. I'm not an expert though.I have a RAM related query.
I currently only have 16GB in 2x 8GB sticks. I was going to add 2 more 8GB but found I could get 2x 16GB for only a little more. Obviously I'd get exactly same speed and latency, but is there any weird performance affects from combining different capacity modules together?
I have a RAM related query.
I currently only have 16GB in 2x 8GB sticks. I was going to add 2 more 8GB but found I could get 2x 16GB for only a little more. Obviously I'd get exactly same speed and latency, but is there any weird performance affects from combining different capacity modules together?
Big enough for width? The 48" has a center stand so you are fine there.Fellas would a 30" desk be ok for a 48 oled strictly for gaming? Reason being is that lol, Best Buy has the 48 for the same price as a 42 if you're a totaltech member (such a based membership) and it's just so damn tempting!