Nvidia’s RTX 5050 GPU starts at $249

Whats the legit use case for these?
Why would you buy a card knowing that it wont run anything.
Sounds like a straight up HDMI port situation where it exists to give you a video output and nothing more.
Budget PCs and laptops. And the xx50 isn't a bad card by any means. It can run pretty much everything in the market, including the unoptimized mess of UE5 games, at high settings

 
Nice! I'll get this as a backup/sidegrade in case if my 2060 Super dies in the next 18 months
I do most of my PC gaming on high end CRT monitors anyways I don't need much
Not worth it for me going all out on hardware until Zen 6/RTX 6000 since I'm
not looking forward to any games in the near future other than "Kingmakers"
 
Hyperbole like this is so tiring and why I can't take a lot of PC people seriously.

The 5050 is not a good deal by any means and I'm super critical of basically everything Nvidia does but to equate its existence to being a HDMI input is stupid. There are 1000s of games that will run incredibly well on a 5050, and the most played games on Steam will all run well. Not everything needs to be some huge 4K 120 fps monster capable of path tracing. The beauty of PC lies in it's flexibility.
Would you recommend it to anyone in any possible situation? I'm struggling to think of one it's suited for. Seems like this exists for PC builders to put into boxes to scam uninformed buyers.
 
Would you recommend it to anyone in any possible situation? I'm struggling to think of one it's suited for. Seems like this exists for PC builders to put into boxes to scam uninformed buyers.
As I said I don't think it's a good deal but to compare a card that is fully capable of providing a good experience in 1000s of games including modern games and is more powerful/capable then the top GPU on Steam to a simple HDMI output is stupid.

I'm sure there is a situation where I would consider it to be worthwhile depending on budget. A Quick Look at my local Microcenter's stock of GPUs it's basically a 3050 for $329 or a 9060xt 8GB for $299. I had a friend ask me about building a PC the other day to finally move on from his PS4 and asked what he could get for $700-750. I would tell him to buy a 5050 vs the 9060 easily in that budget just simply due to how good Nvidia's software is.
 
Coming To America Bar GIF



RT on a 50 series
 
I've got an old Alienware X51 that has been sitting for years and years.
How CPU bound would an old Intel 3770K with 16GB DDR3? I could rip out the old GTX 660 and fit a 5050 in there. The 330W power brick might be enough.
Yay? Nay? Throw that old thing in the trash?
 
I've got an old Alienware X51 that has been sitting for years and years.
How CPU bound would an old Intel 3770K with 16GB DDR3? I could rip out the old GTX 660 and fit a 5050 in there. The 330W power brick might be enough.
Yay? Nay? Throw that old thing in the trash?
I can't imagine a 3770K is very usable nowadays. I would recycle it unless you really just plan on playing old games.
 
Would you recommend it to anyone in any possible situation? I'm struggling to think of one it's suited for. Seems like this exists for PC builders to put into boxes to scam uninformed buyers.

It's really for OEMs who want to hit a specific price point. That's why I mentioned IGP killer...
 
Just stopping by to wonder how this release is garnering this much attention from a gaming enthusiast forum? Is anyone here even in the market for this tier card?
 
I can't imagine a 3770K is very usable nowadays. I would recycle it unless you really just plan on playing old games.
A 3770K can still play a bunch of games decently, for its age.
There are a lot of fake benchmarks videos but this one looks real. The system listed has RX 7600, 16GB DDR3 and i7 3770, not the K version but they have very similar performance at stock frequencies.


Obviously it will have problems with the more CPU demanding games and unoptimized software, even if it's old.
For example, Venetica a game from 2009 runs badly with the 3770K, a high end CPU from 2012, especially when you reach Venice. It hammers a single core/thread like crazy. Add DXVK and use Vulkan instead of DX9 and it runs quite a lot better.

It has AVX but it lacks AVX2 support. Some games "require" AVX2 even if they don't really use those extensions. Same deal happend with AVX and usually was patched out officially or unofficially.
But several recent games seem to really demand it. I know of Alan Wake 2, Assassin's Creed Shadows, Monster Hunter Wilds, God of War Ragnarok, FF XVI and FVII Rebirth. Some have workarounds but it's possible that there is a significant performance hit on top of running the game.

A system with a 3770K it's limited to PCIe 3.0 and DDR3. This 5050 is PCIe 5.0 with only 8 lanes and 8GB of VRAM. Running out of VRAM is bad, doing so with crippled bandwith is worse.
But it could be fine while within that 8GB framebuffer.
 
A 3770K can still play a bunch of games decently, for its age.
There are a lot of fake benchmarks videos but this one looks real. The system listed has RX 7600, 16GB DDR3 and i7 3770, not the K version but they have very similar performance at stock frequencies.


Obviously it will have problems with the more CPU demanding games and unoptimized software, even if it's old.
For example, Venetica a game from 2009 runs badly with the 3770K, a high end CPU from 2012, especially when you reach Venice. It hammers a single core/thread like crazy. Add DXVK and use Vulkan instead of DX9 and it runs quite a lot better.

It has AVX but it lacks AVX2 support. Some games "require" AVX2 even if they don't really use those extensions. Same deal happend with AVX and usually was patched out officially or unofficially.
But several recent games seem to really demand it. I know of Alan Wake 2, Assassin's Creed Shadows, Monster Hunter Wilds, God of War Ragnarok, FF XVI and FVII Rebirth. Some have workarounds but it's possible that there is a significant performance hit on top of running the game.

A system with a 3770K it's limited to PCIe 3.0 and DDR3. This 5050 is PCIe 5.0 with only 8 lanes and 8GB of VRAM. Running out of VRAM is bad, doing so with crippled bandwith is worse.
But it could be fine while within that 8GB framebuffer.

I had the 3770K. It was an extremely long-lasting CPU, but modern games are just too CPU intensive (UE5 games or PS5/XSX ports). Even when the game run at 45–50 fps, as with the TLOU1 remake for example, I experienced stuttering and the game wasn't very playable.

However, the 3770K and the RTX 5050 should provide a good gaming experience with pre-2022 PC games and should also be capable of running some modern games.
 
Last edited:
I've got an old Alienware X51 that has been sitting for years and years.
How CPU bound would an old Intel 3770K with 16GB DDR3? I could rip out the old GTX 660 and fit a 5050 in there. The 330W power brick might be enough.
Yay? Nay? Throw that old thing in the trash?
You'll see a massive difference even on this old 3770. I had this CPU and when I replaced GTX680 with a 1080ti, I saw 5-6x scaling in many games with this CPU.
 
Last edited:
Just stopping by to wonder how this release is garnering this much attention from a gaming enthusiast forum? Is anyone here even in the market for this tier card?
Pretty sure half of them didn't even know there was a rtx xx50 card, nor that it was perfectly viable for gaming and were just shocked by the headline.
 
Last edited:
PC game devs need to accept that they have to make games that work well with 8 GB of VRAM for the foreseeable future. It is not going away anytime soon if ever
 
I've got an old Alienware X51 that has been sitting for years and years.
How CPU bound would an old Intel 3770K with 16GB DDR3? I could rip out the old GTX 660 and fit a 5050 in there. The 330W power brick might be enough.
Yay? Nay? Throw that old thing in the trash?

Very, if you're talking bottlenecks.

You're either better off with, a GTX 1660 Ti / 1070, at best, to both hit 100% usage in games or buying a new system. Get them used for cheap.

If you want this card, then just buy a cheap i5-12400f + B760 + 32 GB DDR5-6000 combo kit you see online and pair them with it.
 
PC game devs need to accept that they have to make games that work well with 8 GB of VRAM for the foreseeable future. It is not going away anytime soon if ever
Making sure that games work on 8GB and making sure medium or low textures look acceptable are two different things though. The problem is usually in the latter, not the former.

And then you can run into the issue of not being able to use your card's best features like DLSS or frame gen because that also increases memory usage and that's really not on the devs to make sure their games work but there's also enough overhead for the GPU to use its own features.
 
Last edited:
130w is a pretty weird load. Not sure what sort of device that's aimed at.

The mobile version seems more reasonable but still 100w at the high end is a lot of draw.

The main benefit of something like the RTX 3050 (6GB) is that the power draw basically allows you just shove it in an old office PC.

Can't really do that here.
 
On the desktop side, the RTX 5050 will start at $249, draw up to 130W of power
so still no successor to the 1650GTX. Only reasonably powered possibility for SFF PCs are RTX A2000 or Ada2000, for an absurd price, new, and used also overpriced.
Meanwhile AMD offering exactly nothing competitive even though their APUs show that they could do okayish in the <70W/ only socket power GPUs niche.
 
Making sure that games work on 8GB and making sure medium or low textures look acceptable are two different things though. The problem is usually in the latter, not the former.

And then you can run into the issue of not being able to use your card's best features like DLSS or frame gen because that also increases memory usage and that's really not on the devs to make sure their games work but there's also enough overhead for the GPU to use its own features.
It's less about making sure textures look acceptable and more about doing proper texture streaming. Modern games, from what i've seen, don't really drop the quality of textures, they just reduce the distance which the higher quality textures kick in.
 
It's less about making sure textures look acceptable and more about doing proper texture streaming. Modern games, from what i've seen, don't really drop the quality of textures, they just reduce the distance which the higher quality textures kick in.
Either that or they drop the quality haphazardly to soup levels to hit specific memory limits because their main targets are console specs and don't really take the time to make sure all presets look acceptable. Honestly, I can see arguments for both sides here. They should do better, but at the sane time not all developers have the time, budget or even know-how to optimize to perfection for lower specs. Kinda how some just drop settings randomly until the Series S version sort of works and call it a day. Ain't nobody got time for that. Both Nvidia and AMD should do better not to constraint developers, it's time for 8GB to die for new GPUs meant for gaming.
 
Last edited:
They are still selling <200$ RX 6600
power draw around 120W
I am talking about cards running on 75W, only via the PCIe slot
No one offers them anymore outside of workstation stuff.
AMD has there the Radeon Pro W7500. Cheaper than Ada2000 but also worse.

Simple workstations will probably run on P620s as long as they are available, used, if they don't offer any worthwhile successors. And dumb companies will buy the A4000 or A6000 cards they get offered as workstations™, with performance most will never need, wasting energy idling around and wasting silicon.
 
It's less about making sure textures look acceptable and more about doing proper texture streaming. Modern games, from what i've seen, don't really drop the quality of textures, they just reduce the distance which the higher quality textures kick in.
that's true, at least at 1080p in doom dark ages and star wars outlaws and somewhat relevant for indiana jones too. in doom dark ages and outlaws I've failed to notice any meaningful texture differences. in indiana jones I was able to see texture differences with zoom but without zoom I was not able to


it is possible there would be bigger differences at 4K where VRAM is more strained but that's not really relevant for me as I play at 1080p so it is just my perspective on things as a 1080p user

i actually did the indiana jones comparison because of how poor trees looked in that scene. i figured this was finally the moment where i'd go "yeah it probably does not load proper textures here". imagine my confusion when I realized those trees also looked poor on geforce now too. these findings have stopped me from getting a 4060ti or 5060ti 16 gb actually. i will just wait for gta 6 and then decide what to do
 
Last edited:
I had the 3770K. It was an extremely long-lasting CPU, but modern games are just too CPU intensive (UE5 games or PS5/XSX ports). Even when the game run at 45–50 fps, as with the TLOU1 remake for example, I experienced stuttering and the game wasn't very playable.
TLOU1 was a notably bad port on PC, it took a while but they fixed a lot of the performance issues. It was bad even on high end moderm hardware and running out of VRAM/crashing with 8GB cards. I think it was one of the games requiring AVX2 but added an alternative executable for just AVX soon after launch.

But yeah, you are right. A lot of modern games are just too demanding for a 4 core/8 thread CPU from over a decade ago.
I still believe that a significant part is just lack of optimization, UE5 is particulary bad. Maybe UE 5.6+ will really improve things.

There is also games with really bad framepacing in general. Even when using their built in frame limiters or just VSync, but using RTSS can help a lot.


However, the 3770K and the RTX 5050 should provide a good gaming experience with pre-2022 PC games and should also be capable of running some modern games.
Exactly.
 
Either that or they drop the quality haphazardly to soup levels to hit specific memory limits because their main targets are console specs and don't really take the time to make sure all presets look acceptable. Honestly, I can see arguments for both sides here. They should do better, but at the sane time not all developers have the time, budget or even know-how to optimize to perfection for lower specs. Kinda how some just drop settings randomly until the Series S version sort of works and call it a day. Ain't nobody got time for that. Both Nvidia and AMD should do better not to constraint developers, it's time for 8GB to die for new GPUs meant for gaming.
we're pretty far into the gen, with many peformance heavy games for both right and wrong reasons having had come out. So far i haven't seen any drastic differences between 8gb and +12gb vram cards, definitely nothing like what people 3-5 years ago were claiming would happen. Some games may not work properly at ultra/epic/cinematic whatever textures in 8gb cards, but one notch below does and it hardly looks bad or even different.
 
Last edited:
And the comparisons to the RTX cards seem very flawed, as several games are skewed to AMD.
And there seems to be some results that simply don't match up for the 3060, as this should be 8-10% slower than a 2070S. But some of your pictures show a difference of 15% or more.
That's an average based on an aggregate imperfect differential from Techpowerup. Besides Frontiers of Pandora, which game is AMD-skewed?

The RTX 3060 isn't equal to the PS5, full stop. It's equal to the 2070. We've established years ago now that to get console-like performance, you need a 2070S card with 10GB+ of VRAM. There's nothing controversial about what I said, so I'm not sure why you insist on saying it's an RTX 3060-level card when the 3060 loses almost every time by a noticeable margin.
 
That's an average based on an aggregate imperfect differential from Techpowerup. Besides Frontiers of Pandora, which game is AMD-skewed?

The most notable AMD skewed games are CoD and some AC.
But then there are the games that are just bad ports, which affect both brands. Several Sony ports fit in this category.

The RTX 3060 isn't equal to the PS5, full stop. It's equal to the 2070. We've established years ago now that to get console-like performance, you need a 2070S card with 10GB+ of VRAM. There's nothing controversial about what I said, so I'm not sure why you insist on saying it's an RTX 3060-level card when the 3060 loses almost every time by a noticeable margin.

Only in select games is the 2070S a match for the PS5.
You are taking very flawed data from DF, to make such a comparison.
 
The main benefit of something like the RTX 3050 (6GB) is that the power draw basically allows you just shove it in an old office PC.

Can't really do that here.

power draw around 120W
I am talking about cards running on 75W, only via the PCIe slot
No one offers them anymore outside of workstation stuff.
AMD has there the Radeon Pro W7500. Cheaper than Ada2000 but also worse.

Simple workstations will probably run on P620s as long as they are available, used, if they don't offer any worthwhile successors. And dumb companies will buy the A4000 or A6000 cards they get offered as workstations™, with performance most will never need, wasting energy idling around and wasting silicon.

Yeah, it's too bad no one wants to make a cheap gpu that's only powered by the PCIe slot. I have an old office PC sitting on my bench that I'd put a 5050 in if it was lower power.
 
We really need gaming APU's to fix this mess. If they were like $100 more than the equivalent 8-core with small graphics, it could fill this huge gap in the budget range. Plus it could be on integrated boards and get prices a little cheaper, even.
 
Top Bottom