adamsapple
Or is it just one of Phil's balls in my throat?
starts at $249
Surejan.gif
starts at $249
Whats the legit use case for these?
Why would you buy a card knowing that it wont run anything.
Budget PCs and laptops. And the xx50 isn't a bad card by any means. It can run pretty much everything in the market, including the unoptimized mess of UE5 games, at high settingsSounds like a straight up HDMI port situation where it exists to give you a video output and nothing more.
Would you recommend it to anyone in any possible situation? I'm struggling to think of one it's suited for. Seems like this exists for PC builders to put into boxes to scam uninformed buyers.Hyperbole like this is so tiring and why I can't take a lot of PC people seriously.
The 5050 is not a good deal by any means and I'm super critical of basically everything Nvidia does but to equate its existence to being a HDMI input is stupid. There are 1000s of games that will run incredibly well on a 5050, and the most played games on Steam will all run well. Not everything needs to be some huge 4K 120 fps monster capable of path tracing. The beauty of PC lies in it's flexibility.
As I said I don't think it's a good deal but to compare a card that is fully capable of providing a good experience in 1000s of games including modern games and is more powerful/capable then the top GPU on Steam to a simple HDMI output is stupid.Would you recommend it to anyone in any possible situation? I'm struggling to think of one it's suited for. Seems like this exists for PC builders to put into boxes to scam uninformed buyers.
I can't imagine a 3770K is very usable nowadays. I would recycle it unless you really just plan on playing old games.I've got an old Alienware X51 that has been sitting for years and years.
How CPU bound would an old Intel 3770K with 16GB DDR3? I could rip out the old GTX 660 and fit a 5050 in there. The 330W power brick might be enough.
Yay? Nay? Throw that old thing in the trash?
Recycle it you heathen.Throw that old thing in the trash?
Would you recommend it to anyone in any possible situation? I'm struggling to think of one it's suited for. Seems like this exists for PC builders to put into boxes to scam uninformed buyers.
What's funny is that the Switch 2 is not a 3050 let alone 5050 it's more like a 3030 and it retails for $450. The 5050 is an absolute monster in comparison.The kind of thing you see on the shelf near the check out lanes while you stand in line. Sure Ill grab a coke, candybar and a 5050.
A 3770K can still play a bunch of games decently, for its age.I can't imagine a 3770K is very usable nowadays. I would recycle it unless you really just plan on playing old games.
A 3770K can still play a bunch of games decently, for its age.
There are a lot of fake benchmarks videos but this one looks real. The system listed has RX 7600, 16GB DDR3 and i7 3770, not the K version but they have very similar performance at stock frequencies.
Obviously it will have problems with the more CPU demanding games and unoptimized software, even if it's old.
For example, Venetica a game from 2009 runs badly with the 3770K, a high end CPU from 2012, especially when you reach Venice. It hammers a single core/thread like crazy. Add DXVK and use Vulkan instead of DX9 and it runs quite a lot better.
It has AVX but it lacks AVX2 support. Some games "require" AVX2 even if they don't really use those extensions. Same deal happend with AVX and usually was patched out officially or unofficially.
But several recent games seem to really demand it. I know of Alan Wake 2, Assassin's Creed Shadows, Monster Hunter Wilds, God of War Ragnarok, FF XVI and FVII Rebirth. Some have workarounds but it's possible that there is a significant performance hit on top of running the game.
A system with a 3770K it's limited to PCIe 3.0 and DDR3. This 5050 is PCIe 5.0 with only 8 lanes and 8GB of VRAM. Running out of VRAM is bad, doing so with crippled bandwith is worse.
But it could be fine while within that 8GB framebuffer.
You'll see a massive difference even on this old 3770. I had this CPU and when I replaced GTX680 with a 1080ti, I saw 5-6x scaling in many games with this CPU.I've got an old Alienware X51 that has been sitting for years and years.
How CPU bound would an old Intel 3770K with 16GB DDR3? I could rip out the old GTX 660 and fit a 5050 in there. The 330W power brick might be enough.
Yay? Nay? Throw that old thing in the trash?
Pretty sure half of them didn't even know there was a rtx xx50 card, nor that it was perfectly viable for gaming and were just shocked by the headline.Just stopping by to wonder how this release is garnering this much attention from a gaming enthusiast forum? Is anyone here even in the market for this tier card?
PC game devs need to accept that they have to make games that work well with 8 GB of VRAM for the foreseeable future. It is not going away anytime soon if ever
I've got an old Alienware X51 that has been sitting for years and years.
How CPU bound would an old Intel 3770K with 16GB DDR3? I could rip out the old GTX 660 and fit a 5050 in there. The 330W power brick might be enough.
Yay? Nay? Throw that old thing in the trash?
Making sure that games work on 8GB and making sure medium or low textures look acceptable are two different things though. The problem is usually in the latter, not the former.PC game devs need to accept that they have to make games that work well with 8 GB of VRAM for the foreseeable future. It is not going away anytime soon if ever
130w is a pretty weird load. Not sure what sort of device that's aimed at.
The mobile version seems more reasonable but still 100w at the high end is a lot of draw.
so still no successor to the 1650GTX. Only reasonably powered possibility for SFF PCs are RTX A2000 or Ada2000, for an absurd price, new, and used also overpriced.On the desktop side, the RTX 5050 will start at $249, draw up to 130W of power
They are still selling <200$ RX 6600Meanwhile AMD offering exactly nothing
It's less about making sure textures look acceptable and more about doing proper texture streaming. Modern games, from what i've seen, don't really drop the quality of textures, they just reduce the distance which the higher quality textures kick in.Making sure that games work on 8GB and making sure medium or low textures look acceptable are two different things though. The problem is usually in the latter, not the former.
And then you can run into the issue of not being able to use your card's best features like DLSS or frame gen because that also increases memory usage and that's really not on the devs to make sure their games work but there's also enough overhead for the GPU to use its own features.
Either that or they drop the quality haphazardly to soup levels to hit specific memory limits because their main targets are console specs and don't really take the time to make sure all presets look acceptable. Honestly, I can see arguments for both sides here. They should do better, but at the sane time not all developers have the time, budget or even know-how to optimize to perfection for lower specs. Kinda how some just drop settings randomly until the Series S version sort of works and call it a day. Ain't nobody got time for that. Both Nvidia and AMD should do better not to constraint developers, it's time for 8GB to die for new GPUs meant for gaming.It's less about making sure textures look acceptable and more about doing proper texture streaming. Modern games, from what i've seen, don't really drop the quality of textures, they just reduce the distance which the higher quality textures kick in.
power draw around 120WThey are still selling <200$ RX 6600
that's true, at least at 1080p in doom dark ages and star wars outlaws and somewhat relevant for indiana jones too. in doom dark ages and outlaws I've failed to notice any meaningful texture differences. in indiana jones I was able to see texture differences with zoom but without zoom I was not able toIt's less about making sure textures look acceptable and more about doing proper texture streaming. Modern games, from what i've seen, don't really drop the quality of textures, they just reduce the distance which the higher quality textures kick in.
Seems only way is APUI am talking about cards running on 75W, only via the PCIe slot
Yikes marketing bullshit x)5050 ~ 4060
![]()
Seems only way is APU
Nv marketing always was bs. Just interesting how nv make 5050 = 4060, if 5050 have less cores, despite between ada and blackwell, zero ipc progress.Yikes marketing bullshit x)
TLOU1 was a notably bad port on PC, it took a while but they fixed a lot of the performance issues. It was bad even on high end moderm hardware and running out of VRAM/crashing with 8GB cards. I think it was one of the games requiring AVX2 but added an alternative executable for just AVX soon after launch.I had the 3770K. It was an extremely long-lasting CPU, but modern games are just too CPU intensive (UE5 games or PS5/XSX ports). Even when the game run at 45–50 fps, as with the TLOU1 remake for example, I experienced stuttering and the game wasn't very playable.
Exactly.However, the 3770K and the RTX 5050 should provide a good gaming experience with pre-2022 PC games and should also be capable of running some modern games.
we're pretty far into the gen, with many peformance heavy games for both right and wrong reasons having had come out. So far i haven't seen any drastic differences between 8gb and +12gb vram cards, definitely nothing like what people 3-5 years ago were claiming would happen. Some games may not work properly at ultra/epic/cinematic whatever textures in 8gb cards, but one notch below does and it hardly looks bad or even different.Either that or they drop the quality haphazardly to soup levels to hit specific memory limits because their main targets are console specs and don't really take the time to make sure all presets look acceptable. Honestly, I can see arguments for both sides here. They should do better, but at the sane time not all developers have the time, budget or even know-how to optimize to perfection for lower specs. Kinda how some just drop settings randomly until the Series S version sort of works and call it a day. Ain't nobody got time for that. Both Nvidia and AMD should do better not to constraint developers, it's time for 8GB to die for new GPUs meant for gaming.
That's an average based on an aggregate imperfect differential from Techpowerup. Besides Frontiers of Pandora, which game is AMD-skewed?And the comparisons to the RTX cards seem very flawed, as several games are skewed to AMD.
And there seems to be some results that simply don't match up for the 3060, as this should be 8-10% slower than a 2070S. But some of your pictures show a difference of 15% or more.
Agreed. But it looks like with no frame-gen, it's around the same speed as a 4060. Better than I expected to be honest.Yikes marketing bullshit x)
That's an average based on an aggregate imperfect differential from Techpowerup. Besides Frontiers of Pandora, which game is AMD-skewed?
The RTX 3060 isn't equal to the PS5, full stop. It's equal to the 2070. We've established years ago now that to get console-like performance, you need a 2070S card with 10GB+ of VRAM. There's nothing controversial about what I said, so I'm not sure why you insist on saying it's an RTX 3060-level card when the 3060 loses almost every time by a noticeable margin.
That article hugely over exaggerates the actual benefits of itWating for RTX 6090 with 1gb vram
![]()
AMD researchers reduce graphics card VRAM capacity of 3D-rendered trees from 38GB to just 52 KB with work graphs and mesh nodes — shifting CPU work to the GPU yields tremendous results
What would take 38GB of VRAM to hold now takes just a measly 52KB.www.tomshardware.com
I can't either but I find them less pretentious then whatever the hell the "tech Jesus" channel is called.Watched about 2-3 mins of it and then was done.
I can't really stand HWUB personally.
your wrong sum triggers me more than it shouldOrdered 2 for SLI = RTX 1100.
The main benefit of something like the RTX 3050 (6GB) is that the power draw basically allows you just shove it in an old office PC.
Can't really do that here.
power draw around 120W
I am talking about cards running on 75W, only via the PCIe slot
No one offers them anymore outside of workstation stuff.
AMD has there the Radeon Pro W7500. Cheaper than Ada2000 but also worse.
Simple workstations will probably run on P620s as long as they are available, used, if they don't offer any worthwhile successors. And dumb companies will buy the A4000 or A6000 cards they get offered as workstations™, with performance most will never need, wasting energy idling around and wasting silicon.