12VHPWR on RTX 5090 is Extremely Concerning

Bernoulli

M2 slut


Happy Birthday Party GIF by Birthday Bot

Here We Go Again GIF
 
Last edited:
What has always bothered me the most about the controversial 12VHPWR connector is the people who defended it and blamed other users for "not knowing how to connect a cable". Fanboyism sucks, and I say that as an Nvidia user for several generations.
 
Damn, this shit is stressful.
I have a Asus Tuf Gaming 1200w Gold PSU, and I feel like a detective checking what kind of ATX this thing actually has..
And does it even matter??

The info from the store where I brought it writes that is's ATX 3.0, and there's no info about it being ATX 3.1 anywhere, except on the Asus page.. which should make me feel safe, but why is the 3.0 info everywhere else??

Damn, this shit makes me paranoid😂
 
My 4090 while OC'd pulls 570 watts alone (that's what HWINFO64 shows). I have an MSI ATX 3, 1300 watts psu. 12VHPWR cable directly from psu to GPU. Never had an issue. But the thought of this thing pulling more current is worrying as hell. I don't think i will be able to game without looking at my gpu every 5 minutes.
 
Ok it says ATX 3.1 on the Thermaltake PSU Toughpower SFX 1000W so maybe I am ok
Oh no you're not. Did you know the latest driver update was just pulled, because some GPUs were bricked?

Nah, NVidia has been quiet about the driver issue, other than acknowledging it, just messing with you.
 
Maybe it will just have 3 8pin connectors. I have not seen any rumors yet.
didn't the 4070 also use 12VHPWR? I thought every 4000 series card used that

either way, it's just not good looking. I want to upgrade to Nvidia but i'm not ready to spend twice what I wanted to pay just for a backup GPU
 
didn't the 4070 also use 12VHPWR? I thought every 4000 series card used that

either way, it's just not good looking. I want to upgrade to Nvidia but i'm not ready to spend twice what I wanted to pay just for a backup GPU
My 4070 has 2 8pin connectors. It also dosn't require much power though.
 
What a time to be alive. Nintendo users are hoping for 30fps most of the time at sub native res while PC enthusiasts have to get a proper house insurance for their GPU. Funny thing is, there's plenty of overlap between the two groups.
 
With such expensive GPUs, that use so much power, having such a bad, cheapo connection is a terrible idea. Especially when these issues were already present in the 4000 series.
Nvidia has no respect for gamers.
 
Does anyone understand if this info means that the PSU is atx 3.0 or 3.1?
Sorry for (probably) dumb question..
kyPF7XN.jpeg
It's because the differences between 3.0 and 3.1 are very minor that many 3.0 already meet or exceed 3.1 so the manufacturers have jumbled it all up.

A "real" 3.1 will say 12V-2x6 instead of 12VHPWR for the connector but that's just a matter of pin length at the connector and that cable would work in either plug.
 
What has always bothered me the most about the controversial 12VHPWR connector is the people who defended it and blamed other users for "not knowing how to connect a cable". Fanboyism sucks, and I say that as an Nvidia user for several generations.
I don't have any Nvidia stuff at the moment (except a 1650 in a laptop) but it was pretty obvious that a lot of it was user error. There are a lot of 'enthusiasts' out there that have minimal skills but over inflated egos.
 
It's going to take some politician's house getting burned down by a 5090 for there to be any chance of Nvidia being taught a lesson. Oh and it has to be a republican's house because democrat's would just lulling people to sleep with kind words like "this is outrages!" instead of saying "Nvidia cooked my children!!".
 
One potential "safety measure" would be connecting the 5090 via only of the 8-pin connectors on the adapter that comes with the card?
Wasn't that reported to lower power usage down to 450w, while only losing about 5% performance?
 
Another one:



Not reporting on the same case, HIS cable burned too with the 5090 FE

Hm, seems to be a Founders Edition issue.
No reports of this happening with partner cards?

Maybe they should have kept the big form factor with a bunch of extra cooling..
 
Well, even if I had the idea creeping up in the back of my mind that I should maybe go for a 5090 months from now if I were able to find one, this shuts that down completely. Seems like the 5080 will be the only viable choice for a couple more years. :messenger_pensive:
 
Top Bottom