12VHPWR on RTX 5090 is Extremely Concerning

I tell you, this 50 series launch keeps getting better and better. IF Nvidia was trying to convince me to not get a 50 series card, They convinced me! I will stick with my old reliable 40 series.
 
23 amps on a single cable thread (rated for 10) and over 100 degrees in 10 minutes on a testbed with no known issues. That's a bit wow.

This has a chance of becoming a recall issue in the EU at least.
 
Impressive that Astral is the only card that has sensors for this connector to give you an actual warning if the cable is heating or not connected properly.

Expensive or not, it's the safest card to buy it seems.

Glad u picked that when I did order it from Best Buy ( still not get shipped lol )
 
Outside of memes for a minute, I saw some interesting counterpoints from good sources



and Jonny Guru, THE Jonny fucking Guru, which is the #1 reference of PSUs probably worldwide, director of R&D of Corsair's PSU unit and huge experience



If it was that hot, he wouldn't be able to hold it in his hand. I don't know what his IR camera was measuring, but as Aris pointed out.... that wire would've melted. I've melted wires with a lot less current than that.
Also, the fact that the temperature at the PSU is hotter than the GPU is completely backwards from everything I've ever tested. And I've tested a lot. Right now I have a 5090 running Furmark 2 for an hour so far and I have 46.5°C at the PSU and 64.2°C at the GPU in a 30°C room. The card is using 575.7W on average.
Derau8er is smart. Hr'll figure things out sooner than later. I just think his video was too quick and dirty. Proper testing would be to move those connectors around the PSU interface. Unplug and replug and try again. Try another cable. At the very least, take all measurements at least twice. He's got everyone in an uproar and it's really all for nothing. Not saying there is no problem. I personally don't like the connector, but we don't have enough information right now and shouldn't be basing assumptions on some third party cable from some Hong Kong outfit.

Maybe his setup is borked.

Will be interesting if GamersNexus drops in again with the answer like last time.
 
I just ran temps on my cable and connector under heavy load, 5090FE.

Cables nearest the connector and exhaust got to 34c and the connector 40c, the cable section at the bottom of the case was 28c.
 
Outside of memes for a minute, I saw some interesting counterpoints from good sources



and Jonny Guru, THE Jonny fucking Guru, which is the #1 reference of PSUs probably worldwide, director of R&D of Corsair's PSU unit and huge experience







Maybe his setup is borked.

Will be interesting if GamersNexus drops in again with the answer like last time.


Youtubers…..
Gamernexus is another youtuber fyr
 
I'm an electric wiring designer by trade. The pins in these GPU connectors are very similar (potentially identical) to the Molex Mini Fit Jr family. The largest wire they can accept and stay within industrial code is 18awg. In his video, he measured 23A (!!!!) going through a single one of these. Personally, I won't use mini fit pins for anything other than inputs, instrurment control voltage and maybe the power for a button's ring light. Carrying an actual load through them is insanity. I don't see why they just don't move up to any number of higher rated connector families. The cost difference is negligible.
 
Last edited:
My cable is still sitting in the Corsair box. Bought that PSU over a year ago in anticipation of a new GPU (at a reasonable price).
I have a RTX 4070 sipping power from a 1200watt PSU.

Is there any details out there about how much cost say a 5090 build would add to ones electricity bill per month?
Take the wattage (600w) multiply X your electrictiy cost.
In my area it's 11.5 cents per KW hour
600/1000=.6kw/hr X 11.5 cents = about 7cents an hour.
If I was to play a big open world game for 100 hours that 100X.07 or $7.
7$ just to run a 5090 as fast as it can go for 100 hours.
That poor cable never stood a chance.
 
Why even buy this? What AMAZING new games are coming out this year? Weird looking doom game? AC shadows? MH weeb game? Lmao.

Max graphics settings + Native 4k resolution + high frame rate (above 100fps consistently). You can have two of those three things with lower tier cards, but if you want all three at the same time, you need the most powerful card you can get.

I think with an RTX 4090 if you crank Cyberpunk to the max in everything at 4k, with no frame generation on, it runs in the high 60s to low 80s most of the time. If you own a 4K gaming monitor, it can feel like you're missing out. PC gaming is a helluva drug.

I'm on a QHD 165hz monitor with an RTX 3080. My brain is not capable of detecting any difference above 90fps. I can barely notice any difference between 60 and 90. I'm just not sensitive to the higher refresh rates like some people are (my son for example). Yet I still feel this little itch telling me I should be playing all games at a rock solid 120fps, "just to be sure". It's totally dumb. I cannot justify it logically at all. But when I see a game that's not running buttery smooth, it gives me a tiny bit of disappointment and anxiety. The idea of eliminating that feeling by getting a shinny new toy is alluring for some reason. When I am able to buy an RTX 5080, I will. Not because I need it, not because it will do me any good, but because it will feel "right". And yes, I am part of the problem.
 
The secret to amazing PC gaming performance is a 1080p/240hz monitor. No single hardware purchase gave me a better FPS boost. Dumping 4k for a decade is a wise move. We can circle back in 2030 and see if the mass market hardware is there yet.
 
The secret to amazing PC gaming performance is a 1080p/240hz monitor. No single hardware purchase gave me a better FPS boost. Dumping 4k for a decade is a wise move. We can circle back in 2030 and see if the mass market hardware is there yet.
I'm on 1440p 360Hz now.

I did try 4K a while back and yeah I could tell the difference but after a couple days I didn't care and my overall performance took a hit because of how more demanding it was. returned it for a 1440p 144hz monitor then recently upgraded to 360Hz.

My previous card was a 4080 and it was amazing of course but going to 360Hz was too much for it. The only game I could play at 360Hz was Valorant. That's why I wanted to get a 5090. With 5090 I can play at 360fps in cyberpunk with everything maxed out. Yes it's "fake frames" but I can't tell any difference. Feels so much better.
 
I'm on 1440p 360Hz now.

I did try 4K a while back and yeah I could tell the difference but after a couple days I didn't care and my overall performance took a hit because of how more demanding it was. returned it for a 1440p 144hz monitor then recently upgraded to 360Hz.

My previous card was a 4080 and it was amazing of course but going to 360Hz was too much for it. The only game I could play at 360Hz was Valorant. That's why I wanted to get a 5090. With 5090 I can play at 360fps in cyberpunk with everything maxed out. Yes it's "fake frames" but I can't tell any difference. Feels so much better.
I did the opposite.
Had a 1440p 360hz OLED, and changed it for a 4K 240hz OLED.
I can't tell the difference between 240 and 360, but I enjoy the sharper picture on the 4K screen, both in games but also in windows and various software and stuff.
 
Last edited:
The secret to amazing PC gaming performance is a 1080p/240hz monitor. No single hardware purchase gave me a better FPS boost. Dumping 4k for a decade is a wise move. We can circle back in 2030 and see if the mass market hardware is there yet.
I just wished there was BIGGER 1080p/High refresh displays. I'd get a 1080p/240hz 50" before I get a 1440p or 4K display.
 
I think its insane that people are paying $2K for a GPU that can't even run every game currently released at native 4K max settings and achieve 60fps. There are still games where DLSS is necessary.

Please tell me why you are supporting Nvidia with these clear price gouging tactics? PC gamers with more money than brain cells are the reason why they keep doing this.
 
I'm an electric wiring designer by trade. The pins in these GPU connectors are very similar (potentially identical) to the Molex Mini Fit Jr family. The largest wire they can accept and stay within industrial code is 18awg. In his video, he measured 23A (!!!!) going through a single one of these. Personally, I won't use mini fit pins for anything other than inputs, instrurment control voltage and maybe the power for a button's ring light. Carrying an actual load through them is insanity. I don't see why they just don't move up to any number of higher rated connector families. The cost difference is negligible.
Yeah. They were trying to answer the diy community's desire of a smaller cable that would look better in the build. But they fucked up big time doing that, and also didn't put the fucking connector in a better position for cable management. Why the hell they thought that just making a smaller conector was a good enough solution? If they kept the 2x 8pin setup, but enforced a different position for the conector on the board, it would be more than enough for most setups to look cleaner.
 
Yeah. They were trying to answer the diy community's desire of a smaller cable that would look better in the build. But they fucked up big time doing that, and also didn't put the fucking connector in a better position for cable management. Why the hell they thought that just making a smaller conector was a good enough solution? If they kept the 2x 8pin setup, but enforced a different position for the conector on the board, it would be more than enough for most setups to look cleaner.
Or just run four 10awg wires and have all the power their hearts desire.
 
It's because the differences between 3.0 and 3.1 are very minor that many 3.0 already meet or exceed 3.1 so the manufacturers have jumbled it all up.

A "real" 3.1 will say 12V-2x6 instead of 12VHPWR for the connector but that's just a matter of pin length at the connector and that cable would work in either plug.
What to do? I have a 1350 watt TT Toughpower.
 
It's because the differences between 3.0 and 3.1 are very minor that many 3.0 already meet or exceed 3.1 so the manufacturers have jumbled it all up.

A "real" 3.1 will say 12V-2x6 instead of 12VHPWR for the connector but that's just a matter of pin length at the connector and that cable would work in either plug.
There's also PCI-E 5.1 these days.

Luckily the updated Seasonic Prime-XT 1600W has both PCI-E 5.1 and ATX 3.1!
seasonic_x_noctua_prime_tx_1600_web.jpg
 
Someone's salty. :messenger_winking_tongue:



He got caught, plain and simple. Der8auer said it can happen and he never said that it's safe. Hardware Busters said it was impossible to happen. One was right, and one was wrong. Hardware Busters should simply take the "L" and move forward, not act like a little child. Jesus.
 
That sucks, I liked the look of the Ventus and the Suprim, you got something else lined up?
Best Buy gave me store credit for the return for when I want to order something and still have a CLX 9800X3D 4080 super that I haven't returned and think I will just keep and while at Best Buy they had a Cyberpower also with a 9800X3D with a 5080 open box buy so I brought it home for the heck of it having 2 months to return it but its loud

I guess its why it was returned as it has a 360 MM rad on the CPU but its ramping up just installing games not to mention when I really throw something at it, its annoying loud.

I guess reposting the CPU cooler might fix it but not fucking with it

Honestly still overly happy with the 4080 super CLX prebuilt, its very quiet

I will likely just return this 5080 build and keep the nearly $5k credit waiting for stock to pick up and grab one I really want and avoid ones like Cyberpower or Ibuypower
 

conclusion is to use a brand new cable when you upgrade new gpu.


●From the test of the power supply and graphics card power cord combination in combination 1 to combination 6, the old and new version of the power supply connector has little effect on the current balance. Instead, the usage and new and old status of the graphics card power cord have a greater impact. Frequent plug-in and unplug or old graphics card power cords are more likely to cause current imbalance due to resistance changes. If you want to use an old version of the power supply with a high-power graphics card, it is recommended to update the graphics card power cord at the same time. The pin definitions of 12V-2×6 and 12VHPWR are completely compatible, so there will be no problems with the combination.
 
Outside of memes for a minute, I saw some interesting counterpoints from good sources



and Jonny Guru, THE Jonny fucking Guru, which is the #1 reference of PSUs probably worldwide, director of R&D of Corsair's PSU unit and huge experience







Maybe his setup is borked Motorcycle training in Milton Keynes.

Will be interesting if GamersNexus drops in again with the answer like last time.

The main issue was that 12vHPWR was not built with appropriate safety margins. This meant that many issues, including manufacturing defects, can cause a cable to melt. Anyone that knows anything about manufacturing knows that defects happen, so when you ship thousands of cards/cables, there's going to be some failures.

This is supported by looking at failure rates. 4080 melting failures were extremely rare(only a couple documented cases on the web), and no 4070 TI failures have ever been reported, even though they use the same cable. The lower power consumption associated with these lower end cards provides a larger margin of safety.

GamersNexus did a massive disservice to the community by immediately jumping to the "user error" conclusion when he managed to replicate the issue in that way. As soon as he did that, Nvidia immediately latched onto the theory that allowed them to blame their customers, when in reality, only some of the failures can be attributed to user error. Just because he could not replicate a failure under normal conditions with the parts he had does not mean that others did not have defective parts.
 
Last edited:
Top Bottom