CuNi
Member
Unless you are mining with them, even an 800w card will barely show up on the power bill. My 3090 is running at 500watts overclocked, and my computer amounts to a rounding error on the energy bill. People WAY overplay the actual cost of computer power consumption and quote full 100% load peak draw power as the scary figure, and fail to talk about how 90%+ of the time the GPU sits at a much lower power profile.
You are wrong and I already corrected someone else in another thread on it.
It may be "a rounding error on the energy bill" in America, but it surely isn't in the EU.
In the other thread, we had a example statement about a 200W increase and how much of a footprint it would have on the energy bill in a fictional 8h a day whole year scenario and it around 140€ per year more.
And that is only the INCREASE of 200W.
I can give you a more realistic scenario.
Let's assume the GPU draws 600W (GPU alone!!) and we run it for 21h a week (average of 3h per day) which is realistic.
That GPU alone would cost me, with my current pricing, 275,18€ per year. And that is the price of the GPU only. There is nothing else like CPU or anything else.
The same GPU with 800W would already make out insane 366,91€.
Yes, this is with 100% of advertised power draw but even if you take away 10% or even 20% off those figures, the price pay'd is getting ridiculous.
And about Idle-power, this example was made with only 3h gaming a day. Many use the PC far longer daily for browsing/work etc. While it isn't insanely high, Idle draw depending on card, still easily reaches 40W and more.
And the rest of the system still has to be added to the whole cost calculation.
I don't know you, so I cannot judge your wealth, but to be honest, nearly 400€ per year spent on GPU energy alone to me is far off from a rounding error.
Last edited: