GeForce RTX 4090 running at 3.0 GHz and 616 Watt running GPU stress tool

Published by

Click here to post a comment for GeForce RTX 4090 running at 3.0 GHz and 616 Watt running GPU stress tool on our message forum
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
gQx:

More power But sir... I said. More POWER. Did you see what you did sir f**k this s**t I'm out
Anatoli Diatlov: More power Random Comrade: But sir... Anatoli Diatlov: I said. More POWER.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
@ 600W it's not just a GPU stress tool. It's also testing PSU, case, case fans, CPU cooler etc etc
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
southamptonfc:

@ 600W it's not just a GPU stress tool. It's also testing PSU, case, case fans, CPU cooler etc etc
It also tests your circuit breaker (you have to account for the whole PC, the input wattage, and other devices connected to the same breaker), your air conditioning performance, and your wife's patience with electric bills and fan noise.
https://forums.guru3d.com/data/avatars/m/163/163068.jpg
Well, if you look at all of the incandescent light bulbs you replaced in your home with LEDs, a 600 Watt GPU is palatable.
data/avatar/default/avatar14.webp
That power draw tests your house electric instalation. Some older houses (25/30y) have thinner wiring on some electric plugs, and equipments consuming high amounts of energy/current can burn that wiring over the years. My father being an electrician did wiring replacement in many houses because of that.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
umeng2002:

Well, if you look at all of the incandescent light bulbs you replaced in your home with LEDs, a 600 Watt GPU is palatable.
600W of lightbulbs spread throughout a house that might not be all on simultaneously isn't a big deal. Having 600W of incandescent bulbs all plugged into the same single outlet is. Remember too; that's 600W for the GPU alone, not the whole system. Nobody is going to pair a GPU like this with a 12400 or 5600G, so we're talking more like like 800W for the whole PC in a realistic workload. Add another ~40W for AC to DC conversion losses. The average incandescent bulb is 60W, so we're talking 14 light bulbs being on simultaneously in the same room.
data/avatar/default/avatar02.webp
schmidtbag:

600W of lightbulbs spread throughout a house that might not be all on simultaneously isn't a big deal. Having 600W of incandescent bulbs all plugged into the same single outlet is. Remember too; that's 600W for the GPU alone, not the whole system. Nobody is going to pair a GPU like this with a 12400 or 5600G, so we're talking more like like 800W for the whole PC in a realistic workload. Add another ~40W for AC to DC conversion losses. The average incandescent bulb is 60W, so we're talking 14 light bulbs being on simultaneously in the same room.
Let me tell you this is not realistic workload. My entire 12700K PC with 3080, 4K TV, peripherals, router peaks out at 550W in gaming (but usually averages out at 350-450W depending on a game). I know this because I have enterprise grade UPS that always shows power usage. So if I added 4090 to my setup, power usage would remain relatively similar, even lower depending on a game, because it's a much more efficient card. Only if I stressed 4090 with uncapped framerate I guesstimate my total setup power usage would peak to about 650 watts for short bursts. This 616 watt thing in the article is for GPU stress tool, not real world gaming scenario.
https://forums.guru3d.com/data/avatars/m/294/294076.jpg
If I remember correctly, the maximum for the new 16-Pins power connector is 660W. So, that's pretty close.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Nopa:

If I remember correctly, the maximum for the new 16-Pins power connector is 660W. So, that's pretty close.
It can take more if they keep splashing some liquid nitrogen on it!
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Glottiz:

Let me tell you this is not realistic workload. My entire 12700K PC with 3080, 4K TV, peripherals, router peaks out at 550W in gaming (but usually averages out at 350-450W depending on a game). I know this because I have enterprise grade UPS that always shows power usage. So if I added 4090 to my setup, power usage would remain relatively similar, even lower depending on a game, because it's a much more efficient card. Only if I stressed 4090 with uncapped framerate I guesstimate my total setup power usage would peak to about 650 watts for short bursts. This 616 watt thing in the article is for GPU stress tool, not real world gaming scenario.
We're talking about an overclocked 4090 here, not a presumably stock 3080. So yes, it is a realistic workload. GPUs are commonly bottlenecks in games, so games would be pushing this to 600W, plus or minus a dozen. Note that if the CPU were under full load along with the GPU then the wattage would probably get closer to 900, which is why I was saying that realistically, it would be lower.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
guess its time to build that custom loop with a car radiator that i've always thought about.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
I think it's about time discussions were had on the design of modern computers and just scrapping what exists now and getting new technology - because, this is an appalling situation. Buying one of these? I think I would rather watch a 3 hour documentary about how Gorgonzola Larson does yoga in preparation of pretending to be an actress in Captain Marvel.
data/avatar/default/avatar36.webp
Notice that new, hot running hardware is released after the summer, when temps in the home are dropping. People need the extra heat anyhow so don’t care as much.
data/avatar/default/avatar26.webp
schmidtbag:

We're talking about an overclocked 4090 here, not a presumably stock 3080. So yes, it is a realistic workload. GPUs are commonly bottlenecks in games, so games would be pushing this to 600W, plus or minus a dozen. Note that if the CPU were under full load along with the GPU then the wattage would probably get closer to 900, which is why I was saying that realistically, it would be lower.
Are you being deliberately obtuse or just enjoy spreading hysteria? In any case, you'll be proven wrong when reviews come out in a few days and 4090 won't be running at 600W in games.
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
Glottiz:

Let me tell you this is not realistic workload. My entire 12700K PC with 3080, 4K TV, peripherals, router peaks out at 550W in gaming (but usually averages out at 350-450W depending on a game). I know this because I have enterprise grade UPS that always shows power usage. So if I added 4090 to my setup, power usage would remain relatively similar, even lower depending on a game, because it's a much more efficient card. Only if I stressed 4090 with uncapped framerate I guesstimate my total setup power usage would peak to about 650 watts for short bursts. This 616 watt thing in the article is for GPU stress tool, not real world gaming scenario.
The "3GHz" in the title already gave the context, welcome to the thread 😉
data/avatar/default/avatar25.webp
Glottiz:

Yup, and everyone memeing about power usage fail to realize that 4090 will actually be much more power efficient. Let's say you play a game at 4K with 60fps cap. On a 3090 power usage is about 400W, but on a 4090 playing that same game at 4K 60 and same settings will only use ~200W.
Lol, no one with a functioning brain will buy a 4090 to run the games at the same settings and fps as you did with your old gpu... that's like the most retarded notion ever.
data/avatar/default/avatar19.webp
Dragam1337:

Lol, no one with a functioning brain will buy a 4090 to run the games at the same settings and fps as you did with your old gpu... that's like the most retarded notion ever.
what you gonna remaster game yourself and invent higher settings or buy 8K TV ? Not all games stress GPU at 100% all the time. If I play a game that already runs at max settings on my display at least the benefit from 4090 would be smaller power bill.
data/avatar/default/avatar19.webp
Glottiz:

what you gonna remaster game yourself and invent higher settings or buy 8K TV ?
I run games at 8k, yes. Not that it's of any relevance to your comment - anyone with a functioning brain doesn't buy a new gpu to run games as the exact same settings and fps... that would be super retarded.