GeForce RTX 4090 running at 3.0 GHz and 616 Watt running GPU stress tool

Published by

Click here to post a comment for GeForce RTX 4090 running at 3.0 GHz and 616 Watt running GPU stress tool on our message forum
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
Love how they have a picture of an apple covering the temp and hotspot.
data/avatar/default/avatar13.webp
We need some water over here ! 😀
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
Legit 4k game benchmarks or it doesn't count.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Maddness:

Love how they have a picture of an apple covering the temp and hotspot.
If that temperature graph also shows 0 to 100 then going by the fan speed graph it runs at 35C.
https://forums.guru3d.com/data/avatars/m/163/163068.jpg
lol
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Your power company will send you an invitation to their New Year's gala if you overclock your 4090.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
ln2 probably with 600W and what looks like temps you can't reach under air.
https://forums.guru3d.com/data/avatars/m/230/230258.jpg
Whatever. I just don't get it. How they dare to charge almost 1000$ for an RTX 0070 level of card A.K.A RTX4070 12GB How on earth could they get away with it.
data/avatar/default/avatar30.webp
mohiuddin:

Whatever. I just don't get it. How they dare to charge almost 1000$ for an RTX 0070 level of card A.K.A RTX4070 12GB How on earth could they get away with it.
The won't. They can charge that. But I gan get a used 3090 ti for 800 a 3090 for 700 and so on all day every day. There are literally millions of them. That 12gb 4080 will not sell at that price.
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
Or it'll be like Japanese Gentlemen Agreement, but with 450W instead of 276Hp - "it's not our fault you overclocked the graphics card beyond all the specs, standards, and limits" 😀
mohiuddin:

Whatever. I just don't get it. How they dare to charge almost 1000$ for an RTX 0070 level of card A.K.A RTX4070 12GB How on earth could they get away with it.
On top of that no wonder they focus on 4090 performance on stage. 3090Ti to 3080 used GA102, 3070 Ti to 3060Ti use GA104 40xx use completely different cores; AD102 from 4090 is nowhere near AD103 or AD104 from 4080/down... maybe this is some kind of a stunt to show a product having an edge over upcoming AMD card Also, 3090 core count is much closer to non-Ti than 4090 to headroom Simple maths: 3090 - 10496 cores; Ti is 10 752 (+2.4%) 3080 - 8704 cores - 83% of 3090 (12 GB variant 8960, Ti with 10240 is almost 3090) 3070 Ti - 6144 cores - 59% of 3090 3070 - 5888 cores - 56% of 3090 3060 Ti - 4864 cores - 46% of 3090 (and different chip) 4090 - 16384 cores (of 18176 max?, +10.9%) 4080 16GB - 9728 cores - 59% of 4090... this is already a completely different card 4080 12GB - 7680 cores - 47% of 4090, 79% of 4080 16GB I know that they can position the products as they want, but relative core count if we used 3000 Ampere series names would look like that: 4090 (baseline 4090) 4070 Ti (4080 16GB) 4060 Ti (4080 12GB) What a lovely coincidence that the core count of the 4080s compared to the flagship matches the ranges of 3070 Ti and 3060 Ti so nicely, both cards being GA104. Maybe they predicted that some of the people will smell naming bs, and they wanted them to buy the 16GB version "because 12GB is 4070". When in fact both look like kinda "4070Ti" and "4060Ti".
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
mohiuddin:

Whatever. I just don't get it. How they dare to charge almost 1000$ for an RTX 0070 level of card A.K.A RTX4070 12GB How on earth could they get away with it.
Becaues they know that if they put out a 4050 XTRi RS Super with a 64 bit bus and a mouse on a wheel GPU but put 24gb of VRam on it, people would buy it
data/avatar/default/avatar36.webp
haste:

lol 615.8W
This is Max (not Avg or Min) power draw. Resets when GPU-Z is closed and reopened.
data/avatar/default/avatar31.webp
pharma:

This is Max (not Avg or Min) power draw. Resets when GPU-Z is closed and reopened.
Yup, and everyone memeing about power usage fail to realize that 4090 will actually be much more power efficient. Let's say you play a game at 4K with 60fps cap. On a 3090 power usage is about 400W, but on a 4090 playing that same game at 4K 60 and same settings will only use ~200W.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Glottiz:

Yup, and everyone memeing about power usage fail to realize that 4090 will actually be much more power efficient. Let's say you play a game at 4K with 60fps cap. On a 3090 power usage is about 400W, but on a 4090 playing that same game at 4K 60 and same settings will only use ~200W.
Nothing is power efficient when overclocked.
https://forums.guru3d.com/data/avatars/m/126/126739.jpg
Glottiz:

Yup, and everyone memeing about power usage fail to realize that 4090 will actually be much more power efficient. Let's say you play a game at 4K with 60fps cap. On a 3090 power usage is about 400W, but on a 4090 playing that same game at 4K 60 and same settings will only use ~200W.
This is 100% true, because I do the same with my 69000xt. Cap the framerate to a level where my max draw is 280w, and then I let it go full tilt when I want all the FPS. But the issue is, majority who buy top of the line, want all the FPS. So 4k 120FPS or better, will be staying at the 600w mark. My thing is, I want to know how hot the new TINY 12 pin connect gets, when its being hit with 600W 50A for an hour or two. And then if you add in factors of a poorly cooled case, or if you have that new fragile power adapter bent or etc.. How hot will it get then?
data/avatar/default/avatar13.webp
More power But sir... I said. More POWER. Did you see what you did sir f**k this s**t I'm out
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I once had a computer that ran BOINC with an overclocked FX CPU and 3x GPUs running at 100% load simultaneously and it drew less power from the wall than this GPU by itself. I ran this in winter months to keep my bedroom warm (back then I had electric heat so I figured I'd make the most of it), and it was enough to turn off the radiator for that room. This is stupid.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Ven0m:

Or it'll be like Japanese Gentlemen Agreement, but with 450W instead of 276Hp - "it's not our fault you overclocked the graphics card beyond all the specs, standards, and limits" 😀
I love this idea/concept. For me, Intel AMD and Nvidia should sit together and agree on maximum thermals values that could be used by their CPUs and GPUs. And if anyone breached that values, that would have to pay a penalty to the others, if they managed to say within the agreed values. I know this is almost impossible to happen, but it would be really good.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
H83:

I love this idea/concept. For me, Intel AMD and Nvidia should sit together and agree on maximum thermals values that could be used by their CPUs and GPUs. And if anyone breached that values, that would have to pay a penalty to the others, if they managed to say within the agreed values. I know this is almost impossible to happen, but it would be really good.
@Ven0m I thought it was the Japanese government that was cracking down on them? That's why kei vehicles are so weird, because the restrictions were put on by the government and manufacturers tried to make the most of those restrictions. The Subaru Sambar is possibly the best example of this. In any case, the biggest difference here is that those car manufacturers made pretty much everything in their cars. AMD and Nvidia pretty much just make the chips themselves but none of the rest of the PC. They have no incentive to agree upon such things. The market determines the demand and being a duopoloy, one of them just has to be only slightly better than the other (Nvidia has a better overall platform, AMD has better prices) and if you don't like it then that sucks for you. PSU and motherboard manufacturers aren't going to care because it just helps drive sales of more valuable components. OEMs are probably thrilled about it because by cheaping out on cooling, the parts will surely prematurely die, which means people will be forced to buy upgrades/replacements sooner. So, I think a large government will have to step in, and basically tax the AIB manufacturer for making components that exceed a certain performance-per-watt. Since the AIB partners have razor thin margins, any additional costs would incentivize them to either optimize or otherwise lower the performance of the GPU. Since the chip manufacturer doesn't want to see their stats lower, they are then incentivized to make chips more efficient.