NVIDIA GeForce RTX 4060: 3DMark scores leak

Published by

Click here to post a comment for NVIDIA GeForce RTX 4060: 3DMark scores leak on our message forum
data/avatar/default/avatar08.webp
Look at the huge difference having 12GB for the 3060 makes. Good lord there is no excuse for gimping this thing with only 8GB of ram. If this wasn't a monopoly Nvidia would be getting destroyed with releases like this. Absolute garbage.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Even a rtx 2060super users should skip upgrading to this card. It looks so dissapointing.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Moonbogg:

Look at the huge difference having 12GB for the 3060 makes. Good lord there is no excuse for gimping this thing with only 8GB of ram. If this wasn't a monopoly Nvidia would be getting destroyed with releases like this. Absolute garbage.
If you are playing at 1080p, like you should with a card like 3060, the amount of memory, in this case, matters less than the brutally cut bandwidth. The 12GB version has 360GB/s of bandwidth, whereas the 8GB version has pitiful 240GB/s.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
And let's not forget, the original 12GB 3060 was a rather weak card compared to the high-end (a 3080), at less than half the core count and half or under half measured performance. And two and a half years later we're getting a 20% "upgrade" with less memory and more money. Atrocious.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
wavetrex:

And let's not forget, the original 12GB 3060 was a rather weak card compared to the high-end (a 3080), at less than half the core count and half or under half measured performance. And two and a half years later we're getting a 20% "upgrade" with less memory and more money. Atrocious.
yup it was barely faster than 2060 super (tu106 die card)
Undying:

Even a rtx 2060super users should skip upgrading to this card. It looks so dissapointing.
2060 12gb actually had the same number of shaders as 2060 Super and 12G. It was a very overlooked card cause it cost a lot less than 3060 but performed similarly.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Kaarme:

If you are playing at 1080p, like you should with a card like 3060, the amount of memory, in this case, matters less than the brutally cut bandwidth. The 12GB version has 360GB/s of bandwidth, whereas the 8GB version has pitiful 240GB/s.
why you "should " ? I had no issues playing 1440p with my derelict gtx 1060 did i even resort to low/mid settings .... and well admittedly the lowest i ever put on settings ever ? Sure did ..but still i was able to stay over 48fps in every case ... i do not see why a 3060 can not do 1440p as far you are reasonable .
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Venix:

why you "should " ? I had no issues playing 1440p with my derelict gtx 1060 did i even resort to low/mid settings .... and well admittedly the lowest i ever put on settings ever ? Sure did ..but still i was able to stay over 48fps in every case ... i do not see why a 3060 can not do 1440p as far you are reasonable .
Yeah, that was just my personal opinion, nothing more. I play at 1080p and got 4070 for it, thinking it might be enough for a while. I'm not going to claim my opinion would be any better than yours or anyone else's, within reason. It's up to every individual gamer how they enjoy their games. That being said, according to this article, Nvidia markets 4060 for 1080p, and 4060 is stronger than 3060. It could be 3060 12GB would beat 4060 at 1440p in extremely memory critical cases. AMD already proved in the past that a larger cache loses its edge when the resolution rises.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Kaarme:

Yeah, that was just my personal opinion, nothing more. I play at 1080p and got 4070 for it, thinking it might be enough for a while. I'm not going to claim my opinion would be any better than yours or anyone else's, within reason. It's up to every individual gamer how they enjoy their games. That being said, according to this article, Nvidia markets 4060 for 1080p, and 4060 is stronger than 3060. It could be 3060 12GB would beat 4060 at 1440p in extremely memory critical cases. AMD already proved in the past that a larger cache loses its edge when the resolution rises.
You got 4070 for 600eur to play at 1080p? Thats silly. It should easily handle 1440p atleast for now.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Undying:

You got 4070 for 600eur to play at 1080p? Thats silly. It should easily handle 1440p atleast for now.
Yes and no if you aim 60 Hz it can do 1440p maxed out maybe even ray tracing when 144hz or more get involved things get tricky! Depends on what's your aim me for example as far my 0.1% rarely or never drop bellow 48 fps I am fine with it (when freesynch kicks in) some others want way way more than that.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
As long as AV1 encoding is the same as 4090, i'm in.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
so that comparing dlss vs dlss vs FG and with out? or actual numbers
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I think 8GB is adequate for this GPU, but the price is too much for an 8GB card.
Undying:

You got 4070 for 600eur to play at 1080p? Thats silly. It should easily handle 1440p atleast for now.
I assume it's for higher framerates (like 120Hz+) and perhaps to enable DXR without DLSS.
data/avatar/default/avatar24.webp
Moonbogg:

Look at the huge difference having 12GB for the 3060 makes. Good lord there is no excuse for gimping this thing with only 8GB of ram. If this wasn't a monopoly Nvidia would be getting destroyed with releases like this. Absolute garbage.
The amount of memory accounts for literally none of the difference. The difference is due to the 3060 12GB having 50% more memory bandwidth.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
ttnuagmada:

The amount of memory accounts for literally none of the difference. The difference is due to the 3060 12GB having 50% more memory bandwidth.
Despite my previous comment, I somewhat disagree. There are a lot of people who [needlessly] use 4K-optimized textures in 1080p on games with poor optimization, and that will use more than 8GB. However, you are right that the added bandwidth makes much more of a difference.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
schmidtbag:

I assume it's for higher framerates (like 120Hz+) and perhaps to enable DXR without DLSS.
I understand the need for performance but 1440p especially native looks so much better i myself for instance can never go back after seeing it. Even upscaled 1440p looks better than native 1080p. Kinda feels like a waste.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Bet they spent more time optimizing for 3dm than actual games.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Undying:

I understand the need for performance but 1440p especially native looks so much better i myself for instance can never go back after seeing it. Even upscaled 1440p looks better than native 1080p. Kinda feels like a waste.
I assume you mean you prefer 1440p@60Hz rather than 1080p@120Hz? Personally, I would share that sentiment, but it depends on the kind of game you play and what your setup looks like. Meanwhile, going from 1440p to 2160p isn't quite as impressive. Don't get me wrong, there is absolutely a difference, but in a lot of cases the performance penalty isn't always worth it, whereas (to me) going from 1080p to 1440p almost always is. Really, the only kind of displays I can't comprehend why anyone would like are ultrawides, especially the non-curved ones. You're paying more for a cropped display. You could argue "it's good for peripheral vision in games" but in most games where that matters, you can increase the FOV as necessary. Meanwhile, there's so much content that doesn't work well on such a display.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
schmidtbag:

I assume you mean you prefer 1440p@60Hz rather than 1080p@120Hz? Personally, I would share that sentiment, but it depends on the kind of game you play and what your setup looks like. Meanwhile, going from 1440p to 2160p isn't quite as impressive. Don't get me wrong, there is absolutely a difference, but in a lot of cases the performance penalty isn't always worth it, whereas (to me) going from 1080p to 1440p almost always is. Really, the only kind of displays I can't comprehend why anyone would like are ultrawides, especially the non-curved ones. You're paying more for a cropped display. You could argue "it's good for peripheral vision in games" but in most games where that matters, you can increase the FOV as necessary. Meanwhile, there's so much content that doesn't work well on such a display.
There is nothing more immersive than using an ultrawide curved display and im using one for a while now. If we talk gaming as a primary thing there is plenty of content for such a display, 21:9 has become mainstream. I do prefer higher visual fidelity so if i can run native at ultra settings without upscaling at 60fps over upscaled dialed down settings 120fps i'll probably go that route ofc for a single player games. Cant go back to 16:9 https://abload.de/img/untitledovi06.png
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Undying:

There is nothing more immersive than using an ultrawide curved display and im using one for a while now. If we talk gaming as a primary thing there is plenty of content for such a display, 21:9 has become mainstream. I do prefer higher visual fidelity so if i can run native at ultra settings without upscaling at 60fps over upscaled dialed down settings 120fps i'll probably go that route ofc for a single player games. Cant go back to 16:9
But that's my point: 16:9 with an increased FOV is even more immersive. It's the same exact thing except the top and bottom aren't cropped.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
schmidtbag:

But that's my point: 16:9 with an increased FOV is even more immersive. It's the same exact thing except the top and bottom aren't cropped.
You'd think that but no trust me.