Well I did not try my RTX 4090 on 1080P low settings but I did try my 13700KF on 6200Mhz with 5700Mhz all core got this result.A little better than stock on 13900K
You have been a fanboy of Intel for years, every release of AMD or Intel you state Intel is King.
.
This happens with many different products, cars, phones, specific shops, even things like ISP providers.
Subconsciously nobody wants to admit to falling for a bad deal so ignore any negatives and just expand on the positives.
LOL awesome,you took the deep dive.I love AMD but if they cant get there crap together.Intel it is.I knew you were at microcenter(you told me,I did same thing AMD to Intel ). F spiderman ,they lost AMD fans on that one
OPPS I forgot,see if your 13900K can keep up with my 13700KF,well I now it can kill on multi but single thread ?
People are arguing over figures.
If anybody played a game on both an AMD rig and an Intel rig using the fastest of each, but didnt know what rig was what, they would have to guess.
In real world gaming you cant tell the difference between the two.
That's right, this is no different to fast cars and drag racing as if you would notice the 0.5-second difference when racing down that strip. You only know because of the speed camera.
But there will always be people who will upgrade because it's "new".
We see it all the time in the consumer world we live in.
13900k and a 4090 paired together are close to 800 watts under load. And that's just for the gpu and cpu by themselves.
OVER 800, requiring a 1300W platinum to be safe as even 1000W PSU might struggle.
And that energy is transferred to heat in the room. And while we are OK in the winter, in the summer having a 800W+ heater in your home going to be a problem.
13900k and a 4090 paired together are close to 800 watts under load. And that's just for the gpu and cpu by themselves.
If you run a stress test on both at the same time. Cap a 13900k at 90w,you'll see no difference in gaming.4090 actually stays under 350w on average too.
I said 1-3 FPS bud, not %, and I used the CPU benches on Guru3D as a point of reference. My point being all things considered and for the vast majority, the latest AMD and Intel CPUs are neither evolutionary or revolutionary.
What I meant is 3090 is too slow to really demand a lot from a CPU at 4k,youd be fine running it with a 11400 most of the time at that resolution
I really didn't see that much difference with 4090 + 13900K with let's say 7950x or 5800x3d or 12900K tbh as of now. The GPU is not fast enough to make that massive of a difference between these CPUs.
Never considered the 11400 and what not to be high end enough.
What I meant is 3090 is too slow to really demand a lot from a CPU at 4k,youd be fine running it with a 11400 most of the time at that resolution
OK I see what you mean, but still how many 4090 owners compared to 3*** series, and how many will upgrade CPU over GPU, I've no idea but my point being the jump in gaming, certainly considering the power draw and heat production, there's probably little incentive for gamers (as opposed to hardware enthusiasts) to upgrade.
This review doesn't show the full picture, find a review with 4090, thats more like it
The 4K benchmarks all plus minus identical across different CPUs here, thats due to GPU bottleneck.
With 4090,m the 13900K shines even brighter and especially in 6Ghz with 4090 on waterloop,
TVB?
From a gamer perpective intel 13th gem or zen 7000 isnt worth an upgrade for any of us that are on 12th gem or am4, i swapped my 5900x for an 5800x3d instead with all these benchmark in game cheap upgrade after resseling the 5900x
LOL awesome,you took the deep dive.I love AMD but if they cant get there crap together.Intel it is.I knew you were at microcenter(you told me,I did same thing AMD to Intel ). F spiderman ,they lost AMD fans on that one OPPS I forgot,see if your 13900K can keep up with my 13700KF,well I now it can kill on multi but single thread ?