Intel Core i9-11900K processor review

Published by

Click here to post a comment for Intel Core i9-11900K processor review on our message forum
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

Stock 5600X will never drwaw 95W , not my at least . I'm arunning 1.35v @ 4.7 all core now for test and im below 70W gaming. I call BS 🙂
what do you call BS ? the numbers you see cause yours doesn't ? if it draws 70w with a 6800 at 1440 then it's neither a 65w cpu nor is 70w its limit.
Undying:

When i saw 3060 got a recommended badge and now 11900k we should probably ignore it in the future. It means nothing at this point.
stop whining about badges,really. 3060 is a good card,your problem is not that 3060 got it,but 6700xt didn't. though giving anything but a facepalm to 11900k is too much.
data/avatar/default/avatar25.webp
It's an absolute no-brainer - aren't the intel motherboards a whole heap more expensive too??? I certainly wouldn't 'Recommend' 11900K £540 is what I got my 5900x for. I can't describe how happy I am I jumped in on launch day knowing now that it completely owns the intel top model.
upload_2021-3-31_11-59-20.png
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Everyones favorite review yt channel 😀 [youtube=_jVAfk4AG3A]
data/avatar/default/avatar11.webp
cucaulay malkin:

what do you call BS ? the numbers you see cause yours doesn't ? if it draws 70w with a 6800 at 1440 then it's neither a 65w cpu nor is 70w its limit. .
It will never pass 70W at pure stock. My is manualy overclocked and limits are skyhigh , but it will draw 90W at AVX2, gaming doesn't come even close. On video you posted that 5600X runs with PBO enabled, not stock 🙂 Please , it's worse CPU of the decade , don't defend it . its UTTER CRAP. https://www.guru3d.com/index.php?ct=articles&action=file&id=70609
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Undying:

Every review site - waste of silicon Guru3d - recommended Eh.
G3D desperately needs new badges, for the full range: - What were they thinking? - Dumpster fire - Meh. Whatever. - Not bad at all - Awesomium-256 --- This one definitely deserves an "What were they thinking?" badge.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
I am getting a little tired of the crap I need to take in my own house based on our awards. READ, do not just look at photos, charts, and awards. Literally in the last words: You can shave off two tenners if you opt for the KF (no iGPU). You can also step down a notch to the i7-11700K at 399 USD, of course. Regardless, we'll happily hand out a recommendation if you can mentally and physically manage that power envelope and accompanying heat levels.
data/avatar/default/avatar09.webp
MOD: Removed, this is already posted in this thread. No need for repeat.
data/avatar/default/avatar13.webp
Hilbert Hagedoorn:

I am getting a little tired of the crap I need to take in my own house based on our awards. READ, do not just look at photos, charts, and awards. Literally in the last words: You can shave off two tenners if you opt for the KF (no iGPU). You can also step down a notch to the i7-11700K at 399 USD, of course. Regardless, we'll happily hand out a recommendation if you can mentally and physically manage that power envelope and accompanying heat levels.
Why not listen to Your own words then , and don't give badges to product that obviously does't deserve any recommendation ( in this case You need "shitty product badge"). . Guru3d means something , and you recommend this product....
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
kapu:

Why not listen to Your own words then , and don't give badges to product that obviously does't deserve any recommendation ( in this case You need "shitty product badge"). . Guru3d means something , and you recommend this product....
Guess what? I will decide what I deem is applicable, not you. If you have a problem with it start your damn website and write your own articles, The disrespect and wording here are starting to really annoy me. Shitty product badge, what the hell are you 12 years old?
https://forums.guru3d.com/data/avatars/m/255/255510.jpg
They may well be cheap, butt?
data/avatar/default/avatar19.webp
While I'm nobody to get attention here I do agree that @hilbert is obviously right on both points, meaning it's his site with his articles and how people put words together can get annoying. BUT I do have to agree with some users that the guru3d badges "system" (if you can call it so) would indeed highly appreciate an overhaul with some additional badges etc. I've also noticed on some product reviews here that the badge seems not fitting and just confusing.
data/avatar/default/avatar07.webp
Hilbert Hagedoorn:

Guess what? I will decide what I deem is applicable, not you. If you have a problem with it start your damn website and write your own articles, The disrespect and wording here are starting to really annoy me. Shitty product badge, what the hell are you 12 years old?
You seem very moved , means there is something on the subject 🙂 It's not just me. Channel that anger on product that deserves good bashing. https://www.meme-arsenal.com/memes/f3a49a04f6fba5863bf46b4df7daeea9.jpg
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

Please , it's worse CPU of the decade , don't defend it . its UTTER CRAP.
what HH said is true,talking to you is like having an argument with a middleschooler.
kapu:

You seem very moved , means there is something on the subject 🙂 It's not just me. Channel that anger on product that deserves good bashing. https://www.meme-arsenal.com/memes/f3a49a04f6fba5863bf46b4df7daeea9.jpg
you gotta understand this is not a YT comment section and we like it that way.
data/avatar/default/avatar17.webp
I love Hilbert's reviews, one of the first sites I always go to for them. Recommended can be read different ways, certainly recommended if you're an intel fan who has deep pockets and can handle the heat - but I feel the message going out to people really has to be "It's fine, but if you can buy a 5900x, do this". I imagine reading the review would give you that sense though.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Fox2232:

As you are making normalized 3.5GHz tests for IPC, would it be possible to add 3.5GHz test which loads all cores and get "score" and "power draw"? That could give some standardized power efficiency for those who seek it. And may show differences between each generation/manufacturer.
Hmm, I'd have to think about that. Normalizing frequency says something about the actual IPC, the performance of the architecture, but little about energy efficiency. You can measure it; you'll get the nominal wattage needed for 3500 MHz on all cores, however, in the end, that IPC x frequency and voltage is your performance and relevant energy consumption. And in the end that is what really matters.
data/avatar/default/avatar17.webp
cucaulay malkin:

you gotta understand this is not a YT comment section and we like it that way.
Maybe he wandered into News section thinking it's AMD subforum. And if you're gonna post anything good about competition in AMD subforum, I believe you have to have a court order, yes?
data/avatar/default/avatar18.webp
tty8k:

The max wattage in prime extreme test is not real world usage realistic. It was the same with 10700k, websites bashing it for 270W consumption and spending tons on custom water cooling. Guess what? I have a 10700k in my house overclocked to 5GHz. In gaming it avg 60-80W with 130W peaks. In blender/cinebench avx 190W. Peak temp on blender 80C with a noctua D15. Edit: I know this is an optimized scenario with CPU tuning but still. Also, the motherboard manufacturers should have a second look at those auto values for Intel, many of them are pumping too high numbers for stock cpu.
Doesn't really change my viewpoint. The 5900x is better, more efficient, more future proof and the same price (well it was for me anyway)
data/avatar/default/avatar11.webp
Fox2232:

Then it is good that some have it. Because nVidia's news in "AMD specific news/rumor" thread were not bashed in past. I have seen quite a few discussions there. It is about tone people post with. Even I did recommend nVidia's GPU to some members in AMD section. Just because their use case did not have great product on red side. But some people go there and on 1st glimpse they make troll posts.
Not going to argue any further because its completly off topic. But I won't let this fly either:
Fox2232:

Go find yourself a suitable section for new DLSS thread
/done here
data/avatar/default/avatar11.webp
David3k:

This would actually mean that they can hit relevant performance marks in desktop for 2020. But even with the IPC gain, the performance-per-watt would still be the real question because we don't really know right now how high they're going to have to clock these. Performance-per-watt is still up in the air and we only have an idea of where it's going to land (based on the 14nm+++ node).
I wrote that back in November 2019 when we all still assumed that Rocket Lack was coming out in 2020, and I guess we all know now where the perf-per-watt landed. Based on what I knew and lessons from history, it was almost a given to me that Rocket Lake was going to consume insane amounts of power. What I didn't know was that it was going to come with little to no performance advantage vs the 10900k. I mean I never thought they wouldn't at least come out on top in terms of pure performance, considering the efficiency they sacrificed to get here. Rocket Lake was supposed to have an IPC increase to match Zen2 or Zen3 but with a clockspeed that would make them pale in comparison. How then, did this happen?? At any rate, here's to hoping that Alder Lake will bring us back to performance parity with AMD, and/or future ARM desktop CPUs have hardware x86/x64-to-AArch64 translation layers to accelerate x86/x64 code translation/execution on ARM CPUs so we can start migrating to greener pastures.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Fox2232:

It is about tone people post with.
exactly. this is the problem.