Retro review: Intel Sandy Bridge Core i7 2600K - 2018 review

Processors 213 Page 1 of 1 Published by

Click here to post a comment for Retro review: Intel Sandy Bridge Core i7 2600K - 2018 review on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Great review Boss.
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
Nice. I'd love to see one of those with the Core i7 3930K or 3960X Sandy Bridge E (including 4-4.5GHz overclock) as well. I somehow doubt it's worth an upgrade right now, especially when taking the high prices into account.
https://forums.guru3d.com/data/avatars/m/254/254955.jpg
Does there really exist people that use 2600k at stock clocks? What's the point of review? Let's make strawpoll and ask 2600k owners at what clocks do they use this CPU. I think 90% of users would use it under OC conditions. And where is more processor oriented games like BF1 etc. ? MAX and LOW FPS? ..
https://forums.guru3d.com/data/avatars/m/47/47947.jpg
Thx man. I'm still fine with my 2500k @4.5 and a 1060. πŸ™‚
https://forums.guru3d.com/data/avatars/m/173/173869.jpg
running 3930k @4.5. not even thinking to upgrade. this is strong beast for gaming and other tasks. and I still can't believe the price in 2011 vs 2018.
data/avatar/default/avatar15.webp
Great article. I still use a 3770k @4.7 and a 1080ti. This allows me to game at high res using dsr so my gpu is usually the bottleneck. The only title I play when I feel the need for a better cpu is ac origins. Certain places in that game struggles to hold 60fps but then again, this is at ultra settings. I think for me, the fact remains that a Β£1000 upgrade is still just not worth it. Perhaps if I had a weaker gpu and played at 1080 then it may be worth it. The amount of times I have nearly bit the bullet and upgraded is crazy but at my gaming resolutions, the extra 3-8 fps surely isn't worth it is it especially when we are talking like 110 fps when it could be 116fps with 8700k?
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
For 1440p and above it's still more than enough for most games (with exception the ones that use more CPU power/cores like BF1 for example). Legendary CPU really. 😱
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
Hilbert Hagedoorn:

Gaming, however, was an interesting topic. Here the reality is simple the 2600K runs out of juice in CPU bound situations like low resolutions. The fact remains though, at 1080P is still has enough oomph to deliver decent enough numbers on anything below a GTX 1080, I mean not hugely great but certainly decent enough. When we take the GPU out of the equation and look solely at 720P performance, here you can see and measure a rather dramatic effect where Sandy Bridge limps behind. But let's always remember, a GPU bottleneck is far more apparent than a CPU bottleneck.
To me, and you're not mentioning this at all it seems, it looks like there's no difference at all in QHD (1440p). So if you're on Sandy Bridge and are split between getting a new GPU + a completely new system, vs getting a new GPU + a 1440p g-sync monitor, you're probably well advised to get a GPU and 1440p monitor. If, on the other hand, you're all about 1080p or lower + highest FPS you can get ("300FPS competitive stuff",) you should opt for new platform instead. (Of course if you have loads of money to throw, you'll upgrade everything... πŸ˜›)
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Mega Guide: Do you have 1080p screen or 1440p/4k? 1080p - Yes, worth an upgrade. 1440p/4k - No, keep using it as GPU is the bottleneck.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
burebista:

Thx man. I'm still fine with my 2500k @4.5 and a 1060. πŸ™‚
When you see that 1st child still use an old core2 Q9550 OC with a 1060 and R6 siege put everything at max by default at 1440... then of course you are still fine with your 2500K πŸ™‚
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Stock 2600k @ 3.4ghz. Add 30-40% to the benches since these chips can easily OC to that.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
Well You all bought in your overclocking capabilities which has nothing to do with the cleverness of the old days of overclocking - it was all done with a lot of risk but not paid. What it means basically U did not OCed at all, it was bought in speed.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Ourasi:

2600K overclocked to ~ 4.5ghz with 16gb 2133mhz RAM turns into a whole other story, running it at stock with 1333mhz RAM is kinda "what is wrong with this picture" stuff, but hey, some grandma might run it at stock as an "interweb thingy" πŸ˜›
Non OC intel 2*00 are not as weak as you think...
data/avatar/default/avatar30.webp
Great review HH!! Thanks.
data/avatar/default/avatar03.webp
Stock clocks and DDR3 1333? Wish you had at least thrown a 4.5ghz+ with some faster ram in the mix.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
ttnuagmada:

Stock clocks and DDR3 1333? Wish you had at least thrown a 4.5ghz+ with some faster ram in the mix.
Yeah, that was what I wanted to do as extra (tweaking), the H67 board, however, did not allow CPU tweaking. Memory wise I had to choose, either faster 8GB or 16GB reference JEDEC - I really wanted 16GB of memory opposed to some faster 8GB I had sitting there (all dusty btw). Also the H67 mobo had issues with faster memory (which was a normal thing in the pre-XMP days πŸ˜‰ )
https://forums.guru3d.com/data/avatars/m/189/189438.jpg
Not sure about pci-e lane improvements, my last 6 cpu`s(2500k,4790k,g4560,6600k,6700k and 7700k) have all been limited to 16 lanes whereas my 4820k i had 40 on the cpu to play with and i didnt have the option for m.2 pci-e x4 back then.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Still a better CPU than mine, which is a perfect example of how Intel didn't bother to do anything during all those years when AMD couldn't put up a fight. Three whole generations between that CPU and mine, yet the old basic i7k still beats the much newer i5k. But I'm glad Intel made hundreds of billions of profit in the meantime. Not a single cent of it was used for the benefit of the customers. I'm so glad AMD was finally able to make a comeback. It was such a sick market.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Ourasi:

Here you can see what my 2600K does at 4.7ghz and with 16gb 2133mhz RAM in Aida64: Would have been fun to see you do the entire benchmarksuite at these settings πŸ˜›
Did you just volunteer to help HH with your board and those benches? πŸ˜€
https://forums.guru3d.com/data/avatars/m/253/253070.jpg
Great article, was a pleasant read. But I swear, the " Time to upgrade? " question comes round in full force with each hardware release. And usually the answers to those questions were always as valid as they were predicable; Stick with what you have and be content or get an upgrade if you need it and can afford it. At least with the Ryzen and Coffee lake releases, people started to feel a leap in improvement. Just kinda happy the years of Ivy bridge/Haswell/Skylake / quad-core upgrade discussions have started to dull out.