AMD Ryzen Quad-Core 2+2 versus 4+0 Core Setups Analyzed

Published by

Click here to post a comment for AMD Ryzen Quad-Core 2+2 versus 4+0 Core Setups Analyzed on our message forum
https://forums.guru3d.com/data/avatars/m/183/183421.jpg
So if it's not the CCX's being the problem then what is well probably a mix of slower IPC and non optimization of code to help it do things the RyZen way oh and a lack of OC ability 4.1GHz vs Intels 7700K @ 4.8GHz maybe if they could match that then they might have a chance... Still doesn't put me off the RyZen though I'd rather happily pay AMD than be extorted by Intel
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
............all except for those who want the best gaming performance...
Then you have the question if the best gaming performance is given by average framerates, useless "minimum" and "maximum" numbers, or things like frame latency, where things get much more muddy. Look at the frame plotting graphs, and notice the spikes of the 7700k, even in games it does well in. [youtube]RZS2XHcQdqA[/youtube]
data/avatar/default/avatar03.webp
All this fighting and arguing is useless. Your most persuasive vote is your wallet. And speaking of which... I'm seeing quite a few ppl giving Universal and Positive "Yes" to Ryzen, but very few actually voting with the wallet. 6core is the last train to AMD this gen imho. No one is going to upgrade Sandy+ to a 4core which is almost better. AMD needs time to polish, bur alas... All in all decent. Could have been worse, could have been better.
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
I really hope the next revision of Ryzen will support Quad Channel memory or possibly even eight.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
All this fighting and arguing is useless. Your most persuasive vote is your wallet. And speaking of which... I'm seeing quite a few ppl giving Universal and Positive "Yes" to Ryzen, but very few actually voting with the wallet. 6core is the last train to AMD this gen imho. No one is going to upgrade Sandy+ to a 4core which is almost better. AMD needs time to polish, bur alas... All in all decent. Could have been worse, could have been better.
True and i did just that a few days ago. I was undecided between sticking with Intel again or try AMD and in the end i bought a 7600K and an ASUS board with a nice rebate. The reasons for this choice are that my pc is primarily for gaming, Ryzen´s weak spot, and for me Ryzen´s platform is to "raw" for my liking with to many problems, some small, others big like people bricking motherboards trying to update the BIOS... Don´t know if i made the right choice but now it´s done. But i still wish all the luck to AMD and i hope they kick Intel´s ass once again.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Does anyone have a benchmark on Dota 2 after the Ryzen update? The only thing I can find is a guy on the Chinese forum claiming he went from ~130 to ~160 fps.
I have seen "unofficial" Linux benchmarks that seemed to yield similar results. I too have tried finding some more legit benchmarks, and never found anything.
Also all this conversation about quads being a better choice for getting a CPU in 2017, reminds me of some equally surreal conversations in the graphics subforum, where people were insisting that the GTX 960 2GB was a better choice to the R9 380 4GB, because it was a bit faster on some titles, and "you can't use the extra memory on such a low-end GPU anyway".
I wouldn't consider it that black and white. I myself am very likely to get the 4c/8t Ryzen, because I know I won't be needing any more than that for a long while. Most games revolve around what consoles can do. Current-gen consoles may have 8 cores, but they're clocked low and most games only use 6 of the cores; XB1 just recently permitted access to the 7th core. Do a little overclock and you should be able to play just about any game reliably. "Reliably" is an important word. Of course, a 6c/12t CPU will last you longer and will offer more performance in most realistic gaming scenarios, and, there are a handful of games that can take advantage of it. But, if 60FPS is your motive, an overclocked quad core will work too. The most important thing to keep in mind is developers target the widest audience possible. Even a 4c/8t i7 isn't cheap. Getting more threads hasn't been widely adopted by consumers. Perhaps Ryzen will help change this trend, but from what I personally have seen, there just isn't currently a reason to have more than 8 threads for games and I don't think this will change for at least 2 or 3 years. Most games on PC still only use 4 threads.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Was thinking the same. Not many have upgraded to Ryzen but the lack of brackets for coolers, low motherboard availability and teething issues are the cause of that, for some part. I've upgraded from Sandy to a 4 core and I'm not complaining. In fact, I love my new rig. Funny thing is, I see more people upgrading their Sandy+ to modern Intels.
I warrantee you that if I went to get a CPU now, it would have been a Taichi mobo with a 1700 and 3200 Geil memory. But I don't really need a CPU yet, I'm actually holding my cash for the GPU upgrade first. I also can't say that I would ever recommend to anyone getting a new CPU right now, to go with a lesser thread CPU for the same money, given the current IPC differences.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
So if it's not the CCX's being the problem then what is well probably a mix of slower IPC and non optimization of code to help it do things the RyZen way oh and a lack of OC ability 4.1GHz vs Intels 7700K @ 4.8GHz maybe if they could match that then they might have a chance... Still doesn't put me off the RyZen though I'd rather happily pay AMD than be extorted by Intel
4.1Ghz 8-core vs 4.8Ghz 4-core, 8 core will smack it down every day, except in programs that only use 4 threads or lower. AKA gimping the 8 core to make it look worse.
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
Then you have the question if the best gaming performance is given by average framerates, useless "minimum" and "maximum" numbers, or things like frame latency, where things get much more muddy. Look at the frame plotting graphs, and notice the spikes of the 7700k, even in games it does well in.
It's crazy how much Ryzen performance changed with some updates and a overclock on the RAM
https://forums.guru3d.com/data/avatars/m/108/108341.jpg
Then you have the question if the best gaming performance is given by average framerates, useless "minimum" and "maximum" numbers, or things like frame latency, where things get much more muddy. Look at the frame plotting graphs, and notice the spikes of the 7700k, even in games it does well in.
I'm not planning on a new computer anytime soon, but Ryzen looks to be exactly what I would be looking for. I do about 70% gaming and 30% video conversion/3D object creation and manipulation. I've always been a 60FPS/Hz+V-sync person so the idea that Ryzen has higher lows with lower highs makes it sound like the perfect choice for a rig with a high end GPU and somebody looking for a 60FPS lock with the highest graphical settings allowable. Can't wait to see the Ryzen refresh with higher clocks and 4000MHz RAM with a Vega GPU.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Seems 4+0 is just the same as 2+2 which both are worse than 4+4 or 3+3 if there even is a difference. Would mean the interconnect isn't the issue after all. To me that conclusion is actually positive.
Yes, I agree, it is positive news because that's the architecture, and it can't be changed. Although they do need to find why Ryzen is still slightly underperforming in games - it's not the Windows 10 Scheduler, AMD admitted it wasn't that, but I think I remember them saying that games would need to be developed with Ryzen in mind - I think it's down to the software developers to take advantage of Ryzen architecture - this could take some time. Also, the increased introduction of faster RAM kits for Ryzen will help too.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Ya lets compare best case scenario Ryzen to a 7700k that is gimped. You realize that even the 7700k scales well with higher ram speeds and most samples should be able to do at least 5.1ghz. Bench a 7700k with a 4000mhz+ ram and 5.1ghz, and lets see the Ryzen CPU still winning.
There's nothing gimped about the tests there. What's gimped is taking a program that utilizes at max 4 threads and claiming the i7 7700k is faster because 4 of its cores are faster the 4 of 8 cores of another. Woopdyfickendo. Btw, GREAT FIRST POST. :thumbup::thumbup::thumbup::thumbup:
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Actually i5-7600K is already quite behind i7-7700k in some games. Time of Clean Quad Core CPUs without SMT is over. Next ones are Quads with SMT.
Agree.
data/avatar/default/avatar19.webp
This reminds me of the Core 2 Quads which was 2 cores per di (2+2) which went up against the Phenom IIs which were just 4 cores on a single di and the Intels did very well for themselves granted the C2Qs didn't have any of this new technology either but still people complained that the C2Qs were not true quad core CPUs. I know there is feature that the new AMD cpus have that cause a slowdown when talking to cores on separate CCXes. According to those graphs there really isn't that much difference between a CPU that is 2+2 vs a cpu that is 4+0 however I wouldn't have that high of an OC on the 7600k and the 7700k respectively I would have left them at stock including the turbo clock frequencies. Plus I would have added an Intel hex core CPU into the mix since you have a hex core Ryzen in there as well. Yeah I know that the purpose of this graph was to analyze 2+2 Cpus vs 4+0 cpus.
data/avatar/default/avatar27.webp
YouTube tech Guru Steve of AdoredTV (The Scottish guy in Sweden) put up an amazing video that essentially proves that while doing CPU gaming testing at low resolutions will show which CPUs are better at gaming TODAY, they have dubious accuracy at best for predicting long-term viability of any given CPU in gaming. He points out that while the i5-2500K was 10% faster than the FX-8350 back when the FX-8350 came out, in modern games with more advanced GPUs, the same low-resolution tests show the FX-8350 to now be more than 10% faster than the i5-2500K. The reason is that modern games are more thread-aware and are much better at taking advantage of more threads. Now, this doesn't invalidate the accuracy of those gaming tests that were originally made, it just points out that the situation can change completely and what was great back then may be worse today for completely different reasons. We all knew that the bulldozer architecture design philosophy was too ahead of its time for its own good since no software developers were making games for eight threads. The thing is, AMD claimed that this was the principal reason that FX was suffering and AdoredTV's video vindicates them by proving their claims to be correct. AdoredTV Video in Question, "The Press Loses The Plot": https://www.youtube.com/watch?v=ylvdSnEbL50
I just watched another YouTuber who just debunked this guy's benchmarks with Benchmarks of his own. Surely it is a FX 8370 but there is really no difference between a 8350 and 8370. https://www.youtube.com/watch?v=76-8-4qcpPo
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
I just watched another YouTuber who just debunked this guy's benchmarks with Benchmarks of his own. Surely it is a FX 8370 but there is really no difference between a 8350 and 8370. https://www.youtube.com/watch?v=76-8-4qcpPo
Im positive even a fx 9590 would loose to a 2500k stock when it comes to gaming.
data/avatar/default/avatar18.webp
I just watched another YouTuber who just debunked this guy's benchmarks with Benchmarks of his own. Surely it is a FX 8370 but there is really no difference between a 8350 and 8370. https://www.youtube.com/watch?v=76-8-4qcpPo
Ouch AdoRed Leader got served 😀
data/avatar/default/avatar08.webp
If a CPU keeps a game's minimum frame rate at 30fps or above, it has done its job and a CPU with a minimum of 80fps won't look or feel any different. Go to YouTube and seek out the channel "Testing Games" and you'll see them pit different CPUs and different GPUs against each other in the same game. They display it as a split-screen and you know what? Regardless of what BS anyone says about "this is better" or "that is better", they all look exactly the same as each other. You see the FPS in the corners and sometimes they're as far apart as 40fps but the games still look exactly the same throughout the entire vid. Why do you think that is? https://www.youtube.com/channel/UCueyCLPsCxjk1ZwtAmu6Gvw/videos Don't take my word for it, pick one, pick ANY one! They all look the same and they all look fine (and they DO have an FX-8350 in there too!).
30fps minimum same gameplay as 80fps minimum. OH LORDIE! You for real??
data/avatar/default/avatar36.webp
How would either of them "lose" when both keep a minimum frame rate of above 30fps? The way people talk, my FX-8350 should be dead by now, unable to run any of the newest games. And yet, it does, perfectly. Can you explain this?
The best thing to do when CPU is dropping to 30fps might be to set it to half-refresh rate and keep it at 30fps. Dropping to 30fps is what's considered borderline playable these days.
data/avatar/default/avatar23.webp
It's called borderline because it hasn't hit below playable yet. I did also say minimum frame rate, which means that 30fps is the LOWEST that happens including all dips. If your minimum frame rate is 30 or higher, you're laughing.
No you're not laughing. 30-60-30fps might be the worst gamplay experience. Why are we discussing these basics...