AMD Ryzen 2600 Benchmark Spotted

Published by

Click here to post a comment for AMD Ryzen 2600 Benchmark Spotted on our message forum
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
fl2015:

I didn't even want to mention SB in this thread until the other guy said it was night & day difference for him switching to Ryzen.
Differences can be heavily dependent on workload. For me, there was a huge difference between an i5 6600K and my Ryzen 5 1600. There are other uses for processors and graphics cards than just gaming..... It's in those usage cases where the biggest differences show.
https://forums.guru3d.com/data/avatars/m/147/147322.jpg
From what i read, any AM4 mobo with older bios will have issue to boot with 2400G. There are also mobos which can flash bios without having CPU present, any example which motherboard offers that?
https://forums.guru3d.com/data/avatars/m/147/147322.jpg
I guess that Asus Strix B350-F does not have bios flashback option and i would need to buy higher model, right? Pity.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Pinscher:

Here is an article on 14nm LPP vs 12nmLP. My bad, it from anandtech. https://www.anandtech.com/show/11854/globalfoundries-adds-12lp-process-tech-amd-first-customer I can't find the article I read from 14FinFet to 14LPP, but i found a similar one. https://www.anandtech.com/show/9959/samsung-announces-14lpp-mass-production This is rel event as GloFlo botched their own 14nm and had to license Samsung's...It was potential 15% ... anyways, what ever right? it's just over promising from GloFo which seems standard these days.
There are different implementations of the same node for different applications, some of the changes listed for a low power product (mobile) may not be compatible with a high performance part. Also maximum clockspeed on a chip is only partially governed by manufacturing process (cell libraries make a big difference here) - a good example of this is the 1050Ti, which can be clocked to ~1.9Ghz on Samsung's 14nmLPP node while the similarly sized RX560 on the same node maxes out at about ~1450 from my quick googling. I agree with D3M1G0D, I didn't expect Zen+ to get much other then a slight clock speed/power adjustment, I think most of the gains will come with Z2.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Pinscher:

My friend, I'm only trying to point out that a 2600 being labeled as an upgrade over the 1600 with Such % speed boost is false to all us enthusiast. I could take a marker and mark my 1600 with a 2 and change my clock speeds to match the 2600 and my performance, according to other provide metrics, would be nearly identical on ST.
According to what reliable data? The only point I'm trying to make is nobody should be jumping to any conclusions, good or bad, based on the data available to us. Though, so far it seems Zen+ will at least be better. Whether it is worth getting is a different story, because there's not enough data.
Here is an article on 14nm LPP vs 12nmLP. My bad, it from anandtech. https://www.anandtech.com/show/11854/globalfoundries-adds-12lp-process-tech-amd-first-customer
That still doesn't mention where 25% is coming from. It mentions 15% higher transistor density and 10% higher frequency potential, but those are not percentages you can add. I don't recall transistor density having a proportionate impact on performance (in other words, 15% higher density doesn't translate to 15% faster calculations), and the performance you get out of clock speed is dependent upon other factors like IPC. All that being said, the 2600 will likely be around 10-15% faster clock-per-clock than the 1600, while using less energy and having better overclock potential. Personally, I wouldn't consider 10-15% worth upgrading from a 1600, but if you can clock to something like 4.4GHz, that could be worth it.
I can't find the article I read from 14FinFet to 14LPP, but i found a similar one. https://www.anandtech.com/show/9959/samsung-announces-14lpp-mass-production This is rel event as GloFlo botched their own 14nm and had to license Samsung's...It was potential 15% ... anyways, what ever right? it's just over promising from GloFo which seems standard these days.
I don't doubt any of that; my doubts come from the predicted performance of the 2600 that you mention. Going from 14nm to 12nm is not going to affect performance that much.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Pinscher:

Well you make a good point, Zen 2, current in the form of the 2600 engineering sample, isn't higher clocks and clock for clock it's not higher IPC. .
Zen 2 doesn't have an engineer sample, nor is 2600 Zen2, it's Zen+. If you think i'm being a stickler on names, then you don't understand what Zen+ is vs Zen 2. The rest of your message goes on about Zen+, so i am to assume you meant Zen+ not Zen 2, hopefully.
Pinscher:

Releasing Zen+ at a slightly higher base clock and a turbo under 4.1GHz isn't anything to write home about. This isn't the upgrade path we were promised with 3 generations of Zen on the AM4 platform. Segways are not upgrades.
I'm not sure what you're on about in regards to what we were "promised", Zen+ was always "promised" to be a slight improvement. This leads me to go back to my original question on if you understand the difference of what Zen+ and Zen 2 are. Zen 2 is supposed to be the performance improvement. However, without knowing where the frequency cap will be on Zen+, how exactly can you even say it isn't an upgrade path? And with it being a faster processor then a 1600, if released for the same MSRP as the 1600 is currently, then how is it not an upgrade path? I would agree in general that going from the 1600 to 2600 would not be much of an upgrade, but then again, most people aren't upgrading their CPUs every year, and if they do, are likely upgrading to the next model up, not the same model, different generation. It has realistically never made sense to do that within 1 year, for a CPU.
Pinscher:

Let's both hope that something changes in the near future as I really did want to get three generations of upgrades out of the AM4. That's why I blew a load on the memory to carry me through the lifespan of the product line. .
Why would you want to upgrade 3 times in 3 years? the CPU? graphics card, sure, i get it. SSD and maybe even HDD depending on what you are doing with them, sure, i get it. CPU? That makes no sense, and hasn't ever. Once every 2-4 years, sure. The more and more i read your message, the more and more you stop making sense.
Pinscher:

Here is an article on 14nm LPP vs 12nmLP. My bad, it from anandtech. https://www.anandtech.com/show/11854/globalfoundries-adds-12lp-process-tech-amd-first-customer
From that article: "GlobalFoundries promises that its 12LP provides a 15% higher transistor density and enables a 10% higher frequency potential (at the same power and complexity) compared to “16/14nm FinFET solutions on the market today”." 15% higher transistor density, and 10% higher frequency POTENTIAL Meaning, most Ryzens can currently get to 4.1Ghz right? That means if what they said is true then that cap should be lifted to 4.5Ghz. Since we don't know if that's true or not, i'm not sure what you are complaining about? Unless you're complaining that the 2600 is 3.4Ghz rather then 3.5Ghz which would be a 10% increase from the 1600, in which case, sure, i guess? Doesn't make sense to complain about still.
Pinscher:

14nm LPE to LPP is 15% as expressed by Gloflo/Samsung 14 nm LPP to 12nm LP is 10% as expressed by gloflo These add to a potential 25% performance bump when going from 14nm LPE to 12nm LP
You keep giving these numbers around and when asked for proof of these statements, give one article, that didn't even say what you're saying, and another article that has nothing to do with AMD per-say, but idealistic properties from Samsung. The samsung article you posted is about switching speeds of the transistor, which sure, could translate into 15% increase in CPU speed, problem is, that does not take into account the architecture of the processor. If all processors were bound by the switching speed of the transistor and only that, then all processors would have the same limit in frequencies. And we all know that's not the case.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Pinscher:

14nm LPE to LPP is 15% as expressed by Gloflo/Samsung 14 nm LPP to 12nm LP is 10% as expressed by gloflo These add to a potential 25% performance bump when going from 14nm LPE to 12nm LP I'm just reading and expressing what has already been said by those far smarter that myself. When reading the numbers the pros are providing the potential performance increases from the switch from 14nm LPE to 12nm LP you can formulate an expectation.
But... you can't add up those numbers like that, and nobody said they could be. To put it in another perspective, if you reduced the weight of a car by 10%, that doesn't make it 10% faster or 10% more fuel efficient. It will accelerate faster and it will use less fuel in doing so, but neither will be 10% better. Shrinking the transistors is the same idea. A 10% reduction doesn't yield a 10% boost in speed, nor does it yield a 10% decrease in energy usage; it will go faster, but not 10% faster. It will use less energy, but not 10% less energy. The 15% boost from LPE to LPP is referring to clock speeds, which doesn't affect IPC performance from last-gen hardware. What all this means is, when comparing clock to clock, we're going to see a minimal (definitely less than 10%) performance improvement from the transistor shrink, and we can OC at least 15% higher. Any other performance improvements will be related to changes in the architecture itself. Zen+ is not a significant upgrade, but rather just refining Summit Ridge. I don't remember what claims AMD made about that, but the performance gain is likely going to be less than 15%, and, whatever their claim is likely already includes the performance boost of the die shrink. TL;DR: We're probably going to see up to a 15% IPC improvement in the 2600 vs the 1600, and, can potentially OC 15% higher.
For my I think somewhere near half of what's being expressed is likely to be accurate. Or it's safe for us to assume to expect half of what they promise as the result of their efforts.
If by half you're referring to half of 25% then yes, I agree. It is very unlikely the 2600 will be more than 15% faster clock-per-clock than the 1600.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
D3M1G0D:

So what, you bought your CPU only as a stopgap or temporary measure? I don't know about you, but when I buy a CPU I expect it to last for a very long time (I did NOT buy a Ryzen 7 expecting to replace it within a year or two). In fact, I would be perfectly happy still using my 4-year old Core i7 4790K for everyday usage, except that I wanted more cores for grid computing and AMD offered more cores and threads for far less.
Well I did buy my 1500x as a temp measure. I couldnt afford to get a 1600x when I switched from my 3930k setup. And I was not gonna get an intel system again. EFF that. And I still dont regret it. I needed/wanted m.2 support, better IPC, and was not gonna buy a f**king overpriced 4790k quad core. I will be getting the R5 1600(X) refresh because I dont have one. It will be a nice upgrade. And guess what, Ill get it with a super cheap board combo so that I can throw my 1500x in that and make a cheap modern HTPC. My HTPC is still a q6600 @ 3ghz, 6GB DDR2 800, and an 8400GS. I wanna finally have an HTPC that handles 4k.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
mikev190:

When you say multi threaded don't you mean two threads? Also the graphics engine and UI all run off the same thread which is why not even the most insane PC can play this game at the 10 preset. Unless you call 50% GPU usage 8% cpu usage @ 1440p with 150% render scale showing off the multicore usage. Please if your going to try discredit what I'm saying, please at least know how WoW works.
That's actually the funniest thing I've read all day.. Thanks, I needed that laugh. I had a really crappy day and I appreciate you posting something so stupid just to make me laugh. If all you're getting is 50% GPU usage and 8% CPU usage, it's time to let someone configure your system that actually knows what they're doing. When I stopped playing, I was running an i7 2600K@4.5ghz with a GTX970 with 70-80% CPU usage and 95-98% GPU usage, with the graphics settings maxed, in a 25 man raid. I'd be willing to bet you're also one of the ones that didn't believe WoW supported DX10 just because you couldn't switch between to it in the UI settings. (here's a hint: it required a switch be added to run the game in DX10)
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Pinscher:

My friend, if you're talking about affording cpu's why are you buying the ones that don't come with a cooler and don't perform any better than the ones that come with a cooler? The logic is lost on me 1500x is $175 bucks w/o cooler and 1600 is 219.99 w/cooler Both will run 3.9GHz. 1500X needs a cooler so +$30 at least
1500x comes with the Wraith Spire Cooler, so not sure what you're on about. Also, you should probably not spam 4 (edit: 5 now as i was writing this post) posts that could have been 1.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Pinscher:

My friend, if you're talking about affording cpu's why are you buying the ones that don't come with a cooler and don't perform any better than the ones that come with a cooler? The logic is lost on me 1500x is $175 bucks w/o cooler and 1600 is 219.99 w/cooler Both will run 3.9GHz. 1500X needs a cooler so +$30 at least
I bought one with a cooler. So nice try. Plus I got this a DeepCooler after market cooler for $8 on Amazon. https://i.imgur.com/1zhUccTh.png 1600 was $249 with no cooler back in Sept when I built my Ryzen rig. 1500x was $189 with the cooler. $129 for Motherboard $189 for cpu, $158 for ram. I had $500 budget. I already had PSU, Case, GPU, and hard drives from my 3930k build. I sold my 3930k setup for $375 for quick sale. Sold in 11 hours from posting. So I dont know why you are tripping on cpus without coolers. Am2/3 coolers work on Ryzen with the snap retention brackets.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
mikev190:

Those figures you chucked out about the 2600k and 970 just proves my point about your knowledge. Of course they are going to have higher usage. It's basic common sense. The 970 and 2600k are much weaker. So that higher percentage you got is because it takes a 970 a hell of a lot more effort to push 60 fps than a 1080ti for example. It's like saying my cpu in AC Origins is weaker than my old man's 4790k because my usage is 40% and his 80%. Basically you've used the most retarded methodology, instead of just posting s screen shot of wows "epic core scaling". Like I said learn how wow works and is programmed and please remember that a 13 year old game has 0 ryzen support. Stick to your old games a mid range hardware because you've only seen the power of this hardware in review's clearly. Bringing DX 10 into the conversation just shows you're trying to take your chip on your shoulder out here. Anyone with half a brain knows DX 10 is a fail, like your knowledge of wow and hardware scaling. Also prove this excellent CPU usage then. Because whenever I had a i5 2500k I never went over 50% usage in wow but don't tell me it's because of HT put your money where your mouth is. The let someone know what they were doing part was the highlight of your factless comeback. I've spent 100s of hours tweaking ryzen and my 1080ti when all you've got is baseless opinions and a lack of real hands on experience with top end hardware. Rather disappointing coming from a so called "ancient guru". Read and learn: https://us.battle.net/forums/en/wow/topic/20757426421?page=4
Let's see....where to begin.... I stopped playing WoW in 2014, prior to the launch of Warlords of Draenor. I logged in for a couple hours in January 2015 to burn a game time card I had received. Haven't touched the game since. The next step up from my i7 2600K would have been an i7 3770K at that time and the next step up from my GTX970 would have been a GTX980. So, hardly "mid-range" hardware for the time. That thread you posted. Rather irrelevant. The OP provides no information necessary to determine what his issue is. If he has the client set for DX9, of course it's only going to use 1 core. The 9th post in the thread, someone mentions using 2 GTX970's in SLI....but, WoW doesn't support SLI. Never actually has. Then, at the bottom of the 2nd page, the OP states that the API and graphics settings make no difference. Which is incorrect. Also in that thread, it's mentioned that simply adding support for Vulkan would allow WoW to run in Linux, which is also incorrect. Also, a later post in that same thread, on the same page you linked to, states that you're wrong along with an explanation of how WoW's multi-core implementation actually works. I mentioned DX10 because you claim to know everything about WoW. Blizzard added DX10 support in The Burning Crusade, when they updated the "game engine". Blizzard removed DX10 support when Wrath of the Lich King was released. Now, on to the rest. WoW's "game engine" was in fact updated during The Burning Crusade to add multi-core support. It's really not my fault you're too stupid to comprehend what is being said. You're bitching and whining like a little kid with not even the slightest clue what you're saying. Multi-core support is NOT the same as multi-threading support, which is also different from supporting SMT. As a point of fact, if WoW had "zero" Ryzen support (as you claim), the game wouldn't run at all. Since WoW is coded to run on x86 hardware, it has support for Ryzen built in. But why should you worry about actual facts now when you haven't so far... That "mid-range hardware" statement really hurt.....lol I think that was about the most pathetic thing in your entire post. I mean, I was pulling 60fps in 25man raids back in 2014 with "mid-range" hardware, so I really have nothing to feel bad about. As for providing you proof. There's really no need. You're the only one that has an ego problem and regardless of anything I could post as "proof", you'd still claim it was false.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
mikev190:

I guess I upset this butthurt strawberry tho.
Nope.... Still laughing at your stupidity. You prove your own stupidity by referencing threads on other forums that leave out details necessary to determine the cause of the issue.... How about finding some actual relevant threads if you're going to keep posting?
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I'm probably going to regret butting into this but I think there are points that need to be made, from an un-biased perspective. In other words, I'm not addressing anyone specifically: * I think we can all agree that WoW uses multiple threads, and, just about any Ryzen can play WoW just fine. * I think we can all agree that the game plays fine on even a quad core. * It is important to distinguish that a program/game that uses multiple threads doesn't mean it's dynamically multi-threaded. In the vast majority of cases of modern games, they have pre-set functions for threads. Most games now need a minimum of 2 cores/threads in order to be even slightly playable. This is because often there's always a bare minimum of 1 thread for game logic and another thread dedicated to rendering. Meanwhile, some games have another thread dedicated to post-processing, and another thread for physics, and so on. * Having said my 3rd point, Starcraft 2, another very popular (but newer) game from Blizzard, doesn't have any significant performance improvements when you get beyond a quad core. As for WoW, I decided to look up benchmarks myself, from sources nobody here provided, and confirmed that WoW does not take advantage of more than 4 physical cores. However, WoW does slow down with CPUs that use 2c/4t. To my understanding, these benchmarks did not include a local server process, which obviously would take advantage of more cores. * Continuing my 3rd point again, even modern games designed for many-thread CPUs have a limit. From what I recall, Ashes of the Singularity for a while could "only" take advantage of 12 threads, whereas now it is limited to 16. You can keep adding cores and it won't use them. The point is, I am pretty confident there is currently no game in existence that will dynamically use every thread available to it. * Whether this is relevant to the topic at hand or not, 8-threaded CPUs are not going to be obsolete for several years. Even though there are games like Ashes that can use more threads, Ashes is still plenty playable on even 6 or 4 threaded CPUs. Now, does anybody disagree with anything I just said? Is there any argument still left open that I didn't address (you guys have a lot of rambling to sift through)? @mikev190 Regardless of how right or wrong you may be (I'm not making that call), the problem is you let sykozis (someone who I find is frequently and needlessly antagonistic) get to you. Your points will not get far regardless of evidence when you start insulting people, especially insulting them in a way that has nothing to do with anything they said. And to clarify, I'm not ganging up on you; I'm not addressing sykozis here because doing so is a waste of my time - you seem to be more reasonable of a person, though, at this point a bit in a panic.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
schmidtbag:

I'm probably going to regret butting into this but I think there are points that need to be made, from an un-biased perspective.
I simply stated a fact. With TBC, Blizzard added multi-core support to WoW. Apparently, still no support for SMT though. Multi-core support and multi-thread support, as you stated, doesn't imply that every available core is going to be used. It doesn't even imply that every running thread won't end up utilizing the same core, even though they shouldn't. My intention was to stop at that first post. His response was both unnecessary and confrontational, just as some of yours have been. If you're going to make posts that appear confrontational to those you're responding to, expect the same in return. Attacking someone, directly or indirectly, as you've chosen to do here, doesn't exactly make you look innocent....
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Let me reiterate something here. If you have an issue with a member, feel harassed or whatever, you press the report button and a moderator will look into it and actions will be taken if there's a valid reason for that. No matter if you're right or wrong, your language always needs to be respectful towards each other, even if you would be right in a certain situation and feel wronged. Keep your cool and if needed, use the report button. Swearing, cursing and what not will get you nowhere.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
sykozis:

I simply stated a fact.
At first, neither of you were aggressive. Both of you made a point. I again am not going to pick sides here. In his first response to you, it had a sense of pretentiousness in it, but it wasn't all that hostile. You, meanwhile, responded to him very rudely and antagonistically: https://forums.guru3d.com/threads/amd-ryzen-2600-benchmark-spotted.419567/page-3#post-5521872 I understand why you responded the way you did, but it was too much. To clarify, I'm not justifying mikev190's behavior at all, but I'm sure if you approached this different, he wouldn't be banned right now. By the time fox2232 responded (who, BTW, kept their cool), he was already too fired-up and in a panic.
My intention was to stop at that first post. His response was both unnecessary and confrontational, just as some of yours have been. If you're going to make posts that appear confrontational to those you're responding to, expect the same in return.
Your first intent and response went fine. I agree that last sentence of his was unnecessary, but it was you who turned up the heat, a lot. Being confrontational in of itself isn't a problem. It is possible to be polite about it, and have a civilized discussion, even if you disagree. Note how earlier in this thread, I was in a disagreement with someone else, who I confronted. But the difference here is I wasn't laughing at the person and insulting their intelligence, and it ended peacefully. Even if someone's response to you is wrong, you don't need to fight fire with fire. I agree that it is reasonable to expect someone to respond negatively with negativity, but is doing so justified? (The answer is no, because look where it got us).
Attacking someone, directly or indirectly, as you've chosen to do here, doesn't exactly make you look innocent....
I'm not looking for innocence, I was looking to put an end to bickering about something extremely petty without someone getting banned. I guess I was too late.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
Aura89:

No matter which bios i try, no matter what memory settings i do, i have had a constant issue with cold boot many times deciding to not boot the OS, of which i'll have to reset the PC, or shut it fully down again, and then it'll, hopefully, maybe boot up.
This was fixed for me in one of the new BIOS updates. This one fixed it for me.
Version 3008 2017/12/13 CROSSHAIR VI HERO BIOS 3008 Update to AGESA 1071 for new upcoming processors.
They also released this one too which added even better memory support.
Version 3502 2018/01/30 CROSSHAIR VI HERO BIOS 3502 Update to AGESA 1000a for new upcoming processors
They even have a newer BIOS too but its in beta (version 6001) but for some reason its to improve performance with Ryzen+Vega CPU's which I don't understand as these boards have no video outputs on the I/O LMAO 😀 I have this memory kit >>>>link<<<< For some reason the price has skyrocketed, it was around £130 when I first brought it 😱 I had the cold boot issue where it would not train the memory properly, it would switch off the system, reboot once and have a long boot where I would then need to go into the BIOS and see that the memory had defaulted back to 2133MHz or 2400MHz I can't remember. I would then have to reload my OC profile and restart where it would set the correct RAM speed. I am currently running BIOS 3502 (2018/01/30) and have this kit running perfectly overclocked to 3466MHz CL14 1T @ 1.4v (I could probably go lower on the voltage but its working absolutely rock solid, I did do 10 runs in Asus Realbench benchmark and it passed that with flying colours. Here is my CPU-Z screenshot, https://image.ibb.co/hUkudH/Untitled.jpg 1729.2 x2= 3458.4MHz (give or take 3466MHz and this is with setting XMP uisng the DOCP setting, leaving the timings at default CL14 1T and then manually setting 3466MHz and upping voltage to 1.4v). EDIT: My 1800X CPU is at 4.15GHz across all 16 threads @ 1.4v with SOC @ 1.15v You might need to up the SOC voltage to maintain higher RAM speeds with your kit. DONT GO ABOVE 1.2V though and don't just set 1.2V work your way up as higher SOC can really ramp up CPU temps. Default should be around 1.1V by the way. You can also go to the DIGI+ section in the BIOS and crank up LLC (I use level 2) also set CPU spread spectrum to disabled, and power phase or CPU, RAM, PCIE to extreme and then RAM boot voltage above what your running voltage is so if you DDR4 kit needs 1.35V at its rated speed then set 1.36V (0.01V different) this can help get it passed the cold boot issue.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
CPC_RedDawn:

This was fixed for me in one of the new BIOS updates. This one fixed it for me..
3008 definitely didn't fix it for me, that was the one i was on when i posted that. I have since upgraded to 6001 and have not had an issue with that, so far, though i'm not holding out hope that it'll fix it, i won't be surprised if once it does it. That being said i have had once so far with 6001 where my PC started up and then shut down again for no reason, automatically starting back up, and never giving me a video. I had to unplug the PC from the wall to get it working again, had not had that ever happen before 6001, so don't know if it was just a fluke, or what. This is the kit i have: https://www.amazon.com/gp/product/B01ACODPHI/ref=oh_aui_search_detailpage?ie=UTF8&psc=1 And i've only been running it at 2666Mhz, i will try 3200 soon if i don't have any more issues, and i'll keep the 1.36v in mind. As to the SOC voltage, i'll keep that in mind too. I'm just hoping i won't have any more issues, as though the cold boot issue is generally pretty easy to get around, it's plain and simply annoying lol
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Pinscher:

You seem well equip for computer shopping. I forget prices change all over the world. funny the 1600 cost more than the 1600x. That's a nice looking bling cooler for $8, holy what a steal. Miles a head the stock cooler, for sure. I enjoy my 1600, i hope you do too
Thank you. Normally its a $20+ cooler. Yea, I get good deals on the time. I get lucky. I paid $4 for a Nighthawk X6, and $4 also for a BNIB EVGA GTX 660 2GB last year.