Intel Shows 28-core processor die-shot

Published by

Click here to post a comment for Intel Shows 28-core processor die-shot on our message forum
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
A new socket, why? Just because. Not buying another Intel product until they stop being silly.
data/avatar/default/avatar25.webp
On the other hand, this are server products. I don't know many companies who upgrade their CPUs in servers. They rather buy a new server with more current technology and fresh warranty.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
The only real reasons for new CPU socket is new memory tech and width (channnels) So, basically, we should have had one socket per DDR type (on the same number of channels). Of course, more channels ( like the HEDT plaform ) - another socket. But 1156, 1155, 1151, 1150... yea, that is completely silly. Yay for monopoly \o/
https://forums.guru3d.com/data/avatars/m/182/182067.jpg
man, all these socket changes, cant they just make a oversized socket and utilize the pins that are needed, and extra pins can be used at later date when they change the architecture. Like they must have insight knowledge of new memory bandwidth/tech and all that stuff waaaay before they even think about making a CPU. Hate having to buy a whole new system each time a high end CPU is released, grated the x99 as lasted for ages now.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I'm finding it hilarious how people are whining about a new socket for a series of CPUs that they likely will never be able to afford for themselves. These aren't consumer-level products... Keep in mind, AMD is also doing a new socket for their stupidly large server CPUs too.
https://forums.guru3d.com/data/avatars/m/186/186763.jpg
I'm finding it hilarious how people are whining about a new socket for a series of CPUs that they likely will never be able to afford for themselves. These aren't consumer-level products... Keep in mind, AMD is also doing a new socket for their stupidly large server CPUs too.
I find it funny too lol I mean these are enterprise grade sever parts not something anyone here is buying for there little desktop computer at home lol I see no problem with a new socket for consumer grade processors every other gen either tbh
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
:) Only working idiots or spoiled brats care about Intel nowadays tbh. An idiotic overpriced company. They only want us pay more for less each new generation. Their innovation skills are not equal to their R&D size. But, whatever, move on...
You have a pretty closed mind. Basically you're saying anyone who has money is an idiot... Well newsflash :stewpid:. Intel is still the best performing. Now whether that's worth the cost to someone that's another story.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I find it funny too lol I mean these are enterprise grade sever parts not something anyone here is buying for there little desktop computer at home lol I see no problem with a new socket for consumer grade processors every other gen either tbh
I've stated in the past how the Guru3D reader-base doesn't understand enterprise-grade hardware and that articles like this should probably be avoided for that very reason (I doubt Intel appreciates the negative publicity), though I've been told off for making such statements. Personally, I do see a problem with new consumer-grade sockets every other gen. If making a new socket is a necessity then by all means go for it. But I have a very hard time believing Intel needed a new socket for the CPUs that fit in 1155, 2011, and 1150. As for 1151, I'll give that a pass, since that was transitioning to DDR4. Meanwhile, look at what AMD accomplished - if you have an AM3 (non-plus) CPU, that can fit in AM2+, AM3, AM3+, and some some cases, AM2 boards. These are not pin-identical, and, this goes from DDR2 to DDR3. Based on release dates of the sockets themselves, that's a range of roughly 5 years. If you take into account when AM3+ was superseded by AM4, that would be 11 years. That's impressive. Sure, AM3 CPUs were a slower than their Intel counterparts, but not by a wide margin. In another perspective, AMD also had at least 2 completely different architectures compatible with socket AM3+. Intel has the funding to make a socket last, but, it wouldn't surprise me if they have a deal with motherboard manufacturers where they're expected to break compatibility in order to increase sales. If Intel went AMD's route, motherboard manufacturers would likely lose tens of millions of dollars every year. Think of it in this way: motherboard manufacturers maybe give Intel a small bribe for making a new socket, allowing them to make an entire set of sales when Intel releases a new CPU. As for Intel - they don't care. They have their own fabrication facilities so it's not like they have to pay a 3rd party a hefty price for changing designs. They're going to make sales whether they change the socket or not, so the little bribe they get from the motherboard manufacturers likely pay for the expense of changing things up. In the end, it's a win-win for both companies. The consumer never knows the secrets behind these designs, so they'll never know if the change in socket was ever a necessity.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
So witch is it? Intel making the killing or the MB manufacturers. You two contradict each other.
There is no contradiction - like I said in my post, both Intel and the mobo companies will return a profit. Both of them have to spend a little extra in design changes, but that's worth it if it ensures both sides make sales. Without changing the socket, only Intel will profit. If Intel kept the same socket for most of their consumer-level DDR3 CPUs, mobo manufacturers would be at a massive loss, because nobody would need to buy a new board when upgrading the CPU. Since Intel isn't really in a position to anger mobo companies, they likely made a deal to change things up once in a while.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
There has to be more to the socket change than you are making it out to be. You used AM2/AM2+/AM3/AM3+ as an example. I would like to point out that those sockets held back AMD. It did not allow for the PCIe controller to be moved to the CPU die held back available memory bandwidth on the FX processors over all it hindered AMD from moving forward because they kept having to look back.
"has to be" is as presumptuous as saying "must not be". Some of the CPU architectures (like Ivy Bridge) were supported on multiple sockets. Of the sockets I mentioned, I am not aware of any differences between them that warrant an entirely new backward-incompatible socket. It's one thing to prevent an old CPU being used in a new motherboard (much like what AMD did with AM2 CPUs on AM3 motherboards) but I can't find any valid excuse for what Intel did, other than profit. As for AMD, I do agree that them holding onto the same socket probably hurt them (in terms of performance) more than helped. Bulldozer likely would've been better if they did a fresh new socket.
And as new tech comes out people would upgrade there motherboards more and keep their CPU in your fantasy world. Again MB manufactures do not gain anything from socket changes.
MB manufacturers absolutely do gain from socket changes, if it means they make a sale from a new CPU purchase. Think of it in my perspective: I have a socket AM3 motherboard. When I bought it, AM3+ wasn't a thing. When Piledriver came around, it was compatible with my chipset. All I had to do was update my BIOS and I could upgrade my CPU. I made that decision because I didn't need to buy another motherboard. In other words, a MB manufacturer lost out on a sale even though I still got a CPU upgrade. Intel makes a hell of a lot more sales than AMD. If everyone who owned a Sandy Bridge could upgrade to a Haswell without needing to upgrade their mobo, then these companies would lose millions. This is business - their income is more important than a convenience for customers that the customers don't have to know could exist.
And again most people will want to upgrade their CPU with the latest available to match with their new motherboard so not only do they get all the latest tech from PCI-e, specs to USB upgrades, M2, Thunderbolt (may you rest in peace(no seriously don't come back)) updates in LAN and on-board WiFi/Bluetooth. They will also want the latest instruction sets that come with updated CPU architectures as well as the bump in IPC.
Most of the new tech you mentioned is totally irrelevant to most people. Using my mobo as example again, it is nearly 7 years old and yet it can still wholly handle a GTX 1080 (where the CPU would be the main bottleneck). There are PCIe converter cards for M.2 cards, and they're not expensive. Thunderbolt, as you have established, was not of anyone's interest. Gigabit LAN has been available to consumers for about a decade and is still pretty much all you'll find; most people don't have routers that support 4 or 10Gbps. Most motherboards don't come with wifi; if you want to upgrade, it's not hard to just buy a compatible replacement card. Most new instruction sets are rarely taken advantage of for most consumer-level applications. Most people aren't willing to spend hundreds of dollars on just the CPU alone for a 5-15% performance bump in IPC.
TL;DR If you are that pissed about socket changes you probably don't like Intel to begin with and this is just something you latched onto as your go to argument against the company.
Considering how seldom I upgrade, socket changes are the least of my problems when it comes to Intel. I think you are gravely overestimating how much this bothers me.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
You love speaking out of both sides of your mouth don't you. In the process of trying to discredit my POV you proved my point. Thank you.
You love to exaggerate things and jump to conclusions, don't you. Seeing as you haven't elaborated (after all, there are a lot of Ps in your POV), whatever you deemed discredited is left only to your interpretation. In other words, your claim is meaningless, and this sense of superiority you have contributes nothing. But by all means, say whatever you want if, in your mind, this prevents your feelings from being hurt. As much as you'd like to think otherwise, my goal wasn't to ridicule; it was nothing more than a hypothesis, which you decided to ridicule. I stated I dislike the multiple sockets, but I'm not ridiculing Intel (or MB manufacturers) for doing so, because I understand the potential reasoning.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
You first say that changing sockets is not needed and use AMD as a point of proof then only one post later admit that not changing a socket for Buldozee held it back. Then you sit here and say things like "I can just use add-on cards for all those things that new boards have" news flash a NVMe SSD on a AM3 990FX board through a PCI adapter will probably be no faster than a SATA3 drive. You posted a wall of text for really no reason.
Here you go again, jumping to conclusions and failing to understand simple concepts. Using the same socket and northbridge on a completely different architecture is bound to hold back performance. It wasn't "needed" for AMD to change sockets, though maybe they should have. This is not by any means the same scenario Intel was in. Ivy Bridge alone exists on FIVE different CPU sockets (4 if you don't want to get nitpicky); 8 if you include BGA. 3 of those sockets were for consumer-level motherboards, and at least 2 of them targeted the same demographic. So seriously, think about it - you are insisting that there must be good reasons to have at least 3 CPU sockets for the same architecture and same type of PC. How do you not find this questionable? Anyway, NVMe SSDs are largely based on PCIe, so yes, it will in fact be faster than SATA3. Let's not pull numbers out of nowhere, ok? At least my rants actually have substance and reason.
the original point I was making in here is the socket changes in the grand scheme of things affects no one.
That applies strictly to people who have the money to always get the latest and greatest. Even Intel and MB manufacturers have to address the costs of changing sockets, but like I said, that's worth it for them in the end if it ensure mores sales. Most people on a budget would rather re-use a perfectly capable motherboard.
BTW you are basing this over ONE "unnecessary" socket change 1150-1155.
I explicitly stated 1150, 1155, AND 2011. Please keep up.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Make up your mind!
My phrasing and mindset is consistent. You are the one who chooses to see things in a different way.
5? Care to elaborate?
Ivy Bridge is compatible with the following sockets: LGA 1155 LGA 2011 LGA 2011-1 (this is the one that I find nitpicky) EDIT: LGA 2011-3, for some motherboards LGA 1356 Socket G2 And these are the BGA models, which I have excused for argument's sake: BGA-1023 BGA-1224 BGA-1284
Faster but not near its theoretical performance
If you're on enough of a budget where you need to stick with a mobo that doesn't have M.2, you're going to care more about actual real-world performance over theoretical. And at that, in most cases, there is hardly a perceivable difference between M.2 and SATAIII: techreport [dot] com [slash] review/29221/samsung-950-pro-512gb-ssd-reviewed/4
You have spent as much with your CPU upgrades over the past 6 years on that 890FX board that you could have bought a z77 and 3570k instead of that bulldozer and would not NEED an upgrade right now.
I have spent roughly $220 on CPUs for this motherboard. As of right now, my CPU handles every game I have played at 60FPS @ 1080p, except Starcraft II under intense battles. So, not only were both of my CPUs combined roughly the same cost as a new 3570k, but I'd have also had to buy a new motherboard as well. In other words, I'd be spending more for no perceivable user experience. I couldn't care less about synthetic benchmarks or theoretical performance.
1150 and 1155 are dual channel DDR3. 2011 is quad channel DDR3 with no IGPU and supports up to a 8 core CPU. 2011 v3 is quad channel DDR4. Try and keep up.
I apparently wasn't aware the 115# sockets were so crippled compared to 2011. I thought they could handle more than quad cores, so yes, maybe socket 2011 does have a distinguished reason to exist (my confusion probably came from i7s in both sockets). I actually explicitly didn't mention 2011v3 because I knew there were several changes where they weren't very comparable, but the funny thing is some motherboards are backward and forward compatible. Anyway, you still haven't addressed the necessity of 1150 vs 1155. And more importantly, there is still no good explanation as to why, for example, socket 1151 CPUs (to my knowledge) aren't backward compatible with 1150 mobos. Many (if not all) 1151 CPUs are DDR3 and DDR4 compatible. There may be other combinations but I don't feel like researching all of them. Again - I don't care that much about there being new sockets, but the intentional breakage of backward and/or forward compatibility is my gripe.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
2011 and 2011-1 are the same socket. AFAIK 2011 v3 does not support IB at all (because DDR4), and LGA 1356 is xenon are you serious?
2011-1 may be the same socket but apparently (as Intel would like us to believe) they aren't functionally the same, as can be said about 2011 v3. Like I said, that one was nitpicky, so I wasn't counting that. As for whether IB is compatible with 2011 v3, like I said, that depends on the motherboard. According to Intel themselves, it is both backward and forward compatible, but it is up to the motherboard devs to allow for that. It's no different than AM3 boards using AM3+ CPUs - some may be fully equipped to do so, but the mobo manufacturer may deny that. That being said, IB isn't officially supported on 2011 v3. As for seriousness... there is no "Xenon", it's Xeon. And socket 2011 was primarily focused around Xeons with a handful of desktop parts, so what's your point?
Really?!?! Intel has officially listed support for DDR3L standard, not DDR3 (this is notebook only).
*sigh* seriously? I gave you the benefit of the doubt over my own nit-pickyness by excluding stuff like 2011-1, and yet you're getting anal over DDR3L? Get a grip.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
THERE IS NO 2011-1 it's only 2011. short and long 2011-1 IS 2011
Uh oh, somebody missed nap time. From wikipedia: "LGA 2011-1 (Socket R2), an updated generation of the socket and the successor of LGA 1567, is used for Ivy Bridge-EX (Xeon E7 v2)[6] and Haswell-EX (Xeon E7 v3) CPUs, which were released in February 2014 and May 2015, respectively." So, mind explaining how v3 is worthy of being distinguished but this doesn't? I don't see how the differences between v3 to -1 are any effectively different than -1 to plain 2011. Both of them are just simply upgrades allowing for different CPUs to run, but they're physically the same socket.
DDR3L is low voltage SODIMM RAM for laptops and AIO computers.
Again, you're being nitpicky and it's not contributing anything to your argument. But anyway, DDR3L does in fact exist on desktops. I have built micro ATX PCs that use it. Example: https://www.newegg.com/Product/Product.aspx?Item=9SIA0FU42J4496
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
You claimed 2011-1 is desktop so you aren't being nit-picky you are being ignorant. It's again the dual socket.
I said no such thing, and never implied it.
LGA 2011 enthusiast consumer
2011 also has Xeons. There are also Xeons for sockets 1150, 1155, and other "consumer" sockets. As you should be able to see, this gets confusing real fast. Note I have 2 separate arguments, of which you seem to be mixing together: 1. Ivy Bridge exists on several sockets despite there not being backward compatibility among most of them. 2. Intel makes various sockets (such as 1150 and 1155) despite there not being any noticeable good reason to do so.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
You are assuming there is not reason to do so but there very well may be a reason. This is the difference I have found in a quick search.
Can you seriously lay off the drugs and stop claiming things that were never said or implied? I never said anything about whether Intel should or shouldn't use Xeons in sockets 115#. I couldn't care less if someone chooses to do so. Honestly, I'm thankful Intel gives us the option, regardless of whether it's a better choice or not. But this has nothing to do with what we're discussing.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Are you purposely ignoring the answer I gave to your second question just so you can continue to believe you have some moral superiority?
Moral superiority? Not sure how morals have anything to do with this... The IGP and audio are completely unacceptable reasons to prevent backward compatibility. If backward compatibility were allowed, people would understand any lost features are a result of using the "wrong" socket (for example: using an AM3 CPU in a AM2 mobo and not having DDR3 access). To my knowledge, most motherboards (regardless of socket) came with their own audio chipset regardless of what the CPU offered. The CPU's audio is likely for things like DP or HDMI.
Your first question is more of a straw-man so I will just leave it at that.
Actually it directly relates to what I was talking about the entire time, but it's no surprise your series of inaccurate presumptions and claims would lead you to think this way. EDIT: BTW, real suspicious timing of your post edits. I'm sure I'll be seeing more of that. But, that's what I get for over-cropping quotes.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
So let's get this strait you started off saying Ivy Bridge is supported by 5 sockets turns out it's only 3 and one is mobile.
Actually, I specifically clarified that I was only referring to 3 of the 5 sockets. But, you were too busy ranting about too much text. This once again is your issue.
You ask for a reason for socket changes. Some are given with proof and now you start injecting your opinion into things.
I also specifically stated (directly quoting from one of my posts) "...I am not aware of any differences between them that warrant an entirely new backward-incompatible socket". A couple tweaks to a GPU and an audio chipset nobody uses does NOT warrant a backward-incompatible socket. Again, I don't care if it's a new socket - that in of itself is not the problem. Regardless - from the very beginning of this discussion, I made it very clear that my statements were opinionated. Yet here you are, acting like this is some revelation and weakness in my argument. Even though you're aware of this, you can't seem to just let it go.
You keep using AMD as a point of reference and everyone can see they screwed up with AM3+( this is my opinion).
I accept the idea that AM3+ was a screw-up. Convenient for consumers, but a screw-up. But I was focusing more on AM2 vs AM3. The architectures between those sockets didn't vary that much and are more comparable to what Intel could be doing.
iGPU is used by more people than dGPU so if Intel found that handling it differently on Haswell and on made a difference in power consumption then go for it I say. Remember there was a marked improvement from Ivy Bridge to Haswell in both iGPU performance (like 90% improvement in some instances) and power consumption.
Are you looking at desktop PCs or all PCs? Because laptops and servers would really skew those results. Laptops, as you have pointed out, use a different socket too. When just looking at desktops (where sockets 1150 and 1155 are used), the percentage of IGP users would drop considerably. Regardless, I do understand there are still likely tens of thousands of desktop IGP users out there. However, the differences between the chipsets should not not prevent people from using the IGP. Maybe a feature or two, but the IGP as a whole ought to still be usable. If Intel made a different socket because the GPU architecture became a complication, I would accept that. But that's not what happened, because Ivy Bridge used the same GPU architecture on both sockets. EDIT: GPU performance doesn't say much about whether a socket should change. The IGPs for FM2 range from uselessly slow to modestly good, but it's all 1 socket.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Intel sockets are confusing *__*