Apple M1 chip outperforms Intel Core i7-11700K in PassMark (single-thread)

Published by

Click here to post a comment for Apple M1 chip outperforms Intel Core i7-11700K in PassMark (single-thread) on our message forum
https://forums.guru3d.com/data/avatars/m/284/284929.jpg
OldManRiver:

Pretty much every modern processor is internally RISC with a front end CISC decoder. Surprising to see so many believe they are still primarly CISC based. That has not been true for a long time.
Care to explain more? I think that x86 has not really changed all that much since its invention and remains a CISC system.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Raserian:

Care to explain more? I think that x86 has not really changed all that much since its invention and remains a CISC system.
To the user, x86 architecture is *very* CISC. Inside a chip it can be a different story. A modern high end Intel chip takes those CISC instructions and translates (decodes) them down into an internal RISC-like instruction set and executes those in parallel and can often “execute” (retire is Intel’s term for when an instruction completes execution) as many as 4 at a time (that was what they could do in 2015 on Haswell class machines). Today, Intel or AMD x86 processors do not execute x86, its instructions get decoded and translated into micro-instructions in the processor frontend; the backend executes those micro-instructions and looks a lot more like a RISC processor.
https://forums.guru3d.com/data/avatars/m/270/270792.jpg
schmidtbag:

A trend in what, specifically?
Specifically SOC (that implies not modular parts as it is nowadays on desktops) and non x86 like processors.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
OldManRiver:

Pretty much every modern processor is internally RISC with a front end CISC decoder. Surprising to see so many believe they are still primarly CISC based. That has not been true for a long time.
You say that as though that changes anything discussed to any significant degree, because as Astyanax pointed out, the end result of many modern architectures is something that is very much CISC. By your logic, that's like saying "seaweed isn't a plant, it's a protist, so it's not a vegetable". The things that separate seaweeds from being plants don't separate it from being a vegetable. All that being said, MIPS is a true RISC architecture that is still fairly modern. Even today, the floating point instructions aren't integral to the architecture, and yet, MIPS can still run a full OS.
fredgml7:

Specifically SOC (that implies not modular parts as it is nowadays on desktops) and non x86 like processors.
Desktop and laptop PCs have become more and more of a SoC. For one big example, there is no longer a separate northbridge and southbridge. I wouldn't be surprised if within the next decade, there will no longer be a discrete motherboard chipset anymore. I think it'll still be a long while until RAM becomes fully integrated, like it is in many ARM platforms. As for non-x86 processors, I think that's very possible. Most tablets run on ARM and have obsoleted the need for desktops or laptops for most people. Apple is moving away from x86. Many tools we use have become cloud based, where it doesn't matter what architecture you use. The Chinese government is trying to move to MIPS. For Linux users, there's a rising interest in POWER, ARM, and RISC-V. MS's 2nd attempt at ARM is actually usable, and I'm sure they're just trying to prepare in the future in case ARM becomes a serious contender. So yeah, I think it's very possible x86 may lose quite a lot of popularity. It will basically just remain popular to those who want raw CPU processing power, and that's a dying trend thanks to things like OpenCL or CUDA.
https://forums.guru3d.com/data/avatars/m/250/250667.jpg
Magic power of ASUS DARK HERO DOCS
Capture54377.PNG
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
I've seen loads of videos about the M1 in previews and post release. As a home recording enthusiast, it's very tempting, especially as more software is starting to support it (although still a while to go yet). I'm sure a year from now it will be very compelling. It runs emulated windows apps pretty decently too. I'll definitely be watching it's progress in the coming years.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Undying:

Is there anything else apple chip excel at beside passmark?
Yes most bench and program. Everyone were sceptical about it (don't forget that the 1st version is a tablet CPU) but the more they put it with more core on computer the more i like it... The good news is that they will make even more core and powerfull version. The bad point it is that it is Apple exclusive... 🙁
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
All this talk about CISC and RISC is completely irrelevant. All modern CPUs just translate incoming command streams to whatever they use internally. It's really a pity it's an Apple exclusive. I even see Apple going to RISC V in 20 years, just not to be in the mercy of Nvidia.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
OldManRiver:

You meant RISC. I said that because of ridiculous backward claims like that. I think it's funny you cite Sora, for once, clearly based on my post, got it more or less correct. You got it wrong. Look elsewhere to incite conflict. Thanks.
I meant what I said. Who the hell is Sora? How can I be "more or less correct" while also getting it wrong? You're the one getting pedantic about what is actually RISC. If you don't want to incite conflict, you're barking up the wrong tree.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
OldManRiver:

You're still arguing with me because? More or less correct was not addressed to you. I am not arguing "what is RISC" at all. Learn to read. You're still seeking conflict, duh. Well no, it is not actually irrelevant what the internal processor structure is. It it were, there would be no differences in uarch... 'cause it would be irrelevant'. Man... poor quality posting around here.
May i suggest you scream at em to get of your lawn ?
data/avatar/default/avatar29.webp
schmidtbag:

A trend in what, specifically? These days, computer hardware is becoming less and less interesting. The days of overclocking are nearing an end (now it's basically just providing a cooling solution capable of maintaining boost clocks). All motherboards are pretty much the same thing with ever-so-slightly different variations, and even then, they're mostly just black with RGB heatsinks. I wouldn't be surprised if in a few years, all motherboards come with soldered-on CPUs. It wouldn't matter if the chip came to PCs - most of the performance enhancements is in Apple's software. The CPU itself is otherwise mediocre. ARM chips are also notorious for overclocking poorly.
Id say at least Zen2/3 there is, alot of head room for OC/UV and ram, tweaking. Much more than it was ever on intel side. Was quite fun for me 🙂 Zen3 especially is most tweakable cpu in recent history
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
it's faster than AMD cpus too, but the point now is to s*it over rocket lake as hard as possible. Sad really.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
What good is how fast any of these chips are if they're exclusive to a locked down platform that can't even be used as a normal PC? An iPhone might have the fastest chips, but it's worthless to me considering you practically don't own it with how retard-controlled it is. Apple's PCs are just a half step away from that now that they're no longer x86. Last I heard you can't even do something as simple as change your default music app, wouldn't want those filthy peasants who paid over a thousand dollars to rent the phone to have an option to use a different music provider! It can be twice as fast as an 11700K, and yet still worthless as far as I'm concerned, if I can't use it for anything that has any value or interest to me. No, real gaming in any meaningful capacity is not coming to Apple platforms in the near future. As always, Apple can go suck a fat one.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
gx-x:

it's faster than AMD cpus too, but the point now is to s*it over rocket lake as hard as possible. Sad really.
Yes kinda , but i think the focus on intel is because apple dropped intel to move to their own cpu design.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
OldManRiver:

You're still arguing with me because? More or less correct was not addressed to you. I am not arguing "what is RISC" at all. Learn to read. You're still seeking conflict, duh. Well no, it is not actually irrelevant what the internal processor structure is. It it were, there would be no differences in uarch... 'cause it would be irrelevant'. Man... poor quality posting around here.
Sure there are, but in the end, you have microcode. You may change the front end, and different architectures require different decoders, sure, but from that point on you can actually reuse resources. It can't be 100% unaffected, but the front end does manage this for quite some time now.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Venix:

Yes kinda , but i think the focus on intel is because apple dropped intel to move to their own cpu design.
That's the real reason yes, but let's be real: No one likes Intel, the raging weasel assholes who held a monopoly from 2006 to 2017 on everything but the low end after they used every scumfuck illegal tactic imaginable to nearly sink AMD and never really paid any meaningful price for doing so. The only reason they aren't a galaxy beyond AMD right now is because they sat around with their thumbs up their asses for over a decade. A decade of quad core CPUs at asshole scalper-like prices with each "generation" being either the exact same, or nearly the same crap as the previous. The only reason I want Intel to succeed is the only reason any of us should want those monumental cartoon-level villains to succeed; for our own good. The market is still horrendously bad even with 2 players, the last thing we need is 11 years of AMD robbing us for the same barely-changing garbage year after year. It's the same reason I hope Intel somehow successfully enters the nearly FUBAR GPU market.
data/avatar/default/avatar03.webp
schmidtbag:

Take for example Clear Linux (which is developed by Intel), and for years has results ranging from 5% faster to more than twice as fast, using the exact same CPU. All they did was just optimize libraries and programs to use more instructions. Since AMD shares many of the same instructions, they too saw a performance increase, though it often wasn't as significant. As a result of Intel's efforts, more [open source] programs have baked in their optimizations. Really goes to show how much performance modern software is lacking, often because devs are too lazy.
10% ahead of Ubuntu, acc to latest Phoronix 62 tests benchmark. Very nice. Focusing all of Intel's might to come up with 10%, while admirable, does not really substantiate the lazy devs accusation.
Neo Cyrus:

That's the real reason yes, but let's be real: No one likes Intel, the raging weasel assholes who held a monopoly from 2006 to 2017 on everything but the low end after they used every scumfuck illegal tactic imaginable to nearly sink AMD and never really paid any meaningful price for doing so.
So put them in jail for 17 hundred years. Why does the entire company has to suffer for the sins of few top criminals. Oh but the jail is where we put common criminals. /politics
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Noisiv:

10% ahead of Ubuntu, acc to latest Phoronix 62 tests benchmark. Very nice. Focusing all of Intel's might to come up with 10%, while admirable, does not really substantiate the lazy devs accusation. So put them in jail for 17 hundred years. Why does the entire company has to suffer for the sins of few top criminals. Oh but the jail is where we put common criminals. /politics
Depending on the appliction, it might be more than 10%, but this has nothing to do with developers. Intel is using their compiler to do this, they could have chosen to optimize GCC if they were honest about this, but there they know that some shit doesn't fly. Developers are not responsible for compiler optimizations.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Noisiv:

So put them in jail for 17 hundred years. Why does the entire company has to suffer for the sins of few top criminals. Oh but the jail is where we put common criminals. /politics
I've always said these types of colossal crimes will never stop until there is actual hard prison time given out to those responsible as a punishment. Otherwise the punishment will always remain as a pre-calculated cost of business. The cost to Intel to do it (which I'm not sure if they've even paid out yet) was absolutely laughable and an insult to us all. That being said: Frack Intel. I'll always be a hardcore anything-but-Intel buyer so long as any viable alternative exists. Much of the same assholes from back then are still running the company. Their new Chancellor of Propaganda (or whatever his title is, but that's literally his job) is just a small result of that. He's already been breaking laws left and right. Laws of which lawsuits against Intel set precedent for.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
You do know that M1 has like triple the transistors compared to 5900/5950 or Intel 11700. It's an awesome soc Apple made but it's also quite a bit more expensive to make than AMD or Intel processors. Apple can drown alot of the price because they don't really sell hardware they sell service attached to the hardware and gain money from that. For AMD and Intel making something like this would be more akin to suicide. Google and Microsoft could do it. I guess the soc in ps5 and Xbox series X are the most comparable things.