AMD Security Announcement on Fallout, RIDL and ZombieLoad Attack

Published by

Click here to post a comment for AMD Security Announcement on Fallout, RIDL and ZombieLoad Attack on our message forum
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
waltc3:

Foremost, though, is that every time you reinstall the OS you have to reapply the OS patches.
Not exactly true, with windows 10 anyways. Every time i reformat and reinstall windows 10 i get the latest version of it on a bootable usb drive. I believe they only update the installation media on big updates, so for instance i wanna say the current one is 1809 without the last 4-5 months or updates, but once 1903 comes out then that won't be the case. Hence why i say its not exactly true, as it depends on if you use the most updated version of the installation software as well as which exact patch someone might be concerned about. But it also does depend on OS.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
Alessio1989:

I can image that datacenter clients are sooo happy right now... They cannot, or they will destroyed in never-ending legal trials loop by datacenter/HTPC OEMs and clients
The evidence is mounting that says they didn't care. However I'm sure they'll start caring soon™
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Is there some reason whenever AMD security vulnerabilities get announced, it's always shortly after the Intel ones are announced? And, why are they always given such cheesy names? Sure, Meltdown was a little bit cheesy, but, Intel wasn't the only one affected either.
er557:

Go ahead, switch to AMD, when such overhyped vulnerabilities will easily be patched with microcode update, minimal performance impacts, it is good that research is being done on the matter, and i dont feel any less secure. With AMD you get lower per core performance, low efficiency interconnect, lower performance in games and productivity software. It is definitely not a reason to bash intel over this.
"Minimal" my ass. Have you seen before and after benchmarks of OSes and microcode mitigations applied? Because in a lot of cases, the performance difference is so great that in cases where Intel had a lead over Ryzen, they lost that lead.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
jwb1:

You AMD fanboy's realize, this vulnerabilities have been around since 2008 or something and no one has been attacked due to it.... now there are actual patches out in software and soon micro code updates. Safer today than yesterday. But go ahead, make a mountain out of a ant hill and enjoy your lower performance AMD in every application.
Can you not troll topics like this? This is now the second warning you're getting, no more of this.
https://forums.guru3d.com/data/avatars/m/274/274006.jpg
Can hate towards Intel also be moderated please
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
SniperX:

Can hate towards Intel also be moderated please
Agreed, but so far in this topic I haven't seen any hate.
https://forums.guru3d.com/data/avatars/m/277/277333.jpg
I'm so happy with my Athlon 200GE right now, soon to be as fast as the 9900k 😛
https://forums.guru3d.com/data/avatars/m/272/272918.jpg
:)
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
jwb1:

You should re-read, then. Here's a few choice one's from people.
None of that is hate, sorry you feel personally attacked that a company that isn't you (or is it?) Have people who are rightly so upset about their practices and security flaws and prices. Now when you start throwing around factless, blatantly wrong information, that is what is wrong. Yes, you blatantly and factually did this. Learn what facts are before you post.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
jwb1:

You should re-read, then. Here's a few choice one's from people. BTW, we all seem to be forgetting the day this was announced, Intel had patches available. But nah, they are just an evil business who only cares about money. Cause ya know, AMD doesn't care about money or you know AMD didn't have its own share of security issues. This whole press piece by AMD is fanboy trash PR talk. But lets all just focus on how much we hate evil Intel.
The topic is not for discussion, if I saw something of the sort happening on both sides I would take care of it. Easy as that. But don't listen to warnings, don't follow rules. That will help you out.
data/avatar/default/avatar04.webp
Clawedge:

Yes AMD! kick Intel when they are down. Intel getting just what it deserves for supressing competition and price gouging! Karma is a bitch!
-------------------------------------------------------------------------------------------------------------------------------------- Wish I can say that too but I got my comment deleted a while back for something like that, any who WELL SAID!
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
I'm starting to think Intel knew about all these issues/flaws but kept silent about them just for the extra performance benefits.
https://forums.guru3d.com/data/avatars/m/271/271585.jpg
Intel's stock is higher today than it was prior to the announcement. I'm sure that has a lot to do with overall market conditions and the health of the company, but it does make you wonder what non-technical people (most people and, by extension, most investors) think when they see something like this. Personal observations from working in IT - people care FAR too little about security of their devices until something terrible happens. Most users don't even grasp the concept that their devices could be compromised on a hardware level or through no fault of their own.
data/avatar/default/avatar10.webp
SamuelL421:

Personal observations from working in IT - people care FAR too little about security of their devices until something terrible happens. Most users don't even grasp the concept that their devices could be compromised on a hardware level or through no fault of their own.
Working in IT and being on both sides, most of security guys are out of touch with reality asking you to implement everything to all system in matter of days. I had too many bloody fights with them. Its important to keep security up, but at same time you need to keep business running so there is always needed balance between security and business continuity.
Fox2232:

While I really dislike nVidia. I'll have to correct you a bit. Turing delivers more transistors per $ than older generations. Fact that those cards are not much faster (gaming wise) than those older generation per $ while having many more transistors in not pleasant. But that's because nVidia added proper FP16 performance which AMD had for quite some time. And they added special new functions which were not cheap in terms of transistor count either. They added so many transistors that Turing's gaming performance per transistor at same clock as Vega is practically same. Except that AMD still has ~30% higher compute performance and nVidia still holds ~30% lower power consumption. (I personally waited for very long time for nVidia to deliver decent compute. Because w/o that game studios would not utilize compute. Now doors are open for whole new world of magic.)
More transistors sure, but: - its not something customer really cares about - increase in price is not adequate for chip size increase, its actually far far off. Difference in cost would be bellow $50 to make bigger chip but Nvidia decided to hike price by $700, there is no sympathy at all from me for that.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
xrodney:

- increase in price is not adequate for chip size increase, its actually far far off. Difference in cost would be bellow $50 to make bigger chip but Nvidia decided to hike price by $700, there is no sympathy at all from me for that.
You realize that's not how that works, right? Just take GTX 1080 ti vs RTX 2080 ti, a $300 price difference (not 700 like you imply, it'd be 400-500 if you want to take non-MSRP prices which have nothing to do with nvidia, as that price hike goes to fund the 3rd party manufacturer such as MSI, or the retailer such as newegg, or both, so i'm going to stick with $300 difference) GTX 1080 ti 471mm2 RTX 2080 ti 754mm2 60% increase in die space. But that doesn't mean that it cost 60% more to make, or that you get 60% less dies per wafer. No, not even remotely. Being that it's 60% bigger, that means the likelihood of defects rises fairly dramatically, this affects costs. It's likely due to how many less dies they get per wafer that the die itself costs 2-3 times more, could even be more. That being said, we can't know the exact number, since it's publicly not even known. Then you have to factor in the fact that GDDR6 is more expensive then GDDR5x But i want to make an extra point on the die cost. One of the biggest reason AMD has been so competitive with its high core count processors is because they don't have a monolithic die, like intel does. This helps TREMENDOUSLY for AMD in regards to costs and being able to undercut intel yet still make good money. Their dies are so small that the amount of defective ones compared to cost isn't a problem, whereas intels much larger dies have a much larger cost due to defects. Yes, i understand, that's AMD and Intel, and not GPUs, but the principle is the same, large dies have bad yields = higher costs not in regards to the increase in cost per physical die material. And lastly, none of this takes into consideration R&D for the features that are in the RTX lineup that are not in the GTX lineup. You can say "oh that doesn't matter" all you want, but it does. A company doesn't just put millions and possibly billions into R&D and not factor that into the costs of a product. A CUSTOMER may not care about this cost, but if the company wants to survive, they HAVE to worry about this cost even if the customer refuses to. It's not JUST the physical material costs that need to be taken into consideration but also the expected amount of units to be sold at which margins to pay for the R&D. Both of these numbers as consumers we will never fully know.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
HWgeek:

https://www.intel.com/content/www/us/en/architecture-and-technology/mds.html 14% on Storage and 19% on Server Side Java with HT OFF... https://www.intel.com/content/dam/www/public/us/en/images/corporate/16x9/mds-server-hton-16x9.png https://www.intel.com/content/dam/www/public/us/en/images/corporate/16x9/mds-server-htoff-16x9.png
ouchie
jwb1:

You AMD fanboy's realize, this vulnerabilities have been around since 2008 or something and no one has been attacked due to it.... now there are actual patches out in software and soon micro code updates. Safer today than yesterday. But go ahead, make a mountain out of a ant hill and enjoy your lower performance AMD in every application.
that we know of, and thats the beauty of these types of exploits, they leave no evidence
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
er557:

Go ahead, switch to AMD, when such overhyped vulnerabilities will easily be patched with microcode update, minimal performance impacts, it is good that research is being done on the matter, and i dont feel any less secure. With AMD you get lower per core performance, low efficiency interconnect, lower performance in games and productivity software. It is definitely not a reason to bash intel over this.
Fanboy to the max. Intel deserves any bashing they get from this. Time to ditch my intel server. Threadripper here I come.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
SniperX:

Can hate towards Intel also be moderated please
What hate towards Intel? This forum is so heavily Intel/NVidia biased it's embarrassing. AMD could release products that have twice the performance of anything Intel or NVidia have in their respective markets, at half the power consumption or the competing products, and people on this forum would still bash AMD.....
schmidtbag:

"Minimal" my ass. Have you seen before and after benchmarks of OSes and microcode mitigations applied? Because in a lot of cases, the performance difference is so great that in cases where Intel had a lead over Ryzen, they lost that lead.
I have an i3 7130U based laptop and an i3 380m based laptop. The 7130U is "patched" and noticeably slower than the i3 380m..... In fact, the i3 7130U is comparable in (perceived) performance to the Celeron N4100 based laptop it was bought to replace.... I noticed the same performance loss on my i3 7100U based laptop a few months ago. I'm scared to see what the new patches will do to the performance of the 7130U....
https://forums.guru3d.com/data/avatars/m/94/94773.jpg
My last couple Intel cpu's (including my current 4770K) have served me well but I am pretty certain my next PC will be back to AMD when new Ryzen comes out.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
sykozis:

I have an i3 7130U based laptop and an i3 380m based laptop. The 7130U is "patched" and noticeably slower than the i3 380m..... In fact, the i3 7130U is comparable in (perceived) performance to the Celeron N4100 based laptop it was bought to replace.... I noticed the same performance loss on my i3 7100U based laptop a few months ago. I'm scared to see what the new patches will do to the performance of the 7130U....
I myself have a laptop with an i3-4100U. For many years, it's been able to keep up just fine with my daily workloads, but ever since 2019 when pretty much all of the mitigations have been applied, I've been noticing performance dips, and my CPU usage when watching Youtube videos has gone up dramatically. It's still usable but I think I'm ready for an upgrade now. The only reason I never mentioned any of this earlier is because I don't like to use personal anecdotes as evidence to back up a point. Haha so, the only reason I'm saying any of this now is to say I feel your pain.