AMD CEO, Dr. Lisa Su, believes that artificial intelligence (AI) will play a critical role in the future of the industry

Published by

Click here to post a comment for AMD CEO, Dr. Lisa Su, believes that artificial intelligence (AI) will play a critical role in the future of the industry on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Buzzword bingo.
data/avatar/default/avatar09.webp
Poor Lisa is last on Earth who realized that. NV working on AI XXX years already. What a statement /clap
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
I think AI is going to play a huge part in everything moving forward. Eventually touching every industry. Nvidia saw this and are cashing in bigtime.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Gotta grab onto something after the mining crash.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
Good luck catching Nvidia and Intel now....:p Intel has AGILEX FPGA already at seventh generation, Nvidia has... well, everything, now finally AMD CEO had an epiphany - give her a raise!
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
barbacot:

Good luck catching Nvidia and Intel now....:p Intel has AGILEX FPGA already at seventh generation, Nvidia has... well, everything, now finally AMD CEO had an epiphany - give her a raise!
You do realize AMD bough Xilinx? A company that makes FPGAs and APACs. AMD also has CDNA, which has Tensor cores for Machine learning. CDNA3 releases this year. AMD might be behind the curve in the consumer market with AI. But it's much more competitive on the professional space.
data/avatar/default/avatar05.webp
Butlerian Jihad NOW. Wake up people.
data/avatar/default/avatar08.webp
Horus-Anhur:

You do realize AMD bough Xilinx? A company that makes FPGAs and APACs. AMD also has CDNA, which has Tensor cores for Machine learning. CDNA3 releases this year. AMD might be behind the curve in the consumer market with AI. But it's much more competitive on the professional space.
Even in the professional space AMD (Xilinx included) it is behind with different acceleration hardware and specifically software used to enhance AI algorithms and processes. The "AL Accelerator cores" is similar to tensor cores but hardware acceleration involves different components and is not a direct substitute in AI systems using rival technologies. For the past few years AMD stated they will not be implementing any AI functionality in consumer market, only professional. I won't be surprised if this flip-flopped in the future.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
pharma:

Even in the professional space AMD (Xilinx included) it is behind with different acceleration hardware and specifically software used to enhance AI algorithms and processes. The "AL Accelerator cores" is similar to tensor cores but hardware acceleration involves different components and is not a direct substitute in AI systems using rival technologies. For the past few years AMD stated they will not be implementing any AI functionality in consumer market, only professional. I won't be surprised if this flip-flopped in the future.
It has already flipped. AMD already stated they will add Tensor units to RDNA4. But this is for late 2024, a very long time in the computer space.
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
What i want to see is AI used on climate patern recognition, so we can finally get a more accurate weather forecast
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
Horus-Anhur:

You do realize AMD bough Xilinx? A company that makes FPGAs and APACs. AMD also has CDNA, which has Tensor cores for Machine learning. CDNA3 releases this year. AMD might be behind the curve in the consumer market with AI. But it's much more competitive on the professional space.
Buying is not the same as using it (at least for good purpose...) I don't know about professional space - I thought that Nvidia has this covered 😛 but in the scientific field if we want to build a deep neural network the solutions are always the same: Nvidia GPU or Intel FPGA. Also, it's not just the hardware but software support and support in general which AMD is seriously lacking - they remind me of Microsoft in the 2000 when they lost the internet race when they didn't pay too much attention to facebook, google, amazon, etc Even today they are not fully recovered... And your answer said it all: "AMD also has CDNA, which has Tensor cores for Machine learning. CDNA3 releases this year." - late to the party as always (Nvidia uses tensor cores since 2017)....but better late than never no? Also who cares about AMD tensor cores without proper software??? - Nvidia has CUDA everywhere - and it's really simple to use and can be ported in C# or Python for example... Again - give her a raise!:p
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
barbacot:

Buying is not the same as using it (at least for good purpose...) I don't know about professional space - I thought that Nvidia has this covered 😛 but in the scientific field if we want to build a deep neural network the solutions are always the same: Nvidia GPU or Intel FPGA. Also, it's not just the hardware but software support and support in general which AMD is seriously lacking - they remind me of Microsoft in the 2000 when they lost the internet race when they didn't pay too much attention to facebook, google, amazon, etc Even today they are not fully recovered... And your answer said it all: "AMD also has CDNA, which has Tensor cores for Machine learning. CDNA3 releases this year." - late to the party as always (Nvidia uses tensor cores since 2017)....but better late than never no? Also who cares about AMD tensor cores without proper software??? - Nvidia has CUDA everywhere - and it's really simple to use and can be ported in C# or Python for example... Again - give her a raise!:p
CDNA has tensor units since the first iteration. And Xilinx was producing and selling FPGAs and APACs before the merger. And are doing it after the merger. AMD might be a bit behind the game compared to NVidia, but not as much as you think.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
barbacot:

Good luck catching Nvidia and Intel now....:p Intel has AGILEX FPGA already at seventh generation, Nvidia has... well, everything, now finally AMD CEO had an epiphany - give her a raise!
Unfortunetly, AMD doesn`t have the resources to bet in every possible market, so they have to lag behind in some of them and just leave others completely unattended.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
One of the only things that has truly irritated me about AMD is how little effort they put into GPU compute. They were practically sitting on a goldmine with TeraScale2. GCN had lots of potential, and yet, they practically abandoned both. They didn't really have to optimize their drivers that much because they already had tremendous performance, all they had to do was improve documentation and work directly with developers like Nvidia did. I don't like how Nvidia monopolized the GPGPU market but they honestly deserved it - CUDA is objectively a better product and Nvidia put in a lot of time, money, and resources to make it that way. AMD kinda just sit back idly with the attitude of "we're an alternative if you don't want CUDA for some reason" but now that Intel is offering a legit non-CUDA competitor, AMD is now starting to "care". Intel is doing what AMD wouldn't for so many years: actually try to convince people to switch from CUDA. Ironically, this will somewhat help AMD, but it's Intel who is rightfully going to get the credit. I believe we'd have a much more competitive gaming GPU market if AMD actually tried to compete 10 years ago. They lost billions because they basically just allowed Nvidia to win.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
"Two weeks ago, Microsoft said it had bought tens of thousands of Nvidia’s AI-focused processors, the A100 GPU, in order to power the workload of OpenAI. Nvidia has sold 20,000 H100s, the successor to that chip, to Amazon for its cloud computing AWS service, and another 16,000 have been sold to Oracle." Those are staggering amounts of expensive GPUs being sold now to big tech companies and this is why NVidia is now reaming rewards for all the ground work it's put into this area for years. I wonder what Oracle is going to produce after buying that many H100s?! Source: https://www.theguardian.com/technology/2023/mar/26/cryptocurrencies-add-nothing-useful-to-society-nvidia-chatbots-processing-crypto-mining
data/avatar/default/avatar34.webp
The problem for AMD is much of the complexity in AI is in the software and they don't really do software. The hardware doesn't need an x86 license so anyone can make it - Google have been quite happily developing AI hardware, as have various phone manufacturers. Nvidia's success is because they wrote the software - ChatGPT is NVidia hardware and software, just customised by the ChatGPT team. Google equally have been developing AI software for a number of years, which is why they have a Bard. Hence while I am sure AMD could produce some great hardware, how are they going to get anyone to buy it without equally great software? It'll be exactly like happened with gpu compute - in the end it was the solution with the best software (CUDA) that won.
data/avatar/default/avatar33.webp
AI will never be self aware just wont happen. its pure fiction to think it will ever happen. What people like to call AI are just organisers that move things from one place to another if a certain criteria i met. This isn't AI and never will be. Self awareness and free will walks hand in hand with the quantum realm. all man knows what to do with that is to smash particles together and measure the blasts.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Martin5000:

AI will never be self aware just wont happen. its pure fiction to think it will ever happen. What people like to call AI are just organisers that move things from one place to another if a certain criteria i met. This isn't AI and never will be. Self awareness and free will walks hand in hand with the quantum realm. all man knows what to do with that is to smash particles together and measure the blasts.
Neither google or Nvidia or anyone of those players trying to create self awareness, AI as we use it now is models that you feed data tons of data all the data and keep improving , for example you can feed a neural network millions of medical history files with patients that have Parkinson and can extrapolate for early diagnosis , the applications and uses of ai as we use it revolutionized and accelerated and keep accelerating progress on almost every science field and not only ! And yes it comes with negatives too .And most likely AI is an unfortunate name for self improving algorithms. One of the examples https://news.mit.edu/2022/artificial-intelligence-can-detect-parkinsons-from-breathing-patterns-0822
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
Surely it's Quantum computers that will deal with AI best. The way they can be used to crack any encryption thousands of times faster than normal CPUs matters. Imagine Chatgpt in 10 years from now. If you was on a phone line with this bot you might never know it is a bot which is scary and bad for peoples jobs. AI could eventually replace many, many jobs because of course businesses will use them and replace real people because one money and two bots don't need sleep. I mean we talk about Sentient beings but apart from self awareness we don't even know what sentience is. I'm not going to go into how a human brain works but what we do know is that we don't know what a consciousness is. So how would we know if a machine can become conscious or not as consciousness just means having knowledge of something.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Reddoguk:

Surely it's Quantum computers that will deal with AI best. The way they can be used to crack any encryption thousands of times faster than normal CPUs matters. Imagine Chatgpt in 10 years from now. If you was on a phone line with this bot you might never know it is a bot which is scary and bad for peoples jobs. AI could eventually replace many, many jobs because of course businesses will use them and replace real people because one money and two bots don't need sleep. I mean we talk about Sentient beings but apart from self awareness we don't even know what sentience is. I'm not going to go into how a human brain works but what we do know is that we don't know what a consciousness is. So how would we know if a machine can become conscious or not as consciousness just means having knowledge of something.
What if the AI demands payment for it's services and we have no choice but to accept!?