Intel Does Not Confirm Upcoming Discrete Xe GPUs Will Support Ray Tracing (update)

Published by

Click here to post a comment for Intel Does Not Confirm Upcoming Discrete Xe GPUs Will Support Ray Tracing (update) on our message forum
https://forums.guru3d.com/data/avatars/m/271/271585.jpg
Maybe Xe could end up as integrated graphics only (at least for consumers)? Workstation cards makes sense since I'm sure Intel will want a piece of the compute GPU market. Consumer desktop cards, I'm not so sure.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
SamuelL421:

Maybe Xe could end up as integrated graphics only (at least for consumers)? Workstation cards makes sense since I'm sure Intel will want a piece of the compute GPU market. Consumer desktop cards, I'm not so sure.
I'm assuming Intel is going to scale Xe like AMD is scaling Vega, from igpu to full gpu.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
vbetts:

I'm assuming Intel is going to scale Xe like AMD is scaling Vega, from igpu to full gpu.
Thats interesting. Im wondering how far will it scale.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
I think it is safe to assume Intel will scale Xe up however might not be like we are assuming. When we have chips made on 5nm which is not far off, I could see having an iGPU that can actually play AAA games at GTX 1070 or better FPS. Fully expect to see a chiplet / foveros design from both AMD and Intel 2022-2023 that negates more of the dedicated GPU market. When we have proper 5nm chips we easily can have enough GPU and CPU for a single user all in one package. Intel ist terribly far off today from being able to play AAA titles on an iGPU on 14nm which 5nm is going to be 3-4x more dense.
data/avatar/default/avatar36.webp
"desktop and mobility platforms as early as 2020", eh? Let's look into the crystal ball and translate that into something that isn't marketing hyperbole. "10nm mobile parts will release late Q4 2020, with desktop SKUs announced late 2020 and entering general availability late Q1 2021. A good portion of the die space saved by the shrink will go to an improved iGPU that almost nobody will use, and the CPU cluster will remain mostly unchanged except for the addition of more L3 cache on all chips and 2 more cores on a new SKU that is expected to be priced at $650. Boost clock speeds will improve to 5.2 GHz, just enough to keep us at the top of the gaming CPU list, but not enough to dissuade people from buying AMD."
JamesSneed:

I think it is safe to assume Intel will scale Xe up however might not be like we are assuming. When we have chips made on 5nm which is not far off, I could see having an iGPU that can actually play AAA games at GTX 1070 or better FPS. Fully expect to see a chiplet / foveros design from both AMD and Intel 2022-2023 that negates more of the dedicated GPU market. When we have proper 5nm chips we easily can have enough GPU and CPU for a single user all in one package. Intel ist terribly far off today from being able to play AAA titles on an iGPU on 14nm which 5nm is going to be 3-4x more dense.
That's a gigantic leap of logic. Current Intel iGPUs aren't even competitive with a GT1030, "doubling" that with Gen11 will bring it nearly in line with one, as it can just hit the 60FPS mark on eSports titles like the article lists. The next gen, being really optimistic, is only going to be as fast as a 1050. They can't really get any faster than that because they use shared system memory - one of the big things that makes a 1070 as fast as it is is because it has a dedicated 256-bit link to GDDR5, which is faster and lower latency than desktop DDR5 will be. An iGPU just can't get the amount of memory bandwidth needed to hit those performance levels on a shared desktop platform.
https://forums.guru3d.com/data/avatars/m/220/220188.jpg
would be so cool if intel did to nvidia's 2080ti what AMD did to their HEDT products
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
For integrated graphics, those are some decent results, especially if the wattage is proportionate to performance (or better). I'm sure I can wait until Intel gets their enthusiast-grade parts out.
data/avatar/default/avatar32.webp
I have followed INTEL and AMD since I was a little kid maybe back since 1990. INTEL even though I don't like their business shady tactics I believe they finally have a chance only because they hired a lot of great AMD workers. The question is can INTEL keep up with drivers?
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
Dimitrios1983:

I have followed INTEL and AMD since I was a little kid maybe back since 1990. INTEL even though I don't like their business shady tactics I believe they finally have a chance only because they hired a lot of great AMD workers. The question is can't INTEL keep up with drivers?
That's always been a major issue for Intel.... Drivers... They've never put the work in to develop a full featured driver, nor to actually support gaming to any extent. I think this is where Intel is going to struggle. Whether or not the products live up to the hype at launch time, will be interesting to see.
https://forums.guru3d.com/data/avatars/m/223/223196.jpg
If they can put up serious competition in GPUs against Nvidia, like AMD is currently doing to them in CPUs, then this can only be good for pricing and innovation.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
So... maybe I'm the only sceptic in here but... How do they miraculously double their performance in just one generation? While never having done something like that (dedicated cards) before? Or do I just see an issue here where there is none? Also, suddenly they're doing HW raytracing, whatever that's supposed to be / include? In their first generation, already knowing how to design dedicated hardware for this? And die space at hand for that too? Please help me understand fellow gurus.
https://forums.guru3d.com/data/avatars/m/188/188114.jpg
fantaskarsef:

So... maybe I'm the only sceptic in here but... How do they miraculously double their performance in just one generation? While never having done something like that (dedicated cards) before? Or do I just see an issue here where there is none? Also, suddenly they're doing HW raytracing, whatever that's supposed to be / include? In their first generation, already knowing how to design dedicated hardware for this? And die space at hand for that too? Please help me understand fellow gurus.
Its simple... they hired a lot of AMD guys... they know how to do stuff.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Spider4423:

Its simple... they hired a lot of AMD guys... they know how to do stuff.
But does that mean they redesigned their own stuff? Had it replaced by AMD knowledge? I just have a hard time imagining all it takes to make what Intel has into a platform competing with AMD and Nvidia is... hiring 5 guys from AMD, and 18-24 months later you have everything at hand. Or is it?
https://forums.guru3d.com/data/avatars/m/269/269912.jpg
So now there is buzz of Nvidia releasing "Ampere" gpu in 2020. Where does this leave Intel as far as releasing their new gpu platform? Will it be a dud compared to Nvidia's newest Rtx offering? Not to mention Amd's newest offering in 2020. Intel jumpng in to the gpu game in the middle of Nvidia and Amd battle for gpu sales with one upmanship seems foolhardy. I see Intel's Gpu release in 2020 as a "Tada" moment with crickets chirping in the background. How can you be competitive when you don't even know what you have to bring to the table to grab a piece of the gpu market a year from now? https://media2.giphy.com/media/i3PgVt295cSKQ/giphy.gif
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
NewTRUMP Order:

So now there is buzz of Nvidia releasing "Ampere" gpu in 2020. Where does this leave Intel as far as releasing their new gpu platform? Will it be a dud compared to Nvidia's newest Rtx offering? Not to mention Amd's newest offering in 2020. Intel jumpng in to the gpu game in the middle of Nvidia and Amd battle for gpu sales with one upmanship seems foolhardy. I see Intel's Gpu release in 2020 as a "Tada" moment with crickets chirping in the background. How can you be competitive when you don't even know what you have to bring to the table to grab a piece of the gpu market a year from now?
The GPU market works like that now. Ampere's design started 3 years ago or more - you're always guessing at what competitors are doing.
fantaskarsef:

But does that mean they redesigned their own stuff? Had it replaced by AMD knowledge? I just have a hard time imagining all it takes to make what Intel has into a platform competing with AMD and Nvidia is... hiring 5 guys from AMD, and 18-24 months later you have everything at hand. Or is it?
You hire a bunch of people that ran those types of projects before then staff a team underneath them. Remember Raja didn't do the initial design on GCN - he only continued that project. He was the CTO at AMD right after they bought Radeon and he was in a high level position at Apple. He kind of has a starting point as Intel has GPUs now so all he has to do is the same thing he's already been doing at AMD/Apple - just build off existing designs and improve them. I don't think it's going to be that hard for them to transition that experience and their current offerings into a competitive product.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Denial:

You hire a bunch of people that ran those types of projects before then staff a team underneath them. Remember Raja didn't do the initial design on GCN - he only continued that project. He was the CTO at AMD right after they bought Radeon and he was in a high level position at Apple. He kind of has a starting point as Intel has GPUs now so all he has to do is the same thing he's already been doing at AMD/Apple - just build off existing designs and improve them. I don't think it's going to be that hard for them to transition that experience and their current offerings into a competitive product.
I see what you mean, but doesn't that mostly rely on already usable "base tech" that you can use? If so, why weren't they "competitive" or already in this market segment? Why was there no RT hardware? I mean, that stuff is probably unkown from the competition and general public to keep your company's IP and trade secrets safe but... I don't know... it sounds too easy to just take an inferior iGPU tech and make it into a competitive dGPU product with just hiring the right manager or engineer. It still needs a lot of engineering itself, which isn't done on a friday's afternoon, but maybe I don't have the right understanding of the matter.
data/avatar/default/avatar33.webp
sykozis:

That's always been a major issue for Intel.... Drivers... They've never put the work in to develop a full featured driver, nor to actually support gaming to any extent. I think this is where Intel is going to struggle. Whether or not the products live up to the hype at launch time, will be interesting to see.
What worries me is INTEL bribing or "funding" gaming developers so INTEL can move up the ladder in performance. If INTEL was smart they can just buy some gaming companies out or just give them money on "training" them on how to fully take advantage of their features.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
fantaskarsef:

I see what you mean, but doesn't that mostly rely on already usable "base tech" that you can use? If so, why weren't they "competitive" or already in this market segment? Why was there no RT hardware? I mean, that stuff is probably unkown from the competition and general public to keep your company's IP and trade secrets safe but... I don't know... it sounds too easy to just take an inferior iGPU tech and make it into a competitive dGPU product with just hiring the right manager or engineer. It still needs a lot of engineering itself, which isn't done on a friday's afternoon, but maybe I don't have the right understanding of the matter.
It does need a lot of engineering but I wouldn't say it's being done on a friday afternoon. Raja was hired in 2017 and the product isn't being released until 2020. Intel has a ton of resources on top of already having existing IP. Think of Ryzen for example - Bulldozer was the building block they started off with - catching it up to Intel wasn't too hard because the team working on Ryzen already knows all the architectural deficits and what they need to do. Raja is going to look at what Intel has in the iGPU space, use what he knows about architectures from GCN/Apple and just build the differential parts up. On top of that Intel is going to throw him a budget probably twice what he had on AMD - which means he's going to hire a ton of engineers to help him do it. I don't think it's going to be a challenge for Intel to be competitive. Will it beat Nvidia/AMD? Maybe not in direct performance but i'm sure it will be price/performance competitive.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Denial:

It does need a lot of engineering but I wouldn't say it's being done on a friday afternoon. Raja was hired in 2017 and the product isn't being released until 2020. Intel has a ton of resources on top of already having existing IP. Think of Ryzen for example - Bulldozer was the building block they started off with - catching it up to Intel wasn't too hard because the team working on Ryzen already knows all the architectural deficits and what they need to do. Raja is going to look at what Intel has in the iGPU space, use what he knows about architectures from GCN/Apple and just build the differential parts up. On top of that Intel is going to throw him a budget probably twice what he had on AMD - which means he's going to hire a ton of engineers to help him do it. I don't think it's going to be a challenge for Intel to be competitive. Will it beat Nvidia/AMD? Maybe not in direct performance but i'm sure it will be price/performance competitive.
Yes... I kind of didn't think about Intel's budget... they can easily hire engineers if that's what they want to do. I just had the impression that Intel's iGPU never was really good out of my experience (from a few years back, I have to admit), not comparable to the Ryzen 2400G that I use these days. And then they enter the market "just" 3 years later... it felt like too little time since, as you give the example, the time between Bulldozer and Ryzen was way longer than three years, more like a decade iirc. Hence I doubted Intel's prowess in such a short time. But with the right budget, manhours can be bought on the bulk, and certainly, an experienced engineers experience saves time too.