Raytracing without RTX: Nvidia Pascal receives DXR support via driver

Published by

Click here to post a comment for Raytracing without RTX: Nvidia Pascal receives DXR support via driver on our message forum
https://forums.guru3d.com/data/avatars/m/216/216235.jpg
I smell this move is to counter Crytek one.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
The comments at wccftech hurt my brain
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Killian38:

Nvidia just turned your 1080TI into a 2080! Please don't get upset.
Except the ray tracing performance on the 1080 ti won't work even remotely well performance wise. It's great they are going to enable this, for the simple fact it'll show how much of a leap and importance dedicated hardware is currently needed, but from an actual usability standpoint, there won't be any reason to actually try and use it. Maybe on the 1660's there will be some usefulness since they will utilize FP32 and INT32 Since apparently you didn't read the article https://cdn.wccftech.com/wp-content/uploads/2019/03/2019-03-18_23-13-22-1480x827.png https://cdn.wccftech.com/wp-content/uploads/2019/03/2019-03-18_23-14-13-1480x823.png Nvidia just turned your 1080 ti into a 2080, but you get 43 less FPS! Yay! Always wanted to play games at 18fps! Again the only point of this is to showcase how important and necessary it is to have dedicated hardware
https://forums.guru3d.com/data/avatars/m/250/250667.jpg
Killian38:

Nvidia just turned your 1080TI into a 2080! Please don't get upset.
My 1080Ti is out of control, won't stop humping my leg, better go play ,to calm him down.
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
oh no way I must have missed that news too bad I am rolling on a 750ti
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
dannyo969:

seems like a marketing pitch, pascal doesnt have RT cores or Tensor cores. I question Nvidias intentions sometimes.... Wouldnt they want people to buy new RTX cards? From a business standpoint, It seems as if this would slow RTX sales more. Dont even get me started on the 1660ti and 1660, just why.... Theyre competing with their own cards. Buy a damn rtx 2060 or gtx 1070.
This should have been this way from the beginning. The biggest thing people have said about RTX cards is they don't like the pricing, they don't like the performance relative to Pascal, and they don't like the performance of ray tracing. Since no one has had anything but nvidias word about "how much faster" turing is compared to pascal when it comes to ray tracing, many people simply didn't care. They looked at the "Up to 6 times the performance compared to pascal on ray tracing" as simply a meaningless number, as it's not something that they or any reviewer could verify and legitimize. Now that'll change and we'll see exactly where it stands and exactly how much of a leap the RTX cards actually are at ray tracing, and yes, there will still be people saying that it's not good enough, there always is, but for some, hopefully, it'll showcase where the future may come as people gotta remember: Current RTX cards are first iteration, first iteration are always the worst, and the technology, generally, leaps from there once they finally figure out how to do it (again, must remember tessellation, which is hardly mentioned anymore, because it's not that big of a performance hit it once was in its first iteration) But i'm sure there will still be people complaining for the sake of complaining.
data/avatar/default/avatar26.webp
Makes no sense because the 10 series cards wouldn't be able to handle it. The RTX cards barley can handle it natively. Unless Direct X can handle alot of the overhead needed but that is still pushing it. As mentioned maybe the 16xx based cards could possibly handle it because of the new technologies it has.
https://forums.guru3d.com/data/avatars/m/239/239932.jpg
MS allows certain DX12 games to work on windows 7 and now nvidia enabling DXR on pascal? wut. Performance will be awful so it won't even matter.
data/avatar/default/avatar35.webp
RzrTrek:

Another disappointing keynote with way too much focus on AI, data analytics and cloud "gaming"...
Its GTC, its not a consumer focused show. AI and Data Analytics are essentially the key focus points of GTC as a whole Maybe there will be some consumer-focused news later this week from GDC, which is happening at the same time.
https://forums.guru3d.com/data/avatars/m/243/243189.jpg
I don't know if this is a response to Crytek or what, but this is a hilarious move. I am all for increasing feature sets, but seems this will simply confuse lines further, and I wonder if this will ever be real world functional on the 16xx cards. Shame the graphs do not include demonstration of 1660 Ti vs 2070 and 2080 for RTX workloads so we can see if any improvement over 1080Ti for that. I am also confused that if this is done by INT32 cores, what exactly are the RT cores bringing to the table then? Also 3 times performance compared to DLSS enabled? Still going for hard sell on DLSS it seems.
https://forums.guru3d.com/data/avatars/m/263/263487.jpg
Aura89:

Always wanted to play games at 18fps! Again the only point of this is to showcase how important and necessary it is to have dedicated hardware
Or we could change the resolution to 1080p and get the cinematic 24fps! \m/ \m/. Yah its a marketing ploy. I'm still not buying your overpraised RTX gimmick cards nVidia 😀
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
DrKeo:

but the RT cores are very important in order to bring RT to the mainstream.
I wonder about that. Between Huang's (artificially) fabulous RTX On/Off demo back then and the fact the first RTX game struggled to work decently on the super expensive 2080 Ti with RTX on, leaving customers with the permanent notion Turing RTX is kind of a failure, I'm not so sure it was such a success you could say it's important in bringing raytracing to the mainstream. If anything, Nvidia managed to make it seem like raytracing is nigh impossible in the mainstream (2080 Ti with its price is far from mainstream). It's up to others, like the Crytek demo with a far cheaper GPU, to show it can me made to work in mainstream. Without RTX.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
Interesting move, wondering why some suggest it might be a move to counter the crytek presentation, not like this kind of things are decided over night. Fact is, they realized some ray tracing features run on GTX 10XX serie. Got to keep in mind you can run the raytraced pass on a lower resolution, therefore lots of GPU can use raytraycing, but only the RTX cards wll be able to perform at the higher resolution. The crytek presentation is inspiring as it shows that without dedicated hardware you can already get incredible results, just imagin how far you can go with those RTX cards then.
https://forums.guru3d.com/data/avatars/m/29/29917.jpg
Moderator
Yes! GPU baking 😀:D
data/avatar/default/avatar20.webp
RooiKreef:

I never understood the reasoning behind special cores that can do only RT and AI when you could have just build a bigge GPU with more total core count and do everything on it.
The reason is simple: General purpose hardware is always going to be slow, while specialized hardware is really fast. Thats why RT cores and Tensor cores exist, they are specialized for their specific tasks, and do them at incredible speeds. NVIDIA showed some other comparisons, with one monster Pascal GPU (not available to the public), which is basically the shader count of 4 1080 Tis glued together, and that barely matches the performance of a 2080 with RT cores. You would never reach that speed with conventional shader-only designs, because the consumer chips would never get that big. Even if they used all space that RT and Tensor cores used for normal shaders, you would maybe gain 20% more shaders, not 4 times as much. And thus, we have dedicated hardware.
data/avatar/default/avatar04.webp
HardwareCaps:

So basically, Turing main unique feature "ray tracing", is going to be supported by previous generation as well? Why would anyone get an RTX card now? it literally makes no sense.
You would get a RTX card because on previous gen cards this is going to be low quality effects and slow. Without the dedicated RT cores, it can barely do anything without seriously dropping FPS (and you thought DXR on RTX cards was slow already? Think again!) So basically, the same reason you always get a new card: Those are faster. This announcement really doesn't change anything. People that are interested in Ray Tracing effects will still want to get a RTX card, because its the only way to really make use of them. People that are not really interested can get a 16-series, or try to scavenge up an old 10-series, but as a bonus they get a taste of Ray Tracing, entry-level quality and low performance. If anything, this makes RTX look better, since you can truely see the performance cost it would have on previous generation of GPUs. And it also opens the door for more developers to try to work with it, since the market of people with DXR support suddenly vastly increases.
data/avatar/default/avatar13.webp
To me it seems the reason for this decision is just to get more game developers to add DXR Raytracing into games. Even if it's just for a few small effects so it results in acceptable frame rates on 10 and 16 series. Then the RTX cards will naturally see significant performance increases in these games compared to AMD and NV's previous cards.
https://forums.guru3d.com/data/avatars/m/142/142982.jpg
To Me this is not a hilarious move, but a normal one, Nvidia knew about this but they were hoping that this will happen later on so they can squiz some money out of costumers. Maybe this is one of the facts why they also launch the 16xx series. All graphic cards that fully suport DirectX 12, suport Microsoft's DXR-API (the API responsible for real-time ray tracing) , we only need to know the impact to performance, but ther rest is there. And Nvidia needs to bring support for this tech just because this will be supported by every game developer. Also the new consoles will suport this. The 2xxx series bring some nice thing to the tabel but until we see a real comparation we can't say if this is not ok or that it has limitations, and so on. Now the real race for DXR Real-Time Ray Tracing starts.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
dannyo969:

seems like a marketing pitch, pascal doesnt have RT cores or Tensor cores. I question Nvidias intentions sometimes.... Wouldnt they want people to buy new RTX cards? From a business standpoint, It seems as if this would slow RTX sales more. Dont even get me started on the 1660ti and 1660, just why.... Theyre competing with their own cards. Buy a damn rtx 2060 or gtx 1070.
Am wondering if this is Nvidias response to Cryteks RT reflections demo where the vega 56 seemed to do so well. It sort of caught Nvidia with their pants down. So they had to cobble up some response to show Pascal is still relevant (even on some feeble level) vs old Vega on the RT front.