NVIDIA Releases SDK allowing Global Illumination through Ray-Tracing for any GPU supporting DXR

Published by

Click here to post a comment for NVIDIA Releases SDK allowing Global Illumination through Ray-Tracing for any GPU supporting DXR on our message forum
data/avatar/default/avatar16.webp
I call it preemptive strike. I wonder if developers using the Series X will be caught on that. Very tempting to use an already developed code... don't bite that bait.
data/avatar/default/avatar08.webp
This is GameWorks all over again.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
David3k:

This is GameWorks all over again.
no it isn't. and the only hate for gameworks was because AMD had hardware that wasn't up to the task, RDNA is.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
I see this as a positive move.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
This is basically Nvidia seeing AMD is supporting raytracing on the big two next gen consoles.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
I hope it will really be optimized, but not the same old history, forcing every consumer to buy an expesive new card, no matter the brand, to been allowed to see lightning effects or shadows...
Astyanax:

no it isn't. and the only hate for gameworks was because AMD had hardware that wasn't up to the task, RDNA is.
I always thought that standard open libraries are the best way to build visuals in a wide range of games, instead the usual marketing campaings, locking common graphic effects in some sponsored games (like aliasing or physics), making even their strongest last gen cards look like toys and leaving a pile of downgrades and bugs behind... But if GTX 10 series can deliver enough performance and visuals too, maybe everyone can enjoy ray traced effects in new games, at least at 1080p. Its quite clear now, that David Wang from AMD already knew what will happen and they where waiting for next gen consoles to offer DXR on every new card. Let's see what comes next... finger crossed for physics and AI on every CPU and ray traced effects on every GPU ๐Ÿ˜€
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Astyanax:

no it isn't. and the only hate for gameworks was because AMD had hardware that wasn't up to the task, RDNA is.
Lots Nvidia users hated gameworks too. I did and I,was running 2x GTX 970s at the time. My 290x crossfire ran gimpworks just fine. It's the fact nvidia forced developers to lock AMD out of the game source code and engine if the games use gimpworks. AMD could not even have a driver ready for the game till after launch.
data/avatar/default/avatar40.webp
Do not confuse GameWorks with closed binaries with GameWorks completely open. Most of GameWorks now are completely open.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Agonist:

It's the fact nvidia forced developers to lock AMD out of the game source code and engine if the games use gimpworks. AMD could not even have a driver ready for the game till after launch.
Source it. I've never seen such a thing. For a while Nvidia didn't provide source for Gameworks libraries but they do now - I've never seen them lock AMD out of the games source or engine, ever. In fact the two times AMD did mention GameWorks, it didn't even make sense. With Project Cars, AMD responded to a poster that claimed GPU PhysX was destroying AMD's performance (the game didn't even have GPU PhysX) the developer responded back and said AMD had access to the game for months with no communication what-so-ever to the developer:
With the complaints flowing in thick and fast, Project Cars developer Slightly Mad Studios joined the fray and proceeded to place the blame for the game's issues squarely on AMD. "Weโ€™ve provided AMD with 20 keys for game testing as they work on the driver side," said Slighty Mad Studios' Ian Bell. "But you only have to look at the lesser hardware in the consoles to see how optimised we are on AMD based chips. Weโ€™re reaching out to AMD with all of our efforts. Weโ€™ve provided them 20 keys as I say. They were invited to work with us for years, looking through company mails the last I can see [AMD] talked to us was October of last year. Categorically, Nvidia have not paid us a penny. They have though been very forthcoming with support and co-marketing work at their instigation."
With Witcher 3 Richard Huddy said:
"We've been working with CD Projeckt Red from the beginning," said Huddy. "We've been giving them detailed feedback all the way through. Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we're concerned. We were running well before that... it's wrecked our performance, almost as if it was put in to achieve that goal."
But there is videos of the game with Hairworks a year before it shipped out. The whole thing was demoed at GDC over a year before it launched. They knew it had Hairworks, yet they never requested for builds with hairworks (presumably or they did and just did nothing) and they knew their geometry performance was trash in their architecture. So I guess they decided to just handwave the whole thing away with "ehh we need sourcecode!" And now all the source is available for all GameWorks libraries and yet no magical drivers came out and fixed the performance or issues AMD had.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
Denial:

Source it. I've never seen such a thing. For a while Nvidia didn't provide source for Gameworks libraries but they do now - I've never seen them lock AMD out of the games source or engine, ever. In fact the two times AMD did mention GameWorks, it didn't even make sense. With Project Cars, AMD responded to a poster that claimed GPU PhysX was destroying AMD's performance (the game didn't even have GPU PhysX) the developer responded back and said AMD had access to the game for months with no communication what-so-ever to the developer: With Witcher 3 Richard Huddy said: But there is videos of the game with Hairworks a year before it shipped out. The whole thing was demoed at GDC over a year before it launched. They knew it had Hairworks, yet they never requested for builds with hairworks (presumably or they did and just did nothing) and they knew their geometry performance was trash in their architecture. So I guess they decided to just handwave the whole thing away with "ehh we need sourcecode!" And now all the source is available for all GameWorks libraries and yet no magical drivers came out and fixed the performance or issues AMD had.
The evidence: https://www.techpowerup.com/104868/batman-arkham-asylum-enables-aa-only-on-nvidia-hardware-on-pcs And every old PhysX game I played didn't run well enought on their own cards, for quite common physics that even my crappy GT 710 DDR3 can push nowadays... At Nvidia they say that don't pay as an sponsor, that they only give away graphic cards and workforce (a lot of money) to implement their "Gameworks", but we must always remember what happened with Watchdogs and Batman Arkham Knight ports to PC... I want to think that things were a little different with CDProyect Red, because they are a reliable studio and they didn`t allowed too much crap in their game, but again 64x tessellation on hair... looks like a desperate move to cripple everything in a nosense. Every GPU maker look for an advantage over rivals: AMD does it through consoles (in the past TressFX, Mantle, etc.), triying to improve performance on their hardware but giving this progress to everyone, while Nvidia always found a way to make every card on the market feels poor except their new gen flagship, while pushing prices higher than ever for a consumer part... I know what to expect, but I hope for an open implementation in new games.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Luc:

The evidence: https://www.techpowerup.com/104868/batman-arkham-asylum-enables-aa-only-on-nvidia-hardware-on-pcs And every old PhysX game I played didn't run well enought on their own cards, for quite common physics that even my crappy GT 710 DDR3 can push nowadays... At Nvidia they say that don't pay as an sponsor, that they only give away graphic cards and workforce (a lot of money) to implement their "Gameworks", but we must always remember what happened with Watchdogs and Batman Arkham Knight ports to PC... I want to think that things were a little different with CDProyect Red, because they are a reliable studio and they didn`t allowed too much crap in their game, but again 64x tessellation on hair... looks like a desperate move to cripple everything in a nosense. I know what to expect, but I hope for an open implementation in new games.
How is this a source? Where in this article does it say that the developer locked AMD out of the engine and/or source code for the game but gave Nvidia access? Furthermore:
"Batman AA is not our property. It is owned by Eidos. It is up to Eidos to decide the fate of a feature that AMD refused to contribute too and QA for their customers, not NVIDIA. If it is relatively trivial, Mr. Huddy should have done it himself. The Unreal engine does not support in game AA, so we added it and QAed it for our customers. As Eidos confirmed (Not allowed to post links here, but check PCper for Eidos' statement) AMD refused the same opportunity to support gamers with AA on AMD GPUs. I'm sure Mr. Huddy knows how important QA is for game developers. I recommend AMD starts working with developers to make their HW work in a proper way. That's not our job. We added functionality for NVIDIA GPUs into the game. We did not lock anything out. AMD just did not do their work."
From PCPer:
The developer relations team at NVIDIA is significantly larger, has a significantly larger budget and in general works with more developers than AMD's. In the case of Batman's AA support, NVIDIA essentially built the AA engine explicitly for Eidos - AA didn't exist in the game engine before that. NVIDIA knew that this title was going to be a big seller on the PC and spent the money/time to get it working on their hardware. Eidos told us in an email conversation that the offer was made to AMD for them to send engineers to their studios and do the same work NVIDIA did for its own hardware, but AMD declined.
Throughout this whole period of time - from when AMD purchased ATi, to honestly pretty recently - AMD was notorious for not supporting game developers. S2 games for example made the biggest stink about it with Savage 2. Where they claimed they actually had Nvidia help them get the game working on AMD hardware because AMD didn't have the resources to help indie game developers. Also Richard Huddy just straight up lies left and right anyway. Remember when he said 4GB of HBM = 12GB of GDDR5? Fury X owners remember.
Every GPU maker look for an advantage over rivals: AMD does it through consoles (in the past TressFX, Mantle, etc.), triying to improve performance on their hardware but giving this progress to everyone, while Nvidia always found a way to make every card on the market feels poor except their new gen flagship, while pushing prices higher than ever for a consumer part...
TressFX had the nearly the same problem that AMD claimed with HairWorks: https://www.pcgamer.com/tomb-raiders-geforce-performance-issues-being-looked-at-by-nvidia-and-crystal-dynamics/ Notice how unlike AMD, they just said they were working on drivers and with developer to fix the issue instead of pointing fingers. Oh and they did it without the source code because TressFX didn't release source until months later. Don't get me wrong - I think AMD is a completely different company under Lisa Su.. most of the people at that time are long gone. But a lot of these things people bring up are from a time when AMD/Nvidia's architectures were moving in radically different directions and both companies were trying to develop features that were optimized for their own products. I think it's asinine to keep bringing them up, especially when most of the issues with Gameworks (for example it being closed source) is no longer an issue.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
sadly Richard Huddy is still present and lying.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
Denial:

How is this a source? Where in this article does it say that the developer locked AMD out of the engine and/or source code for the game but gave Nvidia access?
From the source: "AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom." I can confirm you that the developer changed their game core and deleted/modified their whitelist to allow AMD to use AA, after months of criticism from all the gaming comunity. At the end, they did it between poor excuses, in an attempt to clean their public image, and it looks like they accomplished it.
Denial:

Throughout this whole period of time - from when AMD purchased ATi, to honestly pretty recently - AMD was notorious for not supporting game developers. S2 games for example made the biggest stink about it with Savage 2. Where they claimed they actually had Nvidia help them get the game working on AMD hardware because AMD didn't have the resources to help indie game developers.
I didn't knew about that game nor if it was popular or niche in its time, so I cannot discuss about it, nor if AMD had money or it was their fault at all, but i'd like to trust you.
Denial:

Also Richard Huddy just straight up lies left and right anyway. Remember when he said 4GB of HBM = 12GB of GDDR5? Fury X owners remember.
Yeah, big fail by everyone there, they lied trying to hide the problem and the hype got things worst.
Denial:

TressFX had the nearly the same problem that AMD claimed with HairWorks: https://www.pcgamer.com/tomb-raiders-geforce-performance-issues-being-looked-at-by-nvidia-and-crystal-dynamics/ Notice how unlike AMD, they just said they were working on drivers and with developer to fix the issue instead of pointing fingers. Oh and they did it without the source code because TressFX didn't release source until months later.
Yes, AMD did the same they blame. At Nvidia they did it well, first they improved the stability, then they upgraded the performance with the source code. TressFX seemed to be rushed, I remember that even Radeon drivers weren't performing well when the game was released. But there is a big difference in time, talking about releasing the source code after launching the game (dirty) isn't the same as releasing PhysX code openly after 10 years being closed, because I don't know how many time GameWorks was closed. They must avoid those strategys, that's why this announcement is so encouraging, but we know how those old dogs used to be...
Denial:

Don't get me wrong - I think AMD is a completely different company under Lisa Su.. most of the people at that time are long gone. But a lot of these things people bring up are from a time when AMD/Nvidia's architectures were moving in radically different directions and both companies were trying to develop features that were optimized for their own products. I think it's asinine to keep bringing them up, especially when most of the issues with Gameworks (for example it being closed source) is no longer an issue.
I hope you are right, because we don't know how AMD RDNA2 will play with RTX on, and there exists the possibility that will only work on next-gen console ports. Lisa leadership is great, but she already thrown Vega VII to gamers when David Wang said that chip isn't for gaming. Maybe Intel can balance the situation between them, it could be a funny history. PS: sorry for the long post and my repetitive and/or bad english ๐Ÿ˜ณ
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
Astyanax:

no it isn't. and the only hate for gameworks was because AMD had hardware that wasn't up to the task, RDNA is.
Neither did Nvidia. Gameworks broke nearly every game it was implemented in even on Nvidia cards. Watch Dogs, broken Witcher 3, broken Arkham Knight, broken Metro Series, broken Crysis 2, broken These are just a few I can remember but nearly every time they added in these features it destroyed the frame rate on AMD but also Nvidia cards too. This is why its common knowledge that if you want a quick fix for getting better performance you ALWAYS disable these stupid features first and forget they even exist. Open standards are always better, this closed off garden Nvidia keeps locking themselves in does nothing but hurt the PC gaming industry. The sooner this stuff goes away the better for us all.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
CPC_RedDawn:

Neither did Nvidia. Gameworks broke nearly every game it was implemented in even on Nvidia cards. Watch Dogs, broken Witcher 3, broken Arkham Knight, broken Metro Series, broken Crysis 2, broken These are just a few I can remember but nearly every time they added in these features it destroyed the frame rate on AMD but also Nvidia cards too. This is why its common knowledge that if you want a quick fix for getting better performance you ALWAYS disable these stupid features first and forget they even exist. Open standards are always better, this closed off garden Nvidia keeps locking themselves in does nothing but hurt the PC gaming industry. The sooner this stuff goes away the better for us all.
I played more than half these games fine at launch (I never played WD or AK). No idea what you're talking about with the others - Crysis 2 I played on AMD card, the game was bad but it played fine. Also all of Nvidia's stuff is open now, including the item mentioned in the OP.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
Denial:

I played more than half these games fine at launch (I never played WD or AK). No idea what you're talking about with the others - Crysis 2 I played on AMD card, the game was bad but it played fine. Also all of Nvidia's stuff is open now, including the item mentioned in the OP.
Watch Dogs was a mess, AK had many other issues too and gameworks just added to it. Witcher 3 hairworks was terrible unless you had a high end GPU they should of used tressfx instead, Metro had physX and you took about 20% less performance, same goes for Metro Exodus for 40% more performance disable all Nvidia features for barely any difference in IQ, or you could enable RT for a 50-60% drop in performance. Crysis 2 had insane amount of tessellation left in which Nvidia knew about and it killed performance on AMD cards. Thank god its finally open source, nice to see open source always wins in the end.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
CPC_RedDawn:

Watch Dogs was a mess, AK had many other issues too and gameworks just added to it. Witcher 3 hairworks was terrible unless you had a high end GPU they should of used tressfx instead, Metro had physX and you took about 20% less performance, same goes for Metro Exodus for 40% more performance disable all Nvidia features for barely any difference in IQ, or you could enable RT for a 50-60% drop in performance. Crysis 2 had insane amount of tessellation left in which Nvidia knew about and it killed performance on AMD cards. Thank god its finally open source, nice to see open source always wins in the end.
TressFX has never been used on more than one asset in the game and at the time doing one character took tons of development time, it also had no fur support - it couldn't have been used in Witcher 3. To this day Hairworks instances better than TressFX - which is why basically no games use it and the ones that do limit it to very few characters. RT is in plenty of games with good results and it keeps improving - I don't think you can compare RT to other Gameworks features. What do you mean "nvidia knew about" ? The underwater tessellation was debunked (gets culled). The rest is explained here:
At the time Crysis 2 came out, Crytek didn't yet improve their parallax occlusion method technique. Parallax occlusion mapping, or POM, is a shader trick used to make a surface look like it has depth to it. An example of it being used in the Crysis 2 CE3 build: http://www.simonfuchs.net/folio/gfx/tutorials/02_pom/13_pom_animation.gifhttp://docs.cryengine.com/display/SDKDOC2/Silhouette+POM After all this time, they finally improved their POM to influence the silhouette. It's still an insanely expensive effect to use, but it has 2 major advantages. 1: It does run much, much faster than tessellation on very noisy surfaces. 2: Gamers have yet to figure out what a shader actually is. You can get people forcing on wireframe mode and poking around in there, but the average gamer still has zero concept of what a shader effect is or what it does. So, if performance goes down, they'll just write it off as something else influencing the performance. Including this link for the sake of completeness: https://lh3.googleusercontent.com/ABCa7x6lZTEJLdFh-dFqOfrffLn6azgd57ekud_FyliE0ZknHHvuIBwgOjJKOYiyR05B8YaLA_Lo95KEg7N80BAKM-Ky7y6TyB4DB9QLQ0-srsFRFkbhXFD70YsdlUJ3PhEOT3o This is Epic's POM integration into the UE4. Of course, like any sensible developer, they warn against overusing it since it is a very expensive shader effect. Image source: https://forums.unrealengine.com/showthread.php?72647-Engine-Features-Preview-6-11-2015
Has nothing to do with Nvidia. There has been some Nvidia Gameworks features that ruin performance with questionable benefits but again it's easily explained by the departure in architectures or Nvidia just trying things (like the voxel based tracing in The Division). People have to realize that these things aren't developed in a vacuum though. Throughout the years i've seen people say "nvidia intentionally made this to sabotage AMD's performance" but in the case of hairworks it was in development for 8 years. https://developer.download.nvidia.com/presentations/2008/SIGGRAPH/RealTimeHairRendering_SponsoredSession2.pdf Here is a 2008 Siggraph presentation that is the basis for Hairworks. Nvidia felt GPU architectures were going to go more towards geometry but they didn't - AMD pushed compute hard, won all the console contracts, pushed compute into consoles and all the tech Nvidia was developing towards their architectures became somewhat obsoleted. In the meantime AMD's tessellation performance was lackluster and they've since made lots of attempts to improve it.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
Denial:

TressFX has never been used on more than one asset in the game and at the time doing one character took tons of development time, it also had no fur support - it couldn't have been used in Witcher 3. To this day Hairworks instances better than TressFX - which is why basically no games use it and the ones that do limit it to very few characters. RT is in plenty of games with good results and it keeps improving - I don't think you can compare RT to other Gameworks features. What do you mean "nvidia knew about" ? The underwater tessellation was debunked (gets culled). The rest is explained here: Has nothing to do with Nvidia. There has been some Nvidia Gameworks features that ruin performance with questionable benefits but again it's easily explained by the departure in architectures or Nvidia just trying things (like the voxel based tracing in The Division). People have to realize that these things aren't developed in a vacuum though. Throughout the years i've seen people say "nvidia intentionally made this to sabotage AMD's performance" but in the case of hairworks it was in development for 8 years. https://developer.download.nvidia.com/presentations/2008/SIGGRAPH/RealTimeHairRendering_SponsoredSession2.pdf Here is a 2008 Siggraph presentation that is the basis for Hairworks. Nvidia felt GPU architectures were going to go more towards geometry but they didn't - AMD pushed compute hard, won all the console contracts, pushed compute into consoles and all the tech Nvidia was developing towards their architectures became somewhat obsoleted. In the meantime AMD's tessellation performance was lackluster and they've since made lots of attempts to improve it.
Excuse me, most of those links didn't worked for me. I don't get your point. First, you accused AMD to not support an Indie Game that was already sponsored by Nvidia, because the developers couldn't get their own software working in Radeon hardware. And it was AMDs fault because they hadn't money for that proyect, and we know it could be true. Then, you excused Nvidia for their many GameWorks broken games, that already worked nicely (30 pfs ๐Ÿ™„ ) in x86 consoles. The only thing those ports had in common was Nvidia sponsorship, because the rest of console ports (most of the games) worked much better. Arkham Knight was taken off the market and repaired for almost a year... I can't believe the only problem here, was that Nvidia planned to use more geometry, because they never had problems to offer an strong computing performance, and their sponsored games only offered overkill geometry in hair, fur and godrays, because the fog, smoke and fire were so uncommon that I don't remember seen them since Arkham Asylum unnecessary stacked volumetric fog... it looked so unoptimized, that only the strongest PhysX card could run it... it worked really well most of the time, but in some places it was so overpiled that crippled the performance, changing more than 100 fps to almost 20. Everybody knows that Nvidia can be the strongest graphic player in the market without dirty tricks and I expect that any contender does it neither. I hope I didn't sounded harsh, it isn't my mood ๐Ÿ˜›
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Batman AA didn't work on AMD intentionally by the studio because AMD didn't supply any devrel to verify the AA actually worked on AMD parts.