NVIDIA Releases SDK allowing Global Illumination through Ray-Tracing for any GPU supporting DXR
Click here to post a comment for NVIDIA Releases SDK allowing Global Illumination through Ray-Tracing for any GPU supporting DXR on our message forum
MBTP
I call it preemptive strike.
I wonder if developers using the Series X will be caught on that.
Very tempting to use an already developed code...
don't bite that bait.
David3k
This is GameWorks all over again.
Astyanax
Maddness
I see this as a positive move.
KissSh0t
This is basically Nvidia seeing AMD is supporting raytracing on the big two next gen consoles.
Luc
I hope it will really be optimized, but not the same old history, forcing every consumer to buy an expesive new card, no matter the brand, to been allowed to see lightning effects or shadows...
I always thought that standard open libraries are the best way to build visuals in a wide range of games, instead the usual marketing campaings, locking common graphic effects in some sponsored games (like aliasing or physics), making even their strongest last gen cards look like toys and leaving a pile of downgrades and bugs behind...
But if GTX 10 series can deliver enough performance and visuals too, maybe everyone can enjoy ray traced effects in new games, at least at 1080p.
Its quite clear now, that David Wang from AMD already knew what will happen and they where waiting for next gen consoles to offer DXR on every new card.
Let's see what comes next... finger crossed for physics and AI on every CPU and ray traced effects on every GPU ๐
Agonist
Alessio1989
Do not confuse GameWorks with closed binaries with GameWorks completely open. Most of GameWorks now are completely open.
Denial
videos of the game with Hairworks a year before it shipped out. The whole thing was demoed at GDC over a year before it launched. They knew it had Hairworks, yet they never requested for builds with hairworks (presumably or they did and just did nothing) and they knew their geometry performance was trash in their architecture. So I guess they decided to just handwave the whole thing away with "ehh we need sourcecode!" And now all the source is available for all GameWorks libraries and yet no magical drivers came out and fixed the performance or issues AMD had.
Source it. I've never seen such a thing. For a while Nvidia didn't provide source for Gameworks libraries but they do now - I've never seen them lock AMD out of the games source or engine, ever. In fact the two times AMD did mention GameWorks, it didn't even make sense. With Project Cars, AMD responded to a poster that claimed GPU PhysX was destroying AMD's performance (the game didn't even have GPU PhysX) the developer responded back and said AMD had access to the game for months with no communication what-so-ever to the developer:
With Witcher 3 Richard Huddy said:
But there is Luc
https://www.techpowerup.com/104868/batman-arkham-asylum-enables-aa-only-on-nvidia-hardware-on-pcs
And every old PhysX game I played didn't run well enought on their own cards, for quite common physics that even my crappy GT 710 DDR3 can push nowadays...
At Nvidia they say that don't pay as an sponsor, that they only give away graphic cards and workforce (a lot of money) to implement their "Gameworks", but we must always remember what happened with Watchdogs and Batman Arkham Knight ports to PC...
I want to think that things were a little different with CDProyect Red, because they are a reliable studio and they didn`t allowed too much crap in their game, but again 64x tessellation on hair... looks like a desperate move to cripple everything in a nosense.
Every GPU maker look for an advantage over rivals: AMD does it through consoles (in the past TressFX, Mantle, etc.), triying to improve performance on their hardware but giving this progress to everyone, while Nvidia always found a way to make every card on the market feels poor except their new gen flagship, while pushing prices higher than ever for a consumer part...
I know what to expect, but I hope for an open implementation in new games.
The evidence:
Denial
Remember when he said 4GB of HBM = 12GB of GDDR5? Fury X owners remember.
TressFX had the nearly the same problem that AMD claimed with HairWorks:
https://www.pcgamer.com/tomb-raiders-geforce-performance-issues-being-looked-at-by-nvidia-and-crystal-dynamics/
Notice how unlike AMD, they just said they were working on drivers and with developer to fix the issue instead of pointing fingers. Oh and they did it without the source code because TressFX didn't release source until months later.
Don't get me wrong - I think AMD is a completely different company under Lisa Su.. most of the people at that time are long gone. But a lot of these things people bring up are from a time when AMD/Nvidia's architectures were moving in radically different directions and both companies were trying to develop features that were optimized for their own products. I think it's asinine to keep bringing them up, especially when most of the issues with Gameworks (for example it being closed source) is no longer an issue.
How is this a source? Where in this article does it say that the developer locked AMD out of the engine and/or source code for the game but gave Nvidia access?
Furthermore:
From PCPer:
Throughout this whole period of time - from when AMD purchased ATi, to honestly pretty recently - AMD was notorious for not supporting game developers. S2 games for example made the biggest stink about it with Savage 2. Where they claimed they actually had Nvidia help them get the game working on AMD hardware because AMD didn't have the resources to help indie game developers.
Also Richard Huddy just straight up lies left and right anyway.
Astyanax
sadly Richard Huddy is still present and lying.
Luc
blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom."
I can confirm you that the developer changed their game core and deleted/modified their whitelist to allow AMD to use AA, after months of criticism from all the gaming comunity. At the end, they did it between poor excuses, in an attempt to clean their public image, and it looks like they accomplished it.
I didn't knew about that game nor if it was popular or niche in its time, so I cannot discuss about it, nor if AMD had money or it was their fault at all, but i'd like to trust you.
Yeah, big fail by everyone there, they lied trying to hide the problem and the hype got things worst.
Yes, AMD did the same they blame. At Nvidia they did it well, first they improved the stability, then they upgraded the performance with the source code. TressFX seemed to be rushed, I remember that even Radeon drivers weren't performing well when the game was released. But there is a big difference in time, talking about releasing the source code after launching the game (dirty) isn't the same as releasing PhysX code openly after 10 years being closed, because I don't know how many time GameWorks was closed.
They must avoid those strategys, that's why this announcement is so encouraging, but we know how those old dogs used to be...
I hope you are right, because we don't know how AMD RDNA2 will play with RTX on, and there exists the possibility that will only work on next-gen console ports.
Lisa leadership is great, but she already thrown Vega VII to gamers when David Wang said that chip isn't for gaming.
Maybe Intel can balance the situation between them, it could be a funny history.
PS: sorry for the long post and my repetitive and/or bad english ๐ณ
From the source:
"AMD's Ian McNaughton in his recent CPC_RedDawn
Denial
CPC_RedDawn
Denial
https://developer.download.nvidia.com/presentations/2008/SIGGRAPH/RealTimeHairRendering_SponsoredSession2.pdf
Here is a 2008 Siggraph presentation that is the basis for Hairworks.
Nvidia felt GPU architectures were going to go more towards geometry but they didn't - AMD pushed compute hard, won all the console contracts, pushed compute into consoles and all the tech Nvidia was developing towards their architectures became somewhat obsoleted. In the meantime AMD's tessellation performance was lackluster and they've since made lots of attempts to improve it.
TressFX has never been used on more than one asset in the game and at the time doing one character took tons of development time, it also had no fur support - it couldn't have been used in Witcher 3. To this day Hairworks instances better than TressFX - which is why basically no games use it and the ones that do limit it to very few characters.
RT is in plenty of games with good results and it keeps improving - I don't think you can compare RT to other Gameworks features.
What do you mean "nvidia knew about" ? The underwater tessellation was debunked (gets culled). The rest is explained here:
Has nothing to do with Nvidia.
There has been some Nvidia Gameworks features that ruin performance with questionable benefits but again it's easily explained by the departure in architectures or Nvidia just trying things (like the voxel based tracing in The Division). People have to realize that these things aren't developed in a vacuum though. Throughout the years i've seen people say "nvidia intentionally made this to sabotage AMD's performance" but in the case of hairworks it was in development for 8 years.
Luc
Astyanax
Batman AA didn't work on AMD intentionally by the studio because AMD didn't supply any devrel to verify the AA actually worked on AMD parts.