Microsoft Adds DirectX 12 Feature-level 12_2, and Turing, Ampere and RDNA2 Support it

Published by

Click here to post a comment for Microsoft Adds DirectX 12 Feature-level 12_2, and Turing, Ampere and RDNA2 Support it on our message forum
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
Good to see everyone is getting in on it.
data/avatar/default/avatar06.webp
isn't ps5 a rdna2 too and so supporting mesh shaders also? ( with their own api not dx12 )
data/avatar/default/avatar10.webp
oh ok so ps5 may not have rdna2 as it is, but something else. Weird. For who buys purely on specs, this may be a deal breaker. I just want the new playstation, so for me is not a big deal
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
PS5 will have equal strength to XBOX series x. Just less features. It all depends "if" those features are gonna be used by developers. Now, Turing, Ampere and RDNA2, not vega, not rdna 1.5 or 1 are viable, tech is moving on, fast and unforgiving. Cheer-up gurus! 😎
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Tier 0.9? That's a interesting little thing for that sampler feedback. EDIT: https://microsoft.github.io/DirectX-Specs/d3d/SamplerFeedback.html Well it does something, probably nothing too important. Already multiple tiers for several of the D3D12_2 stuff too, interesting. 🙂 (And it seems ray tracing also has incremental ones like 1.1 going by Ampere GPU capabilities. Hmm well it probably means nothing much overall.)
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
Can someone forward on all the DX12 documentation to Microsoft - they seem to need it to finish developing Flight Simulator 2020... appears they were only given the DX11 docs.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
geogan:

Can someone forward on all the DX12 documentation to Microsoft - they seem to need it to finish developing Flight Simulator 2020... appears they were only given the DX11 docs.
Microsoft didn't develop flight simulator 2020.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
K.S.:

So much for early adopters of RDNA
RTX 2060 Super and RTX 2070 Super are like a steal compared to 5600XT, 5700 and 5700XT. RDNA1 is underwhelming and looks worse day by day. Missing so many features.
https://forums.guru3d.com/data/avatars/m/261/261821.jpg
itpro:

RTX 2060 Super and RTX 2070 Super are like a steal compared to 5600XT, 5700 and 5700XT. RDNA1 is underwhelming and looks worse day by day. Missing so many features.
I'd say the 2060 Super is completely useless. You can get a 5700XT for same price with a lot more power. Raytracing is horrible on 2060 Super(even on my 2080 its horrible(performance)), and barely any games to play, still). Same with DLSS 2.0, barely any games to support it and 1.0 is crap.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
riot83:

I'd say the 2060 Super is completely useless. You can get a 5700XT for same price with a lot more power. Raytracing is horrible on 2060 Super(even on my 2080 its horrible(performance)), and barely any games to play, still). Same with DLSS 2.0, barely any games to support it and 1.0 is crap.
Techpowerup's game summary puts a 5700XT at 5% faster than a 2060 Super across all 3 resolutions. Idk if I'd call that "a lot more power", unless of course you're talking about the 5700XT's 35w higher power consumption. Having the 2060 Super gives you Mesh Shaders, VRS, RT and DLSS. Yes there are relatively few games that use it.. but obviously with consoles coming with support for most of these technologies, more games in the future will use RT/Mesh/VRS. Further, some developers are looking at using RT for other things aside from just graphics - like sound for example, which shouldn't tax RT as much as reflects/shadows/etc, so even if you think the performance of 2060 Super in current RT games isn't enough, it might be in those circumstances. Then there are other advantages like the video encoder/decoder just being generally better. RTX voice if that's something you're interested in. All the weird Nvidia experience/shader bullshit that no one probably uses. Various VR improvements no probably uses either but maybe they do. So idk, I think calling it completely useless is a bit of a stretch. Like even if you ignored the massive feature advantages towards the Nvidia card, it's a 5% performance difference for roughly the same price.. no one is batting an eye at that. All this being said, with both companies on the verge of releasing new cards within the next couple months, you probably shouldn't be buying either.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
It is so sad from consumers' perspective. Early adopters of RTX 20 series got new features plus made preparations for the stage of RTX 30 series. RDNA early adopters got only the badge of AMD casual fan and loyal consumer. RDNA 2 will make RDNA 1 go EOL quicker than ever. AMD did it so wrong again. "Poor vega", "poor small navi". Let's hope big navi is rich and successful, enough with technologies' poverty.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
karma777police:

Ray tracing is useless until Nvidia releases 4000 series. Around that time the industry will pick up on it in more meaningful way. In mean time you are wasting your money.
Enjoy CP2077 without ray tracing then buddy!
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
Everything depends on how devs "will" work with the API(s)..... Remembering of DirectX 10 with Crysis 1.....:(;)
https://forums.guru3d.com/data/avatars/m/247/247876.jpg
Why should I care for all these new tiers if Cyberpunk was developed without them? For now I am good with DX 12_1.
data/avatar/default/avatar24.webp
DannyD:

Enjoy CP2077 without ray tracing then buddy!
Proper SSR will look almost as good without the performance hit, and we’re not even doing fully ray-traced scenes yet. Perhaps the 3090 will manage 1440 60fps with RT in most games and maybe even 4K but I wouldn’t hold my breath for anything other than the 3090 to run RT @1440p smoothly, not without the help of DLSS anyway.
data/avatar/default/avatar19.webp
so geforce will remain FL 0_9 on sampler feedback. not a big deal anyway...
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
craycray:

nVidia has 80% dGPU market share. Did you forget that post? So, I think it could absolutely be generalised when 80% of us are currently not using AMD.
Thats....not how that works.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
mbk1969:

Why should I care for all these new tiers if Cyberpunk was developed without them? For now I am good with DX 12_1.
Until like say a year or so of having compatible hardware for this available can there even be development directly against 12_2 without having at least a decent segment of the PC market having said compatible hardware for utilization of these features? It's a bit like Windows 7 and lowest common denominator and development target that so unless developers and perhaps more importantly the publisher somehow starts giving these development projects and ports to PC far more time (Or even non-simultaneous release dates but that usually also backfires a bit.) I don't expect too many studios to try having 12_0/12_1 and then also 12_2 now that there's a hardware divide for this much as the features are probably going to be well received and over time grow to become more utilized far as low-level API supported engines go at least. 🙂 It took years for D3D11 to surpass D3D9 and while it's not intended to replace it the usage of D3D12 and Vulkan are also still fairly small but support is improving and engine adaption is getting better and the new console hardware might help just push a bit of a shift in hardware overall but with higher GPU pricing and outside of enthusiast segments I don't see the 3000 NVIDIA's and 6000 (?) AMD's cards taking hardware market majority that quickly. Might still see it implemented in part or utilized somewhat in games although for ray tracing I would assume NVIDIA RTX implementation to be the more standard but we'll see but by the time coding for these are more common we'll probably have a few different generations of GPU hardware that's 12_2 compatible. (And then Microsoft goes and makes something newer starting it all over again not that it's a bad thing as such.) Will be interesting to see what this can do but I don't expect to see the primary benefits besides the new GPU's being really fast anytime soon due to targeting common and current available hardware and OS's and all that. Plus with everything currently in development jamming in 12_2 support for the PC version could go all number of ways from minor support to outright skipping it but the XBox X series console does make for a interesting hardware divider here and how ports will happen as a result of this having the hardware bits and support now. (Don't know about the PS5 much actually interesting to hear that about the GPU differences here.) Well for now at least we're getting what is hopefully nice and fast GPU's especially if the base cost is going to go up again but we'll see about that too once some details and eventually the actual reviews and market availability shows what NVIDIA's and AMD's new cards can do.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
craycray:

Right.... So how does it work? Please don’t spare any details, enlighten us.
Really shouldn't need to be explained to be honest. If you can't figure out why 80%, even if that number was true to what you said, which isn't, you're just reading what you want to hear rather then what is actually stated, but regardless, even if it was true, 80% isn't everybody, never has been, can't make a claim what you claimed, can't talk for everyone. 80% isn't everyone. Simple math can honestly figure that one out, really shouldn't need to be explaining this to you....
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
JonasBeckman:

Tier 0.9? That's a interesting little thing for that sampler feedback. EDIT: https://microsoft.github.io/DirectX-Specs/d3d/SamplerFeedback.html Well it does something, probably nothing too important. Already multiple tiers for several of the D3D12_2 stuff too, interesting. 🙂 (And it seems ray tracing also has incremental ones like 1.1 going by Ampere GPU capabilities. Hmm well it probably means nothing much overall.)
Microsoft wanted to go just a bit further in their sampler feedback design than Turing implemented, but they didn't want to lock Turing out of 12_2 feature level either