Unity Adding NVIDIA DLSS Support to Their Game Engine

Published by

Click here to post a comment for Unity Adding NVIDIA DLSS Support to Their Game Engine on our message forum
data/avatar/default/avatar12.webp
Good , bring more , that will pressure AMD, competition = progress.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

Good , bring more , that will pressure AMD, competition = progress.
I think they've been pressured enough to show their own image reconstruction technology when dlss 2.0 came out more than a year ago. adding dlss to all major engines would be ensuring fantastic longevity for rtx 20/30 cards. when pascal owners have to start dropping resolution scaling,turing owners can use dlss intead.frankly this is amazing,I wish my 1070 had it.
data/avatar/default/avatar12.webp
cucaulay malkin:

I think they've been pressured enough to show their own image reconstruction technology when dlss 2.0 came out more than a year ago. adding dlss to all major engines would be ensuring fantastic longevity for rtx 20/30 cards. when pascal owners have to start dropping resolution scaling,turing owners can use dlss intead.frankly this is amazing,I wish my 1070 had it.
Imagine 1080 Ti had it . That would be nuts . Like that longest high performing card in the history 😀 DLSS 2.0 is not enugh pressure. Need wide support. Progress is progress anything is good.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

DLSS 2.0 is not enugh pressure.
🙄 OF COURSE IT ISN'T I mean what kind of amd owner would like a feature like that
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
It's a pity the main AI neural net has to be trained using images of game that needs it. So doesn't work so well in untrained other games or applications. Or does it? Not sure. Sounds like it from what I read.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
geogan:

It's a pity the main AI neural net has to be trained using images of game that needs it. So doesn't work so well in untrained other games or applications. Or does it? Not sure. Sounds like it from what I read.
it looks like anything can be reconstructed with dlss 2.0 but to get satisfactory results you need training
data/avatar/default/avatar27.webp
cucaulay malkin:

🙄 OF COURSE IT ISN'T I mean what kind of amd owner would like a feature like that
I think any user would like that . For now i don't need it but in 2 years the card will age and that would help alot.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Fox2232:

AMD's version which is supposedly going to work even for GCN cards will very likely work on 1080Ti too.
If something seems too good to be true, it probably is. At this point AMD still don't know which direction they are going with (DirectML or no DirectML?). My 2cents is if FSR can be used on all hardware then the image quality will turn out terrible (might be better than Upscaling + CAS but nowhere near DLSS). Also there are other Upscaling tech like Unreal Temporal Upsampling which works out pretty well but still lose to DLSS.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
This is the perfect example of how "small" AMD is compared to both Nvidia and Intel because they can afford to create software features alongside their hardware, something that´s much more difficult for AMD because they don´t possess the same financial resources.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
H83:

This is the perfect example of how "small" AMD is compared to both Nvidia and Intel because they can afford to create software features alongside their hardware, something that´s much more difficult for AMD because they don´t possess the same financial resources.
first of all, it takes a lot of planning in advance. dlss was just an afterthought,it started with putting tensor cores on volta,then using them on turing,then improving them on ampere to cut the number of tensor cores but still have the same performance available.2080ti has 588 tensor cores,3070 has 180-something. Mobile entry level cards like 3050 are gonna get it this generation already. It was a long term investment that will pay off.AMD too often has to whip something up fast,and there is no guarantee they won't have to search for other solutions for later generations.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
I almost bought a 5700XT last year but I decided to spring for the few $ more 2070s(could have did a 2070 I suppose) because of it's RT cores and I don't regret it.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Noisiv:

sure it's equivalent that's why they hit an all-time rock bottom:
Honestly the 6k series is a good product and I think it's marketshare would be better than 5k series, I just think given the shortage/demand situation AMD would rather spend it's capacity with TSMC on Ryzen dies, which nets them way larger profit margins than the GPUs.
https://forums.guru3d.com/data/avatars/m/239/239063.jpg
After i bought 3080 last december and saw for myself what dlss 2.0 can do, i replaced it with 6800XT. Good riddance. Dlss is a gimmic ! I prefer RAW.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
valentyn0:

After i bought 3080 last december and saw for myself what dlss 2.0 can do, i replaced it with 6800XT. Good riddance. Dlss is a gimmic ! I prefer RAW.
Lol
data/avatar/default/avatar25.webp
geogan:

It's a pity the main AI neural net has to be trained using images of game that needs it. So doesn't work so well in untrained other games or applications. Or does it? Not sure. Sounds like it from what I read.
DLSS 2 does not use per-game training.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
valentyn0:

After i bought 3080 last december and saw for myself what dlss 2.0 can do, i replaced it with 6800XT. Good riddance. Dlss is a gimmic ! I prefer RAW.
After i bought 6900xt last december and saw for myself what CAS can do, i replaced it with 3090. Good riddance. CAS is a gimmic ! I prefer RAW.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Fox2232:

Yeah, ... well, ... no? CAS runs on nVidia's HW too. You could have seen CAS in action no matter what make of GPU you have had as long as it was capable to start game which uses it. (Or inject one of CAS variants via reshade.)
Jesus Christ can't you see sarcasm from a mile ? you really have selective vision man
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
pharma:

Outriders' DLSS does a lot more than just improve performance | Rock Paper Shotgun April 15, 2021
It also adds in extra details missing at native resolution ... Admittedly, at a quick glance, it doesn't look like much has really changed with DLSS Quality enabled, and in motion you'd probably be hard-pushed to notice the difference as well. But if you like to take your time in Outriders once you've finished clearing its rooms of goons and trying on all the different kind of space trousers they've left behind, then it's definitely worth switching on, as I think it not only makes textures look sharper and more defined, but it also adds in extra details you just don't get when playing without it.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Fox2232:

No, sorry. Only false equivalence where it was not appropriate. Anyway, even that text does not qualify to be dictionary expression of sarcasm. And stating that it is sarcastic expression is kind of confusing in terms of: "What sentiment did you want to express again?" But maybe 'sarcasm' has different meaning/use in your langue. Some things are relative, some are not. Same way as some people notice IQ degradation with DLSS, and some do not.
false equivalence ? like you keep replying to my sarcastic comment and turn a blind eye to the original one ? every time we exchange posts you conveniently ignore things you just don't wanna see.