Starfield PC Version Exclusively Optimized for AMD: No NVIDIA DLSS Support

Published by

Click here to post a comment for Starfield PC Version Exclusively Optimized for AMD: No NVIDIA DLSS Support on our message forum
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Valken:

Cannot believe all the Gurus here shilling for upscalers when I thought everyone would be cheering for full metal native GPU rendering! Gonna wait for the GOTY version and RTRT 4K 120 FPS patch!
They been brainwash like other that Non native performance is where it at? I care about native those upscalers are last ditch resort, it should be used defacto standard
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
TheDeeGee:

Native rendering clearly means moving towards 750 Watt GPUs, you want that?
Native rendering would be easier if they stop putting 128bit or 192bit bus on a gpus. Your 4070ti clearly cannot render higher native resolutions becouse its limited in many ways. 7900xt 320bit with 20gb vram can do it for instance much better.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Undying:

Native rendering would be easier if they stop putting 128bit or 192bit bus on a gpus. Your 4070ti clearly cannot render native resolutions becouse its limited in many ways. 7900xt 320bit with 20gb vram can do it for instance much better.
Then why does AMD have FSR?
https://forums.guru3d.com/data/avatars/m/269/269645.jpg
Fine with me. Seems like DLSS is starting to be used as a shortcut to ignore actual optimization. Anything besides quality setting looks bad. Like a graphics cheat a cheap console would use. *Not a fan of fsr either. 5090 is probably three years out so let's hope fsr not required.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Denial:

Then why does AMD have FSR?
You dont have to use it is my point. 7900xt will destroy 4070ti in Starfield and not only becouse its amd sponsored but how much modded skyrim and fallout can use amount if memory within a this huge game. Amd will make sure nvidia cards become very limited. You are basically forced to use dlss and fg but with amd you run it native.
data/avatar/default/avatar40.webp
JiveTurkey:

Fine with me. Seems like DLSS is starting to be used as a shortcut to ignore actual optimization. Anything besides quality setting looks bad. Like a graphics cheat a cheap console would use.
On the contrary, DLSS/FSR is used by AMD/NV as an up-sale point for underpowered (and overpriced) hardware. Unoptimized games are often a product of botched QA and lazy asset riding with heavy engines like Unreal.
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
Hilbert Hagedoorn:

This collaboration ensures that the highly anticipated space exploration game is being developed to fully leverage the capabilities of Ryzen CPUs and Radeon 7000 series GPUs, providing enhanced perfor... Starfield PC Version Exclusively Optimized for AMD: No NVIDIA DLSS Support
I don't understand this at all -- doesn't FSR2 have very similar inputs to DLSS? If you're going to bother to implement FSR2 seems like a no-brainer to go ahead and plug in DLSS as well unless I'm missing something. Seems like it wouldn't be a lot of extra work for them, but I could be wrong I suppose. I don't think users should have to mod in support for key features like this.
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
fellix:

On the contrary, DLSS/FSR is used by AMD/NV as an up-sale point for underpowered (and overpriced) hardware. Unoptimized games are often a product of botched QA and lazy asset riding with heavy engines like Unreal.
Seems like so many games (especially UE4 games) coming out lately just run horribly on the CPU side of things. UE games especially often have poor thread scaling behavior and horrendous asset streaming/traversal hitching. Some also have shader stutter. I don't understand why publishers/devs deem that kind of thing acceptable. I'd even take a return to "RAGE1" style aggressive asset fade if it meant zero stuttering and better 1% lows. I love graphics, but as I've said 100 times consistent smooth performance is more important -- to me at least. Zero stuttering should be a top priority if it were up to me.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Undying:

Native rendering would be easier if they stop putting 128bit or 192bit bus on a gpus. Your 4070ti clearly cannot render higher native resolutions becouse its limited in many ways. 7900xt 320bit with 20gb vram can do it for instance much better.
It will run 1440p just fine, what are you talking about?
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
emperorsfist:

Great, so it's "optimised" to use the shittiest of mostly shitty upscalers. Lovely. Does that mean they know that main game is going to run poorly out the door?
Maybe my mind is playing tricks on me, but at least on my monitor the quality of some checkerboard rendering upscales (PS4 Pro) looks noticeably better to me than FSR2. I'm happy to have FSR2 as an option and I'm also happy FSR1 exists as it's better at least than ye old bilinear/bicubic scalers, but if you're going to bother putting in FSR2 you might as well plug in DLSS as iirc the inputs are quite similar (motion vectors/the inputs required for TAA).
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
There have been Nvidia sponsored games with and without FSR and there has been AMD sponsored games with and without DLSS too. BOTH companies are guilty of paying for one to be used over the other. DLSS requires more work to integrate FSR doesn't, AND FSR supports both consoles AND nearly every single GPU released in the last 10 years, DLSS doesn't. Why spend more time implementing DLSS when only people with RTX cards can use it. FSR can be used by nearly everyone including consoles. Yes DLSS is the better option in terms of IQ but not everyone can use it and having DLSS become the defacto solution means you are in favour of having a monopoly seen as its a closed off tech. Wanting FSR to win and improve over time is the better choice as more people have the option to use it regardless of their GPU brand. Regardless of AMD's money bags I think FSR is going to become more prevelant than DLSS. Nvidia NEEDS to pay companies to include DLSS, whereas AMD doesn't seen as FSR code is open source. I think this is another case similar to Freesync VS Gsync where the open source model won. If AMD take FSR3 to a closed source solution then I will not defend it.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
TheDeeGee:

It will run 1440p just fine, what are you talking about?
Clearly when nvidia users complain about game not having nvidia features they are not sure how it will run.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
GeniusPr0:

The solution to no DLSS is a 4090 Time to pick up extra shifts at McDonald's
Yip, 95% of the negativity posted on the forums has no relevance to me ha ha ha #Go4090OrGoHome
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
rl66:

DLSS, XeSS, FSR2... All of that crap thing to make you feel you have a powerfull GPU instead having real powerful GPU... It's a mirage, for now GPU (whatever the brand) can only handle basic RT, those tricks only make high fps possible with dramatic degradation of the image quality. But i agree that most people play on bad quality screen, so a bit worse can't be noticed on that (lol).
Personally I don't mind so long as the resulting image quality is comparable enough to true native + TAA to justify the performance uplift. If it weren't for image reconstruction RT would not have been feasible for more generations. I know for people with lower end GPUs FSR1 and 2 has been very useful in running more recent games at an acceptable level. We don't have to use them after all, they're just really nice options to have for users. FSR1 and 2 aren't really good enough for me and I wouldn't use them unless I was very performance constrained, but modern DLSS in its Quality mode looks very good to my eye (and stable/very little shimmer in motion) compared to native + TAA with a 1440p output. Now, older versions of DLSS (especially those prior to 2.0) looked pretty terrible imo.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Undying:

Clearly when nvidia users complain about game not having nvidia features they are not sure how it will run.
We all know how it will run. It's a AAA game, it will be a s**t show regardless what it's optimzied for. On console it runs 30 FPS with FSR, that says enough i think.
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
BLEH!:

Both consoles run on AMD hardware, so it makes sense.
I wonder if this is part of the reason why AMD GPUs sometimes have an advantage when CPU bound on PC. Maybe it has something to do with bespoke console optimizations carrying over or a lack of focus on Nvidia specific optimization work? I am probably completely wrong about this. Could be doe to who knows what, some say it's a scheduler difference, others say it's a driver thing, others still say it's fabricated and totally game dependent. What do I know?
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
good news for my Steam Deck probably won't need FSR on my main rig.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Kool64:

good news for my Steam Deck probably won't need FSR on my main rig.
Isn't the Deck half as powerful as an Xbox?
https://forums.guru3d.com/data/avatars/m/243/243189.jpg
TheDeeGee:

Isn't the Deck half as powerful as an Xbox?
Screen is 720p, so 1/2 the pixels with 1/2 the power. Also I am in awe the same engine is returning for another round. Considering skyrim is on everything including switch with decent performance, it might not be an impossible wish that this game will be playable with decent image quality across the board?
https://forums.guru3d.com/data/avatars/m/224/224714.jpg
BlindBison:

Seems like so many games (especially UE4 games) coming out lately just run horribly on the CPU side of things. UE games especially often have poor thread scaling behavior and horrendous asset streaming/traversal hitching. Some also have shader stutter. I don't understand why publishers/devs deem that kind of thing acceptable. I'd even take a return to "RAGE1" style aggressive asset fade if it meant zero stuttering and better 1% lows. I love graphics, but as I've said 100 times consistent smooth performance is more important -- to me at least. Zero stuttering should be a top priority if it were up to me.
A lot of poorly coded games, not always the dev teams fault as some times they are not given enough resources to finish the game properly