Battlefield V: GeForce RTX 2080 Ti Rotterdam Gameplay

Published by

Click here to post a comment for Battlefield V: GeForce RTX 2080 Ti Rotterdam Gameplay on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
warlord:

Well, if I was Nvidia, providing the overall best and most innovative product I would do the same as they do perhaps. That doesn't mean it is an ethical or a proper thing to do.
Again I agree with you, only that business is in no way ethical. Ethics always cost money, and no sane company does that, except if they earn more by their PR's gains. That's what people don't get... a well running company does not require to be seen as "the good guys", and being seen as the good guys doesn't miraculously make you earn more automatically. We all know why this is an issue, since lack of competition makes things like € 1300 GPUs happening and 500€ four cores the standard for years. We need more competition.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
warlord:

Nobody cares about innovation, but price. It is all about money. Always. You can't sell a product 50% faster for 100% the previous model price. You shouldn't do it. Period.
Yes, no one cares about innovation, that's why new technology continues to come out and is extremely high priced for the first few to many years...because....no one cares or wants anything new....nothing innovative...we like the same thing over and over and over again, we're all stuck on VHS's and gigantic cell phones..no wait, heck we don't even have cellphones! we're on land lines because obviously, innovation = bad, no one wants that.......totally makes sense..... And define 50% faster? Or are you deciding to only gauge the speed of the GPU by one metric rather then all metrics? If the ray tracing is, what was it 6 times? something like that, the speed of previous generations, does that not account for anything? No i'm sure to you it doesn't, because you don't care about innovation... Also, $699x2=just shy of $1400, RTX 2080 ti MSRP (non-FE and regardless of other companies price gauging) is $999 So not sure how you got 100% the previous model price when it's about 43% more expensive, not 100%. And i'm sure you're going to come back with "Well where can you find the RTX 2080 ti for $999? Yeah! you can't!", and really, i don't care. If manufacturers do not abide or provide any products within the MSRP price, that's not nvidias fault. But it's also to be seen when they are actually released and within the first 2 months or so if the MSRP will be respected.
data/avatar/default/avatar23.webp
I am excited for it since I'm mainly a SP/Coop player at a slower pace. I would notice all the fidelity. Still a bummer in that it is mainly for specular surfaces but still not bad. The video did say they were getting sub 30 FPS @4K with RT ON a beta build. That is not bad for a RT 1.0. But my mantra has been - never buy 1.0. Wait for 1.1 or 2.0 for better value. Unless AMD has some secret RT tech in development for their next gen GPU or if ARMA 5 gets RT!!! That would blow my socks off since they do use a full 3D first person/third person model... and 30 FPS in ARMA is relatively smooth already...
https://forums.guru3d.com/data/avatars/m/118/118434.jpg
Just have to wait for the reviews on 4k gaming on these cards
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
macdaddy:

Just have to wait for the reviews on 4k gaming on these cards
Probably the most sensible thought to have been uttered on these forums ever.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I think the primary reason why GPU PhysX isn't used is because the CPU is simply fast enough for the level of physics games tend to implement. More complex systems just clutter the screen uselessly, can't be used for gameplay oriented stuff in multiplayer, and obviously the fact that it's Nvidia hardware only is bad. PhysX itself though is fine.. it's used in thousands of games.
https://forums.guru3d.com/data/avatars/m/94/94406.jpg
Two things: 1. Gfx: 1080p with RTX is more realistic image than 4k without RTX, as someone said before me that photorealistic 1080p hasn't been achieved yet. I would prefer photorealtistic 1080p before 4k with less quality graphics. Don't forget most PC gamers sit close to a monitor so does the eye prefer more res or photorealism? 2. It looks nice but simple things are missing meaning photorealism may be a step closer but the environment isn't. Check at 30 seconds, when the StG sprays the wall, very little effects, imagine shooting a wall at that range, you'd have dust, brick fragments and a good chance of putting your eye out 🙂
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
I don't really understand all this arguing about tech we don't yet fully understand. We have no real data on how these cards perform. Are they a leap forward or are they just a small step? Well, with no good numbers from any trusted sources to compare, this all seems moot.
data/avatar/default/avatar27.webp
0blivious:

I don't really understand all this arguing about tech we don't yet fully understand. We have no real data on how these cards perform. Are they a leap forward or are they just a small step? Well, with no good numbers from any trusted sources to compare, this all seems moot.
being able to render raytracing in real time is a big step forward, raytracing has been out forever though only put in CG still images.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
0blivious:

I don't really understand all this arguing about tech we don't yet fully understand. We have no real data on how these cards perform. Are they a leap forward or are they just a small step? Well, with no good numbers from any trusted sources to compare, this all seems moot.
True, but some aspects of the Turing launch invites intense speculation (or suspicion) and, naturally, the resulting loud chatter that accompanies it. Nvidias focus on RT with these cards to the exclusion of regular performance seems like a big red flag. If regular performance is mediocre (relative to the big gains of Pascal, Maxwell), which may cause disappointment or negativity, then what is a marketing dept to do? Easy: make the entire focus on a new feature and declare it as the next big revolution in graphics. Which it may be, but also unfortunately may be better demonstrated with more capable future hardware. Dont do it half-assed or half baked! People paying $1000 for a GPU do not want to hear of less than 60fps on 1080p, esp when most of them are likely already on high res monitors. And wtf are RTX 2070 owners supposed to do with RT games? Disable it?! Thats why we cant shut up and act as if nothing silly is going on. Yes, it all seems 'moot' without reviews, but with all the time they've had to tweak and tune their demos, if thats the best they can up with on a major GPU launch, then they either handled it badly, or RT is more over-hyped than it should be if the cards cant handle it with adequate performance.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
warlord:

Innovation should have a somehow realistic pricing for innovative products. If you really like overpriced products to justify someone's R&D costs, it is your call. I assume we cannot call it proper or ethical. It turns again to money and not technological progress at all.
Again, you're acting like this is a new thing. Look at anything that has been a leap in innovation/technology/etc. And it comes to the same thing. VHS, DVD, Blu-ray, all were massively expensive when they initially came out compared to previous technology. Flat screen technology (lcd), 720p/1080p displays, 4k, 8k, oled, led, plasma, all were massively expensive technologies when they first came out (8k and oled still are obviously) R&D plus the actual cost to produce new, advance technology is such a high cost, way more then you are giving credit for. If the numbers match up and ray tracing is 6 times faster on RTX cards then previous generations, that is a MASSIVE gain , and we should all be ecstatic that we are FINALLY getting to the point where ray tracing is a possibility, because it'll only get better from here. This is one of the biggest leaps we have had in YEARS.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Aura89:

Again, you're acting like this is a new thing. Look at anything that has been a leap in innovation/technology/etc. And it comes to the same thing. VHS, DVD, Blu-ray, all were massively expensive when they initially came out compared to previous technology. Flat screen technology (lcd), 720p/1080p displays, 4k, 8k, oled, led, plasma, all were massively expensive technologies when they first came out (8k and oled still are obviously) R&D plus the actual cost to produce new, advance technology is such a high cost, way more then you are giving credit for.
But these were practical working technologies, not bound by certain conditions or limitations for them to work properly.
Aura89:

If the numbers match up and ray tracing is 6 times faster on RTX cards then previous generations, that is a MASSIVE gain , and we should all be ecstatic that we are FINALLY getting to the point where ray tracing is a possibility, because it'll only get better from here. This is one of the biggest leaps we have had in YEARS.
6 times faster, but you still cant play it unless 1080p 60hz.... yes, we should be 'ecstatic'. 🙄
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Am I pleased ray tracing is here - hell yes! Is it ready yet - not so much!
data/avatar/default/avatar04.webp
People have just forgotten how much new features typically drag down performance. This feels similar to when GPU PhysX destroyed performance unless you got a second card for it, or when tessellation was the first thing you turned off on the HD 5870. I admit that there are some good showcases of how raytracing genuinely can improve the game, but until we get next-gen consoles possibly supporting it, I still don't see it as more than how GPU PhysX was.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
alanm:

6 times faster, but you still cant play it unless 1080p 60hz.... yes, we should be 'ecstatic'. 🙄
No one is being forced to use the ray tracing in games, would you rather effort to ray tracing just never be done? Or do you have unrealistic expectations that ray tracing should be instantaniously 20 times faster then previous generations...
alanm:

But these were practical working technologies, not bound by certain conditions or limitations for them to work properly.
Not really sure what you mean by this. Practical working technologies? Vhs movies were once 100 dollars just for the movies....let alone the player, and that list can go on. And not bound by certain working conditions or limitations? Again, what? You had to have all new items except maybe the actual tv and even then thats not necessarily true when it comes to requiring new video/audio inputs to take advantage of the new technology or at least have a converter that would destroy the advantages anyway. And its not like competing technologies worked with eachother either. You couldnt play betamax on vhs, or blu ray on hddvd, etc. When it comes to other types of technology, proprietary IS the game
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Yxskaft:

People have just forgotten how much new features typically drag down performance. This feels similar to when GPU PhysX destroyed performance unless you got a second card for it, or when tessellation was the first thing you turned off on the HD 5870. I admit that there are some good showcases of how raytracing genuinely can improve the game, but until we get next-gen consoles possibly supporting it, I still don't see it as more than how GPU PhysX was.
Actually, that's a good point, I'd be curious to see just how much a 2nd card in sli would improve the ray tracing capability and overall framerates. For example, on Battlefield V they're targetting 60 fps at 1080p with ray tracing enabled - I wonder if with x2 2080ti in sli if you'd then get close to double the performance, i.e. 120fps - now that's starting to get more respectable! Although, that would mean paying for two RTX 2080ti's!! But, it would be interesting to see if the ray tracing side of sli scales the same as 'normal game rendering' without ray tracing enabled.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
The game's graphics look amazing even without RT. So unless RT becomes so good that it's a huge difference, I suspect most people won't care about it, especially with the perf hit.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Robbo9999:

Actually, that's a good point, I'd be curious to see just how much a 2nd card in sli would improve the ray tracing capability and overall framerates. For example, on Battlefield V they're targetting 60 fps at 1080p with ray tracing enabled - I wonder if with x2 2080ti in sli if you'd then get close to double the performance, i.e. 120fps - now that's starting to get more respectable! Although, that would mean paying for two RTX 2080ti's!! But, it would be interesting to see if the ray tracing side of sli scales the same as 'normal game rendering' without ray tracing enabled.
It'll be interesting to see how SLI does in these cards in general since NVLink bridge
data/avatar/default/avatar08.webp
Aura89:

It'll be interesting to see how SLI does in these cards in general since NVLink bridge
Well apparently the nvlink will still just work as normal sli, just with increased bandwidth. So there shouldn't be too much of a difference, with the exception of TAA perhaps working on the 2000 series with sli, due to the increased bandwidth.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
I find the hyping of "RTX" interesting since rtx was talked about months ago with little interest ,the Titan V also supports RTX as defined by nvidia so these cards don't actually have anything new over volta as far as featureset is concerned. Ray tracing also isn't impossible to do on "regular" video cards, nvidia claims they can do 10gigarays/s which while impressive ,probably does not require specialized hardware to do "comparable" numbers, for example it has been highlighted here (https://www.reddit.com/r/Amd/comments/9bj93j/technical_informationpaper_for_raytracing_test/) that a 290x could do 4.4gigarays/s. which while less than half the speed, suggests it may not be completely out of reach for something like a 1080ti and possibly vega .