Sapphire Releases Custom RX Vega 64 and RX Vega 56 Nitro+

Published by

Click here to post a comment for Sapphire Releases Custom RX Vega 64 and RX Vega 56 Nitro+ on our message forum
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Vega 56 with three power connectors? I reckon it comes with a built-in laser cannon to use all that power.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
QBI:

Sadly not FPS. Still pales in comparison to 1080Ti even at high resolutions.
yes but they are on 1070 (ti or not) step...
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
£750 for the Nitro+ 64 here in the UK, can get the much faster 1080 Ti cheaper, and the 56 is £560, which is more than the faster 1080, rediculous pricing.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Very impressive noise reduction - massive noise reduction - in comparison to reference Vega, and that's while increasing performance too!
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
QBI:

Sadly not FPS. Still pales in comparison to 1080Ti even at high resolutions.
Did you seriously try to compare the vega56 to a 1080ti. That is so ignorant that you are either a moronic nvidiot fanboy or just not smart at all............ Good thing about 3 PCIE connectors, miner assholes might leave this card alone so we can actually buy them. Did people forget EVGA FTW 580 3gbs had 3x PCIE, and there are more recent nvidia cards that due too. I swear, people for what ever damn reason, follow the sheople trend to hate AMD for any reason possible. For anyone who wants AMD or has freesync, even on a fury X, the Vega 56 is a nice upgrade. People act like it is the HD 2900 launch or something. Guess people want 2006-2008 all over again. That was absolutely horrible GPU time frame. And nvidiot fanboys were bad back then before social media and game works. The prices of these cards are really not AMD or AIB fault. I even have a feeling they are pricing them high initially so these miner c****s dont buy them. This is way worse then the 290x launch and rape of the HD 7950/70s for mining as well.
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Agonist:

Did you seriously try to compare the vega56 to a 1080ti. That is so ignorant that you are either a moronic nvidiot fanboy or just not smart at all............
They are comparable in price so it's comparable. Nitro is 580 euros, more than I paid for my Ti
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
sammarbella:

Well as Agent-A01 said you can compare it by a reasonable factor: price. Ppl buying this GPU (at this price) must be ignorant or AMD fanboy or just not smart at all or simply had chosen to tie themselves to AMD GPUs buying a freesync monitor because it costs less than gsync ones. "Chained" ppl to freesync monitors can "enjoy" the minor price they paid compared to Gsync monitor, playing at less FPS with a more expensive GPU.(!?!) I'll wait until HDMI 2.1 enabled monitors and TV will be released to buy a new monitor AND GPU. Adaptive sync enforced in standard will end this freesync/gsync war. Hopefully by 3Q 2018.
Sadly nothing will ever be standard if Nvidia has anything to say. They want everything to be their standard.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
sammarbella:

HDMI 2.1 make variable frame rate mandatory in the standard: https://www.anandtech.com/show/12095/hdmi-21-specification-released It's not some optional HDMI standard extension only AMD freesync labeled monitors use right now.
Well I am very glad to hear this. But still leaves doubts with Nvidia. Though I don't still wanna support Nvidia. Its not fanboyism, I just hate Nvidia the past few years. The tragedy that I experienced trying to use 3 x21:9 with Nvidia and Win 10x64 was not fun. Everything worked right of the box when I sold my GTX 970 sli off and got 2x R9 290s. Nvidia still sucks when using different monitors for surround and using DVI-D on Win 10 and even Win 7 x64 with 21:9. Works no issue ironically on my HD 5770 crossfire though even on Win 7 X64.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
sammarbella:

I have no doubt Nvidia will make his best to sell more GPUs for GAMERS and get his share from Gsync monitors sold to GAMERS. When HDMI 2.1 monitors will be available it will obviously benefit Nvidia more than AMD: free sync tech for all gamers will expose even more the lack of performance per dollar using AMD GPUs in a PC gaming setup. AMD drivers team put an extra effort in niche features and mining support, Nvidia drivers provide the performance and the features the major part of his GAMERS customers need. Take a look at the new AMD drivers: Connect tab and fancy backgrounds 2.0. It's enterely AMD faults if Vega can't compete in performance AND/OR price with Nvidia pascal (56 vs 1070ti, 64 vs 1080) been 1 year later to the party. Blind tests comparing gaming experience from a cheaper FreeSync monitor with an AMD Vega VS a Gsync monitor with a Pascal both at 60 Hz (or 100 Hz) locked is not going to increase AMD GAMING GPU market share, only competitive performance AND prices will do. We need competition (performance AND price) from AMD gaming GPUs not PR stunts, fancy connectors/backgrounds and mining love.
Well as a Freesync user, its far better then no freesync or gsync. Ive used both. And I have turned off freesync, and just used enhanced sync. Its not as good. You can ramble on all you want, but the drivers are on point with AMD. Its not just social media crap they add. The drivers are rather easy to navigate compared to the old drivers, and are very fast and responsive. I cringe when I need to open NCP and set things on my server. PR stunts is something AMD does need sadly. There are people who bllindly believe due to forums, and fanboy friends that AMD is complete garbage for gaming when it is far from the case. Game works did wonders to fuel that nonsense. The fact you can toggle ichill, Frame rate cap, and even freesync in game now with the overlay, is just bitching. Period. Nvidia can not do that kinda crap with geforce experience, which I use on my server. I fully agree that AMD needs to bring better gpus to market, but the nvidia fanboys, and miners are ruining that for AMD users. I do agree at times, AMD shoot themselves in the foot. Saying the Fury X was an overclockers dream was very stupid. But once drivers got right, the fury x ripped. I still miss mine. I prefer AMD due to their drivers as of the past few years. NCP is a damn joke. Even on a 960 evo, they are rather slow. The fact I need NCP and geforce experience to the same as AMDs drivers says alot to me. Other then game streaming to a device, AMD isnt lacking really much beyond raw GPU power now. But if people dont actually support them beyond miners,, they cant deliver can they either.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
sammarbella:

Miners and gamers look for the same: performance. Miners "support" AMD because they get mining performance.They don't buy more expensive AMD GPUs because they love the brand. 😀 When gamers can get gaming performance from AMD GPUs they will "support" it too.
They dont just look for performance, but value as well. Most nvidia fans are obsessed with raw performance numbers, even ones with 1080p 60hz screens. The obsession with Nvidia dropping stupidly expsensive titan cards and that people buy them and how popular they even are shows this. Honestly most AMD users look for value, and once again you like many fail to remember this forum is not the average user base of gamers, were all high end for the most part. AMD will never do anything right in GPU market, even if they did. Its a trendy thing to always hate on AMD. They pushed really hard to bring something good but ended up short with Vega. It shows, but I think things would be different if gamers could actually buy the damn cards. Its terrible in the US to try and even get one before they are gone in minutes and assholes put them on ebay for stupid high prices. AMD does well in the mid range. RX 470/RX 480 etc bracket. Nvidia still sucks there. Always bringing crippled cards in that area. AMD did fail in the RX 460/560 in my opinion. I had an unlocked RX 460 4GB and it was kinda slow for the price point vs the GTX 1050ti.
data/avatar/default/avatar35.webp
Man I wonder if having 3 8 Pin Power connectors will have an effect on how well the card Overclocks by having extra power. Any way it took the AIB manufactures a good while to release these cards kind of reminds me of when the r9 290/290X launched it took the AIB manufacturers a good while to release those cards as well. Which is another reason why AMD is suffering with Vega while performance is a different story entirely with the Vega cards. One could imagine what the TDP is going to be with 3 8 pin power connectors.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Fender178:

Man I wonder if having 3 8 Pin Power connectors will have an effect on how well the card Overclocks by having extra power. Any way it took the AIB manufactures a good while to release these cards kind of reminds me of when the r9 290/290X launched it took the AIB manufacturers a good while to release those cards as well. Which is another reason why AMD is suffering with Vega while performance is a different story entirely with the Vega cards. One could imagine what the TDP is going to be with 3 8 pin power connectors.
In theory: yes... But since both AMD and NVidia put limit in their bios: no Since 2 or 3 year exept some few card even the less expensive non custom PCB can reach more or less the same freq than top custom one, if good vented chassis. But i still would get them... for the look and the silence.
data/avatar/default/avatar07.webp
rl66:

In theory: yes... But since both AMD and NVidia put limit in their bios: no Since 2 or 3 year exept some few card even the less expensive non custom PCB can reach more or less the same freq than top custom one, if good vented chassis. But i still would get them... for the look and the silence.
That is true. But maybe with a custom BIOS if someone can make one that is could get rid of that limit. Or have a switch just like this card has have an OC mode which could increase the power limit. My GTX 1070 has such a feature that can increase the power limit a tad via the dual BIOS switch. Yeah the AIB cards are more silent than the reference cards and looks alot better too.
https://forums.guru3d.com/data/avatars/m/268/268876.jpg
"They run much softer and lower in temperatures. Sapphire uses a three fan cooler, the middle one weirdly enough turning in the opposite direction compared to the other two?" That actually makes sense because where the fans border, air does not only carry an axial and radial velocity, but also a tangential velocity. That means, if they'd be spinning the same direction, air in-between the fans would 'collide' coming form opposite directions. That would create turbulence an less cooling efficiency, and therefore unwanted noise and higher temps. Looks pretty sick.