AMD Fiji XT Photo

Published by

Click here to post a comment for AMD Fiji XT Photo on our message forum
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
This doesn't look like it's active adapter and that's a problem for me. i need active adapter and that's expensive. Oh and it must be Active DP-DL-DVI not Single link dvi.
Yup, needs to be an active adapter, and to DL-DVI for higher than 60Hz.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
But... but... how about 120hz monitors that utilizes dual dvi ports ???
there is DP... if you have DP you have DVI / DVI-I you just have to get an adaptator for 1 or 2 DP to have the equivalence. *edit* btw DVI start desapear from NVidia too, it's not a "AMD is crap it doesn't do DVI" ... as analogic VGA it start to fade away. it's life:P
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
i must also say I get why there's no DVI cause this is not where what this card is designed for. Remember they will have a Fiji Watercooled and air cooled version so maybe the aircooled will come with DVI. They thought well if you still runing DVI with this card you doing it wrong. Let's just see and hope that they will bring a card with DVI out but it doesn't bother me that much cause how AMD is doing and the recent thing about the drivers being way below par it's a no brainer for me to go Nvidia. I'm not fully against AMD cause I know they can do better. If only the drivers was on time and they at least had better PR and not worried about this "The fixer" nonsense then I will be happy to buy AMD card again.
What "recent thing about the drivers being way below par"...? I'm not familiar with that--I've got ~40 games installed in my current Win10 10122 build at home, from very recently released games to 20-year-old-games +, and I don't have a problem with the Catalysts--they run great in all of those games--don't have single one that simply won't run (or run optimally) because of the current Cats (and I'm running a beta WDDM 2.0 driver from AMD at the moment--the WDDM 1.3 drivers are even better, imo.) But I don't have xFire, either (I used to have it--a pair of 4850's, until I bought a single 1GB 5770 that performed as well as both of them in xFire.) From what I've read, the grass for xFire is no greener at all with SLI, as nV certainly has its own set of problems.
one of my clan mates went from r9 290 3 way crossfire to gtx980 and playing on rog swift now and he told me it was so worth it.
What do you expect him to say?...;) If I'd lost my shirt on a lateral move like that, I'd want to say the same thing--but then, I'd never have done that in the first place...! Heh...;) My driver support from AMD is great at the moment--everything is changing with respect to SLI/xFire with DX12--but I'll admit I don't see things from a multi-gpu perspective anymore, either...
data/avatar/default/avatar29.webp
What "recent thing about the drivers being way below par"...? I'm not familiar with that--I've got ~40 games installed in my current Win10 10122 build at home, from very recently released games to 20-year-old-games +, and I don't have a problem with the Catalysts--they run great in all of those games--don't have single one that simply won't run (or run optimally) because of the current Cats (and I'm running a beta WDDM 2.0 driver from AMD at the moment--the WDDM 1.3 drivers are even better, imo.) But I don't have xFire, either (I used to have it--a pair of 4850's, until I bought a single 1GB 5770 that performed as well as both of them in xFire.) From what I've read, the grass for xFire is no greener at all with SLI, as nV certainly has its own set of problems. What do you expect him to say?...;) If I'd lost my shirt on a lateral move like that, I'd want to say the same thing--but then, I'd never have done that in the first place...! Heh...;) My driver support from AMD is great at the moment--everything is changing with respect to SLI/xFire with DX12--but I'll admit I don't see things from a multi-gpu perspective anymore, either...
Good for you man. You haven't played the witcher yet. No crossfire support and even r9 290x is not enough to get what I want. iu need 2 cards so you talking nonsense. Also there's project cars and check how long it took to get mantle to work in BF4. You not the only person in the world ok.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Pretty bloody sad effort that it only has Display Port 1.2a... So as we are moving into the 4K era AMD release a top of the line card that does not support monitors already released with Display Port 1.3 that could allow high refresh rates or 3D at 3840×2160... Seems pretty retarded
Fijis no good for 4k, its only a 1080p card, as its only got 4GB of mem.
https://forums.guru3d.com/data/avatars/m/79/79987.jpg
There is no problem with just using an DP->DVI adapter. Like this one for example.
The only problem I see with that adapter is that it only supports up to 1200p. So if you wanted to use a higher resolution you would be out of luck.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I am and I will 😀
Wait, this reminds me of somebody.... 😀
https://forums.guru3d.com/data/avatars/m/97/97268.jpg
Wait, this reminds me of somebody.... 😀
Who ? 😀
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Well, if its only 600$ that would be amazing. So the first rumored 850$ isnt true?
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
If that ROP count is true... I questions how well it will do in 4K vs. GTX 980 Ti.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
If that ROP count is true... I questions how well it will do in 4K vs. GTX 980 Ti.
Idk, looking at those specs it should be beating 980ti by a good magin. And if 600$ is true then even better.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I thought Hilbert said HBM1 could only do 4GB
HBM1 can do 8GB. A bunch of reviewers said that AMD's implementation of HBM1 is limited to 4GB, including Hilbert, Ars, PC Perspective (who also interviewed AMD and AMD kind of hinted that its limited to 4GB). All of AMD's marketing material definitely makes it seem like it's limited to 4GB as well. None of this really made sense to me, so I emailed Ryan Smith from Anandtech about it:
HBM as a spec can scale out to 8 stacks. However whether AMD’s implementation supports that is another matter entirely. AMD is hinting at the fact that this will only be a 4GB card, though that’s something I’m reserving judgement for until the official specs have been announced. Both AMD and NVIDIA are known to engage in misdirection when it serves them, and in either case there’s no rush to judge. The HBM stacks themselves only come in 1GB currently, so you have to go with 8 stacks to get 8GB.
It doesn't make sense to me for AMD to release an 4GB HBM card. I think it would be a huge mistake up if they did. I personally won't consider buying it unless it's 8GB. That being said it's possible that it's limited to 4GB and if that's the case well then I'll be sad.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
I thought Hilbert said HBM1 could only do 4GB?
I think there was something else though about HBM1 and the memory, as in it can only do stacks in certain configurations such as 4x1024 stacks, or 8x1024 stacks? I'll have to find that slide and info from AMD again.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I think there was something else though about HBM1 and the memory, as in it can only do stacks in certain configurations such as 4x1024 stacks, or 8x1024 stacks? I'll have to find that slide and info from AMD again.
http://www.hotchips.org/wp-content/uploads/hc_archives/hc26/HC26-11-day1-epub/HC26.11-3-Technology-epub/HC26.11.310-HBM-Bandwidth-Kim-Hynix-Hot%20Chips%20HBM%202014%20v7.pdf According to SK Hynix (who makes it) it can do 8GB. The problem is all of AMD's HBM marketing material only shows it as 4 stacks. Plus AMD's own guy basically went on a little blurb about 4GB of memory:
An obvious concern is the limit of 4GB of memory for the upcoming Fiji GPU – even though AMD didn’t verify that claim for the upcoming release, implementation of HBM today guarantees that will be the case. Is this enough for a high end GPU? After all, both AMD and NVIDIA have been crusading for larger and larger memory capacities including AMD’s 8GB R9 290X offerings released last year. Will gaming suffer on the high end with only 4GB? Macri doesn’t believe so; mainly because of a renewed interest in optimizing frame buffer utilization. Macri admitted that in the past very little effort was put into measuring and improving the utilization of the graphics memory system, calling it “exceedingly poor.” The solution was to just add more memory – it was easy to do and relatively cheap. With HBM that isn’t the case as there is a ceiling of what can be offered this generation. Macri told us that with just a couple of engineers it was easy to find ways to improve utilization and he believes that modern resolutions and gaming engines will not suffer at all from a 4GB graphics memory limit. It will require some finesse from the marketing folks at AMD though…
Which again just like kind of makes me think it's 4GB. Like why would Macri go into detail about frame buffer optimization or even mention it if it's a 8GB card? Unless of course AMD is intentionally trying to misdirect people. But why would they do that? I know people who literally bought Nvidia cards already since finding out it was a 4GB card. All it's doing is hurting them. Makes no sense to me.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Double posting but I should also point out that the TDP of that card makes no sense to me given the specs. 40% more shaders, 8GB of HBM (effectively cancels out the power savings from HBM) 28nm and 300w TDP? Uh what is this magic? Sorry but no ****ing way. Either the TDP is wrong or the specs are off. Honestly if I was expecting AMD to have a winning card, those are definitely the specs. But the TDP doesn't make sense to me.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Don't expect 8GB version, unless AMD decided to keep both intel and nVidia in fog unexpecting any competition, and then sudden blow in the balls with both carrizo and GPUs. I only want to see how will be Fiji's 4GB vram utilized vs r9-290x 4GB while running same settings in benchmarks/games which go close to 3~4 GB. (And even if it is same, it would be enough for me - but I would be pissed on AMD to throw "improved utilization / buffers" sentences at us.)
I actually kind of wonder if AMD will have some special kind of AA available that uses the increased bandwidth to it's advantage. That would be neat. But yeah I'm not really expecting an 8GB card, I just think it should have been 8GB. And I know for most people 4GB is plenty, but I was just kind of hoping I could finally go 4K with this card and be happy with it.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Radeon 'Fury' (Rumour)
AMD’s upcoming Titan equivalent flagship Radeon graphics card is rumored to be dubbed “Fury”. R9 390X to be based on enhanced Hawaii. We had previously pointed out that AMD’s 300 series naming structure would simply not allow for a 300 series Fiji based graphics card to exist.
More @ Wccf and Videocard z
https://forums.guru3d.com/data/avatars/m/259/259564.jpg
I actually kind of wonder if AMD will have some special kind of AA available that uses the increased bandwidth to it's advantage. That would be neat. But yeah I'm not really expecting an 8GB card, I just think it should have been 8GB. And I know for most people 4GB is plenty, but I was just kind of hoping I could finally go 4K with this card and be happy with it.
4GB isn't enough. If a company releases a top end card that requires compromises on games that came out before it was released, then that card has failed. GTA V easily surpasses 4GB at 4k. I actually find that many games I try to play require texture compromises at 4k, as is, on a GTX 980. Dying light, Far Cry 4, all of them required me to turn down to High textures to maintain a non-stutter experience at 4k. If I'm paying over 500 dollars for a video card, that's unacceptable, especially one that's calling itself a 4k card.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
I don't think this card will be released as a high end card(It is marketed as it though), more or less it is a proof of concept for HBM and to show what it can do. At this point they aren't worried about 4k because everyone knows that 4k resolution requires more than 4gb of vram. They're more interested in seeing how HBM pans out with its high bandwidth and clocks.