Radeon Technologies Group - January 2016 - AMD Polaris Architecture
Click here to post a comment for Radeon Technologies Group - January 2016 - AMD Polaris Architecture on our message forum
vbetts
Moderator
So then Polaris is a new GCN based architecture?
ruiner13
Guessing much of that power savings is due to the 14mm lithography. Once nVidia shrinks their dies and gets HBM2, they will also have substantial power savings. Hilbert, I do appreciate you calling out the apples to oranges comparison they did to the nVidia 950... "our 2016 models are better than the competitions 2015 models (which aren't substantially changed from the 2014 models)!" I should hope so!
Denial
schmidtbag
Twiddles
System used: i7-4790K 4x4GB DDR4. Wait whut?!
Source: http://content.hwigroup.net/images/articles/AMD%20Polaris%205.jpg
Noisiv
http://blog.neweggbusiness.com/news/best-selling-video-cards-of-2015/
And getting murdered, is what you forgot to mention
Having to use big-small Tonga die and feature-less Pitcairns to fight mid-sized GM206 - is anything but GOOD ENOUGH
But it's actually good compared to the state of AMD high-end GPU market share.
[spoiler]
Top Five Video Cards Ranked by Sales (Total Revenue)
EVGA GeForce GTX 970 4 GB PCI-e 3.0 x16 Superclocked ACX 2.0 Video Card
PNY Quadro K4200 4 GB PCI-e 2.0 x16 Workstation Video Card
EVGA GeForce GTX TITAN X 12 GB PCI-e 3.0 Superclocked Video Card
EVGA GeForce GTX 980 4 GB PCI-e 3.0 x16 Superclocked ACX 2.0 Video Card
EVGA GeForce GTX 980 Ti 6 GB PCI-e 3.0 x16 SC+ w/ACX BP Video Card
Quote:
Top Five Video Cards Ranked by Volume (Units Sold)
EVGA GeForce GTX 970 4 GB PCI-e 3.0 x16 Superclocked ACX 2.0 Video Card
EVGA GeForce GTX 960 2 GB PCI-e 3.0 x16 SuperSC ACX 2.0+ Video Card
PNY Quadro K620 2 GB PCI-e 2.0 x16 Workstation Video Card
PNY Quadro K4200 4 GB PCI-es 2.0 x16 Workstation Video Card
PNY Quadro K2200 4 GB PCI-e 2.0 x16 Workstation Video Card
[/spoiler]
Noisiv
http://www.3dcenter.org/artikel/launch-analyse-nvidia-geforce-gtx-950/launch-analyse-nvidia-geforce-gtx-950-seite-2
Even custom OC-ed GTX 950 peak gaming power does not reach 140W
https://www.techpowerup.com/reviews/EVGA/GTX_950_SSC/28.html
AMD labs strikes again 😀
GTX 950 - 140W LMAO
85W averaged across 9 reviews, G3D including
Noisiv
Noisiv
Noisiv
I did now. And it sounds very impressive.
OTOH 86W at the wall.....thats like 70W real consumption. Which means card + CPU are pulling 10-20W watts above idle.
I'd like to be wrong, but I think someone there needs to retake their electronics lab class 😀
edit: on a 2nd thought, it is a double node jump. so it's quite possible :banana:
edilsonj
Noisiv
Denial
My 4790K stress tests at about 45-50w. A GTX950 has a TDP of 90w.
It means the Polaris GPU is running at about 35-40w which is definitely impressive but it is on a smaller node. So we have no idea how much of that is architecture improvements vs node shrink. You also have to remember that AMD's architecture is heavily favored in Battlefront 3. For example, a 950 outperforms a 370 in nearly every other title, by about 20+%, but in BF3 the 370 ties the 950.
Noisiv
Athlonite
I wonder what the reason is for not having this already on current GCN 1.2 arch cards like the 280/285 and above I mean isn't that the whole point of GCN is that it's programmable
Multimedia h.256 main 10 decode up to 4k / 4k h.265 at 60 fps encode.
I can already do it using MPC-HC + MadVR so why can't AMD do it via drivers just like they do for H.264
theoneofgod
Noisiv
yes they can do it "via drivers",
but without dedicated hw accelerator, H.265 encoding will be same as now: slow as &*#*
thesebastian
My next monitor will be 4k for sure. If possible 120Hz with DP 1.3.
But with this concept of HDR, I don't know what to think. What would be better....4K 60Hz HDR or 4K 120HZ without HDR.
Maybe it's possible to use HDR for Desktop, non-FPS games. And just disable HDR for FPS games.
Anyway, what about something like 4K-75-90FPS-HDR? That would be my best choice for all environments. (FreeSync or G-Sync or whatever will be called, enabled of course).
Maddness
Truder