Radeon Technologies Group - January 2016 - AMD Polaris Architecture

Graphics cards 1054 Page 1 of 1 Published by

Click here to post a comment for Radeon Technologies Group - January 2016 - AMD Polaris Architecture on our message forum
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
So then Polaris is a new GCN based architecture?
https://forums.guru3d.com/data/avatars/m/253/253059.jpg
Guessing much of that power savings is due to the 14mm lithography. Once nVidia shrinks their dies and gets HBM2, they will also have substantial power savings. Hilbert, I do appreciate you calling out the apples to oranges comparison they did to the nVidia 950... "our 2016 models are better than the competitions 2015 models (which aren't substantially changed from the 2014 models)!" I should hope so!
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
So then Polaris is a new GCN based architecture?
Yeah it would seem that way. Which is honestly good -- AMD doesn't need to waste it's resources redesigning an architecture. GCN is fine. It just needed to bring power consumption down. 14/16nm will obviously help that but the architecture itself needs to bring improvements as well if they want to compete with Nvidia, which I'm sure RTG is doing with Polaris. I find it really weird that they are dual-sourcing parts out of GF/TSMC. I wonder if 14nm will be restricted to APU stuff. It seems weird that they would have pay the overhead of dual sourcing in their main GPU line.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Yeah it would seem that way. Which is honestly good -- AMD doesn't need to waste it's resources redesigning an architecture. GCN is fine. It just needed to bring power consumption down. 14/16nm will obviously help that but the architecture itself needs to bring improvements as well if they want to compete with Nvidia, which I'm sure RTG is doing with Polaris.
I agree. Hell, GCN is good enough that AMD is still rebranding GCN 1.0 GPUs (even though nobody seems to really like it when they do that). As long as both AMD and Nvidia can't figure out how to get decent FPS for 4k screens, working on power efficiency is the next best thing. Both companies have GPUs that are more than good enough for 1080p and sufficient for 2k, so as long as they keep that performance while lowering the watts, I'm happy.
data/avatar/default/avatar08.webp
I agree. Hell, GCN is good enough that AMD is still rebranding GCN 1.0 GPUs
And getting murdered, is what you forgot to mention Having to use big-small Tonga die and feature-less Pitcairns to fight mid-sized GM206 - is anything but GOOD ENOUGH But it's actually good compared to the state of AMD high-end GPU market share. [spoiler] Top Five Video Cards Ranked by Sales (Total Revenue) EVGA GeForce GTX 970 4 GB PCI-e 3.0 x16 Superclocked ACX 2.0 Video Card PNY Quadro K4200 4 GB PCI-e 2.0 x16 Workstation Video Card EVGA GeForce GTX TITAN X 12 GB PCI-e 3.0 Superclocked Video Card EVGA GeForce GTX 980 4 GB PCI-e 3.0 x16 Superclocked ACX 2.0 Video Card EVGA GeForce GTX 980 Ti 6 GB PCI-e 3.0 x16 SC+ w/ACX BP Video Card Quote: Top Five Video Cards Ranked by Volume (Units Sold) EVGA GeForce GTX 970 4 GB PCI-e 3.0 x16 Superclocked ACX 2.0 Video Card EVGA GeForce GTX 960 2 GB PCI-e 3.0 x16 SuperSC ACX 2.0+ Video Card PNY Quadro K620 2 GB PCI-e 2.0 x16 Workstation Video Card PNY Quadro K4200 4 GB PCI-es 2.0 x16 Workstation Video Card PNY Quadro K2200 4 GB PCI-e 2.0 x16 Workstation Video Card [/spoiler] http://blog.neweggbusiness.com/news/best-selling-video-cards-of-2015/
data/avatar/default/avatar16.webp
that was the overall system power consumption that was given
except that it wasn't says: "Polaris card on med 1080p scored 60fps and consumed 86W" 4790k rig is around 60-ish watts in idle 86W while gaming rly?
data/avatar/default/avatar06.webp
People buy more quadros than 370/380's? Anyone else thinks this is strange?
it's proly skewed by god-knows-what, but it's still telling
data/avatar/default/avatar25.webp
I did now. And it sounds very impressive. OTOH 86W at the wall.....thats like 70W real consumption. Which means card + CPU are pulling 10-20W watts above idle. I'd like to be wrong, but I think someone there needs to retake their electronics lab class 😀 edit: on a 2nd thought, it is a double node jump. so it's quite possible :banana:
data/avatar/default/avatar16.webp
Total system consumption.
yeye we covered that already. Looks like Polaris is around 30W For GTX 950: 140W - 90W(ingame) = 50W (idle) Polaris: 85W - 50W(idle) = 35W (Polaris+CPU)
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
My 4790K stress tests at about 45-50w. A GTX950 has a TDP of 90w. It means the Polaris GPU is running at about 35-40w which is definitely impressive but it is on a smaller node. So we have no idea how much of that is architecture improvements vs node shrink. You also have to remember that AMD's architecture is heavily favored in Battlefront 3. For example, a 950 outperforms a 370 in nearly every other title, by about 20+%, but in BF3 the 370 ties the 950.
data/avatar/default/avatar03.webp
A GTX950 has a TDP of 90w
that's the exact number I used above, and it returns 50W idle for their rig with 950 using these 50W as Polaris rig(edited) idle state gives us 35W above idle while gaming. and CPU is certainly not sitting while ingame
https://forums.guru3d.com/data/avatars/m/183/183421.jpg
I wonder what the reason is for not having this already on current GCN 1.2 arch cards like the 280/285 and above I mean isn't that the whole point of GCN is that it's programmable Multimedia h.256 main 10 decode up to 4k / 4k h.265 at 60 fps encode. I can already do it using MPC-HC + MadVR so why can't AMD do it via drivers just like they do for H.264
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
I wonder what the reason is for not having this already on current GCN 1.2 arch cards like the 280/285 and above I mean isn't that the whole point of GCN is that it's programmable Multimedia h.256 main 10 decode up to 4k / 4k h.265 at 60 fps encode. I can already do it using MPC-HC + MadVR so why can't AMD do it via drivers just like they do for H.264
Software decoding?
data/avatar/default/avatar15.webp
yes they can do it "via drivers", but without dedicated hw accelerator, H.265 encoding will be same as now: slow as &*#*
https://forums.guru3d.com/data/avatars/m/263/263507.jpg
My next monitor will be 4k for sure. If possible 120Hz with DP 1.3. But with this concept of HDR, I don't know what to think. What would be better....4K 60Hz HDR or 4K 120HZ without HDR. Maybe it's possible to use HDR for Desktop, non-FPS games. And just disable HDR for FPS games. Anyway, what about something like 4K-75-90FPS-HDR? That would be my best choice for all environments. (FreeSync or G-Sync or whatever will be called, enabled of course).
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
Are you dying this year or are you leaving the solar system?
He's on his way to the Polaris star. 🙂
https://forums.guru3d.com/data/avatars/m/169/169351.jpg
He's on his way to the Polaris star. 🙂
But it's impossible to go to Polaris, it requires a system permit which is currently impossible to obtain - possible because it's the Thargoid home system... Sorry I couldn't help but do an Elite plug. Ontopic, I knew polaris was going to be GCN 1.3/2.0, that article released last week on fudzilla totally had it wrong (well the write up was conflicted anyway). I'm looking forward to 2016's GPU market, that's for sure!