ZEN2 8c/16t with NAVI GPU Spotted - AMD Console chip?

Published by

Click here to post a comment for ZEN2 8c/16t with NAVI GPU Spotted - AMD Console chip? on our message forum
https://forums.guru3d.com/data/avatars/m/265/265607.jpg
slyphnier:

sony silent or not, it going to be similar SoC to Xbox anyway, probably some slight change/difference but it still same amd-soc-based or is there any potential alternative ?
I think there is basically no alternative on the console maker, as Intel still has no GPU or APU to talk about and I doubt that anyone wants to work with NVIDIA after first xbox. So it's very likely that new consoles will have very similar, or even exactly the same, HW. And again, as in current generation, the biggest difference will be available games and exclusives.
data/avatar/default/avatar39.webp
Backstabak:

I think there is basically no alternative on the console maker, as Intel still has no GPU or APU to talk about and I doubt that anyone wants to work with NVIDIA after first xbox. So it's very likely that new consoles will have very similar, or even exactly the same, HW. And again, as in current generation, the biggest difference will be available games and exclusives.
After the first Xbox, Ps3 had an Nvidia GPU and now Nintendo does.
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
AlbertX:

After the first Xbox, Ps3 had an Nvidia GPU and now Nintendo does.
Yeah, and after working with Nvidia on the PS3 Sony didn't want to work with them again. There wasn't really anything else available for the switch at the time.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
So we're seeing more medium clocked cores (this time with HT) and a mid range GPU? Why am I not impressed...
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
It makes 8c/16t standard. This just changes my mind of getting an R5 3600 and looking at R7 3700 instead.
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
fantaskarsef:

So we're seeing more medium clocked cores (this time with HT) and a mid range GPU? Why am I not impressed...
TDP cant be to high on a console
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
fantaskarsef:

So we're seeing more medium clocked cores (this time with HT) and a mid range GPU? Why am I not impressed...
Considering the move from Jaguar and a GCN/Polaris based platform performance definitely will be a lot better for the console market.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
fantaskarsef:

So we're seeing more medium clocked cores (this time with HT) and a mid range GPU? Why am I not impressed...
Consoles never had impressive specs. I'd say this is actually a pretty substantial upgrade over last gen. Remember: The performance you get in console hardware does not correlate with PC hardware. Build a PC with the same specs as the original XB1 and I'm sure you'll struggle to sustain 1080p@30FPS.
Undying:

It makes 8c/16t standard. This just changes my mind of getting an R5 3600 and looking at R7 3700 instead.
Don't forget - these are low-power cores. In multithreaded tasks, a single 3600 core will most likely outperform 2 cores for these consoles. I'm sure they're going to reserve a core or 2 for the OS like they did with last-gen. Considering how much more people are demanding of their consoles (like streaming or recording), It's reasonable they'll still reserve 2 cores. So, unless you expect to do that stuff yourself, you can most likely get by with the 3600 and not see any problems. Besides, look at all of the current console games ported to PC. It's an 8-core CPU, but practically none of these games will bog down something like a 8600K. In most cases, an overclocked 7600K is sufficient. Remember: we're often playing games with at least twice the framerate. If you don't want to take any risks and have money to spare, the 3700 is probably the smarter choice, but I'm willing to bet the 3600 will be sufficient for anyone who wants to play next-gen games and doesn't intend to run intensive background processes (like streaming). Alternatively, if all you care about is making sure you have enough cores to match consoles, you could get the 2700. Those are a steal, and they offer more than enough performance.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

Don't forget - these are low-power cores. In multithreaded tasks, a single 3600 core will most likely outperform 2 cores for these consoles.
I'm actually not sure if this is true. I've been playing around with power limiting/scaling on my 3900 based on a suggestion by @Fox2232 and the scaling on this processor is extremely weird. I can go down to like 70w capped and only lose like 10% sustained performance. It doesn't make any sense. I think the TDP on the console is obviously going to be low and balanced towards the GPU but it wouldn't surprise of the CPU performance is within 10-15% of a full 3700 or 3600 whatever the 8 core is.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

I'm actually not sure if this is true. I've been playing around with power limiting/scaling on my 3900 based on a suggestion by @Fox2232 and the scaling on this processor is extremely weird. I can go down to like 70w capped and only lose like 10% sustained performance. It doesn't make any sense. I think the TDP on the console is obviously going to be low and balanced towards the GPU but it wouldn't surprise of the CPU performance is within 10-15% of a full 3700 or 3600 whatever the 8 core is.
I'm not sure I understand what you're saying. What are you doing to cap off the wattage? The IPC should remain pretty constant regardless of what your frequency is, but, just because the wattage is lower, that doesn't mean the frequency isn't affected (doesn't mean it is, either). It's worth pointing out that the Ryzen 3000 series seems to sort of have a mind of its own, where for example if you lower the voltage too much, it automatically underclocks (but it might not report that to Task Manager). Gamers Nexus has an article on that behavior that goes into more depth. All that being said, when you account for the actual current frequency, a 3.6GHz core of the exact same architecture should be approximately twice as fast as a 1.8GHz core. In the case of comparing these consoles to a PC, there will be some deviation due to stuff like other IRQ'ing devices, RAM speed, background processes, and cache size, but, that's why I said "approximately".
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
schmidtbag:

I'm not sure I understand what you're saying. What are you doing to cap off the wattage? The IPC should remain pretty constant regardless of what your frequency is, but, just because the wattage is lower, that doesn't mean the frequency isn't affected (doesn't mean it is, either). It's worth pointing out that the Ryzen 3000 series seems to sort of have a mind of its own, where for example if you lower the voltage too much, it automatically underclocks (but it might not report that to Task Manager). Gamers Nexus has an article on that behavior that goes into more depth. All that being said, when you account for the actual current frequency, a 3.6GHz core of the exact same architecture should be approximately twice as fast as a 1.8GHz core. In the case of comparing these consoles to a PC, there will be some deviation due to stuff like other IRQ'ing devices, RAM speed, background processes, and cache size, but, that's why I said "approximately".
Lower wattage is usually obtained by lowering the clocked rate, that or limiting boosts by clocks or what cores can boost.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
vbetts:

Lower wattage is usually obtained by lowering the clocked rate, that or limiting boosts by clocks or what cores can boost.
Right, but that's why I find it a bit confusing. If you lower the clock rate then of course the performance will drop. Denial didn't specify what the frequency was. Ryzen from the very beginning seems to exponentially increase in wattage as you get closer and closer to the limit of the chip (which so far has typically been somewhere between 4.0-4.4GHz, depending on silicon lottery). So - without much other information, it's not much surprise that you could chop off 30% of the wattage for a 10% performance loss. That being said, the part I find confusing is whether or not he's saying the IPC is being affected. So hypothetically, let's say there was a 30% drop in frequency, but benchmarks only showed a 10% loss in performance, that would be very weird.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

That being said, the part I find confusing is whether or not he's saying the IPC is being affected. So hypothetically, let's say there was a 30% drop in frequency, but benchmarks only showed a 10% loss in performance, that would be very weird.
Nah not the IPC - the clocks definitely automatically drop but like you said the power is exponential. I just don't think this chip is ever going to need to run at it's base frequency of 1.8 (picture actually says 1.6 not sure where the 1.8 is coming from in the article). I use PPT to control the power and the 10% performance loss comes from actual benchmarks not just looking at frequency - but the frequencies basically drop 10%.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

Nah not the IPC - the clocks definitely automatically drop but like you said the power is exponential. I just don't think this chip is ever going to need to run at it's base frequency of 1.8 (picture actually says 1.6 not sure where the 1.8 is coming from in the article).
Well yeah, I wasn't really questioning power consumption though. All I'm saying is core-per core and per-clock, the 3600 is roughly twice as fast as what the next-gen consoles will be. Even if you're right that the consoles won't typically run at base clocks, the same goes for a PC user running a Zen2 CPU as well. To your point though, it will be much more power-hungry. On the other hand, the 3700 Undying was considering isn't going to be a whole lot better in that regard. Just a side note: Since 1.8GHz is the minimum spec, it's worth pointing out that a [good] game developer is basically obligated to assume that's the frequency they have to work with. Since it is very possible the system will clock that low, they must ensure that the game will run smoothly at that speed. That being said, 1.8GHz, in most cases, will be the fastest we should expect of that CPU when all cores are in use. For games that don't need all available cores, that's where the boost clocks come in handy, without compromising TDP.
I use PPT to control the power consumption and the 10% performance loss comes from actual benchmarks not just looking at frequency - but the frequencies basically drop 10%.. you can just cut a lot of power for that 10% though.
Yup, definitely. Haha I've been in this sort of dichotomy with my CPU, where I have some headroom to overclock by another 150MHz. But the added heat (and therefore fan noise) and power consumption just isn't worth it since I'm already so close to the ceiling. On the other hand, I really don't care about preserving this CPU since I bought it knowing it was on the verge of obsolescence, so part of me is just like "let's just crank up until the crank breaks". Meanwhile, I could actually try to preserve this CPU and undervolt+underclock just a little bit, where it could make for a decent home server.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
vbetts:

Considering the move from Jaguar and a GCN/Polaris based platform performance definitely will be a lot better for the console market.
While you are absolutely right of course, it also makes it predictable. So that's why I'm simply not surprised.
schmidtbag:

Consoles never had impressive specs. I'd say this is actually a pretty substantial upgrade over last gen.
Sure... 5 years later it's easy to do that as well. Which is not saying the performance will be bad, too low, or whatever you think. It's just... a substantial upgrade, like you said @schmidtbag , but this is no revolution. Which still holds my humble opinion true, I'm simply not impressed. Whatever you want to interpret is not my saying. 🙂
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
fantaskarsef:

Which is not saying the performance will be bad, too low, or whatever you think. It's just... a substantial upgrade, like you said @schmidtbag , but this is no revolution. Which still holds my humble opinion true, I'm simply not impressed. Whatever you want to interpret is not my saying. 🙂
But that's my point - no console's hardware was ever a revolution, so why should this be any different? I suppose you could say the original GameBoy was revolutionary, since it was the first truly portable gaming system that was actually decent, but people were more marveling that it was portable, rather than its level of performance. After all, it was just black and white. EDIT: Same could be said about the first Atari.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
schmidtbag:

But that's my point - no console's hardware was ever a revolution, so why should this be any different? I suppose you could say the original GameBoy was revolutionary, since it was the first truly portable gaming system that was actually decent, but people were more marveling that it was portable, rather than its level of performance. After all, it was just black and white. EDIT: Same could be said about the first Atari.
Yes I do remember my GameBoy. Should get it out of the basement 😉 But yeah, when we're both saying the same, as expected / not impressed, why do you find any need to answer to my post? I already said you are right in the exact quote you put into your's, my fellow guru
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
call me skeptical but both sony and ms said there cpu where woefully underpower and the wont make the mistake again. but to me it looks like it gona happen again. I get the most consoles try to stay under the 300watts and for most part around 200watts, but still these low clocks are worry some to me there is only some much work that can offloaded to gpu to make up for the cpu lack of power, unless there is huge diffrence from jaguar chip and zen2 at same clocks. any game the physic a lot suffer greatly do to the cpu being bad, particle effect too fps generally plummeted in this those games and
https://forums.guru3d.com/data/avatars/m/277/277333.jpg
Console makers learned their lesson over the years to not overspec their machines. If you look at the historical sales of most consoles and handhelds, the vast majority of the best-selling in their respective generations were the lower spec'd ones - Gameboy, DS, 3DS, PSOne, PS2, Wii, PS4 (was cheaper on launch). So, there's little reason to invest in cutting-edge hardware, since that doesn't translate into more sales, and selling hardware at a loss is 101 business bad practice. Also, at this point, there's very little in the way of making games run in lowered spec'd machines - just look at some switch ports. So, I believe that consoles won't be using high spec hardware ever again. Maybe console makers will make some "premium" SKUs like the PS4 pro or XB1X as an option, but that's as far as they will go, IMHO.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Ricardo:

Console makers learned their lesson over the years to not overspec their machines. If you look at the historical sales of most consoles and handhelds, the vast majority of the best-selling in their respective generations were the lower spec'd ones - Gameboy, DS, 3DS, PSOne, PS2, Wii, PS4 (was cheaper on launch). So, there's little reason to invest in cutting-edge hardware, since that doesn't translate into more sales, and selling hardware at a loss is 101 business bad practice. Also, at this point, there's very little in the way of making games run in lowered spec'd machines - just look at some switch ports. So, I believe that consoles won't be using high spec hardware ever again. Maybe console makers will make some "premium" SKUs like the PS4 pro or XB1X as an option, but that's as far as they will go, IMHO.
here in lies the problem they both said both there systems will be 4k and 8k capable, with ray tracing.... higher end gpu has issues with that stuff, most 300$ gpu can't even do 4k with acceptable fps and 800$ gpu still struggle to Maintain 60 fps at 4k so forget it if ray tracing it added and forget it some more if they really think 8k is possible for anything but movies. I still waiting for consoles to do 1080p @ 60 as a requirement cause sub 1080p is unacceptable and 30 fps is only acceptable in slow paced slow moving games. only time 30fps is acceptable in fast pace game is when there are ZERO framepacing issue, which is still a issue, even then 60 fps is preferable. There huge difference how say nioh in performance mode runs which for most part 60 fps 720p or lower or higher with huge drops when particle effect are used. and nioh in movie mode with 1080p but 30 fps locked with still the huge drops. even though nioh look like shit in performance mode, but it plays 1000% better with 60 fps. and anyone that plays in 30 then 60fps will say they same think less the blind the cpu had be strong enough to atlest keep up with gpu it coupled with not underpowered like it was with the ps4/xbox one and even the pro/x the cpu was still way under power.