AMD to Unleash Some Fury ?

Published by

Click here to post a comment for AMD to Unleash Some Fury ? on our message forum
https://forums.guru3d.com/data/avatars/m/236/236974.jpg
You'll only need a DP to DVI adapter
HDMI 1.0 mode can emulate DVI, so if you still have a 1080p monitor with only DVI, you're prefectly good to go with a passive HDMI-DVI adapter. In fact, even if you have DisplayPort only, you just need a passive Dual-Mode (DP++) adapter which enables HDMI/DVI emulation on the DisplayPort output.
data/avatar/default/avatar10.webp
HDMI 1.1 or whatever it was is rampant still and doesn't support 120Hz + Plus there's been that bug ages with Nvidia drivers and RGB color range. DP is okay I guess but it has little reason to exist. Saying DVI is "old technology" is kind of funny. It's a digital signal interface. If anything, the only reason it's being phased out is for new content protection standards or some other nonsense.
What I meant by old version of HDMI I was talking about 1.4a Also DVI is an old digital format is what I meant to say and DP and HDMI are being geared to do the same thing as DVI. Finally there is a reason for DP to exist before HDMI 2.0 came out DP was the only format that could handle 4K @60hz for those who wanted 4k at that time and now as of HDMI 2.0 can handle 4K @ 60hz.
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
You know, some of us actually had those. I had VooDoo 3 2000 16MB, and I happen to be using AIW 128 Pro some time later. Cousin had Xpert card of that generation. And then I got TNT2 32MB m64. That loony boy from Mars up the page apparently forgot to mention he paired his ATi card with 386. Because while ATi Rage 128 was just quite a bit weaker, it was not delivering 10fps at games of that time. And was soon replace by Rage Pro 128. One thing which was apparent at that time was IQ. Voodoo 3/TNT2 produced textures which would remind you of whore's face in rain. Makeup melted all over, nothing sharp. That's where ATi/Matrox were clear winners back then. I played mostly 800x600/1024x768 and at that ATi delivered better Image quality per frame rate. But Hey, I had to learn hard way with those 3Dfx/nVidia GPUs. There is no question that TNT2 lasted me with its 32MB a long time, since it had bit higher performance and it made me used to poor IQ. With it, it hardly mattered if 3D scene was rendered 640x480 or 1024x768, textures made everything blurry in same way. Here is some comparison: http://www.tomshardware.com/reviews/ati-rage-fury-pro-review,133-6.html
Looney boy...? :stewpid: Heh...;) This forum does not tolerate prejudice against Martians, I'll have you know. Tom's hardware back in those days was about as trustworthy as a soup-kitchen prostitute...;) I have no doubt that Dr. Pabst-Blue-Ribbon (as I used to call him) was being goosed under the table by nVidia in those days to slant his "reviews" in a pro-nVidia direction. There were several long-winded nVidia scandals in those days that I enjoyed. I remember when Pabst posted his review and compared the stock-clocked 3dfx V3 to an overclocked TNT2 running at 175MHz--which was fine, except when the TNT2 shipped nVidia shipped it at 150Mhz. (The V3 ran circles around that product, easily.) I only used a TNT2 for 2d rendering, coupled with a V2 (while in those days you could count the number of decent D3d titles on one hand, I had dozens of GLIDE titles--if you didn't run GLIDE in those days you were not actually a "gamer" and it was obvious you didn't do much of it...;) I think you can at least agree with me on that much.) When I bought the V3 it allowed me to rip out my TNT2/V2 combo--V3 was very impressive--it was faster than V2 SLI and could run 3d-accelerated GLIDE titles in higher resolutions than than V2 SLI, too. (Up to 1600x1200, IIRC.) If 3dfx had not made such a catastrophic mistake in buying STB's Mexican fab (which pulled them under with surprise debt), I have no doubt they'd still be here today. It was ironic: the V3 was the best product 3dfx ever made (along with the V5.5k, which I also had later on), but those products came out of the STB acquisition--and that's what drove 3dfx under. Most of those idiots ran "screen shots" of the Voodoo3 that actually left out half of the visible on-screen image because they (deliberately) chose not to understand that existing frame-grab software at the time couldn't capture the post-processing filtering that 3dfx was using (We owe 3dfx so much even now--like post-process image enhancement, FSAA--especially FSAA.) Even Anand posted V3 screen shots which were horrible looking, and which he admitted *in the body of his V3 review* were not representative of what he saw on the screen--and even he for some mysterious reason chose not to simply ring up 3dfx and ask why. (Shortly after the V3 shipped 3dfx released frame-grab software that would capture the image as it actually appeared on screen--but Anand never bothered to correct the initial negative impression.) Try and remember here that we are talking about *3d* here--not 2d. I owned at least two Matrox Milleniums--great, wonderful 2d cards at the time--but they sucked @ 3d. Completely sucked. All of them did when compared to the V3. ATI was almost as bad at 3d as Matrox, IIRC. What makes 3d useful, you ask? "Running at a playable frame-rate," I answer. Who cares if the Matrox cards and the ATi cards could generate fantastic-looking static screen shots? I didn't care about that at all because what good is that if your frame-rate is literally 5-10 frames per second? A: No good at all. Also...who cares about the cpu? The whole point of 3d accelerators is that *they* do the heavy lifting so that the cpu can do something else. What you need is a refresher course...;) Go fire up a copy of Unreal and run it in software mode...and you'll be amazed at what cpus cannot do at all in terms of 3d-acceleration--even today's cpus, let alone a '386. When people talk about "lousy ATi drivers" it is that period--the pre-ArtX ATi--that they are talking about. Today, the only people who complain at all about AMD drivers are Crossfire owners, and nVidia SLI owners are in the exact same boat. That's because neither Crossfire nor SLI are supported by either D3d or OpenGL--and never have been. It's all too easy for a developer to build a game engine that just so happens not to be amenable to forced FSAA or xFire/SLI--but too often consumers blame the IHV instead of the APIs. But that day will soon be behind us as D3d12 & the next iteration of OpenGL will both, finally and at along last, support multi-gpu rendering. As for me, I've owned nothing except AMD GPUs since 2002 when I bought an R300 (9700 Pro.) Post the ArtX acquisition, ATi/AMD is a different company and their products pretty much sell themselves, imo.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
[youtube]Myi3MosRuy8[/youtube] :D 😀 😀
https://forums.guru3d.com/data/avatars/m/97/97268.jpg
Hmm Geforce 980Ti at 649$ Let's hope AMD can match that pricetag !
https://forums.guru3d.com/data/avatars/m/97/97268.jpg
So excited about a comparison on the Fiji and the 980Ti 😀 😀 😀
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Matrox is still around http://www.matrox.com/en/
They do mostly encoder and signage stuff. I use Matrox solutions daily. 😀
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
^ Small freq. bump is nothing unusual, they couldn't have done that on 290X back then - it was already power hungry @ 512bit as it was, this time its a lot different.
I really hope it has 8GB. I still don't think the price of that previous leak is accurate though.
You quoted that pic and in AMD cpl it clearly says 8192MB. Or are you still doubting that ? 🤓 From what I saw, latest leak said 600-650$.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Yep by UHD 4K and MSAA for sure, its basically 4x more bandwidth data compared to 1080p
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Can also see the HBM, all 4x of them, so looks like it is 4GB, as they are 1GB per stack aren't they ?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Can also see the HBM, all 4x of them, so looks like it is 4GB, as they are 1GB per stack aren't they ?
Yeah, HBMv1 is limited to 1GB, so unless its on the backside somehow it's 4GB.
^ Small freq. bump is nothing unusual, they couldn't have done that on 290X back then - it was already power hungry @ 512bit as it was, this time its a lot different. You quoted that pic and in AMD cpl it clearly says 8192MB. Or are you still doubting that ? 🤓 From what I saw, latest leak said 600-650$.
I'm still doubting it.
data/avatar/default/avatar03.webp
I wonder if it is possible to have 2 sets of 4 1gb stacks to have a total of 8gb on the card or that isnt possible due to HBM v1 limitations.
data/avatar/default/avatar38.webp
To me in that picture of the Fuji die it looks like 2 sets 4 1gb chips.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Each set is 4x1GB, so 2 sets total 8GB. Its only those two squares bellow upper side doesn't have squares, just thermal paste footprint..