AMD to Unleash Some Fury ?

Published by

Click here to post a comment for AMD to Unleash Some Fury ? on our message forum
https://forums.guru3d.com/data/avatars/m/261/261152.jpg
this one reminds me of the movie Kung Fury!
data/avatar/default/avatar20.webp
The new GPUs will be based on the same architecture like the 285, which is GCN 1.2.
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
I had ATI RageIIc and then I got RivaTNT 2 Pro... it was unbelievable. Things actually worked, no rendering problems and it actually accelerated games instead of slowing them down. RageIIc did so great job ruining my childhood gaming experience that I still hold grudges towards AMD. On top of that when I gave ATI second chance the X1800 XT died right after its one year warranty ended. So if I appear to lean towards the green side, you now know the reason why.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Yeah, brings back old memories 🙂
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Makes sense, as anyone who buys one will be furyous.
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
I had a crappy ATI on board chip in my PC and a friend of mine got one that had an Nvidia TNT2 in it... 🙁
I hope AMD doesn't do this...! People do not have fond memories of the pre-ArtX ATi, and I'm one of them. The Rage Fury was an awful product--it made me *furious* on more than one occasion. Anecdote: I'd been running a 16MB Voodoo3 on a 2048x1536 Windows desktop when I decided to give the Rage Fury a shot...ROOB I couldn't get the card to support that resolution for the Windows desktop--I had to *call ATi* at the time and *speak to a driver programmer* who was interested in learning why I wanted to run my desktop at such a high resolution--and I learned that ATI "had no plans" to support such resolutions in the future (lol) and so that is where I first learned how to play with a Windows install .inf to be able to manipulate the resolutions a given display adapter could use. But that wasn't the worst of it: the 3D ran like a slide show--it was really, really fast if you could coax it to 10 fps @ 640x400--all the while I was used to 3dfx's very playable 25-35 fps GLIDE gaming. I had a TNT1/2 in those days (which I had used with a 3dfx V2) that wasn't much better--nVidia loved to play up the fact that 3dfx wasn't running in 24-bit color in those days, but the fact was that 3dfx's 16/22-bit GLIDE hybrid mode ran much better than the either TNT product could run in 24-bit D3d/OpenGL. But the Rage Fury was even worse than the TNT/2...;) I returned it to Best Buy a few days later for a full refund--that was my last ATi product until the company turned the corner with R300 (courtesy of ArtX.) Pre ArtX was not a good chapter in ATi's 3d-accelerator history, imo. In those days, 3dfx made everyone look bad...;) But I'll never forget my Rage Fury, and that's not such a good thing for ATi. AMD has in the past several years endeavored to distance itself from the old ATi marketing--I think that was/is the best course for the company. Looking back like that just seems sort of lame. If they want to go retro over something people have good memories of then the upcoming Fiji products should be tagged "R300A", or something else evocative of a bright period in ATi's past. "Rage Fury" surely was not it...:D
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I'm not sure I'm a fan of this. AMD's naming scheme is already a pretty complicated mess, adding the word "Fury" in the mix doesn't make things any easier. I also think they could easily do without all the rebrands. Drop the X suffix, take out all the un-changed rebrands (such as the current R9 360), and they have a decent lineup that's mostly (though not 100%) new tech. They don't NEED to have so many options. If they want to get rid of old stock, keep the old names as they are, continue to sell them, and just drop the price. If someone is intending to get something like an R9 360, they're obviously not striving for the latest and greatest.
If AMD misses on this release they will be known as Radeon Fur-e not Fury!:butt:
lol you mean furry?
data/avatar/default/avatar01.webp
Not sure if the new Radeon Fury card will beat the Titan X or not but it will give it a run for its money and the same for the 980ti. If I were going to purchase a 3xx card I would get a 390 (yeah I know its the same as the 290x) but 8gb of Vram can be very useful. Well wasnt the Rage fury card a dual GPU card? Because I remember reading about that GPU in the history of GPUs right here on this site.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Idk, seems to me that Kepler cards cant compete with with current 280/290's. Last few games proved that. Thats why "fixing" is on the way but i would not count on that.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
I like the bandwidth jump, i very interested in see how that works out
data/avatar/default/avatar12.webp
ATI Rage!
data/avatar/default/avatar07.webp
Volcanic Islands, Krakatoa, Fiji, Fukushima, Hiroshima... It finally dawned on them that setting themselves up for another round of African village electricity jokes is not the best way to kickstart new GPU http://abload.de/img/23_13_22hvkii.gif
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
After the anouncement that all the 300 series would be rebrands i figured they would do a funny naming scheam like titan hopefully its not a joke like titan as well.
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
I hope a Titan-like naming scheme won't cause them to use Titan-like overpricing...
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Why has no one picked up on this yet: "monitor connectivity is HDMI and DP, no more DVI" NO MORE DVI ????? Also doesn't HDMI cause lag? I cannot understand why they are going DP only as HDMI is still valid for 99% of gamers. How many people have a DP monitor? I know I don't. Which means a new monitor if i want to upgrade to the fastest AMD card.
I've never heard of HDMI causing lag. Are you referring to older HDMI standards not being able to support 60fps at 4K? If you're referring to input lag, I've never heard of there being notable input lag differences between input methods with most of it coming from the monitor's scaler. As for DP, every monitor I've had in an eternity has had it. I was going to say the last 5 years but it's probably been since 2008.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Why has no one picked up on this yet: "monitor connectivity is HDMI and DP, no more DVI" NO MORE DVI ????? Also doesn't HDMI cause lag? I cannot understand why they are going DP only as HDMI is still valid for 99% of gamers. How many people have a DP monitor? I know I don't. Which means a new monitor if i want to upgrade to the fastest AMD card. How much money is that going to cost?
You'll only need a DP to DVI adapter, or go Nvidia, as they still have DVI.
data/avatar/default/avatar10.webp
Why has no one picked up on this yet: "monitor connectivity is HDMI and DP, no more DVI" NO MORE DVI ????? Also doesn't HDMI cause lag? I cannot understand why they are going DP only as HDMI is still valid for 99% of gamers. How many people have a DP monitor? I know I don't. Which means a new monitor if i want to upgrade to the fastest AMD card. How much money is that going to cost?
DVI is old technology and it needs to be phased out. I have used HDMI for a good while and no input lag unless you try running stuff that was not made for it. There is quite a few people that have a DP monitors because there are monitors that come with HDMI and DP on it. Also DP is the new DVI because it being able to support 120 and 144hz. I think that HDMI can support 120 and 144hz at any resolution under 4k until HDMI 2.0 was released. So pretty much HDMI and DP can do the same thing as DVI.
https://forums.guru3d.com/data/avatars/m/84/84507.jpg
After the anouncement that all the 300 series would be rebrands i figured they would do a funny naming scheam like titan hopefully its not a joke like titan as well.
AMD Radeon Olympia