AMD Pulls the Plug on ATI

Recommended Videos

garfoldsomeoneelse

Charming, But Stupid
Mar 22, 2009
2,908
0
0
NullMad said:
This thread is impressively full of misinformation and bad reading comprehension. Not to mention fanboyism.

AMD has acquired ATI a few years back and all it's fazing out is the branding. The cards will be on the shelves same as always. The sticker will now say AMD. Consumer graphics cards will rename under the name Radeon. This is purely a marketing shift. Much smaller than Foundries split was a year ago and few outside the tech circles even heard of that.

As for ATI cards being unsupported they have had a monthly driver release schedule for a number of years now. So I fail to see how that qualifies as "unsupported". There are issues with that approach but they are most certainly trying.
Welcome to the Escapist. I like you.
 

nelsonr100

New member
Apr 15, 2009
303
0
0
So does this mean AMD are ceasing producing graphics cards completely? Or just that the ATI brand is being taken away? Hopefully the 2nd as the recent spell of cards have been really top notch and have beaten nvidia on price and performance.

I'm running an ATI 4890HD at the moment and its great
 

aaaaaDisregard

New member
Feb 16, 2010
62
0
0
Who cares? Most people I know are aware of the fact that AMD bought ATi, and the rest just don't care of such things as video chip make because they buy their PCs prebuilt. Personally I don't give a damn what label is on on the back of my Radeon 5850 PCB - AMD or ATi.
They really should have done this long ago, so more people would know that AMD brand now includes video-cards.

nelsonr100 said:
So does this mean AMD are ceasing producing graphics cards completely? Or just that the ATI brand is being taken away?
Nobody's gonna get out of video-chips business as it generates profit for AMD, which can't always be said about its CPU division. Plus powerful GPU architecture will be serious, if not the only, advantage in battle with Intel on hybrid processors front. They aren't going to scrap it, believe me.
 

Radelaide

New member
May 15, 2008
2,503
0
0
josh797 said:
so is nvidia now a monopoly? i mean who else makes graphics cards? hmmmm i smell a lawsuit acomin'
Nope, since AMD are releasing Fusion, where it's a CPU/GPU and not just a singular GPU unit.
 

nelsonr100

New member
Apr 15, 2009
303
0
0
aaaaaDisregard said:
Who cares? Most people I know are aware of the fact that AMD bought ATi, and the rest just don't care of such things as video chip make because they buy their PCs prebuilt. Personally I don't give a damn what label is on on the back of my Radeon 5850 PCB - AMD or ATi.
They really should have done this long ago, so more people would know that AMD brand now includes video-cards.

nelsonr100 said:
So does this mean AMD are ceasing producing graphics cards completely? Or just that the ATI brand is being taken away?
Nobody's gonna get out of video-chips business as it generates profit for AMD, which can't always be said about its CPU division. Plus powerful GPU architecture will be serious, if not the only, advantage in battle with Intel on hybrid processors front. They aren't going to scrap it, believe me.
Yeah I thought it was wierd when I saw the news article ss ATI are the most profitable bit for AMD pretty much and recently have been very successful with the 4800 series and 5800 series. I think the way the article is titled and written is confusing a lot of people on here. AMD are just "absorbing" ATI fully into their brand.
 

Funkysandwich

Contra Bassoon
Jan 15, 2010
759
0
0
I've always preferred Nvidia cards, but I only buy cards from decent manufacturers so I've never had one fail yet.

I don't think people realize it's generally not Nvidia's or ATI's fault when their card fails, but it's the company that actually built the thing.
 

manaman

New member
Sep 2, 2007
3,218
0
0
Ultratwinkie said:
GIJames said:
It's about time.

Anyway, NVIDIA FTW!
nvidia is overpriced and is so badly constructed it lasts only 2 years.
You do realize that nVida does not make graphics cards, right? They design them. ATI does the same but they also threw out there own line as well. Many of the companies that make ATI cards also make nVidia cards, with a few exceptions.

Oh and did I mention that you don't know what you are talking about if you really think that's true, I mean you wouldn't be fibbing to us would you? There is no way you would state something obviously not true and inflammatory just to incite fanboys from the other camp would you?

I have owned both nVidia designed cards and ATI designed cards, just like I have switched between AMD and Intel as the prices and quality have changed over the years for both of them. As I start a new cycle of buying parts here shortly I will take another look and see who is on top, how much I want to spend and other factors and make my dicision.
 

manaman

New member
Sep 2, 2007
3,218
0
0
Ultratwinkie said:
except if its a billion graphic card recall i can. something that big is just neglect.
Now you are just lying. There is no way any graphics card has ever sold a billion units.

From your description it sounds like the company making them had improperly installed heat sinks usually a bad paste, or sometimes it's poorly attached mechanically, even failed fan can cause temperatures to spike. A poorly maintained wave soldering unit, or used cheap supplies and an improper mask in the manufacturing process could also be to blame. You can actually get those components so hot they melt the solder when you over tax them (and how much you can tax them diminishes increasing fast with diminished cooling capacity). It can become possible for solder to flow and cause shorts and you get a POP, POP, fizzle, possible fire, etc. So yes what you describe is a manufacture problem, not chipset problem.

You don't get a show when it's just a component failing to heat. Open circuits tend not to be flashy. Maybe a slight smell of burnt electronics and that is it.
 

Liberaliter

New member
Sep 17, 2008
1,370
0
0
They are only getting rid of the ATI name as far as I hear things, this is because AMD has owned ATI for several years now.
 

minus_273c

Knackered Old Shit
Nov 21, 2009
126
0
0
DominicxD said:
Ultratwinkie said:
so i take it my ATI card is now an unsupported piece of crap?
Hate to break it to you mate, but it has been that for a good few years now.
Given the quality of the last few driver releases for my card (a 2GB 5970) I agree. When I find a driver that doesn't dumping me to desktop during BFBC2 I find it's disabled one of the cores. I've currently got one that's ok on both counts and I'm sticking with it.
 

manaman

New member
Sep 2, 2007
3,218
0
0
Ultratwinkie said:
manaman said:
Ultratwinkie said:
except if its a billion graphic card recall i can. something that big is just neglect.
Now you are just lying. There is no way any graphics card has ever sold a billion units.

From your description it sounds like the company making them had improperly installed heat sinks usually a bad paste, or sometimes it's poorly attached mechanically, even failed fan can cause temperatures to spike. A poorly maintained wave soldering unit, or used cheap supplies and an improper mask in the manufacturing process could also be to blame. You can actually get those components so hot they melt the solder when you over tax them (and how much you can tax them diminishes increasing fast with diminished cooling capacity). It can become possible for solder to flow and cause shorts and you get a POP, POP, fizzle, possible fire, etc. So yes what you describe is a manufacture problem, not chipset problem.

You don't get a show when it's just a component failing to heat. Open circuits tend not to be flashy. Maybe a slight smell of burnt electronics and that is it.
maybe not a single card but a collection of cards. look up nvidia recall in google and you get a metric SHIT TON of them.
Well, since you don't want to accept that nVidia and ATI both design chipsets, and nVidia does not manufacture any boards maybe by showing you some google results you will see that basing your opinion entierly on one bad experience and one google search isn't the best idea.

ATI Recall - 1,380,000 results.
nVidia Recall - 932,000 results.

By your logic your brand of choice is more of a failure, but it's not. Neither brand is overall better then the other, they both lag and lead behind each other all the time. They both have excellent brands that manufacture their cards, and they both have poorer brands.

I am not sure you understand exactly what nVida, and ATI do. I don't think you have quite grasped that this:



Is what nVidia and ATI make. ATI happens to also slap theirs onto a board with their name on it as well, but essentially both make chipsets and license out their name and board designs only.

I was going to go on about board design with pictures and everything, but I made a better comparison along the way and went with it.

You don't blame your CPU because your mother board fails do you? Think of it more that way. nVidia and ATI make the equivalent of a CPU for the graphics card, someone else makes the board and buys the other components that are used. Buying a cheap graphics card would be the equivalent of buying a cheap motherboard, then blaming your CPU manufacturer when it fails. Its not a perfect comparison, but it's the best I can think of that is easy to follow.

The fact is, the chipsets are never almost never the failure points in either companies product.

Oh and I think that you will find both nVidia and ATI custom chipsets working smoothly together in the Wii and 360. You will also find a Sony and nVidia co-designed GPU in the PS3.

Obviously you had one bad experience that you are going to stick with as the model of all things to come, hell or high water, but that doesn't seem to be the case for the rest of the world. You can't even really generalize the products into more expensive, or higher quality because of how varied they are cost wise, and power wise. You can just know that sometimes one will have an advantage over the other. Power wise nVidia was leading for a long time. ATI has closed in and have value cornered, but that may have all changed in the months since I was last looking at video cards.
 

ActionDan

New member
Jun 29, 2009
1,002
0
0
What the FUCK? I thought they were doing well with their dedicated graphics. Woah, what a bunch of selfish fucks.
 

Andy Chalk

One Flag, One Fleet, One Cat
Nov 12, 2002
45,698
1
0
Alphalpha said:
Wow, dude. That is one serious level of obsession you've got going there. You're entirely wrong, of course, about both my "gross misconception" of what "pull the plug" means and how it applies in this situation, but aside from pointing out that obvious fact I'm not going to debate the point further. Sky is blue, grass is green, etc.

But seriously. I'm impressed.
 

JediMB

New member
Oct 25, 2008
3,094
0
0
So much ignorance/idiocy in this thread. Ugh.

Well, I used to be a 3DFX person. Had a Voodoo 2 for the longest time. Well, first one and then another, since the first one unexpectedly died from faulty memory.

Switched to ATi when 3DFX was acquired by nVIDIA, and got myself a Radeon 9800 PRO.

When that suddenly died on me, it left a bad taste and I got a GeForce 6600 GT. The GPU performed well, but the card manufacturer screwed the fan up. Still, I had to keep on using it until it was time for a completely new computer and...

A GeForce GTX260. Performed awesomely. Unfortunately, though, I got a "Black Edition" card manufactured by XFX, and they screwed up the factory overclocking. Still, not nVIDIA's fault, and I love the performance and PhysX support, so my next card will be an nVIDIA too, and most likely a GTX460.

I haven't had much luck with graphics cards, have I? :p
 

aaaaaDisregard

New member
Feb 16, 2010
62
0
0
manaman said:
You don't blame your CPU because your mother board fails do you? Think of it more that way. nVidia and ATI make the equivalent of a CPU for the graphics card, someone else makes the board and buys the other components that are used. Buying a cheap graphics card would be the equivalent of buying a cheap motherboard, then blaming your CPU manufacturer when it fails. Its not a perfect comparison, but it's the best I can think of that is easy to follow.

The fact is, the chipsets are never almost never the failure points in either companies product.

Oh and I think that you will find both nVidia and ATI custom chipsets working smoothly together in the Wii and 360. You will also find a Sony and nVidia co-designed GPU in the PS3.

Obviously you had one bad experience that you are going to stick with as the model of all things to come, hell or high water, but that doesn't seem to be the case for the rest of the world. You can't even really generalize the products into more expensive, or higher quality because of how varied they are cost wise, and power wise. You can just know that sometimes one will have an advantage over the other. Power wise nVidia was leading for a long time. ATI has closed in and have value cornered, but that may have all changed in the months since I was last looking at video cards.
You are partially right here. AMD and NVIDIA indeed only design GPUs and then outsource their production to, like, TSMC. After that they design PCB (which is a board to which GPU, memory and cooler are attached) and outsource production of cards, based on it (and PCBs themselves) to the companies.
By the way, all high-end video-cards are designed by AMD and NVIDIA at first and manufactured on the single company's fabs (AFAIK in NVIDIA's case it's Foxconn) at least in the first few months, so you can't blame for malfuntions those who just glue their brand sticker to a card and sell it (ASUS, MSI, Gigabyte, EVGA, BFG, you name it - everyone). Custom designs for high-end cards are allowed much later.

There are a couple of things however.

First, GPU chip basically defines what PCB and resulting card will look like - if the chip is complex and big as hell (like GT200 was some time ago and like GF100 is right now) it'll emit a lot of heat, and you just have to draw that heat off somehow.
Current air cooling technology can't be significantly improved, which means that you can't make air cooler MUCH more effective without raising its price (using expensive materials for radiator, adding heat pipes, using some sofisticated thermo-interface like liquid metal) and/or size (bigger radiator dissipates more heat than small one on the same technology, bigger fan results in better air flow).

There are more advanced cooling solutions, but they are all more expensive and usually less reliable (water, liquid metal, phase transition, thermoelectric cooling etc), so if you have massive chip with high TDP, you've got to either make the board (and cooler) big or tolerate very high temperature. High temperature definitely lowers life expectancy for any card, and any small production defect is more likely to result in permanent damage to such a card.
Oh, and high TDP means high power consumption, so you have to use more expensive components like capacitors and make PCB design more complex. More such components further increase risk of malfunction (high temperatures and all), like leakage (or you must use solid ones which are expensive).

Second, heat isn't everything - there are other factors. If you make a GPU with 512bit memory bus, than you have to provide it with enough memory chips and complex circuitry. It's expensive.

GPU defines video-card design (and thus reliability + price), and with high-end models it's too expensive for most companies to design their own cards even after GPU maker allows to do it (you can see ASUS with it's MARS cards or, earlier, Sapphire with its weird stuff like Radeon 3850x3, but it's done mostly for image of the company).
So yeah, most malfunctions' direct cause is some manufacturing defect, but minor manufacturing problems are unavoidable (you can't yield 100% of product), while poor chip design results in more complex products, which are harder to produce, so there'll be more problems (and higher price).

That's why first GeForce GT200 (GTX280/260) were very unreliable and hugely expensive (I bought GTX260 at launch and it burned out in a week. A guy in the store told me a lot of people returned these cards at the time), while Radeon 4850/4870 were cheap in relatively durable.

Sorry for such a long posting - I just mean that GPU designer is mostly responsible not just for the card's performance, but for its reliability too. Long gone are days when any Chinese sweatshop could produce it's own castrated design and sell cards cheaply under "No Name" brand.
 

manaman

New member
Sep 2, 2007
3,218
0
0
aaaaaDisregard said:
manaman said:
You don't blame your CPU because your mother board fails do you? Think of it more that way. nVidia and ATI make the equivalent of a CPU for the graphics card, someone else makes the board and buys the other components that are used. Buying a cheap graphics card would be the equivalent of buying a cheap motherboard, then blaming your CPU manufacturer when it fails. Its not a perfect comparison, but it's the best I can think of that is easy to follow.

The fact is, the chipsets are never almost never the failure points in either companies product.

Oh and I think that you will find both nVidia and ATI custom chipsets working smoothly together in the Wii and 360. You will also find a Sony and nVidia co-designed GPU in the PS3.

Obviously you had one bad experience that you are going to stick with as the model of all things to come, hell or high water, but that doesn't seem to be the case for the rest of the world. You can't even really generalize the products into more expensive, or higher quality because of how varied they are cost wise, and power wise. You can just know that sometimes one will have an advantage over the other. Power wise nVidia was leading for a long time. ATI has closed in and have value cornered, but that may have all changed in the months since I was last looking at video cards.
You are partially right here. AMD and NVIDIA indeed only design GPUs and then outsource their production to, like, TSMC. After that they design PCB (which is a board to which GPU, memory and cooler are attached) and outsource production of cards, based on it (and PCBs themselves) to the companies.
By the way, all high-end video-cards are designed by AMD and NVIDIA at first and manufactured on the single company's fabs (AFAIK in NVIDIA's case it's Foxconn) at least in the first few months, so you can't blame for malfuntions those who just glue their brand sticker to a card and sell it (ASUS, MSI, Gigabyte, EVGA, BFG, you name it - everyone). Custom designs for high-end cards are allowed much later.

There are a couple of things however.

First, GPU chip basically defines what PCB and resulting card will look like - if the chip is complex and big as hell (like GT200 was some time ago and like GF100 is right now) it'll emit a lot of heat, and you just have to draw that heat off somehow.
Current air cooling technology can't be significantly improved, which means that you can't make air cooler MUCH more effective without raising its price (using expensive materials for radiator, adding heat pipes, using some sofisticated thermo-interface like liquid metal) and/or size (bigger radiator dissipates more heat than small one on the same technology, bigger fan results in better air flow).

There are more advanced cooling solutions, but they are all more expensive and usually less reliable (water, liquid metal, phase transition, thermoelectric cooling etc), so if you have massive chip with high TDP, you've got to either make the board (and cooler) big or tolerate very high temperature. High temperature definitely lowers life expectancy for any card, and any small production defect is more likely to result in permanent damage to such a card.
Oh, and high TDP means high power consumption, so you have to use more expensive components like capacitors and make PCB design more complex. More such components further increase risk of malfunction (high temperatures and all), like leakage (or you must use solid ones which are expensive).

Second, heat isn't everything - there are other factors. If you make a GPU with 512bit memory bus, than you have to provide it with enough memory chips and complex circuitry. It's expensive.

GPU defines video-card design (and thus reliability + price), and with high-end models it's too expensive for most companies to design their own cards even after GPU maker allows to do it (you can see ASUS with it's MARS cards or, earlier, Sapphire with its weird stuff like Radeon 3850x3, but it's done mostly for image of the company).
So yeah, most malfunctions' direct cause is some manufacturing defect, but minor manufacturing problems are unavoidable (you can't yield 100% of product), while poor chip design results in more complex products, which are harder to produce, so there'll be more problems (and higher price).

That's why first GeForce GT200 (GTX280/260) were very unreliable and hugely expensive (I bought GTX260 at launch and it burned out in a week. A guy in the store told me a lot of people returned these cards at the time), while Radeon 4850/4870 were cheap in relatively durable.

Sorry for such a long posting - I just mean that GPU designer is mostly responsible not just for the card's performance, but for its reliability too. Long gone are days when any Chinese sweatshop could produce it's own castrated design and sell cards cheaply under "No Name" brand.
I was keeping it simple for the other poster. Heat concerns are pretty much the number one reason for failure. Doing something as simple as looking at failure rates shows that manufacturer does make a difference in both graphics cards and mother boards.
 

Alphalpha

New member
Jan 11, 2010
62
0
0
Andy Chalk said:
Alphalpha said:
Wow, dude. That is one serious level of obsession you've got going there. You're entirely wrong, of course, about both my "gross misconception" of what "pull the plug" means and how it applies in this situation, but aside from pointing out that obvious fact I'm not going to debate the point further. Sky is blue, grass is green, etc.

But seriously. I'm impressed.
I'm glad I impressed you, but you make a convincing argument. I now realize that I was entirely wrong, and am quite embarrassed at missing such an obvious fact.

Please accept my sincere apologies.