Sir Scoops-a-lot himself, Fuad Abazovic at The Inquirer, posted a news story saying that on Monday AMD and ATI will propose a merger to their shareholders. We've heard the AMD/ATI rumors for quite a while now and if I knew for sure if it was going to happen I'd leave some veiled hint in an article that it would, but honestly I'm not sure. The idea of it happening, however, is quite intriguing and in my opinion, would be tremendous for AMD.
As soon as you bring up the idea of a merger, everyone immediately asks "why not NVIDIA?" The problem with NVIDIA is that they are far too Intel-like of a company (see: my dinner table discussion from my Core 2 review).
The other important thing to consider is that if AMD and ATI merged, a very important market for NVIDIA chipsets would cease to exist forcing NVIDIA to turn to Intel for scraps off the table, which the link above explains would not be a healthy lifestyle choice for NVIDIA.
Strategically it makes sense, if GPUs are becoming more CPU-like with each generation then there may come a time where either a CPU company drives a GPU company out of business, or vice versa. By combining talents early, you create a company that would be very well prepared for the convergence of the CPU/GPU.
Don't look at the idea of a merger as AMD having a closer chipset partner, but imagine what else could come from it.
Imagine a Socket-AM2 GPU, with incredibly low latency access to a Socket-AM2 CPU. A huge strength of just about any gaming console (see: PS3) is the extremely high bandwidth, low latency interconnect that exists between the CPU and GPU.
Imagine a stripped down graphics core stuck on a Athlon 64 processor die, all of the sudden integrated graphics wouldn't be so terrible. Taking this "what-if" even further, it may just make Intel produce better integrated graphics solutions. And that would make my good friend Mark Rein quite happy ;)
But more importantly, imagine a GPU company that was no longer fabless. If ATI were able to gain a manufacturing advantage over NVIDIA you'd be able to see more transistors, clocked higher and running cooler in ATI GPUs than NVIDIA at any given product cycle.
Simply merging the two companies wouldn't announce the death of NVIDIA or anything like that, because both would have to work extremely hard to actually overcome their weaknesses and make some of these fantasies happen. But if they could, well, that would reason enough to merge.
Update from the Lab: Derek is hard at work on a Prey GPU performance comparison, Josh Venning has just started on our long overdue silent GPU roundup, and I've been testing Intel's Core 2 Duo E6400 & E6300 non-stop. Between benchmark setups I've also been working on the HD-DVD/Blu-ray piece and preparing for my X2 EE/EE SFF article. Tomorrow I'll officially start some of the performance per watt testing for the X2 EE/EE SFF piece - thanks for your feedback on what you'd like to see in that article, it's helped tremendously.
Take care and have a great weekend :)
As soon as you bring up the idea of a merger, everyone immediately asks "why not NVIDIA?" The problem with NVIDIA is that they are far too Intel-like of a company (see: my dinner table discussion from my Core 2 review).
The other important thing to consider is that if AMD and ATI merged, a very important market for NVIDIA chipsets would cease to exist forcing NVIDIA to turn to Intel for scraps off the table, which the link above explains would not be a healthy lifestyle choice for NVIDIA.
Strategically it makes sense, if GPUs are becoming more CPU-like with each generation then there may come a time where either a CPU company drives a GPU company out of business, or vice versa. By combining talents early, you create a company that would be very well prepared for the convergence of the CPU/GPU.
Don't look at the idea of a merger as AMD having a closer chipset partner, but imagine what else could come from it.
Imagine a Socket-AM2 GPU, with incredibly low latency access to a Socket-AM2 CPU. A huge strength of just about any gaming console (see: PS3) is the extremely high bandwidth, low latency interconnect that exists between the CPU and GPU.
Imagine a stripped down graphics core stuck on a Athlon 64 processor die, all of the sudden integrated graphics wouldn't be so terrible. Taking this "what-if" even further, it may just make Intel produce better integrated graphics solutions. And that would make my good friend Mark Rein quite happy ;)
But more importantly, imagine a GPU company that was no longer fabless. If ATI were able to gain a manufacturing advantage over NVIDIA you'd be able to see more transistors, clocked higher and running cooler in ATI GPUs than NVIDIA at any given product cycle.
Simply merging the two companies wouldn't announce the death of NVIDIA or anything like that, because both would have to work extremely hard to actually overcome their weaknesses and make some of these fantasies happen. But if they could, well, that would reason enough to merge.
Update from the Lab: Derek is hard at work on a Prey GPU performance comparison, Josh Venning has just started on our long overdue silent GPU roundup, and I've been testing Intel's Core 2 Duo E6400 & E6300 non-stop. Between benchmark setups I've also been working on the HD-DVD/Blu-ray piece and preparing for my X2 EE/EE SFF article. Tomorrow I'll officially start some of the performance per watt testing for the X2 EE/EE SFF piece - thanks for your feedback on what you'd like to see in that article, it's helped tremendously.
Take care and have a great weekend :)
8 Comments
View All Comments
Rike - Saturday, July 22, 2006 - link
OK, let's go dreaming: a dual socket AM2 mobo with HTT 3.0 running a quad-core K8L in one socket and a dual core ATI GPU in the other talking to each other over the HT link. And as long as I'm dreaming, I'd have lots of software that could really use my fun hardware!
Well if the merger goes through, that could certainly change the landscape in tech world yet again.
BigLan - Monday, July 24, 2006 - link
But where is the memory? While the GPU would have great access to the CPU it'll have to deal with system memory which is slower speed (1066mhz ddr2 vs 1600mhz+ gddr3/4) and much longer traces. For anything past mid-range cards this will be a deal breaker.The solution would wither be a memory daughter card for the gpu which would complicate designs and have a very small market share imo, or embedded on-die memory which ati tried with mobile parts a few years back. Neither of those sound enticing to me.
Jjoshua2 - Sunday, July 23, 2006 - link
Looking forward to the EE article, I want to see if its worth the extra to buy EE or just a higher model non-EE when overclocking.RichUK - Saturday, July 22, 2006 - link
Change is good, this would mark the beginning of a big evolutionary leap for both companies.gnumantsc - Saturday, July 22, 2006 - link
Seems like AMD is really going to buy ATI with this http://www.globeinvestor.com/servlet/WireFeedRedir...">link.Seems like it should not be far off and hopefully intel won't be the #1 GPU seller anymore with their crippleware of a videocard.
tuteja1986 - Saturday, July 22, 2006 - link
This would be either turn out good for ATO or we are going to have a monopoly in the high end GPU market. I am really worried about what AMD has plans for ATI high end GPU sector.gnumantsc - Saturday, July 22, 2006 - link
I remember reading somewhere on the net that Intel did block ATi from making any future chipsets for Intel so maybe the rumours are true?microAmp - Friday, July 21, 2006 - link
Your 1st link in the Core 2 Duo points back to the same article instead of previous one.http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">http://www.anandtech.com/cpuchipsets/showdoc.aspx?...