Uh, why? I have a mobo thats AGP and it supports every Athlon 64 socket 939 chip out today, INCLUDING dual core (I have one in my system right now overclocked to 2.8 ghz). You are wrong about AGP being outdated. The AGP bus wasn't even the limiting factor.
Anand, what do you think about Crossfire technology now that a system (albiet pre-release) is tested?
I really hope ATI's entry level R520 comes in Crossfire flavour at launch. Then those who needs a X850XT upgrade can do that while those hardcore upgraders can get the top end R520 Crossfire. For me, I will love a R520 Crossfire system but hopefully ATI don't release a entry R520 with greatly reduced pipelines (like how X800Pro and X850Pro was killed just because of 4 pipelines)
#32 - 1 word: overhead. something like that is going to take up processing time, which reduces performance. it makes more sense when youre talking about computers because theyre not always going to be the same performance, and anyways its single coded tasks. with, say, a 7800gtx, youre talking about 24 pipelines, 16 ROPS, 8 vertex engines - and suddenly its a lot more complicated to just "execute the next line of code." you need a master which is telling a slave which ones to and not to execute (they both have the same thing in memory) and thats going to take away, and it just doesnt make much sense. SFR is great cause each card knows exactly what part of the frame its gonna process and render and it just does it. also there is some load balancing there, it will change how much goes to each one based on how hard it is. and if for some reason (DOOM3) SFR is not good, (i dont understand why myself) AFR works pretty efficiently, too. nvidia put millions into research for SLI over 3 years. i think they got the best plan ;)
Some people can really think weird, both nVidia and ATI try algorithms like even-odd rendering, or splitting the part to be rendered in two to give both cards equal performance.
There's a even more flexible and possibly faster why do process th scene and you won't have the trouble of having two of the same cards (nv) or the same speed (nv). It's called dynamic load balancing (DLB) - commonly used in computer clustering techniques for high performance computing.
DLB is easy to understand and has the advantage that the superior processor will do more work without losing it's capable speed. DLB don't require special algorithms to split up the scene, just start from the beginning and which ever card is available gets the next line to process.
Anand, I would also like to see some LCD TV articles. There is supposed to be a new generation of them due out this fall based on new technology that significantly beats the current crop. What is the new technology; which displays will have it, and how can I connect them to my PVR's Radeon 9700 Pro? :-)
well #29, 6800u tied on bf2 (1600x1200 4xaa, what a high end type will prob use unless you got 2048, which is unlikely) smashed the 7800gtx on doom3, beat it on everquest 2, tied it on guild wars, tied it on half life 2, got beat by 7800 on Splinter Cell, was slightly faster in UT2004, and crushed the 7800 on wolfenstein. clearly you didnt read the anand review.
#27---that is a ridiculous statement that the 800xl does not compete with the 6800gt. Sorry to step on toes, but I buy bang for the buck---and the 800xl gives you 6800gt performance -- the 6800gt does doom 3 better, but the 800xl kills the gt in half life 2.
A new LCD TV just came out this month, and I would love to see a review from you guys on it. It's the Syntax LT32HVE. It's the 32 inch model with their iDEA technology. Until now, only models up to 26 inches had this, and from what some people are saying, it makes a good bit of difference. It's just over $1K, and seems to be a good value for its size and supposed performance. Keep up the good work!
so it takes a lot of power, but in most cases 2 6800u well outperforms a 7800gtx... i would hardly call it a waste of money,m sometimes its better to be able to buy one now and one later.
Also, a 6800gt will pwn an x800xl in anything opengl and is about tied/a little better in d3d games. Also, pcie is definately faster and better then agp. twice as fast in sheer bandwith (even if its not used yet) and the fact that it has the same speed up or down has made turbocache/hypermemory possible, making cheaper cards that can actually do something (as opposed to intel integrated trash.) and whats this about pcie requiring more power then AGP? learn to read, it supplies more power.
If you are looking for a high end card now, it would be stupid not to wait for the R520 to come out. If it's better then get that, if it's worse, then the 7800 will be cheaper. Oh and crossfire/sli is a waste of money no matter how you look at it. The next gen product will always match or come very close to exceeding the performance for a lot less. (not to mention the rediculous power requirements)
It all depends on what market you are in. Nvidia is the top performer if money is not an issue via sli or the new gtx. Bang for the buck is ati...a x800xl outperforms a 6800GT at a lover price.
SLI has sold well into the millions of units. The 6600GT is the most popular card on the market, and there are many out there with SLI systems based on these.
First, the nForce4 is the most successful chipset launch ever by nVidia, even beating out the nForce2 in both launch partners and volume. Secondly, being the fastest, most feature-rich, and most overclockable chipset on the market (or even ever, comparatively) is far from buggy. NForce4 has dominated the AMD market and has set itself up to dominate the Intel enthusiast market as well.
The 7800GTX is actually starting to sell pretty well for a ultra high-end card.
And last but not least, AGP-based cards use the same amount of power as PCIe-based cards, they just pull power from different places (PCIe supplies 75w through the slot vs. 25w for AGP slots).
hardly a soul besides a minute percentage of people ever get another video card for sli. Really, there has been no point besides for about 3 months a year ago almost to even use 2 cards. invidia 6600GT was only thing worth it, but people said, ehhh.... screw it lets get a 6800GT or ultra and see if I can rake up some cash for another one, if not, i still go above 70 frames for the games out there at highest resolutions.
then the 7800 comes out. blowing away any sli. and, no game really needs it. I mean, no normal person with a monitor under 23 inches needs it.
ATI makes all-in-wonder cards, also, the 2d text is better, and, agp chipsets use about 40 watts less of power without being slower than 1-2 frames at maximum compared to actively cooled/noisy nforce 4 sli chipsets, which, are also buggy still comparitively.
the new ULI one is king if it comes out to market soon. runs pci e and agp at full tilt.
(agp is all-in-wonder besides the severely slow x600 which wouldnt pass for a 9800 pro even)
#18 No this is irrelevant. Similar to Sony PS3 stunt to get people wait on ATI while Nvidia actually has something shipping. Why irrelevant? Well it's business as usual, don't get so worked up over it. When the product is here for real, then it's time.
"ATI distributed a special driver to their partners prior to the Computex launch that was designed to simulate CrossFire performance, by only rendering odd frames (effectively doubling the frame rate and simulating AFR performance). Although we can't confirm that we also ran with this driver back at Computex, chances are we probably did. But more importantly, the reviews you've seen where a pair of slave cards are used aren't actually testing CrossFire, they are simply simulating the performance of CrossFire by rendering half the frames. "
nothing about rendering even frames here. see, doubling the rate assumes no overhead whatsoever, which is rediculous. first of all you are using double the bandwith to get the memory info to both cards. then, AFR isnt as simple as saying you, render odd, you, render even - it has to go through careful load balancing to figure out what the overall frame rate is, and where to render and skip, and then what is odd and even. easier said then done. then, this all has to be sent accross the, er, dvi/whatever weird interface they call it and put back together into a single coherent video stream. all that needs processing time/bandwith and will take away more then a few FPS. thats why double is not realistic, not on SLI, not on crossfire - ati just doubling the framerate is lying
Well I agree with you, i dont think anyone in there right mind sould spend $350 on an agp card. I meant more along the lines of those who are on old tech already and looking for one last upgrade just to keep them going till they can get the new system. Remember most people still on agp will have to upgrade to s939 from what they are on. So its not just buying a new motherboard and gpu, its also cpu, maybe even more. With these ULi boards just reviewed that may be a better upgrade path for those trying to migrate.
Basically i meant ati needs all the help they can get right now, and keeping the agp market open may be beneficial to them.
Why does everyone misunderstand what is happening here? The drivers are sending even-frames to one card and odd-frames to the other card. The cards are rendering every single frame. However, because there is no composite chip, only every-other frame can be sent to the monitor. That's all.
#13
They def. do have their work cut out for them, however I disagree with you on the AGP market. You would literally be INSANE to buy a $350+ AGP card which is outdated technology. That would be like going to the dealer and DEMANDING to pay $20,000 for a Gremlin. It would be stupid to waste that mutch money and not upgrade your motherboard to a technology that has been out for a year+. (A year in the PC world is HUGE)
No... I do not agree that the "AGP" market would keep someone afloat. ATI def. needs something more than the hope that no one upgrades their PC in order to put them on the same level they were on a year ago.
- Creathir
ati really needs to step it up here, i completely agree that they need to launch r520 and crossfire at the same time. added to that if they dont paper launch, i see them making a big comeback. but they have there work cut out for them. one advantage they do have is saying they will produce agp cards, which is still a big market. if they play it right they will gain back most of, if not all that they lost to nvidia(sales, market share or w/e).
See, son, the thing is - when you want people to *believe* you dont give a shit , it helps when you dont yell caffiene fuelled gems such as "SO F U C K Y O U "
I'm no NVIDIA fanboy, but when only one company has come back and beat ATI down like they have, I become slightly impressed. I had a 9700 Pro until March of this year, and when it came around to upgrade to Athlon 64, ATI had NOTHING compared to the performance SLI was delivering. NVidia has screwed things up in the past *cough*GeForce FX*cough*, but they had their superb chipset to keep them afloat. Now they have delt this massive next-gen card (7600) that is causing ATI to retool some things. This is quite amazing, considering a year ago ATI was (IMO) in control of the graphics market. I certainly am no fan boy, except I want the best, and right now, ATI is not looking too hot.
- Creathir
I love how everyone that posted thinks ATI is going to be bought up even though they only lost 20 million last quarter in profit OUT OF 550 million, sorry to burst you bubble fanboys just becuase a company does bad does in one quarter does not mean it will be sold to quote "Creative/Intel/AMD". Get realistic and look at the facts fools.
Competition is good so don't rant and rave about a couple months of delay on R520 and crossfire, when it finally does come out NVIDIA will probably drop prices on SLI and 7800/7600 series.
And furthermore ATI is just showing that they want the best friggin product that they can get to the public, with no bugs.
Please fanboys bitch about ATI all you want but the reality is no one listens to you anyway.
And just so you dont call me a fanboy i have 2 6600gt in sli. SO F U C K Y O U
#6 - thats not the point. remember the "exclusive benchmark" showing the massive doom3 performance increase when crossfire was "launched"? well i guess when they got that exclusive benchmark they forgot to mention it was fake.
http://www.anandtech.com/video/showdoc.aspx?i=2432...
"Even at this early stage, performance and stability were both impressive. The system that we were running had just been assembled hours earlier and didn't crash at all during our testing. In fact, the system was so new that the motherboard manufacturer who let us test with their hardware hadn't even seen it running - it was their first time as well as ours.
The performance of the solution was equally impressive; at 1024x768, the dual GPU CrossFire setup improved performance by 49%. At 1280x1024 and 1600x1200, the performance went up by 72% and 86% respectively. We had our doubts that ATI would be able to offer performance scaling on par with what we've seen on NVIDIA's SLI, but these initial numbers, despite being run on early hardware/drivers, are quite promising. "
#7, if they get bought out, then no more real competition till the new company gets its act together and we wont see such dramatic performance increases on cards. and i hope they get flamed plnty on the forums ;)
#2 Ahhh... I guess I misunderstood. I thought the drivers were shipping even frames to one card and odd frames to the other card... thus creating the cross-fire. If all they were doing is dropping every other frame so they could "double" the frame rate... that is total b.S. and they deserve the pummling that they are recieving in the forums.
I wish NVidia would get the 7800GT and midrange cards out so ATI would really start to hurt... just to enforce the need to "keep up". Still and all, I think I understand that if they have such low yields that only 10 good chips come off a $3000 wafer, there is no chance that they can sell anything.
In the HKEPC article, they actually pointed out that the card were just simulating the performance of real CrossFire mode. They stated that the article is just to give people an idea what the actual performance might be.
This is rather interesting...
The once mighty ATI has really managed to put themselves at quite a disadvantage. I would say NVidia really socked 'em in the gut with the G70. If the release of ONE care is causing them to hold of on the release of an entire chipset/technology, and given their current yield issues they have having, ATI really is in for a hurt. I would not be surprised to see someone like Creative/Intel/AMD pick up ATI in the near future. It would not surprise me at all...
It's all shannanigans. If you buy one of the new R520 "master" boards and pair it with anything other than an exact copy, you are wasting your money. Since their system defaults to the smallest RAM size and each card is rendering at its native speed, then the faster card will wait forever on anything even remotely slower than itself. And if you are doing AFR, you will need your dramamine to play anything short of freecell.
Moral of the story, unless you just bought a new x800XT or x800XT Platinum- you are going to have to buy at the very least a "master" card and an equivalent "slave".
(yes, yes, I know that in order to use SLI you must buy two brand new SLI cards. But ATI seems to want to make people think that they will be able to create blazing framerates with their current 9800 and a new "master" card. But that just isn't realistic.)
#1, what do you want those drivers for? theyre lies! rendering every odd frame? its a trick, aparently not one below ati, to hype crossfire - I'll believe it when i see it.
Clearly R520 wasnt so well designed, they tried to do too much and theyve had to tape it out 3 times. still talking about a launch in september? seeing ATI's recent behavior, doubt stuff will be available immediately for cheap, tho anythings better then the x800/850 xt pe debacle. Theyve been beaten to market by nvidia, now lets see wht they can come up with... a year late. When i see doom3/half life 2 numbers, then i'll decide...
ATI should release the drivers rather than making us all wait for their "master" cards. The only thing that they are doing by waiting is letting Nvidia make more inroads into their market share. At this point, it's not about the fact that they can boost their profit by $50 a card for the "master" Crossfire card... but rather keeping their market share from eroding further against Nvidia. A freebe like these drivers would go a long way to keeping their loyal customers.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
35 Comments
Back to Article
Richard - Saturday, July 23, 2005 - link
#14Uh, why? I have a mobo thats AGP and it supports every Athlon 64 socket 939 chip out today, INCLUDING dual core (I have one in my system right now overclocked to 2.8 ghz). You are wrong about AGP being outdated. The AGP bus wasn't even the limiting factor.
Heron Kusanagi - Friday, July 22, 2005 - link
Anand, what do you think about Crossfire technology now that a system (albiet pre-release) is tested?I really hope ATI's entry level R520 comes in Crossfire flavour at launch. Then those who needs a X850XT upgrade can do that while those hardcore upgraders can get the top end R520 Crossfire. For me, I will love a R520 Crossfire system but hopefully ATI don't release a entry R520 with greatly reduced pipelines (like how X800Pro and X850Pro was killed just because of 4 pipelines)
daniel - Wednesday, July 20, 2005 - link
#32 - 1 word: overhead. something like that is going to take up processing time, which reduces performance. it makes more sense when youre talking about computers because theyre not always going to be the same performance, and anyways its single coded tasks. with, say, a 7800gtx, youre talking about 24 pipelines, 16 ROPS, 8 vertex engines - and suddenly its a lot more complicated to just "execute the next line of code." you need a master which is telling a slave which ones to and not to execute (they both have the same thing in memory) and thats going to take away, and it just doesnt make much sense. SFR is great cause each card knows exactly what part of the frame its gonna process and render and it just does it. also there is some load balancing there, it will change how much goes to each one based on how hard it is. and if for some reason (DOOM3) SFR is not good, (i dont understand why myself) AFR works pretty efficiently, too. nvidia put millions into research for SLI over 3 years. i think they got the best plan ;)EvolutionBunny - Wednesday, July 20, 2005 - link
Some people can really think weird, both nVidia and ATI try algorithms like even-odd rendering, or splitting the part to be rendered in two to give both cards equal performance.There's a even more flexible and possibly faster why do process th scene and you won't have the trouble of having two of the same cards (nv) or the same speed (nv). It's called dynamic load balancing (DLB) - commonly used in computer clustering techniques for high performance computing.
DLB is easy to understand and has the advantage that the superior processor will do more work without losing it's capable speed. DLB don't require special algorithms to split up the scene, just start from the beginning and which ever card is available gets the next line to process.
Just a thought.
null_pointer_us - Wednesday, July 20, 2005 - link
Anand, I would also like to see some LCD TV articles. There is supposed to be a new generation of them due out this fall based on new technology that significantly beats the current crop. What is the new technology; which displays will have it, and how can I connect them to my PVR's Radeon 9700 Pro? :-)Anonymous - Wednesday, July 20, 2005 - link
well #29, 6800u tied on bf2 (1600x1200 4xaa, what a high end type will prob use unless you got 2048, which is unlikely) smashed the 7800gtx on doom3, beat it on everquest 2, tied it on guild wars, tied it on half life 2, got beat by 7800 on Splinter Cell, was slightly faster in UT2004, and crushed the 7800 on wolfenstein. clearly you didnt read the anand review.Josh - Wednesday, July 20, 2005 - link
#26- clearly you didn't read the anand review then, because the 7800GTX generally beats the 6800U-SLI setup slightly.haf - Tuesday, July 19, 2005 - link
#27---that is a ridiculous statement that the 800xl does not compete with the 6800gt. Sorry to step on toes, but I buy bang for the buck---and the 800xl gives you 6800gt performance -- the 6800gt does doom 3 better, but the 800xl kills the gt in half life 2.Cliff - Tuesday, July 19, 2005 - link
Anand,A new LCD TV just came out this month, and I would love to see a review from you guys on it. It's the Syntax LT32HVE. It's the 32 inch model with their iDEA technology. Until now, only models up to 26 inches had this, and from what some people are saying, it makes a good bit of difference. It's just over $1K, and seems to be a good value for its size and supposed performance. Keep up the good work!
Daniel - Tuesday, July 19, 2005 - link
so it takes a lot of power, but in most cases 2 6800u well outperforms a 7800gtx... i would hardly call it a waste of money,m sometimes its better to be able to buy one now and one later.Also, a 6800gt will pwn an x800xl in anything opengl and is about tied/a little better in d3d games. Also, pcie is definately faster and better then agp. twice as fast in sheer bandwith (even if its not used yet) and the fact that it has the same speed up or down has made turbocache/hypermemory possible, making cheaper cards that can actually do something (as opposed to intel integrated trash.) and whats this about pcie requiring more power then AGP? learn to read, it supplies more power.
jkostans - Tuesday, July 19, 2005 - link
If you are looking for a high end card now, it would be stupid not to wait for the R520 to come out. If it's better then get that, if it's worse, then the 7800 will be cheaper. Oh and crossfire/sli is a waste of money no matter how you look at it. The next gen product will always match or come very close to exceeding the performance for a lot less. (not to mention the rediculous power requirements)Anonymous - Tuesday, July 19, 2005 - link
It all depends on what market you are in. Nvidia is the top performer if money is not an issue via sli or the new gtx. Bang for the buck is ati...a x800xl outperforms a 6800GT at a lover price.Strages - Tuesday, July 19, 2005 - link
#22, you make me cry.SLI has sold well into the millions of units. The 6600GT is the most popular card on the market, and there are many out there with SLI systems based on these.
First, the nForce4 is the most successful chipset launch ever by nVidia, even beating out the nForce2 in both launch partners and volume. Secondly, being the fastest, most feature-rich, and most overclockable chipset on the market (or even ever, comparatively) is far from buggy. NForce4 has dominated the AMD market and has set itself up to dominate the Intel enthusiast market as well.
The 7800GTX is actually starting to sell pretty well for a ultra high-end card.
And last but not least, AGP-based cards use the same amount of power as PCIe-based cards, they just pull power from different places (PCIe supplies 75w through the slot vs. 25w for AGP slots).
Careful ElJefe, your fanboy is showing.
ElJefe - Tuesday, July 19, 2005 - link
AGP is the same speed as pci-e.hardly a soul besides a minute percentage of people ever get another video card for sli. Really, there has been no point besides for about 3 months a year ago almost to even use 2 cards. invidia 6600GT was only thing worth it, but people said, ehhh.... screw it lets get a 6800GT or ultra and see if I can rake up some cash for another one, if not, i still go above 70 frames for the games out there at highest resolutions.
then the 7800 comes out. blowing away any sli. and, no game really needs it. I mean, no normal person with a monitor under 23 inches needs it.
ATI makes all-in-wonder cards, also, the 2d text is better, and, agp chipsets use about 40 watts less of power without being slower than 1-2 frames at maximum compared to actively cooled/noisy nforce 4 sli chipsets, which, are also buggy still comparitively.
the new ULI one is king if it comes out to market soon. runs pci e and agp at full tilt.
(agp is all-in-wonder besides the severely slow x600 which wouldnt pass for a 9800 pro even)
dornick - Monday, July 18, 2005 - link
This is the most ridiculous thing I've ever heard. Let's simulate Quad-Crossfire!!! by rendering only an eighth of the frames!!!d_jedi - Monday, July 18, 2005 - link
I'm willing to wait for R520.. ATI better have something good, though!sixpak - Monday, July 18, 2005 - link
#18 No this is irrelevant. Similar to Sony PS3 stunt to get people wait on ATI while Nvidia actually has something shipping. Why irrelevant? Well it's business as usual, don't get so worked up over it. When the product is here for real, then it's time.geo - Sunday, July 17, 2005 - link
Yes, #15 & #17 are the crux of the matter. Which is the right view? It makes a difference.
Daniel - Sunday, July 17, 2005 - link
#15 - you should read anands post more carefully."ATI distributed a special driver to their partners prior to the Computex launch that was designed to simulate CrossFire performance, by only rendering odd frames (effectively doubling the frame rate and simulating AFR performance). Although we can't confirm that we also ran with this driver back at Computex, chances are we probably did. But more importantly, the reviews you've seen where a pair of slave cards are used aren't actually testing CrossFire, they are simply simulating the performance of CrossFire by rendering half the frames. "
nothing about rendering even frames here. see, doubling the rate assumes no overhead whatsoever, which is rediculous. first of all you are using double the bandwith to get the memory info to both cards. then, AFR isnt as simple as saying you, render odd, you, render even - it has to go through careful load balancing to figure out what the overall frame rate is, and where to render and skip, and then what is odd and even. easier said then done. then, this all has to be sent accross the, er, dvi/whatever weird interface they call it and put back together into a single coherent video stream. all that needs processing time/bandwith and will take away more then a few FPS. thats why double is not realistic, not on SLI, not on crossfire - ati just doubling the framerate is lying
reactor - Sunday, July 17, 2005 - link
Well I agree with you, i dont think anyone in there right mind sould spend $350 on an agp card. I meant more along the lines of those who are on old tech already and looking for one last upgrade just to keep them going till they can get the new system. Remember most people still on agp will have to upgrade to s939 from what they are on. So its not just buying a new motherboard and gpu, its also cpu, maybe even more. With these ULi boards just reviewed that may be a better upgrade path for those trying to migrate.Basically i meant ati needs all the help they can get right now, and keeping the agp market open may be beneficial to them.
Anonymous - Sunday, July 17, 2005 - link
Why does everyone misunderstand what is happening here? The drivers are sending even-frames to one card and odd-frames to the other card. The cards are rendering every single frame. However, because there is no composite chip, only every-other frame can be sent to the monitor. That's all.Creathir - Saturday, July 16, 2005 - link
#13They def. do have their work cut out for them, however I disagree with you on the AGP market. You would literally be INSANE to buy a $350+ AGP card which is outdated technology. That would be like going to the dealer and DEMANDING to pay $20,000 for a Gremlin. It would be stupid to waste that mutch money and not upgrade your motherboard to a technology that has been out for a year+. (A year in the PC world is HUGE)
No... I do not agree that the "AGP" market would keep someone afloat. ATI def. needs something more than the hope that no one upgrades their PC in order to put them on the same level they were on a year ago.
- Creathir
reactor - Saturday, July 16, 2005 - link
fanboys are funny to watch...ati really needs to step it up here, i completely agree that they need to launch r520 and crossfire at the same time. added to that if they dont paper launch, i see them making a big comeback. but they have there work cut out for them. one advantage they do have is saying they will produce agp cards, which is still a big market. if they play it right they will gain back most of, if not all that they lost to nvidia(sales, market share or w/e).
time will tell..
at80eighty - Saturday, July 16, 2005 - link
#9 nature-See, son, the thing is - when you want people to *believe* you dont give a shit , it helps when you dont yell caffiene fuelled gems such as "SO F U C K Y O U "
/thought i'd throw you a little freebie there..
Creathir - Saturday, July 16, 2005 - link
I'm no NVIDIA fanboy, but when only one company has come back and beat ATI down like they have, I become slightly impressed. I had a 9700 Pro until March of this year, and when it came around to upgrade to Athlon 64, ATI had NOTHING compared to the performance SLI was delivering. NVidia has screwed things up in the past *cough*GeForce FX*cough*, but they had their superb chipset to keep them afloat. Now they have delt this massive next-gen card (7600) that is causing ATI to retool some things. This is quite amazing, considering a year ago ATI was (IMO) in control of the graphics market. I certainly am no fan boy, except I want the best, and right now, ATI is not looking too hot.- Creathir
Daniel - Friday, July 15, 2005 - link
ATI fanboys... theyre so desperate theyre putting 6600gts in SLI!nature - Friday, July 15, 2005 - link
I love how everyone that posted thinks ATI is going to be bought up even though they only lost 20 million last quarter in profit OUT OF 550 million, sorry to burst you bubble fanboys just becuase a company does bad does in one quarter does not mean it will be sold to quote "Creative/Intel/AMD". Get realistic and look at the facts fools.Competition is good so don't rant and rave about a couple months of delay on R520 and crossfire, when it finally does come out NVIDIA will probably drop prices on SLI and 7800/7600 series.
And furthermore ATI is just showing that they want the best friggin product that they can get to the public, with no bugs.
Please fanboys bitch about ATI all you want but the reality is no one listens to you anyway.
And just so you dont call me a fanboy i have 2 6600gt in sli. SO F U C K Y O U
Daniel - Friday, July 15, 2005 - link
#6 - thats not the point. remember the "exclusive benchmark" showing the massive doom3 performance increase when crossfire was "launched"? well i guess when they got that exclusive benchmark they forgot to mention it was fake.http://www.anandtech.com/video/showdoc.aspx?i=2432...
"Even at this early stage, performance and stability were both impressive. The system that we were running had just been assembled hours earlier and didn't crash at all during our testing. In fact, the system was so new that the motherboard manufacturer who let us test with their hardware hadn't even seen it running - it was their first time as well as ours.
The performance of the solution was equally impressive; at 1024x768, the dual GPU CrossFire setup improved performance by 49%. At 1280x1024 and 1600x1200, the performance went up by 72% and 86% respectively. We had our doubts that ATI would be able to offer performance scaling on par with what we've seen on NVIDIA's SLI, but these initial numbers, despite being run on early hardware/drivers, are quite promising. "
#7, if they get bought out, then no more real competition till the new company gets its act together and we wont see such dramatic performance increases on cards. and i hope they get flamed plnty on the forums ;)
kleinwl - Friday, July 15, 2005 - link
#2 Ahhh... I guess I misunderstood. I thought the drivers were shipping even frames to one card and odd frames to the other card... thus creating the cross-fire. If all they were doing is dropping every other frame so they could "double" the frame rate... that is total b.S. and they deserve the pummling that they are recieving in the forums.I wish NVidia would get the 7800GT and midrange cards out so ATI would really start to hurt... just to enforce the need to "keep up". Still and all, I think I understand that if they have such low yields that only 10 good chips come off a $3000 wafer, there is no chance that they can sell anything.
gwwfps - Friday, July 15, 2005 - link
In the HKEPC article, they actually pointed out that the card were just simulating the performance of real CrossFire mode. They stated that the article is just to give people an idea what the actual performance might be.Creathir - Friday, July 15, 2005 - link
This is rather interesting...The once mighty ATI has really managed to put themselves at quite a disadvantage. I would say NVidia really socked 'em in the gut with the G70. If the release of ONE care is causing them to hold of on the release of an entire chipset/technology, and given their current yield issues they have having, ATI really is in for a hurt. I would not be surprised to see someone like Creative/Intel/AMD pick up ATI in the near future. It would not surprise me at all...
- Creathir
Tim - Friday, July 15, 2005 - link
#3 -- I got a little button happy. Forgot to fill in the form fields.Anonymous - Friday, July 15, 2005 - link
It's all shannanigans. If you buy one of the new R520 "master" boards and pair it with anything other than an exact copy, you are wasting your money. Since their system defaults to the smallest RAM size and each card is rendering at its native speed, then the faster card will wait forever on anything even remotely slower than itself. And if you are doing AFR, you will need your dramamine to play anything short of freecell.Moral of the story, unless you just bought a new x800XT or x800XT Platinum- you are going to have to buy at the very least a "master" card and an equivalent "slave".
(yes, yes, I know that in order to use SLI you must buy two brand new SLI cards. But ATI seems to want to make people think that they will be able to create blazing framerates with their current 9800 and a new "master" card. But that just isn't realistic.)
Daniel - Friday, July 15, 2005 - link
#1, what do you want those drivers for? theyre lies! rendering every odd frame? its a trick, aparently not one below ati, to hype crossfire - I'll believe it when i see it.Clearly R520 wasnt so well designed, they tried to do too much and theyve had to tape it out 3 times. still talking about a launch in september? seeing ATI's recent behavior, doubt stuff will be available immediately for cheap, tho anythings better then the x800/850 xt pe debacle. Theyve been beaten to market by nvidia, now lets see wht they can come up with... a year late. When i see doom3/half life 2 numbers, then i'll decide...
kleinwl - Friday, July 15, 2005 - link
ATI should release the drivers rather than making us all wait for their "master" cards. The only thing that they are doing by waiting is letting Nvidia make more inroads into their market share. At this point, it's not about the fact that they can boost their profit by $50 a card for the "master" Crossfire card... but rather keeping their market share from eroding further against Nvidia. A freebe like these drivers would go a long way to keeping their loyal customers.