Comments Locked

121 Comments

Back to Article

  • ahtoh - Wednesday, July 6, 2016 - link

    AMD is dead
  • zmeul - Wednesday, July 6, 2016 - link

    gimping the card for the win
  • vladx - Wednesday, July 6, 2016 - link

    AMassiveDisaster
  • Despoiler - Wednesday, July 6, 2016 - link

    There is no gimp on the primary fix. I guess you aren't technically competent enough to understand how they are fixing the issue.
  • Morawka - Wednesday, July 6, 2016 - link

    it's not a fix, its still out of spec and it can still receive a ITC Ban for being out of spec
  • wira123 - Wednesday, July 6, 2016 - link

    did you even read the da** thing ?
    performance was improved by few fps vs crimson driver 16.6.2
  • yeright - Wednesday, July 6, 2016 - link

    because nVidia don't have issues?
    http://www.tomshardware.com/news/nvidia-vive-displ...
  • miribus - Wednesday, July 6, 2016 - link

    Everyone, all of them, release technical goofs like these at some point and more than once. This one is pretty minor, even at slightly lower clockspeeds it is still a good value card. Finally, the only reason you would "gimp" the card and choose option #2 is if you really wanted to impress everyone by maintaining a fully PCIe spec compliant system. Good for you?
  • nevcairiel - Wednesday, July 6, 2016 - link

    To be fair there is two parts involved in a DP connection, so without deep technical analysis shoving the entire blame on NVIDIA and nothing on the Vive seems rather one-sided.
  • BurntMyBacon - Wednesday, July 6, 2016 - link

    You are right in that without further evidence you can't come to any conclusions. However, (right or wrong) its pretty easy for people to place at least part of the blame at nVidia's feet given that the same issue doesn't exist for its competitor.

    Note: It doesn't have to be fully nVidia's fault to make a valid claim that they have some issues. There are other companies that have been completely in the wrong (granted different situation), but the rest of the industry still had to change to match them. In the end, who is said to have issues?
  • vanilla_gorilla - Wednesday, July 6, 2016 - link

    >In the end, who is said to have issues?

    Impossible to say without knowing details. If NVIDIA is respecting the specifications exactly and everyone else is just a little more forgiving, is NVIDIA "wrong" ?
  • Morawka - Wednesday, July 6, 2016 - link

    Nvidia's display port is 1.4, the RX 480 only has 1.3. This is probably why its not working. The Vive probably requires a certification for it to recognize the device, and since the 1.4 certification process is not done, the pascal series can't provide one. HTC needs to issue a hotfix and not require the certification
  • bill4 - Wednesday, July 6, 2016 - link

    Then why doesn't Nvidia say that? When reached out to by media, Nvidia has refused comment about this Vive/Displayport issue.

    This leads me to believe they screwed up big time somehow, and know it. However, being Nvidia, they dont have to answer to anybody or provide customer service. Their legions of fanboys will back them regardless and continue to overpay for their products.
  • noealo - Wednesday, July 6, 2016 - link

    Any product with a dual HDMI/DP connector probably just has a DP>HDMI adaptor inside. It is entirely possible that the Vive DP>HDMI adaptor has compatibility issues of its own.

    A bigger test is how many DP>HDMI adaptors work with the 1080/1070 as there are active/passive types and some require special non-standard support from the card. It is possible with the newer DP standard support in the 1080/1070 some out of spec adaptors don't work anymore.
  • bill4 - Wednesday, July 6, 2016 - link

    amd cards work fine
  • Salbrox - Wednesday, July 6, 2016 - link

    But they managed to improve performance on the card is this not better?
  • Michael Bay - Wednesday, July 6, 2016 - link

    >maganed
    Oh please. This shit is planned by marketing months ahead of release.
  • Murloc - Wednesday, July 6, 2016 - link

    the improvement would have come anyway if it's due to drivers.
  • smilingcrow - Wednesday, July 6, 2016 - link

    Advanced Managed Disasters!
  • prisonerX - Wednesday, July 6, 2016 - link

    Your brain is dead
  • Geranium - Wednesday, July 6, 2016 - link

    I wonder how much power those 6-pin less GTX 950 consume?
  • AS118 - Wednesday, July 6, 2016 - link

    Don't be a fanboy, the 480's are sold out everywhere online. Which is good for ALL gamers. Hopefully AMD gains enough marketshare to where NVidia will stop being as much of a near-monopoly.
  • joeycagle - Wednesday, July 6, 2016 - link

    It appears to be sold out everywhere. I came across a pre-order on Amazon, however, that said it wouldn't be available until July 13. It was one of the few items that wasn't at a rip off price (actually at MSRP rather than the 300-some odd dollars they're being sold for mostly there) and so I went ahead and pre-ordered. That was yesterday. Today, I found out it was actually shipped and it will be here tomorrow.

    So if you want one, go to Amazon for a pre-order. Just be sure you're not overpaying. You may find the next day it is actually being shipped right then.
  • bill4 - Wednesday, July 6, 2016 - link

    someone better tell their stock thats up like 300% the past year.
  • GTRagnarok - Wednesday, July 6, 2016 - link

    #1 sounds good to me. The 6pin connector is pretty underspec'd, isn't it? There's plenty of power headroom.
  • T1beriu - Wednesday, July 6, 2016 - link

    The 6pin on the RX480 is actually routed and used as an 8pin. It can handle 250W.

    The 6 pin feeds the vcore and the PCIE bus feeds the memory.
  • KateH - Wednesday, July 6, 2016 - link

    Could u clarify what u mean by "routed and used as an 8pin"? I've looked at bare PCB shots for RX480 and I see a 6-pin connector with 6 leads soldered to 6 thru-holes on the board. The limiting factor for power delivery in this case is that each wire going from the PSU to the GPU has a limit to the current it can safely carry, as does each pin inside the connectors on the PSU cable and GPU.

    I guess I could see a 6-pin connector handling 250W @12v, but that would be roughly 20A per pin which would have to mean 16 or 14-gauge wires from the PSU to GPU and a lot of trust put into the power connectors on the GPU and PSU...
  • KateH - Wednesday, July 6, 2016 - link

    I did my math wrong on those amperage ratings. Still. Even if the connector on RX480 GPUs is rated for 250W, the 6-pin connectors on power supplies absolutely cannot be guaranteed to be.
  • blahsaysblah - Wednesday, July 6, 2016 - link

    A 6-pin is wired as 3x12v line, 2 ground, 1 sense pin where the 3rd 12v is not supposed to be connected. An eight pin uses that 3rd 12v(same location) and adds a 2nd sense pin and a 3rd ground. The RX 480, because its power controller can detect no power condition, does not need the sense pin. So they actually wired their port to have the 3rd 12V line active, and turned the sense pin into ground(which is what it is on PS side). You can safely use a 6-pin there and pull "150W" 8-pin power as long as power supply has enough overall rating for it. Technically, ATX12V version 2.2(~2006, 2.4 is 2013) required the pins to switch to HCS/High Current Series, so if your motherboard/PS follow that, they can drive 9A per wire (assuming they dont cheap out on wire). It was 6A per wire before. There are actually 11A HCS Plus pins that maybe are used by high end motherboards and power supplies. Side note, ATX 20pin has one 12v and 24pin has two 12v wires. Whole point of +4 was to safely get more power to PCI-E bus via motherboard.
  • blahsaysblah - Wednesday, July 6, 2016 - link

    EDIT: For example, i had no idea about all this before RX 480. Back than, i bough wire matching what came with my PS when i created my own set of custom power cables. I bought 18AWG tin clad copper 16 strands of 30 AWG wire which said its rated for 300v/7A. So even using my lower 7A rating, that 6-pin(assuming RX 480 port isnt weak link) can pull 3x12Vx7A or 252W. However, technically, same applies to power supply, the actual internal traces and contacts technically only have to support 150W or 50W/4A per wire. Just because the pins are rated for 9A, nothing mentions a power requirement other than 75W/150W. Anyway, there is not technical reason the 6-pin cannot safely provide 150W 8-pin power. The 3rd 12v line is there in specs, just not used. And the missing 3rd ground is more of the liability(which they fixed because they dont need sense pin).
  • nevcairiel - Wednesday, July 6, 2016 - link

    That is assuming the power supply actually has the 3rd 12V connected there. Now if its part of a 6+2 setup, then it will, but if its a cheaper/older PSU and only has a pure 6-pin, maybe it doesn't?

    The key point is, even though you could use that kind of werid setup, its not following the specification, and thats just bad design for "marketing" reasons like "we can't have an 8-pin on there, it will make our card look inefficient" - guess what, it still is inefficient.
  • bill4 - Wednesday, July 6, 2016 - link

    480 is rated at 150 watts (sometimes exceeds that a bit to be fair), 1060 which will perform similarly is rated at 120 watts. Not big difference.

    Then you come to the fact 480 has more teraflops 5.8 vs 4.4, a 256 bit bus vs 192, and comes with much more VRAM. You could probably actually argue the Nvidia card is the inneficient one.

    However efficiency is a nonsense made up thing from nvidia fanboys anyway. It doesnt matter how many watts a card uses (within reason, which all video cards under 300 watts are fine), two things matter, price, and performance. If you want to use fewer watts, go buy an IGP. It uses way less watts than an Nvidia card.
  • bill4 - Wednesday, July 6, 2016 - link

    At least 960 Strix exceeded PCI-E by a lot (probably a lot more nvidia cards did, it was just one of only a few Nvidia cards ever tested), so Nvidia did the same thing and their fanboys made sure no stink was raised back then, it was 1000% ignored. So you cant say anything now.
  • Daniel Egger - Wednesday, July 6, 2016 - link

    > A 6-pin is wired as 3x12v line, 2 ground, 1 sense pin where the 3rd 12v is not supposed to be connected.

    It's not supposed to be *used*, in many cases the connectors are actually 6+2-pin for higher flexibility. In that case you'll always have the third 12v PIN. However what goes in must come out, too and there the 3rd GND would be regrettably missed...

    Still I don't see any reason why you couldn't draw more than 75W reliably from that connector; cable lengths are reasonable short and both wires and plugs much more confidence inspiring than routing the allowed 75W through traces on the mainboard.
  • BurntMyBacon - Wednesday, July 6, 2016 - link

    @Daniel Egger: "Still I don't see any reason why you couldn't draw more than 75W reliably from that connector; cable lengths are reasonable short and both wires and plugs much more confidence inspiring than routing the allowed 75W through traces on the mainboard."

    Given the build specifications used by most quality PSU manufacturers, you can. Bargain basement PSU manufacturers and budget OEM builds sometimes toe the line with respect to what they design their PSUs to handle.

    With respect to motherboard traces, due to the fact that current flows along the skin of a conductor, they are much more capable of carrying a load than most give them credit for. While their volume is extremely small compared to a wire, their surface area is not. Think about the burnouts you've seen. Usually you'll see charring at connectors, devices on the board, or their respective soldier joints, but how often do you see the traces get burnt. Have you ever seen traces get burnt when the connector or chip did not get burnt?
  • BurntMyBacon - Wednesday, July 6, 2016 - link

    @KateH: "Still. Even if the connector on RX480 GPUs is rated for 250W, the 6-pin connectors on power supplies absolutely cannot be guaranteed to be."

    True. You can, however, be sure that if your connector is a 6+2pin that pigtails the last two pins of the ones next to them, then the connector will have no issue as it is already designed to carry the required currents over the six wires that will be connected.
  • bill4 - Wednesday, July 6, 2016 - link

    awesome, i have a 6 pin like you describe with the two extra for the 8 pin on the side...
  • Flunk - Wednesday, July 6, 2016 - link

    Any PSU with a 6+2 pin connector IS rated for 150W so this only applies to single 6-pin connectors.
  • BlueBlazer - Wednesday, July 6, 2016 - link

    Wrong. Watch https://www.youtube.com/watch?v=plC7tOYIqBw&t=... from 54 minutes onwards. Not all of the power came from the 6-pin connnector. Toms Hardware also retested and reanalyzed the graphic card http://www.tomshardware.com/reviews/amd-radeon-rx-... and found about half of its VRM draws power from the PCI Express x16 slot instead.
  • bill4 - Wednesday, July 6, 2016 - link

    what are you talking about? AMD is shifting (they only need to shift a few watts) a bit of the load from the slot to the plug. That's all. So if before under heavy load it was like 80 from slot 70 from plug, now it will be reversed and slot will be in spec.
  • KateH - Wednesday, July 6, 2016 - link

    Generally, yeah. The official limitation is largely due to maintaining a margin of safety taking into account wire gauge and connector quality; keeping power under 75W (6.25A@12v) means that using a cheap PSU with 22AWG leads on the PCIe power lines, or an older supply with a Molex/SATA-to-PCIe adapter won't intrinsically create an unsafe situation.

    Assuming good connectors, common PSUs with 20 or 18AWG 12v supply leads ought to be able to safely push north of 100W thru the 6-pin, and high-quality PSUs with 16AWG leads could safely do a full 150W *PROVIDED* the connectors on both the GPU and supply line are of good quality and are making good contact. This is the rub, and the thing that can burn up ppl's GPUs.
  • hedon - Wednesday, July 6, 2016 - link

    When there's no rival over NVidia or Intel, we're gonna be near to death.
  • vladx - Wednesday, July 6, 2016 - link

    Bullshit even with AMD gone, they would still need to improve to sell new products over their old ones. Not that I want AMD gone, like I said elsewhere someone like Apple buying AMD would be great. Apple+AMD is the perfect match.
  • Gich - Wednesday, July 6, 2016 - link

    Sure they have to improve, but they can give out just the bare minumum... look at how much Intel has improved since Bulldozer!
  • Scali - Wednesday, July 6, 2016 - link

    You mean stuff like this? http://www.pcworld.com/article/3050466/hardware/ea...
    Yea, pretty impressive indeed!
  • atlantico - Wednesday, July 6, 2016 - link

    $4,115
  • Scali - Thursday, July 7, 2016 - link

    Price is just an arbitrary number. Innovation is about developing new technology. This CPU is proof that Intel continues to develop and improve their technology.
    Someone asked "look at how much Intel has improved since Bulldozer", well, there you have it.
  • barleyguy - Wednesday, July 6, 2016 - link

    As another data point, the original Pentium came out at $1200 (a lot more if you add inflation), for a chip that at first had very little performance advantage over the top 486s and also had a major bug. That's where we will be back to if Intel doesn't have any competition. The NVidia situation would be similar.

    (I build at least one AMD based computer a year either for myself or a friend, and am also an AMD stockholder. I do both of these things partially to support competition.)
  • Scali - Thursday, July 7, 2016 - link

    The Pentium was a technological quantum leap over the 486, offering a fully superscalar integer pipeline, and a pipelined FPU to boot. Well-optimized integer code can be almost twice as fast on a Pentium as a 486 at the same clockspeed. FPU code can be 3-4 times as fast.
    Not to mention the fact that Pentiums also clocked much higher than 486es because of the deeper pipeline design.
  • bill4 - Wednesday, July 6, 2016 - link

    samsung was looking at buying amd...i think that'd be better. Apple would have no use for AMD outside of being a person supplier. All AMD's business is nothing to AMD.

    BTW Nvidia is already gouging people, look at the "Founders Edition" crap. And the fact they charge $50-100 more than AMD for the same performance in most cases ALREADY, because AMD has such small share Nvidia can already act like a monopoly to some extent. It will only get much worse if AMD goes out of business. Luckily AMD stock is up 300% this year.

    Anyways it's what Nvidia fanboys wanted. So, they will enjoy paying more to make Nvidia richer.
  • Michael Bay - Friday, July 8, 2016 - link

    AMD stock is literally pond scum at this point. Nobody cares if it goes up a little. Then again, if "AMD's business is nothing to AMD", that is how it should be.

    And they wish they invented this founders whatever thing. Unfortunately, you first have to have something people will want to buy.
  • hedon - Wednesday, July 6, 2016 - link

    I mean when there are no competitor or less competition, the leading would have less effort to make improvement/inovation.
  • KateH - Wednesday, July 6, 2016 - link

    Ouch. Even if the performance losses are minimal with the new driver, this is putting a big damper on the launch of what is otherwise looks like a good card. And this all could've been avoided if AMD had put a dang 8-pin power connector on the reference design- c'mon, what were they thinking going with 6 leads? Think the last time I saw that on an enthusiast card was the 8800GT
  • Weyoun0 - Wednesday, July 6, 2016 - link

    AMD was pretending to be power efficient, but that has always been their weakness, so it came back to bite them in the ass.
  • KateH - Wednesday, July 6, 2016 - link

    Hopefully the OEMs are scrambling to revise their PCBs to use 8-pin power. Will be a very easy thing to do, astounding that this slipped past Q&A. My guess is that management forced it on the engineers :(
  • T1beriu - Wednesday, July 6, 2016 - link

    The 6pin used on the RX480 is pimped-out by AMD to be used as an 8pin. It can feed 250W to the GPU just by itself.
  • KateH - Wednesday, July 6, 2016 - link

    I have serious concerns about this statement being put out there... 6 pins is 6 pins, period. Now, perhaps AMD sourced GPU side ("female" plug) power connectors that are rated for 250W, and perhaps AMD ensured the trace width between the connector and the VRMs is sufficient for 250W, but AMD cannot guarantee that any power supply used with an RX480 is designed to handle any more than the 75W spec being pushed thru a single 6-pin lead.
  • bill4 - Wednesday, July 6, 2016 - link

    Power efficiency is somer fake made up thing by Nvidia fanboys, because Nvidia has a lead in that area. Doesn't matter. Unless you are a grandma or a nun. Oh no, a 480 draws 30 watts more than a 1060...who gives a flying sh*t.

    Funny thing is all the nvidia fanboys talking about power efficiency are all overclockers. The worst thing you can do for your "power efficiency" (such a fake word).

    Also the 480 has 5.8 teraflops at 150 watts, the 1060 has 4.4 at 120. Plus 480 has a larger memory bus, more VRAM, more TMU's, etc. So technically AMD is more efficient.
  • Scali - Thursday, July 7, 2016 - link

    Almost correct :)
    The 1060 is more efficient because it gets better performance than the RX480 while having a smaller bus, less VRAM, less TMUs, and a lower TDP.
  • blzd - Sunday, July 10, 2016 - link

    "Power efficiency doesn't matter".

    lol nice try AMD.
  • artk2219 - Tuesday, July 12, 2016 - link

    Nope, someone is forgetting the GTX 200, 400, and 500 series with their blow dryers, overheating, and high power consumption. Granted Nvidia learned from these designs, and introduced the wonderful cards we have today. But at the time AMD was the more efficient brand, and Nvidia really didnt reach parity until the 600 series. Its funny how people seem to forget things from only 4 years ago.
  • TrantaLocked - Wednesday, July 6, 2016 - link

    What are you talking about? There will be no performance losses; they're transferring power from PCI-E to 6-pin. The nvidia shills in this thread are fucking ridiculous.
  • StrangerGuy - Wednesday, July 6, 2016 - link

    I didn't knew violating existing standards or lowering release performance as a shipped product or as a "fix" would be remotely considered as acceptable practices in any industry, and demanding a baseline QC is called shilling for NV.
  • Michael Bay - Wednesday, July 6, 2016 - link

    I wonder how many of AMD clients are explicit leftists and bern victims.
    Same rhetoric patterns.
  • atlantico - Wednesday, July 6, 2016 - link

    All the people whining about "regulation" and "standards" are nvidiots, and leftists and bern victims. AMD is all about, screw overbearing regulation, as long as it works. If it doesn't work, the market will take care of itself. Commie.
  • Michael Bay - Wednesday, July 6, 2016 - link

    Oh, the market is taking care about AMD, indeed.
  • Macpoedel - Wednesday, July 6, 2016 - link

    I'm guessing you have never overclocked because you want all of your parts adhering to an overly strict standard. If RX 480's start burning down pc's, that'll be AMD's problem, but otherwise I don't see how any hardware enthusiast makes problem out of this, other than for the fun of throwing some rocks at a company.
  • K_Space - Wednesday, July 6, 2016 - link

    First off thanks to KateH & blahsaysblah for the awesome comment/analysis on power routing through the 6 (+2) connector; it's these kinda of comments that really add flavour to AT. However I am curious as to why you think this is dampner on the launch. Technically the card is a 1080 muncher (resolution not NV card) so even with compatibility option enabled, a game at that resolution won't suddenly become unplayable. Certainly a marketing disaster (and food for fanboys from both sides) but I'd argue discerning gamers/technology nerds shouldn't be concerned. Looking forward to the updated benchmarks with both options. Ryan, will you enable rail testing for this particular scenario?
  • bill4 - Wednesday, July 6, 2016 - link

    the performance losses will be zero actually, the default solution is just shunting some power draw away from the slot onto the plug (supposedly more equipped to handle it). Alternately, AMD are offering a "compatibility" toggle that ensures neither slot nor plug will run out of spec (this is the one that likely requires some slight power tuning/performance loss). I'm sure about 1% of people will opt in to that...most will opt for higher performance.

    As Anand notes, power draw is related to frequency by a cube function, so it takes very little downclocking to reduce a lot of power. So I'm going to guess the performance losses in compatibility will be like 3-5% (AMD is PRing that loss by claiming the new driver increases performance by 3% with per game optimizing, making up for it). Personally I wont opt in to compatibility mode.
  • StrangerGuy - Wednesday, July 6, 2016 - link

    Either choice, RX480 users are entirely justified to demand a refund for out-of-spec 6-pin operation and downclocking after launch, there is no weaseling out of this one for AMD.
  • duploxxx - Wednesday, July 6, 2016 - link

    hahah nice, joke. so all those nvidia cards where there is no 4GB but only 3.5 can do the same ????
  • Scali - Wednesday, July 6, 2016 - link

    Which nVidia cards would that be?
    Not the GTX970 at least, because if you remove the cooler, you can easily see that there is 4 GB of memory physically present.
    And if you run some DirectX diagnostics tools, you can easily see it report two segments of memory, one 3.5 GB and one 0.5 GB, totaling 4 GB.
  • Drumsticks - Wednesday, July 6, 2016 - link

    Wouldn't they have been breaking specs by not allowing full high bandwidth access to all 4GB of that memory at once...?
  • vanilla_gorilla - Wednesday, July 6, 2016 - link

    What "spec" would that be?
  • Drumsticks - Wednesday, July 6, 2016 - link

    The ones their own technical marketing team provided everybody in September 2014? They advertised 64 ROPs, and 2MB of L2 cache, along with that 4GB of memory. A full 1/8th of the ROPs and L2 weren't actually there, despite marketing it as such, to everybody. On top of that, in practice it's difficult (and generally not done due to this) to access both segments of memory at once, meaning they really mainly use 3.5GB of memory.

    http://www.anandtech.com/show/8935/geforce-gtx-970...

    It's possible to like a company and still hold them accountable for their screwups. That's what drives them to make a better product, after all.
  • Geranium - Wednesday, July 6, 2016 - link

    Its not a problem, its a feature.
  • Scali - Thursday, July 7, 2016 - link

    No. In practice you can rarely reach the full theoretical bandwidth for a sustained period of time anyway. So nobody would expect that. That's what we have benchmarks for.
  • D. Lister - Wednesday, July 6, 2016 - link

    "...RX480 users are entirely justified to demand a refund..."

    "...so all those nvidia cards where there is no 4GB but only 3.5 can do the same ?"

    Apparently some people did get a refund for their 970s. Although, and this may be of some passing significance, the memory issue couldn't really cause any hardware damage.

    I find it intriguing how in the past when AMD messed up, we would be reminded of how they came up with the 64-bit CPUs, and how they put the memory controller on the CPU, etc., etc. Nowadays? Instead of talking about the great innovations AMD has come up with, people go straight for pointing out the flaws of the competition. I have a feeling that here is something interesting to be learned from this subtle change in attitudes.
  • tamalero - Wednesday, July 6, 2016 - link

    I think the most related issue would be those drivers by Nvidia (3 in a row?) that burned cards.
    All because they botched the fan vs temps dynamics.
  • Peter2k - Wednesday, July 6, 2016 - link

    There's still a lawsuit about that going on in think
    And some got refunds of belive
  • TrantaLocked - Wednesday, July 6, 2016 - link

    Did you not read the article? There will be a toggle to limit power for people who want within spec power usage in this driver.
  • miribus - Wednesday, July 6, 2016 - link

    You're right that they are justified, it is a mistake after all, I won't disagree with you there. That said it is still an objectively good card even if it suffered slightly slower clock speeds. It is embarrassing, but is it suddenly not a good card? It's just a marginally (I assume, who knows for sure yet) slower one. If we're literally talking the difference of a few frames I'm still buying the $200 card, even underclocking is somehow my only option. It would still be the right card for me at that price. The GT1060 is faster than the 8GB at $250! (according to leaks).... fine, but I'm not buying that card though. I'm buying the $200 one, that's my budget, as soon as I can, giving it 8-pins, and overclocking it.
  • Peter2k - Wednesday, July 6, 2016 - link

    There are reports that 4gb rx480's are actually 8gb versions
    Can be unlocked as well
  • Macpoedel - Wednesday, July 6, 2016 - link

    Well that is very debatable because the card isn't downclocking at all. The RX 480 has a boost clock of 1266MHz and a base clock of 1120MHz, both of which remain the same. AMD doesn't have to change any of the official specifications, so all what's on that box still applies.

    The default option is out of spec, yes, but the fact that there's an option in the drivers to force this back in spec, should be sufficient.

    Anyway, we've all been overclocking our parts for years, no one has cared at all about a few mA more or less on the board power, but all of a sudden this is a problem. Haven't seen any hard proof of anyone frying their motherboard over this, and even if this has happened, that would be as much the motherboard maker's fault as it would be AMD's.
  • Michael Bay - Wednesday, July 6, 2016 - link

    "Choose your fix" is never a good approach.
    Seriously, how could they miss this stuff in the first place? It should be in basic internal testing.
  • Weyoun0 - Wednesday, July 6, 2016 - link

    It was not missed. AMD was trying to pull a fast one. Unless AMD is hiring HS drop outs as engineers, which is just as likely due to AMD rapid decline in recent years.
  • Drumsticks - Wednesday, July 6, 2016 - link

    I'm not sure I believe this. I don't really see any benefit for them. What "fast one" can they pull and benefit from by routing extra power through the motherboard? It certainly doesn't grant more performance than sending it through the 6 pin.
  • vladx - Wednesday, July 6, 2016 - link

    By using only 6 pin AMD tried to hide how much more inefficient their card is compared to Nvidia ones. AMD's newest tech Pascal barely keeps up with Nvidia's last generation Maxwell. Of course it backfired quite spectacularly as deserved for all their deceptive marketing at every product release in the last 10 years.
  • vladx - Wednesday, July 6, 2016 - link

    *AMD's newest tech Polaris I mean.

    That's what happens when both AMD and Nvidia use somewhat similar names.
  • Drumsticks - Wednesday, July 6, 2016 - link

    Second comment/edit: It's worth noting, for whatever reason, AMD has stated that the 480 is supposed to be around 2x perf/watt, but the 470 should be about 2.7x. If that's true, it does in fact bring them pretty close to Pascal, maybe even on all products going forward. It still has to be true, but we'll see soon enough.
  • Drumsticks - Wednesday, July 6, 2016 - link

    Well sure, using a 6 pin looks good for them. But why would they route so much power through the motherboard? Given that, it looks much more like carelessness imo.
  • Geranium - Wednesday, July 6, 2016 - link

    Yeah, thats why we see CLCs, Hybrides and three fans on efficient NVIDIA gpus.
  • benzosaurus - Wednesday, July 6, 2016 - link

    If they were hiring HS drop-outs as engineers, I'd totally go work there. I could get a job with my BS without "at least 5 years experience with…". Friggin' job market.
  • watzupken - Wednesday, July 6, 2016 - link

    I think AMD needs to move on from GCN architect to stay competitive. GCN has its pros, but seems to have more downside particularly on the power efficiency standpoint. Perhaps GF's 14nm is not as good as TSMC's 16nm, but it probably only added on to the inefficiencies of the GCN architect.
  • tipoo - Wednesday, July 6, 2016 - link

    Unfortunately we won't see a new architecture until Navi in 2018
  • tamalero - Wednesday, July 6, 2016 - link

    They are probably not focusing on gaming again afterall.
    They are always trying the "all around' approach to feed both professional cards AND gaming.
    Unlike Nvidia which either crippled compute performance, or did poorly on gaming but amazing on compute.
  • Macpoedel - Wednesday, July 6, 2016 - link

    Would it help if AMD would give each iteration of GCN a different name instead of using vague version numbers? Because that's what Nvidia is doing, you could call Pascal CUDA 6.1.
  • Geranium - Wednesday, July 6, 2016 - link

    TSMC is now a Apple supplier. They first plans for Apple, then others.
  • D. Lister - Wednesday, July 6, 2016 - link

    "we assembled a worldwide team this past weekend to investigate and develop a driver update to improve the power draw."

    Sounds very impressive, this worldwide team that has been assembled. Sort of like The AMD Avengers. Lisa Shu would be their Samuel L. Jackson, I guess. But I digress, hopefully next time they would assemble their worldwide team BEFORE a product launch.
  • Michael Bay - Wednesday, July 6, 2016 - link

    I imagined Lisa Shu saying the M word every other second and it wasn`t pretty.
    Huang would make a great villain though.
  • D. Lister - Wednesday, July 6, 2016 - link

    "I imagined Lisa Shu saying the M word every other second and it wasn`t pretty."

    lol, she could pull it off though, she looks like a tough lady. With a shaved head and an eye-patch, she could be practically intimidating.

    "Huang would make a great villain though."

    Many from a competing brand's fan club already think he is one. :D
  • RBFL - Wednesday, July 6, 2016 - link

    Did no-one else get an Apollo 13 vibe. Engineers sweating out the holiday working out how to balance all the power draws.
  • miribus - Wednesday, July 6, 2016 - link

    If I had a card and a PSU with 8-pin power I would: 1) made sure AMD didn't do something intensely stupid like short pins 2 & 6 together that power connector on their card, and also make sure that the middle power pin on the card is actually ohms out to the other power pins... AMD might not have connected it since it technically wouldn't be needed in the spec to have it there. 2) used my 6+2pin connector 3) shorted pins 4 & 8 4) check power draw. That doesn't help people without those connectors and I think trying to get on more common (cheaper) power supplies for higher adoption clouded their judgement when considering the issues that could happen.
    "Normal" 6-pin power supplies, even though they have 1 fewer circuit, are often laughably under-specified look up the datasheets yourself. The wiring used on this stuff is rated several amps and that's for the crappy stuff. There is a long list of reasons as to why ratings and tolerances are specified the way they are, but long TL;DR, your PSU, unless it is really really really really terribly made, can absolutely handle the 1 or 2 more amps on those wires. Don't believe me? Ask your PSU vendor how many amps at 12V your 2 wires on that connector can really do. If they're honest they'll tell you well north of 75W.
    I Design motherboards for a living, including power delivery similar enough to what is used in PSUs, and we have to (or, really opt to) stay within PCIe specs on our end for obvious reasons. I can tell you for a fact that the 75W mark for the 6-pin power connector is an extremely loose specification and, again, we're talking a factor of maybe 2 amps.
    On a PSU that does not have a 6+2 pin connector, one pin is missing, the middle upper power pin. That design specification limits you to 75W for absolutely no other reason than they figured it was the safest possible specification to go with. It did not at all account for kW enthusiast PSUs and it didn't have to. PCIe specs aren't law, there isn't the PCIe police per se, but the point of the specification is that if you break the spec for your own purposes, which is totally legal, you can't say: "It is fully compliant etc etc." You would be running out of PCIe spec, but not the PSU's spec.
    But lets say you DO have a 6+2 pin connector, there is no good reason, except for an extremely technical interpretation of the specification, that it wouldn't be "fully" compliant except for the extremely technical reason that it is a 6 pin connector and it is known, 75W, period. Truthfully? The wires and connector likely do way north of 6 amps a piece on your PSU and I'm being really conservative. *If you've got a 6+2, you've definitely got the power. Take those +2 that are floppin' out there in the breeze and short them together and your PSU suddenly can supply 150W because it thinks you have a more powerful card.
    *I haven't tried this and it is a theory, but if you have a multimeter it is easy to do safely. If I had a card I would absolutely try it. If someone wants to try it and has a 6+2 PSU, and a multimeter, and obviously an RX 480, I'll walk you through it... I really want to know if it would work. It would not only be fine stock, but it would probably be more over-clockable.
    Somehow AMD is going to force the card to draw a few (very safe) more amps from your PSU by software, that is interesting. Tricking your PSU into thinking your 6-pin card is really an 8-pin card requires hardware, I'm not positive how they would do it in software.
  • riottime - Wednesday, July 6, 2016 - link

    I was going to buy a RX 480 to replace my aging HD 5850 but now not so much. I'll wait and see how VEGA turns out later this year instead.
  • atlantico - Wednesday, July 6, 2016 - link

    Just get a third party RX 480, with a custom cooler, they'll also have 8-pin power connectors. You'll only have wait a couple of weeks for that and it's a great replacement for an HD 5850.
  • vladx - Wednesday, July 6, 2016 - link

    Sorry but it got delayed to next year: http://www.fudzilla.com/news/graphics/41034-vega-1...

    Probably AMD having issues as usual.
  • FriendlyUser - Wednesday, July 6, 2016 - link

    So, to sum it up:
    1. No risk for anyone's MB with new driver
    2. For people with intolerant PSUs, PCIe compatibility can be ensured at minimal performance loss.

    We all wait for the benchmarks, of course, but I don't expect a seismic shift in performance. Experience has shown that it is perfectly possible to run the RX480 at slightly lower power targets with minimal change in performance.
  • tipoo - Wednesday, July 6, 2016 - link

    Seems like the fix and lower power toggle all but confirms a last minute overclock. They also say the 3% boost in games the new driver provides should substantially offset any impact from the lower energy setting, so it seems pretty sure to me they got worried and pushed it past what they found was it's peak efficiency point, or at least the better tradeoff of efficiency to performance (since power use scales more than linearly with voltage+frequency, while performance scales less than linearly with clock), in favor of boosting it a bit more.

    All that debacle didn't seem worth it if it's 3-5% they squeezed out.
  • prisonerX - Wednesday, July 6, 2016 - link

    It's pretty amusing (and somewhat sad) that a bunch of people who would routinely overclock anything they can wail and moan about a card being a little out of power spec.

    Oh the hypocrisy!
  • Agent Smith - Wednesday, July 6, 2016 - link

    Well said !!
  • vladx - Wednesday, July 6, 2016 - link

    You forgot just one LITTLE detail: RX 480 is a budget card mainly targeted at people with budget systems.

    Tech sites users are not indicative of the general customer population buying this card.
  • silverblue - Wednesday, July 6, 2016 - link

    I do love how a $200 to $250 card is "budget" to a lot of people commenting on these sorts of articles. Are the 970 and 390 budget cards?

    My 950 is "budget" at half the price of the 480; I'd class the 480 as mainstream at worst.
  • barleyguy - Wednesday, July 6, 2016 - link

    Agreed. $250 isn't really a low budget video card for non-enthusiasts. Also, the reference cards are absolutely going to enthusiasts. They are the early adopters that bought things on Day 1 while fighting high demand. The casual users that would normally be in the budget market won't be buying these until much later, when most of the cards will be custom.

    Personally I'm planning to get an RX 470. It should be about 50 watts lower in power usage and still good enough for my 1080p monitor.
  • Peter2k - Wednesday, July 6, 2016 - link

    It's pretty amusing that some still don't get it

    No one cares that the rx480 pulled more power than the stated TDP
    Every card does this, ever part you OC does this
    Everyone knows

    People care that the card draws more power over the weakest link, the PCIE slot
    And
    It's a cheaper card for cheaper systems
    The boards I've heard about getting (supposedly) damaged where old AMD boards to begin with

    AMD actually is bringing out a fix, because it's true
    The card will draw more power over the power connector
    The part that can actually handle the additional current
  • pencea - Wednesday, July 6, 2016 - link

    "we’ll be back in a few days with our full review"

    Right... That's whats written in both preview articles of the GTX 1080 & 480X preview. It's been over a month, and yet still no reviews of the GTX 1080 or GTX 1070 in sight.

    As for the 480X it's officially been a week and no reviews for it too...
  • silverblue - Wednesday, July 6, 2016 - link

    Ryan said that the 480 review would be quickly followed by the 1080 review.
  • catavalon21 - Wednesday, July 6, 2016 - link

    I do have to say, this is one of the more eyebrow-raising cases I can recall of products garnering a lot of attention, to get the review treatment of well, nothing special. Some previews, some promises for more, and yes, there are benchmarks for all 3 cards in the AT GPU 2016 charts section, but it's just bizarre. Still, I keep coming back for more...
  • catavalon21 - Wednesday, July 6, 2016 - link

    ...because irritating brand-trash-talk notwithstanding, even in these 11 screens of comments there is a lot of very good technical discussion on the issue.
  • BrokenCrayons - Wednesday, July 6, 2016 - link

    I'd like to see the new performance numbers of the card once its using the so-called compatibility mode before getting very excited one way or the other about the issue, the company, or its competitors. In the end, though I like the 480 up to this point mainly for its price, I'm probably not going to end up making a purchase since I want a GPU that doesn't need an external power connector at all AND is within spec for demanding power from the expansion slot. In order for that to happen on 14/16nm, it's going to be a waiting game while AMD and NV get around to producing their smaller/better little GPUs.
  • enryu - Friday, July 8, 2016 - link

    From the quote on the 3% extra game performance:
    "... System config: Core i7-5960X, 16GB DDR4-2666MHz, Gigabyte X99-UD4, Windows 10 64-bit"

    Does this mean AMD uses an Intel Testplatform? Or Are those benchmarks not from AMD?
  • Michael Bay - Friday, July 8, 2016 - link

    They use the most powerful CPU available to isolate GPU performance. In this case it was 5960.

Log in

Don't have an account? Sign up now