This is one of those rare cases where the answer to a "this or that" question is 'no'. No, it's not a Fictional Edition. No, they won't make enough to meet demand.
This one's hardly worth the money for gamers anyway. They're really pushing their creative lie about "doubling" CUDA cores and FP32 FLOPS to cover for the fact that this will have very little advantage over the preceding generation in games, because it can't render at resolutions high enough to benefit substantially from that dynamic reassignment of INT32 resources to FP32. I'll be surprised if this substantially outperforms the 2060 Super at 1080p.
In four generations, Nvidia have gone from having their third-rank card be a price/performance bargain as was traditional (GTX 970), to having their succeeding fourth-rank card be *substantially* more powerful than that for less money (GTX 1060), to charging third-rank video card prices for their fifth-rank card and for meagre performance increases. I'm not going to reward that with a purchase.
There's no lie, there's just an architecture change, and it wasn't done for marketing purposes. Enthusiasts look at games benchmarks to see how fast a card is, not the number of CUDA cores. And non-enthusiasts are not going to look at CUDA core counts, either. I imagine they look at the product number (3060 vs 3070). Of course it's foing to substantially outperform the 2060 SUPER.
Firstly, with the 970 you are referencing a nadir in pricing. You are also referencing a card that had 3.5 GB of fast memory and .5 GB of slower memory to cut costs. But, that being said, GPU average selling prices, in terms of what customers are looking to pay, have been steadily climbing over the past 20 years. NVIDIA's rejiggering of their numbering schemes with respect to die sizes simply reflects that evolution of the market. You should probably buy the card that suits your value proposition. "Not rewarding" a company going about its business is an irrational thing. The prices that are charges are the prices of the market. NVIDIA doesn't set them. And if NVIDIA really were charging "third-rank video card prices for their fifth-rank card" their gross margins would be colossal now compared to back then, and that just isn't the case. The margins are pretty much the same as they were back then. It's the naming scheme that has changed, not the price being charged for the underlying hardware. In other words, a 3060 is a lot more expensive to make than a 960 or 1060. These are facts. You might not like the facts but there they are. Otherwise NVIDIA's gaming gross margins would be absolutely skyrocketing, and they are not.
Parts of the (non-cryptocurrency) compute crowd will be willing to shell out for this card too. 12GB of VRAM for under $500 is a bargain, especially with RTX/tensor cores and a nice transcoding block.
I wouldnt touch this, and I dont know a single person that would.
Prices are getting stupid and stock is non existent. I'll hold my 1080 until it molds before ill pay some scalper a premium for a card. Even at MSRP they are over priced.
It's pretty lame that any GPU news is basically meaningless for the vast majority of us at this point, unless someone releases a GPU that simply cannot be used for mining... which isn't going to happen. My 970 with an Arctic Accelero III will probably still be in my main system this time next year unless something crazy happens.
Even if stock is available near MSRP, I have very little interest in paying over $300 for a **60 level GPU. The only thing working in the 3060's favor is that the 2060 was actually more expensive at launch... but that pricing was so bad it was likely intentional, to set a new standard for "normal" pricing mid range GPUs.
Gone are the days of getting a **60 on sale for under $200, or a **70 on sale for under $300.
Not "or", both are true - but the scale of the difference is mostly about Nvidia realising they could use their video game customers to bootstrap their AI business and, as a result, saddling us with ever-more-expensive chip and board designs for ever-diminishing returns.
Of course, some competition from AMD would have helped a bit. You can track each price bump very obviously to the lack of a "true" competitor (features and performance) from AMD. We're not likely to see that situation reversed now though - even if AMD could make enough cards to fulfil demand, they have no incentive to harm their own margins when Nvidia have blazed such a bright trail for them - just like Apple and Samsung with flagship smartphone prices.
If you have decent internet and don't play crazy fast reaction games, take a look at trialing a cloud gaming subscription eg GeForce Now. I have it and am pretty pleased with it, and it's cheap compared to something like Shadow Cloud. Shadow offers better quality but costs more and is more fiddly to look after.
It's a good point, but hardly anyone will actually do it. That's probably good for you because if people did sign up en masse the service would become swamped. On the other hand, although NVIDIA can't ensure cards get sold to gamers instead of miners, they can make sure they get the cards themselves instead of anyone else buying them. But it would still take months to implement such a thing (doubt they can just buy all MSI's cards, for example, they'd have to sell the modules to manufacturers with the understanding that the cards are for them) so there would be a painful period until they secured the supply that might turn people off to the service. Well, maybe they do have enough power to negotiate the purchase of already made cards under not-too-unfavorable terms, I don't know.
Streaming game services are awful, even with my gig synchronous fiber connection it isnt anywhere close to being responsive enough for me to see it as anything other than a novelty
Honestly, I'm thankful I haven't had any money to shell out for a GPU since forever. All tech news has been meaningless to me, I just like to hear about numbers going up and what those funny numbers mean.
Indeed. I got lucky enough to pick up a janky mined-out 1080 Ti super cheap just after the last crypto bust. I have to set the power target at 60% to not get memory artifacts, but that's still plenty for 1440p 144Hz. Which is good, because that and my 2700X are probably gonna have to hold out for at least a few more years....
That's very lucky indeed - the 1080Ti is one of those cards that I frowned at on release for the sheer cost, but has turned out to be an unexpected long-term bargain vs. buying mid-tier cards and replacing them more frequently.
I'm ok with raising the price target to $350 for the XX60 range and $450 for the XX70 range, it's in line with inflation elsewhere and cost to performance ratio is still better than the past range by quite a bit. I'm not OK with board partners turning a blind eye to scalpers and miners, the first they can actually stop most of the time by working with FBI/customs & boarders/DOJ since many scalpers are selling them to miners in Iran, a huge violation of export compliance
"it's in line with inflation elsewhere" Not even close. The GTX 960 was $199 in 2014, which is $220 in 2021 dollars. Nvidia added $50 going from the 960 to the 1060, and then another $100 on top of that up to the 2060 at $399 on release. The fact that they haven't been *quite* so egregious with the 3060 is only a reflection of expected competition from AMD, and doesn't magically erase all of that interim gouging - we'd be at $220 if it were just inflation, and I'd be willing to contemplate $275 to account for the ever-increasing design complexity (that we didn't actually ask for).
"cost to performance ratio is still better than the past range by quite a bit" Only by a little, not "quite a bit", and certainly not in line with what you'd expect after a die-shrink - again, they've been massively *under-performing* on that metric for at least 3 years now. The power of the human mind to normalise this stuff is amazing.
Yup. There's no way in hell I'm following Nvidia down that merry little path; especially not for a GPU that - when assessed objectively - has a pathetic performance increase over its predecessor.
Inflation from 2010 through 2020 was a total of 3.538%. If you want to add 2000 to 2009, that's another 2.54% for a total of 6.07%. This does not account for a 25% price increase.
More like 20% (https://www.in2013dollars.com/us/inflation/2010?am... from 2010 to 2020. 52% from 2000 to 2020 (all of this is for the US, i'm assuming). So yes, a $400 gpu of 10 years ago is today a $500 just by inflation Seeing as you just seem to be adding the rates, i don't think you understand what a compound interest is.
Yeah, no. I don't know where you are getting that number but it is way, way off. The Fed targets inflation at 2%, and while we haven't always hit that over the past decade, a six percent figure over two decades is absurdly low.
According to the Bureau of Labor Statistics, inflation in the Consumer Price Index since September 2009 (when the Radeon 5870 launched) is 21%. This mean that the Radeon 5870 launch price of $379 would be $459 in 2021 dollars.
$486.50, to be precise, which would nearly get you a 3070 at its theoretical MSRP, and in reality get you a 3060 Ti with a little change (if you could find it in stock).
The way I see it, NVIDIA's previous 30-series launches happened before AMD announced its 6000 series cards with 16GB memory all-around. That had to have caused NVIDIA to rethink its strategy and so all new launches going forward should hopefully have more VRAM.
But they're still behind on the 3070 and 3080, which just looks really weird. There were rumors of 20GB versions, but those have yet to materialize. Then again, all of these cards seem to have difficulty materializing these days.
I'm still guessing that Nvidia originally was planning more VRAM on the higher-end cards, but then GDDR6X prices were too expensive so they cut their allocations in half, given that few current applications need that much RAM. GDDR6 isn't as bad (the Radeon 5500 XT I bought last year for $200 has 8GB of GDDR6), but they also cut down the 3070's allocation so as to not embarrass the high-end cards. But for the 3060, 6GB was really not going to be enough, current games can fill that, to say nothing of future titles, so they were stuck with either going to 12GB or going with a wider bus to accommodate 8GB (and ended up doing both).
Why? That's somewhat like being baffled that a cheaper car has a bigger gas tank, except the size of the gas tank carries no status. I guess it has something to do with an innate notion of hierarchy.
Practically speaking, though, a graphics card just needs enough VRAM. The 3060 has quite a bit more than it needs because of marketing and the technical specifics of VRAM products. For a certain amount of compute, a graphics card needs a certain minimum memory capacity and a certain minimum memory bandwidth. But the capacities and bandwidths come in chunky steps. Apparently it's more expensive to have a wider memory bus than needed than to have more capacity than needed. NVIDIA doesn't want 6 GB, perhaps mostly for marketing reasons, or perhaps it's getting very close to the minimum VRAM they think the card needs. But if they wanted to go with 8 GB they'd need to increase the memory bus width and add more memory controllers to their die. That extra bandwidth would mostly be wasted, so they chose to waste the extra 4 GB of memory capacity instead by going with the smaller bus width and the higher capacity chips to result in 12 GB of VRAM.
It's based on memory bandwidth requirements, not memory capacity requirement. Capacity requirements are FAR below what people think they are even at 'extreme' resolutions (an 8k framebuffer is barely over 125MB). Just looking at 'utilised capacity' tells you basically nothing about actual capacity requirements: any half-competant game engine will try and fill VRAM with every asset in that level up until full capacity or running out of data to load. All that data is marked as flushable, and will instantly be overwritten if required for actual in-flight data (i.e. cached data is functionally identical to empty VRAM).
I think many people are making guesses about where VRAM requirements might go in the near future based on the latest crop of consoles, which isn't entirely unreasonable. I think it's fair to have questions like that when you're coughing up $500+ for high-end hardware!
People are just nervous about running out of scarce resources. There's no reason to it. They aren't making any guesses. But regardless, what does that have to do with a higher-tier card having a lower RAM capacity than a lower-tier card? Is that the kind of reason you mean? Worrying that the higher-tier card has too little because the lower-tier card has more?
Newest rumors say the 3080 Ti will only get 12GB (though presumably the full 384-bit memory bus from the 3090 will be enabled to accommodate the extra chips)
Too little, too late, too much money. I was going to upgrade from a 1060, but have abandoned all hope of upgrading. I bought a cloud gaming subscription and am very happy with it. Currently on GeForce Now. £50 per year (£25 every 6 months) for a 1080 / 2080 (varies) for many (but not all) of my favourite games is mighty fine.
£25 every six months and I’ll leave for a different service the moment a better option comes up, versus however many hundreds of pounds a new 3000 series card costs?
GeForce Now is probably profitable (if not running at a small loss as it’s early days for them) as it runs (I think) on unused cloud capacity, but the profit margin can’t be as large as selling a whole card to me. So yes it’s their loss. Not that they care, as they don’t have any spare cards to sell anyway.
I wouldn't venture to guess the economics of GeForce Now versus cards. I think not even NVIDIA really knows that. There's a certain unknown longer term value to the growth of the service regardless of the near-term financial implications. And NVIDIA's purpose with building GeForce Now is to build a service that won't have a better option exactly like their purpose with selling GPU modules is to offer graphics cards that don't have a better option. My guess is that NVIDIA's margins on GeForce now should eventually be greater than for selling modules to AIBs because they are providing more of the service. You can imagine that whatever other service you sign up for may be buying their cards from NVIDIA for that service.
An argument of greater liquidity in the gaming as a service market is not an argument against gaming as a service, as NVIDIA may gain from that liquidity as much as they lose from it (You may not be locked into a 300 pound purchase, but neither is the buyer of AMD cards). And as mentioned, even if you switch to another service you may be using NVIDIA's cards.
But I don't see why you are blaming NVIDIA here as if they don't want to sell a card to you, or as if they wouldn't be making more cards to sell to you if they could. Automobile plants are completely shutting down because of a lack of microchips. There are all sorts of problems with supplies at the moment.
It won't, Nvidia is about at the max for their higher end cards, the lower end might go up a bit more than this, but doubt it tbh.
This is mostly a case of moving to a process node that's slightly worse for power while trying to clock it at its highest it can hold before issues arise, notice how they all have about 1850 as top clocks.
You should treat these the same as AMD hot cards were, clock them 10% lower than their max and enjoy the large reduction in power and fan noise.
The power will continue to go up because Dennard scaling has broken down. The limit on the power is a limit on how much power can be delivered and how much heat can be dissipated economically in a client setting. People already get triple fan solutions so eventually they can push it up to 400 or 500 watts. That's why NVIDIA introduced the 12-pin connector.
NVIDIA introduced a more power efficient architecture with Kepler and Maxwell. And in the Pascal generation finfets were introduced. Those things kept power requirements from creeping up for a good 6 years. But there are less and less efficiency gains to wring out as the architectures mature and the power scaling going from planar fets to finfets was a one time thing. Samsung's 8 nm process is probably a bit behind the curve in terms of power efficiency so I wouldn't expect an increase in power usage next generation, but in subsequent generations it's going to go up.
486 machines came with about 200 watt power supplies. It's been a long term trend to increase power usage, but more recently that seems to have been increasing faster, and that despite power efficiency now being a primary design parameter even in desktop processors (it's a primary parameter now because of the breakdown in Dennard scaling, and the desktop was the last segment to be concerned with power efficiency, after mobile and server. One can probably look at Intel's Alder Lake, with its inclusion of the small Gracemont cores, as a continuation in that direction).
I can tell you something, Nvidia's GeForce RTX 5090 is getting released soon! Who needs unobtanium 3xxx series when there is 5xxx series on horizon! Ah, yes, 4xxx series was released but you cannot get it either.
Now if we would just know which retailer will get the one 3060 card that'll be available to the one lucky buyer on the 25th. As for the rest of us, it's vaporware or scalper central.
I'm not a huge gamer - I'd be happy with a 6GB 3050 (to replace my 3GB 1060 that I bought used on eBay like 3 years ago). What I'd love to see is something with decent power, but ideally to fit in the 75w PCIe slot power budget for maybe $199 or so. Hopefully nVIDIA will go one more level down the stack.
As to people complaining about supply - the whole chip industry is in turmoil - net is that there haven't been any 6" wafer foundries built in 20 years - and as everyone wants chips, there just isn't capacity. nVIDIA is competing with companies supplying Lenovo, HP, and Dell - and they will not get first dibs. I mean, Ford has cut down on F150's and Chevy has idled some plants.
The next thing that is going to happen is that prices will be going UP.
The issue affecting automobiles is a different one from what's affecting NVIDIA and AMD. And NVIDIA is supplying Lenovo, HP, and Dell. NVIDIA is near the top of the food chain. Ahead of it are the flagship smartphone SoC manufacturers because of the yearly turnover of that market, and especially Apple because of its size. But in terms of leading-edge high powered chips NVIDIA and AMD are at the top of the chain. And the leading-edge chip manufacturers have it better than those trying to produce on cheaper older nodes, because the investment is going into the leading edge as that's where the foundries can make the best margins.
But foundry capacity is not the biggest issue for NVIDIA. I think the issue is getting enough of various components used in the manufacture of chips. One of these components that has been reported as being in short supply is a certain high-insulation matrix for routing the signals and power between the PCB and the compute area of the chip. This high performance version is only necessary in chips that require the higher power densities, so it is affecting NVIDIA's GPUs while not affecting many other chips. I don't know exactly what needs it and what doesn't. I also don't know why it's in short supply. From what I can gather one Japanese company (Ajinimoto) makes the material, which does not seem to be in short supply, and a few Taiwanese companies use that to provide the component for chip manufacturing. That's where the supply constraint seems to be, and those companies are increasing their production but it is taking time to do so.
I believe those auto chip manufacturers are mostly producing on 8 inch wafers, by the way, not 6 inch wafers. I think the machines are not made any more so they can't really build new fabs for them. But it costs less to produce on those processes so that's why they've designed their chips for them. Many applications have been switching from 12 inch back to 8 inch in recent years. But they are going to have to design a certain amount of their chips for 12 inch wafer processes going forward, I would imagine. Unless they are willing to have no options every time there's a demand surge or supply crunch.
In any case, AMD is certainly also dealing with foundry space restrictions, whether NVIDIA is or is not, and there are also reports of a lack of GDDR6 DRAM, which is something that is fixable because the DRAM manufacturers have spare capacity, but it takes them time to use it to catch up to demand in one particular area.
I wonder if the assumed 3050 is going to have the same supply issues as the xx60 and cards up due to mining. Are the xx50 class cards good for mining? I haven't really followed mining so really have no idea...
Their production costs are not 'our' (the consumer) concern. However, the cost of their products are, and 'our' wilingness to fuel this brand based consumer ideology of success (e.g Apple, Samsung etc...) which leads to more expensive products for each re-iteration of the same. Then, If the rtx 3060 is indeed going to cost 350 plus dollars then they should abandon the branding nomencluture all together and give their products individual names.
Nobody is saying that reducing mining performance is tantamount to reducing overall GPGPU performance. In all applications that use Cuda or OpenCL, not ROP and TMU. Nvidia has chosen the worst possible way to increase accessibility. At the expense of customers, solely for their own profit. No one doubts that they didn't like the prospect of selling used GPUs ...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
73 Comments
Back to Article
commenter001 - Friday, February 12, 2021 - link
annnnnnnnd it's sold outAchaios - Friday, February 12, 2021 - link
Out Of Stock.bigboxes - Tuesday, February 23, 2021 - link
annnnnnnnd it's selling for twice it's MSRP!raywin - Friday, February 12, 2021 - link
is this another Fictional Edition, or are they going to make enough to meet demand? Not that I am bitter...Yojimbo - Saturday, February 13, 2021 - link
This is one of those rare cases where the answer to a "this or that" question is 'no'. No, it's not a Fictional Edition. No, they won't make enough to meet demand.Spunjji - Monday, February 15, 2021 - link
This one's hardly worth the money for gamers anyway. They're really pushing their creative lie about "doubling" CUDA cores and FP32 FLOPS to cover for the fact that this will have very little advantage over the preceding generation in games, because it can't render at resolutions high enough to benefit substantially from that dynamic reassignment of INT32 resources to FP32. I'll be surprised if this substantially outperforms the 2060 Super at 1080p.In four generations, Nvidia have gone from having their third-rank card be a price/performance bargain as was traditional (GTX 970), to having their succeeding fourth-rank card be *substantially* more powerful than that for less money (GTX 1060), to charging third-rank video card prices for their fifth-rank card and for meagre performance increases. I'm not going to reward that with a purchase.
Yojimbo - Tuesday, February 16, 2021 - link
There's no lie, there's just an architecture change, and it wasn't done for marketing purposes. Enthusiasts look at games benchmarks to see how fast a card is, not the number of CUDA cores. And non-enthusiasts are not going to look at CUDA core counts, either. I imagine they look at the product number (3060 vs 3070). Of course it's foing to substantially outperform the 2060 SUPER.Firstly, with the 970 you are referencing a nadir in pricing. You are also referencing a card that had 3.5 GB of fast memory and .5 GB of slower memory to cut costs. But, that being said, GPU average selling prices, in terms of what customers are looking to pay, have been steadily climbing over the past 20 years. NVIDIA's rejiggering of their numbering schemes with respect to die sizes simply reflects that evolution of the market. You should probably buy the card that suits your value proposition. "Not rewarding" a company going about its business is an irrational thing. The prices that are charges are the prices of the market. NVIDIA doesn't set them. And if NVIDIA really were charging "third-rank video card prices for their fifth-rank card" their gross margins would be colossal now compared to back then, and that just isn't the case. The margins are pretty much the same as they were back then. It's the naming scheme that has changed, not the price being charged for the underlying hardware. In other words, a 3060 is a lot more expensive to make than a 960 or 1060. These are facts. You might not like the facts but there they are. Otherwise NVIDIA's gaming gross margins would be absolutely skyrocketing, and they are not.
brucethemoose - Friday, February 12, 2021 - link
Parts of the (non-cryptocurrency) compute crowd will be willing to shell out for this card too. 12GB of VRAM for under $500 is a bargain, especially with RTX/tensor cores and a nice transcoding block.sonny73n - Friday, February 12, 2021 - link
Midrange cards over $300? Whatever, shill.Qasar - Saturday, February 13, 2021 - link
um IF you actually read the article, then you would of seen the rtx 3060 launch price of $329, and the TI at 399, who's the shill ?brucethemoose - Saturday, February 13, 2021 - link
Uhhhhh, I don't set the pricing?Udyr - Monday, February 15, 2021 - link
See at the bottom of the specification comparison chart.brucethemoose - Monday, February 15, 2021 - link
I still don't really get what Sonny is saying. The compute crowd will be willing to pay a whole lot more than the $330 MSRP.RadiclDreamer - Tuesday, February 16, 2021 - link
I wouldnt touch this, and I dont know a single person that would.Prices are getting stupid and stock is non existent. I'll hold my 1080 until it molds before ill pay some scalper a premium for a card. Even at MSRP they are over priced.
ozzuneoj86 - Friday, February 12, 2021 - link
It's pretty lame that any GPU news is basically meaningless for the vast majority of us at this point, unless someone releases a GPU that simply cannot be used for mining... which isn't going to happen. My 970 with an Arctic Accelero III will probably still be in my main system this time next year unless something crazy happens.Even if stock is available near MSRP, I have very little interest in paying over $300 for a **60 level GPU. The only thing working in the 3060's favor is that the 2060 was actually more expensive at launch... but that pricing was so bad it was likely intentional, to set a new standard for "normal" pricing mid range GPUs.
Gone are the days of getting a **60 on sale for under $200, or a **70 on sale for under $300.
raywin - Friday, February 12, 2021 - link
This ^^^^^^sonny73n - Friday, February 12, 2021 - link
"Gone are the days of getting a **60 on sale for under $200, or a **70 on sale for under $300."This or the dollar has lost its value.
Spunjji - Monday, February 15, 2021 - link
Not "or", both are true - but the scale of the difference is mostly about Nvidia realising they could use their video game customers to bootstrap their AI business and, as a result, saddling us with ever-more-expensive chip and board designs for ever-diminishing returns.Of course, some competition from AMD would have helped a bit. You can track each price bump very obviously to the lack of a "true" competitor (features and performance) from AMD. We're not likely to see that situation reversed now though - even if AMD could make enough cards to fulfil demand, they have no incentive to harm their own margins when Nvidia have blazed such a bright trail for them - just like Apple and Samsung with flagship smartphone prices.
Tomatotech - Friday, February 12, 2021 - link
If you have decent internet and don't play crazy fast reaction games, take a look at trialing a cloud gaming subscription eg GeForce Now. I have it and am pretty pleased with it, and it's cheap compared to something like Shadow Cloud. Shadow offers better quality but costs more and is more fiddly to look after.Yojimbo - Saturday, February 13, 2021 - link
It's a good point, but hardly anyone will actually do it. That's probably good for you because if people did sign up en masse the service would become swamped. On the other hand, although NVIDIA can't ensure cards get sold to gamers instead of miners, they can make sure they get the cards themselves instead of anyone else buying them. But it would still take months to implement such a thing (doubt they can just buy all MSI's cards, for example, they'd have to sell the modules to manufacturers with the understanding that the cards are for them) so there would be a painful period until they secured the supply that might turn people off to the service. Well, maybe they do have enough power to negotiate the purchase of already made cards under not-too-unfavorable terms, I don't know.RadiclDreamer - Tuesday, February 16, 2021 - link
Streaming game services are awful, even with my gig synchronous fiber connection it isnt anywhere close to being responsive enough for me to see it as anything other than a noveltyUnashamed_unoriginal_username_x86 - Saturday, February 13, 2021 - link
Honestly, I'm thankful I haven't had any money to shell out for a GPU since forever. All tech news has been meaningless to me, I just like to hear about numbers going up and what those funny numbers mean.VoraciousGorak - Saturday, February 13, 2021 - link
Indeed. I got lucky enough to pick up a janky mined-out 1080 Ti super cheap just after the last crypto bust. I have to set the power target at 60% to not get memory artifacts, but that's still plenty for 1440p 144Hz. Which is good, because that and my 2700X are probably gonna have to hold out for at least a few more years....Spunjji - Monday, February 15, 2021 - link
That's very lucky indeed - the 1080Ti is one of those cards that I frowned at on release for the sheer cost, but has turned out to be an unexpected long-term bargain vs. buying mid-tier cards and replacing them more frequently.basroil - Saturday, February 13, 2021 - link
I'm ok with raising the price target to $350 for the XX60 range and $450 for the XX70 range, it's in line with inflation elsewhere and cost to performance ratio is still better than the past range by quite a bit. I'm not OK with board partners turning a blind eye to scalpers and miners, the first they can actually stop most of the time by working with FBI/customs & boarders/DOJ since many scalpers are selling them to miners in Iran, a huge violation of export complianceSpunjji - Monday, February 15, 2021 - link
"it's in line with inflation elsewhere"Not even close. The GTX 960 was $199 in 2014, which is $220 in 2021 dollars. Nvidia added $50 going from the 960 to the 1060, and then another $100 on top of that up to the 2060 at $399 on release. The fact that they haven't been *quite* so egregious with the 3060 is only a reflection of expected competition from AMD, and doesn't magically erase all of that interim gouging - we'd be at $220 if it were just inflation, and I'd be willing to contemplate $275 to account for the ever-increasing design complexity (that we didn't actually ask for).
"cost to performance ratio is still better than the past range by quite a bit"
Only by a little, not "quite a bit", and certainly not in line with what you'd expect after a die-shrink - again, they've been massively *under-performing* on that metric for at least 3 years now. The power of the human mind to normalise this stuff is amazing.
RadiclDreamer - Tuesday, February 16, 2021 - link
Incredibly well said. Nvidia and AMD are BOTH pushing to see how far they can get the prices.euskalzabe - Tuesday, February 16, 2021 - link
This, a thousand times. Perfectly put. Not many of us seem to have historical perspective these days.Spunjji - Monday, February 15, 2021 - link
Yup. There's no way in hell I'm following Nvidia down that merry little path; especially not for a GPU that - when assessed objectively - has a pathetic performance increase over its predecessor.plonk420 - Friday, February 12, 2021 - link
heh, i was just discussing how i missed flagships (like the HD5870) being under $400. noticed this article reading the 5870/GTX480 reviews 😂evilspoons - Friday, February 12, 2021 - link
For a fair comparison, remember the 5870 is 11 1/2 years old at this point. A $399 card is now a $499 card just by inflation.prime2515103 - Friday, February 12, 2021 - link
Inflation from 2010 through 2020 was a total of 3.538%. If you want to add 2000 to 2009, that's another 2.54% for a total of 6.07%. This does not account for a 25% price increase.ManuelDiego - Friday, February 12, 2021 - link
More like 20% (https://www.in2013dollars.com/us/inflation/2010?am... from 2010 to 2020. 52% from 2000 to 2020 (all of this is for the US, i'm assuming). So yes, a $400 gpu of 10 years ago is today a $500 just by inflationSeeing as you just seem to be adding the rates, i don't think you understand what a compound interest is.
fcth - Friday, February 12, 2021 - link
Yeah, no. I don't know where you are getting that number but it is way, way off. The Fed targets inflation at 2%, and while we haven't always hit that over the past decade, a six percent figure over two decades is absurdly low.According to the Bureau of Labor Statistics, inflation in the Consumer Price Index since September 2009 (when the Radeon 5870 launched) is 21%. This mean that the Radeon 5870 launch price of $379 would be $459 in 2021 dollars.
Spunjji - Monday, February 15, 2021 - link
$486.50, to be precise, which would nearly get you a 3070 at its theoretical MSRP, and in reality get you a 3060 Ti with a little change (if you could find it in stock).Spunjji - Monday, February 15, 2021 - link
Although the 5870 was a $379 card, so that's actually $462 now. Oof.FatFlatulentGit - Friday, February 12, 2021 - link
I'm still baffled by the lowest end card having more VRAM than all but the highest end card.krazyfrog - Friday, February 12, 2021 - link
The way I see it, NVIDIA's previous 30-series launches happened before AMD announced its 6000 series cards with 16GB memory all-around. That had to have caused NVIDIA to rethink its strategy and so all new launches going forward should hopefully have more VRAM.FatFlatulentGit - Friday, February 12, 2021 - link
But they're still behind on the 3070 and 3080, which just looks really weird. There were rumors of 20GB versions, but those have yet to materialize. Then again, all of these cards seem to have difficulty materializing these days.fcth - Friday, February 12, 2021 - link
I'm still guessing that Nvidia originally was planning more VRAM on the higher-end cards, but then GDDR6X prices were too expensive so they cut their allocations in half, given that few current applications need that much RAM. GDDR6 isn't as bad (the Radeon 5500 XT I bought last year for $200 has 8GB of GDDR6), but they also cut down the 3070's allocation so as to not embarrass the high-end cards. But for the 3060, 6GB was really not going to be enough, current games can fill that, to say nothing of future titles, so they were stuck with either going to 12GB or going with a wider bus to accommodate 8GB (and ended up doing both).Yojimbo - Saturday, February 13, 2021 - link
Why? That's somewhat like being baffled that a cheaper car has a bigger gas tank, except the size of the gas tank carries no status. I guess it has something to do with an innate notion of hierarchy.Practically speaking, though, a graphics card just needs enough VRAM. The 3060 has quite a bit more than it needs because of marketing and the technical specifics of VRAM products. For a certain amount of compute, a graphics card needs a certain minimum memory capacity and a certain minimum memory bandwidth. But the capacities and bandwidths come in chunky steps. Apparently it's more expensive to have a wider memory bus than needed than to have more capacity than needed. NVIDIA doesn't want 6 GB, perhaps mostly for marketing reasons, or perhaps it's getting very close to the minimum VRAM they think the card needs. But if they wanted to go with 8 GB they'd need to increase the memory bus width and add more memory controllers to their die. That extra bandwidth would mostly be wasted, so they chose to waste the extra 4 GB of memory capacity instead by going with the smaller bus width and the higher capacity chips to result in 12 GB of VRAM.
edzieba - Sunday, February 14, 2021 - link
It's based on memory bandwidth requirements, not memory capacity requirement. Capacity requirements are FAR below what people think they are even at 'extreme' resolutions (an 8k framebuffer is barely over 125MB). Just looking at 'utilised capacity' tells you basically nothing about actual capacity requirements: any half-competant game engine will try and fill VRAM with every asset in that level up until full capacity or running out of data to load. All that data is marked as flushable, and will instantly be overwritten if required for actual in-flight data (i.e. cached data is functionally identical to empty VRAM).Spunjji - Monday, February 15, 2021 - link
I think many people are making guesses about where VRAM requirements might go in the near future based on the latest crop of consoles, which isn't entirely unreasonable. I think it's fair to have questions like that when you're coughing up $500+ for high-end hardware!Yojimbo - Monday, February 15, 2021 - link
People are just nervous about running out of scarce resources. There's no reason to it. They aren't making any guesses. But regardless, what does that have to do with a higher-tier card having a lower RAM capacity than a lower-tier card? Is that the kind of reason you mean? Worrying that the higher-tier card has too little because the lower-tier card has more?Sttm - Friday, February 12, 2021 - link
Where is my 16gb 3080Ti at?fcth - Friday, February 12, 2021 - link
Newest rumors say the 3080 Ti will only get 12GB (though presumably the full 384-bit memory bus from the 3090 will be enabled to accommodate the extra chips)Alistair - Friday, February 12, 2021 - link
It isn't that exciting except for the possibility of being in stock. The 3060 Ti is a much better GPU for almost the same price.Spunjji - Monday, February 15, 2021 - link
Indeed. Ampere only really comes into its own at 1440p or above.powerarmour - Friday, February 12, 2021 - link
Hey maybe we'll actually get an Ampere review! (too soon?)fcth - Friday, February 12, 2021 - link
I'm still hoping the review will be live by the time we can buy the cards at MSRP!Sychonut - Friday, February 12, 2021 - link
Looking forward to the unavailability.Tomatotech - Friday, February 12, 2021 - link
Too little, too late, too much money. I was going to upgrade from a 1060, but have abandoned all hope of upgrading. I bought a cloud gaming subscription and am very happy with it. Currently on GeForce Now. £50 per year (£25 every 6 months) for a 1080 / 2080 (varies) for many (but not all) of my favourite games is mighty fine.Your loss, NVIDIA.
mobutu - Saturday, February 13, 2021 - link
The GeForceNow Subscription you pay monthly/yearly goes to nVidia so they haven't lost anything.Tomatotech - Saturday, February 13, 2021 - link
£25 every six months and I’ll leave for a different service the moment a better option comes up, versus however many hundreds of pounds a new 3000 series card costs?GeForce Now is probably profitable (if not running at a small loss as it’s early days for them) as it runs (I think) on unused cloud capacity, but the profit margin can’t be as large as selling a whole card to me. So yes it’s their loss. Not that they care, as they don’t have any spare cards to sell anyway.
Yojimbo - Saturday, February 13, 2021 - link
I wouldn't venture to guess the economics of GeForce Now versus cards. I think not even NVIDIA really knows that. There's a certain unknown longer term value to the growth of the service regardless of the near-term financial implications. And NVIDIA's purpose with building GeForce Now is to build a service that won't have a better option exactly like their purpose with selling GPU modules is to offer graphics cards that don't have a better option. My guess is that NVIDIA's margins on GeForce now should eventually be greater than for selling modules to AIBs because they are providing more of the service. You can imagine that whatever other service you sign up for may be buying their cards from NVIDIA for that service.An argument of greater liquidity in the gaming as a service market is not an argument against gaming as a service, as NVIDIA may gain from that liquidity as much as they lose from it (You may not be locked into a 300 pound purchase, but neither is the buyer of AMD cards). And as mentioned, even if you switch to another service you may be using NVIDIA's cards.
But I don't see why you are blaming NVIDIA here as if they don't want to sell a card to you, or as if they wouldn't be making more cards to sell to you if they could. Automobile plants are completely shutting down because of a lack of microchips. There are all sorts of problems with supplies at the moment.
Lucky Stripes 99 - Friday, February 12, 2021 - link
Can't say that I am happy to see the upward trend in power consumption continue. Looks like an RTX 3050 will be in my future.Yojimbo - Saturday, February 13, 2021 - link
That's going to be a long-term trend in computing.RSAUser - Sunday, February 14, 2021 - link
It won't, Nvidia is about at the max for their higher end cards, the lower end might go up a bit more than this, but doubt it tbh.This is mostly a case of moving to a process node that's slightly worse for power while trying to clock it at its highest it can hold before issues arise, notice how they all have about 1850 as top clocks.
You should treat these the same as AMD hot cards were, clock them 10% lower than their max and enjoy the large reduction in power and fan noise.
Yojimbo - Monday, February 15, 2021 - link
The power will continue to go up because Dennard scaling has broken down. The limit on the power is a limit on how much power can be delivered and how much heat can be dissipated economically in a client setting. People already get triple fan solutions so eventually they can push it up to 400 or 500 watts. That's why NVIDIA introduced the 12-pin connector.NVIDIA introduced a more power efficient architecture with Kepler and Maxwell. And in the Pascal generation finfets were introduced. Those things kept power requirements from creeping up for a good 6 years. But there are less and less efficiency gains to wring out as the architectures mature and the power scaling going from planar fets to finfets was a one time thing. Samsung's 8 nm process is probably a bit behind the curve in terms of power efficiency so I wouldn't expect an increase in power usage next generation, but in subsequent generations it's going to go up.
486 machines came with about 200 watt power supplies. It's been a long term trend to increase power usage, but more recently that seems to have been increasing faster, and that despite power efficiency now being a primary design parameter even in desktop processors (it's a primary parameter now because of the breakdown in Dennard scaling, and the desktop was the last segment to be concerned with power efficiency, after mobile and server. One can probably look at Intel's Alder Lake, with its inclusion of the small Gracemont cores, as a continuation in that direction).
BlueScreenJunky - Saturday, February 13, 2021 - link
Do they have a release date for the 3080 in the EU ?Peskarik - Saturday, February 13, 2021 - link
Ha! Ha! Ha!I can tell you something, Nvidia's GeForce RTX 5090 is getting released soon!
Who needs unobtanium 3xxx series when there is 5xxx series on horizon!
Ah, yes, 4xxx series was released but you cannot get it either.
Peskarik - Saturday, February 13, 2021 - link
I LOVE how the world turned into USSR is less then a year.A new product is released, buy it....in 10 years, if you are lucky, from a scalper.
Lovely.
Sivar - Saturday, February 13, 2021 - link
I am more interested in the 3080. When is its release date?haukionkannel - Saturday, February 13, 2021 - link
It was released in a Galaxy far far away!iranterres - Sunday, February 14, 2021 - link
Any GPU news these days are as useful as paperweight.eastcoast_pete - Sunday, February 14, 2021 - link
Now if we would just know which retailer will get the one 3060 card that'll be available to the one lucky buyer on the 25th. As for the rest of us, it's vaporware or scalper central.jbwhite1999 - Monday, February 15, 2021 - link
I'm not a huge gamer - I'd be happy with a 6GB 3050 (to replace my 3GB 1060 that I bought used on eBay like 3 years ago). What I'd love to see is something with decent power, but ideally to fit in the 75w PCIe slot power budget for maybe $199 or so. Hopefully nVIDIA will go one more level down the stack.As to people complaining about supply - the whole chip industry is in turmoil - net is that there haven't been any 6" wafer foundries built in 20 years - and as everyone wants chips, there just isn't capacity. nVIDIA is competing with companies supplying Lenovo, HP, and Dell - and they will not get first dibs. I mean, Ford has cut down on F150's and Chevy has idled some plants.
The next thing that is going to happen is that prices will be going UP.
Yojimbo - Monday, February 15, 2021 - link
The issue affecting automobiles is a different one from what's affecting NVIDIA and AMD. And NVIDIA is supplying Lenovo, HP, and Dell. NVIDIA is near the top of the food chain. Ahead of it are the flagship smartphone SoC manufacturers because of the yearly turnover of that market, and especially Apple because of its size. But in terms of leading-edge high powered chips NVIDIA and AMD are at the top of the chain. And the leading-edge chip manufacturers have it better than those trying to produce on cheaper older nodes, because the investment is going into the leading edge as that's where the foundries can make the best margins.But foundry capacity is not the biggest issue for NVIDIA. I think the issue is getting enough of various components used in the manufacture of chips. One of these components that has been reported as being in short supply is a certain high-insulation matrix for routing the signals and power between the PCB and the compute area of the chip. This high performance version is only necessary in chips that require the higher power densities, so it is affecting NVIDIA's GPUs while not affecting many other chips. I don't know exactly what needs it and what doesn't. I also don't know why it's in short supply. From what I can gather one Japanese company (Ajinimoto) makes the material, which does not seem to be in short supply, and a few Taiwanese companies use that to provide the component for chip manufacturing. That's where the supply constraint seems to be, and those companies are increasing their production but it is taking time to do so.
I believe those auto chip manufacturers are mostly producing on 8 inch wafers, by the way, not 6 inch wafers. I think the machines are not made any more so they can't really build new fabs for them. But it costs less to produce on those processes so that's why they've designed their chips for them. Many applications have been switching from 12 inch back to 8 inch in recent years. But they are going to have to design a certain amount of their chips for 12 inch wafer processes going forward, I would imagine. Unless they are willing to have no options every time there's a demand surge or supply crunch.
In any case, AMD is certainly also dealing with foundry space restrictions, whether NVIDIA is or is not, and there are also reports of a lack of GDDR6 DRAM, which is something that is fixable because the DRAM manufacturers have spare capacity, but it takes them time to use it to catch up to demand in one particular area.
domboy - Monday, February 15, 2021 - link
I wonder if the assumed 3050 is going to have the same supply issues as the xx60 and cards up due to mining. Are the xx50 class cards good for mining? I haven't really followed mining so really have no idea...Gothmoth - Wednesday, February 17, 2021 - link
who the F*** cares?everything is sold out and will be sold out.
here in germany you may get a GT 710 with 2GB DDR5 for 90-110 euro if you are very lucky.
and most tech press is reporting about new stuff as if nothing has changed.
it´s like michelin is reporting about fantastic restaurants in north korea.... you write fiction at this moment.
Teradoom - Wednesday, February 17, 2021 - link
Their production costs are not 'our' (the consumer) concern. However, the cost of their products are, and 'our' wilingness to fuel this brand based consumer ideology of success (e.g Apple, Samsung etc...) which leads to more expensive products for each re-iteration of the same. Then, If the rtx 3060 is indeed going to cost 350 plus dollars then they should abandon the branding nomencluture all together and give their products individual names.anemusek - Saturday, February 20, 2021 - link
Nobody is saying that reducing mining performance is tantamount to reducing overall GPGPU performance. In all applications that use Cuda or OpenCL, not ROP and TMU. Nvidia has chosen the worst possible way to increase accessibility. At the expense of customers, solely for their own profit. No one doubts that they didn't like the prospect of selling used GPUs ...dlasher - Monday, February 22, 2021 - link
When are they going to learn?Paid Pre-orders FTW please