Comments Locked

12 Comments

Back to Article

  • ballsystemlord - Friday, March 27, 2020 - link

    Are there any details on Micron's process nodes? I tried wikichip.org without success.
    The naming (1y, 1z) is different then other fabs.
  • eek2121 - Friday, March 27, 2020 - link

    https://www.anandtech.com/show/14593/microns-dram-...
  • eek2121 - Friday, March 27, 2020 - link

    Also https://www.anandtech.com/show/14746/micron-mass-p...
  • kpaczari - Friday, March 27, 2020 - link

    TechInsights has lots of info:
    https://www.semiconductor-digest.com/2019/09/13/dr...
    https://www.techinsights.com/blog/techinsights-mem...
  • ballsystemlord - Friday, March 27, 2020 - link

    Nice find!
  • eastcoast_pete - Friday, March 27, 2020 - link

    Any information or rumors on initial large volume customers?
    While very unlikely, HBM/HBM2 memory could differentiate the upcoming PS5 from the new Xbox or vice versa. At current specs, the GPU of either would make good use of the high bandwidth, and guarantee a large volume for years to come.
    However, a more realistic first use is in the upcoming higher end GPUs from NVIDIA, Intel and AMD.
  • Slash3 - Friday, March 27, 2020 - link

    Both upcoming consoles have already confirmed the use of GDDR6.
  • eastcoast_pete - Friday, March 27, 2020 - link

    I know. GDDR6 should be plenty fast and is likely much cheaper.
  • Diogene7 - Saturday, March 28, 2020 - link

    It seems that 32GB DDR4 2666Mhz So-DIMM is approximatively 150€ to 200€, with ~20% VAT included as of March 2020 (according to www.amazon.fr).

    Could anyone give some approximative idea of what would be the cost of 32GB GDDR6 versus 32GB HBM2 as of March 2020 ? It is to get an approximative idea of much it is that we are talking about...

    Ex: 32GB GDDR6 is said to be ~50% more expensive (so 1.5x the price) of 32GB DDR4, so 32GB GDDR6 would be between 225€ to 300€, and HBM2 is said to 150% more expensive (so 2.5x the price) of 32GB GDDR6, so 32GB HBM would be between 550€ to 750€.
  • extide - Sunday, March 29, 2020 - link

    You have to be careful -- because the actual memory itself isn't THAT much more than GDDR6 -- it is likely more because of the stacking which makes it harder to produce but not by much. However -- of course the total implementation cost must include the interposer -- whether it be a full-size interposer like everyone but Intel uses or the mini EMIB ones that Intel uses -- will add additional cost. None of these actual costs are public -- so it's really all speculation.
  • TheJian - Sunday, March 29, 2020 - link

    Except for certain use cases (servers, some pro stuff), this crap is pointless and expensive. Stick to easy to produce GDDR5x/GDDR6 type stuff. Not sure why they didn't just do GDDR6x vs. HBM2. All HBM/HBM2 has done for AMD is kill NET INCOME on flagship cards (gaming that is, surely they make some on pro stuff). It was the cause of shortages multiple times, cost issues multiple times, delays...Jeez, why bother with this crap for gaming stuff? Can you prove it is NEEDED? No? Then why the heck are we still having to deal with this crap memory for gaming?

    We almost got screwed by rambus/Intel ages ago, why are they shoving this crap down our throats now? PROVE you need the bandwidth on the gaming side or drop this crap AMD/NV (NV joining it seems too for gaming, stupid if so). Go cheaper unless you can prove bandwidth is an issue. So far I have seen NOTHING to prove we need the memory on a gaming card.

    Is the Titan RTX bandwidth starved with 24GB GDDR6? Nope. 672GB/s seems enough. I doubt the next release from NV this xmas (or whatever) will be massively faster than TitanRTX so prove you need it or forget raising the costs for NOTHING (talking to BOTH sides). Drop this expensive buzzword memory until you can PROVE you need it. Don't say 4k, nobody plays there and still prove it. Yeah, I call under 5% NOBODY (that is pretty much the total of both 1440+4k!). Wasting time on 4k testing in every card review is dumb. More 1080p where 65% play. I mean, should you write for 98% of your readers or 2%? Only a fool wastes time on nobody in publishing. I haven't looked at a 4k result in your reviews (anyone's LOL) since the first hit your site :) I still won't be on 4k for my next monitor...ROFL. Still no 4k TV either and I can easily afford ~800 for a great one at costco, but my 1080p samsung/LG's works fine still (61/65in both look great even with a good 720p rip - not how anandtech tests mind you...trash settings here) ;)
  • lilkwarrior - Sunday, March 29, 2020 - link

    This stuff is first & foremost for pro usage. Gamers are a very broad audience that hardly wants to pay for high-end GPUs that warrants using HBM memory.

    On top of that games have been stuck catering to Windows 7 users; as result game developers have had to code games that are extremely inefficient of leveraging modern hardware like an HBM 2.0 GPU.

    That has only changed recently when Windows 7 dying and next-gen consoles now releasing that enabled game developers to enforce far higher bars moving forward (WDDM 2.0 FINALLY being the baseline for DX12 & Vulkan; Windows 10 exclusive DX12 features now labeled DX12 Ultimate to make it easier to communicate once next-gen consoles drop what are games that leave the Windows 7 catering behind that gamers accommodate by upgrading)

Log in

Don't have an account? Sign up now