Combined with this[1] interesting paper from summer 2023 on HBM combined with Xeon processors which would now allow for 144 GB on a single CPU. In theory at least.
1: https://lenovopress.lenovo.com/lp1738-implementing-intel-hig...
Not true. They announced completion of development, with intention to begin mass production in H1. So a couple months out, at least.
> Micron's memory roadmap for AI is further solidified with the upcoming release of a 36 GB 12-Hi HBM3E product in March 2024.
So likely competitive timing with Samsung for 36/12.
48GB consumer cards (or 96GB pro cards) would sell like hotcakes if AMD/Intel dare to break the artificial VRAM segmentation status quo.
Personally I'd love to have as much VRAM as possible (and as high a bandwidth as is possible too) to mess around with simulations in- but that's definitely a pro workload.
I'd love to see like a flagship card have a stupid amounts of VRAM spec option - like an RTX 4090 with 32-48gb of VRAM just to see what happens with it on the market.
AMD is doing the same thing, the only high memory cards they put out (MI300) are for data centers.
https://investors.micron.com/news-releases/news-release-deta...
- CAS latency?
- Wattage?
Also: https://piped.video/watch?v=2G4_RZo41Zw (is this memory same size as 5nm?)
Not naming the company but seems like HBM manufacturers might be going all-in to benefit from Nvidia's stock surge.