Optane products were sold as DIMMS with single-DIMM capacity as high as 512 GB. With an Intel memory controller that could make it look like DRAM.
512 GB.
It was slower than conventional DRAM.
But for AI models, Optane may have an advantage: it's bit-addressable.
I'm not aware of any memory controllers that exposed that single-bit granularity; Optane was fighting to create a niche for itself, between DRAM and NAND Flash: pretending to be both, when it was neither.
Bit-level operations, computational units in the same device as massive storage, is an architecture that has yet to be developed.
AI GPUs try to be such an architecture by plopping 16GB of HBM next to a sea of little dot-product engines.