The $20/month subscription is going to give you access to commercial models, but generally you have to run the open weight models yourself. With the unified RAM you can trivially run the larger 70B+ models.
AI researchers generally have to use CUDA due to how the ecosystem is still mostly CUDA-only for training and fine tuning, but those who need to occasionally use custom/local models for inference will likely find high end Macs being a good fit for their use cases.
Personally, I’m reasonably happy with GPT4 and Github Copilot, and I’ve sometimes used Midjourney, though I cancelled my subscription since I’m not currently generating any images. Are there important apps that I’m missing?
If Apple releases on-device AI, this will be an effective way of getting people to upgrade like they used to, but haven't had to recently. For example, I bought Pro-level computers in my younger years, but now would only consider an MBA, mini, or iMac. But they could get me to go for a Pro if it were the only way to get more RAM for better AI performance. It will also likely shorten upgrade cycles since newer computers would have the latest and greatest performance. When I bought my M2 MBA years ago I suspected it would last me a long time. Now I'm not so sure since I don't have a ton of RAM.
Not comparable. M2 Ultra with 128GB RAM has 800 GB/s bandwidth.
Maximum bandwidth for a DDR5 Intel 14900K system is 89.6 GB/s. https://www.intel.com/content/www/us/en/products/sku/236773/...