For me, there are plenty of old-school models which are still plenty useful and which run fine even on a modern, fast CPU and would do well with numpy, and even more so with cupy. Step through:
https://huggingface.co/learn/nlp-course/chapter1/1
And there's a pile of awesome. It feels pretty lame compared to GPT4o, ChatGPT, or even GPT3, but it's still super-useful a lot of the time, and not too resource-intensive.
(Disclaimer: That's the original Hugging Face course and is politely structured to work on reasonable machines too. They have other courses which require moderate GPU, and plenty of models which require crazy hardware.)