Assuming there is a development that makes GPUs obsolete, I think it's safe to assume that what will replace them at scale will still take the form dedicated AI card/rack
1. Tight integration necessary for fundamental compute constraints like memory latency.
2. Economies of scale
3. Opportunity cost to AI orgs. Meta, OpenAI etc want 50k h100s to arrive in shipping container and plug in so they can focus on their value-add.
Everyone will have to readjust to this paradigm. Even if next get AI runs better on CPU, Intel won't suddenly be signing contracts to sell 1,000,000 xeons and 1,000,000 motherboards etc
Also, Nvidia have 25bn cash in hand and almost 10 billion yearly r&d spend. They've been an AI-first company for over a decade now, they're more prepared to pivot than anyone else
Edit: nearly forgot - Nvidia can issue 5% new stocks and raise 100B like it's nothing.