AI can be used in ways that lead to deeper understanding. If a student wants AI to give them practice problems, or essay feedback, or a different explanation of something that they struggle with, all of those methods of learning should translate to actual knowledge that can be the foundation of future learning or work and can be evaluated without access to AI.
That actual knowledge is really important. Literacy and numeracy are not the same thing as mental arithmetic. Someone who can't read literature in their field (whether that's a Nature paper or a business proposal or a marketing tweet) shouldn't rely on AI to think for them, and certainly universities shouldn't be encouraging that and endorsing it through a degree.
I think the most important thing about that kind of deeper knowledge is that it's "frictional", as the original essay says. The highest-rated professors aren't necessarily the ones I've learned the most from, because deep learning is hard and exhausting. Students, by definition, don't know what's important and what isn't. If someone has done that intellectual labor and then finds AI works well enough, great. But that's a far cry from being reliant on AI output and incapable of understanding its limitations.