Society is set up in such a way that people are trained to:
1. Respect authority figures
2. Follow what your parents tell you to do
3. Your education is the most important thing in your life (between 10 and 30 yo)
All of these things lead to situations where people are more than willing to take out massive loans because everyone in their life tells them "it will be ok!" and "don't think about the money right now, it's about the experience" which is one of the most disgusting ideas that I think is perpetuated. Basically when your parents, your teachers, your politicians, your peers, etc are doing something it's hard to think "that's a bad idea".
Graduating high school, it's incredibly clear that studying finance or engineering will give you a higher income than studying most liberal arts majors. That's so universally known in our culture — and I say this as someone from a terrible neighborhood where nobody went to college — that it's almost absurd.
Nobody signs up for a liberal arts major, stays through it throughout college (even seeing their low to zero internship pay compared to high pay for other majors their peers are in), and then when they get out, says "wow, I'm shocked!".
Why not? We expect homeopathic medications to include "Not an FDA approved treatment for X" on the label.