I agree that this situation that the author outlines is unsatisfactory but it's mostly the fault of the education system (and by extension the post author). With a class writing exercise like the author describes, of course the students are going to use an LLM, they would be stupid not to if their classmates are using it.
The onus should be on the educators to reframe how they teach and how they test. It's strange how the author can't see this.
Universities and schools must change how they do things with respect to AI, otherwise they are failing the students. I am aware that AI has many potential and actual problems for society but AI, if embraced correctly, also has the potential to transform the educational experience in positive ways.
Why would they be stupid? Were people before LLMs stupid for not asking smarter classmate/parent/paid contractor to solve the homework for them?
Large part of education is learning about things that can be easily automated, because you can't learn hard things without learning easy things. Nothing conceptually changed in this regard, like Wolfram Alpha didn't change the way differentiation is taught.
I agree that making assignments not designed with external sources in mind significantly impact the final grade is not ideal. I think this is minor and easily fixable point rather than some failure of the whole education system.
In American universities where your GPA from your in-class assessments forms part of your final grade? Yes, absolutely.
Where I came from you do your learning in class and your assessment in a small, short set of exams (and perhaps one graded essay) at the end of each year. That seems far more conducive to learning things without having to juggle two competing objectives the whole time.
Whether not doing everything to maximize your GPA is "stupid" (literally or figuratively) is a good question too.
But even if your assignments influence your GPA it's rarely the only thing that does, and not doing assignments will harm your ability to perform in midterm/exam/whatever.
https://chatgpt.com/share/6817fe76-973c-8011-acf3-ef3138c144...
https://www.reddit.com/r/ChatGPT/comments/1hun3e4/my_little_...
I don't know what the answer is. I'm old school, if it was up to me I'd bring back slide rules and log tables, because that's such a visual and tactile way of getting to know mathematics and numbers.
It's interesting to consider how AI is affecting humans' cognition skills. Is it going to make us stupid or free us up to use our mental capacities for higher level activities? Or both?
Its only stupid if you try to optimize for the wrong things (finishing quickly, just getting a pass).
I'd say it's very smart if you don't rely on LLMs, copy the homework from someone else, or similar; because you're optimizing for learning, which will help you more than the various shortcuts.
> Universities and schools must change how they do things with respect to AI, otherwise they are failing the students.
Hard disagree.
Students need to answer a fundamental question of themselves;
Am I here to learn or to get a passing grade?
If it is the former, the latter doesn't really matter.If it is the latter, the former was not the point to begin with.
University is like a supermarket.
For some, they go there with a loose idea of what they want only to find ingredients not previously considered, often ending up with a better dining experience because of it.
For others, it is aisle after aisle of crap "Uber Eats" can deliver already made and without the hassle of having to cook it.
To each their own.