To begin with, systems that don't tell people to use elmer's glue to keep the cheese from sliding off the pizza, displaying a fundamental lack of understanding of.. everything. At minimum it needs to be able to reliably solve hard, unique, but well-defined problems like a group of the most cohesive intelligent people could. It's certainly not AGI until it can do a better job than the most experienced, talented, and intelligent knowledge workers out there.
Every major advancement (which LLMs certainly are) has caused some disruption in the fields it affected, but that isn't useful criteria that can differentiate between "crude but useful tool" from "AGI".