Indeed. But they won't get to "AGI", because that goal isn't even remotely defined. A "human-level" intelligence implies a large number of properties that cannot exist inside an inference machine. Dreams, for example, might be considered to be a part of "human-level" intelligence. Will the machine dream?
What happens if you turn a "human-level" intelligence off? Did you kill someone?
AGI is a pipe dream - and moreover it's not even something that anyone actually wants.