The paper states for the main result:
>In this section, we show gradient descent with a constant positive step size converges to the global minimum with a linear rate.
This is rather ambiguous: it sounds like it guarantees "it WILL converge to the global minimum, btw at a linear rate" but I suspect they are really saying "IF it converges to the global minimum, THEN it will converge at a linear rate" in a way to purpousefully sound like the first statement.
could you comment on if GD does or does not find the global minimum of the integer factorization cost function in the following comment?
https://news.ycombinator.com/item?id=18439287