QEMU briefly used -Og for its debug build setting, but switched back to -O0, because in practice using -Og results in a lot more situations where gdb just says "<optimised out>" rather than being able to tell you the values of variables, arguments in stack backtraces, and so on. If -Og really was "optimize where possible without breaking the debug illusion", that would be great, but in my experience it absolutely was not, and now I'm pretty wary of going back and trying it again when -O0 works just fine for me for debug...
What version of GCC/GDB were they using? I've been using GCC/GDB 7 for embedded development and haven't really seen a problem with it. It's also almost a hard requirement because otherwise the generated code is much larger & much slow which impacts the runtime behaviour to a significant extent.
This would have been gcc 5.4.0, as shipped by Ubuntu 16.04. Certainly it's possible that newer gcc do better, but from my point of view if they started out with something that breaks the debug illusion it indicates that their definition of the feature is wildly different from mine. Once bitten, twice shy.