> IMO a full GC is just the wrong level of abstraction for anything that's timing relevant, which includes all UI threads.
Hard real-time GC systems exist. In these systems, you can prove that pauses last no longer than a certain number of milliseconds. They're definitely applicable to programs with UI.
Can you prove that dropping a reference doesn't free an arbitrarily large number of objects? You can probably convince yourself in specific cases for specific programs that you don't see arbitrarily large refcount-release times, but any change you make to the code might invalidate this analysis.
A hard real-time GC stays hard real time.