If your use case requires a minimum bound, use another algorithm.
The only way for the total amount of leaked memory to grow is by replacing something else on the stack, which releases whatever was there before, and, assuming there's some maximum stack size, there's a finite amount of such space.
End result being that, even if you have some code path that generates an integer that looks like a reference, it can only leak one object (which could be massive perhaps, but with generational GC or similar there's no guarantee of all garbage being GC'd at any single GC anyway); if codegen clears stack slots upon their lifetime ending, you need one such integer-holding frame to be on the stack for every object you want to be leaked.
Luckily sorting is something you can easily choose another implementation of, if the default over didn't fit your use-cade, unlike the GC built into the single language implementation that your customer uses.
Branch misprediction is a permanent loss. You will never get those nanoseconds back.
Space inefficiencies require you to install more memory.