> But I wonder why its not trivial to throw a bunch of different inputs at your cyphering functions and measure that the execution times are all within an epsilon tolerance?
My guess is because the GC introduces pauses and therefor nondetermism in measuring the time anything takes.