Another possibility would be two adjacent 32-bit ints merging; for a 2GB heap this'd require the high half hitting one of the two valid 1≤x≤32767 values (reasonably frequent range for general-purpose numbers) and the bottom one can be anything; though whether such packing can happen on-stack depends on codegen (and, to a lesser extent, the program, as one can just do "12345L<<32" and get a thing that has the 2 in 32767 (0.006%) chance of hitting the heap); but even then fitting a million of such on the stack only gives ~61 potential false roots, and for that to be significant some of those must hit massive objects (brings back some 1 in a billion factor unless there are large cyclic object subgraphs).