Okay, you've mentioned Kolmogorov complexity, that's a reasonable approach. In that definition, a function able to produce random numbers is not (cannot be) smaller than the resulting set, i.e. it has maximum entropy. The reason is that a function that is smaller than its result implies that a pattern is being exploited, and that, in turn, implies that the set isn't random (such a set by definition would have no exploitable patterns).
Another test, based on the same ideas, is a hypothetical compression algorithm, not one that we know of or can write, that is put to the task of compressing the "random" set. If the compression algorithm cannot make the set smaller, the set is random. This also addresses the issue of exploitable patterns and entropy.
But all these ways of thinking about randomness relies on the set being infinite in size (or an inexhaustible function as a source for the test values). If this condition isn't met, and if we acquire a finite set of numbers of size N, one can argue that the next set of that size will be a repetition of the same values, therefore not random. This objection can only be answered by defining the problem in terms of an infinite set.