As a student I remember thinking that this was incredibly alarming. Without a good understanding of this concept, how can we be sure any of what we're saying is even remotely correct?
What Knuth has given us here is the ideal treatment of the subject, essentially putting the question of what randomness is, to rest for good. He starts by taking a comprehensive, entertaining history of the landscape (in short: researchers ping-ponged between having definitions of randomness so strict that nothing was random, and so loose that they were all pathologically predictable) before finally proving a theorem that completely solves the problem.
It is a monumental accomplishment in the field, and it is quite shocking to me that it's still so obscure. It's one of my favorite results in all of CS, anywhere.
If you haven't had the chance, I highly recommend the rest of this volume (Seminumerical Algorithms). It is just stuffed with other surprising, amazing results.
But picking up a like new used copy from Amazon for less than $10 with shipping made it to good to pass up, and like the rest of Knuth, every time I read a few pages I learn something new.
http://www.av8n.com/computer/htm/secure-random.htm
---
"My favorite proverb of all is the one that says for every proverb, there is an equal and opposite proverb. In this case, we should note that the proverb about not putting all your eggs in one basket is not necessarily a sound engineering principle. The right answer depends on the margin of error and on the per-basket failure rate. It also greatly depends on the chance of correlated, uncorrelated, and anti-correlated basket failures. Sometimes the best way to get your eggs from point A to point B is to put them all in one basket and take really good care of that basket."
---
If you don't believe me, read the chapter. Early probability theorists (e.g., von Mises, Kolmogorov) literally started thinking about randomness in order to define probability.
EDIT: And, I don't suppose it's worth pointing out that pseudorandomness is not at all the same thing as randomness. The fact that you seem to use them interchangeably is not a good sign IMHO.
EDIT 2: Why the unexplained downvote, HN? :(
So just to get done with it, I pick 1, 2, 3, 4, 5, 6 and 7 on one combo and 8, 9, 10, 11, 12, 13 and 14 on the other. I give it to him, when he throws his hands in the air and angrily says, "What the hell? You just wasted two combinations, these numbers are never going to get drawn!"
It handily illustrates how hard it is to grasp the concept of true randomness and probability, and even if you do get it, sometimes you'll be caught off guard and your psychological biases will kick in.
And it is not the only way to get good independent streams either; hooking a cipher with a counter is arguably a superior way to do it, cf [1].
[1] http://www.thesalmons.org/john/random123/papers/random123sc1...
The simplest idea of stability is constancy, or invariance. A thing that has no possibility to change is, by definition, immune to external pertubations. [...] Invariance is an important concept, but also one that has been shattered by modern ideas of physics. What was once considered invariant, is usually only apparently invariant on a certain scale. When one looks in more detail, we find that we may only have invariance of an average. - Mark Burgess, In Search of Certainty: The Science of Our Information Infrastructure (2013)
This accords well with the opening quotation Lest men suspect your tale untrue, Keep probability in view. - John Gay, English poet and dramatist and member of the Scriblerus Club (1727) https://en.wikipedia.org/wiki/John_Gay