To explain things even further, let's say that the perceptual algorithm makes a false positive 1% of the time. That is, 1 in every 100 completely normal pictures are incorrectly matched with some picture in the child pornography database. There's no reason to think (at least none springs to mind, happy to hear suggestions) that a false positive in one image will make it any more likely to see a false positive in another image. Thus, if you have a phone with 1000 pictures on it, and it takes 40 trigger a match, there's less than a 1 in a trillion probability that this would happen if the pictures are all normal.