Even if we accept that your image of your kids in the bath will match the hash of a known and identified CSAM picture (a real stretch), under this system the voucher payload does not contain the private key to decrypt the picture, so nobody will be able to evaluate a photo or flag it as anything. Humans have access to a “visual derivative” based on the perceptual mechanism used to create the hash, but not to the actual photo.
Other companies have CSAM scanning and reporting with much fewer safeguards and the “my kid in the bathtub” scenario hasn’t seemed to actually be a problem.