Unfortunately, it's now vital to child safety that iPhone users never be allowed to jailbreak their phones. Apple's recently-announced CSAM photo detection feature relies on Apple being sure that the real NeuralHash machine-learning model, not a fake that produces random data indistinguishable from a non-matching photo, is running on everyone's devices.
So I'd say the odds of them loosening their grip here are pretty slim.