That's the CSAM part. The iMessages part uses a neural network to detect explicit photos through machine learning.
What happens when Apple, or the government, mandate an expansion of CSAM into detecting new material? Apple already has built a neural network to detect new explicit material...
Also, what happens when the USG mandates that Top Secret classified material must also be added to the database? Or when Russia mandates that homosexual pornography must be added to the database?