There has been some outcry over Apple's CSAM program to uncover any child pornography on photos uploaded to iCloud Photos (CSAM, Child Sexual Abuse Material).
Apple is dedicated to protecting children throughout the ecosystem, no matter where the products are used, and we support innovation in this area.
We have developed robust systems at all software levels throughout the chain. As part of this, Apple uses image matching technology to find and report child abuse.
Like spam filters in e-mail, we use electronic signatures to find suspected exploitation of children. We secure each match with individual review. Accounts with such content violate our Terms of Service and will be disabled.
Sources: 9tomac