Category: Apple|Aug 24, 2021 | Author: Admin

"Apple is the largest platform for child pornography distribution"

Share on

There has been some outcry over Apple's CSAM program to uncover any child pornography on photos uploaded to iCloud Photos (CSAM, Child Sexual Abuse Material).

Manual review of your email
Apple has now confirmed to 9to5Mac that since 2019 they have scanned for abused material in iCloud Mail. They use “electronic signatures to expose suspected child exploitation. Every registered suspicion is checked manually ”.

 

It aroused 9to5mac's curiosity that one of Apple's executives could state that "Apple is the largest platform for distributing child pornography": How could he claim that if the company did not scan iCloud photos?

 

 

Several clues
9to5mac also had a couple of other clues that indicated that Apple must have run a form of CSAM scanning. An archived version of Apple's child safety page states:

 

Apple is dedicated to protecting children throughout the ecosystem, no matter where the products are used, and we support innovation in this area.

 

We have developed robust systems at all software levels throughout the chain. As part of this, Apple uses image matching technology to find and report child abuse.

 

Like spam filters in e-mail, we use electronic signatures to find suspected exploitation of children. We secure each match with individual review. Accounts with such content violate our Terms of Service and will be disabled.

 

Screener for illegal images
Another Apple CEO told a technology conference that the company uses screening technology to look for illegal images and that they disable accounts if they find material showing the exploitation of children, but did not say how they detect it.

 

Apple has confirmed to 9to5mac that they have been scanning outbound and inbound iCloud Mail for CSAM since 2019. Email is not encrypted so it is a trivial task to scan attached files that pass through Apple's servers.

 

“Limited scan of other data”
Apple also stated that they perform a limited scan of other data, but would not say what it was except to suggest that it was on a "small scale".

 

"Risk of abuse by the authorities"
Two Princeton University graduates say they have created a prototype of a scanning system based on the same approach as Apple's, but they gave up work because of the obvious risk of government misuse of such systems.

Sources: 9tomac

Sponsored Ads:

Comments:


Now Tesla can do something completely new

Category: General|Jul 4, 2022 | Author: Admin

"Google pushes users into comprehensive monitoring"

Category: Google|Jul 3, 2022 | Author: Admin

In this country, Apple must comply with the new iPhone law

Category: Apple|Jul 2, 2022 | Author: Admin

Are you tired of iPhone? Now Google makes it easy

Category: Google|Jul 1, 2022 | Author: Admin

The EU extends its 'Roam-like-at-home' mobile service rule through 2032

Category: IT|Jun 30, 2022 | Author: Admin

Apple's important iPhone project may have failed

Category: Apple|Jun 29, 2022 | Author: Admin

Dramatic Netflix message: announces they are losing two million customers

Category: General|Jun 28, 2022 | Author: Admin

This is how mobile cameras get better with a shrunken Samsung sensor

Category: IT|Jun 27, 2022 | Author: Admin

Now the PlayStation 5 screens come with 4K and 144Hz

Category: General|Jun 26, 2022 | Author: Admin

AirPods get better sound quality - much about AirPods Pro 2 revealed

Category: Apple|Jun 25, 2022 | Author: Admin

The day is here: all you have to do is launch your Playstation

Category: General|Jun 24, 2022 | Author: Admin

This card holds the record

Category: IT|Jun 23, 2022 | Author: Admin

Russians refuse to download Windows 11 and 10

Category: Microsoft|Jun 22, 2022 | Author: Admin

Tesla is banned in this city

Category: General|Jun 21, 2022 | Author: Admin

Everyone with Sonos must know the super-tricks this button can

Category: IT|Jun 20, 2022 | Author: Admin
more