Top
image credit: Pexels

Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis

August 6, 2021

Via: 9to5Mac
Category:

Apple’s new feature for detection of Child Sexual Abuse Material (CSAM) content in iCloud Photos will launch first in the United States, as 9to5Mac reported yesterday. Apple confirmed today, however, that any expansion outside of the United States will occur on a country-by-country basis depending on local laws and regulations.

As 9to5Mac reported yesterday, the new feature will allow Apple to detect known CSAM images when they are stored in iCloud Photos. The feature has faced considerable pushback from certain sources, but Apple promises that this is necessary to protect children and that everything is being done with privacy in mind.

Read More on 9to5Mac