Thursday, December 12, 2024
spot_imgspot_imgspot_imgspot_img
HomeGadgetsApple sued for failing to implement CSAM detection tools in iCloud

Apple sued for failing to implement CSAM detection tools in iCloud


Apple is being sued by victims of child sexual abuse over its failure to follow a plan to scan iCloud for child sexual abuse material (CSAM). Report. In 2021, Apple announced that it is working on It will flag images depicting such abuse and notify the National Center for Missing and Exploited Children. But the company faced immediate backlash over the privacy implications of the technology, and ultimately ,

The lawsuit filed Saturday in Northern California seeks more than $1.2 billion in compensation for a potential group of 2,680 victims. NYTIt claims that, after Apple showed off its planned child safety devices, the company “failed to implement those designs or take any measures to detect and limit CSAM on its devices”, causing harm to victims. Because the images kept circulating. Engadget has contacted Apple for comment.

in a statement to the new York Times Apple spokesman Fred Saenz said of the lawsuit, “Child sexual exploitation material is abhorrent and we are committed to fighting the ways in which predators put children at risk. We are urgently and proactively innovating to tackle these crimes without compromising the security and privacy of all our users. This lawsuit comes just a few months after Apple was sued By the UK’s National Society for the Prevention of Cruelty to Children (NSPCC).



Source link

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments

Enable Notifications OK No thanks