Apple Update Will Check iPhones For Images Of Child Sexual Abuse
NDTV
"We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material (CSAM)," Apple said in an online post.
Apple Thursday said iPhones and iPads will soon start detecting images containing child sexual abuse and reporting them as they are uploaded to its online storage in the United States, a move privacy advocates say raises concerns. "We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material (CSAM)," Apple said in an online post. New technology will allow software powering Apple mobile devices to match abusive photos on a user's phone against a database of known CSAM images provided by child safety organizations, then flag the images as they are uploaded to Apple's online iCloud storage, according to the company. However, several digital rights organizations say the tweaks to Apple's operating systems create a potential "backdoor" into gadgets that could be exploited by governments or other groups.More Related News