Apple to scan iPhones, iPads for images of child sex abuse
Al Jazeera
Apple Inc said it is launching new software later this year that will analyse iPad and iPhone photos for sexually explicit images of children and report any relevant findings to authorities.
Apple Inc. said it will launch new software later this year that will analyze photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities. As part of new safeguards involving children, the company also announced a feature that will analyze photos sent and received in the Messages app to or from children to see if they are explicit. Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021. If Apple detects a threshold of sexually explicit photos of children in a user’s account, the instances will be manually reviewed by the company and reported to the National Center for Missing and Exploited Children, or NCMEC, which works with law enforcement agencies. Apple said images are analyzed on a user’s iPhone and iPad in the U.S. before they are uploaded to the cloud.More Related News