Available with iOS 15, iPadOS 15 and macOS Monterey, features like Communication Safety will make your online experience even safer.
Apple, through a press release , announced the arrival in its operating systems coming out in the fall of new features aimed at protecting the safety of children and limiting the dissemination of child pornography material through the various communication platforms.
With this in mind, the Cupertino giant will improve three areas , making use of the collaboration of industry experts.
- The Messages system app will use on-device machine learning to find and report sensitive material, without however violating the privacy of communications. Apple will not have access to the messages.
- Future versions of iOS and iPadOS will use new encryption tools to limit the sharing of Child Sexual Abuse Material ( CSAM ).
- Siri and Search will provide parents and children with more detailed information on the subject and will help them should a case of abuse arise.
Communication Safety
The Communication Safety feature of the Messages app on iPhone, iPad and Mac will alert you to any sexually explicit photographs, received or sent. The machine learning will analyze media content and darken automatically all those that contain explicit material.
When the user tries to view a photo flagged by machine learning in the Messages app, he will be generically informed of its content, that is, that it may contain nudity and that it could hurt sensitivity. In the event that the user who owns the Apple device is a child, it will be possible to activate an option that will send parents a notification in case of sensitive material received or sent .
Analysis of photos in iCloud
Starting with iOS 15 and iPadOS 15, Apple will be able to detect child pornography (CSAM) material within the photo gallery stored on iCloud . In this way, the Californian company will be able to forward the reports to the National Center for Missing and Exploited Children (NCMEC) , a non-profit organization that works with the US law enforcement agencies.
In summary, Apple explains, the new system will use on-device machine learning to compare camera roll photos with a database of CSAM images provided by NCMEC and other child abuse organizations.
Recent Comments