Apple expands iPhone child safety feature to the UK

Long after its initial announcement back in August, and following considerable controversy over an as-yet-released component, Apple is expanding to the UK an iPhone feature designed to protect children against sending or receiving sexual content.

Communications Safety in Messages eventually launched as part of the iOS 15.2 point update in December. Until now, however, this has been limited to the US. The Guardian broke the news that Apple announced plans to bring this feature to the UK, although the timeframe remains unclear.

When enabled on a child’s device, the feature uses an on-device AI tool to scan all photos received over Messages for nudity. If one is found, the image will be blurred and the user will receive a warning that it may contain sensitive content as well as links to helpful resources. Similarly, the tool scans photos sent by the child, and if any nudity is detected they are advised not to send the material, and to contact an adult.

“Messages analyzes image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages,” Apple explains. “The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else.”

Reflecting the pushback Apple experienced from privacy groups, the feature has been watered down from initial plans. In its original conception, the feature included a parental option to automatically be notified if nudity was detected in images sent or received by children under the age of 13. Apple removed that aspect after concerns were raised over risks for parental violence or abuse. The feature now gives children the ability to message a trusted adult if they choose, separate from the decision to view the image, and parents are not notified unless the child chooses.

A raft of child-safety features–which also included a controversial AI tool to scan photos uploaded to iCloud using hashes and compare them with a database of known Child Sexual Abuse Material–was originally slated to appear as part of last year’s iOS 15 software update. Apple delayed the CSAM component late last year and has yet to implement it.

What does this mean for me?

US readers are unaffected by this news, as the feature has been active since iOS 15.2. If Communications Safety in Messages is expanding to a second country we can infer that Apple is eased with the results and unlikely to backtrack and remove it in the US. This feature only affects images received in Messages and doesn’t scan any photos stored in your child’s Photo Library.

UK readers who have children will soon have the option (it’s disabled by default) to enable the feature on their kids’ handsets and thereby activate on-device scanning for potentially sexual content. But as we have explained above the results of these scans will not automatically be shared with parents, though if you plan to enable the feature, it would be wise to make it part of a wider discussion about the dos and don’ts of digital sharing.

Subscribe to Applenews247.Com Newsletter

Leave a Reply

Your email address will not be published. Required fields are marked *

*


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>