Apple CSAM detection: Conversation Safety for Messages is coming to the UK

Apple announced that it will begin scanning all photos uploaded to iCloud for potential child sexual abuse material (CSAM). It’s come under a great deal of scrutiny and generated some outrage, including from Apple employees. Here’s what you need to know about the new technology before it rolls out later this year.

Apple and CSAM Scanning: The latest news

Apr. 21, 2022: The Conversation Safety feature for Messages is coming to the U.K., though a timeline has not been announced.

Dec. 15, 2021: Apple removed references to the CSAM system from its website. Apple says the CSAM feature is still “delayed” and not canceled.

Dec. 14, 2021: Apple released the Conversation Safety feature for Messages in the official iOS 15.2 update. The CSAM feature was not released.

Nov. 10, 2021: The iOS 15.2 beta 2 has the less-controversial Conversation Safety feature for Messages. It relies on on-device scanning of images, but it doesn’t match images to a known database and isn’t enabled unless a parent account enables it. This is different from CSAM. Get the details.

Sept. 3, 2021: Apple announced that it will delay the release of its CSAM features in iOS 15, iPadOS 15, watchOS 8, and macOS 12 Monterey until later this year. The features will be part of an OS update.

Aug. 24, 2021: Apple has confirmed to 9to5Mac that it is already scanning iCloud emails for CSAM using image matching technology.

Aug. 19, 2021: Nearly 100 policy and rights groups published an open letter urging Apple to drop plans to implement the system in iOS 15.

Aug. 18, 2021: After a report that the NeuralHash system that Apple’s CSAM tech is based on was spoofed, Apple said the system “behaves as described.”

Aug. 13, 2021: In an interview with Joanna Stern from The Wall Street Journal, senior vice president of Software Engineering Craig Federighi said Apple’s new technology is “widely misunderstood.” He further explained how the system works, as outlined below. Apple also released a document with more details about the safety features in Messages and the CSAM detection feature.

The basics

What are the technologies Apple is rolling out? 

Apple will be rolling out new anti-CSAM features in three areas: Messages, iCloud Photos, and Siri and Search. Here’s how each of them will be implemented, according to Apple.

Messages: The Messages app will use on-device machine learning to warn children and parents about sensitive content.

iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

Siri and Search: Siri and Search will provide additional resources to help children and parents stay safe online and get help with unsafe situations.

When will the system arrive?

Apple announced in early September that the system will not be available at the fall release of iOS 15, iPadOS 15, watchOS 8, and macOS 12 Monterey. The features will be available in OS updates later this year.

Apple OS familyhttps://b2c-contenthub.com/wp-content/uploads/2021/06/apple-OS-family.jpg?resize=300%2C200&quality=50&strip=all 300w, https://b2c-contenthub.com/wp-content/uploads/2021/06/apple-OS-family.jpg?resize=768%2C512&quality=50&strip=all 768w, https://b2c-contenthub.com/wp-content/uploads/2021/06/apple-OS-family.jpg?resize=1200%2C800&quality=50&strip=all 1200w, https://b2c-contenthub.com/wp-content/uploads/2021/06/apple-OS-family.jpg?resize=1536%2C1024&quality=50&strip=all 1536w, https://b2c-contenthub.com/wp-content/uploads/2021/06/apple-OS-family.jpg?resize=2048%2C1365&quality=50&strip=all 2048w, https://b2c-contenthub.com/wp-content/uploads/2021/06/apple-OS-family.jpg?resize=1240%2C826&quality=50&strip=all 1240w, https://b2c-contenthub.com/wp-content/uploads/2021/06/apple-OS-family.jpg?resize=150%2C100&quality=50&strip=all 150w" width="1200" height="800" sizes="(max-width: 1200px) 100vw, 1200px" />

The new CSAM detection tools will arrive with the new OSes later this year.

Apple

Why is the system releasing now?

In an interview with Joanna Stern from the Wall Street Journal, Craig Federighi said the reason why it’s releasing in iOS 15 is that “we figured it out.”

CSAM scanning

Does the scanning tech mean Apple will be able to see my photos?

Not exactly. Here’s how Apple explains the technology: Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices. As Apple explains, the system is strictly looking for “specific, known” images. An Apple employee will only see photos that are tagged as having the hash and even then only when a threshold is met.

But Apple is scanning photos on my device, right?

Yes and no. The system is multi-faceted. For one, Apple says the system does not work for users who have iCloud Photos disabled, though it’s not totally clear if that scanning is only performed on images uploaded to iCloud Photos, or all images are scanned and compared but the results of the scan (a hash match or not) are only sent along with the photo when it’s uploaded to iCloud Photos. Federighi said “the cloud does the other half of the algorithm,” so while photos are scanned on the device it requires iCloud to fully work. Federighi emphatically stated that the system is “literally part of the pipeline for storing images in iCloud.”

What happens if the system detects CSAM images?

Since the system only works with CSAM image hashes provided by NCMEC, it will only report photos that are known CSAM in iCloud Photos. If it does detect CSAM over a certain threshold—Federighi said that it’s “something on the order of 30,” Apple will then conduct a human review before deciding whether to make a report to NCMEC. Apple says there is no automated reporting to law enforcement, though it will report any instances to the appropriate authorities.

Could the system mistake an actual photo of my child as CSAM?

It’s extremely unlikely. Since the system is only scanning for known images, Apple says the likelihood that the system would incorrectly flag any given account is less than one in one trillion per year. And if it does happen, an human review would catch it before it escalated to the authorities. Additionally, there is an appeals process in place for anyone who feels their account was flagged and disabled in error.

A report in August, however, seemingly proved that the system is fallible. GitHub user AsuharietYgva reportedly outlined details of the NeuralHash system Apple uses while user dxoigmn seemingly claimed to have tricked the system with two different images that created the same hash. In response, Apple defended the system, telling Motherboard that one used “is a generic version and not the final version that will be used for iCloud Photos CSAM detection.” In a document analyzing the security threat, Apple said, “The NeuralHash algorithm [… is] included as part of the code of the signed operating system [and] security researchers can verify that it behaves as described.”

Can I opt out of the iCloud Photos CSAM scanning?

No, but you can disable iCloud Photos to prevent the feature from working. It is unclear if doing so would fully turn off Apple’s on-device scanning of photos, but the results of those scans (matching a hash or not) are only received by Apple when the image is uploaded to iCloud Photos.

Apple messages scam https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-messages.jpg?resize=300%2C200&quality=50&strip=all 300w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-messages.jpg?resize=768%2C512&quality=50&strip=all 768w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-messages.jpg?resize=1200%2C800&quality=50&strip=all 1200w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-messages.jpg?resize=1536%2C1024&quality=50&strip=all 1536w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-messages.jpg?resize=2048%2C1365&quality=50&strip=all 2048w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-messages.jpg?resize=1240%2C826&quality=50&strip=all 1240w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-messages.jpg?resize=150%2C100&quality=50&strip=all 150w" width="1200" height="800" sizes="(max-width: 1200px) 100vw, 1200px" />

Apple

Messages

Is Apple scanning all of my photos in Messages too?

Not exactly. Apple’s safety measures in Messages are designed to protect children and are only available for child accounts set up as families in iCloud.

So how does it work?

Communication safety in Messages is a completely different technology than CSAM scanning for iCloud Photos. Rather than using image hashes to compare against known images of child sexual abuse, it analyzes images sent or received by Messages using a machine learning algorithm for any sexually explicit content. Images are not shared with Apple or any other agency, including NCMEC. It is a system that parents can enable on child accounts to give them (and the child) a warning if they’re about to receive or send sexually explicit material.

Can parents opt-out?

Parents need to specifically enable the new Messages image scanning feature on the accounts they have set up for their children, and it can be turned off at any time.

Will iMessages still be end-to-end encrypted?

Yes. Apple says communication safety in Messages doesn’t change the privacy features baked into messages, and Apple never gains access to communications. Furthermore, none of the communications, image evaluation, interventions, or notifications are available to Apple. 

What happens if a sexually explicit image is discovered?

When the parent has this feature enabled for their child’s account and the child sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with resources, and reassured it is okay if they do not want to view or send the photo. For accounts of children age 12 and under, parents can set up parental notifications that will be sent if the child confirms and sends or views an image that has been determined to be sexually explicit.

Siri and Search

What’s new in Siri and Search?

Apple is enhancing Siri and Search to help people find resources for reporting CSAM and expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online and get help with unsafe situations. Apple is also updating Siri and Search to intervene when users perform searches for queries related to CSAM. Apple says the interventions will include explaining to users that interest in this topic is harmful and problematic and provide resources from partners to get help with this issue.

The controversy

So why are people upset?

While most people agree that Apple’s system is appropriately limited in scope, experts, watchdogs, and privacy advocates are concerned about the potential for abuse. For example, Edward Snowden, who exposed global surveillance programs by the NSA and is living in exile, tweeted “No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.” Additionally, the Electronic Frontier Foundation criticized the system and Matthew Green, a cryptography professor at Johns Hopkins, explained the potential for misuse with the system Apple is using.

A number of people pointed out that these scanning technologies are effectively (somewhat limited) mass surveillance tools. Not dissimilar to the tools that repressive regimes have deployed — just turned to different purposes.

— Matthew Green (@matthew_d_green) August 5, 2021

People are also concerned that Apple is sacrificing the privacy built into the iPhone by using the device to scan for CSAM images. While many other services scan for CSAM images, Apple’s system is unique in that it uses on-device matching rather than images uploaded to the cloud.

Can the CSAM system be used to scan for other image types?

Not at the moment. Apple says the system is only designed to scan for CSAM images. However, Apple could theoretically augment the hash list to look for known images related to other things, such as LGBTQ+ content but has repeatedly said the system is only designed for CSAM.

Is Apple scanning any other apps or services?

Apple recently confirmed that it has been scanning iCloud emails using image-matching technology on its servers. However, it insists that iCloud backup and photos are not part of this system. Of note, iCloud emails are not encrypted on Apple’s servers, so scanning images is an easier process.

What if a government forces Apple to scan for other images?

Apple says it will refuse such demands. 

apple csam systemhttps://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-hash.jpg?resize=300%2C200&quality=50&strip=all 300w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-hash.jpg?resize=768%2C512&quality=50&strip=all 768w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-hash.jpg?resize=1200%2C800&quality=50&strip=all 1200w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-hash.jpg?resize=1536%2C1024&quality=50&strip=all 1536w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-hash.jpg?resize=2048%2C1365&quality=50&strip=all 2048w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-hash.jpg?resize=1240%2C826&quality=50&strip=all 1240w, https://b2c-contenthub.com/wp-content/uploads/2021/08/apple-csam-hash.jpg?resize=150%2C100&quality=50&strip=all 150w" width="1200" height="800" sizes="(max-width: 1200px) 100vw, 1200px" />

Apple

Do other companies scan for CSAM images?

Yes, most cloud services, including Dropbox, Google, and Microsoft, as well as Facebook also have systems in place to detect CSAM images. These all operate by decrypting your images in the cloud to scan them.

Can the Messages technology make a mistake?

Federighi says the system “is very hard” to fool. However, while he said Apple “had a tough time” coming up with images to fool the Messages system, he admitted that it’s not foolproof.

Could Apple be blocked from implementing its CSAM detection system?

It’s hard to say, but it’s likely that there will be legal battles both before and after the new technologies are implemented. On August 19, more than 90 policy and rights groups published an open letter urging Apple to abandon the system: “Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the letter said.

Subscribe to Applenews247.Com Newsletter

Leave a Reply

Your email address will not be published. Required fields are marked *

*


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>