August 30, 2021

Privacy Comparison: Apple’s CSAM detection system vs. “Hey, Siri”

What are your thoughts on the differences in privacy concerns between Apple’s recently announced Child Sexual Abuse Material (CSAM) detection system and voice activated technology such as “Hey, Siri”, “Hey, Google”, and “Alexa…”?

**CSAM Detection**
The CSAM detection system does not scan any details of iCloud-stored photos, but rather compares a hash (yes, it’s a neural hash and not a cryptographic hash, but it’s **not** the same as scanning the contents of a photo.

**Voice Activated Assistants**
Voice activated assistants wait for a specific command before they start actively listening for more commands. While users *can* opt into sending voice commands to the company for various reasons (but usually under the assumption it’s to help improve the service) the company isn’t listening to or storing your non-voice command conversations.

Is the CSAM detection system simply the new “privacy fad topic” of the year? I can’t remember the last time someone told me they were concerned about their privacy while having Alexa play them a song…

Comments

John_Ruth

Since I don’t have Siri active or an Alexa, I don’t care what the intent is. It’s still a problem.

Voice control is the same reason I haven’t accepted the TOS for my TV I got last December.

Leave a Reply

Your email address will not be published. Required fields are marked *

Note: By filling this form and submitting your commen, you acknowledge, agree and comply with our terms of service. In addition you acknowledge that you are willingly sharing your email address with AiOWikis and you might receive notification emails from AiOWikis for comment notifications. AiOWiksi guarantees that your email address WILL NOT be used for advertisement or email marketting purposes.

This site uses Akismet to reduce spam. Learn how your comment data is processed.