Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's the CSAM part. The iMessages part uses a neural network to detect explicit photos through machine learning.

What happens when Apple, or the government, mandate an expansion of CSAM into detecting new material? Apple already has built a neural network to detect new explicit material...

Also, what happens when the USG mandates that Top Secret classified material must also be added to the database? Or when Russia mandates that homosexual pornography must be added to the database?



> Also, what happens when the USG mandates that Top Secret classified material must also be added to the database?

That doesn't appear to have happened in the 10+ years this kind of scanning has already been happening at various cloud providers - what specifically about Apple moving it down to the phone from iCloud Photos makes this more likely now?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: