Apple has announced plans to scan iPhones for images of child abuse, raising immediate concerns regarding user privacy and surveillance with the move.
Has Apple’s iPhone become an iSpy?
Apple says its system is automated, doesn’t scan the actual images themselves, uses some form of hash data system to identify known instances of child sexual abuse materials (CSAM) and says it has some fail-safes in place to protect privacy.
To read this article in full, please click here