Apple is reportedly developing a tool that would scan for child sexual abuse material (CSAM) in your iPhone photos using hashing algorithms. The system is said to be deployed on the user’s device for greater security and privacy.
from Gadgets 360 https://ift.tt/2VuUxHV
via IFTTT
No comments:
Post a Comment