Apple to begin reporting Child Sexual Abuse Material (CSAM)

Speakeasy Security - A podcast by Tony Anscombe

Categories:

Apple recently announced it will begin reporting Child Sexual Abuse Material (CSAM) to law enforcement with the latest iOS 15 update. The new system aims to identify images using a process called hashing, which turns images into numbers. On this episode, we discuss how Apple’s new system will work and how this bold step in combating Child Sexual Abuse is being received by privacy-sensitive users around the world.Links:Apple to combat Child Sexual Abuse Material: https://www.cnbc.com/202...