Wed, 27 Nov 2024, 11:48 am

Apple criticised for system that detects child abuse

Reporter Name
  • Update Time : Saturday, August 7, 2021
  • 184 Time View

Apple is facing criticism over a new system that finds child sexual abuse material (CSAM) on US users’ devices.

The technology will search for matches of known CSAM before the image is stored onto iCloud Photos.

 

But there are concerns that the technology could be expanded and used by authoritarian governments to spy on its own citizens, BBC reported.

WhatsApp head Will Cathcart called Apple’s move “very concerning”.

Apple said that new versions of iOS and iPadOS – due to be released later this year – will have “new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy”.

The system will report a match which is then manually reviewed by a human. It can then take steps to disable a user’s account and report to law enforcement.

The company says that the new technology offers “significant” privacy benefits over existing techniques – as Apple only learns about users’ photos if they have a collection of known child sex abuse material in their iCloud account.

Please Share This Post in Your Social Media

Leave a Reply

Your email address will not be published. Required fields are marked *

More News Of This Category
© All rights reserved © 2019 WeeklyBangladeshNY.Net
Theme Dwonload From ThemesBazar.Com