icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm

Apple DELAYS controversial plan to scan iPhones for child abuse images following privacy backlash

Apple DELAYS controversial plan to scan iPhones for child abuse images following privacy backlash
Apple has announced it will “take additional time” in the coming months to work on plans for flagging child sexual abuse material (CSAM), amid concerns from activists and rights groups over censorship and privacy issues.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” an Apple spokesperson said in a statement on Friday.

The delay follows a controversial announcement that was immediately met with calls to abandon the plans from civil rights groups, including the American Civil Liberties Union (ACLU).

Apple’s technology would scan photos and conversations for CSAM, using a program the company previously claimed would still protect individual privacy because the technology does not identify the overall details of a picture or conversation, or need to be in possession of either – though many critics have voiced their doubts.

Also on rt.com Apple’s new ‘child-safety’ features face fresh challenge over censorship & privacy from over 90 rights groups

The system uses a database of references or ‘image hashtags’ to recognize specific content to be flagged, though security experts have warned that such technology could likely be manipulated, or innocent images could be misinterpreted. 

Even Apple employees have reportedly expressed concerns with the detection technology, worrying that it could be used to work around encryption protections, that it could easily misidentify and flag some photos – or even that some governments could exploit it to find other material. Apple maintains that it will refuse any requests from governments to use the system for anything other than child abuse images.

“iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent,” read a letter from a coalition of more than 90 activist groups to Apple CEO Tim Cook on the potential changes. 

The exact timeline for the current delay is unknown, but the new detection system was originally intended to be in use sometime this year.

If you like this story, share it with a friend!

Podcasts