Apple will delay its iPhone CSAM scanning plans

|
, ,

Over the past few weeks, Apple has come under fire for its plan to roll out a scanning feature in iOS 15 that would search users’ iPhones for child sexual abuse material or CSAM. Apple’s plans involved using a technology called NeuralHash to seek out CSAM on every iPhone in the United States.

Estimated reading time: 3 minutes

While on the surface, this seems like a great idea, some privacy experts and privacy advocates feel it opens a backdoor for Apple or others to abuse the NeuralHash system. While Apple searching for CSAM may seem bold and commendable, it does leave users open to being scanned for pretty much anything.

Apple said on Friday that it is pausing the testing of the CSAM tool to gather more feedback and try and improve the system.

“Last month, we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

MSN

While Apple is pausing the rollout and testing, it still insists that consumer privacy is of the utmost importance to them. The company says the CSAM tool would only turn photos on iPhones and iPads into unreadable hashes or complex numbers. Apple would then compare the numbers to the National Center for Missing and Exploited Children database, and only the hashes that matched would be reported.

Still, privacy advocates like the EFF and Edward Snowden think Apple should abandon its surveillance plans altogether.

The features Apple announced a month ago, intending to help protect children, would create an infrastructure that is all too easy to redirect to greater surveillance and censorship. These features would create an enormous danger to iPhone users’ privacy and security, offering authoritarian governments a new mass surveillance system to spy on citizens.

EFF

It will be interesting to see how this all plays out. For now, the CSAM scanning is on hold.

What do you think of this story? Please share your thoughts on any of the social media pages listed below. You can also comment on our MeWe page by joining the MeWe social network.

CSAM
Previous

Monoprice SB-300 review: Affordable virtual Dolby Atmos 2.0Ch soundbar

Drop + THX Panda review: Fantastic planar magnetic headphones with SoundID

Next

Latest Articles

Share via
Copy link
Powered by Social Snap