Apple’s plans to research child sexual images in the future

Over the summer, Apple released plans to automatically scan iPhone photos for child pornography (CSAM), a move that immediately drew negative reactions from some privacy experts.

Today, a group of leading IT and security experts published an article documenting their concerns about the proposed software, according to ComputerWeekly.com. The non-peer reviewed article was published via Columbia University and Arxiv, an open access research database.

Many internet companies are looking for CSAM, including social media giants like Facebook, reports Vox. But while these scans take place on external servers, Apple’s plan would scan photos destined for iCloud accounts on personal devices like iPhones. In addition, Apple plans to allow parents to choose to scan photos sent via Messages by users under the age of 12, Vox reports.

Scanning photos on phones, as opposed to centralized servers, is called “client-side scanning” or CSS – and that’s what the authors of this new article are concerned about. They say the problem is not with finding CSAM, but rather with potential future uses of CSS technology.

Client-side scanning (CSS) could be extremely problematic for privacy and security

“Even if it was initially deployed to search for child pornography, clearly illegal content, there would be tremendous pressure to expand its reach,” the authors write. “We would then be hard pressed to find a way to resist its expansion or to control abuse of the system. “

The authors note concerns about the possibility of targeting everything from political activism to organized crime and terrorism to LGBTQ + people, according to who is wanted by a given government. If this technology created the door to people’s phones, you could imagine the pressure to open that door to search for images associated with unpopular political views or anti-authoritarian messages from authoritarian governments.

The plan’s initial backlash prompted the company to rework, but maintain, the iPhone CSS proposal, reports Ars Technica. In addition, Apple has written a series of answers to some questions on the subject. In response to a question of whether governments could force Apple to search for things other than CSAM, the company said it would “refuse such requests” and that “this technology is limited to detecting CSAM stored in iCloud and we will not gain access to any government. asks to extend it.

Apple has previously held firm on privacy concerns, such as with the FBI’s request to unlock the San Bernardino gunman’s iPhone in 2016. Although the FBI ultimately took the phone, Apple has maintained its refusal to help, according to The edge.

But the concerns do not only concern state actors. The newspaper cites concerns about people such as corrupt police officers or even family members. “Consider, for example, a woman who is considering escaping from an abusive or controlling partner who mistreats her and her children,” the authors write. “The perpetrator often installs stalking devices on family phones and may use ‘smart home’ infrastructure such as video doorbells to control them. “

Apple has responded to questions such as whether the feature will “prevent children living in abusive homes from seeking help” by stating on their website that “the communications security feature only applies to sexually explicit photos shared or received in messages. Other communications victims can use to request help, including text in messages, are unaffected, ”and note that they add support in Siri and Search to provide more support for victims. abuse.

The response to this proposition has not been uniformly negative – The edge notes that some experts are satisfied with the possibility of intercepting the CSAM. And for children growing up in an internet-centric society, protecting them online is extremely important.

Make sure to keep your kids offline as much as possible and educated when they are online

One of the most important things you can do, besides keeping your kids off the internet and smartphones for as long as possible, is to talk to your kids about internet safety and data privacy, including making them aware of the reality of surveillance and data collection. , and the potential bad actors in today’s world.

As the world grows online, it will be difficult to find workable solutions to protect people, especially children, and keep our data out of sight of everyone from criminals to authoritarian governments to private companies. .

The authors of this new article note some of the most important potential implications of mass surveillance. “The introduction of digitization on our personal devices – devices that keep information from notes to be made to texts and photos of loved ones – tears at the heart of citizens’ privacy,” they write. “Such massive surveillance can have a significant chilling effect on freedom of expression and, indeed, on democracy itself. “


Source link

Comments are closed.