Privacy rights and child protection - the case of Apple
top of page
  • Matilda Sandvik

Privacy rights and child protection - the case of Apple

Updated: Jul 21, 2022

BLOG POST

Update: Since publishing this article, on September 3rd 2021 Apple announced that they would be pausing the implementation of the planned child safety features discussed below. We are deeply disappointed in this announcement, especially as it comes without a clear indication of its extents and details. We at Protect Children will continue to fight to ensure the best interests of children in all environments. Privacy rights and the protection of children from abuse and exploitation must and can co-exist in balance. The arguments in regards to the balance between privacy rights and child protection efforts presented in the article remain extremely relevant.


Read LL.M candidate Matilda Sandvik’s article (published prior to Apple’s announcement to pause the measures) below.



Apple’s new child safety measures, released in August and made available currently only in the United States, include 3 significant measures for global child protection; client-side CSAM detection, communications safety in iMessages and a function for expanded guidance to help-sources for persons searching for CSAM related topics.


Apple has taken significant and highly technologically sophisticated measures to ensure that the highest level possible of privacy can be offered while still being able to detect CSAM and prevent child sexual abuse, also offering an extremely small error rate of one in one trillion in the CSAM detection technology. Nevertheless, the new features have been met with reactions ranging from concerns to anger, to a large extent on the privacy rights advocate side. Conspiracy theories range from Apple handing over the technology to authoritarian states where it will wreak havoc, to Apple using child safety as a cover with their real intent being to surveil on their users.


Now, regarding the client-side CSAM detection (which seems to be the feature under hottest fire), it is noteworthy that perpetual hashing is not something new. What Apple is proposing is not unprecedented. Content on various, commonly used platforms has been analyzed in similar manners for more than a decade. Such widespread concerns for malicious use of these technologies have, however, not been presented. And to clear up a common misconception brought up in the media; this function does not allow Apple to manually analyze content to recognize nudity and then check to see if the image contains child sexual abuse. Images are only matched against already known hashes from third parties, and Apple can only access the material once a certain threshold is exceeded.


As those of us who work in the field of child protection know, and as becomes painfully clear in our daily work, the spread of CSAM is a very real, and widespread problem globally. We also know that an increasing number of images originate from children’s own devices – extremely relevant in view of the communications safety feature. As technology has continued to develop, so has an increase in the spread of illegal material. Technology, while the catalyst for the problem, can and must be a principal component in the battle against it. Without the involvement of the tech sector, the battle will be lost. Apple’s new measures set a crucial precedent for other companies to join the battle and better protect children online.


Privacy rights advocates ought to remember that CSAM detection is not fundamentally against privacy, on the contrary, it is the only way to ensure and advance the privacy rights of child victims portrayed in imagery. Additionally, amid all the criticism, I have so far not seen any alternative suggestions proposed. And no, a solution where tech-companies are no longer allowed to detect or scan their platforms for CSAM or other harmful material is not an option: this would also go against the very foundations of human rights law. At Protect Children, we promote all rights of children, and recognize privacy rights as fundamental. However, due to the indivisibility of human rights, we cannot allow privacy rights, although fundamental, to take precedence.


The discussions regarding CSAM detection and its impact on privacy rights have for too long been severely polarized, and polarized discussions do not serve the best interests of any stakeholder. On the other hand, it is important for child rights advocates to be open to listening to privacy concerns, as they are not without any foundation. The only way to move forward is through finding a compromise, and companies such as Apple who are attempting to create technologies that could respect the wishes of both sides should be applauded. Flaws can, and should, be noted without the need to completely shoot down the entire idea. All stakeholders must consider that a solution which completely satisfies all wishes of everyone involved does not exist, but that does not mean that we should give up on finding the next best option.


Matilda Sandvik, LL.M candidate, Project Employee


matilda.sandvik(at)protectchildren.fi


Matilda is finalizing her LL.M studies in international law, specializing in human rights.

bottom of page