Welcome to the Huguenot Herald. We are the student-run newspaper at New Rochelle High School. We meet Wednesdays in room 309.
Apple has just released its latest version of the iPhone’s operating system, iOS 15.2. Among various bug fixes is a feature aimed to protect users. The new feature is a client-side (all done on the user’s device) program that automatically detects when nude images are being sent or received. It also allows the user to opt in to view them, as well as notify a trusted adult. The feature is aimed to thwart teens and tweens from sharing nude images, warning them about sensitive media that might “show the private body parts that you cover with a bathing suit.”
The new program works by using AI to identify images sent by iMessages that Apple deems concerning. The image is blurred, and the user is notified that the image might be inappropriate. They have the choice to view it and the choice to notify a trusted adult.
This feature is one of a triad that Apple planned on introducing in iOS 15 (released a few months ago), but was delayed due to public backlash. The other two features that have yet to be implemented are also aimed at preventing the distribution of child pornography. The first, largely uncontroversial feature, offers resources when a user searches for things related to child abuse or sexual exploitation. The second, and most controversial feature, aims to detect child pornography in iCloud users’ personal images library. To accomplish this, Apple would have to upload hashes of thousands of images (according to The Verge) and videos of child pornography to every user’s iCloud account in order to compare them to existing images, where matches between the newly uploaded porn and a user’s existing library would indicate that that user possessed child porography. Repeated matches would alert Apple engineers, who would then alert law enforcement.
Privacy advocates argue that the third feature would seriously compromise users’ digital security, as it provides a way for Apple to examine users' iCloud libraries without their consent. They argue that in the future, Apple could be compelled to make use of this backdoor to check users’ accounts for other incriminating evidence, something that privacy advocates fear could be exploited by foreign governments in unjustly prosecuting their citizens, especially in China. Advocates in favor of the potential feature assert that Apple has the power to disrupt the sharing of child pornography and make significant changes to this difficult-to-thwart corner of the web. They argue it would be unethical of Apple not to use their tools for good.
After planning to release all three features in the fall, in a rare move, Apple admitted that the potential impacts of these features needed to be reexamined. As of now, the text message intercepting system is the only portion of the original plan to come to fruition. The other features remain on hold indefinitely.