Apple drops all mention of its CSAM detection feature
Apple removes links to its child pornography detection tool. To appease the spirits before the new version?
A few months ago, Apple made a surprise announcement, so to speak, announcing its intention to implement a potential child pornography detection feature (CSAM detection). If the very idea of this feature is quite noble, and its implementation is ideal for effectively detecting and tracking illegal content and activities, the fact that the Cupertino company comes to scan all users’ photos in this way is very difficult.
Apple removes links to its child pornography detection tool
In fact, the controversy was not long in coming. The Apple brand did everything possible to defend its position, to defend its project, but little by little, Apple had to back down. Today, all links to this feature have been removed from his website, as noted by Florian Szymanke.
Links to CSAM scans were still present on the Cupertino company’s website last Friday, according to the report. Now they are no more. However, this does not mean that Apple has completely abandoned this feature. As 9to5Mac points out, it’s possible that the Apple brand simply removed these links to better reflect the feature and, so to speak, appeal to the general public.
To appease the spirits before the new version?
As already mentioned, the discovery of child pornography is a good thing. Concerns relate to the consequences and possible deviations from this technology. Indeed, Apple could use this algorithm to get new data about its users, and some even fear that the feature could be abused by governments that might try to infiltrate devices looking for information about, for example, minorities, activists, political opponents, or others.
At the moment, no one knows if this functionality will ever return and in what form.
Leave a Reply