Apple wants to use user data to test the effectiveness of the parental control feature – after consent. The nude filter is not yet available in Germany.
Apple plans to collect user data on the iMessage nude photo filter. Upon approval, the manufacturer wants to collect data on the use of the function called “communication security”, which will ultimately serve to improve it. This should also help to understand the effectiveness of the nude photo filter, according to the description text, which first appeared in the third beta of iOS 15.6.
Data collection planned for iOS 15.6
Apple wants to collect data for this, which will reveal, among other things, the proportion of “unique” photos identified by machine learning among all photos sent and received. The group is also interested in how often users use other options of the parental control function, including the button that refers to offers of help.
These are just examples of the data that will be collected in aggregate form, Apple emphasizes in the code of the next iOS version, which a developer has disclosed. The information collected is also not to be linked to Apple ID or other user data. The enhancement feature is designed to protect personal information and let users control what they share, the text continues.
iMessage nude photo filter not yet in Germany
Since iOS 15.2, it has been possible in initial countries for parents to activate the nude photo filter on the devices of their underage children. Adults can not turn on the filter for their own communication. After the USA, the youth protection function was also recently launched in the UK. It remains unclear whether and when it will come to Germany. After activation, the operating system locally analyzes photos received and sent via iMessage or Apple’s Messages app. Photos classified as nude images should then be automatically blurred and displayed with a warning about “possibly sensitive” content – but can then be optionally displayed or sent on a tap.
Originally, Apple had planned that parents of under 13-year-olds are automatically informed about the viewing or sending of nude photos. The function was removed after massive criticism, including from child protection organizations, even before its introduction. It is not known how reliably Apple’s filter detects nude photos. The recognition of nude content and pornographic images works with machine learning, but can make mistakes and can be deceived, admitted Apple’s software chief in the run-up.