Parents will be able to activate the feature on their kids' iOS devices
Apple will be pushing ahead with its nudity-detecting, child protection feature in the Messages app for iOS 15.2, but parents will have to turn it on.
When Apple first revealed its child protection features, they were met with afairly critical response,resulting in a delay of the planned roll-out.
The biggest privacy concernApple scanning iCloud photos for Child Sexual Abuse Material (CSAM)is still on hold, but according toBloomberg, the Messages update is slated for release with iOS 15.2.

RF Pictures / Getty Images
Apple says it won’t be on by default, however, and that image analysis will be happening on-unit, so it won’t have access to potentially sensitive materials.
RF Pictures / Getty Images
According to Apple, once enabled, the feature will use on-rig machine learning to detect whether sent or received photos in Messages contain explicit material.
This will blur potentially explicit incoming images and warn the child or give them a warning if they’re sending something that might be explicit.

Apple
In both cases, the child will also have the option to contact a parent and tell them what’s going on.
In a list ofFrequently Asked Questions, Apple states that for child accounts 12 and under, the child will be warned that a parent will be contacted if they view/send explicit material.
For child accounts between ages 13-17, the child is warned of the potential risk, but parents will not be contacted.
Apple
In the same FAQ, Apple insists that none of the information will be shared with outside parties, including Apple, law enforcement, or the NCMEC (National Center for Missing & Exploited Children).
These new child safety options for Messages should be available in the upcoming iOS 15.2 update, which isexpected to roll sometime this month, according toMacworld.