Apple’s Delaying Its Campaign to Protect Kids from Sexual Predators

Apple’s Delaying Its Campaign to Protect Kids from Sexual PredatorsTech giant Apple recently announced plans to begin testing a new system that would scan iPhone and iCloud accounts for photos that match a database of known child pornography and sexual abuse images. When found and verified, Apple would alert the necessary authorities. This announcement was part of a larger push on behalf of the company to focus on child safety.

However, Apple has announced it would delay the rollout of the features in response to backlash from its customers and from privacy advocates. Per CNBC News, “After objections about privacy rights, Apple said Friday it will delay its plan to scan users’ photo libraries for images of child exploitation.”

How does Apple’s child protection technology work?

The tech was supposed to build on an existing system currently used by Apple, Gmail, and other cloud providers. CNBC explains that “the system does not scan a user’s photos, but instead looks for known digital “fingerprints” that it matches against the CSAM database. If the system detects enough images on a user’s account, it is then flagged to a human monitor who can confirm the imagery and pass the information along to law enforcement if necessary.”

Privacy and security proponents raised alarms about the image scanning, concerned that private and innocent family photos could be mistakenly flagged as CSAM. Apple said that the system is limited only to materials flagged by the National Center for Missing and Exploited Children (NCMEC) and uploaded to iCloud photos. If a photo generates an alert, Apple and NCMEC must review it first before alerting law enforcement as an extra safeguard.

According to the company’s announcement, “Apple only learns about users’ photos if they have a collection of known CSAM (child sexual abuse material) in their iCloud Photos account.”

Although Apple’s Messaging app currently has a device-scanning setting for children’s accounts to detect sexually explicit content, their new setting adds a further layer of security. If a child sends or receives this type of content, the image is blurred and the child receives an alert. This new setting, which parents can enable on their family iCloud account, will also trigger an alert informing the child if they choose to send or view the image, their parents or guardians will receive a message.

It is unclear how long Apple will delay the new tools.

What were Apple’s expanded protections?

In their initial announcement, Apple discussed the introduction of new safety features designed to protect children from online sexual predators and limit the spread of Child Sexual Abuse Material, also called CSAM. This suite of three child safety features encompasses three areas:

  • CSAM detection on iOS and iPadOS to limit the spread of these images online, which helps provide information to law enforcement, but protecting user privacy at the same time.
  • Enhanced tools within the Messages app, using on-device machine learning to warn parents if their child is about to view sensitive content, while keeping private messages unreadable by Apple.
  • Updates to Siri and Search to help parents or children who encounter unsafe sexual situations, as well as intervene when users attempt to search for CSAM or CSAM-related topics.

The company noted in their press release these “features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.”

Keeping your children protected from online predators

Especially in these ever-changing times, it can be difficult to keep up with what your child is doing online. During last year’s pandemic lockdown, calls from young people under age 18 to the National Sexual Abuse Hotline rose by 22 percent. Further, reports of online child sex abuse quadrupled from the same period in 2019. How can you keep your child safe? The U.S. Department of Justice (DOJ) offers some valuable advice.

  • Discuss online safety with your child, teach them how to spot red flags, and encourage open communication with you or another trusted adult if they feel unsafe.
  • Supervise younger children’s internet use, and periodically check their online profiles. Think about setting internet time limits or usage only in common areas of the home.
  • Review apps, social media, or games before your child uses them; especially any that include messaging, video chats, anonymity, or file uploads. These are frequently used by online predators.
  • Utilize parental controls and privacy settings for your child’s online activities.
  • Talk to your kids about the importance of not sharing personal and identifying information or personal photos online, especially to people they don’t know. Remind them that the internet is forever.
  • Be alert to potential signs of sexual abuse online, like changes in your child’s electronics use, concealing their online activity, or changes in behavior – withdrawing, emotional outbursts, depression, or anxiety.

Finally, the DOJ states: “Immediately report suspected online enticement or sexual exploitation of a child by calling 911, contacting the FBI at tips.fbi.gov, or filing a report with the National Center for Missing & Exploited Children (NCMEC) at 1-800-843-5678 or report.cybertip.org.”

At Taylor & Ring, our Los Angeles sexual assault attorneys protect children from predators, online and off. If your child was harmed by a sexual predator, let us help. We are here to listen and hold the person that hurt your child responsible for their actions. To schedule a free consultation, call our office at 310-776-6390, or complete our contact form.