Apple Will Now Monitor iPhone and iCloud Pictures for Child Abuse

The features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

By Emily Rella Aug 06, 2021
VCG | Getty Images

Opinions expressed by Entrepreneur contributors are their own.

Apple Inc. announced Thursday that it’s implementing a new system to check iPhone images before they’re uploading to the iCloud storage service to be sure no images match with known images of child sexual abuse.

During a press conference, Apple explained that the service will turn device images into an unreadable series of hashes or complex numbers, which will be matched against a database of hashes from the National Center for MIssing and Exploited Children.

According to a note on Apple’s website, this is just one part of the brand’s new child safety initiative: “First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.”

Related: Is it Time to Take a Bite of Apple Stock?

The Siri and Search functions in iOS will also play a part in combatting child abuse, as they’ll provide parents and children information and help in unsafe situations and intervene when users attempt to search for abuse-related topics.

Apple, which markets itself as a secure and private option for consumers, was careful to highlight that these steps are not meant to infringe on privacy.

The features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

Related: Apple Co-Founder Steve Wozniak Says Bitcoin Is Better Than Gold

Apple Inc. announced Thursday that it’s implementing a new system to check iPhone images before they’re uploading to the iCloud storage service to be sure no images match with known images of child sexual abuse.

During a press conference, Apple explained that the service will turn device images into an unreadable series of hashes or complex numbers, which will be matched against a database of hashes from the National Center for MIssing and Exploited Children.

According to a note on Apple’s website, this is just one part of the brand’s new child safety initiative: “First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.”

Related: Is it Time to Take a Bite of Apple Stock?

The Siri and Search functions in iOS will also play a part in combatting child abuse, as they’ll provide parents and children information and help in unsafe situations and intervene when users attempt to search for abuse-related topics.

Apple, which markets itself as a secure and private option for consumers, was careful to highlight that these steps are not meant to infringe on privacy.

The features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

Related: Apple Co-Founder Steve Wozniak Says Bitcoin Is Better Than Gold

The rest of this article is locked.

Join Entrepreneur+ today for access.

Subscribe Now

Already have an account? Sign In

Emily Rella

Senior News Writer
Emily Rella is a Senior News Writer at Entrepreneur.com. Previously, she was an editor at Verizon Media. Her coverage spans features, business, lifestyle, tech, entertainment, and lifestyle. She is a 2015 graduate of Boston College and a Ridgefield, CT native. Find her on Twitter at @EmilyKRella.

Related Content