Apple’s New Tool will Scan the iPhone, will Report if Child Porn is Found.

0
369
Apple's New Tool will Scan the iPhone, will Report if Child Porn is Found.
Image Credit by “Apple.com”

Apple’s New Tool will Scan the iPhone, will Report if Child Porn is Found.

Tech giant Apple has announced that the company is going to add a new technology to the iOS, macOS, watchOS and iMessage platforms. Through this technology, Apple will identify child sexual abuse material (CSAM) and will also report to the relevant department if such material is found.

The company has announced that its upcoming devices will have new cryptographic applications that will prevent such images from spreading. In this, the privacy of the users will also be maintained. This feature will be released with iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.

Apple is going to introduce some new features for the safety of children. According to Apple, new child safety features have been designed in partnership with child safety experts. These will have new communication tools that will allow parents to keep a close eye on children’s online activities.

Read More: HBO Max may coming soon in India can it compete with Netflix / Amazon Prime

The company is also going to bring a feature for iMessage where the app will identify sensitive content being received with the help of Machine Learning (ML). If sensitive content is detected, iMessage will blur the picture and alert the child. If the child still sees the photo after that, a message will be sent to the parent about it. Along with the content received, if the child tries to send such pictures, a message will be sent to the parent about it. Along with the content received, a message will be sent to the parent even if the child tries to send such pictures.

CSAM

Additionally, to track CSAM online, Apple will use a non-CSAM image database provided by NCMEC (National Center for Missing & Exploited Children) and other child safety organizations.. iCloud will then compare the photos before storing them in Photos. If additional non-CSAM content is found in an account, Apple will manually review each report and disable the user’s account if the report is found to be correct. Along with this, the company will also report to NCMEC.

A similar feature is also coming to Siri and Search. Through the new feature, if users search for things related to CSAM, then it will stop them. Along with this additional information will also be given for the online safety of the child and the parent. The company has also said that even after the introduction of these features, the privacy of the users will remain. These features will initially be launched only in the US.

[youtube-feed] [youtube-feed channel="setnewsbox" subscribetext="Subscribe to My Channel"]

LEAVE A REPLY

Please enter your comment!
Please enter your name here