News

Apple’s Advanced Child Protection Program

Apple is a company that really does a lot for its users in terms of security. In the AppStore, we see the minimum number of unwanted applications, third-party developers are as limited as possible in access to our content, and we, in turn, are provided with tools to independently manage our own content and the information that we want to share.

New security program

On the officer. Apple has published a new program called Extended Child Protection. This is a set of measures, consisting of 3 parts:

  1. Security while communicating in iMessage.
  2. Establishing the facts of sexual abuse of children.
  3. Siri training in SERP on such issues.

Let’s go over each part.

Part 1: Security in iMessage

The iMessage app (Messages) will receive a new “Secure Communication” feature. It allows you to alert children or their parents when receiving or sending sexually explicit photographs.

When receiving candid photos, the machine learning algorithm will recognize it and “blur”. The child will be warned, and if they try to open, the parents will be notified.

Photo protection

The feature is undeniably useful. Another thing is that iMessage is not as popular a tool as social networks. And second, how often will strangers text you via iMessage? After all, in this case, they should know your phone number.

RELATED:   iOS 14.3 release: what's new

Part 2: Learning Siri

The voice assistant Siri will be “trained” to produce competent results. If, for example, a child or an adult is interested in how to seek specialized help. Or, for example, they may want to file a child abuse or exploitation report.

Siri training

The function is pretty useful. You will be less likely to make mistakes in search engines, and there will be less chance of getting into unwanted results.

Part 3: CSAM

CSAM (Child Sexual Abuse Material) – materials of sexual abuse of children.

There is such a non-profit organization “National Center for Missing and Exploited Children” (website missingkids.org), which works in collaboration with US law enforcement agencies.

Organization

A database with hash functions (code) will be created on the basis of photographs with missing children.

A hash function is an algorithm for converting an array of data (in our case, these will be images) into code.

Then this code is collected into a large database and transferred to Apple. And Apple, in turn, can scan your photos and find matches.

All of the user’s photos are usually stored on the device or on iCloud. In theory, Apple should scan both. But in order to scan photos in iCloud, power is required, and this is an additional cost. And how do you get to your devices?

RELATED:   How much faster is iOS 14.6 than iOS 14.5

And then an ingenious technology was invented Neural hash… Photos before uploading to iCloud will be scanned directly by your iPhone, then hashed. The resulting file will be scripted and attached to the photo. Then this photo will be uploaded to the cloud storage.

Photo analysis

Those. all photos in iCloud have already been analyzed, including photos on your iPhone, using the power of your own device.

Then a special algorithm comes into play. As soon as the database of the missing children website matches the database of your library, and some critical mass of these matches is typed, the data will be transferred to Apple, analyzed (privately), then transferred to the National Center for Missing and Exploited Children and to the police.

CSAM system

In this case, your Apple ID account with all data will be blocked. Of course, you will have the right to appeal, but there are still many open questions.

For example, how will Apple experts re-test the algorithm? The only way is to analyze the photos that will come to them. What if there are no matches, but there are personal photos of me, for example, of my child?

Here you should know that other large companies (Microsoft, Google, Facebook, Twitter) are already analyzing the Internet space for the presence of such content in order to ban it later. And Apple decided to analyze exactly your media library located on your device using the capacities of your own device.

RELATED:   IOS 14.2 release: what's new

The process will start in future updates. Moreover, iMessage and Siri will fight prohibited content, only starting with version iOS 15, iPadOS 15 and macOS Monterey. But the search algorithm on the device will work only on iOS and iPadOS devices, and the firmware version is not officially indicated. This means that the update can “arrive” on any version of iOS at any time.

So far, this algorithm will work only in the United States, but this is only for now. I am absolutely not opposed to large companies analyzing the network, looking for “prohibited”, banning it and exposing the intruders. But digging into my personal device is a completely different thing.

It is banal that all attackers after this news will throw off all the forbidden information on their flash drive or on a PC, and will be there and view it. And millions of devices around the world will be scanned.

Desired system

100%

I am against scanning my device

0%

Voted: 2

Subscribe to our Telegram, VK, Twitter, Instagram

Related posts

Which iPhones will update to iOS 15 in 2021?

vaibhav

Apple innovations in 2021

vaibhav

Apple unveils Dolby Atmos and Hi-Res Lossless on Apple Music

vaibhav

Leave a Comment