Apple delays launching of child abuse detection technology

The American company planned to launch on iCloud accounts along with iOS 15, iPadOS, 15 and macOS

Apple delayed a few months the implementation of its technology to detect and fight against the distribution of images of sexual exploitation of minors through the company’s services such as iCloud, and will introduce changes before its launch.

The US company planned to launch its new technology on family iCloud accounts with iOS 15, iPadOS 15 and macOS Monterrey, which will finally be delayed for a few months due to “comments from customers, advocacy groups, researchers and others,” as it acknowledged through a statement.

“We have decided to take more time over the next few months to gather information and make improvements before launching these child safety features,” said the American company.

This technology, which Apple had announced in August but had not yet applied, was designed to protect children from sexual harassers who use the company’s communication tools to contact and exploit minors, as well as to prevent dissemination of these contents.

These “new cryptographic applications” make it possible to detect images of this type that are stored in iCloud. This method does not scan the images in the cloud, but is based on an on-device comparison of known images provided by child safety organizations before they are uploaded to iCloud.

What is being compared is not the image itself, but the “hashes” of the images, a kind of fingerprint. A cryptographic technology called ‘private set intersection’ is what determines if there is a match without revealing the result, and is attached to the image once uploaded to iCloud.

The secret exchange threshold technology ensures a high level of match, and when Apple receives an alert it is sent for human teams to review. If confirmed, the user’s account is deactivated and a report is sent to the relevant associations and the Police.

K. Tovar

Source: dpa

You might also like