Apple’s child-abuse scanner has serious flaw, researchers say

Illustration by Alex Castro / The Verge

Researchers have found a flaw in iOS’s built-in hash function, raising new concerns about the integrity of Apple’s CSAM-scanning system. The flaw affects the hashing system, called NeuralHash, which allows Apple to check for exact matches of known child-abuse imagery without possessing any of the images or gleaning any information about non-matching pictures.

On Tuesday, a GitHub user called Asuhariet Ygvar posted code for a reconstructed Python version of NeuralHash, which he claimed to have reverse-engineered from previous versions of iOS. The GitHub post also includes instructions on how to extract the NeuralMatch files from a current macOS or iOS build.

“Early tests show that it can tolerate image resizing and compression, but not...

Continue reading…



from The Verge - All Posts https://ift.tt/2W6RhCJ
Apple’s child-abuse scanner has serious flaw, researchers say Apple’s child-abuse scanner has serious flaw, researchers say Reviewed by Benny on August 18, 2021 Rating: 5

No comments:

Powered by Blogger.