Apple is appealing a security research firm that is investigating programs such as child abuse image detection

0
31

Apple on Tuesday appealed a lost copyright case against security startup Corellium, which is helping researchers investigate programs like Apple’s proposed new method of detecting images of child sexual abuse.

A federal judge last year dismissed Apple’s copyright claims against Corellium, which makes a simulated iPhone that researchers are using to study how the severely restricted devices work.

Security professionals are a core customer of Corellium, and the bugs they uncover have been reported to Apple for cash rewards and used elsewhere, including by the FBI while cracking the phone of a mass shooter who killed several people in San Bernardino, California.

Apple makes its software difficult to review, and the specialized research phones it offers select experts come with a number of limitations. The company declined to comment.

The appeal came as a surprise because Apple had just settled other claims with Corellium in connection with the Digitial Milennium Copyright Act to avoid a lawsuit.

Experts said they were also surprised that Apple revived a battle against a major research tools provider after arguing that researchers were reviewing its controversial plan to scan customer devices.

“Enough is enough,” said Corellium boss Amanda Gorton. “Apple cannot pretend to be accountable to the security research community while trying to make this research illegal.”

As part of Apple’s plan announced earlier this month, the software will automatically check photos uploaded to iCloud online storage from phones or computers to see if they match digital identifiers of known child abuse images. If enough matches are found, Apple employees will check if the images are illegal, then cancel the account and refer the user to law enforcement.

“We will prevent abuse of these child safety mechanisms by relying on people to bypass our copy protection mechanisms,” is a rather incoherent argument internally, “tweeted David Thiel of the Stanford Internet Observatory.

Because Apple has marketed itself as being committed to data protection and other companies only scan content after it has been saved or shared online, digital rights groups have spoken out against the plan.

One of their main arguments was that governments could theoretically force Apple to search for banned political material or target a single user as well.

In defense of the program, Apple executives said researchers could review the list of banned images and examine what data was sent to the company, to be honest about what it was looking for and by whom.

A senior executive said such reviews made privacy overall better than it would have been if the scanning was done in Apple’s memory, where the encoding is kept secret.

© Thomson Reuters 2021

LEAVE A REPLY

Please enter your comment!
Please enter your name here