Apple Inc (AAPL.O) said on Friday that it would roll out a system for screening or analyzing images of child abuse on a country-by-country basis, depending on local laws, starting in the United States.
On the same day that Apple announced the implementation of the system, the company said that it will filter pictures for such images before they are transferred from iPhones in the United States to its iCloud storage. Click here to find out more
Child safety advocates applauded Apple for joining companies such as Facebook Inc (FB.O), Microsoft Corp (MSFT.O), and Alphabet Inc’s (GOOGL.O) Google in implementing similar safeguards.
Concerns have been expressed about Apple’s picture check on the iPhone itself, which has prompted fears that the firm is delving into customers’ devices in ways that might be abused by governments. A large number of different technological firms examine photographs after they have been posted to servers.
The company stated in a media conference on Friday that it will make preparations to extend the service following the rules of each nation in which it operates.
Apple claims that subtleties in its system, such as “safety vouchers” sent from the iPhone to Apple’s computers that do not contain any relevant information, would shield the firm from government demands to identify content other than child abuse pictures.
Apple also said that it has a human review mechanism that serves as a safeguard against misuse by the government. If there is no evidence of child abuse images in the reports generated by the business’s picture checking system, the company will not forward them to police authorities.
Regulators are intensifying their demands that technology firms do more to remove unlawful material from the internet. For the last several years, law enforcement and politicians have used the plague of child abuse material to criticize robust encryption, in the same manner, that they had previously used the necessity to combat terrorism as justification for their positions.
Several of the resultant regulations, like those in the United Kingdom, may be used to compel tech firms to take action against their customers in secret if they so choose.
However, while Apple’s strategy may serve to deflect government meddling by demonstrating its willingness to reach into customer phones or by complying with anticipated European Union directives, many security experts believe the privacy champion is making a grave mistake by demonstrating its willingness to do so.
“It may have diverted attention away from terrorist and extremist material in the United States for this one subject, but it will draw attention away from terrorist and extremist information in other parts of the world,” said Riana Pfefferkorn, a research scholar at Stanford Internet Observatory.
Politically powerful copyright holders in Hollywood and elsewhere may even argue that their digital rights should be enforced in this manner, according to her.
It is also under pressure from governments that want to know what people are saying via Facebook’s WhatsApp, which is the world’s biggest completely encrypted messaging service. WhatsApp worries that the pressure will only grow in the future. WhatsApp CEO Will Cathcart unleashed a torrent of criticism at Apple on Friday, claiming that the company’s new design was flawed.
For decades, people have used personal computers to store their private information. However, no legislation has ever required that all desktops, laptops, or phones in the world be scanned for illegal material, according to the author. “It’s not the way technology developed in free societies operates.”
As Apple’s specialists pointed out, the company was not actually intruding into people’s phones since data transmitted via its gadgets had to pass through several hoops. Banished content, for example, is identified by monitoring organizations, and the identifiers are integrated into Apple’s operating systems across the globe, making them more difficult to alter.
Some analysts said that they had one reason to believe that Apple had not made a fundamental shift in its business strategy.
According to a story by Reuters last year, Apple was working on making iCloud backups end-to-end encrypted, which would prevent the firm from turning over readable copies of the backups to law enforcement agencies. The FBI opposed the idea, and the company decided to abandon it.
“Apple may be preparing to switch on encryption later this year by taking steps this week to stave off expected criticism of the move,” Stanford Observatory creator Alex Stamos speculated this week.
Apple refused to comment on the company’s product plans for the foreseeable future.