What happens in the iPhone stays in the iPhone. Remember this huge advertisement covering the whole building? On 20 19 CES show in Las Vegas, the apple that didn't participate in the exhibition gave itself the best advertisement. In the past few years, Apple has been holding high the banner of protecting users' privacy in the technology industry, taking this as one of the selling points of its biggest products, and slogan "iPhone is to protect users' privacy".
Not only that, but Apple will also target Internet companies that profit from users' big data. Cook has repeatedly publicly criticized Internet companies such as Facebook for selling user privacy for profit, emphasizing that Apple will never touch user data, and even updated the iOS system this year to let users decide whether to allow data tracking, which has brought a major impact to Internet companies such as Facebook.
In order to ensure the security of user data, Apple has also blocked the pressure from the US government in the past few years and resolutely rejected the US government's request to set up a back door for data reading in the iPhone in the name of counter-terrorism. In addition, in 20 16 and 2020, Apple publicly rejected the US government's request to unlock the terrorist iPhone at least twice, and even sued the government in court.
The background of these two incidents is the shooting in San Bernardino, California and the shooting at the naval base in Florida. There are Islamic extremist religious forces behind the gunmen. The FBI hopes to investigate the motives and contact institutions of terrorists from mobile phone communication records. Due to Apple's firm resistance, the US government finally sought the help of a third party to crack the company and obtain data from the terrorists' mobile phones.
However, Apple, which unconditionally protects users' privacy, will shoot itself in the foot and publicly announce that it will scan users' files, as well as local photos of the iPhone. Even if the starting point is to safeguard justice, this incident is very inconsistent with Apple's style, and it has caused a lot of heated discussions about user privacy. What the hell is going on here?
Earlier this month, Apple publicly announced that it would launch a new function in the subsequent upgrade of iOS and iPadOS systems, and locally scan whether there are photos related to child sexual abuse on users' devices, and compare them with those in the official US child sexual exploitation database. If a large number of illegal pictures are found, Apple will notify law enforcement officers.
This feature is part of Apple's overall project to combat child sexual abuse. In addition, iMessage SMS service will also add parental control function. If 12-year-old minors send or browse nude photos through iMessage, their parents will receive a system alarm.
Even if the prevention of child exploitation and human trafficking is to safeguard justice, the function of scanning users' local pictures is still too sensitive, which has caused many controversies and concerns. After the news was released 10, Craig Federighi, Apple's senior vice president in charge of iOS, finally had to come out again to clarify.
Federich specifically explained this function: When iPad and iPhone users upload local pictures to iCloud cloud, Apple will scan these pictures in local devices and compare them with the data map of the National Center for Missing and Exploited Children (NCMEC). If a large number of pictures are found to match, it means that the user has pictures of missing and sexually abused children. Apple will send them to the audit department and notify the law enforcement department after reconfirmation, but users themselves will not be reminded. However, Apple will also block the user's iCloud account, which is equivalent to notifying the user in disguise.
However, it is worth noting that the system will only trigger an alarm when a large number of illegal pictures are found through comparison, and the problem of a single picture will not. This is obviously to avoid recognition errors. Moreover, Apple will hire a third-party examiner to confirm whether the content identified by the system is illegal, and they will notify the police. However, Apple did not disclose this threshold for determining the amount of illegal content. The outside world doesn't know how many illegal pictures will be triggered automatically.
Earlier, Apple said that this scanning function will only be launched in the United States first, and will be launched in other countries in the future as appropriate. Federich further explained that this CSAM database is included in the system upgrade of iOS and iPadOS and is only activated in the US market; "Our devices in China, Europe and the United States all use the same software, including the same database."
Why did Apple, which pays the most attention to users' privacy, publicly announce to scan users' local photos? Is it completely private for users to own photos of iPhone and iPad? Are Internet companies responsible for such illegal content?
Article 230 of the Federal Communications Regulation Act 1996 stipulates that "any provider or user of interactive computer services shall not be regarded as the publisher and spokesman of any information provided by another information content provider." This clause is also regarded as an exemption umbrella for Internet companies.
This clause actually contains two meanings: Internet companies are not responsible for the third-party information on the platform, and the content published by users on the platform has nothing to do with the platform; Internet companies are not responsible for their acts of deleting the platform content in good faith, and have the right to delete the content published by users according to the audit standards.
But this umbrella is not without boundaries. In 2065438+2008, the US Congress passed the Anti-Sex Trading Act, which clearly stipulated that Internet companies have the responsibility to report and delete such illegal content on the network platform, otherwise they will face penalties from relevant laws; It explicitly includes child pornography. This is the first gap in the 230 exemption clause.
According to this law, if users open iCloud pictures to upload, and pictures with local devices are connected to the Internet, then Apple, as an Internet platform, has the responsibility and power to report and handle such illegal content. For most iPhone users, the iCloud upload function is turned on by default.
In fact, major Internet companies in the United States have long scanned user photo libraries to identify child pornography-related content. Facebook, Twitter and Reddit will compare and scan users' content on their own platforms, whether it is published or not. Like Apple, they also compare it through NCMEC's database, and call the police when they find violations.
More importantly, child pornography does not belong to the scope of protection of the right to freedom of speech in the First Amendment of the US Constitution. It is illegal for users to hold child pornography pictures, whether they are disseminated or not. Both the Anti-Child Sexual Exploitation Act and the Anti-Child Trafficking Act of the United States clearly stipulate that the production, dissemination, acceptance and possession of such content are federal felonies and will face severe punishment of at least 5 years and at most 30 years' imprisonment.
In Europe, Internet regulatory measures in this regard are even earlier. As early as 20 13, the United Kingdom formulated regulatory rules requiring that pornographic content be blocked from Internet service providers (ISPs), and users must manually close the filtered content before visiting pornographic websites. During the same period, the British government also asked search service providers such as Google and Microsoft to delete child pornography search results. However, child pornography rarely appears in search results.
Even though American Internet companies have been cracking down on child pornography for a long time, Apple's announcement of scanning users' pictures still raises many questions. Apple confirmed to the American media that it had scanned the content of iCloud mail before, but never scanned the user's iCloud Photos image data. Matt Tait, an expert in network information security, pointed out that child pornography was originally a high-voltage line cracked down by US law enforcement agencies. Law enforcement agencies can easily obtain subpoenas and ask Apple to scan users' iCloud photo libraries, and Apple will be willing to cooperate with this request.
What's the difference between what Apple has done and other Internet companies? Other internet companies scan the content uploaded by users on the cloud server, and this time Apple scans it locally (provided that iCloud synchronization is turned on). Why did Apple choose local scanning?
Federich, Apple's senior vice president, stressed that this scanning mechanism will be carried out locally on the device, rather than online comparison, and is currently only for the US market. "If someone scans a picture in the cloud, who knows what it will scan (meaning it may get out of control). In our function, NCMEC's database is transmitted locally. "
In the new system of iOS and iPad, there will be a tool called NeuralHash, which will decompose the image into many segments for labeling and recognition, and compare millions of NCMEC data. After matching illegal content is found, a "vault file" containing illegal content will be generated in iOS or iPadOS, and it will not be sent to Apple's auditors until such illegal content accumulates to a certain threshold.
Newenshwander, Apple's privacy director, particularly emphasized that if the user turns off the iCloud picture synchronization function, then the NeuralHash tool will not run, and Apple will not and cannot scan child pornography pictures.
Even though Apple has published this scanning function as openly and transparently as possible, it has caused dissatisfaction and concern among some Internet privacy protection agencies. After all, this is the first time Apple has scanned and monitored users' local files. If this time is to combat child pornography, what will it be next time, anti-terrorism?
Federich, Apple's senior vice president, once again stressed that Apple will not expand the current data scanning scope, except illegal CSAM child pornography. He stressed, "If a government asks us to scan data other than CSAM content, we will directly refuse. If you have no confidence in this and don't believe that Apple dares to refuse, then we have also set up a multi-layer protection mechanism. We want to reassure users that you don't need to trust any company or any country on the issue of what files to scan. "
Apple has indeed publicly rejected the US government's anti-terrorism backdoor and the request to unlock mobile phones, and is not afraid to go to court with the US government to resist administrative orders. However, if faced with legal subpoenas and orders from the judicial department, Apple can only accept and obey, which they are powerless to resist.
Just two months ago, Apple admitted that in February, 20 18, according to the requirements of the Justice Department of the Trump administration, it provided a series of user identification codes to confirm the email addresses and telephone numbers of some users. The Trump administration, which was in the process of "TongRumen" investigation at that time, was secretly investigating who leaked government secrets to the Democratic congressman.
Apple is willing to cooperate with the Trump administration's investigation because it has received a subpoena from the federal grand jury, and this subpoena is accompanied by confidentiality requirements. This means that Apple will not only obey the order to hand over the data to the government, but also keep it a secret until the end of the confidentiality period in May this year. In addition to Apple, Microsoft also received this subpoena with confidentiality requirements.
According to the Electronic Frontier Foundation, the US government has never shied away from requiring users to obtain encrypted communications, and has put pressure on technology companies to help them obtain data through search warrants and court orders. Although Apple has always promised to resist the government's order to reduce user privacy protection, after the launch of this scanning mechanism, Apple has to continue to resist the government's pressure to expand the scanning scope.
This situation is not only in the United States. In 20 18, the "Five Eyes Alliance" (the United States, Britain, Canada, New Zealand and Australia) made it clear that if technology companies refused to provide encrypted data, they would legally obtain the data through technology, law enforcement and legislation. Subsequently, the British passed the Investigation Authority Act (IPA), requiring telecom operators to provide assistance when the government issues data collection orders through technical means.
Although this time Apple set up a scanning mechanism in iOS and iPadOS systems for the just purpose of combating child pornography and sexual exploitation, it also pre-buried a dangerous "mine" for itself. In 20 16, when Apple refused to unlock the terrorist iPhone by the US Department of Justice in San Bernardino, it publicly stated that it did not set any back door on iOS. But what about this time? "If you set up (the back door), they will follow up. Apple's move has opened the back door for countries around the world to strengthen monitoring and auditing. " The Electronic Frontier Foundation concluded this way.
"What happens in the iPhone stays in the iPhone", Apple's iconic promise now needs to add an exemption clause-if it doesn't involve child pornography, will there be more exceptions in the future?
Tisch
1, safety comes first.
2. Knowledge changes fate, and posts make a career.
3. Promote the Olympic