Current location - Quotes Website - Signature design - Su Kui: Personal data protection is the "fatal weakness" of this platform.
Su Kui: Personal data protection is the "fatal weakness" of this platform.
Text/Observer Network columnist Su Kui

There is no doubt that the Internet platform is the protagonist of the information age. Super-scale, monopoly and huge power are its main characteristics, which can affect the lives, interests and even livelihoods of countless people. Platform has multiple attributes, which is not only a traditional enterprise, but also an economic form. Platform economy with platform as the core, even a virtual city-state. As Zuckerberg famously said, "In many ways, Facebook is more like a government than a traditional company."

Platform is the absolute master of platform economy. It can make rules, interpret rules and implement rules at will, and these rules can affect hundreds of millions of people. They are essentially the same as national laws and regulations, but they do not need to follow the due process required by national legislation, judicature and law enforcement. Since the birth of the country, the government has not faced similar opponents, which can be said to be an unprecedented change in 3000 years.

In the information age, how to manage the platform is a huge challenge all over the world. Traditional tools such as anti-monopoly, consumer protection, and fair competition are the main paths of platform governance worldwide. However, anti-monopoly, consumer protection and fair competition, these governance tools are all developed from the traditional economic era. In the face of giants in the information age, it can often be said that they are stretched, powerless and ineffective.

Personal data protection is a new tool developed in the information age. Although its original intention is to protect personal information, the personal information of countless people has become big data. With big data, the algorithm can change from a dying body to an unruly beast. In other words, the life door of the algorithm is the protection of personal information.

Some recent cases of personal information protection on platforms in Europe and America are thought-provoking and exciting. Personal data protection gives individuals greater power to balance the power structure of the platform in the platform economy, which may also be a new path of platform governance.

Labor service platform

Internet platforms do not have extensive identity, and there are huge differences between different types, including social platforms and search platforms with information services as the core, as well as e-commerce platforms that generate massive information. The labor service platform represented by online car rental and take-out can be regarded as the latter. The platform is a virtual market through which services are sold. But more importantly, these labor platforms have gathered millions or even tens of millions of working people in cities around the world. Compared with other platforms, they not only serve people, but also hold information of countless people. Therefore, these platforms have a closer relationship with people, which has brought greater challenges and far-reaching influence to social governance.

People on these platforms rely on the information provided by the platform to survive, which generates a huge amount of information resources for the platform and is managed and controlled by this information. Compared with other e-commerce platforms, people (laborers) on such platforms have almost no autonomy, and traditional labor protection has nothing to do with them, because they are considered as micro-entrepreneurs by the platform. The platform relies on information to accurately manage such a huge labor force, which exceeds the limit of any organization in human history. It can be said that it is a miracle of human history, and the core of creating such a miracle is the algorithm hidden behind the platform.

The online car-sharing platform is almost the most controversial internet platform that has brought the most problems to social governance so far. Founded in 2008, Uber started the online car sharing industry, with about 654.38 billion monthly active consumers, nearly 20 million global daily orders (including takeout), more than 4 million drivers on the platform, and an average of 50,000 new drivers join the platform every month. Its tough confrontation with global regulators has brought it countless troubles, and the corporate culture that despises rules and laws has also caused internal injuries to the enterprise itself. In 20 17, the founder karanic was even driven out of the company. In addition to the well-known struggle with local regulators, the contradiction between Uber and drivers in the platform can be said to be increasingly acute, and the discord with consumers has also spread from time to time.

The consequences of revealing personal information are very serious.

654381October 20th, Trump announced a long list of pardons for up to 73 people before leaving the White House. Steve Bannon, a former military adviser, appeared on the list without any suspense, but some names were unexpected. Anthony Levandowski, a former Uber executive in charge of self-driving business, was lucky to escape the imprisonment scheduled to start on February 7. It is said that peter thiel, one of Trump's few supporters in Silicon Valley, is behind the scenes.

In contrast, another former Uber executive and chief security officer, Joe Sullivan, can only lament the unfair fate. However, this is really self-inflicted, and it is simply "not fatal." On August 2, 2020, Kloc-0, Sullivan was accused by the Federal Ministry of Justice of obstructing justice and perjury, and his sentence may be as high as 8 years. All this began in 20 16 when the personal information of about 57 million drivers and passengers was covered up.

Sullivan is not a novice. He used to be the chief security officer of Internet giant Facebook, and he has rich experience in information security and personal information protection. However, after learning that Uber's hacking behavior led to the disclosure of personal information on the platform, he did not immediately report to the relevant government departments according to law. On the contrary, after consulting with the then CEO travis kalanick, he paid the hacker a hush fee of $654.38 million, bribing the hacker to characterize it as the platform actively inviting hackers to discover security vulnerabilities. The hush money became a safety reward.

Such a large-scale personal (including drivers and passengers on the platform) information disclosure will naturally be spared by relevant European and American regulatory agencies. According to EU regulations, data leakage should be reported to the regulatory authorities within 72 hours. Accordingly, the Dutch Data Protection Agency, where its European headquarters is located, imposed a fine of 600,000 euros on 20 18. The United Kingdom (ICO), which announced Britain's withdrawal from the EU, also imposed a fine of 385,000 pounds.

Compared with their European counterparts, American regulators have laid a heavier hand on them. The Attorney General of California and the Attorney General of San Francisco joined forces with the relevant departments of 50 American States and Washington, D.C. to initiate a lawsuit against them. 2065438+September 2008, Uber took the initiative to settle with a huge compensation of up to 65438+48 million US dollars. It seems that in terms of personal information protection, the domestic view that American legislation and supervision are more relaxed may not be established.

Supervision goes deep into the black box of enterprises

The Federal Trade Commission is the law enforcement agency in the field of fair competition and consumer protection. After learning about this incident, FTC quickly announced that the settlement agreement reached in August 2065438+2007 was invalid and needed to renegotiate the relevant settlement terms.

20 18 10, FTC announced that it had reached a new settlement agreement again. As Trump's government, which believes in free market, the FTC under it does show mercy to Uber, without imposing heavy penalties or even fines at all like the state government, but it puts forward meticulous requirements for Uber's internal information security management. Compared with the result protection strategy with external results as the main object, FTC protocol is a process protection strategy with internal management as the main object. In other words, it is more to comprehensively supervise the possible personal information protection loopholes within the enterprise with the idea of preventing in advance, and bring the information security management within the enterprise into public management. This can be said to be a new personal information protection model.

In fact, China's ongoing personal information protection legislation (including the network security law, data security law and personal information protection law completed in 2020) also adopted this idea. However, compared with FTC's settlement agreement, China's platform enterprises enjoy more freedom. FTC's settlement agreement has established a relatively complete enterprise information security system, which mainly includes:

Uber needs to establish a comprehensive personal information (privacy) protection plan immediately, and all plans, programs and trainings need to be recorded in writing. Set up a special person to be responsible for the personal information (privacy) protection plan.

Conduct personal information (privacy) risk investigation and evaluation, identify risk points, and formulate rectification plans. Formulate a dynamic monitoring and evaluation scheme for personal information (privacy) risks.

Establish a third-party information (privacy) audit and evaluation system. Third-party auditors must have professional qualifications and have more than 3 years' experience, and relevant auditors must be approved by FTC Consumer Protection Department. After the evaluation is completed 10 days, the evaluation report shall be sent to FTC for reference. The first third-party audit should be completed within half a year after the implementation of the agreement, and then at least once every two years.

The establishment of information security accident statistics and reporting system.

Establish a document signing and learning system. All relevant personnel of the company must study the FTC settlement agreement documents and sign for confirmation, and the relevant learning records need to be recorded in writing.

One year after the release of the document, Uber needs to submit a compliance report to FTC, which should be sworn to be true and reliable, otherwise it will bear the crime of perjury. Any organization and entity of the enterprise shall notify FTC within 14 days to facilitate the inspection and supervision by the regulatory authorities.

Respond to the regulatory authorities in time. Uber needs to reply or respond within 10 days after receiving FTC's inquiry about information (privacy) security. Relevant compliance reports or materials need to be sworn not to commit perjury, and detailed records should be made for future reference.

Establish information security records, including records of relevant employees (including resignation reasons), complaints from platform users, all materials that can prove the company's implementation of documents, the company's external publicity and commitment to personal information (privacy) protection measures, information security assessment, audit and rectification reports, security vulnerability award issuance records, law enforcement subpoenas, investigation and explanation materials, etc.

Screenshot of Forbes' online report on the case.

In addition to the huge compensation, the settlement agreement reached by the Attorney General of California in conjunction with the states and Uber actually put forward a series of similar requirements for enterprise information security compliance, including the need for enterprises to set up chief security officers, as well as cloud storage passwords, identity authentication enhancement systems, the establishment of information security staff training and violation punishment systems, the third-party information auditor qualification system (requiring more than five years of experience), personal information security as a fixed topic of the board of directors, lawyers' participation in accident confirmation and reporting systems, and internal violation reporting systems.

Confucius said, "Killing people without teaching them is cruel, and we can't help but regard it as violence." Federal and state regulators in the United States have used Uber's illegal behavior to build a very strict internal personal information protection mechanism, with special emphasis on internal departmental supervision, external audit, and employee training. And promote the post-event supervision mode from the negative list to the post-event supervision mode, that is, from the punishment-oriented mode to the prevention-oriented mode.

Platform enterprises are no longer black boxes. Through a well-designed internal supervision mechanism, the protection of the platform is more transparent and the goal of personal information protection is more reliable. Its significance lies in that only when the protection of personal information is more reliable, the platform economic governance function other than personal information protection will become possible.

Algorithm ruling platform

There is also a reason why four drivers do this. According to Uber, all four drivers manipulated (improperly used) the driver's mobile phone application because of cheating or illegal use. To put it bluntly, the driver's driver application is judged as a serious violation by the platform algorithm (Uber platform has no concept of canceling the contract, but it is equivalent to the statement of permanently freezing the account) because he chose to wait for the price to rise, or because he installed other applications privately to change the state of the mobile phone (such as changing the location), and these decisions are all made by the platform algorithm. In other words, all four drivers were found to have violated the rules by the algorithm and were cleared (fired) by the algorithm.

However, drivers don't think so. They think that they just exercise the right to freely choose their working hours, which is what Uber platform has always advocated for drivers. It is precisely because of this freedom that drivers are not regarded as employees of the platform, but freelancers, even so-called "micro-entrepreneurs". Platform and driving are partners. Four drivers denied fraud or misconduct, and Uber did not provide an opportunity to make a statement, which made their fate controlled by the algorithm. Therefore, they filed a lawsuit in the Amsterdam District Court in the Netherlands, where Uber's European headquarters is located.

The main legal basis for drivers to sue Uber is the EU General Data Protection Law, which came into effect in May 20 18. According to Article 15 of the EU General Data Protection Law, data subjects have the right to access personal data, understand the purpose of data processing and data types, and have the right to request the correction of inaccurate data. Article 22 deals with automatic decision-making and user profiling. "The data subject has the right to object to such a decision: completely relying on automated processing-including user portraits-to make decisions that have legal or similar serious impact on the data subject".

According to the definition of EU General Data Protection Law, "user portrait" refers to any automatic processing of personal data in order to evaluate certain conditions of natural persons, especially to evaluate their work performance, economic status, health status, personal preferences, interests, reliability, behavior, location or whereabouts. Obviously, the so-called automatic decision-making (user portrait) is a platform algorithm. This is essentially an algorithmic regulatory clause. However, Article 22 also specifically lists exceptions, including (a) when the decision is necessary to sign or perform the contract between the data subject and the data controller; ..... (c) The decision is based on the explicit consent of the data subject.

Obviously, the controversy in this lawsuit focuses on the understanding of Article 22, such as what is the standard of fully automatic decision-making? To what extent can human intervention reach the standard of excluding fully automatic decision-making? Considering that the platform needs tens of millions or even hundreds of millions of drivers every day (including dispatching orders, pricing, evaluation and other links), can automated decision-making be regarded as a necessary measure for the platform to fulfill its contract with drivers?

Balance of platform power structure

Drivers want to carry personal data through a third-party non-profit organization (WIE Co., Ltd., Workers Information Exchange Center), that is, Uber directly transfers the personal data generated by drivers at work to the third-party data exchange intermediary WIE, and the third-party data organization becomes a personal data trust organization. It can help drivers to analyze their personal behavior, whether the algorithm logic of the supervision platform is consistent with the processing logic of their declaration, whether it is fair and reasonable, whether the evaluation platform correctly calculates the service price, the real labor volume and quality of drivers, and analyze the reasons for the difference in scores of different drivers.

By mastering and being able to process these key data, drivers can reduce or eliminate the information advantages of the platform, and drivers may negotiate with the platform more equally and become more equal opponents.

According to the law, Uber has the responsibility to provide relevant data to the data subject (driver) within 30 days, but Uber did not provide all the data requested by the driver, or the key content requested by the driver was actually rejected. Although the data list provided contains the order data (such as getting on and off time, passenger payment, etc.), it does not provide the online and offline time of the driver and the complete GPS location information record. These contents are related to the determination of the driver's working hours (the British Court of Appeal has ruled that the driver's working hours are calculated from the time he goes online, not from the time he receives the order), but the platform does not want the driver to master these data that may be unfavorable to him.

However, Uber also has its own explanation: the data that is not provided is either unavailable or because if it is provided to the driver, it will damage the privacy rights of others.

If the above data can be recorded by the driver himself, then only the platform can know how the algorithm for managing the driver works. What drivers want to know most is the secret of the platform algorithm: how does it paint the driver through the driver's behavior and passenger evaluation data, and how does the driver's portrait affect the driver's interests? For example, what is the relationship between the driver's rating and the dispatch mechanism? How does the driver score trigger the platform delisting mechanism (the driver thinks it is dismissal)?

In the complaint, the driver specifically pointed out that Uber did not respond to the driver label data they requested. This is really the core of the whole controversy. Uber will not think that these are the personal data information that drivers need to know, and even there are huge differences in personal information. Moreover, these contents will almost certainly be used in the labor relationship case between drivers and platforms being tried by the British Supreme Court, so it is also a case that Uber cannot lose.

The plaintiff's driver, Peleg, also hinted at the use of these contents: "We want to have a look at this Orwellian (meaning a strictly ruled and dehumanized society) work world. The workers here are ruled by machines and have no power. "

In fact, there is also a legal basis for drivers to ask for the secret of the algorithm. According to Article 15 of the EU General Personal Data Protection Law, the data subject should have the right to obtain valid information from the controller. For example, the existence of automated decision-making includes data analysis, and the relevant logic includes the expected consequences of this processing on the data subject.

Although Uber denied that the driver's delisting was done by an algorithm (Uber spokesperson claimed that the manager made the delisting decision after the algorithm detected the driver's illegal behavior), it was difficult to explain that there was a material on the company's website introducing the mastermind of the algorithm, which clearly stated that the mastermind was responsible for supervising and managing the driver's cheating. In the privacy policy of the driver application, it is written in black and white that Uber will use automated decision-making to remove drivers suspected of violating the rules.

The platform algorithm determines the driver's job opportunities, workload and income, and even loses the platform job opportunities forever. I'm afraid there is not much disagreement about its importance to drivers. The logic of data automatic processing, that is, the secret of the algorithm, is the commercial secret of the platform. The judge's wisdom is tested by the extent to which the data access right standard required by 15 should be provided and how to strike a balance between the protection of personal data and the intellectual property rights (trade secrets) of enterprises.

It can be said that this is the most important case since the implementation of the EU General Personal Data Protection Law, and its significance far exceeds the four drivers involved in this case (in fact, these drivers are organizing more online car drivers and take-away riders to join), and its referee standard can be said to be equivalent to another EU personal information protection legislation. It is almost certain that this case will eventually be submitted to the EU Supreme Court.

Canary of platform economy

"For everyone, this is a battlefield related to the future," said driver Peleg. The odd jobs on Uber platform are canaries in underground mines in history. This is a battle that can't be lost.

Data is power, and the platform economy further magnifies the traditional information advantage of enterprises as employers. Through the third-party data trust to realize the data carrying right, and through the data empowerment of drivers, who can have a more dynamic supervision platform than the network car drivers and take-away riders? It can be said that this is a brand-new platform supervision idea, using information rights to govern big data.