Hu Ling: Face brushing – identity system, personal information and legal regulation | Frontiers
Face recognition technology has been applied in more and more scenarios, and entering railway stations, communities, and campuses gradually begins to require “swiping your face”. These scenarios involve not only public service places and facilities, but also private service behaviors. wide. However, the legal, ethical and cultural basis for the use of this technology has so far not been fully discussed in the local context of our country. In this regard, in the article “Face Brushing: Identity System, Personal Information and Legal Regulations”, Associate Professor Hu Ling of Shanghai University of Finance and Economics, in-depth analysis of the specific scenarios of face brush nesting and application, analyzes the identity authentication function of face brush, and points out that face brush It means that the authentication authority process moves from distributed to centralized, and provides a dichotomous view of grid and network as the ideal type of governance.
one,
Face brushing as an identity legal system
(1) The face is a universal identifier
From a social point of view, the face has always been a universal identity identifier for mutual recognition and verification.
In the traditional acquaintance society, people do not recognize each other’s same species by sounds and smells like other animals, but recognize faces. In modern society, the authentication meaning of human face to stable small-scale community members has been replaced by the large-scale authentication function provided by the state and unit organizations. As a result, face-scanning has gained new value in a social sense, and its technical level has gradually become prominent, reflecting the construction of a new digital infrastructure. Aside from the shell of technical applications, face brushing in the digital age continues to reflect the strong identity and legal system attributes. The use of face-scanning technology first needs to obey the goal of the system, that is, the social and legal governance of mobility.
(2) How face-scanning is embedded in national governance: from segregation to China Unicom
Comparing the face brushing and its features in the network and grid can further highlight the governance significance of technology-legal systems in different scenarios:
The ubiquitous cameras in the network society represent the “eyes of power” and may bring oppression to social subjects, but the problem is not the technology itself, but the difficulty of a stable social order and social relations due to large-scale mobility in the short term. A balance is formed, so that it is difficult to make the various mechanisms and unwritten norms in the traditional acquaintance society work again and reduce the risk. Therefore, face brushing is nothing but a technological-legal institutional form embedded in the process of contemporary social governance. It exists to deal with liquidity and is part of overall security and trust, not just a neutral technology.
two,
Centralization of face recognition and authentication power
(1) When authentication power meets face brushing
The authentication system is one of the core legal systems of the country. It realizes the effective governance of the country by creating a unique authoritative identity for citizens, issuing identity documents, and conducting statistical investigations on population, taxation, and public services through this document. The logical process of face authentication involves two types of face data, one is face photos collected based on prior legal relationships, and the other is face data files generated in real time. In terms of authentication through the information system, face recognition means the upgrade of the authentication infrastructure, which can reduce the risk of leakage of traditional ID cards during the authentication process to a certain extent.
(2) Two ways of face authentication
There has been a clear path distinction between decentralized authentication (mesh) and centralized authentication (networked). For the former, any physical space and device may perform user identity authentication by swiping the face, so different face data files may be generated, scattered on different authentication subject servers, and the cost of supervision is high. For the latter, specific APPs relying on the platform can centrally use the face recognition function provided by mobile phone hardware or the platform. As an authentication agent, the platform can not only reduce the cost of supervision, but also reduce the operating compliance cost of small and medium-sized APPs. In the process of embedding the above two authentication ideas, face-scanning technology promotes centralized authentication in disguise. As a result, face-scanning has objectively contributed to the reduction of the number of authentication subjects in society and the concentration of authentication power.
three,
Behavior recognition and the legal nature of facial data
(1) Public authorities and face recognition
Face-scanning based on public safety goes beyond the “informed consent” principle for the use of personal information in the sense of private law, but it still needs to be informed and displayed in form. Here, it involves more collection and processing of information on citizens’ behavior in public places , has entered the category of public data.
The face recognition behavior of public agencies will change the relationship between public and private power under traditional technological devices. First, it further extends the surveillance logic of the “panoramic prison” since Bentham and Foucault, and increases the deterrence of potential criminal behavior. Second, facial recognition has the ability to apply indiscriminately to criminal offenses and common offenses. Thirdly, the database access and analysis behind face swiping can undoubtedly expand the law enforcement and evidence collection capabilities of public agencies (especially public security agencies). This requires more procedural informed Display and public communication and participation. Finally, even in non-private spaces, there is a degree of expectation of privacy. People can accept the surveillance of unspecified people in public places, but they cannot accept video recordings directed at themselves or in private spaces. This makes it necessary to regulate the location and number of public safety video installations to enhance disclosure and deterrence visible in public spaces.
(2) Private Services and Face Recognition
Private service providers generate, obtain and use facial data files based on default user agreements, and may claim that the files were created by the company’s technical labor and have the right to use these data as means of production exclusively. As long as the use process can accurately identify and restore to a specific subject and track the behavior of the same person, the use of face data files can be identified as the use of personal information. After private service providers help users open new accounts by swiping their faces, the behavior data generated will constrain users’ future activities in advance, which returns to the production process of the digital economy. Accumulating activity data by swiping your face makes identity recognition more valuable only in online scenarios.
Four,
Personal Information, Face Brushing Risks and Legal Responses
(1) Risks and general regulations of face-scanning
In the data collection process, in fact, as long as the formal informed consent specification is met, and the purpose and use of the collection are stated, it is difficult to completely prevent the data collection of faces or photos. In the data storage link, the discussants generally believed that the risk of face data leakage may occur. Considering the uniqueness of the face, once it is leaked, it will bring irreparable and imaginary losses. There are problems with this assumption: First, the level of exposure risk is ambiguous. Second, it is necessary to distinguish the different risks of face photos collected in advance and 3D data files generated in real time. Third, at least for now, the cost of fake face model operations is almost impossible to do at scale. Therefore, the theft of face data is more due to people’s psychological sensitivity and is amplified in the publicity of the mass media. In fact, the probability of face theft is extremely low.
Except in extreme cases where face data files are forged and used fraudulently, the real risk of face brushing is mainly the failure of substantive informed consent. The following institutional regulations can be implemented in the collection and storage of face data: First, in areas where information security needs to be ensured, multi-factor login identification should be mandatory. Secondly, according to the principle of classification and classification, systematically plan the authentication methods in different scenarios to prevent the mixing of basic identities and secondary identities. Thirdly, the time for retaining the face data files generated in real time can be limited to reduce the probability of leakage. Finally, strengthen professional certification measures for face recognition software and hardware, and adopt a similar network security classification system for hierarchical management.
In the data analysis link, the current digital discrimination shows that the de-identified face-scanning service on the surface may cover up the continuous identity discrimination in the black box under the ideology of technical equality. Face recognition is only for service providers or The authentication subject provides more extensive and low-cost login services, but objectively replaces the diversified account system with a unified account system.
(2) Methodological reflection on scenario-based regulation
It is impossible and unnecessary to regulate face-scanning technology to override existing behavioral goals and norms. The first step is to see the significance of digital identity in the transformation of society and place it in the network/grid mode dichotomy for discussion. The second step is to set authentication/identification as a criterion for demarcating more specific scenarios, identifying a legal relationship for which consensus is reached, and then distinguishing between the different institutional constraints of public authorities and private service providers. The last step is to further discuss the rules and details of specific personal information protection and algorithm supervision when strengthening individual rights, evaluate risks based on cost-benefit analysis, and formulate solutions and policy recommendations. Some specific issues involved in face-scanning need to be traced back to the basic legal relationship and principled consensus of the scene.
One, education and affective computing. In educational fields such as school classrooms, the use of cameras and artificial intelligence to record and analyze students’ performance violates students’ potential for self-directed learning and should impose legal restrictions on similar activities in schools. Second, employer surveillance. On the open flexible employment platform, the platform owner strengthens the control over the digital workers on the platform by swiping their faces, and their behavior can be interpreted as a judicial standard of personal/economic subordination, thus striving for the rights and interests of the digital workers. Third, social norms. Surveillance of illegal behavior (such as running a red light) is often restrained through avatar display and embedded social credit. We should not blindly use face-swiping technology and expand the restraint space of power on individual behavior just because the implementation cost is low, but should see the restraint of individual cognitive cost, and continue to hand over part of the underlying social action space to the social norms that individuals are familiar with, otherwise Ordinary people will continue to violate a large number of norms because they cannot recognize them, and deterrence will not work.
five,
Epilogue
First of all, face brushing is an identity legal system, which plays an identity authentication function and continues the social function of the face as a general identifier, which means that the authentication power is greatly enhanced. Secondly, brushing your face means that the authentication power process is shifted from distributed to centralized, and the authentication information is separated from the carrier. There is still a legal need to distinguish between different constraints on face-scanning surveillance for public safety purposes and for private purposes. Again, although there is no need to panic, we still need to pay attention to the possible risks of face-scanning, especially the automated integration of personal information behind it, as well as possible ways of use, discrimination and alienation. Finally, the dichotomy of grids and networks as ideal types of governance provides a useful perspective for understanding face brushing, and is also a specific application and extension of scene theory.
The Links: EP1C3T100C8N ADAU1701JSTZ-RL