CN112313661A - Method for verifying user identity and electronic equipment - Google Patents

Method for verifying user identity and electronic equipment Download PDF

Info

Publication number
CN112313661A
CN112313661A CN201880094835.4A CN201880094835A CN112313661A CN 112313661 A CN112313661 A CN 112313661A CN 201880094835 A CN201880094835 A CN 201880094835A CN 112313661 A CN112313661 A CN 112313661A
Authority
CN
China
Prior art keywords
user
face information
identity
information
pieces
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880094835.4A
Other languages
Chinese (zh)
Inventor
彭敏
李昌婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN112313661A publication Critical patent/CN112313661A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The embodiment of the application discloses a method for verifying user identity and electronic equipment, relates to the field of terminals, and can be used for verifying identity of a plurality of face information collected simultaneously in an identity verification process and providing multi-user presence certificates, so that potential safety hazards brought to user privacy and property are reduced. The method comprises the following steps: the electronic equipment runs a target application and opens a multi-user joint identity authentication function provided by the target application; the electronic equipment displays a first preview interface captured by a camera and prompts the input of face information of a plurality of preset users in the first preview interface; the electronic equipment acquires N pieces of face information to be verified from the first preview interface, wherein N is an integer greater than 1; and if each piece of face information to be verified in the N pieces of face information to be verified is the face information of a preset user, allowing the electronic equipment to continue to run the target application.

Description

Method for verifying user identity and electronic equipment Technical Field
The present application relates to the field of terminals, and in particular, to a method for verifying a user identity and an electronic device.
Background
At present, a method for authenticating a user by collecting face information of the user is widely applied to various electronic devices. For example, the mobile phone can acquire face information of the user through the camera in the screen locking state. If the acquired face information is matched with the face information of a prestored legal user, and the user identity authentication is passed, the mobile phone can automatically unlock the screen.
But in some cases it may be desirable to authenticate multiple users in the same scenario. For example, online contract, debit, or certificate handling (e.g., marriage certificate), joint account, requires multiple users to confirm the presence of the same information. At this time, if the face information of each user is collected one by one for authentication, even if each user passes the authentication, it is difficult to prove that the plurality of users complete the confirmation of the information at the same time and in the same space. For example, after the user a completes authentication through the face information of the user a, the video of the user B can be used to forge the face information of the user B to perform authentication on the user B. In this way, the user a can also enter into a relevant contract with the user B when the user B is not present. Obviously, the authentication method brings great potential safety hazard to the privacy and property of the user.
Disclosure of Invention
The application provides a method and electronic equipment for verifying user identity, which can be used for verifying identity of a plurality of face information collected simultaneously in an identity verification process and providing a multi-user presence certificate, thereby reducing potential safety hazards brought to user privacy and property.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a method for verifying a user identity, including: the electronic equipment runs a target application; when a multi-user joint identity authentication function provided in a target application is opened, the electronic equipment displays a first preview interface captured by a camera and prompts the user face information of a plurality of preset users to be input in the first preview interface; the electronic equipment acquires N (N is an integer larger than 1) pieces of face information to be verified from the first preview interface, namely, a plurality of preset users need to input the face information into the first preview interface at the same time for identity verification; if each piece of face information to be verified in the N pieces of face information to be verified is face information of a preset user, which indicates that the preset users complete identity authentication at the same time and place, the electronic device may continue to run the target application, for example, the electronic device may open the target application or run a next function after the identity authentication is successful. If the N pieces of face information to be verified contain face information of a non-preset user, the multi-user joint identity verification fails, the electronic equipment can stop running the target application, accuracy and safety of multi-user identity authentication are guaranteed, and therefore potential safety hazards brought to user privacy and property are reduced.
For example, the target application may be a game application, and when a game is opened, by performing the multi-user joint authentication, it may be ensured that a user uses the game application under the authorization of other users, so that the supervision capability of the user using the game application is improved. Or, the target application can be a payment application, and the multi-user combined identity authentication is performed during property transaction, so that the situation that a plurality of preset users know and authorize the transaction can be ensured, the security during property transaction is improved, and the probability that the property of the users is lost is reduced.
In one possible design method, before the electronic device opens the multi-user joint authentication function provided by the target application, the method further includes: the electronic equipment prompts a user to set the number of users for authenticating the user when the multi-user combined identity authentication is carried out; responding to the number N of users input by the user, displaying a second preview interface captured by the camera by the electronic equipment, and acquiring the face information of the N users from the second preview interface; the electronic equipment takes the N users as N preset users, and establishes the corresponding relation between the target application and the face information of the N authenticated users.
In a possible design method, after the electronic device opens the multi-user joint authentication function provided by the target application, the method further includes: the electronic equipment acquires face information of N preset users corresponding to the target application; after the electronic device acquires the N pieces of face information to be verified from the first preview interface, the method further includes: the electronic equipment determines that each piece of face information to be verified is the face information of the preset user by comparing the N pieces of face information to be verified with the face information of the N preset users. Subsequently, when multi-user joint identity authentication is performed in the target application each time, the N preset users need to participate together to complete the multi-user joint identity authentication, so that multi-user presence proof using the target application is provided.
In a possible design method, after the electronic device obtains face information of N authenticated users from the second preview interface, the method further includes: the electronic equipment records the sequence of the face information of the N preset users in the second preview interface; after the electronic device acquires the N pieces of face information to be verified from the first preview interface, the method further includes: the electronic equipment determines the sequence of the N pieces of face information to be verified in the first preview interface, and the sequence of the N pieces of face information of the preset users in the second preview interface is the same. And only when the sequence of the N pieces of face information to be verified in the first preview interface is the same as the sequence of the N pieces of face information of the preset users in the second preview interface, the multi-user joint identity verification can successfully pass, so that the accuracy of the multi-user joint identity verification is improved.
In a possible design method, the electronic device opens a multi-user joint authentication function provided by the target application, including: the electronic equipment automatically opens a multi-user combined identity authentication function when running a target application; or, the electronic equipment responds to the user input to turn on the multi-user joint identity authentication function when the target application is operated.
In one possible design method, the electronic device opens a multi-user joint authentication function provided by a target application, and the method includes: the electronic equipment opens the multi-user joint identity authentication function when starting the target application; or the electronic equipment opens the multi-user joint identity authentication function when the first function provided by the target application is operated.
In a possible design method, if each piece of face information to be verified in the N pieces of face information to be verified is face information of a preset user, allowing the electronic device to continue running the target application, including: and if each piece of face information to be verified in the N pieces of face information to be verified is the face information of a preset user, allowing the electronic equipment to continue to operate the target application within a preset time length. After the preset time length, the target application can require the user to perform multi-user combined identity authentication again, so that the situation that the target application is indulged due to the fact that the user uses the target application for a long time is avoided.
In a second aspect, the present application provides a method for verifying a user identity, comprising: the electronic equipment detects a first function of opening a target application by a user, wherein the first function is a function of multi-user joint identity authentication of N users (N is an integer greater than 1); the electronic equipment displays a first input interface and prompts the input of the identity information of N users in the first input interface; the electronic equipment acquires N pieces of identity information from a first input interface; the electronic equipment displays a first preview interface captured by the camera and prompts the N users to input face information in the first preview interface; the electronic equipment acquires N pieces of face information from a first preview interface, namely, the face information of a plurality of users is acquired simultaneously in the first preview interface for identity verification; furthermore, the electronic equipment sends an authentication request to a server, wherein the authentication request comprises the N pieces of identity information and the N pieces of face information, so that the server performs multi-user combined identity authentication on the N pieces of users; and if the message that the multi-user joint identity authentication is successful, which is sent by the server, is received, which indicates that the first function is authorized by a plurality of users, the electronic equipment can execute the first function.
If the multi-user joint identity authentication failure message sent by the server is received, the multi-user joint identity authentication failure is indicated, the electronic equipment can refuse to execute the first function, the accuracy and the safety of multi-user identity authentication are guaranteed, and therefore potential safety hazards brought to user privacy and property are reduced.
For example, the target application may be an online contracting application, and perform multi-user joint identity verification on multiple parties participating in contracting to provide multi-user presence certificates to avoid subsequent contract disputes. For another example, the target application may be an application for transacting a certificate or a business associated with a joint account online, and perform multi-user joint authentication on multiple parties participating in the transaction, prove that the multiple parties participating in the transaction are present, authorize the business to be transacted, and prevent other users from impersonating legitimate users to transact related business in the presence. For another example, the target application may be an application that handles shared properties of multiple users (e.g., shared properties of couples) online, and performs multi-user joint authentication on all parties of the shared properties, so as to ensure that all parties of the shared properties are present and know transaction contents, thereby reducing potential safety hazards brought to user properties.
In one possible design method, before the electronic device sends the verification request to the server, the method further includes: the electronic equipment displays the corresponding relation between the N pieces of identity information and the N pieces of face information and prompts a user to confirm that a first function is executed by using the N pieces of identity information and the N pieces of face information; wherein, the electronic equipment sends the verification request to the server, including: and if the confirmation operation of the user on the N pieces of identity information and the N pieces of face information is detected, the electronic equipment sends the verification request to the server.
In one possible design method, before the electronic device detects that the user opens the first function of the target application, the method further includes: the electronic equipment acquires identity information input by a user in a second input interface; the electronic equipment acquires face information input by a user in a second preview interface; the electronic equipment sends a registration request to the server, wherein the registration request comprises the identity information and the face information of the user, so that the server establishes the corresponding relationship between the identity information and the face information of the user, namely, the registration process of the multi-user joint identity authentication function is completed.
In a third aspect, the present application provides a method for verifying a user identity, including: the method comprises the steps that a server receives a verification request sent by electronic equipment, wherein the verification request comprises N pieces of identity information and N pieces of face information to be verified; the server acquires N registered face information respectively corresponding to the N identity information, and the server stores the corresponding relation between the identity information of each registered user and the face information; the server determines that the N pieces of registered face information correspond to the N pieces of face information to be verified one by one; and the server sends a message that the multi-user joint identity authentication is successful to the electronic equipment.
In one possible design method, before the server receives the verification request sent by the electronic device, the method further includes: the method comprises the steps that a server receives a registration request sent by electronic equipment, wherein the registration request comprises identity information and face information of a user; the server establishes a corresponding relationship between the identity information and the face information of the user.
In a possible design method, before the server establishes a correspondence between the identity information of the user and the face information, the method further includes: the server inquires that the identity information of the user is correct identity information in a preset database, and the function of multi-user combined identity authentication for illegal user registration is avoided.
In a possible design method, the identity information of the user includes head portrait information of the user; after the server receives the registration request sent by the electronic device, the method further comprises the following steps: the server determines that the head portrait information in the identity information corresponds to the face information, so that the authenticity and the validity of the face information and the identity information uploaded by the user are guaranteed.
In a fourth aspect, the present application provides an electronic device comprising: the processing unit is used for running the target application and opening a multi-user joint identity authentication function provided by the target application; the display unit is used for displaying a first preview interface captured by the camera and prompting the input of face information of a plurality of preset users in the first preview interface; the acquisition unit is used for acquiring N pieces of face information to be verified from the first preview interface, wherein N is an integer greater than 1; and the processing unit is further configured to allow the electronic device to continue to run the target application if each piece of face information to be verified in the N pieces of face information to be verified is the face information of a preset user.
In a possible design method, the display unit is further configured to prompt a user to set a number of users to be authenticated when performing the multi-user joint identity authentication; the display unit is also used for responding to the number N of users input by the users, displaying a second preview interface captured by the camera, and acquiring the face information of the N users from the second preview interface; and the processing unit is also used for taking the N users as N preset users and establishing the corresponding relation between the target application and the face information of the N authenticated users.
In a possible design method, the obtaining unit is further configured to obtain face information of N preset users corresponding to the target application; and the processing unit is further used for determining that each piece of face information to be verified is the face information of the preset user by comparing the N pieces of face information to be verified with the face information of the N preset users.
In a possible design method, the processing unit is further configured to record, in the memory, an order between the face information of the N preset users in the second preview interface; and determining the sequence of the N pieces of face information to be verified in the first preview interface, wherein the sequence of the N pieces of face information to be verified in the first preview interface is the same as the sequence of the N pieces of face information of the preset users in the second preview interface.
In a possible design method, the processing unit is specifically configured to automatically turn on the multi-user federated identity authentication function when the target application is running; alternatively, the multi-user federated authentication function is turned on in response to user input while the target application is running.
In a possible design method, the processing unit is specifically configured to turn on the multi-user federated identity authentication function when the target application is started; or, the multi-user joint authentication function is opened when the first function provided by the target application is operated.
In a possible design method, if each piece of face information to be verified in the N pieces of face information to be verified is face information of a preset user, the processing unit is specifically configured to allow the electronic device to continue to run the target application within a preset time period.
In a fifth aspect, the present application provides an electronic device, comprising: the system comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for detecting a first function of opening a target application by a user, the first function is a function of multi-user joint identity authentication which needs N users, and N is an integer greater than 1; the display unit is used for displaying the first input interface and prompting the input of the identity information of the N users in the first input interface; displaying a first preview interface captured by a camera, and prompting the N users to input face information in the first preview interface; the acquisition unit is further used for acquiring N pieces of identity information from the first input interface; acquiring N pieces of face information from a first preview interface; a communication unit, configured to send an authentication request to a server, where the authentication request includes the N pieces of identity information and the N pieces of face information, so that the server performs multi-user joint identity authentication on the N pieces of users; and the processing unit is used for executing the first function if receiving the message of successful multi-user joint identity authentication sent by the server.
In a possible design method, the display unit is further configured to display a correspondence between the N pieces of identity information and the N pieces of face information, and prompt a user to confirm that a first function is executed using the N pieces of identity information and the N pieces of face information; if the confirmation operation of the user on the N pieces of identity information and the N pieces of face information is detected, the communication unit sends the verification request to the server.
In a possible design method, the obtaining unit is further configured to obtain identity information input by a user in a second input interface; acquiring face information input by a user in a second preview interface; the communication unit is further configured to send a registration request to the server, where the registration request includes the identity information and the face information of the user, so that the server establishes a correspondence between the identity information and the face information of the user.
In a sixth aspect, the present application provides a server, comprising: the communication unit is used for receiving an authentication request sent by the electronic equipment, wherein the authentication request comprises N pieces of identity information and N pieces of face information to be authenticated; the processing unit is used for acquiring N pieces of registered face information respectively corresponding to the N pieces of identity information, and the server stores the corresponding relation between the identity information of each registered user and the face information; determining that the N registered face information correspond to the N face information to be verified one by one; and the communication unit is also used for sending a message of successful multi-user joint identity verification to the electronic equipment.
In a possible design method, the communication unit is further configured to receive a registration request sent by the electronic device, where the registration request includes identity information and face information of a user; the processing unit is further configured to establish a correspondence between the identity information of the user and the face information.
In a possible design method, the processing unit is further configured to query the identity information of the user as correct identity information in a preset database.
In one possible design method, the identity information of the user comprises head portrait information of the user; the processing unit is further configured to determine that the avatar information in the identity information corresponds to the face information.
In a seventh aspect, the present application provides an electronic device comprising a processor, and a communication module, an input device, an output device, and a memory, and one or more computer programs, all coupled to the processor; the one or more computer programs are stored in a memory, and when the electronic device is running, the processor executes the one or more computer programs stored in the memory to cause the electronic device to perform the method for verifying the identity of a user according to any one of the first aspect or the second aspect.
Illustratively, the processor is configured to: running a target application and opening a multi-user joint identity authentication function provided by the target application; the output device is to: displaying a first preview interface captured by a camera, and prompting to input face information of a plurality of preset users in the first preview interface; the input device is to: acquiring N pieces of face information to be verified from the first preview interface, wherein N is an integer greater than 1; the processor is further configured to: and if each piece of face information to be verified in the N pieces of face information to be verified is the face information of a preset user, allowing the electronic equipment to continue to run the target application.
Illustratively, the input device is configured to: detecting a first function of opening a target application by a user, wherein the first function is a function of multi-user joint identity authentication of N users, and N is an integer greater than 1; the output device is to: the electronic equipment displays a first input interface and prompts the input of identity information of N users in the first input interface; displaying a first preview interface captured by a camera, and prompting the N users to input face information in the first preview interface; the processor is configured to: acquiring N pieces of identity information from the first input interface; acquiring N pieces of face information from the first preview interface; instructing the communication module to send an authentication request to a server, where the authentication request includes the N identity information and the N face information, so that the server performs multi-user joint identity authentication on the N users; and if a message of successful multi-user joint identity authentication sent by the server is received, executing the first function.
In an eighth aspect, the present application provides a server comprising a processor, and a communication module and a memory, and one or more computer programs, all coupled to the processor; the one or more computer programs are stored in a memory, and when the server is running, the processor executes the one or more computer programs stored in the memory to cause the server to perform the method for verifying the identity of a user according to any one of the third aspect.
Illustratively, the communication module is configured to: receiving a verification request sent by electronic equipment, wherein the verification request comprises N pieces of identity information and N pieces of face information to be verified; the processor is configured to: acquiring N pieces of registered face information respectively corresponding to the N pieces of identity information, wherein the corresponding relation between the identity information of each registered user and the face information is stored in the memory; determining that the N registered face information correspond to the N face information to be verified one by one; the communication module is further to: and sending a message of successful multi-user joint identity verification to the electronic equipment.
In a ninth aspect, the present application provides a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of verifying the identity of a user as described in any one of the first or second aspects.
In a tenth aspect, the present application provides a computer storage medium comprising computer instructions which, when run on a server, cause the server to perform the method of verifying the identity of a user as described in any one of the first aspect.
In an eleventh aspect, the present application provides a computer program product for causing an electronic device to perform the method of verifying the identity of a user according to any one of the first or second aspects, when the computer program product is run on the electronic device.
In a twelfth aspect, the present application provides a computer program product, which, when run on a server, causes the server to perform the method of verifying the identity of a user as claimed in any one of the third aspects.
It is to be understood that the electronic device according to the fourth aspect, the electronic device according to the fifth aspect, the electronic device according to the seventh aspect, the server according to the sixth aspect, the electronic device according to the ninth aspect, the computer storage medium according to the tenth aspect, and the computer program product according to the eleventh aspect are all configured to perform the corresponding method provided above, and therefore, the beneficial effects achieved by the computer storage medium can refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Drawings
Fig. 1 is a first schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is an architecture diagram of an operating system in an electronic device according to an embodiment of the present application;
fig. 3 is a first flowchart illustrating a method for verifying a user identity according to an embodiment of the present application;
fig. 4 is a first scenario diagram illustrating a method for verifying a user identity according to an embodiment of the present application;
fig. 5 is a schematic view of a second scenario of a method for verifying a user identity according to an embodiment of the present application;
fig. 6 is a third scenario schematic diagram of a method for verifying a user identity according to an embodiment of the present application;
fig. 7 is a fourth scenario schematic diagram of a method for verifying a user identity according to an embodiment of the present application;
fig. 8 is a scene schematic diagram of a method for verifying a user identity according to an embodiment of the present application;
fig. 9 is a sixth schematic view of a scenario of a method for verifying a user identity according to an embodiment of the present application;
fig. 10 is a flowchart illustrating a second method for verifying a user identity according to an embodiment of the present application;
fig. 11 is a seventh scenario diagram illustrating a method for verifying a user identity according to an embodiment of the present application;
fig. 12 is a schematic view illustrating a scene eight of a method for verifying a user identity according to an embodiment of the present application;
fig. 13 is a schematic view illustrating a scenario of a method for verifying a user identity according to an embodiment of the present application;
fig. 14 is a scene schematic diagram ten of a method for verifying a user identity according to an embodiment of the present application;
fig. 15 is a third schematic flowchart of a method for verifying a user identity according to an embodiment of the present application;
fig. 16 is a schematic view eleven of a scenario of a method for verifying a user identity according to an embodiment of the present application;
fig. 17 is a scene schematic diagram twelve of a method for verifying a user identity according to an embodiment of the present application;
fig. 18 is a schematic view thirteen of a scenario of a method for verifying a user identity according to an embodiment of the present application;
fig. 19 is a fourteenth schematic view of a scenario of a method for verifying a user identity according to an embodiment of the present application;
fig. 20 is a fourth flowchart illustrating a method for verifying a user identity according to an embodiment of the present application;
fig. 21 is a flowchart illustrating a fifth method for verifying a user identity according to an embodiment of the present application;
fig. 22 is a scene schematic diagram fifteen of a method for verifying a user identity according to an embodiment of the present application;
fig. 23 is a schematic view sixteen illustrating a scenario of a method for verifying a user identity according to an embodiment of the present application;
fig. 24 is a seventeenth schematic view of a scenario of a method for verifying a user identity according to an embodiment of the present application;
fig. 25 is a schematic view eighteen of a scenario of a method for verifying a user identity according to an embodiment of the present application;
fig. 26 is a nineteenth schematic view of a scenario of a method for verifying a user identity according to an embodiment of the present application;
fig. 27 is a second schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 28 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 29 is a first schematic structural diagram of a server according to an embodiment of the present application;
fig. 30 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
Embodiments of the present embodiment will be described in detail below with reference to the accompanying drawings.
The method for verifying the user identity provided in the embodiment of the present application may be applied to a mobile phone, a tablet computer, a desktop computer, a laptop computer, a notebook computer, an ultra-mobile personal computer (UMPC), a handheld computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device, a virtual reality device, and the like.
Fig. 1 shows a schematic structural diagram of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a motor 191, an indicator 192, a camera 193, a display screen 194, and a Subscriber Identity Module (SIM) card interface 195, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
In an embodiment of the present application, the camera 193 may send the captured image to a processor (e.g., a GPU processor) which extracts one or more face information from the image through a certain face recognition algorithm.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The sensor module 180 may specifically include one or more of a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, or a bone conduction sensor, which is not limited in this embodiment.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
The application package may include camera, gallery, calendar, call, map, navigation, bluetooth, music, video, short message, etc. applications.
In the embodiment of the present application, as shown in fig. 2, the application layer may further include a target application that needs to authenticate multiple users at the same time.
For example, the target application may be a game application, and when the user age of the game application is less than a certain age (for example, 13 years), the identities of the user and the guardian thereof may be simultaneously verified when the game application is opened, so as to ensure that the user uses the game application under the permission or accompanying of the guardian.
For another example, the target application may be a payment-type application, and when the user uses the payment-type application to pay for a large amount of transaction, the payment-type application may require the related personnel participating in the transaction to perform identity authentication together, so as to ensure that the related personnel of the transaction know the knowledge of the transaction.
For another example, the target application may be an application issued by a government agency for conducting related business online. For example, when a vehicle passes a house, both the old vehicle owner and the new vehicle owner are required to transact the house on site, and the identity of the old vehicle owner and the new vehicle owner can be verified by using the application. For example, when dealing with marriage certificates, both men and women are required to deal with relevant procedures on site, and the identity of both men and women can be verified by using the application. For another example, when a contract is signed, both the first party and the second party are required to confirm the contract content on site, and the identity of the first party and the second party can be verified by using the application.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include an authentication service that may provide multi-user federated authentication functionality to the target application. For example, the target application may invoke an authentication service to acquire face information of the user for registration, authentication, and other processes. For example, the authentication service may acquire face information including faces of a plurality of users by driving an image sensor such as a camera. Moreover, the authentication service can also calculate the number of faces contained in the collected face information, perform living body detection on the face information, and the like. In addition, the authentication service can also extract the face characteristics of each face in the face information and compare the extracted face characteristics with the face characteristics of the registered users, so as to determine whether the multiple users currently carrying out identity verification are legal users.
It should be noted that the authentication service may be operated in a common execution environment (REE), or may be operated in a high-security environment such as a Trusted Execution Environment (TEE) or a Secure Element (SE). Of course, a part of the program of the authentication service may be run in the REE, and another part of the program may be run in the TEE, which is not limited in this embodiment.
In addition, the application framework layer may also include drawing services (e.g., faceflickers), window managers, content providers, telephony managers, resource managers, notification managers, and the like.
Among other things, the drawing service may be used to construct display pages for applications. A display page may be composed of one or more views. For example, the display page including the short message notification icon may include a view for displaying text and a view for displaying a picture. The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc. The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.). The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like. The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and the like, and the embodiment of the application does not limit the display driver, the camera driver, the audio driver, the sensor driver and the like.
For example, before a target application in the electronic device 100 needs to authenticate multiple users simultaneously, each user needing to authenticate needs to register their own identity information and face information in the electronic device 100.
Taking a cell phone as an example of the electronic device 100, a user using the cell phone may be one or more family members in a home. In order to facilitate each subsequent user to use the multi-user joint identity authentication function provided by the target application in the mobile phone, each member in the home can register own identity information and face information in the mobile phone through the following steps S301 to S305. As shown in fig. 3, steps S301 to S305 specifically include:
s301, the mobile phone prompts the user to input identity information.
For example, the handset may prompt the user to enter identity information when the user installs or first uses the target application. Taking the target application as the game APP for example, as shown in fig. 4 (a), after the user opens the registration function of the game APP in the mobile phone, the mobile phone may display a registration interface 401. In the registration interface 401, there are provided filling options of identity information such as name, age, sex, mobile phone number, and identification number of the registered user, so as to prompt the user to input corresponding identity information in the registration interface 401.
Alternatively, the identity information entered by the user may not be directly associated with the target application. For example, a cell phone may provide multi-user federated authentication functionality in a system setting. When it is detected that the user turns on the function for the first time, the handset may display a registration interface 402 as shown in (b) of fig. 4. Similar to the registration interface 401, the registration interface 402 may also be configured with one or more filling options of identity information, such as name, age, gender, mobile phone number, and identification number of the registered user, which is not limited in this embodiment.
S302, responding to the input operation of the user A, and the mobile phone acquires the identity information of the user A.
Taking the above-mentioned registration interface 402 as an example, as shown in fig. 5, a user who wishes to register (for example, user a) may input their own identity information, such as name, age, sex, mobile phone number, and identification number, in corresponding filling options of the registration interface 402. If it is detected that the user clicks the next button 403 in the registration interface 401, the mobile phone may store the identity information input by the user in the registration interface 401 at this time as the identity information of the user a.
S303, prompting the user to input the face information by the mobile phone.
For example, after detecting that the user clicks the next button 403, the mobile phone may call a camera (e.g., a front-facing camera) of the mobile phone to obtain a currently captured preview interface, and detect whether face information exists in the preview interface through a preset face detection algorithm. As shown in fig. 6, the cell phone may display the camera-captured content in the preview interface 601, for example, the cell phone may display the camera-captured content in a preset area 602 of the preview interface 601. The mobile phone may prompt the user to adjust the position of the face in the preview interface 601, so that the mobile phone may detect the face of the user in the preset area 602 of the preview interface 601.
S304, the mobile phone obtains the face information input by the user A by using the camera.
As shown in fig. 7, after the user a completely puts the face into the preset area 602 of the preview interface 601, the mobile phone can identify the face information in the preset area 602 and store the identified face information locally. For example, after the mobile phone detects a face in the preset area 602, corresponding face features may be extracted according to image information in the preset area 602. For example, the extracted facial features may be a feature matrix, and the mobile phone may store the extracted feature matrix locally.
After the face information of the user a is acquired, as shown in fig. 7, the mobile phone may further display a prompt 701 indicating that the face information is successfully entered. The subsequent mobile phone can authenticate the user identity of the user A by using the face information input during the registration of the user A. In addition, after the face information of the user a is acquired, as shown in fig. 7, the mobile phone may further display a return button 702 and a next button 703. If it is detected that the user clicks the return button 702, the mobile phone may delete the face information acquired this time, and return to the preview interface 601 to capture and detect the face information again. If it is detected that the user clicks the return button 703, it indicates that the user confirms to use the face information entered this time as a basis for performing identity authentication subsequently, and at this time, the mobile phone may continue to execute step S305.
Of course, if the mobile phone does not recognize the face information in the preset area 602 within the preset time, or detects that the face information entered by the user is incomplete many times, the mobile phone may display the prompt information indicating that the registration has failed, and exit the registration process.
S305, the mobile phone establishes a corresponding relation between the identity information and the face information of the user A and prompts the user A to register successfully.
In step S305, as shown in fig. 8, the mobile phone may display the identity information and the face information of the user a acquired in steps S302 and S304, respectively, on the touch screen, and prompt the user to confirm. If it is detected that the user clicks the confirmation button 801, indicating that the user agrees to bind the identity information and the face information input this time, the mobile phone may establish a corresponding relationship between the identity information and the face information of the user a in a preset database. For example, a face feature library and an identity information library may be set in a mobile phone in advance, and the mobile phone may store the acquired face information of the user a in the face feature library and generate a face feature index of the face information. Furthermore, the mobile phone can store the corresponding relationship between the face feature index of the user a and the identity information of the user a in the identity information base. Therefore, when the user identity is subsequently verified, the mobile phone can search the face feature index corresponding to the identity information in the identity information base according to the identity information input by the user, and further can search the corresponding face information in the face feature base according to the face feature index. Of course, if it is detected that the user clicks the return button 802 shown in fig. 8, the mobile phone may return to the upper menu, so that the user may modify the input identity information and/or face information, which is not limited in this embodiment of the present application.
In addition, after detecting that the user clicks the confirm button 801, as shown in fig. 9, the mobile phone may further display a message 901 that the user a successfully registers in the display screen. Also, the handset may also display an option 902 to add other registered users. If the user is detected to click on option 902, the handset may proceed to perform the above steps S301-S305 to register other users (e.g., user B and user C, etc.) who may use the multi-user federated authentication function.
Thus, different users can complete the registration process of the multi-user joint identity authentication function in the mobile phone through the steps S301 to S305. Each registered user establishes a corresponding relationship between own identity information and face information in the mobile phone. Subsequently, when the mobile phone needs to authenticate a plurality of users at the same time to provide presence certificates of the plurality of users, the mobile phone can respectively authenticate a plurality of face information appearing in the same collected image. When the face information of the registered user is the face information of the registered user, the mobile phone can determine that the multi-user joint identity authentication is passed, so that the identity authentication of the users is completed at the same time and place, and the accuracy and the safety of the identity authentication of the users are ensured.
After a plurality of users register the multi-user federated identity authentication function in the mobile phone, before using the multi-user federated identity authentication function, the users may enable the multi-user federated identity authentication function in the target application through the following steps S1001 to S1006. Of course, if the user does not need to use the combined multiple user identity authentication function in the target application (e.g., the game APP described above), for example, the parent does not need his guardian to accompany when using the game APP, the user may open the game APP by skipping the following steps S1001 to S1006.
Illustratively, as shown in fig. 10, the steps S1001 to S1006 specifically include:
s1001, after detecting that a user starts a multi-user joint identity authentication function in a target application, a mobile phone prompts the user to select the number of users during multi-user joint identity authentication.
Taking the game APP as an example of the target application, as shown in (a) in fig. 11, when the mobile phone detects that the user opens the game APP, if the joint multiuser authentication function in the game APP is not enabled, the mobile phone may display an enable button 1101 that enables the joint multiuser authentication function. When it is detected that the user clicks the enable button 1101, it indicates that the subsequent user wishes to start the joint multiuser authentication function when opening the game APP, so that the user is proved to use the game APP with others, and at this time, as shown in fig. 11 (b), the mobile phone may prompt the user to input the number of users who subsequently participate in joint multiuser authentication.
In some embodiments, the mobile phone can also automatically enable the multi-user joint authentication function for the user in the target application. For example, when detecting that the user opens the game APP, the mobile phone can automatically start the front camera to acquire the user image using the game APP. If the user image determines that the user using the game APP is younger (for example, less than 10 years old), the mobile phone may automatically enable the multi-user joint authentication function and prompt the user to select the number of users in the multi-user joint authentication.
S1002, when the number of the users is N (N is more than 1), the mobile phone prompts to input N pieces of face information in a preview interface.
For example, if the number of users when the user selects the multi-user federated authentication is 2, after the mobile phone detects that the user selects the option of 2, as shown in fig. 12, the mobile phone may call its camera (e.g., a front camera) to obtain a currently captured preview interface 1201 and display the preview interface 1201. Moreover, the mobile phone may prompt the user to input the face information of the user 1 and the user 2 at a designated position of the preview interface 1201 in the preview interface 1201.
For example, when the number of users in the user selection of the federated authentication is 2, the cell phone may mark two areas, namely, an area 1202 and an area 1203 shown in fig. 12, in the preview interface 1201 according to the number of users. The two areas are used for acquiring face information of the user 1 and the user 2 respectively. In this way, the users a and B participating in the multi-user joint authentication may adjust their positions in the preview interface 1201 according to the prompts in the preview interface 1201, so that the mobile phone may detect the face information of the user 1 (e.g., the user a) in the area 1202 of the preview interface 1201 and detect the face information of the user 2 (e.g., the user B) in the area 1202 of the preview interface 1201.
S1003, the mobile phone acquires N pieces of face information from the preview interface.
And S1004, the mobile phone determines that the N pieces of face information are all face information of the registered user.
As shown in fig. 13, after the mobile phone detects face information in both the area 1202 and the area 1203 of the preview interface 1201, it can be determined that the number of faces at this time is the same as the number of users (i.e., 2 users) set by the user, and then the mobile phone can determine whether the face information in the area 1202 and the area 1203 is the face information of registered users that has already been registered one by one. If the number of the collected face information is different from the number of the users set by the user, the mobile phone can prompt the user that the number of the faces input in the preview interface 1201 is incorrect, and return to the preview interface 1201 to collect the face information in the preview interface 1201 again.
For example, the mobile phone may match the face information extracted from the region 1202 with face information of the registered user a, user B, and user C in the face feature library. If the face information in the area 1202 is matched with the face information of the user a, indicating that the face captured in the area 1202 is the face of the user a, the mobile phone can search the identity information of the user a in the identity information base according to the face feature index of the user a.
Similarly, the mobile phone may match the face information extracted from the region 1203 with face information of the registered user a, user B, and user C in the face feature library. If the face information in the area 1203 is matched with the face information of the user B, which indicates that the face captured in the area 1203 is the face of the user B, the mobile phone may search the identity information of the user B in the identity information base according to the face feature index of the user B. In this way, the mobile phone may determine that the 2 pieces of face information collected in the preview interface 1201 are all face information of the registered user.
In addition, if the mobile phone determines that the one or more face information collected this time is not the face information of the registered user, the mobile phone can display a registration button to prompt the unregistered user to register the identity information and the face information in the mobile phone. If the user is detected to click the registration button, the mobile phone can register the identity information and the face information of the new user through the steps S301-S305.
In some embodiments of the present application, after the mobile phone detects the face information from the above-mentioned area 1202 and area 1203, it may further perform living body detection on the face information in the area 1202 and area 1203, that is, it is determined that the face image captured by the mobile phone belongs to a real living user, rather than using fake face information such as a picture or a video. For example, the mobile phone can detect whether or not the temperature distribution in the region 1202 and the region 1203 matches the face surface temperature distribution using the temperature sensor, and if the face surface temperature distribution matches, it can be considered that the living body detection is passed. For another example, the mobile phone may further require the user to blink, open the mouth, or read a text to identify whether the user in the area 1202 and the area 1203 is a living user, which is not limited in this embodiment of the application.
S1005, the mobile phone prompts the user to confirm the correspondence between the N pieces of personal face information and the corresponding N pieces of identity information.
And S1006, if the confirmation operation of the user is detected, the mobile phone establishes a corresponding relation among the target application, the N pieces of personal face information and the N pieces of identity information.
For example, after the mobile phone determines that the face information in the area 1202 and the area 1203 is the face information of the registered user a and the registered user B, as shown in fig. 14, the mobile phone may display the face information and the identity information of the user a and the face information and the identity information of the user B on a touch screen to prompt the user to confirm. If the user is detected to click the confirmation button 1401, the user agrees to perform multi-user joint authentication by the user A and the user B when the game APP is subsequently run. Then, the mobile phone may establish a correspondence between the game APP, the face information and the identity information of the user a, and the face information and the identity information of the user B.
For example, an authentication registry as shown in table 1 may be stored in the mobile phone in advance, and the authentication registry is used to record the correspondence between each multi-user federated authentication enabled application and a corresponding plurality of preset users. According to the embodiment of the application, a plurality of preset users corresponding to the application can be called as the authenticated users of the application. For example, the authenticated users of the game APP include user a and user B. If it is detected that the user clicks the confirmation button 1401 shown in fig. 14, the cellular phone can record the ID of the game APP (e.g., package name of the application) and the related information (e.g., identity information and face information) of the authenticated users (i.e., user a and user B) in the authentication registry. In this way, when it is detected that the user opens the game APP subsequently, the mobile phone can determine, through the authentication registry shown in table 1, whether the user currently performing authentication is an authenticated user recorded in the authentication registry.
TABLE 1
ID of application Identity information Face information
Gaming APP ID Identity information of user A Face information of user a
Identity information of user B Face information of user B
ID of Payment APP Identity information of user B Face information of user B
Identity information of user C Face information of user C
It should be noted that the process of enabling the multi-user joint authentication function in the target application of the mobile phone (i.e., the above-mentioned S1001-S1006) only needs to be completed by the user once in the target application. That is, after the user sets the user a and the user B as authenticated users that complete the multi-user federated authentication function in the target application through the above S1001 to S1006, the user a and the user B need to participate together to complete multi-user federated authentication each time the multi-user federated authentication is performed in the target application, thereby providing a multi-user presence certificate for using the target application.
Of course, the user may replace the authenticated user performing the multi-user federated identity verification in the target application through the above steps S1001 to S1006. For example, the authenticated user may be modified from user a and user B to user a and user C by enabling the multi-user federated identity verification function, which is not limited in any way by the embodiment of the present application.
After a user sets a user A and a user B as authentication users by starting a multi-user joint identity verification function in a target application, if the user is detected to open the multi-user joint identity verification function in the target application, the mobile phone can execute the following steps S1501-S1505 to perform multi-user joint identity verification. Illustratively, as shown in fig. 15, steps S1501 to S1505 specifically include:
s1501, after detecting that the user opens the multi-user combined identity verification function in the target application, the mobile phone determines an authentication user corresponding to the target application.
Still taking the game APP as the target application example, the mobile phone may display an authentication button for performing the multi-user joint identity authentication when entering the game APP, or may display the authentication button when entering a certain function of the game APP. As shown in fig. 16, after detecting that the user opens the game APP, if the user has enabled the joint multi-user authentication function in the target application (i.e., S1001-S1006 described above), the mobile phone may display an authentication button 1601 to prompt the user to perform joint multi-user authentication.
In some embodiments, the mobile phone may also automatically turn on the multi-user joint authentication function in some scenarios. For example, after detecting that the user opens the game APP, the mobile phone may require the user to input the user age, and if the user age is small (e.g., less than 10 years old), the mobile phone may automatically open the multi-user joint authentication function in the game APP. Or, the user can set the mobile phone to be in a child mode, and if the mobile phone is in the child mode when the game APP is opened by the user, the mobile phone can automatically open the multi-user joint identity authentication function in the game APP. Or, the mobile phone may also monitor the running time of the game APP, and when the running time of the game APP in the mobile phone is longer than a preset time (e.g., 2 hours), the mobile phone may automatically turn on the multi-user joint identity authentication function in the game APP.
After the mobile phone opens the multi-user joint identity verification function of the game APP, the mobile phone can acquire the relevant information of the authenticated user corresponding to the game APP in the authentication registration table shown in the table 1 according to the ID of the game APP. For example, the authenticated users corresponding to the game APP include user a and user B. Then, the mobile phone may obtain the corresponding face feature index in the identity information base according to the identity information of the user a, and further, the mobile phone may obtain the face information of the user a in the face feature base according to the face feature index of the user a. Similarly, the mobile phone can acquire the corresponding face feature index in the identity information base according to the identity information of the user B, and further, the mobile phone can acquire the face information of the user B in the face feature base according to the face feature index of the user B.
S1502, the mobile phone prompts the user to input face information of N (N is larger than 1) authenticated users in a preview interface.
After the mobile phone determines that the authenticated users corresponding to the game APP are user a and user B, as shown in fig. 17, the mobile phone may call a camera (e.g., a front camera) of the mobile phone to obtain a currently captured preview interface 1701 and display the preview interface 1701. Also, since the phone has determined that the game APP corresponds to 2 authenticated users, the phone may mark two regions (i.e., region 1702 and region 1703) in the preview interface 1701 to prompt the user to enter face information for each authenticated user in the two regions, as also shown in fig. 17.
Further, the mobile phone may prompt the user to input face information in the preview interface 1701 according to the position relationship of the plurality of authenticated users when the multi-user joint identity authentication function is enabled, and the subsequent mobile phone may determine whether the multi-user joint identity authentication passes or not according to the position relationship of each face information in the preview interface 1701. For example, when the multi-user joint authentication function is enabled, the face information of the user a is on the left side of the preview interface, and the face information of the user B is on the right side of the preview interface. Then, when actually performing the multi-user federated authentication, the user a needs to enter his face information into the left side of the preview interface (i.e., in the area 1702), and the user B needs to enter his face information into the right side of the preview interface (i.e., in the area 1703), otherwise, the mobile phone may determine that this multi-user federated authentication does not pass.
And S1503, the mobile phone acquires N pieces of face information from the preview interface.
S1504, the mobile phone determines that the N pieces of face information are face information of the authenticated user.
If the mobile phone extracts the face information in the area 1702 and the area 1703 of the preview interface 1701, and the mobile phone can determine that the number of the currently authenticated users is the same as the number of the authenticated users, the mobile phone can determine whether the face information in the area 1702 and the area 1703 is the face information of the authenticated users (i.e., the user a and the user B) one by one.
For example, the authenticated user corresponding to the region 1702 is the user a, and then the mobile phone may compare the face information collected in the region 1702 with the face information of the user a. When the similarity between the face information in the region 1702 and the face information of the user a is greater than the preset value, it may be determined that the face information in the region 1702 is the face information of the authenticated user a.
Similarly, the authenticated user corresponding to the area 1703 is the user B, and then the mobile phone may compare the face information collected in the area 1703 with the face information of the user B. When the similarity between the face information in the area 1703 and the face information of the user B is greater than a preset value, it may be determined that the face information in the area 1703 is the face information of the authenticated user B. In this way, the mobile phone can determine that the 2 pieces of face information collected in the preview interface 1701 are face information of 2 authenticated users preset in steps S1001 to S1006 by the user, and the positional relationship of the 2 pieces of face information in the preview interface 1701 also matches the positional relationship set in steps S1001 to S1006 by the user.
S1505, the mobile phone displays the message of successful multi-user joint identity authentication and continues to run the target application.
As shown in fig. 18 (a), if N pieces of face information acquired by the mobile phone in the multi-user joint authentication process correspond to the face information of N authenticated users of the game APP one to one, the mobile phone may display a message 1801 that the multi-user joint authentication is successful. Certainly, after the multi-user joint identity authentication is successful, it is stated that the user has the right to use the game APP, and then the mobile phone may continue to run the target application (i.e., the game APP), for example, the mobile phone may automatically open the game APP to enter the home page of the game APP.
Alternatively, if the mobile phone determines that one or more pieces of face information acquired in the multi-user joint authentication process are not face information of the authenticated user, the mobile phone may display a message indicating that the multi-user joint authentication has failed, as shown in (b) of fig. 18. At this moment, the user does not have the permission to use the game APP, and the mobile phone cannot enter the home page of the game APP to continue to run the game APP.
Further, after the multi-user joint identity authentication is successful, the mobile phone can set the effective time of the multi-user joint identity authentication. Taking the effective time as 2 hours as an example, after the user passes the multi-user joint identity authentication in the game APP, the mobile phone can allow the user to use the game APP for 2 hours. Then, if the mobile phone detects that the game APP is still running in the foreground after 2 hours, the mobile phone may prompt the user to perform the multi-user joint authentication again. That is, after the running time of the game APP exceeds the valid time, the mobile phone may repeatedly execute the above steps S1501 to S1505 for the multi-user joint authentication, so as to avoid the user using the game APP from being enthusiastic in the game.
In addition, in the foregoing embodiment, the game APP is taken as an example of the target APP, and it can be understood that the mobile phone may further set a multi-user joint identity authentication function in other applications to simultaneously authenticate the identities of multiple users, so that a multi-user presence certificate is provided when a certain function in the mobile phone is implemented. For example, the target APP may also be a payment APP. And if the user sets the authentication user for performing the multi-user joint identity verification function in the payment APP as the user B and the user C through the steps S1001 to S1006. Then, as shown in fig. 19, when the mobile phone detects that the transaction amount in the payment APP exceeds a preset limit (for example, 2 ten thousand), the mobile phone may prompt to input face information of the user B and the user C for multi-user joint authentication. If the user B and the user C pass the multi-user combined identity verification in the payment APP, the authenticated users (namely the user B and the user C) bound with the payment APP know and authorize the large-amount transaction, the mobile phone can continue to complete the large-amount transaction, so that the security during property transaction is improved, and the probability of property loss of the user is reduced.
That is, if the multi-user joint identity authentication is started when the target application is opened, after the multi-user joint identity authentication, the mobile phone can continue to run the target application, that is, the mobile phone can open the target application and enter a display interface of the target application; if the multi-user joint authentication is started when a certain function (such as the payment function) of the target application is run, after the multi-user joint authentication, the mobile phone can continue to run the target application, and the mobile phone can continue to execute the function.
It should be noted that, the above embodiments are exemplified by registering locally in a mobile phone, enabling and implementing a multi-user joint authentication function. It can be understood that the handset can also complete the registration and authentication process of the multi-user joint identity authentication function through interaction with the server.
Illustratively, similar to the above steps S301-S305, before using the multi-user joint authentication function provided by the target application, the handset may register the identity information and face information of the user in the server through the following steps S2001-S2008. As shown in fig. 20, steps S2001-S2008 specifically include:
s2001, the mobile phone prompts the user to input identity information.
And S2002, responding to the input operation of the user A, and acquiring the identity information of the user A by the mobile phone.
And S2003, prompting the user to input the face information by the mobile phone.
And S2004, the mobile phone acquires the face information input by the user A by using the camera.
The implementation process of steps S2001-S2004 is the same as the implementation process of steps S301-S304 in the above embodiment, and therefore, the description thereof is omitted here.
S2005, the mobile phone sends a registration request to the server, and the registration request comprises the identity information and the face information of the user A.
After the mobile phone acquires the face information and the identity information of the user a, the face information and the identity information can be encrypted and carried in a registration request to be sent to the server, and the server executes the following steps S2006-S2008 to complete the registration process of the user a.
And S2006, the server determines that the identity information of the user A is effective identity information.
After the server receives the identity information and the face information of the user a, whether the identity information same as the received identity information or the face information same as the received face information is stored in the server can be searched. If the user A is not found, the identity information and the face information of the user A sent at the time are not registered in the server.
Then, the server can verify whether the identity information of the user a is the real and valid identity information through the public security department or other authoritative identity database. For example, the identity information of the user a sent by the mobile phone includes the name and the identification number of the user a. The server may request the identity database to query whether there are any users that match the name and identification number of user a. If the user matched with the name and the identity card number of the user A can be found in the identity database, the identity information input by the user A in the mobile phone is real effective identity information. Otherwise, the server can send the registration result of the registration failure to the mobile phone, which indicates that the identity information input by the user A in the mobile phone is wrong.
In some embodiments of the present application, the identity information of the user a sent by the mobile phone may include face information of the user a. For example, when prompting the user to input the identity information, the mobile phone may require the user to input a photo of the identity card, and the user identity card is generally printed with a head portrait of the user. At this time, the identity information acquired by the mobile phone is the information on the identity card of the user a, and the information also includes the face information of the user a. Then, before the server verifies the identity information of the user a in the identity database, the server may compare the face information in the identity information of the user a with the face information acquired by the mobile phone this time. If the similarity of the two pieces of face information is greater than the threshold value, it is indicated that the identity information and the face information uploaded by the user A are corresponding. Furthermore, the server can verify whether the identity information of the user A is valid identity information or not from the identity database, so that the authenticity and validity of the face information and the identity information uploaded by the user A are ensured.
S2007, the server establishes a corresponding relation between the identity information and the face information of the user A.
If the identity information input by the user A in the mobile phone is determined to be valid identity information, the server can store the identity information and the face information of the user A in the server and establish the corresponding relation between the identity information and the face information of the user A. By the registration method, the server can store the corresponding relation between the identity information and the face information of a plurality of users. Subsequently, the server can search the corresponding face information according to the identity information of the user, and then can verify the identity of the user according to the face information.
S2008, the server sends the registration result of the successful registration of the user a to the mobile phone, so that the mobile phone displays the registration result.
After the server establishes the corresponding relationship between the identity information and the face information of the user A, the server indicates that the user A finishes registering in the server, and at the moment, the server can send a registration result that the user A successfully registers to the mobile phone. After receiving the registration result, the mobile phone can display the message of successful registration to prompt the user that the registration is successful.
Unlike the above steps S301 to S305, since both the identity information and the face information of the user are stored in the server. Therefore, each user who needs to be registered can use a different handset to complete the registration process through the above steps S2001-S2008. For example, the user B may register the identity information and the face information of the user B in the server through a mobile phone of the user B; the user C can also register the identity information and the face information of the user C in the server through the mobile phone of the user C.
Subsequently, when the target application in the mobile phone needs to authenticate a plurality of users at the same time to provide the presence certificates of the plurality of users, the mobile phone can acquire images containing a plurality of pieces of face information at the same time and send the images to the server. And the server performs multi-user joint identity authentication on the plurality of face information acquired at this time according to the face information of the registered user. If the multi-user combined identity authentication passes, the multiple users finish the identity authentication at the same time and place, and the accuracy and the safety of the multi-user identity authentication are ensured.
After a plurality of users participating in the multi-user joint identity authentication successfully register own identity information and face information in the server through the mobile phone respectively, if the fact that a user opens the multi-user joint identity authentication function in the target application is detected, the mobile phone can execute the following steps S2101-S2108 to carry out the multi-user joint identity authentication. Exemplarily, as shown in fig. 21, the steps S2101 to S2108 specifically include:
s2101, after detecting that a user opens the multi-user joint identity authentication function in a target application, a mobile phone prompts M (M is more than 1) users participating in the multi-user joint identity authentication function to input identity information respectively.
S2102, the mobile phone obtains M identity information input by M users respectively.
In the embodiment of the application, a multi-user joint identity authentication function can be set in the target application. For example, when the target application is a game APP, the multi-user joint authentication may be performed when the game APP is opened. For another example, when the target application is a payment APP, multi-user federated authentication may be performed when a large amount of transactions are conducted. For another example, when the target application is an application related to online certificate handling, multi-user federated authentication may be performed on multiple users related to the certificate while the certificate is being handled.
By taking the target application as an example of the vehicle management APP, the vehicle management APP provides a function of transacting vehicle user procedures on line for the user. As shown in fig. 22, a user-passing button 2201 is displayed in the home page of the vehicle management APP. If the user is detected to click the user-passing button 2201, it indicates that the user (e.g., user A) wishes to pass his/her own vehicle to other users (e.g., user B). At this time, the vehicle management APP needs to perform multi-user joint identity verification on the two users participating in the vehicle passing procedure to ensure that the two users participating in the vehicle passing procedure do not dispute the vehicle passing procedure.
Therefore, if it is detected that the user clicks the user button 2201, it indicates that the user has opened the multi-user joint authentication function in the vehicle management APP. At this time, as shown in fig. 23 (a), the mobile phone may jump to the input interface 2301 of the vehicle management APP, and the mobile phone may prompt the owner (i.e., the old owner) before the vehicle passes the house to input the identity information of the owner in the input interface 2301. After the user a as the old vehicle owner inputs his/her own identity information in the input interface 2301, he/she may click the ok button 2302 to submit his/her own identity information. If it is detected that the user clicks the next button 2302, the handset can obtain the identity information input by user a in the input interface 2301.
In addition, after detecting that the user clicks the ok button 2302 in the input interface 2301, as shown in (b) in fig. 23, the mobile phone may jump to the input interface 2303 of the vehicle management APP, and the mobile phone may prompt the owner (i.e., the new owner) of the vehicle passing the home to input the identity information of the owner in the input interface 2303. After the user B as the old vehicle owner inputs his or her own identity information in the input interface 2303, he or she may click the next button 2304 to submit his or her own identity information. If it is detected that the user clicks the next button 2304 in the input interface 2303, the mobile phone may obtain the identity information input by the user B in the input interface 2303.
In the above embodiment, the example in which the mobile phone acquires the identity information of the user a and the user B is taken as an example, and the number of users participating in the multi-user joint identity authentication function may be different in different application scenarios. For example, when a tripartite agreement is signed online in a target application, a mobile phone needs to acquire identity information of three users.
S2103, the mobile phone prompts M users participating in the multi-user joint identity verification function to input face information in the same preview interface.
Still taking the vehicle management APP as an example of the target application, after the mobile phone acquires the identity information of two users, namely, the new owner (user a) and the old owner (user B), as shown in fig. 24, the mobile phone may call a camera (e.g., a front camera) of the mobile phone to acquire a currently captured preview interface 2401 and display the preview interface 2401. In addition, the mobile phone can mark an input area 2402 of the new car owner face information and an input area 2403 of the old car owner face information in the preview interface 2401, so that the user a and the user B are prompted to input own face information in the preview interface 2401.
S2104, the mobile phone acquires M pieces of face information to be verified from the preview interface.
After the mobile phone displays the preview interface 2401, whether a human face exists in the input area 2402 and the input area 2403 can be detected in real time. When the human faces are detected in the input region 2402 and the input region 2403, the mobile phone can extract the human face information in the input region 2402 and the human face information in the input region 2403. The two pieces of face information are face information waiting for verification during subsequent multi-user joint identity verification.
That is to say, when identity authentication needs to be performed on multiple users, the authentication method provided in the embodiment of the present application may collect face information of the multiple users in the same preview interface, so as to ensure that the multiple users know and confirm certain information (for example, the above-mentioned user-passing procedure) at the same time and under the same scene. The verification method provides the presence certification of multiple users, and can reduce the potential safety hazard caused by unequal information in the transaction in which the multiple users participate.
S2105, the mobile phone sends a verification request to the server, wherein the verification request comprises the M identity information and the M face information to be verified.
After the mobile phone acquires the identity information of a new owner (user A) and a old owner (user B) and the face information to be verified of the user A and the user B, the mobile phone can encrypt the identity information and the face information by using an encryption mode agreed with a server. Furthermore, the mobile phone can carry the encrypted M identity information and M face information to be verified in the verification request and send the verification request to the server, so that the potential safety hazard of information leakage between the mobile phone and the server in the interaction process is avoided.
In some embodiments, as shown in fig. 25, before the mobile phone sends the authentication request to the server, the collected identity information and face information of the new owner and the old owner of the vehicle may be displayed to the user, and the user determines whether to use the collected identity information and face information for passing the vehicle. If it is detected that the user clicks the confirm button 2401 as shown in fig. 25, the handset may send the authentication request to the server.
For example, after detecting that the user clicks the confirmation button 2401, the mobile phone may generate the digest of the authentication request by using a cryptographic algorithm such as a hash algorithm, for example, based on the collected face information and the operation content of vehicle passing through the house confirmed by the user. Furthermore, the mobile phone can carry the encrypted operation content, the encrypted identity information, the encrypted face information and the encrypted abstract in the verification request and send the verification request to the server.
S2106, the server respectively searches the corresponding M pieces of registered face information according to the M pieces of identity information.
S2107, the server determines that the M registered face information corresponds to the M face information to be verified one by one.
In step S2106, after the server receives the authentication request sent by the mobile phone, if the authentication request is encrypted, the server may decrypt the authentication request using a corresponding decryption algorithm to obtain the identity information and the face information to be authenticated carried in the authentication request.
For example, the server may decrypt the operation content, the identity information, the face information, and the digest carried in the authentication request using a corresponding decryption algorithm. Furthermore, the server can verify the integrity of the received face information and the operation content through the abstract so as to ensure that the information received by the server is not tampered and the user knows the operation content (namely the vehicle user passing operation) corresponding to the authentication. The method and the device avoid the situation that an attacker is used for authentication application of other operations under the condition that the user is unaware after intercepting the face information of the user.
After the server obtains the M identity information in the authentication request, it can further determine whether the multiple users performing the multi-user joint identity authentication are all registered users registered in the server. Because each user stores the own identity information and the corresponding face information in the server during registration, the server can search whether the received identity information of the user A and the received identity information of the user B are locally stored. If the identity information of the user A and the user B is stored, the user A and the user B which request the multi-user identity authentication are both registered users. Then, the mobile phone can search the face information corresponding to the identity information of the user a and the face information corresponding to the identity information of the user B, that is, the registered face information.
In step S2107, after the mobile phone acquires the registered face information corresponding to the identity information of the user a, the mobile phone may compare the registered face information with the face information to be verified, which is input by the user a in the verification request. If the similarity of the two is greater than the threshold value, the authentication of the user A participating in the multi-user joint authentication is successful.
Similarly, after the mobile phone acquires the registered face information corresponding to the identity information of the user B, the mobile phone may compare the registered face information with the face information to be verified, which is input by the user B in the verification request. If the similarity of the two is greater than the threshold value, the authentication of the user B participating in the multi-user joint authentication is successful. And if the authentication of the user A and the authentication of the user B are both successful, determining that the multi-user combined authentication is successful. The user A and the user B can be provided with the presence certification through the multi-user combined identity authentication, so that the user A and the user B participating in vehicle passing can be ensured to authorize and confirm the passing procedures at the same time and in the same scene, and the property loss of the user caused by the fact that other users pretend to be the user A or the user B to complete the passing procedures is avoided.
S2108, the server sends a message that the multi-user joint identity authentication is successful to the mobile phone.
After the authentication of the user A and the user B is successful, the server can send a message that the multi-user joint authentication is successful to the mobile phone. At this time, as shown in fig. 26, the handset may display a message that the multi-user joint authentication is successful. In addition, the mobile phone can automatically jump to the next operation interface (such as an online number selection interface) of the vehicle user passing procedure, so that the user can continue to complete other procedures of the vehicle user passing.
In addition, if the authentication of the user a and/or the user B participating in the multi-user joint authentication fails, the server may determine that the multi-user joint authentication fails. Then the server may send a message to the handset that the multi-user federated authentication failed. The mobile phone can also display a message of the failure of the multi-user joint authentication. At the moment, the mobile phone cannot jump to the next operation interface of the vehicle user passing procedure, and property safety damage of a user caused by illegal users pretending to replace legal users to pass the vehicle user is avoided.
It should be noted that, in the above embodiments, the vehicle management APP is exemplified as the target application. It can be understood that the identity authentication method provided by the embodiment of the present application can also be applied to other scenarios that require simultaneous authentication of multiple user identities. For example, when an online contracting application is used, the multi-user joint authentication can be performed on the parties participating in contracting by using the authentication method, so that the multi-user presence certificate is provided to avoid subsequent contract disputes. For example, when transacting a marriage certificate or a joint account online, the authentication method can be used to perform multi-user joint authentication on multiple parties involved in transacting business, prove that the multiple parties involved in transacting the business are present and authorize the business to be transacted, and avoid other users impersonating legitimate users to transact related business in the presence. For another example, when the shared property of multiple users (such as the shared property of couples) is processed online, the authentication method can be used to perform multi-user joint authentication on all the parties of the shared property, so as to ensure that all the parties of the shared property are present and acquire the transaction content, thereby reducing the potential safety hazard brought to the property of the users.
In some embodiments, fig. 27 shows a schematic structural diagram of a possible electronic device involved in the above embodiments, where the electronic device includes: an acquisition unit 2701, a processing unit 2702, a display unit 2703, and a communication unit 2704.
The acquisition unit 2701 is configured to support the electronic device to execute the processes S302 and S304 in fig. 3, the process S1003 in fig. 10, the process S1503 in fig. 15, the processes S2002 and S2004 in fig. 20, and the processes S2102 and S2104 in fig. 21; the processing unit 2702 is configured to support the electronic device to perform the process S305 in fig. 3, and the processes S1004-S1006 in fig. 10, and the processes S1504-S1505 in fig. 15; the display unit 2703 is used to support the electronic apparatus to execute the processes S301 and S303 in fig. 3, and the processes S1001 to S1002 in fig. 10, and the processes S1501, S1502, and S1505 in fig. 15, and the processes S2001 and S2003 in fig. 20, and the processes S2101 and S2103 in fig. 21; the communication unit 2704 is used to support the electronic apparatus to execute the processes S2005 and S2008 in fig. 20, and the processes S2105 and S2108 in fig. 21. All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
In other embodiments, fig. 28 shows a schematic structural diagram of a possible electronic device according to the above embodiments. The electronic device includes a processor 2801, a memory 2802, an input device 2803, an output device 2804, and a communication module 2805. The number of the processor 2801, the memory 2802, the input device 2803, the output device 2804, and the communication module 2805 may be one or more (one example is shown in fig. 28), and communication therebetween may be performed by a bus 2806.
The processor 2801 may specifically be the processor 110 shown in fig. 1. The processor 2801 may be used to control and manage the operation of the electronic device. For example, Processor 2801 may be a Central Processing Unit (CPU), GPU, general purpose Processor, Digital Signal Processor (DSP), Application-Specific Integrated Circuit (ASIC), Field Programmable Gate Array (FPGA) or other Programmable logic device, transistor logic, hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others.
The memory 2802 may specifically be the internal memory 121 and the external memory 120 shown in fig. 1. The memory 2802 may include high-speed Random Access Memory (RAM), and may also include non-volatile memory, such as magnetic disk storage devices, flash memory devices, or other volatile solid state storage devices.
The input device 2803 may be a device that receives information input by a user, such as the microphone 170C or a touch sensor in the sensor module 180 shown in fig. 1.
The output device 2804 may be a device such as a display for displaying information input by a user, information provided to the user, and various menus of the electronic device, and the display may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. Output device 2804 may embody, for example, speaker 170A or display screen 194 as shown in fig. 1. Additionally, a touch sensor may be integrated with the display screen 194 for collecting touch events thereon or thereabout and transmitting the collected touch information to other devices (e.g., a processor, etc.).
The communication module 2805 can be a transceiver, a transceiver circuit, a communication interface, or the like. Such as bluetooth devices, Wi-Fi devices, peripheral interfaces, etc. The communication module 2805 may be specifically the radio frequency module 150, the communication module 160, and the like shown in fig. 1.
In some embodiments, fig. 29 shows a schematic diagram of a possible structure of the server involved in the above embodiments, where the server includes: a communication unit 2901, a processing unit 2902, and a determination unit 2903.
The communication unit 2901 is for supporting the server to execute the processes S2005 and S2008 in fig. 20, and the processes S2105 and S2108 in fig. 21; the processing unit 2902 is configured to support the server to execute the process S2007 in fig. 20, and the process S2106 in fig. 21; the determination unit 2903 is used to support the server to execute the process S2006 in fig. 20, and the process S2107 in fig. 21. All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
In other embodiments, fig. 30 shows a schematic diagram of a possible structure of the server according to the above embodiments. The server includes a processor 3001, a memory 3002, and a communication module 3003. The number of the processor 3001, the memory 3002, and the communication module 3003 may be one or more (the number is an example in fig. 28), and they may communicate with each other through the bus 3004.
The processor 3001 can be used for controlling and managing the operations of the electronic device. For example, the Processor 3001 may be a Central Processing Unit (CPU), a GPU, a general purpose Processor, a Digital Signal Processor (DSP), an Application-Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, transistor logic, hardware components, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others.
The memory 3002 is used to store program codes and data for the server. For example, the memory 3002 may include high-speed Random Access Memory (RAM), and may also include non-volatile memory, such as magnetic disk storage devices, flash memory devices, or other volatile solid-state storage devices.
The communication module 3003 is used to support communication of the server with other network entities. For example, the communication module 3003 may be a transceiver, a transceiver circuit, a communication interface, or the like.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to perform all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (30)

  1. A method of verifying a user's identity, comprising:
    the electronic equipment runs a target application and opens a multi-user joint identity authentication function provided by the target application;
    the electronic equipment displays a first preview interface captured by a camera and prompts the input of face information of a plurality of preset users in the first preview interface;
    the electronic equipment acquires N pieces of face information to be verified from the first preview interface, wherein N is an integer greater than 1;
    and if each piece of face information to be verified in the N pieces of face information to be verified is the face information of a preset user, allowing the electronic equipment to continue to run the target application.
  2. The method according to claim 1, further comprising, before the electronic device opens the multi-user federated authentication function provided by the target application:
    the electronic equipment prompts a user to set the number of users for authenticating the user when the multi-user joint identity authentication is carried out;
    responding to the number N of users input by the users, displaying a second preview interface captured by the camera by the electronic equipment, and acquiring the face information of the N users from the second preview interface;
    and the electronic equipment takes the N users as N preset users, and establishes the corresponding relation between the target application and the face information of the N authenticated users.
  3. The method according to claim 2, further comprising, after the electronic device opens the multi-user federated authentication function provided by the target application:
    the electronic equipment acquires face information of N preset users corresponding to the target application;
    after the electronic device acquires N pieces of face information to be verified from the first preview interface, the method further includes:
    and the electronic equipment determines that each piece of face information to be verified is the face information of the preset user by comparing the N pieces of face information to be verified with the face information of the N preset users.
  4. The method according to claim 2 or 3, wherein after the electronic device obtains the face information of the N authenticated users from the second preview interface, the method further comprises:
    the electronic equipment records the sequence among the face information of the N preset users in the second preview interface;
    after the electronic device acquires N pieces of face information to be verified from the first preview interface, the method further includes:
    and the electronic equipment determines the sequence of the N pieces of face information to be verified in the first preview interface to be the same as the sequence of the N pieces of face information of the preset users in the second preview interface.
  5. The method according to any one of claims 1-4, wherein the electronic device opens a multi-user federated identity authentication function provided by the target application, comprising:
    the electronic equipment automatically opens the multi-user joint identity authentication function when running the target application; alternatively, the first and second electrodes may be,
    and when the electronic equipment runs the target application, responding to user input to open the multi-user joint identity authentication function.
  6. The method according to any one of claims 1-5, wherein the electronic device opens a multi-user federated identity authentication function provided by the target application, comprising:
    the electronic equipment opens the multi-user joint identity authentication function when starting the target application; alternatively, the first and second electrodes may be,
    and the electronic equipment opens the multi-user joint identity authentication function when running the first function provided by the target application.
  7. The method according to any one of claims 2 to 6, wherein if each piece of face information to be verified in the N pieces of face information to be verified is face information of a preset user, allowing the electronic device to continue running the target application includes:
    and if each piece of face information to be verified in the N pieces of face information to be verified is the face information of a preset user, allowing the electronic equipment to continue to operate the target application within a preset time.
  8. A method of verifying a user's identity, comprising:
    the method comprises the steps that the electronic equipment detects a first function of opening a target application by a user, wherein the first function is a function of multi-user joint identity authentication of N users, and N is an integer greater than 1;
    the electronic equipment displays a first input interface and prompts the input of identity information of N users in the first input interface;
    the electronic equipment acquires N pieces of identity information from the first input interface;
    the electronic equipment displays a first preview interface captured by a camera and prompts the N users to input face information in the first preview interface;
    the electronic equipment acquires N pieces of face information from the first preview interface;
    the electronic equipment sends a verification request to a server, wherein the verification request comprises the N pieces of identity information and the N pieces of face information, so that the server performs multi-user joint identity verification on the N pieces of users;
    and if a message of successful multi-user joint identity authentication sent by the server is received, the electronic equipment executes the first function.
  9. The method of claim 8, before the electronic device sends the authentication request to the server, further comprising:
    the electronic equipment displays the corresponding relation between the N pieces of identity information and the N pieces of face information and prompts a user to confirm that the first function is executed by using the N pieces of identity information and the N pieces of face information;
    wherein the electronic device sends a verification request to a server, comprising:
    and if the confirmation operation of the user on the N pieces of identity information and the N pieces of face information is detected, the electronic equipment sends the verification request to a server.
  10. The method according to claim 8 or 9, before the electronic device detects that the user opens the first function of the target application while running the target application, further comprising:
    the electronic equipment acquires identity information input by a user in a second input interface;
    the electronic equipment acquires face information input by a user in a second preview interface;
    the electronic equipment sends a registration request to the server, wherein the registration request comprises the identity information and the face information of the user, so that the server establishes the corresponding relationship between the identity information and the face information of the user.
  11. A method of verifying a user's identity, comprising:
    the method comprises the steps that a server receives a verification request sent by electronic equipment, wherein the verification request comprises N pieces of identity information and N pieces of face information to be verified;
    the server acquires N pieces of registered face information respectively corresponding to the N pieces of identity information, and the server stores the corresponding relation between the identity information of each registered user and the face information;
    the server determines that the N pieces of registered face information correspond to the N pieces of face information to be verified one by one;
    and the server sends a message of successful multi-user joint identity verification to the electronic equipment.
  12. The method of claim 11, before the server receives the authentication request sent by the electronic device, further comprising:
    the server receives a registration request sent by the electronic equipment, wherein the registration request comprises identity information and face information of a user;
    and the server establishes a corresponding relation between the identity information and the face information of the user.
  13. The method according to claim 12, before the server establishes the correspondence between the identity information and the face information of the user, further comprising:
    and the server inquires the identity information of the user in a preset database to be correct identity information.
  14. The method according to claim 12 or 13, wherein the identity information of the user comprises avatar information of the user; after the server receives the registration request sent by the electronic device, the method further includes:
    and the server determines that the head portrait information in the identity information corresponds to the face information.
  15. An electronic device comprising a processor, and an input device, an output device, and a memory all coupled to the processor; wherein the content of the first and second substances,
    the processor is configured to: running a target application and opening a multi-user joint identity authentication function provided by the target application;
    the output device is to: displaying a first preview interface captured by a camera, and prompting to input face information of a plurality of preset users in the first preview interface;
    the input device is to: acquiring N pieces of face information to be verified from the first preview interface, wherein N is an integer greater than 1;
    the processor is further configured to: and if each piece of face information to be verified in the N pieces of face information to be verified is the face information of a preset user, allowing the electronic equipment to continue to run the target application.
  16. The electronic device of claim 15,
    the output device is further to: prompting a user to set the number of users for authenticating the user when the multi-user joint identity authentication is performed;
    the input device is further to: receiving the number N of users input by the users;
    the output device is further to: displaying a second preview interface captured by the camera;
    the processor is further configured to: acquiring face information of N users from the second preview interface; and taking the N users as N preset users, and establishing a corresponding relation between the target application and the face information of the N authenticated users.
  17. The electronic device of claim 16,
    the processor is further configured to: acquiring face information of N preset users corresponding to the target application; and determining that each piece of face information to be verified is the face information of the preset user by comparing the N pieces of face information to be verified with the face information of the N preset users.
  18. The electronic device of claim 16 or 17,
    the processor is further configured to: recording the sequence of the face information of the N preset users in the second preview interface in the memory; and determining the sequence of the N pieces of face information to be verified in the first preview interface, wherein the sequence of the N pieces of face information to be verified in the first preview interface is the same as the sequence of the N pieces of face information of the preset users in the second preview interface.
  19. The electronic device of any of claims 16-18,
    the processor is specifically configured to: opening the multi-user joint identity authentication function when the target application is started; or the multi-user joint identity authentication function is opened when the first function provided by the target application is operated.
  20. The electronic device of any of claims 16-19,
    the processor is specifically configured to: and if each piece of face information to be verified in the N pieces of face information to be verified is the face information of a preset user, allowing the electronic equipment to continue to operate the target application within a preset time.
  21. An electronic device comprising a processor, and a communication module, an input device, an output device, and a memory all coupled to the processor; wherein the content of the first and second substances,
    the input device is to: detecting a first function of opening a target application by a user, wherein the first function is a function of multi-user joint identity authentication of N users, and N is an integer greater than 1;
    the output device is to: the electronic equipment displays a first input interface and prompts the input of identity information of N users in the first input interface; displaying a first preview interface captured by a camera, and prompting the N users to input face information in the first preview interface;
    the processor is configured to: acquiring N pieces of identity information from the first input interface; acquiring N pieces of face information from the first preview interface; instructing the communication module to send an authentication request to a server, where the authentication request includes the N identity information and the N face information, so that the server performs multi-user joint identity authentication on the N users; and if a message of successful multi-user joint identity authentication sent by the server is received, executing the first function.
  22. The electronic device of claim 21,
    the output device is further to: displaying the corresponding relation between the N pieces of identity information and the N pieces of face information, and prompting a user to confirm that the first function is executed by using the N pieces of identity information and the N pieces of face information;
    the input device is further to: detecting the confirmation operation of the user on the N pieces of identity information and the N pieces of face information;
    the processor is specifically configured to: and if the confirmation operation of the user on the N pieces of identity information and the N pieces of face information is detected, indicating the communication module to send the verification request to the server.
  23. The electronic device of claim 21 or 22,
    the input device is further to: acquiring identity information input by a user in a second input interface; acquiring face information input by a user in a second preview interface;
    the processor is further configured to: and indicating the communication module to send a registration request to the server, wherein the registration request comprises the identity information and the face information of the user, so that the server establishes the corresponding relationship between the identity information and the face information of the user.
  24. A server comprising a processor, and a communication module and a memory both coupled to the processor; wherein the content of the first and second substances,
    the communication module is configured to: receiving a verification request sent by electronic equipment, wherein the verification request comprises N pieces of identity information and N pieces of face information to be verified;
    the processor is configured to: acquiring N pieces of registered face information respectively corresponding to the N pieces of identity information, wherein the corresponding relation between the identity information of each registered user and the face information is stored in the memory; determining that the N registered face information correspond to the N face information to be verified one by one;
    the communication module is further configured to: and sending a message of successful multi-user joint identity verification to the electronic equipment.
  25. The server according to claim 24,
    the communication module is further configured to: receiving a registration request sent by the electronic equipment, wherein the registration request comprises identity information and face information of a user;
    the processor is further configured to: inquiring that the identity information of the user is correct identity information in a preset database; and establishing a corresponding relation between the identity information and the face information of the user.
  26. The server according to claim 25, wherein the identity information of the user comprises avatar information of the user;
    the processor is further configured to: and determining that the head portrait information in the identity information corresponds to the face information.
  27. A computer-readable storage medium having instructions stored therein, which when run on an electronic device, cause the electronic device to perform a method of verifying a user identity as claimed in any one of claims 1-7 or claims 8-10.
  28. A computer-readable storage medium having instructions stored thereon, which when run on a server, cause the server to perform a method of verifying a user identity according to any one of claims 11-14.
  29. A computer program product comprising instructions for causing an electronic device to perform the method of verifying the identity of a user according to any one of claims 1-7 or claims 8-10 when the computer program product is run on the electronic device.
  30. A computer program product comprising instructions for causing a server to perform the method of verifying the identity of a user according to any one of claims 11 to 14 when the computer program product is run on the server.
CN201880094835.4A 2018-12-21 2018-12-21 Method for verifying user identity and electronic equipment Pending CN112313661A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/122765 WO2020124579A1 (en) 2018-12-21 2018-12-21 Method for verifying user identity, and electronic device

Publications (1)

Publication Number Publication Date
CN112313661A true CN112313661A (en) 2021-02-02

Family

ID=71102482

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880094835.4A Pending CN112313661A (en) 2018-12-21 2018-12-21 Method for verifying user identity and electronic equipment

Country Status (2)

Country Link
CN (1) CN112313661A (en)
WO (1) WO2020124579A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113704727A (en) * 2021-07-02 2021-11-26 深圳市赛云数据有限公司 SIM card identity verification management and updating device

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11792188B2 (en) * 2020-08-05 2023-10-17 Bank Of America Corporation Application for confirming multi-person authentication
CN115550415B (en) * 2022-02-28 2023-08-04 荣耀终端有限公司 Device connection method and electronic device
CN114915486A (en) * 2022-06-02 2022-08-16 北京天融信网络安全技术有限公司 Identity authentication method, device, system, electronic equipment and medium
CN116702100B (en) * 2022-10-21 2024-04-16 荣耀终端有限公司 Authority management method and electronic equipment
CN116935479B (en) * 2023-09-15 2023-12-15 纬领(青岛)网络安全研究院有限公司 Face recognition method and device, electronic equipment and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN200950262Y (en) * 2005-11-11 2007-09-19 北京数字奥森科技有限公司 Mobile terminal
US8145562B2 (en) * 2009-03-09 2012-03-27 Moshe Wasserblat Apparatus and method for fraud prevention
CN202197300U (en) * 2010-08-05 2012-04-18 北京海鑫智圣技术有限公司 Mobile face identification system
CN202210263U (en) * 2011-06-30 2012-05-02 汉王科技股份有限公司 Face recognition device
CN102945366B (en) * 2012-11-23 2016-12-21 海信集团有限公司 A kind of method and device of recognition of face
CN103580867A (en) * 2013-08-01 2014-02-12 百度在线网络技术(北京)有限公司 Trading method and trading system
CN106326862A (en) * 2016-08-25 2017-01-11 广州御银自动柜员机技术有限公司 Multi-face pickup device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113704727A (en) * 2021-07-02 2021-11-26 深圳市赛云数据有限公司 SIM card identity verification management and updating device

Also Published As

Publication number Publication date
WO2020124579A1 (en) 2020-06-25

Similar Documents

Publication Publication Date Title
CN112313661A (en) Method for verifying user identity and electronic equipment
CN111373713B (en) Message transmission method and device
EP3819174B1 (en) Business processing method and device
CN113496426A (en) Service recommendation method, electronic device and system
WO2021017988A1 (en) Multi-mode identity identification method and device
CN111107525B (en) Automatic routing method of SE (secure element) and electronic equipment
US20230254143A1 (en) Method for Saving Ciphertext and Apparatus
CN117014859A (en) Address book-based device discovery method, audio and video communication method and electronic device
CN114296948A (en) Cross-device application calling method and electronic device
CN112930533A (en) Control method of electronic equipment and electronic equipment
US20240095408A1 (en) Data protection method and system, medium, and electronic device
CN111027374B (en) Image recognition method and electronic equipment
WO2024027199A1 (en) Risk identification method and electronic device
CN111886849B (en) Information transmission method and electronic equipment
CN115379043B (en) Cross-equipment text connection method and electronic equipment
CN116527266A (en) Data aggregation method and related equipment
CN114510178A (en) Shared data distribution method and electronic equipment
CN113867851A (en) Electronic equipment operation guide information recording method, electronic equipment operation guide information acquisition method and terminal equipment
CN116049867B (en) Anti-fraud method, graphical interface and related device
CN116669020B (en) Password management method, password management system and electronic equipment
US20220201491A1 (en) Pairing method and device
WO2024037040A1 (en) Data processing method and electronic device
CN115550919A (en) Equipment pairing authentication method and device, sender equipment and receiver equipment
CN114117458A (en) Key using method and related product
CN116029716A (en) Remote payment method, electronic equipment and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination