CN114973428A - Biological information sharing method, electronic device and medium thereof - Google Patents

Biological information sharing method, electronic device and medium thereof Download PDF

Info

Publication number
CN114973428A
CN114973428A CN202110207789.8A CN202110207789A CN114973428A CN 114973428 A CN114973428 A CN 114973428A CN 202110207789 A CN202110207789 A CN 202110207789A CN 114973428 A CN114973428 A CN 114973428A
Authority
CN
China
Prior art keywords
biological information
information
user
electronic equipment
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110207789.8A
Other languages
Chinese (zh)
Inventor
李实�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110207789.8A priority Critical patent/CN114973428A/en
Priority to PCT/CN2021/143626 priority patent/WO2022179308A1/en
Publication of CN114973428A publication Critical patent/CN114973428A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6254Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities

Abstract

The application relates to the technical field of biological identification, in particular to a biological information sharing method and electronic equipment and media thereof, wherein the method comprises the following steps: the method comprises the steps that first biological information is obtained by first electronic equipment and used by the first electronic equipment for identity recognition; responding to the first operation, the first electronic equipment obtains second biological information based on the first biological information, and before the first electronic equipment detects the first operation, the first electronic equipment is in a system account login state; the first electronic equipment sends second biological information to the server; the second electronic equipment and the first electronic equipment log in the same system account; in response to the second operation, the second electronic device sends a biological information acquisition request to the server; the second electronic device receives second biological information from the server; the second electronic device obtains third biological information based on the second biological information, and the third biological information is used by the second electronic device for identity recognition, so that biological information sharing among the devices is realized.

Description

Biological information sharing method, electronic device and medium thereof
Technical Field
The present application relates to the field of biometric identification technologies, and in particular, to a biometric information sharing method, and an electronic device and medium thereof.
Background
At present, biometric technology has been widely applied to various application scenarios, wherein one application scenario is to perform identity verification on an electronic device (e.g., a mobile phone) through biometric technology. For example, identity authentication is performed on a mobile phone through biological information such as a fingerprint and a facial image, so that the mobile phone or an application program on the mobile phone is logged in after authentication is successful. When a user changes a mobile phone or uses a tablet computer, one way to obtain biometric information is: and sharing the biological information stored in the mobile phone originally held by the user to the mobile phone or the tablet computer replaced by the user. However, when the method is used, the mobile phone originally held by the user and storing the specified biological information needs to be ensured to exist and be available, so that the mobile phone replaced by the user has a strong dependence on the mobile phone originally held by the user, and once the mobile phone originally held by the user is lost, out of the vicinity or cannot be normally used, the function of sharing the biological information stored in the mobile phone originally held by the user with the mobile phone replaced by the user cannot be realized. Thus, the current technical solutions include: the problem of sharing biometric information between devices is not achieved when a device (e.g., an old device) originally held by a user is lost, out of the way, or is not normally used.
Disclosure of Invention
The embodiment of the application provides a biological information sharing method, electronic equipment and a medium thereof.
In a first aspect, an embodiment of the present application provides a biological information sharing method, where the method includes:
the method comprises the steps that first biological information is obtained by first electronic equipment, wherein the first biological information is used by the first electronic equipment for identity recognition;
responding to a first operation, and obtaining second biological information by the first electronic equipment based on first biological information, wherein the first electronic equipment is in a login system account state before the first electronic equipment detects the first operation;
the first electronic equipment sends the second biological information to a server;
the second electronic equipment logs in the system account;
in response to the second operation, the second electronic device sends a biological information acquisition request to the server;
the second electronic device receiving the second biometric information from the server;
the second electronic equipment obtains third biological information based on the second biological information, wherein the third biological information is used by the second electronic equipment for identity recognition;
the logged system accounts of the first electronic device and the second electronic device are the same.
In the embodiment of the present application, it can be understood that the first biological information and the second biological information may be the same and are at least one of fingerprint information, fingerprint feature information, facial image feature information, iris information, and iris feature information. The second biological information and the third biological information may be the same and are at least one of fingerprint information, fingerprint feature information, facial image feature information, iris information, and iris feature information. The fingerprint characteristic information is obtained by characteristic extraction of fingerprint information; the facial image feature information is obtained by extracting features of the image information; the iris characteristic information is obtained by extracting the characteristics of the iris information.
Further, it is understood that the first biological information and the second biological information may be different, and the first biological information includes at least one of fingerprint information, fingerprint feature information, face image feature information, iris information, and iris feature information. The second biometric information may be obtained by encrypting the first biometric information. Correspondingly, the second biological information and the third biological information may also be different, where the second biological information includes at least one of fingerprint information, fingerprint feature information, facial image feature information, iris information, and iris feature information. The third biological information may be obtained by decrypting the second biological information.
Further, it is understood that the first biological information and the third biological information may be the same and each is at least one of fingerprint information, fingerprint feature information, face image feature information, iris information, and iris feature information. The first biological information and the third biological information may be the same, for example, the first biological information is face image information, and the second biological information is face image feature information.
It is understood that the first electronic device and the second electronic device suitable for the embodiments of the present application may be various devices having a biological information entry function, for example, a mobile phone, a computer, a laptop computer, a tablet computer, a television, and the like. The first electronic device and the second electronic device may be the same kind of product, for example, both the first electronic device and the second electronic device are mobile phones. The first electronic device and the second electronic device may also be different types of products, for example, the first electronic device is a mobile phone, and the second electronic device is a tablet computer.
The response to the first operation may be in response to the user selecting a one-end-entry multi-end-use function of the biological information on the operation interface of the first electronic device, for example, in response to the user clicking on a "determination" control corresponding to the one-end-entry multi-end-use function of the biological information on the operation interface of the first electronic device. The response to the second operation may be an operation in response to the user selecting the operation of sending the biological information acquisition request to the server on the operation interface of the second electronic device, for example, in response to the user clicking a "start acquisition" control corresponding to the data for face recognition acquired from the server on the operation interface of the second electronic device.
According to the method and the device, the biological information sharing among the devices is realized based on the system account, and the user experience is improved.
In one possible implementation of the above aspect, the first biological information includes at least one of fingerprint information, fingerprint feature information, face image feature information, iris information, and iris feature information.
In one possible implementation of the foregoing aspect, the second biological information obtained by the first electronic device based on the first biological information includes:
the first electronic equipment encrypts the first biological information to obtain second biological information;
the second electronic device obtains the third biological information based on the second biological information, and the third biological information includes:
and the second electronic equipment decrypts the second biological information to obtain the third biological information.
In the embodiment of the present application, for example, the first electronic device encrypts the face image with a password to obtain an encrypted face image. It is to be understood that the password may be a password known by the user, for example, any one or more of a system account number of the first electronic device, a password, and a screen locking password of the first electronic device; the system account of the first electronic device may be a user-defined user name, for example, a mobile phone number of the user. The screen locking password of the first electronic device can be a user-defined character or gesture. As another example, the password for the first electronic device to encrypt the encrypted facial image may be a custom password, e.g., a custom name, a character, a custom gesture. Accordingly, the second electronic device decrypts the encrypted face image obtained from the server with the password for encrypting the face image, resulting in the face image.
For another example, the first electronic device obtains a face feature by performing feature processing on the obtained face image, and then encrypts the face feature by using a password to obtain an encrypted face feature. Correspondingly, the second electronic equipment decrypts the encrypted facial features acquired from the server by using the password for encrypting the facial features to obtain the facial features.
For another example, the first electronic device performs desensitization processing on the face image to obtain a desensitized face image, performs feature processing on the desensitized face image to obtain desensitized face features, and encrypts the desensitized face features by using a password to obtain encrypted desensitized face features. The second electronic device decrypts the encrypted desensitized facial features obtained from the server with a password that encrypts the desensitized facial features to obtain the desensitized facial features, but is not limited thereto.
The password may be at least one of a screen locking password of the first electronic device, a system account, a password, and a user-defined password, or a combination of multiple passwords, but is not limited thereto.
It can be understood that the first electronic device may automatically encrypt the first biological information with the password to obtain the second biological information; the first biological information may be encrypted by the password input by the user to obtain the second biological information only when the password input by the user is acquired.
The second electronic equipment can automatically decrypt the second biological information by using the password to obtain third biological information; or, in the case of acquiring the password input by the user, the second biological information may be encrypted by the password input by the user to obtain the third biological information.
In one possible implementation of the foregoing aspect, the second biological information obtained by the first electronic device based on the first biological information includes:
the first electronic equipment desensitizes the first biological information, extracts features and encrypts the first biological information to obtain second biological information;
the second electronic device obtains the third biological information based on the second biological information, and the third biological information includes:
the second electronic equipment decrypts the second biological information to obtain the third biological information;
and the second electronic equipment acquires fourth biological information during identity recognition, desensitizes the fourth biological information, performs feature extraction processing to generate fifth biological information, and compares the fifth biological information with the third biological information to perform identity recognition.
In the embodiment of the present application, the first biological information may include at least one of fingerprint information, face image information, and iris information. The second biological information may include at least one of desensitized fingerprint characteristic information, desensitized image characteristic information, and desensitized iris characteristic information.
The fourth biological information may include at least one of fingerprint information, face image information, iris information; the fifth biological information may include at least one of desensitization fingerprint characteristic information, desensitization image characteristic information, and desensitization iris characteristic information.
In one possible implementation of the foregoing aspect, the encrypting, by the first electronic device, the first biological information to obtain the second biological information includes:
the first electronic equipment displays a first user interface, and the first user interface is used for instructing a user to input a password;
the first electronic equipment encrypts the first biological information based on the password;
the second electronic device decrypts the second biological information to obtain the third biological information, and the method includes:
the second electronic equipment displays a second user interface, and the second user interface is used for indicating a user to input the password;
the second electronic device decrypts the second biometric information based on the password.
In one possible implementation of the above aspect, the method further comprises:
and the second electronic equipment prompts the category information for identity recognition.
In this embodiment of the application, the second electronic device may further prompt the user of category information of the biological information acquired from the server, so as to prompt the user of a biological information collector that needs to be triggered when performing identity recognition. For example, if the second electronic device obtains a facial image, the second electronic device prompts the user that the type information of the biological information obtained from the server is the facial image, so that the user knows that the biological information collector which needs to be triggered by the second electronic device when the user performs identity recognition again is the camera of the second electronic device.
For another example, if the second electronic device obtains the fingerprint, the second electronic device prompts the user that the type information of the biological information obtained from the server is the fingerprint, so that the user knows that the biological information collector which needs to be triggered by the second electronic device when performing the identity recognition again is the fingerprint sensor of the second electronic device, and the user knows that the user places the finger on the fingerprint sensor for the identity recognition.
In one possible implementation of the foregoing aspect, the foregoing method further includes: the second electronic equipment acquires the biological information of the current user;
and matching the biological information of the current user with the second biological information, and storing the second biological information if the matching result is the biological information of the same user.
In a second aspect, an embodiment of the present application further provides a biological information sharing method, applied to an electronic device, where the method includes: acquiring first biological information, wherein the first biological information is used by the electronic equipment for identity recognition;
responding to a first operation, and obtaining second biological information based on first biological information, wherein the electronic equipment is in a login system account state before the electronic equipment detects the first operation;
and sending the second biological information to a server.
In the embodiment of the present application, it can be understood that the first biological information and the second biological information may be the same and are at least one of fingerprint information, fingerprint feature information, facial image feature information, iris information, and iris feature information. The fingerprint characteristic information is obtained by characteristic extraction of fingerprint information; the facial image feature information is obtained by feature extraction of image information; the iris characteristic information is obtained by extracting the characteristics of the iris information.
Further, it is understood that the first biological information and the second biological information may be different, and the first biological information includes at least one of fingerprint information, fingerprint feature information, face image feature information, iris information, and iris feature information. The second biological information may be obtained by encrypting the first biological information.
It is understood that the electronic device suitable for the embodiment of the present application may be various devices having a biological information entry function, for example, a mobile phone, a computer, a laptop computer, a tablet computer, a television, and the like.
The response to the first operation may be a response to the user selecting a function of one-end-entry multi-end-use of the biological information on the operation interface of the electronic device, for example, a response to the user clicking on a "determination" control corresponding to the function of one-end-entry multi-end-use of the biological information on the operation interface of the electronic device.
According to the method and the device, the biological information sharing among the devices is realized based on the system account, and the user experience is improved.
In one possible implementation of the above aspect, the first biological information includes at least one of fingerprint information, fingerprint feature information, facial image feature information, iris information, and iris feature information.
In one possible implementation of the foregoing aspect, the second biological information obtained based on the first biological information includes:
and encrypting the first biological information to obtain the second biological information.
In one possible implementation of the foregoing aspect, the second biological information obtained based on the first biological information includes:
and performing desensitization processing, feature extraction processing and encryption processing on the first biological information to obtain second biological information.
In the embodiment of the present application, the first biological information may include at least one of fingerprint information, face image information, and iris information. The second biological information may include at least one of desensitized fingerprint characteristic information, desensitized image characteristic information, and desensitized iris characteristic information.
In one possible implementation of the foregoing aspect, the encrypting the first biological information to obtain the second biological information includes:
the electronic equipment displays a first user interface, wherein the first user interface is used for instructing a user to input a password;
the electronic device encrypts the first biological information based on the password.
In a third aspect, an embodiment of the present application further provides a biological information sharing method, which is applied to an electronic device, and the method includes: logging in a system account;
in response to the first operation, sending a biological information acquisition request to the server;
receiving first biological information from the server;
and obtaining second biological information based on the first biological information, wherein the second biological information is used by the electronic equipment for identity recognition.
In the embodiment of the present application, it can be understood that the first biological information and the second biological information may be the same and are at least one of fingerprint information, fingerprint feature information, desensitization fingerprint feature information, facial image feature information, desensitization facial image feature information, iris feature information, and desensitization iris feature information.
Further, it is understood that the first biometric information and the second biometric information may not be the same, and the first biometric information may include at least one of encrypted fingerprint information, encrypted fingerprint characteristic information, encrypted desensitized fingerprint characteristic information, encrypted facial image characteristic information, encrypted desensitized facial image characteristic information, encrypted iris characteristic information, encrypted desensitized iris characteristic information. The second biological information may include at least one of fingerprint information, fingerprint feature information, desensitized fingerprint feature information, facial image feature information, desensitized facial image feature information, iris feature information, desensitized iris feature information.
It is understood that the electronic device suitable for the embodiment of the present application may be various devices having a biological information entry function, for example, a mobile phone, a computer, a laptop computer, a tablet computer, a television, and the like.
The response to the first operation may be an operation of sending the biological information acquisition request to the server in response to the user selecting the operation on the operation interface of the electronic device, for example, in response to the user clicking a "start acquisition" control corresponding to the data for face recognition acquired from the server on the operation interface of the second electronic device.
According to the method and the device, the biological information sharing among the devices is realized based on the system account, and the user experience is improved.
In one possible implementation of the above aspect, the second biometric information includes at least one of fingerprint information, fingerprint feature information, desensitized fingerprint feature information, facial image feature information, desensitized image feature information, iris feature information, desensitized iris feature information.
In one possible implementation of the foregoing aspect, the obtaining the second biological information based on the first biological information includes:
and decrypting the first biological information to obtain the second biological information.
For example, the electronic device decrypts an encrypted face image acquired from the server with a password that encrypts the face image, resulting in the face image.
For another example, the electronic device decrypts the encrypted facial features obtained from the server with a password that encrypts the facial features to obtain the facial features.
For another example, but not limited to, the electronic device decrypts the encrypted desensitized facial features obtained from the server with a password that encrypts the desensitized facial features to obtain the desensitized facial features.
The password may be at least one of a screen locking password, a system account, a password, and a user-defined password of the electronic device, or a combination of multiple passwords, but is not limited thereto.
It can be understood that the electronic device can automatically decrypt the first biological information by using the password to obtain the second biological information; or, in the case of acquiring the password input by the user, the password input by the user is used to decrypt the first biological information to obtain the second biological information.
In one possible implementation of the foregoing aspect, the obtaining the second biological information based on the first biological information includes:
decrypting the first biological information to obtain the second biological information;
the electronic equipment acquires third biological information during identity recognition, desensitizes the third biological information, performs feature extraction processing on the third biological information to generate fourth biological information, and compares the fourth biological information with the second biological information to perform identity recognition.
In this embodiment, the first biometric information may include at least one of encrypted desensitized fingerprint characteristic information, encrypted desensitized image characteristic information, and encrypted desensitized iris characteristic information. The second biological information may include at least one of desensitized fingerprint characteristic information, desensitized image characteristic information, and desensitized iris characteristic information.
The third biological information may include at least one of fingerprint information, face image information, iris information; the fourth biometric information may include at least one of desensitized fingerprint characteristic information, desensitized image characteristic information, and desensitized iris characteristic information.
In one possible implementation of the foregoing aspect, the decrypting the first biological information to obtain the second biological information includes:
displaying a second user interface for instructing a user to enter a password;
decrypting the first biometric information based on the password.
In one possible implementation of the above aspect, the method further comprises:
category information of the biological information for identification is presented.
In the embodiment of the application, the electronic device can also prompt the user of the category information of the biological information acquired from the server so as to prompt the user of a biological information collector which needs to be triggered when the user identifies the identity. For example, if the electronic device acquires a facial image, the electronic device prompts the user that the type information of the biological information acquired from the server is the facial image, so that the user knows that the biological information collector which needs to be triggered by the electronic device when the user performs identity recognition again is the camera of the electronic device.
For another example, if the electronic device acquires a fingerprint, the electronic device prompts the user that the type information of the biological information acquired from the server is the fingerprint, so that the user knows that the biological information collector which needs to be triggered by the electronic device when performing identity recognition again is the fingerprint sensor of the electronic device, and the user knows that the user places a finger on the fingerprint sensor for identity recognition.
In one possible implementation of the foregoing aspect, the foregoing method further includes: acquiring biological information of a current user;
and matching the biological information of the current user with the second biological information, and storing the second biological information if the matching result is the biological information of the same user.
In an embodiment of the present application, the second biological information includes at least one of fingerprint information, fingerprint feature information, desensitization fingerprint feature information, facial image feature information, desensitization image feature information, iris feature information, and desensitization iris feature information. The biological information of the current user includes at least one of fingerprint information, fingerprint feature information, desensitization fingerprint feature information, face image feature information, desensitization image feature information, iris feature information, and desensitization iris feature information.
In a fourth aspect, the present application further provides a computer-readable storage medium, where instructions are stored on the computer-readable storage medium, and when executed on an electronic device, cause the electronic device to implement the method of any one of the second and third aspects.
In a fourth aspect, an embodiment of the present application further provides an electronic device, including:
a memory for storing instructions, an
A processor configured to execute the instructions to implement the method of any one of the possible implementations of the second aspect and the third aspect.
Drawings
Fig. 1 illustrates an application scenario diagram of a biometric information sharing method according to some embodiments of the present application.
Fig. 2 shows a schematic flow diagram of sharing biometric information by the cell phone 100, according to some embodiments of the present application.
Fig. 3A-3F illustrate an operator interface schematic for facial data entry on a cell phone 100, according to some embodiments of the present application.
Fig. 4 is a schematic diagram illustrating a facial image processing procedure according to an embodiment of the present application.
Fig. 5 illustrates a flow diagram of a method of biometric information sharing, according to some embodiments of the present application.
Fig. 6A-6C illustrate an interface diagram for operating a cell phone 300 to obtain facial data from a server 200, according to some embodiments of the present application.
FIG. 7 illustrates a schematic diagram of an electronic device 300, according to some embodiments of the present application
Detailed Description
Illustrative embodiments of the present application include, but are not limited to, a biological information sharing method and an electronic device and medium thereof.
Fig. 1 is a schematic diagram illustrating an application scenario of a biological information sharing method. The scenario includes an electronic device 100, a server 200, and at least one electronic device 300.
It is understood that the electronic device 100 and the electronic device 300 may be electronic devices capable of logging in to the same system account, wherein the system account includes a system account number and a password. The system account may be, but is not limited to, an android system account number, a hong meng system account number, an iOS system account number, and the like.
It is understood that the electronic device 100 and the electronic device 300 may be the same kind of product, for example, the electronic device 100 and the electronic device 300 are both mobile phones. The electronic device 100 and the electronic device 300 may also be different products, for example, the electronic device 100 is a mobile phone, and the electronic device 300 is a tablet computer.
In this scenario, after the electronic device 100 logs in for a system account, the biometric information is sent to the server 200; when the server 200 receives and stores the biometric information transmitted from the electronic device 100; the electronic device 300 may log in the same system account as the electronic device 100, and the electronic device 300 may further obtain the biometric information from the server 200 to perform a user identification service, where the user identification service may include unlocking a screen of the electronic device 300 by a face of a user, unlocking an application program logged in the electronic device 300 by the user by the face of the user, and authenticating the user during a payment process.
It is to be understood that, in the embodiments of the present application, the biometric information is information used for verifying the identity of the user, such as, but not limited to, fingerprint information (hereinafter, simply referred to as a fingerprint), facial image information (hereinafter, simply referred to as a facial image), iris information (hereinafter, simply referred to as an iris), and the like.
Furthermore, it is understood that the electronic device 100 and the electronic device 300 suitable for the embodiments of the present application may be various devices having a biological information entry function, for example, a mobile phone, a computer, a laptop computer, a tablet computer, a television, and the like.
Further, it is understood that the server 200 may be a hardware server or may be embedded in a virtualized environment. For example, according to some embodiments of the present application, the server 200 may be a virtual machine executing on a hardware server that includes one or more other virtual machines, i.e., a cloud server.
The following further explains the technical solution of the present application by taking the electronic device 100 as the mobile phone 100, the electronic device 300 as the mobile phone 300, and the biological information as the facial image as an example.
The following describes a stage in which the mobile phone 100 acquires a face image of the user and transmits the face image to the server.
Fig. 2 shows a schematic flow chart of the sharing of biometric information by the mobile phone 100 according to some embodiments of the present application, and as shown in fig. 2, the method includes:
step 201: the mobile phone 100 detects a user operation for acquiring a face image, and acquires the face image of the user.
It can be understood that when the user performs the authentication on the mobile phone 100 through the biological information such as the facial image, the facial images at multiple angles need to be collected to improve the speed and accuracy of the authentication performed by the mobile phone. Therefore, in the embodiment of the present application, the facial image of the user may include at least one frame of facial image. The at least one frame of face image may be a face image of at least one angle, such as a front photograph, a side photograph, and the like of a human face.
It is understood that the facial image of the user may be captured by a device having a camera function, such as a camera of the cell phone 100.
In some embodiments, the handset 100 may also store the face image of the user in the event that the face image is acquired, the face image being used for user identification services and the like of the handset 100.
In some embodiments, the user may turn on the camera of the cell phone 100 in the system software to capture an image of the user's face. For example, fig. 3A to 3F are schematic diagrams illustrating an operation interface for a user to enter a facial image on the mobile phone 100 according to an embodiment of the present application.
As shown in fig. 3A, the mobile phone 100 logs in to set system software using the acquired system account and password. The setting interface of the mobile phone 100 comprises a biometric identification and password control 10, and a user can find the biometric identification and password setting control 10 for acquiring the biometric information of the user in the setting interface of the mobile phone 100. After the user clicks the "biometric and password" control 10, the cell phone 100 detects this and enters a biometric and password setup interface. In addition, the user may select the "biometric and password" control 10 by voice, but is not limited thereto.
As shown in fig. 3B, the handset 100 displays a biometric and password setup interface that includes controls for face recognition settings, e.g., a "face recognition" control 20.
The user may select a setting control for face recognition by a clicking operation, for example, the user clicks the "face recognition" control 20 to enter the face recognition interface. In addition, the user may select the "face recognition" control 20 by means of voice, but is not limited thereto.
As shown in fig. 3C, the mobile phone 100 displays a face recognition interface, and the user can select a face entry function through a click operation on the face recognition interface, for example, the user clicks the "start entry" control 30, and the mobile phone 100 detects the operation, and then the mobile phone 100 can start to enter facial data (for example, facial images) through the camera. In addition, the user may select the "start entry" control 30 by way of voice, but is not limited thereto.
Only the "start entry" control 30 is shown in fig. 3C, but in other embodiments, after the user clicks the "face recognition" control 20, the mobile phone 100 detects from the server 200 that the mobile phone 100 has stored a facial image in the server 200 before, and the face recognition interface in fig. 3C may also display prompt information for acquiring data for face recognition from the server and a "start acquisition" control corresponding to the prompt information.
Only the "start entry" control 30 is shown in fig. 3C, but in some other embodiments, after the user clicks the "face recognition" control 20, the face recognition interface in fig. 3C of the mobile phone 100 may further display prompt information for acquiring data for face recognition from the server and a "start acquisition" control corresponding to the prompt information. After the user clicks the "start acquiring" control, if the mobile phone 100 detects that the mobile phone 100 stores a facial image in the server 200 before the start of acquiring the facial image from the server 200, the acquired facial image is displayed, otherwise, the prompt information of the facial image corresponding to the system account of the mobile phone 100 does not exist in the server 200 is displayed. But is not limited thereto.
It is understood that the mobile phone 100 may store the entered facial image for the user's identification service, for example, the user unlocks the screen of the mobile phone 100 by the face, the user unlocks the application program logged into the mobile phone 100 by the face, the user's authentication during the payment process, and the like.
It is to be understood that in other embodiments, the facial image entered by the user is stored in the mobile phone 100 before the user clicks the "face recognition" control 20 on the operation interface of the mobile phone 100, so that the user can directly use the stored facial image of the mobile phone 100. Without activating the face image entry function of the handset 100.
Step 202: in a case where the mobile phone 100 detects a user operation to transmit a face image to the server 200, a face image transmission instruction for instructing the mobile phone 100 to transmit the face image to the server 200 is generated.
It will be appreciated that in some embodiments, when one device (e.g., cell phone 100) enters the user's facial image, the system may prompt the user for use of the "one-end-entry-multiple-use" function, and in the event that the user selects to use the "one-end-entry-multiple-use" function, the user's facial image is sent to the server to facilitate another device or devices (e.g., cell phone 300) obtaining the user's facial image from the server (e.g., server 200).
For example, as shown in fig. 3D, after the mobile phone 100 finishes capturing the facial image of the user, the mobile phone 100 displays an entering interface of facial data, the entering interface includes a prompt area 40, and the prompt area 40 includes prompt information: "whether to use a function for recording multiple end uses at one end". The prompt area 40 also includes a "OK" control 41 and a "Cancel" control 42 for selection by the user. In a case where the user clicks the "ok" control 41, the cellular phone 100 detects this operation, and generates a face image transmission instruction.
Step 203: the cellular phone 100 transmits the face image information to the server 200.
It is understood that after the mobile phone 100 acquires the face image, the user may log in to system software (system application) by entering a system account and a password on the mobile phone 100 before the mobile phone 100 transmits the face image information to the server 200, and then operate in the system software to transmit the acquired face image of the user to the server 200.
Further, it is understood that in other embodiments, before the mobile phone 100 acquires the facial image, the user may input a system account and a password on the mobile phone 100, the mobile phone 100 acquires the system account and the password input by the user and logs in to system software (system application), and then operates in the system software to transmit the acquired facial image of the user to the server 200.
Step 204: the server 200 receives the face image transmitted from the cellular phone 100 and stores the face image.
Step 205: the server 200 sends a feedback message to the handset 100 that the face image was successfully received and stored.
Step 206: the handset 100 prompts the user with a message that the server 200 successfully received and stored the facial image.
For example, as shown in fig. 3F, after the server 200 successfully stores the facial image, a prompt message indicating that the storage is successful is sent to the mobile phone 100, and the mobile phone 100 displays the prompt message, for example, the mobile phone 100 displays the prompt message on a prompt interface: has been successfully stored at the server. It will be appreciated that the reminder interface shown in FIG. 3F may also include a "OK" control 60, which disappears after the user has finished looking at the reminder information and clicks on the "OK" control 60.
In addition, in some other embodiments, the mobile phone 100 detects a user operation for acquiring a face image, and acquires the face image of the user. The mobile phone 100 performs feature extraction on the face image to obtain facial features (for example, the mobile phone 100 performs feature extraction on the face image through a facial feature extraction module to obtain the facial features), and when the mobile phone 100 detects a user operation of sending the facial features to the server 200, a facial feature sending instruction is generated, where the facial feature sending instruction is used to instruct the mobile phone 100 to send the facial features to the server 200. The handset 100 sends the facial feature information to the server 200. The server 200 receives the facial features transmitted by the handset 100 and stores the facial features. The server 200 sends a feedback message to the handset 100 that the facial features were successfully received and stored. The handset 100 prompts the user with a message that the server 200 successfully received and stored the facial features.
In addition, in some other embodiments, the mobile phone 100 detects a user operation for acquiring a face image, and acquires the face image of the user. The mobile phone 100 performs desensitization processing on the face image to obtain a desensitized face image, and then performs feature extraction on the desensitized face image to obtain desensitized face features (for example, the mobile phone 100 performs feature extraction on the desensitized face features through a face feature extraction module to obtain face features), and when the mobile phone 100 detects a user operation of sending the desensitized face features to the server 200, a desensitized face feature sending instruction is generated, and the desensitized face feature sending instruction is used for instructing the mobile phone 100 to send the face features to the server 200. The handset 100 sends desensitization facial feature information to the server 200. The server 200 receives the desensitized facial features transmitted by the handset 100 and stores the facial features. The server 200 sends a feedback message to the handset 100 that the desensitized facial features were successfully received and stored. The handset 100 prompts the user for a message that the server 200 successfully received and stored the desensitized facial features.
In the embodiment of the present application, in view of the fact that the distributed terminal interconnection capability has become a trend of future development, a face image of a user is stored in a device (for example, the server 200), and the face image can be acquired by other devices of the user. In this way, in the case that a device (e.g., the mobile phone 100) originally held by the user is lost, not nearby, or cannot be used normally, it is achieved that the entered biological information (e.g., the facial image) can be securely synchronized to a trusted device (e.g., the mobile phone 300) newly held by the user through the device (e.g., the server 200), and the user is prevented from repeatedly entering the facial image on the newly held device. The biological information sharing technology of 'one-end input and multi-end use' is realized, and the user experience is improved.
In addition, in some other embodiments, in order to prevent the mobile phone 100 from transmitting the biometric information (e.g., the face image of the user) to the server 200, the biometric information may be stolen by other devices and the actual biometric information (e.g., the actual face image of the user) of the user may be leaked. Therefore, the mobile phone 100 needs to perform encryption processing on the biological information (e.g., the facial image of the user) before sending the biological information (e.g., the facial image of the user) to the server 200, so that leakage of the biological information (e.g., the real facial image of the user) of the user can be avoided to some extent, for example, in the process that the mobile phone 100 sends the encrypted biological information (e.g., the facial image of the user) to the server 200, the real biological information (e.g., the real facial image) of the user is not leaked even if the information is stolen by a third-party device, thereby realizing the function of sharing the encrypted biological information among the distributed multi-terminal devices while ensuring the security of the user data (e.g., the real facial image of the user).
In some embodiments, the mobile phone 100 detects a user operation to acquire a face image, and acquires the face image of the user. When the mobile phone 100 detects a user operation to transmit a face image to the server 200, the mobile phone 100 encrypts the face image with a password to obtain an encrypted face image. And generates a face image transmission instruction for instructing the cellular phone 100 to transmit the encrypted face image to the server 200. The handset 100 transmits the encrypted face image to the server 200. The server 200 receives the encrypted face image transmitted by the cellular phone 100 and stores the encrypted face image. The server 200 transmits a feedback message to the cellular phone 100 that the encrypted face image is successfully received and stored. The handset 100 prompts the user with a message that the server 200 successfully received and stored the facial image.
It is to be understood that the password may be a password known by the user, for example, any one or more of a system account number of the mobile phone 100, a password, and a screen locking password of the mobile phone 100; the system account of the mobile phone 100 may be a user-defined user name, such as a mobile phone number of the user. The lock screen password of the mobile phone 100 can be a user-defined character or gesture.
As another example, the password for the cell phone 100 to encrypt the encrypted facial image may be a custom password, such as a custom name, a character, a custom gesture.
Since the user-aware password is generally known only to the user, the user may decrypt the encrypted facial image with the user-aware password on the user's multiple devices.
It can be understood that the mobile phone 100 can automatically encrypt the face image by using the password to obtain the encrypted face image; or, in the case of acquiring the password input by the user, the face image may be encrypted by the password input by the user to obtain the encrypted face image.
For example, as shown in fig. 3D, after the mobile phone 100 finishes capturing the facial image of the user, the mobile phone 100 displays a face data entry interface, the face data entry interface includes a prompt area 40, and the prompt area 40 further includes a password input box 41 for the user to input a password. The prompt area 40 also includes a "OK" control 41 and a "Cancel" control 42 for selection by the user. In addition, in other embodiments, the prompt area 40 may also include prompt information: "please enter password".
In some embodiments, the mobile phone 100 detects a user operation to acquire a face image, and acquires the face image of the user. In the case where the mobile phone 100 detects a user operation to transmit a face image to the server 200, the mobile phone 100 obtains a face feature by performing feature processing on the face image, then encrypts the face feature with a password to obtain an encrypted face feature, and generates a face image transmission instruction for instructing the mobile phone 100 to transmit the encrypted face feature to the server 200. The handset 100 sends the encrypted facial features to the server 200. The server 200 receives the encrypted facial features transmitted by the handset 100 and stores the encrypted facial features. The server 200 sends a feedback message to the handset 100 that the encrypted facial features were successfully received and stored. The handset 100 prompts the user with a message that the server 200 successfully received and stored the encrypted facial features.
In some embodiments, the mobile phone 100 detects a user operation to acquire a face image, and acquires the face image of the user. Under the condition that the mobile phone 100 detects the user operation of sending the face image to the server 200, the mobile phone 100 performs desensitization processing on the face image to obtain a desensitized face image, performs feature processing on the desensitized face image to obtain desensitized face features, encrypts the desensitized face features by using passwords to obtain encrypted desensitized face features, generates a face image sending instruction, and the face image sending instruction is used for instructing the mobile phone 100 to send the encrypted desensitized face features to the server 200. The handset 100 sends the encryption desensitized facial features to the server 200. The server 200 receives the encryption desensitized facial features transmitted by the handset 100 and stores the encryption desensitized facial features. The server 200 sends a feedback message to the handset 100 that the encrypted desensitized facial features were successfully received and stored. The handset 100 prompts the user for a message that the server 200 successfully received and stored the encrypted desensitized facial features.
It can be understood that desensitization processing, namely data desensitization processing, refers to performing data deformation on some sensitive information through desensitization rules, so as to realize reliable protection of sensitive private data. Because the desensitization treatment is to carry out data deformation on the sensitive information in the data information and hide the sensitive information in the data information, the data information can be protected by a data desensitization technology.
Fig. 4 is a schematic diagram illustrating a processing procedure of a face image, as shown in fig. 4, after a desensitization process is performed on the face image, the desensitization face image is a real face image of a user, and the face of the user is clear and relatively complete. The desensitized face image obtained after the face image is subjected to desensitization processing hides the information of the clear and complete face image of the user, the real face image of the user is not easy to find through the desensitized face image, and the safety of the real face image of the user is ensured to a certain extent.
Specifically, the mobile phone 100 performs desensitization processing on the face image of the user to obtain a desensitized face image of the user. The mobile phone 100 extracts the desensitized facial features of the user from the desensitized facial image and stores the desensitized facial features. The handset 100 sends the encryption desensitized facial features to the server 200.
For example, continuing to refer to fig. 4, as shown in fig. 4, the facial image capture module 401 of the mobile phone 100 may be configured to capture facial images, for example, a camera of the mobile phone 100.
The mobile phone 100 further includes a desensitization module 402, where the desensitization module 402 is configured to perform desensitization processing on the facial image of the user through a desensitization algorithm to obtain a desensitization facial image of the user. The desensitization module 402 can also be called an image scrambling/encryption module, which performs desensitization processing on pictures containing personal sensitive information, so that the risk of privacy disclosure is reduced from the end side (the mobile phone 100).
It is to be appreciated that in some embodiments, the desensitized facial features may include a desensitized facial feature vector, which is desensitized facial features extracted from the desensitized facial image that may characterize the user's skin tone, five sense organs, eyebrows, face shape, and so forth. The desensitization facial feature vector is a one-dimensional or multi-dimensional feature vector, that is, the desensitization facial feature vector includes features in one or more dimensions.
In some embodiments, feature vector extraction may be performed on the desensitized face images by a Local Binary Patterns (LBP) algorithm, a Histogram of Oriented Gradients (HOG) algorithm, a Haar-like algorithm, or the like, to obtain desensitized face feature vectors corresponding to each desensitized face image.
Continuing to refer to fig. 4, as shown in fig. 4, the mobile phone 100 further includes a desensitization facial image feature extraction module 403, where the desensitization facial image feature extraction module 403 is configured to perform feature extraction on the desensitization facial image to extract desensitization facial features in the desensitization facial image.
The neural network model used in the desensitization facial image feature extraction module 403 is generated by training using the desensitization facial image as a training data set. In the training phase, a desensitization facial image training data set is input into the desensitization facial image feature extraction module 403, and desensitization facial image features meeting preset conditions are generated. The condition meeting the preset condition may be that the number of training times reaches a preset number, or that a loss function is smaller than a preset value in the training process, but is not limited thereto.
In the application stage, desensitized facial images are input into the trained desensitized facial image feature extraction module 403, and desensitized facial features are obtained. Thus, the neural network model trained based on the original image, which is used in the existing end-side device, is replaced with the neural network model trained with the encryption desensitized face image.
It will be appreciated that after the handset 100 has obtained desensitized facial features, the handset 100 may store the desensitized facial features as a reference for facial recognition of the user.
With continued reference to fig. 4, as shown in fig. 4, the handset 100 also includes a memory 404, the memory 404 being used to store desensitized facial features of the user.
It will be appreciated that the handset 100 may employ a password to encrypt the desensitized biometric information. It is to be understood that the password may be a password known by the user, for example, any one or more of a system account number of the mobile phone 100, a password, and a screen locking password of the mobile phone 100; the system account of the mobile phone 100 may be a user-defined user name, such as a mobile phone number of the user. The lock screen password of the mobile phone 100 can be a user-defined character or gesture.
As another example, the password for the cell phone 100 to encrypt the facial image may be a custom password, such as a custom name, a character, or a custom gesture.
Since the user-aware password is generally known only to the user, the user may decrypt the encrypted facial image with the user-aware password on the user's multiple devices. For example, the cell phone 100 may prompt the user to set a custom password, e.g., custom name, characters, custom gesture, for encrypting the desensitized facial feature when the user enters a facial image to generate the desensitized facial feature.
It is to be understood that in the embodiment of the present application, various encryption algorithms may be used to encrypt the desensitized facial feature template, for example, a symmetric block encryption algorithm is used to encrypt the desensitized facial feature template, so that the user can decrypt the encrypted desensitized facial feature template by using the same password on the mobile phone 300 side. The symmetric block Encryption algorithm includes, but is not limited to, an AES (Advanced Encryption Standard) algorithm, an SM4 algorithm, a SIMON algorithm, a SPECK code algorithm, and the like.
It will be appreciated that in some embodiments, after one device (e.g., the handset 100) has generated the user's desensitized facial features, the system may prompt the user whether to use the biometric "enter multi-use one end" function, encrypt the user desensitized facial features and send the encrypted desensitized facial features to the server in the event that the user chooses to use the "enter multi-use one end" function, so that another device (e.g., the handset 300) obtains the user's desensitized facial features from the server.
For example, as shown in fig. 3D, after the mobile phone 100 finishes capturing the facial image of the user, the mobile phone 100 displays an entering interface of facial data, the entering interface includes a prompt area 40, and the prompt area 40 includes prompt information: "whether to use a function with one end used for entering multiple ends". The prompt area 40 also includes a "OK" control 41 and a "Cancel" control 42 for selection by the user. In the event that the user clicks the "ok" control 41, the handset 100 detects this and displays an interface for instructing the user to enter encryption for the desensitized facial features.
As shown in fig. 3E, the interface includes a prompt area 50, and the prompt area 50 further includes a password input box 51 for the user to input a password. The prompt field 50 also includes a "OK" control 52 and a "Cancel" control 53 for selection by the user.
Further, in other embodiments, the reminder area 50 may also include reminder information: "please enter password".
Taking the system account number and the password of the mobile phone 100 as an example, after the user inputs the system account number and the password of the mobile phone 100 and the screen locking password of the mobile phone 100 in the password input box 51 and clicks the "confirm" control 52, the mobile phone 100 responds to the operation of the user, encrypts the desensitized facial features by using the system account number and the password of the mobile phone 100 and the screen locking password of the mobile phone 100 to obtain encrypted desensitized facial features, and generates an instruction for sending the encrypted desensitized facial features to the server 200. It will be appreciated that after the user clicks the "cancel" control 53, the handset 100 closes the entering interface and the entering interface disappears.
In some embodiments, after the integrity verification is performed on the encryption desensitization facial feature by the server 200, the encryption desensitization facial feature in the encryption desensitization storage instruction is stored, and the verification operation may be whether a data packet in which the encryption desensitization facial feature external storage instruction is located is damaged or not, data is lost, and the like. For example, as shown in fig. 3F, after successfully storing the encryption-desensitized facial features, the server 200 sends a prompt message indicating that the storage is successful to the mobile phone 100, for example, the mobile phone 100 displays a prompt message: has been successfully stored at the server.
In the embodiment of the present application, in view of the fact that the distributed terminal interconnection capability has become a trend of future development, encryption-desensitized biometric information (e.g., encryption-desensitized facial features) of a user is stored in a device (e.g., the server 200), and the encryption-desensitized biometric information can be acquired by other devices of the user. In this way, in the case where a device originally held by the user (e.g., the mobile phone 100) is lost, not nearby, or cannot be normally used, the originally-held device that has entered biometric information (e.g., facial images) can securely synchronize encryption desensitization biometric information (e.g., encryption desensitization facial features) into a trusted device (e.g., the mobile phone 300) newly held by the user through the device (e.g., the server 200), and thus avoid the user from repeatedly entering biometric information on the newly-held device. The biological information sharing technology of 'one-end input and multi-end use' is realized, and the user experience is improved.
Next, the stage of the mobile phone 300 acquiring the face image from the server 200 will be described.
After the mobile phone 100 transmits the face image to the server 200, another terminal held by the user, such as the mobile phone 300, may acquire the face image from the server 200 in the case of logging in the same account for the system as the mobile phone 100. Still taking the mobile phone 300 as an example, the following further describes a technical solution of the mobile phone 300 obtaining the facial image from the server 200.
In the case where the user uses the mobile phone 300 and wants to acquire a face image from the server 200, first, the mobile phone 300 may log in to system software (system application) using a system account and a password, and further acquire biometric information.
Fig. 5 shows a flow diagram of a biological information sharing method according to some embodiments of the present application, and as shown in fig. 5, the method specifically includes:
step 501: the mobile phone 300 detects a user operation for acquiring a face image, and generates a face image acquisition request.
Step 502: the handset 300 transmits a face image acquisition request to the server 200. Wherein the face image acquisition request is for requesting acquisition of a face image from the server 200.
For example, fig. 6A to 6C are schematic diagrams illustrating an operation interface for operating the mobile phone 300 to acquire face data from the server 200 according to an embodiment of the present application.
As shown in fig. 6A, the cell phone 300 displays a face recognition interface that includes a "start acquisition" control 70. Assuming that the user selects data for face recognition from the server 200, for example, the user clicks the "start acquisition" control 70, the cellular phone 100 generates a face image acquisition request in response to this operation by the user.
Step 503: the server 200 transmits the face image to the handset 300.
Step 506: the cellular phone 300 stores the face image acquired from the server 200 for the user identification service.
The mobile phone 300 may store the received facial image in a secure storage area to ensure the security of data storage. The secure storage area can be a secure area in the chip, and applications, memories, screens and the like in the mobile phone access the secure area through very strict security authentication. For example, a Trusted Execution Environment operating System (TEE OS) in a Hua chip.
As shown in fig. 6B, when the mobile phone 300 acquires a face image for face recognition from the server, the mobile phone 300 displays prompt information: "acquired biological information! ".
In addition, the mobile phone 300 may prompt the user with category information of the biometric information acquired from the server, so as to prompt the user with a biometric information collector that needs to be triggered when performing identity recognition. For example, if the mobile phone 300 acquires a facial image, the mobile phone 300 prompts the user that the type information of the biological information acquired from the server is the facial image, so that the user knows that the biological information collector that the mobile phone 300 needs to trigger when performing the identity recognition again is the camera of the mobile phone 300.
For another example, if the fingerprint is acquired by the mobile phone 300, the mobile phone 300 prompts the user that the category information of the biological information acquired from the server is the fingerprint, so that the user knows that the biological information acquirer, which needs to be triggered by the mobile phone 300 when performing the identity recognition again, is the fingerprint sensor of the mobile phone 300, and the user knows that the user performs the identity recognition by placing a finger on the fingerprint sensor.
Further, it is understood that the user knows the category of the acquired biological information before the cellular phone 300 acquires the biological information from the server 200. For example, the user needs the mobile phone 300 to acquire a face image from the server 200, the user selects an operation of acquiring the face image on the mobile phone 300, and the mobile phone 300 transmits a face image acquisition request to the server 200 in response to this, thereby acquiring the face image from the server 200.
For another example, the user needs the mobile phone 300 to acquire the fingerprint from the server 200, the user selects the operation of acquiring the fingerprint on the mobile phone 300, and the mobile phone 300 sends a fingerprint acquisition request to the server 200 in response to the operation, so as to acquire the fingerprint from the server 200.
It is understood that the user identification service may include unlocking a screen of the mobile phone 300 by a user's face, unlocking an application program in the mobile phone 300 by the user's face, and authenticating the user during a payment process.
In addition, in some other embodiments, in order to prevent the server 200 from transmitting the biometric information (e.g., the face image of the user) to the mobile phone 300, the biometric information may be stolen by other devices and the actual biometric information of the user (e.g., the actual face image of the user) may be leaked. Therefore, the server 200 sends the encrypted biological information (e.g., the encrypted facial image of the user) to the mobile phone 300, so that leakage of the biological information (e.g., the real facial image of the user) of the user can be avoided to some extent, for example, in the process that the server 200 sends the encrypted biological information (e.g., the facial image of the user) to the mobile phone 300, even if the information is stolen by a third-party device, the real biological information (e.g., the real facial image) of the user is not leaked, and thus, the function of sharing the encrypted biological information among the distributed multi-terminal devices is realized while the security of user data (e.g., the real facial image of the user) is ensured.
In some embodiments, the handset 300 detects a user operation to acquire desensitized facial features, generating a desensitized facial feature acquisition request. The handset 300 sends a desensitization facial feature acquisition request to the server 200. Wherein the desensitization facial feature acquisition request is for requesting acquisition of desensitization facial features from the server 200. The server 200 sends desensitized facial features to the handset 300. The mobile phone 300 decrypts the desensitized facial features by using the password for the desensitized facial features to obtain the desensitized facial features, and the mobile phone 300 stores the desensitized facial features for the user identification service.
In some embodiments, the mobile phone 300 detects a user operation to acquire a facial feature, and generates a facial feature acquisition request. The handset 300 sends a facial feature acquisition request to the server 200. Wherein the facial feature acquisition request is for requesting acquisition of facial features from the server 200. The server 200 sends the facial features to the handset 300. The handset 300 stores facial features for use in a user identification service.
In some embodiments, the mobile phone 300 detects a user operation to acquire an encrypted face image, and generates an encrypted face image acquisition request. The handset 300 transmits an encrypted face image acquisition request to the server 200. Wherein the encrypted face image acquisition request is for requesting acquisition of an encrypted face image from the server 200. The server 200 transmits the encrypted face image to the handset 300. The mobile phone 300 decrypts the encrypted face image with the password for encrypting the face image to obtain the face image, and the mobile phone 300 stores the face image for the user identification service.
In some embodiments, the handset 300 detects a user operation to acquire an encrypted facial feature, and generates an encrypted facial feature acquisition request. The handset 300 transmits an encrypted facial feature acquisition request to the server 200. Wherein the encrypted facial feature acquisition request is for requesting acquisition of the encrypted facial feature from the server 200. The server 200 sends the encrypted facial features to the handset 300. The mobile phone 300 decrypts the encrypted facial features with the password for encrypting the facial features to obtain the facial features, and the mobile phone 300 stores the facial features for the user identification service.
In some embodiments, the handset 300 detects a user operation to acquire encryption-desensitized facial features, generating an encryption-desensitized facial feature acquisition request. The handset 300 sends an encryption desensitization facial feature acquisition request to the server 200. Wherein the encryption desensitization facial feature acquisition request is for requesting acquisition of encryption desensitization facial features from the server 200. The server 200 sends the encryption desensitized facial features to the handset 300. The mobile phone 300 decrypts the encrypted desensitized facial features with the password for encrypting the desensitized facial features to obtain the desensitized facial features, and the mobile phone 300 stores the desensitized facial features for the user identification service.
For example, fig. 6A to 6C are schematic diagrams illustrating an operation interface for a user to operate the mobile phone 300 to obtain face data from the server 200 according to an embodiment of the present application.
As shown in fig. 6A, the cell phone 300 displays a face recognition interface that includes a "start acquisition" control 70. Assuming that the user chooses to retrieve data for facial recognition from the server 200, e.g., the user clicks the "start retrieval" control 70, the handset 100 generates a desensitization facial feature retrieval request in response to this action by the user.
As shown in fig. 6B, after the handset 300 acquires desensitized facial features for facial recognition from the server, the handset 300 displays a prompt: "acquired biological information! "
It can be understood that the mobile phone 200 can automatically decrypt the encrypted face image by using the password to obtain the face image; or, in the case of acquiring the password input by the user, the face image may be obtained by encrypting the encrypted face image with the password input by the user.
For example, as shown in fig. 6C, in the case where the cell phone 300 receives the encryption-desensitized facial features sent by the server 200, the cell phone 300 displays an acquiring interface including a prompt field 80, the prompt field 80 including a password input box 81 for the user to input a password. The prompt field 80 also includes a "OK" control 82 and a "Cancel" control 83 for selection by the user. In addition, in other embodiments, the prompt area 80 may also include prompt information: "please enter password! ".
It is to be appreciated that the user may enter a password on the handset 300 that encrypts the encryption-desensitized biometric information to decrypt the encryption-desensitized biometric information.
For example, after the user enters the account number and password of the system software of the cell phone 100 and the lock screen password of the cell phone 100 in the password input box 81, and clicks the "ok" control 82, the cell phone 300 decrypts the encrypted desensitized facial features using the account number and password of the system software (system application) of the cell phone 100 and the lock screen password of the cell phone 100 as passwords to obtain the desensitized facial features in response to this operation by the user.
It is understood that the user identification service may include unlocking a screen of the mobile phone 300 by a human face, unlocking an application program logged in the mobile phone 300 by a human face, authentication during payment, and the like.
The handset 300 may store the received decrypted desensitized facial features in a secure storage area to ensure the security of the data storage. The secure storage area can be a secure area in the chip, and applications, memories, screens and the like in the mobile phone access the secure area through very strict security authentication. For example, a Trusted Execution Environment operating System (TEE OS) in a Hua chip.
In some embodiments of the present application, after the mobile phone 300 acquires the facial image from the server 200, the user of the mobile phone 300 may be further authenticated to ensure that the user corresponding to the facial image acquired from the server 200 is the same person as the user of the mobile phone 100. For example, after knowing the system account number of the mobile phone 100 of the user holding the mobile phone 300, the system software (system application program) password of the mobile phone 100, and the screen locking password of the mobile phone 100, other users can be prevented from stealing the face data of the user holding the mobile phone 300, and the risk of stealing the face data of the user is further reduced. The following detailed description is further described in conjunction with the accompanying drawings and the description.
Step 501: the mobile phone 300 detects a user operation to acquire a face image, and generates a face image acquisition request.
Step 502: the mobile phone 300 transmits a face image acquisition request to the server 200. Wherein the face image acquisition request is for requesting acquisition of a face image from the server 200.
Step 503: the server 200 transmits the face image to the handset 300.
Step 504: the mobile phone 300 acquires a face image of the holder of the mobile phone 300.
In order to quickly verify whether the users of the handset 100 and the handset 300 are the same person. The facial image of the user acquired by the mobile phone 100 includes at least one frame of facial image. Each of the at least one frame of face image may be at least one angular face image, such as a front photograph, a side photograph, or the like of a human face. And the face image of the holder of the mobile phone 300 is used to verify whether the face image of the holder of the mobile phone 300 and the face image acquired from the server are characterized as the same person, and may be only one frame of face image at a certain angle.
Step 505: the mobile phone 300 matches the face image of the holder of the mobile phone 300 with the face image acquired from the server 200, and determines whether the matching is successful, that is, determines whether the face image of the holder of the mobile phone 300 and the face image acquired from the server are characterized as the same person. If yes, go to step 506; if not, go to step 507.
In order to verify that the user of the newly-held mobile phone 300 is the same user as the user of the originally-held mobile phone 100, the mobile phone 300 needs to match the facial image of the user of the mobile phone 300 with the facial image acquired from the server 200, and if the comparison result is passed, it is verified that the user of the newly-held mobile phone 300 and the user of the originally-held mobile phone 100 are the same user; otherwise, the user who newly holds the mobile phone 300 is not the same user as the user who originally holds the mobile phone 100, and the user is prompted to fail in authentication, so that there is a risk that the system account number of the mobile phone 100, the system software (system application program) password of the mobile phone 100, and the screen locking password of the mobile phone 100 are stolen from the originally-held mobile phone 100, and the user may enter the facial image of the user again on the mobile phone 300 or provide more user information, so as to perform user authentication.
Step 506: the cellular phone 300 stores the face image acquired from the server 200 for the user identification service.
Step 507: the handset 300 prompts the user for information of the failure of the matching.
In addition, in other embodiments, an encrypted facial image is obtained at the handset 300 from the server 200. And the encrypted face image is decrypted by using a password for encrypting the face image to obtain the face image, and then whether the user of the newly-held mobile phone 300 and the user of the originally-held mobile phone 100 are the same user can be verified, and under the condition that the user of the newly-held mobile phone 300 and the user of the originally-held mobile phone 100 are verified to be the same user, the mobile phone 300 stores the face image for the user identification service. The verification operation is the same as above, and is not described herein again.
In addition, in some other embodiments, after the mobile phone 300 obtains the facial features from the server 200, it can be verified whether the user of the newly held mobile phone 300 is the same user as the user of the originally held mobile phone 100. And under the condition that the user of the newly-held mobile phone 300 is verified to be the same user as the user of the originally-held mobile phone 100, the mobile phone 300 stores the facial features for the user identification service. The verification operation is the same as above, and is not described herein again.
In addition, in other embodiments, the handset 300 obtains encrypted facial features from the server 200. And decrypts the encrypted facial features with the password for encrypting the facial features to obtain the facial features, and the mobile phone 300 stores the facial features for the user identification service when verifying that the user of the newly-held mobile phone 300 is the same user as the user of the originally-held mobile phone 100. The verification operation is the same as above, and is not described herein again.
In addition, in other embodiments, the handset 300 obtains encryption desensitized facial features from the server 200. The server 200 sends the encryption desensitized facial features to the handset 300. The mobile phone 300 decrypts the encrypted desensitized facial features by using the password for encrypting the desensitized facial features to obtain the desensitized facial features, and the mobile phone 300 stores the desensitized facial features for the user identification service under the condition that the user who newly holds the mobile phone 300 is verified to be the same user as the user who originally holds the mobile phone 100. The mobile phone 300 acquires a face image of the holder of the mobile phone 300. The mobile phone 300 performs desensitization processing on the acquired face image to obtain a desensitization face image for verification, and the mobile phone 300 extracts desensitization face features from the desensitization face image for verification. The handset 300 matches desensitized facial features extracted from the desensitized facial image used for authentication with decrypted desensitized facial features obtained from the server 300. The mobile phone 300 judges whether the matching is successful, that is, whether the facial image of the holder of the mobile phone 300 and the facial image acquired from the server are characterized as the same person, if so, the mobile phone 300 stores the decrypted desensitized facial features for the user identification service; if not, the mobile phone 300 prompts the information of the matching failure to the user.
With the above scheme, the user of the mobile phone 300 is authenticated to ensure that the user corresponding to the desensitized facial feature is the same person as the user of the mobile phone 100. For example, after the mobile phone 300 needs to decrypt the encrypted desensitized facial feature sent by the server 200, it needs to verify the desensitized facial feature to determine whether the user holding the mobile phone 300 and the user holding the mobile phone 100 are one user, so as to prevent other users from stealing the facial data of the user holding the mobile phone 300 after knowing the system account of the mobile phone 100 holding the mobile phone 300, the system software (system application program) password of the mobile phone 100, and the screen locking password of the mobile phone 100, thereby reducing the risk of stealing the facial data.
Fig. 7 illustrates a schematic structural diagram of an electronic device 300, according to some embodiments of the present application,
the electronic device includes an input unit 302, a processor unit 303, an output unit 306, a communication unit 301, a storage unit 304, a peripheral interface 305, a power supply 307, and the like. These components communicate over one or more buses. Those skilled in the art will appreciate that the configuration of the electronic device shown in fig. 7 is not intended to limit embodiments of the present application, and may be a bus or star configuration, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present application, the electronic device may be any electronic device with a biometric function (mainly face recognition and face comparison), including but not limited to a smart phone, a notebook computer, a tablet computer, and the like.
The input unit 302 is used for enabling interaction of a user with the electronic device and/or input of information into the electronic device. For example, the input unit 302 may receive numeric or character information input by a user to generate a signal input related to user setting or function control. In the embodiment of the present application, the input unit 302 may be any device for acquiring a biometric image of a user, and may be a camera.
The processor unit 303 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and executes various functions of the electronic device and/or processes data by operating or executing software programs and/or modules stored in the storage unit 304 and calling data stored in the storage unit 304. The processor unit 303 may be composed of an Integrated Circuit (IC), for example, a single packaged IC, or a plurality of packaged ICs with the same or different functions. For example, the Processor Unit 303 may include only a Central Processing Unit (CPU), or may be a combination of a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), and a control chip (e.g., a baseband chip) in the communication Unit 301. In the embodiment of the present application, the processor unit 303 mainly refers to a CPU or a GPU, which may be a single operation core or may include multiple operation cores. In the embodiment of the application, the operations of the image encryption module, the encrypted biological characteristic identification module and the face identification/comparison module are all performed in the processor.
The communication unit 301 is configured to establish a communication channel, enable the electronic device to connect to a remote server through the communication channel, and upload and download data from the remote server. The communication unit 301 may include a Wireless Local Area Network (wlan) module, a bluetooth module, an NFC module, a baseband (Base Band) module, and other communication modules, and a Radio Frequency (RF) circuit corresponding to the communication module, and is configured to perform wlan communication, bluetooth communication, NFC communication, infrared communication, and/or cellular communication system communication, such as Wideband Code Division Multiple Access (W-CDMA) and/or High Speed Downlink Packet Access (HSDPA). The communication module is used for controlling communication of each component in the electronic device, and can support Direct Memory Access (Direct Memory Access).
The radio frequency circuit is used for receiving and sending signals in the process of information transceiving or conversation. For example, after receiving the downlink information of the base station, the downlink information is processed by the processing unit; in addition, the data for designing uplink is transmitted to the base station. For another example, after receiving the information sent by the external NFC device, the processing unit processes the information, and sends the processing result to the external NFC device. Typically, the radio frequency circuitry includes well-known circuitry for performing these functions, including but not limited to an antenna system, a radio frequency transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a Codec (Codec) chipset, a Subscriber Identity Module (SIM) card, memory, and so forth. In addition, the radio frequency circuitry may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (Code Division Multiple Access, CDMA), Wideband Code Division Multiple Access (WCDMA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), e-mail, Short Messaging Service (SMS), and the like. In the embodiment of the present application, a communication function between an electronic device (for example, a terminal device) and a server (for example, a cloud server) is related, and a communication unit 301 such as a wireless local area network communication module and a mobile communication network module is mainly used.
The output unit 306 includes, but is not limited to, an image output unit 306 and a sound output unit 306. The image output unit 306 is used for outputting text, pictures and/or video. The image output unit 306 may include a Display panel, such as a Display panel configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), a Field Emission Display (FED), and the like. Alternatively, the image output unit may include a reflective display, such as an electrophoretic (electrophoretic) display, or a display using an Interferometric Modulation of Light (Interferometric) technique. The image output unit may include a single display or a plurality of displays of different sizes. In the embodiment of the present invention, the image output unit 306 can be a liquid crystal display.
The storage unit 304 may be used to store software programs and modules, and the processing unit executes various functional applications of the electronic device and implements data processing by running the software programs and modules stored in the storage unit 304. The storage unit 304 mainly includes a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function, such as a sound playing program, an image processing program, and the like; the data storage area may store data (such as picture data, a phonebook, etc.) created according to the use of the electronic device, and the like. In an embodiment of the present invention, the Memory unit 304 may include a Volatile Memory, such as a Non-Volatile Random Access Memory (NVRAM), a Phase Change Random Access Memory (PRAM), a Magnetoresistive Random Access Memory (MRAM), and the like, and may further include a Non-Volatile Memory, such as at least one magnetic disk Memory device, an Electrically Erasable Programmable Read Only Memory (EEPROM), a flash Memory device, such as a NOR flash Memory (NOR flash Memory) or a NOR flash Memory (NAND flash Memory). The nonvolatile memory stores an operating system and an application program executed by the processing unit. The processing unit loads operating programs and data from the non-volatile memory into the memory and stores digital content in the mass storage device. The operating system includes various components and/or drivers for controlling and managing conventional system tasks, such as memory management, storage device control, power management, etc., as well as facilitating communication between various hardware and software components. In the embodiment of the present invention, the operating system may be an Android system developed by Google, an iOS system developed by Apple, a Windows operating system developed by Microsoft, or an embedded operating system such as Vxworks. In the embodiment of the application, the image encryption module, the encrypted biological characteristic identification module and the face identification/comparison module are all stored in the program storage area, and the desensitized biological characteristic information in the terminal equipment is all stored in the data storage area.
The power supply 307 is used to power the various components of the electronic device to maintain its operation. As a general understanding, the power source 307 may be a built-in battery, such as a common lithium ion battery, a nickel metal hydride battery, etc., and also include an external power source 307 for directly supplying power to the electronic device, such as an AC adapter, etc. In some implementations of embodiments of the present application, the power source 307 may be more broadly defined and may include, for example, a power source 307 management system, a charging system, a power source 307 fault detection circuit, a power source 307 converter or inverter, a power source 307 status indicator (e.g., a light emitting diode), and any other components associated with power generation, management, and distribution of an electronic device.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as computer programs or program code executing on programmable systems comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For purposes of this application, a processing system includes any system having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. The program code can also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in this application are not limited in scope to any particular programming language. In any case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed via a network or via other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including, but not limited to, floppy diskettes, optical disks, read-only memories (CD-ROMs), magneto-optical disks, read-only memories (ROMs), Random Access Memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or a tangible machine-readable memory for transmitting information using the internet in the form of electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signal digital signals, etc.). Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some features of structures or methods may be shown in a particular arrangement and/or order. However, it is to be understood that such specific arrangement and/or ordering may not be required. Rather, in some embodiments, the features may be arranged in a manner and/or order different from that shown in the illustrative figures. In addition, the inclusion of a structural or methodological feature in a particular figure is not meant to imply that such feature is required in all embodiments, and in some embodiments may not be included or may be combined with other features.
It should be noted that, in each device embodiment of the present application, each unit/module is a logical unit/module, and physically, one logical unit/module may be one physical unit/module, or a part of one physical unit/module, and may also be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logical unit/module itself is not the most important, and the combination of the functions implemented by the logical unit/module is the key to solving the technical problem provided by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-mentioned device embodiments of the present application do not introduce units/modules which are not so closely related to solve the technical problems presented in the present application, which does not indicate that no other units/modules exist in the above-mentioned device embodiments.
It is noted that, in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.

Claims (19)

1. A biometric information sharing method, the method comprising:
the method comprises the steps that first biological information is obtained by first electronic equipment, wherein the first biological information is used by the first electronic equipment for identity recognition;
responding to a first operation, and obtaining second biological information by the first electronic equipment based on first biological information, wherein the first electronic equipment is in a login system account state before the first electronic equipment detects the first operation;
the first electronic equipment sends the second biological information to a server;
the second electronic equipment logs in the system account;
in response to the second operation, the second electronic device sends a biological information acquisition request to the server;
the second electronic device receiving the second biometric information from the server;
the second electronic equipment obtains third biological information based on the second biological information, wherein the third biological information is used by the second electronic equipment for identity recognition;
the logged system accounts of the first electronic device and the second electronic device are the same.
2. The method of claim 1, wherein the first biometric information comprises at least one of fingerprint information, fingerprint characteristic information, facial image characteristic information, iris information, and iris characteristic information.
3. The method according to claim 1 or 2, wherein the second biological information obtained by the first electronic device based on the first biological information comprises:
the first electronic equipment encrypts the first biological information to obtain second biological information;
the second electronic device obtains the third biological information based on the second biological information, and the third biological information includes:
and the second electronic equipment decrypts the second biological information to obtain the third biological information.
4. The method according to claim 1 or 2, wherein the second biological information obtained by the first electronic device based on the first biological information comprises:
the first electronic equipment desensitizes the first biological information, extracts features and encrypts the first biological information to obtain second biological information;
the second electronic device obtains the third biological information based on the second biological information, and the third biological information includes:
the second electronic equipment decrypts the second biological information to obtain the third biological information;
and when the second electronic equipment performs identity recognition, acquiring fourth biological information, performing desensitization processing and feature extraction processing on the fourth biological information to generate fifth biological information, and comparing the fifth biological information with the third biological information to perform identity recognition.
5. The method of claim 3, wherein the first electronic device encrypts the first biological information to obtain the second biological information, and comprises:
the first electronic equipment displays a first user interface, and the first user interface is used for instructing a user to input a password;
the first electronic equipment encrypts the first biological information based on the password;
the second electronic device decrypts the second biological information to obtain the third biological information, and the method includes:
the second electronic equipment displays a second user interface, and the second user interface is used for indicating a user to input the password;
the second electronic device decrypts the second biometric information based on the password.
6. The method according to any one of claims 1 to 3, further comprising:
and the second electronic equipment prompts the category information for identity recognition.
7. A biological information sharing method applied to electronic equipment is characterized by comprising the following steps:
acquiring first biological information, wherein the first biological information is used by the electronic equipment for identity recognition;
responding to a first operation, and obtaining second biological information based on first biological information, wherein the electronic equipment is in a login system account state before the electronic equipment detects the first operation;
and sending the second biological information to a server.
8. The method of claim 7, wherein the first biometric information comprises at least one of fingerprint information, fingerprint characteristic information, facial image characteristic information, iris information, and iris characteristic information.
9. The method according to claim 7 or 8, wherein the second biological information obtained based on the first biological information comprises:
and encrypting the first biological information to obtain the second biological information.
10. The method according to claim 7 or 8, wherein the second biological information obtained based on the first biological information comprises:
and carrying out desensitization processing, feature extraction processing and encryption processing on the first biological information to obtain second biological information.
11. The method of claim 9, wherein the encrypting the first biometric information to obtain the second biometric information comprises:
the electronic equipment displays a first user interface, wherein the first user interface is used for instructing a user to input a password;
the electronic device encrypts the first biological information based on the password.
12. A biological information sharing method applied to electronic equipment is characterized by comprising the following steps:
logging in a system account;
in response to the first operation, sending a biological information acquisition request to the server;
receiving first biological information from the server;
and obtaining second biological information based on the first biological information, wherein the second biological information is used by the electronic equipment for identity recognition.
13. The method of claim 12, wherein the second biometric information comprises at least one of fingerprint information, fingerprint characteristic information, desensitized fingerprint characteristic information, facial image characteristic information, desensitized image characteristic information, iris characteristic information, desensitized iris characteristic information.
14. The method according to claim 12 or 13, wherein the deriving the second biological information based on the first biological information comprises:
and decrypting the first biological information to obtain the second biological information.
15. The method according to claim 12 or 13, wherein the deriving the second biological information based on the first biological information comprises:
decrypting the first biological information to obtain second biological information;
when the electronic equipment performs identity recognition, third biological information is obtained, desensitization processing and feature extraction processing are performed on the third biological information to generate fourth biological information, and the fourth biological information is compared with the second biological information to perform identity recognition.
16. The method of claim 14, wherein decrypting the first biometric information to obtain the second biometric information comprises:
displaying a second user interface for instructing a user to enter a password;
decrypting the first biometric information based on the password.
17. The method according to any one of claims 12 to 14, further comprising:
category information of the biological information for identification is presented.
18. A computer-readable storage medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to implement the method of any one of claims 7-11, 12-17.
19. An electronic device, comprising:
a memory for storing instructions, an
A processor for executing the instructions to implement the method of any one of claims 7 to 11, 12 to 17.
CN202110207789.8A 2021-02-24 2021-02-24 Biological information sharing method, electronic device and medium thereof Pending CN114973428A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110207789.8A CN114973428A (en) 2021-02-24 2021-02-24 Biological information sharing method, electronic device and medium thereof
PCT/CN2021/143626 WO2022179308A1 (en) 2021-02-24 2021-12-31 Biological information sharing method, electronic device and medium thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110207789.8A CN114973428A (en) 2021-02-24 2021-02-24 Biological information sharing method, electronic device and medium thereof

Publications (1)

Publication Number Publication Date
CN114973428A true CN114973428A (en) 2022-08-30

Family

ID=82973454

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110207789.8A Pending CN114973428A (en) 2021-02-24 2021-02-24 Biological information sharing method, electronic device and medium thereof

Country Status (2)

Country Link
CN (1) CN114973428A (en)
WO (1) WO2022179308A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150016697A1 (en) * 2013-07-10 2015-01-15 Apple Inc. Finger biometric sensor data synchronization via a cloud computing device and related methods
CN105474573B (en) * 2013-09-19 2019-02-15 英特尔公司 For synchronizing and restoring the technology of reference template
CN103763315B (en) * 2014-01-14 2016-12-07 北京航空航天大学 A kind of trust data access control method being applied to mobile device cloud storage
US10111100B2 (en) * 2014-08-25 2018-10-23 Microsoft Technology Licensing, Llc Multidevice authentication
SE1750282A1 (en) * 2017-03-13 2018-09-14 Fingerprint Cards Ab Updating biometric data templates
CN110366725A (en) * 2017-06-23 2019-10-22 惠普发展公司,有限责任合伙企业 Biometric data synchronizer

Also Published As

Publication number Publication date
WO2022179308A1 (en) 2022-09-01

Similar Documents

Publication Publication Date Title
CN108604345B (en) Method and device for adding bank card
US10009327B2 (en) Technologies for secure storage and use of biometric authentication information
US11909884B2 (en) Secure distributed information system for public device authentication
CN109328348B (en) Service authentication method, system and related equipment
EP2905715B1 (en) Method, system and terminal for encrypting/decrypting application program on communication terminal
CN112771826A (en) Application program login method, application program login device and mobile terminal
US20160119143A1 (en) User identity authenticating method, terminal, and server
CA2665961C (en) Method and system for delivering a command to a mobile device
CN108763917B (en) Data encryption and decryption method and device
CN103279411A (en) Method and system of entering application programs based on fingerprint identification
CN107733652B (en) Unlocking method and system for shared vehicle and vehicle lock
US9276748B2 (en) Data-encrypting method and decrypting method for a mobile phone
US10880091B2 (en) Control method for enrolling face template data and related product
CN105281907B (en) Encrypted data processing method and device
CN107766713B (en) Face template data entry control method and related product
CN105577619B (en) Client login method, client and system
KR20170124953A (en) Method and system for automating user authentication with decrypting encrypted OTP using fingerprint in mobile phone
US11405782B2 (en) Methods and systems for securing and utilizing a personal data store on a mobile device
US9977907B2 (en) Encryption processing method and device for application, and terminal
KR101482321B1 (en) Method for Substituting Password of Certificate by using Biometrics
CN108322907B (en) Card opening method and terminal
CN114582048B (en) NFC-based vehicle door control method, mobile terminal and vehicle
CN114973428A (en) Biological information sharing method, electronic device and medium thereof
WO2017076287A1 (en) Method and device for pairing bluetooth devices
KR20160046655A (en) Apparatus and method for user authentication using subscriber identification module

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination