CN116010917A - Privacy-protected image processing method, identity registration method and identity authentication method - Google Patents

Privacy-protected image processing method, identity registration method and identity authentication method Download PDF

Info

Publication number
CN116010917A
CN116010917A CN202111229310.7A CN202111229310A CN116010917A CN 116010917 A CN116010917 A CN 116010917A CN 202111229310 A CN202111229310 A CN 202111229310A CN 116010917 A CN116010917 A CN 116010917A
Authority
CN
China
Prior art keywords
authenticated
key
auxiliary data
hash value
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111229310.7A
Other languages
Chinese (zh)
Inventor
汤林鹏
邰骋
张舒畅
王心安
张青笛
刘勤
张之蔚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jianmozi Technology Co ltd
Original Assignee
Moqi Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Moqi Technology Beijing Co ltd filed Critical Moqi Technology Beijing Co ltd
Priority to CN202111229310.7A priority Critical patent/CN116010917A/en
Priority to PCT/CN2022/126690 priority patent/WO2023066374A1/en
Publication of CN116010917A publication Critical patent/CN116010917A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application provides an image processing method, an identity registration method and an identity authentication method for privacy protection, wherein the method comprises the following steps: acquiring an image to be processed, wherein the image to be processed comprises a first biological characteristic area, and the first biological characteristic area comprises a plurality of first characteristic points; determining a first biological characteristic template of the first biological characteristic region according to the first biological characteristic region, wherein the first biological characteristic template comprises biological characteristic representations corresponding to the plurality of first characteristic points; and performing coding operation on the first biological characteristic template to obtain first auxiliary data. The method and the device can improve the irreversibility of the biological feature recognition method and ensure the revocability and the unlink property, thereby protecting the privacy of the user.

Description

Privacy-protected image processing method, identity registration method and identity authentication method
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method, an identity registration method, and an identity authentication method for privacy protection.
Background
Biometric identification is an important means of modern identity management and access control systems. Exposing the biometric of the registered user to an attacker can severely compromise user privacy due to the strong and permanent connection between the individual and his biometric. Currently, few biometric methods have the ability to guarantee the irreversibility, revocability, and unlink required for biometric systems without significantly degrading the recognition performance.
Disclosure of Invention
In view of this, an object of the embodiments of the present application is to provide an image processing method, an identity registration method, and an identity authentication method for privacy protection, which can improve the irreversibility of a biometric identification method, and ensure the revocability and the unlink property, so as to achieve protection of user privacy.
In a first aspect, an embodiment of the present application provides an image processing method for privacy protection, including:
acquiring an image to be processed, wherein the image to be processed comprises a first biological characteristic area, and the first biological characteristic area comprises a plurality of first characteristic points;
determining a first biological characteristic template of the first biological characteristic region according to the first biological characteristic region, wherein the first biological characteristic template comprises biological characteristic representations corresponding to the plurality of first characteristic points;
performing encoding operation on the first biological characteristic template to obtain first auxiliary data, wherein the first auxiliary data comprises one of the following items:
performing feature conversion on the first biological feature template to obtain first auxiliary data; the feature conversion is determined according to a key corresponding to the first auxiliary data;
quantizing according to the first biological characteristic template to obtain a quantized value; determining an error correction code word according to a key corresponding to the first auxiliary data; performing first transformation processing on the error correction code word to obtain first auxiliary data; wherein the determining the error correction code word according to the key corresponding to the first auxiliary data includes: determining an error correction code word according to the key corresponding to the first auxiliary data and the quantized value; and/or, the performing a first transformation on the error correction code word to obtain first auxiliary data, including: performing first transformation processing on the error correction code words according to the quantized values to obtain first auxiliary data; the first transform process is an irreversible transform;
Quantizing according to the first biological characteristic template to obtain a quantized value; determining an error correction code word; performing second transformation processing on the error correction code word to obtain first auxiliary data; generating a key corresponding to the first auxiliary data according to the quantized value; wherein an error correction code codeword is determined; performing a second transformation process on the error correction code word to obtain first auxiliary data, including: randomly determining error correction code words; performing second transformation processing on the error correction code words according to the quantized values to obtain first auxiliary data; or determining an error correction code word according to the quantized value, and performing second transformation processing on the error correction code word to obtain first auxiliary data; the second transform process is an irreversible transform;
wherein the encoding operation comprises an irreversible transformation.
In a second aspect, an embodiment of the present application provides an identity registration method, including:
the first auxiliary data in the registration information of the object to be registered is determined by the above-described privacy-preserving image processing method.
In a third aspect, an embodiment of the present application provides an identity authentication method, including:
acquiring an image to be authenticated of an object to be authenticated, wherein the image to be authenticated comprises a second biological characteristic area, and the second biological characteristic area comprises a plurality of second characteristic points;
Determining a biometric template to be authenticated of the second biometric region according to the second biometric region;
acquiring an identity authentication result of the object to be authenticated, wherein the identity authentication result is determined according to the biometric template to be authenticated and the database auxiliary data;
wherein the base assistance data comprises at least one first assistance data; the first assistance data is determined by the identity registration method described above.
In a fourth aspect, an embodiment of the present application provides an identity registration method, including:
receiving registration information sent by terminal equipment, wherein the registration information is determined by the identity registration method;
the registration information is stored in a database, the registration information comprising first assistance data.
In a fifth aspect, embodiments of the present application provide a key usage method, including:
performing identity authentication on the object to be authenticated by using the identity authentication method described in the third aspect;
if the identity authentication of the object to be authenticated is successful, performing one or more of digital signature, message encryption, message decryption, application login and digital wallet management by using the key which is determined by the identity authentication method in the third aspect and passes through verification;
The keys which pass the verification comprise keys to be authenticated which pass the verification, multi-factor keys to be authenticated which pass the verification or keys to be authenticated which are used for generating the multi-factor keys to be authenticated which pass the verification.
In a sixth aspect, an embodiment of the present application provides a digital signature method, including:
determining registration information by the registration method of the second aspect; wherein the registration information includes first assistance data; the secret key corresponding to the first auxiliary data is a first private key; the first private key corresponds to a first public key; the first private key and the first public key are generated in a trusted execution environment;
and sending the first public key to a signer for the signer to sign the digital signature generated by using the first private key by using the first public key.
In a seventh aspect, an embodiment of the present application provides a digital signature method, including:
performing identity authentication on the object to be authenticated by using the identity authentication method described in the third aspect;
if the identity authentication of the object to be authenticated is successful, signing the information to be signed by using the first key to be authenticated, which is determined by the identity authentication method in the third aspect and passes the verification, to obtain signature data with a digital signature;
The signature data is sent to a signature verification party, so that the signature verification party can verify the digital signature of the signature data by using a public key corresponding to the first key to be authenticated; the public key corresponding to the first key to be authenticated is sent to the signer through the method in the fifth aspect.
In an eighth aspect, an embodiment of the present application provides a message decryption method, including:
determining registration information by the identity registration method of the second aspect; wherein the registration information includes first assistance data; the secret key corresponding to the first auxiliary data is a second private key; the second private key corresponds to a second public key; the second private key and the second public key are generated in a trusted execution environment;
and sending the second public key to a message encryptor.
In a ninth aspect, an embodiment of the present application provides a message decryption method, including:
receiving data to be decrypted sent by a message encrypting party;
performing identity authentication on the object to be authenticated by using the identity authentication method described in the third aspect;
if the identity authentication of the object to be authenticated is successful, decrypting the data to be decrypted by using a second key to be authenticated, which is determined by the identity authentication method in the third aspect and passes the verification, to obtain decrypted data;
The public key corresponding to the second key to be authenticated is sent to the message encryptor through the method, and the data to be decrypted is encrypted by the public key corresponding to the second key to be authenticated.
In a tenth aspect, an embodiment of the present application provides an application login method, including:
performing identity authentication on the object to be authenticated by using the identity authentication method described in the third aspect;
if the identity authentication of the object to be authenticated is successful, sending a third key to be authenticated, which is determined by the identity authentication method in the third aspect and passes verification, to an application server so as to log in a target application program; or sending the third key to be authenticated and the user identifier which are determined by the identity authentication method and pass the verification to an application server so as to log in a target application program.
In an eleventh aspect, an embodiment of the present application provides a blockchain node information synchronization method applied to a current blockchain node on a blockchain, where the blockchain includes a plurality of blockchain nodes, including:
determining registration information by the registration method of the second aspect, wherein the registration information includes first auxiliary data; the secret key corresponding to the first auxiliary data is a third private key; the third private key corresponds to a third public key; the third private key and the third public key are generated in a trusted execution environment;
Broadcasting the third public key to other blockchain link points on the blockchain.
In a twelfth aspect, an embodiment of the present application provides an image processing apparatus for privacy protection, including:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring an image to be processed, the image to be processed comprises a first biological characteristic area, and the first biological characteristic area comprises a plurality of first characteristic points;
the first determining module is used for determining a first biological characteristic template of the first biological characteristic area according to the first biological characteristic area, wherein the first biological characteristic template comprises biological characteristic representations corresponding to the plurality of first characteristic points;
the encoding module is used for performing encoding operation on the first biological characteristic template to obtain first auxiliary data;
the coding module is realized by any one of the following modes:
performing feature conversion on the first biological feature template to obtain first auxiliary data; the feature conversion is determined according to a key corresponding to the first auxiliary data;
quantizing according to the first biological characteristic template to obtain a quantized value; determining an error correction code word according to a key corresponding to the first auxiliary data; performing first transformation processing on the error correction code word to obtain first auxiliary data; wherein the determining the error correction code word according to the key corresponding to the first auxiliary data includes: determining an error correction code word according to the key corresponding to the first auxiliary data and the quantized value; and/or, the performing a first transformation on the error correction code word to obtain first auxiliary data, including: performing first transformation processing on the error correction code words according to the quantized values to obtain first auxiliary data; the first transform process is an irreversible transform;
Quantizing according to the first biological characteristic template to obtain a quantized value; determining an error correction code word; performing second transformation processing on the error correction code word to obtain first auxiliary data; generating a key corresponding to the first auxiliary data according to the quantized value; wherein an error correction code codeword is determined; performing a second transformation process on the error correction code word to obtain first auxiliary data, including: randomly determining error correction code words; performing second transformation processing on the error correction code words according to the quantized values to obtain first auxiliary data; or determining an error correction code word according to the quantized value, and performing second transformation processing on the error correction code word to obtain first auxiliary data; the second transform process is an irreversible transform;
wherein the encoding operation comprises an irreversible transformation.
In a thirteenth aspect, an embodiment of the present application provides an identity registration apparatus, including:
a second determining module, configured to determine first auxiliary data in registration information of an object to be registered by the method described in the first aspect.
In a fourteenth aspect, an embodiment of the present application provides an identity authentication device, including:
the second acquisition module is used for acquiring an image to be authenticated of an object to be authenticated, wherein the image to be authenticated comprises a second biological characteristic area, and the second biological characteristic area comprises a plurality of second characteristic points;
A third determining module, configured to determine a biometric template to be authenticated of the second biometric region according to the second biometric region, where the biometric template to be authenticated includes a plurality of biometric data corresponding to the plurality of second feature points;
a fourth determining module, configured to determine an identity authentication result of the object to be authenticated according to the biometric template to be authenticated and the base auxiliary data;
wherein the base assistance data comprises at least one first assistance data; the first assistance data is included in registration information determined by the registration method of the second aspect.
In a fifteenth aspect, an embodiment of the present application provides an identity registration apparatus, including:
a first receiving module, configured to receive registration information sent by a terminal device, where the registration information is determined by the identity registration method provided in the second aspect;
and the second storage module is used for storing the registration information into a database, wherein the registration information comprises first auxiliary data.
In a sixteenth aspect, an embodiment of the present application provides an identity authentication device, including:
the second receiving module is used for receiving the authentication request sent by the terminal equipment;
The database auxiliary data determining module is used for determining the database auxiliary data from the registration information stored in the database by the method in the identity registration method provided by the fourth aspect according to the authentication request; wherein the base assistance data comprises at least one first assistance data.
In a seventeenth aspect, embodiments of the present application further provide an electronic device, including: a processor, a memory storing machine-readable instructions executable by the processor, which when executed by the processor perform the steps of the method described above when the electronic device is run.
In an eighteenth aspect, embodiments of the present application also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the above-described method.
According to the image processing method, the registration method, the identity authentication method, the device, the electronic equipment and the computer readable storage medium for privacy protection, irreversible encoding operation is carried out on the biological characteristic template reflecting the privacy of the user, and as the result of the irreversible encoding operation is stored, even if leakage occurs, the original biological characteristic template cannot be reversely pushed back, so that the privacy leakage risk is greatly reduced; meanwhile, the decoding operation corresponding to the encoding operation has fault tolerance or the matcher used in the authentication stage has fault tolerance, and authentication can be performed based on a biometric template which is similar enough but not identical to that used in registration. Therefore, the method provided by the embodiment of the invention is suitable for biological identification and can protect privacy.
In order to make the above objects, features and advantages of the present application more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an operating environment provided by an embodiment of the present application;
FIG. 2 is a block chain system 200 according to an embodiment of the present disclosure;
fig. 3 is a schematic block diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a flowchart of an image processing method for privacy protection provided in an embodiment of the present application;
fig. 5 is a block schematic diagram of an image processing apparatus for privacy protection according to an embodiment of the present application;
FIG. 6 is a flowchart of an identity authentication method according to an embodiment of the present application;
FIG. 7 is a flowchart of an identity registration method according to an embodiment of the present application;
Fig. 8 is a flowchart of another identity authentication method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
The inventor of the application provides an image processing method, an identity registration method and an identity authentication method for privacy protection, wherein data obtained after encoding operation is carried out on a biological feature template meets the following properties:
irreversibility (Non-irreversibility): the biometric data should be processed by irreversible transformation prior to storage. In any case, it is difficult to reverse and transform the original biometric feature based solely on the irreversible transformation result. This attribute may prevent misuse of stored biometric data to initiate spoofing or replay attacks, improving the security of the biometric system. The more difficult it is to reverse and transform the original biometric feature based on the irreversible transformation result, the higher the security of the biometric system.
Revocability or updatability (Revocability): this property makes it possible to revoke and reissue new instances of protected biometric references when the biometric database is corrupted.
Non-linkability: this attribute satisfies the computational difficulty of determining whether reference instances of two or more protected creatures originate from the same biometric feature of a user, and unlinkability may prevent cross-matching across different applications, thereby protecting user privacy.
The inventive concept of the present application is described below by means of some embodiments.
Example 1
To facilitate an understanding of the present embodiment, a description is first given of an operating environment in which the methods disclosed in the embodiments of the present application are performed.
The registration method and the authentication method provided by the embodiment of the application can be implemented by the terminal equipment or the server alone, or can be implemented by the cooperation of the terminal equipment and the server, and when the terminal equipment and the server are implemented in cooperation, the server and the terminal equipment need to interact. Fig. 1 is a schematic diagram of interaction between a server and a terminal device according to an embodiment of the present application. The server 110 is communicatively coupled to one or more first terminal devices 120 for data communication or interaction via a network. The server 110 may be a web server, database server, or the like. The first terminal device 120 may be a personal computer (personal computer, PC), tablet computer, smart phone, personal digital assistant (personal digital assistant, PDA), punched-card machine, payment device, etc.
The first terminal device 120 may determine registration information based on the acquired image and send the registration information to the server 110 to implement registration.
Optionally, the server 110 may also be communicatively connected to one or more second terminal devices 130 for data communication or interaction.
The second terminal device 130 may perform identity authentication on the object included in the acquired image based on the image.
In some embodiments, the registration of the identity and the authentication of the identity may be implemented in the same terminal device, in which case the first terminal device 120 and the second terminal device 130 may be the same terminal device. For example, the first terminal device 120 may be a card punching device, a gate inhibition device, a payment device, a personal mobile device, or the like.
In other embodiments, the registration of the identity and the authentication of the identity may be implemented in different terminal devices, and in this case, the first terminal device 120 and the second terminal device 130 may be different terminal devices. For example, the first terminal device 120 may be a personal mobile device and the second terminal device 130 may be a face recognition device in a subway.
Fig. 2 is a schematic diagram of a blockchain system 200 according to an embodiment of the present application. In the example shown in fig. 2, the various blockchain nodes in the blockchain system 200 are communicatively coupled for data communication or interaction through a network.
The blockchain nodes may be web servers, database servers ( blockchain nodes 210, 220, 230 as shown in fig. 2), etc.; the blockchain nodes may also be personal computers (personal computer, PCs) (250 in the illustration), tablet computers, smartphones (blockchain nodes 240 in the illustration), personal digital assistants (personal digital assistant, PDAs), etc.
Alternatively, the blockchain system 200 described above may be a federated chain. Optionally, a first blockchain node may be included in the blockchain system 200 for performing steps such as verifying digital certificates that may be performed by the blockchain system 200. A second blockchain node may be included in blockchain system 200 for issuing new digital certificates or updating the status of issued digital certificates. Alternatively, the first blockchain node for verifying may be the same node as or different from the second blockchain node for issuing or updating the digital certificate.
In this embodiment, other blockchain nodes in the blockchain system 200 may store blocks of the blockchain.
In this embodiment, each digital certificate in the blockchain and the action associated with that digital certificate are recorded in each block of the blockchain stored in each blockchain node in the blockchain system 200, along with the digital certificate status after the action is performed.
A block of a blockchain is a data structure that records transactions. Each block consists of a block head and a block main body, wherein the block main body is only responsible for recording all transaction information in a previous period of time, and most functions of a block chain are realized by the block head.
As shown in fig. 3, a block schematic diagram of the electronic device is shown. The electronic device 300 may include a memory 311, a memory controller 312, a processor 313, a peripheral interface 314, an input output unit 315, a display unit 316.
The above-mentioned memory 311, memory controller 312, processor 313, peripheral interface 314, input/output unit 315 and display unit 316 are electrically connected directly or indirectly to each other to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The processor 313 is used to execute executable modules stored in the memory.
The Memory 311 may be, but is not limited to, a random access Memory (Random Access Memory, RAM), a Read Only Memory (ROM), a programmable Read Only Memory (Programmable Read-Only Memory, PROM), an erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), an electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc. The memory 311 is configured to store a program, and the processor 313 executes the program after receiving an execution instruction, and a method executed by the electronic device 300 defined by the process disclosed in any embodiment of the present application may be applied to the processor 313 or implemented by the processor 313.
The processor 313 may be an integrated circuit chip having signal processing capabilities. The processor 313 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (digital signal processor, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field Programmable Gate Arrays (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The peripheral interface 314 couples various input/output devices to the processor 313 and the memory 311. In some embodiments, the peripheral interface 314, the processor 313, and the memory controller 312 may be implemented in a single chip. In other examples, they may be implemented by separate chips.
The input/output unit 315 is used for providing input data to a user. The input/output unit 315 may be, but is not limited to, a mouse, a keyboard, and the like.
The display unit 316 described above provides an interactive interface (e.g., a user-operated interface) between the electronic device 300 and a user or is used to display image data to a user reference. In this embodiment, the display unit may be a liquid crystal display or a touch display. In the case of a touch display, the touch display may be a capacitive touch screen or a resistive touch screen, etc. supporting single-point and multi-point touch operations. Supporting single-point and multi-point touch operations means that the touch display can sense touch operations simultaneously generated from one or more positions on the touch display, and the sensed touch operations are passed to the processor for calculation and processing.
The electronic device 300 may be a server shown in fig. 1, a terminal device shown in fig. 1, or a blockchain node in the blockchain system 200 shown in fig. 2. It will be appreciated by those of ordinary skill in the art that the architecture shown in fig. 3 is merely illustrative and not limiting of the architecture of the electronic device 300, and that the architecture of the various servers, terminal devices, and blockchain nodes in fig. 1 or 2 may also have more or fewer components than the electronic device 300 shown in fig. 3, or may have a different configuration than that shown in fig. 3. For example, the first terminal device and the second terminal device shown in fig. 1 may further include an acquisition device for acquiring image or audio data. For another example, the server shown in fig. 1 may not include the display unit shown in fig. 3. For another example, the blockchain node shown in fig. 2 may also include a positioning device for positioning the blockchain node.
The electronic device 300 in the present embodiment may be used to perform each step in each method provided in the embodiments of the present application. The implementation of each method is described in detail below by means of several embodiments.
Example two
Referring to fig. 4, a flowchart of an image processing method for privacy protection according to an embodiment of the present application is shown. The specific flow shown in fig. 4 will be described in detail.
In step 410, an image to be processed is acquired.
The image to be processed comprises a first biological characteristic area, and the first biological characteristic area comprises a plurality of first characteristic points.
The first biometric region in the image to be processed may be a fingerprint region, a palm print region, a finger vein region, a palm vein region, a face region, an iris region, or the like. The first feature point is a point capable of representing biological characteristics in the first biological feature region, such as a human face key point, a finger palm print and a finger palm vein minutiae. Taking the example that the first biological characteristic area is a face area, the first characteristic point may be a key point in the first biological characteristic area, where the key point represents a nose tip, an eye tail, an eyebrow and the like. Taking the example that the first biometric region is a fingerprint region, the first feature point may be a minutiae point that represents a bifurcation point, an endpoint, etc. of the grain in the first biometric region.
The image to be processed can be obtained by a contact or non-contact acquisition mode. For example, the non-contact acquisition mode may be to obtain an image to be processed by photographing. In order to make the acquired image to be processed contain more characteristic points so that the image to be processed contains enough information available for encoding, images obtained by acquiring the same biological feature multiple times can be fused to form the image to be processed.
In one embodiment, the image to be processed is acquired by a non-contact acquisition mode, and the biological feature included in the first biological feature area may be at least one of a fingerprint, a palm print, a finger vein, and a palm vein. The first biometric region includes a number of feature points greater than 300.
The security level of a biometric system that protects privacy is limited by its FAR (false acceptance rate). The face is not as unique as the fingerprint/palmprint, and the best facial recognition systems today also achieve FAR of no less than one part per million. In contrast, on the one hand, the fingerprint is more unique than the face, and it is estimated that the fingerprint of 36 minutiae can be as high as 1.95×10 36 Is defined by the number of the segments. On the other hand, since the face contains eyes, nose and other five sense organs with certain restrictions rather than disorder, the face contains less information than the palmprint and palmar veins with lower entropy.
In fact, more detail information can be captured in a non-contact collection mode than in the traditional method, multiple fingerprints and large-area palmprints are combined, a biological characteristic area with the number of characteristic points being larger than 300 can be collected at one time, and the number of the characteristic points is enough to enable information with enough distinction and enough coding to be collected through a single image without aligning and fusing multiple images, so that the security level of a biological characteristic recognition algorithm is improved to a very high standard.
Step 420, determining a first biometric template of the first biometric region according to the first biometric region.
The first biometric template includes biometric representations corresponding to the plurality of first feature points.
Illustratively, the biometric template is obtained by feature extraction of a biometric region, where the feature extraction is reversible.
Illustratively, the biometric template may be presented in the form of a vector or data set, or the like.
Illustratively, a biometric template includes a plurality of biometric representations in a one-to-one correspondence with a plurality of feature points, each biometric representation being local information of its corresponding feature point, such that the biometric template includes a sufficiently high degree of differentiation.
In one particular implementation, the biometric template includes a plurality of biometric representations that are in one-to-one correspondence with a plurality of feature points, each biometric representation corresponding to a feature point including information describing the feature point, such as a location, an angle of the feature point, a relative location, a relative angle with other feature points, and the like.
It will be appreciated that the more information contained in a biometric template, the higher the degree of discrimination of the biometric template. The biometric template needs to have a high enough degree of discrimination to ensure that the FAR is within acceptable limits, and the specific degree of discrimination requirements can be determined based on the size of the base for biometric identification. It will be appreciated that the larger the base size, the higher the degree of discrimination required for the biometric template.
Step 430, performing encoding operation on the first biometric template to obtain first auxiliary data.
The encoding operations used in step 430 above include irreversible transformations. It is understood that the encoding operation including the irreversible transformation means that at least one of the several steps involved in the encoding operation is the irreversible transformation, making it difficult to determine the biometric template by the inverse transformation from only the first helper data resulting from the encoding operation. In this way, privacy protection can be achieved.
Meanwhile, the natural ambiguity of the biological feature necessarily requires the decoding operation (at this time, the encoding operation uses BCH, RS and other modes to encode the error correction code, and the decoding operation uses corresponding modes to decode the error correction code) or the matcher to have fault tolerance, so that the authentication can be performed based on the biological feature template which is similar to but not identical to the one used in registration.
The encoding operations meeting the above requirements may be implemented in a number of alternative ways.
In a first alternative embodiment, step 430 may be implemented as:
s100, performing feature conversion on a first biological feature template to obtain first auxiliary data; the feature transformation is determined from a key corresponding to the first helper data.
It will be appreciated that the key corresponding to the first auxiliary data is not required to be obtained after the first auxiliary data is present, and if a key is used to generate the first auxiliary data, it is referred to as the key corresponding to the first auxiliary data.
The key corresponding to the first auxiliary data may be input by a user to be registered corresponding to the image to be processed or generated according to the input of the user to be registered. The key corresponding to the first auxiliary data is generated, for example, by performing one or more of format conversion, bit padding, and verification information addition on the input of the user to be registered.
In this alternative embodiment, the registration phase, F k (X) =h; authentication stage, F k’ (X ')=h'. Wherein F is a feature transfer function, k is a key corresponding to the first auxiliary data, k ' is a key acquired in an authentication stage, X and X ' are biometric templates to be authenticated in a registration stage and an authentication stage respectively, and H ' are auxiliary data in the registration stage and the authentication stage respectively. If X and X ' are close enough and k ' =k, then H and H ' are close enough. In such an alternative embodiment, the authentication phase, the matcher determines whether H and H' are sufficiently close, the matcher being fault tolerant.
Specific examples of the first alternative embodiment may be biological hashing (bio hashing) and robust hashing (Robusthashing).
In a second alternative embodiment, step 430 may be implemented as:
s109, quantifying according to the first biological characteristic template to obtain a quantified value;
it will be appreciated that the biometric representation contained in the biometric template is discrete at a higher resolution and that the biometric representation is further discretized, i.e. quantized, prior to the encoding operation.
Quantization from the first biometric template may be understood as quantizing the biometric representation corresponding to the first feature point to a lattice space. For example, the biometric feature is expressed as a location of a feature point, an angle < i, j, θ >, where i, j, θ can range from 1-32,1-16,1-8,i, j, θ can be an integer or a fraction, respectively. For the convenience of coding, it is desirable that the quantized i, j, θ take integer values from 1 to 32,1 to 16, and 1 to 8, and the size of the lattice point space is 32×16×8, and the resolution of the lattice point space is 1.
Quantifying according to the first biometric template may include rounding and/or computing each element in the biometric representation contained in the first biometric template separately and/or rounding/computing between elements. In one embodiment, the quantification includes rounding each element separately, e.g., <3.5,5.1,60.5> which is a biometric representation with a corresponding quantification value of <4 (rounded), 5 (rounded), 2 (60.5/360 post-rounded) >. In another embodiment, the quantization includes rounding each element separately and then concatenating the elements into a multi-digit number, e.g., <3.5,5.1,60.5> which is a biometric representation with a corresponding quantization value 452 (rounding <4,5,2> and then concatenating the elements into a multi-digit number).
S110, determining error correction code words according to the secret keys corresponding to the first auxiliary data;
the key corresponding to the first auxiliary data may be a user input or a system generation, wherein the system generation comprises a random generation or a generation according to a user input. The key corresponding to the first auxiliary data is k-dimensional. The key corresponding to the first auxiliary data is generated, for example, by performing one or more of a format conversion, bit number padding, and verification information addition on the user input. The determining of the error correction code word according to the key corresponding to the first auxiliary data may be determining the error correction code word from an n-dimensional finite field according to the key corresponding to the first auxiliary data, where the error correction code word is n-dimensional (n > k), so that the decoding algorithm corresponding to the encoding algorithm has fault tolerance. It can be understood that the greater n is than k, the stronger the fault tolerance, and the greater the amount of computation required for the encoding operation.
Determining the error correction code word from the key corresponding to the first auxiliary data includes determining the error correction code word from the key corresponding to the first auxiliary data or determining the error correction code word from the key corresponding to the first auxiliary data and the quantized value.
There are various ways of determining the n-dimensional error correction code words according to the key corresponding to the k-dimensional first auxiliary data, and exemplary, the error correction code words may be determined by error correction code encoding such as BCH. Illustratively, a k-dimensional key may be converted into an n-dimensional vector by performing an operation with a k×n encoding matrix.
There are various manners of determining the codeword of the error correction code according to the key and the quantized value corresponding to the first auxiliary data, and the codeword of the error correction code may be determined by RS error correction coding, for example. For example, a polynomial function of (k-1) order (i.e., k coefficients of the polynomial function) is generated from the key corresponding to the first auxiliary data, and the function value is calculated at n quantized values to obtain n-dimensional error correction code words.
S120, performing first transformation processing on the error correction code words to obtain first auxiliary data.
Performing a first transform process on the error correction code codeword to obtain first auxiliary data may include: performing a first transformation process on the error correction code words independent of the quantized values; and the method can also comprise the step of carrying out first transformation processing on the error correction code words according to the quantized values to obtain first auxiliary data, thereby realizing protection of the error correction code words.
When the codeword of the error correction code is determined according to the key corresponding to the first auxiliary data and not determined according to the quantized value, the first transformation process of S120 needs to be related to the quantized value, so that binding of the key corresponding to the codeword of the error correction code, i.e. the first auxiliary data, and the quantized value, i.e. the biometric template, is achieved in the encoding operation, and protection of the key corresponding to the first auxiliary data and the quantized value, i.e. the biometric template, is further achieved. When the error correction codeword is determined according to the key and the quantized value corresponding to the first auxiliary data, the first transformation process of S120 may be related to the quantized value or may be unrelated to the quantized value, because S110 already binds the key and the quantized value corresponding to the first auxiliary data, and protection of the key and the quantized value corresponding to the first auxiliary data, that is, the biometric template, may be achieved regardless of whether S120 is related to the quantized value.
That is, at least one of steps S110 and S120 uses the quantized value obtained in step S109, that is, the quantized value is used when or after extending the first auxiliary key of the k dimension to the n dimension, thereby realizing binding of the quantized value and the key corresponding to the first auxiliary data in the encoding operation.
In this alternative embodiment, in the registration phase, c=enc (k) or c=enc (k, X), and the first transformation process is performed on c to obtain H; in the authentication phase, dec (X ', H) =k'. Wherein Enc is a step of determining a codeword of an error correction code, dec is a step of decoding operation, k is a key corresponding to the first auxiliary data, k 'is a key obtained by the decoding operation, X and X' are quantized values in a registration stage and an authentication stage, respectively, and H is the first auxiliary data. If X and X 'are close enough, k' =k.
A specific example of a second alternative implementation may be Fuzzy Commitment (Fuzzy Committment), fuzzy Vault (Fuzzy Vault).
In a third alternative embodiment, step 430 may be implemented as:
s129, quantifying according to the first biological characteristic template to obtain a quantified value;
for a description of this step, see S109.
S130, determining error correction code words;
in this embodiment, the key corresponding to the first auxiliary data is not acquired before the error correction code word is determined, and the error correction code word is not generated from the key corresponding to the first auxiliary data.
Illustratively, step S130 includes: randomly determining error correction code words; determining error correction code words according to the quantized values; one of the error correction code words is determined from the quantized value and the random amount. For example, when a quantized value is used when determining an error correction code word, RS error correction coding may be used. For example, when quantized values are not used in determining error correction code words, error correction coding may be performed by BCH error correction coding, RS error correction coding, or the like. Randomly determining an error correction code codeword, e.g., randomly generating a k-dimensional vector, multiplying by a k x n encoding matrix as the error correction code codeword; an error correction code codeword is determined from the quantized values and the random amount, e.g., a polynomial function of order (k-1) is randomly generated (i.e., k coefficients of the polynomial function are randomly generated), and the function value is found at n quantized values to obtain an n-dimensional codeword. When the error correcting code word is determined randomly or according to the quantized value and the random quantity, different error correcting code words can be determined based on the same quantized value, so that the revocability of the biological characteristic recognition algorithm is ensured.
S140, performing second transformation processing on the error correction code words to obtain first auxiliary data.
The second transform process may or may not be related to the quantized value.
When the error correction code word is randomly determined, the second transformation process needs to be related to the quantized value, so that the encoding operation of the error correction code word and the quantized value, namely the biological characteristic template, is realized, and the protection of the quantized value, namely the biological characteristic template is further realized. When the error correction code codeword is determined according to the quantized value or according to the quantized value and the random amount, since the quantized value and the error correction code codeword have been already mapped in S130, protection of the quantized value, i.e., the biometric template, can be achieved regardless of whether the second transformation process is related to the quantized value or not.
That is, at least one of steps S130 and S140 uses the quantized value obtained in step S129, thereby implementing an encoding operation on the quantized value.
S150, generating a key corresponding to the first auxiliary data according to the quantized value;
illustratively, the key corresponding to the first helper data is generated by the strong extractor from the quantized value, i.e. the key corresponding to the first helper data is generated by the biometric region.
In this alternative embodiment, in the registration stage, c=enc (r) or c=enc (X), k=ext (X), and the second transformation process is performed on c to obtain H; in the authentication phase, dec (X ', H) =x1, k' =ext (X1). Wherein r is a random variable, enc is a step of determining a codeword of an error correction code, dec is a step of decoding operation, ext is a step of extracting a key from a quantized value, k is a key corresponding to first auxiliary data, X and X' are quantized values in a registration stage and an authentication stage, X1 is a quantized value obtained by error correction of the error correction code, and H is the first auxiliary data. If X and X 'are close enough, x1=x, so that k' =k.
A specific example of a third alternative embodiment may be Fuzzy extraction (Fuzzy extraction).
According to the method, irreversible coding operation is carried out on the biological characteristic template reflecting the privacy of the user, on one hand, as the result of the irreversible coding operation is stored, the original biological characteristic template cannot be reversely pushed back even if the user leaks, the privacy leakage risk is greatly reduced, and the biological identification method meets the irreversible attribute; on the other hand, a key or a random quantity is used in the irreversible encoding operation, so that different helper data can be generated for the same biometric template, so that the biometric method satisfies the revocable attribute. Meanwhile, the matcher used in the decoding operation or authentication stage corresponding to the encoding operation has fault tolerance and can perform authentication based on a biometric template which is similar to but not identical to that used in registration. Therefore, the method provided by the embodiment of the invention is suitable for biological recognition and can protect privacy.
In some embodiments, the biometric representation corresponding to the first biometric region comprises: biometric data and accurate descriptors. Step 420 may include:
s4201, determining biometric data corresponding to the plurality of first feature points according to the first biometric region;
For example, the biological feature data corresponds to the feature points one by one, and describes the position and angle of the corresponding feature points, the relative positions and relative angles of the corresponding feature points and other feature points, and other azimuth information;
it will be appreciated that due to the natural ambiguity of the biometric feature, even though two feature points in two images acquired at different times correspond to the same biometric location of the same object (e.g., the same minutiae on a fingerprint, the same key points in a face), the biometric data corresponding to the two feature points may be similar but not identical. Therefore, the feature data corresponding to the feature points a and the feature data corresponding to the feature points a 'acquired twice are almost necessarily similar but different, wherein a corresponds to the biological positions w1, w2.. Wn of one object and a' corresponds to the same biological positions w1, w2.. Wn of the same object.
S4202, determining accurate descriptors corresponding to the first feature points according to the first biological feature area.
The accurate descriptors are in one-to-one correspondence with the feature points, and are used for describing information such as texture features, direction fields, frequency domain vectors, colors and the like of local areas where the corresponding feature points are located.
The accurate descriptors refer to two corresponding feature points (for example, the feature point corresponding to the same minutiae on the fingerprint in the first image to be processed is A1, and the corresponding feature point corresponding to the second image to be processed is A2) on the image acquired twice at the same biological position (for example, the same minutiae on the fingerprint and the same key point in the face) of the same object, and the probabilities (for example, the probabilities of more than 90%) of the accurate descriptors corresponding to the feature points corresponding to the minutiae on the fingerprint are the same (the accurate descriptors corresponding to A1 and A2).
In the biological characteristic recognition algorithm based on the fault tolerance of the matcher, the information for coding is added by using descriptors rather than descriptors, so that the safety of the biological characteristic recognition method is further improved. The use of accurate descriptors can reduce the probability of authentication failure caused by double ambiguity of the biometric data and descriptors, thereby increasing the stability of the biometric recognition method.
For a biometric recognition algorithm based on fault tolerance of a decoding operation, the security of the whole method is mainly dependent on the number of feature points used to generate the biometric template and the size of the whole lattice point space used to quantify the feature points. For the case of one-to-one correspondence of descriptors and feature points, for each feature point, accurate descriptors are acquired in addition to biometric data. If the lattice point space is correspondingly increased because of the descriptors, the difficulty of the attacker in guessing the real characteristic points from the lattice point space is further increased, and the security of the biological characteristic recognition method is further improved. For example, the biological characteristics are expressed as < i, j, θ, d >, the values range from 1 to 32, from 1 to 16, from 1 to 8, and from 1 to 10, respectively, and the lattice space can be increased to 32×16×8×10. If the space for the lattice is not increased by the descriptor, the contribution proportion of the biometric data is reduced corresponding to the mapping of the feature points to the lattice space, the contribution proportion of the descriptor is increased, and since the biometric data is blurred, the descriptor is accurate, the proportion of blurred information is reduced corresponding to the mapping of the feature points to the lattice space, and the proportion of accurate information is increased, so that the probability of authentication failure due to the ambiguity of the biometric itself can be reduced, thereby increasing the stability of the biometric identification method. In addition, compared with the fuzzy descriptor, when the accurate descriptor is used, the authentication stage only needs to correct the fuzzy caused by the biological characteristic data, and does not need to further correct the fuzzy caused by the descriptor.
Because of the natural ambiguity of biological characteristics, if descriptors corresponding to characteristic points are extracted in a conventional manner, only ambiguous descriptors can be obtained, but accurate descriptors cannot be obtained. The embodiment of the invention adopts the following mode to determine the accurate descriptor:
accurate descriptor determination mode 1: inputting the image to be processed and/or the description information of the image to be processed into a descriptor extraction model for processing, and obtaining accurate descriptors corresponding to the plurality of first feature points.
The input of the descriptor extraction model may be the image to be processed, or may be both the image to be processed and the description information of the image to be processed.
The description information of the image to be processed can be extracted through an image processing method or an algorithm model, and can be presented in the forms of vectors, matrixes, graphs and the like. For example, the descriptive information is a vector characterizing the ridge frequencies of the regions in the image to be processed.
Before extracting descriptors of an image to be processed using the descriptor extraction model, the descriptor extraction model needs to be trained.
If the input describing the sub-extraction model includes an image to be processed, the training samples describing the sub-extraction model may be a plurality of pairs of sample images whose matching is known.
By way of example, it is known that the matching condition refers to whether the sample image P1 and the sample image P2 are known to match, and if so, which feature points match. For example, the two sample images P1 and P2 are matched, 6 detail points P10-P15 and P20-P25 are respectively included in the P1 and P2, and the corresponding accurate descriptors are d10-d15 and d20-d25 respectively. The following minutiae points in P1, P2 are known to be matched: P11-P23, P13-P25, P12-P20, P10-P22, P14-P21. Knowing the matching situation between the minutiae points, the loss value of the difference between the descriptors of 6*6 =36 minutiae point pairs can be calculated according to the predicted value of the difference between the descriptors corresponding to each minutiae point (see table 2) and the true value GT of the difference between the descriptors of each minutiae point pair (for the matched minutiae point pair, GT of the difference between the descriptors of two minutiae points in the minutiae point pair is 0, for the unmatched minutiae point pair, the difference between the descriptors of the two minutiae points in the minutiae point pair is infinity, see table 1, and infinity is shown), and the network parameters of the descriptor extraction model are updated according to the loss value of the difference between the descriptors, wherein the loss value needs to be designed to make the accurate descriptors of the matched minutiae points in P1 and P2 as consistent as possible, and the accurate descriptor difference between the unmatched minutiae points is as large as possible.
TABLE 1 GT of descriptor differences for pairs of minutiae points
Figure BDA0003315423350000121
TABLE 2 predictive value of descriptor differences for minutiae pairs
Figure BDA0003315423350000122
/>
If the input of the descriptor extraction model includes the image to be processed and the description information of the image to be processed, the training samples of the descriptor extraction model may be a plurality of pairs of sample images and respective description information of which the matching condition is known. Specific training methods are described above and are not described here.
In this embodiment, the to-be-processed image and the description information of the to-be-processed image are used as the input of the descriptor extraction model, so that the descriptors for describing the feature information of the sub-region where each feature point is located, corresponding to each feature point in the to-be-processed image, can be obtained at one time.
Accurate descriptor determination mode 2: and determining local images of the neighborhoods where the first feature points are located according to the position data of the first feature points, and inputting the local images of the first feature points and/or description information of the local images into a descriptor extraction model to obtain accurate descriptors corresponding to the first feature points.
Likewise, before extracting descriptors of an image to be processed using the descriptor extraction model, training of the descriptor extraction model is required.
The extraction of the description information of the local image and the model training mode are similar to the accurate descriptor determination mode 1, and are not repeated. It will be appreciated that the goal of model training is also to make accurate descriptors of matching minutiae as consistent as possible and accurate descriptors of non-matching minutiae as far apart as possible.
In this embodiment, local data corresponding to the feature points is used as input of a descriptor extraction model, the model can be smaller, the operation amount is less, the operation speed is high, and the result accuracy is higher, but preprocessing such as capturing local images from the images to be processed is required, and descriptors corresponding to the feature points can be acquired only by operating the model for multiple times.
Accurate descriptor determination mode 3: for each first feature point in the plurality of first feature points, determining a fuzzy descriptor of the first feature point, clustering the fuzzy descriptor of the first feature point to a category center descriptor closest to the fuzzy descriptor, and taking the category center descriptor closest to the fuzzy descriptor as an accurate descriptor corresponding to the first feature point; the category center descriptor is obtained by clustering a plurality of fuzzy descriptors.
Illustratively, the fuzzy descriptor of the first feature point is determined from a local image of a neighborhood in which the first feature point is located.
In this embodiment, the exact descriptor is determined in three steps. Firstly, extracting descriptors of a large number of (for example, 100 w) feature points to obtain fuzzy descriptors corresponding to the feature points one by one; secondly, clustering the fuzzy descriptors to obtain a plurality of category center descriptors; and determining a fuzzy descriptor of each first feature point in the plurality of first feature points, clustering the fuzzy descriptor of the first feature point to a category center descriptor closest to the fuzzy descriptor, and taking the category center descriptor closest to the fuzzy descriptor as an accurate descriptor corresponding to the first feature point. In this way, the fuzzy descriptor can be converted into an exact descriptor.
The accurate descriptors are determined in the modes, so that the probability (for example, the probability of more than 90 percent) of the accurate descriptors corresponding to the two corresponding characteristic points of the same biological position of the same object on the image acquired by two times is the same, the probability of correcting the descriptors is greatly reduced, and the decoding speed is improved.
In this embodiment, the biometric representation contains biometric data and accurate descriptors, which are used in the various sub-steps of step S430. When the accurate description is used in the biological characteristic template/the quantized value/the characteristic conversion step according to the biological characteristic template, the safety and/or the stability of the biological characteristic identification method can be improved, and when the accurate description is used in the first transformation processing and the second transformation processing, the operation speed of the biological characteristic identification method can be improved compared with that of the fuzzy description. It is understood that the feature conversion/first transformation process/second transformation process in step S430 may be independent of the exact descriptor only when the exact descriptor is used in the biometric template and/or in the quantized value obtained from the biometric template; when only the exact description is used in the feature conversion/first transformation process/second transformation process in step S430, the feature conversion according to the first biometric template may be the feature conversion of only the feature data in the first biometric template, and the quantization according to the first biometric template may be the quantization according to only the biometric data in the first biometric template.
When step S430 is implemented by the first to third alternative embodiments, accurate descriptors may be used in S100, S109 and S120, S129 and S150, respectively.
Specifically, when step S430 is implemented by the first alternative embodiment, there are 3 ways to use the exact descriptor in S100:
mode 1: performing feature conversion on the biological feature data and the accurate descriptors contained in the first biological feature template to obtain first auxiliary data; the feature conversion is determined according to a key corresponding to the first auxiliary data;
mode 2: performing feature conversion on the biological feature data in the first biological feature template to obtain first auxiliary data; the feature conversion is determined according to the key corresponding to the first auxiliary data and the accurate descriptor;
mode 3: performing feature conversion on the biological feature data and the accurate descriptors contained in the first biological feature template to obtain first auxiliary data; the feature transformation is determined from the key corresponding to the first auxiliary data and the accurate descriptor.
In these 3 ways, the use of descriptors adds information for encoding rather than descriptors, so that the security of the biometric identification method is further improved. The use of accurate descriptors can reduce the probability of authentication failure caused by double ambiguity of the biometric data and descriptors, thereby increasing the stability of the biometric recognition method.
When step S430 is implemented by the second alternative embodiment, the exact descriptors may be used in S109 and S120.
When using the exact descriptor in S109, the exact descriptor may be used in step S1091 and/or step S1092.
Step S1091, performing quantization according to the first biometric template, includes: quantifying the biometric data and the accurate descriptor included in the first biometric template;
it is understood that even if the accurate descriptor is included in the biometric representation, the quantization process of S1091 may use only the biometric data and not the descriptor. When the accurate description is used in S1091, both cases of increasing and not increasing the lattice space are included.
As described above, when the biometric data and the accurate description are sub-quantized into the lattice space, the security of the biometric method can be improved if the lattice space is increased compared to when the description is not used, and the stability of the biometric method can be improved if the lattice space is not increased compared to when the description is not used.
Step S1092, the obtaining the quantized value includes: performing third transformation processing on the quantized result in the step S1091 to obtain a quantized value, wherein the third transformation processing is determined according to the accurate descriptor; the third transformation process is reversible or irreversible transformation;
Illustratively, quantized values
Figure BDA0003315423350000141
Wherein T is the quantized result obtained in step S1091,>
Figure BDA0003315423350000142
a third transformation process determined from the exact descriptor.
In this embodiment, if the pair of real points is to be guessed, not only T in the guess (when T is determined based on the biometric data only, T in the guess means the biometric data in the guess), but also d in the guess additionally, so that the security of the biometric identification method can be further improved.
And the authentication stage can quantize the biometric template to be authenticated acquired in the authentication stage in the same way to obtain a quantized value to be authenticated, and can perform authentication according to decoding operation of the quantized value to be authenticated without restoring the quantized result before the third transformation processing, so that the third transformation processing is reversible or irreversible transformation.
That is, the use of additional accurate descriptor information in S109 can further enhance the security and/or stability of the biometric identification algorithm. If only the exact descriptor is used in S1091 and the lattice space is not increased, the stability is improved; if only the accurate descriptor is used in S1091 and the lattice space is increased, the security is improved; if only the exact descriptor is used in S1092, security is improved; if the exact descriptor is used in S1092 and S1091 and the lattice space is not increased, stability and safety are improved; if the exact descriptor is used and the lattice space is increased in S1092 and S1091, security is improved.
When the accurate description is used in the first transformation process of S120, the operation speed at the time of authentication will be significantly improved with respect to using the blurred descriptor.
In a first embodiment 2.1 of the second alternative embodiment, steps S110 and S120 may be implemented as:
s1101, determining parameters of a first generation curve according to a secret key corresponding to the first auxiliary data;
in a specific embodiment, the key corresponding to the first auxiliary data is in k dimensions, the first generation curve is in a k-order polynomial f, and coefficients of the k-order polynomial are determined according to the key corresponding to the first auxiliary data.
S1103, mapping a first quantized value in quantized values on a first generation of digital curve to obtain a first mapped value corresponding to the first quantized value, wherein the error correction code word comprises a plurality of mapped values.
Illustratively, the first quantized value is one of the quantized values. Mapping the first quantized value on the first generation curve to obtain a first mapped value, thereby obtaining each mapped value corresponding to each quantized value one by one, and the error correction code word comprises each mapped value. Points having the first quantized value as the first coordinate component and the first mapped value as the second coordinate component may be referred to as first points; the point set having the quantized values and the mapped values as the first and second coordinate components is referred to as a first point set.
In a specific embodiment, the quantized value X includes a plurality of quantized values X1, X2 … xn that are in one-to-one correspondence with the feature points. For each quantized value, its corresponding mapped value y1=f (x 1), y2=f (x 2), … yn=f (xn) is generated. The error correction code codeword includes y1, y2, … yn. The first set of points is { (x 1, y 1), (x 2, y 2), … (xn, yn) }. The first point is one point in the first set of points.
S1201, generating a hash point set;
in a specific embodiment, the hash points in the set of hash points may be randomly generated and the first coordinate component of the hash points is not equal to any quantized value.
S1203, determining first auxiliary data according to the first auxiliary point set and the hash point set; wherein, a first auxiliary point in a first auxiliary point set has a first functional relation with a first point on a first generation number curve, the first point takes the first quantized value as a first coordinate component and the first mapping value as a second coordinate component; the first coordinate component of the first auxiliary point is determined from the first quantized value.
In step S1103, the first point set is already obtained, and the first point set may be directly used as the first auxiliary point set, where the first functional relationship is an identity mapping; alternatively, the first set of points is transformed to obtain a first set of auxiliary points, and the first functional relationship is determined by the transformation.
In a specific embodiment, the union of the first set of auxiliary points and the set of hash points is taken as the first auxiliary data.
In the authentication stage, enough points in the first point set need to be restored, so that parameters of the first generation number curve are determined, and then a key corresponding to the first auxiliary data is determined.
In the case that the first auxiliary point set is obtained by performing the transformation processing on the first point set, S120 may further include: s1202, performing fourth transformation processing on the first points to obtain first auxiliary points in the first auxiliary point set; the fourth transform process is determined from the exact descriptor, the fourth transform process being reversible.
It will be appreciated that determining the auxiliary point for each point in the first set of points also results in a first set of auxiliary points that is made up of auxiliary points.
For example, the first point set is { (x 1, y 1), (x 2, y 2), … (xn, yn) }, the first auxiliary point set is { (x 1', y 1'), (x 2', y 2') … (xn ', yn') }, (xi ', yi') =Φ d (xi,yi)=(Φ d x (xi),Φ d y (xi, yi)), where i=1-n. Wherein phi is d Representing a fourth transformation process, Φ, determined from the exact descriptor d x A first transformation component, Φ, representing a first coordinate component used to obtain a first auxiliary point in a fourth transformation process d y And a second transformation component representing a second coordinate component used to obtain the first auxiliary point in the fourth transformation process. It will be appreciated that Φ d Only the second transformed component may be present without the first transformed component, i.e., xi' =xi.
The fourth transform process is required to satisfy (x, y) |→ (Φ) d x (x),Φ d y (x, y)), where Φ d x (x) The second coordinate component representing the first transformed component independent of the first point may be related to the first coordinate component of the first point; phi d y The (x, y) representation of the second transformed component may relate to both the first coordinate component of the first point and the second coordinate component of the first point (and may also relate to the first coordinate component of the first point).
The fourth transformation process needs to satisfy the above condition to ensure that, in the authentication stage, the point in the first auxiliary point set can be determined from the auxiliary data according to the quantized value corresponding to the biometric template to be authenticated, and the fourth transformation process needs to be reversible to ensure that the point in the first auxiliary point set corresponding to the first auxiliary point in the auxiliary data can be restored according to the first auxiliary point in the auxiliary data.
For example, when the first and second auxiliary data are generated by using the embodiment 2.1 according to the same biometric template twice, the first algebraic curve determined by the key corresponding to the first auxiliary data is f, the second algebraic curve determined by the key corresponding to the second auxiliary data is g, and the fourth transformation is to perform the exclusive-or operation by using the accurate descriptor d and the error correction code codeword f (x)/g (x), the first auxiliary data
Figure BDA0003315423350000151
(where i=1-n, is the true minutiae point, x' is the hash point, ε is random noise), a second auxiliary numberAccording to->
Figure BDA0003315423350000161
(where i=1-n, is the true minutiae point, x' is the hash point, δ is the random noise), since the descriptor d is ambiguous, a further round of error correction code encoding of the error correction code codeword f (x) g (x) is required for error correction d, in this case
Figure BDA0003315423350000162
Figure BDA0003315423350000163
If the attacker takes H1 and H2 and will +.>
Figure BDA0003315423350000164
Then there is { xi, enc (f (xi) +g (xi)) } U { x ', enc (f (x ')+g (x ')) +ε+δ }. Therefore, the second coordinate component of the real minutiae corresponds to the error correction code word of the Enc error correction code, the second coordinate component of the hash point does not necessarily not correspond to the error correction code word due to the randomness of noise, and thus an attacker can screen out most of the real minutiae according to the auxiliary data, so that the biological feature recognition algorithm is unsafe. When using exact descriptors, the problem can be overcome without error correction code encoding f (xi), g (xi).
In a second embodiment 2.2 of the second alternative embodiment, steps S110 and S120 are not clearly defined, and steps S110 and S120 may be implemented as:
s1105, determining parameters of a first generation curve according to the secret key corresponding to the first auxiliary data. See step S1101.
S1107, determining parameters of a first mapping relation according to the quantized value and the first generation curve;
wherein the first set and the second set have a first mapping relationship therebetween.
The first set and the second set are drawn to describe the first mapping relationship, and it is not necessary to generate or store the first set and the second set.
The first subset of the first set is determined from the quantized values; the first value in the first subset is taken as a first coordinate component, the value satisfying the first mapping relation with the first value in the second set is taken as a subset point of a second coordinate component, and a second point on the first generation number curve has a second functional relation; the first coordinate component of the second point is a quantized value corresponding to the first value. All or most of the points in the first set, which are the complementary set points in the second set and have the value satisfying the first mapping relation with the value, are the first coordinate components, and the complementary set points in the second set and the value satisfy the first mapping relation do not have the second functional relation with the points on the first generation number curve.
It is understood that the first value is any value in the first subset.
That is, the first mapping relationship needs to make the first coordinate component and the second coordinate component of the sub-set point and the complementary set point satisfy the first mapping relationship, and the sub-set point has a second functional relationship with the second point on the second algebraic curve, while all or most of the complementary set points do not have the second functional relationship with the points on the second algebraic curve. Carrying out
By way of example, the first mapping relationship may be expressed as shown in equation 1.1,
Figure BDA0003315423350000165
wherein, the elements in the first set are X, the elements corresponding to X in the second set are V (X), the first generation curve is f (X), A is the set X1, X2, … xn corresponding to the quantized value, i is 1-n. For the child set point, the second term in equation 1.1 is 0 and the second functional relationship is an identity mapping.
In this embodiment, the first mapping relationship is an n-order polynomial, and V (X) is expanded to obtain coefficients of each of the X-th order polynomials.
S1109, determining the first auxiliary data according to the parameters of the first mapping relation.
Illustratively, the parameter of the first mapping relationship is taken as the first auxiliary data. For example, a polynomial coefficient is used as the first auxiliary data.
In this embodiment, the first coordinate component and the second coordinate component of the sub-set point, i.e. the real point and the complementary set point, i.e. the hash point, conform to the same first mapping relation, and only the parameters of the first mapping relation need to be stored, so that the storage space is greatly saved.
In a specific embodiment, step S1107 includes: and determining parameters of the first mapping relation according to the quantized values, the accurate descriptors and the first generation curve.
When using the exact descriptor, the first mapping relationship may be determined from the exact descriptor.
The first mapping relationship may be expressed as a form as in equation 1.3 or equation 1.5:
Figure BDA0003315423350000171
Figure BDA0003315423350000172
wherein the elements in the first set are X, the elements corresponding to X in the second set are V (X), the first generation curve is f (X), A is the set X1, X2, … xn, i is 1-n, phi corresponding to the quantized value d Is a transformation process related to the exact descriptor. The second functional relation is according to phi d And (5) determining.
It will be appreciated that Φ d The same condition of the fourth transformation process needs to be satisfied to ensure that, in the authentication stage, the sub-set points can be screened out from the auxiliary data according to the quantized values corresponding to the biometric templates to be authenticated. At the same time Φ d Reversibility is required to ensure that the second point can be restored from the auxiliary data. It will be appreciated that in formulas 1.3 and 1.5, Φ d Only the second transformed component is included.
In a second embodiment 2.3 of the second alternative embodiment, steps S110 and S120 may be implemented as:
s1111, determining the error correction code word according to the key corresponding to the first auxiliary data;
in a specific embodiment, the key corresponding to the first auxiliary data is k dimensions, the codebook includes a plurality of n-dimensional error correction code words, and the error correction code word corresponding to the key corresponding to the first auxiliary data can be determined from the error correction code words of the codebook according to the key corresponding to the first auxiliary data.
Illustratively, the key corresponding to the first auxiliary data is expanded into an n-dimensional vector by zero padding, and an error correction code word closest to the n-dimensional vector is determined from error correction code words of the codebook as the error correction code word corresponding to the key corresponding to the first auxiliary data. Illustratively, the key corresponding to the k-dimensional first auxiliary data and the k×n coding matrix are calculated and converted into n-dimensional vectors as error correction code words.
S1205, determining substitution operation according to the biological vector corresponding to the quantized value; and performing permutation operation on the error correction code word to obtain first auxiliary data.
In the step, binding of the error correction code words and the quantized values is realized in the encoding operation through the replacement operation, namely binding of the secret key corresponding to the first auxiliary data and the biological characteristic template is realized.
It is understood that the quantized values themselves may be in the form of vectors, and may be considered as biological vectors. When the quantized value is not in a vector form, the quantized value can be converted into a corresponding biological vector.
For example, the quantized values include 64 quantized values corresponding to 64 fingerprint minutiae, each quantized value corresponds to a biometric data represented by a minutiae position and an angle i, j, θ, where i, j, θ respectively occupy 5, 4, and 3 bits, and 6 bits represent minutiae corresponding to the quantized values, and then each quantized value corresponds to a 12-bit vector, and the biometric vector corresponding to the 64 quantized values in total is 18 bits.
Illustratively, step S1205 may be represented as h=kx (c). Where KX is a permutation operation determined from X. The permutation operation may include translation, rotation, etc. with respect to the biological vector X, and the permutation operation may be a matrix operation with the biological vector X. For example, the substitution operation determined from the biological vector corresponding to the quantized value is
Figure BDA0003315423350000173
I.e. exclusive or-ed with the bio-vector,
Figure BDA0003315423350000174
essentially characterizing the amount of translation of c relative to X. It will be appreciated that for some permutation operations it is required that the bio-vector X and the error correction code codeword c are the same length.
For example, the result of the permutation operation may be directly used as the first auxiliary data, i.e., h=k X (c) The result of the substitution operation may be subjected to a subsequent conversion process and used as the first auxiliary data.
In a specific embodiment of performing a subsequent transform process on a result of a permutation operation to obtain first auxiliary data, the applying the permutation operation to the error correction code word to obtain first auxiliary data includes:
s1205a, acting the replacement operation on the error correction code word to obtain a replacement operation result;
s1205b, performing fifth transformation processing on the replacement operation result to obtain first auxiliary data; the fifth transformation is determined from the exact descriptor; the fifth transformation process is reversible transformation to ensure that the replacement operation result is restored according to the first auxiliary data, and then the key corresponding to the first auxiliary data is restored.
The variable of the fifth transformation process may be, for example, the result K of the permutation operation X (c) (at this time, H=Φ) d (K X (c) Or) may be the result K of the substitution operation X (c) And biological vector X (in this case h=Φ) d (X,K X (c)))。
When step S430 is implemented by the third alternative embodiment, the exact descriptor may be used in S129 and S140.
The manner and effect of using the accurate descriptor in S129 is described with reference to S109, and will not be described again.
In embodiment 3.1 of the third alternative embodiment, steps S130 to S150 may be implemented as:
s1301, randomly determining error correction code words;
s1401, determining substitution operation according to the biological vector corresponding to the quantized value; the replacement operation is acted on the error correction code word to obtain first auxiliary data;
this embodiment is similar to embodiment 2.3 except that the error correction code words are determined randomly and not based on the key corresponding to the first auxiliary data.
And S150, generating a key corresponding to the first auxiliary data according to the quantized value.
Illustratively, the k-dimensional key is generated based on the quantized value and a random amount, such that different keys may be generated based on the same biometric template.
Illustratively, the random amount is included in the first assistance data.
In embodiment 3.2 of the third alternative embodiment, steps S130 to S150 may be implemented as:
s1303, determining that the codeword with the closest biological vector distance corresponding to the quantized value is an error correction code codeword;
s1403, obtaining first auxiliary data according to the difference between the biological vector and the error correction code word.
Illustratively, the bio-vector is X, the error correction code codeword is C, h=x-C.
And S150, generating a key corresponding to the first auxiliary data according to the quantized value.
Illustratively, the k-dimensional key is generated based on the quantized value and a random amount, such that different keys may be generated based on the same biometric template.
Illustratively, the random amount is included in the first assistance data.
It is to be understood that in steps S1401 and 1403, the result of the substitution operation or the result after the calculation of the difference may be directly used as a part of the first auxiliary data, and the result after the subsequent conversion processing based thereon may be used as a part of the first auxiliary data. It is understood that the random amount used when generating the key corresponding to the first auxiliary data according to the quantized value may also be included in the first auxiliary data.
Illustratively, deriving the first assistance data in steps S1401 and S1403 includes: obtaining first auxiliary data through sixth transformation processing; a sixth transformation process is determined from the exact descriptor, the sixth transformation process being reversible. The sixth transformation process is described in the fifth transformation process.
It can be appreciated that if the fourth transformation is Φ in embodiment 2.2 d The fifth transformation process or the sixth transformation process is related to the descriptor and the descriptor is ambiguous, and in the authentication stage, the ambiguity of the descriptor is corrected first, and the ambiguity of the quantized value is corrected, and the two-layer error correction can be reduced to one-layer error correction by using the accurate descriptor, so that the decoding speed is improved.
In a specific embodiment, the acquisition, the use (e.g. step S430, or the part of S430 involving the quantized value and the key corresponding to the first auxiliary data) and/or the transmission procedure (e.g. sent by the terminal to the server) of the key corresponding to the first auxiliary data are performed in the terminal equipment TEE (Trusted Execution Environment ).
Therefore, the secret key can be prevented from being leaked and the security of the biological identification system can be further guaranteed.
Since the image to be processed includes a biometric area and also involves user privacy, it is of course desirable that the processes of preprocessing from the acquisition of the image to be processed, obtaining a biometric template, obtaining auxiliary data from the biometric template are all performed in the TEE, however, the TEE is limited in both computational effort and storage space, and each step is difficult to implement in TEE execution, where each step may be executed in the TEE with a certain priority. In one embodiment, step S420 is divided into different sub-steps in two ways, and each sub-step is performed at the TEE with a certain priority:
Division mode 1, step S420 includes: determining a first intermediate result according to the first biological feature region, and determining a first biological feature template according to the first intermediate result;
wherein said determining a first intermediate result from said first biometric area is performed in said trusted execution environment with a first priority; the determining, based on the first intermediate result, that a first biometric template is executed in the trusted execution environment at a second priority.
In this way, step S420 is divided into two segments, and when the TEE resources are sufficient, both segments can be executed in the TEE, otherwise, both segments are executed in the TEE with a certain priority.
Dividing mode 2, determining a second intermediate result according to the first biological characteristic area; determining a third intermediate result according to the second intermediate result; determining a first biometric template from the third intermediate result;
wherein said determining a second intermediate result from said first biometric area is performed in said trusted execution environment with a first priority; determining that a first biometric template is executed in the trusted execution environment at a second priority based on the third intermediate result; the method further includes determining, based on the second intermediate result, that a third intermediate result is executed in the trusted execution environment at a third priority.
In this way, step S420 is divided into three segments, and when the TEE resources are sufficient, the three segments may be executed in the TEE, otherwise the three segments are executed in the TEE with a certain priority.
Wherein the third priority is lower than the first priority, and the third priority is lower than the second priority. The first priority and the second priority are not limited, and "first" and "second" do not represent the priority levels.
It is understood that steps not performed in the TEE are performed in the re. On one hand, the process of shallow layer transformation according to the first biological characteristic region to obtain an intermediate result has stronger correlation with the first biological characteristic region, and relates to user privacy, and TEE (terminal equipment) is put in a high priority (first priority); on the other hand, on the premise that step S430 has been performed in the TEE, placing the portion of step S420 closely connected to step S430 in the TEE is performed without increasing the communication cost between the TEE and the re, it may be considered to place the TEE with a high priority (second priority). Based on the above two considerations, in the division mode 2, the step of determining the third intermediate result according to the second intermediate result neither belongs to the step of having strong correlation with the first biological feature area nor to the step of tightly connecting with the step S103, and may be performed in the TEE with the lowest priority. Which of the first priority and the second priority is higher may be selected based on whether it is more desirable to protect data that is more relevant to the first biometric area, or to reduce the cost of communication between the TEEs, REEs.
Example III
Based on the same application conception, the embodiment of the present application further provides a privacy-preserving image processing device corresponding to the privacy-preserving image processing method, and since the principle of solving the problem by the device in the embodiment of the present application is similar to that of the foregoing embodiment of the privacy-preserving image processing method, the implementation of the device in the embodiment of the present application may refer to the description in the embodiment of the foregoing method, and the repetition is omitted.
The respective modules in the privacy-preserving image processing apparatus in the present embodiment are configured to execute the respective steps in the above-described method embodiments. As shown in fig. 5, the privacy-preserving image processing apparatus in the present embodiment may include: a first acquisition module 510, a first determination module 520, an encoding module 530; wherein, the liquid crystal display device comprises a liquid crystal display device,
a first obtaining module 510, configured to obtain an image to be processed, where the image to be processed includes a first biometric area, and the first biometric area includes a plurality of first feature points;
a first determining module 520, configured to determine a first biometric template of the first biometric region according to the first biometric region, where the first biometric template includes a plurality of biometric data corresponding to the plurality of first feature points;
The encoding module 530 is configured to perform an encoding operation according to the first biometric template to obtain first auxiliary data; wherein the encoding operation comprises an irreversible transformation.
Example IV
The identity registration method provided by the embodiment of the present application is applied to a terminal device, and the specific flow of the embodiment is explained in detail below in combination with the steps of the identity registration method.
In step 610, first auxiliary data in registration information of an object to be registered is determined by a privacy-preserving image processing method.
The privacy-preserving image processing method in this embodiment may be the privacy-preserving image processing method provided in the second embodiment, so the obtaining manner of the first auxiliary data in this embodiment may refer to the description in the second embodiment, and is not repeated here.
The identity registration method in this embodiment may be applied to a terminal device.
The registration method of the present embodiment may further include storing the registration information in a database. The database may be, for example, a database of the terminal device or a database of a server in communication with the terminal device.
When the first auxiliary data in the registration information is determined according to the first embodiment in step S430, the registration information may include only the first auxiliary data, and the authentication result may be determined by comparing the auxiliary data to be authenticated generated in the authentication stage with the first auxiliary data. When the first auxiliary data in the registration information is determined according to the second or third embodiment in step S430, the key to be authenticated may be determined according to the decoding result of the error correction code in the authentication stage, and the authentication result may be determined by comparing the key to be authenticated with the key corresponding to the first auxiliary data and checking the key to be authenticated. When the first auxiliary data in the registration information is determined according to the second or third embodiment in step S430, in order to further improve the system security performance, the biometric feature may be used in combination with other factors for multi-factor authentication. One possible multi-factor authentication mode is that in a registration stage, a key corresponding to first auxiliary data and other factors are subjected to transformation processing to obtain a first multi-factor key, in an authentication stage, a key to be authenticated corresponding to the auxiliary data is decoded, the other factors are acquired again, the key to be authenticated and the other factors acquired in the authentication stage are subjected to transformation processing similar to the registration stage to obtain a multi-factor key to be authenticated, the key to be authenticated and the first multi-factor key are compared to verify the multi-factor key to be authenticated, and an authentication result is determined. It will be appreciated that when multi-factor authentication is used, the key to be authenticated may be checked in addition to checking the multi-factor key to be authenticated.
When the first auxiliary data in the registration information is determined according to the second or third embodiment in step S430, since the key to be authenticated and/or the multi-factor key to be authenticated need to be checked in the authentication phase, the registration information also needs to include information required for checking the key to be authenticated and/or the multi-factor key to be authenticated.
In a specific embodiment, the verifying the key may include verifying according to a hash value of the key, where the registration information may further include: and the first hash value and the second hash value are respectively used for verifying the key to be authenticated and the multi-factor key to be authenticated in the authentication stage according to the first hash value generated by the key corresponding to the first auxiliary data and/or the second hash value generated by the first multi-factor key. It will be appreciated that the first hash value, the second hash value may be determined from the key, may be determined from the key and a default hash parameter, and may be determined from the key and a set (non-default) hash parameter, wherein the hash parameter includes a salt value and/or a number of rounds of the hash operation. When the set hash parameters are used for carrying out hash operation, different hash parameters can be used for different users to be registered, or different hash parameters can be used for different application scenes of the same user, so that the biological recognition system can meet the revocability to a greater extent. It can be understood that when the set hash parameter is used for performing the hash operation, the registration information needs to additionally include the hash parameter, so that the hash operation is performed on the key to be authenticated obtained by decoding or the multi-factor key to be authenticated determined by the key to be authenticated according to the same hash parameter in the authentication stage.
Based on the above description, the content contained in the registration information may include, but is not limited to, several cases listed in table 3:
TABLE 3 Table 3
Figure BDA0003315423350000211
If the registration information further includes the first hash value, correspondingly, the identity registration method in this embodiment may further include:
step 611, performing a hash operation according to the key corresponding to the first auxiliary data, to determine a first hash value in the registration information. The hash operation may be performed according to the key corresponding to the first auxiliary data, or may be performed according to the key corresponding to the first auxiliary data and a first hash parameter.
If the registration information further includes the second hash value, correspondingly, the identity registration method in this embodiment may further include:
step S612, performing a ninth transformation process according to the key corresponding to the first auxiliary data and the first transformation key to obtain a first multi-factor key;
step S613, performing a hash operation according to the first multi-factor key, and determining a second hash value in the registration information. The hash operation is performed according to the first multi-factor key, or may be performed according to the first multi-factor key and the second hash parameter.
As can be seen from the foregoing, this embodiment corresponds to a scenario of multi-factor authentication, where other factors may be other biometric features or user passwords, and preferably, other factors are accurate, but not ambiguous, and in this embodiment, taking other factors as user passwords as examples, the user passwords are directly used as the first transformation key or the first transformation key is generated according to the received user passwords. The first transformation key is generated from the user password, for example, by performing one or more of format conversion, bit number filling, and verification information addition on the user password.
Illustratively, the key corresponding to the first auxiliary data and the first transformation key may be combined by a ninth transformation process, which may be any reversible or irreversible transformation. For example, the ninth transformation process is to symmetrically encrypt the first transformation key with the key corresponding to the first auxiliary data and symmetrically encrypt the key corresponding to the first auxiliary data with the first transformation key. Compared with the authentication of the independent factors, the factors are combined, so that the combination result can be checked once in the authentication stage to determine whether the independent factors are correct.
The step of obtaining the first multi-factor key may include performing a transformation process according to the key corresponding to the first auxiliary data and the first transformation key, and adding verification information to a transformation result to obtain the first multi-factor key. For example, redundancy bits are determined from the transform result, and the first multi-factor key includes the transform result and the redundancy bits.
It is understood that the registration information may also include both the first hash value and the second hash value to verify both the key to be authenticated and the multi-factor key to be authenticated, and the identity registration method includes steps S611, 612 and 613.
In an alternative embodiment, the key may be initially verified before being verified by the hash value, and the key that passed the initial verification may be subsequently verified. For example, the primary verification may be performed by the verification information included in the key to be authenticated, the multi-factor key to be authenticated, and the redundancy bits included in the key may be used for the primary verification. The key to be authenticated and the multi-factor key to be authenticated can be preliminarily verified by means of the additional verification value. It can be appreciated that the preliminary verification may consume less effort and/or require less network communication than the subsequent verification, and that reducing the range of keys required for the subsequent verification by the preliminary verification may improve authentication efficiency.
When the key to be authenticated and/or the multi-factor key to be authenticated are/is preliminarily verified by means of the additional verification value, the registration information can further comprise a first verification value corresponding to the key corresponding to the first auxiliary data and/or a second verification value corresponding to the first multi-factor key. Correspondingly, the identity registration method provided by the embodiment of the application can further include: and generating a first check value according to the key corresponding to the first auxiliary data and/or generating a second check value according to the first multi-factor key.
For example, if the registration information includes the first check value and/or the second check value, the first check value and/or the second check value need to be stored in association with the corresponding first auxiliary data.
The check value may be a result of performing an operation on a value corresponding to a part or all of bits in the key, or may be a value corresponding to a part of bits in the key.
Illustratively, the verification information contained in the key to be authenticated, the multi-factor key to be authenticated itself, may be understood as hiding the verification value as redundant information in the key.
Example five
Based on the same application conception, the embodiment of the application also provides an identity registration device corresponding to the identity registration method, and since the principle of solving the problem of the device in the embodiment of the application is similar to that of the embodiment of the identity registration method, the implementation of the device in the embodiment of the application can be referred to the description in the embodiment of the method, and the repetition is omitted.
Each module in the identity registration apparatus in this embodiment is configured to perform each step in the above-described method embodiment. The identity registration apparatus includes: and a second determination module.
And a second determining module, configured to determine first auxiliary data in registration information of an object to be registered by using the privacy-preserving image processing method provided in the foregoing embodiment.
Example six
Fig. 6 is a flowchart of an identity authentication method according to an embodiment of the present application. The method of the present embodiment may be performed by a terminal device. The device performing the authentication method may be the same as or different from the device performing the registration method. The specific flow shown in fig. 6 will be described in detail.
Step 710, obtaining an image to be authenticated of the object to be authenticated.
The image to be authenticated includes a second biometric area including a plurality of second feature points therein.
In this embodiment, the image to be authenticated may be obtained by using the method for obtaining the image to be processed according to the above embodiment.
And step 720, determining a biometric template to be authenticated of the second biometric region according to the second biometric region.
It should be noted that the biometric template to be authenticated may be understood in a broad sense, and the biometric template to be authenticated may include biometric representations corresponding to the plurality of second feature points, or may include intermediate results for determining biometric representations corresponding to the plurality of second feature points.
It will be appreciated that if the decoding operation is performed at the terminal, the biometric template to be authenticated may include biometric representations corresponding to a plurality of second feature points. For example, at registration, a first biometric template is determined from the first biometric region via feature extraction steps a-e. During authentication, determining a biometric template to be authenticated through the steps a-e. If the decoding operation is completed at the server, the to-be-authenticated biometric template extracted by the terminal may include biometric representations corresponding to the plurality of second feature points, or may include intermediate results for determining biometric representations corresponding to the plurality of second feature points, where the server completes determining biometric representations corresponding to the plurality of second feature points according to the intermediate results and subsequent steps. For example, at registration, a first biometric template is determined from the first biometric region via feature extraction steps a-e. During authentication, the terminal equipment can determine the biometric template to be authenticated through the steps a-b, the server completes the steps c-e, or the terminal equipment determines the biometric template to be authenticated through the steps a-e.
That is, the terminal device may determine the biometric template to be authenticated of the second biometric region from the second biometric region by at least the first half of the feature extraction step that is the same as determining the first biometric template from the first biometric region.
Step 730, obtaining an identity authentication result of the object to be authenticated.
The identity authentication result is determined according to the biometric template to be authenticated and the base auxiliary data.
It will be appreciated that the step of determining the identity authentication result of the object to be authenticated by the biometric template to be authenticated and the base auxiliary data may be performed at the terminal or the server, or may be performed by the terminal and the server in cooperation, which is not limited herein. Wherein the base assistance data comprises at least one first assistance data; the first assistance data is included in registration information determined by the above-described identity registration method embodiment.
In one implementation manner, the identity authentication result may be determined in a server, and the obtaining the identity authentication result of the object to be authenticated includes: and receiving an identity authentication result of the object to be authenticated, which is sent by the server. The identity authentication result may include one or more of authentication success or failure, authentication success user identification, verification of passed key to be authenticated, matching of hash values.
In this implementation manner, the biometric template to be authenticated or the quantized value to be authenticated may be sent through a secure channel pre-established between the terminal and the server, so that the server determines an identity authentication result of the object to be authenticated according to the biometric template to be authenticated and the base auxiliary data, or determines an identity authentication result of the object to be authenticated according to the quantized value to be authenticated and the base auxiliary data.
It can be appreciated that, when step S430 is implemented in the first embodiment, the biometric template to be authenticated is sent through a secure channel, so that the server determines the identity authentication result of the object to be authenticated according to the biometric template to be authenticated and the base auxiliary data. In the step S430 corresponding to the second or third specific embodiment, if the terminal device sends a biometric template to be authenticated, the server needs to perform the step of obtaining a quantized value to be authenticated according to the biometric template to be authenticated (if the biometric template to be authenticated is not a biometric representation corresponding to the second feature point, the server needs to perform the step of obtaining a biometric representation corresponding to the second feature point according to the biometric template to be authenticated, and obtaining a quantized value to be authenticated according to the biometric representation corresponding to the second feature point), and then determining an identity authentication result of the object to be authenticated according to the quantized value to be authenticated and the base auxiliary data. If the terminal equipment sends the quantized value to be authenticated, the server needs to execute the step of determining the identity authentication result of the object to be authenticated according to the quantized value to be authenticated and the base auxiliary data.
To ensure the security of data transmission, a secure channel may be established between the terminal and the server TEE.
In this implementation manner, since the computing power of the server is generally stronger than that of the terminal device, the server determines that the identity authentication result of the object to be authenticated is generally faster than that of the terminal device according to the biometric template to be authenticated or the quantized value to be authenticated and the base auxiliary data, but the user identification to be authenticated is needed to be authenticated on the biometric template to be authenticated or the quantized value to be authenticated, the key to be authenticated passing the verification, the multi-factor key to be authenticated passing the verification, the hash value matching and other data are transmitted.
In another implementation manner, the identity authentication result of the object to be authenticated is determined according to the biometric template to be authenticated and the base auxiliary data through the terminal equipment or through the cooperation of the terminal equipment and the server.
Corresponding to the first to third embodiments in step S430, the identity authentication result of the object to be authenticated may be determined according to the biometric template to be authenticated and the base auxiliary data in different manners.
Corresponding to the second (steps S109, S110-S120) or third embodiment (steps S129, S130-S150) of step S430, step S730 includes:
S740, quantifying according to the biological characteristic template to be authenticated to obtain a quantified value to be authenticated.
Illustratively, the biometric template to be authenticated of S720 includes biometric representations corresponding to the plurality of second feature points; correspondingly, the step S740 is analogous to the step S430 of quantifying according to the first biometric template to obtain the quantified value. It will be appreciated that if the biometric representation of the enrolment phase includes a precision descriptor, the authentication phase also needs to determine the precision descriptor to be authenticated in the same manner. If the accurate descriptor is used in step S430, then step S740 also needs to use the accurate descriptor to be authenticated in the same manner. For example, if the exact descriptor is used in S1091 and/or S1092 in step S430, the exact descriptor to be authenticated is used in S7401 and/or S7402 in step S740.
Step S7401, quantifying according to the biometric template to be authenticated, includes: the biometric data and the accurate descriptor included in the biometric template to be authenticated are quantized in the same manner as in S1091.
Step S7402, obtaining a quantized value to be authenticated, including: performing third transformation processing similar to step S1092 on the quantization result in step S7401 to obtain a quantization value to be authenticated, where the third transformation processing is determined according to the accurate descriptor to be authenticated; the third transformation process is reversible or irreversible transformation;
Illustratively, the quantized value x=Φ d (T), wherein T is the quantized result obtained in step S7401, φ d And a third transformation process determined for the accurate descriptor to be authenticated.
For example, the quantized result obtained in S1091 is T (T1, T2, … tn), the exact descriptor corresponding to the feature point corresponding to each quantized result is D (D1, D2, … dn), and x1=Φ is set in S1092 d1 (t 1) and thus X (X1, X2 … xn) can be determined. Correspondingly, the quantized result obtained in S7401 is T ' (T1 ', T2', … tn '), the accurate descriptor to be authenticated corresponding to the feature point corresponding to each quantized result is D ' (D1 ', D2', … dn '), and in S7402, x1' =Φ d1’ (t 1 ') so that X ' (X1 ', X2' … xn ') can be determined.
As such, when the image to be processed and the image to be authenticated contain the same biological location of the same object, X and X' are the same or sufficiently similar.
S750, performing decoding operation corresponding to the encoding operation according to the quantized value to be authenticated and the base auxiliary data, and determining at least one key to be authenticated corresponding to at least one first auxiliary data contained in the base auxiliary data;
it will be appreciated that the base assistance data is stored locally at the terminal or at the server. When the base auxiliary data is stored in the terminal, the terminal can determine the base auxiliary data from the registration information. When the auxiliary data of the base is stored in the server, the server can determine the auxiliary data of the base from the registration information and send the auxiliary data to the terminal.
For the 1:N case, the bottom library auxiliary data can be the total first auxiliary data in the stored registration information, or can be the first auxiliary data after preliminary screening. For example, the base auxiliary data is screened from the full first auxiliary data according to the user identification, the equipment identification, the registration time, the authentication time, the low-discrimination characteristic extracted from the image to be authenticated of the authentication place, the low-discrimination characteristic extracted from the biological characteristic template to be authenticated, the low-discrimination characteristic extracted from the quantized value to be authenticated and other limiting factors. For a 1:1 scenario, the base assistance data may be first assistance data that is uniquely determined based on user identification, device identification, etc. It will be appreciated that if the auxiliary data is prescreened with the limiting factor, the limiting factor and the first auxiliary data need to be stored in association.
Illustratively, the decoding operations corresponding to the encoding operations include error correction code decoding operations. Illustratively, the error correction code decoding operation needs to correspond to the error correction code encoding operation at the time of encoding, and may be BCH decoding or RS decoding. In step S430, an error correction code codeword is used, so that the error correction code codeword determined in step S430 can be recovered by error correction code decoding according to the quantized value to be authenticated and the first auxiliary data, which are close enough to the quantized value corresponding to the first biometric template, and then the key corresponding to the first auxiliary data is obtained by error correction code decoding, or the error correction code codeword determined in step S430 can be recovered by error correction code decoding according to the quantized value to be authenticated and the first auxiliary data, which are close enough to the quantized value corresponding to the first biometric template, and then the quantized value corresponding to the first biometric template is determined according to the error correction code codeword, and then the key corresponding to the first auxiliary data is obtained according to the quantized value.
It can be understood that, for one first auxiliary data in the base auxiliary data, when the first auxiliary data is the first auxiliary data included in the registration information determined in the registration stage of the object to be authenticated, the key to be authenticated determined according to the quantization value to be authenticated and the first auxiliary data is equal to the key corresponding to the first auxiliary data. When the first auxiliary data is not the first auxiliary data contained in the registration information determined by the object to be authenticated in the registration stage, the key to be authenticated cannot be determined according to the quantized value to be authenticated and the first auxiliary data or the determined key to be authenticated is inconsistent with the key corresponding to the first auxiliary data. It is understood that the inability to determine the key to be authenticated means that the key to be authenticated cannot be determined within polynomial time.
The following illustrates how decoding operations corresponding to the encoding operations are performed.
When the second embodiment is implemented as 2.1, the first auxiliary data is a set of points, step S750 may be implemented as,
s7501, taking one first auxiliary data in at least one first auxiliary data as current first auxiliary data, screening query points in the current first auxiliary data according to a quantized value to be authenticated, and obtaining a query point set corresponding to the current first auxiliary data, wherein the query point set comprises a plurality of query points; the first coordinate component of the query point is determined according to the quantized value to be authenticated.
For example, each first auxiliary data in the at least one first auxiliary data may be sequentially used as the current first auxiliary data, or the at least one first auxiliary data may be screened, and each first auxiliary data in the screened first auxiliary data may be sequentially used as the current first auxiliary data.
Illustratively, the first coordinate component of the query point is a quantized value to be authenticated; for example, to-be-authenticated quantized values X ' (X1 ', X2',..xn '), the first coordinate components of each query point in the set of query points are X1', X2', … xn ', respectively.
For example, if the point in the first auxiliary point set is obtained by performing the fourth transformation on the first point in the registration stage, the first coordinate component of the query point is obtained by performing the inverse transformation on the first transformed component of the fourth transformation on the quantized value to be authenticated. Can be expressed as phi d x-1 (X’),Φ d x A first transformed component for obtaining a first coordinate component of the first auxiliary point in the fourth transformation process is represented. It will be appreciated that the exact descriptor to be authenticated is used in the inverse transformation.
S7502, according to the query point set, recovering the key to be authenticated corresponding to the current auxiliary data by using a decoding algorithm.
For example, if in the registration stage, the point in the first auxiliary point set is directly obtained by the first point, the key to be authenticated corresponding to the current auxiliary data is recovered by using a decoding algorithm according to the query point set.
For example, if in the registration stage, the points in the first auxiliary point set are obtained by performing a fourth transformation process on the first points, a query curve point set is obtained according to the query point set (an inverse transformation of a second transformation component of the fourth transformation process is performed on a second coordinate component of the query point in the query point set, where an accurate descriptor to be authenticated is used in the inverse transformation), and a key to be authenticated corresponding to the current auxiliary data is recovered by using a decoding algorithm according to the query curve point set. The decoding algorithm is, for example, an algorithm that utilizes a lagrangian interpolation recovery polynomial.
It can be appreciated that if there are enough points in the query point set or the query curve point set to coincide with the points in the first point set, the key to be authenticated corresponding to the current auxiliary data is consistent with the key corresponding to the first auxiliary data.
When the second embodiment is implemented as 2.2, the first auxiliary data is a parameter of the first generation curve, step S750 may be implemented as,
s7503, taking one first auxiliary data in at least one first auxiliary data as current first auxiliary data, determining a query point corresponding to the current first auxiliary data according to a quantized value to be authenticated, and obtaining a query point set corresponding to the current first auxiliary data, wherein the query point set comprises a plurality of query points; the first coordinate component of the query point is determined according to the quantized value to be authenticated, and the second coordinate component of the query point and the first coordinate component of the query point meet the current mapping relation corresponding to the current first auxiliary data;
For example, each first auxiliary data in the at least one first auxiliary data may be sequentially used as the current first auxiliary data, or the at least one first auxiliary data may be screened, and each first auxiliary data in the screened first auxiliary data may be sequentially used as the current first auxiliary data.
Illustratively, the first coordinate component of the query point is a quantized value to be authenticated; for example, to-be-authenticated quantized values X ' (X1 ', X2',..xn '), the first coordinate components of each query point in the set of query points are X1', X2', … xn ', respectively. The parameter value of the first generation curve is recorded in the current first auxiliary data, and the second coordinate component of the query point can be determined according to the first coordinate component of the query point.
Exemplary, if the first auxiliary data in the registration phase is derived from the exact descriptor, the first coordinate component of the query point is Φ to the quantized value to be authenticated d x Is obtained by inverse transformation of (a). Can be expressed as phi d x-1 (X’),Φ d x For Φ used in determining the first mapping relation in embodiment 2.2 d x . It will be appreciated that the exact descriptor to be authenticated is used in the inverse transformation.
S7504, recovering the key to be authenticated corresponding to the current auxiliary data by using a decoding algorithm according to the query point set.
See description of step S7502.
The decoding algorithm is, for example, an algorithm that utilizes a lagrangian interpolation recovery polynomial.
It can be appreciated that if there are enough points in the query point set or the query curve point set to coincide with points in the second point set, the key to be authenticated corresponding to the current auxiliary data is consistent with the key corresponding to the first auxiliary data.
In embodiments 2.1 and 2.2, the error correction code words are determined from the auxiliary data by error correction code decoding (RS decoding), and the key to be authenticated is determined from the error correction code words by lagrangian interpolation.
When the second embodiment is implemented as 2.3, step S750 may be implemented as,
s7505, taking one first auxiliary data in the at least one first auxiliary data as current first auxiliary data, determining inverse operation of replacement operation according to the to-be-authenticated biological vector corresponding to the to-be-authenticated quantized value, applying the inverse operation of replacement operation on the current first auxiliary data or the preprocessed current first auxiliary data, and determining error correction code words corresponding to the current first auxiliary data;
it is understood that the substitution operation in step S7505 is the same as the substitution operation in step S1205, the substitution operation in step S1205 is determined according to the biological vector corresponding to the quantized value, and the substitution operation in step S7505 is determined according to the biological vector to be authenticated corresponding to the quantized value to be authenticated. It can be understood that the actual execution process may not have the step of determining the replacement operation in step S7505 according to the to-be-authenticated biological vector corresponding to the to-be-authenticated quantized value, and may directly determine the inverse operation of the replacement operation.
If step S1205 includes step S1205b, preprocessing the current first auxiliary data, wherein the preprocessing includes performing an inverse operation Φ of a fifth transformation on the current first auxiliary data d -1 (H) Or phi is d -1 (X', H). It will be appreciated that the exact descriptor to be authenticated is used in this inverse operation.
S7506, obtaining the key to be authenticated corresponding to the current first auxiliary data according to the error correction code word corresponding to the current first auxiliary data.
When the third embodiment is implemented as 3.1, step S750 may be implemented as,
s7507, as with S7505, it is understood that the substitution operation of step S7507 is the same as the substitution operation in step S1401, the substitution operation of step S1401 is determined according to the biological vector corresponding to the quantized value, and the substitution operation of step S7507 is determined according to the biological vector to be authenticated corresponding to the quantized value to be authenticated. It can be understood that the actual execution process may not have the step of determining the replacement operation in step S7507 according to the to-be-authenticated biological vector corresponding to the to-be-authenticated quantized value, and may directly determine the inverse operation of the replacement operation.
Exemplary, the sixth transformation is performed in S1401, wherein the preprocessing of the current first auxiliary data is required, the preprocessing including the inverse operation Φ of the sixth transformation of the current first auxiliary data d -1 (H) Or phi is d -1 (X', H). It will be appreciated that the exact descriptor to be authenticated is used in this inverse operation.
S7508, restoring the biological vector corresponding to the quantized value used when the first auxiliary data is currently restored according to the error correction code words determined in S7507;
s7509, determining a key to be authenticated corresponding to the current first auxiliary data according to the quantized value.
The key to be authenticated corresponding to the current first auxiliary data is generated according to the quantized value and the random amount analyzed in the current first auxiliary data.
When the third embodiment is implemented as 3.2, step S750 may be implemented as,
s7510, taking one first auxiliary data in the at least one first auxiliary data as current first auxiliary data, and determining an error correction code word corresponding to the current first auxiliary data according to the current first auxiliary data and the biological vector to be authenticated;
for example, each first auxiliary data in the at least one first auxiliary data may be sequentially used as the current first auxiliary data, or the at least one first auxiliary data may be screened, and each first auxiliary data in the screened first auxiliary data may be sequentially used as the current first auxiliary data.
Illustratively, the sixth transformation in S1403 requires preprocessing the current first auxiliary data, which includes performing an inverse operation Φ of the sixth transformation on the current first auxiliary data d -1 (H) Or phi is d -1 (X', H). It will be appreciated that the exact descriptor to be authenticated is used in this inverse operation.
S7511, recovering the biological vector corresponding to the quantized value used when the first auxiliary data is currently recovered according to the error correction code words determined in S7510;
s7512, determining a key to be authenticated corresponding to the current first auxiliary data according to the quantized value.
The key to be authenticated corresponding to the current first auxiliary data is generated according to the quantized value and the random amount analyzed in the current first auxiliary data.
S760, checking the key to be authenticated, and determining the identity authentication result of the object to be authenticated.
Corresponding to the second (steps S109, S110-S120) or third embodiment (steps S129, S130-S150) of step S430, in one specific embodiment, multi-factor authentication is performed, where step S730 includes:
s740, quantifying according to the biological characteristic template to be authenticated to obtain a quantified value to be authenticated.
S750, performing decoding operation corresponding to the encoding operation according to the quantized value to be authenticated and the base auxiliary data, and determining at least one key to be authenticated corresponding to at least one first auxiliary data contained in the base auxiliary data;
S770, performing ninth transformation processing in S612 according to part or all of the key to be authenticated and the second transformation key to obtain a multi-factor key to be authenticated;
the second transformation key is determined from the user password received during the authentication phase, for example directly using the user password during the authentication phase as the second transformation key or generating the second transformation key from the received user password. The second transformation key is generated from the user password, for example, by performing one or more of format conversion, bit number filling, and verification information addition on the user password. It will be appreciated that the second transformation key is determined by the user password during the authentication phase in the same manner as the first transformation key is determined by the user password during the enrolment phase.
It will be appreciated that the registration stage may use the key corresponding to the first auxiliary data and the first transformation key as variables of the ninth transformation process, and the authentication stage may use the key to be authenticated and the second transformation key as variables of the ninth transformation process. That is, the ninth transformation process variables of the registration phase and the authentication phase are different, and the parameters are the same.
Illustratively, performing a ninth transformation process according to a part of the key to be authenticated and the second transformation key to obtain the multi-factor key to be authenticated may be performing the ninth transformation process according to the key to be authenticated and the second transformation key that pass the preliminary verification to obtain the multi-factor key to be authenticated; or performing ninth transformation processing according to the target key to be authenticated and the second transformation key which pass the verification, and obtaining the multi-factor key to be authenticated.
It can be understood that the authentication efficiency can be improved by performing transformation processing on part of the keys to be authenticated instead of all the keys to be authenticated.
S780, checking the multi-factor key to be authenticated, and determining an identity authentication result of the object to be authenticated.
Specifically, the verifying the multi-factor key to be authenticated in step S780 includes:
s7801, determining a part or all of to-be-authenticated multi-factor hash values corresponding to the to-be-authenticated multi-factor keys according to the part or all of to-be-authenticated multi-factor keys, or determining a part or all of to-be-authenticated multi-factor hash values corresponding to the to-be-authenticated multi-factor keys according to the part or all of to-be-authenticated multi-factor keys and second hash parameters corresponding to the first auxiliary data corresponding to the to-be-authenticated multi-factor keys;
for a multi-factor hash value to be authenticated, if a second hash value of the base library matched with the multi-factor hash value to be authenticated exists in the second hash value of the base library, the verification result of the multi-factor key to be authenticated corresponding to the multi-factor hash value to be authenticated is verification passing;
the second hash value of the bottom library is a second hash value corresponding to the first auxiliary data contained in the auxiliary data of the bottom library, or the second hash value of the bottom library is a second hash value meeting a specified condition; the second hash value is included in the registration information.
It can be appreciated that in this embodiment, the key is verified based on the hash value of the key.
If the registration information is stored, the second hash value is stored in association with the first auxiliary data, and after determining which first auxiliary data are contained in the base auxiliary data, the second hash value corresponding to the first auxiliary data can be queried, and at this time, the base second hash value is the second hash value corresponding to the first auxiliary data contained in the base auxiliary data. If the second hash value is not associated with the first auxiliary data when the registration information is stored, it is determined that the first auxiliary data included in the base auxiliary data cannot be queried for the second hash value corresponding to the first auxiliary data, and at this time, the base second hash value may be a second hash value satisfying a specified condition, for example, a full-scale second hash value or a second hash value narrowed according to some limiting factors. Limiting factors are, for example, user identification, device identification, registration time, authentication location, etc. It will be appreciated that if the base second hash value is a second hash value that is narrowed down by a limiting factor, then the limiting factor and the second hash value need to be stored in association.
After the second hash value of the base is determined, the multi-factor hash value to be authenticated and the second hash value of the base can be compared, and for one multi-factor hash value to be authenticated, if the second hash value of the base matched with the multi-factor hash value to be authenticated exists in the second hash value of the base, the verification result of the multi-factor key to be authenticated corresponding to the multi-factor hash value to be authenticated is verification passing. For example, the first auxiliary data of the base contains 10 first auxiliary data, 10 multi-factor hash values to be authenticated are determined according to the 10 first auxiliary data, and the second hash value of the base is 50000. If one of the 10 multi-factor hash values to be authenticated is matched with the second hash value of the base, the verification result of the multi-factor key to be authenticated corresponding to the multi-factor hash value to be authenticated is verification passing.
It can be understood that when the device for generating the multi-factor hash value to be authenticated is not the same device as the device for storing the registration information, the multi-factor hash value to be authenticated and the second hash value of the base are required to be located in the same device through data transmission to perform comparison. For example, the registration information is stored in the server, the multi-factor hash value to be authenticated is generated by the terminal device, and the step of comparing the multi-factor hash value to the second hash value of the base can be performed by the terminal or the server.
It should be noted that, in S7801, only the hash value of a part of the multi-factor key to be authenticated may be calculated, for example, only the hash value of the multi-factor key to be authenticated that passes the preliminary verification is calculated.
In one embodiment, the verifying the key to be authenticated in step S760 includes:
s7601, determining a hash value to be authenticated corresponding to part or all of the keys to be authenticated according to part or all of the keys to be authenticated, or determining a hash value to be authenticated corresponding to part or all of the keys to be authenticated according to part or all of the keys to be authenticated and first hash parameters corresponding to first auxiliary data corresponding to the keys to be authenticated;
for one hash value to be authenticated, if the first hash value of the base library matched with the hash value to be authenticated exists in the first hash value of the base library, the verification result of the key to be authenticated corresponding to the hash value to be authenticated is verification passing;
the first hash value of the bottom library is a first hash value corresponding to the first auxiliary data contained in the auxiliary data of the bottom library, or the first hash value of the bottom library is a first hash value meeting a specified condition; the first hash value is included in the registration information.
The explanation for S7601 can be analogized to S7801.
Step S7801 includes, when determining the to-be-authenticated multi-factor hash value corresponding to the to-be-authenticated multi-factor key according to the to-be-authenticated multi-factor key, further including, before S7801:
and S7800, performing preliminary verification on the multi-factor key to be authenticated to obtain the multi-factor key to be authenticated which passes the preliminary verification.
It can be understood that the partial multi-factor key to be authenticated in step S7801 refers to the multi-factor key to be authenticated that passes the preliminary verification.
S7800 includes S7800a, performing a preliminary verification of the multi-factor key to be authenticated using the verification information in the multi-factor key to be authenticated, or,
s7800b, generating a second check value to be authenticated according to the multi-factor key to be authenticated, and comparing the second check value to be authenticated with a second check value of a base to perform preliminary check;
the second check value of the base is a second check value corresponding to the first auxiliary data corresponding to the multi-factor key to be authenticated in the database. As described above, when the registration information includes the second check value, the second check value is stored in association with the first auxiliary data, and the corresponding second check value can be determined by determining one first auxiliary data. The multi-factor key to be authenticated is generated according to a certain first auxiliary data in the auxiliary data of the base, and the second check value corresponding to the first auxiliary data is the second check value of the base.
Similarly, when step S7601 includes determining the multi-factor hash value to be authenticated corresponding to the partial key to be authenticated according to the partial key to be authenticated, step S7601 further includes:
s7600, carrying out preliminary verification on the key to be authenticated to obtain the key to be authenticated which passes the preliminary verification.
The description of S7600 may be analogous to the description of S7800.
It is understood that the partial key to be authenticated in step S7601 refers to the key to be authenticated that is preliminarily verified.
It can be understood that if the verification result of the key to be authenticated and/or the multi-factor key to be authenticated is that the verification is passed, the identity authentication result of the object to be authenticated is that the authentication is successful.
In a specific embodiment, the registration information further includes a user identification. The identity authentication method further comprises the following steps:
s790: determining a user identifier corresponding to a matching hash value according to the matching hash value;
illustratively, the matching hash value may be generated at the terminal device;
wherein the determining, according to the matching hash value, the user identifier corresponding to the matching hash value includes:
s7901, sending the matched hash value to a server, so that the server queries a user identifier corresponding to the matched hash value according to the matched hash value;
S7902, receiving a user identifier corresponding to the matched hash value;
or, S7903, inquiring, according to the matching hash value, a user identifier corresponding to the matching hash value;
the user identification may be understood in a broad sense, including information such as a user name, a document number, etc. The matching hash value is a hash value to be authenticated, which corresponds to the key to be authenticated, the verification result of which is verified, or the matching hash value is a multi-factor hash value to be authenticated, which corresponds to the multi-factor key to be authenticated, the verification result of which is verified.
In the registration stage, the user identifier corresponding to the first auxiliary data can be stored in association with the first auxiliary data. Because it is not desirable that the user identification is associated with the sensitive information, the first auxiliary data may be stored in association with the sensitive information when the first auxiliary data is stored in association with the sensitive information, the user identification being stored independently of the first auxiliary data, so as to avoid association of the user identification with the sensitive information.
The sensitive information is, for example, a low-resolution feature extracted from an image to be processed used at the time of registration, a low-resolution feature extracted from a first biometric template, a low-resolution feature extracted from a quantized value, a registration time, and the like.
For example, the first auxiliary data may be stored in association with sensitive information, hash parameters, check values, and other information in the first data table, and the user identifier and the first hash value and/or the second hash value may be stored in association in the second data table, so that when it is determined that there is a first hash value of the base that matches the hash value to be authenticated of the key to be authenticated and/or it is determined that there is a second hash value of the base that matches the hash value to be authenticated of the multi-factor key to be authenticated, the user identifier may be determined in the second data table according to the matching hash value. The matching hash value is a hash value to be authenticated (i.e., a first hash value of a base library matched with the hash value to be authenticated of the key to be authenticated) corresponding to the key to be authenticated, whose verification result is verified, or the matching hash value is a hash value to be authenticated (i.e., a second hash value of a base library matched with the hash value to be authenticated of the key to be authenticated) corresponding to the key to be authenticated, whose verification result is verified. It can be understood that, under normal circumstances, the to-be-authenticated key is used for checking and determining the identity authentication result, the checking result is that the to-be-authenticated hash value corresponding to the checked to-be-authenticated key is the matching hash value, the to-be-authenticated multi-factor key is used for checking and determining the identity authentication result, and the checking result is that the to-be-authenticated multi-factor hash value corresponding to the checked to-be-authenticated multi-factor key is the matching hash value.
When the first embodiment is adopted in step S430, step S730 includes:
s800, performing the coding operation according to the biometric template to be authenticated and the authentication key to obtain auxiliary data to be authenticated;
the authentication key is illustratively entered by or determined from the input of the object to be authenticated during the authentication phase. The authentication key is generated, for example, by performing one or more of the processes of format conversion, bit padding, and verification information addition on the input of the object to be authenticated. It can be understood that the authentication key and the key corresponding to the first auxiliary data in the first implementation mode of the registration stage are obtained by the same processing.
In this embodiment, the encoding operation is the feature conversion of the same form in step S100, but the feature conversion in this embodiment is determined based on the authentication key.
In the present embodiment, the registration stage, F k (X) =h; authentication stage, F k ' (X ')=h '. Wherein F is a feature transfer function, k is a firstThe key corresponding to the auxiliary data is an authentication key acquired in an authentication stage, X and X 'are biometric templates in a registration stage and an authentication stage respectively, and H' are auxiliary data in the registration stage and the authentication stage respectively, wherein the auxiliary data in the authentication stage is auxiliary data to be authenticated.
S801, determining an identity authentication result of the object to be authenticated according to a comparison result of the auxiliary data to be authenticated and the auxiliary data of the base.
If X and X ' are close enough and k ' =k, then H and H ' are close enough. Therefore, the identity authentication result of the object to be authenticated can be determined according to the comparison result of the auxiliary data to be authenticated and the auxiliary data of the base.
Example seven
Based on the same application conception, the embodiment of the application also provides an identity authentication device corresponding to the identity authentication method, and since the principle of solving the problem of the device in the embodiment of the application is similar to that of the embodiment of the identity authentication method, the implementation of the device in the embodiment of the application can be referred to the description in the embodiment of the method, and the repetition is omitted.
The embodiment of the application provides a functional module of an identity authentication device. Each module in the identity authentication device in this embodiment is configured to perform each step in the above-described method embodiment. The identity authentication device comprises: the device comprises a second acquisition module, a third determination module and a third acquisition module; wherein, the liquid crystal display device comprises a liquid crystal display device,
the second acquisition module is used for acquiring an image to be authenticated of an object to be authenticated, wherein the image to be authenticated comprises a second biological characteristic area, and the second biological characteristic area comprises a plurality of second characteristic points;
A third determining module, configured to determine a biometric template to be authenticated of the second biometric region according to the second biometric region;
the third acquisition module is used for acquiring an identity authentication result of the object to be authenticated, wherein the identity authentication result is determined according to the biometric template to be authenticated and the base auxiliary data.
Wherein the base assistance data comprises at least one first assistance data; the first assistance data is included in registration information determined by the identity registration method provided by the fourth embodiment.
Example eight
Fig. 7 is a flowchart of an identity registration method according to an embodiment of the present application. The method in this embodiment is similar to the identity registration method provided in the fourth embodiment, and is different in that the method provided in the fourth embodiment is based on the identity registration method provided at the terminal device side, and the method provided in this embodiment is based on the identity registration method provided at the server side. The specific flow shown in fig. 7 will be described in detail.
Step 810, receiving registration information sent by a terminal device.
The above registration information may be determined by the identity registration method provided in the fourth embodiment. The identity registration method in the present embodiment can be applied to a server.
Step 820, storing the registration information in a database.
The registration information includes first assistance data.
The registration information further includes a first hash value, a first hash parameter and/or a second hash value, a second hash parameter, corresponding to the various scenarios shown in table 3.
It should be noted that, when the registration information includes a hash parameter, the hash parameter must be stored in association with the first auxiliary data, so that after the key to be authenticated or the multi-factor key to be authenticated is determined according to the first auxiliary data, a corresponding hash value to be authenticated or an authentication multi-factor hash value can be determined according to the hash parameter. Accordingly, step S820 includes a step of storing the hash value, associating the stored hash parameter with the first auxiliary data. As described above, the registration information may further include a first check value and/or a second check value, and when the first check value and/or the second check value are included in the registration information, the first check value and/or the second check are stored in association with the first auxiliary data in the database, so that after the key to be authenticated or the multi-factor key to be authenticated is determined according to the first auxiliary data, the key to be authenticated or the multi-factor key to be authenticated may be initially checked according to the check value.
Example nine
Based on the same application conception, the embodiment of the application also provides an identity registration device corresponding to the identity registration method, and since the principle of solving the problem of the device in the embodiment of the application is similar to that of the embodiment of the identity registration method, the implementation of the device in the embodiment of the application can be referred to the description in the embodiment of the method, and the repetition is omitted.
The embodiment of the application provides a functional module of an identity registration device. Each module in the identity registration apparatus in this embodiment is configured to perform each step in the above-described method embodiment. The identity registration apparatus includes: a receiving module and a second storage module; wherein, the liquid crystal display device comprises a liquid crystal display device,
the receiving module is used for receiving registration information sent by the terminal equipment, wherein the registration information is determined by the identity registration method;
and the second storage module is used for storing the registration information into a database, wherein the registration information comprises first auxiliary data.
Examples ten
Fig. 8 is a flowchart of an identity authentication method according to an embodiment of the present application. The identity authentication method provided in this embodiment is similar to the identity authentication method provided in the sixth embodiment, and is different in that the identity authentication method provided in the sixth embodiment is based on the identity authentication method provided by the terminal device, and the identity authentication method provided in this embodiment is based on the identity authentication method of the server. The specific flow shown in fig. 8 will be described in detail.
Step 910, receiving an authentication request sent by the terminal device.
And step 920, determining the auxiliary data of the base from the registration information according to the authentication request.
Wherein the base assistance data comprises at least one first assistance data.
For the 1:N case, the bottom library auxiliary data can be the total first auxiliary data in the stored registration information, or can be the first auxiliary data after preliminary screening. For example, the base auxiliary data is screened from the full first auxiliary data according to the user identification, the equipment identification, the registration time, the authentication place, the low-discrimination features extracted from the image to be authenticated, the low-discrimination features extracted from the biometric template to be authenticated, the low-discrimination features extracted from the quantized value to be authenticated, and other limiting factors. For a 1:1 scenario, the base assistance data may be first assistance data that is uniquely determined based on user identification, device identification, etc. It will be appreciated that the information used to determine the vault assistance data from the registration information may be sent by the terminal device to the server, which may be included in the authentication request or may be sent independently of the authentication request.
Wherein the registration information is the registration information stored in the database by the identity registration method.
The identity authentication method provided by the embodiment can authenticate the biometric template to be authenticated based on the determined base auxiliary data. Specifically, the authentication can be performed at the server, or the base auxiliary data can be sent to the terminal device, and the terminal device or the terminal device and the server cooperate to perform the authentication.
In one implementation manner of authentication by the server, the identity authentication method provided in this embodiment may further include the following steps 930 to 950.
Step 930, receiving a biometric template to be authenticated or a quantized value to be authenticated, which is sent by the terminal device, through a secure channel;
the specific embodiment of step S930 may be described in S730.
Step 940, determining an identity authentication result of the object to be authenticated according to the biometric template to be authenticated and the base auxiliary data;
or determining an identity authentication result of the object to be authenticated according to the quantized value to be authenticated and the base auxiliary data;
or, quantizing according to the biometric template to be authenticated to obtain a quantized value to be authenticated, and determining an identity authentication result of the object to be authenticated according to the quantized value to be authenticated and the base auxiliary data;
The to-be-authenticated quantized value is obtained by quantization according to the to-be-authenticated biometric template.
It can be understood that if the biometric template to be authenticated is received, the server performs the step of obtaining a quantized value to be authenticated by quantizing according to the biometric template to be authenticated; see S740 for a detailed description of this step. If the quantized value to be authenticated is received, the server does not need to execute the step of obtaining the quantized value to be authenticated by quantizing according to the biometric template to be authenticated.
The specific embodiment of step S940 may be described in S730.
Step 950, sending the identity authentication result of the object to be authenticated to the terminal device.
It is understood that the identity authentication result may include one or more of authentication success or failure, user identification of authentication success, verification of passed key to be authenticated, matching of hash value.
In one embodiment, corresponding to the second and third embodiments of S430, step 940 includes:
s960, performing decoding operation corresponding to the encoding operation according to the quantized value to be authenticated and the base auxiliary data, and determining at least one key to be authenticated, which corresponds to at least one first auxiliary data contained in the base auxiliary data one by one;
S970, checking the key to be authenticated, and determining the identity authentication result of the object to be authenticated.
The description of S960 and S970 is referred to the description of S750 and S760. The only difference is that S960 and S970 are performed by the server, and S750 and S760 are performed by the terminal device.
In one embodiment, the verifying the key to be authenticated in step S970 includes:
s9701, determining a hash value to be authenticated corresponding to part or all of the keys to be authenticated according to part or all of the keys to be authenticated, or determining a hash value to be authenticated corresponding to part or all of the keys to be authenticated according to part or all of the keys to be authenticated and first hash parameters corresponding to first auxiliary data corresponding to the keys to be authenticated;
for one hash value to be authenticated, if the first hash value of the base library matched with the hash value to be authenticated exists in the first hash value of the base library, the verification result of the key to be authenticated corresponding to the hash value to be authenticated is verification passing;
the first hash value of the bottom library is a first hash value corresponding to the first auxiliary data contained in the auxiliary data of the bottom library, or the first hash value of the bottom library is a first hash value meeting a specified condition; the first hash value is included in the registration information.
S7601 can be seen for a description of S9701. The only difference is that S9701 is performed by the server and S7601 is performed by the terminal device.
Step S9701 further includes, when determining the multi-factor hash value to be authenticated corresponding to the partial key to be authenticated according to the partial key to be authenticated, before step S9701:
s9700, carrying out preliminary verification on the key to be authenticated to obtain the key to be authenticated which passes the preliminary verification.
See S7600 for a description of S9700. The only difference is that S9700 is performed by the server and S7600 is performed by the terminal device.
For the situation that the terminal equipment performs authentication or the terminal equipment and the server cooperate to perform authentication, the identity authentication method further comprises S921 that the server sends the base auxiliary data to the terminal equipment, wherein the base auxiliary data is used for the terminal equipment to determine an identity authentication result of the object to be authenticated.
In a specific embodiment, when the terminal device and the server cooperate to perform authentication, the server may receive the hash value to be authenticated, the multi-factor hash value to be authenticated, and the server performs the step of comparing the multi-factor hash value to be authenticated with the second hash value of the base, and/or comparing the hash value to be authenticated with the first hash value of the base. The server may issue the first hash value of the base and the second hash value of the base, and the terminal may execute the step of comparing the multi-factor hash value to be authenticated with the second hash value of the base and/or comparing the hash value to be authenticated with the first hash value of the base. That is, the server may also perform at least one of the following steps. It will be appreciated that when two of SA1, SA2, SB1, SB2 are performed, one of SA1 and SA2, SB1 and SB2, respectively, may be selected, and SA1 and SA2, or SB1 and SB2, respectively, cannot be performed at the same time.
SA1, transmitting a first hash value of a base to the terminal equipment so that the terminal equipment compares the hash value to be authenticated with the first hash value of the base and determines the first hash value of the base matched with the hash value to be authenticated;
SA2, receiving a hash value to be authenticated sent by the terminal equipment, comparing the hash value to be authenticated with the first hash value of the base, and determining the first hash value of the base matched with the hash value to be authenticated;
SB1, transmitting a second hash value of a base to the terminal equipment so that the terminal equipment compares the multi-factor hash value to be authenticated with the second hash value of the base and determines the second hash value of the base matched with the multi-factor hash value to be authenticated;
SB2, receiving a multi-factor hash value to be authenticated sent by the terminal equipment, comparing the multi-factor hash value to be authenticated with the second hash value of the base, and determining the second hash value of the base matched with the multi-factor hash value to be authenticated;
wherein the hash value to be authenticated is determined according to the base auxiliary data; the first hash value of the bottom library is a first hash value corresponding to the first auxiliary data contained in the auxiliary data of the bottom library, or the first hash value of the bottom library is a first hash value meeting specified conditions in the database; the multi-factor hash value to be authenticated is determined according to the base auxiliary data; the second hash value of the bottom library is a second hash value corresponding to the first auxiliary data contained in the auxiliary data of the bottom library, or the second hash value of the bottom library is a second hash value meeting specified conditions in the database.
In a specific embodiment, the registration information further includes a user identification. The identity authentication method further comprises the following steps:
s992, inquiring a user identifier corresponding to the matching hash value in a second data table of the database according to the matching hash value;
the matching hash value may be generated at the server or may be transmitted to the server after the terminal device is generated, for example.
S993, the user identification corresponding to the matched hash value is sent to the terminal equipment;
wherein the database comprises a first data table and a second data table; the first auxiliary data is stored in the first data table; the registration information also comprises a user identifier; the user identification and the second hash value are stored in the second data table in an associated mode; and/or, the user identification and the first hash value are associated and stored in the second data table;
the matching hash value is a hash value to be authenticated, which corresponds to the key to be authenticated, the verification result of which is verified, or the matching hash value is a multi-factor hash value to be authenticated, which corresponds to the multi-factor key to be authenticated, the verification result of which is verified.
The related description may refer to step S790. Wherein S992-S993 are executed at the server.
Corresponding to the first embodiment of step S430, in one embodiment, step 940 includes,
performing the coding operation according to the biometric template to be authenticated and the authentication key to obtain auxiliary data to be authenticated;
and determining the identity authentication result of the object to be authenticated according to the comparison result of the auxiliary data to be authenticated and the auxiliary data of the database.
For a description of these two steps, see S800 and S801, the only difference is that these two steps are performed at the server, and S800 and S801 are performed at the terminal device.
It can be understood that if the terminal device needs to use the first hash parameter/the second hash parameter, the first check value/the second check value, the base first hash value/the base second hash value, and the user identifier associated with the base first hash value/the base second hash value, the server will also send relevant data to the terminal device.
Example eleven
Based on the same application conception, the embodiment of the application also provides an identity authentication device corresponding to the identity authentication method, and since the principle of solving the problem of the device in the embodiment of the application is similar to that of the embodiment of the identity authentication method, the implementation of the device in the embodiment of the application can be referred to the description in the embodiment of the method, and the repetition is omitted.
Each module in the identity authentication device in this embodiment is configured to perform each step in the above-described method embodiment. The identity authentication device comprises: the second receiving module and the bottom library auxiliary data determining module; wherein, the liquid crystal display device comprises a liquid crystal display device,
the second receiving module is used for receiving the authentication request sent by the terminal equipment;
the base auxiliary data determining module is used for determining base auxiliary data from the registration information according to the authentication request; wherein the base assistance data comprises at least one first assistance data.
Example twelve
The embodiment of the application also provides a key using method, and the specific flow of the method is described in detail below.
Step 1010, identity authentication is performed on the object to be authenticated using an identity authentication method.
If the identity authentication of the object to be authenticated is successful, step 1020 is executed.
Alternatively, the identity authentication method used in this embodiment may be similar to the identity authentication method provided in the sixth or tenth embodiment, and the specific details of step 1010 may be referred to the description in the sixth or tenth embodiment, which is not repeated here.
It can be understood that the identity authentication method of step S1010 needs to be a method capable of determining the key to be authenticated or the multi-factor key to be authenticated in the authentication process, that is, the identity authentication result is determined by checking the key to be authenticated or the multi-factor key to be authenticated. The generation and verification of the key to be authenticated or the multi-factor key to be authenticated can be performed at the terminal and/or the server. If the verification passing key to be authenticated or the verification passing multi-factor key to be authenticated is determined to be performed in the server, the server needs to send the verification passing key to be authenticated or the verification passing multi-factor key to be authenticated to the terminal equipment, so that the terminal equipment performs subsequent application processing by using the verification passing key. Typically, the key transfer between the server and the terminal device is performed between the TEEs of both via a secure channel. For example, the verified key may be included in the authentication result and sent to the terminal device by the server.
And 1020, performing one or more of digital signature, message encryption, message decryption, application login and digital wallet management by using the key which is determined by the identity authentication method and passes verification.
The keys which pass the verification comprise keys to be authenticated which pass the verification, multi-factor keys to be authenticated which pass the verification or keys to be authenticated which are used for generating the multi-factor keys to be authenticated which pass the verification.
It can be understood that the identity authentication method used in step S1010 performs identity authentication by checking the key to be authenticated or the multi-factor key to be authenticated, so that when the identity authentication passes, a check-passed key to be authenticated or a check-passed multi-factor key to be authenticated must be determined.
If the authentication stage is verified by the key to be authenticated, the verified key comprises the verified key to be authenticated; if the authentication stage uses the multi-factor key to be authenticated for verification, the verified key can comprise the multi-factor key to be authenticated for verification, and can also comprise the key to be authenticated used for generating the multi-factor key to be authenticated for verification.
In the embodiment of the invention, the key used in the registration stage can be restored according to the biological characteristics and the auxiliary information which are close enough to those in the registration stage, so that one or more of digital signature, message encryption, message decryption, application login and digital wallet management can be performed by utilizing the restored key.
Example thirteen
The embodiment of the application provides a digital signature method. The specific flow of the method will be described in detail below.
In step 1110, registration information is determined by an identity registration method.
Wherein the registration information includes first assistance data.
The identity registration method used in this embodiment may be similar to the identity registration method provided in the fourth embodiment, and further details regarding the identity registration method of step 1110 in this embodiment may be referred to the description in the fourth embodiment, which is not repeated here.
It should be noted that the registration method in step S1110 needs to be the second and third methods in step S430, that is, the authentication method corresponding to the registration method is to perform identity authentication by generating and verifying the key to be authenticated or the multi-factor key to be authenticated.
In step 1120, a first public key corresponding to the first private key is generated.
The first private key is a key corresponding to the first auxiliary data or a first multi-factor key corresponding to the first auxiliary data.
For example, a key corresponding to the first assistance data involved in the registration phase or a first multi-factor key may be used as the first private key, and the first public key may be generated based on the first private key.
For example, a key pair of a first private key and a first public key may be generated, the first private key may be used as a key corresponding to the first auxiliary data, or a first multi-factor key may be generated based on the key corresponding to the first auxiliary data. At this time, step S1120 may be performed prior to step S1110.
Step 1130, sending the first public key to a signer, so that the signer can sign the digital signature generated by using the first private key by using the first public key.
Examples fourteen
The embodiment of the application provides a digital signature method. The specific flow of the method will be described in detail below.
Step 1210, performing identity authentication on the object to be authenticated using an identity authentication method.
If the identity authentication of the object to be authenticated is successful, step 1220 is executed.
The explanation of step S1210 is referred to step S1010.
Step 1220, signature the information to be signed by using the key determined by the identity authentication method and passing the verification, to obtain signature data with digital signature.
For example, the hash value of the information to be signed can be asymmetrically encrypted by using the key passing through verification to obtain a digital signature, and the digital signature and the information to be signed are used as signature data.
Step 1230, the signature data is sent to a signer, so that the signer uses the public key corresponding to the verified secret key to sign the digital signature of the signature data.
Illustratively, the signer has received at least one public key prior to receiving the signature data. At least one public key is sent to the signer by the method of embodiment thirteenth. The signer can determine which public key to use to sign the digital signature based on information contained or attached to the signature data.
After receiving the signature data, the signer can calculate the hash value of the data to be signed in the signature data, compare the hash value with the result obtained by public key decryption of the digital signature, and pass the signer if the hash value and the result are consistent. The public key refers to a public key for signing, which is determined by the signing party according to information contained or attached in the signature data.
The public key corresponding to the verified secret key is sent to the signing verification party through the method described in the thirteenth embodiment; the keys which pass the verification comprise keys to be authenticated which pass the verification, multi-factor keys to be authenticated which pass the verification or keys to be authenticated which are used for generating the multi-factor keys to be authenticated which pass the verification. See in particular the description of embodiment twelve.
Example fifteen
The embodiment of the application provides a message decryption method. The specific flow of the method will be described in detail below.
The specific explanation of this step is referred to step S1110.
Step 1420, generating a second public key corresponding to the second private key; the second private key is a key corresponding to the first auxiliary data or a first multi-factor key corresponding to the first auxiliary data.
Step 1430, sending the second public key to the message encryptor.
For a description of the second public key, the second private key, reference may be made to the first public key, the first private key.
Examples sixteen
The embodiment of the application provides a message decryption method. The specific flow of the method will be described in detail below.
In step 1510, the data to be decrypted sent by the message encryptor is received.
Step 1520, identity authentication is performed on the object to be authenticated using an identity authentication method.
See S1010 for a description of S1520.
If the identity authentication of the object to be authenticated is successful, step 1530 is executed.
Step 1530, decrypting the data to be decrypted by using the key determined by the identity authentication method and passing the verification, to obtain decrypted data.
The method provided by the fifteen public key embodiments corresponding to the verified secret key is sent to the message encrypting party; the keys which pass the verification comprise keys to be authenticated which pass the verification, multi-factor keys to be authenticated which pass the verification or keys to be authenticated which are used for generating the multi-factor keys to be authenticated which pass the verification; and the data to be decrypted is encrypted by using the public key corresponding to the key which passes the verification.
Example seventeen
The embodiment of the application provides an application login method. The specific flow of the method will be described in detail below.
Step 1610, identity authentication is performed on the object to be authenticated using an identity authentication method.
If the identity authentication of the object to be authenticated is successful, step 1620 is executed.
See in particular the description of S1010.
Step 1620, logging in the target application program by using the key which is determined by the identity authentication method and passes the verification; or logging in the target application program by using the key and the user identifier which are determined by the identity authentication method and pass the verification.
It will be appreciated that the verification passed key may be determined by an identity authentication method and that if the verification passed key is globally unique in the target application, for example a string of characters long enough to be most random, the target application may be logged in using only the verification passed key. If the identity authentication method determines the user identification, the user identification and the verified secret key can be used for logging in the target application program.
It is understood that, by way of example, S1620 includes: and carrying out hash operation (the salt value/round number of the hash operation can be selected) according to the key which passes the verification, and sending the hash operation result to the application server to log in the target application program, or sending the hash operation result and the user identification to the application server to log in the target application program.
The keys which pass the verification comprise keys to be authenticated which pass the verification, multi-factor keys to be authenticated which pass the verification or keys to be authenticated which are used for generating the multi-factor keys to be authenticated which pass the verification.
Example eighteen
The embodiment of the application provides a block chain node information synchronization method. The blockchain node information synchronization method provided by the embodiment is applied to a current blockchain node on a blockchain, wherein the blockchain comprises a plurality of blockchain nodes, and the specific flow of the method is described in detail below.
In step 1710, registration information is determined by an identity registration method.
The registration information includes first assistance data;
see in particular the description of S1110.
Step 1720, generating a third public key corresponding to the third private key; the third private key is a key corresponding to the first auxiliary data or a first multi-factor key corresponding to the first auxiliary data.
For a description of the third private key and the third public key, reference is made to the description of the first private key, the first public key.
Step 1730 broadcasts the third public key to other blockchain links on the blockchain.
In this embodiment, after the third public key is broadcast in the blockchain node, verification of transaction data generated on the blockchain node may be implemented at any node in the blockchain system.
Examples nineteenth
The embodiment of the application provides a block chain node information synchronization method. The blockchain node information synchronization method provided by the embodiment is applied to a current blockchain node on a blockchain, wherein the blockchain comprises a plurality of blockchain nodes, and the specific flow of the method is described in detail below.
Step 1730, identity authentication is performed on the object to be authenticated using an identity authentication method.
If the identity authentication of the object to be authenticated is successful, step 1740 is performed.
See in particular the description of S1010.
Step 1740, signing the transaction information by using the key which is determined by the identity authentication method and passes through verification, so as to obtain transaction data with digital signature;
step 1750, broadcasting the transaction data to other blockchain nodes on the blockchain for the other blockchain nodes to check the digital signature of the transaction data by using the public key corresponding to the verified key;
the public key corresponding to the verified secret key is sent to the signing verification party through the method of the eighteenth embodiment; the keys which pass the verification comprise keys to be authenticated which pass the verification, multi-factor keys to be authenticated which pass the verification or keys to be authenticated which are used for generating the multi-factor keys to be authenticated which pass the verification.
See example fourteen for a detailed description.
In addition, the embodiments of the present application further provide apparatuses corresponding to the methods of the twelve to nineteenth embodiments, and specific apparatus embodiments refer to descriptions of the summary of the invention and the methods corresponding to the apparatuses.
It should be noted that, preferably, the key (the first transformation key, the second transformation key, the authentication key, the first public key, the first private key, the second public key, the second private key, the third public key, and the third private key corresponding to the first auxiliary data) in each embodiment of the present application is generated (e.g., generated by a system, e.g., generated by a decoding operation), and is executed in the device TEE (including the terminal device TEE and the server TEE) using (e.g., hash calculation and transformation processing), and the transmission of the key is performed between the device TEEs through a secure channel between the devices.
Furthermore, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor performs the steps of the respective method embodiments described above.
The computer program product of each method provided in the embodiments of the present application includes a computer readable storage medium storing a program code, where the program code includes instructions for executing the steps in each method embodiment described above, and specifically reference may be made to the method embodiment described above, which is not described herein.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other manners as well. The apparatus embodiments described above are merely illustrative, for example, flow diagrams and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes. It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the same, but rather, various modifications and variations may be made by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of the present application should be included in the protection scope of the present application. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (56)

1. A privacy-preserving image processing method, comprising:
acquiring an image to be processed, wherein the image to be processed comprises a first biological characteristic area, and the first biological characteristic area comprises a plurality of first characteristic points;
Determining a first biological characteristic template of the first biological characteristic region according to the first biological characteristic region, wherein the first biological characteristic template comprises biological characteristic representations corresponding to the plurality of first characteristic points;
performing encoding operation on the first biological characteristic template to obtain first auxiliary data, wherein the first auxiliary data comprises one of the following items:
performing feature conversion according to the first biological feature template to obtain first auxiliary data; the feature conversion is determined according to a key corresponding to the first auxiliary data;
quantizing according to the first biological characteristic template to obtain a quantized value; determining an error correction code word according to a key corresponding to the first auxiliary data; performing first transformation processing on the error correction code word to obtain first auxiliary data; wherein the determining the error correction code word according to the key corresponding to the first auxiliary data includes: determining an error correction code word according to the key corresponding to the first auxiliary data and the quantized value; and/or, the performing a first transformation on the error correction code word to obtain first auxiliary data, including: performing first transformation processing on the error correction code words according to the quantized values to obtain first auxiliary data;
Quantizing according to the first biological characteristic template to obtain a quantized value; determining an error correction code word; performing second transformation processing on the error correction code word to obtain first auxiliary data; generating a key corresponding to the first auxiliary data according to the quantized value; wherein an error correction code codeword is determined; performing a second transformation process on the error correction code word to obtain first auxiliary data, including: randomly determining error correction code words; performing second transformation processing on the error correction code words according to the quantized values to obtain first auxiliary data; or determining an error correction code word according to the quantized value, and performing second transformation processing on the error correction code word to obtain first auxiliary data;
wherein the encoding operation comprises an irreversible transformation.
2. The method of claim 1, wherein the first biometric template comprises biometric data and a precision descriptor, the determining the first biometric template for the first biometric region from the first biometric region comprising:
determining biological characteristic data corresponding to the plurality of first characteristic points according to the first biological characteristic area;
And determining accurate descriptors corresponding to the plurality of first feature points according to the first biological feature area.
3. The method of claim 2, wherein determining the exact descriptors corresponding to the plurality of first feature points from the first biometric region comprises:
inputting the image to be processed and/or the description information of the image to be processed into a descriptor extraction model for processing to obtain accurate descriptors corresponding to the plurality of first feature points;
or determining local images of the neighborhoods where the first feature points are located according to the position data of the first feature points, and inputting the local images of the first feature points and/or description information of the local images into a descriptor extraction model to obtain accurate descriptors corresponding to the first feature points;
or, for each first feature point in the plurality of first feature points, determining a fuzzy descriptor of the first feature point, clustering the fuzzy descriptor of the first feature point to a category center descriptor closest to the fuzzy descriptor, and taking the category center descriptor closest to the fuzzy descriptor as an accurate descriptor corresponding to the first feature point; the category center descriptor is obtained by clustering a plurality of fuzzy descriptors.
4. A method according to claim 2 or 3, characterized in that the method comprises at least one of the following:
performing feature conversion according to the first biological feature template to obtain first auxiliary data, including: performing feature conversion on the biological feature data and the accurate descriptors contained in the first biological feature template to obtain first auxiliary data; the feature conversion is determined according to a key corresponding to the first auxiliary data;
performing feature conversion according to the first biological feature template to obtain first auxiliary data, including: performing feature conversion on the first biological feature data in the first biological feature template to obtain first auxiliary data; the feature conversion is determined according to the key corresponding to the first auxiliary data and the accurate descriptor;
performing feature conversion according to the first biological feature template to obtain first auxiliary data, including: performing feature conversion on the biological feature data and the accurate descriptors contained in the first biological feature template to obtain first auxiliary data; the feature conversion is determined according to the key corresponding to the first auxiliary data and the accurate descriptor;
quantifying according to the first biometric template, comprising: quantizing the biological feature data and the accurate descriptors contained in the first biological feature template to obtain quantized values;
The obtained quantized values include: performing third transformation processing on a quantized result obtained by quantizing according to the first biological characteristic template to obtain a quantized value, wherein the third transformation processing is determined according to the accurate descriptor; the third transformation process is reversible or irreversible transformation;
the first transformation processing is performed on the error correction code word to obtain first auxiliary data, including: performing first transformation processing on the error correction code words by using the accurate descriptors to obtain first auxiliary data;
the second transforming the error correction code word to obtain first auxiliary data includes: and carrying out second transformation processing on the error correction code words by using the accurate descriptors to obtain first auxiliary data.
5. The method of claim 4, wherein the step of determining the position of the first electrode is performed,
determining an error correction code word according to a key corresponding to the first auxiliary data, including:
determining parameters of a first generation curve according to the secret key corresponding to the first auxiliary data;
mapping a first quantized value in the quantized values on the first generation of digital curves to obtain first mapping values corresponding to the first quantized values, wherein the error correction code words comprise a plurality of first mapping values;
Performing a first transformation process on the error correction code word to obtain first auxiliary data, including:
generating a hash point set;
determining first auxiliary data according to the first auxiliary point set and the hash point set; wherein a first auxiliary point in the first auxiliary point set has a first functional relationship with a first point on the first generation number curve, the first point uses the first quantized value as a first coordinate component and the first mapping value as a second coordinate component; the first coordinate component of the first auxiliary point is determined from the first quantized value.
6. The method according to claim 5, wherein: performing a first transformation process on the error correction code word to obtain first auxiliary data, and further including:
performing fourth transformation processing on the first points to obtain first auxiliary points in the first auxiliary point set; the fourth transform process is determined from the exact descriptor, the fourth transform process being reversible.
7. The method of claim 4, wherein the step of determining the position of the first electrode is performed,
determining an error correction code word according to a key corresponding to the first auxiliary data; performing a first transformation process on the error correction code word to obtain first auxiliary data, including:
Determining parameters of a first generation curve according to the secret key corresponding to the first auxiliary data;
determining parameters of a first mapping relation according to the quantized values and the first generation curve;
determining the first auxiliary data according to the parameters of the first mapping relation;
wherein the first mapping relation exists between the first set and the second set, and the first subset of the first set is determined by the quantized values; taking a first value in the first subset as a first coordinate component, taking a subset point in a second set, which satisfies a first mapping relation with the first value, as a second coordinate component, as a second functional relation with a second point on the first generation number curve, taking a value in a complement of the first subset in the first set as the first coordinate component, and taking a complement point in the second set, which satisfies the first mapping relation with the value, as a second coordinate component, as a point on the first generation number curve, as the second functional relation not being present; the first coordinate component of the second point is a quantized value corresponding to the first value.
8. The method of claim 7, wherein determining parameters of a first mapping relationship from the quantized values and the first generation curve comprises: and determining parameters of a first mapping relation according to the quantized values, the accurate descriptors and the first generation curve.
9. The method of claim 4, wherein the step of determining the position of the first electrode is performed,
determining an error correction code word according to a key corresponding to the first auxiliary data, including:
determining the error correction code word according to the key corresponding to the first auxiliary data;
performing a first transformation process on the error correction code word to obtain first auxiliary data, including:
determining substitution operation according to the biological vector corresponding to the quantized value; and the replacement operation is acted on the error correction code word to obtain first auxiliary data.
10. The method of claim 9, wherein the step of determining the position of the substrate comprises,
said applying said permutation operation on said error correction code codeword to obtain first auxiliary data comprising:
the replacement operation is acted on the error correction code word to obtain a replacement operation result; performing fifth transformation processing on the replacement operation result to obtain first auxiliary data; said fifth transformation is determined from said exact descriptor; the fifth transform process is a reversible transform.
11. The method of claim 4, wherein the step of determining the position of the first electrode is performed,
randomly determining error correction code words; performing a second transformation on the error correction code word according to the quantized value to obtain first auxiliary data, including:
Randomly determining error correction code words; determining substitution operation according to the biological vector corresponding to the quantized value; the replacement operation is acted on the error correction code word to obtain first auxiliary data;
or alternatively, the process may be performed,
determining an error correction code word according to the quantized value, performing second transformation processing on the error correction code word to obtain first auxiliary data, including:
determining that a codeword with the closest biological vector distance corresponding to the quantized value is an error correction code codeword; and obtaining first auxiliary data according to the difference between the biological vector and the error correction code word.
12. The method of claim 11, wherein the step of determining the position of the probe is performed,
the obtaining the first auxiliary data includes: obtaining first auxiliary data through sixth transformation processing; the sixth transformation is determined from the exact descriptor, the sixth transformation being reversible.
13. The method according to any one of claims 1-12, wherein the acquiring the image to be processed comprises:
and acquiring the image to be processed in a non-contact acquisition mode, wherein the first biological characteristic region is at least one of a fingerprint, a palm print, a finger vein and a palm vein, and the number of characteristic points contained in the first biological characteristic region is more than 300.
14. The method according to any one of claims 1-13, applied to a terminal device, wherein a trusted execution environment is provided on the terminal device;
the acquiring, using or/and transmitting process of the key corresponding to the first auxiliary data is executed in the trusted execution environment.
15. The method of claim 14, wherein the determining a first biometric template for the first biometric region from the first biometric region comprises:
determining a first intermediate result according to the first biological feature region, and determining a first biological feature template according to the first intermediate result; wherein said determining a first intermediate result from said first biometric area is performed in said trusted execution environment with a first priority; determining that a first biometric template is executed in the trusted execution environment at a second priority based on the first intermediate result;
or determining a second intermediate result according to the first biological characteristic area; determining a third intermediate result according to the second intermediate result; determining a first biometric template from the third intermediate result; wherein said determining a second intermediate result from said first biometric area is performed in said trusted execution environment with a first priority; determining that a first biometric template is executed in the trusted execution environment at a second priority based on the third intermediate result; said determining, from said second intermediate result, that a third intermediate result is executed in said trusted execution environment at a third priority; wherein the third priority is lower than the first priority, and the third priority is lower than the second priority.
16. An identity registration method, comprising:
first assistance data in registration information of an object to be registered is determined by the method of any one of claims 1-15.
17. The method of claim 16, wherein the registration information further comprises: a first hash value; the method further comprises the steps of: performing hash operation according to a key corresponding to the first auxiliary data, and determining a first hash value in the registration information;
and/or, the registration information further includes: the first hash value and the first hash parameter, the method further comprising: performing hash operation according to the key corresponding to the first auxiliary data and the first hash parameter, and determining a first hash value in the registration information;
and/or, the registration information includes a second hash value; the method further comprises the steps of: performing ninth transformation processing according to the key corresponding to the first auxiliary data and the first transformation key to obtain a first multi-factor key; performing hash operation according to the first multi-factor key, and determining a second hash value in the registration information;
and/or, the registration information includes a second hash value and a second hash parameter, the method further comprising: performing ninth transformation processing according to the key corresponding to the first auxiliary data and the first transformation key to obtain a first multi-factor key; performing hash operation according to the first multi-factor key and the second hash parameter, and determining a second hash value in the registration information;
Wherein the first hash parameter comprises a salt value and/or a round number used by hash operation, and the second hash parameter comprises a salt value and/or a round number used by hash operation.
18. The method according to claim 16 or 17, wherein the key corresponding to the first auxiliary data and/or the first multi-factor key comprises verification information;
or alternatively, the process may be performed,
the method further comprises the steps of:
generating a first check value according to a key corresponding to the first auxiliary data and/or generating a second check value according to the first multi-factor key; the registration information includes the first check value and/or the second check value.
19. An identity authentication method, comprising:
acquiring an image to be authenticated of an object to be authenticated, wherein the image to be authenticated comprises a second biological characteristic area, and the second biological characteristic area comprises a plurality of second characteristic points;
determining a biometric template to be authenticated of the second biometric region according to the second biometric region;
acquiring an identity authentication result of the object to be authenticated, wherein the identity authentication result is determined according to the biometric template to be authenticated and the database auxiliary data;
Wherein the base assistance data comprises at least one first assistance data; the first assistance data is determined by an identity registration method according to any one of claims 16 to 18.
20. The method of claim 19, wherein the method further comprises:
the biometric template to be authenticated or the quantized value to be authenticated is sent through a secure channel, so that a server can determine an identity authentication result of the object to be authenticated according to the biometric template to be authenticated and the base auxiliary data, or the server can determine the identity authentication result of the object to be authenticated according to the quantized value to be authenticated and the base auxiliary data; the quantized value to be authenticated is obtained by quantization according to the biometric template to be authenticated;
the step of obtaining the identity authentication result of the object to be authenticated comprises the following steps: and receiving an identity authentication result of the object to be authenticated, which is sent by the server.
21. The method according to claim 19, wherein the obtaining the identity authentication result of the object to be authenticated includes:
quantizing according to the biometric template to be authenticated to obtain a quantized value to be authenticated;
According to the quantized value to be authenticated and the base auxiliary data, decoding operation corresponding to the encoding operation is carried out, and at least one key to be authenticated corresponding to at least one first auxiliary data contained in the base auxiliary data is determined;
and checking the key to be authenticated, and determining an identity authentication result of the object to be authenticated.
22. The method according to claim 19, wherein the obtaining the identity authentication result of the object to be authenticated includes:
quantizing according to the biometric template to be authenticated to obtain a quantized value to be authenticated;
according to the quantized value to be authenticated and the base auxiliary data, decoding operation corresponding to the encoding operation is carried out, and at least one key to be authenticated corresponding to at least one first auxiliary data contained in the base auxiliary data is determined;
obtaining a multi-factor key to be authenticated according to part or all of the key to be authenticated and the second transformation key;
and verifying the multi-factor key to be authenticated, and determining an identity authentication result of the object to be authenticated.
23. The method of claim 22, wherein the obtaining the identity authentication result of the object to be authenticated further comprises:
Verifying the at least one key to be authenticated to obtain a target key to be authenticated which passes verification;
the obtaining the multi-factor key to be authenticated according to part or all of the key to be authenticated and the second transformation key comprises the following steps:
and obtaining the multi-factor key to be authenticated according to the target key to be authenticated and the second transformation key.
24. The method according to claim 22 or 23, wherein said verifying the multi-factor key to be authenticated comprises:
determining a to-be-authenticated multi-factor hash value corresponding to a part or all of the to-be-authenticated multi-factor keys according to the part or all of the to-be-authenticated multi-factor keys, or determining a part or all of the to-be-authenticated multi-factor hash value corresponding to the to-be-authenticated multi-factor keys according to a part or all of the to-be-authenticated multi-factor keys and second hash parameters corresponding to first auxiliary data corresponding to the to-be-authenticated multi-factor keys;
for a multi-factor hash value to be authenticated, if a second hash value of the base library matched with the multi-factor hash value to be authenticated exists in the second hash value of the base library, the verification result of the multi-factor key to be authenticated corresponding to the multi-factor hash value to be authenticated is verification passing;
The second hash value of the bottom library is a second hash value corresponding to the first auxiliary data contained in the auxiliary data of the bottom library, or the second hash value of the bottom library is a second hash value meeting a specified condition; the second hash value is included in the registration information.
25. The method according to any of claims 21, 23-24, wherein said verifying the key to be authenticated comprises:
determining a hash value to be authenticated corresponding to part or all of the keys to be authenticated according to part or all of the keys to be authenticated, or determining a hash value to be authenticated corresponding to part or all of the keys to be authenticated according to part or all of the keys to be authenticated and first hash parameters corresponding to first auxiliary data corresponding to the keys to be authenticated;
for one hash value to be authenticated, if the first hash value of the base library matched with the hash value to be authenticated exists in the first hash value of the base library, the verification result of the key to be authenticated corresponding to the hash value to be authenticated is verification passing;
the first hash value of the bottom library is a first hash value corresponding to the first auxiliary data contained in the auxiliary data of the bottom library, or the first hash value of the bottom library is a first hash value meeting a specified condition; the first hash value is included in the registration information.
26. The method according to any of claims 21-25, wherein said verifying said multi-factor key to be authenticated further comprises:
performing preliminary verification on the multi-factor key to be authenticated to obtain a multi-factor key to be authenticated which passes the preliminary verification;
the preliminary verification of the multi-factor key to be authenticated comprises the following steps: performing preliminary verification on the multi-factor key to be authenticated by using the verification information in the multi-factor key to be authenticated, or generating a second verification value to be authenticated according to the multi-factor key to be authenticated, and comparing the second verification value to be authenticated with a second verification value of a base to perform preliminary verification;
the second check value of the base is a second check value corresponding to first auxiliary data corresponding to the multi-factor key to be authenticated in the database;
and/or the number of the groups of groups,
the verifying the key to be authenticated further includes:
performing preliminary verification on the key to be authenticated to obtain a key to be authenticated which passes the preliminary verification;
performing preliminary verification on the key to be authenticated, including: performing preliminary verification on the key to be authenticated by using the verification information in the key to be authenticated, or generating a first verification value to be authenticated according to the key to be authenticated, and comparing the first verification value to be authenticated with a first verification value of a base library to perform preliminary verification;
The first check value of the base is a first check value corresponding to first auxiliary data corresponding to the key to be authenticated in the database.
27. The method according to any one of claims 21-26, further comprising:
determining a user identifier corresponding to a matching hash value according to the matching hash value;
wherein the determining the user identifier of the object to be authenticated according to the matching hash value includes:
sending the matched hash value to a server, so that the server inquires a user identifier corresponding to the matched hash value according to the matched hash value; receiving a user identifier corresponding to the matching hash value;
or inquiring a user identifier corresponding to the matched hash value according to the matched hash value;
the matching hash value is a hash value to be authenticated, which corresponds to the key to be authenticated and passes the verification, or the matching hash value is a hash value to be authenticated, which corresponds to the multi-factor key to be authenticated and passes the verification.
28. The method according to claim 19, wherein the obtaining the identity authentication result of the object to be authenticated includes:
Performing the coding operation according to the biometric template to be authenticated and the authentication key to obtain auxiliary data to be authenticated;
and determining the identity authentication result of the object to be authenticated according to the comparison result of the auxiliary data to be authenticated and the auxiliary data of the database.
29. An identity registration method, comprising:
receiving registration information sent by a terminal device, wherein the registration information is determined by an identity registration method according to any one of claims 16 to 18;
the registration information is stored in a database, the registration information comprising first assistance data.
30. The method of claim 29, wherein the registration information further comprises: a first hash value; the storing the registration information in a database further includes: storing the first hash value into the database;
or/and, the registration information further includes: a first hash value and a first hash parameter; the storing the registration information in a database further includes: storing the first hash value into the database, and storing first auxiliary data and the first hash parameter in the database in an associated manner;
Or/and, the registration information includes a second hash value; the storing the registration information in a database further includes: storing the second hash value in the database;
or/and, the registration information comprises a second hash value and a second hash parameter; the storing the registration information in a database further includes: storing the second hash value into the database, and storing the first auxiliary data and the second hash parameter in the database in an associated manner;
wherein the first hash parameter comprises a salt value and/or a round number used by hash operation, and the second hash parameter comprises a salt value and/or a round number used by hash operation.
31. The method according to claim 29 or 30, wherein the key corresponding to the first auxiliary data and/or the first multi-factor key comprises verification information; or alternatively, the process may be performed,
the registration information further comprises a first check value and/or a second check value, and the first check value and/or the second check value are/is stored in the database in association with the first auxiliary data.
32. An identity authentication method, the method comprising:
receiving an authentication request sent by a terminal device;
Determining base assistance data from registration information stored in a database by the method of any one of claims 29-31 in accordance with the authentication request; wherein the base assistance data comprises at least one first assistance data.
33. The method of claim 32, the method further comprising:
receiving a to-be-authenticated biometric template or a to-be-authenticated quantized value sent by a terminal device through a secure channel;
determining an identity authentication result of the object to be authenticated according to the biometric template to be authenticated and the base auxiliary data; or determining an identity authentication result of the object to be authenticated according to the quantized value to be authenticated and the base auxiliary data; or, quantizing according to the biometric template to be authenticated to obtain a quantized value to be authenticated, and determining an identity authentication result of the object to be authenticated according to the quantized value to be authenticated and the base auxiliary data; the quantized value to be authenticated is obtained by quantization according to the biometric template to be authenticated;
and sending the identity authentication result of the object to be authenticated to the terminal equipment.
34. The method of claim 33, wherein the determining the identity authentication result of the object to be authenticated according to the biometric quantification value to be authenticated and the base assistance data comprises:
According to the quantized value to be authenticated and the base auxiliary data, decoding operation corresponding to the encoding operation is carried out, and at least one key to be authenticated corresponding to at least one first auxiliary data contained in the base auxiliary data is determined;
and checking the key to be authenticated, and determining an identity authentication result of the object to be authenticated.
35. The method of claim 34, wherein verifying the key to be authenticated comprises:
determining a hash value to be authenticated corresponding to part or all of the keys to be authenticated according to part or all of the keys to be authenticated, or determining a hash value to be authenticated corresponding to part or all of the keys to be authenticated according to part or all of the keys to be authenticated and first hash parameters corresponding to first auxiliary data corresponding to the keys to be authenticated;
for one hash value to be authenticated, if the first hash value of the base library matched with the hash value to be authenticated exists in the first hash value of the base library, the verification result of the key to be authenticated corresponding to the hash value to be authenticated is verification passing;
the first hash value of the bottom library is a first hash value corresponding to the first auxiliary data contained in the auxiliary data of the bottom library, or the first hash value of the bottom library is a first hash value meeting a specified condition; the first hash value is included in the registration information.
36. The method of any one of claims 34-35, wherein,
the verifying the key to be authenticated further includes:
performing preliminary verification on the key to be authenticated to obtain a key to be authenticated which passes the preliminary verification;
performing preliminary verification on the key to be authenticated, including: performing preliminary verification on the key to be authenticated by using the verification information in the key to be authenticated, or generating a first verification value to be authenticated according to the key to be authenticated, and comparing the first verification value to be authenticated with a first verification value of a base library to perform preliminary verification;
the first check value of the base is a first check value corresponding to first auxiliary data corresponding to the key to be authenticated in the database.
37. The method of claim 33, wherein the determining the identity authentication result of the object to be authenticated based on the biometric template to be authenticated and the base assistance data comprises:
performing the coding operation according to the biometric template to be authenticated and the authentication key to obtain auxiliary data to be authenticated;
and determining the identity authentication result of the object to be authenticated according to the comparison result of the auxiliary data to be authenticated and the auxiliary data of the database.
38. The method of claim 32, the method further comprising: and sending the base auxiliary data to the terminal equipment.
39. The method of claim 38, further comprising at least one of:
transmitting a first hash value of a base to the terminal equipment so that the terminal equipment compares the hash value to be authenticated with the first hash value of the base and determines the first hash value of the base matched with the hash value to be authenticated;
receiving a hash value to be authenticated sent by the terminal equipment, comparing the hash value to be authenticated with the first hash value of the base, and determining the first hash value of the base matched with the hash value to be authenticated;
transmitting a second hash value of the base to the terminal equipment so that the terminal equipment compares the multi-factor hash value to be authenticated with the second hash value of the base and determines the second hash value of the base matched with the multi-factor hash value to be authenticated;
receiving a multi-factor hash value to be authenticated, which is sent by the terminal equipment, comparing the multi-factor hash value to be authenticated with the second hash value of the base, and determining the second hash value of the base matched with the multi-factor hash value to be authenticated;
Wherein the hash value to be authenticated is determined according to the base auxiliary data; the first hash value of the bottom library is a first hash value corresponding to the first auxiliary data contained in the auxiliary data of the bottom library, or the first hash value of the bottom library is a first hash value meeting specified conditions in the database; the multi-factor hash value to be authenticated is determined according to the base auxiliary data; the second hash value of the bottom library is a second hash value corresponding to the first auxiliary data contained in the auxiliary data of the bottom library, or the second hash value of the bottom library is a second hash value meeting specified conditions in the database.
40. The method of any one of claims 32-39, wherein the database comprises a first data table and a second data table; the method further comprises the steps of:
inquiring a user identifier corresponding to the matching hash value in a second data table of the database according to the matching hash value;
the matching hash value is a hash value to be authenticated, corresponding to the key to be authenticated, of which the verification result is that the verification is passed, or the matching hash value is a hash value to be authenticated, corresponding to the multi-factor key to be authenticated, of which the verification result is that the verification is passed;
Wherein the first indexing feature and the first auxiliary data are stored in association in the first data table; the registration information comprises the second hash value, the user identification and the second hash value are stored in the second data table in an associated mode, and/or the registration information comprises the first hash value, and the user identification and the first hash value are stored in the second data table in an associated mode.
41. A key use method, comprising:
authenticating an object to be authenticated using the authentication method of any one of the preceding claims 21-27;
if the identity authentication of the object to be authenticated is successful, performing one or more of digital signature, message encryption, message decryption, application login and digital wallet management by using the key determined by the identity authentication method according to any one of claims 21 to 27;
the keys which pass the verification comprise keys to be authenticated which pass the verification, multi-factor keys to be authenticated which pass the verification or keys to be authenticated which are used for generating the multi-factor keys to be authenticated which pass the verification.
42. A digital signature method, comprising:
Determining registration information by an identity registration method according to any one of claims 16 to 18; wherein the registration information includes first assistance data;
generating a first public key corresponding to the first private key; the first private key is a key corresponding to the first auxiliary data or a first multi-factor key corresponding to the first auxiliary data;
and sending the first public key to a signer for the signer to sign the digital signature generated by using the first private key by using the first public key.
43. A digital signature method, comprising:
authenticating an object to be authenticated using the authentication method of any one of the preceding claims 21-27;
if the identity authentication of the object to be authenticated is successful, signing the information to be signed by using the key which passes the verification and is determined by the identity authentication method according to any one of claims 21-27, so as to obtain signature data with a digital signature;
the signature data is sent to a signature verification party, so that the signature verification party can verify the digital signature of the signature data by using a public key corresponding to the verified secret key; wherein the public key corresponding to the verified key is sent to the signer by the method of claim 42; the keys which pass the verification comprise keys to be authenticated which pass the verification, multi-factor keys to be authenticated which pass the verification or keys to be authenticated which are used for generating the multi-factor keys to be authenticated which pass the verification.
44. A message decryption method, comprising:
determining registration information by an identity registration method according to any one of claims 16-18; wherein the registration information includes first assistance data;
generating a second public key corresponding to the second private key; the second private key is a key corresponding to the first auxiliary data or a first multi-factor key corresponding to the first auxiliary data;
and sending the second public key to a message encryptor.
45. A message decryption method, comprising:
receiving data to be decrypted sent by a message encrypting party;
authenticating an object to be authenticated using the authentication method of any one of the preceding claims 21-27;
if the identity authentication of the object to be authenticated is successful, decrypting the data to be decrypted by using the key which passes the verification and is determined by the identity authentication method according to any one of claims 21 to 27, so as to obtain decrypted data;
wherein the public key corresponding to the verified key is sent to the message encryptor by the method of claim 44; the keys which pass the verification comprise keys to be authenticated which pass the verification, multi-factor keys to be authenticated which pass the verification or keys to be authenticated which are used for generating the multi-factor keys to be authenticated which pass the verification; and the data to be decrypted is encrypted by using the public key corresponding to the key which passes the verification.
46. An application login method, comprising:
authenticating an object to be authenticated using the authentication method of any one of the preceding claims 21-27;
if the identity authentication of the object to be authenticated is successful, logging in the target application program by using the key which is determined by the identity authentication method according to any one of claims 21-27 and passes the verification; or, logging in the target application program with the key and the user identifier determined by the identity authentication method according to claim 27;
the keys which pass the verification comprise keys to be authenticated which pass the verification, multi-factor keys to be authenticated which pass the verification or keys to be authenticated which are used for generating the multi-factor keys to be authenticated which pass the verification.
47. The method of claim 46, wherein logging into the target application using the verified key determined by the authentication method of any one of claims 21-27, comprises: carrying out hash operation according to the secret key which passes the verification, and sending the hash operation result to an application server to log in a target application program; or, using the key and the user identifier determined by the identity authentication method according to claim 27, the method includes: and carrying out hash operation according to the key which passes the verification, and sending the hash operation result and the user identification to an application server to log in a target application program.
48. A method of block chain link point information synchronization applied to a current block chain node on a block chain, the block chain including a plurality of block chain nodes thereon, comprising:
determining registration information by an identity registration method according to any one of claims 16-18, the registration information comprising first assistance data;
generating a third public key corresponding to the third private key; the third private key is a key corresponding to the first auxiliary data or a first multi-factor key corresponding to the first auxiliary data;
the third public key is broadcast to other blockchain link points on the blockchain.
49. A method of block chain link point information synchronization applied to a current block chain node on a block chain, the block chain including a plurality of block chain nodes thereon, comprising:
authenticating an object to be authenticated using the authentication method of any one of the preceding claims 21-27;
if the identity authentication of the object to be authenticated is successful, signing transaction information by using a secret key which is determined by the identity authentication method according to any one of claims 21-27 and passes verification, so as to obtain transaction data with digital signature;
Broadcasting the transaction data to other blockchain nodes on the blockchain so that the other blockchain nodes can check the digital signature of the transaction data by using the public key corresponding to the verified secret key; wherein the public key corresponding to the verified key is sent to the signer by the method of claim 48; the keys which pass the verification comprise keys to be authenticated which pass the verification, multi-factor keys to be authenticated which pass the verification or keys to be authenticated which are used for generating the multi-factor keys to be authenticated which pass the verification.
50. A privacy-preserving image processing apparatus, comprising:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring an image to be processed, the image to be processed comprises a first biological characteristic area, and the first biological characteristic area comprises a plurality of first characteristic points;
the first determining module is used for determining a first biological characteristic template of the first biological characteristic area according to the first biological characteristic area, wherein the first biological characteristic template comprises biological characteristic representations corresponding to the plurality of first characteristic points;
the encoding module is used for performing encoding operation on the first biological characteristic template to obtain first auxiliary data;
The coding module is realized by any one of the following modes:
performing feature conversion on the first biological feature template to obtain first auxiliary data; the feature conversion is determined according to a key corresponding to the first auxiliary data;
quantizing according to the first biological characteristic template to obtain a quantized value; determining an error correction code word according to a key corresponding to the first auxiliary data; performing first transformation processing on the error correction code word to obtain first auxiliary data; wherein the determining the error correction code word according to the key corresponding to the first auxiliary data includes: determining an error correction code word according to the key corresponding to the first auxiliary data and the quantized value; and/or, the performing a first transformation on the error correction code word to obtain first auxiliary data, including: performing first transformation processing on the error correction code words according to the quantized values to obtain first auxiliary data; the first transform process is an irreversible transform;
quantizing according to the first biological characteristic template to obtain a quantized value; determining an error correction code word; performing second transformation processing on the error correction code word to obtain first auxiliary data; generating a key corresponding to the first auxiliary data according to the quantized value; wherein an error correction code codeword is determined; performing a second transformation process on the error correction code word to obtain first auxiliary data, including: randomly determining error correction code words; performing second transformation processing on the error correction code words according to the quantized values to obtain first auxiliary data; or determining an error correction code word according to the quantized value, and performing second transformation processing on the error correction code word to obtain first auxiliary data; the second transform process is an irreversible transform;
Wherein the encoding operation comprises an irreversible transformation.
51. An identity registration apparatus, comprising:
a second determining module for determining first auxiliary data in registration information of an object to be registered by the method of any one of claims 1-15.
52. An identity authentication device, comprising:
the second acquisition module is used for acquiring an image to be authenticated of an object to be authenticated, wherein the image to be authenticated comprises a second biological characteristic area, and the second biological characteristic area comprises a plurality of second characteristic points;
a third determining module, configured to determine a biometric template to be authenticated of the second biometric region according to the second biometric region;
the third acquisition module is used for acquiring an identity authentication result of the object to be authenticated, wherein the identity authentication result is determined according to the biological characteristic template to be authenticated and the base auxiliary data;
wherein the base assistance data comprises at least one first assistance data; the first assistance data is determined by an identity registration method according to any one of claims 16 to 18.
53. An identity registration apparatus, comprising:
A first receiving module, configured to receive registration information sent by a terminal device, where the registration information is determined by an identity registration method according to any one of claims 16 to 18;
and the second storage module is used for storing the registration information into a database, wherein the registration information comprises first auxiliary data.
54. An identity authentication device, comprising:
the second receiving module is used for receiving the authentication request sent by the terminal equipment;
a base auxiliary data determining module, configured to determine base auxiliary data from registration information stored in a database by the method according to any one of claims 29 to 31 according to the authentication request; wherein the base assistance data comprises at least one first assistance data.
55. An electronic device, comprising: a processor, a memory storing machine-readable instructions executable by the processor, which when executed by the processor perform the steps of the method of any of claims 1 to 49 when the electronic device is run.
56. A computer-readable storage medium, characterized in that it has stored thereon a computer program which, when executed by a processor, performs the steps of the method according to any of claims 1 to 49.
CN202111229310.7A 2021-10-21 2021-10-21 Privacy-protected image processing method, identity registration method and identity authentication method Pending CN116010917A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111229310.7A CN116010917A (en) 2021-10-21 2021-10-21 Privacy-protected image processing method, identity registration method and identity authentication method
PCT/CN2022/126690 WO2023066374A1 (en) 2021-10-21 2022-10-21 Privacy protection based image processing method, identity registration method, and identity authentication method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111229310.7A CN116010917A (en) 2021-10-21 2021-10-21 Privacy-protected image processing method, identity registration method and identity authentication method

Publications (1)

Publication Number Publication Date
CN116010917A true CN116010917A (en) 2023-04-25

Family

ID=86023561

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111229310.7A Pending CN116010917A (en) 2021-10-21 2021-10-21 Privacy-protected image processing method, identity registration method and identity authentication method

Country Status (1)

Country Link
CN (1) CN116010917A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116756718A (en) * 2023-08-14 2023-09-15 安徽大学 U-Sketch-based biological feature data error correction method, system and tool
CN117218685A (en) * 2023-10-18 2023-12-12 湖南工商大学 Biological feature recognition method considering feature template protection
CN117318941A (en) * 2023-11-29 2023-12-29 合肥工业大学 Method, system, terminal and storage medium for distributing preset secret key based on in-car network

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116756718A (en) * 2023-08-14 2023-09-15 安徽大学 U-Sketch-based biological feature data error correction method, system and tool
CN116756718B (en) * 2023-08-14 2023-12-01 安徽大学 U-Sketch-based biological feature data error correction method, system and tool
CN117218685A (en) * 2023-10-18 2023-12-12 湖南工商大学 Biological feature recognition method considering feature template protection
CN117318941A (en) * 2023-11-29 2023-12-29 合肥工业大学 Method, system, terminal and storage medium for distributing preset secret key based on in-car network
CN117318941B (en) * 2023-11-29 2024-02-13 合肥工业大学 Method, system, terminal and storage medium for distributing preset secret key based on in-car network

Similar Documents

Publication Publication Date Title
Lee et al. Cancelable fingerprint templates using minutiae-based bit-strings
Li et al. An effective biometric cryptosystem combining fingerprints with error correction codes
Sandhya et al. Securing fingerprint templates using fused structures
Lee et al. Biometric key binding: Fuzzy vault based on iris images
Cimato et al. Privacy-aware biometrics: Design and implementation of a multimodal verification system
CN105471575B (en) Information encryption and decryption method and device
Jain et al. Fingerprint template protection: From theory to practice
CN116010917A (en) Privacy-protected image processing method, identity registration method and identity authentication method
Wu et al. A face based fuzzy vault scheme for secure online authentication
US11227037B2 (en) Computer system, verification method of confidential information, and computer
Benhammadi et al. Password hardened fuzzy vault for fingerprint authentication system
Billeb et al. Biometric template protection for speaker recognition based on universal background models
Sadhya et al. Review of key‐binding‐based biometric data protection schemes
Chafia et al. A biometric crypto-system for authentication
Kaur et al. Cryptographic key generation from multimodal template using fuzzy extractor
Sandhya et al. Cancelable fingerprint cryptosystem using multiple spiral curves and fuzzy commitment scheme
Shi et al. Fingerprint recognition strategies based on a fuzzy commitment for cloud-assisted IoT: a minutiae-based sector coding approach
Martínez et al. Secure crypto-biometric system for cloud computing
Chiou Secure method for biometric-based recognition with integrated cryptographic functions
CN114117383A (en) Registration method, authentication method and device
Dong et al. Secure chaff-less fuzzy vault for face identification systems
Wang et al. A novel template protection scheme for multibiometrics based on fuzzy commitment and chaotic system
Sadhya et al. Design of a cancelable biometric template protection scheme for fingerprints based on cryptographic hash functions
Yang et al. A Delaunay triangle group based fuzzy vault with cancellability
Baghel et al. An enhanced fuzzy vault to secure the fingerprint templates

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230821

Address after: Room 1507-1512, 13th Floor, No. 27 Zhichun Road, Haidian District, Beijing, 100083

Applicant after: Beijing jianmozi Technology Co.,Ltd.

Address before: 100000 1024, 1f, building 5, yard 5, Jiangtai Road, Chaoyang District, Beijing

Applicant before: Moqi Technology (Beijing) Co.,Ltd.

TA01 Transfer of patent application right