CN117351616A - Self-service terminal identity recognition method and device - Google Patents

Self-service terminal identity recognition method and device Download PDF

Info

Publication number
CN117351616A
CN117351616A CN202311322346.9A CN202311322346A CN117351616A CN 117351616 A CN117351616 A CN 117351616A CN 202311322346 A CN202311322346 A CN 202311322346A CN 117351616 A CN117351616 A CN 117351616A
Authority
CN
China
Prior art keywords
characteristic
limb
feature
biological
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311322346.9A
Other languages
Chinese (zh)
Inventor
苏泽华
刘亦宋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202311322346.9A priority Critical patent/CN117351616A/en
Publication of CN117351616A publication Critical patent/CN117351616A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F19/00Complete banking systems; Coded card-freed arrangements adapted for dispensing or receiving monies or the like and posting such transactions to existing accounts, e.g. automatic teller machines
    • G07F19/20Automatic teller machines [ATMs]
    • G07F19/207Surveillance aspects at ATMs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3226Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using a predetermined code, e.g. password, passphrase or PIN
    • H04L9/3231Biological data, e.g. fingerprint, voice or retina
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/40Network security protocols

Abstract

A self-service terminal identity recognition method and device can be used in the financial field or other fields. The method comprises the following steps: obtaining a first biological dynamic characteristic and a first appearance characteristic according to the obtained first biological characteristic information and the first image information; performing limb characteristic identification on the acquired second image information by using the acquired first ranging data to acquire first limb characteristic data; when the first ranging data is not smaller than the leaving distance threshold value, suspending the current business flow; if the acquired second distance measurement data is smaller than a preset distance-away threshold, acquiring second biological dynamic characteristics, second appearance characteristics and second limb characteristic data according to the acquired second biological characteristic information and third image information; respectively comparing the characteristic data to obtain a plurality of characteristic similarities; and dynamically weighting by using the weighting value to obtain the multi-mode fusion similarity, and determining the customer identification result. The method and the device improve the accuracy of customer identification, ensure the safety of the customer account and improve the customer experience.

Description

Self-service terminal identity recognition method and device
Technical Field
The invention relates to the technical field of self-service terminals, in particular to a self-service terminal identity recognition method and device.
Background
In order to avoid that a customer leaves the site after forgetting to log out after using the self-service terminal, a subsequent customer can override to access personal data of the customer which is not logged out, and the self-service terminal of a bank requires to carry out identity verification on each service. Therefore, when a customer needs to transact a plurality of services at the self-service terminal, identity verification is completed by inserting an identity card, mobile phone bank code scanning, face recognition and other modes before each service starts, and the service transacting experience of the customer is affected.
At present, a banking business processing method is disclosed in the prior art, and a self-service terminal control method based on face recognition is provided: when the service system of the self-service terminal runs to a preset key progress node, the self-service terminal calls a camera to shoot a face photo of an operator, and judges whether the operator is a login user or not by comparing the face photo of the operator with a reserved face photo of a login user of the current service system, and identity verification is not required to be performed on each service explicitly.
However, this prior art has the following problems: the user experience is poor: if the accuracy of face recognition is to be ensured, the operator is generally required to pick up face ornaments such as masks, sunglasses and the like and to face the camera when taking face photos of the operator, so that the identity recognition process in the prior art is not 'non-feel' to the client, and the business handling experience of the client is still affected. The customer identification precision is low, only the identity of the customer is identified through the face photo, the identification means is single, in addition, the identification precision is low, and the problem that potential safety hazards appear in the customer account can be caused.
Disclosure of Invention
Aiming at the problems existing in the prior art, the main purpose of the embodiment of the invention is to provide a self-service terminal identity recognition method and device, which improves the customer recognition precision and customer use experience.
In order to achieve the above object, an embodiment of the present invention provides a self-service terminal identity recognition method, including:
acquiring first biological characteristic information and first image information which are acquired by a self-service terminal and authorized by a client, and extracting characteristics of the first biological characteristic information and the first image information to obtain a first biological dynamic characteristic and a first appearance characteristic;
acquiring first ranging data acquired by a self-service terminal, if the first ranging data is not less than a preset handling distance threshold, acquiring second image information authorized by a client through the self-service terminal, and performing limb feature identification on the second image information by utilizing the first ranging data to acquire first limb feature data;
when the first ranging data is not smaller than a preset leaving distance threshold value, suspending the current business flow;
acquiring second ranging data acquired by the self-service terminal, if the second ranging data is smaller than a preset departure distance threshold, acquiring second biological characteristic information and third image information authorized by a client through the self-service terminal, acquiring second biological dynamic characteristics and second appearance characteristics according to the second biological characteristic information and the third image information, and performing limb characteristic identification on the third image information by utilizing the second ranging data to acquire second limb characteristic data;
Respectively comparing the first biological dynamic characteristic with the second biological dynamic characteristic, the first appearance characteristic with the second appearance characteristic, the first limb characteristic data with the second limb characteristic data to obtain biological characteristic similarity, appearance characteristic similarity and limb characteristic matching degree;
and dynamically weighting the biological feature similarity, the appearance feature similarity and the limb feature matching degree by using a preset weighting value to obtain multi-modal fusion similarity, and determining a customer identification result according to the multi-modal fusion similarity.
Optionally, in an embodiment of the present invention, performing feature extraction on the first biometric information to obtain a first biometric feature includes:
extracting the characteristics of the first biological characteristic information to obtain the heart rate, the heart beat amplitude, the respiratory rate and the respiratory rhythm of the current client;
and obtaining a breathing rhythm characteristic sequence by utilizing the breathing rhythm, and taking the heart rate, the heartbeat amplitude, the breathing frequency and the breathing rhythm characteristic sequence of the current client as first biological dynamic characteristics.
Optionally, in an embodiment of the present invention, comparing the first biological dynamic feature with the second biological dynamic feature to obtain the biological feature similarity includes:
Calculating the similarity of the breathing rhythm feature sequence in the first biological dynamic feature and the breathing rhythm feature sequence in the second biological dynamic feature to obtain breathing rhythm feature similarity;
and calculating the similarity of the breathing rhythm characteristics, the heart rate, the heartbeat amplitude and the breathing frequency in the first biological dynamic characteristics and the second biological dynamic characteristics to obtain the biological characteristic similarity.
Optionally, in an embodiment of the present invention, performing limb feature recognition on the second image information by using the first ranging data, to obtain first limb feature data includes:
selecting a plurality of limb characteristic points from the second image information, and determining image coordinates corresponding to each limb characteristic point according to the moment corresponding to each limb characteristic point;
and taking the first ranging data as compensation, and carrying out normalization calculation by utilizing image coordinates corresponding to each limb characteristic point to obtain the first limb characteristic data.
Optionally, in an embodiment of the present invention, comparing the first limb feature data with the second limb feature data to obtain the limb feature matching degree includes:
and carrying out matched filtering detection on the first limb characteristic data and the second limb characteristic data by using a preset farthest confidence distance to obtain limb characteristic matching degree.
Optionally, in an embodiment of the present invention, determining the client identification result according to the multimodal fusion similarity includes:
and if the multi-mode fusion similarity is not smaller than the similarity discrimination threshold, determining that the client identification result is the same client.
Optionally, in an embodiment of the present invention, the method further includes: and if the client identification result is the same client, recovering the suspended current business flow.
The embodiment of the invention also provides a self-service terminal identity recognition device, which comprises:
the biological feature module is used for acquiring first biological feature information and first image information which are acquired by the self-service terminal and authorized by a client, and extracting features of the first biological feature information and the first image information to obtain a first biological dynamic feature and a first appearance feature;
the limb characteristic module is used for acquiring first ranging data acquired by the self-service terminal, acquiring second image information authorized by a client through the self-service terminal if the first ranging data is not smaller than a preset handling distance threshold value, and carrying out limb characteristic identification on the second image information by utilizing the first ranging data to acquire first limb characteristic data;
The service suspension module is used for suspending the current service flow when the first ranging data is not smaller than a preset leaving distance threshold value;
the distance measurement data module is used for acquiring second distance measurement data acquired by the self-service terminal, acquiring second biological characteristic information and third image information authorized by a client through the self-service terminal if the second distance measurement data is smaller than a preset departure distance threshold value, acquiring second biological dynamic characteristics and second appearance characteristics according to the second biological characteristic information and the third image information, and carrying out limb characteristic identification on the third image information by utilizing the second distance measurement data to acquire second limb characteristic data;
the similarity calculation module is used for comparing the first biological dynamic characteristic with the second biological dynamic characteristic, the first appearance characteristic with the second appearance characteristic, the first limb characteristic data with the second limb characteristic data respectively to obtain biological characteristic similarity, appearance characteristic similarity and limb characteristic matching degree;
the client identification module is used for dynamically weighting the biological feature similarity, the appearance feature similarity and the limb feature matching degree by using a preset weighting value to obtain multi-modal fusion similarity, and determining a client identification result according to the multi-modal fusion similarity.
Optionally, in an embodiment of the present invention, the biometric module includes:
the dynamic parameter unit is used for carrying out feature extraction on the first biological feature information to obtain the heart rate, the heartbeat amplitude, the respiratory rate and the respiratory rhythm of the current client;
the dynamic characteristic unit is used for obtaining a breathing rhythm characteristic sequence by utilizing the breathing rhythm, and taking the heart rate, the heartbeat amplitude, the breathing frequency and the breathing rhythm characteristic sequence of the current client as first biological dynamic characteristics.
Optionally, in an embodiment of the present invention, the similarity calculation module includes:
the breathing rhythm unit is used for calculating the similarity of the breathing rhythm characteristic sequence in the first biological dynamic characteristic and the breathing rhythm characteristic sequence in the second biological dynamic characteristic to obtain breathing rhythm characteristic similarity;
and the biological feature unit is used for calculating the similarity of the breathing rhythm features, the heart rate, the heartbeat amplitude and the breathing frequency in the first biological dynamic feature and the second biological dynamic feature to obtain the biological feature similarity.
Optionally, in an embodiment of the present invention, the limb feature module includes:
the feature point unit is used for selecting a plurality of limb feature points from the second image information and determining image coordinates corresponding to each limb feature point according to the moment corresponding to each limb feature point;
And the limb characteristic unit is used for taking the first ranging data as compensation, and carrying out normalization calculation by utilizing the image coordinates corresponding to each limb characteristic point to obtain the first limb characteristic data.
Optionally, in an embodiment of the present invention, the similarity calculation module is further configured to perform matched filtering detection on the first limb feature data and the second limb feature data by using a preset farthest confidence distance, so as to obtain the limb feature matching degree.
Optionally, in an embodiment of the present invention, the client identification module is further configured to determine that the client identification result is the same client if the multi-mode fusion similarity is not less than the similarity discrimination threshold.
Optionally, in an embodiment of the present invention, the apparatus further includes: and the service recovery module is used for recovering the suspended current service flow if the client identification result is the same client.
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the above method when executing the program.
The present invention also provides a computer-readable storage medium storing a computer program for executing the above method by a computer.
The invention also provides a computer program product comprising computer programs/instructions which when executed by a processor implement the steps of the above method.
According to the invention, through a multi-mode customer identification mode integrating visual features and biological features, the accuracy of customer identification is improved, the safety of a customer account is ensured, and the relevant feature information of the customer is acquired through the self-service terminal, so that automatic customer identity verification is realized on the premise that the customer does not feel, the customer does not need to perform identity verification operations such as inserting identity cards for many times when continuously handling multiple businesses, and the customer business handling experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a self-service terminal identity recognition method according to an embodiment of the present invention;
FIG. 2 is a flow chart of obtaining a first biodynamic characteristic in an embodiment of the present invention;
FIG. 3 is a flow chart of obtaining similarity of biological features according to an embodiment of the invention;
FIG. 4 is a flowchart of obtaining first limb characteristic data according to an embodiment of the present invention;
FIG. 5 is a flowchart of a method for self-service terminal identity recognition in an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a self-service terminal identity recognition device according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a biometric module according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a similar computing module according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of a limb feature module according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a self-service terminal identity recognition device according to another embodiment of the present invention;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the invention.
Detailed Description
The embodiment of the invention provides a self-service terminal identity recognition method and device, which can be used in the financial field and other fields, and the self-service terminal identity recognition method and device can be used in the financial field and any field except the financial field, and the application field of the self-service terminal identity recognition method and device is not limited.
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 is a flowchart of a self-service terminal identity recognition method according to an embodiment of the present invention, where an execution subject of the self-service terminal identity recognition method provided by the embodiment of the present invention includes, but is not limited to, a computer. According to the invention, through a multi-mode customer identification mode integrating visual features and biological features, the accuracy of customer identification is improved, the safety of a customer account is ensured, and the relevant feature information of the customer is acquired through the self-service terminal, so that automatic customer identity verification is realized on the premise that the customer does not feel, the customer does not need to perform identity verification operations such as inserting identity cards for many times when continuously handling multiple businesses, and the customer business handling experience is improved. The method shown in the figure comprises the following steps:
step S1, acquiring first biological characteristic information and first image information which are acquired by a self-service terminal and authorized by a client, and extracting the characteristics of the first biological characteristic information and the first image information to obtain a first biological dynamic characteristic and a first appearance characteristic.
The first biological characteristic information and the first image information are anonymous biological characteristic information of the current customer collected through a millimeter wave radar module and a camera module of the self-service terminal, and preparation is made for subsequent characteristic comparison. The step is executed by authorization after the customer completes the identity authentication operations such as inserting an identity card, mobile phone bank code scanning, face recognition and the like.
Further, the first biometric information of the client includes static parameters (heart rate, heart beat amplitude, respiratory rate) and dynamic parameters including respiratory rhythm.
Furthermore, the heart rate, the heart beat amplitude, the breathing frequency and the breathing rhythm can be extracted from the first biological characteristic information, the breathing rhythm characteristic sequence can be extracted by utilizing the breathing rhythm, and the heart rate, the heart beat amplitude, the breathing frequency and the breathing rhythm characteristic sequence are used as the first biological dynamic characteristics.
Further, the first appearance characteristic of the customer is extracted from the first influence information of the customer, specifically including, for example, clothes color, whether to wear glasses, facial form, etc., and recorded as a feature vector, thereby obtaining the first appearance characteristic.
Step S2, acquiring first ranging data acquired by the self-service terminal, if the first ranging data is not smaller than a preset handling distance threshold, acquiring second image information authorized by a client through the self-service terminal, and performing limb feature recognition on the second image information by utilizing the first ranging data to acquire first limb feature data.
The method comprises the steps of acquiring first distance measurement data acquired through the self-service terminal, judging whether a customer has signs of leaving the self-service terminal, if so, recording action characteristics of the customer and suspending a business process, and preventing a next operator of the self-service terminal from illegally and unauthorized reading personal information of a current login customer.
Further, the first distance measurement data is the distance between the current client and the self-service terminal, and when the first distance measurement data is not smaller than a preset handling distance threshold, the client is ready to leave the self-service terminal. And acquiring image information of the current customer, namely second image information, through the self-service terminal. And taking the first ranging data as compensation, and carrying out limb characteristic recognition on the second image information to obtain first limb characteristic data.
Specifically, as the pixel area of the client in the image is smaller when the client is far away from the self-service machine, the accuracy of image recognition can be affected, compensation is performed according to the distance measured by the self-service terminal millimeter wave radar module, and the recognition accuracy of limb characteristics can be greatly improved.
And S3, suspending the current business flow when the first ranging data is not smaller than a preset departure distance threshold value.
And when the first distance measurement data is not smaller than a preset leaving distance threshold value, the client is indicated to leave the self-service terminal at the moment, and the business process operated by the client at the self-service terminal is suspended, so that the next operator is prevented from knowing or operating the information and business of the current client.
Step S4, acquiring second ranging data acquired by the self-service terminal, if the second ranging data is smaller than a preset departure distance threshold, acquiring second biological characteristic information and third image information authorized by a client through the self-service terminal, acquiring second biological dynamic characteristics and second appearance characteristics according to the second biological characteristic information and the third image information, and carrying out limb characteristic identification on the third image information by utilizing the second ranging data to acquire second limb characteristic data.
When the next operator approaches the self-service terminal, the customer identification process is required to judge whether the next operator and the last operator are the same customer. If the client is the same client, the suspended business process can be recovered, the client is prevented from carrying out identity recognition again, and the user experience is improved. If the customer is not the same customer, the business process is not restored, so that the safety of the customer account is effectively ensured.
Further, when the distance between the next operator collected by the self-service terminal and the self-service terminal, that is, the second ranging data is smaller than the preset departure threshold value, the next operator is indicated to approach and prepare to operate the self-service terminal. At this time, the self-service terminal collects the second biometric information and the third image information of the operator.
In particular, the second biometric information, like the first biometric information, comprises static parameters (heart rate, heart beat amplitude, respiratory rate) and dynamic parameters, including breathing rhythm. Likewise, the second biodynamic characteristic is extracted from the second biodynamic characteristic information, which is the same as the first biodynamic information, including as a characteristic sequence of the heart rate, heart beat amplitude, breathing rate, and breathing rhythm of the operator, using the same feature extraction as the first biodynamic information.
Further, as with the first image information, the third image information is subjected to appearance feature extraction including, for example, clothing color, whether to wear glasses, face shape, etc., recorded as feature vectors, thereby obtaining the second appearance feature of the current operator.
Further, as with the first limb feature data, the second ranging data is used as compensation to perform limb feature recognition on the third image information, so as to obtain second limb feature data.
And S5, comparing the first biological dynamic characteristic with the second biological dynamic characteristic, the first appearance characteristic with the second appearance characteristic, and the first limb characteristic data with the second limb characteristic data to obtain biological characteristic similarity, appearance characteristic similarity and limb characteristic matching degree.
Wherein the first biodynamic characteristic of the last customer is compared with the second biodynamic characteristic of the current operator, the first and second appearance characteristics, the first and second limb characteristic data. Specifically, the biological feature similarity, the appearance feature similarity and the limb feature matching degree of the previous client and the current operator can be obtained through calculation of the similarity and the matching degree.
And S6, dynamically weighting the biological feature similarity, the appearance feature similarity and the limb feature matching degree by using a preset weighting value to obtain multi-modal fusion similarity, and determining a customer identification result according to the multi-modal fusion similarity.
The preset weighting values comprise a weighting value of the appearance feature, a weighting value of the limb movement feature and a weighting value of the biological feature, and the weighting value is used for carrying out multi-mode fusion on the biological feature similarity, the appearance feature similarity and the limb feature matching degree to obtain the multi-mode fusion similarity.
Further, comparing the multi-mode fusion similarity with a preset similarity threshold, and if the multi-mode fusion similarity is not smaller than the preset similarity threshold, considering that the operator who is reached later is the same person as the last client, and obtaining the client identification result as the same client.
Furthermore, after the customer is identified as the same person, the suspended business process after the customer leaves can be directly recovered, operations such as re-authorization identification and the like are not required, and the customer experience is greatly improved.
As an embodiment of the present invention, as shown in fig. 2, performing feature extraction on the first biometric information to obtain a first biometric feature includes:
step S11, extracting features of the first biological feature information to obtain the heart rate, the heartbeat amplitude, the respiratory rate and the respiratory rhythm of the current client;
step S12, a breathing rhythm characteristic sequence is obtained by utilizing the breathing rhythm, and the heart rate, the heartbeat amplitude, the breathing frequency and the breathing rhythm characteristic sequence of the current client are used as first biological dynamic characteristics.
The first biometric information of the client comprises static parameters (heart rate, heart beat amplitude, respiratory rate) and dynamic parameters, including respiratory rhythm.
Furthermore, by adopting a conventional mode, the heart rate, the heart beat amplitude, the breathing frequency and the breathing rhythm can be extracted from the first biological characteristic information, the breathing rhythm characteristic sequence can be extracted by utilizing the breathing rhythm, and the heart rate, the heart beat amplitude, the breathing frequency and the breathing rhythm characteristic sequence are used as the first biological dynamic characteristics.
In this embodiment, as shown in fig. 3, comparing the first biological dynamic feature with the second biological dynamic feature to obtain the biological feature similarity includes:
step S51, similarity calculation is carried out on the breathing rhythm feature sequence in the first biological dynamic feature and the breathing rhythm feature sequence in the second biological dynamic feature, so that breathing rhythm feature similarity is obtained;
step S52, similarity calculation is carried out on the breathing rhythm feature similarity, the heart rate, the heartbeat amplitude and the breathing frequency in the first biological dynamic feature and the second biological dynamic feature, and the biological feature similarity is obtained.
And calculating the similarity of the breathing rhythm characteristic sequence in the first biological dynamic characteristic and the breathing rhythm characteristic sequence in the second biological dynamic characteristic, so that the breathing rhythm characteristic similarity of the last client and the current operator can be obtained. After the similarity of the breathing rhythm features is obtained, the similarity calculation is carried out by combining the heart rate, the heartbeat amplitude and the breathing frequency in the first biological dynamic feature and the second biological dynamic feature, so that the biological feature similarity of the previous client and the current operator can be obtained.
As an embodiment of the present invention, as shown in fig. 4, performing limb feature recognition on the second image information by using the first ranging data, to obtain first limb feature data includes:
Step S21, selecting a plurality of limb characteristic points from the second image information, and determining image coordinates corresponding to each limb characteristic point according to the moment corresponding to each limb characteristic point;
and S22, taking the first ranging data as compensation, and carrying out normalization calculation by utilizing image coordinates corresponding to each limb characteristic point to obtain the first limb characteristic data.
And recording the image coordinates of the nth characteristic point at the tn-th moment, wherein a plurality of limb characteristic points are selected altogether, and the corresponding time length is recorded. Meanwhile, the distance at the time tn is recorded through a millimeter wave radar sensor module of the self-service terminal, and then normalization calculation is carried out by taking the first ranging data as compensation, so that first limb characteristic data are obtained.
As an embodiment of the present invention, comparing the first limb feature data with the second limb feature data to obtain the limb feature matching degree includes: and carrying out matched filtering detection on the first limb characteristic data and the second limb characteristic data by using a preset farthest confidence distance to obtain limb characteristic matching degree.
And carrying out matched filtering detection on the limb characteristic data by using a preset farthest confidence distance, if the position of the person to be detected is far away during sampling, the confidence degree is lower during matched filtering detection, and if the distance of the person exceeds the farthest confidence distance, the point does not acquire a message, thereby obtaining the limb characteristic matching degree.
As one embodiment of the present invention, determining the customer identification result according to the multi-modal fusion similarity includes: if the multi-mode fusion similarity is not smaller than the similarity discrimination threshold, determining that the client identification result is the same client.
In this embodiment, the method further includes: and if the client identification result is the same client, recovering the suspended current business flow.
And comparing the multi-mode fusion similarity with a preset similarity threshold, and if the multi-mode fusion similarity is not smaller than the preset similarity threshold, considering that the operator who is reached later is the same person as the last client, and obtaining a client identification result as the same client.
Furthermore, after the customer is identified as the same person, the suspended business process after the customer leaves can be directly recovered, operations such as re-authorization identification and the like are not required, and the customer experience is greatly improved.
In a specific embodiment of the present invention, as shown in fig. 5, the present invention overcomes the problems of low generality and poor user experience in the prior art, and provides a multi-mode customer identification scheme that integrates visual features and biological features. The characteristic information of the client is acquired through the camera and the millimeter wave radar, and automatic client identity verification is realized on the premise that the client does not feel, so that the client does not need to perform identity verification operations such as inserting identity cards for many times when handling a plurality of services continuously, and the client service handling experience is improved.
S100, recording the biological characteristics of the client, and acquiring anonymous biological characteristic information of the client through a millimeter wave radar module and a camera module of the self-service terminal to prepare for subsequent characteristic comparison. The step is executed after the customer completes the operations of identity authentication such as inserting an identity card, mobile phone bank code scanning, face recognition and the like.
The specific flow is as follows: s101 biological feature collection, S102 biological dynamic feature extraction and S103 appearance feature extraction.
Wherein, S101 biological feature collection is to collect the biological feature of the customer through the millimeter wave radar module. The biometric features comprise static parameters (heart rate, heart beat amplitude, respiratory rate) and dynamic parameters (respiratory rhythm). Recording the heart rate of the current client asThe heart beat amplitude is +.>Respiratory rate of->Let the initial recorded point number of breathing rhythm be +.>The initial recorded sequence of breathing rhythm is recorded +.>Wherein->The individual dots are written +.>The value range is +.>
Further, S102 the biodynamic feature extraction will be from the initial recorded sequence of breathing rhythmExtracting respiratory rhythm characteristic sequence->The specific operation is as follows: for->The number of the points is +.>Fast Fourier Transform (FFT) of (2) to obtain a length +.>Is->Sequence.
Further, S103, the image information of the customer is obtained through the camera module, the image characteristics of the customer in the picture, such as clothes color, whether wearing glasses, facial form, etc., are extracted and recorded, and the image characteristic vector is recorded In (1)/(2)>Is->Individual appearance characteristic value->The value range is +.> In this step, the extracted features can be obtained by using existing algorithms, such as color recognition and neural network methods.
S200, identifying a customer leaving action, judging whether the customer has a sign of leaving the self-service terminal through a millimeter wave radar module and a camera module of the self-service terminal, if so, recording the action characteristics of the customer and suspending a business process, and preventing the next operator of the self-service terminal from illegally and unauthorized reading the personal information of the current login customer.
The specific flow is as follows: s201 leaving intention judgment, S202 limb movement feature acquisition and S203 leaving behavior confirmation.
Wherein S201 leaving intention judgment is to distance the customer through the millimeter wave radar module, and record the distance between the customer and the self-service terminal as d m The maximum distance threshold value of the transacted business is recorded asWhen->When the client is considered to be in a ready-to-leave state, the process S202 is executed; otherwise, the current flow is circularly executed.
Further, S202, acquiring the limb movement characteristics, namely acquiring image information of a client through a camera module, and recording and selecting I together B The recording time length of each limb characteristic point is T B Point, ith B The characteristic point is at t B The image sitting mark of the moment is Wherein i is B The value range is i B =1,…,I B ,t B The value range is t B =1,…,T B . At the same time, a millimeter wave radar sensor module is used for recording the time t B Distance at time->
Specifically, normalized limb feature points are obtained using the following equation (1)Where |x| is the absolute value taking operation. The invention compensates according to the distance measured by the millimeter wave radar module because the pixel area of the customer in the image is smaller when the customer is far away from the self-service machine tool, which affects the accuracy of image recognition, thereby improving the accuracy of limb feature recognition.
Further, the step S203 is to confirm the departure of the client by the millimeter wave radar module, and when the distance d is the distance m Greater than a given standoff thresholdWhen (i.e.)>The client is considered to be in a leaving state, the system suspends the business process and executes step S301; otherwise, the current flow is circularly executed.
S300, comparing the biological characteristics of the clients, and judging whether the current operator is the last client completing identity authentication on the self-service terminal, if so, recovering the previously suspended business flow, and if not, requesting the current operator to conduct identity authentication.
On one hand, the process can avoid that the client is misjudged to have departed due to accidental errors in the process of identifying the client to leave in S200, so that the business process of the client is suspended and identity authentication needs to be carried out again; on the other hand, the method can avoid the situation that the customer needs to leave the self-service terminal temporarily to seek help of the website personnel due to the problem, so that the business process is suspended and identity authentication needs to be carried out again when the customer returns to the scene.
The specific flow is as follows: s301 customer detection, S302 limb movement feature collection, S303 biological feature collection, S304 limb movement feature comparison, S305 appearance feature comparison, S306 biological dynamic feature comparison, S307 biological comprehensive feature comparison, S308 multi-mode feature fusion comparison and S309 identity discrimination.
Wherein S301 customer detection is to measure distance of customers approaching the self-service terminal through the millimeter wave radar module, whenWhen the self-service terminal is used, the client is considered to be ready to use, and the process S302 is entered; otherwise, the current flow is circularly executed.
Further, the step S302 of acquiring the limb movement characteristics is similar to the step S202, the camera module is used for acquiring the image information of the current operator, and recording the limb movement characteristics of the current operator, i B The characteristic point is at t B The image sitting mark of the moment isDistance is recorded as->Normalized limb feature point +.>Obtained by the following formula (2):
further, the S303 biometric feature collection is similar to the processes S101 and S102, and the user is subjected to biometric feature recording through the millimeter wave radar module, and the heart rate of the current operator is recorded asThe heart beat amplitude is +.>RespirationFrequency of->Initial recording sequence of breathing rhythm->For->The number of the points is +.>Fast Fourier Transform (FFT) of (2) to obtain a length +. >Is a current operator breathing rhythm feature sequence +.>Meanwhile, the camera module is used for acquiring a scene picture, and the number N of people in the picture is obtained through analysis people
Further, S304 limb movement feature comparison uses a matched filtering detection method combined with distance confidence coefficient, and the current operator limb movement feature sequenceAnd original client limb movement characteristic sequence->Matching degree of->Calculated with the following formula (3):
wherein,for the most distant confidence distance,<x> 1 the symbol is represented as x>The expression results in 1 at 1. The formula means that if the position of the person to be detected is far away during sampling, the confidence coefficient is lower when the point carries out matched filtering detection, and when the distance of the person exceeds the farthest confidence distance, the point does not take a letter;
further, the comparison of the appearance characteristics of S305 is similar to the process S103, the camera module is used to obtain the image information of the client, and the appearance characteristics of the current operator in the picture are extracted and recordedAnd->Similarity of (2)Calculated using the following equation (4):
further, S306 biodynamic characteristic alignment calculates a current operator breathing rhythm feature sequence by equation (5) as followsAnd respiratory rhythm characteristic sequence->Similarity of->
Further, the S307 biological comprehensive characteristic comparison is performed by the following formula (6) Calculating the biological characteristic similarity between the current operator and the original client
Further, S308 multi-modal feature fusion comparison calculates the multi-modal fusion similarity between the current operator and the original client by the following formula (7)Aiming at the problem that the precision of the millimeter wave sensor is reduced in multiple people, the invention dynamically weights the similarity obtained by the millimeter wave sensor by using a camera to detect the number of people present (see the flow S303) when the multimode fusion is carried out:
wherein,weights, ω, for appearance features B Is the weight and omega of the movement characteristics of the limbs β The default condition of the biological characteristic weight value is 1, the specific value can be adjusted according to the precision of the millimeter wave radar module, the resolution of the camera module and the shooting angle, if the resolution of the camera module is higher and the shooting angle can be seen forward by a client, the reliability of the related data acquired through the camera module is considered to be high, and further the improvement is realized>Is a weight of (2).
Further, S309 identity discrimination is used to determine whether the current operator is the last customer to complete identity authentication on the self-service terminal. Recording deviceThe similarity discrimination threshold is fused in a multi-mode way, when +.>When the new arriving customer and the original customer are considered to be the same person, the system will resume the previously suspended business process and execute process S201.
According to the invention, the visual characteristics and the inherent biological characteristics are fused to perform multi-mode identification on the target, so that the identification accuracy is improved. Specifically, when comparing limb movements, a matched filtering detection method combined with distance confidence is used; dynamically weighting biological characteristics by using a visual detection result when final comprehensive comparison is performed; static + dynamic features. In addition, aiming at the characteristic that the visual picture is far smaller, the distance measurement function of the millimeter wave radar sensor is utilized to correct the obtained limb movement characteristic data, so that the error of a pure visual algorithm caused by the fact that the human body distance is far or the resolution of a camera is low is made up.
In addition, the invention can apply the anonymous characteristic information of the customer, thereby accelerating the business handling efficiency without the authorization of the customer. The anonymous characteristic information applied by the invention can be collected under the condition that the client does not feel, the continuous business handling process of the client is not interfered, and the client experience is improved.
Fig. 6 is a schematic structural diagram of a self-service terminal identity recognition device according to an embodiment of the present invention, where the device includes:
the biological feature module 10 is configured to obtain first biological feature information and first image information which are acquired by the self-service terminal and authorized by a client, and perform feature extraction on the first biological feature information and the first image information to obtain a first biological dynamic feature and a first appearance feature;
The limb feature module 20 is configured to obtain first ranging data collected by the self-service terminal, and if the first ranging data is not less than a preset handling distance threshold, obtain second image information authorized by the client through the self-service terminal, and perform limb feature recognition on the second image information by using the first ranging data to obtain the first limb feature data;
a service suspension module 30, configured to suspend the current service flow when the first ranging data is not less than a preset departure distance threshold;
the ranging data module 40 is configured to acquire second ranging data acquired by the self-service terminal, acquire second biometric information and third image information authorized by the client through the self-service terminal if the second ranging data is smaller than a preset departure distance threshold, acquire a second biometric feature and a second appearance feature according to the second biometric information and the third image information, and perform limb feature identification on the third image information by using the second ranging data to acquire second limb feature data;
the similarity calculation module 50 is configured to compare the first biodynamic characteristic with the second biodynamic characteristic, the first appearance characteristic with the second appearance characteristic, and the first limb characteristic data with the second limb characteristic data, respectively, so as to obtain a biological characteristic similarity, an appearance characteristic similarity and a limb characteristic matching degree;
The client identification module 60 is configured to dynamically weight the biometric similarity, the appearance feature similarity, and the limb feature matching degree by using a preset weight value to obtain a multi-modal fusion similarity, and determine a client identification result according to the multi-modal fusion similarity.
As an embodiment of the present invention, as shown in fig. 7, the biometric module 10 includes:
the dynamic parameter unit 11 is used for extracting the characteristics of the first biological characteristic information to obtain the heart rate, the heartbeat amplitude, the respiratory rate and the respiratory rhythm of the current client;
the dynamic feature unit 12 is configured to obtain a breathing rhythm feature sequence by using the breathing rhythm, and take the heart rate, the heartbeat amplitude, the breathing frequency and the breathing rhythm feature sequence of the current client as first biological dynamic features.
In the present embodiment, as shown in fig. 8, the similarity calculation module 50 includes:
a breathing rhythm unit 51, configured to calculate a similarity between a breathing rhythm feature sequence in the first biodynamic characteristic and a breathing rhythm feature sequence in the second biodynamic characteristic, so as to obtain a breathing rhythm feature similarity;
and the biometric feature unit 52 is configured to calculate similarity of the breathing rhythm feature similarity, and the heart rate, the heartbeat amplitude, and the breathing frequency in the first biometric feature and the second biometric feature, so as to obtain the biometric feature similarity.
As shown in fig. 9, as one embodiment of the present invention, the limb characteristics module 20 includes:
a feature point unit 21, configured to select a plurality of limb feature points from the second image information, and determine image coordinates corresponding to each limb feature point according to a time corresponding to each limb feature point;
the limb characteristic unit 22 is configured to perform normalization calculation using the first ranging data as compensation and using the image coordinates corresponding to each limb characteristic point to obtain first limb characteristic data.
The similarity calculation module 50 is further configured to perform matched filtering detection on the first limb feature data and the second limb feature data by using a preset farthest confidence distance, so as to obtain a limb feature matching degree.
The client identifying module 60 is further configured to determine that the client identifying result is the same client if the multimodal fusion similarity is not less than the similarity discrimination threshold.
As an embodiment of the present invention, as shown in fig. 10, the apparatus further includes: the service restoration module 70 is configured to restore the suspended current service flow if the client identification result is the same client.
Based on the same application conception as the self-service terminal identity recognition method, the invention also provides the self-service terminal identity recognition device. Because the principle of the self-service terminal identity recognition device for solving the problem is similar to that of a self-service terminal identity recognition method, the implementation of the self-service terminal identity recognition device can refer to the implementation of the self-service terminal identity recognition method, and repeated parts are not repeated.
According to the invention, through a multi-mode customer identification mode integrating visual features and biological features, the accuracy of customer identification is improved, the safety of a customer account is ensured, and the relevant feature information of the customer is acquired through the self-service terminal, so that automatic customer identity verification is realized on the premise that the customer does not feel, the customer does not need to perform identity verification operations such as inserting identity cards for many times when continuously handling multiple businesses, and the customer business handling experience is improved.
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the above method when executing the program.
The invention also provides a computer program product comprising computer programs/instructions which when executed by a processor implement the steps of the above method.
The present invention also provides a computer-readable storage medium storing a computer program for executing the above method by a computer.
As shown in fig. 11, the electronic device 600 may further include: a communication module 110, an input unit 120, an audio processor 130, a display 160, a power supply 170. It is noted that the electronic device 600 need not include all of the components shown in FIG. 11; in addition, the electronic device 600 may further include components not shown in fig. 11, to which reference is made to the related art.
As shown in fig. 11, the central processor 100, also sometimes referred to as a controller or operational control, may include a microprocessor or other processor device and/or logic device that the central processor 100 receives inputs and controls the operation of the various components of the electronic device 600.
The memory 140 may be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. The information about failure may be stored, and a program for executing the information may be stored. And the central processor 100 can execute the program stored in the memory 140 to realize information storage or processing, etc.
The input unit 120 provides an input to the central processor 100. The input unit 120 is, for example, a key or a touch input device. The power supply 170 is used to provide power to the electronic device 600. The display 160 is used for displaying display objects such as images and characters. The display may be, for example, but not limited to, an LCD display.
The memory 140 may be a solid state memory such as Read Only Memory (ROM), random Access Memory (RAM), SIM card, or the like. But also a memory which holds information even when powered down, can be selectively erased and provided with further data, an example of which is sometimes referred to as EPROM or the like. Memory 140 may also be some other type of device. Memory 140 includes a buffer memory 141 (sometimes referred to as a buffer). The memory 140 may include an application/function storage 142, the application/function storage 142 for storing application programs and function programs or a flow for executing operations of the electronic device 600 by the central processor 100.
The memory 140 may also include a data store 143, the data store 143 for storing data, such as contacts, digital data, pictures, sounds, and/or any other data used by the electronic device. The driver storage 144 of the memory 140 may include various drivers of the electronic device for communication functions and/or for performing other functions of the electronic device (e.g., messaging applications, address book applications, etc.).
The communication module 110 is a transmitter/receiver 110 that transmits and receives signals via an antenna 111. A communication module (transmitter/receiver) 110 is coupled to the central processor 100 to provide an input signal and receive an output signal, which may be the same as in the case of a conventional mobile communication terminal.
Based on different communication technologies, a plurality of communication modules 110, such as a cellular network module, a bluetooth module, and/or a wireless local area network module, etc., may be provided in the same electronic device. The communication module (transmitter/receiver) 110 is also coupled to a speaker 131 and a microphone 132 via an audio processor 130 to provide audio output via the speaker 131 and to receive audio input from the microphone 132 to implement usual telecommunication functions. The audio processor 130 may include any suitable buffers, decoders, amplifiers and so forth. In addition, the audio processor 130 is also coupled to the central processor 100 so that sound can be recorded locally through the microphone 132 and so that sound stored locally can be played through the speaker 131.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The principles and embodiments of the present invention have been described in detail with reference to specific examples, which are provided to facilitate understanding of the method and core ideas of the present invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (10)

1. The self-service terminal identity recognition method is characterized by comprising the following steps of:
acquiring first biological characteristic information and first image information which are acquired by a self-service terminal and authorized by a client, and extracting characteristics of the first biological characteristic information and the first image information to obtain a first biological dynamic characteristic and a first appearance characteristic;
acquiring first ranging data acquired by the self-service terminal, if the first ranging data is not smaller than a preset handling distance threshold, acquiring second image information authorized by a client through the self-service terminal, and performing limb feature recognition on the second image information by utilizing the first ranging data to acquire first limb feature data;
when the first ranging data is not smaller than a preset leaving distance threshold value, suspending the current business process;
acquiring second ranging data acquired by the self-service terminal, if the second ranging data is smaller than a preset leaving distance threshold, acquiring second biological characteristic information and third image information authorized by a client through the self-service terminal, acquiring second biological dynamic characteristics and second appearance characteristics according to the second biological characteristic information and the third image information, and carrying out limb characteristic identification on the third image information by utilizing the second ranging data to acquire second limb characteristic data;
Respectively comparing the first biological dynamic characteristic with the second biological dynamic characteristic, the first appearance characteristic with the second appearance characteristic, the first limb characteristic data with the second limb characteristic data to obtain biological characteristic similarity, appearance characteristic similarity and limb characteristic matching degree;
and dynamically weighting the biological feature similarity, the appearance feature similarity and the limb feature matching degree by using a preset weighting value to obtain multi-modal fusion similarity, and determining a customer identification result according to the multi-modal fusion similarity.
2. The method of claim 1, wherein extracting features from the first biometric information to obtain a first biometric feature comprises:
extracting the characteristics of the first biological characteristic information to obtain the heart rate, the heart beat amplitude, the respiratory rate and the respiratory rhythm of the current client;
and obtaining a breathing rhythm characteristic sequence by utilizing the breathing rhythm, and taking the heart rate, the heartbeat amplitude, the breathing frequency and the breathing rhythm characteristic sequence of the current client as the first biological dynamic characteristic.
3. The method of claim 2, wherein comparing the first biometric characteristic to the second biometric characteristic to obtain a biometric similarity comprises:
Calculating the similarity of the breathing rhythm feature sequence in the first biological dynamic feature and the breathing rhythm feature sequence in the second biological dynamic feature to obtain breathing rhythm feature similarity;
and calculating the similarity of the breathing rhythm characteristics, the heart rate, the heartbeat amplitude and the breathing frequency in the first biological dynamic characteristics and the second biological dynamic characteristics to obtain the biological characteristic similarity.
4. The method of claim 1, wherein performing limb feature recognition on the second image information using the first ranging data, the obtaining first limb feature data comprises:
selecting a plurality of limb characteristic points from the second image information, and determining image coordinates corresponding to each limb characteristic point according to the moment corresponding to each limb characteristic point;
and taking the first ranging data as compensation, and carrying out normalization calculation by utilizing image coordinates corresponding to each limb characteristic point to obtain the first limb characteristic data.
5. The method of claim 1, wherein comparing the first limb feature data with the second limb feature data to obtain a limb feature match comprises:
And carrying out matched filtering detection on the first limb characteristic data and the second limb characteristic data by using a preset farthest confidence distance to obtain the limb characteristic matching degree.
6. The method of claim 1, wherein determining a customer recognition result based on the multimodal fusion similarity comprises:
and if the multi-mode fusion similarity is not smaller than the similarity discrimination threshold, determining that the client identification result is the same client.
7. The method of claim 6, wherein the method further comprises: and if the client identification result is the same client, recovering the suspended current business flow.
8. A self-service terminal identity recognition device, the device comprising:
the system comprises a biological feature module, a self-service terminal and a storage module, wherein the biological feature module is used for acquiring first biological feature information and first image information which are acquired by the self-service terminal and authorized by a client, and carrying out feature extraction on the first biological feature information and the first image information to obtain a first biological dynamic feature and a first appearance feature;
the limb characteristic module is used for acquiring first ranging data acquired by the self-service terminal, acquiring second image information authorized by a client through the self-service terminal if the first ranging data is not smaller than a preset handling distance threshold value, and carrying out limb characteristic identification on the second image information by utilizing the first ranging data to acquire first limb characteristic data;
The service suspension module is used for suspending the current service flow when the first ranging data is not smaller than a preset leaving distance threshold value;
the distance measurement data module is used for acquiring second distance measurement data acquired by the self-service terminal, acquiring second biological characteristic information and third image information authorized by a client through the self-service terminal if the second distance measurement data is smaller than a preset departure distance threshold value, acquiring second biological dynamic characteristics and second appearance characteristics according to the second biological characteristic information and the third image information, and carrying out limb characteristic identification on the third image information by utilizing the second distance measurement data to acquire second limb characteristic data;
the similarity calculation module is used for comparing the first biological dynamic characteristic with the second biological dynamic characteristic, the first appearance characteristic with the second appearance characteristic, the first limb characteristic data with the second limb characteristic data respectively to obtain biological characteristic similarity, appearance characteristic similarity and limb characteristic matching degree;
and the client identification module is used for dynamically weighting the biological feature similarity, the appearance feature similarity and the limb feature matching degree by using a preset weighting value to obtain multi-mode fusion similarity, and determining a client identification result according to the multi-mode fusion similarity.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of any of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program for executing the method of any one of claims 1 to 7 by a computer.
CN202311322346.9A 2023-10-12 2023-10-12 Self-service terminal identity recognition method and device Pending CN117351616A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311322346.9A CN117351616A (en) 2023-10-12 2023-10-12 Self-service terminal identity recognition method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311322346.9A CN117351616A (en) 2023-10-12 2023-10-12 Self-service terminal identity recognition method and device

Publications (1)

Publication Number Publication Date
CN117351616A true CN117351616A (en) 2024-01-05

Family

ID=89358965

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311322346.9A Pending CN117351616A (en) 2023-10-12 2023-10-12 Self-service terminal identity recognition method and device

Country Status (1)

Country Link
CN (1) CN117351616A (en)

Similar Documents

Publication Publication Date Title
KR100902199B1 (en) Authentication apparatus and method for controlling the authentication apparatus, electronic device provided with authentication apparatus, program for controlling authentication apparatus, and recording media storing the program
US11783335B2 (en) Transaction confirmation and authentication based on device sensor data
CN102201055A (en) Information processing device, information processing method, and program
CN110175849B (en) Collecting method, device, equipment, server and system
CN109766785A (en) A kind of biopsy method and device of face
CN112597850B (en) Identity recognition method and device
CN111626371A (en) Image classification method, device and equipment and readable storage medium
CN108877787A (en) Audio recognition method, device, server and storage medium
CN112036331A (en) Training method, device and equipment of living body detection model and storage medium
CN106471440A (en) Eye tracking based on efficient forest sensing
CN109766755A (en) Face identification method and Related product
CN108229375B (en) Method and device for detecting face image
CN111368811A (en) Living body detection method, living body detection device, living body detection equipment and storage medium
CN111738199B (en) Image information verification method, device, computing device and medium
JP2022177229A (en) Liveness detection verification method, liveness detection verification system, recording medium, and method for training liveness detection verification system
CN110929237A (en) Identity verification system, method and device and information verification system
CN111783677B (en) Face recognition method, device, server and computer readable medium
CN111931617B (en) Human eye image recognition method and device based on image processing and self-service terminal
CN117351616A (en) Self-service terminal identity recognition method and device
JP2018136701A (en) Face authentication fund transferring system
CN114882576B (en) Face recognition method, electronic device, computer-readable medium, and program product
CN114495292A (en) Identity recognition method, device, equipment and readable storage medium
CN113420279A (en) Password input method and device
CN113657293A (en) Living body detection method, living body detection device, electronic apparatus, medium, and program product
CN114742561A (en) Face recognition method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination