CN112949403B - Reliable user authentication method and system based on biological characteristics of mandible - Google Patents

Reliable user authentication method and system based on biological characteristics of mandible Download PDF

Info

Publication number
CN112949403B
CN112949403B CN202110137465.1A CN202110137465A CN112949403B CN 112949403 B CN112949403 B CN 112949403B CN 202110137465 A CN202110137465 A CN 202110137465A CN 112949403 B CN112949403 B CN 112949403B
Authority
CN
China
Prior art keywords
mandible
vibration
user
gradient
biological
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110137465.1A
Other languages
Chinese (zh)
Other versions
CN112949403A (en
Inventor
刘建伟
宋文帆
沈乐明
韩劲松
任奎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202110137465.1A priority Critical patent/CN112949403B/en
Publication of CN112949403A publication Critical patent/CN112949403A/en
Priority to PCT/CN2021/114402 priority patent/WO2022160691A1/en
Application granted granted Critical
Publication of CN112949403B publication Critical patent/CN112949403B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/14Fourier, Walsh or analogous domain transformations, e.g. Laplace, Hilbert, Karhunen-Loeve, transforms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction

Abstract

The invention discloses a reliable user authentication method based on biological characteristics of mandible, which comprises the steps of acquiring vibration signals of six axes generated when a user vocalizes in throat by utilizing an inertia measurement unit; the vibration signals for each axis are preprocessed into a gradient array. And inputting the gradient array into a mandible biological characteristic extractor, obtaining the mandible biological characteristic, and registering and authenticating the user according to the mandible biological characteristic. The invention generates a vibration signal by the sound production of the throat of a user and drives the mandible to vibrate, and the invention collects the vibration signal containing biological characteristics by utilizing the earphone. The invention uses an inertia measurement unit to capture the vibration characteristics of the mandible, separates positive vibration characteristics from negative vibration characteristics by using the sign of the gradient, and extracts the biological characteristics of the mandible in a vibration signal by using a double-branch deep neural network. The biometric vector is converted into a revocable biometric vector using a gaussian matrix to prevent replay attacks.

Description

Reliable user authentication method and system based on biological characteristics of mandible
Technical Field
The invention belongs to the field of user authentication, and particularly relates to a reliable user authentication method for collecting a vibration signal containing biological characteristics of a mandible by using an inertial measurement unit in an earphone and extracting the biological characteristics by using a deep neural network.
Background
User authentication techniques are widely deployed in security and privacy related applications in people's daily lives. For example, the door lock of a user house, the unlocking of a smart phone, the identity check of the entrance and exit of a transportation station such as a high-speed rail airplane and the like. Existing user authentication techniques may be classified into a biometric-based user authentication technique and a non-biometric-based identity authentication technique according to whether or not a biometric feature is used. Both of these existing authentication techniques have their own disadvantages.
Non-biometric based authentication techniques often use knowledge or devices as authentication credentials. For example, a password or pattern unlocking of the mobile phone belongs to the knowledge certificate. The user identity card and the ID card belong to the device certificate, and the two authentication modes are simple and easy to use. However, the knowledge-based authentication method requires the user to remember sufficiently complicated knowledge for sufficient security, which causes inconvenience to the user. The device-based authentication method has a great safety hazard after the device is lost, because anyone who owns the device can be authenticated.
In order to solve the inconvenience and insecurity of the authentication technology based on the non-biometric features, the authentication technology based on the biometric features has been proposed and widely studied. Existing biometric-based user authentication techniques include in vivo biometric-based and in vitro biometric-based authentication techniques. For example, fingerprints, facial features are all in vitro features because they can be captured from the body surface. Brain waves and heartbeat features are biological features in the body because they are extracted from the dynamics of organs in the body. Although the authentication technology based on the in vitro characteristics is very convenient in characteristic collection, fingerprints and facial characteristics are easy to steal and copy by attackers, and the security of the authentication technology based on the in vivo biological characteristics is not as good as that of the authentication technology based on the in vivo biological characteristics. However, the authentication technique based on the in vivo biometric feature is inconvenient in collecting the feature, and the feature itself is not stable enough. For example, brain wave collecting devices are cumbersome and not conducive to long-term wear. The heartbeat characteristics are susceptible to mood swings and the like. Therefore, a reliable user authentication technique is urgently needed, which can capture stable biometric features to realize authentication by using a simple acquisition manner.
With the rapid development of hardware devices, inertial measurement units can be deployed in existing earphone devices. This makes it possible to acquire a vibration signal of the mandible using the inertial measurement unit. In addition, the headset will become an important next-generation computing platform, and a deep neural network is deployed in the headset for real-time language translation. Therefore, it is possible to extract the biological features of the mandible (in vivo features) in the vibration signal using a deep neural network in the headphones. The invention provides a reliable authentication method based on mandible biological characteristics, which collects mandible vibration signals by using an inertial measurement unit in an earphone and extracts the mandible biological characteristics in vivo by using a deep neural network.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a user authentication method which uses a simple inertial measurement unit in an earphone to collect in-vivo vibration signals and uses a deep neural network to extract stable and reliable in-vivo biological characteristics.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
a method for reliable user authentication based on a biometric characteristic of a mandible, comprising the steps of:
acquiring vibration signals of six axes generated when a user produces sound in throat by using an inertia measurement unit; wherein the inertial measurement unit comprises an accelerometer and a gyroscope. The user can produce a 'emm' sound by closing the mouth to cause the mandible to vibrate, which is transmitted along the mandible to the ear and received by the centralized inertial measurement unit.
Removing abnormal values from the vibration signals of each axis, filtering, normalizing dispersion, calculating gradient values, dividing the gradient values into positive and negative vibration signals, and finally splicing to form a gradient array.
And inputting the gradient array into a mandible biological characteristic extractor, obtaining the mandible biological characteristic, and registering and authenticating the user according to the mandible biological characteristic.
Further, the vibration signals of the six axes are N sampling points taken from the vibration starting point, and N is not less than 60.
Further, the vibration starting point is determined by the following method:
the original signal is divided into a window every ten values, and the standard deviation of all signal values in the window is calculated. If the standard deviation of a window is greater than 250 and the standard deviations of the two subsequent windows are greater than 100, then the first signal value of the window with the standard deviation of 250 is considered as the vibration onset.
Further, the removing the abnormal value specifically includes: and finding out abnormal values by using an MAD detection algorithm, and replacing the abnormal values by using the mean values of the two previous normal values and the two next normal values of the abnormal values simultaneously so as to achieve the purpose of noise reduction. Further, the high pass filtering is performed using a butterworth filter with a cut-off frequency of 20HZ, because the frequency of occurrence of a person is typically above 150HZ, while the frequency caused by body motion is typically below 10 HZ. This filtering approach may eliminate low frequency components that are not related to the biological characteristics of the mandible.
Further, interpolation processing is carried out on the positive and negative vibration signals, and the dimension of the gradient array is kept consistent.
Further, the mandible biological feature extractor is a trained neural network, a classifier and the like. Preferably, the mandibular biometric extractor is a two-branch deep neural network into which positive and negative gradient features are input, respectively, to obtain a biometric vector of dimension (1, 512). Training is done by the system manufacturer.
Further, after obtaining the biological characteristics of the mandible, the method also comprises the following steps:
multiplying the mandible biometric by a gaussian matrix translates the mandible recall biometric. If the Gaussian matrix is represented as G, the biological feature vector output by the feature extractor is represented as M 1 Then the transformed revocable mandibular biometric vector may be calculated using the following formula:
M=M 1 ×G.
once the revocable biometric template in the headset is stolen, the user can change to a gaussian matrix G and generate a new revocable biometric template. The attack is rejected because the similarity between the old template and the new template is low.
Further, the process of registering the user according to the biological characteristics of the mandible is as follows:
and storing the acquired biological characteristics of the mandible of the registered user to finish the registration.
The process of authenticating the user according to the biological characteristics of the mandible is as follows: and performing similarity calculation on the mandible biological characteristics of the authenticated user and the mandible biological characteristics stored during registration, if the similarity is greater than a threshold value, accepting the authentication, and otherwise, rejecting the authentication. The cosine algorithm is used for calculating the similarity, namely, the cosine similarity between the new revocable biological feature vector of the mandible and the revocable biological feature vector template is calculated. If the obtained similarity is larger than the acceptance threshold value set in advance, the authentication is accepted. Otherwise, the authentication is refused.
Wherein, the user only needs to sound for 0.2 second to provide 60 vibration sampling points. Since the sampling rate of the inertial measurement unit may be 350 HZ.
The formula for processing the data (amplitude) of each axis by dispersion normalization is as follows:
Figure BDA0002927232010000041
wherein v' i ,v i ,v min ,v max The vibration data value after normalization, the vibration data value before normalization, the maximum value in the axis, and the minimum value, respectively.
The formula for finding the gradient for each axis is as follows:
Figure BDA0002927232010000042
wherein g is i ,v′ i ,|t i+1 -t i And | is the ith gradient value, the ith normalized vibration data value, and the time difference between the ith data and the (i + 1) th data, respectively.
The invention also provides an authentication system of the reliable user authentication method based on the biological characteristics of the mandible, which comprises the following steps:
and the inertia measurement unit is used for acquiring six-axis vibration signals generated when a user produces sound in throat.
And the signal processing module is used for removing abnormal values of the vibration signals of each axis, filtering, normalizing dispersion, calculating gradient values, dividing the gradient values into positive and negative vibration signals, and finally splicing to form a gradient array.
And the mandible biological feature extractor is used for extracting the mandible biological features according to the gradient array.
And the registration and authentication module is used for storing the mandible biological characteristics generated during user authentication and authenticating according to the mandible biological characteristics during user authentication.
The authentication system can be a stand-alone system or built in an existing device comprising an inertial measurement unit, such as a headset, and can be obtained by embedding a trained mandible biometric extraction network and a registration and authentication module in the headset system when a headset manufacturer manufactures the headset. And thus can be used for authentication of the device.
Compared with the prior user authentication technology, the invention generates a vibration signal and drives the mandible to vibrate by the sound production of the throat of the user, and the invention collects the vibration signal of the biological characteristics by utilizing the earphone. The invention uses an inertia measurement unit to capture the vibration characteristics of the mandible, separates positive vibration characteristics from negative vibration characteristics by using the sign of the gradient, and extracts the biological characteristics of the mandible in a vibration signal by using a double-branch deep neural network. The biometric vector is converted into a revocable biometric vector using a gaussian matrix to prevent replay attacks.
Drawings
FIG. 1 is an authentication flow diagram of the present invention;
FIG. 2 is a schematic view of a mandible vibration model;
FIG. 3 is a block diagram of a dual branch neural network;
FIG. 4 is a diagram of authentication performance;
Detailed Description
The invention aims at the existing user recognitionUnder the condition that the usability and the safety of the card technology cannot be simultaneously and well met, a method for capturing safe, reliable and high-discrimination biological characteristics by an inertial measurement unit on wearable equipment such as an earphone to perform authentication is provided. FIG. 2 is a vibration model of the mandible assuming a force F that produces a positive vibration signal P The mandible mass is m, and the damping and elastic coefficients of vibration generated by body components on both sides of the mandible are (c) 1 ,c 2 ) And (k) 1 ,k 2 ) Then according to newton's second law:
F P (t)=mx″(t)+c 1 x′(t)+(k 1 +k 2 )x(t),
where x (t) is the positive vibrational displacement of the mandible. After Fourier transform, the following equation is obtained:
Figure BDA0002927232010000051
wherein w, X P (w), i, and F P (0) Respectively, the frequency components of the vibration signal, the frequency spectrum of the vibration signal, the imaginary unit, and the instantaneous force that produces the positive vibration. After the vibration is transmitted to the headset, the frequency spectrum of the vibration signal can be represented as:
Figure BDA0002927232010000061
where a and d are the attenuation coefficient and propagation distance, respectively, of the vibration signal. Similar to the propagation law of the positive vibration signal, the frequency spectrum of the negative vibration signal arriving at the earphone can be expressed as:
Figure BDA0002927232010000062
thus one period of vibration (mandible shift from a central position to a positive edge and from a central position to a negative edge) can be expressed as:
Figure BDA0002927232010000063
m, c in this formula can be found 1 ,c 2 ,k 1 ,k 2 Are all biological features of the mandible and are distinguishable among different persons. Therefore, capturing a vibration signal containing the biological characteristics of the mandible from the inertial measurement unit of the headset is a viable authentication method. The method of the invention is further illustrated with reference to the accompanying drawings and specific examples:
a method for reliable user authentication based on the biometric features of the mandible, whose brief flow is shown in fig. 1, is specifically completed in the following five steps:
step 1) the user vocalizes with the throat for 0.2 seconds. The vibration of the throat may drive the mandible of the user to vibrate. The vibration signal is captured by an inertial measurement unit in the headset as it passes along the mandible at the headset. The vibration signal is a signal containing a biometric characteristic of the user's mandible.
And 2) preprocessing the original vibration signals collected by the inertial measurement unit to remove noise and components irrelevant to the environment in the signal data, and finally obtaining a signal array.
The original signal is divided into a window every ten values, and the standard deviation of all signal values in the window is calculated. If the standard deviation of a window is greater than 250 and the standard deviation of the two subsequent windows is greater than 100, the vibration is considered to start with the first data of the window with the standard deviation of 250. And selects a succession of 60 data values for the vibration signal for each axis.
And finding abnormal values caused by hardware imperfection or human body movement in each axis by using the MAD algorithm. Each outlier is then replaced by the average of its two previous and two subsequent normal values.
And carrying out high-pass filtering processing on the data of each axis by using a Butterworth filter, wherein the cutoff frequency is 20HZ, so that low-frequency components irrelevant to the biological characteristics of the mandible are eliminated, and the data are purer.
Since the starting vibration value is different for each axis, dispersion normalization is required for each axis. Otherwise the contribution of the axis with the smaller starting value will be masked in the subsequent processing by the contribution of the axis with the larger starting value. The normalized six-axis data is concatenated to form a signal array with dimension (6, 60).
Step 3) calculating the gradient characteristics of positive direction and negative direction
The signal array obtained by the step 1) and the step 2) is difficult to distinguish positive vibration and negative vibration, so that the gradient is obtained for each axis, and the positive vibration and the negative vibration can be separated according to the sign of the gradient. Gradients greater than or equal to 0 pertain to positive oscillations, and other gradients pertain to negative oscillations.
The positive and negative gradients are interpolated linearly, respectively, so that the gradient in each direction for each axis contains 30 values. The gradients of all axes are spliced together to form a gradient array with dimension (2, 6, 30).
And 4) inputting the gradient matrix obtained in the step 3) into a mandible biometric extractor to obtain a mandible biometric vector. The vector is multiplied by a gaussian matrix and stored in the headset as a revocable authentication template.
In this embodiment, the mandible biometric extractor adopts a trained neural network, and the structure of the neural network is as shown in fig. 3, and the neural network comprises two convolution branches for respectively inputting a positive gradient and a negative gradient. Each convolution branch includes three convolution layers. Each convolutional layer is followed by a batch normalization function and a ReLU activation function. The output of the convolutional layer is spliced and then input into the two fully-connected layers. The output of the first fully connected layer is the mandible biometric vector extracted from the gradient feature array.
Step 5) the user makes the earphone capture the vibration signal of the mandible by throat vibration for 0.2 seconds. Inputting the vibration signal into a feature extractor after processing the vibration signal through steps 2) and 3) to obtain a new mandible biological feature vector. And converting the biological feature vector into a revocable biological feature vector and then carrying out similarity calculation with a template in the earphone. If the similarity is more than 0.57, the authentication is accepted, otherwise, the authentication is rejected.
The invention provides a safe and reliable user authentication method which can be realized on an earphone aiming at the condition that the safety and the usability of the existing user authentication technology can not be simultaneously guaranteed. According to the invention, limb interaction between a user and authentication equipment is not needed, only throat vibration is needed, the inertial measurement unit in the earphone can capture a vibration signal containing the biological characteristics of the mandible, the biological characteristics of the mandible with high discrimination are extracted by utilizing multi-step preprocessing and a deep network, and the anti-replay safety authentication method is realized by utilizing Gaussian matrix and similarity calculation. The false rejection rate and the false yield rate of the experimental data of 30 persons are shown in figure 4, and the error rate is less than 2%.
It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. It is not necessary or exhaustive to mention all embodiments. And obvious variations or modifications of the invention may be made without departing from the scope of the invention.

Claims (9)

1. A method for reliable user authentication based on a biometric characteristic of a mandible, comprising the steps of:
acquiring vibration signals of six axes generated by a user when the user produces sound in throat by using an inertia measuring unit;
removing abnormal values from the vibration signals of each axis, filtering, normalizing dispersion and calculating gradient values to divide the gradient values into positive and negative gradients;
inputting the positive gradient and the negative gradient into a mandible biological feature extractor to obtain the mandible biological feature, and registering and authenticating the user according to the mandible biological feature; the mandible biological feature extractor adopts a trained double-branch neural network, and the double-branch neural network comprises two convolution branches which respectively correspond to an input positive gradient and a negative gradient.
2. The method of claim 1, wherein the vibration signals of the six axes are N sampling points taken from behind a vibration start point, N being not less than 60.
3. The method of claim 2, wherein the vibration initiation point is determined by:
dividing the original signal into a window according to every ten values, and calculating the standard deviation of all signal values in the window; if the standard deviation of a window is greater than 250 and the standard deviations of the two subsequent windows are greater than 100, then the first signal value of the window with the standard deviation of 250 is considered as the vibration onset.
4. The method for reliable user authentication based on the biometric characteristics of the mandible according to claim 1, wherein the removing abnormal values are specifically: the outliers are found using the MAD detection algorithm, and for each outlier, the mean of the two normal values preceding the outlier and the two normal values following the outlier are substituted at the same time.
5. The method of claim 1, wherein the filtering process has a cutoff frequency of 20 HZ.
6. The method of claim 1, further comprising interpolating positive and negative vibration signals to maintain the dimensions of the gradient array consistent.
7. The method of claim 1, wherein obtaining the biometric information of the mandible further comprises:
multiplying the mandible biometric by the gaussian matrix translates the mandible recall biometric.
8. The method of claim 1, wherein the registering the user according to the biometric features of the mandible comprises:
storing the acquired biological characteristics of the mandible of the registered user to finish registration;
the process of authenticating the user according to the biological characteristics of the mandible is as follows: and performing similarity calculation on the mandible biological characteristics of the authenticated user and the mandible biological characteristics stored during registration, if the similarity is greater than a threshold value, accepting the authentication, and otherwise, rejecting the authentication.
9. The authentication system of the reliable user authentication method based on the biometric characteristic of the mandible according to any one of claims 1 to 8, comprising:
the inertia measurement unit is used for acquiring vibration signals of six axes generated when a user produces sound in throat;
the signal processing module is used for removing abnormal values of the vibration signals of each axis, filtering, normalizing dispersion and calculating gradient values to divide the gradient values into positive and negative gradients;
the mandible biological feature extractor is used for extracting the mandible biological feature according to the positive and negative gradients;
and the registration and authentication module is used for storing the mandible biological characteristics generated during user authentication and authenticating according to the mandible biological characteristics during user authentication.
CN202110137465.1A 2021-02-01 2021-02-01 Reliable user authentication method and system based on biological characteristics of mandible Active CN112949403B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110137465.1A CN112949403B (en) 2021-02-01 2021-02-01 Reliable user authentication method and system based on biological characteristics of mandible
PCT/CN2021/114402 WO2022160691A1 (en) 2021-02-01 2021-08-24 Reliable user authentication method and system based on mandibular biological features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110137465.1A CN112949403B (en) 2021-02-01 2021-02-01 Reliable user authentication method and system based on biological characteristics of mandible

Publications (2)

Publication Number Publication Date
CN112949403A CN112949403A (en) 2021-06-11
CN112949403B true CN112949403B (en) 2022-08-23

Family

ID=76240896

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110137465.1A Active CN112949403B (en) 2021-02-01 2021-02-01 Reliable user authentication method and system based on biological characteristics of mandible

Country Status (2)

Country Link
CN (1) CN112949403B (en)
WO (1) WO2022160691A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112949403B (en) * 2021-02-01 2022-08-23 浙江大学 Reliable user authentication method and system based on biological characteristics of mandible

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101301063B1 (en) * 2013-07-05 2013-08-28 (주)드림텍 Method of manufacturing fingerprint recognition home key using high dielectric constant material and fingerprint recognition home key structure thereof
CN108574701A (en) * 2017-03-08 2018-09-25 理查德.A.罗思柴尔德 System and method for determining User Status
CN110363120A (en) * 2019-07-01 2019-10-22 上海交通大学 Intelligent terminal based on vibration signal touches authentication method and system
CN111371951A (en) * 2020-03-03 2020-07-03 北京航空航天大学 Smart phone user authentication method and system based on electromyographic signals and twin neural network
CN112149638A (en) * 2020-10-23 2020-12-29 贵州电网有限责任公司 Personnel identity recognition system construction and use method based on multi-modal biological characteristics

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103218841B (en) * 2013-04-26 2016-01-27 中国科学技术大学 In conjunction with the three-dimensional vocal organs animation method of physiological models and data-driven model
US9858403B2 (en) * 2016-02-02 2018-01-02 Qualcomm Incorporated Liveness determination based on sensor signals
US10051112B2 (en) * 2016-12-23 2018-08-14 Google Llc Non-intrusive user authentication system
KR20200024602A (en) * 2018-08-28 2020-03-09 삼성전자주식회사 Learning method and apparatus of user terminal
CN109711350B (en) * 2018-12-28 2023-04-07 武汉大学 Identity authentication method based on lip movement and voice fusion
CN112949403B (en) * 2021-02-01 2022-08-23 浙江大学 Reliable user authentication method and system based on biological characteristics of mandible

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101301063B1 (en) * 2013-07-05 2013-08-28 (주)드림텍 Method of manufacturing fingerprint recognition home key using high dielectric constant material and fingerprint recognition home key structure thereof
CN108574701A (en) * 2017-03-08 2018-09-25 理查德.A.罗思柴尔德 System and method for determining User Status
CN110363120A (en) * 2019-07-01 2019-10-22 上海交通大学 Intelligent terminal based on vibration signal touches authentication method and system
CN111371951A (en) * 2020-03-03 2020-07-03 北京航空航天大学 Smart phone user authentication method and system based on electromyographic signals and twin neural network
CN112149638A (en) * 2020-10-23 2020-12-29 贵州电网有限责任公司 Personnel identity recognition system construction and use method based on multi-modal biological characteristics

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Improved Throat Vibration Sensing with a Flexible 160-GHz Radar through Harmonic Generation";Martin Geiger.et al;《IEEE/MTT-S International Microwave Symposium》;20180820;全文 *
"TouchPass: Towards Behavior-irrelevant on-touch User Authentication on Smartphones Leveraging Vibrations ";Xiangyu Xu.et al;《ACM》;20200921;全文 *

Also Published As

Publication number Publication date
CN112949403A (en) 2021-06-11
WO2022160691A1 (en) 2022-08-04

Similar Documents

Publication Publication Date Title
US10566007B2 (en) System and method for authenticating voice commands for a voice assistant
US10867019B2 (en) Personal authentication device, personal authentication method, and personal authentication program using acoustic signal propagation
CN104834849B (en) Dual-factor identity authentication method and system based on Application on Voiceprint Recognition and recognition of face
US10628568B2 (en) Biometric recognition system
El_Rahman Biometric human recognition system based on ECG
CN112949403B (en) Reliable user authentication method and system based on biological characteristics of mandible
Lee et al. Wearable Bio-Signal (PPG)-Based Personal Authentication Method Using Random Forest and Period Setting Considering the Feature of PPG Signals.
Duraibi Voice biometric identity authentication model for iot devices
Dey et al. Electrocardiogram feature based inter-human biometric authentication system
WO2022268183A1 (en) Video-based random gesture authentication method and system
CN112347450A (en) Identity verification method based on blink sound signal
Liu et al. Mandipass: Secure and usable user authentication via earphone imu
CN111444830A (en) Imaging method and device based on ultrasonic echo signal, storage medium and electronic device
CN113241081B (en) Far-field speaker authentication method and system based on gradient inversion layer
CN204576520U (en) Based on the Dual-factor identity authentication device of Application on Voiceprint Recognition and recognition of face
CN110197172B (en) Method and device for identity authentication based on photoelectric blood vessel volume information
CN115935314A (en) User identity authentication method based on wearable device motion sensor
Nickel et al. Does a cycle-based segmentation improve accelerometer-based biometric gait recognition?
Wahid et al. A Gaussian mixture models approach to human heart signal verification using different feature extraction algorithms
CN109100948A (en) A kind of intelligent home control system having identification verification function
CN114676413A (en) Wearable device identity verification method through reprogrammable vibration signal
Mohanta et al. Development of multimodal biometric framework for smartphone authentication system
Chang et al. Vogue: Secure user voice authentication on wearable devices using gyroscope
Sidek et al. The study of ppg and apg signals for biometric recognition
CN111444489A (en) Double-factor authentication method based on photoplethysmography sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant