CN106778179B - Identity authentication method based on ultrasonic lip language identification - Google Patents

Identity authentication method based on ultrasonic lip language identification Download PDF

Info

Publication number
CN106778179B
CN106778179B CN201710006640.7A CN201710006640A CN106778179B CN 106778179 B CN106778179 B CN 106778179B CN 201710006640 A CN201710006640 A CN 201710006640A CN 106778179 B CN106778179 B CN 106778179B
Authority
CN
China
Prior art keywords
user
lip language
signal
lip
ultrasonic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710006640.7A
Other languages
Chinese (zh)
Other versions
CN106778179A (en
Inventor
王晓亮
谭佳瑶
陆桑璐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University
Original Assignee
Nanjing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University filed Critical Nanjing University
Priority to CN201710006640.7A priority Critical patent/CN106778179B/en
Publication of CN106778179A publication Critical patent/CN106778179A/en
Application granted granted Critical
Publication of CN106778179B publication Critical patent/CN106778179B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

An identity authentication method based on ultrasonic lip language identification comprises the following steps: (1) the signal transmitting source transmits an ultrasonic signal, and the signal receiving source receives a reflected signal from the mouth; (2) performing lip language feature extraction on the collected reflection signals, wherein the extracted lip language features comprise a frequency spectrum feature corresponding to a section of lip language event and a dynamic lip contour feature of a user; (3) and (4) identifying and authenticating the user according to the lip language features extracted in the step (2). The invention has the beneficial effects that: on the basis of no need of additional hardware customization, the characteristic parameters of the identity authentication are obtained by utilizing the ultrasonic sensing capability, the deployment cost is low, and the application scene of the ultrasonic technology in the mobile terminal is expanded; by utilizing the difference of behavior habits of users when expressing the same lip language event and combining with biological characteristics, the security loopholes in the prior technologies of facial recognition, fingerprint recognition and the like are solved, and efficient identity authentication can be achieved.

Description

Identity authentication method based on ultrasonic lip language identification
Technical Field
The invention relates to the technical field of identity authentication, in particular to an identity authentication method based on ultrasonic lip language identification.
Background
In the information society of today, accurate identity authentication technology is a necessary prerequisite for system security. Traditional identity authentication techniques are often based on either feature content (text passwords, identification card IDs) or customized identification authentication items (identification cards, student cards). However, the former is easy to be copied and stolen by some technical means, and the latter is often lost and damaged. Therefore, it is necessary to introduce biological feature information and learning of user behavior habits into the identity authentication system. On the other hand, the existing biological characteristic information such as facial recognition, fingerprint recognition and voiceprint recognition can be recorded by other malicious users in the modes of photographing, fingerprint copying, sound recording and the like, and potential safety hazards are brought to the system.
Disclosure of Invention
The invention aims to solve the technical problem of providing an identity authentication method based on ultrasonic lip language identification, which can solve the security loophole in the existing face identification technology and achieve high-efficiency authentication.
In order to solve the technical problem, the invention provides an identity authentication method based on ultrasonic lip language identification, which comprises the following steps:
(1) the signal transmitting source transmits an ultrasonic signal, and the signal receiving source receives a reflected signal from the mouth;
(2) performing lip language feature extraction on the collected reflection signals, wherein the extracted lip language features comprise a frequency spectrum feature corresponding to a section of lip language event and a dynamic lip contour feature of a user;
(3) and (4) identifying and authenticating the user according to the lip language features extracted in the step (2).
Preferably, in the step (2), the method for extracting the dynamic lip contour features of the user includes the following steps:
(1) the mobile terminal generates an ultrasonic signal so that the ultrasonic signal covers the face of the user;
(2) for a section of lip language event, denoising and filtering an ultrasonic signal reflected by the face of a user to obtain a mouth reflection signal only containing mouth movement of the user; a period of lip language event refers to a sentence defined by the user as a password;
(3) the change of signal frequency and phase caused by the mouth movement of the user is used as the identification fingerprint information.
Preferably, in step (2), the method for extracting a specific lip event expressed by the user includes the following steps:
(1) segmenting the reflected signal only containing mouth movement to obtain a segmented signal, extracting the spectral characteristic parameters of the segmented signal, and matching a corresponding voice event;
(2) and performing joint learning on the voice events corresponding to the segmented signals, and mapping to the corresponding lip language events.
Preferably, in step (3), the identification and authentication of the user includes the following steps:
(1) recording user password information; recording an ultrasonic reflection signal under the influence of a section of user lip language event by using a mobile terminal as an identification password;
(2) matching the frequency spectrum characteristics; when the user identity authentication is carried out, the user repeats the same section of lip language, and identity authentication is carried out through feature extraction and feature matching of the reflected signal and the recorded signal; extracting the spectrum characteristics of the signal reflected by the mouth movement of the current user, and judging whether the spectrum characteristics are matched with the spectrum characteristics of the lip language event in the secret key of the mobile terminal, if the spectrum characteristics are not matched, the user is an illegal user;
(3) matching lip contour features: the lip outlines of different users are different, and when the frequency spectrum characteristics of lip language events are matched, the biological identity authentication of the users needs to be further carried out; and imaging the reflected signals of the mouth to construct dynamic lip contour features of the current user, matching the dynamic lip contour features with the lip contour features of all users corresponding to the lip language event in the user database, if the matching is successful, the authentication is passed, otherwise, the authentication is an illegal user.
Preferably, the mobile terminal comprises an information acquisition module and an unlocking control module; the information acquisition module comprises a signal transmitting submodule and a signal receiving submodule; the signal transmitting submodule is a loudspeaker of the mobile terminal and sends ultrasonic signals; the signal receiving submodule is a microphone of the mobile terminal and receives a reflected signal from the face of a user; the unlocking control module comprises an identification sub-module, a matching sub-module and an authentication sub-module; the identification submodule extracts real-time lip language features of the reflected signals acquired by the signal receiving module and generates an access key of the mobile terminal; the matching submodule is used for matching the lip language characteristics of the current access user with the information of the user database and judging whether the lip language characteristics are matched with the preset lip language secret key of the mobile terminal; and the authentication sub-module performs identity authentication according to whether the matching degree exceeds a threshold value, if so, the identity authentication of the mobile terminal is passed, and if not, the mobile terminal keeps a locking state and judges that an illegal user breaks into the mobile terminal.
The invention has the beneficial effects that: on the basis of no need of additional hardware customization, the characteristic parameters of the identity authentication are obtained by utilizing the ultrasonic sensing capability, the deployment cost is low, and the application scene of the ultrasonic technology in the mobile terminal is expanded; by utilizing the difference of behavior habits of users when expressing the same lip language event and combining with biological characteristics, the security loopholes in the prior technologies of facial recognition, fingerprint recognition and the like are solved, and efficient identity authentication can be achieved.
Drawings
Fig. 1 is an example of an application scenario of the present invention.
Fig. 2 is a functional block diagram of the mobile terminal of the present invention.
FIG. 3 is a flow chart of the method for recognizing lip language features according to the present invention.
Fig. 4 is a flowchart illustrating an identity authentication method according to the present invention.
Detailed Description
Fig. 1 shows an example of an application scenario of the present invention. Firstly, after the mobile terminal acquires the request access information sent by each user to the mobile terminal, the key information synchronously transmitted by each user is transmitted to a user database, the database identifies and matches the key information of all current users, and the user with the highest matching degree passes the authentication and is the legal user of the mobile terminal. And other users are illegal users, and the mobile terminal keeps the locked state.
Fig. 2 is a diagram illustrating an overall structure of the mobile terminal according to the present invention. The invention provides an identity authentication technology and method for lip language identification based on ultrasonic waves. The method is divided into two modules of information acquisition and unlocking control according to functions. The information acquisition module comprises a signal transmitting submodule and a signal receiving submodule. The signal transmitting submodule is a loudspeaker of the mobile terminal and sends ultrasonic signals; the signal receiving submodule is a microphone of the mobile terminal and receives a reflected signal from the face of a user. The unlocking control module comprises a recognition sub-module, a matching sub-module and an authentication sub-module. The identification submodule extracts real-time lip language features of the reflected signals acquired by the signal receiving module and generates an access key of the mobile terminal. And the matching submodule is used for matching the lip language characteristics of the current access user with the information of the user database and judging whether the lip language characteristics are matched with the preset lip language secret key of the mobile terminal. And the authentication sub-module performs identity authentication according to whether the matching degree exceeds a threshold value, if so, the identity authentication of the mobile terminal is passed, and if not, the mobile terminal keeps a locking state and judges that an illegal user breaks into the mobile terminal.
As shown in fig. 3, a method for lip language feature recognition based on identity authentication technology and method for lip language recognition by ultrasonic waves includes the following steps:
(1) and denoising and filtering the ultrasonic signals reflected by the face of the user to obtain mouth reflection signals only containing the mouth movement of the user.
(2) By utilizing the principle of ultrasonic imaging, the dynamic lip contour characteristics of the user are constructed according to different reflection and scattering capacities of different parts of the mouth on ultrasonic signals and serve as biological characteristics of the user.
(3) And segmenting the reflected signal only containing the mouth movement to obtain a segmented signal, extracting the spectral characteristic parameters of the segmented signal, and matching the corresponding voice event.
(4) And performing joint learning on the voice events corresponding to the segmented signals, and mapping to corresponding lip language contents.
As shown in fig. 4, an identity authentication process method of a mobile terminal according to an identity authentication technology and method for lip language identification based on ultrasonic waves includes the following steps:
(1) establishing a user database: the user selects any lip event as a secret key for identity authentication of the mobile terminal, and trains the dynamic lip contour feature and the corresponding frequency spectrum feature of the user when the user speaks the lip event by using the lip feature extraction method shown in fig. 3. Thus, a training set is obtained and used as a secret key for user identity authentication.
(2) Identification and authentication of the user: for the identification of the access mobile terminal user, after the lip language feature sample set of the current user is obtained, the matching degree of the sample set of the current user and the training set in the lip language information database is analyzed. The analysis of the matching degree can be subdivided into the following two parts:
(a) matching the frequency spectrum characteristics: and extracting the spectrum characteristics of the signal reflected by the mouth movement of the current user, and judging whether the spectrum characteristics of the lip language event in the mobile terminal key are matched, if the spectrum characteristics are not matched, the user is an illegal user, otherwise, the next step is carried out.
(b) Matching lip contour features: the lip contours of different users are different, and when the frequency spectrum characteristics of lip language events are matched, the biological identity authentication of the users needs to be further carried out. And imaging the reflected signals of the mouth to obtain the dynamic lip contour characteristics of the current user, matching the dynamic lip contour characteristics with the lip contour characteristics of all users corresponding to the lip language event in the user database, if the matching is successful, the authentication is passed, otherwise, the authentication is an illegal user.
The invention uses the mouth shape detection of the user as the fingerprint identification of the user, and has good safety because the voice is not required to be emitted and can not be recorded by the audio recording equipment. The lip event is a word or a sentence, is preset by a user, and other users cannot obtain the lip event by a photographing or recording method, so that the lip event can be widely applied to the fields of mobile phone unlocking, payment application payment password authentication and the like.
While the invention has been shown and described with respect to the preferred embodiments, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the scope of the invention as defined in the following claims.

Claims (3)

1. An identity authentication method based on ultrasonic lip language identification is characterized by comprising the following steps:
(1) the signal transmitting source transmits an ultrasonic signal, and the signal receiving source receives a reflected signal from the mouth;
(2) performing lip language feature extraction on the collected reflection signals, wherein the extracted lip language features comprise a frequency spectrum feature corresponding to a section of lip language event and a dynamic lip contour feature of a user; the method for extracting the dynamic lip contour features of the user comprises the following steps:
(21) the mobile terminal generates an ultrasonic signal so that the ultrasonic signal covers the face of the user;
(22) for a section of lip language event, denoising and filtering an ultrasonic signal reflected by the face of a user to obtain a mouth reflection signal only containing mouth movement of the user; a period of lip language event refers to a sentence defined by the user as a password;
(23) using the change of signal frequency and phase caused by the mouth movement of the user as identification fingerprint information;
(3) identifying and authenticating the user according to the lip language features extracted in the step (2); the identification and authentication of the user comprises the following steps:
(31) recording user password information; recording an ultrasonic reflection signal under the influence of a section of user lip language event by using a mobile terminal as an identification password;
(32) matching the frequency spectrum characteristics; when the user identity authentication is carried out, the user repeats the same section of lip language, and identity authentication is carried out through feature extraction and feature matching of the reflected signal and the recorded signal; extracting the spectrum characteristics of the signal reflected by the mouth movement of the current user, and judging whether the spectrum characteristics are matched with the spectrum characteristics of the lip language event in the secret key of the mobile terminal, if the spectrum characteristics are not matched, the user is an illegal user;
(33) matching lip contour features: the lip outlines of different users are different, and when the frequency spectrum characteristics of lip language events are matched, the biological identity authentication of the users needs to be further carried out; and imaging the reflected signals of the mouth to construct dynamic lip contour features of the current user, matching the dynamic lip contour features with the lip contour features of all users corresponding to the lip language event in the user database, if the matching is successful, the authentication is passed, otherwise, the authentication is an illegal user.
2. The identity authentication method based on ultrasonic lip language identification according to claim 1, wherein in the step (2), the method for extracting a specific lip language event expressed by the user comprises the following steps:
(1) segmenting the reflected signal only containing mouth movement to obtain a segmented signal, extracting the spectral characteristic parameters of the segmented signal, and matching a corresponding voice event;
(2) and performing joint learning on the voice events corresponding to the segmented signals, and mapping to the corresponding lip language events.
3. The identity authentication method based on ultrasonic lip language identification according to claim 1, wherein the mobile terminal comprises an information acquisition module and an unlocking control module; the information acquisition module comprises a signal transmitting submodule and a signal receiving submodule; the signal transmitting submodule is a loudspeaker of the mobile terminal and sends ultrasonic signals; the signal receiving submodule is a microphone of the mobile terminal and receives a reflected signal from the face of a user; the unlocking control module comprises an identification sub-module, a matching sub-module and an authentication sub-module; the identification submodule extracts real-time lip language features of the reflected signals acquired by the signal receiving module and generates an access key of the mobile terminal; the matching submodule is used for matching the lip language characteristics of the current access user with the information of the user database and judging whether the lip language characteristics are matched with the preset lip language secret key of the mobile terminal; and the authentication sub-module performs identity authentication according to whether the matching degree exceeds a threshold value, if so, the identity authentication of the mobile terminal is passed, and if not, the mobile terminal keeps a locking state and judges that an illegal user breaks into the mobile terminal.
CN201710006640.7A 2017-01-05 2017-01-05 Identity authentication method based on ultrasonic lip language identification Active CN106778179B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710006640.7A CN106778179B (en) 2017-01-05 2017-01-05 Identity authentication method based on ultrasonic lip language identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710006640.7A CN106778179B (en) 2017-01-05 2017-01-05 Identity authentication method based on ultrasonic lip language identification

Publications (2)

Publication Number Publication Date
CN106778179A CN106778179A (en) 2017-05-31
CN106778179B true CN106778179B (en) 2021-07-09

Family

ID=58949500

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710006640.7A Active CN106778179B (en) 2017-01-05 2017-01-05 Identity authentication method based on ultrasonic lip language identification

Country Status (1)

Country Link
CN (1) CN106778179B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107358085A (en) * 2017-07-28 2017-11-17 惠州Tcl移动通信有限公司 A kind of unlocking terminal equipment method, storage medium and terminal device
NO347923B1 (en) * 2017-09-15 2024-05-13 Elliptic Laboratories Asa User Authentication Control
CN107784215B (en) * 2017-10-13 2018-10-26 上海交通大学 Audio unit based on intelligent terminal carries out the user authen method and system of labiomaney
CN111507216A (en) 2017-11-03 2020-08-07 阿里巴巴集团控股有限公司 Method and device for identifying illegal behaviors in unattended scene
CN108959866B (en) * 2018-04-24 2020-10-23 西北大学 Continuous identity authentication method based on high-frequency sound wave frequency
CN109446774B (en) * 2018-09-30 2021-11-30 山东知味行网络科技有限公司 Identity recognition application method and system
CN109711350B (en) * 2018-12-28 2023-04-07 武汉大学 Identity authentication method based on lip movement and voice fusion
CN110600058A (en) * 2019-09-11 2019-12-20 深圳市万睿智能科技有限公司 Method and device for awakening voice assistant based on ultrasonic waves, computer equipment and storage medium
CN111091831B (en) * 2020-01-08 2023-04-07 上海交通大学 Silent lip language recognition method and system
CN111552941B (en) * 2020-04-22 2022-04-26 歌尔科技有限公司 Terminal unlocking method and device, electronic equipment and readable storage medium
WO2022028207A1 (en) * 2020-08-03 2022-02-10 华为技术有限公司 Speech recognition method, apparatus, device and system, and computer readable storage medium
CN113011245B (en) * 2021-01-28 2023-12-12 南京大学 Lip language identification system and method based on ultrasonic sensing and knowledge distillation
CN114676735A (en) * 2022-04-21 2022-06-28 歌尔股份有限公司 Lip language identification method and device and computer readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102200829A (en) * 2010-03-24 2011-09-28 南京大学 Body movement recognition device used for virtual reality input
CN103226386A (en) * 2013-03-13 2013-07-31 广东欧珀移动通信有限公司 Gesture identification method and system based on mobile terminal
CN103885590A (en) * 2014-03-10 2014-06-25 可牛网络技术(北京)有限公司 Method and user equipment for obtaining user instructions
CN103970260A (en) * 2013-01-31 2014-08-06 华为技术有限公司 Non-contact gesture control method and electronic terminal equipment
CN104200146A (en) * 2014-08-29 2014-12-10 华侨大学 Identity verifying method with video human face and digital lip movement password combined
CN104657650A (en) * 2015-01-06 2015-05-27 三星电子(中国)研发中心 Method and device for data input or authentication
CN105787428A (en) * 2016-01-08 2016-07-20 上海交通大学 Method for lip feature-based identity authentication based on sparse coding
WO2016139655A1 (en) * 2015-03-01 2016-09-09 I Am Real Ltd. Method and system for preventing uploading of faked photos

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102200829A (en) * 2010-03-24 2011-09-28 南京大学 Body movement recognition device used for virtual reality input
CN103970260A (en) * 2013-01-31 2014-08-06 华为技术有限公司 Non-contact gesture control method and electronic terminal equipment
CN103226386A (en) * 2013-03-13 2013-07-31 广东欧珀移动通信有限公司 Gesture identification method and system based on mobile terminal
CN103885590A (en) * 2014-03-10 2014-06-25 可牛网络技术(北京)有限公司 Method and user equipment for obtaining user instructions
CN104200146A (en) * 2014-08-29 2014-12-10 华侨大学 Identity verifying method with video human face and digital lip movement password combined
CN104657650A (en) * 2015-01-06 2015-05-27 三星电子(中国)研发中心 Method and device for data input or authentication
WO2016139655A1 (en) * 2015-03-01 2016-09-09 I Am Real Ltd. Method and system for preventing uploading of faked photos
CN105787428A (en) * 2016-01-08 2016-07-20 上海交通大学 Method for lip feature-based identity authentication based on sparse coding

Also Published As

Publication number Publication date
CN106778179A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
CN106778179B (en) Identity authentication method based on ultrasonic lip language identification
CN104361276B (en) A kind of multi-modal biological characteristic identity identifying method and system
Lu et al. Lippass: Lip reading-based user authentication on smartphones leveraging acoustic signals
RU2738325C2 (en) Method and device for authenticating an individual
Frischholz et al. BiolD: a multimodal biometric identification system
CN104834849B (en) Dual-factor identity authentication method and system based on Application on Voiceprint Recognition and recognition of face
CN109450850B (en) Identity authentication method, identity authentication device, computer equipment and storage medium
Kim et al. Person authentication using face, teeth and voice modalities for mobile device security
US20190013026A1 (en) System and method for efficient liveness detection
US6810480B1 (en) Verification of identity and continued presence of computer users
Bigun et al. Multimodal biometric authentication using quality signals in mobile communications
CN106709402A (en) Living person identity authentication method based on voice pattern and image features
CN106599866A (en) Multidimensional user identity identification method
CN104376250A (en) Real person living body identity verification method based on sound-type image feature
CN103678977A (en) Method and electronic device for protecting information security
CN109920435B (en) Voiceprint recognition method and voiceprint recognition device
CN107784215B (en) Audio unit based on intelligent terminal carries out the user authen method and system of labiomaney
KR20190085731A (en) Method for user authentication
CN112491844A (en) Voiceprint and face recognition verification system and method based on trusted execution environment
CN112132996A (en) Door lock control method, mobile terminal, door control terminal and storage medium
US20120330663A1 (en) Identity authentication system and method
Alegre et al. Evasion and obfuscation in automatic speaker verification
CN204576520U (en) Based on the Dual-factor identity authentication device of Application on Voiceprint Recognition and recognition of face
KR20190119521A (en) Electronic apparatus and operation method thereof
Vajaria et al. Evaluation and analysis of a face and voice outdoor multi-biometric system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant