CN108959880B - Authentication method, authentication apparatus, and computer-readable storage medium - Google Patents

Authentication method, authentication apparatus, and computer-readable storage medium Download PDF

Info

Publication number
CN108959880B
CN108959880B CN201810575116.6A CN201810575116A CN108959880B CN 108959880 B CN108959880 B CN 108959880B CN 201810575116 A CN201810575116 A CN 201810575116A CN 108959880 B CN108959880 B CN 108959880B
Authority
CN
China
Prior art keywords
face
infrared
authorized user
template
infrared camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810575116.6A
Other languages
Chinese (zh)
Other versions
CN108959880A (en
Inventor
张学勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810575116.6A priority Critical patent/CN108959880B/en
Publication of CN108959880A publication Critical patent/CN108959880A/en
Priority to PCT/CN2019/083370 priority patent/WO2019233199A1/en
Priority to US16/424,426 priority patent/US10942999B2/en
Priority to EP19178415.6A priority patent/EP3579131B1/en
Application granted granted Critical
Publication of CN108959880B publication Critical patent/CN108959880B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Collating Specific Patterns (AREA)
  • Image Analysis (AREA)

Abstract

The verification method disclosed by the invention comprises the following steps: judging whether the movable module is triggered or not, wherein the movable module is accommodated in the shell and can extend out of the shell, and comprises a bracket, an infrared camera and a structured light projector; if the movable module is triggered, the infrared camera and the structured light projector are driven by the support to extend out of the shell, and the infrared camera and the structured light projector are initialized; acquiring an infrared image through an infrared camera; judging whether a human face exists in the infrared image or not; when a face exists in the infrared image, judging whether the face is matched with a face template of an authorized user, acquiring a laser pattern through a structured light projector and an infrared camera, acquiring a depth image according to the laser pattern and judging whether the depth image is matched with the depth template of the authorized user; and when the face is matched with the face template of the authorized user and the depth image is matched with the depth template of the authorized user, the verification is passed. The invention also discloses a verification device and a computer readable storage medium.

Description

Authentication method, authentication apparatus, and computer-readable storage medium
Technical Field
The present invention relates to the field of consumer electronics technologies, and in particular, to a verification method, a verification apparatus, and a computer-readable storage medium.
Background
The existing camera module for unlocking the human face is generally arranged on a front shell of a mobile phone, so that a display screen arranged on the front shell cannot be made into a full-face screen. And in order to make the positive display screen of cell-phone can make full face screen, make the module of making a video recording can hide inside the cell-phone or expose outside the cell-phone selectively, at this moment, when using the camera module to carry out the people's face unblock, the time of people's face unblock is longer, and then leads to user experience relatively poor.
Disclosure of Invention
The embodiment of the invention provides a verification method, a verification device and a computer readable storage medium.
The verification method of the embodiment of the invention comprises the following steps:
judging whether a movable module is triggered, wherein the movable module is accommodated in a shell and can extend out of the shell, and comprises a bracket, an infrared camera arranged on the bracket and a structured light projector arranged on the bracket;
if the movable module is triggered, the infrared camera and the structured light projector are driven by the support to move towards the outside of the shell so as to extend out of the shell, and the infrared camera and the structured light projector are initialized;
acquiring an infrared image through the infrared camera;
judging whether a human face exists in the infrared image or not;
when a face exists in the infrared image, judging whether the face is matched with a face template of an authorized user, acquiring a laser pattern through the structured light projector and the infrared camera, acquiring a depth image according to the laser pattern, and judging whether the depth image is matched with the depth template of the authorized user; and
and if the face is matched with the face template of the authorized user and the depth image is matched with the depth template of the authorized user, the verification is passed.
The verification device of the embodiment of the invention comprises:
the movable module is accommodated in the shell and can extend out of the shell, the movable module comprises a support, an infrared camera arranged on the support and a structured light projector arranged on the support, when the movable module is triggered, the support is used for driving the infrared camera and the structured light projector to move towards the outside of the shell so as to extend out of the shell, and the infrared camera and the structured light projector are initialized; the infrared camera is used for acquiring an infrared image; the infrared camera and the structured light projector are used for acquiring laser patterns; and
a processor to:
judging whether the movable module is triggered or not;
judging whether a human face exists in the infrared image or not;
when a face exists in the infrared image, judging whether the face is matched with a face template of an authorized user;
when a human face exists in the infrared image, acquiring a depth image according to the laser pattern and judging whether the depth image is matched with a depth template of an authorized user; and
and when the face is matched with a face template of an authorized user and the depth image is matched with a depth template of the authorized user, determining that the verification is passed.
The verification device of the embodiment of the invention comprises:
the movable module is accommodated in the shell and can extend out of the shell, the movable module comprises a support, an infrared camera arranged on the support and a structured light projector arranged on the support, when the movable module is triggered, the support is used for driving the infrared camera and the structured light projector to move towards the outside of the shell so as to extend out of the shell, and the infrared camera and the structured light projector are initialized; the infrared camera is used for acquiring an infrared image; the infrared camera and the structured light projector are used for acquiring laser patterns;
the first judgment module is used for judging whether the movable module is triggered or not;
the second judgment module is used for judging whether a human face exists in the infrared image or not;
the third judgment module is used for judging whether the face is matched with a face template of an authorized user when the face exists in the infrared image;
the acquisition module is used for acquiring a depth image according to the laser pattern when a human face exists in the infrared image;
the fourth judgment module is used for judging whether the depth image is matched with a depth template of an authorized user when the face exists in the infrared image; and
and the verification module is used for determining that the verification is passed when the face is matched with the face template of the authorized user and the depth image is matched with the depth template of the authorized user.
The computer-readable storage medium of the embodiments of the present invention stores one or more computer-executable instructions that, when executed by one or more processors, cause the one or more processors to perform the authentication method of the above-described embodiments.
The computer-readable storage medium, the verification method and the verification device of the embodiment of the invention judge whether the face is matched with the face template of the authorized user and acquire the laser pattern through the structured light projector and the infrared camera after judging that the face exists in the infrared image, and are synchronously executed, thereby reducing the execution time of the verification method executed by the verification device and improving the verification speed.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a validation method in accordance with certain embodiments of the present invention.
Fig. 2 to 4 are schematic structural views of the verification device according to the embodiment of the present invention.
Fig. 5-9 are flow diagrams illustrating authentication methods according to some embodiments of the invention.
Fig. 10 and 11 are schematic structural views of an authentication device according to some embodiments of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that the terms "first", "second" and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
The following disclosure provides many different embodiments or examples for implementing different features of the invention. To simplify the disclosure of the present invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, the present invention provides examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or uses of other materials.
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
Referring to fig. 1 to 3, a verification method according to an embodiment of the present invention includes:
01, judging whether the movable module 10 is triggered, wherein the movable module 10 is accommodated in the shell 101 and can extend out of the shell 101, and the movable module 10 comprises a bracket 11, an infrared camera 12 arranged on the bracket 11 and a structured light projector 13 arranged on the bracket 11;
02, if the movable module 10 is triggered, the infrared camera 12 and the structured light projector 13 are carried by the bracket 11 to move towards the outside of the shell 101 so as to extend out of the shell 101, and the infrared camera 12 and the structured light projector 13 are initialized;
03, acquiring an infrared image through the infrared camera 12;
04, judging whether a human face exists in the infrared image;
05, when a face exists in the infrared image, judging whether the face is matched with a face template of an authorized user, simultaneously acquiring a laser pattern through the structured light projector 13 and the infrared camera 12, acquiring a depth image according to the laser pattern and judging whether the depth image is matched with the depth template of the authorized user; and
and 06, if the face is matched with the face template of the authorized user and the depth image is matched with the depth template of the authorized user, the verification is passed.
Wherein step 02 comprises:
021, which carries the infrared camera 12 and the structured light projector 13 to move towards the outside of the shell 101 through the bracket 11 so as to extend out of the shell 101; and
022, the infrared camera 12 is initialized with the structured light projector 13.
Step 05 comprises:
051, when there is a face in the infrared image, judging whether the face matches with the face template of the authorized user;
052, when there is a face in the infrared image, acquiring a laser pattern through the structured light projector 13 and the infrared camera 12;
053, obtaining a depth image according to the laser pattern; and
054, judging whether the depth image is matched with the depth template of the authorized user.
Step 051 and step 052 are executed synchronously, wherein the synchronous execution means that the starting time of the two steps is the same, and specifically, the starting time is a certain moment after the face is judged to exist in the infrared image.
Referring to fig. 2 and 3, the verification apparatus 100 according to the embodiment of the invention includes a housing 101, a movable module 10, and a processor 40. The movable module 10 includes a support 11, an infrared camera 12, and a laser projector 13. The processor 40 is connected to the infrared camera 12 and the laser projector 13. The movable module 10 is housed in the housing 101 and can protrude from the housing 101. Specifically, the housing 101 includes a head portion 102 and a tail portion 103 opposite to each other, the housing 101 further includes a front surface 104 connecting the head portion 102 and the tail portion 103, and a back surface (not shown) opposite to the front surface 104, the front surface 104 is provided with a display screen 105, the display screen 105 is a full-screen, and the movable module 10 is disposed at one end of the head portion 102 of the housing 101. The verification device 100 includes any one of a mobile phone, a tablet computer, an intelligent bracelet, and an intelligent helmet. The verification apparatus 100 according to the embodiment of the present invention is exemplified by a mobile phone.
The processor 40 is configured to determine whether the movable module 10 is triggered, determine whether a face exists in the infrared image when the movable module 10 is triggered and the infrared camera 12 is used to acquire the infrared image, determine whether the face matches a face template of an authorized user when the face exists in the infrared image, determine whether the face exists in the infrared image and the structured light projector 13 acquires a laser pattern with the infrared camera 12, acquire a depth image from the laser pattern, determine whether the depth image matches a depth template of the authorized user, and determine that the authentication is passed when the face matches the face template of the authorized user and the depth image matches the depth template of the authorized user. That is, the processor 40 is configured to perform step 01, step 03, step 04, step 051, step 053, step 054 and step 06.
When the movable module 10 is triggered, the bracket 11 with the infrared camera 12 and the structured light projector 13 moves out of the housing 101 to protrude out of the housing 101. The infrared camera 12 is initialized with the structured light projector 13. That is, the stand 11 is used to perform the step 021, the infrared camera 12 and the laser projector 13 are used together to perform the step 022, and the stand 11, the infrared camera 12 and the laser projector 13 are used together to perform the step 02. The bracket 11 carries the infrared camera 12 and the structured light projector 13 to move towards the outside of the housing 101 so as to extend out of the housing 101, and the infrared camera 12 and the structured light projector 13 are initialized and then executed, that is, the step 021 is executed before the step 022; alternatively, the bracket 11 carries the infrared camera 12 and the structured light projector 13 to move toward the outside of the housing 101 to extend out of the housing 101, and is executed simultaneously with the initialization of the infrared camera 12 and the structured light projector 13, that is, step 021 and step 022 are executed simultaneously, and compared with the case that the step 021 is executed first and then the step 022 is executed, the preparation time (initialization) of the infrared camera 12 and the structured light projector 13 is advanced, so as to reduce the execution time of the verification method as a whole.
The infrared camera 12 can be used to acquire infrared images, and in particular, the infrared camera 12 is used to acquire infrared images after the movable module 10 is triggered. That is, the infrared camera 12 is used to perform step 03.
The infrared camera 12 and the structured light projector 13 can be used together to obtain the laser pattern, specifically, the infrared camera 12 and the structured light projector 13 are used together to obtain the laser pattern when a human face exists in the infrared image, that is, the infrared camera 12 and the structured light projector 13 are used together to execute the step 052.
The operations to trigger the movable module 10 include moving and/or rotating the authentication apparatus 100 in a predetermined manner (e.g., the user rotates the head 102 toward the front side 104), illuminating the display 105 (e.g., the user illuminates the display 105 by pressing a key on the authentication apparatus 100 or the user double-clicks the display 105 to illuminate the display 105 while the display 105 is in the rest state), activating a face detection application in the authentication apparatus 100 (e.g., the user activates/clicks the face detection application or software in the authentication apparatus 100, including, in particular, when the user uses the authentication apparatus 100, the user can activate the user authentication software in the authentication apparatus 100 to identify the face of the user to determine whether the user has authority to use the authentication apparatus 100), clicking a button/key of a pneumatic face detection in an application running in the authentication apparatus 100 (e.g., when the user uses the payment software, the user clicks a button in the payment software that pays through a human face). Specifically, the operation of triggering the movable module 10 is also an operation of starting the infrared camera 12 and the structured light projector 13, that is, when the user performs the above-mentioned operations (moving and/or rotating the authentication apparatus 100 in a predetermined manner, lighting the display 105, starting the face detection application in the authentication apparatus 100, clicking a button/key for pneumatic face detection in an application running in the authentication apparatus 100), the infrared camera 12 and the structured light projector 13 are started, and the initialization operation is performed after the infrared camera 12 and the structured light projector 13 are started. When a user needs to perform face recognition using the infrared camera 12 and the structured light projector 13, the user performing the above operations (moving and/or rotating the authentication apparatus 100 in a predetermined manner, illuminating the display 105, starting a face detection application in the authentication apparatus 100, clicking a button/key for pneumatic face detection in an application running in the authentication apparatus 100) can generate a trigger signal for triggering the movable module 10, and the processor 40 can determine whether the movable module 10 is triggered according to whether the trigger signal is received.
The verification device 100 further comprises a driving assembly 31, the driving assembly 31 is disposed in the housing 101 and connected to the support 11, and the driving assembly 31 is used for driving the support 11 to move. The driving assembly 31 includes a driving motor, and the processor 40 is connected to the driving assembly 31 and controls the driving assembly 31 to drive the movable module 10 to move when the movable module 10 is triggered.
The initialization of the infrared camera 12 and the structured light projector 13 includes starting a driver of the infrared camera 12 to prepare the infrared camera 12 for shooting and starting a driver of the structured light projector 13 to prepare the structured light projector 13 for projecting the infrared laser.
The infrared camera 12 can obtain infrared images when the support 11 moves in place and stops moving, and because the infrared images are obtained when the support 11 is stable, the infrared images are clear, thereby being beneficial to executing subsequent human face judgment steps and reducing the probability of repeatedly shooting multiple frames of infrared images. The infrared camera 12 may also obtain the infrared image when the infrared camera 12 is completely exposed outside the housing 101 and the bracket 11 has not stopped moving, for example, when the infrared camera 12 is completely exposed outside the housing 101 and the moving speed of the bracket 11 is less than one third of the highest speed of the bracket 11, the infrared camera 12 obtains the infrared image, so that the time for shooting the image is advanced, the execution time of the verification method is further reduced on the whole, and the user experience is improved.
The step of judging whether the human face exists in the infrared image may include: and extracting the features of the infrared image through a specific algorithm, matching the extracted features with known face characteristic vectors, and judging whether the infrared image is the face image according to a matching result. The characteristics of the infrared image may be extracted by an Active Shape Model (ASM), a Local Binary Pattern (LBP) algorithm, a Principal Component Analysis (PCA), a Linear Discriminant Analysis (LDA) algorithm, or the like.
The authorized user may be the owner of the authentication device 100 and a user who has a relationship or a friend with the owner. The face template of the authorized user may be a face template pre-stored inside or outside the authentication apparatus 100. The face template can be a face infrared image of an authorized user, and the face infrared image can be a planar image.
When a human face is present in the infrared image, the laser projector 13 is configured to project laser light toward a target object (outside the authentication apparatus 100), and acquire a laser light pattern of the target object by the infrared camera 12. The depth template of the authorized user may be a face depth template pre-existing inside or outside the authentication device 100. The depth template can be a face depth image of an authorized user, and the face depth image can be obtained in a structured light detection mode.
When the processor 40 determines that the infrared image matches the face template of the authorized user, it may be considered that the infrared image currently input by the user and the pre-stored face template originate from the same user, and since the face template and the infrared image are both planar images, the infrared image is easily forged, for example, by using a two-dimensional photo for authentication. Therefore, further determining, by the processor 40, whether the depth image of the target object matches the depth template of the authorized user can better verify whether the current user is a user with a pre-stored depth template. When the processor 40 determines that the depth image matches the depth template of the authorized user, it determines that the verification is passed, and after the verification is passed, the current user can obtain corresponding operation rights of the verification apparatus 100, such as operation rights of screen unlocking, payment, and the like.
After the verification device 100 and the verification method of the embodiment of the invention judge that the face exists in the infrared image, the judgment whether the face is matched with the face template of the authorized user and the acquisition of the laser pattern through the structured light projector 13 and the infrared camera 12 are synchronously executed, so that the execution time of the verification device 100 for executing the verification method is reduced, and the verification speed is improved.
Referring to fig. 1, fig. 2 and fig. 4, a verification apparatus 100 according to an embodiment of the present invention includes a movable module 10, a first determining module 21, a second determining module 22, a third determining module 23, a fourth determining module 24, a verifying module 25 and an obtaining module 27.
The first determining module 21 is configured to determine whether the movable module 10 is triggered, that is, the first determining module 21 may be configured to execute step 01.
The second determining module 22 is configured to determine whether a human face exists in the infrared image after the infrared camera 12 acquires the infrared image, that is, the second determining module 22 may be configured to execute step 04.
The third judging module 23 is configured to judge whether the face is matched with the face template of the authorized user when the face exists in the infrared image, that is, the third judging module 23 may be configured to execute step 051.
The obtaining module 27 is configured to obtain a depth image according to the laser pattern when a human face exists in the infrared image, that is, the obtaining module 27 is configured to perform step 053.
The fourth judging module 24 is configured to judge whether the depth image matches the depth template of the authorized user, that is, the fourth judging module 24 is configured to execute step 054. The infrared camera 12, the structured light projector 13, the third determining module 23, the obtaining module 27 and the fourth determining module 24 are used together to execute step 05.
The verification module 25 is configured to determine that the verification is passed when the face matches the face template of the authorized user and the depth image matches the depth template of the authorized user. That is, the verification module 23 is used to perform step 06.
The support 11 is used for executing the step 021, the infrared camera 12 and the laser projector 13 are used for executing the step 022, and the support 11, the infrared camera 12 and the laser projector 13 are used for executing the step 02. The infrared camera 12 is used for executing step 03. The infrared camera 12 is used in conjunction with the structured light projector 13 to perform step 052.
The steps performed by the first determining module 21, the second determining module 22, the third determining module 23, the fourth determining module 24, the verifying module 25 and the obtaining module 27 may be performed by the processor 40.
The first determination module 21 is connected to the driving assembly 31, the infrared camera 12 and the laser projector 13, so that after the first determination module 21 determines that the movable module 10 is triggered, the first determination module 21 can transmit a signal to the driving assembly 31, the infrared camera 12 and the laser projector 13, so that the driving assembly 31, the infrared camera 12 and the laser projector 13 execute step 02.
The second determination module 22 connects the infrared camera 12 and the laser projector 13, so that the infrared image acquired by the infrared camera 12 can be transmitted to the second determination module 22, and the second determination module 22 can transmit a signal to the infrared camera 12 and the laser projector 13 after determining that a human face exists in the infrared image, so that the infrared camera 12 and the laser projector 13 execute step 052.
The obtaining module 27 is connected to the infrared camera 12 and the fourth determining module 24, so that the obtaining module 27 can receive the laser pattern obtained by the infrared camera 12, and the obtaining module 27 can transmit the depth image to the fourth determining module 24 after generating the depth image.
After the verification device 100 and the verification method of the embodiment of the invention judge that the face exists in the infrared image, the judgment whether the face is matched with the face template of the authorized user and the acquisition of the laser pattern through the structured light projector 13 and the infrared camera 12 are synchronously executed, so that the execution time of the verification device 100 for executing the verification method is reduced, and the verification speed is improved.
Referring to fig. 2 and 3, in some embodiments, the movable module 10 further includes an infrared fill-in light 14, and the infrared fill-in light 14 may be configured to emit infrared light, and the infrared light is reflected by an object and then received by the infrared camera 12. Specifically, when the infrared camera 12 is used for acquiring an infrared image, the infrared fill-in light 14 is used for generating infrared light outwards so as to enhance the intensity of the infrared camera 12 receiving the infrared light reflected by an object, thereby improving the definition of the infrared image.
Referring to fig. 2 and 3, in some embodiments, the movable module 10 further includes at least one of a front camera 15, a receiver 16, a light sensor 17, a proximity sensor 18, a rear camera 19, and a light supplement lamp 191. The front camera 15, the receiver 16, the light sensor 17, or the proximity sensor 18 need not be disposed on the front face 104 of the housing 101, and the display screen 105 may be disposed on the entire front face 104, in which case the display screen 105 is a full-face screen. The rear camera 19 does not need to be disposed on the back surface of the housing 101, so that the back surface of the housing 101 has better integrity and the external light is more beautiful.
The width W1 of the bracket 11 of the present embodiment is equal to the width W2 of the housing 101. The bracket 11 can be an integrated structure which can simultaneously fix the optical sensor 17, the infrared light supplement lamp 14, the infrared camera 12, the proximity sensor 18, the telephone receiver 16, the rear camera 19, the visible light supplement lamp 191, the front camera 15 and the laser projector 13; or, the holder 11 may further include a first sub-holder structure for fixing the light sensor 17, the infrared fill-in light lamp 14, the infrared camera 12, the proximity sensor 18, the receiver 16, the front camera 15 and the laser projector 13, and a second sub-holder structure for fixing the rear camera 19 and the visible light fill-in light lamp 191, where the first sub-holder and the second sub-holder are assembled together, and specifically, the first sub-holder and the second sub-holder are connected together by at least one or a combination of screwing, clamping, gluing and welding. The holder 11 is provided with a light-passing hole (not shown) at an end (top surface of the holder 11) corresponding to the head 102, and the light sensor 17 is installed at a position corresponding to the light-passing hole so that light outside the authentication apparatus 100 (or the holder 11) can be transmitted to the light sensor 17.
Referring to fig. 5, in some embodiments, the steps of acquiring the laser pattern through the structured light projector 13 and the infrared camera 12 and acquiring the depth image according to the laser pattern (steps 052 and 053) include:
0521: projecting laser light by a laser projector 13;
0522: acquiring a laser pattern modulated by an object through the infrared camera 12; and
0531: the laser pattern is processed to obtain a depth image.
The laser projector 13 may be used to perform step 0521, the infrared camera 12 may be used to perform step 0522, and the processor 40 may be used to perform step 0531. That is, the laser projector 13 is used to project laser light; the infrared camera 12 user acquires a laser pattern modulated by the object and the processor 40 is configured to process the laser pattern to obtain a depth image.
The processor 40 may store therein calibration information of the laser projected by the laser projector 13, and the processor 40 obtains depth information of different positions of the target object by processing the laser pattern and the calibration information and forms a depth image. Wherein, the laser that laser projector 13 throws can be the infrared light, and the laser projects and also has the difference to the laser pattern after modulating on different materials, and for example when laser projects on the skin of people, rubber, wood etc. material, the laser pattern after laser is modulated has the difference, consequently, the material information of target object also can embody in the degree of depth image to some extent, only when the material is the skin of people, degree of depth image just can be matched with the degree of depth template and pass verification.
Referring to fig. 3, in some embodiments, determining whether a face exists in an infrared image is performed in a Trusted Execution Environment (TEE) 41; and/or
Judging whether the matching of the face in the infrared image and the face template of the authorized user is performed in a trusted execution environment; and/or
Determining whether the depth image matches the depth template of the authorized user is performed in the trusted execution environment.
Step 04, and/or step 051, and/or step 054 are performed in a trusted execution environment. Specifically, the processor 40 is further configured to form a trusted Execution Environment 41 and an untrusted Execution Environment (REE) 42, where both code and a memory area in the trusted Execution Environment 41 are controlled by the access control unit and cannot be accessed by a program in the untrusted Execution Environment 42. Specifically, the trusted execution environment 41 may receive the image (infrared image or depth image) transmitted to the trusted execution environment 41 by the infrared camera 12, and output the comparison result, and the image data and the program in the trusted execution environment 41 cannot be accessed by the program in the untrusted execution environment 42.
Specifically, when the step of determining whether the face exists in the infrared image (step 04) is executed in the trusted execution environment 41, the infrared image is transmitted to the trusted execution environment 41 to be processed to determine whether the face exists in the infrared image, and the trusted execution environment 41 outputs the comparison result (including the presence of the face in the infrared image and the absence of the face in the infrared image); when the step of judging whether the face in the infrared image is matched with the face template of the authorized user (step 051) is carried out in the trusted execution environment 41, the infrared image is transmitted to the trusted execution environment 41 to be processed so as to judge whether the face in the infrared image is matched with the face template of the authorized user, and the trusted execution environment 41 outputs the comparison result (including matching between the face and the face template and mismatching between the face and the face template); when the step of determining whether the depth image matches the depth template of the authorized user (step 054) is performed in the trusted execution environment 41, the depth image is transmitted to the trusted execution environment 41 for processing and determining whether the depth image matches the depth template of the authorized user, and the trusted execution environment 41 outputs the comparison result (including matching the depth image with the depth template and mismatching the depth image with the depth template); the comparison result may be output to the untrusted execution environment 42.
Step 04, and/or step 051, and/or step 054 of the present embodiment are executed in the trusted execution environment, so that the risk that the depth image and/or the infrared image of the user is leaked due to reading of the depth image and/or the infrared image by the untrusted execution environment can be reduced, and the security of the authentication apparatus 100 is improved.
In some embodiments, Processor 40 includes an Application Processor (AP) 43 and a microprocessor 44. Both the trusted execution environment 41 and the untrusted execution environment 42 are formed on an application processor 43. The microprocessor 44 is connected to the infrared camera 12 and is configured to obtain an infrared image and a laser pattern, the microprocessor 44 processes the laser pattern to obtain a depth image, specifically, calibration information of the laser projected by the laser projector 13 may be stored in the microprocessor 44, and the microprocessor 44 obtains depth information of different positions of the target object by processing the laser pattern and the calibration information and forms a depth image. Specifically, the microprocessor 44 and the infrared camera 12 may be connected by an Inter-Integrated Circuit (I2C) bus 50, the microprocessor 44 may provide a clock signal for acquiring the infrared image to the infrared camera 12, and the infrared image and the laser pattern acquired by the infrared camera 12 may be transmitted to the microprocessor 44 through a Mobile Industry Processor Interface (MIPI) 441. The microprocessor 44 is also connected to the laser projector 13, and in particular, the laser projector 13 may be connected to a Pulse Width Modulation (PWM) interface 442 of the microprocessor 44. The microprocessor 44 is connected to the application processor 43 and transmits the infrared image and the depth image into the trusted execution environment 41. In other embodiments, the laser projector 13 may also be connected to an application processor 43, and the application processor 43 may be used to enable the laser projector 13 and connect through the integrated circuit bus 50.
The microprocessor 44 may be a processing chip, the application processor 43 may be used for resetting the microprocessor 44, waking up (wake) the microprocessor 44, debugging (debug) the microprocessor 44, and the like, and the microprocessor 44 may be connected to the application processor 43 through the mobile industry processor interface 441, and specifically, the microprocessor 44 is connected to the trusted execution environment 41 of the application processor 43 through the mobile industry processor interface 441 to directly transfer data in the microprocessor 44 to the trusted execution environment 41.
The microprocessor 44 may receive the infrared image acquired by the infrared camera 12 to acquire an infrared image, and the microprocessor 44 may transmit the infrared image to the trusted execution environment 41 through the mobile industry processor interface 441, so that the infrared image output from the microprocessor 44 does not enter the untrusted execution environment 42 of the application processor 43, and is not acquired by other programs, thereby improving the information security of the authentication apparatus 100. Meanwhile, the application processor 43 compares whether the face in the infrared image is matched with the face template in the trusted execution environment 41, and then outputs whether the comparison result is matched, and in the process of comparing whether the infrared image is matched with the face template, the infrared image and the face template cannot be acquired, tampered or stolen by other programs, so that the information security of the terminal 100 is further improved. Similarly, the depth image and the depth template are not acquired, tampered or stolen by other programs, thereby improving the information security of the verification apparatus 100.
When the verification apparatus 100 includes the infrared fill-in light 14, the infrared fill-in light 14 and the application processor 43 may be connected through the integrated circuit bus 50, the application processor 43 may be configured to enable the infrared fill-in light 14, the infrared fill-in light 14 may also be connected with the microprocessor 44, and specifically, the infrared fill-in light 14 may be connected to the pulse width modulation interface 442 of the microprocessor 44.
Referring to fig. 2 and 6, in some embodiments, the verification method further includes:
07, if the face in the infrared image is not matched with the face template of the authorized user, the verification is not passed; and/or if the depth image does not match the depth template of the authorized user, the authentication is not passed.
Processor 40 is further configured to perform step 07, that is, processor 40 is configured to determine that the authentication is not passed if the face in the infrared image does not match the face template of the authorized user; and/or processor 40 is configured to determine that the authentication fails when the depth image does not match the depth template of the authorized user. In other embodiments, step 07 further comprises: and if the human face does not exist in the infrared image, the verification is not passed.
Specifically, step 06 need not be performed when the face in the infrared image does not match the face template of the authorized user, or when the depth image does not match the depth template of the authorized user. When the processor 40 fails, the processor 40 may control the display 105 to display a word "fail to verify, please input again" or the like, or the processor 40 may control the verification apparatus 100 to generate a predetermined vibration to prompt the user that the verification fails. At this time, the movable module 10 may remain extended outside the housing 101; alternatively, the movable module 10 may be moved into the housing 101.
Referring to fig. 7, in some embodiments, the verification method further includes:
081, if no human face exists in the infrared image, returning to the step of acquiring the infrared image by the infrared camera 12 (step 03); or
082, if the face in the infrared image is not matched with the face template of the authorized user, returning to the step of acquiring the infrared image by the infrared camera 12 (step 03); and/or
083, if the depth image does not match the depth template of the authorized user, returning to the step of acquiring the infrared image by the infrared camera 12 (step 03).
Specifically, the infrared camera 12 is further configured to obtain an infrared image when a human face does not exist in the infrared image, or when the human face in the infrared image does not match the human face template of the authorized user, and/or when the depth image does not match the depth template of the authorized user. For example, after the infrared camera 12 acquires the infrared image, if the processor 40 (or the second determining module 22) determines that the infrared image does not have a human face, the infrared camera 12 acquires the infrared image again (return to step 03); after the infrared camera 12 acquires the infrared image and the processor 40 (or the second judging module 22) judges that the face exists in the infrared image, if the processor 40 (or the third judging module 23) judges that the face in the infrared image is not matched with the face template of the authorized user, the infrared camera 12 acquires the infrared image again (returning to execute step 03); after the infrared camera 12 acquires the infrared image and the processor 40 (or the second judging module 22) judges that the face exists in the infrared image, if the processor 40 (or the fourth judging module 24) judges that the depth image does not match the depth template of the authorized user, the infrared camera 12 acquires the infrared image again (return to execute step 03); after the infrared camera 12 acquires the infrared image and the processor 40 (or the second judging module 22) judges that the face exists in the infrared image, if the processor 40 (or the third judging module 23) judges that the face in the infrared image does not match the face template of the authorized user and the processor 40 (or the fourth judging module 24) judges that the depth image does not match the depth template of the authorized user, the infrared camera 12 acquires the infrared image again (returning to execute step 03). In other embodiments, the verification method further comprises: if the face in the infrared image matches the face template of the authorized user and the depth image does not match the depth template of the authorized user, the step of obtaining the laser pattern through the structured light projector 13 and the infrared camera 12 is returned (step 052).
The infrared camera 12 in the verification device 100 and the verification method according to this embodiment is further configured to obtain the infrared image when no face exists in the infrared image, or when the face in the infrared image is not matched with the face template of the authorized user, and/or when the depth image is not matched with the depth template of the authorized user, the movable module 10 does not need to extend into the housing 101 and then extend out from the housing 101, so that the infrared camera 12 can obtain the infrared image, thereby reducing the time for the verification device 100 to execute the verification method, and improving the success rate of passing the verification.
Referring to fig. 8, in some embodiments, after the infrared camera 12 continuously acquires the infrared images within a predetermined number of times, the verification method further includes:
09, if the infrared image does not have a human face, the movable module 10 moves to be accommodated in the housing 101; or, if the face in the infrared image is not matched with the face template of the authorized user, the movable module 10 moves to be accommodated in the shell 101; and/or if the depth image does not match the depth template of the authorized user, the movable module 10 moves to be housed within the housing 101.
Specifically, after the infrared camera 12 continuously collects the infrared images within the predetermined number of times, the movable module 10 is further configured to move to be accommodated in the housing 101 when no human face exists in the infrared images, or the human face in the infrared images is not matched with the human face template of the authorized user, and/or the depth image is not matched with the depth template of the authorized user.
The predetermined number of times may be two, three, four, five, or any number of times, and this embodiment is exemplified by two times: when the movable module 10 is accommodated in the housing 101 and the movable module 10 is triggered, the movable module 10 extends out of the housing 101 to expose the infrared camera 12 out of the housing 101, the infrared camera 12 acquires an infrared image for the first time, if the infrared image does not have a human face, the movable module 10 remains exposed out of the housing 101, and at this time, the infrared camera 12 acquires an infrared image for the second time (return to execute step 03); if the infrared image acquired by the infrared camera 12 for the second time does not have a human face, the movable module 10 moves to be accommodated in the housing 101. When the movable module 10 is accommodated in the housing 101 and the movable module 10 is triggered, the movable module 10 extends out of the housing 101 to expose the infrared camera 12 out of the housing 101, the infrared camera 12 acquires an infrared image for the first time, if the infrared image does not have a human face, the movable module 10 remains exposed out of the housing 101, and at this time, the infrared camera 12 acquires an infrared image for the second time (return to execute step 03); if a human face exists in the infrared image acquired by the infrared camera 12 for the second time, but the human face in the infrared image acquired for the second time does not match with the human face template, the movable module 10 moves to be accommodated in the housing 101. When the movable module 10 is accommodated in the housing 101 and the movable module 10 is triggered, the movable module 10 extends out of the housing 101 to expose the infrared camera 12 out of the housing 101, the infrared camera 12 acquires an infrared image for the first time, if the infrared image does not have a human face, the movable module 10 remains exposed out of the housing 101, and at this time, the infrared camera 12 acquires an infrared image for the second time (return to execute step 03); if the infrared image acquired by the infrared camera 12 for the second time has a human face, but the depth image acquired by the laser projector 13, the infrared camera 12, and the processor 40 (the acquisition module 27) does not match the depth template, the movable module 10 moves to be accommodated in the housing 101. When the movable module 10 is accommodated in the housing 101 and the movable module 10 is triggered, the movable module 10 extends out of the housing 101 to expose the infrared camera 12 out of the housing 101, the infrared camera 12 acquires an infrared image for the first time, if the infrared image does not have a human face, the movable module 10 remains exposed out of the housing 101, and at this time, the infrared camera 12 acquires an infrared image for the second time (return to execute step 03); if there is a human face in the infrared image acquired by the infrared camera 12 for the second time, but the human face in the infrared image acquired for the second time does not match the human face template, and the depth image acquired by the laser projector 13, the infrared camera 12, and the processor 40 (the acquisition module 27) does not match the depth template, the movable module 10 moves to be accommodated in the housing 101.
When the movable module 10 starts moving toward the inside of the housing 101, both the infrared camera 12 and the laser projector 13 are turned off; or, after the infrared camera 12 continuously collects the infrared images within the predetermined number of times, the movable module 10 is further configured to turn off both the infrared camera 12 and the laser projector 13 when no face exists in the infrared images, or the face in the infrared images is not matched with the face template of the authorized user, and/or the depth image is not matched with the depth template of the authorized user.
In the verification apparatus 100 and the verification method of the embodiment, after the infrared camera 12 continuously collects the infrared images within the predetermined number of times, the movable module 10 is further configured to move to be accommodated in the housing 101 when the infrared images do not have human faces, or the human faces in the infrared images are not matched with the human face templates of authorized users, and/or the depth images are not matched with the depth templates of authorized users, so as to avoid that after the infrared images do not pass multiple times of verification, the infrared camera 12 and/or the laser projector 13 continuously work, and the movable module 10 is always exposed outside the housing 101, which affects the appearance of the verification apparatus 100.
Referring to fig. 9 and 10, in some embodiments, the bracket 11 is provided with a reference position, and the step of moving the infrared camera 12 and the structured light projector 13 out of the housing 101 with the bracket 11 to extend out of the housing 101 (step 021) includes:
0211, detecting whether the reference position on the movable module 10 reaches a preset position;
0212, if the reference position reaches the preset position, the movement of the support 11 is stopped.
The verification device 100 further comprises a detection module 26, a reference position is arranged on the support 11, and the detection module 26 is used for detecting whether the reference position on the movable module 10 reaches a preset position; when the reference position reaches the preset position, the movement of the carriage 11 is stopped.
The reference position may be a position where a stopper (e.g., a stopper protrusion) and a positioning portion (e.g., a positioning groove) on the bracket 11 are located. The preset position is a fixed position with respect to the housing 101, and specifically, the preset position may be a position where a limiting portion (e.g., a limiting protrusion) on the housing 101 is located. When the movable module 10 is housed in the housing 101, the distance between the reference position and the preset position is the maximum stroke of the movable module 10. The detection module 26 may be a detection circuit connected with a position switch (which may be a travel switch), the position switch is disposed at a preset position, the bracket 11 is provided with a protrusion capable of triggering the position switch at a reference position, when the reference position on the bracket 11 moves to the preset position, the bracket 11 triggers the position switch and is detected by the detection circuit, so that the detection module 26 can detect whether the reference position of the bracket 11 moves to the preset position.
Referring to fig. 10, in some embodiments, the detection module 26 includes a magnetic device 261 and a hall sensor 262, the magnetic device 261 is disposed at a reference position, and the hall sensor 262 is disposed at a predetermined position. Specifically, when the magnetic device 261 moves to the preset position, the magnetic device 261 aligns with the hall sensor 262 and changes the signal on the hall sensor 262, and it can be determined whether the magnetic device 261 (or the support 11) reaches the preset position according to the signal change of the hall sensor 262.
Referring to fig. 11, a computer-readable storage medium 60 is further provided according to an embodiment of the present invention. The computer-readable storage medium 60 is used in the authentication apparatus 100 of the above embodiment, and the computer-readable storage medium 60 is used for storing one or more computer-executable instructions, when the one or more computer-executable instructions are executed by the one or more processors 40, the processors 40 execute the following steps:
01, judging whether the movable module 10 is triggered or not;
02, if the movable module 10 is triggered, the control bracket 11 carries the infrared camera 12 and the structured light projector 13 to move towards the outside of the shell 101 so as to extend out of the shell 101, and controls the infrared camera 12 and the structured light projector 13 to initialize;
03, controlling the infrared camera 12 to acquire an infrared image;
04, judging whether a human face exists in the infrared image;
05, when a face exists in the infrared image, judging whether the face is matched with a face template of an authorized user, controlling the structured light projector 13 and the infrared camera 12 to acquire a laser pattern, acquiring a depth image according to the laser pattern and judging whether the depth image is matched with the depth template of the authorized user; and
and 06, if the face is matched with the face template of the authorized user and the depth image is matched with the depth template of the authorized user, the verification is passed.
When the computer-executable instructions are executed by the one or more processors 40, the processors 40 may further perform the steps of:
0521: controlling the laser projector 13 to project laser;
0522: controlling the infrared camera 12 to acquire a laser pattern modulated by the object; and
0531: the laser pattern is processed to obtain a depth image.
When the computer-executable instructions are executed by the one or more processors 40, the processors 40 may further perform the steps of:
07, if the face in the infrared image is not matched with the face template of the authorized user, the verification is not passed; and/or if the depth image does not match the depth template of the authorized user, the authentication is not passed.
When the computer-executable instructions are executed by the one or more processors 40, the processors 40 may further perform the steps of:
081, if no human face exists in the infrared image, returning to the step of acquiring the infrared image by the infrared camera 12 (step 03); or
082, if the face in the infrared image is not matched with the face template of the authorized user, returning to the step of acquiring the infrared image by the infrared camera 12 (step 03); and/or
083, if the depth image does not match the depth template of the authorized user, returning to the step of acquiring the infrared image by the infrared camera 12 (step 03).
After infrared camera 12 has consecutively acquired infrared images within a predetermined number of times, when the computer-executable instructions are executed by one or more processors 40, processor 40 may further perform the steps of:
09, if the infrared image does not have a human face, controlling the movable module 10 to move to be accommodated in the shell 101; or, if the face is not matched with the face template of the authorized user, the movable module 10 is controlled to move to be accommodated in the shell 101; and/or, if the depth image does not match the depth template of the authorized user, controlling the movable module 10 to move to be housed within the housing 101.
When the computer-executable instructions are executed by the one or more processors 40, the processors 40 may further perform the steps of:
0211, the control detection module 26 detects whether the reference position on the movable module 10 reaches a preset position; and
0212, if the reference position reaches the preset position, the control bracket 11 stops moving.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, various steps or methods may be performed by software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for performing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried out in the above method may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be executed in the form of hardware or in the form of a software functional module. The integrated module, if executed in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (17)

1. A method of authentication, the method comprising:
judging whether a movable module is triggered, wherein the movable module is accommodated in a shell and can extend out of the shell, and comprises a bracket, an infrared camera arranged on the bracket and a structured light projector arranged on the bracket;
if the movable module is triggered, the bracket drives the infrared camera and the structured light projector to move towards the outside of the shell so as to extend out of the shell, and the infrared camera and the structured light projector are initialized in the process that the bracket extends out of the shell;
when the infrared camera is completely exposed out of the shell and the support does not stop moving, acquiring an infrared image through the infrared camera;
judging whether a human face exists in the infrared image or not;
when a face exists in the infrared image, judging whether the face is matched with a face template of an authorized user, acquiring a laser pattern through the structured light projector and the infrared camera, acquiring a depth image according to the laser pattern, and judging whether the depth image is matched with the depth template of the authorized user; and
if the face is matched with a face template of an authorized user and the depth image is matched with a depth template of the authorized user, the verification is passed;
the support is provided with a reference position, and the verification method comprises the following steps:
detecting whether the reference position on the movable module reaches a preset position or not, wherein the distance between the reference position and the preset position is the maximum stroke of the movable module;
and if the reference position reaches a preset position, stopping the movement of the bracket.
2. The verification method according to claim 1, wherein said determining whether a face is present in the infrared image is performed in a trusted execution environment; and/or
Judging whether the face is matched with a face template of an authorized user or not is carried out in a trusted execution environment; and/or
The determining whether the depth image matches a depth template of an authorized user is performed in a trusted execution environment.
3. The authentication method according to claim 1, further comprising:
if the face is not matched with the face template of the authorized user, the verification is not passed; and/or the presence of a gas in the gas,
if the depth image does not match the depth template of the authorized user, the verification is not passed.
4. The authentication method according to claim 1, further comprising:
if the human face does not exist in the infrared image, returning to the step of acquiring the infrared image through the infrared camera; or
If the face is not matched with the face template of the authorized user, returning to the step of acquiring the infrared image through the infrared camera; and/or
And if the depth image is not matched with the depth template of the authorized user, returning to the step of acquiring the infrared image through the infrared camera.
5. The authentication method as claimed in claim 4, wherein after the infrared camera continuously acquires the infrared image for a predetermined number of times, the authentication method further comprises:
if the infrared image does not have a human face, the movable module moves to be accommodated in the shell; or
If the face is not matched with the face template of the authorized user, the movable module moves to be accommodated in the shell; and/or
If the depth image does not match the depth template of the authorized user, the movable module moves to be accommodated in the shell.
6. An authentication apparatus, characterized in that the authentication apparatus comprises:
the movable module is accommodated in the shell and can extend out of the shell, the movable module comprises a support, an infrared camera arranged on the support and a structured light projector arranged on the support, when the movable module is triggered, the support is used for driving the infrared camera and the structured light projector to move towards the outside of the shell so as to extend out of the shell, and the infrared camera and the structured light projector are initialized in the process that the support extends out of the shell; the infrared camera is used for acquiring infrared images when the infrared camera is completely exposed out of the shell and the bracket does not stop moving; the infrared camera and the structured light projector are used for acquiring laser patterns; and
a processor to:
judging whether the movable module is triggered or not;
judging whether a human face exists in the infrared image or not;
when a face exists in the infrared image, judging whether the face is matched with a face template of an authorized user;
when a human face exists in the infrared image, acquiring a depth image according to the laser pattern and judging whether the depth image is matched with a depth template of an authorized user; and
when the face is matched with a face template of an authorized user and the depth image is matched with a depth template of the authorized user, determining that the verification is passed;
the verification device further comprises a detection module, a reference position is arranged on the support, and the detection module is used for detecting whether the reference position on the movable module reaches a preset position; when the reference position reaches a preset position, the support stops moving, and the distance between the reference position and the preset position is the maximum stroke of the movable module.
7. The apparatus according to claim 6, wherein the processor is further configured to form a feasible execution environment, and the determining whether the human face is present in the infrared image is performed in the trusted execution environment; and/or
Judging whether the face is matched with a face template of an authorized user in the trusted execution environment; and/or
The determining whether the depth image matches a depth template of an authorized user is performed in the trusted execution environment.
8. The authentication device of claim 6, wherein the processor is further configured to:
when the face is not matched with the face template of the authorized user, determining that the verification is not passed; and/or the presence of a gas in the gas,
determining that the authentication fails when the depth image does not match a depth template of an authorized user.
9. The authentication device according to claim 6, wherein the infrared camera is further configured to acquire an infrared image when a human face does not exist in the infrared image; or
The infrared camera is also used for acquiring an infrared image when the face is not matched with a face template of an authorized user; and/or
The infrared camera is further used for acquiring the infrared image when the depth image is not matched with the depth template of the authorized user.
10. The authentication device according to claim 9, wherein after the infrared camera continuously acquires the infrared image for a predetermined number of times,
the movable module is also used for moving to be accommodated in the shell when no human face exists in the infrared image; or
The movable module is also used for moving to be accommodated in the shell when the face is not matched with a face template of an authorized user; and/or
The movable module is also for moving to be received into the housing when the depth image does not match a depth template of an authorized user.
11. The verification device according to claim 6, wherein the detection module comprises a magnetic device and a Hall sensor, the magnetic device is disposed on the reference position, and the Hall sensor is disposed on the preset position.
12. The authentication device according to any one of claims 6 to 11, wherein the authentication device comprises any one of a mobile phone, a tablet computer, a smart band, and a smart helmet.
13. A computer readable storage medium for use with an authentication apparatus comprising a moveable module housed within and extendable from a housing, the moveable module comprising a cradle, an infrared camera disposed on the cradle, and a structured light projector disposed on the cradle, the cradle being movable with the infrared camera and the structured light projector toward the outside of the housing to extend out of the housing, the infrared camera being capable of acquiring an infrared image when the infrared camera is fully exposed outside of the housing and the cradle has not stopped moving, the infrared camera and the structured light projector being capable of acquiring a laser light pattern, wherein the computer readable storage medium is for storing one or more computer executable instructions that, when executed by one or more processors, the one or more processors perform the following authentication method:
judging whether the movable module is triggered or not;
if the movable module is triggered, controlling the bracket to drive the infrared camera and the structured light projector to move towards the outside of the shell so as to extend out of the shell, and controlling the infrared camera and the structured light projector to initialize in the process that the bracket extends out of the shell;
controlling the infrared camera to acquire an infrared image;
judging whether a human face exists in the infrared image or not;
when a face exists in the infrared image, judging whether the face is matched with a face template of an authorized user, controlling the structured light projector and the infrared camera to acquire a laser pattern, acquiring a depth image according to the laser pattern and judging whether the depth image is matched with the depth template of the authorized user; and
if the face is matched with a face template of an authorized user and the depth image is matched with a depth template of the authorized user, the verification is passed;
the authentication device further comprises a detection module having a reference position disposed on the support, the one or more processors further performing the following authentication method when the one or more computer-executable instructions are executed by the one or more processors:
controlling the detection module to detect whether the reference position on the movable module reaches a preset position, wherein the distance between the reference position and the preset position is the maximum stroke of the movable module; and
and if the reference position reaches a preset position, controlling the support to stop moving.
14. The computer-readable storage medium of claim 13, wherein the determining whether a human face is present in the infrared image is performed in a trusted execution environment; and/or
Judging whether the face is matched with a face template of an authorized user or not is carried out in a trusted execution environment; and/or
The determining whether the depth image matches a depth template of an authorized user is performed in a trusted execution environment.
15. The computer-readable storage medium of claim 13, wherein when the one or more computer-executable instructions are executed by the one or more processors, the one or more processors further perform the following authentication method:
if the face is not matched with the face template of the authorized user, the verification is not passed; and/or the presence of a gas in the gas,
if the depth image does not match the depth template of the authorized user, the verification is not passed.
16. The computer-readable storage medium of claim 13, wherein when the one or more computer-executable instructions are executed by the one or more processors, the one or more processors further perform the following authentication method:
if the human face does not exist in the infrared image, returning to the step of controlling the infrared camera to acquire the infrared image; or
If the face is not matched with the face template of the authorized user, returning to the step of controlling the infrared camera to acquire the infrared image; and/or
And if the depth image is not matched with the depth template of the authorized user, returning to the step of controlling the infrared camera to acquire the infrared image.
17. The computer-readable storage medium of claim 16, wherein after the infrared camera has acquired the infrared images consecutively for a predetermined number of times, when the one or more computer-executable instructions are executed by the one or more processors, the one or more processors further perform a verification method comprising:
if the infrared image does not have a human face, the movable module moves to be accommodated in the shell; or
If the face is not matched with the face template of the authorized user, the movable module moves to be accommodated in the shell; and/or
If the depth image does not match the depth template of the authorized user, the movable module moves to be accommodated in the shell.
CN201810575116.6A 2018-06-06 2018-06-06 Authentication method, authentication apparatus, and computer-readable storage medium Expired - Fee Related CN108959880B (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201810575116.6A CN108959880B (en) 2018-06-06 2018-06-06 Authentication method, authentication apparatus, and computer-readable storage medium
PCT/CN2019/083370 WO2019233199A1 (en) 2018-06-06 2019-04-19 Authentication method and device, and computer-readable storage medium
US16/424,426 US10942999B2 (en) 2018-06-06 2019-05-28 Verification method, verification device, electronic device and computer readable storage medium
EP19178415.6A EP3579131B1 (en) 2018-06-06 2019-06-05 Verification method, verification device, electronic device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810575116.6A CN108959880B (en) 2018-06-06 2018-06-06 Authentication method, authentication apparatus, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN108959880A CN108959880A (en) 2018-12-07
CN108959880B true CN108959880B (en) 2020-09-18

Family

ID=64493089

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810575116.6A Expired - Fee Related CN108959880B (en) 2018-06-06 2018-06-06 Authentication method, authentication apparatus, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN108959880B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019233199A1 (en) * 2018-06-06 2019-12-12 Oppo广东移动通信有限公司 Authentication method and device, and computer-readable storage medium
CN109961062A (en) * 2019-04-16 2019-07-02 北京迈格威科技有限公司 Image-recognizing method, device, terminal and readable storage medium storing program for executing

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9408076B2 (en) * 2014-05-14 2016-08-02 The Regents Of The University Of California Sensor-assisted biometric authentication for smartphones
KR101688168B1 (en) * 2015-08-17 2016-12-20 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN105554196A (en) * 2016-01-26 2016-05-04 孔岳 Full-screen mobile phone
CN106991377B (en) * 2017-03-09 2020-06-05 Oppo广东移动通信有限公司 Face recognition method, face recognition device and electronic device combined with depth information
CN107729836B (en) * 2017-10-11 2020-03-24 Oppo广东移动通信有限公司 Face recognition method and related product
CN107766824A (en) * 2017-10-27 2018-03-06 广东欧珀移动通信有限公司 Face identification method, mobile terminal and computer-readable recording medium

Also Published As

Publication number Publication date
CN108959880A (en) 2018-12-07

Similar Documents

Publication Publication Date Title
CN108924340B (en) Authentication method, authentication apparatus, and computer-readable storage medium
US10942999B2 (en) Verification method, verification device, electronic device and computer readable storage medium
CN110580102B (en) Screen lightening method and device, mobile terminal and storage medium
US20200293754A1 (en) Task execution method, terminal device, and computer readable storage medium
EP2595402B1 (en) System for controlling light enabled devices
CN105988586A (en) Low-power iris authentication alignment
EP2701097A2 (en) Method and device for authenticating a user
US6393136B1 (en) Method and apparatus for determining eye contact
US20170344840A1 (en) Liveness detection for face capture
CN108446638B (en) Identity authentication method and device, storage medium and electronic equipment
EP2680190A2 (en) Facial recognition
WO2018000638A1 (en) Apparatus and method for non-contact acquisition of 3d fingerprint
CN108959880B (en) Authentication method, authentication apparatus, and computer-readable storage medium
CN104166835A (en) Method and device for identifying living user
EP3576016A1 (en) Face recognition method and apparatus, and mobile terminal and storage medium
US20200160857A1 (en) Electronic Device with Voice Process Control and Corresponding Methods
CN114187637A (en) Vehicle control method, device, electronic device and storage medium
CN108763911B (en) Authentication method, authentication apparatus, electronic device, and computer-readable storage medium
CN117104179A (en) Vehicle control system, method, electronic equipment and medium
CN112764515A (en) Biological characteristic recognition device and electronic equipment
CN114495328A (en) Door lock device and control method thereof
CN109064503B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN215576395U (en) Electronic equipment
US20230367182A1 (en) Methods and systems for light source diagnostics
CN114323583B (en) Vehicle light detection method, device, equipment and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200918

CF01 Termination of patent right due to non-payment of annual fee