WO2019228107A1 - 验证模板的生成方法和生成系统、终端和计算机设备 - Google Patents

验证模板的生成方法和生成系统、终端和计算机设备 Download PDF

Info

Publication number
WO2019228107A1
WO2019228107A1 PCT/CN2019/084326 CN2019084326W WO2019228107A1 WO 2019228107 A1 WO2019228107 A1 WO 2019228107A1 CN 2019084326 W CN2019084326 W CN 2019084326W WO 2019228107 A1 WO2019228107 A1 WO 2019228107A1
Authority
WO
WIPO (PCT)
Prior art keywords
template
execution environment
target object
infrared
laser
Prior art date
Application number
PCT/CN2019/084326
Other languages
English (en)
French (fr)
Inventor
张学勇
吕向楠
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP19794870.6A priority Critical patent/EP3608814B1/en
Priority to US16/613,371 priority patent/US11210800B2/en
Publication of WO2019228107A1 publication Critical patent/WO2019228107A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/36User authentication by graphic or iconic representation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2133Verifying human interaction, e.g., Captcha
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2141Access rights, e.g. capability lists, access control lists, access tables, access matrices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Definitions

  • the present application relates to the field of information security technology, and more specifically, to a method for generating a verification template, a system for generating a verification template, a terminal, a non-volatile computer-readable storage medium, and a computer device.
  • electronic devices usually verify whether a user has relevant usage rights by comparing the difference between a face image input by a user and a pre-stored face image template.
  • a face image template is easily tampered or misappropriated. , Resulting in lower security of the information in the electronic device.
  • the embodiments of the present application provide a method for generating a verification template, a system for generating a verification template, a terminal, a non-volatile computer-readable storage medium, and a computer device.
  • the verification template according to the embodiment of the present application includes an infrared template and a depth template.
  • a method for generating the verification template includes: acquiring an infrared image of a target object and storing the infrared image in a trusted execution environment as the infrared template; and controlling the laser projector to the infrared template.
  • the target object projects laser light; obtains a laser pattern modulated by the target object; and processes the laser pattern to obtain a depth image, and stores the depth image in a trusted execution environment as the depth template.
  • the system for generating a verification template includes a microprocessor and an application processor, where the microprocessor is configured to: obtain an infrared image of a target object, and store the infrared image of the target processor as a trusted execution environment for the application processor The infrared template; controlling a laser projector to project laser light onto the target object; acquiring a laser pattern modulated by the target object; and processing the laser pattern to obtain a depth image, and storing the depth image in the trusted execution environment as The depth template.
  • the terminal includes an infrared camera, a laser projector, and a verification template generation system.
  • the infrared camera is used to collect an infrared image of a target object;
  • the laser projector is used to project a laser onto the target object;
  • the generation system includes a microprocessor and an application processor, the microprocessor is configured to: acquire an infrared image of a target object and store the infrared image in a trusted execution environment of the application processor as the infrared template; control a laser projector Projecting a laser on the target object; obtaining a laser pattern modulated by the target object; and processing the laser pattern to obtain a depth image, and storing the depth image as the depth template.
  • One or more non-transitory computer-readable storage media containing computer-executable instructions in the embodiments of the present application when the computer-executable instructions are executed by one or more processors, cause the processors to perform the foregoing implementation.
  • the computer device includes a memory and a processor.
  • the memory stores computer-readable instructions.
  • the processor causes the processor to execute the verification template according to the foregoing embodiment. Build method.
  • FIG. 1 is a schematic flowchart of a method for generating a verification template according to an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of a terminal according to an embodiment of the present application.
  • FIG. 3 is a schematic block diagram of a terminal according to an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of a method for generating a verification template according to an embodiment of the present application
  • FIG. 5 is a schematic flowchart of a method for generating a verification template according to an embodiment of the present application
  • FIG. 6 is a schematic block diagram of a computer-readable storage medium and a processor according to an embodiment of the present application
  • FIG. 7 is a schematic block diagram of a computer device according to an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a laser projector according to an embodiment of the present application.
  • 9 to 11 are partial structural schematic diagrams of a laser projector according to an embodiment of the present application.
  • the verification template according to the embodiment of the present application includes an infrared template and a depth template.
  • a method for generating the verification template includes: acquiring an infrared image of a target object and storing the infrared image in a trusted execution environment as an infrared template; and controlling the laser projector to project a laser on the target object. ; Obtaining a laser pattern modulated by a target object; and processing the laser pattern to obtain a depth image, and storing the depth image as a depth template.
  • the generating method further includes: acquiring a color image of the target object and storing the color image in an untrusted execution environment; and acquiring a color image from the untrusted execution environment and controlling the display screen to display the color image.
  • acquiring a laser pattern modulated by a target object includes acquiring a plurality of frames of laser patterns modulated by the target object, and processing the laser pattern to obtain a depth image includes: separately processing multiple frames of laser patterns to obtain multiple frames of initial depth images; And synthesize multiple initial depth images to obtain the final depth image and use it as the depth image.
  • the system for generating a verification template includes a microprocessor and an application processor.
  • the microprocessor is configured to: obtain an infrared image of a target object and store the infrared image in a trusted execution environment of the application processor as an infrared template; control the laser
  • the projector projects laser light onto a target object; obtains a laser pattern modulated by the target object; and processes the laser pattern to obtain a depth image, and stores the depth image in a trusted execution environment as a depth template.
  • the application processor is further configured to: obtain a color image of the target object, and store the color image in the untrusted execution environment of the processor; and obtain a color image from the untrusted execution environment, and control the display The screen displays color images.
  • the microprocessor is configured to: obtain a multi-frame laser pattern modulated by the target object; process the multi-frame laser pattern to obtain a multi-frame initial depth image; and synthesize the multi-frame initial depth image to obtain a final depth image and use as Depth image.
  • the microprocessor is connected to the trusted execution environment through the mobile industry processor interface MIPI.
  • the terminal includes an infrared camera, a laser projector, and a verification template generation system according to any of the foregoing embodiments.
  • the infrared camera is used to collect an infrared image of a target object; the laser projector is used to provide the target object Cast a laser.
  • the one or more non-transitory computer-readable storage media containing computer-executable instructions in the embodiments of the present application when the computer-executable instructions are executed by one or more processors, cause the processors to execute any of the foregoing embodiments. Verify the template generation method.
  • the computer device includes a memory and a processor.
  • the memory stores computer-readable instructions.
  • the processor causes the processor to execute the method for generating a verification template according to any one of the foregoing embodiments.
  • the method for generating a verification template, a system for generating a verification template, a terminal, a non-volatile computer-readable storage medium, and a computer device store the obtained infrared template and depth template in a trusted execution environment.
  • the authentication template in the trusted execution environment is not easy to be tampered with and misappropriated, and the security of the information in the terminal is high.
  • an embodiment of the present application provides a method for generating a verification template.
  • the verification template includes an infrared template and a depth template.
  • the method for generating a verification template includes steps:
  • the laser pattern is processed to obtain a depth image, and it is stored in the trusted execution environment 511 as a depth template.
  • the terminal 100 includes an infrared camera 10, a laser projector 20, and a verification template generation system 50.
  • the infrared camera 10 can be used to collect an infrared image of a target object.
  • the laser projector 20 is used to project a laser light onto a target object.
  • the verification template generation system 50 can be used to implement a verification template generation method.
  • the verification template generation system 50 includes an application processor (AP) 51 and a microprocessor 52.
  • the application processor 51 is formed with a Trusted Execution Environment (TEE) 511.
  • TEE Trusted Execution Environment
  • the microprocessor 52 can be used to implement steps 01, 02, 03, and 04. That is, the microprocessor 52 can be used to obtain the infrared of the target object.
  • the verification template refers to a user entering the terminal 100 in advance and using it as a basis for comparison of subsequent input verification elements. When the similarity between the subsequent input verification elements and the verification template exceeds a predetermined value, it is judged that the verification is passed, otherwise it is judged that the verification is not by.
  • the verification template includes an infrared template and a depth template.
  • the infrared template may be an infrared image of a user's face, and the infrared image of the face may be a flat image.
  • the depth template can be a user's face depth image, and the depth image can be obtained by means of structured light detection.
  • an infrared image of the scene in front of the terminal 100 can be obtained and compared with the infrared template to determine whether there is a face image matching the infrared image of the face in the infrared template. Further, after the infrared template verification is passed, a depth image of the scene in front of the terminal 100 may be obtained and compared with the depth template to determine whether the depth image matches the depth image of the face in the depth template. Face image. After the user passes the authentication, the corresponding operation authority on the terminal 100 can be obtained, such as the screen unlocking and payment operation authority.
  • the terminal 100 may be a mobile phone, a tablet computer, a smart watch, a smart bracelet, a smart wearable device, and the like.
  • the terminal 100 is a mobile phone.
  • the specific form of 100 is not limited to mobile phones.
  • the infrared image of the target object can be collected by the infrared camera 10, the infrared camera 10 can be connected to the application processor 51, and the application processor 51 can be used to control the power of the infrared camera 10 to be turned on and off, pwdn the infrared camera 10, or reset ) Infrared camera 10; At the same time, infrared camera 10 can also be connected to microprocessor 52, microprocessor 52 and infrared camera 10 can be connected through an integrated circuit (Inter-Integrated Circuit (I2C) bus 70), microprocessor 52 can give infrared The camera 10 provides clock information for collecting infrared images.
  • I2C Inter-Integrated Circuit
  • the infrared images collected by the infrared camera 10 can be transmitted to the microprocessor 52 through a Mobile Industry Processor Interface (MIPI) 521.
  • the terminal 100 further includes an infrared fill light 40.
  • the infrared fill light 40 can be used to emit infrared light outward.
  • the infrared light is reflected by the user and received by the infrared camera 10.
  • the processor 51 can be connected through an integrated circuit bus.
  • the application processor 51 can be used to enable the infrared fill light 40.
  • the infrared fill light 40 can also be connected to the microprocessor 52. Specifically, the infrared fill light 40 can be connected to a microprocessor. Pulse width modulation interface (Pulse Width Modulation, PWM) 522 of the transmitter 52.
  • Pulse width modulation interface Pulse Width Modulation, PWM
  • the laser projector 20 of the terminal 100 can project a laser light onto a target object.
  • the laser projector 20 may be connected to an application processor 51.
  • the application processor 51 may be used to enable the laser projector 20 and connected through the integrated circuit bus 70.
  • the laser projector 20 may also be connected to the microprocessor 52. Specifically, the laser projector 20
  • the processor 20 may be connected to the pulse width modulation interface 522 of the microprocessor 52.
  • the microprocessor 52 may be a processing chip, and the microprocessor 52 is connected to the application processor 51. Specifically, the application processor 51 may be used to reset the microprocessor 52, wake the microprocessor 52, and debug Microprocessor 52, etc.
  • the microprocessor 52 can be connected to the application processor 51 through the mobile industry processor interface 521. Specifically, the microprocessor 52 is connected to the trusted execution environment 511 of the application processor 51 through the mobile industry processor interface 521. Connected to directly transfer the data in the microprocessor 52 to the trusted execution environment 511 for storage. Among them, the code and memory area in the trusted execution environment 511 are controlled by the access control unit and cannot be accessed by programs in the non-trusted execution environment (REE) 512.
  • the trusted execution environment 511 and non- Each of the trusted execution environments 512 may be formed in the application processor 51.
  • the microprocessor 52 can obtain the infrared image by receiving the infrared image collected by the infrared camera 10, and the microprocessor 52 can transmit the infrared image to the trusted execution environment 511 through the mobile industry processor interface 521, and from the microprocessor 52 The output infrared image will not enter the untrusted execution environment 512 of the application processor 51, so that the infrared image will not be acquired by other programs, which improves the information security of the terminal 100.
  • the infrared image stored in the trusted execution environment 511 can be used as an infrared template.
  • the microprocessor 52 controls the laser projector 20 to project a laser on a target object
  • the microprocessor 52 can also control the infrared camera 10 to collect a laser pattern modulated by the target object.
  • the microprocessor 52 then obtains the laser pattern through the mobile industry processor interface 521.
  • the microprocessor 52 processes the laser pattern to obtain a depth image.
  • the microprocessor 52 may store calibration information of the laser light projected by the laser projector 20, and the microprocessor 52 obtains a target by processing the laser pattern and the calibration information. Depth information of different positions of the object and form a depth image.
  • After obtaining the depth image it is then transmitted to the trusted execution environment 511 through the mobile industry processor interface 521.
  • the depth image stored in the trusted execution environment 511 can be used as a depth template.
  • both the obtained infrared template and the depth template are stored in the trusted execution environment 511, and the verification template in the trusted execution environment 511 cannot be easily tampered with. And theft, the security of the information in the terminal 100 is high.
  • the method for generating a verification template further includes steps:
  • the application processor 51 may be used to implement steps 05 and 06, that is, the application processor 51 may be used to obtain a color image of the target object and store it in the untrusted execution environment 512; and A color image is acquired in the trusted execution environment 512, and the display screen 60 is controlled to display the color image.
  • the terminal 100 further includes a visible light camera 30, which is connected to the application processor 51.
  • the visible light camera 30 may be connected to the application processor 51 through the integrated circuit bus 70 and the mobile industry processor interface 31.
  • the application processor 51 may be used to enable the visible light camera 30, turn off the visible light camera 30, or reset the visible light camera 10.
  • the visible light camera 30 can be used to collect color images.
  • the application processor 51 obtains a color image from the visible light camera 30 through the mobile industry processor interface 31, and stores the color image in the untrusted execution environment 512.
  • the data stored in the untrusted execution environment 512 may be retrieved by other programs.
  • the color image may be acquired by the application processor 51 and displayed on the display screen 60 of the terminal 100.
  • the visible light camera 30 and the infrared camera 30 can work at the same time.
  • the color image obtained by the application processor 51 can be synchronized with the laser pattern obtained by the microprocessor 52.
  • the user can adjust the head's steering by observing the color image displayed on the display 60.
  • the infrared camera 10 acquires a more accurate face image or laser pattern.
  • step 03 includes step 031: obtaining a multi-frame laser pattern modulated by a target object.
  • step 04 includes steps:
  • steps 031, 041, and 042 can be implemented by the microprocessor 52, that is, the microprocessor 52 can be used to obtain a multi-frame laser pattern modulated by the target object; the multi-frame laser pattern is processed separately to obtain Multiple frames of initial depth image; and synthesize multiple frames of initial depth image to obtain a final depth image and use it as a depth image.
  • the final depth image as the depth template may be obtained by synthesizing the initial depth images of the user's face obtained from multiple different angles.
  • the multiple initial depth images may be obtained by processing multiple frames of laser patterns, and the multiple frames of laser patterns may be Obtained after the user's head is swung to different angles. For example, under the guidance of the display content of the display 60, the user can swing the left, right, top, and hem of the head respectively.
  • the laser projector 20 can continuously project laser light on the face, and the infrared camera 10 Acquire multiple frames of modulated laser patterns, the microprocessor 52 acquires the multiple frames of laser patterns and processes to obtain multiple frames of initial depth images, and the microprocessor 52 then processes the multiple frames of initial depth images to obtain final depth images and final depth images. Includes depth information of the user's face from the front, left, right, and bottom angles. In this way, when the user needs to perform verification, the user's face at different angles can be obtained for comparison with the depth template, without requiring the user to align the infrared camera 10 strictly at an angle to shorten the user verification time.
  • an embodiment of the present application further provides a computer-readable storage medium 200.
  • the one or more non-volatile computer-readable storage media 200 include computer-executable instructions 202, and when the computer-executable instructions 202 are executed by one or more processors 300, the processor 300 causes the processor 300 to perform the verification of any one of the foregoing embodiments.
  • a template generation method for example, execute step 01: obtaining an infrared image of a target object and storing it in a trusted execution environment 511 as an infrared template; 02: controlling the laser projector 20 to project a laser on the target object; 03: obtaining a target object Modulated laser pattern; 04: Process the laser pattern to obtain a depth image, and store it in the trusted execution environment 511 as a depth template.
  • an embodiment of the present application further provides a computer device 400.
  • the computer device 400 includes a memory 401 and a processor 402.
  • Computer-readable instructions are stored in the memory 401.
  • the processor 402 executes the method for generating a verification template according to any of the foregoing embodiments, for example, step 01 is performed: Obtain the infrared image of the target object and store it in the trusted execution environment 511 as an infrared template; 02: control the laser projector 20 to project the laser onto the target object; 03: acquire the laser pattern modulated by the target object; 04: process the laser The pattern obtains a depth image and is stored in the trusted execution environment 511 as a depth template.
  • the computer device 400 may further include electronic components such as an infrared camera 403, a visible light camera 404, and a display screen 405.
  • the infrared camera 403 may be used to collect an infrared image of a target object or a laser pattern modulated by the target object.
  • the visible light camera 404 is available For collecting a color image of a target object, the display screen 405 may be used to display an infrared image, a color image, a laser pattern, etc. acquired by the processor.
  • the laser projector 20 includes a substrate assembly 21, a lens barrel 22, a light source 23, a collimating element 24, a diffractive optical element (DOE) 25, and a protective cover 26.
  • DOE diffractive optical element
  • the substrate assembly 21 includes a substrate 211 and a circuit board 212.
  • the circuit board 212 is disposed on the substrate 211.
  • the circuit board 212 is used to connect the light source 23 and the main board of the terminal 100.
  • the circuit board 212 may be a hard board, a soft board, or a rigid-flexible board. In the embodiment shown in FIG. 8, the circuit board 212 is provided with a through hole 2121.
  • the light source 23 is fixed on the substrate 211 and is electrically connected to the circuit board 212.
  • the substrate 211 may be provided with a heat dissipation hole 2111.
  • the heat generated by the light source 23 or the circuit board 212 may be dissipated through the heat dissipation hole 2111.
  • the heat dissipation glue may be filled in the heat dissipation hole 2111 to further improve the heat dissipation performance of the substrate assembly 21.
  • the lens barrel 22 is fixedly connected to the substrate assembly 21.
  • the lens barrel 22 is formed with a receiving cavity 221.
  • the lens barrel 22 includes a top wall 222 and an annular peripheral wall 224 extending from the top wall 222.
  • the peripheral wall 224 is disposed on the substrate assembly 21 and the top wall 222.
  • a light-passing hole 2212 is defined to communicate with the receiving cavity 221.
  • the peripheral wall 224 may be connected to the circuit board 212 by an adhesive.
  • a protective cover 26 is provided on the top wall 222.
  • the protective cover 26 includes a baffle 262 provided with a light emitting through hole 260 and an annular side wall 264 extending from the baffle 262.
  • the light source 23 and the collimating element 24 are both disposed in the receiving cavity 221.
  • the diffractive optical element 25 is mounted on the lens barrel 22.
  • the collimating element 24 and the diffractive optical element 25 are sequentially disposed on the light emitting light path of the light source 23.
  • the collimating element 24 collimates the laser light emitted from the light source 23.
  • the laser light passes through the collimating element 24 and then passes through the diffractive optical element 25 to form a laser pattern.
  • the light source 23 may be a vertical cavity surface emitting laser (Vertical Cavity Surface Laser, VCSEL) or an edge-emitting laser (EEL). In the embodiment shown in FIG. 8, the light source 23 is an edge-emitting laser. Ground, the light source 23 may be a distributed feedback laser (Distributed Feedback Laser, DFB).
  • the light source 23 is configured to emit laser light into the receiving cavity 221. With reference to FIG. 9, the light source 23 is in a column shape as a whole. One end surface of the light source 23 away from the substrate assembly 21 forms a light emitting surface 231. The laser light is emitted from the light emitting surface 231 and the light emitting surface 231 faces the collimating element 24.
  • the light source 23 is fixed on the substrate assembly 21.
  • the light source 23 may be adhered to the substrate assembly 21 through a sealant 27.
  • a sealant 27 For example, a side of the light source 23 opposite to the light emitting surface 231 is adhered to the substrate assembly 21. 8 and FIG. 10, the side surface 232 of the light source 23 can also be adhered to the substrate assembly 21, the sealant 27 can surround the surrounding side surfaces 232, or only one of the sides of the side surface 232 can be adhered to the substrate assembly 21 or adhered. A certain number of faces are connected to the substrate assembly 21.
  • the sealant 27 may be a thermally conductive adhesive to conduct heat generated from the operation of the light source 23 to the substrate assembly 21.
  • the diffractive optical element 25 is carried on the top wall 222 and is contained in the protective cover 26.
  • the opposite sides of the diffractive optical element 25 are in contact with the protective cover 26 and the top wall 222, respectively.
  • the baffle 262 includes an abutting surface 2622 near the light through hole 2212, and the diffractive optical element 25 is in abutment with the abutting surface 2622.
  • the diffractive optical element 25 includes a diffractive incidence surface 252 and a diffractive emission surface 254 opposite to each other.
  • the diffractive optical element 25 is carried on the top wall 222, the diffractive output surface 254 is in contact with the surface (abutment surface 2622) of the baffle 262 near the light-through hole 2212, and the diffractive incidence surface 252 is in contact with the top wall 222.
  • the light-through hole 2212 is aligned with the receiving cavity 221, and the light-through hole 260 is aligned with the light-through hole 2212.
  • the top wall 222, the annular side wall 264, and the baffle 262 are in contact with the diffractive optical element 25, so as to prevent the diffractive optical element 25 from falling out of the protective cover 26 in the light emitting direction.
  • the protective cover 26 is adhered to the top wall 222 by glue.
  • the light source 23 of the above-mentioned laser projector 20 uses an edge-emitting laser.
  • the edge-emitting laser has a lower temperature drift than the VCSEL array.
  • the cost of the light source of the laser projector 20 is low.
  • the gain of the power is obtained through the feedback of the grating structure.
  • the sealant 27 can fix the side-emitting laser and prevent the side-emitting laser from being dropped, displaced, or shaken.
  • the light source 23 may also be fixed on the substrate assembly 21 in a fixing manner as shown in FIG. 11.
  • the laser projector 20 includes a plurality of support blocks 28.
  • the support blocks 28 may be fixed on the substrate assembly 21.
  • the plurality of support blocks 28 collectively surround the light source 23.
  • the light source 23 may be directly mounted on the plurality of support blocks 28 during installation. between.
  • the light source 23 is clamped by a plurality of support blocks 28 to further prevent the light source 23 from shaking.
  • the protective cover 26 may be omitted.
  • the diffractive optical element 25 may be disposed in the receiving cavity 221, the diffractive output surface 254 of the diffractive optical element 25 may abut the top wall 222, and the laser light passes through the diffractive optical element 25. Then pass through the light through hole 2212 again. In this way, the diffractive optical element 25 does not easily fall off.
  • the substrate 211 may be omitted and the light source 23 may be directly fixed on the circuit board 212 to reduce the overall thickness of the laser projector 20.
  • first and second are used for descriptive purposes only and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
  • Any process or method description in a flowchart or otherwise described herein can be understood as representing a module, fragment, or portion of code that includes one or more executable instructions for implementing a particular logical function or step of a process
  • the scope of the preferred embodiments of the present application includes additional implementations, in which the functions may be performed out of the order shown or discussed, including performing functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present application pertain.
  • a sequenced list of executable instructions that can be considered to implement a logical function can be embodied in any computer-readable medium,
  • the instruction execution system, device, or device such as a computer-based system, a system including a processor, or other system that can fetch and execute instructions from the instruction execution system, device, or device), or combine these instruction execution systems, devices, or devices Or equipment.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
  • computer readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
  • each part of the application may be implemented by hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it may be implemented using any one or a combination of the following techniques known in the art: Discrete logic circuits, application specific integrated circuits with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
  • a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments may be implemented by a program instructing related hardware.
  • the program may be stored in a computer-readable storage medium.
  • the program is When executed, one or a combination of the steps of the method embodiment is included.
  • each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Projection Apparatus (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

一种验证模板的生成方法,验证模板包括红外模板和深度模板,生成方法包括:(01)获取目标物体的红外图像,并存入可信执行环境(511)中以作为红外模板;(02)控制激光投射器(20)向目标物体投射激光;(03)获取由目标物体调制后的激光图案;和(04)处理激光图案得到深度图像,并存入可信执行环境(511)中以作为深度模板。

Description

验证模板的生成方法和生成系统、终端和计算机设备
优先权信息
本申请请求2018年05月29日向中国国家知识产权局提交的、专利申请号为201810529884.8的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本申请涉及信息安全技术领域,更具体而言,涉及一种验证模板的生成方法、验证模板的生成系统、终端、非易失性计算机可读存储介质和计算机设备。
背景技术
在相关技术中,电子装置通常通过对比用户输入的人脸图像,与预存的人脸图像模板之间的差异性来验证用户是否具有相关的使用权限,然而,人脸图像模板容易被篡改或盗用,导致电子装置内的信息的安全性较低。
发明内容
本申请实施方式提供一种验证模板的生成方法、验证模板的生成系统、终端、非易失性计算机可读存储介质和计算机设备。
本申请实施方式的验证模板包括红外模板和深度模板,验证模板的生成方法包括:获取目标物体的红外图像,并存入可信执行环境中以作为所述红外模板;控制激光投射器向所述目标物体投射激光;获取由所述目标物体调制后的激光图案;和处理所述激光图案得到深度图像,并存入可信执行环境中以作为所述深度模板。
本申请实施方式的验证模板的生成系统包括微处理器和应用处理器,所述微处理器用于:获取目标物体的红外图像,并存入所述应用处理器的可信执行环境中以作为所述红外模板;控制激光投射器向所述目标物体投射激光;获取由所述目标物体调制后的激光图案;和处理所述激光图案得到深度图像,并存入所述可信执行环境中以作为所述深度模板。
本申请实施方式的终端包括红外摄像头、激光投射器及验证模板的生成系统,所述红外摄像头用于采集目标物体的红外图像;所述激光投射器用于向所述目标物体投射激光;验证模板的生成系统包括微处理器和应用处理器,所述微处理器用于:获取目标物体的红外图像,并存入所述应用处理器的可信执行环境中以作为所述红外模板;控制激光投射器向所述目标物体投射激光;获取由所述目标物体调制后的激光图案; 和处理所述激光图案得到深度图像,并存入所述可信执行环境中以作为所述深度模板。
本申请实施方式的一个或多个包含计算机可执行指令的非易失性计算机可读存储介质,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行上述实施方式所述的验证模板的生成方法。
本申请实施方式的计算机设备包括存储器及处理器,所述存储器中储存有计算机可读指令,所述指令被所述处理器执行时,使得所述处理器执行上述实施方式所述的验证模板的生成方法。
本申请的实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实施方式的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是本申请实施方式的验证模板的生成方法的流程示意图;
图2是本申请实施方式的终端的结构示意图;
图3是本申请实施方式的终端的模块示意图;
图4是本申请实施方式的验证模板的生成方法的流程示意图;
图5是本申请实施方式的验证模板的生成方法的流程示意图;
图6是本申请实施方式的计算机可读存储介质和处理器的模块示意图;
图7是本申请实施方式的计算机设备的模块示意图;
图8是本申请实施方式的激光投射器的结构示意图;
图9至图11是本申请实施方式的激光投射器的部分结构示意图。
具体实施方式
下面详细描述本申请的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本申请,而不能理解为对本申请的限制。
本申请实施方式的验证模板包括红外模板和深度模板,验证模板的生成方法包括:获取目标物体的红外图像,并存入可信执行环境中以作为红外模板;控制激光投射器向目标物体投射激光;获取由目标物体调制后的激光图案;和处理激光图案得到深度图像,并存入可信执行环境中以作为深度模板。
在某些实施方式中,生成方法还包括:获取目标物体的彩色图像,并存入非可信执行环境中;和从非可信执行环境中获取彩色图像,并控制显示屏显示彩色图像。
在某些实施方式中,获取由目标物体调制后的激光图案包括获取由目标物体调制后的多帧激光图案,处理激光图案得到深度图像包括:分别处理多帧激光图案得到多帧初始深度图像;和合成多帧初始深度图像以得到最终深度图像并作为深度图像。
本申请实施方式的验证模板的生成系统包括微处理器和应用处理器,微处理器用于:获取目标物体的红外图像,并存入应用处理器的可信执行环境中以作为红外模板;控制激光投射器向目标物体投射激光;获取由目标物体调制后的激光图案;和处理激光图案得到深度图像,并存入可信执行环境中以作为深度模板。
在某些实施方式中,应用处理器还用于:获取目标物体的彩色图像,并应用处理器的存入非可信执行环境中;和从非可信执行环境中获取彩色图像,并控制显示屏显示彩色图像。
在某些实施方式中,微处理器用于:获取由目标物体调制后的多帧激光图案;处理多帧激光图案得到多帧初始深度图像;和合成多帧初始深度图像以得到最终深度图像并作为深度图像。
在某些实施方式中,微处理器通过移动产业处理器接口MIPI与可信执行环境连接。
本申请实施方式的终端包括红外摄像头、激光投射器及上述任一实施方式的验证模板的生成系统,所述红外摄像头用于采集目标物体的红外图像;所述激光投射器用于向所述目标物体投射激光。
本申请实施方式的一个或多个包含计算机可执行指令的非易失性计算机可读存储介质,当计算机可执行指令被一个或多个处理器执行时,使得处理器执行上述任一实施方式的验证模板的生成方法。
本申请实施方式的计算机设备包括存储器及处理器,存储器中储存有计算机可读指令,指令被处理器执行时,使得处理器执行上述任一实施方式的验证模板的生成方法。
本申请实施方式的验证模板的生成方法、验证模板的生成系统、终端、非易失性计算机可读存储介质和计算机设备将获取得到的红外模板和深度模板均存储在可信执行环境中,在可信执行环境中的验证模板不易被篡改和盗用,终端内的信息的安全性较高。
请参阅图1至图3,本申请实施方式提供一种验证模板的生成方法,验证模板包括红外模板和深度模板,验证模板的生成方法包括步骤:
01:获取目标物体的红外图像,并存入可信执行环境511中以作为红外模板;
02:控制激光投射器20向目标物体投射激光;
03:获取由目标物体调制后的激光图案;
04:处理激光图案得到深度图像,并存入可信执行环境511中以作为深度模板。
本申请实施方式的终端100包括红外摄像头10、激光投射器20和验证模板的生成系统50。红外摄像头10可用于采集目标物体的红外图像。激光投射器20用于向目标物体投射激光。验证模板的生成系统50可用于实施验证模板的生成方法。验证模板的生成系统50包括应用处理器(Application Processor,AP)51和微处理器52。应用处理器51形成有可信执行环境(Trusted Execution Environment,TEE)511,微处理器52可用于实施步骤01、02、03和04,也就是说,微处理器52可用于获取目标物体的红外图像,并存入应用处理器51的可信执行环境511中以作为红外模板;控制激光投射器20向目标物体投射激光;获取由目标物体调制后的激光图案;和处理激光图案得到深度图像,并存入可信执行环境511中以作为深度模板。
验证模板指的是用户提前录入终端100,且作为对后续输入的验证要素进行比对的基准,当后续输入的验证要素与验证模板的相似度超过预定值时,判断验证通过,否则判断验证不通过。在本申请实施例中,验证模板包括红外模板和深度模板,红外模板可以是用户的人脸红外图像,人脸红外图像可以是平面的图像。深度模板可以是用户的人脸深度图像,深度图像可以是通过结构光检测的方式得到。在实际的验证过程中,可以获取终端100前的场景的红外图像,并将其与红外模板进行比对,以判断该红外图像是否存在与红外模板中的人脸红外图像相匹配的人脸图像;进一步地,在红外模板验证通过后,可以获取终端100前的场景的深度图像,并将其与深度模板进行比对,以判断该深度图像是否存在与深度模板中的人脸深度图像相匹配的人脸图像。用户验证通过后,可以获得在终端100的相应操作权限,例如屏幕解锁、支付等操作权限。
请参阅图2和图3,终端100可以是手机、平板电脑、智能手表、智能手环、智能穿戴设备等,在本申请实施例中,以终端100是手机为例进行说明,可以理解,终端100的具体形式并不限于手机。目标物体的红外图像可以由红外摄像头10采集,红外摄像头10可以与应用处理器51连接,应用处理器51可用于控制红外摄像头10的电源启闭、关闭(pwdn)红外摄像头10或重置(reset)红外摄像头10;同时,红外摄像头10还可以与微处理器52连接,微处理器52与红外摄像头10可以通过集成电路(Inter-Integrated Circuit,I2C)总线70连接,微处理器52可以给红外摄像头10提供采集红外图像的时钟信息,红外摄像头10采集的红外图像可以通过移动产业处理器接口(Mobile Industry Processor Interface,MIPI)521传输到微处理器52中。在本申请实 施例中,终端100还包括红外补光灯40,红外补光灯40可用于向外发射红外光,红外光被用户反射后被红外摄像头10接收,红外补光灯40与应用处理器51可以通过集成电路总线连接,应用处理器51可用于使能红外补光灯40,红外补光灯40还可以与微处理器52连接,具体地,红外补光灯40可以连接在微处理器52的脉冲宽度调制接口(Pulse Width Modulation,PWM)522上。
终端100的激光投射器20可向目标物体投射激光。激光投射器20可以与应用处理器51连接,应用处理器51可用于使能激光投射器20并通过集成电路总线70连接;激光投射器20还可以与微处理器52连接,具体地,激光投射器20可以连接在微处理器52的脉冲宽度调制接口522上。
微处理器52可以是处理芯片,微处理器52与应用处理器51连接,具体地,应用处理器51可用于重置微处理器52、唤醒(wake)微处理器52、纠错(debug)微处理器52等,微处理器52可通过移动产业处理器接口521与应用处理器51连接,具体地,微处理器52通过移动产业处理器接口521与应用处理器51的可信执行环境511连接,以将微处理器52中的数据直接传输到可信执行环境511中存储。其中,可信执行环境511中的代码和内存区域都是受访问控制单元控制的,不能被非可信执行环境(Rich Execution Environment,REE)512中的程序所访问,可信执行环境511和非可信执行环境512均可以形成在应用处理器51中。
微处理器52可以通过接收红外摄像头10采集的红外图像以获取红外图像,微处理器52可将该红外图像通过移动产业处理器接口521传输至可信执行环境511中,从微处理器52中输出的红外图像不会进入到应用处理器51的非可信执行环境512中,而使得该红外图像不会被其他程序获取,提高终端100的信息安全性。存储在可信执行环境511中的红外图像可作为红外模板。
微处理器52控制激光投射器20向目标物体投射激光后,还可以控制红外摄像头10采集由目标物体调制后的激光图案,微处理器52再通过移动产业处理器接口521获取该激光图案。微处理器52处理该激光图案以得利深度图像,具体地,微处理器52中可以存储有激光投射器20投射的激光的标定信息,微处理器52通过处理激光图案与该标定信息得到目标物体不同位置的深度信息并形成深度图像。得到深度图像后,再通过移动产业处理器接口521传输至可信执行环境511中。存储在可信执行环境511中的深度图像可作为深度模板。
综上,本申请实施方式的验证模板的生成方法和终端100中,将获取得到的红外模板和深度模板均存储在可信执行环境511中,在可信执行环境511中的验证模板不易被篡改和盗用,终端100内的信息的安全性较高。
请参阅图2至图4,在某些实施方式中,验证模板的生成方法还包括步骤:
05:获取目标物体的彩色图像,并存入非可信执行环境512中;和
06:从非可信执行环境512中获取彩色图像,并控制显示屏60显示彩色图像。
在某些实施方式中,应用处理器51可用于实施步骤05和06,也就是说,应用处理器51可用于获取目标物体的彩色图像,并存入非可信执行环境512中;和从非可信执行环境512中获取彩色图像,并控制显示屏60显示彩色图像。
具体地,终端100还包括可见光摄像头30,可见光摄像头30与应用处理器51连接,具体地,可见光摄像头30可通过集成电路总线70、移动产业处理器接口31与应用处理器51连接。应用处理器51可用于使能可见光摄像头30、关闭可见光摄像头30或重置可见光摄像头10。可见光摄像头30可用于采集彩色图像,应用处理器51通过移动产业处理器接口31从可见光摄像头30中获取彩色图像,并将该彩色图像存入非可信执行环境512中。非可信执行环境512中存储的数据可以由其他程序调取,在本申请实施例中,彩色图像可以被应用处理器51获取并显示在终端100的显示屏60上。可见光摄像头30与红外摄像头30可以同时工作,应用处理器51获取彩色图像可以与微处理器52获取激光图案同步进行,用户可以通过观察显示屏60中显示的彩色图像,调整头部的转向以便于红外摄像头10获取更准确的人脸图像或激光图案。
请参阅图2、图3和图5,在某些实施方式中,步骤03包括步骤031:获取由目标物体调制后的多帧激光图案。步骤04包括步骤:
041:分别处理多帧激光图案得到多帧初始深度图像;和
024:合成多帧初始深度图像以得到最终深度图像并作为深度图像。
在某些实施方式中,步骤031、041和042可以由微处理器52实施,也就是说,微处理器52可用于获取由目标物体调制后的多帧激光图案;分别处理多帧激光图案得到多帧初始深度图像;和合成多帧初始深度图像以得到最终深度图像并作为深度图像。
具体地,作为深度模板的最终深度图像可以是由多个不同角度获取的用户人脸的初始深度图像合成得到,多个初始深度图像可以通过处理多帧激光图案得到,而多帧激光图案可以是用户的头部摆动到不同角度后获取的。例如用户可以在显示屏60的显示内容的指引下,头部分别做左摆、右摆、上摆和下摆的摆动动作,摆动过程中,激光投射器20可以持续向人脸投射激光,红外摄像头10采集多帧被调制后的激光图案,微处理器52获取该多帧激光图案并处理得到多帧初始深度图像,微处理器52再处理该多帧初始深度图像得到最终深度图像,最终深度图像包括了用户人脸的正面、左侧、右侧、下侧等角度的深度信息。如此,在用户需要进行验证时,可以获取不同的角度的用户人脸以与深度模板进行比对,而不需要要求用户严格按照某个角度对准红外摄 像头10,减短用户验证的时间。
请结合图6,本申请实施方式还提供了一种计算机可读存储介质200。一个或多个非易失性计算机可读存储介质200包含计算机可执行指令202,当计算机可执行指令202被一个或多个处理器300执行时,使得处理器300执行上述任一实施方式的验证模板的生成方法,例如执行步骤01:获取目标物体的红外图像,并存入可信执行环境511中以作为红外模板;02:控制激光投射器20向目标物体投射激光;03:获取由目标物体调制后的激光图案;04:处理激光图案得到深度图像,并存入可信执行环境511中以作为深度模板。
请结合图7,本申请实施方式还提供了一种计算机设备400。计算机设备400包括存储器401及处理器402,存储器401中储存有计算机可读指令,指令被处理器402执行时,处理器402执行上述任一实施方式的验证模板的生成方法,例如执行步骤01:获取目标物体的红外图像,并存入可信执行环境511中以作为红外模板;02:控制激光投射器20向目标物体投射激光;03:获取由目标物体调制后的激光图案;04:处理激光图案得到深度图像,并存入可信执行环境511中以作为深度模板。另外,计算机设备400还可包括红外摄像头403、可见光摄像头404、显示屏405等电子元器件,其中红外摄像头403可用于采集目标物体的红外图像或由目标物体调制后的激光图案,可见光摄像头404可用于采集目标物体的彩色图像,显示屏405可用于显示由处理器获取的红外图像、彩色图像、激光图案等。
请参阅图8,在某些实施方式中,激光投射器20包括基板组件21、镜筒22、光源23、准直元件24、衍射光学元件(diffractive optical elements,DOE)25、及保护盖26。
基板组件21包括基板211和电路板212。电路板212设置在基板211上,电路板212用于连接光源23与终端100的主板,电路板212可以是硬板、软板或软硬结合板。在如图8所示的实施例中,电路板212上开设有通孔2121,光源23固定在基板211上并与电路板212电连接。基板211上可以开设有散热孔2111,光源23或电路板212工作产生的热量可以由散热孔2111散出,散热孔2111内还可以填充导热胶,以进一步提高基板组件21的散热性能。
镜筒22与基板组件21固定连接,镜筒22形成有收容腔221,镜筒22包括顶壁222及自顶壁222延伸的环形的周壁224,周壁224设置在基板组件21上,顶壁222开设有与收容腔221连通的通光孔2212。周壁224可以与电路板212通过粘胶连接。
保护盖26设置在顶壁222上。保护盖26包括开设有出光通孔260的挡板262及自挡板262延伸的环形侧壁264。
光源23与准直元件24均设置在收容腔221内,衍射光学元件25安装在镜筒22 上,准直元件24与衍射光学元件25依次设置在光源23的发光光路上。准直元件24对光源23发出的激光进行准直,激光穿过准直元件24后再穿过衍射光学元件25以形成激光图案。
光源23可以是垂直腔面发射激光器(Vertical Cavity Surface Emitting Laser,VCSEL)或者边发射激光器(edge-emitting laser,EEL),在如图8所示的实施例中,光源23为边发射激光器,具体地,光源23可以为分布反馈式激光器(Distributed Feedback Laser,DFB)。光源23用于向收容腔221内发射激光。请结合图9,光源23整体呈柱状,光源23远离基板组件21的一个端面形成发光面231,激光从发光面231发出,发光面231朝向准直元件24。光源23固定在基板组件21上,具体地,光源23可以通过封胶27粘结在基板组件21上,例如光源23的与发光面231相背的一面粘接在基板组件21上。请结合图8和图10,光源23的侧面232也可以粘接在基板组件21上,封胶27包裹住四周的侧面232,也可以仅粘结侧面232的某一个面与基板组件21或粘结某几个面与基板组件21。此时封胶27可以为导热胶,以将光源23工作产生的热量传导至基板组件21中。
请参阅图8,衍射光学元件25承载在顶壁222上并收容在保护盖26内。衍射光学元件25的相背两侧分别与保护盖26及顶壁222抵触,挡板262包括靠近通光孔2212的抵触面2622,衍射光学元件25与抵触面2622抵触。
具体地,衍射光学元件25包括相背的衍射入射面252和衍射出射面254。衍射光学元件25承载在顶壁222上,衍射出射面254与挡板262的靠近通光孔2212的表面(抵触面2622)抵触,衍射入射面252与顶壁222抵触。通光孔2212与收容腔221对准,出光通孔260与通光孔2212对准。顶壁222、环形侧壁264及挡板262与衍射光学元件25抵触,从而防止衍射光学元件25沿出光方向从保护盖26内脱落。在某些实施方式中,保护盖26通过胶水粘贴在顶壁222上。
上述的激光投射器20的光源23采用边发射激光器,一方面边发射激光器较VCSEL阵列的温漂较小,另一方面,由于边发射激光器为单点发光结构,无需设计阵列结构,制作简单,激光投射器20的光源成本较低。
分布反馈式激光器的激光在传播时,经过光栅结构的反馈获得功率的增益。要提高分布反馈式激光器的功率,需要通过增大注入电流和/或增加分布反馈式激光器的长度,由于增大注入电流会使得分布反馈式激光器的功耗增大并且出现发热严重的问题,因此,为了保证分布反馈式激光器能够正常工作,需要增加分布反馈式激光器的长度,导致分布反馈式激光器一般呈细长条结构。当边发射激光器的发光面231朝向准直元件24时,边发射激光器呈竖直放置,由于边发射激光器呈细长条结构,边发射激光器 容易出现跌落、移位或晃动等意外,因此通过设置封胶27能够将边发射激光器固定住,防止边发射激光器发生跌落、位移或晃动等意外。
请参阅图8和图11,在某些实施方式中,光源23也可以采用如图11所示的固定方式固定在基板组件21上。具体地,激光投射器20包括多个支撑块28,支撑块28可以固定在基板组件21上,多个支撑块28共同包围光源23,在安装时可以将光源23直接安装在多个支撑块28之间。在一个例子中,多个支撑块28共同夹持光源23,以进一步防止光源23发生晃动。
在某些实施方式中,保护盖26可以省略,此时衍射光学元件25可以设置在收容腔221内,衍射光学元件25的衍射出射面254可以与顶壁222相抵,激光穿过衍射光学元件25后再穿出通光孔2212。如此,衍射光学元件25不易脱落。
在某些实施方式中,基板211可以省去,光源23可以直接固定在电路板212上以减小激光投射器20的整体厚度。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执 行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本申请的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。例如,如果用硬件来实现,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本申请各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (20)

  1. 一种验证模板的生成方法,所述验证模板包括红外模板和深度模板,其特征在于,所述生成方法包括:
    获取目标物体的红外图像,并存入可信执行环境中以作为所述红外模板;
    控制激光投射器向所述目标物体投射激光;
    获取由所述目标物体调制后的激光图案;和
    处理所述激光图案得到深度图像,并存入可信执行环境中以作为所述深度模板。
  2. 根据权利要求1所述的验证模板的生成方法,其特征在于,所述生成方法还包括:
    获取所述目标物体的彩色图像,并存入非可信执行环境中;和
    从所述非可信执行环境中获取所述彩色图像,并控制显示屏显示所述彩色图像。
  3. 根据权利要求1所述的验证模板的生成方法,其特征在于,所述获取由所述目标物体调制后的激光图案包括获取由所述目标物体调制后的多帧激光图案,所述处理所述激光图案得到深度图像包括:
    分别处理多帧所述激光图案得到多帧初始深度图像;和
    合成多帧所述初始深度图像以得到最终深度图像并作为所述深度图像。
  4. 根据权利要求3所述的验证模板的生成方法,其特征在于,多帧所述初始深度图像分别从多个不同的角度获取。
  5. 根据权利要求1所述的验证模板的生成方法,其特征在于,所述可信执行环境中的代码和内存区域受访问控制单元控制,不能被非可信执行环境中的程序所访问。
  6. 一种验证模板的生成系统,所述验证模板包括红外模板和深度模板,其特征在于,所述生成系统包括微处理器和应用处理器,所述微处理器用于:
    获取目标物体的红外图像,并存入所述应用处理器的可信执行环境中以作为所述红外模板;
    控制激光投射器向所述目标物体投射激光;
    获取由所述目标物体调制后的激光图案;和
    处理所述激光图案得到深度图像,并存入所述可信执行环境中以作为所述深度模板。
  7. 根据权利要求6所述的验证模板的生成系统,其特征在于,所述应用处理器还用于:
    获取所述目标物体的彩色图像,并存入所述应用处理器的非可信执行环境中;和
    从所述非可信执行环境中获取所述彩色图像,并控制显示屏显示所述彩色图像。
  8. 根据权利要求6所述的验证模板的生成系统,其特征在于,所述微处理器用于:
    获取由所述目标物体调制后的多帧激光图案;
    处理多帧所述激光图案得到多帧初始深度图像;和
    合成多帧所述初始深度图像以得到最终深度图像并作为所述深度图像。
  9. 根据权利要求8所述的验证模板的生成系统,其特征在于,多帧所述初始深度图像分别从多个不同的角度获取。
  10. 根据权利要求6所述的验证模板的生成系统,其特征在于,所述微处理器通过移动产业处理器接口MIPI与所述可信执行环境连接。
  11. 根据权利要求6所述的验证模板的生成系统,其特征在于,所述可信执行环境中的代码和内存区域受访问控制单元控制,不能被非可信执行环境中的程序所访问。
  12. 一种终端,其特征在于,包括:
    红外摄像头,用于采集目标物体的红外图像;
    激光投射器,用于向所述目标物体投射激光;和
    验证模板的生成系统,所述验证模板包括红外模板和深度模板,所述生成系统包括微处理器和应用处理器,所述微处理器用于:
    获取目标物体的红外图像,并存入所述应用处理器的可信执行环境中以作为所述红外模板;
    控制所述激光投射器向所述目标物体投射激光;
    获取由所述目标物体调制后的激光图案;和
    处理所述激光图案得到深度图像,并存入所述可信执行环境中以作为所述深度模 板。
  13. 根据权利要求12所述的终端,其特征在于,所述应用处理器还用于:
    获取所述目标物体的彩色图像,并存入所述应用处理器的非可信执行环境中;和
    从所述非可信执行环境中获取所述彩色图像,并控制显示屏显示所述彩色图像。
  14. 根据权利要求12所述的终端,其特征在于,所述微处理器用于:
    获取由所述目标物体调制后的多帧激光图案;
    处理多帧所述激光图案得到多帧初始深度图像;和
    合成多帧所述初始深度图像以得到最终深度图像并作为所述深度图像。
  15. 根据权利要求14所述的终端,其特征在于,多帧所述初始深度图像分别从多个不同的角度获取。
  16. 根据权利要求12所述的终端,其特征在于,所述微处理器通过移动产业处理器接口MIPI与所述可信执行环境连接。
  17. 根据权利要求12所述的终端,其特征在于,所述可信执行环境中的代码和内存区域受访问控制单元控制,不能被非可信执行环境中的程序所访问。
  18. 根据权利要求12所述的终端,其特征在于,所述微处理器与所述红外摄像头连接,所述微处理器与所述激光投射器连接。
  19. 一个或多个包含计算机可执行指令的非易失性计算机可读存储介质,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行权利要求1至5中任一项所述的验证模板的生成方法。
  20. 一种计算机设备,包括存储器及处理器,所述存储器中储存有计算机可读指令,所述指令被所述处理器执行时,使得所述处理器执行权利要求1至5中任一项所述的验证模板的生成方法。
PCT/CN2019/084326 2018-05-29 2019-04-25 验证模板的生成方法和生成系统、终端和计算机设备 WO2019228107A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19794870.6A EP3608814B1 (en) 2018-05-29 2019-04-25 Verification process in a terminal, corresponding terminal and corresponding computer program
US16/613,371 US11210800B2 (en) 2018-05-29 2019-04-25 Method, system and terminal for generating verification template

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810529884.8A CN108804900B (zh) 2018-05-29 2018-05-29 验证模板的生成方法和生成系统、终端和计算机设备
CN201810529884.8 2018-05-29

Publications (1)

Publication Number Publication Date
WO2019228107A1 true WO2019228107A1 (zh) 2019-12-05

Family

ID=64090943

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/084326 WO2019228107A1 (zh) 2018-05-29 2019-04-25 验证模板的生成方法和生成系统、终端和计算机设备

Country Status (4)

Country Link
US (1) US11210800B2 (zh)
EP (1) EP3608814B1 (zh)
CN (2) CN108804900B (zh)
WO (1) WO2019228107A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108804900B (zh) * 2018-05-29 2022-04-15 Oppo广东移动通信有限公司 验证模板的生成方法和生成系统、终端和计算机设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105513221A (zh) * 2015-12-30 2016-04-20 四川川大智胜软件股份有限公司 一种基于三维人脸识别的atm机防欺诈装置及系统
CN106226977A (zh) * 2016-08-24 2016-12-14 深圳奥比中光科技有限公司 激光投影模组、图像采集系统及其控制方法和装置
CN107609383A (zh) * 2017-10-26 2018-01-19 深圳奥比中光科技有限公司 3d人脸身份认证方法与装置
CN108804900A (zh) * 2018-05-29 2018-11-13 Oppo广东移动通信有限公司 验证模板的生成方法和生成系统、终端和计算机设备

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4224449B2 (ja) * 2004-11-30 2009-02-12 本田技研工業株式会社 画像抽出装置
KR102270674B1 (ko) * 2013-09-30 2021-07-01 삼성전자주식회사 생체인식 카메라
US20150339471A1 (en) 2014-05-23 2015-11-26 Texas Instruments Incorporated Device unlock with three dimensional (3d) captures
CN104700268B (zh) 2015-03-30 2018-10-16 中科创达软件股份有限公司 一种移动支付方法及移动设备
JP6878826B2 (ja) * 2016-10-20 2021-06-02 富士通株式会社 撮影装置
CN106548152A (zh) * 2016-11-03 2017-03-29 厦门人脸信息技术有限公司 近红外三维人脸解锁装置
CN107341473B (zh) 2017-07-04 2018-07-06 深圳市利众信息科技有限公司 手掌特征识别方法、手掌特征识别设备、及存储介质
CN107292283A (zh) 2017-07-12 2017-10-24 深圳奥比中光科技有限公司 混合人脸识别方法
CN107862266A (zh) * 2017-10-30 2018-03-30 广东欧珀移动通信有限公司 图像处理方法及相关产品
CN108052878B (zh) 2017-11-29 2024-02-02 上海图漾信息科技有限公司 人脸识别设备和方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105513221A (zh) * 2015-12-30 2016-04-20 四川川大智胜软件股份有限公司 一种基于三维人脸识别的atm机防欺诈装置及系统
CN106226977A (zh) * 2016-08-24 2016-12-14 深圳奥比中光科技有限公司 激光投影模组、图像采集系统及其控制方法和装置
CN107609383A (zh) * 2017-10-26 2018-01-19 深圳奥比中光科技有限公司 3d人脸身份认证方法与装置
CN108804900A (zh) * 2018-05-29 2018-11-13 Oppo广东移动通信有限公司 验证模板的生成方法和生成系统、终端和计算机设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3608814A4 *

Also Published As

Publication number Publication date
US11210800B2 (en) 2021-12-28
US20210334990A1 (en) 2021-10-28
EP3608814B1 (en) 2023-06-07
CN114817895A (zh) 2022-07-29
EP3608814A1 (en) 2020-02-12
EP3608814A4 (en) 2020-07-15
CN108804900A (zh) 2018-11-13
CN108804900B (zh) 2022-04-15

Similar Documents

Publication Publication Date Title
US10051196B2 (en) Projecting light at angle corresponding to the field of view of a camera
WO2019228097A1 (zh) 验证系统、电子装置、验证方法、计算机可读存储介质及计算机设备
EP3771205A1 (en) Control method, microprocessor, computer-readable storage medium, and computer program
CN108376251B (zh) 控制方法、控制装置、终端、计算机设备和存储介质
US9330464B1 (en) Depth camera feedback
US20210084280A1 (en) Image-Acquisition Method and Image-Capturing Device
US11506963B2 (en) Systems for controlling laser projector and mobile terminals
WO2019227975A1 (zh) 激光投射器的控制系统、终端和激光投射器的控制方法
TW202001801A (zh) 影像處理方法、電腦設備和可讀儲存媒體
WO2020259385A1 (zh) 图像处理方法和装置、存储介质
US11204668B2 (en) Electronic device and method for acquiring biometric information using light of display
US11184513B2 (en) System, method and device for controlling electronic device and its target camera and laser projector
WO2019228107A1 (zh) 验证模板的生成方法和生成系统、终端和计算机设备
WO2019233168A1 (zh) 验证方法、验证装置、电子设备和计算机可读存储介质
WO2020019682A1 (zh) 激光投射模组、深度获取装置和电子设备
WO2019228020A1 (zh) 激光投射器的控制系统和移动终端
CN108931202B (zh) 检测方法和装置、电子装置、计算机设备和可读存储介质
CN108763903B (zh) 验证装置和电子设备
EP3462735A1 (en) Light field based projector calibration method and system
CN108736312B (zh) 结构光投射器的控制系统、结构光投射组件和电子装置
CN108848207B (zh) 光电投射模组的控制系统及控制方法和终端
CN108763902A (zh) 验证方法、验证系统、终端、计算机设备和可读存储介质
CN108833884A (zh) 深度校准方法及装置、终端、可读存储介质及计算机设备
US10666917B1 (en) System and method for image projection
US20150365652A1 (en) Depth camera system

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019794870

Country of ref document: EP

Effective date: 20191111

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19794870

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE