WO2019174436A1 - 控制方法、控制装置、深度相机和电子装置 - Google Patents

控制方法、控制装置、深度相机和电子装置 Download PDF

Info

Publication number
WO2019174436A1
WO2019174436A1 PCT/CN2019/075390 CN2019075390W WO2019174436A1 WO 2019174436 A1 WO2019174436 A1 WO 2019174436A1 CN 2019075390 W CN2019075390 W CN 2019075390W WO 2019174436 A1 WO2019174436 A1 WO 2019174436A1
Authority
WO
WIPO (PCT)
Prior art keywords
array
current distance
distance
target number
user
Prior art date
Application number
PCT/CN2019/075390
Other languages
English (en)
French (fr)
Inventor
韦怡
张学勇
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201810200433.XA external-priority patent/CN108509867B/zh
Priority claimed from CN201810200875.4A external-priority patent/CN108333860B/zh
Priority claimed from CN201810202149.6A external-priority patent/CN108594451B/zh
Priority claimed from CN201810201627.1A external-priority patent/CN108227361B/zh
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP19742274.4A priority Critical patent/EP3567427B1/en
Priority to US16/451,737 priority patent/US11441895B2/en
Publication of WO2019174436A1 publication Critical patent/WO2019174436A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • G03B21/2006Lamp housings characterised by the light source
    • G03B21/2013Plural light sources
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • G03B21/2006Lamp housings characterised by the light source
    • G03B21/2033LED or laser light sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings

Definitions

  • the present invention relates to the field of imaging technologies, and in particular, to a method for controlling a laser projection module, a control device for a laser projection module, a depth camera, and an electronic device.
  • the laser projection module can project a laser with predetermined pattern information, and project the laser onto a target user located in the space, and then obtain an image of the laser reflected by the target user through an imaging device (such as an infrared camera) to further obtain the target.
  • an imaging device such as an infrared camera
  • Embodiments of the present invention provide a laser projection module control method, a laser projection module control device, a depth camera, and an electronic device.
  • the laser projection module of the embodiment of the present invention includes a laser emitter, the laser emitter includes a plurality of point light sources, and the plurality of point light sources form a plurality of light emitting arrays, and the plurality of the light emitting arrays are independently controlled, and the control method The method includes: obtaining a current distance between the laser projection module and a user; determining a target number of the illumination array according to the current distance; and turning on the target number of the point light sources of the illumination array.
  • the laser projection module includes a laser emitter, the laser emitter includes a plurality of point light sources, and the plurality of the point light sources form a plurality of light emitting arrays, and the plurality of the light emitting The array is independently controlled; the control device includes: an acquisition module, a determination module, and an opening module.
  • the acquiring module is configured to acquire a current distance between the laser projection module and a user.
  • the determining module is configured to determine a target quantity of the illuminating array according to the current distance.
  • the opening module turns on the target number of point light sources of the light emitting array.
  • the depth camera of the embodiment of the invention includes an image collector and a laser projection module.
  • the laser projection module includes a laser emitter, the laser emitter includes a plurality of point light sources, and the plurality of point light sources form a plurality of light emitting arrays, and the plurality of light emitting arrays are independently controlled; the depth camera further includes processing The processor is configured to: acquire a current distance between the laser projection module and a user; determine a target quantity of the illumination array according to the current distance; and turn on the target number of the point light sources of the illumination array.
  • An electronic device of an embodiment of the present invention includes a housing and the depth camera described above, the depth camera being disposed within the housing and exposed from the housing to acquire a depth image.
  • FIG. 1 is a schematic structural view of an electronic device according to some embodiments of the present invention.
  • FIG. 2 is a schematic block diagram of a depth camera according to some embodiments of the present invention.
  • FIG 3 is a schematic structural view of a laser projection module according to some embodiments of the present invention.
  • FIG. 4 is a flow chart showing a method of controlling a laser projection module according to some embodiments of the present invention.
  • FIG. 5 is a schematic diagram of a laser emitter arranged in a plurality of fan arrays in a laser projection module according to some embodiments of the present invention.
  • FIG. 6 is a block diagram of a control device of a laser projection module according to some embodiments of the present invention.
  • FIG. 7 and 8 are schematic flow diagrams of a method of controlling a laser projection module according to some embodiments of the present invention.
  • FIG. 9 is a schematic illustration of a laser emitter in a laser projection module in accordance with some embodiments of the present invention in a circular sub-array and a plurality of annular sub-arrays.
  • FIG. 10 is a flow chart showing a method of controlling a laser projection module according to some embodiments of the present invention.
  • FIG. 11 is a block diagram of a first acquisition module in a control device for a laser projection module according to some embodiments of the present invention.
  • FIG. 12 is a flow chart showing a method of controlling a laser projection module according to some embodiments of the present invention.
  • FIG. 13 is a block diagram of a determining unit in a control device for a laser projection module according to some embodiments of the present invention.
  • FIG. 14 is a flow chart showing a method of controlling a laser projection module according to some embodiments of the present invention.
  • 15 is a block diagram of a determining unit in a control device for a laser projection module according to some embodiments of the present invention.
  • 16 is a flow chart showing a method of controlling a laser projection module according to some embodiments of the present invention.
  • 17 is a block diagram of a determining unit in a control device for a laser projection module according to some embodiments of the present invention.
  • FIG. 18 is a flow chart showing a method of controlling a laser projection module according to some embodiments of the present invention.
  • 19 is a block diagram of a first acquisition module in a control device for a laser projection module according to some embodiments of the present invention.
  • FIG. 20 to FIG. 24 are schematic diagrams showing the arrangement of laser emitters in a plurality of fan arrays in a laser projection module according to some embodiments of the present invention.
  • 25 is a flow chart showing a method of controlling a laser projection module according to some embodiments of the present invention.
  • 26 is a schematic diagram of a laser emitter in a laser projection module according to some embodiments of the present invention in a square sub-array and a plurality of annular sub-arrays.
  • 27 to 29 are schematic flow charts of a method of controlling a laser projection module according to some embodiments of the present invention.
  • FIG. 30 is a block diagram of a control device of a laser projection module according to some embodiments of the present invention.
  • 31 is a flow chart showing a method of controlling a laser projection module according to some embodiments of the present invention.
  • 32 is a block diagram of a correction module in a control device for a laser projection module according to some embodiments of the present invention.
  • 33 is a flow chart showing a method of controlling a laser projection module according to some embodiments of the present invention.
  • Figure 34 is a block diagram of a correction module in a control device for a laser projection module according to some embodiments of the present invention.
  • 35 is a flow chart showing a method of controlling a laser projection module according to some embodiments of the present invention.
  • 36 is a block diagram of a correction module in a control device for a laser projection module according to some embodiments of the present invention.
  • FIG. 40 are schematic diagrams showing a laser emitter in a "field" shape in a laser projection module according to some embodiments of the present invention.
  • first and second are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated.
  • features defining “first” or “second” may include one or more of the described features either explicitly or implicitly.
  • the meaning of "a plurality" is two or more unless specifically and specifically defined otherwise.
  • the present invention provides an electronic device 3000.
  • the electronic device 3000 includes a housing 2000 and a depth camera 1000.
  • the depth camera 1000 is disposed within the housing 2000 and exposed from the housing 2000 to acquire a depth image.
  • the electronic device 3000 can be a mobile phone, a tablet computer, a notebook computer, a smart watch, a smart bracelet, smart glasses, a smart helmet, and the like.
  • the depth camera 1000 includes an image collector 200, a laser projection module 100, and a processor 300.
  • the image collector 200 can be used to acquire a laser pattern, and the image collector 200 can be an infrared camera.
  • the processor 300 can be used to process a laser pattern to obtain a depth image.
  • the laser projection module 100 includes a laser emitter 10, a collimating element 20, a diffractive element 30, a lens barrel 40, a substrate assembly 50, and a protective cover 60.
  • the laser emitter 10 is a Vertical-Cavity Surface-Emitting Laser (VCSEL), the vertical cavity surface laser emitter includes a plurality of point light sources 101 (shown in FIG. 5), and the laser emitter 10 is used to emit laser light.
  • the collimating element 20 is used to collimate the laser light emitted by the laser emitter 10, and the diffractive element 30 is used to diffract the collimated laser light of the collimating element 20 to form a laser pattern.
  • the lens barrel 40 is disposed on the substrate assembly 50.
  • the side wall 41 of the lens barrel 40 and the substrate assembly 50 enclose a receiving cavity 42.
  • the substrate assembly 50 includes a substrate 52 and a circuit board 51 carried on the substrate 52.
  • the circuit board 51 is provided with a through hole 511, and the laser emitter 10 is carried on the substrate 52 and housed in the through hole 511.
  • the collimating element 20 and the diffractive element 30 are sequentially arranged in the light emitting direction of the laser emitter 10.
  • the side wall 41 of the lens barrel 40 extends to the center of the housing cavity 42 with a carrier 411 on which the diffractive element 30 is carried.
  • the protective cover 60 may be made of a light transmissive material such as glass, Polymethyl Methacrylate (PMMA), Polycarbonate (PC), Polyimide (PI), or the like. Since the light-transmitting materials such as glass, PMMA, PC, and PI all have excellent light-transmitting properties, the protective cover 60 can be omitted. In this manner, the protective cover 60 can prevent the diffraction element 30 from being exposed to the outside of the lens barrel 40 while preventing the diffraction element 30 from falling off, thereby making the diffraction element 30 waterproof and dustproof.
  • the protective cover 60 may be provided with a light-transmissive aperture that opposes the optically effective area of the diffractive element 30 to avoid obscuring the optical path of the diffractive element 30.
  • the present invention also provides a control method for the laser projection module 100 described above.
  • the laser emitter 10 in the laser projection module 100 includes a plurality of point sources 101.
  • the plurality of point light sources 101 form a plurality of light emitting arrays 111, and the plurality of light emitting arrays 111 can be independently controlled.
  • Control methods include:
  • the present invention also provides a control device 80 for the laser projection module 100 described above.
  • the laser emitter 10 in the laser projection module 100 includes a plurality of point sources 101.
  • the plurality of point light sources 101 form a plurality of light emitting arrays 111, and the plurality of light emitting arrays 111 can be independently controlled.
  • the control device 80 includes a first acquisition module 81, a determination module 82, and an opening module 83.
  • Step 001 can be implemented by the first obtaining module 81
  • step 002 can be implemented by the determining module 82
  • step 003 can be implemented by the opening module 83. That is to say, the first obtaining module 81 can be used to obtain the current distance between the laser projection module 100 and the user.
  • the determination module 82 can be configured to determine the target number of the illumination arrays 111 based on the current distance.
  • the on module 83 can be used to turn on the point source 101 of the target number of illumination arrays 111.
  • step 001, step 002, and step 003 can also be implemented by the processor 300. That is to say, the processor 300 can also be used to obtain the current distance between the laser projection module 100 and the user, determine the target number of the light-emitting array 111 according to the current distance, and turn on the target light source array 111 of the light-emitting array 111.
  • the laser projection module 100 when the laser projection module 100 is turned on, all the point light sources 101 are normally turned on. If the distance from the laser projection module 100 is too close, the laser light emitted by the laser emitter 10 is turned on after all the point sources 101 are turned on. Higher energy may be harmful to the user's eyes.
  • the control method of the laser projection module 100, the control device 80 of the laser projection module 100, the depth camera 1000 and the electronic device 3000 of the embodiment of the present invention arrange the point light sources 101 in the laser projection module 100 into a plurality of independently controllable
  • the light-emitting array 111 is configured to open the point light source 101 of the light-emitting array 111 corresponding to the target number of the current distance according to the detected current distance, so as to avoid the user being too close to the laser projection module 100 after turning on all the point light sources 101.
  • the energy emitted by the laser emitter 10 is too high, which jeopardizes the problem of the user's eyes.
  • the plurality of light-emitting arrays 111 are a plurality of fan-shaped arrays 111 , and the plurality of fan-shaped arrays 111 enclose a circular array 11 .
  • the plurality of sector arrays 111 are independently controlled.
  • the current distance between the laser projection module 100 and the user is obtained in step 001: 01: The current distance between the laser projection module 100 and the user is obtained.
  • Step 002 determines the target number of the illumination array 111 according to the current distance: 02: Determine the target number of the sector array 111 according to the current distance.
  • Step 003: Turning on the target number of light sources 111 of the light source array 101 is: 03: Turn on the point light source of the target number of sector arrays 111.
  • the first obtaining module 81 can be used to obtain the current distance between the laser projection module 100 and the user.
  • the determination module 82 can be configured to determine the target number of the fan arrays 111 based on the current distance.
  • the on module 83 can be used to turn on the point source 101 of the target number of sector arrays 111.
  • the processor 300 can be used to obtain the current distance between the laser projection module 100 and the user, determine the target number of the fan array 111 according to the current distance, and turn on the target number of the fan array 111 points of the light source 101.
  • the optical effective area of the collimating element 20 is generally circular.
  • the circular optical effective area should cover the rectangular arrangement of the point light source 101 to satisfy the optical effective area.
  • the diameter is larger than the length of the diagonal of the rectangle formed by the point source 101, thus causing a waste of a part of the space.
  • the plurality of light emitting arrays 111 are a plurality of sub-arrays 111 .
  • a plurality of sub-arrays 111 enclose a circular array 11.
  • the plurality of sub-arrays 111 include a circular sub-array 113 and a circular sub-array 112.
  • the number of the circular sub-arrays 113 is one, and the number of the circular sub-arrays 112 is one or more.
  • the plurality of sub-arrays 111 can be independently controlled.
  • the current distance between the laser projection module 100 and the user is obtained in step 001: 01: The current distance between the laser projection module 100 and the user is obtained.
  • Step 002 determines the target number of the light-emitting array 111 according to the current distance: 02: Determine the target number of the sub-array 111 according to the current distance.
  • Step 003: Turning on the target light source array 111 of the target number is: 03: Turn on the spot light source 101 of the target number of sub-arrays 111.
  • the first obtaining module 81 can be used to obtain the current distance between the laser projection module 100 and the user.
  • the determination module 82 can be configured to determine the target number of sub-arrays 111 based on the current distance.
  • the on module 83 can be used to turn on the point source 101 of the target number of sub-arrays 111.
  • the processor 300 can be configured to acquire the current distance between the laser projection module 100 and the user, determine the target number of the sub-array 111 according to the current distance, and turn on the point source 101 of the sub-array 111 of the target number.
  • the optical effective area of the collimating element 20 is generally circular. In this case, if the plurality of point light sources 101 are arranged in a rectangular shape, the circular optical effective area should cover the rectangular arrangement of the point light source 101 to satisfy the optical effective area.
  • the diameter is larger than the length of the diagonal of the rectangle formed by the point source 101, thus causing a waste of a part of the space.
  • the plurality of point light sources 101 are formed into a plurality of sub-arrays 111, and the plurality of sub-arrays 111 are arranged in a circular array 11, so that the shape of the laser emitter 10 can correspond to the circular optical effective area of the collimating element 20, and the space can be fully utilized.
  • the current distance between the laser projection module 100 and the user in step 01 includes:
  • 013 Determine the current distance according to the first ratio.
  • the first acquisition module 81 includes an acquisition unit 811, a processing unit 812, and a determination unit 813.
  • Step 011 can be implemented by the obtaining unit 811
  • step 012 can be implemented by the processing unit 812
  • step 013 can be implemented by the determining unit 813. That is to say, the obtaining unit 811 can be used to acquire a face image of the user.
  • the processing unit 812 can be configured to process the face image to determine a first ratio of the user's face to the face image.
  • the determining unit 813 is operative to determine the current distance according to the first ratio.
  • steps 011, 012, and 013 can all be implemented by the processor 300. That is to say, the processor 300 can also be used to acquire a face image of the user, process the face image to determine a first ratio of the face of the user to the face image, and determine the current distance according to the first ratio.
  • the face image is captured by the image collector 200, the processor 300 is electrically connected to the image collector 200, and the face image is read from the image collector 200.
  • the face region and the background region in the face image may be first divided according to the extraction and analysis of the feature points of the face, and then the ratio of the number of pixels in the face region to the number of pixels in the face image is calculated. Get the first ratio. It can be understood that when the first ratio is large, the user is closer to the image collector 200, that is, closer to the laser projection module 100, and the current distance is smaller. At this time, the laser projection module 100 needs to turn on less target amount of illumination.
  • the point source 101 of the array 111 (sector array 111 or sub-array 111) prevents the projected laser from being too strong to burn the user.
  • the laser projection module 100 needs to project the laser with a large power.
  • the laser projection module 100 needs to open a larger number of target arrays of light-emitting arrays 111 (sector array 111 or sub-array). Point light source 101 of 111).
  • the face with the largest area among the plurality of faces is selected as the face area to calculate the first ratio, and the other faces occupy the area as Part of the background area.
  • the current distance and the first ratio can be calibrated in advance. Specifically, the user is instructed to take a face image at a predetermined current distance, and calculate a calibration ratio corresponding to the face image, and store a correspondence between the preset current distance and the calibration ratio, so as to be based on actual conditions in subsequent use.
  • a scale calculates the current distance. For example, the user is directed to take a face image when the current distance is 30 cm, and the calibration ratio corresponding to the face image is calculated to be 45%.
  • the first ratio is calculated as R
  • the similarity is based on The nature of the triangle Where D is the actual current distance calculated according to the first ratio R actually measured.
  • the current distance between the user and the laser projection module 100 can be more objectively reflected.
  • step 013 determines the current distance according to the first ratio, including:
  • 0131 calculating a second feature ratio of a preset feature area of the face in the face image.
  • 0132 Calculate the current distance according to the first ratio and the second ratio.
  • the determining unit 813 includes a first calculating sub-unit 8131 and a second calculating sub-unit 8132.
  • Step 0131 can be implemented by the first computing sub-element 8131
  • step 0132 can be implemented by the second computing sub-unit 8132.
  • the first calculation sub-unit 8131 can be used to calculate a second ratio of the preset feature area of the face in the face image to the face.
  • the second calculation sub-unit 8132 can be configured to calculate the current distance according to the first ratio and the second ratio.
  • Steps 0131 and 0132 may also be implemented by processor 300 when sub-array 111 of sub-array 112. That is to say, the processor 300 can also be used to calculate a second ratio of the preset feature area of the face in the face image to the face, and calculate the current distance according to the first ratio and the second ratio.
  • the second ratio is the proportion of the preset feature area of the face to the face, and the preset feature area may select a feature area with a small degree of difference between different user individuals, for example, the preset feature area is the distance between the eyes of the user.
  • the second ratio is large, the user's face is small, and the current distance calculated according to the first ratio is too large; when the second ratio is small, the user's face is large, only according to the first The current distance calculated by the scale is too small.
  • the first ratio, the second ratio, and the current distance may be calibrated in advance.
  • the user is instructed to first capture a face image at a predetermined current distance, and calculate a first calibration ratio and a second calibration ratio corresponding to the facial image, and store the preset current distance and the first calibration ratio, and the second calibration.
  • the correspondence of the scales so as to calculate the current distance according to the actual first ratio and the second ratio in subsequent use.
  • the user is directed to take a face image when the current distance is 25 cm, and the first calibration ratio corresponding to the face image is calculated to be 50%, and the second calibration ratio is 10%, and in actual measurement, when calculated
  • the first ratio is R1, and when the second ratio is R2, there are properties similar to triangles.
  • D1 is the initial current distance calculated according to the first ratio R1 actually measured, and can be further based on the relationship
  • the current distance D2, D2 which is further calculated based on the actually measured second ratio R2, is obtained as the final current distance.
  • the current distance calculated according to the first ratio and the second ratio takes into account individual differences between different users, and a more objective current distance can be obtained.
  • step 013 determines the current distance according to the first ratio, including:
  • 0133 determining whether the user wears glasses according to the face image.
  • 0134 Calculate the current distance according to the first ratio and the preset distance coefficient when the user wears the glasses.
  • the determining unit 813 includes a first judging sub-unit 8133 and a third calculating sub-unit 8134.
  • Step 0133 can be implemented by the first decision sub-unit 8133
  • step 0134 can be implemented by the third calculation sub-unit 8134. That is to say, the first judging sub-unit 8133 can be used to determine whether the user wears glasses according to the face image.
  • the third calculating sub-unit 8134 can be configured to calculate the current distance according to the first ratio and the preset distance coefficient when the user wears the glasses.
  • Step 0133 and step 0134 may also be implemented by processor 300 when sub-array 111 of sub-array 112. That is to say, the processor 300 can be used to determine whether the user wears glasses according to the face image, and calculate the current distance according to the first ratio and the preset distance coefficient when the user wears the glasses.
  • the point source 101 of the smaller number of illumination arrays 111 causes the laser projection module 100 to project less energy to avoid damage to the user's eyes.
  • the preset distance coefficient may be a coefficient between 0 and 1, such as 0.6, 0.78, 0.82, 0.95, etc., for example, the initial current distance calculated according to the first ratio, or calculated according to the first ratio and the second ratio.
  • the initial current distance or the calibrated current distance is multiplied by the distance coefficient to obtain the final current distance, and the target number is determined according to the current distance. In this way, it is possible to avoid the excessive power of the projected laser light to the user suffering from eye diseases or poor vision.
  • the distance coefficient may be unfixed.
  • the distance coefficient may be self-adjusting according to the intensity of visible light or infrared light in the environment.
  • the face image collected by the image collector 200 is an infrared image, and the average value of the infrared light intensity of all the pixels of the face image may be calculated first, and different average values correspond to different distance coefficients. The larger the average value, the smaller the distance coefficient is. The smaller the average, the larger the distance coefficient.
  • step 013 determines the current distance according to the first ratio, including:
  • 0136 Adjust the current distance according to the first ratio and age.
  • the determining unit 813 further includes a second judging sub-unit 8135 and an adjusting sub-unit 8136.
  • Step 0135 can be implemented by second decision sub-unit 8135, which can be implemented by adjustment sub-unit 8136. That is to say, the second judging subunit 8135 can be used to judge the age of the user from the face image.
  • Adjustment sub-unit 8136 can be used to adjust the current distance based on the first ratio and age.
  • Step 0135 and step 0136 may also be implemented by processor 300 when sub-array 111 of sub-array 112. That is to say, the processor 300 can also be used to determine the age of the user according to the face image, and adjust the current distance according to the first ratio and age.
  • the number, distribution, area, and the like of the feature points of the face wrinkles in the face image may be extracted to determine the age of the user, for example, the number of wrinkles at the corner of the eye is extracted to determine the age of the user, or further combined with the user's How much wrinkles are on the forehead to determine the age of the user.
  • the proportional coefficient can be obtained according to the age of the user. Specifically, the correspondence between the age and the proportional coefficient can be found in the lookup table.
  • the proportional coefficient is 0.6, and the age is When the age is 15 to 20, the scale factor is 0.8; when the age is between 20 and 45, the scale factor is 1.0; when the age is 45 or older, the scale factor is 0.8.
  • the initial current distance calculated according to the first ratio, or the current distance of the calibration calculated according to the first ratio and the second ratio may be multiplied by a proportional coefficient to obtain a final current distance, and then The target number of the light-emitting arrays 111 (the sector array 111 or the sub-array 111) is determined based on the current distance. In this way, it is possible to avoid the power of the projected laser being too large to harm a small age group or an older user.
  • the plurality of light-emitting arrays 111 are a plurality of fan-shaped arrays 111, or the plurality of light-emitting arrays 111 are plural, including the circular sub-arrays 113 and the ring shape.
  • the current distance between the laser projection module 100 and the user in step 01 includes:
  • the first acquisition module 81 includes a transmitting unit 814 and a first calculating unit 815.
  • Step 014 can be implemented by transmitting unit 814, which can be implemented by first computing unit 815. That is, the transmitting unit 814 can be used to transmit a detection signal to the user.
  • the first calculation unit 815 can be configured to calculate the current distance based on the detection signal reflected back by the user.
  • step 014 can be implemented by laser projection module 100
  • step 015 can be implemented by processor 300. That is to say, the laser projection module 100 can be used to transmit a detection signal to a user.
  • the processor 300 is operative to calculate a current distance based on the detection signal reflected back by the user.
  • the laser projection module 100 only turns on the point source 101 in one of the illumination arrays 111 (the sector array 111 or the sub-array 111), that is, only the point source 101 in the illumination array 111 (the sector array 111 or the sub-array 111) Launch a laser.
  • the laser light emitted by the laser projection module 100 Since only the point light source 101 in one of the light-emitting arrays 111 (the sector array 111 or the sub-array 111) is turned on to detect the current distance, the laser light emitted by the laser projection module 100 has a low energy and does not cause harm to the eyes of the user. After the current distance between the user and the laser projection module 100 is roughly measured, the target number of the illuminated array 111 (sector array 111 or sub-array 111) is determined according to the current distance, and the laser emitted by the laser projection module 100 at this time. It can meet the accuracy requirements of depth image estimation without harming the user's eyes.
  • the plurality of light-emitting arrays 111 are a plurality of fan-shaped arrays 111, or the plurality of light-emitting arrays 111 are a plurality of sub-arrays 111 including a circular sub-array 113 and a circular sub-array 112
  • the point source 101 of the first target number of light-emitting arrays 111 is turned on.
  • the point source 103 of the second target number of light-emitting arrays 111 is turned on.
  • the point source 103 of the third target number of light-emitting arrays 111 When the current distance is in the third distance interval, the point source 103 of the third target number of light-emitting arrays 111 (sector array 111 or sub-array 111) is turned on.
  • the second distance interval is located between the first distance interval and the second distance interval, that is, the maximum value of the distance in the first distance interval is less than or equal to the minimum value of the distance in the second distance interval, and the second distance interval
  • the maximum value of the intermediate distance is smaller than the minimum value of the distance in the third distance interval.
  • the second target number is greater than the first target number and less than the third target number.
  • the point light source 101 in the laser projection module 100 is formed with six light-emitting arrays 111 (sector array 111 or sub-array 111), the first distance interval is [0 cm, 15 cm], and the second distance interval is (15 cm). , 40cm], the third distance interval is (40cm, ⁇ ), the first target number is 2, the second target number is 4, and the third target number is 6.
  • the point light sources 101 of the two light-emitting arrays 111 are turned on; when the detected current distance is in (15 cm, 40 cm), the four light-emitting arrays 111 are turned on (the sector array 111) Or the point source 101 of the sub-array 111); when the detected current distance is in (40 cm, ⁇ ), the point light source 101 of the six light-emitting arrays 111 (the sector array 111 or the sub-array 111) is turned on.
  • the larger the value of the target number the greater the number of point light sources 101 of the turned-on light-emitting array 111 (sector array 111 or sub-array 111).
  • the points of the fewer light-emitting arrays 111 are turned on.
  • the source 101 prevents the laser energy emitted by the laser projection module 100 from being excessively large and jeopardizes the user's eyes.
  • illumination arrays 111 When the current distance between the user and the laser projection module 100 is large, more illumination arrays 111 are opened (sector array 111 or sub- The point source 101 of the array 111) can cause the image collector 200 to receive laser light of sufficient energy to further make the depth image acquisition accuracy higher.
  • the open A plurality of fan arrays 111 are symmetrically distributed around the center of the laser emitter 10.
  • the number of the fan-shaped arrays 111 is four
  • the number of targets is two
  • four is a multiple of two
  • the two fan-shaped arrays 111 that are turned on are distributed symmetrically around the center of the laser emitter 10.
  • the number of the fan-shaped arrays 111 is six, and the number of targets is three. Then, between the two adjacent fan-shaped arrays 111, there is an unopened fan-shaped array 111, and three open ones.
  • the fan array 111 is distributed symmetrically around the center of the laser emitter 10. For example, as shown in FIG. 21, the number of the fan-shaped arrays 111 is nine, and the number of targets is three. Then, two fan-shaped arrays 111 that are not turned on between the two adjacent fan-shaped arrays 111 have two open-ended fan-shaped arrays 111.
  • the fan arrays 111 are symmetrically distributed around the center of the laser emitter 10.
  • the opened fan-shaped array 111 is symmetrically distributed in the center, and the laser light emitted by the point light source 101 can be covered by the collimating element 20 and the diffractive element 30 to cover a larger field of view, and the light is uniform, which is advantageous for improving the accuracy of obtaining the depth image.
  • the open A plurality of fan arrays 111 are symmetrically distributed around the center of the laser emitter 10.
  • the number of the fan-shaped arrays 111 is six
  • the number of targets is four
  • two of the four open fan-shaped arrays 111 are adjacent to each other
  • the remaining two fan-shaped arrays 111 are adjacent to each other.
  • the four fan arrays 111 are symmetrically distributed around the center of the laser emitter 10.
  • the number of the fan-shaped arrays 111 is ten, the number of targets is six, three of the six open fan-shaped arrays 111 are adjacent, and the remaining three fan-shaped arrays 111 are adjacent to each other.
  • the six fan-shaped arrays 111 that are turned on are distributed symmetrically around the center of the laser emitter 10.
  • the number of the fan-shaped arrays 111 is twelve, the number of targets is nine, and three of the nine fan-shaped arrays 111 that are turned on are adjacent to each other, and the other three of the fan-shaped arrays 111 are adjacent to each other.
  • the remaining three sector arrays 111 are adjacent to each other, and the nine fan arrays 111 that are turned on are symmetrically distributed around the center of the laser emitter 10.
  • the opened fan-shaped array 111 is symmetrically distributed in the center, and the laser light emitted by the point light source 101 can be covered by the collimating element 20 and the diffractive element 30 to cover a larger field of view, and the light is uniform, which is advantageous for improving the accuracy of obtaining the depth image.
  • the plurality of light emitting arrays 111 are a plurality of sub-arrays 111, and the plurality of sub-arrays 111 include a circular sub-array 113 and a circular sub-array 112, the point light source 101 of the circular sub-array 113 is simultaneously turned on and at least When the point source 101 of a circular sub-array 112 is located, the point source 101 of the sub-array 111 that is further from the center of the circular array 120 has a higher power.
  • the circular array 120 of the laser emitter 10 includes four sub-arrays 111, which are a circular sub-array 113 and three annular sub-arrays 112, respectively.
  • the three annular sub-arrays 112 are sequentially arranged in a direction away from the center of the circular array 11, and the three annular sub-arrays 112 sequentially arranged are numbered A, B, and C, respectively.
  • the voltage (U- circle ) of the point light source 101 applied in the circular sub-array 113 is smaller than that applied to the number A.
  • the voltage (U A ) in the annular sub-array 112 is less than the voltage (U B ) of the point source 101 applied in the annular sub-array 112 numbered B , ie U circle ⁇ U A ⁇ U B ; or, when simultaneously turned on
  • the voltage (U- circle ) of the point source 101 applied in the circular sub-array 113 is smaller than that applied to voltage (the U-a) 112 in an annular array voltage annular sub-subarrays a number of point light sources 101, 112 (U a), and applied to the a number of In applying number B of the circular sub-array voltage (the U-B) the point light sources 112 101, and applying a voltage of 112 circular sub-array number B is less than applied numbered annular sub-array C of 112
  • the voltage (U C ) of the point source 101 that is,
  • the laser light emitted by the laser emitter 10 is more concentrated at the center of the circular array 11 when the laser passes through the diffraction element 30. Since the diffraction ability of the diffraction element 30 is limited, that is, part of the light beam is not diffracted but directly emitted, the directly emitted laser light does not undergo the diffraction attenuation effect of the diffraction element 20, so the energy of the directly emitted laser light is large, and it is highly likely to be applied to the user. The eyes are harmful.
  • reducing the power of the sub-array 111 closer to the center of the circular array 11 can avoid excessive laser light collected in the center of the circular array 11 and directly exit without diffraction, thereby jeopardizing the problem of the user's eyes. .
  • the plurality of light-emitting arrays 111 are a plurality of sub-arrays 111, and the plurality of sub-arrays 111 include a circular sub-array 113 and a circular sub-array 112, when the current distance is in the first distance interval, the circular sub-array is turned on. Point light source 101 of 112. When the current distance is in the second distance interval, the point light source 101 of the circular sub-array 113 is turned on. The maximum value of the first distance interval is smaller than the minimum value of the second distance interval.
  • the circular array 11 of laser emitters includes two sub-arrays 111, which are a circular sub-array 113 and a circular sub-array 112, respectively, the first distance interval is [0 cm, 15 cm], and the second distance interval is ( 15cm, 40cm], the third distance interval is (40cm, ⁇ ), the first target number is 1, the second target number is 1, and the third target number is 2.
  • the sub-array 111 is turned on in such a manner as to open the circular sub-array 112 and the circle in a direction close to the center of the circular array 11.
  • Sub-array 113 Therefore, it is possible to avoid turning on the circular sub-array 113 or the annular sub-array 112 near the center of the circular array 11 when the current distance is small, resulting in excessive laser energy that is directly emitted without diffraction diffraction of the diffraction element 30. The problem of the user's eyes.
  • obtaining the current distance between the laser projection module 100 and the user in step 001 includes:
  • a predetermined number of light-emitting arrays 111 are turned on to detect the current distance between the user and the laser projection module 100.
  • step 04 can be implemented by the first acquisition module 81. That is to say, the first obtaining module 81 can be used to turn on a predetermined number of the light emitting arrays 111 to detect the current distance between the user and the laser projection module 100.
  • step 04 can also be implemented by processor 300. That is to say, the processor 300 can also be used to turn on a predetermined number of the light-emitting arrays 111 to detect the current distance between the user and the laser projection module 100.
  • the laser light projected by the laser projection module 100 is an infrared laser, and the current distance between the user and the laser projection module 100 during operation of the laser projection module 100 is unknown. Therefore, if the energy of the infrared laser is improperly controlled, the energy of the infrared laser may be caused. Too big to cause damage to the user's eyes.
  • the point light source 101 in the laser emitter 10 is divided into a plurality of independently controllable light-emitting arrays 111. When the laser projection module 100 is in operation, a predetermined number of light-emitting arrays 111 are first turned on to detect the current state of the user and the laser projection module 100.
  • the distance of the light-emitting array 111 that needs to be turned on is determined according to the current distance after determining the current distance. Therefore, the number of the light-emitting arrays 111 that are turned on can be prevented from being too small, and the brightness of the laser pattern collected by the image collector 200 is too low. The accuracy of the depth image acquisition is affected; the number of the light-emitting arrays 111 that are turned on is also prevented from being excessive, which causes the problem that the emitted laser energy is excessively harmful to the eyes of the user.
  • the predetermined number corresponding to the first opening of the light-emitting array 111 during operation of the laser projection module 100 can be obtained from empirical data. Before using the laser projection module 100, the predetermined number of light-emitting arrays 111 can be turned on to measure the user and the laser projection. The current distance between the modules 100, on the other hand, does not pose a hazard to the user's eyes.
  • the predetermined number of the light emitting arrays 111 varies depending on the type of the electronic device 3000 and the total number of the light emitting arrays 111.
  • the laser projection module 100 is often used to assist in acquiring a 3D face image for face recognition unlocking, and the current distance between the user and the laser projection module 100 is usually small. Assuming that the total number of the light-emitting arrays 111 is six, the predetermined number may be two. If the total number of the light-emitting arrays 111 is 12, the predetermined number may be three. Thus, on the one hand, the user and the user may be roughly measured. The current distance of the laser projection module 100 can avoid the problem of excessive laser energy on the one hand. For another example, when the electronic device 3000 is a somatosensory gaming device, the current distance between the user and the laser projection module 100 is generally large.
  • the predetermined number can be eight.
  • the current distance between the user and the laser projection module 100 can be roughly measured, and on the one hand, the problem of excessive laser energy can be avoided.
  • step 0111 acquires the current distance between the laser projection module 100 and the user, including:
  • both step 06 and step 07 can be implemented by the first obtaining module 81. That is to say, the first obtaining module 81 can be configured to acquire the first image and the second image of the user, and calculate the current distance between the user and the laser projection module 100 according to the first image and the second image.
  • both step 06 and step 07 can be implemented by processor 300. That is to say, the processor 300 can also be configured to acquire the first image and the second image of the user, and calculate the current distance between the user and the laser projection module 100 according to the first image and the second image.
  • the laser projection module 100 projects the laser pattern onto the user in the space, and then the image collector 200 collects the laser pattern reflected by the user, and then uses the laser pattern and the reference laser pattern to acquire the depth image of the user.
  • the laser light projected by the laser projection module 100 is an infrared laser, and the current distance between the user and the laser projection module 100 during operation of the laser projection module 100 is unknown. Therefore, if the energy of the infrared laser is improperly controlled, the energy of the infrared laser may be caused. Too big to cause damage to the user's eyes.
  • the point light source 101 in the laser emitter 10 is divided into a plurality of independently controllable light-emitting arrays 111.
  • the first image and the second image of the user are first acquired to calculate the user and the laser projection module.
  • the current distance of 100 determines the target number of the light-emitting arrays 111 to be turned on according to the current distance after determining the current distance.
  • the number of the light-emitting arrays 111 that are turned on can be prevented from being too small, resulting in the brightness of the laser pattern collected by the image collector 200. Too low, affecting the accuracy of depth image acquisition; also avoiding the excessive number of open illumination arrays 111, resulting in the problem that the emitted laser energy is too large to cause harm to the user's eyes.
  • the first image may be an infrared image
  • the second image may be a visible light image (RGB image)
  • the first image may be a visible light image
  • the second image may be an infrared image
  • the visible light image may be captured by the visible light camera module 4000.
  • the infrared image can be captured by the image capture device 200 in the depth camera 1000.
  • the first image and the second image may also be visible light images.
  • the electronic device 3000 includes two visible light imaging modules 4000. Taking the first image as an infrared image and the second image as a visible light image, when the laser projection module 100 is in operation, the processor 300 first turns on the image collector 200 and the visible light camera module 4000, and the image collector 200 captures the first image.
  • the visible light camera module 4000 captures a second image
  • the processor 300 reads the first image and the second image from the image capture device 200 and the visible light camera module 4000.
  • the first image and the second image are used as a pair of image matching pairs, and the processor 300 calculates the current distance according to the pair of image matching pairs.
  • the processor 300 first performs binocular image correction on the first image and the second image, according to The image collector 200 and the visible light camera module 4000 pre-calibrate the obtained monocular internal reference data (focal length, imaging origin, distortion parameter) and binocular relative positional relationship (rotation matrix and translation vector) for the first image and the second image, respectively. De-distortion and line alignment are performed such that the first image and the second image correspond strictly.
  • the processor 300 identifies the face in the first image and the second image, and determines the depth information corresponding to the face according to the matching relationship between the depth image and the first image or the second image, because the face usually occupies more The pixel points, therefore, take the median or mean of the plurality of depth information corresponding to the plurality of pixels as the final current distance.
  • the user can be regarded as a point target, and the distance between the laser projection module 100 and the point target is the current distance; or a certain part of the user can be used as a point.
  • the distance between the laser projection module 100 and the point target is the current distance.
  • the distance between the laser projection module 100 and the user's face is the current distance.
  • the faces in the first image and the second image are identified, and then the pixel matching and the depth information are calculated on the face portion in the first image and the face portion in the second image, and then calculated according to the calculation.
  • the obtained depth information determines the current distance.
  • the processor 300 determines the target number of the light-emitting array 111 to be turned on according to the current distance, and then controls the laser projection module 100 to turn on the target number.
  • the illumination array 111 is further to obtain a more accurate depth image. For example, when the electronic device 3000 is a mobile phone, the total number of the light-emitting arrays 111 is six. If the current distance is measured, for example, 15 to 20 cm, the target number can be determined to be 3 to 4 according to the current distance, and then the data will be turned on.
  • 3 to 4 light source arrays 101 of the light source array 111 if the current distance is measured, for example, 5 to 10 cm, the number of targets can be determined according to the current distance, and the point light source of 1 light-emitting array 111 will be turned on. 101.
  • the control method turns on a predetermined number of the light-emitting arrays 111 in step 04 to detect the current distance between the user and the laser projection module 100 or in step 07 according to the first image and the first After the image calculates the current distance between the user and the laser projection module 100, the method further includes the step of correcting the current distance (ie, step 05), specifically:
  • the control device 80 further includes a second acquisition module 851, a calculation module 852, and a correction module 853.
  • Step 051 can be implemented by the second acquisition module 851.
  • Step 052 can be implemented by computing module 852.
  • Step 053 can be implemented by the correction module 853. That is to say, the second obtaining module 851 can be used to acquire a face image of the user.
  • the calculation module 852 can be used to calculate the first proportion of the face in the face image.
  • the correction module 853 can be used to correct the current distance according to the first ratio.
  • step 051, step 052, and step 053 can all be implemented by processor 300. That is to say, the processor 300 can also be used to acquire a face image of the user, calculate a first proportion of the face in the face image, and correct the current distance according to the first ratio.
  • the face region and the background region in the face image may be firstly segmented according to the extraction and analysis of the face feature points, and then the ratio of the number of pixels in the region where the face is located to the number of pixels in the face image is calculated. Get the first ratio. It can be understood that when the first ratio is large, the user is closer to the image collector 200, that is, closer to the laser projection module 100, and the current distance is smaller. At this time, the laser projection module 100 needs to turn on less target amount of illumination. The point source 101 of the array 111 prevents the projected laser from being too strong to burn the user.
  • the laser projection module 100 needs to project the laser with a large power to project the laser pattern onto the user and be reflected. After that, there is still a suitable intensity for forming a depth image. At this time, the laser projection module 100 needs to turn on the point light source 101 of the target array of the plurality of light-emitting arrays 111.
  • the face with the largest area among the plurality of faces is selected as the face area to calculate the first ratio, and the other faces occupy the area as Part of the background area.
  • the current distance and the first ratio can be calibrated in advance. Specifically, the user is instructed to take a face image at a predetermined current distance, and calculate a calibration ratio corresponding to the face image, and store a correspondence between the preset current distance and the calibration ratio, so as to be based on actual conditions in subsequent use.
  • a scale calculates the current distance. For example, the user is directed to take a face image when the current distance is 30 cm, and the calibration ratio corresponding to the face image is calculated to be 45%.
  • the first ratio is calculated as R
  • the similarity is based on The nature of the triangle Where D is the actual current distance calculated according to the first ratio R actually measured.
  • the current distance between the user and the laser projection module 100 can be more objectively reflected.
  • step 053 corrects the current distance according to the first ratio, including:
  • the correction module 853 includes a second calculation unit 8531 and a first correction unit 8532.
  • Step 0531 can be implemented by the second calculation unit 8531, and step 0532 can be implemented by the first modification unit 8532.
  • the second calculating unit 8531 is for calculating a second ratio of the preset feature area of the face in the face image to the face.
  • the first correcting unit 8532 is configured to correct the current distance according to the first ratio and the second ratio.
  • step 0531 and step 0532 can also be implemented by the processor 300. That is to say, the processor 300 can also be used to calculate a second ratio of the preset feature area of the face in the face image to the face, and correct the current distance according to the first ratio and the second ratio.
  • the second ratio is the proportion of the preset feature area of the face to the face, and the preset feature area may select a feature area with a small degree of difference between different user individuals, for example, the preset feature area is the distance between the eyes of the user.
  • the second ratio is large, the user's face is small, and the current distance calculated according to the first ratio is too large; when the second ratio is small, the user's face is large, only according to the first The current distance calculated by the scale is too small.
  • the first ratio, the second ratio, and the current distance may be calibrated in advance.
  • the user is instructed to first capture a face image at a predetermined current distance, and calculate a first calibration ratio and a second calibration ratio corresponding to the facial image, and store the preset current distance and the first calibration ratio, and the second calibration.
  • the correspondence of the scales so as to calculate the current distance according to the actual first ratio and the second ratio in subsequent use.
  • the user is directed to take a face image when the current distance is 25 cm, and the first calibration ratio corresponding to the face image is calculated to be 50%, and the second calibration ratio is 10%, and in actual measurement, when calculated
  • the first ratio is R1, and when the second ratio is R2, there are properties similar to triangles.
  • D1 is the initial current distance calculated according to the first ratio R1 actually measured, and can be further based on the relationship
  • the current distance D2, D2 which is further calculated based on the actually measured second ratio R2, is obtained as the final current distance.
  • the current distance calculated according to the first ratio and the second ratio takes into account individual differences between different users, and a more objective current distance can be obtained.
  • step 053 corrects the current distance according to the first ratio, including:
  • the correction module 853 includes a first determining unit 8533 and a second correcting unit 8534.
  • Step 0533 can be implemented by the first determining unit 8533.
  • Step 0534 can be implemented by the second correction unit 8534. That is to say, the first determining unit 8533 can be used to determine whether the user wears glasses according to the face image.
  • the second correcting unit 8534 can be configured to correct the current distance according to the first ratio and the distance coefficient when the user wears the glasses.
  • step 0533 and step 0534 can also be implemented by the processor 300. That is to say, the processor 300 can also be used to determine whether the user wears glasses according to the face image, and correct the current distance according to the first ratio and the distance coefficient when the user wears the glasses.
  • the point light source 101 of the smaller number of the light-emitting arrays 111 makes the laser light projected by the laser projection module 100 less energy to avoid harm to the eyes of the user.
  • the preset distance coefficient may be a coefficient between 0 and 1, such as 0.6, 0.78, 0.82, 0.95, etc., for example, the initial current distance calculated according to the first ratio, or calculated according to the first ratio and the second ratio.
  • the initial current distance or the calibrated current distance is multiplied by the distance coefficient to obtain the final current distance, and the target number is determined according to the current distance. In this way, it is possible to avoid the excessive power of the projected laser light to the user suffering from eye diseases or poor vision.
  • the distance coefficient may be unfixed.
  • the distance coefficient may be self-adjusting according to the intensity of visible light or infrared light in the environment.
  • the face image collected by the image collector 200 is an infrared image, and the average value of the infrared light intensity of all the pixels of the face image may be calculated first, and different average values correspond to different distance coefficients. The larger the average value, the smaller the distance coefficient is. The smaller the average, the larger the distance coefficient.
  • step 053 corrects the current distance according to the first ratio, including:
  • 0535 determining the age of the user according to the face image
  • step 0535 can be implemented by the second determining unit 8355.
  • Step 0536 can be implemented by the third correction unit 8336. That is to say, the second judging unit 8535 can be used to judge the age of the user from the face image.
  • the third correction unit 8536 can be used to correct the current distance based on the first ratio and age.
  • step 0535 and step 0536 can also be implemented by processor 300. That is to say, the processor 300 can also be used to determine the age of the user according to the face image, and correct the current distance according to the first ratio and age.
  • the number, distribution, area, and the like of the feature points of the face wrinkles in the face image may be extracted to determine the age of the user, for example, the number of wrinkles at the corner of the eye is extracted to determine the age of the user, or further combined with the user's How much wrinkles are on the forehead to determine the age of the user.
  • the proportional coefficient can be obtained according to the age of the user. Specifically, the correspondence between the age and the proportional coefficient can be found in the lookup table.
  • the proportional coefficient is 0.6, and the age is When the age is 15 to 20, the scale factor is 0.8; when the age is between 20 and 45, the scale factor is 1.0; when the age is 45 or older, the scale factor is 0.8.
  • the initial current distance calculated according to the first ratio, or the current distance of the calibration calculated according to the first ratio and the second ratio may be multiplied by a proportional coefficient to obtain a final current distance, and then The target number of the light-emitting arrays 111 is determined based on the current distance. In this way, it is possible to avoid the power of the projected laser being too large to harm a small age group or an older user.
  • the point light source 101 of the first target number of light-emitting arrays 111 is turned on.
  • the second target number of point light sources 101 of the light-emitting array 111 are turned on.
  • the point light source 101 of the third target number of light-emitting arrays 111 is turned on.
  • the second distance interval is located between the first distance interval and the third distance interval, that is, the maximum distance in the first distance interval is less than or equal to the minimum distance in the second distance interval, and the second distance interval
  • the maximum value of the intermediate distance is smaller than the minimum value of the distance in the third distance interval.
  • the second target number is greater than the first target number and less than the third target number.
  • the point light source 101 in the laser projection module 100 is formed with six light-emitting arrays 111, the first distance interval is [0 cm, 15 cm], the second distance interval is (15 cm, 40 cm), and the third distance interval is (40cm, ⁇ ), the first target number is 2, the second target number is 4, and the third target number is 6.
  • the detected current distance is in [0cm, 15cm] 2 lights are turned on.
  • the point source 101 of the light-emitting array 111 when the detected current distance is in (15 cm, 40 cm), turn on point light source 101 of 4 light-emitting arrays 111; when the detected current distance is in (40 cm, ⁇ ), turn on 6
  • the point source 101 of the light-emitting array 111 when the detected current distance is in (15 cm, 40 cm),
  • the point light source 101 of the light-emitting array 111 is turned on to prevent the laser light emitted by the laser projection module 100 from being excessively harmful to the user's eyes, between the user and the laser projection module 100.
  • Source 101 may cause the image capture device 200 receives the laser beam of sufficient energy such that the depth image obtaining further higher accuracy.
  • the plurality of light-emitting arrays 111 are arranged in a ring shape, wherein the current distance is obtained according to step 04 or obtained according to steps 06 and 07.
  • the laser light emitted by the point source 101 of the circular array of light-emitting arrays 111 can cover a wider field of view, so that depth information of more objects in the space can be obtained.
  • the ring shape may be a square ring or a ring shape.
  • the illumination array 111 is turned on in such a manner that the light-emitting array 111 that is further away from the center of the laser emitter 10 is turned on first.
  • the total number of the light-emitting arrays 111 is six, and the six light-emitting arrays 111 include five annular sub-arrays 114 and one square sub-array 115, five rings in the direction near the center of the laser emitter 10.
  • the sub-arrays 114 are arranged in order, and the five annular sub-arrays 114 arranged in order are numbered A, B, C, D, and E.
  • the point source 101 of the ring sub-array 114 numbered A and B is turned on; when the number of targets is four, the points of the ring sub-array 114 numbered A, B, C, and D are turned on.
  • the diffractive power of the diffractive element 30 is limited, that is, part of the laser light emitted by the laser emitter 10 is not diffracted but directly emitted, and the laser light that is directly emitted does not undergo the diffraction attenuation of the diffractive element 30, so the laser that directly exits
  • the energy is larger and is highly likely to cause harm to the user's eyes. Therefore, when the current distance is small, the annular sub-array 114 away from the center of the laser emitter 10 is first turned on to avoid diffraction diffraction without diffraction of the diffraction element 30.
  • the laser energy that is directly emitted is too large to endanger the eyes of the user.
  • the power of the point light source 101 of the light-emitting array 111 farther from the center of the laser emitter 10 is higher. high.
  • the total number of the light-emitting arrays 111 is four, and the four light-emitting arrays 111 include three annular sub-arrays 114 and one square sub-array 115. In a direction away from the center of the laser emitter 10, three annular sub-arrays 114 are sequentially arranged, and the three annular sub-arrays 114 sequentially arranged are numbered A, B, and C.
  • the voltage (U- square ) of the point source 101 applied in the square sub-array 115 is smaller than the ring sub-array applied to the number A.
  • the voltage (U A ) of the point source 101 in 114 that is, the U side ⁇ U A ; or, when the square sub-array 115, the number A, and the point sub-array 114 in the ring sub-array 114 numbered B are simultaneously turned on, are applied
  • the voltage (U square ) of the point source 101 in the square sub-array 115 is smaller than the voltage (U A ) applied to the point source 101 in the ring sub-array 114 numbered A, and is applied to the ring sub-array 114 numbered A.
  • the voltage (U A ) of the point source 101 in the middle is smaller than the voltage (U B ) of the point source 101 applied in the ring sub-array 114 numbered B , that is, the U side ⁇ U A ⁇ U B ; or, when the square is simultaneously turned on
  • the voltage (U- square ) of the point source 101 applied in the square sub-array 115 is smaller than that applied to the number A.
  • an annular array 114 of circular sub-voltage (the U-a) of the point light source 101 in sub-array 114 and is applied in the number of a Voltage point source voltage (U A) a light source 101 is less than the applied voltage (the U-B) at the number B of the circular sub-array 114 of the point light source 101, and applied to the number B of the circular sub-array 114 in 101 ( U B ) is smaller than the voltage (U C ) of the point source 101 applied in the ring sub-array 114 numbered C , that is, the U square ⁇ U A ⁇ U B ⁇ U C .
  • the higher the power of the light-emitting array 111 from the center of the laser emitter 10 the more uniform the light emitted from the diffraction element 30 can be ensured.
  • the plurality of light-emitting arrays 111 are arranged in a "shape", wherein the current distance is obtained according to step 04 or according to steps 06 and 07.
  • each of the light-emitting arrays 111 has a square structure, and the plurality of square-structured light-emitting arrays 111 are combined into a "Tian"-shaped structure.
  • the "light" array of the light-emitting arrays 111 is only a combination of a plurality of square-shaped light-emitting arrays 111. Therefore, the manufacturing process is relatively simple. As shown in FIG.
  • the sizes of the respective light-emitting arrays 111 may be equal, or, as shown in FIG. 39, the size of the partial light-emitting arrays 111 may be unequal.
  • the plurality of light-emitting arrays 111 can also be arranged in other shapes, as shown in FIG.
  • a "computer-readable medium” can be any apparatus that can contain, store, communicate, propagate, or transport a program for use in an instruction execution system, apparatus, or device, or in conjunction with the instruction execution system, apparatus, or device.
  • computer readable media include the following: electrical connections (electronic devices) having one or more wires, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
  • the computer readable medium may even be a paper or other suitable medium on which the program can be printed, as it may be optically scanned, for example by paper or other medium, followed by editing, interpretation or, if appropriate, other suitable The method is processed to obtain the program electronically and then stored in computer memory.
  • portions of the invention may be implemented in hardware, software, firmware or a combination thereof.
  • multiple steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it can be implemented by any one or combination of the following techniques well known in the art: having logic gates for implementing logic functions on data signals. Discrete logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate arrays (PGAs), field programmable gate arrays (FPGAs), etc.
  • each functional unit in each embodiment of the present invention may be integrated into one processing module, or each unit may exist physically separately, or two or more units may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the integrated modules, if implemented in the form of software functional modules and sold or used as stand-alone products, may also be stored in a computer readable storage medium.
  • the above mentioned storage medium may be a read only memory, a magnetic disk or an optical disk or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Signal Processing (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Studio Devices (AREA)

Abstract

一种激光投射模组(100)的控制方法、激光投射模组(100)的控制装置(80)、深度相机(1000)和电子装置(3000)。激光投射模组包括激光发射器(10),激光发射器(10)包括多个点光源(101),多个点光源(101)形成多个发光阵列(111),多个发光阵列(111)独立控制。控制方法包括:获取激光投射模组(100)与用户的当前距离;根据当前距离确定发光阵列(111)的目标数量;和开启目标数量的发光阵列(111)的点光源(101)。

Description

控制方法、控制装置、深度相机和电子装置
优先权信息
本申请请求2018年3月12日向中国国家知识产权局提交的、专利申请号为201810200433.X、201810201627.1、201810202149.6及201810200875.4的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本发明涉及成像技术领域,特别涉及一种激光投射模组的控制方法、激光投射模组的控制装置、深度相机和电子装置。
背景技术
激光投射模组可投射带有预定图案信息的激光,并将激光投射到位于空间中的目标用户上,再通过成像装置(如红外摄像头等)获取由目标用户反射的激光图案,以进一步获得目标用户的深度图像,然而,激光投射器投射的激光控制不当容易对用户进行造成伤害。
发明内容
本发明实施方式提供一种激光投射模组的控制方法、激光投射模组的控制装置、深度相机和电子装置。
本发明实施方式的激光投射模组包括激光发射器,所述激光发射器包括多个点光源,多个所述点光源形成多个发光阵列,多个所述发光阵列独立控制,所述控制方法包括:获取所述激光投射模组与用户的当前距离;根据所述当前距离确定所述发光阵列的目标数量;和开启所述目标数量的所述发光阵列的点光源。
本发明实施方式的激光投射模组的控制装置,激光投射模组包括激光发射器,所述激光发射器包括多个点光源,多个所述点光源形成多个发光阵列,多个所述发光阵列独立控制;所述控制装置包括:获取模块、确定模块和开启模块。所述获取模块用于获取所述激光投射模组与用户的当前距离。所述确定模块用于根据所述当前距离确定所述发光阵列的目标数量。所述开启模块开启所述目标数量的所述发光阵列的点光源。
本发明实施方式的深度相机包括图像采集器和激光投射模组。所述激光投射模组包括激光发射器,所述激光发射器包括多个点光源,多个所述点光源形成多个发光阵列,多个所述发光阵列独立控制;所述深度相机还包括处理器,所述处理器用于:获取所述激光投射模组与用户的当前距离;根据所述当前距离确定所述发光阵列的目标数量;开启所述目标数量的所述发光阵列的点光源。
本发明实施方式的电子装置包括壳体和上述的深度相机,所述深度相机设置在所述壳体内并从所述壳体暴露以获取深度图像。
本发明实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本发明的实践了解到。
附图说明
本发明的上述和/或附加的方面和优点可以从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是本发明某些实施方式的电子装置的结构示意图。
图2是本发明某些实施方式的深度相机的结构示意图。
图3是本发明某些实施方式的激光投射模组的结构示意图。
图4是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图5是本发明某些实施方式的激光投射模组中激光发射器呈多个扇形阵列排布的示意图。
图6是本发明某些实施方式的激光投射模组的控制装置的模块示意图。
图7和图8是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图9是本发明某些实施方式的激光投射模组中激光发射器呈圆形子阵列及多个环形子阵列排布的示 意图。
图10是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图11是本发明某些实施方式的激光投射模组的控制装置中的第一获取模块的模块示意图。
图12是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图13是本发明某些实施方式的激光投射模组的控制装置中的确定单元的模块示意图。
图14是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图15是本发明某些实施方式的激光投射模组的控制装置中的确定单元的模块示意图。
图16是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图17是本发明某些实施方式的激光投射模组的控制装置中的确定单元的模块示意图。
图18是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图19是本发明某些实施方式的激光投射模组的控制装置中的第一获取模块的模块示意图。
图20至图24是本发明某些实施方式的激光投射模组中激光发射器呈多个扇形阵列排布的示意图。
图25是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图26是本发明某些实施方式的激光投射模组中激光发射器呈方形子阵列及多个环形子阵列排布的示意图。
图27至图29是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图30是本发明某些实施方式的激光投射模组的控制装置的模块示意图。
图31是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图32是本发明某些实施方式的激光投射模组的控制装置中的修正模块的模块示意图。
图33是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图34是本发明某些实施方式的激光投射模组的控制装置中的修正模块的模块示意图。
图35是本发明某些实施方式的激光投射模组的控制方法的流程示意图。
图36是本发明某些实施方式的激光投射模组的控制装置中的修正模块的模块示意图。
图37至图40是本发明某些实施方式的激光投射模组中激光发射器呈“田”字形排布的示意图。
具体实施方式
下面详细描述本发明的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本发明,而不能理解为对本发明的限制。
在本发明的描述中,需要理解的是,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个所述特征。在本发明的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。
请参阅图1,本发明提供一种电子装置3000。电子装置3000包括壳体2000和的深度相机1000。深度相机1000设置在壳体2000内并从壳体2000暴露以获取深度图像。其中,电子装置3000可以是手机、平板电脑、笔记本电脑、智能手表、智能手环、智能眼镜、智能头盔等等。
请参阅图2,深度相机1000包括图像采集器200、激光投射模组100和处理器300。图像采集器200可用于采集激光图案,图像采集器200可为红外摄像头。处理器300可用于处理激光图案以获取深度图像。
请结合图3,激光投射模组100包括激光发射器10、准直元件20、衍射元件30、镜筒40、基板组件50和保护罩60。激光发射器10为垂直腔面激光发射器(Vertical-Cavity Surface-Emitting Laser,VCSEL),垂直腔面激光发射器包括多个点光源101(图5所示),激光发射器10用于发射激光。准直元件20用于准直激光发射器10发射的激光,衍射元件30用于衍射准直元件20准直后的激光以形成激光图案。镜筒40设置在基板组件50上。镜筒40的侧壁41与基板组件50围成收容腔42。基板组件50包括基板52及承载在基板52上的电路板51。电路板51开设有通孔511,激光发射器10承载在基板52上并收容在通孔511内。准直元件20和衍射元件30沿激光发射器10的发光方向依次排列。镜筒40的侧壁41向收容腔42的中心延伸有承载台411,衍射元件30承载在承载台411上。
保护罩60可以由透光材料制成,例如玻璃、聚甲基丙烯酸甲酯(Polymethyl Methacrylate,PMMA)、聚碳酸酯(Polycarbonate,PC)、聚酰亚胺(Polyimide,PI)等。由于玻璃、PMMA、PC、及PI等透光材料均具有优异的透光性能,保护罩60可以不用开设透光孔。如此,保护罩60能够在防止衍射元件30脱落的同时,还能够避免衍射元件30裸露在镜筒40的外面,从而使衍射元件30防水防尘。当然,在其他实施方式中,保护罩60可以开设有透光孔,透光孔与衍射元件30的光学有效区相对以避免遮挡衍射元件30的光路。
请一并参阅图1至图5,本发明还提供一种用于上述的激光投射模组100的控制方法。激光投射模组100中的激光发射器10包括多个点光源101。多个点光源101形成多个发光阵列111,多个发光阵列111可被独立控制。控制方法包括:
001:获取激光投射模组100与用户的当前距离;
002:根据当前距离确定发光阵列111的目标数量;和
003:开启目标数量的发光阵列111的点光源101。
请参阅图6,本发明还提供一种用于上述的激光投射模组100的控制装置80。激光投射模组100中的激光发射器10包括多个点光源101。多个点光源101形成多个发光阵列111,多个发光阵列111可被独立控制。控制装置80包括第一获取模块81、确定模块82和开启模块83。步骤001离可以由第一获取模块81实现,步骤002可以由确定模块82实现,步骤003可以由开启模块83实现。也即是说,第一获取模块81可用于获取激光投射模组100与用户的当前距离。确定模块82可用于根据当前距离确定发光阵列111的目标数量。开启模块83可用于开启目标数量的发光阵列111的点光源101。
请再参阅图1,在某些实施方式中,步骤001、步骤002和步骤003还可以由处理器300实现。也即是说,处理器300还可用于获取激光投射模组100与用户的当前距离、根据当前距离确定发光阵列111的目标数量、以及开启目标数量的发光阵列111的点光源101。
可以理解,激光投射模组100开启时通常是开启全部的点光源101,若此时用户距离激光投射模组100的距离过近,则由于全部开启点光源101后激光发射器10发射的激光的能量较高,可能对用户的眼睛产生危害。本发明实施方式的激光投射模组100的控制方法、激光投射模组100的控制装置80、深度相机1000和电子装置3000将激光投射模组100中的点光源101排列成多个可独立控制的发光阵列111,如此,可以根据检测到的当前距离开启对应该当前距离的目标数量的发光阵列111的点光源101,避免开启全部的点光源101后,用户与激光投射模组100的距离过近,而激光发射器10发射的能量又过高,危害用户眼睛的问题。
请一并参阅图1、图5、图6和图7,在某些实施方式中,多个发光阵列111为多个扇形阵列111,多个扇形阵列111围成圆形阵列11。多个扇形阵列111独立控制。此时,步骤001获取激光投射模组100与用户的当前距离即为:01:获取激光投射模组100与用户的当前距离。步骤002根据当前距离确定发光阵列111的目标数量即为:02:根据当前距离确定扇形阵列111的目标数量。步骤003开启目标数量的发光阵列111的点光源101即为:03:开启目标数量的扇形阵列111的点光源。对应地,第一获取模块81可用于获取激光投射模组100与用户的当前距离。确定模块82可用于根据当前距离确定扇形阵列111的目标数量。开启模块83可用于开启目标数量的扇形阵列111的点光源101。对应地,处理器300可用于获取激光投射模组100与用户的当前距离、根据当前距离确定扇形阵列111的目标数量、以及开启目标数量的扇形阵列111点的光源101。可以理解,准直元件20的光学有效区通常为圆形,此时,如果多个点光源101排列成矩形,圆形的光学有效区要全部覆盖矩形排列的点光源101需要满足光学有效区的直径大于点光源101组成的矩形的对角线的长度,如此,会导致一部分空间的浪费。而将多个点光源101形成多个扇形阵列111,多个扇形阵列111排列成圆形阵列11,则可以使激光发射器10的形状与准直元件20的圆形光学有效区对应,充分利用空间。
请一并参阅图1、图6、图8和图9,在某些实施方式中,多个发光阵列111为多个子阵列111。多个子阵列111围成圆形阵列11。多个子阵列111包括圆形子阵列113和环形子阵列112。其中,圆形子阵列113的个数为一个,环形子阵列112的个数为一个或多个。多个子阵列111可被独立控制。此时,步骤001获取激光投射模组100与用户的当前距离即为:01:获取激光投射模组100与用户的当前距离。步骤002根据当前距离确定发光阵列111的目标数量即为:02:根据当前距离确定子阵列111的目标数量。步骤003开启目标数量的发光阵列111的点光源101即为:03:开启目标数量的子阵列111的点光 源101。对应地,第一获取模块81可用于获取激光投射模组100与用户的当前距离。确定模块82可用于根据当前距离确定子阵列111的目标数量。开启模块83可用于开启目标数量的子阵列111的点光源101。对应地,处理器300可用于获取激光投射模组100与用户的当前距离、根据当前距离确定子阵列111的目标数量、以及开启目标数量的子阵列111的点光源101。可以理解,准直元件20的光学有效区通常为圆形,此时,如果多个点光源101排列成矩形,圆形的光学有效区要全部覆盖矩形排列的点光源101需要满足光学有效区的直径大于点光源101组成的矩形的对角线的长度,如此,会导致一部分空间的浪费。而将多个点光源101形成多个子阵列111,多个子阵列111排列成圆形阵列11,可以使激光发射器10的形状与准直元件20的圆形光学有效区对应,充分利用空间。
请一并参阅图5、图9和图10,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,步骤01获取激光投射模组100与用户的当前距离包括:
011:获取用户的人脸图像;
012:处理人脸图像以确定用户的人脸占人脸图像的第一比例;和
013:根据第一比例确定当前距离。
请一并参阅图5、图9和图11,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,第一获取模块81包括获取单元811、处理单元812和确定单元813。步骤011可以由获取单元811实现,步骤012可以由处理单元812实现,步骤013可以由确定单元813实现。也即是说,获取单元811可用于获取用户的人脸图像。处理单元812可用于处理人脸图像以确定用户的人脸占人脸图像的第一比例。确定单元813可用于根据第一比例确定当前距离。
请一并参阅图1、图5和图9,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,步骤011、步骤012和步骤013均可以由处理器300实现。也即是说,处理器300还可用于获取用户的人脸图像,处理人脸图像以确定用户的人脸占人脸图像的第一比例,以及根据第一比例确定当前距离。其中,人脸图像由图像采集器200拍摄得到,处理器300与图像采集器200电连接,并从图像采集器200中读取人脸图像。
具体地,可以先依据对人脸的特征点的提取和分析划分人脸图像中的人脸区域和背景区域,然后计算人脸区域所在的像素个数与人脸图像的像素个数的比值以得到第一比例。可以理解,当第一比例较大时,说明用户较靠近图像采集器200,也就是较靠近激光投射模组100,当前距离较小,此时激光投射模组100需要开启较少目标数量的发光阵列111(扇形阵列111或子阵列111)的点光源101,以避免投射的激光太强而灼伤用户。同时,当第一比例较小时,说明用户与图像采集器200相距较远,也就是与激光投射模组100相距较远,当前距离较大,激光投射模组100需要以较大的功率投射激光,以使激光图案投射到用户上并被反射后依然有适当的强度,以用于形成深度图像,此时激光投射模组100需要开启较多目标数量的发光阵列111(扇形阵列111或子阵列111)的点光源101。在一个例子中,当同一张人脸图像中包含有多个人脸时,则选取多个人脸中面积最大的人脸作为人脸区域用以计算第一比例,其他人脸所占的区域均作为背景区域的一部分。
可以预先对当前距离与第一比例进行标定。具体地,指引用户以预定的当前距离拍摄人脸图像,并计算该人脸图像对应的标定比例,存储该预设的当前距离与标定比例的对应关系,以便在后续的使用中依据实际的第一比例计算当前距离。例如,指引用户在当前距离为30厘米时拍摄人脸图像,并计算得到该人脸图像对应的标定比例为45%,而在实际测量中,当计算得到第一比例为R时,则依据相似三角形的性质有
Figure PCTCN2019075390-appb-000001
其中,D为依据实际测量的第一比例R计算的实际的当前距离。如此,依据人脸图像中人脸所占第一比例,可以较为客观地反应用户与激光投射模组100之间的当前距离。
请一并参阅图5、图9和图12,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,步骤013根据第一比例确定当前距离包括:
0131:计算人脸图像中人脸的预设的特征区域占人脸的第二比例;和
0132:根据第一比例及第二比例计算当前距离。
请一并参阅图5、图9和图13,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,确定单元813包括第一计算子单元8131和第二计算子单元8132。步骤0131可以由第一计算子单元8131实现,步骤0132可以由第二计算子单元8132实现。也即是说,第一计算子单元8131可用于计算人脸图像中人脸的预设的特征区域占人脸的第二比例。第二计算子单元8132可用于根据第一比例及第二比例计算当前距离。
请一并参阅图1、图5和图9,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,步骤0131和步骤0132还可以由处理器300实现。也即是说,处理器300还可用于计算人脸图像中人脸的预设的特征区域占人脸的第二比例,以及根据第一比例及第二比例计算当前距离。
可以理解,不同的用户的人脸的大小有差异,使得不同的用户处于同样的距离被采集到的人脸图像中,人脸所占的第一比例有差异。第二比例为人脸的预设的特征区域占人脸的比例,预设的特征区域可以选择不同用户个体的差异度较小的特征区域,例如预设的特征区域为用户的双眼间距。当第二比例较大时,说明该用户的人脸较小,仅依据第一比例计算得到的当前距离过大;当第二比例较小时,说明该用户的人脸较大,仅依据第一比例计算得到的当前距离过小。在实际使用中,可以预先对第一比例、第二比例与当前距离进行标定。具体地,指引用户以预定的当前距离先拍摄人脸图像,并计算该人脸图像对应的第一标定比例及第二标定比例,存储该预设的当前距离与第一标定比例、第二标定比例的对应关系,以便于在后续的使用中依据实际的第一比例和第二比例计算当前距离。例如,指引用户在当前距离为25厘米时拍摄人脸图像,并计算得到该人脸图像对应的第一标定比例为50%,第二标定比例为10%,而在实际测量中,当计算得到第一比例为R1,第二比例为R2时,则依据三角形相似的性质有
Figure PCTCN2019075390-appb-000002
其中D1为依据实际测量的第一比例R1计算得到的初始的当前距离,可以再依据关系式
Figure PCTCN2019075390-appb-000003
求得进一步依据实际测量的第二比例R2计算得到的校准的当前距离D2,D2作为最终的当前距离。如此,依据第一比例与第二比例计算得到的当前距离考虑了不同用户之间的个体差异,能够获得更加客观的当前距离。
请一并参阅图5、图9和图14,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,步骤013根据第一比例确定当前距离包括:
0133:根据人脸图像判断用户是否佩戴眼镜;和
0134:在用户佩戴眼镜时根据第一比例及预设的距离系数计算当前距离。
请一并参阅图5、图9和图15,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,确定单元813包括第一判断子单元8133和第三计算子单元8134。步骤0133可以由第一判断子单元8133实现,步骤0134可以由第三计算子单元8134实现。也即是说,第一判断子单元8133可用于根据人脸图像判断用户是否佩戴眼镜。第三计算子单元8134可用于在用户佩戴眼镜时根据第一比例及预设的距离系数计算当前距离。
请一并参阅图1、图5和图9,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,步骤0133和步骤0134还可以由处理器300实现。也即是说,处理器300可用于根据人脸图像判断用户是否佩戴眼镜,以及在用户佩戴眼镜时根据第一比例及预设的距离系数计算当前距离。
可以理解,用户是否佩戴眼镜可以用于表征用户眼睛的健康状况,具体为用户佩戴眼镜则表明用户的眼睛已经患有相关的眼疾或视力不佳,在对佩戴眼镜的用户投射激光时,需要开启较少数目的发光阵列111(扇形阵列111或子阵列111)的点光源101,使得激光投射模组100投射的激光的能量较小,以免对用户的眼睛造成伤害。预设的距离系数可以是介于0至1的系数,例如0.6、0.78、0.82、0.95等,例如在根据第一比例计算得到初始的当前距离,或者在依据第一比例和第二比例计算得到校准后的当前距离后,再将初始的当前距离或者校准的当前距离乘以距离系数,得到最终的当前距离,并根据该当前距离确定目标数量。如此,可以避免投射激光的功率过大伤害患有眼疾或视力不佳的用户。
进一步地,距离系数可以是不固定的,例如,距离系数可以是根据环境中可见光或者红外光的强度自行调节的。图像采集器200采集的人脸图像为红外图像,可以先计算人脸图像的所有像素的红外光强度的平均值,不同的平均值对应不同的距离系数,平均值越大,距离系数越小,平均值越小,距离系数越大。
请一并参阅图5、图9和图16,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,步骤013根据第一比例确定当前距离包括:
0135:根据人脸图像判断用户的年龄;和
0136:根据第一比例及年龄调整当前距离。
请一并参阅图5、图9和图17,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,确定单元813还包括第二判断子单元8135和调整子单元8136。步骤0135可以由第二判断子单元8135实现,步骤0136可以由调整子单元8136实现。也即是说,第二判断子单元8135可用于根据人脸图像判断用户的年龄。调整子单元8136可用于根据第一比例及年龄调整当前距离。
请一并参阅图1、图5和图9,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,步骤0135和步骤0136还可以由处理器300实现。也即是说,处理器300还可用于根据人脸图像判断用户的年龄,以及根据第一比例及年龄调整当前距离。
不同年龄段的人对红外激光的耐受能力不同,例如小孩和老人更容易被激光灼伤等,可能对于成年人而言是合适强度的激光会对小孩造成伤害。本实施方式中,可以提取人脸图像中,人脸皱纹的特征点的数量、分布和面积等来判断用户的年龄,例如,提取眼角处皱纹的数量来判断用户的年龄,或者进一步结合用户的额头处的皱纹多少来判断用户的年龄。在判断用户的年龄后,可以依据用户的年龄得到比例系数,具体可以是在查询表中查询得知年龄与比例系数的对应关系,例如,年龄在15岁以下时,比例系数为0.6,年龄在15岁至20岁时,比例系数为0.8;年龄在20岁至45岁时,比例系数为1.0;年龄在45岁以上时,比例系数为0.8。在得知比例系数后,可以将根据第一比例计算得到的初始的当前距离、或者根据第一比例及第二比例计算得到的校准的当前距离乘以比例系数,以得到最终的当前距离,再根据该当前距离确定发光阵列111(扇形阵列111或子阵列111)的目标数量。如此,可以避免投射激光的功率过大而伤害小年龄段或者年龄较大的用户。
请一并参阅图5、图9和图18,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,步骤01获取激光投射模组100与用户的当前距离包括:
014:向用户发射检测信号;和
015:根据被用户反射回的检测信号计算当前距离。
请一并参阅图5、图9和图19,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,第一获取模块81包括发射单元814和第一计算单元815。步骤014可以由发射单元814实现,步骤015可以由第一计算单元815实现。也即是说,发射单元814可用于向用户发射检测信号。第一计算单元815可用于根据被用户反射回的检测信号计算当前距离。
请一并参阅图1、图5和图9,在某些实施方式中,当多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111时,步骤014可以由激光投射模组100实现,步骤015可以由处理器300实现。也即是说,激光投射模组100可以用于向用户发射检测信号。处理器300可用于根据被用户反射回的检测信号计算当前距离。
具体地,激光投射模组100仅开启一个发光阵列111(扇形阵列111或子阵列111)中的点光源101,即仅由该发光阵列111(扇形阵列111或子阵列111)中的点光源101发射激光。由深度相机1000中的图像采集器200接收反射回的激光以得到激光图案,再利用图像匹配算法计算出该激光图案中各像素点与预定图案中的对应各个像素点的偏离值,再根据偏离值进一步获得该激光图案对应的深度图像,从而粗略估算激光投射模组100与用户的当前距离。由于仅开启一个发光阵列111(扇形阵列111或子阵列 111)中的点光源101进行当前距离的检测,因此激光投射模组100发射的激光的能量较低,不会对用户的眼睛产生危害。而在粗测完用户与激光投射模组100的当前距离后,根据当前距离确定开启的发光阵列111(扇形阵列111或子阵列111)的目标数量,此时激光投射模组100的发射的激光既能满足深度图像测算的精准度需求,同时还不会对用户的眼睛产生危害。
在某些实施方式中,若多个发光阵列111为多个扇形阵列111,或者多个发光阵列111为多个包括圆形子阵列113和环形子阵列112的子阵列111,则当当前距离处于第一距离区间时,开启第一目标数量的发光阵列111(扇形阵列111或子阵列111)的点光源101。当当前距离处于第二距离区间时,开启第二目标数量的发光阵列111(扇形阵列111或子阵列111)的点光源101。当当前距离处于第三距离区间时,开启第三目标数量的发光阵列111(扇形阵列111或子阵列111)的点光源101。其中,第二距离区间位于第一距离区间与第二距离区间之间,也即是说,第一距离区间中距离的最大值小于或等于第二距离区间中距离的最小值,第二距离区间中距离的最大值小于第三距离区间中距离的最小值。第二目标数量大于第一目标数量且小于第三目标数量。
具体地,例如,激光投射模组100中的点光源101形成有6个发光阵列111(扇形阵列111或子阵列111),第一距离区间为[0cm,15cm],第二距离区间为(15cm,40cm],第三距离区间为(40cm,∞),第一目标数量为2个,第二目标数量为4个,第三目标数量为6个。则当检测到的当前距离处于[0cm,15cm]中时,开启2个发光阵列111(扇形阵列111或子阵列111)的点光源101;当检测到的当前距离处于(15cm,40cm]中时,开启4个发光阵列111(扇形阵列111或子阵列111)的点光源101;当检测到的当前距离处于(40cm,∞)中时,开启6个发光阵列111(扇形阵列111或子阵列111)的点光源101。也即是说,随着当前距离的增大,目标数量的值越大,开启的发光阵列111(扇形阵列111或子阵列111)的点光源101的数量越多。如此,在用户与激光投射模组100之间的当前距离较小时,开启较少的发光阵列111(扇形阵列111或子阵列111)的点光源101,避免激光投射模组100发射的激光能量过大而危害用户眼睛,在用户与激光投射模组100之间的当前距离较大时,开启较多的发光阵列111(扇形阵列111或子阵列111)的点光源101,可以使得图像采集器200接收到足够能量的激光,进一步使得深度图像的获取精度更高。
在某些实施方式中,若多个发光阵列111为多个扇形阵列111,则当扇形阵列111的数量及目标数量均为多个,且扇形阵列111的数量是目标数量的倍数时,开启的多个扇形阵列111环绕激光发射器10的中心呈中心对称分布。例如,如图5所示,扇形阵列111的数量为4个,目标数量为2个,4为2的倍数,则开启的2个扇形阵列111环绕激光发射器10的中心呈中心对称分布。再例如,如图20所示,扇形阵列111的数量为6个,目标数量为3个,则相邻2个开启的扇形阵列111之间均有一个未开启的扇形阵列111,开启的3个扇形阵列111环绕激光发射器10的中心呈中心对称分布。再例如,如图21所示,扇形阵列111的数量为9个,目标数量为3个,则相邻2个开启的扇形阵列111之间均有2个未开启的扇形阵列111,开启的3个扇形阵列111环绕激光发射器10的中心呈中心对称分布。如此,开启的扇形阵列111呈中心对称分布,点光源101发射的激光经由准直元件20和衍射元件30出射后可以覆盖较大的视场,且出光均匀,有利于提升深度图像的获取精度。
在某些实施方式中,若多个发光阵列111为多个扇形阵列111,则当扇形阵列111的数量及目标数量均为多个,且扇形阵列111的数量的个数为偶数时,开启的多个扇形阵列111环绕激光发射器10的中心呈中心对称分布。例如,如图22所示,扇形阵列111的数量为6个,目标数量为4个,开启的4个扇形阵列111中的2个相邻接,剩余的2个扇形阵列111相邻接,开启的4个扇形阵列111环绕激光发射器10的中心呈中心对称分布。再例如,如图23所示,扇形阵列111的数量为10个,目标数量为6个,开启的6个扇形阵列111中的3个相邻接,剩余的3个扇形阵列111相邻接,开启的6个扇形阵列111环绕激光发射器10的中心呈中心对称分布。再例如,如图24所示,扇形阵列111的数量为12个,目标数量为9个,开启的9个扇形阵列111中的3个扇形阵列111相邻接,另外的3个扇形阵列111相邻接,剩余的3个扇形阵列111相邻接,开启的9个扇形阵列111环绕激光发射器10的中心呈中心对称分布。如此,开启的扇形阵列111呈中心对称分布,点光源101发射的激光经由准直元件20和衍射元件30出射后可以覆盖较大的视场,且出光均匀,有利于提升深度图像的获取精度。
在某些实施方式中,若多个发光阵列111为多个子阵列111,多个子阵列111包括圆形子阵列113和环形子阵列112,则在同时开启圆形子阵列113的点光源101和至少一个环形子阵列112的点光源101 时,距离圆形阵列120中心越远的子阵列111的点光源101功率越高。
具体地,请结合图9,例如,激光发射器10的圆形阵列120包括4个子阵列111,分别为1个圆形子阵列113和3个环形子阵列112。沿远离圆形阵列11中心的方向,3个环形子阵列112依次排布,依次排布的3个环形子阵列112的编号分别为A、B、C。则当同时开启圆形子阵列113和编号为A的环形子阵列112中的点光源101时,施加在圆形子阵列113中的点光源101的电压(U )小于施加在编号为A的环形子阵列112中的点光源101的电压(U A),即U <U A;或者,当同时开启圆形子阵列113、编号为A和编号为B的环形子阵列112中的点光源101时,施加在圆形子阵列113中的点光源101的电压(U )小于施加在编号为A的环形子阵列112中的点光源101的电压(U A),且施加在编号为A的环形子阵列112中的电压(U A)小于施加在编号为B的环形子阵列112中的点光源101的电压(U B),即U <U A<U B;或者,当同时开启圆形子阵列113、编号为A、编号为B和编号为C的环形子阵列112中的点光源101时,施加在圆形子阵列113中的点光源101的电压(U )小于施加在编号为A的环形子阵列112中的点光源101的电压(U A),且施加在编号为A的环形子阵列112中的电压(U A)小于施加在编号为B的环形子阵列112中的点光源101的电压(U B),且施加在编号为B的环形子阵列112中的电压小于施加在编号为C的环形子阵列112中的点光源101的电压(U C),即U <U A<U B<U C。如此,距离圆形阵列11中心越远的子阵列111的功率越高,能够保证从衍射元件30中出射的光线出光均匀。
可以理解,若距离圆形阵列11中心越近的子阵列111的功率越高,则激光发射器10发射的聚集在圆形阵列11中心位置的激光较多,该部分激光经衍射元件30时,由于衍射元件30的衍射能力有限,即部分光束不会被衍射而是直接出射,直接出射的激光不经过衍射元件20的衍射衰减作用,因此直接出射的激光的能量较大,极有可能对用户的眼睛产生危害,因此,降低距离圆形阵列11中心较近的子阵列111的功率,可以避免在圆形阵列11中心聚集的激光过多,且不经衍射直接出射,从而危害用户眼睛的问题。
在某些实施方式中,若多个发光阵列111为多个子阵列111,多个子阵列111包括圆形子阵列113和环形子阵列112,则当当前距离处于第一距离区间时,开启环形子阵列112的点光源101。当当前距离处于第二距离区间时,开启圆形子阵列113的点光源101。其中,第一距离区间的最大值小于第二距离区间的最小值。
具体地,假设激光发射器的圆形阵列11包括2个子阵列111,分别为一个圆形子阵列113和一个环形子阵列112,第一距离区间为[0cm,15cm],第二距离区间为(15cm,40cm],第三距离区间为(40cm,∞),第一目标数量为1个,第二目标数量为1个,第三目标数量为2个。则当当前距离处于第一距离区间时,开启环形子阵列112的点光源101;当当前距离处于第二距离区间时,开启圆形子阵列113的点光源101;当当前距离处于第三距离区间时,开启环形子阵列112的点光源101和圆形子阵列113的点光源101。在开启环形子阵列112的点光源101或者开启圆形子阵列113的点光源101时,施加在环形子阵列112的点光源101的电压可与施加在圆形子阵列111的点光源101的电压相等。如此,随着当前距离增大,子阵列111的开启方式为:沿靠近圆形阵列11中心的方向,依次开启环形子阵列112和圆形子阵列113。如此,可以避免在当前距离较小时先开启靠近圆形阵列11中心的圆形子阵列113或环形子阵列112,导致不经衍射元件30的衍射作用衍射衰减而直接出射的激光能量过大而危害用户眼睛的问题。
请一并参阅图25和图26,在某些实施方式中,步骤001获取激光投射模组100与用户的当前距离包括:
04:开启预定数量的发光阵列111以检测用户与激光投射模组100的当前距离。
请一并参阅图6和图26,在某些实施方式中,步骤04可以由第一获取模块81实现。也即是说,第一获取模块81可用于开启预定数量的发光阵列111以检测用户与激光投射模组100的当前距离。
请一并参阅图1和图26,在某些实施方式中,步骤04还可以由处理器300实现。也即是说,处理器300还可以用于开启预定数量的发光阵列111以检测用户与激光投射模组100的当前距离。
激光投射模组100投射的激光为红外激光,而激光投射模组100工作时用户与激光投射模组100的当前距离是未知的,因此,若红外激光的能量控制不当,可能导致红外激光的能量过大,对用户的眼睛造成伤害。而将激光发射器10中的点光源101分为多个可独立控制的发光阵列111,在激光投射模组100工作时首先开启预定数量的发光阵列111以检测用户与激光投射模组100的当前距离,在确定当前距离后再根据当前距离确定需要开启的发光阵列111的目标数量,如此,可以避免开启的发光阵列111 的数量过少,导致图像采集器200采集的激光图案的亮度过低,影响深度图像获取的准确性;也可避免开启的发光阵列111的数量过多,导致出射的激光能量过大对用户的眼睛产生危害的问题。
其中,激光投射模组100工作时首先开启发光阵列111对应的预定数量可由经验数据得到,在使用该激光投射模组100前开启该预定数量的发光阵列111一方面可以大致测得用户与激光投射模组100之间的当前距离,另一方面不会对用户的眼睛产生危害。发光阵列111的预定数量随着电子装置3000的类型不同以及发光阵列111的总数的不同而变化。例如,当电子装置3000为手机时,激光投射模组100常用于辅助获取3D人脸图像以进行人脸识别解锁,此时用户与激光投射模组100的当前距离通常较小。假设此时发光阵列111的总数为6个,则预定数量可为2个,若此时发光阵列111的总数为12个,则预定数量可为3个,如此,一方面可以大致测得用户与激光投射模组100的当前距离,一方面可以避免激光能量过大的问题。再例如,当电子装置3000为体感游戏设备时,用户与激光投射模组100之间的当前距离通常较大。假设此时发光阵列111的总数为24个,则预定数量可为8个,如此,一方面可以大致测得用户与激光投射模组100的当前距离,一方面可以避免激光能量过大的问题。
请一并参阅图26和图27,在某些实施方式中,步骤0111获取激光投射模组100与用户的当前距离包括:
06:获取用户的第一图像和第二图像;
07:根据第一图像和第二图像计算用户与激光投射模组100的当前距离。
请一并参阅图5和图26,在某些实施方式中,步骤06和步骤07均可以由第一获取模块81实现。也即是说,第一获取模块81可以用于获取用户的第一图像和第二图像、以及根据第一图像和第二图像计算用户与激光投射模组100的当前距离。
请一并参阅图1和图26,在某些实施方式中,步骤06和步骤07均可以由处理器300实现。也即是说,处理器300还可以用于获取用户的第一图像和第二图像、根据第一图像和第二图像计算用户与激光投射模组100的当前距离。
激光投射模组100投射激光图案到空间中的用户上,再由图像采集器200采集由用户反射的激光图案,再利用该激光图案与参考的激光图案获取用户的深度图像。激光投射模组100投射的激光为红外激光,而激光投射模组100工作时用户与激光投射模组100的当前距离是未知的,因此,若红外激光的能量控制不当,可能导致红外激光的能量过大,对用户的眼睛造成伤害。而将激光发射器10中的点光源101分为多个可独立控制的发光阵列111,在激光投射模组100工作时首先获取用户的第一图像和第二图像以计算用户与激光投射模组100的当前距离,在确定当前距离后再根据当前距离确定需要开启的发光阵列111的目标数量,如此,可以避免开启的发光阵列111的数量过少,导致图像采集器200采集的激光图案的亮度过低,影响深度图像获取的准确性;也可避免开启的发光阵列111的数量过多,导致出射的激光能量过大对用户的眼睛产生危害的问题。
其中,第一图像可以是红外图像,第二图像可以是可见光图像(RGB图像),或者第一图像可以是可见光图像,第二图像可以是红外图像,可见光图像可以由可见光摄像模组4000拍摄得到,红外图像可以由深度相机1000中的图像采集器200拍摄得到。第一图像和第二图像也可均为可见光图像,此时电子装置3000包括两个可见光摄像模组4000。以第一图像为红外图像且第二图像为可见光图像为例,激光投射模组100工作时,处理器300首先开启图像采集器200和可见光摄像模组4000,图像采集器200拍摄第一图像,可见光摄像模组4000拍摄第二图像,处理器300从图像采集器200和可见光摄像模组4000中读取第一图像和第二图像。第一图像和第二图像作为一对图像匹配对,处理器300根据这一对图像匹配对计算当前距离,具体地,处理器300首先对第一图像和第二图像做双目图像校正,根据图像采集器200和可见光摄像模组4000事先标定获得的单目内参数据(焦距、成像原点、畸变参数)和双目相对位置关系(旋转矩阵和平移向量),分别对第一图像和第二图像进行消除畸变和行对准,使得第一图像和第二图像严格行对应。随后,对于第一图像上的每一个点,在第二图像中找到与该点匹配的对应点,由于第一图像和第二图像严格行对应,因此,对于第一图像上的每一个点,仅需在第二图像中的与第一图像上该点所在行对应的行的位置寻找与该点匹配的对应点,而无需在整幅第二图像中寻找对应点,因此,第一图像和第二图像之间的点的匹配计算较快。在匹配完毕第一图像和第二图像中的每个点后,即可根据每一对相匹配的点计算出对应位置处的深度信息,最终生成深度图像。最后,处理器300识别出第一图像和第二图像中的人脸,再根据深度图像与第一图像或第二图像的匹配关系,确定出人脸 对应的深度信息,由于人脸通常占据多个像素点,因此,取多个像素对应的多个深度信息的中值或均值作为最终的当前距离。
当然,为进一步减小处理器300的处理时间,可以将用户看作一个点目标,激光投射模组100到该点目标之间的距离即为当前距离;也可以以用户的某个部位作为点目标,激光投射模组100到该点目标之间的距离即为当前距离,例如,以用户的脸部为点目标,激光投射模组100到用户脸部之间的距离即为当前距离,此时,具体地,先识别出第一图像和第二图像中的人脸,再对第一图像中的人脸部分和第二图像中的人脸部分进行像素匹配和深度信息计算,随后根据计算得到的深度信息确定出当前距离。
处理器300执行步骤04获取到当前距离后或者执行步骤06和步骤07获取到当前距离后,再根据当前距离确定需要开启的发光阵列111的目标数量,再控制激光投射模组100开启该目标数量的发光阵列111以进一步获取较为准确的深度图像。例如,当电子装置3000为手机,发光阵列111的总数为6个,若测得当前距离较远,例如为15~20cm,则可以根据该当前距离确定目标数量为3~4个,则将开启3~4个发光阵列111的点光源101;若测得当前距离较近,例如为5~10cm,则可以根据该当前距离确定目标数量为1个,则将开启1个发光阵列111的点光源101。
请参阅图28和图29,在某些实施方式中,控制方法在步骤04开启预定数量的发光阵列111以检测用户与激光投射模组100的当前距离后或者在步骤07根据第一图像和第二图像计算用户与激光投射模组100的当前距离后还包括修正当前距离的步骤(即步骤05),具体为:
051:获取用户的人脸图像;
052:计算人脸图像中人脸所占的第一比例;和
053:根据第一比例修正当前距离。
请参阅图30,在某些实施方式中,控制装置80还包括第二获取模块851、计算模块852和修正模块853。步骤051可以由第二获取模块851实现。步骤052可以由计算模块852实现。步骤053可以由修正模块853实现。也即是说,第二获取模块851可用于获取用户的人脸图像。计算模块852可用于计算人脸图像中人脸所占的第一比例。修正模块853可用于根据第一比例修正当前距离。
请再参阅图1,在某些实施方式中,步骤051、步骤052和步骤053均可以由处理器300实现。也即是说,处理器300还可以用于获取用户的人脸图像、计算人脸图像中人脸所占的第一比例、以及根据第一比例修正当前距离。
具体地,可以先根据对人脸特征点的提取和分析划分人脸图像中的人脸区域和背景区域,然后计算人脸所在区域所在的像素个数与人脸图像的像素个数的比值以得到第一比例。可以理解,当第一比例较大时,说明用户较靠近图像采集器200,也就是较靠近激光投射模组100,当前距离较小,此时激光投射模组100需要开启较少目标数量的发光阵列111的点光源101,以避免投射的激光太强而灼伤用户。同时,当第一比例较小时,说明用户与图像采集器200相距较远,当前距离较大,激光投射模组100需要以较大的功率投射激光,以使激光图案投射到用户上并被反射后依然有适当的强度,以用于形成深度图像,此时激光投射模组100需要开启较多目标数量的发光阵列111的点光源101。在一个例子中,当同一张人脸图像中包含有多个人脸时,则选取多个人脸中面积最大的人脸作为人脸区域用以计算第一比例,其他人脸所占的区域均作为背景区域的一部分。
可以预先对当前距离与第一比例进行标定。具体地,指引用户以预定的当前距离拍摄人脸图像,并计算该人脸图像对应的标定比例,存储该预设的当前距离与标定比例的对应关系,以便在后续的使用中依据实际的第一比例计算当前距离。例如,指引用户在当前距离为30厘米时拍摄人脸图像,并计算得到该人脸图像对应的标定比例为45%,而在实际测量中,当计算得到第一比例为R时,则依据相似三角形的性质有
Figure PCTCN2019075390-appb-000004
其中,D为依据实际测量的第一比例R计算的实际的当前距离。如此,依据人脸图像中人脸所占第一比例,可以较为客观地反应用户与激光投射模组100之间的当前距离。
请参阅图31,在某些实施方式中,步骤053根据第一比例修正当前距离包括:
0531:计算人脸图像中人脸的预设的特征区域占人脸的第二比例;和
0532:根据第一比例及第二比例修正当前距离。
请参阅图32,在某些实施方式中,修正模块853包括第二计算单元8531和第一修正单元8532。步骤0531可以由第二计算单元8531实现,步骤0532可以由第一修正单元8532实现。也即是说,第二计 算单元8531用于计算人脸图像中人脸的预设的特征区域占人脸的第二比例。第一修正单元8532用于根据第一比例及第二比例修正当前距离。
请再参阅图1,在某些实施方式中,步骤0531和步骤0532还可以由处理器300实现。也即是说,处理器300还可用于计算人脸图像中人脸的预设的特征区域占人脸的第二比例,以及根据第一比例及第二比例修正当前距离。
可以理解,不同的用户的人脸的大小有差异,使得不同的用户处于同样的距离被采集到的人脸图像中,人脸所占的第一比例有差异。第二比例为人脸的预设的特征区域占人脸的比例,预设的特征区域可以选择不同用户个体的差异度较小的特征区域,例如预设的特征区域为用户的双眼间距。当第二比例较大时,说明该用户的人脸较小,仅依据第一比例计算得到的当前距离过大;当第二比例较小时,说明该用户的人脸较大,仅依据第一比例计算得到的当前距离过小。在实际使用中,可以预先对第一比例、第二比例与当前距离进行标定。具体地,指引用户以预定的当前距离先拍摄人脸图像,并计算该人脸图像对应的第一标定比例及第二标定比例,存储该预设的当前距离与第一标定比例、第二标定比例的对应关系,以便于在后续的使用中依据实际的第一比例和第二比例计算当前距离。例如,指引用户在当前距离为25厘米时拍摄人脸图像,并计算得到该人脸图像对应的第一标定比例为50%,第二标定比例为10%,而在实际测量中,当计算得到第一比例为R1,第二比例为R2时,则依据三角形相似的性质有
Figure PCTCN2019075390-appb-000005
其中D1为依据实际测量的第一比例R1计算得到的初始的当前距离,可以再依据关系式
Figure PCTCN2019075390-appb-000006
求得进一步依据实际测量的第二比例R2计算得到的校准的当前距离D2,D2作为最终的当前距离。如此,依据第一比例与第二比例计算得到的当前距离考虑了不同用户之间的个体差异,能够获得更加客观的当前距离。
请参阅图33,在某些实施方式中,步骤053根据第一比例修正当前距离包括:
0533:根据人脸图像判断用户是否佩戴眼镜;及
0534:在用户佩戴眼镜时根据第一比例及距离系数修正当前距离。
请参阅图32,在某些实施方式中,修正模块853包括第一判断单元8533和第二修正单元8534。步骤0533可以由第一判断单元8533实现。步骤0534可以由第二修正单元8534实现。也即是说,第一判断单元8533可用于根据人脸图像判断用户是否佩戴眼镜。第二修正单元8534可用于在用户佩戴眼镜时根据第一比例及距离系数修正当前距离。
请再参阅图1,在某些实施方式中,步骤0533和步骤0534还可以由处理器300实现。也即是说,处理器300还可用于根据人脸图像判断用户是否佩戴眼镜,以及在用户佩戴眼镜时根据第一比例及距离系数修正当前距离。
可以理解,用户是否佩戴眼镜可以用于表征用户眼睛的健康状况,具体为用户佩戴眼镜则表明用户的眼睛已经患有相关的眼疾或视力不佳,在对佩戴眼镜的用户投射激光时,需要开启较少数目的发光阵列111的点光源101,使得激光投射模组100投射的激光的能量较小,以免对用户的眼睛造成伤害。预设的距离系数可以是介于0至1的系数,例如0.6、0.78、0.82、0.95等,例如在根据第一比例计算得到初始的当前距离,或者在依据第一比例和第二比例计算得到校准后的当前距离后,再将初始的当前距离或者校准的当前距离乘以距离系数,得到最终的当前距离,并根据该当前距离确定目标数量。如此,可以避免投射激光的功率过大伤害患有眼疾或视力不佳的用户。
进一步地,距离系数可以是不固定的,例如,距离系数可以是根据环境中可见光或者红外光的强度自行调节的。图像采集器200采集的人脸图像为红外图像,可以先计算人脸图像的所有像素的红外光强度的平均值,不同的平均值对应不同的距离系数,平均值越大,距离系数越小,平均值越小,距离系数越大。
请参阅图35,在某些实施方式中,步骤053根据第一比例修正当前距离包括:
0535:根据人脸图像判断用户的年龄;和
0536:根据第一比例及年龄修正当前距离。
请参阅图36,在某些实施方式中,步骤0535可以由第二判断单元8535实现。步骤0536可以由第三修正单元8536实现。也即是说,第二判断单元8535可用于根据人脸图像判断用户的年龄。第三修正 单元8536可用于根据第一比例及年龄修正当前距离。
请再参阅图1,在某些实施方式中,步骤0535和步骤0536还可以由处理器300实现。也即是说,处理器300还可用于根据人脸图像判断用户的年龄,以及根据第一比例及年龄修正当前距离。
不同年龄段的人对红外激光的耐受能力不同,例如小孩和老人更容易被激光灼伤等,可能对于成年人而言是合适强度的激光会对小孩造成伤害。本实施方式中,可以提取人脸图像中,人脸皱纹的特征点的数量、分布和面积等来判断用户的年龄,例如,提取眼角处皱纹的数量来判断用户的年龄,或者进一步结合用户的额头处的皱纹多少来判断用户的年龄。在判断用户的年龄后,可以依据用户的年龄得到比例系数,具体可以是在查询表中查询得知年龄与比例系数的对应关系,例如,年龄在15岁以下时,比例系数为0.6,年龄在15岁至20岁时,比例系数为0.8;年龄在20岁至45岁时,比例系数为1.0;年龄在45岁以上时,比例系数为0.8。在得知比例系数后,可以将根据第一比例计算得到的初始的当前距离、或者根据第一比例及第二比例计算得到的校准的当前距离乘以比例系数,以得到最终的当前距离,再根据该当前距离确定发光阵列111的目标数量。如此,可以避免投射激光的功率过大而伤害小年龄段或者年龄较大的用户。
在某些实施方式中,在根据步骤04获得当前距离后或者根据步骤06和步骤07获得当前距离后,当当前距离处于第一距离区间时,开启第一目标数量的发光阵列111的点光源101。当当前距离处于第二距离区间时,开启第二目标数量的发光阵列111的点光源101。当当前距离处于第三距离区间时,开启第三目标数量的发光阵列111的点光源101。其中,第二距离区间位于第一距离区间与第三距离区间之间,也即是说,第一距离区间中距离的最大值小于或等于第二距离区间中距离的最小值,第二距离区间中距离的最大值小于第三距离区间中距离的最小值。第二目标数量大于第一目标数量且小于第三目标数量。
具体地,例如,激光投射模组100中的点光源101形成有6个发光阵列111,第一距离区间为[0cm,15cm],第二距离区间为(15cm,40cm],第三距离区间为(40cm,∞),第一目标数量为2个,第二目标数量为4个,第三目标数量为6个。则当检测到的当前距离处于[0cm,15cm]中时,开启2个发光阵列111的点光源101;当检测到的当前距离处于(15cm,40cm]中时,开启4个发光阵列111的点光源101;当检测到的当前距离处于(40cm,∞)中时,开启6个发光阵列111的点光源101。也即是说,随着当前距离的增大,目标数量的值越大,开启的发光阵列111的点光源101的数量越多。如此,在用户与激光投射模组100之间的当前距离较小时,开启较少的发光阵列111的点光源101,避免激光投射模组100发射的激光能量过大而危害用户眼睛,在用户与激光投射模组100之间的当前距离较大时,开启较多的发光阵列111的点光源101,可以使得图像采集器200接收到足够能量的激光,进一步使得深度图像的获取精度更高。
请一并参阅图26和图37,在某些实施方式中,多个发光阵列111呈环形排布,其中,当前距离根据步骤04获得或者根据步骤06和步骤07获得。环形排布的发光阵列111的点光源101发出的激光可以覆盖更广的视场,如此,可以获得空间中更多物体的深度信息。其中,环形可为方环形或圆环形。
在某些实施方式中,随当前距离的增大,发光阵列111的开启方式为:距离激光发射器10的中心越远的发光阵列111越先开启。例如,请结合图25,发光阵列111的总数为6个,6个发光阵列111包括5个环形子阵列114和1个方形子阵列115,沿靠近激光发射器10的中心的方向,5个环形子阵列114依次排布,依次排布的5个环形子阵列114的编号为A、B、C、D、E。则当目标数量为2个时,开启编号为A和B的环形子阵列114的点光源101;当目标数量为4个时,开启编号为A、B、C和D的环形子阵列114的点光源101;当目标数量为6个时,开启编号为A、B、C、D和E的环形子阵列114和方形子阵列115。可以理解,衍射元件30的衍射能力是有限的,即激光发射器10发射的部分激光不会被衍射而是直接出射,直接出射的激光不经过衍射元件30的衍射衰减作用,因此直接出射的激光的能量及较大,极有可能对用户的眼睛产生危害,因此,在当前距离较小时先开启远离激光发射器10的中心的环形子阵列114,可以避免不经衍射元件30的衍射作用衍射衰减而直接出射的激光能量过大而危害用户眼睛的问题。
进一步地,在某些实施方式中,在同时开启方形子阵列115和至少一个环形子阵列114的点光源101时,距离激光发射器10的中心越远的发光阵列111的点光源101的功率越高。
具体地,请结合图37,例如,发光阵列111的总数为4个,4个发光阵列111包括3个环形子阵列 114和1个方形子阵列115。沿远离激光发射器10的中心的方向,3个环形子阵列114依次排布,依次排布的3个环形子阵列114的编号为A、B、C。则当同时开启方形子阵列115和编号为A的环形子阵列114的点光源101时,施加在方形子阵列115中的点光源101的电压(U )小于施加在编号为A的环形子阵列114中的点光源101的电压(U A),即U <U A;或者,当同时开启方形子阵列115、编号为A和编号为B的环形子阵列114中的点光源101时,施加在方形子阵列115中的点光源101的电压(U )小于施加在编号为A的环形子阵列114中的点光源101的电压(U A),且施加在编号为A的环形子阵列114中的点光源101的电压(U A)小于施加在编号为B的环形子阵列114中的点光源101的电压(U B),即U <U A<U B;或者,当同时开启方形子阵列115、编号为A、编号为B和编号为C的环形子阵列114中的点光源101时,施加在方形子阵列115中的点光源101的电压(U )小于施加在编号为A的环形子阵列114中的点光源101的电压(U A),且施加在编号为A的环形子阵列114中的点光源101的电压(U A)小于施加在编号为B的环形子阵列114中的点光源101的电压(U B),且施加在编号为B的环形子阵列114中的点光源101的电压(U B)小于施加在编号为C的环形子阵列114中的点光源101的电压(U C)即U <U A<U B<U C。如此,距离激光发射器10中心越远的发光阵列111的功率越高,能够保证从衍射元件30中出射的光线出光均匀。
请一并参阅图38至图40,在某些实施方式中,多个发光阵列111呈“田”字形排布,其中,当前距离根据步骤04获得或者根据步骤06和步骤07获得。具体地,每个发光阵列111为方形结构,多个方形结构的发光阵列111组合成“田”字形结构。“田”字形排布的发光阵列111仅为多个方形结构的发光阵列111的组合,因此,其制作工艺较为简单。其中,如图38所示,发光阵列111呈“田”字形排布时,各个发光阵列111的大小可以是相等的,或者,如图39所示,部分发光阵列111的大小可以是不等的。当然,多个发光阵列111还可以呈其他形状的排布,如图40所示。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本发明的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本发明的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本发明的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本发明的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。例如,如果用硬件来实现,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集 成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本发明各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本发明的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本发明的限制,本领域的普通技术人员在本发明的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (64)

  1. 一种激光投射模组的控制方法,其特征在于,所述激光投射模组包括激光发射器,所述激光发射器包括多个点光源,多个所述点光源形成多个发光阵列,多个所述发光阵列独立控制,所述控制方法包括:
    获取所述激光投射模组与用户的当前距离;
    根据所述当前距离确定所述发光阵列的目标数量;和
    开启所述目标数量的所述发光阵列的点光源。
  2. 根据权利要求1所述的控制方法,其特征在于,多个所述发光阵列为多个扇形阵列,多个所述扇形阵列围成圆形阵列,多个所述扇形阵列独立控制;所述根据所述当前距离确定所述发光阵列的目标数量包括:根据所述当前距离确定所述扇形阵列的目标数量;所述开启所述目标数量的所述发光阵列的点光源包括:开启所述目标数量的所述扇形阵列的点光源。
  3. 根据权利要求2所述的控制方法,其特征在于,所述获取所述激光投射模组与用户的当前距离的步骤包括:
    获取所述用户的人脸图像;
    处理所述人脸图像以确定所述用户的人脸占所述人脸图像的第一比例;和
    根据所述第一比例确定所述当前距离。
  4. 根据权利要求3所述的控制方法,其特征在于,所述根据所述第一比例确定所述当前距离的步骤包括:
    计算所述人脸图像中所述人脸的预设的特征区域占所述人脸的第二比例;和
    根据所述第一比例及所述第二比例计算所述当前距离。
  5. 根据权利要求2所述的控制方法,其特征在于,所述获取所述激光投射模组与用户的当前距离的步骤包括:
    向所述用户发射检测信号;和
    根据被所述用户反射回的检测信号计算所述当前距离。
  6. 根据权利要求2所述的控制方法,其特征在于,当所述当前距离处于第一距离区间时,开启第一目标数量的所述扇形阵列的点光源;当所述当前距离处于第二距离区间时,开启第二目标数量的所述扇形阵列的点光源;当所述当前距离处于第三距离区间时,开启第三目标数量的所述扇形阵列的点光源;所述第二距离区间位于所述第一距离区间与所述第三距离区间之间;所述第二目标数量大于所述第一目标数量且小于所述第三目标数量。
  7. 根据权利要求2所述的控制方法,其特征在于,当所述扇形阵列的数量及所述目标数量均为多个,且所述扇形阵列的数量是所述目标数量的倍数时,开启的多个所述扇形阵列环绕所述激光发射器的中心呈中心对称分布。
  8. 根据权利要求1所述的控制方法,其特征在于,当所述扇形阵列的数量与所述目标数量均为多个,且所述扇形阵列数量的个数为偶数时,开启的多个所述扇形阵列环绕所述激光发射器的中心呈中心对称分布。
  9. 根据权利要求1所述的控制方法,其特征在于,多个所述发光阵列围成圆形阵列,多个所述发光阵列为多个子阵列,多个所述子阵列包括一个圆形子阵列和至少一个环形子阵列;所述根据所述当前距离确定所述发光阵列的目标数量包括:所述根据所述当前距离确定所述子阵列的目标数量;多大胡开启所述目标数量的所述发光阵列的点光源包括:开启所述目标数量的所述子阵列的所述点光源。
  10. 根据权利要求9所述的控制方法,其特征在于,所述获取所述激光投射模组与用户的当前距离的步骤包括:
    获取所述用户的人脸图像;
    处理所述人脸图像以确定所述用户的人脸占所述人脸图像的第一比例;和
    根据所述第一比例确定所述当前距离。
  11. 根据权利要求10所述的控制方法,其特征在于,所述根据所述第一比例确定所述当前距离的步骤包括:
    计算所述人脸图像中所述人脸的预设的特征区域占所述人脸的第二比例;和
    根据所述第一比例及所述第二比例计算所述当前距离。
  12. 根据权利要求9所述的控制方法,其特征在于,所述获取所述激光投射模组与用户的当前距离的步骤包括:
    向所述用户发射检测信号;和
    根据被所述用户反射回的检测信号计算所述当前距离。
  13. 根据权利要求9所述的控制方法,其特征在于,当所述当前距离处于第一距离区间时,开启第一目标数量的所述子阵列的所述点光源;当所述当前距离处于第二距离区间时,开启第二目标数量的所述子阵列的所述点光源;当所述当前距离处于第三距离区间时,开启第三目标数量的所述子阵列的所述点光源;所述第二距离区间位于所述第一距离区间与所述第三距离区间之间;所述第二目标数量大于所述第一目标数量且小于所述第三目标数量。
  14. 根据权利要求9所述的控制方法,其特征在于,在同时开启所述圆形子阵列的所述点光源和至少一个所述环形子阵列的所述点光源时,距离所述圆形子阵列中心越远的所述子阵列的所述点光源的功率越高。
  15. 根据权利要求9所述的控制方法,其特征在于,当所述当前距离处于第一距离区间时,开启所述环形子阵列的所述点光源,当所述当前距离处于第二距离区间时,开启所述圆形子阵列的所述点光源;所述第一距离区间的最大值小于或等于所述第二距离区间的最小值。
  16. 根据权利要求1所述的控制方法,其特征在于,所述获取所述激光投射模组与用户的当前距离,包括:
    开启预定数量的所述发光阵列以检测所述用户与所述激光投射模组的当前距离。
  17. 根据权利要求16所述的控制方法,其特征在于,所述控制方法在获取所述激光投射模组与用户的当前距离的步骤后还包括:
    获取所述用户的人脸图像;
    计算所述人脸图像中人脸所占的第一比例;和
    根据所述第一比例修正所述当前距离。
  18. 根据权利要求17所述的控制方法,其特征在于,所述根据所述第一比例修正所述当前距离的步骤包括:
    计算所述人脸图像中所述人脸的预设的特征区域占所述人脸的第二比例;和
    根据所述第一比例及所述第二比例修正所述当前距离。
  19. 根据权利要求17所述的控制方法,其特征在于,所述根据所述第一比例修正所述当前距离的步骤包括:
    根据所述人脸图像判断所述用户是否佩戴眼镜;及
    在所述用户佩戴眼镜时根据所述第一比例及距离系数修正所述当前距离。
  20. 根据权利要求17所述的控制方法,其特征在于,所述根据所述第一比例修正所述当前距离的步骤包括:
    根据所述人脸图像判断所述用户的年龄;和
    根据所述第一比例及所述年龄修正所述当前距离。
  21. 根据权利要求16所述的控制方法,其特征在于,当所述当前距离处于第一距离区间时,开启第一目标数量的所述发光阵列的所述点光源;当所述当前距离处于第二距离区间时,开启第二目标数量的所述发光阵列的所述点光源;当所述当前距离处于第三距离区间时,开启第三目标数量的所述发光阵列的所述点光源;所述第二距离区间位于所述第一距离区间与所述第三距离区间之间;所述第二目标数量大于所述第一目标数量且小于所述第三目标数量。
  22. 根据权利要求16所述的控制方法,其特征在于,多个所述发光阵列呈环形或“田”字形排布。
  23. 根据权利要求1所述的控制方法,其特征在于,所述获取所述激光投射模组与用户的当前距离,包括:
    获取所述用户的第一图像和第二图像;和
    根据所述第一图像和所述第二图像计算所述用户与所述激光投射模组的当前距离。
  24. 根据权利要求23所述的控制方法,其特征在于,所述控制方法在获取所述激光投射模组与用户的当前距离的步骤后还包括:
    获取所述用户的人脸图像;
    计算所述人脸图像中人脸所占的第一比例;和
    根据所述第一比例修正所述当前距离。
  25. 根据权利要求24所述的控制方法,其特征在于,所述根据所述第一比例修正所述当前距离的步骤包括:
    计算所述人脸图像中所述人脸的预设的特征区域占所述人脸的第二比例;和
    根据所述第一比例及所述第二比例修正所述当前距离。
  26. 根据权利要求24所述的控制方法,其特征在于,所述根据所述第一比例修正所述当前距离的步骤包括:
    根据所述人脸图像判断所述用户是否佩戴眼镜;及
    在所述用户佩戴眼镜时根据所述第一比例及距离系数修正所述当前距离。
  27. 根据权利要求24所述的控制方法,其特征在于,所述根据所述第一比例修正所述当前距离的步骤包括:
    根据所述人脸图像判断所述用户的年龄;和
    根据所述第一比例及所述年龄修正所述当前距离。
  28. 根据权利要求23所述的控制方法,其特征在于,当所述当前距离处于第一距离区间时,开启第一目标数量的所述发光阵列的所述点光源;当所述当前距离处于第二距离区间时,开启第二目标数量的所述发光阵列的所述点光源;当所述当前距离处于第三距离区间时,开启第三目标数量的所述发光阵列的所述点光源;所述第二距离区间位于所述第一距离区间与所述第三距离区间之间;所述第二目标数量大于所述第一目标数量且小于所述第三目标数量。
  29. 根据权利要求23所述的控制方法,其特征在于,多个所述发光阵列呈环形或“田”字形排布。
  30. 一种激光投射模组的控制装置,其特征在于,所述激光投射模组包括激光发射器,所述激光发射器包括多个点光源,多个所述点光源形成多个发光阵列,多个所述发光阵列独立控制;所述控制装置包括:第一获取模块,所述第一获取模块用于获取所述激光投射模组与用户的当前距离;
    确定模块,所述确定模块用于根据所述当前距离确定所述发光阵列的目标数量;和
    开启模块,所述开启模块开启所述目标数量的所述发光阵列的点光源。
  31. 根据权利要求30所述的控制装置,其特征在于,多个所述发光阵列为多个扇形阵列,多个所述扇形阵列围成圆形阵列,多个所述扇形阵列独立控制;所述确定模块用于根据所述当前距离确定所述扇形阵列的目标数量;所述开启模块用于开启所述目标数量的所述扇形阵列的点光源。
  32. 根据权利要求30所述的控制装置,其特征在于,多个所述发光阵列围成圆形阵列,多个所述发光阵列包括一个圆形子阵列和至少一个环形子阵列;所述确定模块用于根据所述当前距离确定所述子阵列的目标数量;所述开启模块用于开启所述目标数量的所述子阵列的点光源。
  33. 根据权利要求30所述的控制装置,其特征在于,所述第一获取模块还用于开启预定数量的所述发光阵列以检测所述用户与所述激光投射模组的当前距离。
  34. 根据权利要求30所述的控制装置,其特征在于,所述第一获取模块还用于:
    获取所述用户的第一图像和第二图像;
    根据所述第一图像和所述第二图像计算所述用户与所述激光投射模组的当前距离。
  35. 一种深度相机,包括图像采集器和激光投射模组,其特征在于,所述激光投射模组包括激光发射器,所述激光发射器包括多个点光源,多个所述点光源形成多个发光阵列,多个所述发光阵列独立控制;所述深度相机还包括处理器,所述处理器用于:
    获取所述激光投射模组与用户的当前距离;
    根据所述当前距离确定所述发光阵列的目标数量;和
    开启所述目标数量的所述发光阵列的点光源。
  36. 根据权利要求35所述的深度相机,其特征在于,多个所述发光阵列为多个扇形阵列,多个所述扇形阵列围成圆形阵列,多个所述扇形阵列独立控制;所述处理器还用于:根据所述当前距离确定所述 扇形阵列的目标数量;和
    开启所述目标数量的所述扇形阵列的点光源。
  37. 根据权利要求36所述的深度相机,其特征在于,所述处理器还用于:
    获取所述用户的人脸图像;
    处理所述人脸图像以确定所述用户的人脸占所述人脸图像的第一比例;和
    根据所述第一比例确定所述当前距离。
  38. 根据权利要求37所述的深度相机,其特征在于,所述处理器还用于:
    计算所述人脸图像中所述人脸的预设的特征区域占所述人脸的第二比例;和
    根据所述第一比例及所述第二比例计算所述当前距离。
  39. 根据权利要求36所述的深度相机,其特征在于,所述激光投射模组用于:
    向所述用户发射检测信号;
    所述处理器还用于:
    根据被所述用户反射回的检测信号计算所述当前距离。
  40. 根据权利要求36所述的深度相机,其特征在于,当所述当前距离处于第一距离区间时,开启第一目标数量的所述扇形阵列;当所述当前距离处于第二距离区间时,开启第二目标数量的所述扇形阵列;当所述当前距离处于第三距离区间时,开启第三目标数量的所述扇形阵列;所述第二距离区间位于所述第一距离区间与所述第三距离区间之间;所述第二目标数量大于所述第一目标数量且小于所述第三目标数量。
  41. 根据权利要求36所述的深度相机,其特征在于,当所述扇形阵列的数量及所述目标数量均为多个,且所述扇形阵列的数量是所述目标数量的倍数时,开启的多个所述扇形阵列环绕所述激光发射器的中心呈中心对称分布。
  42. 根据权利要求36所述的深度相机,其特征在于,当所述扇形阵列的数量与所述目标数量均为多个,且所述扇形阵列的数量的个数为偶数时,开启的多个所述扇形阵列环绕所述激光发射器的中心呈中心对称分布。
  43. 根据权利要求35所述的深度相机,多个所述发光阵列围成圆形阵列,多个所述发光阵列包括一个圆形子阵列和至少一个环形子阵列;所述处理器还用于:根据所述当前距离确定所述子阵列的目标数量;和
    开启所述目标数量的所述子阵列的点光源。
  44. 根据权利要求43所述的深度相机,其特征在于,所述处理器还用于:
    获取所述用户的人脸图像;
    处理所述人脸图像以确定所述用户的人脸占所述人脸图像的第一比例;和
    根据所述第一比例确定所述当前距离。
  45. 根据权利要求44所述的深度相机,其特征在于,所述处理器还用于:
    计算所述人脸图像中所述人脸的预设的特征区域占所述人脸的第二比例;和
    根据所述第一比例及所述第二比例计算所述当前距离。
  46. 根据权利要求43所述的深度相机,其特征在于,所述激光投射模组用于:
    向所述用户发射检测信号;
    所述处理器还用于:
    根据被所述用户反射回的检测信号计算所述当前距离。
  47. 根据权利要求43所述的深度相机,其特征在于,当所述当前距离处于第一距离区间时,开启第一目标数量的所述子阵列的所述点光源;当所述当前距离处于第二距离区间时,开启第二目标数量的所述子阵列的所述点光源;当所述当前距离处于第三距离区间时,开启第三目标数量的所述子阵列的所述点光源;所述第二距离区间位于所述第一距离区间与所述第三距离区间之间;所述第二目标数量大于所述第一目标数量且小于所述第三目标数量。
  48. 根据权利要求43所述的深度相机,其特征在于,在同时开启所述圆形子阵列的所述点光源和至少一个所述环形子阵列的所述点光源时,距离所述圆形子阵列中心越远的所述子阵列的所述点光源的功率越高。
  49. 根据权利要求43所述的深度相机,其特征在于,当所述当前距离处于第一距离区间时,开启所述环形子阵列的所述点光源,当所述当前距离处于第二距离区间时,开启所述圆形子阵列的所述点光源;所述第一距离区间的最大值小于所述第二距离区间的最小值。
  50. 根据权利要求35所述的深度相机,其特征在于,所述处理器还用于:
    开启预定数量的所述发光阵列以检测用户与所述激光投射模组的当前距离。
  51. 根据权利要求50所述的深度相机,其特征在于,所述处理器还用于:
    获取所述用户的人脸图像;
    计算所述人脸图像中人脸所占的第一比例;和
    根据所述第一比例修正所述当前距离。
  52. 根据权利要求51所述的深度相机,其特征在于,所述处理器还用于:
    计算所述人脸图像中所述人脸的预设的特征区域占所述人脸的第二比例;和
    根据所述第一比例及所述第二比例修正所述当前距离。
  53. 根据权利要求51所述的深度相机,其特征在于,所述处理器还用于:
    根据所述人脸图像判断所述用户是否佩戴眼镜;及
    在所述用户佩戴眼镜时根据所述第一比例及距离系数修正所述当前距离。
  54. 根据权利要求51所述的深度相机,其特征在于,所述处理器还用于:
    根据所述人脸图像判断所述用户的年龄;和
    根据所述第一比例及所述年龄修正所述当前距离。
  55. 根据权利要求50所述的深度相机,其特征在于,当所述当前距离处于第一距离区间时,开启第一目标数量的所述发光阵列的所述点光源;当所述当前距离处于第二距离区间时,开启第二目标数量的所述发光阵列的所述点光源;当所述当前距离处于第三距离区间时,开启第三目标数量的所述发光阵列的所述点光源;所述第二距离区间位于所述第一距离区间与所述第三距离区间之间;所述第二目标数量大于所述第一目标数量且小于所述第三目标数量。
  56. 根据权利要求50所述的深度相机,其特征在于,多个所述发光阵列呈环形或“田”字形排布。
  57. 根据权利要求35所述的深度相机,其特征在于,所述处理器还用于:
    获取用户的第一图像和第二图像;和
    根据所述第一图像和所述第二图像计算所述用户与所述激光投射模组的当前距离。
  58. 根据权利要求57所述的深度相机,其特征在于,所述处理器还用于:
    获取所述用户的人脸图像;
    计算所述人脸图像中人脸所占的第一比例;和
    根据所述第一比例修正所述当前距离。
  59. 根据权利要求58所述的深度相机,其特征在于,所述处理器还用于:
    计算所述人脸图像中所述人脸的预设的特征区域占所述人脸的第二比例;和
    根据所述第一比例及所述第二比例修正所述当前距离。
  60. 根据权利要求58所述的深度相机,其特征在于,所述处理器还用于:
    根据所述人脸图像判断所述用户是否佩戴眼镜;及
    在所述用户佩戴眼镜时根据所述第一比例及距离系数修正所述当前距离。
  61. 根据权利要求58所述的深度相机,其特征在于,所述处理器还用于:
    根据所述人脸图像判断所述用户的年龄;和
    根据所述第一比例及所述年龄修正所述当前距离。
  62. 根据权利要求57所述的深度相机,其特征在于,当所述当前距离处于第一距离区间时,开启第一目标数量的所述发光阵列的所述点光源;当所述当前距离处于第二距离区间时,开启第二目标数量的所述发光阵列的所述点光源;当所述当前距离处于第三距离区间时,开启第三目标数量的所述发光阵列的所述点光源;所述第二距离区间位于所述第一距离区间与所述第三距离区间之间;所述第二目标数量大于所述第一目标数量且小于所述第三目标数量。
  63. 根据权利要求57所述的深度相机,其特征在于,多个所述发光阵列呈环形或“田”字形排布。
  64. 一种电子装置,其特征在于,所述电子装置包括:
    壳体;和
    权利要求35至63任意一项所述的深度相机,所述深度相机设置在所述壳体内并从所述壳体暴露以获取深度图像。
PCT/CN2019/075390 2018-03-12 2019-02-18 控制方法、控制装置、深度相机和电子装置 WO2019174436A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19742274.4A EP3567427B1 (en) 2018-03-12 2019-02-18 Control method and control device for a depth camera
US16/451,737 US11441895B2 (en) 2018-03-12 2019-06-25 Control method, depth camera and electronic device

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
CN201810200433.XA CN108509867B (zh) 2018-03-12 2018-03-12 控制方法、控制装置、深度相机和电子装置
CN201810202149.6 2018-03-12
CN201810200433.X 2018-03-12
CN201810200875.4A CN108333860B (zh) 2018-03-12 2018-03-12 控制方法、控制装置、深度相机和电子装置
CN201810200875.4 2018-03-12
CN201810202149.6A CN108594451B (zh) 2018-03-12 2018-03-12 控制方法、控制装置、深度相机和电子装置
CN201810201627.1A CN108227361B (zh) 2018-03-12 2018-03-12 控制方法、控制装置、深度相机和电子装置
CN201810201627.1 2018-03-12

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/451,737 Continuation US11441895B2 (en) 2018-03-12 2019-06-25 Control method, depth camera and electronic device

Publications (1)

Publication Number Publication Date
WO2019174436A1 true WO2019174436A1 (zh) 2019-09-19

Family

ID=67908587

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/075390 WO2019174436A1 (zh) 2018-03-12 2019-02-18 控制方法、控制装置、深度相机和电子装置

Country Status (4)

Country Link
US (1) US11441895B2 (zh)
EP (1) EP3567427B1 (zh)
TW (1) TWI684026B (zh)
WO (1) WO2019174436A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108344376A (zh) * 2018-03-12 2018-07-31 广东欧珀移动通信有限公司 激光投射模组、深度相机和电子装置
EP3567427B1 (en) 2018-03-12 2023-12-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Control method and control device for a depth camera
FR3106888B1 (fr) * 2020-02-05 2022-02-25 Idemia Identity & Security France Terminal, en particulier de contrôle d’accès biométrique
CN117042265B (zh) * 2023-10-10 2024-02-23 深圳市千岩科技有限公司 校准方法、装置、电子设备及计算机可读存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103576428A (zh) * 2012-08-02 2014-02-12 建兴电子科技股份有限公司 具有安全保护机制的激光投影系统
CN103793105A (zh) * 2012-10-31 2014-05-14 中强光电股份有限公司 触控模块及其运作方法
CN104680113A (zh) * 2010-03-11 2015-06-03 得利捷Ip科技有限公司 图像捕捉装置
CN105373223A (zh) * 2015-10-10 2016-03-02 惠州Tcl移动通信有限公司 一种自动调节发光强度的发光设备及方法
CN108227361A (zh) * 2018-03-12 2018-06-29 广东欧珀移动通信有限公司 控制方法、控制装置、深度相机和电子装置
CN108333860A (zh) * 2018-03-12 2018-07-27 广东欧珀移动通信有限公司 控制方法、控制装置、深度相机和电子装置
CN108509867A (zh) * 2018-03-12 2018-09-07 广东欧珀移动通信有限公司 控制方法、控制装置、深度相机和电子装置
CN108594451A (zh) * 2018-03-12 2018-09-28 广东欧珀移动通信有限公司 控制方法、控制装置、深度相机和电子装置

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4159153B2 (ja) * 1998-12-03 2008-10-01 株式会社トプコン 回転レーザ装置及び受光装置
WO2000046637A1 (fr) 1999-02-04 2000-08-10 Matsushita Electric Industrial Co., Ltd. Projecteur et dispositif d'affichage comprenant un element optique pour assurer une diffraction et une diffusion
JP2005257325A (ja) 2004-03-09 2005-09-22 Denso Corp 距離検出装置
CN100388760C (zh) 2004-12-30 2008-05-14 亚洲光学股份有限公司 测距式数码相机
JP2009122523A (ja) 2007-11-16 2009-06-04 Olympus Imaging Corp ストロボ装置
US8651381B2 (en) * 2010-11-18 2014-02-18 David Rudich Firearm sight having an ultra high definition video camera
EP2477240A1 (en) 2011-01-18 2012-07-18 Koninklijke Philips Electronics N.V. Illumination device
JP2013020569A (ja) * 2011-07-14 2013-01-31 Toshiba Corp 画像処理装置
CN102645828B (zh) 2011-12-01 2014-11-05 深圳市光峰光电技术有限公司 投影装置、显示用光源系统及其控制方法
US9285566B2 (en) 2013-08-08 2016-03-15 Apple Inc. Mirror tilt actuation
CN104349072A (zh) * 2013-08-09 2015-02-11 联想(北京)有限公司 一种控制方法、装置和电子设备
KR101569268B1 (ko) 2014-01-02 2015-11-13 아이리텍 잉크 얼굴 구성요소 거리를 이용한 홍채인식용 이미지 획득 장치 및 방법
KR102226177B1 (ko) 2014-09-24 2021-03-10 삼성전자주식회사 사용자 인증 방법 및 그 전자 장치
KR102364084B1 (ko) * 2014-10-21 2022-02-17 엘지전자 주식회사 이동단말기 및 그 제어방법
CN204206551U (zh) 2014-10-23 2015-03-11 东莞市合明光电科技有限公司 智能双亮微波感应led灯
CN104794506B (zh) 2015-04-14 2017-11-14 天津七一二通信广播股份有限公司 一种利用激光测距可自动调整发射功率的物联网终端
CN105407615A (zh) 2015-12-28 2016-03-16 浙江大学 随照射距离改变自动调节亮度的手电筒
CN106954017B (zh) 2016-01-06 2020-03-17 青岛海信移动通信技术股份有限公司 一种激光对焦方法、装置及拍照设备
CN105791681B (zh) * 2016-02-29 2019-05-03 Oppo广东移动通信有限公司 控制方法、控制装置及电子装置
TWI588587B (zh) * 2016-03-21 2017-06-21 鈺立微電子股份有限公司 影像擷取裝置及其操作方法
CN105842956A (zh) 2016-05-26 2016-08-10 广东欧珀移动通信有限公司 闪光灯控制方法、装置及终端设备
CN107515509A (zh) 2016-06-15 2017-12-26 香港彩亿科技有限公司 投影机装置与亮度自动调整方法
CN106203285A (zh) 2016-06-28 2016-12-07 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
CN106200979A (zh) 2016-07-20 2016-12-07 广东欧珀移动通信有限公司 控制方法及控制装置
WO2018027530A1 (zh) 2016-08-09 2018-02-15 深圳市瑞立视多媒体科技有限公司 红外光源的亮度调节方法与装置、光学动捕摄像机
CN106972347B (zh) * 2017-05-04 2019-04-09 深圳奥比中光科技有限公司 用于3d成像的激光阵列
CN107424188B (zh) * 2017-05-19 2020-06-30 深圳奥比中光科技有限公司 基于vcsel阵列光源的结构光投影模组
CN107229173B (zh) 2017-06-14 2023-10-31 奥比中光科技集团股份有限公司 投影模组及其制造方法以及深度相机
CN206877030U (zh) 2017-07-07 2018-01-12 深圳奥比中光科技有限公司 发光装置及其激光投影模组
CN107330316B (zh) 2017-07-31 2020-01-14 Oppo广东移动通信有限公司 解锁处理方法及相关产品
CN107490869B (zh) 2017-08-24 2020-08-28 华天科技(昆山)电子有限公司 空间结构光发射装置
JP6958163B2 (ja) * 2017-09-20 2021-11-02 株式会社アイシン 表示制御装置
CN107687841A (zh) 2017-09-27 2018-02-13 中科创达软件股份有限公司 一种测距方法及装置
CN107680128B (zh) 2017-10-31 2020-03-27 Oppo广东移动通信有限公司 图像处理方法、装置、电子设备及计算机可读存储介质
EP3567427B1 (en) 2018-03-12 2023-12-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Control method and control device for a depth camera

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104680113A (zh) * 2010-03-11 2015-06-03 得利捷Ip科技有限公司 图像捕捉装置
CN103576428A (zh) * 2012-08-02 2014-02-12 建兴电子科技股份有限公司 具有安全保护机制的激光投影系统
CN103793105A (zh) * 2012-10-31 2014-05-14 中强光电股份有限公司 触控模块及其运作方法
CN105373223A (zh) * 2015-10-10 2016-03-02 惠州Tcl移动通信有限公司 一种自动调节发光强度的发光设备及方法
CN108227361A (zh) * 2018-03-12 2018-06-29 广东欧珀移动通信有限公司 控制方法、控制装置、深度相机和电子装置
CN108333860A (zh) * 2018-03-12 2018-07-27 广东欧珀移动通信有限公司 控制方法、控制装置、深度相机和电子装置
CN108509867A (zh) * 2018-03-12 2018-09-07 广东欧珀移动通信有限公司 控制方法、控制装置、深度相机和电子装置
CN108594451A (zh) * 2018-03-12 2018-09-28 广东欧珀移动通信有限公司 控制方法、控制装置、深度相机和电子装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3567427A4 *

Also Published As

Publication number Publication date
EP3567427A4 (en) 2020-04-15
US20190320161A1 (en) 2019-10-17
TW201939109A (zh) 2019-10-01
EP3567427A1 (en) 2019-11-13
TWI684026B (zh) 2020-02-01
US11441895B2 (en) 2022-09-13
EP3567427B1 (en) 2023-12-13

Similar Documents

Publication Publication Date Title
WO2019174436A1 (zh) 控制方法、控制装置、深度相机和电子装置
CN108333860B (zh) 控制方法、控制装置、深度相机和电子装置
CN111474818B (zh) 控制方法、控制装置、深度相机和电子装置
CN108594451B (zh) 控制方法、控制装置、深度相机和电子装置
US10119806B2 (en) Information processing apparatus and information processing method
US10788892B2 (en) In-field illumination and imaging for eye tracking
CN108227361B (zh) 控制方法、控制装置、深度相机和电子装置
US11079839B2 (en) Eye tracking device and eye tracking method applied to video glasses and video glasses
JP2022515968A (ja) 較正ターゲットを伴うポータブルドッキングステーションを使用するヘッドマウントディスプレイ較正
WO2020038062A1 (zh) 控制方法及装置、深度相机、电子装置及可读存储介质
US9958383B2 (en) Range camera
WO2016050115A1 (en) Photography illumination compensation method, compensation apparatus, and user equipment
WO2020038064A1 (zh) 控制方法及装置、深度相机、电子装置及可读存储介质
US20200259989A1 (en) High dynamic range camera assembly with augmented pixels
US10791286B2 (en) Differentiated imaging using camera assembly with augmented pixels
KR20170031185A (ko) 광시야각 깊이 이미징
US11307654B1 (en) Ambient light eye illumination for eye-tracking in near-eye display
KR102251307B1 (ko) 거리측정 기능을 갖는 열상 카메라 시스템
KR20230107574A (ko) 디스플레이를 통한 깊이 측정
US11831858B2 (en) Passive three-dimensional image sensing based on referential image blurring
US11997396B2 (en) Processing apparatus, processing system, image pickup apparatus, processing method, and memory medium
US11543499B2 (en) Hybrid refractive gradient-index optics for time-of-fly sensors
WO2020114001A1 (zh) 用于检测发光模组的光功率的检测系统及检测方法
US20230077073A1 (en) Augmented reality device and method for obtaining depth map by using depth sensor
CN114652267A (zh) 眼球追踪方法、系统、可读存储介质及电子设备

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019742274

Country of ref document: EP

Effective date: 20190731

NENP Non-entry into the national phase

Ref country code: DE