CN108227361B - Control method, control device, depth camera and electronic device - Google Patents

Control method, control device, depth camera and electronic device Download PDF

Info

Publication number
CN108227361B
CN108227361B CN201810201627.1A CN201810201627A CN108227361B CN 108227361 B CN108227361 B CN 108227361B CN 201810201627 A CN201810201627 A CN 201810201627A CN 108227361 B CN108227361 B CN 108227361B
Authority
CN
China
Prior art keywords
sector
arrays
light sources
target number
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810201627.1A
Other languages
Chinese (zh)
Other versions
CN108227361A (en
Inventor
韦怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810201627.1A priority Critical patent/CN108227361B/en
Publication of CN108227361A publication Critical patent/CN108227361A/en
Priority to PCT/CN2019/075390 priority patent/WO2019174436A1/en
Priority to EP19742274.4A priority patent/EP3567427B1/en
Priority to TW108108334A priority patent/TWI684026B/en
Priority to US16/451,737 priority patent/US11441895B2/en
Application granted granted Critical
Publication of CN108227361B publication Critical patent/CN108227361B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • G03B21/2006Lamp housings characterised by the light source
    • G03B21/2013Plural light sources
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/145Housing details, e.g. position adjustments thereof
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • G03B21/2006Lamp housings characterised by the light source
    • G03B21/2033LED or laser light sources

Abstract

The invention discloses a control method and a control device of a laser projection module, a depth camera and an electronic device. The laser emitter in the laser projection module comprises a plurality of point light sources, the plurality of point light sources form a plurality of fan-shaped arrays which can be independently controlled, and the plurality of fan-shaped arrays surround to form a circular array. The control method comprises the following steps: acquiring the current distance between the laser projection module and a user; determining the target number of the sector array according to the current distance; a target number of fan-shaped arrays of point light sources are turned on. According to the control method, the control device, the depth camera and the electronic device, the point light sources are arranged into the circular array formed by the plurality of independently controllable fan-shaped arrays, the shape of the laser emitter can correspond to the circular optical effective area of the collimation element, the space is fully utilized, the fan-shaped array point light sources of the target number can be started according to the distance, and the problems that when all the point light sources are started, the distance between a user and a laser projection module is too close, the laser energy is too high, and the eyes of the user are damaged are solved.

Description

Control method, control device, depth camera and electronic device
Technical Field
The present invention relates to the field of imaging technologies, and in particular, to a method for controlling a laser projection module, a device for controlling a laser projection module, a depth camera, and an electronic device.
Background
The point light sources in the existing laser projection module are arranged in a partitioned rectangle. Since the optically effective area of the collimating element is circular, the circular optically effective area needs to completely cover the point light sources arranged in the rectangular structure, and the diameter of the optically effective area needs to be larger than the length of the diagonal line of the rectangle formed by the point light sources, so that a part of space is wasted. In addition, all the point light sources are usually turned on when the laser projection module is turned on, and if the distance from the user to the laser projection module is too short, the energy of laser emitted by all the turned-on point light sources is high, which may harm eyes of the user.
Disclosure of Invention
The embodiment of the invention provides a control method of a laser projection module, a control device of the laser projection module, a depth camera and an electronic device.
The invention provides a control method of a laser projection module, wherein the laser projection module comprises a laser transmitter, the laser transmitter comprises a plurality of point light sources, the point light sources form a plurality of sector arrays, the sector arrays surround to form a circular array, and the sector arrays are independently controlled; the control method comprises the following steps:
acquiring the current distance between the laser projection module and a user;
determining the target number of the sector array according to the current distance; and
and starting the target number of the point light sources of the sector array.
The invention provides a control device of a laser projection module. The laser projection module comprises a laser transmitter, the laser transmitter comprises a plurality of point light sources, the point light sources form a plurality of sector arrays, the sector arrays surround to form a circular array, and the sector arrays are independently controlled. The control device comprises an acquisition module, a determination module and an opening module. The acquisition module is used for acquiring the current distance between the laser projection module and a user. The determining module is used for determining the target number of the sector array according to the current distance. The starting module is used for starting the target number of the point light sources of the sector array.
The invention provides a depth camera. The depth camera comprises an image collector and a laser projection module. The laser projection module comprises a laser transmitter, the laser transmitter comprises a plurality of point light sources, the point light sources form a plurality of sector arrays, the sector arrays surround to form a circular array, and the sector arrays are independently controlled; the depth camera also includes a processor. The processor is used for obtaining the current distance between the laser projection module and a user, determining the target number of the sector array according to the current distance, and starting the point light sources of the sector array with the target number.
The invention provides an electronic device. The electronic device comprises a housing and the depth camera. The depth camera is disposed within and exposed from the housing to acquire a depth image.
According to the control method of the laser projection module, the control device of the laser projection module, the depth camera and the electronic device, the point light sources in the laser projection module are arranged into the circular array formed by the plurality of independently controllable fan-shaped arrays, so that on one hand, the shape of the laser emitter can correspond to the circular optical effective area of the collimation element, space is fully utilized, on the other hand, the fan-shaped array point light sources corresponding to the target number of the distance can be started according to the detected distance, and the problem that after all the point light sources are started, when the distance between a user and the laser projection module is too close and the energy emitted by the laser emitter is too high, the eyes of the user are damaged is solved.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flow chart illustrating a method for controlling a laser projection module according to some embodiments of the present invention.
Fig. 2 is a schematic structural diagram of a laser projection module according to some embodiments of the invention.
Fig. 3 is a schematic diagram of a fan-shaped array of laser emitters in a laser projection module according to some embodiments of the present invention.
Fig. 4 is a block diagram of a control device of a laser projection module according to some embodiments of the present invention.
FIG. 5 is a schematic diagram of a depth camera in accordance with certain embodiments of the invention.
Fig. 6 is a schematic structural diagram of an electronic device according to some embodiments of the invention.
Fig. 7 is a flowchart illustrating a method for controlling a laser projection module according to some embodiments of the invention.
Fig. 8 is a block diagram of an acquisition module in a control device of a laser projection module according to some embodiments of the present invention.
Fig. 9 is a flowchart illustrating a method for controlling a laser projection module according to some embodiments of the invention.
Fig. 10 is a block diagram of a determination unit in a control device of a laser projection module according to some embodiments of the present invention.
Fig. 11 is a flowchart illustrating a method for controlling a laser projection module according to some embodiments of the invention.
Fig. 12 is a block diagram of a determination unit in a control device of a laser projection module according to some embodiments of the present invention.
Fig. 13 is a flowchart illustrating a method for controlling a laser projection module according to some embodiments of the invention.
Fig. 14 is a block diagram of a determination unit in a control device of a laser projection module according to some embodiments of the present invention.
Fig. 15 is a flowchart illustrating a method for controlling a laser projection module according to some embodiments of the invention.
Fig. 16 is a block diagram of an acquisition module in a control device of a laser projection module according to some embodiments of the present invention.
Fig. 17 to 21 are schematic diagrams illustrating the arrangement of the laser emitters in the laser projection module according to some embodiments of the present invention in a fan-shaped array.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
In the description of the present invention, it is to be understood that the terms "first", "second" and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
Referring to fig. 1 to 3, the present invention provides a method for controlling a laser projection module 100. The laser projection module 100 includes a laser transmitter 10, and the laser transmitter 10 includes a plurality of point light sources 101. The plurality of point light sources 101 form a plurality of sector arrays 111, and the sector arrays 111 enclose a circular array 11. The sector arrays 111 may be independently controlled. The control method comprises the following steps:
01: acquiring the current distance between the laser projection module 100 and a user;
02: determining the target number of the sector array 111 according to the current distance; and
03: a target number of the point light sources 101 of the sector array 111 are turned on.
Referring to fig. 4, the present invention further provides a control device 80 of the laser projection module 100. The Laser projection module 100 includes a Laser transmitter 10, the Laser transmitter 10 is a Vertical-Cavity Surface-Emitting Laser (VCSEL), and the VCSEL includes a plurality of point light sources 101. The plurality of point light sources 101 form a plurality of sector arrays 111, and the sector arrays 111 enclose a circular array 11. The sector arrays 111 may be independently controlled. The control device 80 comprises an acquisition module 81, a determination module 82 and an activation module 83. Step 01 may be implemented by the obtaining module 81, step 02 may be implemented by the determining module 82, and step 03 may be implemented by the opening module 83. That is, the obtaining module 81 can be used to obtain the current distance between the laser projection module 100 and the user. The determination module 82 may be configured to determine the number of targets for the sector array 111 based on the current distance. The turning-on module 83 may be used to turn on a target number of fan-shaped array 111 of point light sources 101.
Referring to fig. 2, the laser projection module 100 further includes a collimating element 20 and a diffractive element 30. The collimating element 20 is used for collimating the laser light emitted from the laser emitter 10, and the diffraction element 30 is used for diffracting the laser light collimated by the collimating element 20 to form a laser light pattern. The laser projection module 100 further includes a lens barrel 40 and a substrate assembly 50. The lens barrel 40 is disposed on the substrate assembly 50. The sidewall 41 of the lens barrel 40 and the substrate assembly 50 enclose a receiving cavity 42. The substrate assembly 50 includes a substrate 52 and a circuit board 51 carried on the substrate 52. The circuit board 51 is provided with a through hole 511, and the laser emitter 10 is carried on the substrate 52 and is accommodated in the through hole 511. The collimating element 20 and the diffractive element 30 are arranged in sequence along the light emitting direction of the laser emitter 10. A mount 411 extends from the side wall 41 of the lens barrel 40 toward the center of the housing cavity 42, and the diffraction element 30 is mounted on the mount 411.
The laser projection module 100 further includes a protective cover 60. The protective cover 60 may be made of a light-transmitting material, such as glass, Polymethyl Methacrylate (PMMA), Polycarbonate (PC), Polyimide (PI), or the like. Since the transparent materials such as glass, PMMA, PC, and PI have excellent light transmittance, the protective cover 60 does not need to be provided with a light transmittance hole. In this way, the protective cover 60 can prevent the diffraction element 30 from falling off, and can prevent the diffraction element 30 from being exposed to the outside of the lens barrel 40, thereby making the diffraction element 30 waterproof and dustproof. Of course, in other embodiments, the protective cover 60 may be provided with a light-transmitting hole opposite to the optically effective area of the diffraction element 30 to avoid blocking the light path of the diffraction element 30.
Referring to fig. 5, the present invention further provides a depth camera 1000. The depth camera 1000 includes an image collector 200, the laser projection module 100 and a processor 300. Image collector 200 may be used to collect laser patterns and image collector 200 may be an infrared camera. The processor 300 may be used to process the laser pattern to acquire a depth image. Step 01, step 02 and step 03 may also be implemented by the processor 300. That is, the processor 300 may be further configured to obtain a current distance between the laser projection module 100 and the user, determine the target number of the sector arrays 111 according to the current distance, and turn on the point light sources 101 of the sector arrays 111 of the target number.
Referring to fig. 6, the present invention also provides an electronic device 3000. The electronic device 3000 includes a housing 2000 and the depth camera 1000 described above. The depth camera 1000 is disposed within the housing 2000 and exposed from the housing 2000 to acquire a depth image. The electronic device 3000 may be a mobile phone, a tablet computer, a notebook computer, an intelligent watch, an intelligent bracelet, an intelligent glasses, an intelligent helmet, or the like.
It can be understood that the point light sources 101 in the conventional laser projection module 100 are arranged in a partitioned rectangular shape. The optically effective area of the collimating element 20 is usually circular, and the circular optically effective area is required to completely cover the rectangular arrangement of the point light sources 101, so that the diameter of the optically effective area is larger than the length of the diagonal line of the rectangle formed by the point light sources 101, and thus, a part of the space is wasted. In addition, all the point light sources 101 are usually turned on when the laser projection module 100 is turned on, and if the user is too close to the laser projection module 100, the energy of the laser emitted by the laser emitter 10 after all the point light sources 101 are turned on is high, which may cause damage to the eyes of the user.
According to the control method of the laser projection module 100, the control device 80 of the laser projection module 100, the depth camera 1000 and the electronic device 3000 of the embodiment of the invention, the point light sources 101 in the laser projection module 100 are arranged into the circular array 11 formed by the plurality of independently controllable sector arrays 111, so that on one hand, the shape of the laser emitter 10 can correspond to the circular optical effective area of the collimation element 20, and the space can be fully utilized, and on the other hand, the point light sources 101 of the sector arrays 111 corresponding to the target number of the distance can be opened according to the detected distance, and the problems that after all the point light sources 101 are opened, the distance between a user and the laser projection module 100 is too close, the energy emitted by the laser emitter 10 is too high, and the eyes of the user are damaged are avoided.
Referring to fig. 7, in some embodiments, the step 01 of obtaining the current distance between the laser projection module 100 and the user includes:
011: acquiring a face image of a user;
012: processing the face image to determine a first proportion of a face of the user to the face image; and
013: the current distance is determined according to the first scale.
Referring to fig. 8, in some embodiments, the obtaining module 81 includes an obtaining unit 811, a processing unit 812, and a determining unit 813. Step 011 can be implemented by the obtaining unit 811, step 012 can be implemented by the processing unit 812, and step 013 can be implemented by the determining unit 813. That is, the acquisition unit 811 may be used to acquire a face image of the user. The processing unit 812 may be operative to process the facial image to determine a first proportion of the user's face to the facial image. The determination unit 813 may be configured to determine the current distance according to a first ratio.
Referring back to fig. 5, in some embodiments, step 011, step 012, and step 013 can all be implemented by processor 300. That is, the processor 300 may be further configured to obtain a face image of the user, process the face image to determine a first ratio of the face of the user to the face image, and determine the current distance according to the first ratio. The face image is captured by the image capturing device 200, and the processor 300 is electrically connected to the image capturing device 200 and reads the face image from the image capturing device 200.
Specifically, the face region and the background region in the face image may be divided according to extraction and analysis of feature points of the face, and then a ratio of the number of pixels where the face region is located to the number of pixels of the face image is calculated to obtain the first ratio. It can be understood that when the first ratio is larger, it indicates that the user is closer to the image collector 200, that is, closer to the laser projection module 100, and the current distance is smaller, at this time, the laser projection module 100 needs to turn on the point light sources 101 of the sector array 111 with smaller target number, so as to avoid burning the user due to too strong projected laser. Meanwhile, when the first ratio is smaller, it indicates that the user is far away from the image collector 200, that is, far away from the laser projection module 100, and the current distance is larger, the laser projection module 100 needs to project laser with larger power, so that the laser pattern still has appropriate intensity after being projected onto the user and reflected, so as to form a depth image, and at this time, the laser projection module 100 needs to turn on the point light sources 101 of the sector array 111 with a larger target number. In one example, when the same face image includes a plurality of faces, the face with the largest area among the plurality of faces is selected as the face area to calculate the first ratio, and areas occupied by other faces are all used as a part of the background area.
The current distance and the first ratio may be calibrated in advance. Specifically, the user is guided to shoot the face image at a preset current distance, a calibration ratio corresponding to the face image is calculated, and the corresponding relation between the preset current distance and the calibration ratio is stored, so that the current distance is calculated according to an actual first ratio in subsequent use. For example, the user is guided to shoot a face image when the current distance is 30 cm, and the calibration proportion corresponding to the face image is calculated to be 45%, and in the actual measurement, when the first proportion calculated is R, the first proportion is calculated to be R according to the property of the similar triangle
Figure BDA0001594654480000061
Where D is the actual current distance calculated from the actually measured first ratio R. Thus, the current distance between the user and the laser projection module 100 can be reflected more objectively according to the first proportion of the face in the face image.
Referring to fig. 9, in some embodiments, the determining the current distance according to the first ratio in step 013 includes:
0131: calculating a second proportion of a preset characteristic region of the face in the face image to the face; and
0132: and calculating the current distance according to the first proportion and the second proportion.
Referring to fig. 10, in some embodiments, the determination unit 813 includes a first calculation subunit 8131 and a second calculation subunit 8132. Step 0131 may be implemented by a first calculation subunit 8131 and step 0132 may be implemented by a second calculation subunit 8132. That is, the first calculating subunit 8131 may be configured to calculate a second ratio of the preset feature region of the face in the face image to the face. The second calculating subunit 8132 may be configured to calculate the current distance according to the first ratio and the second ratio.
Referring back to fig. 5, in some embodiments, step 0131 and step 0132 may also be implemented by processor 300. That is, the processor 300 may be further configured to calculate a second ratio of the preset feature region of the face in the face image to the face, and calculate the current distance according to the first ratio and the second ratio.
It can be understood that the sizes of the faces of different users are different, so that the first proportion occupied by the faces in the face images acquired by different users at the same distance is different. The second ratio is the ratio of the preset feature region of the face to the face, and the preset feature region can select a feature region with a small degree of difference between different user individuals, for example, the preset feature region is the distance between the eyes of the user. When the second proportion is larger, the face of the user is smaller, and the current distance calculated according to the first proportion is too large; when the second proportion is smaller, the face of the user is larger, and the current distance calculated according to the first proportion is too small. In practical useThe first proportion, the second proportion and the current distance can be calibrated in advance. Specifically, the user is guided to shoot the face image at a preset current distance, a first calibration proportion and a second calibration proportion corresponding to the face image are calculated, and the corresponding relation between the preset current distance and the first calibration proportion and the second calibration proportion is stored, so that the current distance can be calculated in the subsequent use according to the actual first proportion and the actual second proportion. For example, the user is guided to shoot a face image when the current distance is 25 cm, and the first scale proportion corresponding to the face image is calculated to be 50%, and the second scale proportion is calculated to be 10%, and in the actual measurement, when the first scale proportion is calculated to be R1, and the second scale proportion is calculated to be R2, then there is a feature that the triangle is similar according to the property that the triangle is
Figure BDA0001594654480000071
Wherein D1 is the initial current distance calculated according to the actually measured first ratio R1, which can be further based on the relation
Figure BDA0001594654480000072
A calibrated current distance D2, D2, which is calculated further as a function of the actually measured second ratio R2, is determined as the final current distance. Therefore, the current distance calculated according to the first proportion and the second proportion considers the individual difference between different users, and a more objective current distance can be obtained.
Referring to fig. 11, in some embodiments, the determining the current distance according to the first ratio in step 013 includes:
0133: judging whether the user wears glasses or not according to the face image; and
0134: and when the user wears the glasses, calculating the current distance according to the first proportion and a preset distance coefficient.
Referring to fig. 12, in some embodiments, determination unit 813 includes a first judgment sub-unit 8133 and a third calculation sub-unit 8134. Step 0133 may be implemented by the first decision subunit 8133 and step 0134 may be implemented by the third calculation subunit 8134. That is, the first judging subunit 8133 may be configured to judge whether the user wears the glasses according to the face image. The third computing subunit 8134 is configured to compute the current distance according to the first ratio and a preset distance coefficient when the user wears the glasses.
Referring back to fig. 5, in some embodiments, step 0133 and step 0134 may also be implemented by processor 300. That is, the processor 300 may be configured to determine whether the user wears the glasses according to the face image, and calculate the current distance according to the first ratio and the preset distance coefficient when the user wears the glasses.
It can be understood that whether the user wears the glasses or not can be used for characterizing the health condition of the eyes of the user, and particularly, when the user wears the eyes, it indicates that the glasses of the user have related eye diseases or poor eyesight, and when the user wears the eyes to project the laser, a smaller number of the point light sources 101 of the sector array 111 need to be turned on, so that the energy of the laser projected by the laser projection module 100 is smaller, and the user's eyes are not damaged. The preset distance coefficient may be a coefficient between 0 and 1, such as 0.6, 0.78, 0.82, 0.95, etc., for example, after calculating the initial current distance according to the first ratio, or after calculating the calibrated current distance according to the first ratio and the second ratio, multiplying the initial current distance or the calibrated current distance by the distance coefficient to obtain the final current distance, and determining the target number according to the current distance. Therefore, the damage of the user suffering from eye diseases or poor eyesight due to the overlarge power of the projection laser can be avoided.
Further, the distance coefficient may not be fixed, for example, the distance coefficient may be self-adjusted according to the intensity of visible light or infrared light in the environment. The face image collected by the image collector 200 is an infrared image, and the average value of the infrared light intensity of all pixels of the face image can be calculated first, different average values correspond to different distance coefficients, and the larger the average value is, the smaller the distance coefficient is, the smaller the average value is, and the larger the distance coefficient is.
Referring to fig. 13, in some embodiments, the determining the current distance according to the first ratio in step 013 includes:
0135: judging the age of the user according to the face image; and
0136: and adjusting the current distance according to the first proportion and the age.
Referring to fig. 14, in some embodiments, the determining unit 813 further includes a second determining subunit 8135 and an adjusting subunit 8136. Step 0135 may be implemented by the second determining subunit 8135 and step 0136 may be implemented by the adjusting subunit 8136. That is, the second determination subunit 8135 may be configured to determine the age of the user according to the face image. The adjusting subunit 8136 may be configured to adjust the current distance according to the first ratio and the age.
Referring back to fig. 5, in some embodiments, step 0135 and step 0136 may also be implemented by processor 300. That is, the processor 300 may be further configured to determine an age of the user according to the face image, and adjust the current distance according to the first scale and the age.
Persons of different ages have different resistance to infrared laser light, for example, children and the elderly are more susceptible to laser burns, etc., and laser light of an intensity that may be appropriate for adults can cause injury to children. In this embodiment, the number, distribution, area, and the like of the feature points of the wrinkles of the face in the face image may be extracted to determine the age of the user, for example, the number of wrinkles at the corners of the eyes may be extracted to determine the age of the user, or the age of the user may be determined by further combining the number of wrinkles at the forehead of the user. After the age of the user is determined, the scaling factor may be obtained according to the age of the user, and specifically, the corresponding relationship between the age and the scaling factor may be obtained by querying a lookup table, for example, when the age is below 15 years, the scaling factor is 0.6, and when the age is between 15 years and 20 years, the scaling factor is 0.8; the proportionality coefficient is 1.0 when the age is 20 years old to 45 years old; the scale factor is 0.8 at age above 45 years. After the scaling factor is known, the initial current distance calculated according to the first ratio or the calibrated current distance calculated according to the first ratio and the second ratio may be multiplied by the scaling factor to obtain the final current distance, and then the target number of the sector array 111 is determined according to the current distance. Therefore, the damage to users of small age or older age due to the excessive power of the projected laser can be avoided.
Referring to fig. 15, in some embodiments, the step 01 of obtaining the current distance between the laser projection module 100 and the user includes:
014: transmitting a detection signal to a user; and
015: the current distance is calculated from the detection signal reflected back by the user.
Referring to fig. 16, in some embodiments, the obtaining module 81 includes a transmitting unit 814 and a calculating unit 815. Step 014 may be implemented by the transmitting unit 814 and step 015 may be implemented by the calculating unit 815. That is, the transmitting unit 814 may be used to transmit the detection signal to the user. The calculation unit 815 may be configured to calculate the current distance from the detection signal reflected back by the user.
Referring back to fig. 5, in some embodiments, step 014 may be implemented by the laser projection module 100 and step 015 may be implemented by the processor 300. That is, the laser projection module 100 may be used to emit a detection signal to a user. The processor 300 may be configured to calculate the current distance from the detection signal reflected back by the user.
Specifically, the laser projection module 100 only turns on the point light sources 101 in one sector array 111, i.e. only the point light sources 101 in the sector array 111 emit laser light. The image collector 200 in the depth camera 1000 receives the reflected laser light to obtain a laser pattern, and then calculates a deviation value between each pixel point in the laser pattern and each corresponding pixel point in the predetermined pattern by using an image matching algorithm, and further obtains a depth image corresponding to the laser pattern according to the deviation value, thereby roughly estimating the current distance between the laser projection module 100 and the user. Since only one point light source 101 in the sector array 111 is turned on to detect the current distance, the energy of the laser emitted by the laser projection module 100 is low, and the user's eyes are not damaged. After the current distance between the user and the laser projection module 100 is roughly measured, the target number of the opened sector arrays 111 is determined according to the current distance, and at the moment, the laser emitted by the laser projection module 100 can meet the requirement of accuracy of depth image measurement and calculation, and meanwhile, the eyes of the user cannot be damaged.
In some embodiments, a first target number of point light sources 101 of the sector array 111 are turned on when the current distance is in the first distance interval. When the current distance is in the second distance interval, a second target number of the point light sources 101 of the sector array 111 are turned on. When the current distance is in the third distance interval, a third target number of the point light sources 101 of the sector array 111 are turned on. The second distance interval is located between the first distance interval and the second distance interval, that is, the maximum value of the distance in the first distance interval is less than or equal to the minimum value of the distance in the second distance interval, and the maximum value of the distance in the second distance interval is less than the minimum value of the distance in the third distance interval. The second target number is greater than the first target number and less than the third target number.
Specifically, for example, the point light sources 101 in the laser projection module 100 are formed with 6 fan arrays 111, the first distance section is [0cm,15cm ], the second distance section is (15cm,40cm ], the third distance section is (40cm, ∞), the first target number is 2, the second target number is 4, and the third target number is 6, then when the detected current distance is in [0cm,15cm ], the point light sources 101 of the 2 fan arrays 111 are turned on, when the detected current distance is in (15cm,40cm ], the point light sources 101 of the 4 fan arrays 111 are turned on, when the detected current distance is in (40cm, ∞), the point light sources 101 of the 6 fan arrays 111 are turned on, that is, as the current distance increases, the value of the target number is larger, the number of point light sources 101 of the fan arrays 111 is turned on, and thus, when the current distance between the user and the laser projection module 100 is small, the point light sources 101 of the sector arrays 111 are opened less, the phenomenon that the laser energy emitted by the laser projection module 100 is too large to harm the eyes of the user is avoided, when the current distance between the user and the laser projection module 100 is large, the point light sources 101 of the sector arrays 111 are opened more, the image collector 200 can receive laser with enough energy, and the acquisition precision of the depth image is further higher.
In some embodiments, when the number of the sector arrays 111 and the target number are multiple and the number of the sector arrays 111 is a multiple of the target number, the plurality of sector arrays 111 that are turned on are distributed in a central symmetry around the center of the laser transmitter 10.
For example, as shown in fig. 3, if the number of sector arrays 111 is 4, the target number is 2, and 4 is a multiple of 2, then the 2 sector arrays 111 that are turned on are distributed around the center of the laser transmitter 10 in a central symmetry manner. For another example, as shown in fig. 17, if the number of sector arrays 111 is 6 and the target number is 3, there is an unopened sector array 111 before each of the adjacent 2 opened sector arrays 111, and the opened 3 sector arrays 111 are distributed around the center of the laser transmitter 10 in a central symmetry manner. For another example, as shown in fig. 18, if the number of the sector arrays 111 is 9 and the target number is 3, there are 2 unopened sector arrays 111 between every two adjacent opened sector arrays 111, and the opened 3 sector arrays 111 are distributed around the center of the laser transmitter 10 in a central symmetry manner.
Thus, the opened fan-shaped array 111 is distributed in central symmetry, laser emitted by the point light source 101 can cover a larger field of view after being emitted by the collimating element 20 and the diffraction element 30, and light is emitted uniformly, which is beneficial to improving the acquisition precision of depth images.
In some embodiments, when the number of the sector arrays 111 and the target number are both multiple and the number of the sector arrays 111 is even, the sector arrays 111 that are turned on are distributed around the center of the laser transmitter 10 in a central symmetry manner.
For example, as shown in fig. 19, the number of sector arrays 111 is 6, the target number is 4, 2 of the turned-on 4 sector arrays 111 are adjacent, the remaining 2 sector arrays 111 are adjacent, and the turned-on 4 sector arrays 111 are distributed in central symmetry around the center of the laser transmitter 10. For another example, as shown in fig. 20, the number of the sector arrays 111 is 10, the target number is 6, 3 of the turned-on 6 sector arrays 111 are adjacent to each other, the remaining 3 sector arrays 111 are adjacent to each other, and the turned-on 6 sector arrays 111 are distributed around the center of the laser transmitter 10 in a central symmetry manner. For another example, as shown in fig. 21, the number of the sector arrays 111 is 12, the target number is 9, 3 sector arrays 111 of the opened 9 sector arrays 111 are adjacent to each other, the other 3 sector arrays 111 are adjacent to each other, the remaining 3 sector arrays 111 are adjacent to each other, and the opened 9 sector arrays 111 are distributed around the center of the laser transmitter 10 in a central symmetry manner.
Thus, the opened fan-shaped array 111 is distributed in central symmetry, laser emitted by the point light source 101 can cover a larger field of view after being emitted by the collimating element 20 and the diffraction element 30, and light is emitted uniformly, which is beneficial to improving the acquisition precision of depth images.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (10)

1. The control method of the laser projection module is characterized in that the laser projection module comprises a laser emitter, the laser emitter comprises a plurality of point light sources, the point light sources form a plurality of sector arrays, the sector arrays surround to form a circular array, and the sector arrays are independently controlled; the control method comprises the following steps:
acquiring the current distance between the laser projection module and a user;
the obtaining of the current distance between the laser projection module and the user includes:
acquiring a face image of the user;
processing the face image to determine a first proportion of the face of the user to the face image;
calculating a second proportion of the inter-ocular distance of the face in the face image to the face; and
calculating the current distance according to the first proportion and the second proportion;
determining the target number of the sector array according to the current distance; and
and starting the target number of the point light sources of the sector array.
2. The control method according to claim 1, wherein when the current distance is in a first distance interval, a first target number of the point light sources of the sector array is turned on; when the current distance is in a second distance interval, starting point light sources of the sector array with a second target quantity; when the current distance is in a third distance interval, starting point light sources of a third target number of the sector array; the second distance interval is located between the first distance interval and the third distance interval; the second target number is greater than the first target number and less than the third target number.
3. The control method according to claim 1, wherein when the number of the sector arrays and the target number are both plural and the number of the sector arrays is a multiple of the target number, the plurality of sector arrays that are turned on are distributed in a central symmetry manner around a center of the laser transmitter.
4. The control method according to claim 1, wherein when the number of the sector arrays and the target number are both multiple and the number of the sector arrays is even, the multiple sector arrays that are turned on are distributed in a central symmetry manner around the center of the laser transmitter.
5. The control device of the laser projection module is characterized in that the laser projection module comprises a laser emitter, the laser emitter comprises a plurality of point light sources, the point light sources form a plurality of sector arrays, the sector arrays surround to form a circular array, and the sector arrays are independently controlled; the control device includes:
the acquisition module is used for acquiring the current distance between the laser projection module and a user;
the acquisition module includes:
the acquisition unit is used for acquiring a face image of the user;
the processing unit is used for processing the face image to determine a first proportion of the face of the user in the face image;
a determination unit configured to:
calculating a second proportion of the inter-ocular distance of the face in the face image to the face; and
calculating the current distance according to the first proportion and the second proportion;
a determination module for determining a target number of the sector array according to the current distance; and
and the starting module is used for starting the target number of the point light sources of the sector array.
6. A depth camera comprises an image collector and a laser projection module, and is characterized in that the laser projection module comprises a laser emitter, the laser emitter comprises a plurality of point light sources, the point light sources form a plurality of sector arrays, the sector arrays surround to form a circular array, and the sector arrays are independently controlled; the depth camera further includes a processor to:
acquiring a face image of a user;
processing the face image to determine a first proportion of the face of the user to the face image;
calculating a second proportion of the inter-ocular distance of the face in the face image to the face;
calculating the current distance according to the first proportion and the second proportion;
determining the target number of the sector array according to the current distance; and
and starting the target number of the point light sources of the sector array.
7. The depth camera of claim 6, wherein a first target number of the fan arrays is turned on when the current distance is in a first distance interval; when the current distance is in a second distance interval, starting the sector arrays with a second target number; when the current distance is in a third distance interval, starting the sector arrays of a third target number; the second distance interval is located between the first distance interval and the third distance interval; the second target number is greater than the first target number and less than the third target number.
8. The depth camera of claim 6, wherein when the number of fan arrays and the number of targets are both multiple and the number of fan arrays is a multiple of the number of targets, the plurality of fan arrays that are turned on are distributed in a central symmetry around the center of the laser transmitter.
9. The depth camera of claim 6, wherein when the number of the sector arrays and the number of the targets are both multiple and the number of the sector arrays is even, the multiple sector arrays that are turned on are distributed around the center of the laser transmitter in a central symmetry manner.
10. An electronic device, comprising:
a housing; and
the depth camera of any of claims 6 to 9, disposed within and exposed from the housing to acquire a depth image.
CN201810201627.1A 2018-03-12 2018-03-12 Control method, control device, depth camera and electronic device Active CN108227361B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201810201627.1A CN108227361B (en) 2018-03-12 2018-03-12 Control method, control device, depth camera and electronic device
PCT/CN2019/075390 WO2019174436A1 (en) 2018-03-12 2019-02-18 Control method, control device, depth camera and electronic device
EP19742274.4A EP3567427B1 (en) 2018-03-12 2019-02-18 Control method and control device for a depth camera
TW108108334A TWI684026B (en) 2018-03-12 2019-03-12 Control method, control device, depth camera and electronic device
US16/451,737 US11441895B2 (en) 2018-03-12 2019-06-25 Control method, depth camera and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810201627.1A CN108227361B (en) 2018-03-12 2018-03-12 Control method, control device, depth camera and electronic device

Publications (2)

Publication Number Publication Date
CN108227361A CN108227361A (en) 2018-06-29
CN108227361B true CN108227361B (en) 2020-05-26

Family

ID=62658379

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810201627.1A Active CN108227361B (en) 2018-03-12 2018-03-12 Control method, control device, depth camera and electronic device

Country Status (1)

Country Link
CN (1) CN108227361B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019174436A1 (en) * 2018-03-12 2019-09-19 Oppo广东移动通信有限公司 Control method, control device, depth camera and electronic device
CN108881691A (en) * 2018-07-13 2018-11-23 Oppo广东移动通信有限公司 Control method, microprocessor, computer readable storage medium and computer equipment
CN108957914B (en) * 2018-07-25 2020-05-15 Oppo广东移动通信有限公司 Laser projection module, depth acquisition device and electronic equipment
CN109145811A (en) * 2018-08-17 2019-01-04 联想(北京)有限公司 Array light source control method, equipment and electronic equipment
CN109031252B (en) * 2018-08-22 2020-12-18 Oppo广东移动通信有限公司 Calibration method, calibration controller and calibration system
CN110213413B (en) * 2019-05-31 2021-05-14 Oppo广东移动通信有限公司 Control method of electronic device and electronic device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102645828B (en) * 2011-12-01 2014-11-05 深圳市光峰光电技术有限公司 Projecting device, light source system for displaying and control methods thereof
KR102364084B1 (en) * 2014-10-21 2022-02-17 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN107515509A (en) * 2016-06-15 2017-12-26 香港彩亿科技有限公司 Projector and method for automatic brightness adjustment
CN106203285A (en) * 2016-06-28 2016-12-07 广东欧珀移动通信有限公司 Control method, control device and electronic installation
CN106200979A (en) * 2016-07-20 2016-12-07 广东欧珀移动通信有限公司 control method and control device
CN106972347B (en) * 2017-05-04 2019-04-09 深圳奥比中光科技有限公司 Laser array for 3D imaging
CN107680128B (en) * 2017-10-31 2020-03-27 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN108227361A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108509867B (en) Control method, control device, depth camera and electronic device
CN108227361B (en) Control method, control device, depth camera and electronic device
CN108333860B (en) Control method, control device, depth camera and electronic device
CN108594451B (en) Control method, control device, depth camera and electronic device
CN109104583B (en) Control method and device, depth camera, electronic device and readable storage medium
TWI684026B (en) Control method, control device, depth camera and electronic device
CN108833889B (en) Control method and device, depth camera, electronic device and readable storage medium
CN109068036B (en) Control method and device, depth camera, electronic device and readable storage medium
US6091378A (en) Video processing methods and apparatus for gaze point tracking
CN108376251B (en) Control method, control device, terminal, computer device, and storage medium
JP6639422B2 (en) Apparatus for determining information associated with a reflective property of a surface
US20170135617A1 (en) Optoelectronic modules operable to distinguish between signals indicative of reflections from an object of interest and signals indicative of a spurious reflection
KR102028869B1 (en) Components, computer programs, systems and kits for judging corrective lenses
WO2019165956A1 (en) Control method, control apparatus, terminal, computer device, and storage medium
US20160148049A1 (en) Image collection with increased accuracy
KR20060105569A (en) Safe eye detection
CN106778641B (en) Sight estimation method and device
CN110226110B (en) Fresnel lens with dynamic draft for reducing optical artifacts
CN109031252B (en) Calibration method, calibration controller and calibration system
US20200204853A1 (en) Method for controlling a display parameter of a mobile device and computer program product
CN109115333A (en) The detection method and detection system of laser projecting apparatus
US11632496B2 (en) Image pickup apparatus for detecting line-of-sight position, control method therefor, and storage medium
US20230030103A1 (en) Electronic apparatus
US20230240523A1 (en) Systems, methods, and apparatus for tele-otoscopy
WO2023147346A1 (en) Systems, methods, and apparatus for tele-otoscopy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 523860 No. 18, Wu Sha Beach Road, Changan Town, Dongguan, Guangdong

Applicant after: OPPO Guangdong Mobile Communications Co., Ltd.

Address before: 523860 No. 18, Wu Sha Beach Road, Changan Town, Dongguan, Guangdong

Applicant before: Guangdong OPPO Mobile Communications Co., Ltd.

GR01 Patent grant
GR01 Patent grant