WO2020038064A1 - Control method and device, depth camera, electronic device, and readable storage medium - Google Patents

Control method and device, depth camera, electronic device, and readable storage medium Download PDF

Info

Publication number
WO2020038064A1
WO2020038064A1 PCT/CN2019/090078 CN2019090078W WO2020038064A1 WO 2020038064 A1 WO2020038064 A1 WO 2020038064A1 CN 2019090078 W CN2019090078 W CN 2019090078W WO 2020038064 A1 WO2020038064 A1 WO 2020038064A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
projection distance
captured image
target
ratio
Prior art date
Application number
PCT/CN2019/090078
Other languages
French (fr)
Chinese (zh)
Inventor
武隽
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020038064A1 publication Critical patent/WO2020038064A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals

Definitions

  • the present application relates to the field of three-dimensional imaging technology, and in particular, to a control method, a control device, a depth camera, an electronic device, and a computer-readable storage medium.
  • Time of flight (TOF) imaging system can calculate the depth information of the measured object by calculating the time difference between the moment when the optical transmitter emits the optical signal and the moment when the optical receiver receives the optical signal.
  • Light emitters typically include a light source and a diffuser. The light from the light source is diffused by the diffuser and then casts a uniform surface light into the scene.
  • Embodiments of the present application provide a control method, a control device, a depth camera, an electronic device, and a computer-readable storage medium.
  • a method for controlling a light transmitter includes: obtaining a projection distance between the light transmitter and a target subject in a scene; determining a target light emission frequency of the light transmitter according to the projection distance; The light emitter emits light at the target emission frequency.
  • the control device for an optical transmitter includes a first acquisition module, a determination module, and a control module.
  • the first obtaining module is configured to obtain a projection distance between the light emitter and a target subject in a scene.
  • the determining module is configured to determine a target light emitting frequency of the light transmitter according to the projection distance.
  • the control module is configured to control the light emitter to emit light at the target emission frequency.
  • the depth camera of the embodiment of the present application includes a light emitting emitter and a processor.
  • the processor is configured to obtain a projection distance between the light emitter and a target subject in a scene; determine a target light emission frequency of the light emitter according to the projection distance; and control the light emitter to use the target Luminous frequency glows.
  • An electronic device includes the above-mentioned depth camera, one or more processors, a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be provided by the The one or more processors execute the program, and the program includes instructions for performing the foregoing control method.
  • the computer-readable storage medium of the embodiment of the present application includes a computer program used in combination with an electronic device, and the computer program can be executed by a processor to complete the control method described above.
  • FIG. 1 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present application.
  • FIG. 2 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 3 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present application.
  • FIG. 4 is a schematic diagram of the operation of a TOF depth camera according to some embodiments of the present application.
  • FIG. 5 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 6 is a schematic block diagram of a first acquisition module of a control device according to some embodiments of the present application.
  • FIG. 7 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 8 is a schematic block diagram of a first acquisition module of a control device according to some embodiments of the present application.
  • FIG. 9 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 10 is a schematic block diagram of a control device according to some embodiments of the present application.
  • FIG. 11 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 12 is a schematic block diagram of a second computing unit of a control device according to some embodiments of the present application.
  • FIG. 13 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 14 is a schematic block diagram of a second computing unit of a control device according to some embodiments of the present application.
  • 15 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 16 is a schematic block diagram of a second computing unit of a control device according to some embodiments of the present application.
  • FIG. 17 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present application.
  • FIG. 18 is a schematic diagram of a three-dimensional structure of a depth camera according to some embodiments of the present application.
  • FIG. 19 is a schematic plan view of a depth camera according to some embodiments of the present application.
  • FIG. 20 is a schematic cross-sectional view of the depth camera in FIG. 19 along the line XX-XX.
  • FIG. 21 is a schematic structural diagram of a light emitter according to some embodiments of the present application.
  • 22 and 23 are schematic structural diagrams of a light source of a light emitter according to some embodiments of the present application.
  • FIG. 24 is a schematic block diagram of an electronic device according to some embodiments of the present application.
  • 25 is a schematic diagram of a connection between a computer-readable storage medium and an electronic device according to some embodiments of the present application.
  • the present application provides a method for controlling the light transmitter 100.
  • the control method includes: obtaining a projection distance between the light emitter 100 and a target subject in the scene; determining a target light emitting frequency of the light emitter 100 according to the projection distance; and controlling the light emitter 100 to emit light at the target light emitting frequency.
  • acquiring a projection distance between the light emitter 100 and a target subject in a scene includes: acquiring a captured image of the scene; processing the captured image to determine whether a human face exists in the captured image; When a human face exists in the captured image, the first proportion of the human face in the captured image is calculated; and the projection distance is calculated according to the first proportion.
  • obtaining a projection distance between the light emitter 100 and a target subject in the scene includes: controlling the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene; The depth information calculates a projection distance between the light emitter 100 and the target subject.
  • the control method further includes: obtaining the ambient brightness of the scene; Target light emitting power of the light emitter 100; controlling the light emitter 100 to emit light at the target light emitting power.
  • calculating the projection distance according to the first scale includes: calculating a second ratio of a preset feature area of a human face in a captured image to the human face; and according to the first and second scales. Calculate the projection distance.
  • calculating the projection distance according to the first ratio includes: determining whether the target subject is wearing glasses according to the captured image; and calculating the projection distance according to the first ratio and the distance coefficient when the target subject wears glasses.
  • calculating the projection distance according to the first ratio includes: determining the age of the target subject according to the captured image; and calculating the projection distance according to the first ratio and age.
  • the present application further provides a control device 90 of the optical transmitter 100.
  • the control device 90 includes a first acquisition module 91, a determination module 92, and a control module 93.
  • the first acquisition module 91 is configured to acquire a projection distance between the light emitter 100 and a target subject in a scene.
  • the determining module 92 is configured to determine a target emission frequency of the light transmitter 100 according to the projection distance.
  • the control module 93 is configured to control the light emitter 100 to emit light at a target emission frequency.
  • the first acquisition module 91 includes a first acquisition unit 911, a processing unit 912, a first calculation unit 913, and a second calculation unit 914.
  • the first acquiring unit 911 is configured to acquire a captured image of a scene.
  • the processing unit 912 is configured to process the captured image to determine whether a human face exists in the captured image.
  • the first calculation unit 913 is configured to calculate a first proportion of a human face in a captured image when a human face exists in the captured image.
  • the second calculation unit 914 is configured to calculate a projection distance according to the first ratio.
  • the first obtaining module 91 includes a first control unit 915 and a third calculation unit 916.
  • the first control unit 915 is used to control the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene.
  • the third calculation unit 916 is configured to calculate a projection distance between the light emitter 100 and the target subject according to the initial depth information.
  • the control device 90 further includes a second obtaining module 94 and a calculation module 95.
  • the second acquisition module 94 may be configured to acquire the ambient brightness of the scene.
  • the calculation module 95 is configured to calculate a target luminous power of the light transmitter 100 according to the ambient brightness and the projection distance.
  • the control module 93 is further configured to control the light emitter 100 to emit light at a target light emission power.
  • the second calculation unit 914 includes a first calculation sub-unit 9141 and a second calculation sub-unit 9142.
  • the first calculation subunit 9141 is configured to calculate a second proportion of the preset feature area of the human face in the captured image.
  • the second calculation subunit 9142 is configured to calculate a projection distance according to the first scale and the second scale.
  • the second calculation unit 914 further includes a first determination sub-unit 9143 and a third calculation sub-unit 9144.
  • the first determining subunit 9143 is configured to determine whether the target subject is wearing glasses according to the captured image
  • the third calculating subunit 9144 is used to calculate the projection distance according to the first ratio and the distance coefficient when the target subject is wearing glasses.
  • the second calculation unit 914 further includes a second determination sub-unit 9145 and a fourth calculation sub-unit 9146.
  • the second judging subunit 9145 is configured to judge the age of the target subject according to the captured image.
  • the fourth calculation subunit 9146 is configured to calculate a projection distance according to the first ratio and age.
  • the present application further provides a depth camera 300.
  • the depth camera 300 includes a light transmitter 100, a light receiver 200, and a processor 805.
  • the processor 805 may be configured to obtain a projection distance between the light emitter 100 and a target subject in the scene, determine a target light emission frequency of the light emitter 100 according to the projection distance, and control the light emitter 100 to emit light at the target light emission frequency.
  • the processor 805 is configured to obtain a captured image of a scene, process the captured image to determine whether a face exists in the captured image, and calculate a person in the captured image when a face exists in the captured image. The first proportion occupied by the face, and the projection distance is calculated based on the first proportion.
  • the processor 805 is further configured to control the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene, and calculate the light emitter 100 and the target subject according to the initial depth information. The projection distance between.
  • the processor 805 is configured to obtain the ambient brightness of the scene, calculate the target light emitting power of the light transmitter 100 according to the environment brightness and the projection distance, and control the light emitter 100 to use the target light emitting power Glow.
  • the processor 805 may be configured to calculate a second proportion of the preset feature area of the human face in the captured image to the human face, and calculate the projection distance according to the first proportion and the second proportion.
  • the processor 805 is further configured to determine whether the target subject is wearing glasses according to the captured image, and calculate the projection according to the first scale and distance coefficient when the target subject is wearing glasses. distance.
  • the processor 805 is further configured to determine the age of the target subject according to the captured image, and calculate the projection distance according to the first ratio and age.
  • the present application further provides an electronic device 800.
  • the electronic device 800 includes the depth camera 300, one or more processors 805, a memory 806, and one or more programs 807 according to any one of the foregoing embodiments.
  • One or more programs 807 are stored in the memory 806 and are configured to be executed by one or more processors 805.
  • the program 807 includes instructions for executing the control method of the optical transmitter 100 according to any one of the foregoing embodiments.
  • the present application further provides a computer-readable storage medium 901.
  • the computer-readable storage medium 901 includes a computer program 902 used in conjunction with the electronic device 800.
  • the computer program 902 can be executed by the processor 805 to complete the method for controlling the optical transmitter 100 according to any one of the foregoing embodiments.
  • Control methods include:
  • the light emitter 100 is controlled to emit light at a target emission frequency.
  • the present application further provides a control device 90 of the optical transmitter 100.
  • the control method of the light transmitter 100 according to the embodiment of the present application may be executed by the control device 90 of the light transmitter 100 according to the embodiment of the present application.
  • the control device 90 includes a first acquisition module 91, a determination module 92, and a control module 93.
  • Step 01 may be implemented by the first obtaining module 91.
  • Step 02 may be implemented by the determination module 92.
  • Step 03 may be implemented by the control module 93. That is, the first obtaining module 91 may be used to obtain a projection distance between the light emitter 100 and a target subject in the scene.
  • the determination module 92 may be configured to determine a target light emission frequency of the light transmitter 100 according to the projection distance.
  • the control module 93 may be configured to control the light emitter 100 to emit light at a target light emission frequency.
  • the depth camera 300 includes a light transmitter 100, a light receiver 200, and a processor 805. Steps 01, 02, and 03 can be implemented by the processor 805. That is to say, the processor 805 may be configured to obtain a projection distance between the light emitter 100 and a target subject in the scene, determine a target light emission frequency of the light emitter 100 according to the projection distance, and control the light emitter 100 to emit light at the target light emission frequency. .
  • the depth camera 300 according to the embodiment of the present application can be applied to the electronic device 800.
  • the processor 805 in the depth camera 300 according to the embodiment of the present application and the processor 805 of the electronic device 800 may be the same processor 805 or two independent processors 805. In the specific embodiment of the present application, the processor 805 in the depth camera 300 and the processor 805 of the electronic device 800 are the same processor 805.
  • the electronic device 800 may be a mobile phone, a tablet computer, a smart wearable device (a smart watch, a smart bracelet, a smart glasses, a smart helmet), a drone, etc., and is not limited herein.
  • the depth camera 300 is a time of flight (TOF) depth camera.
  • a TOF depth camera generally includes a light transmitter 100 and a light receiver 200.
  • the light receiver 200 is configured to project a laser light into the scene, and the light receiver 200 receives the laser light reflected by a person or an object in the scene.
  • the TOF depth camera usually obtains depth information in two ways: direct acquisition and indirect acquisition.
  • the processor 805 can calculate the flight time of the laser in the scene according to the time point when the optical receiver 200 emits the laser light and the time point when the light receiver 200 receives the laser light, and calculate the scene's Depth information.
  • the light transmitter 100 emits laser light with a certain modulation frequency to the scene after pulse modulation, and the light receiver 200 collects the laser light in one or more complete pulse periods reflected back.
  • Each pixel of the light receiver 200 is composed of a light-sensitive device.
  • the light-sensitive device is connected to multiple high-frequency switches, which can direct current into different capacitors that can store electric charges.
  • the processor 805 controls the on-off of the high-frequency switch.
  • the received laser under one or more complete pulse periods is divided into two parts, and the distance between the object and the TOF depth camera can be calculated according to the current corresponding to the infrared light of these two parts. For example, as shown in Fig.
  • the charges accumulated by the two lasers are Q1 and Q2, and the duration of the laser in a pulse period is T, then the propagation time of the laser in the scene Corresponding distance
  • c is the speed of light.
  • the control method, control device 90 and depth camera 300 of the light transmitter 100 Before acquiring the depth information of the scene, the control method, control device 90 and depth camera 300 of the light transmitter 100 according to the embodiments of the present application first detect the projection distance between the target subject and the depth camera 300 in the scene, and then according to the projection distance To determine the target light emission frequency of the light emitter 100, and finally control the light emitter 100 to emit light at the target light emission frequency.
  • the projection distance has a mapping relationship with the target emission frequency.
  • the projection distance is a specific value
  • the target emission frequency is also a specific value
  • the projection distance corresponds to the target emission frequency one by one.
  • the projection distance is a range
  • the target The light emission frequency is a specific value
  • the projection distance corresponds to the target light emission frequency one to one.
  • the mapping relationship between the projection distance and the target luminous frequency may be determined based on calibration data of a large number of experiments before the depth camera 300 leaves the factory.
  • the mapping relationship between the projection distance and the target luminous frequency satisfies the law that the target luminous frequency decreases as the projection distance increases. For example, when the projection distance is 1.5 meters, the target light emitting frequency of the light transmitter 100 is 100 MHz; when the projection distance is 3 meters, the target light emitting frequency of the light transmitter 100 is 60 MHz; when the projection distance is 5 meters, the The target light emission frequency is 30 MHz, etc., so that when the projection distance is increased, the integration time of the laser light accumulated by the photosensitive device is increased by reducing the target light emission frequency to further improve the accuracy of obtaining depth information.
  • obtaining the projection distance between the light emitter 100 and the target subject in the scene in step 01 includes:
  • 011 Get the captured image of the scene
  • 012 Process the captured image to determine whether a human face exists in the captured image
  • 014 Calculate the projection distance according to the first ratio.
  • the first acquisition module 91 includes a first acquisition unit 911, a processing unit 912, a first calculation unit 913, and a second calculation unit 914.
  • Step 011 may be implemented by the first obtaining unit 911.
  • Step 012 may be implemented by the processing unit 912.
  • Step 013 may be implemented by the first calculation unit 913.
  • Step 014 may be implemented by the second calculation unit 914. That is to say, the first acquisition unit 911 may be configured to acquire a captured image of a scene.
  • the processing unit 912 may be configured to process the captured image to determine whether a human face exists in the captured image.
  • the first calculation unit 913 may be configured to calculate a first proportion of a human face in the captured image when a human face exists in the captured image.
  • the second calculation unit 914 may be configured to calculate the projection distance according to the first ratio.
  • the first obtaining unit 911 may be an infrared camera (which may be the light receiver 200) or a visible light camera 400.
  • the captured image is an infrared image; when the first obtaining unit 911 is a visible light camera At 400, the captured image is a visible light image.
  • step 011, step 012, step 013, and step 014 may be implemented by the processor 805. That is to say, the processor 805 may be configured to obtain a captured image of a scene, process the captured image to determine whether a human face exists in the captured image, calculate a first proportion of the human face in the captured image when a human face exists in the captured image, And calculating the projection distance according to the first scale.
  • the processor 805 first recognizes whether a human face exists in the captured image based on a human face recognition algorithm. When a face exists in the captured image, the processor 805 extracts the face area and calculates the number of pixels occupied by the face area. Subsequently, the processor 805 divides the number of pixels in the face area by the total number of pixels in the captured image. Count to get the first proportion of the face in the captured image, and finally calculate the projection distance based on the first proportion. Generally, when the first ratio is larger, the target subject is closer to the depth camera 300, that is, the target subject is closer to the light transmitter 100, and the projection distance is smaller. When the first proportion is larger, the target subject and the depth camera are explained.
  • the distance of 300 is longer, that is, the target subject is farther from the light transmitter 100, and the projection distance is larger. Therefore, the relationship between the projection distance and the first ratio satisfies that the projection distance increases as the first ratio decreases.
  • the face with the largest area among the multiple faces may be selected as the face area to calculate the first proportion; or the area of multiple faces may also be selected To calculate the first proportion; or, the face of the holder of the electronic device 800 can be identified from multiple faces, and the first proportion can be calculated by using the face of the holder as the face area. Determining the target light emission frequency based on the distance between the holder and the depth camera 300 can improve the accuracy of obtaining the depth information corresponding to the holder and improve the user experience.
  • the first ratio has a mapping relationship with the projection distance.
  • the first ratio is a specific value and the projection distance is also a specific value.
  • the first ratio corresponds to the projection distance one by one.
  • the first ratio is a range and the projection distance is
  • the first ratio is a one-to-one correspondence with the projection distance; or, the first ratio is a range and the projection distance is also a range, and the first ratio corresponds to the projection distance one-to-one.
  • the mapping relationship between the first scale and the projection distance may be calibrated in advance.
  • the user is directed to stand at more than a predetermined projection distance from the infrared camera or visible light camera 400, and the infrared camera or visible light camera 400 sequentially captures captured images.
  • the processor 805 calculates the calibration ratio of the face to the captured image in each captured image, and then stores the corresponding relationship between the calibrated ratio in each captured image and the predetermined projection distance. Based on the actually measured first ratio in subsequent use Find the projection distance corresponding to the first ratio in the above mapping relationship.
  • the user is instructed to stand at a projection distance of 10 cm, 20 cm, 30 cm, 40 cm, an infrared camera or a visible light camera 400 sequentially captures captured images, and the processor 805 calculates a projection distance of 10 cm from the multiple captured images , 20 cm, 30 cm, and 40 cm respectively corresponding to the calibration ratio of 80%, 60%, 45%, 30%, and the mapping relationship between the calibration ratio and the predetermined projection distance 10cm-80%, 20cm-60%, 30cm-45 %, 40cm-30% are stored in the memory of the electronic device 800 (shown in FIG. 24) in the form of a mapping table. In subsequent use, directly find the projection distance corresponding to the first ratio in the mapping table.
  • the projection distance and the first ratio are calibrated in advance.
  • the user is directed to stand at a predetermined projection distance from the infrared camera or visible light camera 400, and the infrared camera or visible light camera 400 collects captured images.
  • the processor 805 calculates the calibration ratio of the human face in the captured image, and then stores the correspondence between the calibration ratio in the captured image and the predetermined projection distance. In subsequent use, based on the correspondence between the calibration ratio and the predetermined projection distance The relationship calculates the projection distance.
  • the processor 805 calculates that the proportion of the human face in the captured image is 45%, and in actual measurement, when When the first ratio is calculated as R, according to the properties of similar triangles, Among them, D is an actual projection distance calculated according to the actually measured first ratio R.
  • the projection distance between the target subject and the light emitter 100 can be reflected more objectively.
  • obtaining the projection distance between the light emitter 100 and the target subject in the scene in step 01 includes:
  • the first obtaining module 91 includes a first control unit 915 and a third calculation unit 916.
  • Step 015 may be implemented by the first control unit 915.
  • Step 015 may be implemented by the third calculation unit 916. That is, the first control unit 915 may be used to control the light transmitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene.
  • the third calculation unit 916 may be configured to calculate a projection distance between the light emitter 100 and the target subject according to the initial depth information.
  • step 015 and step 016 may both be implemented by the processor 805. That is, the processor 805 may also be used to control the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene, and calculate a projection distance between the light emitter 100 and the target subject according to the initial depth information.
  • the processor 805 controls the light transmitter 100 to emit laser light at a predetermined light emission frequency
  • the light receiver 200 receives the laser light reflected by a person or an object in the scene
  • the processor 805 calculates an initial scene based on the reception result of the light receiver 200 Depth information.
  • the predetermined light emission frequency is less than a preset threshold, that is, when the initial depth information of the scene is obtained, the light emitter 100 emits light at a lower light emission frequency.
  • the lower light emission frequency can reduce the power consumption of the electronic device 800.
  • the projection distance between the target subject and the depth camera 300 is unknown, and it is also unknown whether the target subject is a user.
  • the light is directly emitted at a higher light frequency, if the target subject is the user and the target subject and the depth camera 300 If the distance is relatively short, the high-frequency emission of the laser light is likely to cause harm to the eyes of the user, and the light emission at a lower light emission frequency does not have the above-mentioned hidden dangers.
  • the target subject is further determined from the scene to further determine the initial depth information of the target subject.
  • the target subject is generally located in the central area of the field of view of the light receiver 200. Therefore, the central area of the field of view of the light receiver 200 can be used as the area where the target subject is located, so that the initial depth information of the pixels in the central area is used as The initial depth information of the target subject.
  • the processor 805 can calculate the average or median value of the multiple initial depth information, and use the average or median value as the projection between the light emitter 100 and the target subject. distance.
  • the projection distance between the target subject and the light emitter 100 is calculated, and the target light emitting frequency of the light emitter 100 is determined based on the projection distance, so that the light emitter 100 emits light according to the target light emitting frequency, and the depth of the obtained target subject is increased.
  • the accuracy of the information is calculated.
  • the processor 805 may further perform steps 015 and 016 to determine the target subject and The projection distance between the light emitters 100. In this way, when a human face does not exist in the captured image, the projection distance between the target subject and the light emitter 100 can also be determined.
  • the processor 805 may control the infrared camera (which may be the light receiver 200) or the visible light camera 400 to capture and shoot image. It is assumed that the captured image is collected by the visible light camera 400.
  • the visual field of the visible light camera 400 and the light receiver 200 in the electronic device 800 usually has a large overlap.
  • the manufacturer will also calibrate the relative position between the visible light camera 400 and the light receiver 200 and obtain multiple calibration parameters for matching the color information of the subsequent visible light image and the depth information of the depth image. . Therefore, after the captured image is obtained by the processor 805, the processor 805 can first identify whether a human face exists in the captured image, and when there is a human face, find it based on the matching relationship between the captured image and the initial depth image formed by the initial depth information. The initial depth information corresponding to the face, and the initial depth information corresponding to the face is used as the depth information of the target subject. If there is no human face in the captured image, the initial depth information of the pixels in the central area is used as the initial depth information of the target subject. As such, when there is a user in the scene, the projection distance between the user and the depth camera 300 can be measured more accurately.
  • control method after step 01 further includes:
  • the light emitter 100 is controlled to emit light at the target light emission power.
  • the control device 90 further includes a second obtaining module 94 and a calculation module 95.
  • Step 04 may be implemented by the second acquisition module 94.
  • Step 05 may be implemented by the calculation module 95.
  • Step 06 may be implemented by the control module 93. That is to say, the second acquisition module 94 can be used to acquire the ambient brightness of the scene.
  • the calculation module 95 may be configured to calculate a target luminous power of the light transmitter 100 according to the ambient brightness and the projection distance.
  • the control module 93 can also be used to control the light emitter 100 to emit light at a target light emission power.
  • step 04, step 05, and step 06 can all be implemented by the processor 805. That is to say, the processor 805 can be used to obtain the ambient brightness of the scene, calculate the target light emitting power of the light transmitter 100 according to the environment brightness and the projection distance, and control the light transmitter 100 to emit light at the target light emitting power.
  • step 04, step 05, and step 02 may be performed synchronously, and steps 06 and 03 may be performed simultaneously.
  • the processor 805 controls the light transmitter 100 to emit light at the target light emission frequency, and also controls the light transmitter 100 Light is emitted at the target light emission power.
  • the ambient brightness can be detected by a light sensor.
  • the processor 805 reads the ambient brightness it detects from the light sensor.
  • the ambient brightness may also be detected by an infrared camera (which may be the light receiver 200) or the visible light camera 400.
  • the infrared camera or the visible light camera 400 captures an image of the current scene, and the processor 805 calculates the brightness value of the image as the ambient brightness.
  • the processor 805 calculates the target luminous power of the scene based on the two parameters of the ambient brightness and the projection distance. It can be understood that, first, when the ambient brightness is high, there are more infrared light components in the ambient light, and the infrared light in the ambient light and the infrared laser light emitted by the light transmitter 100 overlap with each other.
  • the optical receiver 200 When the optical receiver 200 receives both the infrared laser light emitted by the optical transmitter 100 and the infrared light in the ambient light, if the light emitting power of the infrared laser emitted by the optical transmitter 100 is low, the The ratio between the infrared laser light from the light transmitter 100 and the infrared light from the ambient light is not much different. This will cause the time point when the light receiver 200 receives the light is not accurate, or the values of Q1 and Q2 are not accurate enough. It will reduce the accuracy of acquiring depth information.
  • the transmission power of the infrared laser emitted by the optical transmitter 100 it is necessary to increase the transmission power of the infrared laser emitted by the optical transmitter 100 to reduce the influence of the infrared light in the environment on the optical receiver 200 receiving the infrared laser from the optical transmitter 100;
  • the brightness is low
  • the infrared light component contained in the ambient light is less.
  • the power of the electronic device 800 will be increased. Consuming.
  • the projection distance is long, the flight time of the laser is long, the flight distance is long, and the loss of the laser is large, which further causes the values of Q1 and Q2 to be smaller, which affects the accuracy of the depth information acquisition. Therefore, when the projection distance is large, the transmission power of the infrared laser emitted by the optical transmitter 100 can be appropriately increased.
  • the target light emitting power of the light transmitter 100 is greater than or equal to the first predetermined power P1.
  • the target light emitting power of the light transmitter 100 is less than or equal to the second predetermined power P2.
  • the first predetermined power P1 is greater than the second predetermined power P2.
  • the target light emitting power of the light transmitter 100 is between the second predetermined power P2 and the first predetermined power P1
  • the value range of the target luminous power of the optical transmitter 100 is (P2, P1).
  • jointly determining the target light emitting power of the light transmitter 100 based on the ambient brightness and the projection distance can reduce the power consumption of the electronic device 800 on the one hand and improve the accuracy of obtaining the depth information of the scene on the other.
  • calculating the projection distance according to the first scale in step 014 includes:
  • 0141 Calculate the second proportion of the preset feature area of the human face in the captured image.
  • 0142 Calculate the projection distance according to the first scale and the second scale.
  • the second calculation unit 914 includes a first calculation sub-unit 9141 and a second calculation sub-unit 9142.
  • Step 0141 may be implemented by the first calculation subunit 9141
  • step 0142 may be implemented by the second calculation subunit 9142.
  • the first calculation subunit 9141 may be configured to calculate a second proportion of the preset feature area of the human face in the captured image to the human face.
  • the second calculation subunit 9142 may be configured to calculate the projection distance according to the first scale and the second scale.
  • step 0141 and step 0142 may both be implemented by the processor 805. That is to say, the processor 805 may be configured to calculate a second ratio of a preset feature area of a human face in the captured image to the human face, and calculate a projection distance according to the first ratio and the second ratio.
  • the second ratio is a ratio of the preset features of the human face to the human face.
  • the preset feature area may select a feature area with a small difference between different user individuals.
  • the preset feature trend area is the binocular distance of the user.
  • the user is directed to stand at a predetermined projection distance position, collect a captured image, and then calculate a first calibration ratio and a second calibration ratio corresponding to the captured image, and store the predetermined projection distance, the first calibration ratio, and the second
  • the corresponding relationship of the scales is calibrated, so as to calculate the projection distance according to the actual first scale and the second scale in subsequent use. For example, instruct the user to stand at a projection distance of 25 cm and collect the captured image, and then calculate the first calibration ratio corresponding to the captured image to be 50% and the second calibration ratio to be 10%.
  • D1 is the initial projection distance calculated according to the actually measured first ratio R1, which can be further based on the relationship
  • a calibrated projection distance D2 which is further calculated according to the actually measured second ratio R2, is obtained, and D2 is used as the final projection distance.
  • the projection distance calculated according to the first ratio and the second ratio takes into account the individual differences between different users, and can obtain a more objective projection distance. It can further determine a more accurate target light emission frequency based on a more accurate projection distance. And target luminous power.
  • calculating the projection distance according to the first proportion in step 014 includes:
  • 0143 judging whether the target subject is wearing glasses based on the captured image.
  • 0144 Calculate the projection distance according to the first scale and the distance coefficient when the target subject wears glasses.
  • the second calculation unit 914 further includes a first determination sub-unit 9143 and a third calculation sub-unit 9144.
  • Step 0143 may be implemented by the first judging subunit 9143.
  • Step 0144 may be implemented by the third calculation subunit 9144. That is to say, the first judging sub-unit 9143 may be used to judge whether the target subject is wearing glasses according to the captured image, and the third calculating sub-unit 9144 may be used to calculate the projection distance according to the first ratio and the distance coefficient when the target subject is wearing glasses.
  • step 0143 and step 0144 may be implemented by the processor 805. That is, the processor 805 may be further configured to determine whether the target subject is wearing glasses according to the captured image, and calculate the projection distance according to the first ratio and the distance coefficient when the target subject is wearing glasses.
  • the optical transmitter 100 emits laser light to the user wearing the glasses. At this time, the light emitting power of the light transmitter 100 needs to be reduced so that the energy of the laser light emitted by the light transmitter 100 is small, so as not to cause damage to the eyes of the user.
  • the preset distance coefficient can be a coefficient between 0 and 1, such as 0.6, 0.78, 0.82, 0.95, etc., for example, the initial projection distance is calculated according to the first ratio, or the first distance and the second ratio are calculated.
  • the initial projection distance or the calibrated projection distance is multiplied by the distance coefficient to obtain the final projection distance, and the target luminous power is determined according to the projection distance and the ambient brightness. In this way, it is possible to avoid that the power of the emitted laser is too large to hurt the user suffering from eye disease or poor vision.
  • calculating the projection distance according to the first scale in step 014 includes:
  • 0145 judging the age of the target subject based on the captured image
  • 0146 Calculate the projection distance according to the first ratio and age.
  • the second calculation unit 914 further includes a second determination sub-unit 9145 and a fourth calculation sub-unit 9146.
  • Step 0145 may be implemented by the second judgment sub-unit 9145.
  • Step 0146 may be implemented by the fourth calculation subunit 9146. That is to say, the second judging subunit 9145 can be used to judge the age of the target subject based on the captured image.
  • the fourth calculation subunit 9146 may be configured to calculate the projection distance according to the first ratio and the age.
  • step 0145 and step 0146 may be implemented by the processor 805. That is, the processor 805 may be further configured to determine the age of the target subject based on the captured image, and calculate the projection distance based on the first ratio and age.
  • the number, distribution, and area of feature points of facial wrinkles in the captured image can be extracted to determine the user's age, for example, the number of wrinkles at the corners of the eyes can be used to determine the user's age, or further combined with the user's forehead How many wrinkles are there to determine the user's age.
  • the proportion coefficient can be obtained according to the age of the user. Specifically, the correspondence between age and the proportion coefficient can be found in a query table.
  • the proportion coefficient is 0.6 and the age is between When the age is 15 to 20, the scale factor is 0.8; when the age is 20 to 45, the scale factor is 1.0; when the age is 45 or more, the scale factor is 0.8.
  • the initial projection distance calculated from the first scale or the calibrated projection distance calculated from the first and second scales can be multiplied by the scale factor to obtain the final projection distance. Determine the target luminous power according to the projection distance and the ambient brightness. In this way, excessive power of the emitted laser can be avoided to hurt young users or older users.
  • the electronic device 800 further includes a housing 801.
  • the housing 801 may serve as a mounting carrier for the functional elements of the electronic device 800.
  • the housing 801 can provide protection for the functional elements from dust, drop, and water.
  • the functional elements can be a display screen 802, a visible light camera 400, a receiver, and the like.
  • the housing 801 includes a main body 803 and a movable bracket 804.
  • the movable bracket 804 can move relative to the main body 803 under the driving of a driving device.
  • the movable bracket 804 can slide relative to the main body 803 to slide.
  • FIG. 17 Some functional elements (such as the display 802) can be installed on the main body 803, and other functional elements (such as the depth camera 300, the visible light camera 400, and the receiver) can be installed on the movable bracket 804.
  • the movement of the movable bracket 804 can drive the other A part of the functional elements is retracted into or protruded from the main body 803.
  • FIG. 1 and FIG. 17 are merely examples of a specific form of the casing 801, and cannot be understood as a limitation on the casing 801 of the present application.
  • the depth camera 300 is mounted on a casing 801.
  • the housing 801 may be provided with an acquisition window, and the depth camera 300 is aligned with the acquisition window to enable the depth camera 300 to acquire depth information.
  • the depth camera 300 is mounted on a movable bracket 804.
  • the movable bracket 804 can be triggered to slide in The main body 803 is retracted into the main body by driving the depth camera 300.
  • the depth camera 300 further includes a first substrate assembly 71 and a spacer 72.
  • the first substrate assembly 71 includes a first substrate 711 and a flexible circuit board 712 connected to each other.
  • the spacer 72 is disposed on the first substrate 711.
  • the light emitter 100 is used for projecting laser light outward, and the light emitter 100 is disposed on the cushion block 72.
  • the flexible circuit board 712 is bent and one end of the flexible circuit board 712 is connected to the first substrate 711 and the other end is connected to the light emitter 100.
  • the light receiver 200 is disposed on the first substrate 711.
  • the light receiver 200 is configured to receive laser light reflected by a person or an object in the target space.
  • the light receiver 200 includes a housing 741 and an optical element 742 provided on the housing 741.
  • the housing 741 is integrally connected with the pad 72.
  • the first substrate assembly 71 includes a first substrate 711 and a flexible circuit board 712.
  • the first substrate 711 may be a printed wiring board or a flexible wiring board.
  • a control circuit of the depth camera 300 and the like may be laid on the first substrate 711.
  • One end of the flexible circuit board 712 may be connected to the first substrate 711, and the other end of the flexible circuit board 712 is connected to the circuit board 50 (shown in FIG. 20).
  • the flexible circuit board 712 can be bent at a certain angle, so that the relative positions of the devices connected at both ends of the flexible circuit board 712 can be selected.
  • the spacer 72 is disposed on the first substrate 711.
  • the spacer 72 is in contact with the first substrate 711 and is carried on the first substrate 711.
  • the spacer 72 may be combined with the first substrate 711 by means of adhesion or the like.
  • the material of the spacer 72 may be metal, plastic, or the like.
  • a surface on which the pad 72 is combined with the first substrate 711 may be a flat surface, and a surface on which the pad 72 is opposite to the combined surface may also be a flat surface, so that the light emitter 100 is disposed on the pad 72. It has better smoothness.
  • the light receiver 200 is disposed on the first substrate 711, and the contact surface between the light receiver 200 and the first substrate 711 is substantially flush with the contact surface between the pad 72 and the first substrate 711 (that is, the installation starting point of the two is at On the same plane).
  • the light receiver 200 includes a housing 741 and an optical element 742.
  • the casing 741 is disposed on the first substrate 711, and the optical element 742 is disposed on the casing 741.
  • the casing 741 may be a lens holder and a lens barrel of the light receiver 200, and the optical element 742 may be an element such as a lens disposed in the casing 741.
  • the light receiver 200 further includes a photosensitive chip (not shown), and the laser light reflected by a person or an object in the target space passes through the optical element 742 and is irradiated into the photosensitive chip, and the photosensitive chip responds to the laser.
  • the housing 741 and the cushion block 72 are integrally connected.
  • the casing 741 and the cushion block 72 may be integrally formed; or the materials of the casing 741 and the cushion block 72 are different, and the two are integrally formed by two-color injection molding or the like.
  • the housing 741 and the spacer 72 may also be separately formed, and the two form a matching structure.
  • one of the housing 741 and the spacer 72 may be set on the first substrate 711, and then the other The first substrate 711 is integrally connected with each other.
  • the light transmitter 100 is disposed on the pad 72, which can increase the height of the light transmitter 100, thereby increasing the height of the surface on which the laser is emitted by the light transmitter 100.
  • the laser light emitted by the light transmitter 100 is not easily received by the light
  • the device 200 is blocked, so that the laser light can be completely irradiated on the measured object in the target space.
  • the side where the cushion block 72 is combined with the first substrate 711 is provided with a receiving cavity 723.
  • the depth camera 300 further includes an electronic component 77 provided on the first substrate 711.
  • the electronic component 77 is housed in the receiving cavity 723.
  • the electronic component 77 may be an element such as a capacitor, an inductor, a transistor, or a resistor.
  • the electronic component 77 may be electrically connected to a control line laid on the first substrate 711 and used for or controlling the operation of the light transmitter 100 or the light receiver 200.
  • the electronic component 77 is housed in the receiving cavity 723, and the space in the pad 72 is used reasonably.
  • the number of the receiving cavities 723 may be one or more, and the receiving cavities 723 may be spaced apart from each other. When mounting the pad 72, the receiving cavity 723 and the electronic component 77 may be aligned and the pad 72 may be disposed on the first substrate 711.
  • the cushion block 72 is provided with an avoiding through hole 724 connected to at least one receiving cavity 723, and at least one electronic component 77 extends into the avoiding through hole 724. It can be understood that when the electronic component 77 needs to be accommodated in the avoiding through hole, the height of the electronic component 77 is required to be not higher than the height of the receiving cavity 723. For electronic components having a height higher than the receiving cavity 723, an avoiding through hole 724 corresponding to the receiving cavity 723 may be provided, and the electronic component 77 may partially extend into the avoiding through hole 724 so as not to increase the height of the cushion 72. Arranges the electronic component 77.
  • the first substrate assembly 711 further includes a reinforcing plate 713, and the reinforcing plate 713 is coupled to a side of the first substrate 711 opposite to the pad 72.
  • the reinforcing plate 713 may cover one side of the first substrate 711, and the reinforcing plate 713 may be used to increase the strength of the first substrate 711 and prevent deformation of the first substrate 711.
  • the reinforcing plate 713 may be made of a conductive material, such as metal or alloy.
  • the reinforcing plate 713 may be electrically connected to the casing 801 to ground the reinforcing plate 713. And the interference of the static electricity of the external components on the depth camera 300 is effectively reduced.
  • the depth camera 300 further includes a connector 76 connected to the first substrate assembly 71 and used to electrically connect with electronic components external to the depth camera 300. connection.
  • the light receiver 100 includes a light source 10, a diffuser 20, a lens barrel 30, a protective cover 40, a circuit board 50, and a driver 61.
  • the lens barrel 30 includes a ring-shaped lens barrel sidewall 33, and the ring-shaped lens barrel sidewall 33 surrounds a receiving cavity 62.
  • the side wall 33 of the lens barrel includes an inner surface 331 located in the receiving cavity 62 and an outer surface 332 opposite to the inner surface.
  • the side wall 33 of the lens barrel includes a first surface 31 and a second surface 32 opposite to each other.
  • the receiving cavity 62 penetrates the first surface 31 and the second surface 32.
  • the first surface 31 is recessed toward the second surface 32 to form a mounting groove 34 communicating with the receiving cavity 62.
  • the bottom surface 35 of the mounting groove 34 is located on a side of the mounting groove 34 remote from the first surface 31.
  • the outer surface 332 of the side wall 33 of the lens barrel is circular at one end of the first surface 31, and the outer surface 332 of the side wall 33 of the lens barrel is formed with an external thread at one end of the first surface 31.
  • the circuit board 50 is disposed on the second surface 32 of the lens barrel 30 and closes one end of the receiving cavity 62.
  • the circuit board 50 may be a flexible circuit board or a printed circuit board.
  • the light source 10 is carried on the circuit board 50 and received in the receiving cavity 62.
  • the light source 10 is configured to emit laser light toward the first surface 31 (the mounting groove 34) side of the lens barrel 30.
  • the light source 10 may be a single-point light source or a multi-point light source.
  • the light source 10 may specifically be an edge-emitting laser, for example, a distributed feedback laser (Distributed Feedback Laser, DFB), etc .; when the light source 10 is a multi-point light source, the light source 10 may specifically be vertical A cavity-surface emitter (Vertical-Cavity Surface Laser, VCSEL), or the light source 10 is also a multi-point light source composed of multiple edge-emitting lasers.
  • DFB distributed Feedback Laser
  • VCSEL Vertical A cavity-surface emitter
  • VCSEL Vertical-Cavity Surface Laser
  • the vertical cavity surface emitting laser has a small height, and the use of the vertical cavity surface emitter as the light source 10 is beneficial to reducing the height of the light emitter 100 and facilitating the integration of the light emitter 100 into a mobile phone and other requirements on the thickness of the fuselage.
  • Electronic device 800 Compared with the vertical cavity surface emitter, the temperature drift of the side-emitting laser is smaller, and the influence of the temperature on the effect of the projected laser light from the light source 10 can be reduced.
  • the driver 61 is carried on the circuit board 50 and is electrically connected to the light source 10. Specifically, the driver 61 may receive the modulated input signal, and convert the input signal into a constant current source and transmit it to the light source 10, so that the light source 10 is directed toward the first side 31 of the lens barrel 30 under the action of the constant current source. Laser is emitted on one side.
  • the driver 61 of this embodiment is provided outside the lens barrel 30. In other embodiments, the driver 61 may be disposed in the lens barrel 30 and carried on the circuit board 50.
  • the diffuser 20 is mounted (supported) in the mounting groove 34 and abuts the mounting groove 34.
  • the diffuser 20 is used to diffuse the laser light passing through the diffuser 20. That is, when the light source 10 emits laser light toward the first surface 31 side of the lens barrel 30, the laser light passes through the diffuser 20 and is diffused or projected outside the lens barrel 30 by the diffuser 20.
  • the protective cover 40 includes a top wall 41 and a protective sidewall 42 extending from one side of the top wall 41.
  • a light through hole 401 is defined in the center of the top wall 41.
  • the protective side wall 42 is disposed around the top wall 41 and the light through hole 401.
  • the top wall 41 and the protection side wall 42 together form a mounting cavity 43, and the light-passing hole 401 communicates with the mounting cavity 43.
  • the cross-section of the inner surface of the protective sidewall 42 is circular, and an inner thread is formed on the inner surface of the protective sidewall 42.
  • the internal thread of the protective sidewall 42 is screwed with the external thread of the lens barrel 30 to mount the protective cover 40 on the lens barrel 30.
  • the top wall 41 abuts the diffuser 20 so that the diffuser 40 is sandwiched between the top wall 41 and the bottom surface 35 of the mounting groove 34.
  • the opening 20 is installed in the lens barrel 30, and the diffuser 20 is installed in the installation groove 34, and the protective cover 40 is installed on the lens barrel 30 to clamp the diffuser 20 between the protective cover 40 and the installation groove. 34 between the bottom surfaces 35 so that the diffuser 20 can be fixed on the lens barrel 30.
  • glue which can prevent the glue from solidifying on the surface of the diffuser 20 and affecting the microstructure of the diffuser 20 after the glue is volatilized to a gaseous state.
  • the glue with the lens barrel 30 decreases due to aging, the diffuser 20 falls off from the lens barrel 30.
  • the structure of the vertical cavity surface emitter at this time may be:
  • the vertical cavity surface emitter includes a plurality of point light sources 101, which form a plurality of independently controllable fan-shaped arrays 11, and the plurality of fan-shaped arrays 11 surround a circle (as shown in FIG. 22) or a polygon ( (Not shown), at this time, the light emitting power of the light emitter 100 can be achieved by turning on the point light sources 101 of different numbers of the fan-shaped arrays 11, that is, the target light-emitting power corresponds to the target number of the turned-on fan-shaped arrays.
  • the part of the fan-shaped array that is turned on should be symmetrically distributed in the center, so that the laser light emitted by the light emitter 100 can be made more uniform.
  • the vertical cavity surface emitter includes a plurality of point light sources 101, and the plurality of point light sources 101 form a plurality of sub-arrays 12, and the plurality of sub-arrays 12 include at least one circular sub-array and at least one circular sub-array, and at least one circular sub-array. And at least one circular sub-array is enclosed in a circle (as shown in FIG. 23), or the multiple sub-arrays 12 include at least one polygonal sub-array and at least one circular sub-array Polygon (not shown).
  • the light emitting power of the light transmitter 100 can be adjusted by turning on the point light sources 101 of different numbers of the sub-arrays 12, that is, the target of the light-emitting power and the opened sub-arrays 12 Correspondence of quantity.
  • the present application further provides an electronic device 800.
  • the electronic device 800 includes the depth camera 300, one or more processors 805, a memory 806, and one or more programs 807 according to any one of the foregoing embodiments.
  • One or more programs 807 are stored in the memory 806 and are configured to be executed by one or more processors 805.
  • the program 807 includes instructions for executing the control method of the optical transmitter 100 according to any one of the foregoing embodiments.
  • the program 807 includes instructions for performing the following steps:
  • the light emitter 100 is controlled to emit light at a target emission frequency.
  • program 807 further includes instructions for performing the following steps:
  • 011 Get the captured image of the scene
  • 012 Process the captured image to determine whether a human face exists in the captured image
  • 014 Calculate the projection distance according to the first ratio.
  • program 807 further includes instructions for performing the following steps:
  • program 807 further includes instructions for performing the following steps:
  • the light emitter 100 is controlled to emit light at the target light emission power.
  • program 807 further includes instructions for performing the following steps:
  • 0141 Calculate the second proportion of the preset feature area of the human face in the captured image.
  • 0142 Calculate the projection distance according to the first scale and the second scale.
  • program 807 further includes instructions for performing the following steps:
  • 0143 judging whether the target subject is wearing glasses based on the captured image.
  • 0144 Calculate the projection distance according to the first scale and the distance coefficient when the target subject wears glasses.
  • program 807 further includes instructions for performing the following steps:
  • 0145 judging the age of the target subject based on the captured image
  • 0146 Calculate the projection distance according to the first ratio and age.
  • the present application further provides a computer-readable storage medium 901.
  • the computer-readable storage medium 901 includes a computer program 902 used in conjunction with the electronic device 800.
  • the computer program 902 can be executed by the processor 805 to complete the method for controlling the optical transmitter 100 according to any one of the foregoing embodiments.
  • the computer program 902 may be executed by the processor 805 to complete the following steps:
  • the light emitter 100 is controlled to emit light at a target emission frequency.
  • the computer program 902 can also be executed by the processor 805 to complete the following steps:
  • 011 Get the captured image of the scene
  • 012 Process the captured image to determine whether a human face exists in the captured image
  • 014 Calculate the projection distance according to the first ratio.
  • the computer program 902 can also be executed by the processor 805 to complete the following steps:
  • the computer program 902 can also be executed by the processor 805 to complete the following steps:
  • the light emitter 100 is controlled to emit light at the target light emission power.
  • the computer program 902 can also be executed by the processor 805 to complete the following steps:
  • 0141 Calculate the second proportion of the preset feature area of the human face in the captured image.
  • 0142 Calculate the projection distance according to the first scale and the second scale.
  • the computer program 902 can also be executed by the processor 805 to complete the following steps:
  • 0143 judging whether the target subject is wearing glasses based on the captured image.
  • 0144 Calculate the projection distance according to the first scale and the distance coefficient when the target subject wears glasses.
  • the computer program 902 may also be executed by the processor 805 to complete the following steps:
  • 0145 judging the age of the target subject based on the captured image
  • 0146 Calculate the projection distance according to the first ratio and age.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
  • Any process or method description in a flowchart or otherwise described herein can be understood as a module, fragment, or portion of code that includes one or more executable instructions for implementing a particular logical function or step of a process
  • the scope of the preferred embodiments of this application includes additional implementations in which the functions may be performed out of the order shown or discussed, including performing the functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present application pertain.

Abstract

A control method for a light emitter (100), a control device (90), a depth camera (300), an electronic device (800) and a computer readable storage medium (901). The control method comprises: acquiring a projection distance between a light emitter (100) and a target body in a scene (01); determining, according to the projection distance, a target light emission frequency of the light emitter (100) (02); and controlling the light emitter (100) to emit light at the target light emission frequency (03).

Description

控制方法及装置、深度相机、电子装置及可读存储介质Control method and device, depth camera, electronic device and readable storage medium
优先权信息Priority information
本申请请求2018年8月22日向中国国家知识产权局提交的、专利申请号为201810962843.8的专利申请的优先权和权益,并且通过参照将其全文并入此处。This application claims the priority and rights of the patent application filed with the State Intellectual Property Office of China on August 22, 2018, with a patent application number of 201810962843.8, and the entirety thereof is incorporated herein by reference.
技术领域Technical field
本申请涉及三维成像技术领域,特别涉及一种控制方法、控制装置、深度相机、电子装置和计算机可读存储介质。The present application relates to the field of three-dimensional imaging technology, and in particular, to a control method, a control device, a depth camera, an electronic device, and a computer-readable storage medium.
背景技术Background technique
飞行时间(Time of Flight,TOF)成像系统可通过计算光发射器发射光信号的时刻,与光接收器接收到光信号的时刻之间的时间差来计算被测物体的深度信息。光发射器通常包括光源和扩散器。光源发出的光经扩散器的扩散作用后向场景中投射均匀的面光。Time of flight (TOF) imaging system can calculate the depth information of the measured object by calculating the time difference between the moment when the optical transmitter emits the optical signal and the moment when the optical receiver receives the optical signal. Light emitters typically include a light source and a diffuser. The light from the light source is diffused by the diffuser and then casts a uniform surface light into the scene.
发明内容Summary of the Invention
本申请的实施例提供了一种控制方法、控制装置、深度相机、电子装置和计算机可读存储介质。Embodiments of the present application provide a control method, a control device, a depth camera, an electronic device, and a computer-readable storage medium.
本申请实施方式的光发射器的控制方法包括:获取所述光发射器与场景中的目标主体之间的投射距离;根据所述投射距离确定所述光发射器的目标发光频率;以及控制所述光发射器以所述目标发光频率发光。A method for controlling a light transmitter according to an embodiment of the present application includes: obtaining a projection distance between the light transmitter and a target subject in a scene; determining a target light emission frequency of the light transmitter according to the projection distance; The light emitter emits light at the target emission frequency.
本申请实施方式的光发射器的控制装置包括第一获取模块、确定模块、控制模块。所述第一获取模块用于获取所述光发射器与场景中的目标主体之间的投射距离。所述确定模块用于根据所述投射距离确定所述光发射器的目标发光频率。所述控制模块用于控制所述光发射器以所述目标发光频率发光。The control device for an optical transmitter according to the embodiment of the present application includes a first acquisition module, a determination module, and a control module. The first obtaining module is configured to obtain a projection distance between the light emitter and a target subject in a scene. The determining module is configured to determine a target light emitting frequency of the light transmitter according to the projection distance. The control module is configured to control the light emitter to emit light at the target emission frequency.
本申请实施方式的深度相机包括发光发射器和处理器。所述处理器用于获取所述光发射器与场景中的目标主体之间的投射距离;根据所述投射距离确定所述光发射器的目标发光频率;以及控制所述光发射器以所述目标发光频率发光。The depth camera of the embodiment of the present application includes a light emitting emitter and a processor. The processor is configured to obtain a projection distance between the light emitter and a target subject in a scene; determine a target light emission frequency of the light emitter according to the projection distance; and control the light emitter to use the target Luminous frequency glows.
本申请实施方式的电子装置包括上述的深度相机、一个或多个处理器、存储器和一个或多个程序,其中所述一个或多个程序被存储在所述存储器中,并且被配置成由所述一个或多个处理器执行,所述程序包括用于执行上述的控制方法的指令。An electronic device according to an embodiment of the present application includes the above-mentioned depth camera, one or more processors, a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be provided by the The one or more processors execute the program, and the program includes instructions for performing the foregoing control method.
本申请实施方式的计算机可读存储介质包括与电子装置结合使用的计算机程序,所述计算机程序可被处理器执行以完成上述的控制方法。The computer-readable storage medium of the embodiment of the present application includes a computer program used in combination with an electronic device, and the computer program can be executed by a processor to complete the control method described above.
本申请的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。Additional aspects and advantages of the present application will be given in part in the following description, part of which will become apparent from the following description, or be learned through practice of the present application.
附图说明BRIEF DESCRIPTION OF THE DRAWINGS
本申请上述的和/或附加的方面和优点从下面结合附图对实施例的描述中将变得明显和容易理解,其中:The above and / or additional aspects and advantages of this application will become apparent and easily understood from the following description of the embodiments in conjunction with the accompanying drawings, in which:
图1是本申请某些实施方式的电子装置的立体结构示意图。FIG. 1 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present application.
图2是本申请某些实施方式的光发射器的控制方法的流程示意图。FIG. 2 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图3是本申请某些实施方式的光发射器的控制装置的模块示意图。FIG. 3 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present application.
图4是本申请某些实施方式的TOF深度相机工作的原理示意图。FIG. 4 is a schematic diagram of the operation of a TOF depth camera according to some embodiments of the present application.
图5是本申请某些实施方式的光发射器的控制方法的流程示意图。FIG. 5 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图6是本申请某些实施方式的控制装置的第一获取模块的模块示意图。6 is a schematic block diagram of a first acquisition module of a control device according to some embodiments of the present application.
图7是本申请某些实施方式的光发射器的控制方法的流程示意图。FIG. 7 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图8是本申请某些实施方式的控制装置的第一获取模块的模块示意图。8 is a schematic block diagram of a first acquisition module of a control device according to some embodiments of the present application.
图9是本申请某些实施方式的光发射器的控制方法的流程示意图。FIG. 9 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图10是本申请某些实施方式的控制装置的模块示意图。FIG. 10 is a schematic block diagram of a control device according to some embodiments of the present application.
图11是本申请某些实施方式的光发射器的控制方法的流程示意图。11 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图12是本申请某些实施方式的控制装置的第二计算单元的模块示意图。FIG. 12 is a schematic block diagram of a second computing unit of a control device according to some embodiments of the present application.
图13是本申请某些实施方式的光发射器的控制方法的流程示意图。13 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图14是本申请某些实施方式的控制装置的第二计算单元的模块示意图。14 is a schematic block diagram of a second computing unit of a control device according to some embodiments of the present application.
图15是本申请某些实施方式的光发射器的控制方法的流程示意图。15 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图16是本申请某些实施方式的控制装置的第二计算单元的模块示意图。FIG. 16 is a schematic block diagram of a second computing unit of a control device according to some embodiments of the present application.
图17是本申请某些实施方式的电子装置的立体结构示意图。FIG. 17 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present application.
图18是本申请某些实施方式的深度相机的立体结构示意图。FIG. 18 is a schematic diagram of a three-dimensional structure of a depth camera according to some embodiments of the present application.
图19是本申请某些实施方式的深度相机的平面结构示意图。FIG. 19 is a schematic plan view of a depth camera according to some embodiments of the present application.
图20是图19中的深度相机沿XX-XX线的截面示意图。FIG. 20 is a schematic cross-sectional view of the depth camera in FIG. 19 along the line XX-XX.
图21是本申请某些实施方式的光发射器的结构示意图。FIG. 21 is a schematic structural diagram of a light emitter according to some embodiments of the present application.
图22和图23是本申请某些实施方式的光发射器的光源的结构示意图。22 and 23 are schematic structural diagrams of a light source of a light emitter according to some embodiments of the present application.
图24是本申请某些实施方式的电子装置的模块示意图。FIG. 24 is a schematic block diagram of an electronic device according to some embodiments of the present application.
图25是本申请某些实施方式的计算机可读存储介质与电子装置的连接示意图。25 is a schematic diagram of a connection between a computer-readable storage medium and an electronic device according to some embodiments of the present application.
具体实施方式detailed description
下面详细描述本申请的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本申请,而不能理解为对本申请的限制。Hereinafter, embodiments of the present application will be described in detail. Examples of the embodiments are shown in the accompanying drawings, wherein the same or similar reference numerals represent the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the drawings are exemplary, and are intended to explain the present application, and should not be construed as limiting the present application.
请参阅图1。本申请提供一种光发射器100的控制方法。控制方法包括:获取光发射器100与场景中的目标主体之间的投射距离;根据投射距离确定光发射器100的目标发光频率;控制光发射器100以目标发光频率发光。See Figure 1. The present application provides a method for controlling the light transmitter 100. The control method includes: obtaining a projection distance between the light emitter 100 and a target subject in the scene; determining a target light emitting frequency of the light emitter 100 according to the projection distance; and controlling the light emitter 100 to emit light at the target light emitting frequency.
请参阅图5,在某些实施方式中,获取光发射器100与场景中的目标主体之间的投射距离包括:获取场景的拍摄图像;处理拍摄图像以判断拍摄图像中是否存在人脸;在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例;根据第一比例计算投射距离。Referring to FIG. 5, in some embodiments, acquiring a projection distance between the light emitter 100 and a target subject in a scene includes: acquiring a captured image of the scene; processing the captured image to determine whether a human face exists in the captured image; When a human face exists in the captured image, the first proportion of the human face in the captured image is calculated; and the projection distance is calculated according to the first proportion.
请参阅图7,在某些实施方式中,获取光发射器100与场景中的目标主体之间的投射距离包括:控制光发射器100以预定发光频率发光以检测场景的初始深度信息;根据初始深度信息计算光发射器100与目标主体之间的投射距离。Referring to FIG. 7, in some embodiments, obtaining a projection distance between the light emitter 100 and a target subject in the scene includes: controlling the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene; The depth information calculates a projection distance between the light emitter 100 and the target subject.
请参阅图9,在某些实施方式中,控制方法在获取光发射器100与场景中的目标主体之间的投射距离的步骤后还包括:获取场景的环境亮度;根据环境亮度及投射距离计算光发射器100的目标发光功率;控制光发射器100以目标发光功率发光。Please refer to FIG. 9. In some embodiments, after the step of obtaining the projection distance between the light emitter 100 and the target subject in the scene, the control method further includes: obtaining the ambient brightness of the scene; Target light emitting power of the light emitter 100; controlling the light emitter 100 to emit light at the target light emitting power.
请参阅图11,在某些实施方式中,根据所述第一比例计算投射距离包括:计算拍摄图像中人脸的预设特征区域占人脸的第二比例;根据第一比例及第二比例计算投射距离。Referring to FIG. 11, in some embodiments, calculating the projection distance according to the first scale includes: calculating a second ratio of a preset feature area of a human face in a captured image to the human face; and according to the first and second scales. Calculate the projection distance.
请参阅图13,在某些实施方式中,根据所述第一比例计算投射距离包括:根据拍摄图像判断目标主体是否佩戴眼镜;在目标主体佩戴眼镜时根据第一比例及距离系数计算投射距离。Referring to FIG. 13, in some embodiments, calculating the projection distance according to the first ratio includes: determining whether the target subject is wearing glasses according to the captured image; and calculating the projection distance according to the first ratio and the distance coefficient when the target subject wears glasses.
请参阅图15,在某些实施方式中,根据所述第一比例计算投射距离包括:根据拍摄图像判断目标主体的年龄;根据第一比例及年龄计算投射距离。Referring to FIG. 15, in some embodiments, calculating the projection distance according to the first ratio includes: determining the age of the target subject according to the captured image; and calculating the projection distance according to the first ratio and age.
请一并参阅图2和图3,本申请还提供一种光发射器100的控制装置90。控制装置90包括第一获取模块91、确定模块92及控制模块93。第一获取模块91用于获取光发射器100与场景中的目标主体之间的投射距离。确定模块92用于根据投射距离确定光发射器100的目标发光频率。控制模块93用于控制光发射器100以目标发光频率发光。Please refer to FIG. 2 and FIG. 3 together. The present application further provides a control device 90 of the optical transmitter 100. The control device 90 includes a first acquisition module 91, a determination module 92, and a control module 93. The first acquisition module 91 is configured to acquire a projection distance between the light emitter 100 and a target subject in a scene. The determining module 92 is configured to determine a target emission frequency of the light transmitter 100 according to the projection distance. The control module 93 is configured to control the light emitter 100 to emit light at a target emission frequency.
请参阅图6,在某些实施方式中,第一获取模块91包括第一获取单元911、处理单元912、第一计算单元913和第二计算单元914。第一获取单元911用于获取场景的拍摄图像。处理单元912用于处理拍摄图像以判断拍摄图像中是否存在人脸。第一计算单元913用于在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例。第二计算单元914用于根据第一比例计算投射距离。Referring to FIG. 6, in some embodiments, the first acquisition module 91 includes a first acquisition unit 911, a processing unit 912, a first calculation unit 913, and a second calculation unit 914. The first acquiring unit 911 is configured to acquire a captured image of a scene. The processing unit 912 is configured to process the captured image to determine whether a human face exists in the captured image. The first calculation unit 913 is configured to calculate a first proportion of a human face in a captured image when a human face exists in the captured image. The second calculation unit 914 is configured to calculate a projection distance according to the first ratio.
请参阅图8,在某些实施方式中,第一获取模块91包括第一控制单元915和第三计算单元916。第 一控制单元915用于控制光发射器100以预定发光频率发光以检测场景的初始深度信息。第三计算单元916用于根据初始深度信息计算光发射器100与目标主体之间的投射距离。Referring to FIG. 8, in some embodiments, the first obtaining module 91 includes a first control unit 915 and a third calculation unit 916. The first control unit 915 is used to control the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene. The third calculation unit 916 is configured to calculate a projection distance between the light emitter 100 and the target subject according to the initial depth information.
请参阅图10,在某些实施方式中,控制装置90还包括第二获取模块94、计算模块95。第二获取模块94可用于获取场景的环境亮度。计算模块95用于根据环境亮度及投射距离计算光发射器100的目标发光功率。控制模块93还用于控制光发射器100以目标发光功率发光。Referring to FIG. 10, in some embodiments, the control device 90 further includes a second obtaining module 94 and a calculation module 95. The second acquisition module 94 may be configured to acquire the ambient brightness of the scene. The calculation module 95 is configured to calculate a target luminous power of the light transmitter 100 according to the ambient brightness and the projection distance. The control module 93 is further configured to control the light emitter 100 to emit light at a target light emission power.
请参阅图12,在某些实施方式中,第二计算单元914包括第一计算子单元9141和第二计算子单元9142。第一计算子单元9141用于计算拍摄图像中人脸的预设特征区域占人脸的第二比例。第二计算子单元9142用于根据第一比例及第二比例计算投射距离。Referring to FIG. 12, in some embodiments, the second calculation unit 914 includes a first calculation sub-unit 9141 and a second calculation sub-unit 9142. The first calculation subunit 9141 is configured to calculate a second proportion of the preset feature area of the human face in the captured image. The second calculation subunit 9142 is configured to calculate a projection distance according to the first scale and the second scale.
请参阅图14,在某些实施方式中,第二计算单元914还包括第一判断子单元9143和第三计算子单元9144。第一判断子单元9143用于根据拍摄图像判断目标主体是否佩戴眼镜,第三计算子单元9144用于在目标主体佩戴眼镜时根据第一比例及距离系数计算投射距离。Referring to FIG. 14, in some embodiments, the second calculation unit 914 further includes a first determination sub-unit 9143 and a third calculation sub-unit 9144. The first determining subunit 9143 is configured to determine whether the target subject is wearing glasses according to the captured image, and the third calculating subunit 9144 is used to calculate the projection distance according to the first ratio and the distance coefficient when the target subject is wearing glasses.
请参阅图16,在某些实施方式中,第二计算单元914还包括第二判断子单元9145和第四计算子单元9146。第二判断子单元9145用于根据拍摄图像判断目标主体的年龄。第四计算子单元9146用于根据第一比例及年龄计算投射距离。Referring to FIG. 16, in some embodiments, the second calculation unit 914 further includes a second determination sub-unit 9145 and a fourth calculation sub-unit 9146. The second judging subunit 9145 is configured to judge the age of the target subject according to the captured image. The fourth calculation subunit 9146 is configured to calculate a projection distance according to the first ratio and age.
请参阅图2,本申请还提供一种深度相机300。深度相机300包括光发射器100、光接收器200和处理器805。处理器805可用于获取光发射器100与场景中的目标主体之间的投射距离、根据投射距离确定光发射器100的目标发光频率及控制光发射器100以目标发光频率发光。Referring to FIG. 2, the present application further provides a depth camera 300. The depth camera 300 includes a light transmitter 100, a light receiver 200, and a processor 805. The processor 805 may be configured to obtain a projection distance between the light emitter 100 and a target subject in the scene, determine a target light emission frequency of the light emitter 100 according to the projection distance, and control the light emitter 100 to emit light at the target light emission frequency.
请再参阅图1,在某些实施方式中,处理器805用于获取场景的拍摄图像、处理拍摄图像以判断拍摄图像中是否存在人脸、在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例、以及根据第一比例计算投射距离。Please refer to FIG. 1 again. In some embodiments, the processor 805 is configured to obtain a captured image of a scene, process the captured image to determine whether a face exists in the captured image, and calculate a person in the captured image when a face exists in the captured image. The first proportion occupied by the face, and the projection distance is calculated based on the first proportion.
请再参阅图1,在某些实施方式中,处理器805还用于控制光发射器100以预定发光频率发光以检测场景的初始深度信息、及根据初始深度信息计算光发射器100与目标主体之间的投射距离。Please refer to FIG. 1 again. In some embodiments, the processor 805 is further configured to control the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene, and calculate the light emitter 100 and the target subject according to the initial depth information. The projection distance between.
请再参阅图1,在某些实施方式中,处理器805用于获取场景的环境亮度、根据环境亮度及投射距离计算光发射器100的目标发光功率、以及控制光发射器100以目标发光功率发光。Please refer to FIG. 1 again. In some embodiments, the processor 805 is configured to obtain the ambient brightness of the scene, calculate the target light emitting power of the light transmitter 100 according to the environment brightness and the projection distance, and control the light emitter 100 to use the target light emitting power Glow.
请再参阅图1,在某些实施方式中,处理器805可用于计算拍摄图像中人脸的预设特征区域占人脸的第二比例、及根据第一比例及第二比例计算投射距离。Please refer to FIG. 1 again. In some embodiments, the processor 805 may be configured to calculate a second proportion of the preset feature area of the human face in the captured image to the human face, and calculate the projection distance according to the first proportion and the second proportion.
请再参阅图1,在某些实施方式中,也即是说,处理器805还用于根据拍摄图像判断目标主体是否佩戴眼镜、以及在目标主体佩戴眼镜时根据第一比例及距离系数计算投射距离。Please refer to FIG. 1 again. In some embodiments, that is, the processor 805 is further configured to determine whether the target subject is wearing glasses according to the captured image, and calculate the projection according to the first scale and distance coefficient when the target subject is wearing glasses. distance.
请再参阅图1,在某些实施方式中,处理器805还用于根据拍摄图像判断目标主体的年龄,以及根据第一比例及年龄计算投射距离。Please refer to FIG. 1 again. In some embodiments, the processor 805 is further configured to determine the age of the target subject according to the captured image, and calculate the projection distance according to the first ratio and age.
请参阅图24,本申请还提供一种电子装置800。电子装置800包括上述任意一实施方式所述的深度相机300、一个或多个处理器805、存储器806和一个或多个程序807。其中一个或多个程序807被存储在存储器806中,并且被配置成由一个或多个处理器805执行。程序807包括用于执行上述任意一项实施方式所述的光发射器100的控制方法的指令。Referring to FIG. 24, the present application further provides an electronic device 800. The electronic device 800 includes the depth camera 300, one or more processors 805, a memory 806, and one or more programs 807 according to any one of the foregoing embodiments. One or more programs 807 are stored in the memory 806 and are configured to be executed by one or more processors 805. The program 807 includes instructions for executing the control method of the optical transmitter 100 according to any one of the foregoing embodiments.
请参阅图25,本申请还提供一种计算机可读存储介质901。计算机可读存储介质901包括与电子装置800结合使用计算机程序902。计算机程序902可被处理器805执行以完成上述任意一项实施方式所述的光发射器100的控制方法。Referring to FIG. 25, the present application further provides a computer-readable storage medium 901. The computer-readable storage medium 901 includes a computer program 902 used in conjunction with the electronic device 800. The computer program 902 can be executed by the processor 805 to complete the method for controlling the optical transmitter 100 according to any one of the foregoing embodiments.
请一并参阅图1和图2,本申请提供一种光发射器100的控制方法。控制方法包括:Please refer to FIG. 1 and FIG. 2 together. The present application provides a method for controlling an optical transmitter 100. Control methods include:
01:获取光发射器100与场景中的目标主体之间的投射距离;01: Obtain the projection distance between the light emitter 100 and the target subject in the scene;
02:根据投射距离确定光发射器100的目标发光频率;和02: determining the target emission frequency of the light transmitter 100 according to the projection distance; and
03:控制光发射器100以目标发光频率发光。03: The light emitter 100 is controlled to emit light at a target emission frequency.
请一并参阅图2和图3,本申请还提供一种光发射器100的控制装置90。本申请实施方式的光发射器100的控制方法可以由本申请实施方式的光发射器100的控制装置90执行。具体地,控制装置90包括第一获取模块91、确定模块92及控制模块93。步骤01可以由第一获取模块91实现。步骤02可以由确定模块92实现。步骤03可以由控制模块93实现。也即是说,第一获取模块91可以用于获取光发射器100与场景中的目标主体之间的投射距离。确定模块92可用于根据投射距离确定光发射器100的目标发光频率。控制模块93可用于控制光发射器100以目标发光频率发光。Please refer to FIG. 2 and FIG. 3 together. The present application further provides a control device 90 of the optical transmitter 100. The control method of the light transmitter 100 according to the embodiment of the present application may be executed by the control device 90 of the light transmitter 100 according to the embodiment of the present application. Specifically, the control device 90 includes a first acquisition module 91, a determination module 92, and a control module 93. Step 01 may be implemented by the first obtaining module 91. Step 02 may be implemented by the determination module 92. Step 03 may be implemented by the control module 93. That is, the first obtaining module 91 may be used to obtain a projection distance between the light emitter 100 and a target subject in the scene. The determination module 92 may be configured to determine a target light emission frequency of the light transmitter 100 according to the projection distance. The control module 93 may be configured to control the light emitter 100 to emit light at a target light emission frequency.
请再参阅图2,本申请还提供一种深度相机300。深度相机300包括光发射器100、光接收器200和处理器805。步骤01、步骤02和步骤03可以由处理器805实现。也及是说,处理器805可用于获取光发射器100与场景中的目标主体之间的投射距离、根据投射距离确定光发射器100的目标发光频率及控制光发射器100以目标发光频率发光。Please refer to FIG. 2 again, this application further provides a depth camera 300. The depth camera 300 includes a light transmitter 100, a light receiver 200, and a processor 805. Steps 01, 02, and 03 can be implemented by the processor 805. That is to say, the processor 805 may be configured to obtain a projection distance between the light emitter 100 and a target subject in the scene, determine a target light emission frequency of the light emitter 100 according to the projection distance, and control the light emitter 100 to emit light at the target light emission frequency. .
本申请实施方式的深度相机300可以应用于电子装置800中。本申请实施方式的深度相机300中的处理器805与电子装置800的处理器805可为同一个处理器805,也可为两个独立的处理器805。在本申请的具体实施例中,深度相机300中的处理器805与电子装置800的处理器805为同一个处理器805。电子装置800可以是手机、平板电脑、智能穿戴设备(智能手表、智能手环、智能眼镜、智能头盔)、无人机等,在此不作限制。The depth camera 300 according to the embodiment of the present application can be applied to the electronic device 800. The processor 805 in the depth camera 300 according to the embodiment of the present application and the processor 805 of the electronic device 800 may be the same processor 805 or two independent processors 805. In the specific embodiment of the present application, the processor 805 in the depth camera 300 and the processor 805 of the electronic device 800 are the same processor 805. The electronic device 800 may be a mobile phone, a tablet computer, a smart wearable device (a smart watch, a smart bracelet, a smart glasses, a smart helmet), a drone, etc., and is not limited herein.
具体地,本申请实施方式的深度相机300为飞行时间(Time ofFlight,TOF)深度相机。TOF深度相机通常包括一个光发射器100、一个光接收器200。光接收器200用于向场景中投射激光,光接收器200接收由场景中的人或物反射回的激光。TOF深度相机获取深度信息的方式通常包括直接获取和间接获取两种方式。直接获取方式下,处理器805可以根据光接收器200发射激光的时间点与光接收器200接收激光的时间点计算激光在场景中的飞行时间,并根据激光在场景中的飞行时间计算场景的深度信息。间接获取方式下,光发射器100向场景中发射脉冲调制后的发光频率一定的激光,光接收器200采集反射回的一个或多个完整的脉冲周期下的激光。光接收器200的每个像素都由一个感光器件组成,感光器件连接多个高频开关,可以把电流导入不同的可以存储电荷的电容里,如此,处理器805控制高频开关的开启和关闭,将接收到的一个或多个完整的脉冲周期下的激光分为两个部分,根据这两个部分的红外光对应的电流即可计算出物体与TOF深度相机的距离。例如,如图4所示,由两个部分的激光积累的电荷量分别为Q1和Q2,一个脉冲周期中激光的持续时间为T,则激光在场景中的传播时间
Figure PCTCN2019090078-appb-000001
则对应的距离
Figure PCTCN2019090078-appb-000002
其中,c为光速。当场景中人或物体距离TOF深度相机的距离较远时,若此时发光频率较高,则一方面一个脉冲周期中激光的持续时间T较短,感光器件积累激光的积分时间较短,另一方面距离较远,激光的飞行时间较长,损耗较多,如此会导致激光累积后的值Q1和Q2均较小,影响深度信息的获取精度。
Specifically, the depth camera 300 according to the embodiment of the present application is a time of flight (TOF) depth camera. A TOF depth camera generally includes a light transmitter 100 and a light receiver 200. The light receiver 200 is configured to project a laser light into the scene, and the light receiver 200 receives the laser light reflected by a person or an object in the scene. The TOF depth camera usually obtains depth information in two ways: direct acquisition and indirect acquisition. In the direct acquisition mode, the processor 805 can calculate the flight time of the laser in the scene according to the time point when the optical receiver 200 emits the laser light and the time point when the light receiver 200 receives the laser light, and calculate the scene's Depth information. In the indirect acquisition mode, the light transmitter 100 emits laser light with a certain modulation frequency to the scene after pulse modulation, and the light receiver 200 collects the laser light in one or more complete pulse periods reflected back. Each pixel of the light receiver 200 is composed of a light-sensitive device. The light-sensitive device is connected to multiple high-frequency switches, which can direct current into different capacitors that can store electric charges. In this way, the processor 805 controls the on-off of the high-frequency switch. , The received laser under one or more complete pulse periods is divided into two parts, and the distance between the object and the TOF depth camera can be calculated according to the current corresponding to the infrared light of these two parts. For example, as shown in Fig. 4, the charges accumulated by the two lasers are Q1 and Q2, and the duration of the laser in a pulse period is T, then the propagation time of the laser in the scene
Figure PCTCN2019090078-appb-000001
Corresponding distance
Figure PCTCN2019090078-appb-000002
Where c is the speed of light. When a person or object in the scene is far away from the TOF depth camera, if the light emission frequency is high at this time, on the one hand, the duration T of the laser light in a pulse period is short, and the integration time of the laser device to accumulate laser light is short. On the one hand, the distance is longer, the laser flight time is longer, and the loss is more. This will cause the values Q1 and Q2 of the laser accumulation to be small, which will affect the accuracy of depth information acquisition.
本申请实施方式的光发射器100的控制方法、控制装置90和深度相机300,在获取场景的深度信息前,首先检测场景中目标主体与深度相机300的之间的投射距离,再根据投射距离来确定光发射器100的目标发光频率,最后控制光发射器100按照目标发光频率发光。其中,投射距离与目标发光频率具有映射关系,例如,投射距离为一个具体的值,目标发光频率也为一个具体值,投射距离与目标发光频率一一对应;或者,投射距离为一个范围,目标发光频率为一个具体值,投射距离与目标发光频率一一对应。投射距离与目标发光频率之间映射关系可以是在深度相机300出厂前基于大量实验的标定数据确定得到的。投射距离与目标发光频率之间的映射关系满足目标发光频率随投射距离的增加而减小的规律。例如,投射距离为1.5米时,光发射器100的目标发光频率为100MHz;投射距离为3米时,光发射器100的目标发光频率为60MHz;投射距离为5米时,光发射器100的目标发光频率为30MHz等,从而在投射距离增加时,通过减小目标发光频率来增加感光器件积累激光的积分时间,进一步提升深度信息的获取精度。Before acquiring the depth information of the scene, the control method, control device 90 and depth camera 300 of the light transmitter 100 according to the embodiments of the present application first detect the projection distance between the target subject and the depth camera 300 in the scene, and then according to the projection distance To determine the target light emission frequency of the light emitter 100, and finally control the light emitter 100 to emit light at the target light emission frequency. The projection distance has a mapping relationship with the target emission frequency. For example, the projection distance is a specific value, the target emission frequency is also a specific value, and the projection distance corresponds to the target emission frequency one by one. Or, the projection distance is a range, and the target The light emission frequency is a specific value, and the projection distance corresponds to the target light emission frequency one to one. The mapping relationship between the projection distance and the target luminous frequency may be determined based on calibration data of a large number of experiments before the depth camera 300 leaves the factory. The mapping relationship between the projection distance and the target luminous frequency satisfies the law that the target luminous frequency decreases as the projection distance increases. For example, when the projection distance is 1.5 meters, the target light emitting frequency of the light transmitter 100 is 100 MHz; when the projection distance is 3 meters, the target light emitting frequency of the light transmitter 100 is 60 MHz; when the projection distance is 5 meters, the The target light emission frequency is 30 MHz, etc., so that when the projection distance is increased, the integration time of the laser light accumulated by the photosensitive device is increased by reducing the target light emission frequency to further improve the accuracy of obtaining depth information.
请参阅图5,在某些实施方式中,步骤01获取光发射器100与场景中的目标主体之间的投射距离包括:Referring to FIG. 5, in some embodiments, obtaining the projection distance between the light emitter 100 and the target subject in the scene in step 01 includes:
011:获取场景的拍摄图像;011: Get the captured image of the scene;
012:处理拍摄图像以判断拍摄图像中是否存在人脸;012: Process the captured image to determine whether a human face exists in the captured image;
013:在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例;和013: Calculate the first proportion of the face in the captured image when a face is present in the captured image; and
014:根据第一比例计算投射距离。014: Calculate the projection distance according to the first ratio.
请参阅图6,在某些实施方式中,第一获取模块91包括第一获取单元911、处理单元912、第一计算单元913和第二计算单元914。步骤011可以由第一获取单元911实现。步骤012可以由处理单元912实现。步骤013可以由第一计算单元913实现。步骤014可以由第二计算单元914实现。也即是说,第一获取单元911可用于获取场景的拍摄图像。处理单元912可用于处理拍摄图像以判断拍摄图像中是否存在人脸。第一计算单元913可用于在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例。第 二计算单元914可用于根据第一比例计算投射距离。其中,第一获取单元911可以是红外摄像头(可以是光接收器200)或可见光摄像头400,当第一获取单元911为红外摄像头时,拍摄图像为红外图像;当第一获取单元911为可见光摄像头400时,拍摄图像为可见光图像。Referring to FIG. 6, in some embodiments, the first acquisition module 91 includes a first acquisition unit 911, a processing unit 912, a first calculation unit 913, and a second calculation unit 914. Step 011 may be implemented by the first obtaining unit 911. Step 012 may be implemented by the processing unit 912. Step 013 may be implemented by the first calculation unit 913. Step 014 may be implemented by the second calculation unit 914. That is to say, the first acquisition unit 911 may be configured to acquire a captured image of a scene. The processing unit 912 may be configured to process the captured image to determine whether a human face exists in the captured image. The first calculation unit 913 may be configured to calculate a first proportion of a human face in the captured image when a human face exists in the captured image. The second calculation unit 914 may be configured to calculate the projection distance according to the first ratio. The first obtaining unit 911 may be an infrared camera (which may be the light receiver 200) or a visible light camera 400. When the first obtaining unit 911 is an infrared camera, the captured image is an infrared image; when the first obtaining unit 911 is a visible light camera At 400, the captured image is a visible light image.
请再参阅图1,在某些实施方式中,步骤011、步骤012、步骤013和步骤014均可以由处理器805实现。也即是说,处理器805可用于获取场景的拍摄图像、处理拍摄图像以判断拍摄图像中是否存在人脸、在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例、以及根据第一比例计算投射距离。Please refer to FIG. 1 again. In some embodiments, step 011, step 012, step 013, and step 014 may be implemented by the processor 805. That is to say, the processor 805 may be configured to obtain a captured image of a scene, process the captured image to determine whether a human face exists in the captured image, calculate a first proportion of the human face in the captured image when a human face exists in the captured image, And calculating the projection distance according to the first scale.
具体地,处理器805先基于人脸识别算法识别拍摄图像中是否存在人脸。在拍摄图像中存在人脸时,处理器805提取出人脸区域并计算人脸区域所占的像素个数,随后,处理器805将人脸区域的像素个数除以拍摄图像的总像素个数以得到拍摄图像中人脸所占的第一比例,最后基于第一比例计算投射距离。一般地,当第一比例较大时,说明目标主体比较靠近深度相机300,也就是目标主体比较靠近光发射器100,投射距离较小;当第一比例较大时,说明目标主体与深度相机300距离较远,也就是目标主体距离光发射器100较远,投射距离较大。因此,投射距离与第一比例之间的关系满足投射距离随第一比例的减小而增大。在一个例子中,当拍摄图像中包含多张人脸时,可以选取多张人脸中面积最大的人脸作为人脸区域用以计算第一比例;或者,也可以选取多张人脸的面积的平均值来计算第一比例;或者,可以从多张人脸中识别出电子装置800的持有者的人脸,将持有者的人脸作为人脸区域来计算第一比例,如此,基于持有者与深度相机300的距离来确定目标发光频率,可以提升持有者对应的深度信息的获取精度,提升用户使用体验。Specifically, the processor 805 first recognizes whether a human face exists in the captured image based on a human face recognition algorithm. When a face exists in the captured image, the processor 805 extracts the face area and calculates the number of pixels occupied by the face area. Subsequently, the processor 805 divides the number of pixels in the face area by the total number of pixels in the captured image. Count to get the first proportion of the face in the captured image, and finally calculate the projection distance based on the first proportion. Generally, when the first ratio is larger, the target subject is closer to the depth camera 300, that is, the target subject is closer to the light transmitter 100, and the projection distance is smaller. When the first proportion is larger, the target subject and the depth camera are explained. The distance of 300 is longer, that is, the target subject is farther from the light transmitter 100, and the projection distance is larger. Therefore, the relationship between the projection distance and the first ratio satisfies that the projection distance increases as the first ratio decreases. In one example, when the captured image contains multiple faces, the face with the largest area among the multiple faces may be selected as the face area to calculate the first proportion; or the area of multiple faces may also be selected To calculate the first proportion; or, the face of the holder of the electronic device 800 can be identified from multiple faces, and the first proportion can be calculated by using the face of the holder as the face area. Determining the target light emission frequency based on the distance between the holder and the depth camera 300 can improve the accuracy of obtaining the depth information corresponding to the holder and improve the user experience.
第一比例与投射距离具有映射关系,例如,第一比例为一个具体值,投射距离也为一个具体值,第一比例与投射距离一一对应;或者,第一比例为一个范围,投射距离为一个具体值,第一比例为投射距离一一对应;或者,第一比例为一个范围,投射距离也为一个范围,第一比例与投射距离一一对应。具体地,第一比例与投射距离之间的映射关系可以预先标定。在标定时,指引用户分别站在距离红外摄像头或可见光摄像头400多个预定投射距离处,红外摄像头或可见光摄像头400依次采集拍摄图像。处理器805计算每张拍摄图像中人脸占拍摄图像的标定比例,再存储每张拍摄图像中的标定比例与预定投射距离之间的对应关系,在后续使用时,基于实际测量的第一比例在上述映射关系中寻找与第一比例对应的投射距离。例如,指引用户在投射距离为10厘米、20厘米、30厘米、40厘米的位置处站立,红外摄像头或可见光摄像头400依次采集拍摄图像,处理器805根据多张拍摄图像计算出与投射距离10厘米、20厘米、30厘米、40厘米分别对应的标定比例80%、60%、45%、30%,并将标定比例与预定投射距离的映射关系10cm-80%、20cm-60%、30cm-45%、40cm-30%以映射表的形式存储在电子装置800的存储器(图24所示)中。在后续使用时,直接在映射表中寻找与第一比例对应的投射距离。The first ratio has a mapping relationship with the projection distance. For example, the first ratio is a specific value and the projection distance is also a specific value. The first ratio corresponds to the projection distance one by one. Or, the first ratio is a range and the projection distance is For a specific value, the first ratio is a one-to-one correspondence with the projection distance; or, the first ratio is a range and the projection distance is also a range, and the first ratio corresponds to the projection distance one-to-one. Specifically, the mapping relationship between the first scale and the projection distance may be calibrated in advance. At the time of calibration, the user is directed to stand at more than a predetermined projection distance from the infrared camera or visible light camera 400, and the infrared camera or visible light camera 400 sequentially captures captured images. The processor 805 calculates the calibration ratio of the face to the captured image in each captured image, and then stores the corresponding relationship between the calibrated ratio in each captured image and the predetermined projection distance. Based on the actually measured first ratio in subsequent use Find the projection distance corresponding to the first ratio in the above mapping relationship. For example, the user is instructed to stand at a projection distance of 10 cm, 20 cm, 30 cm, 40 cm, an infrared camera or a visible light camera 400 sequentially captures captured images, and the processor 805 calculates a projection distance of 10 cm from the multiple captured images , 20 cm, 30 cm, and 40 cm respectively corresponding to the calibration ratio of 80%, 60%, 45%, 30%, and the mapping relationship between the calibration ratio and the predetermined projection distance 10cm-80%, 20cm-60%, 30cm-45 %, 40cm-30% are stored in the memory of the electronic device 800 (shown in FIG. 24) in the form of a mapping table. In subsequent use, directly find the projection distance corresponding to the first ratio in the mapping table.
或者,预先对投射距离与第一比例进行标定。在标定时,指引用户站在距离红外摄像头或可见光摄像头400的某一个预定投射距离处,红外摄像头或可见光摄像头400采集拍摄图像。处理器805计算拍摄图像中人脸占拍摄图像的标定比例,再存储拍摄图像中的标定比例与预定投射距离之间的对应关系,在后续使用时,基于标定比例与预定投射距离之间的对应关系计算投射距离。例如,指引用户在投射距离为30厘米的位置处站立,红外摄像头或可见光摄像头400采集拍摄图像,处理器805计算到人脸在拍摄图像中的占比为45%,而在实际测量中,当计算得到第一比例为R时,则依据相似三角形的性质有
Figure PCTCN2019090078-appb-000003
其中,D依据实际测量的第一比例R计算的实际的投射距离。
Alternatively, the projection distance and the first ratio are calibrated in advance. During calibration, the user is directed to stand at a predetermined projection distance from the infrared camera or visible light camera 400, and the infrared camera or visible light camera 400 collects captured images. The processor 805 calculates the calibration ratio of the human face in the captured image, and then stores the correspondence between the calibration ratio in the captured image and the predetermined projection distance. In subsequent use, based on the correspondence between the calibration ratio and the predetermined projection distance The relationship calculates the projection distance. For example, to guide the user to stand at a position with a projection distance of 30 cm, the infrared camera or visible light camera 400 captures the captured image, the processor 805 calculates that the proportion of the human face in the captured image is 45%, and in actual measurement, when When the first ratio is calculated as R, according to the properties of similar triangles,
Figure PCTCN2019090078-appb-000003
Among them, D is an actual projection distance calculated according to the actually measured first ratio R.
如此,依据拍摄图像中人脸所占的第一比例,可以较为客观地反应目标主体与光发射器100之间的投射距离。In this way, according to the first proportion of the human face in the captured image, the projection distance between the target subject and the light emitter 100 can be reflected more objectively.
请参阅图7,在某些实施方式中,步骤01获取光发射器100与场景中的目标主体之间的投射距离包括:Referring to FIG. 7, in some embodiments, obtaining the projection distance between the light emitter 100 and the target subject in the scene in step 01 includes:
015:控制光发射器100以预定发光频率发光以检测场景的初始深度信息;和015: controlling the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene; and
016:根据初始深度信息计算光发射器100与目标主体之间的投射距离。016: Calculate a projection distance between the light emitter 100 and the target subject according to the initial depth information.
请参阅图8,在某些实施方式中,第一获取模块91包括第一控制单元915和第三计算单元916。步骤015可以由第一控制单元915实现。步骤015可以由第三计算单元916实现。也即是说,第一控制单元915可用于控制光发射器100以预定发光频率发光以检测场景的初始深度信息。第三计算单元916可用于根据初始深度信息计算光发射器100与目标主体之间的投射距离。Referring to FIG. 8, in some embodiments, the first obtaining module 91 includes a first control unit 915 and a third calculation unit 916. Step 015 may be implemented by the first control unit 915. Step 015 may be implemented by the third calculation unit 916. That is, the first control unit 915 may be used to control the light transmitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene. The third calculation unit 916 may be configured to calculate a projection distance between the light emitter 100 and the target subject according to the initial depth information.
请再参阅图1,在某些实施方式中,步骤015和步骤016均可以由处理器805实现。也即是说,处理器805还可以用于控制光发射器100以预定发光频率发光以检测场景的初始深度信息、及根据初始深度信息计算光发射器100与目标主体之间的投射距离。Please refer to FIG. 1 again. In some embodiments, step 015 and step 016 may both be implemented by the processor 805. That is, the processor 805 may also be used to control the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene, and calculate a projection distance between the light emitter 100 and the target subject according to the initial depth information.
具体地,处理器805控制光发射器100以预定发光频率发射激光,光接收器200接收由场景中的人或物体反射回的激光,处理器805基于光接收器200的接收结果计算场景的初始深度信息。其中,预定发光频率小于预设阈值,也即是说,获取场景的初始深度信息时,光发射器100以较低的发光频率发光,发光频率较低一方面可以减小电子装置800的功耗,另一方面,此时目标主体与深度相机300之间的投射距离未知,目标主体是否为用户也未知,若直接以较高的发光频率发光,如果目标主体为用户且目标主体与深度相机300的距离较近,则激光的高频率出射容易对用户的眼睛产生危害,而以较低的发光频率发光则不会存在上述的安全隐患。Specifically, the processor 805 controls the light transmitter 100 to emit laser light at a predetermined light emission frequency, the light receiver 200 receives the laser light reflected by a person or an object in the scene, and the processor 805 calculates an initial scene based on the reception result of the light receiver 200 Depth information. Wherein, the predetermined light emission frequency is less than a preset threshold, that is, when the initial depth information of the scene is obtained, the light emitter 100 emits light at a lower light emission frequency. The lower light emission frequency can reduce the power consumption of the electronic device 800. On the other hand, at this time, the projection distance between the target subject and the depth camera 300 is unknown, and it is also unknown whether the target subject is a user. If the light is directly emitted at a higher light frequency, if the target subject is the user and the target subject and the depth camera 300 If the distance is relatively short, the high-frequency emission of the laser light is likely to cause harm to the eyes of the user, and the light emission at a lower light emission frequency does not have the above-mentioned hidden dangers.
处理器805计算出场景的初始深度信息后,进一步地从场景中确定出目标主体,以进一步确定目标主体的初始深度信息。具体地,目标主体一般处于光接收器200的视场的中央区域,因此,可以将光接收器200视场的中央区域作为目标主体所在区域,从而将中央区域的这部分像素的初始深度信息作为目标主体的初始深度信息。一般地,目标主体的初始深度信息的值有多个,处理器805可计算出多个初始深度信息的均值或中值,并将均值或中值作为光发射器100与目标主体之间的投射距离。如此,计算出目标主体与光发射器100之间的投射距离,再基于投射距离确定光发射器100的目标发光频率,从而使得光发射器100按照目标发光频率发光,提升获取的目标主体的深度信息的精度。After the processor 805 calculates the initial depth information of the scene, the target subject is further determined from the scene to further determine the initial depth information of the target subject. Specifically, the target subject is generally located in the central area of the field of view of the light receiver 200. Therefore, the central area of the field of view of the light receiver 200 can be used as the area where the target subject is located, so that the initial depth information of the pixels in the central area is used as The initial depth information of the target subject. Generally, there are multiple values of the initial depth information of the target subject. The processor 805 can calculate the average or median value of the multiple initial depth information, and use the average or median value as the projection between the light emitter 100 and the target subject. distance. In this way, the projection distance between the target subject and the light emitter 100 is calculated, and the target light emitting frequency of the light emitter 100 is determined based on the projection distance, so that the light emitter 100 emits light according to the target light emitting frequency, and the depth of the obtained target subject is increased. The accuracy of the information.
在某些实施方式中,在步骤012处理拍摄图像以判断拍摄图像中是否存在人脸后,若拍摄图像中不存在人脸,则处理器805可进一步执行步骤015和步骤016以确定目标主体与光发射器100之间的投射距离。如此,在拍摄图像中不存在人脸时,也能够确定出目标主体与光发射器100之间的投射距离。In some embodiments, after processing the captured image to determine whether a face exists in the captured image in step 012, if there is no face in the captured image, the processor 805 may further perform steps 015 and 016 to determine the target subject and The projection distance between the light emitters 100. In this way, when a human face does not exist in the captured image, the projection distance between the target subject and the light emitter 100 can also be determined.
在某些实施方式中,在步骤015控制光发射器100以预定发光频率发光以检测场景的初始深度信息后,处理器805可以控制红外摄像头(可以为光接收器200)或可见光摄像头400采集拍摄图像。假设拍摄图像由可见光摄像头400采集,一般地,为了拍摄人物的三维色彩图像或者是对场景做三维建模,电子装置800中可见光摄像头400与光接收器200的视场通常具有较大部分的重叠,在电子装置800出厂前,厂商也会对可见光摄像头400与光接收器200之间的相对位置做标定并得到多个标定参数以用于后续可见光图像的色彩信息和深度图像的深度信息的匹配。因此,处理器805获取到拍摄图像后,处理器805可以先识别拍摄图像中是否存在人脸,在存在人脸时,再根据拍摄图像与初始深度信息形成的初始深度图像二者的匹配关系找到人脸对应的初始深度信息,并将人脸对应的初始深度信息作为目标主体的深度信息。若拍摄图像中不存在人脸,再将中央区域的这部分像素的初始深度信息作为目标主体的初始深度信息。如此,在场景中存在用户时,可以更准确地测量到用户与深度相机300之间的投射距离。In some embodiments, after controlling the light transmitter 100 to emit light at a predetermined light emission frequency to detect the initial depth information of the scene in step 015, the processor 805 may control the infrared camera (which may be the light receiver 200) or the visible light camera 400 to capture and shoot image. It is assumed that the captured image is collected by the visible light camera 400. Generally, in order to capture a three-dimensional color image of a person or to three-dimensionally model a scene, the visual field of the visible light camera 400 and the light receiver 200 in the electronic device 800 usually has a large overlap. Before the electronic device 800 leaves the factory, the manufacturer will also calibrate the relative position between the visible light camera 400 and the light receiver 200 and obtain multiple calibration parameters for matching the color information of the subsequent visible light image and the depth information of the depth image. . Therefore, after the captured image is obtained by the processor 805, the processor 805 can first identify whether a human face exists in the captured image, and when there is a human face, find it based on the matching relationship between the captured image and the initial depth image formed by the initial depth information. The initial depth information corresponding to the face, and the initial depth information corresponding to the face is used as the depth information of the target subject. If there is no human face in the captured image, the initial depth information of the pixels in the central area is used as the initial depth information of the target subject. As such, when there is a user in the scene, the projection distance between the user and the depth camera 300 can be measured more accurately.
请参阅图9,在某些实施方式中,控制方法在步骤01后还包括:Referring to FIG. 9, in some embodiments, the control method after step 01 further includes:
04:获取场景的环境亮度;04: Get the ambient brightness of the scene;
05:根据环境亮度及投射距离计算光发射器100的目标发光功率;和05: Calculate the target luminous power of the light transmitter 100 based on the ambient brightness and the projection distance; and
06:控制光发射器100以目标发光功率发光。06: The light emitter 100 is controlled to emit light at the target light emission power.
请参阅图10,在某些实施方式中,控制装置90还包括第二获取模块94、计算模块95。步骤04可以由第二获取模块94实现。步骤05可以由计算模块95实现。步骤06可以由控制模块93实现。也即是说,第二获取模块94可用于获取场景的环境亮度。计算模块95可用于根据环境亮度及投射距离计算光发射器100的目标发光功率。控制模块93还可用于控制光发射器100以目标发光功率发光。Referring to FIG. 10, in some embodiments, the control device 90 further includes a second obtaining module 94 and a calculation module 95. Step 04 may be implemented by the second acquisition module 94. Step 05 may be implemented by the calculation module 95. Step 06 may be implemented by the control module 93. That is to say, the second acquisition module 94 can be used to acquire the ambient brightness of the scene. The calculation module 95 may be configured to calculate a target luminous power of the light transmitter 100 according to the ambient brightness and the projection distance. The control module 93 can also be used to control the light emitter 100 to emit light at a target light emission power.
请再参阅图1,在某些实施方式中,步骤04、步骤05和步骤06均可以由处理器805实现。也即是说,处理器805可以用于获取场景的环境亮度、根据环境亮度及投射距离计算光发射器100的目标发光功率、以及控制光发射器100以目标发光功率发光。Please refer to FIG. 1 again. In some embodiments, step 04, step 05, and step 06 can all be implemented by the processor 805. That is to say, the processor 805 can be used to obtain the ambient brightness of the scene, calculate the target light emitting power of the light transmitter 100 according to the environment brightness and the projection distance, and control the light transmitter 100 to emit light at the target light emitting power.
其中,步骤04、步骤05与步骤02可以是同步执行的,步骤06与步骤03可以是同步执行的,此时处理器805除了控制光发射器100以目标发光频率发光,还控制光发射器100以目标发光功率发光。Among them, step 04, step 05, and step 02 may be performed synchronously, and steps 06 and 03 may be performed simultaneously. At this time, the processor 805 controls the light transmitter 100 to emit light at the target light emission frequency, and also controls the light transmitter 100 Light is emitted at the target light emission power.
具体地,环境亮度可以由光传感器检测。处理器805从光传感器中读取其检测到的环境亮度。或者,环境亮度也可以由红外摄像头(可以为光接收器200)或可见光摄像头400来检测,红外摄像头或可见光摄像头400拍摄当前场景的图像,处理器805计算图像的亮度值以作为环境亮度。Specifically, the ambient brightness can be detected by a light sensor. The processor 805 reads the ambient brightness it detects from the light sensor. Alternatively, the ambient brightness may also be detected by an infrared camera (which may be the light receiver 200) or the visible light camera 400. The infrared camera or the visible light camera 400 captures an image of the current scene, and the processor 805 calculates the brightness value of the image as the ambient brightness.
在确定出环境亮度和投射距离后,处理器805基于环境亮度和投射距离两个参数共同计算场景的目 标发光功率。可以理解的是,首先,在环境亮度较高时,环境光中包含的红外光成分较多,环境光中的红外光与光发射器100发射的红外激光的波段重合的部分也较多,此时,光接收器200同时会接收到光发射器100发射的红外激光以及环境光中的红外光,若光发射器100发射红外激光的发光功率较低,则光接收器200接收的光中的来自光发射器100的红外激光与来自环境光中的红外光二者的占比相差不大,如此会导致光接收器200接收光的时间点不准确,或者导致Q1和Q2的值不够准确,进一步会降低深度信息的获取精度,因此,需要提升光发射器100发射红外激光的发射功率,以减小环境中的红外光对光接收器200接收来自光发射器100的红外激光的影响;在环境亮度较低时,环境光线中包含的红外光成分较少,此时光发射器100若采用较高的发光功率发光,则会增加电子装置800的功耗。另外,在投射距离较远时,激光的飞行时间较长,飞行行程较远,激光的损耗较多,进一步地导致Q1和Q2的值较小,从而对深度信息的获取精度产生影响。因此,在投射距离较大时,可以适当提升光发射器100发射红外激光的发射功率。After determining the ambient brightness and the projection distance, the processor 805 calculates the target luminous power of the scene based on the two parameters of the ambient brightness and the projection distance. It can be understood that, first, when the ambient brightness is high, there are more infrared light components in the ambient light, and the infrared light in the ambient light and the infrared laser light emitted by the light transmitter 100 overlap with each other. When the optical receiver 200 receives both the infrared laser light emitted by the optical transmitter 100 and the infrared light in the ambient light, if the light emitting power of the infrared laser emitted by the optical transmitter 100 is low, the The ratio between the infrared laser light from the light transmitter 100 and the infrared light from the ambient light is not much different. This will cause the time point when the light receiver 200 receives the light is not accurate, or the values of Q1 and Q2 are not accurate enough. It will reduce the accuracy of acquiring depth information. Therefore, it is necessary to increase the transmission power of the infrared laser emitted by the optical transmitter 100 to reduce the influence of the infrared light in the environment on the optical receiver 200 receiving the infrared laser from the optical transmitter 100; When the brightness is low, the infrared light component contained in the ambient light is less. At this time, if the light emitter 100 emits light with a higher light emitting power, the power of the electronic device 800 will be increased. Consuming. In addition, when the projection distance is long, the flight time of the laser is long, the flight distance is long, and the loss of the laser is large, which further causes the values of Q1 and Q2 to be smaller, which affects the accuracy of the depth information acquisition. Therefore, when the projection distance is large, the transmission power of the infrared laser emitted by the optical transmitter 100 can be appropriately increased.
具体地,在环境亮度高于预设亮度时,且投射距离大于预定距离时,光发射器100的目标发光功率大于或等于第一预定功率P1。在环境亮度小于预设亮度,且投射距离小于预定距离时,光发射器100的目标发光功率小于或等于第二预定功率P2。其中,第一预定功率P1大于第二预定功率P2。在环境亮度大于预设亮度且投射距离小于预定距离,或者环境亮度小于预设亮度且投射距离大于预定距离时,光发射器100的目标发光功率位于第二预定功率P2与第一预订功率P1之间,即光发射器100的目标发光功率的取值范围为(P2,P1)。Specifically, when the ambient brightness is higher than the preset brightness and the projection distance is greater than a predetermined distance, the target light emitting power of the light transmitter 100 is greater than or equal to the first predetermined power P1. When the ambient brightness is less than the preset brightness and the projection distance is less than a predetermined distance, the target light emitting power of the light transmitter 100 is less than or equal to the second predetermined power P2. The first predetermined power P1 is greater than the second predetermined power P2. When the ambient brightness is greater than the preset brightness and the projection distance is less than a predetermined distance, or when the ambient brightness is less than the preset brightness and the projection distance is greater than a predetermined distance, the target light emitting power of the light transmitter 100 is between the second predetermined power P2 and the first predetermined power P1 In other words, the value range of the target luminous power of the optical transmitter 100 is (P2, P1).
如此,基于环境亮度及投射距离共同确定光发射器100的目标发光功率,一方面可以减小电子装置800的功耗,另一方面可以提升场景的深度信息的获取精度。In this way, jointly determining the target light emitting power of the light transmitter 100 based on the ambient brightness and the projection distance can reduce the power consumption of the electronic device 800 on the one hand and improve the accuracy of obtaining the depth information of the scene on the other.
请参阅图11,在某些实施方式中,步骤014根据所述第一比例计算投射距离包括:Referring to FIG. 11, in some embodiments, calculating the projection distance according to the first scale in step 014 includes:
0141:计算拍摄图像中人脸的预设特征区域占人脸的第二比例;和0141: Calculate the second proportion of the preset feature area of the human face in the captured image; and
0142:根据第一比例及第二比例计算投射距离。0142: Calculate the projection distance according to the first scale and the second scale.
请参阅图12,在某些实施方式中,第二计算单元914包括第一计算子单元9141和第二计算子单元9142。步骤0141可以由第一计算子单元9141实现,步骤0142可以由第二计算子单元9142实现。也即是说,第一计算子单元9141可用于计算拍摄图像中人脸的预设特征区域占人脸的第二比例。第二计算子单元9142可用于根据第一比例及第二比例计算投射距离。Referring to FIG. 12, in some embodiments, the second calculation unit 914 includes a first calculation sub-unit 9141 and a second calculation sub-unit 9142. Step 0141 may be implemented by the first calculation subunit 9141, and step 0142 may be implemented by the second calculation subunit 9142. That is to say, the first calculation subunit 9141 may be configured to calculate a second proportion of the preset feature area of the human face in the captured image to the human face. The second calculation subunit 9142 may be configured to calculate the projection distance according to the first scale and the second scale.
请再参阅图1,在某些实施方式中,步骤0141和步骤0142均可以由处理器805实现。也即是说,处理器805可用于计算拍摄图像中人脸的预设特征区域占人脸的第二比例、及根据第一比例及第二比例计算投射距离。Please refer to FIG. 1 again. In some embodiments, step 0141 and step 0142 may both be implemented by the processor 805. That is to say, the processor 805 may be configured to calculate a second ratio of a preset feature area of a human face in the captured image to the human face, and calculate a projection distance according to the first ratio and the second ratio.
可以理解,不同的用户的人脸大小有差异,使得不同的用户处于同样的距离下时,采集到的拍摄图像中人脸所占的第一比例有差异。第二比例为人脸的预设的特征占人脸的比例,预设的特征区域可以选择不同用户个体的差异度较小的特征区域,例如预设的特征趋区域为用户的双眼间距。当第二比例较大时,说明该用户的人脸较小,仅依据第一比例计算得到的投射距离过大;当第二比例较小时,说明该用户的人脸较大,仅依据第一比例计算得到的投射距离过小。在实际使用中,可以预先对第一比例、第二比例与投射距离进行标定。具体地,指引用户站在预定的投射距离位置处,并采集拍摄图像,再计算该拍摄图像对应的第一标定比例及第二标定比例,存储该预定的投射距离与第一标定比例、第二标定比例的对应关系,以便于在后续使用中依据实际的第一比例和第二比例计算投射距离。例如,指引用户站在投射距离为25厘米处,并采集拍摄图像,再计算该拍摄图像对应的第一标定比例为50%,第二标定比例为10%,而在实际测量中,当计算得到的第一比例为R1,第二比例为R2时,则依据三角形相似的性质有
Figure PCTCN2019090078-appb-000004
其中,D1为依据实际测量的第一比例R1计算得到的初始的投射距离,可以再依据关系式
Figure PCTCN2019090078-appb-000005
求得进一步依据实际测量的第二比例R2计算得到的校准的投射距离D2,D2作为最终的投射距离。如此,依据第一比例和第二比例计算得到的投射距离考虑了不同用户之间的个体差异,能够获得更加客观的投射距离,进一步地可以基于较为准确的投射距离确定出较为准确的目标发光频率和目标发光功率。
It can be understood that the face sizes of different users are different, so that when different users are at the same distance, the first proportion of the faces in the captured images is different. The second ratio is a ratio of the preset features of the human face to the human face. The preset feature area may select a feature area with a small difference between different user individuals. For example, the preset feature trend area is the binocular distance of the user. When the second ratio is large, the user's face is small, and the projection distance calculated based on the first ratio is too large; when the second ratio is small, the user's face is large, only based on the first The calculated projection distance is too small. In actual use, the first ratio, the second ratio, and the projection distance can be calibrated in advance. Specifically, the user is directed to stand at a predetermined projection distance position, collect a captured image, and then calculate a first calibration ratio and a second calibration ratio corresponding to the captured image, and store the predetermined projection distance, the first calibration ratio, and the second The corresponding relationship of the scales is calibrated, so as to calculate the projection distance according to the actual first scale and the second scale in subsequent use. For example, instruct the user to stand at a projection distance of 25 cm and collect the captured image, and then calculate the first calibration ratio corresponding to the captured image to be 50% and the second calibration ratio to be 10%. In actual measurement, when calculated, When the first ratio is R1 and the second ratio is R2, according to the similar properties of the triangle,
Figure PCTCN2019090078-appb-000004
Among them, D1 is the initial projection distance calculated according to the actually measured first ratio R1, which can be further based on the relationship
Figure PCTCN2019090078-appb-000005
A calibrated projection distance D2, which is further calculated according to the actually measured second ratio R2, is obtained, and D2 is used as the final projection distance. In this way, the projection distance calculated according to the first ratio and the second ratio takes into account the individual differences between different users, and can obtain a more objective projection distance. It can further determine a more accurate target light emission frequency based on a more accurate projection distance. And target luminous power.
请参阅图13,在某些实施方式中,步骤014根据所述第一比例计算投射距离包括:Referring to FIG. 13, in some embodiments, calculating the projection distance according to the first proportion in step 014 includes:
0143:根据拍摄图像判断目标主体是否佩戴眼镜;和0143: judging whether the target subject is wearing glasses based on the captured image; and
0144:在目标主体佩戴眼镜时根据第一比例及距离系数计算投射距离。0144: Calculate the projection distance according to the first scale and the distance coefficient when the target subject wears glasses.
请参阅图14,在某些实施方式中,第二计算单元914还包括第一判断子单元9143和第三计算子单元9144。步骤0143可以由第一判断子单元9143实现。步骤0144可以由第三计算子单元9144实现。也即是说,第一判断子单元9143可用于根据拍摄图像判断目标主体是否佩戴眼镜,第三计算子单元9144可用于在目标主体佩戴眼镜时根据第一比例及距离系数计算投射距离。Referring to FIG. 14, in some embodiments, the second calculation unit 914 further includes a first determination sub-unit 9143 and a third calculation sub-unit 9144. Step 0143 may be implemented by the first judging subunit 9143. Step 0144 may be implemented by the third calculation subunit 9144. That is to say, the first judging sub-unit 9143 may be used to judge whether the target subject is wearing glasses according to the captured image, and the third calculating sub-unit 9144 may be used to calculate the projection distance according to the first ratio and the distance coefficient when the target subject is wearing glasses.
请再参阅图1,在某些实施方式中,步骤0143和步骤0144均可以由处理器805实现。也即是说,处理器805还可用于根据拍摄图像判断目标主体是否佩戴眼镜、以及在目标主体佩戴眼镜时根据第一比例及距离系数计算投射距离。Please refer to FIG. 1 again. In some embodiments, step 0143 and step 0144 may be implemented by the processor 805. That is, the processor 805 may be further configured to determine whether the target subject is wearing glasses according to the captured image, and calculate the projection distance according to the first ratio and the distance coefficient when the target subject is wearing glasses.
可以理解,用户是否佩戴眼镜可以用于表征用户眼睛的健康状况,具体为用户佩戴眼镜则表明用户的眼睛已经患有相关的眼疾或视力不佳,在光发射器100对佩戴眼镜的用户发射激光时,需要降低光发射器100的发光功率以使得光发射器100发射的激光的能量较小,以免对用户的眼睛造成伤害。预设的距离系数可以是介于0至1的系数,例如0.6、0.78、0.82、0.95等,例如在根据第一比例计算得到初始的投射距离,或者在依据第一比例和第二比例计算得到校准后的投射距离后,再将初始的投射距离或者校准的投射距离乘以距离系数,得到最终的投射距离,并根据该投射距离以及环境亮度确定目标发光功率。如此,可以避免发射激光的功率过大伤害患有眼疾或视力不佳的用户。It can be understood that whether the user wears glasses can be used to characterize the health status of the user's eyes. Specifically, if the user wears glasses, it indicates that the user's eyes have suffered from related eye diseases or poor eyesight. The optical transmitter 100 emits laser light to the user wearing the glasses. At this time, the light emitting power of the light transmitter 100 needs to be reduced so that the energy of the laser light emitted by the light transmitter 100 is small, so as not to cause damage to the eyes of the user. The preset distance coefficient can be a coefficient between 0 and 1, such as 0.6, 0.78, 0.82, 0.95, etc., for example, the initial projection distance is calculated according to the first ratio, or the first distance and the second ratio are calculated. After the calibrated projection distance, the initial projection distance or the calibrated projection distance is multiplied by the distance coefficient to obtain the final projection distance, and the target luminous power is determined according to the projection distance and the ambient brightness. In this way, it is possible to avoid that the power of the emitted laser is too large to hurt the user suffering from eye disease or poor vision.
请参阅图15,在某些实施方式中,步骤014根据所述第一比例计算投射距离包括:Referring to FIG. 15, in some embodiments, calculating the projection distance according to the first scale in step 014 includes:
0145:根据拍摄图像判断目标主体的年龄;和0145: judging the age of the target subject based on the captured image; and
0146:根据第一比例及年龄计算投射距离。0146: Calculate the projection distance according to the first ratio and age.
请参阅图16,在某些实施方式中,第二计算单元914还包括第二判断子单元9145和第四计算子单元9146。步骤0145可以由第二判断子单元9145实现。步骤0146可以由第四计算子单元9146实现。也即是说,第二判断子单元9145可用于根据拍摄图像判断目标主体的年龄。第四计算子单元9146可用于根据第一比例及年龄计算投射距离。Referring to FIG. 16, in some embodiments, the second calculation unit 914 further includes a second determination sub-unit 9145 and a fourth calculation sub-unit 9146. Step 0145 may be implemented by the second judgment sub-unit 9145. Step 0146 may be implemented by the fourth calculation subunit 9146. That is to say, the second judging subunit 9145 can be used to judge the age of the target subject based on the captured image. The fourth calculation subunit 9146 may be configured to calculate the projection distance according to the first ratio and the age.
请再参阅图1,在某些实施方式中,步骤0145和步骤0146均可以由处理器805实现。也即是说,处理器805还可用于根据拍摄图像判断目标主体的年龄,以及根据第一比例及年龄计算投射距离。Please refer to FIG. 1 again. In some embodiments, step 0145 and step 0146 may be implemented by the processor 805. That is, the processor 805 may be further configured to determine the age of the target subject based on the captured image, and calculate the projection distance based on the first ratio and age.
不同年龄段的人对红外激光的耐受能力不同,例如小孩和老人更容易被激光灼伤等,可能对于成年人而言是合适强度的激光会对小孩造成伤害。本实施方式中,可以提取拍摄图像中,人脸皱纹的特征点的数量、分布和面积等来判断用户的年龄,例如,提取眼角处皱纹的数量来判断用户的年龄,或者进一步结合用户的额头处的皱纹多少来判断用户的年龄。在判断用户的年龄后,可以依据用户的年龄得到比例系数,具体可以是在查询表中查询得知年龄与比例系数的对应关系,例如,年龄在15岁以下时,比例系数为0.6,年龄在15岁至20岁时,比例系数为0.8;年龄在20岁至45岁时,比例系数为1.0;年龄在45岁以上时,比例系数为0.8。在得知比例系数后,可以将根据第一比例计算得到的初始的投射距离、或者根据第一比例及第二比例计算得到的校准的投射距离乘以比例系数,以得到最终的投射距离,再根据投射距离以及环境亮度确定目标发光功率。如此,可以避免发射激光的功率过大而伤害小年龄段或者年龄较大的用户。People of different ages have different tolerances to infrared lasers. For example, children and the elderly are more susceptible to laser burns. For adults, lasers of appropriate intensity may cause harm to children. In this embodiment, the number, distribution, and area of feature points of facial wrinkles in the captured image can be extracted to determine the user's age, for example, the number of wrinkles at the corners of the eyes can be used to determine the user's age, or further combined with the user's forehead How many wrinkles are there to determine the user's age. After judging the user's age, the proportion coefficient can be obtained according to the age of the user. Specifically, the correspondence between age and the proportion coefficient can be found in a query table. For example, when the age is under 15 years, the proportion coefficient is 0.6 and the age is between When the age is 15 to 20, the scale factor is 0.8; when the age is 20 to 45, the scale factor is 1.0; when the age is 45 or more, the scale factor is 0.8. After knowing the scale factor, the initial projection distance calculated from the first scale or the calibrated projection distance calculated from the first and second scales can be multiplied by the scale factor to obtain the final projection distance. Determine the target luminous power according to the projection distance and the ambient brightness. In this way, excessive power of the emitted laser can be avoided to hurt young users or older users.
请一并参阅图1和图17,在某些实施方式中,本申请实施方式的电子装置800还包括壳体801。壳体801可以作为电子装置800的功能元件的安装载体。壳体801可以为功能元件提供防尘、防摔、防水等保护,功能元件可以是显示屏802、可见光摄像头400、受话器等。在本申请实施例中,壳体801包括主体803及可动支架804,可动支架804在驱动装置的驱动下可以相对于主体803运动,例如可动支架804可以相对于主体803滑动,以滑入主体803(如图17所示)或从主体803滑出(如图1所示)。部分功能元件(例如显示屏802)可以安装在主体803上,另一部分功能元件(例如深度相机300、可见光摄像头400、受话器)可以安装在可动支架804上,可动支架804运动可带动该另一部分功能元件缩回主体803内或从主体803中伸出。当然,图1和图17所示仅仅是对壳体801的一种具体形式举例,不能理解为对本申请的壳体801的限制。Please refer to FIG. 1 and FIG. 17 together. In some embodiments, the electronic device 800 according to the embodiment of the present application further includes a housing 801. The housing 801 may serve as a mounting carrier for the functional elements of the electronic device 800. The housing 801 can provide protection for the functional elements from dust, drop, and water. The functional elements can be a display screen 802, a visible light camera 400, a receiver, and the like. In the embodiment of the present application, the housing 801 includes a main body 803 and a movable bracket 804. The movable bracket 804 can move relative to the main body 803 under the driving of a driving device. For example, the movable bracket 804 can slide relative to the main body 803 to slide. Enter or slide out of the main body 803 (as shown in FIG. 17). Some functional elements (such as the display 802) can be installed on the main body 803, and other functional elements (such as the depth camera 300, the visible light camera 400, and the receiver) can be installed on the movable bracket 804. The movement of the movable bracket 804 can drive the other A part of the functional elements is retracted into or protruded from the main body 803. Of course, FIG. 1 and FIG. 17 are merely examples of a specific form of the casing 801, and cannot be understood as a limitation on the casing 801 of the present application.
深度相机300安装在壳体801上。具体地,壳体801上可以开设有采集窗口,深度相机300与采集窗口对准安装以使深度相机300采集深度信息。在本申请的具体实施例中,深度相机300安装在可动支架804上。用户在需要使用深度相机300时,可以触发可动支架804从主体803中滑出以带动深度相机 300从主体803中伸出;在不需要使用深度相机300时,可以触发可动支架804滑入主体803以带动深度相机300缩回主体中。The depth camera 300 is mounted on a casing 801. Specifically, the housing 801 may be provided with an acquisition window, and the depth camera 300 is aligned with the acquisition window to enable the depth camera 300 to acquire depth information. In a specific embodiment of the present application, the depth camera 300 is mounted on a movable bracket 804. When the user needs to use the depth camera 300, he can trigger the movable bracket 804 to slide out from the main body 803 to drive the depth camera 300 to protrude from the main body 803. When the depth camera 300 is not needed, the movable bracket 804 can be triggered to slide in The main body 803 is retracted into the main body by driving the depth camera 300.
请一并参阅图18至图20,在某些实施方式中,深度相机300除了包括光发射器100和光接收器200外,还包括第一基板组件71和垫块72。第一基板组件71包括互相连接的第一基板711及柔性电路板712。垫块72设置在第一基板711上。光发射器100用于向外投射激光,光发射器100设置在垫块72上。柔性电路板712弯折且柔性电路板712的一端连接第一基板711,另一端连接光发射器100。光接收器200设置在第一基板711上,光接收器200用于接收被目标空间中的人或物反射回的激光。光接收器200包括外壳741及设置在外壳741上的光学元件742。外壳741与垫块72连接成一体。Please refer to FIG. 18 to FIG. 20 together. In some embodiments, in addition to the light transmitter 100 and the light receiver 200, the depth camera 300 further includes a first substrate assembly 71 and a spacer 72. The first substrate assembly 71 includes a first substrate 711 and a flexible circuit board 712 connected to each other. The spacer 72 is disposed on the first substrate 711. The light emitter 100 is used for projecting laser light outward, and the light emitter 100 is disposed on the cushion block 72. The flexible circuit board 712 is bent and one end of the flexible circuit board 712 is connected to the first substrate 711 and the other end is connected to the light emitter 100. The light receiver 200 is disposed on the first substrate 711. The light receiver 200 is configured to receive laser light reflected by a person or an object in the target space. The light receiver 200 includes a housing 741 and an optical element 742 provided on the housing 741. The housing 741 is integrally connected with the pad 72.
具体地,第一基板组件71包括第一基板711及柔性电路板712。第一基板711可以是印刷线路板或柔性线路板。第一基板711上可以铺设有深度相机300的控制线路等。柔性电路板712的一端可以连接在第一基板711上,柔性电路板712的另一端连接在电路板50(图20所示)上。柔性电路板712可以发生一定角度的弯折,使得柔性电路板712的两端连接的器件的相对位置可以有较多选择。Specifically, the first substrate assembly 71 includes a first substrate 711 and a flexible circuit board 712. The first substrate 711 may be a printed wiring board or a flexible wiring board. A control circuit of the depth camera 300 and the like may be laid on the first substrate 711. One end of the flexible circuit board 712 may be connected to the first substrate 711, and the other end of the flexible circuit board 712 is connected to the circuit board 50 (shown in FIG. 20). The flexible circuit board 712 can be bent at a certain angle, so that the relative positions of the devices connected at both ends of the flexible circuit board 712 can be selected.
垫块72设置在第一基板711上。在一个例子中,垫块72与第一基板711接触且承载在第一基板711上,具体地,垫块72可以通过胶粘等方式与第一基板711结合。垫块72的材料可以是金属、塑料等。在本申请的实施例中,垫块72与第一基板711结合的面可以是平面,垫块72与该结合的面相背的面也可以是平面,使得光发射器100设置在垫块72上时具有较好的平稳性。The spacer 72 is disposed on the first substrate 711. In one example, the spacer 72 is in contact with the first substrate 711 and is carried on the first substrate 711. Specifically, the spacer 72 may be combined with the first substrate 711 by means of adhesion or the like. The material of the spacer 72 may be metal, plastic, or the like. In the embodiment of the present application, a surface on which the pad 72 is combined with the first substrate 711 may be a flat surface, and a surface on which the pad 72 is opposite to the combined surface may also be a flat surface, so that the light emitter 100 is disposed on the pad 72. It has better smoothness.
光接收器200设置在第一基板711上,且光接收器200和第一基板711的接触面与垫块72和第一基板711的接触面基本齐平设置(即,二者的安装起点在同一平面上)。具体地,光接收器200包括外壳741及光学元件742。外壳741设置在第一基板711上,光学元件742设置在外壳741上,外壳741可以是光接收器200的镜座及镜筒,光学元件742可以是设置在外壳741内的透镜等元件。进一步地,光接收器200还包括感光芯片(图未示),由目标空间中的人或物反射回的激光通过光学元件742后照射到感光芯片中,感光芯片对该激光产生响应。在本申请的实施例中,外壳741与垫块72连接成一体。具体地,外壳741与垫块72可以是一体成型;或者外壳741与垫块72的材料不同,二者通过双色注塑等方式一体成型。外壳741与垫块72也可以是分别成型,二者形成配合结构,在组装深度相机300时,可以先将外壳741与垫块72中的一个设置在第一基板711上,再将另一个设置在第一基板711上且连接成一体。The light receiver 200 is disposed on the first substrate 711, and the contact surface between the light receiver 200 and the first substrate 711 is substantially flush with the contact surface between the pad 72 and the first substrate 711 (that is, the installation starting point of the two is at On the same plane). Specifically, the light receiver 200 includes a housing 741 and an optical element 742. The casing 741 is disposed on the first substrate 711, and the optical element 742 is disposed on the casing 741. The casing 741 may be a lens holder and a lens barrel of the light receiver 200, and the optical element 742 may be an element such as a lens disposed in the casing 741. Further, the light receiver 200 further includes a photosensitive chip (not shown), and the laser light reflected by a person or an object in the target space passes through the optical element 742 and is irradiated into the photosensitive chip, and the photosensitive chip responds to the laser. In the embodiment of the present application, the housing 741 and the cushion block 72 are integrally connected. Specifically, the casing 741 and the cushion block 72 may be integrally formed; or the materials of the casing 741 and the cushion block 72 are different, and the two are integrally formed by two-color injection molding or the like. The housing 741 and the spacer 72 may also be separately formed, and the two form a matching structure. When assembling the depth camera 300, one of the housing 741 and the spacer 72 may be set on the first substrate 711, and then the other The first substrate 711 is integrally connected with each other.
如此,将光发射器100设置在垫块72上,垫块72可以垫高光发射器100的高度,进而提高光发射器100出射激光的面的高度,光发射器100发射的激光不易被光接收器200遮挡,使得激光能够完全照射到目标空间中的被测物体上。In this way, the light transmitter 100 is disposed on the pad 72, which can increase the height of the light transmitter 100, thereby increasing the height of the surface on which the laser is emitted by the light transmitter 100. The laser light emitted by the light transmitter 100 is not easily received by the light The device 200 is blocked, so that the laser light can be completely irradiated on the measured object in the target space.
请再一并参阅图18至图20,在某些实施方式中,垫块72与第一基板711结合的一侧开设有容纳腔723。深度相机300还包括设置在第一基板711上的电子元件77。电子元件77收容在容纳腔723内。电子元件77可以是电容、电感、晶体管、电阻等元件。电子元件77可以与铺设在第一基板711上的控制线路电连接,并用于或控制光发射器100或光接收器200工作。电子元件77收容在容纳腔723内,合理利用了垫块72内的空间,不需要增加第一基板711的宽度来设置电子元件77,有利于减小深度相机300的整体尺寸。容纳腔723的数量可以是一个或多个,容纳腔723可以是互相间隔的。在安装垫块72时,可以将容纳腔723与电子元件77的位置对准并将垫块72设置在第一基板711上。Please refer to FIG. 18 to FIG. 20 together. In some embodiments, the side where the cushion block 72 is combined with the first substrate 711 is provided with a receiving cavity 723. The depth camera 300 further includes an electronic component 77 provided on the first substrate 711. The electronic component 77 is housed in the receiving cavity 723. The electronic component 77 may be an element such as a capacitor, an inductor, a transistor, or a resistor. The electronic component 77 may be electrically connected to a control line laid on the first substrate 711 and used for or controlling the operation of the light transmitter 100 or the light receiver 200. The electronic component 77 is housed in the receiving cavity 723, and the space in the pad 72 is used reasonably. It is not necessary to increase the width of the first substrate 711 to set the electronic component 77, which is beneficial to reducing the overall size of the depth camera 300. The number of the receiving cavities 723 may be one or more, and the receiving cavities 723 may be spaced apart from each other. When mounting the pad 72, the receiving cavity 723 and the electronic component 77 may be aligned and the pad 72 may be disposed on the first substrate 711.
请继续一并参阅图18至图20,在某些实施方式中,垫块72开设有与至少一个容纳腔723连接的避让通孔724,至少一个电子元件77伸入避让通孔724内。可以理解,需要将电子元件77收容在避让通孔内时,要求电子元件77的高度不高于容纳腔723的高度。而对于高度高于容纳腔723的电子元件,可以开设与容纳腔723对应的避让通孔724,电子元件77可以部分伸入避让通孔724内,以在不提高垫块72的高度的前提下布置电子元件77。Please continue to refer to FIG. 18 to FIG. 20 together. In some embodiments, the cushion block 72 is provided with an avoiding through hole 724 connected to at least one receiving cavity 723, and at least one electronic component 77 extends into the avoiding through hole 724. It can be understood that when the electronic component 77 needs to be accommodated in the avoiding through hole, the height of the electronic component 77 is required to be not higher than the height of the receiving cavity 723. For electronic components having a height higher than the receiving cavity 723, an avoiding through hole 724 corresponding to the receiving cavity 723 may be provided, and the electronic component 77 may partially extend into the avoiding through hole 724 so as not to increase the height of the cushion 72. Arranges the electronic component 77.
请还一并参阅图18至图20,在某些实施方式中,第一基板组件711还包括加强板713,加强板713结合在第一基板711的与垫块72相背的一侧。加强板713可以覆盖第一基板711的一个侧面,加强板713可以用于增加第一基板711的强度,避免第一基板711发生形变。另外,加强板713可以由导电的材料制成,例如金属或合金等,当深度相机300安装在电子设备800上时,可以将加强板713与壳体801电连接,以使加强板713接地,并有效地减少外部元件的静电对深度相机300的干扰。Please refer to FIGS. 18 to 20 together. In some embodiments, the first substrate assembly 711 further includes a reinforcing plate 713, and the reinforcing plate 713 is coupled to a side of the first substrate 711 opposite to the pad 72. The reinforcing plate 713 may cover one side of the first substrate 711, and the reinforcing plate 713 may be used to increase the strength of the first substrate 711 and prevent deformation of the first substrate 711. In addition, the reinforcing plate 713 may be made of a conductive material, such as metal or alloy. When the depth camera 300 is mounted on the electronic device 800, the reinforcing plate 713 may be electrically connected to the casing 801 to ground the reinforcing plate 713. And the interference of the static electricity of the external components on the depth camera 300 is effectively reduced.
请再一并参阅图18至图20,在某些实施方式中,深度相机300还包括连接器76,连接器76连接 在第一基板组件71上并用于与深度相机300外部的电子元件电性连接。Please refer to FIG. 18 to FIG. 20 again. In some embodiments, the depth camera 300 further includes a connector 76 connected to the first substrate assembly 71 and used to electrically connect with electronic components external to the depth camera 300. connection.
请参阅图21,在某些实施方式中,光接收器100包括光源10、扩散器20、镜筒30、保护罩40、电路板50及驱动器61。Referring to FIG. 21, in some embodiments, the light receiver 100 includes a light source 10, a diffuser 20, a lens barrel 30, a protective cover 40, a circuit board 50, and a driver 61.
其中,镜筒30包括呈环状的镜筒侧壁33,环状的镜筒侧壁33围成收容腔62。镜筒侧壁33包括位于收容腔62内的内表面331及与内表面相背的外表面332。镜筒侧壁33包括相背的第一面31及第二面32。收容腔62贯穿第一面31及第二面32。第一面31朝第二面32凹陷形成与收容腔62连通的安装槽34。安装槽34的底面35位于安装槽34的远离第一面31的一侧。镜筒侧壁33的外表面332在第一面31的一端的横截面呈圆形,镜筒侧壁33的外表面332在第一面31的一端形成有外螺纹。Wherein, the lens barrel 30 includes a ring-shaped lens barrel sidewall 33, and the ring-shaped lens barrel sidewall 33 surrounds a receiving cavity 62. The side wall 33 of the lens barrel includes an inner surface 331 located in the receiving cavity 62 and an outer surface 332 opposite to the inner surface. The side wall 33 of the lens barrel includes a first surface 31 and a second surface 32 opposite to each other. The receiving cavity 62 penetrates the first surface 31 and the second surface 32. The first surface 31 is recessed toward the second surface 32 to form a mounting groove 34 communicating with the receiving cavity 62. The bottom surface 35 of the mounting groove 34 is located on a side of the mounting groove 34 remote from the first surface 31. The outer surface 332 of the side wall 33 of the lens barrel is circular at one end of the first surface 31, and the outer surface 332 of the side wall 33 of the lens barrel is formed with an external thread at one end of the first surface 31.
电路板50设置在镜筒30的第二面32上并封闭收容腔62的一端。电路板50可以为柔性电路板或印刷电路板。The circuit board 50 is disposed on the second surface 32 of the lens barrel 30 and closes one end of the receiving cavity 62. The circuit board 50 may be a flexible circuit board or a printed circuit board.
光源10承载在电路板50上并收容在收容腔62内。光源10用于朝镜筒30的第一面31(安装槽34)一侧发射激光。光源10可以是单点光源,也可是多点光源。在光源10为单点光源时,光源10具体可以为边发射型激光器,例如可以为分布反馈式激光器(Distributed Feedback Laser,DFB)等;在光源10为多点光源时,光源10具体可以为垂直腔面发射器(Vertical-Cavity Surface Laser,VCSEL),或者光源10也为由多个边发射型激光器组成的多点光源。垂直腔面发射激光器的高度较小,采用垂直腔面发射器作为光源10,有利于减小光发射器100的高度,便于将光发射器100集成到手机等对机身厚度有较高的要求的电子装置800中。与垂直腔面发射器相比,边发射型激光器的温漂较小,可以减小温度对光源10的投射激光的效果的影响。The light source 10 is carried on the circuit board 50 and received in the receiving cavity 62. The light source 10 is configured to emit laser light toward the first surface 31 (the mounting groove 34) side of the lens barrel 30. The light source 10 may be a single-point light source or a multi-point light source. When the light source 10 is a single-point light source, the light source 10 may specifically be an edge-emitting laser, for example, a distributed feedback laser (Distributed Feedback Laser, DFB), etc .; when the light source 10 is a multi-point light source, the light source 10 may specifically be vertical A cavity-surface emitter (Vertical-Cavity Surface Laser, VCSEL), or the light source 10 is also a multi-point light source composed of multiple edge-emitting lasers. The vertical cavity surface emitting laser has a small height, and the use of the vertical cavity surface emitter as the light source 10 is beneficial to reducing the height of the light emitter 100 and facilitating the integration of the light emitter 100 into a mobile phone and other requirements on the thickness of the fuselage. Electronic device 800. Compared with the vertical cavity surface emitter, the temperature drift of the side-emitting laser is smaller, and the influence of the temperature on the effect of the projected laser light from the light source 10 can be reduced.
驱动器61承载在电路板50上并与光源10电性连接。具体地,驱动器61可以接收经过调制的输入信号,并将输入信号转化为恒定的电流源后传输给光源10,以使光源10在恒定的电流源的作用下朝镜筒30的第一面31一侧发射激光。本实施方式的驱动器61设置在镜筒30外。在其他实施方式中,驱动器61可以设置在镜筒30内并承载在电路板50上。The driver 61 is carried on the circuit board 50 and is electrically connected to the light source 10. Specifically, the driver 61 may receive the modulated input signal, and convert the input signal into a constant current source and transmit it to the light source 10, so that the light source 10 is directed toward the first side 31 of the lens barrel 30 under the action of the constant current source. Laser is emitted on one side. The driver 61 of this embodiment is provided outside the lens barrel 30. In other embodiments, the driver 61 may be disposed in the lens barrel 30 and carried on the circuit board 50.
扩散器20安装(承载)在安装槽34内并与安装槽34相抵触。扩散器20用于扩散穿过扩散器20的激光。也即是,光源10朝镜筒30的第一面31一侧发射激光时,激光会经过扩散器20并被扩散器20扩散或投射到镜筒30外。The diffuser 20 is mounted (supported) in the mounting groove 34 and abuts the mounting groove 34. The diffuser 20 is used to diffuse the laser light passing through the diffuser 20. That is, when the light source 10 emits laser light toward the first surface 31 side of the lens barrel 30, the laser light passes through the diffuser 20 and is diffused or projected outside the lens barrel 30 by the diffuser 20.
保护罩40包括顶壁41及自顶壁41的一侧延伸形成的保护侧壁42。顶壁41的中心开设有通光孔401。保护侧壁42环绕顶壁41及通光孔401设置。顶壁41与保护侧壁42共同围成安装腔43,通光孔401与安装腔43连通。保护侧壁42的内表面的横截面呈圆形,保护侧壁42的内表面上形成有内螺纹。保护侧壁42的内螺纹与镜筒30的外螺纹螺合以将保护罩40安装在镜筒30上。顶壁41与扩散器20抵触使得扩散器40被夹持在顶壁41与安装槽34的底面35之间。The protective cover 40 includes a top wall 41 and a protective sidewall 42 extending from one side of the top wall 41. A light through hole 401 is defined in the center of the top wall 41. The protective side wall 42 is disposed around the top wall 41 and the light through hole 401. The top wall 41 and the protection side wall 42 together form a mounting cavity 43, and the light-passing hole 401 communicates with the mounting cavity 43. The cross-section of the inner surface of the protective sidewall 42 is circular, and an inner thread is formed on the inner surface of the protective sidewall 42. The internal thread of the protective sidewall 42 is screwed with the external thread of the lens barrel 30 to mount the protective cover 40 on the lens barrel 30. The top wall 41 abuts the diffuser 20 so that the diffuser 40 is sandwiched between the top wall 41 and the bottom surface 35 of the mounting groove 34.
如此,通过在镜筒30上开设安装槽34,并将扩散器20安装在安装槽34内,以及通过保护罩40安装在镜筒30上以将扩散器20夹持在保护罩40与安装槽34的底面35之间,从而可以将扩散器20固定在镜筒30上。此种方式无需使用胶水将扩散器20固定在镜筒30上,能够避免胶水挥发成气态后,气态的胶水凝固在扩散器20的表面而影响扩散器20的微观结构,并能够避免扩散器20和镜筒30的胶水因老化而使粘着力下降时扩散器20从镜筒30脱落。In this way, the opening 20 is installed in the lens barrel 30, and the diffuser 20 is installed in the installation groove 34, and the protective cover 40 is installed on the lens barrel 30 to clamp the diffuser 20 between the protective cover 40 and the installation groove. 34 between the bottom surfaces 35 so that the diffuser 20 can be fixed on the lens barrel 30. In this way, it is not necessary to fix the diffuser 20 to the lens barrel 30 by using glue, which can prevent the glue from solidifying on the surface of the diffuser 20 and affecting the microstructure of the diffuser 20 after the glue is volatilized to a gaseous state. When the glue with the lens barrel 30 decreases due to aging, the diffuser 20 falls off from the lens barrel 30.
请一并参阅图22和图23,在某些实施方式中,在调节光发射器100的发光功率时,可以通过调节驱动光发射器100发光的驱动电流的来实现。另外地,如果光发射器100的光源10为垂直腔面发射器,则此时垂直腔面发射器的结构可为:Please refer to FIG. 22 and FIG. 23 together. In some embodiments, when adjusting the light emitting power of the light emitter 100, it can be achieved by adjusting the driving current that drives the light emitter 100 to emit light. In addition, if the light source 10 of the light emitter 100 is a vertical cavity surface emitter, the structure of the vertical cavity surface emitter at this time may be:
(1)垂直腔面发射器包括多个点光源101,多个点光源101形成多个可独立控制的扇形阵列11,多个扇形阵列11围成圆形(如图22所示)或多边形(图未示),此时,光发射器100的发光功率可以通过开启不同数目的扇形阵列11的点光源101来实现,也即是说,目标发光功率与开启的扇形阵列的目标数量的对应。当扇形阵列未全部开启时,开启的那部分扇形阵列应呈中心对称分布,如此,可以使得光发射器100发出的激光较为均匀。(1) The vertical cavity surface emitter includes a plurality of point light sources 101, which form a plurality of independently controllable fan-shaped arrays 11, and the plurality of fan-shaped arrays 11 surround a circle (as shown in FIG. 22) or a polygon ( (Not shown), at this time, the light emitting power of the light emitter 100 can be achieved by turning on the point light sources 101 of different numbers of the fan-shaped arrays 11, that is, the target light-emitting power corresponds to the target number of the turned-on fan-shaped arrays. When the fan-shaped array is not fully turned on, the part of the fan-shaped array that is turned on should be symmetrically distributed in the center, so that the laser light emitted by the light emitter 100 can be made more uniform.
(2)垂直腔面发射器包括多个点光源101,多个点光源101形成多个子阵列12,多个子阵列12包括至少一个圆形子阵列和至少一个环形子阵列,至少一个圆形子阵列和至少一个环形子阵列围成圆形(如图23所示),或者多个子阵列12包括至少一个多边形子阵列和至少一个环形子阵列,至少一个多边形子阵列和至少一个环形子阵列围成一个多边形(图未示),此时,光发射器100的发光功率的调节 可以通过开启不同数目的子阵列12的点光源101来实现,也即是说,发光功率与开启的子阵列12的目标数量的对应。(2) The vertical cavity surface emitter includes a plurality of point light sources 101, and the plurality of point light sources 101 form a plurality of sub-arrays 12, and the plurality of sub-arrays 12 include at least one circular sub-array and at least one circular sub-array, and at least one circular sub-array. And at least one circular sub-array is enclosed in a circle (as shown in FIG. 23), or the multiple sub-arrays 12 include at least one polygonal sub-array and at least one circular sub-array Polygon (not shown). At this time, the light emitting power of the light transmitter 100 can be adjusted by turning on the point light sources 101 of different numbers of the sub-arrays 12, that is, the target of the light-emitting power and the opened sub-arrays 12 Correspondence of quantity.
请参阅图24,本申请还提供一种电子装置800。电子装置800包括上述任意一实施方式所述的深度相机300、一个或多个处理器805、存储器806和一个或多个程序807。其中一个或多个程序807被存储在存储器806中,并且被配置成由一个或多个处理器805执行。程序807包括用于执行上述任意一项实施方式所述的光发射器100的控制方法的指令。Referring to FIG. 24, the present application further provides an electronic device 800. The electronic device 800 includes the depth camera 300, one or more processors 805, a memory 806, and one or more programs 807 according to any one of the foregoing embodiments. One or more programs 807 are stored in the memory 806 and are configured to be executed by one or more processors 805. The program 807 includes instructions for executing the control method of the optical transmitter 100 according to any one of the foregoing embodiments.
例如,请结合图1、图2及图24,程序807包括用于执行以下步骤的指令:For example, in conjunction with FIG. 1, FIG. 2, and FIG. 24, the program 807 includes instructions for performing the following steps:
01:获取光发射器100与场景中的目标主体之间的投射距离;01: Obtain the projection distance between the light emitter 100 and the target subject in the scene;
02:根据投射距离确定光发射器100的目标发光频率;和02: determining the target emission frequency of the light transmitter 100 according to the projection distance; and
03:控制光发射器100以目标发光频率发光。03: The light emitter 100 is controlled to emit light at a target emission frequency.
再例如,请结合图5和图24,程序807还包括用于执行以下步骤的指令:For another example, in conjunction with FIG. 5 and FIG. 24, the program 807 further includes instructions for performing the following steps:
011:获取场景的拍摄图像;011: Get the captured image of the scene;
012:处理拍摄图像以判断拍摄图像中是否存在人脸;012: Process the captured image to determine whether a human face exists in the captured image;
013:在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例;和013: Calculate the first proportion of the face in the captured image when a face is present in the captured image; and
014:根据第一比例计算投射距离。014: Calculate the projection distance according to the first ratio.
再例如,请结合图7,程序807还包括用于执行以下步骤的指令:For another example, in conjunction with FIG. 7, the program 807 further includes instructions for performing the following steps:
015:控制光发射器100以预定发光频率发光以检测场景的初始深度信息;和015: controlling the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene; and
016:根据初始深度信息计算光发射器100与目标主体之间的投射距离。016: Calculate a projection distance between the light emitter 100 and the target subject according to the initial depth information.
再例如,请结合图9,程序807还包括用于执行以下步骤的指令:For another example, in conjunction with FIG. 9, the program 807 further includes instructions for performing the following steps:
04:获取场景的环境亮度;04: Get the ambient brightness of the scene;
05:根据环境亮度及投射距离计算光发射器100的目标发光功率;和05: Calculate the target luminous power of the light transmitter 100 based on the ambient brightness and the projection distance; and
06:控制光发射器100以目标发光功率发光。06: The light emitter 100 is controlled to emit light at the target light emission power.
再例如,请结合图11,程序807还包括用于执行以下步骤的指令:For another example, in conjunction with FIG. 11, the program 807 further includes instructions for performing the following steps:
0141:计算拍摄图像中人脸的预设特征区域占人脸的第二比例;和0141: Calculate the second proportion of the preset feature area of the human face in the captured image; and
0142:根据第一比例及第二比例计算投射距离。0142: Calculate the projection distance according to the first scale and the second scale.
再例如,请结合图13,程序807还包括用于执行以下步骤的指令:For another example, in conjunction with FIG. 13, the program 807 further includes instructions for performing the following steps:
0143:根据拍摄图像判断目标主体是否佩戴眼镜;和0143: judging whether the target subject is wearing glasses based on the captured image; and
0144:在目标主体佩戴眼镜时根据第一比例及距离系数计算投射距离。0144: Calculate the projection distance according to the first scale and the distance coefficient when the target subject wears glasses.
再例如,请结合图15,程序807还包括用于执行以下步骤的指令:For another example, in conjunction with FIG. 15, the program 807 further includes instructions for performing the following steps:
0145:根据拍摄图像判断目标主体的年龄;和0145: judging the age of the target subject based on the captured image; and
0146:根据第一比例及年龄计算投射距离。0146: Calculate the projection distance according to the first ratio and age.
请参阅图25,本申请还提供一种计算机可读存储介质901。计算机可读存储介质901包括与电子装置800结合使用计算机程序902。计算机程序902可被处理器805执行以完成上述任意一项实施方式所述的光发射器100的控制方法。Referring to FIG. 25, the present application further provides a computer-readable storage medium 901. The computer-readable storage medium 901 includes a computer program 902 used in conjunction with the electronic device 800. The computer program 902 can be executed by the processor 805 to complete the method for controlling the optical transmitter 100 according to any one of the foregoing embodiments.
例如,请结合图1、图2及图25,计算机程序902可被处理器805执行以完成以下步骤:For example, in conjunction with FIG. 1, FIG. 2, and FIG. 25, the computer program 902 may be executed by the processor 805 to complete the following steps:
01:获取光发射器100与场景中的目标主体之间的投射距离;01: Obtain the projection distance between the light emitter 100 and the target subject in the scene;
02:根据投射距离确定光发射器100的目标发光频率;和02: determining the target emission frequency of the light transmitter 100 according to the projection distance; and
03:控制光发射器100以目标发光频率发光。03: The light emitter 100 is controlled to emit light at a target emission frequency.
再例如,请结合图5及图25,计算机程序902还可被处理器805执行以完成以下步骤:For another example, in conjunction with FIG. 5 and FIG. 25, the computer program 902 can also be executed by the processor 805 to complete the following steps:
011:获取场景的拍摄图像;011: Get the captured image of the scene;
012:处理拍摄图像以判断拍摄图像中是否存在人脸;012: Process the captured image to determine whether a human face exists in the captured image;
013:在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例;和013: Calculate the first proportion of the face in the captured image when a face is present in the captured image; and
014:根据第一比例计算投射距离。014: Calculate the projection distance according to the first ratio.
再例如,请结合图7,计算机程序902还可被处理器805执行以完成以下步骤:For another example, in conjunction with FIG. 7, the computer program 902 can also be executed by the processor 805 to complete the following steps:
015:控制光发射器100以预定发光频率发光以检测场景的初始深度信息;和015: controlling the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene; and
016:根据初始深度信息计算光发射器100与目标主体之间的投射距离。016: Calculate a projection distance between the light emitter 100 and the target subject according to the initial depth information.
再例如,请结合图9,计算机程序902还可被处理器805执行以完成以下步骤:For another example, in conjunction with FIG. 9, the computer program 902 can also be executed by the processor 805 to complete the following steps:
04:获取场景的环境亮度;04: Get the ambient brightness of the scene;
05:根据环境亮度及投射距离计算光发射器100的目标发光功率;和05: Calculate the target luminous power of the light transmitter 100 based on the ambient brightness and the projection distance; and
06:控制光发射器100以目标发光功率发光。06: The light emitter 100 is controlled to emit light at the target light emission power.
再例如,请结合图11,计算机程序902还可被处理器805执行以完成以下步骤:For another example, in conjunction with FIG. 11, the computer program 902 can also be executed by the processor 805 to complete the following steps:
0141:计算拍摄图像中人脸的预设特征区域占人脸的第二比例;和0141: Calculate the second proportion of the preset feature area of the human face in the captured image; and
0142:根据第一比例及第二比例计算投射距离。0142: Calculate the projection distance according to the first scale and the second scale.
再例如,请结合图13,计算机程序902还可被处理器805执行以完成以下步骤:For another example, in conjunction with FIG. 13, the computer program 902 can also be executed by the processor 805 to complete the following steps:
0143:根据拍摄图像判断目标主体是否佩戴眼镜;和0143: judging whether the target subject is wearing glasses based on the captured image; and
0144:在目标主体佩戴眼镜时根据第一比例及距离系数计算投射距离。0144: Calculate the projection distance according to the first scale and the distance coefficient when the target subject wears glasses.
再例如,请结合图15,计算机程序902还可被处理器805执行以完成以下步骤:For another example, in conjunction with FIG. 15, the computer program 902 may also be executed by the processor 805 to complete the following steps:
0145:根据拍摄图像判断目标主体的年龄;和0145: judging the age of the target subject based on the captured image; and
0146:根据第一比例及年龄计算投射距离。0146: Calculate the projection distance according to the first ratio and age.
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。In the description of this specification, the description with reference to the terms “one embodiment”, “some embodiments”, “examples”, “specific examples”, or “some examples” and the like means specific features described in conjunction with the embodiments or examples , Structure, material, or characteristic is included in at least one embodiment or example of the present application. In this specification, the schematic expressions of the above terms are not necessarily directed to the same embodiment or example. Moreover, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. In addition, without any contradiction, those skilled in the art may combine and combine different embodiments or examples and features of the different embodiments or examples described in this specification.
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。In addition, the terms "first" and "second" are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as "first" and "second" may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "a plurality" is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。Any process or method description in a flowchart or otherwise described herein can be understood as a module, fragment, or portion of code that includes one or more executable instructions for implementing a particular logical function or step of a process And, the scope of the preferred embodiments of this application includes additional implementations in which the functions may be performed out of the order shown or discussed, including performing the functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present application pertain.
尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型。Although the embodiments of the present application have been shown and described above, it can be understood that the above embodiments are exemplary and should not be construed as limitations on the present application. Those skilled in the art may, within the scope of the present application, understand the above. Embodiments are subject to change, modification, substitution, and modification.

Claims (23)

  1. 一种光发射器的控制方法,其特征在于,所述控制方法包括:A control method of an optical transmitter, characterized in that the control method includes:
    获取所述光发射器与场景中的目标主体之间的投射距离;Obtaining a projection distance between the light emitter and a target subject in the scene;
    根据所述投射距离确定所述光发射器的目标发光频率;和Determining a target light emission frequency of the light emitter according to the projection distance; and
    控制所述光发射器以所述目标发光频率发光。Controlling the light emitter to emit light at the target emission frequency.
  2. 根据权利要求1所述的控制方法,其特征在于,所述获取所述光发射器与场景中的目标主体之间的投射距离的步骤包括:The control method according to claim 1, wherein the step of obtaining a projection distance between the light emitter and a target subject in a scene comprises:
    获取所述场景的拍摄图像;Acquiring a captured image of the scene;
    处理所述拍摄图像以判断所述拍摄图像中是否存在人脸;Processing the captured image to determine whether a human face exists in the captured image;
    在所述拍摄图像中存在所述人脸时计算所述拍摄图像中所述人脸所占的第一比例;和Calculating a first proportion of the human face in the captured image when the human face is present in the captured image; and
    根据所述第一比例计算所述投射距离。Calculating the projection distance according to the first ratio.
  3. 根据权利要求1所述的控制方法,其特征在于,所述获取所述光发射器与场景中的目标主体之间的投射距离的步骤包括:The control method according to claim 1, wherein the step of obtaining a projection distance between the light emitter and a target subject in a scene comprises:
    控制所述光发射器以预定发光频率发光以检测所述场景的初始深度信息;和Controlling the light emitter to emit light at a predetermined light emission frequency to detect initial depth information of the scene; and
    根据所述初始深度信息计算所述光发射器与所述目标主体之间的投射距离。A projection distance between the light emitter and the target subject is calculated according to the initial depth information.
  4. 根据权利要求2所述的控制方法,其特征在于,所述控制方法还包括:The control method according to claim 2, wherein the control method further comprises:
    获取所述场景的环境亮度;Obtaining the ambient brightness of the scene;
    根据所述环境亮度及所述投射距离计算所述光发射器的目标发光功率;和Calculating a target luminous power of the light transmitter according to the ambient brightness and the projection distance; and
    控制所述光发射器以所述目标发光功率发光。Controlling the light emitter to emit light at the target light emitting power.
  5. 根据权利要求4所述的控制方法,其特征在于,所述根据所述第一比例计算所述投射距离的步骤包括:The control method according to claim 4, wherein the step of calculating the projection distance according to the first ratio comprises:
    计算所述拍摄图像中所述人脸的预设特征区域占所述人脸的第二比例;和Calculating a second ratio of the preset feature area of the human face in the captured image to the human face; and
    根据所述第一比例及所述第二比例计算所述投射距离。Calculate the projection distance according to the first ratio and the second ratio.
  6. 根据权利要求4所述的控制方法,其特征在于,所述根据所述第一比例计算所述投射距离包括:The control method according to claim 4, wherein the calculating the projection distance according to the first ratio comprises:
    根据所述拍摄图像判断所述目标主体是否佩戴眼镜;和Determining whether the target subject is wearing glasses according to the captured image; and
    在所述目标主体佩戴眼镜时根据所述第一比例及距离系数计算所述投射距离。When the target subject wears glasses, calculate the projection distance according to the first scale and a distance coefficient.
  7. 根据权利要求4所述的控制方法,其特征在于,所述根据所述第一比例计算所述投射距离的步骤包括:The control method according to claim 4, wherein the step of calculating the projection distance according to the first ratio comprises:
    根据所述拍摄图像判断所述目标主体的年龄;和Judging the age of the target subject based on the captured image; and
    根据所述第一比例及所述年龄计算所述投射距离。Calculate the projection distance according to the first ratio and the age.
  8. 一种光发射器的控制装置,其特征在于,所述控制装置包括:A control device for a light transmitter, characterized in that the control device includes:
    第一获取模块,所述第一获取模块用于获取所述光发射器与场景中的目标主体之间的投射距离;A first acquisition module, configured to acquire a projection distance between the light emitter and a target subject in a scene;
    确定模块,所述确定模块用于根据所述投射距离确定所述光发射器的目标发光频率;和A determining module for determining a target light emitting frequency of the light transmitter according to the projection distance; and
    控制模块,所述控制模块用于控制所述光发射器以所述目标发光频率发光。A control module for controlling the light emitter to emit light at the target emission frequency.
  9. 根据权利要求8所述的控制装置,其特征在于,所述第一获取模块包括:The control device according to claim 8, wherein the first acquisition module comprises:
    第一获取单元,用于获取所述场景的拍摄图像;A first obtaining unit, configured to obtain a captured image of the scene;
    处理单元,用于处理所述拍摄图像以判断所述拍摄图像中是否存在人脸;A processing unit, configured to process the captured image to determine whether a human face exists in the captured image;
    第一计算单元,用于在所述拍摄图像中存在所述人脸时计算所述拍摄图像中所述人脸所占的第一比例;和A first calculation unit, configured to calculate a first proportion of the human face in the captured image when the human face is present in the captured image; and
    第二计算单元,用于根据所述第一比例计算所述投射距离。A second calculation unit is configured to calculate the projection distance according to the first ratio.
  10. 根据权利要求8所述的控制装置,其特征在于,所述获第一获取模块包括:The control device according to claim 8, wherein the first obtaining module comprises:
    第一控制单元,用于控制所述光发射器以预定发光频率发光以检测所述场景的初始深度信息;和A first control unit for controlling the light emitter to emit light at a predetermined light emission frequency to detect initial depth information of the scene; and
    第三计算单元,用于根据所述初始深度信息计算所述光发射器与所述目标主体之间的投射距离。A third calculation unit is configured to calculate a projection distance between the light emitter and the target subject according to the initial depth information.
  11. 根据权利要求9所述的控制装置,其特征在于,所述控制装置还包括:The control device according to claim 9, wherein the control device further comprises:
    第二获取模块,用于获取所述场景的环境亮度;和A second acquisition module, configured to acquire the ambient brightness of the scene; and
    计算模块,用于根据所述环境亮度及所述投射距离计算所述光发射器的目标发光功率;A calculation module, configured to calculate a target luminous power of the light transmitter according to the ambient brightness and the projection distance;
    所述控制模块还用于控制所述光发射器以所述目标发光功率发光。The control module is further configured to control the light emitter to emit light at the target light emitting power.
  12. 根据权利要求9所述的控制装置,其特征在于,所述第二计算单元包括:The control device according to claim 9, wherein the second calculation unit comprises:
    第一计算子单元,用于计算所述拍摄图像中所述人脸的预设特征区域占所述人脸的第二比例;和A first calculation subunit, configured to calculate a second proportion of the preset feature area of the face in the captured image to the face; and
    第二计算子单元,用于根据所述第二比例及所述第二比例计算所述投射距离。A second calculation subunit is configured to calculate the projection distance according to the second ratio and the second ratio.
  13. 根据权利要求9所述的控制装置,其特征在于,所述第二计算单元包括:The control device according to claim 9, wherein the second calculation unit comprises:
    第一判断子单元,用于根据所述拍摄图像判断所述目标主体是否佩戴眼镜;和A first determining subunit, configured to determine whether the target subject is wearing glasses according to the captured image; and
    第三计算子单元,用于在所述目标主体佩戴眼镜时根据所述第一比例及距离系数计算所述投射距离。A third calculation subunit is configured to calculate the projection distance according to the first scale and a distance coefficient when the target subject wears glasses.
  14. 根据权利要求9所述的控制装置,其特征在于,所述第二计算单元包括:The control device according to claim 9, wherein the second calculation unit comprises:
    第二判断子单元,用于根据所述拍摄图像判断所述目标主体的年龄;和A second determining subunit, configured to determine the age of the target subject according to the captured image; and
    第四计算子单元,用于根据所述第一比例及所述年龄计算所述投射距离。A fourth calculation subunit is configured to calculate the projection distance according to the first ratio and the age.
  15. 一种深度相机,其特征在于,所述深度相机包括光发射器和处理器;所述处理器用于:A depth camera, characterized in that the depth camera includes a light emitter and a processor; the processor is used for:
    获取所述光发射器与场景中的目标主体之间的投射距离;Obtaining a projection distance between the light emitter and a target subject in the scene;
    根据所述投射距离确定所述光发射器的目标发光频率;和Determining a target light emission frequency of the light emitter according to the projection distance; and
    控制所述光发射器以所述目标发光频率发光。Controlling the light emitter to emit light at the target emission frequency.
  16. 根据权利要求15所述的深度相机,其特征在于,所述处理器还用于:The depth camera according to claim 15, wherein the processor is further configured to:
    获取所述场景的拍摄图像;Acquiring a captured image of the scene;
    处理所述拍摄图像以判断所述拍摄图像中是否存在人脸;Processing the captured image to determine whether a human face exists in the captured image;
    在所述拍摄图像中存在所述人脸时计算所述拍摄图像中所述人脸所占的第一比例;和Calculating a first proportion of the human face in the captured image when the human face is present in the captured image; and
    根据所述第一比例计算所述投射距离。Calculating the projection distance according to the first ratio.
  17. 根据权利要求15所述的深度相机,其特征在于,所述处理器还用于:The depth camera according to claim 15, wherein the processor is further configured to:
    控制所述光发射器以预定发光频率发光以检测所述场景的初始深度信息;和Controlling the light emitter to emit light at a predetermined light emission frequency to detect initial depth information of the scene; and
    根据所述初始深度信息计算所述光发射器与所述目标主体之间的投射距离。A projection distance between the light emitter and the target subject is calculated according to the initial depth information.
  18. 根据权利要求16所述的深度相机,其特征在于,所述处理器还用于:The depth camera according to claim 16, wherein the processor is further configured to:
    获取所述场景的环境亮度;Obtaining the ambient brightness of the scene;
    根据所述环境亮度及所述投射距离计算所述光发射器的目标发光功率;和Calculating a target luminous power of the light transmitter according to the ambient brightness and the projection distance; and
    控制所述光发射器以所述目标发光功率发光。Controlling the light emitter to emit light at the target light emitting power.
  19. 根据权利要求18所述的深度相机,其特征在于,所述处理器还用于:The depth camera according to claim 18, wherein the processor is further configured to:
    计算所述拍摄图像中所述人脸的预设特征区域占所述人脸的第二比例;和Calculating a second ratio of the preset feature area of the human face in the captured image to the human face; and
    根据所述第一比例及所述第二比例计算所述投射距离。Calculate the projection distance according to the first ratio and the second ratio.
  20. 根据权利要求18所述的深度相机,其特征在于,所述处理器还用于:The depth camera according to claim 18, wherein the processor is further configured to:
    根据所述拍摄图像判断所述目标主体是否佩戴眼镜;和Determining whether the target subject is wearing glasses according to the captured image; and
    在所述目标主体佩戴眼镜时根据所述第一比例及距离系数计算所述投射距离。When the target subject wears glasses, calculate the projection distance according to the first scale and a distance coefficient.
  21. 根据权利要求18所述的深度相机,其特征在于,所述处理器还用于:The depth camera according to claim 18, wherein the processor is further configured to:
    根据所述拍摄图像判断所述目标主体的年龄;和Judging the age of the target subject based on the captured image; and
    根据所述第一比例及所述年龄计算所述投射距离。Calculate the projection distance according to the first ratio and the age.
  22. 一种电子装置,其特征在于,所述电子装置包括:An electronic device is characterized in that the electronic device includes:
    权利要求15-21任意一项所述的深度相机;The depth camera according to any one of claims 15-21;
    一个或多个处理器;One or more processors;
    存储器;和Memory; and
    一个或多个程序,其中所述一个或多个程序被存储在所述存储器中,并且被配置成由所述一个或多个处理器执行,所述程序包括用于执行权利要求1至7任意一项所述的控制方法的指令。One or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the programs include instructions for performing any of claims 1 to 7 An instruction for a control method as described.
  23. 一种计算机可读存储介质,其特征在于,包括与电子装置结合使用的计算机程序,所述计算机程序可被处理器执行以完成权利要求1至7任意一项所述的控制方法。A computer-readable storage medium, comprising a computer program used in combination with an electronic device, the computer program being executable by a processor to complete the control method according to any one of claims 1 to 7.
PCT/CN2019/090078 2018-08-22 2019-06-05 Control method and device, depth camera, electronic device, and readable storage medium WO2020038064A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810962843.8A CN108833889B (en) 2018-08-22 2018-08-22 Control method and device, depth camera, electronic device and readable storage medium
CN201810962843.8 2018-08-22

Publications (1)

Publication Number Publication Date
WO2020038064A1 true WO2020038064A1 (en) 2020-02-27

Family

ID=64150437

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/090078 WO2020038064A1 (en) 2018-08-22 2019-06-05 Control method and device, depth camera, electronic device, and readable storage medium

Country Status (2)

Country Link
CN (1) CN108833889B (en)
WO (1) WO2020038064A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108833889B (en) * 2018-08-22 2020-06-23 Oppo广东移动通信有限公司 Control method and device, depth camera, electronic device and readable storage medium
CN109819238B (en) * 2019-02-22 2021-06-22 北京旷视科技有限公司 Working frequency adjusting method and device of TOF image acquisition module and electronic system
CN110400342B (en) * 2019-07-11 2021-07-06 Oppo广东移动通信有限公司 Parameter adjusting method and device of depth sensor and electronic equipment
CN110418062A (en) * 2019-08-29 2019-11-05 上海云从汇临人工智能科技有限公司 A kind of image pickup method, device, equipment and machine readable media
CN110659617A (en) * 2019-09-26 2020-01-07 杭州艾芯智能科技有限公司 Living body detection method, living body detection device, computer equipment and storage medium
CN111309012A (en) * 2020-02-24 2020-06-19 深圳市优必选科技股份有限公司 Robot and movement control method and device thereof
CN111427049A (en) * 2020-04-06 2020-07-17 深圳蚂里奥技术有限公司 Laser safety device and control method
CN111487633A (en) * 2020-04-06 2020-08-04 深圳蚂里奥技术有限公司 Laser safety control device and method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100046802A1 (en) * 2008-08-19 2010-02-25 Tatsumi Watanabe Distance estimation apparatus, distance estimation method, storage medium storing program, integrated circuit, and camera
CN103473794A (en) * 2012-06-05 2013-12-25 三星电子株式会社 Depth image generating method and apparatus and depth image processing method and apparatus
CN106817794A (en) * 2015-11-30 2017-06-09 宁波舜宇光电信息有限公司 TOF circuit modules and its application
CN108333860A (en) * 2018-03-12 2018-07-27 广东欧珀移动通信有限公司 Control method, control device, depth camera and electronic device
CN108805025A (en) * 2018-04-28 2018-11-13 Oppo广东移动通信有限公司 Laser output control method and device, electronic equipment, storage medium
CN108833889A (en) * 2018-08-22 2018-11-16 Oppo广东移动通信有限公司 Control method and device, depth camera, electronic device and readable storage medium storing program for executing
CN109068036A (en) * 2018-09-12 2018-12-21 Oppo广东移动通信有限公司 Control method and device, depth camera, electronic device and readable storage medium storing program for executing
CN109104583A (en) * 2018-08-22 2018-12-28 Oppo广东移动通信有限公司 Control method and device, depth camera, electronic device and readable storage medium storing program for executing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108281880A (en) * 2018-02-27 2018-07-13 广东欧珀移动通信有限公司 Control method, control device, terminal, computer equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100046802A1 (en) * 2008-08-19 2010-02-25 Tatsumi Watanabe Distance estimation apparatus, distance estimation method, storage medium storing program, integrated circuit, and camera
CN103473794A (en) * 2012-06-05 2013-12-25 三星电子株式会社 Depth image generating method and apparatus and depth image processing method and apparatus
CN106817794A (en) * 2015-11-30 2017-06-09 宁波舜宇光电信息有限公司 TOF circuit modules and its application
CN108333860A (en) * 2018-03-12 2018-07-27 广东欧珀移动通信有限公司 Control method, control device, depth camera and electronic device
CN108805025A (en) * 2018-04-28 2018-11-13 Oppo广东移动通信有限公司 Laser output control method and device, electronic equipment, storage medium
CN108833889A (en) * 2018-08-22 2018-11-16 Oppo广东移动通信有限公司 Control method and device, depth camera, electronic device and readable storage medium storing program for executing
CN109104583A (en) * 2018-08-22 2018-12-28 Oppo广东移动通信有限公司 Control method and device, depth camera, electronic device and readable storage medium storing program for executing
CN109068036A (en) * 2018-09-12 2018-12-21 Oppo广东移动通信有限公司 Control method and device, depth camera, electronic device and readable storage medium storing program for executing

Also Published As

Publication number Publication date
CN108833889B (en) 2020-06-23
CN108833889A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
WO2020038064A1 (en) Control method and device, depth camera, electronic device, and readable storage medium
WO2020038062A1 (en) Control method and device, depth camera, electronic device, and readable storage medium
WO2020052284A1 (en) Control method and device, depth camera, electronic device, and readable storage medium
WO2020038060A1 (en) Laser projection module and control method therefor, and image acquisition device and electronic apparatus
CN108205374B (en) Eyeball tracking module and method of video glasses and video glasses
CN108333860B (en) Control method, control device, depth camera and electronic device
US11335028B2 (en) Control method based on facial image, related control device, terminal and computer device
CN108509867B (en) Control method, control device, depth camera and electronic device
CN108227361B (en) Control method, control device, depth camera and electronic device
JP2017097901A (en) Eye tracking device operating method, and eye tracking device for conducting active power management
WO2020038058A1 (en) Calibration method, calibration controller, and calibration system
WO2020052282A1 (en) Electronic device, control method for same, control device therefor, and computer-readable storage medium
CN108376251B (en) Control method, control device, terminal, computer device, and storage medium
WO2020062909A1 (en) Control method and apparatus, time-of-flight device, terminal, and computer readable storage medium
CN108594451B (en) Control method, control device, depth camera and electronic device
CN108281880A (en) Control method, control device, terminal, computer equipment and storage medium
WO2020038053A1 (en) Time of flight module and control method therefor, controller and electronic apparatus
TWI684026B (en) Control method, control device, depth camera and electronic device
CN110308458B (en) Adjusting method, adjusting device, terminal and computer readable storage medium
WO2020038061A1 (en) Flight time module and control method thereof, controller and electronic device
US10551500B2 (en) Infrared optical element for proximity sensor system
WO2020038063A1 (en) Electronic device and control method for electronic device
CN108279496B (en) Eyeball tracking module and method of video glasses and video glasses
KR20210006605A (en) Electronic device including sensor and method of operation thereof
US10952324B2 (en) Spacer for surface mountable electronic components

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19852733

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19852733

Country of ref document: EP

Kind code of ref document: A1