WO2020038062A1 - 控制方法及装置、深度相机、电子装置及可读存储介质 - Google Patents

控制方法及装置、深度相机、电子装置及可读存储介质 Download PDF

Info

Publication number
WO2020038062A1
WO2020038062A1 PCT/CN2019/090076 CN2019090076W WO2020038062A1 WO 2020038062 A1 WO2020038062 A1 WO 2020038062A1 CN 2019090076 W CN2019090076 W CN 2019090076W WO 2020038062 A1 WO2020038062 A1 WO 2020038062A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
projection distance
frequency
captured image
distance
Prior art date
Application number
PCT/CN2019/090076
Other languages
English (en)
French (fr)
Inventor
韦怡
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020038062A1 publication Critical patent/WO2020038062A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the invention relates to the field of three-dimensional imaging technology, and in particular, to a control method, a control device, a depth camera, an electronic device, and a computer-readable storage medium.
  • Time of flight (TOF) imaging system can calculate the depth information of the measured object by calculating the time difference between the moment when the optical transmitter emits the optical signal and the moment when the optical receiver receives the optical signal.
  • Light emitters typically include a light source and a diffuser. The light from the light source is diffused by the diffuser and then casts a uniform surface light into the scene.
  • Embodiments of the present invention provide a control method, a control device, a depth camera, an electronic device, and a computer-readable storage medium.
  • a method for controlling a light transmitter includes: obtaining a projection distance between the light transmitter and a target subject in a scene; and when the projection distance is greater than a preset distance, controlling the light transmitter to After emitting light at one frequency, emitting light at a second frequency, the first frequency is different from the second frequency.
  • the control device for an optical transmitter includes a first acquisition module and a control module.
  • the first obtaining module is configured to obtain a projection distance between the light emitter and a target subject in a scene.
  • the control module is configured to control the light transmitter to emit light at a second frequency when the projection distance is greater than a preset distance, and the second transmitter is different from the first frequency.
  • a depth camera includes a light emitter and a processor.
  • the processor is configured to obtain a projection distance between the light emitter and a target subject in the scene; and when the projection distance is greater than a preset distance, controlling the light emitter to emit light at a first frequency and then at a second frequency Emit light, the first frequency is different from the second frequency.
  • An electronic device includes the above-mentioned depth camera, one or more processors, a memory, and one or more programs, wherein the one or more programs are stored in the memory and are configured by The one or more processors execute the program, and the program includes instructions for performing the foregoing control method.
  • the computer-readable storage medium of the embodiment of the present invention includes a computer program used in combination with an electronic device, and the computer program can be executed by a processor to complete the control method described above.
  • FIG. 1 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present invention.
  • FIG. 2 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present invention.
  • FIG. 3 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present invention.
  • FIG. 4 is a schematic diagram illustrating the operation of a depth camera according to some embodiments of the present invention.
  • FIG. 5 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present invention.
  • FIG. 6 is a schematic block diagram of a first acquisition module of a control device according to some embodiments of the present invention.
  • FIG. 7 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present invention.
  • FIG. 8 is a schematic block diagram of a first acquisition module of a control device according to some embodiments of the present invention.
  • FIG. 9 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present invention.
  • FIG. 10 is a schematic block diagram of a control device according to some embodiments of the present invention.
  • FIG. 11 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present invention.
  • FIG. 12 is a schematic block diagram of a second computing unit of a control device according to some embodiments of the present invention.
  • FIG. 13 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present invention.
  • FIG. 14 is a schematic block diagram of a second computing unit of a control device according to some embodiments of the present invention.
  • FIG. 15 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present invention.
  • FIG. 16 is a schematic block diagram of a second computing unit of a control device according to some embodiments of the present invention.
  • FIG. 17 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present invention.
  • FIG. 18 is a schematic diagram of a three-dimensional structure of a depth camera according to some embodiments of the present invention.
  • FIG. 19 is a schematic plan view of a depth camera according to some embodiments of the present invention.
  • FIG. 20 is a schematic cross-sectional view of the depth camera in FIG. 19 along the line XX-XX.
  • FIG. 21 is a schematic structural diagram of a light emitter according to some embodiments of the present invention.
  • 22 and 23 are schematic structural diagrams of a light source of a light emitter according to some embodiments of the present invention.
  • FIG. 24 is a schematic block diagram of an electronic device according to some embodiments of the present invention.
  • 25 is a schematic diagram of a connection between a computer-readable storage medium and an electronic device according to some embodiments of the present invention.
  • Control methods include:
  • the light transmitter 10 When the projection distance is greater than the preset distance, the light transmitter 10 is controlled to emit light at a first frequency and then emit light at a second frequency, and the first frequency is different from the second frequency.
  • the preset distance may be preset in the light transmitter 100 or determined by a user input. In one embodiment, the preset distance is, for example, 2.5 m.
  • the present invention further provides a control device 90 of the light transmitter 100.
  • the control method of the light transmitter 100 according to the embodiment of the present invention may be performed by the control device 90 of the light transmitter 100 according to the embodiment of the present invention.
  • the control device 90 includes a first acquisition module 91 and a control module 93.
  • Step 01 may be implemented by the first obtaining module 91.
  • Step 03 may be implemented by the control module 93. That is, the first obtaining module 91 may be used to obtain a projection distance between the light emitter 100 and a target subject in the scene.
  • the control module 93 may be configured to control the light transmitter 10 to emit light at a first frequency and to emit light at a second frequency when the projection distance is greater than a preset distance, where the first frequency is different from the second frequency.
  • the present invention further provides a depth camera 300.
  • the depth camera 300 includes a light transmitter 100, a light receiver 200, and a processor 805. Steps 01 and 03 may be implemented by the processor 805. That is to say, the processor 805 may be configured to obtain a projection distance between the light emitter 100 and a target subject in the scene, and when the projection distance is greater than a preset distance, control the light emitter 10 to emit light at a first frequency and to emit light at a second frequency. The frequency emits light, and the first frequency is different from the second frequency.
  • the depth camera 300 according to the embodiment of the present invention can be applied to the electronic device 800.
  • the processor 805 in the depth camera 300 according to the embodiment of the present invention and the processor 805 of the electronic device 800 may be the same processor 805 or two independent processors 805. In a specific embodiment of the present invention, the processor 805 in the depth camera 300 and the processor 805 of the electronic device 800 are the same processor 805.
  • the electronic device 800 may be a mobile phone, a tablet computer, a smart wearable device (a smart watch, a smart bracelet, a smart glasses, a smart helmet), a drone, etc., and is not limited herein.
  • the depth camera 300 is a Time of Flight (TOF) depth camera.
  • a TOF depth camera generally includes a light transmitter 100 and a light receiver 200.
  • the light receiver 200 is configured to project a laser light into the scene, and the light receiver 200 receives the laser light reflected by a person or an object in the scene.
  • the TOF depth camera usually obtains depth information in two ways: direct acquisition and indirect acquisition.
  • the processor 805 can calculate the flight time of the laser in the scene according to the time point when the optical receiver 200 emits the laser light and the time point when the light receiver 200 receives the laser light, and calculate the scene's Depth information.
  • the light transmitter 100 emits laser light into the scene, and the light receiver 200 collects the reflected laser light to obtain a phase difference, and calculates the depth information of the scene according to the phase difference and the light emission frequency of the laser.
  • d is the distance of the object
  • c is the speed of light
  • t is the flight time of the laser
  • T is the emission period of the laser.
  • f is the emission frequency of the laser.
  • the value is from 0 degrees to 360 degrees.
  • d 0, the distance of the object is the minimum.
  • the maximum object distance is obtained, that is, the distance that the depth camera 300 can measure depends on the light emission frequency. The higher the light emission frequency, the shorter the distance that the depth camera 300 can measure and the lower the light emission frequency, the longer the distance that the depth camera 300 can measure.
  • the light emitting frequency of the light transmitter will affect the measurement accuracy of the depth information of people and objects in the scene.
  • current light emitters usually emit light at a fixed frequency.
  • the target light emitting frequency of the light emitter 100 may be determined according to the projection distance, and then the light emitter 100 is controlled to emit light according to the target light emitting frequency.
  • the projection distance obtained in step 01 is not accurate enough, or the error between the projection distance and the actual distance is generally large.
  • the projection distance is a specific value and the target luminous frequency is also a specific value.
  • the projection distance corresponds to the target luminous frequency one by one.
  • the projection distance is a range and the target luminous frequency.
  • the projection distance corresponds to the target light emission frequency one-to-one.
  • the mapping relationship between the projection distance and the target luminous frequency may be determined based on calibration data of a large number of experiments before the depth camera 300 leaves the factory.
  • the mapping relationship between the projection distance and the target luminous frequency satisfies the law that the target luminous frequency decreases as the projection distance increases.
  • the target light emitting frequency of the light transmitter 100 is 100 MHz; when the projection distance is within 2.5 meters, the target light emitting frequency of the light transmitter 100 is 60 MHz; when the projection distance is within 5 meters, the The target light emission frequency is 30 MHz, etc., so that when the projection distance increases, the measurement distance of the depth camera 300 is increased by reducing the target light emission frequency, and since the depth information is measured by only one target light emission frequency, the depth information can be obtained more fast.
  • the depth camera 300 When the projection distance is greater than the distance that can be measured by the corresponding luminous frequency, the depth camera 300 will have a measurement blur, that is, the depth camera 300 will repeatedly show the range that can be measured. For example, when the light emission frequency is 100 MHz, the depth camera 300 can measure 0. -1.5 meters, when the projection distance is greater than 1.5 meters, such as 5 meters, the phase difference measured by the depth camera 300 at this time is 120 degrees, which is consistent with the phase difference of 0.5 meters. Therefore, the depth camera 300 will project 5 meters The distance is mistaken for 0.5 meters.
  • the light transmitter 10 when the projection distance is greater than a preset distance, the light transmitter 10 is controlled to emit light at a first frequency and then emit light at a second frequency, and the first frequency is different from the second frequency.
  • the optical receiver 200 is configured to receive the laser light emitted by the reflected light transmitter 100 at a first frequency to obtain a first phase difference and receive the laser light emitted from the reflected light transmitter 100 at a second frequency to obtain a second phase difference.
  • the processor 805 is further configured to: obtain a first distance according to the first frequency and the first phase difference; obtain a second distance according to the second frequency and the second phase difference; and obtain a calibration distance according to the first distance and the second distance. .
  • the projection frequency may be measured using a first frequency greater than a third frequency and a second frequency greater than a third frequency, where the third frequency is a maximum light emission corresponding to the projection distance. frequency. Please refer to FIG.
  • the projection distance can be measured by a first frequency and a second frequency greater than the third frequency.
  • the first frequency is, for example, 100 MHz
  • the second The frequency is, for example, 60 MHz.
  • the first frequency is the first frequency (100 MHz)
  • the first phase difference received by the optical receiver 200 is 120 degrees
  • the first distance obtained by the measurement is 0.5 m
  • the second phase difference received by the optical receiver 200 is 360 degrees
  • the second distance obtained by the measurement is 2.5 m.
  • the actual projection distance should be 1.5k 1 +0.5 and 2.5k 2 +2.5.
  • the light emitting frequency of the light transmitter 100 may be a single frequency, such as 60 MHz or 100 MHz.
  • obtaining the projection distance between the light emitter 100 and the target subject in the scene in step 01 includes:
  • 011 Get the captured image of the scene
  • 012 Process the captured image to determine whether a human face exists in the captured image
  • 014 Calculate the projection distance according to the first ratio.
  • the first acquisition module 91 includes a first acquisition unit 911, a processing unit 912, a first calculation unit 913, and a second calculation unit 914.
  • Step 011 may be implemented by the first obtaining unit 911.
  • Step 012 may be implemented by the processing unit 912.
  • Step 013 may be implemented by the first calculation unit 913.
  • Step 014 may be implemented by the second calculation unit 914. That is to say, the first acquisition unit 911 may be configured to acquire a captured image of a scene.
  • the processing unit 912 may be configured to process the captured image to determine whether a human face exists in the captured image.
  • the first calculation unit 913 may be configured to calculate a first proportion of a human face in the captured image when a human face exists in the captured image.
  • the second calculation unit 914 may be configured to calculate a projection distance according to the first ratio.
  • the first obtaining unit 911 may be an infrared camera (which may be the light receiver 200) or a visible light camera 400.
  • the captured image is an infrared image; when the first obtaining unit 911 is a visible light camera At 400, the captured image is a visible light image.
  • step 011, step 012, step 013, and step 014 may be implemented by the processor 805. That is to say, the processor 805 may be configured to obtain a captured image of a scene, process the captured image to determine whether a human face exists in the captured image, calculate a first proportion of the human face in the captured image when a human face exists in the captured image, And calculating the projection distance according to the first scale.
  • the processor 805 first recognizes whether a human face exists in the captured image based on a human face recognition algorithm. When a face exists in the captured image, the processor 805 extracts the face area and calculates the number of pixels occupied by the face area. Subsequently, the processor 805 divides the number of pixels in the face area by the total number of pixels in the captured image. Count to get the first proportion of the face in the captured image, and finally calculate the projection distance based on the first proportion. Generally, when the first ratio is larger, the target subject is closer to the depth camera 300, that is, the target subject is closer to the light transmitter 100, and the projection distance is smaller. When the first proportion is larger, the target subject and the depth camera are explained.
  • the distance of 300 is longer, that is, the target subject is farther from the light transmitter 100, and the projection distance is larger. Therefore, the relationship between the projection distance and the first ratio satisfies that the projection distance increases as the first ratio decreases.
  • the face with the largest area among the multiple faces may be selected as the face area to calculate the first proportion; or the area of multiple faces may also be selected To calculate the first proportion; or, the face of the holder of the electronic device 800 can be identified from multiple faces, and the first proportion can be calculated by using the face of the holder as the face area. Determining whether to use the first frequency and the second frequency based on the distance between the holder and the depth camera 300 can improve the accuracy of obtaining the depth information corresponding to the holder and improve the user experience.
  • the first ratio has a mapping relationship with the projection distance.
  • the first ratio is a specific value and the projection distance is also a specific value.
  • the first ratio corresponds to the projection distance one by one.
  • the first ratio is a range and the projection distance is
  • the first ratio is a one-to-one correspondence with the projection distance; or, the first ratio is a range and the projection distance is also a range, and the first ratio corresponds to the projection distance one-to-one.
  • the mapping relationship between the first scale and the projection distance may be calibrated in advance.
  • the user is directed to stand at more than a predetermined projection distance from the infrared camera or visible light camera 400, and the infrared camera or visible light camera 400 sequentially captures captured images.
  • the processor 805 calculates the calibration ratio of the face to the captured image in each captured image, and then stores the corresponding relationship between the calibrated ratio in each captured image and the predetermined projection distance. Based on the actually measured first ratio in subsequent use Find the projection distance corresponding to the first ratio in the above mapping relationship.
  • the user is instructed to stand at a projection distance of 10 cm, 20 cm, 30 cm, 40 cm, an infrared camera or a visible light camera 400 sequentially captures captured images, and the processor 805 calculates a projection distance of 10 cm from the multiple captured images , 20 cm, 30 cm, and 40 cm respectively corresponding to the calibration ratio of 80%, 60%, 45%, 30%, and the mapping relationship between the calibration ratio and the predetermined projection distance 10cm-80%, 20cm-60%, 30cm-45 %, 40cm-30% are stored in the memory of the electronic device 800 (shown in FIG. 24) in the form of a mapping table. In subsequent use, directly find the projection distance corresponding to the first ratio in the mapping table.
  • the projection distance and the first ratio are calibrated in advance.
  • the user is directed to stand at a predetermined projection distance from the infrared camera or visible light camera 400, and the infrared camera or visible light camera 400 collects captured images.
  • the processor 805 calculates the calibration ratio of the human face in the captured image, and then stores the correspondence between the calibration ratio in the captured image and the predetermined projection distance. In subsequent use, based on the correspondence between the calibration ratio and the predetermined projection distance The relationship calculates the projection distance.
  • the processor 805 calculates that the proportion of the human face in the captured image is 45%, and in actual measurement, when When the first ratio is calculated as R, according to the properties of similar triangles, Among them, D is an actual projection distance calculated according to the actually measured first ratio R.
  • the projection distance between the target subject and the light emitter 100 can be reflected more objectively.
  • obtaining the projection distance between the light emitter 100 and the target subject in the scene in step 01 includes:
  • the first obtaining module 91 includes a first control unit 915 and a third calculation unit 916.
  • Step 015 may be implemented by the first control unit 915.
  • Step 016 may be implemented by the third calculation unit 916. That is, the first control unit 915 may be used to control the light transmitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene.
  • the third calculation unit 916 may be configured to calculate a projection distance between the light emitter 100 and the target subject according to the initial depth information.
  • step 015 and step 016 may both be implemented by the processor 805. That is, the processor 805 may also be used to control the light emitter 100 to emit light at a predetermined light emission frequency to detect initial depth information of the scene, and calculate a projection distance between the light emitter 100 and the target subject according to the initial depth information.
  • the processor 805 controls the light transmitter 100 to emit laser light at a predetermined light emission frequency
  • the light receiver 200 receives the laser light reflected by a person or an object in the scene
  • the processor 805 calculates an initial scene based on the reception result of the light receiver 200 Depth information.
  • the predetermined light emission frequency is less than a preset threshold, that is, when the initial depth information of the scene is acquired, the light emitter 100 emits light at a lower light emission frequency.
  • the lower light emission frequency can reduce the power consumption of the electronic device 800
  • the projection distance between the target subject and the depth camera 300 is unknown, and it is also unknown whether the target subject is a user.
  • the light is directly emitted at a higher light frequency, if the target subject is the user and the target subject and the depth camera 300 If the distance is relatively short, the high-frequency emission of the laser light is likely to cause harm to the eyes of the user, and the light emission at a lower light emission frequency does not have the above-mentioned hidden dangers.
  • the target subject is further determined from the scene to further determine the initial depth information of the target subject.
  • the target subject is generally located in the central area of the field of view of the light receiver 200. Therefore, the central area of the field of view of the light receiver 200 can be used as the area where the target subject is located, so that the initial depth information of the pixels in the central area is used as The initial depth information of the target subject.
  • the processor 805 can calculate the average or median value of the multiple initial depth information, and use the average or median value as the projection between the light emitter 100 and the target subject. distance.
  • the projection distance between the target subject and the light emitter 100 is calculated, and then the light emitting frequency of the light emitter 100 is determined based on the projection distance, so that the light emitter 100 emits light according to the light emitting frequency, and the depth information of the obtained target subject is improved. Precision.
  • the processor 805 may further perform steps 015 and 016 to determine the target subject and The projection distance between the light emitters 100. In this way, when a human face does not exist in the captured image, the projection distance between the target subject and the light emitter 100 can also be determined.
  • the processor 805 may control the infrared camera (which may be the light receiver 200) or the visible light camera 400 to capture and shoot image. It is assumed that the captured image is collected by the visible light camera 400.
  • the visual field of the visible light camera 400 and the light receiver 200 in the electronic device 800 usually has a large overlap.
  • the manufacturer will also calibrate the relative position between the visible light camera 400 and the light receiver 200 and obtain multiple calibration parameters for matching the color information of the subsequent visible light image and the depth information of the depth image. . Therefore, after the captured image is obtained by the processor 805, the processor 805 can first identify whether a human face exists in the captured image, and when there is a human face, find it based on the matching relationship between the captured image and the initial depth image formed by the initial depth information. The initial depth information corresponding to the face, and the initial depth information corresponding to the face is used as the depth information of the target subject. If there is no human face in the captured image, the initial depth information of the pixels in the central area is used as the initial depth information of the target subject. As such, when there is a user in the scene, the projection distance between the user and the depth camera 300 can be measured more accurately.
  • control method after step 01 further includes:
  • the light emitter 100 is controlled to emit light at the target light emission power.
  • the control device 90 further includes a second obtaining module 94 and a calculation module 95.
  • Step 04 may be implemented by the second acquisition module 94.
  • Step 05 may be implemented by the calculation module 95.
  • Step 06 may be implemented by the control module 93. That is to say, the second acquisition module 94 can be used to acquire the ambient brightness of the scene.
  • the calculation module 95 may be configured to calculate a target luminous power of the light transmitter 100 according to the ambient brightness and the projection distance.
  • the control module 93 can also be used to control the light emitter 100 to emit light at a target light emission power.
  • step 04, step 05, and step 06 can all be implemented by the processor 805. That is to say, the processor 805 can be used to obtain the ambient brightness of the scene, calculate the target light emitting power of the light transmitter 100 according to the environment brightness and the projection distance, and control the light transmitter 100 to emit light at the target light emitting power.
  • step 06 and step 03 may be performed synchronously.
  • the processor 805 in addition to controlling the light emitting frequency of the light transmitter 100, the processor 805 also controls the light transmitter 100 to emit light at the target light emitting power.
  • the ambient brightness can be detected by a light sensor.
  • the processor 805 reads the ambient brightness it detects from the light sensor.
  • the ambient brightness may also be detected by an infrared camera (which may be the light receiver 200) or the visible light camera 400.
  • the infrared camera or the visible light camera 400 captures an image of the current scene, and the processor 805 calculates the brightness value of the image as the ambient brightness.
  • the processor 805 After determining the ambient brightness and the projection distance, the processor 805 jointly calculates the target luminous power of the scene based on the two parameters of the ambient brightness and the projection distance. It can be understood that, first, when the ambient brightness is high, there are more infrared light components in the ambient light, and the infrared light in the ambient light and the infrared laser light emitted by the light transmitter 100 overlap with each other.
  • the optical receiver 200 When the optical receiver 200 receives both the infrared laser light emitted by the optical transmitter 100 and the infrared light in the ambient light, if the light emitting power of the infrared laser emitted by the optical transmitter 100 is low, the The difference between the ratio of the infrared laser from the light transmitter 100 and the infrared light from the ambient light is not large. This will cause the time when the light receiver 200 receives the light is not accurate, or the obtained phase difference is not accurate enough. Decrease the accuracy of obtaining depth information.
  • the transmission power of the infrared laser emitted by the optical transmitter 100 needs to be increased to reduce the influence of the infrared light in the environment on the optical receiver 200 receiving the infrared laser from the optical transmitter 100; When it is lower, the infrared light component contained in the ambient light is less. At this time, if the light emitter 100 emits light with a higher luminous power, the electronic device 800 will be increased. Power consumption.
  • the projection distance is long, the flight time of the laser is longer, the flight distance is longer, and the laser loss is more, which further causes the obtained phase difference to be inaccurate, which affects the accuracy of the depth information acquisition. Therefore, when the projection distance is large, the transmission power of the infrared laser emitted by the optical transmitter 100 can be appropriately increased.
  • the target light emitting power of the light transmitter 100 is greater than or equal to the first predetermined power P1.
  • the target light emitting power of the light transmitter 100 is less than or equal to the second predetermined power P2.
  • the first predetermined power P1 is greater than the second predetermined power P2.
  • the target light emitting power of the light transmitter 100 is between the second predetermined power P2 and the first predetermined power P1
  • the value range of the target luminous power of the optical transmitter 100 is (P2, P1).
  • jointly determining the target light emitting power of the light transmitter 100 based on the ambient brightness and the projection distance can reduce the power consumption of the electronic device 800 on the one hand and improve the accuracy of obtaining the depth information of the scene on the other.
  • calculating the projection distance according to the first scale in step 014 includes:
  • 0141 Calculate the second proportion of the preset feature area of the human face in the captured image.
  • 0142 Calculate the projection distance according to the first scale and the second scale.
  • the second calculation unit 914 includes a first calculation sub-unit 9141 and a second calculation sub-unit 9142.
  • Step 0141 may be implemented by the first calculation subunit 9141
  • step 0142 may be implemented by the second calculation subunit 9142.
  • the first calculation subunit 9141 may be configured to calculate a second proportion of the preset feature area of the human face in the captured image to the human face.
  • the second calculation subunit 9142 may be configured to calculate the projection distance according to the first scale and the second scale.
  • step 0141 and step 0142 may both be implemented by the processor 805. That is to say, the processor 805 may be configured to calculate a second ratio of a preset feature area of a human face in the captured image to the human face, and calculate a projection distance according to the first ratio and the second ratio.
  • the second ratio is a ratio of the preset features of the human face to the human face.
  • the preset feature area may select a feature area with a small difference between different user individuals.
  • the preset feature trend area is the binocular distance of the user.
  • the user is directed to stand at a predetermined projection distance position, collect a captured image, and then calculate a first calibration ratio and a second calibration ratio corresponding to the captured image, and store the predetermined projection distance, the first calibration ratio, and the second
  • the corresponding relationship of the scales is calibrated, so as to calculate the projection distance according to the actual first scale and the second scale in subsequent use. For example, instruct the user to stand at a projection distance of 25 cm and collect the captured image, and then calculate the first calibration ratio corresponding to the captured image to be 50% and the second calibration ratio to be 10%.
  • D1 is the initial projection distance calculated according to the actually measured first ratio R1, which can be further based on the relationship A calibrated projection distance D2, which is further calculated according to the actually measured second ratio R2, is obtained, and D2 is used as the final projection distance.
  • the projection distance calculated according to the first ratio and the second ratio takes into account the individual differences between different users, and can obtain a more objective projection distance. Further, based on the more accurate projection distance, a more accurate light emission frequency and Target luminous power.
  • calculating the projection distance according to the first proportion in step 014 includes:
  • 0143 judging whether the target subject is wearing glasses based on the captured image.
  • 0144 Calculate the projection distance according to the first scale and the distance coefficient when the target subject wears glasses.
  • the second calculation unit 914 further includes a first determination sub-unit 9143 and a third calculation sub-unit 9144.
  • Step 0143 may be implemented by the first judging subunit 9143.
  • Step 0144 may be implemented by the third calculation subunit 9144. That is to say, the first judging sub-unit 9143 may be used to judge whether the target subject is wearing glasses according to the captured image, and the third calculating sub-unit 9144 may be used to calculate the projection distance according to the first ratio and the distance coefficient when the target subject is wearing glasses.
  • step 0143 and step 0144 may be implemented by the processor 805. That is, the processor 805 may be further configured to determine whether the target subject is wearing glasses according to the captured image, and calculate the projection distance according to the first ratio and the distance coefficient when the target subject is wearing glasses.
  • the optical transmitter 100 emits laser light to the user wearing the glasses. At this time, the light emitting power of the light transmitter 100 needs to be reduced so that the energy of the laser light emitted by the light transmitter 100 is small, so as not to cause damage to the eyes of the user.
  • the preset distance coefficient can be a coefficient between 0 and 1, such as 0.6, 0.78, 0.82, 0.95, etc., for example, the initial projection distance is calculated according to the first ratio, or the first distance and the second ratio are calculated.
  • the initial projection distance or the calibrated projection distance is multiplied by the distance coefficient to obtain the final projection distance, and the target luminous power is determined according to the projection distance and the ambient brightness. In this way, it is possible to avoid that the power of the emitted laser is too large to hurt the user suffering from eye disease or poor vision.
  • calculating the projection distance according to the first scale in step 014 includes:
  • 0145 judging the age of the target subject based on the captured image
  • 0146 Calculate the projection distance according to the first ratio and age.
  • the second calculation unit 914 further includes a second determination sub-unit 9145 and a fourth calculation sub-unit 9146.
  • Step 0145 may be implemented by the second judgment sub-unit 9145.
  • Step 0146 may be implemented by the fourth calculation subunit 9146. That is to say, the second judging subunit 9145 can be used to judge the age of the target subject based on the captured image.
  • the fourth calculation subunit 9146 may be configured to calculate the projection distance according to the first ratio and the age.
  • step 0145 and step 0146 may be implemented by the processor 805. That is, the processor 805 may be further configured to determine the age of the target subject based on the captured image, and calculate the projection distance based on the first ratio and age.
  • the number, distribution, and area of feature points of facial wrinkles in the captured image can be extracted to determine the user's age, for example, the number of wrinkles at the corners of the eyes can be used to determine the user's age, or further combined with the user's forehead How many wrinkles are there to determine the user's age.
  • the proportion coefficient can be obtained according to the age of the user. Specifically, the correspondence between age and the proportion coefficient can be found in a query table.
  • the proportion coefficient is 0.6 and the age is between When the age is 15 to 20, the scale factor is 0.8; when the age is 20 to 45, the scale factor is 1.0; when the age is 45 or more, the scale factor is 0.8.
  • the initial projection distance calculated from the first scale or the calibrated projection distance calculated from the first and second scales can be multiplied by the scale factor to obtain the final projection distance. Determine the target luminous power according to the projection distance and the ambient brightness. In this way, excessive power of the emitted laser can be avoided to hurt young users or older users.
  • the electronic device 800 further includes a housing 801.
  • the housing 801 may serve as a mounting carrier for the functional elements of the electronic device 800.
  • the housing 801 can provide protection for the functional elements from dust, drop, and water.
  • the functional elements can be a display screen 802, a visible light camera 400, a receiver, and the like.
  • the housing 801 includes a main body 803 and a movable bracket 804.
  • the movable bracket 804 can be moved relative to the main body 803 under the driving of a driving device.
  • the movable bracket 804 can slide relative to the main body 803 to slide.
  • FIG. 17 Some functional elements (such as the display 802) can be installed on the main body 803, and other functional elements (such as the depth camera 300, the visible light camera 400, and the receiver) can be installed on the movable bracket 804.
  • the movement of the movable bracket 804 can drive the other A part of the functional elements is retracted into or protruded from the main body 803.
  • FIG. 1 and FIG. 17 are merely examples of a specific form of the casing 801, and cannot be understood as a limitation on the casing 801 of the present invention.
  • the depth camera 300 is mounted on a casing 801.
  • the housing 801 may be provided with an acquisition window, and the depth camera 300 is aligned with the acquisition window to enable the depth camera 300 to acquire depth information.
  • the depth camera 300 is mounted on a movable bracket 804.
  • the movable bracket 804 can be triggered to slide in The main body 803 is retracted into the main body by driving the depth camera 300.
  • the depth camera 300 further includes a first substrate assembly 71 and a spacer 72.
  • the first substrate assembly 71 includes a first substrate 711 and a flexible circuit board 712 connected to each other.
  • the spacer 72 is disposed on the first substrate 711.
  • the light emitter 100 is used for projecting laser light outward, and the light emitter 100 is disposed on the cushion block 72.
  • the flexible circuit board 712 is bent and one end of the flexible circuit board 712 is connected to the first substrate 711 and the other end is connected to the light emitter 100.
  • the light receiver 200 is disposed on the first substrate 711.
  • the light receiver 200 is configured to receive laser light reflected by a person or an object in the target space.
  • the light receiver 200 includes a housing 741 and an optical element 742 provided on the housing 741.
  • the housing 741 is integrally connected with the pad 72.
  • the first substrate assembly 71 includes a first substrate 711 and a flexible circuit board 712.
  • the first substrate 711 may be a printed wiring board or a flexible wiring board. Control circuits and the like of the depth camera 300 may be laid on the first substrate 71.
  • One end of the flexible circuit board 712 may be connected to the first substrate 711, and the other end of the flexible circuit board 712 is connected to the circuit board 50 (shown in FIG. 20).
  • the flexible circuit board 712 can be bent at a certain angle, so that the relative positions of the devices connected at both ends of the flexible circuit board 712 can be selected.
  • the spacer 72 is disposed on the first substrate 711.
  • the spacer 72 is in contact with the first substrate 711 and is carried on the first substrate 711.
  • the spacer 72 may be combined with the first substrate 711 by means of adhesion or the like.
  • the material of the spacer 72 may be metal, plastic, or the like.
  • a surface on which the pad 72 is combined with the first substrate 711 may be a plane, and a surface on which the pad 72 is opposite to the combined surface may also be a flat surface, so that the light emitter 100 is disposed on the pad 72. It has better smoothness.
  • the light receiver 200 is disposed on the first substrate 711, and the contact surface between the light receiver 200 and the first substrate 711 is substantially flush with the contact surface between the pad 72 and the first substrate 711 (that is, the installation starting point of the two is at On the same plane).
  • the light receiver 200 includes a housing 741 and an optical element 742.
  • the casing 741 is disposed on the first substrate 711, and the optical element 742 is disposed on the casing 741.
  • the casing 741 may be a lens holder and a lens barrel of the light receiver 200, and the optical element 742 may be an element such as a lens disposed in the casing 741.
  • the light receiver 200 further includes a photosensitive chip (not shown), and the laser light reflected by a person or an object in the target space passes through the optical element 742 and is irradiated into the photosensitive chip, and the photosensitive chip responds to the laser.
  • the housing 741 and the cushion block 72 are integrally connected.
  • the casing 741 and the cushion block 72 may be integrally formed; or the materials of the casing 741 and the cushion block 72 are different, and the two are integrally formed by two-color injection molding or the like.
  • the housing 741 and the spacer 72 may also be separately formed, and the two form a matching structure.
  • one of the housing 741 and the spacer 72 may be set on the first substrate 711, and then the other The first substrate 711 is integrally connected with each other.
  • the light transmitter 100 is disposed on the pad 72, which can increase the height of the light transmitter 100, thereby increasing the height of the surface on which the laser is emitted by the light transmitter 100.
  • the laser light emitted by the light transmitter 100 is not easily received by the light
  • the device 200 is blocked, so that the laser light can be completely irradiated on the measured object in the target space.
  • the side where the cushion block 72 is combined with the first substrate 711 is provided with a receiving cavity 723.
  • the depth camera 300 further includes an electronic component 77 provided on the first substrate 711.
  • the electronic component 77 is housed in the receiving cavity 723.
  • the electronic component 77 may be an element such as a capacitor, an inductor, a transistor, or a resistor.
  • the electronic component 77 may be electrically connected to a control line laid on the first substrate 711 and used for or controlling the operation of the light transmitter 100 or the light receiver 200.
  • the electronic component 77 is housed in the receiving cavity 723, and the space in the pad 72 is used reasonably.
  • the number of the receiving cavities 723 may be one or more, and the receiving cavities 723 may be spaced apart from each other. When mounting the pad 72, the receiving cavity 723 and the electronic component 77 may be aligned and the pad 72 may be disposed on the first substrate 711.
  • the cushion block 72 is provided with an avoiding through hole 724 connected to at least one receiving cavity 723, and at least one electronic component 77 extends into the avoiding through hole 724. It can be understood that when the electronic component 77 needs to be accommodated in the avoiding through hole, the height of the electronic component 77 is required to be not higher than the height of the receiving cavity 723. For electronic components having a height higher than the receiving cavity 723, an avoiding through hole 724 corresponding to the receiving cavity 723 may be provided, and the electronic component 77 may partially extend into the avoiding through hole 724 so as not to increase the height of the cushion 72. Arranges the electronic component 77.
  • the first substrate assembly 711 further includes a reinforcing plate 713, and the reinforcing plate 713 is coupled to a side of the first substrate 711 opposite to the pad 72.
  • the reinforcing plate 713 may cover one side of the first substrate 711, and the reinforcing plate 713 may be used to increase the strength of the first substrate 711 and prevent deformation of the first substrate 711.
  • the reinforcing plate 713 may be made of a conductive material, such as metal or alloy.
  • the reinforcing plate 713 may be electrically connected to the casing 801 to ground the reinforcing plate 713. And the interference of the static electricity of the external components on the depth camera 300 is effectively reduced.
  • the depth camera 300 further includes a connector 76 connected to the first substrate assembly 71 and used to electrically connect with electronic components external to the depth camera 300. connection.
  • the light receiver 100 includes a light source 10, a diffuser 20, a lens barrel 30, a protective cover 40, a circuit board 50, and a driver 61.
  • the lens barrel 30 includes a ring-shaped lens barrel sidewall 33, and the ring-shaped lens barrel sidewall 33 surrounds a receiving cavity 62.
  • the side wall 33 of the lens barrel includes an inner surface 331 located in the receiving cavity 62 and an outer surface 332 opposite to the inner surface.
  • the side wall 33 of the lens barrel includes a first surface 31 and a second surface 32 opposite to each other.
  • the receiving cavity 62 penetrates the first surface 31 and the second surface 32.
  • the first surface 31 is recessed toward the second surface 32 to form a mounting groove 34 communicating with the receiving cavity 62.
  • the bottom surface 35 of the mounting groove 34 is located on a side of the mounting groove 34 remote from the first surface 31.
  • the outer surface 332 of the side wall 33 of the lens barrel is circular at one end of the first surface 31, and the outer surface 332 of the side wall 33 of the lens barrel is formed with an external thread at one end of the first surface 31.
  • the circuit board 50 is disposed on the second surface 32 of the lens barrel 30 and closes one end of the receiving cavity 62.
  • the circuit board 50 may be a flexible circuit board or a printed circuit board.
  • the light source 10 is carried on the circuit board 50 and received in the receiving cavity 62.
  • the light source 10 is configured to emit laser light toward the first surface 31 (the mounting groove 34) side of the lens barrel 30.
  • the light source 10 may be a single-point light source or a multi-point light source.
  • the light source 10 may specifically be an edge-emitting laser, for example, a distributed feedback laser (Distributed Feedback Laser, DFB), etc .; when the light source 10 is a multi-point light source, the light source 10 may specifically be vertical A cavity-surface emitter (Vertical-Cavity Surface Laser, VCSEL), or the light source 10 is also a multi-point light source composed of multiple edge-emitting lasers.
  • DFB distributed Feedback Laser
  • VCSEL Vertical A cavity-surface emitter
  • VCSEL Vertical-Cavity Surface Laser
  • the vertical cavity surface emitting laser has a small height, and the use of the vertical cavity surface emitter as the light source 10 is beneficial to reducing the height of the light emitter 100 and facilitating the integration of the light emitter 100 into a mobile phone and other requirements on the thickness of the fuselage.
  • Electronic device 800 Compared with the vertical cavity surface emitter, the temperature drift of the side-emitting laser is smaller, and the influence of the temperature on the effect of the projected laser light from the light source 10 can be reduced.
  • the driver 61 is carried on the circuit board 50 and is electrically connected to the light source 10. Specifically, the driver 61 may receive the modulated input signal, and convert the input signal into a constant current source and transmit it to the light source 10, so that the light source 10 is directed toward the first side 31 of the lens barrel 30 under the action of the constant current source. Laser is emitted on one side.
  • the driver 61 of this embodiment is provided outside the lens barrel 30. In other embodiments, the driver 61 may be disposed in the lens barrel 30 and carried on the circuit board 50.
  • the diffuser 20 is mounted (supported) in the mounting groove 34 and abuts the mounting groove 34.
  • the diffuser 20 is used to diffuse the laser light passing through the diffuser 20. That is, when the light source 10 emits laser light toward the first surface 31 side of the lens barrel 30, the laser light passes through the diffuser 20 and is diffused or projected outside the lens barrel 30 by the diffuser 20.
  • the protective cover 40 includes a top wall 41 and a protective sidewall 42 extending from one side of the top wall 41.
  • a light through hole 401 is opened in the center of the top wall 41.
  • the protective side wall 42 is disposed around the top wall 41 and the light through hole 401.
  • the top wall 41 and the protection side wall 42 together form a mounting cavity 43, and the light-passing hole 401 communicates with the mounting cavity 43.
  • the cross-section of the inner surface of the protective sidewall 42 is circular, and an inner thread is formed on the inner surface of the protective sidewall 42.
  • the internal thread of the protective sidewall 42 is screwed with the external thread of the lens barrel 30 to mount the protective cover 40 on the lens barrel 30.
  • the top wall 41 abuts the diffuser 20 so that the diffuser 40 is sandwiched between the top wall 41 and the bottom surface 35 of the mounting groove 34.
  • the opening 20 is installed in the lens barrel 30, and the diffuser 20 is installed in the installation groove 34, and the protective cover 40 is installed on the lens barrel 30 to clamp the diffuser 20 between the protective cover 40 and the installation groove. 34 between the bottom surfaces 35 so that the diffuser 20 can be fixed on the lens barrel 30.
  • glue which can prevent the glue from solidifying on the surface of the diffuser 20 and affecting the microstructure of the diffuser 20 after the glue is volatilized to a gaseous state.
  • the glue with the lens barrel 30 decreases due to aging, the diffuser 20 falls off from the lens barrel 30.
  • the structure of the vertical cavity surface emitter at this time may be:
  • the vertical cavity surface emitter includes a plurality of point light sources 101, which form a plurality of independently controllable fan-shaped arrays 11, and the plurality of fan-shaped arrays 11 surround a circle (as shown in FIG. 22) or a polygon ( (Not shown), at this time, the light emitting power of the light emitter 100 can be achieved by turning on the point light sources 101 of different numbers of the fan-shaped arrays 11, that is, the target light-emitting power corresponds to the target number of the turned-on fan-shaped arrays.
  • the part of the fan-shaped array that is turned on should be symmetrically distributed in the center, so that the laser light emitted by the light emitter 100 can be made more uniform.
  • the vertical cavity surface emitter includes a plurality of point light sources 101, and the plurality of point light sources 101 form a plurality of sub-arrays 12, and the plurality of sub-arrays 12 include at least one circular sub-array and at least one circular sub-array, and at least one circular sub-array. And at least one circular sub-array is enclosed in a circle (as shown in FIG. 23), or the multiple sub-arrays 12 include at least one polygonal sub-array and at least one circular sub-array, at least one polygonal sub-array and at least one circular sub-array are enclosed in one Polygon (not shown).
  • the light emitting power of the light transmitter 100 can be adjusted by turning on the point light sources 101 of different numbers of the sub-arrays 12, that is, the target of the light-emitting power and the turned-on sub-arrays 12 Correspondence of quantity.
  • the present invention further provides an electronic device 800.
  • the electronic device 800 includes the depth camera 300, one or more processors 805, a memory 806, and one or more programs 807 according to any one of the foregoing embodiments.
  • One or more programs 807 are stored in the memory 806 and are configured to be executed by one or more processors 805.
  • the program 807 includes instructions for executing the control method of the optical transmitter 100 according to any one of the foregoing embodiments.
  • the program 807 includes instructions for performing the following steps:
  • the light transmitter 10 is controlled to emit light at a first frequency and then emit light at a second frequency, and the first frequency is different from the second frequency.
  • program 807 further includes instructions for performing the following steps:
  • 011 Get the captured image of the scene
  • 012 Process the captured image to determine whether a human face exists in the captured image
  • 014 Calculate the projection distance according to the first ratio.
  • the present invention also provides a computer-readable storage medium 901.
  • the computer-readable storage medium 901 includes a computer program 902 used in conjunction with the electronic device 800.
  • the computer program 902 can be executed by the processor 805 to complete the method for controlling the optical transmitter 100 according to any one of the foregoing embodiments.
  • the computer program 902 may be executed by the processor 805 to complete the following steps:
  • the light transmitter 10 is controlled to emit light at a first frequency and then emit light at a second frequency, and the first frequency is different from the second frequency.
  • the computer program 902 can also be executed by the processor 805 to complete the following steps:
  • 011 Get the captured image of the scene
  • 012 Process the captured image to determine whether a human face exists in the captured image
  • 014 Calculate the projection distance according to the first ratio.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present invention, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
  • Any process or method description in a flowchart or otherwise described herein can be understood as a module, fragment, or portion of code that includes one or more executable instructions for implementing a particular logical function or step of a process
  • the scope of the preferred embodiments of the present invention includes additional implementations in which functions may be performed out of the order shown or discussed, including performing functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present invention pertain.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种光发射器(100)的控制方法、控制装置(90)、深度相机(300)、电子装置(800)和计算机可读存储介质(901)。控制方法包括:获取光发射器(100)与场景中的目标主体之间的投射距离(01);在投射距离大于预设距离时,控制光发射器(10)以第一频率发光后以第二频率发光,第一频率与第二频率不同(03)。

Description

控制方法及装置、深度相机、电子装置及可读存储介质
优先权信息
本申请请求2018年8月22日向中国国家知识产权局提交的、专利申请号为201810963382.6的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本发明涉及三维成像技术领域,特别涉及一种控制方法、控制装置、深度相机、电子装置和计算机可读存储介质。
背景技术
飞行时间(Time of Flight,TOF)成像系统可通过计算光发射器发射光信号的时刻,与光接收器接收到光信号的时刻之间的时间差来计算被测物体的深度信息。光发射器通常包括光源和扩散器。光源发出的光经扩散器的扩散作用后向场景中投射均匀的面光。
发明内容
本发明的实施例提供了一种控制方法、控制装置、深度相机、电子装置和计算机可读存储介质。
本发明实施方式的光发射器的控制方法包括:获取所述光发射器与场景中的目标主体之间的投射距离;在所述投射距离大于预设距离时,控制所述光发射器以第一频率发光后以第二频率发光,所述第一频率与所述第二频率不同。
本发明实施方式的光发射器的控制装置包括第一获取模块和控制模块。所述第一获取模块用于获取所述光发射器与场景中的目标主体之间的投射距离。所述控制模块用于在所述投射距离大于预设距离时,控制所述光发射器以第一频率发光后以第二频率发光,所述第一频率与所述第二频率不同。
本发明实施方式的深度相机包括光发射器和处理器。所述处理器用于获取所述光发射器与场景中的目标主体之间的投射距离;在所述投射距离大于预设距离时,控制所述光发射器以第一频率发光后以第二频率发光,所述第一频率与所述第二频率不同。
本发明实施方式的电子装置包括上述的深度相机、一个或多个处理器、存储器和一个或多个程序,其中所述一个或多个程序被存储在所述存储器中,并且被配置成由所述一个或多个处理器执行,所述程序包括用于执行上述的控制方法的指令。
本发明实施方式的计算机可读存储介质包括与电子装置结合使用的计算机程序,所述计算机程序可被处理器执行以完成上述的控制方法。
本发明的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本发明的实践了解到。
附图说明
本发明上述的和/或附加的方面和优点从下面结合附图对实施例的描述中将变得明显和容易理解,其中:
图1是本发明某些实施方式的电子装置的立体结构示意图。
图2是本发明某些实施方式的光发射器的控制方法的流程示意图。
图3是本发明某些实施方式的光发射器的控制装置的模块示意图。
图4是本发明某些实施方式的深度相机工作的原理示意图。
图5是本发明某些实施方式的光发射器的控制方法的流程示意图。
图6是本发明某些实施方式的控制装置的第一获取模块的模块示意图。
图7是本发明某些实施方式的光发射器的控制方法的流程示意图。
图8是本发明某些实施方式的控制装置的第一获取模块的模块示意图。
图9是本发明某些实施方式的光发射器的控制方法的流程示意图。
图10是本发明某些实施方式的控制装置的模块示意图。
图11是本发明某些实施方式的光发射器的控制方法的流程示意图。
图12是本发明某些实施方式的控制装置的第二计算单元的模块示意图。
图13是本发明某些实施方式的光发射器的控制方法的流程示意图。
图14是本发明某些实施方式的控制装置的第二计算单元的模块示意图。
图15是本发明某些实施方式的光发射器的控制方法的流程示意图。
图16是本发明某些实施方式的控制装置的第二计算单元的模块示意图。
图17是本发明某些实施方式的电子装置的立体结构示意图。
图18是本发明某些实施方式的深度相机的立体结构示意图。
图19是本发明某些实施方式的深度相机的平面结构示意图。
图20是图19中的深度相机沿XX-XX线的截面示意图。
图21是本发明某些实施方式的光发射器的结构示意图。
图22和图23是本发明某些实施方式的光发射器的光源的结构示意图。
图24是本发明某些实施方式的电子装置的模块示意图。
图25是本发明某些实施方式的计算机可读存储介质与电子装置的连接示意图。
具体实施方式
以下结合附图对本发明的实施方式作进一步说明。附图中相同或类似的标号自始至终表示相同或类似的元件或具有相同或类似功能的元件。
下面详细描述本发明的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本发明,而不能理解为对本发明的限制。
请一并参阅图1和图2,本发明提供一种光发射器100的控制方法。控制方法包括:
01:获取光发射器100与场景中的目标主体之间的投射距离;
03:在投射距离大于预设距离时,控制光发射器10以第一频率发光后以第二频率发光,第一频率与第二频率不同。预设距离可以预先设置在光发射器100中或由用户输入确定,在一个实施例中,预设距离例如为2.5m。
请一并参阅图2和图3,本发明还提供一种光发射器100的控制装置90。本发明实施方式的光发射器100的控制方法可以由本发明实施方式的光发射器100的控制装置90执行。具体地,控制装置90包括第一获取模块91和控制模块93。步骤01可以由第一获取模块91实现。步骤03可以由控制模块93实现。也即是说,第一获取模块91可以用于获取光发射器100与场景中的目标主体之间的投射距离。控制模块93可用于在投射距离大于预设距离时,控制光发射器10以第一频率发光后以第二频率发光,第一频率与第二频率不同。
请再参阅图2,本发明还提供一种深度相机300。深度相机300包括光发射器100、光接收器200和处理器805。步骤01和步骤03可以由处理器805实现。也即是说,处理器805可用于获取光发射器100与场景中的目标主体之间的投射距离、在投射距离大于预设距离时,控制光发射器10以第一频率发光后以第二频率发光,第一频率与第二频率不同。
本发明实施方式的深度相机300可以应用于电子装置800中。本发明实施方式的深度相机300中的处理器805与电子装置800的处理器805可为同一个处理器805,也可为两个独立的处理器805。在本发明的具体实施例中,深度相机300中的处理器805与电子装置800的处理器805为同一个处理器805。电子装置800可以是手机、平板电脑、智能穿戴设备(智能手表、智能手环、智能眼镜、智能头盔)、无人机等,在此不作限制。
具体地,本发明实施方式的深度相机300为飞行时间(Time of Flight,TOF)深度相机。TOF深度相机通常包括一个光发射器100、一个光接收器200。光接收器200用于向场景中投射激光,光接收器200接收由场景中的人或物反射回的激光。TOF深度相机获取深度信息的方式通常包括直接获取和间接获取两种方式。直接获取方式下,处理器805可以根据光接收器200发射激光的时间点与光接收器200接收激光的时间点计算激光在场景中的飞行时间,并根据激光在场景中的飞行时间计算场景的深度信息。间接获取方式下,光发射器100向场景中发射激光,光接收器200采集反射回来的激光以获得相位差,并根据该相位差和激光的发光频率计算场景的深度信息。在一个实施例中,
Figure PCTCN2019090076-appb-000001
Figure PCTCN2019090076-appb-000002
其中d为物体距离,c为光速、t为激光的飞行时间,T为激光的发光周期,
Figure PCTCN2019090076-appb-000003
为相位差,f为激光的发光频率。需要说明的是,
Figure PCTCN2019090076-appb-000004
的取值为0度至360度,在
Figure PCTCN2019090076-appb-000005
时,d为0,物体距离取得最小值,在
Figure PCTCN2019090076-appb-000006
时,
Figure PCTCN2019090076-appb-000007
物体距离取得最大值,即深度相机300能够测量的距离取决于发光频率,发光频率越高,深度相机300能够测量的距离越短,发光频率越低,深度相机300能够测量的距离越长。
光发射器的发光频率会影响场景中人物、物体的深度信息的测量精度。但目前的光发射器通常以 固定的发光频率发光。
在某些实施方式中,可以根据投射距离来确定光发射器100的目标发光频率,然后控制光发射器100按照目标发光频率发光。其中,步骤01获取的投射距离不够准确,或者说投射距离与实际距离的误差一般比较大。投射距离与目标发光频率具有映射关系,例如,投射距离为一个具体的值,目标发光频率也为一个具体值,投射距离与目标发光频率一一对应;或者,投射距离为一个范围,目标发光频率为一个具体值,投射距离与目标发光频率一一对应。投射距离与目标发光频率之间映射关系可以是在深度相机300出厂前基于大量实验的标定数据确定得到的。投射距离与目标发光频率之间的映射关系满足目标发光频率随投射距离的增加而减小的规律。例如,投射距离为1.5米内时,光发射器100的目标发光频率为100MHz;投射距离为2.5米内时,光发射器100的目标发光频率为60MHz;投射距离为5米内时,光发射器100的目标发光频率为30MHz等,从而在投射距离增加时,通过减小目标发光频率来增加深度相机300的测量距离,并且由于只通过一个目标发光频率来测量深度信息,因此可以使得深度信息的获取更加快速。
在投射距离大于对应发光频率能够测量的距离时,深度相机300会出现测量模糊,即深度相机300会重复出现能够测量的范围,例如在发光频率为100MHz时,深度相机300能够测量的范围为0-1.5米,在投射距离大于1.5米时,例如5米,此时深度相机300测量到的相位差为120度,即与0.5米的相位差一致,因此,深度相机300会将5米的投射距离误认为0.5米。在某些实施方式中,在投射距离大于预设距离时,控制光发射器10以第一频率发光后以第二频率发光,第一频率与第二频率不同。光接收器200用于接收反射回来的光发射器100以第一频率发射的激光以获得第一相位差和接收反射回来的光发射器100以第二频率发射的激光以获得第二相位差。处理器805还用于:根据第一频率和第一相位差计算获得第一距离;根据第二频率和第二相位差计算获得第二距离;和根据第一距离和第二距离计算获得校准距离。具体地,在投射距离大于预设距离时,可以采用大于第三频率的第一频率和大于第三频率的第二频率来测量该投射距离,其中,第三频率为该投射距离对应的最大发光频率。请参阅图4,以投射距离为5米为例,第三频率为30MHz,则可以通过大于第三频率的第一频率和第二频率来测量该投射距离,第一频率例如为100MHz,第二频率例如为60MHz。在发光频率为第一频率(100MHz)时,光接收器200接收到的第一相位差为120度,则测量获得的第一距离为0.5m,在发光频率为第二频率(60MHz)时,光接收器200接收到的第二相位差为360度,则测量获得的第二距离为2.5m。实际的投射距离应该为1.5k 1+0.5,同时也应该为2.5k 2+2.5,令1.5k 1+0.5=2.5k 2+2.5,则可以计算获得3k 1=5k 2+4,求k 1和k 2的最小自然数解即可获得实际投射距离,例如k 1=3,k 2=1,则实际的投射距离为1.5*3+0.5=2.5*1+2.5=5米。如此,可以通过第一频率和第二频率来准确地测量投射距离,并且由于第一频率和第二频率大于第三频率,因此能够使得测量获得的投射距离的精度更高。需要说明的是,发光频率越高,深度相机300测量获得的距离的精度越高,发光频率越低,深度相机300测量获得的距离的精度越低。
在某些实施方式中,在投射距离小于预设距离时,光发射器100的发光频率可以为单一频率,例如为60MHz或100MHz等。
请参阅图5,在某些实施方式中,步骤01获取光发射器100与场景中的目标主体之间的投射距离包括:
011:获取场景的拍摄图像;
012:处理拍摄图像以判断拍摄图像中是否存在人脸;
013:在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例;和
014:根据第一比例计算投射距离。
请参阅图6,在某些实施方式中,第一获取模块91包括第一获取单元911、处理单元912、第一计算单元913和第二计算单元914。步骤011可以由第一获取单元911实现。步骤012可以由处理单元912实现。步骤013可以由第一计算单元913实现。步骤014可以由第二计算单元914实现。也即是说,第一获取单元911可用于获取场景的拍摄图像。处理单元912可用于处理拍摄图像以判断拍摄图像中是否存在人脸。第一计算单元913可用于在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例。第二计算单元914可用于根据第一比例计算投射距离。其中,第一获取单元911可以是红外摄像头(可以是光接收器200)或可见光摄像头400,当第一获取单元911为红外摄像头时,拍摄图像为红外图像;当第一获取单元911为可见光摄像头400时,拍摄图像为可见光图像。
请再参阅图1,在某些实施方式中,步骤011、步骤012、步骤013和步骤014均可以由处理器805实现。也即是说,处理器805可用于获取场景的拍摄图像、处理拍摄图像以判断拍摄图像中是否存在人脸、在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例、以及根据第一比例计算投射距离。
具体地,处理器805先基于人脸识别算法识别拍摄图像中是否存在人脸。在拍摄图像中存在人脸时,处理器805提取出人脸区域并计算人脸区域所占的像素个数,随后,处理器805将人脸区域的像素个数除以拍摄图像的总像素个数以得到拍摄图像中人脸所占的第一比例,最后基于第一比例计算投射距离。一般地,当第一比例较大时,说明目标主体比较靠近深度相机300,也就是目标主体比较靠近光发射器100,投射距离较小;当第一比例较大时,说明目标主体与深度相机300距离较远,也就是目标主体距离光发射器100较远,投射距离较大。因此,投射距离与第一比例之间的关系满足投射距离随第一比例的减小而增大。在一个例子中,当拍摄图像中包含多张人脸时,可以选取多张人脸中面积最大的人脸作为人脸区域用以计算第一比例;或者,也可以选取多张人脸的面积的平均值来计算第一比例;或者,可以从多张人脸中识别出电子装置800的持有者的人脸,将持有者的人脸作为人脸区域来计算第一比例,如此,基于持有者与深度相机300的距离来确定是否采用第一频率和第二频率可以提升持有者对应的深度信息的获取精度,提升用户使用体验。
第一比例与投射距离具有映射关系,例如,第一比例为一个具体值,投射距离也为一个具体值,第一比例与投射距离一一对应;或者,第一比例为一个范围,投射距离为一个具体值,第一比例为投射距离一一对应;或者,第一比例为一个范围,投射距离也为一个范围,第一比例与投射距离一一对应。具体地,第一比例与投射距离之间的映射关系可以预先标定。在标定时,指引用户分别站在距离红外摄像头或可见光摄像头400多个预定投射距离处,红外摄像头或可见光摄像头400依次采集拍摄图像。处理器805计算每张拍摄图像中人脸占拍摄图像的标定比例,再存储每张拍摄图像中的标定比例与预定投射距离之间的对应关系,在后续使用时,基于实际测量的第一比例在上述映射关系中寻找与第一比例对应的投射距离。例如,指引用户在投射距离为10厘米、20厘米、30厘米、40厘米的位置处站立,红外摄像头或可见光摄像头400依次采集拍摄图像,处理器805根据多张拍摄图像计算出与投射距离10厘米、20厘米、30厘米、40厘米分别对应的标定比例80%、60%、45%、30%,并将标定比例与预定投射距离的映射关系10cm-80%、20cm-60%、30cm-45%、40cm-30%以映射表的形式存储在电子装置800的存储器(图24所示)中。在后续使用时,直接在映射表中寻找与第一比例对应的投射距离。
或者,预先对投射距离与第一比例进行标定。在标定时,指引用户站在距离红外摄像头或可见光摄像头400的某一个预定投射距离处,红外摄像头或可见光摄像头400采集拍摄图像。处理器805计算拍摄图像中人脸占拍摄图像的标定比例,再存储拍摄图像中的标定比例与预定投射距离之间的对应关系,在后续使用时,基于标定比例与预定投射距离之间的对应关系计算投射距离。例如,指引用户在投射距离为30厘米的位置处站立,红外摄像头或可见光摄像头400采集拍摄图像,处理器805计算到人脸在拍摄图像中的占比为45%,而在实际测量中,当计算得到第一比例为R时,则依据相似三角形的性质有
Figure PCTCN2019090076-appb-000008
其中,D依据实际测量的第一比例R计算的实际的投射距离。
如此,依据拍摄图像中人脸所占的第一比例,可以较为客观地反应目标主体与光发射器100之间的投射距离。
请参阅图7,在某些实施方式中,步骤01获取光发射器100与场景中的目标主体之间的投射距离包括:
015:控制光发射器100以预定发光频率发光以检测场景的初始深度信息;和
016:根据初始深度信息计算光发射器100与目标主体之间的投射距离。
请参阅图8,在某些实施方式中,第一获取模块91包括第一控制单元915和第三计算单元916。步骤015可以由第一控制单元915实现。步骤016可以由第三计算单元916实现。也即是说,第一控制单元915可用于控制光发射器100以预定发光频率发光以检测场景的初始深度信息。第三计算单元916可用于根据初始深度信息计算光发射器100与目标主体之间的投射距离。
请再参阅图1,在某些实施方式中,步骤015和步骤016均可以由处理器805实现。也即是说,处理器805还可以用于控制光发射器100以预定发光频率发光以检测场景的初始深度信息、及根据初始深度信息计算光发射器100与目标主体之间的投射距离。
具体地,处理器805控制光发射器100以预定发光频率发射激光,光接收器200接收由场景中的人或物体反射回的激光,处理器805基于光接收器200的接收结果计算场景的初始深度信息。其中,预定发光频率小于预设阈值,也即是说,获取场景的初始深度信息时,光发射器100以较低的发光频率发光,发光频率较低一方面可以减小电子装置800的功耗,另一方面,此时目标主体与深度相机300之间的投射距离未知,目标主体是否为用户也未知,若直接以较高的发光频率发光,如果目标主体为用户且目标主体与深度相机300的距离较近,则激光的高频率出射容易对用户的眼睛产生危害,而以 较低的发光频率发光则不会存在上述的安全隐患。
处理器805计算出场景的初始深度信息后,进一步地从场景中确定出目标主体,以进一步确定目标主体的初始深度信息。具体地,目标主体一般处于光接收器200的视场的中央区域,因此,可以将光接收器200视场的中央区域作为目标主体所在区域,从而将中央区域的这部分像素的初始深度信息作为目标主体的初始深度信息。一般地,目标主体的初始深度信息的值有多个,处理器805可计算出多个初始深度信息的均值或中值,并将均值或中值作为光发射器100与目标主体之间的投射距离。如此,计算出目标主体与光发射器100之间的投射距离,再基于投射距离确定光发射器100的发光频率,从而使得光发射器100按照发光频率发光,提升获取的目标主体的深度信息的精度。
在某些实施方式中,在步骤012处理拍摄图像以判断拍摄图像中是否存在人脸后,若拍摄图像中不存在人脸,则处理器805可进一步执行步骤015和步骤016以确定目标主体与光发射器100之间的投射距离。如此,在拍摄图像中不存在人脸时,也能够确定出目标主体与光发射器100之间的投射距离。
在某些实施方式中,在步骤015控制光发射器100以预定发光频率发光以检测场景的初始深度信息后,处理器805可以控制红外摄像头(可以为光接收器200)或可见光摄像头400采集拍摄图像。假设拍摄图像由可见光摄像头400采集,一般地,为了拍摄人物的三维色彩图像或者是对场景做三维建模,电子装置800中可见光摄像头400与光接收器200的视场通常具有较大部分的重叠,在电子装置800出厂前,厂商也会对可见光摄像头400与光接收器200之间的相对位置做标定并得到多个标定参数以用于后续可见光图像的色彩信息和深度图像的深度信息的匹配。因此,处理器805获取到拍摄图像后,处理器805可以先识别拍摄图像中是否存在人脸,在存在人脸时,再根据拍摄图像与初始深度信息形成的初始深度图像二者的匹配关系找到人脸对应的初始深度信息,并将人脸对应的初始深度信息作为目标主体的深度信息。若拍摄图像中不存在人脸,再将中央区域的这部分像素的初始深度信息作为目标主体的初始深度信息。如此,在场景中存在用户时,可以更准确地测量到用户与深度相机300之间的投射距离。
请参阅图9,在某些实施方式中,控制方法在步骤01后还包括:
04:获取场景的环境亮度;
05:根据环境亮度及投射距离计算光发射器100的目标发光功率;和
06:控制光发射器100以目标发光功率发光。
请参阅图10,在某些实施方式中,控制装置90还包括第二获取模块94、计算模块95。步骤04可以由第二获取模块94实现。步骤05可以由计算模块95实现。步骤06可以由控制模块93实现。也即是说,第二获取模块94可用于获取场景的环境亮度。计算模块95可用于根据环境亮度及投射距离计算光发射器100的目标发光功率。控制模块93还可用于控制光发射器100以目标发光功率发光。
请再参阅图1,在某些实施方式中,步骤04、步骤05和步骤06均可以由处理器805实现。也即是说,处理器805可以用于获取场景的环境亮度、根据环境亮度及投射距离计算光发射器100的目标发光功率、以及控制光发射器100以目标发光功率发光。
其中,步骤06与步骤03可以是同步执行的,此时处理器805除了控制光发射器100的发光频率之外,还控制光发射器100以目标发光功率发光。
具体地,环境亮度可以由光传感器检测。处理器805从光传感器中读取其检测到的环境亮度。或者,环境亮度也可以由红外摄像头(可以为光接收器200)或可见光摄像头400来检测,红外摄像头或可见光摄像头400拍摄当前场景的图像,处理器805计算图像的亮度值以作为环境亮度。
在确定出环境亮度和投射距离后,处理器805基于环境亮度和投射距离两个参数共同计算场景的目标发光功率。可以理解的是,首先,在环境亮度较高时,环境光中包含的红外光成分较多,环境光中的红外光与光发射器100发射的红外激光的波段重合的部分也较多,此时,光接收器200同时会接收到光发射器100发射的红外激光以及环境光中的红外光,若光发射器100发射红外激光的发光功率较低,则光接收器200接收的光中的来自光发射器100的红外激光与来自环境光中的红外光二者的占比相差不大,如此会导致光接收器200接收光的时间点不准确,或者导致获得的相位差不够准确,进一步会降低深度信息的获取精度,因此,需要提升光发射器100发射红外激光的发射功率,以减小环境中的红外光对光接收器200接收来自光发射器100的红外激光的影响;在环境亮度较低时,环境光线中包含的红外光成分较少,此时光发射器100若采用较高的发光功率发光,则会增加电子装置800的功耗。另外,在投射距离较远时,激光的飞行时间较长,飞行行程较远,激光的损耗较多,进一步地导致获得的相位差不够准确,从而对深度信息的获取精度产生影响。因此,在投射距离较大时,可以适当提升光发射器100发射红外激光的发射功率。
具体地,在环境亮度高于预设亮度时,且投射距离大于预定距离时,光发射器100的目标发光功 率大于或等于第一预定功率P1。在环境亮度小于预设亮度,且投射距离小于预定距离时,光发射器100的目标发光功率小于或等于第二预定功率P2。其中,第一预定功率P1大于第二预定功率P2。在环境亮度大于预设亮度且投射距离小于预定距离,或者环境亮度小于预设亮度且投射距离大于预定距离时,光发射器100的目标发光功率位于第二预定功率P2与第一预订功率P1之间,即光发射器100的目标发光功率的取值范围为(P2,P1)。
如此,基于环境亮度及投射距离共同确定光发射器100的目标发光功率,一方面可以减小电子装置800的功耗,另一方面可以提升场景的深度信息的获取精度。
请参阅图11,在某些实施方式中,步骤014根据所述第一比例计算投射距离包括:
0141:计算拍摄图像中人脸的预设特征区域占人脸的第二比例;和
0142:根据第一比例及第二比例计算投射距离。
请参阅图12,在某些实施方式中,第二计算单元914包括第一计算子单元9141和第二计算子单元9142。步骤0141可以由第一计算子单元9141实现,步骤0142可以由第二计算子单元9142实现。也即是说,第一计算子单元9141可用于计算拍摄图像中人脸的预设特征区域占人脸的第二比例。第二计算子单元9142可用于根据第一比例及第二比例计算投射距离。
请再参阅图1,在某些实施方式中,步骤0141和步骤0142均可以由处理器805实现。也即是说,处理器805可用于计算拍摄图像中人脸的预设特征区域占人脸的第二比例、及根据第一比例及第二比例计算投射距离。
可以理解,不同的用户的人脸大小有差异,使得不同的用户处于同样的距离下时,采集到的拍摄图像中人脸所占的第一比例有差异。第二比例为人脸的预设的特征占人脸的比例,预设的特征区域可以选择不同用户个体的差异度较小的特征区域,例如预设的特征趋区域为用户的双眼间距。当第二比例较大时,说明该用户的人脸较小,仅依据第一比例计算得到的投射距离过大;当第二比例较小时,说明该用户的人脸较大,仅依据第一比例计算得到的投射距离过小。在实际使用中,可以预先对第一比例、第二比例与投射距离进行标定。具体地,指引用户站在预定的投射距离位置处,并采集拍摄图像,再计算该拍摄图像对应的第一标定比例及第二标定比例,存储该预定的投射距离与第一标定比例、第二标定比例的对应关系,以便于在后续使用中依据实际的第一比例和第二比例计算投射距离。例如,指引用户站在投射距离为25厘米处,并采集拍摄图像,再计算该拍摄图像对应的第一标定比例为50%,第二标定比例为10%,而在实际测量中,当计算得到的第一比例为R1,第二比例为R2时,则依据三角形相似的性质有
Figure PCTCN2019090076-appb-000009
其中,D1为依据实际测量的第一比例R1计算得到的初始的投射距离,可以再依据关系式
Figure PCTCN2019090076-appb-000010
求得进一步依据实际测量的第二比例R2计算得到的校准的投射距离D2,D2作为最终的投射距离。如此,依据第一比例和第二比例计算得到的投射距离考虑了不同用户之间的个体差异,能够获得更加客观的投射距离,进一步地可以基于较为准确的投射距离确定出较为准确的发光频率和目标发光功率。
请参阅图13,在某些实施方式中,步骤014根据所述第一比例计算投射距离包括:
0143:根据拍摄图像判断目标主体是否佩戴眼镜;和
0144:在目标主体佩戴眼镜时根据第一比例及距离系数计算投射距离。
请参阅图14,在某些实施方式中,第二计算单元914还包括第一判断子单元9143和第三计算子单元9144。步骤0143可以由第一判断子单元9143实现。步骤0144可以由第三计算子单元9144实现。也即是说,第一判断子单元9143可用于根据拍摄图像判断目标主体是否佩戴眼镜,第三计算子单元9144可用于在目标主体佩戴眼镜时根据第一比例及距离系数计算投射距离。
请再参阅图1,在某些实施方式中,步骤0143和步骤0144均可以由处理器805实现。也即是说,处理器805还可用于根据拍摄图像判断目标主体是否佩戴眼镜、以及在目标主体佩戴眼镜时根据第一比例及距离系数计算投射距离。
可以理解,用户是否佩戴眼镜可以用于表征用户眼睛的健康状况,具体为用户佩戴眼镜则表明用户的眼睛已经患有相关的眼疾或视力不佳,在光发射器100对佩戴眼镜的用户发射激光时,需要降低光发射器100的发光功率以使得光发射器100发射的激光的能量较小,以免对用户的眼睛造成伤害。预设的距离系数可以是介于0至1的系数,例如0.6、0.78、0.82、0.95等,例如在根据第一比例计算得到初始的投射距离,或者在依据第一比例和第二比例计算得到校准后的投射距离后,再将初始的投射距离或者校准的投射距离乘以距离系数,得到最终的投射距离,并根据该投射距离以及环境亮度确定目标发光功率。如此,可以避免发射激光的功率过大伤害患有眼疾或视力不佳的用户。
请参阅图15,在某些实施方式中,步骤014根据所述第一比例计算投射距离包括:
0145:根据拍摄图像判断目标主体的年龄;和
0146:根据第一比例及年龄计算投射距离。
请参阅图16,在某些实施方式中,第二计算单元914还包括第二判断子单元9145和第四计算子单元9146。步骤0145可以由第二判断子单元9145实现。步骤0146可以由第四计算子单元9146实现。也即是说,第二判断子单元9145可用于根据拍摄图像判断目标主体的年龄。第四计算子单元9146可用于根据第一比例及年龄计算投射距离。
请再参阅图1,在某些实施方式中,步骤0145和步骤0146均可以由处理器805实现。也即是说,处理器805还可用于根据拍摄图像判断目标主体的年龄,以及根据第一比例及年龄计算投射距离。
不同年龄段的人对红外激光的耐受能力不同,例如小孩和老人更容易被激光灼伤等,可能对于成年人而言是合适强度的激光会对小孩造成伤害。本实施方式中,可以提取拍摄图像中,人脸皱纹的特征点的数量、分布和面积等来判断用户的年龄,例如,提取眼角处皱纹的数量来判断用户的年龄,或者进一步结合用户的额头处的皱纹多少来判断用户的年龄。在判断用户的年龄后,可以依据用户的年龄得到比例系数,具体可以是在查询表中查询得知年龄与比例系数的对应关系,例如,年龄在15岁以下时,比例系数为0.6,年龄在15岁至20岁时,比例系数为0.8;年龄在20岁至45岁时,比例系数为1.0;年龄在45岁以上时,比例系数为0.8。在得知比例系数后,可以将根据第一比例计算得到的初始的投射距离、或者根据第一比例及第二比例计算得到的校准的投射距离乘以比例系数,以得到最终的投射距离,再根据投射距离以及环境亮度确定目标发光功率。如此,可以避免发射激光的功率过大而伤害小年龄段或者年龄较大的用户。
请一并参阅图1和图17,在某些实施方式中,本发明实施方式的电子装置800还包括壳体801。壳体801可以作为电子装置800的功能元件的安装载体。壳体801可以为功能元件提供防尘、防摔、防水等保护,功能元件可以是显示屏802、可见光摄像头400、受话器等。在本发明实施例中,壳体801包括主体803及可动支架804,可动支架804在驱动装置的驱动下可以相对于主体803运动,例如可动支架804可以相对于主体803滑动,以滑入主体803(如图17所示)或从主体803滑出(如图1所示)。部分功能元件(例如显示屏802)可以安装在主体803上,另一部分功能元件(例如深度相机300、可见光摄像头400、受话器)可以安装在可动支架804上,可动支架804运动可带动该另一部分功能元件缩回主体803内或从主体803中伸出。当然,图1和图17所示仅仅是对壳体801的一种具体形式举例,不能理解为对本发明的壳体801的限制。
深度相机300安装在壳体801上。具体地,壳体801上可以开设有采集窗口,深度相机300与采集窗口对准安装以使深度相机300采集深度信息。在本发明的具体实施例中,深度相机300安装在可动支架804上。用户在需要使用深度相机300时,可以触发可动支架804从主体803中滑出以带动深度相机300从主体803中伸出;在不需要使用深度相机300时,可以触发可动支架804滑入主体803以带动深度相机300缩回主体中。
请一并参阅图18至图20,在某些实施方式中,深度相机300除了包括光发射器100和光接收器200外,还包括第一基板组件71和垫块72。第一基板组件71包括互相连接的第一基板711及柔性电路板712。垫块72设置在第一基板711上。光发射器100用于向外投射激光,光发射器100设置在垫块72上。柔性电路板712弯折且柔性电路板712的一端连接第一基板711,另一端连接光发射器100。光接收器200设置在第一基板711上,光接收器200用于接收被目标空间中的人或物反射回的激光。光接收器200包括外壳741及设置在外壳741上的光学元件742。外壳741与垫块72连接成一体。
具体地,第一基板组件71包括第一基板711及柔性电路板712。第一基板711可以是印刷线路板或柔性线路板。第一基板71上可以铺设有深度相机300的控制线路等。柔性电路板712的一端可以连接在第一基板711上,柔性电路板712的另一端连接在电路板50(图20所示)上。柔性电路板712可以发生一定角度的弯折,使得柔性电路板712的两端连接的器件的相对位置可以有较多选择。
垫块72设置在第一基板711上。在一个例子中,垫块72与第一基板711接触且承载在第一基板711上,具体地,垫块72可以通过胶粘等方式与第一基板711结合。垫块72的材料可以是金属、塑料等。在本发明的实施例中,垫块72与第一基板711结合的面可以是平面,垫块72与该结合的面相背的面也可以是平面,使得光发射器100设置在垫块72上时具有较好的平稳性。
光接收器200设置在第一基板711上,且光接收器200和第一基板711的接触面与垫块72和第一基板711的接触面基本齐平设置(即,二者的安装起点在同一平面上)。具体地,光接收器200包括外壳741及光学元件742。外壳741设置在第一基板711上,光学元件742设置在外壳741上,外壳741可以是光接收器200的镜座及镜筒,光学元件742可以是设置在外壳741内的透镜等元件。进一步地,光接收器200还包括感光芯片(图未示),由目标空间中的人或物反射回的激光通过光学元件742后照 射到感光芯片中,感光芯片对该激光产生响应。在本发明的实施例中,外壳741与垫块72连接成一体。具体地,外壳741与垫块72可以是一体成型;或者外壳741与垫块72的材料不同,二者通过双色注塑等方式一体成型。外壳741与垫块72也可以是分别成型,二者形成配合结构,在组装深度相机300时,可以先将外壳741与垫块72中的一个设置在第一基板711上,再将另一个设置在第一基板711上且连接成一体。
如此,将光发射器100设置在垫块72上,垫块72可以垫高光发射器100的高度,进而提高光发射器100出射激光的面的高度,光发射器100发射的激光不易被光接收器200遮挡,使得激光能够完全照射到目标空间中的被测物体上。
请再一并参阅图18至图20,在某些实施方式中,垫块72与第一基板711结合的一侧开设有容纳腔723。深度相机300还包括设置在第一基板711上的电子元件77。电子元件77收容在容纳腔723内。电子元件77可以是电容、电感、晶体管、电阻等元件。电子元件77可以与铺设在第一基板711上的控制线路电连接,并用于或控制光发射器100或光接收器200工作。电子元件77收容在容纳腔723内,合理利用了垫块72内的空间,不需要增加第一基板711的宽度来设置电子元件77,有利于减小深度相机300的整体尺寸。容纳腔723的数量可以是一个或多个,容纳腔723可以是互相间隔的。在安装垫块72时,可以将容纳腔723与电子元件77的位置对准并将垫块72设置在第一基板711上。
请继续一并参阅图18至图20,在某些实施方式中,垫块72开设有与至少一个容纳腔723连接的避让通孔724,至少一个电子元件77伸入避让通孔724内。可以理解,需要将电子元件77收容在避让通孔内时,要求电子元件77的高度不高于容纳腔723的高度。而对于高度高于容纳腔723的电子元件,可以开设与容纳腔723对应的避让通孔724,电子元件77可以部分伸入避让通孔724内,以在不提高垫块72的高度的前提下布置电子元件77。
请还一并参阅图18至图20,在某些实施方式中,第一基板组件711还包括加强板713,加强板713结合在第一基板711的与垫块72相背的一侧。加强板713可以覆盖第一基板711的一个侧面,加强板713可以用于增加第一基板711的强度,避免第一基板711发生形变。另外,加强板713可以由导电的材料制成,例如金属或合金等,当深度相机300安装在电子设备800上时,可以将加强板713与壳体801电连接,以使加强板713接地,并有效地减少外部元件的静电对深度相机300的干扰。
请再一并参阅图18至图20,在某些实施方式中,深度相机300还包括连接器76,连接器76连接在第一基板组件71上并用于与深度相机300外部的电子元件电性连接。
请参阅图21,在某些实施方式中,光接收器100包括光源10、扩散器20、镜筒30、保护罩40、电路板50及驱动器61。
其中,镜筒30包括呈环状的镜筒侧壁33,环状的镜筒侧壁33围成收容腔62。镜筒侧壁33包括位于收容腔62内的内表面331及与内表面相背的外表面332。镜筒侧壁33包括相背的第一面31及第二面32。收容腔62贯穿第一面31及第二面32。第一面31朝第二面32凹陷形成与收容腔62连通的安装槽34。安装槽34的底面35位于安装槽34的远离第一面31的一侧。镜筒侧壁33的外表面332在第一面31的一端的横截面呈圆形,镜筒侧壁33的外表面332在第一面31的一端形成有外螺纹。
电路板50设置在镜筒30的第二面32上并封闭收容腔62的一端。电路板50可以为柔性电路板或印刷电路板。
光源10承载在电路板50上并收容在收容腔62内。光源10用于朝镜筒30的第一面31(安装槽34)一侧发射激光。光源10可以是单点光源,也可是多点光源。在光源10为单点光源时,光源10具体可以为边发射型激光器,例如可以为分布反馈式激光器(Distributed Feedback Laser,DFB)等;在光源10为多点光源时,光源10具体可以为垂直腔面发射器(Vertical-Cavity Surface Laser,VCSEL),或者光源10也为由多个边发射型激光器组成的多点光源。垂直腔面发射激光器的高度较小,采用垂直腔面发射器作为光源10,有利于减小光发射器100的高度,便于将光发射器100集成到手机等对机身厚度有较高的要求的电子装置800中。与垂直腔面发射器相比,边发射型激光器的温漂较小,可以减小温度对光源10的投射激光的效果的影响。
驱动器61承载在电路板50上并与光源10电性连接。具体地,驱动器61可以接收经过调制的输入信号,并将输入信号转化为恒定的电流源后传输给光源10,以使光源10在恒定的电流源的作用下朝镜筒30的第一面31一侧发射激光。本实施方式的驱动器61设置在镜筒30外。在其他实施方式中,驱动器61可以设置在镜筒30内并承载在电路板50上。
扩散器20安装(承载)在安装槽34内并与安装槽34相抵触。扩散器20用于扩散穿过扩散器20的激光。也即是,光源10朝镜筒30的第一面31一侧发射激光时,激光会经过扩散器20并被扩散器20扩散或投射到镜筒30外。
保护罩40包括顶壁41及自顶壁41的一侧延伸形成的保护侧壁42。顶壁41的中心开设有通光孔 401。保护侧壁42环绕顶壁41及通光孔401设置。顶壁41与保护侧壁42共同围成安装腔43,通光孔401与安装腔43连通。保护侧壁42的内表面的横截面呈圆形,保护侧壁42的内表面上形成有内螺纹。保护侧壁42的内螺纹与镜筒30的外螺纹螺合以将保护罩40安装在镜筒30上。顶壁41与扩散器20抵触使得扩散器40被夹持在顶壁41与安装槽34的底面35之间。
如此,通过在镜筒30上开设安装槽34,并将扩散器20安装在安装槽34内,以及通过保护罩40安装在镜筒30上以将扩散器20夹持在保护罩40与安装槽34的底面35之间,从而可以将扩散器20固定在镜筒30上。此种方式无需使用胶水将扩散器20固定在镜筒30上,能够避免胶水挥发成气态后,气态的胶水凝固在扩散器20的表面而影响扩散器20的微观结构,并能够避免扩散器20和镜筒30的胶水因老化而使粘着力下降时扩散器20从镜筒30脱落。
请一并参阅图22和图23,在某些实施方式中,在调节光发射器100的发光功率时,可以通过调节驱动光发射器100发光的驱动电流的来实现。另外地,如果光发射器100的光源10为垂直腔面发射器,则此时垂直腔面发射器的结构可为:
(1)垂直腔面发射器包括多个点光源101,多个点光源101形成多个可独立控制的扇形阵列11,多个扇形阵列11围成圆形(如图22所示)或多边形(图未示),此时,光发射器100的发光功率可以通过开启不同数目的扇形阵列11的点光源101来实现,也即是说,目标发光功率与开启的扇形阵列的目标数量的对应。当扇形阵列未全部开启时,开启的那部分扇形阵列应呈中心对称分布,如此,可以使得光发射器100发出的激光较为均匀。
(2)垂直腔面发射器包括多个点光源101,多个点光源101形成多个子阵列12,多个子阵列12包括至少一个圆形子阵列和至少一个环形子阵列,至少一个圆形子阵列和至少一个环形子阵列围成圆形(如图23所示),或者多个子阵列12包括至少一个多边形子阵列和至少一个环形子阵列,至少一个多边形子阵列和至少一个环形子阵列围成一个多边形(图未示),此时,光发射器100的发光功率的调节可以通过开启不同数目的子阵列12的点光源101来实现,也即是说,发光功率与开启的子阵列12的目标数量的对应。
请参阅图24,本发明还提供一种电子装置800。电子装置800包括上述任意一实施方式所述的深度相机300、一个或多个处理器805、存储器806和一个或多个程序807。其中一个或多个程序807被存储在存储器806中,并且被配置成由一个或多个处理器805执行。程序807包括用于执行上述任意一项实施方式所述的光发射器100的控制方法的指令。
例如,请结合图1、图2及图24,程序807包括用于执行以下步骤的指令:
01:获取光发射器100与场景中的目标主体之间的投射距离;和
03:在投射距离大于预设距离时,控制光发射器10以第一频率发光后以第二频率发光,第一频率与第二频率不同。
再例如,请结合图5和图24,程序807还包括用于执行以下步骤的指令:
011:获取场景的拍摄图像;
012:处理拍摄图像以判断拍摄图像中是否存在人脸;
013:在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例;和
014:根据第一比例计算投射距离。
请参阅图25,本发明还提供一种计算机可读存储介质901。计算机可读存储介质901包括与电子装置800结合使用计算机程序902。计算机程序902可被处理器805执行以完成上述任意一项实施方式所述的光发射器100的控制方法。
例如,请结合图1、图2及图25,计算机程序902可被处理器805执行以完成以下步骤:
01:获取光发射器100与场景中的目标主体之间的投射距离;和
03:在投射距离大于预设距离时,控制光发射器10以第一频率发光后以第二频率发光,第一频率与第二频率不同。
再例如,请结合图5及图25,计算机程序902还可被处理器805执行以完成以下步骤:
011:获取场景的拍摄图像;
012:处理拍摄图像以判断拍摄图像中是否存在人脸;
013:在拍摄图像中存在人脸时计算拍摄图像中人脸所占的第一比例;和
014:根据第一比例计算投射距离。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本发明的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。 此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本发明的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本发明的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本发明的实施例所属技术领域的技术人员所理解。
尽管上面已经示出和描述了本发明的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本发明的限制,本领域的普通技术人员在本发明的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (20)

  1. 一种光发射器的控制方法,其特征在于,所述控制方法包括:
    获取所述光发射器与场景中的目标主体之间的投射距离;和
    在所述投射距离大于预设距离时,控制所述光发射器以第一频率发光后以第二频率发光,所述第一频率与所述第二频率不同。
  2. 根据权利要求1所述的控制方法,其特征在于,所述获取所述光发射器与场景中的目标主体之间的投射距离的步骤包括:
    控制所述光发射器以预定发光频率发光以检测所述场景的初始深度信息;和
    根据所述初始深度信息计算所述光发射器与所述目标主体之间的投射距离。
  3. 根据权利要求1所述的控制方法,其特征在于,所述获取所述光发射器与场景中的目标主体之间的投射距离的步骤包括:
    获取所述场景的拍摄图像;
    处理所述拍摄图像以判断所述拍摄图像中是否存在人脸;
    在所述拍摄图像中存在所述人脸时计算所述拍摄图像中所述人脸所占的第一比例;和
    根据所述第一比例计算所述投射距离。
  4. 根据权利要求3所述的控制方法,其特征在于,所述控制方法还包括:
    获取所述场景的环境亮度;
    根据所述环境亮度及所述投射距离计算所述光发射器的目标发光功率;和
    控制所述光发射器以所述目标发光功率发光。
  5. 根据权利要求4所述的控制方法,其特征在于,所述根据所述第一比例计算所述投射距离的步骤包括:
    计算所述拍摄图像中所述人脸的预设特征区域占所述人脸的第二比例;和
    根据所述第一比例及所述第二比例计算所述投射距离。
  6. 根据权利要求4所述的控制方法,其特征在于,所述根据所述第一比例计算所述投射距离包括:
    根据所述拍摄图像判断所述目标主体是否佩戴眼镜;和
    在所述目标主体佩戴眼镜时根据所述第一比例及距离系数计算所述投射距离。
  7. 根据权利要求4所述的控制方法,其特征在于,所述根据所述第一比例计算所述投射距离的步骤包括:
    根据所述拍摄图像判断所述目标主体的年龄;和
    根据所述第一比例及所述年龄计算所述投射距离。
  8. 一种光发射器的控制装置,其特征在于,所述控制装置包括:
    第一获取模块,所述第一获取模块用于获取所述光发射器与场景中的目标主体之间的投射距离;和
    控制模块,所述控制模块用于在所述投射距离大于预设距离时,控制所述光发射器以第一频率发光后以第二频率发光,所述第一频率与所述第二频率不同。
  9. 一种深度相机,其特征在于,所述深度相机包括光发射器和处理器;所述处理器用于:
    获取所述光发射器与场景中的目标主体之间的投射距离;和
    在所述投射距离大于预设距离时,控制所述光发射器以第一频率发光后以第二频率发光,所述第一频率与所述第二频率不同。
  10. 根据权利要求9所述的深度相机,其特征在于,所述处理器还用于:
    控制所述光发射器以预定发光频率发光以检测所述场景的初始深度信息;和
    根据所述初始深度信息计算所述光发射器与所述目标主体之间的投射距离。
  11. 根据权利要求9所述的深度相机,其特征在于,所述处理器还用于:
    获取所述场景的拍摄图像;
    处理所述拍摄图像以判断所述拍摄图像中是否存在人脸;
    在所述拍摄图像中存在所述人脸时计算所述拍摄图像中所述人脸所占的第一比例;和
    根据所述第一比例计算所述投射距离。
  12. 根据权利要求11所述的深度相机,其特征在于,所述处理器还用于:
    获取所述场景的环境亮度;
    根据所述环境亮度及所述投射距离计算所述光发射器的目标发光功率;和
    控制所述光发射器以所述目标发光功率发光。
  13. 根据权利要求12所述的深度相机,其特征在于,所述处理器还用于:
    计算所述拍摄图像中所述人脸的预设特征区域占所述人脸的第二比例;和
    根据所述第一比例及所述第二比例计算所述投射距离。
  14. 根据权利要求12所述的深度相机,其特征在于,所述处理器还用于:
    根据所述拍摄图像判断所述目标主体是否佩戴眼镜;和
    在所述目标主体佩戴眼镜时根据所述第一比例及距离系数计算所述投射距离。
  15. 根据权利要求12所述的深度相机,其特征在于,所述处理器还用于:
    根据所述拍摄图像判断所述目标主体的年龄;和
    根据所述第一比例及所述年龄计算所述投射距离。
  16. 根据权利要求9所述的深度相机,其特征在于,所述深度相机还包括光接收器,所述光接收器用于接收反射回来的所述光发射器以所述第一频率发射的激光以获得第一相位差和接收反射回来的所述光发射器以所述第二频率发射的激光以获得第二相位差,所述处理器还用于:
    根据所述第一频率和所述第一相位差计算获得第一距离;
    根据所述第二频率和所述第二相位差计算获得第二距离;和
    根据所述第一距离和所述第二距离计算获得校准距离。
  17. 根据权利要求9所述的深度相机,其特征在于,所述深度相机还包括第一基板组件和垫块,所述第一基板组件包括相互连接的第一基板及柔性电路板,所述垫块设置在所述第一基板上,所述光发射器设置在所述垫块上,所述柔性电路板弯折且所述柔性电路板的一端连接所述第一基板,另一端连接所述光发射器;所述光接收器设置在所述第一基板上,所述光接收器包括外壳及设置在所述外壳上的光学元件,所述外壳与所述垫块连接成一体。
  18. 根据权利要求17所述的深度相机,其特征在于,所述垫块与所述外壳一体成型。
  19. 一种电子装置,其特征在于,所述电子装置包括:
    权利要求9-18任意一项所述的深度相机;
    一个或多个处理器;
    存储器;和
    一个或多个程序,其中所述一个或多个程序被存储在所述存储器中,并且被配置成由所述一个或多个处理器执行,所述程序包括用于执行权利要求1至7任意一项所述的控制方法的指令。
  20. 一种计算机可读存储介质,其特征在于,包括与电子装置结合使用的计算机程序,所述计算机程序可被处理器执行以完成权利要求1至7任意一项所述的控制方法。
PCT/CN2019/090076 2018-08-22 2019-06-05 控制方法及装置、深度相机、电子装置及可读存储介质 WO2020038062A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810963382.6 2018-08-22
CN201810963382.6A CN109104583B (zh) 2018-08-22 2018-08-22 控制方法及装置、深度相机、电子装置及可读存储介质

Publications (1)

Publication Number Publication Date
WO2020038062A1 true WO2020038062A1 (zh) 2020-02-27

Family

ID=64850746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/090076 WO2020038062A1 (zh) 2018-08-22 2019-06-05 控制方法及装置、深度相机、电子装置及可读存储介质

Country Status (2)

Country Link
CN (2) CN112702541B (zh)
WO (1) WO2020038062A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113064139A (zh) * 2021-03-15 2021-07-02 深圳煜炜光学科技有限公司 一种高测量精度的激光雷达及其使用方法
CN114833458A (zh) * 2022-04-29 2022-08-02 恒玄科技(上海)股份有限公司 一种预防激光灼烧芯片的打印方法、装置及打印机

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112702541B (zh) * 2018-08-22 2023-04-18 Oppo广东移动通信有限公司 控制方法及装置、深度相机、电子装置及可读存储介质
CN108833889B (zh) * 2018-08-22 2020-06-23 Oppo广东移动通信有限公司 控制方法及装置、深度相机、电子装置及可读存储介质
CN110308458B (zh) * 2019-06-27 2021-03-23 Oppo广东移动通信有限公司 调节方法、调节装置、终端及计算机可读存储介质
CN110365887B (zh) * 2019-07-30 2021-10-26 歌尔光学科技有限公司 成像方法、装置、设备及计算机可读存储介质
CN110418062A (zh) * 2019-08-29 2019-11-05 上海云从汇临人工智能科技有限公司 一种拍摄方法、装置、设备及机器可读介质
CN110659617A (zh) * 2019-09-26 2020-01-07 杭州艾芯智能科技有限公司 活体检测方法、装置、计算机设备和存储介质
CN112526546B (zh) * 2021-02-09 2021-08-17 深圳市汇顶科技股份有限公司 深度信息确定方法及装置
CN113296106A (zh) * 2021-05-17 2021-08-24 江西欧迈斯微电子有限公司 一种tof测距方法、装置、电子设备以及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150204970A1 (en) * 2014-01-22 2015-07-23 Samsung Electronics Co., Ltd. Time of flight camera device and method of driving the same
CN105103006A (zh) * 2012-12-19 2015-11-25 微软技术许可有限责任公司 飞行去混叠的单个频率时间
CN106772414A (zh) * 2016-10-14 2017-05-31 北醒(北京)光子科技有限公司 一种提高tof相位法测距雷达测距精度的方法
CN106817794A (zh) * 2015-11-30 2017-06-09 宁波舜宇光电信息有限公司 Tof电路模块及其应用
CN108333860A (zh) * 2018-03-12 2018-07-27 广东欧珀移动通信有限公司 控制方法、控制装置、深度相机和电子装置
CN109104583A (zh) * 2018-08-22 2018-12-28 Oppo广东移动通信有限公司 控制方法及装置、深度相机、电子装置及可读存储介质

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2808136B2 (ja) * 1989-06-07 1998-10-08 キヤノン株式会社 測長方法及び装置
KR20130093521A (ko) * 2010-07-12 2013-08-22 가부시기가이샤니레꼬 거리 측정 장치 및 거리 측정 방법
CN102184436B (zh) * 2011-05-16 2013-04-17 重庆大学 一种物联网物体位置感知方法
AU2014342114B2 (en) * 2013-11-01 2019-06-20 Irobot Corporation Scanning range finder
JP2015184200A (ja) * 2014-03-25 2015-10-22 横河電子機器株式会社 レーダ装置
US10419703B2 (en) * 2014-06-20 2019-09-17 Qualcomm Incorporated Automatic multiple depth cameras synchronization using time sharing
CN105372668A (zh) * 2015-11-16 2016-03-02 中国电子科技集团公司第二十八研究所 一种相位式激光测距方法
CN105763803A (zh) * 2016-02-29 2016-07-13 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
US10802119B2 (en) * 2016-07-26 2020-10-13 Samsung Electronics Co., Ltd. Lidar device and method of measuring distance using the same
CN106597462B (zh) * 2016-12-26 2019-08-06 艾普柯微电子(上海)有限公司 测距方法及测距装置
CN108072870B (zh) * 2017-10-25 2021-05-11 西南电子技术研究所(中国电子科技集团公司第十研究所) 利用载波相位提高突发通信测距精度的方法
CN108333859B (zh) * 2018-02-08 2024-03-12 宁波舜宇光电信息有限公司 结构光投射装置、深度相机以基于深度相机的深度图像成像方法
CN108281880A (zh) * 2018-02-27 2018-07-13 广东欧珀移动通信有限公司 控制方法、控制装置、终端、计算机设备和存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105103006A (zh) * 2012-12-19 2015-11-25 微软技术许可有限责任公司 飞行去混叠的单个频率时间
US20150204970A1 (en) * 2014-01-22 2015-07-23 Samsung Electronics Co., Ltd. Time of flight camera device and method of driving the same
CN106817794A (zh) * 2015-11-30 2017-06-09 宁波舜宇光电信息有限公司 Tof电路模块及其应用
CN106772414A (zh) * 2016-10-14 2017-05-31 北醒(北京)光子科技有限公司 一种提高tof相位法测距雷达测距精度的方法
CN108333860A (zh) * 2018-03-12 2018-07-27 广东欧珀移动通信有限公司 控制方法、控制装置、深度相机和电子装置
CN109104583A (zh) * 2018-08-22 2018-12-28 Oppo广东移动通信有限公司 控制方法及装置、深度相机、电子装置及可读存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113064139A (zh) * 2021-03-15 2021-07-02 深圳煜炜光学科技有限公司 一种高测量精度的激光雷达及其使用方法
CN113064139B (zh) * 2021-03-15 2024-02-06 深圳煜炜光学科技有限公司 一种高测量精度的激光雷达及其使用方法
CN114833458A (zh) * 2022-04-29 2022-08-02 恒玄科技(上海)股份有限公司 一种预防激光灼烧芯片的打印方法、装置及打印机
CN114833458B (zh) * 2022-04-29 2023-09-08 恒玄科技(上海)股份有限公司 一种预防激光灼烧芯片的打印方法、装置及打印机

Also Published As

Publication number Publication date
CN112702541B (zh) 2023-04-18
CN109104583A (zh) 2018-12-28
CN109104583B (zh) 2021-01-15
CN112702541A (zh) 2021-04-23

Similar Documents

Publication Publication Date Title
WO2020038062A1 (zh) 控制方法及装置、深度相机、电子装置及可读存储介质
WO2020038064A1 (zh) 控制方法及装置、深度相机、电子装置及可读存储介质
WO2020052284A1 (zh) 控制方法及装置、深度相机、电子装置及可读存储介质
WO2020038060A1 (zh) 激光投射模组及其控制方法、图像获取设备和电子装置
CN108205374B (zh) 一种视频眼镜的眼球追踪模组及其方法、视频眼镜
US11335028B2 (en) Control method based on facial image, related control device, terminal and computer device
CN108333860B (zh) 控制方法、控制装置、深度相机和电子装置
WO2020062909A1 (zh) 控制方法与装置、飞行时间设备、终端及计算机可读存储介质
WO2020139915A1 (en) Head mounted display calibration using portable docking station with calibration target
WO2020038058A1 (zh) 标定方法、标定控制器及标定系统
CN108509867B (zh) 控制方法、控制装置、深度相机和电子装置
CN108227361B (zh) 控制方法、控制装置、深度相机和电子装置
US10503248B1 (en) Selective color sensing for motion tracking
WO2020052282A1 (zh) 电子装置及其控制方法、控制装置和计算机可读存储介质
CN108594451B (zh) 控制方法、控制装置、深度相机和电子装置
WO2020038053A1 (zh) 飞行时间模组及其控制方法、控制器和电子装置
TWI684026B (zh) 控制方法、控制裝置、深度相機和電子裝置
CN108281880A (zh) 控制方法、控制装置、终端、计算机设备和存储介质
CN108279496B (zh) 一种视频眼镜的眼球追踪模组及其方法、视频眼镜
WO2020038061A1 (zh) 飞行时间模组及其控制方法、控制器和电子装置
US10551500B2 (en) Infrared optical element for proximity sensor system
KR20210006605A (ko) 센서를 포함하는 전자 장치 및 그의 동작 방법
KR20200137830A (ko) 적어도 하나의 센서를 이용하여 측정된 전자 장치와 사용자 간 거리에 기반하여 생체 데이터를 보정하는 전자 장치 및 방법
US11550408B1 (en) Electronic device with optical sensor for sampling surfaces
KR20150072778A (ko) 저전력 전자 기기

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19851199

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19851199

Country of ref document: EP

Kind code of ref document: A1