CN111580282A - Light emitting module, depth camera, electronic equipment and control method - Google Patents

Light emitting module, depth camera, electronic equipment and control method Download PDF

Info

Publication number
CN111580282A
CN111580282A CN202010472389.5A CN202010472389A CN111580282A CN 111580282 A CN111580282 A CN 111580282A CN 202010472389 A CN202010472389 A CN 202010472389A CN 111580282 A CN111580282 A CN 111580282A
Authority
CN
China
Prior art keywords
light
light emitting
laser
emitting
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010472389.5A
Other languages
Chinese (zh)
Other versions
CN111580282B (en
Inventor
张学勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010472389.5A priority Critical patent/CN111580282B/en
Publication of CN111580282A publication Critical patent/CN111580282A/en
Application granted granted Critical
Publication of CN111580282B publication Critical patent/CN111580282B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/09Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
    • G02B27/0938Using specific optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/09Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
    • G02B27/0927Systems for changing the beam intensity distribution, e.g. Gaussian to top-hat
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/09Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
    • G02B27/0938Using specific optical elements
    • G02B27/095Refractive optical elements
    • G02B27/0955Lenses
    • G02B27/0961Lens arrays
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application discloses a light emitting module, a depth camera, an electronic device and a control method. The light emitting module comprises a light source, a diaphragm, a lens array and a diffuser. The light source is used for emitting laser, and the light source comprises a plurality of light-emitting areas which are independently controlled. The diaphragm is arranged on a light-emitting light path of the light source, a plurality of through holes corresponding to the light-emitting areas are formed in the diaphragm, and the through holes are used for allowing laser to pass through. The lens array is arranged on a light emitting optical path of the light source, and comprises a plurality of lenses corresponding to the light emitting areas, and each lens is used for refracting the laser light to change the propagation direction of the laser light. The diffuser is arranged on the light emitting path of the light source and is used for diffusing laser. The light emission module of this application embodiment is through the subregion control to the light source and utilize the lens array to refract the laser of light source transmission, can enlarge the range finding scope and the angle of vision of light emission module, optimizes the luminous effect of light emission module.

Description

Light emitting module, depth camera, electronic equipment and control method
Technical Field
The application relates to the technical field of depth ranging, in particular to a light emitting module, a depth camera, electronic equipment and a control method.
Background
Electronic devices such as cell phones may be equipped with time-of-flight depth cameras to enable measurement of depth information of a scene. Time-of-flight depth cameras typically include a light emitter for emitting light into a scene and a light receiver for receiving light reflected back from objects in the scene, and depth information for the scene can be calculated from the time difference between the time at which the light emitter emits light and the time at which the light receiver receives light. However, the range of existing time-of-flight depth cameras is small.
Disclosure of Invention
The embodiment of the application provides a light emitting module, a depth camera, electronic equipment and a control method.
The light emitting module of the embodiment of the application comprises a light source, a diaphragm, a lens array and a diffuser. The light source is used for emitting laser, and the light source comprises a plurality of light-emitting areas which are independently controlled. The diaphragm is arranged on a light emitting light path of the light source, a plurality of through holes corresponding to the light emitting areas are formed in the diaphragm, and the through holes are used for allowing the laser to pass through. The lens array is arranged on a light emitting optical path of the light source, and comprises a plurality of lenses corresponding to the light emitting areas, and each lens is used for refracting the laser light to change the propagation direction of the laser light. The diffuser is arranged on a light emitting optical path of the light source and used for diffusing the laser.
The depth camera of the embodiment of the application comprises a light emitting module and a light receiving module. The light receiving module is used for receiving the laser emitted by the light emitting module. The light emission module comprises a light source, a diaphragm, a lens array and a diffuser. The light source is used for emitting laser, and the light source comprises a plurality of light-emitting areas which are independently controlled. The diaphragm is arranged on a light emitting light path of the light source, a plurality of through holes corresponding to the light emitting areas are formed in the diaphragm, and the through holes are used for allowing the laser to pass through. The lens array is arranged on a light emitting optical path of the light source, and comprises a plurality of lenses corresponding to the light emitting areas, and each lens is used for refracting the laser light to change the propagation direction of the laser light. The diffuser is arranged on a light emitting optical path of the light source and used for diffusing the laser.
The electronic equipment of the embodiment of the application comprises a shell and a depth camera, wherein the depth camera is combined with the shell. The depth camera comprises a light emitting module and a light receiving module. The light receiving module is used for receiving the laser emitted by the light emitting module. The light emission module comprises a light source, a diaphragm, a lens array and a diffuser. The light source is used for emitting laser, and the light source comprises a plurality of light-emitting areas which are independently controlled. The diaphragm is arranged on a light emitting light path of the light source, a plurality of through holes corresponding to the light emitting areas are formed in the diaphragm, and the through holes are used for allowing the laser to pass through. The lens array is arranged on a light-emitting optical path of the light source and comprises a plurality of lenses corresponding to the light-emitting areas, and each lens is used for refracting the laser light to change the propagation direction of the laser light; and the diffuser is arranged on the light emitting optical path of the light source and used for diffusing the laser.
The control method of the embodiment of the application is used for the light emitting module. The light emission module comprises a light source, a diaphragm, a lens array and a diffuser. The light source is used for emitting laser, and the light source comprises a plurality of light-emitting areas which are independently controlled. The diaphragm is arranged on a light emitting light path of the light source, a plurality of through holes corresponding to the light emitting areas are formed in the diaphragm, and the through holes are used for allowing the laser to pass through. The lens array is arranged on a light emitting optical path of the light source, and comprises a plurality of lenses corresponding to the light emitting areas, and each lens is used for refracting the laser light to change the propagation direction of the laser light. The diffuser is arranged on a light emitting optical path of the light source and used for diffusing the laser. The control method comprises the following steps: determining an application scene of the depth camera; and determining a starting strategy of the light emitting module according to the application scene, wherein the starting strategy comprises at least one of the light emitting power of the light emitting areas and the starting sequence of the light emitting areas.
The light emitting module, the depth camera, the electronic device and the control method in the embodiment of the application enable the light emitting module to have a large range finding range and avoid serious heating problems through zone control of the light source. In addition, the light emission module utilizes the lens array to refract the laser of a plurality of light emitting areas transmission in to the light source, both can distinguish the visual field that a plurality of light emitting areas correspond, can also enlarge the visual field of light emission module, optimizes the luminous effect of light emission module.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic diagram of a light emitting module according to some embodiments of the present disclosure;
FIG. 2 is a schematic block diagram of a depth camera according to some embodiments of the present application;
FIG. 3 is a schematic view of the arrangement of the light emitting regions of the light source according to some embodiments of the present disclosure;
FIG. 4 is a schematic view of a laser pattern projected by a light emission module according to some embodiments of the present disclosure;
FIG. 5 is a schematic view of the arrangement of the light emitting regions of the light source according to some embodiments of the present disclosure;
FIG. 6 is a schematic view of a laser pattern projected by a light emission module according to some embodiments of the present disclosure;
FIG. 7 is a schematic view of the arrangement of the light emitting regions of the light source according to some embodiments of the present disclosure;
FIG. 8 is a schematic view of a laser pattern projected by a light emission module according to some embodiments of the present disclosure;
FIG. 9 is a schematic diagram of a light emitting module according to some embodiments of the present disclosure;
FIG. 10 is a schematic view of the arrangement of the light emitting regions of the light source according to some embodiments of the present disclosure;
FIG. 11 is a schematic view of a laser pattern projected by a light receiving module according to some embodiments of the present disclosure;
FIG. 12 is a schematic diagram of a light emitting module according to some embodiments of the present disclosure;
FIG. 13 is a schematic diagram of a light emitting module according to some embodiments of the present disclosure;
FIG. 14 is a schematic structural diagram of a light emitting module according to some embodiments of the present disclosure;
FIG. 15 is a schematic view of the arrangement of the light emitting regions of the light source according to some embodiments of the present disclosure;
FIG. 16 is a schematic view of the arrangement of the light emitting regions of the light source according to some embodiments of the present disclosure;
FIG. 17 is a schematic view of the arrangement of light emitting regions of a light source according to some embodiments of the present disclosure;
FIG. 18 is a schematic structural diagram of an electronic device according to some embodiments of the present application;
FIG. 19 is a schematic flow chart diagram of a control method according to certain embodiments of the present application;
FIG. 20 is a flow chart illustrating a control method according to some embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
Referring to fig. 1 and 3, the present embodiment provides a light emitting module 10 including a light source 11, a diaphragm 13, a lens array 15 and a diffuser 17. The light source 11 is for emitting laser light, and the light source 11 includes a plurality of light emitting regions 111, and the plurality of light emitting regions 111 are independently controlled. The aperture 13 is disposed on a light emitting path of the light source 11, and the aperture 13 is opened with a plurality of through holes 131 corresponding to the plurality of light emitting areas 111, and the plurality of through holes 131 are used for passing the laser light. The lens array 15 is disposed on a light emitting optical path of the light source 11, and the lens array 15 includes a plurality of lenses 151 corresponding to the plurality of light emitting areas 111, each lens 151 for refracting the laser light to change a propagation direction of the laser light. A diffuser 17 is disposed on the light emitting path of the light source 11, and the diffuser 17 is used to diffuse the laser light.
The light emitting module 10 of the embodiment of the present application can not only have a large ranging range, but also have no serious heat generation problem by controlling the light source 11 in a partitioned manner. In addition, the light emitting module 10 utilizes the lens array 15 to refract the laser light emitted from the light emitting areas 111 in the light source 11, so that the fields of view corresponding to the light emitting areas 111 can be distinguished, the field of view of the light emitting module 10 can be enlarged, and the light emitting effect of the light emitting module 10 can be optimized.
Referring to fig. 1 and fig. 2, a depth camera 100 is further provided in the present embodiment. The depth camera 100 includes a light emitting module 10, a light receiving module 20, and a processor 30. The processor 30 is electrically connected to both the light emitting module 10 and the light receiving module 10. In one example, depth camera 100 may be a time-of-flight depth camera.
The light emitting module 10 includes a light source 11, a diaphragm 13, a lens array 15 and a diffuser 17.
The light source 11 is used to emit laser light. The light source 11 includes a plurality of light emitting regions 111, and the plurality of light emitting regions 111 may be independently controlled, that is, the plurality of light emitting regions 111 may independently emit light. Each light emitting region 111 may have a polygonal cross-section. As shown in fig. 3, in one example, each light emitting region 111 has a square cross section, but in other examples, the light emitting region 111 may have a triangular cross section, a pentagonal cross section, a hexagonal cross section, an octagonal cross section, a dodecagonal cross section, and the like, without limitation. The light source 11 may be divided into 2x2, 3x3, 4x4, etc. light emitting regions 111, and the number of light emitting regions 111 may be determined according to the distance measuring range of the depth camera 100, the angle of view, and the heat dissipation requirement of the light source 11, and is not limited herein. The areas of any two light emitting regions 111 may be the same or similar, and the shapes of any two light emitting regions 111 may also be the same or similar, which is not limited herein. Thus, on the one hand, the design of the light source 11 can be facilitated, and on the other hand, the light rays emitted by the light emitting regions 111 can have substantially the same projection area. Each light-emitting area 111 includes one or more point light sources 113, and the difference between the number of point light sources 113 in any two light-emitting areas 111 is less than a predetermined threshold. It can be understood that if there is a large difference between the numbers of the point light sources 113 in the two light-emitting regions 111, the light-emitting power of the light-emitting region 111 with the large number of the point light sources 113 is larger than that of the light-emitting region 111 with the small number of the point light sources 113 under the same driving current, so that the images in the respective regions of the image are not uniform, and the image quality is affected. The plurality of light emitting areas 111 adopt a design scheme of common cathode and divided anode, and the on or off of the anodes of the plurality of point light sources 111 in each light emitting area 111 can be controlled by a driving chip (not shown) connected with the light source 11, so that the control of the light emitting state (on or off) of the point light sources 111 in each light emitting area 111 is realized.
It can be understood that, in the related art, when the light emitting module emits light, the entire light source generally emits laser light. When the light emitting mode is applied to measurement of a short-distance scene, depth information with high precision can be obtained. However, when measuring a long-distance scene, because the distance of the object to be measured is long, the energy of the laser emitted by the light source is greatly attenuated when the laser is transmitted to the object to be measured, and depth information with high precision cannot be obtained. At this time, if it is to be ensured that depth information satisfying the condition with accuracy can be obtained, it is necessary to supply a larger drive current to the light source so that the light source can emit laser light of higher power. However, the method of increasing the driving current to increase the emitting power greatly increases the heat generation of the light emitting module, which affects the working performance of the light emitting module. In the light emitting module 10 of the present embodiment, each light emitting region 111 of the light source 11 can be independently controlled, so that each light emitting region 111 can be turned on at different times or simultaneously. Then, when measuring the depth information of a remote object, the distance measurement accuracy of the depth camera 100 can be ensured by turning on the plurality of light emitting regions 111 in a time-sharing manner and increasing the light emitting power of each light emitting region 111, and meanwhile, the heat generated by the light emitting module 10 can be reduced, and the working performance of the light emitting module 10 can be ensured. So, the light emission module 10 of this application embodiment both can measure the degree of depth information of closely spaced object, can measure the degree of depth information of long-distance object again, and the range finding is bigger, and the range finding precision is higher.
Referring to fig. 1 and 3, the diaphragm 13 is disposed on the light emitting path of the light source 11, the diaphragm 13 is provided with a plurality of through holes 131 corresponding to the plurality of light emitting areas 111, and the plurality of through holes 131 are used for passing the laser. The diaphragm 13 is an entity that plays a role of limiting a light beam in an optical system, and a projection field of view of a light ray passing through the through hole 131 in the diaphragm 13 can be limited. In the embodiment of the present application, the correspondence relationship between the light emitting region and the through hole 131 may be: each light emitting region 111 corresponds to one through hole 131, and the laser light emitted from the one light emitting region 111 exits after passing through the one through hole 131. When one light emitting region 111 is disposed to correspond to one through hole 131, the amount of light passing through the through hole 131 can be adjusted by controlling the magnitude of the driving current of the one light emitting region 111. In the embodiment of the present invention, the aperture 13 is disposed on the light emitting path of the light source 11, and the aperture 13 is provided with a plurality of through holes 131 corresponding to the different light emitting regions 111, so that the laser light emitted by each light emitting region 111 passes through the corresponding through hole 131, thereby limiting the laser light emitted by each light emitting region 111 and preventing crosstalk of the laser light emitted by the light emitting regions 111 corresponding to the different through holes 131.
Referring to fig. 1 and 3, the lens array 15 is disposed on the light emitting path of the light source 11, and the lens array 15 includes a plurality of lenses 151 corresponding to the light emitting regions 111, and each lens 151 is used for refracting the laser light to change the propagation direction of the laser light. Specifically, the diaphragm 13 is located between the light source 11 and the lens array 15, and each lens 151 is used to refract the laser light passing through the diaphragm 13 to change the propagation direction of the laser light. The lens 151 includes opposing first and second faces 1511, 1513, the first face 1511 being closer to the light source 11 than the second face 1513. At least one of the first surface 1511 and the second surface 1513 is a free-form surface. As shown in fig. 1, in one example, the first surface 1511 and the second surface 1513 of the lens 151 may both be free-form surfaces. Because first face 1511 and second face 1513 are free-form surface, lens 151's refraction effect to laser is better, has the benefit to promote the laser projection effect of optical transmission module 10. The lens array 15 corresponds to the plurality of through holes 131 such that one lens 151 corresponds to one through hole 131, and when one light-emitting region 111 corresponds to one through hole 131, the light-emitting region 111, the through hole 131 and the lens 151 correspond to one light-emitting region 111 corresponding to one through hole 131 and one lens 151 (as shown in fig. 1). The shape of the projection of the lens 151 on the light source 11 may be the same as the shape of the cross section of the corresponding light emitting region 111, or may be similar to the shape of the cross section of the corresponding light emitting region 111, which is not limited herein. It can be understood that in the measurement of the long-distance scene, the accuracy of the acquired depth information can be improved by increasing the light emitting power of the light source 11, and since increasing the light emitting power of the light source 11 easily causes the power consumption of the light emitting module 10 to increase, the power consumption of the light emitting module 10 can be reduced by turning on some point light sources 111 in the light source 11. However, this approach in turn results in a reduced projected field of view for the light emission module 10. The light emitting module 10 of the embodiment of the present application is provided with the lens array 15, the lens array 15 includes the lens 151 mutually matched with the plurality of light emitting areas 111, and the plurality of lenses 151 can divide the projection fields of view of the plurality of light emitting areas 111, so that the plurality of projection fields of view of the plurality of light emitting areas 111 can be combined into a large projection field of view of the light emitting module 10. When measuring a long-distance scene, the light emitting regions 111 can be turned on in a time-sharing manner and a larger driving current can be provided for the light emitting regions 111, so that high-precision depth information acquisition can be realized with lower power consumption, and the light emitting module 10 can still have a larger projection field of view.
Further, in the lens array 15 of the embodiment of the present application, the free-form surfaces of the lenses 151 distributed around the periphery of the lens array 15 are protruded to a greater extent than the free-form surfaces of the lenses 151 distributed at the center of the lens array 15, so that the laser light emitted from the light emitting region 111 far away from the center of the light source 11 can be refracted at a greater angle by the lenses 151 distributed around the periphery of the lens array 15, which is beneficial to diffusing the field angle of the light emitting module 10.
Referring to fig. 1 again, a diffuser 17 is disposed on the light emitting path of the light source 11, and the diffuser 17 is used for diffusing the laser light. The working principle of the diffuser 17 is to atomize the laser light emitted by the light source 11 by refraction and reflection of the diffusing substance. In the embodiment of the present application, the diffuser 17 mainly homogenizes and diffuses the laser beam to change the field angle of the light emitting module 10 and the light intensity distribution of the laser in the measured scene. The relative positional relationship between the lens array 15 and the diffuser 17 may be various. As shown in fig. 1, in one example, the stop 13, the lens array 15 and the diffuser 17 are sequentially disposed along the light emitting direction of the light source 11, and the lens array 15 and the diffuser 17 are spaced apart from each other. Each lens 151 is used for refracting the laser passing through the through hole 131 corresponding to the lens to change the propagation direction of the laser passing through the through hole 131, and the diffuser 17 is used for diffusing the laser passing through the lens array 15 to change the angle of view of the light emitting module 10 and the light intensity distribution of the laser in the detected scene.
Referring to fig. 2 again, the light receiving module 20 may be used for receiving the laser light emitted by the light emitting module 10. Specifically, the light receiving module 20 is configured to receive laser light emitted by the light source 11, refracted by the lens array 15 and diffused by the diffuser 17 after passing through the diaphragm 13, and reflected by an object in the scene to be detected. By calculating the time of flight of the laser in the air, the distance of the subject from the depth camera 100, i.e., the depth information, can be calculated.
Referring to fig. 2 and 3, the processor 30 may be configured to determine an application scenario of the depth camera 100, and determine an opening policy of the light emitting module 10 according to the application scenario. The turn-on strategy includes at least one of the light emitting power of the plurality of light emitting regions 111 and the turn-on sequence of the plurality of light emitting regions 111. That is, the turn-on strategy may include only the light emitting power of the plurality of light emitting regions 111; alternatively, the turn-on strategy may include only the turn-on order of the plurality of light emitting areas 111; alternatively, the turn-on strategy may include both the light emitting power of the plurality of light emitting regions 111 and the turn-on sequence of the plurality of light emitting regions 111.
In particular, the processor 30 may determine the application scenario of the depth camera 100 in a variety of ways. For example, the processor 30 may determine an application scene of the depth camera 100 according to the user's input. For another example, the processor 30 may determine the application scene according to the current application enabling the depth camera 100, for example, when the application enabling the depth camera 100 is an application related to gesture recognition, the processor 100 determines that the depth camera 100 is applied to gesture recognition, and the current scene to be detected is a close-range scene; when the application that enables the depth camera 100 is an indoor navigation-related application, then the processor 100 determines that the depth camera 100 is applied for indoor navigation and the current scene under test is a distant scene. For another example, the processor 30 may determine the application scene according to the information obtained by the coarse distance measurement of the depth camera 100, for example, the light emitting module 10 in the depth camera 100 may be first turned on by a predetermined current to roughly measure the depth information of the measured scene, and then determine whether the current measured scene is a long-distance scene or a short-distance scene according to the depth information.
Referring to fig. 1, fig. 2, fig. 3 and fig. 4, when the application scene is a first scene (i.e., a short-distance scene), the light-emitting regions 111 emit light with a first light-emitting power, and the light-emitting regions 111 are simultaneously turned on. Specifically, the light receiving module 20 sends a synchronization signal to the light emitting module 10, after the light emitting module 10 receives the synchronization signal sent by the light receiving module 20, the light emitting module 10 simultaneously turns on the anode switches of the 9 light emitting regions 111, the point light sources 113 of the 9 light emitting regions 111 are simultaneously turned on and emit laser light with the first light emitting power (shown in fig. 3), the laser light is emitted into the corresponding lens 151 through the corresponding through hole 131 of each light emitting region 111, and is reflected by the lens 151 and diffused by the diffuser 17 and then projected into the scene to be measured (shown in fig. 4). It can be understood that, since the first scene is a close-range scene, the light emitting module 10 only needs to emit light with a lower light emitting power to obtain the depth information with higher accuracy, and since the light emitting power is lower, the simultaneous opening of the light emitting areas 111 of the light emitting module 10 does not cause the light emitting module 10 to generate more heat. Therefore, when the depth camera 100 is applied to the first scene, the plurality of light emitting areas 111 can be controlled to emit light with the first light emitting power, and the plurality of light emitting areas 111 are simultaneously turned on, so that the depth information is ensured to have higher acquisition accuracy, the light emitting module 10 can be prevented from generating larger power consumption, and time consumption caused by turning on the plurality of light emitting areas 111 in a time-sharing manner can be avoided.
Referring to fig. 1, fig. 2, and fig. 5 to fig. 8, when the application scene is a second scene (i.e., a long-distance scene), the light-emitting regions 111 emit light with a second light-emitting power greater than the first light-emitting power, and the light-emitting regions 111 are turned on in a time-sharing manner. Specifically, the light receiving module 20 sends a synchronization signal to the light emitting module 10, after the light emitting module 10 receives the synchronization signal sent by the light receiving module 20, the light emitting module 10 turns on the anode switch of the first light emitting region 111 (shown in fig. 5), the point light source 113 of the first light emitting region 111 is lit and emits laser light with the second light emitting power, the laser light passes through the through hole 131 corresponding to the first light emitting region 111 and then enters the corresponding lens 151, and is reflected by the lens 151 and diffused by the diffuser 17 and then projected to a region of the scene to be measured, where the region is located within the receiving field of the light receiving module 20 (shown in fig. 6). Then, the light emitting module 10 turns off the anode switch of the first light emitting region 111 and turns on the anode switch of the second light emitting region 111 (shown in fig. 7), the point light source 113 of the second light emitting region 111 is turned on and emits laser with the second light emitting power, the laser passes through the through hole 131 corresponding to the second light emitting region 111 and then enters the corresponding lens 151, and is reflected by the lens 151 and diffused by the diffuser 17 to be projected to another region of the measured scene, which is also located in the receiving field of the light receiving module 20 (shown in fig. 8). Subsequently, the light emitting module 10 sequentially turns on the third to ninth light emitting regions 111 to 111. The processor 30 can obtain 9 frames of depth sub-images according to the laser light corresponding to the 9 different areas obtained by the light receiving module 20, and the processor 30 synthesizes the 9 frames of depth sub-images into one frame of depth image to obtain a depth image with a larger field of view. It should be noted that the turn-on sequence of the light-emitting regions 111 is not limited by the sequence of the embodiments shown in fig. 5 to 8, and the turn-on sequence of the light-emitting regions 111 can be determined by the developer. It can be understood that, since the second scene may be a long-distance scene, when depth measurement of the long-distance scene is performed, since the distance of the object to be measured is long, energy of the laser emitted by the light source 11 is greatly attenuated when the laser propagates to the object to be measured, so that the depth camera 100 cannot obtain depth information with high accuracy. Therefore, the depth camera 100 may be able to obtain depth information with higher accuracy by turning on the plurality of light emitting regions 111 in a time-sharing manner and causing each light emitting region 111 to emit light with a larger light emitting power to cause the light emitting module 10 to have a larger transmission field of view without increasing the power consumption of the light emitting module 10.
In summary, the depth camera 100 and the light emitting module 10 according to the embodiment of the present invention control the light source 11 in different zones, so that the light emitting module 10 can have a wide range, and at the same time, the problem of serious heat generation does not exist. In addition, the light emitting module 10 utilizes the lens array 15 to refract the laser light emitted from the light emitting areas 111 in the light source 11, so that the fields of view corresponding to the light emitting areas 111 can be distinguished, the field of view of the light emitting module 10 can be enlarged, and the light emitting effect of the light emitting module 10 can be optimized.
Referring to fig. 9, in some embodiments, the corresponding relationship between the light emitting regions 111 and the through holes 131 and the lenses 151 may be that a plurality of light emitting regions 111 correspond to one through hole 131 and one lens 151, and the laser light emitted from the plurality of light emitting regions 111 is emitted through the same through hole 131. The two light emitting regions 111 may correspond to one through hole 131 and one lens 151, or the three light emitting regions 111 may correspond to one through hole 131 and one lens 151, and the number of the light emitting regions 111 corresponding to one through hole 131 and one lens 151 may be designed by a developer according to practical situations, which is not limited herein. In one example, the light emitting regions 111 of the light source 11 may be divided into three groups, each group including three light emitting regions 111. Each set of three light emitting regions 111 corresponds to one through hole 131 and one lens 151.
Referring to fig. 1, fig. 2, fig. 3 and fig. 4, when the application scene is a first scene (i.e., a short-distance scene), the light-emitting regions 111 emit light with a first light-emitting power, and the light-emitting regions 111 are simultaneously turned on. Specifically, the light receiving module 20 sends a synchronization signal to the light emitting module 10, after the light emitting module 10 receives the synchronization signal sent by the light receiving module 20, the light emitting module 10 simultaneously turns on the anode switches of the 9 light emitting regions 111, the point light sources 113 of the 9 light emitting regions 111 are simultaneously turned on and emit laser light with the first light emitting power (shown in fig. 3), the laser light is emitted into the corresponding lens 151 through the corresponding through hole 131 of each light emitting region 111, and is reflected by the lens 151 and diffused by the diffuser 17 and then projected into the scene to be measured (shown in fig. 4).
Referring to fig. 2, 9 to 11, when the application scene is the second scene (i.e. the long-distance scene), the light-emitting regions 111 emit light with the second light-emitting power greater than the first light-emitting power, and the light-emitting regions 111 are turned on in a time-sharing manner. Specifically, the light receiving module 20 sends a synchronization signal to the light emitting module 10, after the light emitting module 10 receives the synchronization signal sent by the light receiving module 20, the light emitting module 10 simultaneously turns on the anode switches of the three light emitting regions 111 (shown in fig. 10) in the first group, the point light sources 113 of the three light emitting regions 111 are turned on and emit laser light with the second light emitting power, the laser light passes through the through holes 131 corresponding to the three light emitting regions 111 and then enters the corresponding lenses 151, and the laser light is refracted by the lenses 151 and diffused by the diffuser 17 and then projected to a region of the scene to be measured, where the region is located in the receiving field of the light receiving module 20 (shown in fig. 11). Then, the light emitting module 10 simultaneously turns off the anode switches of the three light emitting regions 111 in the first group, and turns on the three light emitting regions 111 in the second group so that the three light emitting regions 111 in the second group project laser light. Subsequently, the light emitting module 10 simultaneously turns off the anode switches of the three light emitting regions 111 in the second group and turns on the three light emitting regions 111 in the third group so that the three light emitting regions 111 in the third group project laser light. The processor 30 may obtain 3 frame depth sub-images from the laser light corresponding to the three different sets of regions obtained by the light receiving module 20, and the processor 30 synthesizes the 3 frame depth sub-images into one frame depth image to obtain a depth image with a larger field of view. It should be noted that the turn-on sequence of the three groups of light-emitting regions 111 is not limited by the sequence of the aforementioned embodiments shown in fig. 10 to fig. 11, and the turn-on sequence of the plurality of light-emitting regions 111 can be determined by the developer. In the embodiment of the present application, the plurality of light emitting regions 111 are corresponding to the through hole 131 and the lens 151, and a manner of simultaneously turning on the plurality of light emitting regions 111 each time to emit laser light is less time-consuming than a manner of turning on the light emitting regions 111 one by one in the embodiments shown in fig. 5 to 8, which is beneficial to increasing the speed of obtaining a depth image and improving the working efficiency of the depth camera 100.
Referring to fig. 12, in some embodiments, any one of the first surface 1511 and the second surface 1513 of the lens 151 is a free-form surface. The first surface 1511 may be a free-form surface, and the second surface 1513 may be a flat surface (shown in fig. 12); the first surface 1511 may be a flat surface, and the second surface 1513 may be a free-form surface (not shown), which is not limited herein. Compared with the lens 151 with the first surfaces 1511 and 1513 both free-form surfaces shown in fig. 1, in the lens 151 of the embodiment of the present application, only one of the first surface 1511 and the second surface 1513 is set as a free-form surface, so that the lens 151 is easier to process by an injection molding equal-production process, and the production efficiency is high.
Referring to fig. 13, in some embodiments, the first surface 1511 and the second surface 1513 of the lens 151 may be both planar. Compared with the lens 151 in which the first surface 1511 and the second surface 1513 both have free-form surfaces as shown in fig. 1 and the lens 151 in which any one of the first surface 1511 and the second surface 1513 is a free-form surface as shown in fig. 12, in the lens 151 in the embodiment of the present application, the first surface 1511 and the second surface 1513 are both set to be flat, the structure of the lens 151 is simpler, the lens 151 is easier to be processed by an injection molding equal-production process, and the production efficiency is higher.
Referring to fig. 13, in some embodiments, when the first surface 1511 of the lens array 15 is a plane, the stop 13, the lens array 15 and the diffuser 17 may be arranged in sequence along the light emitting direction of the light source 11, and the diffuser 17 is located on the first surface 1511 of the lens array 15 and connected to the first surface 1511. The connection manner of the diffuser 17 and the lens array 15 may be gluing, welding, or the like, which is not limited herein. Compared with the light emitting module 10 shown in fig. 1, in the production process of the light emitting module 10 shown in fig. 1, the optical axes of each light emitting region 111 of the light source 11, each lens 151 of the lens array 15 and the diffuser 17 need to be registered, and the precision requirement of the assembly process is high. In the embodiment of the present invention, the diffuser 17 is connected to the first surface 1511 of the lens 151, and only the light emitting region 111 and the lens array 15 connected to the diffuser 17 need to be aligned with the optical axis once during the production process, which greatly simplifies the process flow of the light emitting module 10 in assembly and optical axis alignment.
Referring to fig. 14, in some embodiments, when the second surface 1513 of the lens array 15 is a plane, the stop 13, the diffuser 17 and the lens array 15 may be sequentially disposed along the light emitting direction of the light source 11, and the diffuser 17 is located on the second surface 1513 of the lens array 15 and connected to the second surface 1513, so that the relative position relationship between the stop 13, the lens array 15 and the diffuser 17 is along the light emitting direction of the light source 11. The connection manner of the diffuser 17 and the lens array 15 may be gluing, welding, or the like, which is not limited herein. The diffuser 17 is connected to the second surface 1513 of the lens 151, and only the light-emitting region 111 and the lens array 15 connected to the diffuser 17 need to be aligned with the optical axis once during the production process, which greatly simplifies the process flow of the light emitting module 10 in assembly and optical axis alignment. Of course, in another embodiment, when the diaphragm 13, the diffuser 17, and the lens array 15 are sequentially disposed along the light emitting direction of the light source 11, the diffuser 17 may be spaced from the lens array 15, which is not limited herein. In this case, the second surface 1513 of the lens array 15 may be a free-form surface, and is not limited herein.
Referring to fig. 1 and 15, in some embodiments, the light emitting region 111 may have a hexagonal cross-section. In the embodiment of the present application, the cross section of the light emitting region 111 is set to be hexagonal, so that the projection shape of the laser light projected by the light emitting region 111 is closer to a circle, which is more favorable for the diffuser 17 to shape and homogenize the laser light emitted from the light emitting region 111. Moreover, compared with the light-emitting region 111 with a square cross section in the embodiment shown in fig. 1, the light-emitting region 111 with a hexagonal cross section only needs to be opened in a time-sharing manner for a few times (7 times as shown in fig. 15) in the depth measurement process to traverse all the light-emitting regions 111, so as to obtain a frame of depth image, which is beneficial to reducing the time consumption for obtaining a frame of image and improving the working efficiency of the depth camera 100.
Referring to fig. 16, in some embodiments, the light emitting region 111 may have a fan-shaped cross section. At this time, the light source 11 may be divided into 2 light emitting regions 111, 4 light emitting regions 111, 8 light emitting regions 111, and the like, and the number of the light emitting regions 111 may be determined according to factors such as a range of the depth camera 100, a field angle, and a heat dissipation requirement of the light source 11, and is not limited herein. The areas of any two light emitting regions 111 may be the same or similar, and are not limited herein.
Referring to fig. 3 and 17, in some embodiments, the light emitting region 111 may have a circular cross-section. And the light source 11 may be divided into 2 light emitting regions 111, 3 light emitting regions 111, 6 light emitting regions 111, etc., and the number of the light emitting regions 111 may be determined according to the distance measuring range of the depth camera 100, the angle of view, the heat dissipation requirement of the light source 11, etc., and is not limited herein. The areas of any two light emitting regions 111 may be the same or similar, and are not limited herein. When the cross section of the light emitting region 111 is annular, the projection shape of the laser light projected by the light emitting region 111 is closer to a circle, which is more advantageous for the diffuser 17 to shape and homogenize the laser light emitted from the light emitting region 111. It should be noted that when the cross sections of the light-emitting regions 111 are annular, an additional light-emitting region 111 with a circular cross section needs to be added, so that the light-emitting regions 111 with the annular cross section 111 and the light-emitting region 111 with the circular cross section can be combined into the light source 11 with a circular cross section.
Referring to fig. 1 and fig. 18, an electronic device 1000 is further provided in the present embodiment. The electronic device 1000 includes a housing 500 and the depth camera 100 described in any of the above embodiments. The depth camera 100 is coupled to the housing 500. For example, the housing 500 is formed with a receiving space (not shown) in which the depth camera 100 is received.
The electronic device 1000 may be a mobile phone, a tablet computer, a notebook computer, an intelligent wearable device (such as an intelligent bracelet, an intelligent watch, an intelligent helmet, and an intelligent glasses), a virtual reality device, and the like, without any limitation. In an embodiment of the invention, the electronic device 1000 is a mobile phone.
The electronic device 1000 according to the embodiment of the present application is provided with the depth camera 100 (shown in fig. 1) including the light emitting module 10. The light emitting module 10 can have a large range through the zone control of the light source 11, and meanwhile, the light emitting module 10 does not have a serious heating problem. In addition, the light emitting module 10 utilizes the lens array 15 to refract the laser light emitted from the light emitting areas 111 in the light source 11, so that the fields of view corresponding to the light emitting areas 111 can be distinguished, the field of view of the light emitting module 10 can be enlarged, and the light emitting effect of the light emitting module 10 can be optimized.
Referring to fig. 1, 3 and 19, an embodiment of the present application further provides a control method. The control method may be applied to the light emitting module 10 according to any of the above embodiments. The control method comprises the following steps:
01: determining an application scene of the depth camera 100;
02: the turn-on strategy of the light emitting module 10 is determined according to the application scenario, and the turn-on strategy includes at least one of the light emitting power of the plurality of light emitting regions 111 and the turn-on sequence of the plurality of light emitting regions 111.
Referring to fig. 3 and 20, in some embodiments, the step 02 of determining the start-up strategy of the light emitting module 10 according to the application scenario includes:
021: when the application scene is a first scene, the plurality of light-emitting areas 111 emit light with a first light-emitting power, and the plurality of light-emitting areas 111 are simultaneously turned on;
022: when the application scene is the second scene, the plurality of light-emitting areas 111 emit light with a second light-emitting power greater than the first light-emitting power, and the plurality of light-emitting areas 111 are turned on in a time-sharing manner.
The specific execution process of the control method according to the embodiment of the present application is the same as the specific execution process of the processor 30 for determining the application scenario and the start policy, and is not described herein again.
The control method of the embodiment of the application enables the light emitting module 10 to have a large distance measurement range through the zone control of the light source 11, and meanwhile, the serious heating problem does not exist.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (13)

1. An optical transmission module, comprising:
a light source for emitting laser light, the light source including a plurality of light emitting areas, the plurality of light emitting areas being independently controlled;
the diaphragm is arranged on a light emitting light path of the light source, and is provided with a plurality of through holes corresponding to the plurality of light emitting areas, and the plurality of through holes are used for the laser to pass through;
a lens array disposed on a light emitting optical path of the light source, the lens array including a plurality of lenses corresponding to the plurality of light emitting areas, each of the lenses being configured to refract the laser light to change a propagation direction of the laser light; and
a diffuser disposed on a light emitting path of the light source, the diffuser to diffuse the laser light.
2. The light emitting module of claim 1, wherein each of the light emitting areas comprises one or more point light sources, and the difference between the number of the point light sources in any two of the light emitting areas is less than a predetermined threshold.
3. The light-emitting module of claim 1, wherein one of the light-emitting areas corresponds to one of the through holes and one of the lenses; or
The plurality of light emitting regions correspond to one of the through holes and one of the lenses.
4. The light emission module of claim 1, wherein the lens comprises first and second opposing faces, the first face being closer to the light source than the second face;
at least one of the first surface and the second surface is a free-form surface; or
The first face and the second face are both planar.
5. The light emitting module of claim 4, wherein the light emitting area has a cross-section of a sector, a ring or a polygon.
6. The light emitting module according to any one of claims 1 to 5, wherein the diaphragm, the lens array and the diffuser are sequentially disposed along a light emitting direction of the light source, each lens is configured to refract the laser light passing through the through hole to change a propagation direction of the laser light passing through the through hole, and the diffuser is configured to diffuse the laser light passing through the lens array.
7. The light emitting module according to any one of claims 1 to 5, wherein the diaphragm, the diffuser and the lens array are sequentially disposed along a light emitting direction of the light source, the diffuser is configured to diffuse the laser light passing through the through hole, and each lens is configured to refract the laser light passing through the diffuser to change a propagation direction of the laser light passing through the diffuser.
8. A depth camera, comprising:
the light emission module of any of claims 1-7; and
the optical receiving module is used for receiving the laser emitted by the optical emitting module.
9. The depth camera of claim 8, further comprising a processor to:
determining an application scene of the depth camera;
and determining a starting strategy of the light emitting module according to the application scene, wherein the starting strategy comprises at least one of the light emitting power of the light emitting areas and the starting sequence of the light emitting areas.
10. The depth camera of claim 9, wherein when the application scene is a first scene, the plurality of light-emitting areas emit light with a first light-emitting power, and the plurality of light-emitting areas are simultaneously turned on;
when the application scene is a second scene, the plurality of light-emitting areas emit light with second light-emitting power larger than the first light-emitting power, and the plurality of light-emitting areas are switched on in a time-sharing manner.
11. An electronic device, comprising:
a housing; and
the depth camera of any of claims 8-10, in combination with the housing.
12. A control method is used for a light emission module, and is characterized in that the light emission module comprises a light source, a diaphragm, a lens array and a diffuser; the light source is used for emitting laser, and comprises a plurality of light emitting areas which are independently controlled; the diaphragm is arranged on a light emitting light path of the light source, and is provided with a plurality of through holes corresponding to the plurality of light emitting areas, and the plurality of through holes are used for the laser to pass through; the lens array is arranged on a light-emitting optical path of the light source and comprises a plurality of lenses corresponding to the light-emitting areas, and each lens is used for refracting the laser light to change the propagation direction of the laser light; the diffuser is arranged on a light emitting optical path of the light source and is used for diffusing the laser; the control method comprises the following steps:
determining an application scene of the depth camera;
and determining a starting strategy of the light emitting module according to the application scene, wherein the starting strategy comprises at least one of the light emitting power of the light emitting areas and the starting sequence of the light emitting areas.
13. The control method according to claim 12, wherein when the application scene is a first scene, the plurality of light-emitting areas emit light with a first light-emitting power, and the plurality of light-emitting areas are simultaneously turned on;
when the application scene is a second scene, the plurality of light-emitting areas emit light with second light-emitting power larger than the first light-emitting power, and the plurality of light-emitting areas are switched on in a time-sharing manner.
CN202010472389.5A 2020-05-29 2020-05-29 Light emitting module, depth camera, electronic equipment and control method Active CN111580282B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010472389.5A CN111580282B (en) 2020-05-29 2020-05-29 Light emitting module, depth camera, electronic equipment and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010472389.5A CN111580282B (en) 2020-05-29 2020-05-29 Light emitting module, depth camera, electronic equipment and control method

Publications (2)

Publication Number Publication Date
CN111580282A true CN111580282A (en) 2020-08-25
CN111580282B CN111580282B (en) 2022-05-24

Family

ID=72112708

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010472389.5A Active CN111580282B (en) 2020-05-29 2020-05-29 Light emitting module, depth camera, electronic equipment and control method

Country Status (1)

Country Link
CN (1) CN111580282B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111856506A (en) * 2020-08-28 2020-10-30 宁波舜宇奥来技术有限公司 Dynamic scanning light source module and partition scanning dynamic lighting method
CN112946604A (en) * 2021-02-05 2021-06-11 上海鲲游科技有限公司 dTOF-based detection device and electronic device and application thereof
CN112965073A (en) * 2021-02-05 2021-06-15 上海鲲游科技有限公司 Partition projection device and light source unit and application thereof
CN113687371A (en) * 2021-08-09 2021-11-23 Oppo广东移动通信有限公司 Light emission module, depth camera and terminal
CN113791397A (en) * 2021-08-06 2021-12-14 Oppo广东移动通信有限公司 Light emission module, depth camera and terminal
CN114002698A (en) * 2021-10-28 2022-02-01 Oppo广东移动通信有限公司 Depth camera, method for manufacturing light emitting module and terminal
CN114563799A (en) * 2020-11-27 2022-05-31 宁波飞芯电子科技有限公司 Distance information acquisition system
CN114563798A (en) * 2020-11-27 2022-05-31 宁波飞芯电子科技有限公司 Distance information acquisition system
WO2022111501A1 (en) * 2020-11-27 2022-06-02 宁波飞芯电子科技有限公司 Distance information acquisition system
CN114624722A (en) * 2020-11-27 2022-06-14 宁波飞芯电子科技有限公司 Distance information acquisition system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102109734A (en) * 2009-12-25 2011-06-29 日本精密测器株式会社 Aperture device and manufacturing method, electronic apparatus and mobile body driving device thereof
CN108333860A (en) * 2018-03-12 2018-07-27 广东欧珀移动通信有限公司 Control method, control device, depth camera and electronic device
CN108490633A (en) * 2018-03-12 2018-09-04 广东欧珀移动通信有限公司 Structured light projector, depth camera and electronic equipment
CN109149355A (en) * 2018-09-12 2019-01-04 Oppo广东移动通信有限公司 Light emitting mould group and its control method, TOF depth camera and electronic equipment
CN208351151U (en) * 2018-06-13 2019-01-08 深圳奥比中光科技有限公司 Projective module group, depth camera and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102109734A (en) * 2009-12-25 2011-06-29 日本精密测器株式会社 Aperture device and manufacturing method, electronic apparatus and mobile body driving device thereof
CN108333860A (en) * 2018-03-12 2018-07-27 广东欧珀移动通信有限公司 Control method, control device, depth camera and electronic device
CN108490633A (en) * 2018-03-12 2018-09-04 广东欧珀移动通信有限公司 Structured light projector, depth camera and electronic equipment
CN208351151U (en) * 2018-06-13 2019-01-08 深圳奥比中光科技有限公司 Projective module group, depth camera and electronic equipment
CN109149355A (en) * 2018-09-12 2019-01-04 Oppo广东移动通信有限公司 Light emitting mould group and its control method, TOF depth camera and electronic equipment

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111856506A (en) * 2020-08-28 2020-10-30 宁波舜宇奥来技术有限公司 Dynamic scanning light source module and partition scanning dynamic lighting method
CN114624722A (en) * 2020-11-27 2022-06-14 宁波飞芯电子科技有限公司 Distance information acquisition system
CN114563799A (en) * 2020-11-27 2022-05-31 宁波飞芯电子科技有限公司 Distance information acquisition system
CN114563798A (en) * 2020-11-27 2022-05-31 宁波飞芯电子科技有限公司 Distance information acquisition system
WO2022111501A1 (en) * 2020-11-27 2022-06-02 宁波飞芯电子科技有限公司 Distance information acquisition system
CN112965073A (en) * 2021-02-05 2021-06-15 上海鲲游科技有限公司 Partition projection device and light source unit and application thereof
CN112946604A (en) * 2021-02-05 2021-06-11 上海鲲游科技有限公司 dTOF-based detection device and electronic device and application thereof
CN113791397A (en) * 2021-08-06 2021-12-14 Oppo广东移动通信有限公司 Light emission module, depth camera and terminal
WO2023011031A1 (en) * 2021-08-06 2023-02-09 Oppo广东移动通信有限公司 Light emitting module, depth camera, and terminal
CN113791397B (en) * 2021-08-06 2024-04-26 Oppo广东移动通信有限公司 Light emission module, depth camera and terminal
CN113687371A (en) * 2021-08-09 2021-11-23 Oppo广东移动通信有限公司 Light emission module, depth camera and terminal
WO2023016078A1 (en) * 2021-08-09 2023-02-16 Oppo广东移动通信有限公司 Light emission module, depth camera, and terminal
CN114002698A (en) * 2021-10-28 2022-02-01 Oppo广东移动通信有限公司 Depth camera, method for manufacturing light emitting module and terminal

Also Published As

Publication number Publication date
CN111580282B (en) 2022-05-24

Similar Documents

Publication Publication Date Title
CN111580282B (en) Light emitting module, depth camera, electronic equipment and control method
EP3786707B1 (en) Projection module and terminal
CN109901300B (en) Laser speckle projector based on vertical cavity surface emitting laser regular dot matrix
CN108718406B (en) Variable-focus 3D depth camera and imaging method thereof
JP2019215569A (en) Head-up display device
CN112840231B (en) Laser radar and device with laser radar
CN111694161A (en) Light emitting module, depth camera and electronic equipment
CN109756725A (en) Structured light projection device, three-dimensional camera mould group and terminal device
WO2022241778A1 (en) Transmitting apparatus for time-of-flight depth detection and electronic device
TWI684026B (en) Control method, control device, depth camera and electronic device
US20220390603A1 (en) Lidar, method for controlling the same, and apparatus including lidar
JP2018098162A (en) Surface light source device and display device
CN103511935A (en) Lighting device
CN115079499B (en) Dynamic projection module applied to car lamp and design method thereof
CN214122581U (en) Light projection module, depth camera and electronic equipment
CN115922754A (en) Panoramic solid-state laser radar and mobile robot navigation system
WO2022241781A1 (en) Emitting apparatus for time-of-flight depth detection and electronic device
CN115016201A (en) Zoom camera light supplementing system
CN116348806A (en) Space suspension image display device and light source device
CN110070584B (en) Method and device for generating speckle coding pattern with adjustable density
CN221303609U (en) Optical lens and light source device using same
CN215489559U (en) Light source module and light source device
WO2024130642A1 (en) Transmitting apparatus, detection apparatus, and terminal
CN108388070A (en) Fibre-optical projector and apply its depth camera
CN113971938B (en) Partitioned lighting method, device, equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant