CN113534484A - Light emitting device and electronic equipment - Google Patents

Light emitting device and electronic equipment Download PDF

Info

Publication number
CN113534484A
CN113534484A CN202011511815.8A CN202011511815A CN113534484A CN 113534484 A CN113534484 A CN 113534484A CN 202011511815 A CN202011511815 A CN 202011511815A CN 113534484 A CN113534484 A CN 113534484A
Authority
CN
China
Prior art keywords
light
light source
collimating lens
light sources
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011511815.8A
Other languages
Chinese (zh)
Inventor
魏文雄
王帆
俞锋
余恺
邱晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to EP21780128.1A priority Critical patent/EP4119976A4/en
Priority to PCT/CN2021/079346 priority patent/WO2021196976A1/en
Publication of CN113534484A publication Critical patent/CN113534484A/en
Priority to US17/955,261 priority patent/US20230026858A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/42Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
    • G02B27/4205Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive optical element [DOE] contributing to image formation, e.g. whereby modulation transfer function MTF or optical aberrations are relevant
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Semiconductor Lasers (AREA)

Abstract

The embodiment of the application provides a light emitting device and electronic equipment, relates to the technical field of optical electronic devices, and can reduce the requirement of the rotation angle of a scanning rotating mirror in the light emitting device and increase the utilization rate of light beams emitted by the light emitting device. The light emitting device includes: the array light source comprises M rows and N columns of light sources, wherein M and N are positive integers, and an included angle between any column of light source in the N columns of light sources and any row of light source in the M rows of light sources is a preset angle. The array light source is positioned on the first side of the collimating lens, the plane of the array light source is vertical to the optical axis of the collimating lens, and the distance between the plane of the array light source and the center point of the collimating lens is the focal length of the collimating lens. The scanning rotating mirror is positioned at the second side of the collimating lens, and the central point of the reflecting surface of the scanning rotating mirror is positioned on the optical axis of the collimating lens.

Description

Light emitting device and electronic equipment
The present application claims priority from the application of the chinese patent application entitled "a light emitting device and electronic apparatus" filed by the national intellectual property office at 31/03/2020, application number 202010246351.6, the entire contents of which are incorporated herein by reference.
Technical Field
The embodiment of the application relates to the technical field of optical electronic devices, in particular to a light emitting device and electronic equipment.
Background
A three-dimensional (3D) camera (also called a depth camera) photographs a target object, and not only can a two-dimensional image of the target object (such as a human face) be obtained, but also depth information of the target object can be obtained. The depth information of the target object comprises distances between various features on the target object and the camera, and the three-dimensional features of the target object can be represented. Based on the three-dimensional image, the 3D camera can utilize the two-dimensional image and the depth information of the target object obtained by shooting by the 3D camera to realize the functions of face recognition, three-dimensional map reconstruction and the like.
Wherein, the 3D camera includes a transmitting end and a receiving end. The 3D camera may obtain depth information of the target object in the following manner. The emitting end is used for emitting light, and the light emitted by the emitting end is projected on the target object and reflected by the target object. The receiving end may receive light reflected by the target object. The 3D camera may calculate depth information of the target object according to a time difference between a time when the transmitting end transmits light and a time when the receiving end receives light reflected by the target object.
Currently, the emitting end of the 3D camera emits light in a point scanning manner. For example, fig. 1 is a schematic structural diagram of a transmitting end based on a point scanning technology. As shown in fig. 1, the emission end includes a light source 110 and a scanning mirror 120, and the scanning mirror 120 includes a reflecting mirror 121. The light source may emit a light beam 1, the light beam 1 may be irradiated onto a mirror 121 on the scanning mirror 120, and reflected as a light beam 2 by the mirror 121, and the light beam 2 may be irradiated onto the target object. In the emitting end shown in fig. 1, the propagation direction of the light beam 1 emitted by the light source 110 is constant. The mirror 121 can be rotated in the horizontal direction and the vertical direction to adjust the angle between the mirror 121 and the light beam 1 (i.e., the incident angle of the light beam 1 on the mirror 121). The incident angle of the light beam 1 on the mirror 121 changes, and the exit angle of the light beam 2 changes accordingly. That is, the emitting end can control the variation of the exit angle of the light beam 2 by adjusting the mirror 121 to emit light (i.e., reflect the light beam 2) in a point-scanning manner.
The transmitting end adopting the point scanning mode can emit light at different angles only by adjusting the reflecting mirror 121 for multiple times in the horizontal direction and the vertical direction, and the receiving end can acquire the image of the target object. For example, when scanning a Video Graphics Array (VGA) image (an image with a default resolution of 640 x 480) using a 3D camera, at least nine million points per second are acquired, i.e., at least nine million adjustments of mirror 121 are required. Moreover, when the mirror 121 is adjusted, the angle of the scanning mirror needs to be adjusted in both the horizontal direction and the vertical direction, the requirement on the rotation angle of the scanning mirror is high, and the requirement on the modulation speed of the light source 110 is extremely high. Therefore, the rotation angle of the scanning mirror is required to be high by adopting a point scanning mode, and the realization difficulty is high. Moreover, the light source projects to one point of the target object, and only one pixel point can be acquired by the corresponding receiving end. For example, the light beam at the emitting end needs to irradiate 640 × 480 times on the target object, and the receiving end can acquire a complete VGA image. In the spot scanning mode, the utilization rate of the light source is low.
Disclosure of Invention
The application provides a light emitting device and electronic equipment, can reduce the rotation angle requirement of scanning rotating mirror among the light emitting device to increase the utilization ratio of the light beam that light emitting device sent.
In order to achieve the technical purpose, the following technical scheme is adopted in the application:
in a first aspect, the present application provides a light emitting device, which may include: the array light source comprises M multiplied by N light sources, the M multiplied by N light sources are light sources in M rows and N columns, M and N are positive integers, wherein in the N rows of light sources, the interval between two adjacent rows of light sources is a first preset distance, in the M rows of light sources, the interval between two adjacent rows of light sources is a second preset distance, and the included angle between any one row of light source in the N rows of light sources and any one row of light source in the M rows of light sources is a preset angle.
The array light source can emit K light beams, K is more than or equal to 1, and K is a positive integer. The array light source is positioned on the first side of the collimating lens, the plane of the array light source is perpendicular to the optical axis of the collimating lens, and the distance between the plane of the array light source and the center point of the collimating lens is the focal length of the collimating lens. The collimating lens can convert the K light beams emitted by the array light source into K first collimated light beams.
The scanning rotating mirror is positioned at the second side of the collimating lens and used for realizing one-dimensional rotation, and the central point of the reflecting surface of the scanning rotating mirror is positioned on the optical axis of the collimating lens; the reflecting surface is used for reflecting the K first collimated light beams into K second collimated light beams. When the scanning rotating mirror rotates in one dimension, one array can be projected each time, and scanning can be completed through multiple times of projection, so that the preset resolution is realized.
The optical beam splitter is used for receiving the K second collimated light beams and splitting the K second collimated light beams into i multiplied by K third collimated light beams; wherein i is more than or equal to 2, and i is a positive integer.
In one aspect, the light source in the light emitting device is an array light source comprising M × N light sources. The array light source can be composed of M rows of light sources and N columns of light sources, or composed of N rows of light sources and M columns of light sources. The array light source can emit K light beams, and K is larger than or equal to 1. Compared with a point scanning mode, when the scanning rotating mirror rotates in one direction, a two-dimensional light beam array can be formed, and the requirement on the rotating angle of the scanning rotating mirror is reduced.
On the other hand, the optical beam splitter in the light emitting device may split 1 light beam into i light beams, so that the number of light beams projected by the light emitting device becomes large. If each beam is projected onto the target object, a point on the corresponding target object, that is, after 1 beam in the light emitting device is split into i beams, 1 beam i corresponds to a point on the target object. Compared with a point scanning mode, the light emitting device improves the utilization rate of a light source. In addition, one light source emits 1 light beam, and the 1 light beam can be split into i light beams, and assuming that each light beam corresponds to one pixel, that is, each light source can correspond to i pixels, when the light sources are arranged, each light source is not required to be closely arranged. The processing difficulty and the installation difficulty of the light source are reduced.
To sum up, the processing degree of difficulty and the installation degree of difficulty of array light source are lower among the light emitting device that this application provided, easily realize, and have improved the utilization ratio to the light source to light emitting device.
In one possible embodiment, the optical beam splitter may include: at least one of a one-dimensional grating, a two-dimensional diffractive optical element, and a prism film. That is, the optical beam splitter may be a one-dimensional grating, a two-dimensional diffractive optical element, or a prism film, or may be composed of two or more elements. For example, the optical beam splitter may be a combination of a one-dimensional grating and a prism film.
In another possible embodiment, the preset angle between any one of the N rows of light sources and any one of the M rows of light sources is an acute angle.
In another possible embodiment, when the optical beam splitter is a prism film, the incident surface of the optical beam splitter is a plane, and the exit surface of the optical beam splitter is a prism film structure. The prism film structure comprises i beam splitting surfaces, the i beam splitting surfaces are used for splitting 1 light beam into i light beams, and the propagation directions of the i light beams are different.
It is understood that the prism film structure is an optical film having a micro prism structure on a surface thereof, that is, the prism film structure is an optical film having a micro prism on a surface thereof. The micro-prisms on the surface of the prismatic film structure may split 1 beam into i beams, so that the optical beam splitter may split 1 beam into i beams.
In another possible embodiment, when the optical beam splitter is a two-dimensional diffractive optical element; the two-dimensional diffractive optical element is used to split a light beam into a matrix of light beams comprising i light beams.
In another possible embodiment, when the optical beam splitter is a one-dimensional grating, the one-dimensional grating element may split one beam into i beams, and when the i beams are projected on the same plane, the connection lines of the transmission points of the beams are a straight line.
In another possible embodiment, the light source in the array light source may be composed of an edge-Emitting semiconductor Laser, a Vertical-Cavity Surface-Emitting Laser (VCSEL), a fiber Laser, or a solid-state Laser.
The wavelength of the light beam emitted by the light source in the array light source can be any wavelength. The array light source may be a light beam emitted by any one light source, or a light beam emitted by a light source in any column or any row in the array light source.
In another possible embodiment, the light emitting device further comprises a controller; the controller is connected with the scanning rotating mirror. The controller is used for receiving a control signal and transmitting the control signal to the scanning rotating mirror, and the control signal is used for indicating the scanning rotating mirror to adjust the angle of the reflecting surface on the scanning rotating mirror. The scanning rotating mirror is used for receiving the control signal and adjusting the angle of the reflecting surface according to the control signal so as to adjust the propagation direction of the K second collimated light beams.
In another possible embodiment, the scanning turning mirror includes: a micro-electro-mechanical system MEMS mirror or a digital micro-mirror DMD.
In a second aspect, the present application also provides an electronic device comprising the light emitting apparatus of the first aspect and any one of its possible embodiments; wherein the light beam emitted by the light emitting device is irradiated on the target object and reflected by the target object. The electronic device further comprises receiving means for receiving the light beam reflected by the target object.
In one possible embodiment, the receiving apparatus includes: a receiving lens and an image sensor; wherein, the optical axis of the receiving lens is vertical to the plane of the image sensor. The receiving lens is used for receiving the light beam reflected by the target object and refracting the light beam reflected by the target object into a refracted light beam; wherein the refracted light beam impinges on an image sensor such that the target object is imaged on the image sensor.
In another possible embodiment, the distance between the image sensor and the receiving lens is less than twice the focal length of the receiving lens.
In another possible embodiment, pixels of the image sensor are E × F, and E, F are all positive integers. Wherein the image sensor comprises j detectors, wherein j is less than E x F, and j is a positive integer.
The number of detectors in the image sensor is less than the number of pixels, and an image with a corresponding resolution can be formed on the image sensor, that is, the number of detectors of the image sensor can be reduced, and the cost is reduced.
In another possible embodiment, the detector comprises at least one single photon detector.
In another possible embodiment, the receiving device further comprises an optical filter; the optical filter is arranged between the receiving lens and the image sensor and is parallel to the plane of the image sensor. The filter is used for filtering out ambient light in the refracted light beam.
In a third aspect, the present application further provides a method of emitting a light beam, which may be applied to the light emitting device in the first aspect and any one of the possible embodiments thereof, where the light emitting device includes an array light source, a collimating lens, a scanning turning mirror, an optical beam splitter, and a processor. The array light source comprises M multiplied by N light sources, the M multiplied by N light sources are light sources in M rows and N columns, M and N are positive integers, wherein in the N rows of light sources, the interval between two adjacent rows of light sources is a first preset distance, in the M rows of light sources, the interval between two adjacent rows of light sources is a second preset distance, and the included angle between any one row of light sources in the N rows of light sources and any one row of light sources in the M rows of light sources is a preset angle; the method can comprise the following steps: the processor controls the array light source to emit K light beams, wherein K is more than or equal to 1 and is a positive integer. The array light source is positioned on the first side of the collimating lens, the plane of the array light source is perpendicular to the optical axis of the collimating lens, and the distance between the plane of the array light source and the center point of the collimating lens is the focal length of the collimating lens. The K light beams emitted by the array light source can be transmitted to the collimating lens, and the K light beams are converted into K first collimated light beams through the collimating lens. The scanning rotating mirror is positioned on the second side of the collimating lens and used for realizing one-dimensional rotation, and the optical axis of the collimating lens passes through the reflecting surface of the scanning rotating mirror. The K collimated light beams converted from the collimated light beams can be transmitted to the scanning rotating mirror; the processor controls the scanning turning mirror to reflect the K first collimated light beams into K second collimated light beams. The K second collimated light beams are split into i multiplied by K third collimated light beams through the optical beam splitter, and the i multiplied by K third collimated light beams are emitted; wherein i is more than or equal to 2, and i is a positive integer.
In a possible embodiment, the predetermined angle is an acute angle.
In a fourth aspect, an embodiment of the present application further provides a computer-readable storage medium including computer instructions. When the computer instructions are run on an electronic device as in the first aspect and any one of its possible implementations, the electronic device may implement the method of emitting a light beam as in the third aspect described above.
It is to be understood that the above-mentioned advantages achieved by the electronic device of the second aspect, the method for emitting a light beam of the third aspect, and the computer-readable storage medium of the fourth aspect can be referred to the advantages of the first aspect and any possible implementation thereof, and therefore, the detailed description thereof is omitted here.
Drawings
Fig. 1 is a schematic structural diagram of an emitting end in a point scanning manner according to the present application;
fig. 2 is a schematic structural diagram of a diffractive optical element according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an emitting end in a line scanning manner according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a light emitting device according to an embodiment of the present disclosure;
fig. 5A is a schematic structural diagram of an array light source according to an embodiment of the present disclosure;
fig. 5B is a schematic structural diagram of another array light source provided in the present embodiment;
fig. 6 is a schematic structural diagram of a collimating lens according to an embodiment of the present disclosure;
FIG. 7 is a schematic structural diagram of an optical beam splitter according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram illustrating a light exit angle of a light emitting device according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a receiving apparatus according to an embodiment of the present disclosure;
fig. 10 is a schematic view of an acquisition scan of a receiving device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a detector provided in an embodiment of the present application;
FIG. 12 is a schematic diagram of a detector according to an embodiment of the present disclosure;
FIG. 13 is a schematic diagram of another detector scan acquisition result provided in an embodiment of the present application;
FIG. 14 is a schematic diagram of another detector scan acquisition result provided by an embodiment of the present application;
fig. 15A is a schematic structural diagram of another receiving apparatus according to an embodiment of the present disclosure;
fig. 15B is a schematic diagram of a light beam collected on an image sensor according to an embodiment of the present disclosure;
fig. 16 is a schematic diagram of a light spot received by a detector in a receiving apparatus according to an embodiment of the present application;
FIG. 17 is a flow chart of a method for emitting a light beam according to an embodiment of the present disclosure;
fig. 18 is a schematic structural diagram of a 3D camera according to an embodiment of the present disclosure;
fig. 19 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present embodiment, "a plurality" means two or more unless otherwise specified.
Terms that may appear in the embodiments of the present application are explained below.
Diffractive Optical Elements (DOE): also known as binary optics, diffractive optical elements have a specific surface structure design that diffracts a light beam propagating to itself.
Specifically, when the surface structures of the diffractive optical elements are different, the functions of the diffractive optical elements are also different. For the DOE used in the present application, the DOE may have a one-dimensional lattice beam splitting function, and the DOE may further have a two-dimensional lattice beam splitting function. The DOE with the one-dimensional lattice beam splitting function can split 1 light beam into a plurality of light beams in one direction. For example, as shown in fig. 2 (a), the beam 1 is divided into 4 beams in the vertical direction by the DOE. The DOE with the two-dimensional lattice beam splitting function can split 1 light beam into one matrix light beam. For example, as shown in fig. 2 (b), beam 1 is split by the DOE into an array of 8 × 8 beams.
Collimation: generally, the light rays are divergent, i.e. the adjacent two light rays are further away from each other in the process of starting propagation. Collimation is the maintenance of parallelism between the rays in a beam.
Single-photon detector (SPAD): once a photon propagates to the SPAD surface, the single photon detector is triggered to detect the optical signal. Wherein SPAD may be triggered and then restored to the initial state (i.e., the non-triggered state) after a certain time. The SPAD can only detect whether photons are impinging on the surface of the SPAD, but not the number of photons impinging on the surface of the SPAD.
In the 3D camera, if the emitting end emits light in a point scanning mode, the requirement on the rotating angle of the scanning rotating mirror in the emitting end is high, the requirement on the modulation speed of a light source is extremely high, and the realization is difficult. In addition, the emitting end of the 3D camera can also emit light in a line scanning mode, wherein the line scanning mode means that the emitting end can emit a plurality of light beams in one direction, and a scanning rotating mirror of the emitting end rotates in one direction, so that the receiving end can acquire an image of a target object. For example, in the line scanning mode, the transmitting end can emit a plurality of light beams in the vertical direction, and the scanning rotating mirror at the transmitting end rotates in the horizontal direction, so that the receiving end can acquire an image of the target object. Therefore, the light is emitted in a line scanning mode, and the requirement on the rotating angle of the scanning rotating mirror can be reduced.
For example, fig. 3 is a schematic structural diagram of an emitting end of a line scanning technology. As shown in fig. 3, the emitting end includes a linear light source 210, a lens 220, and a scanning turning mirror 230. The line type light source 210 is formed by arranging a plurality of light sources in one direction. In the line scan mode shown in fig. 3, the line light source 210 emits a light beam, which is projected outward through the lens 220 and the scan turning mirror 230. Taking the example of scanning the VGA image by the line scanning technique, the linear light source 210 emits 480 light beams in the vertical direction, the scanning rotating mirror rotates in the horizontal direction, and the angle of the scanning rotating mirror is adjusted 640 times to enable the receiving end to acquire a complete VGA image.
It can be understood that the requirement of the rotation angle of the scanning rotating mirror in the transmitting end can be reduced by adopting a line scanning mode, and the implementation difficulty of the transmitting end is reduced. In the above line scanning mode, the receiving end can collect one point of the target object by 1 light beam emitted by the line light source, and one point on the target object corresponds to one pixel on the image. In order to meet the requirement of scanning accuracy, adjacent light sources on the line type light source are closely arranged. The arrangement mode of the light sources increases the processing difficulty and the installation difficulty of the linear light sources and is difficult to realize.
In the point scanning mode, 1 light beam emitted by the light source is projected to one point of the target object, and the receiving end collects the light beam reflected by the point of the target object and forms one pixel on the image of the target object. That is, the light source emits a light beam that ultimately corresponds to a pixel on the image. In the on-line scanning mode, the problem of low utilization rate of the light source is not solved.
The embodiment of the application provides a light emitting device, and a light source adopts an array light source, that is to say, the light source adopts an arrangement mode of a plurality of rows and a plurality of columns. The array light source does not need to be closely arranged, so that the processing difficulty and the installation difficulty of the light source are reduced. Moreover, the light emitting device comprises an optical beam splitter, the optical beam splitter can split 1 light beam into i light beams, i is a positive integer, and i is larger than or equal to 2. So that 1 light beam emitted by the light source can be projected to i points of the target object, and the utilization rate of the light source is improved.
The light emitting device provided by the embodiment of the application can be used for a 3D camera and can also be used for a 3D scanner and other devices using a 3D sensing technology.
If the light emitting device in the embodiment of the present application is used as the emitting end of the 3D camera, the 3D camera may be disposed on the electronic device. The electronic device in the embodiment of the present application may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \ Virtual Reality (VR) device, and the like, and the embodiment of the present application does not particularly limit the specific form of the electronic device.
Please refer to fig. 4, which is a schematic structural diagram of a light emitting device according to an embodiment of the present disclosure. The light emitting device includes: an array light source 401, a collimating lens 402, a scanning turning mirror 403 and an optical beam splitter 404.
As shown in fig. 4, the array light source 401 is located on the first side S1 of the collimating lens 402, the plane of the array light source 401 is perpendicular to the optical axis of the collimating lens 402, and the distance between the plane of the array light source 401 and the center point of the collimating lens 402 is the focal length of the collimating lens 402. The scanning turning mirror 403 is located on the second side S2 of the collimator lens 402, and the center point of the reflecting surface of the scanning turning mirror 403 is on the optical axis of the collimator lens 402.
The array light source 401 includes M × N light sources. When K light sources in the array light source 401 are in a working state, the array light source 401 can emit K light beams, where K is greater than or equal to 1 and K is a positive integer. That is, the array light source 401 may emit a plurality of light beams. The K light beams may be emitted by K light sources of the M × N light sources. The plane of the array light source 401 is perpendicular to the optical axis of the collimating lens 402, that is, the array light source 401 faces the collimating lens 402. The K light beams emitted from the array light source 401 propagate along the optical axis to the collimating lens 402. The collimator lens 402 functions to collimate the light beam. Thus, the collimating lens 402 can convert the K beams into K collimated beams (i.e., first collimated beams). The central point of the reflecting surface of the scanning turning mirror 403 is on the optical axis of the collimating lens 402, so that the K first collimated light beams can propagate to the scanning turning mirror 403, and the reflecting surface of the scanning turning mirror 403 is used for reflecting the K first collimated light beams into K second collimated light beams.
For example, the arrangement of the M × N light sources may be M rows of light sources and N columns of light sources, or may be N rows of light sources and M columns of light sources. For example, the array light source 401 is a light source with M rows and N columns. The interval between two adjacent columns of light sources in the N columns of light sources is a first preset distance; the interval between two adjacent rows of light sources in the M rows of light sources is a second preset distance; an included angle between any one of the N rows of light sources and any one of the M rows of light sources is a preset angle.
For example, it is assumed that M and N are both 8, as shown in fig. 5A, which is a schematic structural diagram of the array light source 401. For example, in fig. 5A, the light source on the line L1 is the first row light source, the light source on the line L2 is the second row light source, the light source on the line L3 is the first column light source, and the light source on the line L4 is the second column light source. The interval between two adjacent columns is the interval between adjacent light sources on the same row, and the interval between two adjacent rows is the interval between adjacent light sources on the same column. As in fig. 5A, the spacing between L1 and L2 is the distance X1 between adjacent light sources on L3, and the spacing between L3 and L4 is the distance X2 between adjacent light sources on L1. The included angle between the first row light source L1 and the first column light source L3 is theta, if two adjacent lights in the first row light source L1The difference in the horizontal direction of the sources is X3, where X3 can also be understood as the second column of light sources being shifted down by X3 with respect to the first column of light sources. Then theta can be expressed as
Figure BDA0002846603480000061
In this embodiment, an included angle between any row of light sources and any column of light sources in the array light source 401 is a preset angle, and the preset angle may be an acute angle, that is, less than 90 °. If each column of light sources is arranged in sequence along the vertical and horizontal direction, the light sources in each row are not in the same horizontal line. As in fig. 5A, the individual light sources in each row are not arranged in the same horizontal line.
In some embodiments, the array light source 401 is 8 by 8 light sources, X1 is 40 μm (micrometers), X2 is 40 μm (micrometers), and X3 is 5 μm (micrometers). In which the position of each row of light sources is shifted down by 5 μm compared to the previous row of light sources, as shown in fig. 5A. Specifically, the light source in the array light source 401 may be a laser.
As another example, the arrangement of the light sources in the array light source 401 may also be as shown in fig. 5B. Taking the array light source 401 as 8 × 8 light sources as an example, specifically, as shown in fig. 5B (a), an included angle between any one of the N rows of light sources and any one of the M rows of light sources is 90 °. As shown in fig. 5B (B), the position of the rear row of light sources in the array light source 401 is moved rightward by a preset distance compared with the position of the front row of light sources, and an included angle between any one of the N rows of light sources and any one of the M rows of light sources is a preset angle of 0 ° to 90 °. As shown in fig. 5B (c), in the first column of light sources denoted by L in the array light source 401, each light source in a column of light sources may not be in the same vertical direction.
It is understood that when the array light source emits the light beam, any light source in the array light source may emit the light beam, and any row or any column of the light sources in the array light source may emit the light beam. Or some of the light sources in the array light source emit light beams, for example, the array light source includes 8 × 8 light sources, and 4 × 4 light sources in the array light source may be controlled to emit light beams.
The light source in the array light source may be at least one of an edge-Emitting semiconductor Laser, a Vertical-Cavity Surface-Emitting Laser (VCSEL), a fiber Laser, or a solid-state Laser. Also, the light beams emitted by the light sources in the array light source may be light beams of arbitrary wavelengths. If the light emitting device is used as the emitting end of the 3D camera, the wavelength of the light beam emitted by the array light source needs to be determined so that the receiving end of the 3D camera can determine that the received light beam is the light beam emitted by the emitting end.
In the embodiment of the present application, the collimating lens 402 is used to convert the light beam emitted from the light source into a first collimated light beam.
In some embodiments, the collimating lens 402 in the present example may be a single lens, which refers to a lens formed from one piece of optical glass. That is, the collimating lens 402 may be a convex lens. In other embodiments, the collimating lens 402 can be formed by multiple optical elements, such as a collimating lens that is made up of multiple lenses. The collimating lens 402 may convert the light beam emitted by the light source into a collimated light beam.
The distance between the plane of the array light source 401 and the center point of the collimating lens 402 is the focal length of the collimating lens 402, that is, the plane of the array light source 401 includes the focal point of the collimating lens 402. Therefore, the light beam emitted from the array light source 401 passes through the collimator lens 402 and becomes a collimated light beam. However, since the position of each light source in the array light source 401 is different, the position of the light beam emitted by each light source entering the collimating lens 402 is also different, and after the collimating lens 402 converts the light beam emitted by the light source into the first collimated light beam, the exit angle of the first collimated light beam is also different.
For example, it is assumed that the collimating lens 402 is a single lens, i.e., the collimating lens 402 is a convex lens. The focal length of the convex lens may be 3.4mm (millimeters).
Fig. 6 is a schematic diagram of a collimating lens 402. The plurality of light beams respectively represent light beams emitted from light sources at different positions on the array light source 401, and a lens emitted from the light source becomes the first light beam after passing through the collimator lens 402A collimated beam, the first collimated beam passes through the focus of the collimating lens 402 on the second side. As shown in fig. 6, the light beam 1 is converted by the collimator lens 402 and propagates in the exit direction of the collimator lens 402. The exit angle of the light beam emitted by the light source after passing through the collimating lens 402 is represented as:
Figure BDA0002846603480000071
where α i denotes an exit angle of a light beam emitted from the light source, Li denotes a distance value between the light beam and the optical axis of the collimator lens 402, and f denotes a focal length of the collimator lens 402.
The collimating lens 402 converts the light beam emitted by the array light source 401 into a first collimated light beam, which propagates along the exit direction. The array light source 401 is located at a first side of the collimating lens 402, light beams emitted by the array light source 401 are incident to the collimating lens 402 from the first side, and after the collimating lens 402 converts the light beams into first collimated light beams, the first collimated light beams propagate at a second side of the collimating lens 402 along an emergent direction. The scanning turning mirror 403 is located on the second side of the collimating lens 402, and the center point of the reflecting surface of the scanning turning mirror 403 is on the optical axis of the collimating lens 402.
After the first collimated light beam propagates to the scanning rotating mirror 403, the first collimated light beam is reflected by the reflecting surface of the scanning rotating mirror 403 to be a second collimated light beam. That is, the scanning turn mirror 403 is used to change the propagation direction of the first collimated beam.
In some embodiments, a controller is disposed on scanning turning mirror 403, and is configured to receive a control signal and transmit the control signal to scanning turning mirror 403, where the control signal is configured to instruct scanning turning mirror 403 to adjust the angle of the reflective surface on scanning turning mirror 403. The scanning turning mirror 403 is used for receiving the control signal and adjusting the angle of the reflecting surface according to the control signal to adjust the propagation direction of the second collimated light beam.
Illustratively, the scanning turning mirror 403 may be a micro-electro-mechanical system (MEMS) mirror, or the scanning turning mirror may be a digital micro-mirror device (DMD).
For example, the MEMS mirrors are driven electrostatically or electromagnetically. Specifically, the control signal may be an electrostatic signal, and the electrostatic drive is generated by the electrostatic signal to control the angle of the MEMS mirror. Alternatively, the control signal is a current signal, and the electromagnetic drive is generated by the change of the current to control the angle of the MEMS mirror. Since the MEMS mirror is small in size, light in weight, and short in response time, the light emission efficiency of the light emitting device can be improved.
As another example, a DMD includes a plurality of micromirrors. I.e., each DMD is composed of a plurality of micromirrors. The DMD is driven by a digital signal. Specifically, the control signal may be a digital signal, and the angle of the DMD is controlled by the digital signal to adjust the propagation direction of the second collimated light beam.
Wherein, the scanning rotating mirror can also be a lens driven by a motor. Moreover, the scanning rotating mirror can realize one-dimensional rotation (namely, rotation in one direction) and two-dimensional rotation (namely, rotation in two directions). When the scanning rotating mirror rotates in one dimension, one array can be projected each time, and scanning can be completed through multiple times of projection, so that the preset resolution is realized. In addition, the shape of the scanning rotating mirror can be circular, rectangular, polygonal or the like.
In the light emitting device, the optical beam splitter 404 is used to split 1 light beam into i light beams, where i is greater than or equal to 2, and i is a positive integer. The optical beam splitter 404 includes an incident surface from which 1 light beam enters the optical beam splitter 404 and an exit surface from which i light beams are emitted.
The optical beam splitter may be a one-dimensional grating, a two-dimensional diffractive optical element, or a prism film, among others. The one-dimensional grating can split one light beam into i light beams, and the number of the light beams split is related to the number of the gratings of the one-dimensional grating. The beam splitting effect of the one-dimensional grating can be referred to as a beam splitting pattern of the diffractive optical element shown in (a) in fig. 2. The two-dimensional diffractive optical element may split one light beam into a light beam matrix including i light beams. The beam splitting effect of the two-dimensional diffractive optical element can be referred to the beam splitting pattern of the diffractive optical element shown in (b) in fig. 2.
In some embodiments, when the optical beam splitter is a prismatic film, the exit face of the optical beam splitter 404 is a prismatic film structure. The prism film structure refers to a thin film with a prism-shaped surface. Taking the prism as a triangular prism structure as an example, as shown in fig. 7 (a), when the incident surface 70 is a plane and the exit surface 71 includes 8 prisms, and the angles of the prisms and the horizontal plane are different, the propagation directions of the light beams emitted from the prisms are also different. As shown in fig. 7 (a), the angle between the prism 1 and the horizontal plane is β 1, and the angle between the prism 2 and the horizontal plane is- β 1; the included angle between the prism 3 and the horizontal plane is beta 2, and the included angle between the prism 4 and the horizontal plane is beta 2; the included angle between the prism 5 and the horizontal plane is beta 3, and the included angle between the prism 6 and the horizontal plane is beta 3; the included angle between the prism 7 and the horizontal plane is beta 4, and the included angle between the prism 8 and the horizontal plane is beta 4. As shown in fig. 7 (b), when 1 light beam propagates to the optical beam splitter 404, the optical beam splitter 404 can split 1 incident light beam into 8 outgoing light beams. As shown in fig. 7 (b), I1 represents 1 incident light beam, O1 represents an emergent light beam 1, O2 represents an emergent light beam 2, O3 represents emergent light beams 3, … …, and O8 represents an emergent light beam 8, arrows in the figure represent the traveling directions of light rays, and the traveling directions of the emergent light beams are different.
For example, in 8 prisms on the exit surface 71 shown in fig. 7 (a), β 1 is assumed to be 5.4 °, β 2 is assumed to be 15.5 °, β 3 is assumed to be 24.2 °, and β 4 is assumed to be 31.5 °. It will be appreciated that if the array light source 401 is 8 by 8 light sources, it is assumed that each light source emits a light beam. The light beams emitted by the array light source 401 propagate to the collimator lens 402, and the collimator lens 402 converts the 8 × 8 light beams into 8 × 8 first collimated light beams. The 8 × 8 first collimated light beams propagate to the reflective surface of the scan mirror 403, and the reflective surface of the scan mirror 403 reflects the 8 × 8 first collimated light beams into 8 × 8 second collimated light beams. The 8 × 8 second collimated light beams propagate in the reflection direction to the optical beam splitter 404, and the optical beam splitter 404 may split the 8 × 8 second collimated light beams into 8 × 8(512) light beams. The exit angles of 512 light beams are shown in fig. 8, the light beam emitted by one light source in each column is split into 8 light beams, for example, 64 × 8 light beams, and the angle corresponding to the vertical direction represents the exit angle of one light beam in the longitudinal direction.
When the optical beam splitter is a prism film, the optical beam splitter may be made of glass, polymethyl methacrylate (PMMA), Polycarbonate (PC), polyimide film (PI), or other optical plastics. In particular, it can be processed by etching, embossing, microreplication, etc.
In other embodiments, the optical beam splitter 404 may be an optical beam splitter 404 with a one-dimensional lattice beam splitting function, or may be an optical beam splitter 404 with a two-dimensional lattice beam splitting function.
Note that the array light source 401 includes M × N light sources. When the array light source 401 is in an operating state, all of the M × N light sources may be controlled to emit light, or some of the M × N light sources may be controlled to emit light. For example, a first column of light sources in the array light source 401 is controlled to emit light, a first row of light sources in the array light source 401 is controlled to emit light, or a certain light source in the array light source 401 is controlled to emit light, etc.
In the embodiment of the present application, the array light source 401 in the light emitting device can emit K light beams, where K is greater than or equal to 2 and less than or equal to mxn, and K is a positive integer. K light beams propagate to collimating lens 402, collimating lens 402 converts K light beams into K first collimated light beams, and K first collimated light beams propagate to the reflecting surface of scanning lens, and scanning lens reflects K first collimated light beams into K second collimated light beams, and K second collimated light beams propagate to optical beam splitter 404, and optical beam splitter 404 splits K second collimated light beams into i × K third collimated light beams. The i × K third collimated light beams propagate to the target object along the propagation direction and are emitted or refracted by the target object.
Generally, when a target object is photographed by a 3D camera, a transmitting end projects an emitted light beam on the target object, the light beam projected to the target object is reflected by the target object, and a receiving device receives the light emitted from the target object and generates an image of the target object according to the received light beam. Wherein the light beam is projected at a point of the target object, and the receiving device generates a pixel point on the image of the target object corresponding to one light beam according to the light reflected by the target object.
Referring to fig. 9, an embodiment of the present application further provides a receiving apparatus. As shown in fig. 9, the receiving apparatus includes: a receiving lens 901 and an image sensor 902. Wherein the optical axis of the receiving lens 901 is perpendicular to the plane of the image sensor 902. The receiving lens 901 is used for receiving the light beam reflected by the target object and refracting the light beam reflected by the target object into a refracted light beam; wherein the refracted light beam impinges on image sensor 902 such that the target object is imaged on image sensor 902.
The receiving device is used for receiving light reflected by the target object, and if light in the environment is irradiated onto the target object and reflected by the target object, the light can also enter the receiving device, so that the target object is imaged on the image sensor 902. Wherein light in the environment may affect the image of the target object acquired by the receiving device.
In some embodiments, the receiving device may further include an optical filter; the optical filter is disposed between the receiving lens 901 and the image sensor 902, and the optical filter is parallel to the plane of the image sensor 902. The filter is used for filtering out ambient light in the refracted light beam.
Specifically, the light emitting device is used for projecting a light beam to a target object, and the receiving device provided by the embodiment of the application is used for receiving the light beam reflected by the target object. If the light emitting device and the receiving device can work together, the image of the target object can be shot.
The image sensor 902 on the receiving device includes a detector thereon for collecting photons reflected to itself. Image sensor 902 determines the pixels of a location from photons reflected onto a detector, one for each pixel on image sensor 902. It should be noted that the detector herein is a detector in a functional sense, i.e. the detector is used to identify whether the pixel has photons. The detector may comprise one SPAD or a plurality of SPADs. If a detector includes a SPAD, the detector only identifies photons on that pixel site. If the detector comprises a plurality of SPADs, each SPAD detects a photon at a pixel. In addition, the detector can also comprise a silicon Avalanche photodiode (SiAPD) or an Avalanche Photodiode (APD).
In a first implementation, one detector comprises one SPAD. Since one detector includes one SPAD, the number of detectors in the image sensor 902 in the receiving device corresponds to the pixels of the generated image. For example, if the number of detectors in the receiving device is 640 × 480, the resolution of the image generated by the receiving device is also 640 × 480.
For example, assuming that the array light source 401 in the light emitting device is a 3 × 8 array light source 401, the optical beam splitter 404 may split 1 light beam into 2 light beams. Therefore, the light emitting device can emit 3 × 16 light beams. Specifically, one detector comprises a SPAD, and photons collected by each detector correspond to a pixel point on the generated image. As shown in fig. 10, a schematic diagram is shown after three acquisitions of the detector. Each square as in fig. 10 (only a part of the detector is shown) corresponds to one pixel on the image. When the scanning rotating mirror 403 is at the first angle and time T0, the detector first acquires the position of the pixel on the image corresponding to the photon reflected by the target object (e.g., the first acquired pixel in the first, second, and third columns in fig. 10); when the scanning rotating mirror 403 rotates by a preset angle and the time is T1, the detector acquires the position of the pixel on the image corresponding to the photon reflected by the target object for the second time (for example, the pixel acquired for the second time in the second column, the third column and the fourth column in fig. 10); when the scanning rotating mirror 403 is rotated again by the preset angle, the detector acquires the position of the pixel on the image corresponding to the photon reflected by the target object for the third time (for example, the pixel acquired for the third time in the third column, the fourth column and the fifth column in fig. 10) at time T2.
In a second implementation, one detector includes a plurality of SPADs. Illustratively, pixels of image sensor 902 are E F, and E, F are all positive integers. The image sensor 902 includes j detectors, where j is greater than or equal to 2, j is smaller than E × F, and j is a positive integer.
Please refer to fig. 11, which is a schematic structural diagram of a detector according to an embodiment of the present disclosure. As shown in fig. 11, one detector includes 8 by 8 SPADs, each of which is not fully shown in the drawing.
It should be noted that SPAD can only recognize photons and not the number of photons. When the SPAD recognizes a photon, the SPAD is triggered and needs to be restored to an initial state after a preset time. When multiple SPADs are included on a detector, the detector is in the triggered state if a SPAD is triggered. Therefore, the state of the detector can be controlled in a time-sharing working mode for the detector in the embodiment of the application.
Example 1: taking the example of the array light source 401 being 8 × 8 in the light emitting device, the optical beam splitter 404 splits 1 beam into 8 beams, and the example of the receiving device is that one detector shown in fig. 11 includes 8 × 8 SPADs. Specifically, the array light source 401 emits 8 × 8 light beams, and each light source is split into 8 light beams. Take the example that 1 beam is split into 8 beams in the horizontal direction. Since 8 x 8 SPADs are included on each detector, 60 x 80 detectors may be included on image sensor 902 if a VGA image is to be acquired. Fig. 12 is a schematic diagram of imaging on the detector of the receiving device. And (4) collecting pixel points of the black squares (all the pixel points are not completely shown in the figure).
Specifically, if 8 × 8 light sources in the array light source 401 are simultaneously in operation, 512 pixels can be collected by the detector at the same time. Since each column of light beams is distributed in different detectors, when the angle of the scanning turning mirror 403 in the light emitting device is changed, the position of the black square in fig. 12 is changed. In this case, each detector can collect 8 × 8 pixels, and two pixels cannot be collected simultaneously on one detector. If a receiving device of this configuration is used to acquire VGA images, the image sensor 902 may include 60 × 80 detectors.
For example, since the images acquired by the first 7 rows of detectors are incomplete, in practice, the light emitting device needs to scan 696 rows (56 incomplete rows +640 complete rows). As shown in fig. 13, the receiving apparatus forms a VGA image with 640 columns in the vertical direction and 480 rows in the horizontal direction. Since the images in the first 7 columns of pixels are not completely discarded. In actual use, the formed pixel points of the first 7 columns can be ignored (the detector of the last 7 columns is not shown in the figure).
Example 2: taking the example of the array light source 401 being 8 × 8 in the light emitting device, the optical beam splitter 404 may split one beam into 3 × 8 two-dimensional array beams, and one detector in the receiving device includes 8 × 8 SPADs.
The array light source 401 may emit 64 × 8 × 3 × 512 × 3 dot matrix light beams at the same time, and the array light source 401 may be divided into 3 portions, each portion includes 8 columns, and each column includes 64 dot matrixes. The 3 portions may be evenly distributed over the detector of the receiving device. Each portion occupies 1/3 the entire image pixel. As shown in fig. 14 (only a part of the detectors are shown), taking VGA image acquisition as an example, the light emitting device completes 696-column scanning, and each part of the detectors acquires 696/3 ═ 232-column detector scanning due to the detectors divided into 3 parts. So that the scanning rotating mirror 403 in the light emitting device supports 232 columns of detectors for each dot matrix scanning, the VGA image can be acquired by the receiving device. This can greatly reduce the number of detectors.
The number of detectors in the image sensor 902 is smaller than the number of pixels, and an image with a corresponding resolution can be formed on the image sensor 902, that is, the number of detectors of the image sensor 902 can be reduced, thereby reducing the cost.
In a possible embodiment, since the array light source 401 comprises 8 by 8 light sources, the light emitting device may generate 512 light beams at a time if each light source is in operation. Taking the example that each light source in the array light source 401 is controlled individually, one light source may be split into 8 light beams, and each light source in the array light source 401 works individually, and 8 × 8 light sources are lighted up in turn. I.e. after each light source has completed the scanning task, the next light source is turned on. In this way, 8 × 10 detectors are arranged on the image detector to meet the requirement of generating VGA images.
It should be noted that, in the above embodiment, the number of detectors is smaller than the number of pixels. The light spot formed by the light beam reflected by the target object on the imaging surface of the image sensor 902 becomes small, and the distance between the image sensor 902 and the receiving lens 901 can be smaller than the focal length of the receiving lens 901, so that a larger light spot can be obtained on the image sensor 902. Alternatively, the distance value between the image sensor 902 and the receiving lens 901 is larger than the focal length of the receiving lens 901 and smaller than twice the focal length of the receiving lens 901. As shown in fig. 15A, the distance between the image sensor 902 and the receiving lens 901 is larger than the focal length of the receiving lens 901, so that a spot formed on the image sensor 902 becomes large. Fig. 15B is a schematic diagram showing the shape of the light spot on the image sensor 902. Wherein the shape of the spot becomes larger in order to form an image of the target object.
In the circuit configuration shown in fig. 15A, since the detectors are closely arranged on the image sensor 902, when the theoretical imaging position of the detection point is located at the edge of the detector, the expanded light spot is received by a plurality of detectors, causing crosstalk. In particular, a dynamically allocated SPAD array may be disposed on the image sensor 902. A dynamically assigned SPAD array indicates that the position of the detector can move as the position of the spot moves. As shown in fig. 16 (a), the position of the light spot and the position of the detector are shown in state 1. In fig. 16 (a), black circles indicate light spots, and rectangular boxes outside the black circles indicate the positions of the detectors. Fig. 16 (b) shows a position diagram of the detector after the spot position is shifted.
Wherein, an interface can be arranged on the detector, and a controller (such as an MCU) can control the position of the detector through the interface. For example, the controller may send a series of SPAD configuration commands to the detector via the interface, the detector completing the movement of the SPAD array according to the configuration commands.
It should be noted that the SPADs and the detectors are not in one-to-one correspondence, for example, each detector includes 16 SPADs, the first 16 SPADs (1-4 rows) on the image sensor 902 form one detector at the first light spot time, and the SPADs (2-5 rows) on the image sensor 902 form a new detector at the second light spot time.
The embodiments of the present application also provide a method for emitting a light beam, which can be applied to the light emitting device mentioned in the above embodiments. Alternatively, the method may also be applied to the electronic device described above. Fig. 17 is a schematic flow chart of an implementation of the method for emitting a light beam. The method may include steps 601-602.
The light emitting device can include an array light source, a collimating lens, a scanning rotating mirror, an optical beam splitter and a processor. The processor in the light emitting device may perform the above-described method of emitting a light beam.
Step 601: the processor controls the array light source to emit K light beams, wherein K is more than or equal to 1 and is a positive integer; the K light beams are converted into K first collimated light beams through the collimating lens.
The array light source is positioned on the first side of the collimating lens, the plane of the array light source is perpendicular to the optical axis of the collimating lens, and the distance between the plane of the array light source and the center point of the collimating lens is the focal length of the collimating lens. The K light beams emitted from the array light source can propagate to the collimator lens. And the scanning rotating mirror is positioned at the second side of the collimating lens, and the optical axis of the collimating lens passes through the reflecting surface of the scanning rotating mirror. The K collimated beams converted from the collimated beams may propagate to a scanning turning mirror.
Step 602: the processor controls the scanning rotating mirror to reflect the K first collimated light beams into K second collimated light beams; the K second collimated light beams are split into i × K third collimated light beams by the optical beam splitter, and the i × K third collimated light beams are emitted.
It can be understood that when the scanning turning mirror rotates, the propagation direction of the K second collimated light beams also changes. Even if the scanning turning mirror is rotated, the K second collimated light beams can be propagated to the optical beam splitter.
The embodiment of the present application also provides a 3D camera, as shown in fig. 18, the 3D camera includes the light emitting device 1701 and the receiving device 1702.
The 3D camera may also link together various circuits such as peripherals, processors, power management circuits, etc., which are well known in the art, and therefore, will not be described further herein.
Fig. 19 is a schematic structural diagram of an electronic device 500 according to an embodiment of the present application. As shown in fig. 19, the electronic device 500 may include a processor 510, an external memory interface 520, an internal memory 521, a Universal Serial Bus (USB) interface 530, a charging management module 540, a power management module 541, a battery 542, an antenna 1, an antenna 2, a mobile communication module 550, a wireless communication module 560, an audio module 570, a sensor module 580, a camera 593, a display 594, a Subscriber Identification Module (SIM) card interface 595, and the like. Among them, the sensor module 580 may include a pressure sensor 580A, a gyro sensor 580B, an air pressure sensor 580C, a magnetic sensor 580D, an acceleration sensor 580E, a distance sensor 580F, a proximity light sensor 580G, a fingerprint sensor 580H, a temperature sensor 580J, a touch sensor 580K, and the like.
It is to be understood that the illustrated structure of the embodiment of the invention is not intended to limit the electronic device 500. In other embodiments of the present application, the electronic device 500 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 510 may include one or more processing units, such as: processor 510 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 500. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 510 for storing instructions and data. In some embodiments, the memory in processor 510 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 510. If processor 510 needs to reuse the instruction or data, it may be called directly from memory. Avoiding repeated accesses reduces the latency of the processor 510, thereby increasing the efficiency of the system.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 500. In other embodiments of the present application, the electronic device 500 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 540 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger.
The power management module 541 is used to connect the battery 542, the charging management module 540 and the processor 510. The power management module 541 receives input from the battery 542 and/or the charging management module 540, and provides power to the processor 510, the internal memory 521, the external memory, the display 594, the camera 593, the wireless communication module 560, and the like. The power management module 541 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc.
The wireless communication function of the electronic device 500 may be implemented by the antenna 1, the antenna 2, the mobile communication module 550, the wireless communication module 560, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 500 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 550 may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device 500. The mobile communication module 550 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like.
The wireless communication module 560 may provide a solution for wireless communication applied to the electronic device 500, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 560 may be one or more devices integrating at least one communication processing module. The wireless communication module 560 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 510. The wireless communication module 560 may also receive a signal to be transmitted from the processor 510, frequency-modulate it, amplify it, and convert it into electromagnetic waves via the antenna 2 to radiate it.
The electronic device 500 implements display functions via the GPU, the display screen 594, and the application processor. The GPU is an image processing microprocessor connected to a display screen 594 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 510 may include one or more GPUs that execute program instructions to generate or alter display information.
The electronic device 500 may implement a capture function via the ISP, the camera 593, the video codec, the GPU, the display screen 594, and the application processor, etc.
The camera 593 may include a light emitting device 1701 and a receiving device 1702. The light emitting device 1701 is configured to emit a light beam, which is projected to and reflected by the target object, and the receiving device 1702 is configured to receive the light beam reflected by the target object and generate an image of the target object based on the collected light beam.
The ISP is used to process the data fed back by the camera 593. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 593.
The camera 593 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 500 may include 1 or N cameras 593, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 500 selects at a frequency bin, the digital signal processor is used to perform a fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 500 may support one or more video codecs. In this way, the electronic device 500 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The external memory interface 520 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 500. The external memory card communicates with the processor 510 through the external memory interface 520 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 521 may be used to store computer-executable program code, which includes instructions. The processor 510 executes various functional applications of the electronic device 500 and data processing by executing instructions stored in the internal memory 521. The internal memory 521 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like.
The electronic device 500 may implement audio functions through the audio module 570.
The pressure sensor 580A is used for sensing a pressure signal, which can be converted into an electrical signal. In some embodiments, pressure sensor 580A may be disposed on display screen 594.
The gyro sensor 580B may be used to determine a motion pose of the electronic device 500. In some embodiments, the angular velocity of electronic device 500 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 580B. The gyro sensor 580B may be used for photographing anti-shake.
The air pressure sensor 580C is used to measure air pressure. In some embodiments, electronic device 500 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 580C.
The magnetic sensor 580D includes a hall sensor. The electronic device 500 may detect the opening and closing of the flip holster using the magnetic sensor 580D.
The acceleration sensor 580E may detect the magnitude of acceleration of the electronic device 500 in various directions (typically three axes).
A distance sensor 580F for measuring distance. The electronic device 500 may measure the distance by infrared or laser.
The proximity light sensor 580G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 500 emits infrared light to the outside through the light emitting diode. The electronic device 500 detects infrared reflected light from nearby objects using photodiodes.
The ambient light sensor 580L is for sensing ambient light brightness.
The fingerprint sensor 580H is used to collect a fingerprint. The electronic device 500 may utilize the collected fingerprint characteristics to implement fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering, and the like.
The temperature sensor 580J is used to detect temperature. In some embodiments, electronic device 500 implements a temperature processing strategy using the temperature detected by temperature sensor 580J.
The touch sensor 580K is also referred to as a "touch panel". The touch sensor 580K may be disposed on the display screen 594, and the touch sensor 580K and the display screen 594 form a touch screen, which is also referred to as a "touch screen".
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the above claims.

Claims (16)

1. A light emitting device, comprising: array light source, collimating lens, scanning rotating mirror and optical beam splitter, wherein:
the array light source comprises M multiplied by N light sources, the M multiplied by N light sources are light sources in M rows and N columns, M and N are positive integers, the interval between two adjacent columns of light sources in the N rows of light sources is a first preset distance, the interval between two adjacent rows of light sources in the M rows of light sources is a second preset distance, and the included angle between any one row of light sources in the N rows of light sources and any one row of light sources in the M rows of light sources is a preset angle;
the array light source is used for emitting K light beams, K is more than or equal to 1 and is a positive integer; wherein the array light source is located at a first side of the collimating lens; the plane of the array light source is perpendicular to the optical axis of the collimating lens, and the distance between the plane of the array light source and the center point of the collimating lens is the focal length of the collimating lens;
the collimating lens is used for converting the K light beams into K first collimated light beams;
the scanning rotating mirror is positioned at the second side of the collimating lens and used for realizing one-dimensional rotation, an optical axis of the collimating lens passes through a reflecting surface of the scanning rotating mirror, and the reflecting surface is used for reflecting the K first collimated light beams into K second collimated light beams;
the optical beam splitter is used for receiving the K second collimated light beams and splitting the K second collimated light beams into i multiplied by K third collimated light beams; wherein i is more than or equal to 2, and i is a positive integer.
2. The light-emitting device according to claim 1,
the optical beam splitter includes: at least one of a one-dimensional grating, a two-dimensional diffractive optical element, and a prism film.
3. A light-emitting device according to claim 1 or 2, wherein the predetermined angle is an acute angle.
4. The light emitting device according to claim 2, wherein the optical beam splitter is the one-dimensional grating;
the one-dimensional grating is used for splitting a light beam into i light beams, wherein the propagation directions of the i light beams are different.
5. The light-emitting device according to claim 2, wherein the optical beam splitter is the prism film; the incident surface of the optical beam splitter is a plane, and the emergent surface of the optical beam splitter is of a prism film structure;
the prism film structure comprises i beam splitting surfaces, the i beam splitting surfaces are used for splitting 1 light beam into i light beams, and the propagation directions of the i light beams are different.
6. The light emitting device according to claim 2, wherein the optical beam splitter is a two-dimensional diffractive optical element;
the two-dimensional diffractive optical element is configured to split a light beam into a matrix of light beams including i light beams.
7. The light-emitting device according to any one of claims 1-6, further comprising a controller; the controller is connected with the scanning rotating mirror;
the controller is used for receiving a control signal and transmitting the control signal to the scanning rotating mirror, and the control signal is used for indicating the scanning rotating mirror to adjust the angle of the reflecting surface;
and the scanning rotating mirror is used for receiving the control signal and adjusting the angle of the reflecting surface according to the control signal so as to adjust the propagation direction of the K second collimated light beams.
8. The light emitting device of claim 7, wherein the scanning turning mirror comprises: a micro-electro-mechanical system MEMS mirror or a digital micro-mirror DMD.
9. An electronic device, comprising:
the light emitting device of any one of claims 1-8; wherein the light beam emitted by the light emitting device is irradiated on a target object and reflected by the target object;
and the receiving device is used for receiving the light beam reflected by the target object.
10. The electronic device of claim 9, wherein the receiving means comprises: a receiving lens and an image sensor; wherein the optical axis of the receiving lens is perpendicular to the plane of the image sensor;
the receiving lens is used for receiving the light beam reflected by the target object and refracting the light beam reflected by the target object into a refracted light beam;
wherein the refracted light beam impinges on the image sensor such that the target object is imaged on the image sensor.
11. The electronic device of claim 10, wherein a distance between the image sensor and the receive lens is less than twice a focal length of the receive lens.
12. The electronic device of claim 10 or 11, wherein pixels of the image sensor are E x F, and E, F are all positive integers;
wherein the image sensor comprises j detectors; wherein j is less than E multiplied by F, and j is a positive integer.
13. The electronic device of claim 12, wherein the detector comprises at least one single photon detector.
14. The electronic device of any of claims 10-13, wherein the receiving means further comprises an optical filter;
the optical filter is arranged between the receiving lens and the image sensor and is parallel to the plane of the image sensor;
the filter is used for filtering out ambient light in the refracted light beam.
15. A method of emitting a light beam, applied to a light emitting device comprising an array light source, a collimating lens, a scanning turning mirror, an optical beam splitter and a processor;
the array light source comprises M multiplied by N light sources, the M multiplied by N light sources are light sources in M rows and N columns, M and N are positive integers, the interval between two adjacent columns of light sources in the N rows of light sources is a first preset distance, the interval between two adjacent rows of light sources in the M rows of light sources is a second preset distance, and the included angle between any one row of light sources in the N rows of light sources and any one row of light sources in the M rows of light sources is a preset angle;
the method comprises the following steps:
the processor controls the array light source to emit K light beams, wherein K is more than or equal to 1 and is a positive integer; wherein the array light source is located at a first side of the collimating lens; the plane of the array light source is perpendicular to the optical axis of the collimating lens, and the distance between the plane of the array light source and the center point of the collimating lens is the focal length of the collimating lens; the K light beams are converted into K first collimated light beams through the collimating lens, wherein the scanning rotating mirror is located on the second side of the collimating lens and used for achieving one-dimensional rotation, and the optical axis of the collimating lens passes through the reflecting surface of the scanning rotating mirror;
the processor controls the scanning rotating mirror to reflect the K first collimated light beams into K second collimated light beams; the K second collimated light beams are split into i multiplied by K third collimated light beams through the optical beam splitter and emit the i multiplied by K third collimated light beams; wherein i is more than or equal to 2, and i is a positive integer.
16. The method for emitting a light beam of claim 15, wherein the predetermined angle is an acute angle.
CN202011511815.8A 2020-03-31 2020-12-18 Light emitting device and electronic equipment Pending CN113534484A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP21780128.1A EP4119976A4 (en) 2020-03-31 2021-03-05 Light emitting apparatus and electronic device
PCT/CN2021/079346 WO2021196976A1 (en) 2020-03-31 2021-03-05 Light emitting apparatus and electronic device
US17/955,261 US20230026858A1 (en) 2020-03-31 2022-09-28 Optical transmitting apparatus and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010246351 2020-03-31
CN2020102463516 2020-03-31

Publications (1)

Publication Number Publication Date
CN113534484A true CN113534484A (en) 2021-10-22

Family

ID=78124226

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011511815.8A Pending CN113534484A (en) 2020-03-31 2020-12-18 Light emitting device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113534484A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115840295A (en) * 2023-02-23 2023-03-24 北京数字光芯集成电路设计有限公司 Linear array MicroLED scanning AR equipment
WO2023236580A1 (en) * 2022-06-08 2023-12-14 华为技术有限公司 Display apparatus and method, and vehicle

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108363267A (en) * 2018-02-14 2018-08-03 深圳奥比中光科技有限公司 The structured light projection module of regular array light source
CN109343034A (en) * 2018-09-19 2019-02-15 中国电子科技集团公司第三十八研究所 A kind of laser radar emission system based on MEMS galvanometer

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108363267A (en) * 2018-02-14 2018-08-03 深圳奥比中光科技有限公司 The structured light projection module of regular array light source
CN109343034A (en) * 2018-09-19 2019-02-15 中国电子科技集团公司第三十八研究所 A kind of laser radar emission system based on MEMS galvanometer

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023236580A1 (en) * 2022-06-08 2023-12-14 华为技术有限公司 Display apparatus and method, and vehicle
CN115840295A (en) * 2023-02-23 2023-03-24 北京数字光芯集成电路设计有限公司 Linear array MicroLED scanning AR equipment
CN115840295B (en) * 2023-02-23 2023-05-02 北京数字光芯集成电路设计有限公司 Linear array micro LED scans AR equipment

Similar Documents

Publication Publication Date Title
US11265532B2 (en) Non-mechanical beam steering for depth sensing
CN110832443B (en) Compact optical system with MEMS scanner for image generation and object tracking
WO2021212915A1 (en) Laser distance measuring device and method
US20200192206A1 (en) Structured light projector, three-dimensional camera module and terminal device
CN108718406B (en) Variable-focus 3D depth camera and imaging method thereof
US20200075652A1 (en) Pixel cell with multiple photodiodes
US11399139B2 (en) High dynamic range camera assembly with augmented pixels
EP2873986A1 (en) Camera integrated with light source
US10679370B2 (en) Energy optimized imaging system with 360 degree field-of-view
JPWO2006077718A1 (en) Lens array and image sensor having lens array
US10677896B2 (en) Resolution enhancement for scanning LIDAR/LADAR
CN113534484A (en) Light emitting device and electronic equipment
US10791286B2 (en) Differentiated imaging using camera assembly with augmented pixels
US10481739B2 (en) Optical steering of component wavelengths of a multi-wavelength beam to enable interactivity
US20230026858A1 (en) Optical transmitting apparatus and electronic device
US20240127566A1 (en) Photography apparatus and method, electronic device, and storage medium
US10855896B1 (en) Depth determination using time-of-flight and camera assembly with augmented pixels
US11402473B1 (en) Photonic integrated circuit illuminator
US10178372B2 (en) Long focal length monocular 3D imager
WO2022002162A1 (en) Electronic device and depth image photographing method
CN116670713A (en) Method for depth sensing using candidate images selected based on epipolar lines
WO2020116036A1 (en) Object recognition device, object recognition method, and program
JP2004170305A (en) Method and apparatus for measuring three-dimensional shape
CN217883646U (en) Long-baseline depth camera module and electronic equipment
Voskerchyan et al. Flexible and highly scalable concept for an FMCW LiDAR PIC based on Tilted Grating Couplers

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20211022

RJ01 Rejection of invention patent application after publication