WO2022267645A1 - 摄像装置、方法、电子设备及存储介质 - Google Patents
摄像装置、方法、电子设备及存储介质 Download PDFInfo
- Publication number
- WO2022267645A1 WO2022267645A1 PCT/CN2022/087239 CN2022087239W WO2022267645A1 WO 2022267645 A1 WO2022267645 A1 WO 2022267645A1 CN 2022087239 W CN2022087239 W CN 2022087239W WO 2022267645 A1 WO2022267645 A1 WO 2022267645A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- infrared
- path
- reflected
- receiver
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 16
- 230000003287 optical effect Effects 0.000 claims abstract description 162
- 239000011159 matrix material Substances 0.000 claims abstract description 104
- 238000003384 imaging method Methods 0.000 claims description 40
- 238000000926 separation method Methods 0.000 claims description 17
- 230000001902 propagating effect Effects 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 7
- 230000005540 biological transmission Effects 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 239000000758 substrate Substances 0.000 description 6
- 238000002834 transmittance Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000001228 spectrum Methods 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 3
- 239000002131 composite material Substances 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000009977 dual effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000003796 beauty Effects 0.000 description 1
- 239000000306 component Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 239000008358 core component Substances 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000001795 light effect Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 230000010363 phase shift Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/10—Beam splitting or combining systems
- G02B27/1006—Beam splitting or combining systems for splitting or combining different wavelengths
- G02B27/1013—Beam splitting or combining systems for splitting or combining different wavelengths for colour or multispectral image sensors, e.g. splitting an image into monochromatic image components on respective sensors
Definitions
- the embodiments of the present application relate to the technical field of imaging, and in particular, to an imaging device, method, electronic equipment, and storage medium.
- 3D imaging technology structured light measurement, laser scanning, ToF and other technologies are becoming more mature, and 3D recognition functions are gradually equipped in mobile terminals, for example, to realize face recognition, so that face recognition has higher security .
- the terminal display screen needs to reserve a place for the three-dimensional identification device. Since the three-dimensional identification device includes at least three lenses, the non-display area in the terminal display screen is at least The projection area of the three lenses will cause the terminal display to form a "notch screen", which will affect the appearance and reduce the user experience.
- the 3D recognition device can also be installed under the terminal display screen.
- the terminal display screen needs to perform special light-transmitting treatment on the location of the 3D recognition device.
- the special area of the light-transmitting treatment is different from other normal display areas. When the number is large, the special area of light transmission treatment is larger, and it will be more obvious from other normal display areas, which will reduce the user experience.
- the area of the abnormal display area (including non-display area and special area requiring light-transmitting treatment) in the terminal display screen should be at least the area of the projection area of the three lenses.
- the area is relatively large, which will affect the user experience.
- An embodiment of the present application provides an imaging device, including: an optical emitting lens for emitting dot matrix projection light and compensating infrared light to a target object; an optical receiving lens for receiving the first reflected infrared light and visible light reflected by the target object ; Wherein, the visible light and the first reflected infrared light are used for the three-dimensional recognition of the target object;
- the optical receiving lens includes a first infrared receiver, a visible light receiver and an optical filter;
- the optical filter is arranged in the light incident path of the optical receiving lens , used to filter and separate the incident light to obtain the first reflected infrared light and visible light propagating along the light incident path and along the separated path, the separated path is perpendicular to the light incident path;
- the first infrared receiver is used to receive the first reflected infrared light, visible light
- the receiver is used to receive visible light;
- the optical emitting lens includes a first flood illuminator, an infrared dot matrix projector
- An embodiment of the present application provides another imaging device, including: an optical emitting lens, used to emit infrared light to a target object; an optical receiving lens, used to receive the first reflected infrared light and visible light reflected by the target object; wherein, the visible light and the visible light The first reflected infrared light is used for three-dimensional recognition of the target object; the optical receiving lens includes the first infrared receiver, visible light receiver and optical filter; the optical filter is arranged in the light incident path of the optical receiving lens, and is used to convert the incident Optical filtering and separation to obtain the first reflected infrared light and visible light propagating along the light incident path and along the separated path, the separated path is perpendicular to the light incident path; the first infrared receiver is used to receive the first reflected infrared light, and the visible light receiver is used to receive visible light ; Infrared light includes infrared continuously modulated pulsed light; the optical emission lens includes a second flood light illuminator for emitting inf
- the embodiment of the present application also provides an imaging method, including: controlling the optical emitting lens to emit dot matrix projection light and compensating infrared light to the target object; controlling the optical receiving lens to receive the first reflected infrared light and visible light reflected by the target object; wherein, The visible light and the first reflected infrared light are used for three-dimensional identification of the target object;
- the optical receiving lens includes a first infrared receiver, a visible light receiver and a filter; the filter is arranged in the light incident path of the optical receiving lens for The incident light is filtered and separated to obtain the first reflected infrared light and visible light propagating along the light incident path and the separated path, and the separated path is perpendicular to the light incident path;
- the first infrared receiver is used to receive the first reflected infrared light, and the visible light receiver is used to receive Visible light;
- an optical emission lens including a first flood irradiator, an infrared dot matrix projector and a reflector;
- the embodiment of the present application also provides an imaging method, including: controlling the optical emitting lens to emit infrared light to the target object; controlling the optical receiving lens to receive the first reflected infrared light and visible light reflected by the target object; wherein, the visible light and the first reflected infrared light The light is used for three-dimensional recognition of the target object; the optical receiving lens includes a first infrared receiver, a visible light receiver and an optical filter; the optical filter is arranged in the light incident path of the optical receiving lens, and is used to filter and separate the incident light to obtain The first reflected infrared light and visible light propagating along the light incident path and along the separated path, the separated path is perpendicular to the light incident path; the first infrared receiver is used to receive the first reflected infrared light, and the visible light receiver is used to receive visible light; the infrared light includes infrared Continuously modulated pulsed light; optical emission lens, including a second flood illuminator for emitting infrared continuously
- the embodiment of the present application also provides an electronic device, including: at least one processor; a memory communicatively connected to the at least one processor; the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor, So that at least one processor can execute the above-mentioned imaging method.
- An embodiment of the present application also provides a computer-readable storage medium storing a computer program, and implementing the above-mentioned imaging method when the computer program is executed by a processor.
- Fig. 1 is a schematic structural diagram of an imaging device implementing structured light provided according to an embodiment of the present application
- Fig. 2 is a schematic diagram of the layout of a three-dimensional identification device on a terminal in the related art
- Figures 3a to 3c are schematic diagrams of device arrangement schemes involved in three-dimensional identification in the related art
- Fig. 4 is a schematic structural diagram of an imaging device for realizing monocular structured light according to another embodiment of the present application.
- Fig. 5 is a schematic structural diagram of an imaging device for realizing binocular structured light provided according to an embodiment of the present application
- Fig. 6 is a schematic diagram of the location of the photosensitive device provided according to an embodiment of the present application.
- FIG. 7 is a schematic structural diagram of an imaging device for implementing TOF according to an embodiment of the present application.
- Fig. 8 is a flowchart of an imaging method provided according to an embodiment of the present application.
- FIG. 9 is a flowchart of an imaging method provided according to another embodiment of the present application.
- Fig. 10 is a schematic diagram of an electronic device provided according to an embodiment of the present application.
- the main purpose of the embodiments of the present application is to provide an imaging device, method, electronic equipment and storage medium, which can reduce the abnormally displayed area on the display screen of the terminal.
- Embodiments of the present application relate to a camera device, as shown in Figure 1, which may include, but is not limited to:
- An optical emitting lens 1100 used for emitting dot matrix projection light and compensating infrared light to the target object
- the optical receiving lens 1200 is used to receive the first reflected infrared light and visible light reflected by the target object;
- the visible light and the first reflected infrared light are used for three-dimensional identification of the target object, the first reflected infrared light is used to construct the depth information of the target object, the visible light is used to construct a two-dimensional image of the target object, and the depth information and the two-dimensional image are used for Construct a three-dimensional image of the target object;
- the optical receiving lens 1200 includes a first infrared receiver 1210, a visible light receiver 1220 and a filter 1230;
- the optical filter 1230 is arranged in the light incident path of the optical receiving lens 1200, and is used to filter and separate the incident light to obtain the first reflected infrared light and visible light propagating along the light incident path and along the separated path, and the separated path is perpendicular to the light incident path;
- the first infrared receiver 1210 is used to receive the first reflected infrared light
- the visible light receiver 1220 is used to receive visible light
- the optical emission lens 1100 includes a first flood illuminator 1110, an infrared dot matrix projector 1120 and a reflector 1130; the infrared dot matrix projector 1120 is used to emit dot matrix projection light, and the first flood light illuminator 1110 is used to emit compensation infrared light;
- the reflector 1130 is arranged in the light exit path of the optical emission lens 1100, and the light exit path includes a merge path, an initial exit path of the dot matrix projection light and an initial exit path of the compensation infrared light, and an initial exit path of the dot matrix projection light and the compensation infrared light
- the paths form an intersection angle, and the dot-matrix projection light and the compensation infrared light are emitted along the combined path after passing through the reflector.
- the imaging device of this embodiment can be applied to mobile terminals such as mobile phones, tablet computers, Customer Premise Equipment ("CPE”), smart homes, etc., and acts on various three-dimensional recognition application scenarios, for example, face recognition , AR, VR, 3D modeling, somatosensory games, holographic image interaction, 3D beauty, remote videophone, cloud real-time video and other different applications.
- the optical receiving lens 1200 can be used not only as a device for realizing a three-dimensional recognition function, but also as a device for realizing a front camera function.
- the visible light receiver 1220 in the optical receiving lens 1200 can receive visible light for the mobile terminal to generate a two-dimensional image when realizing the three-dimensional recognition function, and can also receive the light incident on the optical receiving lens 1200 when the optical receiving lens 1200 is used as a common front camera. Visible light for the mobile terminal to generate a two-dimensional image of the camera.
- the mobile terminal can control the camera device to emit infrared light to the target object to be recognized, such as a human face, with the optical transmitting lens 1100, and receive the first reflected infrared light and visible light reflected by the target object with the optical receiving lens 1200, Therefore, the mobile terminal can construct the depth information of the target object according to the received first reflected infrared light, construct the two-dimensional image of the target object according to the received visible light, and then construct the three-dimensional image of the target object according to the depth information and the two-dimensional image.
- the target object to be recognized such as a human face
- the forward 3D recognition of mobile terminals mainly includes (a) monocular structured light, (b) Time of Flight (TOF) method ), (c) binocular structured light solution.
- the monocular structured light solution consists of infrared dot matrix projector a301 projecting speckle or coded structured light to the target object, composed of flood light illuminator a302 (low power flood light illuminator), and infrared camera a303 receiving the reflected infrared light reflected by the target object
- the visible light reflected by the target object is received by the RGB camera a304, and the four devices are arranged sequentially along the baseline a305 in the three-dimensional recognition area on the terminal.
- the processor constructs the depth information of the target object based on the reflected infrared light received by the infrared camera a303, and the The visible light received by the camera a304 constructs a two-dimensional image of the target object, and constructs a three-dimensional image of the target object according to the depth information and the two-dimensional image.
- the flood irradiator b301 (high-power flood irradiator) emits infrared continuously modulated pulse light to the target object
- the infrared camera b302 receives the reflected pulse light reflected by the target object
- the RGB camera b303 receives the visible light reflected by the target object
- the three devices are sequentially arranged in the three-dimensional identification area on the terminal along the baseline b304.
- the processor constructs the depth information of the target object according to the reflected pulsed light received by the infrared camera a302, constructs the two-dimensional image of the target object according to the visible light received by the RGB camera b303, and constructs the three-dimensional image of the target object according to the depth information and the two-dimensional image.
- the binocular structured light solution consists of infrared dot matrix projector c301 projecting speckle or coded structured light to the target object, composed of flood light illuminator c302 (low power flood light illuminator), and infrared camera c303 and infrared camera c304 receiving the target object
- the reflected infrared light is reflected by the RGB camera c305 to receive the visible light reflected by the target object
- the five devices are arranged in sequence along the baseline c306 in the three-dimensional identification area on the terminal, and the processor constructs the For the depth information of the target object, a two-dimensional image of the target object is constructed according to the visible light received by the RGB camera c305, and a three-dimensional image of the target object is constructed according to the depth information and the two-dimensional image.
- the projection area of the lens of the above-mentioned 3D recognition device is relatively large, and it is usually arranged in the "notch" area on the top of the display screen.
- this area can only be prevented from becoming the screen display area. , so that the effective display area of the display screen is reduced, which also affects the overall appearance to a certain extent.
- the application state needs to carry out special processing to enhance the light transmittance for the display area of the corresponding terminal screen (mainly the OLED screen with light transmittance), that is, by reducing RGB pixels or shrinking RGB pixels increase the amount of light passing through, so this display area will always present a display effect that is more or less different from the rest of the display area.
- the application of 3D recognition under the screen will cause the area of the special treatment area to improve the light transmittance of the display screen to increase a lot, so reducing the area of the special treatment area is conducive to reducing the display difference area .
- the most direct simplification is to reduce the number of 3D recognition devices installed under the screen.
- the camera device can be set as an under-screen camera.
- the under-screen camera means that the camera is set under the display screen of the mobile terminal.
- the light only enters the sensor through the light-transmitting area on the screen of the mobile terminal.
- the camera device can be set as an ordinary front camera, and the display screen of the mobile terminal reserves a non-display area for the camera device, forming screen styles such as "notch screen” and "water drop screen”.
- the imaging device of this embodiment can obtain depth information and two-dimensional images for constructing a three-dimensional image of the target object by using an optical emitting lens to emit infrared light to the target object, and using an optical receiving lens to receive reflected red light and visible light reflected by the target object.
- the optical receiving lens receives the first reflected infrared light through the internal first infrared receiver, receives the visible light through the internal visible light receiver, and separates the first reflected infrared light from the incident light by setting a filter in the light incident path.
- Light and visible light so as to realize simultaneous reception of two kinds of light with an optical receiving lens, set the first floodlight illuminator, infrared dot matrix projector and reflector on the optical emitting lens, and the dot matrix projected light at the angle of intersection will be projected by the reflector
- Both the control and the compensating infrared light are emitted along the combined path, so that one lens of the optical emission lens can be used to emit the dot matrix projection light and the compensating infrared light.
- the structured light solution for 3D recognition can be realized, and the number of lenses required for 3D recognition can be reduced.
- the lenses used for structured light three-dimensional recognition on the display screen are reduced from at least four lenses to two. Therefore, only part of the area occupied by the optical transmitting lens and the optical receiving lens is processed in the display screen to reduce the terminal display.
- the area that needs to be processed for abnormal display in the screen increases the area used for normal display in the terminal display screen to improve user experience.
- the optical transmitting lens 1100 and the optical receiving lens 1200 can be arranged under the terminal display screen 1010, wherein the terminal display screen 1010 can be composed of a touch panel (Touch Panel, referred to as "TP") 1011 and a liquid crystal display (Liquid Crystal Display, referred to as "LCD”) 1012 composition.
- TP touch panel
- LCD liquid crystal display
- the infrared light emitted by the optical emitting lens 1100 penetrates the transmitting light-transmitting area of the terminal display 1010 and emits to the target object, and the optical receiving lens 1200 receives the incident light that penetrates the receiving light-transmitting area of the terminal display 1010, and separates the incident light to obtain Reflects infrared and visible light.
- the optical receiving lens 1200 includes a first infrared receiver 1210, a visible light receiver 1220 and an optical filter 1230, wherein the optical filter 1230 is arranged in the light incident path of the optical receiving lens 1200, placed at an angle of 45 degrees to the incident light, and filters
- the light sheet 1230 can be a visible/infrared dichroic filter, using a special filter film. When the incident angle between the incident light and the filter 1230 is 45 degrees, the reflectance of the visible light band from 0.3 ⁇ m to 0.6 ⁇ m is greater than 90%.
- the near-infrared transmittance of 0.75 ⁇ m to 2.5 ⁇ m is greater than 90%, so that the reflected infrared light can pass through the filter 1230, reach the first infrared receiver 1210 below the filter, and reflect visible light into the separation path in the visible light receiver 1220.
- the first infrared receiver 1210 includes an infrared sensitive substrate 1211, an infrared low-pass filter 1212, and a lens group 1213, wherein the lens group 1213 is used to gather the first reflected infrared light and a small part of visible light reflected by the filter 1230, and the infrared low-pass filter 1212
- the filter 1212 is used to filter a small part of the visible light reflected by the optical filter 1230, and only allow the first reflected infrared light to pass through, so that the infrared sensitive substrate 1211 only obtains the first reflected infrared light.
- Infrared photosensitive substrate 1211 has infrared CMOS image sensor (Infrared CMOS Image Sensor, referred to as "CIS" special chip.
- Visible light receiver 1220 includes visible light photosensitive substrate 1221, infrared cut filter 1222 and lens group 1223, wherein, lens group 1223 is used for The visible light reflected by the optical filter 1230 and a small part of the first reflected infrared light are collected, and the infrared cutoff filter 1222 is used to filter the small part of the first reflected infrared light reflected by the optical filter 1230, so that only visible light can penetrate, so that the visible light photosensitive substrate 1221 Only visible light is obtained, and there is a CIS dedicated chip on the visible light photosensitive substrate 1221.
- the lens group 1213 and the lens group 1223 can be composed of wafer-level optical lenses (Wafer Level Optics lens, referred to as "WLO").
- the reflective and transmitted light band devices can be reversed, and the propagation paths of the first reflected infrared light and visible light can be reversed, so that the visible light receiver 1220 can exchange positions with infrared receiver 1210 .
- the infrared dot matrix projector and the first flood irradiator can emit infrared light of different bands. But different bands are limited to different bands of near-infrared.
- one of the infrared dot matrix projector and the first flood irradiator is located in the near-infrared short-wave (780-1100nm), and the other is located in the near-infrared long-wave (1100-2526nm).
- the distance between the optical centers of the optical transmitting lens and the optical receiving lens is the baseline, and this baseline can be a conventional baseline or a micro baseline.
- Monocular structured light adopts active three-dimensional measurement, estimates the spatial position of each pixel through the baseline, and then measures the distance between the object and the lens, that is, obtains depth information.
- the depth range measured by the binocular camera is related to the baseline. The larger the baseline distance, the farther it can be measured.
- the micro-baseline means that the distance between the two lenses is very short. Although the measurement distance is short, it is in line with the advantages of close-range applications of structured light, and it is also more conducive to the overall internal layout of the already compact mobile terminal.
- the dot matrix projection light and the compensation infrared light are controlled to emit along the merged path by the reflector, so that the optical emission can be achieved.
- One lens, one lens emits dot matrix projection light and compensates infrared light to realize a structured light solution for three-dimensional recognition.
- the intersection angles formed by the dot matrix projection light and the compensation infrared light are all drawn as right angles.
- the camera device can be used to implement a monocular structured light solution in 3D recognition.
- the intersection angle formed by the initial exit path of the dot matrix projection light and the compensation infrared light satisfies the total reflection condition of the reflector, and the reflector 1130 can be an infrared total reflection lens, that is, the optical emission lens 1100, which can include a first flood illuminator 1110, an infrared Dot matrix projector 1120 and infrared total reflection lens, by infrared light, comprise that infrared dot matrix projector 1120 sends dot matrix projection light and first floodlight irradiator 1110 sends compensation infrared light, and infrared total reflection lens is arranged on optical emission lens 1100 In the light exit path, it is used to reflect and transmit the dot matrix projection light and the compensation infrared light at the same time, and emit along the combining path.
- the first flood illuminator 1110 may be disposed below the infrared total reflection lens, that is, the initial emission path of the first flood illuminator 1110 may be on the same straight line as the combining path.
- the infrared total reflection lens reflects and emits the dot matrix projection light emitted by the infrared dot matrix projector 1120 and makes the compensation infrared light emitted by the first flood illuminator 1110 penetrate and emit.
- the first flood illuminator 1110 may include a diffuser 1111 and a low-power Vertical-Cavity Surface-Emitting Laser (Vertical-Cavity Surface-Emitting Laser, referred to as "VCSEL") 1112, and the infrared dot matrix projector 1120 may include an optical diffraction element (Diffractive Optical Elements, referred to as "DOE") 1121, high-power VCSEL1122 and WLO1123.
- DOE optical diffraction element
- the infrared dot matrix projector 1120 and the visible light receiver 1220 located in the separate path of the optical receiving lens 1200 are both located below the baseline of the optical transmitting lens 1100 and the optical receiving lens 1200 .
- the baseline 1020 can be a conventional baseline length, or a micro-baseline length
- the optical transmitting lens 1100 and the optical receiving lens 1200 which are composite devices inside, can make the local response of the LCD display area improve the light transmittance under the micro-baseline state
- the composite devices of the two lenses are closely intertwined under the micro-baseline, which can realize integration and standardization.
- Technical parameters such as dual-camera calibration can be adjusted to a basically stable state before the integrated output, effectively reducing separation
- Infrared dot matrix projection light can use speckle structured light or coded structured light, and the baseline length can be very small, which is a micro baseline.
- the imaging system's ability to image and accurately track long-distance targets is limited.
- Using micro-baseline laser active lighting and composite lighting to illuminate distant, small and dark targets or their local areas can reduce the influence of background radiation and improve the system's ability to accurately track and clearly image long-distance, small and dark targets .
- the working principle of the laser active lighting surveillance system is basically the same as that of the lidar. By adjusting the focus state (divergence angle) of the emitted laser beam, the entire target or key feature parts of the target are illuminated to meet the detection requirements of the receiving system and achieve the purpose of imaging and precise tracking of the target.
- the first flood illuminator 1110 and the infrared dot projector 1120 when they are working simultaneously, they can each use a relatively narrow spectrum, and the first infrared receiver 1210 can use a relatively wide spectrum.
- the first flood illuminator 1110 under the conditions of dark light at night and strong light during the day, the first flood illuminator 1110 can achieve the supplementary light effect, and the first infrared receiver 1210 of the processor can receive the infrared light reflected by the target object at the same time.
- the structured light of the dot matrix projector and the unstructured light of the first flood irradiator, after fusion calculation, can enhance the target recognition ability, and at the same time, it can effectively make up for the shortcoming that the diffracted light of the dot matrix projection will quickly dissipate due to the increasing distance.
- the infrared total reflection lens has both total reflection and transmission characteristics, can directly transmit the infrared unstructured light emitted by the first flood illuminator 1110 , and at the same time completely reflect the infrared structured light emitted by the infrared dot matrix projector 1120 .
- the satisfying condition of total reflection is that the light is from an optically denser medium to an optically rarer medium and the incident angle is greater than or equal to the critical angle.
- the requirement of greater than the critical angle is unified, and the refraction angle is 90 degrees as the critical state. Therefore, under normal circumstances, the initial outgoing path of light before total reflection and the merged path after total reflection are not vertical, but exist in the form of an obtuse angle.
- the infrared dot matrix projector in Fig. 1 usually cannot be placed horizontally, but placed obliquely with the left side high and the right side low. Therefore, the illustration in Fig. 1 is only given from the perspective of position regulation.
- the imaging device is provided with a first flood illuminator, an infrared dot matrix projector and an infrared total reflection lens on the optical emission lens, so that the intersection angle formed by the dot matrix projection light and the compensation infrared light meets the total reflection of the reflector.
- the infrared total reflection lens can emit the dot matrix projection light and the compensation infrared light along the combined path at the same time, so that one lens of the optical emission lens can be used to emit the dot matrix projection light and the compensation infrared light to realize the structured light scheme of three-dimensional recognition .
- the reflector can also be a movable infrared reflector 1132, that is, an optical emission lens 1100, which can include a first flood illuminator 1110, an infrared dot matrix projector 1120 and a movable infrared reflector 1132.
- the reflective mirror 1132 is composed of infrared light including the infrared dot matrix projector 1120 that emits dot matrix projection light and the first flood light irradiator 1110 that sends out compensation infrared light.
- the movable infrared reflective mirror 1132 is arranged in the light exit path of the optical emission lens 1100, It is used to control the dot matrix projection light and compensate the infrared light to emit along the combined path in time division.
- the initial emission paths of the dot matrix projection light and the compensation infrared light can be perpendicular to each other, or can be in other angles, as long as the dot matrix projection light and the compensation infrared light can pass through the infrared reflector 1132 and then emit along the merged path.
- the first flood illuminator 1110 may be arranged under the movable infrared reflector 1132 .
- the movable infrared reflector 1132 is impenetrable to infrared light, the infrared dot matrix projector and the floodlight irradiator cannot work simultaneously, and the movable infrared reflector 1132 can only be time-shared according to the control of the system processor in two states of inclination and vertical Work.
- the movable infrared reflector 1132 When the movable infrared reflector 1132 is in a tilted state, the infrared compensation light emitted by the first flood illuminator 1110 is blocked, and the infrared light projected by the infrared dot matrix projector 1120 is reflected; the movable infrared reflector 1132 is in a vertical state At this time, the optical path of the infrared dot matrix projector 1120 is blocked, and the optical path of the first flood irradiator 1110 is unimpeded to emit infrared light.
- a more general working mode can be adopted, so that the first flood irradiator 1110 under the optical emission lens 1100 is used to start in advance, mainly used to predict the flood illumination of the target object, and a light source with a larger illumination angle is used.
- the infrared dot matrix projector 1120 emits multiple light points (for example, thousands to tens of thousands) to project onto the human face, and cooperates with the optical receiving lens 1200 to receive the reflected light point changes and calculate the virtual human face
- the surface profile is used to finely judge whether the detected face is the user of the terminal or other authenticated persons. It can be understood that the position of the infrared dot matrix projector 1120 and the first flood illuminator 1110 can be exchanged.
- the imaging device is provided with a first flood illuminator, an infrared dot matrix projector, and a movable infrared reflector on the optical emission lens, and the dot matrix projected light perpendicular to each other and the compensated infrared light are controlled and separated by the reflector. Time is emitted along the combined path, so that one lens can be used to optically emit lens, emit dot matrix projection light and compensate infrared light, and realize a structured light solution for three-dimensional recognition.
- the camera device can be used to implement a binocular structured light solution in 3D recognition.
- the optical emission lens 1100 that realizes the binocular structured light scheme in three-dimensional recognition also includes: a second infrared receiver 1140 and a rotating device 1150; the second infrared receiver 1140 is used to receive the The second reflected infrared light, one side of the rotating device 1150 is provided with the second infrared receiver 1140, and the other side is provided with the infrared dot matrix projector 1120 or the first flood illuminator 1110, and the rotating device 1150 is used for the infrared dot matrix projector After 1120 or the first flood illuminator 1110 emits dot matrix projection light or compensated infrared light, it rotates to the other side for the second infrared receiver 1140 to receive the second reflected infrared light.
- the infrared light is transmitted to the target object and the second infrared receiver 1140 and the rotating device 1150;
- the processor controls the rotating device 1150 to rotate, Rotate the originally upward first flood illuminator 1110 to the bottom, and rotate the downward second infrared receiver 1140 to the top, and the first infrared receiver 1210 and the second infrared receiver 1140 simultaneously receive the reflected light from the target object.
- the first reflected infrared light and the second reflected infrared light the system processor calculates and fuses the data of the target object received by the dual infrared receivers to realize three-dimensional recognition.
- the infrared rays emitted and received by the first flood illuminator 1110 and the second infrared receiver 1140 can pass through the infrared total reflection lens 1131, and since the incident angles in both directions are far below the critical angle of total reflection, both will Present ideal infrared penetration effect. Therefore, and the binocular structured light solution can also be applied based on the imaging device shown in FIG. 1 .
- Binocular structured light can use structured light to measure depth information in indoor environments, and switch to pure binocular mode when outdoor lighting causes structured light to fail, improving reliability and anti-interference capabilities.
- the imaging device is provided with a first flood illuminator, an infrared dot matrix projector and a rotating device on the optical emission lens, and the dot matrix projection is emitted by the rotating device on the infrared dot matrix projector or the first flood illuminator. After emitting or compensating infrared light, it is rotated to the other side for the second infrared receiver to receive the second reflected infrared light.
- both the optical transmitting lens and the optical receiving lens can be used to receive the first reflected infrared light and the second reflected infrared light reflected by the target object.
- Infrared light so a binocular structured light solution for three-dimensional recognition can be realized.
- the imaging device further includes: a photosensitive device, the photosensitive device is located in the first non-display area 601 between the light transmission area of the optical emitting lens and the edge of the terminal screen as shown in Figure 6, or the optical receiving lens In the second non-display area 602 between the light-transmitting area and the edge of the terminal screen. Therefore, the extremely narrow non-display area 600 at the top of the display screen is used to arrange proximity light sensors, ambient light sensors, and distance sensors for structured light prediction purposes, such as small-sized one-dimensional TOF sensors.
- the camera device in this embodiment can further improve the accuracy of three-dimensional recognition and enhance user experience by disposing distance sensors for predicting targets in the first non-display area and the second non-display area.
- the embodiment of the present application relates to a camera device for realizing the TOF solution in three-dimensional recognition, as shown in Figure 7, including:
- Optical emission lens 7100 used to emit infrared light to the target object
- the optical receiving lens 7200 is used to receive the first reflected infrared light and visible light reflected by the target object;
- the visible light and the first reflected infrared light are used for three-dimensional identification of the target object, the first reflected infrared light is used to construct the depth information of the target object, the visible light is used to construct a two-dimensional image of the target object, and the depth information and the two-dimensional image are used for Construct a three-dimensional image of the target object;
- the optical receiving lens 7200 includes a first infrared receiver 7210, a visible light receiver 7220 and an optical filter 7230;
- the optical filter 7230 is arranged in the light incident path of the optical receiving lens 7200, and is used to filter and separate the incident light to obtain the first reflected infrared light and visible light propagating along the light incident path and along the separated path, and the separated path is perpendicular to the light incident path;
- the first infrared receiver 7210 is used to receive the first reflected infrared light
- the visible light receiver 7220 is used to receive visible light
- Infrared light can be infrared continuously modulated pulsed light
- the optical emission lens 7100 can be a second flood light illuminator 7110 for emitting infrared continuously modulated pulsed light.
- the second flood light illuminator 7110 is located below the terminal screen, and the first infrared light
- the receiver 7210 or visible light receiver 7220 is located below the second flood illuminator 7110 .
- the optical filter 7230 reflects visible light to the separation path
- the visible light receiver 7220 is located under the second flood illuminator 7110
- the first infrared receiver 7210 is located Below the second flood illuminator 7110.
- the second flood irradiator 7110 includes a diffuser 7111 and a high-power VCSEL7112.
- the high-power VCSEL7112 is used to emit infrared continuously modulated pulsed light
- the diffuser 7111 is used to control the diffusion of the infrared continuously modulated pulsed light emitted by the high-power VCSEL7112 to the target object.
- the interior of the first infrared receiver 7210 of the TOF solution is more complex and requires higher performance.
- TOF is a surface-emitting light, which can form a three-dimensional image at one time. There can be a zero baseline between the optical transmitting lens and the optical receiving lens, so the overall structure is more compact, and it can minimize the special treatment area of the terminal display screen to improve the light transmittance. .
- TOF is divided into iTOF (indirect TOF) and dTOF (direct TOF).
- the former uses VCSEL to emit infrared continuous modulation pulse light and receive the infrared light reflected by the target, and perform homodyne demodulation to measure the phase shift of reflected infrared light.
- VCSEL Light Detection and Ranging
- the core components include VCSEL, Single Photon Avalanche Diode (Single Photon Avalanche Diode, "SPAD” for short) and Time Digital Converter (Time Digital Converter, "TDC' for short)
- VCSEL emits pulse waves into the scene
- SPAD receives pulse waves reflected from the target object
- TDC records the time of each received optical signal Time-of-flight, that is, the time interval between transmitting pulses and receiving pulses.
- the long detection distance and high-precision imaging characteristics of dToF lidar can provide better night shooting, video and AR experience.
- the imaging device of this embodiment can emit infrared light to the target object by using the optical transmitting lens, and receive the reflected red light and visible light reflected by the target object with the optical receiving lens, and can obtain depth information and two-dimensional images for constructing a three-dimensional image of the target object.
- image wherein the optical receiving lens receives the first reflected infrared light through the internal first infrared receiver, receives the visible light through the internal visible light receiver, and separates the first from the incident light by setting a filter in the light incident path Reflect infrared light and visible light, so as to realize simultaneous reception of two kinds of light with one optical receiving lens.
- the target object reflects the infrared continuous modulation pulse light and other light rays
- the first infrared receiver receives the reflected infrared light reflected by the target object
- the visible light receiver receives the target object
- the visible light reflected by the object enables the mobile terminal to obtain the reflected infrared light and visible light, thereby constructing a three-dimensional image of the target object with the TOF scheme, and reducing the number of lenses required for three-dimensional recognition by multiplexing the optical receiving lens.
- the lens used for TOF three-dimensional recognition on the display screen is reduced from at least three lenses to two.
- the second flood illuminator is located below the terminal screen, the first infrared receiver or visible light receiver is located below the second flood illuminator, that is, the second flood illuminator and the first infrared receiver or visible light receiver are placed below the second flood illuminator.
- the receivers are arranged up and down to increase the utilization of the space under the screen.
- the embodiment of the present application also relates to a camera method, as shown in FIG. 8 , including the following steps:
- Step 801 controlling the optical emission lens to emit dot matrix projection light and compensate infrared light to the target object;
- Step 802 controlling the optical receiving lens to receive the first reflected infrared light and visible light reflected by the target object
- the visible light and the first reflected infrared light are used for three-dimensional identification of the target object, the first reflected infrared light is used to construct the depth information of the target object, the visible light is used to construct a two-dimensional image of the target object, and the depth information and the two-dimensional image are used for Construct a three-dimensional image of the target object;
- An optical receiving lens including a first infrared receiver, a visible light receiver and a filter;
- the optical filter is arranged in the light incident path of the optical receiving lens, and is used to filter and separate the incident light to obtain the first reflected infrared light and visible light propagating along the light incident path and along the separation path, and the separation path is perpendicular to the light incidence path;
- the first infrared receiver is used to receive the first reflected infrared light
- the visible light receiver is used to receive visible light
- the optical emission lens includes a first flood irradiator, an infrared dot matrix projector and a reflector; the first flood irradiator is used for emitting compensated infrared light, and the infrared dot matrix projector is used for emitting dot matrix projection light;
- the reflector is arranged in the light exit path of the optical emission lens, and the light exit path includes a merge path, an initial exit path of the dot matrix projection light and an initial exit path of the compensated infrared light, and the initial exit path of the dot matrix projection light and the compensation infrared light forms At the intersection angle, the dot matrix projection light and the compensation infrared light are emitted along the combined path after passing through the reflector.
- the embodiment of the present application also relates to another imaging method, as shown in FIG. 9 , including the following steps:
- Step 901 controlling the optical emitting lens to emit infrared light to the target object
- Step 302 controlling the optical receiving lens to receive the first reflected infrared light and visible light reflected by the target object
- the visible light and the first reflected infrared light are used for three-dimensional identification of the target object, the first reflected infrared light is used to construct the depth information of the target object, the visible light is used to construct a two-dimensional image of the target object, and the depth information and the two-dimensional image are used for Construct a three-dimensional image of the target object;
- An optical receiving lens including a first infrared receiver, a visible light receiver and a filter;
- the optical filter is arranged in the light incident path of the optical receiving lens, and is used to filter and separate the incident light to obtain the first reflected infrared light and visible light propagating along the light incident path and along the separation path, and the separation path is perpendicular to the light incidence path;
- the first infrared receiver is used to receive the first reflected infrared light
- the visible light receiver is used to receive visible light
- Infrared light includes infrared continuously modulated pulsed light
- an optical emission lens including a second flood illuminator for emission of infrared continuously modulated pulsed light
- the second floodlight illuminator is located under the terminal screen, and the first infrared receiver or visible light receiver is located under the second floodlight illuminator.
- the camera method is applied to three-dimensional recognition, and the first infrared receiver adopts a relatively wide spectrum to receive two relatively narrow spectrum lights of different bands simultaneously sent by the floodlight illuminator and the dot matrix projector to realize dual optical paths
- the light is effectively decomposed and transformed into 3D and 2D point clouds with coordinate registration, and finally the specific process of fusion to achieve 3D image enhancement is as follows:
- the processor separates the photoelectrically converted signal of the first infrared band light of the flood illuminator reflected by the target object received by the infrared receiver and the photoelectrically converted signal of the second infrared band light of the dot matrix projector.
- Load AI engine for clustering analysis
- the separated two types of photoelectric data are preprocessed respectively to realize data filtering and compression;
- the third step is to perform stereo registration on the two-dimensional image data of the reflected light from the flood illuminator and the three-dimensional image data of the reflected light from the dot matrix projector (the pixel coordinates of the two images, the image coordinates, the camera Coordinates and world coordinates are calibrated to realize the two-dimensional and three-dimensional superimposed points in the real three-dimensional space are mapped to the two-dimensional imaging plane), find out the key feature points, match the two into the same coordinate system, and determine the correspondence between the two images
- the spatial coordinate relationship between points is to perform stereo registration on the two-dimensional image data of the reflected light from the flood illuminator and the three-dimensional image data of the reflected light from the dot matrix projector (the pixel coordinates of the two images, the image coordinates, the camera Coordinates and world coordinates are calibrated to realize the two-dimensional and three-dimensional superimposed points in the real three-dimensional space are mapped to the two-dimensional imaging plane), find out the key feature points, match the two into the same
- the fourth step is to form point cloud data that is easy to store and process, and is used to expand high-dimensional feature information.
- the fifth step is to load the AI engine again, and use the method of deep learning to classify and segment the 3D point cloud (learn the cross transformation according to the input point, and then use it to simultaneously weight the input features associated with the point and rearrange them into a potentially implied canonical order, which is then achieved by applying product and sum operations on the elements);
- the sixth step is to realize 3D image recognition or reconstruction.
- the two-dimensional image data of the floodlight illuminator can effectively compensate for the lack of three-dimensional image data of the dot matrix projection, enhance the three-dimensional recognition effect, and be more conducive to the safe unlocking and payment of users under strong or dark light, and Game Modeling, Virtual Reality and Augmented Reality.
- the embodiment of the present application also relates to an electronic device, as shown in FIG. 10 , including: at least one processor 1001; a memory 1002 communicatively connected to the at least one processor; instructions, the instructions are executed by at least one processor 1001 in the above-mentioned imaging method.
- the memory 1002 and the processor 1001 are connected by a bus, and the bus may include any number of interconnected buses and bridges, and the bus connects one or more processors 1001 and various circuits of the memory 1002 together.
- the bus may also connect together various other circuits such as peripherals, voltage regulators, and power management circuits, all of which are well known in the art and therefore will not be further described herein.
- the bus interface provides an interface between the bus and the transceivers.
- a transceiver may be a single element or multiple elements, such as multiple receivers and transmitters, providing means for communicating with various other devices over a transmission medium.
- the information processed by the processor 1001 is transmitted on the wireless medium through the antenna, and further, the antenna also receives the information and transmits the information to the processor 1001 .
- the processor 1001 is responsible for managing the bus and general processing, and may also provide various functions including timing, peripheral interface, voltage regulation, power management and other control functions. Instead, the memory 1002 may be used to store information used by the processor when performing operations.
- Embodiments of the present application also relate to a computer-readable storage medium storing a computer program.
- the above method embodiments are implemented when the computer program is executed by the processor.
- the program is stored in a storage medium, and includes several instructions to make a device ( It may be a single-chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods in the various embodiments of the present application.
- the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disc, etc., which can store program codes. .
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Vascular Medicine (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
Claims (10)
- 一种摄像装置,包括:光学发射镜头,用于向目标物体发射点阵投影光和补偿红外光;光学接收镜头,用于接收所述目标物体反射的第一反射红外光和可见光;其中,所述可见光和所述第一反射红外光用于所述目标物体的三维识别;所述光学接收镜头,包括第一红外接收器、可见光接收器和滤光片;所述滤光片设置在所述光学接收镜头的光入射路径中,用于将入射光过滤分离得到沿所述光入射路径传播和沿分离路径传播的所述第一反射红外光和所述可见光,所述分离路径垂直于所述光入射路径;所述第一红外接收器用于接收所述第一反射红外光,所述可见光接收器用于接收所述可见光;所述光学发射镜头,包括第一泛光照射器、红外点阵投影器和反射镜;所述第一泛光照射器用于发出所述补偿红外光,所述红外点阵投影器用于发出所述点阵投影光;所述反射镜设置在所述光学发射镜头的光出射路径中,所述光出射路径包括合并路径、所述点阵投影光的初始出射路径和所述补偿红外光的初始出射路径,所述点阵投影光和所述补偿红外光的初始出射路径形成交角,所述点阵投影光和所述补偿红外光经过所述反射镜后沿所述合并路径射出。
- 根据权利要求1所述的摄像装置,其中,所述交角满足所述反射镜的全反射条件;所述反射镜包括红外全反射透镜,所述红外全反射透镜用于将所述点阵投影光和所述补偿红外光同时进行反射和透射,沿所述合并路径射出。
- 根据权利要求1或2所述的摄像装置,其中,所述反射镜包括可移动红外反光镜,所述可移动红外反光镜用于控制所述点阵投影光和所述补偿红外光分时沿所述合并路径射出。
- 根据权利要求1至3中任一项所述的摄像装置,其中,所述光学发射镜头还包括:第二红外接收器和旋转装置;所述第二红外接收器用于接收所述目标物体反射的第二反射红外光;所述旋转装置的一侧设置所述第二红外接收器,另一侧设置所述红外点阵 投影器或所述第一泛光照射器,所述旋转装置用于在所述红外点阵投影器或所述第一泛光照射器射出所述点阵投影光或所述补偿红外光后,旋转至另一侧供所述第二红外接收器接收所述第二反射红外光。
- 根据权利要求1至4中任一项所述的摄像装置,其中,所述摄像装置还包括:光感器件,所述光感器件位于所述光学发射镜头的透光区与终端屏面边缘之间的第一非显示区域,或所述光学接收镜头的透光区与终端屏面边缘之间的第二非显示区域中。
- 一种摄像装置,包括:光学发射镜头,用于向目标物体发射红外光;光学接收镜头,用于接收所述目标物体反射的第一反射红外光和可见光;其中,所述可见光和所述第一反射红外光用于所述目标物体的三维识别;所述光学接收镜头,包括第一红外接收器、可见光接收器和滤光片;所述滤光片设置在所述光学接收镜头的光入射路径中,用于将入射光过滤分离得到沿所述光入射路径传播和沿分离路径传播的所述第一反射红外光和所述可见光,所述分离路径垂直于所述光入射路径;所述第一红外接收器用于接收所述第一反射红外光,所述可见光接收器用于接收所述可见光;所述红外光包括红外连续调制脉冲光;所述光学发射镜头,包括用于发射所述红外连续调制脉冲光的第二泛光照射器;所述第二泛光照射器位于终端屏面下方,所述第一红外接收器或所述可见光接收器位于所述第二泛光照射器下方。
- 一种摄像方法,包括:控制光学发射镜头向目标物体发射点阵投影光和补偿红外光;控制光学接收镜头接收所述目标物体反射的第一反射红外光和可见光;其中,所述可见光和所述第一反射红外光用于所述目标物体的三维识别;所述光学接收镜头包括第一红外接收器、可见光接收器和滤光片;所述滤光片设置在所述光学接收镜头的光入射路径中,用于将入射光过滤分离得到沿所述光入射路径传播和沿分离路径传播的所述第一反射红外光和所 述可见光,所述分离路径垂直于所述光入射路径;所述第一红外接收器用于接收所述第一反射红外光,所述可见光接收器用于接收所述可见光;所述光学发射镜头,包括第一泛光照射器、红外点阵投影器和反射镜;所述第一泛光照射器用于发出所述补偿红外光,所述红外点阵投影器用于发出所述点阵投影光;所述反射镜设置在所述光学发射镜头的光出射路径中,所述光出射路径包括合并路径、所述点阵投影光的初始出射路径和所述补偿红外光的初始出射路径,所述点阵投影光和所述补偿红外光的初始出射路径形成交角,所述点阵投影光和所述补偿红外光经过所述反射镜后沿所述合并路径射出。
- 一种摄像方法,包括:控制光学发射镜头向目标物体发射红外光;控制光学接收镜头接收所述目标物体反射的第一反射红外光和可见光;其中,所述可见光和所述第一反射红外光用于所述目标物体的三维识别;所述光学接收镜头,包括第一红外接收器、可见光接收器和滤光片;所述滤光片设置在所述光学接收镜头的光入射路径中,用于将入射光过滤分离得到沿所述光入射路径传播和沿分离路径传播的所述第一反射红外光和所述可见光,所述分离路径垂直于所述光入射路径;所述第一红外接收器用于接收所述第一反射红外光,所述可见光接收器用于接收所述可见光;所述红外光包括红外连续调制脉冲光;所述光学发射镜头,包括用于发射所述红外连续调制脉冲光的第二泛光照射器;所述第二泛光照射器位于终端屏面下方,所述第一红外接收器或所述可见光接收器位于所述第二泛光照射器下方。
- 一种电子设备,包括:至少一个处理器;与所述至少一个处理器通信连接的存储器;所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述 至少一个处理器执行,以使所述至少一个处理器能够执行如权利要求7或8所述的摄像方法。
- 一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求7或8所述的摄像方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22827147.4A EP4344186A4 (en) | 2021-06-21 | 2022-04-15 | PHOTOGRAPHY APPARATUS AND METHOD, ELECTRONIC DEVICE AND STORAGE MEDIUM |
JP2023566723A JP2024524813A (ja) | 2021-06-21 | 2022-04-15 | 撮像装置、方法、電子機器及びコンピュータプログラム |
US18/393,437 US20240127566A1 (en) | 2021-06-21 | 2023-12-21 | Photography apparatus and method, electronic device, and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110686374.3 | 2021-06-21 | ||
CN202110686374.3A CN115580766A (zh) | 2021-06-21 | 2021-06-21 | 摄像装置、方法、电子设备及存储介质 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/393,437 Continuation US20240127566A1 (en) | 2021-06-21 | 2023-12-21 | Photography apparatus and method, electronic device, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022267645A1 true WO2022267645A1 (zh) | 2022-12-29 |
Family
ID=84545208
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/087239 WO2022267645A1 (zh) | 2021-06-21 | 2022-04-15 | 摄像装置、方法、电子设备及存储介质 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240127566A1 (zh) |
EP (1) | EP4344186A4 (zh) |
JP (1) | JP2024524813A (zh) |
CN (1) | CN115580766A (zh) |
WO (1) | WO2022267645A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117354409A (zh) * | 2023-12-04 | 2024-01-05 | 深圳市华维诺电子有限公司 | 一种屏下摄像头组件及手机 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6556791B1 (en) * | 1999-12-21 | 2003-04-29 | Eastman Kodak Company | Dual channel optical imaging system |
JP2006301149A (ja) * | 2005-04-19 | 2006-11-02 | Nikon Corp | 一眼レフ電子カメラ |
US20100328780A1 (en) * | 2008-03-28 | 2010-12-30 | Contrast Optical Design And Engineering, Inc. | Whole Beam Image Splitting System |
CN104748721A (zh) * | 2015-03-22 | 2015-07-01 | 上海砺晟光电技术有限公司 | 一种具有同轴测距功能的单目视觉传感器 |
CN108040243A (zh) * | 2017-12-04 | 2018-05-15 | 南京航空航天大学 | 多光谱立体视觉内窥镜装置及图像融合方法 |
CN207751449U (zh) * | 2018-01-11 | 2018-08-21 | 苏州江奥光电科技有限公司 | 一种基于视场匹配的单目深度相机 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10204262B2 (en) * | 2017-01-11 | 2019-02-12 | Microsoft Technology Licensing, Llc | Infrared imaging recognition enhanced by 3D verification |
CN111083453B (zh) * | 2018-10-18 | 2023-01-31 | 中兴通讯股份有限公司 | 一种投影装置、方法及计算机可读存储介质 |
CN111190323B (zh) * | 2018-11-15 | 2022-05-13 | 中兴通讯股份有限公司 | 一种投影器和终端 |
-
2021
- 2021-06-21 CN CN202110686374.3A patent/CN115580766A/zh active Pending
-
2022
- 2022-04-15 EP EP22827147.4A patent/EP4344186A4/en active Pending
- 2022-04-15 JP JP2023566723A patent/JP2024524813A/ja active Pending
- 2022-04-15 WO PCT/CN2022/087239 patent/WO2022267645A1/zh active Application Filing
-
2023
- 2023-12-21 US US18/393,437 patent/US20240127566A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6556791B1 (en) * | 1999-12-21 | 2003-04-29 | Eastman Kodak Company | Dual channel optical imaging system |
JP2006301149A (ja) * | 2005-04-19 | 2006-11-02 | Nikon Corp | 一眼レフ電子カメラ |
US20100328780A1 (en) * | 2008-03-28 | 2010-12-30 | Contrast Optical Design And Engineering, Inc. | Whole Beam Image Splitting System |
CN104748721A (zh) * | 2015-03-22 | 2015-07-01 | 上海砺晟光电技术有限公司 | 一种具有同轴测距功能的单目视觉传感器 |
CN108040243A (zh) * | 2017-12-04 | 2018-05-15 | 南京航空航天大学 | 多光谱立体视觉内窥镜装置及图像融合方法 |
CN207751449U (zh) * | 2018-01-11 | 2018-08-21 | 苏州江奥光电科技有限公司 | 一种基于视场匹配的单目深度相机 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4344186A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117354409A (zh) * | 2023-12-04 | 2024-01-05 | 深圳市华维诺电子有限公司 | 一种屏下摄像头组件及手机 |
CN117354409B (zh) * | 2023-12-04 | 2024-02-02 | 深圳市华维诺电子有限公司 | 一种屏下摄像头组件及手机 |
Also Published As
Publication number | Publication date |
---|---|
EP4344186A1 (en) | 2024-03-27 |
US20240127566A1 (en) | 2024-04-18 |
CN115580766A (zh) | 2023-01-06 |
EP4344186A4 (en) | 2024-08-21 |
JP2024524813A (ja) | 2024-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020057205A1 (zh) | 屏下光学系统、衍射光学元件的设计方法及电子设备 | |
WO2020057208A1 (zh) | 电子设备 | |
EP3660575B1 (en) | Eye tracking system and eye tracking method | |
US20200409163A1 (en) | Compensating display screen, under-screen optical system and electronic device | |
US10877281B2 (en) | Compact optical system with MEMS scanners for image generation and object tracking | |
US10983340B2 (en) | Holographic waveguide optical tracker | |
CN105629474B (zh) | 一种近眼显示系统及头戴显示设备 | |
WO2020057207A1 (zh) | 电子设备 | |
US20080198459A1 (en) | Conjugate optics projection display system and method having improved resolution | |
CN111083453B (zh) | 一种投影装置、方法及计算机可读存储介质 | |
US10832052B2 (en) | IR illumination module for MEMS-based eye tracking | |
KR20150086388A (ko) | 사람에 의해 트리거되는 홀로그래픽 리마인더 | |
US20240127566A1 (en) | Photography apparatus and method, electronic device, and storage medium | |
WO2020057206A1 (zh) | 屏下光学系统及电子设备 | |
EP3935437B1 (en) | Ir illumination module for mems-based eye tracking | |
US10838489B2 (en) | IR illumination module for MEMS-based eye tracking | |
WO2021196976A1 (zh) | 一种光发射装置及电子设备 | |
EP4443379A1 (en) | Three-dimensional recognition apparatus, terminal, image enhancement method and storage medium | |
CN111024626B (zh) | 光源模组、成像装置和电子设备 | |
CN215499049U (zh) | 具有3d摄像模组的显示装置和电子设备 | |
CN215499363U (zh) | 具有3d摄像模组的显示装置和电子设备 | |
EP4053588A1 (en) | Optical sensing system | |
EP4047387A1 (en) | Optical sensing system | |
KR20230079618A (ko) | 인체를 3차원 모델링하는 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22827147 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023566723 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022827147 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022827147 Country of ref document: EP Effective date: 20231221 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |