WO2022257558A1 - Module de temps de vol, terminal et procédé de détection de profondeur - Google Patents

Module de temps de vol, terminal et procédé de détection de profondeur Download PDF

Info

Publication number
WO2022257558A1
WO2022257558A1 PCT/CN2022/083585 CN2022083585W WO2022257558A1 WO 2022257558 A1 WO2022257558 A1 WO 2022257558A1 CN 2022083585 W CN2022083585 W CN 2022083585W WO 2022257558 A1 WO2022257558 A1 WO 2022257558A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
time
image sensor
lens
light source
Prior art date
Application number
PCT/CN2022/083585
Other languages
English (en)
Chinese (zh)
Inventor
戴阳
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2022257558A1 publication Critical patent/WO2022257558A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves

Definitions

  • the present application relates to the field of ranging, and more specifically, relates to a time-of-flight module, a terminal and a depth detection method.
  • Time of flight is a ranging technology that calculates the distance between the target object and the sensor by measuring the time difference between the transmitted signal and the signal reflected by the target object.
  • Embodiments of the present application provide a time-of-flight module, a terminal, and a depth detection method.
  • the time-of-flight module of the embodiment of the present application includes a light source, a lens and an image sensor.
  • the light source is used for emitting light.
  • the lens is located on the side where the light source emits light.
  • the image sensor includes a photosensitive pixel, a detection pixel and a timer. When the detection pixel receives the light reflected by the lens, the timer starts counting; when the photosensitive pixel receives the light reflected by the target object, the timer counts. The timer stops timing to obtain the receiving time; the image sensor generates the depth information of the target object according to the receiving time.
  • the terminal in the embodiment of the present application includes a casing and a time-of-flight module, and the time-of-flight module is arranged on the casing.
  • the time-of-flight module includes a light source, a lens and an image sensor.
  • the light source is used for emitting light.
  • the lens is located on the side where the light source emits light.
  • the image sensor includes a photosensitive pixel, a detection pixel and a timer. When the detection pixel receives the light reflected by the lens, the timer starts counting; when the photosensitive pixel receives the light reflected by the target object, the timer counts. The timer stops timing to obtain the receiving time; the image sensor generates the depth information of the target object according to the receiving time.
  • the depth detection method in the embodiment of the present application is applied to a time-of-flight module, and the time-of-flight module includes a light source, a lens, and an image sensor.
  • the light source is used for emitting light.
  • the lens is located on the side where the light source emits light.
  • the image sensor includes a photosensitive pixel, a detection pixel and a timer. When the detection pixel receives the light reflected by the lens, the timer starts counting; when the photosensitive pixel receives the light reflected by the target object, the timer counts. The timer stops timing to obtain the receiving time; the image sensor generates the depth information of the target object according to the receiving time.
  • the depth detection method in the embodiment of the present application is applied to a time-of-flight module, the time-of-flight module includes a light source, a lens and an image sensor, the lens is located on the side where the light source emits light, and the depth detection method includes: controlling The light source emits light; when the detection pixel of the image sensor receives the light reflected by the lens, the timer of the image sensor starts timing; when the photosensitive pixel of the image sensor receives the light reflected by the target object When the light is detected, the timer stops counting to generate a receiving time; according to the receiving time, the depth information of the target object is calculated.
  • FIG. 1 is a schematic structural diagram of a time-of-flight module in some embodiments of the present application
  • Fig. 2 is a schematic diagram of the scene of the time-of-flight module of the prior art solution
  • FIG. 3 is a schematic structural diagram of a terminal in some embodiments of the present application.
  • Figure 4 and Figure 5 are schematic diagrams of scenes of the time-of-flight module in some embodiments of the present application.
  • FIG. 6 is a schematic diagram of a lens of a time-of-flight module according to some embodiments of the present application.
  • FIG. 7 is a schematic structural diagram of an image sensor of a time-of-flight module according to some embodiments of the present application.
  • FIG. 10 is a schematic diagram of a scene of an image sensor in some embodiments of the present application.
  • 11 to 16 are schematic plan views of image sensors in some embodiments of the present application.
  • Fig. 17 is a schematic flowchart of a depth detection method in some embodiments of the present application.
  • the time-of-flight module of the embodiment of the present application includes a light source, a lens and an image sensor.
  • the light source is used for emitting light.
  • the lens is located on the side where the light source emits light.
  • the image sensor includes a photosensitive pixel, a detection pixel and a timer. When the detection pixel receives the light reflected by the lens, the timer starts counting; when the photosensitive pixel receives the light reflected by the target object, the timer counts. The timer stops timing to obtain the receiving time; the image sensor generates the depth information of the target object according to the receiving time.
  • the detection pixels are used to receive the light reflected by the lens to generate a trigger signal, and the timer starts counting when receiving the trigger signal; the photosensitive pixels are used to receive the target object the reflected light to generate a receiving signal, and the timer stops counting when receiving the receiving signal to generate a receiving time.
  • the curvature of the lens is greater than a predetermined curvature.
  • the time-of-flight module further includes a casing, the casing includes a substrate, a top board, a side board, and a spacer board, and the substrate, the top board, and the side boards enclose a receiving space,
  • the light source and the image sensor are disposed on the substrate
  • the spacer plate is disposed on the top plate and located in the accommodation space
  • the spacer plate separates the light source and the image sensor
  • the spacer plate A gap is formed between the substrate and the substrate, and the light reflected by the lens enters the detection pixel from the gap.
  • the detection pixels are located on the side of the image sensor close to the light source, and the width of the gap is based on the light reflected by the lens, which enters the detection pixels in the gap. The maximum height is determined.
  • the image sensor further includes a first reflective member, the first reflective member is arranged on the side of the detection pixel close to the top plate and away from the light source, and the first reflective member is used for for reflecting the light reflected by the lens.
  • the image sensor further includes a second reflective member, the second reflective member is arranged on the side of the detection pixel close to the top plate and close to the light source, and the second reflective member uses to reflect the light reflected by the first reflective member.
  • the image sensor further includes a third reflective element and a fourth reflective element.
  • the third reflective member and the fourth reflective member are arranged opposite to each other, and form a closed space with the first reflective member and the second reflective member.
  • the photosensitive pixels and the detection pixels are arranged in a matrix
  • the detection pixels include pixels in a predetermined column in the matrix close to the light source
  • the direction of the columns of the matrix is perpendicular to the image The arrangement direction of the sensor and the light source.
  • both the photosensitive pixels and the detection pixels are single photon avalanche diodes.
  • the light source comprises a VCSEL.
  • the terminal in the embodiment of the present application includes a casing and a time-of-flight module, and the time-of-flight module is arranged on the casing.
  • the time-of-flight module includes a light source, a lens and an image sensor.
  • the light source is used for emitting light.
  • the lens is located on the side where the light source emits light.
  • the image sensor includes a photosensitive pixel, a detection pixel and a timer. When the detection pixel receives the light reflected by the lens, the timer starts counting; when the photosensitive pixel receives the light reflected by the target object, the timer counts. The timer stops timing to obtain the receiving time; the image sensor generates the depth information of the target object according to the receiving time.
  • the depth detection method in the embodiment of the present application is applied to a time-of-flight module, and the time-of-flight module includes a light source, a lens, and an image sensor.
  • the light source is used for emitting light.
  • the lens is located on the side where the light source emits light.
  • the image sensor includes a photosensitive pixel, a detection pixel and a timer. When the detection pixel receives the light reflected by the lens, the timer starts counting; when the photosensitive pixel receives the light reflected by the target object, the timer counts. The timer stops timing to obtain the receiving time; the image sensor generates the depth information of the target object according to the receiving time.
  • the detection pixels are used to receive the light reflected by the lens to generate a trigger signal, and the timer starts counting when receiving the trigger signal; the photosensitive pixels are used to receive the target object the reflected light to generate a receiving signal, and the timer stops counting when receiving the receiving signal to generate a receiving time.
  • the curvature of the lens is greater than a predetermined curvature.
  • the time-of-flight module further includes a casing, the casing includes a substrate, a top board, a side board, and a spacer board, and the substrate, the top board, and the side boards enclose a receiving space,
  • the light source and the image sensor are disposed on the substrate
  • the spacer plate is disposed on the top plate and located in the accommodation space
  • the spacer plate separates the light source and the image sensor
  • the spacer plate A gap is formed between the substrate and the substrate, and the light reflected by the lens enters the detection pixel from the gap.
  • the detection pixels are located on the side of the image sensor close to the light source, and the width of the gap is based on the light reflected by the lens, which enters the detection pixels in the gap. The maximum height is determined.
  • the image sensor further includes a first reflective member, the first reflective member is arranged on the side of the detection pixel close to the top plate and away from the light source, and the first reflective member is used for for reflecting the light reflected by the lens.
  • the image sensor further includes a second reflective member, the second reflective member is arranged on the side of the detection pixel close to the top plate and close to the light source, and the second reflective member uses to reflect the light reflected by the first reflective member.
  • the image sensor further includes a third reflective element and a fourth reflective element.
  • the third reflective member and the fourth reflective member are arranged opposite to each other, and form a closed space with the first reflective member and the second reflective member.
  • the photosensitive pixels and the detection pixels are arranged in a matrix
  • the detection pixels include pixels in a predetermined column in the matrix close to the light source
  • the direction of the columns of the matrix is perpendicular to the image The arrangement direction of the sensor and the light source.
  • both the photosensitive pixels and the detection pixels are single photon avalanche diodes.
  • the light source comprises a VCSEL.
  • the depth detection method in the embodiment of the present application is applied to a time-of-flight module, the time-of-flight module includes a light source, a lens and an image sensor, the lens is located on the side where the light source emits light, and the depth detection method includes: controlling The light source emits light; when the detection pixel of the image sensor receives the light reflected by the lens, the timer of the image sensor starts timing; when the photosensitive pixel of the image sensor receives the light reflected by the target object When the light is detected, the timer stops counting to generate a receiving time; according to the receiving time, the depth information of the target object is calculated.
  • the time-of-flight module, terminal and depth detection method of the embodiment of the present application control the light source to emit light and reflect the light to the detection pixel through the lens.
  • the time taken to stop counting when the light emitted by the target object is reached is the receiving time, and the image sensor can generate the depth information of the target object according to the receiving time.
  • the timer starts timing is based on the trigger signal generated by the detection pixel, and the moment when the timer receives the trigger signal (considering the extremely fast speed of light, the time for the lens to reflect light to the detection pixel can be ignored or deducted after pre-measurement) Therefore, the time to start timing is the actual light-emitting time of the light source, and the receiving time has nothing to do with the temperature drift of the light source, thereby eliminating the problem of inaccurate timing starting point caused by temperature drift, so as to ensure the accuracy of timing, thereby ensuring the image sensor to generate The accuracy of the depth information of the target object. Additional aspects and advantages of the application will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application.
  • the embodiment of the present application provides a time-of-flight module 100 .
  • the TOF module 100 includes a light source 20 , a lens 30 and an image sensor 40 .
  • the light source 20 is used for emitting light.
  • the lens 30 is located on the side where the light source 20 emits light.
  • the image sensor 40 includes a photosensitive pixel 41 and a detection pixel 42.
  • the detection pixel 42 is used to receive light reflected by the lens 30 to generate a trigger signal, and transmit the trigger signal to the photosensitive pixel 41; the photosensitive pixel 41 starts counting when receiving the trigger signal
  • the image sensor 40 generates the depth information of the target object according to the receiving time of the photosensitive pixel 41 receiving the light reflected by the target object (shown in FIG. 2 ).
  • the emitted light and the light reflected by the target object are often measured by directly measuring the time-of-flight (direct Time of Flight, dToF) module.
  • the time difference is used to calculate the distance between the target object and the image sensor 40 through the time difference.
  • the dToF module consists of three main components: drive control chip, photosensitive pixel and light emitting device.
  • drive control chip controls the driving control chip to send light-emitting instructions to the light-emitting device, and the photosensitive pixel receives the light reflected by the target object, it will experience a total of 4 moments, namely T0 moment, T1 moment, T2 moment and T3 moment.
  • the time T0 is the moment when the dToF module controls the drive control chip to send the light-emitting command to the light-emitting device
  • the time T1 is the time when the drive control chip sends the light-emitting command
  • the time T2 is the time when the light-emitting device responds to the light-emitting command to emit light
  • the moment T3 is the moment when the photosensitive pixel receives the light reflected back by the target object.
  • the dToF module calculates the time difference between the light emitted by the light-emitting device and the light emitted by the target object, theoretically speaking, the timer in the dToF module starts counting at T2 and ends at T3, then the best Accurate flight time. However, during operation, it takes a certain amount of time for the light-emitting device to receive the control signal from the drive control chip and actually emit light in response to the control signal, so that the actual light-emitting time T2 of the light-emitting device often lags behind T1.
  • the temperature of the light-emitting device will change, resulting in a change in the response speed of the light-emitting device from receiving the control signal to start emitting light to actually emitting light.
  • the response speed of the light-emitting device is slower, which makes it difficult to define the time T2 (the actual light-emitting time of the light-emitting device). Therefore, the dToF module cannot use T2 time as the starting time to get the most accurate flight time.
  • the timer will often use the T0 time as the time to start counting.
  • the time T2 is difficult to define, and there is no fixed time difference between the time T0 and the time T2. Therefore, when calculating the flight time, the time error of the light-emitting device responding to the light-emitting command cannot be eliminated. It will cause the dToF module to fail to find an accurate timing starting point when measuring the flight time, resulting in inaccurate distance measurement, so that the image sensor 40 generates inaccurate depth information of the target object.
  • the time-of-flight module 100 of the embodiment of the present application controls the light source 20 to emit light, and reflects the light to the detection pixel 42 through the lens 30. At this time, the timer 401 starts counting, and the timer 401 counts from the start to the photosensitive pixel 41. The time taken to stop counting when receiving the light emitted by the target object is the receiving time, and the image sensor 40 can generate the depth information of the target object according to the receiving time.
  • the timer 401 starts counting time is to generate a trigger signal according to the detection pixel 41, and the moment when the timer 401 receives the trigger signal (considering that the speed of light is extremely fast, the time for the lens 30 to reflect the light to the detection pixel can be ignored or in advance deduction after calculation), therefore, the time to start counting is the actual light emitting time of the light source 20, and the receiving time has nothing to do with the temperature drift of the light source 20, thereby eliminating the inaccurate problem of the starting point of timing caused by temperature drift, so as to ensure the accuracy of timing , so as to ensure the accuracy of the depth information of the target object generated by the image sensor 40 .
  • an embodiment of the present application provides a terminal 1000 .
  • the terminal 1000 includes a time-of-flight module 100 and a housing 200 .
  • the time-of-flight module 100 is disposed on the casing 200 .
  • the terminal 1000 can be, but not limited to, VR glasses, AR glasses, mobile phones, tablet computers, notebook computers, smart watches, game consoles, head-mounted display devices, laser rulers, etc., and these electronic devices often have a time-of-flight module 100 to realize A function to generate depth information of a target object.
  • the time-of-flight module 100 can be a dTOF module, and the dTOF module can calculate the difference between the target object and the time-of-flight module 100 according to the emission time of the light emitted by the light source 20 and the time when the light is reflected back to the image sensor 40 by the target object. The distance between them is the depth information of the target object.
  • the casing 200 includes a side wall 201 and a back plate 202 at the bottom, and the side wall 201 and the back plate 202 form a receiving space 203 for accommodating components of the terminal 1000 .
  • the time-of-flight module 100 is set in the storage space 203, and when the time-of-flight module 100 needs to generate the depth information of the target object, it can transmit the light source to the target object through the back plate 202, thereby Get the depth information of the target object.
  • the material of the casing 200 may be metal, glass, plastic, etc., or the material of the casing 200 may be a mixture of metal, glass, and plastic.
  • the material of the side wall 201 is metal
  • the material of the back plate 202 is glass.
  • the material of the side wall 201 and part of the back plate 202 is metal
  • the material of the other part of the back plate 202 is glass.
  • the time-of-flight module 100 includes a casing 10 , a light source 20 , a lens 30 and an image sensor 40 .
  • the light source 20 , the lens 30 and the image sensor 40 are disposed in the casing 10 .
  • the light source 20 is used to emit light
  • the lens 30 is used to reflect the light emitted by the light source 20
  • the image sensor 40 includes photosensitive pixels 41 and detection pixels 42 .
  • the housing 10 includes a substrate 11 , a top plate 12 , a side plate 13 and a partition plate 14 . Both ends of the side plate 13 are respectively connected to the substrate 11 and the top plate 12 , and one end of the spacer plate 14 is connected to the top plate 12 .
  • the substrate 11 is arranged in the casing 200, the substrate 11, the top plate 12 and the side plate 13 are surrounded by a housing space 15, the light source 20, the lens 30 and the image sensor 40 are accommodated in the housing space 15, and the light source 20 and the image sensor 40 is directly disposed on the substrate 11, and the terminal 1000 can power on the light source 20 and the image sensor 40 through the substrate 11 to ensure that the light source 20 and the image sensor 40 can work normally.
  • the spacer 14 is arranged in the accommodation space 15, and the spacer 14 is used for separating the light source 20 and the image sensor 40.
  • the spacer 14 is used for blocking part of the light, so as to prevent the light from entering the containing space.
  • crosstalk light is formed, thereby reducing the interference of the crosstalk light on the image sensor 40 .
  • a gap 16 is formed between the spacer plate 14 and the substrate 11.
  • the light source 20 emits light and reflects the light through the lens 30, the light reflected by the lens 30 will be incident on the detection pixel 42 through the gap 16, so that the detection pixel 42 generates trigger signal.
  • the width of the gap 16 is determined at the maximum height of the gap 16 according to the light reflected by the lens 30 to the detection pixel 42 .
  • the detection pixel 42 is located on the side of the image sensor 40 close to the light source 20.
  • the light emitted by the light source 20 is reflected to the detection pixel 42 through the lens 30, if there is no spacer 14, the light reflected by the lens 30 will fall on the On the photosensitive pixels 41 and the detection pixels 42 , but the light is not reflected by the target object, when the light is reflected on the photosensitive pixels 41 , it will interfere with the image sensor 40 . Therefore, in order to ensure that the light reflected by the lens 30 is reflected to the detection pixels 42 and not to the photosensitive pixels 41 , a spacer 14 is required to form a gap 16 between the spacer 14 and the substrate 11 .
  • the width of the gap 16 is related to the maximum height of the light reflected by the lens 30 at the gap 16 .
  • the distance H between the light L at the position of the spacer plate 14 and the substrate 11 should be such that the spacer plate 14 allows the light reflected by the lens 30 to pass through the gap 16
  • the maximum height of the light above the light L will theoretically be reflected to the photosensitive pixel 41 (such as light M, light X), and the spacer 14 needs to block the light higher than the height of the light L to ensure that this part of the light will not be reflected To the photosensitive pixel 41, therefore, the height of the gap 16 should be the distance H between the light L at the position of the spacer plate 14 and the substrate 11, so as to ensure that the light reflected by the lens 30 will only be reflected on the detection pixel 42, so as to reduce the Interference of crosstalk light on the image sensor 40 .
  • a spacer 14 is often provided between the light source 20 and the image sensor 40, and the spacer 14 is directly connected to the substrate 11 to prevent Turn off the light source 20 and the image sensor 40.
  • the side of the image sensor 40 close to the light source 20 needs to be connected to the circuit board of the substrate 11 with a connection line, and the connection line is located between the image sensor 40 and the light source 20. between.
  • a gap 16 is formed between the spacer plate 14 and the substrate 11, and the connection line connecting the image sensor 40 to the circuit board of the substrate 11 can be arranged at the position of the gap 16 Therefore, there is no need to place the connection line, and the distance between the image sensor 40 and the light source 20 can be reduced, thereby reducing the required size of the time-of-flight module 100 .
  • the light source 20 may be a vertical-cavity surface-emitting laser (Vertical-Cavity Surface-Emitting Laser, VCSEL), an edge-emitting semiconductor laser (edge-emitting semiconductor lasers, EEL), and a light emitting diode (Light Emitting Diode, LED) and other light sources 20. These light sources 20 may be point light sources 20 composed of a single laser or diode, or array light sources 20 composed of multiple lasers or diodes. The light source 20 can emit laser light to the target object under the control of the time-of-flight module 100 for dTOF ranging.
  • the light source 20 in the embodiment of the present application is a VCSEL, and it can be understood that the light source 20 is not limited to the VCSEL.
  • the lens 30 is located on the side where the light source 20 emits light.
  • the curvature of the lens 30 needs to be greater than a preset curvature.
  • the preset curvature is 45 degrees.
  • the light source 20 emits light to the lens 30, at this time, because the light enters the lens 30 from the air, that is, the light enters another medium from one medium, the light will be refracted by the lens 30, and the light will also form on the surface of the lens 30. reflection.
  • the total energy of the light is fixed.
  • the energy of the light will be divided into two parts, and the energy of the reflected light and the energy of the refracted light are interlinked to ensure that the total energy of the light does not change. Change. For example, when the energy of the reflected ray increases, the energy of the refracted ray decreases correspondingly, which is the same as the increase in the energy of the reflected ray.
  • the lens 30 is a high-curvature lens
  • the incident angle of the light is larger than that of the low-curvature lens, resulting in stronger reflection of the light on the surface of the lens 30, and the light The greater the energy of the reflected light, the smaller the energy of the refracted light refracted by the lens 30 .
  • the energy of the refracted light is smaller than that of the light refracted by the low-curvature lens.
  • the time-of-flight module 100 is installed in the terminal 1000. Since the material of the casing 200 of the terminal 1000 covering the time-of-flight module 100 is glass, the light incident on the casing 200 after being refracted by the lens 30 will be When the reflection is formed inside the casing 200, due to the weak energy of the light, the light is also weak when it is reflected by the casing 200 and then enters the time-of-flight module 100.
  • the energy of the crosstalk light Z formed can reduce the impact of the crosstalk light Z caused by the reflection of the casing 200 on the time-of-flight module 100 .
  • the curvature of the lens 30 needs to be greater than a predetermined curvature, wherein the predetermined curvature is 45 degrees.
  • the curvature of the lens 30 is specifically expressed as the angle ⁇ between the line connecting the center point O of the lens 30 and the edge point P of the curved edge of the lens 30 and the optical axis K of the lens 30, the lens 30
  • the curvature must be greater than the preset curvature, that is, the included angle ⁇ must be greater than 45 degrees.
  • the image sensor 40 includes photosensitive pixels 41 , detection pixels 42 and a timer 401 .
  • the timer 401 starts counting when the detection pixel 42 receives the light reflected by the lens 30 , and stops counting when the photosensitive pixel 41 receives the light reflected by the target object, so as to generate the receiving time.
  • the timer 401 is a timing circuit, for example, a time-to-digital converter (Time To Digital Converter, TDC) circuit.
  • TDC Time To Digital Converter
  • the photosensitive pixel 41, the detection pixel 42 and the timer 401 are connected through a circuit. After the photosensitive pixel 41 or the detection pixel 42 receives light, it can convert photons into electrons, and amplify it into a voltage change signal, so as to transmit it to the timer through the circuit 401, so as to trigger the timer 401 to perform the work of starting timing or stopping timing.
  • the timer 401 can convert the time signal into a digital signal through the signal transmitted by the photosensitive pixel 41 and the detection pixel 42, so as to obtain the receiving time. For example, when the timer 401 starts counting, the time signal at the start time is converted into a digital signal, and when the timer 401 stops counting, the time signal at the time stop time is converted into a digital signal. The time difference between the counting time and the stop counting time, that is, the receiving time.
  • the detection pixel 42 when the detection pixel 42 receives the light reflected by the lens 30, it will generate a trigger signal, and the timer 401 can receive the trigger signal to start timing (starting from 0), and the photosensitive pixel 41 will receive the light reflected by the target object. , a receiving signal will be generated.
  • the timer 401 when the timer 401 receives the receiving signal, it will stop counting. Since the start counting starts counting from 0, the time for stopping counting is the time taken from the start counting to the stop counting.
  • the timer 401 can generate the receiving time, and the image sensor 40 can generate the depth information of the target object according to the receiving time.
  • the photosensitive pixel 41 and the detection pixel 42 can share a timer 401, that is, the circuits of the photosensitive pixel 41 and the detection pixel 42 are connected to a timer 401 at the same time, and the detection pixel 42 generates a trigger signal and the photosensitive pixel 41 generates a trigger signal.
  • the timer 401 can perform the actions of starting and stopping timing respectively, so as to obtain the receiving time.
  • the photosensitive pixel 41 and the detection pixel 42 can be respectively connected to a timer 401 , that is, the image sensor 40 includes two timers 401 that act on the photosensitive pixel 41 and the detection pixel 42 respectively.
  • the detection pixel 42 When the detection pixel 42 generates a trigger signal, the timer 401 acting on the detection pixel 42 starts counting, and through the circuit, the timer 401 acting on the photosensitive pixel 41 starts counting, and the photosensitive pixel 41 acts on when generating the receiving signal.
  • the timer 401 in the detection pixel 42 stops counting, so as to obtain the receiving time.
  • the photosensitive pixels 41 and the detection pixels 42 are arranged in a matrix, and the detection pixels 42 include a predetermined column of pixels close to the light source 20 in the matrix.
  • the direction of the columns of the matrix is perpendicular to the arrangement direction of the image sensor 40 and the light source 20 .
  • the image sensors 40 are arranged in 5 rows and 6 columns, the column direction of the matrix is the A direction, the height direction of the image sensors 40 is the B direction, and the arrangement direction of the image sensors 40 and the light source 20 is the C direction.
  • the A direction is perpendicular to the B direction and the C direction
  • the B direction is perpendicular to the C direction.
  • the column of the image sensor 40 close to the light source 20 that is, the sixth column is the detection pixels 42 .
  • the other columns are photosensitive pixels 41, and the photosensitive pixels 41 are provided with a microlens array 43 (Micro Lens Array, MLA), and the microlens array 43 includes a plurality of microlenses 44 to ensure that the light reflected by the target object is After being refracted by 44 , it can be better incident into the photosensitive pixel 41 to improve the photosensitive effect of the photosensitive pixel 41 .
  • MLA Micro Lens Array
  • the light emitted to the edge of the lens 30 will be reflected by the lens 30 to be reflected to the detection pixel 42 , and at this time, the detection pixel 42 can generate a trigger signal.
  • the light emitted to the middle position of the lens 30 can be refracted by the lens 30 to reach the target object, and reflected by the target object to the photosensitive pixel 41.
  • the photosensitive pixel 41 can generate a receiving signal
  • the timer 401 can receive a trigger Signal and receive signal to perform the work of starting and stopping timing, respectively, so as to determine the receiving time.
  • the timer 401 starts counting and stops Timing will not affect the receiving time, and the speed of light is relatively fast, and the time for the light to be reflected by the lens 30 to the detection pixel 42 is negligible, thereby ensuring that the image sensor 40 can accurately obtain the moment when the light source 20 emits light. In order to ensure the accuracy of the depth information of the target object generated by the image sensor 40 .
  • the detection pixel 42 of the image sensor 40 may also be provided with a microlens 44 , and the microlens 44 is offset on the side of the detection pixel 42 close to the top plate 12 .
  • the light reflected by the lens 30 to the detection pixel 42 needs to be refracted by the microlens 44 on the detection pixel 42, so that more light can enter the detection pixel 42, thereby improving the sensitivity of the detection pixel 42. strength.
  • the detection pixel 42 when the detection pixel 42 generates a trigger signal, the light needs to be incident on the inside of the detection pixel 42, for example, when the light is incident on the inside of the detection pixel 42 at 3 microns to 5 microns, then the detection pixel 42 can generate the trigger signal.
  • the detection pixel 42 when the detection pixel 42 is not provided with a microlens 44, it can be seen that theoretically, the light E will directly incident on the edge position of the detection pixel 42, and since the angle between the light E and the detection pixel 42 is relatively small is small, the light E cannot be well incident into the detection pixel 42 , resulting in a weak photosensitive intensity of the detection pixel 42 .
  • the detection pixel 42 When the detection pixel 42 is provided with a microlens 44 , the light E can be refracted by the microlens 44 to change the angle at which the light E enters the detection pixel 42 , thereby increasing the photosensitive intensity of the detection pixel 42 .
  • the time-of-flight module 100 in the method of this application can adjust the height and offset distance of the microlens 44 on the detection pixel 42 so that the microlens 44 on the detection pixel 42 can refract the lens 30 at the same time.
  • the light incident on the photosensitive pixel 41 and the detection pixel 42 refracts the light that should have entered the photosensitive pixel 41 to the detection pixel 42.
  • the light entering the photosensitive pixel 41 is reduced to reduce the interference of crosstalk light , so as to improve the accuracy of the target object generated by the image sensor 40.
  • the light incident on the photosensitive pixel 41 is refracted to the detection pixel 42, which improves the photosensitive intensity of the detection pixel 42, thereby improving the accuracy of generating the trigger signal.
  • the microlens 44 on the detection pixel 42 is used to refract the light reflected by the lens 30 to the detection pixel 42, so that the light enters the detection pixel 42 at a large angle, so that the light can better enter the detection pixel 42 interior.
  • the offset direction and offset distance of the microlens 44 on the detection pixel 42 need to be determined according to the angle of the light reflected by the lens 30 to the detection pixel 42 .
  • the incident angle of the light T1 reflected by the lens is ⁇ 1 (the angle between the light and the detection pixel 42). It can be seen that if the detection pixel 42 is not provided with a microlens 44, the light T1 will be directly incident on the photosensitive pixel 41, and when the detection pixel 42 is provided with a microlens 44, the light T1 is refracted at point Y1 of the microlens 44 on the detection pixel 42, but the light refracted by the microlens 44 on the detection pixel 42 T1 will still be incident on the photosensitive pixel 41 , at this time, it is necessary to shift the microlens 44 on the detection pixel 42 to the side close to the light source 20 to change the refracted position of the light T1 on the microlens 44 .
  • the incident angles of light T1 and light T2 are ⁇ 1 and ⁇ 2 respectively, and ⁇ 1 is greater than ⁇ 2, no matter whether microlens 44 is set on detection pixel 42, light T1 Both the light ray T2 and the light ray T2 will finally be incident on the photosensitive pixel 41 , but the incident position of the light ray T2 is farther away from the detection pixel 42 than the incident position of the light ray T1 . Then, the offset of the microlens 44 on the detection pixel 42 needs to be larger than that of the microlens 44 in FIG. d) as shown).
  • the time-of-flight module 100 further includes a first reflective member 50 , and the first reflective member 50 is disposed on a side of the detection pixel 42 close to the top plate 12 and away from the light source 20 .
  • the first reflector 50 is used to reflect the light reflected by the lens 30, so as to ensure that the light reflected by the lens 30 will not enter the photosensitive pixel 41, thereby ensuring that no crosstalk light will be formed inside the image sensor 40, so as to ensure the image The accuracy with which the sensor 40 generates depth information of the target object.
  • the height of the first reflective member 50 is determined according to the maximum height of the light reflected by the lens 30 at the position of the first reflective member 50 . As shown in FIG. 11 , if the ray G is the ray with the highest height reflected by the lens 30 into the image sensor 40, then when the first reflective member 50 can reflect the ray G, it means that the ray reflected by the lens 30 is incident on the image sensor 40.
  • the height of the first reflective member 50 needs to be greater than the height when the light G is reflected to the position of the first reflective member 50, Therefore, it is ensured that the light reflected by the lens 30 will not enter the photosensitive pixel 41 , so as to ensure the accuracy of the depth information of the target object generated by the image sensor 40 .
  • the time-of-flight module 100 can also include the first reflective member 50 and the second reflective member 60 at the same time.
  • the second reflector 60 is disposed on a side of the detection pixel 42 close to the top plate 12 and close to the light source 20 .
  • the second reflective member 60 is used to reflect the light reflected by the first reflective member 50.
  • the light reflected by the lens 30 enters the first reflective member 50, the light will be reflected by the first reflective member 50 to the second
  • the second reflective member 60 is reflected to the detection pixel 42 by the second reflective member 60 to generate a trigger signal.
  • the detection pixel 42 can pass through the first reflector 50 and the second reflector 60 to receive the light that will theoretically be reflected by the lens 30 to the position of the photosensitive pixel 41, so as to increase the photosensitive intensity of the detection pixel 42, thereby increasing the generation of the trigger signal. accuracy.
  • the height of the second reflective member 60 is determined according to the maximum height of the light reflected by the first reflective member 50 at the position of the second reflective member 60 .
  • the height of the second reflective member 60 needs to be greater than The height when the light R is reflected to the position of the second reflective member 60, so as to ensure that the light reflected by the first reflective member 50 is reflected to the detection pixel 42, thereby increasing the photosensitive intensity of the detection pixel 42, so as to improve the generation of the trigger signal. accuracy.
  • the height of the second reflector 60 needs to be further adjusted according to the minimum height of the light reflected by the lens 30 when it is reflected to the position of the second reflector 60, and the height of the second reflector 60 needs to be less than The minimum height is to prevent the second reflector 60 from blocking the light reflected by the lens 30 from directly entering the detection pixel 42 .
  • a third reflective member 70 and a fourth reflective member 80 may also be disposed on the detection pixel 42 .
  • the third reflective member 70 and the fourth reflective member 80 are arranged opposite to each other, and form a closed space 90 surrounded by the first reflective member 50 and the second reflective member 60 .
  • the lens 30 it is not guaranteed that all the light reflected by the lens 30 will be reflected on the first reflective member 50.
  • the included angle ⁇ between the light U and the detection pixel 42 is small, it can be seen that theoretically, if the fourth reflective member 80 is not provided, the light U will not be incident into the detection pixel 42 in the end. Therefore, the light in this direction and at a smaller angle with the detection pixel 42 can be incident into the detection pixel 42, and the fourth reflective member 80 can be provided to block this type of light.
  • the light will be captured by the fourth
  • the reflection of the reflector 80 is the same as the principle of the first reflector 50 and the second reflector 60, and the third reflector 70 can be provided to ensure that the light reflected by the fourth reflector 80 will be blocked and reflected by the third reflector 70 to the detection pixel 42 so as to ensure that the incident light from any direction can pass through the cooperation of the first reflective member 50 , the second reflective member 60 , the third reflective member 70 and the fourth reflective member 80 to be incident on the detection pixel 42 , so as to increase the photosensitive intensity of the detection pixel 42, so as to improve the accuracy of generating the trigger signal.
  • the lens 30 when the light source 20 emits light, the lens 30 will reflect the light to the detection pixel 42.
  • the detection pixel 42 receives the light
  • the timer 401 starts timing, and the When 41 receives the light reflected by the target object, the timing is stopped to obtain the receiving time, and the image sensor 40 can receive the time, that is, the flight time of the light, so as to generate the depth information of the target object.
  • the embodiment of the present application provides a depth detection method, the depth detection method includes steps:
  • the timer 401 stops timing to generate the receiving time
  • the depth detection method in the embodiment of the present application can be applied to the time-of-flight module 100 in the manner described in the present application.
  • the circuits of the photosensitive pixel 41 and the detection pixel 42 are connected with the circuit of the timer 401 .
  • the timer 401 starts timing (the timer 401 starts counting from 0), and when the photosensitive pixel 41 receives the light reflected by the target object, The timer 401 stops counting.
  • the time taken from the start of counting to the stop of counting is the receiving time, that is to say, the receiving time is the time of flight from when the light source emits light to when the light reaches the target object and is reflected to the photosensitive pixel 41 , when the flight time is known, the distance between the image sensor 40 and the target object can be calculated according to the following formula, that is, the depth information of the target object can be calculated.
  • d is the distance between the image sensor 40 and the target object
  • ⁇ t is the time difference calculated from the time when the timing starts to the time when the timing is stopped, that is to say, ⁇ t is the receiving time, that is, the flight time of light
  • c is the speed of light.
  • the depth detection method of the embodiment of the present application controls the light source 20 to emit light, and reflects the light to the detection pixel 42 through the lens 30. At this time, the timer 401 starts counting, and the timer 401 counts from the start until the photosensitive pixel 41 receives The time taken to stop counting the light emitted by the target object is the receiving time, and the image sensor 40 can generate the depth information of the target object according to the receiving time.
  • the timer 401 starts counting time is to generate a trigger signal according to the detection pixel 41, and the moment when the timer 401 receives the trigger signal (considering that the speed of light is extremely fast, the time for the lens 30 to reflect the light to the detection pixel can be ignored or in advance deduction after calculation), therefore, the time to start counting is the actual light emitting time of the light source 20, and the receiving time has nothing to do with the temperature drift of the light source 20, thereby eliminating the inaccurate problem of the starting point of timing caused by temperature drift, so as to ensure the accuracy of timing , so as to ensure the accuracy of the depth information of the target object generated by the image sensor 40 .
  • first and second are used for descriptive purposes only, and cannot be interpreted as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features.
  • the features defined as “first” and “second” may explicitly or implicitly include at least one of said features.
  • “plurality” means at least two, such as two, three, unless otherwise specifically defined.

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un module de temps de vol (100), un terminal (1000) et un procédé de détection de profondeur. Le module de temps de vol (100) comprend une source de lumière (20), une lentille (30) et un capteur d'image (40). Lorsqu'un pixel de détection (42) du capteur d'image (40) reçoit la lumière réfléchie par la lentille (30), une minuterie (401) démarre le chronométrage ; lorsqu'un pixel photosensible (41) du capteur d'image (40) reçoit la lumière réfléchie par un objet cible, la minuterie (401) arrête le chronométrage de manière à obtenir le temps de réception ; et le capteur d'image (40) génère des informations de profondeur de l'objet cible en fonction du temps de réception.
PCT/CN2022/083585 2021-06-10 2022-03-29 Module de temps de vol, terminal et procédé de détection de profondeur WO2022257558A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110646449.5 2021-06-10
CN202110646449.5A CN113419252A (zh) 2021-06-10 2021-06-10 飞行时间模组、终端及深度检测方法

Publications (1)

Publication Number Publication Date
WO2022257558A1 true WO2022257558A1 (fr) 2022-12-15

Family

ID=77788386

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/083585 WO2022257558A1 (fr) 2021-06-10 2022-03-29 Module de temps de vol, terminal et procédé de détection de profondeur

Country Status (2)

Country Link
CN (1) CN113419252A (fr)
WO (1) WO2022257558A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113419252A (zh) * 2021-06-10 2021-09-21 Oppo广东移动通信有限公司 飞行时间模组、终端及深度检测方法

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07248374A (ja) * 1994-03-10 1995-09-26 Nikon Corp 距離測定装置
US20050094238A1 (en) * 2002-02-12 2005-05-05 Juha Kostamovaara Method and arrangement for performing triggering and timing of triggering
CN101825703A (zh) * 2010-01-25 2010-09-08 华北电力大学(保定) 改进的脉冲激光测距装置及使用该装置的激光测距方法
CN201749190U (zh) * 2010-01-25 2011-02-16 华北电力大学(保定) 使用连续激光源的脉冲激光测距装置
CN104483676A (zh) * 2014-12-04 2015-04-01 北京理工大学 一种3d/2d非扫描激光雷达复合成像装置
CN105423960A (zh) * 2015-12-31 2016-03-23 国网辽宁省电力有限公司沈阳供电公司 基于激光定位的导线风偏监测装置
CN205352325U (zh) * 2015-12-31 2016-06-29 国网辽宁省电力有限公司沈阳供电公司 基于激光定位的导线风偏监测装置
CN111602069A (zh) * 2018-01-30 2020-08-28 索尼半导体解决方案公司 用于检测距离的电子设备
CN113419252A (zh) * 2021-06-10 2021-09-21 Oppo广东移动通信有限公司 飞行时间模组、终端及深度检测方法
CN215728840U (zh) * 2021-08-20 2022-02-01 深圳市灵明光子科技有限公司 飞行时间测距传感模组及终端

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5959727A (en) * 1997-08-01 1999-09-28 Raytheon Company System and method for discriminating between direct and reflected electromagnetic energy
TWI509292B (zh) * 2011-09-07 2015-11-21 Hon Hai Prec Ind Co Ltd 鏡片及具有該鏡片的鏡頭模組
US10529763B2 (en) * 2018-04-19 2020-01-07 Semiconductor Components Industries, Llc Imaging pixels with microlenses
WO2020022206A1 (fr) * 2018-07-27 2020-01-30 株式会社小糸製作所 Dispositif de mesure de distance
CN109151271A (zh) * 2018-08-22 2019-01-04 Oppo广东移动通信有限公司 激光投射模组及其控制方法、图像获取设备和电子装置
CN109271916B (zh) * 2018-09-10 2020-09-18 Oppo广东移动通信有限公司 电子装置及其控制方法、控制装置和计算机可读存储介质
CN112235494B (zh) * 2020-10-15 2022-05-20 Oppo广东移动通信有限公司 图像传感器、控制方法、成像装置、终端及可读存储介质
CN112505713A (zh) * 2020-11-27 2021-03-16 Oppo(重庆)智能科技有限公司 距离测量装置及方法、计算机可读介质和电子设备

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07248374A (ja) * 1994-03-10 1995-09-26 Nikon Corp 距離測定装置
US20050094238A1 (en) * 2002-02-12 2005-05-05 Juha Kostamovaara Method and arrangement for performing triggering and timing of triggering
CN101825703A (zh) * 2010-01-25 2010-09-08 华北电力大学(保定) 改进的脉冲激光测距装置及使用该装置的激光测距方法
CN201749190U (zh) * 2010-01-25 2011-02-16 华北电力大学(保定) 使用连续激光源的脉冲激光测距装置
CN104483676A (zh) * 2014-12-04 2015-04-01 北京理工大学 一种3d/2d非扫描激光雷达复合成像装置
CN105423960A (zh) * 2015-12-31 2016-03-23 国网辽宁省电力有限公司沈阳供电公司 基于激光定位的导线风偏监测装置
CN205352325U (zh) * 2015-12-31 2016-06-29 国网辽宁省电力有限公司沈阳供电公司 基于激光定位的导线风偏监测装置
CN111602069A (zh) * 2018-01-30 2020-08-28 索尼半导体解决方案公司 用于检测距离的电子设备
CN113419252A (zh) * 2021-06-10 2021-09-21 Oppo广东移动通信有限公司 飞行时间模组、终端及深度检测方法
CN215728840U (zh) * 2021-08-20 2022-02-01 深圳市灵明光子科技有限公司 飞行时间测距传感模组及终端

Also Published As

Publication number Publication date
CN113419252A (zh) 2021-09-21

Similar Documents

Publication Publication Date Title
KR102319494B1 (ko) 차량 센서들에 대한 가변 빔 간격, 타이밍, 및 전력
WO2021072802A1 (fr) Système et procédé de mesure de distance
CN109613515A (zh) 一种激光雷达系统
CN109597050A (zh) 一种激光雷达
US20210041534A1 (en) Distance measurement module, distance measurement method, and electronic apparatus
JP2023549774A (ja) 伝送光学パワーモニタを伴うLiDARシステム
EP3611533B1 (fr) Appareil pour fournir une pluralité de faisceaux de lumière
US11686819B2 (en) Dynamic beam splitter for direct time of flight distance measurements
CN111812663A (zh) 深度测量模组及系统
WO2022011974A1 (fr) Système et procédé de mesure de distance et support d'enregistrement lisible par ordinateur
KR20210059645A (ko) 레이저 출력 어레이 및 이를 이용한 라이다 장치
WO2022257558A1 (fr) Module de temps de vol, terminal et procédé de détection de profondeur
WO2021208582A1 (fr) Appareil d'étalonnage, système d'étalonnage, dispositif électronique et procédé d'étalonnage
CN111580120A (zh) 飞行时间tof装置和电子设备
CN111007523A (zh) 飞行时间发射器、飞行时间深度模组和电子装置
WO2022042078A1 (fr) Source de lumière laser, unité d'émission de lumière et lidar
WO2020223879A1 (fr) Appareil de mesure de distance et plateforme mobile
CN111983630B (zh) 一种单光子测距系统、方法、终端设备及存储介质
CN213210474U (zh) 飞行时间tof装置和电子设备
WO2024050902A1 (fr) Caméra itof, procédé d'étalonnage et dispositif associé
WO2022088914A1 (fr) Dispositif photosensible et système de télémétrie par temps de vol
WO2022226893A1 (fr) Dispositif de détection laser, procédé et dispositif de commande associés et terminal
CN210243829U (zh) 一种激光雷达系统及激光测距装置
JP2023507384A (ja) 物体までの距離を計算するためのlidar装置及び方法
CN219302660U (zh) 一种扫描式激光雷达

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22819163

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22819163

Country of ref document: EP

Kind code of ref document: A1