US20200400796A1 - Time of flight device and time of flight method - Google Patents

Time of flight device and time of flight method Download PDF

Info

Publication number
US20200400796A1
US20200400796A1 US16/898,405 US202016898405A US2020400796A1 US 20200400796 A1 US20200400796 A1 US 20200400796A1 US 202016898405 A US202016898405 A US 202016898405A US 2020400796 A1 US2020400796 A1 US 2020400796A1
Authority
US
United States
Prior art keywords
time
digital converter
pulse signal
depth data
sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/898,405
Inventor
Ping-Hung Yin
Jia-Shyang Wang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Tyrafos Semiconductor Technologies Co Ltd
Original Assignee
Guangzhou Tyrafos Semiconductor Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Tyrafos Semiconductor Technologies Co Ltd filed Critical Guangzhou Tyrafos Semiconductor Technologies Co Ltd
Priority to US16/898,405 priority Critical patent/US20200400796A1/en
Assigned to GUANGZHOU TYRAFOS SEMICONDUCTOR TECHNOLOGIES CO., LTD reassignment GUANGZHOU TYRAFOS SEMICONDUCTOR TECHNOLOGIES CO., LTD ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WANG, JIA-SHYANG, YIN, PING-HUNG
Publication of US20200400796A1 publication Critical patent/US20200400796A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • G01S17/14Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein a voltage or current pulse is initiated and terminated in accordance with the pulse transmission and echo reception respectively, e.g. using counters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

Definitions

  • the invention relates to a distance measurement technology, and more particularly, to a time of flight device and a time of flight method.
  • the ToF circuit For a general time-of-flight (ToF) circuit, the ToF circuit includes a time-to-digital converter (TDC), and the TDC is used for a time-to-depth data conversion.
  • TDC time-to-digital converter
  • a conversion characteristic curve of the TDC is usually nonlinear in a period of time after being enabled. Because a sensing period for time of flight is often short, a time length measurement can be easily affected by jitter. In other words, since the general ToF circuit is often affected by the nonlinear conversion characteristic curve and jitter, a distorted conversion result of time to depth data may be obtained. Therefore, several solutions are provided in the following embodiments.
  • the invention provides a time of flight device and a time of flight method that can provide a reliable distance measurement effect.
  • the time of flight device of the invention includes a first time-to-digital converter, a second time-to-digital converter, a driving circuit, a sensing light source, a sensing pixel and a processing circuit.
  • the driving circuit is configured to provide a pulse signal and a reference pulse signal simultaneously.
  • the driving circuit is coupled to the first time-to-digital converter.
  • the reference pulse signal is provided to the first time-to-digital converter so that the first time-to-digital converter determines first depth data based on the reference pulse signal.
  • the sensing light source is coupled to the driving circuit, and configured to emit a light pulse to a sensing object based on the pulse signal.
  • the sensing pixel is coupled to the second time-to-digital converter, and configured to receive a reflected light pulse reflected by the sensing object and output a reflected pulse signal to the second time-to-digital converter so that the second time-to-digital converter determines second depth data based on the reflected pulse signal.
  • the processing circuit is coupled to the first time-to-digital converter and the second time-to-digital converter, and configured to subtract the first depth data from the second depth data to obtain real depth data.
  • the time of flight method of the invention includes the following steps: simultaneously providing a reference pulse signal to a first time-to-digital converter and a pulse signal to a sensing light source; determining first depth data based on the reference pulse signal by the first time-to-digital converter; emitting a light pulse to a sensing object based on the pulse signal by the sensing light source; receiving a reflected light pulse reflected by the sensing object and outputting a reflected pulse signal to a second time-to-digital converter by a sensing pixel; determining second depth data based on the reflected pulse signal by the second time-to-digital converter; and subtracting the first depth data from the second depth data to obtain real depth data by a processing circuit.
  • the time of flight device and the time of flight method can correct the depth data by the two time-to-digital converters to effectively obtain the real depth data.
  • FIG. 1 is a block diagram of a time of flight device according to an embodiment of the invention.
  • FIG. 2 is a signal timing diagram of various signals and a light pulse according to an embodiment of the invention.
  • FIG. 3 is a flowchart of a time of flight method according to an embodiment of the invention.
  • FIG. 1 is a block diagram of a time of flight device according to an embodiment of the invention.
  • a time of flight device 100 includes a driving circuit 110 , a sensing light source 120 , a first time-to-digital converter 130 , a sensing pixel 140 , a second time-to-digital converter 150 and a processing circuit 160 .
  • the driving circuit 110 is coupled to the sensing light source 120 and the first time-to-digital converter 130 .
  • the second time-to-digital converter 150 is coupled to the sensing pixel 140 .
  • the processing circuit 160 is coupled to the first time-to-digital converter 130 and the second time-to-digital converter 150 .
  • the driving circuit 110 may simultaneously provide a pulse signal PL to the sensing light source 120 and a reference pulse signal RPL to the first time-to-digital converter 130 .
  • the first time-to-digital converter 130 may determine first depth data D 1 based on the reference pulse signal RPL and provide the first depth data D 1 to the processing circuit 160 .
  • the sensing light source 120 emits a light pulse LP to a sensing object 200 based on the pulse signal PL.
  • the sensing pixel 140 receives a reflected light pulse RLP reflected by the sensing object 200 and outputs a reflected pulse signal RP to the second time-to-digital converter 150 .
  • the second time-to-digital converter 150 determines second depth data D 2 based on the reflected pulse signal RP and provide the second depth data D 2 to the processing circuit 160 .
  • the processing circuit 160 may subtract the first depth data D 1 from the second depth data D 2 to obtain real depth data.
  • the sensing light source 120 may be, for example, a pulse light emitter or a laser diode, and the sensing light source 120 may be configured to emit the light pulse LP of infrared radiation (IR) to the sensing object 200 .
  • the sensing pixel 140 may be, for example, a complementary metal-oxide-semiconductor image sensor (CMOS Image Sensor; CIS), and the sensing pixel 140 may received or sense the reflected light pulse RLP of IR reflected by the sensing object 200 .
  • CMOS Image Sensor complementary metal-oxide-semiconductor image sensor
  • the sensing pixel 140 may received or sense the reflected light pulse RLP of IR reflected by the sensing object 200 .
  • the first depth data D 1 of the present embodiment is reference data (or known as correcting data) instead of a real sensing result.
  • the second depth data D 2 of the present embodiment refers to a sensing result of a distance between the time of flight device 100 and the sensing object 200 or surface
  • the driving circuit 110 may further include a timing circuit.
  • the timing circuit may be configured to provide a timing to the first time-to-digital converter 130 and the second time-to-digital converter 150 so that the first time-to-digital converter 130 and the second time-to-digital converter 150 may be simultaneously enabled according to the timing. Further, that the first time-to-digital converter 130 and the second time-to-digital converter 150 may be enabled before the driving circuit 110 simultaneously provides the pulse signal PL and the reference pulse signal RPL.
  • the time of flight device 100 of the present embodiment may also include a pixel array which includes the sensing pixel 140 and a dark pixel, and the dark pixel is coupled to the first time-to-digital converter 130 .
  • the dark pixel refers to a pixel element located in the pixel array that is not used for sensing.
  • the time of flight device 100 of this embodiment may directly receive the reference pulse signal RPL of the driving circuit 110 through the time-to-digital converter provided in the area of one or more dark pixels in the pixel array where the one or more dark pixels are not used fro sensing, so as to obtain the first depth data D 1 .
  • a plurality of pixels in the pixel array may be classified into a plurality of pixel groups, and each of the pixel groups includes the sensing pixel 140 and the dark pixel.
  • the pixel group may be, for example, four pixels adjacent to each other in a two-by-two manner, where three pixels may be used for distance measurement to obtain three pieces of depth data and the remaining one pixel may be used to obtain the first depth data D 1 .
  • the first depth data D 1 may be used to correct said three pieces of depth data.
  • multiple pixels in one entire row or one entire column of the pixel array may all be used as the dark pixels described above, and configured to correct the depth data obtained by the sensing pixels in each corresponding row or column.
  • FIG. 2 is a signal timing diagram of various signals and a light pulse according to an embodiment of the invention.
  • FIG. 2 is a signal timing diagram of various signals and the light pulse shown in FIG. 1 .
  • the first time-to-digital converter 130 and the second time-to-digital converter 150 are simultaneously enabled at a first time T 1 to starting counting, and a counting result is a characteristic conversion curve TDC_C shown in FIG. 2 .
  • TDC_C characteristic conversion curve
  • the driving circuit 110 may provide the pulse signal PL and the reference pulse signal RPL at a second time T 2 simultaneously.
  • the driving circuit 110 outputs the pulse signal PL to the sensing light source 120 so that the sensing light source 120 emits the light pulse LP to the sensing object 200 based on the pulse signal PL.
  • a time difference between the pulse signal PL outputted by the driving circuit 110 and the light pulse LP emitted by the sensing light source 120 is extremely short, for convenience of explanation, they are considered as being generated simultaneously in FIG. 2 . Nonetheless, whether the pulse signal PL and the light pulse LP are generated simultaneously or not does not affect the operation of the invention.
  • the driving circuit 110 outputs the reference pulse signal RPL to the first time-to-digital converter 130 so that the first time-to-digital converter 130 determines the first depth data D 1 based on the first time T 1 and the second time T 2 for receiving the reference pulse signal RPL.
  • the first depth data D 1 corresponds to depth information of the characteristic conversion curve TDC_C between the first time T 1 and the second time T 2 that may be distorted.
  • the sensing pixel 140 provides the reflected pulse signal RP to the second time-to-digital converter 150 .
  • a time difference between the reflected light pulse RLP received or sensed by the sensing pixel 140 and the reflected pulse signal RP outputted by the sensing pixel 140 is extremely short, they are considered as being generated simultaneously in FIG. 2 . Nonetheless, whether the reflected light pulse RLP and the reflected pulse signal RP are generated simultaneously or not does not affect the operation of the invention.
  • the second time-to-digital converter 150 determines the second depth data D 2 based on the first time T 1 and the third time T 3 for receiving the reflected pulse signal RP.
  • the second time T 2 is between the first time T 1 and the third time T 3 .
  • the second depth data D 2 corresponds to depth information of the characteristic conversion curve TDC_C between the first time T 1 and the third time T 3 .
  • the processing circuit 160 of the present embodiment may receive the first depth data D 1 and the second depth data D 2 provided by the first time-to-digital converter 130 and the second time-to-digital converter 150 , and then the processing circuit 160 may subtract the first depth data D 1 from the second depth data D 2 to obtain real depth data D 3 .
  • the processing circuit 160 of the present embodiment deduces the part of the second depth data D 2 that may have distorted depth information
  • the depth information corresponding to the characteristic conversion curve TDC_C between the second time T 2 and the third time T 3 may be obtained.
  • the characteristic conversion curve TDC_C has a linear curve change between the second time T 2 to the third time T 3 . Therefore, the time of flight device 100 of the present embodiment can accurately obtain depth information of the sensing object 200 .
  • an enabling time (the first time T 1 ) of the time-to-digital converter 100 is not the same as the time (the second time T 2 ) for the drive circuit to output the pulse signal PL
  • a time length (T 2 ⁇ T 1 ) between the enabling time (the first time T 1 ) and the time for outputting the pulse signal PL (the second time T 2 ) will be affected by jitter.
  • the reference pulse signal RPL and the architecture design of the second time-to-digital converter 150 as described in this embodiment are not used, the result of time of flight will be different each time due to jitter.
  • the traditional time of flight architecture cannot obtain information regarding jitter, the impact of jitter cannot be deducted.
  • the time of flight device 100 of the present embodiment can obtain the information regarding jitter through the reference pulse signal RPL and the output result D 1 of the second time-to-digital converter 150 .
  • the time of flight device 100 of the present embodiment can obtain jitter information of the pulse signal PL, and can deduct the impact of jitter to obtain real depth information. That is to say, for the time of flight device 100 of the present embodiment, in a fixed scene, the result of time of flight will be the same each time without being affected by jitter.
  • FIG. 3 is a flowchart of a time of flight method according to an embodiment of the invention.
  • the time of flight method of the present embodiment is applicable to the time of flight device 100 in the embodiment of FIG. 1 .
  • the driving circuit 110 simultaneously provides the reference pulse signal RPL to the first time-to-digital converter 130 and the pulse signal PL to the sensing light source 120 .
  • the first time-to-digital converter 130 determines the first depth data D 1 based on the reference pulse signal RPL.
  • the sensing light source 120 emits the light pulse LP to the sensing object 200 based on the pulse signal PL.
  • step S 340 the sensing pixel 140 receives the reflected light pulse RLP reflected by the sensing object 200 and outputs the reflected pulse signal RP to the second time-to-digital converter 150 .
  • step S 350 the second time-to-digital converter 150 determines the second depth data D 2 based on the reflected pulse signal RP.
  • step S 360 the processing circuit 160 subtracts the first depth data D 1 from the second depth data D 2 to obtain the real depth data. Therefore, the time of flight method of the present embodiment can allow the time of flight device 100 to accurately obtain depth information of the sensing object 200 .
  • the time of flight device and time of flight method of the invention can simultaneously enable the first time-to-digital converter and the second time-to-digital converter before the time of flight sensing begins, and provide the reference data (or the correcting data) between the enabling and sensing times through the first time-to-digital converter to correct the depth data generated by the second-time to digital converter. Therefore, after the part that may have distortion information in a previous stage of depth data is deducted from the depth data generated by the second-time-to-digital converter, the real depth data without distortion or with low distortion may be obtained.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Apparatuses And Processes For Manufacturing Resistors (AREA)
  • Electrical Discharge Machining, Electrochemical Machining, And Combined Machining (AREA)
  • Transition And Organic Metals Composition Catalysts For Addition Polymerization (AREA)

Abstract

A time of flight device and a time of flight method are provided. The time of flight device includes a first time-to-digital converter, a second time-to-digital converter, a driving circuit, a sensing light source, a sensing pixel and a processing circuit. The driving circuit provides a pulse signal and a reference pulse signal simultaneously. The first time-to-digital converter determines first depth data based on the reference pulse signal. The sensing light source emits a light pulse to a sensing object based on the pulse signal. The sensing pixel receives a reflected light pulse reflected by the sensing object and outputs a reflected pulse signal to the second time-to-digital converter so that the second time-to-digital converter determines second depth data based on the reflected pulse signal. The processing circuit subtracts the first depth data from the second depth data to obtain real depth data.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims the priority benefit of U.S. provisional application no. 62/864,516, filed on Jun. 21, 2019. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
  • TECHNICAL FIELD
  • The invention relates to a distance measurement technology, and more particularly, to a time of flight device and a time of flight method.
  • BACKGROUND
  • For a general time-of-flight (ToF) circuit, the ToF circuit includes a time-to-digital converter (TDC), and the TDC is used for a time-to-depth data conversion. However, a conversion characteristic curve of the TDC is usually nonlinear in a period of time after being enabled. Because a sensing period for time of flight is often short, a time length measurement can be easily affected by jitter. In other words, since the general ToF circuit is often affected by the nonlinear conversion characteristic curve and jitter, a distorted conversion result of time to depth data may be obtained. Therefore, several solutions are provided in the following embodiments.
  • SUMMARY
  • The invention provides a time of flight device and a time of flight method that can provide a reliable distance measurement effect.
  • The time of flight device of the invention includes a first time-to-digital converter, a second time-to-digital converter, a driving circuit, a sensing light source, a sensing pixel and a processing circuit. The driving circuit is configured to provide a pulse signal and a reference pulse signal simultaneously. The driving circuit is coupled to the first time-to-digital converter. The reference pulse signal is provided to the first time-to-digital converter so that the first time-to-digital converter determines first depth data based on the reference pulse signal. The sensing light source is coupled to the driving circuit, and configured to emit a light pulse to a sensing object based on the pulse signal. The sensing pixel is coupled to the second time-to-digital converter, and configured to receive a reflected light pulse reflected by the sensing object and output a reflected pulse signal to the second time-to-digital converter so that the second time-to-digital converter determines second depth data based on the reflected pulse signal. The processing circuit is coupled to the first time-to-digital converter and the second time-to-digital converter, and configured to subtract the first depth data from the second depth data to obtain real depth data.
  • The time of flight method of the invention includes the following steps: simultaneously providing a reference pulse signal to a first time-to-digital converter and a pulse signal to a sensing light source; determining first depth data based on the reference pulse signal by the first time-to-digital converter; emitting a light pulse to a sensing object based on the pulse signal by the sensing light source; receiving a reflected light pulse reflected by the sensing object and outputting a reflected pulse signal to a second time-to-digital converter by a sensing pixel; determining second depth data based on the reflected pulse signal by the second time-to-digital converter; and subtracting the first depth data from the second depth data to obtain real depth data by a processing circuit.
  • Based on the above, the time of flight device and the time of flight method can correct the depth data by the two time-to-digital converters to effectively obtain the real depth data.
  • To make the aforementioned more comprehensible, several embodiments accompanied with drawings are described in detail as follows.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a time of flight device according to an embodiment of the invention.
  • FIG. 2 is a signal timing diagram of various signals and a light pulse according to an embodiment of the invention.
  • FIG. 3 is a flowchart of a time of flight method according to an embodiment of the invention.
  • DETAILED DESCRIPTION
  • In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
  • In order to make content of the invention more comprehensible, embodiments are described below as the examples to prove that the invention can actually be realized. Moreover, elements/components/steps with same reference numerals represent same or similar parts in the drawings and embodiments.
  • FIG. 1 is a block diagram of a time of flight device according to an embodiment of the invention. Referring to FIG. 1, a time of flight device 100 includes a driving circuit 110, a sensing light source 120, a first time-to-digital converter 130, a sensing pixel 140, a second time-to-digital converter 150 and a processing circuit 160. The driving circuit 110 is coupled to the sensing light source 120 and the first time-to-digital converter 130. The second time-to-digital converter 150 is coupled to the sensing pixel 140. The processing circuit 160 is coupled to the first time-to-digital converter 130 and the second time-to-digital converter 150. In this embodiment, the driving circuit 110 may simultaneously provide a pulse signal PL to the sensing light source 120 and a reference pulse signal RPL to the first time-to-digital converter 130. The first time-to-digital converter 130 may determine first depth data D1 based on the reference pulse signal RPL and provide the first depth data D1 to the processing circuit 160. The sensing light source 120 emits a light pulse LP to a sensing object 200 based on the pulse signal PL. The sensing pixel 140 receives a reflected light pulse RLP reflected by the sensing object 200 and outputs a reflected pulse signal RP to the second time-to-digital converter 150. The second time-to-digital converter 150 determines second depth data D2 based on the reflected pulse signal RP and provide the second depth data D2 to the processing circuit 160. In this embodiment, the processing circuit 160 may subtract the first depth data D1 from the second depth data D2 to obtain real depth data.
  • In this embodiment, the sensing light source 120 may be, for example, a pulse light emitter or a laser diode, and the sensing light source 120 may be configured to emit the light pulse LP of infrared radiation (IR) to the sensing object 200. In this embodiment, the sensing pixel 140 may be, for example, a complementary metal-oxide-semiconductor image sensor (CMOS Image Sensor; CIS), and the sensing pixel 140 may received or sense the reflected light pulse RLP of IR reflected by the sensing object 200. Further, it should be noted that, the first depth data D1 of the present embodiment is reference data (or known as correcting data) instead of a real sensing result. Furthermore, the second depth data D2 of the present embodiment refers to a sensing result of a distance between the time of flight device 100 and the sensing object 200 or surface depth information of the sensing object 200.
  • In the present embodiment, the driving circuit 110 may further include a timing circuit. The timing circuit may be configured to provide a timing to the first time-to-digital converter 130 and the second time-to-digital converter 150 so that the first time-to-digital converter 130 and the second time-to-digital converter 150 may be simultaneously enabled according to the timing. Further, that the first time-to-digital converter 130 and the second time-to-digital converter 150 may be enabled before the driving circuit 110 simultaneously provides the pulse signal PL and the reference pulse signal RPL.
  • In addition, the time of flight device 100 of the present embodiment may also include a pixel array which includes the sensing pixel 140 and a dark pixel, and the dark pixel is coupled to the first time-to-digital converter 130. The dark pixel refers to a pixel element located in the pixel array that is not used for sensing. In other words, the time of flight device 100 of this embodiment may directly receive the reference pulse signal RPL of the driving circuit 110 through the time-to-digital converter provided in the area of one or more dark pixels in the pixel array where the one or more dark pixels are not used fro sensing, so as to obtain the first depth data D1 . In one embodiment, a plurality of pixels in the pixel array may be classified into a plurality of pixel groups, and each of the pixel groups includes the sensing pixel 140 and the dark pixel. For example, the pixel group may be, for example, four pixels adjacent to each other in a two-by-two manner, where three pixels may be used for distance measurement to obtain three pieces of depth data and the remaining one pixel may be used to obtain the first depth data D1. The first depth data D1 may be used to correct said three pieces of depth data. Alternatively, multiple pixels in one entire row or one entire column of the pixel array may all be used as the dark pixels described above, and configured to correct the depth data obtained by the sensing pixels in each corresponding row or column.
  • FIG. 2 is a signal timing diagram of various signals and a light pulse according to an embodiment of the invention. Referring to FIG. 1 and FIG. 2, FIG. 2 is a signal timing diagram of various signals and the light pulse shown in FIG. 1. First of all, with reference to an enabling timing TDC1_EN of the first time-to-digital converter 130 and an enabling timing TDC2_EN of the second time-to-digital converter 150 shown in FIG. 2, the first time-to-digital converter 130 and the second time-to-digital converter 150 are simultaneously enabled at a first time T1 to starting counting, and a counting result is a characteristic conversion curve TDC_C shown in FIG. 2. It should be noted that, the characteristic conversion curve TDC_C shown in FIG. 2 has a non-linear curve change at an initial stage of enabling. Next, the driving circuit 110 may provide the pulse signal PL and the reference pulse signal RPL at a second time T2 simultaneously. In this regard, the driving circuit 110 outputs the pulse signal PL to the sensing light source 120 so that the sensing light source 120 emits the light pulse LP to the sensing object 200 based on the pulse signal PL. Moreover, since a time difference between the pulse signal PL outputted by the driving circuit 110 and the light pulse LP emitted by the sensing light source 120 is extremely short, for convenience of explanation, they are considered as being generated simultaneously in FIG. 2. Nonetheless, whether the pulse signal PL and the light pulse LP are generated simultaneously or not does not affect the operation of the invention. Meanwhile, the driving circuit 110 outputs the reference pulse signal RPL to the first time-to-digital converter 130 so that the first time-to-digital converter 130 determines the first depth data D1 based on the first time T1 and the second time T2 for receiving the reference pulse signal RPL. As shown in FIG. 2, the first depth data D1 corresponds to depth information of the characteristic conversion curve TDC_C between the first time T1 and the second time T2 that may be distorted.
  • Then, over a period of time, the light pulse LP is emitted on a surface of the sensing object 200 so that the sensing pixel 140 senses or receives the reflected light pulse RLP at a third time T3. Accordingly, the sensing pixel 140 provides the reflected pulse signal RP to the second time-to-digital converter 150. In this regard, since a time difference between the reflected light pulse RLP received or sensed by the sensing pixel 140 and the reflected pulse signal RP outputted by the sensing pixel 140 is extremely short, they are considered as being generated simultaneously in FIG. 2. Nonetheless, whether the reflected light pulse RLP and the reflected pulse signal RP are generated simultaneously or not does not affect the operation of the invention. The second time-to-digital converter 150 determines the second depth data D2 based on the first time T1 and the third time T3 for receiving the reflected pulse signal RP. The second time T2 is between the first time T1 and the third time T3. As shown in FIG. 2, the second depth data D2 corresponds to depth information of the characteristic conversion curve TDC_C between the first time T1 and the third time T3. Lastly, the processing circuit 160 of the present embodiment may receive the first depth data D1 and the second depth data D2 provided by the first time-to-digital converter 130 and the second time-to-digital converter 150, and then the processing circuit 160 may subtract the first depth data D1 from the second depth data D2 to obtain real depth data D3. In other words, after the processing circuit 160 of the present embodiment deduces the part of the second depth data D2 that may have distorted depth information, the depth information corresponding to the characteristic conversion curve TDC_C between the second time T2 and the third time T3 may be obtained. In this regard, the characteristic conversion curve TDC_C has a linear curve change between the second time T2 to the third time T3. Therefore, the time of flight device 100 of the present embodiment can accurately obtain depth information of the sensing object 200.
  • It should be noted here that, since an enabling time (the first time T1) of the time-to-digital converter 100 is not the same as the time (the second time T2) for the drive circuit to output the pulse signal PL, a time length (T2−T1) between the enabling time (the first time T1) and the time for outputting the pulse signal PL (the second time T2) will be affected by jitter. In other words, if the reference pulse signal RPL and the architecture design of the second time-to-digital converter 150 as described in this embodiment are not used, the result of time of flight will be different each time due to jitter. Moreover, since the traditional time of flight architecture cannot obtain information regarding jitter, the impact of jitter cannot be deducted. In this regard, since the present embodiment uses the architectural design of the reference pulse signal RPL and the second time-to-digital converter 150, the time of flight device 100 of the present embodiment can obtain the information regarding jitter through the reference pulse signal RPL and the output result D1 of the second time-to-digital converter 150. Moreover, since the pulse signal PL and the reference pulse signal RPL are generated simultaneously, the time of flight device 100 of the present embodiment can obtain jitter information of the pulse signal PL, and can deduct the impact of jitter to obtain real depth information. That is to say, for the time of flight device 100 of the present embodiment, in a fixed scene, the result of time of flight will be the same each time without being affected by jitter.
  • FIG. 3 is a flowchart of a time of flight method according to an embodiment of the invention. Referring to FIG. 1 and FIG. 3, the time of flight method of the present embodiment is applicable to the time of flight device 100 in the embodiment of FIG. 1. In step S310, the driving circuit 110 simultaneously provides the reference pulse signal RPL to the first time-to-digital converter 130 and the pulse signal PL to the sensing light source 120. In step S320, the first time-to-digital converter 130 determines the first depth data D1 based on the reference pulse signal RPL. In step S330, the sensing light source 120 emits the light pulse LP to the sensing object 200 based on the pulse signal PL. In step S340, the sensing pixel 140 receives the reflected light pulse RLP reflected by the sensing object 200 and outputs the reflected pulse signal RP to the second time-to-digital converter 150. In step S350, the second time-to-digital converter 150 determines the second depth data D2 based on the reflected pulse signal RP. In step S360, the processing circuit 160 subtracts the first depth data D1 from the second depth data D2 to obtain the real depth data. Therefore, the time of flight method of the present embodiment can allow the time of flight device 100 to accurately obtain depth information of the sensing object 200.
  • Nevertheless, enough teaching, suggestion, and implementation regarding other device features and technical details of the time of flight device 100 of this embodiment may be obtained from the foregoing embodiments of FIG. 1 and FIG. 2, and thus related descriptions thereof are not repeated hereinafter.
  • In summary, the time of flight device and time of flight method of the invention can simultaneously enable the first time-to-digital converter and the second time-to-digital converter before the time of flight sensing begins, and provide the reference data (or the correcting data) between the enabling and sensing times through the first time-to-digital converter to correct the depth data generated by the second-time to digital converter. Therefore, after the part that may have distortion information in a previous stage of depth data is deducted from the depth data generated by the second-time-to-digital converter, the real depth data without distortion or with low distortion may be obtained.
  • Although the invention has been described with reference to the above embodiments, it will be apparent to one of ordinary skill in the art that modifications to the described embodiments may be made without departing from the spirit of the invention. Accordingly, the scope of the invention will be defined by the attached claims and not by the above detailed descriptions.

Claims (10)

1. A time of flight device, comprising:
a first time-to-digital converter;
a second time-to-digital converter;
a driving circuit, coupled to the first time-to-digital converter, and configured to simultaneously provide a pulse signal and a reference pulse signal, wherein the reference pulse signal is provided to the first time-to-digital converter so that the first time-to-digital converter determines first depth data based on the reference pulse signal;
a sensing light source, coupled to the driving circuit, and configured to emit a light pulse to a sensing object based on the pulse signal;
a sensing pixel, coupled to the second time-to-digital converter, and configured to receive a reflected light pulse reflected by the sensing object and output a reflected pulse signal to the second time-to-digital converter so that the second time-to-digital converter determines second depth data based on the reflected pulse signal; and
a processing circuit, coupled to the first time-to-digital converter and the second time-to-digital converter, and configured to subtract the first depth data from the second depth data to obtain real depth data.
2. The time of flight device of claim 1, wherein the first time-to-digital converter is enabled at a first time, and determines the first depth data based on the first time and a second time for receiving the reference pulse signal.
3. The time of flight device of claim 2, wherein the second time-to-digital converter is enabled at the first time, and determines the second depth data based on the first time and a third time for receiving the reflected pulse signal.
4. The time of flight device of claim 3, wherein the second time is between the first time and the third time.
5. The time of flight device of claim 1, wherein the driving circuit comprises a timing circuit, and the driving circuit is further coupled to the second time-to-digital converter to separately provide a timing to the first time-to-digital converter and the second time-to-digital converter so that the first time-to-digital converter and the second time-to-digital converter are simultaneously enabled.
6. The time of flight device of claim 1, further comprising a pixel array, wherein the pixel array comprises the sensing pixel and a dark pixel, and the dark pixel is coupled to the first time-to-digital converter.
7. The time of flight device of claim 6, wherein the pixel array comprises a plurality of pixel groups, and each of the pixel groups comprises the sensing pixel and the dark pixel.
8. A time of flight method, comprising:
simultaneously providing a reference pulse signal to a first time-to-digital converter and a pulse signal to a second time-to-digital converter;
determining first depth data based on the reference pulse signal by the first time-to-digital converter;
emitting a light pulse to a sensing object based on the pulse signal by a sensing light source;
receiving a reflected light pulse reflected by the sensing object and outputting a reflected pulse signal to the second time-to-digital converter by a sensing pixel;
determining second depth data based on the reflected pulse signal by the second time-to-digital converter; and
subtracting the first depth data from the second depth data to obtain real depth data by a processing circuit.
9. The time of flight method of claim 8, wherein the first time-to-digital converter is enabled at a first time, and the first time-to-digital converter determines the first depth data based on the first time and a second time for receiving the reference pulse signal.
10. The time of flight method of claim 9, wherein the second time-to-digital converter is enabled at the first time, and the second time-to-digital converter determines the second depth data based on the first time and a third time for receiving the reflected pulse signal.
US16/898,405 2019-06-21 2020-06-10 Time of flight device and time of flight method Abandoned US20200400796A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/898,405 US20200400796A1 (en) 2019-06-21 2020-06-10 Time of flight device and time of flight method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962864516P 2019-06-21 2019-06-21
US16/898,405 US20200400796A1 (en) 2019-06-21 2020-06-10 Time of flight device and time of flight method

Publications (1)

Publication Number Publication Date
US20200400796A1 true US20200400796A1 (en) 2020-12-24

Family

ID=73798875

Family Applications (3)

Application Number Title Priority Date Filing Date
US16/898,405 Abandoned US20200400796A1 (en) 2019-06-21 2020-06-10 Time of flight device and time of flight method
US16/898,396 Active 2041-04-24 US11474217B2 (en) 2019-06-21 2020-06-10 Time of flight device and time of flight method
US16/905,918 Active 2041-05-20 US11525902B2 (en) 2019-06-21 2020-06-19 Time-of-flight ranging device suitable for indirect time-of-flight ranging

Family Applications After (2)

Application Number Title Priority Date Filing Date
US16/898,396 Active 2041-04-24 US11474217B2 (en) 2019-06-21 2020-06-10 Time of flight device and time of flight method
US16/905,918 Active 2041-05-20 US11525902B2 (en) 2019-06-21 2020-06-19 Time-of-flight ranging device suitable for indirect time-of-flight ranging

Country Status (3)

Country Link
US (3) US20200400796A1 (en)
CN (3) CN112114322A (en)
TW (3) TWI748460B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020191505A (en) * 2019-05-20 2020-11-26 キヤノン株式会社 Imaging apparatus and control method thereof
US11536836B2 (en) * 2019-09-27 2022-12-27 Sensors Unlimited, Inc. Time-of-flight independent of object reflectivity
TWI773133B (en) * 2020-07-10 2022-08-01 大陸商廣州印芯半導體技術有限公司 Ranging device and ranging method
TWI812493B (en) * 2022-09-28 2023-08-11 勝薪科技股份有限公司 Ranging device and ranging method thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110080580A1 (en) * 2006-03-10 2011-04-07 Imra America, Inc. Optical signal processing with modelocked lasers
US20140247366A1 (en) * 2011-07-25 2014-09-04 Aptina Imaging Corporation Image sensors with dark pixels for real-time verification of imaging systems
JP2017011693A (en) * 2015-06-17 2017-01-12 パナソニックIpマネジメント株式会社 Imaging device
US10014333B2 (en) * 2015-08-26 2018-07-03 Semiconductor Components Industries, Llc Back-side illuminated pixels with interconnect layers
CN110779459A (en) * 2018-07-24 2020-02-11 三星电子株式会社 Image sensor, imaging unit and method of generating gray-scale image
US20200271763A1 (en) * 2019-02-23 2020-08-27 Didi Research America, Llc Light detection and ranging signal correction methods and systems
US20200284883A1 (en) * 2019-03-08 2020-09-10 Osram Gmbh Component for a lidar sensor system, lidar sensor system, lidar sensor device, method for a lidar sensor system and method for a lidar sensor device
DE102021100503A1 (en) * 2020-01-21 2021-07-22 Semiconductor Components Industries, Llc SYSTEM WITH SPAD-BASED SEMI-CONDUCTOR DEVICE WITH DARK PIXELS FOR MONITORING SENSOR PARAMETERS

Family Cites Families (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06230133A (en) * 1993-02-03 1994-08-19 Nissan Motor Co Ltd Object detector for vehicle
US6452666B1 (en) * 1998-10-29 2002-09-17 Photobit Corporation Optical range finder
US6522395B1 (en) * 1999-04-30 2003-02-18 Canesta, Inc. Noise reduction techniques suitable for three-dimensional information acquirable with CMOS-compatible image sensor ICS
CA2425250A1 (en) * 2003-04-11 2004-10-11 Quinton Lyon Time-base generator with self-compensating control loop
US7379100B2 (en) * 2004-02-12 2008-05-27 Canesta, Inc. Method and system to increase dynamic range of time-of-flight (TOF) and/or imaging sensors
JP2007121116A (en) * 2005-10-28 2007-05-17 Sharp Corp Optical distance measuring device
DE102006029025A1 (en) * 2006-06-14 2007-12-27 Iris-Gmbh Infrared & Intelligent Sensors Reflective object distance determining device, has two integrators connected with photoelectric unit and control input, where photoelectric unit is rectangle or square shape and exhibits specific side length
CN101216562A (en) * 2007-01-05 2008-07-09 薛志强 Laser distance measuring system
KR101668869B1 (en) * 2009-05-29 2016-10-28 삼성전자주식회사 Depth sensor, 3D image sensor and Depth sensing method there-of
TWI443308B (en) * 2009-12-03 2014-07-01 Pixart Imaging Inc Distance-measuring device, 3d image sensing device, and optical touch system
TWI407081B (en) * 2009-09-23 2013-09-01 Pixart Imaging Inc Distance-measuring device by means of difference of imaging location and calibrating method thereof
CN102072718B (en) * 2009-11-25 2013-08-28 原相科技股份有限公司 Distance measuring device for measuring distance by utilizing imaging position difference and calibrating method thereof
US8638425B2 (en) * 2009-12-03 2014-01-28 Pixart Imaging Inc. Distance-measuring device with increased signal-to-noise ratio and method thereof
TWI427273B (en) * 2010-07-29 2014-02-21 Pixart Imaging Inc Distance-measuring device of increasing a signal-to-noise ratio and method thereof
JP2011169701A (en) * 2010-02-17 2011-09-01 Sanyo Electric Co Ltd Object detection device and information acquisition apparatus
US9083889B2 (en) * 2010-02-28 2015-07-14 Himax Imaging, Inc. Signal processing circuit capable of selectively adjusting gain factor of sample-and-hold circuit and signal processing method thereof
CN102235860B (en) * 2010-04-20 2014-09-03 原相科技股份有限公司 Distance measuring device, stereo image sensing device and optical touch control system
CN102384737B (en) * 2010-09-01 2014-06-18 原相科技股份有限公司 Ranging device capable of raising signal to noise rate and method thereof
CN102479002B (en) * 2010-11-30 2014-12-10 原相科技股份有限公司 Optical touch control system and sensing method thereof
GB2492848A (en) * 2011-07-15 2013-01-16 Softkinetic Sensors Nv Optical distance measurement
CN102707290A (en) * 2012-05-02 2012-10-03 中山市柏棱光电有限公司 Laser ranging method
US9354051B2 (en) * 2012-09-13 2016-05-31 Laser Technology, Inc. System and method for a rangefinding instrument incorporating pulse and continuous wave signal generating and processing techniques for increased distance measurement accuracy
US9019480B2 (en) * 2013-02-26 2015-04-28 Jds Uniphase Corporation Time-of-flight (TOF) system, sensor pixel, and method
US9204143B2 (en) * 2013-03-15 2015-12-01 Samsung Electronics Co., Ltd. Image sensor, operation method thereof, and system including the same
US10063844B2 (en) * 2013-10-17 2018-08-28 Microsoft Technology Licensing, Llc. Determining distances by probabilistic time of flight imaging
US10712432B2 (en) * 2013-12-11 2020-07-14 Ams Sensors Singapore Pte. Ltd. Time-of-light-based systems using reduced illumination duty cycles
US9658336B2 (en) * 2014-08-20 2017-05-23 Omnivision Technologies, Inc. Programmable current source for a time of flight 3D image sensor
US9437633B2 (en) * 2014-11-06 2016-09-06 Taiwan Semiconductor Manufacturing Company, Ltd. Depth sensing pixel, composite pixel image sensor and method of making the composite pixel image sensor
US10062201B2 (en) * 2015-04-21 2018-08-28 Microsoft Technology Licensing, Llc Time-of-flight simulation of multipath light phenomena
US10048357B2 (en) * 2015-06-15 2018-08-14 Microsoft Technology Licensing, Llc Time-of-flight (TOF) system calibration
JP6532325B2 (en) * 2015-07-09 2019-06-19 キヤノン株式会社 Measuring device for measuring the shape of the object to be measured
FR3039928B1 (en) * 2015-08-03 2019-06-07 Teledyne E2V Semiconductors Sas METHOD FOR CONTROLLING AN ACTIVE PIXEL IMAGE SENSOR
CN107923737B (en) * 2015-12-13 2019-12-17 富通尼奥有限责任公司 Method and apparatus for superpixel modulation and ambient light rejection
EP3391076A1 (en) * 2015-12-20 2018-10-24 Apple Inc. Light detection and ranging sensor
JP6631718B2 (en) * 2016-02-08 2020-01-15 株式会社デンソー Time-of-flight distance measuring device and method for detecting multipath errors
US10782393B2 (en) * 2016-02-18 2020-09-22 Aeye, Inc. Ladar receiver range measurement using distinct optical path for reference light
JP2017150893A (en) * 2016-02-23 2017-08-31 ソニー株式会社 Ranging module, ranging system, and control method of ranging module
US9967539B2 (en) * 2016-06-03 2018-05-08 Samsung Electronics Co., Ltd. Timestamp error correction with double readout for the 3D camera with epipolar line laser point scanning
US10594920B2 (en) * 2016-06-15 2020-03-17 Stmicroelectronics, Inc. Glass detection with time of flight sensor
JP6673084B2 (en) * 2016-08-01 2020-03-25 株式会社デンソー Light flight type distance measuring device
CN106066475B (en) * 2016-08-16 2018-10-26 深圳市佶达德科技有限公司 A kind of three-dimensional laser radar range-measurement system
US10762651B2 (en) * 2016-09-30 2020-09-01 Magic Leap, Inc. Real time calibration for time-of-flight depth measurement
US10291895B2 (en) * 2016-10-25 2019-05-14 Omnivision Technologies, Inc. Time of flight photosensor
JP6541119B2 (en) * 2016-12-05 2019-07-10 パナソニックIpマネジメント株式会社 Imaging device
EP3554065B1 (en) * 2016-12-08 2021-12-29 Nuvoton Technology Corporation Japan Solid-state imaging apparatus and imaging apparatus
WO2018160886A1 (en) * 2017-03-01 2018-09-07 Ouster, Inc. Accurate photo detector measurements for lidar
EP3392675B1 (en) * 2017-04-21 2021-03-10 Melexis Technologies NV Active pixel circuit for a time-of-flight system and method for operating the same
WO2018198729A1 (en) * 2017-04-28 2018-11-01 シャープ株式会社 Three-dimensional image element and optical radar device
DE102017207317B4 (en) * 2017-05-02 2022-03-03 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device for determining a distance to an object and a corresponding method
TW201915818A (en) * 2017-10-05 2019-04-16 香港商印芯科技股份有限公司 Optical identification module
JP7228509B2 (en) * 2017-10-18 2023-02-24 ソニーセミコンダクタソリューションズ株式会社 Identification device and electronic equipment
TWI661211B (en) * 2017-12-08 2019-06-01 財團法人工業技術研究院 Ranging device and method thereof
CN208477552U (en) * 2018-04-01 2019-02-05 印芯科技股份有限公司 Optical recognition module
US11294039B2 (en) * 2018-07-24 2022-04-05 Samsung Electronics Co., Ltd. Time-resolving image sensor for range measurement and 2D greyscale imaging
CN109116302B (en) * 2018-08-29 2024-05-31 四川中电昆辰科技有限公司 Arrival time detection method, arrival time detection device and positioning device
WO2020042166A1 (en) * 2018-08-31 2020-03-05 深圳市汇顶科技股份有限公司 Time of flight-based distance measurement method and distance measurement system
CN109633670A (en) * 2018-10-25 2019-04-16 上海无线电设备研究所 It is a kind of to utilize the laser pulse ranging method for receiving signal width amendment measurement error
EP3881098A4 (en) * 2019-01-04 2022-08-31 Sense Photonics, Inc. High dynamic range direct time of flight sensor with signal-dependent effective readout rate
US11555899B2 (en) * 2019-02-19 2023-01-17 Infineon Technologies Ag Random hardware fault and degradation protection apparatus for time-of-flight receiver

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110080580A1 (en) * 2006-03-10 2011-04-07 Imra America, Inc. Optical signal processing with modelocked lasers
US20140247366A1 (en) * 2011-07-25 2014-09-04 Aptina Imaging Corporation Image sensors with dark pixels for real-time verification of imaging systems
US9161028B2 (en) * 2011-07-25 2015-10-13 Semiconductor Components Industries, Llc Image sensors with dark pixels for real-time verification of imaging systems
JP2017011693A (en) * 2015-06-17 2017-01-12 パナソニックIpマネジメント株式会社 Imaging device
US20210152755A1 (en) * 2015-06-17 2021-05-20 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus including light source that emits pulsed light, image sensor, and control circuit
US10014333B2 (en) * 2015-08-26 2018-07-03 Semiconductor Components Industries, Llc Back-side illuminated pixels with interconnect layers
CN110779459A (en) * 2018-07-24 2020-02-11 三星电子株式会社 Image sensor, imaging unit and method of generating gray-scale image
US20200271763A1 (en) * 2019-02-23 2020-08-27 Didi Research America, Llc Light detection and ranging signal correction methods and systems
US20200284883A1 (en) * 2019-03-08 2020-09-10 Osram Gmbh Component for a lidar sensor system, lidar sensor system, lidar sensor device, method for a lidar sensor system and method for a lidar sensor device
CN113795773A (en) * 2019-03-08 2021-12-14 欧司朗股份有限公司 Component for a LIDAR sensor system, LIDAR sensor device, method for a LIDAR sensor system and method for a LIDAR sensor device
DE102021100503A1 (en) * 2020-01-21 2021-07-22 Semiconductor Components Industries, Llc SYSTEM WITH SPAD-BASED SEMI-CONDUCTOR DEVICE WITH DARK PIXELS FOR MONITORING SENSOR PARAMETERS

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
google machine translation of 110779459 (8-29-22) *
google machine translation of CN-113795773-A (8-29-22) *
Jargon, Jeffery, Correcting Sampling Oscilloscope Timebase Errors With a Passively Mode-Locked Laser Phase Locked to a Microwave Oscillator, 916 IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 59, NO. 4, APRIL 2010 *
Niclass, Cristiano, A 128 128 Single-Photon Image Sensor With Column-Level 10-Bit Time-to-Digital Converter Array, IEEE JOURNAL OF SOLID-STATE CIRCUITS, VOL. 43, NO. 12, DECEMBER 2008 ( https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4684628)(hereinafter "NICLASS"). (Year: 2008) *

Also Published As

Publication number Publication date
TWI714508B (en) 2020-12-21
CN112114328A (en) 2020-12-22
TWI775092B (en) 2022-08-21
US11525902B2 (en) 2022-12-13
TW202101025A (en) 2021-01-01
TWI748460B (en) 2021-12-01
US11474217B2 (en) 2022-10-18
US20200400819A1 (en) 2020-12-24
CN112114322A (en) 2020-12-22
CN112114323A (en) 2020-12-22
CN112114328B (en) 2024-06-11
TW202101024A (en) 2021-01-01
TW202101026A (en) 2021-01-01
US20200400793A1 (en) 2020-12-24

Similar Documents

Publication Publication Date Title
US20200400796A1 (en) Time of flight device and time of flight method
US10838066B2 (en) Solid-state imaging device, distance measurement device, and distance measurement method
US8542348B2 (en) Color sensor insensitive to distance variations
KR101711061B1 (en) Method for estimating depth information using depth estimation device
CN110168398B (en) Flying time ranging system and correction method
US20190259902A1 (en) Light detection device and electronic apparatus
TWI722519B (en) Time-of-flight ranging sensor and time-of-flight ranging method
WO2015128915A1 (en) Distance measurement device and distance measurement method
US9103663B2 (en) Depth sensor, method of calculating depth in the same
US8432304B2 (en) Error correction in thermometer codes
TWI759213B (en) Light sensor and sensing method thereof
US20200408910A1 (en) Distance measuring method and distance measuring device
JP5391127B2 (en) Time measuring device and distance measuring device
TW202045954A (en) Time-of-flight ranging device and time-of-flight ranging method
JP5251912B2 (en) Image reading device
US20100053401A1 (en) Method and Apparatus for Imaging
KR100885479B1 (en) Line sensor and method for controlling the same
JP2020161989A (en) Error signal generation circuit, solid state image pickup device, and distance measurement device
TW202005359A (en) Image sensing system and multi-function image sensor thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: GUANGZHOU TYRAFOS SEMICONDUCTOR TECHNOLOGIES CO., LTD, CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YIN, PING-HUNG;WANG, JIA-SHYANG;REEL/FRAME:052958/0787

Effective date: 20200602

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION