CN112198525B - Compensation data determining method and device, compensation method and device and electronic equipment - Google Patents

Compensation data determining method and device, compensation method and device and electronic equipment Download PDF

Info

Publication number
CN112198525B
CN112198525B CN202011059810.6A CN202011059810A CN112198525B CN 112198525 B CN112198525 B CN 112198525B CN 202011059810 A CN202011059810 A CN 202011059810A CN 112198525 B CN112198525 B CN 112198525B
Authority
CN
China
Prior art keywords
receiving module
compensation
module
compensation data
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011059810.6A
Other languages
Chinese (zh)
Other versions
CN112198525A (en
Inventor
张学勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011059810.6A priority Critical patent/CN112198525B/en
Publication of CN112198525A publication Critical patent/CN112198525A/en
Application granted granted Critical
Publication of CN112198525B publication Critical patent/CN112198525B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4913Circuits for detection, sampling, integration or read-out

Abstract

The disclosure belongs to the technical field of electronic equipment, and in particular relates to a compensation data determining method and device, a compensation method and device, a time-of-flight image sensor assembly, electronic equipment and a storage medium, wherein the compensation data determining method comprises the following steps: driving the transmitting module and the receiving module to work in an isolation state, wherein the isolation state is used for isolating the transmitting module and the receiving module so as to prevent detection light emitted by the transmitting module from entering the receiving module; acquiring compensation data of each pixel of the receiving module under different receiving module parameters, wherein the receiving module parameters comprise one or more of temperature, exposure time, amplification gain factor, modulation frequency and ambient light brightness of the receiving module; and establishing a compensation mapping relation according to the receiving module parameters and the corresponding compensation data, wherein the compensation mapping light comprises the mapping relation of the receiving module parameters and the compensation data. The determination of the time-of-flight image sensor compensation signal can be achieved.

Description

Compensation data determining method and device, compensation method and device and electronic equipment
Technical Field
The disclosure relates to the technical field of electronic equipment, in particular to a compensation data determining method and device, a compensation method and device, a time-of-flight image sensor assembly, electronic equipment and a storage medium.
Background
A time of flight (TOF) image sensor includes an emitter that emits a detection light that is reflected upon encountering an object and a receiver that receives the reflected detection light. The sensor determines the distance of the object based on the time difference or phase difference of the emission and reception of the probe light. The receiver converts the optical signal into an electrical signal after receiving the reflected probe light, which is used to generate a depth image. The receiver converts the optical signal into an electrical signal through the photodiode and outputs the electrical signal through the output circuit, and since the devices in the output circuit are non-ideal devices, noise exists in the electrical signal output through the output circuit, and the noise affects the ranging accuracy.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The disclosure aims to provide a compensation data determining method and device, a compensation method and device, a time-of-flight image sensor assembly, an electronic device and a storage medium, so that the time-of-flight image sensor improves the ranging accuracy at least to a certain extent.
According to a first aspect of the present disclosure, there is provided a method of determining compensation data for a time-of-flight image sensor, the method comprising:
driving the transmitting module and the receiving module to work in an isolation state, wherein the isolation state is used for isolating the transmitting module and the receiving module so as to prevent detection light emitted by the transmitting module from entering the receiving module;
acquiring compensation data of the receiving module under different receiving module parameters, wherein the receiving module parameters comprise one or more of temperature, exposure time, amplification gain factor, modulation frequency and ambient light brightness of the receiving module;
and establishing a compensation mapping relation according to the receiving module parameters and the corresponding compensation data, wherein the compensation mapping light comprises the mapping relation of the receiving module parameters and the compensation data.
According to a second aspect of the present disclosure, there is provided a compensation method for a time-of-flight image sensor, the compensation method comprising:
Acquiring receiving module parameters, wherein the receiving module parameters comprise one or more of temperature, exposure time, amplification gain factors and modulating frequency environment light brightness of a receiving module;
determining compensation data according to the receiving module parameters and the compensation mapping relation, wherein the compensation mapping relation is determined according to the compensation data determining method;
and compensating the depth signal received by the receiving module by using the compensation data.
According to a third aspect of the present disclosure, there is provided a determination apparatus of compensation data for a time-of-flight image sensor, the apparatus comprising:
the driving module is used for driving the transmitting module and the receiving module to work in an isolation state, wherein the isolation state is used for isolating the transmitting module and the receiving module so as to prevent detection light emitted by the transmitting module from entering the receiving module;
the first acquisition module is used for acquiring compensation data of each pixel of the receiving module under different receiving module parameters, wherein the receiving module parameters comprise one or more of temperature, exposure time, amplification gain multiple and modulation frequency environment brightness of the receiving module;
the building module is used for building a compensation mapping relation according to the receiving module parameters and the corresponding compensation data, wherein the compensation mapping light comprises the mapping relation of the receiving module parameters and the compensation data.
According to a fourth aspect of the present disclosure, there is provided a compensation device for a time-of-flight image sensor, the compensation device comprising:
the second acquisition module is used for acquiring parameters of the receiving module, wherein the parameters of the receiving module comprise one or more of temperature, exposure time, amplification gain multiple and modulation frequency environment brightness of the receiving module;
the determining module is used for determining compensation data according to the receiving module parameters and the compensation mapping relation, wherein the compensation mapping relation is determined according to the compensation data determining method;
and the compensation module is used for compensating the depth signal received by the receiving module by using the compensation data.
According to a third aspect of the present disclosure, there is provided a time-of-flight image sensor assembly, the image sensor assembly comprising:
the emission module is used for emitting detection light;
the receiving module is used for receiving the detection light and generating a depth signal;
the detection module is connected with the receiving module and used for detecting the parameters of the receiving module, wherein the parameters of the receiving module comprise one or more of temperature, exposure time, amplification gain and modulation frequency environment brightness of the receiving module
The control module is connected with the receiving module and the detecting module, the control module stores a compensation mapping relation, and the control module is used for determining compensation data according to the parameters of the receiving module and the compensation mapping relation and compensating the depth signal by utilizing the compensation data.
According to a sixth aspect of the present disclosure, there is provided an electronic device comprising
A processor; and
a memory having stored thereon computer readable instructions which, when executed by the processor, implement a method according to any of the above.
According to a seventh aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method according to any of the above.
According to the method for determining the compensation data, the transmitting module and the receiving module are driven to work in the isolation state, the compensation data of each pixel of the receiving module under different receiving module parameters are obtained, and the compensation mapping relation is established according to the receiving module parameters and the corresponding compensation data, so that in the working process of the image sensor, the compensation data corresponding to different receiving module parameters can be obtained through the compensation mapping relation, the depth signal is compensated, and the ranging precision of the time-of-flight image sensor can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings.
FIG. 1 is a schematic diagram of a time-of-flight image sensor provided by an exemplary embodiment of the present disclosure;
fig. 2 is a schematic diagram of a receiving module output circuit according to an exemplary embodiment of the disclosure;
fig. 3 is a control timing diagram of a receiving module output circuit according to an exemplary embodiment of the present disclosure;
FIG. 4 is a flowchart of a method for determining compensation data provided by an exemplary embodiment of the present disclosure;
FIG. 5 is a flowchart of another compensation data determination method provided by an exemplary embodiment of the present disclosure;
FIG. 6 is a test block diagram of a time-of-flight image sensor provided in an exemplary embodiment of the present disclosure;
FIG. 7 is a flow chart of a compensation method provided by an exemplary embodiment of the present disclosure;
FIG. 8 is a block diagram of a compensation data determination apparatus provided by an exemplary embodiment of the present disclosure;
FIG. 9 is a block diagram of a compensation device provided by an exemplary embodiment of the present disclosure;
FIG. 10 is a block diagram of a time-of-flight image sensor assembly provided by an exemplary embodiment of the present disclosure;
FIG. 11 is a schematic diagram of an electronic device provided in an exemplary embodiment of the present disclosure;
Fig. 12 is a schematic diagram of a computer-readable storage medium provided in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments can be embodied in many forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the disclosed aspects may be practiced without one or more of the specific details, or with other methods, components, materials, devices, steps, etc. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
As shown in fig. 1, the time-of-flight image sensor includes a transmitting module 01 and a receiving module 02, where the transmitting module 01 and the receiving module 02 can be integrated on a base 03, and the receiving module 02 is disposed on one side of the transmitting module 01. The transmitting module 01 is used for transmitting detection light (infrared light), the detection light is reflected when encountering an obstacle, and the receiving module 02 receives the reflected light. The receiving module 02 may include a photoelectric conversion unit and an output circuit, the output circuit being connected to the photoelectric conversion unit, the photoelectric conversion unit being configured to convert an optical signal into an electrical signal, and the output circuit processing and outputting the electrical signal.
The photoelectric conversion unit may include a photodiode PD, and a photodiode array may be provided in the reception module. As illustrated in fig. 2, the output circuit may include a first switching unit K1, a second switching unit K2, a third switching unit K3, a fourth switching unit K4, a fifth switching unit K5, a sixth switching unit K6, a first capacitor C1, and a second capacitor C2. The first end of the first switch unit K1 is connected with the photodiode, the second end of the first switch unit K1 is connected with the first node, the first capacitor C1 is connected to the first node, the first end of the second switch unit K2 receives the reset signal Vrst, the second end of the second switch unit K2 is connected with the first node, the first end of the third switch unit K3 is connected with the first node, and the second end of the third switch unit K3 is connected with the output end OUT1. The first end of the fourth switch unit K4 is connected with the photodiode, the second end of the fourth switch unit K4 is connected with the second node, the second capacitor C2 is connected to the second node, the first end of the fifth switch unit K5 receives the reset signal Vrst, the second end of the fifth switch unit K5 is connected with the second node, the first end of the sixth switch unit K6 is connected with the second node, and the second end of the sixth switch unit K6 is connected with the output end OUT2.
As shown in fig. 3, the control timing of the output circuit may be that the transmitting signal is EM, the receiving signal is RE, the control signal of the first switch unit K1 is SK1, the control signal of the fourth switch unit K4 is SK2, and the modulation and demodulation process of the output circuit may be as follows:
in the reset stage, the second switch unit K2 and the fourth switch unit K4 are turned on, the first switch unit K1, the third switch unit K3, the fourth switch unit K4 and the sixth switch unit K6 are turned off, and reset voltages are written in the first capacitor C1 and the second capacitor C2.
In the integration phase, the second, third, fifth and sixth switching units K2, K3, K5 and K6 are turned off. The first switch unit K1 and the fourth switch unit K4 may be turned on according to a phase of a detection signal transmitted by the transmitting module. For example, the first switching unit K1 is turned on to charge the first capacitor C1 when the signal is transmitted by 0 to 180 degrees, and the second switching unit K2 is turned on to charge the second capacitor C2 when the signal is transmitted by 180 to 360 degrees. Because the phases of the reflected light received by the receiving module and the detected light emitted by the emitting module are different, the electric quantity written by the first capacitor C1 and the second capacitor C2 are different. In practical applications, the transmitting module transmits pulse signals with multiple periods in the integration stage, that is, the first capacitor C1 and the second capacitor C2 are charged multiple times, and the charging ratio of the first capacitor C1 and the second capacitor C2 is consistent with the phase difference of the detection signal and the reflection signal. The charge variation of the first capacitor C1 and the second capacitor C2 may be as shown in fig. 3.
In the reading stage, the first switch unit K1, the second switch unit K2, the fourth switch unit K4 and the fifth switch unit K5 are turned off, the third switch unit K3 and the sixth switch are turned on, and the electric quantities of the first capacitor C1 and the second capacitor C2 are output.
And in the dead zone stage, the transmitting module is turned off to not transmit the detection light.
The embodiment of the disclosure first provides a method for determining compensation data for a time-of-flight image sensor, as shown in fig. 4, the method comprising:
step S410, driving the transmitting module and the receiving module to work in an isolation state, wherein the isolation state is to isolate the transmitting module and the receiving module so as to prevent the detection light emitted by the transmitting module from entering the receiving module;
step S420, obtaining compensation data of the receiving module under different receiving module parameters, wherein the receiving module parameters comprise one or more of temperature, exposure time, amplification gain factor and modulation frequency environment brightness of the receiving module;
step S430, according to the receiving module parameters and the corresponding compensation data, a compensation mapping relation is established, wherein the compensation mapping light comprises the mapping relation of the receiving module parameters and the compensation data.
The compensation data of the receiving module corresponds to the pixels in the receiving module, and when the compensation data is obtained, the compensation data corresponding to each pixel is obtained. The compensation mapping relationship may include a plurality of sub-compensation mapping relationships, where each sub-compensation mapping relationship corresponds to a pixel.
According to the method for determining the compensation data, the transmitting module and the receiving module are driven to work in the isolation state, the compensation data of each pixel of the receiving module under different receiving module parameters are obtained, and the compensation mapping relation is established according to the receiving module parameters and the corresponding compensation data, so that in the working process of the image sensor, the compensation data corresponding to different receiving module parameters can be obtained through the compensation mapping relation, the depth signal is compensated, and the ranging precision of the time-of-flight image sensor can be improved.
Further, as shown in fig. 5, the method for determining compensation data provided in the embodiment of the disclosure may further include:
step S510 isolates the transmitting module from the receiving module to block the probe light emitted by the transmitting module from entering the receiving module. Through isolating emission module and receiving module, can avoid the light that emission module transmitted to get into receiving module, and then lead to the fact the influence to the determination of compensation data.
The following describes in detail each step of the compensation data determining method provided in the embodiment of the present disclosure:
in step S510, the transmitting module and the receiving module may be isolated to block the probe light emitted by the transmitting module from entering the receiving module.
In the time-of-flight image sensor, the receiving module is arranged at one side of the transmitting module, and the light emitting side of the transmitting module and the light entering side of the receiving module are positioned at the same side. In the determination of the compensation data, it is required that the receiving module does not receive the detection light of the transmitting module (including the detection light directly entering the receiving module through the transmitting module and the detection light reflected by the obstacle), so that the transmitting module and the receiving module need to be isolated.
Isolation of the transmitting module and the receiving module can be realized in the following manner: the black light absorption layer is used for absorbing the detection light emitted by the emission module, and at least covers the emission end of the emission module.
The light emitted by the emission module can be absorbed by the black light absorption layer, and the light absorption rate of the black light absorption layer is more than 99.5%.
As shown in fig. 6, an emission module provided in this embodiment of the present disclosure may include a vertical cavity surface laser array 11, a collimating lens 12, and a light homogenizing sheet 13, where the vertical cavity surface laser array 11 is configured to receive a power signal and emit light, the collimating lens 12 is disposed on a light emitting side of the vertical cavity surface laser array 11 and is configured to calibrate and shape light emitted by the vertical cavity surface laser array 11, the light homogenizing sheet 13 is disposed on a side of the collimating lens 12 away from the vertical cavity surface laser array 11, and the light homogenizing sheet 13 is configured to perform a homogenization treatment on light. The receiving module comprises a lens 22 and a sensor 21, wherein the lens 22 is arranged on the light inlet side of the sensor 11. Both the transmitting module and the receiving module may be disposed on the circuit board 31.
The black light absorbing layer 40 may be disposed at the emitting end of the emitting module, that is, the black light absorbing layer covers the light emitting hole of the emitting module. And a black light absorption layer can be arranged between the transmitting module and the receiving module so as to prevent the detection light of the transmitting module from directly entering the receiving module and further to receive the module. When the receiving module needs to determine the compensation data under the condition of isolating the ambient light, the black light absorption layer can also cover the light inlet end of the receiving module.
In step S410, the transmitting module and the receiving module may be driven to operate in an isolated state, where the transmitting module and the receiving module are isolated to block the probe light emitted by the transmitting module from entering the receiving module.
In the isolated state, a power signal is provided to the transmitting module, and the transmitting module transmits probe light in response to the power signal. The detection light emitted by the emitting module is absorbed by the black light absorption layer in the isolated state, and the signals generated by the receiving module are noise signals.
In step S420, compensation data of the receiving module under different receiving module parameters may be obtained, where the receiving module parameters include one or more of temperature, exposure time, amplification gain, and modulation frequency environment light brightness of the receiving module.
The compensation data of each pixel of the receiving module under different receiving module parameters is obtained by the following modes: acquiring image data output by a receiving module under different receiving module parameters; and determining compensation data corresponding to each pixel of the receiving module according to the image data.
The compensation data of the receiving module is used for compensating the noise signal of the receiving module, so when the compensation data is acquired, the noise signal of the receiving module is firstly acquired, and the noise signal of the receiving module is the signal output by the receiving module when the detection light (isolation state) of the transmitting module is not received. The signal received by the receiving module is reflected in the image output by the image sensor, the noise signal can be determined according to the image data, and the compensation signal can be obtained according to the noise signal. The compensation signal makes the black level of the receiving module set in an ideal state, namely, the output is 0 in an isolated state.
Since the physical devices in the receiving module may not be ideal, the photosensitive unit may generate charges due to impurities, heat, and other reasons, even if light is not irradiated to the photosensitive unit, and the charges generate dark current. In an ideal condition, the black level is a signal level corresponding to definition of the image data as 0. The actual raw data output by the receiving module when not receiving the probe light emitted by the emitting module is not black balance data (data is not 0) due to the influence of dark current or the like. Accordingly, to reduce the effect of dark current on the depth signal, embodiments of the present disclosure need to determine the output signal (i.e., noise signal) due to dark current.
In practical application, depth signals acquired by the TOF image sensor are converted into original RAW format data through a conversion circuit. For example, taking 8bit data as an example, the effective value of a single photosensitive unit is 0-255, but the accuracy of an actual analog-to-digital conversion chip may not be able to convert a part of the smaller voltage value, so a fixed offset is usually added before the depth signal is input into the analog-to-digital conversion chip. The offset is a known value and can therefore be compensated for in a later process to counteract its effect on the depth image.
The receiving module also generates dark current when not receiving the detection light, and the dark current causes the receiving module to output a signal when receiving the detection light. When the receiving module receives the detection light, the signal generated by the receiving module in response to the detection light is overlapped with the dark current signal, so that the error exists in the depth signal. The receiving circuit generates dark current and is related to the brightness of the ambient light, the temperature of the receiving module, the exposure time of the receiving module, the amplification gain multiple of the receiving module and the modulation frequency of the receiving module, so that the receiving module parameters can comprise one or more of the brightness of the ambient light, the temperature of the receiving module, the exposure time of the receiving module, the amplification gain multiple of the receiving module and the modulation frequency of the receiving module. Of course, in practical applications, the receiving module parameters may also include other parameters, which are not limited in this disclosure.
The amplification gain of the receiving module may be the amplification of an amplifying circuit in the output circuit of the receiving module. The exposure time of the receiving module may be the light emitting time of the transmitting module, or the on time of the first switching unit and the fourth switching unit in fig. 2. The modulation frequency of the receiving module may be the pulse frequency of the probe light emitted by the emitting module.
In the embodiment of the disclosure, one or more of the brightness of the ambient light, the temperature of the receiving module, the exposure time of the receiving module, the amplification gain multiple of the receiving module and the modulation frequency of the receiving module are used as variables, the variables are adjusted to be at different values, and the depth image data output by the output module is obtained under the different temperatures of the receiving module, the exposure time of the receiving module, the amplification gain multiple of the receiving module and the modulation frequency of the receiving module.
For example, in the embodiment of the present disclosure, if the brightness of the ambient light has a small influence on the dark current, the brightness of the ambient light may be set to zero. For example, a black light absorbing layer may be disposed at the light inlet end of the receiving module. The receiving module parameters may include one or more of a temperature of the receiving module, an exposure time of the receiving module, an amplification factor of the receiving module, and a modulation frequency of the receiving module.
When the receiving module parameters comprise the temperature of the receiving module, the temperature of the receiving module can be controlled, the image data output by the receiving module can be obtained at different temperatures, and noise signals at different temperatures can be determined according to the image data. For example, the noise signal may be detected once every one degree celsius difference; alternatively, the temperature may be segmented, and the noise signal may be detected once in each temperature segment, which is not limited by the embodiments of the present disclosure.
When the receiving module parameters comprise the exposure time of the receiving module, the exposure time of the receiving module can be controlled, the image data output by the receiving module can be obtained under different exposure time, and the noise signals under different exposure time can be determined according to the image data. The exposure time may be segmented, and the noise signal is detected once every exposure time period, for example, every 1.1 seconds, 1 second, 2 seconds, or the like, and the embodiment of the disclosure is not limited thereto.
When the receiving module parameters comprise the exposure time of the receiving module, the exposure time of the receiving module can be controlled, the image data output by the receiving module can be obtained under different exposure time, and the noise signals under different exposure time can be determined according to the image data. The exposure time may be segmented, and the noise signal is detected once every exposure time period, for example, every 0.1 seconds, 1 second, 2 seconds, or the like, and the embodiment of the disclosure is not limited thereto.
When the receiving module parameters comprise the amplification gain factors of the receiving module, the amplification gain factors of the receiving module can be controlled, image data output by the receiving module are obtained under different amplification gain factors, and noise signals under different amplification gain factors are determined according to the image data. For example, the noise signal may be detected at an amplification gain of 2, 3, 4, or 5 times, which is not limited in the embodiment of the disclosure.
When the receiving module parameters comprise the modulation frequency of the receiving module, the modulation frequency of the receiving module can be controlled, the image data output by the receiving module can be obtained under different modulation frequencies, and the noise signals under different modulation frequencies can be determined according to the image data. The modulation frequency may be segmented, and the noise signal may be detected once in each modulation frequency segment, for example, once every 50 hz, 100 hz, 500 hz, or 1000 hz, etc., which is not limited in this embodiment of the disclosure.
In practical applications, the receiving module parameters may also include a combination of multiple parameters. The receiving module parameters can be any two, three, four or five combinations of brightness of the ambient light, temperature of the receiving module, exposure time of the receiving module, amplification gain multiple of the receiving module and modulation frequency of the receiving module. For example, the receiving module parameter may be a combination of a receiving module temperature and an exposure time, the receiving module parameter may be a combination of a receiving module temperature, an exposure time combination, and an amplification gain factor, and the receiving module parameter may be a combination of a receiving module temperature, an exposure time combination, an amplification gain factor, and a modulation frequency.
When determining the compensation data corresponding to each pixel of the receiving module according to the image data, the depth signal corresponding to the gray level can be calculated according to the gray level of the image, and after the depth signal (namely, the noise signal) is obtained, the compensation signal is determined according to the noise signal. The compensation signal is used to cancel the noise signal, i.e. the compensation signal may be opposite to the noise signal to cancel the noise signal.
In step S430, a compensation mapping relationship may be established according to the receiving module parameters and the corresponding compensation data, where the compensation mapping light includes a mapping relationship between the receiving module parameters and the compensation data.
The compensation mapping relationship may be established according to the compensation data obtained in step S420 and corresponding to the receiving module parameters. The compensation mapping relationship may be a table or a function, and establishing the compensation mapping relationship may be establishing the table or the function.
For example, when the receiving module parameters include brightness of the ambient light, temperature of the receiving module, exposure time of the receiving module, amplification gain of the receiving module, and modulation frequency of the receiving module, the compensation mapping relationship may be a function of the compensation data with respect to the brightness of the ambient light, the temperature of the receiving module, the exposure time of the receiving module, the amplification gain of the receiving module, and the modulation frequency of the receiving module. Of course, in practical applications, the parameters of the receiving module include one or more of brightness of ambient light, temperature of the receiving module, exposure time of the receiving module, amplification gain of the receiving module, and modulation frequency of the receiving module, the compensation mapping relationship may be a function of the compensation data with respect to the corresponding parameters.
When the compensation mapping relation is a function, the compensation mapping relation can be established through fitting according to the receiving module parameters and corresponding compensation data. For example, the fitting may be performed by a difference value, and the compensation mapping relationship may be a piecewise function, a quadratic fit function, a cubic fit function, or the like.
According to the method for determining the compensation data, the transmitting module and the receiving module are driven to work in the isolation state, the compensation data of each pixel of the receiving module under different receiving module parameters are obtained, and the compensation mapping relation is established according to the receiving module parameters and the corresponding compensation data, so that in the working process of the image sensor, the compensation data corresponding to different receiving module parameters can be obtained through the compensation mapping relation, the depth signal is compensated, and the ranging precision of the time-of-flight image sensor can be improved.
The exemplary embodiments of the present disclosure also provide a compensation method for a time-of-flight image sensor, as shown in fig. 7, which may include the steps of:
step S710, obtaining receiving module parameters, wherein the receiving module parameters comprise one or more of temperature, exposure time, amplification gain, modulation frequency and ambient light brightness of the receiving module;
Step S720, determining compensation data according to the receiving module parameters and the compensation mapping relation, wherein the compensation mapping relation is determined according to the compensation data determining method;
in step S730, the compensation data is used to compensate the depth signal received by the receiving module.
According to the compensation method provided by the embodiment of the disclosure, the receiving module parameters are obtained, the compensation data corresponding to different receiving module parameters are obtained through the compensation mapping relation, and the depth signals are compensated, so that the signal-to-noise ratio of the time-of-flight image sensor can be improved, and further the ranging accuracy is improved.
The steps of the compensation method provided in the embodiments of the present disclosure will be described in detail below:
in step S710, receiving module parameters may be obtained, where the receiving module parameters include one or more of a temperature, an exposure time, an amplification factor, a modulation frequency, and an ambient light level of the receiving module.
Wherein, can be through setting up temperature sensor at the receiving module and detect the temperature of receiving module, set up the luminance that light sensor detected ambient light on the receiving module. And acquiring the exposure time, the amplification gain and the modulation frequency of the receiving module through the driving signal of the receiving module.
In step S720, the compensation data may be determined according to the receiving module parameters and the compensation mapping relationship, where the compensation mapping relationship is determined according to the compensation data determining method described above.
The compensation mapping relationship may be stored in a storage device of the electronic device, and the compensation mapping relationship is obtained from the storage device when compensation is performed. And determining compensation data by using the compensation mapping relation and the detected receiving module parameters.
For example, when the compensation mapping relationship is a table, the detected parameters of the receiving module may be used to obtain compensation data from the table. When the compensation mapping relation is a function, the detected receiving module parameters can be brought into the function to acquire compensation data.
In step S730, the depth signal received by the receiving module may be compensated by using the compensation data.
The compensation of the depth signal received by the receiving module by the compensation data can be realized by determining the compensation signal of each pixel of the receiving module according to the compensation data; and superposing the compensation signal of each pixel and the depth signal acquired by the corresponding pixel.
It should be noted that although the steps of the methods of the present disclosure are illustrated in the accompanying drawings in a particular order, this does not require or imply that the steps must be performed in that particular order or that all of the illustrated steps be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
The exemplary embodiments of the present disclosure further include a determining apparatus of compensation data for a time-of-flight image sensor, as shown in fig. 8, the determining apparatus 800 of compensation data includes:
the driving module 810 is configured to drive the transmitting module and the receiving module to operate in an isolated state, where the isolating state isolates the transmitting module and the receiving module to block the probe light emitted by the transmitting module from entering the receiving module;
a first obtaining module 820, configured to obtain compensation data of each pixel of the receiving module under different receiving module parameters, where the receiving module parameters include one or more of temperature, exposure time, amplification gain, and modulation frequency environment brightness of the receiving module;
the establishing module 830 is configured to establish a compensation mapping relationship according to the receiving module parameters and the corresponding compensation data, where the compensation mapping light includes a mapping relationship between the receiving module parameters and the compensation data.
According to the device for determining the compensation data, provided by the embodiment of the disclosure, the transmitting module and the receiving module are driven to work in the isolation state, the compensation data of each pixel of the receiving module under different receiving module parameters are obtained, and the compensation mapping relation is established according to the receiving module parameters and the corresponding compensation data, so that the compensation data corresponding to the different receiving module parameters can be obtained through the compensation mapping relation in the working process of the image sensor, the depth signal is compensated, and the ranging precision of the time-of-flight image sensor can be improved.
Further, the compensation data determining apparatus provided by the embodiment of the present disclosure may further include: the isolation module is used for isolating the transmitting module and the receiving module so as to prevent the detection light emitted by the transmitting module from entering the receiving module.
In a possible embodiment of the present disclosure, the first obtaining module may include:
the acquisition unit is used for acquiring the image data output by the receiving module under different receiving module parameters;
the first determining unit is used for determining compensation data corresponding to each pixel of the receiving module according to the image data.
In a possible embodiment of the present disclosure, the establishing module may include:
and the fitting unit is used for establishing a compensation mapping relation through fitting according to the receiving module parameters and the corresponding compensation data.
In a possible embodiment of the present disclosure, the establishing module may include:
and the absorption unit is used for absorbing the detection light emitted by the emission module by using the black light absorption layer, and the black light absorption layer at least covers the emission end of the emission module.
The exemplary embodiments of the present disclosure also provide a compensation apparatus for a time-of-flight image sensor, as shown in fig. 9, the compensation apparatus 900 includes:
a second obtaining module 910, configured to obtain receiving module parameters, where the receiving module parameters include one or more of a temperature, an exposure time, an amplification gain factor, and a modulation frequency environment light brightness of the receiving module;
The determining module 920 is configured to determine compensation data according to the receiving module parameters and a compensation mapping relationship, where the compensation mapping relationship is determined according to the compensation data determining method;
the compensation module 930 is configured to compensate the depth signal received by the receiving module using the compensation data.
According to the compensation device provided by the embodiment of the disclosure, the receiving module parameters are obtained, the compensation data corresponding to different receiving module parameters are obtained through the compensation mapping relation, the depth signals are compensated, and the ranging precision of the time-of-flight image sensor can be improved.
In a possible embodiment of the present disclosure, the compensation module may include:
the second determining unit is used for determining a compensation signal of each pixel of the receiving module according to the compensation data;
and the superposition unit is used for superposing the compensation signal of each pixel and the depth signal acquired by the corresponding pixel.
The specific details of the modules of the above-mentioned middle compensation data determining apparatus have been described in detail in the corresponding compensation data determining method, and the specific details of the modules of the above-mentioned middle compensation apparatus have been described in detail in the corresponding compensation method, so that they will not be described in detail here.
It should be noted that although several modules or units of the compensation data determining means and the compensating means are mentioned in the above detailed description, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
The exemplary embodiments of the present disclosure also provide a time-of-flight image sensor assembly, as shown in fig. 10, including: the device comprises an emission module 101, a receiving module 102, a detection module 103 and a control module 104, wherein the emission module 101 is used for emitting detection light; the receiving module 102 is used for receiving the detection light and generating a depth signal; the detection module 103 is connected with the receiving module 102 and is used for detecting receiving module parameters, wherein the receiving module parameters comprise one or more of temperature, exposure time, amplification gain multiple and modulation frequency environment brightness of the receiving module; the control module 104 is connected with the receiving module 102 and the detecting module 103, the control module 104 stores a compensation mapping relation, and the control module 104 is used for determining compensation data according to the parameters of the receiving module and the compensation mapping relation and compensating the depth signal by using the compensation data.
The detecting module 103 may include a temperature sensor, a light sensor, a counter, a current sensor and a voltage sensor, where the temperature sensor is used to detect the temperature of the receiving module, the light sensor is used to detect the brightness of the ambient light, the current sensor is used to detect the output current, and the voltage sensor is used to detect the output voltage. The exposure time, amplification gain and modulation frequency of the receiving module can be determined by the driving signal of the receiving module, so that the control module can be shared as the detecting module. The control module 104 may be a microprocessor or a processor.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
Those skilled in the art will appreciate that the various aspects of the invention may be implemented as a system, method, or program product. Accordingly, aspects of the invention may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 1100 according to such an embodiment of the invention is described below with reference to fig. 11. The electronic device 1100 shown in fig. 11 is merely an example, and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 11, the electronic device 1100 is embodied in the form of a general purpose computing device. For example, the electronic device may be a cell phone, tablet computer, electronic reader, or wearable device, etc. Components of electronic device 1100 may include, but are not limited to: the at least one processing unit 1110, the at least one memory unit 1120, a bus 1130 connecting the different system components (including the memory unit 1120 and the processing unit 1110), and a display unit 1140.
Wherein the storage unit stores program code that is executable by the processing unit 1110 such that the processing unit 1110 performs steps according to various exemplary embodiments of the present invention described in the above-described "exemplary methods" section of the present specification.
The storage unit 1120 may include a readable medium in the form of a volatile storage unit, such as a Random Access Memory (RAM) 11201 and/or a cache memory 11202, and may further include a Read Only Memory (ROM) 11203.
The storage unit 1120 may also include a program/utility 11204 having a set (at least one) of program modules 11205, such program modules 11205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The bus 1130 may be a local bus representing one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a bus using any of a variety of bus architectures.
The electronic device 1100 may also communicate with one or more external devices 1170 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 1100, and/or any device (e.g., router, modem, etc.) that enables the electronic device 1100 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1150. Also, electronic device 1100 can communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 1160. As shown, the network adapter 1140 communicates with other modules of the electronic device 1100 via the bus 1130. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 1100, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification is also provided. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention as described in the "exemplary methods" section of this specification, when said program product is run on the terminal device.
Referring to fig. 12, a program product 1200 for implementing the above-described method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Furthermore, the above-described drawings are only schematic illustrations of processes included in the method according to the exemplary embodiment of the present invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (13)

1. A method of determining compensation data for a time-of-flight image sensor, the method comprising:
Driving the transmitting module and the receiving module to work in an isolation state, wherein the isolation state is used for isolating the transmitting module and the receiving module so as to prevent detection light emitted by the transmitting module from entering the receiving module, and the detection light comprises detection light which directly enters the receiving module by the transmitting module and detection light reflected by an obstacle;
acquiring compensation data of the receiving module under different receiving module parameters, wherein the receiving module parameters comprise one or more of temperature, exposure time, amplification gain factor, modulation frequency and ambient light brightness of the receiving module;
and establishing a compensation mapping relation according to the receiving module parameters and the corresponding compensation data, wherein the compensation mapping relation comprises the mapping relation of the receiving module parameters and the compensation data.
2. The method for determining compensation data according to claim 1, wherein the obtaining the compensation data for each pixel of the receiving module under different receiving module parameters comprises:
acquiring image data output by the receiving module under different receiving module parameters;
and determining compensation data of the receiving module according to the image data.
3. The method for determining compensation data according to claim 1, wherein establishing a compensation mapping relationship according to the reception module parameters and the corresponding compensation data comprises:
And establishing a compensation mapping relation through fitting according to the receiving module parameters and the corresponding compensation data.
4. The method of determining compensation data of claim 1, wherein the method further comprises:
isolating the transmitting module and the receiving module to block the probe light emitted by the transmitting module from entering the receiving module.
5. The method of determining compensation data of claim 4, wherein isolating the transmit module from the receive module comprises:
and the black light absorption layer is used for absorbing the detection light emitted by the emission module, and at least covers the emission end of the emission module.
6. The method of claim 1, wherein the ambient light level is zero in the receiving module parameters.
7. A compensation method for a time-of-flight image sensor, the compensation method comprising:
acquiring receiving module parameters, wherein the receiving module parameters comprise one or more of temperature, exposure time, amplification gain factor, modulation frequency and ambient light brightness of a receiving module;
determining compensation data according to the receiving module parameters and a compensation mapping relation, wherein the compensation mapping relation is determined according to the method for determining the compensation data according to any one of claims 1-6;
And compensating the depth signal received by the receiving module by using the compensation data.
8. The compensation method of claim 7, wherein compensating the depth signal received by the receiving module with the compensation data comprises:
determining a compensation signal of each pixel in the receiving module according to the compensation data;
and superposing the compensation signal of each pixel and the depth signal acquired by the corresponding pixel.
9. A compensation data determining apparatus for a time-of-flight image sensor, the apparatus comprising:
the driving module is used for driving the transmitting module and the receiving module to work in an isolation state, wherein the isolation state is used for isolating the transmitting module and the receiving module so as to prevent detection light emitted by the transmitting module from entering the receiving module;
the first acquisition module is used for acquiring compensation data of the receiving module under different receiving module parameters, wherein the receiving module parameters comprise one or more of temperature, exposure time, amplification gain factor, modulation frequency and environment light brightness of the receiving module;
the building module is used for building a compensation mapping relation according to the receiving module parameters and the corresponding compensation data, wherein the compensation mapping relation comprises the mapping relation of the receiving module parameters and the compensation data.
10. A compensation device for a time-of-flight image sensor, the compensation device comprising:
the second acquisition module is used for acquiring parameters of the receiving module, wherein the parameters of the receiving module comprise one or more of temperature, exposure time, amplification gain multiple and modulation frequency environment brightness of the receiving module;
the determining module is used for determining compensation data according to the receiving module parameters and the compensation mapping relation, wherein the compensation mapping relation is determined according to the method for determining the compensation data according to any one of claims 1-8;
and the compensation module is used for compensating the depth signal received by the receiving module by using the compensation data.
11. A time-of-flight image sensor assembly, the image sensor assembly comprising:
the emission module is used for emitting detection light;
the receiving module is used for receiving the detection light and generating a depth signal;
the detection module is connected with the receiving module and used for detecting the parameters of the receiving module, wherein the parameters of the receiving module comprise one or more of temperature, exposure time, amplification gain and modulation frequency environment brightness of the receiving module
The control module is connected with the receiving module and the detecting module, the control module stores a compensation mapping relation, and the control module is used for determining compensation data according to the parameters of the receiving module and the compensation mapping relation and compensating the depth signal by utilizing the compensation data.
12. An electronic device, comprising
A processor; and
a memory having stored thereon computer readable instructions which, when executed by the processor, implement the method according to any of claims 1 to 8.
13. A computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to any of claims 1 to 8.
CN202011059810.6A 2020-09-30 2020-09-30 Compensation data determining method and device, compensation method and device and electronic equipment Active CN112198525B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011059810.6A CN112198525B (en) 2020-09-30 2020-09-30 Compensation data determining method and device, compensation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011059810.6A CN112198525B (en) 2020-09-30 2020-09-30 Compensation data determining method and device, compensation method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN112198525A CN112198525A (en) 2021-01-08
CN112198525B true CN112198525B (en) 2023-04-28

Family

ID=74007266

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011059810.6A Active CN112198525B (en) 2020-09-30 2020-09-30 Compensation data determining method and device, compensation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112198525B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206178141U (en) * 2016-09-12 2017-05-17 深圳市金立通信设备有限公司 Laser rangefinder , camera device and terminal
US10389957B2 (en) * 2016-12-20 2019-08-20 Microsoft Technology Licensing, Llc Readout voltage uncertainty compensation in time-of-flight imaging pixels
CN110072065B (en) * 2018-01-23 2021-04-27 舜宇光学(浙江)研究院有限公司 Projector working time control method suitable for roller shutter exposure depth camera and application thereof
CN208314204U (en) * 2018-05-14 2019-01-01 孙向明 A kind of TOF 3D depth image sensor of environment resistant light interference
CN109544617B (en) * 2018-12-05 2024-04-16 光微信息科技(合肥)有限公司 Temperature compensation method and temperature compensation device applied to phase type TOF sensor
CN110109133A (en) * 2019-05-05 2019-08-09 武汉市聚芯微电子有限责任公司 Compensated distance method and distance measuring method based on the flight time
CN110579753A (en) * 2019-09-21 2019-12-17 北醒(北京)光子科技有限公司 depth sensor calibration system and method
CN110940963A (en) * 2019-12-25 2020-03-31 科沃斯机器人股份有限公司 Measurement module and autonomous mobile device

Also Published As

Publication number Publication date
CN112198525A (en) 2021-01-08

Similar Documents

Publication Publication Date Title
US11397249B2 (en) Light detection device and electronic apparatus comprising a reverse bias voltage adjustment in accordance with a pulse number output by a first SPAD array
US20240027593A1 (en) Compensation circuitry for lidar receiver systems and method of use thereof
US9740337B2 (en) Projector
CN111538024B (en) Filtering ToF depth measurement method and device
US11143750B2 (en) Optical crosstalk calibration for ranging systems
CN103750850B (en) The automatic synchronous method of a kind of optical signal detector and device
WO2021103428A1 (en) Depth measurement system and method
EP3488770A1 (en) Optical signal processing method and apparatus
CN110832776A (en) Proximity sensor with crosstalk compensation
CN111123289A (en) Depth measuring device and measuring method
CN110488251A (en) The preparation method of laser radar system and its laser radar echo signal curve, device
CN112558096A (en) Distance measurement method, system and storage medium based on shared memory
CN112198525B (en) Compensation data determining method and device, compensation method and device and electronic equipment
US20180314127A1 (en) Bias control apparatus and method of modulator of optical transmitter and optical transmitter
KR102092331B1 (en) Compact oct spectrometer suitable for mobile environment
KR102014146B1 (en) Distance detecting device and Image processing apparatus including the same
CN213365014U (en) Miniature TOF single-point range finding module and miniature TOF single-point range finding equipment
CN114814880A (en) Laser radar detection parameter adjustment control method and device
CN112036366A (en) Determination method of noise photosensitive unit, fingerprint identification method and device
CN114879167A (en) Pixel circuit, photoelectric signal acquisition method and system and distance measuring sensor
CN112713944A (en) Underwater wireless optical communication system and optical communication method based on calculation time domain ghost imaging
KR102339811B1 (en) Compact oct spectrometer with improved crosstalk, echo signal, and nonlinearity
US20220337762A1 (en) Radiation imaging apparatus, radiation imaging system, method for controlling radiation imaging apparatus, and storage medium
CN117221686A (en) Pixel of image sensor and image sensor
RU186487U1 (en) Device for round-the-clock observation of the position of the radiation spot at a remote object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant