CN113687366A - Detection unit, detection device and method - Google Patents

Detection unit, detection device and method Download PDF

Info

Publication number
CN113687366A
CN113687366A CN202010404733.7A CN202010404733A CN113687366A CN 113687366 A CN113687366 A CN 113687366A CN 202010404733 A CN202010404733 A CN 202010404733A CN 113687366 A CN113687366 A CN 113687366A
Authority
CN
China
Prior art keywords
exposure time
information
processing module
signal
exposure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010404733.7A
Other languages
Chinese (zh)
Inventor
雷述宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Abax Sensing Electronic Technology Co Ltd
Original Assignee
Ningbo Abax Sensing Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Abax Sensing Electronic Technology Co Ltd filed Critical Ningbo Abax Sensing Electronic Technology Co Ltd
Priority to CN202010404733.7A priority Critical patent/CN113687366A/en
Publication of CN113687366A publication Critical patent/CN113687366A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The application provides a detection pixel unit, detection device and use this detection device's detection method, detection device is including the sensitization module that can obtain N different exposure information, N different exposure information of group include exposure duration different first exposure time and second exposure time, processing module receives different signals right the information that different exposure duration obtained carry out different processing, final target information is exported according to the different information processing results of processing unit to the information generation module, through this kind of detection device's setting, different mode has been prefabricated in system inside, can realize the accurate pursuit of detecting system to the different distances of many objects in the field of view, result output is reliable and fast simultaneously, has strengthened user experience and has felt.

Description

Detection unit, detection device and method
Technical Field
The present application relates to the field of detection technologies, and in particular, to a detection unit, a detection device, and a detection method.
Background
More and more technologies are continuously released in the technical field of detection, in order to ensure target information of efficient and rapid detection in application fields such as images or ranging, the acquisition efficiency of detection information is more and more concerned, whether a detection system can efficiently and rapidly process high-quality pictures when images are acquired can directly influence user experience, and the detection method is particularly suitable for the ranging field, for example, when the detection device and a detected object have certain relative speed, it is very important to rapidly and accurately acquire and process distance data, especially when the detection device is a vehicle-mounted device, the rapid and accurate distance information can greatly help a user to realize full-automatic driving in rapid driving, and meanwhile, the safety of automatic driving can also be ensured.
In the earlier disclosed technology, when an image is acquired, a plurality of images under different exposure parameters are taken for the same scene, the images are partitioned, the information entropy of each image block of a single frame image in an image sequence is obtained, the image block with the maximum information amount is stored and combined into a new scene image, an image solution scheme capable of obtaining a larger dynamic range is provided in the design, wherein the high dynamic range is simply referred to in image acquisition, the image can obtain an effect with a bright and dark layer, a picture is not easy to form pure black or pure white (namely bright points and black points), the accuracy of ranging is also more and more concerned in the ranging field, particularly, the development of the existing laser source is more and more biased to array, and the emission and the reception can detect multiple targets in a field of view, in a detection field, the far and near distance states of the target objects may be different or have larger difference, return information of detection light for the object with a relatively far distance may be very weak, and return information of detection light for the object with a very close distance may be very strong, so that if the same exposure is used, the information acquisition in the field is lost and other defects are generated, the problem can be solved by synthesizing the final field target object distance information by adopting field information with different exposure times, however, images with information acquired for multiple times need particularly complex storage and operation, so that the speed of acquiring multi-object distance information in the field can be reduced in distance measurement, so that potential danger may exist for vehicle-mounted equipment and the like, and the experience of users is influenced for image acquisition equipment.
For example, in the current distance or depth information detection process, a Time of flight (TOF) method is often used, which is based on the principle that a distance to an object is obtained by detecting a Time of flight (round trip) of a light pulse by continuously transmitting the light pulse to the object and then receiving the light returned from the object with a sensor, and a technique of directly measuring the Time of flight of the light in the TOF technique is called DTOF (direct-TOF); a measurement technique of periodically modulating the emitted light signal, measuring a phase delay of the reflected light signal with respect to the emitted light signal, and calculating a time of flight from the phase delay is called an ITOF (index-TOF) technique. According to the difference of modulation and demodulation types, the modulation and demodulation method can be divided into a Continuous Wave (CW) modulation and demodulation method and a Pulse Modulated (PM) modulation and demodulation method, and a distance detection scheme with high precision and high sensitivity can be obtained by further adopting an ITOF scheme, so that the ITOF scheme is widely applied.
In order to obtain efficient measurement results and higher integration of chips, distance measurement is usually achieved by adopting two taps or more than two taps, distance information of a target object can be obtained according to a phase distance measurement algorithm, for example, a two-phase method is simplest adopted, or a three-phase four-phase method or even a 5-phase scheme can be further adopted to obtain distance information, here, taking a four-phase algorithm as an example, at least two exposures (usually four exposures are required to ensure measurement accuracy) are required to complete the acquisition of four-phase data and output a depth image of one frame, meanwhile, different exposure times with different phases are required to be arranged on the basis of four-phase distance measurement, so that a higher frame frequency is more difficult to obtain, and therefore, a method capable of solving the problem that detection information, especially a detection device in a distance measurement process, has a very high dynamic range characteristic is urgently needed, but also can ensure the high-efficiency and quick result output of the whole distance measuring equipment.
Disclosure of Invention
An object of the application lies in, to the not enough among the above-mentioned prior art, provides a detecting element to solve the technical problem that current detecting element can not deal with the high precision of many meshes and survey fast.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides a detection pixel unit, including: the photosensitive module is used for respectively carrying out exposure processing on the pixels in N different exposure time and receiving N groups of exposure, wherein N is an integer greater than or equal to 2;
the processing module can respectively process the N groups of exposures to obtain N groups of exposure signals; the N groups of exposures comprise at least two groups of exposures with different exposure times, namely a first exposure time and a second exposure time, wherein the first exposure time is shorter than the second exposure time;
the processing module can receive a first signal, establish a corresponding relationship between the processing module and the photosensitive module, output a first exposure time signal which corresponds to the first exposure time and is processed by the processing module, and reset the second exposure time signal;
and the information generation module is used for receiving the first exposure time signal output by the processing module and generating final target information.
Optionally, the processing module may further receive a second signal, establish a corresponding relationship between the processing module and the photosensitive module, and output a second exposure time signal processed by the processing module corresponding to the second exposure time, where the first exposure time signal is reset;
and the information generation module receives the second exposure time signal output by the processing module and generates final target information.
Optionally, the exposure control device further comprises a judging module, the judging module generates a third signal, the processing module can also receive the third signal, and the processing module performs operation according to the results of the first exposure time signal and the second exposure time signal;
the information generation module receives the operation result of the first exposure time signal and the second exposure time signal output by the processing module and generates final target information.
Optionally, the pixel unit is a distance acquisition pixel unit, and the target information is distance information.
Optionally, the first signal and the second signal are related to the distance of the measured object.
Optionally, the determining module outputs the third signal according to the charge storage threshold of the pixel and the current storage value of the charge of the pixel under the first or second exposure.
In another aspect, the present invention also provides a detection apparatus comprising, a light source operable to emit light to illuminate an object under detection;
respectively carrying out exposure processing on the pixel array at N different exposure times, wherein the pixel array receives N groups of exposure, and N is an integer greater than or equal to 2;
the processing module can respectively process the N groups of exposures to obtain N groups of exposure signals; the N groups of exposures comprise at least two groups of exposures with different exposure times, namely a first exposure time and a second exposure time, wherein the first exposure time is shorter than the second exposure time;
the processing module can receive a first signal, the processing module establishes a corresponding relation with the pixel array and outputs a first exposure time signal which corresponds to the first exposure time and is processed by the processing module, and the second exposure time signal is reset;
and the information generation module is used for receiving the first exposure time signal output by the processing module and generating final target information.
Optionally, the processing module may further receive a second signal, establish a corresponding relationship between the processing module and the pixel array, and output a second exposure time signal processed by the processing module corresponding to the second exposure time, where the first exposure time signal is reset;
and the information generation module receives the second exposure time signal output by the processing module and generates final target information.
Optionally, the display device further includes a judging module, the judging module generates a third signal, the processing module may further receive the third signal, the processing module performs an operation according to the results of the first exposure time signal and the second exposure time signal, and the signals of a part of the pixel units of the pixel array use the result signals output by the processing module as output signals of the part of the pixel units;
the information generation module receives the output signals of all the pixel array pixel units output by the processing module and generates final target information.
Optionally, the output information corresponding to the reception control signal containing four different phase information in the N sets of exposures contains 0 °, 90 °, 180 ° and 270 °.
Optionally, the first exposure time and/or the first exposure time in the N groups of exposures includes output information corresponding to the receiving control signal of four different phase information.
Optionally, each pixel output information in the pixel array includes two subframes, the two subframes include the same number of first exposure times, and the first exposure time includes output information corresponding to four receiving control signals with different phase information.
Optionally, the two subframes include the same number of second exposure times, the first subframe includes at least one second exposure time, the second exposure time includes output information corresponding to the receiving control signals of two pieces of phase information with a phase difference of 180 °, the second subframe includes at least one second exposure time, the second exposure time includes output information corresponding to the receiving control signals of two pieces of phase information with a phase difference of 180 °, and the receiving control signals with a phase difference of 180 ° in the second exposure time included in the two subframes may constitute the output signals of the four receiving control signals with different phases.
Optionally, each pixel output information in the pixel array includes a plurality of sub-frames, two adjacent sub-frames in the plurality of sub-frames each include at least one output information corresponding to a reception control signal of two phase information with a phase difference of 180 °, and the reception control signals with a phase difference of 180 ° in the second exposure time included in the two adjacent sub-frames may constitute the output signals of the four different reception control signals with different phases; the processing module can also receive a fourth control signal and output a first exposure time signal and a second exposure time signal of the two adjacent subframes, and the information generating module receives different exposure time signals output by the processing module and generates final target information.
In a third aspect, an embodiment of the present application provides a detection method, which is applied to the detection apparatus described in the second aspect, and the detection method includes:
the light source is operable to emit light to illuminate the detected object;
the photosensitive module is used for respectively carrying out exposure processing on the pixels at N different exposure time and receiving N groups of exposure, wherein N is an integer greater than or equal to 2;
the processing module can respectively process the N groups of exposures to obtain N groups of exposure signals; the N groups of exposures comprise at least two groups of exposures with different exposure times, namely a first exposure time and a second exposure time, wherein the first exposure time is shorter than the second exposure time;
the processing module can receive a first signal control and output a first exposure time signal which corresponds to the first exposure time and is processed by the processing module, and the second exposure time signal is reset;
and the information generation module is used for receiving the first exposure time signal output by the processing module and generating final target information.
Optionally, the processing module may further receive a second signal control, and output a second exposure time signal processed by the processing module corresponding to the second exposure time, where the first exposure time signal is reset; and the information generation module receives the second exposure time signal output by the processing module and generates final target information.
Optionally, the display device further includes a judging module, the judging module generates a third signal, the processing module receives the third signal to control the operation according to the results of the first exposure time signal and the second exposure time signal, and the signals of the partial pixel units of the pixel array use the result signals output by the processing module as the output signals of the partial pixel units;
the information generation module receives the output signals of all the pixel array pixel units output by the processing module and generates final target information.
Optionally, the output information corresponding to the reception control signal containing four different phase information in the N sets of exposures contains 0 °, 90 °, 180 ° and 270 °.
Optionally, each pixel output information in the pixel array includes two subframes, the two subframes include the same number of first exposure times, and the first exposure time includes output information corresponding to four receiving control signals with different phase information.
Optionally, each pixel output information in the pixel array includes a plurality of sub-frames, two adjacent sub-frames in the plurality of sub-frames each include at least one output information corresponding to a reception control signal of two phase information with a phase difference of 180 °, and the reception control signals with a phase difference of 180 ° in the second exposure time included in the two adjacent sub-frames may constitute the output signals of the four different reception control signals with different phases; the processing module can also receive a fourth control signal and output a first exposure time signal and a second exposure time signal of the two adjacent subframes, and the information generating module receives different exposure time signals output by the processing module and generates final target information.
The beneficial effect of this application is:
the embodiment of the application provides a detection unit, a detection device and a method, wherein the detection device comprises: a light source operable to emit light to illuminate an inspected object; respectively carrying out exposure processing on the pixel array at N different exposure times, wherein the pixel array receives N groups of exposure, and N is an integer greater than or equal to 2; the processing module can respectively process the N groups of exposures to obtain N groups of exposure signals; the N groups of exposures comprise at least two groups of exposures with different exposure times, namely a first exposure time and a second exposure time, wherein the first exposure time is shorter than the second exposure time; the processing module can receive a first signal, establish a corresponding relation with the pixel array, output a first exposure time signal which corresponds to the first exposure time and is processed by the processing module, and reset the second exposure time signal; therefore, the detection device has an intelligent selection function, in the first mode, for example, the distance between the detection device and a detected object in a field of view is very small, the distance needs to be obtained quickly, and the first exposure time, namely the short exposure information, in the distance range can completely meet the distance calculation, so that the device can output the distance information quickly, and the high-efficiency intelligent operation of the device is realized; still further, the apparatus may further receive a second signal, the processing module establishes a corresponding relationship with the pixel array, and outputs a second exposure time signal processed by the processing module corresponding to the second exposure time, and the first exposure time signal is reset; the information generation module receives the second exposure time signal output by the processing module and generates final target information, the device can detect farther distance in the mode, and the distance of a final target object is obtained only by using the signal with long exposure time, so that the high-dynamic effect is realized; finally, the detection device further comprises a judgment module, the judgment module generates a third signal, the processing module can also receive the third signal, the processing module performs operation according to the results of the first exposure time signal and the second exposure time signal, the signals of a part of pixel units of the pixel array use the result signals output by the processing module as the output signals of the part of pixel units, in this mode, when the state of the target object in the field of view is more complex, the information of the first exposure time and the information of the second exposure time can be adopted for correction to obtain the final most reliable target information, and when the target object is obtained for multiple times, the information of the previous subframe and the next subframe is complementary to each other through the arrangement of the multiple subframes, thereby forming an arrangement in which the information of the previous subframe and the next subframe can be multiplexed, this achieves the effect of quickly obtaining target information without lowering the frame rate during image or distance acquisition.
Drawings
In order to more clearly explain the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments are briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic functional block diagram of a detection apparatus according to an embodiment of the present disclosure;
fig. 2 is a schematic functional block diagram of another detection apparatus according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating an operating principle of a detection system according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of an arrangement of long and short exposures according to an embodiment of the present application;
FIG. 5 is a schematic diagram of another long and short exposure arrangement provided in an embodiment of the present application;
fig. 6 is a schematic diagram of an arrangement scheme of long and short exposures in multiple subframes according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a probing sequence according to an embodiment of the present application;
FIG. 8 is a schematic diagram of another probing sequence provided in the embodiments of the present application;
FIG. 9 is a schematic diagram of another exemplary probing sequence provided in the embodiments of the present application;
fig. 10 is a schematic diagram of a detection timing sequence in multiple subframes according to an embodiment of the present application;
fig. 11 is a schematic flowchart of a detection method according to an embodiment of the present application;
fig. 12 is a schematic flow chart of another detection method provided in the embodiment of the present application;
fig. 13 is a schematic flowchart of another detection method according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Fig. 1 is a schematic functional block diagram of a detection apparatus according to an embodiment of the present disclosure. As shown in fig. 1, the detecting device includes: the light source 110, the processing module 120, the photosensitive module 130, and the information generating unit 140, wherein the light source 110 may be configured as a unit or an array light source system that emits continuous light, and may be a semiconductor laser, an LED, or another light source that can be pulse modulated, when the semiconductor laser is used as the light source, a Vertical-cavity surface-emitting laser VCSEL (Vertical-cavity surface-emitting laser) or an edge-emitting semiconductor laser EEL (edge-emitting laser) may be used, which is only exemplary and not particularly limited herein, and a waveform of light output by the light source 110 is also not limited, and may be a square wave, a triangular wave, or a sine wave. The photosensitive module 130 includes a photoelectric conversion module, which has a photoelectric conversion function and can be implemented by a Photodiode (PD), and may be specifically a Charge-coupled Device (CCD), a Complementary Metal Oxide Semiconductor (CMOS), and the type of the photoelectric conversion module is not particularly limited herein.
The processing module 120 may include a control module that can control the light source to emit light of different times, the processing module 120 may enable the light sensing module 130 to obtain light reflected back by the detected object 150 corresponding to different phase delays when the light sensing module 130 emits light corresponding to the light source 110 at the time when the phase delay of the emitted light is respectively four values, i.e., 0 °, 180 °, 90 °, and 270 °, the reflected light forms incident light at the light sensing module 130, and is photoelectrically converted into different information by the light sensing module, in some cases, the information acquisition of the detected object is realized by using a 0 ° and 180 ° two-phase scheme, the document also discloses three-phase acquisition of target information of 0 °, 120 °, and 240 °, even the document also discloses a five-phase difference delay scheme, the invention is not particularly limited, and the acquired target information may be image information of a target and may also be distance information of the target, Contour information, etc., and the present invention is not particularly limited. The following is a detailed description of the problem-existence and solution scheme by taking a scheme of acquiring a distance by a four-phase time-of-flight as an example for explaining a specific technical problem.
On the basis of the foregoing, the light source 110 emits the emitting light, the photosensitive module 130 is controlled by the processing module 120, the light reflected by the detected object 150 is obtained under the condition of a predetermined delay phase, for example, four different delay phases, with respect to the emitting light, the returned reflected light forms the incident light at the photosensitive module 130, the solution does not make special requirements for the light source, there is no phase difference for the same light emitted by the light source each time, the error caused by the need to adjust the light emitting state parameters during the use of the light source device is avoided, the device is very simple to implement, the reliability of the whole detecting device system is ensured, the implementation of the phase delay in the solution is implemented in the receiving part and the controller, the processing module and/or the information generating module 140 can be integrated in the photosensitive module 130, so as to ensure the simplicity and the high efficiency of the system structure, in addition, the multi-phase delay receiving scheme adopted in the receiving part also avoids the need of transmitting the emitted light for each phase at the transmitting end, for example, in the four-phase scheme, two phase delays of 0 DEG and 180 DEG can be obtained in one transmission, so that the whole ranging system can achieve the aim of efficient ranging. The light emitted by the light source 110 and reflected by the detected object 150 is converted into photo-generated electrons (or photo-generated charges) in the photoelectric conversion module of the photosensitive module 130, wherein N groups of exposures include a first exposure with a first exposure duration and a second exposure with a second exposure duration, wherein the first exposure duration is short exposure time, and the two groups of exposures are realized on the same pixel or pixel array, so that the adaptability of the whole receiving array to a view field can be ensured, blind spots are not generated due to the fact that the receiving array is divided into different units to receive different exposures, the control is easier to realize, the length of the exposure is realized on the same pixel by different time sequences, a reset control time sequence can be set between the time sequences, the effect that different exposure information cannot have interference is further ensured, and a complex isolation technology is not required to be designed on the pixel level, the photo-generated electrons are output through modulation of a tap (the result of different phase information can be output by one same circuit or can be output by a plurality of different circuits, and the operation is not limited here), and then are subjected to physical scheme operation (for example, a charge storage unit: a capacitor and the like is used) or digital operation (for example, a sensor and an operation unit are integrated into a structure of a one-piece chip) in a pixel, or are subjected to physical operation or digital operation in a subsequent ADC or other circuit parts, and the specific implementation scheme is not limited in the invention.
When the light sensing module 130 receives the emitted light reflected by the detected object 150 according to different delay phases and different exposure times, it generates different information, the information generating module 140 can receive different control signals, and further perform different operation processing on the different information acquired by the light sensing module to complete final information acquisition, the different control signals can include object distance signals, for example, the detection system can obtain approximate distance information of the target object according to a preset mode, and select a final information acquisition scheme on the basis, when the detected object is very close to the detected object in the field of view, the detection device only uses the short exposure result to perform distance calculation, and when no short-distance detected object exists in the field of view, the detection system only uses the long exposure result to perform calculation, when a long-distance detected object and a short-distance detected object exist in the field of view simultaneously, the information obtained by the two modes of long exposure and short exposure is used, the result obtained by the two different exposure times is used for correcting the result obtained by the partial unit in the array type photosensitive module, the high dynamic range characteristic of the detection device is ensured, and the expression shown in the formula 1 can be adopted for correcting the pixel information of the photosensitive module in the detection device.
f(x)=mfs(x)+nfl(x); (1)
In the formula 1, f (x) indicates information after correction, fs(x) Result information obtained by short exposure, fl(x) The result information obtained by long exposure is referred to, m and n are correction coefficients, which may be fixed values or empirical coefficients obtained according to experiments, etc., when the third mode, that is, both long and short exposure information are utilized, the system may include a judgment module, which may be a mode for the user to select, in which the detection system may perform information synthesis operation of long and short exposures to obtain final information, or may be an adaptive judgment in an automatic state, for example, a module similar to a comparator, compares the information value obtained by the pixel unit with a threshold (including at least two thresholds of a maximum threshold and a minimum threshold), when the short exposure information value of a part of units is smaller than the minimum threshold (indicating that a long-distance object exists in the field of view and a dark spot exists in the obtained information), or when the long exposure information value of part of the units is larger than the maximum threshold value (indicating that a short-distance object exists in a field of view and white spots exist in the obtained information), the judging module autonomously generates a control signal under the condition that the detection system can adopt information of two different exposure times to obtain final information, so that the detection system can adapt to the requirements of users to meet the acquisition of a high dynamic range on one hand and can autonomously obtain a high dynamic range effect on the other hand, and can realize autonomous selection of a final information obtaining mode according to different scenes, thereby greatly improving the user experience and ensuring the detection accuracy on the basis of high efficiency, in addition, the system interpenetrates different information in multi-frame information through time sequence control, so that the information of two adjacent sub-frames can be mutually corrected, and the frame frequency in the whole detection process can not be reduced, the distance measurement efficiency cannot be influenced, in the implementation of the control signal, the first control signal and the second control signal can be signals related to the distance of the detected object, the distance acquisition in advance is realized according to historical data or a preset mode, and then the first control signal and the second control signal are generatedA first and a second control signal.
Fig. 2 is another embodiment of the present invention, functions corresponding to each module are the same as those in fig. 1, and are not repeated here, and compared with fig. 1, a judgment module 260 is shown, in practice, the judgment module may be made into one module with a processing module, or may be made into one module with an information generation module, and may be disposed in an array of pixels or physically arranged at intervals, where this is not limited, a third control signal of the judgment module 260 may be a control signal provided by a user selection signal judgment, or may be a control signal obtained by a detection apparatus through adaptive judgment, and this is not limited here, and in a multi-subframe detection process, a fourth signal of a system control signal may be the same as a third signal generation manner, that is, a selection signal of a user, or may be an adaptive control signal, even the fourth signal is substantially the same as the electrical signal, and will not be described in detail herein.
Fig. 3 is a schematic block diagram of the implementation of the present invention, which is the same as the system working mode explained in fig. 1, and it should be noted that the detection system has different working modes, so that higher automation and intelligence are implemented, high-efficiency and high-accuracy distance measurement during the operation of the system is ensured, and the reliability of the whole system is ensured, and the working principle is not repeated here.
Fig. 4 illustrates a four-phase method as an example, the processing module 120 controls the light source 110 to emit the emitting light, and after the emitting light is reflected by the detected object 150, the processing module 120 controls the light sensing module 130 to receive the emitting light with four-phase delay, wherein four phases are all received by using short exposure, and a plurality of sets of four-phase short exposure and two-phase long exposure with a phase difference of 180 ° are set in the subframe of fig. 4, and a four-phase long exposure and at least one set of four-phase short exposure data can also be set, which is not limited herein, wherein the length of the frame is related to the frame frequency of the detection system, for example, the detection system uses more 15FPS, 30FPS or 60FPS, the present invention includes at least one set of four-phase short exposure and one set of two-phase long exposure in one subframe by reasonable arrangement, wherein the exposure time of the long exposure can be four times or more than the short exposure time, through the arrangement, the detection system comprises information values of different phases and different exposure times, and can adapt to multi-target detection of the detection system and accurate detection of different distances of multiple targets.
Fig. 5 is similar to the arrangement of fig. 4, in order to compensate for the problem of the ranging accuracy of the long exposure information included in the detection system at the high frame rate, four complementary subframes, which are formed with the long exposure subframe of fig. 4, are set in another subframe, although the long exposure information of four phases may also be set in one subframe in fig. 4 and 5, the interior of the detection system may be autonomously controlled according to different frame rates and different long and short exposure delays, and the delays of the short exposure and the long exposure may also be autonomously adjusted, which is not particularly limited herein.
Figure 6 illustrates a schematic diagram of the setting of different phase delays and different exposure durations for a plurality of sub-frames, for example at a frame rate of 15FPS, 30FPS or 60FPS, respectively, that is to say 15, 30 or 60 sub-frames per second, four phase information of the long and short exposures are obtained in the above-described manner of fig. 4 and 5 at the nth frame and the N +1 th frame, complementary subframes may be formed using a previous subframe and a subsequent subframe, such that in the detection of multiple subframes, the information of two adjacent subframes can realize different detection distances in a multi-target scene, and by this arrangement, the distance information of the detected object can be obtained in the same way as the four-phase algorithm, for example, the information of every two adjacent sub-frames can form complementary information, the frame rate of the result output is not lowered in the result output because the amount of information required to obtain the result is relatively large.
In the above embodiments, the phase delay is the reception phase of 0 ° and 180 °, and the phase difference is 180 °; when the modulation signals corresponding to the first circuit and the second circuit are mutually inverse signals, namely when the modulation signals corresponding to the first circuit and the second circuit are received in a 0-degree phase delay mode in a first time period, the corresponding 180-degree delay receiving on the pixel does not output the electric signals through any circuit of the two circuits, the opposite operation is just executed in another time period, the same operation is also carried out on the receiving phases with the phase delay of 90 degrees and 270 degrees and the phase delay of 180 degrees, so that the scheme that the modulation signals of the circuit corresponding to the receiving phases with the phase difference of 180 degrees are mutually inverse signals is obtained, and the effects of signal reliability acquisition and system efficient operation when a multi-phase common tap or Floating Diffusion (FD) or other circuit elements are achieved.
The phase difference of the round trip of the optical signal between the laser imaging radar and the target can be calculated according to 4 groups of integrated charges in the distance acquisition process
Figure BDA0002490868270000171
Taking sinusoidal modulated light as an example, the phase difference between the echo signal and the transmitted signal corresponding to the modulated light
Figure BDA0002490868270000172
Comprises the following steps:
Figure BDA0002490868270000173
in the above formula 2Q、Q90°、Q180°、Q270°The electric signals converted by the receiving circuits corresponding to different phase delays are combined with the relationship between the distance and the phase difference, so that the final distance result can be obtained:
Figure BDA0002490868270000174
in the above equation 3, c is the speed of light, f is the frequency of the laser emitted by the light source 110, and the case that the light emitted by the light source 110 is a square wave can be divided into different cases, and the final distance information is obtained according to the following calculation method:
when Q is>Q180°And Q90°>Q270°When the temperature of the water is higher than the set temperature,
Figure BDA0002490868270000181
when Q is<Q180°And Q90°>Q270°When the temperature of the water is higher than the set temperature,
Figure BDA0002490868270000182
when Q is<Q180°And Q90°<Q270°When the temperature of the water is higher than the set temperature,
Figure BDA0002490868270000183
when Q is>Q180°And Q90°<Q270°When the temperature of the water is higher than the set temperature,
Figure BDA0002490868270000184
in the above equation 4-7, where the square wave is used for distance calculation, Q、Q90°、Q180°、Q270°The electrical signals converted by the receiving circuits corresponding to different phase delays, c is the speed of light, and f is the frequency of laser, and certainly, in some special cases, companies can also approximate the distance of the square wave by directly adopting a sine wave method.
Fig. 7 is a timing chart illustrating a method for setting different phases and different exposure time information, when a pixel unit has only one output circuit or floating diffusion node, etc., a plurality of different receptions are set in the nth sub-frame, including at least one complete four-phase information, to ensure that one sub-frame information can obtain distance information of an object to be detected in a field of view, and can be adaptively adjusted, including two-phase long exposure information, so that target information of a remote object to be detected can be obtained by a two-phase method.
Fig. 8 is a schematic diagram illustrating timing control capable of outputting delay information of different phases through two different circuits, where a pixel unit may output information through two circuits, so as to ensure high efficiency of information, the timing sequence of fig. 8 may include short exposure information of multiple four-phase delays, so as to ensure reliability of a detection result of the short exposure, and long exposure information of two delay phases with a phase difference of 180 °, on one hand, distance acquisition of a two-phase scheme may be performed based on the long exposure information of the two delay phases with a phase difference of 180 °, on the other hand, multiple subframes may be arranged in a complementary manner, and then a target distance is obtained by using the same four-phase delays, so as to finally achieve an effect of not reducing an output frame frequency of the result.
In the four-phase ranging process, the results of outputting different phase delay signals by different circuits (including a pixel internal charge transfer channel and a pixel external physical circuit part) are involved, however, in the actual use process, due to the effects of delay, offset and the like of a column line and a comparator, the results obtained by the two circuits for the same phase receiving signal processing are different, for example, the effects are classified into Q,Q180°The inherent deviation electron numbers of are delta Q1 and delta Q2, and there is actually middle Q,Q180°The obtained number of electrons has a certain deviation, for example, the electric signals corresponding to the four phase delays obtained by the first circuit and the second circuit are respectively:
Q0°,r1=Q+△Q1;Q180°,r2=Q180°+△Q2; (8)
q in formula 80°,r1Refers to the value of an electric signal, Q, converted by a first circuit at a 0 DEG delay phase actually substituted into a distance operation formulaThe value of Δ Q1 is a deviation electrical signal value generated when the 0 ° delay phase signal is converted by the first circuit, the meaning of each symbol in the electrical signal calculation formula corresponding to the 180 ° delay phase in formula 8 is similar to that of the 0 ° delay phase calculation formula, and is not described here again, the value of Δ Q1 may be a linear function relation or a multiple function relation, and the value can be simulated according to practical situations, and the deviation electrical signal is very difficult to obtain in practical use, so the value is not described here in detailIn order to solve the technical problem, in the scheme of the invention, each of four different delay phases can be respectively obtained by a first circuit and a second circuit to obtain two electric signal values, then an arithmetic mean scheme (or similar algorithm) is utilized to obtain the electric signal value finally substituted into an expression, and the electric signal value can be expressed by the following formula:
Figure BDA0002490868270000201
that is, signals obtained by two circuits are summed, and the results obtained by the same phase at different circuit outputs after the summation are superimposed, and influence factors Δ Q1 and Δ Q2 are also superimposed on the basis of the sum, so that the difference of the same phase at different circuit outputs is considered in the results, and the results after the superposition are used in the subsequent distance calculation to obtain an accurate distance result, which is explained in the case of equation 4 of square wave detection:
when Q is>Q180°And Q90°>Q270°When the temperature of the water is higher than the set temperature,
Figure BDA0002490868270000202
in the above equation 10, the final accurate distance information can be obtained by directly using the sum result without averaging in the final distance acquisition, and the result of the physical capacitance charge accumulation can be obtained by performing digital operation in the subsequent operation circuit, in the calculation, because the difference operation of different phases is involved, the offset caused by column line comparator can be eliminated, on the other hand, the phenomenon of transfer function mismatch caused by the difference of non-ideal factors such as tap can be removed, the offset charge caused by transfer function mismatch can be classified into linear or nonlinear relation, the basic principle is similar to the offset-induced charge difference, and a scheme similar to the scheme of the relation 1 can be adopted to correct and obtain the most accurate value by using the values obtained by two channels in the image sensing application.
Fig. 9 shows a schematic diagram of a timing arrangement for outputting at least one of the same phase delay signals by using different circuits, which on one hand can eliminate the mismatch of offset and transfer function caused by various reasons, and on the other hand, as described above, the effect of not reducing the frame frequency due to the output of the result is achieved by setting the complementary phases of different sub-frames, and certainly, the long exposure in one sub-frame may include all the four-delay phase information, and the four-delay phases of the long exposure in the other sub-frame are obtained by different second circuits, so that the actual complementary operation of two adjacent sub-frames can be achieved, and the accuracy of the long and short exposure information is improved while the frame frequency of the output result is not reduced, which is not described in detail herein.
Fig. 10 is a timing chart illustrating different phase delays and different exposure durations in two adjacent sub-frames among a plurality of sub-frames, and the timing chart is configured to take into account differences in output information from different circuits, so as to solve the problems of offset and the like, and also to ensure that the frame rate of the resultant output is not lowered by the complementary design of the adjacent sub-frames.
The above example is based on the timing control of the pixel units, the detection units in the actual detection system form a detection array, so that different detection units in the array may correspond to different detected objects in the field of view, and there is a difference in distance between different detected objects, so that the detection system may only need to modify information obtained by part of the detection units in actual operation, for example, the detection system may directly receive control signals (button signals, etc.), or may be an adaptive control scheme, for example, it may be determined according to the obtained information, when there is a dark spot in the obtained information, the information at the dark spot is replaced by long exposure information, when there is a white spot in the obtained information, the information at the white spot is replaced by short exposure information, of course, under the control of the button signals or the adaptive signals, at least one delay phase obtaining information may also be modified, and performing corresponding correction by using the two long and short exposure information values to ensure the accuracy of the detection information, which is not described in detail herein.
Fig. 11 illustrates steps of a manner of implementing the present invention, where the processing module 120 in S101 controls the light source 110 to emit light, which may be a square wave, a triangular wave, or a sine wave, and the like, without specific limitation, and the field of view is illuminated under the action of the emitted light, and the detected object 150 reflects the emitted light, thereby forming a reflected light echo, while the processing module 120 in S102 controls the light source to emit the emitted light, and at the same time controls the photosensitive module 130 to receive the echo of the reflected light with a control signal having a different phase delay from the light source 110, the photosensitive module 130 in S103 obtains N groups of exposure signals including different exposure fields, and different exposure time signals of the N groups of exposure signals are utilized according to different control signals, and the specific processing manner has been described in detail before, and is not described herein again.
Fig. 12 illustrates steps of another implementation manner of the present invention, which is similar to the steps illustrated in fig. 11, and a scheme for obtaining target information in a scheme in which long and short exposures in N groups of exposure signals include four phase delay is further defined in fig. 12, and an implementation manner of corresponding steps may refer to the steps described in fig. 11, and is not repeated here.
Fig. 13 is a schematic diagram of steps of another implementation manner of the present invention, which is similar to the steps shown in fig. 7 and 8, and further defines a scheme for obtaining target information by using a four-phase scheme in fig. 13, and each phase of at least one of the long and short exposures defining four delay phases is obtained by two circuits to obtain corresponding electrical signals, and the implementation manner of the corresponding steps may refer to the steps described in fig. 11, and is not repeated here.
It is noted that relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters indicate like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (20)

1. A detection pixel unit, characterized by comprising:
the photosensitive module is used for respectively carrying out exposure processing on the pixels in N different exposure time and receiving N groups of exposure, wherein N is an integer greater than or equal to 2;
the processing module can respectively process the N groups of exposures to obtain N groups of exposure signals;
the N groups of exposures comprise at least two groups of exposures with different exposure times, namely a first exposure time and a second exposure time, wherein the first exposure time is shorter than the second exposure time;
the processing module can receive a first signal, establish a corresponding relation between the processing module and the photosensitive module, output a first exposure time signal which corresponds to the first exposure time and is processed by the processing module, and reset the second exposure time signal;
and the information generation module is used for receiving the first exposure time signal output by the processing module and generating final target information.
2. The pixel unit according to claim 1, wherein the processing module is further capable of receiving a second signal, the processing module establishes a corresponding relationship with the photosensitive module and outputs a second exposure time signal processed by the processing module corresponding to the second exposure time, and the first exposure time signal is reset;
and the information generation module receives the second exposure time signal output by the processing module and generates final target information.
3. The pixel cell of claim 1, further comprising a determining module, wherein the determining module generates a third signal, and the processing module further receives the third signal, and the processing module performs an operation according to the results of the first exposure time signal and the second exposure time signal;
the information generation module receives the operation result of the first exposure time signal and the second exposure time signal output by the processing module and generates final target information.
4. The pixel cell of claim 1, wherein the pixel cell is a distance acquisition pixel cell and the target information is distance information.
5. The pixel cell of claim 2, wherein the first signal and the second signal are related to object distance.
6. The pixel cell of claim 3, wherein the determining module outputs the third signal based on a charge storage threshold of the pixel and a current stored value of charge of the pixel at the first or second exposure.
7. A detection device comprising an array of pixels according to claim 1, comprising a light source operable to emit light to illuminate an object under detection;
respectively carrying out exposure processing on the pixel array at N different exposure times, wherein the pixel array receives N groups of exposure, and N is an integer greater than or equal to 2;
the processing module can respectively process the N groups of exposures to obtain N groups of exposure signals;
the N groups of exposures comprise at least two groups of exposures with different exposure times, namely a first exposure time and a second exposure time, wherein the first exposure time is shorter than the second exposure time;
the processing module can receive a first signal, establish a corresponding relation with the pixel array, output a first exposure time signal which corresponds to the first exposure time and is processed by the processing module, and reset the second exposure time signal;
and the information generation module is used for receiving the first exposure time signal output by the processing module and generating final target information.
8. A detection apparatus according to claim 7, wherein the processing module is further capable of receiving a second signal, the processing module establishes a correspondence relationship with the pixel array and outputs a second exposure time signal processed by the processing module corresponding to the second exposure time, the first exposure time signal is reset;
and the information generation module receives the second exposure time signal output by the processing module and generates final target information.
9. A detecting device according to claim 7, further comprising a judging module, wherein the judging module generates a third signal, the processing module is further capable of receiving the third signal, the processing module performs an operation according to the results of the first exposure time signal and the second exposure time signal, and the signals of the partial pixel units of the pixel array use the result signals output by the processing module as the output signals of the partial pixel units;
the information generation module receives the output signals of all the pixel array pixel units output by the processing module and generates final target information.
10. The apparatus according to claim 7, wherein the output information corresponding to the received control signals containing the four different phase information in the N exposure sets comprises 0 °, 90 °, 180 ° and 270 °.
11. A probe apparatus according to claim 10 wherein the first exposure time and/or the first exposure time of said N sets of exposures includes output information corresponding to the received control signals for four different phase information.
12. A detection apparatus according to claim 10, wherein each pixel output information in the pixel array comprises two sub-frames, the two sub-frames comprise the same number of first exposure times, and the first exposure time comprises output information corresponding to four different phase information receiving control signals.
13. The apparatus according to claim 12, wherein the two sub-frames include a same number of second exposure times, the first sub-frame includes at least one second exposure time, and the second exposure time includes output information corresponding to the reception control signals of two phase information having a phase difference of 180 °, the second sub-frame includes at least one second exposure time, and the second exposure time includes output information corresponding to the reception control signals of two phase information having a phase difference of 180 °, and the reception control signals having a phase difference of 180 ° in the second exposure time included in the two sub-frames constitute the output signals of the four reception control signals having different phases.
14. A detecting device according to claim 10, wherein each pixel output information in the pixel array comprises a plurality of sub-frames, two adjacent sub-frames in the plurality of sub-frames each comprise at least one output information corresponding to the receiving control signal of two phase information having a phase difference of 180 °, and the receiving control signals having a phase difference of 180 ° in the second exposure time comprised in the two adjacent sub-frames can constitute the output signals of the four receiving control signals having different phases; the processing module can also receive a fourth control signal and output a first exposure time signal and a second exposure time signal of the two adjacent subframes, and the information generating module receives different exposure time signals output by the processing module and generates final target information.
15. A detection method applied to the detection apparatus according to any one of claims 1 to 10, the detection method comprising:
the light source is operable to emit light to illuminate the detected object;
the photosensitive module is used for respectively carrying out exposure processing on the pixels in N different exposure time and receiving N groups of exposure, wherein N is an integer greater than or equal to 2;
the processing module can respectively process the N groups of exposures to obtain N groups of exposure signals;
the N groups of exposures comprise at least two groups of exposures with different exposure times, namely a first exposure time and a second exposure time, wherein the first exposure time is shorter than the second exposure time;
the processing module can receive a first signal control and output a first exposure time signal which corresponds to the first exposure time and is processed by the processing module, and the second exposure time signal is reset;
and the information generation module is used for receiving the first exposure time signal output by the processing module and generating final target information.
16. A detection method according to claim 15, wherein the processing module is further capable of receiving a second signal control, outputting a second exposure time signal processed by the processing module corresponding to the second exposure time, and the first exposure time signal is reset; and the information generation module receives the second exposure time signal output by the processing module and generates final target information.
17. A detecting method according to claim 15, further comprising a judging module, wherein the judging module generates a third signal, the processing module receives the third signal to control the operation according to the results of the first exposure time signal and the second exposure time signal, and the signals of a part of pixel units of the pixel array use the result signals output by the processing module as the output signals of the part of pixel units;
the information generation module receives the output signals of all the pixel array pixel units output by the processing module and generates final target information.
18. A method for detecting a radiation source according to claim 15, wherein the output information corresponding to the received control signals for the N exposures which contain four different phase information comprises 0 °, 90 °, 180 ° and 270 °.
19. A detection apparatus according to claim 18, wherein each pixel output information in the pixel array comprises two sub-frames, the two sub-frames include the same number of first exposure times, and the first exposure time comprises output information corresponding to four different phase information receiving control signals.
20. The detection method according to claim 18, wherein each pixel output information in the pixel array comprises a plurality of sub-frames, two adjacent sub-frames in the plurality of sub-frames each comprise at least one output information corresponding to the reception control signals of two phase information having a phase difference of 180 °, and the reception control signals having a phase difference of 180 ° in the second exposure time comprised in the two adjacent sub-frames can constitute the output signals of the four different phase reception control signals; the processing module can also receive a fourth control signal and output a first exposure time signal and a second exposure time signal of the two adjacent subframes, and the information generating module receives different exposure time signals output by the processing module and generates final target information.
CN202010404733.7A 2020-05-14 2020-05-14 Detection unit, detection device and method Pending CN113687366A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010404733.7A CN113687366A (en) 2020-05-14 2020-05-14 Detection unit, detection device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010404733.7A CN113687366A (en) 2020-05-14 2020-05-14 Detection unit, detection device and method

Publications (1)

Publication Number Publication Date
CN113687366A true CN113687366A (en) 2021-11-23

Family

ID=78575265

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010404733.7A Pending CN113687366A (en) 2020-05-14 2020-05-14 Detection unit, detection device and method

Country Status (1)

Country Link
CN (1) CN113687366A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114428247A (en) * 2022-03-11 2022-05-03 深圳航天科技创新研究院 Single antenna ultra-wideband radar system for imaging applications
CN116915321A (en) * 2023-09-12 2023-10-20 威海威信光纤科技有限公司 Rapid test method and system for optical fiber bus

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101909157A (en) * 2009-06-05 2010-12-08 比亚迪股份有限公司 Method for acquiring highly dynamic images from image sensor, and image sensor
CN105872392A (en) * 2015-01-23 2016-08-17 原相科技股份有限公司 Optical distance measuring system enabling dynamic exposure time
JP2018077071A (en) * 2016-11-08 2018-05-17 株式会社リコー Distance measuring device, monitoring camera, three-dimensional measurement device, moving body, robot, method for setting condition of driving light source, and method for measuring distance
CN113740866A (en) * 2020-05-13 2021-12-03 宁波飞芯电子科技有限公司 Detection unit, detection device and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101909157A (en) * 2009-06-05 2010-12-08 比亚迪股份有限公司 Method for acquiring highly dynamic images from image sensor, and image sensor
CN105872392A (en) * 2015-01-23 2016-08-17 原相科技股份有限公司 Optical distance measuring system enabling dynamic exposure time
JP2018077071A (en) * 2016-11-08 2018-05-17 株式会社リコー Distance measuring device, monitoring camera, three-dimensional measurement device, moving body, robot, method for setting condition of driving light source, and method for measuring distance
CN113740866A (en) * 2020-05-13 2021-12-03 宁波飞芯电子科技有限公司 Detection unit, detection device and method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114428247A (en) * 2022-03-11 2022-05-03 深圳航天科技创新研究院 Single antenna ultra-wideband radar system for imaging applications
CN114428247B (en) * 2022-03-11 2022-09-27 深圳航天科技创新研究院 Single antenna ultra-wideband radar system for imaging applications
CN116915321A (en) * 2023-09-12 2023-10-20 威海威信光纤科技有限公司 Rapid test method and system for optical fiber bus
CN116915321B (en) * 2023-09-12 2023-12-01 威海威信光纤科技有限公司 Rapid test method and system for optical fiber bus

Similar Documents

Publication Publication Date Title
US10686994B2 (en) Imaging device, and solid-state imaging element used for same
JP5593479B2 (en) TOF region advantageous for suppression of background radiation
JP5679549B2 (en) Distance measuring method, distance measuring system and distance sensor
US10545239B2 (en) Distance-measuring imaging device and solid-state imaging device
EP2729826B1 (en) Improvements in or relating to the processing of time-of-flight signals
US9568607B2 (en) Depth sensor and method of operating the same
CN111045029B (en) Fused depth measuring device and measuring method
CN110361751B (en) Time flight depth camera and distance measuring method for reducing noise of single-frequency modulation and demodulation
US20220082698A1 (en) Depth camera and multi-frequency modulation and demodulation-based noise-reduction distance measurement method
CN110456370B (en) Flight time sensing system and distance measuring method thereof
CN113687366A (en) Detection unit, detection device and method
JP5180501B2 (en) Ranging device and ranging method
CN111025315A (en) Depth measurement system and method
CN110389351A (en) TOF range sensor, sensor array and the distance measuring method based on TOF range sensor
EP3835819A1 (en) Optical range calculation apparatus and method of range calculation
CN111123285B (en) Signal receiving system and method based on array type sensor and array type sensor
CN114200466A (en) Distortion determination apparatus and method of determining distortion
CN113740866A (en) Detection unit, detection device and method
US11624834B2 (en) Time of flight sensing system and image sensor used therein
JP2022064276A (en) Image sensor and analog-to-digital converter
EP3913400A1 (en) Distance image measurement device, distance image measurement system, and distance image measurement method
WO2021227202A1 (en) Detection apparatus and method
US20230009987A1 (en) Signal extraction circuit, signal extraction method, and distance measurement method and device
CN112799083A (en) Detection device for improving frame frequency
WO2022113637A1 (en) Signal processing device, distance measurement device, distance measurement method, and image sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination