USRE46930E1 - Distance detection method and system - Google Patents
Distance detection method and system Download PDFInfo
- Publication number
- USRE46930E1 USRE46930E1 US14/984,704 US201514984704A USRE46930E US RE46930 E1 USRE46930 E1 US RE46930E1 US 201514984704 A US201514984704 A US 201514984704A US RE46930 E USRE46930 E US RE46930E
- Authority
- US
- United States
- Prior art keywords
- signal
- time
- pulse
- light source
- integration value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 title description 33
- 238000000034 method Methods 0.000 claims abstract description 82
- 230000010354 integration Effects 0.000 claims abstract description 51
- 238000005286 illumination Methods 0.000 claims abstract description 25
- 230000003287 optical effect Effects 0.000 claims description 57
- 230000007704 transition Effects 0.000 claims description 6
- 230000035508 accumulation Effects 0.000 description 23
- 238000009825 accumulation Methods 0.000 description 23
- 230000000630 rising effect Effects 0.000 description 22
- 230000006870 function Effects 0.000 description 17
- 238000012545 processing Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 14
- 239000000523 sample Substances 0.000 description 14
- 238000012935 Averaging Methods 0.000 description 13
- 238000005259 measurement Methods 0.000 description 13
- 230000010363 phase shift Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 7
- 239000002245 particle Substances 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 238000001914 filtration Methods 0.000 description 6
- 230000001360 synchronised effect Effects 0.000 description 6
- 230000002123 temporal effect Effects 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 230000001934 delay Effects 0.000 description 4
- 230000006872 improvement Effects 0.000 description 4
- 238000012417 linear regression Methods 0.000 description 4
- 238000010183 spectrum analysis Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000003708 edge detection Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 101000798707 Homo sapiens Transmembrane protease serine 13 Proteins 0.000 description 2
- 102100032467 Transmembrane protease serine 13 Human genes 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 238000002592 echocardiography Methods 0.000 description 2
- 239000007789 gas Substances 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000000779 smoke Substances 0.000 description 2
- 238000012358 sourcing Methods 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 230000000087 stabilizing effect Effects 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- BNPSSFBOAGDEEL-UHFFFAOYSA-N albuterol sulfate Chemical compound OS(O)(=O)=O.CC(C)(C)NCC(O)C1=CC=C(O)C(CO)=C1.CC(C)(C)NCC(O)C1=CC=C(O)C(CO)=C1 BNPSSFBOAGDEEL-UHFFFAOYSA-N 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 230000003750 conditioning effect Effects 0.000 description 1
- 238000005314 correlation function Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000001208 nuclear magnetic resonance pulse sequence Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4861—Circuits for detection, sampling, integration or read-out
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4865—Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/487—Extracting wanted echo signals, e.g. pulse detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/003—Transmission of data between radar, sonar or lidar systems and remote stations
Definitions
- the invention relates to methods and systems for improving the measurement of light transit time reflected by different types of objects in detection and ranging methods and systems.
- Optical range-finding systems frequently rely on the time-of-flight principle and determine the distance between the apparatus and the object by measuring the time a short pulse of light emitted from the apparatus takes to reach an object and be reflected to a photo-detection circuit.
- Conventional optical rangefinders use a counter initiated at the starting pulse and then stopped when the receiver circuit detects the pulse echo of a value higher than a specific threshold. This threshold can be set low to provide sensitivity but the system will generate false alarms from transient noise. It can be set high to avoid false alarms but the system will not detect objects that return weak signal reflection. In bad weather conditions, such as rain or snow, several pulse echoes can be generated. Some techniques help to detect a certain number of echoes and may be used the reject some reflections but they have their limitations.
- optical rangefinders use other methods to be more robust against false alarms.
- One method is based on the use of an analog-to-digital converter (ADC) for the digitalization of the waveform of the echoed back signal. Once digitalized, the waveform can be processed by digital signal processing circuits to improve the performance of the system.
- ADC analog-to-digital converter
- Averaging is an efficient way to improve the signal to noise ratio (SNR).
- SNR signal to noise ratio
- averaging has an impact on response time and may render the system too slow for some applications.
- the resolution of distance measurement can be enhanced by using a clock pulsed delay circuit technique.
- N integer division of the clock pulse signal with a delay circuit and by rearranging each echo light pulse sample data
- this technique improves the resolution by a factor N.
- this technique has an impact on the number of averages if the averaging technique is also used to improve the SNR.
- Digital correlation is another digital processing technique for increasing the resolution of the range measurement.
- the distance to the object can be estimated by using the peak value of the result of the correlation function.
- the present system improves the detection of the presence and the measure of the distance of objects, while optimizing the performance (resolution, repetition rate, etc) by adapting a range-dependant processing as a function of the need of different applications.
- the present system can be adapted for use with a lighting system for lighting purposes as well as for the detection and ranging purposes.
- the present system also improves the detection of rain, snow, fog, smoke and can provide information about current weather conditions.
- a method for detecting a distance to an object comprises providing a lighting system having at least one pulse width modulated visible-light source for illumination of a field of view; emitting an illumination signal for illuminating the field of view for a duration of time y using the visible-light source at a time t; integrating a reflection energy for a first time period from a time t ⁇ x to a time t+x; determining a first integration value for the first time period; integrating the reflection energy for a second time period from a time t+y ⁇ x to a time t+y+x; determining a second integration value for the second time period; calculating a difference value between the first integration value and the second integration value; determining a propagation delay value proportional to the difference value; determining the distance to the object from the propagation delay value.
- FIG. 1 is a block diagram of an embodiment of the lighting system
- FIG. 2 shows an example of a reflected signal with accumulation and phase shift techniques wherein FIG. 2a is a trace obtained with no accumulation and no phase shift, FIG. 2b has accumulation and phase shift improvements and FIG. 2c has a greater number of accumulations and phase shifts;
- FIG. 3 is a table of example setup parameters for the segmentation
- FIG. 4 shows an example of a reflected signal with adjusted parameters as a function of the distance
- FIG. 5 is a flow chart of an embodiment of the segmentation process
- FIG. 6 shows an example of the accumulation and phase shift technique for a 10 m range finder using the one sample by optical pulse technique
- FIG. 7 is a table of example setup configuration for the accumulation and phase shift technique using the one sample by optical pulse technique
- FIG. 8 is a block diagram of a lidar module using an embedded processor
- FIG. 9 shows a noisy signal fitted and filtered
- FIG. 10 presents a Gaussian pulse with a zero-crossing point of the first derivative
- FIG. 11 shows a typical PWM pattern with slope adjustment
- FIG. 12 shows a rising edge signal from a source and reflected signals
- FIG. 13 shows a 10% to 90% rising edge of an echo back noisy signal with linear regression
- FIG. 14 is a flow chart of an embodiment of the PWM edge technique for detection and ranging.
- FIG. 15 shows a rising edge with overshoot stabilizing after one cycle of the resonance frequency
- FIG. 16 shows a timing diagram of the method using an integration signal from the reflected signal and synchronized with rising edge and falling edge of the PWM lighting source
- FIG. 17 is a flow chart of the main steps of a method for acquiring a detected light optical signal and generating an accumulated digital trace
- FIG. 18 is a flow chart of the main steps of a method for detecting a distance to an object.
- FIG. 1 is a block diagram illustrating an embodiment of a lighting system equipped with the present system.
- the lighting system 100 has a visible-light source 112 .
- the visible-light source 12 has, as a first purpose, the emission of visible light for illumination or visual communication of information, like signaling, for human vision.
- the primary purpose of emitting light is controlled according to specific criteria like optical power, field of view and light color, to meet requirements defined through a number of regulations.
- the visible-light source 112 has one or more solid-state lighting devices, LEDs or OLEDs for instance.
- the visible-light source 112 is connected to a source controller 114 , so as to be driven into producing visible light.
- the system 100 performs detection of objects and particles (vehicles, passengers, pedestrians, airborne particles, gases and liquids) when these objects are part of the environment/scene illuminated by the light source 112 .
- the source controller 114 drives the visible-light source 112 in a predetermined mode, such that the emitted light takes the form of a light signal, for instance by way of amplitude-modulated or pulsed light emission.
- These light signals are such that they can be used to provide the lighting illumination level required by the application, through data/signal processor 118 and source controller 114 , while producing a detectable signal. Accordingly, it is possible to obtain a light level equivalent to a continuous light source by modulating the light signal fast enough (e.g., frequency more than 100 Hz) to be generally imperceptible to the human eye and having an average light power equivalent to a continuous light source.
- the light signal fast enough e.g., frequency more than 100 Hz
- the source controller 114 is designed to provide an illumination drive signal, such as a constant DC signal or a pulse-width modulated (PWM) signal, that is normally used in lighting systems to produce the required illumination and control its intensity.
- the illumination drive signal is produced by the illumination driver sub-module 114 A of the controller 114 .
- a modulated/pulsed driving signal supplies the fast modulation/pulse sequence required for remote object detection.
- This modulated/pulsed drive signal is produced by a modulation driver sub-module 114 B of the controller 114 .
- the amplitude of short-pulse (typ. ⁇ 50 ns) can be several time the nominal value while the duty cycle is low (typ. ⁇ 0.1%).
- the modulator driver 114 B can also be used to send data for optical communication. Both driving signals can be produced independently or in combination. Sequencing of the drive signals is controlled by the data/signal processor 118 .
- the light source 112 can be monitored by the optical detector 116 and the resulting parameters sent to the data/signal processor 118 for optimization of data processing.
- auxiliary light source (ALS) 122 can be a visible or non-visible source (e.g., UV or IR light, LEDs or laser) using the modulation driver 14 B.
- the auxiliary light source 122 provides additional capabilities for detecting objects and particles.
- UV light source (particularly around 250 nm) can be used to limit the impact of the sunlight when used with a UV detector.
- IR light can be used to increase the performance and the range of the detection area. IR lights and other types of light can be used to detect several types of particles by selecting specific wavelengths.
- the auxiliary light source 122 can also be useful during the installation of the system by using it as a pointer and distance meter reference. It can also be used to determine the condition of the lens.
- the visible-light source 112 is preferably made up of LEDs. More specifically, LEDs are well suited to be used in the lighting system 100 since LED intensity can be efficiently modulated/pulsed at suitable speed. Using this feature, current lighting systems already installed and featuring LEDs for standard lighting applications can be used as the light source 112 for detection applications, such as presence detection for energy savings, distance and speed measurements, fog, rain, snow or smoke detection and spectroscopic measurements for gas emission or smog detection.
- the system 100 has at least one lens 130 through which light is emitted in an appropriate way for specific applications.
- At least one input lens section 130 a of at least one lens 130 is used for receiving the light signal, for instance reflected or diffused (i.e., backscattered) by the objects/particles 134 .
- This input lens section 130 a can be at a single location or distributed (multiple zone elements) over the lens 130 and have at least one field of view.
- Several types of lens 130 can be used, such as Fresnel lenses for example.
- a sub-section of the lens 130 can be used for infrared wavelength.
- a sub-section of the lens 130 can be used for optical data reception.
- a detector 116 is associated with the visible-light source 112 and/or auxiliary light source 122 and the lens 130 .
- the detector module 116 is an optical detector (or detectors) provided so as to collect light emitted by the light source 112 /ALS 122 and back-scattered (reflected) by the objects/particles 134 .
- Detector module 116 can also monitor the visible-light source 112 or auxiliary light source 122 .
- the light signal can also come from an object 134 being the direct source of this light (such as a remote control) in order to send information to the data/signal processor through the optical detector module 116 .
- the optical detector module 116 is, for example, composed of photodiodes, avalanche photodiodes (APD), photomultipliers (PMT), complementary metal-oxide semiconductor (CMOS) or charge-coupled device (CCD) array sensors.
- Filters are typically provided with the detector module 116 to control background ambient light emitted from sources other than the lighting system 100 . Filters can also be used for spectroscopic measurements and to enhance performance of the light source 112 .
- a front-end and analog-to-digital converter (ADC) 124 is connected to detector 116 and receives detected light data therefrom and controls the detector 116 .
- adjusting the Vbias of an APD detector can be one of the detector controls to optimize the gain of the receiver section for an Automatic Gain Control (AGC).
- Analog filters can be used for discriminating specific frequencies or to measure the DC level.
- a detection and ranging digital processing unit 126 is connected to the front-end 124 , and controls parameters such as gain of amplifier, synchronization and sample rate of the ADC.
- the detection and ranging digital processing unit 126 receives data from ADC and pre-processes the data.
- the data/signal processor 118 is connected to the detection and ranging processing module 126 and receives pre-processed data.
- the data/signal processor 118 is also connected to the source controller 114 , so as to receive driving data therefrom.
- the data/signal processor 118 has a processing unit (e.g., CPU) so as to interpret the pre-processed data from the detection module 126 , in comparison with the driving data of the source controller 114 , which provides information about the predetermined mode of emission of the light signals emitted by the visible-light source 112 .
- a processing unit e.g., CPU
- information about the object e.g., presence, distance, speed of displacement, composition, dimension, etc.
- information about the object is calculated by the data/signal processor 118 as a function of the relationship (e.g., phase difference, relative intensity, spectral content, time of flight, etc.) between the driving data and the detected light data, is optionally pre-processed by the front-end and ADC 24 and the detection and ranging processing unit 126 .
- a database 120 may be provided in association with the data/signal processor 118 so as to provide historical data or tabulated data to accelerate the calculation of the object parameters.
- the data/signal processor 118 controls the source controller 114 and thus the light output of the visible-light source 112 .
- the visible-light source 112 may be required to increase or reduce its intensity, or change the parameters of its output. For example, changes in its output power can adapt the lighting level required in daytime conditions versus nighttime conditions or in bad visibility conditions such as fog, snow or rain.
- the system 100 can be provided with sensors 132 connected to the data/signal processor 118 .
- Sensors 132 can be an inclinometer, accelerometer, temperature sensor, day/night sensors, etc. Sensors 132 can be useful during the installation of the system and during operation of the system. For example, data from an inclinometer and accelerometer can be used to compensate for the impact on the field of view of an effect of the wind or any kind of vibration. Temperature sensors are useful to provide information about weather (internal, external or remote temperature with FIR lens).
- Information from sensors 132 and data/signal processor 118 and light from light source 112 and auxiliary light source 122 can be used during installation, in particular for adjusting the field of view of the optical receiver.
- the auxiliary light source 112 can be used as a pointer and distance meter.
- the system 100 has a power supply and interface 128 .
- the interface section is connected to a Data/signal processor and communicates to an external traffic management system (via wireless, power line, Ethernet, CAN bus, USB, etc.).
- ACC Adaptive Cruise Control
- the reflected signal is strong but, usually, the needs for a good resolution and fast refresh rate of the data are high.
- the reflected signal is weak and noisy but the need for resolution and refresh rate is less demanding.
- Phase shifting control techniques can improve accuracy using a digital acquisition system with low sample rate.
- a relatively low cost ADC ex.: 50 MSPS
- the detection and ranging digital processing unit 126 and the Data/signal Processor 118 allows to control the number of shift delay by period, the number of accumulation and the refresh rate for each data point sampled or for several segments. For shorter distances, with an echo back signal which is relatively strong, the number of shift delays and the refresh rate can be higher to improve the resolution and the response time. The number of accumulation (or other time-integration techniques) would be lower but sufficient at short distances (trade-off between signal-to-noise ratio, resolution and number of results per second).
- the accumulation technique improves the signal-to-noise ratio of the detected light signal using multiple measurements.
- the technique uses M light pulses and for each light pulse, a signal detected by the optical detector is sampled by the ADC with an ADC time resolution of 1/F second thereby generating M lidar traces of j points (S 1 to S j ) each. Points of the M lidar traces are added point per point to generate one accumulated digital lidar trace of j points.
- the phase shift technique is used to improve the time resolution of the trace acquired by the ADC and limited by its sample rate F Hz.
- the phase shift technique allows for the use of a low cost ADC having a low sample rate F by virtually increasing the effective sample rate.
- the effective sample rate is increased by a factor P by acquiring P sets corresponding to P light pulses while shifting the phase between the emitted light pulse and the ADC sampling rate.
- the phase shifting between each acquisition corresponds to 2 ⁇ /P.
- the P sets obtained are then combined in a single trace by interleaving the P sets such that the resulting trace is equivalent to a single measurement with a temporal resolution (1/F ⁇ P) second.
- the detection and ranging digital processing unit 126 and the Data/signal Processor 118 creates one combined trace of the reflected light pulse.
- the length of the buffer is at least j ⁇ P elements and the number of bit of each element is a function of the resolution of the ADC (number of bits, B) and the number of accumulations M.
- each element of the buffer should have at least B+log 2 M bits.
- FIGS. 2a, 2b and 2c Example results of the accumulation and phase shift techniques are shown in FIGS. 2a, 2b and 2c .
- a target is approximately at a distance of 12 meters and the system use an ADC at 50 MSPS.
- FIG. 2a shows a trace obtained with no accumulation and no phase shift. The signal is noisy with a lack of resolution and it is very difficult to identify the target.
- FIG. 2b shows an improvement in terms of signal to noise ratio by accumulating 64 sets with 8 shift delays.
- FIG. 2c shows how an accumulation of 1024 sets with 256 shift delays can improve the signal-to-noise ratio and resolution.
- Accumulation and shift control can be done by a programmable logic, a Field Programmable Gate Array (FPGA) for example.
- Phase shifting can be controlled by delaying the clock of the ADC converter 130 by a fraction of a period or by delaying the driver of the optical source.
- FIG. 3 shows one example of setup configurations for this method using different parameters as a function of the distance. For different distances (for instance, for a range from 1 m to 100 m), one can optimize the temporal resolution, the number of accumulation and the refresh rate and make tradeoffs in terms of sensibility, accuracy and speed as a function of the distance to a target.
- FIG. 4 shows a reflected signal with a first echo from an object closer to the system and a second echo from another object further from the source.
- the amplitude of the first echo is higher and the system optimizes the temporal resolution.
- the amplitude of the second echo back pulse from the farther object is lower and the system optimizes the SNR by using more accumulation instead of optimizing the resolution.
- each parameter can be adaptive as a function of the echo back signal.
- the system can optimize the process by adjusting parameters as a function of the priority (resolution, refresh rate, SNR). For example, if the noise is lower than expected, the system can reduce the number of accumulation and increase the number of shift delays to improve the resolution.
- FIG. 5 shows a flow chart of a typical process for this method.
- Configuration 500 sets several parameters before the beginning of the process.
- Acquisition 502 starts the process by the synchronization of the emission of the optical pulses and the acquisition of samples by the ADC.
- Digital filtering and processing of the data 504 make the conditioning for the extraction and storage in memory of a lidar trace 506 .
- Detection and estimation of the distance 508 is made, typically using a reference signal and measuring the lapse of time between the emission and the reception of the signal.
- the transmission of the results 510 (the detection and the estimation of the distance) are transmitted to a external system.
- Noise analysis 512 is performed and an adjustment of the parameters as a function of the level of the noise 514 can be made to optimize the process.
- the ADC has to acquire samples at the frequency of the optical pulse emission.
- the ADC converts L samples per second with P shift delay of D ns of delay.
- FIG. 6 shows an example of that technique for a ten meter range finder.
- the ADC works at same frequency as the optical pulse driver (ex.: 100 KHz). For each one of the first twenty optical pulses, the system synchronizes a shift delay of 5 ns between the optical pulse driver and the ADC. After 20 pulses, the system samples the reflected back signal 95 ns after the pulse was emitted, just enough to detect the end of the reflected back signal from a target at 10 meters. For this example, if the system works at 100 KHz, after 200 us, a complete 10 meters Lidar trace is recorded. To improve the signal-to-noise ratio, one can accumulate up to 5000 times to have one complete lidar trace per second.
- FIG. 7 is a table showing setup configuration for this method. For a maximum range of 10 meters and 30 meters, the table shows tradeoffs between accuracy (temporal resolution), sensibility (improvement of the signal to noise ratio by accumulation) and speed (refresh rate).
- FIG. 8 shows a block diagram of a lidar module 800 using an embedded processor optimizing the cost of the range finder system.
- the embedded processor 801 controls the timing for the driver 802 sourcing the light source 803 .
- a light signal is emitted in a direction determined by the optical lens 804 .
- a reflection signal from objects/particles 834 is received on the optical lens 804 and collected by the optical detector and amplifier 805 .
- the embedded processor 801 uses an embedded ADC to make the acquisition of the lidar trace and processes the data and sends the information to an external system 840 .
- the system 800 can use several sources being driven sequentially using one sensor or several sensors. The frequency of acquisition is at the frequency of optical source multiplied by the number of optical sources.
- moving average techniques permit to constantly have the last N samples to perform an average.
- Using a FIFO by adding a new data and subtracting the first data accumulated is an example of an implementation of that technique.
- Averaging techniques can consider a signal from moving objects as noise and will fail to discriminate it. Frequency domain analysis can be useful for this kind of situation. Wavelet transform is very efficient for signal analysis in time/frequency domain and is sensitive to the transient signals. By separating the echo back signal in several segments and analyzing the spectral frequency, the system can detect the frequency of the pulse of the source in a specific segment. Averaging parameters can be adjusted as a function of events detected by the spectral analysis process. For instance, the number of averages should be reduced when moving objects are detected sequentially in different segments.
- Low pass filters can be used as pre-processes on each trace before averaging. Filters may be particularly efficient when more than one sample is available on an echo pulse. Information from noise analysis and from the information of the signal waveform emitted by the source can also help to discriminate a signal and to adjust the parameters. Specific processing functions can be used for each point of the trace or by segment.
- the reference signal can be a pattern signal stored in memory or a reference reflection signal of an optical pulse detected by a reference optical detector.
- This reference optical detector acquires a reference zero value and this reference signal is compared to the lidar trace. Detection and distance is based on comparison between both signals. Fit can be made by convolution.
- FIG. 9 shows a noisy signal fitted and filtered to diminish the effects of the noise.
- FIG. 9 presents the effect of signal filtering and curve fitting.
- the raw data curve is the noisy signal as received from the sensor.
- the filter curve is the raw data curve after filtering by correlation with an ideal (no noise) pulse. This removes the high-frequency noise.
- the fit curve presents the optimal fitting of an ideal pulse on the filtered signal. Fitting can improve distance stability especially when the signal is weak and still too noisy even after filtering.
- FIG. 10 shows an example of a Gaussian pulse with selected data over a predefined threshold and the result from the derivative calculation of those selected data. One can see the zero crossing from the derivative plot representing the peak of the pulse.
- Illumination Driver as a Source for Rangefinder with Edge Detection
- Switch-mode LED drivers are very useful notably for their efficiency compared to linear drivers.
- PWMs permit a very high dimming ratio without drifting the wavelength usually generated by linear drivers. Their performance is particularly well suited for high power LEDs.
- switch-mode LED drivers are noisier and the EMI can be an issue for some applications.
- One way to address this issue is to use a gate rising/falling slope adjust circuit to decrease the speed of transitions. Transitions at lower speed mean less EMI.
- FIG. 11 presents a typical PWM signal with slope adjustment.
- the PWM LED light source has a relatively constant slope during its rising/falling edge to reduce EMI (rising/falling edge of 100 ns for example).
- the optical signal from the source is sampled to be able to determine the starting time of the pulse (T 0 ).
- Electrical synchronization signal can also be used to indicate the starting point.
- the reflected signal is sampled with enough temporal resolution to have several points during the slope of the signal when an object in the field of view returns a perceptible echo.
- FIG. 12 shows an example of a rising edge from a source, an echo back signal from an object 4.5 meters away from the source ( ⁇ 30 ns later) and another from an object at 7 meters from the source ( ⁇ 45 ns later). Calculating the slope by linear regression or other means, an evaluation of the origin of the signal is made and the elapsed time between the signal from the source and an echo back signal can be determined. Based on that result, one can estimate the presence and the distance of the object reflecting the signal.
- FIG. 13 represents a 10% to 90% rising edge of an echo back noisy signal from an object at 4.5 meters from the source.
- linear regression one can calculate the intercept point and get a good estimate of the delay between the two signals. Samples close to the end of the slope have a better SNR.
- a threshold can be set to discriminate the presence or the absence of an object.
- Averaging and filtering techniques can be used to diminish the level of noise and shifting techniques can also be used to have more points in the slope. As shown in FIG. 9 , even with a noisy signal, this method can give good results.
- FIG. 14 shows a flow chart of the typical process for this method.
- the echo back signal is filtered 1400 , typically using a band-pass filter based on the frequency of the transition. Rising and falling edges are detected 1402 and samples are taken in the slope 1404 to memorize a digital waveform of the slope.
- the calculation of the linear regression 1406 is made and permits to calculate the intercept point 1408 . Based on that information, the calculation of the difference in time between the signal emission and the signal received 1410 allows to estimate the distance to the object 1412 .
- This method can be improved by using demodulation and spectral analysis techniques.
- the base frequency of the PWM can be demodulated and the result of this demodulation will give an indication of a presence of an object.
- By selecting a frequency based on an harmonic coming from the slopes of the PWM signal one can estimate the position of the object by spectral analysis of different segments. Knowing the approximated position, the acquisition of samples will be adjusted to target the rising and the falling edge.
- the edge detection technique By using the edge detection technique, one can use a standard LED driver for the purpose of lighting and also for the purpose of detection and ranging.
- the frequency of the PWM might be in the range from a few KHz up to 1 MHz. High frequency modulation can improve the SNR notably by averaging techniques.
- this method permits using a PWM source for a LED lighting system completely electrically isolated from the receiver.
- EMI is not an issue
- the electronic driver can generate a fast rising edge and/or falling edge with some overshoot at a resonance frequency. This signal adds more power at a specific frequency and increase the signal that can be detected by the receiver.
- FIG. 15 shows a rising edge with overshoot stabilizing after one cycle of the resonance frequency.
- Different shapes of objects reflect a modified waveform of the original signal.
- the echo back signal from a wall is different when compared to the echo back signal from an object with an irregular shape. Reflection from two objects with a short longitudinal distance between them also generates a distinct waveform.
- this data can be used to improve the digital processing performance.
- Digital correlation can be done to detect a predetermined pattern.
- Averaging techniques do not perform very well with moving objects. By tracking a moving object, one can anticipate the position of the object and adapt to the situation. Averaging with shifting proportional to the estimated position is a way to improve the SNR even in the case of moving objects. Tracking edges is another way to adjust the acquisition of the waveform with more points in the region of interest. Spectral analysis can also be used to lock and track an object.
- the system can be used as a road weather information system (RWIS) and thus provide information about temperature, visibility (fog, snow, rain, dust), condition of the road (icy) and pollution (smog). Pattern recognition based on low frequency signals and spikes can be implemented to do so. The recognition of bad weather condition patterns helps to discriminate noise from objects.
- the system can be used to adjust the intensity of light depending on weather conditions. Monitoring the condition of the lens is also possible (dirt, accumulation of snow, etc). This monitoring can be done by the measurement of the reflection on the lens from the source or from an auxiliary source.
- FIG. 16 shows a timing diagram of the method using an integration signal from the reflected signal and synchronized with the rising edge and the falling edge of the PWM lighting source.
- FIG. 16 shows a PWM signal (PWM curve 1601 ) with an adjustable duty cycle to control the intensity of light for illumination purposes.
- PWM curve 1601 a PWM signal with an adjustable duty cycle to control the intensity of light for illumination purposes.
- the sensor starts the integration (sensor integration curve 1603 ) of the reflected signal.
- the sensor stops the integration.
- the same process is performed at the falling edge of the PWM.
- the light pulse from the source is delayed (delay curve 1602 ) proportionally to the travelled distance.
- the delta curve 1604 shows that the integration P 1 for the rising edge is smaller than the integration P 2 for the falling edge because of the delay of travel of the light signal.
- the integration value from the rising edge will be approximately equal to the integration value from the falling edge.
- the integration value of rising edge will be less than the integration value of the falling edge.
- the difference between the values is proportional to the distance.
- the same technique can be used by switching the synchronisation of the signal of the optical source and the signal to the sensor integration time.
- Values from the signal integration are memorized.
- each “pixel” is memorized.
- Several integrations can be performed and an averaging process can be done to improve signal to noise ratio.
- the combined trace can be compared 1720 to a detected reference reflection signal of the pulse to determine 1722 a distance traveled by the pulse.
- a timer can be triggered to calculate a time elapsed 1724 between the emission of the pulse and the detection of the reflection signal to determine a distance traveled 1722 by the pulse based on the time elapsed.
- the method comprises providing a lighting system 1800 having at least one pulse width modulated visible-light source for illumination of a field of view; emitting an illumination signal 1802 for illuminating the field of view for a duration of time y using the visible-light source at a time t; integrating a reflection energy for a first time period from a time t ⁇ x to a time t+x 1808 ; determining a first integration value for the first time period 1810 ; integrating the reflection energy for a second time period from a time t+y ⁇ x to a time t+y+x 1812 ; determining a second integration value for the second time period 1814 ; calculating a difference value between the first integration value and the second integration value 1816 ; determining a propagation delay value proportional to the difference value 1818 ; determining the distance to the object from the propagation delay value 1820 .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
Distance=c×(INT/4)*(P2−P1)/(P2+P1),
where c represents the velocity of light, INT represents the integration time, P1 represents the integration value synchronized with the rising edge of the optical pulse and P2 represents the integration value synchronized on the falling edge of the optical pulse.
Distance=c×(INT/4)*((P2−B)−(P1−B))/(P2+P1−2B),
where B is the integration value of the optical background level when the optical source of the system is off.
Distance=c×(INT/4)*(P1−P2)/(P2+P1),
where c represents the velocity of light, INT represents the integration time, P1 represents the integration value when optical pulse is synchronized with the rising edge of integration and P2 represents the integration value when the optical pulse is synchronized with the falling edge of integration.
Distance=c×(INT/4)*((P1−B)−(P2−B))/(P2+P1−2B).
Claims (3)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/984,704 USRE46930E1 (en) | 2007-12-21 | 2015-12-30 | Distance detection method and system |
US16/011,820 USRE49342E1 (en) | 2007-12-21 | 2018-06-19 | Distance detection method and system |
US17/984,975 USRE49950E1 (en) | 2007-12-21 | 2022-11-10 | Distance detection method and system |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US1573807P | 2007-12-21 | 2007-12-21 | |
PCT/CA2008/002268 WO2009079789A1 (en) | 2007-12-21 | 2008-12-19 | Detection and ranging methods and systems |
US12/809,235 US8310655B2 (en) | 2007-12-21 | 2008-12-19 | Detection and ranging methods and systems |
US13/632,191 US8619241B2 (en) | 2007-12-21 | 2012-10-01 | Distance detection method and system |
US14/984,704 USRE46930E1 (en) | 2007-12-21 | 2015-12-30 | Distance detection method and system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/632,191 Reissue US8619241B2 (en) | 2007-12-21 | 2012-10-01 | Distance detection method and system |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/632,191 Continuation US8619241B2 (en) | 2007-12-21 | 2012-10-01 | Distance detection method and system |
US13/632,191 Division US8619241B2 (en) | 2007-12-21 | 2012-10-01 | Distance detection method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
USRE46930E1 true USRE46930E1 (en) | 2018-07-03 |
Family
ID=62683716
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/984,704 Active USRE46930E1 (en) | 2007-12-21 | 2015-12-30 | Distance detection method and system |
Country Status (1)
Country | Link |
---|---|
US (1) | USRE46930E1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11156456B2 (en) * | 2019-05-21 | 2021-10-26 | Apple Inc. | Optical proximity sensor integrated into a camera module for an electronic device |
US11175006B2 (en) | 2018-09-04 | 2021-11-16 | Udayan Kanade | Adaptive lighting system for even illumination |
US20220026539A1 (en) * | 2020-07-21 | 2022-01-27 | Leddartech Inc. | Beam-steering device particularly for lidar systems |
US20220082659A1 (en) * | 2019-05-24 | 2022-03-17 | Huawei Technologies Co., Ltd | Echo Signal Processing Method and Apparatus, System, and Storage Medium |
US11635374B2 (en) | 2019-05-09 | 2023-04-25 | Advantest Corporation | Optical testing apparatus |
US11668830B1 (en) | 2018-06-01 | 2023-06-06 | Vayavision Sensing Ltd. | System and method for performing active distance measurements |
US11725956B2 (en) | 2015-04-01 | 2023-08-15 | Vayavision Sensing Ltd. | Apparatus for acquiring 3-dimensional maps of a scene |
US11740071B2 (en) | 2018-12-21 | 2023-08-29 | Apple Inc. | Optical interferometry proximity sensor with temperature variation compensation |
US11874110B2 (en) | 2020-09-25 | 2024-01-16 | Apple Inc. | Self-mixing interferometry device configured for non-reciprocal sensing |
US11906303B2 (en) | 2019-05-24 | 2024-02-20 | Apple Inc. | Wearable skin vibration or silent gesture detector |
Citations (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3045231A (en) * | 1958-04-17 | 1962-07-17 | Thompson Ramo Wooldridge Inc | Signal analyzing method and system |
US3954335A (en) * | 1972-06-19 | 1976-05-04 | Siemens Ag | Method and apparatus for measuring range and speed of an object relative to a datum plane |
US4808997A (en) | 1987-05-21 | 1989-02-28 | Barkley George J | Photoelectric vehicle position indicating device for use in parking and otherwise positioning vehicles |
US4891624A (en) | 1987-06-12 | 1990-01-02 | Stanley Electric Co., Ltd. | Rearward vehicle obstruction detector using modulated light from the brake light elements |
JPH04145390A (en) | 1990-10-05 | 1992-05-19 | Mitsubishi Electric Corp | Distance measuring device |
JPH04145391A (en) | 1990-10-05 | 1992-05-19 | Mitsubishi Electric Corp | Distance measuring device |
US5298905A (en) * | 1992-06-12 | 1994-03-29 | Motorola, Inc. | Visible light detection and ranging apparatus and method |
US5396510A (en) * | 1993-09-30 | 1995-03-07 | Honeywell Inc. | Laser sensor capable of measuring distance, velocity, and acceleration |
US5565870A (en) * | 1993-06-28 | 1996-10-15 | Nissan Motor Co., Ltd. | Radar apparatus with determination of presence of target reflections |
US5587908A (en) * | 1992-12-22 | 1996-12-24 | Mitsubishi Denki Kabushiki Kaisha | Distance measurement device and vehicle velocity control device for maintaining inter-vehicular distance |
JPH0912723A (en) * | 1995-06-23 | 1997-01-14 | Toshiba Silicone Co Ltd | Polyether-modified polyorganosiloxane |
US5633801A (en) | 1995-10-11 | 1997-05-27 | Fluke Corporation | Pulse-based impedance measurement instrument |
US5699151A (en) * | 1994-06-28 | 1997-12-16 | Mitsubishi Denki Kabushiki Kaisha | Distance measurement device |
US5812249A (en) | 1996-09-26 | 1998-09-22 | Envirotest Systems Corporation | Speed and acceleration monitoring device using visible laser beams |
US5852491A (en) * | 1996-05-20 | 1998-12-22 | Olympus Optical Co., Ltd. | Distance measuring apparatus |
US5933225A (en) * | 1997-08-12 | 1999-08-03 | Mitsubishi Denki Kabushiki Kaisha | Vehicular optical radar apparatus |
US5987395A (en) * | 1996-06-17 | 1999-11-16 | Bayerische Motoren Werke Aktiengesellschaft | Process for measuring the distance between a motor vehicle and an object |
US6100539A (en) * | 1997-01-20 | 2000-08-08 | Sick Ag | Light sensor with evaluation of the light transit time |
US6115112A (en) * | 1996-03-07 | 2000-09-05 | Spectra Precision Ab | Electronic distance measuring instrument |
US6252655B1 (en) * | 1997-07-07 | 2001-06-26 | Nikon Corporation | Distance measuring apparatus |
US20010024271A1 (en) * | 2000-03-17 | 2001-09-27 | Olympus Optical Co., Ltd | Distance measurement apparatus and distance measuring |
US6502053B1 (en) | 2000-06-12 | 2002-12-31 | Larry Hardin | Combination passive and active speed detection system |
US6587185B1 (en) * | 1999-06-30 | 2003-07-01 | Minolta Co., Ltd. | Distance measuring apparatus |
US6650403B2 (en) * | 2001-04-06 | 2003-11-18 | Mitsubishi Denki Kabushiki Kaisha | Distance measuring device for a vehicle |
US6657704B2 (en) * | 2001-06-11 | 2003-12-02 | Denso Corporation | Distance measurement apparatus |
US6665057B2 (en) * | 2001-03-27 | 2003-12-16 | Hella Kg Hueck & Co. | Method for distance measurement for vehicles by measuring transit time of laser pulses |
US20040035620A1 (en) | 2002-08-26 | 2004-02-26 | Mckeefery James | Single wire automatically navigated vehicle systems and methods for toy applications |
US6710859B2 (en) * | 2000-12-12 | 2004-03-23 | Denso Corporation | Distance measurement apparatus |
US20040135992A1 (en) | 2002-11-26 | 2004-07-15 | Munro James F. | Apparatus for high accuracy distance and velocity measurement and methods thereof |
US6765495B1 (en) | 2000-06-07 | 2004-07-20 | Hrl Laboratories, Llc | Inter vehicle communication system |
US6829043B2 (en) * | 2002-04-15 | 2004-12-07 | Toolz, Ltd. | Distance measurement device with short distance optics |
US6850156B2 (en) | 1999-11-15 | 2005-02-01 | Donnelly Corporation | Anti-collision safety system for vehicle |
US6897465B2 (en) * | 2000-06-22 | 2005-05-24 | Ford Global Technologies, Llc | System and method for determining a distance of an object using emitted light pulses |
US20050117364A1 (en) | 2003-10-27 | 2005-06-02 | Mark Rennick | Method and apparatus for projecting a turn signal indication |
JP2005170184A (en) | 2003-12-10 | 2005-06-30 | Nissan Motor Co Ltd | Light emitting diode lamp device with radar function |
US20050269481A1 (en) * | 2002-08-05 | 2005-12-08 | Elbit Systems Ltd. | Vehicle mounted night vision imaging system and method |
US6989781B2 (en) * | 2002-05-04 | 2006-01-24 | Robert Bosch Gmbh | Short-range radar system with variable pulse duration |
US7023531B2 (en) * | 2002-08-09 | 2006-04-04 | Hilti Aktiengesellschaft | Laser distance measuring device with phase delay measurement |
US20060072099A1 (en) * | 2004-10-01 | 2006-04-06 | Denso Corporation | Vehicular radar system |
US7068214B2 (en) * | 2003-02-19 | 2006-06-27 | Fujitsu Ten Limited | Radar |
US20060149472A1 (en) | 2005-01-04 | 2006-07-06 | Deere & Company, A Delaware Corporation. | Vision-aided system and method for guiding a vehicle |
US20060147089A1 (en) | 2005-01-04 | 2006-07-06 | Deere & Company, A Delaware Corporation | Method and system for guiding a vehicle with vision-based adjustment |
US20070024841A1 (en) * | 2005-07-13 | 2007-02-01 | Mariusz Kloza | Device for precise distance measurement |
US7177014B2 (en) * | 2004-06-15 | 2007-02-13 | Hokuyo Automatic Co., Ltd. | Light wave distance measuring apparatus |
US20070091294A1 (en) | 2003-10-06 | 2007-04-26 | Triple-In Holding Ag | Distance measurement |
US20070097349A1 (en) * | 2005-10-28 | 2007-05-03 | Hideo Wada | Optical distance measuring apparatus |
JP2007121116A (en) | 2005-10-28 | 2007-05-17 | Sharp Corp | Optical distance measuring device |
US7221271B2 (en) | 2002-10-31 | 2007-05-22 | Gerd Reime | Device for controlling lighting for the interiors of automotive vehicles and method for controlling said device |
US7350945B2 (en) | 2004-01-09 | 2008-04-01 | Valeo Vision | System and method of detecting driving conditions for a motor vehicle |
US20090102699A1 (en) | 2007-10-11 | 2009-04-23 | Andreas Behrens | Method for Detecting and Documenting Traffic Violations at a Traffic Light |
US20090251680A1 (en) | 2008-04-02 | 2009-10-08 | Ali Farsaie | Three dimensional spatial imaging system and method |
US7957900B2 (en) | 2008-02-08 | 2011-06-07 | Gaurav Chowdhary | Tracking vehicle locations in a parking lot for definitive display on a GUI |
US20110134249A1 (en) | 2009-12-04 | 2011-06-09 | Lockheed Martin Corporation | Optical Detection and Ranging Sensor System For Sense and Avoid, and Related Methods |
-
2015
- 2015-12-30 US US14/984,704 patent/USRE46930E1/en active Active
Patent Citations (57)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3045231A (en) * | 1958-04-17 | 1962-07-17 | Thompson Ramo Wooldridge Inc | Signal analyzing method and system |
US3954335A (en) * | 1972-06-19 | 1976-05-04 | Siemens Ag | Method and apparatus for measuring range and speed of an object relative to a datum plane |
US4808997A (en) | 1987-05-21 | 1989-02-28 | Barkley George J | Photoelectric vehicle position indicating device for use in parking and otherwise positioning vehicles |
US4891624A (en) | 1987-06-12 | 1990-01-02 | Stanley Electric Co., Ltd. | Rearward vehicle obstruction detector using modulated light from the brake light elements |
JPH04145390A (en) | 1990-10-05 | 1992-05-19 | Mitsubishi Electric Corp | Distance measuring device |
JPH04145391A (en) | 1990-10-05 | 1992-05-19 | Mitsubishi Electric Corp | Distance measuring device |
US5298905A (en) * | 1992-06-12 | 1994-03-29 | Motorola, Inc. | Visible light detection and ranging apparatus and method |
US5587908A (en) * | 1992-12-22 | 1996-12-24 | Mitsubishi Denki Kabushiki Kaisha | Distance measurement device and vehicle velocity control device for maintaining inter-vehicular distance |
US5565870A (en) * | 1993-06-28 | 1996-10-15 | Nissan Motor Co., Ltd. | Radar apparatus with determination of presence of target reflections |
US5396510A (en) * | 1993-09-30 | 1995-03-07 | Honeywell Inc. | Laser sensor capable of measuring distance, velocity, and acceleration |
US5699151A (en) * | 1994-06-28 | 1997-12-16 | Mitsubishi Denki Kabushiki Kaisha | Distance measurement device |
JPH0912723A (en) * | 1995-06-23 | 1997-01-14 | Toshiba Silicone Co Ltd | Polyether-modified polyorganosiloxane |
JPH09178786A (en) | 1995-10-11 | 1997-07-11 | Fluke Corp | Pulse-base impedance-measuring instrument and pulse-base method for measuring complex impedance |
US5633801A (en) | 1995-10-11 | 1997-05-27 | Fluke Corporation | Pulse-based impedance measurement instrument |
US6115112A (en) * | 1996-03-07 | 2000-09-05 | Spectra Precision Ab | Electronic distance measuring instrument |
US5852491A (en) * | 1996-05-20 | 1998-12-22 | Olympus Optical Co., Ltd. | Distance measuring apparatus |
US5987395A (en) * | 1996-06-17 | 1999-11-16 | Bayerische Motoren Werke Aktiengesellschaft | Process for measuring the distance between a motor vehicle and an object |
US5812249A (en) | 1996-09-26 | 1998-09-22 | Envirotest Systems Corporation | Speed and acceleration monitoring device using visible laser beams |
US6100539A (en) * | 1997-01-20 | 2000-08-08 | Sick Ag | Light sensor with evaluation of the light transit time |
US6252655B1 (en) * | 1997-07-07 | 2001-06-26 | Nikon Corporation | Distance measuring apparatus |
US5933225A (en) * | 1997-08-12 | 1999-08-03 | Mitsubishi Denki Kabushiki Kaisha | Vehicular optical radar apparatus |
US6587185B1 (en) * | 1999-06-30 | 2003-07-01 | Minolta Co., Ltd. | Distance measuring apparatus |
US6850156B2 (en) | 1999-11-15 | 2005-02-01 | Donnelly Corporation | Anti-collision safety system for vehicle |
US20010024271A1 (en) * | 2000-03-17 | 2001-09-27 | Olympus Optical Co., Ltd | Distance measurement apparatus and distance measuring |
US6765495B1 (en) | 2000-06-07 | 2004-07-20 | Hrl Laboratories, Llc | Inter vehicle communication system |
US6502053B1 (en) | 2000-06-12 | 2002-12-31 | Larry Hardin | Combination passive and active speed detection system |
US6897465B2 (en) * | 2000-06-22 | 2005-05-24 | Ford Global Technologies, Llc | System and method for determining a distance of an object using emitted light pulses |
US6710859B2 (en) * | 2000-12-12 | 2004-03-23 | Denso Corporation | Distance measurement apparatus |
US6665057B2 (en) * | 2001-03-27 | 2003-12-16 | Hella Kg Hueck & Co. | Method for distance measurement for vehicles by measuring transit time of laser pulses |
US6650403B2 (en) * | 2001-04-06 | 2003-11-18 | Mitsubishi Denki Kabushiki Kaisha | Distance measuring device for a vehicle |
US6657704B2 (en) * | 2001-06-11 | 2003-12-02 | Denso Corporation | Distance measurement apparatus |
US6829043B2 (en) * | 2002-04-15 | 2004-12-07 | Toolz, Ltd. | Distance measurement device with short distance optics |
US6989781B2 (en) * | 2002-05-04 | 2006-01-24 | Robert Bosch Gmbh | Short-range radar system with variable pulse duration |
US20050269481A1 (en) * | 2002-08-05 | 2005-12-08 | Elbit Systems Ltd. | Vehicle mounted night vision imaging system and method |
US7023531B2 (en) * | 2002-08-09 | 2006-04-04 | Hilti Aktiengesellschaft | Laser distance measuring device with phase delay measurement |
US20040035620A1 (en) | 2002-08-26 | 2004-02-26 | Mckeefery James | Single wire automatically navigated vehicle systems and methods for toy applications |
US7221271B2 (en) | 2002-10-31 | 2007-05-22 | Gerd Reime | Device for controlling lighting for the interiors of automotive vehicles and method for controlling said device |
JP2006521536A (en) | 2002-11-26 | 2006-09-21 | ジェームス エフ. マンロ | High-precision distance measuring apparatus and method |
WO2005008271A2 (en) | 2002-11-26 | 2005-01-27 | Munro James F | An apparatus for high accuracy distance and velocity measurement and methods thereof |
US20040135992A1 (en) | 2002-11-26 | 2004-07-15 | Munro James F. | Apparatus for high accuracy distance and velocity measurement and methods thereof |
US7068214B2 (en) * | 2003-02-19 | 2006-06-27 | Fujitsu Ten Limited | Radar |
US20070091294A1 (en) | 2003-10-06 | 2007-04-26 | Triple-In Holding Ag | Distance measurement |
US20050117364A1 (en) | 2003-10-27 | 2005-06-02 | Mark Rennick | Method and apparatus for projecting a turn signal indication |
JP2005170184A (en) | 2003-12-10 | 2005-06-30 | Nissan Motor Co Ltd | Light emitting diode lamp device with radar function |
US7350945B2 (en) | 2004-01-09 | 2008-04-01 | Valeo Vision | System and method of detecting driving conditions for a motor vehicle |
US7177014B2 (en) * | 2004-06-15 | 2007-02-13 | Hokuyo Automatic Co., Ltd. | Light wave distance measuring apparatus |
US20060072099A1 (en) * | 2004-10-01 | 2006-04-06 | Denso Corporation | Vehicular radar system |
US20060147089A1 (en) | 2005-01-04 | 2006-07-06 | Deere & Company, A Delaware Corporation | Method and system for guiding a vehicle with vision-based adjustment |
US20060149472A1 (en) | 2005-01-04 | 2006-07-06 | Deere & Company, A Delaware Corporation. | Vision-aided system and method for guiding a vehicle |
US20070024841A1 (en) * | 2005-07-13 | 2007-02-01 | Mariusz Kloza | Device for precise distance measurement |
US20070097349A1 (en) * | 2005-10-28 | 2007-05-03 | Hideo Wada | Optical distance measuring apparatus |
JP2007121116A (en) | 2005-10-28 | 2007-05-17 | Sharp Corp | Optical distance measuring device |
US7417718B2 (en) | 2005-10-28 | 2008-08-26 | Sharp Kabushiki Kaisha | Optical distance measuring apparatus |
US20090102699A1 (en) | 2007-10-11 | 2009-04-23 | Andreas Behrens | Method for Detecting and Documenting Traffic Violations at a Traffic Light |
US7957900B2 (en) | 2008-02-08 | 2011-06-07 | Gaurav Chowdhary | Tracking vehicle locations in a parking lot for definitive display on a GUI |
US20090251680A1 (en) | 2008-04-02 | 2009-10-08 | Ali Farsaie | Three dimensional spatial imaging system and method |
US20110134249A1 (en) | 2009-12-04 | 2011-06-09 | Lockheed Martin Corporation | Optical Detection and Ranging Sensor System For Sense and Avoid, and Related Methods |
Non-Patent Citations (2)
Title |
---|
Akindinov et al., "Detection of Light Pulses Using an Avalanche-Photodiode Array with a Metal-Resistor-Semiconductor Structure", Instruments and Experimental Techniques, Nov. 2004, vol. 48, No. 3 205, pp. 355-363, Russia. |
Braun et al., "Nanosecond transient electroluminescence from polymer lightemitting diodes", Applied Physics Letters Dec. 1992, vol. 61, No. 26, pp. 3092-3094, California. |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11725956B2 (en) | 2015-04-01 | 2023-08-15 | Vayavision Sensing Ltd. | Apparatus for acquiring 3-dimensional maps of a scene |
US11668830B1 (en) | 2018-06-01 | 2023-06-06 | Vayavision Sensing Ltd. | System and method for performing active distance measurements |
US11175006B2 (en) | 2018-09-04 | 2021-11-16 | Udayan Kanade | Adaptive lighting system for even illumination |
US11740071B2 (en) | 2018-12-21 | 2023-08-29 | Apple Inc. | Optical interferometry proximity sensor with temperature variation compensation |
US11635374B2 (en) | 2019-05-09 | 2023-04-25 | Advantest Corporation | Optical testing apparatus |
US11156456B2 (en) * | 2019-05-21 | 2021-10-26 | Apple Inc. | Optical proximity sensor integrated into a camera module for an electronic device |
US20220003543A1 (en) * | 2019-05-21 | 2022-01-06 | Apple Inc. | Optical Proximity Sensor Integrated into a Camera Module for an Electronic Device |
US11846525B2 (en) * | 2019-05-21 | 2023-12-19 | Apple Inc. | Optical proximity sensor integrated into a camera module for an electronic device |
EP3964866A4 (en) * | 2019-05-24 | 2022-05-04 | Huawei Technologies Co., Ltd. | Echo signal processing method, apparatus and system, and storage medium |
US20220082659A1 (en) * | 2019-05-24 | 2022-03-17 | Huawei Technologies Co., Ltd | Echo Signal Processing Method and Apparatus, System, and Storage Medium |
US11906303B2 (en) | 2019-05-24 | 2024-02-20 | Apple Inc. | Wearable skin vibration or silent gesture detector |
US20220026539A1 (en) * | 2020-07-21 | 2022-01-27 | Leddartech Inc. | Beam-steering device particularly for lidar systems |
US11828853B2 (en) * | 2020-07-21 | 2023-11-28 | Leddartech Inc. | Beam-steering device particularly for LIDAR systems |
US11874110B2 (en) | 2020-09-25 | 2024-01-16 | Apple Inc. | Self-mixing interferometry device configured for non-reciprocal sensing |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
USRE49950E1 (en) | Distance detection method and system | |
USRE46930E1 (en) | Distance detection method and system | |
CN106597471B (en) | Vehicle and method with transparent barriers object automatic detection function | |
TWI432768B (en) | Procedure and device to determining a distance by means of an opto-electronic image sensor | |
CA2691141C (en) | Lighting system with traffic management capabilities | |
JP2018119986A (en) | Optical distance meter without multiple-view scanner under bright circumference background light | |
US9378640B2 (en) | System and method for traffic side detection and characterization | |
JP2011506979A5 (en) | ||
US20080237445A1 (en) | Method and apparatus for distance measurement | |
EP3540460B1 (en) | Light receiving apparatus, object detection apparatus, distance measurement apparatus, mobile object apparatus, noise measuring method, object detecting method, and distance measuring method | |
CA3015002C (en) | Determination of an item of distance information for a vehicle | |
CA3075721A1 (en) | Full waveform multi-pulse optical rangefinder instrument | |
JP6265882B2 (en) | Object detection apparatus and object detection method | |
JP2019074375A (en) | Distance measuring device, moving body, distance measuring method, and program | |
WO2014038527A1 (en) | Vehicle radar device, and method of controlling detection range of same | |
JP2015152428A (en) | Laser radar device and object detection method | |
JP7316175B2 (en) | rangefinder | |
JPH04291190A (en) | Detecting device of inter-vehicle distance | |
JPH0656360B2 (en) | Optical detection device | |
US20230121398A1 (en) | Blockage detection of high-resolution lidar sensor | |
US20240192375A1 (en) | Guided flash lidar | |
FR2887342A1 (en) | Telemetry method for motor vehicle, involves producing signals representing transmitting and reflection waves, where signals are processed to obtain another set of signals that are compared with threshold to produce measuring signal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LEDDARTECH INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MIMEAULT, YVAN;REEL/FRAME:039277/0948 Effective date: 20121205 |
|
AS | Assignment |
Owner name: INVESTISSEMENT QUEBEC, CANADA Free format text: SECURITY INTEREST;ASSIGNOR:LEDDARTECH INC.;REEL/FRAME:051713/0622 Effective date: 20200130 |
|
AS | Assignment |
Owner name: FEDERATION DES CAISSES DESJARDINS DU QUEBEC, QUEBEC Free format text: SECURITY CONFIRMATION AGREEMENT;ASSIGNOR:LEDDARTECH INC.;REEL/FRAME:053980/0942 Effective date: 20200123 |
|
AS | Assignment |
Owner name: INVESTISSEMENT QUEBEC, CANADA Free format text: SECURITY INTEREST;ASSIGNOR:LEDDARTECH INC.;REEL/FRAME:055244/0208 Effective date: 20210205 |
|
CC | Certificate of correction | ||
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2552); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 8 |
|
AS | Assignment |
Owner name: FEDERATION DES CAISSES DESJARDINS DU QUEBEC, CANADA Free format text: SECURITY INTEREST;ASSIGNOR:LEDDARTECH INC.;REEL/FRAME:063316/0716 Effective date: 20230405 |
|
AS | Assignment |
Owner name: TSX TRUST COMPANY, CANADA Free format text: SECURITY INTEREST;ASSIGNOR:LEDDARTECH INC.;REEL/FRAME:063965/0233 Effective date: 20230609 |