CN112180388A - Three-dimensional distance measuring device - Google Patents

Three-dimensional distance measuring device Download PDF

Info

Publication number
CN112180388A
CN112180388A CN202010341819.XA CN202010341819A CN112180388A CN 112180388 A CN112180388 A CN 112180388A CN 202010341819 A CN202010341819 A CN 202010341819A CN 112180388 A CN112180388 A CN 112180388A
Authority
CN
China
Prior art keywords
distance
frame
dimensional
mode
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010341819.XA
Other languages
Chinese (zh)
Inventor
泉克彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi LG Data Storage Inc
Original Assignee
Hitachi LG Data Storage Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi LG Data Storage Inc filed Critical Hitachi LG Data Storage Inc
Publication of CN112180388A publication Critical patent/CN112180388A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • G01S17/18Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • G01S7/4866Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak by fitting a model or function to the received signal

Abstract

The present invention provides a three-dimensional distance measuring device, comprising: a light emitting unit (10) that irradiates light to an object; a light receiving unit (13) that detects reflected light from an object; a distance calculation unit (14) that calculates a three-dimensional distance to the subject based on the detected transmission time of the reflected light; an image processing unit (15) that generates a two-dimensional distance image of the subject on the basis of the calculated distance data; and a distance pattern selection processing unit (16) that selects a predetermined distance pattern from a plurality of distance patterns having different measurable distance ranges, and sets the driving conditions of the light emitting unit. And selecting a first distance mode in the first frame, selecting a second distance mode in the second frame, and combining the distance data acquired in each frame to generate three-dimensional distance data of an output frame.

Description

Three-dimensional distance measuring device
Technical Field
The present invention relates to a three-dimensional distance measuring device that outputs the position of an object such as a person as a distance image.
Background
A technique is known in which the distance to an object is measured based on the light propagation time (hereinafter referred to as TOF method: time of flight), and the measured distance is output as an image (distance image) showing the distance. In order to accurately measure the distance to an object, it is necessary to measure the coordinates of each object in a three-dimensional space with high accuracy. In the TOF system, the distance measurement accuracy (distance measurement resolution) and the distance range depend on the frequency of the light to be irradiated (the length of the light emission period), and the higher the frequency of the light to be irradiated, the more accurate the distance measurement is possible, but the distance range in which the distance measurement is possible is narrowed.
For example, in a distance measurement imaging apparatus described in international publication No. 2017/022152, a method of combining a first distance measurement by a Continuous Wave (Continuous Wave) method and a second distance measurement by a pulse method for the purpose of achieving both high distance measurement accuracy and a long distance measurement range has been proposed.
Disclosure of Invention
In distance measurement by the TOF method, since light that is irradiated to each object and then reflected by each object and returned to the distance measuring device is weak, measurement accuracy may be affected by the irradiation environment of the light irradiated to the object. Further, when the measurement distance is extended, the light returned to the distance measuring device is weaker, which causes a problem of lowering the distance measurement accuracy. The technique of international publication No. 2017/022152 described above is a technique that combines high distance measurement accuracy and a long distance measurement range. However, the problem of interference between devices that occurs when a plurality of range finding imaging devices are installed in the same area is not considered.
The inter-device interference is a phenomenon in which the irradiation light (or reflected light) other than the self device becomes interference light, and the self device is exposed to light, thereby causing an error in the distance measurement value. As a countermeasure, a method is known in which the modulation frequency of the light emission pulse is changed in accordance with the apparatus to reduce the distance measurement error caused by interference. However, when this method is applied to the structure of international publication No. 2017/022152, the pulse width must be changed by commonly using the continuous method and the pulse method, and it is difficult to put this method into practical use.
The invention aims to provide a three-dimensional distance measuring device which does not affect interference measures when a plurality of devices are arranged and can measure distances with high precision even in a wide measuring distance range.
The present invention provides a three-dimensional distance measurement device that outputs a position of an object as a distance image, including: a light emitting unit that irradiates the subject with light; a light receiving unit that detects reflected light from the subject; a distance calculation unit that calculates a three-dimensional distance to the object based on the transmission time of the reflected light detected by the light receiving unit; an image processing unit that generates a two-dimensional distance image of the subject based on the distance data calculated by the distance calculation unit; and a distance pattern selection processing unit that selects a predetermined distance pattern from a plurality of distance patterns having different measurable distance ranges, and sets a driving condition of the light emitting unit. The distance mode selection processing unit is configured to acquire three-dimensional distance data from a first distance mode in a first frame and acquire three-dimensional distance data from a second distance mode in a second frame, and the image processing unit is configured to combine the three-dimensional distance data acquired from the first frame and the three-dimensional distance data acquired from the second frame to generate three-dimensional distance data of an output frame.
According to the present invention, it is possible to provide a three-dimensional distance measuring device that measures a distance with high accuracy over a wide range of measurement distances. In this case, since conventional measures against interference between apparatuses can be applied, a problem does not arise when a plurality of apparatuses are installed.
Drawings
Fig. 1 is a configuration diagram of a three-dimensional distance measuring apparatus in example 1.
Fig. 2A is a diagram illustrating the principle of distance measurement by the TOF method.
Fig. 2B is a diagram illustrating the principle of distance measurement by the TOF method.
Fig. 3A is a diagram illustrating a distance mode (close distance) in example 1.
Fig. 3B is a diagram illustrating a distance mode (long distance) in embodiment 1.
Fig. 4A is a diagram illustrating a distance pattern and frame composition.
Fig. 4B is a diagram schematically showing the appearance of an object.
Fig. 5 is a flowchart showing the flow of the frame composition process.
Fig. 6 is a diagram for explaining a distance measurement method in example 2.
Fig. 7A is a diagram illustrating a distance pattern and frame composition.
Fig. 7B is a diagram schematically showing a detection state of an object.
Fig. 8 is a flowchart showing the flow of the frame composition process.
Detailed Description
Hereinafter, an embodiment of a three-dimensional distance measuring device according to the present invention will be described with reference to the drawings.
[ example 1 ]
Fig. 1 is a configuration diagram of a three-dimensional distance measuring apparatus in example 1. In the following example, a case of detecting a position of a person as an object is described. In a three-dimensional distance measuring device, a distance to an object including a person is measured by a TOF (time of Flight) method, and the distance to each part of the object, which is measured by, for example, color display, is output as a distance image.
The three-dimensional distance measuring device is configured such that the CPU17 controls a distance image generating unit 1 (hereinafter, also referred to as a TOF camera or a TOF) that generates a distance image by the TOF method. Here, the principle of distance measurement by TOF will be described later.
The TOF camera 1 includes: a light emitting unit 10 including a Laser Diode (LD), a Light Emitting Diode (LED), or the like that irradiates pulsed light to the subject 2; a light receiving unit 13 having a CCD sensor, a CMOS sensor, or the like that receives pulsed light reflected from an object; a distance calculation unit 14 that calculates a distance to the subject based on the detection signal of the light receiving unit 13; an image processing unit 15 that outputs a distance image of the object 2 based on the distance data output from the distance calculating unit 14; and a distance pattern selection processing unit 16 for setting a drive condition of the light emission pulse so as to select distance patterns having different measurable distance ranges. The TOF camera 1 is controlled by the CPU17 and can measure a three-dimensional distance. Next, the structure and function of each part will be described in detail.
In the TOF camera 1 shown in fig. 1, the light emitting unit 10 includes a light source unit 11 including a laser light source, and a light emission control unit 12 that performs light emission or extinction or adjustment of the light emission amount of the laser light source. The light source unit 11 is provided with laser light sources 11a, and laser light emitted from each laser light source can irradiate an irradiation region indicated by 3 a.
The light emission control unit 12 is constituted by a laser drive circuit 12a that drives the laser light source, and the laser drive circuit 12a corresponds to the laser light source 11 a. The distance mode selection processing unit 16 sets the driving conditions of the light emission pulses according to the selected distance mode in accordance with an instruction from the external CPU17, and then controls the light source unit 11 to emit or extinguish the laser light source 11 a.
The CCD sensor 13a mounted on the light receiving unit 13 detects light reflected from the object 2 and transmits a photoelectrically converted signal to the distance calculating unit 14. The distance calculation unit 14 calculates the distance to the object 2, and transmits the distance data to the image processing unit 15, the distance data being data of the distance to the object 2.
The image processing unit 15 stores the distance data transferred from the distance calculating unit 14 in an internal memory, and synthesizes the distance data between frames. Then, a colorization process is performed to change the color tone of the person or object image based on the distance data, and the image is output to an external device or displayed on a display or the like. The image processing may also be processing of changing brightness, contrast, or the like. The user can easily know the position (distance) and shape (posture) of an object such as a person by viewing the colored image.
Next, the operation of each part will be described.
Fig. 2A and 2B are diagrams illustrating the principle of distance measurement by the TOF method. In the TOF method, the distance is calculated from the time difference between the emitted light signal and the received light signal.
Fig. 2A is a diagram showing a relationship between a TOF camera (range image generating unit) 1 and an object 2 (e.g., a person). The TOF camera 1 includes a light emitting unit 10 and a light receiving unit 13, and emits a light emission pulse 31 for distance measurement from the light emitting unit 10 to the object 2. Infrared light is used for the emitted light. The light receiving unit 13 receives the reflected light 32 of the light emitted toward the object 2, and receives the light by a two-dimensional sensor 34 such as a CCD through an objective lens 33. The object 2 is located at a distance L [ m ] from the light emitting section 10 and the light receiving section 13. Here, when the light velocity is c [ m/s ] and the time difference between when the light emitting section 10 starts emitting light and when the light receiving section 13 receives reflected light is t [ s ], the distance L [ m ] to the object 2 is obtained by the following equation (1):
L[m]=c[m/s]×t[s]/2...(1)。
fig. 2B is a diagram showing the measurement of the time difference t. The distance calculation unit 14 measures the time difference t based on the timing of the light 31 emitted from the light emitting unit 10 and the timing of the reception of the reflected light 32 by the light receiving unit 13, and calculates the distance L from the object 2 according to the formula (1). Further, the difference in distance between the respective positions of the object, that is, the uneven shape of the object can be obtained from the deviation in the light receiving timing at each pixel position in the two-dimensional sensor 34.
Fig. 3A and 3B are diagrams illustrating a distance mode of the TOF camera in embodiment 1. Fig. 3A is a distance pattern for measuring a short distance, and fig. 3B is a distance pattern for measuring a long distance. In fig. 3A and 3B, the drive interval times pa, pb of the respective light emission pulses 31 are not the same (pa < pb). When the drive interval time p of the light emission pulse 31 is shortened, the maximum value of the time difference t is also shortened, and thus the measurable distance (limit distance) D is also shortened. Therefore, as for the resolution in distance measurement of TOF, if the number of bits allocated is the same, the distance resolution can be made finer (i.e., the distance accuracy can be improved) as the measurable distance D becomes shorter. In this case, the short-distance side (minimum distance) at which the distance can be measured is the same regardless of the distance mode, and the long-distance side (limit distance) at which the distance can be measured differs depending on the setting of the distance mode. In the present embodiment, in order to set the measurable distance (limit distance) D including the distance to the object 2 to the optimum distance, the distance mode is set by the CPU 17.
Fig. 4A and 4B are diagrams illustrating a distance pattern and frame composition. Fig. 4A illustrates a configuration in which frames having different distance patterns are combined as an output frame, and fig. 4B is a diagram schematically showing the appearance of the subject at this time as a distance image 60. Fig. 4A shows a case where a person 2a on the near distance side and a person 2b on the far distance side are subjects. In frame 1, the distance mode is set to the short distance side (the mode of fig. 3A), and the measurable distance is L1. In another frame 2, the distance mode is set on the far side (the mode of fig. 3B), and the measurable distance is L2. The measurable distance between the two is L1< L2.
In this embodiment, two different distance patterns are switched for each frame, and the most suitable distance pattern is selected for each pixel or each frame to perform frame synthesis. That is, by performing frame synthesis described later, as shown in fig. 4B, the object 2a on the near distance side performs distance measurement in the near distance mode with high accuracy, and the object 2B on the far distance side performs distance measurement in the far distance mode with low accuracy but a large measurable distance. As a result, the measurable distance Lout can be measured at a distance as long as L2, and the decrease in measurement accuracy can be minimized, as compared with the conventional method.
In the measurement method of the present embodiment, when a plurality of three-dimensional distance measuring devices are installed in the same area and operated, a method of changing the modulation frequency of the light emission pulse for each device can be applied in order to avoid interference between the devices. This prevents the problem described in patent document 1.
Fig. 5 is a flowchart showing the flow of the frame composition process. The frame composition process described below is executed by the CPU17 of the three-dimensional distance measuring apparatus controlling the operations of the respective units in fig. 1. The following description will be made in order of steps.
S101, the CPU17 instructs TOF drive to start TOF.
In step S102, the distance mode selection processing unit 16 sets the distance mode of the frame 1 to the short distance mode.
In step S103, the light emission control unit 12 turns on the light source unit 11 in the short-range mode.
In step S104, the light receiving unit 13 receives the reflected light from the subject, and the distance calculating unit 14 acquires distance data.
S105, storing the distance data of the frame 1 in an internal memory and finishing the processing in the frame 1.
S106, the distance mode selection processing unit 16 sets the distance mode of the frame 2 to the long distance mode.
In step S107, the light emission control unit 12 turns on the light source unit 11 in the long-distance mode.
In S108, the light receiving unit 13 receives the reflected light from the subject, and the distance calculating unit 14 acquires distance data.
S109, storing the distance data of the frame 2 in the internal memory and ending the processing in the frame 2.
At this point, the frame 1 acquires three-dimensional distance data within a short-distance measurement range, and the frame 2 acquires three-dimensional distance data within a long-distance measurement range.
In step S110, the image processing unit 15 determines which frame of distance data is used for each pixel. The determination method is as follows. For pixels that can be distance-measured by frame 1, the distance data of frame 1 is used. On the other hand, for pixels for which distance measurement cannot be performed in the frame 1, the distance data for the long distance in the frame 2 is used because the pixels are outside the measurement range for the short distance.
And S111, according to the decision of S110, generating a branch on whether to adopt the distance data of the frame 1. If the distance data of frame 1 is used, the process proceeds to S112. If the distance data of frame 1 is not used, the process proceeds to S113.
In step S112, the image processing unit 15 reads the distance data of the frame 1 from the internal memory.
In step S113, the image processing unit 15 reads the distance data of the frame 2 from the internal memory.
S114, the image processing unit 15 stores the read distance data in the output frame. After the processing is completed for all the pixels, distance data for the output frame is output from the TOF camera.
Thereafter, the process returns to S102, and the above-described process is repeated.
According to the above-described embodiment 1, the short-distance three-dimensional distance data acquired in the frame 1 and the long-distance three-dimensional distance data acquired in the frame 2 are combined to generate the output frame, and the three-dimensional distance data for the combined output frame is output from the TOF camera. As a result, while ensuring that the measurement distance range reaches a long distance, it is possible to maintain high accuracy on the short distance side and to minimize a decrease in measurement accuracy.
[ example 2 ]
Next, a three-dimensional distance measuring apparatus in example 2 will be described. The basic structure of the apparatus is the same as that of embodiment 1 shown in fig. 1, and therefore, the description thereof is omitted here. In the TOF method of example 2, a distance measurement method different from that of example 1 was employed.
Fig. 6 is a diagram illustrating a distance measurement method of the TOF camera in example 2. The light emission pulse 31a emitted from the light emitting section 10 is reflected by the subject and reaches the light receiving section 13, becoming a light receiving pulse 32 a. At this time, the gate pulse (gate)18 is set at a plurality of different delay time positions with reference to the timing of emission of the light emission pulse 31 a. Then, it is determined which gate pulse 18 is detected in the set open period of the light receiving pulse 32 a. As a result of the determination, the object is present at a position of the measurement distance corresponding to the time position of the detected strobe. That is, the position of the gate pulse in embodiment 2 corresponds to the distance pattern in embodiment 1.
In the example of fig. 6, the strobes 18A, 18B for frame 1 (first, second range mode) and the strobes 18C, 18D for frame 2 (third, fourth range mode) are shown. The open period of each gate pulse is D, and the start position of each gate pulse is Da to Dd with respect to the light emission pulse 31 a. The strobes 18A and 18B in the same frame are adjacent and the strobes 18C and 18D are adjacent. In other words, the limit distance on the far-distance side of the distance range (strobe pulses 18A, 18C) that can be measured in the first and third distance modes substantially matches the limit distance on the near-distance side of the distance range (strobe pulses 18B, 18D) that can be measured in the second and fourth distance modes.
In addition, gate pulses 18A and 18C are offset from only half the period of the open period, and gate pulses 18B and 18D are offset from only half the period of the open period. In other words, the third distance pattern is offset from the first distance pattern by half a period phase, and the fourth distance pattern is offset from the second distance pattern by half a period phase. Note that the phase shift amount may not be a half cycle, and the effect of the present embodiment can be obtained if the gate pulse 18C (third range mode) is a result including a part of the gate pulse 18A (first range mode) and a part of the gate pulse 18B (second range mode). In each frame, adjacent gate pulses not shown are also repeated similarly after the gate pulse 18B and after the gate pulse 18D.
In the example of fig. 6, the reflected light 32a is detected (represented by the good mark) at the gate pulses 18A and 18C. When the reflected light 32a cannot be detected by the gate pulses 18A and 18C, whether the reflected light 32a can be detected by the adjacent gate pulses 18B and 18D is checked, and when the reflected light is not detected, the same operation is repeated by the adjacent gate pulses not shown. The distance data obtained by the series of operations is set as the distance data in the frame 1 and the frame 2. Note that, if the distance data obtained at this time is within the same strobe, the distance data is regarded as distance data at the center of the strobe period D, for example, regardless of which timing within the strobe is detected. Since this measurement method can measure the distance by the presence or absence of reflected light in the gate pulse period D, it is advantageous to perform long-distance measurement by combining weak reflected light with a highly sensitive photodetector.
In fig. 6, the method of detecting the distance to the object as frame 1 using a set of gate pulses 18A and 18B is the same as the description of the TOF method described earlier. In contrast, in embodiment 2, another set of gate pulses 18C and 18D that are set to be shifted by half a period from the gate pulse open timing is combined, and the distance to the subject is detected as frame 2 using these gate pulses 18C and 18D. With this configuration, frame 1 and frame 2 are alternately repeated, and gate pulse information that enables detection of reflected light is checked in each of frame 1 and frame 2, whereby the ranging accuracy can be improved as described later.
Fig. 7A and 7B are diagrams illustrating a distance pattern and frame composition. Fig. 7A is a diagram illustrating a configuration in which frames 1 and 2 in which the timing of the gate pulse is shifted by a half period are combined and output frames. Fig. 7B is a diagram schematically showing a detection state of the object 2 at this time. Frame 1 corresponds to the gate pulses 18A and 18B in fig. 6, and is described as distance patterns 1 and 2. Frame 2 corresponds to the gate pulses 18C and 18D of fig. 6, and is described as a distance pattern A, B. The output frames are distinguished by overlapping regions of the distance patterns of frame 1 and frame 2, and are described as distances 1A, 2A, and 2B.
In fig. 7A, in frame 1, the timing of the object 2 in the distance pattern 2 is detected as the distance L1. In another frame 2, the timing of the object 2 in the distance pattern B is detected as a distance L2. In embodiment 2, a range in which the detection range of the frame 1 and the detection range of the frame 2 are common is regarded as the presence range of the object. Thus, in the output frame, the distance Lout corresponding to the period (shown by the distance 2B) common to the distance pattern 2 and the distance pattern B is detected.
Fig. 7B schematically shows the detection state of the object 2 in each frame. As described above, the object 2 is detected by the distance pattern 2 in the frame 1 and by the distance pattern B in the frame 2. In this way, in a single frame, the object is detected in an arbitrary distance pattern, but the position accuracy is determined in principle by the interval (measurement distance unit S) between adjacent distance patterns. According to the present embodiment, the distance width (measurement distance unit) of the effective distance pattern can be halved (S/2) by setting the output frame to be a composite of two frames. Thus, the accuracy of the distance measurement of the present embodiment can be improved by 2 times the resolution (distance measurement accuracy) as compared with the accuracy of the distance measurement using only the frame 1 or only the frame 2.
Although the present embodiment shows an example in which two frames are combined, the number of frames can be increased to n (n is 3 or more) and the timings of the gate pulses can be shifted. In this case, it is needless to say that the effective distance width (measurement distance unit) can be further reduced to S/n to improve the distance measurement accuracy.
Fig. 8 is a flowchart showing the flow of the frame composition process. The following frame combination process is executed by the CPU17 of the three-dimensional distance measuring apparatus controlling the operations of the respective units in fig. 1. The following description will be made in order of steps.
At S201, the CPU17 instructs TOF driving to start TOF.
In S202, the range mode selection processing section 16 sets a strobe for frame 1. For example, the strobe pulse 18A set to the closest distance (mode 1).
In step S203, the light emission control unit 12 turns on the light source unit 11.
In step S204, the light receiving unit 13 receives the reflected light from the subject and confirms the coincidence of the reflected light with the strobe.
S205, whether the timing of the reflected light is consistent with the set strobe pulse is judged, if so, the process proceeds to S207, and if not, the process proceeds to S206.
S206, the set strobe pulses are shifted from the adjacent positions. For example, S203 is executed by shifting from gate pulse 18A (mode 1) to gate pulse 18B (mode 2).
In step S207, the distance calculating unit 14 stores the distance data determined by the set strobe in the internal memory, and terminates the processing of the frame 1.
Next, the processing of frame 2 is started.
In step S302, the range mode selection processing unit 16 sets a strobe for frame 2. For example, the strobe pulse 18C set to the closest distance (mode a).
In step S303, the light source unit 11 is turned on by the light emission control unit 12.
In step S304, the light receiving unit 13 receives the reflected light from the subject and confirms the coincidence of the reflected light with the gate pulse.
S305, judging whether the timing of the reflected light is consistent with the strobe pulse, if so, going to S307, and if not, going to S306.
S306, the set strobe pulses are staggered at adjacent positions. For example, S303 is executed by shifting from gate pulse 18C (pattern a) to gate pulse 18D (pattern B).
In step S307, the distance calculation unit 14 stores the distance data determined by the set strobe in the internal memory, and ends the processing of the frame 2.
Next, a process of combining the frame 1 and the frame 2 is performed.
In step S308, the image processing unit 15 reads the distance data between the frame 1 and the frame 2 from the internal memory.
In step S309, the image processing unit 15 outputs the average value of the distance data of the frame 1 and the frame 2 from the TOF camera as the distance data of the output frame.
Thereafter, the process returns to S202, and the above-described process is repeated.
According to the above-described embodiment 2, the output frame in which the three-dimensional distance data acquired in the frame 1 and the three-dimensional distance data acquired in the frame 2 are combined is generated, and the three-dimensional distance data for the combined output frame is output from the TOF camera. As a result, the resolution can be improved as compared with the case of detection by a single frame, and distance measurement can be performed with reduced degradation in measurement accuracy even when the measurement distance range is a long distance.

Claims (7)

1. A three-dimensional distance measuring device for outputting a position of an object as a distance image,
the three-dimensional distance measuring device is provided with:
a light emitting unit that irradiates the subject with light;
a light receiving unit that detects reflected light from the subject;
a distance calculation unit that calculates a three-dimensional distance to the object based on the transmission time of the reflected light detected by the light receiving unit;
an image processing unit that generates a two-dimensional distance image of the object based on the distance data calculated by the distance calculation unit; and
a distance pattern selection processing unit for selecting a predetermined distance pattern from a plurality of distance patterns having different measurable distance ranges and setting a driving condition of the light emitting unit,
acquiring, by the distance mode selection processing section, three-dimensional distance data from a first distance mode in a first frame and three-dimensional distance data from a second distance mode in a second frame,
the image processing unit combines the three-dimensional distance data acquired in the first frame and the second frame to generate three-dimensional distance data of the output frame.
2. The three-dimensional distance measuring apparatus according to claim 1,
the limit distance on the far-distance side of the measurable distance range in the first distance mode is smaller than the limit distance on the far-distance side of the measurable distance range in the second distance mode,
all or part of the range that can be measured in the first distance mode is the same distance as part of the range that can be measured in the second distance mode.
3. The three-dimensional distance measuring apparatus according to claim 2,
the three-dimensional distance data of the frame to be output is selected from the three-dimensional distance data of the first frame in the range that can be measured in the first distance mode, and is selected from the three-dimensional distance data of the second frame in the range that can be measured only in the second distance mode.
4. A three-dimensional distance measuring device for outputting a position of an object as a distance image,
the three-dimensional distance measuring device is provided with:
a light emitting unit that irradiates the subject with light;
a light receiving unit that detects reflected light from the subject;
a distance calculation unit that calculates a three-dimensional distance to the object based on the transmission time of the reflected light detected by the light receiving unit,
an image processing unit that generates a two-dimensional distance image of the object based on the distance data calculated by the distance calculation unit; and
a distance pattern selection processing unit for selecting a predetermined distance pattern from a plurality of distance patterns having different measurable distance ranges and setting a driving condition of the light emitting unit,
the distance pattern selection processing unit acquires at least three-dimensional distance data from a first distance pattern and a second distance pattern in a first frame, acquires at least three-dimensional distance data from a third distance pattern and a fourth distance pattern in a second frame,
the image processing unit combines the three-dimensional distance data acquired in the first frame and the second frame to generate three-dimensional distance data of the output frame.
5. The three-dimensional distance measuring apparatus according to claim 4,
the limit distance on the far side of the measurable distance range in the first distance mode substantially coincides with the limit distance on the near side of the measurable distance range in the second distance mode,
the limit distance on the far side of the measurable distance range in the third distance mode substantially coincides with the limit distance on the near side of the measurable distance range in the fourth distance mode,
the third distance pattern includes a part of the distance range that can be measured in the first distance pattern and includes a part of the distance range that can be measured in the second distance pattern.
6. The three-dimensional distance measuring apparatus according to claim 5,
the third distance pattern of the second frame is offset from the first distance pattern of the first frame by half a period of phase.
7. The three-dimensional distance measuring apparatus according to claim 6,
three-dimensional distance data of the output frame is generated by averaging the three-dimensional distance data of the first frame and the three-dimensional distance data of the second frame.
CN202010341819.XA 2019-07-05 2020-04-27 Three-dimensional distance measuring device Pending CN112180388A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-126149 2019-07-05
JP2019126149A JP7257275B2 (en) 2019-07-05 2019-07-05 3D distance measuring device

Publications (1)

Publication Number Publication Date
CN112180388A true CN112180388A (en) 2021-01-05

Family

ID=73919070

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010341819.XA Pending CN112180388A (en) 2019-07-05 2020-04-27 Three-dimensional distance measuring device

Country Status (3)

Country Link
US (1) US11604278B2 (en)
JP (1) JP7257275B2 (en)
CN (1) CN112180388A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024004639A1 (en) * 2022-06-29 2024-01-04 ソニーセミコンダクタソリューションズ株式会社 Light reception device, information processing device, ranging device, and information processing method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0382978A (en) * 1989-08-25 1991-04-08 Matsushita Electric Works Ltd Ultrasonic detector
US20050269481A1 (en) * 2002-08-05 2005-12-08 Elbit Systems Ltd. Vehicle mounted night vision imaging system and method
US20070058038A1 (en) * 2004-02-04 2007-03-15 Elbit Systems Ltd. Gated imaging
WO2017110413A1 (en) * 2015-12-21 2017-06-29 株式会社小糸製作所 Image acquisition device for vehicles, control device, vehicle provided with image acquisition device for vehicles and control device, and image acquisition method for vehicles
JP2017167120A (en) * 2016-03-10 2017-09-21 株式会社リコー Distance measurement device, moving body, robot, device and three-dimensional measurement method
JP2018185179A (en) * 2017-04-25 2018-11-22 株式会社リコー Distance measuring device, monitoring device, three-dimensional measuring device, mobile body, robot, and method for measuring distance
CN109917412A (en) * 2019-02-01 2019-06-21 深圳奥比中光科技有限公司 A kind of distance measurement method and depth camera

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001337166A (en) * 2000-05-26 2001-12-07 Minolta Co Ltd Method and device for three-dimensional input
CN103064087B (en) 2012-12-25 2015-02-25 符建 Three-dimensional imaging radar system and method based on multiple integral
JP6675061B2 (en) 2014-11-11 2020-04-01 パナソニックIpマネジメント株式会社 Distance detecting device and distance detecting method
CN107850669B (en) 2015-07-31 2021-10-12 新唐科技日本株式会社 Distance measurement imaging device and solid-state imaging device
EP3418772A4 (en) 2016-02-17 2019-03-13 Panasonic Intellectual Property Management Co., Ltd. Distance measuring device
WO2019012756A1 (en) * 2017-07-11 2019-01-17 ソニーセミコンダクタソリューションズ株式会社 Electronic device and method for controlling electronic device
CN110192118A (en) 2017-12-22 2019-08-30 索尼半导体解决方案公司 Impulse generator and signal generating apparatus
JP2020134464A (en) * 2019-02-25 2020-08-31 ソニーセミコンダクタソリューションズ株式会社 Distance measuring apparatus, distance measuring method and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0382978A (en) * 1989-08-25 1991-04-08 Matsushita Electric Works Ltd Ultrasonic detector
US20050269481A1 (en) * 2002-08-05 2005-12-08 Elbit Systems Ltd. Vehicle mounted night vision imaging system and method
US20070058038A1 (en) * 2004-02-04 2007-03-15 Elbit Systems Ltd. Gated imaging
WO2017110413A1 (en) * 2015-12-21 2017-06-29 株式会社小糸製作所 Image acquisition device for vehicles, control device, vehicle provided with image acquisition device for vehicles and control device, and image acquisition method for vehicles
JP2017167120A (en) * 2016-03-10 2017-09-21 株式会社リコー Distance measurement device, moving body, robot, device and three-dimensional measurement method
JP2018185179A (en) * 2017-04-25 2018-11-22 株式会社リコー Distance measuring device, monitoring device, three-dimensional measuring device, mobile body, robot, and method for measuring distance
CN109917412A (en) * 2019-02-01 2019-06-21 深圳奥比中光科技有限公司 A kind of distance measurement method and depth camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张毅;柏连发;陈钱;顾国华;张保民;: "循环步进延时距离选通水下微光三维成像", 南京理工大学学报(自然科学版), no. 06 *

Also Published As

Publication number Publication date
US20210003705A1 (en) 2021-01-07
JP7257275B2 (en) 2023-04-13
US11604278B2 (en) 2023-03-14
JP2021012091A (en) 2021-02-04

Similar Documents

Publication Publication Date Title
CN109959942B (en) Distance measuring apparatus, identifying apparatus, and distance measuring method
US10310084B2 (en) Range imaging apparatus and range imaging method
CN109425864B (en) 3D distance measuring device
US8120761B2 (en) Method and apparatus for position judgment
KR101891907B1 (en) Distance measuring device and parallax calculation system
US10578741B2 (en) Distance detection device and distance detection method
US11536814B2 (en) Distance measuring apparatus having distance correction function
CN110850426B (en) TOF depth camera
CN113406655B (en) Method for correcting measured value of distance measuring device and distance measuring device
KR100728482B1 (en) Displacement sensor providing auto setting means of measurement area
JP2020056698A (en) Distance measuring imaging device
US10514447B2 (en) Method for propagation time calibration of a LIDAR sensor
CN112180388A (en) Three-dimensional distance measuring device
CN113567952B (en) Laser radar control method and device, electronic equipment and storage medium
US7391505B2 (en) Range sensing system
US9562972B2 (en) Method for ascertaining a distance of an object from a motor vehicle using a PMD sensor
CN115176174A (en) Distance-measuring camera device
JPWO2020021590A1 (en) Endoscope device
JP7436428B2 (en) Distance measuring device, ranging system, and interference avoidance method
WO2024029486A1 (en) Distance measurement device
US11869206B2 (en) Controllable laser pattern for eye safety and reduced power consumption for image capture devices
WO2022209180A1 (en) Distance measurement device and distance measurement system
US20210382153A1 (en) Method and apparatus for characterizing a time-of-flight sensor and/or a cover covering the time-of-flight sensor
WO2021059638A1 (en) Distance measurement device
EP3715903B1 (en) Laser distance meter and method of measuring a distance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination