WO2019021887A1 - Optical radar device - Google Patents
Optical radar device Download PDFInfo
- Publication number
- WO2019021887A1 WO2019021887A1 PCT/JP2018/026748 JP2018026748W WO2019021887A1 WO 2019021887 A1 WO2019021887 A1 WO 2019021887A1 JP 2018026748 W JP2018026748 W JP 2018026748W WO 2019021887 A1 WO2019021887 A1 WO 2019021887A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- radar device
- optical
- pulsed light
- pulsed
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
Definitions
- the present invention relates to an optical radar device, and more particularly to an optical radar device for acquiring a three-dimensional image mainly composed of a two-dimensional image of an object and distance information on the object.
- the three-dimensional image is a concept that includes distance information to the object in the field of view in addition to two-dimensional images like ordinary photographs, and in recent years, three-dimensional image sensors have been used for peripheral recognition of automobiles and robots. It is extremely important.
- sensors for two-dimensional images charge coupled devices (CCDs) and complementary metal oxide semiconductors (CMOSs) imagers are in widespread use, and light intensity is converted into an electrical signal by a photodiode made of silicon for imaging.
- CMOSs complementary metal oxide semiconductors
- Non-Patent Document 1 a scanning type in which a laser beam focused in a point shape (see Non-Patent Document 1) or a band (see Patent Document 1) is scanned by a mirror etc.
- a batch irradiation type in which a laser beam is spread and irradiated, and many scan types in which a strong beam intensity can be easily obtained on an object have been developed.
- the scan type is expensive and bulky because it requires a mechanical mechanism to swing the beam.
- the batch irradiation type does not require a mechanical mechanism to scan, so it is easy to miniaturize, but since the laser light intensity on the object becomes weaker than the scan type, when the distance to the object increases, The signal strength decreases and the distance measurement accuracy decreases.
- the pulse laser beam is emitted multiple times in order to directly measure the time measurement accuracy to the distance accuracy, and the time measurement from light emission to light reception is repeated, and the histogram (horizontal axis: time, vertical axis: frequency)
- TCSPC Time-Correlated-Single-Photon-Counting
- SPAD Single-Photon-Avalanche-Diode: single photon avalanche multiplication photodiode
- This method Since this method has a large circuit scale per pixel, it is not used in an imager in which pixels are two-dimensionally arrayed in a large scale (deletion), and is mainly used in combination with a scan type (Patent Document 2 and Non-Patent Document 1).
- the current of the photodiode is measured, and the flight time is determined by comparing with the determination value. Further, there is also a case where the current is sequentially accumulated in capacitors arranged in time series, and determination is made based on the accumulated amount. According to this mechanism, since a three-dimensional image is formed by one laser irradiation, as in the case of taking a picture by applying a flash in terms of photography, simultaneousness is ensured over the entire field of view. Each point in the field of view differs greatly from the scan type with a different time, and is described as "flash" (see Patent Document 3 and Patent Document 4).
- the optical radar device is used by being attached to a moving object (main body, moving body), such as a vehicle, a robot, a drone, a wand for the visually impaired
- a moving object main body, moving body
- the inclination of the ground and the inclination of the main body may change, and the center (optical axis) of the object view of the light radar device may be vertically oscillated.
- the field of view may rotate around the optical axis. For example, when there is an up and down swing, an object that is originally at the front center is recognized as being above or below the actual, or an object that is stationary is recognized as being moving up and down There is a risk of misrecognition.
- a flat passage When the field of view rotates around the optical axis, a flat passage may be misrecognized as an inclined passage, or an inclined ground may be misrecognized as flat.
- another horizontal sensor or a 2-axis accelerometer is provided on the main body, and it is monitored how the main body is inclined with respect to the vertical line, and the inclination information of the main body is used.
- the main body is a large vehicle and equipped with a high-performance CPU
- processing of the above data can be done without a large time delay, but when a smaller-sized mobile unit is not equipped with a powerful CPU.
- the above data processing is a major obstacle.
- the inclination from the vertical line may be different for each light radar device. This is because the larger the body, the greater the amount of deformation, as long as the body is not completely rigid.
- One aspect of the present invention is to realize an optical radar device capable of outputting a 3D image in which the positional relationship of the optical laser device with respect to the vertical line is constant, and the position of an object can be accurately recognized with a small amount of data processing. I assume.
- an optical radar device is an optical radar device that measures the distance to an object present in a predetermined target area by a time-of-flight method.
- a light source for emitting pulsed light
- a pulsed light illumination system for irradiating at least a part of the target area with pulsed light emitted from the light source, and the target for pulsed light emitted by the pulsed light illumination system
- a light receiving system for detecting the reflected light from the object to measure the distance to the object, and a horizontal sensor for detecting an inclination angle of the optical axis of the light receiving system with respect to the horizontal plane; The tilt of the optical axis of the light receiving system with respect to the horizontal plane is corrected by the tilt angle detected by the horizontal sensor.
- the positional relationship of the 3D image output from the optical laser device with respect to the horizontal surface is constant, and a 3D image capable of accurately recognizing the position of the object with a small amount of data processing can be output. Play.
- FIG. 2 is a schematic view showing the relationship between a target area, a measurable area, and a pulsed light irradiation area in the optical radar device shown in FIG.
- ToF sensor which comprises the optical radar apparatus shown in FIG.
- the optical radar apparatus which concerns on Embodiment 2 of this invention WHEREIN It is a schematic diagram which shows the relationship between an object area
- FIG. 7 is a schematic view showing the relationship between a target area, a measurable area, and a pulsed light irradiation area in the optical radar device shown in FIG. 6; It is a schematic diagram which shows the structure of the optical radar apparatus based on Embodiment 4 of this invention.
- FIG. 9 is a schematic view showing the relationship between a target area, a measurable area, and a pulsed light irradiation area in the optical radar device shown in FIG. 8;
- Embodiment 1 It will be as follows if one embodiment of the present invention is described. Hereinafter, for convenience of explanation, the same reference numerals may be added to the configurations having the same functions as the configurations described in the specific embodiment, and the description thereof may be omitted.
- FIG. 1 is a schematic view of an optical radar device 100 according to the first embodiment.
- the optical radar device 100 includes a pulse light illumination system 110 for irradiating a pulse light irradiation area 10 with fan-shaped pulse light 124 (hereinafter sometimes referred to simply as pulse light); It consists of a light receiving system 150 for receiving light from at least a part of 10, and a horizontal sensor 140.
- a pulse light illumination system 110 for irradiating a pulse light irradiation area 10 with fan-shaped pulse light 124 (hereinafter sometimes referred to simply as pulse light); It consists of a light receiving system 150 for receiving light from at least a part of 10, and a horizontal sensor 140.
- the pulsed light illumination system 110 includes a light source 122 that emits pulsed light, and a fan-shaped light irradiation system 123 that irradiates the pulsed light irradiation area 10 with the pulsed light emitted by the light source 122.
- the fan-shaped light irradiation system 123 converts the pulse light from the light source 122 into a band-shaped pulse light having a narrow band width spread in the horizontal direction, and scans the band-shaped pulse light one-dimensionally in the vertical direction. Pulse light (fan-shaped pulse light 124) is irradiated.
- the light receiving system 150 includes a ToF sensor (including a three-dimensional image element) 153, an imaging optical system 151 that forms an image on the light receiving unit 154 of the ToF sensor 153, and an optical band pass filter 152.
- the ToF sensor 153 also includes functions such as control of the pulsed light illumination system 110 and communication with the external system 170.
- the horizontal sensor 140 constantly monitors the inclination angle of the light axis of the light receiving system 150 with respect to the horizontal plane, and outputs the monitoring result to the ToF sensor 153.
- the case of correcting the inclination of the light axis of the light receiving system 150 with respect to the horizontal plane will be described.
- the Z axis is the vertical direction
- the XY plane is the horizontal plane.
- the coordinate axes fixed to the environment are represented by X, Y, Z
- the coordinate axes fixed to the light radar device 100 are represented by Xr, Yr, Zr.
- the two coincide.
- the case where the Y-axis direction is the front of the optical radar device 100 and the Yr-axis is inclined with respect to the Y-axis is called forward inclination
- the case where the Xr-axis is inclined with respect to the X-axis is called lateral inclination.
- the Y-axis direction is the central portion of the pulsed light irradiation region 10 on the assumption that the optical radar device 100 is not inclined in any direction with respect to the horizontal surface unless otherwise specified.
- a case where the light receiving system 150 is in the optical axis direction will be described. Further, the present embodiment is directed to the case where the left and right inclination of the light radar device 100 can be ignored.
- FIG. 2 is a schematic view showing a configuration of a fan-shaped light irradiation system 123 of the pulsed light illumination system 110 shown in FIG.
- FIG. 3 is a schematic view showing the relationship between a target area, a measurable area, and a pulsed light irradiation area in the optical radar device shown in FIG.
- the fan-shaped pulsed light 124 irradiated by the fan-shaped light irradiation system 123 will be described.
- the fan-shaped pulse light 124 spreads like a fan in the horizontal direction, and its spread angle is denoted by ⁇ h.
- the spread angle in the vertical direction is small, and the beam thickness is ⁇ w (half width).
- ⁇ h >> ⁇ w
- By scanning the fan-shaped pulse light 124 in the vertical direction within the vertical irradiation angle ⁇ v it is possible to sequentially irradiate the pulsed light irradiation region 10 with the horizontal spread angle ⁇ h and the vertical spread angle ⁇ v. In addition. It is ⁇ v> ⁇ w.
- Ns is the total number of scans in the vertical direction.
- ⁇ h> ⁇ v many of the objects in the field of view can be detected by the initial few scans, without completing all of the vertical scans.
- the obstacle to be detected floats in the air, and it often stands vertically from the road surface or the ground, so for example from a low height, fan-shaped pulse light just parallel to the horizontal plane If irradiated, most obstacles can be detected. Therefore, there is a feature that an obstacle can be detected quickly before the end of one field period.
- ⁇ h ⁇ ⁇ v may be used, and can be changed according to the application.
- the fan-shaped pulse light 124 be uniform in the object field of view, but the detection sensitivity of the place where the light intensity is high is high. It is also possible to make the light intensity distribution in which the intensity in the vicinity is strengthened.
- the fan-shaped light irradiation system 123 includes a collimated light generator 130 for shaping the light of the light source 122 (FIG. 1) into substantially parallel spot light 133 (in YZ plane); At least in the vertical direction (Z direction), and at least a fan-shaped beam generator 132 for spreading the spot light whose traveling angle in the vertical direction has been changed by the one-dimensional scan device 131 in a fan-like shape.
- a collimated light generator 130 for shaping the light of the light source 122 (FIG. 1) into substantially parallel spot light 133 (in YZ plane); At least in the vertical direction (Z direction), and at least a fan-shaped beam generator 132 for spreading the spot light whose traveling angle in the vertical direction has been changed by the one-dimensional scan device 131 in a fan-like shape.
- the one-dimensional scanning device 131 is configured of a MEMS mirror element or the like having a reflective surface that rotates about one axis (X axis) in a horizontal plane (XY plane).
- the fan beam generator 132 includes, for example, a Powell lens.
- a collimated light generator 130 forms a spot light 133 having a divergence angle of about 1.0 degree and a diameter of about 3 mm at the incident part of a Powell lens with an aperture of 8.9 mm, and is a one-dimensional scanning device
- the laser beam is shaken ⁇ 20 degrees with respect to the horizontal plane by 131.
- the light path of the fan-shaped light irradiation system 123 is made to pass through the fan-shaped beam generator 132 after passing through the one-dimensional scanning device 131 in FIG. 2, but conversely, after passing through the fan-shaped beam generator 132 It may pass through the scanning device 131.
- ⁇ v be larger than the measurement region ⁇ ov in the vertical direction of the light radar device 100.
- ⁇ iy a forward inclination angle, downward is defined as minus
- the actual horizontal plane corresponds to the pulse light irradiation region 10 of the optical radar device 100. Shift by - ⁇ iy with respect to the center.
- spreaded centering on the horizontal surface was described here, it is the same also when the object area
- the MEMS mirror element is, for example, an electromagnetic type, and changes the swing angle of the mirror by controlling the amount of current flowed. In the electrostatic type and the piezoelectric type, the deflection angle of the mirror can be changed by controlling the applied voltage.
- the MEMS mirror element has the advantage that the device can be miniaturized.
- Control of the one-dimensional scanning device 131 is included in the ToF sensor 153. In order to detect the signal from the object irradiated with the fan-shaped pulsed light 124, it is necessary to synchronously control the deflection angle of the mirror and the light receiving system.
- the one-dimensional scanning device 131 may have another configuration, such as a polygon mirror or a liquid crystal waveguide method, in addition to the MEMS mirror element.
- the light source 122 constituting the pulsed light illumination system 110 is a light source capable of pulse emission, such as a laser or LED, and is preferably an infrared ray having a wavelength of about 700 nm to about 1000 nm.
- the longer the wavelength the higher the safety for the eyes of animals because it is invisible to the human eye.
- the longer the wavelength the lower the background light intensity.
- the wavelength around 940 nm to 950 nm is preferable because the intensity is lowered due to the absorption of sunlight by moisture in the air.
- the emission wavelength band is narrow and the temperature fluctuation of the emission peak wavelength is small, and an infrared laser is preferable.
- a temperature control circuit for controlling the temperature of the light source 122 may be added in order to suppress the temperature fluctuation of the light emission peak wavelength.
- the light source 122 emits pulsed light in synchronization with the ToF sensor 153.
- the light emission intensity and the pulse width may be variable.
- the pulse width of the pulsed light is about 1 nsec to several hundreds nsec.
- the peak power of pulsed light is from several watts to hundreds of watts.
- the pulsed lights 124-1 to 124-40 are emitted.
- the time allotted to the fan-shaped pulsed light 124-k is 1/1200 seconds, during which the angle of the reflective surface of the one-dimensional scanning device 131 is changed to a set value, and the light source 122 is pulsed.
- the pulse emission frequency is 190 kHz
- each sector pulse light 124-k emits 158 pulses to the object and integrates the measurement results for each pulse.
- ⁇ w be approximately equal to or larger than the vertical angle resolution corresponding to one pixel of the light receiving system 150.
- the object surface corresponding to the target pixel is irradiated with light substantially uniformly.
- ⁇ w should be as small as possible.
- Light receiving system 150 The light receiving system 150 of the light radar device 100 will be described below.
- the image forming optical system 151 constituting the light receiving system 150 is generally a lens.
- the focal length and the F number can be selected according to the size of the light receiving unit 154 and the viewing angle (FOV).
- FOV viewing angle
- the transmittance is high and the aberration is small at the central wavelength of the optical band pass filter 152 described later.
- a lens is used as the imaging optical system 151
- a reflective optical system other than a lens may be used. It is preferable to reduce the out-of-band background light by arranging the optical band pass filter 152 in the light path from the front surface of the imaging optical system 151 to the light receiving unit 154.
- the optical band pass filter 152 has a transmission band in a fixed width band centered on the wavelength peak of pulsed light.
- the width of the transmission band (half width of wavelength distribution of transmittance) is several nm to several tens nm.
- the width of the transmission band is preferably about 10 nm to 20 nm. In general, when operating outdoors, the operating temperature range widens and the peak wavelength of the pulsed light changes with the temperature, so the pulsed light distribution needs to be within the transmission band at least in the operating temperature range.
- the temperature shift of the peak wavelength is about 0.07 nm / degree
- the half width of the emission peak is about 1 nm
- the temperature shift of the central wavelength of the transmission band of the optical bandpass filter is 0.025 nm / degree.
- the relative wavelength shift of the peak wavelength and the central wavelength of the transmission band is about 5.6 nm even if the temperature band from -40 ° C is taken into consideration, and it is possible to use an optical band pass filter of the transmission band of about 10 nm.
- the optical band pass filter 152 may be incorporated in the imaging optical system 151.
- the optical axis of the imaging optical system 151 is in the Y-axis direction, and the light receiving unit 154 of the ToF sensor 153 is parallel to the XZ plane.
- the measurable area 11 determined by the imaging optical system 151 and the light receiving unit 154 of the ToF sensor 153 be equal to or smaller than the pulsed light irradiation area 10.
- FIG. 4 is a schematic view of a ToF sensor constituting the light radar device shown in FIG.
- the ToF sensor 153 receives the light from the linear portion of the pulsed light irradiation area 10 illuminated by the fan-shaped pulsed light, and measures the time of flight between the linear portion and the optical radar device 100. It is a sensor that finds the distance.
- the ToF sensor 153 includes a light receiving unit 154 in which pixels Px (i, j) are arranged in a two-dimensional matrix, and each column forming the row of the pixels Px (i, j).
- the data stored in the pixel storage element Mx (j) to which the electric pulse is supplied from the pixel Px (i, j) and the pixel storage element Mx (j) are read, and the target is set for each pixel Px (i, j).
- a signal processing circuit DS for acquiring at least distance information indicating a distance to an object.
- the pixel storage element Mx (j) has a plurality of binary counters that integrate the numbers of the electric pulses at different timings from each other, and the signal processing circuit DS is stored in the pixel storage element Mx (j). Data reading can be performed in parallel with the integration by the binary counter.
- the ToF sensor 153 has a function of receiving the output of the horizontal sensor 140, controlling the pulsed light illumination system 110 based on the output value, and selecting the pixel Px (k, j) to receive light, It is not always necessary to configure one chip. It may be a combination of the three-dimensional image element and a device that performs another control function.
- the pixels Px (i, j) are arranged in a two-dimensional matrix of M rows and N columns, and The light signal is projected by the imaging optical system 151 onto this M-row N-column two-dimensional matrix.
- the shape of the pixel Px (i, j) is a square.
- the FOV of the measurable region is N ⁇ ⁇ in the horizontal direction and M ⁇ ⁇ in the vertical direction.
- the pixels Px (i, j) are not all activated at one time. Since the pulse light irradiated toward the pulse light irradiation area 10 is the fan-shaped pulse light 124, only the pixels in the k-th row corresponding to the fan-shaped pulse light 124-k are activated. That is, when the fan-shaped pulsed light 124-k is irradiated, Px (k, j) is activated.
- Px (k, j) is activated means that at least an output signal of Px (k, j) is transmitted to the signal storage processing unit 155, and further, other Px (i, j) ) May be stopped, and power may be supplied to only Px (k, j).
- the sector pulse light 124 is numbered from 1 to M from the bottom to the top, and i of the corresponding Px (i, j) is from the top to the bottom, 1 Assign numbers from to. Since the order of the two is reversed via the imaging optical system 151, they are associated in this way. Note that this correspondence can be changed depending on the nature of the imaging optical system 151.
- the pixel Px (i, j) includes a light receiving element that performs photoelectric conversion, a circuit that transmits the output of the light receiving element to the signal storage processing unit 155, and the like.
- a light receiving element a photodiode such as an avalanche photodiode is used.
- a row selection circuit 161 is provided in the light receiving unit 154 as a circuit for selecting the pixels Px (k, j) of the k rows corresponding to the fan-shaped pulse light 124-k.
- a row selection line R (i) for transmitting the signal of the row selection circuit 161 to each pixel Px (i, j) is provided.
- the row selection line R (i) is not limited to a single signal line, and may be a plurality of signal lines having different polarities and voltages.
- the row selection circuit 161 selects k rows to be activated in synchronization with the operation of the one-dimensional scanning device 131 of the fan-shaped light irradiation system 123.
- a signal for synchronization is issued from the control unit 160.
- the signal storage processing unit 155 includes at least one pixel storage element Mx (j) corresponding to each column j, and the pixel storage element Mx (j) includes each pixel Px (i, j) and the signal line Lx (j). It is tied by.
- the electric signal emitted from the pixel Px (k, j) is received via the signal line Lx (j) by the pixel memory element Mx (j), and the signal amount is stored in time series. That is, an electric pulse is supplied to the pixel memory element Mx (j) from each pixel forming the column of pixels, and data is accumulated.
- the signal storage processing unit 155 further includes a buffer memory Bx (j), a column signal line C (j), and a signal processing circuit DS.
- the data stored in the pixel storage element Mx (j) is read at a predetermined timing and copied to the buffer memory Bx (j) via the column signal line C (j).
- the signal processing circuit DS calculates and outputs at least the distance information D (k, j) and the two-dimensional image information G1 (k, j) and G2 (k, j) based on the information in the buffer memory Bx (j).
- G1 (i, j) is two-dimensional image information by background light
- G2 (i, j) can be two-dimensional image information by reflected light of pulse light, but is not limited thereto.
- the signal storage processing unit 155 may have a memory selection circuit 163 for selecting a part of the pixel storage element Mx (j) and a memory selection line Rm ( ⁇ ).
- a memory selection circuit 163 for selecting a part of the pixel storage element Mx (j) and a memory selection line Rm ( ⁇ ).
- the control unit 160 determines the measurement order in the target area 12 (FIG. 3), performs angle setting of the one-dimensional scanning device 131 (FIG. 2) according to this order, and determines the row k to activate the light receiving unit 154.
- the data stored in the pixel memory element Mx (j) is copied to the buffer memory Bx (j), and the distance information D (k, j) and 2 are sequentially transmitted to the signal processing circuit DS.
- Performs a series of controls such as calculating the dimensional image information G1 (k, j) and G2 (k, j) and storing them in the memory in the ToF sensor 153 or outputting them to the external system 170 (FIG. 1) .
- control unit 160 synchronously drives the pulsed light illumination system 110 and the ToF sensor 153 to acquire a 3D image of the target area 12. Further, as described below, it has a function of specifying fan-shaped pulse light 124-k corresponding to the target area 12 and the pixel Px (k, j) whose line is to be measured, based on the output value ⁇ iy of the horizontal sensor 140. .
- the actual horizontal plane is shifted by ⁇ iy with respect to the center of the pulsed light irradiation area 10 of the light radar device 100. Since the target area 12 which is the area to be actually measured is in the range of the angle ⁇ ov centered on the horizontal plane, the one-dimensional scanning device 131 of the pulsed light illumination system 110 shifts the light irradiation area by - ⁇ iy from the center by And the ToF sensor 153 synchronously with this, in the row direction of the light receiving unit 154, ( ⁇ ov / ⁇ / 2) up and down centering on the row shifted from (- ⁇ iy / ⁇ ) from the center Measure the range of lines.
- the number M of rows of the pixels Px (i, j) constituting the light receiving unit 154 be ⁇ v / ⁇ or more. In this case, since the entire pulsed light irradiation area 10 can be measured, the entire target area 12 can always be measured even if the light radar device 100 is inclined as long as
- control part 160 can stop the function which monitors the predetermined
- the measurable area 11 which changes in accordance with the orientation of the main body is monitored.
- monitoring on the basis of a horizontal surface on a mountain road or the like with a large inclination there are cases where only the road surface can be seen when climbing, only sky when going down, and forward monitoring can not be performed correctly.
- an apparatus such as an aircraft or a drone, it may be better to be able to monitor a certain field of view with reference to the main body. In this way, the availability of the optical radar device can be increased by switching the function according to the application and the situation.
- the signal storage processing unit 155 of the ToF sensor 153 can have various configurations. There are various methods, such as a circuit that handles analog signals and a circuit that handles digital signals, as to signals that pixels emit. Further, with regard to measurement methods of flight time, there are various methods such as direct method, indirect method, phase shift method, TCSPC, etc. In the present invention, any method can be adopted. Also, the ToF elements may be arranged in a line corresponding to one row instead of the above-described two-dimensional array of light receiving units, and mechanically scanned in synchronization with the one-dimensional scanning device 131.
- the horizontal sensor 140 of the optical radar device 100 will be described below.
- the optical radar device 100 of the present embodiment is assumed to be used on land or in the sea, and the pulsed light irradiation region 10 extends in the vertical direction. Further, in the case where the optical axis of the light receiving system 150 is in the horizontal plane, the case of correcting the inclination of the optical axis with respect to the horizontal plane will be described.
- the horizontal sensor 140 measures the inclination angle ⁇ iy of the optical axis with respect to the horizontal plane, and outputs the result to the ToF sensor 153.
- a three-dimensional accelerometer provided on a plane parallel to the XZ plane referred to in the coordinate axes shown in FIG. 1 is attached, gravity acceleration is measured, and the result is (Gx, Gy, Gz) (coordinates fixed to horizontal sensor 140)
- an inclinometer may be provided on a plane parallel to the XY plane, and the inclination angle in the Y-axis direction may be ⁇ iy.
- the horizontal sensor 140 need not necessarily output ⁇ iy, and may output sin ( ⁇ iy), cos ( ⁇ iy), or both.
- the horizontal sensor 140 is preferably installed near the ToF sensor 153. This is because distortion of the positional relationship between the ToF sensor 153 and the horizontal sensor 140 becomes larger as the main body vibrates or the like as the distance from the ToF sensor 153 increases. Further, it is preferable that the horizontal sensor 140 be mounted on the same substrate as the ToF sensor 153. When mounted on a different substrate, there is also a possibility that the positional relationship between the two is distorted due to vibration or the like, and the measured value of ⁇ iy may deviate from the inclination angle of the actual optical axis with respect to the horizontal plane.
- ⁇ iy is measured in each frame, and the one-dimensional scanning device 131 of the pulsed light illumination system 110 measures ⁇ iy from the center of the light irradiation area based on the measured value of ⁇ iy.
- the range of ⁇ ov is scanned around a line shifted as much as possible.
- the ToF sensor 153 measures lines in the range of ( ⁇ ov / ⁇ / 2) above and below the line shifted from (- ⁇ iy / ⁇ ) from the center in the line direction of the light receiving unit 154 By doing this, it is possible to always measure the target area 12 in the range of ⁇ ov around the actual horizontal plane.
- the target area 12 can be kept constant no matter how the light radar device 100 is inclined.
- the external system 170 can always acquire a three-dimensional image in a fixed direction regardless of the amount of tilt of the light radar device 100, so that the object can be recognized easily and accurately. For example, the inclination of the road and the passage in front of the main body can be correctly recognized without complicated correction on the main body side, and the height of the obstacle can also be measured accurately.
- the same function can be realized by inclining the light radar device 100 itself by - ⁇ iy with respect to the main body according to ⁇ iy measured by the horizontal sensor.
- the entire optical radar device 100 must be mechanically moved with high speed and high accuracy for each frame, resulting in a device that is large and vulnerable to vibration.
- it has to be a very expensive device.
- the present embodiment is an application example of the present invention to an application in which the rotation around the optical axis is a problem, and measures for rotational fluctuation around the Y axis of the optical radar device 100 which were not dealt with in the first embodiment are included. There is.
- the components of the optical radar device 100a according to this embodiment are the same as those of the optical radar device 100 of the first embodiment shown in FIGS. 1 and 4, but the characteristics of the respective components are different as described below.
- the elements of the present embodiment are distinguished from those of Embodiment 1 by appending a to the end of the element number. Only differences from the first embodiment will be described below.
- the horizontal sensor 140 measures not only the forward inclination ⁇ iy of the optical radar device 100 a of FIG. 1 but also the left and right inclination amount ⁇ ix.
- ⁇ ix corresponds to the amount of rotation about the Y axis.
- a three-dimensional accelerometer provided in a plane parallel to the XZ plane in FIG.
- ⁇ iy Arctan (Gy / Gz)
- ⁇ ix Arctan (Gx / Gz)
- an inclinometer may be provided on a plane parallel to the XY plane in FIG. 1, and the inclination angle in the Y-axis direction may be ⁇ iy, and the inclination angle in the X-axis direction may be ⁇ ix.
- FIG. 5 is a schematic view showing the relationship between the target area, the measurable area, and the pulsed light irradiation area in the light radar device 100a.
- the fan-shaped light irradiation system 123a of the pulse light illumination system 110a emits the fan-shaped pulse light 124a, and as shown in FIG. 5, the pulse light irradiation area 10a is wider than the target area 12 including ⁇ h. Therefore, in the case where the target area 12 is the same, it is preferable that the horizontal spread angle of the fan beam generator 132a be large. It is preferable that not only the number of rows but also the number of columns of the light receiving unit 154a be increased. Therefore, it is preferable that the number of columns also increases in the signal storage processing unit 155a. It also has a memory that can store distance information D (i, j) and two-dimensional image information G1 (i, j) and G2 (i, j), which are measurement results of pixels corresponding to measurement area 13a in FIG. ing.
- the control unit 160a determines the measurement area 13a based on ⁇ iy and ⁇ ix measured in each frame.
- the measurement area 13a is a part of the measurable area 11a of the optical radar device 100a that covers the entire target area 12 to be measured, and is a part of the corresponding light receiving unit 154a.
- the target area 12 extends in a direction parallel to the X axis and the Y axis centering on the origin, but the measurement area 13a is inclined ⁇ iy with respect to the Y axis according to the inclination of the optical radar device 100a Is also inclined by ⁇ ix.
- the control unit 160a irradiates a portion corresponding to the measurement area 13a with pulsed light, acquires a 3D image, and stores the 3D image in a memory in the signal storage processing unit 155a.
- the control unit 160 a reads the measurement result corresponding to the target area 12 from the memory and outputs the result to the external system 170. In this way, data of a constant target area can be obtained for an actual target regardless of the values of ⁇ iy and ⁇ ix.
- the measurement area 13a has a width ⁇ sh in the horizontal direction and a width ⁇ sv in the vertical direction with the direction facing upward by - ⁇ iy from the center of the measurable area 11a in order to completely cover the target area 12 .
- ⁇ sh and ⁇ sv are given by the following equations (1) and (2).
- ⁇ sh ⁇ oh ⁇ cos ( ⁇ ix) + ⁇ ov ⁇ sin ( ⁇ ix) (1)
- ⁇ sv ⁇ oh ⁇ sin ( ⁇ ix) + ⁇ ov ⁇ cos ( ⁇ ix) (2) Therefore, assuming that the allowable maximum value of ⁇ iy is ⁇ ⁇ my and the allowable maximum value of ⁇ ix is ⁇ ⁇ mx, it is preferable that ⁇ h and ⁇ v satisfy the following expressions (3) and (4). Under this condition, if the maximum allowable range of ⁇ iy and ⁇ ix is satisfied, the entire area of the target area 12 can be monitored.
- Equations (5) and (6) are rounded to the nearest integer and assigned data of the corresponding row number and column number.
- interpolation or extrapolation may be performed from a plurality of data having row numbers and column numbers before and after the numerical values calculated by Equations (5) and (6) ((x, ⁇ z It is also possible to obtain the data of).
- the allowable range of the tilt angle of the optical radar device 100a is that when the Y axis is in the range of ⁇ ⁇ my and the X axis is ⁇ ⁇ mx, ⁇ h and ⁇ v representing the measurable area 11 a are expressed by the equations (3) and (4)
- the target area 12 can be kept constant no matter how the light radar device 100 a tilts.
- the external system 170 can always acquire a three-dimensional image in a fixed direction regardless of the amount of tilt of the light radar device 100a, so that an object can be recognized easily and accurately. For example, the inclination of the road and the passage in front of the main body can be correctly recognized without complicated correction on the main body side, and the height of the obstacle can also be measured accurately.
- the light radar device 100a itself can be inclined with respect to the main body according to ⁇ iy and ⁇ ix measured by the horizontal sensor 140 to realize the same function.
- the entire optical radar device 100 has to be mechanically moved with high speed and high accuracy for each frame with high accuracy, becoming a large and fragile device and extremely expensive. Do not get.
- the present embodiment differs from the first and second embodiments in that pulsed light is irradiated to the measurement region by a two-dimensional scanning device.
- the configuration of the optical radar device 100b according to the present embodiment will be described using FIGS. 6 and 7. Description of the same points as the first and second embodiments will be omitted.
- FIG. 6 is a schematic view showing the configuration of an optical radar device 100b according to Embodiment 3 of the present invention.
- the optical radar device 100b includes a pulsed light illumination system 110b for irradiating a pulsed light irradiation area 10b with spot pulsed light 124b (hereinafter sometimes referred to simply as pulsed light), and a pulsed light irradiation area It consists of a light receiving system 150b that receives light from at least a part of 10b, and a horizontal sensor 140.
- a pulsed light illumination system 110b for irradiating a pulsed light irradiation area 10b with spot pulsed light 124b (hereinafter sometimes referred to simply as pulsed light)
- a pulsed light irradiation area It consists of a light receiving system 150b that receives light from at least a part of 10b, and a horizontal sensor 140.
- the pulsed light illumination system 110b has a two-dimensional scanning device 123b for irradiating the entire pulsed light irradiation area 10b by two-dimensionally scanning spot pulsed light, and a light source 122 for emitting pulsed light and a collimated light generator 130 are provided. Including.
- the light receiving system 150b includes a ToF sensor 153b, an imaging optical system 151b that forms an image on the light receiving unit 154b of the ToF sensor 153b, an optical band pass filter 152, and a control unit 160b.
- the control unit 160 b also includes functions such as signal reception from the horizontal sensor 140, control of the pulsed light illumination system 110 b, control of the ToF sensor 153 b, and communication with the external system 170.
- the horizontal sensor 140 constantly monitors the inclination angle of the light radar device 100b with respect to the horizontal plane, and outputs it to the ToF sensor 153b.
- Pulsed light illumination system 110b The pulsed light illumination system 110b of the light radar device 100b will be described below.
- the pulsed light illumination system 110b collimates the light from the light source 122 into a spot light 133 almost in parallel, and shakes the spot light 133 in the X axis direction and the Z axis direction. And a dimensional scan device 123b.
- the two-dimensional scanning device 123b includes a scanning mirror 131b.
- the scanning mirror 131b performs rotational movement around the Z axis to control the swing of the spot pulse light 124b in the X axis direction, and rotates around one axis in the XY plane (incline 45 degrees to the X axis) By moving, control the shake in the Z-axis direction.
- the two-dimensional scanning device 123 b includes, for example, a MEMS mirror element and its control device. Control of the two-dimensional scanning device 123b is included in the control unit 160b.
- the two-dimensional scanning device 123b may have another configuration such as a polygon mirror or a liquid crystal waveguide method, in addition to the MEMS mirror element.
- the spot light 133 is reflected by the scanning mirror 131 b and is irradiated to the pulsed light irradiation area 10 b as a spot pulse light 124 b. Therefore, the spot light 133 and the spot pulse light 124 b have substantially the same dispersion angle.
- the dispersion angle may be different in the X-axis direction and the Y-axis direction, but in the following, the case of a circular beam in which both are substantially equal will be described.
- the dispersion angle ⁇ b restricts the angular resolution of the optical radar device 100b. ⁇ b is about 0.05 degree to about 1 degree.
- Light receiving system 150b The light receiving system 150b of the light radar device 100b will be described below.
- the reflected light 125b from the object passes through the reflection by the scanning mirror 131b and the mirror 156, and is condensed by the imaging optical system 151b to the light receiving portion 154b of the ToF sensor 153b via the optical band pass filter 152.
- the mirror 156 is located in the optical path of the spot light 133 but has an opening 157 at the center, and the spot light 133 passes through the opening 157.
- FIG. 7 is a schematic view showing the relationship between the target area, the measurable area, and the pulsed light irradiation area in the optical radar device 100b.
- the ToF sensor 153b is a sensor for capturing the reflected light from the circular portion illuminated by the spot pulse light 124b by the light receiving unit 154b and determining the distance between the circular portion and the optical radar device 100b by time of flight measurement.
- the control unit 160b determines the measurement order in the target area 12, sets the angle of the two-dimensional scanning device 123b in accordance with this order, instructs the light source 122 to emit light, and simultaneously activates the ToF sensor 153b. Measure flight time.
- the control unit 160b determines the setting angle based on the outputs ⁇ iy and ⁇ ix of the horizontal sensor 140 when setting the angle of the two-dimensional scanning device 123b, and always determines the target area regardless of the inclination of the optical radar device 100b. It can be measured. For example, as shown in FIG. 7, when the spot pulse light 124b is irradiated in the direction (.eta.x, .eta.z) in the target area 12, the spot pulse light 124b is set to (.PHI.x, .PHI.z for the two-dimensional scanning device 123b. It instructs to set the scanning mirror 131b so as to irradiate in the) direction.
- ⁇ x is expressed by the following expression (7)
- ⁇ z is expressed by the following expression (8).
- Xx ⁇ iy ⁇ sin ( ⁇ ix) + ⁇ x ⁇ cos ( ⁇ ix) ⁇ z ⁇ sin ( ⁇ ix) (7)
- Zz ⁇ iy ⁇ cos ( ⁇ ix) + ixz ⁇ cos ( ⁇ ix) + ⁇ x ⁇ sin ( ⁇ ix) (8)
- scanning in the X axis direction is performed by rotation around the Z axis, and then the angle in the Z axis direction is changed, As in the case of scanning in the X-axis direction, two axes are not moved at the same time, but if the X-axis of the light radar device 100b is inclined, it is necessary to change the two axes simultaneously.
- the external system 170 can always acquire a three-dimensional image in a fixed direction regardless of the amount of tilt of the light radar device 100b, so that the object can be recognized easily and accurately. For example, the inclination of the road and the passage in front of the main body can be correctly recognized without complicated correction on the main body side, and the height of the obstacle can also be measured accurately.
- the present embodiment differs from the first and second embodiments in that pulsed light is irradiated to the measurement region at one time by the diffusion irradiation device.
- the configuration of the optical radar device 100c according to the present embodiment will be described with reference to FIGS. 8 and 9. Description of the same points as the first and second embodiments will be omitted.
- FIG. 8 is a schematic view showing the configuration of an optical radar device 100c according to Embodiment 4 of the present invention.
- the optical radar device 100c includes a pulsed light illumination system 110c for collectively emitting pulsed light 124c (hereinafter sometimes referred to simply as pulsed light) in a pulsed light irradiation area 10c; It includes a light receiving system 150c that receives light from at least a portion of 10c, and a horizontal sensor 140.
- a pulsed light illumination system 110c for collectively emitting pulsed light 124c (hereinafter sometimes referred to simply as pulsed light) in a pulsed light irradiation area 10c; It includes a light receiving system 150c that receives light from at least a portion of 10c, and a horizontal sensor 140.
- the light receiving system 150c includes a ToF sensor 153c, an imaging optical system 151 forming an image on the light receiving unit 154c of the ToF sensor 153c, an optical band pass filter 152, synchronous control of the pulsed light illumination system 110c and the ToF sensor 153c, It comprises a control unit 160 c and the like that bears functions such as output feedback and communication with the external system 170.
- the horizontal sensor 140 constantly monitors the inclination angle of the light radar device 100c with respect to the horizontal plane, and outputs it to the control unit 160c.
- Pulsed light illumination system 110c The pulsed light illumination system 110c of the light radar device 100c will be described below.
- the pulsed light illumination system 110c includes a light source 122 that emits pulsed light, and a diffusion optical system 123c that spreads the pulsed light in two dimensions and irradiates it.
- the diffusion optical system 123c is an optical device that disperses light in a predetermined range and irradiates the light, and is configured of a diffusion plate, a diffraction grating, a cylindrical lens, and the like.
- the pulsed light 124c can be irradiated collectively to the pulsed light irradiation area 10c having the horizontal irradiation angle ⁇ h and the vertical irradiation angle ⁇ v.
- the light receiving system 150c of the light radar device 100c will be described below.
- the light receiving system 150c is the same as the second embodiment except for the ToF sensor 153c. It is preferable that the measurable area 11c determined by the imaging optical system 151 and the light receiving unit 154c be equal to or smaller than the pulsed light irradiation area 10c. Here, the case where both are equal is described. Also, the measurable area 11c is larger than the target area 12 which actually outputs the measurement result, and for the horizontal spread angle ⁇ oh of the target area 12 and the vertical spread angle ⁇ ov, ⁇ h> ⁇ oh and ⁇ v> ⁇ ov. Is preferred.
- FIG. 9 is a schematic view showing the relationship between the target area, the measurable area, and the pulsed light irradiation area in the optical radar device 100c.
- the light receiving unit 154c of the ToF sensor 153c includes a group of pixels arranged in a two-dimensional array, and can cover the measurable area 11c.
- the ToF sensors 153 and 153a perform time-of-flight measurement row by row, while performing time-of-flight measurement of all pixels of the ToF sensor 153c in parallel.
- Such ToF sensors are known and will not be described in detail.
- the control unit 160c emits a signal to the pulsed light illumination system 110c to emit the pulsed light 124c, and starts measurement of time of flight of the ToF sensor 153c.
- the measurement results of all the pixels are temporarily stored in the memory (not shown) of the ToF sensor 153c. That is, as shown in FIG. 9, all data in the measurable area 11c is stored in the memory.
- the control unit 160 c selects only the part corresponding to the target area 12 from among these and outputs the selected part to the external system 170.
- the measurement results corresponding to the direction ( ⁇ x, zz) in the target area 12 are the directions ( ⁇ x, zz) of the measurable area 11c represented by the equations (7) and (8) described in the third embodiment. It is data.
- the external system 170 can always acquire a three-dimensional image in a fixed direction regardless of the amount of tilt of the light radar device 100c, so that the object can be recognized easily and accurately. For example, the inclination of the road and the passage in front of the main body can be correctly recognized without complicated correction on the main body side, and the height of the obstacle can also be measured accurately.
- the pulsed light illumination system 110c does not include any movable part, and after performing measurement independently of inclination, only the target area can be selected from the stored measurement results, so the system is very simple. And cost can be reduced.
- An optical radar device is an optical radar device (100, 100a, 100b, 100c) that measures the distance to an object present in a predetermined target region 12 by a time-of-flight method.
- a pulsed light illumination system (110, 110a) for irradiating at least a part of the target area 12 with a light source 122 for emitting pulsed light, the pulsed light 124 emitted from the light source 122, and the pulsed light illumination
- a light receiving system (150, 150a) for detecting the reflected light from the object of the pulsed light 124 irradiated in the system (110, 110a) and measuring the distance to the object; , 150a), and the light receiving system (150, 150a) detects the tilt angle detected by the horizontal sensor 140. It is characterized in that the inclination of the optical axis of the light receiving system (150, 150a) with respect to the horizontal plane is corrected by the oblique angle ⁇ iy.
- the light receiving system corrects the tilt of the light axis of the light receiving system with respect to the horizontal plane by the tilt angle detected by the horizontal sensor, so that the actual horizontal plane The distance to the object can be measured taking into account. For this reason, the target area in which the target exists is kept constant even if the light radar device itself is inclined. That is, the positional relationship of the light laser device with respect to the vertical line is constant.
- the distance to the object can be accurately measured by a simple method of correcting the inclination of the optical axis of the light receiving system with respect to the horizontal plane.
- the distance to the object can always be measured in a fixed direction by a simple method.
- the external device can always acquire a three-dimensional image in a fixed direction regardless of the amount of tilt of the light radar device, so that the object can be recognized easily and accurately.
- the light radar device may correct the light irradiation range of the pulsed light illumination system (110, 110a) by the inclination angle detected by the horizontal sensor 140.
- the power consumption can be reduced by enhancing the signal intensity to improve the measurement accuracy or reduce unnecessary light irradiation. I can do things.
- the horizontal sensor 140 detects an inclination angle ⁇ iy when the optical axis (Y axis) is inclined forward with respect to the horizontal surface.
- the horizontal sensor detects the tilt angle when the optical axis tilts forward with respect to the horizontal surface
- the light receiving system detects the tilt angle of the light receiving system by the tilt angle detected by the horizontal sensor.
- the forward tilt of the optical axis with respect to the horizontal plane can be corrected. That is, the target area in which the object is present is kept constant even if the light radar device itself is inclined forward with respect to the horizontal plane.
- the optical radar device is the optical radar device according to any one of the aspects 1 to 3, wherein the horizontal sensor 140 is inclined when the optical axis (Y axis) is inclined in the lateral direction with respect to the horizontal surface.
- the inclination angle ⁇ iy may be detected.
- the horizontal sensor detects an inclination angle when the optical axis is inclined in the left-right direction with respect to the horizontal plane, and the light receiving system detects the inclination angle detected by the horizontal sensor. It is possible to correct the horizontal inclination of the optical axis of the lens with respect to the horizontal plane. That is, the target area in which the object is present is kept constant even if the light radar device itself is inclined in the lateral direction with respect to the horizontal plane.
- the pulsed light irradiation area 10 of the pulsed light illumination system (110, 110a) includes at least the entire target area 12 It may be a size.
- the pulsed light irradiation area of the pulsed light illumination system has a size including at least the entire target area, the entire target area can always be measured even if the optical laser device is inclined.
- the measurement area ⁇ ov of the light receiving system (150, 150a) has a size including at least the entire target area 12 May be
- the measurement region of the light receiving system has a size including at least the entire target region, so that the entire target region can always be measured even if the optical laser device is inclined.
- the pulsed light illumination system (110, 110a) is obtained by converting the pulsed light 124 emitted from the light source 122
- a one-dimensional scanning device 131 may be provided to irradiate the target light 12 with the spot light.
- the one-dimensional scanning device 131 preferably irradiates the spot light in the vertical direction with respect to the target area 12.
- the pulse light emitted from the light source can be irradiated to the target area for scanning with a simple configuration of a one-dimensional scanning device.
- the light receiving system (150, 150a) may include a two-dimensional pixel array.
- the pulsed light illumination system (110c) irradiates the target area 12 with pulsed light 124 emitted from the light source 122.
- a two-dimensional scanning device may be provided.
- the target area can be scanned at one time.
- the pulsed light 124 may be infrared light.
- An optical radar device is the optical radar device according to any one of the above aspects 1 to 9,
- the three-dimensional image element includes a light receiving unit 154 in which the pixels Px (i, j) are arranged in a two-dimensional matrix, and a pixel memory to which electric pulses are supplied from each pixel forming the row of the pixels Px (i, j)
- a signal processing circuit DS which reads out data stored in the element Mx (j) and the pixel storage element Mx (j) and obtains at least distance information indicating a distance to an object for each of the pixels Px (i, j).
- the pixel storage element Mx (j) includes a plurality of binary counters that integrate the numbers of the electric pulses at different timings from each other, and the signal processing circuit DS is characterized in that the light receiving section has a three-dimensional image element which executes reading of data stored in the pixel storage element Mx (j) in parallel with integration by the binary counter.
- the present invention is not limited to the above-described embodiments, and various modifications can be made within the scope of the claims.
- the present invention can be obtained by appropriately combining the technical means disclosed in different embodiments.
- the embodiments are also included in the technical scope of the present invention.
- new technical features can be formed by combining the technical means disclosed in each embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The present invention enables the position of an object to be accurately recognized with a small amount of data processing. This optical radar device (100) includes: a pulsed-light illumination system (110) that projects pulsed light onto an object; a light reception system (150) that detects the light reflected from the object and thereby measures the distance to the object; and a horizontal sensor (140) that detects an inclination angle with respect to a horizontal plane of the light axis of the light reception system (150). The light reception system (150) corrects the inclination, with respect to the horizontal plane, of the light axis of the light reception system (150) by the inclination angle amount detected by the horizontal sensor (140).
Description
本発明は、光レーダ装置に関し、特に、主に対象物の2次元画像と対象物までの距離情報からなる3次元イメージを取得するための光レーダ装置に関する。
The present invention relates to an optical radar device, and more particularly to an optical radar device for acquiring a three-dimensional image mainly composed of a two-dimensional image of an object and distance information on the object.
3次元イメージは、通常の写真の様な2次元イメージに加えて、視野内の対象物までの距離情報も含めた概念であり、近年、自動車やロボット等の周辺認識用として3次元イメージセンサが極めて重要となっている。2次元イメージ用センサとしては、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)イメージャが普及しており、何れもシリコン製フォトダイオードによって光強度を電気信号に変換して画像化している。高精度の距離情報の計測は、レーザ光を照射して、対象物からレーザ光が反射して戻って来るまでの飛行時間(Time-of-flight)を計測する方法が普及しつつある。
The three-dimensional image is a concept that includes distance information to the object in the field of view in addition to two-dimensional images like ordinary photographs, and in recent years, three-dimensional image sensors have been used for peripheral recognition of automobiles and robots. It is extremely important. As sensors for two-dimensional images, charge coupled devices (CCDs) and complementary metal oxide semiconductors (CMOSs) imagers are in widespread use, and light intensity is converted into an electrical signal by a photodiode made of silicon for imaging. As for the measurement of distance information with high accuracy, a method of irradiating a laser beam and measuring a time-of-flight until the laser beam is reflected and returned from an object is spreading.
視野全体にレーザ光を照射する方法としては、点状(非特許文献1参照)または帯状(特許文献1参照)に絞ったレーザビームを、ミラー等によってスキャンするスキャンタイプと、視野全体にほぼ均一にレーザビームを広げて照射する一括照射タイプが有り、対象物上で強いビーム強度が得易いスキャンタイプが多く開発されている。スキャンタイプはビームを振るための機械的な機構が必要となるため、高価で大きくなる。一方、一括照射タイプは、スキャンする機械的な機構が必要無いため、小型化し易いが、対象物上でのレーザ光強度がスキャンタイプに比べて弱くなるため、対象物までの距離が大きくなると、信号強度が弱くなり、距離測定精度が落ちる。
As a method of irradiating the entire field of view with laser light, a scanning type in which a laser beam focused in a point shape (see Non-Patent Document 1) or a band (see Patent Document 1) is scanned by a mirror etc. There is a batch irradiation type in which a laser beam is spread and irradiated, and many scan types in which a strong beam intensity can be easily obtained on an object have been developed. The scan type is expensive and bulky because it requires a mechanical mechanism to swing the beam. On the other hand, the batch irradiation type does not require a mechanical mechanism to scan, so it is easy to miniaturize, but since the laser light intensity on the object becomes weaker than the scan type, when the distance to the object increases, The signal strength decreases and the distance measurement accuracy decreases.
飛行時間計測に関しては、時間測定精度が距離精度に直結するために、パルスレーザ光を複数回発射し、発光から受光までの時間測定を繰り返し、ヒストグラム(横軸:時間、縦軸:頻度)を構成し、飛行時間を決定する方法が有る。TCSPC(Time-Correlated-Single-Photon-Counting:時間相関単光子数計測)と呼ばれる手法である。受光素子としては、SPAD(Single-Photon-Avalanche-Diode:単光子雪崩増倍フォトダイオード)が用いられる。この方法は、画素当たりの回路規模が大きいため、大規模な(削除)に画素を2次元配列したイメージャでは用いられておらず、主にスキャンタイプと組合せて用いられている(特許文献2および非特許文献1参照)。
As for time of flight measurement, the pulse laser beam is emitted multiple times in order to directly measure the time measurement accuracy to the distance accuracy, and the time measurement from light emission to light reception is repeated, and the histogram (horizontal axis: time, vertical axis: frequency) There is a way to configure and determine flight times. This is a method called TCSPC (Time-Correlated-Single-Photon-Counting). As a light receiving element, SPAD (Single-Photon-Avalanche-Diode: single photon avalanche multiplication photodiode) is used. Since this method has a large circuit scale per pixel, it is not used in an imager in which pixels are two-dimensionally arrayed in a large scale (deletion), and is mainly used in combination with a scan type (Patent Document 2 and Non-Patent Document 1).
一方、一括照射タイプでは、フォトダイオードの電流を計測し、判定値と比較して飛行時間を決定している。また、時系列に並べたキャパシタに順次、前記電流を蓄積し、その蓄積量で判定する場合も有る。この仕組みから、1回のレーザ照射で3次元イメージを構成することとなるため、写真で言えばフラッシュを当てて画像を撮った様に、視野全体に渡って同時性が確保される。視野の各点毎に、時刻が異なるスキャンタイプとは大きく異なり、「フラッシュ」と形容されている(特許文献3および特許文献4参照)。
On the other hand, in the batch irradiation type, the current of the photodiode is measured, and the flight time is determined by comparing with the determination value. Further, there is also a case where the current is sequentially accumulated in capacitors arranged in time series, and determination is made based on the accumulated amount. According to this mechanism, since a three-dimensional image is formed by one laser irradiation, as in the case of taking a picture by applying a flash in terms of photography, simultaneousness is ensured over the entire field of view. Each point in the field of view differs greatly from the scan type with a different time, and is described as "flash" (see Patent Document 3 and Patent Document 4).
しかしながら、上述した従来の技術では、以下の課題がある。
However, the above-described conventional techniques have the following problems.
光レーダ装置は、車両、ロボット、ドローン、視覚障害者用杖等、移動する物体(本体、移動体)に取り付けられて使用される為、時々刻々と対象視野の方向が変化する場合が多い。例えば本体の移動に伴い、地面の傾斜や本体の傾きが変わり、光レーダ装置の対象視野の中心(光軸)が上下に振れる事がある。また、前記光軸の回りに視野が回転する事も生じる。例えば上下の振れがある場合には、本来は前方中央にある対象物を、実際より上方や下方に在ると認識したり、静止している対象物を上下に動いていると認識したりする誤認識が生じる恐れが有る。光軸の回りに視野が回転する場合には、平坦な通路を傾いた通路と誤認識したり、傾いた地面を平坦と誤認識したりする場合が生じうる。この様な誤認識を避ける為には、本体に別の水平センサ、或いは2軸加速度計を設け、鉛直線に対して本体がどの様に傾斜しているかをモニターし、本体の傾斜情報を基に、光レーダ装置が出力する3Dイメージ情報を解釈し直す必要が有る。
Since the optical radar device is used by being attached to a moving object (main body, moving body), such as a vehicle, a robot, a drone, a wand for the visually impaired, the direction of the object view often changes from time to time. For example, with the movement of the main body, the inclination of the ground and the inclination of the main body may change, and the center (optical axis) of the object view of the light radar device may be vertically oscillated. Also, the field of view may rotate around the optical axis. For example, when there is an up and down swing, an object that is originally at the front center is recognized as being above or below the actual, or an object that is stationary is recognized as being moving up and down There is a risk of misrecognition. When the field of view rotates around the optical axis, a flat passage may be misrecognized as an inclined passage, or an inclined ground may be misrecognized as flat. In order to avoid such misrecognition, another horizontal sensor or a 2-axis accelerometer is provided on the main body, and it is monitored how the main body is inclined with respect to the vertical line, and the inclination information of the main body is used. In addition, it is necessary to reinterpret the 3D image information output from the light radar device.
本体が大きな車両であり、高性能CPUを備えている場合には、上記の様なデータの加工は大きな時間遅延なしに出来るが、より小規模の移動体で強力なCPUを備えていない場合には、上記データ処理は大きな障害となる。また、大きな車両で複数台の光レーダ装置を装備している場合、鉛直線からの傾斜が、光レーダ装置毎に異なる事が在りえる。これは、本体が大きい程、完全剛体で無い限り、変形量は大きくなるからである。
When the main body is a large vehicle and equipped with a high-performance CPU, processing of the above data can be done without a large time delay, but when a smaller-sized mobile unit is not equipped with a powerful CPU. The above data processing is a major obstacle. In addition, in the case where a large vehicle is equipped with a plurality of light radar devices, the inclination from the vertical line may be different for each light radar device. This is because the larger the body, the greater the amount of deformation, as long as the body is not completely rigid.
本発明の一態様は、鉛直線に対する光レーザ装置の位置関係が一定しており、少ないデータ処理量で対象物の位置を正確に認識できる3Dイメージを出力できる光レーダ装置を実現することを目的とする。
One aspect of the present invention is to realize an optical radar device capable of outputting a 3D image in which the positional relationship of the optical laser device with respect to the vertical line is constant, and the position of an object can be accurately recognized with a small amount of data processing. I assume.
上記の課題を解決するために、本発明の一態様に光レーダ装置は、所定の対象領域に存在する対象物までの距離を飛行時間(Time-of-flight)法で測定する光レーダ装置であって、パルス状の光を発する光源と、上記光源の発するパルス光を上記対象領域の少なくとも一部に照射するパルス光照明系と、上記パルス光照明系にて照射されたパルス光の上記対象物からの反射光を検出して、上記対象物までの距離を測定する受光系と、上記受光系の光軸の水平面に対する傾斜角を検出する水平センサと、を含み、上記受光系は、上記水平センサが検出した傾斜角分、当該受光系の光軸の水平面に対する傾斜を補正することを特徴としている。
In order to solve the above problems, according to one aspect of the present invention, an optical radar device is an optical radar device that measures the distance to an object present in a predetermined target area by a time-of-flight method. A light source for emitting pulsed light, a pulsed light illumination system for irradiating at least a part of the target area with pulsed light emitted from the light source, and the target for pulsed light emitted by the pulsed light illumination system A light receiving system for detecting the reflected light from the object to measure the distance to the object, and a horizontal sensor for detecting an inclination angle of the optical axis of the light receiving system with respect to the horizontal plane; The tilt of the optical axis of the light receiving system with respect to the horizontal plane is corrected by the tilt angle detected by the horizontal sensor.
本発明の一態様によれば、光レーザ装置の出力する3Dイメージの、水平面に対する位置関係が一定しており、少ないデータ処理量で対象物の位置を正確に認識できる3Dイメージを出力できるという効果を奏する。
According to one aspect of the present invention, the positional relationship of the 3D image output from the optical laser device with respect to the horizontal surface is constant, and a 3D image capable of accurately recognizing the position of the object with a small amount of data processing can be output. Play.
〔実施形態1〕
本発明の一実施形態について説明すれば以下の通りである。以下、説明の便宜上、特定の実施形態にて説明した構成と同一の機能を有する構成については、同一の符号を付記し、その説明を省略する場合がある。Embodiment 1
It will be as follows if one embodiment of the present invention is described. Hereinafter, for convenience of explanation, the same reference numerals may be added to the configurations having the same functions as the configurations described in the specific embodiment, and the description thereof may be omitted.
本発明の一実施形態について説明すれば以下の通りである。以下、説明の便宜上、特定の実施形態にて説明した構成と同一の機能を有する構成については、同一の符号を付記し、その説明を省略する場合がある。
It will be as follows if one embodiment of the present invention is described. Hereinafter, for convenience of explanation, the same reference numerals may be added to the configurations having the same functions as the configurations described in the specific embodiment, and the description thereof may be omitted.
(光レーダ装置の概要)
図1は、本実施形態1に係る光レーダ装置100の模式図である。 (Outline of optical radar system)
FIG. 1 is a schematic view of an optical radar device 100 according to the first embodiment.
図1は、本実施形態1に係る光レーダ装置100の模式図である。 (Outline of optical radar system)
FIG. 1 is a schematic view of an optical radar device 100 according to the first embodiment.
光レーダ装置100は、図1に示すように、パルス光照射領域10に扇状パルス光124(以下では単にパルス光と記述する場合もある)を照射するパルス光照明系110と、パルス光照射領域10の少なくとも一部からの光を受光する受光系150と、水平センサ140とからなる。
The optical radar device 100, as shown in FIG. 1, includes a pulse light illumination system 110 for irradiating a pulse light irradiation area 10 with fan-shaped pulse light 124 (hereinafter sometimes referred to simply as pulse light); It consists of a light receiving system 150 for receiving light from at least a part of 10, and a horizontal sensor 140.
パルス光照明系110は、パルス光を発する光源122と、光源122が発したパルス光をパルス光照射領域10に照射する扇状光照射系123とを含んでいる。
The pulsed light illumination system 110 includes a light source 122 that emits pulsed light, and a fan-shaped light irradiation system 123 that irradiates the pulsed light irradiation area 10 with the pulsed light emitted by the light source 122.
扇状光照射系123は、光源122からのパルス光を水平方向に広がった狭帯幅の帯状パルス光にし、この帯状パルス光を垂直方向に1次元スキャンする事で、パルス光照射領域10全体にパルス光(扇状パルス光124)を照射するようになっている。
The fan-shaped light irradiation system 123 converts the pulse light from the light source 122 into a band-shaped pulse light having a narrow band width spread in the horizontal direction, and scans the band-shaped pulse light one-dimensionally in the vertical direction. Pulse light (fan-shaped pulse light 124) is irradiated.
受光系150は、ToFセンサ(3次元画像素子を含む)153と、ToFセンサ153の受光部154に結像する結像光学系151、光学バンドパスフィルタ152を含む。ToFセンサ153には、パルス光照明系110の制御や、外部システム170とのコミュニケーション等の機能も含めている。
The light receiving system 150 includes a ToF sensor (including a three-dimensional image element) 153, an imaging optical system 151 that forms an image on the light receiving unit 154 of the ToF sensor 153, and an optical band pass filter 152. The ToF sensor 153 also includes functions such as control of the pulsed light illumination system 110 and communication with the external system 170.
水平センサ140は、受光系150の光軸の水平面に対する傾斜角を常時モニターし、モニター結果をToFセンサ153へ出力する。本実施形態では、受光系150の光軸の水平面に対する傾斜を補正する場合について記載する。
The horizontal sensor 140 constantly monitors the inclination angle of the light axis of the light receiving system 150 with respect to the horizontal plane, and outputs the monitoring result to the ToF sensor 153. In this embodiment, the case of correcting the inclination of the light axis of the light receiving system 150 with respect to the horizontal plane will be described.
なお、本実施形態では図1及び図2に示した座標設定において、Z軸が鉛直方向、X-Y面が水平面である。なお、環境に固定された座標軸を、X,Y,Zで表し、光レーダ装置100に固定された座標軸をXr、Yr、Zrで表す。光レーダ装置100が傾いていない場合には、両者は一致する。Y軸方向が光レーダ装置100の前方であり、Yr軸がY軸に対して傾斜する場合を前方傾斜と呼び、Xr軸がX軸に対して傾斜する場合を左右傾斜と呼ぶ。以下の各構成要素の説明においては、特に断らない限り、光レーダ装置100が水平面に対して、いずれの方向にも傾いていない場合を前提に、Y軸方向がパルス光照射領域10の中央部であり、受光系150の光軸方向である場合について説明する。また、本実施形態では光レーダ装置100の左右傾斜が無視できる場合を対象としている。
In the present embodiment, in the coordinate setting shown in FIG. 1 and FIG. 2, the Z axis is the vertical direction, and the XY plane is the horizontal plane. The coordinate axes fixed to the environment are represented by X, Y, Z, and the coordinate axes fixed to the light radar device 100 are represented by Xr, Yr, Zr. When the light radar device 100 is not tilted, the two coincide. The case where the Y-axis direction is the front of the optical radar device 100 and the Yr-axis is inclined with respect to the Y-axis is called forward inclination, and the case where the Xr-axis is inclined with respect to the X-axis is called lateral inclination. In the following description of each component, the Y-axis direction is the central portion of the pulsed light irradiation region 10 on the assumption that the optical radar device 100 is not inclined in any direction with respect to the horizontal surface unless otherwise specified. A case where the light receiving system 150 is in the optical axis direction will be described. Further, the present embodiment is directed to the case where the left and right inclination of the light radar device 100 can be ignored.
(パルス光照明系110)
光レーダ装置100のパルス光照明系110について図2および図3を参照しながら以下に説明する。図2は、図1に示すパルス光照明系110の扇状光照射系123の構成を示す模式図である。図3は、図1に示す光レーダ装置において、対象領域と測定可能領域、パルス光照射領域の関係を示す模式図である。 (Pulsed light illumination system 110)
The pulsed light illumination system 110 of the light radar device 100 will be described below with reference to FIGS. 2 and 3. FIG. 2 is a schematic view showing a configuration of a fan-shapedlight irradiation system 123 of the pulsed light illumination system 110 shown in FIG. FIG. 3 is a schematic view showing the relationship between a target area, a measurable area, and a pulsed light irradiation area in the optical radar device shown in FIG.
光レーダ装置100のパルス光照明系110について図2および図3を参照しながら以下に説明する。図2は、図1に示すパルス光照明系110の扇状光照射系123の構成を示す模式図である。図3は、図1に示す光レーダ装置において、対象領域と測定可能領域、パルス光照射領域の関係を示す模式図である。 (Pulsed light illumination system 110)
The pulsed light illumination system 110 of the light radar device 100 will be described below with reference to FIGS. 2 and 3. FIG. 2 is a schematic view showing a configuration of a fan-shaped
はじめに、扇状光照射系123によって照射される扇状パルス光124について説明する。扇状パルス光124は、水平方向では扇状に広がっており、その広がり角度をθhとする。一方、垂直方向の広がり角度は小さく、ビーム厚θw(半値幅)とする。θh>>θwである。この扇状パルス光124を垂直方向に垂直照射角θv内でスキャンする事で、水平方向の広がり角度θh、垂直方向の広がり角度θvのパルス光照射領域10を順次光照射する事が出来る。なお。θv>θwである。垂直方向に異なる角度で照射される扇状パルス光124を区別する必要がある場合は、124-1~124-Nsと記す事にする。Nsは垂直方向のスキャン総数である。
First, the fan-shaped pulsed light 124 irradiated by the fan-shaped light irradiation system 123 will be described. The fan-shaped pulse light 124 spreads like a fan in the horizontal direction, and its spread angle is denoted by θh. On the other hand, the spread angle in the vertical direction is small, and the beam thickness is θw (half width). θ h >> θ w By scanning the fan-shaped pulse light 124 in the vertical direction within the vertical irradiation angle θv, it is possible to sequentially irradiate the pulsed light irradiation region 10 with the horizontal spread angle θh and the vertical spread angle θv. In addition. It is θv> θw. When it is necessary to distinguish the fan-shaped pulse lights 124 which are irradiated at different angles in the vertical direction, they are described as 124-1 to 124-Ns. Ns is the total number of scans in the vertical direction.
θh>θvの場合には、垂直方向のスキャンを全て完了しなくても、初期の少数のスキャンによって、視野内の対象物の多くを検出出来る。陸上では、検出すべき障害物が空中に浮かんでいる場合は稀であり、路面や地面から垂直方向に立っている場合が多い為、例えば低い高さから、丁度水平面に平行に扇状パルス光を照射すれば、大概の障害物を検知できる。従って、1フィールドの期間が終わる前に、早く障害物を検出できると言う特徴がある。一方、路面を遠方まで監視する様な用途では、θh≦θvであっても良く、用途に応じて変更できる。
If θh> θv, many of the objects in the field of view can be detected by the initial few scans, without completing all of the vertical scans. On land, it is rare when the obstacle to be detected floats in the air, and it often stands vertically from the road surface or the ground, so for example from a low height, fan-shaped pulse light just parallel to the horizontal plane If irradiated, most obstacles can be detected. Therefore, there is a feature that an obstacle can be detected quickly before the end of one field period. On the other hand, in applications where the road surface is to be monitored far, θ h ≦ θ v may be used, and can be changed according to the application.
扇状パルス光124は、対象視野内において、均一である事が好ましいが、光強度が強い場所の検出感度が高くなる為、対象視野内において、特に注視する必要が有る場所がある場合は、その付近の強度を強化した光強度分布とする事も可能である。
It is preferable that the fan-shaped pulse light 124 be uniform in the object field of view, but the detection sensitivity of the place where the light intensity is high is high. It is also possible to make the light intensity distribution in which the intensity in the vicinity is strengthened.
次に、扇状光照射系123の構成について説明する。
Next, the configuration of the fan-shaped light irradiation system 123 will be described.
扇状光照射系123は、図2に示すように、光源122(図1)の光を、ほぼ平行なスポット光133(Y-Z面内)に整形するコリメート光発生器130と、スポット光133を垂直方向(Z方向)に振る1次元スキャン装置131、1次元スキャン装置131によって垂直方向の進行角度を変えられたスポット光を扇状に広げる扇状ビーム生成器132を少なくとも含む。
The fan-shaped light irradiation system 123, as shown in FIG. 2, includes a collimated light generator 130 for shaping the light of the light source 122 (FIG. 1) into substantially parallel spot light 133 (in YZ plane); At least in the vertical direction (Z direction), and at least a fan-shaped beam generator 132 for spreading the spot light whose traveling angle in the vertical direction has been changed by the one-dimensional scan device 131 in a fan-like shape.
1次元スキャン装置131は、水平面(X-Y面)内の一軸(X軸とする)の回りに回転運動する反射面を有するMEMSミラー素子等により構成される。
The one-dimensional scanning device 131 is configured of a MEMS mirror element or the like having a reflective surface that rotates about one axis (X axis) in a horizontal plane (XY plane).
扇状ビーム生成器132は、例えばパウエルレンズを含む。例えば、コリメート光発生器130によって、発散角1.0度程度、口径8.9mmのパウエルレンズの入射部での径が3mm程度のスポット光133を形成し、MEMSミラー素子からなる1次元スキャン装置131によって、レーザ光を水平面に対して、±20度振る。前記パウエルレンズは、レーザ光をθh=90度、θw=1度で放射する為、この扇状パルス光124はθh=90度、θv=40度の範囲を照射する事ができる。
The fan beam generator 132 includes, for example, a Powell lens. For example, a collimated light generator 130 forms a spot light 133 having a divergence angle of about 1.0 degree and a diameter of about 3 mm at the incident part of a Powell lens with an aperture of 8.9 mm, and is a one-dimensional scanning device The laser beam is shaken ± 20 degrees with respect to the horizontal plane by 131. The Powell lens emits laser light at θh = 90 ° and θw = 1 °, so this fan-shaped pulse light 124 can irradiate a range of θh = 90 ° and θv = 40 °.
扇状光照射系123の光路は、図2では、1次元スキャン装置131を通過した後に、扇状ビーム生成器132を通過させているが、逆に、扇状ビーム生成器132を通過し後に、1次元スキャン装置131を通過しても良い。ここで、θvは、光レーダ装置100の垂直方向の測定領域θovよりは大きい事が好ましい。例えば、図3に示すように、光レーダ装置100が下方向きにθiy(前方傾斜角、下向きをマイナスで定義する)だけ傾いた場合、実際の水平面は光レーダ装置100のパルス光照射領域10の中心に対して-θiyだけずれる。実際に測定すべき領域である対象領域12は水平面を中心に角度θovの範囲であるため、パルス光照明系110の1次元スキャン装置131は、光照射領域を中心から-θiyだけずらして、θovの範囲をスキャンする事が好ましい。従って、θiyの許容最大値を±θmyとすれば、θv≧θov+2・θmyである事が好ましい。例えば、θov=20度、θmy=10度なら、θvは40度以上である事が好ましい。なお、ここでは対象領域12が水平面を中心に広がった場合を記載したが、対象領域12が水平面に対して、下方や上方に向いている場合も同様である。即ち、Y軸を対象領域12の中心へ向く様に、水平面からずらして設定すれば良い。以下でも同様である。
The light path of the fan-shaped light irradiation system 123 is made to pass through the fan-shaped beam generator 132 after passing through the one-dimensional scanning device 131 in FIG. 2, but conversely, after passing through the fan-shaped beam generator 132 It may pass through the scanning device 131. Here, it is preferable that θv be larger than the measurement region θov in the vertical direction of the light radar device 100. For example, as shown in FIG. 3, when the optical radar device 100 is inclined downward by θiy (a forward inclination angle, downward is defined as minus), the actual horizontal plane corresponds to the pulse light irradiation region 10 of the optical radar device 100. Shift by -θiy with respect to the center. Since the target area 12 which is the area to be actually measured is in the range of the angle θov centered on the horizontal plane, the one-dimensional scanning device 131 of the pulsed light illumination system 110 shifts the light irradiation area by -θiy from the center by It is preferable to scan the range of Therefore, if an allowable maximum value of θiy is ± θmy, it is preferable that θv ≧ θov + 2 · θmy. For example, if θov = 20 degrees and θmy = 10 degrees, it is preferable that θv be 40 degrees or more. In addition, although the case where the object area | region 12 spread | spreaded centering on the horizontal surface was described here, it is the same also when the object area | region 12 is facing down and upward with respect to the horizontal surface. That is, the Y axis may be set to be shifted from the horizontal plane so as to face the center of the target area 12. The same applies to the following.
MEMSミラー素子は、例えば電磁式であり、流す電流量を制御する事で、ミラーの振れ角を変更する。静電式や圧電式では、印加する電圧を制御する事で、ミラーの振れ角を変更できる。MEMSミラー素子は、装置を小型化できると言う利点がある。1次元スキャン装置131の制御は、ToFセンサ153に含めている。扇状パルス光124が照射された対象物からの信号を検出する様に、ミラーの振れ角と受光系を同期制御する必要が有る。1次元スキャン装置131はMEMSミラー素子以外には、ポリゴンミラーや液晶導波路方式、等、他の構成であっても良い。
The MEMS mirror element is, for example, an electromagnetic type, and changes the swing angle of the mirror by controlling the amount of current flowed. In the electrostatic type and the piezoelectric type, the deflection angle of the mirror can be changed by controlling the applied voltage. The MEMS mirror element has the advantage that the device can be miniaturized. Control of the one-dimensional scanning device 131 is included in the ToF sensor 153. In order to detect the signal from the object irradiated with the fan-shaped pulsed light 124, it is necessary to synchronously control the deflection angle of the mirror and the light receiving system. The one-dimensional scanning device 131 may have another configuration, such as a polygon mirror or a liquid crystal waveguide method, in addition to the MEMS mirror element.
扇状ビーム生成器132として、他の構成も可能ではあるが、パウエルレンズを用いる事は、可動部が無く、比較的小さな光学部品によって、パルス光を広げられる為、装置の小型化や堅固性向上と言う利点がある。
Although other configurations are possible as the fan-shaped beam generator 132, using a Powell lens has no moving parts, and the pulse light can be spread by a relatively small optical component, so the device can be miniaturized and the rigidity can be improved. There is an advantage to say.
ここで、パルス光照明系110を構成する光源122は、レーザやLEDの様にパルス発光が可能な光源であり、波長700nmから1000nm程度の赤外線で有る事が好ましい。人の目に見えない為、邪魔にならない上に、長波長程、動物の目に対する安全性が高いと言う利点がある。更に、波長が長い程、背景光強度も低下する、特に940nmから950nm付近の波長は、空気中の水分による太陽光の吸収によって、強度が低下している為、好ましい。また、発光波長帯域が狭く、発光ピーク波長の温度変動少ない事が好ましく、赤外線レーザが好ましい。特に、発光波長帯域が狭く、発光ピーク波長の温度変動少ないVCSEL(Vertical Cavity Surface Emitting LASER:垂直共振器面発光レーザ)が好ましい。図1には記載して無いが、発光ピーク波長の温度変動を抑制する為に、光源122の温度制御を行う温度制御回路を加えても良い。
Here, the light source 122 constituting the pulsed light illumination system 110 is a light source capable of pulse emission, such as a laser or LED, and is preferably an infrared ray having a wavelength of about 700 nm to about 1000 nm. In addition to being unobtrusive, it has the advantage that the longer the wavelength, the higher the safety for the eyes of animals because it is invisible to the human eye. Furthermore, the longer the wavelength, the lower the background light intensity. In particular, the wavelength around 940 nm to 950 nm is preferable because the intensity is lowered due to the absorption of sunlight by moisture in the air. Further, it is preferable that the emission wavelength band is narrow and the temperature fluctuation of the emission peak wavelength is small, and an infrared laser is preferable. In particular, it is preferable to use a VCSEL (Vertical Cavity Surface Emitting LASER: vertical cavity surface emitting laser) which has a narrow emission wavelength band and a small temperature fluctuation of the emission peak wavelength. Although not shown in FIG. 1, a temperature control circuit for controlling the temperature of the light source 122 may be added in order to suppress the temperature fluctuation of the light emission peak wavelength.
光源122は、ToFセンサ153と同期して、パルス発光する。発光強度やパルス幅(発光時間の半値幅)は可変であっても良い。ここでパルス光のパルス幅は1nsecから数百nsec程度である。パルス光のピークパワーは数Wから数百Wである。
The light source 122 emits pulsed light in synchronization with the ToF sensor 153. The light emission intensity and the pulse width (half width of light emission time) may be variable. Here, the pulse width of the pulsed light is about 1 nsec to several hundreds nsec. The peak power of pulsed light is from several watts to hundreds of watts.
光レーダ装置100が、毎秒30フレームのデータを取得し、各フレームの画素分解能Δθが0.5度、θov=20度の場合、例えば、1フレームは垂直方向に進行角度が異なる40個の扇状パルス光124-1から124-40が照射される。扇状パルス光124―kの照射に割り当てられる時間は、1/1200秒であり、この間に1次元スキャン装置131の反射面の角度を設定値に変更し、光源122をパルス発光させる。パルス発光周波数が190kHzの場合、各扇状パルス光124―kは158発のパルスを対象物に照射し、パルス毎の測定結果を積算する事となる。
When the optical radar device 100 acquires data at 30 frames per second, and the pixel resolution Δθ of each frame is 0.5 degrees and θov = 20 degrees, for example, 40 sectors of one frame having different advancing angles in the vertical direction The pulsed lights 124-1 to 124-40 are emitted. The time allotted to the fan-shaped pulsed light 124-k is 1/1200 seconds, during which the angle of the reflective surface of the one-dimensional scanning device 131 is changed to a set value, and the light source 122 is pulsed. When the pulse emission frequency is 190 kHz, each sector pulse light 124-k emits 158 pulses to the object and integrates the measurement results for each pulse.
1次元スキャン装置131の反射面の角度設定精度が高く無い場合には、受光系150の1画素に対応する垂直角度分解能に対し、θwは、ほぼ等しいか、大きい方が好ましい。一方、1次元スキャン装置131の反射面の角度設定精度が高い場合には、画素の垂直角度分解能に対し、θwは、ほぼ等しいか、小さい方が好ましい。例えば、Δθ=0.5度に対し、反射面の角度設定精度が±0.2度の場合、対象画素に確実にパルス光を照射する為には、θw≧0.9度でなければならない。θw=1度の場合には、ほぼ50%のパルス光しか、対象画素に投影される対象物面上に照射されない場合が生じる。反射面の角度設定精度が±0.02度の場合、Δθ=0.5度の場合、対象画素の対象物面上に扇状パルス光124の90%以上を照射する事ができる。
If the angle setting accuracy of the reflective surface of the one-dimensional scanning device 131 is not high, it is preferable that θw be approximately equal to or larger than the vertical angle resolution corresponding to one pixel of the light receiving system 150. On the other hand, when the angle setting accuracy of the reflective surface of the one-dimensional scanning device 131 is high, it is preferable that θw be approximately equal to or smaller than the vertical angle resolution of the pixel. For example, when .DELTA..theta. = 0.5 degrees and the angle setting accuracy of the reflecting surface is. +-. 0.2 degrees, .theta.w.gtoreq.0.9 degrees in order to reliably irradiate the target pixel with pulsed light. . In the case of θw = 1 degree, there may be a case where only approximately 50% of pulsed light is irradiated on the object surface projected onto the object pixel. When the angle setting accuracy of the reflective surface is ± 0.02 degrees, 90% or more of the fan-shaped pulse light 124 can be irradiated on the target object surface of the target pixel when Δθ = 0.5 degrees.
上記の説明では、対象画素に対応する対象物面をほぼ均一に光照射する事を前提にしているが、不均一性を許容して、できるだけ多くの光を照射する事を優先するなら、θwは出来るだけ小さい方が良い。例えば、前記反射面の角度設定精度が±0.2度の場合において、θw=0.5度の場合、最低照射量はパルス光の60%であり、θw=1度で均一照射する場合より多く出来る。θw=0.05度なら、ほぼ100%のパルス光を照射出来る。従って、θwの大きさは、1次元スキャン装置131の特性や、対象面への照射態様によって異なるが、受光系の1画素に対応する垂直角度分解能を基準として、決定される。
In the above description, it is assumed that the object surface corresponding to the target pixel is irradiated with light substantially uniformly. However, if it is preferred to allow irradiation of as much light as possible in consideration of nonuniformity, θw Should be as small as possible. For example, when the angle setting accuracy of the reflecting surface is ± 0.2 degrees, the minimum irradiation amount is 60% of that of the pulse light in the case of θw = 0.5 degrees, compared to the case of uniform irradiation at θw = 1 degree. Many can do. If θw = 0.05 degrees, nearly 100% of pulsed light can be emitted. Therefore, the size of θw is determined based on the vertical angle resolution corresponding to one pixel of the light receiving system, although it varies depending on the characteristics of the one-dimensional scanning device 131 and the irradiation mode to the target surface.
(受光系150)
光レーダ装置100の受光系150について以下に説明する。 (Light receiving system 150)
The light receiving system 150 of the light radar device 100 will be described below.
光レーダ装置100の受光系150について以下に説明する。 (Light receiving system 150)
The light receiving system 150 of the light radar device 100 will be described below.
受光系150を構成する結像光学系151は一般的にはレンズである。受光部154の大きさ、視野角(FOV)によって、焦点距離やFナンバーを選択できる。後述の光学バンドパスフィルタ152の中心波長に於いて、透過率が高く、収差が少ない事が好ましい。図1に示す光レーダ装置100では、結像光学系151としてレンズを用いた例を示したが、レンズ以外の反射光学系であっても構わない。結像光学系151の前面から受光部154までの光路に、光学バンドパスフィルタ152を配置する事で、帯域外の背景光を低減する事が好ましい。
The image forming optical system 151 constituting the light receiving system 150 is generally a lens. The focal length and the F number can be selected according to the size of the light receiving unit 154 and the viewing angle (FOV). Preferably, the transmittance is high and the aberration is small at the central wavelength of the optical band pass filter 152 described later. In the light radar device 100 shown in FIG. 1, an example in which a lens is used as the imaging optical system 151 is shown, but a reflective optical system other than a lens may be used. It is preferable to reduce the out-of-band background light by arranging the optical band pass filter 152 in the light path from the front surface of the imaging optical system 151 to the light receiving unit 154.
光学バンドパスフィルタ152は、パルス光の波長ピークを中心に、一定幅の帯域に透過帯を有する。透過帯の幅(透過率の波長分布の半値幅)は数nmから数十nmである。透過帯の幅は、10nmから20nm程度が好ましい。一般に屋外で動作させる場合、動作温度範囲が広くなり、パルス光のピーク波長が温度と共に変化する為、少なくとも動作温度範囲では、パルス光の分布は前記透過帯に納まっている必要が有る。VCSELの場合、ピーク波長の温度シフトは0.07nm/度程度、発光ピークの半値幅が1nm程度、光学バンドパスフィルタの透過帯中心波長の温度シフトが0.025nm/度である為、85℃から-40℃の温度帯を考慮しても、ピーク波長と透過帯中心波長の相対的な波長シフトは5.6nm程度であり、10nm程度の透過帯域の光学バンドパスフィルタを使用する事が出来る。なお、光学バンドパスフィルタ152を結像光学系151の内部に組み込んでも良い。
The optical band pass filter 152 has a transmission band in a fixed width band centered on the wavelength peak of pulsed light. The width of the transmission band (half width of wavelength distribution of transmittance) is several nm to several tens nm. The width of the transmission band is preferably about 10 nm to 20 nm. In general, when operating outdoors, the operating temperature range widens and the peak wavelength of the pulsed light changes with the temperature, so the pulsed light distribution needs to be within the transmission band at least in the operating temperature range. In the case of a VCSEL, the temperature shift of the peak wavelength is about 0.07 nm / degree, the half width of the emission peak is about 1 nm, and the temperature shift of the central wavelength of the transmission band of the optical bandpass filter is 0.025 nm / degree. The relative wavelength shift of the peak wavelength and the central wavelength of the transmission band is about 5.6 nm even if the temperature band from -40 ° C is taken into consideration, and it is possible to use an optical band pass filter of the transmission band of about 10 nm. . The optical band pass filter 152 may be incorporated in the imaging optical system 151.
光レーダ装置100が水平面に対して傾いていない場合には、結像光学系151の光軸はY軸方向を向いおり、ToFセンサ153の受光部154はX-Z面に平行である。
When the optical radar device 100 is not tilted with respect to the horizontal plane, the optical axis of the imaging optical system 151 is in the Y-axis direction, and the light receiving unit 154 of the ToF sensor 153 is parallel to the XZ plane.
結像光学系151やToFセンサ153の受光部154によって決まる測定可能領域11はパルス光照射領域10と等しいか、小さい事が好ましい。
It is preferable that the measurable area 11 determined by the imaging optical system 151 and the light receiving unit 154 of the ToF sensor 153 be equal to or smaller than the pulsed light irradiation area 10.
(ToFセンサ153)
受光系150を構成するToFセンサ153について図4を参照しながら以下に説明する。図4は、図1に示す光レーダ装置を構成するToFセンサの模式図である。ここで言う、ToFセンサ153は、扇状のパルス光によって照らされる、パルス光照射領域10のライン状部分からの光を受光し、飛行時間測定により、前記ライン状部分と光レーダ装置100の間の距離を求めるセンサである。 (ToF sensor 153)
The ToF sensor 153 constituting the light receiving system 150 will be described below with reference to FIG. FIG. 4 is a schematic view of a ToF sensor constituting the light radar device shown in FIG. Here, the ToF sensor 153 receives the light from the linear portion of the pulsed light irradiation area 10 illuminated by the fan-shaped pulsed light, and measures the time of flight between the linear portion and the optical radar device 100. It is a sensor that finds the distance.
受光系150を構成するToFセンサ153について図4を参照しながら以下に説明する。図4は、図1に示す光レーダ装置を構成するToFセンサの模式図である。ここで言う、ToFセンサ153は、扇状のパルス光によって照らされる、パルス光照射領域10のライン状部分からの光を受光し、飛行時間測定により、前記ライン状部分と光レーダ装置100の間の距離を求めるセンサである。 (ToF sensor 153)
The ToF sensor 153 constituting the light receiving system 150 will be described below with reference to FIG. FIG. 4 is a schematic view of a ToF sensor constituting the light radar device shown in FIG. Here, the ToF sensor 153 receives the light from the linear portion of the pulsed light irradiation area 10 illuminated by the fan-shaped pulsed light, and measures the time of flight between the linear portion and the optical radar device 100. It is a sensor that finds the distance.
ToFセンサ153は、一例として、例えば図4に示すように、画素Px(i,j)を2次元行列状に配置した受光部154と、上記画素Px(i,j)の列を構成する各画素Px(i,j)から電気パルスが供給される画素記憶素子Mx(j)と、上記画素記憶素子Mx(j)が蓄えたデータを読み出し、上記画素Px(i,j)毎に、対象物までの距離を示す距離情報を少なくとも取得する信号処理回路DSと、を備えた3次元画像素子を含む。上記画素記憶素子Mx(j)は、上記電気パルスの数を互いに異なるタイミングで積算する、複数のバイナリカウンタを有しており、上記信号処理回路DSは、上記画素記憶素子Mx(j)が蓄えたデータの読み出しを、上記バイナリカウンタによる積算と並行して実行できる。
For example, as shown in FIG. 4, the ToF sensor 153 includes a light receiving unit 154 in which pixels Px (i, j) are arranged in a two-dimensional matrix, and each column forming the row of the pixels Px (i, j). The data stored in the pixel storage element Mx (j) to which the electric pulse is supplied from the pixel Px (i, j) and the pixel storage element Mx (j) are read, and the target is set for each pixel Px (i, j). And a signal processing circuit DS for acquiring at least distance information indicating a distance to an object. The pixel storage element Mx (j) has a plurality of binary counters that integrate the numbers of the electric pulses at different timings from each other, and the signal processing circuit DS is stored in the pixel storage element Mx (j). Data reading can be performed in parallel with the integration by the binary counter.
尚、ToFセンサ153は水平センサ140の出力を受信し、その出力値に基づいて、パルス光照明系110を制御し、受光する画素Px(k,j)を選択する機能を有しており、必ずしも1チップで構成する必要は無い。前記3次元画像素子と他の制御機能を実行するデバイスとの組み合わせであっても構わない。
The ToF sensor 153 has a function of receiving the output of the horizontal sensor 140, controlling the pulsed light illumination system 110 based on the output value, and selecting the pixel Px (k, j) to receive light, It is not always necessary to configure one chip. It may be a combination of the three-dimensional image element and a device that performs another control function.
具体的には、図4に示すように、ToFセンサ153の受光部154には、画素Px(i,j)をM行N列の2次元マトリックスに配置されており、パルス光照射領域10からの光信号が結像光学系151によって、このM行N列の2次元マトリックス上に投影される。ここで、画素Px(i,j)の形状を正方形とする。この場合、受光系150の角度分解能がΔθの場合、測定可能領域のFOVは水平方向にN・Δθ、垂直方向にM・Δθである。
Specifically, as shown in FIG. 4, in the light receiving unit 154 of the ToF sensor 153, the pixels Px (i, j) are arranged in a two-dimensional matrix of M rows and N columns, and The light signal is projected by the imaging optical system 151 onto this M-row N-column two-dimensional matrix. Here, the shape of the pixel Px (i, j) is a square. In this case, when the angular resolution of the light receiving system 150 is Δθ, the FOV of the measurable region is N · Δθ in the horizontal direction and M · Δθ in the vertical direction.
画素Px(i,j)は全部が一度に活性化される訳では無い。パルス光照射領域10へ向けて照射されるパルス光が扇状パルス光124である為、この扇状パルス光124―kに対応した、k行にある画素のみが活性化される。即ち、扇状パルス光124―kが照射される際に、Px(k,j)が活性化される。ここでPx(k,j)が活性化されるとは、少なくともPx(k,j)の出力信号が信号記憶処理部155へ伝達される事を意味し、更に、他のPx(i,j)への電源供給を停止し、Px(k,j)のみに電源が供給されていても良い。ここでは、説明の便宜上、扇状パルス光124は、最下部から最上部に向けて、1からMまでの番号を付け、対応するPx(i,j)のiは最上部から最下部へ、1からMまでの番号を付ける。結像光学系151を介して、両者の順番は逆になる為、この様に対応させている。なお、この対応付けは、結像光学系151の性質に依存して、変更する事は可能である。
The pixels Px (i, j) are not all activated at one time. Since the pulse light irradiated toward the pulse light irradiation area 10 is the fan-shaped pulse light 124, only the pixels in the k-th row corresponding to the fan-shaped pulse light 124-k are activated. That is, when the fan-shaped pulsed light 124-k is irradiated, Px (k, j) is activated. Here, that Px (k, j) is activated means that at least an output signal of Px (k, j) is transmitted to the signal storage processing unit 155, and further, other Px (i, j) ) May be stopped, and power may be supplied to only Px (k, j). Here, for convenience of explanation, the sector pulse light 124 is numbered from 1 to M from the bottom to the top, and i of the corresponding Px (i, j) is from the top to the bottom, 1 Assign numbers from to. Since the order of the two is reversed via the imaging optical system 151, they are associated in this way. Note that this correspondence can be changed depending on the nature of the imaging optical system 151.
画素Px(i,j)は、光電変換する受光素子と、受光素子の出力を信号記憶処理部155へ伝える回路等が含まれている。受光素子としては、アバランシェフォトダイオード等のフォトダイオードが用いられる。
The pixel Px (i, j) includes a light receiving element that performs photoelectric conversion, a circuit that transmits the output of the light receiving element to the signal storage processing unit 155, and the like. As a light receiving element, a photodiode such as an avalanche photodiode is used.
扇状パルス光124―kに対応するk行の画素Px(k,j)の選択をする為の回路として、受光部154には行選択回路161が設けられている。また、行選択回路161の信号を各画素Px(i,j)に伝える行選択線R(i)が設けられている。行選択線R(i)は単一の信号線とは限らず、極性や電圧が異なる複数の信号線であっても良い。
A row selection circuit 161 is provided in the light receiving unit 154 as a circuit for selecting the pixels Px (k, j) of the k rows corresponding to the fan-shaped pulse light 124-k. In addition, a row selection line R (i) for transmitting the signal of the row selection circuit 161 to each pixel Px (i, j) is provided. The row selection line R (i) is not limited to a single signal line, and may be a plurality of signal lines having different polarities and voltages.
行選択回路161は、扇状光照射系123の1次元スキャン装置131の動作と同期して、活性化するk行を選択する。同期する為の信号は制御部160から発せられる。行選択回路161は、例えば、各Px(k,j)(j=1~N)の出力だけを信号線Lx(j)に接続する様に、Px(i,j)を制御しても良いし、各Px(k,j)(j=1~N)のみに電源(複数)を供給する様にスイッチ(図示せず)を制御しても良い。両方を行っても良い。
The row selection circuit 161 selects k rows to be activated in synchronization with the operation of the one-dimensional scanning device 131 of the fan-shaped light irradiation system 123. A signal for synchronization is issued from the control unit 160. For example, the row selection circuit 161 may control Px (i, j) such that only the output of each Px (k, j) (j = 1 to N) is connected to the signal line Lx (j). Alternatively, a switch (not shown) may be controlled to supply power (pluralities) only to each Px (k, j) (j = 1 to N). You may do both.
信号記憶処理部155には、各列jに対応する少なくとも一個の画素記憶素子Mx(j)を含み、画素記憶素子Mx(j)は各画素Px(i,j)と信号線Lx(j)によって、結ばれている。光を受光する事によって、画素Px(k,j)が発する電気信号を、信号線Lx(j)を介して画素記憶素子Mx(j)が受け、信号量を時系列で記憶する。すなわち、画素記憶素子Mx(j)には、画素の列を構成する各画素から電気パルスが供給され、データを蓄積する。
The signal storage processing unit 155 includes at least one pixel storage element Mx (j) corresponding to each column j, and the pixel storage element Mx (j) includes each pixel Px (i, j) and the signal line Lx (j). It is tied by. By receiving light, the electric signal emitted from the pixel Px (k, j) is received via the signal line Lx (j) by the pixel memory element Mx (j), and the signal amount is stored in time series. That is, an electric pulse is supplied to the pixel memory element Mx (j) from each pixel forming the column of pixels, and data is accumulated.
信号記憶処理部155は更に、バッファメモリBx(j)と、列信号線C(j)、信号処理回路DSを含む。ここで、画素記憶素子Mx(j)に蓄積されたデータは、定められたタイミングで読み出され、列信号線C(j)を介してバッファメモリBx(j)へ、コピーされる。
The signal storage processing unit 155 further includes a buffer memory Bx (j), a column signal line C (j), and a signal processing circuit DS. Here, the data stored in the pixel storage element Mx (j) is read at a predetermined timing and copied to the buffer memory Bx (j) via the column signal line C (j).
信号処理回路DSは、バッファメモリBx(j)の情報に基づき、距離情報D(k,j)や2次元イメージ情報G1(k,j)、G2(k,j)を少なくとも算出し出力する。G1(i,j)は背景光による2次元イメージ情報であり、G2(i,j)はパルス光の反射光による2次元イメージ情報とする事ができるが、それらに限定される物では無い。
The signal processing circuit DS calculates and outputs at least the distance information D (k, j) and the two-dimensional image information G1 (k, j) and G2 (k, j) based on the information in the buffer memory Bx (j). G1 (i, j) is two-dimensional image information by background light, and G2 (i, j) can be two-dimensional image information by reflected light of pulse light, but is not limited thereto.
信号記憶処理部155は、画素記憶素子Mx(j)の一部を選択する為のメモリ選択回路163とメモリ選択線Rm(α)を有していても良い。画素記憶素子Mx(j)が列信号線C(j)に信号を出力する際に、全出力を平行に出力すると大量の配線が必要となる為、画素記憶素子Mx(j)を構成する時間毎の信号量を読み出す事で、配線数を減らす事が出来る。
The signal storage processing unit 155 may have a memory selection circuit 163 for selecting a part of the pixel storage element Mx (j) and a memory selection line Rm (α). When the pixel storage element Mx (j) outputs a signal to the column signal line C (j), when all outputs are output in parallel, a large amount of wiring is required. Therefore, the time for forming the pixel storage element Mx (j) By reading out the amount of each signal, the number of wires can be reduced.
制御部160は、対象領域12(図3)内の測定順序を決定し、この順序に従い1次元スキャン装置131(図2)の角度設定を行い、受光部154の活性化すべき行kを決定し、行選択回路161を駆動して、画素Px(k,j)(j=1~N)を活性化し、光源122(図1)に発光を指示し、同時に画素Px(k,j)の測定を開始し、画像記憶素子Mx(j)信号量の記憶を始める。光源122の繰り返し発光が終了した時点で、画素記憶素子Mx(j)が蓄積したデータをバッファメモリBx(j)へコピーし、信号処理回路DSに順に、距離情報D(k,j)や2次元イメージ情報G1(k,j)、G2(k,j)を計算させ、ToFセンサ153内のメモリに蓄えたり、外部システム170(図1)へ出力したりすると言った、一連の制御を行う。
The control unit 160 determines the measurement order in the target area 12 (FIG. 3), performs angle setting of the one-dimensional scanning device 131 (FIG. 2) according to this order, and determines the row k to activate the light receiving unit 154. The row selection circuit 161 is driven to activate the pixels Px (k, j) (j = 1 to N), and the light source 122 (FIG. 1) is instructed to emit light, and simultaneously the pixels Px (k, j) are measured To store the image storage element Mx (j) signal amount. When repetitive light emission of the light source 122 is finished, the data stored in the pixel memory element Mx (j) is copied to the buffer memory Bx (j), and the distance information D (k, j) and 2 are sequentially transmitted to the signal processing circuit DS. Performs a series of controls such as calculating the dimensional image information G1 (k, j) and G2 (k, j) and storing them in the memory in the ToF sensor 153 or outputting them to the external system 170 (FIG. 1) .
この様に、制御部160は、パルス光照明系110とToFセンサ153を同期して駆動する事で、対象領域12の3Dイメージを取得する。また、以下に記載する様に、水平センサ140の出力値θiyに基づき、対象領域12に対応する扇状パルス光124―kとそのラインを測定する画素Px(k,j)を特定する機能を持つ。
As described above, the control unit 160 synchronously drives the pulsed light illumination system 110 and the ToF sensor 153 to acquire a 3D image of the target area 12. Further, as described below, it has a function of specifying fan-shaped pulse light 124-k corresponding to the target area 12 and the pixel Px (k, j) whose line is to be measured, based on the output value θiy of the horizontal sensor 140. .
光レーダ装置100が、図3に示すように傾いた場合、実際の水平面は光レーダ装置100のパルス光照射領域10の中心に対して-θiyだけずれる。実際に測定すべき領域である対象領域12は水平面を中心に角度θovの範囲であるため、パルス光照明系110の1次元スキャン装置131は、光照射領域を中心から-θiyだけずらして、θovの範囲をスキャンし、ToFセンサ153はこれと同期して、受光部154の行方向において、中心から(-θiy/Δθ)からずらした行を中心として、その上下に(θov/Δθ/2)の範囲の行を測定する。従って、受光部154を構成する画素Px(i,j)の行数Mはθv/Δθ以上で有る事が好ましい。この場合には、パルス光照射領域10全体を測定できる為、|θiy|≦θmyである限り、光レーダ装置100が傾いても、常に対象領域12全体を測定できる。
When the light radar device 100 is inclined as shown in FIG. 3, the actual horizontal plane is shifted by −θiy with respect to the center of the pulsed light irradiation area 10 of the light radar device 100. Since the target area 12 which is the area to be actually measured is in the range of the angle θov centered on the horizontal plane, the one-dimensional scanning device 131 of the pulsed light illumination system 110 shifts the light irradiation area by -θiy from the center by And the ToF sensor 153 synchronously with this, in the row direction of the light receiving unit 154, (θov / Δθ / 2) up and down centering on the row shifted from (-θiy / Δθ) from the center Measure the range of lines. Therefore, it is preferable that the number M of rows of the pixels Px (i, j) constituting the light receiving unit 154 be θv / Δθ or more. In this case, since the entire pulsed light irradiation area 10 can be measured, the entire target area 12 can always be measured even if the light radar device 100 is inclined as long as | θiy | ≦ θmy.
尚、本体の傾斜に関わらず、常に所定の対象領域12を監視する機能は、外部システム170からの指示によって、制御部160が停止できる事が好ましい。前記機能を停止した際には、本体の向きに従って変わる測定可能領域11を監視する事となる。傾斜が大きい山道等では、水平面を基準に監視すると、登りでは路面しか見えず、下りでは空しか見えず、前方の監視が正しく出来ない場合が生じ得る。又、航空機やドローンの様な装置では、本体を基準にして、一定の視野を監視できた方が良い場合が有るからである。この様に、用途や状況に応じて、機能を切り換えられる事で、光レーダ装置の有用性を増す事が出来る。
In addition, it is preferable that the control part 160 can stop the function which monitors the predetermined | prescribed object area | region 12 always by the instruction | indication from the external system 170 irrespective of the inclination of a main body. When the function is stopped, the measurable area 11 which changes in accordance with the orientation of the main body is monitored. When monitoring on the basis of a horizontal surface on a mountain road or the like with a large inclination, there are cases where only the road surface can be seen when climbing, only sky when going down, and forward monitoring can not be performed correctly. In addition, in an apparatus such as an aircraft or a drone, it may be better to be able to monitor a certain field of view with reference to the main body. In this way, the availability of the optical radar device can be increased by switching the function according to the application and the situation.
ToFセンサ153の信号記憶処理部155としては、種々の構成が可能である。画素が発する信号に関しても、アナログのまま扱う回路や、デジタル変換して扱う回路など、様々の方式が有る。また飛行時間の測定方式に関しても、直接法、間接法、位相シフト法、TCSPC等、多様な方式が有り、本発明では、いずれの方式も採用できる。また、上述の様な2次元アレイ状の受光部では無く、一行に相当するライン状にToF素子を配置し、1次元スキャン装置131と同期して、機械的にスキャンしても良い。
The signal storage processing unit 155 of the ToF sensor 153 can have various configurations. There are various methods, such as a circuit that handles analog signals and a circuit that handles digital signals, as to signals that pixels emit. Further, with regard to measurement methods of flight time, there are various methods such as direct method, indirect method, phase shift method, TCSPC, etc. In the present invention, any method can be adopted. Also, the ToF elements may be arranged in a line corresponding to one row instead of the above-described two-dimensional array of light receiving units, and mechanically scanned in synchronization with the one-dimensional scanning device 131.
(水平センサ140)
光レーダ装置100の水平センサ140について以下に説明する。 (Horizontal sensor 140)
Thehorizontal sensor 140 of the optical radar device 100 will be described below.
光レーダ装置100の水平センサ140について以下に説明する。 (Horizontal sensor 140)
The
本実施形態の光レーダ装置100は陸上または海上で使用する事を想定しており、パルス光照射領域10は鉛直方向に伸びている。また、受光系150の光軸は水平面内にある場合について、前記光軸の水平面に対する傾斜を補正する場合について記述する。
The optical radar device 100 of the present embodiment is assumed to be used on land or in the sea, and the pulsed light irradiation region 10 extends in the vertical direction. Further, in the case where the optical axis of the light receiving system 150 is in the horizontal plane, the case of correcting the inclination of the optical axis with respect to the horizontal plane will be described.
水平センサ140は、前記光軸の水平面に対する傾斜角θiyを計測し、その結果をToFセンサ153へ出力する。例えば図1に記載した座標軸で言うX-Z面と平行な面に設けた3次元加速度計を取り付け、重力加速度を測定し、結果が(Gx、Gy、Gz)(水平センサ140に固定した座標系で見た3次元ベクトル表現、Gz<0である)であった場合に、θiy=Arctan(Gy/Gz)から、θiyを求める事が出来る。或いはX-Y面に平行な面に、傾斜計を設け、Y軸方向の傾斜角をθiyとしても良い。なお、水平センサ140は必ずしもθiyを出力する必要は無く、sin(θiy)やcos(θiy)、或いは両方を出力しても良い。
The horizontal sensor 140 measures the inclination angle θiy of the optical axis with respect to the horizontal plane, and outputs the result to the ToF sensor 153. For example, a three-dimensional accelerometer provided on a plane parallel to the XZ plane referred to in the coordinate axes shown in FIG. 1 is attached, gravity acceleration is measured, and the result is (Gx, Gy, Gz) (coordinates fixed to horizontal sensor 140) Θiy can be obtained from θiy = Arctan (Gy / Gz) in the case of a three-dimensional vector expression (Gz <0) viewed in the system. Alternatively, an inclinometer may be provided on a plane parallel to the XY plane, and the inclination angle in the Y-axis direction may be θiy. The horizontal sensor 140 need not necessarily output θiy, and may output sin (θiy), cos (θiy), or both.
水平センサ140は、ToFセンサ153の近傍に設置する事が好ましい。ToFセンサ153からの距離が離れる程、本体が振動等する場合に、ToFセンサ153と水平センサ140の位置関係の歪みが大きくなる為である。また、水平センサ140はToFセンサ153と同一基板上に搭載する事が好ましい。異なる基板に搭載した場合、やはり振動等によって、両者の位置関係が歪み、θiyの計測値が、実際の光軸の水平面に対する傾斜角からずれる恐れが有る。
The horizontal sensor 140 is preferably installed near the ToF sensor 153. This is because distortion of the positional relationship between the ToF sensor 153 and the horizontal sensor 140 becomes larger as the main body vibrates or the like as the distance from the ToF sensor 153 increases. Further, it is preferable that the horizontal sensor 140 be mounted on the same substrate as the ToF sensor 153. When mounted on a different substrate, there is also a possibility that the positional relationship between the two is distorted due to vibration or the like, and the measured value of θiy may deviate from the inclination angle of the actual optical axis with respect to the horizontal plane.
(効果の説明)
以上の構成の光レーダ装置100によれば、各フレームにおいて、θiyを測定し、測定されたθiyの値より、パルス光照明系110の1次元スキャン装置131は、光照射領域の中心から-θiyだけずれたラインを中心として、θovの範囲をスキャンする。ToFセンサ153はこれと同期して、受光部154の行方向において、中心から(-θiy/Δθ)からずらした行を中心として、その上下に(θov/Δθ/2)の範囲の行を測定する事で、実際の水平面を中心として、θovの範囲である対象領域12を常に測定する事が出来る。 (Description of the effect)
According to the optical radar device 100 having the above configuration, θiy is measured in each frame, and the one-dimensional scanning device 131 of the pulsed light illumination system 110 measures −θiy from the center of the light irradiation area based on the measured value of θiy. The range of θov is scanned around a line shifted as much as possible. In synchronization with this, the ToF sensor 153 measures lines in the range of (θov / Δθ / 2) above and below the line shifted from (-θiy / Δθ) from the center in the line direction of the light receiving unit 154 By doing this, it is possible to always measure the target area 12 in the range of θov around the actual horizontal plane.
以上の構成の光レーダ装置100によれば、各フレームにおいて、θiyを測定し、測定されたθiyの値より、パルス光照明系110の1次元スキャン装置131は、光照射領域の中心から-θiyだけずれたラインを中心として、θovの範囲をスキャンする。ToFセンサ153はこれと同期して、受光部154の行方向において、中心から(-θiy/Δθ)からずらした行を中心として、その上下に(θov/Δθ/2)の範囲の行を測定する事で、実際の水平面を中心として、θovの範囲である対象領域12を常に測定する事が出来る。 (Description of the effect)
According to the optical radar device 100 having the above configuration, θiy is measured in each frame, and the one-
そして、光レーダ装置100の傾斜角が±θmyの範囲である限り、どの様に光レーダ装置100が傾斜しても、対象領域12は一定に保つ事が出来る。
Then, as long as the inclination angle of the light radar device 100 is in the range of ± θ my, the target area 12 can be kept constant no matter how the light radar device 100 is inclined.
従って、外部システム170は光レーダ装置100の傾斜量に関わらず、常に一定方向の3次元イメージを取得する事ができる為、対象物の認識を簡便に精度良く、行う事ができる。例えば、本体側で複雑な補正をせずに、本体前方の道路や通路の傾斜を正しく認識できるし、障害物の高さも精度良く測定できる。
Therefore, the external system 170 can always acquire a three-dimensional image in a fixed direction regardless of the amount of tilt of the light radar device 100, so that the object can be recognized easily and accurately. For example, the inclination of the road and the passage in front of the main body can be correctly recognized without complicated correction on the main body side, and the height of the obstacle can also be measured accurately.
本実施形態に対して、水平センサが測定したθiyに従って、光レーダ装置100自身を本体に対して、-θiyだけ傾斜させて、同様な機能を実現する事が出来る。しかし、この方法では光レーダ装置100全体を、フレーム毎に非常に高速で、高精度に機械的に動かさねばならず、大きく、且つ、振動に脆弱な装置となる。また、非常に高価な装置にならざるを得ない。
With respect to the present embodiment, the same function can be realized by inclining the light radar device 100 itself by -θiy with respect to the main body according to θiy measured by the horizontal sensor. However, in this method, the entire optical radar device 100 must be mechanically moved with high speed and high accuracy for each frame, resulting in a device that is large and vulnerable to vibration. In addition, it has to be a very expensive device.
本実施形態では受光系150の視野拡大によるコストアップや1次元スキャン装置131のスキャン範囲増大によるコストアップ等、多少のコストアップ要因が有るにしても、比較的容易に実現できる。
In this embodiment, even if there are some cost increase factors such as cost increase due to the field of view expansion of the light receiving system 150 and cost increase due to the scan range increase of the one-dimensional scanning device 131, it can be realized relatively easily.
又、この様なコストアップも望まないので有れば、θov=θvとして、|θiy|≠0の場合には、対象領域12の内、実際に測定可能な領域だけを測定して出力する事も出来る。この場合、対象領域12の内、|θiy|/Δθ分の行数のデータ数が欠落するが、光レーダ装置100の傾斜に依存しない測定結果を得る事が出来る。|θiy|≦θv/2であれば、少なくとも測定可能領域11の半分以上のデータを得る事が出来る。従って、水平面付近の対象物は間違いなく補足できる為、光レーダ装置として十分機能する。
Also, if such a cost increase is not desired, θov = θv, and in the case of | θiy | ≠ 0, it is necessary to measure and output only the actually measurable region of the target region 12 You can also. In this case, although the number of data of the number of rows corresponding to | θiy | / Δθ in the target area 12 is missing, it is possible to obtain measurement results that do not depend on the tilt of the optical radar device 100. If | θiy | ≦ θv / 2, at least half of the data of the measurable area 11 can be obtained. Therefore, since an object in the vicinity of the horizontal plane can be captured without fail, it sufficiently functions as an optical radar device.
〔実施形態2〕
本発明の他の実施形態について説明すれば以下の通りである。 Second Embodiment
It will be as follows if other embodiment of this invention is described.
本発明の他の実施形態について説明すれば以下の通りである。 Second Embodiment
It will be as follows if other embodiment of this invention is described.
本実施形態は、光軸周りの回転が問題となる用途への本発明の適用例であり、前記実施形態1では扱っていなかった光レーダ装置100のY軸周りの回転変動に対する対策を盛り込んでいる。
The present embodiment is an application example of the present invention to an application in which the rotation around the optical axis is a problem, and measures for rotational fluctuation around the Y axis of the optical radar device 100 which were not dealt with in the first embodiment are included. There is.
本実施形態に係る光レーダ装置100aの構成要素としては、図1および図4に示す、前記実施形態1の光レーダ装置100と変わらないが、各構成要素の特性は以下に記すように異なる為、各要素の番号の末尾にaを付けて、実施形態1とは区別する。以下では実施形態1との相違点のみを説明する。
The components of the optical radar device 100a according to this embodiment are the same as those of the optical radar device 100 of the first embodiment shown in FIGS. 1 and 4, but the characteristics of the respective components are different as described below. The elements of the present embodiment are distinguished from those of Embodiment 1 by appending a to the end of the element number. Only differences from the first embodiment will be described below.
水平センサ140は、図1の光レーダ装置100aの前方傾斜θiyだけでなく、左右傾斜量θixも計測する。θixはY軸周りの回転量に対応している。例えば、図1のX-Z面と平行な面に設けた3次元加速度計を取り付け、この座標系での重力加速度を測定し、結果が(Gx、Gy、Gz)(水平センサ140に固定した座標系で見た3次元ベクトル表現、Gz<0である)であった場合に、θiy=Arctan(Gy/Gz)から、θiyを求め、θix=Arctan(Gx/Gz)からθixを求める事が出来る。或いは図1のX-Y面に平行な面に、傾斜計を設け、Y軸方向の傾斜角をθiy、X軸方向の傾斜角をθixとしても良い。
The horizontal sensor 140 measures not only the forward inclination θiy of the optical radar device 100 a of FIG. 1 but also the left and right inclination amount θix. θ ix corresponds to the amount of rotation about the Y axis. For example, a three-dimensional accelerometer provided in a plane parallel to the XZ plane in FIG. 1 is attached, the gravitational acceleration in this coordinate system is measured, and the result is fixed to (Gx, Gy, Gz) (horizontal sensor 140 If it is a three-dimensional vector expression (Gz <0) viewed in the coordinate system, θiy can be obtained from θiy = Arctan (Gy / Gz), and θix can be obtained from θix = Arctan (Gx / Gz) It can. Alternatively, an inclinometer may be provided on a plane parallel to the XY plane in FIG. 1, and the inclination angle in the Y-axis direction may be θiy, and the inclination angle in the X-axis direction may be θix.
図5は、光レーダ装置100aにおいて、対象領域と測定可能領域、パルス光照射領域の関係を示す模式図である。
FIG. 5 is a schematic view showing the relationship between the target area, the measurable area, and the pulsed light irradiation area in the light radar device 100a.
パルス光照明系110aの扇状光照射系123aは、扇状パルス光124aを出射し、図5に示す様に、パルス光照射領域10aが対象領域12よりθhも含めて広くなっている。従って、対象領域12が同じ場合には、扇状ビーム生成器132aの水平方向への広がり角度は大きい事が好ましい。受光部154aは行数ばかりでなく、列数も増加している事が好ましい。従って、信号記憶処理部155aも、列数が増加している事が好ましい。また、図5の測定領域13aに対応した画素の測定結果である距離情報D(i,j)や2次元イメージ情報G1(i,j)、G2(i,j)を蓄積できるメモリを有している。
The fan-shaped light irradiation system 123a of the pulse light illumination system 110a emits the fan-shaped pulse light 124a, and as shown in FIG. 5, the pulse light irradiation area 10a is wider than the target area 12 including θh. Therefore, in the case where the target area 12 is the same, it is preferable that the horizontal spread angle of the fan beam generator 132a be large. It is preferable that not only the number of rows but also the number of columns of the light receiving unit 154a be increased. Therefore, it is preferable that the number of columns also increases in the signal storage processing unit 155a. It also has a memory that can store distance information D (i, j) and two-dimensional image information G1 (i, j) and G2 (i, j), which are measurement results of pixels corresponding to measurement area 13a in FIG. ing.
制御部160aは、各フレームにおいて計測されたθiyとθixに基づき、測定領域13aを決定する。測定領域13aは、測定すべき対象領域12全体をカバーする光レーダ装置100aの測定可能領域11aの一部であり、対応する受光部154aの一部である。対象領域12は原点を中心として、X軸とY軸に平行な方向に広がっているが、測定領域13aは光レーダ装置100aの傾きに応じて、Y軸に対してθiy傾斜すると共に、X軸に対してもθixだけ傾斜している。
The control unit 160a determines the measurement area 13a based on θiy and θix measured in each frame. The measurement area 13a is a part of the measurable area 11a of the optical radar device 100a that covers the entire target area 12 to be measured, and is a part of the corresponding light receiving unit 154a. The target area 12 extends in a direction parallel to the X axis and the Y axis centering on the origin, but the measurement area 13a is inclined θiy with respect to the Y axis according to the inclination of the optical radar device 100a Is also inclined by θix.
制御部160aは、測定領域13aに該当する部分にパルス光を照射し、3Dイメージを取得し、信号記憶処理部155a内のメモリに蓄えて行く。測定領域13aの測定が完了した時点で、制御部160aは対象領域12に対応する測定結果を前記メモリから読み出し、外部システム170へ出力する。この様にして、θiyとθixの値に関わらず、実際の対象に対して、一定の対象領域のデータを得る事が出来る。
The control unit 160a irradiates a portion corresponding to the measurement area 13a with pulsed light, acquires a 3D image, and stores the 3D image in a memory in the signal storage processing unit 155a. When the measurement of the measurement area 13 a is completed, the control unit 160 a reads the measurement result corresponding to the target area 12 from the memory and outputs the result to the external system 170. In this way, data of a constant target area can be obtained for an actual target regardless of the values of θiy and θix.
測定領域13aは、対象領域12を完全にカバーする為に、測定可能領域11aの中心から-θiyだけ上方に向いた方向を中心として、横方向に幅θsh、縦方向に幅θsvの範囲である。θsh及びθsvは以下の式(1)(2)で与えられる。
θsh=θoh・cos(θix)+θov・sin(θix)・・・(1)
θsv=θoh・sin(θix)+θov・cos(θix)・・・(2)
従って、θiyの許容最大値を±θmy、θixの許容最大値を±θmxとすれば、θh及びθvは下記の条件を示す式(3)(4)を満たす事が好ましい。この条件下では、前記θiy及びθixの最大許容範囲であれば、対象領域12の全域を監視する事が出来る。
θh≧θoh・cos(θmx)+θov・sin(θmx)・・・(3)
θv≧2・θmy+θoh・sin(θmx)+θov・cos(θmx)・・(4)
対象領域12に対応する測定データは、メモリに蓄えられた測定領域13aの測定結果から抽出して、外部システム170へ受け渡す必要が有る。例えば、対象領域12内の方向(ηx、ηz)のデータとしては、測定領域13a内の下記画素のデータを採用すれば良い。なお、測定領域13aの行番号(式(5))、列番号(式(6))の基準は、図5に示す様に、左上を原点としている。 The measurement area 13a has a width θsh in the horizontal direction and a width θsv in the vertical direction with the direction facing upward by -θiy from the center of the measurable area 11a in order to completely cover the target area 12 . θsh and θsv are given by the following equations (1) and (2).
θsh = θoh · cos (θix) + θov · sin (θix) (1)
θsv = θoh · sin (θix) + θov · cos (θix) (2)
Therefore, assuming that the allowable maximum value of θiy is ± θmy and the allowable maximum value of θix is ± θmx, it is preferable that θh and θv satisfy the following expressions (3) and (4). Under this condition, if the maximum allowable range of θiy and θix is satisfied, the entire area of the target area 12 can be monitored.
θhhθoh · cos (θmx) + θov · sin (θmx) (3)
θ v 2 2 · θ my + θ oh · sin (θ mx) + θ ov · cos (θ mx) · · · (4)
It is necessary to extract the measurement data corresponding to the target area 12 from the measurement result of the measurement area 13a stored in the memory and deliver it to theexternal system 170. For example, as data of the direction (ηx, zz) in the target area 12, data of the following pixel in the measurement area 13a may be adopted. As shown in FIG. 5, the reference of the row number (formula (5)) and the column number (formula (6)) of the measurement area 13a has the upper left as the origin.
θsh=θoh・cos(θix)+θov・sin(θix)・・・(1)
θsv=θoh・sin(θix)+θov・cos(θix)・・・(2)
従って、θiyの許容最大値を±θmy、θixの許容最大値を±θmxとすれば、θh及びθvは下記の条件を示す式(3)(4)を満たす事が好ましい。この条件下では、前記θiy及びθixの最大許容範囲であれば、対象領域12の全域を監視する事が出来る。
θh≧θoh・cos(θmx)+θov・sin(θmx)・・・(3)
θv≧2・θmy+θoh・sin(θmx)+θov・cos(θmx)・・(4)
対象領域12に対応する測定データは、メモリに蓄えられた測定領域13aの測定結果から抽出して、外部システム170へ受け渡す必要が有る。例えば、対象領域12内の方向(ηx、ηz)のデータとしては、測定領域13a内の下記画素のデータを採用すれば良い。なお、測定領域13aの行番号(式(5))、列番号(式(6))の基準は、図5に示す様に、左上を原点としている。 The measurement area 13a has a width θsh in the horizontal direction and a width θsv in the vertical direction with the direction facing upward by -θiy from the center of the measurable area 11a in order to completely cover the target area 12 . θsh and θsv are given by the following equations (1) and (2).
θsh = θoh · cos (θix) + θov · sin (θix) (1)
θsv = θoh · sin (θix) + θov · cos (θix) (2)
Therefore, assuming that the allowable maximum value of θiy is ± θmy and the allowable maximum value of θix is ± θmx, it is preferable that θh and θv satisfy the following expressions (3) and (4). Under this condition, if the maximum allowable range of θiy and θix is satisfied, the entire area of the target area 12 can be monitored.
θhhθoh · cos (θmx) + θov · sin (θmx) (3)
It is necessary to extract the measurement data corresponding to the target area 12 from the measurement result of the measurement area 13a stored in the memory and deliver it to the
{θsh/2+ηx・cos(θix)―ηz・sin(θix)}/Δθ・・(5)
{θsv/2-ηz・cos(θix)-ηx・sin(θix)}/Δθ・・(6)
尚、式(5)及び式(6)は、最も単純な場合には、四捨五入によって整数化されて、該当する行番号、列番号のデータが割り当てられる。より精度が求められる場合には、式(5)及び式(6)で計算された数値の前後の行番号、列番号を有する複数のデータより、内挿、或いは外挿によって、(ηx、ηz)のデータを求める事も可能である。 {Θsh / 2 + ηx · cos (θix) −ηz · sin (θix)} / Δθ ···· (5)
{Θsv / 2−ηz · cos (θix) −ηx · sin (θix)} / Δθ ···· (6)
In the simplest case, the equations (5) and (6) are rounded to the nearest integer and assigned data of the corresponding row number and column number. When more accuracy is required, interpolation or extrapolation may be performed from a plurality of data having row numbers and column numbers before and after the numerical values calculated by Equations (5) and (6) ((x, η z It is also possible to obtain the data of).
{θsv/2-ηz・cos(θix)-ηx・sin(θix)}/Δθ・・(6)
尚、式(5)及び式(6)は、最も単純な場合には、四捨五入によって整数化されて、該当する行番号、列番号のデータが割り当てられる。より精度が求められる場合には、式(5)及び式(6)で計算された数値の前後の行番号、列番号を有する複数のデータより、内挿、或いは外挿によって、(ηx、ηz)のデータを求める事も可能である。 {Θsh / 2 + ηx · cos (θix) −ηz · sin (θix)} / Δθ ···· (5)
{Θsv / 2−ηz · cos (θix) −ηx · sin (θix)} / Δθ ···· (6)
In the simplest case, the equations (5) and (6) are rounded to the nearest integer and assigned data of the corresponding row number and column number. When more accuracy is required, interpolation or extrapolation may be performed from a plurality of data having row numbers and column numbers before and after the numerical values calculated by Equations (5) and (6) ((x, η z It is also possible to obtain the data of).
(効果の説明)
以上の構成によれば、各フレームにおいて、θiy、θixを測定し、測定されたθiyの値より、パルス光照明系110aの1次元スキャン装置131aは、光照射領域の中心から-θiyだけずれたラインを中心として、θsvの範囲をスキャンする。ToFセンサ153aはこれと同期して、受光部154aの行方向において、中心から(-θiy/Δθ)からずらした行を中心として、その上下に(θsv/Δθ/2)の範囲の行に対して、列方向には±(θsh/Δθ/2)の範囲を測定する事で、実際の水平面を中心としてθovの範囲であり、水平方向にはθohの範囲にある対象領域12を常に測定する事が出来る。 (Description of the effect)
According to the above configuration, in each frame, θiy and θix are measured, and the one-dimensional scanning device 131a of the pulsed light illumination system 110a deviates by -θiy from the center of the light irradiation area from the measured values of θiy. Scan the range of θsv around the line. In synchronization with this, with respect to the row in the range of (θsv / Δθ / 2) in the row direction of the light receiving unit 154a, centering on the row shifted from (-θiy / Δθ) from the center, By measuring the range of ± (θsh / Δθ / 2) in the column direction, the target region 12 in the range of θov in the horizontal direction and always in the range of θoh in the horizontal direction is always measured. I can do things.
以上の構成によれば、各フレームにおいて、θiy、θixを測定し、測定されたθiyの値より、パルス光照明系110aの1次元スキャン装置131aは、光照射領域の中心から-θiyだけずれたラインを中心として、θsvの範囲をスキャンする。ToFセンサ153aはこれと同期して、受光部154aの行方向において、中心から(-θiy/Δθ)からずらした行を中心として、その上下に(θsv/Δθ/2)の範囲の行に対して、列方向には±(θsh/Δθ/2)の範囲を測定する事で、実際の水平面を中心としてθovの範囲であり、水平方向にはθohの範囲にある対象領域12を常に測定する事が出来る。 (Description of the effect)
According to the above configuration, in each frame, θiy and θix are measured, and the one-dimensional scanning device 131a of the pulsed light illumination system 110a deviates by -θiy from the center of the light irradiation area from the measured values of θiy. Scan the range of θsv around the line. In synchronization with this, with respect to the row in the range of (θsv / Δθ / 2) in the row direction of the light receiving unit 154a, centering on the row shifted from (-θiy / Δθ) from the center, By measuring the range of ± (θsh / Δθ / 2) in the column direction, the target region 12 in the range of θov in the horizontal direction and always in the range of θoh in the horizontal direction is always measured. I can do things.
光レーダ装置100aの傾斜角の許容範囲は、Y軸が±θmyの範囲であり、X軸が±θmxである場合、測定可能領域11aを表すθh、θvが式(3)及び式(4)を満たせば、どの様に光レーダ装置100aが傾斜しても、対象領域12は一定に保つ事が出来る。
The allowable range of the tilt angle of the optical radar device 100a is that when the Y axis is in the range of ± θ my and the X axis is ± θ mx, θ h and θ v representing the measurable area 11 a are expressed by the equations (3) and (4) The target area 12 can be kept constant no matter how the light radar device 100 a tilts.
従って、外部システム170は光レーダ装置100aの傾斜量に関わらず、常に一定方向の3次元イメージを取得する事ができる為、対象物の認識を簡便に精度良く、行う事ができる。例えば、本体側で複雑な補正をせずに、本体前方の道路や通路の傾斜を正しく認識できるし、障害物の高さも精度良く測定できる。
Therefore, the external system 170 can always acquire a three-dimensional image in a fixed direction regardless of the amount of tilt of the light radar device 100a, so that an object can be recognized easily and accurately. For example, the inclination of the road and the passage in front of the main body can be correctly recognized without complicated correction on the main body side, and the height of the obstacle can also be measured accurately.
本実施形態に対して、水平センサ140が測定したθiy、θixに従って、光レーダ装置100a自身を本体に対して傾斜させて、同様な機能を実現する事が出来る。しかし、この方法では光レーダ装置100全体を、フレーム毎に非常に高速で、高精度に機械的に動かさねばならず、大きくて振動に脆弱な装置となり、且つ、非常に高価な装置にならざるを得ない。
With respect to the present embodiment, the light radar device 100a itself can be inclined with respect to the main body according to θiy and θix measured by the horizontal sensor 140 to realize the same function. However, in this method, the entire optical radar device 100 has to be mechanically moved with high speed and high accuracy for each frame with high accuracy, becoming a large and fragile device and extremely expensive. Do not get.
本実施形態では受光系150aの視野拡大によるコストアップや1次元スキャン装置131aのスキャン範囲増大によるコストアップ等、多少のコストアップ要因が有るにしても、比較的容易に実現できる。
In this embodiment, even if there are some cost increase factors such as cost increase due to the field of view expansion of the light receiving system 150a and cost increase due to the scan range increase of the one-dimensional scanning device 131a, it can be realized relatively easily.
又、この様なコストアップも望まないので有れば、θov=θv、θoh=θhとして、|θiy|≠0、|θix|≠0の場合には、対象領域12の内、実際に測定可能な領域だけを測定して出力する事も出来る。この場合、対象領域12は縮小されるが、光レーダ装置の傾斜に依存しない測定結果を得る事が出来る。|θiy|≦θv/2であれば、少なくとも測定可能領域11の半分以上のデータを得る事が出来る。従って、水平面付近の対象物は間違いなく補足できる為、光レーダ装置として十分機能する。
In addition, if such a cost increase is not desired, it can be actually measured in the target region 12 in the case of | θiy | ≠ 0, | θix | ≠ 0 as θov = θv and θoh = θh. It is also possible to measure and output only the In this case, although the target area 12 is reduced, it is possible to obtain measurement results that do not depend on the tilt of the optical radar device. If | θiy | ≦ θv / 2, at least half of the data of the measurable area 11 can be obtained. Therefore, since an object in the vicinity of the horizontal plane can be captured without fail, it sufficiently functions as an optical radar device.
〔実施例3〕
本発明のさらに他の実施形態について以下に説明する。 [Example 3]
Further embodiments of the present invention will be described below.
本発明のさらに他の実施形態について以下に説明する。 [Example 3]
Further embodiments of the present invention will be described below.
本実施形態は、2次元スキャン装置によって、測定領域にパルス光を照射する点が、前記実施形態1,2と異なる。図6及び図7を用いて、本実施形態に係る光レーダ装置100bの構成について説明する。なお、前記実施形態1,2と同様の点に関しては、説明を省略する。
The present embodiment differs from the first and second embodiments in that pulsed light is irradiated to the measurement region by a two-dimensional scanning device. The configuration of the optical radar device 100b according to the present embodiment will be described using FIGS. 6 and 7. Description of the same points as the first and second embodiments will be omitted.
図6は、本発明の実施形態3に係る光レーダ装置100bの構成を示す模式図である。
FIG. 6 is a schematic view showing the configuration of an optical radar device 100b according to Embodiment 3 of the present invention.
光レーダ装置100bは、図6に示すように、パルス光照射領域10bにスポットパルス光124b(以下では単にパルス光と記述する場合もある)を照射するパルス光照明系110bと、パルス光照射領域10bの少なくとも一部からの光を受光する受光系150bと、水平センサ140とからなる。
The optical radar device 100b, as shown in FIG. 6, includes a pulsed light illumination system 110b for irradiating a pulsed light irradiation area 10b with spot pulsed light 124b (hereinafter sometimes referred to simply as pulsed light), and a pulsed light irradiation area It consists of a light receiving system 150b that receives light from at least a part of 10b, and a horizontal sensor 140.
パルス光照明系110bはスポットパルス光を2次元スキャンする事で、パルス光照射領域10b全体を照射する為の2次元スキャン装置123bを有し、パルス光を発する光源122とコリメート光発生器130を含む。
The pulsed light illumination system 110b has a two-dimensional scanning device 123b for irradiating the entire pulsed light irradiation area 10b by two-dimensionally scanning spot pulsed light, and a light source 122 for emitting pulsed light and a collimated light generator 130 are provided. Including.
受光系150bは、ToFセンサ153bと、ToFセンサ153bの受光部154bに結像する結像光学系151bと、光学バンドパスフィルタ152と、制御部160bを含む。制御部160bは、水平センサ140からの信号受信や、パルス光照明系110bの制御、ToFセンサ153bの制御、外部システム170とのコミュニケーション等の機能も含めている。
The light receiving system 150b includes a ToF sensor 153b, an imaging optical system 151b that forms an image on the light receiving unit 154b of the ToF sensor 153b, an optical band pass filter 152, and a control unit 160b. The control unit 160 b also includes functions such as signal reception from the horizontal sensor 140, control of the pulsed light illumination system 110 b, control of the ToF sensor 153 b, and communication with the external system 170.
水平センサ140は、光レーダ装置100bの水平面に対する傾斜角を常時モニターし、ToFセンサ153bへ出力する。
The horizontal sensor 140 constantly monitors the inclination angle of the light radar device 100b with respect to the horizontal plane, and outputs it to the ToF sensor 153b.
(パルス光照明系110b)
光レーダ装置100bのパルス光照明系110bについて以下に説明する。 (Pulsedlight illumination system 110b)
The pulsedlight illumination system 110b of the light radar device 100b will be described below.
光レーダ装置100bのパルス光照明系110bについて以下に説明する。 (Pulsed
The pulsed
パルス光照明系110bは、図6に示す様に、光源122の光を、ほぼ平行はスポット光133に整形するコリメート光発生器130と、スポット光133をX軸方向とZ軸方向に振る2次元スキャン装置123bとを含む。
The pulsed light illumination system 110b, as shown in FIG. 6, collimates the light from the light source 122 into a spot light 133 almost in parallel, and shakes the spot light 133 in the X axis direction and the Z axis direction. And a dimensional scan device 123b.
2次元スキャン装置123bは、スキャンニングミラー131bを含む。スキャンニングミラー131bは、Z軸周りに回転運動をする事で、X軸方向のスポットパルス光124bの振れを制御し、X-Y面内の一軸(X軸と45度傾斜)の回りに回転運動する事で、Z軸方向の振れを制御する。
The two-dimensional scanning device 123b includes a scanning mirror 131b. The scanning mirror 131b performs rotational movement around the Z axis to control the swing of the spot pulse light 124b in the X axis direction, and rotates around one axis in the XY plane (incline 45 degrees to the X axis) By moving, control the shake in the Z-axis direction.
2次元スキャン装置123bは、例えば、MEMSミラー素子及びその制御装置からなる。2次元スキャン装置123bの制御は、制御部160bに含めている。2次元スキャン装置123bは、MEMSミラー素子以外には、ポリゴンミラーや液晶導波路方式、等、他の構成であっても良い。
The two-dimensional scanning device 123 b includes, for example, a MEMS mirror element and its control device. Control of the two-dimensional scanning device 123b is included in the control unit 160b. The two-dimensional scanning device 123b may have another configuration such as a polygon mirror or a liquid crystal waveguide method, in addition to the MEMS mirror element.
スポット光133は、スキャンニングミラー131bによって反射され、スポットパルス光124bとして、パルス光照射領域10bへ照射される。従って、スポット光133とスポットパルス光124bは、ほぼ同じ分散角を有している。分散角はX軸方向とY軸方向で異なっても良いが、以下では両者がほぼ等しい円形ビームの場合について説明する。分散角θbは、光レーダ装置100bの角度分解能を制約する。θbは、0.05度~1度程度である。
The spot light 133 is reflected by the scanning mirror 131 b and is irradiated to the pulsed light irradiation area 10 b as a spot pulse light 124 b. Therefore, the spot light 133 and the spot pulse light 124 b have substantially the same dispersion angle. The dispersion angle may be different in the X-axis direction and the Y-axis direction, but in the following, the case of a circular beam in which both are substantially equal will be described. The dispersion angle θb restricts the angular resolution of the optical radar device 100b. θb is about 0.05 degree to about 1 degree.
(受光系150b)
光レーダ装置100bの受光系150bについて以下に説明する。 (Light receiving system 150b)
Thelight receiving system 150b of the light radar device 100b will be described below.
光レーダ装置100bの受光系150bについて以下に説明する。 (
The
対象物からの反射光125bはスキャンニングミラー131bとミラー156での反射を経て、結像光学系151bによって、光学バンドパスフィルタ152を介して、ToFセンサ153bの受光部154bへ集光される。ミラー156はスポット光133の光路に位置するが、中央部に開口部157を有しており、スポット光133はこの開口部157を通過する。
The reflected light 125b from the object passes through the reflection by the scanning mirror 131b and the mirror 156, and is condensed by the imaging optical system 151b to the light receiving portion 154b of the ToF sensor 153b via the optical band pass filter 152. The mirror 156 is located in the optical path of the spot light 133 but has an opening 157 at the center, and the spot light 133 passes through the opening 157.
(ToFセンサ153bおよび制御部160b)
光レーダ装置100bのToFセンサ153bおよび制御部160bについて以下に説明する。図7は、光レーダ装置100bにおいて、対象領域と測定可能領域、パルス光照射領域の関係を示す模式図である。 (ToF sensor 153b and control unit 160b)
The ToF sensor 153b and the control unit 160b of thelight radar device 100b will be described below. FIG. 7 is a schematic view showing the relationship between the target area, the measurable area, and the pulsed light irradiation area in the optical radar device 100b.
光レーダ装置100bのToFセンサ153bおよび制御部160bについて以下に説明する。図7は、光レーダ装置100bにおいて、対象領域と測定可能領域、パルス光照射領域の関係を示す模式図である。 (ToF sensor 153b and control unit 160b)
The ToF sensor 153b and the control unit 160b of the
ToFセンサ153bは、スポットパルス光124bによって照らされる、円形部分からの反射光を受光部154bによって捕え、飛行時間測定により、前記円形部分と光レーダ装置100bの間の距離を求めるセンサである。
The ToF sensor 153b is a sensor for capturing the reflected light from the circular portion illuminated by the spot pulse light 124b by the light receiving unit 154b and determining the distance between the circular portion and the optical radar device 100b by time of flight measurement.
制御部160bは、対象領域12内の測定順序を決定し、この順序に従い2次元スキャン装置123bの角度設定を行い、光源122に発光を指示し、同時にToFセンサ153bを活性化し、当該対象物に対する飛行時間を計測する。
The control unit 160b determines the measurement order in the target area 12, sets the angle of the two-dimensional scanning device 123b in accordance with this order, instructs the light source 122 to emit light, and simultaneously activates the ToF sensor 153b. Measure flight time.
制御部160bは、前記2次元スキャン装置123bの角度設定の際に、水平センサ140の出力θiy、θixを基に、設定角度を決定し、光レーダ装置100bの傾斜に依らず、常に対象領域を測定する事が出来る。例えば、図7に示す様に、対象領域12内の方向(ηx、ηz)へスポットパルス光124bを照射する際には、2次元スキャン装置123bに対して、スポットパルス光124bを(Φx、Φz)方向へ照射する様に、スキャンニングミラー131bを設定する様に指示する。ここで、Φxは以下の式(7)、Φzは以下の式(8)で示される。
Φx= θiy・sin(θix)+ηx・cos(θix)―ηz・sin(θix)・・・(7)
Φz=―θiy・cos(θix)+ηz・cos(θix)+ηx・sin(θix)・・・(8)
光レーダ装置100bのX軸が傾斜していない場合には、図6に示す様に、Z軸周りの回転によって、X軸方向のスキャンを行い、次にZ軸方向の角度を変更して、X軸方向のスキャンを行うと言う様に、同時に2軸を動かす事は無いが、光レーダ装置100bのX軸が傾斜している場合には、2軸を同時に変更する必要が有る。
(効果の説明)
以上の構成によれば、各フレームにおいて、θiy、θixを測定し、測定されたθiy、θixの値より、パルス光照明系110bの2次元スキャン装置123bは、式(7)及び式(8)を満たす(Φx、Φz)方向へスポットパルス光124bを照射する事で、対象領域12内の方向(ηx、ηz)へ光照射する事ができ、光レーダ装置100bの傾斜に依らず、常に対象領域12だけを測定する事が出来る。 The control unit 160b determines the setting angle based on the outputs θiy and θix of thehorizontal sensor 140 when setting the angle of the two-dimensional scanning device 123b, and always determines the target area regardless of the inclination of the optical radar device 100b. It can be measured. For example, as shown in FIG. 7, when the spot pulse light 124b is irradiated in the direction (.eta.x, .eta.z) in the target area 12, the spot pulse light 124b is set to (.PHI.x, .PHI.z for the two-dimensional scanning device 123b. It instructs to set the scanning mirror 131b so as to irradiate in the) direction. Here, Φx is expressed by the following expression (7), and Φz is expressed by the following expression (8).
Xx = θiy · sin (θix) + ηx · cos (θix) −ηz · sin (θix) (7)
Zz = −θiy · cos (θix) + ixz · cos (θix) + ηx · sin (θix) (8)
When the X axis of theoptical radar device 100b is not inclined, as shown in FIG. 6, scanning in the X axis direction is performed by rotation around the Z axis, and then the angle in the Z axis direction is changed, As in the case of scanning in the X-axis direction, two axes are not moved at the same time, but if the X-axis of the light radar device 100b is inclined, it is necessary to change the two axes simultaneously.
(Description of the effect)
According to the above configuration, in each frame, θiy and θix are measured, and based on the measured values of θiy and θix, the two-dimensional scanning device 123b of the pulsed light illumination system 110b obtains the equations (7) and (8) By irradiating the spot pulse light 124b in the (Φx, zz) direction, the light can be emitted in the direction (ηx, zz) in the target area 12, and the target is always targeted regardless of the inclination of the optical radar device 100b. Only the area 12 can be measured.
Φx= θiy・sin(θix)+ηx・cos(θix)―ηz・sin(θix)・・・(7)
Φz=―θiy・cos(θix)+ηz・cos(θix)+ηx・sin(θix)・・・(8)
光レーダ装置100bのX軸が傾斜していない場合には、図6に示す様に、Z軸周りの回転によって、X軸方向のスキャンを行い、次にZ軸方向の角度を変更して、X軸方向のスキャンを行うと言う様に、同時に2軸を動かす事は無いが、光レーダ装置100bのX軸が傾斜している場合には、2軸を同時に変更する必要が有る。
(効果の説明)
以上の構成によれば、各フレームにおいて、θiy、θixを測定し、測定されたθiy、θixの値より、パルス光照明系110bの2次元スキャン装置123bは、式(7)及び式(8)を満たす(Φx、Φz)方向へスポットパルス光124bを照射する事で、対象領域12内の方向(ηx、ηz)へ光照射する事ができ、光レーダ装置100bの傾斜に依らず、常に対象領域12だけを測定する事が出来る。 The control unit 160b determines the setting angle based on the outputs θiy and θix of the
Xx = θiy · sin (θix) + ηx · cos (θix) −ηz · sin (θix) (7)
Zz = −θiy · cos (θix) + ixz · cos (θix) + ηx · sin (θix) (8)
When the X axis of the
(Description of the effect)
According to the above configuration, in each frame, θiy and θix are measured, and based on the measured values of θiy and θix, the two-
従って、外部システム170は光レーダ装置100bの傾斜量に関わらず、常に一定方向の3次元イメージを取得する事ができる為、対象物の認識を簡便に精度良く、行う事ができる。例えば、本体側で複雑な補正をせずに、本体前方の道路や通路の傾斜を正しく認識できるし、障害物の高さも精度良く測定できる。
Therefore, the external system 170 can always acquire a three-dimensional image in a fixed direction regardless of the amount of tilt of the light radar device 100b, so that the object can be recognized easily and accurately. For example, the inclination of the road and the passage in front of the main body can be correctly recognized without complicated correction on the main body side, and the height of the obstacle can also be measured accurately.
本方式では、対象領域12全体をカバーする為に、余分な領域まで測定する必要が無く、パルス光照射やToFセンサが余分な測定をしない上に、測定結果を一旦メモリし、その中から対象領域だけを選択する必要が無く、消費電力を削減できる。
In this method, in order to cover the entire target area 12, it is not necessary to measure to the extra area, and the pulse light irradiation and the ToF sensor do not perform the extra measurement, and the measurement results are temporarily stored. There is no need to select only the area, and power consumption can be reduced.
〔実施例4〕
本発明のさらに他の実施形態について以下に説明する。 Example 4
Further embodiments of the present invention will be described below.
本発明のさらに他の実施形態について以下に説明する。 Example 4
Further embodiments of the present invention will be described below.
本実施形態は、拡散照射装置によって、測定領域に一括してパルス光を照射する点が、前記実施形態1,2と異なる。図8及び図9を用いて、本実施形態に係る光レーダ装置100cの構成について説明する。なお、前記実施形態1,2と同様の点に関しては、説明を省略する。
The present embodiment differs from the first and second embodiments in that pulsed light is irradiated to the measurement region at one time by the diffusion irradiation device. The configuration of the optical radar device 100c according to the present embodiment will be described with reference to FIGS. 8 and 9. Description of the same points as the first and second embodiments will be omitted.
図8は、本発明の実施形態4に係る光レーダ装置100cの構成を示す模式図である。
FIG. 8 is a schematic view showing the configuration of an optical radar device 100c according to Embodiment 4 of the present invention.
光レーダ装置100cは、図8に示すように、パルス光照射領域10cにパルス光124c(以下では単にパルス光と記述する場合もある)を一括照射するパルス光照明系110cと、パルス光照射領域10cの少なくとも一部からの光を受光する受光系150cと、水平センサ140を含む。
The optical radar device 100c, as shown in FIG. 8, includes a pulsed light illumination system 110c for collectively emitting pulsed light 124c (hereinafter sometimes referred to simply as pulsed light) in a pulsed light irradiation area 10c; It includes a light receiving system 150c that receives light from at least a portion of 10c, and a horizontal sensor 140.
受光系150cは、ToFセンサ153c、ToFセンサ153cの受光部154cに結像する結像光学系151、光学バンドパスフィルタ152、パルス光照明系110cとToFセンサ153cの同期制御や、水平センサ140の出力のフィードバック、外部システム170とのコミュニケーション等の機能を担う制御部160c等よりなる。
The light receiving system 150c includes a ToF sensor 153c, an imaging optical system 151 forming an image on the light receiving unit 154c of the ToF sensor 153c, an optical band pass filter 152, synchronous control of the pulsed light illumination system 110c and the ToF sensor 153c, It comprises a control unit 160 c and the like that bears functions such as output feedback and communication with the external system 170.
水平センサ140は、光レーダ装置100cの水平面に対する傾斜角を常時モニターし、制御部160cへ出力する。
The horizontal sensor 140 constantly monitors the inclination angle of the light radar device 100c with respect to the horizontal plane, and outputs it to the control unit 160c.
(パルス光照明系110c)
光レーダ装置100cのパルス光照明系110cについて以下に説明する。 (Pulsedlight illumination system 110c)
The pulsedlight illumination system 110c of the light radar device 100c will be described below.
光レーダ装置100cのパルス光照明系110cについて以下に説明する。 (Pulsed
The pulsed
パルス光照明系110cは、図8に示すように、パルス光を発する光源122と、前記パルス光を2次元に広げて照射する拡散光学系123cとを含む。
As shown in FIG. 8, the pulsed light illumination system 110c includes a light source 122 that emits pulsed light, and a diffusion optical system 123c that spreads the pulsed light in two dimensions and irradiates it.
拡散光学系123cは、一定の範囲に光を分散させて照射する光学装置であり、拡散板、回折格子、シリンドリカルレンズ等により構成される。水平照射角θh、垂直照射角θvを有するパルス光照射領域10cに対して、一括してパルス光124cを照射する事が出来る。
(受光系150c)
光レーダ装置100cの受光系150cについて以下に説明する。 The diffusionoptical system 123c is an optical device that disperses light in a predetermined range and irradiates the light, and is configured of a diffusion plate, a diffraction grating, a cylindrical lens, and the like. The pulsed light 124c can be irradiated collectively to the pulsed light irradiation area 10c having the horizontal irradiation angle θh and the vertical irradiation angle θv.
(Light receiving system 150c)
Thelight receiving system 150c of the light radar device 100c will be described below.
(受光系150c)
光レーダ装置100cの受光系150cについて以下に説明する。 The diffusion
(
The
受光系150cは、ToFセンサ153cを除いて、前記実施形態2と同じである。結像光学系151と受光部154cで決定される測定可能領域11cはパルス光照射領域10cと等しいか、小さい事が好ましい。ここでは、両者が等しい場合について記載する。また、測定可能領域11cは実際に測定結果を出力する対象領域12より大きく、対象領域12の水平方向広がり角度θoh、垂直方向広がり角度θovに対しては、θh>θoh、θv>θovである事が好ましい。
The light receiving system 150c is the same as the second embodiment except for the ToF sensor 153c. It is preferable that the measurable area 11c determined by the imaging optical system 151 and the light receiving unit 154c be equal to or smaller than the pulsed light irradiation area 10c. Here, the case where both are equal is described. Also, the measurable area 11c is larger than the target area 12 which actually outputs the measurement result, and for the horizontal spread angle θoh of the target area 12 and the vertical spread angle θov, θh> θoh and θv> θov. Is preferred.
(ToFセンサ153cおよび制御部160c)
光レーダ装置100cのToFセンサ153cおよび制御部160cについて以下に説明する。図9は、光レーダ装置100cにおいて、対象領域と測定可能領域、パルス光照射領域の関係を示す模式図である。 (ToF sensor 153c and control unit 160c)
TheToF sensor 153c and the control unit 160c of the optical radar device 100c will be described below. FIG. 9 is a schematic view showing the relationship between the target area, the measurable area, and the pulsed light irradiation area in the optical radar device 100c.
光レーダ装置100cのToFセンサ153cおよび制御部160cについて以下に説明する。図9は、光レーダ装置100cにおいて、対象領域と測定可能領域、パルス光照射領域の関係を示す模式図である。 (
The
ToFセンサ153cの受光部154cは、図4に示したToFセンサ153及び153aと同様に、2次元配列された画素群を含み、測定可能領域11cをカバーできる。ToFセンサ153及び153aは飛行時間測定を行毎に行うのに対して、ToFセンサ153c全画素の飛行時間測定を平行して実行する。この様なToFセンサは公知である為、詳細は記述しない。
Similar to the ToF sensors 153 and 153a shown in FIG. 4, the light receiving unit 154c of the ToF sensor 153c includes a group of pixels arranged in a two-dimensional array, and can cover the measurable area 11c. The ToF sensors 153 and 153a perform time-of-flight measurement row by row, while performing time-of-flight measurement of all pixels of the ToF sensor 153c in parallel. Such ToF sensors are known and will not be described in detail.
制御部160cは、パルス光照明系110cへ信号を発して、パルス光124cを発光させると共に、ToFセンサ153cの飛行時間測定を開始する。全画素の測定結果を一旦ToFセンサ153cのメモリ(図示せず)に蓄積する。即ち、図9に示す様に、測定可能領域11cの全データがメモリに蓄積される。制御部160cは水平センサ140の出力を基に、この中から対象領域12に該当する部分のみを選択して、外部システム170へ出力する。対象領域12内の方向(ηx、ηz)に対応する測定結果は、前記実施形態3で説明した式(7)、式(8)で表される測定可能領域11cの方向(Φx、Φz)のデータである。
The control unit 160c emits a signal to the pulsed light illumination system 110c to emit the pulsed light 124c, and starts measurement of time of flight of the ToF sensor 153c. The measurement results of all the pixels are temporarily stored in the memory (not shown) of the ToF sensor 153c. That is, as shown in FIG. 9, all data in the measurable area 11c is stored in the memory. Based on the output of the horizontal sensor 140, the control unit 160 c selects only the part corresponding to the target area 12 from among these and outputs the selected part to the external system 170. The measurement results corresponding to the direction (ηx, zz) in the target area 12 are the directions (Φx, zz) of the measurable area 11c represented by the equations (7) and (8) described in the third embodiment. It is data.
(効果の説明)
以上の構成によれば、各フレームにおいて、測定可能領域11c全体の飛行時間測定を行なうと共に、θiy、θixを測定し、測定されたθiy、θixの値より、式(7)及び式(8)を満たす(Φx、Φz)方向のデータを参照する事で、対象領域12内の方向(ηx、ηz)に該当する測定データを得る事が出来る。光レーダ装置100cの傾斜に依らず、常に対象領域12を測定する事が出来る。 (Description of the effect)
According to the above configuration, in each frame, time-of-flight measurement of the entiremeasurable area 11c is performed, and θiy and θix are measured, and from the measured values of θiy and θix, equations (7) and (8) By referring to the data in the (Φx, zz) direction satisfying the above condition, it is possible to obtain measurement data corresponding to the direction (ηx, ηz) in the target area 12. The target area 12 can always be measured regardless of the inclination of the light radar device 100c.
以上の構成によれば、各フレームにおいて、測定可能領域11c全体の飛行時間測定を行なうと共に、θiy、θixを測定し、測定されたθiy、θixの値より、式(7)及び式(8)を満たす(Φx、Φz)方向のデータを参照する事で、対象領域12内の方向(ηx、ηz)に該当する測定データを得る事が出来る。光レーダ装置100cの傾斜に依らず、常に対象領域12を測定する事が出来る。 (Description of the effect)
According to the above configuration, in each frame, time-of-flight measurement of the entire
従って、外部システム170は光レーダ装置100cの傾斜量に関わらず、常に一定方向の3次元イメージを取得する事ができる為、対象物の認識を簡便に精度良く、行う事ができる。例えば、本体側で複雑な補正をせずに、本体前方の道路や通路の傾斜を正しく認識できるし、障害物の高さも精度良く測定できる。
Therefore, the external system 170 can always acquire a three-dimensional image in a fixed direction regardless of the amount of tilt of the light radar device 100c, so that the object can be recognized easily and accurately. For example, the inclination of the road and the passage in front of the main body can be correctly recognized without complicated correction on the main body side, and the height of the obstacle can also be measured accurately.
本方式では、パルス光照明系110cが全く可動部を含まず、傾斜とは無関係に測定を実行した後、メモリされた測定結果より対象領域だけを選択する事が出来る為、システムが非常に単純化でき、コストを低減できると言う利点がある。
In this method, the pulsed light illumination system 110c does not include any movable part, and after performing measurement independently of inclination, only the target area can be selected from the stored measurement results, so the system is very simple. And cost can be reduced.
今回開示された実施形態はすべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は上記した説明ではなくて特許請求の範囲によって示され、特許請求の範囲と均等の意味及び範囲内でのすべての変更が含まれることが意図される。
It should be understood that the embodiments disclosed herein are illustrative and non-restrictive in every respect. The scope of the present invention is indicated not by the above description but by claims, and is intended to include all modifications within the meaning and scope equivalent to claims.
〔まとめ〕
本発明の態様1に係る光レーダ装置は、所定の対象領域12に存在する対象物までの距離を飛行時間(Time-of-flight)法で測定する光レーダ装置(100,100a,100b,100c)であって、パルス状の光を発する光源122と、上記光源122の発するパルス光124を上記対象領域12の少なくとも一部に照射するパルス光照明系(110,110a)と、上記パルス光照明系(110,110a)にて照射されたパルス光124の上記対象物からの反射光を検出して、上記対象物までの距離を測定する受光系(150,150a)と、上記受光系(150,150a)の光軸(Y軸)の水平面に対する傾斜角θiyを検出する水平センサ140と、を含み、上記受光系(150,150a)は、上記水平センサ140が検出した傾斜角θiy分、当該受光系(150,150a)の光軸の水平面に対する傾斜を補正することを特徴としている。 [Summary]
An optical radar device according toaspect 1 of the present invention is an optical radar device (100, 100a, 100b, 100c) that measures the distance to an object present in a predetermined target region 12 by a time-of-flight method. A pulsed light illumination system (110, 110a) for irradiating at least a part of the target area 12 with a light source 122 for emitting pulsed light, the pulsed light 124 emitted from the light source 122, and the pulsed light illumination A light receiving system (150, 150a) for detecting the reflected light from the object of the pulsed light 124 irradiated in the system (110, 110a) and measuring the distance to the object; , 150a), and the light receiving system (150, 150a) detects the tilt angle detected by the horizontal sensor 140. It is characterized in that the inclination of the optical axis of the light receiving system (150, 150a) with respect to the horizontal plane is corrected by the oblique angle θiy.
本発明の態様1に係る光レーダ装置は、所定の対象領域12に存在する対象物までの距離を飛行時間(Time-of-flight)法で測定する光レーダ装置(100,100a,100b,100c)であって、パルス状の光を発する光源122と、上記光源122の発するパルス光124を上記対象領域12の少なくとも一部に照射するパルス光照明系(110,110a)と、上記パルス光照明系(110,110a)にて照射されたパルス光124の上記対象物からの反射光を検出して、上記対象物までの距離を測定する受光系(150,150a)と、上記受光系(150,150a)の光軸(Y軸)の水平面に対する傾斜角θiyを検出する水平センサ140と、を含み、上記受光系(150,150a)は、上記水平センサ140が検出した傾斜角θiy分、当該受光系(150,150a)の光軸の水平面に対する傾斜を補正することを特徴としている。 [Summary]
An optical radar device according to
上記構成によれば、受光系は、水平センサが検出した傾斜角分、当該受光系の光軸の水平面に対する傾斜を補正することで、光レーダ装置自体が傾いていたとしても、実際の水平面を考慮して対象物までの距離を測定することができる。このため、対象物が存在する対象領域は、光レーダ装置自体が傾いていたとしても一定に保たれる。つまり、鉛直線に対する光レーザ装置の位置関係が一定している。
According to the above configuration, the light receiving system corrects the tilt of the light axis of the light receiving system with respect to the horizontal plane by the tilt angle detected by the horizontal sensor, so that the actual horizontal plane The distance to the object can be measured taking into account. For this reason, the target area in which the target exists is kept constant even if the light radar device itself is inclined. That is, the positional relationship of the light laser device with respect to the vertical line is constant.
しかも、受光系の光軸の水平面に対する傾斜を補正するという簡便な方法で、対象物までの距離を精度よく測定することができる。
Moreover, the distance to the object can be accurately measured by a simple method of correcting the inclination of the optical axis of the light receiving system with respect to the horizontal plane.
従って、光レーダ装置の傾斜量に関わらず、簡便な方法で、対象物までの距離を常に一定方向で測定することができる。これにより、外部装置は、光レーダ装置の傾斜量に関わらず、常に一定方向の3次元イメージを取得する事ができる為、対象物の認識を簡便に精度良く、行う事ができる。
Therefore, regardless of the amount of tilt of the light radar device, the distance to the object can always be measured in a fixed direction by a simple method. As a result, the external device can always acquire a three-dimensional image in a fixed direction regardless of the amount of tilt of the light radar device, so that the object can be recognized easily and accurately.
本発明の態様2に係る光レーダ装置は、上記水平センサ140が検出した傾斜角分、上記パルス光照明系(110,110a)の光照射範囲を補正してもよい。
The light radar device according to aspect 2 of the present invention may correct the light irradiation range of the pulsed light illumination system (110, 110a) by the inclination angle detected by the horizontal sensor 140.
上記構成によれば、光照射領域を必要な領域に集中できる為、信号強度を高める事で測定精度を向上したり、或いは不要な光照射を低減したりする事で、電力消費量を低減する事が出来る。
According to the above configuration, since the light irradiation area can be concentrated to the necessary area, the power consumption can be reduced by enhancing the signal intensity to improve the measurement accuracy or reduce unnecessary light irradiation. I can do things.
本発明の態様3に係る光レーダ装置は、上記態様1または2において、上記水平センサ140は、上記光軸(Y軸)が上記水平面に対して前方に傾斜したときの傾斜角θiyを検出してもよい。
In the optical radar device according to aspect 3 of the present invention, in aspect 1 or 2, the horizontal sensor 140 detects an inclination angle θiy when the optical axis (Y axis) is inclined forward with respect to the horizontal surface. May be
上記構成によれば、水平センサは、上記光軸が上記水平面に対して前方に傾斜したときの傾斜角を検出することで、受光系は、水平センサが検出した傾斜角分、当該受光系の光軸の水平面に対する前方傾斜を補正することができる。つまり、対象物が存在する対象領域は、光レーダ装置自体が水平面に対して前方に傾いていたとしても一定に保たれる。
According to the above configuration, the horizontal sensor detects the tilt angle when the optical axis tilts forward with respect to the horizontal surface, and the light receiving system detects the tilt angle of the light receiving system by the tilt angle detected by the horizontal sensor. The forward tilt of the optical axis with respect to the horizontal plane can be corrected. That is, the target area in which the object is present is kept constant even if the light radar device itself is inclined forward with respect to the horizontal plane.
本発明の態様4に係る光レーダ装置は、上記態様1~3の何れか1態様において、上記水平センサ140は、上記光軸(Y軸)が上記水平面に対して左右方向に傾斜したときの傾斜角θiyを検出してもよい。
The optical radar device according to aspect 4 of the present invention is the optical radar device according to any one of the aspects 1 to 3, wherein the horizontal sensor 140 is inclined when the optical axis (Y axis) is inclined in the lateral direction with respect to the horizontal surface. The inclination angle θiy may be detected.
上記構成によれば、水平センサは、上記光軸が上記水平面に対して左右方向に傾斜したときの傾斜角を検出することで、受光系は、水平センサが検出した傾斜角分、当該受光系の光軸の水平面に対する左右方向の傾斜を補正することができる。つまり、対象物が存在する対象領域は、光レーダ装置自体が水平面に対して左右方向に傾いていたとしても一定に保たれる。
According to the above configuration, the horizontal sensor detects an inclination angle when the optical axis is inclined in the left-right direction with respect to the horizontal plane, and the light receiving system detects the inclination angle detected by the horizontal sensor. It is possible to correct the horizontal inclination of the optical axis of the lens with respect to the horizontal plane. That is, the target area in which the object is present is kept constant even if the light radar device itself is inclined in the lateral direction with respect to the horizontal plane.
本発明の態様5に係る光レーダ装置は、上記態様1~4の何れか1態様において、上記パルス光照明系(110,110a)のパルス光照射領域10は、少なくとも上記対象領域12全てを含む大きさであってもよい。
In the optical radar device according to aspect 5 of the present invention, in any one of the aspects 1 to 4, the pulsed light irradiation area 10 of the pulsed light illumination system (110, 110a) includes at least the entire target area 12 It may be a size.
上記構成によれば、パルス光照明系のパルス光照射領域は、少なくとも対象領域全てを含む大きさであることで、光レーザ装置が傾いても、常に対象領域全体を測定することができる。
According to the above configuration, since the pulsed light irradiation area of the pulsed light illumination system has a size including at least the entire target area, the entire target area can always be measured even if the optical laser device is inclined.
本発明の態様6に係る光レーダ装置は、上記態様1~5の何れか1態様において、上記受光系(150,150a)の測定領域θovは、少なくとも上記対象領域12全てを含む大きさであってもよい。
In the optical radar device according to aspect 6 of the present invention, in any one of the aspects 1 to 5, the measurement area θov of the light receiving system (150, 150a) has a size including at least the entire target area 12 May be
上記構成によれば、受光系の測定領域は、少なくとも上記対象領域全てを含む大きさであることで、光レーザ装置が傾いても、常に対象領域全体を測定することができる。
According to the above configuration, the measurement region of the light receiving system has a size including at least the entire target region, so that the entire target region can always be measured even if the optical laser device is inclined.
本発明の態様7に係る光レーダ装置は、上記態様1~6の何れか1態様において、上記パルス光照明系(110,110a)は、上記光源122の発するパルス光124を変換して得られたスポット光を上記対象領域12に照射する1次元スキャン装置131を備えていてもよい。
In the optical radar device according to aspect 7 of the present invention, in any one of the aspects 1 to 6, the pulsed light illumination system (110, 110a) is obtained by converting the pulsed light 124 emitted from the light source 122 A one-dimensional scanning device 131 may be provided to irradiate the target light 12 with the spot light.
本発明の態様8に係る光レーダ装置は、上記態様7において、上記1次元スキャン装置131は、上記対象領域12に対して鉛直方向に上記スポット光を照射するのが好ましい。
In the light radar device according to aspect 8 of the present invention, in aspect 7, the one-dimensional scanning device 131 preferably irradiates the spot light in the vertical direction with respect to the target area 12.
上記構成によれば、1次元スキャン装置という簡単な構成で、光源の発するパルス光を対象領域に照射して、スキャニングすることができる。
According to the above configuration, the pulse light emitted from the light source can be irradiated to the target area for scanning with a simple configuration of a one-dimensional scanning device.
本発明の態様9に係る光レーダ装置は、上記態様1~8の何れか1態様において、上記受光系(150,150a)は、2次元画素アレイを含んでもよい。
In the light radar device according to aspect 9 of the present invention, in any one of the aspects 1 to 8, the light receiving system (150, 150a) may include a two-dimensional pixel array.
本発明の態様10に係る光レーダ装置は、上記態様1~6の何れか1態様において、上記パルス光照明系(110c)は、上記光源122の発するパルス光124を上記対象領域12に照射する2次元スキャン装置を備えていてもよい。
In the optical radar device according to aspect 10 of the present invention, in any one of the aspects 1 to 6, the pulsed light illumination system (110c) irradiates the target area 12 with pulsed light 124 emitted from the light source 122. A two-dimensional scanning device may be provided.
上記構成によれば、対象領域を一括してスキャンすることができる。
According to the above configuration, the target area can be scanned at one time.
本発明の態様11に係る光レーダ装置は、上記態様1~10の何れか1態様において、上記パルス光124は赤外線であってもよい。
In the light radar device according to aspect 11 of the present invention, in any one of the aspects 1 to 10, the pulsed light 124 may be infrared light.
上記構成によれば、赤外線の種々の特性を生かした光レーダ装置を実現することができる。
According to the above configuration, it is possible to realize an optical radar device that makes use of various characteristics of infrared light.
本発明の態様12に係る光レーダ装置は、上記態様1~9の何れか1態様において、
3次元画像素子は、画素Px(i,j)を2次元行列状に配置した受光部154と、上記画素Px(i,j)の列を構成する各画素から電気パルスが供給される画素記憶素子Mx(j)と、上記画素記憶素子Mx(j)が蓄えたデータを読み出し、上記画素Px(i,j)毎に、対象物までの距離を示す距離情報を少なくとも取得する信号処理回路DSと、を備えた3次元画像素子であって、上記画素記憶素子Mx(j)は、上記電気パルスの数を互いに異なるタイミングで積算する、複数のバイナリカウンタを有しており、上記信号処理回路DSは、上記画素記憶素子Mx(j)が蓄えたデータの読み出しを、上記バイナリカウンタによる積算と並行して実行する3次元画像素子を受光部に有している事を特徴としている。 An optical radar device according to aspect 12 of the present invention is the optical radar device according to any one of theabove aspects 1 to 9,
The three-dimensional image element includes a light receiving unit 154 in which the pixels Px (i, j) are arranged in a two-dimensional matrix, and a pixel memory to which electric pulses are supplied from each pixel forming the row of the pixels Px (i, j) A signal processing circuit DS which reads out data stored in the element Mx (j) and the pixel storage element Mx (j) and obtains at least distance information indicating a distance to an object for each of the pixels Px (i, j). And the pixel storage element Mx (j) includes a plurality of binary counters that integrate the numbers of the electric pulses at different timings from each other, and the signal processing circuit DS is characterized in that the light receiving section has a three-dimensional image element which executes reading of data stored in the pixel storage element Mx (j) in parallel with integration by the binary counter.
3次元画像素子は、画素Px(i,j)を2次元行列状に配置した受光部154と、上記画素Px(i,j)の列を構成する各画素から電気パルスが供給される画素記憶素子Mx(j)と、上記画素記憶素子Mx(j)が蓄えたデータを読み出し、上記画素Px(i,j)毎に、対象物までの距離を示す距離情報を少なくとも取得する信号処理回路DSと、を備えた3次元画像素子であって、上記画素記憶素子Mx(j)は、上記電気パルスの数を互いに異なるタイミングで積算する、複数のバイナリカウンタを有しており、上記信号処理回路DSは、上記画素記憶素子Mx(j)が蓄えたデータの読み出しを、上記バイナリカウンタによる積算と並行して実行する3次元画像素子を受光部に有している事を特徴としている。 An optical radar device according to aspect 12 of the present invention is the optical radar device according to any one of the
The three-dimensional image element includes a light receiving unit 154 in which the pixels Px (i, j) are arranged in a two-dimensional matrix, and a pixel memory to which electric pulses are supplied from each pixel forming the row of the pixels Px (i, j) A signal processing circuit DS which reads out data stored in the element Mx (j) and the pixel storage element Mx (j) and obtains at least distance information indicating a distance to an object for each of the pixels Px (i, j). And the pixel storage element Mx (j) includes a plurality of binary counters that integrate the numbers of the electric pulses at different timings from each other, and the signal processing circuit DS is characterized in that the light receiving section has a three-dimensional image element which executes reading of data stored in the pixel storage element Mx (j) in parallel with integration by the binary counter.
上記構成によれば、上記態様1と同様の効果を奏する。
According to the above-mentioned composition, the same effect as the above-mentioned mode 1 is produced.
また、本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。さらに、各実施形態にそれぞれ開示された技術的手段を組み合わせることにより、新しい技術的特徴を形成することができる。
Furthermore, the present invention is not limited to the above-described embodiments, and various modifications can be made within the scope of the claims. The present invention can be obtained by appropriately combining the technical means disclosed in different embodiments. The embodiments are also included in the technical scope of the present invention. Furthermore, new technical features can be formed by combining the technical means disclosed in each embodiment.
10,10a,10b,10c パルス光照射領域
11,11a,11c 測定可能領域
12 対象領域
13a 測定領域
100,100a,100b,100c 光レーダ装置
110,110a,110b,110c パルス光照明系
122 光源
123 扇状光照射系
123a 扇状光照射系
123b 2次元スキャン装置
123c 拡散光学系
124 扇状パルス光
124a 扇状パルス光
124b スポットパルス光
124c パルス光
125b 反射光
130 コリメート光発生器
131,131a 1次元スキャン装置
131b スキャンニングミラー
132,132a 扇状ビーム生成器
133 スポット光
140 水平センサ
150,150a,150b,150c 受光系
151,151b 結像光学系
152 光学バンドパスフィルタ
153,153a,153b,153c ToFセンサ
154,154a,154b,154c 受光部
155,155a 信号記憶処理部
156 ミラー
157 開口部
160,160a,160b,160c 制御部
161 行選択回路
163 メモリ選択回路
170 外部システム 10, 10a, 10b, 10c Pulselight irradiation area 11, 11a, 11c Measurable area 12 Target area 13a Measurement area 100, 100a, 100b, 100c Optical radar device 110, 110a, 110b, 110c Pulsed light illumination system 122 Light source 123 Fan-shaped Light irradiation system 123a fan-shaped light irradiation system 123b two-dimensional scanning device 123c diffusion optical system 124 fan-shaped pulsed light 124a fan-shaped pulsed light 124b spot pulse light 124c pulsed light 125b reflected light 130 collimated light generator 131, 131a one-dimensional scanning unit 131b scanning Mirror 132, 132a Fan-shaped beam generator 133 Spot light 140 Horizontal sensor 150, 150a, 150b, 150c Light receiving system 151, 151b Imaging optical system 152 Optical band pass filter 153, 153a, 1 53b, 153c ToF sensor 154, 154a, 154b, 154c Light receiving unit 155, 155a Signal storage processing unit 156 Mirror 157 Openings 160, 160a, 160b, 160c Control unit 161 Row selection circuit 163 Memory selection circuit 170 External system
11,11a,11c 測定可能領域
12 対象領域
13a 測定領域
100,100a,100b,100c 光レーダ装置
110,110a,110b,110c パルス光照明系
122 光源
123 扇状光照射系
123a 扇状光照射系
123b 2次元スキャン装置
123c 拡散光学系
124 扇状パルス光
124a 扇状パルス光
124b スポットパルス光
124c パルス光
125b 反射光
130 コリメート光発生器
131,131a 1次元スキャン装置
131b スキャンニングミラー
132,132a 扇状ビーム生成器
133 スポット光
140 水平センサ
150,150a,150b,150c 受光系
151,151b 結像光学系
152 光学バンドパスフィルタ
153,153a,153b,153c ToFセンサ
154,154a,154b,154c 受光部
155,155a 信号記憶処理部
156 ミラー
157 開口部
160,160a,160b,160c 制御部
161 行選択回路
163 メモリ選択回路
170 外部システム 10, 10a, 10b, 10c Pulse
Claims (12)
- 所定の対象領域に存在する対象物までの距離を飛行時間(Time-of-flight)法で測定する光レーダ装置であって、
パルス状の光を発する光源と、
上記光源の発するパルス光を上記対象領域の少なくとも一部に照射するパルス光照明系と、
上記パルス光照明系にて照射されたパルス光の上記対象物からの反射光を検出して、上記対象物までの距離を測定する受光系と、
上記受光系の光軸の水平面に対する傾斜角を検出する水平センサと、を含み、
上記受光系は、
上記水平センサが検出した傾斜角分、当該受光系の光軸の水平面に対する傾斜を補正することを特徴とする光レーダ装置。 An optical radar device which measures a distance to an object present in a predetermined target area by a time-of-flight method,
A light source that emits pulsed light;
A pulsed light illumination system for irradiating at least a part of the target area with pulsed light emitted from the light source;
A light receiving system that detects reflected light from the object of the pulsed light emitted by the pulsed light illumination system and measures a distance to the object;
A horizontal sensor for detecting an inclination angle of the optical axis of the light receiving system with respect to a horizontal plane;
The above light receiving system is
An optical radar apparatus characterized by correcting the inclination of the light axis of the light receiving system with respect to the horizontal plane by the inclination angle detected by the horizontal sensor. - 上記水平センサが検出した傾斜角分、上記パルス光照明系の光照射範囲を補正することを特徴とする請求項1に記載の光レーダ装置。 The light radar device according to claim 1, wherein the light irradiation range of the pulsed light illumination system is corrected by an inclination angle detected by the horizontal sensor.
- 上記水平センサは、上記光軸が上記水平面に対して前方に傾斜したときの傾斜角を検出することを特徴とする請求項1または2に記載の光レーダ装置。 The optical radar device according to claim 1, wherein the horizontal sensor detects an inclination angle when the optical axis is inclined forward with respect to the horizontal surface.
- 上記水平センサは、上記光軸が上記水平面に対して左右方向に傾斜したときの傾斜角を検出することを特徴とする請求項1~3の何れか1項に記載の光レーダ装置。 The optical radar device according to any one of claims 1 to 3, wherein the horizontal sensor detects an inclination angle when the optical axis is inclined in the left-right direction with respect to the horizontal plane.
- 上記パルス光照明系のパルス光照射領域は、少なくとも上記対象領域全てを含む大きさであることを特徴とする請求項1~4の何れか1項に記載の光レーダ装置。 The optical radar device according to any one of claims 1 to 4, wherein the pulsed light irradiation area of the pulsed light illumination system has a size including at least the entire target area.
- 上記受光系の測定領域は、少なくとも上記対象領域全てを含む大きさであることを特徴とする請求項1~5の何れか1項に記載の光レーダ装置。 The optical radar device according to any one of claims 1 to 5, wherein the measurement area of the light receiving system has a size including at least the entire target area.
- 上記パルス光照明系は、
上記光源の発するパルス光を変換して得られた線状に広がった光を上記対象領域に照射する1次元スキャン装置を備えていることを特徴とする請求項1~6の何れか1項に記載の光レーダ装置。 The above pulsed light illumination system
7. The apparatus according to claim 1, further comprising a one-dimensional scanning device for irradiating the target area with linearly spread light obtained by converting pulsed light emitted from the light source. Optical radar device as described. - 上記1次元スキャン装置は、上記対象領域に対して鉛直方向に上記線状に広がった光を照射することを特徴とする請求項7に記載の光レーダ装置。 The optical radar device according to claim 7, wherein the one-dimensional scanning device irradiates the light which spreads linearly in the vertical direction with respect to the target area.
- 上記受光系は、2次元画素アレイを含むことを特徴とする請求項1~8の何れか1項に記載の光レーダ装置。 The light radar device according to any one of claims 1 to 8, wherein the light receiving system includes a two-dimensional pixel array.
- 上記パルス光照明系は、
上記光源の発するパルス光を上記対象領域に照射する2次元スキャン装置を備えていることを特徴とする請求項1~6の何れか1項に記載の光レーダ装置。 The above pulsed light illumination system
The optical radar device according to any one of claims 1 to 6, further comprising a two-dimensional scanning device for irradiating the target area with pulsed light emitted from the light source. - 上記パルス光は赤外線であることを特徴とする請求項1~10の何れか1項に記載の光レーダ装置。 The optical radar device according to any one of claims 1 to 10, wherein the pulse light is an infrared ray.
- 画素を2次元行列状に配置した受光部と、
上記画素の列を構成する各画素から電気パルスが供給される画素記憶素子と、
上記画素記憶素子が蓄えたデータを読み出し、上記画素毎に、対象物までの距離を示す距離情報を少なくとも取得する信号処理回路と、を備えた3次元画像素子であって、
上記画素記憶素子は、上記電気パルスの数を互いに異なるタイミングで積算する、複数のバイナリカウンタを有しており、
上記信号処理回路は、上記画素記憶素子が蓄えたデータの読み出しを、上記バイナリカウンタによる積算と並行して実行する3次元画像素子を受光部に有することを特徴とする請求項1~9の何れか1項に記載の光レーダ装置。 A light receiving unit in which pixels are arranged in a two-dimensional matrix;
A pixel storage element to which an electric pulse is supplied from each of the pixels constituting the row of pixels;
A signal processing circuit for reading out data stored in the pixel storage element and acquiring at least distance information indicating a distance to an object for each of the pixels.
The pixel storage element has a plurality of binary counters that integrate the number of electrical pulses at different timings from each other,
The signal processing circuit according to any one of claims 1 to 9, wherein the light receiving unit includes a three-dimensional image element that executes reading of data stored in the pixel storage element in parallel with integration by the binary counter. The light radar device according to any one of the preceding claims.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-145389 | 2017-07-27 | ||
JP2017145389 | 2017-07-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019021887A1 true WO2019021887A1 (en) | 2019-01-31 |
Family
ID=65040616
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/026748 WO2019021887A1 (en) | 2017-07-27 | 2018-07-17 | Optical radar device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2019021887A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111722239A (en) * | 2019-03-19 | 2020-09-29 | 株式会社东芝 | Light receiving device and distance measuring device |
WO2021049151A1 (en) * | 2019-09-13 | 2021-03-18 | ソニーセミコンダクタソリューションズ株式会社 | Distance measurement device and distance measurement mechanism deviation adjustment method for same |
CN113614604A (en) * | 2019-03-14 | 2021-11-05 | 株式会社理光 | Light source device, detection device, and electronic apparatus |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003269897A (en) * | 2002-03-18 | 2003-09-25 | Mitsubishi Heavy Ind Ltd | Infrared image correcting device, missile guiding device provided therewith and infrared image correcting method |
JP2003344045A (en) * | 2002-05-29 | 2003-12-03 | Yaskawa Electric Corp | Image processing apparatus |
JP2004523769A (en) * | 2001-04-04 | 2004-08-05 | インストロ プレシジョン リミテッド | Surface shape measurement |
JP2004317134A (en) * | 2003-04-11 | 2004-11-11 | Daihatsu Motor Co Ltd | Object recognition device of vehicle and object recognition method |
JP2011203122A (en) * | 2010-03-25 | 2011-10-13 | Nippon Soken Inc | Optical radar apparatus |
JP2013117475A (en) * | 2011-12-05 | 2013-06-13 | Toyota Motor Corp | Obstacle detector |
JP2015075382A (en) * | 2013-10-08 | 2015-04-20 | 株式会社デンソー | Object detection device |
JP2016125999A (en) * | 2014-12-26 | 2016-07-11 | 株式会社デンソー | Pitching determination device |
-
2018
- 2018-07-17 WO PCT/JP2018/026748 patent/WO2019021887A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004523769A (en) * | 2001-04-04 | 2004-08-05 | インストロ プレシジョン リミテッド | Surface shape measurement |
JP2003269897A (en) * | 2002-03-18 | 2003-09-25 | Mitsubishi Heavy Ind Ltd | Infrared image correcting device, missile guiding device provided therewith and infrared image correcting method |
JP2003344045A (en) * | 2002-05-29 | 2003-12-03 | Yaskawa Electric Corp | Image processing apparatus |
JP2004317134A (en) * | 2003-04-11 | 2004-11-11 | Daihatsu Motor Co Ltd | Object recognition device of vehicle and object recognition method |
JP2011203122A (en) * | 2010-03-25 | 2011-10-13 | Nippon Soken Inc | Optical radar apparatus |
JP2013117475A (en) * | 2011-12-05 | 2013-06-13 | Toyota Motor Corp | Obstacle detector |
JP2015075382A (en) * | 2013-10-08 | 2015-04-20 | 株式会社デンソー | Object detection device |
JP2016125999A (en) * | 2014-12-26 | 2016-07-11 | 株式会社デンソー | Pitching determination device |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113614604A (en) * | 2019-03-14 | 2021-11-05 | 株式会社理光 | Light source device, detection device, and electronic apparatus |
CN113614604B (en) * | 2019-03-14 | 2024-05-14 | 株式会社理光 | Light source device, detection device, and electronic apparatus |
CN111722239A (en) * | 2019-03-19 | 2020-09-29 | 株式会社东芝 | Light receiving device and distance measuring device |
CN111722239B (en) * | 2019-03-19 | 2024-04-23 | 株式会社东芝 | Light receiving device and distance measuring device |
WO2021049151A1 (en) * | 2019-09-13 | 2021-03-18 | ソニーセミコンダクタソリューションズ株式会社 | Distance measurement device and distance measurement mechanism deviation adjustment method for same |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7146004B2 (en) | Synchronous spinning LIDAR and rolling shutter camera system | |
CN111352091B (en) | Real-time gating and signal routing in laser and detector arrays for LIDAR applications | |
CN109642952B (en) | Hybrid scanning LIDAR system | |
US20210311171A1 (en) | Improved 3d sensing | |
US8675181B2 (en) | Color LiDAR scanner | |
CN110325879B (en) | System and method for compressed three-dimensional depth sensing | |
JP6784295B2 (en) | Distance measurement system, distance measurement method and program | |
US9285477B1 (en) | 3D depth point cloud from timing flight of 2D scanned light beam pulses | |
US9621876B2 (en) | Scanning 3D imager | |
US9435891B2 (en) | Time of flight camera with stripe illumination | |
US11592530B2 (en) | Detector designs for improved resolution in lidar systems | |
US10571574B1 (en) | Hybrid LADAR with co-planar scanning and imaging field-of-view | |
US11555926B2 (en) | Optical device, measurement device, robot, electronic apparatus, mobile object, and shaping device | |
JP2017534868A (en) | 3D lidar sensor based on 2D scanning of 1D optical emitter | |
JP2007279017A (en) | Radar system | |
WO2021212916A1 (en) | Tof depth measurement apparatus and method, and electronic device | |
US11675064B2 (en) | Optical radar apparatus | |
WO2019021887A1 (en) | Optical radar device | |
US20200284882A1 (en) | Lidar sensors and methods for the same | |
JP2017520755A (en) | 3D coarse laser scanner | |
WO2018230203A1 (en) | Imaging device | |
US11156716B1 (en) | Hybrid LADAR with co-planar scanning and imaging field-of-view | |
US11828881B2 (en) | Steered LIDAR system with arrayed receiver | |
US20200004127A1 (en) | Light source, projection device, measurement device, robot, electronic device, mobile object, and shaping apparatus | |
US11796643B2 (en) | Adaptive LIDAR scanning methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18838968 Country of ref document: EP Kind code of ref document: A1 |
|
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18838968 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |