WO2018100648A1 - Optical scanning device and optical rangefinder - Google Patents

Optical scanning device and optical rangefinder Download PDF

Info

Publication number
WO2018100648A1
WO2018100648A1 PCT/JP2016/085427 JP2016085427W WO2018100648A1 WO 2018100648 A1 WO2018100648 A1 WO 2018100648A1 JP 2016085427 W JP2016085427 W JP 2016085427W WO 2018100648 A1 WO2018100648 A1 WO 2018100648A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
light
vibration
optical
image
Prior art date
Application number
PCT/JP2016/085427
Other languages
French (fr)
Japanese (ja)
Inventor
山田 健一郎
愼介 尾上
誠 保坂
瀬尾 欣穂
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Priority to PCT/JP2016/085427 priority Critical patent/WO2018100648A1/en
Publication of WO2018100648A1 publication Critical patent/WO2018100648A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems

Definitions

  • the present invention relates to optical scanning technology and optical ranging technology.
  • in-vehicle sensors such as an in-vehicle camera that processes the reflected light and obtains surrounding information in a time-sharing manner, and the light distribution of automobile headlights to prevent dazzling the driver of oncoming vehicles and forward vehicles.
  • ADB Adaptive Driving Beam
  • Patent Document 1 there is a technique disclosed in Patent Document 1 as a prior art for solving the influence of vibration in an in-vehicle sensor.
  • the moving component that moves up and down in the image in accordance with the vibration of the vehicle in the captured image captured by the camera is identified, and the intensity of the vibration frequency of the moving component is determined based on the movement history of the identified moving component.
  • the vibration frequency range having the highest vibration frequency intensity is extracted, the equilibrium image captured at the vibration equilibrium point within the extracted predetermined frequency range is extracted, and based on the extracted equilibrium image, the vehicle
  • the vanishing point which is the center position of the displacement in the image displaced by vibration, is detected, and the detected vanishing point is monitored to calculate the amount of image displacement due to vehicle vibration in each captured image (summary extract). " .
  • the influence of vehicle vibration is observed as distortion in the detection result of a single frame.
  • different solution means are required to correct the influence of external vibration in the obtained image and obtain a detection result without distortion.
  • the influence of the vibration of the vehicle appears as distortion of the irradiation area of the headlight.
  • the optical scanning headlight requires a solution means for correcting the influence of external vibration and obtaining a detection result without distortion.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to efficiently reduce the influence of external vibration when external vibration is applied to a scanning apparatus.
  • the present invention provides a light deflector that deflects measurement light, which is light to be projected, in a light projecting direction determined by a rocking direction and a translation direction that intersects the rocking direction, and the light deflecting the measurement light
  • a light projecting unit that projects light toward the unit, a light receiving unit that receives reflected light of the measurement light from the object, and the light so that the oscillation direction is a main vibration direction that is a main vibration direction of the light.
  • an optical scanning device comprising: a propagation direction control unit that drives and controls a deflection unit; and an output unit that outputs information obtained from the reflected light as output information.
  • the distortion of the distance image due to the main vibration is corrected by using the optical scanning device, a distance image generating unit that generates a distance image from the output information, and a displacement amount of the optical scanning device,
  • An optical distance measuring device comprising: a correction unit that outputs a distance image.
  • (A) is a flowchart of the swing direction control process of the first embodiment
  • (b) is a flowchart of the distance measurement process of the first embodiment.
  • (A) And (b) is explanatory drawing for demonstrating for demonstrating the time history of the displacement observed by the vibration detection apparatus at the time of the external vibration of 1st embodiment, (c), It is explanatory drawing for demonstrating the mode of a vibration of an optical ranging device.
  • (A) is explanatory drawing for demonstrating the relationship between a light emission area
  • (A)-(c) is explanatory drawing for demonstrating the influence on the detection result when external vibration is added to the optical ranging apparatus.
  • (A) And (b) is the simulation result of the locus
  • (A) And (b) is the simulation result of the locus
  • (A)-(e) is explanatory drawing for demonstrating the influence of a vibration, and the effect of a correction
  • (A) And (b) is explanatory drawing for demonstrating the distance image of 2nd embodiment.
  • (A)-(c) is explanatory drawing for demonstrating the relationship between the detection result of 2nd embodiment, and actual information.
  • (A) is a detection area in the second embodiment
  • (b) is an example of waveform distortion caused by external vibration
  • (c) is a waveform distortion when the shape of the object has distortion. It is explanatory drawing for demonstrating an example, respectively. It is a flowchart of the displacement amount calculation process of 2nd embodiment.
  • (A)-(d) is explanatory drawing for demonstrating the displacement amount calculation method of 2nd embodiment.
  • (A) And (b) is explanatory drawing for demonstrating the displacement amount calculation method of 2nd embodiment. It is a block diagram of the optical ranging apparatus of 3rd embodiment.
  • (A) illustrates the change in the visual field when an external vibration is applied to the scanning sensor
  • (b) illustrates the change in the visual field when an external vibration is applied to the non-scanning sensor. It is explanatory drawing for. It is a flowchart of the displacement amount calculation process of 3rd embodiment.
  • (A)-(c) is explanatory drawing for demonstrating the process of calculating pixel deviation
  • (A)-(c) is explanatory drawing for demonstrating the process of calculating pixel deviation
  • a vibration detection sensor is provided, and its output is used to determine the swinging direction of the scanning optical distance measuring device. Further, the distortion of the distance image due to the vibration of a moving body such as a vehicle on which the optical distance measuring device is mounted is corrected using the detection result by the vibration detection sensor.
  • the moving body is an automobile or a vehicle.
  • the optical distance measuring device 100 of the present embodiment corrects the optical scanning device 200, the distance image generation unit 110 that generates a distance image from the output information of the optical scanning device 200, and the distance image generated by the distance image generation unit 110. And a correction unit 120.
  • the optical distance measuring device 100 is connected to the ECU 300, and the distance image corrected by the correction unit 120 is output to the ECU 300.
  • the ECU 300 is an engine control unit that controls an automobile engine.
  • the optical scanning device 200 is a scanning sensor that acquires surrounding information in a time-division manner by scanning a light spot using a driving scanner.
  • the scanner is oscillated in a predetermined oscillating direction 402 with a predetermined oscillating amplitude, while being translated in a translation direction 403 that intersects the oscillating direction.
  • the light is projected, and a predetermined distance measurement area (field-of-view range) 401 is scanned.
  • a black arrow indicates a locus 404 drawn by the propagation direction of the scanner.
  • the orthogonal two-dimensional axes in the field of view are substantially parallel to the X axis and the Y axis, which are two-dimensional axes of the vibration detection device 240 described later. Install as follows.
  • the horizontal axis (horizontal axis) and the vertical axis (vertical axis) in the field of view of the optical scanning device 200 (optical ranging device 100) are also expressed as the X axis and the Y axis.
  • the optical scanning device 200 includes a light projecting unit 211, a scanner 212, a scanning angle detection unit 215, a propagation direction control unit 216, a light receiving unit 221, an I / V conversion unit 222, and an amplification.
  • a circuit 223, an ADC 224, a signal processing unit 225, and a control unit 230 are provided.
  • the light projecting unit 211 projects laser light (light spot) toward the scanner 212.
  • the light projection is performed at a predetermined light projection timing. Further, the laser light is modulated into a pulse shape or a sine wave shape, and projected as modulated light.
  • the light projecting unit 211 includes, for example, a laser diode, a collimating lens that shapes the laser diameter to a desired laser diameter, and the like.
  • the scanner 212 includes an optical deflection unit 213 and a drive unit 214.
  • the light deflection unit 213 has a mirror surface that deflects light.
  • the light deflection unit 213 has at least one drive shaft.
  • the light deflecting unit 213 may have, for example, a two-dimensional swing axis that is a long axis and a high speed swing axis and a short axis and a low speed swing axis.
  • the drive unit 214 supports the light deflection unit 213 and drives the drive shaft according to the drive signal.
  • the drive unit 214 swings the light deflection unit 213 in a desired direction with a desired amplitude by driving each drive axis in accordance with a drive signal from a propagation direction control unit 216 described later. While being driven, it is translationally driven in a desired direction.
  • the light projection destination of the light projecting unit 211 is, for example, the substantially swing center of the light deflecting unit 213.
  • the modulated light projected by the light projecting unit 211 is deflected by the light deflecting unit 213 and is irradiated to the distance measuring range as measurement light.
  • the measurement light is reflected by objects around the vehicle on which the optical distance measuring device 100 is mounted.
  • the scanning angle detection unit 215 detects the angle of the light deflection unit 213. Then, the detection result is output to the propagation direction control unit 216.
  • the scanning angle detection unit 215 may be any device that can detect the angle of the light deflection unit 213.
  • a capacitance type rotary encoder or an optical encoder is used.
  • the rotary encoder is attached to the actuator unit. Note that the optical encoder uses back surface reflection or the like opposite to the light projecting unit 211 side with the light deflecting unit 213 interposed therebetween. For this reason, the optical encoder is provided at a position where the back surface reflection can be received.
  • the propagation direction control unit 216 controls the swing of the scanner 212 (light deflecting unit 213) in accordance with an instruction from the control unit 230.
  • the control is performed using the drive signal received from the control unit 230 and the output of the scanning angle detection unit 215 in accordance with the control timing signal received from the control unit 230.
  • a drive signal is output for each drive axis of the scanner 212 (light deflector 213).
  • the propagation direction control unit 216 controls the swinging direction of the scanner 212 (light deflecting unit 213) substantially parallel to the main vibration direction, and the amplitude and the translation amount in the translation direction cover the visual field range 401. Control as follows.
  • the light receiving unit 221 receives the reflected light of the measurement light and performs photoelectric conversion.
  • the light receiving unit 221 includes, for example, a photodiode that performs photoelectric conversion and an optical system that includes a condensing mirror that guides reflected light to the photodiode.
  • the output unit 226 includes, for example, an I / V conversion unit (current / voltage conversion unit) 222, an amplifier circuit 223, an ADC (analog / digital converter) 224, and a signal processing unit 225.
  • the reflected light information is converted into voltage information by an I / V conversion unit (current / voltage conversion unit) 222. Then, the converted information is subjected to analog noise removal and amplification or reduction to an appropriate voltage value by the amplification circuit 223, and then converted from an analog voltage amount to a digital value by the ADC 224.
  • I / V conversion unit current / voltage conversion unit
  • the signal processing unit 225 instructs the light projecting unit 211 to project the modulated light, processes the reflected light received by the light receiving unit 221, and calculates a delay amount from light projection to light reception.
  • the light projection instruction is performed by outputting a light projection timing signal to the light projecting unit 211.
  • the light projection timing signal is generated according to a control timing signal output from a pulse signal generation unit 235 described later. In accordance with the control timing signal, the propagation direction of the light deflecting unit 213 is determined, and the timing for causing the light projecting unit 211 to emit light in order to project the modulated light is controlled. That is, the signal processing unit controls the timing of light projection so that the light projection range by the light deflection unit 213 becomes a predetermined visual field range.
  • digital filter processing and binarization processing are performed on information converted into digital values by the ADC 224.
  • the binarization process is performed by, for example, a digital comparator.
  • the calculated delay amount is output to the distance image generation unit 110 described later.
  • the control unit 230 is configured such that the swing direction of the scanner 212 (light deflecting unit 213) is the vibration direction (main vibration direction) of the optical distance measuring device 100 due to the main vibration of the vehicle (device) on which the optical distance measuring device 100 is mounted. And the operation of the entire optical distance measuring device 100 is controlled.
  • control unit 230 of the present embodiment includes a distance calculation unit 231, an intensity calculation unit 232, a vibration analysis unit 233, an output processing unit 234, and a pulse signal generation unit 235.
  • the vibration detection device 240 is traversed separately from the optical distance measuring device 100 in a vehicle on which the optical distance measuring device 100 is mounted.
  • the vibration detection device 240 is a vibration detection sensor that obtains a displacement amount due to external vibration of a vehicle on which the optical distance measuring device 100 is mounted.
  • the external vibration is, for example, a vibration amount of the optical distance measuring device 100 due to road surface vibration.
  • the displacement from the steady state without external vibration is detected as the vibration amount.
  • the displacement from the steady state is referred to as a displacement amount.
  • the amount of displacement is, for example, a value (Ax, Ay) in each axial direction of two predetermined orthogonal axes (X axis and Y axis).
  • the vibration detection device 240 is realized by, for example, a displacement sensor having a plurality of detection axes. Note that the vibration detection device 240 is not limited to a displacement sensor having a plurality of detection axes, and for example, an amount of displacement due to external vibration of the optical distance measuring device 100 such as an acceleration sensor having a plurality of detection axes. Anything that can be estimated is acceptable.
  • the obtained displacement amount is output to the control unit 230 and the correction unit 120.
  • the vibration analysis unit 233 acquires the angle (main vibration angle) ⁇ M of the vibration direction (main vibration direction) from a predetermined direction from the amount of displacement of the optical distance measuring device 100 detected by the vibration detection device 240.
  • the acquired main vibration angle ⁇ M is output to the propagation direction control unit 216 as a drive signal for each drive axis via the pulse signal generation unit 235.
  • the main vibration angle ⁇ M is calculated as follows.
  • the displacements in the X-axis and Y-axis directions detected by the vibration detection device 240 at the same time are Ax and Ay, respectively.
  • the combined displacement (amplitude; amplitude due to external vibration) Acom of the X axis and the Y axis is expressed by the following equation (1).
  • the main vibration direction ⁇ M is calculated by the following equation (2) as the vibration direction of the composite displacement with respect to the X axis.
  • the pulse signal generation unit 235 generates various control signals and outputs them to each unit.
  • a control timing signal for synchronizing various processes, the above-described drive signal, and the like are generated and output.
  • the control timing signal is output to, for example, the propagation direction control unit 216, the distance calculation unit 231, the correction unit 120, and the like.
  • the drive signal is output to the propagation direction control unit 216.
  • the control timing signal may also be output to the intensity information memory 132, the signal processing unit 225, and the like.
  • the control timing signal is generated based on the periodic signal transmitted from the distance calculation unit 231. Further, the drive signal is generated based on the main vibration direction ⁇ M calculated by the vibration analysis unit 233. In this embodiment, the drive signal is swung and translated so that the swinging direction 402 of the scanner 212 (light deflecting unit 213) is substantially parallel to the main vibration direction ⁇ M and the visual field range 401 is covered. Generated to move.
  • the distance image generation unit 110 includes a distance calculation unit 231, an intensity calculation unit 232, and a distance information memory 131 and an intensity information memory 132 that store respective calculation results.
  • the distance calculation unit 231 calculates the distance to the object (object) using the delay amount received from the signal processing unit 225.
  • the obtained distance value (distance value) is converted into a value and size considering the memory storage time, and then stored in the distance information memory 131 in units of frames, for example.
  • Each distance value is stored in a memory having an address associated in advance according to the driving amount of the scanner 212 at the time of acquiring the delay amount according to the control timing signal.
  • a plurality of distance values stored in the distance information memory 131 in units of frames are collectively referred to as a distance image.
  • the intensity calculation unit 232 calculates the intensity value using the delay amount received from the signal processing unit 225.
  • the obtained intensity value is converted into a value and a size considering the memory storage time, and then stored in the intensity information memory 132 in units of frames, for example.
  • Each intensity value is also stored in a memory having an address associated in advance according to the driving amount of the scanner 212 when the delay amount is acquired.
  • intensity images a plurality of intensity values stored in the intensity information memory 132 in units of frames are collectively referred to as intensity images.
  • the correction unit 120 corrects the influence of external vibration on the distance value stored in the distance information memory 131 and the intensity value stored in the intensity information memory 132.
  • the correction is performed using the displacement amounts Ax and Ay acquired from the vibration detection device 240.
  • As the displacement amount a value obtained at the timing when the delay amount for calculating the distance value and the intensity value is acquired is used.
  • the correction unit 120 performs correction by converting the address of the pixel. Then, the converted results are stored again in the distance information memory 131 and the intensity information memory 132, respectively.
  • the correction performed by the correction unit 120 of the present embodiment corrects distortion within one frame, as will be described later. For this reason, this correction is called distortion correction processing.
  • the X-axis and Y-axis pixel addresses are Px and Py, respectively, and the X-axis and Y-axis displacements corresponding to the pixel coordinates are respectively the displacement amounts Ax , Ay.
  • the correction unit 120 converts the address of the pixel into Pcx and Pcy according to the following equation (3).
  • Bx and By in Expression (3) represent conversion coefficients of pixel movement amounts with respect to displacement amounts detected by the X-axis and Y-axis vibration detection devices 240, respectively.
  • ROUND is a function that rounds off the value in parentheses for integerization.
  • the function for integerization in Formula (3) is not necessarily limited to the rounding-off process. It may be a function that rounds up or down.
  • the output processing unit 234 performs various processes on the intensity information stored in the intensity information memory 132, for example, and outputs the processed intensity information.
  • processing for example, image processing, conversion processing to a digital broadcast signal, and the like are performed.
  • the image processing performed here is, for example, gamma correction.
  • the conversion process to digital broadcasting is, for example, ATSC, DVB-T, ISDB-T, or the like.
  • the output destination is the ECU 300 or the like.
  • the functions included in the control unit 230 that is, the vibration analysis unit 233, the pulse signal generation unit 235, the distance calculation unit 231, the intensity calculation unit 232, the output processing unit 234, and the correction unit 120 include, for example, a CPU and a memory And an information processing device including a storage device.
  • the CPU is realized by loading a program stored in a storage device in advance into a memory and executing the program. All or some of the functions may be realized by hardware such as ASIC (Application Specific Integrated Circuit) or FPGA (Field-programmable gate array).
  • the distance information memory 131 and the intensity information memory 132 are constructed in a storage device.
  • the vibration analysis unit 233 calculates the main vibration direction ⁇ M using the displacement received from the vibration detection device 240 (step S1101). At this time, the scanning angle detector 215 detects the current angle ⁇ c of the light deflector 213. The pulse signal generation unit 235 generates a drive signal from the main vibration direction ⁇ M (step S1103) and transmits it to the propagation direction control unit 216.
  • the propagation direction control unit 216 controls the drive unit 214 according to the drive signal and the current angle ⁇ c so that the swing direction ⁇ F of the light deflection unit 213 becomes the main vibration direction ⁇ M and the visual field range 401 is scanned. (Step S1104).
  • the optical scanning device 200 continues the above process according to the control timing signal until an end instruction is received (step S1101). Thereby, in this embodiment, a rocking
  • FIG. 4B is a processing flow of the distance measuring process of the present embodiment.
  • the processing of the present embodiment is started when the optical distance measuring device 100 is activated.
  • the signal processing unit 225 transmits a light projection timing signal to the light projecting unit 211.
  • the light projecting unit 211 projects the modulated light to the approximate swing center of the light deflecting unit 213 (step S1202).
  • the modulated laser light is irradiated as a measurement light in a predetermined direction by the light deflecting unit 213 and diffusely reflected by objects around the vehicle.
  • the light receiving unit 221 detects light reflected by diffuse reflection (step S1203). Then, the obtained reflected light information is processed by the output unit 226 and input to the signal processing unit 225.
  • the signal processing unit 225 calculates a delay amount from the emission of the measurement light to the reception of the reflected light from the converted reflected light information, and performs signal processing for calculating the intensity information of the reflected light (step S1204). Then, the calculated delay amount and intensity information are output to the distance calculator 231 and the intensity calculator 232, respectively.
  • the distance calculation unit 231 calculates the distance to the object using the received delay amount and stores it in the distance information memory 131.
  • the intensity calculator 232 stores the intensity information in the intensity information memory 132 (step S1206).
  • the correction unit 120 When data for one frame is stored in each of the distance information memory 131 and the intensity information memory 132, that is, when a distance image and an intensity image are generated, the correction unit 120 performs a distortion correction process (step S1206). .
  • the correction unit 120 performs the distortion correction process on the distance image and the intensity image by the above method using the displacement received from the vibration detection device 240 according to the control timing signal. Then, the corrected distance image is output to ECU 300. Further, the corrected intensity image is output to ECU 300 via output processing unit 234.
  • the optical distance measuring device 100 continues the above process according to the control timing signal until an end instruction is received (step S1201).
  • vibration from the road surface is applied as external vibration depending on the traveling environment of the vehicle on which the optical distance measuring device 100 is mounted.
  • FIG. 5A and FIG. 5B exemplify the time history of displacement observed by the vibration detection device 240 in the Y-axis direction and the X-axis direction (changes in the time-axis direction), respectively. is there.
  • black dots 411 and 412 in FIGS. 5A and 5B represent actual sampled measurement points.
  • the displacement waveform due to external vibration applied to the optical distance measuring device 100 depending on the road surface condition is the same as the X axis as shown in FIGS.
  • the frequency and phase are substantially the same and only the amplitude is different.
  • the frequencies are substantially the same, the phases are different by 180 degrees, and the amplitudes are different.
  • the mounted vehicle continues to vibrate in the direction of the angle ⁇ M, so the main vibration angle ⁇ M of the optical distance measuring device 100 does not depend on the passage of time. It is almost constant.
  • FIG. 6A shows the relationship between the light emitting region 424, the non-light emitting region 425, and the visual field range 422 when there is no external vibration.
  • the light deflection unit 213 is swung in a direction substantially parallel to the angle ⁇ M, and is translated so as to cover the field of view range.
  • a white arrow indicates a trajectory 421 drawn by the propagation direction of the scanner 212
  • a rectangular area drawn by a broken line indicates a visual field range 422
  • a plurality of rectangular band areas drawn by solid lines indicate column units at the time of scanning.
  • a dark color region in the rectangular band represents a light emitting region 424
  • a light color region represents a non-light emitting region 425.
  • the light emitting area 424 is an area in which the distance is measured by actually causing the light projecting unit 211 to emit light.
  • the non-light emitting area 425 is an area in which the distance measurement is not performed without causing the light projecting unit 211 to emit light.
  • the light emitting region 424 is substantially rectangular with the rectangular region (view range 422) drawn with a broken line. For this reason, distance information of the entire visual field can be obtained only by drive control.
  • FIG. 6B shows a relationship between the light emitting region 424, the non-light emitting region 425, and the visual field range 422 when external vibration is generated in the ⁇ M direction.
  • the drive control of the light deflection unit 213 is the same as that in FIG.
  • a rectangular area drawn with a broken line indicates a visual field range 422 when there is no external vibration
  • a plurality of rectangular band areas drawn with a solid line show a column unit 433 during scanning and a column unit 433.
  • the dark color region represents the light emitting region 434
  • the light color region represents the non-light emitting region 435.
  • the light emitting region 434 vibrates in the angle ⁇ M direction due to this vibration. For this reason, the light emitting region 434 is displaced from the original rectangle (original visual field range 422). Regardless of the presence or absence of external vibration, the distance information detection result is given an array address on the premise of the original visual field range 422. As a result, the detection result of the optical distance measuring device 100 is affected.
  • FIG. 7A to 7C are used to explain the influence on the detection result when external vibration is applied to the optical distance measuring device 100.
  • FIG. Here, a white thick line 441 on the X axis in the visual field range 422 shown in FIG.
  • the thick line 441 that is the detection target 441 is detected in a distorted shape like a white curve 443. That is, a visual field different from the original visual field range 422 is detected by the external vibration, and thereby a detection result 442 different from the actual shape of the detection target 441 is erroneously detected.
  • the correction unit 120 performs distortion correction processing on the detection result 442 according to the above equation (3), so that the shape of the original detection target 441 is obtained as indicated by a white bold line 441a in FIG. Detection result 441a can be obtained.
  • the visual field range 422a obtained by performing the distortion correction process is smaller than the visual field range 422 before correction.
  • the signal processing unit 225 may control the light projection timing signal so that light is emitted only within the corrected visual field range 422a.
  • the corrected visual field range 422a is obtained by the correction unit 120 performing correction processing.
  • the corrected visual field range 422a can also be calculated from the amount of displacement detected by the vibration detection device 240. In this case, the displacement detected by the vibration detection device 240 is also output to the signal processing unit 225, and the signal processing unit 225 controls light projection accordingly.
  • FIGS. 8A and 9A show simulation results of the trajectory drawn by the propagation direction when the swinging direction of the scanner 212 is the X-axis direction.
  • FIG. 8A shows an ideal state where there is no external vibration
  • FIG. 9A shows a simulation result when external vibration occurs in the Y-axis direction.
  • the swing direction is set to be perpendicular to the external vibration direction. Table 1 shows other simulation conditions at this time.
  • FIG. 8 (b) and FIG. 9 (b) are partially enlarged views of FIG. 8 (a) and FIG. 9 (a), respectively.
  • the horizontal axis and the vertical axis in each of these figures represent the pixel (column) number in the X-axis direction and the pixel (row) number in the Y-axis direction, respectively, in the visual field at the time of detection.
  • the trajectory in the propagation direction is divided into a region in which the density of the sine wave is sparse and a region in which the density of the sine wave is dense in the Y-axis direction. , Drawn repeatedly. As can be seen from the enlarged view of FIG. 9B, in the region where the density is sparse, the trajectory hardly passes and detection is not possible.
  • the detection density is sparse and dense
  • the detection itself is hardly performed in the region where the density is sparse. Missing.
  • the detection information is excessive, and the detection information overlaps with one pixel after distortion correction. Therefore, it is not desirable from the viewpoint of data utilization efficiency.
  • FIG. 10A to FIG. 10E show the influence of the vibration and the effect of the correction processing when the scanner 212 is swung in the same direction as the external vibration as in this embodiment.
  • FIG. 10A shows a visual field range 422 and a detection target 441 when there is no vibration.
  • FIG. 10B shows the detection result when external vibration is applied in the Y-axis direction.
  • the detection visual field 422b vibrates with external vibration.
  • the detection result 441c is distorted as shown in FIG.
  • the detection density can be detected uniformly without sparseness. Further, the detection area is not partially lost.
  • the detected visual field range 422a is smaller than the original visual field range 422, but as shown in FIG. 10E as the detection result 441a.
  • the shape of the detection object 441 can be detected correctly.
  • the measurement light that is the light to be projected is deflected in the light projecting direction determined by the swing direction and the translation direction intersecting the swing direction.
  • the light deflecting unit 213, the light projecting unit 211 that projects the measurement light toward the light deflecting unit 213, the light receiving unit 221 that receives the reflected light of the measurement light from the object, and the oscillation direction is
  • a propagation direction control unit 216 that drives and controls the light deflection unit 213 so as to be in a main vibration direction that is a main vibration direction.
  • the vibration analysis part 233 which acquires a main vibration direction is further provided.
  • the vibration analysis unit 233 obtains the main vibration direction using the output from the vibration detection device 240 that is a displacement sensor that detects displacement in response to external vibration.
  • the swinging direction of the optical scanning device 200 is controlled so as to be always substantially parallel to the vibrating direction. For this reason, as described above, the detection density in the scanning region does not vary. For this reason, data can be used efficiently. That is, even when external vibration is applied, it is possible to realize uniform detection without causing sparseness in detection density.
  • the distance image generation unit 110 that generates a distance image from the output information of the optical scanning device 200 and the output of the vibration detection device 240 are used to detect the distance image by vibration.
  • a correction unit 120 that corrects distortion.
  • the present embodiment it is possible to obtain a detection result obtained by correcting the distortion in the frame while efficiently using the acquired data in the scanning sensor. That is, in the scanning type apparatus, distortion of the detection target due to external vibration can be efficiently reduced.
  • the main vibration direction ⁇ M due to external vibration is detected, and the swing direction is controlled to be substantially parallel to the main vibration direction ⁇ M.
  • the control may be performed such that the main vibration direction is always swung in the direction.
  • the light deflecting unit 213 may have a configuration having a drive axis only in the direction, that is, uniaxial drive. In this case, the vibration detection device 240 and the vibration analysis unit 233 may not be provided.
  • the external vibration that the optical distance measuring device 100 receives during traveling is vibration due to the road surface. And this vibration is dominant in the direction of gravity. Therefore, in such a case, the swinging direction may be set substantially parallel to the gravity direction.
  • the shift amount AxBx and AyBy has substantially the same value. In such a case, the shift amount need not be calculated for each pixel.
  • the above-described distortion correction processing may be performed in an offset manner using the deviation amounts AxBx and AyBy calculated for one pixel.
  • the shift amounts AxBx and AyBy have substantially the same value. This is the case, for example, when the value of ⁇ M obtained from equation (2) or the value of a trigonometric function such as tan based on ⁇ M is substantially the same.
  • the above-described distortion correction processing may be performed in an offset manner using the deviation amounts AxBx and AyBy calculated for one pixel. That is, the same value may be used for the address string that is substantially parallel to the main vibration direction in the memory that stores the pixel values of the distance image.
  • the control that makes the oscillation direction substantially parallel to the main vibration direction ⁇ M of external vibration which is a feature of the present embodiment, is not limited to the optical distance measuring device 100, but various light that requires scanning of a light spot. It can be applied to a scanning device. Examples of such an optical scanning device include an automobile headlight or a video projection projector.
  • the distance information is temporarily stored in the distance information memory 131, a distance image is obtained, and distortion associated with external vibration applied to the optical distance measuring device 100 is corrected for the distance image.
  • it is not limited to this method.
  • a displacement sensor having a high sampling frequency may be used as the vibration detection device 240.
  • the vibration detection device 240 external vibration applied to the optical distance measuring device 100 can be detected at high speed.
  • the correction unit 120 can perform distortion correction in real time.
  • the propagation direction control unit 216 controls the control amplitudes AMPcx and AMPcy at the time of swinging when the scanner 212 translates so as to cover the field of view range while swinging in a direction substantially parallel to the vibration direction ⁇ M.
  • AMPx and AMPy in Equation (6) are the X-axis and Y-axis components of the control amplitude when rocking with respect to the vibration direction ⁇ M.
  • Ax and Ay are the X-axis and Y-axis displacement amounts detected by the vibration detection device 240.
  • Fx and Fy are conversion coefficients of control amplitude with respect to detection values by the X-axis and Y-axis vibration detection devices 240.
  • the distortion is corrected by feeding back the vibration amplitude of the external vibration detected in real time to the amplitude of the swing control.
  • the feedforward correction may be performed by estimating the frequency of the external vibration from the time history of the displacement amount of the external vibration detected by the vibration detection device 240.
  • the corrected visual field range does not become small. Therefore, it is possible to correct the distortion in the frame while maintaining the original wide visual field range.
  • Second Embodiment A second embodiment of the present invention will be described.
  • the temporal change of the distance image and the intensity image acquired by the optical distance measuring device is used to correct the influence of external vibration of these distance image and intensity image.
  • the output of an external vibration detection sensor is not used.
  • the main vibration direction during traveling is substantially vertical.
  • the main vibration direction is determined in advance, and the swinging direction of the scanner (light deflection unit) is set to this main vibration direction.
  • the mounting destination is a vehicle and the main vibration direction is the Y-axis direction will be described as an example.
  • FIG. 11 is a configuration diagram of the optical distance measuring device 100a of the present embodiment.
  • the optical distance measuring device 100a of this embodiment basically has the same configuration as that of the first embodiment. However, in the present embodiment, unlike the first embodiment, the output of the external vibration detection sensor (vibration detection device 240) is not used. Therefore, the vibration analysis unit 233 that processes the output of the vibration detection device 240 is not provided.
  • the correction unit 120a does not use the output of the vibration detection device 240 at the time of correction.
  • the present embodiment will be described focusing on distortion correction processing by the correction unit 120a different from the first embodiment.
  • the distance calculator 231 and the intensity calculator 232 temporarily store the calculated distance value and the intensity value of the reflected light in the distance information memory 131 and the intensity information memory 132, respectively. Then, the correction unit 120a corrects the influence of external vibration by converting these addresses using the correction value in units of one frame. At this time, in the present embodiment, the correction value is calculated using a distance image of a plurality of frames.
  • the optical distance measuring device 100 further includes an information matching unit 140 that detects distortion of the distance image and the intensity image due to vibration and calculates a correction amount.
  • the optical distance measuring device 100 when the optical distance measuring device 100 receives external vibration, as shown in FIG. 12A, the field of view 451 different from the original field of view range 422 (dotted rectangle in the drawing) Detected. Regardless of the presence or absence of external vibration, the distance information detection result is given a rectangular array address as shown in FIG. 12B on the premise of the original visual field range 422.
  • the distance information 452 stored in the distance information memory 131 has distortion.
  • the external vibration detection device 240 is not provided. For this reason, when a detection result having distortion in the Y-axis direction is obtained as shown in FIG. 13A, the actual detection target shape itself is distorted as shown in FIG. 13B. However, as shown in FIG. 13C, the actual shape of the detection target is not distorted, and it cannot be determined whether the field of view is distorted by external vibration.
  • the information matching unit 140 determines whether the detection result with distortion is due to distortion of the shape of the detection target or due to external vibration. If it is determined that the vibration is due to external vibration, Ax and Ay are calculated and output as the displacement amounts of vibration in the X-axis direction and Y-axis direction. As described above, in the present embodiment, since the main vibration direction is the Y-axis direction, Ay is calculated as the displacement amount.
  • the information matching unit 140 of the present embodiment includes a distortion detection unit 141 and a distortion amplitude detection unit 142.
  • the distortion detection unit 141 calculates the distortion of the same object in the first area and the distortion in the second area, which is closer to the host vehicle than the first area, and determines the cause of the distortion.
  • the distortion detection unit 141 calculates distance information (first distance information) in the first region and distance information (second distance information) in the second region.
  • first distance information first distance information
  • second distance information second distance information
  • the first area and the second area are set based on an optical vanishing point 521 in the visual field range of the optical distance measuring device 100a.
  • the first region 522 is set at a position (far distance) close to the optical vanishing point, as shown in FIG.
  • the second region 523 is set to a region (medium distance) obtained by projecting the first region from the vanishing point in an enlarged manner.
  • a case where each is set to a rectangular area will be described as an example.
  • the distance information memory 131 of this embodiment can secure the distance information memory for at least two frames.
  • the distortion amplitude detection unit 142 calculates the distortion width, that is, the displacement amount Ay. As the displacement amount, a result calculated by the strain detection unit 141 at the time of determination is used.
  • distance information of the first area 522 is calculated (step S2101).
  • FIG. 16A shows the distance detection result of the first region 522 when there is no external vibration
  • FIG. It is a distance image expressed by a difference.
  • the distance value of the same distance from the optical distance measuring device 100a is substantially constant.
  • the distance values of pixel rows having the same coordinate in the Y-axis direction are substantially the same. Further, the distance value increases as the value in the Y-axis direction increases. In terms of the distance image, in the pixel row having the same X-axis coordinate, the distance value increases toward the back side.
  • the distance information is also a distortion synchronized with the vibration (referred to as waveform distortion).
  • waveform distortion a distortion synchronized with the vibration
  • the distortion detection unit 141 of the present embodiment calculates the average value D for each pixel column as described above for the first region 522, and determines the presence or absence of waveform distortion. For example, it is determined whether or not each calculated average value D is within a predetermined threshold.
  • the average value D for each pixel column is calculated, and the average value (average value of all data) Dave is calculated.
  • the distortion detector 141 determines whether or not there is distortion depending on whether or not the average value D of each pixel column is within the range of Dave ⁇ Dth (step S2102). Note that Dth is determined in advance.
  • the distortion detection unit 141 determines that there is distortion when it does not fit. If it is determined that there is distortion, the waveform distortion amplitude Ad is further calculated.
  • the distortion detector 141 further calculates the waveform distortion of the second region 523 after a predetermined time (step S2103).
  • the distortion detection unit 141 calculates the average value Ad2 of the waveform distortion by calculating the average value for each pixel column of the second area 523 by the same method as that of the first area 522.
  • the time history of the speed information of the host vehicle received from the engine control unit 300 and the elapsed time from the frame acquisition for calculating the distance information of the first area 522 to the frame acquisition for calculating the second area 523 And the movement distance between frames is estimated. Then, based on the information of the movement distance, the magnification at the time of projection of the waveform distortion is calculated.
  • the distortion detection unit 141 determines the cause of the distortion (step S2104).
  • the cause of the distortion is determined by whether or not the amplitude of the waveform distortion calculated in the first area 522 and the amplitude of the waveform distortion calculated in the second area 523 are constant. If it is constant, it is determined that it is due to external vibration.
  • FIG. 14B and FIG. 14C show the difference in generation between waveform distortion caused by external vibration and waveform distortion caused when the detection target itself is distorted.
  • the amplitude and frequency of the waveform distortion detected due to external vibration are substantially the same in any distance region as shown in FIG. 14B.
  • the amplitude of the waveform distortion increases substantially linearly with respect to the distance from the optical vanishing point 521 as shown in FIG.
  • the amplitude of the distance information calculation result in the first region 522 is Ad.
  • the amplitude of waveform distortion is Ad at all distances, as shown in FIG.
  • the distance to the first region 522, the distance to the second region 523, and the distance to the third region 524 are equally spaced. Assuming that, as shown in FIG. 14C, 2Ad and 3Ad are obtained in the second region and the third region.
  • the distortion amplitude detector 142 calculates a displacement amount used for correction by the correction unit 120 (step S2106).
  • the distortion amplitude detection unit 142 calculates the average value Dave of all data in the second region 523 by the same method as that of the first region 522. Then, as shown in FIG. 17B, a difference Dy between the average value D in each column direction and the average value Dave of all data is calculated.
  • the correction unit 120 corrects the address of each pixel value of the distance image and the intensity image using the difference Dy as the displacement amount Ay (step S2107), and ends the process.
  • P x and P y are addresses of X-axis and Y-axis pixels in the distance information memory 131 or the intensity information memory 132.
  • P cx and P cy are addresses after the conversion.
  • step S2102 If it is determined in step S2102 that there is no distortion, and if it is determined in step S2105 that the cause of distortion is due to the shape, the process returns to step S2101 and the process is repeated.
  • step S2104 determination of the factor in step S2104 is not necessarily performed based on the amplitude. Any standard that can determine whether the detected distortion is the actual shape of the detection target or the external vibration may be used. For example, a frequency obtained as a result of image analysis may be used.
  • the correction amount Ay is calculated from the distance information (waveform distortion) of the second region 523, but is not limited to this.
  • the three positions may be set at equal intervals from the optical vanishing point 521.
  • the distance information memory 131 can secure the distance information memory for at least the number of frames equal to the number of positions for calculating the distance information.
  • the third area 524 is preferably an area that covers data of all columns of the visual field.
  • the shape of the area for calculating the distance information is not necessarily limited to a rectangle as shown in FIG. 14A, and any shape can be used as long as the projection from the vanishing point 521 is possible.
  • distortion due to external vibration can be corrected using the distance image itself acquired by the optical distance measuring device 100a.
  • a vibration detection sensor that is separate from the optical distance measuring device 100 is not used. Therefore, even when an external vibration is applied to the scanning sensor only with the optical distance measuring device 100a, a detection result in which distortion in the frame is corrected can be obtained. That is, it is possible to obtain a high-quality image with a simpler configuration and free from the influence of distortion due to vibration.
  • the acquired data can be used efficiently as in the first embodiment by setting the swing direction to the vibration direction.
  • the main vibration direction of external vibration is specified using an image acquired by an image acquisition device separate from the optical distance measuring device.
  • the image is used to correct distortion due to external vibration of the distance image and the intensity image.
  • FIG. 18 is a configuration diagram of the optical distance measuring device 100b of the present embodiment.
  • the optical distance measuring device 100b of this embodiment basically has the same configuration as the optical distance measuring device 100 of the first embodiment.
  • the present embodiment will be described focusing on the configuration different from the first embodiment.
  • the displacement amounts Ax and Ay and the main vibration direction ⁇ M of the optical distance measuring device 100b are calculated using information from the image acquisition device 600 outside the optical distance measuring device 100b. Then, using this, the propagation direction control unit 216 controls the swinging direction of the scanner 212 (light deflection unit 213), and the correction unit 120 corrects the influence of external vibration.
  • the optical distance measuring device 100b of the present embodiment includes a camera image information memory 133 that stores data acquired by the image acquisition device 600 (hereinafter referred to as a camera image). Further, an information collating unit 140b that calculates correction amounts (displacement amounts) Ax and Ay is provided.
  • the information matching unit 140b includes a distortion amplitude detection unit 142b, a viewpoint conversion unit 143, and an image processing unit 144.
  • the vibration analysis unit 233 of the present embodiment obtains the displacement amounts Ax and Ay from the distortion amplitude detection unit 142 and specifies the main vibration angle ⁇ M in the main vibration direction.
  • the propagation direction control unit 216 controls the swing direction of the scanner 212 so as to be substantially parallel to the main vibration direction ⁇ M, as in the first embodiment.
  • the image acquisition device 600 is a non-scanning sensor that can acquire pixel values for one frame at a time.
  • the image acquisition apparatus 600 is a camera having a non-scanning image sensor such as a CCD or a CMOS, for example.
  • the image acquisition device 600 has the same or higher frame rate as the optical distance measuring device 100b, and can capture frames at the same timing as the optical distance measuring device 100b by performing synchronization control.
  • the vibration of the distance image and the intensity image is performed.
  • the reason why the distortion (displacement amount) can be calculated will be described.
  • FIG. 19A is a schematic diagram showing a state of the visual field 461 when external vibration is applied to the scanning sensor.
  • the field of view 461 is distorted in synchronism with the vibration as shown in FIG.
  • FIG. 19B is a schematic diagram showing the state of the visual field 471 when external vibration is applied to the non-scanning sensor.
  • visual field distortion can be sufficiently ignored if the sensor is an ideal sensor with a short exposure time.
  • a detection result a result in which the entire visual field is shifted by the vibration amplitude at the timing of exposure is obtained.
  • the cause of the distortion of the shape of the detection target obtained by the scanning sensor can be specified by collating the acquisition result of the non-scanning sensor and the information of the scanning sensor acquisition result. That is, it is possible to distinguish whether the detection target itself has a distorted shape or whether the field of view is distorted due to intra-frame distortion due to external vibration.
  • the viewpoint conversion unit 143 performs viewpoint conversion for converting the intensity information (intensity image) stored in the intensity information memory 132 into the viewpoint direction of the camera image acquired by the image acquisition apparatus 600 (step S2101).
  • viewpoint conversion distance information (distance image) stored in the distance information memory 131 is used.
  • the viewpoint conversion is performed by a known method using information such as the installation position and field of view of the optical distance measuring device 100b and the installation position and field of view of the image acquisition device 600.
  • the viewpoint conversion unit 143 may not be provided.
  • the image processing unit 144 performs various types of image processing on the intensity image and the camera image after the viewpoint conversion (step S3102).
  • image processing for example, lens distortion correction, light amount correction, gamma correction, range conversion by bit number conversion, image size conversion, and the like are performed.
  • image processing comparison between the intensity image and the camera image becomes possible.
  • the distortion amplitude detection unit 142b compares the processed intensity image with the camera image, calculates the pixel shift amount, and calculates the displacement amount (step S3103).
  • the calculation result is output to the vibration analysis unit 233 and the correction unit 120.
  • the shift amount is calculated by matching processing using a template in the image. Details of the shift amount calculation will be described with reference to FIGS. 21 (a) to 22 (c).
  • the pixel shift amount is calculated by matching processing using the template 611 in the camera image 610.
  • the pixel shift amount is calculated for each column of the intensity image 620 in the swing direction.
  • the template 611 shown in FIG. As the region used as the template 611, only the number M of pixels in the predetermined column, for example, near the center in the camera image 610 is used.
  • the reason for not using all the pixels in the swinging direction is that the pixel value at the end of the camera image 610 may be missing in the intensity image 620 that is a comparison target due to vibration.
  • the region used for the template 611 is not limited to the pixel near the center of the column. You may use the pixel of an edge part.
  • the distortion amplitude detection unit 142b uses the extracted template 611 to search for a matching part in the intensity image 620.
  • a raster scan search is performed in the swing direction. That is, the matching process is performed by raster scanning on the region of the intensity image 620 having the same number of pixels M as the template 611.
  • SSD Standard of Squared Difference
  • FIG. 21 (c) shows an example in which the matching region 612 is determined to be a position shifted by Ax in the X axis direction and Ay in the Y axis direction with respect to the camera image 610 as a result of matching.
  • the distortion amplitude detection unit 142b sets the deviation Ay as a displacement.
  • Ax 0.
  • the template 611 is extracted along the swing direction ⁇ F at that time.
  • the region used for the template 611 is not all the pixels in the column of the swing direction ⁇ F in the camera image 610, for example, the number of pixels near the center of the column in the same direction of the camera image 610, as in FIG. Only M is used.
  • the distortion amplitude detection unit 142b uses the extracted template 611 to search for a matching portion in the intensity image 620 by raster scanning.
  • a region where the dissimilarity R_SSD expressed by the above formula (6) is the smallest is set as a matching region 612.
  • FIG. 22C shows a state in which the value of the difference R_SSD is the smallest at a position shifted by Ax in the horizontal direction and Ay in the vertical direction with respect to the camera image 610 as a result of matching.
  • the distortion amplitude detector 142b calculates the pixel shift amounts Ax and Ay obtained as a result of the matching as displacement amounts.
  • the correction unit 120 that has received the displacement amounts Ax and Ay from the distortion amplitude detection unit 142b performs correction by the same method as in the first embodiment (step S3104).
  • the vibration analysis unit 233 also calculates the main vibration direction ⁇ M and the combined displacement (amplitude) Acom by the same method as in the first embodiment.
  • the matching process with the template in this embodiment is not limited to the SSD method, but the SAD (Sum of Absolute Difference) method, the NCC (Normalized Cross-Correlation) method, or the ZNCC (Zero-mean Normalized Cross-Cross-Corresion-Cross-Corress The method may be used.
  • the swing direction of the optical scanning device 200 is controlled to be always substantially parallel to the swing direction. For this reason, as in the first embodiment, the detection density in the scanning region does not vary, and data can be used efficiently. That is, even when external vibration is applied, it is possible to realize uniform detection without causing sparseness in detection density.
  • distortion caused by vibration of the distance image is corrected using the output of the image acquisition device 600. Therefore, as in the first embodiment, in the scanning sensor, the distortion of the detection target can be corrected by external vibration while efficiently using the acquired data.
  • the optical ranging apparatus 100 in each of the above-described embodiments has been described by taking as an example the case of being mounted on an automobile or the like for the purpose of explanation, the mounting target is not limited to an automobile.
  • the surrounding information such as a self-propelled conveyance vehicle like an automatic guided vehicle, and a robot.

Abstract

In the present invention, when external vibration is applied to a scanning device, the influence of the external vibration is efficiently reduced. Provided is an optical scanning device 200 provided with a light deflector 213 for deflecting measurement light that has been projected in a projection direction determined by a pivoting direction and a translation direction intersecting the pivoting direction, a light projector 211 for projecting the measurement light toward the light deflector, a light receiver 221 for receiving the measurement light reflected by an object, a propagation direction controller 216 for controlling the driving of the light deflector 213 so that the pivoting direction becomes the primary vibration direction that is the primary direction of the vibration of the optical scanning device 200, and an output unit 266 for outputting information obtained from the reflected light as output information.

Description

光走査装置および光測距装置Optical scanning device and optical distance measuring device
 本発明は、光走査技術および光測距技術に関する。 The present invention relates to optical scanning technology and optical ranging technology.
 自動車等の移動体に搭載されて、駆動するスキャナを用いて光スポットを走査する技術がある。例えば、その反射光を処理して時分割的に周囲の情報を取得する車載カメラ等の車載センサ、対向車や前方車の運転者への眩惑を防止するために自動車のヘッドライトの配光を制御するADB(Adaptive Driving Beam)、等に用いられる。これらは、移動体に搭載されるため、外的振動を受ける。これにより、車載センサは、得られる画像に、視野の誤差が発生する。また、ADBでは、配光領域が意図した領域からずれる。 There is a technology for scanning a light spot using a scanner mounted on a moving body such as an automobile. For example, in-vehicle sensors such as an in-vehicle camera that processes the reflected light and obtains surrounding information in a time-sharing manner, and the light distribution of automobile headlights to prevent dazzling the driver of oncoming vehicles and forward vehicles. Used for ADB (Adaptive Driving Beam) to be controlled. Since these are mounted on the moving body, they are subject to external vibration. As a result, the in-vehicle sensor generates a visual field error in the obtained image. Further, in ADB, the light distribution area deviates from the intended area.
 車載センサにおける振動による影響を解決する先行技術として、例えば、特許文献1に開示の技術がある。本技術では、「カメラで撮像された撮像画像内で車両の振動に伴って画像内で上下に移動する移動成分を特定し、特定した移動成分の移動履歴に基づいて移動成分の振動周波数の強度を算出し、振動周波数の強度が最も高い振動周波数範囲を抽出し、抽出した所定の周波数範囲内における振動の平衡点で撮像された平衡画像を抽出し、抽出した平衡画像に基づいて、車両の振動によって変位する画像内の変位の中心位置である消失点を検出し、検出した消失点を監視して、各撮像画像における車両の振動に起因する画像の変位量を算出する(要約抜粋)」。 For example, there is a technique disclosed in Patent Document 1 as a prior art for solving the influence of vibration in an in-vehicle sensor. According to the present technology, “the moving component that moves up and down in the image in accordance with the vibration of the vehicle in the captured image captured by the camera is identified, and the intensity of the vibration frequency of the moving component is determined based on the movement history of the identified moving component. The vibration frequency range having the highest vibration frequency intensity is extracted, the equilibrium image captured at the vibration equilibrium point within the extracted predetermined frequency range is extracted, and based on the extracted equilibrium image, the vehicle The vanishing point, which is the center position of the displacement in the image displaced by vibration, is detected, and the detected vanishing point is monitored to calculate the amount of image displacement due to vehicle vibration in each captured image (summary extract). " .
特開2006-170961号公報JP 2006-170961 A
 特許文献1に開示の技術では、カメラのような、CCDやCMOSといった二次元の撮像素子を用いて一括して周囲の情報を検出する、非走査型センサを用いることを想定している。非走査型センサでは、外的振動の影響は、各フレーム間のずれとして現れる。 In the technique disclosed in Patent Document 1, it is assumed that a non-scanning sensor that detects surrounding information at once using a two-dimensional image sensor such as a camera such as a CCD or CMOS is used. In the non-scanning sensor, the influence of external vibration appears as a shift between the frames.
 しかし、上述のような走査型センサでは車両の振動による影響は単一フレームの検出結果の中に歪みとなって観測される。このため、走査型センサでは、得られる画像において外的振動の影響を補正し、歪みの無い検出結果を得るためには、異なる解決手段が必要となる。また、このとき、取得したデータを効率的に利用したい。ADBにおいても、車両の振動による影響は、ヘッドライトの照射領域の歪みとなって現れる。このため、光走査型のヘッドライトでは外的振動の影響を補正し、歪みの無い検出結果を得る解決手段が必要となる。 However, in the scanning sensor as described above, the influence of vehicle vibration is observed as distortion in the detection result of a single frame. For this reason, in the scanning sensor, different solution means are required to correct the influence of external vibration in the obtained image and obtain a detection result without distortion. At this time, we want to use the acquired data efficiently. Also in ADB, the influence of the vibration of the vehicle appears as distortion of the irradiation area of the headlight. For this reason, the optical scanning headlight requires a solution means for correcting the influence of external vibration and obtaining a detection result without distortion.
 本発明は、上記事情に鑑みてなされたもので、走査型の装置において外的な振動が加わった場合、外的振動の影響を効率的に低減することを目的とする。 The present invention has been made in view of the above circumstances, and an object of the present invention is to efficiently reduce the influence of external vibration when external vibration is applied to a scanning apparatus.
 本発明は、揺動方向と、前記揺動方向と交差する並進方向と、で定まる投光方向に、投光する光である測定光を偏向する光偏向部と、前記測定光を前記光偏向部に向けて投光する投光部と、前記測定光の、対象物による反射光を受光する受光部と、前記揺動方向が、自身の主たる振動方向である主振動方向になるよう前記光偏向部を駆動制御する伝搬方向制御部と、前記反射光から得た情報を出力情報として出力する出力部と、を備えることを特徴とする光走査装置を提供する。 The present invention provides a light deflector that deflects measurement light, which is light to be projected, in a light projecting direction determined by a rocking direction and a translation direction that intersects the rocking direction, and the light deflecting the measurement light A light projecting unit that projects light toward the unit, a light receiving unit that receives reflected light of the measurement light from the object, and the light so that the oscillation direction is a main vibration direction that is a main vibration direction of the light. Provided is an optical scanning device comprising: a propagation direction control unit that drives and controls a deflection unit; and an output unit that outputs information obtained from the reflected light as output information.
 また、上記光走査装置と、前記出力情報から距離画像を生成する距離画像生成部と、前記光走査装置の変位量を用いて、前記主たる振動による前記距離画像の歪みを補正し、補正後の距離画像を出力する補正部と、を備えることを特徴とする光測距装置を提供する。 Further, the distortion of the distance image due to the main vibration is corrected by using the optical scanning device, a distance image generating unit that generates a distance image from the output information, and a displacement amount of the optical scanning device, An optical distance measuring device comprising: a correction unit that outputs a distance image.
 本発明によれば、走査型のセンサにおいて外的な振動がセンサに加わった場合においてもフレーム内に歪みの無い検出を実現することができる。上記した以外の課題、構成及び効果は、以下の実施形態の説明により明らかにされる。 According to the present invention, even when external vibration is applied to the scanning type sensor, detection without distortion in the frame can be realized. Problems, configurations, and effects other than those described above will be clarified by the following description of embodiments.
第一の実施形態の光測距装置の概略構成図である。It is a schematic block diagram of the optical distance measuring device of 1st embodiment. 第一の実施形態の光測距装置の構成図である。It is a block diagram of the optical ranging apparatus of 1st embodiment. 第一の実施形態の光走査装置の走査手法を説明するための説明するための説明図である。It is explanatory drawing for demonstrating for demonstrating the scanning method of the optical scanning device of 1st embodiment. (a)は、第一の実施形態の揺動方向制御処理のフローチャート、(b)は、第一の実施形態の測距処理のフローチャートである。(A) is a flowchart of the swing direction control process of the first embodiment, and (b) is a flowchart of the distance measurement process of the first embodiment. (a)および(b)は、第一の実施形態の外的振動時の振動検出装置により観測される変位の時刻歴を説明するための説明するための説明図であり、(c)は、光測距装置の振動の様子を説明するための説明図である。(A) And (b) is explanatory drawing for demonstrating for demonstrating the time history of the displacement observed by the vibration detection apparatus at the time of the external vibration of 1st embodiment, (c), It is explanatory drawing for demonstrating the mode of a vibration of an optical ranging device. (a)は、外的振動がない場合の、(b)は、外的な振動がある場合の、それぞれ、発光領域と非発光領域と視野との関係を説明するための説明図である。(A) is explanatory drawing for demonstrating the relationship between a light emission area | region, a non-light emission area | region, and a visual field, respectively, when there is no external vibration and (b) is an external vibration. (a)~(c)は、光測距装置に外的な振動が加わった際の検出結果への影響を説明するための説明図である。(A)-(c) is explanatory drawing for demonstrating the influence on the detection result when external vibration is added to the optical ranging apparatus. (a)および(b)は、揺動方向がX軸方向であり、外的振動が無い場合の、伝搬方向が描く軌跡のシミュレーション結果である。(A) And (b) is the simulation result of the locus | trajectory which a propagation direction draws when a rocking | fluctuation direction is an X-axis direction and there is no external vibration. (a)および(b)は、スキャナの揺動方向がX軸方向であり、Y軸方向に外的振動がある場合の、伝搬方向が描く軌跡のシミュレーション結果である。(A) And (b) is the simulation result of the locus | trajectory which a propagation direction draws when the rocking | fluctuation direction of a scanner is an X-axis direction and there exists an external vibration in a Y-axis direction. (a)~(e)は、スキャナの揺動方向と外的振動方向とを略平行に設定した場合の振動の影響および補正の効果を説明するための説明図である。(A)-(e) is explanatory drawing for demonstrating the influence of a vibration, and the effect of a correction | amendment when the rocking direction and external vibration direction of a scanner are set substantially parallel. 第二の実施形態の光測距装置の構成図である。It is a block diagram of the optical ranging apparatus of 2nd embodiment. (a)および(b)は、第二の実施形態の距離画像を説明するための説明図である。(A) And (b) is explanatory drawing for demonstrating the distance image of 2nd embodiment. (a)~(c)は、第二の実施形態の検出結果と実際の情報との関係を説明するための説明図である。(A)-(c) is explanatory drawing for demonstrating the relationship between the detection result of 2nd embodiment, and actual information. (a)は、第二の実施形態における検出領域を、(b)は、外的振動が原因の場合の波形歪み例を、(c)は、対象物の形状が歪みを有する場合の波形歪み例を、それぞれ説明するための説明図である。(A) is a detection area in the second embodiment, (b) is an example of waveform distortion caused by external vibration, and (c) is a waveform distortion when the shape of the object has distortion. It is explanatory drawing for demonstrating an example, respectively. 第二の実施形態の変位量算出処理のフローチャートである。It is a flowchart of the displacement amount calculation process of 2nd embodiment. (a)~(d)は、第二の実施形態の変位量算出手法を説明するための説明図である。(A)-(d) is explanatory drawing for demonstrating the displacement amount calculation method of 2nd embodiment. (a)および(b)は、第二の実施形態の変位量算出手法を説明するための説明図である。(A) And (b) is explanatory drawing for demonstrating the displacement amount calculation method of 2nd embodiment. 第三の実施形態の光測距装置の構成図である。It is a block diagram of the optical ranging apparatus of 3rd embodiment. (a)は、走査型センサに外的な振動が加わった場合の視野の変化を、(b)は、非走査型センサに外的な振動が加わった場合の視野の変化を、それぞれ説明するための説明図である。(A) illustrates the change in the visual field when an external vibration is applied to the scanning sensor, and (b) illustrates the change in the visual field when an external vibration is applied to the non-scanning sensor. It is explanatory drawing for. 第三の実施形態の変位量算出処理のフローチャートである。It is a flowchart of the displacement amount calculation process of 3rd embodiment. (a)~(c)は、振動方向がY軸方向である場合の、ピクセルずれ量算出する過程を説明するための説明図である。(A)-(c) is explanatory drawing for demonstrating the process of calculating pixel deviation | shift amount when a vibration direction is a Y-axis direction. (a)~(c)は、振動方向が、X軸に対してθF方向である場合の、ピクセルずれ量算出する過程を説明するための説明図である。(A)-(c) is explanatory drawing for demonstrating the process of calculating pixel deviation | shift amount when a vibration direction is (theta) F direction with respect to an X-axis.
 以下、本発明の実施形態について図面を用いて説明する。以下、本明細書において、同一機能を有するものは、特に断らない限り同一の符号を付し、繰り返しの説明は省略する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. Hereinafter, in this specification, those having the same function are denoted by the same reference numerals unless otherwise specified, and repeated description is omitted.
 <<第一の実施形態>>
 本発明の第一の実施形態を説明する。本実施形態では、振動検出センサを備え、その出力を用い、走査型の光測距装置の揺動方向を決定する。また、光測距装置が搭載された車両等の移動体の振動による距離画像の歪みを、振動検出センサによる検出結果を用いて補正する。以下、本実施形態では、移動体を自動車、車両とする場合を例にあげて説明する。
<< First Embodiment >>
A first embodiment of the present invention will be described. In this embodiment, a vibration detection sensor is provided, and its output is used to determine the swinging direction of the scanning optical distance measuring device. Further, the distortion of the distance image due to the vibration of a moving body such as a vehicle on which the optical distance measuring device is mounted is corrected using the detection result by the vibration detection sensor. Hereinafter, in the present embodiment, a case where the moving body is an automobile or a vehicle will be described as an example.
 図1および図2は、本実施形態の光測距装置100の構成図である。本実施形態の光測距装置100は、光走査装置200と、光走査装置200の出力情報から距離画像を生成する距離画像生成部110と、距離画像生成部110が生成した距離画像を補正する補正部120と、を備える。光測距装置100は、例えば、ECU300と接続され、補正部120により補正された距離画像は、ECU300に出力される。なお、ECU300は、自動車のエンジンを制御するエンジンコントロールユニットである。 1 and 2 are configuration diagrams of the optical distance measuring device 100 of the present embodiment. The optical distance measuring device 100 of the present embodiment corrects the optical scanning device 200, the distance image generation unit 110 that generates a distance image from the output information of the optical scanning device 200, and the distance image generated by the distance image generation unit 110. And a correction unit 120. For example, the optical distance measuring device 100 is connected to the ECU 300, and the distance image corrected by the correction unit 120 is output to the ECU 300. The ECU 300 is an engine control unit that controls an automobile engine.
 [光走査装置]
 光走査装置200は、駆動するスキャナを用いて光スポットを走査することで時分割的に周囲の情報を取得する走査型のセンサである。例えば、図3に示すように、スキャナを、予め定めた揺動方向402に、予め定めた揺動振幅で揺動させながら、揺動方向とは交差する並進方向403に並進させ、光スポットを投光し、予め定めた測距領域(視野範囲)401内を走査する。図中、黒色の矢印は、スキャナの伝搬方向が描く軌跡404を示す。
[Optical scanning device]
The optical scanning device 200 is a scanning sensor that acquires surrounding information in a time-division manner by scanning a light spot using a driving scanner. For example, as shown in FIG. 3, the scanner is oscillated in a predetermined oscillating direction 402 with a predetermined oscillating amplitude, while being translated in a translation direction 403 that intersects the oscillating direction. The light is projected, and a predetermined distance measurement area (field-of-view range) 401 is scanned. In the figure, a black arrow indicates a locus 404 drawn by the propagation direction of the scanner.
 本実施形態の光走査装置200(光測距装置100)では、視野における直交する二次元の軸を、後述する振動検出装置240の二次元の軸であるX軸、Y軸と略平行となるように設置する。以降、説明を簡単にする為、光走査装置200(光測距装置100)の視野における横軸(水平軸)および縦軸(垂直軸)についてもX軸およびY軸と表記する。 In the optical scanning device 200 (optical distance measuring device 100) of the present embodiment, the orthogonal two-dimensional axes in the field of view are substantially parallel to the X axis and the Y axis, which are two-dimensional axes of the vibration detection device 240 described later. Install as follows. Hereinafter, in order to simplify the description, the horizontal axis (horizontal axis) and the vertical axis (vertical axis) in the field of view of the optical scanning device 200 (optical ranging device 100) are also expressed as the X axis and the Y axis.
 これを実現するため、光走査装置200は、投光部211と、スキャナ212と、走査角度検出部215と、伝搬方向制御部216と、受光部221と、I/V変換部222と、増幅回路223と、ADC224と、信号処理部225と、制御部230と、を備える。 In order to realize this, the optical scanning device 200 includes a light projecting unit 211, a scanner 212, a scanning angle detection unit 215, a propagation direction control unit 216, a light receiving unit 221, an I / V conversion unit 222, and an amplification. A circuit 223, an ADC 224, a signal processing unit 225, and a control unit 230 are provided.
 投光部211は、レーザ光(光スポット)をスキャナ212に向けて投光する。投光は、予め定めた投光タイミングにてなされる。また、レーザ光は、パルス形状あるいは正弦波形状等に変調され、変調光として投光される。投光部211は、例えば、レーザダイオード、レーザ径を所望のレーザ径に成形するコリメートレンズ等で構成される。 The light projecting unit 211 projects laser light (light spot) toward the scanner 212. The light projection is performed at a predetermined light projection timing. Further, the laser light is modulated into a pulse shape or a sine wave shape, and projected as modulated light. The light projecting unit 211 includes, for example, a laser diode, a collimating lens that shapes the laser diameter to a desired laser diameter, and the like.
 スキャナ212は、光偏向部213と、駆動部214とを備える。 The scanner 212 includes an optical deflection unit 213 and a drive unit 214.
 光偏向部213は、光を偏向するミラー面を有する。また、光偏向部213は、少なくとも1軸以上の駆動軸を有する。光偏向部213は、例えば、長軸かつ高速な揺動軸と短軸かつ低速な揺動軸の二次元の揺動軸を有してもよい。 The light deflection unit 213 has a mirror surface that deflects light. The light deflection unit 213 has at least one drive shaft. The light deflecting unit 213 may have, for example, a two-dimensional swing axis that is a long axis and a high speed swing axis and a short axis and a low speed swing axis.
 駆動部214は、光偏向部213を支持するとともに、駆動信号に従って、駆動軸を駆動させる。本実施形態では、駆動部214は、後述する伝搬方向制御部216からの駆動信号に従って、各駆動軸を駆動させることにより、光偏向部213を、所望の方向に、所望の振幅で、揺動駆動させるとともに、所望の方向に、並進駆動させる。 The drive unit 214 supports the light deflection unit 213 and drives the drive shaft according to the drive signal. In the present embodiment, the drive unit 214 swings the light deflection unit 213 in a desired direction with a desired amplitude by driving each drive axis in accordance with a drive signal from a propagation direction control unit 216 described later. While being driven, it is translationally driven in a desired direction.
 なお、投光部211の投光先は、例えば、光偏向部213の略揺動中心とする。投光部211が投光した変調光は、光偏向部213で偏向され、測定光として測距範囲に照射される。測定光は、光測距装置100が搭載される車両の周囲の物体で反射される。 Note that the light projection destination of the light projecting unit 211 is, for example, the substantially swing center of the light deflecting unit 213. The modulated light projected by the light projecting unit 211 is deflected by the light deflecting unit 213 and is irradiated to the distance measuring range as measurement light. The measurement light is reflected by objects around the vehicle on which the optical distance measuring device 100 is mounted.
 走査角度検出部215は、光偏向部213の角度を検出する。そして、検出結果を伝搬方向制御部216に出力する。なお、走査角度検出部215は、光偏向部213の角度を検出可能な装置であればよい。例えば、静電容量式のロータリーエンコーダ、あるいは、光学式エンコーダ等を用いる。ロータリーエンコーダは、アクチュエータ部に取り付けられる。なお、光学式エンコーダは、光偏向部213を挟んで、投光部211の側とは逆の裏面反射等を利用する。このため、光学式エンコーダは、裏面反射を受光可能な位置に設けられる。 The scanning angle detection unit 215 detects the angle of the light deflection unit 213. Then, the detection result is output to the propagation direction control unit 216. The scanning angle detection unit 215 may be any device that can detect the angle of the light deflection unit 213. For example, a capacitance type rotary encoder or an optical encoder is used. The rotary encoder is attached to the actuator unit. Note that the optical encoder uses back surface reflection or the like opposite to the light projecting unit 211 side with the light deflecting unit 213 interposed therebetween. For this reason, the optical encoder is provided at a position where the back surface reflection can be received.
 伝搬方向制御部216は、制御部230の指示に従って、スキャナ212(光偏向部213)の揺動を制御する。制御は、制御部230から受信する制御タイミング信号に従って、同じく制御部230から受信する駆動信号と、走査角度検出部215の出力とを用いて行う。なお、駆動信号は、スキャナ212(光偏向部213)の駆動軸毎に出力される。なお、伝搬方向制御部216は、スキャナ212(光偏向部213)の揺動方向を、主振動方向と略平行に制御するとともに、その振幅と並進方向への並進量が視野範囲401をカバーするよう制御する。 The propagation direction control unit 216 controls the swing of the scanner 212 (light deflecting unit 213) in accordance with an instruction from the control unit 230. The control is performed using the drive signal received from the control unit 230 and the output of the scanning angle detection unit 215 in accordance with the control timing signal received from the control unit 230. A drive signal is output for each drive axis of the scanner 212 (light deflector 213). The propagation direction control unit 216 controls the swinging direction of the scanner 212 (light deflecting unit 213) substantially parallel to the main vibration direction, and the amplitude and the translation amount in the translation direction cover the visual field range 401. Control as follows.
 受光部221は、測定光の反射光を受光し、光電変換を行う。受光部221は、例えば、光電変換を行うフォトダイオードと、反射光をフォトダイオードへ導く集光ミラーを含む光学系等とを備える。 The light receiving unit 221 receives the reflected light of the measurement light and performs photoelectric conversion. The light receiving unit 221 includes, for example, a photodiode that performs photoelectric conversion and an optical system that includes a condensing mirror that guides reflected light to the photodiode.
 なお、受光部221で光電変換された反射光の情報(電流値)は、出力部226にて変換、信号処理され、距離画像生成部110に出力される。出力部226は、例えば、I/V変換部(電流電圧変換部)222と、増幅回路223と、ADC(アナログデジタルコンバータ)224と、信号処理部225とを備える。 Note that the information (current value) of the reflected light photoelectrically converted by the light receiving unit 221 is converted and signal processed by the output unit 226 and output to the distance image generation unit 110. The output unit 226 includes, for example, an I / V conversion unit (current / voltage conversion unit) 222, an amplifier circuit 223, an ADC (analog / digital converter) 224, and a signal processing unit 225.
 反射光の情報は、I/V変換部(電流電圧変換部)222にて、電圧の情報に変換される。そして、変換後の情報は、増幅回路223にてアナログ的なノイズ除去および適切な電圧値へと増幅あるいは低減された後、ADC224にてアナログな電圧量からデジタル値へと変換される。 The reflected light information is converted into voltage information by an I / V conversion unit (current / voltage conversion unit) 222. Then, the converted information is subjected to analog noise removal and amplification or reduction to an appropriate voltage value by the amplification circuit 223, and then converted from an analog voltage amount to a digital value by the ADC 224.
 信号処理部225は、投光部211に変調光を投光するタイミングを指示するとともに、受光部221で受光した反射光を処理し、投光から受光までの遅延量を算出する。投光の指示は、投光タイミング信号を投光部211に出力することにより行う。投光タイミング信号は、後述するパルス信号生成部235が出力する制御タイミング信号に応じて生成する。制御タイミング信号に応じて、光偏向部213の伝搬方向を判別し、変調光を投光するために投光部211を発光させるタイミングを制御する。すなわち、信号処理部は、光偏向部213による投光範囲が、予め定めた視野範囲となるよう、投光するタイミングを制御する。 The signal processing unit 225 instructs the light projecting unit 211 to project the modulated light, processes the reflected light received by the light receiving unit 221, and calculates a delay amount from light projection to light reception. The light projection instruction is performed by outputting a light projection timing signal to the light projecting unit 211. The light projection timing signal is generated according to a control timing signal output from a pulse signal generation unit 235 described later. In accordance with the control timing signal, the propagation direction of the light deflecting unit 213 is determined, and the timing for causing the light projecting unit 211 to emit light in order to project the modulated light is controlled. That is, the signal processing unit controls the timing of light projection so that the light projection range by the light deflection unit 213 becomes a predetermined visual field range.
 また、反射光の処理として、ADC224にてデジタル値へと変換された情報に対し、デジタルフィルタ処理および二値化処理を行う。二値化処理は、例えば、デジタルコンパレータで行う。算出した遅延量は、後述の距離画像生成部110に出力される。 In addition, as processing of reflected light, digital filter processing and binarization processing are performed on information converted into digital values by the ADC 224. The binarization process is performed by, for example, a digital comparator. The calculated delay amount is output to the distance image generation unit 110 described later.
 制御部230は、スキャナ212(光偏向部213)の揺動方向が、光測距装置100が搭載される車両(装置)の主な振動による光測距装置100の振動方向(主振動方向)と略平行になるように制御するとともに、光測距装置100全体の動作を制御する。 The control unit 230 is configured such that the swing direction of the scanner 212 (light deflecting unit 213) is the vibration direction (main vibration direction) of the optical distance measuring device 100 due to the main vibration of the vehicle (device) on which the optical distance measuring device 100 is mounted. And the operation of the entire optical distance measuring device 100 is controlled.
 これを実現するため、本実施形態の制御部230は、距離演算部231と、強度演算部232と、振動解析部233と、出力処理部234と、パルス信号生成部235とを備える。 In order to realize this, the control unit 230 of the present embodiment includes a distance calculation unit 231, an intensity calculation unit 232, a vibration analysis unit 233, an output processing unit 234, and a pulse signal generation unit 235.
 なお、本実施形態では、光測距装置100が搭載される車両には、光測距装置100とは別に、振動検出装置240が踏査される。この振動検出装置240は、光測距装置100が搭載される車両の外的振動による変位量を得る振動検出センサである。外的振動は、たとえば、路面振動による光測距装置100の振動量である。本実施形態では、振動量として、外的な振動のない定常状態からの変位を検出する。以下、定常状態からの変位を、変位量と呼ぶ。変位量は、例えば、予め定めた直交二軸(X軸およびY軸)の、各軸方向の値(Ax、Ay)とする。 In this embodiment, the vibration detection device 240 is traversed separately from the optical distance measuring device 100 in a vehicle on which the optical distance measuring device 100 is mounted. The vibration detection device 240 is a vibration detection sensor that obtains a displacement amount due to external vibration of a vehicle on which the optical distance measuring device 100 is mounted. The external vibration is, for example, a vibration amount of the optical distance measuring device 100 due to road surface vibration. In this embodiment, the displacement from the steady state without external vibration is detected as the vibration amount. Hereinafter, the displacement from the steady state is referred to as a displacement amount. The amount of displacement is, for example, a value (Ax, Ay) in each axial direction of two predetermined orthogonal axes (X axis and Y axis).
 振動検出装置240は、例えば、複数の検出軸を有した変位センサで実現される。なお、振動検出装置240は、複数の検出軸を有した変位センサに限らず、例えば、複数の検出軸を有した加速度センサ等のように光測距装置100の外的振動に伴う変位量を推定出来るものであればよい。得られた変位量は、制御部230および補正部120に出力される。 The vibration detection device 240 is realized by, for example, a displacement sensor having a plurality of detection axes. Note that the vibration detection device 240 is not limited to a displacement sensor having a plurality of detection axes, and for example, an amount of displacement due to external vibration of the optical distance measuring device 100 such as an acceleration sensor having a plurality of detection axes. Anything that can be estimated is acceptable. The obtained displacement amount is output to the control unit 230 and the correction unit 120.
 振動解析部233は、振動検出装置240が検出した光測距装置100の変位量から、振動方向(主振動方向)の、予め定めた方向からの角度(主振動角)θMを取得する。取得した主振動角θMは、パルス信号生成部235を介して、駆動軸毎の駆動信号として、伝搬方向制御部216に出力される。 The vibration analysis unit 233 acquires the angle (main vibration angle) θM of the vibration direction (main vibration direction) from a predetermined direction from the amount of displacement of the optical distance measuring device 100 detected by the vibration detection device 240. The acquired main vibration angle θM is output to the propagation direction control unit 216 as a drive signal for each drive axis via the pulse signal generation unit 235.
 なお、主振動角θMは、以下のように算出される。例えば、振動検出装置240が、同時刻に検出した、X軸およびY軸方向の変位を、それぞれAxおよびAyとする。この場合、X軸およびY軸の、合成変位(振幅;外的振動による振幅)Acomは、以下の式(1)で表される。
Figure JPOXMLDOC01-appb-M000001
 また、主振動方向θMは、X軸を基準とした合成変位の振動方向として、以下の式(2)で算出される。
Figure JPOXMLDOC01-appb-M000002
The main vibration angle θM is calculated as follows. For example, the displacements in the X-axis and Y-axis directions detected by the vibration detection device 240 at the same time are Ax and Ay, respectively. In this case, the combined displacement (amplitude; amplitude due to external vibration) Acom of the X axis and the Y axis is expressed by the following equation (1).
Figure JPOXMLDOC01-appb-M000001
The main vibration direction θM is calculated by the following equation (2) as the vibration direction of the composite displacement with respect to the X axis.
Figure JPOXMLDOC01-appb-M000002
 パルス信号生成部235は、各種の制御信号を生成し、各部に出力する。本実施形態では、例えば、各種の処理を同期させるための制御タイミング信号、上述の駆動信号等を生成し、出力する。制御タイミング信号は、例えば、伝搬方向制御部216、距離演算部231、補正部120等に出力される。駆動信号は、伝搬方向制御部216に出力される。なお、制御タイミング信号は、さらに、強度情報メモリ132、信号処理部225等にも出力されてもよい。 The pulse signal generation unit 235 generates various control signals and outputs them to each unit. In the present embodiment, for example, a control timing signal for synchronizing various processes, the above-described drive signal, and the like are generated and output. The control timing signal is output to, for example, the propagation direction control unit 216, the distance calculation unit 231, the correction unit 120, and the like. The drive signal is output to the propagation direction control unit 216. The control timing signal may also be output to the intensity information memory 132, the signal processing unit 225, and the like.
 制御タイミング信号は、距離演算部231から送信される周期信号に基づいて生成される。また、駆動信号は、振動解析部233が算出した主振動方向θMに基づいて生成される。本実施形態では、駆動信号は、スキャナ212(光偏向部213)の揺動方向402が、主振動方向θMに対して略平行となり、かつ、視野範囲401がカバーされるように揺動および並進移動するよう生成される。 The control timing signal is generated based on the periodic signal transmitted from the distance calculation unit 231. Further, the drive signal is generated based on the main vibration direction θM calculated by the vibration analysis unit 233. In this embodiment, the drive signal is swung and translated so that the swinging direction 402 of the scanner 212 (light deflecting unit 213) is substantially parallel to the main vibration direction θM and the visual field range 401 is covered. Generated to move.
 [距離画像生成部]
 次に、距離画像生成部110を説明する。距離画像生成部110は、距離演算部231と、強度演算部232と、それぞれの演算結果を保存する、距離情報メモリ131と、強度情報メモリ132と、を備える。
[Distance image generator]
Next, the distance image generation unit 110 will be described. The distance image generation unit 110 includes a distance calculation unit 231, an intensity calculation unit 232, and a distance information memory 131 and an intensity information memory 132 that store respective calculation results.
 距離演算部231は、信号処理部225から受信した遅延量を用い、対象物(物体)までの距離を演算する。得られた距離の値(距離値)は、メモリ格納時を考慮した値およびサイズに変換された後に、例えば、フレーム単位で距離情報メモリ131に格納される。各距離値は、制御タイミング信号に従って、遅延量取得時のスキャナ212の駆動量に応じて予め対応づけられたアドレスを有するメモリに格納される。なお、以下、フレーム単位で距離情報メモリ131に格納された複数の距離値を、まとめて、距離画像とも呼ぶ。 The distance calculation unit 231 calculates the distance to the object (object) using the delay amount received from the signal processing unit 225. The obtained distance value (distance value) is converted into a value and size considering the memory storage time, and then stored in the distance information memory 131 in units of frames, for example. Each distance value is stored in a memory having an address associated in advance according to the driving amount of the scanner 212 at the time of acquiring the delay amount according to the control timing signal. Hereinafter, a plurality of distance values stored in the distance information memory 131 in units of frames are collectively referred to as a distance image.
 強度演算部232は、同じく信号処理部225から受信した遅延量を用い、強度値を算出する。得られた強度値は、メモリ格納時を考慮した値およびサイズに変換された後に、例えば、フレーム単位で、強度情報メモリ132に格納される。各強度値も、遅延量取得時のスキャナ212の駆動量に応じて予め対応づけられたアドレスを有するメモリに格納される。なお、以下、フレーム単位で強度情報メモリ132に格納された複数の強度値を、まとめて、強度画像とも呼ぶ。 The intensity calculation unit 232 calculates the intensity value using the delay amount received from the signal processing unit 225. The obtained intensity value is converted into a value and a size considering the memory storage time, and then stored in the intensity information memory 132 in units of frames, for example. Each intensity value is also stored in a memory having an address associated in advance according to the driving amount of the scanner 212 when the delay amount is acquired. Hereinafter, a plurality of intensity values stored in the intensity information memory 132 in units of frames are collectively referred to as intensity images.
 [補正部]
 補正部120は、距離情報メモリ131に保存された距離値および強度情報メモリ132に保存された強度値の、外的な振動による影響を補正する。補正は、振動検出装置240から取得した変位量Ax、Ayを用いて行う。なお、変位量は、距離値および強度値を算出する遅延量を取得したタイミングに得た値を用いる。
[Correction section]
The correction unit 120 corrects the influence of external vibration on the distance value stored in the distance information memory 131 and the intensity value stored in the intensity information memory 132. The correction is performed using the displacement amounts Ax and Ay acquired from the vibration detection device 240. As the displacement amount, a value obtained at the timing when the delay amount for calculating the distance value and the intensity value is acquired is used.
 補正部120は、ピクセルのアドレスを変換することにより、補正を行う。そして、変換後の結果を、それぞれ、距離情報メモリ131および強度情報メモリ132に再格納する。本実施形態の補正部120が行う補正は、後述するように、1フレーム内の歪みを補正するものである。このため、本補正を歪み補正処理と呼ぶ。 The correction unit 120 performs correction by converting the address of the pixel. Then, the converted results are stored again in the distance information memory 131 and the intensity information memory 132, respectively. The correction performed by the correction unit 120 of the present embodiment corrects distortion within one frame, as will be described later. For this reason, this correction is called distortion correction processing.
 例えば、距離情報メモリ131あるいは強度情報メモリ132の、X軸およびY軸のピクセルのアドレスがそれぞれPxおよびPy、そのピクセル座標に対応したX軸およびY軸の、その時点での変位量がそれぞれAx、Ayとする。このとき、補正部120は、そのピクセルのアドレスを、以下の式(3)に従ってPcxおよびPcyに変換する。
Figure JPOXMLDOC01-appb-M000003
 ここで式(3)におけるBxおよびByは、それぞれ、X軸およびY軸の振動検出装置240が検出した変位量に対するピクセル移動量の変換係数を表す。また、ROUNDは、整数化の為に括弧内の値を四捨五入する関数である。なお、式(3)における整数化の為の関数は、必ずしも四捨五入処理に限定されない。繰り上げあるいは切り捨てする関数であってもよい。
For example, in the distance information memory 131 or the intensity information memory 132, the X-axis and Y-axis pixel addresses are Px and Py, respectively, and the X-axis and Y-axis displacements corresponding to the pixel coordinates are respectively the displacement amounts Ax , Ay. At this time, the correction unit 120 converts the address of the pixel into Pcx and Pcy according to the following equation (3).
Figure JPOXMLDOC01-appb-M000003
Here, Bx and By in Expression (3) represent conversion coefficients of pixel movement amounts with respect to displacement amounts detected by the X-axis and Y-axis vibration detection devices 240, respectively. ROUND is a function that rounds off the value in parentheses for integerization. In addition, the function for integerization in Formula (3) is not necessarily limited to the rounding-off process. It may be a function that rounds up or down.
 出力処理部234は、例えば、強度情報メモリ132に格納される強度情報に対して各種の処理を行い、処理後の強度情報を出力する。各種の処理として、例えば、画像処理、デジタル放送信号への変換処理等が行われる。ここで行う画像処理は、例えば、ガンマ補正等である。また、デジタル放送への変換処理は、例えば、ATSC、DVB-TあるいはISDB-T等である。出力先は、ECU300等である。 The output processing unit 234 performs various processes on the intensity information stored in the intensity information memory 132, for example, and outputs the processed intensity information. As various types of processing, for example, image processing, conversion processing to a digital broadcast signal, and the like are performed. The image processing performed here is, for example, gamma correction. Also, the conversion process to digital broadcasting is, for example, ATSC, DVB-T, ISDB-T, or the like. The output destination is the ECU 300 or the like.
 制御部230に含まれる各機能、すなわち、振動解析部233、パルス信号生成部235、距離演算部231、強度演算部232、および出力処理部234と、補正部120とは、例えば、CPUとメモリと記憶装置とを備える情報処理装置により実現される。ここでは、CPUが、予め記憶装置に格納したプログラムをメモリにロードして実行することにより実現される。なお、全部または一部の機能は、ASIC(Application Specific Integrated Circuit)、FPGA(field-programmable gate array)などのハードウェアによって実現されてもよい。 The functions included in the control unit 230, that is, the vibration analysis unit 233, the pulse signal generation unit 235, the distance calculation unit 231, the intensity calculation unit 232, the output processing unit 234, and the correction unit 120 include, for example, a CPU and a memory And an information processing device including a storage device. Here, the CPU is realized by loading a program stored in a storage device in advance into a memory and executing the program. All or some of the functions may be realized by hardware such as ASIC (Application Specific Integrated Circuit) or FPGA (Field-programmable gate array).
 また、処理に必要な各種のデータおよび処理により得られた各種のデータは、記憶装置に格納される。本実施形態では、距離情報メモリ131および強度情報メモリ132は、記憶装置に構築される。 Also, various data necessary for the process and various data obtained by the process are stored in the storage device. In the present embodiment, the distance information memory 131 and the intensity information memory 132 are constructed in a storage device.
 上記の各部による、スキャナ212(光偏向部213)の揺動方向制御処理の流れを図4(a)に従って説明する。本処理は、光測距装置100が起動されたことを契機に開始される。 The flow of the swinging direction control process of the scanner 212 (light deflecting unit 213) by the above-described units will be described with reference to FIG. This process is started when the optical distance measuring device 100 is activated.
 光測距装置100が起動されると、振動解析部233は、振動検出装置240から受信した変位量を用い、主振動方向θMを算出する(ステップS1101)。このとき、走査角度検出部215は、光偏向部213の現在の角度θcを検出する。パルス信号生成部235は、主振動方向θMから駆動信号を生成し(ステップS1103)、伝搬方向制御部216に送信する。 When the optical distance measuring device 100 is activated, the vibration analysis unit 233 calculates the main vibration direction θM using the displacement received from the vibration detection device 240 (step S1101). At this time, the scanning angle detector 215 detects the current angle θc of the light deflector 213. The pulse signal generation unit 235 generates a drive signal from the main vibration direction θM (step S1103) and transmits it to the propagation direction control unit 216.
 伝搬方向制御部216は、駆動信号と、現在の角度θcとに従って、光偏向部213の揺動方向θFが主振動方向θMとなり、かつ、視野範囲401が走査されるよう、駆動部214を制御する(ステップS1104)。光走査装置200は、以上の処理を、終了の指示を受け付けるまで、制御タイミング信号に従って、継続する(ステップS1101)。これにより、本実施形態では、リアルタイムで揺動方向が変化する。すなわち、常に、揺動方向を、主振動方向と略平行に保つことができる。 The propagation direction control unit 216 controls the drive unit 214 according to the drive signal and the current angle θc so that the swing direction θF of the light deflection unit 213 becomes the main vibration direction θM and the visual field range 401 is scanned. (Step S1104). The optical scanning device 200 continues the above process according to the control timing signal until an end instruction is received (step S1101). Thereby, in this embodiment, a rocking | fluctuation direction changes in real time. That is, the swing direction can always be kept substantially parallel to the main vibration direction.
 次に、本実施形態の光測距装置100による測距処理の流れを説明する。図4(b)は、本実施形態の測距処理の処理フローである。本実施形態処理は、光測距装置100が起動されたことを契機に開始される。 Next, the flow of distance measurement processing by the optical distance measuring device 100 of the present embodiment will be described. FIG. 4B is a processing flow of the distance measuring process of the present embodiment. The processing of the present embodiment is started when the optical distance measuring device 100 is activated.
 信号処理部225は、投光部211に、投光タイミング信号を送信する。投光タイミング信号を受信した投光部211は、変調光を光偏向部213の略揺動中心へと投光する(ステップS1202)。変調されたレーザ光は、測定光として光偏向部213により所定方向に照射され、車両の周囲の物体で拡散反射される。 The signal processing unit 225 transmits a light projection timing signal to the light projecting unit 211. Receiving the light projection timing signal, the light projecting unit 211 projects the modulated light to the approximate swing center of the light deflecting unit 213 (step S1202). The modulated laser light is irradiated as a measurement light in a predetermined direction by the light deflecting unit 213 and diffusely reflected by objects around the vehicle.
 受光部221は、拡散反射による反射光を検出する(ステップS1203)。そして、得られた反射光の情報は、出力部226で処理され、信号処理部225に入力される。 The light receiving unit 221 detects light reflected by diffuse reflection (step S1203). Then, the obtained reflected light information is processed by the output unit 226 and input to the signal processing unit 225.
 信号処理部225は、変換後の反射光の情報から、測定光の発光から反射光の受光までの遅延量を算出するとともに、反射光の強度情報を算出する信号処理を行う(ステップS1204)。そして、算出された遅延量および強度情報は、それぞれ、距離演算部231および強度演算部232に出力される。 The signal processing unit 225 calculates a delay amount from the emission of the measurement light to the reception of the reflected light from the converted reflected light information, and performs signal processing for calculating the intensity information of the reflected light (step S1204). Then, the calculated delay amount and intensity information are output to the distance calculator 231 and the intensity calculator 232, respectively.
 距離演算部231は、受信した遅延量を用い、物体までの距離を演算し、距離情報メモリ131に格納する。また、強度演算部232は、強度情報を、強度情報メモリ132に格納する(ステップS1206)。 The distance calculation unit 231 calculates the distance to the object using the received delay amount and stores it in the distance information memory 131. In addition, the intensity calculator 232 stores the intensity information in the intensity information memory 132 (step S1206).
 距離情報メモリ131および強度情報メモリ132それぞれに、1フレーム分のデータが格納されると、すなわち、距離画像および強度画像が生成されると、補正部120は、歪み補正処理を行う(ステップS1206)。ここでは、補正部120は、制御タイミング信号に従って、振動検出装置240から受信した変位量を用いて、距離画像および強度画像に対し、上記手法で歪み補正処理を行う。そして、補正後の距離画像を、ECU300へ出力する。また、補正後の強度画像は、出力処理部234を介してECU300へ出力する。 When data for one frame is stored in each of the distance information memory 131 and the intensity information memory 132, that is, when a distance image and an intensity image are generated, the correction unit 120 performs a distortion correction process (step S1206). . Here, the correction unit 120 performs the distortion correction process on the distance image and the intensity image by the above method using the displacement received from the vibration detection device 240 according to the control timing signal. Then, the corrected distance image is output to ECU 300. Further, the corrected intensity image is output to ECU 300 via output processing unit 234.
 光測距装置100は、以上の処理を、終了の指示を受け付けるまで、制御タイミング信号に従って、継続する(ステップS1201)。 The optical distance measuring device 100 continues the above process according to the control timing signal until an end instruction is received (step S1201).
 次に、補正部120による歪み補正処理の必要性について説明する。本実施形態の光測距装置100は、搭載される車両の走行環境によっては、路面からの振動が外的振動として加わる。 Next, the necessity for the distortion correction processing by the correction unit 120 will be described. In the optical distance measuring device 100 of the present embodiment, vibration from the road surface is applied as external vibration depending on the traveling environment of the vehicle on which the optical distance measuring device 100 is mounted.
 図5(a)および図5(b)は、それぞれ、Y軸方向およびX軸方向の、振動検出装置240によって観測される変位の時刻歴(時間軸方向の変化の様子)を例示したものである。ここで、図5(a)および図5(b)における黒点411、412は、実際のサンプリングされた測定点を表す。 FIG. 5A and FIG. 5B exemplify the time history of displacement observed by the vibration detection device 240 in the Y-axis direction and the X-axis direction (changes in the time-axis direction), respectively. is there. Here, black dots 411 and 412 in FIGS. 5A and 5B represent actual sampled measurement points.
 通常、路面の状態により光測距装置100に加わる外的な振動による変位の変化波形は、その要因が同一であれば、図5(a)および図5(b)に示すように、X軸およびY軸に関し、周波数と位相は略同一で振幅のみが異なる。あるいは、周波数は略同一、位相が180度異なり、振幅は異なる。このとき、搭載された車両は、図5(c)に示すように、角度θMの方向に対して振動し続けるため、光測距装置100の主振動角θMは、時間の経過によらず、略一定である。 Normally, the displacement waveform due to external vibration applied to the optical distance measuring device 100 depending on the road surface condition is the same as the X axis as shown in FIGS. With respect to the Y axis, the frequency and phase are substantially the same and only the amplitude is different. Alternatively, the frequencies are substantially the same, the phases are different by 180 degrees, and the amplitudes are different. At this time, as shown in FIG. 5C, the mounted vehicle continues to vibrate in the direction of the angle θM, so the main vibration angle θM of the optical distance measuring device 100 does not depend on the passage of time. It is almost constant.
 このような外的振動が光測距装置100の検出結果に及ぼす影響について説明する。 The effect of such external vibration on the detection result of the optical distance measuring device 100 will be described.
 図6(a)に、外的振動が無い場合の、発光領域424と、非発光領域425と、視野範囲422との関係を示す。ここでは、光偏向部213を、角度θMに対して略平行な方向に揺動させ、視野範囲をカバーするように並進移動させる。 FIG. 6A shows the relationship between the light emitting region 424, the non-light emitting region 425, and the visual field range 422 when there is no external vibration. Here, the light deflection unit 213 is swung in a direction substantially parallel to the angle θM, and is translated so as to cover the field of view range.
 本図において、白色の矢印はスキャナ212の伝搬方向が描く軌跡421を、破線で描かれた長方形の領域は視野範囲422を、実線で描かれた複数の長方形の帯領域はスキャン時の列単位423を、さらに、長方形の帯の内で濃い色の領域は、発光領域424を、同薄い色の領域は、非発光領域425をそれぞれ表す。なお、発光領域424は、投光部211を実際に発光させて距離測定を行う領域である。また、非発光領域425は、投光部211を発光させず距離測定を行わない領域である。 In this figure, a white arrow indicates a trajectory 421 drawn by the propagation direction of the scanner 212, a rectangular area drawn by a broken line indicates a visual field range 422, and a plurality of rectangular band areas drawn by solid lines indicate column units at the time of scanning. Further, 423, a dark color region in the rectangular band represents a light emitting region 424, and a light color region represents a non-light emitting region 425. Note that the light emitting area 424 is an area in which the distance is measured by actually causing the light projecting unit 211 to emit light. Further, the non-light emitting area 425 is an area in which the distance measurement is not performed without causing the light projecting unit 211 to emit light.
 外的振動がない場合、発光領域424は、破線で描かれた長方形の領域(視野範囲422)と略等しく長方形となる。このため、駆動制御のみで全視野の距離情報を得ることができる。 When there is no external vibration, the light emitting region 424 is substantially rectangular with the rectangular region (view range 422) drawn with a broken line. For this reason, distance information of the entire visual field can be obtained only by drive control.
 しかし、実際には、角度θM方向には外的な振動が生じているため、視野範囲422は、長方形を保つことが出来ない。図6(b)に、θM方向に外的振動が生じている場合の発光領域424と、非発光領域425と、視野範囲422との関係を示す。ここでは、光偏向部213の駆動制御は上記図6(a)と同様とする。 However, actually, since the external vibration is generated in the angle θM direction, the visual field range 422 cannot be kept rectangular. FIG. 6B shows a relationship between the light emitting region 424, the non-light emitting region 425, and the visual field range 422 when external vibration is generated in the θM direction. Here, the drive control of the light deflection unit 213 is the same as that in FIG.
 本図において、破線で描かれた長方形の領域が、外的振動がない場合の視野範囲422を、実線で描かれた複数の長方形の帯領域が、スキャン時の列単位433を、列単位433のうち、濃い色の領域が発光領域434を、同薄い色の領域が非発光領域435を、それぞれ表す。 In this figure, a rectangular area drawn with a broken line indicates a visual field range 422 when there is no external vibration, and a plurality of rectangular band areas drawn with a solid line show a column unit 433 during scanning and a column unit 433. Among these, the dark color region represents the light emitting region 434, and the light color region represents the non-light emitting region 435.
 外的振動がある場合、発光領域434が、この振動により、角度θM方向に振動する。このため、発光領域434が、元の長方形(元の視野範囲422)から、変位する。外的な振動の有無にかかわらず、距離情報の検出結果は、元の視野範囲422を前提として配列アドレスが付与される。この結果、光測距装置100の検出結果に影響が生じる。 When there is external vibration, the light emitting region 434 vibrates in the angle θM direction due to this vibration. For this reason, the light emitting region 434 is displaced from the original rectangle (original visual field range 422). Regardless of the presence or absence of external vibration, the distance information detection result is given an array address on the premise of the original visual field range 422. As a result, the detection result of the optical distance measuring device 100 is affected.
 図7(a)~図7(c)の模式図を用いて光測距装置100に外的振動が加わった際の検出結果への影響を説明する。ここでは、図7(a)に示す視野範囲422内のX軸上の白色の太線441を検出対象441とする。 7A to 7C are used to explain the influence on the detection result when external vibration is applied to the optical distance measuring device 100. FIG. Here, a white thick line 441 on the X axis in the visual field range 422 shown in FIG.
 ここでは、図7(b)に示すように、外的振動が、X軸から角度θ(本図では、θ=45度とする)傾いた方向に発生したものとする。なお、この場合の、外的振動の、時間方向の変化の様子を、太曲線444で示す。 Here, as shown in FIG. 7B, it is assumed that external vibration has occurred in a direction inclined from the X axis by an angle θ (in this figure, θ = 45 degrees). In this case, a change in the time direction of the external vibration is indicated by a thick curve 444.
 本実施形態のような走査型の光走査装置200を用いる場合、このような外的振動が加わると、検出対象441である太線441は、白曲線443のように歪んだ形状で検出される。すなわち、外的振動により、元の視野範囲422と異なる視野が検出され、これにより、実際の検出対象441の形状とは異なる検出結果442が誤検出される。 In the case of using the scanning optical scanning device 200 as in the present embodiment, when such external vibration is applied, the thick line 441 that is the detection target 441 is detected in a distorted shape like a white curve 443. That is, a visual field different from the original visual field range 422 is detected by the external vibration, and thereby a detection result 442 different from the actual shape of the detection target 441 is erroneously detected.
 補正部120が、この検出結果442に対して、上記式(3)に従って、歪み補正処理を行うことにより、図7(c)に白太線441aで示すように、本来の検出対象441の形状どおりの検出結果441aを得ることができる。ただし、歪み補正処理を行うことにより、得られる視野範囲422aは、補正前の視野範囲422より小さくなる。 The correction unit 120 performs distortion correction processing on the detection result 442 according to the above equation (3), so that the shape of the original detection target 441 is obtained as indicated by a white bold line 441a in FIG. Detection result 441a can be obtained. However, the visual field range 422a obtained by performing the distortion correction process is smaller than the visual field range 422 before correction.
 なお、補正後の視野範囲422aが判明した後、補正後の視野範囲422a内でのみ発光するように信号処理部225において投光タイミング信号を制御してもよい。補正後の視野範囲422aは、補正部120が補正処理を行うことにより得られる。また、補正後の視野範囲422aは、振動検出装置240が検出した変位量からも算出できる。この場合、振動検出装置240が検出した変位量を信号処理部225にも出力し、信号処理部225は、それに従って、投光を制御する。 Note that, after the corrected visual field range 422a is found, the signal processing unit 225 may control the light projection timing signal so that light is emitted only within the corrected visual field range 422a. The corrected visual field range 422a is obtained by the correction unit 120 performing correction processing. The corrected visual field range 422a can also be calculated from the amount of displacement detected by the vibration detection device 240. In this case, the displacement detected by the vibration detection device 240 is also output to the signal processing unit 225, and the signal processing unit 225 controls light projection accordingly.
 次に、外的振動の方向(主振動方向)に対して、略平行に揺動駆動する効果を、図8(a)~図11(e)を用いて説明する。 Next, the effect of swinging and driving substantially parallel to the direction of external vibration (main vibration direction) will be described with reference to FIGS. 8 (a) to 11 (e).
 図8(a)および図9(a)に、スキャナ212の揺動方向がX軸方向である場合の、伝搬方向が描く軌跡のシミュレーション結果を示す。図8(a)は、外的な振動がない理想状態を、図9(a)は、Y軸方向に外的振動が生じている場合のシミュレーション結果である。図9(a)では、上記本実施形態の制御とは異なり、外的振動方向に対して揺動方向が垂直に設定された状態である。このときの、他のシミュレーション条件を表1に示す。
Figure JPOXMLDOC01-appb-T000004
FIGS. 8A and 9A show simulation results of the trajectory drawn by the propagation direction when the swinging direction of the scanner 212 is the X-axis direction. FIG. 8A shows an ideal state where there is no external vibration, and FIG. 9A shows a simulation result when external vibration occurs in the Y-axis direction. In FIG. 9A, unlike the control of the present embodiment, the swing direction is set to be perpendicular to the external vibration direction. Table 1 shows other simulation conditions at this time.
Figure JPOXMLDOC01-appb-T000004
 図8(b)および図9(b)に、それぞれ、図8(a)および図9(a)の一部拡大図を示す。なお、これらの各図の横軸および縦軸は、それぞれ、検出時の視野におけるX軸方向のピクセル(列)番号およびY軸方向のピクセル(行)番号を表す。 8 (b) and FIG. 9 (b) are partially enlarged views of FIG. 8 (a) and FIG. 9 (a), respectively. Note that the horizontal axis and the vertical axis in each of these figures represent the pixel (column) number in the X-axis direction and the pixel (row) number in the Y-axis direction, respectively, in the visual field at the time of detection.
 外的振動が無い場合、図8(a)に示すように、X軸方向に対して正弦波の軌跡が歪むことなく描かれる。また、図8(b)に示すように、行番号0から99の各行において、偏り無く均一な軌跡が得られる。 When there is no external vibration, as shown in FIG. 8A, the locus of the sine wave is drawn without distortion in the X-axis direction. Further, as shown in FIG. 8B, a uniform trajectory can be obtained without deviation in each of the rows with row numbers 0 to 99.
 一方、Y軸方向に外的振動がある場合、図9(a)に示すように、伝搬方向の軌跡は、Y軸方向に関し、正弦波の密度が疎となる領域と密となる領域とが、繰り返し表れるよう描かれる。図9(b)の拡大図からも分かるように、密度が疎となる領域ではほとんど軌跡が通過せず、検出が出来ない。 On the other hand, when there is external vibration in the Y-axis direction, as shown in FIG. 9A, the trajectory in the propagation direction is divided into a region in which the density of the sine wave is sparse and a region in which the density of the sine wave is dense in the Y-axis direction. , Drawn repeatedly. As can be seen from the enlarged view of FIG. 9B, in the region where the density is sparse, the trajectory hardly passes and detection is not possible.
 このように、検出密度に疎密が生じると、歪み補正処理によって正しい検出対象の形状に補正しても、密度が疎となる領域では、検出自体がほとんど行われない為、部分的に検出情報が欠落する。逆に、密度が密となる領域では、検出情報が過多となり、歪み補正後の1つのピクセルに対して検出情報が重複する。従って、データ利用効率の観点からも望ましくない。 As described above, when the detection density is sparse and dense, even if the detection is corrected to the correct shape by the distortion correction process, the detection itself is hardly performed in the region where the density is sparse. Missing. On the contrary, in the region where the density is high, the detection information is excessive, and the detection information overlaps with one pixel after distortion correction. Therefore, it is not desirable from the viewpoint of data utilization efficiency.
 次に、本実施形態のように、スキャナ212を、外的振動と同方向に揺動させる場合の、振動の影響および補正処理の効果を、図10(a)~図10(e)に示す。ここでは、主振動方向および揺動方向がY軸方向である場合を例にあげて説明する。図10(a)に、振動が無い場合の視野範囲422および検出対象441を示す。 Next, FIG. 10A to FIG. 10E show the influence of the vibration and the effect of the correction processing when the scanner 212 is swung in the same direction as the external vibration as in this embodiment. . Here, a case where the main vibration direction and the swing direction are the Y-axis direction will be described as an example. FIG. 10A shows a visual field range 422 and a detection target 441 when there is no vibration.
 図10(b)は、Y軸方向に外的振動が加わった場合の、検出結果を示す。本図に示すように、外的振動に伴い、検出視野422bは振動する。このため、検出結果をそのまま距離情報メモリ131に格納すると、図10(c)に示すように、検出結果441cは、歪む。しかしながら、図9(a)および図9(b)のように検出密度に疎密が生じることなく均一に検出できる。また、部分的に検出領域が欠落することもない。その結果、歪み補正処理を行うと、図10(d)に示すように、検出した視野範囲422aは、元の視野範囲422より小さくなるものの、図10(e)に検出結果441aとして示すように、検出対象441の形状は、正しく検出できる。 FIG. 10B shows the detection result when external vibration is applied in the Y-axis direction. As shown in this figure, the detection visual field 422b vibrates with external vibration. For this reason, if the detection result is stored in the distance information memory 131 as it is, the detection result 441c is distorted as shown in FIG. However, as shown in FIGS. 9A and 9B, the detection density can be detected uniformly without sparseness. Further, the detection area is not partially lost. As a result, when distortion correction processing is performed, as shown in FIG. 10D, the detected visual field range 422a is smaller than the original visual field range 422, but as shown in FIG. 10E as the detection result 441a. The shape of the detection object 441 can be detected correctly.
 このように、本実施形態のように、外的振動の角度θに対して揺動方向が略平行になるよう制御することで、検出領域の欠落がなく、かつ、高いデータ利用効率を保持しつつ歪み補正処理を行うことができる。 Thus, as in this embodiment, by controlling the oscillation direction to be substantially parallel to the external vibration angle θ, there is no missing detection area and high data utilization efficiency is maintained. In addition, the distortion correction process can be performed.
 以上説明したように、本実施形態の光走査装置200によれば、揺動方向と、揺動方向と交差する並進方向と、で定まる投光方向に、投光する光である測定光を偏向する光偏向部213と、測定光を光偏向部213に向けて投光する投光部211と、測定光の、対象物による反射光を受光する受光部221と、揺動方向が、自身の主たる振動方向である主振動方向になるよう光偏向部213を駆動制御する伝搬方向制御部216とを備える。そして、主振動方向を取得する、振動解析部233をさらに備える。この振動解析部233は、外的振動に伴い変位を検出する変位センサである振動検出装置240による出力を用い、主振動方向を取得する。 As described above, according to the optical scanning device 200 of this embodiment, the measurement light that is the light to be projected is deflected in the light projecting direction determined by the swing direction and the translation direction intersecting the swing direction. The light deflecting unit 213, the light projecting unit 211 that projects the measurement light toward the light deflecting unit 213, the light receiving unit 221 that receives the reflected light of the measurement light from the object, and the oscillation direction is A propagation direction control unit 216 that drives and controls the light deflection unit 213 so as to be in a main vibration direction that is a main vibration direction. And the vibration analysis part 233 which acquires a main vibration direction is further provided. The vibration analysis unit 233 obtains the main vibration direction using the output from the vibration detection device 240 that is a displacement sensor that detects displacement in response to external vibration.
 従って、本実施形態によれば、光走査装置200の揺動方向が、常に振動方向に略平行となるよう制御される。このため、上述のように、走査領域内の検出密度に疎密が生じない。このため、効率的にデータを利用できる。すなわち、外的な振動が加わった場合であっても、検出密度に疎密が生じることの無い均一な検出を実現できる。 Therefore, according to this embodiment, the swinging direction of the optical scanning device 200 is controlled so as to be always substantially parallel to the vibrating direction. For this reason, as described above, the detection density in the scanning region does not vary. For this reason, data can be used efficiently. That is, even when external vibration is applied, it is possible to realize uniform detection without causing sparseness in detection density.
 さらに、本実施形態の光測距装置100によれば、光走査装置200の出力情報から距離画像を生成する距離画像生成部110と、振動検出装置240の出力を用い、距離画像の、振動による歪みを補正する補正部120とを備える。 Furthermore, according to the optical distance measuring device 100 of the present embodiment, the distance image generation unit 110 that generates a distance image from the output information of the optical scanning device 200 and the output of the vibration detection device 240 are used to detect the distance image by vibration. And a correction unit 120 that corrects distortion.
 従って、本実施形態によれば、走査型センサにおいて、取得したデータを効率的に利用しつつ、フレーム内の歪みを補正した検出結果を得ることができる。すなわち、走査型の装置において、外的振動による検出対象の歪みを効率的に低減できる。 Therefore, according to the present embodiment, it is possible to obtain a detection result obtained by correcting the distortion in the frame while efficiently using the acquired data in the scanning sensor. That is, in the scanning type apparatus, distortion of the detection target due to external vibration can be efficiently reduced.
 なお、上記実施形態では、外的な振動による主振動方向θMを検出し、それに対して揺動方向が略平行になるよう制御するよう構成している。しかしながら、この手法に限定されない。例えば、予め主振動方向が特定されている場合等は、常に、当該方向に揺動させるよう制御するよう構成してもよい。この場合、光偏向部213は、当該方向のみの駆動軸を有する構成、すなわち、1軸駆動であってもよい。また、この場合、振動検出装置240および振動解析部233は、備えなくてもよい。 In the above embodiment, the main vibration direction θM due to external vibration is detected, and the swing direction is controlled to be substantially parallel to the main vibration direction θM. However, it is not limited to this method. For example, when the main vibration direction is specified in advance, the control may be performed such that the main vibration direction is always swung in the direction. In this case, the light deflecting unit 213 may have a configuration having a drive axis only in the direction, that is, uniaxial drive. In this case, the vibration detection device 240 and the vibration analysis unit 233 may not be provided.
 例えば、光測距装置100の搭載先が、自動車である場合、走行中に光測距装置100が受ける外的振動は、路面による振動である。そして、この振動は、重力方向が支配的である。従って、このような場合、揺動方向を重力方向と略平行に設定すればよい。 For example, when the mounting destination of the optical distance measuring device 100 is an automobile, the external vibration that the optical distance measuring device 100 receives during traveling is vibration due to the road surface. And this vibration is dominant in the direction of gravity. Therefore, in such a case, the swinging direction may be set substantially parallel to the gravity direction.
 さらに、外的な振動方向が重力方向(Y軸方向)のみである場合、距離情報メモリ131および強度情報メモリ132において、X軸方向に同一のアドレス値Pcxをもつピクセルにおいては、ずれ量AxBxおよびAyByは略同一値となる。このような場合は、各ピクセルについて、ずれ量を算出しなくてもよい。1のピクセルで算出したずれ量AxBx、AyByを用い、オフセット的に前述の歪み補正処理を行ってもよい。 Further, when the external vibration direction is only the gravitational direction (Y-axis direction), in the distance information memory 131 and the intensity information memory 132, in the pixels having the same address value Pcx in the X-axis direction, the shift amount AxBx and AyBy has substantially the same value. In such a case, the shift amount need not be calculated for each pixel. The above-described distortion correction processing may be performed in an offset manner using the deviation amounts AxBx and AyBy calculated for one pixel.
 また、アドレスが、外的な振動方向に対して略平行な幾何関係をもつピクセルにおいては、ずれ量AxBxおよびAyByは略同一値となる。これは、例えば、式(2)により求まるθMの値、あるいはθMに基づくtan等の三角関数の値が略同一である場合である。この場合も、1のピクセルで算出したずれ量AxBx、AyByを用い、オフセット的に前述の歪み補正処理を行ってもよい。すなわち、距離画像の各画素値が格納されるメモリのうち、主振動方向と略平行なアドレス列については、同一の値を用いてもよい。 In addition, in a pixel whose address has a geometrical relationship substantially parallel to the external vibration direction, the shift amounts AxBx and AyBy have substantially the same value. This is the case, for example, when the value of θM obtained from equation (2) or the value of a trigonometric function such as tan based on θM is substantially the same. Also in this case, the above-described distortion correction processing may be performed in an offset manner using the deviation amounts AxBx and AyBy calculated for one pixel. That is, the same value may be used for the address string that is substantially parallel to the main vibration direction in the memory that stores the pixel values of the distance image.
 なお、本実施形態の特徴である、揺動方向を外的振動の主振動方向θMに略平行とする制御は、光測距装置100以外にも、光スポットの走査が必要となる様々な光走査装置に適用が可能である。そのような光走査装置としては、例えば、自動車用ヘッドライトあるいは映像投射用プロジェクター等が挙げられる。 The control that makes the oscillation direction substantially parallel to the main vibration direction θM of external vibration, which is a feature of the present embodiment, is not limited to the optical distance measuring device 100, but various light that requires scanning of a light spot. It can be applied to a scanning device. Examples of such an optical scanning device include an automobile headlight or a video projection projector.
 本実施形態によれば、このような光走査装置においても、外的な振動が加わった場合でも、光スポットの投影密度に疎密が生じることの無い均一な視野あるいは投影画像を実現できる。すなわち、取得したデータを効率的に利用して所望の領域内を歪み無く照射できる。 According to this embodiment, even in such an optical scanning device, even when external vibration is applied, a uniform field of view or projection image in which the density of projection of the light spot does not occur can be realized. That is, it is possible to irradiate the desired area without distortion by efficiently using the acquired data.
 また、本実施形態では、距離情報を、距離情報メモリ131に一旦格納し、距離画像を得、距離画像に対し、光測距装置100に加わる外的な振動に伴う歪みの補正を行う。しかしながら、本手法に限定されない。 In this embodiment, the distance information is temporarily stored in the distance information memory 131, a distance image is obtained, and distortion associated with external vibration applied to the optical distance measuring device 100 is corrected for the distance image. However, it is not limited to this method.
 例えば、振動検出装置240として、高速なサンプリング周波数をもつ変位センサを用いてもよい。この場合、光測距装置100に加わる外的な振動を、高速に検出できる。このため、補正部120では、リアルタイムで歪み補正を行うことができる。 For example, a displacement sensor having a high sampling frequency may be used as the vibration detection device 240. In this case, external vibration applied to the optical distance measuring device 100 can be detected at high speed. For this reason, the correction unit 120 can perform distortion correction in real time.
 伝搬方向制御部216は、スキャナ212が、振動方向θMに対して略平行な方向に揺動しつつ視野範囲をカバーするように並進移動する際の、揺動時の制御振幅AMPcx、AMPcyを、以下の式(4)に従って、リアルタイムで導出する。
Figure JPOXMLDOC01-appb-M000005
ここで、式(6)におけるAMPx、AMPyは、振動方向θMに対して揺動させる際の制御振幅のX軸およびY軸成分である。また、Ax、Ayは、振動検出装置240の検出したX軸およびY軸の変位量である。Fx、Fyは、X軸およびY軸の振動検出装置240による検出値に対する制御振幅の変換係数である。
The propagation direction control unit 216 controls the control amplitudes AMPcx and AMPcy at the time of swinging when the scanner 212 translates so as to cover the field of view range while swinging in a direction substantially parallel to the vibration direction θM. Derived in real time according to the following equation (4).
Figure JPOXMLDOC01-appb-M000005
Here, AMPx and AMPy in Equation (6) are the X-axis and Y-axis components of the control amplitude when rocking with respect to the vibration direction θM. Ax and Ay are the X-axis and Y-axis displacement amounts detected by the vibration detection device 240. Fx and Fy are conversion coefficients of control amplitude with respect to detection values by the X-axis and Y-axis vibration detection devices 240.
 すなわち、このような変位センサを用いる場合、リアルタイムで検出した外的振動の振動振幅を、揺動制御の振幅にフィードバックすることで歪みを補正する。しかし、本手法に限定されない。例えば、車道の舗装が周期的な石畳である場合、路面から光測距装置100が受ける外的な振動は周期的なものであると考えられる。このような場合、振動検出装置240の検出した外的振動の変位量の時刻歴から、外的振動の周波数を推定してフィードフォワード補正を行ってもよい。 That is, when using such a displacement sensor, the distortion is corrected by feeding back the vibration amplitude of the external vibration detected in real time to the amplitude of the swing control. However, it is not limited to this method. For example, when the pavement of the roadway is a periodic stone pavement, it is considered that the external vibration received by the optical distance measuring device 100 from the road surface is periodic. In such a case, the feedforward correction may be performed by estimating the frequency of the external vibration from the time history of the displacement amount of the external vibration detected by the vibration detection device 240.
 このような変位センサを用い、上述のような補正を行うことにより、図6(c)に示す例とは異なり、補正後の視野範囲が小さくならない。従って、元の広い視野範囲を維持しつつ、フレーム内の歪みを補正できる。 Using such a displacement sensor and performing the above-described correction, unlike the example shown in FIG. 6C, the corrected visual field range does not become small. Therefore, it is possible to correct the distortion in the frame while maintaining the original wide visual field range.
 <<第二の実施形態>>
 本発明の第二の実施形態を説明する。本実施形態では、光測距装置で取得した距離画像および強度画像の時間的な変化を用い、これらの距離画像および強度画像の外的な振動の影響を補正する。
<< Second Embodiment >>
A second embodiment of the present invention will be described. In the present embodiment, the temporal change of the distance image and the intensity image acquired by the optical distance measuring device is used to correct the influence of external vibration of these distance image and intensity image.
 なお、本実施形態では、外部の振動検出センサの出力を用いない。例えば、搭載される装置が車両等の場合、走行時の主振動方向は、略垂直方向となる。本実施形態では、予め主振動方向を定め、スキャナ(光偏向部)の揺動方向を、この主振動方向に設定する。以下、本実施形態では、搭載先が車両であり、かつ、主振動方向がY軸方向である場合を例にあげて説明する。 In this embodiment, the output of an external vibration detection sensor is not used. For example, when the device to be mounted is a vehicle or the like, the main vibration direction during traveling is substantially vertical. In this embodiment, the main vibration direction is determined in advance, and the swinging direction of the scanner (light deflection unit) is set to this main vibration direction. Hereinafter, in the present embodiment, a case where the mounting destination is a vehicle and the main vibration direction is the Y-axis direction will be described as an example.
 図11は、本実施形態の光測距装置100aの構成図である。本実施形態の光測距装置100aは、基本的に第一の実施形態と同様の構成を有する。しかし、本実施形態では、第一の実施形態のように、外部の振動検出センサ(振動検出装置240)の出力を用いない。従って、振動検出装置240の出力を処理する振動解析部233を備えない。 FIG. 11 is a configuration diagram of the optical distance measuring device 100a of the present embodiment. The optical distance measuring device 100a of this embodiment basically has the same configuration as that of the first embodiment. However, in the present embodiment, unlike the first embodiment, the output of the external vibration detection sensor (vibration detection device 240) is not used. Therefore, the vibration analysis unit 233 that processes the output of the vibration detection device 240 is not provided.
 また、補正部120aも、補正時に振動検出装置240の出力を用いない。以下、本実施形態について、第一の実施形態と異なる補正部120aによる歪み補正処理に主眼をおいて説明する。 Also, the correction unit 120a does not use the output of the vibration detection device 240 at the time of correction. Hereinafter, the present embodiment will be described focusing on distortion correction processing by the correction unit 120a different from the first embodiment.
 本実施形態においても、距離演算部231および強度演算部232は、演算した距離値および反射光の強度値を、それぞれ、距離情報メモリ131および強度情報メモリ132に一旦格納する。そして、1フレーム単位で補正部120aが、補正値を用い、これらのアドレスを変換することにより、外的振動の影響を補正する。このとき、本実施形態では、補正値を、複数フレームの距離画像を用いて算出する。 Also in this embodiment, the distance calculator 231 and the intensity calculator 232 temporarily store the calculated distance value and the intensity value of the reflected light in the distance information memory 131 and the intensity information memory 132, respectively. Then, the correction unit 120a corrects the influence of external vibration by converting these addresses using the correction value in units of one frame. At this time, in the present embodiment, the correction value is calculated using a distance image of a plurality of frames.
 これを実現するため、本実施形態の光測距装置100は、さらに、振動による距離画像および強度画像の歪を検出し、補正量を算出する情報照合部140を備える。 To realize this, the optical distance measuring device 100 according to the present embodiment further includes an information matching unit 140 that detects distortion of the distance image and the intensity image due to vibration and calculates a correction amount.
 光測距装置100は、第一の実施形態同様、外的な振動を受けると、図12(a)に示すように、元の視野範囲422(図中の点線の長方形)と異なる視野451が検出される。外的な振動の有無にかかわらず、距離情報の検出結果は、元の視野範囲422を前提として、図12(b)のような長方形の配列アドレスが付与される。この、距離情報メモリ131に格納された距離情報452は、歪みをもつ。 As in the first embodiment, when the optical distance measuring device 100 receives external vibration, as shown in FIG. 12A, the field of view 451 different from the original field of view range 422 (dotted rectangle in the drawing) Detected. Regardless of the presence or absence of external vibration, the distance information detection result is given a rectangular array address as shown in FIG. 12B on the premise of the original visual field range 422. The distance information 452 stored in the distance information memory 131 has distortion.
 本実施形態では、外部の振動検出装置240を備えない。このため、図13(a)のように、Y軸方向に対して歪みを持った検出結果が得られた場合、図13(b)に示すように実際の検出対象の形状自体が歪んでいるのか、図13(c)に示すように、実際の検出対象の形状は歪んでおらず、外的な振動によって視野が歪んでいるのかが判別できない。 In this embodiment, the external vibration detection device 240 is not provided. For this reason, when a detection result having distortion in the Y-axis direction is obtained as shown in FIG. 13A, the actual detection target shape itself is distorted as shown in FIG. 13B. However, as shown in FIG. 13C, the actual shape of the detection target is not distorted, and it cannot be determined whether the field of view is distorted by external vibration.
 本実施形態の情報照合部140は、歪みを持った検出結果が、検出対象の形状の歪によるものか、外的な振動によるものかを判別する。そして、外的な振動によるものと判別した場合、X軸方向およびY軸方向の振動の変位量をAxおよびAyを算出し、出力する。なお、上述のように、本実施形態では、主振動方向をY軸方向としているため、変位量としてAyが算出される。 The information matching unit 140 according to the present embodiment determines whether the detection result with distortion is due to distortion of the shape of the detection target or due to external vibration. If it is determined that the vibration is due to external vibration, Ax and Ay are calculated and output as the displacement amounts of vibration in the X-axis direction and Y-axis direction. As described above, in the present embodiment, since the main vibration direction is the Y-axis direction, Ay is calculated as the displacement amount.
 これを実現するため、本実施形態の情報照合部140は、歪み検出部141と、歪み振幅検出部142とを備える。 To realize this, the information matching unit 140 of the present embodiment includes a distortion detection unit 141 and a distortion amplitude detection unit 142.
 歪み検出部141は、同一対象物の、第一の領域における歪みと、第一の領域よりも自車両に近い、第二の領域における歪みと、を計算し、歪みの原因を判別する。歪み検出部141は、第一の領域における距離情報(第一の距離情報)と、第二の領域における距離情報(第二の距離情報)とを算出する。第二の距離情報算出時は、自車両の速度を考慮する。なお、自車両の速度情報は、ECU300より取得する。 The distortion detection unit 141 calculates the distortion of the same object in the first area and the distortion in the second area, which is closer to the host vehicle than the first area, and determines the cause of the distortion. The distortion detection unit 141 calculates distance information (first distance information) in the first region and distance information (second distance information) in the second region. When calculating the second distance information, the speed of the host vehicle is taken into consideration. The speed information of the host vehicle is acquired from the ECU 300.
 第一の領域および第二の領域は、図14(a)に示すように、光測距装置100aの視野範囲における光学的な消失点521を基準に設定される。例えば、第一の領域522は、図14(a)に示すように、光学的な消失点に近い位置(遠距離)に設定される。また、第二の領域523は、第一の領域を消失点から拡大投影した領域(中距離)に設定される。ここでは、それぞれ、長方形領域に設定される場合を例にあげて説明する。 As shown in FIG. 14A, the first area and the second area are set based on an optical vanishing point 521 in the visual field range of the optical distance measuring device 100a. For example, the first region 522 is set at a position (far distance) close to the optical vanishing point, as shown in FIG. The second region 523 is set to a region (medium distance) obtained by projecting the first region from the vanishing point in an enlarged manner. Here, a case where each is set to a rectangular area will be described as an example.
 このとき、同一対象物が、それぞれの位置で取得されるタイミングで取得した距離情報を用いる。従って、第一の位置および第二の位置の距離情報の計算を行うフレームは、それぞれ異なる。このため、本実施形態の距離情報メモリ131は、少なくとも2フレーム数分の距離情報メモリを確保可能とする。 At this time, the distance information acquired at the timing when the same object is acquired at each position is used. Accordingly, the frames for calculating the distance information of the first position and the second position are different. For this reason, the distance information memory 131 of this embodiment can secure the distance information memory for at least two frames.
 歪み振幅検出部142は、歪みの原因が外的振動によるものと判別された場合、その歪み幅、すなわち、変位量Ayを算出する。変位量は、歪み検出部141が判別時に算出した結果を利用する。 When it is determined that the cause of distortion is due to external vibration, the distortion amplitude detection unit 142 calculates the distortion width, that is, the displacement amount Ay. As the displacement amount, a result calculated by the strain detection unit 141 at the time of determination is used.
 以下、本実施形態の情報照合部140による変位量算出処理を、図15の処理フローに従って、説明する。 Hereinafter, the displacement amount calculation processing by the information matching unit 140 of the present embodiment will be described according to the processing flow of FIG.
 まず、第一の領域522の、距離情報を算出する(ステップS2101)。 First, distance information of the first area 522 is calculated (step S2101).
 ここで、第一の領域522内の距離領域の計算について図16(a)~図17(b)を用いて説明する。 Here, the calculation of the distance region in the first region 522 will be described with reference to FIGS. 16 (a) to 17 (b).
 図16(a)は、外的な振動のない場合の、図16(c)は、外的な振動のある場合の、それぞれ、第一の領域522の距離検出結果を、色の濃さの違いで表現した距離画像である。 FIG. 16A shows the distance detection result of the first region 522 when there is no external vibration, and FIG. It is a distance image expressed by a difference.
 外的な振動がない場合、自車両の走行する道路の路面は、一般に平らである。すなわち、光測距装置100aから、同一距離の距離値は、略一定となる。距離画像で説明すると、図16(a)に示すように、Y軸方向の座標が同一な画素行の距離値は、略同一である。また、Y軸方向の値が大きくなるほど、距離値が大きくなる。距離画像で言うと、X軸方向の座標が同一な画素列では、奥側ほど距離値が大きくなる。 When there is no external vibration, the road surface on which the vehicle travels is generally flat. That is, the distance value of the same distance from the optical distance measuring device 100a is substantially constant. In the case of a distance image, as shown in FIG. 16A, the distance values of pixel rows having the same coordinate in the Y-axis direction are substantially the same. Further, the distance value increases as the value in the Y-axis direction increases. In terms of the distance image, in the pixel row having the same X-axis coordinate, the distance value increases toward the back side.
 このような状況で、距離情報を計算する領域の画素列(X軸座標が同一の画素群)毎に、距離情報の平均値Dを計算すると、図16(b)に示すように、領域内の全列において、略同一の値となる。 In such a situation, when the average value D of the distance information is calculated for each pixel row (pixel group having the same X-axis coordinate) in the area for calculating the distance information, as shown in FIG. In all the columns, substantially the same value is obtained.
 一方、外的な振動がある場合は、視野もY軸方向に対して振動するため、図16(c)に示すように、距離情報も、振動に同期した歪み(波形歪みと呼ぶ。)を持つ。従って、このような状況で、距離情報を計算する領域の画素列毎に、距離情報の平均値Dを計算すると、図16(d)に示すように、その値は振動する。 On the other hand, when there is an external vibration, the field of view also vibrates in the Y-axis direction. Therefore, as shown in FIG. 16C, the distance information is also a distortion synchronized with the vibration (referred to as waveform distortion). Have. Therefore, when the average value D of the distance information is calculated for each pixel column in the area for calculating the distance information in such a situation, the value vibrates as shown in FIG.
 本実施形態の歪み検出部141は、まず、第一の領域522について、上述のような、画素列毎の平均値Dを算出し、波形歪みの有無を判別する。例えば、算出した各平均値Dが、予め定めた閾値内に収まっているか否かを判定する。 First, the distortion detection unit 141 of the present embodiment calculates the average value D for each pixel column as described above for the first region 522, and determines the presence or absence of waveform distortion. For example, it is determined whether or not each calculated average value D is within a predetermined threshold.
 ここでは、図16(b)および図16(d)に示すように、画素列毎の平均値Dを算出し、さらにその平均値(全データの平均値)Daveを算出する。 Here, as shown in FIGS. 16B and 16D, the average value D for each pixel column is calculated, and the average value (average value of all data) Dave is calculated.
 そして、歪み検出部141は、各画素列の平均値Dが、Dave±Dthの範囲内に収まっているか否かにより、歪みの有無を判別する(ステップS2102)。なお、Dthは、予め定めておく。 The distortion detector 141 determines whether or not there is distortion depending on whether or not the average value D of each pixel column is within the range of Dave ± Dth (step S2102). Note that Dth is determined in advance.
 歪み検出部141は、収まっていない場合、歪み有りと判別する。歪み有と判別した場合、さらに、波形歪みの振幅Adを算出する。 The distortion detection unit 141 determines that there is distortion when it does not fit. If it is determined that there is distortion, the waveform distortion amplitude Ad is further calculated.
 歪み有りと判別された場合、さらに、歪み検出部141は、所定の時間後、第二の領域523の波形歪みを算出する(ステップS2103)。ここでは、まず、歪み検出部141は、第一の領域522と同様の手法で、第二の領域523の画素列毎の平均値を算出して波形歪みの振幅Ad2を算出する。 If it is determined that there is distortion, the distortion detector 141 further calculates the waveform distortion of the second region 523 after a predetermined time (step S2103). Here, first, the distortion detection unit 141 calculates the average value Ad2 of the waveform distortion by calculating the average value for each pixel column of the second area 523 by the same method as that of the first area 522.
 さらに、エンジンコントロールユニット300から受信した自車両の速度情報の時刻歴と、第一の領域522の距離情報計算を行ったフレーム取得から第二の領域523の計算を行ったフレーム取得までの経過時間と、を用いて、フレーム間の移動距離を推定する。そして、その移動距離の情報を基に、波形歪みの、投影時の倍率を算出する。 Furthermore, the time history of the speed information of the host vehicle received from the engine control unit 300 and the elapsed time from the frame acquisition for calculating the distance information of the first area 522 to the frame acquisition for calculating the second area 523 And the movement distance between frames is estimated. Then, based on the information of the movement distance, the magnification at the time of projection of the waveform distortion is calculated.
 そして、歪み検出部141は、歪みの原因を判別する(ステップS2104)。歪みの原因は、第一の領域522で算出した波形歪みの振幅と、第二の領域523で算出した波形歪みの振幅とが一定であるか否かにより、判別する。一定の場合、外的振動によるものと判別する。 Then, the distortion detection unit 141 determines the cause of the distortion (step S2104). The cause of the distortion is determined by whether or not the amplitude of the waveform distortion calculated in the first area 522 and the amplitude of the waveform distortion calculated in the second area 523 are constant. If it is constant, it is determined that it is due to external vibration.
 外的な振動が原因の場合の波形歪みと、検出対象自身が歪んでいる場合の波形歪みとの発生の違いを、図14(b)および図14(c)に示す。 FIG. 14B and FIG. 14C show the difference in generation between waveform distortion caused by external vibration and waveform distortion caused when the detection target itself is distorted.
 これらの図に示すように、外的な振動が原因として検出される波形歪みの振幅および周波数は、図14(b)に示すように、どの距離領域においても略同一である。一方、検出対象自身が歪んでいる場合は、図14(c)に示すように、波形歪みの振幅は光学的な消失点521からの距離に対して略線形に増加する。 As shown in these drawings, the amplitude and frequency of the waveform distortion detected due to external vibration are substantially the same in any distance region as shown in FIG. 14B. On the other hand, when the detection target itself is distorted, the amplitude of the waveform distortion increases substantially linearly with respect to the distance from the optical vanishing point 521 as shown in FIG.
 例えば、第一の領域522における距離情報計算結果の振幅をAdとする。外的な振動により検出される場合は、図14(b)に示すように、全ての距離において、波形歪みの振幅はAdとなる。一方、検出対象の形状が歪んでいる場合は、第一の領域522までの距離と、第二の領域523までの距離と、第三の領域524までの距離(近距離)とが等間隔と仮定すると、図14(c)に示すように、第二の領域、第三の領域では、2Ad、3Adとなる。 For example, it is assumed that the amplitude of the distance information calculation result in the first region 522 is Ad. When detected by external vibration, the amplitude of waveform distortion is Ad at all distances, as shown in FIG. On the other hand, when the shape of the detection target is distorted, the distance to the first region 522, the distance to the second region 523, and the distance to the third region 524 (short distance) are equally spaced. Assuming that, as shown in FIG. 14C, 2Ad and 3Ad are obtained in the second region and the third region.
 歪み振幅検出部142は、歪み検出部141が、歪みの要因が外的振動によるものと判別した場合(ステップS2105)、補正部120が補正に用いる変位量を算出する(ステップS2106)。 When the distortion detector 141 determines that the cause of distortion is due to external vibration (step S2105), the distortion amplitude detector 142 calculates a displacement amount used for correction by the correction unit 120 (step S2106).
 ここでは、歪み振幅検出部142は、例えば、図17(a)に示すように、第一の領域522と同様の手法で、第二の領域523の全データの平均値Daveを計算する。そして、図17(b)に示すように、各列方向の平均値Dと、全データの平均値Daveとの差分Dyを算出する。 Here, for example, as illustrated in FIG. 17A, the distortion amplitude detection unit 142 calculates the average value Dave of all data in the second region 523 by the same method as that of the first region 522. Then, as shown in FIG. 17B, a difference Dy between the average value D in each column direction and the average value Dave of all data is calculated.
 補正部120は、上記差分Dyを変位量Ayとして、距離画像および強度画像の各画素値のアドレスを補正し(ステップS2107)、処理を終了する。 The correction unit 120 corrects the address of each pixel value of the distance image and the intensity image using the difference Dy as the displacement amount Ay (step S2107), and ends the process.
 補正は、以下の式(5)に従って行う。
Figure JPOXMLDOC01-appb-M000006
ここで、PおよびPは、距離情報メモリ131または強度情報メモリ132の、X軸およびY軸のピクセルのアドレスである。PcxおよびPcyは、その変換後のアドレスである。
The correction is performed according to the following equation (5).
Figure JPOXMLDOC01-appb-M000006
Here, P x and P y are addresses of X-axis and Y-axis pixels in the distance information memory 131 or the intensity information memory 132. P cx and P cy are addresses after the conversion.
 なお、上記ステップS2102で歪み無し、と判別された場合、および、上記ステップS2105で、歪みの要因が形状によるものと判別された場合、ステップS2101へ戻り、処理を繰り返す。 If it is determined in step S2102 that there is no distortion, and if it is determined in step S2105 that the cause of distortion is due to the shape, the process returns to step S2101 and the process is repeated.
 なお、ステップS2104における要因の判定は、必ずしも振幅により行わなくてもよい。検出された歪みが実際の検出対象の形状なのか外的な振動が原因であるかを判別可能な基準であればよく、例えば、画像解析の結果得られた周波数を用いてもよい。 Note that the determination of the factor in step S2104 is not necessarily performed based on the amplitude. Any standard that can determine whether the detected distortion is the actual shape of the detection target or the external vibration may be used. For example, a frequency obtained as a result of image analysis may be used.
 なお、上記例では、補正量Ayを、第二の領域523の距離情報(波形歪み)から算出したが、これに限定されない。例えば、図14(a)に示すように、より消失点521に近い遠距離位置、遠距離位置より消失点から遠い中距離位置、さらに消失点から遠い近距離位置の3位置以上であってもよい。この場合、例えば、3位置を、光学的な消失点521から等間隔に設定してもよい。この場合、距離情報メモリ131は、少なくとも、距離情報を計算する位置の数と同じだけのフレーム数分の距離情報メモリを確保可能とする。また、第三の領域524は、視野の全列のデータをカバーする領域とすることが望ましい。 In the above example, the correction amount Ay is calculated from the distance information (waveform distortion) of the second region 523, but is not limited to this. For example, as shown in FIG. 14 (a), even if there are three or more positions, a far distance position closer to the vanishing point 521, a middle distance position farther from the vanishing point than the far distance position, and a near distance position farther from the vanishing point. Good. In this case, for example, the three positions may be set at equal intervals from the optical vanishing point 521. In this case, the distance information memory 131 can secure the distance information memory for at least the number of frames equal to the number of positions for calculating the distance information. The third area 524 is preferably an area that covers data of all columns of the visual field.
 また、距離情報計算の領域の形は図14(a)のように必ずしも長方形に限定するものではなく、消失点521からの投影が可能な形であればどのような形状でも構わない。 Further, the shape of the area for calculating the distance information is not necessarily limited to a rectangle as shown in FIG. 14A, and any shape can be used as long as the projection from the vanishing point 521 is possible.
 以上説明したように、本実施形態によれば、光測距装置100aが取得した距離画像そのものを用いて、外的振動による歪みを補正できる。第一の実施形態のように、光測距装置100とは別体の振動検出センサを用いない。従って、光測距装置100aのみで、走査型センサにおいて、外的な振動が加わった場合においても、フレーム内の歪みを補正した検出結果を得ることができる。すなわち、より簡単な構成で、振動による歪みの影響のない、高品質な画像を得ることができる。 As described above, according to the present embodiment, distortion due to external vibration can be corrected using the distance image itself acquired by the optical distance measuring device 100a. As in the first embodiment, a vibration detection sensor that is separate from the optical distance measuring device 100 is not used. Therefore, even when an external vibration is applied to the scanning sensor only with the optical distance measuring device 100a, a detection result in which distortion in the frame is corrected can be obtained. That is, it is possible to obtain a high-quality image with a simpler configuration and free from the influence of distortion due to vibration.
 また、外的振動の方向が予めわかっている場合、当該振動方向に揺動方向を設定することにより、第一の実施形態同様、取得したデータを効率的に利用することができる。 Further, when the direction of external vibration is known in advance, the acquired data can be used efficiently as in the first embodiment by setting the swing direction to the vibration direction.
 <<第三の実施形態>>
 本発明の第三の実施形態を説明する。本実施形態では、光測距装置とは別体の画像取得装置が取得した画像を用い、外的な振動の主振動方向を特定する。また、当該画像を用い、距離画像および強度画像の外的振動による歪みを補正する。
<< Third Embodiment >>
A third embodiment of the present invention will be described. In the present embodiment, the main vibration direction of external vibration is specified using an image acquired by an image acquisition device separate from the optical distance measuring device. In addition, the image is used to correct distortion due to external vibration of the distance image and the intensity image.
 図18は、本実施形態の光測距装置100bの構成図である。本図に示すように、本実施形態の光測距装置100bは、基本的に第一の実施形態の光測距装置100と同様の構成を有する。以下、本実施形態について、第一の実施形態と異なる構成に主眼をおいて説明する。 FIG. 18 is a configuration diagram of the optical distance measuring device 100b of the present embodiment. As shown in the figure, the optical distance measuring device 100b of this embodiment basically has the same configuration as the optical distance measuring device 100 of the first embodiment. Hereinafter, the present embodiment will be described focusing on the configuration different from the first embodiment.
 本実施形態では、光測距装置100b外部の画像取得装置600からの情報を用い、変位量Ax,Ay、光測距装置100bの主振動方向θMを算出する。そして、それを用いて、伝搬方向制御部216は、スキャナ212(光偏向部213)の揺動方向を制御し、補正部120は、外的振動の影響を補正する。 In the present embodiment, the displacement amounts Ax and Ay and the main vibration direction θM of the optical distance measuring device 100b are calculated using information from the image acquisition device 600 outside the optical distance measuring device 100b. Then, using this, the propagation direction control unit 216 controls the swinging direction of the scanner 212 (light deflection unit 213), and the correction unit 120 corrects the influence of external vibration.
 これを実現するため、本実施形態の光測距装置100bは、画像取得装置600で取得したデータ(以下、カメラ画像と呼ぶ)を格納するカメラ画像情報メモリ133を備える。また、補正量(変位量)Ax,Ayを算出する情報照合部140b、を備える。情報照合部140bは、歪み振幅検出部142bと、視点変換部143と、画像処理部144と、を備える。 In order to realize this, the optical distance measuring device 100b of the present embodiment includes a camera image information memory 133 that stores data acquired by the image acquisition device 600 (hereinafter referred to as a camera image). Further, an information collating unit 140b that calculates correction amounts (displacement amounts) Ax and Ay is provided. The information matching unit 140b includes a distortion amplitude detection unit 142b, a viewpoint conversion unit 143, and an image processing unit 144.
 また、本実施形態の振動解析部233は、歪み振幅検出部142から変位量Ax,Ayを得、主振動方向の、主振動角θMを特定する。 In addition, the vibration analysis unit 233 of the present embodiment obtains the displacement amounts Ax and Ay from the distortion amplitude detection unit 142 and specifies the main vibration angle θM in the main vibration direction.
 なお、伝搬方向制御部216は、第一の実施形態と同様に、スキャナ212の揺動方向が、主振動方向θMと略平行になるよう制御する。 The propagation direction control unit 216 controls the swing direction of the scanner 212 so as to be substantially parallel to the main vibration direction θM, as in the first embodiment.
 なお、画像取得装置600は、1フレーム分の画素値を一度に取得可能な、非走査型のセンサとする。画像取得装置600は、例えば、CCDやCMOSといった非走査型の撮像素子を有するカメラである。画像取得装置600は、光測距装置100bと同じ、あるいは、より高速なフレームレートをもち、同期制御を行うことで光測距装置100bと同じタイミングで、フレームを撮像することが可能であるものとする。 Note that the image acquisition device 600 is a non-scanning sensor that can acquire pixel values for one frame at a time. The image acquisition apparatus 600 is a camera having a non-scanning image sensor such as a CCD or a CMOS, for example. The image acquisition device 600 has the same or higher frame rate as the optical distance measuring device 100b, and can capture frames at the same timing as the optical distance measuring device 100b by performing synchronization control. And
 各部の詳細な説明に先立ち、非走査型センサで取得したカメラ画像と、本実施形態の光測距装置100bのような走査型センサで取得した強度画像とを用い、距離画像および強度画像の振動による歪み(変位量)を算出可能な理由を説明する。 Prior to the detailed description of each part, using the camera image acquired by the non-scanning sensor and the intensity image acquired by the scanning sensor such as the optical distance measuring device 100b of this embodiment, the vibration of the distance image and the intensity image is performed. The reason why the distortion (displacement amount) can be calculated will be described.
 図19(a)は、走査型のセンサにおいて外的な振動が加わった場合の視野461の様子を示す模式図である。走査型のセンサの場合、本図に示すように視野461が振動に同期して歪むため、検出結果も歪む。 FIG. 19A is a schematic diagram showing a state of the visual field 461 when external vibration is applied to the scanning sensor. In the case of a scanning sensor, the field of view 461 is distorted in synchronism with the vibration as shown in FIG.
 図19(b)は、非走査型のセンサにおいて外的な振動が加わった場合の視野471の様子を示す模式図である。非走査型のセンサの場合、露光時間が短い理想的なセンサであれば、視野の歪みは十分に無視することができる。検出結果として、露光時のタイミングにおける振動振幅分だけ視野全体がずれた結果が得られる。 FIG. 19B is a schematic diagram showing the state of the visual field 471 when external vibration is applied to the non-scanning sensor. In the case of a non-scanning sensor, visual field distortion can be sufficiently ignored if the sensor is an ideal sensor with a short exposure time. As a detection result, a result in which the entire visual field is shifted by the vibration amplitude at the timing of exposure is obtained.
 従って、非走査型センサの取得結果と走査型センサの取得結果の情報を照合することにより、走査型のセンサにより得られる検出対象の形状の歪みの原因を特定できる。すなわち、検出対象自体が歪んだ形状を有するか、外的な振動によるフレーム内歪みにより視野が歪んで見えているかの区別が可能となる。 Therefore, the cause of the distortion of the shape of the detection target obtained by the scanning sensor can be specified by collating the acquisition result of the non-scanning sensor and the information of the scanning sensor acquisition result. That is, it is possible to distinguish whether the detection target itself has a distorted shape or whether the field of view is distorted due to intra-frame distortion due to external vibration.
 これを利用し、本実施形態の情報照合部140bによる、変位量を算出処理の詳細を、図20に沿って説明する。 The details of the displacement amount calculation processing by the information matching unit 140b of this embodiment will be described with reference to FIG.
 視点変換部143は、強度情報メモリ132に格納された強度情報(強度画像)を、画像取得装置600が取得するカメラ画像の視点方向へと変換する視点変換を行う(ステップS2101)。視点変換には、距離情報メモリ131に格納された距離情報(距離画像)を用いる。視点変換は、光測距装置100bの設置位置、視野、画像取得装置600の設置位置および視野等の情報を用い、周知の手法で行う。 The viewpoint conversion unit 143 performs viewpoint conversion for converting the intensity information (intensity image) stored in the intensity information memory 132 into the viewpoint direction of the camera image acquired by the image acquisition apparatus 600 (step S2101). For the viewpoint conversion, distance information (distance image) stored in the distance information memory 131 is used. The viewpoint conversion is performed by a known method using information such as the installation position and field of view of the optical distance measuring device 100b and the installation position and field of view of the image acquisition device 600.
 光測距装置100bと画像取得装置600とは、別体であるため、その視野は異なる。従って、情報照合前に、距離画像をカメラ画像の視点へと視点変換を行う。ただし、光測距装置100bと画像取得装置600とが、両者の視野ずれが無視できる程度に至近距離に設置されている場合は、視点変換は必ずしも行わなくてもよい。この場合、視点変換部143は備えなくてもよい。 Since the optical distance measuring device 100b and the image acquisition device 600 are separate bodies, their fields of view are different. Therefore, the viewpoint conversion is performed from the distance image to the viewpoint of the camera image before information collation. However, when the optical distance measuring device 100b and the image acquisition device 600 are installed at a close distance such that the visual field shift between them can be ignored, the viewpoint conversion is not necessarily performed. In this case, the viewpoint conversion unit 143 may not be provided.
 画像処理部144は、視点変換後の強度画像とカメラ画像とに対し、各種の画像処理を行う(ステップS3102)。ここでは、例えば、レンズ歪み補正、光量補正、ガンマ補正、ビット数変換によるレンジ変換および画像サイズ変換等を行う。このような画像処理を行うことで、強度画像とカメラ画像間の比較が可能となる。 The image processing unit 144 performs various types of image processing on the intensity image and the camera image after the viewpoint conversion (step S3102). Here, for example, lens distortion correction, light amount correction, gamma correction, range conversion by bit number conversion, image size conversion, and the like are performed. By performing such image processing, comparison between the intensity image and the camera image becomes possible.
 歪み振幅検出部142bは、処理後の強度画像とカメラ画像とを比較し、ピクセルずれ量を計算し、変位量として算出する(ステップS3103)。算出結果は、振動解析部233と、補正部120とへ出力する。 The distortion amplitude detection unit 142b compares the processed intensity image with the camera image, calculates the pixel shift amount, and calculates the displacement amount (step S3103). The calculation result is output to the vibration analysis unit 233 and the correction unit 120.
 ずれ量は、画像内のテンプレートによりマッチング処理により算出する。ずれ量計算の詳細を、図21(a)~図22(c)を用いて説明する。 The shift amount is calculated by matching processing using a template in the image. Details of the shift amount calculation will be described with reference to FIGS. 21 (a) to 22 (c).
 まず、図21(a)~図21(c)を用いて、揺動方向がY軸方向の場合の、処理後の強度画像620とカメラ画像610とのピクセルずれ量算出する過程を説明する。ピクセルずれ量は、カメラ画像610内のテンプレート611によるマッチング処理により算出する。ピクセルずれ量の算出は、強度画像620の、揺動方向の列ごとに実施する。 First, the process of calculating the pixel shift amount between the processed intensity image 620 and the camera image 610 when the swinging direction is the Y-axis direction will be described with reference to FIGS. 21 (a) to 21 (c). The pixel shift amount is calculated by matching processing using the template 611 in the camera image 610. The pixel shift amount is calculated for each column of the intensity image 620 in the swing direction.
 例えば、図21(a)に示すテンプレート611を用いるものとする。テンプレート611として用いる領域は、カメラ画像610における、所定列の、例えば、中央寄りの画素数Mだけを用いる。揺動方向の全画素を用いないのは、振動によりカメラ画像610の端部の画素値が、比較対象である強度画像620において欠ける可能性を考慮したものである。 For example, assume that the template 611 shown in FIG. As the region used as the template 611, only the number M of pixels in the predetermined column, for example, near the center in the camera image 610 is used. The reason for not using all the pixels in the swinging direction is that the pixel value at the end of the camera image 610 may be missing in the intensity image 620 that is a comparison target due to vibration.
 なお、テンプレート611に用いる領域は、列中央寄りのピクセルに限定されない。端部の画素を利用してもよい。 Note that the region used for the template 611 is not limited to the pixel near the center of the column. You may use the pixel of an edge part.
 歪み振幅検出部142bは、抽出されたテンプレート611を用い、強度画像620内でマッチングする部分を探索する。ここでは、例えば、図21(b)に示すように、揺動方向にラスタスキャン探索を行う。すなわち、強度画像620の、テンプレート611と同じ画素数Mの領域について、ラスタスキャンでマッチング処理を行う。 The distortion amplitude detection unit 142b uses the extracted template 611 to search for a matching part in the intensity image 620. Here, for example, as shown in FIG. 21B, a raster scan search is performed in the swing direction. That is, the matching process is performed by raster scanning on the region of the intensity image 620 having the same number of pixels M as the template 611.
 マッチングは、SSD(Sum of Squared Difference)法によって評価することにより行う。すなわち、歪み振幅検出部142bは、以下の式(6)で表される相違度R_SSDが最も小さくなる領域をマッチング領域512とする。
Figure JPOXMLDOC01-appb-M000007
なお、テンプレート611内の各画素の番号をi(i=0~M-1)、テンプレート611の番号iの画素の輝度値をT(i)、および、比較する強度画像620内の領域における対応する番号iの画素の輝度値をI(i)とする。
The matching is performed by evaluating by an SSD (Sum of Squared Difference) method. That is, the distortion amplitude detection unit 142b sets the region where the dissimilarity R_SSD expressed by the following formula (6) is the smallest as the matching region 512.
Figure JPOXMLDOC01-appb-M000007
Note that the number of each pixel in the template 611 is i (i = 0 to M−1), the luminance value of the pixel i of the template 611 is T (i), and the correspondence in the region in the intensity image 620 to be compared. Let I (i) be the luminance value of the pixel with the number i.
 図21(c)に、マッチングの結果、カメラ画像610に対して、X軸方向にAx,Y軸方向にAyだけずれた位置がマッチング領域612と判定された場合の例を示す。この場合、歪み振幅検出部142bは、このずれ量Ayを変位とする。なお、図20(c)の例では、揺動方向がY軸方向であるため、Ax=0である。 FIG. 21 (c) shows an example in which the matching region 612 is determined to be a position shifted by Ax in the X axis direction and Ay in the Y axis direction with respect to the camera image 610 as a result of matching. In this case, the distortion amplitude detection unit 142b sets the deviation Ay as a displacement. In the example of FIG. 20C, since the swinging direction is the Y-axis direction, Ax = 0.
 次に、図22(a)~図22(c)を用いて、揺動方向が、X軸に対してθF方向である場合の、ずれ量算出処理例を説明する。この場合、テンプレート611は、その時点の揺動方向θFに沿って抽出する。ただし、テンプレート611に用いる領域は、図21(a)と同様に、カメラ画像610における揺動方向θFの列の全画素でなく、例えば、カメラ画像610の同方向の列の中央付近の画素数Mだけを用いる。 Next, with reference to FIGS. 22A to 22C, a description will be given of a deviation amount calculation processing example when the swinging direction is the θF direction with respect to the X axis. In this case, the template 611 is extracted along the swing direction θF at that time. However, the region used for the template 611 is not all the pixels in the column of the swing direction θF in the camera image 610, for example, the number of pixels near the center of the column in the same direction of the camera image 610, as in FIG. Only M is used.
 この場合も、歪み振幅検出部142bは、抽出されたテンプレート611を用い、強度画像620内でマッチングする部分をラスタスキャンにより探索する。そして、上記式(6)で表される相違度R_SSDが最も小さくなる領域をマッチング領域612とする。 Also in this case, the distortion amplitude detection unit 142b uses the extracted template 611 to search for a matching portion in the intensity image 620 by raster scanning. A region where the dissimilarity R_SSD expressed by the above formula (6) is the smallest is set as a matching region 612.
 図22(c)はマッチングの結果、カメラ画像610に対して水平方向にAxだけ、垂直方向にAyだけずれた位置で最も相違度R_SSDの値が小さくなった様子を示す。 FIG. 22C shows a state in which the value of the difference R_SSD is the smallest at a position shifted by Ax in the horizontal direction and Ay in the vertical direction with respect to the camera image 610 as a result of matching.
 歪み振幅検出部142bは、マッチングの結果得られたピクセルずれ量Ax、Ayを、変位量として算出する。 The distortion amplitude detector 142b calculates the pixel shift amounts Ax and Ay obtained as a result of the matching as displacement amounts.
 歪み振幅検出部142bから変位量Ax、Ayを受信した補正部120は、第一の実施形態と同様の手法で、補正を行う(ステップS3104)。 The correction unit 120 that has received the displacement amounts Ax and Ay from the distortion amplitude detection unit 142b performs correction by the same method as in the first embodiment (step S3104).
 なお、振動解析部233も、第一の実施形態と同様の手法で、主振動方向θMおよび合成変位(振幅)Acomを算出する。 The vibration analysis unit 233 also calculates the main vibration direction θM and the combined displacement (amplitude) Acom by the same method as in the first embodiment.
 なお、本実施例におけるテンプレートとのマッチング処理はSSD法に限定するものでなく、SAD(Sum of Absolute Difference)法、NCC(Normalized Cross-Correlation)法、あるいはZNCC(Zero-mean Normalized Cross-Correlation)法を用いてもよい。 Note that the matching process with the template in this embodiment is not limited to the SSD method, but the SAD (Sum of Absolute Difference) method, the NCC (Normalized Cross-Correlation) method, or the ZNCC (Zero-mean Normalized Cross-Cross-Corresion-Cross-Corress The method may be used.
 以上説明したように、本実施形態によれば、光走査装置200の揺動方向が、常に振動方向と略平行になるよう制御される。このため、第一の実施形態同様、走査領域内の検出密度に疎密が生じず、効率的にデータを利用できる。すなわち、外的な振動が加わった場合であっても、検出密度に疎密が生じることの無い均一な検出を実現できる。 As described above, according to the present embodiment, the swing direction of the optical scanning device 200 is controlled to be always substantially parallel to the swing direction. For this reason, as in the first embodiment, the detection density in the scanning region does not vary, and data can be used efficiently. That is, even when external vibration is applied, it is possible to realize uniform detection without causing sparseness in detection density.
 また、画像取得装置600の出力を用い、距離画像の振動による歪みを補正する。従って、第一の実施形態同様、走査型センサにおいて、取得したデータを効率的に利用しつつ、外的振動により検出対象の歪みを補正できる。 Also, distortion caused by vibration of the distance image is corrected using the output of the image acquisition device 600. Therefore, as in the first embodiment, in the scanning sensor, the distortion of the detection target can be corrected by external vibration while efficiently using the acquired data.
 なお、上記各実施形態における光測距装置100は、説明の為に自動車等に搭載する場合を例にあげて説明したが、搭載対象は自動車に限定されない。例えば、無人搬送車のような自律走行型の搬送車やロボット等の周囲の情報を取得する必要があるものであればよい。 In addition, although the optical ranging apparatus 100 in each of the above-described embodiments has been described by taking as an example the case of being mounted on an automobile or the like for the purpose of explanation, the mounting target is not limited to an automobile. For example, what is necessary is just to acquire the surrounding information, such as a self-propelled conveyance vehicle like an automatic guided vehicle, and a robot.
 100:光測距装置、100a:光測距装置、100b:光測距装置、110:距離画像生成部、120:補正部、120a:補正部、131:距離情報メモリ、132:強度情報メモリ、133:カメラ画像情報メモリ、140:情報照合部、140b:情報照合部、141:歪み検出部、142:歪み振幅検出部、142b:歪み振幅検出部、143:視点変換部、144:画像処理部、
 200:光走査装置、211:投光部、212:スキャナ、213:光偏向部、214:駆動部、215:走査角度検出部、216:伝搬方向制御部、221:受光部、222:I/V変換部、223:増幅回路、225:信号処理部、226:出力部、230:制御部、231:距離演算部、232:強度演算部、233:振動解析部、234:出力処理部、235:パルス信号生成部、240:振動検出装置、
 300:ECU、
 401:視野範囲、402:揺動方向、403:並進方向、404:軌跡、411:黒点、412:黒点、421:軌跡、422:視野範囲、422a:視野範囲、422b:検出視野、423:列単位、424:発光領域、425:非発光領域、433:列単位、434:発光領域、435:非発光領域、441:検出対象、441a:検出結果、441c:検出結果、442:検出結果、443:検出結果、444:振動、451:視野、452:距離情報、461:視野、471:視野、
 512:マッチング領域、521:消失点、522:第一の領域、523:第二の領域、524:第三の領域、
 600:画像取得装置、610:カメラ画像、611:テンプレート、612:マッチング領域、620:強度画像
100: Optical distance measuring device, 100a: Optical distance measuring device, 100b: Optical distance measuring device, 110: Distance image generation unit, 120: Correction unit, 120a: Correction unit, 131: Distance information memory, 132: Intensity information memory, 133: Camera image information memory, 140: Information collation unit, 140b: Information collation unit, 141: Distortion detection unit, 142: Distortion amplitude detection unit, 142b: Distortion amplitude detection unit, 143: View point conversion unit, 144: Image processing unit ,
200: optical scanning device, 211: light projecting unit, 212: scanner, 213: light deflecting unit, 214: drive unit, 215: scanning angle detection unit, 216: propagation direction control unit, 221: light receiving unit, 222: I / V conversion section, 223: amplification circuit, 225: signal processing section, 226: output section, 230: control section, 231: distance calculation section, 232: intensity calculation section, 233: vibration analysis section, 234: output processing section, 235 : Pulse signal generation unit, 240: vibration detection device,
300: ECU,
401: viewing range, 402: swinging direction, 403: translational direction, 404: locus, 411: black point, 412: black point, 421: locus, 422: field of view range, 422a: field of view range, 422b: detection field of view, 423: column Unit: 424: light emitting area, 425: non-light emitting area, 433: row unit, 434: light emitting area, 435: non light emitting area, 441: detection target, 441a: detection result, 441c: detection result, 442: detection result, 443 : Detection result, 444: vibration, 451: visual field, 452: distance information, 461: visual field, 471: visual field,
512: matching region, 521: vanishing point, 522: first region, 523: second region, 524: third region,
600: Image acquisition device, 610: Camera image, 611: Template, 612: Matching area, 620: Intensity image

Claims (12)

  1.  揺動方向と、前記揺動方向と交差する並進方向と、で定まる投光方向に、投光する光である測定光を偏向する光偏向部と、
     前記測定光を前記光偏向部に向けて投光する投光部と、
     前記測定光の、対象物による反射光を受光する受光部と、
     前記揺動方向が、自身の主たる振動方向である主振動方向になるよう前記光偏向部を駆動制御する伝搬方向制御部と、
     前記反射光から得た情報を出力情報として出力する出力部と、を備えること
     を特徴とする光走査装置。
    A light deflector that deflects measurement light that is light to be projected in a light projecting direction determined by a rocking direction and a translation direction that intersects the rocking direction;
    A light projecting unit that projects the measurement light toward the light deflecting unit;
    A light receiving unit for receiving the reflected light of the measurement light from the object;
    A propagation direction control unit that drives and controls the light deflection unit so that the oscillation direction is a main oscillation direction that is the main oscillation direction of the device;
    And an output unit that outputs information obtained from the reflected light as output information.
  2.  請求項1記載の光走査装置であって、
     前記主振動方向を算出する振動解析部をさらに備え、
     前記振動解析部は、前記主たる振動による当該光走査装置の変位量を取得し、取得した、前記変位量から前記主振動方向を算出すること
     を特徴とする光走査装置。
    The optical scanning device according to claim 1,
    A vibration analysis unit for calculating the main vibration direction;
    The vibration analysis unit acquires a displacement amount of the optical scanning device due to the main vibration, and calculates the main vibration direction from the acquired displacement amount.
  3.  請求項2記載の光走査装置であって、
     前記振動解析部は、外的振動に伴う変位を検出する変位センサから前記変位量を取得すること
     を特徴とする光走査装置。
    The optical scanning device according to claim 2,
    The vibration analysis unit acquires the amount of displacement from a displacement sensor that detects displacement caused by external vibration.
  4.  請求項2記載の光走査装置であって、
     前記振動解析部は、非走査型の画像取得装置が取得した画像を用い、前記変位量を取得すること
     を特徴とする光走査装置。
    The optical scanning device according to claim 2,
    The vibration analysis unit acquires the displacement amount using an image acquired by a non-scanning image acquisition device.
  5.  請求項2記載の光走査装置であって、
     前記振動解析部は、前記変位量から前記主たる振動の振幅をさらに算出し、
     前記伝搬方向制御部は、算出した前記振幅を用い、前記揺動方向の振幅が予め定めた視野範囲となるよう制御すること
     を特徴とする光走査装置。
    The optical scanning device according to claim 2,
    The vibration analysis unit further calculates an amplitude of the main vibration from the displacement amount,
    The propagation direction control unit controls the amplitude in the swing direction to be within a predetermined visual field range using the calculated amplitude.
  6.  請求項5記載の光走査装置であって、
     前記測定光を投光するタイミングを指示する信号処理部をさらに備え、
     前記信号処理部は、前記光偏向部による投光範囲が、予め定めた視野範囲となるよう前記投光するタイミングを制御すること
     を特徴とする光走査装置。
    The optical scanning device according to claim 5, wherein
    A signal processing unit for instructing the timing of projecting the measurement light;
    The signal processing unit controls the light projection timing so that a light projection range by the light deflecting unit becomes a predetermined visual field range.
  7.  請求項1記載の光走査装置と、
     前記出力情報から距離画像を生成する距離画像生成部と、
     前記光走査装置の前記主たる振動による変位量を用いて、前記主たる振動による前記距離画像の歪みを補正し、補正後の距離画像を出力する補正部と、を備えること
     を特徴とする光測距装置。
    An optical scanning device according to claim 1;
    A distance image generation unit that generates a distance image from the output information;
    And a correction unit that corrects distortion of the distance image due to the main vibration using the displacement amount of the main vibration of the optical scanning device, and outputs a corrected distance image. apparatus.
  8.  請求項7記載の光測距装置であって、
     前記変位量は、外的振動に伴う変位を検出する変位センサから取得すること
     を特徴とする光測距装置。
    The optical distance measuring device according to claim 7,
    The optical distance measuring device is characterized in that the displacement amount is obtained from a displacement sensor that detects a displacement caused by external vibration.
  9.  請求項7記載の光測距装置であって、
     前記変位量は、異なる時間に取得した前記距離画像を用いて算出すること
     を特徴とする光測距装置。
    The optical distance measuring device according to claim 7,
    The optical distance measuring device is characterized in that the displacement amount is calculated using the distance images acquired at different times.
  10.  請求項7記載の光測距装置であって、
     前記変位量は、非走査型の画像取得装置が取得した画像を用いて取得すること
     を特徴とする光測距装置。
    The optical distance measuring device according to claim 7,
    The displacement amount is acquired using an image acquired by a non-scanning image acquisition apparatus.
  11.  請求項7記載の光測距装置であって、
     前記変位量は、前記距離画像の各画素値が格納されるメモリ毎に取得され、
     前記補正部は、前記メモリのアドレスを、前記変位量だけ変位させることにより、前記歪みを補正すること
     を特徴とする光測距装置。
    The optical distance measuring device according to claim 7,
    The displacement amount is acquired for each memory in which each pixel value of the distance image is stored,
    The optical distance measuring device, wherein the correction unit corrects the distortion by displacing the address of the memory by the displacement amount.
  12.  請求項11記載の光測距装置であって、
     前記変位量は、前記距離画像の各画素値が格納されるメモリのうち、前記主振動方向と略平行なアドレス列については、同一の値を用いること
     を特徴とする光測距装置。
     
    The optical distance measuring device according to claim 11,
    The optical distance measuring device according to claim 1, wherein the same amount of displacement is used for an address string substantially parallel to the main vibration direction in a memory storing each pixel value of the distance image.
PCT/JP2016/085427 2016-11-29 2016-11-29 Optical scanning device and optical rangefinder WO2018100648A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/085427 WO2018100648A1 (en) 2016-11-29 2016-11-29 Optical scanning device and optical rangefinder

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/085427 WO2018100648A1 (en) 2016-11-29 2016-11-29 Optical scanning device and optical rangefinder

Publications (1)

Publication Number Publication Date
WO2018100648A1 true WO2018100648A1 (en) 2018-06-07

Family

ID=62241310

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/085427 WO2018100648A1 (en) 2016-11-29 2016-11-29 Optical scanning device and optical rangefinder

Country Status (1)

Country Link
WO (1) WO2018100648A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020026512A1 (en) * 2018-07-30 2020-02-06 コニカミノルタ株式会社 Laser radar device and frame data correction system
CN111537979A (en) * 2020-04-30 2020-08-14 上海禾赛光电科技有限公司 Laser radar and control method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05100025A (en) * 1991-10-03 1993-04-23 Hamamatsu Photonics Kk Car-to-car distance measuring device
JPH07332966A (en) * 1994-06-09 1995-12-22 Hitachi Ltd Distance-measuring apparatus for vehicle
JPH10170630A (en) * 1996-12-07 1998-06-26 Robert Bosch Gmbh Detecting method and device of vertical directional positioning error or positioning offset of distance sensor
JP2011089874A (en) * 2009-10-22 2011-05-06 Toyota Central R&D Labs Inc Distance image data acquisition device
US20160327635A1 (en) * 2015-05-07 2016-11-10 GM Global Technology Operations LLC Spatio-temporal scanning patterns for array lidar systems

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05100025A (en) * 1991-10-03 1993-04-23 Hamamatsu Photonics Kk Car-to-car distance measuring device
JPH07332966A (en) * 1994-06-09 1995-12-22 Hitachi Ltd Distance-measuring apparatus for vehicle
JPH10170630A (en) * 1996-12-07 1998-06-26 Robert Bosch Gmbh Detecting method and device of vertical directional positioning error or positioning offset of distance sensor
JP2011089874A (en) * 2009-10-22 2011-05-06 Toyota Central R&D Labs Inc Distance image data acquisition device
US20160327635A1 (en) * 2015-05-07 2016-11-10 GM Global Technology Operations LLC Spatio-temporal scanning patterns for array lidar systems

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020026512A1 (en) * 2018-07-30 2020-02-06 コニカミノルタ株式会社 Laser radar device and frame data correction system
JPWO2020026512A1 (en) * 2018-07-30 2021-08-02 コニカミノルタ株式会社 Laser radar device and frame data correction system
JP7264167B2 (en) 2018-07-30 2023-04-25 コニカミノルタ株式会社 Laser radar device and frame data correction system
CN111537979A (en) * 2020-04-30 2020-08-14 上海禾赛光电科技有限公司 Laser radar and control method thereof

Similar Documents

Publication Publication Date Title
KR101097119B1 (en) Method of inspecting tunnel inner part damage by vision sensor system
JP6528447B2 (en) Disparity calculation system and distance measuring device
US9329035B2 (en) Method to compensate for errors in time-of-flight range cameras caused by multiple reflections
JP2021507268A (en) Multi-pulse LIDAR system for multidimensional capture of objects
JP3187495B2 (en) Optical space transmission equipment
JP2015017992A (en) Method and device for optically scanning and measuring environment
JPH07332966A (en) Distance-measuring apparatus for vehicle
US11067689B2 (en) Information processing device, information processing method and program
US20160295186A1 (en) Wearable projecting device and focusing method, projection method thereof
US9864173B2 (en) Systems and methods for run-time alignment of a spot scanning wafer inspection system
WO2020075525A1 (en) Sensor fusion system, synchronization control device, and synchronization control method
WO2018100648A1 (en) Optical scanning device and optical rangefinder
JP2008059260A (en) Movement detection image creating device
EP3349441A3 (en) Image projection apparatus and compensation method
JP2019049484A (en) Object detection system and object detection program
JP4905074B2 (en) Detection center axis deviation amount detection method
JP2013068859A (en) Image display device
KR20200114860A (en) Wide-angle high resolution distance measuring device
EP1605680A3 (en) Apparatus for and method of forming image using oscillation mirror
KR20170116635A (en) Scanning device and operating method thereof
JP2014070936A (en) Error pixel detecting apparatus, error pixel detecting method, and error pixel detecting program
JP6379646B2 (en) Information processing apparatus, measurement method, and program
JP5659109B2 (en) Moving object tracking device and reference point tracking method
CN111788495B (en) Light detection device, light detection method, and laser radar device
JPH0749942B2 (en) Road surface unevenness measuring device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16922662

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16922662

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP