WO2016021606A1 - 複数の物体検出手段を用いた物体認識装置 - Google Patents
複数の物体検出手段を用いた物体認識装置 Download PDFInfo
- Publication number
- WO2016021606A1 WO2016021606A1 PCT/JP2015/072111 JP2015072111W WO2016021606A1 WO 2016021606 A1 WO2016021606 A1 WO 2016021606A1 JP 2015072111 W JP2015072111 W JP 2015072111W WO 2016021606 A1 WO2016021606 A1 WO 2016021606A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- learning
- axis
- object detection
- axis deviation
- deviation
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/50—Systems of measurement based on relative movement of target
- G01S17/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4004—Means for monitoring or calibrating of parts of a radar system
- G01S7/4026—Antenna boresight
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/417—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30242—Counting objects in image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
Definitions
- the present disclosure relates to an object recognition device, and more particularly to an object recognition device mounted on a moving body.
- an object detection sensor such as a millimeter wave radar or a laser radar is mounted on a vehicle, and an object existing around the vehicle such as a preceding vehicle, a pedestrian, or an obstacle is detected.
- the object recognition device performs various controls for improving the traveling safety of the vehicle based on the object detection result.
- this object recognition device when a reference axis shift (optical axis shift) occurs in an object detection sensor attached to a vehicle, the object detection area by the object detection sensor shifts from the area that should be detected. It is conceivable that the detection accuracy of the is reduced. In such a case, the accuracy of various controls for improving the traveling safety of the vehicle may be reduced.
- the axis deviation of the radar apparatus When detecting an axis deviation of the radar apparatus using an image captured by the imaging apparatus, if the detection process is performed in a situation where a deviation of the reference axis (imaging axis) of the imaging apparatus occurs, the axis deviation of the radar apparatus is detected. Cannot be detected with high accuracy, and there is a concern that erroneous determination may be caused.
- the present disclosure has been made in view of the above circumstances, and object recognition that can suppress erroneous determination in processing for determining whether or not a reference axis shift of an object detection unit mounted on a moving body has occurred. Providing equipment.
- This disclosure is configured as follows.
- the present disclosure is applied to a moving body (50) in which a first object detection means (11) and a second object detection means (12) are mounted as object detection means for detecting an object existing in a predetermined detectable region.
- the present invention relates to an object recognition apparatus.
- An axis for performing an axis misalignment determination process for determining whether or not an axis misalignment has occurred in the reference axis of the second object detecting means based on the object detection results of the first object detecting means and the second object detecting means.
- the determination accuracy of the axis deviation determination of the second object detection means changes. For example, if the learning result of the axis deviation amount of the first object detection means is a result indicating that the accuracy of the learning is not so high, there is a risk of causing an erroneous determination in the axis deviation determination of the second object detection means. is there.
- the block diagram which shows schematic structure of an object recognition apparatus.
- the figure which shows arrangement
- the schematic diagram showing the shift
- the time chart which shows the specific aspect of a radar axis deviation determination process.
- the flowchart which shows the process sequence of a radar axis deviation determination process.
- the flowchart which shows the process sequence of reset determination processing.
- the time chart which shows the counter reset timing according to the learning state of a vanishing point.
- the time chart which shows the specific aspect of other embodiment.
- the object recognition apparatus 10 is mounted on a vehicle as a moving body.
- the imaging device 11 and the radar device 12 recognize an object that exists in a detectable region including the front of the vehicle.
- a schematic configuration of the object recognition apparatus 10 according to the present embodiment will be described with reference to FIGS. 1 and 2.
- the imaging device 11 is an in-vehicle camera, and is composed of a CCD camera, a CMOS image sensor, a near infrared camera, and the like.
- the imaging device 11 captures the surrounding environment including the traveling road of the host vehicle 50, generates image data representing the captured image, and sequentially outputs the image data to the object recognition device 10.
- the imaging device 11 according to the present embodiment is installed near the upper end of the windshield of the host vehicle 50, for example, and captures an area 61 that extends in a range of a predetermined angle ⁇ 1 toward the front of the vehicle about the imaging axis X1 (see FIG. 2).
- the imaging device 11 may be a monocular camera or a stereo camera.
- the radar apparatus 12 is an apparatus that detects an object by transmitting an electromagnetic wave (that is, a radar wave) as a transmission wave and receiving the reflected wave.
- an electromagnetic wave that is, a radar wave
- the radar apparatus 12 is attached to the front portion of the host vehicle 50, and scans an area 62 that extends over a range of a predetermined angle ⁇ 2 ( ⁇ 2 ⁇ 1) with the radar signal centered on the optical axis X2 toward the front of the vehicle. .
- distance measurement data is created based on the time from when the electromagnetic wave is transmitted toward the front of the vehicle until the reflected wave is received, and the created distance measurement data is sequentially output to the object recognition device 10.
- the distance measurement data includes information on the direction in which the object exists, the distance to the object, and the relative speed.
- the imaging axis X1 that is the reference axis of the imaging device 11 and the optical axis X2 that is the reference axis of the radar device 12 are in the same direction as the horizontal direction with respect to the road surface (traveling road surface) on which the host vehicle 50 travels.
- the imaging device 11 and the radar device 12 are attached to the host vehicle 50.
- at least a part of the detectable area 61 of the imaging device 11 and the detectable area 62 of the radar device 12 overlap each other.
- the imaging device 11 corresponds to “first object detection means”
- the radar device 12 corresponds to “second object detection means”.
- the object recognition device 10 is a computer including a CPU, a RAM, a ROM, an I / O, and the like.
- the object recognition device 10 includes a white line recognition unit 13, a flow calculation unit 14, a vanishing point calculation unit 20, and a radar axis deviation detection unit 30, and the CPU executes each of these programs by executing a program installed in the ROM. Realize the function.
- the white line recognition unit 13 inputs an image photographed by the imaging device 11 and recognizes a white line as a road lane marking included in the image.
- the white line recognition unit 13 extracts edge points as white line candidates from the captured image data based on, for example, the luminance change rate in the horizontal direction of the image, and sequentially stores the extracted edge points for each frame.
- the white line is recognized based on the stored history of white line edge points.
- the flow calculation unit 14 inputs an image photographed by the imaging device 11, and calculates an optical flow as a motion vector at each point in the image using the input image data (flow calculation means). For example, the flow calculation unit 14 calculates a motion vector for each pixel based on a change in spatial luminance distribution.
- the vanishing point calculation unit 20 includes a reference value estimation unit 21, a vanishing point learning unit 22, and a learning value storage unit 25.
- the reference value estimation unit 21 inputs information (white line information) on the position of the white line from the white line recognition unit 13, and also inputs information (flow information) on the optical flow from the flow calculation unit 14, and uses these input data.
- Various processes for obtaining a vanishing point are executed.
- the reference value estimation unit 21 calculates the vanishing point based on the image data captured by the imaging device 11. Specifically, the reference value estimation unit 21 calculates the vanishing point using the white line information input from the white line recognition unit 13 and the flow information input from the flow calculation unit 14. For example, when white line information is used, an intersection of a pair of white lines existing around the vehicle is estimated as a vanishing point, and the value (reference vanishing point) is stored in the ROM. When the vehicle is shipped, an initial value is stored in advance in the ROM as a vanishing point.
- the initial value is set based on, for example, parameters indicating the mounting state of the imaging device 11 (for example, mounting height, depression angle of the imaging axis, etc.) and parameters relating to the imaging function of the imaging device 11 (for example, the number of pixels, focal length, etc.). Has been.
- the vanishing point learning unit 22 performs vanishing point learning for calculating the vanishing point steady deviation amount (axial deviation amount of the imaging axis X1) accompanying the change in the mounting height and the axial direction of the imaging device 11. . Specifically, the vanishing point learning unit 22 performs learning on the vanishing point calculated from the white line information, and the second learning unit 24 executes learning on the vanishing point calculated from the flow information. With. The learning values (vanishing point learning values) of the first learning unit 23 and the second learning unit 24 are stored and updated in the learning value storage unit 25 each time learning is executed.
- the vanishing point learning unit 22 starts vanishing point learning when a start switch (for example, an ignition switch) of the host vehicle 50 is turned on. Further, in this embodiment, the vanishing point learning is performed after the start switch is turned on in consideration that the mounting height and the axial direction of the imaging device 11 change according to the loading state and traveling state of the vehicle and the position of the vanishing point also changes accordingly. Even after it is once completed, vanishing point learning is sequentially performed.
- a start switch for example, an ignition switch
- the learning value storage unit 25 is configured by, for example, a nonvolatile memory (EEPROM or the like) that can electrically rewrite data.
- the object recognition device 10 analyzes the image data using the vanishing point as an index, for example, estimates the traveling state of the host vehicle 50 with respect to the traveling road, the positional relationship with the preceding vehicle, and recognizes a pedestrian.
- the radar axis deviation detection unit 30 includes a camera target detection unit 31, a radar target detection unit 32, an axis deviation information calculation unit 33, an axis deviation determination unit 34, and a reset determination unit 35.
- the camera target detection unit 31 inputs an image taken by the imaging device 11 and detects a target (camera target) included in the image.
- the radar target detection unit 32 inputs distance measurement data from the radar apparatus 12 and detects a target (radar target) included in the input data.
- the axis deviation information calculation unit 33 determines whether an optical axis deviation has occurred in the radar device 12 based on the detection result by the camera target detection unit 31 and the detection result by the radar target detection unit 32. .
- the axis misalignment information calculation unit 33 inputs the camera target detected by the camera target detection unit 31 and information on the radar target detected by the radar target detection unit 32, respectively.
- the axis deviation information calculation unit 33 calculates the camera target detection counter CA and the axis deviation determination counter CJ. Specifically, when an object (for example, a preceding vehicle) exists in front of the host vehicle 50 and at least a target corresponding to the object is detected by the camera target detection unit 31, the camera target detection counter CA is set. Increment at a predetermined cycle. When the radar target detection unit 32 detects a target that can be the same object as the target detected by the camera target detection unit 31, the axis deviation determination counter CJ is incremented.
- the camera target detection counter CA and the axis deviation determination counter CJ correspond to “information about optical axis deviation”.
- the axis deviation determination unit 34 determines whether an optical axis deviation of the radar apparatus 12 has occurred based on the camera target detection counter CA and the axis deviation determination counter CJ input from the axis deviation information calculation unit 33. Specifically, the ratio of the axis deviation determination counter CJ to the camera target detection counter CA is calculated, and the ratio CJ / CA is compared with the determination value THA2. At this time, when CJ / CA ⁇ THA2, it is determined that there is an optical axis deviation. The reset determination unit 35 determines whether to output the data to the axis deviation determination unit 34 or invalidate the camera target detection counter CA and the axis deviation determination counter CJ.
- the reset determination unit 35 inputs the vanishing point learning value from the learning value storage unit 25, and determines whether to reset the camera target detection counter CA and the axis deviation determination counter CJ based on the input vanishing point learning value. decide.
- the reset determination unit 35 sets 0 or 1 to the past FOE credit flag Ftr based on the vanishing point learning value stored in the learning value storage unit 25, and sets the set flag Ftr as the axis deviation information.
- the result is output to the calculation unit 33.
- This past FOE trust flag Ftr is information indicating whether to reset the camera target detection counter CA and the axis deviation determination counter CJ. If the input past FOE trust flag Ftr is 1, the axis deviation information calculation unit 33 holds the measured values of the camera target detection counter CA and the axis deviation determination counter CJ as they are.
- the camera target detection counter CA and the axis deviation determination counter CJ are reset to zero, and the counter measurement is restarted from the beginning.
- the axis deviation information calculation unit 33 and the reset determination unit 35 constitute an “invalidating unit”.
- FIG. 3 schematically shows the distance deviation of the camera target due to the axis deviation of the imaging axis X1. In FIG. 3, it is assumed that no optical axis deviation occurs.
- the detection result of the imaging device 11 and the detection result of the radar device 12 are compared with respect to the detection distance from the host vehicle 50 to the same object (the preceding vehicle 55 in FIG. 3), in a state where no axis deviation occurs in the imaging axis X1.
- the difference in detection distance is relatively small as ⁇ d1.
- the imaging axis X1 is shifted by a predetermined angle ⁇ 3 upward from a horizontal position with respect to the traveling road surface, the difference in the detection distance is relatively large as ⁇ d2.
- two targets may not be recognized as the same object in the image processing. obtain.
- the learning state of the vanishing point shifts from a state where learning has not been completed for both optical flow and white line recognition to a learning completion state by optical flow, and then learning by white line recognition. Change to completed state.
- the reliability of the vanishing point learning result that is, the amount of deviation from the true value of the vanishing point after learning changes after the start of the radar axis deviation determination process.
- the reliability of the learning result is different between the state where vanishing point learning by white line recognition has not yet been completed and the state where vanishing point learning by white line recognition has been completed, and the former has a larger deviation from the true value than the latter.
- the reliability of learning results tends to be low.
- the camera target detection counter CA and the axis deviation determination counter CJ are invalidated according to the learning result of vanishing point learning. Specifically, if the reliability of the vanishing point learning result does not change after the start of the radar axis misalignment determination process, the camera target detection counter CA and the axis misalignment determination counter CJ are incremented as they are. On the other hand, if the reliability of the vanishing point learning result changes after the start of the radar axis deviation determination process, the camera target detection counter CA and the axis deviation determination counter CJ are reset, and the radar axis deviation determination process is performed. Start over from the beginning.
- FIG. 4 shows the transition of the ignition switch (IG switch) on / off, (b) shows the transition of the values of the camera target detection counter CA and the axis deviation determination counter CJ, and (c) shows the learning of the vanishing point. Shows the state transition.
- IG switch ignition switch
- FIG. 4 it is assumed that a preceding vehicle exists when the IG switch is turned on.
- the camera target detection counter CA starts counting up.
- the axis deviation determination counter CJ is incremented by an amount corresponding to the number of detections (detection frequency). In FIG. 4, since the optical axis shift of the radar apparatus 12 has occurred, CJ ⁇ CA.
- a command for starting vanishing point learning based on the optical flow is output.
- the predetermined time T1 is set to a time (for example, tens of seconds) for acquiring image data necessary for the optical flow calculation.
- the reliability of the vanishing point learning result has changed after the camera target detection counter CA starts counting up. It is estimated to be.
- the camera target detection counter CA and the axis deviation determination counter CJ are reset to zero, and the count up for radar axis deviation determination is performed again from the beginning.
- the absolute value ⁇ FOE (C ⁇ A) of the deviation amount between the previous learning value FOE_A and the vanishing point FOE_C due to the optical flow is calculated. Based on the calculated ⁇ FOE (CA), it is determined that the reliability of the learning result of vanishing point learning has changed.
- the same processing is performed when vanishing point learning by white line recognition is completed at a time t13 when a predetermined time T2 (for example, several minutes) has further passed from the vanishing point learning start command by the optical flow. Specifically, it is determined that the reliability of the vanishing point learning result has changed from the absolute value ⁇ FOE (DC) of the deviation amount between the vanishing point FOE_C by the optical flow and the vanishing point by the white line recognition (FOE_D). Then, the camera target detection counter CA and the axis deviation determination counter CJ are reset to zero, and the count-up for radar axis deviation determination is restarted from the beginning.
- DC absolute value
- FOE_D white line recognition
- step S103 an image captured by the imaging device 11 is input, and it is determined whether or not the input image includes a target (camera target). If the camera target is not detected, this routine is terminated as it is. If the camera target is detected, the process proceeds to step S104, and the camera target detection counter CA is incremented.
- a target camera target
- step S105 it is determined whether or not the radar apparatus 12 has detected a target that is regarded as the same object as the camera target. If it is detected, the process proceeds to step S106, and the axis deviation determination counter CJ is incremented. On the other hand, if there is no detection, the process proceeds to step S107 without executing the process of step S106.
- step S107 it is determined whether or not the camera target detection counter CA is larger than a determination value THA1 (for example, several thousand times). If CA> THA1, the process proceeds to step S108, and it is determined whether or not the ratio (CJ / CA) of the axis deviation determination counter CJ to the camera target detection counter CA is smaller than the determination value THA2. If the ratio CJ / CA is greater than or equal to the determination value THA2, the process proceeds to step S110, and the camera target detection counter CA and the axis deviation determination counter CJ are reset to zero.
- a determination value THA1 for example, several thousand times.
- step S109 where it is determined that the optical axis shift of the radar apparatus 12 has occurred, and the determination result is stored. Thereafter, the process proceeds to step 110, where the camera target detection counter CA and the axis deviation determination counter CJ are reset to zero, and this routine ends.
- step S201 it is determined whether or not vanishing point learning by white line recognition is completed. If the vanishing point learning by white line recognition has not been completed yet, the process proceeds to step S202, and it is determined whether or not the vanishing point learning by the optical flow has been completed. Here, it is determined whether or not the vanishing point calculated by the optical flow shows a stable value. When it is determined that the vanishing point is stable, the vanishing point learning value (FOE_C) is calculated based on the optical flow.
- FOE_C vanishing point learning value
- Whether or not the vanishing point calculated by the optical flow shows a stable value is determined based on the dispersion of the vanishing point in the vertical plane, and an affirmative determination is made when the dispersion is smaller than a predetermined value.
- the process proceeds to step S203, and the vanishing point by the initial value or the previous learning value (FOE_A) and the optical flow is calculated. It is determined whether or not the absolute value ⁇ FOE (CA) of the deviation amount from the learning value FOE_C is smaller than the determination value THF1 (reliability determination means).
- step S204 If ⁇ FOE (CA) ⁇ THF1, the process proceeds to step S204, and 1 is set to the past FOE credit flag Ftr. On the other hand, if ⁇ FOE (CA) ⁇ THF1, the process proceeds to step S205, and 0 is set to the past FOE credit flag Ftr.
- step S202 a negative determination is made in step S202 and the process proceeds to step S204.
- the past FOE credit flag Ftr is set to 1, the counter is not reset.
- step S201 when vanishing point learning by white line recognition is completed, an affirmative determination is made in step S201, and the process proceeds to step S206.
- step S206 it is determined whether or not the absolute value ⁇ FOE (DC) of the deviation amount between the vanishing point learning value FOE_C by the optical flow and the vanishing point learning value FOE_D by white line recognition is smaller than the determination value THF2 (reliability). Degree determination means). If ⁇ FOE (DC) ⁇ THF2, the process proceeds to step S207, and 1 is set to the past FOE credit flag Ftr. On the other hand, if ⁇ FOE (DC) ⁇ THF2, the process proceeds to step S208, and 0 is set to the past FOE credit flag Ftr.
- DC absolute value of the deviation amount between the vanishing point learning value FOE_C by the optical flow and the vanishing point learning value FOE_D by white line recognition is smaller than the determination value THF2 (reliability). Degree determination means). If ⁇ FOE (DC
- FIG. 7 is a time chart showing the reset timing according to the vanishing point learning state.
- (a) is the initial value (or the previous learned value) is correct with respect to the true value and the vanishing point based on the optical flow is deviated from the true value
- (b) is the initial value (or the previous learned value). Is deviated from the true value, and the vanishing point based on the optical flow is correct with respect to the true value
- (c) and (d) are both true values of the initial value (or the last learned value) and the vanishing point based on the optical flow.
- the amount of deviation from the true value of the vanishing point based on the optical flow is smaller than that in (d).
- A is the ON timing of the IG switch
- B is the start timing of vanishing point learning based on the optical flow
- C is the timing when the vanishing point based on the optical flow is stable
- D is white line recognition Is the completion timing of vanishing point learning based on, and corresponds to A to D in FIG.
- the reset of the camera target detection counter CA and the axis deviation determination counter CJ is performed at at least one of timing C when the vanishing point calculated by the optical flow is stabilized and completion timing D of vanishing point learning by white line recognition.
- the absolute value ⁇ FOE (CA) of the deviation amount between the previous learning value FOE_A and the vanishing point learning value FOE_C by the optical flow is large, and the vanishing point learning value by the optical flow is large.
- the absolute value ⁇ FOE (DC) of the deviation amount between FOE_C and the vanishing point learning value FOE_D by white line recognition is large.
- the counters CA and CJ are reset at timing C and timing D.
- the counters CA and CJ are reset only at the timing C
- the counters CA and CJ are reset only at the timing D.
- the information regarding the deviation of the optical axis X2 of the radar device 12 acquired so far is invalidated.
- the optical axis deviation determination of the radar apparatus 12 is performed according to the learning result of the axis deviation amount of the imaging apparatus 11. The accuracy of changes. In this regard, according to the above configuration, erroneous determination can be suppressed in the process of determining whether or not the optical axis shift of the radar apparatus 12 has occurred.
- the counter CA which is information about the optical axis deviation
- the CJ is invalidated. If a change in reliability occurs in the learning result of vanishing point learning after the start of radar axis misalignment determination processing, radar image misalignment determination is performed if image processing is performed based on a vanishing point with a large amount of deviation from the true value. The determination accuracy of becomes low. Accordingly, an erroneous determination may occur.
- the vanishing point learning by white line recognition has high reliability of the learning result, but requires a certain amount of time (for example, several minutes to several tens of minutes) to complete the learning. Therefore, in the period until the vanishing point learning based on white line recognition is completed, the learning value at the previous vehicle driving is used, or it is obtained by another learning means with lower reliability than vanishing point learning based on white line recognition. It is necessary to perform image processing using the learned value, for example, the vanishing point learned value based on the optical flow. On the other hand, in the radar axis deviation determination process of the present embodiment, image data captured by the imaging device 11 is used. Therefore, a case where the learned vanishing point is deviated from the true value may be erroneously determined.
- the vanishing point learning result is related to the optical axis deviation.
- the information is invalidated.
- the counters CA and CJ can be invalidated at an appropriate timing, and an erroneous determination caused by the fact that the learning accuracy of vanishing point learning is not sufficiently secured is preferable. Can be suppressed.
- the radar axis deviation determination process can be performed before the vanishing point learning by white line recognition is completed.
- vanishing point learning by optical flow learning can be completed as soon as possible after the start of driving of the vehicle, but learning accuracy is lower than vanishing point learning by white line recognition, and there is a deviation from the true value There is.
- the radar axis deviation determination is performed before the vanishing point learning by the white line recognition is completed. Even when processing is started, erroneous determination of radar axis deviation can be suppressed.
- the axis deviation determination of the radar device 12 is performed based on the object detection results of the imaging device 11 and the radar device 12. .
- the imaging device 11 and the radar device 12 have different object detection ranges and detection accuracy, it is possible to compensate for each other's weak points by mounting both devices and performing object detection. Further, by combining the imaging device 11 and the radar device 12, it is possible to detect the optical axis shift of the radar device 12 using the image data of the imaging device 11.
- FIG. 8 is a time chart showing a specific mode of the reset determination process with the above configuration.
- (a) to (c) are the same as FIG.
- a case is considered where the absolute value ⁇ FOE (CA) of the deviation amount between the previous learning value FOE_A and the vanishing point learning value FOE_C by the optical flow is large at the time t22 when the vanishing point calculated by the optical flow is stabilized.
- the current camera target detection counter CA and the axis deviation determination counter CJ are stored (information holding unit). Subsequently, the camera target detection counter CA and the axis deviation determination counter CJ are reset (invalidating means).
- the absolute value ⁇ FOE (DC) of the deviation amount between the vanishing point learning value (FOE_C) by optical flow and the vanishing point learning value (FOE_D) by white line recognition is larger than the determination value THF2
- the previous time When the absolute value ⁇ FOE (DA) of the deviation amount between the learning value FOE_A and the vanishing point learning value FOE_D by white line recognition is smaller than the determination value THF3, the reliability of the learning result of the vanishing point learning before time t22 is It is assumed that more than a predetermined amount has been secured.
- the radar axis misalignment determination process performed in the period from when the ignition switch is turned on until the vanishing point learning by white line recognition is completed has been described.
- the present disclosure may be applied to radar axis misalignment determination processing performed in a period after vanishing point learning by white line recognition is completed. Vanishing point learning by white line recognition has high accuracy, and once learning is completed, the change in the learning value is relatively small thereafter.
- the axial direction of the imaging axis X1 changes due to changes in the loading state or traveling state of the vehicle, and the position of the vanishing point changes.
- the vanishing point learning is applied to a configuration in which vanishing point learning by optical flow and vanishing point learning by white line recognition are performed is described.
- the point (D) when vanishing point learning by white line recognition is completed is the counter reset timing.
- the axis deviation determination means for determining the axis deviation of the second object detection means is not limited to the above configuration as long as it is performed based on the object detection results of the first object detection means and the second object detection means. For example, detecting an error between the signal sending direction and the straight traveling direction of the vehicle based on the vanishing point detected based on the image taken while the vehicle is running and the sending direction of the signal sent from the radar device 12. It is good also as a structure which performs the axis-shift determination of the radar apparatus 12 by.
- a configuration for detecting a change in the level of the vanishing point learning value as reliability determination means for detecting that a change in the reliability of the learning result of the vanishing point learning has occurred after the start of the radar axis misalignment determination process It was adopted. Specifically, the absolute value ⁇ FOE (CA) of the deviation amount between the previous learning value FOE_A and the vanishing point learning value FOE_C by optical flow, and the vanishing point learning value FOE_C by optical flow and the vanishing point learning value FOE by white line recognition. This was done by comparing the absolute value ⁇ FOE (DC) of the deviation amount from (FOE_D) with the judgment value.
- the configuration of the reliability determination unit is not limited to this, and may be detected based on, for example, the degree of change (differential value) of the vanishing point learning value.
- the axis deviation determination result based on the ratio CJ / CA that is, the determination that there is an optical axis deviation. It is good also as a structure which invalidates a result. That is, in this case, the axis deviation determination result based on the ratio CJ / CA corresponds to “information about axis deviation of the second object detection unit”.
- the present disclosure may be applied to a system including a plurality of radar devices (first radar device, second radar device) as the first object detection unit and the second object detection unit.
- first radar device second radar device
- the present disclosure may be applied to a system including a plurality of radar devices (first radar device, second radar device) as the first object detection unit and the second object detection unit.
- first radar device second radar device
- the optical axis of the first radar device Based on the learning result of the amount of axis deviation, information regarding the axis deviation of the second radar device is invalidated.
- the imaging apparatus 11 has the axial deviation in the vertical plane
- the axial deviation in the horizontal plane is not limited to the axial deviation in the vertical plane.
- the present disclosure can also be applied in the case where the error occurs.
- the detectable area of the first object detection means and the second object detection means is not limited to the front of the vehicle, but may be, for example, the rear or side of the vehicle. Further, the attachment positions of the first object detection means and the second object detection means are not particularly limited.
- the imaging device and the radar device are used as the object detection means, but the present invention is not limited to these.
- a sonar that detects an object using an ultrasonic wave as a transmission wave may be employed.
- the object recognition device mounted on a vehicle has been described as an example, but it can also be mounted on a moving body such as a railway vehicle, a ship, an aircraft, and the like.
- SYMBOLS 10 ... Object recognition apparatus, 11 ... Imaging device (1st object detection means), 12 ... Radar apparatus (2nd object detection means), 13 ... White line recognition part, 14 ... Flow calculation part, 20 ... Vanishing point calculation part, 22 ... vanishing point learning unit, 23 ... first learning unit, 24 ... second learning unit, 25 ... learning value storage unit, 30 ... radar axis deviation detection unit, 31 ... camera target detection unit, 32 ... radar target detection unit , 33... Axis deviation information calculation unit, 34... Axis deviation determination unit, 35... Reset determination unit, 50 .. Self-vehicle, X1... Imaging axis (reference axis), X2.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- Signal Processing (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
本開示は上記実施形態に限定されず、例えば次のように実施してもよい。
Claims (6)
- 所定の検出可能領域内に存在する物体を検出する物体検出手段として第1物体検出手段(11)及び第2物体検出手段(12)が搭載された移動体(50)に適用され、
前記検出可能領域は各々に基準軸(X1、X2)を含み、
前記第1物体検出手段の基準軸の軸ずれ量を学習する軸ずれ学習手段と、
前記第1物体検出手段及び前記第2物体検出手段の物体検出結果に基づいて、前記第2物体検出手段の基準軸において軸ずれが生じているか否かを判定する軸ずれ判定処理を実施する軸ずれ判定手段と、
前記軸ずれ学習手段による前記軸ずれ量の学習結果に基づいて、前記軸ずれ判定処理によって取得した前記第2物体検出手段の軸ずれに関する情報を無効にする無効化手段と、を備えることを特徴とする物体認識装置。 - 前記軸ずれ学習手段による前記軸ずれ量の学習結果に基づいて、前記軸ずれ判定処理の開始後において真値に対する前記学習結果の信頼度の変化が生じたか否かを判定する信頼度判定手段を備え、
前記無効化手段は、前記信頼度判定手段によって前記学習結果の信頼度の変化が生じたと判定された場合に、前記第2物体検出手段の基準軸のずれに関する情報を無効にする請求項1に記載の物体認識装置。 - 前記無効化手段によって無効にされる前の前記第2物体検出手段の基準軸のずれに関する情報を保持する情報保持手段を備え、
前記無効化手段によって前記第2物体検出手段の基準軸のずれに関する情報を無効にした後に、該無効にした情報を取得したときの前記真値に対する前記学習結果の信頼度が所定以上確保されていることが検出された場合に、前記情報保持手段に保持されている情報を用いて前記軸ずれ判定処理を実施する請求項2に記載の物体認識装置。 - 前記第1物体検出手段は、道路を含む周辺環境を撮影する撮像装置であり、
前記第1物体検出手段により撮影された画像に基づいて前記道路の区画線を認識する区画線認識手段を備え、
前記軸ずれ学習手段は、前記区画線認識手段により認識した区画線に関する情報である区画線情報に基づいて、前記第1物体検出手段の基準軸の軸ずれ量を学習する手段を含み、
前記無効化手段は、前記移動体の運転開始後において前記軸ずれ学習手段による前記区画線情報に基づく前記軸ずれ量の学習が完了した時点で、前記学習結果に基づいて前記第2物体検出手段の基準軸のずれに関する情報を無効にする請求項1~3のいずれか一項に記載の物体認識装置。 - 前記第1物体検出手段により撮影された画像に基づいてオプティカルフローを演算するフロー演算手段を備え、
前記軸ずれ学習手段は、前記区画線情報に基づいて前記第1物体検出手段の基準軸の軸ずれ量を学習する第1学習手段と、前記フロー演算手段により演算したオプティカルフローに基づいて前記第1物体検出手段の基準軸の軸ずれ量を学習する第2学習手段とを含み、
前記無効化手段は、前記移動体の運転開始後において前記軸ずれ学習手段による前記区画線情報に基づく前記軸ずれ量の学習が完了した時点及び前記区画線情報に基づく前記軸ずれ量の学習が完了するよりも前の期間で、前記学習結果に基づいて前記第2物体検出手段の基準軸のずれに関する情報を無効にする請求項4に記載の物体認識装置。 - 前記第1物体検出手段は、道路を含む周辺環境を撮影する撮像装置であり、 前記第2物体検出手段は、送信波を送出し、該送出した送信波を受信することで前記物体を検出する探知装置である請求項1~5のいずれか一項に記載の物体認識装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112015003605.9T DE112015003605B4 (de) | 2014-08-06 | 2015-08-04 | Mehrere Objekterfassungseinrichtungen verwendende Objekterkennungsvorrichtung |
KR1020177005660A KR101961571B1 (ko) | 2014-08-06 | 2015-08-04 | 복수의 물체 검출 수단을 사용한 물체 인식 장치 |
CN201580041368.5A CN106574961B (zh) | 2014-08-06 | 2015-08-04 | 使用多个物体检测单元的物体识别装置 |
US15/501,786 US10422871B2 (en) | 2014-08-06 | 2015-08-04 | Object recognition apparatus using a plurality of object detecting means |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014160680A JP6396714B2 (ja) | 2014-08-06 | 2014-08-06 | 物体認識装置 |
JP2014-160680 | 2014-08-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016021606A1 true WO2016021606A1 (ja) | 2016-02-11 |
Family
ID=55263868
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/072111 WO2016021606A1 (ja) | 2014-08-06 | 2015-08-04 | 複数の物体検出手段を用いた物体認識装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US10422871B2 (ja) |
JP (1) | JP6396714B2 (ja) |
KR (1) | KR101961571B1 (ja) |
CN (1) | CN106574961B (ja) |
DE (1) | DE112015003605B4 (ja) |
WO (1) | WO2016021606A1 (ja) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6390459B2 (ja) * | 2015-02-10 | 2018-09-19 | 株式会社デンソー | 光軸ずれ検出装置 |
US10852418B2 (en) * | 2016-08-24 | 2020-12-01 | Magna Electronics Inc. | Vehicle sensor with integrated radar and image sensors |
US9996752B2 (en) * | 2016-08-30 | 2018-06-12 | Canon Kabushiki Kaisha | Method, system and apparatus for processing an image |
JP6637472B2 (ja) * | 2017-07-06 | 2020-01-29 | 本田技研工業株式会社 | 情報処理方法及び情報処理装置 |
DE102017118083B4 (de) | 2017-08-09 | 2022-11-24 | Sick Ag | Sensor zur Erfassung eines Objekts und Verfahren zum Einstellen eines Schaltpunktes |
JP6986962B2 (ja) * | 2017-12-28 | 2021-12-22 | 株式会社デンソーテン | カメラずれ検出装置およびカメラずれ検出方法 |
JP7127356B2 (ja) * | 2018-05-14 | 2022-08-30 | 富士通株式会社 | データ収集方法、データ収集プログラムおよび情報処理装置 |
JP6973302B2 (ja) * | 2018-06-06 | 2021-11-24 | トヨタ自動車株式会社 | 物標認識装置 |
KR20200040391A (ko) * | 2018-10-10 | 2020-04-20 | 주식회사 만도 | 차량용 레이더의 보완 장치 및 방법 |
US11557061B2 (en) * | 2019-06-28 | 2023-01-17 | GM Cruise Holdings LLC. | Extrinsic calibration of multiple vehicle sensors using combined target detectable by multiple vehicle sensors |
JP7412254B2 (ja) * | 2020-04-02 | 2024-01-12 | 三菱電機株式会社 | 物体認識装置および物体認識方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6714156B1 (en) * | 2002-11-22 | 2004-03-30 | Visteon Global Technologies, Inc. | Method for correcting radar misalignment |
JP2004198159A (ja) * | 2002-12-17 | 2004-07-15 | Nissan Motor Co Ltd | 車載センサの軸ずれ計測装置 |
JP2004205398A (ja) * | 2002-12-26 | 2004-07-22 | Nissan Motor Co Ltd | 車両用レーダ装置およびレーダの光軸調整方法 |
JP2008232887A (ja) * | 2007-03-22 | 2008-10-02 | Omron Corp | 物体検知装置、および照射軸調整方法 |
JP2010249613A (ja) * | 2009-04-14 | 2010-11-04 | Toyota Motor Corp | 障害物認識装置及び車両制御装置 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09142236A (ja) * | 1995-11-17 | 1997-06-03 | Mitsubishi Electric Corp | 車両の周辺監視方法と周辺監視装置及び周辺監視装置の故障判定方法と周辺監視装置の故障判定装置 |
JP3331882B2 (ja) * | 1995-12-27 | 2002-10-07 | 株式会社デンソー | 車両用障害物検出装置の中心軸偏向量算出装置,中心軸偏向量補正装置,および車間制御装置 |
JPH11142520A (ja) * | 1997-11-06 | 1999-05-28 | Omron Corp | 測距装置の軸調整方法及び軸ずれ検出方法並びに測距装置 |
JP3922194B2 (ja) * | 2003-03-11 | 2007-05-30 | 日産自動車株式会社 | 車線逸脱警報装置 |
JP2004317507A (ja) * | 2003-04-04 | 2004-11-11 | Omron Corp | 監視装置の軸調整方法 |
EP1584946A3 (en) * | 2004-04-02 | 2006-03-22 | Omron Corporation | Method of adjusting monitor axis of optical sensors |
US7706978B2 (en) * | 2005-09-02 | 2010-04-27 | Delphi Technologies, Inc. | Method for estimating unknown parameters for a vehicle object detection system |
US7786898B2 (en) * | 2006-05-31 | 2010-08-31 | Mobileye Technologies Ltd. | Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications |
JP5146716B2 (ja) * | 2007-03-01 | 2013-02-20 | マツダ株式会社 | 車両用障害物検知装置 |
JP4453775B2 (ja) * | 2008-06-27 | 2010-04-21 | トヨタ自動車株式会社 | 物体検出装置 |
JP5328491B2 (ja) | 2009-06-02 | 2013-10-30 | 三菱電機株式会社 | レーダ画像処理装置 |
JP5551892B2 (ja) * | 2009-06-19 | 2014-07-16 | 富士通テン株式会社 | 信号処理装置、及びレーダ装置 |
JP5416026B2 (ja) * | 2010-04-23 | 2014-02-12 | 本田技研工業株式会社 | 車両の周辺監視装置 |
JP2014228943A (ja) * | 2013-05-20 | 2014-12-08 | 日本電産エレシス株式会社 | 車両用外界センシング装置、その軸ズレ補正プログラム及びその軸ズレ補正方法 |
US9898670B2 (en) * | 2013-12-13 | 2018-02-20 | Fts Computertechnik Gmbh | Method and device for observing the environment of a vehicle |
JP6428270B2 (ja) | 2014-02-10 | 2018-11-28 | 株式会社デンソー | 軸ずれ検出装置 |
US9286679B2 (en) * | 2014-07-22 | 2016-03-15 | GM Global Technology Operations LLC | Misalignment correction and state of health estimation for lane management fusion function |
-
2014
- 2014-08-06 JP JP2014160680A patent/JP6396714B2/ja active Active
-
2015
- 2015-08-04 WO PCT/JP2015/072111 patent/WO2016021606A1/ja active Application Filing
- 2015-08-04 DE DE112015003605.9T patent/DE112015003605B4/de active Active
- 2015-08-04 US US15/501,786 patent/US10422871B2/en active Active
- 2015-08-04 KR KR1020177005660A patent/KR101961571B1/ko active IP Right Grant
- 2015-08-04 CN CN201580041368.5A patent/CN106574961B/zh active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6714156B1 (en) * | 2002-11-22 | 2004-03-30 | Visteon Global Technologies, Inc. | Method for correcting radar misalignment |
JP2004198159A (ja) * | 2002-12-17 | 2004-07-15 | Nissan Motor Co Ltd | 車載センサの軸ずれ計測装置 |
JP2004205398A (ja) * | 2002-12-26 | 2004-07-22 | Nissan Motor Co Ltd | 車両用レーダ装置およびレーダの光軸調整方法 |
JP2008232887A (ja) * | 2007-03-22 | 2008-10-02 | Omron Corp | 物体検知装置、および照射軸調整方法 |
JP2010249613A (ja) * | 2009-04-14 | 2010-11-04 | Toyota Motor Corp | 障害物認識装置及び車両制御装置 |
Also Published As
Publication number | Publication date |
---|---|
US10422871B2 (en) | 2019-09-24 |
DE112015003605T5 (de) | 2017-04-27 |
JP6396714B2 (ja) | 2018-09-26 |
CN106574961B (zh) | 2019-09-13 |
DE112015003605B4 (de) | 2023-02-09 |
DE112015003605T8 (de) | 2017-05-18 |
CN106574961A (zh) | 2017-04-19 |
JP2016038261A (ja) | 2016-03-22 |
KR20170041775A (ko) | 2017-04-17 |
KR101961571B1 (ko) | 2019-03-22 |
US20170227634A1 (en) | 2017-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6396714B2 (ja) | 物体認識装置 | |
JP6361366B2 (ja) | 物体認識装置 | |
US10922561B2 (en) | Object recognition device and vehicle travel control system | |
KR101787996B1 (ko) | 차선 추정 장치 및 그 방법 | |
US9053554B2 (en) | Object detection device using an image captured with an imaging unit carried on a movable body | |
US10457283B2 (en) | Vehicle driving assist apparatus | |
US11014566B2 (en) | Object detection apparatus | |
CN107305632B (zh) | 基于单目计算机视觉技术的目标对象距离测量方法与系统 | |
US10471961B2 (en) | Cruise control device and cruise control method for vehicles | |
US11119210B2 (en) | Vehicle control device and vehicle control method | |
JP6468136B2 (ja) | 走行支援装置及び走行支援方法 | |
JP6458651B2 (ja) | 路面標示検出装置及び路面標示検出方法 | |
JP2014228943A (ja) | 車両用外界センシング装置、その軸ズレ補正プログラム及びその軸ズレ補正方法 | |
JP4753765B2 (ja) | 障害物認識方法 | |
US11346922B2 (en) | Object recognition apparatus and object recognition method | |
JP3656056B2 (ja) | 割り込み車両検出装置及びその方法 | |
JP3823782B2 (ja) | 先行車両認識装置 | |
JP2005329779A (ja) | 障害物認識方法及び障害物認識装置 | |
JP3925285B2 (ja) | 走行路環境検出装置 | |
JP2012014520A (ja) | 障害物検出装置 | |
JP2020076580A (ja) | 軸ずれ推定装置 | |
JP2020047059A (ja) | 車両の走行環境検出装置及び走行制御システム | |
US20230243923A1 (en) | Method for detecting intensity peaks of a specularly reflected light beam | |
JP2005332120A (ja) | 障害物認識方法及び障害物認識装置 | |
JP2018018215A (ja) | 物体特徴点検知装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15829147 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112015003605 Country of ref document: DE |
|
ENP | Entry into the national phase |
Ref document number: 20177005660 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15829147 Country of ref document: EP Kind code of ref document: A1 |