WO2015194371A1 - 物体認識装置及びそれを用いた車両走行制御装置 - Google Patents
物体認識装置及びそれを用いた車両走行制御装置 Download PDFInfo
- Publication number
- WO2015194371A1 WO2015194371A1 PCT/JP2015/065967 JP2015065967W WO2015194371A1 WO 2015194371 A1 WO2015194371 A1 WO 2015194371A1 JP 2015065967 W JP2015065967 W JP 2015065967W WO 2015194371 A1 WO2015194371 A1 WO 2015194371A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lane
- host vehicle
- vehicle
- unit
- behind
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/021—Determination of steering angle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9315—Monitoring blind spots
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/932—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93272—Sensor installation details in the back of the vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93274—Sensor installation details on the side of the vehicles
Definitions
- the present invention relates to an object recognition device and a vehicle travel control device using the same, and more particularly to a vehicle travel control device that recognizes the position of an object that exists behind and includes an oblique rear side of the host vehicle and controls the travel of the host vehicle. About.
- Patent Document 1 Conventionally, for example, a conventional technique disclosed in Patent Document 1 is known as a technique for detecting the position of an object existing behind the host vehicle.
- the object position detection device disclosed in Patent Document 1 includes a storage unit that stores a traveling position of a host vehicle traveling on a road, a detection unit that detects a position of a rear object existing behind the host vehicle, Estimation means for estimating a lane in which the rear object is located based on a relative positional relationship between a past travel position of the host vehicle stored in a storage means and a position of the rear object detected by the detection means;
- the estimation means is an apparatus that estimates a lane in which the rear object is located based on a distance between a travel locus obtained from a past travel position of the host vehicle and the position of the rear object.
- the driver of a vehicle traveling in a certain lane may change the lane in preparation for overtaking a vehicle traveling ahead or turning left or right, for example, when the own vehicle changes the lane
- the travel position of the host vehicle is stored across the lane.
- the object position detection device disclosed in Patent Document 1 in order to estimate the lane where the rear object is located based on the distance between the travel locus obtained from the past travel position of the host vehicle and the position of the rear object, As shown in FIG. 20, when the host vehicle changes the lane and the past travel position of the host vehicle is stored across the lane, as shown in FIG. For example, there is a possibility that a false alarm may be generated for a driver who drives the own vehicle. Further, in the object position detection device disclosed in Patent Document 1, when the host vehicle travels in a curved lane, the lane in which the host vehicle travels or the lane in which the rear object is located may not be accurately grasped. Yes, for example, there is a possibility that a false alarm is generated for the driver who drives the host vehicle.
- a millimeter wave radar is used as a detection means for detecting the position of a rear object existing behind the host vehicle.
- radio wave radar such as millimeter wave radar
- the guardrail When traveling on continuously laid roads, the signal output from radio wave radar such as millimeter wave radar is reflected by the guardrail, and the position of the rear object and thus the lane where the rear object is located cannot be accurately estimated There was also.
- the present invention has been made in view of the above problems, and an object thereof is an object recognition device capable of precisely recognizing the position of an object existing behind the host vehicle including an oblique rear side. It is in providing the vehicle travel control apparatus using the same.
- an object recognition device is an object recognition device that recognizes the position of an object existing behind the host vehicle, and an imaging unit that images an environment ahead or behind the host vehicle. Based on the image captured by the imaging unit, a lane detection unit that detects a lane in front or rear of the host vehicle, a lane detected by the lane detection unit, and a travel history of the host vehicle, Detected by the lane position estimation unit that estimates the position of the lane in the rear, the rear object detection unit that detects an object existing behind the host vehicle, and the rear object detection unit with respect to the lane position estimated by the lane position estimation unit And a relative position calculation unit that calculates the relative position of the object.
- An object recognition apparatus is an object recognition apparatus for recognizing the position of an object existing behind a host vehicle.
- the image recognition unit captures an environment in front of or behind the host vehicle, and is captured by the imaging unit.
- a three-dimensional object detection unit for detecting a stationary three-dimensional object in front of or behind the host vehicle based on the captured image, and a rear of the host vehicle based on the stationary three-dimensional object detected by the three-dimensional object detection unit and the traveling history of the host vehicle.
- a three-dimensional object position estimating unit for estimating the position of a stationary three-dimensional object, a rear object detecting unit for detecting an object existing behind the host vehicle, and the rear of the position of the stationary three-dimensional object estimated by the three-dimensional object position estimating unit
- a relative position calculation unit that calculates a relative position of the object detected by the object detection unit.
- the vehicle travel control device controls the travel of the host vehicle based on the position of the object recognized by the object recognition device.
- a lane or a stationary three-dimensional object in front of the host vehicle is detected based on an image captured by an imaging unit that captures an environment in front of the host vehicle, and the detected lane or stationary three-dimensional object and the host vehicle are detected.
- the block diagram which shows the structure of Embodiment 1 of the vehicle travel control apparatus using the object recognition apparatus which concerns on this invention.
- the figure which shows an example of the image imaged with the camera shown in FIG. The figure which shows an example of the position of the lane with respect to the own vehicle, and a rear side vehicle.
- FIG. 1 The flowchart explaining the processing flow by the vehicle travel control apparatus shown in FIG.
- the block diagram which shows the structure of Embodiment 2 of the vehicle travel control apparatus using the object recognition apparatus which concerns on this invention.
- the figure which shows an example of the image imaged with the stereo camera shown in FIG. The figure which shows an example of the position of the stationary solid object with respect to the own vehicle and a rear side vehicle.
- the figure which shows the other examples of the positional relationship of the stationary solid object and rear side vehicle with respect to the own vehicle The flowchart explaining the processing flow by the vehicle travel control apparatus shown in FIG.
- the figure which shows an example of the image imaged with the camera shown in FIG. The figure which shows an example of the position of the lane with respect to the own vehicle, and a rear side vehicle.
- FIG. 1 is a configuration diagram showing a configuration of a vehicle travel control apparatus using an object recognition apparatus according to a first embodiment of the present invention.
- the vehicle travel control device 50 mainly includes an object recognition device 20 that recognizes the position of an object behind the host vehicle, a steering angle sensor 21, a yaw rate sensor 22, a wheel speed sensor 23, a navigation system 24, And the control unit 25 which produces
- the object recognition device 20 includes a rear object detection device 1, a lane detection device 2, and a travel history calculation device 3.
- the rear object detection device 1 is an object (for example, a vehicle (car, motorcycle, bicycle, etc.) or a person who is moving, stopped, or parked behind) including an oblique rear side (hereinafter referred to as a rear side) of the host vehicle.
- a plurality of radio wave radars (rear object detection units) 7 provided on the left and right rear sides of the host vehicle.
- the radio wave radar 7 transmits a radio wave to a predetermined range behind the host vehicle and receives a reflected wave from an object existing in the range, so that the relative position (distance, direction) of the object behind the host vehicle and the host vehicle are determined.
- the relative speed with respect to can be detected.
- the own vehicle VS is equipped with radio radars 7a and 7b on the left and right rear sides, and the radio radar 7a attached to the left rear part of the own vehicle VS
- the area Aa on the left rear side of the vehicle VS is set as a detection area
- the radio wave radar 7b attached to the right rear portion of the host vehicle VS sets the area Ab on the right rear side of the host vehicle VS as a detection area.
- the radio wave radar 7 detects the position (P, Q) of the target vehicle VT in the coordinate system XY with the center of the own vehicle VS as the origin.
- the relative speed with respect to the host vehicle VS are detected by the radio wave radars 7a and 7b.
- the lane detection device 2 is for detecting a lane in front of the host vehicle (lane in which the vehicle travels).
- the lane detection device 2 is arranged at the center upper part of the windshield of the host vehicle and captures an environment in front of the host vehicle ( (Front camera) (imaging unit) 8 and a lane detection unit 9 that detects a lane in front of the host vehicle based on an image captured by the camera 8.
- the camera 8 is composed of, for example, a CMOS camera, and is attached to the host vehicle so as to have an optical axis directed forward and obliquely downward of the host vehicle. As shown in FIG. The surrounding environment including the road and the like is imaged, and the captured image is transmitted to the lane detection unit 9.
- the lane detection unit 9 performs, for example, binarization processing or feature point extraction processing based on the image captured by the camera 8, thereby obtaining road lane markings (white lines, yellow lines, broken lines, botsdots, etc.) on the road. ) Pixels (road lane line candidate points) that are presumed to fall under the above are selected, and the consecutive line of the selected road lane line candidate points are recognized as road lane lines constituting the lane. The position is obtained, and information regarding the position is transmitted to the lane position estimation unit 4 of the travel history calculation device 3. In FIG.
- the position of the right road lane marking in the image captured by the camera 8 is indicated by R1 to R3 from the front
- the position of the left road lane marking is indicated by L1 to L3 from the front.
- the coordinate system XY as seen from above with the center of the vehicle as the origin R1 : (xr_1, yr_1), R2 : (xr_2, yr_2), R3 : (xr_3, yr_3)
- 2 and 3 show an example in which the position of the road marking line is obtained for each of the right side and the left side, but when obtaining 2 points or less or 4 points or more, or approximating with a straight line or curve Is the same.
- the lane detection device 2 may be shared with various devices for lane detection used in, for example, a lane maintenance assist device (also referred to as lane keep assist), a lane departure warning device (also referred to as lane departure warning), and the like.
- a lane maintenance assist device also referred to as lane keep assist
- a lane departure warning device also referred to as lane departure warning
- the travel history calculation device 3 calculates the position of an object existing behind the host vehicle based on information transmitted from the rear object detection device 1 or the lane detection device 2, and obtains information necessary for the travel control of the host vehicle. This is output to the control unit 25, and is mainly composed of a travel history calculation unit 11, a lane position estimation unit 4, a lane position information storage unit 10, a relative position calculation unit 5, and a determination unit 6. ing.
- the travel history calculation unit 11 is based on information obtained from the steering angle sensor 21, the yaw rate sensor 22, the wheel speed sensor 23 as a vehicle speed sensor, the navigation system 24, and the like that constitute the vehicle travel control device 50. And the calculation result is transmitted to the lane position estimation unit 4.
- a coordinate system having the origin at the center of the vehicle at time t (n) is represented by X (n) -Y (n), the coordinate system with the origin at the center of the host vehicle at time t (n + 1) is X (n + 1) -Y (n + 1), and the host vehicle at time t (n)
- ⁇ t (t (n + 1)
- the amount of change ( ⁇ x, ⁇ y) in the position of the host vehicle VS during -t (n)) is expressed by the following equation (1).
- the angle ⁇ n formed with -Y (n + 1) is expressed by the following equation (3).
- the speed V n of the host vehicle VS from moment to moment is obtained by the wheel speed sensor 23, the navigation system 24, and the like, and the traveling direction ⁇ n and the rotation angular velocity ⁇ n of the host vehicle VS are determined by the steering angle sensor 21 and the yaw rate sensor. 22. It is calculated
- the lane position estimation unit 4 uses the lane position information (corresponding to the road lane line position) output from the lane detection device 2 as the origin of the vehicle center at that time. Is converted into a coordinate system, and the conversion result is stored in the lane position information storage unit 10. Regarding the transformation of the coordinate system at that time, the point P fixed to the ground is changed from the coordinate system X (n) -Y (n) whose origin is the center of the vehicle at time t (n) to time t (n + 1).
- the coordinate of the point P in the coordinate system at time t (n) is expressed as (x (t (n)), y ( t (n))) and the coordinates of the point P at time t (n + 1) are (x (t (n + 1)), y (t (n + 1))), the relationship between these coordinates Is represented by the following formula (4).
- the lane position estimation unit 4 converts the lane position detected in the past into a coordinate system based on the information obtained from the travel history calculation unit 11 and having the center of the vehicle at that time as the origin, The conversion result is stored in the lane position information storage unit 10.
- the lane position estimation unit 4 acquires the current lane position information from the lane detection device 2 and adds and stores it in the lane position information storage unit 10.
- the lane position estimation unit 4 first stores the lane position information output from the lane detection device 2 in the lane position information storage unit 10 at time t (n).
- the position of the right road lane line (that is, lane information) obtained from the image captured by the camera 8 is the origin of the vehicle center at time t (n).
- the coordinate information of (t (n)), yr_3 (t (n))) is stored.
- the position of the left road lane line (that is, lane information) obtained from the image captured by the camera 8 is expressed in the coordinate system XY with the center of the vehicle as the origin.
- positions L1 to L3 As positions L1 to L3, (xl_1 (t (n)), yl_1 (t (n))), (xl_2 (t (n)), yl_2 (t (n))), (xl_3 (t (n)), The coordinate information of yl_3 (t (n))) is stored.
- the lane position estimation unit 4 determines the right road lane marking positions R1 to R3 and the left road lane marking positions L1 to L3.
- the lane position information is stored by converting into position information in the coordinate system X (n + 1) -Y (n + 1) using the center of the vehicle at the time t (n + 1) as an origin using the equation (4). Store in the unit 10.
- Such conversion is performed sequentially, and at time t (n + m), from the right road lane marking positions R1 to R3 detected at time t (n), at time t (n + m).
- the detected right road lane marking positions Rm1 to Rm3 are combined and the coordinate information is stored in the lane position information storage unit 10 as lane position information (see FIG. 5).
- the position information of road lane markings detected by the lane detection device 2 until m) is stored in the lane position information storage unit 10 as lane position information.
- the lane position information storage unit 10 accumulates and stores past lane position information, but in order to prevent the storage capacity from overflowing, the lane position information that has passed a predetermined time or the distance from the host vehicle is a predetermined value or more. It is practically effective to sequentially delete the lane position information that has become from the lane position information storage unit 10.
- the lane position behind the host vehicle can be accurately estimated by recognizing the lane position information every moment and using the past lane position information.
- the relative position calculation unit 5 calculates the relative position of the object detected by the rear object detection device 1 with respect to the position of the lane behind the host vehicle stored in the lane position information storage unit 10.
- the rear object detection device 1 detects the target vehicle VT as shown in FIGS. 6 and 7, and the position (P, Q) of the target vehicle VT in the coordinate system XY with the center of the host vehicle as the origin. ) And the relative speed with respect to the host vehicle VS, the relative position calculation unit 5 is closest to the traveling direction (Y-axis direction) of the host vehicle VS from the lane position information stored in the lane position information storage unit 10. Select points (two points on each of the left and right road lane markings). 6 and 7, the lane position information of the right road lane marking positions Rn1 and Rn2 and the left road lane marking positions Ln1 and Ln2 is the closest point in the Y-axis direction.
- the relative position calculation unit 5 obtains a straight line connecting the two points Rn1 and Rn2 of the right road lane marking position, and calculates a distance xr_np in the X direction at a location corresponding to the position of the target vehicle VT in the Y-axis direction. Then, the magnitude relationship between the value of the distance xr_np and the value P of the target vehicle VT in the X-axis direction is calculated, and the calculation result is transmitted to the determination unit 6.
- the determination unit 6 determines whether or not the object detected by the rear object detection device 1 is in a predetermined lane (for example, a lane where a warning is to be issued). to decide.
- a predetermined lane for example, a lane where a warning is to be issued.
- FIG. 6 shows an example in which the value P in the X-axis direction of the target vehicle VT is larger than the distance xr_np to the right road lane marking, that is, an example in which the target vehicle VT exists in the right adjacent lane. From the above calculation, it can be determined that the target vehicle VT exists in the adjacent lane on the right side of the lane in which the host vehicle VS travels.
- FIG. 7 shows an example in which the value P in the X-axis direction of the target vehicle VT is smaller than the distance xr_np to the right road partition line. In this case, the same calculation as described above is performed on the left road partition. It is possible to determine whether or not the target vehicle VT exists in the rear of the same lane as the lane in which the host vehicle VS travels by performing the process continuously on the line.
- the control unit 25 generates various control signals for controlling the travel of the host vehicle based on the information transmitted from the object recognition device 20 (the determination unit 6 of the travel history calculation device 3).
- control unit 25 controls, for example, steering when it is determined by the determination unit 6 that an object detected by the rear object detection device 1 is present in a predetermined lane (for example, a lane where a warning should be issued).
- the signal is transmitted to an appropriate on-vehicle device.
- step S101 the method described based on FIGS. 2 and 3 by the lane detection device 2 of the object recognition device 20.
- the lane position in the coordinate system XY viewed from above with the host vehicle as the origin is obtained.
- step S102 the lane position estimation unit 4 of the travel history calculation device 3 performs step S1.
- the lane position obtained in 01 is stored in the lane position information storage unit 10 in a coordinate system with the vehicle as the origin.
- step S103 the travel history calculation unit 11 performs ⁇ t seconds based on information obtained from the steering angle sensor 21, the yaw rate sensor 22, the wheel speed sensor 23, the navigation system 24, and the like constituting the vehicle travel control device 50.
- a coordinate system X (n) -Y (n) with the own vehicle as the origin at time t (n) is estimated using the equations (1) to (3) described above.
- step S104 the lane position estimation unit 4 uses the above-described equation (4) to determine the lane position information detected by the lane detection device 2 in the past as a coordinate system with the vehicle at the current time as the origin. And the lane position behind the host vehicle is estimated and stored in the lane position information storage unit 10.
- step S105 the rear object detection device 1 detects a rear object (target vehicle or the like) including the rear side, and obtains the position coordinates (P, Q) and the like of the target vehicle.
- step S106 the relative position calculation unit 5 obtains the relative position between the lane position information stored in step S104 and the target vehicle detected in step S105. More specifically, as described above, when the rear object detection device 1 detects the target vehicle and detects the position (P, Q) of the target vehicle in the coordinate system XY with the own vehicle as the origin. From the lane position information stored in the lane position information storage unit 10, the two closest lane position information in the traveling direction (Y-axis direction) of the host vehicle is selected. By obtaining the distance in the X direction between the two closest lane positions in the Y-axis direction and the target vehicle, it is possible to estimate the relative position of the target vehicle with respect to the lane.
- step S107 based on the information on the relative position of the target vehicle with respect to the lane obtained in step S106, the determination unit 6 determines which lane (the lane on which the own vehicle is traveling, Lane, lane adjacent to the left, or lanes separated by two or more lanes). Specifically, it is determined whether or not the target vehicle exists in an area (lane) to be alarmed, and when it is determined that the target vehicle is in an area (lane) to be alarmed, step S108 In step (1), it is determined whether or not an alarm should actually be output from information on the position and speed of the target vehicle (for example, whether the target vehicle approaches the host vehicle).
- step S108 When it is determined in step S108 that an alarm should actually be output, a control signal is generated by the control unit 25 in step S109 to generate an alarm.
- the camera 8 that captures the environment ahead of the host vehicle in the vehicle that detects the object existing behind the host vehicle using the radio wave radar 7 such as the millimeter wave radar. Based on the captured image, the lane in front of the host vehicle is detected, the position of the lane behind the host vehicle is estimated based on the detected lane and the traveling history of the host vehicle, and the estimated lane position is By calculating the relative position of the object that is behind the host vehicle, the position of the object that exists behind the host vehicle, including diagonally behind the host vehicle, especially the lane in which the object is located, can be refined even when driving on a curve or changing lanes. Can be recognized.
- FIG. 9 is a configuration diagram showing a configuration of a second embodiment of the vehicle travel control device using the object recognition device according to the present invention.
- the vehicle travel control device 50A according to the second embodiment shown in FIG. 9 detects a stationary solid object in front of the host vehicle (hereinafter simply referred to as a three-dimensional object) with respect to the vehicle travel control device 50 according to the first embodiment described above.
- the other configuration is the same as that of the vehicle travel control device 50 of the first embodiment. Therefore, hereinafter, only the configuration different from the vehicle travel control device 50 of the first embodiment will be described in detail, and the same reference numerals are given to the same configurations as those of the first embodiment, and the detailed description thereof will be omitted.
- the object recognition device 20A of the vehicle travel control device 50A includes a rear object detection device 1A, a three-dimensional object detection device 2A, and a travel history calculation device 3A.
- the three-dimensional object detection device 2A is for detecting a three-dimensional object (for example, a guard rail or a wall laid along the lane) in front of the host vehicle, and is disposed, for example, at the center upper portion of the windshield of the host vehicle.
- a stereo camera 8A composed of a plurality of cameras (front cameras) that capture the environment ahead of the host vehicle, and a three-dimensional object detection unit 9A that detects a three-dimensional object in front of the host vehicle based on the plurality of images captured by the stereo camera 8A. And is composed of.
- each camera constituting the stereo camera 8 ⁇ / b> A captures the surrounding environment including the preceding vehicle VP that travels in front and the guard rail GR that is a road-side three-dimensional object, and the captured image is a three-dimensional object. It transmits to the detection part 9A.
- the three-dimensional object detection unit 9A obtains parallax between the left and right images based on the left and right image information captured by the stereo camera 8A, detects the presence of the three-dimensional object, and calculates the distance from the host vehicle to the three-dimensional object. Moreover, the magnitude
- representative points of the three-dimensional object (here, the guard rail GR) detected from the image captured by the stereo camera 8A are shown as G1 to G3 from the front, and in FIG. 11, representatives of these three-dimensional objects shown in FIG.
- the position in the three-dimensional coordinate system XYZ with the vehicle as the origin is defined as G1: (xg_1, yg_1, zg_1), G2: (xg_2, yg_2, zg_2), G3: (xg_3, yg_3 , zg_3), respectively.
- 10 and 11 show an example in which three positions of the representative points of the three-dimensional object are detected, but the same applies to the case where two or less points or four or more points are detected. Further, when a three-dimensional object exists continuously along a lane, for example, it may be approximated as a straight line or a curve in the XY plane of the coordinate system.
- the travel history calculation unit 11A of the travel history calculation device 3A calculates the coordinate system X (n) ⁇ with the origin at the center of the host vehicle at time t (n) by the same calculation as described with reference to FIG. 4 in the first embodiment.
- the own vehicle V The change in the direction of S is also obtained by the calculation of the procedure similar to the procedure described in the expression (2) in Embodiment 1, and the calculation result is transmitted to the three-dimensional object position estimation unit 4A.
- the three-dimensional object position estimation unit 4A obtains the three-dimensional object position information output from the three-dimensional object detection device 2A based on the amount of change in the position of the own vehicle obtained by the travel history calculation unit 11A, and the center Is converted into a coordinate system with the origin as the origin, and the conversion result is stored in the three-dimensional object position information storage unit 10A.
- the transformation of the coordinate system at that time the transformation of the XY coordinate system is the same as the transformation represented by the expression (4) in the first embodiment, but in the second embodiment, the change of the position of the own vehicle is not changed.
- the three-dimensional object position estimation unit 4A converts the three-dimensional object position detected in the past into a coordinate system whose origin is the center of the host vehicle at that time, based on the information obtained from the travel history calculation unit 11A. Then, the conversion result is stored in the three-dimensional object position information storage unit 10A. At the same time, the three-dimensional object position estimation unit 4A acquires the current three-dimensional object position information from the three-dimensional object detection device 2A, and adds and stores the information in the three-dimensional object position information storage unit 10A.
- the three-dimensional object position estimation unit 4A first obtains the three-dimensional object position information output from the three-dimensional object detection device 2A at the time t (n), as shown in FIG.
- the position information of the three-dimensional object obtained from the image picked up by the stereo camera 8A is expressed in the coordinate system XY with the center of the vehicle at the time t (n) as the origin.
- the three-dimensional object position estimation unit 4A uses the above-described equation (4) to calculate the three-dimensional object position information G1 to G3 at the time t.
- the three-dimensional object position information storage unit 10A is converted into position information in the coordinate system X (n + 1) -Y (n + 1) -Z (n + 1) with the center of the vehicle in (n + 1) as the origin.
- Such conversion is sequentially performed, and at the time t (n + m), the three-dimensional object position information G1 to G3 detected at the time t (n) is detected at the time t (n + m).
- the three-dimensional object position information Gm1 to Gm3 By combining the three-dimensional object position information Gm1 to Gm3, three-dimensional coordinate information is stored in the three-dimensional object position information storage unit 10A.
- the three-dimensional object position information storage unit 10A accumulates and stores past three-dimensional object position information. However, in order to prevent the storage capacity from overflowing, the three-dimensional object position information or the distance from the host vehicle that has passed a predetermined time or longer is used. It is practically effective to sequentially delete the three-dimensional object position information that is equal to or greater than the predetermined value from the three-dimensional object position information storage unit 10A.
- the position of the three-dimensional object behind the host vehicle can be accurately estimated by recognizing the three-dimensional object position information every moment and using the past three-dimensional object position information.
- the relative position calculation unit 5A calculates the relative position of the object detected by the rear object detection device 1A with respect to the position of the three-dimensional object behind the host vehicle stored in the three-dimensional object position information storage unit 10A.
- the rear object detection device 1A detects the left rear target vehicle VT as shown in FIGS. 13 and 14, and the target vehicle VT on the XY plane of the coordinate system with the center of the host vehicle as the origin.
- the relative position calculation unit 5A determines the traveling direction (Y of the host vehicle VS from the three-dimensional object position information stored in the three-dimensional object position information storage unit 10A. Select the two closest points (in the axial direction). In FIG. 13, the three-dimensional object position information Gn1 , Gn2 is the closest point in the Y-axis direction.
- the relative position calculation unit 5A obtains a straight line connecting the positions Gn1 and Gn2 of the two representative points of the three-dimensional object, and calculates the distance xg_np in the X direction at a location corresponding to the position of the target vehicle VT in the Y-axis direction. Then, the magnitude relationship between the value of the distance xg_np and the value P2 of the target vehicle VT in the X-axis direction is calculated, and the calculation result is transmitted to the determination unit 6A.
- the determination unit 6A Based on the magnitude relationship transmitted from the relative position calculation unit 5A, the determination unit 6A has an object detected by the rear object detection device 1A in a predetermined region (for example, the reliability of the detection result by the rear object detection device 1A is low). In other words, it is determined whether or not the object detected by the rear object detection device 1A exists in an area determined to be an object erroneously detected due to the influence of the three-dimensional object.
- the determination unit 6A determines whether the target vehicle VT exists inside the three-dimensional object (on the own vehicle side), for example, as illustrated in FIG. Alternatively, it is determined whether it exists outside (the side opposite to the host vehicle side), the influence of the three-dimensional object on the detection result of the target vehicle VT by the radio wave radar 7A is determined, and the determination result is transmitted to the control unit 25A. To do.
- FIG. 13 shows an example in which the value P2 in the X-axis direction of the target vehicle VT is larger than the distance xg_np from the Y-axis to the three-dimensional object. In such a case, the target vehicle VT by the rear object detection device 1A is shown. It is determined that the reliability of the detection result is low.
- the target vehicle VT ′ is detected by the radio wave radar 7aA attached to the left rear part of the host vehicle VS, and the target vehicle VT is detected by the radio wave radar 7bA attached to the right rear part of the host vehicle VS.
- the position information Gn1, Gn2, and Gn3 of the three representative points of the three-dimensional object are obtained by the three-dimensional object position estimation unit 4A.
- the radio wave radiated from the radio wave radar 7A is expected to be reflected by a solid object having a continuous shape, and the target vehicle VT detected by the radio wave radar 7bA is true information.
- the target vehicle VT ′ detected by the radar 7aA is presumed to be highly likely to be information in which a radio wave is reflected by a three-dimensional object and an erroneous target vehicle (ghost of the target vehicle VT) is detected. Therefore, even in such a case, it is determined that the reliability of the detection result of the target vehicle VT by the rear object detection device 1A is low.
- the position information of the three-dimensional object is shown as a series of points. However, when the three-dimensional object exists continuously along a road, for example, it is approximated as a straight line or a curve.
- the information may be stored as position information of the three-dimensional object in the three-dimensional coordinate system XYZ.
- the control unit 25A generates various control signals for controlling the traveling of the host vehicle based on the information transmitted from the object recognition device 20A (the determination unit 6A of the travel history calculation device 3A).
- the object detected by the rear object detection device 1A by the determination unit 6A is a predetermined region (for example, a region where the reliability of the detection result by the rear object detection device 1A is determined to be low). If it is determined that there is a low necessity for steering control (regulation), vehicle speed control of the own vehicle, warning for the driver (warning or warning display), for example, A control signal to be released or suppressed is generated.
- a predetermined region for example, a region where the reliability of the detection result by the rear object detection device 1A is determined to be low.
- step S201 the three-dimensional object detection device 2A of the object recognition device 20A has been described based on FIG. 10 and FIG.
- step S201 the position of the three-dimensional object in the three-dimensional coordinate system XYZ with the own vehicle as the origin is obtained.
- step S202 the three-dimensional object position estimation unit 4A of the travel history calculation device 3A stores the three-dimensional object position obtained in step S201 in the three-dimensional object position information storage unit 10A in a coordinate system with the vehicle as the origin. Keep it.
- step S203 ⁇ t seconds is determined based on information obtained from the steering angle sensor 21A, the yaw rate sensor 22A, the wheel speed sensor 23A, the navigation system 24A, and the like that constitute the vehicle travel control device 50A by the travel history calculation unit 11A. The amount of movement of the vehicle is estimated.
- the processing in step S203 is the same as the processing in step S103 in FIG.
- step S204 the three-dimensional object position estimation unit 4A uses the above-described equation (4) to detect the three-dimensional object position information detected by the three-dimensional object detection device 2A in the past, with the vehicle at the current time as the origin.
- the coordinate system is converted into a coordinate system, and the position of the three-dimensional object behind the host vehicle is estimated and stored in the three-dimensional object position information storage unit 10A.
- step S205 the rear object detection device 1A detects a rear object (such as a target vehicle) including the rear side, and obtains the position coordinates (P, Q) and the like of the target vehicle.
- the processing in step S205 is the same as the processing in step S105 in FIG.
- the relative position calculation unit 5A obtains the relative position between the three-dimensional object position information stored in step S204 and the target vehicle detected in step S205. More specifically, as described above, the rear object detection device 1A detects the target vehicle, and the position (P, Q) of the target vehicle in the XY plane of the three-dimensional coordinate system with the own vehicle as the origin. When the three-dimensional object position information stored in the three-dimensional object position information storage unit 10A is selected, the three-dimensional object position information of the two points closest in the traveling direction (Y-axis direction) of the host vehicle is selected. The relative position of the target vehicle with respect to the three-dimensional object can be estimated by obtaining the distance in the X direction between the positions of the representative points of the two closest three-dimensional objects in the Y-axis direction and the target vehicle.
- step S207 it is determined whether or not the target vehicle behind the host vehicle is in an area (lane) that should be an alarm target. Usually, the determination is made based on the position where the target vehicle is located with respect to the own vehicle. However, when the lane position information as described in the first embodiment is acquired, the lane (the own vehicle) is the target vehicle. It is possible to accurately determine whether the vehicle is traveling in the lane in which the vehicle travels, the lane on the right, the lane on the left, or a lane that is two or more lanes away.
- step S208 based on the information on the relative position of the target vehicle with respect to the three-dimensional object obtained in step S206, FIG. Whether or not the vehicle detected by the rear object detection device 1A exists in an area where the reliability of the detection result by the rear object detection device 1A is determined to be low by the method described based on FIG. It is determined whether or not the vehicle detected by the detection device 1A is a vehicle erroneously detected due to the influence of a three-dimensional object.
- step S208 When it is determined in step S208 that the vehicle detected by the rear object detection device 1A is not a vehicle erroneously detected due to the influence of a three-dimensional object, in step S209, the position and speed of the target vehicle (for example, It is determined whether or not an alarm should actually be output from the information on whether or not the target vehicle approaches the host vehicle.
- step S209 When it is determined in step S209 that an alarm should actually be output, a control signal is generated by the control unit 25A to generate an alarm in step S210.
- the processes in steps S209 and S210 are the same as the processes in steps S108 and S109 in FIG.
- the stereo that images the environment ahead of the host vehicle is captured.
- a three-dimensional object in front of the host vehicle is detected based on a plurality of images captured by the camera 8A, and a position of the three-dimensional object behind the host vehicle is estimated based on the detected three-dimensional object and the traveling history of the host vehicle.
- FIG. 16 is a block diagram which shows the structure of Embodiment 3 of the vehicle travel control apparatus using the object recognition apparatus based on this invention.
- the vehicle travel control device 50B according to the third embodiment shown in FIG. 16 detects a lane behind the host vehicle using a camera that images the environment behind the host vehicle with respect to the vehicle travel control device 50 according to the first embodiment described above.
- the other configuration is the same as that of the vehicle travel control device 50 of the first embodiment. Therefore, hereinafter, only the configuration different from the vehicle travel control device 50 of the first embodiment will be described in detail, and the same reference numerals are given to the same configurations as those of the first embodiment, and the detailed description thereof will be omitted.
- the object recognition device 20B of the vehicle travel control device 50B includes a rear object detection device 1B, a lane detection device 2B, and a travel history calculation device 3B, as with the vehicle travel control device 50 of the first embodiment.
- a camera (imaging unit) 8B in the lane detection device 2B is attached toward the rear of the host vehicle (see FIG. 18).
- the lane detection device 2B is for detecting a lane (lane in which the vehicle travels) behind the host vehicle.
- the lane detection device 2B is arranged at the rear of the host vehicle and captures an environment behind the host vehicle ( (Rear camera) (imaging unit) 8B, and a lane detection unit 9B that detects a lane behind the host vehicle based on an image captured by the camera 8B.
- the camera 8B is composed of, for example, a CMOS camera, and is attached to the host vehicle so as to have an optical axis that is directed rearward and obliquely downward. As shown in FIG. 17, the camera 8B is within a range of about 10 m behind the host vehicle. The surrounding environment including the road is imaged, and the captured image is transmitted to the lane detecting unit 9B.
- the detection area (imaging area) Ac of the camera 8B is smaller than the detection areas Aa, Ab of the radio wave radar 7B (7aB, 7bB) constituting the rear object detection apparatus 1B.
- the lane detection unit 9B performs, for example, binarization processing or feature point extraction processing based on the image captured by the camera 8B, and from the image, road lane markings on the road (white lines, yellow lines, broken lines, botsdots, etc.) ) Pixels (road lane line candidate points) that are presumed to fall under the above are selected, and the consecutive line of the selected road lane line candidate points are recognized as road lane lines constituting the lane.
- the position is obtained, and information related to the position is transmitted to the lane position estimation unit 4B of the travel history calculation device 3B.
- lane detection device 2B may be shared with various devices for lane detection used in, for example, a lane maintenance assist device (also referred to as lane keep assist), a lane departure warning device (also referred to as lane departure warning), and the like.
- a lane maintenance assist device also referred to as lane keep assist
- a lane departure warning device also referred to as lane departure warning
- the travel history computation unit 11B of the travel history computation device 3B calculates the coordinate system X (n) ⁇ with the origin at the center of the vehicle at time t (n) by the same calculation as that described with reference to FIG. 4 in the first embodiment.
- the own vehicle V The change in the direction of S is also obtained by calculation of a procedure similar to the procedure described in Equation (2) in Embodiment 1, and the calculation results are transmitted to the three-dimensional object position estimation unit 4B.
- the lane position estimation unit 4B calculates the lane position detected in the past based on the information obtained from the travel history calculation unit 11B by the same calculation as described with reference to FIG. 4 in the first embodiment. It converts into the coordinate system which makes a vehicle center the origin, and memorize
- the lane position estimation unit 4B first stores the lane position information output from the lane detection device 2B in the lane position information storage unit 10B at time t (n).
- the position of the right road lane line that is, lane information obtained from the image captured by the camera 8B is defined as the origin at the vehicle center at time t (n).
- Information of (xcr_1 (t (n)), ycr_1 (t (n))), (xcr_2 (t (n)), ycr_2 (t (n)))) as positions Rc1 and Rc2 in the coordinate system XY Remember.
- FIG. 19 the position estimation unit 4B
- the position of the left road lane line (that is, lane information) obtained from the image captured by the camera 8B is represented in the coordinate system XY with the center of the host vehicle as the origin. Coordinate information of (xcl_1 (t (n)), ycl_1 (t (n))), (xcl_2 (t (n)), ycl_2 (t (n))) is stored as the positions Lc1 and Lc2.
- the lane position estimating unit 4B determines the positions Rc1, Rc2 of the right road lane marking and the positions Lc1, Lc2 of the left road lane marking. Using the above equation (4), the lane position is converted into position information in the coordinate system X (n + 1) -Y (n + 1) with the vehicle center at the time t (n + 1) as the origin. It memorize
- Such conversion is performed sequentially, and at the time t (n + m), at the time t (n + m) from the positions Rc1 and Rc2 of the right road marking line detected at the time t (n).
- the detected right road lane marking positions Rcm1 and Rcm2 are combined and the coordinate information is stored in the lane position information storage unit 10B as lane position information (see FIG. 19).
- the lane position information storage unit uses the position information of the road lane line detected by the lane detector 2B from time t (n) to time t (n + m) as lane position information. Store to 10B.
- the lane position information storage unit 10B accumulates and stores past lane position information. To prevent the storage capacity from overflowing, the lane position information that has passed a predetermined time or the distance from the host vehicle is a predetermined value or more. It is practically effective to sequentially delete the lane position information that has become from the lane position information storage unit 10B.
- the lane position information is recognized every moment, and the past lane position information is used to accurately determine the lane position behind the host vehicle (the lane position behind the detection area (imaging area) Ac of the camera 8B). Can be estimated.
- the relative position calculation unit 5B and the determination unit 6B perform the same calculation as described with reference to FIGS. 4 to 7 in the first embodiment, for example, with respect to the lane on which the target vehicle VS travels.
- control unit 25B based on the information transmitted from the object recognition device 20B (the determination unit 6B of the travel history calculation device 3B) by the same calculation as described based on the first embodiment, the vehicle etc. It is possible to generate various control signals for controlling.
- a radio radar 7B such as a millimeter wave radar having a larger detection range behind the host vehicle than the camera 8B is used to detect an object existing behind the host vehicle including an oblique rear side.
- the lane behind the host vehicle is detected based on the image captured by the camera 8B that captures the environment behind the host vehicle, and the rear of the host vehicle based on the detected lane and the travel history of the host vehicle (In particular, by estimating the position of the lane in the detection area (imaging area) Ac of the camera 8B and calculating the relative position of the object existing behind the host vehicle with respect to the estimated position of the lane, Even when the lane is changed, it is possible to accurately and quickly recognize the position of an object that exists behind the host vehicle including the diagonally rear, particularly the lane in which the object is located.
- Embodiment 1, Embodiment 2, and Embodiment 3 were demonstrated separately in the above, it combined with any of them, and was imaged with the camera which images the environment ahead or back of the own vehicle. Based on the image, lanes and three-dimensional objects in front or rear of the host vehicle are detected, and the positions of the lanes and three-dimensional objects in the rear of the host vehicle are estimated based on the detected lanes and three-dimensional objects and the travel history of the host vehicle. By calculating the relative position of the object existing behind the host vehicle with respect to the estimated lane and the position of the three-dimensional object, the position of the object existing behind the host vehicle including the diagonally rear side, particularly the lane where the object is positioned It can be recognized more precisely.
- the mode in which the speed information of the host vehicle is acquired using the wheel speed sensor has been described.
- the speed information of the host vehicle is acquired by means or devices other than the wheel speed sensor. Of course, it is also good.
- a three-dimensional object may be detected using a monocular camera.
- Embodiment 3 a mode was described in which a monocular camera was attached facing backwards to detect a lane behind the host vehicle.
- a stereo camera composed of a plurality of cameras was attached facing rearward and installed behind the host vehicle.
- a lane or a three-dimensional object may be detected, and the position of the lane or the three-dimensional object behind the host vehicle (in particular, behind the detection area (imaging area) of the stereo camera) may be estimated based on the detection result.
- the present invention is not limited to Embodiments 1 to 3 described above, and includes various modifications.
- the first to third embodiments described above are described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.
- a part of the configuration of an embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of an embodiment.
- control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
Abstract
Description
、前記記憶手段に記憶された前記自車両の過去の走行位置と前記検出手段によって検出された前記後方物体の位置との相対位置関係に基づいて、前記後方物体が位置する車線を推定する推定手段と、を備え、前記推定手段は、前記自車両の過去の走行位置から求められる走行軌跡と前記後方物体の位置との距離に基づいて前記後方物体が位置する車線を推定する装置である。
図1は、本発明に係る物体認識装置を用いた車両走行制御装置の実施形態1の構成を示す構成図である。
。
て自車両前方における車線を検知する車線検知部9と、で構成されている。
。図2では、カメラ8で撮像された画像における右側の道路区画線の位置を手前からR1~R3、左側の道路区画線の位置を手前からL1~L3で示し、図3では、図2に示すこれらの道路区画線の位置について、自車両の中央を原点とする上方から見た座標系X-Y
における位置をR1:(xr_1,yr_1), R2:(xr_2,yr_2), R3:(xr_3,yr_3), L1:(xl_1,yl_1)
, L2:(xl_2,yl_2),L3: (xl_3,yl_3)でそれぞれ示している。なお、図2及び図3では、道路区画線の位置を右側と左側のそれぞれについて3点求めた例を示しているが、2点以下または4点以上求める場合や、直線や曲線で近似する場合も同様である。
(n)-Y(n)、時刻t(n+1)における自車両中央を原点とする座標系をX(n+1)-Y(n+1)とし
、時刻t(n)における自車両VSの速度をVn、その進行方向をθnとするとき、Δt=(t(n+1)
-t (n))の間の自車両VSの位置の変化量(Δx、Δy)は以下の式(1)で表される。
サ21やヨーレートセンサ22によって推定される時刻t(n)における自車両VSの回転角速度をωnとすると、時刻t(n+1)における自車両VSの進行方向θn+1は以下の式(2)で
推定される。
ける自車両中央を原点とする座標系X(n+1)-Y(n+1)とのなす角Δθnは、以下の式(3)で表される。
m)までに車線検知装置2により検出された道路区画線の位置情報を車線位置情報として車線位置情報記憶部10へ記憶する。
情報記憶部10に記憶された車線位置情報から、自車両VSの進行方向(Y軸方向)で最も近い2点(左右の道路区画線の各2点)を選出する。図6や図7では、右側の道路区画線の位置Rn1、Rn2、及び、左側の道路区画線の位置Ln1、Ln2の車線位置情報がY軸方向で最も近い点となっている。そして、相対位置演算部5は、右側の道路区画線位置の2点Rn1、Rn2を結ぶ直線を求め、この直線のターゲット車両VTのY軸方向の位置に相当する場所におけるX方向の距離xr_npを求め、距離xr_npの値とターゲット車両VTのX軸方向の値Pとの大小関係を算出し、その算出結果を判断部6へ送信する。
線に対しても続いて行うことで、ターゲット車両VTが自車両VSが走行する車線と同じ車線内の後方に存在するか否かを判定できる。
、例えば、操舵を制御(規制)する制御信号や、自車両の車速を制御する制御信号、運転者に対して警告(警報やコントロールパネル等への警告表示)を発するための制御信号等を生成し、そのような制御信号を適宜の車載装置へ送信する。
01で求めた車線位置を、自車両を原点とする座標系で車線位置情報記憶部10に記憶しておく。
式(1)から式(3)を用いて、時刻t(n)における自車両を原点とする座標系X(n)-Y(n
)及び時刻t(n+1)における自車両を原点とする座標系X(n+1)-Y(n+1)を求める。
図9は、本発明に係る物体認識装置を用いた車両走行制御装置の実施形態2の構成を示す構成図である。
して近似してもよい。
(n)から時刻t(n+1)における自車両を原点とする座標系X(n+1)-Y(n+1)を算出し、Δt=(
t (n+1)-t(n))の間の自車両VSの位置の変化量(Δx、Δy)を算出する。また、自車両V
Sの方向の変化についても、実施形態1における式(2)等で説明した手順と同様の手順の計算で求め、それらの算出結果を立体物位置推定部4Aへ送信する。
ける式(4)で示した変換と同様であるが、本実施形態2では、自車両の位置等の変化はX-Y平面(図示例では水平面)内の軸方向への移動とZ軸(鉛直軸)周りでの回転であ
るため、立体物検知装置2Aにより検出された立体物のZ方向の情報(高さ)は座標変換の前後で同じ値を保持するものとする。
、立体物検知装置2Aから出力される立体物位置情報を立体物位置情報記憶部10Aへ記憶する。例えば、図10及び図11に示したように、ステレオカメラ8Aで撮像された画像から求められた立体物の位置情報を、時刻t(n)における自車両中央を原点とする座標系X-Y-Zにおける位置及び高さの情報G1~G3として(xg_1(t(n)),yg_1(t(n)),zg_1(t(n))), (xg_2(t(n)),yg_2(t(n)),zg_2(t(n))),
(xr_3(t(n)),yr_3(t(n)),zg_3(t(n)))の座標情報を記憶する。
ト車両VTの位置(P2,Q2)と自車両VSに対する相対速度を検知した場合、相対位置演
算部5Aは、立体物位置情報記憶部10Aに記憶された立体物位置情報から、自車両VSの進行方向(Y軸方向)で最も近い2点を選出する。図13では、立体物位置情報Gn1
、Gn2がY軸方向で最も近い点となっている。そして、相対位置演算部5Aは、立体物
の2つの代表点の位置Gn1、Gn2を結ぶ直線を求め、この直線のターゲット車両VTのY軸方向の位置に相当する場所におけるX方向の距離xg_npを求め、距離xg_npの値とターゲット車両VTのX軸方向の値P2との大小関係を算出し、その算出結果を判断部6Aへ送信する。
Gn2に含まれる高さが、電波レーダ7Aの取付け位置よりも高い場合、電波レーダ7A
から出力される電波はその立体物の2つの代表点を結んだ線より遠方には到達しないと考えられる。図13では、ターゲット車両VTのX軸方向の値P2がY軸から立体物までの距離xg_npよりも大きい例を示しているが、このような場合には、後方物体検知装置1Aに
よるターゲット車両VTの検知結果の信頼性が低いと判断される。
ダ7bAによってターゲット車両VTが検知され、立体物位置推定部4Aによって、立体物の3つの代表点の位置情報Gn1、Gn2、Gn3が得られる例を示している。このよう
な場合、電波レーダ7Aから放射される電波は、連続した形状を有する立体物によって反射されることが予想され、電波レーダ7bAによって検知されたターゲット車両VTは真の情報であるが、電波レーダ7aAによって検知されたターゲット車両VT'は、立体物
によって電波が反射されて誤ったターゲット車両(ターゲット車両VTのゴースト)を検知した情報である可能性が高いと推測される。そのため、このような場合にも、後方物体検知装置1Aによるターゲット車両VTの検知結果の信頼性が低いと判断される。
。
、まず、ステップS201では、物体認識装置20Aの立体物検知装置2Aにより、図10及び図11に基づき説明した手法により、自車両を原点とする3次元の座標系X-Y-Zにおける立体物位置を求める。
する。このステップS203での処理は、図8のステップS103での処理と同様である。
テップS205での処理は、図8のステップS105での処理と同様である。
)に存在するか否かを判断する。通常は、ターゲット車両が自車両に対してどの位置に存在するかによって判断するが、実施形態1で説明したような車線位置情報が取得されている場合には、ターゲット車両がどの車線(自車両が走行する車線、右隣の車線、左隣の車線、または2車線以上離れた車線)を走行しているかを正確に判断できる。
図16は、本発明に係る物体認識装置を用いた車両走行制御装置の実施形態3の構成を示す構成図である。
後方カメラ)(撮像部)8Bと、カメラ8Bで撮像された画像に基づいて自車両後方にお
ける車線を検知する車線検知部9Bと、で構成されている。
。なお、ここでは、カメラ8Bの検知領域(撮像領域)Acが、後方物体検知装置1Bを構成する電波レーダ7B(7aB、7bB)の検知領域Aa、Abよりも小さく、より詳細には、電波レーダ7B(7aB、7bB)の自車両後方への検知領域Aa、Abが、カメラ8Bの自車両後方への検知領域(撮像領域)Acよりも大きく、電波レーダ7B(7aB、7bB)によって、カメラ8Bよりも自車両からより後方(特に、後側方)に存在する物体を検知できるものとする(図18参照)。
における位置をRc1:(xcr_1,ycr_1), Rc2:(xcr_2,ycr_2), Lc1:(xcl_1,ycl_1), Lc2:(xcl_2,ycl_2)でそれぞれ示している。なお、図17及び図18では、道路区画線の位置
を右側と左側のそれぞれについて2点求めた例を示しているが、1点または3点以上求める場合や、直線や曲線で近似する場合も同様である。
(n)から時刻t(n+1)における自車両を原点とする座標系X(n+1)-Y(n+1)を算出し、Δt=(
t (n+1)-t(n))の間の自車両VSの位置の変化量(Δx、Δy)を算出する。また、自車両V
Sの方向の変化についても、実施形態1における式(2)等で説明した手順と同様の手順の計算で求め、それらの算出結果を立体物位置推定部4Bへ送信する。
、図19では記載していないが、カメラ8Bで撮像された画像から求められた左側の道路区画線の位置(すなわち車線情報)を、自車両中央を原点とする座標系X-Yにおける位
置Lc1、Lc2として(xcl_1(t(n)), ycl_1(t(n))),(xcl_2(t(n)),ycl_2(t(n)))の座標情報を記憶する。
位置情報に変換して車線位置情報記憶部10Bへ記憶する。
が自車両VSが走行する車線と同じ車線内の後方に存在するか(図7参照)を判定する。
、実施形態の構成の一部について、他の構成の追加・削除・置換をすることが可能である
。
2、2B … 車線検知装置
2A… 立体物検知装置
3、3A、3B… 走行履歴演算装置
4、4B … 車線位置推定部
4A… 立体物位置推定部
5、5A、5B… 相対位置演算部
6、6A、6B… 判断部
7、7A、7B… 電波レーダ(後方物体検知部)
8 … 前方カメラ(撮像部)
8A… ステレオカメラ(撮像部)
8B… 後方カメラ(撮像部)
9、9B … 車線検知部
9A… 立体物検知部
10、10B… 車線位置情報記憶部
10A… 立体物位置情報記憶部
11、11A、11B… 走行履歴演算部
20、20A、20B… 物体認識装置
21、21A、21B… 操舵角センサ
22、22A、22B… ヨーレートセンサ
23、23A、23B… 車輪速センサ
24、24A、24B… ナビゲーションシステム
25、25A、25B… コントロールユニット
GR… ガードレール
VP… 先行車両
VS… 自車両
VT… ターゲット車両
Claims (15)
- 自車両後方に存在する物体の位置を認識する物体認識装置であって、
自車両前方または後方の環境を撮像する撮像部と、
前記撮像部で撮像された画像に基づいて、自車両前方または後方における車線を検知する車線検知部と、
前記車線検知部で検知された車線と自車両の走行履歴とに基づいて、自車両後方における車線の位置を推定する車線位置推定部と、
自車両後方に存在する物体を検知する後方物体検知部と、
前記車線位置推定部で推定された車線の位置に対する前記後方物体検知部で検知された物体の相対位置を算出する相対位置演算部と、を備えることを特徴とする物体認識装置。 - 前記車線検知部は、前記撮像部で撮像された画像から自車両前方または後方における道路区画線を抽出して前記自車両前方または後方における車線を検知することを特徴とする
、請求項1に記載の物体認識装置。 - 前記撮像部は、単眼カメラで構成されることを特徴とする、請求項1に記載の物体認識装置。
- 前記物体認識装置は、車速センサ、ヨーレートセンサ、操舵角センサ、ナビゲーションシステムから選択される少なくとも一つから得られる情報に基づいて、前記自車両の走行履歴を算出する走行履歴演算部を更に備えることを特徴とする、請求項1に記載の物体認識装置。
- 前記後方物体検知部は、電波レーダで構成されることを特徴とする、請求項1に記載の物体認識装置。
- 前記物体認識装置は、前記相対位置演算部で算出された物体の相対位置に基づいて、前記物体が所定の車線に存在するか否かを判断する判断部を更に備えることを特徴とする、請求項1に記載の物体認識装置。
- 請求項6に記載の物体認識装置により認識された物体の位置に基づいて、自車両の走行を制御することを特徴とする車両走行制御装置。
- 前記判断部により前記物体が所定の車線に存在すると判断した際に警告を発生するようになっていることを特徴とする、請求項7に記載の車両走行制御装置。
- 自車両後方に存在する物体の位置を認識する物体認識装置であって、
自車両前方または後方の環境を撮像する撮像部と、
前記撮像部で撮像された画像に基づいて、自車両前方または後方における静止立体物を検知する立体物検知部と、
前記立体物検知部で検知された静止立体物と自車両の走行履歴とに基づいて、自車両後方における静止立体物の位置を推定する立体物位置推定部と、
自車両後方に存在する物体を検知する後方物体検知部と、
前記立体物位置推定部で推定された静止立体物の位置に対する前記後方物体検知部で検知された物体の相対位置を算出する相対位置演算部と、を備えることを特徴とする物体認識装置。 - 前記撮像部は、複数のカメラで構成されることを特徴とする、請求項9に記載の物体認識装置。
- 前記物体認識装置は、車速センサ、ヨーレートセンサ、操舵角センサ、ナビゲーションシステムから選択される少なくとも一つから得られる情報に基づいて、前記自車両の走行履歴を算出する走行履歴演算部を更に備えることを特徴とする、請求項9に記載の物体認識装置。
- 前記後方物体検知部は、電波レーダで構成されることを特徴とする、請求項9に記載の物体認識装置。
- 前記物体認識装置は、前記相対位置演算部で算出された物体の相対位置に基づいて、前記物体が所定の領域に存在するか否かを判断する判断部を更に備えることを特徴とする、請求項9に記載の物体認識装置。
- 請求項13に記載の物体認識装置により認識された物体の位置に基づいて、自車両の走行を制御することを特徴とする車両走行制御装置。
- 前記判断部により前記物体が所定の領域に存在すると判断した際に警告を発生するようになっていることを特徴とする、請求項14に記載の車両走行制御装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016529224A JP6404922B2 (ja) | 2014-06-19 | 2015-06-03 | 物体認識装置及びそれを用いた車両走行制御装置 |
US15/307,394 US9934690B2 (en) | 2014-06-19 | 2015-06-03 | Object recognition apparatus and vehicle travel controller using same |
CN201580028678.3A CN106463064B (zh) | 2014-06-19 | 2015-06-03 | 物体识别装置和使用该物体识别装置的车辆行驶控制装置 |
EP15809777.4A EP3159866B1 (en) | 2014-06-19 | 2015-06-03 | Object recognition apparatus and vehicle travel controller using same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014126634 | 2014-06-19 | ||
JP2014-126634 | 2014-06-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015194371A1 true WO2015194371A1 (ja) | 2015-12-23 |
Family
ID=54935361
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/065967 WO2015194371A1 (ja) | 2014-06-19 | 2015-06-03 | 物体認識装置及びそれを用いた車両走行制御装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9934690B2 (ja) |
EP (1) | EP3159866B1 (ja) |
JP (1) | JP6404922B2 (ja) |
CN (1) | CN106463064B (ja) |
WO (1) | WO2015194371A1 (ja) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017134519A (ja) * | 2016-01-26 | 2017-08-03 | トヨタ自動車株式会社 | 車両用衝突回避支援システム |
JP2017198566A (ja) * | 2016-04-28 | 2017-11-02 | 日立オートモティブシステムズ株式会社 | レーンマーカ認識装置、自車両位置推定装置 |
JP2019046150A (ja) * | 2017-09-01 | 2019-03-22 | 株式会社Subaru | 走行支援装置 |
CN110663072A (zh) * | 2017-05-22 | 2020-01-07 | 三菱电机株式会社 | 位置估计装置、位置估计方法以及位置估计程序 |
JP2020038496A (ja) * | 2018-09-04 | 2020-03-12 | いすゞ自動車株式会社 | 車線特定装置及び車線特定方法 |
JP2020067822A (ja) * | 2018-10-24 | 2020-04-30 | マツダ株式会社 | 障害物認識装置 |
JP2020107115A (ja) * | 2018-12-27 | 2020-07-09 | 株式会社デンソー | 運転支援装置 |
JP2020203546A (ja) * | 2019-06-14 | 2020-12-24 | 株式会社シマノ | 検出装置、検出方法、生成方法、コンピュータプログラム、および記憶媒体 |
JP2022011941A (ja) * | 2020-06-30 | 2022-01-17 | ダイハツ工業株式会社 | 運転支援装置 |
WO2022254788A1 (ja) * | 2021-06-04 | 2022-12-08 | 日立Astemo株式会社 | 後方白線推定装置、物標認識装置並びに方法 |
WO2023276616A1 (ja) * | 2021-07-02 | 2023-01-05 | 株式会社デンソー | 運転支援装置 |
JP7298063B1 (ja) * | 2022-11-15 | 2023-06-27 | Pciソリューションズ株式会社 | 機械学習認識システム |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102015209186A1 (de) * | 2015-05-20 | 2016-12-08 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren zur Ermittlung einer Beschreibung eines Fahrstreifens |
JP6602683B2 (ja) * | 2016-02-05 | 2019-11-06 | 株式会社東芝 | 充電装置および位置ずれ検出方法 |
JP6620881B2 (ja) * | 2016-03-17 | 2019-12-25 | 日本電気株式会社 | 乗車人数計測装置、システム、方法およびプログラムならびに車両移動量算出装置、方法およびプログラム |
US20170309181A1 (en) * | 2016-04-26 | 2017-10-26 | Hyundai Motor Company | Apparatus for recognizing following vehicle and method thereof |
CN109476307B (zh) * | 2016-07-12 | 2021-12-07 | 日产自动车株式会社 | 行驶控制方法及行驶控制装置 |
CN108528432B (zh) * | 2017-03-02 | 2020-11-06 | 比亚迪股份有限公司 | 车辆行驶自动控制方法和装置 |
CN108528450B (zh) * | 2017-03-02 | 2020-06-19 | 比亚迪股份有限公司 | 车辆行驶自动控制方法和装置 |
CN108528449B (zh) * | 2017-03-02 | 2020-02-21 | 比亚迪股份有限公司 | 车辆行驶自动控制方法和装置 |
CN108528433B (zh) * | 2017-03-02 | 2020-08-25 | 比亚迪股份有限公司 | 车辆行驶自动控制方法和装置 |
CN106991389B (zh) * | 2017-03-29 | 2021-04-27 | 蔚来(安徽)控股有限公司 | 确定道路边沿的装置和方法 |
CN110612561A (zh) * | 2017-05-17 | 2019-12-24 | 三菱电机株式会社 | 物体辨识装置、路侧装置以及物体辨识方法 |
US11403865B2 (en) * | 2017-07-25 | 2022-08-02 | Nec Corporation | Number-of-occupants detection system, number-of-occupants detection method, and program |
KR20190012370A (ko) * | 2017-07-27 | 2019-02-11 | 삼성에스디에스 주식회사 | 차선 변경 지원 방법 및 장치 |
US11520340B2 (en) * | 2017-08-10 | 2022-12-06 | Nissan Motor Co., Ltd. | Traffic lane information management method, running control method, and traffic lane information management device |
JP6592852B2 (ja) * | 2017-09-07 | 2019-10-23 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、およびプログラム |
JP6930394B2 (ja) * | 2017-11-24 | 2021-09-01 | トヨタ自動車株式会社 | 物体認識装置 |
CN111742236B (zh) * | 2018-02-27 | 2023-07-28 | 日立安斯泰莫株式会社 | 噪声消除学习装置以及具备该噪声消除学习装置的车辆 |
JP2019159380A (ja) * | 2018-03-07 | 2019-09-19 | 株式会社デンソー | 物体検知装置、物体検知方法およびプログラム |
WO2020026633A1 (ja) * | 2018-08-02 | 2020-02-06 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
CN109177876A (zh) * | 2018-08-13 | 2019-01-11 | 吉利汽车研究院(宁波)有限公司 | 一种移动物体检测报警系统及方法 |
JP7136035B2 (ja) * | 2018-08-31 | 2022-09-13 | 株式会社デンソー | 地図生成装置及び地図生成方法 |
JP7020353B2 (ja) * | 2018-09-21 | 2022-02-16 | トヨタ自動車株式会社 | 物体検出装置 |
JP7234354B2 (ja) * | 2018-09-30 | 2023-03-07 | グレート ウォール モーター カンパニー リミテッド | 走行座標系の構築方法及びその使用 |
DE102018220803A1 (de) * | 2018-12-03 | 2020-06-04 | Robert Bosch Gmbh | Spurgenaue Lokalisierung von Fahrzeugen |
US10703365B1 (en) | 2018-12-26 | 2020-07-07 | Automotive Research & Testing Center | Lane tracking method and lane tracking system for an autonomous vehicle |
KR20200113915A (ko) * | 2019-03-27 | 2020-10-07 | 주식회사 만도 | 차량 제어 장치 및 방법 |
US11514594B2 (en) | 2019-10-30 | 2022-11-29 | Vergence Automation, Inc. | Composite imaging systems using a focal plane array with in-pixel analog storage elements |
KR20210054944A (ko) * | 2019-11-06 | 2021-05-14 | 현대자동차주식회사 | 차량용 레이더센서의 오차 보정 장치 및 그 방법 |
JP7277349B2 (ja) * | 2019-12-12 | 2023-05-18 | 日立Astemo株式会社 | 運転支援装置、および、運転支援システム |
CN113359147B (zh) * | 2020-03-06 | 2023-08-18 | 宇通客车股份有限公司 | 一种车辆及目标物运动状态的判断方法和装置 |
US11679768B2 (en) | 2020-10-19 | 2023-06-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for vehicle lane estimation |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001010433A (ja) * | 1999-05-08 | 2001-01-16 | Daimlerchrysler Ag | 車輌用車線変更誘導装置および方法 |
JP2001357497A (ja) * | 2000-06-13 | 2001-12-26 | Mitsubishi Motors Corp | 後方車両監視装置 |
JP2010127908A (ja) * | 2008-12-01 | 2010-06-10 | Toyota Motor Corp | 物体位置検出装置 |
WO2013098996A1 (ja) * | 2011-12-28 | 2013-07-04 | トヨタ自動車株式会社 | 車両の運転支援装置 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6123985A (ja) * | 1984-07-13 | 1986-02-01 | Nissan Motor Co Ltd | 車間距離検出装置 |
US6618672B2 (en) * | 1998-10-21 | 2003-09-09 | Yazaki Corporation | Vehicle-applied rear-and-side monitoring system |
JP3885716B2 (ja) * | 2002-11-21 | 2007-02-28 | 日産自動車株式会社 | 車両用推奨操作量生成装置 |
JP4578795B2 (ja) * | 2003-03-26 | 2010-11-10 | 富士通テン株式会社 | 車両制御装置、車両制御方法および車両制御プログラム |
JP4661602B2 (ja) | 2006-01-13 | 2011-03-30 | トヨタ自動車株式会社 | 後方車両解析装置及び衝突予測装置 |
KR101075615B1 (ko) * | 2006-07-06 | 2011-10-21 | 포항공과대학교 산학협력단 | 주행 차량의 운전자 보조 정보 생성 장치 및 방법 |
JP4420011B2 (ja) * | 2006-11-16 | 2010-02-24 | 株式会社日立製作所 | 物体検知装置 |
JP2009143433A (ja) * | 2007-12-14 | 2009-07-02 | Toyota Motor Corp | 車両挙動制御装置 |
KR101071732B1 (ko) * | 2007-12-17 | 2011-10-11 | 현대자동차주식회사 | 차량 주행속도 제어 장치 및 그 방법 |
JP5359516B2 (ja) * | 2008-07-29 | 2013-12-04 | 日産自動車株式会社 | 車両運転支援装置及び車両運転支援方法 |
JP4992959B2 (ja) * | 2009-11-30 | 2012-08-08 | 株式会社デンソー | 衝突回避支援装置、および衝突回避支援プログラム |
JP4990424B2 (ja) * | 2010-04-15 | 2012-08-01 | 三菱電機株式会社 | 走行支援装置 |
JP2012226392A (ja) * | 2011-04-14 | 2012-11-15 | Honda Elesys Co Ltd | 運転支援システム |
WO2012147187A1 (ja) * | 2011-04-27 | 2012-11-01 | トヨタ自動車株式会社 | 周辺車両検出装置 |
JP5743286B2 (ja) * | 2012-12-28 | 2015-07-01 | 富士重工業株式会社 | 車両の運転支援装置 |
US9026356B2 (en) * | 2013-02-22 | 2015-05-05 | Nissan North America, Inc. | Vehicle navigation system and method |
JP6188471B2 (ja) * | 2013-07-26 | 2017-08-30 | アルパイン株式会社 | 車両後側方警報装置、車両後側方警報方法および立体物検出装置 |
-
2015
- 2015-06-03 WO PCT/JP2015/065967 patent/WO2015194371A1/ja active Application Filing
- 2015-06-03 JP JP2016529224A patent/JP6404922B2/ja active Active
- 2015-06-03 CN CN201580028678.3A patent/CN106463064B/zh active Active
- 2015-06-03 EP EP15809777.4A patent/EP3159866B1/en active Active
- 2015-06-03 US US15/307,394 patent/US9934690B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001010433A (ja) * | 1999-05-08 | 2001-01-16 | Daimlerchrysler Ag | 車輌用車線変更誘導装置および方法 |
JP2001357497A (ja) * | 2000-06-13 | 2001-12-26 | Mitsubishi Motors Corp | 後方車両監視装置 |
JP2010127908A (ja) * | 2008-12-01 | 2010-06-10 | Toyota Motor Corp | 物体位置検出装置 |
WO2013098996A1 (ja) * | 2011-12-28 | 2013-07-04 | トヨタ自動車株式会社 | 車両の運転支援装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3159866A4 * |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017134519A (ja) * | 2016-01-26 | 2017-08-03 | トヨタ自動車株式会社 | 車両用衝突回避支援システム |
JP2017198566A (ja) * | 2016-04-28 | 2017-11-02 | 日立オートモティブシステムズ株式会社 | レーンマーカ認識装置、自車両位置推定装置 |
CN110663072B (zh) * | 2017-05-22 | 2022-03-29 | 三菱电机株式会社 | 位置估计装置、位置估计方法以及计算机能读取的存储介质 |
CN110663072A (zh) * | 2017-05-22 | 2020-01-07 | 三菱电机株式会社 | 位置估计装置、位置估计方法以及位置估计程序 |
JP2019046150A (ja) * | 2017-09-01 | 2019-03-22 | 株式会社Subaru | 走行支援装置 |
US10795370B2 (en) | 2017-09-01 | 2020-10-06 | Subaru Corporation | Travel assist apparatus |
JP2020038496A (ja) * | 2018-09-04 | 2020-03-12 | いすゞ自動車株式会社 | 車線特定装置及び車線特定方法 |
JP7230373B2 (ja) | 2018-09-04 | 2023-03-01 | いすゞ自動車株式会社 | 車線特定装置及び車線特定方法 |
JP2020067822A (ja) * | 2018-10-24 | 2020-04-30 | マツダ株式会社 | 障害物認識装置 |
JP7087911B2 (ja) | 2018-10-24 | 2022-06-21 | マツダ株式会社 | 障害物認識装置 |
JP7169873B2 (ja) | 2018-12-27 | 2022-11-11 | 株式会社デンソー | 運転支援装置 |
JP2020107115A (ja) * | 2018-12-27 | 2020-07-09 | 株式会社デンソー | 運転支援装置 |
JP2020203546A (ja) * | 2019-06-14 | 2020-12-24 | 株式会社シマノ | 検出装置、検出方法、生成方法、コンピュータプログラム、および記憶媒体 |
JP2022011941A (ja) * | 2020-06-30 | 2022-01-17 | ダイハツ工業株式会社 | 運転支援装置 |
WO2022254788A1 (ja) * | 2021-06-04 | 2022-12-08 | 日立Astemo株式会社 | 後方白線推定装置、物標認識装置並びに方法 |
WO2023276616A1 (ja) * | 2021-07-02 | 2023-01-05 | 株式会社デンソー | 運転支援装置 |
JP7298063B1 (ja) * | 2022-11-15 | 2023-06-27 | Pciソリューションズ株式会社 | 機械学習認識システム |
Also Published As
Publication number | Publication date |
---|---|
US9934690B2 (en) | 2018-04-03 |
US20170053533A1 (en) | 2017-02-23 |
EP3159866A4 (en) | 2018-02-21 |
EP3159866A1 (en) | 2017-04-26 |
EP3159866B1 (en) | 2022-04-13 |
JP6404922B2 (ja) | 2018-10-17 |
CN106463064A (zh) | 2017-02-22 |
JPWO2015194371A1 (ja) | 2017-04-20 |
CN106463064B (zh) | 2020-06-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6404922B2 (ja) | 物体認識装置及びそれを用いた車両走行制御装置 | |
EP3361721B1 (en) | Display assistance device and display assistance method | |
JP4420011B2 (ja) | 物体検知装置 | |
JP5389002B2 (ja) | 走行環境認識装置 | |
JP6787157B2 (ja) | 車両制御装置 | |
US11024051B2 (en) | Object detection device | |
EP2775423A2 (en) | Object detection apparatus, vehicle-mounted device control system and carrier medium of program of object detection | |
TW201704067A (zh) | 防撞方法、實現該防撞方法之電腦程式產品及防撞系統 | |
JP2007300181A (ja) | 周辺認識装置、周辺認識方法、プログラム | |
JPH1139596A (ja) | 車外監視装置 | |
JP6354659B2 (ja) | 走行支援装置 | |
JP2007280132A (ja) | 走行誘導障害物検出装置および車両用制御装置 | |
JP2007309799A (ja) | 車載測距装置 | |
JP2019067116A (ja) | 立体物接地判定装置 | |
JP2001195698A (ja) | 歩行者検知装置 | |
JP6204782B2 (ja) | オフロードダンプトラック | |
JP6465919B2 (ja) | 障害物検知システム | |
JP5832850B2 (ja) | 車線監視システム及び車線監視方法 | |
JP2007199932A (ja) | 画像処理装置及びその方法 | |
JP4231883B2 (ja) | 画像処理装置及びその方法 | |
JP6295868B2 (ja) | 車両用表示装置 | |
JP7083768B2 (ja) | 認識装置、車両制御装置、認識方法、およびプログラム | |
JP6812701B2 (ja) | 画像処理装置、移動体機器制御システム、画像処理方法、及びプログラム | |
JP6604052B2 (ja) | 走路境界推定装置及び走路境界推定方法 | |
JP2015179337A (ja) | 画像判定装置、画像処理装置、画像判定プログラム、画像判定方法、移動体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15809777 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016529224 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15307394 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2015809777 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015809777 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |