WO2018198163A1 - Procédé et dispositif de prédiction d'état périphérique - Google Patents

Procédé et dispositif de prédiction d'état périphérique Download PDF

Info

Publication number
WO2018198163A1
WO2018198163A1 PCT/JP2017/016185 JP2017016185W WO2018198163A1 WO 2018198163 A1 WO2018198163 A1 WO 2018198163A1 JP 2017016185 W JP2017016185 W JP 2017016185W WO 2018198163 A1 WO2018198163 A1 WO 2018198163A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
behavior
prediction
vehicles
likelihood
Prior art date
Application number
PCT/JP2017/016185
Other languages
English (en)
Japanese (ja)
Inventor
芳 方
卓也 南里
翔一 武井
Original Assignee
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社 filed Critical 日産自動車株式会社
Priority to PCT/JP2017/016185 priority Critical patent/WO2018198163A1/fr
Publication of WO2018198163A1 publication Critical patent/WO2018198163A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • Patent Document 1 a technique for calculating a vehicle trajectory of another vehicle around the host vehicle and reflecting it in the driving support control of the host vehicle having a driving support function is known (Patent Document 1).
  • the invention described in Patent Document 1 detects vehicle trajectories of a plurality of other vehicles around the host vehicle at a place where the vehicle cannot travel as a surrounding situation (for example, a construction site), and calculates a representative trajectory from the detected vehicle trajectories. The own vehicle is controlled based on the representative trajectory.
  • Patent Document 1 does not consider anything about predicting the road surface condition as the surrounding state of the host vehicle, and it is difficult to predict the road surface condition.
  • the present invention has been made in view of the above problems, and an object thereof is to provide a surrounding situation prediction method and a surrounding situation prediction apparatus capable of predicting a road surface situation as a surrounding situation of the host vehicle. .
  • FIG. 1 is a configuration diagram of an ambient situation prediction apparatus according to this embodiment of the present invention.
  • FIG. 2 is a diagram for explaining the prediction intention based on the road structure according to this embodiment of the present invention.
  • FIG. 3A is a diagram for explaining an example when there is a difference between the actual behavior and the prediction intention according to this embodiment of the present invention.
  • FIG. 3B is a diagram for explaining an example when there is a difference between the actual behavior and the prediction intention according to the embodiment of the present invention.
  • FIG. 4A is a diagram illustrating another example in the case where there is a difference between the actual behavior and the prediction intention according to this embodiment of the present invention.
  • FIG. 1 is a configuration diagram of an ambient situation prediction apparatus according to this embodiment of the present invention.
  • FIG. 2 is a diagram for explaining the prediction intention based on the road structure according to this embodiment of the present invention.
  • FIG. 3A is a diagram for explaining an example when there is a difference between the actual behavior and the prediction intention according to this embodiment of the
  • FIG. 4B is a diagram illustrating another example in the case where there is a difference between the actual behavior and the prediction intention according to this embodiment of the present invention.
  • FIG. 5A is a diagram illustrating another example in the case where there is a difference between the actual behavior and the prediction intention according to this embodiment of the present invention.
  • FIG. 5B is a diagram for explaining another example in the case where there is a difference between the actual behavior and the prediction intention according to the embodiment of the present invention.
  • FIG. 6 is a diagram illustrating an example of similarity in behavior of other vehicles according to the present embodiment of the present invention.
  • FIG. 7 is a table for explaining an example of similarity in behavior of other vehicles according to the present embodiment of the present invention.
  • FIG. 8 is a diagram illustrating another example of the similarity in behavior of other vehicles according to the present embodiment of the present invention.
  • FIG. 9 is a diagram illustrating another example of the similarity in behavior of other vehicles according to the present embodiment of the invention.
  • FIG. 10 is a table for explaining another example of the similarity in behavior of other vehicles according to this embodiment of the present invention.
  • FIG. 11 is a table for explaining another example of the similarity in behavior of other vehicles according to this embodiment of the present invention.
  • FIG. 12 is a diagram illustrating another example of similarity in behavior of other vehicles according to this embodiment of the present invention.
  • FIG. 13 is a table for explaining another example of the similarity in behavior of other vehicles according to this embodiment of the present invention.
  • FIG. 10 is a table for explaining another example of the similarity in behavior of other vehicles according to this embodiment of the present invention.
  • FIG. 11 is a table for explaining another example of the similarity in behavior of other vehicles according to this embodiment of the present invention.
  • FIG. 12 is a
  • FIG. 14 is a flowchart for explaining an operation example of the ambient situation prediction apparatus according to the present embodiment.
  • FIG. 15 is a flowchart for explaining an operation example of the ambient situation prediction apparatus according to the present embodiment.
  • FIG. 16 is a flowchart for explaining an operation example of the ambient situation prediction apparatus according to the present embodiment.
  • FIG. 17 is a flowchart for explaining an operation example of the ambient situation prediction apparatus according to the present embodiment.
  • FIG. 18 is a diagram for explaining an operation example at the intersection of the surrounding state prediction apparatus according to the present embodiment.
  • FIG. 19 is a flowchart for explaining an operation example at the intersection of the surrounding state prediction apparatus according to the present embodiment.
  • FIG. 20 is a diagram for explaining another operation example of the ambient situation prediction apparatus according to the present embodiment.
  • FIG. 21 is a diagram for explaining another operation example of the ambient situation prediction apparatus according to the present embodiment.
  • the ambient situation prediction method mainly detects the behaviors of a plurality of other vehicles around the host vehicle, predicts the surrounding conditions of the host vehicle based on the behaviors of the other vehicles, and It is used for a driving support device that supports driving of a vehicle.
  • the surrounding situation prediction device includes an object detection device 1, a vehicle position estimation device 2, a map acquisition device 3, and a controller 100.
  • a surrounding situation prediction apparatus is an apparatus mainly used for an automatic driving vehicle having an automatic driving function.
  • the object detection device 1 includes a plurality of different types of object detection sensors that detect objects around the host vehicle such as a laser radar, a millimeter wave radar, and a camera mounted on the host vehicle.
  • the object detection device 1 detects an object around the host vehicle using a plurality of object detection sensors.
  • the object detection device 1 detects other vehicles, motorcycles, bicycles, moving objects including pedestrians, and stationary objects including parked vehicles. For example, the object detection device 1 detects the position, posture (yaw angle), size, speed, acceleration, jerk, deceleration, and yaw rate of a moving object and a stationary object with respect to the host vehicle.
  • the own vehicle position estimation device 2 includes a position detection sensor that measures the absolute position of the own vehicle such as GPS (Global Positioning System) and odometry mounted on the own vehicle.
  • the own vehicle position estimation device 2 uses the position detection sensor to measure the absolute position of the own vehicle, that is, the position, posture, and speed of the own vehicle with respect to a predetermined reference point.
  • the map acquisition device 3 acquires map information indicating the structure of the road on which the host vehicle is traveling.
  • the map information acquired by the map acquisition device 3 includes road structure information such as absolute lane positions, lane connection relationships, and relative position relationships.
  • the map acquisition device 3 may own a map database storing map information, or may acquire map information from an external map data server by cloud computing.
  • the map acquisition apparatus 3 may acquire map information using vehicle-to-vehicle communication and road-to-vehicle communication.
  • the controller 100 predicts the road surface situation as the surrounding situation of the own vehicle based on the detection result by the object detection device 1 and the own vehicle position estimation device 2 and the acquisition information by the map acquisition device 3.
  • the controller 100 is a general-purpose microcomputer including a CPU (Central Processing Unit), a memory, and an input / output unit.
  • a computer program for causing the microcomputer to function as an ambient condition predicting device is installed.
  • the microcomputer functions as a plurality of information processing circuits included in the ambient situation prediction apparatus.
  • a plurality of information processing circuits included in the ambient situation prediction device is realized by software.
  • dedicated hardware for executing each information processing shown below is prepared and information processing is performed. It is also possible to construct a circuit.
  • a plurality of information processing circuits may be configured by individual hardware.
  • the controller 100 includes a detection integration unit 4, an object tracking unit 5, an in-map position calculation unit 6, a behavior prediction unit 10, an automatic route generation unit 21, and a vehicle control unit 22 as a plurality of information processing circuits.
  • the behavior prediction unit 10 includes a lane determination unit 11, an intention prediction unit 12, a track prediction unit 13, a likelihood calculation unit 14, a behavior storage unit 15, a behavior comparison unit 16, and a road surface condition prediction unit 17. And a behavior prediction correction unit 18.
  • the detection integration unit 4 integrates a plurality of detection results obtained from each of the plurality of object detection sensors provided in the object detection device 1, and outputs one detection result for each object. Specifically, the most rational behavior of the object with the smallest error is calculated from the behavior of the object obtained from each of the object detection sensors in consideration of error characteristics of each object detection sensor. Specifically, by using a known sensor fusion technique, the detection results obtained by a plurality of types of sensors are comprehensively evaluated to obtain a more accurate detection result.
  • the object tracking unit 5 tracks the object detected by the detection integration unit 4. Specifically, the object tracking unit 5 verifies (associates) the identity of objects between different times from the behaviors of the objects output at different times, and tracks the objects based on the associations. To do. Note that the behavior of an object output at a different time is stored in a memory in the controller 100 and used for trajectory prediction described later.
  • the in-map position calculation unit 6 estimates the position and orientation of the host vehicle on the map from the absolute position of the host vehicle obtained by the host vehicle position estimation device 2 and the map data acquired by the map acquisition device 3.
  • the lane determination unit 11 identifies the host vehicle and the traveling lane of the object on the map using the object information acquired from the object tracking unit 5 and the self-position estimated by the in-map position calculation unit 6.
  • the intention prediction unit 12 predicts all candidate lanes on which an object may travel based on information on the traveling lane acquired from the lane determination unit 11 and the road structure. For example, when the travel lane in which the object is traveling is a one-lane road, there is one candidate lane that the object may travel. On the other hand, when the traveling lane in which the object is traveling is a two-lane road, there are two candidate lanes on which the object is going to travel: a lane that travels straight through the traveling lane and a lane that is adjacent to the traveling lane. The intention prediction unit 12 outputs the predicted candidate lane to the track prediction unit 13.
  • the track prediction unit 13 uses the candidate lane predicted by the intention prediction unit 12 to predict the traveling track when the object has advanced to the candidate lane.
  • the trajectory prediction unit 13 outputs the predicted traveling trajectory to the likelihood calculation unit 14.
  • the lane predicted by the intention prediction unit 12 and the track predicted by the track prediction unit 13 may be referred to as a prediction intention below.
  • the likelihood calculation unit 14 calculates the possibility (probability) that the object travels along the travel track using the travel track predicted by the track prediction unit 13. In the present embodiment, the possibility that an object will travel on the predicted travel path is called likelihood.
  • the likelihood may be expressed by a number, or may be expressed using a relative expression such as high or low.
  • the likelihood calculating unit 14 also calculates the amount of change in likelihood during a predetermined time.
  • the behavior storage unit 15 stores the behavior of the object on the map using the behavior of the object obtained by the detection integration unit 4.
  • the behavior comparison unit 16 determines whether or not the object has moved as predicted using the behavior of the object stored in the behavior storage unit 15 and the likelihood calculated by the likelihood calculation unit 14.
  • the road surface state prediction unit 17 predicts the road surface state around the vehicle based on the result determined by the behavior comparison unit 16.
  • the behavior prediction correction unit 18 corrects the behavior prediction (likelihood) of the object following the rear based on the road surface state predicted by the road surface state prediction unit 17.
  • the road surface condition prediction unit 17 can predict the road surface condition in advance, the own vehicle can suppress sudden changes in the behavior of the vehicle such as sudden braking and sudden steering. In addition to suppressing the uncomfortable feeling given to passengers or passengers of other vehicles, it contributes to smooth traffic flow.
  • the traveling scene shown in FIG. 2 is a scene in which the host vehicle M0 is traveling behind the other vehicle M1 approaching the intersection.
  • the intention prediction unit 12 predicts all candidate lanes that the other vehicle M1 may travel.
  • FIG. 2 there are four possible lanes in which the other vehicle M1 may travel: straight ahead, change to the left lane, turn right at the intersection, and turn left at the intersection.
  • the track prediction unit 13 uses the candidate lane predicted in this way to predict the travel tracks 30 to 33 when the other vehicle M1 travels to the candidate lane.
  • the traveling track 30 is a traveling track on which the other vehicle M1 goes straight.
  • the travel track 31 is a travel track in which the other vehicle M1 changes lanes to the left lane.
  • the traveling track 32 is a traveling track in which the other vehicle M1 turns right at the intersection.
  • the travel track 33 is a travel track in which the other vehicle M1 turns left at the intersection.
  • the likelihood calculation unit 14 calculates the likelihood that the other vehicle M1 travels along the travel tracks 30 to 33 using the travel tracks 30 to 33 predicted by the track prediction unit 13. As shown in FIG. 2, the likelihood that the other vehicle M1 travels along the travel track 30 is 0.8. Further, the likelihood that the other vehicle M1 travels along the travel track 31 is 0.5. Further, the likelihood that the other vehicle M1 travels along the travel track 32 is 0.3. Further, the likelihood that the other vehicle M1 travels along the travel track 33 is 0.3. Likelihood means that the greater the value, the higher the likelihood. Therefore, in FIG. 2, the likelihood calculation unit 14 determines that the other vehicle M1 is most likely to go straight as it is.
  • the likelihood calculating unit 14 calculates the likelihood based on the vehicle speed of the other vehicle M1, the position with respect to the center line, the yaw angle, the blinker blinking, and the road structure. Further, the likelihood calculating unit 14 may calculate the likelihood in consideration of the behavior of other vehicles other than the other vehicle M1, the presence or absence of pedestrians, and the like.
  • the behavior comparison unit 16 determines that there is a difference between the actual behavior of the other vehicle M1 and the prediction intention.
  • the difference between the actual behavior and the prediction intention means a case where an actual other vehicle travels along a travel path having a low likelihood.
  • the other vehicle M2 will be described as a vehicle that travels behind the other vehicle M1.
  • the other vehicle M3 will be described as a vehicle that travels behind the other vehicle M2.
  • the other vehicle M9 will be described as a vehicle that travels in front of the host vehicle M0.
  • the difference between the actual behavior and the prediction intention is described as a case where an actual other vehicle travels along a travel path having a low likelihood, but is not limited thereto.
  • the trajectory prediction unit 13 predicts that the pedestrian 40 moves along the trajectory 41.
  • the behavior comparison unit 16 determines that there is a difference between the actual behavior and the prediction intention. That is, even when there is a difference between the actual behavior and the predicted trajectory, it can be expressed that there is a difference between the actual behavior and the prediction intention.
  • 4A and 4B have described the trajectory of the pedestrian 40, the same applies to the trajectory of the vehicle. That is, when there is a difference between the traveling track and the actual traveling track of the vehicle predicted by the track predicting unit 13, the behavior comparing unit 16 determines that there is a difference between the actual behavior and the prediction intention.
  • the likelihood calculating unit 14 predicts that the other vehicle M2 travels at the same speed as the other vehicle M1.
  • the bump 50 is installed on the road, but it is assumed that the host vehicle M0 has not detected the bump 50.
  • the bump 50 is a deceleration zone.
  • FIG. 5B when the other vehicle M2 decelerates to pass through the bump 50, the actual speed of the other vehicle M2 and the predicted speed are different. Thus, even when there is a difference between the actual speed and the predicted speed, it can be expressed that there is a difference between the actual behavior and the prediction intention.
  • the likelihood calculating unit 14 calculates the likelihood that the other vehicle M1 travels along the travel track 30 as 0.8, and the other vehicle M1 travels along the travel track 31.
  • the likelihood of traveling is calculated as 0.2.
  • the behavior prediction correction unit 18 calculates the likelihood that the other vehicle M2 travels along the travel track 30 as 0.8, and the likelihood that the other vehicle M2 travels along the travel track 31 is 0.2. calculate.
  • the road surface condition prediction unit 17 determines that there is a possibility that there is a fallen object 60 in front of the other vehicle M3. Note that at the time T + 2, the road surface condition prediction unit 17 determines that the possibility is, for example, 40%. Based on this possibility, as shown in FIGS.
  • the behavior prediction correction unit 18 determines that the other vehicle M3 travels along the travel track 30 and the other vehicle M3 travels along the travel track 31. Correct the likelihood of traveling. The reason is that, when there is a falling object 60, the other vehicle M3 also travels avoiding the falling object 60, so that the possibility of traveling along the traveling track 31 increases. Specifically, the behavior prediction correction unit 18 adds ⁇ L1 to the likelihood that the other vehicle M3 travels along the travel track 31 as shown in FIG. ⁇ L1 is, for example, 0.3. The behavior prediction correction unit 18 corrects the likelihood that the other vehicle M3 travels along the travel track 30 from 0.8 to 0.5, and the likelihood that the other vehicle M3 travels along the travel track 31 is Correct from 0.2 to 0.5.
  • the road surface condition prediction unit 17 determines that there is a high possibility that there is a fallen object 60 in front of the other vehicle M4. For example, at time T + 3, the road surface condition prediction unit 17 determines that the possibility is 80%. This is because three units have taken a behavior showing similarity. As described above, the road surface condition prediction unit 17 increases the possibility (likelihood) of the fallen object 60 as the number of other vehicles exhibiting the predetermined similarity increases. Based on this possibility, as shown in FIGS.
  • the behavior prediction correction unit 18 determines that the other vehicle M4 travels along the travel track 30 and the other vehicle M3 travels along the travel track 31. Correct the likelihood of traveling.
  • the behavior prediction correction unit 18 corrects the likelihood that the other vehicle M3 travels along the travel track 30 from 0.5 to 0.1, and the likelihood that the other vehicle M3 travels along the travel track 31 is Correct from 0.5 to 0.9.
  • the behavior comparison unit 16 detects the similarity of the behaviors of the plurality of other vehicles M1 to M3 (in the example shown in FIG. 6), and the road surface condition prediction unit 17 determines the similarity based on this similarity.
  • the road surface condition (presence or absence of the fallen object 60) can be predicted as the surrounding condition of the host vehicle M0.
  • the automatic route generation unit 21 can generate a route in consideration of the road surface condition, and the vehicle control unit 22 can perform driving support suitable for the road surface condition.
  • the behavior prediction correction unit 18 corrects the behavior of the other vehicle based on the road surface state predicted by the road surface state prediction unit 17, and the road surface state prediction unit 17 based on the similarity of the behavior of the other vehicle after the correction. To predict the road surface condition. And the behavior prediction correction part 18 and the road surface condition prediction part 17 repeat this. Thereby, the road surface condition prediction unit 17 can accurately estimate the road surface condition.
  • the object detection device 1 detects the speeds of the other vehicles M1 to M4.
  • the likelihood calculating unit 14 predicts that the other vehicles M1 to M4 pass at the speed detected by the object detection device 1.
  • the speed detected by the object detection device 1 at time T1 is called, for example, a first speed.
  • the behavior prediction correction unit 18 predicts that the other vehicle M2 passes at the first speed.
  • the road surface condition prediction unit 17 determines that there is a possibility that there is a bump 50 in front of the other vehicle M3. Note that at the time T + 2, the road surface condition prediction unit 17 determines that the possibility is, for example, 40%. Based on this possibility, the behavior prediction correction unit 18 corrects the speed at which the other vehicle M3 passes to a speed lower than the first speed. The reason is that when there is the bump 50, the possibility that the other vehicle M3 decelerates becomes high.
  • the road surface condition prediction unit 17 determines that there is a high possibility that there is a bump 50 in front of the other vehicle M4. For example, at time T + 3, the road surface condition prediction unit 17 determines that the possibility is 80%. As described above, the road surface condition prediction unit 17 determines whether or not the bump 50 is present as the road surface condition around the host vehicle M0 even when it is difficult to confirm the road surface condition based on the similarity of deceleration of the other vehicles M1 to M3. Can be predicted.
  • the road surface condition estimation part 17 may predict a road surface condition based on the similarity of acceleration of other vehicles.
  • the road surface state prediction unit 17 can predict the road surface state even when it is difficult to confirm the road surface state.
  • the driving scenes at time T, time T + 1, and time T + 2 are the same as those in FIG.
  • the behavior comparison unit 16 compares the vehicle type of the other vehicle M3 with the vehicle types of the other vehicle M1 and the other vehicle M2.
  • the other vehicle M3 is a high truck with a chassis different from that of the other vehicle M1 and the other vehicle M2, the other vehicle M3 may be able to pass without avoiding the falling object 60.
  • the road surface condition prediction unit 17 determines that the possibility that there is a falling object 60 in front of the other vehicle M4 is 60%.
  • the behavior prediction correction unit 18 adds ⁇ L2 to the likelihood that the other vehicle M4 travels along the travel track 31 as shown in FIG. ⁇ L2 is a value smaller than ⁇ L1, for example, 0.2.
  • the reason why the behavior prediction correction unit 18 adds ⁇ L2 smaller than ⁇ L1 is that the possibility of the falling object 60 is high, but it is necessary to consider the case where there is no falling object 60.
  • the behavior prediction correction unit 18 corrects the likelihood that the other vehicle M4 travels along the travel track 30 from 0.5 to 0.3, and the likelihood that the other vehicle M4 travels along the travel track 31 is Correct from 0.5 to 0.7.
  • the road surface condition prediction unit 17 determines that there is a high possibility that there is a fallen object 60 in front of the other vehicle M5. For example, at the time T + 4, the road surface condition prediction unit 17 determines the possibility as 80%.
  • the behavior prediction correction unit 18 corrects the likelihood that the other vehicle M5 travels along the travel track 30 from 0.3 to 0.1, and sets the likelihood that the other vehicle M5 travels along the travel track 31, Correct from 0.7 to 0.9.
  • the road surface condition prediction unit 17 determines the fallen object 60 as the road surface condition around the host vehicle even when it is difficult to confirm the road surface condition based on the similarity of the traveling tracks of the plurality of other vehicles M1, M2, and M4. Presence or absence can be predicted.
  • the behavior prediction correction unit 18 calculates ⁇ L1 from the likelihood of the fourth vehicle. May be subtracted. This is because in the example shown in FIG. 11, the third vehicle and the fourth vehicle continue to go straight, and the possibility of falling objects in front of the fifth vehicle is reduced.
  • FIG. 12 As shown in FIG. 12, at time T, a pedestrian 40 is near the other vehicle M1. However, since there is a distance between the pedestrian 40 and the other vehicle M1, the likelihood calculating unit 14 calculates the likelihood that the other vehicle M1 travels along the traveling track 30 as 0.8. The likelihood that M1 travels along the travel track 31 is calculated as 0.2.
  • the likelihood calculating unit 14 calculates the likelihood of traveling along the traveling track 30 as 0.8, and the other vehicle M1 is traveling.
  • the likelihood of traveling along 31 is calculated as 0.2.
  • the road surface condition prediction unit 17 determines that there is a possibility that there is a puddle 70 near the other vehicle M3.
  • the road surface condition prediction unit 17 determines the possibility as 40%, for example.
  • the behavior prediction correction part 18 adds (DELTA) L1 with respect to the likelihood that the other vehicle M3 drive
  • the behavior prediction correction unit 18 corrects the likelihood that the other vehicle M3 travels along the travel track 30 from 0.8 to 0.5, and the likelihood that the other vehicle M3 travels along the travel track 31 is Correct from 0.2 to 0.5.
  • the behavior comparison unit 16 determines whether or not the other vehicles M1 to M3 are decelerated when passing, and whether or not the pedestrian 40 is near the other vehicles M1 to M3. As shown in FIG. 13, when passing, the other vehicles M1 and M2 are not decelerated, and the other vehicle M3 is not decelerated. In addition, there was a pedestrian 40 near the other vehicles M1 and M2, but there was no pedestrian 40 near the other vehicle M3. Therefore, the reason why the other vehicles M1 and M2 have changed lanes is considered to avoid the pedestrian 40.
  • the behavior prediction correction unit 18 subtracts ⁇ L3 from the likelihood that the other vehicle M4 travels along the travel track 31.
  • ⁇ L3 is a value smaller than ⁇ L1, for example, 0.1.
  • the behavior prediction correction unit 18 corrects the likelihood that the other vehicle M4 travels along the travel track 30 from 0.5 to 0.6, and sets the likelihood that the other vehicle M4 travels along the travel track 31, Correct from 0.5 to 0.4.
  • the road surface condition prediction unit 17 determines that there is a high possibility that there is a puddle 70 near the other vehicle M5. For example, at the time T + 4, the road surface condition prediction unit 17 determines the possibility as 80%. Moreover, since the pedestrian 40 exists near the other vehicle M5 at time T + 4, the lane may be changed in order to avoid the pedestrian 40 as in the case of the other vehicle M1 and the other vehicle M2.
  • the behavior prediction correction unit 18 corrects the likelihood that the other vehicle M5 travels along the travel track 30 from 0.6 to 0.2, and the likelihood that the other vehicle M5 travels along the travel track 31. Is corrected from 0.4 to 0.8.
  • the road surface condition prediction unit 17 automatically determines whether the road surface condition is difficult to check based on the similarity of the traveling tracks of the other vehicles M1 and M2 and the similarity of the traveling tracks and decelerations of the other vehicles M3 and M4.
  • the presence or absence of the puddle 70 can be predicted as the road surface condition around the vehicle.
  • step S101 the object detection apparatus 1 detects an object (for example, another vehicle) around the host vehicle using a plurality of object detection sensors.
  • the process proceeds to step S102, and the detection integration unit 4 integrates a plurality of detection results obtained from each of the plurality of object detection sensors, and outputs one detection result for each other vehicle.
  • the object tracking unit 5 tracks each other vehicle detected and integrated.
  • step S103 the own vehicle position estimation device 2 measures the absolute position of the own vehicle using the position detection sensor.
  • step S104 the map acquisition device 3 acquires map information indicating the structure of the road on which the host vehicle travels.
  • step S105 the in-map position calculation unit 6 estimates the position and orientation of the host vehicle on the map from the absolute position of the host vehicle measured in step S103 and the map data acquired in step S104. .
  • step S106 the behavior prediction unit 10 predicts the behavior of the other vehicle. Details of the behavior prediction unit 10 will be described with reference to FIG.
  • step S107 the road surface condition prediction unit 17 predicts the road surface condition around the host vehicle. Details of the road surface condition prediction unit 17 will be described with reference to FIG.
  • step S108 the automatic route generation unit 21 regenerates the route to the destination input in advance by the occupant based on the road surface conditions around the vehicle.
  • step S109 the vehicle control unit 22 proceeds to the route regenerated by the automatic route generation unit 21, so that various actuators (steering actuator, accelerator pedal actuator, brake actuator, etc.) of the host vehicle are used while using information from various sensors. To control automatic operation.
  • step S110 the surroundings state prediction device determines whether or not the ignition switch is off. If the ignition switch is on (No in step S110), the process returns to step S101. When the ignition switch is off (Yes in step S110), the surrounding state prediction device ends a series of processes.
  • step S201 the lane determination unit 11 determines the travel lane of the other vehicle on the map using the other vehicle information acquired from the object tracking unit 5.
  • step S202 the lane determination unit 11 determines whether there are lanes on the left and right of the other vehicle. If there are lanes on the left and right of the other vehicle (Yes in step S202), the process proceeds to step S202. On the other hand, when there are no lanes on the left and right of the other vehicle (No in step S202), the process proceeds to step S208.
  • step S203 the intention prediction unit 12 predicts an intention that the other vehicle may change lanes as one of the prediction intentions of the other vehicle.
  • the intention prediction unit 12 predicts a lane adjacent to the travel lane as a candidate lane on which another vehicle may travel.
  • the track prediction unit 13 generates a track when another vehicle changes lanes based on the intention generated by the intention prediction unit 12.
  • the map acquisition device 3 extracts lane information ahead of the other vehicle.
  • the process proceeds to step S207.
  • the plurality of lanes in front of the other vehicle means a plurality of lanes that intersect the lane in which the other vehicle is currently traveling.
  • step S207 the object tracking unit 5 calculates the angle between the lane in which the other vehicle has traveled a certain distance before and the lane in which the other vehicle is currently traveling. If the lane in which the other vehicle was traveling a certain distance before and the lane in which the other vehicle is currently traveling are the same, the angle is almost 0 degrees. If the lane in which the other vehicle was traveling a certain distance before and the lane in which the other vehicle is currently traveling are different, the angle changes according to the certain distance. When the fixed distance is large, the angle is small, and when the constant distance is small, the angle is large. On the other hand, when there are not a plurality of lanes ahead (No in step S206), the process proceeds to step S208.
  • step S207 If the angle calculated in step S207 is larger than the threshold (Yes in step S209), the process proceeds to step S211.
  • the case where the angle calculated in step S207 is larger than the threshold value is a case where the above-described constant distance is small. That is, not much time has passed since the other vehicle changed lanes, and there is a lane that intersects the lane in which the other vehicle is currently traveling in front of the other vehicle. Therefore, the other vehicle may turn right or left. Therefore, in step S211 and step S212, the intention prediction unit 12 predicts an intention that the other vehicle may turn right or left as one of the prediction intentions of the other vehicle. In other words, the intention prediction unit 12 predicts a lane that turns right or left as a candidate lane on which another vehicle may travel.
  • step S208 the intention prediction unit 12 predicts an intention that the other vehicle may go straight as one of the prediction intentions of the other vehicle.
  • the intention prediction unit 12 predicts a lane that travels straight on the traveling lane as a candidate lane on which another vehicle may travel.
  • the object tracking unit 5 calculates an offset amount.
  • the offset amount is a shift in the position of the other vehicle with respect to the center of the traveling lane.
  • the process proceeds to step S214, and the trajectory prediction unit 13 uses the candidate lane predicted by the intention prediction unit 12 to generate a travel trajectory when another vehicle travels to the candidate lane.
  • the process proceeds to step S215, and the likelihood calculating unit 14 calculates the likelihood that the other vehicle travels on the traveling track predicted in step S214.
  • step S ⁇ b> 301 shown in FIG. 16 the object tracking unit 5 acquires the actual behavior of the other vehicle and outputs the acquired behavior of the other vehicle to the behavior storage unit 15.
  • the behavior comparison unit 16 compares the actual behavior of the other vehicle stored in the behavior storage unit 15 with the behavior of the other vehicle predicted by the likelihood calculation unit 14.
  • the process proceeds to step S302, and when the other vehicle behaves with a low likelihood (Yes in step S302), the process proceeds to step S401.
  • the behavior with the lowest likelihood may be a behavior with the lowest likelihood among the likelihoods calculated by the likelihood calculating unit 14 or a behavior lower than a predetermined likelihood.
  • the behavior with the low likelihood may be a behavior other than the highest likelihood among the likelihoods calculated by the likelihood calculating unit 14.
  • the process proceeds to step S303.
  • step S303 when the difference between the predicted trajectory and the actual trajectory is large (Yes in step S303), the process proceeds to step S401.
  • step S304 when the difference between the predicted trajectory and the actual trajectory is small (No in step S303), the process proceeds to step S304. If the difference between the predicted speed and the actual speed is large in step S304 (Yes in step S304), the process proceeds to step S401.
  • the difference between the predicted speed and the actual speed is small (No in step S304)
  • the road surface condition prediction process ends.
  • step S401 shown in FIG. 17 the behavior comparison unit 16 determines whether there are two other vehicles that behave the same. If there are two other vehicles that behave the same (Yes in step S401), the process proceeds to step S402. On the other hand, if there is no other vehicle that behaves the same (No in step S401), the process waits.
  • step S402 the behavior prediction correction unit 18 corrects the prediction result of the behavior of the third other vehicle.
  • step S403 the behavior comparison unit 16 determines whether the third other vehicle has the same behavior as the previous two other vehicles.
  • the process proceeds to step S404, and the behavior prediction correction unit 18 determines the behavior of the fourth other vehicle. Correct the prediction results.
  • the process proceeds to step S405, and the behavior comparison unit 16 determines that the third other vehicle Compare the car model of this car with the car model of the other two other cars. If the vehicle type of the third other vehicle is different from the vehicle type of the previous two other vehicles (Yes in step S405), the process proceeds to step S406, and the behavior comparison unit 16 determines the vehicle type of the fourth other vehicle. Compare the models of the other three other vehicles. The process proceeds to step S407, and the behavior prediction correction unit 18 corrects the prediction result of the behavior of the fourth other vehicle based on the result of step S406.
  • step S408 the behavior comparison unit 16 compares the vehicle type of the fifth other vehicle with the vehicle type of the previous four other vehicles.
  • the process proceeds to step S409, and the behavior prediction correction unit 18 corrects the prediction result of the behavior of the fifth other vehicle based on the result of step S408. Thereafter, the process proceeds to step S415.
  • the process proceeds to step S410, and the behavior comparison unit 16 has passed the previous two vehicles. It is determined whether or not there was a pedestrian.
  • the behavior comparison part 16 should just refer the detection result of the object detection apparatus 1 about the presence or absence of a pedestrian.
  • step S410 If there are pedestrians on the sidewalk when the previous two vehicles pass (Yes in step S410), the process proceeds to step S411, and the behavior comparison unit 16 leaves the pedestrian when the third other vehicle passes. It is determined whether it has been. If the pedestrian is away when the third other vehicle passes (Yes in step S411), the process proceeds to step S412 and the behavior comparison unit 16 determines whether or not the third other vehicle has decelerated. Determine. When the third other vehicle decelerates (Yes in step S412), the process proceeds to step S413, and the behavior comparison unit 16 determines whether it is raining.
  • step S413 the process proceeds to step S414, and the behavior prediction correction unit 18 corrects the behavior prediction result of the third and subsequent vehicles according to the presence or absence of a pedestrian. Thereafter, the process proceeds to step S415.
  • step S415 the road surface state prediction unit 17 predicts the road surface state based on the similarity in behavior of other vehicles.
  • the rightmost lane is a right turn lane.
  • the likelihood calculating unit 14 calculates the likelihood that the other vehicle M1 travels along the traveling track 30 as 0.6.
  • the likelihood calculating unit 14 calculates the likelihood that the other vehicle M1 travels along the traveling track 31 as 0.2, and the likelihood that the other vehicle M1 travels along the traveling track 32 is 0. Calculate as 2.
  • the behavior prediction correction unit 18 calculates the likelihood that the other vehicle M2 travels along the travel track 30 as 0.6. In addition, the behavior prediction correction unit 18 calculates the likelihood that the other vehicle M2 travels along the travel track 31 as 0.2, and the likelihood that the other vehicle M2 travels along the travel track 32 is 0. Calculate as 2.
  • the behavior prediction correction unit 18 does not correct the likelihood of the other vehicle M3.
  • the behavior prediction correction unit 18 calculates the likelihood that the other vehicle M3 travels along the travel track 30 as 0.6. Further, the behavior prediction correction unit 18 calculates the likelihood that the other vehicle M3 travels along the travel track 31 as 0.2, and the likelihood that the other vehicle M3 travels along the travel track 32 is 0. Calculate as 2.
  • the behavior prediction correction unit 18 calculates the likelihood that the other vehicle M4 travels along the travel track 30 as 0.6. Further, the behavior prediction correction unit 18 calculates the likelihood that the other vehicle M4 travels along the travel track 31 as 0.2, and the likelihood that the other vehicle M4 travels along the travel track 32 is 0. Calculate as 2.
  • the behavior prediction correction unit 18 may not correct the likelihood in the rear vehicle according to the similarity between the structure of the intersection (the presence or absence of a right turn dedicated lane) and the behavior of the other vehicle. On the other hand, the behavior prediction correction unit 18 may correct the likelihood in the rear vehicle according to the similarity between the structure of the intersection and the behavior of the other vehicle. This point will be described with reference to the flowchart of FIG.
  • step S501 the in-map position calculation unit 6 determines whether or not the self position is near an intersection. Specifically, the in-map position calculation unit 6 determines that the self position is near the intersection when the distance from the self position to the intersection is within a predetermined distance.
  • the predetermined distance is not particularly limited, but is, for example, 50 m. If the self-position is near the intersection (Yes in step S501), the process proceeds to step S502, and the map position calculation unit 6 refers to the map information acquired by the map acquisition device 3, and makes a right or left turn at the intersection. Determine if there is a dedicated lane.
  • step S503 If there is a right turn or left turn dedicated lane (Yes in step S502), the process proceeds to step S503, and the behavior comparison unit 16 determines whether the other vehicle has changed its lane to the right turn or left turn dedicated lane.
  • the process proceeds to step S505, and the behavior prediction correction unit 18 does not correct the prediction result in the rear vehicle. This is because other vehicles turn right or left without returning to the original lane.
  • the process proceeds to step S506, and the behavior comparison unit 16 determines whether the other vehicle has exited the right lane or left turn lane. Determine.
  • step S507 the behavior comparison unit 16 determines whether or not the other vehicle travels straight.
  • step S505 If the other vehicle goes straight ahead (Yes in step S507), the process proceeds to step S505. If there is no right turn or left turn dedicated lane (No in step S502), the process proceeds to step S504, and the behavior comparison unit 16 determines whether the other vehicle has turned right or left immediately after the lane change. When the other vehicle turns right or left immediately after the lane change (Yes in step S504), the process proceeds to step S505. If the other vehicle does not turn right or left immediately after changing lanes (No in step S504), or if the other vehicle does not exit the right or left turn lane (No in step S506), or the other vehicle is going straight ahead If not (No in step S507), the process proceeds to step S508.
  • step S508 the behavior comparison unit 16 determines whether the other vehicle has returned to the original lane after the lane change.
  • the process proceeds to step S509, and the road surface state prediction unit 17 predicts the road surface state based on the similarity of the behavior of the other vehicle,
  • the behavior prediction correction unit 18 corrects the prediction result in the rear vehicle. If the position is not near the intersection (No in step S501), or if the other vehicle does not return to the original lane after changing the lane (No in step S508), the process returns to step S501.
  • the surrounding situation prediction device detects a plurality of other vehicles around the host vehicle and detects the similarity in behavior of the plurality of other vehicles. Then, the surrounding situation prediction apparatus predicts the road surface situation as the surrounding situation of the host vehicle based on the similarity. Thereby, the surroundings state prediction device predicts the road surface state (presence / absence of fallen object 60, presence / absence of bump 50, presence / absence of puddle 70, presence / absence of construction site, etc.) as the surrounding state of the vehicle even when it is difficult to check the road surface state can do. In addition, the surrounding situation prediction apparatus can generate a route in consideration of the road surface condition, and can perform driving support suitable for the road surface condition.
  • the surrounding situation prediction device detects acceleration / deceleration of a plurality of other vehicles as the behavior of the plurality of other vehicles.
  • the surrounding situation prediction device predicts the road surface situation as the surrounding situation of the host vehicle based on the acceleration / deceleration similarity.
  • the surrounding situation prediction apparatus can predict the presence or absence of the bump 50 as the surrounding road surface condition of the host vehicle even when it is difficult to check the road surface condition by using the similarity of deceleration of the other vehicle.
  • the surrounding situation prediction apparatus may predict a road surface situation based on the similarity of acceleration of other vehicles.
  • the surrounding situation prediction device can predict the road condition even when it is difficult to confirm the road condition.
  • the surrounding situation prediction device detects the traveling locus of a plurality of other vehicles as the behavior of the plurality of other vehicles.
  • the surrounding situation prediction apparatus predicts the road surface situation as the surrounding situation of the host vehicle based on the similarity of the traveling tracks.
  • the surroundings state prediction device can predict the presence or absence of the fallen object 60 as the surrounding road surface situation of the host vehicle even when it is difficult to check the road surface state by using the similarity of the traveling tracks.
  • the surrounding situation prediction device calculates the likelihood of the road surface condition based on the similarity of the behavior of a plurality of other vehicles. For example, as shown in FIG. 6, in the surrounding situation prediction apparatus, when two other vehicles, that is, the other vehicle M1 and the other vehicle M2, behave similarly, there is a fallen object 60 in front of the other vehicle M3. The probability is calculated as 40%. Thereby, the surroundings state prediction apparatus can perform driving support more suitable for road surface conditions.
  • the likelihood may be an evaluation that it is simply possible, or may be a specific numerical value.
  • the ambient condition prediction apparatus increases the likelihood as the number of other vehicles exhibiting a predetermined similarity increases. For example, as shown in FIG. 6, when the three other vehicles, that is, the other vehicle M1, the other vehicle M2, and the other vehicle M3, behave in a similar manner, the surroundings state prediction device is placed in front of the other vehicle M4. The possibility that there is a fallen object 60 is calculated as 80%. As described above, the surrounding state predicting apparatus increases the likelihood as the number of other vehicles exhibiting the predetermined similarity increases. Thereby, the surrounding condition prediction apparatus can perform driving support suitable for the road surface condition, and can suppress a sense of discomfort given to the occupant.
  • the ambient situation prediction device detects the vehicle type of another vehicle that shows a predetermined similarity, and calculates the likelihood based on the vehicle type of the other vehicle. As described above, by using the vehicle type of the other vehicle exhibiting the predetermined similarity, the surrounding situation prediction apparatus can calculate a more accurate likelihood.
  • the surrounding situation prediction device detects a road structure around the other vehicle showing a predetermined similarity, and calculates a likelihood based on the road structure. As described above, by using the road structure around the other vehicle showing the predetermined similarity, the surrounding situation prediction apparatus can calculate a more accurate likelihood.
  • the road surface condition prediction unit 17 can predict that there is a fallen object in the right lane based on the similarity in behavior of the other vehicles M1 to M3.
  • the vehicle control unit 22 can change the lane while avoiding a place where a fallen object is predicted.
  • the road surface condition prediction unit 17 predicts that, for example, construction is being performed in the right lane and the right lane is impassable based on the similarity in behavior of the other vehicles M1 to M3. Since the own vehicle M0 cannot change the lane to the right lane, the automatic route generation unit 21 regenerates the route to the destination. Thereby, the vehicle control part 22 can drive the own vehicle smoothly to the destination.
  • the host vehicle is an automatically driven vehicle
  • the host vehicle may be a manually driven vehicle.
  • a speaker, a display, and a controller for controlling these user interfaces are used to guide the steering, accelerator, and brake operations to the driver using voice or images. It only has to have.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé de prédiction d'état périphérique destiné à un dispositif d'aide au déplacement qui détecte le comportement d'une pluralité d'autres véhicules dans la périphérie du véhicule hôte, prédit l'état périphérique du véhicule hôte d'après le comportement de la pluralité des autres véhicules et aide au déplacement du véhicule hôte d'après les résultats de prédiction. Le procédé de prédiction d'état périphérique détecte la similarité du comportement de la pluralité des autres véhicules et, d'après cette similarité, prédit l'état de la surface de route tel que l'état périphérique du véhicule hôte.
PCT/JP2017/016185 2017-04-24 2017-04-24 Procédé et dispositif de prédiction d'état périphérique WO2018198163A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/016185 WO2018198163A1 (fr) 2017-04-24 2017-04-24 Procédé et dispositif de prédiction d'état périphérique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/016185 WO2018198163A1 (fr) 2017-04-24 2017-04-24 Procédé et dispositif de prédiction d'état périphérique

Publications (1)

Publication Number Publication Date
WO2018198163A1 true WO2018198163A1 (fr) 2018-11-01

Family

ID=63918174

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/016185 WO2018198163A1 (fr) 2017-04-24 2017-04-24 Procédé et dispositif de prédiction d'état périphérique

Country Status (1)

Country Link
WO (1) WO2018198163A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020132146A (ja) * 2019-02-19 2020-08-31 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド 自動運転車両走行計画のリアルタイム学習方法、装置、サーバ、システム、デバイス、記憶媒体、及びプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004078333A (ja) * 2002-08-12 2004-03-11 Nissan Motor Co Ltd 走行経路生成装置
JP2005242552A (ja) * 2004-02-25 2005-09-08 Denso Corp 車載受信装置、車載送信装置、およびサーバ
JP2006313519A (ja) * 2005-04-04 2006-11-16 Sumitomo Electric Ind Ltd 障害物検出センター装置、障害物検出システム及び障害物検出方法
JP2009157419A (ja) * 2007-12-25 2009-07-16 Sumitomo Electric Ind Ltd 運転支援システム、路上通信装置、および、車載機

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004078333A (ja) * 2002-08-12 2004-03-11 Nissan Motor Co Ltd 走行経路生成装置
JP2005242552A (ja) * 2004-02-25 2005-09-08 Denso Corp 車載受信装置、車載送信装置、およびサーバ
JP2006313519A (ja) * 2005-04-04 2006-11-16 Sumitomo Electric Ind Ltd 障害物検出センター装置、障害物検出システム及び障害物検出方法
JP2009157419A (ja) * 2007-12-25 2009-07-16 Sumitomo Electric Ind Ltd 運転支援システム、路上通信装置、および、車載機

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020132146A (ja) * 2019-02-19 2020-08-31 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド 自動運転車両走行計画のリアルタイム学習方法、装置、サーバ、システム、デバイス、記憶媒体、及びプログラム
US11780463B2 (en) 2019-02-19 2023-10-10 Baidu Online Network Technology (Beijing) Co., Ltd. Method, apparatus and server for real-time learning of travelling strategy of driverless vehicle

Similar Documents

Publication Publication Date Title
US10928820B1 (en) Confidence levels along the same predicted trajectory of an obstacle
CN112498365B (zh) 基于置信度水平和距离、响应于障碍物的自动驾驶车辆的延迟决策
US11442450B2 (en) Method for determining passable area in planning a path of autonomous driving vehicles
CN110352450B (zh) 驾驶辅助方法及驾驶辅助装置
JP6798611B2 (ja) 走行支援方法及び走行支援装置
US11740628B2 (en) Scenario based control of autonomous driving vehicle
JP7182376B2 (ja) 運転支援方法及び運転支援装置
CN113552870B (zh) 基于感知结果的动态速度限制调整系统及方法
CN110622226A (zh) 行驶辅助装置的动作预测方法以及动作预测装置
CN111830959A (zh) 用于操作自动驾驶车辆的方法、系统和机器可读介质
KR102657973B1 (ko) 차량 거동 예측 방법 및 차량 거동 예측 장치
CN111819609B (zh) 车辆行为预测方法及车辆行为预测装置
JP7206048B2 (ja) 運転特性推定方法及び運転特性推定装置
CN114516329A (zh) 车辆自适应巡航控制系统、方法和计算机可读介质
CN113060140A (zh) 基于中心线移位的变道前路径规划
CN113002534A (zh) 碰撞后减损制动系统
WO2018198163A1 (fr) Procédé et dispositif de prédiction d'état périphérique
CN116009539A (zh) 操作自主驾驶车辆的方法和系统
US20230053243A1 (en) Hybrid Performance Critic for Planning Module's Parameter Tuning in Autonomous Driving Vehicles
WO2018198186A1 (fr) Procédé et dispositif d'aide au déplacement
US11325529B2 (en) Early brake light warning system for autonomous driving vehicle
JP7143893B2 (ja) 車両挙動予測方法及び車両挙動予測装置
US11577644B2 (en) L3-level auto-emergency light system for ego vehicle harsh brake
US11807274B2 (en) L4 auto-emergency light system for future harsh brake
JP7398236B2 (ja) 車両制御方法及び車両制御装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17907340

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17907340

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP