KR102005203B1 - Driving planning device, driving support device, driving planning method - Google Patents

Driving planning device, driving support device, driving planning method Download PDF

Info

Publication number
KR102005203B1
KR102005203B1 KR1020187001530A KR20187001530A KR102005203B1 KR 102005203 B1 KR102005203 B1 KR 102005203B1 KR 1020187001530 A KR1020187001530 A KR 1020187001530A KR 20187001530 A KR20187001530 A KR 20187001530A KR 102005203 B1 KR102005203 B1 KR 102005203B1
Authority
KR
South Korea
Prior art keywords
vehicle
stop
route
path
intersection
Prior art date
Application number
KR1020187001530A
Other languages
Korean (ko)
Other versions
KR20180018789A (en
Inventor
스스무 후지타
모토노부 아오키
Original Assignee
닛산 지도우샤 가부시키가이샤
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 닛산 지도우샤 가부시키가이샤 filed Critical 닛산 지도우샤 가부시키가이샤
Priority to PCT/JP2015/070748 priority Critical patent/WO2017013750A1/en
Publication of KR20180018789A publication Critical patent/KR20180018789A/en
Application granted granted Critical
Publication of KR102005203B1 publication Critical patent/KR102005203B1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18109Braking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/04Vehicle stop
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2201/00Application
    • G05D2201/02Control of position of land vehicles
    • G05D2201/0213Road vehicle, e.g. car or truck

Abstract

And a driving plan processor (21) for planning an operation plan of the subject vehicle (V1) running on the route. The operation planning processor (21) includes a plurality of maps One or a plurality of stop position candidates for stopping the vehicle V1 are set for each event using the evaluation results of the relationship between the vehicle V1 and the vehicle V1 and the relationship between the plurality of events encountered in the stop position candidate and the vehicle V1 , A driving plan is prepared for the scene encountered by the own vehicle V1.

Description

Driving planning device, driving support device, driving planning method

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a driving plan apparatus, a driving support apparatus, and a driving plan method for planning driving of a vehicle.

BACKGROUND ART [0002] With respect to this kind of apparatus, there is known a technique of detecting an intersection to be temporarily halted and stopping, and braking a vehicle on behalf of a driver when the vehicle can not stop in front of a stop scheduled position Document 1).

Japanese Patent Application Laid-Open No. 2006-224754

However, according to the conventional technique, there is a problem that the vehicle can not stop at the center of the intersection because the vehicle is waiting for the straight ahead vehicle to pass when turning the intersection right.

A problem to be solved by the present invention is to provide an operation planning apparatus for quickly determining an operation plan according to a scene encountered by a vehicle running on a route.

The present invention uses one or a plurality of stop position candidates for stopping the subject vehicle by using the evaluation results on the relationship between the event and the subject vehicle encountered by the vehicle running on the first route, The present invention solves the above-mentioned problems by preparing an operation plan for a scene encountered by the vehicle using the evaluation results of the relationship between the plurality of ideas encountered in the vehicle and the vehicle.

According to the present invention, the stop position candidates are set according to the relationship between the vehicle and the vehicle, and the operation plan is prepared in consideration of the relationship between the plurality of events and the vehicle V1 encountered in the stop position candidate. It is possible to realize the operation plan which does not affect.

1 is a block diagram of a travel support system according to the present embodiment.
2A is a first diagram for explaining a method of evaluating a scene encountered by a vehicle.
2B is a second diagram for explaining a method of evaluating a scene encountered by the vehicle.
FIG. 2C is a third view for explaining a method of evaluating a scene encountered by the vehicle. FIG.
FIG. 2D is a fourth figure for explaining a method of evaluating a scene encountered by the vehicle.
FIG. 2E is a fifth view for explaining a method of evaluating a scene encountered by the vehicle. FIG.
FIG. 2F is a view for explaining a method of evaluating a scene encountered by the vehicle. FIG.
Fig. 2G is a seventh figure for explaining a method of evaluating a scene encountered by the vehicle.
3 is a diagram for explaining a method of determining an event using a traffic rule.
Fig. 4 is an example of display information showing the event over time.
Fig. 5A is a first diagram for explaining a decision processing method of driving behavior in an event; Fig.
5B is a second diagram for explaining a decision processing method of the driving behavior in the event.
6 is an example of display information indicating a result of determination of the driving behavior.
Fig. 7A is a first diagram for explaining an event extraction process. Fig.
Fig. 7B is a second diagram for explaining an event extraction process.
8 is an example of display information indicating a result of determination of the driving behavior.
Fig. 9A is a first diagram for explaining a process of extracting an image and a process of operation; Fig.
Fig. 9B is a second diagram for explaining the process of extracting the map and the operation. Fig.
Fig. 10 is an example of the position of the display information indicating the event in time.
11A is a first diagram for explaining an event extraction process including a parking vehicle.
11B is a second diagram for explaining an event extraction process including a parking vehicle.
Fig. 12 is an example of the position of the display information which indicates the event over time.
13A is a first diagram for explaining a method of setting a stop candidate position.
13B is a second diagram for explaining a method of setting a stop candidate position.
13C is a third diagram for explaining a method of setting a stop candidate position.
13D is a fourth figure for explaining a method of setting a stop candidate position.
14A is a first diagram for explaining a method of setting a stop candidate position in the T character string.
14B is a second diagram for explaining a method of setting a stop candidate position in the T character string.
Fig. 15 is a third figure for explaining a method of setting a stop candidate position at the time of stalling.
16 is a flowchart showing a control procedure of the travel support system of the present embodiment.
17 is a flowchart showing a subroutine of step S15 of the control procedure shown in Fig.
18 is a flowchart showing a subroutine of step S23 of the control procedure shown in Fig.
FIG. 19 is a diagram showing a scene for explaining a method of extracting an event. FIG.
20 is a flowchart showing a control procedure of a first method for extracting an event.
FIG. 21 is a first diagram for explaining a first method of extracting an event. FIG.
FIG. 22 is a second diagram for explaining a first method of extracting an event. FIG.
23 is a third view for explaining a first method of extracting an event.
24A is a fourth figure for explaining a first method of extracting a mapped image.
FIG. 24B is a fifth diagram for explaining a first method of extracting an image. FIG.
25 is a flowchart showing a control procedure of a second method of extracting an event.
26A is a first diagram for explaining a second method of extracting a mapped image.
Fig. 26B is a second diagram for explaining a second method of extracting an image. Fig.
Fig. 26C is a third diagram for explaining a second method of extracting an event. Fig.
FIG. 27A is a diagram showing a scene for explaining a method of merging events. FIG.
FIG. 27B is a first diagram for explaining a merging method of events. FIG.
FIG. 27C is a second diagram for explaining a merging method of events. FIG.
FIG. 27D is a third diagram for explaining a merging method of a mapped image; FIG.
Fig. 28 is a diagram for explaining another example of the method of merging events.
29 is a diagram for explaining the effect of the present invention.
Fig. 30 is an example of display information showing the image of the scene shown in Fig. 29 over time.

DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the present embodiment, a case in which the running scene evaluation apparatus according to the present invention is applied to a running support system mounted on a vehicle will be described as an example.

Fig. 1 is a diagram showing a block configuration of the running support system 1. Fig. The running support system 1 of this embodiment includes a running support apparatus 100 and a vehicle mounting apparatus 200. [ The embodiment of the running support apparatus 100 of the present invention is not limited and may be mounted on a vehicle or may be applied to a terminal apparatus 200 and a portable terminal capable of exchanging information. The terminal device includes a device such as a smart phone and a PDA. Each of the running support system 1, the running support apparatus 100, the in-vehicle apparatus 200, and the respective apparatuses thereof is a computer having an arithmetic processing unit such as a CPU and executing arithmetic processing.

First, the in-vehicle apparatus 200 will be described.

The vehicle mounted device 200 of the present embodiment includes a vehicle controller 210, a navigation device 220, an object detection device 230, a lane departure prevention device 240, and an output device 250. The respective apparatuses constituting the in-vehicle apparatus 200 are connected by a CAN (Controller Area Network) or other in-vehicle LAN in order to exchange information with each other. The in-vehicle device 200 can exchange information with the driving support device 100 via the in-vehicle LAN. The vehicle controller 210 of the present embodiment operates in cooperation with the detecting device 260, the driving device 270, and the steering device 280.

The vehicle controller 210 of the present embodiment includes a detection device 260. [ The detecting device 260 has an angle sensor 261, a vehicle speed sensor 262, and an attitude sensor 263. The steering angle sensor 261 detects information such as a steering angle, a steering speed, a steering acceleration, and the like, and outputs it to the vehicle controller 210. The vehicle speed sensor 262 detects the speed and / or acceleration of the vehicle and outputs it to the vehicle controller 210. The attitude sensor 263 detects the position of the vehicle, the pitch angle of the vehicle, the yaw angle of the vehicle, and the roll angle of the vehicle, and outputs it to the vehicle controller 210. The posture sensor 263 of the present embodiment includes a gyro sensor.

The vehicle controller 210 of the present embodiment is an in-vehicle computer such as an engine control unit (an engine control unit or an electronic control unit (ECU)) and electronically controls the operating state of the vehicle. Examples of the vehicle of the present embodiment include an electric vehicle having an electric motor as a driving source, an engine vehicle having an internal combustion engine as a driving source, a hybrid vehicle having both an electric motor and an internal combustion engine as a driving source. The electric vehicle or the hybrid vehicle using the electric motor as the driving source includes a type in which the secondary battery is used as a power source for the electric motor or a type in which the fuel battery is used as the power source for the electric motor.

The driving device 270 of the present embodiment includes a driving mechanism of the vehicle V1. The drive mechanism includes the above-described electric motor and / or internal combustion engine as the driving drive, the power transmission device including the drive shaft and the automatic transmission transmitting the output from the driving drive source to the drive wheels, and the braking device 271 for braking the wheels . The drive device 270 generates respective control signals of these drive mechanisms based on the input signals by the accelerator operation and the brake operation and the control signals obtained from the vehicle controller 210 or the driving support device 100, And the like. By transmitting the control information to the drive device 270, the running control including the acceleration / deceleration of the vehicle can be automatically performed. Further, in the case of a hybrid vehicle, the torque distribution to be output to the electric motor and the internal combustion engine according to the running state of the vehicle is also sent to the drive device 270.

The steering apparatus 280 of the present embodiment includes a steering actuator. The steering actuator includes a motor or the like provided on the column shaft of the steering. The steering apparatus 280 executes control of changing the traveling direction of the vehicle based on the control signal obtained from the vehicle controller 210 or the input signal by the steering operation. The vehicle controller 210 sends the control information including the steering amount to the steering device 280 to execute the change control of the traveling direction. Further, the travel assistance apparatus 100 may control the change of the traveling direction of the vehicle by controlling the amount of braking of each wheel of the vehicle. In this case, the vehicle controller 210 sends the control information including the braking amount of each wheel to the braking device 271, thereby executing the change control of the traveling direction of the vehicle. Further, the control of the driving device 270 and the control of the steering device 280 may be performed completely automatically, or may be performed in such a manner as to support a driving operation (forward operation) of the driver. The control of the driving device 270 and the control of the steering device 280 can be stopped / stopped by the intervention of the driver. The vehicle controller 210 controls the operation of the subject vehicle according to the operation plan of the operation planning device 20. [

The in-vehicle apparatus 200 of the present embodiment includes a navigation device 220. [ The navigation device 220 of the present embodiment calculates the route from the current position of the subject vehicle to the destination. As a method of calculating the path, a method known at the time of filing based on the graph search theory such as the multi-extensional method or A * can be used. The calculated route is sent out to the vehicle controller 210 for use in supporting the running of the subject vehicle. The calculated route is outputted as route guidance information through the output device 250 described later.

The navigation device 220 is provided with a position detection device 221. The position detecting device 221 of the present embodiment includes a Global Positioning System (GPS) and detects a vehicle running position (latitude and longitude) during traveling.

The navigation device 220 includes accessible map information 222, road information 223, and traffic rule information 224. The map information 222, the road information 223 and the traffic rule information 224 may be read by the navigation device 220. The navigation device 220 may be physically separate from the navigation device 220, It may be stored on a readable server.

The map information 222 of the present embodiment is a so-called electronic map, which is information in which the latitude and longitude and the map information are associated with each other. The map information 222 has road information 223 associated with each point.

The road information 223 of this embodiment is defined by a node and a link connecting the node. The road information 223 includes information specifying the road by the position / area of the road, road type for each road, road width for each road, and road shape information. The road information 223 of the present embodiment stores information on the position of an intersection, the direction of entry of an intersection, information on the type of intersection, and other intersections for each identification information of each road link. The road information 223 of the present embodiment is a road information 223 of the present embodiment that can be used for identifying road types, road widths, road shapes, ) And stores information related to other roads in correspondence.

The navigation device 220 specifies the first path on which the subject vehicle travels based on the current position of the subject vehicle detected by the position detecting device 221. [ The first path along which the present vehicle travels may be specified for each road or may be specified for each specified lane in which the upward / downward directions are specified, or may be specified for each single lane in which the vehicle is actually traveling, . The navigation device 220 of the present embodiment refers to the road information 223 to be described later, and specifies the road link as the first route on which the vehicle runs. The first path of the present embodiment includes the specific information (coordinate information) of one or a plurality of points to be passed through the child vehicle V1 in the future. The first route includes at least one point that indicates the next travel position on which the present vehicle is traveling. The target path may be constituted by continuous lines or discrete points.

The traffic rule information 224 of the present embodiment is a traffic rule that a vehicle such as a pause on a route, a parking / stopping prohibition, a slowing down, a speed limit, Each rule is defined per link (latitude, longitude), per link. The traffic rule information 224 may include information on traffic signals acquired from a device installed on the road side.

The in-vehicle apparatus 200 of the present embodiment includes the object detecting apparatus 230. [ The object detecting apparatus 230 of the present embodiment detects the surrounding situation of the subject vehicle. The object detection device 230 of the subject vehicle detects the presence and the existence position of the object including the obstacle present around the subject vehicle. The object detecting apparatus 230 of the present embodiment includes a camera 231 although not particularly limited. The camera 231 of the present embodiment is an imaging device including an imaging device such as a CCD, for example. The camera 231 may be an infrared camera or a stereo camera. The camera 231 is installed at a predetermined position of the vehicle, and picks up an object around the vehicle. The surroundings of the vehicle include the front, rear, front and rear sides of the vehicle. The object to be imaged by the camera 231 includes a stationary object such as a cover. The object includes a moving object such as a pedestrian, a two-wheeled vehicle, or a riding vehicle such as a four-wheeled vehicle. The object includes a road structure such as a guard rail, a median separator, and a curbstone.

The object detecting apparatus 230 may analyze the image data and identify the object type based on the analysis result. The object detecting apparatus 230 identifies whether the object included in the image data is a vehicle, a pedestrian, or a cover using a pattern matching technique or the like. The object detecting apparatus 230 processes the acquired image data and obtains the distance from the subject vehicle to the object based on the position of the object present around the subject vehicle. Particularly, the object detecting apparatus 230 obtains the positional relationship between the object and the child vehicle.

Further, the object detecting apparatus 230 of the present embodiment may use the radar apparatus 232. As the radar device 232, a device known at the time of filing of a millimeter-wave radar, a laser radar, an ultrasonic radar, or the like can be used. The object detecting apparatus 230 detects the presence or absence of the object, the position of the object, and the distance to the object on the basis of the received signal of the radar apparatus 232. The object detecting apparatus 230 detects the presence or absence of the object, the position of the object, and the distance to the object based on the clustering result of the point cloud information acquired by the laser radar.

If it is possible for the other vehicle and the subject vehicle to communicate with each other, the object detecting device 230 may acquire the vehicle speed and acceleration of the other vehicle detected by the vehicle speed sensor of the other vehicle as object information indicating that another vehicle exists do. Of course, the object detecting apparatus 230 may acquire object information including the position, velocity, and acceleration of another vehicle from an external device of the traffic system at an altitude.

The in-vehicle apparatus 200 of the present embodiment is provided with a lane departure prevention device 240. [ The lane departure prevention apparatus 240 includes a camera 241 and road information 242. [ The camera 241 may share the camera 231 of the object detecting apparatus. The road information 242 may share the road information 223 of the navigation device. The lane departure prevention apparatus 240 detects a lane of a first route on which the host vehicle travels from a captured image of the camera 241. [ The lane departure prevention device 240 recognizes the first lane on which the vehicle is traveling and controls the lane departure avoidance control for controlling the movement of the lane marker so that the position of the lane markers of the lane marker and the position of the lane marker maintain a predetermined relationship Function (lane keeping support function). The driving support apparatus 100 of the present embodiment controls the motion of the subject vehicle so that the subject vehicle runs at the center of the lane. The driving support apparatus 100 may control the motion of the child vehicle so that the distance along the lane width direction from the lane marker of the lane to the child vehicle is a predetermined value region. The lane markers in the present embodiment are not limited as long as they have a function of defining the lane. The lane markers may be a line drawn on the road surface, a planting material existing between the lanes, guard rails existing on the shoulder side of the lane, It may be a road structure such as a curb, a sidewalk, and a road for a motorcycle. The lane marker may be a floating object such as a signboard, a sign, a store, a street tree, or the like existing on the shoulder side of the lane.

The evaluation processor 11, which will be described later, stores the objects detected by the object detecting apparatus 230 in association with the paths. That is, the evaluation processor 11 has information as to which path the object exists.

The in-vehicle apparatus 200 of the present embodiment includes an output apparatus 250. [ The output device 250 is provided with a display 251 and a speaker 252. The output device 250 of the present embodiment outputs various kinds of information regarding the driving support to the passenger of the vehicle or the surrounding vehicle. In the present embodiment, the output device 250 outputs the information on the driving control based on the built driving behavior plan and the driving behavior plan thereof. Information indicating that the steering operation or the acceleration / deceleration is executed as information according to the control information for causing the vehicle to travel on the first route (target route) is informed to the occupant of the vehicle through the display 251 and the speaker 252 in advance. Further, it is also possible to inform the driver of the vehicle or the passenger of the other vehicle through the vehicle exterior lamps and the car interior lamps about the information about the running support. Further, the output apparatus 250 of the present embodiment may output various kinds of information related to driving support to an external device such as an intelligent transport system (ITS) through a communication device.

Next, the driving support apparatus 100 will be described.

The running support apparatus 100 of the present embodiment includes a scene evaluation apparatus 10, an operation planning apparatus 20, and an output apparatus 30. [ The output device 30 has the same function as the output device 250 of the in-vehicle device 200 described above. The display 251 and the speaker 252 are used as the configuration of the output device 30. [ A scene evaluation apparatus 10, an operation planning apparatus 20, and an output apparatus 30. [ Each device can exchange information with each other through a wired or wireless communication line.

First, the scene evaluation apparatus 10 will be described.

The scene evaluation apparatus 10 includes an evaluation processor 11 that functions as a control apparatus of the scene evaluation apparatus 10. [ The evaluation processor 11 is a calculation device used for evaluating a scene encountered by a vehicle running on a route when determining the driving behavior of the vehicle. More specifically, the evaluation processor 11 includes a ROM (Read Only Memory) storing a program for executing a process for evaluating a scene encountered by the vehicle, and a scene evaluation apparatus 10 (Central Processing Unit) as a functioning operation circuit, and a RAM (Random Access Memory) functioning as an accessible storage device. The evaluation processor 11 has a storage medium storing a program for executing a process for evaluating a scene encountered by the vehicle.

The evaluation processor 11 of the scene evaluation apparatus 10 according to the present embodiment executes the following processing.

(1) a process of extracting a second route having an intersection with a first route through which the own vehicle travels (route extracting process)

(2) a process (mapped extraction process) for extracting a plurality of mapped images encountered by the vehicle running on the first route based on the relationship between the first route and each second route,

(3) a process of evaluating the scene (evaluation process) using the relationship between each extracted image and its vehicle.

The evaluation processor 11 of the present embodiment has a first block for realizing a path extracting function, a second block for realizing an image extracting function, and a third block for realizing a scene evaluating function. The first block executes the path extraction processing, the second block executes the mapped extraction processing, and the third block performs the evaluation processing. The evaluation processor 11 of the present embodiment executes the respective functions in order to realize the respective functions or by software for executing each process and cooperation of the hardware described above.

Hereinafter, each process executed by the evaluation processor 11 according to the present embodiment will be described with reference to Figs. 2A to 2G. Fig.

First, the path extraction processing of the evaluation processor 11 will be described.

The evaluation processor 11 of the present embodiment calculates a first route in which the host vehicle is traveling or scheduled to travel. The evaluation processor 11 acquires the self-vehicle information to calculate the first path. The evaluation processor 11 obtains the present position of the subject vehicle from the position detecting device 221. [ The evaluation processor 11 refers to the map information 222 and calculates the first path using the acquired current position and the traveling direction. The evaluation processor 11 may acquire the scheduled travel route of the subject vehicle obtained by the navigation device 220 as the first route. The evaluation processor 11 may acquire the guidance route from the current position to the destination, which is obtained by the navigation device 220, as the first route.

The evaluation processor 11 of the present embodiment extracts a second route having an intersection with a first route that the vehicle is traveling or scheduled to travel. The second path in the present embodiment is a path having an intersection with the first path. The path having an intersection with the first path includes a path that meets the first path, a path that flows into the first path, a path that branches from the first path, and a path that intersects with the first path.

The evaluation processor 11 determines whether or not the subject vehicle V1 encounters a scene to be evaluated.

Specifically, the evaluation processor 11 determines whether or not the first path on which the subject vehicle V1 runs has encountered a scene that intersects with the other second path.

The scene shown in Fig. 2A will be described as an example. In this scene, the present vehicle V1 travels on the first route M1L at this time. The evaluation processor 11 refers to the link information of the map information 222 or the road information 223 and extracts the first path M1L including the link to which the current position acquired from the position detecting device 221 belongs. The first path M1L is a path to which the current position of the child vehicle V1 belongs. The first path M1L is identified by the link ID defined in the map information 222 or the road information 223. [ The first path M1L including the current position of the subject vehicle V1 is specified as the first path through which the subject vehicle V1 travels.

The evaluation processor 11 calculates a first route that the vehicle V1 is to travel.

FIG. 2B determines the first path BV1L that the subject vehicle V1 is to travel. The evaluation processor 11 may determine the first path BV1L from the route information to the destination calculated by the navigation device 220 or may determine from the winker signal indicating the left turn of the vehicle V1. Further, the winker signal is acquired through the vehicle controller 210.

2C is a diagram showing an example of an intersection point where the first path BV1L alternates. In Fig. 2C, the entire paths that the vehicle traveling in each lane are likely to run are shown in superposition. As shown in FIG. 2C, the other vehicle V2 has three travelable routes, that is, a forward travel or a rightward / leftward travel, the other vehicle V3 has three travelable routes of straight ahead, right turn and left turn, , Right turn, and left turn. When the subject vehicle V1 travels along the first path BV1L, it is required to judge information about all the paths shown in Fig. 2C.

The evaluation processor 11 of the present embodiment extracts a second path having an intersection with the first path BV1L. The second path in the present embodiment is a path that intersects with the first path, collides with the first path (T), joins the first path, and passes through the area leading to the first path.

The evaluation processor 11 of the present embodiment executes the extraction method of the second path.

First, the evaluation processor 11 of the present embodiment specifies an area of the " scene to be encountered by the subject vehicle " to be evaluated when determining the driving behavior of the subject vehicle. The evaluation processor 11 predicts that the vehicle V1 encounters the scene to be evaluated when there is a second path to which the link having the intersection with the first path M1L belongs. Although there is an example, the evaluation processor 11 determines whether or not the current position of the subject vehicle V1 belongs to the region R1 within a predetermined distance from the intersection (for example, the intersection center RO in FIG. 2A) of the first path M1L and the second path The scene is specified as a scene to be evaluated.

The evaluation processor 11 executes extraction of the second path when the subject vehicle V1 encounters the evaluation scene. The evaluation processor 11 extracts a second route existing in a region (R1 in Fig. 2A) corresponding to the scene to be evaluated which is encountered by the child vehicle V1. Thus, by extracting the second route and evaluating the scene for each scene to be evaluated, it is possible to evaluate what state the scene (scene) encountered by the subject vehicle without increasing the processing load.

Hereinafter, a method of extracting the second path in the scene shown in Fig. 2B will be described with reference to Figs. 2D to 2G. First, as shown in Fig. 2D, the evaluation processor 11 of the present embodiment determines a route that the other vehicle V2 may travel. The evaluation processor 11 determines whether there is a possibility that the other vehicle V2 (the same for V3 and V4) will run using the map information 222, the road information 223, the traffic rule information, And calculates the route.

As shown in Fig. 2D, the other vehicle V2 is likely to proceed to the second path BV2S going straight, the second path BV2L making a left turn, and the second path BV2R making a right turn. As shown in Fig. 2E, the other vehicle V3 is likely to proceed to the second path BV3S going straight, the second path BV3L making a left turn, and the second path BV3R making a right turn. As shown in Fig. 2F, the other vehicle V4 is likely to proceed to the second path BV4S going straight, the second path BV4L making a left turn, and the second path BV4R making a right turn. That is, there are three routes for each of the other vehicles.

The evaluation processor 11 of the present embodiment narrows down the route in which the first route BV1L of the child vehicle V1 may cross the entire route (the entire route that the other vehicle can travel). The evaluation processor 11 extracts the intersections QV12 and QV13 of the paths different from the first path BV1L that the child vehicle V1 is to travel, as shown in Fig. 2G. The second path BV2S sharing the first path BV1L and the intersection QV12 and the second path BV3R sharing the intersection QV13 with the first path BV1L are extracted. According to this processing, the evaluation processor 11 calculates two second paths BV2S and BV3R having an intersection with the first path BV1L among the nine paths existing in the scene (crossing point passing scene) encountered by the child vehicle V1 . The extracted second paths BV2S and BV3R have an intersection with the first path BV1L and are likely to constitute a scene encountered by the vehicle V1. In this way, it is possible to disassemble a number of paths related to the first path on which the subject vehicle V1 travels, and to extract only the second path to be considered in the operation plan of the subject vehicle V1.

The evaluation processor 11 of the present embodiment extracts a plurality of events encountered by the vehicle V1 traveling on the first route BV1L based on the relationship between the first route BV1L and the second routes BV2S and BV3R. The event that the subject vehicle V1 encounters is a condition in which the subject vehicle V1 passes the intersection of the first path and the second path, the subject vehicle V1 enters the second path from the first path, or the subject vehicle V1 enters the other vehicles V2 and V3 , V4, or what happens to the vehicle V1 that the vehicle V1 passes by the other vehicle V2, V3, V4. An idea can be expressed as a place where the vehicle encounters the matter. For this reason, in the present specification, the " mapping " is also described by specifying location information such as a point, an intersection point, an intersection point, and the like.

The evaluation processor 11 of the present embodiment sets the point along the intersections QV12 and QV13 of the first path BV1L and the second paths BV2S and BV3R that the child vehicle V1 is going to run to the child vehicle V1 running on the first path BV1L It is judged as a point encountering this thought. The evaluation processor 11 assigns the positions QV12, QV13 of the first path BV1L and the intersections QV12, QV13 of the second paths BV2S, BV3R as an event encountered by the vehicle V1. At the intersections QV12 and QV13, the child vehicle V1 encounters the idea of entering (merging) into the second paths BV2S and BV3R. The subject vehicle V1 encounters the idea of approaching the other vehicles V2, V3, and V4. In this manner, since the location encountering the event is extracted from the relationship between the first route and the second route, only the event that influences the operation plan of the vehicle V1 can be considered.

The evaluation processor 11 of the present embodiment refers to the traffic rule information 224 and determines the relationship between the traffic rule of the first route and the traffic rule of each second route, Extract the events encountered by vehicle V1. The traffic rule information 224 is information in which information such as a pause position, entry prohibition, one-way traffic, etc., is associated with a link (route) or position information. In this process, the map information 222 and the road information 223 may be referred to.

The evaluation processor 11 recognizes the stop traffic rule as an event. The evaluation processor 11 extracts the position where the stop is defined as the position where the vehicle V1 encounters the event. The location of the extracted map is associated with the route (including the link). Likewise, the evaluation processor 11 recognizes the entry prohibition traffic rule as an event. The evaluation processor 11 extracts the position (the upstream side in the running direction) on the upstream side of the position where the entry prohibition is defined as the position where the vehicle V1 encounters the event. The location of the extracted map is associated with the route (including the link). The evaluation processor 11 calculates the position (the upstream side in the running direction) on the upstream side of the region where the stop or prohibition is defined such as the center portion of the intersection (region R2 in Fig. 2A) . The location of the extracted map is associated with the route (including the link).

The evaluation processor 11 of the present embodiment calculates the priority for the second route of the first route from the traffic rule information 224 of the first route and the second route, And extracts the event of the vehicle V1 traveling on the route.

The evaluation processor 11 of the present embodiment refers to the traffic rule information 224 in the scene (scene) shown in Fig. 3 to extract the stop line ST1 of the first path BV1L. Similarly, the evaluation processor 11 extracts the stop line ST2 of the second path BV2S and the stop line ST3 of the second path BV3R. The evaluation processor 11 compares the stopping rule of the stop line ST2 of the second path BV2S and the stopping rule of the stop line ST1 of the first path BV1L, which are predicted to intersect at the intersection QV12, and determines the relationship therebetween. In the case where the driving of either one is given priority and the driving of the other is prohibited, the stop line for which driving is given priority is excluded from candidates for thought. Under the traffic in which the running of the second route is prohibited and the running of the first route is preferentially permitted, the other vehicle V2 running on the second route does not affect the running of the subject vehicle V1 running on the first route to be. In the example shown in Fig. 3, it is required that the stop rule of the stop line ST1 of the first path BV1L and the stop rule of the stop line ST2 of the second path BV2S always be stopped. The evaluation processor 11 can not determine the priority of the first path BV1L and the second path BV2S. As a result, the intersection QV12 is not excluded from the candidates of ideas.

The evaluation processor 11 of the present embodiment calculates the priority for the second route of the first route from the signal information included in the traffic rule information 224 of the first route and the second route, To extract the event of the vehicle V1 traveling on the first route. Since the signal information is information that is changed in order, it may be recognized by the images captured by the cameras 231 and 241, or acquired through the ITS system. In this embodiment, the form of acquiring the signal information through the storage device of the navigation device 220 is described, but the evaluation processor 11 may acquire the signal information directly.

The evaluation processor 11 of the present embodiment refers to the traffic rule information 224 in the scene (scene) shown in Fig. 3 and extracts the signal indicated by the signal SG1 provided in the first path BV1L. Similarly, the evaluation processor 11 extracts the signal indicated by the signal SG2 provided on the second path BV2S.

The evaluation processor 11 of the present embodiment sets the priority of the route (green signal) permitted to pass to be relatively higher than the priority of the route instructed to stop (traffic is prohibited: red signal). When both paths having intersection points are permitted to pass, priority can not be obtained, so no priority is set. Incidentally, the green light means permission of proceeding and may be displayed in a different color.

When the signal SG1 provided at the intersection QV12 of the first path BV1L indicates a green signal and the signal SG2 provided at the second path BV2S indicates a red signal, the evaluation processor 11 determines that the running of the first path BV1L precedes the second path BV2S . The intersection QV12 of the second route BV2S, which is prohibited from traveling, is excluded from the candidate of the event.

When the signal SG1 provided at the intersection QV12 of the first path BV1L indicates a green signal and the signal SG3 provided at the second path BV3R indicates a green signal, both the traveling of the first route BV1L and the traveling of the second route BV2S are permitted. The evaluation processor 11 can not determine the priority of the first path BV1L and the second path BV2S. As a result, the intersection QV12 is not excluded from the candidates of ideas.

The evaluation processor 11 of the present embodiment calculates the priority for the second route of the first route from the road information 223 of the first route and the second route, And extracts the event of the vehicle V1 traveling on the road. The road information 223 is used to identify a priority route (lane) and a non-priority route (lane) in the T character, an identification of a priority route and a non-priority route according to the lane width, And the like. The road information 223 defines one path as a priority path and the other path as a non-priority path with respect to the path constituting the T character string. The road information 223 defines a route having a wide lane width as a priority route and a lane having a narrow lane width as a non-priority route with respect to a route having an intersection point. Of course, depending on the actual traffic condition, a route having a narrow lane width may be referred to as a priority route. The road information 223 defines a main route as a priority route and a joining route as a non-priority route with respect to the joining route. The road information 223 defines a path having a relatively large curvature radius as a priority path and a path having a relatively small radius of curvature as a non-preferred path. Of course, a route having a relatively small radius of curvature may be referred to as a priority route depending on actual traffic conditions.

The evaluation processor 11 of the present embodiment uses the detection information of an object existing around the subject vehicle V1 to extract the event encountered by the subject vehicle V1 running on the first path. The evaluation processor 11 recognizes that an object (an object including a pedestrian, a different vehicle, a road structure, and the like) detected by the object detection device 230 exists as an event encountered by the vehicle V1. When the distance between the vehicle V1 and the detected object is less than the predetermined value, the evaluation processor 11 may extract the existence of the object as a mapped image. The evaluation processor 11 may extract the presence of the object as a mismatch when the predicted time until the detected vehicle comes into contact with the vehicle V1 is less than a predetermined value.

The evaluation processor 11 of the present embodiment uses the positional information of the object to extract the event encountered by the vehicle V1 traveling on the first route. Objects include objects related to temporary traffic regulations such as construction sites, trouble spots, and avoidance areas. Information on the position where an object exists may be included in the road information 223. [ The information of the position where the object exists can be received from the information providing apparatus on the road side such as ITS.

The evaluation processor 11 stores the objects detected by the object detecting apparatus 230 in a state in which they can be accessed in association with paths. The evaluation processor 11 has information as to which path the object exists. Further, the evaluation processor 11 can judge whether or not the object exists in the extracted second path, the positional relationship between the object and the child vehicle on the second route, and the possibility of the contact between the object and the child vehicle on the second route .

The evaluation processor 11 of the present embodiment associates the positions of the extracted plural maps with the respective routes. The evaluation processor 11 rearranges the extracted plurality of events in the order in which the child vehicle V1 encounters them. The evaluation processor 11 of the present embodiment obtains the sequence of events to be encountered from the transition of the position of the subject vehicle V1 running on the first route and the position of the event, .

The evaluation processor 11 of the present embodiment extracts an object to be encountered, and associates the encounter position with each route. The evaluation processor 11 rearranges the objects detected by the object detecting apparatus 230 in the order in which the child vehicle V1 encounters them. The evaluation processor 11 of the present embodiment obtains the order of encounter with the object from the transition of the position of the vehicle V1 traveling on the first path and the position of the object, .

Next, the output device 30 will be described.

The output device 30 is provided with an output control processor 31. The output control processor 31 uses the display 251 as the output device 30 to display information. The output control processor 31 arranges and displays the information representing the map extracted by the evaluation processor in the order in which the vehicle is encountered.

The output control processor 31 includes a read only memory (ROM) in which a program for executing processing for displaying information indicating an event is stored, and an operation control circuit A CPU (Central Processing Unit), and a RAM (Random Access Memory) functioning as an accessible storage device. The output control processor 31 has a storage medium in which a program for executing a process of displaying information representing the event is stored.

Fig. 4 is an example of the display information VW showing the mapped image with time. In the display example shown in Fig. 4, the first path of the subject vehicle V1 is indicated by an arrow T. Fig. The direction of the arrow is the time axis of the vehicle V1. Arrows curved at 90 degrees from the intersections QV12 and QV13 extracted as mappings on the arrow T are superimposed and displayed. As an event encountered by the vehicle V1, the signal SG1 or the stop line ST1 and the intersections QV12 and QV13 with the second path may be indicated. The information indicating the mapping (collation position / collation timing) may be a symbol or an abstract mark. Coloring, size and the like can be arbitrarily determined.

The output control processor 31 displays information such as symbols and marks representing the extracted events at positions corresponding to the ratio of the actual distances from the subject vehicle V1 to the respective maps. 4, the output control processor 31 sets the length of the arrow T representing the first path as a predetermined distance, and determines the actual distance between the vehicle V1 and the intersection QV12 and the actual distance between the vehicle V1 and the intersection QV13 The positions of the arrows QV12 and QV13 with respect to the arrow T are determined so that the ratio of the actual distance is expressed in the display information VW. In consideration of the speed of the subject vehicle V1, the output control processor 31 sets the length of the arrow T indicating the first route as a predetermined distance, and determines the time at which the subject vehicle V1 reaches the intersection QV12 and the time at which the subject vehicle V1 reaches the intersection QV13 The positions of the arrows QV12 and QV13 with respect to the arrow T may be determined so that the ratio of the time to reach the display information VW is represented by the display information VW.

Further, when the event to be encountered is an object (object), the output control processor 31 obtains the positional relationship of the child vehicle V1 in consideration of the position of the object and the relative speed of the object. The idea of this example is that the vehicle approaching the subject vehicle V1 including the object behind the subject vehicle V1 and the vehicle approaching from the rear can be recognized as the event encountered by the subject vehicle V1. The positional relationship of the subject vehicle V1 is also obtained in consideration of the position and the relative speed of the other vehicle approaching from the rear.

The output control processor 31 can determine whether the event includes a moving object such as a stationary object such as a stop position defined in a traffic rule, a road structure, a pedestrian, or another vehicle, Rearranges the stationary object and the moving object included along a common time axis called the order in which the vehicle is encountered. Other vehicles include other vehicles approaching from the rear.

Thus, by mapping and displaying the events encountered by the vehicle V1 traveling on the first route in the order in which the vehicle is encountered, the driver of the vehicle V1 visually recognizes in what order and in what order the vehicle is encountered can do.

The output control processor 31 of the present embodiment displays information output from the operation planning device 20 to be described later. Specific display examples will be described later.

Here, an evaluation process of a scene using the traffic rule information 224 will be described. A scene of this example is shown in Fig. 5A. As shown in Fig. 5A, the vehicle V1 traveling on the first route BV1 passes leftward through the intersection where the signal SG1 is installed. The evaluation processor 11 extracts a path having an intersection with the first path BV1. Similar to the above-described example, in this example, the second path BV2S and the second path BV3R shown in Fig. 5B are extracted. The evaluation processor 11 refers to the traffic rule information 224 associated with the position information and refers to the traffic rule defined on the first path BV1L. The evaluation processor 11 extracts the stop line ST1 on the first path BV1L. The position of the stop line ST1 is stored in association with the first path BV1L. The evaluation processor 11 also refers to the traffic rule information 224 for the second route BV2S and the second route BV3R to extract the traffic rules related to the operation of the other vehicle, I remember. The evaluation processor 11 determines an imaginary position of the vehicle V1 according to the position (position of the stop line) to which the traffic rule stored in the traffic rule information 224 is applied. In this example, the evaluation processor 11 determines the position of the intersection QV1S between the stop line ST1 stored in the traffic rule information 224 and the first path BV1L as a map position.

The evaluation processor 11 examines the priority among the paths. In the example shown in Fig. 5B, the first path BV1L is a green signal (progress instruction), and the second path BV3R is a green signal. On the other hand, the second path BV2S is a red signal (stop instruction). In this case, the evaluation processor 11 determines that the priority of the first path BV1L is higher than the priority of the second path BV2S. Since the priority of the first path BV1L is higher than that of the second path BV2S, the evaluation processor 11 may exclude the intersection QV12 of the first path BV1L from the candidate of the mapping. Of course, it may be stored as an event, and it may be judged as progress in the operation plan processing to be described later. The evaluation processor 11 does not determine the priorities of the first path BV1L and the second path BV3R which are all green signals.

The evaluation processor 11 rearranges each event into a time series in which the vehicle V1 is encountered. Sequence information of the arranged maps is sent to the operation planning processor 21. In addition, the events extracted through the output device 30 are arranged in a time series and presented to the user. The user can visually confirm what kind of event the vehicle V1 will encounter in the future.

Fig. 6 is an example of the display information VW showing the mismatch over time. In the display example shown in Fig. 6, the traveling direction of the first path of the vehicle V1 is indicated by a bold arrow T. Fig. The direction of the arrow is the time axis of the vehicle V1. The intersection point QV1S of the stop line existing in front of the signaling device extracted as an event on the arrow T is displayed as an icon of the signaling device and the arrows curved at 90 degrees to the intersection points QV12 and QV13 are superimposed and displayed. Further, the signal SG1 or the stop line ST1 and the intersections QV12 and QV13 with the second path may be indicated as the event (object in this example) encountered by the child vehicle V1. The information indicating the mapping (collation position / collation timing) may be a symbol or an abstract mark. Coloring, size and the like can be arbitrarily determined.

Next, the operation planning apparatus 20 will be described. The operation planning apparatus (20) has an operation planning processor (21). The operation planning processor 21 plans the driving behavior of the subject vehicle traveling on the route. The operation planning processor 21 acquires from the evaluation processor 11 the evaluation results of the relationship between the plurality of events and the subject vehicle that are encountered over time when the vehicle runs on the first route. The operation planning processor 21 uses the relationship (evaluation result) of the vehicle V1 evaluated by the evaluation processor 11 to form a driving plan when the vehicle V1 runs on the first route. The operation planning processor 21 forms an operation plan considering the existence of the object detected by the object detecting apparatus 230 when designing the operation plan. The operation planning processor 21 establishes an operation plan avoiding contact of an object existing around the subject vehicle V1.

The operation plan processor 21 includes a ROM (Read Only Memory) storing a program for executing a process of planning a driving behavior including the running / stopping of the subject vehicle, and a driving planner 20), and a RAM (Random Access Memory) that functions as an accessible storage device. The operation planning processor 21 has a storage medium in which a program for executing a process of planning a driving behavior including the running / stopping of the subject vehicle is stored.

The operation planning processor 21 of the present embodiment determines one action for each of a plurality of events extracted by the evaluation processor 11. The action to be determined is the action related to driving, and includes the proceeding behavior and the stopping action. The operation planning processor 21 determines either a progressive action or a stop action for each event. The operation planning processor 21 considers the contents of each action determined for these plurality of events in a comprehensive manner to form a series of operation plans for the scene encountered by the vehicle V1. Thus, it is possible to design an operation plan that clearly shows where the stopping of one scene is required from the start of passage to the end of passage. This simplifies the process until the final operation plan is prepared, thereby reducing the calculation load.

Hereinafter, a method for determining the driving behavior of the operation planning processor 21 will be described with reference to Figs. 7A and 7B. Here, a method of determining the driving behavior in two events (the intersection QV12) and the event (the intersection QV13) shown in Fig. 2G will be described.

Fig. 7A is a diagram for explaining a method of determining the driving behavior in the event (intersection QV12) shown in Fig. 2G. The operation planning processor 21 determines a driving behavior to be taken with respect to the event that the vehicle V1 passes through a point where the first path BV1L intersects with the second path BV2S. The operation planning processor 21 calculates the positional relationship between the vehicle V2 and the vehicle and the change in positional relationship (approaching degree) of the vehicle V2 corresponding to the second route BV2S. Based on the time until the vehicle V1 and the other vehicle V2 are in contact with each other, the operation planning processor 21 determines whether the vehicle V1 intersects with the other vehicle V2 (intersection QV12), which is an intersection of the first route and the second route, It is judged whether or not it is possible to pass through.

Consider the intersection QV12 where the vehicle V1 is likely to encounter the event.

As shown in Fig. 7A, the operation planning processor 21 calculates the estimated time until the vehicle V1 and the other vehicle V2 reach the intersection point QV12, and the vehicle V1 has the margin (the intersection QV12 Or not). For example, assume that the speed of the vehicle V1 is VV1, the distance from the vehicle V1 to the intersection QV12 is L1, the speed of the other vehicle V2 is VV2, and the distance from the other vehicle V2 to the intersection QV12 is L2.

When the following equation (1) is satisfied, it is determined that the vehicle V1 is likely to be in contact with the other vehicle V2 at the intersection QV12, and the driving behavior in the event encountered at the intersection QV12 is determined as " .

Figure 112018005507842-pct00001

On the other hand, when the following formula (2) is satisfied, it is determined that the possibility that the vehicle V1 is in contact with the other vehicle V2 at the intersection QV12 is low, and the driving behavior in this case is " .

Figure 112018005507842-pct00002

Also, the T threshold is a time margin considering safety regarding the mutual passing of the vehicle.

Fig. 7B is a diagram for explaining a method of determining the driving behavior in the event (intersection QV13) shown in Fig. 2G. The operation planning processor 21 determines the driving behavior to be taken with respect to the event that the vehicle V1 passes through the point where the first path BV1L and the second path BV3R intersect. The operation planning processor 21 calculates a positional relationship between the vehicle V3 and the host vehicle corresponding to the second route BV3R and a change (approaching degree) of the positional relationship. The operation planning processor 21 determines whether the vehicle V1 passes through the intersection QV13 between the first route and the second route without touching the other vehicle V2 based on the time until the vehicle V1 and the other vehicle V3 are in contact with each other Or not. In other words, the operation planning processor 21 determines whether or not it can pass through the intersection QV13 without encountering the idea of making contact with the other vehicle V3.

As shown in Fig. 7B, the operation planning processor 21 calculates the estimated time until the vehicle V1 and the other vehicle V2 reach the intersection QV13, and the vehicle V1 is allowed to pass through the intersection QV13 It is judged whether or not it can be done. In other words, the operation planning processor 21 determines whether or not the subject vehicle V1 is less likely to encounter the event that the subject vehicle V1 contacts the other vehicle V3 at the intersection QV13. For example, suppose that the speed of the vehicle V1 is VV1, the distance from the vehicle V1 to the intersection is L1, the speed of the other vehicle V3 is VV3, and the distance from the other vehicle V3 to the intersection QV13 is L3. L3 may be calculated with reference to the curvature / radius of curvature stored in the road information 223, or may be calculated with reference to the distance between the nodes stored in the road information 223.

When the following equation (3) is satisfied, it is determined that there is a high possibility that the subject vehicle V1 is in contact with the other vehicle V3 at the intersection QV13, and the driving behavior in this case is judged as " do.

Figure 112018005507842-pct00003

On the other hand, when the following expression (4) is satisfied, it is determined that the possibility of encountering the subject vehicle V1 with the other vehicle V3 at the intersection QV13 is low, and the driving behavior in this event is judged as " do.

Figure 112018005507842-pct00004

The T threshold is the safety margin for vehicle crossing.

The output control processor 31 described above may display the result of the determination of the driving behavior for each event on the display 251. [ 8 is a display example of the judgment result of the driving behavior. As shown in Fig. 8, the output control processor 31 arranges a plurality of events in the order in which the child vehicle V1 encounters them, and displays the judgment of the driving behavior in each event with text information or symbols do.

The operation planning processor 21 of this embodiment forms a series of operation plans for the scene encountered by the vehicle by using the relationship between the evaluated vehicle V1 and a plurality of events which are encountered over time. Although not particularly limited, the operation planning processor 21 forms an integrated driving plan for the driving operation to be performed by the vehicle V1 with respect to the scene to be encountered. The operation plan is a command in which stop and progress commands are associated with each of the events extracted in the first path during the time from entering the scene (region R1) to the time when the scene (region R1) is withdrawn .

When the determination of the stopping behavior or the determination of the impossibility of judgment is made on at least one event of the event extracted by the evaluation processor 11, the operation planning processor 21 of the present embodiment determines whether or not the scene V1 , A driving plan for stopping the vehicle is set up.

When the determination of the stopping behavior or the determination of the impossibility of judgment is made on at least one event of the event extracted by the evaluation processor 11, the operation planning processor 21 of the present embodiment determines whether or not the current position of the vehicle V1 In the nearest event, plan a driving plan to stop vehicle V1. If there is a point to be stopped in the area R1 corresponding to the scene, the vehicle V1 is immediately stopped, thereby avoiding the risk.

Incidentally, the case where the operation planning processor 21 makes a determination of the judgment impossibility means that when the ratio of the rectangular area included in the image of the camera 231 is equal to or larger than a predetermined value, the object detection apparatus 230 detects the object Is less than a predetermined value, the processing by the lane departure prevention device 240 is stopped, or there is an intervention from the driver, and the like. In the case where the determination is impossible, the execution of the operation plan based on the inaccurate information can be suppressed by quickly stopping the vehicle.

When the operation planning processor 21 of the present embodiment determines the stopping behavior or the impossibility of judgment on the event extracted by the evaluation processor 11 with respect to the event encountered next to the event whose progress is determined, And a driving plan for stopping the vehicle V1 at a meeting point with an event for which a progressive action is determined is set up. Even if the proceeding behavior is once determined, the next vehicle V1 can stop the child vehicle V1 at the position where the proceeding behavior is determined, if the next encountering event is the stopping action or the judgment impossible. Since the place where the proceeding behavior is decided is the place where the presence of the subject vehicle V1 is permitted, the subject vehicle V1 can be safely stopped.

The operation planning processor 21 of the present embodiment is configured such that, when the event that has been determined to be a stopping action or a judgment impossible among the events extracted by the evaluation processor 11 belongs to the second path, The vehicle V1 is stopped. Even when the stopping action or the judgment impossible determination is made for a certain event, when the stop position according to the event is included in the second route, there is a possibility of obstructing the running of the other vehicle running on the second route. It is not appropriate. According to the present embodiment, the stop position can be set at a stopable position on the upstream side rather than on the second route.

The operation planning processor 21 of the present embodiment determines whether or not the event determined by the evaluation processor 11 as a stopping action or a determination of the impossibility of judgment can be approximated or overlapped with another event so that both events are within a predetermined distance , The vehicle V1 is stopped at a position that is upstream of the map and is stoppable. Even when the stopping action or the judgment impossible determination is made for a certain event, when the stopping position according to the event approaches or overlaps with the stopping position according to another event, it is necessary to consider the matching with the judgment on the other event , And is not suitable as a stop position. According to the present embodiment, the stop position can be set at a stopable position on the upstream side rather than on the second route. This makes it possible to reduce the case in which judgment is impossible. In addition, it is possible to smoothly travel within the area R1 of the scene without reducing the load of the determination processing and repeating the stop and go.

The operation planning processor 21 of the present embodiment determines whether or not an advancing behavior is determined for one of the ideas extracted by the evaluation processor 11, In this case, when the degree of separation between one event and the other event is equal to or greater than a predetermined value, an operation plan for advancing the child vehicle V1 with respect to one event is formulated. In the case where the proceeding is permitted for any one event but the subsequent event is a stopping action or a decision of incapability in another event encountered thereafter, if the child vehicle V1 is stopped in one event on the upstream side, It is necessary to judge whether or not it is possible to proceed, and there is also a possibility of obstructing the flow of traffic of another vehicle on another second route. In this way, in the event of a different judgment such as "stop" on the upstream side from the "upstream" side and the "stop" on the upstream side, the complicated process can be prevented by advancing the vehicle V1 on the upstream side.

The operation planning processor 21 of the present embodiment may set the stop position of the subject vehicle V1 to a position within the boundary R1 corresponding to the scene and nearest to the current position of the subject vehicle V1. The stop position of the subject vehicle V1 may be set on the front side of the border R1 corresponding to the scene. The stop position of the subject vehicle V1 may be the position of the most upstream position in the approach direction of the subject vehicle V during the mapping in the boundary R1 corresponding to the scene.

The stopping point setting process can be selected according to traffic volume, road type, and road width at an intersection point.

The operation planning processor 21 of the present embodiment controls the speed.

The operation planning processor 21 of the present embodiment determines whether or not an advancing behavior is determined for one of the ideas extracted by the evaluation processor 11, In this case, a driving plan for reducing the speed of the progressive action in one event is formulated.

Here, the specific scene will be described as an example of evaluation processing of scenes and drafting of an operation plan. A scene of this example is shown in Fig. As shown in Fig. 9A, the vehicle V1 running on the first route BV1 is provided with the signal SG1, and also passes through the intersection on which the crosswalk CR1 is installed, to the left. The evaluation processor 11 extracts a path having an intersection with the first path BV1L. Here, the pedestrian crossing is one of the paths the pedestrian passes.

In this example, as the second route, the crosswalk CR1, the second route BV2S, the second route BV3R, and the crosswalk CR4 are extracted as shown in Fig. 9B. The evaluation processor 11 refers to the traffic rule information 224 associated with the position information and refers to the traffic rule defined on the first path BV1L. The evaluation processor 11 is on the first path BV1L and extracts the stop line ST1 on the upstream side of the crosswalk CR1. The position of the stop line ST1 is stored in association with the first path BV1L. The evaluation processor 11 also refers to the traffic rule information 224 for the second route BV2S and the second route BV3R to extract the traffic rules related to the operation of the other vehicle, I remember. The evaluation processor 11 determines an imaginary position of the vehicle V1 according to the position (position of the stop line) to which the traffic rule stored in the traffic rule information 224 is applied.

In this example, the evaluation processor 11 determines the position of the intersection QVC1 between the stop line ST1 and the first path BV1L stored in the traffic rule information 224 as the position of the map. The stop positions according to the positions of the respective images are stored for each path. In this example, the intersection point QV1S, which is an event, corresponds to the crosswalk CR1 as a stop position. And maps the intersection QV12, which is the mapping, to the second path BV2S as the stop position. And maps the intersection QV13, which is the mapping, to the second path BV3R as the stop position. Corresponds to the crosswalk CR4 as a stop position, the intersection point QVC4.

The evaluation processor 11 examines the priority of the first path and the second path. In the example shown in Fig. 9B, the first path BV1L is a green signal (progress instruction), and the second path BV3R is a green signal. On the other hand, the second path BV2S is a red signal (stop instruction). In this case, the evaluation processor 11 determines that the priority of the first path BV1L is higher than the priority of the second path BV2S. Since the priority of the first path BV1L is higher than that of the second path BV2S, the evaluation processor 11 may exclude the intersection QV12 of the first path BV1L from the candidate of the mapping. Of course, it may be stored as an event, and it may be judged as progress in the operation plan processing to be described later. The evaluation processor 11 does not determine the priorities of the first path BV1L and the second path BV3R which are all green signals.

In addition, the evaluation processor 11 examines the crosswalk as the second route and the priority of the first route.

In the example shown in Fig. 9B, since the signal of the crosswalk CR1 is red (the traverse prohibition instruction) and the first path BV1L is the green signal (progress instruction), the priority of the first path BV1L is higher than the priority of the crosswalk CR1 I think it is high. Since the priority of the first path BV1L is higher than the priority of the crosswalk CR1, the evaluation processor 11 may exclude the intersection QVC1 of the first path BV1L from the candidate of the mapping. Of course, it may be stored as an event, and it may be judged as progress in the operation plan processing to be described later.

In the example shown in Fig. 9B, the signal of the crosswalk CR4 at which the first path BV1L intersects is blue (transverse direction). Although the first route BV1L is also a green signal (progress instruction), it is judged that the priority of the first route BV1L is lower than the priority of the crosswalk CR4 in accordance with the traffic rule for giving priority to the pedestrian of the crosswalk. Since the priority of the first route BV1L is lower than the priority of the crosswalk CR4, the evaluation processor 11 stores the intersection QVC4 of the first route BV1L as an event.

The evaluation processor 11 determines whether or not the intersection QV1S of the first path BV1L and the stop line ST1, the intersection QV12 of the first path BV1L and the second path BV2S, the intersection QV13 of the first path BV1L and the second path BV3R, We judge crossing point QVC4 of news CR4 as history.

The evaluation processor 11 extracts objects such as a first path BV1L, a second path BV2S, a second path BV3R, a pedestrian crossing CR1 as a second path, a pedestrian present in a pedestrian crossing CR4, a two- In the example shown in Fig. 9B, the evaluation processor 11 determines whether or not the other vehicle V2 traveling on the second route BV2S, the other vehicle V3 traveling on the second route BV3R, the pedestrian H1 crossing the crosswalk CR1, The traversing pedestrian H4 is extracted as an event. Each object is stored in association with each path / position.

The evaluation processor 11 rearranges each event into a time series in which the vehicle V1 is encountered. Sequence information of the arranged maps is sent to the operation planning processor 21. In addition, the events extracted through the output device 30 are arranged in a time series and presented to the user. The user can visually confirm what kind of event the vehicle V1 will encounter in the future.

10 is an example of the display information VW showing the event on a time basis. In the display example shown in Fig. 10, the traveling direction of the first path of the vehicle V1 is indicated by a bold arrow T. Fig. The direction of the arrow is the time axis of the vehicle V1. The intersection point QV1S of the stop line existing in front of the signal line extracted as an ideogram on the arrow T is indicated by the icon of the crosswalk, the arrow curving the intersection points QV12 and QV13 by 90 degrees and the intersection point of the crosswalk CR4 are overlapped with the icon of the crosswalk . Further, crosswalk CR1, intersection points QV12, QV13 and crosswalk CR4 of the second path may be indicated as objects to be collided with vehicle V1. In addition, an object existing on each second path may be indicated. In this display example, the pedestrian H1 existing on the crosswalk CR1 and the pedestrian H4 existing on the crosswalk CR4 are displayed. The information indicating the point (object position, the timing of joy, the jaw well) and the object indicating the object may be a symbol or an abstract mark. Coloring, size and the like can be arbitrarily determined.

The operation planning processor 21 of the present embodiment determines the driving behavior as follows for the second path that is associated with each event or each event.

(1) The signal SG1 on the first path BV1L indicates a green signal (progress), and the pedestrian signal SGH1 on the crosswalk CR1 indicates a red signal (stop). Since the priority of the first route BV1L is higher than the priority of the crosswalk CR1, the operation planning processor 21 determines that the driving behavior relating to the event at the intersection QV1S is " progress ".

(2) The signal SG2 on the second path BV2S indicates a red signal (stop). Since the priority of the first route BV1L is higher than the priority of the second route BV2S, the operation planning processor 21 determines that the driving behavior relating to the event at the intersection QV12 is " traveling ".

(3) A signal SG1 on the first path BV1L indicates a green signal (progress), and a signal SG3 on the second path BV3R indicates a green signal (progress). The operation planning processor 21 does not determine the priority of the first path BV1L to the second path BV3R. The operation planning processor 21 determines the driving behavior at the intersection QV13 based on the time until the vehicle V1 and the other vehicle V3 running on the second route BV3R make contact.

(4) A signal SG1 on the first route BV1L indicates a green signal (progress), and a pedestrian signal SGH4 on the crosswalk CR4 indicates a green signal (crossable). In the traffic rule information 224, the priority of the crosswalk is defined to be higher than that of the road for vehicles. The operation plan processor 21 judges that the driving behavior related to the event in the intersection QVC4 (see Fig. 9B) is " stopped " according to the traffic rule information 224 although it is the same green signal.

The operation planning processor 21 of the present embodiment continuously plans an operation plan for each scene. The operation planning processor 21 judges whether or not there is an event determined as " stop " among the plurality of extracted events in the scene R1 set in the scene. The operation plan processor 21 judges that the driving behavior of the entire scene is " stopped " when at least one event determined to be " STOP " is included in the extracted event. Further, the operation planning processor 21 determines a specific stop position.

In this example, for example, when "stop" is determined with respect to the event of the intersection point QV13 on the second route BV3R, the route determined as "stop" is the second route BV3R, the second route CR4 .

The operation planning processor 21 determines " stop " for the event closest to the subject vehicle V1 running on the first route. The operation planning processor 21 sets the stop position based on the position of this event. The stop position is located on the upstream side of the mapped position with respect to the running direction of the vehicle V1 in the first route, and is within a predetermined distance from the map. In this example, " stop " is determined not to the intersection QVC4 with the crosswalk CR4 but to the event relating to the intersection QV13 on the second path BV3R.

In the above case, the second path BV3R is associated with the stop position related to the intersection QV13 (event), but the intersection QV13 is on the second path BV2S. Due to this, the operation planning processor 21 does not set the intersection QV13 and its vicinity to the stop position, but instead sets the stop position in the vicinity of the intersection QV12 associated with the second path BV2S.

When " advance " is determined with respect to the map of the intersection QV13 on the second route BV3R, the intersection QVC4 of the crosswalk CR4 is set as the stop position. The intersection QVC4 is located on the crosswalk CR4 and on the second path BV3R. As a result, the intersection QV13 on the second path BV3R on the upstream side in the traveling direction of the vehicle V1 is set as the stop position. Further, the intersection point QV13 is located on the second path BV3R and on the second path BV2S. As a result, the intersection QV12 on the second path BV2S on the upstream side in the traveling direction of the subject vehicle V1 is set as the stop position.

The above process is repeated until it is determined that processing has been performed on all the destinations up to the destination along the first route.

A modified example of the above processing will be described below. A scene of this example is shown in Fig.

This example is an example of a scene in which the subject vehicle travels on the first path BV1 which is the road of the one-side secondary line.

The evaluation processor 11 extracts the crosswalk CR as a second path having an intersection with the first path BV1. Further, the evaluation processor 11 detects the other vehicle V5 as an event and stores it in association with the first route BV1. In the above example, the first path and the second path intersect at an angle, but in this example, the first path and the second path are common paths.

Arrange the extracted events in the order of encountering them. The relative distance from the subject vehicle V1 to the other vehicle V5 and the relative distance from the subject vehicle V1 to the crosswalk CR are taken into consideration based on the first path BV1 of the subject vehicle V1. The output control processor 31 displays the events arranged on the display 251 based on the obtained relative distances. A display example is shown in Fig. As shown in Fig. 12, the vehicle V1, the other vehicle V5, and the crosswalk CR are expressed in this order.

When the vehicle V1 goes straight ahead in the state of FIG. 11A, the vehicle V1 can not pass the map of the other vehicle V5 because it comes into contact with the other vehicle V5. In addition, the operation planning processor 21 judges that the pedestrian exists on the crosswalk CR. When the pedestrian on the crosswalk CR crosses the other vehicle V5 as in the present example, it is determined that the judgment is impossible on the basis of the occurrence of the square.

The operation planning processor 21 determines the possibility that the vehicle V1 and the other vehicle V5 are in contact with each other. The evaluation processor 11 searches the route avoiding the other vehicle V5 as shown in Fig. 11B, and when the avoidance route is obtained, it can judge that the map of the other vehicle V5 is " traveling ". The avoidance route may be such that a route avoiding the other vehicle V5 exceeding the vehicle width of the subject vehicle V1 can be determined so that no other vehicle traveling on the opposite lane exists.

The operation planning processor 21 determines the behavior through the entire scene. In this example, since the other vehicle V5 can be avoided, it is judged to be "progress" with respect to this event. The presence of a pedestrian on the crosswalk CR could not be confirmed (judging impossible). Because of this, it is necessary to temporarily stop the operation before it is judged as "stop". There is an idea of "proceed" before the idea of "stop". In this case, the operation planning processor 21 sets the speed when running on the route avoiding the other vehicle V5, which is determined to be "progress", to be lower than the previous set speed. That is, decelerates. The subject vehicle V1 decelerates and avoids the avoidable other vehicle V5 and approaches and passes through the crosswalk CR (map) which can not be judged because of the square of the other vehicle V5 during parking.

In addition, it can be confirmed that a pedestrian exists on the crosswalk CR, and if it is determined that the crosswalk CR is "stop", it stops in front of the crosswalk CR. Further, in the case of the unidirectional two-lane road, it is also possible to judge whether or not the lane change is possible while considering the possibility of collision with another vehicle traveling on the adjacent lane. It is possible to cope with an event in which it is difficult to judge in advance such as occurrence of a square.

Hereinafter, a method of setting the stop position candidate in determining the stop position in the operation plan will be described.

The operation planning processor 21 of this embodiment uses the evaluation results of the relationship between the plurality of events and the vehicle V1 encountered over time when the vehicle V1 travels on the first route to stop the vehicle V1 One or more stop position candidates are set for each event. The operation planning processor 21 uses the evaluation results of the relationship between the plurality of events encountered in the set stop position candidates and the subject vehicle V1 to form an operation plan for the scene encountered by the subject vehicle V1.

In this way, in the traffic having the intersection of the first route and the second route, the driving plan is designed in consideration of the relationship between the plurality of events encountered in the stop position candidates and the vehicle V1. It is possible to realize an insufficient operation.

The operation planning processor 21 of this embodiment determines a stop position candidate nearest to the vehicle V1 among the plurality of stop position candidates as a stop position for stopping the vehicle V1 in the scene encountered by the vehicle V1 . As described above, since the child vehicle V1 is stopped at a position closest to the current position of the child V1 in the stop position candidate, the influence on the traffic flow can be suppressed.

The operation planning processor 21 of the present embodiment sets the stop position candidate at a position on the upstream side by a predetermined distance from the stop position where the stop of the vehicle V1 is requested. The vehicle V1 is stopped at a position nearer to the current position of the subject vehicle than the stop position defined in the actual traffic rule information 224, so that the influence on the traffic flow can be suppressed.

The operation planning processor 21 of the present embodiment sets a stop position candidate at a position on the upstream side by a predetermined distance from the outer frame of the area where the parking space of the vehicle V1 is prohibited and outside the parking space area. The vehicle V1 is stopped at a position nearer to the current position of the subject vehicle than the stop position defined in the actual traffic rule information 224, so that the influence on the traffic flow can be suppressed.

The operation planning processor 21 of the present embodiment sets a stop position candidate outside the travelable area of the second route that intersects with the first route. The vehicle V1 is stopped within a lane of the second route or at a position nearer to the current position of the vehicle V1 than the outer edge of the travelable area, so that the influence on the traffic flow can be suppressed.

The operation planning processor 21 of the present embodiment is configured such that when the vehicle body of the vehicle V1 is pushed out of the first path when the vehicle V1 passes through a single path, The vehicle V1 is stopped. When the vehicle V1 is pushed out of the first path, that is, when the vehicle body of the vehicle V1 is likely to enter the lane of another route or into the travelable area, The vehicle V1 is stopped at the position of the vehicle V1, so that the influence on the traffic flow can be suppressed. Although not particularly limited, when at least a part of the vehicle body of the vehicle V1 enters the second path when the vehicle V1 passes through a single path, the vehicle V1 . Likewise, the influence on the traffic flow can be suppressed.

The operation planning processor 21 of the present embodiment can prevent the stop position candidate from being set in an area where no mapping encountered by the child vehicle V1 occurs in accordance with the traffic signal of the first route or the traffic rule of the first route have. When the traffic of the vehicle V1 in the first route is secured by the green signal or when the first route is defined as the priority route by the traffic rules and the traffic of the vehicle V1 is ensured in the priority route, It is possible not to set a position candidate. It is possible to avoid a stop in a scene where stoppage is not necessary and to perform smooth running.

When the speed of the other vehicle flowing into the position of the stop position candidate of the first route is equal to or lower than the specified speed from the second route having the intersection with the first route, Stopping is determined for another stop position candidate on the upstream side of the candidate. If the speed of the other vehicle flowing into the position of the stop position candidate of the first route is equal to or lower than the specified speed, traffic conditions such as congestion may occur. In such a case, it is impossible to stop at an appropriate position, which may affect other vehicles or pedestrians. Therefore, when the speed of the other vehicle entering the position of the stop position candidates of the first route is equal to or lower than the specified speed, the stopping of the stop position candidates close to the current position of the subject vehicle affects the flow of other vehicles or pedestrians It is possible to take a driving behavior which does not give.

Hereinafter, the first setting method of the stop position in the operation plan will be described.

The scene shown in Fig. 13A will be described as an example. The scene shown in Fig. 13A is a scene in which the vehicle V1 turns right at the intersection. The evaluation processor 11 extracts the first path BV1, the crosswalk CR1, and the crosswalk CR2.

Here, a second path having an intersection with the first path is extracted. By using the link information and node information included in the map information 222 for the extraction of the second path, it is possible to efficiently process it.

As shown in Fig. 13B, there are a plurality of nodes ND and links LK in the map database representing intersection points. The node ND is indicated by a circle mark, and the link LK is indicated by an arrow on the route. The link LK represents all the destinations (successive destinations) at the node ND that is the start point / end point. A node includes a link from one node to a plurality of nodes, and a link aggregation from a plurality of nodes to one node. Here, by paying attention to a node in which links are concentrated from a plurality of nodes to one node, it is possible to extract a link that may enter and interfere with the first path of the own vehicle. As a result, it is possible to extract a lane that is likely to flow into or interfere with the first path of the vehicle.

The operation planning processor 21 sets a stop position candidate. The stop position candidates are set for a thread selected from the threads extracted by the evaluation processor 11. [ The operation planning processor 21 determines whether or not it is a stop position candidate for the event extracted by the evaluation processor 11. [

Depending on the signal state of the first path of the subject vehicle V1, there is a case that the second path which is influenced and interrupted with respect to the first path of the subject vehicle V1 may not be considered. In the example shown in Fig. 13C, the signal SG1 at the intersection represents a green signal. In this case, the signal SG2 of the second path BV2S orthogonal to the first path BV1R of the subject vehicle V1 indicates a red signal (stop). As a result, the other vehicle V2 in the second route does not affect the vehicle V1. The evaluation processor 11 determines that the first path BV1R has higher priority than the second path BV2S. Then, in consideration of the fact that the signal SG2 of the second path BV2S indicates a red signal, the evaluation processor 11 makes the intersection of the second path BV2S not mapped.

The relationship between the vehicle V1 and the crosswalk CR1 will be examined. If the pedestrian signal SGH1 of the crosswalk CR1 is a red light, the pedestrian of the crosswalk CR1 does not affect the vehicle V1. For this reason, the evaluation processor 11 determines that the crosswalk CR1 is not the event encountered by the vehicle V1. If the pedestrian signal SGH1 of the crosswalk CR1 is a green signal, there is a possibility that the pedestrian of the pedestrian crossing CR1 may affect the vehicle V1. For this reason, the evaluation processor 11 determines that the crosswalk CR1 is the event encountered by the vehicle V1. There may be a case where no signal is present on the crosswalk CR1 or the contents of the signal can not be detected. In such a case, it is possible to estimate the signal of the signaling device SGH1 for the pedestrian which crosses this from the signaling device SG1 for the vehicle, and judge whether or not it is an event using the above method.

Next, the relationship between the vehicle V1 and the second route BV3S is examined. And the other vehicle V3 goes straight on the second path BV3S. And the signal SG3 of the second path BV3S, which controls the traveling of the other vehicle V3, is a green signal. In the traffic rule, the second route BV3S is a lane in which traveling is prioritized over the first route BV1R where the vehicle V1 turns to the right. For this reason, the evaluation processor 11 determines that the intersection QV13 between the second path BV3S and the first path BV1R is an event.

In this example, the evaluation processor 11 maps the three points of the stop line ST1 on the first path BV1R, the intersection QV13 of the second path BV3S, and the stop position QVJC before the crosswalk CR2.

13D, the evaluation processor 11 calculates, on the basis of the relative distances from the subject vehicle V1 to the respective encounters on the first route BV1R, the order (The stop line ST1, the intersection QVJC of the second path BV3S, the stop position QCJC in front of the crosswalk CR2, and the crosswalk CR2).

The operation planning processor 21 makes a determination of "Go" and "Stop" for each event. For example, the state of the signal SG1 for the stop line ST1, and the presence / absence of a pedestrian during the traverse for the crosswalk CR2 are determined. That is, with respect to the stop line ST1, it is determined whether the signal SG1 is a green signal, and the signal SG1 is determined to be a red signal. As for the crosswalk CR1, a stop determination is made if there is a pedestrian who is going to start a crossing or a crossing. If there is no pedestrian who wants to start a crossing or a crossing, a passing determination is made. In addition, the second path, which is flowed into and interrupted in the first path of the subject vehicle V1, is set such that the presence or absence of another vehicle running on the second path, And judges progress / stop from the degree of approach. The method for determining the degree of approach is as described above.

The operation planning processor 21 sets the stop position candidates according to the positions of the respective encoders. In the example shown in Fig. 13D, the operation planning processor 21 sets the stop position candidate SP1 in the vicinity of the stop line ST1, sets the stop position candidate SP2 in the vicinity of the center R0 of the intersection, Set location candidate SP3. Is set to a position on the upstream side (on the vehicle V1 side) by a predetermined distance from the stop position candidate SP1 stop line ST1. The stop position candidate SP2 is set at a position on the upstream side by a predetermined distance from the intersection QV13 of the second path BV3S and the first path BV1R. The stop position candidate SP3 is set at a position before the specified distance of the pedestrian crossing. These three stop position candidates can be stopped at any stop position candidates because the signal SG1 of the first path BV1R indicates a green signal and does not interfere with the traffic flow of other paths.

The operation planning processor 21 determines an optimal stop position candidate among a plurality of stop position candidates. The center of the intersection which is the stop position candidate determined to be the " stop " closest to the subject vehicle V1 is determined as the stop position. In this manner, since the vehicle V1 is stopped by selecting an appropriate event from a plurality of stop position candidates, the stop position suitable for the scene to be encountered can be determined.

Next, a second setting method of the stop position in the operation plan will be described.

The scene shown in Fig. 14A will be described as an example. The scene shown in Fig. 14A is a scene in which the vehicle V1 turns right at the intersection in the T character. The evaluation processor 11 extracts the first path BV1, the crosswalk CR1, and the crosswalk CR2. The first path BV1 has an intersection with the second path BV2S. The second route BV2S has priority over the first route BV1 in the T-letter route, and travel is recognized.

The evaluation processor 11 refers to the traffic rule information 224 and extracts a stop point on the first path BV1R on which the vehicle V1 runs. The stopping point in the traffic rule is the point at which the vehicle V1 encounters a situation in which the stopping is forced. Further, the evaluation processor 11 extracts a point at which the vehicle V1 is likely to encounter the event. Specifically, the evaluation processor 11 extracts a second path CR1 (crosswalk) having an intersection with the first path BV1R, a second path BV2S, a second path BV4S, and a second path CR2 (crosswalk). Then, the evaluation processor 11 extracts the intersection of the first path BV1R and the second path BV4S. As shown in Fig. 14A, the extracted intersection points in this example are "point Q1 before the stop line ST1", "point Q2 before the crosswalk CR1", "point Q3 before the second path BV2S" The point Q4 before BV4S, and the point Q5 before the crosswalk CR2.

Based on the relative distance from the subject vehicle V1 to the respective paths in the first path of the subject vehicle V1, the operation planning processor 21 calculates the relative distance between the subject vehicle V1 and the subject vehicle V1 in accordance with the order in which the subject vehicle V1 encounters the Q1? Q2? Q3? Q4? Arrange the mapping in the order of Q5. And presented on the display 251 as needed.

The evaluation processor 11 refers to the map information 222, the road information 223 and the traffic rule information 224 to determine whether or not it is an event that is the object of the stop position candidate. The evaluation processor 11 sets matters that may affect the subject vehicle V1 as a joy mapping and does not set matters that do not affect the subject vehicle V1 as a joy mapping. The scene of this example is a T character in which no signal is present, and the vehicle V1 runs in a non-priority lane. As a result, all the extracted five events are extracted as the events encountered by the vehicle.

The operation planning processor 21 sets a stop position candidate according to the position of each thread as shown in Fig. 14B. The operation planning processor 21 sets each stop position candidate at a position shifted upstream by a predetermined distance from the intersections Q1 to Q5. In this example, the operation planning processor 21 determines whether or not the stop position candidate SP1 corresponding to the stop line ST1, the stop position candidate SP2 corresponding to the crosswalk CR1, the stop position corresponding to the intersection with the second route BV2S The stop position candidate SP4 corresponding to the intersection with the second path BV4S, and the stop position candidate SP5 corresponding to the crosswalk CR2 are set as the stop position candidates.

The operation planning processor 21 determines an appropriate stop position from a plurality of stop position candidates SP1 to SP5 included in one scene. Although not particularly limited, the operation planning processor 21 determines the stop position candidate SP1, which is the stop position candidate closest to the child vehicle V1, as the stop position.

When the pedestrian is found on the crosswalk CR1 after stopping the vehicle V1 once in the stop position candidate SP1, the operation planning processor 21 then determines that the vehicle V1 is the stop position candidate SP2 corresponding to the crosswalk CR1 .

The operation planning processor 21 integrates them when a plurality of stop position candidates are approaching (within a predetermined distance). Thus, the processing load can be reduced.

When the pedestrian does not exist on the crosswalk CR1, the operation planning processor 21 advances the vehicle V1 to the stop position candidate SP3 and stops the stop. When a pedestrian is found only on the crosswalk CR2, the stop position candidate SP5 corresponding to the crosswalk CR2 is determined as the stop position.

If the other vehicle V4 is traveling on the second route BV4S and there is a possibility of affecting the running of the vehicle V1, the operation planning processor 21 performs the following processing. The operation planning processor 21 determines whether or not a stop position candidate exists in a path in a direction different from the traveling direction of the subject vehicle V1 with respect to the plurality of stop position candidates that have been set. In the example shown in Fig. 14B, the stop position candidate SP4 exists in the lane area of the second path BV2S in the direction different from the first path BV1R. For this reason, the operation planning processor 21 predicts that the stop position of the subject vehicle V1 affects the other vehicle V2. The operation planning processor 21 determines the stop position candidate SP3 (on the upstream side) in front of the stop position candidate SP4 as the stop position. If the stop position candidate SP3 is not the travelable area of the second route, it is determined as the stop position. According to this example, when the vehicle V1 turns right from the road on the non-side to the T side to the road on the priority side and enters, the other vehicle V2 that travels from the right side to the left side on the second path BV2S on the priority side, It is possible to determine the stop position assuming the other vehicle 4 that travels from the left side to the right side of the second route BV4S. It is possible to stop at a natural stop position that does not affect the other vehicle V4.

Hereinafter, a third method of setting the stop position in the operation plan will be described.

The scene shown in Fig. 15 will be described as an example. The scene shown in Fig. 15 is a scene in which the road after turning right when the subject vehicle V1 makes a right turn at an intersection is stagnant. The operation planning processor 21 determines that the first path of the subject vehicle V1 is stagnant when the speed of the other vehicle entering the position of the stop position candidate in the first path is equal to or lower than the specified speed. In this example, an appropriate stop position in consideration of the stop position of the other vehicle is also determined in the scene of congestion. The evaluation processor 11 extracts the first path BV1, the crosswalk CR1, and the crosswalk CR2.

In this example, it is determined that the stop signal ST1 has passed since the signal device is a green signal. In the present embodiment, it is assumed that the intersection of the vehicle V3 traveling in the opposite direction is judged as "traveling" because the possibility of approaching the vehicle V1 is low. However, since the other vehicle V43 exists before the crosswalk CR2, there is no area for the vehicle V1 to stop the vehicle V1 in front of the crosswalk CR2.

The operation planning processor 21 makes a judgment of whether or not the stop is possible with respect to the " intersection Q4 with the second path BV3S " before the intersection Q5 encountering the event. In this example, it is assumed that the crosswalk CR1 is a red light, the crosswalks CR1 and CR2 are free of pedestrians, and the crosswalks CR1 and CR2 are determined to be progressed.

The operation planning processor 21 sets the stop position candidates SP1 to SP5 based on the positions of the intersections Q1 to Q5. The operation planning processor 21 assumes the stop position candidate SP4 corresponding to the intersection Q4 in front of the one of the intersection Q5 as the stop position. The operation planning processor 21 determines whether the stop position candidate SP4 is included in the lane of the second route BV2S which is different in travel direction from the first route BV1R. Since the stop position candidate SP4 is included in the lane of the second route BV2S, the stop position is likely to interfere with the running of the other vehicle V2 running on the second route BV2S. In particular, when the congestion continues and the signal changes from a green signal to a red signal, the stop position of the vehicle V1 affects the traveling of the other vehicle V2.

Therefore, it is assumed that the stop position candidate SP3 ahead of the stop position candidate SP4 is the stop position. The operation planning processor 21 makes the same judgment for the stop position candidate SP3. Since the stop position candidate SP3 also interferes with the second route BV2S, the stop position candidate SP2 is also a candidate. This stop position candidate SP2 also interferes with the crosswalk CR1 as the second route, and finally determines the stop line ST1 as the stop position.

In this way, when the subject vehicle V1 makes a right turn at the crossing point in the green signal, an appropriate stop position is determined from the stop position candidates in consideration of the stop position of the other vehicle V43, thereby affecting other vehicles It is possible to stop the self vehicle V1 at the stop position.

The driving plan processor 21 can determine whether or not the lane change is possible in planning the driving plan. When an object is detected in front of the subject vehicle V1, the distance from the subject vehicle V1 to the subject is calculated. The time of arrival from the subject vehicle V1 to the object may be calculated in consideration of the speed. The operation planning processor 21 determines whether the relative distance X between the subject vehicle and the preceding vehicle is sufficiently secured so as to determine whether the lane change of the subject vehicle V1 is possible or not. As an example, the operation planning processor 21 defines a threshold value for determining that the lane change is possible by the distance X MIN , and determines whether X> X MIN with respect to the distance X from the vehicle to the preceding vehicle . If X > X MIN , the operation planning processor 21 determines that the lane change of the vehicle V1 is possible, and if not, it is impossible to change the lane. The threshold X MIN is a distance required for the subject vehicle V1 to overtake the preceding vehicle and is an allowable distance to be considered when the subject vehicle V1 travels the running scene.

Of course, the distance may be calculated as the arrival time in consideration of the vehicle speed. As an example, the operation planning processor 21 defines a threshold value at which the lane-changing is possible, by the reaching time T Q , and judges whether or not T> T MIN with respect to the time T from the subject vehicle to the vehicle ahead do. If T > T MIN , the operation planning processor 21 determines that the lane change of the vehicle V1 is possible, and if not, it is determined that the lane change is impossible. The threshold value T MIN is a time required for the subject vehicle V1 to overtake the preceding vehicle, and is a margin time to be taken into account when the subject vehicle V1 travels the running scene.

In this embodiment, based on the relative distances between the event and the vehicle V1, it is determined whether the lane change is possible after arranging the events in the order in which the vehicle V1 is encountered. This makes it possible to cope with the passing of another vehicle in consideration of the vehicle adjacent to the first route of the own vehicle or the vehicle traveling ahead.

Subsequently, the processing procedure of the travel support system 1 of the present embodiment will be described based on the flowchart of Fig. The outline of the processing in each step is as described above. Here, the processing flow will be mainly described, and specific processing examples will be described later.

First, in step S1, the evaluation processor 11 acquires the vehicle information of the vehicle V1. The sub-vehicle information includes the position of the sub-vehicle V1, the speed and acceleration of the sub-vehicle V1, and the traveling direction of the sub-vehicle V1.

In step S2, the evaluation processor 11 acquires the object information. The object information includes the presence or absence of an object around the subject vehicle V1, the attribute of the object (stopping object or moving object), the position of the object, the speed and acceleration of the object, and the traveling direction of the object. The object information can be acquired from the object detecting apparatus 230 and the navigation apparatus 220. [

In step S3, the evaluation processor 11 judges whether there is a change in the nearest encounter scene that the child vehicle V1 encounters forward. Encounter scene is a crossing scene to pass through. The evaluation processor 11 judges whether there is no change in the traveling route or whether or not it has passed the encounter scene that was the subject of evaluation immediately before. This is to judge whether or not the setting of a new encounter scene is necessary. The evaluation processor 11 determines that there is no change in the traveling route when the current position of the vehicle V1 on the already calculated route belongs. If the current position of the subject vehicle V1 does not belong on the already calculated route, the evaluation processor 11 determines that the traveling route has been changed.

If the current position of the vehicle V1 does not belong to the area set as the encounter scene immediately before, the evaluation processor 11 determines that it has passed the encounter scene. The evaluation processor 11 determines that the current position of the subject vehicle V1 belongs to the area set as the encounter scene immediately before it does not pass the encounter scene.

The evaluation processor 11 judges that there is a change in the encounter scene when the travel route is changed or passes the encounter scene, and executes the processes of S4 to S9. If the travel route has not been changed and the encounter scene has not been passed, it is determined that there is no change in the encounter scene, and the process proceeds to S11.

In step S4, the evaluation processor 11 calculates a first path on which the vehicle V1 runs. The first route may be one calculated by the navigation device 220. [ The first path is specified by a road identifier, a lane identifier, a lane identifier, and a link identifier. These lane identifiers, lane identifiers, and link identifiers are defined in the map information 222 and the road information 223.

In step S5, the evaluation processor 11 sets a scene encountered by the vehicle V1 running on the first route. The encounter scene is an area including a point where an intersection of the first path and another path exists. The shape of the intersection of the first path is not limited, and any of joining, branching, intersection, T intersection, or adjacent may be used. The encounter scene is an area including a point where a stop is requested on the first route in accordance with the traffic rule information 224. The evaluation processor 11 refers to the map information 222, the road information 223 and the traffic rule information 224 to determine whether or not the vehicle V1 is likely to encounter the event Is set as the area R1. Examples of the scene encountered by the present vehicle V1 include a region near the intersection, a region near the confluence point of the lane, a region near the crosswalk, a region near the stop line, a region near the crossing,

In step S6, the evaluation processor 11 extracts a second path having an intersection with the first path. The evaluation processor 11 refers to the map information 222 and the road information 223 and extracts a second path having an intersection with the first path. The evaluation processor 11 refers to the link information (node information) defined in the map information 222. [ In a place where a plurality of paths intersect, link information (node information) is connected to a plurality of links different from each other. The evaluation processor 11 extracts a second path that intersects with the first path from the connection status of the link information (node information).

In step S7, the evaluation processor 11 extracts the event encountered by the child vehicle V1 in the set encounter scene. The evaluation processor 11 extracts an intersection between the first path and the second path as a mapping. Incidentally, at the confluence point of the paths, a plurality of links are connected to one link. In the crossing point, the vicinity of the entrance to the intersection corresponds to the branching point of the lane, and the vicinity of the exit of the intersection corresponds to the confluence point of the lane. As described above, a point where one link is connected to a plurality of links can be extracted as an event in which the first path and the second path intersect at the exit side of the intersection. That is, it is possible to detect the second path at the exit of the intersection by detecting the presence of a point where one link is connected to a plurality of links. Further, link information is defined in the pedestrian crossing, and crossing crossing with the first path can be detected as the second path by performing link crossing determination of the link of the first path and the crosswalk. The evaluation processor 11 extracts, as a map, a point at which a stop is requested on the first route in accordance with the traffic rule information 224.

The location of the extracted map is stored in association with the route. And the position of the extracted map may be stored in correspondence with the map information 222 and the road information 223. In the planning of the operation plan to be performed later, the driving behavior is determined for each position of the extracted map.

In step S8, the evaluation processor 11 rearranges the extracted plural images in the order in which the vehicle V1 encounters them.

In step S9, the output control processor 31 displays on the display 251 a plurality of rearranged events. The output control processor 31 may output a plurality of rearranged events using the speaker 252. [

In step S11, the operation planning processor 21 extracts an object to be encountered by the vehicle traveling on the first route. The operation plan processor 21 extracts information of the object existing in the second path among the object information obtained in step S2. In the example shown in Fig. 2G, the other vehicle V2 running on the second route BV2S and the other vehicle V3 running on the second route BV3R are extracted.

In step S12, the operation planning processor 21 associates the object with the event or route. If the path can be specified, the intersection with the first path can be narrowed, so the object information may correspond to the path identifier. The object information of the other vehicle V2 is associated with the identifier of the second path BV2S or the identifier (position information) of the intersection QV12. The object information of the other vehicle V3 is associated with the identifier of the second path BV3R or the intersection QV12.

In step S13, the operation planning processor 21 determines the driving behavior for each event. The driving behavior is determined based on the possibility of contact between the subject vehicle V1 and the object, as described above. The likelihood of contact is judged on the basis of the distance between the vehicle V1 and the object or the time until the two come into contact.

In step S14, the operation planning processor 21 determines whether or not there is an event determined as " stop " in a plurality of events belonging to the area R1 set as a scene. It is determined whether or not any one of the images determined to be " stopped " exists in the area R1 set as the scene.

If it is determined in step S14 that there is no event determined to be " STOP ", the process goes to step S16 to form an operation plan for " pass through " the area R1 set as the scene. On the other hand, if there is at least one event that is determined to be " STOP " in step S14, the flow advances to step S15 to design an operation plan in the area R1 set as the scene. Specifically, the operation contents of the progress or stop are determined for each extracted event, and the stop position is set according to the event position.

In the succeeding step S17, the operation control is executed based on the designed operation plan. The vehicle V1 is stopped at the position of the event where the stop is determined via the vehicle controller 210 and the vehicle V1 is advanced at the position of the event in which the progress is determined.

Fig. 17 is a flowchart showing a subroutine of the drafting process of the operation plan shown in Fig. 16. Fig.

As shown in Fig. 17, in step S21, the operation planning processor 21 sets a stop position candidate according to the position of each thread. In step S22, the operation planning processor 21 integrates the plurality of stop position candidates when they are close to each other and within a predetermined distance. In step S23, the operation planning processor 21 determines the validity of the stop position candidate. Specifically, the operation planning processor 21 determines whether the position of the stop position candidate exists in the area of the second path or not in the constant-distance prohibition area.

In step S24, when there are a plurality of stop position candidates after compression, the operation planning processor 21 proceeds to step S25 to select a stop position candidate that is first encountered by the vehicle V1. In step S26, the operation planning processor 21 determines a stop position.

18 is a flowchart showing a subroutine of the stop position candidate compression processing shown in Fig.

In step S31, the operation planning processor 21 determines whether or not the stop position candidate is within the stopable area. If the stop position candidate is not in the stopable area, it is not preferable to stop the process, so that the process goes to step S34 and the stop position candidate is deleted from the candidate data. Even if the stop position candidate is within the staggered prohibition area, the stop position candidate is deleted from the candidate data. On the other hand, if the stop position candidate is within the stopable area, the process proceeds to step S32.

In step S32, the operation planning processor 21 determines whether or not the subject vehicle is within the area of the first route. If the stop position candidate is not the area of the first route, there is a possibility that it will have an undesirable influence on the pedestrian walking on the pedestrian crossing, which is another route that runs on another route or another route. As a result, the process advances to step S35 to shift the position of the stop position candidate to the upstream side (the vehicle side). On the other hand, if the stop position candidate is within the area of the first path, the process proceeds to step S33.

In step S33, the operation planning processor 21 determines whether the vehicle speed of another vehicle joining the first route is less than a predetermined value. The predetermined value is a speed threshold for judging the occurrence of congestion. If the speed of the other vehicle is slow, congestion may occur after merging.

If the vehicle speed of the other vehicle is less than the predetermined value, the process proceeds to step S36.

In step S33, in consideration of the possibility that the subject vehicle V1 can not stop at an appropriate position due to the influence of the stagnation occurring in the route after joining, the operation planning processor 21 sets the rear of the stop position to the upstream side Vehicle V1 side). If the vehicle speed of another vehicle joining the first route is not less than the predetermined value, the process proceeds to step S24. Steps S24 to S26 are common to the processes described in Fig.

Hereinafter, another form of the mappings extraction process will be described.

The scene evaluation apparatus 10 of the present embodiment includes an evaluation processor 11. Other configurations of the in-vehicle apparatus 200 and the like are as described above.

The evaluation processor 11 of the present embodiment extracts a first path on which the subject vehicle runs and a second path having an intersection with the first path.

The evaluation processor 11 of the present embodiment calculates a moving prediction line over time of the vehicle V1 running on the first route. The movement prediction line includes an element on the time axis. The movement prediction line is information on the position of the moving subject vehicle V1, and is a set of position information that changes with the lapse of time. The positional change of the subject vehicle V1 is predicted based on the current position and the destination information input to the navigation device 220. [ The evaluation processor 11 may execute the calculation of the moving prediction line on the navigation device 220 to obtain the result. The movement prediction line may be a continuous line segment or a discrete line segment.

Further, the evaluation processor 11 calculates a moving prediction range over time of the vehicle V1 running on the first route. The movement prediction unit includes an element of the time axis. This movement prediction difference is information on the position of the moving subject vehicle V1, and is a set of position information that changes with the lapse of time. The movement prediction line differs from the movement prediction line in that the position of the vehicle V1 is represented by the plane, but the content of the information is common to the movement prediction line. The movement prediction line may be obtained by enlarging the width of the movement prediction line along a predetermined plane. The evaluation processor 11 may execute the calculation of the movement prediction band on the navigation device 220 to obtain the result.

The evaluation processor 11 of the present embodiment calculates a predicted moving prediction line of the subject vehicle V1 running on the first path and calculates the predicted moving prediction line based on the position of the intersection point between the second path having the intersection with the first path and the moving prediction line , And extracts the event encountered by the vehicle. By extracting mappings from the intersection positions of the movement prediction lines considering time, an appropriate mappings can be extracted.

The evaluation processor 11 of the present embodiment extracts mappings based on the positions of the intersections located at the most upstream side along the traveling direction of another vehicle that runs on the second route among the intersections of the moving prediction line and the second route. The map is extracted based on the position of the intersection where the movement prediction line and the second path cross at the earliest timing. By considering the moving direction of another vehicle traveling on the second route, it is possible to extract an appropriate map considering time.

The evaluation processor 11 of the present embodiment is an intersection obtained at the earliest timing among the intersections of the second path and the motion prediction block and calculating a temporal motion prediction range of the subject vehicle traveling in the first path, And extracts the event encountered by the vehicle based on the position of the intersection located at the most upstream side along the traveling direction. The intersection of the second path and the motion prediction line becomes a line segment. By making the intersection at the most upstream side along the traveling direction of the second path, it is possible to extract one point of the map considering time.

The evaluation processor 11 of the present embodiment generates a motion predicted line or a motion predicted line with reference to the map information 222 when a region of the first path is not defined. Thereby, it is possible to extract an image even in an area where a lane such as an intersection is not defined.

The evaluation processor 11 of the present embodiment refers to the map information 222 in which nodes and links are defined and is a node associated with the first path on which the vehicle runs, Extracts the first node, and selects a path to which the other node having the connection relationship with the first node belongs as the second path. The second path can be searched using the connection form of the node, so that the second path can be searched with a low processing load.

The evaluation processor 11 of the present embodiment integrates a plurality of events into one event when the distance of a plurality of events encountered by the extracted child vehicle V1 is less than a predetermined value. By arranging the mappings in the vicinity, it is possible to prevent the progress / stop from being repeated, thereby enabling smooth running.

The operation planning processor 21 of the present embodiment determines either the progressive action or the stop action for a plurality of events extracted by the evaluation processor 11. [ Thereby, it is possible to extract an event having no discomfort in time.

A specific example of processing in the scene shown in Fig. 19 will be described. Fig. 19 shows a scene in which the subject vehicle V1 makes a left turn to the priority route from the non-priority road to the T character along the first route BV1L.

20 is a flowchart showing the control procedure of this process. The mapping process will be explained while explaining the control procedure.

In step S101, the evaluation processor 11 acquires the current position of the child vehicle from the position detection device 221. [ In step S102, the evaluation processor 11 acquires the first path of the vehicle V1 obtained from the current position and the destination. The first route is specified in a lane including not only road specific but also direction information. The same is true for the second route. The evaluation processor 11 causes the navigation device 220 to calculate the route. In step S103, the evaluation processor 11 acquires the positional information of the boundary (lane mark, curb, guardrail, etc.) of the first path on which the subject vehicle travels. The evaluation processor 11 may acquire the boundary information of the first route from the map information 222 or the road information 223. [

In an area such as an intersection or a confluence point, lane boundary information does not exist on the road. For this reason, the evaluation processor 11 creates virtual boundary information for an area having no boundary information. The evaluation processor 11 generates a virtual lane boundary from boundary information (lane mark, curb, guardrail, etc.) of the lane before and after the intersection point and destination information of the lane. As shown in Fig. 21, by using the boundary information LN1 and LN2 of the lane before and after the intersection, the curb boundary information SL1 and SL2, and the left turn information of the first path BV1L, Thereby generating a boundary LN0 of the virtual lane in the area. 21, the evaluation processor 11 can connect the lane information of the " lane LN1 of the first route before the left turn " and the lane information of the lane LN2 of the first route after the left turn to generate a virtual lane .

In step S104, the evaluation processor 11 calculates the movement prediction line / movement prediction line, which is the information of the position where the subject vehicle V1 running on the first route is located in the future time. And the point at which the movement of the vehicle is predicted is associated with the time axis. The moving prediction line or the moving prediction line calculated here may be a path defined by a discontinuous function without the necessity of a path defined by a continuous function. The evaluation processor 11 may calculate the movement prediction line or the movement prediction line BV1L using the boundary line of the virtual lane shown in Fig. As shown in Fig. 22, the movement prediction line BV1L may be calculated using the midpoint in the width direction of the boundary line of the virtual lane, or the motion prediction line may be calculated by designating an area of a predetermined width from the center of the virtual lane .

In step S105, the evaluation processor 11 calculates a second path having an intersection with the first path. The evaluation processor 11 calculates the second route using the node and link information associated with the first route in the map information 222. [ For example, as shown in FIG. 23, a plurality of nodes / links expressing a T character string are read. In the map information 222, the connection destination of each node expresses the connection destination at the intersection. A node at an intersection may be one in which a link is branched from one node to a plurality of nodes, and links are aggregated from a plurality of nodes to one node. In this process, attention is paid to a node in which links are aggregated from a plurality of nodes to one node. At a node where destinations of a plurality of links are gathered, it is possible to extract a link likely to be connected to the first route. As a result, it is possible to extract the second path that may interfere with the first path of the subject vehicle V1.

In step S106, the evaluation processor 11 obtains intersection line segments of the motion prediction line and the second path. As shown in Fig. 24A, the movement prediction line Z is a line segment having a finite length. In the driving support apparatus 100, in order to allow the subject vehicle V1 to run in response to an encounter encountered, information about several hundred meters ahead is considered. In this case, the length of the movement prediction line Z is several hundred meters. The other second path is the area that continues as long as the lane continues. The intersection line segment required here is an intersection point of the " movement prediction line Z " and the " second path BV ". In the example shown in Fig. 24A, the points A to B are crossing line segments. When the length of the " moving prediction line Z " is determined based on the length of the link, the length of the " moving prediction line Z " is a length of one finite length from one end to the other end of one link do. With reference to the link length shown in Fig. 23, the length of the " movement prediction line Z " is the length of the point A to the point B 'as shown in Fig. 24B.

In step S107, the evaluation processor 11 determines the traveling direction of the second path BV2S. The traveling direction may be determined from the link direction of the map information 222. The second route moves in the direction of the arrow from the right side to the left side in the figure as viewed from the child vehicle V1.

In step S108, the evaluation processor 11 determines whether or not the other vehicle traveling on the second route BV2S intersects with the direction of approaching the vehicle from among the intersection line segments of the obtained second route and the moving prediction line, Right side) is selected.

The point closest to the upstream side (the right side in the figure) of the second path BV2S in the running direction of the segment AB shown in Fig. This point is located at the far right of the line AB. In this example, since the subject vehicle V1 makes a left turn, the right turn is opposite.

The point A selected as an image can be positioned on the upstream side of the crosswalk CR4. Incidentally, when time is not taken into consideration, both A and B 'shown in Fig. 24B can be mapped. When the point B 'is extracted as an event, it is judged that an event occurs after passing the crosswalk. This causes a problem that the order in which the vehicle is actually encountered is opposite to the order in which the vehicle is encountered, and the behavior decision based on the order of the events encountered by the vehicle can not be made. On the other hand, in the present embodiment, since the order of the events encountered by the vehicle V1 is considered, the point A in front of the crosswalk CR4 can be extracted as a map.

According to this process, when the subject vehicle makes a left turn from the road on the non-side to the T side to the road on the priority side and enters the road on the priority side, considering the traveling direction of the vehicle traveling from the right side to the left side, Mapping can be set to the appropriate location. It is possible to obtain the idea that the vehicle is encountered without being inconsistent with the order of the events actually encountered by the vehicle.

Subsequently, a process for calculating a movement prediction band and extracting a mapped image using the movement prediction band will be described.

First, a case is described in which the subject vehicle enters the priority route from the non-priority route to the T character and makes a left turn, and the other vehicle travels from the right to the left in the priority route to the T character as an example .

25 is a flowchart showing the control procedure of this process. Steps S101, S102, and S105 are the same as those described above with reference to FIG.

In the succeeding step S201, the evaluation processor 11 determines an area where the movement prediction band BV1L and the second path overlap with each other, as shown in Fig. 26A. The movement prediction unit is a region having a finite length (or width). Depending on the range to be predicted, the length of the movement prediction range BV1L is several hundred meters. As shown in Fig. 26B, when the movement prediction band BV1L is defined as a link, the region is a finite length region of the link.

In the succeeding step S201, the evaluation processor 11 determines the approach direction of another vehicle traveling on the second route. The approach direction of the other vehicle can be determined from the running direction of the second route.

In S202, the evaluation processor 11 extracts, as a mapping, a point closest to the approach direction of another vehicle on the second route BV2S. More specifically, as shown in Fig. 26C, a point C on the side of the boundary line a between the " second path " and the moving prediction line nearer to the vehicle V1 and on the upstream side in the traveling direction of the second path BV2S, . It can be seen that the point C is located behind the stop line and the crosswalk of the motion prediction range of the vehicle and is positioned forward of the crosswalk after passing the intersection point.

The evaluation processor 11 executes the merging process of the mappings. This process is executed after the process shown in Fig.

The mapping process of mapping will be described using the example shown in Fig. 27A. Fig. 27A is a diagram showing a case where the vehicle V1 runs along the first path BV1L when turning left at the intersection, the other vehicle V3 turning right from the opposite lane and the other vehicle V2 going straight on the second path BV2S crossing the first path BV1L, .

As shown in Fig. 27B, the node ND1 is set immediately after the intersection point is turned to the left, and a link LK1 for which the child vehicle V1 runs, a link LK3 for which the other vehicle V3 that makes a right turn from the opposite lane runs, And a link LK2 on which the other vehicle V2 traveling straight from the right to the left travels. Thus, by looking at a node in which links are concentrated from one node to a plurality of nodes, it is possible to extract a link having an intersection with the first path that the vehicle runs. As a result, the second path can be easily extracted.

As shown in Fig. 27C, intersecting line segments of the movement prediction line (movement prediction line) and the second path are obtained. In the present embodiment, the line segment DE which is the intersection line of the movement prediction line (movement prediction line) and the second path BV3R, the line segment DE which is the intersection of the movement prediction line (movement prediction line) and the second path BV2S Line segment FG exists.

The second route BV3R " and the " second route BV2S ", and judges from which direction the vehicle V1 approaches. In this example, as viewed from the vehicle V1, the other vehicle V3 moves from the top to the bottom in the figure, and the other vehicle V2 moves from the right to the left.

The evaluation processor 11 maps the point at the most upstream side in the direction in which the other vehicle approaches, out of the intersection line segments. In this embodiment, the points D and F are extracted.

As shown in Fig. 27D, the evaluation processor 11 integrates when there are a plurality of mappings in a predetermined area. For example, when a plurality of events exist in the same lane, they are integrated as one event. Thereby, the stop positions can be made common.

As shown in Fig. 28, when the T character string is rotated right, there are two points of interference A and B. The first point is the intersection of the "first path BV1R" and the "second path BV2S", and the second point is the intersection of the "first path BV1R" and the "second path BV4S". With regard to the stopping at the first point and the second point, although there is no interference point on the same lane, the behavior of the driver is stopped immediately before entering the lane on the front side. As described above, when a plurality of events exist within a predetermined range, or when the stop positions are equal to a plurality of interference points, the stop positions can be considered simply by integrating them as one interference point.

Fig. 29 shows an example of a scene encountered at an intersection. An event that affects the running of the vehicle V1 is an encounter with the following five objects. (2) existence of another vehicle V2B, (3) existence of another vehicle V2C, (4) existence of another vehicle V2D running in the adjacent lane, (5) A pedestrian crossing a pedestrian crossing.

In the general process, the presence or absence of the five objects is acquired, the positional relationship between the objects is calculated, and then, the determination of the progress / stop is made. Even if the presence or absence of each object is considered, it is necessary to execute 2 5 judgment processes.

On the other hand, in the present embodiment, only the event at the intersection of the first path and the second path is narrowed to determine the progress / stop. In the example shown in Fig. 29, the running support apparatus 100 is configured to (1) proceed or stop at a stop line, (2) to progress / stop at a crosswalk, and (3) (4) progress / stop in the crosswalk, and only four kinds of judgment processes.

In this manner, by narrowing down the information and arranging the information in time series, a simple determination can be made, and the processing load can be reduced.

FIG. 30 shows a display example of the determination result of FIG. As shown in Fig. 30, the judgment in each event can be expressed simply. It is easy for the user to grasp because the event encountered on the time axis accompanying the progress of the present vehicle is associated with it. The user can share the judgment contents of the subject vehicle, that is, the contents of the driving plan with the device. Particularly, in the case where the running support process performs some or all of the running control, it is possible to increase the reliability of the control of the automatic running (some automatic running) by promptly informing the user of the driving plan on the vehicle side.

The traveling support apparatus 100 according to the embodiment of the present invention is constituted and operated as described above, so that the following effects are exerted.

[1-1] The scene evaluating apparatus 10 according to the present embodiment is a device that carries out a route search based on a relationship between a first route on which a subject vehicle travels and a second route having an intersection with the first route, A plurality of events encountered by the vehicle are extracted, and the scene is evaluated using the relationship between each extracted image and the vehicle. By using the scene evaluation apparatus 10 of the present embodiment, it is possible to evaluate the scene encountered by the subject vehicle, taking into account only the events that the subject vehicle is deemed important from the viewpoint of making the behavior decision. So that it is possible to reduce the burden of processing. In the running support including the automatic operation, the delay of the processing is not preferable. By reducing the processing load, it is possible to shorten the processing time, thereby preventing the processing from being delayed.

[1-2] The scene evaluating apparatus 10 of the present embodiment uses a relationship derived from a traffic rule of a first route and a traffic rule of each second route to calculate a map . By using the scene evaluation apparatus 10 of the present embodiment, it is possible to extract mappings that need to be reviewed from the viewpoint of traffic rules.

[1-3] The scene evaluating apparatus 10 of the present embodiment extracts an event encountered by a vehicle running on the first route, using detection information of an object existing around the vehicle. By using the scene evaluation apparatus 10 of the present embodiment, an object that affects the operation of the subject vehicle can be extracted as a map.

[1-4] The scene evaluating apparatus 10 of the present embodiment extracts a map encountered by a vehicle traveling on the first route, using the positional information in which the object exists. By using the scene evaluating apparatus 10 of the present embodiment, it is possible to extract an event taking into consideration the position of the object.

[1-5] The scene evaluating apparatus 10 of the present embodiment uses the traffic rules of the first route and the traffic rules of the second route to obtain the priorities concerning the passage of the first route and the second route, Use the figure to extract the events encountered by the vehicle. By using the scene evaluation apparatus 10 of the present embodiment, it is possible to extract an event that needs to be examined from the viewpoint of a path and a signal rule first.

[1-6] The scene evaluating apparatus 10 of the present embodiment uses the signal information of the first route and the signal information of the second route to obtain a priority for passage of the first route and the second route, And extracts the event encountered by the vehicle traveling on the first route. By using the scene evaluating apparatus 10 of the present embodiment, it is possible to extract an event that needs to be examined from the viewpoint of an actual traffic signal indication.

[1-7] The scene evaluating apparatus 10 of the present embodiment uses the road information of the first route and the road information of the second route to obtain the degree of priority concerning the passage of the first route and the second route, And extracts the event encountered by the vehicle traveling on the first route. Using the scene evaluation apparatus 10 of the present embodiment, it is possible to extract the event that needs to be examined from the viewpoints of the road width, the traffic volume of the road, the road shape, and the like.

[1-8] The running support apparatus 100 of the present embodiment uses the relationship between the event and the subject vehicle evaluated by the evaluation processor to form an operation plan at the time of first route travel. By using the driving support apparatus 100 of the present embodiment, an operation plan can be drafted based on an evaluation on a necessary event. Since the ideas to be reviewed are narrowed, the processing time can be shortened while planning an appropriate operation plan. As a result, it is possible to implement the running support with a short delay time.

[1-9] The running support apparatus 100 of the present embodiment uses the positional information in which the object exists to plan the operation for the event encountered by the vehicle running on the first route. By using the running support apparatus 100 of the present embodiment, it is possible to design an operation plan which is examined only for an object that affects the operation of the subject vehicle.

[1-10] The running support apparatus 100 of the present embodiment obtains a priority degree regarding the passage of the first route and the second route, using the traffic rules of the first route and the traffic rules of the respective second routes, The priority is used to plan the operation for the event encountered by the vehicle traveling on the first route. When the driving support apparatus 100 of the present embodiment is used, it is possible to design an operation plan considering the route and signal rules first.

[1-11] The traveling support apparatus 100 of the present embodiment uses the signal information of the first route and the signal information of each second route to obtain a priority regarding the passage of the first route and the second route, The priority is used to plan the operation for the event encountered by the vehicle traveling on the first route. By using the driving support apparatus 100 of the present embodiment, it is possible to design an operation plan considering the actual traffic signal instruction.

[1-12] The traveling support apparatus 100 of the present embodiment uses the road information of the first route and the road information of each second route to obtain the degree of priority concerning the passage of the first route and the second route, The priority is used to plan the operation for the event encountered by the vehicle traveling on the first route. The driving support apparatus 100 of the present embodiment can be used to form an operation plan considering the road width, the traffic volume of the road, the road shape, and the like.

[1-13] A scene evaluation method of the present embodiment extracts a second path having an intersection with a first path on which a subject vehicle travels, and based on the relationship between the first path and the second path, And extracts a plurality of events encountered by the traveling vehicle. When the scene evaluation method of the present embodiment is used, the action effects described in 1-1 can be obtained.

[2-1] The scene evaluation apparatus 10 of the present embodiment extracts a plurality of events encountered when traveling on the first route, and rearranges the extracted plurality of events in accordance with the order in which the vehicle is encountered. By using the scene evaluating apparatus 10 of the present embodiment, it is possible to evaluate an event in consideration of the jumping order without rearranging the narrowed events in the jumping order of the child vehicles, without generating a further computation load.

[2-2] The scene evaluating apparatus 10 of the present embodiment extracts a mental image including a stationary object and a moving object, extracts the stationary object and the moving object included in the extracted plurality of mental images, Rearrange. By using the scene evaluation apparatus 10 of the present embodiment, it is possible to evaluate a jumping order of objects including a stationary object and a moving object on the same time axis without generating a further calculation load.

[2-3] The scene evaluating apparatus 10 according to the present embodiment is a device for evaluating a vehicle running on a first route based on a relationship between a first route on which a subject vehicle travels and a second route having an intersection with the first route, Extracts a plurality of events to be encountered, and evaluates the scene using the relationship between each extracted event and its vehicle. By using the scene evaluation apparatus 10 of the present embodiment, it is possible to evaluate the scene encountered by the subject vehicle, taking into account only the events that the subject vehicle is deemed important from the viewpoint of making the behavior decision. So that it is possible to reduce the burden of processing.

[2-4] The traveling support apparatus 100 of the present embodiment arranges and displays the information representing the extracted events in the order in which the vehicle is encountered. By using the driving support apparatus 100 of the present embodiment, the driver of the child vehicle V1 can visually recognize, in what order, in what order.

[2-5] The driving support apparatus 100 of the present embodiment displays information indicating the extracted map at a position corresponding to the ratio of actual distances from the vehicle to the respective maps. By using the driving support apparatus 100 of the present embodiment, the driver of the vehicle V1 can visually recognize, in what order, in what order, when, and when.

[2-6] The traveling support apparatus 100 of the present embodiment uses the relationship between the vehicle and the vehicle evaluated by the evaluation processor to form a driving plan when the vehicle runs on the first route. By using the driving support apparatus 100 of the present embodiment, an operation plan can be drafted based on an evaluation on a necessary event. Since the ideas to be reviewed are narrowed, the processing time can be shortened while planning an appropriate operation plan. As a result, it is possible to implement the running support with a short delay time.

[2-7] The driving support apparatus 100 according to the present embodiment is a relationship in which the subject vehicle must stop with respect to an event among a plurality of events encountered when the subject vehicle is traveling on the first path, , And formulate a driving plan that sets the point where one extracted map is generated as the stop point of the vehicle. The use of the driving support apparatus 100 of the present embodiment allows the vehicle V1 to stop at the event nearest to the present position of the vehicle, thereby restraining the influence on the traffic flow.

[2-8] The scene evaluation method of the present embodiment uses an evaluation processor that evaluates a scene encountered by a vehicle running on a route, and extracts a plurality of mappings encountered when the vehicle runs on the first route And rearranges and outputs the extracted plurality of events in the order in which the vehicle is encountered. By using the scene evaluation method of the present embodiment, the operation effects described in 2-1 can be obtained.

[3-1] The scene evaluation apparatus 10 of the present embodiment calculates a temporal movement prediction line of a vehicle traveling on a first route, and calculates a distance between a second path having an intersection with the first path, The scene is evaluated using the relationship between the extracted map and the vehicle. It is difficult to separate a plurality of images acquired at the same time from the current position of the subject vehicle such as a captured image of the camera and it is difficult to accurately determine the jumping order of a plurality of events included in the captured image. If the encounter sequence can not be derived accurately, the reliability of the driving plan to be drafted in time series is impaired. By using the scene evaluation apparatus 10 of the present embodiment, it is possible to reduce the calculation load while planning an operation plan with high accuracy in consideration of the chronological order of encounter.

[3-2] The scene evaluating apparatus 10 of the present embodiment extracts mappings based on the positions of the intersections located at the most upstream side along the traveling direction of the second path among the intersections of the moving prediction line and the second path. By using the scene evaluation apparatus 10 of the present embodiment, it is possible to appropriately extract an event (point) to be noticed first in the scene.

[3-3] The scene evaluation apparatus 10 of the present embodiment refers to the map information and generates a motion prediction line when an area of the first path is not defined as a lane. By using the scene evaluation apparatus 10 of this embodiment, an appropriate map (point) can be extracted even in a place where a lane such as an intersection or a confluence point is not defined.

[3-4] The scene evaluating apparatus 10 of the present embodiment calculates a temporal movement prediction range of a subject vehicle traveling on the first path, and generates a second path having an intersection with the first path, Based on the position of the intersection located at the most upstream side along the traveling direction of the second route and extracting the event encountered by the vehicle, And evaluates the scene. By using the scene evaluation apparatus 10 of the present embodiment, it is possible to simplify the process until the final operation plan is prepared. It is possible to reduce the calculation load while designing a highly precise operation plan taking into consideration the necessary events.

[3-5] When the scene evaluating apparatus 10 of the present embodiment travels in an area in which the lane is not defined in the first route, a motion predictive table is generated with reference to the map information. By using the scene evaluation apparatus 10 of this embodiment, an appropriate map (point) can be extracted even in a place where a lane such as an intersection or a confluence point is not defined.

[3-6] The scene evaluating apparatus 10 of the present embodiment refers to a node and a map in which a link is defined, and is a node associated with a first path through which the host vehicle travels, The defined first node is extracted and the path to which the other node having the connection relationship with the first node belongs is selected as the second path. When the scene evaluation apparatus 10 of the present embodiment is used, the second path can be obtained with a low calculation load by using the information of the node and the link.

[3-7] The scene evaluation apparatus 10 of the present embodiment integrates a plurality of events into one event when the distance of the plurality of events encountered by the extracted child vehicle is less than a predetermined value. By using the scene evaluating apparatus 10 of the present embodiment, a plurality of nearby events can be integrated, so that it is possible to run smoothly in the scene region R1 without repeating the stop and go.

[3-8] The driving support apparatus 100 of the present embodiment determines either a progressive action or a stop action for a plurality of extracted ideas. By using the driving support apparatus 100 according to the present embodiment, it is possible to design an accurate and concise driving plan by determining which of the stop and the goau is appropriate for each event in which the necessity of judgment is high.

[3-9] The scene evaluation method of the present embodiment is a method of evaluating a scene by calculating a temporal movement prediction line of a vehicle running on a first path, calculating a second path having an intersection with the first path, , The scene to be encountered by the subject vehicle is extracted and the scene is evaluated using the relationship between the extracted subject and the subject vehicle. By using the scene evaluation method of the present embodiment, the operational effects described in 3-1 can be obtained.

[4-1] The operation planning device 20 of the present embodiment determines one action for each of a plurality of events encountered by the vehicle, and uses the contents of each action determined for each event to determine whether the vehicle Develop a series of driving plans for the encounter scenes. By using the operation planning apparatus 20 of the present embodiment, it is possible to simplify the process until the final operation plan is prepared. It is possible to reduce the calculation load while designing a highly precise operation plan taking into consideration the necessary events. In addition, it is possible to design an operation plan that clearly indicates where the stopping of a scene is required from the start of passage to the end of passage.

[4-2] The operation planning device 20 of the present embodiment determines either a progressive action or a stop action for a plurality of events encountered by the subject vehicle. By using the driving support apparatus 100 according to the present embodiment, it is possible to design an accurate and concise driving plan by determining which of the stop and the goau is appropriate for each event for which it is highly necessary to make a judgment.

[4-3] When the operation planning apparatus 20 of the present embodiment determines that the stop motion is determined or can not be judged with respect to a plurality of events encountered by the subject vehicle, . By using the operation planning device 20 of the present embodiment, when there is a point to be stopped in the area R1 corresponding to the scene, the vehicle V1 is immediately stopped, thereby avoiding the risk.

[4-4] The operation planning apparatus 20 of the present embodiment determines whether a stop action or a decision impossible is made with respect to a plurality of events encountered by the vehicle, , The vehicle is stopped at the encounter point with the event whose progress is determined. By using the operation planning device 20 of the present embodiment, even when the proceeding behavior is once determined, when the next encountered situation of the subject vehicle V1 is the stopping action or the judgment impossibility, Can be stopped. Since the place where the proceeding behavior is decided is the place where the presence of the subject vehicle V1 is permitted, the subject vehicle V1 can be safely stopped.

[4-5] The operation planning apparatus 20 of the present embodiment is arranged such that, when the event which has been determined as the stopping action or the judgment impossible among the plurality of events encountered by the subject vehicle belongs to the second route, Stop the vehicle at a stopable position. Even when the stopping action or the judgment impossible determination is made for a certain event, when the stop position according to the event is included in the second route, there is a possibility of obstructing the running of the other vehicle running on the second route. It is not appropriate. When the operation planning device 20 of the present embodiment is used, the stop position can be set not at the second route but at a stopable position on the upstream side.

[4-6] The operation planning device 20 of the present embodiment is configured such that, when a mobility that has been determined as a stopping action or a judgment impossible among a plurality of mysteries encountered by the vehicle is within a predetermined distance from another mystery, And stops the vehicle at a stopable position. Even when the stopping action or the judgment impossible determination is made for a certain event, when the stopping position according to the event approaches or overlaps with the stopping position according to another event, it is necessary to consider the matching with the judgment on the other event , And is not suitable as a stop position. By using the operation planning apparatus 20 of the present embodiment, it is possible to set the stop position at a stopable position on the upstream side rather than on the second route. This makes it possible to reduce the case in which judgment is impossible. In addition, it is possible to smoothly travel within the area R1 of the scene without reducing the load of the determination processing and repeating the stop and go.

[4-7] The operation planning device 20 of the present embodiment determines whether or not an advancing behavior is determined for one of a plurality of ideas encountered by the vehicle, In the case where the judgment impossible is determined, when the degree of separation between one event and another event is equal to or greater than a predetermined value, an operation plan for advancing the subject vehicle to one event is made. In the event of a different distinction such as " stop " on the downstream side from the " advance " on the upstream side, the vehicle V1 may be advanced on the upstream side to avoid complicated processing.

[4-8] The operation planning device 20 of the present embodiment is configured such that an advancing behavior is determined for one of a plurality of ideas encountered by the vehicle, In the case where the judgment is impossible, a driving plan for reducing the speed of progress in one event is formulated. And a parking vehicle can be avoided. However, when the parking vehicle can not sufficiently detect an object by generating a square, the speed at which the parking vehicle is avoided is lowered while permitting the proceeding. Thereby, the traffic flow can be prevented from being disturbed while being safely considered.

[4-9] The traveling support apparatus 100 according to the present embodiment is configured to determine whether or not the vehicle 1 traveling on the first route is traveling on the basis of the relationship between the first route on which the subject vehicle travels and the second route having the intersection with the first route, A plurality of extracted events are rearranged in accordance with the order in which the vehicle is encountered and a plurality of event relationships encountered over time with the evaluated vehicle, Develop a series of driving plans for the encounter scenes. By using the driving support apparatus 100 of the present embodiment, an operation plan can be drafted based on an evaluation on a necessary event. Since the ideas to be reviewed are narrowed, the processing time can be shortened while planning an appropriate operation plan. As a result, it is possible to implement the running support with a short delay time.

[4-10] In the operation planning method of the present embodiment, by using the evaluation results on the relationship between the plurality of events and the subject vehicle encountered over time when the subject vehicle runs on the first route, Develop a driving plan for the scene. By using the operation planning method of the present embodiment, the operational effects described in 4-1 can be obtained.

[5-1] The operation planning device 20 of the present embodiment uses the evaluation results of the relationship between the plurality of events and the subject vehicle encountered over time when the subject vehicle runs on the first route, A stop position candidate is set for each event and the evaluation results of the relationship between the plurality of events and the subject vehicle encountered in the stop position candidate are used to determine a driving plan for the scene encountered by the subject vehicle do. By using the operation planning apparatus 20 of the present embodiment, in the traffic having the intersection of the first route and the second route, in consideration of the relationship between the plurality of events encountered in the stop position candidates and the vehicle V1, It is possible to realize an operation that does not affect other vehicles or pedestrians.

[5-2] The operation planning apparatus 20 of the present embodiment determines, as a stop position for stopping the child vehicle, a stop position candidate that is closest to the child of the plurality of stop position candidates in the scene encountered by the child vehicle do. According to the operation planning apparatus 20 of the present embodiment, since the child vehicle V1 is stopped at a position closest to the current position of the child vehicle V1 in the stop position candidate, the influence on the traffic flow can be suppressed.

[5-3] The operation planning device 20 of the present embodiment sets a stop position candidate at a position on the upstream side by a predetermined distance from a stop position where stop of the vehicle is required. The use of the operation planning device 20 of the present embodiment makes it possible to stop the vehicle V1 at a position closer to the present position of the vehicle than the stop position defined in the traffic rule information 224, .

[5-4] The operation planning device 20 of the present embodiment sets a stop position candidate at a position on the upstream side by a predetermined distance from the outer frame of the area where the parking space of the subject vehicle is prohibited. The vehicle V1 is stopped at a position nearer to the current position of the subject vehicle than the stop position defined in the actual traffic rule information 224, so that the influence on the traffic flow can be suppressed.

[5-5] The operation planning device 20 of the present embodiment sets a stop position candidate in addition to the travelable area of the second route that intersects with the first route. The vehicle V1 is stopped within a lane of the second route or at a position nearer to the current position of the vehicle V1 than the outer edge of the travelable area, so that the influence on the traffic flow can be suppressed.

[5-6] In the operation planning apparatus 20 of the present embodiment, when the vehicle body of the vehicle comes out of the first path when the vehicle passes through one log, Create a driving plan to stop the vehicle at the position candidate. When the vehicle V1 is pushed out of the first path, that is, when the vehicle body of the vehicle V1 is likely to enter the lane of another route or into the travelable area, The vehicle V1 is stopped at the position of the vehicle V1, so that the influence on the traffic flow can be suppressed.

[5-7] In accordance with the traffic signal of the first route or the traffic rule of the first route, the operation planning device 20 of the present embodiment sets a stop position candidate in a region where the map encountered by the vehicle is not generated Do not. According to the operation planning device 20 of the present embodiment, it is possible to avoid a stop in a scene in which a stop is not necessary, and to perform smooth running.

[5-8] In the operation planning apparatus 20 of the present embodiment, when the speed of the other vehicle flowing into the position of the stop position candidate of the first route from the second route having the intersection with the first route is equal to or lower than the specified speed , A stop is determined for another stop position candidate on the upstream side of the stop position candidate. When the speed of the other vehicle flowing into the position of the stop position candidate of the first route is equal to or lower than the specified speed, the stopping operation is carried out by stopping at the stop position candidate close to the current position of the subject vehicle, I can take action.

[5-9] The traveling support apparatus 100 of the present embodiment is configured to determine whether or not the vehicle 1 traveling on the first route is traveling on the basis of the relationship between the first route on which the subject vehicle travels and the second route having the intersection with the first route A plurality of stop positions candidates for stopping the vehicle are set for each event based on the extracted relationship between the vehicle and the extracted vehicle, and a plurality of maps The stop position for stopping the vehicle is determined and the vehicle is stopped at the stop position in accordance with the evaluation result of the relationship between the vehicle and the vehicle. By using the driving support apparatus 100 of the present embodiment, an operation plan can be drafted based on an evaluation on a necessary event. Since the ideas to be reviewed are narrowed, the processing time can be shortened while planning an appropriate operation plan. As a result, it is possible to implement the running support with a short delay time.

[5-10] The operation planning method according to the present embodiment uses the evaluation results of the relationship between the plurality of events and the subject vehicle encountered over time when the subject vehicle runs on the first route, One or a plurality of stop position candidates are set for each event and an operation plan is set up for a scene encountered by the vehicle using the evaluation results of the relationship between the plurality of events and the child vehicle encountered in the stop position candidate. By using the operation planning method of the present embodiment, the operational effects described in 5-1 can be obtained.

In addition, the embodiments described above are described for the purpose of facilitating understanding of the present invention, and are not described for limiting the present invention. Therefore, each element disclosed in the above embodiments is intended to include all design modifications and equivalents falling within the technical scope of the present invention.

That is, in the present specification, as a form of the running support apparatus according to the present invention, the scene evaluation apparatus 10, the running support apparatus 100 having the operation planning apparatus 20 and the output apparatus 30, However, the present invention is not limited to this.

In this specification, the scene evaluation apparatus 10 having the evaluation processor 11 is described as an example of the scene evaluation apparatus according to the present invention, but the present invention is not limited thereto. In this specification, the operation planning apparatus 20 having the operation planning processor 21 is described as an example of the operation planning apparatus according to the present invention, but the present invention is not limited thereto. In this specification, an output device 30 having an output control processor 31 is described as an example of an output device according to the present invention, but the present invention is not limited to this. The evaluation processor 11, the operation planning processor 21, and the output control processor 31 may be configured as one processor or a plurality of processors.

In this specification, a vehicle controller 210, a navigation device 220, an object detection device 230, a lane departure prevention device 240, an output device 250, The onboard device 200 having the device 260, the drive device 270 and the steering device 280 will be described as an example, but the present invention is not limited thereto. Any of the scene evaluating apparatus, the operation planning apparatus, the output apparatus, and the traveling support apparatus according to the present invention is not limited to being arranged with various apparatuses applicable to the vehicle at the time of filing.

1: Driving support system
100: Driving support device
10: Scene evaluation device
11: Evaluation processor
20: Operation planning device
21: Operation planning processor
30: Output device
31: Output Control Processor
210: vehicle controller
220: Navigation device
221: Position detecting device
222: Map information
223: Road information
224: Traffic rules information
230: object detecting device
231: Camera
232: Radar device
240: Lane departure prevention device
241: Camera
242: Road information
250: Output device
251: Display
252: Speaker
260: Detector
261: Angle sensor
262: vehicle speed sensor
263: Posture sensor
270:
271: Braking device
280: Steering gear

Claims (9)

  1. And a driving plan processor for planning an operation plan of a vehicle traveling on a route,
    The operation planning processor,
    And a second path having an intersection with the first path from among the first path through which the subject vehicle travels and the path through which the moving object moves, The intersection is extracted,
    Wherein the one or more stop position candidates for stopping the child vehicle are calculated for each of the intersections using the evaluation results of the relationship between the intersection points encountered over time and the child vehicle when the child vehicle runs on the first route Setting,
    When planning an operation plan for a scene encountered by the subject vehicle using an evaluation result of a relationship between a plurality of intersection points encountered in the stop position candidate and the subject vehicle,
    The position of the first stop position candidate of the first path for stopping the child vehicle from the second route having the intersection with the first route is set and the speed of the moving object flowing into the intersection is determined The second stop position candidate on the upstream side of one of the first stop position candidates.
  2. The method according to claim 1,
    The operation planning processor,
    And determines a stop position candidate that is closest to the vehicle among the plurality of stop position candidates as a stop position for stopping the child vehicle in a scene encountered by the child vehicle.
  3. The method according to claim 1,
    The operation planning processor,
    And sets the stop position candidate at a position on the upstream side by a predetermined distance from a stop position where stop of the vehicle is requested.
  4. The method according to claim 1,
    The operation planning processor,
    And sets the stop position candidate at a position on the upstream side by a predetermined distance from the outer frame of the area where the parking space of the child vehicle is prohibited.
  5. The method according to claim 1,
    The operation planning processor,
    And sets the stop position candidate outside the travelable region of the second route that intersects the first route.
  6. delete
  7. The method according to claim 1,
    The operation planning processor,
    And does not set the stop position candidate in an area where an intersection point encountered by the child vehicle does not occur according to the traffic signal of the first route or the traffic rule of the first route.
  8. An operation planning apparatus according to any one of claims 1 to 5 and 7,
    And an evaluation processor for evaluating a scene encountered by the vehicle,
    The evaluation processor,
    Extracting a plurality of intersecting points encountered by said child vehicle traveling on said first route based on a relationship between a first route on which said subject vehicle travels and a second route having an intersection with said first route,
    The operation planning processor,
    Setting one or a plurality of stop position candidates for stopping the child vehicle on the basis of the evaluation results of the relationship between the child vehicle and the intersection,
    A stop position for stopping the child vehicle is determined according to an evaluation result of a relationship between a plurality of intersection points encountered in the stop position candidate and the child vehicle and a driving operation for stopping the child vehicle at the stop position Driving support device to plan.
  9. Using a driving plan processor to design a driving plan of a vehicle traveling on a route,
    And a second path having an intersection with the first path from among the first path through which the subject vehicle travels and the path through which the moving object moves, The intersection is extracted,
    Wherein the one or more stop position candidates for stopping the child vehicle are calculated for each of the intersections using the evaluation results of the relationship between the intersection points encountered over time and the child vehicle when the child vehicle runs on the first route Setting,
    When planning an operation plan for a scene encountered by the subject vehicle using an evaluation result of a relationship between a plurality of intersection points encountered in the stop position candidate and the subject vehicle,
    The position of the first stop position candidate of the first path for stopping the child vehicle from the second route having the intersection with the first route is set and the speed of the moving object flowing into the intersection is determined The stoppage is determined for the second stop position candidate on the upstream side of one of the first stop position candidates.
KR1020187001530A 2015-07-21 2015-07-21 Driving planning device, driving support device, driving planning method KR102005203B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/070748 WO2017013750A1 (en) 2015-07-21 2015-07-21 Driving plan device, travel support device, and driving plan method

Publications (2)

Publication Number Publication Date
KR20180018789A KR20180018789A (en) 2018-02-21
KR102005203B1 true KR102005203B1 (en) 2019-07-29

Family

ID=57834281

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020187001530A KR102005203B1 (en) 2015-07-21 2015-07-21 Driving planning device, driving support device, driving planning method

Country Status (10)

Country Link
US (1) US10112614B2 (en)
EP (1) EP3333820A4 (en)
JP (1) JP6451848B2 (en)
KR (1) KR102005203B1 (en)
CN (1) CN107851375A (en)
BR (1) BR112018001073A2 (en)
CA (1) CA2993151A1 (en)
MX (1) MX368086B (en)
RU (1) RU2682092C1 (en)
WO (1) WO2017013750A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016147368A1 (en) * 2015-03-19 2016-09-22 三菱電機株式会社 Driving control device and driving control method
CN107851372A (en) * 2015-07-21 2018-03-27 日产自动车株式会社 Driving plan device, driving assist system, driving plan method
US10416675B2 (en) 2016-04-12 2019-09-17 Agjunction Llc Line acquisition path generation using curvature profiles
US10202118B2 (en) * 2016-10-14 2019-02-12 Waymo Llc Planning stopping locations for autonomous vehicles
JP2018203210A (en) * 2017-06-09 2018-12-27 アイシン精機株式会社 Travel support system
WO2020039224A1 (en) * 2018-08-21 2020-02-27 日産自動車株式会社 Driving plan display method and driving plan display device
WO2020139388A1 (en) * 2018-12-28 2020-07-02 Didi Research America, Llc Vehicle-provided virtual stop and yield line clustering
WO2020139394A1 (en) * 2018-12-28 2020-07-02 Didi Research America, Llc On-board vehicle stop cause determination system
US10746557B1 (en) * 2019-06-21 2020-08-18 Lyft, Inc. Systems and methods for navigation using bounding areas

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009157589A (en) * 2007-12-26 2009-07-16 Denso Corp Stoppage guide device
WO2011055823A1 (en) 2009-11-09 2011-05-12 株式会社小松製作所 Apparatus and method for controlling travel of vehicles

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1349131B1 (en) * 2000-11-24 2006-01-18 Aisin Seiki Kabushiki Kaisha Vehicle collision preventing apparatus
JP4525915B2 (en) 2005-02-16 2010-08-18 株式会社デンソー Driving assistance device
JP4857134B2 (en) * 2007-01-30 2012-01-18 クラリオン株式会社 In-vehicle map display device, vehicle map display system
RU2384887C2 (en) * 2007-06-14 2010-03-20 Сергей Николаевич Кононов System for monitoring compliance with road traffic rules
JP5050735B2 (en) * 2007-08-27 2012-10-17 マツダ株式会社 Vehicle driving support device
JP5083066B2 (en) * 2008-06-24 2012-11-28 トヨタ自動車株式会社 Driving assistance device
JP4614005B2 (en) * 2009-02-27 2011-01-19 トヨタ自動車株式会社 Moving locus generator
JP2010271844A (en) 2009-05-20 2010-12-02 Toyota Central R&D Labs Inc Driving support controller and program
JP5407764B2 (en) * 2009-10-30 2014-02-05 トヨタ自動車株式会社 Driving assistance device
KR101798053B1 (en) * 2010-09-06 2017-11-15 현대모비스 주식회사 System and Method for Vehicle Control for Collision Avoidance on the basis of Vehicular communication systems
CN103298673B (en) * 2011-01-12 2016-01-27 丰田自动车株式会社 Vehicular information disposal system
WO2012105030A1 (en) 2011-02-03 2012-08-09 トヨタ自動車株式会社 Vehicle control apparatus
CN102679993A (en) * 2011-03-18 2012-09-19 阿尔派株式会社 Navigation device and driving guide method thereof
WO2013008299A1 (en) * 2011-07-11 2013-01-17 トヨタ自動車株式会社 Vehicle emergency withdrawal device
JP2013050803A (en) * 2011-08-30 2013-03-14 Sanyo Electric Co Ltd Mobile communication device and method of supporting rear-end collision prevention
EP2827320B1 (en) * 2012-03-16 2020-01-08 Nissan Motor Co., Ltd Device for determining sensitivity to prediction of unexpected situations
US8793046B2 (en) * 2012-06-01 2014-07-29 Google Inc. Inferring state of traffic signal and other aspects of a vehicle's environment based on surrogate data
EP2878507B1 (en) * 2012-07-24 2017-08-30 Toyota Jidosha Kabushiki Kaisha Drive assist device
CN103177596B (en) * 2013-02-25 2016-01-06 中国科学院自动化研究所 A kind of intersection independent control system
CN105246755B (en) * 2013-05-31 2017-11-21 日立汽车系统株式会社 Controller of vehicle
JP6049551B2 (en) * 2013-06-21 2016-12-21 三菱電機株式会社 Navigation device
JP6230620B2 (en) * 2013-12-10 2017-11-15 三菱電機株式会社 Travel control device
US9159032B1 (en) * 2014-03-19 2015-10-13 Xerox Corporation Predicting arrival times of vehicles based upon observed schedule adherence
CN107851372A (en) * 2015-07-21 2018-03-27 日产自动车株式会社 Driving plan device, driving assist system, driving plan method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009157589A (en) * 2007-12-26 2009-07-16 Denso Corp Stoppage guide device
WO2011055823A1 (en) 2009-11-09 2011-05-12 株式会社小松製作所 Apparatus and method for controlling travel of vehicles

Also Published As

Publication number Publication date
MX368086B (en) 2019-09-19
EP3333820A1 (en) 2018-06-13
MX2018000498A (en) 2018-04-13
EP3333820A4 (en) 2019-01-23
JP6451848B2 (en) 2019-01-16
BR112018001073A2 (en) 2018-09-11
KR20180018789A (en) 2018-02-21
WO2017013750A1 (en) 2017-01-26
US20180208199A1 (en) 2018-07-26
US10112614B2 (en) 2018-10-30
CN107851375A (en) 2018-03-27
RU2682092C1 (en) 2019-03-14
JPWO2017013750A1 (en) 2018-05-24
CA2993151A1 (en) 2017-01-26

Similar Documents

Publication Publication Date Title
US10591929B2 (en) Prioritized constraints for a navigational system
US10507807B2 (en) Systems and methods for causing a vehicle response based on traffic light detection
JP6296162B2 (en) Vehicle travel control apparatus and method
KR20190030199A (en) Supervision of vehicles
US10214206B2 (en) Parking assist system for vehicle
JP6619436B2 (en) Autonomous vehicle that detects and responds to concession scenarios
EP3121085B1 (en) Vehicle operation device
JP6252235B2 (en) Automatic driving support system, automatic driving support method, and computer program
JP6015329B2 (en) Convoy travel system and convoy travel device
US20170072962A1 (en) Traffic light anticipation
WO2016159170A1 (en) Automatic driving assistance system, automatic driving assistance method, and computer program
DE102015114464A9 (en) Uniform motion planner for an autonomous vehicle while avoiding a moving obstacle
US20150253772A1 (en) Apparatus and method for continuously establishing a boundary for autonomous driving availability and an automotive vehicle comprising such an apparatus
JP6380274B2 (en) Navigation device for autonomous vehicles
WO2015156146A1 (en) Travel control device, onboard display device, and travel control system
JP6250180B2 (en) Vehicle irradiation control system and image irradiation control method
US8385600B2 (en) Vehicle driving assistance apparatus
JP5886185B2 (en) Method for automatically recognizing driving maneuvers of a motor vehicle and driver assistance system including this method
US8754782B2 (en) Vehicle wrong-way travel detection device
JP5316698B2 (en) Driving assistance device
JP4614005B2 (en) Moving locus generator
JP6365672B2 (en) Travel control device and travel control method
JP4792866B2 (en) Navigation system
RU2660158C1 (en) Device and method of traffic control
JP4955628B2 (en) Packet traffic control system

Legal Events

Date Code Title Description
AMND Amendment
A201 Request for examination
A302 Request for accelerated examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
X701 Decision to grant (after re-examination)
GRNT Written decision to grant