CN109426263B - Vehicle control device, vehicle control method, and storage medium - Google Patents

Vehicle control device, vehicle control method, and storage medium Download PDF

Info

Publication number
CN109426263B
CN109426263B CN201810978033.1A CN201810978033A CN109426263B CN 109426263 B CN109426263 B CN 109426263B CN 201810978033 A CN201810978033 A CN 201810978033A CN 109426263 B CN109426263 B CN 109426263B
Authority
CN
China
Prior art keywords
vehicle
situation
object target
road
traveling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810978033.1A
Other languages
Chinese (zh)
Other versions
CN109426263A (en
Inventor
加藤大智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN109426263A publication Critical patent/CN109426263A/en
Application granted granted Critical
Publication of CN109426263B publication Critical patent/CN109426263B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18145Cornering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0891Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Abstract

The invention provides a vehicle control device, a vehicle control method and a storage medium capable of generating a more appropriate running track according to a running environment. A vehicle control device according to an embodiment includes: an identification unit that identifies a peripheral vehicle of the host vehicle; and a situation setting unit that sets an object target situation based on the nearby vehicle recognized by the recognition unit for a plurality of divided areas obtained by dividing a road area, and that makes the object target situation different depending on whether or not the host vehicle or the nearby vehicle is traveling in a predetermined traveling environment.

Description

Vehicle control device, vehicle control method, and storage medium
Technical Field
The invention relates to a vehicle control device, a vehicle control method, and a storage medium.
Background
The following techniques are known: whether or not a blind spot exists is determined from an image of the host vehicle in the traveling direction, and if it is determined that a blind spot exists, a future target position and a target posture of the host vehicle obtained from the blind spot are set (for example, japanese patent application laid-open No. 2013-186722).
However, in the technique described in patent document 1, the variation in the degree of influence of the peripheral vehicle on the host vehicle according to the running environment of the host vehicle is not taken into consideration, and an appropriate running track may not be generated.
Disclosure of Invention
The present invention has been made in view of the above circumstances, and an object thereof is to provide a vehicle control device, a vehicle control method, and a storage medium that can generate a more appropriate travel track according to a travel environment.
The vehicle control device, the vehicle control method, and the storage medium according to the present invention have the following configurations.
(1): a vehicle control device according to an aspect of the present invention includes: an identification unit that identifies a peripheral vehicle of the host vehicle; and a situation setting unit that sets an object target situation based on the nearby vehicle recognized by the recognition unit for a plurality of divided areas obtained by dividing a road area, and that makes the object target situation different depending on whether or not the host vehicle or the nearby vehicle is traveling in a predetermined traveling environment.
(2): in the aspect of (1) above, the predetermined running environment is a curved road, and the situation setting unit may set the object target situation in a case where the host vehicle or the nearby vehicle is running on a curved road to be different from a case where the host vehicle or the nearby vehicle is not running on a curved road.
(3) In the aspect (2) described above, the situation setting unit may change the degree of difference in the object target situation based on the curvature of the curved road.
(4): in the aspect (3) described above, the predetermined travel environment is a curved road having a predetermined curvature or more, and the situation setting unit may set the situation setting unit to set the object target situation to be different between a case where the host vehicle or the nearby vehicle travels a curved road having a predetermined curvature or more and a case where the host vehicle or the nearby vehicle does not travel a curved road having a predetermined curvature or more.
(5) In the aspect (1) described above, the situation setting unit may vary the degree of change in the object target situation based on the position and behavior of the nearby vehicle.
(6) In the aspect of (1) above, wherein the recognition unit recognizes a road division line between the position of the host vehicle and the position of the nearby vehicle, and the situation setting unit makes the object target situation different between a case where the road division line is recognized by the recognition unit and a case where the road division line is not recognized.
(7) In the aspect of (1) above, the situation setting unit may be configured to make the object target situation different between a case where the possibility of the behavior change of the nearby vehicle is predicted to be high based on the traveling environment of the nearby vehicle and a case where the possibility of the behavior change of the nearby vehicle is not predicted to be high.
(8) In addition to the above-described aspect (1), the vehicle control device further includes: a guidance situation setting unit that sets a guidance situation obtained based on the road region for a plurality of divided regions obtained by dividing the road region; an evaluation unit that derives an index value for evaluating a situation of a focused divided region, based on the object target situation and the guidance situation of the focused divided region set among the plurality of divided regions, and prediction information generated for a peripheral divided region selected from the periphery of the focused divided region; a selection unit that selects one or more divided regions along a traveling direction of the host vehicle from the plurality of divided regions based on the index value derived by the evaluation unit; and a trajectory generation unit that generates a future travel trajectory of the host vehicle based on the one or more divided regions along the traveling direction of the host vehicle selected by the selection unit.
(9) A vehicle control method according to an aspect of the present invention is a vehicle control method executed by a computer mounted on a vehicle, the vehicle control method including: identifying a nearby vehicle of the own vehicle; setting an object target situation obtained based on the recognized surrounding vehicle for a plurality of divided areas obtained by dividing a road area; and making the object target posture different according to whether the own vehicle or the nearby vehicle is running in a prescribed running environment.
(10) A storage medium according to an aspect of the present invention stores a program that causes a computer to perform: identifying a nearby vehicle of the own vehicle; setting an object target situation obtained based on the recognized surrounding vehicle for a plurality of divided areas obtained by dividing a road area; and making the object target posture different according to whether the own vehicle or the nearby vehicle is running in a prescribed running environment.
According to the aspects (1) to (10) described above, a more appropriate travel track can be generated in accordance with the travel environment.
Drawings
Fig. 1 is a block diagram of a vehicle system including an automatic driving control unit.
Fig. 2 is a diagram showing a case where the vehicle position recognition unit recognizes the relative position and posture of the vehicle M with respect to the traveling lane.
Fig. 3 is a diagram showing a case where a target track is generated based on a recommended lane.
Fig. 4 is a functional configuration diagram of the action plan generating unit.
Fig. 5 is a diagram showing an example of the grid set in the road surface area.
Fig. 6 is a diagram showing an example of a guidance situation.
Fig. 7 is a diagram for explaining a method of setting the target posture of the object.
Fig. 8 is a diagram showing a situation in which the object target situation region set in another vehicle is viewed from behind the other vehicle.
Fig. 9 is a diagram showing a state in which an object target situation region set in another vehicle is viewed from a lateral direction.
Fig. 10 is a diagram schematically showing an object target situation region set in the scene of fig. 7 in three dimensions.
Fig. 11 is a diagram for explaining the relationship between the host vehicle M traveling on a curved road and another vehicle.
Fig. 12 is a diagram of the object target situation region set in a curved road of another vehicle as viewed from the front of the other vehicle.
Fig. 13 is a diagram for explaining the setting of the object target situation region in the case where another vehicle is traveling on the lane along the road dividing line.
Fig. 14 is a diagram for explaining the setting of the object target situation region in the case where an obstacle is detected in the traveling direction of another vehicle.
Fig. 15 is a diagram for explaining the setting of the object target situation region in the case where the road dividing line between the traveling lane and the opposite lane cannot be recognized.
Fig. 16 is a diagram for explaining derivation of the index value.
Fig. 17 is a diagram showing another example of the peripheral grid.
Fig. 18 is a diagram for explaining the selection grid selected by the selection unit and the target trajectory generated based on the selection grid.
Fig. 19 is a flowchart showing an example of the processing executed by the action plan generating unit according to the embodiment.
Fig. 20 is a diagram showing an example of the hardware configuration of the automatic driving control unit according to the embodiment.
Detailed Description
Embodiments of a vehicle control device, a vehicle control method, and a storage medium according to the present invention will be described below with reference to the accompanying drawings. In the following embodiments, the vehicle control device is applied to an autonomous vehicle. The automated driving is, for example, a case where at least one of the speed and the steering of the host vehicle M is automatically controlled to cause the host vehicle to travel.
[ integral Structure ]
Fig. 1 is a structural diagram of a vehicle system 1 including an automatic driving control unit 100. The vehicle on which the vehicle system 1 is mounted is, for example, a two-wheel, three-wheel, four-wheel or the like vehicle, and the drive source thereof is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof. The electric motor operates using generated power generated by a generator connected to the internal combustion engine or discharge power of a secondary battery or a fuel cell.
The vehicle system 1 includes, for example, a camera 10, a radar device 12, a probe 14, an object recognition device 16, a communication device 20, an hmi (human Machine interface)30, an etc (electronic Toll Collection system) in-vehicle device 40, a navigation device 50, an MPU (Micro-Processing Unit)60, a vehicle sensor 70, a driving operation tool 80, an automatic driving control Unit 100, a driving force output device 200, a brake device 210, and a steering device 220. These apparatuses and devices are connected to each other by a multiplex communication line such as a can (controller Area network) communication line, a serial communication line, a wireless communication network, and the like. The configuration shown in fig. 1 is merely an example, and a part of the configuration may be omitted, or another configuration may be further added. The automatic driving control unit 100 is an example of a "vehicle control device".
The camera 10 is a digital camera using a solid-state imaging device such as a ccd (charge Coupled device) or a cmos (complementary Metal Oxide semiconductor). One or more cameras 10 are mounted on an arbitrary portion of a vehicle (hereinafter, referred to as a host vehicle M) on which the vehicle system 1 is mounted. When photographing forward, the camera 10 is attached to the upper part of the front windshield, the rear surface of the vehicle interior mirror, or the like. The camera 10 repeatedly shoots the periphery of the host vehicle M periodically, for example. The camera 10 may also be a stereo camera.
The radar device 12 radiates a radio wave such as a millimeter wave to the periphery of the host vehicle M and detects a radio wave reflected by an object (reflected wave) to detect at least the position (distance and direction) of the object. One or more radar devices 12 are mounted on an arbitrary portion of the host vehicle M. The radar device 12 may detect the position and velocity of the object by an FM-cw (frequency Modulated Continuous wave) method.
The detector 14 is a LIDAR (Light Detection and Ranging, or Laser Imaging Detection and Ranging) that measures scattered Light with respect to irradiation Light and detects a distance to a target. One or more sensors 14 are mounted on any portion of the host vehicle M.
The object recognition device 16 performs a sensor fusion process on the detection results detected by some or all of the camera 10, the radar device 12, and the probe 14 to recognize the position, the type, the speed, and the like of the object. The object recognition device 16 outputs the recognition result to the automatic driving control unit 100.
The communication device 20 communicates with another vehicle (a neighboring vehicle) present in the vicinity of the host vehicle M, for example, using a cellular network, a Wi-Fi network, Bluetooth (registered trademark), dsrc (dedicated Short Range communication), or the like, or communicates with various server devices via a wireless base station.
The HMI30 presents various information to the passenger of the host vehicle M and accepts input operations by the passenger. The HMI30 includes, for example, a touch panel, a switch, and the like, which are not shown. The touch panel may be configured by combining a touch panel with a display device such as an lcd (liquid Crystal display) or an organic el (electro luminescence) display.
The Navigation device 50 includes, for example, a gnss (global Navigation Satellite system) receiver 51, a Navigation HMI52, and a route determination unit 53, and stores the first map information 54 in a storage device such as an hdd (hard Disk drive) or flash memory. The GNSS receiver determines the position of the own vehicle M based on signals received from GNSS satellites. The position of the host vehicle M may also be determined or supplemented by an ins (inertial Navigation system) that utilizes the output of the vehicle sensors 70. The navigation HMI52 includes a display device, a speaker, a touch panel, keys, and the like. The navigation HMI52 may also be shared in part or in whole with the aforementioned HMI 30. The route determination unit 53 determines a route from the position of the own vehicle M (or an arbitrary input position) specified by the GNSS receiver 51 to the destination input by the passenger using the navigation HMI52, for example, with reference to the first map information 54. The first map information 54 is information representing a road shape by, for example, a line representing a road and nodes connected by the line. The first map information 54 may also include curvature Of a road, poi (point Of interest) information, and the like. The route determined by the route determination unit 53 is output to the MPU 60. The navigation device 50 may perform route guidance using the navigation HMI52 based on the route determined by the route determination unit 53. The navigation device 50 may be realized by a function of a terminal device such as a smartphone or a tablet terminal held by the user. The navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire a route returned from the navigation server.
The MPU60 functions as, for example, the recommended lane determining unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determining unit 61 divides the route provided from the navigation device 50 into a plurality of sections (for example, 100[ m ] in the vehicle traveling direction), and determines the target lane for each section with reference to the second map information 62. The recommended lane determining unit 61 determines to travel on the second lane from the left side. When there is a branch portion, a junction portion, or the like in the route, the recommended lane determining unit 61 determines the recommended lane so that the host vehicle M can travel on an appropriate route for traveling to the branch destination.
The second map information 62 is map information with higher accuracy than the first map information 54. The second map information 62 includes, for example, information on the center of a lane, information on the boundary of a lane, and the like. The second map information 62 may include road information, traffic regulation information, address information (address, zip code), facility information, telephone number information, and the like. The road information includes information indicating the type of road, such as an expressway, a toll road, a national road, and a prefecture road, the number of lanes on the road, the width of each lane, the gradient of the road, the position of the road (including three-dimensional coordinates of longitude, latitude, and height), the curvature of a turn of the lane, the positions of a junction point and a branch point of the lane, and a sign provided on the road. The second map information 62 can be updated at any time by using the communication device 20 to access other devices.
The vehicle sensors 70 include a vehicle speed sensor that detects the speed of the own vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects an angular velocity about a vertical axis, an orientation sensor that detects the orientation of the own vehicle M, and the like. The vehicle sensor 70 may also include an outside air temperature sensor that detects the outside air temperature.
The driving operation member 80 includes, for example, an accelerator pedal, a brake pedal, a shift lever, a steering wheel, and other operation members. A sensor for detecting the operation amount or the presence or absence of operation is attached to the driving operation element 80, and the detection result is output to the automatic driving control unit 100 or one or both of the running driving force output device 200, the brake device 210, and the steering device 220.
The automatic driving control unit 100 includes, for example, a first control unit 120, a second control unit 140, and a storage unit 160. The first control unit 120 and the second control unit 140 are each realized by a processor such as a cpu (central Processing unit) executing a program (software). Some or all of the functional units may be implemented by hardware (including circuit units) such as lsi (large Scale integration), asic (application Specific Integrated circuit), FPGA (Field-Programmable Gate Array), and gpu (graphics Processing unit), or may be implemented by cooperation between software and hardware. The program may be stored in the storage unit 160 in advance, or may be stored in a removable storage medium such as a DVD or a CD-ROM, and attached to the storage unit 160 by being attached to the drive device via the storage medium. A combination of the external environment recognition unit 121 and the vehicle position recognition unit 122 is an example of the "recognition unit".
The first control unit 120 includes, for example, an external environment recognition unit 121, a vehicle position recognition unit 122, and an action plan generation unit 123.
The environment recognition unit 121 recognizes the state of the surrounding vehicle, such as the position, speed, and acceleration, based on information input from the camera 10, radar device 12, and probe 14 via the object recognition device 16. The position of the nearby vehicle may be represented by a representative point such as the center of gravity and a corner of the nearby vehicle, or may be represented by a region represented by the outline of the nearby vehicle. The "state" of the nearby vehicle may also include acceleration, jerk, or "behavior state" of the nearby vehicle (e.g., whether a lane change is being made or is to be made). The environment recognition unit 121 may recognize the position of a guardrail, a utility pole, a parked vehicle, a pedestrian, or other objects in addition to the surrounding vehicle.
The vehicle position recognition unit 122 recognizes, for example, a lane (traveling lane) in which the host vehicle M is traveling and a relative position and posture of the host vehicle M with respect to the traveling lane. The own vehicle position recognition unit 122 recognizes the traveling lane by, for example, comparing the pattern of road dividing lines (for example, the arrangement of solid lines and broken lines) obtained from the second map information 62 with the pattern of road dividing lines around the own vehicle M recognized from the image captured by the camera 10. In this recognition, the position of the own vehicle M acquired from the navigation device 50 and the processing result by the INS process may be added.
The vehicle position recognition unit 122 recognizes, for example, the position and posture of the vehicle M with respect to the traveling lane. Fig. 2 is a diagram showing a case where the vehicle position recognition unit 122 recognizes the relative position and posture of the vehicle M with respect to the travel lane L1. The vehicle position recognition unit 122 recognizes, for example, a deviation OS of a reference point (for example, the center of gravity) of the host vehicle M from the center CP of the travel lane and an angle θ formed by the traveling direction of the host vehicle M with respect to a line connecting the center CP of the travel lane as the relative position and posture of the host vehicle M with respect to the travel lane L1. Instead, the vehicle position recognition unit 122 may recognize, as the relative position of the vehicle M with respect to the travel lane, the position of the reference point of the vehicle M with respect to either side end of the vehicle lane L1. The relative position of the host vehicle M recognized by the host vehicle position recognition unit 122 is supplied to the recommended lane determination unit 61 and the action plan generation unit 123.
The action plan generating unit 123 determines events to be sequentially executed during the autonomous driving so as to be able to travel on the recommended lane determined by the recommended lane determining unit 61 and to cope with the surrounding situation of the host vehicle M. Examples of the event include a constant speed travel event in which the vehicle travels on the same travel lane at a constant speed, a follow-up travel event in which the vehicle follows the preceding vehicle, a lane change event, a merge event, a branch event, an emergency stop event, and a hand-over event in which the vehicle is switched to manual driving to end automatic driving. In the execution of these events, actions for avoidance may be planned based on the surrounding situation of the host vehicle M (the presence of surrounding vehicles and pedestrians, lane narrowing due to road construction, and the like).
The action plan generating unit 123 generates a target trajectory on which the host vehicle M will travel in the future. The target trajectory includes, for example, a velocity element. For example, a plurality of future reference times are set at predetermined sampling times (e.g., several fractions of sec), and a target track is generated as a set of target points (track points) to be reached at these reference times. Therefore, when the interval between the track points is wide, the high-speed travel in the section between the track points is shown.
Fig. 3 is a diagram showing a case where a target track is generated based on a recommended lane. As shown, the recommended lane is set to be suitable for traveling along the route up to the destination. When the vehicle comes to the near side (which may be determined according to the type of the event) at a predetermined distance from the recommended lane switching point, the action plan generating unit 123 activates a lane change event, a branch event, a merge event, and the like. When the obstacle needs to be avoided during execution of each event, an avoidance trajectory is generated as shown in the drawing.
The action plan generating unit 123 generates a plurality of target trajectory candidates, for example, and selects an optimal target trajectory at that time point from the viewpoint of efficiency.
The event is an event that occurs based on the condition outside the vehicle M, for example. The event occurring based on the situation outside the vehicle M is, for example, an event determined by the action plan generating unit 123 based on the recognition result of the external world recognizing unit 121, or an event of receiving a request signal described later. The execution action is, for example, to generate a predetermined behavior expected in advance by controlling the steering or acceleration/deceleration of the host vehicle M. More specifically, the execution action when the request signal is received is, for example, a case where another vehicle is caused to queue ahead of the host vehicle M.
The action plan generating unit 123 generates a future travel track of the host vehicle M based on the position of the other vehicle recognized by the external world recognizing unit 121, the position of the host vehicle M recognized by the host vehicle position recognizing unit 122, the travel environment of the host vehicle M or the other vehicle, and the like. The details of the function of the action plan generating unit 123 will be described later.
The second control unit 140 includes a travel control unit 141. The travel control unit 141 controls the travel driving force output device 200, the brake device 210, and the steering device 220 so that the host vehicle M passes through the target trajectory generated by the action plan generation unit 123 at a predetermined timing.
The storage unit 160 is implemented by, for example, nonvolatile storage devices such as rom (Read Only memory), eeprom (electrically Erasable and Programmable Read Only memory), hdd (hard Disk drive), and volatile storage devices such as ram (random Access memory) and registers. The storage unit 160 stores various information for executing the vehicle control in the embodiment, an execution result, and the like.
Running drive force output device 200 outputs running drive force (torque) for running the vehicle to the drive wheels. The traveling driving force output device 200 includes, for example, a combination of an internal combustion engine, a motor, a transmission, and the like, and an ECU that controls them. The ECU controls the above configuration in accordance with information input from the travel control unit 141 or information input from the driving operation element 80.
The brake device 210 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor in accordance with information input from the travel control unit 141 or information input from the driving operation element 80, and outputs a braking torque corresponding to a braking operation to each wheel. The brake device 210 may be provided with a mechanism for transmitting the hydraulic pressure generated by the operation of the brake pedal included in the driving operation element 80 to the hydraulic cylinder via the master cylinder as a backup. The brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device that transmits the hydraulic pressure of the master cylinder to the hydraulic cylinder by controlling the actuator in accordance with information input from the travel control unit 141.
The steering device 220 includes, for example, a steering ECU and an electric motor. The electric motor changes the orientation of the steering wheel by applying a force to a rack-and-pinion mechanism, for example. The steering ECU drives the electric motor in accordance with information input from the travel control unit 141 or information input from the driving operation element 80 to change the direction of the steered wheels.
[ details of action plan Generation part ]
Next, the function of the action plan generating unit 123 will be described in detail. Fig. 4 is a functional configuration diagram of the action plan generating unit 123. The action plan generating unit 123 includes, for example, a speed generating unit 130, a grid processing unit 131, a guidance situation setting unit 132, an object target situation setting unit 134, an evaluating unit 136, a selecting unit 137, and a trajectory generating unit 138. The object target situation setting unit 134 is an example of the "situation setting unit".
The speed generation unit 130 generates a current or future speed (target speed) of the host vehicle M, for example, based on the recognition result of the external world recognition unit 121. The speed is arbitrarily set in such a manner as not to exceed the legal speed.
The grid processing unit 131 divides a road surface region in front of the host vehicle M into coordinate systems (i, j) each having as axes a direction along the longitudinal direction of the road (traveling direction) and a direction along the width direction of the road (lateral direction), and virtually sets a grid G obtained by dividing the road surface region into two directions by a constant width.
Fig. 5 is a diagram showing an example of the grid G set in the road surface area. In the example of fig. 5, a one-sided two-lane road is shown. In the example of fig. 5, a straight line is shown to simplify the description. The partition width of the grid G may be set to be equal in the traveling direction and the lateral direction, or may be set to be different. The grid G may be formed in other forms such as a honeycomb form without being formed in a lattice form. The grid G is not limited to a quadrangle, and may be a circle or another polygon. In fig. 5, a straight line is shown for simplicity of explanation, but the same processing can be performed for a curved line via some conversion processing.
The grid processing unit 131 sets an effective region to be evaluated by the evaluation unit 136. Here, the effective area is set within one lane. The effective region is set in the effective region EF other than the region AR1 near the road dividing line CL and the region AR2 near the road dividing line LL on the left side on the travel lane L1, for example, as shown in the drawing. The effective area EF is set to an area in which a part of the host vehicle M does not overlap the road dividing line CL or the road dividing line LL if, for example, the representative point (for example, the center of gravity) of the host vehicle M falls within the effective area EF.
The guidance situation setting unit 132 sets the guidance situation Pi based on the road area. The guidance situation Pi is an index value that is set for a road region based on the existence of a road division (a road division line, a guardrail, a center separation zone, a shoulder line, or the like), for example.
Fig. 6 is a diagram showing an example of the guidance situation Pi. The vertical axis represents the guiding situation Pi and the lateral position represents the position on the lane. The guidance situation Pi is an index value having a property of guiding the host vehicle M more as the value is lower. The value of the guidance situation Pi is set to be higher as the distance from the center CP of the lane increases, for example, as shown by the solid line in fig. 6.
When the road on which the vehicle is traveling is a curved road, the guidance situation setting unit 132 may offset the guidance situation Pi by a predetermined distance in a direction away from the opposite lane (toward the road dividing line LL). In this case, the value of the pilot situation Pi is set, for example, as follows: as shown by the broken line in fig. 6, the value of the guidance situation Pi is lowest at a position CP 'offset from the center CP of the travel lane by a predetermined distance, and is higher as it is farther from the position CP'. In particular, since the relative speed of the opposing vehicle traveling on the opposing lane is high, when the traveling road is a curve, the guidance situation Pi is offset in a direction away from the opposing lane before the object recognition device 16 recognizes the opposing vehicle. Thus, when traveling on a curve, the traveling position of the vehicle M can be shifted in a direction away from the opposite lane, and contact with an opposite vehicle traveling on the opposite lane can be easily avoided.
The object target situation setting unit 134 sets the object target situation Po based on the surrounding situation of the host vehicle M in the future. The object target situation Po is, for example, an index value set for each of the peripheral objects recognized by the external world recognition unit 121. The object target situation is an index value having a property of guiding the own vehicle M more as the value is lower. The object target situation is set to be lower as the object (e.g., another vehicle) is farther away, for example.
Fig. 7 is a diagram for explaining a method of setting the object target situation Po. In the illustrated example, the object target situation setting unit 134 sets the object target situation Po at time t 0. For example, at time t0, another vehicle M1 is traveling ahead of the host vehicle M in the traveling lane L1 of the host vehicle M, and another vehicle M2 is traveling at a position overlapping the host vehicle M in the traveling direction in the adjacent lane L2.
The object target situation setting unit 134 predicts the position of the own vehicle M from time t0 to t4 based on the speed generated by the speed generating unit 130. In the example of fig. 7, the host vehicle M is located at position 0 (zero) at time t0, is expected to travel to position x1 at time t1, is expected to travel to position x2 at time t2, is expected to travel to position x3 at time t3, and is expected to travel to position x4 at time t 4. The object target situation setting unit 134 predicts the positions of the other vehicles m1 and m2 at times t1 to t4 based on the behavior of the other vehicles m1 and m2 at time t0 and before. The positions of the vehicles at times t1 to t4 in fig. 7 are the results of prediction of the positions of the own vehicle M, the other vehicle M1, and the other vehicle M2.
Object target situation Po relating to the other vehicle m1 is explained. For example, it is predicted that the own vehicle M catches up with another vehicle M1 at time t 4. Therefore, the object target situation setting unit 134 sets, as the object target situation region PoA, a region including the peripheral region PoA2 with the region PoA occupied by the another vehicle M1 at the time when the host vehicle M catches up with the another vehicle M1 (or a time slightly before the time), as the center. The object target situation region PoA is a region representing the distribution of the object target situation Po.
Fig. 8 is a diagram showing a state in which the object target situation region PoA set in the another vehicle m1 is viewed from behind the another vehicle m 1. The height on each coordinate of the object target situation region PoA in fig. 8 indicates the size of the object target situation Po. For example, the object target situation Po is set in the following manner: the object target posture Po of the area PoA1 corresponding to the position of the other vehicle m1 is the largest, and the object target posture Po becomes gradually lower as it goes away from the area PoA 1. In this case, the object target situation setting unit 134 sets the size of the object target situation Po obtained based on the distance from the area PoA1 using a predetermined function or the like. The object target situation setting unit 134 may set the size of the object target situation Po in steps according to the distance from the area PoA 1.
Object target situation Po for the other vehicle m2 is explained. It is predicted that the own vehicle M at least partially overlaps the other vehicle M2 in the traveling direction at each time point from time t0 to time t 2. The object target situation setting unit 134 sets the object target situation Po around the region occupied by the position history of the other vehicle M2 in the time zone (or the time zone set to be slightly longer) in which the lateral position of the host vehicle M coincides with the lateral position of the other vehicle M2.
Fig. 9 is a diagram showing a state in which the object target situation region PoA set in the different vehicle m2 is viewed from the lateral direction. The height on each coordinate of the object target situation region PoA in fig. 9 represents the size of the object target situation Po. For example, the object target situation Po is set in the following manner: the object target posture Po of the area PoA1 corresponding to each position from time t0 to time t2 of the other vehicle m2 is the largest, and the object target posture Po gradually becomes lower as the area PoA1 is distant. Fig. 10 is a diagram schematically showing the object target situation region PoA set in the scene of fig. 7 in a three-dimensional form. The height direction in the figure indicates the magnitude of the object target situation Po.
The object target situation setting unit 134 may vary the object target situation region PoA set on the straight road described above based on whether or not the host vehicle M is traveling in the predetermined traveling environment. The predetermined running environment is, for example, the shape of a lane in which the host vehicle M or another vehicle runs, the running state of another vehicle, the presence or absence of an obstacle on the lane, the presence or absence of recognition of a road marking, or the like. The shape of the lane refers to, for example, a curved road, an S-shaped road, a traveling direction or a lateral inclination with respect to the traveling direction, or a change in road width. The traveling state of the other vehicle is, for example, a state in which the other vehicle is traveling near a position on the road dividing line side.
Fig. 11 is a diagram for explaining the relationship between the host vehicle M traveling on a curved road and another vehicle M3. In the example of fig. 11 to 14, the travel lane L1 is a single lane, and the lane L3 is an opposite lane. When the travel road of the host vehicle M or another vehicle M3 is a curved road, the object target situation setting unit 134 sets the object target situation region PoAc in which the object target situation regions PoA set on the above-described straight road are different based on the curvature radius R of the curved road. In the example of fig. 11, the reason why the object target posture region PoAc extends to the vicinity in front of the other vehicle M3 is that, as described above, in the relationship between the traveling direction of the own vehicle M and the traveling direction of the other vehicle M3, there is a high possibility that contact will occur in front of the other vehicle M3. The curvature radius R may be set with reference to, for example, a road dividing line between lanes, or may be set with reference to the center of a lane.
Fig. 12 is a diagram showing the object target situation region PoAc on the curved road set in the different vehicle m3 as viewed from the front of the different vehicle m 3. The height on each coordinate of the object target situation region PoAc in fig. 12 indicates the size of the object target situation Po. The object target situation setting section 134 sets the object target situation Po as follows: the object target posture Po of the area PoA1 corresponding to the position of the other vehicle m1 is the largest, and the object target posture Po becomes gradually lower as it goes away from the area PoA 1. Here, it is predicted that the steering angle θ of the steering wheel of the other vehicle M3 changes toward the host vehicle M side in accordance with the curvature radius R of the lane L3. Therefore, the object target situation setting unit 134 is adjusted such that the size of the object target situation Po in the direction approaching the lane L1 side with respect to the area PoA1 is larger than the object target situation Po in the direction away from the host vehicle M. That is, the object target posture region PoA2b on the lane L1 side with respect to the other vehicle m3 shown in fig. 12 is larger than the object target posture region PoA2a on the opposite side of the lane L1 set when traveling on a straight line.
The object target posture setting unit 134 may adjust the object target posture region PoAc with respect to the curvature radius R using a predetermined function or the like, or may adjust the object target posture region PoAc so as to increase stepwise based on the distance from the region PoA 1. The object target situation setting unit 134 may set the degree of change of the object target situation region PoAc to be larger as the curvature radius R is smaller (the degree of curvature is larger, the vehicle makes a sharp turn).
The object target situation setting unit 134 may change the degree of difference in the object target situation Po based on the curvature of the curved road. In this case, the object target situation setting unit 134 is set such that the object target situation region PoAc in the case where the host vehicle M or the other vehicle M3 is traveling on a curved road having a predetermined curvature or more (in other words, the curvature radius R is a predetermined value or less) is larger than the object target situation region PoA in the case where the host vehicle M or the other vehicle M3 is not traveling on a curved road having a predetermined curvature or more.
In this way, by adjusting the object target posture region based on the curvature radius R of the travel lane L3 of the other vehicle m3, when it is predicted that the possibility that the other vehicle will go beyond the road dividing line CL is high, the object target posture region PoA is adjusted, so that a more appropriate travel track can be generated and the vehicle can travel on the generated travel track.
Fig. 13 is a diagram for explaining the setting of the object target situation region PoAd in the case where another vehicle m3 is traveling on the lane L3 along the road dividing line CL. For example, when the vehicle is traveling near the road dividing line CL that divides the traveling lane L1 of the host vehicle M and the traveling lane L3 of the other vehicle M3, the object target posture setting unit 134 sets the object target posture region PoAd that further increases the entire object target posture region PoAc. In this case, the object target situation setting unit 134 may be adjusted as follows: the degree of change of the object target posture region PoAd with respect to the object target posture region PoAc is made larger according to the magnitude of the deviation OS of the host vehicle M from the center CP of the lane L3.
Fig. 14 is a diagram for explaining the setting of the object target situation region PoAe in the case where the obstacle Ob is detected in the traveling direction of the other vehicle m 3. The obstacle Ob is, for example, a warning cone (registered trademark), a utility pole, a guardrail, a parking vehicle on the road, a pedestrian, or another object.
For example, when there is an obstacle Ob in the traveling direction of the other vehicle m3, there is a high possibility that the other vehicle m3 enters the lane L1 side in order to avoid the obstacle Ob. Therefore, when the obstacle Ob is recognized in the traveling direction of the another vehicle m3 by the external world recognition unit 121, the object target situation setting unit 134 predicts that the possibility of a change in the behavior of the another vehicle m3 is high, and sets the object target situation region PoAe different from the object target situation region PoAc when the possibility of a change in the behavior is high. In this case, the object target situation setting unit 134 may deform the shape of the object target situation region PoAe based on the speed and position of the host vehicle M or the other vehicle M3, or the position and size of the obstacle Ob. Thus, the object target situation region PoAe can be set in accordance with the position and behavior of the other vehicle m 3.
Fig. 15 is a diagram for explaining the setting of the object target situation region PoAf in the case where the road dividing line between the traveling lane and the opposite lane cannot be recognized. The case where the road dividing line between the traveling lane and the opposite lane cannot be recognized refers to, for example, a case where the road dividing line cannot be recognized from the image captured by the camera 10. As examples of the case where the road dividing line cannot be recognized from the image, there are cases where the road dividing line that should normally exist disappears, where white flashes occur, and where there is originally no road dividing line due to a narrow road such as a minor lane. In the example of fig. 15, a lane L4 originally having no road marking is shown.
For example, when the host vehicle M travels on the lane L4 so as to face the other vehicle M3, the other vehicle M3 is predicted to travel toward the host vehicle M. Therefore, in the case where the road dividing line between the traveling lane and the opposite lane cannot be recognized, the object target posture setting portion 134 sets the object target posture region PoAf larger than the object target posture region PoAc. The larger object target posture region may be, for example, a region that is increased by a predetermined ratio with respect to the object target posture region PoAc, or may be a region that extends in the traveling direction of the other vehicle m3 or in the lateral direction with respect to the traveling direction with respect to the object target posture region PoAc. Thus, an appropriate object target situation region PoA can be set according to the state of the travel lane.
The object target situation setting unit 134 may change the object target situation region PoA when the road surface state of the travel lane is a poor road, a slippery road, or the like. The road surface state may be obtained from, for example, weather information obtained from an external device via the communication device 20, an outside air temperature included in the vehicle sensor 70, and the like. For example, when the weather is rain or snow and the outside air temperature is equal to or lower than the predetermined temperature, the object target situation setting unit 134 predicts that the road surface is frozen and is likely to slip, and therefore increases the object target situation area PoA.
The evaluation unit 136 derives an index value for evaluating the posture of the focused grid (focused divided region) based on the guidance posture Pi and the object target posture Po of the focused grid (focused divided region) set in the plurality of grids (divided regions), and the prediction information generated for the peripheral grid (peripheral divided region) selected from the periphery of the focused grid. The anticipation information is, for example, information generated based on the future situation of the host vehicle M and the situation around the host vehicle M predicted from the current situation of the host vehicle M, and is set based on the guidance situation Pi and the object target situation Po set for the surrounding grid. The peripheral grid (the grid to be the object of the anticipation information) is, for example, a predetermined number of grids extending along one or both of the traveling direction of the focused grid and the width direction of the host vehicle M. For example, the peripheral grid includes a grid extending along the traveling direction of the host vehicle M to a position on the front side of the target grid in the traveling direction of the host vehicle M. The peripheral grid may be determined based on, for example, the traveling state of the host vehicle M. For example, the peripheral grid is a grid corresponding to a distance obtained by multiplying the traveling speed of the host vehicle M by a predetermined time (for example, several seconds).
Next, the evaluation unit 136 derives an index value for each grid G of the effective area EF. Fig. 16 is a diagram for explaining derivation of the index value. For example, look at grid G1. The evaluation unit 136 derives the integrated situation based on the guidance situation Pi and the object target situation Po set for the grid G1, for example. The integrated situation may be an index obtained by adding, weighting, multiplying, or the like the guide situation Pi and the object target situation Po, or may be an index value derived by inputting the guide situation Pi and the object target situation Po to a predetermined function or the like.
Next, the evaluation unit 136 derives a comprehensive index value of the grid G1 based on the comprehensive situation and the forecast information of the grid G1. In the example of fig. 16, the expectation information is, for example, information obtained by integrating the guidance situation Pi set for each of the grids G2 to Gn with the object target situation Po. The integrated information is, for example, an integrated situation derived for each of the grids G2 to Gn in the same manner as the grid G1. Hereinafter, this is referred to as a prediction index value.
The evaluation unit 136 may multiply a weight that is smaller as the distance from the focus grid G1 becomes larger when deriving the prediction index value. For example, the composite index value is derived based on the following formula (1).
Figure BDA0001776563090000171
Here, "Q" represents a comprehensive index value of the focused grid G, "i" represents a coordinate in the traveling direction of the focused grid, "j" represents a coordinate in the width direction of the focused grid, "α" represents a range in the traveling direction that is the target of the prediction information, and "β" represents a range in the width direction that is the target of the prediction information. In the equation (1), although the range in the width direction to be the target of the prediction information can be arbitrarily set, in the example of fig. 16, β is set to a value having a value of 1 grid.
In this case, as for the peripheral grid, more grids may be selected along the traveling direction (i direction) of the host vehicle M than grids selected along the width direction (j direction) of the host vehicle M.
The guiding situation Pi, the object target situation P0, the integrated situation, and the like may also be obtained for the grids G outside the effective area EF. This makes it possible to accurately derive the index value for the grid G at the end of the effective area EF on the traveling direction side. When deriving the index value for the grid G at the end portion on the traveling direction side of the effective area EF, the index value for the grid G at the end portion may be derived using a preset value.
The peripheral grid may be a rectangular, non-circular, or non-elliptical area as described below, or may be a polygonal area instead of a square or rectangle. Fig. 17 is a diagram showing another example of the peripheral grid. The peripheral grid may be a range PR1 including, for example, the number "n (an arbitrary natural number)" of grids G from the traveling direction and the number "k (an arbitrary natural number)" of grids G in the width direction with respect to the grid G to be looked at. For example, when attention is paid to grid G1, the peripheral grid may be a range PR1 including grids G2 to Gn whose number is "n (arbitrary natural number)" from the travel direction with respect to the attention grid G1 and grids G adjacent in the width direction of grid G1, and further including two grids G extending in the travel direction with respect to the adjacent grids G.
Returning to fig. 4, the selection unit 137 selects one or more grids along the traveling direction of the host vehicle M from the plurality of grids in the effective area EF based on the integrated index value derived by the evaluation unit 136.
The trajectory generation unit 138 generates a future target trajectory of the host vehicle M based on one or more grids along the traveling direction of the host vehicle M selected by the selection unit 137.
Fig. 18 is a diagram for explaining the selection grid SG selected by the selection unit 137 and the target trajectory TL generated based on the selection grid SG. The selection unit 137 selects, as the selection grid SG, for example, a grid G having the lowest overall index value among the overall index values derived for the respective grids G, among the plurality of grids G extending in the lateral direction with respect to the traveling direction of the host vehicle M.
The trajectory generation unit 138 generates a smooth curve that is as close as possible to the selection grid SG selected by the selection unit 137 and is expressed by a spline function (or an hermitian function) or the like, and generates the target trajectory TL on the curve.
[ flow chart ]
Fig. 19 is a flowchart showing an example of the processing executed by the action plan generating unit 123 according to the embodiment. The processing of the flowchart may be repeatedly executed at a predetermined cycle or predetermined timing, for example. First, the speed generation unit 130 generates a current or future speed (target speed) of the host vehicle M based on the recognition result of the external world recognition unit 121 (step S100). Next, the grid processing unit 131 sets the grid G to be scanned in the vertical and horizontal directions of the traveling lane of the host vehicle M (step S102).
Next, the guide situation setting unit 132 sets the guide situation Pi (step S104). Next, the object target situation setting unit 134 sets the object target situation Po based on the object (for example, another vehicle or an obstacle) recognized by the external world recognition unit 121 (step S106). Next, the object target situation setting unit 134 determines whether or not the other vehicle or the host vehicle M is traveling in a predetermined traveling environment (for example, a curved road) (step S108). When it is determined that the other vehicle or the host vehicle M is traveling in the predetermined traveling environment, the object target situation setting unit 134 adjusts the object target situation Po based on the traveling environment (step S110). That is, the object target situation setting unit 134 is configured to set the object target situation Po in the case where it is determined that the vehicle is traveling in the predetermined traveling environment to be different from the object target situation Po in the case where it is determined that the vehicle is not traveling in the predetermined traveling environment.
Next, the evaluation unit 136 derives a comprehensive index value of the focused grid based on the guide situation Pi, the object target situation Po, and the prediction information (step S112). Next, the selection unit 137 selects an index value having the smallest index value among the plurality of horizontal grids for each vertical grid (step S114). Next, the trajectory generation unit 138 generates a future trajectory for the vehicle M based on the selected global index value (step S116). Thus, the processing of one routine of the present flowchart ends.
According to the above-described embodiment, a more appropriate travel track can be generated in accordance with the travel environment. For example, according to the embodiment, the degree of influence of another vehicle traveling on a curved road or the like on the host vehicle can be set higher than in the case of traveling on a straight road, and therefore, a travel track with a low possibility of contact with another vehicle can be generated.
In the above-described embodiment, the example in which the vehicle control device is applied to the automatically driven vehicle has been described, but the present invention may be applied to a vehicle in which a driving support device that notifies a passenger of a possibility of contact with another vehicle or avoids contact based on the comprehensive situation or the magnitude of the comprehensive index value is mounted.
[ hardware configuration ]
The automatic driving control unit 100 according to the above-described embodiment is realized by a hardware configuration as shown in fig. 20, for example. Fig. 20 is a diagram showing an example of the hardware configuration of the automatic driving control unit 100 according to the embodiment.
The automatic driving control unit 100 is configured such that a communication controller 100-1, a CPU100-2, a RAM100-3, a ROM100-4, a flash memory, a secondary storage device 100-5 such as an HDD, and a drive device 100-6 are connected to each other via an internal bus or a dedicated communication line. A removable storage medium such as an optical disk is mounted in the drive device 100-6. The program 100-5a stored in the secondary storage device 100-5 is developed into the RAM100-3 by a DMA controller (not shown) or the like, and executed by the CPU100-2, thereby realizing the first control unit 120 and the second control unit 140. The program referred to by the CPU100-2 may be stored in a removable storage medium mounted on the drive device 100-6, or may be downloaded from another device via the network NW.
The above embodiment can be expressed in the following manner.
A vehicle control device is provided with:
a storage device that stores information; and
a hardware processor, which executes a program,
the program for causing the hardware processor to execute an identification process and a situation setting process is stored in the storage device,
in the identification process, a nearby vehicle of the own vehicle is identified,
in the situation setting process, an object target situation obtained based on the nearby vehicle identified by the identification process is set for a plurality of divided areas obtained by dividing a road area, and the object target situation is made different depending on whether or not the host vehicle or the nearby vehicle is traveling in a predetermined traveling environment.
While the present invention has been described with reference to the embodiments, the present invention is not limited to the embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.

Claims (10)

1. A control apparatus for a vehicle, wherein,
the vehicle control device includes:
an identification unit that identifies a peripheral vehicle of the host vehicle; and
a situation setting unit that sets a guidance situation obtained based on the road area and an object target situation area obtained based on the nearby vehicle identified by the identification unit for a plurality of divided areas obtained by dividing a road area, and makes the guidance situation and the object target situation area different depending on whether or not the host vehicle or the nearby vehicle is traveling in a predetermined traveling environment,
the situation setting unit biases the position of the guidance situation in a direction away from the opposite lane before the recognition unit recognizes the opposite vehicle when the predetermined running environment is a curve, as compared with a case where the vehicle is not a curve,
the object target situation region on the own-vehicle side of the opposing vehicle is made larger based on the curvature of the curved road than in the case where the curved road is not present.
2. The vehicle control apparatus according to claim 1,
the situation setting unit may set the object target situation region in a case where the host vehicle or the nearby vehicle is traveling on the curved road to be different from a case where the host vehicle or the nearby vehicle is not traveling on the curved road.
3. The vehicle control apparatus according to claim 2,
the situation setting unit changes the degree of difference between the object target situation regions based on the curvature of the curved road.
4. The vehicle control apparatus according to claim 3,
the predetermined running environment is a curved road having a predetermined curvature or more,
the situation setting unit sets the object target situation region to be different between a case where the host vehicle or the nearby vehicle is traveling on a curved road having a predetermined curvature or more and a case where the host vehicle or the nearby vehicle is not traveling on a curved road having a predetermined curvature or more.
5. The vehicle control apparatus according to claim 1,
the situation setting unit varies the degree of change of the object target situation region based on the position and behavior of the nearby vehicle.
6. The vehicle control apparatus according to claim 1,
the identification portion identifies a road dividing line between the position of the own vehicle and the position of the nearby vehicle,
the situation setting unit distinguishes the object target situation region between a case where the road section line is recognized by the recognition unit and a case where the road section line is not recognized.
7. The vehicle control apparatus according to claim 1,
the situation setting unit distinguishes the object target situation region between a case where the possibility of a behavior change of the nearby vehicle is predicted to be high based on the traveling environment of the nearby vehicle and a case where the possibility of a behavior change of the nearby vehicle is not predicted to be high.
8. The vehicle control apparatus according to claim 1,
the vehicle control device further includes:
an evaluation unit that derives an index value for evaluating a situation of a focused segment region among the plurality of segment regions, based on the object target situation region and the guidance situation set in the focused segment region, and prediction information generated for a peripheral segment region selected from the periphery of the focused segment region;
a selection unit that selects one or more divided regions along a traveling direction of the host vehicle from the plurality of divided regions based on the index value derived by the evaluation unit; and
and a trajectory generation unit that generates a future travel trajectory of the host vehicle based on the one or more divided regions along the traveling direction of the host vehicle selected by the selection unit.
9. A vehicle control method executed by a computer mounted on a vehicle, wherein,
the vehicle control method includes the processing of:
identifying a nearby vehicle of the own vehicle;
setting a guidance situation obtained based on the road area and an object target situation area obtained based on the identified surrounding vehicle for a plurality of divided areas obtained by dividing the road area;
differentiating the guidance situation and the object target situation region according to whether the own vehicle or the nearby vehicle is traveling in a prescribed traveling environment;
when the predetermined travel environment is a curve, before the opposing vehicle is recognized, offsetting the position of the guidance situation in a direction away from the opposing lane than when the opposing vehicle is not on the curve; and
the object target situation region on the own-vehicle side of the opposing vehicle is made larger based on the curvature of the curved road than in the case where the curved road is not present.
10. A storage medium, wherein,
the storage medium stores a program that causes a computer to perform:
identifying a nearby vehicle of the own vehicle;
setting a guidance situation obtained based on the road area and an object target situation area obtained based on the identified surrounding vehicle for a plurality of divided areas obtained by dividing the road area;
differentiating the guidance situation and the object target situation region according to whether the own vehicle or the nearby vehicle is traveling in a prescribed traveling environment;
when the predetermined travel environment is a curve, before the opposing vehicle is recognized, offsetting the position of the guidance situation in a direction away from the opposing lane than when the opposing vehicle is not on the curve; and
the object target situation region on the own-vehicle side of the opposing vehicle is made larger based on the curvature of the curved road than in the case where the curved road is not present.
CN201810978033.1A 2017-09-01 2018-08-24 Vehicle control device, vehicle control method, and storage medium Active CN109426263B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-168710 2017-09-01
JP2017168710A JP6651486B2 (en) 2017-09-01 2017-09-01 Vehicle control device, vehicle control method, and program

Publications (2)

Publication Number Publication Date
CN109426263A CN109426263A (en) 2019-03-05
CN109426263B true CN109426263B (en) 2022-03-18

Family

ID=65514581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810978033.1A Active CN109426263B (en) 2017-09-01 2018-08-24 Vehicle control device, vehicle control method, and storage medium

Country Status (3)

Country Link
US (1) US10795371B2 (en)
JP (1) JP6651486B2 (en)
CN (1) CN109426263B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11077756B2 (en) * 2017-11-23 2021-08-03 Intel Corporation Area occupancy determining device
DE102019105547A1 (en) * 2019-03-05 2020-09-10 Bayerische Motoren Werke Aktiengesellschaft Method and control unit for recognizing a vehicle entering or exiting
JP7210336B2 (en) * 2019-03-12 2023-01-23 本田技研工業株式会社 Vehicle control system, vehicle control method, and program
JP7261635B2 (en) * 2019-03-28 2023-04-20 本田技研工業株式会社 vehicle controller
JP7159137B2 (en) * 2019-09-25 2022-10-24 本田技研工業株式会社 VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM
CN111142539B (en) * 2020-01-13 2020-10-27 中智行科技有限公司 Unmanned vehicle control method and device and unmanned vehicle
JP7465705B2 (en) * 2020-03-31 2024-04-11 本田技研工業株式会社 Vehicle control device, vehicle control method, and program
JP7369078B2 (en) 2020-03-31 2023-10-25 本田技研工業株式会社 Vehicle control device, vehicle control method, and program
CN111645682B (en) * 2020-04-20 2021-12-28 长城汽车股份有限公司 Cruise control method and system and vehicle
CN111367299B (en) * 2020-05-26 2020-09-29 弗徕威智能机器人科技(上海)有限公司 Traveling avoidance method, mobile robot and storage medium
CN113968216B (en) * 2020-07-25 2023-11-17 华为技术有限公司 Vehicle collision detection method and device and computer readable storage medium
CN114792416A (en) * 2021-01-08 2022-07-26 华为技术有限公司 Target detection method and device
CN114132317B (en) * 2021-11-30 2023-05-23 重庆长安新能源汽车科技有限公司 Intelligent curve side driving control method, system, vehicle and storage medium
CN114202625B (en) * 2021-12-10 2023-03-14 北京百度网讯科技有限公司 Method and device for extracting road shoulder line and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010023721A (en) * 2008-07-22 2010-02-04 Hitachi Ltd Traveling support device
JP2011221667A (en) * 2010-04-06 2011-11-04 Toyota Motor Corp Object risk prediction device
CN103109313A (en) * 2010-09-08 2013-05-15 丰田自动车株式会社 Degree-of-danger calculation apparatus
WO2016080156A1 (en) * 2014-11-19 2016-05-26 三菱電機株式会社 Radar device
CN105745131A (en) * 2013-10-30 2016-07-06 株式会社电装 Travel controller, server, and in-vehicle device
CN107004361A (en) * 2014-12-09 2017-08-01 三菱电机株式会社 Risk of collision computing device, risk of collision display device and car body control device

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4254844B2 (en) * 2006-11-01 2009-04-15 トヨタ自動車株式会社 Travel control plan evaluation device
JP2010018062A (en) * 2008-07-08 2010-01-28 Fuji Heavy Ind Ltd Vehicle driving support device
JP4853525B2 (en) * 2009-02-09 2012-01-11 トヨタ自動車株式会社 Moving region prediction device
JP4905571B2 (en) * 2010-03-10 2012-03-28 トヨタ自動車株式会社 Vehicle parking assistance device and vehicle equipped with the same
JP5673597B2 (en) * 2011-11-18 2015-02-18 株式会社デンソー Vehicle behavior control device
US8930128B2 (en) * 2012-02-27 2015-01-06 Lit Motors Corporation Vehicle collision mitigation system
JP2013186722A (en) 2012-03-08 2013-09-19 Nissan Motor Co Ltd Travel control apparatus and travel control method
JP6791616B2 (en) * 2015-04-27 2020-11-25 トヨタ自動車株式会社 Self-driving vehicle system
JP6532786B2 (en) * 2015-08-07 2019-06-19 株式会社日立製作所 Vehicle travel control device and speed control method
JP6327719B2 (en) * 2016-02-04 2018-05-23 株式会社Subaru Vehicle travel control device
US20190016339A1 (en) * 2016-02-16 2019-01-17 Honda Motor Co., Ltd. Vehicle control device, vehicle control method, and vehicle control program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010023721A (en) * 2008-07-22 2010-02-04 Hitachi Ltd Traveling support device
JP2011221667A (en) * 2010-04-06 2011-11-04 Toyota Motor Corp Object risk prediction device
CN103109313A (en) * 2010-09-08 2013-05-15 丰田自动车株式会社 Degree-of-danger calculation apparatus
CN105745131A (en) * 2013-10-30 2016-07-06 株式会社电装 Travel controller, server, and in-vehicle device
WO2016080156A1 (en) * 2014-11-19 2016-05-26 三菱電機株式会社 Radar device
CN107004361A (en) * 2014-12-09 2017-08-01 三菱电机株式会社 Risk of collision computing device, risk of collision display device and car body control device

Also Published As

Publication number Publication date
CN109426263A (en) 2019-03-05
JP2019046161A (en) 2019-03-22
US10795371B2 (en) 2020-10-06
JP6651486B2 (en) 2020-02-19
US20190072971A1 (en) 2019-03-07

Similar Documents

Publication Publication Date Title
CN109426263B (en) Vehicle control device, vehicle control method, and storage medium
CN108628300B (en) Route determination device, vehicle control device, route determination method, and storage medium
CN110099834B (en) Vehicle control system, vehicle control method, and storage medium
CN108534792B (en) Lane change estimation device, lane change estimation method, and storage medium
CN111819124B (en) Vehicle control device, vehicle control method, and storage medium
CN110267856B (en) Vehicle control device, vehicle control method, and storage medium
CN110087963B (en) Vehicle control system, vehicle control method, and recording medium
CN110087960B (en) Vehicle control system, vehicle control method, and storage medium
CN110114253B (en) Vehicle control device, vehicle control method, and storage medium
US20190359209A1 (en) Vehicle control device, vehicle control method, and vehicle control program
CN110366513B (en) Vehicle control system, vehicle control method, and storage medium
US11173906B2 (en) Vehicle control system, vehicle control method, and storage medium
CN110087964B (en) Vehicle control system, vehicle control method, and storage medium
JP7030573B2 (en) Vehicle control devices, vehicle control methods, and programs
CN110167811B (en) Vehicle control system, vehicle control method, and storage medium
CN109398358B (en) Vehicle control device, vehicle control method, and medium storing program
CN110099833B (en) Vehicle control system, vehicle control method, and storage medium
CN110678912A (en) Vehicle control system and vehicle control method
JP6663038B2 (en) Vehicle control system, vehicle control method, and vehicle control program
CN110271542B (en) Vehicle control device, vehicle control method, and storage medium
JP7098366B2 (en) Vehicle control devices, vehicle control methods, and programs
JP2019128614A (en) Prediction device, prediction method, and program
CN110139791B (en) Vehicle control device, vehicle control method, and storage medium
JP6669640B2 (en) Trajectory evaluation device, trajectory evaluation method, and trajectory evaluation program
JP6692935B2 (en) Vehicle control device, vehicle control method, and vehicle control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant