US20190294174A1 - Vehicle control system, vehicle control method, and storage medium - Google Patents

Vehicle control system, vehicle control method, and storage medium Download PDF

Info

Publication number
US20190294174A1
US20190294174A1 US16/351,549 US201916351549A US2019294174A1 US 20190294174 A1 US20190294174 A1 US 20190294174A1 US 201916351549 A US201916351549 A US 201916351549A US 2019294174 A1 US2019294174 A1 US 2019294174A1
Authority
US
United States
Prior art keywords
determination result
vehicle
determination
matter
traveling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/351,549
Other languages
English (en)
Inventor
Susumu Iwamoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Assigned to HONDA MOTOR CO., LTD. reassignment HONDA MOTOR CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IWAMOTO, SUSUMU
Publication of US20190294174A1 publication Critical patent/US20190294174A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/02Conjoint control of vehicle sub-units of different type or different function including control of driveline clutches
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • B60W10/06Conjoint control of vehicle sub-units of different type or different function including control of propulsion units including control of combustion engines
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/10Conjoint control of vehicle sub-units of different type or different function including control of change-speed gearings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/02Control of vehicle driving stability
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/02Control of vehicle driving stability
    • B60W30/045Improving turning performance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0953Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • G06K9/00791
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle for navigation systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2201/00Application
    • G05D2201/02Control of position of land vehicles
    • G05D2201/0213Road vehicle, e.g. car or truck

Definitions

  • the present invention relates to a vehicle control system, a vehicle control method, and a storage medium.
  • An aspect of the present invention has been made in consideration of such a circumstance, and an object of the aspect of the present invention is to provide a vehicle control system, a vehicle control method, and a storage medium capable of performing more suitable control determination with respect to a vehicle.
  • a vehicle control system, a vehicle control method, and a storage medium according to the present invention adopt the following constitutions.
  • a vehicle control system includes a determination result generation unit that generates a first determination result made by an occupant or a device mounted on a subject vehicle in relation to a first matter that is related to traveling of the subject vehicle and generates a second determination result by performing determination based on learning data in relation to a second matter that is related to the first determination result, and a determination decision unit that decides a determination related to the first matter and the second matter based on the first determination result, the second determination result, and external information, in a case where the first determination result and the second determination result are different from each other.
  • the determination decision unit decides the determination related to the first matter and the second matter with reference to a table capable of acquiring determination decision rule data for deciding the determination related to the first matter and the second matter, based on a combination of a determination pattern of the first determination result and the second determination result and the external information.
  • the determination decision unit feeds back the determination result decided based on the first determination result, the second determination result, and the external information to the learning data.
  • the determination result generation unit sets a determination result related to a control amount of first traveling control of the subject vehicle as the first determination result and sets a determination result related to a control amount of second traveling control of the subject vehicle as the second determination result
  • the external information includes a traveling state or a surrounding situation of the subject vehicle
  • the determination decision unit adjusts one or both of the control amount of the first traveling control and the control amount of the second traveling control based on the external information in a case where contradiction occurs in vehicle control by reflecting the control amount of the first traveling control and the control amount of the second traveling control on the subject vehicle.
  • the determination result generation unit sets a determination result related to traveling control of automated driving control of a first vehicle as the first determination result and sets a determination result related to traveling control of automated driving control of a second vehicle as the second determination result
  • the external information includes traveling states or surrounding situations of the first vehicle and the second vehicle, and in a case where there is a possibility that the first vehicle and the second vehicle come in contact with each other, the determination decision unit decelerates or stops one or both of the first vehicle or the second vehicle based on the first determination result, the second determination result, and the external information.
  • the determination decision unit decides a vehicle in which deceleration is to be stopped or which is to be preferentially started after decelerating or stopping the first vehicle and the second vehicle.
  • the determination decision unit decelerates or stops one or both of the first vehicle or the second vehicle based on a first determination result based on a determination result of an obstacle based on the traveling state or the surrounding situation of the first vehicle and a determination result of a possibility of contact with the obstacle, a second determination result based on a determination result of an obstacle based on the traveling state or a surrounding situation of the second vehicle and a determination result of a possibility of contact with the obstacle, and the external information.
  • the determination decision unit is provided in an external device capable of communicating with the first vehicle and the second vehicle.
  • a vehicle control system includes an operation receiving unit that receives an operation of a user, a determination result generation unit that generates a determination result based on learning data in relation to a first matter, and a determination decision unit that decides a determination related to the first matter based on the determination result of the determination result generation unit, the operation received by the operation receiving unit, and external information, in a case where an operation different from the determination result determined by the determination result generation unit is performed based on the operation received by the operation receiving unit.
  • a vehicle control method is a vehicle control method that causes a vehicle control system to generate a first determination result made by an occupant or a device in relation to a first matter that is related to traveling of a subject vehicle, generate a second determination result by performing determination based on learning data in relation to a second matter that is related to the first determination result, and decide determination related to the first matter and the second matter based on the first determination result, the second determination result, and external information, in a case where the first determination result and the second determination result are different from each other.
  • a storage medium is a computer-readable non-transitory storage medium storing a program that causes a vehicle control system to generate a first determination result made by an occupant or a device in relation to a first matter that is related to traveling of a subject vehicle, generate a second determination result by performing determination based on learning data in relation to a second matter that is related to the first determination result, and decide determination related to the first matter and the second matter based on the first determination result, the second determination result, and external information, in a case where the first determination result and the second determination result are different from each other.
  • FIG. 1 is a constitution diagram of a vehicle system using a vehicle control system according to a first embodiment.
  • FIG. 2 is a functional constitution diagram of a first control unit, a second control unit, a third control unit, and a storage unit according to the first embodiment.
  • FIG. 3 is a diagram for explaining a process of a determination result generation unit.
  • FIG. 4 is a diagram for explaining a process of the determination result generation unit in a situation in which a subject vehicle travels at an intersection.
  • FIG. 5 is a diagram showing an example of a content of a determination decision rule table.
  • FIG. 6 is a flowchart showing a flow of a process executed by a driving support control device of the first embodiment.
  • FIG. 7 is a constitution diagram of a vehicle system of a second embodiment.
  • FIG. 8 is a functional constitution diagram of a first control unit, a second control unit, a third control unit, and a storage unit according to the second embodiment.
  • FIG. 9 is a diagram for explaining a process of a driving support control device at an intersection.
  • FIG. 10 is a sequence diagram showing a flow of a process executed by the driving support control device of the second embodiment.
  • FIG. 11 is a diagram showing an example of a hardware constitution of the driving support control device according to an embodiment.
  • a vehicle control device of the embodiment is applied to an automated driving vehicle.
  • automated driving is executing driving control by controlling one or both of steering or acceleration and deceleration of a vehicle.
  • the automated driving may include driving control by a driving support device such as adaptive cruise control (ACC), lane keeping assist (LKAS), auto lane changing (ALC), or the like.
  • the driving support device includes a steering control device that mainly controls steering of a vehicle and a distribution control device that mainly controls distribution of left and right braking and driving force of the vehicle.
  • FIG. 1 is a constitution diagram of a vehicle system 1 using a vehicle control system according to the first embodiment.
  • a vehicle in which the vehicle system 1 is mounted is, for example, a vehicle such as a two-wheeled vehicle, a three-wheeled vehicle, or a four-wheeled vehicle, and a driving source of the vehicle is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof.
  • the electric motor operates using electric power generated by a generator connected to the internal combustion engine or electric power discharged by a secondary battery or a fuel cell.
  • the vehicle system 1 includes a camera 10 , a radar device 12 , a finder 14 , an object recognition device 16 , a communication device 20 , a human machine interface (HMI) 30 , a vehicle sensor 40 , a navigation device 50 , a map positioning unit (MPU) 60 , a driving operation element 80 , a driving support control device 100 , a traveling driving force output device 200 , a brake device 210 , and a steering device 220 .
  • HMI human machine interface
  • MPU map positioning unit
  • driving support control device 100 a traveling driving force output device 200
  • brake device 210 a brake device
  • steering device 220 a steering device 220 .
  • Such devices and instruments are connected to each other by a multiple communication line such as a controller area network (CAN) communication line, a serial communication line, a wireless communication network, or the like.
  • CAN controller area network
  • serial communication line a wireless communication network
  • the driving support control device 100 is an example of the “vehicle control system”.
  • a combination of the HMI 30 and the driving operation element 80 is an example of an “operation receiving unit”.
  • the camera 10 is a digital camera using a solid imaging element such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS).
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the camera 10 is attached to an arbitrary place on the vehicle (hereinafter, a subject vehicle M) in which the vehicle system 1 is mounted.
  • a subject vehicle M in which the vehicle system 1 is mounted.
  • the camera 10 is attached to an upper portion of a front windshield, a rear surface of a rearview mirror, or the like.
  • the camera 10 periodically repeats imaging of the surroundings of the subject vehicle M.
  • the camera 10 may be a stereo camera.
  • the radar device 12 radiates radio waves such as millimeter waves or the like to the surroundings of the subject vehicle M and detects at least the position (distance and direction) of an object by detecting radio waves (reflected waves) reflected by the object.
  • the radar device 12 is attached to an arbitrary place on the subject vehicle M.
  • the radar device 12 may detect the position and the speed of the object by a frequency modulated continuous wave (FM-CW) method.
  • FM-CW frequency modulated continuous wave
  • the finder 14 is a light detection and ranging (LIDAR).
  • the finder 14 irradiates light around the subject vehicle M and measures scattered light.
  • the finder 14 detects the distance to the object on the basis of a time from light emission to light reception.
  • the irradiated light is laser light of a pulse shape.
  • the finder 14 is attached to an arbitrary place on the subject vehicle M.
  • the object recognition device 16 performs a sensor fusion process on a detection result by a part or all of the camera 10 , the radar device 12 , and the finder 14 to recognize a position, a type, a speed, and the like of the object.
  • the object recognition device 16 outputs a recognition result to the driving support control device 100 .
  • the object recognition device 16 may output the detection result of the camera 10 , the radar device 12 , and the finder 14 as they are to the driving support control device 100 .
  • the object recognition device 16 may be omitted from the vehicle system 1 .
  • the camera 10 includes an infrared camera that captures an image of a change of a surface temperature of an object, in addition to capturing of a normal image. Between normal imaging and infrared imaging may be switched by a function of the camera 10 .
  • the communication device 20 communicates with another vehicle that is present around the subject vehicle M using a cellular network, a Wi-Fi network, Bluetooth (registered trademark), dedicated short range communication (DSRC), or the like, or communicates with various server devices through a wireless base station.
  • a cellular network a Wi-Fi network, Bluetooth (registered trademark), dedicated short range communication (DSRC), or the like
  • DSRC dedicated short range communication
  • the HMI 30 presents various types of information to an occupant of the subject vehicle M and receives an input operation by the occupant.
  • the HMI 30 includes various display devices, speakers, buzzers, touch panels, switches, keys, light emitting devices provided in a vehicle interior, and the like.
  • the vehicle sensor 40 includes a vehicle speed sensor that detects a speed of the subject vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects an angular velocity around a vertical axis, a direction sensor that detects a direction of the subject vehicle M, and the like.
  • the acceleration sensor may include a sensor that detects longitudinal acceleration or lateral acceleration.
  • the longitudinal acceleration is acceleration with respect to a progress direction of the subject vehicle M.
  • the lateral acceleration is acceleration received in a vehicle width direction of the subject vehicle M with respect to the progress direction of the subject vehicle M.
  • the vehicle sensor 40 may include a contact detection sensor that detects the presence or absence of contact from the outside and the strength of the contact at an arbitrary position of a body portion of the subject vehicle M.
  • the vehicle sensor 40 may include a vibration sensor that detects a vibration of the subject vehicle M and a sound detection sensor that detects a sound generated from the subject vehicle M or in the vicinity of the subject vehicle M.
  • the navigation device 50 includes a global navigation satellite system (GNSS) receiver 51 , a navigation HMI 52 , and a route determination unit 53 .
  • the navigation device 50 holds first map information 54 in a storage device such as a hard disk drive (HDD) or a flash memory.
  • the GNSS receiver 51 specifies the position of the subject vehicle M on the basis of a signal received from a GNSS satellite.
  • the position of the subject vehicle M may be specified or supplemented by an inertial navigation system (INS) using an output of the vehicle sensor 40 .
  • the navigation HMI 52 includes a display device, a speaker, a touch panel, a key, and the like. A part or all of the navigation HMI 52 may be shared with the HMI 30 described above.
  • the route determination unit 53 determines a route (hereinafter referred to as a route on a map) from the position of the subject vehicle M specified by the GNSS receiver 51 (or an input arbitrary position) to a destination input by the occupant using the navigation HMI 52 in a manned state or a destination transmitted from an external communication terminal and received by the communication device 20 in an unmanned state, by referring to the first map information 54 .
  • the first map information 54 is information in which a road shape is expressed by a link indicating a road and nodes connected by the link.
  • the first map information 54 may include information related to a road sign for the link.
  • the first map information 54 may include a curvature of the road, point of interest (POI) information, or the like.
  • the route on the map is output to the MPU 60 .
  • the navigation device 50 may perform route guidance using the navigation HMI 52 on the basis of the route on the map.
  • the navigation device 50 may be realized by a function of a terminal device such as a smartphone or a tablet terminal possessed by the occupant.
  • the navigation device 50 may transmit a current position and a destination to a navigation server through the communication device 20 and acquire the same route as the route on the map from the navigation server.
  • the MPU 60 includes a recommended lane determination unit 61 and holds second map information 62 in the storage device such as an HDD or a flash memory.
  • the recommended lane determination unit 61 divides the route on the map provided from the navigation device 50 into a plurality of blocks (for example, divides the route into intervals of 100 [m] in a vehicle progress direction), and determines a recommended lane for each block by referring to the second map information 62 .
  • the recommended lane determination unit 61 determines the number of a lane from the left that the vehicle travels in. In a case where a branching position is present in the route on the map, the recommended lane determination unit 61 determines the recommended lane so that the subject vehicle M is able to travel on a reasonable travel route for progressing to a branch destination.
  • the second map information 62 is map information with accuracy higher than that of the first map information 54 .
  • the second map information 62 may include information on the center of a lane, information on a boundary of a lane, or the like.
  • the second map information 62 may include road information, traffic regulation information, address information (an address and a postal code), facility information, telephone number information, and the like.
  • the second map information 62 may be updated at any time by the communication device 20 communicating with another device.
  • the driving operation element 80 includes, for example, an acceleration pedal, a brake pedal, a shift lever, a steering wheel, a modified steering wheel, a joystick, and other operation elements.
  • a sensor that detects an operation amount or presence or absence of an operation is attached to the driving operation element 80 , and a detection result of the sensor is output to a part or all of the driving support control device 100 , or the traveling driving force output device 200 , the brake device 210 , and the steering device 220 .
  • a grip sensor that detects whether or not the occupant grips the steering wheel may be attached to the steering wheel.
  • the driving support control device 100 includes a first control unit 120 , a second control unit 150 , a third control unit 160 , and a storage unit 180 .
  • each of such constitution elements except for the storage unit 180 is realized by a hardware processor such as a central processing unit (CPU) executing a program (software).
  • Some or all of such constitution elements may be realized by hardware (a circuit unit including a circuitry) such as a large scale integration (LSI), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a graphics processing unit (GPU), or may be realized by software and hardware in cooperation.
  • the program may be stored in a storage device such as an HDD or a flash memory of the driving support control device 100 in advance.
  • the program may be stored in a detachable storage medium such as a DVD or a CD-ROM and may be installed in the HDD or the flash memory of the driving support control device 100 by the storage medium attached to a drive device.
  • FIG. 2 is a functional constitution diagram of the first control unit 120 , the second control unit 150 , the third control unit 160 , and the storage unit 180 according to the first embodiment.
  • the first control unit 120 includes a recognition unit 130 and an action plan generation unit 140 .
  • the first control unit 120 realizes a function of artificial intelligence (AI) and a function of a previously given model in parallel.
  • AI artificial intelligence
  • a function of “recognizing an intersection” may be executed in parallel with recognition of an intersection by deep learning or the like and recognition based on a previously given condition (there is a pattern matching signal, a road sign, or the like) and may be realized by giving scores to both sides and comprehensively evaluating the scores. Therefore, reliability of automated driving is guaranteed.
  • the recognition unit 130 recognizes states such as the position, the orientation, the speed and the acceleration of the object around the subject vehicle M, on the basis of information input from the camera 10 , the radar device 12 , and the finder 14 through the object recognition device 16 .
  • the object includes, for example, a person such as a pedestrian, a moving object such as another vehicle, an obstacle on the road such as a construction site, a package dropped from a loaded vehicle, and the like.
  • the object may include a curbstone, a median strip, a side groove, a guardrail, a wall, and the like.
  • the position of the object is recognized as a position in absolute coordinates using a representative point (a center of gravity, a drive shaft center, or the like) of the subject vehicle M as an origin and is used in control.
  • the position of the object may be represented by the representative point such as the center of gravity or a corner of the object, or may be represented by an expressed region.
  • a “state” of the object may include an acceleration, a jerk, or an “action state” (for example, whether or not the object is changing a lane or trying to change a lane).
  • the recognition unit 130 recognizes a lane (traveling lane) on which the subject vehicle M is traveling. For example, the recognition unit 130 recognizes the traveling lane by comparing a pattern of a road lane marking (for example, an arrangement of a solid line and a broken line) obtained from the second map information 62 with a pattern of a road lane marking around the subject vehicle M recognized from the image captured by the camera 10 .
  • the recognition unit 130 may recognize the traveling lane by recognizing a traveling road boundary (a road boundary) including a road lane marking, a road shoulder, a road side band, a curb stone, a median strip, a guard rail, and the like, without limiting to recognizing a road lane marking.
  • the recognition unit 130 may recognize a width, a height, and a shape of the object, a type (for example, a vehicle type of the other vehicle), or the like on the basis of the image captured by the camera 10 .
  • the recognition unit 130 recognizes a road sign, a red light, a toll gate, a road structure, and other road events.
  • the recognition unit 130 When recognizing the traveling lane, the recognition unit 130 recognizes the position and a posture of the subject vehicle M with respect to the traveling lane. For example, the recognition unit 130 may recognize a deviation of a reference point (for example, a center of gravity) of the subject vehicle M from a center of the lane and an angle formed by a line connecting the center of the lane of a progress direction of the subject vehicle M as a relative position and the posture of the subject vehicle M with respect to the traveling lane. Instead of this, the recognition unit 130 may recognize a position of the reference point of the subject vehicle M with respect to one of side end portions (the road lane marking or the road boundary) of the traveling lane, or the like as the relative position of the subject vehicle M with respect to the traveling lane. The recognition unit 130 may recognize a structure (for example, a utility pole, a median strip, and the like) on the road on the basis of the first map information 54 or the second map information 62 .
  • a reference point for example, a center of gravity
  • the action plan generation unit 140 generates a target trajectory along which the subject vehicle M automatically travels in the future so that the subject vehicle M travels on the recommended lane determined by the recommended lane determination unit 61 and furthermore the subject vehicle M is able to cope with the surrounding situation of the subject vehicle M.
  • the target trajectory includes, for example, a speed element.
  • the target trajectory is expressed as a sequence of points (trajectory points) where the reference point (for example, the center of gravity G) of the subject vehicle M reaches.
  • the trajectory point is a point where the subject vehicle M reaches for each predetermined traveling distance (for example, about several [m]) at a road distance, and separately from that, a target speed and a target acceleration for each predetermined sampling time (for example, about 0 comma several [sec]) are generated as part of the target trajectory.
  • the target speed for each sampling time is determined on the basis of a high rank target speed determined for each passing road.
  • the high rank target speed may be determined on the basis of a limit speed or a legal speed, or may be arbitrarily set by the occupant or within a predetermined range from the limit speed or the legal speed.
  • the target speed in the claims corresponds to the high rank target speed.
  • the trajectory point may be a position where the subject vehicle M reaches at a sampling time for each predetermined sampling time. In this case, information on the target speed and the target acceleration is expressed by an interval between the trajectory points.
  • the action plan generation unit 140 may set an event of the automated driving.
  • the event of the automated driving includes a constant speed traveling event, a low speed following traveling event, a lane change event, a lane keeping traveling event, a branch event, a merge event, a takeover event, an avoidance event, and the like. Such events also include an event that supports a part of the driving operation by the driver.
  • the action plan generation unit 140 generates a target trajectory on the basis of the determination decision rule data of the traveling control of the subject vehicle M from the second control unit 150 .
  • the second control unit 150 includes, for example, a determination result generation unit 152 and a determination decision unit 154 .
  • the determination result generation unit 152 generates a determination result made by the operation of the occupant or the device in relation to a matter that is related to the traveling of the subject vehicle M or a determination result derived by an AI function based on the learning data stored by the storage unit 180 .
  • the matter is information related to a behavior of the subject vehicle M, and includes a progress direction of the subject vehicle M, a phenomenon occurring in the subject vehicle M, a phenomenon expected to occur in the future, or the like.
  • the phenomenon includes a slip of the subject vehicle M, or a contact avoidance traveling against an approach of an object such as the other vehicle M.
  • the determination result is a determination result for future traveling control of the subject vehicle M.
  • the AI function includes a method of finding regularity or relevance from a plurality of pieces of data and determining things using machine learning that performs determination or prediction, and a method of determining things using deep learning for learning setting of a feature amount, combination contents, and the like, by using a multilayered structure algorithm (for example, a neural network). Details of a function of the determination result generation unit 152 will be described later.
  • the determination decision unit 154 acquires determination decision rule data related to the future traveling control of the subject vehicle M on the basis of a plurality of determination results generated by the determination result generation unit 152 , and outputs the acquired determination decision rule data to the action plan generation unit 140 . Details of a function of the determination decision unit 154 will be described later.
  • the third control unit 160 controls the traveling driving force output device 200 , the brake device 210 , and the steering device 220 so that the subject vehicle M passes through the target trajectory generated by the action plan generation unit 140 at a scheduled time.
  • the third control unit 160 includes an acquisition unit 162 , a speed control unit 164 , and a steering control unit 166 .
  • the acquisition unit 162 acquires information on the target trajectory (a trajectory point) generated by the action plan generation unit 140 and stores the information in a memory (not shown).
  • the speed control unit 164 controls the traveling driving force output device 200 or the brake device 210 on the basis of a speed element accompanying the target trajectory stored in the memory.
  • the steering control unit 166 controls the steering device 220 according to a degree of curvature of the target trajectory stored in the memory.
  • a process of the speed control unit 164 and the steering control unit 166 is realized by a combination of a feed-forward control and a feedback control.
  • the steering control unit 166 is executed by a combination of a feed-forward control according to a curvature of the road ahead of the subject vehicle M and a feedback control based on the deviation from the target trajectory.
  • the storage unit 180 is constituted with, for example, an HDD, a flash memory, an electrically erasable programmable read only memory (EEPROM), a read only memory (ROM), a random access memory (RAM) or the like.
  • the storage unit 180 stores learning data 182 , a determination decision rule table 184 , and other various pieces of information.
  • a determination result (a second determination result which will be described later) learned by the AI function or the like is associated with information obtained from a part of functions for the subject vehicle M to perform traveling control.
  • a part of functions is a sensing function of each of a plurality of external sensing devices (for example, the camera 10 , the radar device 12 , and the finder 14 ) mounted on the subject vehicle M, and a function of the driving support device that executes an LKAS or the like.
  • determination decision rule data is associated with a determination result (a first determination result which will be described later and the second determination result) and external information.
  • the learning data 182 may be acquired from an external device through the communication device 20 , or the determination result generation unit 152 may generate and update the learning data 182 at a predetermined timing. Details of the determination decision rule table 184 will be described later.
  • the traveling driving force output device 200 outputs, to driving wheels, traveling driving force (torque) for enabling the vehicle to travel.
  • the traveling driving force output device 200 includes a combination of an internal combustion engine, an electric motor, a transmission, and the like, and an (electronic control unit) ECU that controls the internal combustion engine, the electric motor, the transmission, and the like.
  • the ECU controls the above-described constitutions according to the information input from the third control unit 160 or the information input from the driving operation element 80 .
  • the brake device 210 includes a brake caliper, a cylinder that transfers oil pressure to the brake caliper, an electric motor that generates the oil pressure in the cylinder, and a brake ECU.
  • the brake ECU controls the electric motor according to the information input from the third control unit 160 or the information input from the driving operation element 80 , so that a brake torque according to a control operation is output to each wheel.
  • the brake device 210 may include a mechanism for transferring the oil pressure generated by an operation of a brake pedal included in the driving operation element 80 to the cylinder through a master cylinder as a backup.
  • the brake device 210 is not limited to the constitution described above, and may be an electronic control method oil pressure brake device that controls an actuator according to the information input from the third control unit 160 to transfer the oil pressure of the master cylinder to the cylinder.
  • the steering device 220 includes a steering ECU and an electric motor.
  • the electric motor changes a direction of steerable wheels by applying a force to a rack and pinion mechanism.
  • the steering ECU changes the direction of the steerable wheels by driving the electric motor according to the information input from the third control unit 160 or the information input from the driving operation element 80 .
  • the determination result generation unit 152 generates the first determination result made by the operation of the occupant or the device in relation to a first matter that is related to the traveling of the subject vehicle M and generates the second determination result by performing determination based on the learning data in relation to a second matter that is related to the first determination result 182 .
  • FIG. 3 is a diagram for explaining a process of the determination result generation unit 152 .
  • the example of FIG. 3 shows a state in which the subject vehicle M travels on a left curved road R 1 slips as the first matter.
  • FIG. 3 schematically shows left and right front wheels WFL and WFR of the subject vehicle M and left and right rear wheels WRL and WRR of the subject vehicle M, and it is assumed that steering angles of the wheels WFL and WFR with respect to a steering amount of the steering wheel are the same.
  • the wheel WFL is used in the description of steering angle control of the wheel.
  • the determination result generation unit 152 generates the first determination result related to the traveling control of the subject vehicle on the basis of an operation amount of the occupant (an example of the user) received from the steering wheel that is an example of the driving operation element 80 .
  • the determination result generation unit 152 generates steering control to direct the steering angle of the wheel of the subject vehicle M to a left side (a side of an arrow a 1 in the figure) as compared with a current state as the first determination result, on the basis of the steering amount of the steering wheel.
  • the determination result generation unit 152 In association with performing the operation for turning the subject vehicle M left by the occupant (an example of the second matter), the determination result generation unit 152 generates steering control to direct the steering angle of the wheel WFL to a right side (a side of an arrow a 2 in the figure) as compared with a current state using the learning data 182 by the driving support device that executes the LKAS as the second determination result.
  • the determination result generation unit 152 may acquire the second determination result corresponding to the second matter by referring to the learning data 182 stored in the storage unit 180 in advance on the basis of the second matter that is related to the first determination result.
  • the first determination result may be set as a determination result for a control content for the device mounted on the subject vehicle M.
  • FIG. 4 is a diagram for explaining a process of the determination result generation unit 152 in a situation in which the subject vehicle M travels through an intersection. In the example of FIG. 4 , a situation is shown in which the subject vehicle M traveling on the road R 2 turns left at an intersection CR 1 connected to each of roads R 2 to R 5 . In this case, the determination result generation unit 152 generates a control content as the first determination result so as to turn left on the basis of the AI function mounted on the steering control device.
  • the determination result generation unit 152 On the basis of the AI function installed in the distribution control device of the left and right braking and driving force, the determination result generation unit 152 generates a control content for generating force in a backward direction (b 2 direction in the figure) with respect to the vehicle by driving the subject vehicle in a forward direction (b 1 direction in the figure) with respect to the rear left wheel WRL and braking the subject vehicle with respect to the rear right wheel WRL as the second determination result, in consideration of grip traveling or turning characteristics (ease of changing the orientation of the subject vehicle M) such that the subject vehicle M exhibits the behavior of not skidding sideways when turning left.
  • the determination decision unit 154 determines whether or not the first determination result and the second determination result generated by the determination result generation unit 152 are different from each other.
  • the fact that the first determination result and the second determination result are different from each other means that contradiction occurs in the vehicle control due to reflection of a control amount of first traveling control due to the first determination result and a control amount of second traveling control due to the second determination result on the subject vehicle M.
  • the fact that the contradiction occurs in the vehicle control means that directionality of the behavior of the subject vehicle M by execution of the traveling control by the first determination result and directionality of the behavior of the subject vehicle M by execution of the traveling control by the second determination result are contrary to each other.
  • the fact that the directionalities are contrary to each other means that the second determination result instructs turning right while the first determination result instructs turning left.
  • the fact that the directionalities are contrary to each other may mean that that the second determination result instructs the deceleration while the first determination result instructs the acceleration.
  • the fact that the contradiction occurs in the vehicle control may include a case in which a range of one or both of traveling controls is exceeded in a case where traveling control based on one of the first determination result and the second determination result and traveling control based on the other determination result are performed.
  • the determination decision unit 154 determines a final determination result on the basis of the first determination result, the second determination result, and external information.
  • the external information is a traveling state or a surrounding situation of the subject vehicle M recognized by the recognition unit 130 .
  • the external information includes information on the behavior of the subject vehicle M, a road shape, a road surface situation, the weather, a surrounding object (obstacle) such as another vehicle, or the like.
  • the external information includes information that the subject vehicle M is traveling on the left curved road R 1 and that slipping is occurring.
  • the external information includes information that the subject vehicle M travels on the road R 2 and turns left to the road R 4 at the intersection CR 1 .
  • the determination decision unit 154 collates the first determination result, the second determination result, and the external information with the determination decision rule table 184 stored in the storage unit 180 , and acquires corresponding determination decision rule data.
  • FIG. 5 is a diagram showing an example of a content of the determination decision rule table 184 .
  • determination decision rule data is associated with determination patterns of the first determination result and the second determination result, and the external information.
  • the determination decision rule data is a determination result related to final traveling control for causing the action plan generation unit 140 to generate the target trajectory.
  • the determination decision rule data may include an adjustment content related to a priority or weights of the first determination result and the second determination result.
  • the determination decision rule data for example, decision contents, in which the second determination result is prioritized over the first determination result, a determination result obtained by adding the first determination result to the second determination result by setting the weight of the first determination result as w 1 and the weight of the second determination result as w 2 is used, only one determination result is used, or the like, is included.
  • the determination decision rule table 184 is acquired from an external device such as a server through the communication device 20 .
  • the determination decision rule table 184 may be derived by statistically processing a pattern of a past determination decision result by the determination decision unit 154 .
  • the determination decision unit 154 is able to obtain a more suitable determination result on the basis of the surrounding situation by using the external information. It is possible to perform accurate determination at an early stage by referring to the determination decision rule table 184 .
  • the determination decision unit 154 executes traveling control of both of the first determination result and the second determination result. For example, in a case where both of the first determination result and the second determination result are instructions for turning right, and even though traveling control based on both instructions is performed the range of other traveling control is not exceeded, the determination decision unit 154 determines both of the first determination result and the second determination result as the determination result and performs traveling control based on the determination decision result.
  • the determination decision unit 154 may feed back the determination decision result (determination decision rule data) to the learning data 182 to update the data.
  • the determination decision unit 154 may update each time the determination decision result is obtained, or update at a timing at which a certain degree of determination decision result is obtained or at a timing at which a predetermined time has elapsed.
  • the determination decision unit 154 may feed back the determination decision result in which the traveling control is finally performed to the learning data 182 , in a case where identical or similar matters are determined from the next time onward, it is possible to suppress occurrence of difference between the first determination result and the second determination result. Therefore, it is possible to perform more accurate and prompt determination.
  • FIG. 6 is a flowchart showing a flow of a process executed by the driving support control device 100 of the first embodiment.
  • the process of the present flowchart may be repeatedly executed at a predetermined period or at a predetermined timing.
  • the target trajectory is generated by the action plan generation unit 140 and the automated driving is executed by the third control unit 160 on the basis of the generated target trajectory.
  • the determination result generation unit 152 acquires the first determination result made by the occupant or the device in relation to the first matter (step S 100 ).
  • the determination result generation unit 152 generates the second determination result by performing the determination based on learning data 182 in relation to the second matter that is related to the first determination result (step S 102 ).
  • the determination decision unit 154 determines whether or not the first determination result and the second determination result are different from each other (step S 104 ). In a case where the first determination result and the second determination result are different from each other, the determination decision unit 154 acquires the external information (step S 106 ), and decides the determination decision rule data with reference to the determination decision rule table 184 , on the basis of the first determination result, the second determination result, and the external information (step S 108 ).
  • step S 104 in a case where the first determination result and the second determination result are not different from each other, the determination decision unit 154 decides the determination decision rule data on the basis of the first determination result and the second determination result (step S 110 ).
  • the determination decision unit 154 outputs the determination decision result obtained by the process of step S 108 or S 110 to the action plan generation unit 140 , and causes the traveling control to be executed based on the determination decision result (step S 112 ).
  • step S 112 the determination decision unit 154 feeds back the determination decision result to the learning data 182 (step S 114 ). Therefore, the process of the present flowchart is ended.
  • the driving support control device 100 of the first embodiment it is possible to perform more suitable control determination with respect to the vehicle.
  • the determination result generation unit 152 since it is sufficient for the determination result generation unit 152 to generate the determination result by concentrating on each determination matter by using the AI function or the like mounted on each of the devices that execute a part of a function of the traveling control mounted on the subject vehicle M, it is possible to reduce a process load and improve an individual learning accuracy at an early stage.
  • the determination decision unit 154 determines the final determination also based on the external information, and thus it is possible to perform accurate determination at an early stage.
  • FIG. 7 is a constitution diagram of the vehicle system 2 of the second embodiment.
  • the vehicle system 2 includes, for example, one or more vehicles 300 and a vehicle support server 400 .
  • Such constitution elements are able to communicate with each other through the network NW.
  • the network NW includes the Internet, a wide area network (WAN), a local area network (LAN), a public line, a provider device, a dedicated line, a wireless base station, and the like.
  • a vehicle 300 includes, for example, the subject vehicle M and the other vehicle of the first embodiment.
  • the vehicle 300 has a difference in that the vehicle 300 includes a driving support control device 100 A instead of the driving support control device 100 as compared with the constitution diagram of the vehicle system 1 of the first embodiment. Therefore, hereinafter, a function of the driving support control device 100 A will be mainly described.
  • FIG. 8 is a functional constitution diagram of the first control unit 120 , a second control unit 150 A, the third control unit 160 , and the storage unit 180 A according to the second embodiment.
  • the driving support control device 100 A shown in FIG. 8 has a difference in that the driving support control device 100 A includes a determination decision inquiry unit 156 instead of the determination decision unit 154 , and a storage unit 180 A instead of the storage unit 180 as compared with the driving support control device 100 of the first embodiment. Therefore, hereinafter, the determination decision inquiry unit 156 and the storage unit 180 A will be mainly described.
  • the determination decision inquiry unit 156 determines whether or not the first determination result and the second determination result generated by the determination result generation unit 152 are different from each other. In a case where the first determination result and the second determination result are different from each other, for example, the determination decision inquiry unit 156 transmits the first determination result, the second determination result, and inquiry information of the final determination decision rule data together with the external information and the like recognized by the recognition unit 130 to the vehicle support server 400 through the communication device 20 .
  • the determination decision inquiry unit 156 acquires the determination decision rule data from the vehicle support server 400 .
  • the determination decision inquiry unit 156 outputs the determination decision rule data to the action plan generation unit 140 .
  • the determination decision inquiry unit 156 may feed back the determination decision rule data to the learning data 182 in the storage unit 180 A to update the data.
  • the vehicle support server 400 includes, for example, a communication unit 410 , an inquiry receiving unit 420 , a determination decision unit 430 , a learning unit 440 , and a storage unit 450 .
  • each of the inquiry receiving unit 420 , the determination decision unit 430 , and the learning unit 440 is realized by a hardware processor such as a CPU executing a program (software).
  • a part or all of such constitution elements may be realized by hardware (including a circuit unit; circuitry) such as an LSI, an ASIC, an FPGA, a GPU, or the like, or may be realized by cooperation of software and hardware.
  • the program may be stored in a storage device such as an HDD or a flash memory of the driving support control device 100 A in advance.
  • the program may be stored in a detachable storage medium such as a DVD or a CD-ROM and may be installed in the HDD or the flash memory of the driving support control device 100 A by attachment of the storage medium to a drive device.
  • the storage unit 450 is constituted with, for example, an HDD, a flash memory, an electrically erasable programmable read only memory (EEPROM), a read only memory (ROM), a random access memory (RAM), or the like.
  • the storage unit 450 stores a determination decision rule table 452 and other various pieces of information. For example, matter similar to the determination decision rule table 184 is stored in the determination decision rule table 452 .
  • the communication unit 410 is, for example, a network card connected to the network NW.
  • the communication unit 410 communicates with the vehicle 300 and another external device through network NW.
  • the inquiry receiving unit 420 receives the first determination result, the second determination result, and an inquiry of the determination result data from the vehicle 300 .
  • the determination decision unit 430 acquires determination decision rule data corresponding to the first determination result, the second determination result, and the external information by referring to the determination decision rule table 452 stored in the storage unit 450 , on the basis of the information received by the inquiry receiving unit 420 .
  • the learning unit 440 stores information related to the first determination result, the second determination result, and the external information, and the determination decision rule data decided by the determination decision unit 430 in the storage unit 450 , performs learning such as a statistical process after a predetermined time has elapsed or at a timing at which a predetermined amount of data is accumulated, and updates the determination decision rule table 452 on the basis of a result of the learning. Therefore, since the learning unit 440 updates the determination decision rule table 452 using the data obtained from the plurality of vehicles 300 , it is possible to perform determination of suitable traveling control according to various situations.
  • the vehicle support server 400 since it is possible to acquire the determination decision rule data from a side of the vehicle support server 400 , it is possible to reduce a process load on a side of the vehicle 300 . According to the second embodiment, since the vehicle support server 400 is able to perform learning using the information from the plurality of vehicles 300 , it is possible to acquire more suitable determination decision rule data according to various situations.
  • FIG. 9 is a diagram for explaining a process of the driving support control device 100 A at an intersection.
  • the subject vehicle M an example of a first vehicle
  • the other vehicle ml an example of a second vehicle
  • the intersection CR 1 is an intersection where a traffic light is not installed. It is assumed that the subject vehicle M goes straight ahead from the road R 2 to the road R 3 and the other vehicle ml turns right from the road R 4 to the road R 2 .
  • the subject vehicle M stops because the other vehicle ml is present in a progress direction ahead of the intersection CR 1 .
  • the other vehicle ml also stops because the subject vehicle M is present in a progress direction ahead of the intersection CR 1 .
  • the determination decision inquiry units 156 of each of the subject vehicle M and the other vehicle ml transmit respective determination results and respective pieces of external information (for example, a traveling state, a surrounding situation) to the vehicle support server 400 , and inquire the determination decision rule data.
  • a determination result of an obstacle (the position of the other vehicle) and a determination result of a possibility of contact with the obstacle may be included as the external information.
  • the external information may include information obtained from a camera, a radar device, a finder, or the like installed in the vicinity of the intersection CR 1 .
  • the inquiry receiving unit 420 of the vehicle support server 400 receives inquiry information of the subject vehicle M and the other vehicle ml.
  • the determination decision unit 430 acquires the determination result related to the traveling control of the automated driving control of the subject vehicle as the first determination result and the determination result related to the traveling control of the automated driving control of the other vehicle ml as the second determination result.
  • the determination decision unit 430 refers to the determination decision rule table 452 on the basis of the external information in the subject vehicle M and the other vehicle ml, and acquires corresponding determination decision rule data.
  • the determination decision rule data is data for decelerating or stopping one or both of the subject vehicle M and the other vehicle ml.
  • rule data related to a vehicle to which deceleration release or departure is prioritized by the determination decision unit 430 after decelerating or stopping one or both of the subject vehicle M or the other vehicle ml is included.
  • the determination decision unit 430 transmits the determination decision rule data to the subject vehicle M and the other vehicle ml. Therefore, the subject vehicle M and the other vehicle ml are able to smoothly travel at the intersection CR 1 .
  • FIG. 10 is a sequence diagram showing a flow of a process executed by the driving support control device 100 A of the second embodiment.
  • the process of the present sequence may be repeatedly executed at a predetermined period or a predetermined timing.
  • the sequence diagram shown in FIG. 10 shows the subject vehicle M, the other vehicle ml, and the vehicle support server 400 , and shows a flow of a process in the situation shown in FIG. 9 .
  • the target trajectory is generated by each action plan generation unit 140 of the subject vehicle M and the other vehicle ml, and the automated driving is executed by the third control unit 160 on the basis of the generated target trajectory.
  • the subject vehicle M generates a determination result for going straight ahead at the intersection CR 1 (step S 200 ).
  • the other vehicle ml generates a determination result for turning right at the intersection CR 1 (step S 202 ).
  • the subject vehicle M and the other vehicle ml transmit the inquiry information of the determination decision rule data together with the respective determination results and the external information to the vehicle support server 400 (steps S 204 and S 206 ).
  • the vehicle support server 400 refers to the determination decision rule table 452 on the basis of the respective determination results of the subject vehicle M and the other vehicle ml and the external information, and acquires the corresponding determination decision rule data (step S 208 ).
  • the vehicle support server 400 transmits determination decision rule data to the subject vehicle M and the other vehicle ml with respect to the subject vehicle M and the other vehicle ml (steps S 210 and S 212 ).
  • the subject vehicle M generates the target trajectory on the basis of the determination decision rule data received from the vehicle support server 400 , and travels along the generated target trajectory (step S 214 ).
  • the subject vehicle M may update the learning data on the basis of the determination decision rule data received from the vehicle support server 400 (step S 216 ).
  • the other vehicle ml generates a target trajectory on the basis of the determination decision rule data received from the vehicle support server 400 , and travels along the generated target trajectory (step S 218 ).
  • the subject vehicle M may update the learning data on the basis of the determination decision rule data received from the vehicle support server 400 (step S 220 ).
  • the vehicle support server 400 stores the determination decision rule data obtained by the process of step S 208 in the storage unit 450 , and performs a statistical process or the like at a predetermined timing to update the determination decision rule table 452 (step S 222 ). Therefore, the process of the present sequence is ended.
  • the second embodiment described above by not having the determination decision unit in each vehicle, it is possible to suppress a risk in a case where the determination decision units of the respective determination decision units have different determination results.
  • a plurality of external sensing devices are mounted on the vehicle, even in a case where the results by the learning data of such sensing functions are different from each other, it is possible to promptly perform accurate determination by acquiring the traveling state and the surrounding situation.
  • the second embodiment for example, in consideration of a mutual contact possibility at an intersection or the like where a traffic light is not installed, it is possible to solve a state in which each is not able to automatically start in a state in which each is stopped, it is possible to realize smooth traffic.
  • Each of the determination result generation unit and the determination decision unit in the first and second embodiments also may use machine learning and depth learning such as deep learning and support vector machine.
  • FIG. 11 is a diagram showing an example of a hardware constitution of the driving support control device 100 or 100 A according to an embodiment.
  • the driving support control device 100 or 100 A includes a constitution in which a communication controller 100 - 1 , a CPU 100 - 2 , a RAM 100 - 3 used as a working memory, a ROM 100 - 4 storing a boot program and the like, a storage device 100 - 5 such as a flash memory or an HDD, a drive device 100 - 6 and the like are mutually connected by an internal bus or a dedicated communication line.
  • the communication controller 100 - 1 communicates with components other than the driving support control device 100 or 100 A.
  • a portable storage medium (for example, a computer readable non-transitory storage medium) such as an optical disk is attached to the drive device 100 - 6 .
  • a program 100 - 5 a executed by the CPU 100 - 2 is stored in the storage device 100 - 5 .
  • This program is developed in the RAM 100 - 3 by a direct memory access (DMA) controller (not shown) or the like and executed by the CPU 100 - 2 .
  • the program 100 - 5 a referred to by the CPU 100 - 2 may be stored in the portable storage medium attached to the drive device 100 - 6 or may be downloaded from another device through the network.
  • the hardware constitution shown in FIG. 11 is also able to be applied to the hardware constitution of the vehicle support server 400 . Therefore, for example, a part or all of the inquiry receiving unit 420 , the determination decision unit 430 , and the learning unit 440 of the vehicle support server 400 are realized.
  • a vehicle control device including:
  • the hardware processor executes the program stored in the storage device to:

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)
US16/351,549 2018-03-20 2019-03-13 Vehicle control system, vehicle control method, and storage medium Abandoned US20190294174A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018053422A JP2019164729A (ja) 2018-03-20 2018-03-20 車両制御システム、車両制御方法、およびプログラム
JP2018-053422 2018-03-20

Publications (1)

Publication Number Publication Date
US20190294174A1 true US20190294174A1 (en) 2019-09-26

Family

ID=67984177

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/351,549 Abandoned US20190294174A1 (en) 2018-03-20 2019-03-13 Vehicle control system, vehicle control method, and storage medium

Country Status (3)

Country Link
US (1) US20190294174A1 (ja)
JP (1) JP2019164729A (ja)
CN (1) CN110304061A (ja)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023145080A1 (ja) * 2022-01-31 2023-08-03 本田技研工業株式会社 車両制御装置
WO2023145079A1 (ja) * 2022-01-31 2023-08-03 本田技研工業株式会社 車両制御装置
CN115123129B (zh) * 2022-07-01 2023-11-07 浙江极氪智能科技有限公司 行车安全保障方法、装置、设备及存储介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4730406B2 (ja) * 2008-07-11 2011-07-20 トヨタ自動車株式会社 走行支援制御装置
EP2650857B1 (en) * 2010-12-08 2020-01-22 Toyota Jidosha Kabushiki Kaisha Driving assistance device
JP6307383B2 (ja) * 2014-08-07 2018-04-04 日立オートモティブシステムズ株式会社 行動計画装置
JP6035308B2 (ja) * 2014-11-07 2016-11-30 富士重工業株式会社 車両の走行制御装置
JP2017030472A (ja) * 2015-07-31 2017-02-09 トヨタ自動車株式会社 運転支援装置
DE102016205569A1 (de) * 2016-04-05 2017-10-05 Robert Bosch Gmbh Verfahren und Vorrichtung zum Betreiben eines Fahrzeugs

Also Published As

Publication number Publication date
JP2019164729A (ja) 2019-09-26
CN110304061A (zh) 2019-10-08

Similar Documents

Publication Publication Date Title
CN111771234B (zh) 车辆控制系统、车辆控制方法及存储介质
WO2018216194A1 (ja) 車両制御システムおよび車両制御方法
US20190146519A1 (en) Vehicle control device, vehicle control method, and storage medium
JP6676196B2 (ja) 車両制御システム、車両制御方法、および車両制御プログラム
JP7199150B2 (ja) 車両制御装置、車両制御方法、及びプログラム
JP7117881B2 (ja) 車両制御装置、車両制御方法、およびプログラム
US11327491B2 (en) Vehicle control device, vehicle control method, and storage medium
JP7345349B2 (ja) 車両制御装置、車両制御方法、およびプログラム
US20190276029A1 (en) Vehicle control device, vehicle control method, and storage medium
WO2019069347A1 (ja) 車両制御装置、車両制御方法、およびプログラム
WO2019130473A1 (ja) 車両制御装置、車両制御方法、およびプログラム
US11383714B2 (en) Vehicle control device, vehicle control method, and storage medium
US11814041B2 (en) Vehicle control device, vehicle control method, and storage medium that performs risk calculation for traffic participant
US20190283740A1 (en) Vehicle control device, vehicle control method, and storage medium
US20190294174A1 (en) Vehicle control system, vehicle control method, and storage medium
US11628862B2 (en) Vehicle control device, vehicle control method, and storage medium
CN110341703B (zh) 车辆控制装置、车辆控制方法及存储介质
JP2024030413A (ja) 車両制御装置、車両制御方法、およびプログラム
JP7474136B2 (ja) 制御装置、制御方法、およびプログラム
US11453398B2 (en) Vehicle control device, vehicle control method, and storage medium
JP7473370B2 (ja) 車両制御装置、車両制御方法、およびプログラム
CN115140086A (zh) 车辆控制装置、车辆控制方法及存储介质
JP7431081B2 (ja) 車両制御装置、車両制御方法、およびプログラム
JP7425133B1 (ja) 車両制御装置、車両制御方法、およびプログラム
US11840222B2 (en) Vehicle control method, vehicle control device, and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: HONDA MOTOR CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IWAMOTO, SUSUMU;REEL/FRAME:048579/0981

Effective date: 20190308

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STCB Information on status: application discontinuation

Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION