CN116890838A - Vehicle control device, vehicle control method, and storage medium - Google Patents

Vehicle control device, vehicle control method, and storage medium Download PDF

Info

Publication number
CN116890838A
CN116890838A CN202310309522.9A CN202310309522A CN116890838A CN 116890838 A CN116890838 A CN 116890838A CN 202310309522 A CN202310309522 A CN 202310309522A CN 116890838 A CN116890838 A CN 116890838A
Authority
CN
China
Prior art keywords
vehicle
driving mode
driving
map information
mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310309522.9A
Other languages
Chinese (zh)
Inventor
井上大地
田村祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN116890838A publication Critical patent/CN116890838A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/182Selecting between different operative modes, e.g. comfort and performance modes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

The invention provides a vehicle control device, a vehicle control method and a storage medium, which can appropriately change the driving control of a vehicle even if the road dividing line identified by a camera is different from the content of map information carried by the vehicle. The vehicle control device is provided with: an acquisition unit that acquires a camera image obtained by capturing a surrounding situation of a vehicle; a driving control unit that controls steering and acceleration/deceleration of the vehicle based on the camera image and map information, independently of an operation of a driver of the vehicle; a mode determination unit that determines a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode; and a determination unit that determines whether or not there is a deviation between a road division line shown in the camera image and a road division line shown in the map information, and determines whether or not the vehicle is at a branching point shown in the map information.

Description

Vehicle control device, vehicle control method, and storage medium
Technical Field
The invention relates to a vehicle control device, a vehicle control method, and a storage medium.
Background
Conventionally, a technique for controlling the travel of a host vehicle based on a road dividing line recognized by a camera mounted on the vehicle has been known. For example, japanese patent application laid-open No. 2020-050086 describes the following technique: the vehicle is driven based on the identified road dividing line, and when the degree of identification of the road dividing line does not satisfy a predetermined criterion, the vehicle is driven based on the track of the preceding vehicle.
The technique described in japanese patent application laid-open No. 2020-050086 controls the travel of the host vehicle based on the road dividing line recognized by the camera and the map information mounted on the host vehicle. However, in the conventional technique, when the road dividing line recognized by the camera is different from the content of the map information mounted on the own vehicle, the driving control of the vehicle may not be appropriately changed.
Disclosure of Invention
The present invention has been made in view of such circumstances, and an object thereof is to provide a vehicle control device, a vehicle control method, and a storage medium that can appropriately change the driving control of a vehicle even when the road division line recognized by a camera is different from the content of map information mounted on the vehicle.
The vehicle control device, the vehicle control method, and the storage medium of the present invention adopt the following configurations.
(1): a vehicle control device according to an aspect of the present invention includes: an acquisition unit that acquires a camera image obtained by capturing a surrounding situation of a vehicle; a driving control unit that controls steering and acceleration/deceleration of the vehicle based on the camera image and map information, independently of an operation of a driver of the vehicle; a mode determination unit that determines a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, at least a part of the driving modes including the second driving mode among the plurality of driving modes being controlled by the driving control unit, and the mode determination unit changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not executed by the driver; and a determination unit that determines whether or not there is a deviation between a road division line indicated by the camera image and a road division line indicated by the map information, and that determines whether or not the vehicle is at a branch point indicated by the map information, wherein the mode determination unit determines a driving mode of the vehicle based on a branching direction of the branch point and a direction of the road division line indicated by the camera image when it is determined that there is a deviation between the road division line indicated by the camera image and the road division line indicated by the map information, and that the vehicle is at the branch point indicated by the map information.
(2): in the aspect of (1) above, the mode determination unit changes the second driving mode to the first driving mode and continues the first driving mode using the road dividing line shown in the camera image when the degree of deviation between the branching direction of the branching point and the direction of the road dividing line shown in the camera image is equal to or greater than a threshold value.
(3): in the aspect of (1) above, the mode determining unit continues the second driving mode using the road division line indicated by the map information when a degree of deviation between a branching direction of the branching point and a direction of the road division line indicated by the camera image is smaller than a threshold value.
(4): in addition to any one of the above (1) to (3), the second driving mode is a driving mode in which a task of holding an operation element that accepts a steering operation of the vehicle is not arranged for the driver, and the first driving mode is a driving mode in which at least a task of holding the operation element is arranged for the driver.
(5): another aspect of the present invention provides a vehicle control method, wherein a computer performs: acquiring a camera image obtained by capturing a surrounding situation of a vehicle; controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information; determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and a part of the driving modes including at least the second driving mode among the plurality of driving modes being performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of a driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not performed by the driver; determining whether there is a deviation between a road division line shown in the camera image and a road division line shown in the map information, and determining whether the vehicle is at a branching point shown in the map information; when it is determined that there is a deviation between the road dividing line shown in the camera image and the road dividing line shown in the map information and it is determined that the vehicle is at the branch point shown in the map information, a driving mode of the vehicle is determined based on a branching direction of the branch point and a direction of the road dividing line shown in the camera image.
(6): a storage medium according to another aspect of the present invention stores a program for causing a computer to: acquiring a camera image obtained by capturing a surrounding situation of a vehicle; controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information; determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and a part of the driving modes including at least the second driving mode among the plurality of driving modes being performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of a driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not performed by the driver; determining whether there is a deviation between a road division line shown in the camera image and a road division line shown in the map information, and determining whether the vehicle is at a branching point shown in the map information; when it is determined that there is a deviation between the road dividing line shown in the camera image and the road dividing line shown in the map information and it is determined that the vehicle is at the branch point shown in the map information, a driving mode of the vehicle is determined based on a branching direction of the branch point and a direction of the road dividing line shown in the camera image.
According to (1) to (6), even when the road dividing line recognized by the camera is different from the content of the map information mounted on the host vehicle, the driving control of the vehicle can be appropriately changed.
Drawings
Fig. 1 is a block diagram of a vehicle system using a vehicle control device according to an embodiment.
Fig. 2 is a functional configuration diagram of the first control unit and the second control unit.
Fig. 3 is a diagram showing an example of the correspondence relationship between the driving mode and the control state and the task of the host vehicle M.
Fig. 4 is a diagram showing an example of a scenario in which the operation of the vehicle control device according to the embodiment is performed.
Fig. 5 is a diagram showing another example of a scenario in which the operation of the vehicle control device according to the embodiment is performed.
Fig. 6 is a flowchart showing an example of a flow of operations executed by the vehicle control device according to the embodiment.
Detailed Description
Embodiments of a vehicle control device, a vehicle control method, and a storage medium according to the present invention are described below with reference to the drawings.
[ integral Structure ]
Fig. 1 is a block diagram of a vehicle system 1 using a vehicle control device according to an embodiment. The vehicle on which the vehicle system 1 is mounted is, for example, a two-wheeled, three-wheeled, four-wheeled or the like vehicle, and the driving source thereof is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof. The motor operates using generated power generated by a generator connected to the internal combustion engine or discharge power of the secondary battery and the fuel cell.
The vehicle system 1 includes, for example, a camera 10, radar devices 12 and LIDAR (Light Detectionand Ranging), an object recognition device 16, communication devices 20 and HMI (Human MachineInterface), a vehicle sensor 40, navigation devices 50 and MPU (Map Positioning Unit) 60, a driver monitoring camera 70, a driving operation element 80, an automatic driving control device 100, a running driving force output device 200, a braking device 210, and a steering device 220. These devices and apparatuses are connected to each other via a multi-way communication line such as CAN (Controller Area Network) communication line, a serial communication line, a wireless communication network, and the like. The configuration shown in fig. 1 is merely an example, and a part of the configuration may be omitted or another configuration may be added.
The camera 10 is, for example, a digital camera using a solid-state imaging device such as CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor). The camera 10 is mounted on an arbitrary portion of a vehicle (hereinafter referred to as the host vehicle M) on which the vehicle system 1 is mounted. When photographing the front, the camera 10 is mounted on the upper part of the front windshield, the rear view mirror of the vehicle interior, or the like. The camera 10, for example, periodically and repeatedly photographs the periphery of the host vehicle M. The camera 10 may also be a stereoscopic camera.
The radar device 12 emits radio waves such as millimeter waves to the periphery of the host vehicle M, and detects at least the position (distance and azimuth) of the object by detecting the radio waves (reflected waves) reflected by the object. The radar device 12 is mounted on an arbitrary portion of the host vehicle M. The radar device 12 may also detect the position and velocity of an object by the FM-CW (Frequency Modulated Continuous Wave) method.
The LIDAR14 irradiates light (or electromagnetic waves having wavelengths close to those of the light) to the periphery of the host vehicle M, and measures scattered light. The LIDAR14 detects the distance to the object based on the time from light emission to light reception. The irradiated light is, for example, pulsed laser light. The LIDAR14 is mounted on any portion of the host vehicle M.
The object recognition device 16 performs sensor fusion processing on detection results detected by some or all of the camera 10, the radar device 12, and the LIDAR14, and recognizes the position, type, speed, and the like of the object. The object recognition device 16 outputs the recognition result to the automatic driving control device 100. The object recognition device 16 may directly output the detection results of the camera 10, the radar device 12, and the LIDAR14 to the automated driving control device 100. The object recognition device 16 may also be omitted from the vehicle system 1.
The communication device 20 communicates with other vehicles existing in the vicinity of the host vehicle M, for example, using a cellular network, wi-Fi network, bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or the like, or communicates with various server devices via a wireless base station.
The HMI30 presents various information to the occupant of the own vehicle M, and accepts an input operation by the occupant. HMI30 includes various display devices, speakers, buzzers, touch panels, switches, keys, etc.
The vehicle sensor 40 includes a vehicle speed sensor that detects the speed of the host vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects the angular velocity about the vertical axis, an azimuth sensor that detects the direction of the host vehicle M, and the like.
The navigation device 50 includes, for example, a GNSS (Global Navigation Satellite System) receiver 51, a navigation HMI52, and a route determination unit 53. The navigation device 50 holds the first map information 54 in a storage device such as HDD (Hard Disk Drive) or a flash memory. The GNSS receiver 51 determines the position of the own vehicle M based on the signals received from the GNSS satellites. The position of the host vehicle M may be determined or supplemented by INS (Inertial Navigation System) using the output of the vehicle sensor 40. The navigation HMI52 includes a display device, speakers, a touch panel, keys, etc. The navigation HMI52 may be partially or entirely shared with the HMI30 described above. The route determination unit 53 determines a route (hereinafter referred to as a route on a map) from the position of the host vehicle M (or an arbitrary position inputted thereto) specified by the GNSS receiver 51 to a destination inputted by the occupant using the navigation HMI52, for example, with reference to the first map information 54. The first map information 54 is, for example, information representing the shape of a road by a link representing the road and a node connected by the link. The first map information 54 may include curvature of a road, POI (Point Of Interest) information, and the like. The route on the map is output to the MPU 60. The navigation device 50 may perform route guidance using the navigation HMI52 based on the route on the map. The navigation device 50 may be realized by the functions of a terminal device such as a smart phone or a tablet terminal held by an occupant. The navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire a route equivalent to the route on the map from the navigation server.
The MPU60 includes, for example, a recommended lane determining unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determining unit 61 divides the route on the map supplied from the navigation device 50 (for example, by dividing every 100 m in the vehicle traveling direction) into a plurality of blocks, and determines the recommended lane for each block by referring to the second map information 62. The recommended lane determination unit 61 determines which lane from the left is to be driven. The recommended lane determining unit 61 determines the recommended lane so that the host vehicle M can travel on a reasonable route for traveling to the branching destination when the branching point exists on the route on the map.
The second map information 62 is map information having higher accuracy than the first map information 54. The second map information 62 includes, for example, information of the center of a lane or information of the boundary of a lane. The second map information 62 may include road information, traffic restriction information, residence information (residence, zip code), facility information, telephone number information, information of a prohibition region where the mode a or the mode B to be described later is prohibited, and the like. The second map information 62 may be updated at any time by the communication device 20 communicating with other devices. In the present embodiment, the second map information 62 includes, as source information, position information of a branching point of a lane in particular.
The driver monitor camera 70 is, for example, a digital camera using a solid-state imaging device such as a CCD or CMOS. The driver monitor camera 70 is mounted on an arbitrary portion of the host vehicle M in a position and an orientation in which the head of an occupant (hereinafter referred to as a driver) seated in the driver of the host vehicle M can be imaged from the front (in the orientation of the imaged face). For example, the driver monitor camera 70 is mounted on an upper portion of a display device provided in a center portion of an instrument panel of the host vehicle M.
The steering operation device 80 includes, for example, an accelerator pedal, a brake pedal, a shift lever, and other operation devices in addition to the steering wheel 82. A sensor for detecting the amount of operation or the presence or absence of operation is attached to the driving operation element 80, and the detection result is output to the automatic driving control device 100, or to some or all of the running driving force output device 200, the brake device 210, and the steering device 220. The steering wheel 82 is an example of an "operation tool that receives a steering operation by a driver". The operating member need not necessarily be annular, and may be in the form of a special-shaped steering gear, a lever, a button, or the like. A steering wheel grip sensor 84 is attached to the steering wheel 82. The steering wheel grip sensor 84 is implemented by a capacitance sensor or the like, and outputs a signal to the automatic driving control device 100 that can detect whether the driver grips (i.e., touches) the steering wheel 82 in a forceful state.
The automatic driving control device 100 includes, for example, a first control unit 120 and a second control unit 160. The first control unit 120 and the second control unit 160 are each realized by a hardware processor such as CPU (Central Processing Unit) executing a program (software). Some or all of these components may be realized by hardware (including a circuit part) such as LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), or by cooperation of software and hardware. The program may be stored in advance in a storage device such as an HDD or a flash memory (a storage device including a non-transitory storage medium) of the autopilot control device 100, or may be stored in a removable storage medium such as a DVD or a CD-ROM, and installed in the HDD or the flash memory of the autopilot control device 100 by being mounted on a drive device via the storage medium (the non-transitory storage medium). The automatic driving control device 100 is an example of a "vehicle control device", and the action plan generation unit 140 and the second control unit 160 are combined to be an example of a "driving control unit".
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160. The first control unit 120 includes, for example, an identification unit 130, an action plan generation unit 140, and a mode determination unit 150. The first control unit 120 realizes a function based on AI (Artificial Intelligence: artificial intelligence) and a function based on a predetermined model in parallel, for example. For example, the function of "identifying an intersection" may be realized by "performing, in parallel, identification of an intersection by deep learning or the like, and identification by a predetermined condition (presence of a signal, road sign, or the like that enables pattern matching), and scoring both sides to comprehensively evaluate". Thereby, reliability of automatic driving is ensured.
The recognition unit 130 recognizes the position, speed, acceleration, and other states of the object located in the vicinity of the host vehicle M based on the information input from the camera 10, the radar device 12, and the LIDAR14 via the object recognition device 16. The position of the object is identified as a position on absolute coordinates with the representative point (center of gravity, drive shaft center, etc.) of the host vehicle M as an origin, for example, and is used for control. The position of the object may be represented by a representative point such as a center of gravity or a corner of the object, or may be represented by a region. The "state" of the object may include acceleration, jerk, or "behavior state" of the object (for example, whether a lane change is being made or not, or whether a lane change is being made).
The recognition unit 130 recognizes, for example, a lane (driving lane) in which the host vehicle M is driving. For example, the identifying unit 130 identifies the driving lane by comparing the pattern of the road dividing line (for example, the arrangement of the solid line and the broken line) obtained from the second map information 62 with the pattern of the road dividing line around the host vehicle M identified from the image captured by the camera 10. The identification unit 130 may identify the driving lane by identifying the road dividing line, and the driving road boundary (road boundary) including a road shoulder, a curb, a center isolation belt, a guardrail, and the like, not limited to the road dividing line. In this identification, the position of the host vehicle M acquired from the navigation device 50 and the processing result by the INS may be considered. The identification unit 130 identifies a temporary stop line, an obstacle, a red light, a toll booth, and other road phenomena.
When recognizing the driving lane, the recognition unit 130 recognizes the position and posture of the host vehicle M with respect to the driving lane. The recognition unit 130 may recognize, for example, a deviation of the reference point of the host vehicle M from the center of the lane and an angle formed by the traveling direction of the host vehicle M with respect to a line connecting the centers of the lanes as a relative position and posture of the host vehicle M with respect to the traveling lane. Instead, the identification unit 130 may identify the position of the reference point of the host vehicle M with respect to an arbitrary side end portion (road dividing line or road boundary) of the travel lane as the relative position of the host vehicle M with respect to the travel lane.
The action plan generation unit 140 generates a target track in which the host vehicle M automatically (independent of the operation of the driver) runs in the future so as to be able to cope with the surrounding situation of the host vehicle M while traveling on the recommended lane determined by the recommended lane determination unit 61 in principle. The target track includes, for example, a speed element. For example, the target track is represented by a track in which points (track points) where the host vehicle M should reach are sequentially arranged. The track point is a point where the own vehicle M should reach every predetermined travel distance (for example, several [ M ] degrees) in terms of the distance along the road, and is generated as a part of the target track at intervals of a predetermined sampling time (for example, several tenths [ sec ] degrees), unlike this point. The track point may be a position where the own vehicle M should reach at the sampling timing at each predetermined sampling time. In this case, the information of the target speed and the target acceleration is expressed by the interval of the track points.
The action plan generation unit 140 may set an event of automatic driving when generating the target trajectory. The event of automatic driving includes a constant speed driving event, a low speed following driving event, a lane change event, a branching event, a converging event, a take over event, and the like. The action plan generation unit 140 generates a target track corresponding to the started event.
The mode determination unit 150 determines the driving mode of the host vehicle M as any one of a plurality of driving modes different in task to be set for the driver. The mode determining unit 150 includes, for example, a determining unit 152. The function of the determination unit 152 will be described later.
Fig. 3 is a diagram showing an example of the correspondence relationship between the driving mode and the control state and the task of the host vehicle M. In the driving mode of the host vehicle M, there are 5 modes, for example, a mode a to a mode E. Regarding the control state, that is, the degree of automation of the driving control of the host vehicle M, the pattern a is highest, and next, the pattern B, the pattern C, and the pattern D are sequentially lowered, and the pattern E is lowest. In contrast, with regard to the task placed on the driver, mode a is the mildest, and next, mode B, mode C, mode D become the heaviest in order, mode E is the heaviest. Since the automatic driving is not controlled in the modes D and E, it is obliged for the automatic driving control device 100 to perform a process of ending the control related to the automatic driving and shifting to the driving support or the manual driving. Hereinafter, the content of each driving mode is exemplified.
In the mode a, the vehicle is automatically driven, and neither the front monitoring nor the steering wheel 82 (steering wheel in the drawing) is disposed to the driver. However, even in the mode a, the driver is required to be in a body posture that can quickly shift to manual driving in response to a request from the system centering on the automatic driving control device 100. Here, the term "automatic driving" refers to steering and acceleration/deceleration that are controlled independently of the operation of the driver. The front side refers to a space in the traveling direction of the host vehicle M visually recognized through the front windshield. The mode a is a driving mode that can be executed when the host vehicle M is traveling at a predetermined speed or less (for example, about 50 km/h) on a vehicle-specific road such as an expressway, and a condition that a following target is present such as a preceding vehicle is satisfied, and is sometimes referred to as TJP (Traffic Jam Pilot). When this condition is no longer satisfied, the mode determination unit 150 changes the driving mode of the host vehicle M to the mode B.
In the mode B, a task of monitoring the front of the vehicle M (hereinafter referred to as front monitoring) is provided to the driver, but a task of holding the steering wheel 82 is not provided. In the mode C, the driving support state is set, and the driver is placed with a task of monitoring the front and a task of holding the steering wheel 82. Mode D is a driving mode in which at least one of steering and acceleration and deceleration of the host vehicle M requires a certain degree of driving operation by the driver. For example, in mode D, driving assistance such as ACC (Adaptive Cruise Contro) and LKAS (Lane Keeping Assist System) is performed. In the mode E, the manual driving state is set in which both steering and acceleration and deceleration require a driving operation by the driver. Both modes D and E are of course tasks for the driver to arrange to monitor the front of the own vehicle M.
The automatic driving control device 100 (and a driving support device (not shown)) executes an automatic lane change according to the driving mode. The automatic lane change includes an automatic lane change (1) based on a system request and an automatic lane change (2) based on a driver request. The automatic lane change (1) includes an automatic lane change for overtaking performed when the speed of the preceding vehicle is equal to or greater than the speed of the host vehicle by a reference, and an automatic lane change for traveling toward the destination (an automatic lane change performed by changing the recommended lane). The automatic lane change (2) is an automatic lane change that, when a driver operates a direction indicator when conditions relating to speed, positional relationship with respect to a surrounding vehicle, and the like are satisfied, causes the host vehicle M to make a lane change in the operation direction.
The automatic driving control apparatus 100 does not perform the automatic lane changes (1) and (2) in the mode a. The automatic driving control apparatus 100 performs automatic lane changes (1) and (2) in both modes B and C. The driving support device (not shown) executes the automatic lane change (2) without executing the automatic lane change (1) in the mode D. In mode E, neither of the automated lane changes (1) and (2) is performed.
The mode determination unit 150 changes the driving mode of the host vehicle M to the driving mode having a heavier task when the driver does not execute the task related to the determined driving mode (hereinafter referred to as the current driving mode).
For example, in the case where the driver cannot move to the manual driving in response to the request from the system in the mode a (for example, in the case where the driver is looking to the outside of the allowable area continuously, in the case where a sign of driving difficulty is detected), the mode determining unit 150 uses the HMI30 to prompt the driver to move to the manual driving, and if the driver does not respond, the driver performs such control that the host vehicle M is gradually stopped against the road shoulder and the automatic driving is stopped. After stopping the automatic driving, the host vehicle is in the mode D or E, and the host vehicle M can be started by a manual operation of the driver. Hereinafter, the same applies to "stop automatic driving". In the case where the driver does not monitor the front direction in the mode B, the mode determining unit 150 uses the HMI30 to prompt the driver to monitor the front direction, and if the driver does not respond, performs control such that the host vehicle M is gradually stopped by leaning to the road shoulder and the automatic driving is stopped. In the mode C, when the driver does not monitor the front direction or does not hold the steering wheel 82, the mode determining unit 150 uses the HMI30 to prompt the driver to monitor the front direction and/or hold the steering wheel 82, and if the driver does not respond, performs control such that the driver gradually stops the vehicle M against the road shoulder and stops the automatic driving.
The second control unit 160 controls the running driving force output device 200, the braking device 210, and the steering device 220 so that the vehicle M passes through the target track generated by the behavior plan generation unit 140 at a predetermined timing.
Returning to fig. 2, the second control unit 160 includes, for example, an acquisition unit 162, a speed control unit 164, and a steering control unit 166. The acquisition unit 162 acquires information of the target track (track point) generated by the action plan generation unit 140, and causes a memory (not shown) to store the information. The speed control unit 164 controls the traveling driving force output device 200 or the brake device 210 based on a speed element attached to the target track stored in the memory. The steering control unit 166 controls the steering device 220 according to the curve of the target track stored in the memory. The processing by the speed control unit 164 and the steering control unit 166 is realized by a combination of feedforward control and feedback control, for example. As an example, the steering control unit 166 performs a combination of feedforward control according to the curvature of the road ahead of the host vehicle M and feedback control based on the deviation from the target track.
The running driving force output device 200 outputs a running driving force (torque) for running the vehicle to the driving wheels. The running driving force output device 200 includes, for example, a combination of an internal combustion engine, an electric motor, a transmission, and the like, and ECU (Electronic Control Unit) for controlling these. The ECU controls the above-described configuration in accordance with information input from the second control portion 160 or information input from the driving operation element 80.
The brake device 210 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor in accordance with information input from the second control portion 160 or information input from the driving operation member 80 so that a braking torque corresponding to a braking operation is output to each wheel. The brake device 210 may be provided with a mechanism for transmitting the hydraulic pressure generated by the operation of the brake pedal included in the drive operation element 80 to the hydraulic cylinder via the master cylinder. The brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device that transmits the hydraulic pressure of the master cylinder to the hydraulic cylinder by controlling the actuator in accordance with information input from the second control unit 160.
The steering device 220 includes, for example, a steering ECU and an electric motor. The electric motor applies a force to the rack-and-pinion mechanism to change the direction of the steered wheel, for example. The steering ECU drives the electric motor in accordance with information input from the second control unit 160 or information input from the driving operation element 80 to change the direction of the steered wheels.
[ operation of vehicle control device ]
Next, the operation of the vehicle control device according to the embodiment will be described. In the following description, it is assumed that the host vehicle M is traveling in the driving mode of the mode B. Fig. 4 is a diagram showing an example of a scenario in which the operation of the vehicle control device according to the embodiment is performed. Fig. 4 shows a situation in which the host vehicle M is traveling on the lane L1 in front of the branch point and the driving mode in the mode B is set to direct straight.
While the host vehicle M is traveling on the lane L1, the recognition unit 130 recognizes the surrounding situation of the host vehicle M, in particular, the road dividing lines on both sides of the host vehicle M, based on the image captured by the camera 10. Hereinafter, a road division line recognized based on the image captured by the camera 10 (hereinafter referred to as "camera road division line CL") is denoted by CL, and a road division line recognized based on the second map information 62 (hereinafter referred to as "map road division line ML") is denoted by ML. In fig. 4, the map road division line ML is deviated from the actual road division line AL, which indicates a situation in which the map road division line ML is deviated from the actual road division line AL due to, for example, construction or redrawing of the road division line.
The determination unit 152 determines whether there is a deviation (mismatch) between the camera road dividing line CL and the map road dividing line ML while the host vehicle M is traveling. Here, the deviation means that, for example, the distance between the camera road dividing line CL and the map road dividing line ML is equal to or greater than a predetermined value. The determination unit 152 also determines whether the vehicle M is at the branch point based on the position information of the branch point obtained from the second map information 62. More specifically, for example, the identifying unit 130 determines whether the vehicle M is at the branch point by determining whether the vehicle M is within a predetermined distance from position information (for example, GPS coordinates) of the branch point.
When it is determined that there is a deviation between the camera road dividing line CL and the map road dividing line ML and it is determined that the host vehicle M is at the branch point, the mode determining unit 150 determines the driving mode of the host vehicle M based on the branch direction of the branch point obtained from the second map information 62 and the direction of the camera road dividing line CL.
More specifically, for example, as shown in fig. 4, the mode determination unit 150 calculates the degree of deviation between the branching direction of the branching point (the extending direction of the map road dividing line ML) obtained from the second map information 62 and the direction of the camera road dividing line CL, and determines that the reliability of the camera road dividing line CL is higher than the reliability of the map road dividing line ML when the calculated degree of deviation is equal to or greater than the threshold value. This is due to: in general, when the host vehicle M passes by the branch point, for example, due to a special painting situation or the like at the branch point, the frequency of erroneous recognition that the recognized camera road dividing line CL is higher than that of the trunk road is closer to the direction of the dividing line ML. That is, if the identified camera road dividing line CL is not directed toward the branch line ML, the reliability of the camera road dividing line CL is high.
Therefore, when the degree of deviation between the branching direction of the branching point obtained from the second map information 62 and the direction of the camera road dividing line CL is equal to or greater than the threshold value, the mode determination unit 150 changes the driving mode from the mode B to the mode C, and continues the travel in the driving mode of the mode C using the camera road dividing line CL as the reference line. Instead of this, the mode determination unit 150 may continue the travel in the driving mode of the mode B by using the camera road dividing line CL as a reference line without changing the driving mode from the mode B. Thus, even when there is a mismatch between the camera road dividing line CL and the map road dividing line ML at the branching point, the travel of the host vehicle M can be controlled using the road dividing line with higher reliability.
Fig. 5 is a diagram showing another example of a scenario in which the operation of the vehicle control device according to the embodiment is performed. The scenario of fig. 5 represents the following conditions: as a result, the camera road dividing line CL is divided in a direction different from the lane L1 in which the host vehicle M originally travels. At this time, the determination unit 152 determines that the degree of deviation between the branching direction of the branching point obtained from the second map information 62 and the direction of the camera road dividing line CL is smaller than the threshold value, and the mode determination unit 150 continues the driving mode of the mode B using the map road dividing line ML as the reference line. That is, if the degree of deviation between the branching direction of the branching point obtained from the second map information 62 and the direction of the camera road dividing line CL is smaller than the threshold value, this indicates that there is a possibility that the camera road dividing line CL is erroneously recognized in the direction toward the branching line ML, and therefore the driving mode of the mode B is continued using the map road dividing line ML. As a result, the host vehicle M is not guided to the camera road dividing line CL in the direction toward the branch line ML, and can travel straight along the map road dividing line ML on the lane L1.
Alternatively, the mode determination unit 150 may report to the occupant of the vehicle M that the driving mode is changed from the mode B to the mode C when the degree of deviation between the direction of the branch line ML and the direction of the camera road dividing line CL is smaller than the threshold value. For example, when the degree of deviation between the direction of the branch line ML and the direction of the camera road dividing line CL is smaller than the threshold value, the mode determining unit 150 may report information indicating the possibility of erroneous recognition at the branch line ML to the occupant of the vehicle M, and may change the occupant to the mode C, or change the mode C after a certain period from the determination of the deviation.
Next, a flow of operations performed by the vehicle control device according to the embodiment will be described with reference to fig. 6. Fig. 6 is a flowchart showing an example of a flow of operations executed by the vehicle control device according to the embodiment. The processing according to the present flowchart is executed in a predetermined cycle while the host vehicle M is traveling in the driving mode of the mode B using the camera road dividing line CL.
First, the mode determining unit 150 obtains the camera road dividing line CL and the map road dividing line ML via the identifying unit 130 (step S100). Next, the determination unit 152 determines whether or not there is a deviation between the acquired camera road division line CL and the map road division line ML (step S102).
Next, when determining that there is a deviation between the acquired camera road division line CL and the map road division line ML, the determination unit 152 determines whether or not the vehicle M is at the branching point based on the second map information 62 (step S104). When it is determined that there is no deviation between the acquired camera road division line CL and the map road division line ML, or when it is determined that the vehicle M is not at the branching point, the mode determining unit 150 returns the process to step S100.
On the other hand, when it is determined that the vehicle M is at the branch point, the determination unit 152 then determines whether or not the degree of deviation between the direction of the camera road division line CL and the branch direction of the branch point is equal to or greater than the threshold value (step S106). When it is determined that the degree of deviation between the direction of the camera road dividing line CL and the branching direction of the branching point is smaller than the threshold value, the mode determining unit 150 determines the driving mode in which the mode B is continued using the map road dividing line ML as the reference line (step S108). On the other hand, when it is determined that the degree of deviation between the direction of the camera road dividing line CL and the branching direction of the branching point is equal to or greater than the threshold value, the mode determining unit 150 changes the driving mode from the mode B to the mode C (step S110). At this time, the mode determination unit 150 temporarily discards the map road division line ML, and determines the driving mode in which the mode C is continued using the camera road division line CL as a reference line. Thus, the processing of the present flowchart ends.
According to the present embodiment described above, when the camera road dividing line is deviated from the map road dividing line and the vehicle is at the branch point, the driving mode of the vehicle is controlled based on the direction of the camera road dividing line and the branch direction of the branch point. Thus, even when the road dividing line recognized by the camera is different from the content of the map information mounted on the host vehicle, the driving control of the vehicle can be appropriately changed.
The embodiments described above can be expressed as follows.
A vehicle control device is provided with:
a storage device in which a program is stored; and
a hardware processor is provided with a processor that,
the processor performs the following processing by executing commands that can be read in by the computer: (the processor executing the computer-readable instructions to:)
Acquiring a camera image obtained by capturing a surrounding situation of a vehicle;
controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information;
determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and a part of the driving modes including at least the second driving mode among the plurality of driving modes being performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of a driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not performed by the driver;
Determining whether there is a deviation between a road division line shown in the camera image and a road division line shown in the map information, and determining whether the vehicle is at a branching point shown in the map information;
when it is determined that there is a deviation between the road dividing line shown in the camera image and the road dividing line shown in the map information and it is determined that the vehicle is at the branch point shown in the map information, a driving mode of the vehicle is determined based on a branching direction of the branch point and a direction of the road dividing line shown in the camera image.
The specific embodiments of the present invention have been described above using the embodiments, but the present invention is not limited to such embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.

Claims (6)

1. A vehicle control apparatus, wherein,
the vehicle control device includes:
an acquisition unit that acquires a camera image obtained by capturing a surrounding situation of a vehicle;
a driving control unit that controls steering and acceleration/deceleration of the vehicle based on the camera image and map information, independently of an operation of a driver of the vehicle;
A mode determination unit that determines a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, at least a part of the driving modes including the second driving mode among the plurality of driving modes being controlled by the driving control unit, and the mode determination unit changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not executed by the driver; and
a determination section that determines whether there is a deviation between a road division line shown in the camera image and a road division line shown in the map information, and determines whether the vehicle is at a branching point shown in the map information,
the mode determination unit determines a driving mode of the vehicle based on a branching direction of the branching point and a direction of a road dividing line indicated by the camera image when it is determined that there is a deviation between the road dividing line indicated by the camera image and the road dividing line indicated by the map information and it is determined that the vehicle is at the branching point indicated by the map information.
2. The vehicle control apparatus according to claim 1, wherein,
the mode determination unit changes the second driving mode to the first driving mode when a degree of deviation between a branching direction of the branching point and a direction of a road dividing line shown in the camera image is equal to or greater than a threshold value, and continues the first driving mode using the road dividing line shown in the camera image.
3. The vehicle control apparatus according to claim 1, wherein,
the mode determination unit continues the second driving mode using the road division line indicated by the map information when a degree of deviation between a branching direction of the branching point and a direction of the road division line indicated by the camera image is smaller than a threshold value.
4. The vehicle control apparatus according to any one of claims 1 to 3, wherein,
the second driving mode is a driving mode in which a task of holding an operation member that accepts a steering operation of the vehicle is not disposed for the driver,
the first driving mode is a driving mode in which at least a task of holding the operation element is arranged for the driver.
5. A vehicle control method, wherein,
The vehicle control method causes a computer to perform the following processing:
acquiring a camera image obtained by capturing a surrounding situation of a vehicle;
controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information;
determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and a part of the driving modes including at least the second driving mode among the plurality of driving modes being performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of a driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not performed by the driver;
determining whether there is a deviation between a road division line shown in the camera image and a road division line shown in the map information, and determining whether the vehicle is at a branching point shown in the map information;
When it is determined that there is a deviation between the road dividing line shown in the camera image and the road dividing line shown in the map information and it is determined that the vehicle is at the branch point shown in the map information, a driving mode of the vehicle is determined based on a branching direction of the branch point and a direction of the road dividing line shown in the camera image.
6. A storage medium storing a program, wherein,
the program causes a computer to perform the following processing:
acquiring a camera image obtained by capturing a surrounding situation of a vehicle;
controlling steering and acceleration and deceleration of the vehicle independently of an operation of a driver of the vehicle based on the camera image and map information;
determining a driving mode of the vehicle as any one of a plurality of driving modes including a first driving mode and a second driving mode, the second driving mode being a driving mode in which a task to be placed on the driver is lighter than the first driving mode, and a part of the driving modes including at least the second driving mode among the plurality of driving modes being performed by controlling steering and acceleration/deceleration of the vehicle independently of an operation of a driver of the vehicle, and changing the driving mode of the vehicle to a driving mode in which the task is heavier when the determined task related to the driving mode is not performed by the driver;
Determining whether there is a deviation between a road division line shown in the camera image and a road division line shown in the map information, and determining whether the vehicle is at a branching point shown in the map information;
when it is determined that there is a deviation between the road dividing line shown in the camera image and the road dividing line shown in the map information and it is determined that the vehicle is at the branch point shown in the map information, a driving mode of the vehicle is determined based on a branching direction of the branch point and a direction of the road dividing line shown in the camera image.
CN202310309522.9A 2022-03-31 2023-03-27 Vehicle control device, vehicle control method, and storage medium Pending CN116890838A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022059656A JP2023150513A (en) 2022-03-31 2022-03-31 Vehicle control device, vehicle control method and program
JP2022-059656 2022-03-31

Publications (1)

Publication Number Publication Date
CN116890838A true CN116890838A (en) 2023-10-17

Family

ID=88309883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310309522.9A Pending CN116890838A (en) 2022-03-31 2023-03-27 Vehicle control device, vehicle control method, and storage medium

Country Status (2)

Country Link
JP (1) JP2023150513A (en)
CN (1) CN116890838A (en)

Also Published As

Publication number Publication date
JP2023150513A (en) 2023-10-16

Similar Documents

Publication Publication Date Title
CN114684184A (en) Vehicle control device, vehicle control method, and storage medium
US11827246B2 (en) Vehicle control device, vehicle control method, and storage medium
CN115140086A (en) Vehicle control device, vehicle control method, and storage medium
CN114644013A (en) Vehicle control device, vehicle control method, and storage medium
CN117622150A (en) Vehicle control device, vehicle control method, and storage medium
US20230303099A1 (en) Vehicle control device, vehicle control method, and storage medium
CN117227725A (en) Moving object control device, moving object control method, and storage medium
JP2023030146A (en) Vehicle control device, vehicle control method, and program
JP2023030147A (en) Vehicle control device, vehicle control method, and program
CN115443236B (en) Vehicle control device, vehicle system, vehicle control method, and storage medium
JP7308880B2 (en) VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM
CN116034066B (en) Vehicle control device and vehicle control method
CN115140083A (en) Vehicle control device, vehicle control method, and storage medium
CN114684191B (en) Vehicle control device, vehicle control method, and storage medium
CN116890838A (en) Vehicle control device, vehicle control method, and storage medium
CN116710339B (en) Vehicle control device, vehicle control method, and storage medium
JP7075550B1 (en) Vehicle control devices, vehicle control methods, and programs
US20230303126A1 (en) Vehicle control device, vehicle control method, and storage medium
CN116890837A (en) Vehicle control device, vehicle control method, and storage medium
CN117657157A (en) Vehicle control device, vehicle control method, and storage medium
CN116890824A (en) Vehicle control device, vehicle control method, and storage medium
CN117584963A (en) Vehicle control device, vehicle control method, and storage medium
CN116710984A (en) Vehicle control device, vehicle control method, and program
CN117584975A (en) Vehicle control device, vehicle control method, and storage medium
CN114684187A (en) Vehicle control device, vehicle control method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination