CN111532269B - Vehicle control device - Google Patents

Vehicle control device Download PDF

Info

Publication number
CN111532269B
CN111532269B CN202010073377.5A CN202010073377A CN111532269B CN 111532269 B CN111532269 B CN 111532269B CN 202010073377 A CN202010073377 A CN 202010073377A CN 111532269 B CN111532269 B CN 111532269B
Authority
CN
China
Prior art keywords
information
intervention
steering
vehicle
driver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010073377.5A
Other languages
Chinese (zh)
Other versions
CN111532269A (en
Inventor
塚田竹美
石坂贤太郎
渡边崇
八代胜也
幸加木徹
广濑峰史
池田雅也
松田寿志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN111532269A publication Critical patent/CN111532269A/en
Application granted granted Critical
Publication of CN111532269B publication Critical patent/CN111532269B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18163Lane change; Overtaking manoeuvres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a vehicle control device which properly combines automatic driving and manual driving by a driver involved in the automatic driving. A vehicle control device according to the present invention includes: a track generation unit that sets a target position based on an output of a periphery monitoring mechanism provided in a vehicle and generates track information directed to the target position; and a travel control means for controlling travel based on the trajectory information, the travel control means being capable of performing an intervention operation by a driver with respect to the travel controlled by the travel control means, wherein the trajectory generation means further changes the trajectory information based on the trajectory corrected by the intervention operation when the intervention operation is present.

Description

Vehicle control device
Technical Field
The present invention relates to a vehicle control device for performing automatic driving and driving assistance of an automobile, for example.
Background
In automated driving or driving assistance of a vehicle such as a four-wheel vehicle, a sensor monitors a specific direction or all directions of the vehicle, monitors a driver's state and a vehicle traveling state, and controls automated driving of the vehicle on an appropriate route and at an appropriate speed or assists driving by the driver based on the monitoring result. Even in a vehicle having such an automatic driving function, the driver is required to actively participate in driving, and such a situation or situation may occur. In such a case, the driver can intervene in driving manually even during automatic driving.
Patent document 1 and the like have been proposed as a technique for achieving both of such automatic driving and manual driving by a driver. Patent document 1 describes a technique of learning a steering characteristic of a driver, setting a cutoff frequency based on the learned steering characteristic, filtering a steering input at the set cutoff frequency, and determining whether or not intervention (emergency control) by the driver is performed.
Documents of the prior art
Patent document
Patent document 1: international publication No. 2013/128638
Disclosure of Invention
Problems to be solved by the invention
Not only such steering characteristics but also driver's preference and habit in driving. For example, the vehicle travels not in the center of the lane but in either the left or right direction, or travels in the center of the lane without cutting into the inside even in a curve.
However, the habit and preference of the driver are not reflected in the predetermined travel trajectory determined in the automated driving.
The present invention has been made in view of the above conventional example, and an object thereof is to provide a vehicle control device that appropriately combines automatic driving and manual driving by a driver involved in the automatic driving.
Means for solving the problems
In order to achieve the above object, the present invention has the following configurations.
That is, according to one aspect of the present invention, there is provided a vehicle control device for performing driving assistance or automatic driving of a host vehicle,
the vehicle control device includes:
a trajectory generation means for setting a target position based on an output of a periphery monitoring means provided in a vehicle and generating trajectory information directed to the target position;
a travel control means that controls travel based on the track information; and
an operation means capable of performing an intervention operation by a driver with respect to the travel controlled by the travel control means,
in a case where there is an intervention operation by the operating mechanism, the trajectory generating mechanism further changes the trajectory information based on the trajectory corrected by the intervention operation for a predetermined period.
Effects of the invention
According to the present invention, it is possible to appropriately balance automatic driving and manual driving by a driver involved in the automatic driving.
Drawings
Fig. 1 is a diagram showing a configuration of a vehicle system of an autonomous vehicle according to an embodiment.
Fig. 2 is a functional block diagram of a vehicle control system (control unit).
Fig. 3 is a block diagram of the steering apparatus.
Fig. 4 (a) is a flowchart showing a track information generation procedure, and fig. 4 (B) is a flowchart showing a procedure of processing intervention information when an obstacle disappears from the running environment.
Fig. 5 is a flowchart showing a procedure of setting intervention information.
Description of the reference numerals
2: a control unit; 31: a steering wheel; 21: a steering ECU;146: a track generation unit; 185: intervention information.
Detailed Description
[ first embodiment ]
Outline of automatic driving and driving assistance
First, an outline of an example of the automated driving will be described. In automatic driving, generally, a driver sets a destination by a navigation system mounted on a vehicle before traveling, and determines a route to the destination in advance by a server or the navigation system. When the vehicle starts, a vehicle control device (or a driving control device) configured by an ECU or the like included in the vehicle drives the vehicle to a destination along the route. During this period, appropriate action is determined in accordance with the external environment such as the route and road conditions, the state of the driver, and the like, and the vehicle is caused to travel to perform the action, for example, drive control, steering control, brake control, and the like. These controls are sometimes collectively referred to as running controls.
In autonomous driving, there are several control states (also referred to as levels of autonomous driving control states or simply states) according to an automation rate (or the amount of tasks required of the driver). Generally speaking, the higher the level of the automatic driving control state and thus the higher the level of automation, the lighter the task (i.e., load) required of the driver. For example, in the highest-ranking control state (third control state) in this example, the driver can notice something other than driving. This third control state is performed in a less complicated environment such as a case of following a preceding vehicle due to congestion on an expressway. In the second control state with a lower rank, the driver may not hold the steering wheel, but the driver needs to pay attention to the surrounding situation and the like. For example, the second control state may be applied when cruising or the like is performed on a highway or the like with few obstacles. Further, whether the driver is paying attention to the surroundings can be detected by the driver state detection camera 41a (refer to fig. 1), and whether the driver is holding the steering wheel can be detected by the steering wheel holding sensor. The driver state detection camera 41a may recognize, for example, the pupils of the driver to determine the direction of observation, or may simply recognize the face and estimate the direction in which the face is oriented as the direction of observation of the driver.
In the first control state of a further lower rank, the driver may not perform the steering wheel operation or the throttle operation, but needs to grasp the steering wheel and pay attention to the driving environment in preparation for the transition (take over) of the driving control from the vehicle to the driver. The zeroth control state, which is further lower in level, is manual driving, but contains automated driving assistance. The first control state is different from the zeroth control state in that the first control state is one of the control states of the automated driving, and the first control state can be skipped between the first control state and the second control state and the third control state under the control of the vehicle 1 depending on the external environment, the traveling state, the driver state, and the like.
The driving assistance in the above-described zeroth control state is a function of assisting the driving operation performed by the driver as the driving subject by the peripheral monitoring and the partial automation. Examples include LKAS (lane keeping assist function) and ACC (adaptive cruise control). Further, there are an automatic braking function of monitoring only the front and applying a brake when an obstacle is detected, a rear monitoring function of detecting a vehicle diagonally behind to urge the driver's attention, a parking function of parking toward a parking space, and the like. The above-described functions may be functions that can be realized also in the first control state of the automatic driving. Note that LKAS is a function of recognizing a white line on a road or the like to maintain a lane, and ACC is a function of following a preceding vehicle in accordance with the speed of the preceding vehicle.
In addition, in automatic driving, there may be intervention or corrective action by the driver. This is called emergency control (over ride). For example, if the driver performs steering and accelerator operations during automatic driving, the driving operation performed by the driver may be prioritized. In this case, the automatic driving function continues to operate so that automatic driving can be resumed from that point in time even if the driver stops operating. Therefore, even in the emergency control, there is a possibility that the automatic driving control state may vary. In addition, when the driver performs the brake operation, the automatic driving may be cancelled and the control may be shifted to the manual driving (zeroth control state).
In the case of switching the automatic driving control state (or automation level), the driver is notified of the situation by sound, display, vibration, or the like. For example, in the case where the automated driving is switched from the above-described first control state to the second control state, the driver is notified that the steering wheel can be released. In the opposite case, the driver is informed to hold the steering wheel. This notification is repeated until the driver's holding of the steering wheel is detected by a steering wheel holding sensor (e.g., sensor 210I of fig. 3). Further, for example, if the steering wheel is not held for a limited time or until the switching limit point of the automatic driving control state, an operation such as stopping the vehicle at a safe place can be performed. Similarly, the switching from the second control state to the third control state is performed, and since the driver is relieved of the monitoring obligation around the driver in the third control state, the driver is notified of a message indicating the content. In the opposite case, the driver is notified to perform the periphery monitoring. This notification is repeated until it is detected by the driver state detection camera 41a that the driver is performing the periphery monitoring. The automated driving is performed substantially as described above, and the configuration and control for realizing the automated driving will be described below.
Constitution of vehicle control device
Fig. 1 is a block diagram of a vehicle control device according to an embodiment of the present invention, which controls a vehicle 1. Fig. 1 shows an outline of a vehicle 1 in a plan view and a side view. As an example, the vehicle 1 is a sedan-type four-wheeled passenger vehicle.
The control device of fig. 1 comprises a control unit 2. The control unit 2 includes a plurality of ECUs 20 to 29 connected to be communicable via an in-vehicle network. Each ECU includes a processor typified by a CPU, a storage device such as a semiconductor memory, an interface with an external device, and the like. The storage device stores a program executed by the processor, data used by the processor for processing, and the like. Each ECU may be provided with a plurality of processors, storage devices, interfaces, and the like.
Hereinafter, functions and the like in charge of the ECUs 20 to 29 will be described. The number of ECUs and the functions in charge of the ECUs may be appropriately designed for the vehicle 1, or may be more detailed or integrated than in the present embodiment.
The ECU20 executes control related to automated driving of the vehicle 1. In the automatic driving, at least one of steering, acceleration, and deceleration of the vehicle 1 is automatically controlled. In the control example described later, both steering and acceleration/deceleration are automatically controlled.
The ECU21 is a steering ECU that controls the steering device 3. The steering device 3 includes a mechanism for steering the front wheels in accordance with a driving operation (steering operation) of a steering wheel (also referred to as a steering wheel) 31 by a driver. The steering device 3 is an electric power steering device, and includes a motor that generates a driving force for assisting a steering operation or automatically steering front wheels, a sensor that detects a steering angle, and the like. When the driving state of the vehicle 1 is the automated driving, the ECU21 automatically controls the steering device 3 in accordance with an instruction from the ECU20, and controls the traveling direction of the vehicle 1.
The ECU22 and the ECU23 perform control of the detection means 41 to 43 that detect the surrounding situation of the vehicle and information processing of the detection results. The ambient condition is also referred to as an ambient condition, an external environment, or the like, and information obtained by detecting these is referred to as ambient condition information, external environment information, or the like. The detection means for the surrounding state and the ECU that controls the detection means are also collectively referred to as a surrounding monitoring device, a surrounding monitoring unit, or the like. The detection means 41 is a camera (hereinafter, may be referred to as a camera 41) that captures an image of the front of the vehicle 1, and in the case of the present embodiment, two cameras are provided in the room of the vehicle 1. By analyzing the image captured by the camera 41, the outline of the target object and the lane lines (white lines, etc.) on the road can be extracted. The detection unit 41a is a camera for detecting the state of the driver (hereinafter, sometimes referred to as a driver state detection camera 41 a), is provided so as to be able to capture the expression of the driver, and is connected to an ECU (electronic control unit), not shown, for processing image data thereof. As a sensor for detecting the state of the driver, a steering wheel holding sensor, not shown, is provided. This makes it possible to detect whether or not the driver is holding the steering wheel. The driver state detection section including the driver state detection camera 41a and the steering wheel grip sensor 210I is also referred to as a driver state detection section.
The Detection unit 42 is an optical radar (hereinafter, may be referred to as an optical radar 42) that detects a target object around the vehicle 1 and measures a distance to the target object. In the case of the present embodiment, five optical radars 42 are provided, one at each corner of the front portion of the vehicle 1, one at the center of the rear portion, and one at each side of the rear portion. The detection means 43 is a millimeter wave radar (hereinafter, may be referred to as a radar 43), and detects a target object around the vehicle 1 and measures a distance to the target object. In the case of the present embodiment, five radars 43 are provided, one at the center of the front portion of the vehicle 1, one at each corner of the front portion, and one at each corner of the rear portion.
The ECU22 controls one of the cameras 41 and the optical radars 42 and performs information processing of detection results. The ECU23 controls the other camera 41 and each radar 43 and performs information processing of the detection result. By providing two sets of devices for detecting the surrounding state of the vehicle, the reliability of the detection result can be improved, and by providing different types of detection means such as a camera, an optical radar, and a radar, the surrounding environment (also referred to as a surrounding state) of the vehicle can be analyzed in various ways.
The ECU24 performs control of the gyro sensor 5, the GPS sensor 24b, and the communication device 24c and information processing of the detection result or the communication result. The gyro sensor 5 detects a rotational motion of the vehicle 1. The travel path of the vehicle 1 can be determined based on the detection result of the gyro sensor 5, the wheel speed, and the like. The GPS sensor 24b detects the current position of the vehicle 1. The communication device 24c wirelessly communicates with a server that provides map information and traffic information, and acquires these pieces of information. The ECU24 can access the database 24a of map information constructed in the storage device, and the ECU24 performs route search from the current position to the destination, and the like.
The ECU25 includes a communication device 25a for vehicle-to-vehicle communication. The communication device 25a performs wireless communication with other vehicles in the vicinity, and performs information exchange between the vehicles.
The ECU26 controls the power unit (i.e., the running driving force output device) 6. The power plant 6 is a mechanism that outputs a driving force that rotates the driving wheels of the vehicle 1, and the power plant 6 includes, for example, an engine and a transmission. The ECU26 controls the output of the engine in accordance with the driver's driving operation (accelerator operation or accelerator operation) detected by an operation detection sensor (i.e., an accelerator opening sensor) 7A provided on the accelerator pedal 7A, or switches the transmission gear based on information such as the vehicle speed detected by a vehicle speed sensor 7 c. When the driving state of the vehicle 1 is the automated driving, the ECU26 automatically controls the power plant 6 in accordance with an instruction from the ECU20, and controls acceleration and deceleration of the vehicle 1. The acceleration in each direction, the angular acceleration around the angular axis, the vehicle speed detected by the vehicle speed sensor 7c, and the like detected by the gyro sensor 5 are information indicating the running state of the vehicle, and these sensors are collectively referred to as a running state monitoring unit. Further, the operation detection sensor 7A of the accelerator pedal 7A and the operation detection sensor (i.e., the brake depression amount sensor) 7B of the brake pedal 7B described later may be included in the running state monitoring unit, but in the present example, these are referred to as an operation state detection unit together with a detection unit (not shown) that detects an operation state with respect to another device.
The ECU27 controls lighting devices (headlamps, tail lamps, etc.) including the direction indicator 8. In the case of the example of fig. 1, the direction indicator 8 is provided at the front, the door mirror, and the rear of the vehicle 1.
The ECU28 controls the input/output device 9. The input/output device 9 outputs information to the driver and accepts input of information from the driver. The sound output device 91 reports information to the driver by sound. The display device 92 reports information to the driver through display of an image. The display device 92 is disposed on the front surface of the driver's seat, for example, and constitutes an instrument panel or the like. Further, although sound and display are shown here by way of example, information may be reported by vibration or light. Further, a plurality of sounds, displays, vibrations, or lights may be combined to report information. Further, the combination may be different or the reporting method may be different depending on the control state (e.g., the degree of urgency) of the information to be reported. The input device 93 is a switch group that is disposed at a position where the driver can operate and instructs the vehicle 1, and may further include a voice input device. The input device 93 is further provided with a cancel switch for manually lowering the level of the automatic driving control state. In addition, an automatic driving changeover switch for changing over from manual driving to automatic driving is also provided. A driver who wants to lower the level of the automatic driving control state can lower the level by operating the cancel switch. In the present embodiment, regardless of the level of the automatic driving control state, the level can be lowered by the same cancel switch.
The ECU29 controls the brake device 10 and a parking brake (not shown). The brake device 10 is, for example, a disc brake device, is provided to each wheel of the vehicle 1, and decelerates or stops the vehicle 1 by applying resistance to rotation of the wheel. The ECU29 controls the operation of the brake device 10, for example, in accordance with a driving operation (braking operation) of the driver detected by an operation detection sensor 7B provided on the brake pedal 7B. When the driving state of the vehicle 1 is the automatic driving, the ECU29 automatically controls the brake device 10 in accordance with an instruction from the ECU20, and controls the deceleration and stop of the vehicle 1. The brake device 10 and the parking brake can also be operated to maintain the stopped state of the vehicle 1. In addition, when the transmission of the power unit 6 includes the parking lock mechanism, the parking lock mechanism may be operated to maintain the stopped state of the vehicle 1.
Vehicle control System
Fig. 2 shows a functional configuration of the control unit 2 in the present embodiment. The control unit 2 is also referred to as a vehicle control system, and each function block shown in fig. 2 is realized by each ECU including the ECU20 executing a program or the like. In fig. 2, a vehicle 1 is mounted with a detection device DD including a camera 41, an optical radar 42, a radar 43, and the like, a navigation device 50, communication devices 24B, 24c, and 25a, a vehicle sensor 60 including a gyro sensor 5, a steering wheel grip sensor, a driver state detection camera 41a, and the like, an accelerator pedal 7A, an accelerator opening sensor 7A, a brake pedal 7B, a brake depression amount sensor 7B, a display device 92, a speaker 91, a switch 93 including an automatic driving changeover switch, a vehicle control system 2, a driving force output device 6, a steering device 3, and a brake device 220. These apparatuses and devices are connected to each other by a multiplex communication line such as a CAN (Controller Area Network) communication line, a serial communication line, a wireless communication Network, or the like.
The Navigation device 50 includes a GNSS (Global Navigation Satellite System) receiver, map information (Navigation map), a touch panel display device functioning as a user interface, a speaker, a microphone, and the like. The navigation device 50 determines the position of the own vehicle 1 by the GNSS receiver, and derives a route from the position to a destination designated by the user. The route derived by the navigation device 50 is provided to the target lane determining unit 110 of the vehicle control system 2. Further, the configuration for determining the position of the host vehicle 1 may be provided independently of the navigation device 50.
The Communication devices 24b, 24c, and 25a perform wireless Communication using, for example, a cellular network, a Wi-Fi network, bluetooth (registered trademark), DSRC (differentiated Short Range Communication), or the like.
The vehicle sensor 60 includes a vehicle speed sensor for detecting a vehicle speed, an acceleration sensor for detecting an acceleration, a yaw rate sensor for detecting an angular velocity about a vertical axis, an orientation sensor for detecting an orientation of the host vehicle 1, and the like. All or a part of them is realized by the gyro sensor 5. A steering wheel grip sensor and a driver state detection camera 41a, which are not shown, may be included in the vehicle sensor 60.
The accelerator pedal 7A is an operation member for receiving an acceleration instruction (or a deceleration instruction based on a return operation) by a driver. The accelerator opening sensor 7A detects a stepping amount of an accelerator pedal 7A, and outputs an accelerator opening signal indicating the stepping amount to the vehicle control system 2. Instead of outputting the output to the vehicle control system 2, the output may be directly output to the running driving force output device 6, the steering device 3, or the brake device 220. The same applies to the configuration of the other driving operation system described below.
The brake pedal 7B is an operation member for receiving a deceleration instruction made by the driver. The brake depression amount sensor 7B detects a depression amount (or a depression force) of the brake pedal 7B, and outputs a brake signal indicating a detection result to the vehicle control system 2.
The Display device 92 is, for example, an LCD (Liquid Crystal Display) or an organic EL (Electroluminescence) Display device attached to each part of the instrument panel, an arbitrary part facing the front passenger seat and the rear seat, or the like. The Display device 92 may be a HUD (Head Up Display) that projects an image on a front windshield or other windows. The speaker 91 outputs sound.
The running drive force output device 6 outputs a running drive force (torque) for running the vehicle to the drive wheels. The travel driving force output device 6 includes, for example, an engine, a transmission, and an engine ECU (Electronic Control Unit) that controls the engine. The travel driving force output device 6 may be an electric motor or a hybrid mechanism combining an internal combustion engine and an electric motor.
The brake device 220 is, for example, an electric servo brake device including a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake control unit. The brake control unit of the electric servo brake device controls the electric motor based on the information input from the travel control unit 160, and outputs a brake torque corresponding to a brake operation to each wheel. Further, the brake device 220 may include a regenerative brake based on a travel motor that can be included in the travel driving force output device 6.
Steering device
Next, the steering device 3 will be explained. The steering device 3 includes, for example, a steering ECU21 and an electric motor. The electric motor changes the orientation of the steering wheel by applying a force to the rack-and-pinion mechanism, for example. The steering ECU21 drives the electric motor based on information input from the vehicle control system 2 or information on the steering angle or the steering torque input, and changes the direction of the steered wheels.
Fig. 3 is a diagram showing an example of the configuration of the steering device 3 according to the present embodiment. The steering device 3 includes, but is not limited to, a steering wheel (also referred to as a steering wheel) 31, a steering shaft 210B, a steering angle sensor 210C, a steering torque sensor 210D, a reaction force motor 210E, an assist motor 210F, a steering mechanism 210G, a steering angle sensor 210H, a steering wheel grip sensor 210I, a steered wheel 210J, and a steering ECU21. The steering ECU21 includes a steering reaction force setting unit 210M and a storage unit 210N, respectively.
The steering wheel 31 is an example of an operation device that accepts a steering instruction made by a driver. A steering input applied to the steering wheel 31, that is, a steering operation is transmitted to the steering shaft 210B. A steering angle sensor 210C and a steering torque sensor 210D are mounted on the steering shaft 210B. The steering angle sensor 210C detects the angle at which the steering wheel 31 is operated, and outputs the detected angle to the steering ECU21. The steering torque sensor 210D detects a torque (steering torque) acting on the steering shaft 210B, and outputs the detected torque to the steering ECU21. That is, the steering torque is a torque that the driver applies to the steering shaft 210B by turning the steering wheel 31. The reaction motor 210E outputs a steering reaction force to the steering wheel 31 by outputting a torque to the steering shaft 210B under the control of the steering ECU21. That is, the reaction force motor 210E applies a predetermined steering reaction force for maintaining the steering during the automated driving (also referred to as a system steering) to the steering shaft 210B in each of the automated driving control states under the control of the steering ECU21. The steering reaction force acts as a torque that applies resistance to the steering operation of the driver. Therefore, when the driver performs the emergency control of the steering of the system, it is necessary to apply a torque exceeding a steering reaction force generated in response to the steering input to the steering shaft 210B.
The assist motor 210F assists the steering by outputting torque to the steering mechanism 210G under the control of the steering ECU21. The assist assists not only the operation of the driver at the time of manual driving but also performs steering without the operation of the driver according to the control of the travel control section 160 at the time of automatic driving. The steering mechanism 210G is, for example, a rack and pinion mechanism. The steering angle sensor 210H detects an amount (for example, a rack stroke) indicating an angle (steering angle) at which the steering wheel 210J is driven and controlled by the steering mechanism 210G, and outputs the detected amount to the steering ECU21. The steering shaft 210B and the steering mechanism 210G may be fixedly coupled or decoupled, or may be coupled via a clutch mechanism or the like.
The steering wheel grip sensor 210I may be a pressure sensor that is provided at a predetermined position of the rim portion of the steering wheel 31 and measures a pressure (hereinafter, also referred to as a grip force) applied to the rim by the grip of the driver when the driver grips the rim of the steering wheel 31. The steering wheel grip sensor 210I outputs the measured gripping force to the steering ECU21. The steering ECU21 performs the various controls described above in cooperation with the vehicle control system 2.
The steering reaction force setting unit 210M refers to the reaction force distribution information 210P in the storage unit 210N in the steering ECU21, using the difference between the steering angle (emergency control steering angle) detected by the steering angle sensor 210C and the system steering angle (for example, the steering angle determined by the travel control unit 160) acquired from the vehicle control system 2 as an index value of the steering input in the autonomous driving control state. The reaction force distribution information 210P is configured as a reaction force table indicating a correspondence relationship between a steering reaction force and a steering angle difference between the emergency control steering angle and the system steering angle, for example. The steering reaction force setting unit 210M reads the steering reaction force corresponding to the steering angle difference from the reaction force table of the reaction force distribution information 210P in the storage unit 210N. The steering ECU21 controls the driving of the reaction force motor 210E so that the steering reaction force of the value read from the storage unit 210N by the steering reaction force setting unit 210M is applied to the steering shaft 210B. In the manual driving control state, reaction force distribution information predetermined for manual driving is prepared, and reaction force is applied based on the reaction force distribution information. As in this example, when the steering shaft 210B is connected to the steering mechanism 210G, the mechanical reaction force from the steering wheel 210J is transmitted to the steering wheel 31, and therefore the reaction force does not have to be applied. However, when the steering shaft is not mechanically connected to the steering mechanism 210G and the steer-by-wire (steer-by-wire) is completed, a reaction force may be generated from a reaction force distribution that simulates a mechanical reaction force in order to give a steering feeling to the driver. In this example, the reaction force is applied so as to have a characteristic according to the automated driving control state of automated driving. The setting of the reaction force will be described again with reference to fig. 3 to 5. The steering angle, torque, steering speed, and the like of the steering are collectively referred to as a steering amount, and the steering amount determined by the travel control unit 160 is sometimes referred to as a system steering amount.
According to the above configuration, the steering reaction force applied to the steering wheel 31 is given in accordance with the difference between the steering angle and the system steering angle, which is generated by the emergency control operation performed on the steering wheel 31 by the driver in the automated driving control state, and the automated driving control state. In this case, the higher the level of the automatic driving control state, the greater the reaction force. As a result, it is possible to make it difficult to perform emergency control when the level of the automatic driving control state is high, and make it easy to perform emergency control when the level of the automatic driving control state is low, depending on the level of the automatic driving control state.
In the autopilot control state, the steering reaction force setting unit 210M refers to the reaction force distribution information 210P in the storage unit 210N each time the steering ECU21 reads the system steering angle and the emergency control steering angle. The steering reaction force setting unit 210M reads a steering reaction force corresponding to the difference between the system steering angle and the emergency control steering angle, which are read, and the level of the autonomous driving control state, and outputs a control signal for applying the steering reaction force to the reaction force motor 210E.
Further, when an emergency control operation (i.e., an intervention operation by a driver) of the steering is performed, information indicating the emergency control is transferred from the steering device 3 to the vehicle control system 2. The emergency control information indicating the emergency control may very simply include information indicating that the emergency control is executed and the direction thereof. Further, the rudder angle difference between the system rudder angle and the rudder angle generated by the intervention operation can be included. The steering angle difference can also indicate the steering direction by its sign, for example. Further the rudder angle difference may also contain steering torque. The emergency control information is reflected on the track information generated by the track generation unit 146.
Vehicle control system (continue)
Returning to fig. 2, the vehicle control system 2 includes, for example, a target lane determination unit 110, an automatic driving control unit 120, a travel control unit 160, an HMI (human machine interface) control unit 170, and a storage unit 180. The automated driving control unit 120 includes, for example, an automated driving state control unit 130, a vehicle position recognition unit 140, an external environment recognition unit 142, an action plan generation unit 144, a trajectory generation unit 146, and a switching control unit 150. Some or all of the units of the target lane determining unit 110 and the automated driving control unit 120, the travel control unit 160, and the HMI control unit 170 are realized by a processor executing a program (software). Some or all of these elements may be realized by hardware such as LSI (Large Scale Integration) or ASIC (Application Specific Integrated Circuit), or may be realized by a combination of software and hardware.
The storage unit 180 stores information such as high-precision map information 182 including information on the center of a lane or information on the boundary of a lane, target lane information 184, intervention information 185, and action plan information 186. The target lane determining unit 110 divides the route provided by the navigation device 50 into a plurality of sections (for example, each 100[ m ] for the vehicle traveling direction), and determines the target lane for each section with reference to the high-precision map information 182. The target lane determining unit 110 determines, for example, to travel in the first lane from the left. For example, when there is a branch point, a merge point, or the like on the route, the target lane determination unit 110 determines the target lane so that the host vehicle 1 can travel on a reasonable travel route for traveling ahead of the branch point. The target lane determined by the target lane determining unit 110 is stored in the storage unit 180 as target lane information 184.
The automated driving state control unit 130 determines an automated driving control level (also referred to as an automation level with a view to an automation rate of each state) of automated driving performed by the automated driving control unit 120. The automatic driving control state in the present embodiment includes the following control states. The following is merely an example, and the number of control states of the automatic driving can be arbitrarily determined.
Skipping of autonomous driving control states
In the present embodiment, there are the zeroth control state to the third control state as the automatic driving control states, and the automation rate becomes higher in order. The zeroth control state is here the control state of manual driving. The zeroth control state is a control state in which manual driving by the driver is required without driving assistance or the like at all. When the driver explicitly instructs the automated driving in the zeroth control state by, for example, a switch operation, the automated driving control state is transitioned to the first control state or the second control state according to a condition at that time, for example, according to an external environment, vehicle information, or the like. The control unit 2 refers to external environment information, travel state information, and the like to determine which control state to jump to.
The first control state is the level of the lowest automated driving control state in automated driving (the lowest automation rate). When the automatic driving is instructed, for example, in the case where the current position cannot be recognized, and in an environment (for example, a general road or the like) where the second control state cannot be applied even if the current position can be recognized, the automatic driving is started in the first control state. The automated functions implemented in the first control state include LKAS, ACC, etc. When the vehicle jumps to the first control state, the driver state detection unit may detect whether the driver is monitoring the outside, particularly the front, and also detect whether the driver is gripping the steering wheel. In this case, the jump is performed if the condition is satisfied. In addition, the monitoring of the driver may be continuously performed during the stay in the first control state. In addition, when the automatic driving control state is shifted from a lower level to a higher level, the task requested by the driver is not changed or the task requested by the driver is reduced, and therefore the driver's state may not be used as the condition for the shift. Note that the difference between the zeroth control state and the first control state is not limited to the above, and for example, only one of LKAS and ACC may be used in the zeroth control state, and both may be used in the first control state. In addition, there may be a case where the work scene is wider in the first control state than in the LKAS and ACC in the zeroth control state.
The second control state is an automatic driving control state of the upper level of the first control state. For example, when an instruction for autonomous driving is received in the zeroth control state, if the external environment at that time is a predetermined environment (for example, during driving on an expressway), the control mode transitions to the second control state. Alternatively, if the external environment is detected as the above-described predetermined environment during automatic driving in the first control state, the vehicle automatically transitions to the second control state. The determination of the external environment can be performed by referring to the current position and map information, in addition to the monitoring result of the peripheral monitoring unit including, for example, a camera. In the second control state, in addition to lane keeping, a function of performing a lane change or the like in accordance with a target object such as a surrounding vehicle is provided. If the condition for maintaining the second control state is lost, the automation level of the vehicle 1 is changed to the first control state by the control means 2. In the second control state, the driver may not hold the steering wheel (which is called "hands off"), and only ambient monitoring is required of the driver. Therefore, in the second control state, the driver state detection camera 41a monitors whether or not the driver is monitoring the outside, and outputs, for example, a warning if the driver is not monitoring the outside.
The third control state is the automatic driving control state of the upper level of the second control state. The controller can jump from the second control state to the third control state, and jump from the zeroth control state and the first control state without jumping through the second control state. The jump to the third control state is not triggered by an instruction from the driver, but is performed by automatic control performed by the control means 2 when it is determined that a certain condition is satisfied. For example, if a congestion occurs while the vehicle is automatically driven in the second control state and the vehicle is in a state of following the preceding vehicle at a low speed, the control state is switched from the second control state to the third control state. The determination in this case is made based on the output of the peripheral monitoring unit such as a camera, the vehicle speed, and the like. When the condition of the second control state is satisfied, for example, when the vehicle is traveling on an expressway, the automatic driving control state is shifted between the second control state and the third control state. In the third control state, the driver does not need to hold the steering wheel nor monitor the surroundings. However, at any time and under any control conditions, there is a possibility that a situation may occur in which the driver must take over driving. Therefore, in order to determine whether the driver can take over driving normally, for example, during automatic driving, it is constantly monitored and detected whether the driver's line of sight is within a predetermined range (e.g., a display portion of a navigation meter). The state of the driver can also be monitored during manual driving.
The automated driving state control unit 130 determines a control state of automated driving based on the operation of the driver for each configuration of the driving operation system, the event determined by the action plan generating unit 144, the travel pattern determined by the trajectory generating unit 146, and the like, and jumps to the determined control state. The HMI control unit 170 is notified of the automatic driving control state. In any control state, it is possible to override the automatic driving by manual driving (emergency control) by operations for the respective configurations of the driving operation system. In the above description, the case where the steering reaction force setting unit 210M determines the reaction force based on the rudder angle difference and the autonomous driving control state has been described, but the autonomous driving state control unit 130 may be configured to set a reaction force table corresponding to a change in the control state, for example. Thus, the steering reaction force setting unit 210M can determine the steering reaction force without considering the automatic driving control state.
The vehicle position recognition unit 140 of the automated driving control unit 120 recognizes the lane (traveling lane) in which the host vehicle 1 is traveling and the relative position of the host vehicle 1 with respect to the traveling lane, based on the high-accuracy map information 182 stored in the storage unit 180 and the information input from the optical radar 42, the radar 43, the camera 41, the navigation device 50, or the vehicle sensor 60.
The vehicle position recognition unit 140 recognizes the traveling lane by comparing the pattern of road dividing lines (for example, the arrangement of solid lines and broken lines) recognized from the high-accuracy map information 182 with the pattern of road dividing lines around the vehicle 1 recognized from the image captured by the camera 41, for example. This recognition may be performed based on the position of the host vehicle 1 acquired from the navigation device 50 and the result of processing by an inertial guidance (inertial guidance) system that may be present. The travel control unit 160 controls the travel driving force output device 6, the steering device 3, and the brake device 220 so that the host vehicle 1 passes through the track generated by the track generation unit 146 at a predetermined timing. The HMI control unit 170 causes the display device 92 to display video and images, and causes the speaker 91 to output audio. The travel control unit 160 determines a steering angle (system steering angle) for performing automated driving in accordance with the action plan information 186, for example, and inputs the steering angle to the steering device 3 to perform steering control.
The environment recognition unit 142 recognizes the position, speed, acceleration, and other states of a target object such as a neighboring vehicle based on information input from the camera 41, the optical radar 42, the radar 43, and the like. The environment recognition unit 142 may recognize the position of other objects such as a guardrail, a utility pole, a parked vehicle, and a pedestrian, in addition to the surrounding vehicle. The environment around the vehicle, which is recognized by the external recognition unit 142 by observing the state of the vehicle, is collectively referred to as a running environment, and this information is referred to as running environment information.
The action plan generating unit 144 sets a start point of the automated driving and/or a destination of the automated driving. The starting point of the automated driving may be the current position of the host vehicle 1 or a point where an operation for instructing the automated driving is performed. The action plan generating unit 144 generates an action plan in a section between the start point and the destination of the automated driving. Further, the action plan generating unit 144 may generate an action plan for an arbitrary section.
The action plan is composed of a plurality of events that are executed in sequence, for example. Examples of the event include a deceleration event for decelerating the host vehicle 1, an acceleration event for accelerating the host vehicle 1, a lane keeping event for causing the host vehicle 1 to travel without deviating from the travel lane, a lane change event for changing the travel lane, a passing event for causing the host vehicle 1 to pass a preceding vehicle, a branch event for changing the host vehicle 1 to a desired lane at a branch point or for causing the host vehicle 1 to travel without deviating from the current travel lane, an entry event for accelerating and decelerating the host vehicle 1 in an entry lane for entering a trunk, changing the travel lane, and a hand over (hand over) event for shifting from the automatic drive control state to the manual drive control state at a predetermined point of termination of the automatic drive. The action plan generating unit 144 sets a lane change event, a branch event, or an entry event at the position where the target lane is switched, which is determined by the target lane determining unit 110. Information indicating the action plan generated by the action plan generating unit 144 is stored in the storage unit 180 as action plan information 186.
The switching control unit 150 switches the automatic driving control state and the manual driving control state to each other based on a signal input from the automatic driving changeover switch 93. Further, the switching control portion 150 switches from the automatic driving (third control state to the first control state) to the manual driving (zeroth control state) based on the operation of the brake pedal 7B. In this example, when the brake operation is performed, the switching control unit 150 switches from the automatic driving control state to the manual driving control state after the suspension time and the warning according to the current automatic control state. In addition, for the steering operation and the accelerator operation, the emergency control is performed by the manual operation while maintaining the automatic driving. Here, in the emergency control, for example, when the steering operation amount exceeds a predetermined emergency control threshold value, the travel control is realized as if the steering operation amount is switched to the manual driving.
Track information Generation Process
Next, a part of the procedure of the track information generation processing performed by the track generation unit 146 will be described. As described above, during the automatic travel, the travel control unit 160 controls the vehicle so as to travel along the track information generated by the track generation unit 146. This step is executed by the ECU20 or the like, for example. Fig. 4 (a) is a flowchart showing an outline of the track information generation process. First, the trajectory information is generated and stored based on the ambient environment information acquired by the ambient environment recognition unit 142, the vehicle position acquired by the vehicle position recognition unit, and the like (S401). Next, intervention information of the current user is acquired with reference to the intervention information 185 (S403). The intervention information includes, for example, an offset that a driver manually generates in automatic driving. The offset refers to a deviation from the center of the lane in the width direction (or lateral direction) of the lane. The intervention information may be set for each driver. For example, in order to perform individual setting for each driver, setting information of a plurality of persons may be stored, and any one of the setting information may be selected to reflect the setting. The intervention information for each user is also referred to the intervention information of the user selected in advance, as described above. Of course, it is also possible to store only one person's intervention information, in which case no selection by the user is required. In addition, although the intervention information includes an offset, when the trajectory determined by the automatic driving function is changed by the emergency control operation of the driver, the intervention information is not limited to the offset as long as the intervention information indicates the change amount.
In this example, three pieces of information, that is, an offset in straight travel, an offset in a curve, and an obstacle flag, are included. The deviation in the straight traveling is a difference between the position (or the track) in the road width direction determined by the track generation unit 146 when the vehicle travels straight by the automatic driving and the position (the track) actually traveled by changing the steering emergency control, which is an intervention operation by the driver. If there is no obstacle or the like, the trajectory generation unit 146 normally selects the center of the lane as a default travel trajectory. The deviation in the curve is a difference from the position (or the track) in the road width direction determined by the track generation unit 146 to the position (the track) actually traveled by changing the steering emergency control, which is an intervention operation performed by the driver, when the vehicle travels in the curve by the autonomous driving. The reason is different from the straight running because it is considered that the deviation amount can be substantially maintained in the straight running, whereas in the curved road, the deviation from the entrance to the exit of the curved road is not constant due to the curvature of the track during the running of the curved road. Therefore, in the present example, as the offset in the curve, the difference between the track generated by the track generation unit 146 and the track after the intervention operation at the position closest to the inner boundary of the lane (this position is referred to as the curve center) is set as the offset in the curve. The obstacle flag is a flag indicating that it is estimated that an intervention operation is performed to get away from an obstacle.
If the intervention information is acquired, the track information generated in step S401 is corrected based on the acquired intervention information (S405). Here, the track information may be regenerated in consideration of the intervention information. Then, the corrected track information is delivered to the travel control unit 160 (S407). Of course, the corrected track information may be stored in the storage unit 180, and the travel control unit 160 may refer to the track information. The travel control unit 160 controls the steering device 3 and the like so that the vehicle 1 travels along the acquired track information. As a result, the vehicle travels on the track changed by the intervention of the driver.
Fig. 5 shows the processing steps when intervention information 185 is stored. This step may be performed by the automatic driving control unit 120, particularly the trajectory generation unit 146, but may be performed by a dedicated module. The steps of fig. 5 are performed upon receiving emergency control information from the steering device. First, it is determined whether avoidance control is in operation (S500). The avoidance control includes, for example, at least any one of driving attention calling, collision avoidance control, and off-road avoidance prevention. Here, the determination may be made not only in a state where the avoidance control is currently performed but also in a case where the avoidance control is operated in the very near past (within a predetermined time, for example, within five minutes). Therefore, when the avoidance control is performed, the time to be determined and the time to be determined may be stored in advance, and these may be referred to in step S500. Next, it is determined whether an obstacle is present on the opposite side of the driver' S steering direction (S501). In the autonomous driving, a trajectory avoiding an obstacle is generated, but there may be a case where a driver performs an emergency control operation to further move away from the obstacle. In step S501, this determination is made. Further, the obstacle is recognized by the external recognition unit 142. If it is determined in step S501 that an obstacle is present on the opposite side of the turning direction, it is determined that the obstacle will be separated from the obstacle, and an obstacle flag is set (S502).
Next, it is determined whether the road on which the vehicle is traveling is approaching a curve, for example, based on the generated track information and the like (S503). If it is determined that the curve is not the curve, the current offset value is stored as intervention information for the straight path in the intervention information 185 (S505). The offset value may be a distance between the track stored in step S401 and the track traveled by the intervening operation. The track on which the vehicle travels (or the position in the width direction) can be determined, for example, based on a lane boundary or the like included in an image captured by a camera or the like. The overwriting is to store information when the intervention operation is stopped as the latest information. If it is determined in step S503 that the curve is a curve, the process proceeds to step S507.
In step S507, it is determined whether or not a predetermined time until the curve center is reached, which is a reference position determined as an offset in the curve, is stored. If the predetermined time is stored, it is determined whether or not the predetermined time has been reached (or passed) at the center of the curve (S509). If the curve center reaches a predetermined time, the offset at that time is stored as the intervention information in the curve (S511). In addition, even with the overlay, the offset in the straight traveling and the offset in the curve are set separately. If it is determined in step S507 that the scheduled time to reach the center of the curve is not stored, the scheduled time to reach the center of the curve is estimated based on the map information, the current speed, and the like (S513). If the curve center has already passed, the process is terminated without any processing (yes in S515), and if the curve center has not passed, the estimated time is stored as the curve center reaching predetermined time (S517).
The intervention information can be set by the above procedure. Since the intervention information set in this way is an offset, the track generation unit 146 may change the track information so as to shift the track by the offset in step S405. Note that, although nothing is done with the trajectory information when it is determined in step S500 that the avoidance control has been already operated, the trajectory information may be controlled so as to increase the steering reaction force. That is, at this time, the steering device 3 is instructed to change the characteristics of the reaction force to the characteristics in which a large reaction force is generated even for a small rudder angle difference.
Fig. 4 (B) is executed when the obstacle to be determined disappears in step S501. The main body of execution may be the automatic driving control unit 120, and may be, for example, the track generation unit 146. The disappearance of the obstacle to be determined can be determined based on, for example, a change in the traveling environment information recognized by the environment recognizing unit 142 or the like. First, an obstacle flag is detected (S411). If the obstacle flag is on, that is, if it can be estimated that the driver intends to leave the obstacle and performs an intervention operation, the intervention information related to the current user is eliminated, and the obstacle flag is reset (S413). This eliminates the deviation caused by the intervention operation, and the track returns to the track when the intervention operation is not performed. In this way, the intervention for moving away from the obstacle is not reflected in the determination of the trajectory in a case other than this.
The intervention information may be deleted for a predetermined period without being limited to the offset due to the obstacle. The predetermined period may be a predetermined time or distance, or may be a period satisfying other predetermined conditions. When the predetermined period is time, the predetermined period is measured as follows. For example, when the steering torque sensor 210D detects that the driver manually steers during the automatic driving, the manual steering is terminated and the steering torque is waited to be zero again. When the steering torque becomes zero, a timer corresponding to a predetermined period is started at that timing. Then, if the timer counts up, the intervention information of the driver is deleted. If the predetermined period is a distance, the measurement of the distance is started when the steering torque becomes zero, and if the vehicle has traveled a distance corresponding to the predetermined period, the intervention information of the driver is canceled. When the predetermined period is a period in which other conditions are satisfied, if the manual steering is ended and the steering torque becomes zero again, the driver waits for the condition not to be satisfied from the timing, and if the condition is not satisfied, the intervention information of the driver is deleted. This can limit the period during which the emergency control operation is reflected on the travel track under autonomous driving.
[ modification of embodiment ]
Further, a mechanism may be provided for resetting the intervention information arbitrarily for each user. By resetting, the intervention information can be reset even when the owner of the vehicle has changed. In addition, in the above-described embodiment, the intervention information is updated to the latest intervention information every time the intervention operation is performed. However, the intervention information may be stored in association with the travel environment information such as the specific target object, position, and scene recognized by the external world recognition unit 142, and when the travel environment information such as the corresponding target object, position, and scene is detected, the trajectory information may be changed with reference to the corresponding intervention information. The running environment information includes, for example, weather such as sunny days and rainy days, lane width, road width, and position of a guardrail.
Summary of the embodiments
The above-described embodiment is summarized as follows.
(1) According to a first aspect of the present invention, there is provided a vehicle control device for performing driving assistance or automatic driving of a host vehicle,
the vehicle control device includes:
a trajectory generation means for setting a target position based on an output of a periphery monitoring means provided in a vehicle and generating trajectory information directed to the target position;
a travel control means that controls travel based on the track information; and
an operation means capable of performing an intervention operation by a driver with respect to the travel controlled by the travel control means,
when there is an intervention operation by the operation means, the trajectory generation means further changes the trajectory information based on the trajectory corrected by the intervention operation for a predetermined period.
According to this configuration, the vehicle can travel on the track reflecting the intervention operation of the driver by the automatic driving, and the control for eliminating the user uneasy feeling can be realized.
(2) According to a second aspect of the present invention, there is provided a vehicle control device in the vehicle control device of (1),
the trajectory generation means changes the trajectory information based on the trajectory corrected by the intervention operation when the running environment acquired by the periphery monitoring means continues at the time of the operation by the operation means.
With this configuration, the state can be maintained without changing the detection state.
(3) According to a third aspect of the present invention, there is provided the vehicle control device of (1),
the trajectory generation means eliminates the change of the trajectory information based on the trajectory corrected by the intervention operation when the running environment acquired by the periphery monitoring means at the time of the operation by the operation means changes.
According to this configuration, the influence of the intervention operation by the driver can be eliminated in accordance with the change in the running environment.
(4) According to a fourth aspect of the present invention, there is provided the vehicle control device as recited in any one of (1) to (3),
the trajectory generation means does not change the trajectory information based on the trajectory corrected by the intervention operation when any one of a driver's attention, collision avoidance control, and off-road deviation prevention is in operation after the intervention operation by the operation means is performed.
According to this configuration, since the attention of the driver may be reduced in a case where an excessive alarm is issued before the intervention operation, the driver can be more conscious by canceling the intervention operation.
(5) According to a fifth aspect of the present invention, there is provided the vehicle control device of (4),
the operating mechanism is capable of performing an intervention operation by steering,
when an intervention operation is performed by steering, if any one of the driver's attention calling, the collision avoidance control, and the off-road departure prevention is in operation, a steering reaction force for the intervention operation performed by the steering is further increased.
With this configuration, it is possible to easily follow a trajectory determined by automatic driving, and it is less likely to be affected by an intervention operation from a driver who is concerned about a reduction in attention.
(6) According to a sixth aspect of the present invention, there is provided a vehicle control device, in addition to the vehicle control device of any one of (1) to (5),
the trajectory generation means stores information relating to the intervention operation in association with travel environment information acquired by the periphery monitoring means when the intervention operation is performed by the operation means, and changes the trajectory information based on the information relating to the intervention operation stored in association with the periphery environment information when the periphery environment information corresponding to the stored travel environment information is acquired by the periphery monitoring means.
With this configuration, it is possible to determine a trajectory reflecting the intervention of the driver's operation according to the surrounding environment by the automatic driving.

Claims (7)

1. A vehicle control apparatus, characterized in that,
the vehicle control device includes:
a trajectory generation means for setting a target position based on an output of a periphery monitoring means provided in a vehicle and generating trajectory information directed to the target position;
a travel control means that controls travel based on the track information; and
an operation means capable of performing an intervention operation by a driver with respect to the travel controlled by the travel control means,
the trajectory generation means further generates intervention information that is a difference between a trajectory based on the trajectory information and a trajectory corrected by the intervention operation, stores the intervention information for each driver, and changes the trajectory information based on the intervention information of the selected driver, when the intervention operation by the operation means is present,
the trajectory generation means erases the saved intervention information when the intervention operation is determined to be an operation in a direction away from an obstacle identified by the periphery monitoring means.
2. The vehicle control device according to claim 1, wherein the trajectory generation means changes the trajectory information based on the intervention information when a running environment acquired by the periphery monitoring means at a time of operation by the operation means continues.
3. The vehicle control device according to claim 1, wherein the trajectory generation means eliminates the change of the trajectory information based on the intervention information when a change occurs in the running environment acquired by the periphery monitoring means at a time point of operation by the operation means.
4. The vehicle control device according to any one of claims 1 to 3, wherein the trajectory generation means does not change the trajectory information based on the intervention information when any one of a driving attention call, a collision avoidance control, and an off-road deviation prevention is in operation after the intervention operation by the operation means is performed.
5. The vehicle control apparatus according to claim 4,
the operating mechanism is capable of performing an intervention operation by steering,
when an intervention operation is performed by steering, if any one of the driver's attention calling, the collision avoidance control, and the off-road departure prevention is in operation, a steering reaction force for the intervention operation performed by the steering is further increased.
6. The vehicle control apparatus according to any one of claims 1 to 3,
the track generation means stores the intervention information in association with the running environment information acquired by the surroundings monitoring means,
when the running environment information corresponding to the stored running environment information is acquired by the periphery monitoring means, the track generation means changes the track information based on the intervention information stored in association with the running environment information.
7. The vehicle control apparatus according to claim 1,
the intervention information is eliminated when the intervention information travels for a certain period or a certain distance.
CN202010073377.5A 2019-02-05 2020-01-22 Vehicle control device Active CN111532269B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019018822A JP2020125026A (en) 2019-02-05 2019-02-05 Vehicle control device
JP2019-018822 2019-02-05

Publications (2)

Publication Number Publication Date
CN111532269A CN111532269A (en) 2020-08-14
CN111532269B true CN111532269B (en) 2023-04-07

Family

ID=71971067

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010073377.5A Active CN111532269B (en) 2019-02-05 2020-01-22 Vehicle control device

Country Status (2)

Country Link
JP (1) JP2020125026A (en)
CN (1) CN111532269B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112319239B (en) * 2020-11-17 2022-04-26 睿驰电装(大连)电动系统有限公司 Torque control method and device based on navigation information and electric vehicle
CN112744232A (en) * 2021-01-18 2021-05-04 陈潇潇 Intelligent vehicle driving control method and device for monitoring automatic driving by human driver
JP7433354B2 (en) 2022-03-23 2024-02-19 本田技研工業株式会社 Control device, control method, and program

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4767930B2 (en) * 2007-09-12 2011-09-07 本田技研工業株式会社 Vehicle travel safety device
JP5330063B2 (en) * 2009-04-08 2013-10-30 本田技研工業株式会社 Vehicle collision avoidance device
EP2871107B1 (en) * 2012-07-06 2023-10-25 Toyota Jidosha Kabushiki Kaisha Traveling control device for vehicle
JP6142979B2 (en) * 2012-08-01 2017-06-07 マツダ株式会社 Lane maintaining control method and lane maintaining control apparatus
JP6278019B2 (en) * 2015-09-25 2018-02-14 トヨタ自動車株式会社 Vehicle driving support device
JP6607826B2 (en) * 2016-06-07 2019-11-20 本田技研工業株式会社 Travel control device
JP2018063476A (en) * 2016-10-11 2018-04-19 株式会社デンソーアイティーラボラトリ Apparatus, method and computer program for driving support
JP6649512B2 (en) * 2016-12-28 2020-02-19 本田技研工業株式会社 Vehicle control system, vehicle control method, and vehicle control program
JP6763344B2 (en) * 2017-04-12 2020-09-30 トヨタ自動車株式会社 Lane change support device

Also Published As

Publication number Publication date
CN111532269A (en) 2020-08-14
JP2020125026A (en) 2020-08-20

Similar Documents

Publication Publication Date Title
CN110271532B (en) Vehicle control device
US11008016B2 (en) Display system, display method, and storage medium
CN111547130B (en) Vehicle control device
JP7177862B2 (en) positioning device
EP3216667B1 (en) Control system for vehicle
US11267484B2 (en) Vehicle control system, vehicle control method, and vehicle control program
JP7416176B2 (en) display device
US20170337810A1 (en) Traffic condition estimation apparatus, vehicle control system, route guidance apparatus, traffic condition estimation method, and traffic condition estimation program
US20190071075A1 (en) Vehicle control system, vehicle control method, and vehicle control program
CN111746498B (en) Vehicle control device, vehicle, and vehicle control method
JP2017218020A (en) Vehicle control device, vehicle control method and vehicle control program
JP6976280B2 (en) Vehicle control devices, vehicle control methods, and programs
CN111532269B (en) Vehicle control device
JP2021184296A (en) Vehicle control device, vehicle and vehicle control method
JP2020164056A (en) Control apparatus, control method and program
JP2019079363A (en) Vehicle control device
JP7201576B2 (en) Information presentation device for self-driving cars
US11897499B2 (en) Autonomous driving vehicle information presentation device
CN112238861B (en) Vehicle control device
CN112937565B (en) Information presentation device for automatic driving vehicle
JP2022152607A (en) Driving support device, driving support method, and program
JP2021107772A (en) Notification device for vehicle, notification method for vehicle, and program
US20220297692A1 (en) Vehicle control device, vehicle control method, and storage medium
US20230356741A1 (en) Travel Control Method and Travel Control Apparatus for Vehicle
JP2021107771A (en) Notification device for vehicle, notification method for vehicle, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant