CN111717280A - Vehicle control device, vehicle control method, and storage medium - Google Patents

Vehicle control device, vehicle control method, and storage medium Download PDF

Info

Publication number
CN111717280A
CN111717280A CN202010145241.0A CN202010145241A CN111717280A CN 111717280 A CN111717280 A CN 111717280A CN 202010145241 A CN202010145241 A CN 202010145241A CN 111717280 A CN111717280 A CN 111717280A
Authority
CN
China
Prior art keywords
vehicle
host vehicle
wind
region
steering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010145241.0A
Other languages
Chinese (zh)
Other versions
CN111717280B (en
Inventor
熊野孝保
芝内翼
新冈琢也
柳原秀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN111717280A publication Critical patent/CN111717280A/en
Application granted granted Critical
Publication of CN111717280B publication Critical patent/CN111717280B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D6/00Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
    • B62D6/04Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits responsive only to forces disturbing the intended course of the vehicle, e.g. forces acting transversely to the direction of vehicle travel

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)

Abstract

The invention provides a vehicle control device, a vehicle control method, and a storage medium, which can perform more appropriate vehicle control based on the surrounding environment of the vehicle. A vehicle control device according to an embodiment includes: an identification unit that identifies the surrounding environment of the vehicle; and a driving control unit that controls at least steering of the host vehicle so that the host vehicle travels at a predetermined position on a road based on a recognition result of the recognition unit, wherein the recognition unit recognizes a change region in which a wind condition around the host vehicle changes, and the driving control unit adjusts the steering of the host vehicle based on a degree of change in the wind condition when it is predicted that the host vehicle will reach the change region recognized by the recognition unit.

Description

Vehicle control device, vehicle control method, and storage medium
Technical Field
The invention relates to a vehicle control device, a vehicle control method, and a storage medium.
Background
Conventionally, there is disclosed a steering device for a vehicle that performs control of automatic steering in response to a disturbance factor in a lateral direction of a traveling vehicle (see, for example, patent document 1).
[ Prior Art document ]
Patent document 1: japanese patent laid-open No. 2001-97234
Problems to be solved by the invention
However, in the above-described conventional technology, vehicle control corresponding to a change in disturbance factors caused by the surrounding environment of the vehicle is not considered. Therefore, appropriate vehicle control for the surrounding environment of the vehicle may not be performed.
Disclosure of Invention
An aspect of the present invention has been made in consideration of such a situation, and an object thereof is to provide a vehicle control device, a vehicle control method, and a storage medium that can perform more appropriate vehicle control based on the surrounding environment of the host vehicle.
Means for solving the problems
The vehicle control device, the vehicle control method, and the storage medium according to the present invention have the following configurations.
(1): a vehicle control device according to an aspect of the present invention includes: an identification unit that identifies the surrounding environment of the vehicle; and a driving control unit that controls at least steering of the host vehicle so that the host vehicle travels at a predetermined position on a road based on a recognition result of the recognition unit, wherein the recognition unit recognizes a change region in which a wind condition around the host vehicle changes, and the driving control unit adjusts the steering of the host vehicle based on a degree of change in the wind condition when it is predicted that the host vehicle will reach the change region recognized by the recognition unit.
(2): in the aspect (1) described above, the recognition unit recognizes another vehicle traveling in the vicinity of the host vehicle, and recognizes, when it is predicted that the recognized another vehicle will miss or run in parallel with the host vehicle, a region on the traveling lane of the host vehicle, the region including one or both of a region ahead of the another vehicle and a region behind the another vehicle, as the change region.
(3): in the aspect (2) above, the other vehicle is a vehicle larger than the own vehicle.
(4): in the aspect of the above (2), when it is recognized that the host vehicle is subjected to a cross wind before passing by or running in parallel with the other vehicle, the recognition unit recognizes, as the change region, a region on the running lane of the host vehicle including one or both of a front region and a rear region of the other vehicle.
(5): in the aspect of the above (4), the recognition unit may recognize, as the change region, a rear region including a part of a side region of the other vehicle on a traveling lane of the host vehicle when the speed of the host vehicle is higher than the speed of the other vehicle and it is predicted that the host vehicle will travel in parallel with the other vehicle, and the recognition unit may recognize, as the change region, a front region of the other vehicle on the traveling lane of the host vehicle when the speed of the host vehicle is higher than the speed of the other vehicle and the host vehicle is in a parallel traveling state with the other vehicle.
(6): in the aspect of the above (4), the recognition unit may recognize, as the change region, a front region including a part of a side region of the other vehicle on a traveling lane of the host vehicle when the speed of the host vehicle is slower than the speed of the other vehicle and it is predicted that the host vehicle will travel in parallel with the other vehicle, and the recognition unit may recognize, as the change region, a rear region of the other vehicle on the traveling lane of the host vehicle when the speed of the host vehicle is slower than the speed of the other vehicle and the host vehicle is in a parallel traveling state with the other vehicle.
(7): in the aspect (1) described above, the recognition unit recognizes that there is an interference factor among the factors causing the error when an error between the recognized yaw rate and the steering angle is equal to or greater than a predetermined value based on the yaw rate of the host vehicle and the steering angle of the steering device mounted on the host vehicle is recognized, and recognizes that the host vehicle is subjected to cross wind when the interference factor is recognized, and the driving control unit adjusts the steering of the host vehicle based on the wind conditions of the cross wind.
(8): in the aspect of (7) above, the recognition unit recognizes the predetermined object target based on the image of the periphery of the host vehicle captured by the imaging unit, and recognizes that crosswind is blowing against the traveling direction of the host vehicle when recognizing that the movement of the recognized object target with the passage of time includes at least a movement in a direction orthogonal to the traveling direction of the host vehicle.
(9): in the aspect of the above (1), the identification unit identifies a point beyond a wind-break area as the change area when the host vehicle is not affected by the wind due to a road member that shields the wind and a crosswind with respect to a traveling direction of the host vehicle is identified at a point beyond the wind-break area by the road member.
(10): in the aspect (1) described above, the driving control unit may increase the steering force of the host vehicle toward the other vehicle when the recognition unit recognizes a crosswind from the recognized other vehicle on the side of the traveling lane.
(11): a vehicle control method according to an aspect of the present invention causes an on-vehicle computer to perform: identifying the surrounding environment of the vehicle; controlling, based on the recognized result, at least steering of the own vehicle so that the own vehicle travels at a prescribed position on the road; identifying a change region of a change in wind conditions in the periphery of the host vehicle; and adjusting steering of the host vehicle based on a degree of change in the wind condition in a case where it is predicted that the host vehicle will reach the identified change region.
(12): a storage medium according to an aspect of the present invention stores a program that causes a vehicle-mounted computer to perform: identifying the surrounding environment of the vehicle; controlling, based on the recognized result, at least steering of the own vehicle so that the own vehicle travels at a prescribed position on the road; identifying a change region of a change in wind conditions in the periphery of the host vehicle; and adjusting steering of the host vehicle based on a degree of change in the wind condition in a case where it is predicted that the host vehicle will reach the identified change region.
Effects of the invention
According to the aspects (1) to (12) described above, more appropriate vehicle control can be performed based on the surrounding environment of the host vehicle.
Drawings
Fig. 1 is a configuration diagram of a vehicle system using a vehicle control device according to an embodiment.
Fig. 2 is a functional configuration diagram of the first control unit and the second control unit.
Fig. 3 is a diagram for explaining the recognition unit and the action plan generation unit.
Fig. 4 is a diagram for explaining processing of the steering adjustment unit.
Fig. 5 is a diagram showing a positional relationship between the host vehicle and another vehicle.
Fig. 6 is a diagram for explaining the assist torque for steering in the host vehicle M (t 3).
Fig. 7 is a diagram for explaining the assist torque for steering in the host vehicle M (t1) and the host vehicle M (t 5).
Fig. 8 is a diagram for explaining the assist torque for steering in the host vehicle M (t2) and the host vehicle M (t 4).
Fig. 9 is a diagram for explaining control of the assist torque amount that differs for the left and right lateral positions.
Fig. 10 is a diagram for explaining a change region for the opposing vehicle.
Fig. 11 is a diagram for explaining a change region based on the wind conditions of the road member.
Fig. 12 is a flowchart showing an example of the flow of processing executed by the automatic driving control apparatus.
Fig. 13 is a diagram showing an example of the hardware configuration of the automatic driving control device according to the embodiment.
Detailed Description
Embodiments of a vehicle control device, a vehicle control method, and a storage medium according to the present invention will be described below with reference to the accompanying drawings. The vehicle control device of the embodiment is applied to, for example, an autonomous vehicle. The automated driving is, for example, a driving control performed by controlling one or both of steering and acceleration/deceleration of the vehicle. The driving Control includes driving support Control such as lkas (lane keep Assistance system), acc (adaptive Cruise Control system), and cmbs (fusion differentiation Brake system). In addition, although the case where the right-hand traffic rule is applied will be described below, the right-hand and left-hand reading may be performed. Hereinafter, a description will be given with one direction having a horizontal direction as X, the other direction as Y, and a vertical direction orthogonal to the horizontal direction of X-Y as Z.
[ integral Structure ]
Fig. 1 is a configuration diagram of a vehicle system 1 using a vehicle control device according to an embodiment. The vehicle (hereinafter referred to as the host vehicle M) on which the vehicle system 1 is mounted is, for example, a two-wheel, three-wheel, four-wheel or the like vehicle, and the drive source thereof is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof. The electric motor operates using generated power generated by a generator connected to the internal combustion engine or discharge power of a secondary battery or a fuel cell.
The vehicle system 1 includes, for example, a camera (an example of an imaging unit) 10, a radar device 12, a probe 14, an object recognition device 16, a communication device 20, an hmi (human Machine interface)30, a vehicle sensor 40, a navigation device 50, an mpu (map Positioning unit)60, a driving operation unit 80, an automatic driving control device 100, a driving force output device 200, a brake device 210, and a steering device 220. These devices and apparatuses are connected to each other by a multiplex communication line such as a can (controller Area network) communication line, a serial communication line, a wireless communication network, and the like. The configuration shown in fig. 1 is merely an example, and a part of the configuration may be omitted or another configuration may be added. The automatic driving control apparatus 100 is an example of a "driving control apparatus".
The camera 10 is a digital camera using a solid-state imaging device such as a ccd (charge Coupled device) or a cmos (complementary metal oxide semiconductor). The camera 10 is mounted on an arbitrary portion of the vehicle M. For example, when the front of the host vehicle M is photographed, the camera 10 is attached to the upper portion of the front windshield, the rear surface of the interior mirror, or the like. When the rear side of the host vehicle M is photographed, the camera 10 is attached to an upper portion of the rear windshield, for example. In the case of imaging the right side or the left side of the host vehicle M, the camera 10 is attached to the right side surface or the left side surface of the vehicle body or the door mirror. The camera 10 repeatedly shoots the periphery of the host vehicle M periodically, for example. The camera 10 may also be a stereo camera.
The radar device 12 radiates radio waves such as millimeter waves to the periphery of the host vehicle M, and detects radio waves (reflected waves) reflected by an object to detect at least the position (distance and direction) of the object. The radar device 12 is mounted on an arbitrary portion of the vehicle M. The radar device 12 may detect the position and velocity of the object by an FM-cw (frequency Modulated Continuous wave) method.
The detector 14 is a LIDAR (light Detection and ranging). The detector 14 irradiates light to the periphery of the host vehicle M and measures scattered light. The detector 14 detects the distance to the subject based on the time from light emission to light reception. The light to be irradiated is, for example, pulsed laser light. The probe 14 is attached to an arbitrary portion of the vehicle M.
The object recognition device 16 performs a sensor fusion process on the detection results detected by some or all of the camera 10, the radar device 12, and the probe 14, and recognizes the position, the type, the speed, and the like of the object. The object recognition device 16 outputs the recognition result to the automatic driving control device 100. The object recognition device 16 may output the detection results of the camera 10, the radar device 12, and the detector 14 directly to the automatic driving control device 100. The object recognition device 16 may also be omitted from the vehicle system 1.
The communication device 20 communicates with another vehicle present in the vicinity of the host vehicle M by using, for example, a cellular network, a Wi-Fi network, Bluetooth (registered trademark), dsrc (dedicatedshort Range communication), or the like, or communicates with various server devices via a wireless base station.
The HMI30 presents various information to an occupant (including the driver) of the host vehicle M, and accepts input operations by the occupant. The HMI30 includes, for example, various display devices, speakers, buzzers, touch panels, switches, keys, and the like.
The vehicle sensor 40 detects information related to the behavior of the own vehicle M and the state of the in-vehicle device. The vehicle sensors 40 include, for example, a yaw rate sensor 42, a steering angle sensor 44, and a torque sensor 46. The yaw rate sensor 42 detects, for example, a yaw rate (rotation angular velocity) around a vertical axis of the host vehicle M. The steering angle sensor 44 detects, for example, the direction and magnitude of a steering angle (hereinafter referred to as steering angle information) in a steering device (an example of a steering device) described later. The torque sensor 46 detects, for example, a steering force used for steering control of the host vehicle M by a steering control unit of the automated driving control apparatus 100, which will be described later. The steering force is, for example, a torque amount of an assist torque for steering in the control such as LKAS. The torque sensor 46 may detect a steering torque input by the driver.
The vehicle sensor 40 may include a vehicle speed sensor that detects the speed of the host vehicle M, an acceleration sensor that detects acceleration, an orientation sensor that detects the orientation of the host vehicle M, and the like. The vehicle sensor 40 may include a wind condition sensor (e.g., anemometer or anemometer) that detects information related to wind conditions around the host vehicle M. The wind condition is information indicating a blowing pattern of wind at a specific location. The wind conditions include, for example, factors such as wind speed (including average wind speed and maximum instantaneous wind speed), wind direction, and the amount of change in each factor with time. The wind direction may be a direction of wind based on the azimuth of the map information, or may be a direction of wind with respect to the frontal direction of the host vehicle M detected by the vehicle sensor 40.
The Navigation device 50 includes, for example, a gnss (global Navigation Satellite system) receiver 51, a Navigation HMI52, and a route determination unit 53. The navigation device 50 holds first map information 54 in a storage device such as an hdd (hard Disk drive) or a flash memory. The GNSS receiver 51 determines the position of the own vehicle M based on the signals received from the GNSS satellites. The position of the host vehicle M may be determined or supplemented by an ins (inertial Navigation system) that utilizes the output of the vehicle sensors 40. The navigation HMI52 includes a display device, a speaker, a touch panel, keys, and the like. A part or all of the navigation HMI52 may also be shared with the aforementioned HMI 30. The route determination unit 53 determines a route (hereinafter referred to as an on-map route) from the position of the own vehicle M (or an arbitrary input position) specified by the GNSS receiver 51 to the destination input by the occupant using the navigation HMI52, for example, with reference to the first map information 54. The first map information 54 is information representing a road shape by a line representing a road and a node connected by the line, for example. The first map information 54 may include curvature of a road, poi (pointof interest) information, and the like. The map upper path is output to the MPU 60. The navigation device 50 may also perform route guidance using the navigation HMI52 based on the on-map route. The navigation device 50 may be realized by a function of a terminal device such as a smartphone or a tablet terminal held by the occupant. The navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire a route equivalent to the route on the map from the navigation server.
The MPU60 includes, for example, the recommended lane determining unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determining unit 61 divides the on-map route provided from the navigation device 50 into a plurality of sections (for example, every 100[ m ] in the vehicle traveling direction), and determines the recommended lane for each section with reference to the second map information 62. The recommended lane determining unit 61 determines to travel in the first lane from the left. The recommended lane determining unit 61 determines the recommended lane so that the host vehicle M can travel on a reasonable route for traveling to the branch destination when there is a branch point on the route on the map.
The second map information 62 is map information with higher accuracy than the first map information 54. The second map information 62 includes, for example, information on the center of a lane, information on the boundary of a lane, and the like. The second map information 62 may include road information, traffic regulation information, address information (address, zip code), facility information, telephone number information, and the like. The second map information 62 can be updated at any time by the communication device 20 communicating with other devices.
The driving operation members 80 include, for example, an accelerator pedal, a brake pedal, a shift lever, a steering wheel, a joystick, and other operation members. A sensor for detecting the operation amount or the presence or absence of operation is attached to the driving operation element 80, and the detection result is output to some or all of the automatic driving control device 100, the running driving force output device 200, the brake device 210, and the steering device 220.
The automatic driving control device 100 includes, for example, a first control unit 120 and a second control unit 160. The first control unit 120 and the second control unit 160 are each realized by a hardware processor such as a cpu (central Processing unit) executing a program (software). Some or all of these components may be realized by hardware (including circuit units) such as lsi (large scale integration), asic (application Specific Integrated circuit), FPGA (Field-programmable gate Array), gpu (graphics Processing unit), or the like, or may be realized by cooperation between software and hardware. The program may be stored in advance in a storage device (a storage device including a non-transitory storage medium) such as an HDD or a flash memory of the automatic drive control device 100, or may be stored in a removable storage medium such as a DVD or a CD-ROM, and attached to the HDD or the flash memory of the automatic drive control device 100 by being mounted on the drive device via the storage medium (the non-transitory storage medium).
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160. The first control unit 120 includes, for example, a recognition unit 130 and an action plan generation unit 140. The combination of the action plan generating unit 140 and the second control unit 160 is an example of a "driving control unit". The first control unit 120 realizes, for example, an AI (Artificial Intelligence) function and a pre-assigned model function in parallel. For example, the "intersection identification" function can be realized by executing intersection identification by deep learning or the like and identification by a condition given in advance (presence of a signal, a road sign, or the like that enables pattern matching) in parallel, and scoring both sides to comprehensively evaluate them. Thereby, the reliability of automatic driving is ensured.
The recognition unit 130 recognizes the environment around the host vehicle M. For example, the recognition unit 130 recognizes the position, the speed, the acceleration, the traveling direction, and other states of an object (for example, a nearby vehicle or an object target) in the vicinity of the host vehicle M based on information input from the camera 10, the radar device 12, and the probe 14 via the object recognition device 16. The position of the object is recognized as a position on absolute coordinates with the origin at the representative point (center of gravity, center of drive shaft, etc.) of the host vehicle M, for example, and used for control. The position of the object may be represented by a representative point such as the center of gravity, the center, and the corner of the object, or may be represented by a region to be represented. In the case where the object is a vehicle, the "state" of the object may include acceleration, jerk, or "behavior state" of the object (e.g., whether a lane change is being made or whether a lane change is being made).
The recognition unit 130 recognizes, for example, a lane (traveling lane) on which the host vehicle M travels. For example, the recognition unit 130 recognizes the traveling lane by comparing the pattern of road dividing lines (for example, the arrangement of solid lines and broken lines) obtained from the second map information 62 with the pattern of road dividing lines around the host vehicle M recognized from the image captured by the camera 10. The recognition unit 130 may recognize the lane by recognizing a traveling road boundary (road boundary) including a road dividing line, a shoulder, a curb, a center barrier, a guardrail, and the like, instead of the road dividing line. In this recognition, the position of the own vehicle M acquired from the navigation device 50 and the processing result by the INS processing may be added. The recognition unit 130 recognizes a stop line, an obstacle, a red light, a toll booth, and other road items.
The recognition unit 130 recognizes the position and posture of the host vehicle M with respect to the travel lane when recognizing the travel lane. The recognition unit 130 may recognize, for example, a deviation of the reference point of the host vehicle M from the center of the lane and an angle of the traveling direction of the host vehicle M with respect to a line connecting the centers of the lanes as the relative position and posture of the host vehicle M with respect to the traveling lane. Instead, the recognition unit 130 may recognize the position of the reference point of the host vehicle M with respect to an arbitrary side end portion (road partition line or road boundary) of the traveling lane, as the relative position of the host vehicle M with respect to the traveling lane.
The recognition unit 130 recognizes information relating to the position of the nearby vehicle based on the nearby vehicle of the host vehicle M recognized from the image captured by the camera 10, the congestion information of the periphery of the host vehicle M acquired by the navigation device 50, or the position information obtained from the second map information 62.
The recognition unit 130 may acquire various information received from vehicles or the like traveling around the host vehicle M by inter-vehicle communication via the communication device 20, and recognize the periphery of the host vehicle M based on the information. The recognition unit 130 includes, for example, a wind condition recognition unit 132, an object recognition unit 134, and another vehicle recognition unit 136. The details of their functions will be described later.
The action plan generating unit 140 basically travels on the recommended lane determined by the recommended lane determining unit 61, and generates a target trajectory on which the host vehicle M will automatically travel in the future (without depending on the operation of the driver) so as to cope with the surrounding situation of the host vehicle M. The target trajectory includes, for example, a velocity element. For example, the target track represents a sequence of points (track points) to which the vehicle M should arrive. The track point is a point to which the host vehicle M should arrive at every predetermined travel distance (for example, several [ M ] degrees) in terms of a distance along the way, and is generated as a part of the target track at a target speed and a target acceleration every predetermined sampling time (for example, several zero-point [ sec ] degrees). The track point may be a position to which the vehicle M should arrive at a predetermined sampling time. In this case, the target velocity and target acceleration information are expressed by the interval between the track points.
The action plan generating unit 140 may set an event of the autonomous driving when the target trajectory is generated. The event of the automatic driving includes a constant speed driving event, a low speed follow-up driving event, a lane change event, a branch event, a merge event, a take-over event, and the like. The action plan generating unit 140 generates a target trajectory corresponding to the started event.
The action plan generating unit 140 includes, for example, a steering adjusting unit 142. The function of the steering adjustment unit 142 will be described in detail later.
The second control unit 160 controls the traveling driving force output device 200, the brake device 210, and the steering device 220 so that the host vehicle M passes through the target trajectory generated by the action plan generation unit 140 at a predetermined timing.
The second control unit 160 includes, for example, an acquisition unit 162, a speed control unit 164, and a steering control unit 166. The acquisition unit 162 acquires information of the target track (track point) generated by the action plan generation unit 140 and stores the information in a memory (not shown). The speed control unit 164 controls the running drive force output device 200 or the brake device 210 based on the speed element associated with the target track stored in the memory. The steering control unit 166 controls the steering device 220 according to the curve of the target track stored in the memory. The processing of the speed control unit 164 and the steering control unit 166 is realized by, for example, a combination of feedforward control and feedback control. As an example, the steering control unit 166 performs a combination of feedforward control according to the curvature of the road ahead of the host vehicle M and feedback control based on deviation from the target trajectory. The steering control unit 166 performs steering control based on the adjustment amount adjusted by the steering adjustment unit 142.
Running drive force output device 200 outputs running drive force (torque) for running the vehicle to the drive wheels. The travel driving force output device 200 includes, for example, a combination of an internal combustion engine, a motor, a transmission, and the like, and an ecu (electronic Control unit) that controls them. The ECU controls the above configuration in accordance with information input from the second control unit 160 or information input from the driving operation element 80.
The brake device 210 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor so that the braking torque corresponding to the braking operation is output to each wheel, in accordance with the information input from the second control unit 160 or the information input from the driving operation element 80. The brake device 210 may be provided with a mechanism for transmitting the hydraulic pressure generated by the operation of the brake pedal included in the driving operation element 80 to the hydraulic cylinder via the master cylinder as a backup. The brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device that controls an actuator in accordance with information input from the second control unit 160 and transmits the hydraulic pressure of the master cylinder to the hydraulic cylinder.
The steering device 220 includes, for example, a steering ECU and an electric motor. The steering device 220 has a function such as Electric Power Steering (EPS). The electric motor changes the orientation of the steering wheel by applying a force to a rack-and-pinion mechanism, for example. The steering ECU drives the electric motor to change the direction of the steered wheels in accordance with information input from the second control unit 160 or information input from the driving operation element 80.
Here, when the vehicle control by LKAS (hereinafter, referred to as LKAS control) is performed, the automatic drive control device 100 recognizes the traveling lane of the host vehicle M based on, for example, information input via the object recognition device 16, and calculates an optimal assist torque for assisting the steering of the host vehicle M so that the reference point (e.g., the center and the center of gravity) of the host vehicle M passes through the center (center in the width direction) of the recognized lane. Then, the automatic driving control device 100 outputs the assist torque to the steering device 220, thereby suppressing the vehicle M from departing from the traveling lane (lane keeping). That is, the automatic driving control apparatus 100 performs at least steering control of the host vehicle M in the LKAS control. When performing the vehicle control by the ACC line, the automatic driving control device 100 controls the running driving force output device 200 and the brake device 210 so as to run while keeping the inter-vehicle distance between the vehicle M and the preceding vehicle constant, for example, based on the information input via the object recognition device 16. That is, the automatic driving control device 100 performs acceleration/deceleration control (speed control) based on the vehicle-to-vehicle distance with the preceding vehicle. When the vehicle control by the CMBS is performed, the automatic driving control device 100 controls the running driving force output device 200, the brake device 210, and the steering device 220 so as to avoid contact with an obstacle that is in direct proximity to the distance between the own vehicle M, for example. That is, the automatic driving control apparatus 100 performs acceleration/deceleration control and steering control for avoiding contact with an object.
The functions of the recognition unit 130 and the action plan generation unit 140 according to the embodiment will be described in detail below. In the following description, mainly LKAS control will be mainly described. Fig. 3 is a diagram for explaining the recognition unit 130 and the action plan generation unit 140. In the example of fig. 3, two lanes L1 and L2 are shown, which can travel in the same direction. The lane L1 is divided by a dividing line LL and a dividing line CL, and the lane L2 is divided by a dividing line CL and a dividing line LR. In the example of fig. 3, the host vehicle M is assumed to be traveling at a speed VM in a lane L1, and the other vehicle M1 is traveling at a speed VM1 in a lane L2. In the example of fig. 3, the direction of wind WND with respect to lane L1 and lane L2 is schematically shown.
[ function of wind Condition identification part ]
The wind condition recognition unit 132 recognizes the wind condition around the host vehicle M. The wind condition recognition unit 132 accesses a management server or the like that manages wind conditions for each point and region via the communication device 20 based on, for example, the position and the traveling direction of the host vehicle M detected by the GNSS receiver 51 and the vehicle sensor 40, and the on-map route to the destination set in the navigation device 50, and acquires the position of the host vehicle M and the wind conditions of the on-map route. The wind condition recognition unit 132 may recognize the wind condition from a wind condition sensor included in the vehicle sensor 40. The wind condition recognition unit 132 continuously recognizes the wind condition at predetermined intervals.
Further, the wind condition recognition unit 132 may recognize the wind condition based on the recognition result recognized by the object target recognition unit 134.
[ function of object target recognition part ]
The object recognition unit 134 recognizes a specific object present in the periphery of the host vehicle M, for example, based on the analysis result of an image (hereinafter, referred to as a camera image) captured by the camera 10 or the like. The specific object target means an object that moves or changes in shape according to wind conditions. The specific object target is, for example, an object having a size equal to or smaller than a first predetermined value or an object estimated from an image to have a weight equal to or smaller than a second predetermined value. Specific object targets include, for example, a flag or flag placed near a road, paper flying in the air with wind or rolling on the road, cloth, plastic bags, balloons, fallen leaves, empty cans, plastic bottles, and other objects. The banner is, for example, a device for suspending a cloth-like drum from a high position and visually checking the wind direction and the wind speed. In addition, the specific object target may include a wind direction meter and an anemometer. In addition, the specific object target may include trees existing in the periphery of the road, and clouds existing in the road.
In the example of fig. 3, the object target recognition unit 134 recognizes a specific object target TG1 of a banner as a specific object target and a specific object target TG2 of a sheet flying ahead of the host vehicle M1. The object recognition unit 134 recognizes the amount of change or the amount of movement of the recognized specific object in a predetermined time. For example, the object target identifying unit 134 identifies the direction in which the cloth-like tube of the specific object target TG1 flutters and the angle from the vertical direction of the cloth-like tube. The object target recognition unit 134 recognizes the movement direction and the movement amount of the specific object target TG2 in a predetermined time.
The wind condition recognition portion 132 recognizes, for example, the wind direction based on the direction in which the cloth-like tube of the specific object target TG1 recognized by the object target recognition portion 134 flaps, and recognizes the wind speed based on the angle of the cloth-like tube from the vertical direction. Further, the wind condition recognition unit 132 recognizes the wind direction based on the movement direction of the specific object target TG2 for a predetermined time period recognized by the object target recognition unit 134, and recognizes the wind speed based on the movement amount for a predetermined time period, for example.
The wind situation recognition unit 132 may refer to the map information (the first map information 54 and the second map information 62) based on the position of the host vehicle M, acquire the geographical information and the region information based on the referred map information, and estimate the intensity of the wind direction and the wind speed with respect to the current position and the traveling direction. For example, when the road on which the host vehicle M travels is coastal, it is estimated that strong wind blows from the sea side, and therefore, the wind direction is estimated based on the current position of the host vehicle M, the traveling direction, and the position of the sea, and it is estimated that wind with a high wind speed blows. The wind condition recognition unit 132 may estimate the wind condition based on the season and the time, or may estimate the wind condition based on a weather forecast obtained from an external server via the communication device 20.
The wind condition recognition unit 132 may determine whether the wind WND is a cross wind with respect to the traveling direction of the host vehicle M based on the recognized wind direction and traveling direction. For example, the wind situation recognition unit 132 recognizes the specific object, and recognizes that the crosswind blows against the traveling direction of the host vehicle M when recognizing that the movement of the recognized specific object with the passage of time includes at least the movement in the direction orthogonal to the traveling direction of the host vehicle M.
The wind condition recognition unit 132 recognizes information on the yaw rate of the host vehicle M obtained from the yaw rate sensor 42 and the steering angle information of the steering device 220 obtained from the steering angle sensor 44, and recognizes that the factor causing the error includes an interference factor when the error between the angle (yaw rate angle) obtained from the recognized yaw rate and the steering angle is equal to or greater than a predetermined value. Further, the wind situation recognition unit 132 may recognize that the vehicle M is subjected to a cross wind when recognizing that the error has occurred includes an interference factor. Thus, it is possible to determine whether or not the host vehicle M is subjected to a cross wind based on the detection results of the yaw rate sensor 42 and the steering angle sensor 44.
The wind condition recognition unit 132 may also recognize the region where the wind condition changes and the degree of change based on the recognition result recognized by the other-vehicle recognition unit 136.
[ function of other vehicle identification parts ]
The other-vehicle recognition unit 136 recognizes the relative position and the relative speed of the other vehicle mn (n is an integer equal to or greater than 1) present in the periphery of the host vehicle M, for example, based on the information input via the object recognition device 16. Hereinafter, the other vehicle mn will be collectively referred to as the other vehicle m without distinguishing it. The other vehicle M present in the periphery of the own vehicle M is, for example, another vehicle present within a predetermined distance around the position of the own vehicle M. In addition to the above-described conditions, the other vehicle M present in the vicinity of the host vehicle M may be another vehicle that can travel on an adjacent lane that can travel in the same direction as the host vehicle M (another vehicle that travels on the lane L1 or the lane L2 in the example of fig. 3), or another vehicle that travels on an opposite lane opposite to the traveling lane of the host vehicle M. In addition to the above-described conditions, the other vehicle M present in the periphery of the host vehicle M may be another vehicle present in a direction capable of blocking the wind WND recognized by the wind condition recognition unit 132. The other vehicle present in the direction in which the wind WND can be shielded is, for example, another vehicle that travels on a lane on the right side of the traveling lane of the host vehicle M when the wind WND blows from the right side with respect to the traveling direction of the host vehicle.
The other-vehicle recognition unit 136 recognizes the size of the other vehicle M based on the information input via the object recognition device 16. The size of the other vehicle m may be, for example, the size of the entire other vehicle m, or may be at least one of the elements indicating the size of the vehicle length, the vehicle height, the vehicle width, and the like. The other-vehicle recognition unit 136 recognizes the size of the other vehicle M based on, for example, the outline shape of the other vehicle M obtained from the analysis result of the camera image captured by the camera 10 or the like. The other-vehicle recognition unit 136 may recognize a license plate (a logo) attached to the other vehicle m from the camera image, and estimate the size of the other vehicle m based on the recognized license plate. Since the size of the vehicle also differs depending on the size of the license plate, the approximate size of the other vehicle m can be estimated by acquiring the size of the license plate from the camera image. The other-vehicle recognition unit 136 may estimate the size of the other vehicle M based on information (e.g., a classification number) described in the license plate.
The other-vehicle recognition unit 136 may recognize the size of the other vehicle M based on the size of the own vehicle M. In this case, the another vehicle recognition unit 136 determines whether the another vehicle M is large relative to the host vehicle M, based on, for example, whether the vehicle length is long, the vehicle height is high, and the vehicle width is wide compared to the host vehicle M. In the example of fig. 3, the another-vehicle recognition unit 136 recognizes that the another vehicle M1 is a large truck and recognizes that the another vehicle M1 is larger than the host vehicle M from the contour shape or the like based on the analysis result of the camera image captured by the camera 10.
The wind condition recognition unit 132 recognizes, when it is predicted that the other vehicle M will miss or run in parallel with the host vehicle M based on the relative position and the relative speed of the other vehicle M recognized by the other vehicle recognition unit 136, a region including one or both of the front region and the rear region of the other vehicle M on the traveling lane of the host vehicle M as a change region of the wind condition. The case where it is predicted that another vehicle M may miss the host vehicle M is, for example, a case where the host vehicle M is traveling in the same traveling direction as the other vehicle M and it is predicted that the host vehicle M will overtake the other vehicle M or the host vehicle M will be overtaken by the other vehicle M within a predetermined time. The case where another vehicle M is predicted to miss the host vehicle M is, for example, a case where another vehicle M traveling on an opposite lane opposite to the traveling lane of the host vehicle M approaches the host vehicle M. The case where the other vehicle M is predicted to run in parallel with the host vehicle M is a case where the host vehicle M is traveling in the same direction as the other vehicle M, the relative distance between the host vehicle M and the other vehicle M is equal to or less than a predetermined value, and the relative speed between the host vehicle M and the other vehicle M is equal to or less than a predetermined value. The change region is, for example, a region where it is estimated that the wind force and the wind direction received by the vehicle M from the wind WND change.
In the example of fig. 3, the speed VM of the host vehicle M is set to be higher than the speed VM1 of the other vehicle M1 by a predetermined value or more, and the wind situation recognition unit 132 recognizes that the host vehicle M overtakes the other vehicle M1 within a predetermined time (including the temporary parallel travel state). In this case, the wind condition recognition unit 132 recognizes the regions including the lane L1 in front of and behind the other vehicle m1 as the change region CA1 and the change region CA 2.
For example, when the speed VM of the host vehicle M is higher than the speed VM1 of the other vehicle M1 present in front and it is predicted that the host vehicle M will travel in parallel with the other vehicle M1, the wind condition recognition unit 132 may recognize a rear region including a part of the side region of the other vehicle M1 on the travel lane L1 of the host vehicle M as the changed region CA 2. In this case, a part of the side region refers to, for example, a region connected to the rear region. By including a part of the side area, the change area can be set in consideration of the influence of the wind WND that is involved in the side area from the vicinity behind the other vehicle m 1. When the speed VM of the host vehicle M is higher than the speed VM1 of the other vehicle M1 and the host vehicle M is in the parallel running state with the other vehicle M1, the wind condition recognition unit 132 recognizes an area in front of the other vehicle M1 on the running lane L1 of the host vehicle M as the change area CA 1.
In the above description, the case where the host vehicle M passes over the other vehicle M1 present in the front direction has been described, but the change area CA1 and the change area CA2 may be identified with reference to the other vehicle similarly also in the case where the host vehicle M passes over the other vehicle present in the rear direction. For example, when the speed VM of the host vehicle M is slower than the speed VM1 of the other vehicle M1 existing behind and it is predicted that the host vehicle M will travel in parallel with the other vehicle M1, the wind condition recognition unit 132 recognizes a front region including a part of the side region of the other vehicle M1 on the travel lane L1 of the host vehicle M as the change region CA 1. In this case, a part of the side region refers to, for example, a region connected to the front region. By including a part of the side area, the change area can be set in consideration of the influence of the wind WND that involves the side area from the vicinity in front of the other vehicle m 1. When the speed VM of the host vehicle M is slower than the speed VM1 of the other vehicle M1 and the host vehicle M is in the parallel traveling state with the other vehicle M1, the wind condition recognition unit 132 recognizes the region behind the other vehicle M1 on the traveling lane L1 of the host vehicle M as the variation region CA 2. The partial side region may be changed according to wind conditions.
Alternatively, the wind situation recognition unit 132 may set the change area CA1 and the change area CA2 when the size of the other vehicle obstructing the wind WND is larger than the host vehicle M, based on the recognition result recognized by the other vehicle recognition unit 136 instead of (or in addition to) the above-described conditions. When the other vehicle M is larger than the host vehicle M, the range of the wind WND shielded from the other vehicle becomes large, and therefore the difference between the case of receiving the influence of the wind WND and the case of not receiving the influence of the wind WND becomes large. On the other hand, when the size of the other vehicle M is equal to or smaller than the host vehicle M, the range of the wind WND from the other vehicle is small, and therefore the difference in influence by the wind WND is small. Therefore, when the other vehicle M is larger than the host vehicle M, the change area CA1 and the change area CA2 are recognized, and steering is adjusted with respect to the recognized change area, whereby more appropriate vehicle control can be performed.
In addition, the wind situation recognition unit 132 may recognize, as the change region, a region including one or both of the front region and the rear region of the other vehicle when it is recognized that the host vehicle M is subjected to a cross wind before the host vehicle M is driven by another vehicle M or travels in parallel therewith.
Further, the wind condition identification unit 132 identifies the degree of change in the change areas CA1 and CA 2. The degree of change is, for example, the degree of change in wind conditions between change region CA1 and change region CA2 and the regions before and after the change region CA 1. The degree of change is set based on, for example, the wind direction, the wind speed, and the magnitude of the other vehicle M that blocks the wind WND with respect to the traveling direction of the host vehicle M. For example, the wind condition recognition unit 132 sets the degree of change to be larger as the wind speed is larger. The wind condition recognition unit 132 sets the degree of change to be larger as the wind direction is transverse to the front direction of the host vehicle M, and the wind condition recognition unit 132 sets the degree of change to be larger as the other vehicles M are larger.
[ function of steering adjustment section ]
Next, the function of the steering adjustment unit 142 will be described in detail. The steering adjustment unit 142 adjusts steering based on the wind conditions recognized by the wind condition recognition unit 132 for the target trajectory of the host vehicle M planned by the action plan generation unit 140. Fig. 4 is a diagram for explaining the processing of the steering adjustment unit 142. In the example of fig. 4, for convenience of explanation, a case will be described in which the lane L3 on which the host vehicle M travels is a curved road and the vehicle M blows from the right side against the traveling direction wind WND of the host vehicle M, but the same processing is performed for other road shapes such as a straight road.
In the scenario of fig. 4, the action plan generating unit 140 detects the yaw rate around the vertical axis of the host vehicle M from the yaw rate sensor 42, and estimates the predicted trajectory K1 predicted to be traveled by the host vehicle M, based on the detected yaw rate. Here, in the LKAS control, the control is generally performed such that the reference point (for example, the center or the center of gravity) of the host vehicle M travels at the center of the travel lane. Therefore, the action plan generating unit 140 generates the target trajectory K2 in which the host vehicle M is located at the center of the lane L3. The action plan generating unit 140 also derives an offset YO1 of the yaw rate for reaching the point P2 on the target track K2 from the point P1 on the predicted track K1 at the point separated by the predetermined distance D1 in the traveling direction from the host vehicle M. The predetermined distance D1 is, for example, the longest distance at which the object recognition device 16 can recognize an object based on information obtained from the camera 10, the radar device 12, the probe 14, and the like. The predetermined distance D1 may be a fixed distance or a variable distance set based on the speed of the host vehicle M and the road shape.
The steering adjustment unit 142 adjusts the offset YO1 with respect to the yaw rate based on the wind condition of the wind WND recognized by the wind condition recognition unit 132. For example, the steering adjustment unit 142 performs adjustment in accordance with a change in behavior caused by the influence of the wind WND received by the host vehicle M. In the example of fig. 4, the steering adjustment unit 142 sets a point P3 at which the vehicle M moves to the right by a distance D2, in consideration of the influence of the cross wind received from the right side, at a point P2 on the target track K2. The distance D2 may be a fixed distance, or may be a variable distance set based on the wind direction, wind speed, or the like, for example. Further, the steering adjustment unit 142 derives the yaw rate offset YO2 based on the distance from the point P1 to the point P3.
That is, the action plan generating unit 140 performs the steering control including the yaw-rate offset YO1 when the wind conditions are not recognized by the wind condition recognizing unit 132 or when the wind speed is determined to be low and the own vehicle M is not affected by the wind WND, and performs the steering control including the yaw-rate offset YO2 when the wind WND is determined to be affected. The steering control controls the steering-related assist torque and the like so that the target steering angle θ derived based on the speed VM of the host vehicle M and the yaw rate offset YO is achieved, for example. Accordingly, since the vehicle is affected by the wind WND while traveling toward the point P3, the vehicle travels at the point P2 when traveling the predetermined distance D. This enables more appropriate driving control even when the host vehicle M is affected by wind.
Further, when it is predicted that the host vehicle M will reach the change regions CA1 and CA2 based on the recognition result recognized by the wind condition recognition unit 132, the steering adjustment unit 142 adjusts the steering of the host vehicle M based on the degree of change in the wind conditions between the change regions CA1 and CA2 and the regions in front of and behind the change regions CA 2.
The content of the adjustment of the assist torque by the steering adjustment unit 142 based on the positional relationship between the host vehicle M and the other vehicle M1 will be specifically described below. Fig. 5 is a diagram showing a positional relationship between the host vehicle M and another vehicle ml. In the example of fig. 5, the relationship "t 1 < t2 < t3 < t4 < t 5" holds for times t1 to t5, and the position of the vehicle at each time is referred to as the host vehicle m (t). Note that, at the time t1 to t5, the other vehicle ml is also traveling at the speed Vml, but is shown in a fixed state in order to clearly show the relative position between the host vehicle M and the other vehicle M1. The following describes the processing of the steering adjustment unit 142 at each time.
< case of time t3 >
At time t3 shown in fig. 5, wind WND is blocked by another vehicle m 1. Therefore, the vehicle M is not affected by the wind WND or is less affected by the wind WND. Therefore, at time t3, the steering adjustment unit 142 assists steering similar to that in the windless state, for example. Fig. 6 is a diagram for explaining the assist torque for steering in the host vehicle M (t 3). In the example of fig. 6, the horizontal axis represents the lateral position (lateral offset amount in the left-right direction) from the center of the road (lane L1), and the vertical axis represents the assist torque for steering. The same applies to the description of the vertical and horizontal axes of the figures related to the following.
For example, when the host vehicle M is traveling near the center of the lane, the steering control of the vehicle is performed by the operation of the steering wheel of the occupant, the assist torque amount for steering the host vehicle M by the steering control unit 166 is low, and the torque amount at the center of the lane is zero (0). The steering control unit 166 executes the disengagement prevention torque control for increasing the tendency change of the assist torque amount for steering the host vehicle M to return to the center of the lane L1 by the LKAS control, in accordance with the distance by which the host vehicle M is separated from the center of the lane L1. This makes it possible to give the steering wheel reaction force operated by the driver, notify the driver that the host vehicle M is about to depart from the traveling lane, and return the host vehicle M to the center of the lane L1. When the assist torque amount exceeds the upper limit value, the automatic driving control device 100 may notify the occupant that the host vehicle M is about to leave the lane or is leaving the lane by outputting an alarm or the like using the HMI 30. In the example of fig. 6, the torque amount is linearly increased according to the distance from the center of the lane, but the present invention is not limited thereto, and a part or all of the curves may be increased. The same applies to the subsequent related drawings.
< cases at times t1 and t5 >
At times t1 and t5 shown in fig. 5, the host vehicle M receives wind WND from the lateral direction. In this case, the subject vehicle M may move from the center of the lane L1 toward the dividing line LL side due to the influence of the crosswind. Therefore, the steering adjustment unit 142 differs from the control of the assist torque at time t3 (no wind state) in the control of the assist torque for preventing the lane L1 from being deviated.
Fig. 7 is a diagram for explaining the steering assist torque of the host vehicle M (t1) and the host vehicle M (t 5). At the time point of time t1 and time point of time t5, since the host vehicle M is affected by the wind WND, the steering adjustment unit 142 adjusts the amount of steering assist torque so that the amount of steering assist torque is larger than the amount of torque at time point t3 (no wind state). The steering adjustment unit 142 may set the adjustment amount to be variable based on the wind direction and the wind speed of the wind condition of the wind WND recognized by the wind condition recognition unit 132. The steering adjustment unit 142 may be configured to make the adjustment amount of the assist torque at the time t1 different from the adjustment amount of the assist torque at the time t 5.
< cases at times t2 and t4 >
At time t2 and time t4 shown in fig. 5, the host vehicle M receives the wind WND from the lateral direction, and is assumed to receive a strong influence of the wind WND due to a change in the airflow caused by the influence of the other vehicle M1. At time t3, since the vehicle is in the parallel traveling state with the other vehicle M1 and is not affected by the wind WND from the other vehicle M1 side, the degree of change in the wind WND borne by the vehicle M increases from time t2 to time t 4. Therefore, at the time point when change regions CA1 and CA2 travel (time t2 and time t4), steering adjustment unit 142 adjusts the assist torque amount so that the assist torque amount at this time point is larger than the torque amounts at other time points t1, t3, and t 5.
Fig. 8 is a diagram for explaining the steering assist torque of the host vehicle M (t2) and the host vehicle M (t 4). Since the strong influence of the wind WND is received at the time t2 and the time t4, the steering adjustment unit 142 adjusts the steering assist torque so that the steering assist torque becomes larger than the torque at the times t1 and t 5. The steering adjustment unit 142 sets the adjustment amount to be variable based on the degree of change in the change region of the wind WND recognized by the wind condition recognition unit 132. The steering adjustment unit 142 may be configured to make the adjustment amount of the assist torque at the time point of time t2 different from the adjustment amount of the assist torque at the time point of time t 4.
In the examples of fig. 6 to 8, the amount of change in the assist torque with respect to an increase in the lateral offset amount is represented non-linearly (in a curve), but the present invention is not limited thereto, and a part or all of the amount of change may be linear.
Note that, instead of adjusting the assist torque for both the left and right steering as shown in fig. 6 to 8, the steering adjustment unit 142 may adjust the different amounts of torque in the left and right directions based on the wind direction. Fig. 9 is a diagram for explaining control of the amount of assist torque that differs with respect to the left and right lateral positions. In the example of fig. 9, the assist torque with respect to the lateral position of the host vehicle M (t2) at time t2 is shown. At the time t2 shown in fig. 5, the own vehicle M receives a lateral wind WND from the right side (the traveling lane L2 side of the other vehicle M1) with respect to the traveling direction, and therefore receives a force moving to the left side with respect to the lane center. Therefore, as shown in fig. 9, when the vehicle is shifted to the left side of the lane center, the steering adjustment unit 142 immediately returns the lane center by increasing the assist torque amount to the ml side of the other vehicle more than the assist torque amount when the vehicle is shifted to the right side. This can suppress the host vehicle M from departing from the lane due to the influence of the wind WND, and perform more appropriate vehicle control.
[ modified examples ]
Hereinafter, modifications of the embodiment will be described. In the above description, the description has been mainly given centering on the case where the host vehicle M travels in the same direction as the other vehicle M, but in addition to this, the steering adjustment unit 142 may perform adjustment of steering even in the case where it is predicted that a vehicle passing by with the other vehicle M traveling on the opposite lane will be missed. Here, when the other vehicle m is an opposing vehicle, it is assumed that the vehicle-crossing is performed in a short time, and therefore, when the steering control is performed in stages as in the parallel traveling described above, there is a possibility that the driving (steering operation) of the driver is adversely affected. Therefore, the wind condition recognition unit 132 recognizes the regions including the front and rear of the other vehicle (the opposing vehicle) and the lateral regions of the other vehicle m as the change region in a group.
Fig. 10 is a diagram for explaining a change region for the opposing vehicle. In the example of fig. 10, the host vehicle M is traveling in the lane L4, and the other vehicle M2 is traveling in the opposite lane L5 of the lane L4 at the speed Vm 2. In this case, the wind condition recognition unit 132 recognizes the wind condition of the wind WND, and recognizes, as the change region CA3, a region including the front and rear of the oncoming vehicle M2 and the lateral direction of the oncoming vehicle M2 (the traveling lane L4 side of the host vehicle M) recognized by the other vehicle recognition unit 136. The steering adjustment unit 142 adjusts the assist torque for steering in the case of traveling in the variation region CA3 so that the assist torque for steering in this case is larger than the assist torque for steering in the case of traveling outside the variation region CA 3. The steering adjustment unit 142 may set the adjustment amount of the assist torque based on the degree of change. This enables the host vehicle M to stably travel even when a change in wind conditions occurs in a short time due to a vehicle-crossing with the opposing vehicle. The setting of the change region described above can be applied to, for example, a case where the relative speed between the host vehicle M and the other vehicle M exceeds the upper limit speed on a lane that can travel in the same direction.
Further, other modifications will be described. In the above-described embodiment, the change region of the wind condition in the other vehicle m is recognized, but in addition to (or instead of) this, it may be determined whether or not a change region of the wind condition exists due to a road member that blocks the wind, such as a tunnel or a wind-break wall, and if a change region exists, steering adjustment may be performed. Fig. 11 is a diagram for explaining a change region of wind conditions caused by road members. Fig. 11 shows an example in which the tunnel TNL (an example of a road member) is provided near the exit of two lanes L6 and L7 that can travel in the same direction. In the example of fig. 11, the host vehicle M is assumed to be traveling in the lane L6 at the speed VM, and the other vehicle M3 traveling in front of the host vehicle M is assumed to be traveling in the lane L6 at the speed VM 3.
Here, the inside of the tunnel TNL is a windproof area that shields the surroundings from the outside. Therefore, the host vehicle M is not affected by the wind WND when traveling in the tunnel TNL. In this case, the wind condition recognition unit 132 recognizes the wind condition of the wind WND after the exit of the tunnel TNL based on the movement direction and the movement amount of the specific object target TG1 or the specific object target TG2 recognized by the object target recognition unit 134. The wind condition recognition unit 132 may access an information providing terminal TA1 that provides information on the wind conditions in the vicinity of the tunnel TNL on which the vehicle M travels through the communication device 20, and may acquire information on the wind conditions in the vicinity of the exit of the tunnel TNL from the information providing terminal TA 1. The wind condition recognition unit 132 may perform inter-vehicle communication with another vehicle M3 traveling before the host vehicle M to acquire the wind condition near the exit of the tunnel TNL acquired by the other vehicle M3. Thus, the wind condition recognition unit 132 recognizes the change area CA4 from the recognized wind condition near the tunnel exit, and recognizes the degree of change based on the wind condition, the size of the tunnel TNL, and the like. The steering adjustment unit 142 adjusts the steering in the host vehicle M when passing through the change area CA 4.
The steering adjustment unit 142 may acquire the adjustment amount of the assist torque for steering when the other vehicle M3 passes through the change region CA4, and adjust the assist torque amount for steering the host vehicle M based on the acquired adjustment amount. In this case, the steering adjustment unit 142 may correct the adjustment amount by the difference in size, type, and other performances between the other vehicle M3 and the host vehicle M.
Further, the steering adjustment unit 142 may not adjust the assist torque with respect to the above-described wind condition change region when the wind speed recognized by the wind condition recognition unit 132 is equal to or lower than a predetermined value. The action plan generating unit 140 may perform adjustment for speed control during automatic driving, based on the wind conditions recognized by the wind condition recognition unit 132, instead of (or in addition to) adjustment of steering. In this case, for example, when the wind speed is equal to or higher than a predetermined value and a component of the oncoming wind or the downwind is included in the wind direction with respect to the traveling direction of the host vehicle M, the action plan generating unit 140 adjusts the speed control based on the wind direction.
In addition, a part or all of the plurality of modifications may be combined in the above-described embodiment. For example, when the host vehicle M is traveling toward the exit of the tunnel TNL and the other vehicle M is traveling in an adjacent lane, the wind condition recognition unit 132 recognizes the change region and the degree of change by combining the change region near the exit of the tunnel TNL and the change region including one or both of the front and the rear of the other vehicle M.
[ treatment procedure ]
Fig. 12 is a flowchart showing an example of the flow of processing executed by the automatic driving control apparatus 100. The processing of the flowchart may be repeated at predetermined cycles or timings. First, the recognition unit 130 recognizes the surrounding environment of the host vehicle M (step S100). Next, the action plan generating unit 140 performs driving support including steering control by LKAS control or the like so that the host vehicle M travels in the center of the lane, based on the surrounding environment recognized by the recognizing unit 130 (step S102).
Next, the wind condition recognition unit 132 determines whether or not a wind condition change region is recognized (step S104). When it is determined that the change region is recognized, the wind condition recognition unit 132 determines whether or not the host vehicle M is predicted to reach the change region of the wind condition (step S106). When it is determined that the vehicle M is predicted to reach the wind condition change region, the wind condition recognition unit 132 recognizes the degree of change in the wind condition (step S108). Next, the steering adjustment unit 142 adjusts at least the steering of the host vehicle M based on the degree of change in the wind conditions recognized by the wind condition recognition unit 132 (step S110).
After the process of step S110, if it is determined that the change region of the wind conditions is not recognized in the process of step S104, or if it is determined that the vehicle M is not predicted to reach the change region in the process of step S106, the automatic driving control apparatus 100 determines whether or not to end the process by the automatic driving (step S112). The case where the change region is not recognized and the vehicle is not predicted to reach the change region means, for example, a case where the speed VM of the vehicle M is equal to or similar to the speed of the other vehicle M and no vehicle-to-vehicle passing occurs.
If it is determined in the process of step S112 that the process is not to be ended, the process returns to step S100. Further, the processing of the present flowchart is ended when it is determined that the processing is ended, for example, when an operation to end the automated driving control by a passenger such as a driver is received, when the automated driving is ended by arriving at the destination, or the like.
According to the embodiment described above, the automatic driving control device 100 includes: a recognition unit 130 that recognizes the surrounding environment of the host vehicle M; and a driving control unit (the action plan generating unit 140 and the second control unit 160) that controls at least the steering of the host vehicle M so that the host vehicle M travels at a predetermined position on the road based on the recognition result of the recognition unit 130, wherein the recognition unit 130 recognizes a change area of the wind condition around the host vehicle M, and wherein the driving control unit adjusts the steering of the host vehicle M based on the degree of change of the wind condition when it is predicted that the host vehicle will reach the change area recognized by the recognition unit 130, thereby enabling more appropriate vehicle control based on the environment around the host vehicle.
Further, according to the embodiment, for example, the torque amounts of the steering assist torques before the parallel traveling with the other vehicle M (before the passing), during the parallel traveling, and after the parallel traveling (after the passing) are adjusted based on the wind conditions and the degree of change in the wind conditions, so that the holding force of the LKAS control can be improved, and more stable traveling can be realized. Further, according to the embodiment, for example, when the host vehicle M is running in parallel with a large vehicle such as a truck or a bus, steering control is executed in which a reaction force against wind from the large vehicle side is not applied, or steering control in which the reaction force is weaker than in the case where the host vehicle M is not running in parallel, whereby more stable running can be achieved.
[ hardware configuration ]
Fig. 13 is a diagram showing an example of the hardware configuration of the automatic driving control apparatus 100 according to the embodiment. As shown in the figure, the automatic driving control apparatus 100 is configured such that a communication controller 100-1, a CPU100-2, a ram (random Access memory)100-3 used as a work memory, a rom (read Only memory)100-4 storing a boot program and the like, a flash memory, a storage apparatus 100-5 such as an HDD, a drive apparatus 100-6 and the like are connected to each other via an internal bus or a dedicated communication line. The communication controller 100-1 performs communication with components other than the automatic driving control apparatus 100. The program 100-5a executed by the CPU100-2 is stored in the storage device 100-5. This program is developed into the RAM100-3 by a dma (direct Memory access) controller (not shown) or the like, and executed by the CPU 100-2. This realizes a part or all of the first control unit 120 and the second control unit 160.
The above-described embodiments can be expressed as follows.
A vehicle control device is configured to control a vehicle,
the vehicle control device includes:
a storage device storing a program; and
a hardware processor for executing a program of a program,
the hardware processor performs the following processing by executing a program stored in the storage device:
identifying the surrounding environment of the vehicle;
controlling, based on the recognized result, at least steering of the own vehicle so that the own vehicle travels at a prescribed position on the road;
identifying a change region of a change in wind conditions in the periphery of the host vehicle; and
adjusting steering of the host vehicle based on a degree of change in the wind condition in a case where the host vehicle is predicted to reach the identified change region.
While the present invention has been described with reference to the embodiments, the present invention is not limited to the embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.

Claims (12)

1. A control apparatus for a vehicle, wherein,
the vehicle control device includes:
an identification unit that identifies the surrounding environment of the vehicle; and
a driving control unit that controls at least steering of the host vehicle so that the host vehicle travels at a predetermined position on a road based on a recognition result of the recognition unit,
the identification unit identifies a change region in which the wind condition around the host vehicle changes,
the driving control unit adjusts the steering of the host vehicle based on the degree of change in the wind condition when it is predicted that the host vehicle will reach the change region identified by the identification unit.
2. The vehicle control apparatus according to claim 1,
the recognition unit recognizes another vehicle traveling in the vicinity of the host vehicle, and recognizes, when it is predicted that the recognized another vehicle will miss or travel in parallel with the host vehicle, a region on the traveling lane of the host vehicle, the region including one or both of a region in front of and a region behind the another vehicle, as the change region.
3. The vehicle control apparatus according to claim 2,
the other vehicle is a vehicle larger than the own vehicle.
4. The vehicle control apparatus according to claim 2,
the recognition unit recognizes, when it is recognized that the host vehicle is subjected to a crosswind before the host vehicle is driven by another vehicle or is driven in parallel with the another vehicle, a region on the driving lane of the host vehicle, the region including one or both of a region in front of the another vehicle and a region in back of the another vehicle, as the change region.
5. The vehicle control apparatus according to claim 4,
the recognition unit recognizes, as the change region, a rear region including a part of a side region of the other vehicle on a traveling lane of the host vehicle when the speed of the host vehicle is higher than the speed of the other vehicle and it is predicted that the host vehicle will travel in parallel with the other vehicle,
the recognition unit recognizes, as the change region, a region ahead of the other vehicle on a traveling lane of the host vehicle when the host vehicle is at a speed higher than the speed of the other vehicle and the host vehicle is in a parallel traveling state with the other vehicle.
6. The vehicle control apparatus according to claim 4,
the recognition unit recognizes a front region including a part of a side region of the other vehicle on a driving lane of the host vehicle as the change region when the speed of the host vehicle is slower than the speed of the other vehicle and it is predicted that the host vehicle will run in parallel with the other vehicle,
the recognition unit recognizes, as the change region, a region behind the other vehicle on a traveling lane of the host vehicle when the host vehicle is slower than the other vehicle and the host vehicle is in a parallel traveling state with the other vehicle.
7. The vehicle control apparatus according to claim 1,
the recognition unit recognizes that there is an interference factor among factors causing an error between the yaw rate and the steering angle when the error is equal to or greater than a predetermined value, and that the host vehicle is subjected to a crosswind when the interference factor is recognized,
the driving control unit adjusts steering of the host vehicle based on a wind condition of the crosswind.
8. The vehicle control apparatus according to claim 7,
the recognition unit recognizes a predetermined object target based on the image of the periphery of the host vehicle captured by the imaging unit, and recognizes that crosswind is blowing in the traveling direction of the host vehicle when recognizing that the movement of the recognized object target with the passage of time includes at least movement in a direction orthogonal to the traveling direction of the host vehicle.
9. The vehicle control apparatus according to claim 1,
the recognition unit recognizes a point that crosses a wind-break area as the change area when the host vehicle is not subjected to the influence of the wind by a road member that shields the wind and when a point that crosses the wind-break area by the road member recognizes a crosswind with respect to a traveling direction of the host vehicle.
10. The vehicle control apparatus according to claim 1,
the driving control unit increases a steering force of the host vehicle toward the other vehicle when the recognition unit recognizes a crosswind from the recognized other vehicle on the side of the traveling lane.
11. A control method for a vehicle, wherein,
the vehicle control method causes an on-board computer to perform:
identifying the surrounding environment of the vehicle;
controlling, based on the recognized result, at least steering of the own vehicle so that the own vehicle travels at a prescribed position on the road;
identifying a change region of a change in wind conditions in the periphery of the host vehicle; and
adjusting steering of the host vehicle based on a degree of change in the wind condition in a case where the host vehicle is predicted to reach the identified change region.
12. A storage medium storing a program, wherein,
the program causes the vehicle-mounted computer to perform the following processing:
identifying the surrounding environment of the vehicle;
controlling, based on the recognized result, at least steering of the own vehicle so that the own vehicle travels at a prescribed position on the road;
identifying a change region of a change in wind conditions in the periphery of the host vehicle; and
adjusting steering of the host vehicle based on a degree of change in the wind condition in a case where the host vehicle is predicted to reach the identified change region.
CN202010145241.0A 2019-03-20 2020-03-04 Vehicle control device, vehicle control method, and storage medium Active CN111717280B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019052476A JP2020152222A (en) 2019-03-20 2019-03-20 Vehicle control device, vehicle control method and program
JP2019-052476 2019-03-20

Publications (2)

Publication Number Publication Date
CN111717280A true CN111717280A (en) 2020-09-29
CN111717280B CN111717280B (en) 2022-09-02

Family

ID=72557421

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010145241.0A Active CN111717280B (en) 2019-03-20 2020-03-04 Vehicle control device, vehicle control method, and storage medium

Country Status (2)

Country Link
JP (1) JP2020152222A (en)
CN (1) CN111717280B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7304334B2 (en) * 2020-12-03 2023-07-06 本田技研工業株式会社 VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008254487A (en) * 2007-04-02 2008-10-23 Matsushita Electric Ind Co Ltd Side wind warning device, automobile equipped with side wind warning device, and side wind warning method
KR20130039046A (en) * 2011-10-11 2013-04-19 현대모비스 주식회사 Method for compensating side-wind based on camera sensor of lkas in motordriven power steering
CN105163994A (en) * 2013-05-01 2015-12-16 丰田自动车株式会社 Driving support apparatus and driving support method
CN204895460U (en) * 2015-07-14 2015-12-23 内蒙古麦酷智能车技术有限公司 Automatic adjusting device of a remotely piloted vehicle windage
CN107458379A (en) * 2016-06-02 2017-12-12 福特全球技术公司 A kind of system of data of method, motor vehicles and the processing for operating motor vehicles on acting on crosswind load on the rolling stock
CN107685730A (en) * 2016-08-03 2018-02-13 德尔福技术有限公司 Use the Lane Keeping System for being used to have the autonomous vehicle in the case of wind of vehicle roll
KR20180039841A (en) * 2016-10-11 2018-04-19 주식회사 만도 Steering Control Device for Compensating Cross Wind and Control Method thereof
JP2018159587A (en) * 2017-03-22 2018-10-11 株式会社豊田中央研究所 Mobile entity motion controller, mobile entity motion control method, and mobile entity motion control program
CN109318981A (en) * 2017-07-31 2019-02-12 株式会社斯巴鲁 The drive-control system of vehicle

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4844103B2 (en) * 2005-11-30 2011-12-28 日産自動車株式会社 Potential risk level warning device and potential risk level warning method
JP2011204125A (en) * 2010-03-26 2011-10-13 Toyota Motor Corp Situation predicting device and route generation device
JP6677134B2 (en) * 2016-09-13 2020-04-08 スズキ株式会社 Driving support device
JP6905367B2 (en) * 2017-03-21 2021-07-21 株式会社Subaru Vehicle travel control device
JP6882958B2 (en) * 2017-08-25 2021-06-02 株式会社Subaru Vehicle driving support device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008254487A (en) * 2007-04-02 2008-10-23 Matsushita Electric Ind Co Ltd Side wind warning device, automobile equipped with side wind warning device, and side wind warning method
KR20130039046A (en) * 2011-10-11 2013-04-19 현대모비스 주식회사 Method for compensating side-wind based on camera sensor of lkas in motordriven power steering
CN105163994A (en) * 2013-05-01 2015-12-16 丰田自动车株式会社 Driving support apparatus and driving support method
CN204895460U (en) * 2015-07-14 2015-12-23 内蒙古麦酷智能车技术有限公司 Automatic adjusting device of a remotely piloted vehicle windage
CN107458379A (en) * 2016-06-02 2017-12-12 福特全球技术公司 A kind of system of data of method, motor vehicles and the processing for operating motor vehicles on acting on crosswind load on the rolling stock
CN107685730A (en) * 2016-08-03 2018-02-13 德尔福技术有限公司 Use the Lane Keeping System for being used to have the autonomous vehicle in the case of wind of vehicle roll
KR20180039841A (en) * 2016-10-11 2018-04-19 주식회사 만도 Steering Control Device for Compensating Cross Wind and Control Method thereof
JP2018159587A (en) * 2017-03-22 2018-10-11 株式会社豊田中央研究所 Mobile entity motion controller, mobile entity motion control method, and mobile entity motion control program
CN109318981A (en) * 2017-07-31 2019-02-12 株式会社斯巴鲁 The drive-control system of vehicle

Also Published As

Publication number Publication date
JP2020152222A (en) 2020-09-24
CN111717280B (en) 2022-09-02

Similar Documents

Publication Publication Date Title
CN110267856B (en) Vehicle control device, vehicle control method, and storage medium
CN110356402B (en) Vehicle control device, vehicle control method, and storage medium
CN110281920B (en) Vehicle control device, vehicle control method, and storage medium
WO2018216194A1 (en) Vehicle control system and vehicle control method
CN110281941B (en) Vehicle control device, vehicle control method, and storage medium
CN110060467B (en) Vehicle control device
CN110053617B (en) Vehicle control device, vehicle control method, and storage medium
CN110271542B (en) Vehicle control device, vehicle control method, and storage medium
CN111771234A (en) Vehicle control system, vehicle control method, and program
CN109835344B (en) Vehicle control device, vehicle control method, and storage medium
CN110271541B (en) Vehicle control device, vehicle control method, and storage medium
CN110341704B (en) Vehicle control device, vehicle control method, and storage medium
CN111942378A (en) Vehicle control device, vehicle control method, and storage medium
WO2019073511A1 (en) Vehicle control device, vehicle control method, and program
CN110001641B (en) Vehicle control device, vehicle control method, and storage medium
CN110271547B (en) Vehicle control device, vehicle control method, and storage medium
CN112298181A (en) Vehicle control device, vehicle control method, and storage medium
JP2019137189A (en) Vehicle control system, vehicle control method, and program
JP2019131077A (en) Vehicle control device, vehicle control method, and program
CN111511621A (en) Vehicle control device, vehicle control method, and program
CN110949376A (en) Vehicle control device, vehicle control method, and storage medium
CN112208532A (en) Vehicle control device, vehicle control method, and storage medium
CN112462751A (en) Vehicle control device, vehicle control method, and storage medium
CN114261405A (en) Vehicle control device, vehicle control method, and storage medium
CN110341703B (en) Vehicle control device, vehicle control method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant