CN112172809A - Vehicle control device, vehicle control method, and storage medium - Google Patents

Vehicle control device, vehicle control method, and storage medium Download PDF

Info

Publication number
CN112172809A
CN112172809A CN202010616306.5A CN202010616306A CN112172809A CN 112172809 A CN112172809 A CN 112172809A CN 202010616306 A CN202010616306 A CN 202010616306A CN 112172809 A CN112172809 A CN 112172809A
Authority
CN
China
Prior art keywords
vehicle
road
factor
narrowing factor
road narrowing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010616306.5A
Other languages
Chinese (zh)
Inventor
小室美纱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN112172809A publication Critical patent/CN112172809A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18163Lane change; Overtaking manoeuvres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

Provided are a vehicle control device, a vehicle control method, and a storage medium, wherein interference between a road narrowing factor and a vehicle can be predicted. A vehicle control device is provided with: a periphery recognition unit that recognizes a peripheral condition of the vehicle; and a driving control unit that controls acceleration, deceleration, and steering of the vehicle based on the surrounding situation recognized by the surrounding recognition unit, wherein the recognition unit determines whether or not a road narrowing factor is present on a road on which the vehicle is traveling, and if it is determined that the road narrowing factor is present, identifies a length of the road narrowing factor in a traveling direction of the vehicle, and the driving control unit generates a avoidance track corresponding to the road narrowing factor based on the length of the road narrowing factor identified in the traveling direction of the vehicle.

Description

Vehicle control device, vehicle control method, and storage medium
Technical Field
The invention relates to a vehicle control device, a vehicle control method, and a storage medium.
Background
Conventionally, there is known a technique of estimating whether or not a pedestrian will stick out to a lane based on a relationship between a support base surface of the pedestrian and a position of a body center of gravity, and detecting that the pedestrian performs an avoidance action (for example, patent document 1 (japanese patent application laid-open No. 2017-210118)).
Disclosure of Invention
Problems to be solved by the invention
However, the conventional techniques have been insufficient for predicting the interference between the vehicle and the road narrowing factor (the traffic participants and the stationary objects) based on the relationship between the vehicle and the stationary objects as the avoidance targets of the traffic participants such as pedestrians.
An object of the present invention is to provide a vehicle control device, a vehicle control method, and a storage medium that can predict interference between a road narrowing factor and a vehicle.
Means for solving the problems
The vehicle control device, the vehicle control method, and the storage medium according to the present invention have the following configurations.
(1): a vehicle control device according to an aspect of the present invention includes: a periphery recognition unit that recognizes a peripheral condition of the vehicle; and a driving control unit that controls acceleration, deceleration, and steering of the vehicle based on the surrounding situation recognized by the surrounding recognition unit, wherein the surrounding recognition unit determines whether or not a road narrowing factor is present on a road on which the vehicle is traveling, and if it is determined that the road narrowing factor is present, identifies a length of the road narrowing factor in a traveling direction of the vehicle, and the driving control unit generates a avoidance track corresponding to the road narrowing factor based on the identified length of the road narrowing factor in the traveling direction of the vehicle.
(2): in the aspect of the above (1), the road narrowing factor includes a shoulder still and one or more traffic participants predicted to interfere with a predetermined travel track of the vehicle by moving away from the shoulder still.
(3): in the aspect of the above (2), the vehicle control device further includes a prediction unit that predicts a time-series position of the traffic participant based on a recognition result of the surrounding recognition unit recognizing the length of the road narrowing factor in the traveling direction of the vehicle.
(4): in the aspect of the above (3), the prediction unit may predict the period during which the traffic participant interferes with the scheduled travel track of the vehicle, based on the length of the road narrowing factor recognized by the periphery recognition unit.
(5): in the aspect of the above (4), when the passable width for scheduled travel of the vehicle is smaller than the predetermined width, the driving control unit may slow down or stop the vehicle in front of the road narrowing factor and wait until the period during which the road narrowing factor is determined to interfere with the scheduled travel track of the vehicle elapses.
(6): in the above-described aspect (4) or (5), the driving control unit may change the lateral avoidance control based on a period during which the road narrowing factor interferes with the planned travel track of the vehicle, when a passable width for which the vehicle is planned to travel is equal to or greater than a predetermined width.
(7): in any one of the above (3) to (6), the driving control unit changes the lateral avoidance control in accordance with a time-series position of the road narrowing factor on the traveling path on which the vehicle is scheduled to travel.
(8): a vehicle control method according to an aspect of the present invention causes a computer to execute: identifying a surrounding condition of the vehicle; controlling acceleration and deceleration and steering of the vehicle based on the surrounding condition; determining whether a road narrowing factor exists on a road on which the vehicle travels, and in a case where it is determined that the road narrowing factor exists, identifying a length of the road narrowing factor in a traveling direction of the vehicle; and generating an avoidance track corresponding to the road narrow factor based on a length of the identified road narrow factor in a traveling direction of the vehicle.
(9): a storage medium according to an aspect of the present invention stores a program that causes a computer to execute: identifying a surrounding condition of the vehicle; controlling acceleration and deceleration and steering of the vehicle based on the surrounding condition; determining whether a road narrowing factor exists on a road on which the vehicle travels, and in a case where it is determined that the road narrowing factor exists, identifying a length of the road narrowing factor in a traveling direction of the vehicle; and generating an avoidance track corresponding to the road narrow factor based on a length of the identified road narrow factor in a traveling direction of the vehicle.
Effects of the invention
According to (1) to (9), interference between the road narrowing factor and the vehicle can be predicted.
Drawings
Fig. 1 is a configuration diagram of a vehicle system 1 using a vehicle control device 100 according to a first embodiment.
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160.
Fig. 3 is a diagram schematically showing a road narrowing factor on the running path LR on which the host vehicle M runs.
Fig. 4 is a diagram for explaining the time-series positions of the pedestrian P.
Fig. 5 is a plan view for explaining a case where the pedestrian P moves while avoiding another vehicle mA.
Fig. 6 is a diagram for explaining an example of the rule of the vehicle speed of the host vehicle M when the second control unit 160 travels on the avoidance trajectory.
Fig. 7 is a diagram illustrating an example of the avoidance trajectory generated by the avoidance trajectory generation unit 142.
Fig. 8 is a diagram illustrating another example of the avoidance trajectory generated by the avoidance trajectory generation unit 142.
Fig. 9 is a diagram for explaining a scene in which a stationary shoulder object on the travel path LR is the large vehicle mB.
Fig. 10 is a diagram for explaining a scene in which a stationary shoulder object on the travel path LR is the large vehicle mB.
Fig. 11 is a flowchart illustrating an example of the flow of the road narrowing factor avoidance process of the vehicle system 1.
Fig. 12 is a flowchart illustrating an example of a flow of the avoidance trajectory generation process by the avoidance trajectory generation unit 142 based on the prediction result of the prediction unit 136.
Fig. 13 is a diagram showing an example of a hardware configuration of various control devices according to the embodiment.
Description of reference numerals:
1 … vehicle system, 10 … camera, 12 … radar device, 14 … detector, 16 … object recognition device, 20 … communication device, 40 … vehicle sensor, 50 … navigation device, 51 … GNSS receiver, 53 … route determination portion, 61 … recommended lane determination portion, 80 … driving operation device, 100 … vehicle control device, 120 … first control portion, 130 … recognition portion, 132 … periphery recognition portion, 134 … road narrow factor recognition portion, 136 … prediction portion, 140 … action plan generation portion, 142 … avoidance track generation portion, 160 … second control portion, 162 … acquisition portion, 164 … speed control portion, 166 … steering control portion, 200 … running driving force output device, 210 … brake device, 220 … steering device, own vehicle, M …, mA … other vehicle, mB … large vehicle, P ….
Detailed Description
Embodiments of a vehicle control device, a vehicle control method, and a storage medium according to the present invention will be described below with reference to the accompanying drawings.
[ integral Structure ]
Fig. 1 is a configuration diagram of a vehicle system 1 using a vehicle control device 100 according to a first embodiment. The vehicle on which the vehicle system 1 is mounted is, for example, a two-wheel, three-wheel, four-wheel or the like vehicle, and the drive source thereof is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof. The electric motor operates using the generated power of the generator connected to the internal combustion engine or the discharge power of the secondary battery or the fuel cell.
The vehicle system 1 includes, for example, a camera 10, a radar device 12, a probe 14, an object recognition device 16, a driving operation element 80, a vehicle control device 100, a travel driving force output device 200, a brake device 210, and a steering device 220. These devices and apparatuses are connected to each other by a multiplex communication line such as a can (controller Area network) communication line, a serial communication line, a wireless communication network, and the like. The configuration shown in fig. 1 is merely an example, and a part of the configuration may be omitted, and another configuration may be further added.
The camera 10 is a digital camera using a solid-state imaging device such as a ccd (charge Coupled device) or a cmos (complementary Metal Oxide semiconductor). The camera 10 is mounted on an arbitrary portion of a vehicle (hereinafter, referred to as a host vehicle M) on which the vehicle system 1 is mounted. When shooting the front, the camera 10 is attached to the upper part of the front windshield, the rear surface of the vehicle interior mirror, or the like. The camera 10 repeatedly captures the periphery of the host vehicle M periodically, for example. The camera 10 may also be a stereo camera.
The radar device 12 radiates radio waves such as millimeter waves to the periphery of the host vehicle M, and detects radio waves (reflected waves) reflected by an object to detect at least the position (distance and direction) of the object. The radar device 12 is mounted on an arbitrary portion of the vehicle M. The radar device 12 may detect the position and velocity of the object by an FM-cw (frequency Modulated Continuous wave) method.
The detector 14 is a LIDAR (light Detection and ranging). The detector 14 irradiates light to the periphery of the host vehicle M to measure scattered light. The probe 14 detects the distance to the object based on the time from light emission to light reception. The light to be irradiated is, for example, pulsed laser light. The probe 14 is attached to an arbitrary portion of the vehicle M.
The object recognition device 16 performs a sensor fusion process on the detection results detected by some or all of the camera 10, the radar device 12, and the probe 14 to recognize the position, the type, the speed, and the like of the object. The object recognition device 16 outputs the recognition result to the vehicle control device 100. The object recognition device 16 may directly output the detection results of the camera 10, the radar device 12, and the detector 14 to the vehicle control device 100. The object recognition device 16 may also be omitted from the vehicle system 1.
The communication device 20 communicates with another vehicle present in the vicinity of the autonomous vehicle, or communicates with various server devices via a wireless base station, for example, using a cellular network, a Wi-Fi network, Bluetooth (registered trademark), dsrc (dedicated Short Range communication), or the like.
The HMI30 presents various information to an occupant of the autonomous vehicle, and accepts input operations by the occupant. The HMI30 includes various display devices, speakers, buzzers, touch panels, switches, keys, and the like.
The vehicle sensors 40 include a vehicle speed sensor that detects the speed of the autonomous vehicle, an acceleration sensor that detects acceleration, a yaw rate sensor that detects an angular velocity about a vertical axis, an orientation sensor that detects the orientation of the autonomous vehicle, and the like.
The navigation device 50 includes, for example, a GNSS receiver 51, a navigation HMI52, and a route determination unit 53. The navigation device 50 holds first map information 54 in a storage device such as an HDD or a flash memory. The GNSS receiver 51 determines the location of the autonomous vehicle based on signals received from GNSS satellites. The position of the autonomous vehicle may also be determined or supplemented by an ins (inertial Navigation system) that utilizes the output of the vehicle sensors 40. The navigation HMI52 includes a display device, a speaker, a touch panel, keys, and the like. The navigation HMI52 may also be partially or wholly shared with the aforementioned HMI 30. The route determination unit 53 determines, for example, a route from the position of the autonomous vehicle (or an arbitrary input position) specified by the GNSS receiver 51 to the destination input by the occupant using the navigation HMI52 (hereinafter, referred to as an on-map route) with reference to the first map information 54. The first map information 54 is information representing a road shape by, for example, a line representing a road and nodes connected by the line. The first map information 54 may also include curvature Of a road, poi (point Of interest) information, and the like. The map upper path is output to the MPU 60. The navigation device 50 may also perform route guidance using the navigation HMI52 based on the on-map route. The navigation device 50 may be realized by a function of a terminal device such as a smartphone or a tablet terminal held by the passenger. The navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire a route equivalent to the route on the map from the navigation server.
The MPU60 includes, for example, the recommended lane determining unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determining unit 61 divides the on-map route provided from the navigation device 50 into a plurality of blocks (for example, every 100[ m ] in the vehicle traveling direction), and determines the recommended lane for each block with reference to the second map information 62. The recommended lane determining unit 61 determines to travel in the first few lanes from the left side. The recommended lane determining unit 61 determines the recommended lane so that the autonomous vehicle can travel on a reasonable route for traveling to the branch destination when there is a branch point on the route on the map.
The second map information 62 is map information with higher accuracy than the first map information 54. The second map information 62 includes, for example, information on the center of a lane, information on the boundary of a lane, and the like. The second map information 62 may include road information, traffic regulation information, address information (address/zip code), facility information, telephone number information, and the like. The second map information 62 can be updated at any time by the communication device 20 communicating with other devices.
The driving operation members 80 include, for example, an accelerator pedal, a brake pedal, a shift lever, a steering wheel, a joystick, and other operation members. A sensor for detecting the operation amount or the presence or absence of operation is attached to the driving operation element 80, and the detection result is output to the vehicle control device 100 or a part or all of the running driving force output device 200, the brake device 210, and the steering device 220.
The vehicle control device 100 includes, for example, a first control unit 120 and a second control unit 160. The first control unit 120 and the second control unit 160 are each realized by executing a program (software) by a hardware processor such as a CPU. Some or all of these components may be realized by hardware (circuit unit) such as LSI, ASIC, FPGA, GPU, etc., or may be realized by cooperation of software and hardware. The program may be stored in advance in a storage device (a storage device including a non-transitory storage medium) such as an HDD or a flash memory of the vehicle control device 100, or may be stored in a removable storage medium such as a DVD or a CD-ROM, and the storage medium (the non-transitory storage medium) may be attached to the HDD or the flash memory of the vehicle control device 100 by being attached to the drive device.
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160. The first control unit 120 includes, for example, a recognition unit 130 and an action plan generation unit 140. The first control unit 120 realizes, for example, an AI (Artificial Intelligence) function and a predetermined model function in parallel. For example, the function of "recognizing an intersection" can be realized by "performing the recognition of an intersection by deep learning or the like and the recognition based on a predetermined condition (presence of a signal, a road sign, or the like that enables pattern matching) in parallel, and scoring both sides to evaluate them comprehensively. Thereby, the reliability of automatic driving is ensured.
The recognition unit 130 recognizes the periphery of the host vehicle M and estimates the behavior of the recognized object. The recognition unit 130 includes, for example, a periphery recognition unit 132, a road narrowing factor recognition unit 134, and a prediction unit 136.
The periphery recognition unit 132 recognizes the state of the object (including a preceding vehicle and an opposing vehicle described later) in the periphery of the autonomous vehicle, such as the position, speed, and acceleration, based on information input from the camera 10, the radar device 12, and the probe 14 via the object recognition device 16. The position of the object is recognized as a position on absolute coordinates with a representative point (center of gravity, center of a drive shaft, etc.) of the autonomous vehicle as an origin, for example, and used for control. The position of the object may be represented by a representative point such as the center of gravity, a corner, or the like of the object, or may be represented by a region represented by the representative point. The "state" of the object may also include acceleration, jerk, or "state of action" of the object (e.g., whether a lane change is being made or is about to be made).
The periphery recognition unit 132 recognizes, for example, a lane in which the autonomous vehicle is traveling (traveling lane). For example, the periphery recognizing unit 132 recognizes the traveling lane by comparing the pattern of road dividing lines (e.g., the arrangement of solid lines and broken lines) obtained from the second map information 62 with the pattern of road dividing lines in the periphery of the autonomous vehicle recognized from the image captured by the camera 10. The periphery recognition unit 132 is not limited to the road dividing line, and may recognize the lane by recognizing a road dividing line, and a traveling road boundary (road boundary) including a shoulder, a curb, a center barrier, a guardrail, and the like. In this recognition, the position of the autonomous vehicle and the processing result of the INS acquired from the navigation device 50 may be considered. In addition, the periphery recognizing section 132 recognizes a temporary stop line, an obstacle, a red light, a toll booth, and other road phenomena.
The periphery recognition unit 132 recognizes the position and posture of the autonomous vehicle with respect to the travel lane when recognizing the travel lane. The periphery recognition unit 132 can recognize, for example, the deviation of the reference point of the autonomous vehicle from the center of the lane and the angle formed by the traveling direction of the autonomous vehicle with respect to a line connecting the centers of the lanes as the relative position and posture of the autonomous vehicle with respect to the traveling lane. Instead, the periphery recognition unit 132 may recognize the position of the reference point of the autonomous vehicle with respect to an arbitrary side end portion (road dividing line or road boundary) of the traveling lane as the relative position of the autonomous vehicle with respect to the traveling lane.
The periphery recognition unit 132 recognizes information on a peripheral vehicle, particularly, a lane to be traveled by the host vehicle M, based on the peripheral vehicle of the host vehicle M recognized from the image captured by the camera 10, the congestion information of the periphery of the host vehicle M acquired by the navigation device 50, or the position information obtained from the second map information 62. The information related to the lane to be traveled includes, for example, a lane width (lane width) to be traveled by the host vehicle M. The periphery recognition unit 132 outputs the recognition result to the road narrowing factor recognition unit 134.
The road narrowing factor identifying unit 134 identifies the road narrowing factor using the identification result identified by the periphery identifying unit 132. The road narrowing factor includes, for example, a stationary shoulder object that narrows a road temporarily, such as another vehicle during parking at a shoulder, a safety fence (guard fence) installed for construction, and the like, and a traffic participant who moves while avoiding the stationary shoulder object. The traffic participants are, for example, pedestrians, bicycles, motorcycles and other vehicles. When recognizing that there is a road narrowing factor, the road narrowing factor recognition unit 134 acquires a result of further recognizing the road narrowing factor as three-dimensional information (hereinafter, abbreviated as "three-dimensional information").
The road narrowing factor identifying unit 134 outputs the road narrowing factor to the predicting unit 136 as a result of the three-dimensional information identification, for example.
The prediction unit 136 predicts the time-series position of the road traffic restriction factor, particularly the time-series position of the traffic participant, based on the recognition result recognized by the periphery recognition unit 132 and the recognition result recognized by the road traffic restriction factor recognition unit 134. The prediction unit 136 sets an arbitrary point in the contour of the road narrowing factor as a candidate point (point), and predicts the time-series position of the traffic participant by predicting the time-series movement position of the candidate point.
The action plan generating unit 140 generates a target trajectory on which the host vehicle M will travel in the future so as to travel on the recommended lane determined by the recommended lane determining unit 61 in principle and execute the autonomous driving according to the surrounding situation of the host vehicle M. The target trajectory includes, for example, a velocity element. For example, the target track is represented by a track in which points (track points) to which the vehicle M should arrive are arranged in order. The track point is a point to which the host vehicle M should arrive at every predetermined travel distance (for example, several [ M ] or so) in terms of a distance along the way, and independently of this, a target speed and a target acceleration at every predetermined sampling time (for example, several zero-point [ sec ] or so) are generated as a part of the target track.
The action plan generating unit 140 includes, for example, a avoidance trajectory generating unit 142. The avoidance trajectory generation unit 142 generates a target trajectory for the host vehicle M to move to avoid the road narrowing factor based on the prediction result of the prediction unit 136. The avoidance trajectory generation unit 142 generates a avoidance trajectory corresponding to the road narrowing factor based on the length of the road narrowing factor recognized by the road narrowing factor recognition unit 134 in the traveling direction of the vehicle.
The second control unit 160 controls the running driving force output device 200, the brake device 210, and the steering device 220 so that the autonomous vehicle passes through the target trajectory generated by the action plan generating unit 140 at a predetermined timing. The configuration in which the action plan generating unit 140 and the second control unit 160 are combined is an example of the "driving control unit".
Returning to fig. 1, the second control unit 160 includes, for example, an acquisition unit 162, a speed control unit 164, and a steering control unit 166. The acquisition unit 162 acquires information of the target trajectory (trajectory point) generated by the action plan generation unit 140 and stores the information in a memory (not shown). The speed control portion 164 controls the running driving force output device 200 or the brake device 210 based on the speed factor attached to the target track stored in the memory. The steering control unit 166 controls the steering device 220 according to the curve condition of the target track stored in the memory. The processing of the speed control unit 164 and the steering control unit 166 is realized by, for example, a combination of feedforward control and feedback control. For example, the steering control unit 166 performs a combination of feedforward control according to the curvature of the road ahead of the autonomous vehicle and feedback control based on the deviation from the target trajectory.
The running drive force output device 200 outputs running drive force (torque) for running of the vehicle to the drive wheels. The traveling driving force output device 200 includes, for example, a combination of an internal combustion engine, a motor, a transmission, and the like, and an ECU that controls the combination. The ECU controls the above configuration in accordance with information input from the second control unit 160 or information input from the driving operation element 80.
The brake device 210 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor so that a braking torque corresponding to a braking operation is output to each wheel, in accordance with information input from the second control unit 160 or information input from the driving operation element 80. The brake device 210 may include a mechanism for transmitting the hydraulic pressure generated by the operation of the brake pedal included in the driving operation tool 80 to the hydraulic cylinder via the master cylinder as a spare part. The brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device that controls an actuator in accordance with information input from the second control unit 160 and transmits the hydraulic pressure of the master cylinder to the hydraulic cylinder.
The steering device 220 includes, for example, a steering ECU and an electric motor. The electric motor changes the orientation of the steering wheel by applying a force to a rack-and-pinion mechanism, for example. The steering ECU drives the electric motor in accordance with information input from the second control unit 160 or information input from the driving operation element 80 to change the direction of the steered wheels.
Fig. 3 is a diagram schematically showing a road narrowing factor on the running path LR on which the host vehicle M runs. The road width of the travel path LR on which the vehicle M travels is WR. The travel path LR may be a single lane road or may include other adjacent lanes not shown. The X axis in the figure is the longitudinal direction of the travel path LR and is an axis of the host vehicle M in a predetermined traveling direction. The Y axis in the figure is an axis in the width direction of the traveling path LR with respect to the traveling direction of the host vehicle M. The Z axis in the figure is an axis in the height direction of the own vehicle M. The road narrowing factor recognition unit 134 recognizes, as three-dimensional information, the length of the shoulder static object in the X-axis direction in the traveling direction of the host vehicle M, the length (width) of the shoulder static object in the Y-axis direction, the height of the shoulder static object in the Z-axis direction, and the information of the traffic participants.
In the following description, it is assumed that the road shoulder stationary object of the road narrowing factor is the other vehicle mA and the traffic participant is the pedestrian P.
The road narrowing factor identifying section 134 identifies the other vehicle mA parked in front of the own vehicle M in the traveling direction (X-axis direction) and the pedestrian P ahead of the other vehicle mA in the traveling direction.
The prediction unit 136 predicts the time-series position of the pedestrian P based on the recognition result recognized by the periphery recognition unit 132 and the recognition result recognized by the road narrowing factor recognition unit 134.
The prediction unit 136 outputs the prediction result using one or more indices. The prediction unit 136 derives, for example, a first index R (risk potential) having a negative value as the distance from the road narrowing factor recognized by the road narrowing factor recognition unit 134 increases, for each of a plurality of candidate points (points) on the traveling direction side of the host vehicle M, and associates the first index R with each of the plurality of candidate points. The term "establishing a correspondence relationship" means, for example, storing information in a memory as information corresponding to each other. In the present embodiment, the value is "negative" and "positive" when approaching zero, and the score assumed value described later is a positive value (preferable value) when approaching zero, but the relationship may be reversed. Therefore, the prediction unit 136 derives the first index R having a larger value as the distance from the road narrowing factor recognized by the road narrowing factor recognition unit 134 becomes closer to each of the plurality of candidate points (points) on the traveling direction side of the host vehicle M. The prediction unit 136 derives the first index RmA for each candidate point so that the representative point of the other vehicle mA, which is a stationary shoulder object that is a road narrowing factor, becomes larger as it is closer to the representative point and becomes smaller as it is farther from the representative point, for example, as the center.
For example, the distribution of the first index RmA of the other vehicle mA is derived so that, for example, if the contour line of the obtained value is in the shape of an ellipse long in the X-axis direction. The ratio of the major axis to the minor axis of the ellipse varies depending on the longitudinal length of the road narrowing factor, for example. The prediction unit 136 sets the major axis of the ellipse indicating the distribution of the first index RmA of the other vehicle mA based on the vehicle length LmA of the other vehicle mA, and sets the minor axis of the ellipse based on the vehicle width WmA of the other vehicle mA. In fig. 3, the outer edge line of the ellipse indicating that the first index RmA of the other vehicle mA is zero is shown by a broken line.
Similarly, the prediction unit 136 derives the first index RP for each candidate point so that the first index RP becomes larger as the distance from the representative point becomes closer and smaller as the distance from the representative point becomes farther, with the representative point of the pedestrian P, which is a road narrowing factor, as the center. In fig. 3, the outer edge line of the ellipse indicating that the first index RP of the pedestrian P is zero is shown by a broken line.
The prediction unit 136 predicts the passable width D in which the vehicle M can pass by predicting the time-series position of the pedestrian P, and outputs the predicted passable width D to the avoidance trajectory generation unit 142.
Fig. 4 is a diagram for explaining the time-series positions of the pedestrian P. Fig. 4 shows the following scenario: after a predetermined time has elapsed from the state shown in fig. 3, the pedestrian P moves while avoiding another vehicle mA, and interferes with the intended travel track of the host vehicle M. While the pedestrian P moves while avoiding another vehicle mA, the road width through which the host vehicle M can pass is minimized (D in the figure).
The prediction unit 136 derives a first index R of a road narrowing factor obtained by combining the first index RmA of the other vehicle mA as the stationary road shoulder and the first index RP of the pedestrian P as the traffic participant. For example, when the pedestrian P approaches the other vehicle mA, the prediction unit 136 may derive the first index R by considering the other vehicle mA and the pedestrian as a road narrowing factor, or may derive the first index RmA of the other vehicle mA and the first index RP of the pedestrian P separately.
The prediction unit 136 predicts the period during which the host vehicle M interferes with the road narrowing factor, based on the vehicle length LmA of the other vehicle mA. The period of interference is a period of lateral position interference in the case where the first index RP of the pedestrian P avoiding the other vehicle mA interferes with the own vehicle avoiding the first index RmA of the other vehicle mA.
In addition, the prediction unit 136 may determine that the vehicle M does not interfere with the road narrowing factor when it is predicted that the pedestrian P waits for the vehicle M to pass by the other vehicle mA in the state shown in fig. 3.
The avoidance trajectory generation unit 142 generates the avoidance trajectory by changing the lateral avoidance control based on the time-series position of the pedestrian P predicted by the prediction unit 136, the relationship between the passable width D and the vehicle width WM of the host vehicle, and the period of time during which the host vehicle M interferes with the road narrowing factor. The avoidance trajectory generation unit 142 generates the avoidance behavior of the vehicle M in consideration of, in particular, the time-series position of the pedestrian P whose passable width D is the smallest. In the evasive maneuver, the own vehicle M is temporarily stopped in front of the other vehicles mA or is slowly traveled in front of the other vehicles mA. The second control unit 160 may change the speed at which the vehicle M travels on the avoidance trajectory generated by the avoidance trajectory generation unit 142, depending on whether or not the pedestrian P is aware of the vehicle M. When the pedestrian P does not perceive the host vehicle M, the second control unit 160 controls the speed of the host vehicle M so as to travel more slowly than when the pedestrian P perceives the host vehicle M.
The estimation of whether or not the pedestrian P is aware of the host vehicle M may be performed by the prediction unit 136 based on, for example, the face orientation of the pedestrian P captured by the camera 10 or a change in the moving speed of the pedestrian P, or may be performed by another image analysis method.
Fig. 5 is a plan view for explaining a case where the pedestrian P moves while avoiding another vehicle mA. The upper left diagram of fig. 5 is a diagram illustrating the position prediction performed by the prediction unit 136 when the pedestrian P starts avoiding the other vehicle mA at time T. The prediction portion 136 predicts that the pedestrian P starts avoiding the other vehicle mA at the time T. The pedestrian P moves to the vicinity of the front of the vehicle of the other vehicle mA at time T + ta after a predetermined time ta elapses from time T (upper right diagram in fig. 5), and moves to the vicinity of the rear of the vehicle of the other vehicle mA at time T +2ta after a further predetermined time ta elapses (lower left diagram in fig. 5). The pedestrian P finishes evading the other vehicles mA at the time T +3ta (lower right diagram of fig. 5). The prediction unit 136 derives the first index R of the road narrowing factor based on the position of the time series from the time T to the time T +3ta at which the pedestrian P is predicted to move while avoiding another vehicle mA as shown in fig. 5. In addition, the prediction unit 136 predicts the passable width D at the portion where the road narrowing factor is most protruded in the Y-axis direction at each time shown in fig. 5.
The prediction unit 136 determines whether or not a avoiding period during which the host vehicle M avoids the other vehicle mA overlaps with a avoiding period during which the pedestrian P avoids the other vehicle mA. When it is determined that the vehicle M is not overlapped with the pedestrian P, the pedestrian P is output to the action plan generating unit 140 so as to execute control to avoid the other vehicle mA because the pedestrian P has a low possibility of affecting the traveling of the vehicle M.
When the passable width D is sufficiently larger than the vehicle width WM of the host vehicle M (for example, when the passable width D is equal to or larger than the threshold Th 1), the prediction unit 136 determines that: even if the pedestrian P passes by the other vehicle mA, the host vehicle M can travel while avoiding the other vehicle mA and the pedestrian P, which are road narrowing factors. Here, the threshold Th1 is a value that can be defined by the sum of the vehicle width WM and a predetermined interval (approximately 50 to 80 cm), for example. When the passable width D is smaller than the threshold Th1 and equal to or larger than the threshold Th2, the prediction unit 136 determines that: while the pedestrian P is passing by the other vehicle mA, the host vehicle M is caused to walk slowly so as to travel while avoiding the road narrowing factor, that is, the other vehicle mA and the pedestrian P. Here, the threshold Th2 is a threshold that defines a stricter condition (difficult to determine that the host vehicle M can escape) than the threshold Th1, and is, for example, a value that can be defined by the sum of the vehicle width WM and a predetermined interval (about 20 to 50[ cm ]). When the passable width D is smaller than the threshold Th2, the prediction unit 136 determines that: while the pedestrian P is passing by the other vehicle mA, the host vehicle M is temporarily stopped in front of the other vehicle mA, and after the passage of the pedestrian P, which is a period determined as interference, has elapsed, the vehicle starts traveling again. The threshold Th1 and the threshold Th2 are examples of "predetermined width".
The avoidance trajectory generation unit 142 generates the avoidance trajectory based on the prediction result of the prediction unit 136 shown in fig. 5.
When the prediction result that the pedestrian P avoids the other vehicle mA is output from the prediction unit 136, the avoidance trajectory generation unit 142 may set the avoidance trajectory of the host vehicle M in consideration of a change in the region corresponding to the pedestrian P of the first index R with time as shown in fig. 5. More specifically, at the time T, the pedestrian P is present at a position far from the host vehicle M, and therefore the escape trajectory may be set so as to pass through a position close to the other vehicle mA in the lateral direction, and at the time T +3ta, the trajectory may be set so as to pass through a position close to the other vehicle mA in the lateral direction after passing by the side of the pedestrian P.
That is, the avoidance trajectory generation unit 142 may generate the avoidance trajectory that realizes the minimum avoidance operation in accordance with the first index R without performing the large avoidance operation assuming that the pedestrian P is always present, based on all the prediction results from the time T to T +3 ta.
Fig. 6 is a diagram for explaining an example of the rule of the vehicle speed of the host vehicle M when the second control unit 160 travels on the avoidance trajectory. The second control unit 160 controls the vehicle speed while the host vehicle M is avoiding the road narrowing factor, based on the passable width D and the estimation result of whether or not the pedestrian P is aware of the host vehicle M, with the legal speed VL as a reference, for example.
Fig. 7 is a diagram illustrating an example of the avoidance trajectory generated by the avoidance trajectory generation unit 142. The avoidance trajectory generation unit 142 generates the avoidance trajectory based on the vehicle length LmA of the other vehicle mA recognized by the road narrowing factor recognition unit 134. For example, when the passable width D is smaller than the threshold Th2, the avoidance trajectory generating unit 142 generates the avoidance trajectory K1 (left drawing in fig. 7) that temporarily stops the host vehicle M in front of the other vehicle mA and waits until the pedestrian P finishes moving beside the other vehicle mA. Further, after the pedestrian P finishes moving near the other vehicle mA, the avoidance trajectory generation unit 142 may generate the avoidance trajectory K2 (the right diagram of fig. 7) for avoiding the other vehicle mA and causing the host vehicle M to travel. The second control unit 160 causes the host vehicle M to travel in accordance with the vehicle speed rule shown in fig. 6.
Fig. 8 is a diagram illustrating another example of the avoidance trajectory generated by the avoidance trajectory generation unit 142. For example, when the passable width D is larger than the threshold value Th1, the avoidance trajectory generation unit 142 generates the avoidance trajectory K3 that allows the host vehicle M to travel beside the pedestrian P even if the pedestrian P is moving beside the other vehicle mA. The second control unit 160 causes the host vehicle M to travel in accordance with the vehicle speed rule shown in fig. 6.
Fig. 9 and 10 are views for explaining a scene in which a stationary shoulder object on the travel path LR is the large vehicle mB. The large vehicle mB is a vehicle whose vehicle length is longer than the other vehicles mA. The prediction unit 136 predicts the period of time during which the large vehicle mB and the pedestrian P, which are road narrowing factors, interfere with the host vehicle M, based on the recognition result of the vehicle width WmB and the vehicle length LmB of the large vehicle mB, which is a road narrowing factor, by the periphery recognition unit 132. The avoidance trajectory generation unit 142 generates the avoidance trajectory K4 shown in fig. 10 based on the prediction result of the prediction unit 136.
[ treatment procedure ]
Fig. 11 is a flowchart illustrating an example of the flow of the road narrowing factor avoidance process of the vehicle system 1.
First, the periphery recognition unit 132 recognizes the peripheral situation of the host vehicle M (step S100). Next, the road narrowing factor identifying unit 134 determines whether or not there are stationary road shoulder objects and traffic participants, which are road narrowing factors, in the vicinity of the host vehicle M (step S102). When it is determined that the road narrowing factor is not present, the road narrowing factor identifying unit 134 ends the processing of the present flowchart. When it is determined that the road narrowing factor is present, the road narrowing factor identifying unit 134 identifies the length of the shoulder still object (step S104). Next, the road narrowing factor identifying unit 134 identifies the moving speed of the traffic participant (step S106).
Next, the prediction unit 136 predicts the positions of the traffic participants based on the recognition result recognized by the road narrowing factor recognition unit 134 (step S108), and determines whether or not the traffic participants will move to the traveling path on which the host vehicle M is scheduled to travel (step S110). When it is determined that the vehicle will not move to the travel path, the prediction unit 136 ends the process of the flowchart. When it is determined that the travel path will be traveled, the prediction unit 136 predicts the time period during which the transportation participant travels and stays on the travel path (step S112). Next, the avoidance trajectory generation unit 142 generates the avoidance trajectory for the host vehicle M based on the prediction result predicted by the prediction unit 136 (step S114), and the process of the present flowchart ends.
Fig. 12 is a flowchart illustrating an example of a flow of the avoidance trajectory generation process by the avoidance trajectory generation unit 142 based on the prediction result of the prediction unit 136. The flowchart shown in fig. 12 explains in more detail the processing in step S108 to step S114 of the flowchart of fig. 11.
First, the prediction unit 136 sets an index relating to the road narrowing factor based on the recognition result of the length of the shoulder still object by the road narrowing factor recognition unit 134 (step S200). Next, the prediction unit 136 predicts the time-series position of the traffic participant on the travel path, and predicts the period during which the traffic participant interferes (step S202). Next, the prediction unit 136 determines whether or not a avoidance period during which the host vehicle M avoids the other vehicle mA, which is a road narrowing factor, overlaps with a avoidance period during which the traffic participant avoids the other vehicle mA (step S204). If it is not determined that the vehicle overlaps, the prediction unit 136 outputs to the action plan generation unit 140 a avoidance trajectory in which the own vehicle M avoids the other vehicle mA, taking only the other vehicle mA into consideration as a road narrowing factor (step S206). When it is determined that the vehicle overlaps, the prediction unit 136 estimates whether or not the traffic participant is aware of the vehicle, considering both the other vehicle mA and the traffic participant as the road narrow factor, and determines the degree of deceleration of the vehicle M when the vehicle is traveling away from the road narrow factor, based on the estimation result (step S208). Next, the prediction unit 136 determines whether the passable width D is smaller than a predetermined value due to the road narrowing factor (step S210). When the value is equal to or greater than the predetermined value, the avoidance trajectory generating unit 142 generates a avoidance trajectory in which the lateral avoidance control is changed based on the recognition result of the road narrowing factor recognizing unit 134 (step S212). When the speed is less than the predetermined value, the avoidance trajectory generation unit 142 generates a trajectory on which the host vehicle M waits to run slowly or stop (step S214). After the processing of step S212 or step S214, the second control unit 160 controls acceleration/deceleration and steering along the trajectory generated in step S212 or step S214 (step S216). The process in the flowchart ends.
As described above, according to the present embodiment, the periphery recognition unit 132 recognizes the periphery of the host vehicle M, the road narrowing factor recognition unit 134 determines whether or not a road narrowing factor is present on the road on which the host vehicle M travels, and when it is determined that a road narrowing factor is present, the length of the road narrowing factor in the traveling direction of the host vehicle M is recognized, and the escape trajectory generation unit 142 generates an escape trajectory corresponding to the recognition result of the road narrowing factor recognized by the road narrowing factor recognition unit 134, whereby it is possible to predict the interference between the host vehicle M and the shoulder immobilizer and the traffic participants, which are the road narrowing factors.
[ hardware configuration ]
Fig. 13 is a diagram showing an example of a hardware configuration of various control devices according to the embodiment. As shown in the figure, various control devices are connected to each other via an internal bus or a dedicated communication line, such as a communication controller 100-1, a CPU100-2, a RAM100-3 used as a work memory, a ROM100-4 for storing boot programs and the like, a flash memory, a storage device 100-5 such as an HDD, and a drive device 100-6. The communication controller 100-1 performs communication with components other than the vehicle control device 100. The storage device 100-5 stores a program 100-5a executed by the CPU 100-2. The program is developed into the RAM100-3 by a dma (direct Memory access) controller (not shown) or the like, and executed by the CPU 100-2. This realizes a part or all of the first control unit 120 and the second control unit 160.
The above-described embodiments can be described as follows.
A vehicle control device is provided with:
a storage device storing a program; and
a hardware processor for executing a program of a program,
the hardware processor is configured to execute the following processing by executing a program stored in the storage device:
the surrounding situation of the vehicle is identified,
controlling acceleration and deceleration and steering of the vehicle based on the surrounding conditions,
the method includes determining whether a road narrowing factor is present on a road on which the vehicle is traveling, identifying a length of the road narrowing factor in a traveling direction of the vehicle if it is determined that the road narrowing factor is present, and generating an avoidance track corresponding to the road narrowing factor based on the identified length of the road narrowing factor in the traveling direction of the vehicle.
While the present invention has been described with reference to the embodiments, the present invention is not limited to the embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.

Claims (10)

1. A control apparatus for a vehicle, wherein,
the vehicle control device includes:
a periphery recognition unit that recognizes a peripheral condition of the vehicle; and
a driving control unit that controls acceleration/deceleration and steering of the vehicle based on the surrounding situation recognized by the surrounding recognition unit,
the periphery recognizing unit determines whether or not a road narrowing factor is present on a road on which the vehicle is traveling, and recognizes a length of the road narrowing factor in a traveling direction of the vehicle when it is determined that the road narrowing factor is present,
the driving control unit generates an avoidance trajectory corresponding to the road narrow factor based on a length of the identified road narrow factor in a traveling direction of the vehicle.
2. The vehicle control apparatus according to claim 1,
the road narrowing factor includes a shoulder still and one or more traffic participants predicted to interfere with a predetermined travel track of the vehicle by moving away from the shoulder still.
3. The vehicle control apparatus according to claim 2,
the vehicle control device further includes a prediction unit that predicts a time-series position of the traffic participant based on a recognition result of the surrounding recognition unit recognizing the length of the road narrowing factor in the traveling direction of the vehicle.
4. The vehicle control apparatus according to claim 3,
the prediction portion predicts a period during which the traffic participant interferes with a predetermined travel track of the vehicle based on the length of the road narrowing factor recognized by the periphery recognition portion.
5. The vehicle control apparatus according to claim 4,
the driving control unit may slow down or stop the vehicle in front of the road narrowing factor when the passable width for the scheduled travel of the vehicle is smaller than the predetermined width, and wait until the period during which the road narrowing factor is determined to interfere with the scheduled travel track of the vehicle elapses.
6. The vehicle control apparatus according to claim 4,
the driving control unit changes the lateral avoidance control based on a period during which the road narrowing factor interferes with the planned travel track of the vehicle when a passable width for which the vehicle is planned to travel is equal to or greater than a predetermined width.
7. The vehicle control apparatus according to claim 5,
the driving control unit changes the lateral avoidance control based on a period during which the road narrowing factor interferes with the planned travel track of the vehicle when a passable width for which the vehicle is planned to travel is equal to or greater than a predetermined width.
8. The vehicle control apparatus according to any one of claims 3 to 7,
the driving control unit changes the lateral avoidance control in accordance with a time-series position of the road narrowing factor on a traveling path on which the vehicle is scheduled to travel.
9. A control method for a vehicle, wherein,
the vehicle control method causes a computer to execute:
identifying a surrounding condition of the vehicle;
controlling acceleration and deceleration and steering of the vehicle based on the surrounding condition;
determining whether a road narrowing factor exists on a road on which the vehicle travels, and in a case where it is determined that the road narrowing factor exists, identifying a length of the road narrowing factor in a traveling direction of the vehicle; and
generating an avoidance trajectory corresponding to the road narrow factor based on a length of the identified road narrow factor in a traveling direction of the vehicle.
10. A storage medium storing a program, wherein,
the program causes a computer to execute:
identifying a surrounding condition of the vehicle;
controlling acceleration and deceleration and steering of the vehicle based on the surrounding condition;
determining whether a road narrowing factor exists on a road on which the vehicle travels, and in a case where it is determined that the road narrowing factor exists, identifying a length of the road narrowing factor in a traveling direction of the vehicle; and
generating an avoidance trajectory corresponding to the road narrow factor based on a length of the identified road narrow factor in a traveling direction of the vehicle.
CN202010616306.5A 2019-07-03 2020-06-30 Vehicle control device, vehicle control method, and storage medium Pending CN112172809A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-124392 2019-07-03
JP2019124392A JP2021009653A (en) 2019-07-03 2019-07-03 Vehicle control device, vehicle control method, and program

Publications (1)

Publication Number Publication Date
CN112172809A true CN112172809A (en) 2021-01-05

Family

ID=73919405

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010616306.5A Pending CN112172809A (en) 2019-07-03 2020-06-30 Vehicle control device, vehicle control method, and storage medium

Country Status (2)

Country Link
JP (1) JP2021009653A (en)
CN (1) CN112172809A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010020371A (en) * 2008-07-08 2010-01-28 Yazaki Corp Vehicle control system
US20160224027A1 (en) * 2013-11-05 2016-08-04 Hitachi, Ltd. Autonomous Mobile System
JP2016143137A (en) * 2015-01-30 2016-08-08 富士重工業株式会社 Driving support device of vehicle
CN107458373A (en) * 2016-06-03 2017-12-12 本田技研工业株式会社 Travel controlling system
CN107472248A (en) * 2016-06-07 2017-12-15 株式会社斯巴鲁 The travel controlling system of vehicle
CN108202746A (en) * 2016-12-20 2018-06-26 本田技研工业株式会社 Vehicle control system, control method for vehicle and the medium for storing vehicle control program
JP2019098965A (en) * 2017-12-04 2019-06-24 スズキ株式会社 Travel support device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4614005B2 (en) * 2009-02-27 2011-01-19 トヨタ自動車株式会社 Moving locus generator
JP6375770B2 (en) * 2014-08-11 2018-08-22 日産自動車株式会社 Travel control device and travel control method
WO2016024314A1 (en) * 2014-08-11 2016-02-18 日産自動車株式会社 Travel control device and method for vehicle
JP6532786B2 (en) * 2015-08-07 2019-06-19 株式会社日立製作所 Vehicle travel control device and speed control method
BR112019022005A2 (en) * 2017-04-19 2020-05-12 Nissan Motor Co. Ltd. TRAVEL ASSISTANCE METHOD AND TRAVEL ASSISTANCE DEVICE
JP6525401B2 (en) * 2017-08-30 2019-06-05 マツダ株式会社 Vehicle control device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010020371A (en) * 2008-07-08 2010-01-28 Yazaki Corp Vehicle control system
US20160224027A1 (en) * 2013-11-05 2016-08-04 Hitachi, Ltd. Autonomous Mobile System
JP2016143137A (en) * 2015-01-30 2016-08-08 富士重工業株式会社 Driving support device of vehicle
CN107458373A (en) * 2016-06-03 2017-12-12 本田技研工业株式会社 Travel controlling system
CN107472248A (en) * 2016-06-07 2017-12-15 株式会社斯巴鲁 The travel controlling system of vehicle
CN108202746A (en) * 2016-12-20 2018-06-26 本田技研工业株式会社 Vehicle control system, control method for vehicle and the medium for storing vehicle control program
JP2019098965A (en) * 2017-12-04 2019-06-24 スズキ株式会社 Travel support device

Also Published As

Publication number Publication date
JP2021009653A (en) 2021-01-28

Similar Documents

Publication Publication Date Title
CN108534792B (en) Lane change estimation device, lane change estimation method, and storage medium
CN110001634B (en) Vehicle control device, vehicle control method, and storage medium
CN110239547B (en) Vehicle control device, vehicle control method, and storage medium
CN111201170B (en) Vehicle control device and vehicle control method
CN110053617B (en) Vehicle control device, vehicle control method, and storage medium
CN110060467B (en) Vehicle control device
CN110167811B (en) Vehicle control system, vehicle control method, and storage medium
CN110271542B (en) Vehicle control device, vehicle control method, and storage medium
CN110341704B (en) Vehicle control device, vehicle control method, and storage medium
CN110271541B (en) Vehicle control device, vehicle control method, and storage medium
CN110126822B (en) Vehicle control system, vehicle control method, and storage medium
CN111133489A (en) Vehicle control device, vehicle control method, and program
CN110271547B (en) Vehicle control device, vehicle control method, and storage medium
CN112677966B (en) Vehicle control device, vehicle control method, and storage medium
CN109795500B (en) Vehicle control device, vehicle control method, and storage medium
JP7112374B2 (en) VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM
US20210070289A1 (en) Vehicle control device, vehicle control method, and storage medium
CN111183082A (en) Vehicle control device, vehicle control method, and program
CN111688692A (en) Vehicle control device, vehicle control method, and storage medium
CN110949376A (en) Vehicle control device, vehicle control method, and storage medium
CN113320541A (en) Vehicle control device, vehicle control method, and storage medium
CN112208532A (en) Vehicle control device, vehicle control method, and storage medium
CN113525409A (en) Mobile object control device, mobile object control method, and storage medium
CN110341703B (en) Vehicle control device, vehicle control method, and storage medium
CN109559540B (en) Periphery monitoring device, periphery monitoring method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination