CN111824127A - Vehicle control system - Google Patents
Vehicle control system Download PDFInfo
- Publication number
- CN111824127A CN111824127A CN202010227355.XA CN202010227355A CN111824127A CN 111824127 A CN111824127 A CN 111824127A CN 202010227355 A CN202010227355 A CN 202010227355A CN 111824127 A CN111824127 A CN 111824127A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- shoulder
- state
- parking
- control device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 104
- 230000008569 process Effects 0.000 claims abstract description 102
- 238000001514 detection method Methods 0.000 claims abstract description 10
- 230000010391 action planning Effects 0.000 description 48
- 230000002159 abnormal effect Effects 0.000 description 18
- 238000012544 monitoring process Methods 0.000 description 16
- 238000012545 processing Methods 0.000 description 16
- 238000004891 communication Methods 0.000 description 15
- 238000007726 management method Methods 0.000 description 14
- 239000000284 extract Substances 0.000 description 9
- 238000012806 monitoring device Methods 0.000 description 9
- 230000001133 acceleration Effects 0.000 description 8
- 238000012797 qualification Methods 0.000 description 7
- 235000002673 Dioscorea communis Nutrition 0.000 description 6
- 241000544230 Dioscorea communis Species 0.000 description 6
- 208000035753 Periorbital contusion Diseases 0.000 description 6
- 230000005856 abnormality Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 101001139126 Homo sapiens Krueppel-like factor 6 Proteins 0.000 description 3
- 210000000744 eyelid Anatomy 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 101000661807 Homo sapiens Suppressor of tumorigenicity 14 protein Proteins 0.000 description 2
- 208000003443 Unconsciousness Diseases 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 208000014674 injury Diseases 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 239000004575 stone Substances 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 206010041349 Somnolence Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008014 freezing Effects 0.000 description 1
- 238000007710 freezing Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- -1 snow Substances 0.000 description 1
- 238000000638 solvent extraction Methods 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/06—Automatic manoeuvring for parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/26—Incapacity
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2720/00—Output or target parameters relating to overall vehicle dynamics
- B60W2720/10—Longitudinal speed
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention provides a vehicle control system, which generates a proper target track to a parking position when a vehicle is parked in the vehicle control system for automatic driving. A vehicle control system (1) is provided with: a control device (15) that automatically performs any of speed control and steering control of the vehicle; and an external recognition device (6) capable of detecting a situation around the vehicle, wherein the control device executes a parking process for stopping the vehicle at a predetermined parking position when a predetermined condition that the vehicle is difficult to continue traveling by the control device or the driver is satisfied during traveling of the vehicle, and wherein the control device determines the parking position based on a detection result of the external recognition device and acquires a shoulder state during the parking process, and changes at least one of a speed of entering the shoulder and a distance between the position of entering the shoulder and the parking position based on the shoulder state.
Description
Technical Field
The present invention relates to a vehicle control system for performing automatic driving.
Background
There is known a vehicle stopping device that forcibly stops a vehicle at a place where the passage of another vehicle is not obstructed when a decrease in consciousness of a driver is detected (for example, patent document 1). The vehicle stopping device detects a road end (road end) or a lane marker such as a white line in front by a stereo camera or a radar sensor, and determines a parking position of the vehicle from within a shoulder based on the information.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2007-331652
Disclosure of Invention
Problems to be solved by the invention
However, when the vehicle is stopped at the determined parking position by the automated driving, a target trajectory (traveling path) of the vehicle up to the parking position is important. For example, when the shoulder width is wide and the road surface condition is good, the vehicle is decelerated on the shoulder, so that the influence on the passage of another following vehicle can be reduced. However, when the road end cannot be detected or the road surface condition of the shoulder is rough, it is not preferable that the vehicle travels on the shoulder.
In view of the above background, an object of the present invention is to generate an appropriate target trajectory to a parking position when a vehicle is parked in a vehicle control system that performs automatic driving.
Means for solving the problems
In order to solve the above problem, one aspect of the present invention is a vehicle control system (1) including: a control device (15) that automatically performs any of speed control and steering control of the vehicle; and an external world identification device (6) capable of detecting a situation around the vehicle, wherein the control device executes a parking process for stopping the vehicle at a predetermined parking position when a predetermined condition that the vehicle is difficult to continue traveling by the control device or a driver is satisfied during traveling of the vehicle, wherein the control device determines the parking position based on a detection result of the external world identification device, acquires a shoulder state, and changes at least one of a speed of entering the shoulder and a distance between the position of entering the shoulder and the parking position based on the shoulder state during the parking process.
According to this aspect, when the vehicle is parked, an appropriate target trajectory up to the parking position can be set. For example, when the state of the shoulder is good, the vehicle can be made to enter the shoulder at a relatively early stage, and the vehicle can be decelerated toward the parking position in the shoulder and run. This can prevent the vehicle from obstructing the passage of the following vehicle. In addition, when the shoulder state is poor, the vehicle can be more safely parked by entering the shoulder after approaching the parking position.
In the above aspect, the control device may set a determination value in accordance with a state of the shoulder during the parking process, execute a 1 st deceleration process when the determination value is equal to or greater than a travel reference value at which the vehicle can travel safely with respect to the state of the shoulder, execute a 2 nd deceleration process when the determination value is smaller than the travel reference value, and reduce a speed at which the vehicle enters the shoulder in the 2 nd deceleration process as compared with the 1 st deceleration process.
According to this aspect, when the shoulder state is poor, the vehicle is brought into the shoulder with the speed further reduced, and therefore the vehicle can be driven more safely on the shoulder.
In the above aspect, in the 2 nd deceleration process, the control device may shorten a distance between a position of entering the shoulder and the parking position, as compared to the 1 st deceleration process.
According to this aspect, when the shoulder state is poor, the distance over which the vehicle travels on the shoulder can be shortened, and the vehicle can travel more safely.
In the above aspect, in the 2 nd deceleration process, the control device may increase the deceleration amount before entering the shoulder, as compared to the 1 st deceleration process.
According to this aspect, the vehicle can be safely decelerated in a lane more suitable for traveling than a shoulder.
In the above aspect, in the 1 st deceleration process, the control device may make the vehicle enter the shoulder after the speed becomes equal to or lower than a 1 st reference speed, and in the 2 nd deceleration process, the control device may make the vehicle enter the shoulder after the speed becomes equal to or lower than a 2 nd reference speed lower than the 1 st reference speed.
According to this aspect, when the shoulder state is poor, the vehicle is brought into the shoulder with the speed further reduced, and therefore the vehicle can be driven more safely on the shoulder.
In the above aspect, the control device may increase the deceleration amount in the shoulder in the 1 st deceleration process as compared to the 2 nd deceleration process.
According to this aspect, when the shoulder condition is good, the influence on the following vehicle can be reduced by decelerating mainly in the shoulder.
In the above aspect, the control device may set the parking position in a driving lane and perform a 3 rd deceleration process of decelerating in the driving lane, when the determination value is smaller than a parking reference value that is not suitable for parking of the vehicle with respect to a state of the shoulder.
According to this aspect, when the shoulder state is poor and the vehicle is not suitable for parking, it is possible to avoid parking on the shoulder and safely stop in the driving lane.
In the above aspect, the shoulder state may include at least one of a degree of recognition of a boundary line between the shoulder and the lane, a degree of recognition of a road end, a road surface state of the shoulder, and a width of the shoulder.
According to this aspect, the state of the shoulder can be appropriately recognized.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the above configuration, in the vehicle control system that performs automatic driving, it is possible to set an appropriate travel locus up to the parking position when the vehicle is parked.
Drawings
Fig. 1 is a functional configuration diagram of a vehicle equipped with a vehicle control system.
Fig. 2 is a flowchart of the parking process.
Fig. 3 is a flowchart of the parking execution process.
Fig. 4 is a flowchart of the running qualification determination process.
Fig. 5 is a flowchart of the parking qualification determination process.
Fig. 6 (a) is an explanatory diagram showing a target track by the 1 st deceleration processing, and fig. 6 (B) is an explanatory diagram showing a target track by the 2 nd deceleration processing.
Fig. 7 is an explanatory diagram showing a target track based on the 3 rd deceleration processing.
Description of the reference symbols
1: vehicle control system
2: vehicle system
6: external recognition device
17: radar apparatus
18: laser radar
19: outer camera of car
35: automatic driving control part
38: running control unit
40: external recognition unit
42: action planning part
101: outer side line of roadway
102: lane
103: road shoulder
104: road end
106: vehicle with a steering wheel
107: parking position
108: target track
109: track point
Detailed Description
Hereinafter, embodiments of a vehicle control system according to the present invention will be described with reference to the drawings. Next, an example will be described in which the vehicle control system of the present invention is applied to a system that controls a vehicle traveling in a country or region where left-side traveling is adopted.
As shown in fig. 1, a vehicle control system 1 is included in a vehicle system 2 mounted on a vehicle. The vehicle system 2 includes a propulsion device 3, a brake device 4, a steering device 5, an external recognition device 6, a vehicle sensor 7, a communication device 8, a navigation device 9 (map device), a driving operation device 10, an occupant monitoring device 11, an HMI12 (Human Machine Interface), an automatic driving level changeover switch 13, an external notification device 14, and a control device 15. The respective components of the vehicle system 2 are connected to each other by a communication unit such as a CAN 16(Controller Area Network) so as to be able to transmit signals.
The propulsion device 3 is a device that applies driving force to a vehicle, and includes, for example, a power source and a transmission. The power source includes at least one of an internal combustion engine such as a gasoline engine or a diesel engine and an electric motor. The brake device 4 is a device for applying a braking force to a vehicle, and includes, for example, a caliper for pressing a pad against a brake rotor, and an electric cylinder for supplying hydraulic pressure to the caliper. The brake device 4 may include a parking brake device that restricts rotation of the wheel by a cable. The steering device 5 is a device for changing the steering angle of the wheels, and includes, for example, a rack and pinion mechanism for steering the wheels and an electric motor for driving the rack and pinion mechanism. The propulsion device 3, the braking device 4 and the steering device 5 are controlled by a control device 15.
The external recognition device 6 is a device that detects an object outside the vehicle, or the like. The environment recognition device 6 includes a sensor, for example, a radar 17, a laser radar 18(LIDAR), and an outside-vehicle camera 19, which detect an object or the like outside the vehicle by capturing electromagnetic waves or light from the periphery of the vehicle. In addition, the external world identification device 6 may be a device that receives a signal from outside the vehicle and detects an object or the like outside the vehicle. The external world identification device 6 outputs the detection result to the control device 15.
The radar 17 emits an electric wave such as a millimeter wave to the periphery of the vehicle, and detects the position (distance and direction) of an object by capturing the reflected wave. At least 1 radar 17 is mounted at an arbitrary position of the vehicle. The radar 17 preferably includes at least a front radar that irradiates radio waves toward the front of the vehicle, a rear radar that irradiates radio waves toward the rear of the vehicle, and a pair of left and right side radars that irradiate radio waves toward the sides of the vehicle.
The laser radar 18 irradiates light such as infrared rays around the vehicle, and detects the position (distance and direction) of an object by capturing the reflected light. At least 1 laser radar 18 is provided at an arbitrary position of the vehicle.
The vehicle exterior camera 19 captures an image of the periphery of the vehicle including an object (for example, a surrounding vehicle or a pedestrian) existing around the vehicle, a guardrail, a curb, a wall, a center separation zone, the shape of a road, a road mark plotted on the road, and the like. The vehicle exterior camera 19 may be a digital camera using a solid-state imaging device such as a CCD or a CMOS, for example. At least 1 vehicle exterior camera 19 is provided at an arbitrary position of the vehicle. The exterior camera 19 includes at least a front camera that captures an image of the front of the vehicle, and may further include a rear camera that captures an image of the rear of the vehicle and a pair of side cameras that capture the left and right sides of the vehicle. The vehicle exterior camera 19 may be a stereo camera, for example.
The vehicle sensor 7 includes a vehicle speed sensor that detects a speed of the vehicle, an acceleration sensor that detects an acceleration, a yaw rate sensor that detects an angular velocity around a vertical axis, an orientation sensor that detects an orientation of the vehicle, and the like. The yaw rate sensor is, for example, a gyro sensor.
The communication device 8 performs communication between the control device 15 and the navigation device 9 and a peripheral vehicle or a server located outside the vehicle. The control device 15 can perform wireless communication with the nearby vehicle via the communication device 8. Further, the control device 15 can communicate with a server that provides traffic control information via the communication device 8. Further, the control device 15 can perform communication with a portable terminal held by a person existing outside the vehicle via the communication device 8. Further, the control device 15 can perform communication with an emergency notification center that receives an emergency notification from the vehicle via the communication device 8.
The navigation device 9 is a device that acquires the current position of the vehicle and performs route guidance to a destination, and includes a GNSS reception unit 21, a map storage unit 22, a navigation interface 23, and a route determination unit 24. The GNSS receiver 21 specifies the position (latitude and longitude) of the vehicle from a signal received from an artificial satellite (positioning satellite). The map storage unit 22 is configured by a known storage device such as a flash memory or a hard disk, and stores map information. The navigation interface 23 receives an input of a destination or the like from the occupant, and presents various information to the occupant by display or voice. The navigation interface 23 may include a touch panel display, a speaker, and the like. In another embodiment, the GNSS receiver 21 may be configured as a part of the communication device 8. The map storage unit 22 may be configured as a part of the control device 15, or may be configured as a part of a server device that can communicate via the communication device 8.
The map information includes road information such as the type of a road such as an expressway, a toll road, a national road, and a prefecture road, the number of lanes on the road, the center position (three-dimensional coordinates including longitude, latitude, and height) of each lane, the shape of a road mark such as a road dividing line or a lane boundary, the presence or absence of a step, a curb, a ditch, and the like, the position of an intersection, the positions of a junction and a branch point of a lane, an area of an emergency stop zone, the width of each lane, and a sign provided on the road. The map information may include traffic control information/address information (address/zip code), facility information, telephone number information, and the like.
The route determination unit 24 determines a route to the destination based on the position of the vehicle specified by the GNSS reception unit 21, the destination input from the navigation interface 23, and the map information. In determining the route, the route determination unit 24 may determine the route by referring to the positions of the merging point and the branch point of the lanes in the map information, and including the target lane, which is the lane on which the vehicle should travel.
The driving operation device 10 receives an input operation performed by a driver to control the vehicle. The driving operation device 10 includes, for example, a steering wheel, an accelerator pedal, and a brake pedal. The driving operation device 10 may include a shift lever, a parking brake lever, and the like. Each driving operation device 10 is provided with a sensor for detecting an operation amount. The driving operation device 10 outputs a signal indicating the operation amount to the control device 15.
The occupant monitoring device 11 monitors the state of an occupant in the vehicle compartment. The occupant monitoring device 11 includes, for example, an indoor camera 26 that captures an image of an occupant seated in a seat in a vehicle interior, and a grip sensor 27 provided in a steering wheel. The indoor camera 26 is a digital camera using a solid-state imaging device such as a CCD or a CMOS. The grip sensor 27 is a sensor that detects whether the driver is gripping the steering wheel and outputs a detection signal indicating the presence or absence of gripping. The grip sensor 27 may be formed by, for example, a capacitance sensor or a piezoelectric element provided in the steering wheel. The occupant monitoring device 11 may include a heart rate sensor provided in a steering wheel or a seat, and a seating sensor provided in the seat. In addition, the occupant monitoring device 11 may be a wearable device that is worn by the occupant and that can detect vital sign information including at least one of the heart rate and the blood pressure of the wearing occupant. In this case, the occupant monitoring device 11 may be configured to be able to communicate with the control device 15 by a known communication means based on a wireless method. The occupant monitoring device 11 outputs the captured image and the detection signal to the control device 15.
The vehicle exterior notification device 14 is a device that notifies the outside of the vehicle by sound or light, and includes, for example, a warning lamp and a horn. A Front light (Front light) or a Tail light (Tail light), a brake light, a hazard light, and an interior light may also function as a warning light.
The HMI12 notifies various kinds of information to the occupant by display and voice, and receives an input operation of the occupant. The HMI12 includes, for example, at least one of a display device 31, a sound generating device 32 such as a buzzer or a speaker, and an input interface 33 such as a GUI switch or a mechanical switch on a touch panel, wherein the display device 31 is a touch panel or a display lamp including a liquid crystal or an organic EL. The navigation interface 23 may function as the HMI 12.
The automatic driving level changeover switch 13 is a switch for receiving an instruction to start automatic driving from the occupant. The automatic driving level switching switch 13 may be a mechanical switch or a GUI switch displayed on a touch panel, and is disposed at an appropriate position in the vehicle interior. The automatic driving level changeover switch 13 may be constituted by the input interface 33 of the HMI12, or may be constituted by the navigation interface 23.
The control device 15 is an Electronic Control Unit (ECU) composed of a CPU, a ROM, a RAM, and the like. The control device 15 executes various vehicle controls by executing arithmetic processing in accordance with a program by the CPU. The controller 15 may be configured as 1 piece of hardware, or may be configured as a unit configured by a plurality of pieces of hardware. At least a part of each functional unit of the control device 15 may be realized by hardware such as an LSI, an ASIC, or an FPGA, or may be realized by a combination of software and hardware.
The control device 15 performs automatic driving control (hereinafter, referred to as automatic driving) of at least level 0 to level 3 in combination with various vehicle controls. The rank is a rank obtained based on the definition of SAE J3016, and is determined in association with the degree of intervention for the driving operation of the driver and the monitoring of the vehicle surroundings.
In the automatic driving at level 0, the control device 15 does not control the vehicle, and the driver performs all driving operations. That is, automatic driving at a level of 0 means so-called manual driving.
In the automatic driving at level 1, the control device 15 performs a part of the driving operations, and the driver performs the remaining driving operations. For example, the level 1 automatic driving includes constant speed driving and inter-vehicle distance control (ACC), adaptive cruise control, and Lane Keeping support control (LKAS), Lane Keeping Assistance System (Lane Keeping Assistance System). The level 1 automatic driving is performed when the following conditions are satisfied: there is no abnormality in various devices (e.g., the external world recognition device 6, the vehicle sensor 7) required to perform the level 1 autonomous driving.
In the automatic driving at level 2, the control device 15 performs all driving operations. The level 2 automatic driving is performed when the following conditions are satisfied: the driver monitors the surroundings of the vehicle, and the vehicle is located in a predetermined area and there is no abnormality in various devices required to perform the level 2 automatic driving.
In the automatic driving at level 3, the control device 15 performs all driving operations. The automatic driving of level 3 is performed when the following conditions are satisfied: the driver is in a posture capable of monitoring the surroundings of the vehicle as needed, the vehicle is located in a predetermined area, and there is no abnormality in various devices required to perform the level 3 automatic driving. The condition for executing the level 3 automatic driving includes, for example, when the vehicle is traveling on a congested road. Whether or not the vehicle is traveling on a congested road may be determined based on traffic control information provided from a server outside the vehicle, or may be determined based on whether or not the vehicle speed acquired by a vehicle speed sensor is equal to or less than a predetermined slow travel determination value (for example, 30km/h) within a predetermined time range.
In this way, in the automatic driving of the level 1 to the level 3, the control device 15 executes at least one of steering, acceleration, deceleration, and periphery monitoring. When the automatic driving mode is set, the control device 15 executes automatic driving of the rank 1 to the rank 3. Hereinafter, steering, acceleration, and deceleration are referred to as driving operations, and driving operations and periphery monitoring are referred to as driving, as needed.
In the present embodiment, when the control device 15 receives an instruction to execute automated driving, the automated driving level changing switch 13 selects automated driving of a level according to the environment in which the vehicle is traveling based on the detection result of the external world recognition device 6 and the position of the vehicle acquired by the navigation device 9, and changes the level. However, the control device 15 may change the rank in accordance with an input to the automatic driving rank switching switch 13.
As shown in fig. 1, the control device 15 includes an automatic driving control unit 35, an abnormal state determination unit 36, a state management unit 37, a travel control unit 38, and a storage unit 39.
The automated driving control unit 35 includes an external environment recognition unit 40, a vehicle position recognition unit 41, and an action planning unit 42. The environment recognition unit 40 recognizes an obstacle located in the periphery of the vehicle, the shape of a road, the presence or absence of a pedestrian path, and a road mark based on the detection result of the environment recognition device 6. The obstacle includes, for example, a guardrail, an electric pole, a nearby vehicle, a pedestrian, or the like. The environment recognizing unit 40 can acquire the states of the peripheral vehicle such as the position, speed, and acceleration from the detection result of the environment recognizing device 6. The position of the nearby vehicle may be identified as a representative point such as the position of the center of gravity or the angular position of the nearby vehicle, or a region represented by the outline of the nearby vehicle.
The vehicle position recognition unit 41 recognizes a driving lane, which is a lane in which the vehicle is driving, and a relative position and an angle of the vehicle with respect to the driving lane. The vehicle position recognition unit 41 recognizes the traveling lane, for example, based on the map information held by the map storage unit 22 and the position of the vehicle acquired by the GNSS reception unit 21. Further, the relative position and angle of the vehicle with respect to the traveling lane may be recognized by extracting the dividing line of the periphery of the vehicle drawn on the road surface from the map information and comparing the shape of the dividing line with the shape of the dividing line captured by the vehicle exterior camera 19.
The action planning unit 42 sequentially generates action plans for causing the vehicle to travel along the route. More specifically, first, the action planning unit 42 determines an event for the vehicle to travel in the target lane determined by the route determination unit 24 without contacting the obstacle. The events include: a constant speed driving event of driving in the same driving lane at a fixed speed; a follow-up event in which a preceding vehicle that is traveling in the same travel lane is followed at a speed equal to or lower than a set speed set by an occupant or a speed determined according to a travel environment of the vehicle; a lane change event that changes a driving lane of a vehicle; an overtaking event for overtaking a preceding vehicle; a merging event for merging vehicles at a merging point of a road; a branch event for causing the vehicle to travel in a direction of a destination at a branch point on a road; an automatic driving end event for ending automatic driving and setting manual driving; and a stop event in which the vehicle is stopped when a predetermined condition indicating that it is difficult to continue driving by the control device 15 or the driver is satisfied while the vehicle is traveling.
The conditions under which the action planning unit 42 determines the parking event include the following cases: during traveling in the autonomous driving mode, the input of the driver to the indoor camera 26, the grip sensor 27, or the autonomous driving level changing switch 13 in accordance with the intervention request (passing request) for the driver for driving is not detected. The intervention request is a warning as follows: notifying the driver that a part of the driving authority is transferred, and requesting the driver to perform at least one of a driving operation and vehicle periphery monitoring corresponding to the transferred driving authority. The condition for the action planning unit 42 to determine the parking event may include the following cases: while the vehicle is traveling, the action planning unit 42 determines that the driver has not performed the driving operation and the vehicle surroundings monitoring corresponding to the driving authority to be taken charge. The condition for the action planning unit 42 to determine the parking event may include the following: during the traveling of the vehicle, the action planning unit 42 determines that the driver is in an abnormal state in which the driving operation cannot be executed, such as a heart rate stopped state, based on a signal from the heart rate sensor or the indoor camera 26, for example.
The action planning unit 42 may determine avoidance events for avoiding an obstacle or the like in accordance with the surrounding situation of the vehicle (presence of a surrounding vehicle or pedestrian, narrowing of a lane due to road construction, or the like) during execution of these events.
The action planning unit 42 further generates a target trajectory on which the vehicle should travel in the future based on the determined event. The target track is obtained by sequentially arranging track points, which are points to which the vehicle should arrive at each time. The action planning unit 42 may generate the target trajectory based on the target velocity and the target acceleration set for each event. At this time, the information of the target velocity and the target acceleration is expressed by the interval of the track points.
The travel control unit 38 controls the propulsion device 3, the braking device 4, and the steering device 5 so that the vehicle passes through the target track generated by the action planning unit 42 at a predetermined timing.
The storage unit 39 is configured by ROM, RAM, and the like, and stores information necessary for processing by the automatic driving control unit 35, the abnormal state determination unit 36, the state management unit 37, and the travel control unit 38.
The abnormal state determination unit 36 includes a vehicle state determination unit 51 and an occupant state determination unit 52. The vehicle state determination unit 51 analyzes signals of various devices (for example, the environment recognition device 6 and the vehicle sensor 7) that affect the automatic driving of the level being executed, and determines whether or not an abnormality that makes it difficult to maintain the automatic driving being executed has occurred in the various devices.
The occupant state determination unit 52 determines whether or not the state of the driver is in an abnormal state based on a signal from the occupant monitoring device 11. The abnormal state includes the following states: in automatic driving in which the driver is under the obligation to steer at a level of 1 or less, it is difficult for the driver to steer. The state in which the driver is difficult to steer specifically includes: a state in which the driver is asleep, a state in which the driver is immobilized or unconscious due to illness or injury, a state in which the heartbeat of the driver is stopped, and the like. The occupant state determination unit 52 may determine that the state of the driver is in the abnormal state when no input is made from the grip sensor 27 by the occupant during automatic driving in which the driver is under the sense of being under the level 1. Further, the occupant state determination section 52 determines the open/close state of the eyelid of the driver from the extracted face image. When the state in which the driver's eyelids are closed continues for a predetermined time or when the number of times of eyelids closed per unit time is equal to or greater than a predetermined threshold value, the occupant state determination unit 52 determines that the driver is asleep, feeling very sleepy, unconscious, or in a heartbeat-stopped state, that the driver is in a state in which driving operation is difficult, and that the state of the driver is in an abnormal state. The occupant state determination unit 52 may further acquire the posture of the driver from the captured image, and when the posture of the driver is not suitable for the driving operation and the posture is maintained in a state in which the posture is not changed within a predetermined time range, may determine that the driver is in a state in which the driver is immobilized due to illness or injury, and the state of the driver is in an abnormal state.
In addition, in automatic driving at a level having a peripheral monitoring obligation, that is, automatic driving at a level 2 or less, the abnormal state includes a state in which the driver does not fulfill the obligation of the vehicle peripheral monitoring. The state in which the driver is not in compliance with the obligation of monitoring the surroundings of the vehicle includes any one of a state in which the driver is not holding the steering wheel and a state in which the driver's line of sight is not directed toward the front of the vehicle. The occupant state determination unit 52 detects whether the driver is gripping the steering wheel, for example, based on a signal from the grip sensor 27, and determines that the driver is in an abnormal state in which the driver is not in an obligation to monitor the surroundings of the vehicle when the driver is not gripping the steering wheel. The occupant state determination unit 52 determines whether or not the state of the driver is abnormal based on the image captured by the indoor camera 26. For example, the occupant condition determination unit 52 extracts the face area of the driver from the captured image by using a known image analysis means. The occupant condition determination unit 52 further extracts the inner corner of the eye, the outer corner of the eye, and an iris portion including the pupil (hereinafter referred to as a black eye) from the extracted face region. The occupant state determination unit 52 acquires the line of sight direction of the driver from the extracted positions of the inner corner of the eye, the outer corner of the eye, and the black eye, the contour shape of the black eye, and the like, and determines that the driver is in a state in which the driver is not in compliance with the obligation of monitoring the surroundings of the vehicle when the line of sight of the driver is not directed forward of the vehicle.
In addition, in the case of the automatic driving at the level without the surrounding monitoring obligation, that is, the automatic driving at the level 3, the abnormal state means a state in which the driving shift cannot be performed promptly when the driving shift request is made to the driver. The state in which the driving shift is not possible includes a state in which the system monitoring is not possible, and the state in which the system monitoring is not possible is a state in which the driver cannot monitor a screen display or the like in which the warning display is performed, and includes a state in which the driver is asleep and watching the rear. In the present embodiment, the abnormal state includes the following states in the level 3 autonomous driving: when the driver is notified to monitor the surroundings of the vehicle, the driver cannot be under the obligation to monitor the surroundings of the vehicle. In the present embodiment, the occupant condition determination unit 52 displays a predetermined screen on the display device 31 of the HMI12, and instructs the driver to view the display device 31. Then, the occupant condition determination unit 52 detects the line of sight of the driver by the indoor camera 26, and determines that the vehicle surroundings monitoring is in a state in which the line of sight of the driver is not oriented toward the display device 31 of the HMI 12.
The occupant state determination unit 52 detects whether the driver is gripping the steering wheel, for example, based on a signal from the grip sensor 27, and determines that the driver is in an abnormal state that does not satisfy the obligation of monitoring the surroundings of the vehicle when the driver is not gripping the steering wheel. The occupant state determination unit 52 determines whether or not the state of the driver is abnormal based on the image captured by the indoor camera 26. For example, the occupant condition determination unit 52 extracts the face area of the driver from the captured image by using a known image analysis means. The occupant condition determination unit 52 further extracts the inner corner of the eye, the outer corner of the eye, and an iris portion including the pupil (hereinafter referred to as a black eye) from the extracted face region. The occupant state determination unit 52 acquires the line of sight direction of the driver from the extracted positions of the inner corner of the eye, the outer corner of the eye, and the black eye, the contour shape of the black eye, and the like, and determines that the driver is in a state in which the driver is not in compliance with the obligation of monitoring the surroundings of the vehicle when the line of sight of the driver is not directed forward of the vehicle.
The state management unit 37 determines the level of the automatic driving based on at least one of the vehicle position, the operation of the automatic driving level changeover switch 13, and the determination result of the abnormal state determination unit 36. Further, the state management unit 37 controls the action planning unit 42 according to the determined level, and performs automatic driving according to each level. For example, when the level 1 autonomous driving is performed and the constant speed travel control is executed, the state management unit 37 limits the event determined by the action planning unit 42 to only the constant speed travel event.
The state management unit 37 performs automatic driving according to the set level, and also performs level up and level down.
More specifically, the state management unit 37 raises the level when the condition for performing the automated driving of the level after the transition is satisfied and an input for instructing the automated driving level changing switch 13 to raise the level of the automated driving is performed.
The state management unit 37 performs intervention request processing when a condition for performing automatic driving at an ongoing level is satisfied or when an input for instructing lowering of the level is made to the automatic driving level changeover switch 13. In the intervention request process, the state managing unit 37 first notifies the driver of a delivery request. The notification to the driver is performed by displaying a message or an image on the display device 31 or generating a sound or a warning sound from the sound generation device 32. The notification to the driver may be continued for a predetermined time after the start of the intervention request process. Note that the notification to the driver may be continued until the occupant monitoring device 11 detects an input.
The case where the condition for performing automatic driving at the executing level is not satisfied includes: when the vehicle moves to an area where only automatic driving at a level lower than the level currently being executed can be executed; and when the abnormal state determination unit 36 determines that an abnormality that makes it difficult to continue the automatic driving has occurred in the driver or the vehicle.
After notifying the driver, the state management unit 37 detects whether or not there is an input indicating the intervention of the driver in driving from the indoor camera 26 or the grip sensor 27. The method of detecting whether or not an input is present is determined depending on the level after the transition. When shifting to level 2, the state management unit 37 extracts the direction of the driver's line of sight from the image acquired by the indoor camera 26, and determines that there is an input indicating intervention of the driver in driving when the driver's line of sight is directed forward of the vehicle. When the state management unit 37 detects the grip of the steering wheel by the driver using the grip sensor 27 when the shift to the level 1 or the level 0 is made, it is determined that there is an input indicating intervention in driving. That is, the indoor camera 26 and the grip sensor 27 function as an intervention detection device that detects the intervention of the driver in driving. Further, the state management unit 37 may detect whether or not there is an input indicating intervention in driving, based on an input to the automatic driving level changeover switch 13.
The state management unit 37 lowers the level when detecting an input indicating intervention in driving within a predetermined time from the start of the intervention request processing. In this case, the level of the lowered automated driving may be level 0 or the highest level in the executable range.
When the input corresponding to the intervention of the driver in the driving is not detected within a predetermined time from the execution of the intervention request processing, the state management unit 37 causes the action planning unit 42 to generate a parking event. A parking event is an event that causes a vehicle control to back up and park the vehicle in a safe location (e.g., an emergency parking belt, roadside belt, shoulder, parking area, etc.). The series of steps performed in this parking event is referred to herein as the MRM (minimum Risk manager: minimum Risk strategy).
After the parking event is generated, the control device 15 shifts from the automatic driving mode to the automatic parking mode, and the action planning unit 42 executes the parking process. Hereinafter, the outline of the parking process will be described with reference to fig. 2.
In the parking process, a notification process is first executed (ST 1). In the notification process, the action planning unit 42 operates the vehicle exterior notification device 14 to perform notification to the outside of the vehicle. For example, the action planning unit 42 operates a horn included in the vehicle exterior notification device 14 to periodically generate a warning sound. The notification process continues until the parking process ends. After the notification processing is completed, the action planning unit 42 may operate the speaker according to the situation to continuously generate the warning sound.
Then, the rollback processing is executed (ST 2). The rollback processing is processing for limiting events that can be generated by the action planning unit 42. The retraction process prohibits the generation of an event of changing lanes to a passing lane, a passing event, a merging event, and the like, for example. Further, the rollback process may also limit the upper limit speed and the upper limit acceleration of the vehicle in various events as compared to the case where the parking process is not executed.
Next, a parking area determination process is performed (ST 3). The parking area determination process refers to the map information based on the vehicle position and extracts a plurality of parking areas suitable for parking, such as shoulders or evacuation spaces in the traveling direction of the vehicle. Then, 1 parking area is selected from the plurality of parking areas according to the size of the parking area, the distance between the parking area and the vehicle position, and the like.
Subsequently, the shift processing is executed (ST 4). In the moving process, a route for reaching a parking area is determined, various events for traveling on the route are generated, and a target track is determined. The travel control unit 38 controls the propulsion device 3, the brake device 4, and the steering device 5 based on the target trajectory determined by the action planning unit 42. Thereby, the vehicle travels along the route to reach the parking area.
Subsequently, the parking position determination process is executed (ST 5). In the parking position determination process, the parking position is determined based on the obstacle, road mark, and the like located in the periphery of the vehicle recognized by the external world recognition unit 40. In addition, in the parking position determination process, there is a case where the parking position cannot be determined in the parking area due to the presence of the surrounding vehicle or the obstacle. When the parking position cannot be determined in the parking position determination process (no in ST 6), the parking area determination process (ST3), the movement process (ST4), and the parking position determination process (ST5) are sequentially repeated.
If the parking position can be determined in the parking position determination process (yes at ST 6), a parking execution process is executed (ST 7). The action planning unit 42 generates a target trajectory in the parking execution process based on the current position and the parking position of the vehicle. The travel control unit 38 controls the propulsion device 3, the braking device 4, and the steering device 5 according to the target trajectory determined by the action planning unit 42. Thereby, the vehicle moves to the parking position and stops at the parking position.
The parking maintaining process is executed after the parking executing process is executed (ST 8). In the parking maintaining process, the travel control unit 38 drives the parking brake device in accordance with a command from the action planning unit 42 to maintain the vehicle at the parking position. Then, the action planning unit 42 may transmit the emergency report to the emergency report center via the communication device 8. After the parking maintaining process is completed, the parking process is ended.
The vehicle control system 1 of the present embodiment includes the environment recognizing unit 40 and the control device 15 for recognizing the environment based on a signal from the environment recognizing device 6 capable of detecting at least the surrounding situation of the vehicle, and changes the target trajectory of the vehicle to the parking position based on the state of the shoulder during the parking process. The parking position determination process executed by the vehicle control system 1 according to the present embodiment will be described in detail below with reference to fig. 3.
In the parking position determination process, the action planning unit 42 first acquires the shoulder information from the external world recognition unit 40 (ST 11). The shoulder information includes at least one of a degree of recognition of the lane outer line, a degree of recognition of the road end, a width of the shoulder, and a road surface state of the shoulder.
The lane outer line 101 is a display line drawn on the road surface, and is a boundary line that divides the lane 102 and the shoulder 103 (see fig. 6). The environment recognizing unit 40 obtains the recognition degree of the lane outside line by using, for example, pattern matching. The degree of recognition of the lane outside line is, for example, a numerical value represented by a percentage. The external world identification unit 40 extracts the lane outside line 101 from the image captured by the vehicle exterior camera 19, calculates the degree of coincidence between the extracted lane outside line 101 and the model image of the lane outside line stored in advance, and sets the degree of identification of the vehicle outside line based on the degree of coincidence. The higher the degree of coincidence between the extracted lane outer side line 101 and the model image of the lane outer side line, the larger the value of the degree of recognition of the vehicle outer side line.
The road end 104 is an end of a road and is defined by a guard rail or a sidewalk for vehicles such as a curb, a wall, a guard rail, or a guard cable. The degree of identification of the route end is, for example, a numerical value represented by a percentage. The external world identification unit 40 may acquire the degree of identification of the road end using, for example, pattern matching, in the same manner as the degree of identification of the lane outer line. The external world identification unit 40 extracts the road end 104 from the image captured by the vehicle exterior camera 19, calculates the degree of coincidence between the extracted road end 104 and the model image of the road end 104 stored in advance, and sets the degree of road end identification based on the degree of coincidence. The higher the degree of coincidence between the extracted road end 104 and the model image of the road end 104, the larger the value of the degree of recognition of the road end.
The width of the curb 103 is the distance between the roadway outer side line 101 and the curb 104. The external world identification unit 40 extracts the lane outer line 101 and the road end 104 from the image of the shoulder 103 captured by the vehicle exterior camera 19, and obtains the distance between the lane outer line 101 and the road end 104 by measurement.
The road surface state of the shoulder 103 includes at least one of an uneven state of the road surface, a paved state, the presence or absence of a deposit such as crushed stone, snow, water, or a frozen state of the road surface. The external world identification unit 40 acquires the road surface state of the shoulder 103 by performing image processing on the image detected by the vehicle exterior camera 19 of the external world identification device 6. The external world identification unit 40 sets a road surface score according to the road surface state of the shoulder 103. The larger the unevenness of the road surface, the smaller the value of the road surface score, and the worse the pavement state of the road surface, the smaller the value of the road surface score, the more the crushed stone is deposited on the road surface, the smaller the value of the road surface score, the more the snow is deposited on the road surface, the smaller the value of the road surface score, the more the water is present on the road surface, the smaller the value of the road surface score, and the smaller the value of the road surface score when the road surface is frozen. That is, the better the road surface state, the higher the road surface score.
The action planning unit 42 determines whether or not the shoulder state is suitable for safe driving of the vehicle based on the acquired shoulder information (ST 12). This determination is performed by setting a determination value, which is determined as a determination criterion that the vehicle can safely travel with respect to the state of the shoulder, on the basis of the shoulder information, and determining whether or not the determination value is equal to or greater than a travel reference value. The determination value may be set for at least one of a plurality of pieces of shoulder information indicating the state of the shoulder. Further, the running reference value may be set for each determination value set for each shoulder information. The shoulder information includes at least one of a degree of recognition of the lane outer line, a degree of recognition of the road end, a width of the shoulder, and a road surface state of the shoulder. The determination at step ST12 may be performed, for example, by a driving qualification determination process for a shoulder as shown in fig. 4.
In the travel qualification determination process shown in fig. 4, first, the action planning unit 42 determines whether or not the degree of recognition of the lane outer line (the 1 ST determination value) included in the shoulder information is equal to or greater than a travel reference value set for the degree of recognition of the lane outer line (ST 21). In the case where the lane outside line 101 is clearly drawn on the road surface and is not covered with a deposit such as snow or rain, the degree of recognition of the lane outside line is equal to or greater than the running reference value, and it is determined that the shoulder state is suitable for safe running of the vehicle, by the determination in step S21. On the other hand, when a part of the lane outside line 101 disappears due to deterioration of the road surface, or when the lane outside line 101 is covered with a deposit such as snow or rain, the degree of recognition of the lane outside line 101 is smaller than the running reference value, and it is determined that the shoulder state is not suitable for safe running of the vehicle.
When the degree of recognition of the lane outside line is equal to or greater than the running reference value (yes at ST21), the action planning unit 42 determines whether or not the degree of recognition of the road end (the 2 nd determination value) included in the shoulder information is equal to or greater than the running reference value set for the degree of recognition of the road end (ST 22). When the road ends 104 are clearly divided by the curb or the guard rail for the vehicle by the determination at step ST22, it is determined that the degree of recognition of the road ends is equal to or greater than the running reference value and the shoulder state is suitable for safe running of the vehicle. On the other hand, when a structure for partitioning the road end 104, such as a curb or a guard rail for a vehicle, is not provided, or when the structure, such as the curb or the guard rail for a vehicle, is damaged, the degree of recognition of the road end is smaller than the running reference value, and it is determined that the state of the shoulder of the road is not suitable for safe running of the vehicle.
When the degree of road end recognition is equal to or greater than the running reference value (yes at ST22), the action planning unit 42 determines whether or not the width of the road end 104 (the 3 rd determination value) included in the shoulder information is equal to or greater than the running reference value set for the width of the road end 104 (ST 23). The running reference value used in step ST23 is set to a width required for safe running of the vehicle. The running reference value is set to, for example, 2m to 3 m. By the determination at step S23, it is determined whether or not the shoulder 103 has a width to the extent that the vehicle can safely travel.
When the width of the shoulder 103 is equal to or greater than the running reference value (yes in ST23), the action planning unit 42 determines whether or not the road surface state of the shoulder 103 (the 4 th determination value) is equal to or greater than the running reference value set for the road surface state of the shoulder, based on the shoulder information (ST 24). The road surface condition of the shoulder may be the road surface score described above. When the road surface score is equal to or greater than the driving reference value, for example, when the road surface of the shoulder has small unevenness, a good paved state, and a small amount of deposits, it is determined that the road surface state of the shoulder is suitable for safe driving of the vehicle.
When the road surface state of the shoulder 103 is equal to or greater than the running reference value (yes in ST24), the action planning unit 42 determines that the shoulder state is suitable for the running of the vehicle (ST 25). On the other hand, the action planning unit 42 determines that the shoulder state is not suitable for the travel of the vehicle when any of the degree of recognition of the lane outer line is smaller than the travel reference value (determination of ST21 is no), the degree of recognition of the road end is smaller than the travel reference value (determination of ST22 is no), the width of the shoulder 103 is smaller than the travel reference value (determination of ST23 is no), and the road surface state (road surface score) is smaller than the travel reference value (determination of ST24 is no) (ST 26).
The above-described travel qualification determination process is an example, and can be modified as appropriate. For example, the respective processes of steps ST21 to ST24 are not essential, and some of them may be omitted. That is, the driving qualification determination process may have at least one of the processes of steps ST21 to ST 24.
If it is determined that the shoulder state is not appropriate for the vehicle to travel (no at ST12), the action planning unit 42 determines whether the shoulder state is appropriate for the vehicle to stop (ST 14). The determination is performed by determining whether or not a determination value for each piece of shoulder information is equal to or greater than a parking reference value determined as a determination reference for enabling safe parking of the vehicle with respect to the state of the shoulder. Further, the parking reference value may be set for each determination value set for each shoulder information. The parking reference value is set to a value lower than the running reference value. The determination at step ST14 may be performed, for example, by a parking qualification determination process for a shoulder as shown in fig. 5.
In the parking eligibility determination process shown in fig. 5, first, the action planning unit 42 determines whether or not the degree of recognition of the lane outer line (the 1 ST determination value) included in the shoulder information is equal to or greater than a parking reference value set for the degree of recognition of the lane outer line (ST 31). The parking reference value set for the degree of recognition of the lane outer line is set to a value lower than the running reference value set for the degree of recognition of the lane outer line.
When the degree of recognition of the lane outside line is equal to or greater than the stop reference value (yes at ST31), the action planning unit 42 determines whether or not the degree of recognition of the road end (the 2 nd determination value) included in the shoulder information is equal to or greater than the stop reference value set for the degree of recognition of the road end (ST 32). The parking reference value set for the degree of identification of the road end is set to a value lower than the running reference value set for the degree of identification of the road end.
When the degree of identification of the road end is equal to or greater than the parking reference value (yes at ST32), the action planning unit 42 determines whether or not the width (3 rd determination value) of the road end 104 included in the shoulder information is equal to or greater than the parking reference value set for the width of the road end 104 (ST 33). The parking reference value set for the width of the road end 104 is set to a value lower than the running reference value set for the width of the road end 104.
When the width of the shoulder 103 is equal to or greater than the parking reference value (yes in ST33), the action planning unit 42 determines whether or not the road surface state of the shoulder 103 (the 4 th determination value) is equal to or greater than the parking reference value set for the road surface state of the shoulder, based on the shoulder information (ST 34). The road surface condition of the shoulder may be the road surface score described above. The parking reference value set for the road surface condition of the shoulder is set to a value lower than the running reference value set for the road surface condition of the shoulder.
When the road surface state of the shoulder 103 is equal to or greater than the parking reference value (yes in ST34), the action planning unit 42 determines that the shoulder state is suitable for parking of the vehicle (ST 35). On the other hand, the action planning unit 42 determines that the road shoulder state is not suitable for the parking of the vehicle when any one of the degree of recognition of the lane outer side line is smaller than the parking reference value (determination of ST31 is no), the degree of recognition of the road end is smaller than the parking reference value (determination of ST32 is no), the width of the road shoulder 103 is smaller than the parking reference value (determination of ST33 is no), and the road surface state (road surface score) is smaller than the parking reference value (determination of ST34 is no) (ST 36).
When determining that the shoulder state is suitable for the travel of the vehicle (yes at ST12), the action planning unit 42 generates the target track 108 for stopping the vehicle 106 at the parking position 107 by the 1 ST deceleration process (ST 13). When it is determined that the shoulder state is not suitable for the vehicle to travel (no in ST12) and it is determined that the shoulder state is suitable for the vehicle to stop (yes in ST14), the action planning unit 42 generates the target track 108 for stopping the vehicle 106 at the stop position 107 by the 2 nd deceleration process (ST 15). When it is determined that the shoulder state is not appropriate for the vehicle to stop (no in ST14), the action planning unit 42 generates the target track 108 for stopping the vehicle 106 in the driving lane by the 3 rd deceleration process (ST 16).
The travel control unit 38 controls the propulsion device 3, the braking device 4, and the steering device 5 so that the vehicle 106 passes through the target track 108 generated by the action planning unit 42 at a predetermined timing. Fig. 6 (a) shows the target track 108 generated by the 1 st deceleration process, fig. 6 (B) shows the target track 108 generated by the 2 nd deceleration process, and fig. 7 shows the target track 108 generated by the 3 rd deceleration process. Points shown in fig. 6 (a), (B), and fig. 7 are track points 109 that are points to which the vehicle 106 should arrive at each time, and the target track 108 is obtained by connecting the track points. The shorter the interval of the track points 109, the smaller the vehicle speed.
The target track 108 in the 1 st deceleration process is a track that enters the shoulder 103 from the lane 102 at a relatively early timing and decelerates toward the parking position 107 at the shoulder 103. In contrast, the target track 108 in the 2 nd deceleration process enters the shoulder 103 from the lane 102 at a time later than the time of entering the shoulder 103 from the lane 102 in the 1 st deceleration process after being decelerated in the lane 102.
In the 2 nd deceleration process, the distance between the position when entering the shoulder 103 and the parking position 107 is shorter than in the 1 st deceleration process. That is, in the target track 108 subjected to the 2 nd deceleration processing, the time of entering the shoulder 103 is later than the target track 108 subjected to the 1 st deceleration processing, and the target track enters the shoulder 103 at a position close to the parking position 107.
In the 2 nd deceleration process, the amount of deceleration before entering the shoulder 103 is large as compared with the 1 st deceleration process. That is, in the 2 nd deceleration process, the deceleration amount in the lane 102 is large as compared with the 1 st deceleration process. Thus, in the 2 nd deceleration process, the speed (vehicle speed) when entering the shoulder 103 is smaller than in the 1 st deceleration process. In the 1 st deceleration process, the vehicle 106 is caused to enter the shoulder 103 after the speed becomes equal to or lower than the 1 st reference speed, and in the 2 nd deceleration process, the vehicle 106 is caused to enter the shoulder 103 after the speed becomes equal to or lower than the 2 nd reference speed lower than the 1 st reference speed.
In the target track 108 of the 2 nd deceleration process, the distance traveled by the vehicle 106 on the shoulder 103 becomes shorter and the distance traveled on the lane 102 becomes longer than in the target track 108 of the 1 st deceleration process.
In the 3 rd deceleration process, a preset parking position 107 in the shoulder 103 is laterally shifted, and a new parking position 110 is set in the travel lane 102. The target track 108 in the 3 rd deceleration process is a trajectory in which the vehicle travels straight in the traveling lane 102 and decelerates toward the parking position 110.
With the above configuration, the action planning unit 42 can stop the vehicle 106 at the shoulder 103 in a safe manner or without obstructing the passage of the following vehicle, depending on the state of the shoulder 103. When the state of the shoulder 103 is good, the vehicle 106 enters the shoulder 103 at a relatively early stage, decelerates within the shoulder 103 toward the parking position 107, and travels. This can prevent the vehicle 106 from obstructing the passage of the following vehicle. In addition, when the state of the shoulder 103 is poor, the vehicle 106 enters the shoulder 103 after approaching the parking position 107, and thereby the vehicle 106 can be safely parked.
When the state of the shoulder 103 is not suitable for safe running of the vehicle, the speed of the vehicle 106 when entering the shoulder 103 from the lane 102 becomes lower than when the state of the shoulder 103 is suitable for safe running of the vehicle. Therefore, the vehicle 106 can travel on the road shoulder 103 more safely. In addition, when the state of the shoulder 103 is poor, the distance traveled by the vehicle 106 on the shoulder 103 becomes shorter than when the state of the shoulder 103 is good. Therefore, the vehicle 106 can reach the parking position 107 more safely.
Further, in the case where the state of the shoulder is poor and the parking of the vehicle is not suitable, it is possible to avoid parking at the shoulder and safely stopping in the driving lane.
The description of the specific embodiments is completed above, but the present invention is not limited to the above embodiments and can be widely modified. For example, the road shoulder information may include weather information or temperature information. The action planning unit 42 may determine that the road surface state is not good when the weather is snow or rain, for example, based on the weather information. Further, the action planning unit 42 may determine that the road surface state is not good, because the shoulder 103 may freeze when the temperature is, for example, below the freezing point, based on the temperature information.
Claims (8)
1. A control system for a vehicle, wherein,
the vehicle control system includes:
a control device that automatically performs any of speed control and steering control of a vehicle; and
an environment recognition device capable of detecting a condition around the vehicle,
the control device executes a parking process for stopping the vehicle at a predetermined parking position when a predetermined condition that the vehicle is difficult to continue traveling by the control device or the driver is satisfied during traveling of the vehicle,
the control device determines the parking position based on a detection result of the external world identification device and acquires a shoulder state during the parking process, and changes at least one of a speed of entering the shoulder and a distance between the position of entering the shoulder and the parking position based on the shoulder state.
2. The vehicle control system according to claim 1,
the control device sets a determination value in accordance with a state of the shoulder during the parking process, and executes a 1 st deceleration process when the determination value is equal to or greater than a travel reference value at which the vehicle can travel safely with respect to the state of the shoulder, and executes a 2 nd deceleration process when the determination value is smaller than the travel reference value, and in the 2 nd deceleration process, the speed at which the vehicle enters the shoulder is reduced as compared with the 1 st deceleration process.
3. The vehicle control system according to claim 2,
in the 2 nd deceleration process, the control device makes a distance between a position of entering the shoulder and the parking position shorter than in the 1 st deceleration process.
4. The vehicle control system according to claim 2 or 3,
in the 2 nd deceleration process, the control means increases the deceleration amount before entering the shoulder, as compared with the 1 st deceleration process.
5. The vehicle control system according to claim 2,
in the 1 st deceleration process, the control device causes the vehicle to enter the shoulder after the speed becomes equal to or less than a 1 st reference speed, and in the 2 nd deceleration process, the control device causes the vehicle to enter the shoulder after the speed becomes equal to or less than a 2 nd reference speed that is lower than the 1 st reference speed.
6. The vehicle control system according to claim 2,
in the 1 st deceleration process, the control means causes an increase in the deceleration amount in the shoulder, as compared with the 2 nd deceleration process.
7. The vehicle control system according to claim 2,
the control device sets the parking position in a driving lane and performs a 3 rd deceleration process of decelerating in the driving lane when the determination value is smaller than a parking reference value at which the vehicle can be safely parked with respect to the state of the shoulder.
8. The vehicle control system according to claim 1,
the state of the shoulder includes at least one of a recognition degree of a boundary line between the shoulder and a lane, a recognition degree of a road end, a road surface state of the shoulder, and a width of the shoulder.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019065722A JP6917406B2 (en) | 2019-03-29 | 2019-03-29 | Vehicle control system |
JP2019-065722 | 2019-03-29 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111824127A true CN111824127A (en) | 2020-10-27 |
CN111824127B CN111824127B (en) | 2024-04-05 |
Family
ID=72715624
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010227355.XA Active CN111824127B (en) | 2019-03-29 | 2020-03-27 | Vehicle control system |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6917406B2 (en) |
CN (1) | CN111824127B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115309169A (en) * | 2022-10-11 | 2022-11-08 | 天地科技股份有限公司 | Underground unmanned vehicle control method and device |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7508036B2 (en) | 2020-11-16 | 2024-07-01 | マツダ株式会社 | Vehicle Control Systems |
JP7506851B2 (en) | 2020-11-16 | 2024-06-27 | マツダ株式会社 | Vehicle Control Systems |
JP7515782B2 (en) | 2020-11-16 | 2024-07-16 | マツダ株式会社 | Vehicle Control Systems |
JP7512857B2 (en) | 2020-11-16 | 2024-07-09 | マツダ株式会社 | Vehicle Control Systems |
WO2022181714A1 (en) * | 2021-02-24 | 2022-09-01 | アナウト株式会社 | Surgery details evaluation system, surgery details evaluation method, and computer program |
JP7533336B2 (en) | 2021-04-22 | 2024-08-14 | トヨタ自動車株式会社 | Vehicle Systems |
JPWO2023067879A1 (en) * | 2021-10-20 | 2023-04-27 | ||
JP7241847B1 (en) | 2021-11-05 | 2023-03-17 | 三菱電機株式会社 | Evacuation driving support device, evacuation driving support method, evacuation driving support program, and recording medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1942335A (en) * | 2004-04-21 | 2007-04-04 | 西门子公司 | Assistance system for motor vehicles |
JP2007331652A (en) * | 2006-06-16 | 2007-12-27 | Toyota Motor Corp | Vehicle stopping device |
US20150006012A1 (en) * | 2011-11-14 | 2015-01-01 | Robert Bosch Gmbh | Method for Safely Parking a METHOD FOR SAFELY PARKING A VEHICLE IN AN EMERGENCY SITUATION |
CN105416284A (en) * | 2014-09-12 | 2016-03-23 | 爱信精机株式会社 | Parking Assist System |
CN107804318A (en) * | 2016-09-08 | 2018-03-16 | 福特全球技术公司 | trapezoidal parking |
-
2019
- 2019-03-29 JP JP2019065722A patent/JP6917406B2/en active Active
-
2020
- 2020-03-27 CN CN202010227355.XA patent/CN111824127B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1942335A (en) * | 2004-04-21 | 2007-04-04 | 西门子公司 | Assistance system for motor vehicles |
JP2007331652A (en) * | 2006-06-16 | 2007-12-27 | Toyota Motor Corp | Vehicle stopping device |
US20150006012A1 (en) * | 2011-11-14 | 2015-01-01 | Robert Bosch Gmbh | Method for Safely Parking a METHOD FOR SAFELY PARKING A VEHICLE IN AN EMERGENCY SITUATION |
CN105416284A (en) * | 2014-09-12 | 2016-03-23 | 爱信精机株式会社 | Parking Assist System |
CN107804318A (en) * | 2016-09-08 | 2018-03-16 | 福特全球技术公司 | trapezoidal parking |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115309169A (en) * | 2022-10-11 | 2022-11-08 | 天地科技股份有限公司 | Underground unmanned vehicle control method and device |
Also Published As
Publication number | Publication date |
---|---|
CN111824127B (en) | 2024-04-05 |
JP2020163984A (en) | 2020-10-08 |
JP6917406B2 (en) | 2021-08-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111824127B (en) | Vehicle control system | |
CN111746511B (en) | Vehicle control system | |
CN111762189B (en) | Vehicle control system | |
CN111824126B (en) | Vehicle control system | |
US20200307573A1 (en) | Vehicle control system | |
CN111824128B (en) | Vehicle control system | |
US11760380B2 (en) | Vehicle control system | |
JP7075908B2 (en) | Vehicle control system | |
US11307582B2 (en) | Vehicle control device, vehicle control method and storage medium | |
US20200307631A1 (en) | Vehicle control system | |
US20200307638A1 (en) | Vehicle control system | |
JP2020163986A (en) | Vehicle control system | |
CN111762025B (en) | Vehicle control system | |
US20190095724A1 (en) | Surroundings monitoring device, surroundings monitoring method, and storage medium | |
JP7184694B2 (en) | vehicle control system | |
JP2022152869A (en) | Vehicle control device, vehicle control method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |