CN111619571B - Vehicle control system, vehicle control method, and storage medium - Google Patents

Vehicle control system, vehicle control method, and storage medium Download PDF

Info

Publication number
CN111619571B
CN111619571B CN202010114244.8A CN202010114244A CN111619571B CN 111619571 B CN111619571 B CN 111619571B CN 202010114244 A CN202010114244 A CN 202010114244A CN 111619571 B CN111619571 B CN 111619571B
Authority
CN
China
Prior art keywords
vehicle
parking
occupant
candidate
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010114244.8A
Other languages
Chinese (zh)
Other versions
CN111619571A (en
Inventor
照田八州志
野口顺平
原悠记
田口龙马
高田雄太
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN111619571A publication Critical patent/CN111619571A/en
Application granted granted Critical
Publication of CN111619571B publication Critical patent/CN111619571B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/181Preparing for stopping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions

Abstract

A vehicle control system, a vehicle control method, and a storage medium capable of stopping a vehicle based on a predicted behavior of an occupant. A vehicle control system is provided with: an identification unit that identifies the surrounding environment of the vehicle; and a driving control unit that performs speed control and steering control of the vehicle based on a result of the recognition by the recognition unit, wherein the driving control unit controls the vehicle based on the position of the occupant candidate recognized by the recognition unit and a state of a device provided at the riding position when the vehicle is moved to and stopped at the riding position of the occupant candidate, and performs first parking control for parking the vehicle at the position of the occupant candidate when the occupant candidate is present at the riding position and the device is not provided, and performs second parking control for parking the vehicle based on the position of the device when the occupant candidate is present at the riding position and the device is provided.

Description

Vehicle control system, vehicle control method, and storage medium
Technical Field
The invention relates to a vehicle control system, a vehicle control method and a storage medium.
Background
In recent years, research on automatically controlling a vehicle is advancing. In association with this, the following techniques are known: when the luggage is held by the occupant of the vehicle, the vehicle is stopped at a position where the luggage is easily placed (for example, japanese patent application laid-open No. 2017-185954).
Disclosure of Invention
Even when the occupant holds the luggage, it is not preferable to stop the vehicle in the vicinity of the occupant in some cases depending on the subsequent actions performed by the occupant. However, in the conventional technique, it is sometimes difficult to determine the parking position of the vehicle according to the state of the occupant.
The present invention has been made in view of such circumstances, and an object thereof is to provide a vehicle control system, a vehicle control method, and a storage medium capable of stopping a vehicle in accordance with a predicted behavior of an occupant.
The vehicle control system, the vehicle control method, and the storage medium of the present invention adopt the following configurations.
(1): a vehicle control system according to an aspect of the present invention includes: an identification unit that identifies the surrounding environment of the vehicle; and a driving control unit that performs speed control and steering control of the vehicle based on a result of the recognition by the recognition unit, wherein the driving control unit controls the vehicle based on a position of the occupant candidate recognized by the recognition unit and a state of a device provided at the riding position when the vehicle is moved to and stopped at the riding position where the occupant candidate is present and the device is not provided, and performs first parking control for parking the vehicle at the position of the occupant candidate when the occupant candidate is present and the device is provided, and performs second parking control for parking the vehicle based on the position of the device.
(2): in the aspect of (1) above, when the recognition unit recognizes that the occupant candidate holds the cart, the driving control unit stops the vehicle at a position closer to the installation position of the device recognized by the recognition unit than the position of the occupant candidate in the second parking control.
(3): in the aspect of (2) above, when the installation position of the equipment is present in the traveling direction of the vehicle, the driving control unit causes the vehicle to follow the movement of the occupant candidate to the installation position of the equipment in the second parking control.
(4): in the aspect of (3) above, when a part of the occupants have already been seated in the vehicle and another part of the occupants are present in the riding position as the occupant candidates, the driving control unit causes the vehicle to follow the movement of the occupant candidates in the second parking control when the state of the occupants having been seated in the vehicle is a state that is safe even when traveling is started.
(5): in the above-described aspects (3) to (4), the driving control unit locks the doors of the vehicle and causes the vehicle to follow the movement of the occupant candidate when the occupant is not seated in the vehicle.
(6): in the above-described aspects (3) to (5), the driving control unit causes the vehicle to follow the movement of the occupant candidate in response to an instruction to lock the door of the vehicle when the occupant is not seated in the vehicle.
(7): in addition to the above-described aspects (2) to (6), the driving control unit acquires device position information indicating a set position of the device in the riding position from a management device of a facility in the riding position, and determines a parking position of the vehicle based on the acquired device position information.
(8): in the above-described aspects (2) to (6), the driving control unit determines the parking position of the vehicle based on the device position information indicating the installation position of the device, which is recognized by the recognition unit when the vehicle has arrived at the riding position.
(9): in addition to any one of the above (1) to (8), the vehicle control system further includes a snow recognition unit that recognizes a degree of snow deposited on the vehicle, and the driving control unit stops the vehicle at a position close to the installation position when the degree is recognized by the snow recognition unit to be equal to or greater than a threshold value at which the vehicle cannot continue traveling and the installation position of the snow removing tool is recognized by the recognition unit in the second parking control.
(10): in the aspect of (9) above, the vehicle control system further includes a notification unit that notifies the occupant of the candidate snow removal when the snow recognition unit recognizes that the degree is equal to or greater than a threshold value at which the vehicle cannot continue traveling.
(11): in addition to the aspects (9) to (10), the vehicle control system further includes a notification unit that notifies the vehicle exterior that traveling is possible when the snow recognition unit recognizes that the degree is smaller than a threshold value at which traveling is possible.
(12): a vehicle control method according to an aspect of the present invention causes a computer to execute: identifying a surrounding environment of the vehicle; performing speed control and steering control of the vehicle based on the recognition result; when the vehicle is moved to a riding position where an occupant is riding, and stopped, the vehicle is controlled based on the identified position of the occupant and a state of a device provided at the riding position; when there is a passenger candidate at the riding position and the equipment is not provided, performing a first parking control for parking the vehicle at the passenger candidate position; and performing a second parking control for parking the vehicle based on the position of the facility when the passenger candidate exists in the riding position and the facility is provided.
(13): a storage medium according to an aspect of the present invention stores a program that causes a computer to execute: identifying a surrounding environment of the vehicle; performing speed control and steering control of the vehicle based on the recognition result; when the vehicle is moved to a riding position where an occupant is riding, and stopped, the vehicle is controlled based on the identified position of the occupant and a state of a device provided at the riding position; when there is a passenger candidate at the riding position and the equipment is not provided, performing a first parking control for parking the vehicle at the passenger candidate position; and performing a second parking control for parking the vehicle based on the position of the facility when the passenger candidate exists in the riding position and the facility is provided.
According to (1) to (13), the vehicle can be stopped based on the predicted behavior of the occupant.
Drawings
Fig. 1 is a block diagram of a vehicle system 1 using a vehicle control device according to an embodiment.
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160.
Fig. 3 is a diagram schematically illustrating a scenario in which an automatic parking event is performed.
Fig. 4 is a diagram showing an example of the structure of parking lot management device 400.
Fig. 5 is a diagram showing an example of a scene in which the vehicle M is stopped in the vicinity of the occupant candidate P.
Fig. 6 is a diagram showing an example of the content of the correspondence information 182.
Fig. 7 is a diagram showing an example of the parking position obtained based on the parking rule in the case where the object of the passenger candidate P is the cart CT.
Fig. 8 is a diagram showing an example of a scene in which the movement direction of the occupant candidate P matches the traveling direction of the host vehicle M.
Fig. 9 is a diagram showing an example of a parking position obtained based on a parking rule in the case where the vehicle M has snow.
Fig. 10 is a diagram showing an example of a parking position obtained based on a parking rule in a case where the own vehicle M has snow and the own vehicle M is not equipped with the snow removing tool RR.
Fig. 11 is a view showing an example of the parking position obtained based on the parking rule in the case where the passenger candidate P holds the baggage.
Fig. 12 is a flowchart showing an example of a series of operations of the automatic drive control device 100 according to the present embodiment.
Fig. 13 is a diagram showing an example of a hardware configuration of the automatic drive control device 100 according to the embodiment.
Detailed Description
Embodiments of a vehicle control device, a vehicle control method, and a storage medium according to the present invention are described below with reference to the drawings. Hereinafter, a case where the left-hand rule is applied will be described, but when the right-hand rule is applied, the left-hand rule and the right-hand rule may be replaced in the opposite manner.
[ integral Structure ]
Fig. 1 is a block diagram of a vehicle system 1 using a vehicle control device according to an embodiment. The vehicle on which the vehicle system 1 is mounted is, for example, a two-wheeled, three-wheeled, four-wheeled or the like vehicle, and the drive source thereof is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof. The motor operates using generated power generated by a generator connected to the internal combustion engine or discharge power of the secondary battery or the fuel cell.
The vehicle system 1 includes, for example, a camera 10, a radar device 12, a detector 14, an object recognition device 16, a snow camera 18, communication devices 20, HMI (Human Machine Interface), vehicle sensors 40, navigation devices 50, MPU (Map Positioning Unit), a driving operation element 80, an automatic driving control device 100, a running driving force output device 200, a braking device 210, and a steering device 220. These devices and apparatuses are connected to each other via a multi-way communication line such as CAN (Controller Area Network) communication line, a serial communication line, a wireless communication network, or the like. The configuration shown in fig. 1 is merely an example, and a part of the configuration may be omitted or another configuration may be further added.
The camera 10 is, for example, a digital camera using a solid-state imaging device such as CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor). The camera 10 is mounted on an arbitrary portion of a vehicle (hereinafter referred to as the host vehicle M) on which the vehicle system 1 is mounted. The camera 10, for example, periodically and repeatedly photographs the periphery of the host vehicle M. The camera 10 may also be a stereoscopic video camera.
The radar device 12 emits radio waves such as millimeter waves to the periphery of the host vehicle M, and detects at least the position (distance and azimuth) of the object by detecting the radio waves (reflected waves) reflected by the object. The radar device 12 is mounted on an arbitrary portion of the host vehicle M. The radar device 12 may also detect the position and velocity of an object by the FM-CW (Frequency Modulated Continuous Wave) method.
The detector 14 is LIDAR (Light Detection and Ranging). The detector 14 irradiates light to the periphery of the vehicle M, and measures the scattered light. The detector 14 detects the distance to the object based on the time from light emission to light reception. The irradiated light is, for example, pulsed laser light. The detector 14 is mounted on an arbitrary portion of the host vehicle M.
The object recognition device 16 performs sensor fusion processing on detection results detected by some or all of the camera 10, the radar device 12, and the detector 14, thereby recognizing the position, the type, the speed, and the like of the object. The object recognition device 16 outputs the recognition result to the automatic driving control device 100. The object recognition device 16 may directly output the detection results of the camera 10, the radar device 12, and the detector 14 to the autopilot control device 100. The object recognition device 16 may also be omitted from the vehicle system 1.
The snow camera 18 is, for example, a digital camera using a solid-state imaging device such as a CCD or CMOS. The snow camera 18 is mounted at an arbitrary position where snow (hereinafter referred to as snow) deposited on a front window or a hood of the vehicle (the hood of a vehicle having a shape in which snow does not accumulate on the front window or the hood of a vehicle having a hood on an upper side of the front window) of the vehicle M can be imaged, for example. The snow camera 18 is attached to the rear side of the mirror inside the vehicle cabin, and is located at a position where snow deposited on the hood can be captured through the front windshield glass of the instrument panel. The snow camera 18 photographs the hood and roof of the host vehicle M periodically or at a predetermined timing, for example.
The communication device 20 communicates with other vehicles, parking lot management devices (described later), or various server devices existing in the vicinity of the host vehicle M, for example, using a cellular network, a Wi-Fi network, bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or the like.
The HMI30 presents various information to the occupant of the own vehicle M, and accepts an input operation by the occupant. HMI30 includes various display devices, speakers, buzzers, touch panels, switches, keys, etc.
The vehicle sensor 40 includes a vehicle speed sensor that detects the speed of the host vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects the angular velocity about the vertical axis, an azimuth sensor that detects the direction of the host vehicle M, and the like.
The navigation device 50 includes, for example, a GNSS (Global Navigation Satellite System) receiver 51, a navigation HMI52, and a route determination unit 53. The navigation device 50 holds the first map information 54 in a storage device such as HDD (Hard Disk Drive) or a flash memory. The GNSS receiver 51 determines the position of the own vehicle M based on the signals received from the GNSS satellites. The position of the host vehicle M may be determined or supplemented by INS (Inertial Navigation System) using the output of the vehicle sensor 40. The navigation HMI52 includes a display device, speakers, a touch panel, keys, etc. Part or all of the navigation HMI52 may be shared with the HMI30 described above. The route determination unit 53 refers to the first map information 54, for example, and determines a route (hereinafter referred to as an on-map route) from the position (or any input position) of the host vehicle M specified by the GNSS receiver 51 to the destination input by the occupant using the navigation HMI 52. The first map information 54 is, for example, information representing the shape of a road by a route representing the road and nodes connected by the route. The first map information 54 may include road curvature, POI (Point Of Interest) information, and the like.
The route on the map is output to the MPU 60. The navigation device 50 may also perform route guidance using the navigation HMI52 based on the route on the map. The navigation device 50 may be realized by the function of a terminal device (hereinafter, referred to as a terminal device TM) such as a smart phone or a tablet terminal held by an occupant, for example. The navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire a route equivalent to the route on the map from the navigation server.
The MPU60 includes, for example, a recommended lane determining unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determining unit 61 divides the route on the map supplied from the navigation device 50 into a plurality of sections (for example, every 100 m in the vehicle traveling direction), and determines a recommended lane for each section by referring to the second map information 62. The recommended lane determination unit 61 determines which lane to travel on from the left side. When there is a branching point in the route on the map, the recommended lane determining unit 61 determines the recommended lane so that the host vehicle M can travel on a reasonable route for traveling to the branching destination.
The second map information 62 is map information having higher accuracy than the first map information 54. The second map information 62 includes, for example, information on the center of a lane, information on a lane boundary, and the like. The second map information 62 may include road information, traffic restriction information, residence information (residence-postal code), facility information, telephone number information, and the like. The second map information 62 may be updated at any time by the communication device 20 communicating with other devices.
The steering operation member 80 includes, for example, an accelerator pedal, a brake pedal, a shift lever, a steering wheel, a profile steering wheel, a joystick, and other operation members. A sensor for detecting the amount of operation or the presence or absence of operation is attached to the driving operation element 80, and the detection result is output to the automatic driving control device 100, or to some or all of the running driving force output device 200, the brake device 210, and the steering device 220.
The automatic driving control device 100 includes, for example, a first control unit 120, a second control unit 160, and a storage unit 180. The first control unit 120 and the second control unit 160 are each realized by a hardware processor such as CPU (Central Processing Unit) executing a program (software), for example. Some or all of these components may be realized by hardware (including a circuit part) such as LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), or by cooperation of software and hardware. The program may be stored in advance in a storage device such as an HDD or a flash memory (a storage device including a non-transitory storage medium) of the autopilot control device 100, or may be stored in a removable storage medium such as a DVD or a CD-ROM, and the storage medium (the non-transitory storage medium) may be mounted on the HDD or the flash memory of the autopilot control device 100 by being mounted on a drive device. The correspondence information 182 is stored in the storage unit 180. Details of the correspondence information 182 are described later.
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160. The first control unit 120 includes, for example, a recognition unit 130 and an action plan generation unit 140. The first control unit 120 realizes a function based on AI (Artificial Intelligence; artificial intelligence) and a function based on a model given in advance in parallel, for example. For example, the function of "identifying an intersection" can be realized by performing, in parallel, identification of an intersection by deep learning or the like and identification based on a condition (presence of a signal, a road sign, or the like that enables pattern matching) given in advance, and scoring both sides to comprehensively evaluate them. Thereby, reliability of automatic driving is ensured.
The recognition unit 130 recognizes the position, speed, acceleration, and other states of the object located in the vicinity of the host vehicle M based on the information input from the camera 10, the radar device 12, and the detector 14 via the object recognition device 16. The position of the object is identified as a position on absolute coordinates with the representative point (center of gravity, drive shaft center, etc.) of the host vehicle M as an origin, for example, and is used in control. The position of the object may be represented by a representative point such as the center of gravity or the corner of the object, or by a region to be represented. The "state" of the object may also include acceleration, jerk, or "behavior" of the object (e.g., whether a lane change is being made or is about to be made).
The identifying unit 130 identifies, for example, a lane (driving lane) in which the host vehicle M is driving. For example, the identifying unit 130 identifies the driving lane by comparing the pattern of the road dividing line (for example, the arrangement of the solid line and the broken line) obtained from the second map information 62 with the pattern of the road dividing line around the host vehicle M identified from the image captured by the camera 10. The identification unit 130 is not limited to the road dividing line, and may identify the driving lane by identifying a driving path boundary (road boundary) including a road dividing line, a road shoulder, a curb, a center isolation belt, a guardrail, and the like. In this identification, the position of the host vehicle M acquired from the navigation device 50 and the processing result by the INS may be taken into consideration. The identification section 130 identifies temporary stop lines, obstacles, red lights, tollgates, and other road phenomena.
The recognition unit 130 recognizes the position and posture of the host vehicle M with respect to the driving lane when recognizing the driving lane. The identification unit 130 may identify, for example, an angle formed by a deviation of the reference point of the host vehicle M from the center of the lane and a line connecting the traveling direction of the host vehicle M with respect to the center of the lane as a relative position and posture of the host vehicle M with respect to the traveling lane. Instead of this, the identification unit 130 may identify the position of the reference point of the host vehicle M with respect to any one side end (road dividing line or road boundary) in the travel lane, or the like, as the relative position of the host vehicle M with respect to the travel lane.
The identification unit 130 includes a snow identification unit 131 and a parking space identification unit 132 that is activated in an automatic parking event described later. The details of the functions of the parking space recognition portion 132 will be described later.
The snow recognizing unit 131 recognizes the degree of snow deposited on the vehicle M (hereinafter referred to as snow level) based on, for example, an image captured by the snow camera 18. For example, the snow recognizing unit 131 analyzes the image, and recognizes the degree of snow deposited on the hood by referring to the luminance value corresponding to the snow and the luminance value of the hood of the vehicle M stored in the storage device in advance. For example, the snow recognizing unit 131 may analyze the image and recognize the degree of snow deposited on the vehicle based on a luminance value corresponding to the amount (thickness) of snow deposited, which is stored in the storage device in advance, the luminance value, and illuminance around the vehicle M. In this case, the illuminance is obtained by an illuminance sensor provided in the vehicle. The snow camera 18 may be omitted, and the image captured by the camera 10 may be used for the processing of the snow recognizing unit 131. The degree of snow deposited on the vehicle may be obtained from another device. For example, the degree of snow deposited on the vehicle may be provided by a monitoring device for monitoring snow accumulated in the parking lot, and an information providing device for providing information on the snow fall amount around the parking lot. An image obtained by capturing snow deposited on the vehicle may be obtained from another device.
The action plan generation unit 140 generates a target track for causing the host vehicle M to automatically travel in the future (independent of the operation of the driver) so that the host vehicle M can travel on the recommended lane determined by the recommended lane determination unit 61 in principle and can cope with the surrounding situation of the host vehicle M. The target track includes, for example, a speed element. For example, the target track is represented as a track in which the points (track points) where the own vehicle M should reach are arranged in order. The track point is a point where the own vehicle M should reach every predetermined travel distance (for example, several [ M ] degrees) in terms of the distance along the road, and is generated as a part of the target track at intervals of a predetermined sampling time (for example, several tenths [ sec ] degrees), unlike this point. The track point may be a position where the own vehicle M should arrive at the sampling timing every predetermined sampling time. In this case, the information of the target speed and the target acceleration is expressed by the interval of the track points.
The action plan generation unit 140 may set an event of autopilot when generating the target trajectory. Examples of the event of the automatic driving include a constant speed driving event, a low speed following driving event, a lane change event, a branching event, a converging event, a take over event, and an automatic parking event in which unmanned driving is performed and parking is performed in a passenger parking or the like. The action plan generation unit 140 generates a target track corresponding to the started event. The action plan generation unit 140 includes an automatic parking control unit 142 that is activated when an automatic parking event is executed. The details of the functions of the automatic parking control unit 142 will be described later.
The second control unit 160 controls the running driving force output device 200, the braking device 210, and the steering device 220 so that the vehicle M passes through the target track generated by the behavior plan generation unit 140 at a predetermined timing.
Returning to fig. 2, the second control unit 160 includes, for example, an acquisition unit 162, a speed control unit 164, and a steering control unit 166. The acquisition unit 162 acquires information of the target track (track point) generated by the action plan generation unit 140 and stores the information in a memory (not shown). The speed control portion 164 controls the running driving force output device 200 or the braking device 210 based on the speed element attached to the target track stored in the memory. The steering control unit 166 controls the steering device 220 according to the curve of the target track stored in the memory. The processing by the speed control unit 164 and the steering control unit 166 is realized by a combination of feedforward control and feedback control, for example. As an example, the steering control unit 166 combines and executes a feedforward control according to the curvature of the road ahead of the host vehicle M and a feedback control based on the deviation from the target track. The combination of the action plan generation unit 140 and the second control unit 160 is an example of a "driving control unit".
The running driving force output device 200 outputs a running driving force (torque) for running the vehicle to the driving wheels. The running driving force output device 200 includes, for example, a combination of an internal combustion engine, an electric motor, a transmission, and the like, and ECU (Electronic Control Unit) for controlling these. The ECU controls the above configuration in accordance with information input from the second control portion 160 or information input from the driving operation element 80.
The brake device 210 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor in accordance with information input from the second control portion 160 or information input from the driving operation member 80, and outputs a braking torque corresponding to a braking operation to each wheel. The brake device 210 may be provided with a mechanism for transmitting, to the hydraulic cylinder via the master cylinder, hydraulic pressure generated by operation of a brake pedal included in the drive operation element 80, as a backup. The brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device that controls an actuator in accordance with information input from the second control unit 160 and transmits the hydraulic pressure of the master cylinder to the hydraulic cylinder.
The steering device 220 includes, for example, a steering ECU and an electric motor.
The electric motor changes the direction of the steered wheel, for example, by applying a force to a rack-and-pinion mechanism. The steering ECU drives the electric motor in accordance with information input from the second control unit 160 or information input from the driving operation element 80, and changes the direction of the steered wheels.
[ automatic parking event-warehouse entry time ]
The automatic parking control unit 142, for example, parks the vehicle M in the parking space based on the information acquired from the parking lot management device 400 by the communication device 20. Fig. 3 is a diagram schematically illustrating a scenario in which an automatic parking event is performed. The gate 300-in and the gate 300-out are provided on a route from the road Rd to the access target facility. The host vehicle M passes through the doors 300-in and travels to the stop zone 310 using manual driving or automatic driving. The stop area 310 faces the boarding area 320 connected to the access target facility. The boarding area 320 is provided with a visor for hiding rain and snow.
After the occupant gets off in the stop area 310, the own vehicle M starts an automatic parking event of unmanned automatic driving and moving to the parking space PS in the parking lot PA. Details of the start triggering conditions of the automatic parking event related to the parking will be described later. When an automatic parking event is started, the automatic parking control unit 142 controls the communication device 20 to transmit a parking request to the parking lot management device 400. Then, the vehicle M moves from the stop area 310 to the parking lot PA according to the guidance of the parking lot management apparatus 400, or moves from the stop area 310 to the parking lot PA while being induced by self.
Fig. 4 is a diagram showing an example of the structure of parking lot management device 400. The parking lot management device 400 includes, for example, a communication unit 410, a control unit 420, and a storage unit 430. The storage unit 430 stores information such as parking lot map information 432 and a parking space state table 434.
The communication unit 410 communicates with the host vehicle M and other vehicles by wireless. The control unit 420 guides the vehicle to the parking space PS based on the information acquired by the communication unit 410 and the information stored in the storage unit 430. The parking lot map information 432 is information geometrically representing the structure of the parking lot PA. The parking lot map information 432 contains coordinates of each parking space PS. The parking space state table 434 associates, for example, a parking space ID as identification information of the parking space PS with a state indicating whether it is an empty state or a full (during parking) state, and a vehicle ID of identification information of the vehicle during parking in the case of the full state.
When the communication unit 410 receives a parking request from the vehicle, the control unit 420 refers to the parking space state table 434, extracts the parking space PS in the empty state, acquires the position of the extracted parking space PS from the parking lot map information 432, and transmits a preferred path to the vehicle using the communication unit 410 until the position of the acquired parking space PS. The control unit 420 instructs a specific vehicle to stop, slow down, etc. as needed based on the positional relationship of the plurality of vehicles so as not to cause the vehicles to travel to the same position at the same time.
In the vehicle that receives the route (hereinafter, referred to as the host vehicle M), the automatic parking control unit 142 generates a target track based on the route. When approaching the target parking space PS, the parking space recognition unit 132 recognizes a parking wire or the like that divides the parking space PS, recognizes a detailed position of the parking space PS, and supplies the recognized position to the automatic parking control unit 142. The automatic parking control unit 142 receives the detailed position of the parking space PS and corrects the target track to park the vehicle M in the parking space PS.
Not limited to the above description, the automatic parking control unit 142 may autonomously find an empty parking space based on the detection result detected by the camera 10, the radar device 12, the detector 14, or the object recognition device 16, and park the vehicle M in the found parking space.
[ automatic parking event-time of leaving warehouse ]
The automatic parking control unit 142 and the communication device 20 maintain the operating state during the parking of the host vehicle M. The automatic parking control unit 142 starts the system of the vehicle M and moves the vehicle M to the stop area 310, for example, when the communication device 20 receives a request to meet a vehicle from the terminal device TM of the occupant or when the vehicle reaches a preset delivery time. At this time, the automatic parking control unit 142 controls the communication device 20 to transmit a start request to the parking lot management device 400. Similarly to the case of the warehouse entry, the control unit 420 of the parking lot management device 400 instructs a specific vehicle to stop, move slowly, etc. as needed based on the positional relationship of the plurality of vehicles so as not to cause the vehicles to travel to the same position at the same time. When the vehicle M is moved to the stop area 310 and the occupant gets on the vehicle, the automatic parking control unit 142 stops the operation, and thereafter, manual driving or automatic driving by another functional unit is started.
[ determination of parking position in automatic parking event ]
In the automatic parking event related to the delivery, the automatic parking control unit 142 controls the vehicle M to stop the vehicle M in the vicinity of a passenger candidate (hereinafter referred to as passenger candidate P) that is a passenger of the vehicle M in the stop area 310. Fig. 5 is a diagram showing an example of a scene in which the vehicle M is stopped in the vicinity of the occupant candidate P. Hereinafter, the host vehicle M (t) represents the host vehicle M at the timing of moving to the stop area 310 under the control of the automatic parking control unit 142, and the host vehicle M (t+1) represents the host vehicle M at the timing of stopping in the stop area 310 by the automatic parking control unit 142.
First, the automatic parking control unit 142 moves the vehicle M to the stop area 310, for example, by an automatic parking event related to a delivery. When the vehicle M is moved to the stop area 310, the recognition unit 130 recognizes a person present in the boarding area 320, and recognizes the person as the occupant candidate P when the recognized person is a single person. The automatic parking control unit 142 controls the vehicle M so as to stop the vehicle M at a position near the occupant candidate P recognized by the recognition unit 130. The position near the occupant candidate P is, for example, a position where a door of a seat predetermined by the occupant candidate P approaches the occupant candidate P. Thus, the automatic parking control unit 142 can park the vehicle based on the predicted behavior of the occupant (in this case, riding).
The identification unit 130 may perform authentication of the occupant candidate P when identifying the occupant candidate P, and may identify the occupant candidate P based on the result of the authentication. For example, the storage unit 180 may store occupant information such as face data and body feature data of the occupant candidate P, and the identification unit 130 may identify the occupant candidate P when the person captured by the camera 10 matches the occupant information. The recognition unit 130 may receive an image obtained by capturing the occupant candidate P with the monitoring camera provided around the boarding area 320 by the communication device 20, and recognize the occupant candidate P as the occupant candidate P when the occupant candidate P shown in the received image matches the occupant information. The identification unit 130 may identify the occupant candidate P by transmitting and receiving information used for authenticating the occupant candidate P through communication between a device (e.g., a key fob, a communication terminal device, etc.) owned by the occupant candidate P and the communication device 20.
[ determination of parking position based on parking rules ]
Here, there are the following cases: the automatic parking control unit 142 can cause the occupant candidate P to perform the following actions more smoothly by stopping the vehicle M in a place other than the vicinity of the occupant candidate P, depending on the state of the occupant candidate P. The automatic parking control unit 142 basically parks the vehicle M at a position near the occupant candidate P, but determines a position other than the vicinity of the occupant candidate P as a parking rule of the vehicle M and parks the vehicle M when the recognition result of the recognition unit 130 satisfies the parking rule based on the recognition result of the recognition unit 130 and the correspondence information 182. Fig. 6 is a diagram showing an example of the content of the correspondence information 182. The correspondence information 182 is described with a parking rule generated based on a prediction of what action the occupant candidate P will take next when the recognition result by the recognition unit 130 shows that the occupant candidate P is a specific object, the state of the host vehicle M is a specific state, or the state of the occupant candidate P is a specific state. Details about the content of the correspondence information 182 will be described in order. The correspondence information 182 may be stored in the storage unit 180 in the form of a table, or may be stored in the storage unit 180 in the form of a program code.
[ (1): parking regulations in the case where the object to be held by the passenger candidate P is a cart (truck) CT
Fig. 7 is a diagram showing an example of the parking position obtained based on the parking rule in the case where the object of the passenger candidate P is the cart CT. The correspondence information 182 includes information obtained by associating the recognition result of the recognition unit 130 indicating that the passenger candidate P holds the cart CT with a parking rule for parking the vehicle M at a position close to the cart placement site CY. This is because, when the cart CT is held, the predicted occupant candidate P returns the cart CT to the cart placement site CY.
As shown in fig. 7, the automatic parking control unit 142 causes the host vehicle M to travel to the position of the host vehicle M (t) in accordance with an automatic parking event related to the delivery. The identifying unit 130 identifies that the passenger candidate P holds the cart CT based on the image captured by the camera 10 at the position of the vehicle M (t). When it is recognized that the passenger candidate P holds the cart CT based on the correspondence information 182 and the recognition result of the recognition unit 130, the automatic parking control unit 142 parks the vehicle M at a position close to the cart placement site CY (the position of the vehicle M (t+1) shown in the drawing). At this time, the recognition unit 130 determines the position of the cart storage location CY based on the image showing the surrounding environment of the host vehicle M captured by the camera 10, and the automatic parking control unit 142 moves the host vehicle M to a position close to the cart storage location CY determined by the recognition unit 130 and stops the host vehicle M. Thus, the automatic parking control unit 142 can make it easier for the occupant candidate P to ride on the host vehicle M after the occupant candidate P returns the cart CT to the cart placement site CY.
Fig. 8 is a diagram showing an example of a scene in which the movement direction of the occupant candidate P matches the traveling direction of the host vehicle M. Here, the automatic parking control unit 142 may be configured to cause the host vehicle M to travel to the cart storage location CY so as to follow the occupant candidate P in response to the movement of the occupant candidate P when returning the cart CT to the cart storage location CY when the direction dr1 from the position of the occupant candidate P having the cart CT recognized by the recognition unit 130 to the position of the cart storage location CY matches the traveling direction dr2 of the host vehicle M. The following of the occupant candidate P means, for example, traveling on the right side of the occupant candidate P or obliquely rearward of the occupant candidate P, and traveling at a speed equivalent to the occupant candidate P. Thus, the automatic parking control unit 142 can make the occupant candidate P recognize that the vehicle-mounted location CY is being recognized and move. The automatic parking control unit 142 controls the host vehicle M to travel to the cart placement site CY so as to follow the occupant candidate P in association with the movement of the occupant candidate P when returning the cart CT to the cart placement site CY, as an example of "second parking control".
The automatic parking control unit 142 may cause the host vehicle M to travel so as to follow the occupant candidate P when the occupant other than the occupant candidate P has already been seated in the host vehicle M, even if the host vehicle M starts traveling and the state of the occupant is safe. For example, the automatic parking control unit 142 may cause the vehicle M to travel so as to follow the occupant candidate P when the recognition unit 130 recognizes that the occupant is in a safe state, based on an image obtained by capturing an image of the occupant properly seated in the seat by the in-vehicle camera (not shown). The automatic parking control unit 142 may cause the vehicle M to travel so as to follow the occupant candidate P when the recognition unit 130 recognizes that the occupant wears the seatbelt. Thus, even when an occupant other than the occupant candidate P is riding on the host vehicle M, the automatic parking control unit 142 can cause the host vehicle M to safely follow the occupant candidate P.
When the vehicle M is driven so as to follow the occupant candidate P without the occupant, the automatic parking control unit 142 may control the mechanism for locking the vehicle M, and after locking the doors of the vehicle M, the vehicle M may be driven so as to follow the occupant candidate P, or may drive the vehicle M so as to follow the occupant candidate P, depending on whether an instruction to lock the doors of the vehicle M is received by the terminal device or the smart key provided in the occupant candidate P (that is, the instruction is used as a trigger condition). Thus, the automatic parking control unit 142 can suppress (prohibit) a person other than the occupant candidate P from riding in the host vehicle M while the host vehicle M is traveling following the occupant candidate P.
In the above, the following is explained: the recognition unit 130 specifies the position of the cart storage location CY based on the image showing the surrounding environment of the host vehicle M captured by the camera 10, and the automatic parking control unit 142 moves the host vehicle M to a position close to the cart storage location CY specified by the recognition unit 130 and stops the host vehicle M, but the present invention is not limited thereto. The automatic parking control unit 142 may acquire information on the boarding area 320 from a management device or the like of the facility to be accessed via the communication device 20, for example, and may cause the vehicle M to travel based on the information. In this case, the information on the boarding area 320 is, for example, truck position information indicating the position of the cart placement site CY. Thus, even when there is another vehicle in front of the host vehicle M or bad weather such as heavy fog and the recognition unit 130 cannot appropriately recognize the cart-placed site CY, the automatic parking control unit 142 can stop the host vehicle M in the vicinity of the cart-placed site CY or can cause the host vehicle M to travel so as to follow the occupant candidate P who has arrived at the cart-placed site CY.
Instead of the process of acquiring the truck position information from the management apparatus or the like of the facility to be accessed, the automatic parking control unit 142 may generate truck position information showing the position of the cart placement site CY recognized by the recognition unit 130 when the vehicle has previously arrived in the boarding area 320, and may cause the host vehicle M to travel based on the generated truck position information. In this case, the storage unit 180 stores information in which the truck position information and information (for example, access destination facility name or the like) that can identify the boarding area 320 are associated with each other, and the identification unit 130 generates (updates) the truck position information each time the truck arrives in the boarding area 320.
[ (2): parking rules in the case of snow on the own vehicle M
Fig. 9 is a diagram showing an example of a parking position obtained based on a parking rule in the case where the vehicle M has snow. The correspondence information 182 includes information that associates a recognition result that recognizes that the degree of snow recognized by the snow recognizing unit 131 is equal to or greater than a predetermined threshold value with a parking rule that stops the vehicle M at a position where the removal position of the snow removing tool RR mounted on the vehicle M is close to the passenger candidate P. This is because, when the vehicle M has snow accumulated above a predetermined threshold value, the predicted occupant candidate P removes snow from the vehicle M, and in this case, the predicted occupant candidate P removes the snow removing tool RR mounted on the vehicle M from the vehicle M. The snow recognizing unit 131 recognizes the degree of snow on the occasion when the automatic parking event related to the delivery starts. The identification unit 130 identifies the presence or absence of the snow removing tool RR in the host vehicle M based on an image of the vehicle interior of the host vehicle M captured by a camera (not shown) that captures the vehicle interior of the host vehicle M. In the example shown in fig. 9, the snow removing tool RR is mounted in a luggage room of the host vehicle M (i.e., in the rear portion of the host vehicle M).
As shown in fig. 10, the automatic parking control unit 142 causes the host vehicle M to travel to the position of the host vehicle M (t) by an automatic parking event related to the delivery. The automatic parking control unit 142 stops the vehicle M at a position (illustrated position of the vehicle M (t+1)) at which the vehicle M is to be placed near the passenger candidate P (i.e., the rear portion of the vehicle M) when the snow concentration recognized by the snow recognition unit 131 is equal to or higher than a predetermined threshold value and the recognition unit 130 recognizes that the snow removal tool RR is to be placed on the vehicle M based on the correspondence information 182 and the recognition result of the recognition unit 130 (and the snow recognition unit 131). At this time, the recognition unit 130 determines the position of the occupant candidate P based on the image showing the surrounding environment of the host vehicle M captured by the camera 10, and the automatic parking control unit 142 moves the host vehicle M to a position where the rear door of the host vehicle M approaches the occupant candidate P determined by the recognition unit 130, and stops it. In this way, the automatic parking control unit 142 can easily remove the snow removing tool RR from the occupant candidate P and easily remove snow after the vehicle M is parked in the vicinity of the occupant candidate P.
Fig. 10 is a diagram showing an example of a parking position obtained based on a parking rule in a case where the own vehicle M has snow and the own vehicle M is not equipped with the snow removing tool RR. Here, there are the following cases: even when the snow recognition unit 131 recognizes that the snow level is equal to or higher than the predetermined threshold value, the snow removing tool RR is not mounted on the host vehicle M, but the snow removing tool RR is provided in the boarding and disembarking area 320. The correspondence information 182 includes information that associates the recognition result of the snow recognition unit 131 that recognizes that the snow level is equal to or higher than the predetermined threshold value with the parking rule that stops the vehicle M at the position near the installation position PI of the snow removing tool RR. This is because, when the vehicle M has snow accumulated above a predetermined threshold value, the predicted occupant candidate P removes snow from the vehicle M, and in this case, the predicted occupant candidate P removes snow using the snow removing tool RR provided at the installation position PI.
As shown in fig. 10, the automatic parking control unit 142 causes the host vehicle M to travel to the position of the host vehicle M (t) by an automatic parking event related to the delivery. The automatic parking control unit 142 stops the vehicle M at a position close to the installation position PI (the position of the vehicle M (t+1) shown in the drawing) when the snow level recognized by the snow recognition unit 131 is equal to or higher than a predetermined threshold value and the recognition unit 130 recognizes that the snow removing tool RR is not installed in the vehicle M, based on the correspondence information 182 and the recognition result of the recognition unit 130 (and the snow recognition unit 131). At this time, the recognition unit 130 determines the set position PI based on the image showing the surrounding environment of the host vehicle M captured by the camera 10, and the automatic parking control unit 142 moves the host vehicle M to a position close to the set position PI determined by the recognition unit 130 and stops it. Thus, the automatic parking control unit 142 can reduce the time and effort required for the occupant candidate P to move the snow removing tool RR to the position of the vehicle M with the installation position PI, and can easily remove snow.
The automatic driving control device 100 may include a notification unit that controls an off-vehicle notification device (e.g., a speaker, a headlight, etc.), and the notification unit may notify that the passenger candidate P is to be prompted to remove snow when the automatic parking control unit 142 causes the vehicle M to travel to the boarding area 320 when the snow level recognized by the snow recognition unit 131 is equal to or greater than a predetermined threshold value. The notification unit notifies the occupant candidate P of the need for snow removal by, for example, playing a sound that urges the occupant candidate P to remove snow (for example, a sound such as "snow is removed because of a large amount of snow)", causing the headlight to pass a flash (passing), or causing the headlight to light in a predetermined lighting pattern.
The notification unit may notify the occupant candidate P that the vehicle M is able to travel when the snow on the vehicle M becomes smaller than the predetermined threshold value due to the occupant candidate P removing snow from the vehicle M and the snow recognition unit 131 recognizes that the vehicle M is able to travel. The notification unit notifies the occupant candidate P of the fact that the vehicle M is able to travel by, for example, playing a sound (for example, a sound such as "the vehicle is able to travel, please travel") that urges the occupant candidate P to the vehicle M, flashing a headlight, or turning on the headlight in a predetermined lighting pattern.
[ (3): parking regulations in the case where the passenger candidate P holds baggage with one hand
Fig. 11 is a view showing an example of the parking position obtained based on the parking rule in the case where the passenger candidate P holds the baggage. Here, the stop area 310 may be a shape in which the vehicles are parked in parallel as shown in fig. 11, instead of the shape in which the vehicles are parked in tandem as described above. Hereinafter, the stop area 310 includes one or more parking areas 315, and the parking areas 315 are arranged in the landing area 320. The correspondence information 182 includes information in which a parking rule for stopping the vehicle M at a position where the door of the vehicle M approaches the hand of the occupant candidate P that does not hold the baggage is associated with the recognition result of the recognition of the one-hand hold (baggage) of the occupant candidate P by the recognition unit 130. This is because, when the baggage is held by one hand, it is predicted that the occupant candidate P opens the door of the vehicle M with the hand of the one who does not hold the baggage (i.e., the free hand). The door of the host vehicle M near the hand of the occupant candidate P is, for example, a door of a seat in which the occupant candidate P mounts baggage.
As shown in fig. 11, the automatic parking control unit 142 causes the host vehicle M to travel to the position of the host vehicle M (t) in accordance with an automatic parking event related to the delivery. The recognition unit 130 recognizes that the passenger candidate P holds the baggage bgl with one hand based on the image captured by the camera 10 at the position of the vehicle M (t). When it is recognized that the occupant candidate P holds the baggage bg1 with one hand based on the correspondence information 182 and the recognition result of the recognition unit 130, the automatic parking control unit 142 brings the vehicle M to a stop at a position where the door of the vehicle M approaches one hand of the occupant candidate P where the baggage bg1 is not held (a position of the vehicle M (t+1) shown in the drawing). At this time, the recognition unit 130 determines the position of the free hand of the occupant candidate P based on the image of the surrounding environment of the vehicle M captured by the camera 10, and the automatic parking control unit 142 moves the vehicle M to a position where the door of the vehicle M approaches the position of the hand determined by the recognition unit 130, and stops the vehicle. The automatic parking control unit 142 stops the vehicle M so that the traveling direction (direction dr3 shown) of the vehicle M is parallel or substantially parallel to the width direction (direction dr4 shown) of the body of the occupant candidate P, as well as the door of the vehicle M is brought close to the hand of the occupant candidate P, after the vehicle M is stopped, and so that the vehicle M is brought closer than the case where the occupant candidate P does not have any luggage. In this way, the automatic parking control unit 142 can easily open the door of the occupant candidate P after the vehicle M is parked in the vicinity of the occupant candidate P, and can easily load baggage. The automatic parking control unit 142 can smoothly start traveling without performing a direction change or the like after the occupant candidate P rides on the host vehicle M. The automatic parking control unit 142 controls the vehicle M to stop at a position where the door of the vehicle M approaches the hand of the occupant candidate P that does not hold baggage, as an example of "first parking control".
Here, when the occupant candidate P holds luggage with both hands, the occupant candidate P may not open the door of the host vehicle M with one hand. In this case, the baggage held by the passenger candidate P is, for example, a baggage heavier than the reference or larger than the reference. In this case, since the recognition unit 130 recognizes that the recognition result of the baggage bg2 held by both hands of the occupant candidate P does not match the recognition result of the recognition unit 130 described in the correspondence information 182, the automatic parking control unit 142 does not adopt the parking rule.
[ action flow ]
Fig. 12 is a flowchart showing an example of a series of operations of the automatic drive control device 100 according to the present embodiment. The flowchart is executed, for example, at the beginning of an automated parking event associated with a delivery. First, the snow recognizing unit 131 recognizes the degree of snow on the vehicle M (step S100). The automatic parking control unit 142 determines whether or not the degree of snow identified by the snow identification unit 131 is equal to or greater than a predetermined threshold value (step S102). When determining that the snow level is equal to or greater than the predetermined threshold, the automatic parking control unit 142 determines whether or not the recognition unit 130 recognizes the snow removing tool RR in the vehicle M (that is, whether or not the snow removing tool RR is mounted on the vehicle M) (step S104). When it is determined that the snow removing tool RR is mounted on the host vehicle M, the automatic parking control unit 142 parks the host vehicle M at a position where the passenger candidate P in the stop region 310 approaches the removal position of the snow removing tool RR (step S106).
When the snow level is equal to or higher than the predetermined threshold value and the snow removing tool RR is not mounted on the vehicle M, the automatic parking control unit 142 determines whether or not the recognition unit 130 recognizes the installation position PI in the boarding area 320 (step S108). When the recognition unit 130 recognizes the installation position PI in the boarding area 320, the automatic parking control unit 142 parks the vehicle M at a position close to the installation position PI (step S110). The control of the automatic parking control unit 142 in step S110 is an example of "second parking control".
The automatic parking control unit 142 determines whether or not the recognition unit 130 recognizes that the passenger candidate P holds the cart CT (step S112). When the recognition unit 130 recognizes that the passenger candidate P holds the cart CT, the automatic parking control unit 142 determines whether or not the recognized passenger candidate P is present at the location of the cart placement site CY (step S113). When the identified occupant candidate P is not present at the cart storage location CY, the automatic parking control unit 142 determines whether or not the position of the occupant candidate P holding the cart CT identified by the identification unit 130 is a position closer to the cart storage location CY, and whether or not the traveling direction of the host vehicle M matches the direction from the host vehicle M to the cart storage location CY (step S114).
When the occupant candidate P is located at the cart storage location CY, the automatic parking control unit 142 advances the process to step S118 if the occupant candidate P is located farther than the cart storage location CY or if the traveling direction of the vehicle M does not match the direction from the vehicle M to the cart storage location CY. The automatic parking control unit 142 causes the host vehicle M to travel as follows when the position of the passenger candidate P is closer to the front of the cart-placed site CY and the traveling direction of the host vehicle M matches the direction from the host vehicle M to the cart-placed site CY: the vehicle M is temporarily stopped laterally to the passenger candidate P, started, and follows the passenger candidate P with the position of the cart placement site CY as an upper limit (step S116). The automatic parking control unit 142 parks the vehicle M in the vicinity of the cart-holding position CY (step S118). The control of the automatic parking control unit 142 in step S116 and step S118 is an example of "second parking control".
The automatic parking control unit 142 determines whether or not the occupant candidate P is identified by the identification unit 130 as holding the luggage with one hand (step S120). When the recognition unit 130 recognizes that the occupant candidate P is held in one hand, the automatic parking control unit 142 obtains the position of the hand of the occupant candidate P recognized by the recognition unit 130 that is not holding the luggage (step S122). The automatic parking control unit 142 parks the vehicle M at a position where the door of the vehicle M approaches the hand of the passenger candidate P that is not holding the luggage (step S124).
When the above condition is not satisfied, the automatic parking control unit 142 stops the vehicle M in the vicinity of the passenger candidate P without using the parking rule (step S126). The control of the automatic parking control unit 142 in step S126 is an example of "first parking control".
In the above description, the following will be described, namely, in the flowchart shown in fig. 12, according to (2): processing related to the parking regulation in the case where the own vehicle M has snow (1): processing relating to a parking rule in the case where the object of the passenger candidate P is the cart CT, and (3): the order of the processing related to the parking rule in the case where the passenger candidate P holds the baggage is not limited to this. Among these processes, any process may be preferentially executed, and any process may be executed in parallel.
[ hardware Structure ]
Fig. 13 is a diagram showing an example of a hardware configuration of the automatic drive control device 100 according to the embodiment. As shown in the figure, the automatic driving control device 100 is configured by interconnecting a communication controller 100-1, a CPU100-2, RAM (Random Access Memory) 100-3 used as a working memory, ROM (Read Only Memory) -4 storing a boot program or the like, a storage device 100-5 such as a flash memory or HDD (Hard Disk Drive), a driving device 100-6, and the like by an internal bus or a dedicated communication line. The communication controller 100-1 communicates with components other than the automatic driving control device 100. The storage device 100-5 stores a program 100-5a for execution by the CPU 100-2. The program is developed into the RAM100-3 by a DMA (Direct Memory Access) controller (not shown) or the like, and executed by the CPU 100-2. Thus, a part or all of the recognition unit 130, the action plan generation unit 140, and the automatic parking control unit 142 are realized.
The embodiments described above can be expressed as follows.
An automatic driving control device, wherein,
the automatic driving control device is configured to include:
a storage device storing a program; and
a hardware processor is provided with a processor that,
the hardware processor executes the following processing by executing a program stored in the storage device:
identifying a surrounding environment of the vehicle;
performing speed control and steering control of the vehicle based on the recognition result;
when the vehicle is moved to a riding position where an occupant rides the vehicle and stopped, a parking position of the vehicle is determined based on a parking rule obtained by referring to correspondence information obtained by referring to a state of a device provided at the identified riding position and a correspondence relationship between an identification result and a parking rule.
The specific embodiments of the present invention have been described above using the embodiments, but the present invention is not limited to such embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.

Claims (13)

1. A vehicle control system, wherein,
the vehicle control system includes:
an identification unit that identifies the surrounding environment of the vehicle; and
a driving control unit that performs speed control and steering control of the vehicle based on a recognition result of the recognition unit,
The driving control unit controls the vehicle based on the position of the occupant candidate recognized by the recognition unit and the state of the equipment provided at the riding position when the vehicle is moved to and stopped at the riding position where the occupant candidate is present and the equipment is not provided, and performs first parking control for parking the vehicle at the position of the occupant candidate and performs second parking control for parking the vehicle based on the position of the equipment when the occupant candidate is present and the equipment is provided,
the device is a device that is predicted based on the recognition result and used in the action of the occupant candidate.
2. The vehicle control system according to claim 1, wherein,
when the recognition unit recognizes that the passenger candidate holds the cart, the driving control unit stops the vehicle at a position closer to a cart placement location, which is a set position of the device recognized by the recognition unit, than the passenger candidate in the second parking control.
3. The vehicle control system according to claim 2, wherein,
in the second parking control, the driving control unit causes the vehicle to follow the movement of the occupant candidate to the installation position of the device when the installation position of the device exists in the traveling direction of the vehicle.
4. The vehicle control system according to claim 3, wherein,
when a part of the passengers are riding in the vehicle and another part of the passengers are present in the riding position as the passenger candidates, the driving control unit causes the vehicle to follow the movement of the passenger candidates in the second parking control when the state of the passengers riding in the vehicle is a state safe even when the driving is started.
5. The vehicle control system according to claim 3 or 4, wherein,
the driving control unit locks a door of the vehicle and causes the vehicle to follow the movement of the occupant candidate when the occupant is not seated in the vehicle.
6. The vehicle control system according to claim 3 or 4, wherein,
the driving control unit causes the vehicle to follow the movement of the occupant candidate in response to an instruction to lock the door of the vehicle when the occupant is not seated in the vehicle.
7. The vehicle control system according to any one of claims 2 to 4, wherein,
the driving control unit acquires device position information indicating a location of the device in the riding position from a facility management device in the riding position, and determines a parking position of the vehicle based on the acquired device position information.
8. The vehicle control system according to any one of claims 2 to 4, wherein,
the driving control unit determines a parking position of the vehicle based on the device position information indicating the installation position of the device, which is recognized by the recognition unit when the vehicle has arrived at the riding position.
9. The vehicle control system according to any one of claims 1 to 4, wherein,
the vehicle control system further includes a snow recognizing unit that recognizes a degree of snow deposited on the vehicle,
in the second parking control, the drive control unit may stop the vehicle at a position close to the installation position when the snow recognition unit recognizes that the degree is equal to or greater than a threshold value at which the vehicle cannot continue traveling and the recognition unit recognizes that the installation position of the snow removing tool is the equipment.
10. The vehicle control system according to claim 9, wherein,
the vehicle control system further includes a notification unit that notifies the occupant of the candidate snow removal when the snow recognition unit recognizes that the degree is equal to or greater than a threshold at which the vehicle cannot continue traveling.
11. The vehicle control system according to claim 9, wherein,
the vehicle control system further includes a notification unit that notifies the vehicle exterior of the possibility of traveling when the snow recognition unit recognizes that the degree is smaller than a threshold value at which traveling can be continued.
12. A vehicle control method, wherein,
the vehicle control method causes a computer to execute:
identifying a surrounding environment of the vehicle;
performing speed control and steering control of the vehicle based on the recognition result;
when the vehicle is moved to a riding position where an occupant is riding, and stopped, the vehicle is controlled based on the identified position of the occupant and a state of a device provided at the riding position;
when there is a passenger candidate at the riding position and the equipment is not provided, performing a first parking control for parking the vehicle at the passenger candidate position; and
When there is a passenger candidate at the riding position and the facility is provided, performing a second parking control for parking the vehicle based on the location of the facility,
the device is a device that is predicted based on the recognition result and used in the action of the occupant candidate.
13. A storage medium, wherein,
the storage medium stores a program that causes a computer to execute:
identifying a surrounding environment of the vehicle;
performing speed control and steering control of the vehicle based on the recognition result;
when the vehicle is moved to a riding position where an occupant is riding, and stopped, the vehicle is controlled based on the identified position of the occupant and a state of a device provided at the riding position;
when there is a passenger candidate at the riding position and the equipment is not provided, performing a first parking control for parking the vehicle at the passenger candidate position; and
when there is a passenger candidate at the riding position and the facility is provided, performing a second parking control for parking the vehicle based on the location of the facility,
the device is a device that is predicted based on the recognition result and used in the action of the occupant candidate.
CN202010114244.8A 2019-02-27 2020-02-24 Vehicle control system, vehicle control method, and storage medium Active CN111619571B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019034302A JP2020140348A (en) 2019-02-27 2019-02-27 Vehicle control system, vehicle control method, and program
JP2019-034302 2019-02-27

Publications (2)

Publication Number Publication Date
CN111619571A CN111619571A (en) 2020-09-04
CN111619571B true CN111619571B (en) 2023-11-14

Family

ID=72267969

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010114244.8A Active CN111619571B (en) 2019-02-27 2020-02-24 Vehicle control system, vehicle control method, and storage medium

Country Status (2)

Country Link
JP (1) JP2020140348A (en)
CN (1) CN111619571B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008009913A (en) * 2006-06-30 2008-01-17 Toyota Motor Corp Automatic vehicle driving system
CN102598080A (en) * 2010-04-12 2012-07-18 丰田自动车株式会社 Vehicle remote operation system and on-board device
JP2017185954A (en) * 2016-04-07 2017-10-12 トヨタ自動車株式会社 Automatic drive vehicle
CN108349504A (en) * 2015-11-04 2018-07-31 日产自动车株式会社 Automatic driving vehicle operating device and automatic driving vehicle operating method
JP2019028663A (en) * 2017-07-28 2019-02-21 株式会社デンソー Vehicle allocation system and on-vehicle unit

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180196417A1 (en) * 2017-01-09 2018-07-12 nuTonomy Inc. Location Signaling with Respect to an Autonomous Vehicle and a Rider
US10780879B2 (en) * 2017-02-14 2020-09-22 Denso Ten Limited Parking controller, parking control system, and parking control method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008009913A (en) * 2006-06-30 2008-01-17 Toyota Motor Corp Automatic vehicle driving system
CN102598080A (en) * 2010-04-12 2012-07-18 丰田自动车株式会社 Vehicle remote operation system and on-board device
CN108349504A (en) * 2015-11-04 2018-07-31 日产自动车株式会社 Automatic driving vehicle operating device and automatic driving vehicle operating method
JP2017185954A (en) * 2016-04-07 2017-10-12 トヨタ自動車株式会社 Automatic drive vehicle
JP2019028663A (en) * 2017-07-28 2019-02-21 株式会社デンソー Vehicle allocation system and on-vehicle unit

Also Published As

Publication number Publication date
JP2020140348A (en) 2020-09-03
CN111619571A (en) 2020-09-04

Similar Documents

Publication Publication Date Title
CN110281941B (en) Vehicle control device, vehicle control method, and storage medium
CN110103962B (en) Vehicle control device, vehicle control method, and storage medium
CN111762113A (en) Vehicle control device, vehicle control method, and storage medium
US20190286135A1 (en) Vehicle control device, vehicle control method, and storage medium
CN110271542B (en) Vehicle control device, vehicle control method, and storage medium
CN111391826B (en) Vehicle control system, vehicle control method, and storage medium
CN111688676A (en) Vehicle control device, vehicle control method, and storage medium
JP7240218B2 (en) Vehicle control system, vehicle control method, and program
CN112677966B (en) Vehicle control device, vehicle control method, and storage medium
CN111667709B (en) Vehicle control device, information providing system, vehicle control method, information providing method, and storage medium
CN111731293A (en) Vehicle control system, vehicle control method, and storage medium
CN111661037A (en) Vehicle control device, vehicle control method, and computer-readable storage medium
CN111766868A (en) Vehicle control device, vehicle control method, and storage medium
CN113442947A (en) Vehicle control device, vehicle control method, and storage medium
CN111762174A (en) Vehicle control device, vehicle control method, and storage medium
CN112037561B (en) Information processing apparatus, information processing method, and storage medium
US20200302199A1 (en) Vehicle control device, monitoring system, vehicle control method, and storage medium
US11242034B2 (en) Vehicle control device, vehicle control method, and storage medium
CN111688708B (en) Vehicle control system, vehicle control method, and storage medium
CN109559540B (en) Periphery monitoring device, periphery monitoring method, and storage medium
CN111746506A (en) Vehicle control device, vehicle control method, and storage medium
CN111619568B (en) Vehicle control device, vehicle control method, and storage medium
CN111619571B (en) Vehicle control system, vehicle control method, and storage medium
CN111942341B (en) Control device, parking lot system, control method, and storage medium
CN111376869B (en) Vehicle control system, vehicle control method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant