CN111746513A - Vehicle control device, vehicle control method, and storage medium - Google Patents
Vehicle control device, vehicle control method, and storage medium Download PDFInfo
- Publication number
- CN111746513A CN111746513A CN202010215429.8A CN202010215429A CN111746513A CN 111746513 A CN111746513 A CN 111746513A CN 202010215429 A CN202010215429 A CN 202010215429A CN 111746513 A CN111746513 A CN 111746513A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- control unit
- occupant
- parking
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000004891 communication Methods 0.000 claims description 47
- 238000012545 processing Methods 0.000 claims description 18
- 230000008569 process Effects 0.000 description 21
- 238000010586 diagram Methods 0.000 description 20
- 238000007726 management method Methods 0.000 description 20
- 230000009471 action Effects 0.000 description 15
- 230000006870 function Effects 0.000 description 9
- 238000001514 detection method Methods 0.000 description 8
- 230000001133 acceleration Effects 0.000 description 7
- 239000000523 sample Substances 0.000 description 7
- 238000012790 confirmation Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000004397 blinking Effects 0.000 description 3
- 238000002485 combustion reaction Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 239000000446 fuel Substances 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 101001093748 Homo sapiens Phosphatidylinositol N-acetylglucosaminyltransferase subunit P Proteins 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0025—Planning or execution of driving tasks specially adapted for specific operations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/06—Automatic manoeuvring for parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W10/00—Conjoint control of vehicle sub-units of different type or different function
- B60W10/20—Conjoint control of vehicle sub-units of different type or different function including control of steering systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2530/00—Input parameters relating to vehicle conditions or values, not covered by groups B60W2510/00 or B60W2520/00
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2720/00—Output or target parameters relating to overall vehicle dynamics
- B60W2720/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2756/00—Output or target parameters relating to data
- B60W2756/10—Involving external transmission of data to or from the vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
Provided are a vehicle control device, a vehicle control method, and a storage medium, which are capable of executing more appropriate driving control. A vehicle control device according to an embodiment includes: an identification unit that identifies the surrounding environment of a vehicle and identifies the locked state of the vehicle; and a driving control unit that performs driving control of one or both of a speed and a steering of the vehicle based on a recognition result of the recognition unit, wherein the driving control unit starts the vehicle when a door of the vehicle is locked after an occupant of the vehicle gets off the vehicle in a predetermined area in a state where a predetermined condition is satisfied.
Description
Technical Field
The invention relates to a vehicle control device, a vehicle control method, and a storage medium.
Background
In recent years, research on automatically controlling a vehicle has been progressing. In connection with this, the following techniques are known: when a vehicle to be parked is located in an alighting space and a driver gets off the vehicle from the vehicle, the vehicle is parked in a parking space when a lock operation remote command for remotely commanding a lock operation is transmitted to the vehicle (for example, japanese patent application laid-open No. 2018-197444).
Disclosure of Invention
However, in the conventional technology, it is sometimes impossible to determine whether the lock operation remote command is a lock operation for a simple door or a lock operation for a parking instruction. Therefore, appropriate driving control may not be performed.
The present invention has been made in view of such circumstances, and an object thereof is to provide a vehicle control device, a vehicle control method, and a storage medium that can execute more appropriate driving control.
The vehicle control device, the vehicle control method, and the storage medium according to the present invention have the following configurations.
(1): a vehicle control device according to an aspect of the present invention includes: an identification unit that identifies the surrounding environment of a vehicle and identifies the locked state of the vehicle; and a driving control unit that performs driving control of one or both of a speed and a steering of the vehicle based on a recognition result of the recognition unit, wherein the driving control unit starts the vehicle when a door of the vehicle is locked after an occupant of the vehicle gets off the vehicle in a predetermined area in a state where a predetermined condition is satisfied.
(2): in the aspect of (1) above, the vehicle control device further includes a receiving unit that receives an operation by the occupant, and the driving control unit causes the vehicle to park in a predetermined parking area when the receiving unit receives a setting of a mode for parking the vehicle in the predetermined parking area before the occupant gets off the vehicle and the door of the vehicle is locked while the occupant gets off the vehicle in the predetermined parking area.
(3): in the aspect of the above (2), the driving control unit does not perform the driving control for parking the vehicle in the parking area when the occupant of the vehicle gets off the vehicle in a predetermined area and the door of the vehicle is locked in a state where the mode setting is not accepted by the acceptance unit before the occupant gets off the vehicle.
(4): in the aspect of any one of the above (1) to (3), the driving control unit starts the vehicle when the recognition unit recognizes that the vehicle enters the area for traveling by the driving control, and when the occupant of the vehicle gets off the vehicle in a predetermined area and the door of the vehicle is locked.
(5): in the aspect of any one of the above (2) to (4), the vehicle control device further includes a communication unit that communicates with the terminal device of the occupant, the predetermined area includes a first area and a second area, and the driving control unit makes a condition for starting the vehicle in a case where the occupant is alighted in the first area different from a condition for starting the vehicle in a case where the occupant is alighted in the second area.
(6): in the aspect (5) above, the driving control unit performs: starting the vehicle when the occupant gets off the vehicle in the first area and the door of the vehicle is locked; and starting the vehicle when the passenger gets off the vehicle in the second area, the door of the vehicle is locked, and a start instruction from the terminal device is received by the communication unit.
(7): in the aspect of the above (6), the first area is an area where an occupant of a vehicle having completed a reservation to park the vehicle in a predetermined parking area gets on or off the vehicle, the second area is an area where the reservation has been completed or the vehicle having not completed the reservation gets on or off the vehicle, and the driving control unit starts the vehicle when the reservation has been completed and the door of the vehicle is locked in a state where the vehicle having not completed the reservation stops at a position where the occupant gets off the vehicle in the second area.
(8): in addition to any one of the above (1) to (7), the vehicle control device further includes: a communication unit that communicates with the occupant terminal device; and a notification control unit that notifies the terminal device of the fact that the predetermined condition is not satisfied, wherein the notification control unit notifies the terminal device via the communication unit when a passenger who gets off the vehicle is separated from the vehicle by a predetermined distance or more.
(9): in the aspect (8) described above, the notification control unit may receive a simple operation from the terminal device and cause the driving control unit to start the vehicle, based on a content of an unsatisfied condition among the predetermined conditions.
(10): in the aspect of any one of the above (1) to (9), the driving control unit starts the vehicle when the occupant is not present in the interior of the vehicle and the recognition unit recognizes that the door of the vehicle is locked.
(11): in the aspect (1) to (10) described above, the driving control unit may start the vehicle when the terminal device of the passenger who gets off the vehicle is not present in the interior of the vehicle and the recognition unit recognizes that the door of the vehicle is locked.
(12): in the aspect of (11) above, when the terminal device of the passenger who gets off the vehicle is present in the interior of the vehicle, the driving control unit does not start the vehicle even when the recognition unit recognizes that the door of the vehicle is locked.
(13): a vehicle control method according to an aspect of the present invention causes a computer to perform: identifying a surrounding environment of a vehicle and identifying a locked state of the vehicle; performing driving control of one or both of a speed and a steering of the vehicle based on the recognition result; and starting the vehicle when a door of the vehicle is locked after a passenger of the vehicle gets off the vehicle in a predetermined area in a state where a predetermined condition is satisfied.
(14): a storage medium according to an aspect of the present invention stores a program that causes a computer to perform: identifying a surrounding environment of a vehicle and identifying a locked state of the vehicle; performing driving control of one or both of a speed and a steering of the vehicle based on the recognition result; and starting the vehicle when a door of the vehicle is locked after a passenger of the vehicle gets off the vehicle in a predetermined area in a state where a predetermined condition is satisfied.
According to the aspects (1) to (14) described above, more appropriate driving control can be executed.
Drawings
Fig. 1 is a configuration diagram of a vehicle system including a vehicle control device of an embodiment.
Fig. 2 is a functional configuration diagram of the first control unit and the second control unit.
Fig. 3 is a diagram showing an example of a functional configuration of the terminal device.
Fig. 4 is a diagram schematically showing a scenario in which an automatic parking event is executed in the embodiment.
Fig. 5 is a diagram showing an example of the configuration of the parking lot management device.
Fig. 6 is a diagram showing an example of an automatic entry setting image displayed on the display device of the HMI.
Fig. 7 is a diagram for explaining a reserved boarding/alighting area and a free boarding/alighting area included in the boarding/alighting area.
Fig. 8 is a diagram showing an example of a warehousing confirmation image notified to the terminal device after the locking operation.
Fig. 9 is a diagram showing an example of an image related to a notification indicating that the automatic parking event cannot be executed.
Fig. 10 is a flowchart showing an example of the flow of processing executed by the automatic driving control apparatus according to the embodiment.
Fig. 11 is a flowchart showing another example of the flow of processing executed by the automatic driving control apparatus according to the embodiment.
Fig. 12 is a diagram showing an example of the hardware configuration of the automatic driving control device according to the embodiment.
Detailed Description
Embodiments of a vehicle control device, a vehicle control method, and a storage medium according to the present invention will be described below with reference to the accompanying drawings. In the following, an embodiment in which the vehicle control device is applied to an autonomous vehicle will be described as an example. The automated driving is, for example, a driving control performed by automatically controlling one or both of steering and acceleration/deceleration of the vehicle. In addition, the autonomous vehicle may also perform driving control by manual operation of the occupant. In the following, the case where the right-hand traffic rule is applied will be described, but the right-hand traffic rule may be applied by switching between the left-hand and the right-hand.
[ integral Structure ]
Fig. 1 is a configuration diagram of a vehicle system 1 including a vehicle control device of the embodiment. The vehicle on which the vehicle system 1 is mounted is, for example, a two-wheel, three-wheel, four-wheel or the like vehicle, and the drive source thereof is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof. The electric motor operates using generated power generated by a generator connected to the internal combustion engine or discharge power of a battery (battery) such as a secondary battery or a fuel cell.
The vehicle system 1 includes, for example, a camera 10, a radar device 12, a probe 14, an object recognition device 16, a communication device 20, an hmi (human Machine interface)30, a vehicle sensor 40, a door lock device 42, a door switch 44, a navigation device 50, an mpu (map Positioning unit)60, a driving operation tool 80, an in-vehicle device 90, an automatic driving control device 100, a driving force output device 200, a brake device 210, and a steering device 220. These devices and apparatuses are connected to each other by a multiplex communication line such as a can (controller Area network) communication line, a serial communication line, a wireless communication network, and the like. The configuration shown in fig. 1 is merely an example, and a part of the configuration may be omitted or another configuration may be added. The combination of the communication device 20, the HMI30, the door lock device 42, the door switch 44, and the automatic drive control device 100 is an example of the "vehicle control device". The combination of the vehicle control device and the terminal device 300 is an example of a "vehicle control system". The communication device 20 is an example of a "communication unit". The HMI30 is an example of the "receiving section". The HMI control unit 180 is an example of a "notification control unit".
The camera 10 is a digital camera using a solid-state imaging device such as a ccd (charge Coupled device) or a cmos (complementary metal oxide semiconductor). The camera 10 is mounted on an arbitrary portion of a vehicle (hereinafter referred to as a vehicle M) on which the vehicle system 1 is mounted. When photographing forward, the camera 10 is attached to the upper part of the front windshield, the rear surface of the vehicle interior mirror, or the like. The camera 10 repeatedly shoots the periphery of the vehicle M periodically, for example. The camera 10 may also be a stereo camera.
The radar device 12 radiates a radio wave such as a millimeter wave to the periphery of the vehicle M and detects a radio wave reflected by an object (reflected wave), thereby detecting at least the position (distance and direction) of the object. The radar device 12 is attached to an arbitrary portion of the vehicle M. The radar device 12 may detect the position and velocity of the object by an FM-cw (frequency Modulated Continuous wave) method.
The detector 14 is a LIDAR (light Detection and ranging). The detector 14 irradiates light to the periphery of the vehicle M and measures scattered light. The detector 14 detects the distance to the subject based on the time from light emission to light reception. The light to be irradiated is, for example, pulsed laser light. The probe 14 is attached to an arbitrary portion of the vehicle M.
The object recognition device 16 performs a sensor fusion process on the detection results detected by some or all of the camera 10, the radar device 12, and the probe 14, and recognizes the position, the type, the speed, and the like of the object. The object recognition device 16 outputs the recognition result to the automatic driving control device 100. The object recognition device 16 may directly output the detection results of the camera 10, the radar device 12, and the detector 14 to the automatic driving control device 100. The object recognition device 16 may also be omitted from the vehicle system 1.
The communication device 20 communicates with, for example, a terminal device 300 of a user U using the vehicle M, another vehicle present in the vicinity of the vehicle M, a parking lot management device (described later), or various server devices, using a network such as a cellular network, a Wi-Fi network, Bluetooth (registered trademark), dsrc (modified short Range communication), lan (local Area network), wan (wide Area network), or the internet. The user U may be, for example, an owner of the vehicle M, or may be a user who uses only the vehicle M by a service of leaving a private car, a vehicle sharing service, or the like. The terminal device 300 is a mobile terminal such as a smartphone or a tablet terminal that the user U can hold. Hereinafter, the occupant of the vehicle M is assumed to include the user U.
The HMI30 presents various information to the occupant of the vehicle M, and accepts input operations by the occupant. The HMI30 includes a display device, a speaker for vehicle interior, a buzzer, a touch panel, a switch, a key, and the like. The display device includes, for example, an instrument display provided in a portion of the instrument panel facing the driver, a center display provided in the center of the instrument panel, a hud (head Up display), and the like. The HUD is a device for visually recognizing an image by superimposing the image on a landscape, for example, and allows a passenger to visually recognize a virtual image by projecting light including the image onto a windshield glass or a combiner of the vehicle M.
The vehicle sensors 40 include a vehicle speed sensor that detects the speed of the vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects an angular velocity about a vertical axis, an orientation sensor that detects the orientation of the vehicle M, and the like. The vehicle sensor 40 may include, for example, a door opening/closing sensor for detecting whether or not a door is opened or closed, and a load sensor for detecting whether or not a load of a seat present in the vehicle compartment is detected. The result detected by the vehicle sensor 40 is output to the automatic driving control apparatus 100.
The door lock device 42 locks (locks) or unlocks (unlocks) doors provided in the vehicle M. The doors provided in the vehicle M include a driver's seat door provided on a driver's seat side of the vehicle M, a passenger seat side door provided on a passenger seat side, and a back door provided on a rear seat side. For example, the door lock device 42 locks or unlocks all the doors or the instructed doors based on a switch operation by the door switch 44 or a lock or unlock instruction from the terminal device 300. The door lock device 42 may be provided with a door lock cylinder for each door. In this case, the door lock device 42 locks or unlocks the door by inserting a key (key) of the vehicle M into the door key cylinder and rotating the key cylinder in a predetermined direction. The door lock device 42 may operate a buzzer or an emergency blinking indicator lamp provided in the vehicle M when locked or unlocked.
The door switch 44 is a switch attached to each door of the vehicle M and used for locking or unlocking the door by a switching operation performed by an occupant from outside or inside the vehicle. The contents operated by the switch are output to the door lock device 42. The door switch 44 is configured to lock or unlock a door to which the pressed switch is attached or all doors by pressing the switch.
The door switch 44 may be configured to receive an operation of locking or unlocking the door in conjunction with a keyless entry system (smart entry system) or the like for locking or unlocking the door without using a mechanical key. In this case, the door switch 44 wirelessly communicates with a key unit (smart key) owned by the occupant to acquire identification information of the key unit, and determines whether or not the key unit is a key unit that can lock or unlock the door. When the door switch 44 is located within a predetermined distance from a key unit that can lock or unlock the door, and when any door is locked or unlocked, the door switch outputs the locking or unlocking operation to the door locking device 42.
The Navigation device 50 includes, for example, a gnss (global Navigation Satellite system) receiver 51, a Navigation HMI52, and a route determination unit 53. The navigation device 50 holds the first map information 54 in a storage device such as an hdd (hard Disk drive) or a flash memory. The GNSS receiver 51 determines the position of the vehicle M based on the signals received from the GNSS satellites. The position of the vehicle M may also be determined or supplemented by an ins (inertial Navigation system) that utilizes the output of the vehicle sensors 40. The navigation HMI52 includes a display device, a speaker, a touch panel, keys, and the like. The navigation HMI52 may also be partially or wholly shared with the aforementioned HMI 30. The route determination unit 53 determines, for example, a route (hereinafter referred to as an on-map route) from the position of the vehicle M (or an arbitrary input position) specified by the GNSS receiver 51 to the destination input by the occupant using the navigation HMI52, with reference to the first map information 54. The first map information 54 is information representing a road shape by, for example, a line representing a road and nodes connected by the line. The first map information 54 may also include curvature Of a road, poi (point Of interest) information, and the like. The map upper path is output to the MPU 60. The navigation device 50 may also perform route guidance using the navigation HMI52 based on the on-map route. The navigation device 50 may be realized by the function of the terminal device 300, for example. The navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire a route equivalent to the route on the map from the navigation server. The navigation device 50 outputs the determined route on the map to the MPU 60.
The MPU60 includes, for example, the recommended lane determining unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determining unit 61 divides the on-map route provided from the navigation device 50 into a plurality of sections (for example, every 100[ m ] in the vehicle traveling direction), and determines the recommended lane for each section with reference to the second map information 62. The recommended lane determining unit 61 determines to travel in the first lane from the left. The recommended lane determining unit 61 determines the recommended lane so that the vehicle M can travel on a reasonable route for traveling to the branch destination when there is a branch point on the route on the map.
The second map information 62 is map information with higher accuracy than the first map information 54. The second map information 62 includes, for example, information on the center of a lane, information on the boundary of a lane, and the like. The second map information 62 may include road information, traffic restriction information, residence information (residence, zip code), facility information, parking lot information, telephone number information, and the like. The parking lot information includes, for example, the position and shape of the parking lot, the number of available parking places, the availability of automatic driving, the boarding/alighting area, and the parking area. The second map information 62 can be updated at any time by the communication device 20 communicating with other devices.
The driving operation member 80 includes, for example, an accelerator pedal, a brake pedal, a shift lever, a steering wheel, a joystick, and other operation members. A sensor for detecting the operation amount or the presence or absence of operation is attached to the driving operation element 80, and the detection result is output to some or all of the automatic driving control device 100, the running driving force output device 200, the brake device 210, and the steering device 220.
The vehicle interior camera 90 is a digital camera using a solid-state imaging device such as a CCD or a CMOS, for example. The vehicle interior camera 90 may also be a stereo camera. The vehicle interior camera 90 is mounted on an arbitrary portion in the interior of the vehicle M. The vehicle interior camera 90 photographs an area including each seat present in the vehicle interior. This makes it possible to determine whether or not the occupant or the terminal device 300 is present in the vehicle interior from the image captured by the vehicle interior camera 90. The vehicle interior camera 90 may periodically repeat imaging of the above-described area, or may image the above-described area at a predetermined timing. The predetermined timing is timing to start the automated driving, timing to lock the door of the vehicle M, or the like. The image captured by the vehicle interior camera 90 is output to the automatic driving control device 100.
The automatic driving control device 100 includes, for example, a first control unit 120, a second control unit 160, an HMI control unit 180, and a storage unit 190. The first control unit 120, the second control unit 160, and the HMI control unit 180 are each realized by a hardware processor execution program (software) such as a cpu (central processing unit), for example. Some or all of these components may be realized by hardware (including circuit units) such as lsi (large Scale integration), asic (application Specific integrated circuit), FPGA (Field-Programmable Gate Array), and gpu (graphics Processing unit), or may be realized by cooperation between software and hardware. The program may be stored in advance in a storage device (a storage device including a non-transitory storage medium, for example, the storage unit 190) such as an HDD or a flash memory of the automatic drive control device 100, or may be stored in a removable storage medium such as a DVD, a CD-ROM, or a memory card, and attached to the storage device of the automatic drive control device 100 by attaching the storage medium (the non-transitory storage medium) to a drive device, a card slot, or the like.
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160. The first control unit 120 includes, for example, a recognition unit 130 and an action plan generation unit 140. The combination of the action plan generating unit 140 and the second control unit 160 is an example of a "driving control unit". The first control unit 120 implements, for example, an AI (Artificial Intelligence) function and a model function in parallel. For example, the function of "recognizing an intersection" may be realized by "recognizing an intersection by deep learning or the like and recognizing based on a predetermined condition (presence of a signal, a road sign, or the like that enables pattern matching) in parallel, scoring both sides, and comprehensively evaluating the both sides". Thereby, the reliability of automatic driving is ensured. The first control unit 120 executes control related to automatic driving of the vehicle M based on, for example, an instruction from the MPU60, the HMI control unit 180, or the like, or an instruction from the terminal device 300.
The recognition unit 130 recognizes the surrounding environment of the vehicle M based on information input from the camera 10, the radar device 12, and the probe 14 via the object recognition device 16. For example, the recognition unit 130 recognizes the state of the object in the vicinity of the vehicle M, such as the position, velocity, and acceleration, based on the input information. The position of the object is recognized as a position on absolute coordinates with the origin at a representative point (center of gravity, center of drive shaft, etc.) of the vehicle M, for example, and used for control. The position of the object may be represented by a representative point such as the center of gravity, a corner, or the like of the object, or may be represented by a region represented. In the case where the object is a moving body such as another vehicle, the "state" of the object may include acceleration, jerk, or "behavior state" of the object (e.g., whether a lane change is being made or is about to be made).
The recognition unit 130 recognizes, for example, a lane in which the vehicle M is traveling (traveling lane). For example, the recognition unit 130 recognizes the traveling lane by comparing the pattern of road segments (e.g., the arrangement of solid lines and broken lines) obtained from the second map information 62 with the pattern of road segments around the vehicle M recognized from the image captured by the camera 10. The recognition unit 130 recognizes not only the road dividing line but also a traveling lane by recognizing a traveling lane boundary (road boundary) including a road dividing line, a shoulder, a curb, a center barrier, a guardrail, and the like. In this recognition, the position of the vehicle M acquired from the navigation device 50 and the processing result processed by the INS may be added. The recognition unit 130 recognizes a temporary stop line, an obstacle, a red light, a toll station, an entrance gate of a parking lot, a stop area, an entrance/exit area, and other road phenomena.
The recognition unit 130 recognizes the position and posture of the vehicle M with respect to the travel lane when recognizing the travel lane. The recognition unit 130 may recognize, for example, a deviation of a reference point of the vehicle M from the center of the lane and an angle formed by the traveling direction of the vehicle M with respect to a line connecting the centers of the lanes as the relative position and posture of the vehicle M with respect to the traveling lane. Instead, the recognition unit 130 may recognize the position of the reference point of the vehicle M with respect to an arbitrary side end portion (road dividing line or road boundary) of the traveling lane as the relative position of the vehicle M with respect to the traveling lane.
The recognition unit 130 includes, for example, a parking space recognition unit 131, a door lock recognition unit 132, a parking setting recognition unit 133, an occupant recognition unit 134, and a terminal recognition unit 135. The functions of the parking space recognition unit 131, the door lock recognition unit 132, the parking setting recognition unit 133, the occupant recognition unit 134, and the terminal recognition unit 135 will be described in detail later.
The action plan generating unit 140 generates an action plan for causing the vehicle M to travel by the automated driving. For example, the action plan generating unit 140 generates a target trajectory on which the vehicle M automatically (without depending on the operation of the driver) travels in the future so that the vehicle can travel on the recommended lane determined by the recommended lane determining unit 61 in principle and can cope with the surrounding situation of the vehicle M based on the recognition result or the like recognized by the recognition unit 130. The target track contains, for example, a velocity element. For example, the target track is represented by a track in which points (track points) to which the vehicle M should arrive are arranged in order. The track point is a point to which the vehicle M should arrive at every predetermined travel distance (for example, several [ M ] or so) in terms of a distance along the way, and unlike this, a target speed and a target acceleration at every predetermined sampling time (for example, several zero [ sec ] or so) are generated as a part of the target track. The track point may be a position to which the vehicle M at a predetermined sampling time is supposed to arrive at the sampling time. In this case, the information on the target velocity and the target acceleration is expressed at intervals of the track points.
The action plan generating unit 140 may set an event of autonomous driving when generating the target trajectory. Examples of the event of the automated driving include an automated parking event in which the vehicle is parked by the automated driving in a parking lot, such as a constant speed driving event, a low speed follow-up driving event, a lane change event, a branch event, a merge event, a take-over event, and a valet parking. The action plan generating unit 140 generates a target trajectory corresponding to the started event. The action plan generating unit 140 includes, for example, an automated parking control unit 142 that is activated when an automated parking event is executed. The function of the automatic parking control unit 142 will be described in detail later.
The second control unit 160 controls the running driving force output device 200, the brake device 210, and the steering device 220 so that the vehicle M passes through the target trajectory generated by the action plan generation unit 140 at a predetermined timing.
The second control unit 160 includes, for example, an acquisition unit 162, a speed control unit 164, and a steering control unit 166. The acquisition unit 162 acquires information of the target track (track point) generated by the action plan generation unit 140, and stores the information in a memory (not shown). The speed control unit 164 controls the running drive force output device 200 or the brake device 210 based on the speed element associated with the target track stored in the memory. The steering control unit 166 controls the steering device 220 according to the curve condition of the target track stored in the memory. The processing of the speed control unit 164 and the steering control unit 166 is realized by, for example, a combination of feedforward control and feedback control. For example, the steering control unit 166 performs a combination of feedforward control according to the curvature of the road ahead of the vehicle M and feedback control based on deviation from the target trajectory.
Returning to fig. 1, the HMI control unit 180 notifies the occupant of predetermined information through the HMI 30. The predetermined information includes information related to the traveling presence of the vehicle M, such as information related to the state of the vehicle M and information related to driving control. The information related to the state of the vehicle M includes, for example, the speed of the vehicle M, the engine speed, the shift position, and the like. The information related to the driving control includes, for example, presence or absence of execution of automatic driving, information related to a preset for starting automatic driving, information inquiring whether to start automatic driving, information related to a degree of driving support by automatic driving, and the like. The predetermined information may include information not related to the traveling of the vehicle M, such as a television program and an entry (e.g., movie) stored in a storage medium such as a DVD. The prescribed information may include, for example, information relating to the communication state between the vehicle M and the terminal device 300, the current position or destination in automatic driving, and the remaining fuel amount of the vehicle M. The HMI control unit 180 may output the information received from the HMI30 to the communication device 20, the navigation device 50, the first control unit 120, and the like.
The HMI control unit 180 may communicate with the terminal device 300 stored in the terminal information 192 via the communication device 20 and transmit predetermined information to the terminal device 300. The HMI control unit 180 may cause the HMI30 to output information acquired from the terminal device 300.
The HMI control unit 180 may also perform the following control, for example: a registration screen for registering the terminal device 300 communicating with the vehicle M is displayed on the display device of the HMI30, and information relating to the terminal device 300 input via the registration screen is stored in the storage unit 190 as terminal information 192. The registration of the terminal information 192 described above is performed at a predetermined timing before the start of the automatic driving such as the driving of the vehicle or the automatic parking event by the user U, for example. The registration of the terminal information 192 described above may be performed by an application (vehicle cooperation application described later) installed in the terminal device 300.
The HMI control unit 180 may transmit the information obtained from the HMI30 and the HMI control unit 180 to the terminal device 300 or another external device via the communication device 20.
The storage unit 190 is implemented by, for example, an HDD, a flash Memory, an EEPROM, a ROM (Read Only Memory), a ram (random access Memory), or the like. The storage unit 190 stores, for example, terminal information 192, programs, and other information.
Running drive force output device 200 outputs running drive force (torque) for running the vehicle to the drive wheels. The travel driving force output device 200 includes, for example, a combination of an internal combustion engine, a motor, a transmission, and the like, and an ecu (electronic Control unit) that controls them. The ECU controls the above configuration in accordance with information input from the second control unit 160 or information input from the driving operation element 80.
The brake device 210 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor so that a braking torque corresponding to a braking operation is output to each wheel, in accordance with information input from the second control unit 160 or information input from the driving operation element 80. The brake device 210 may be provided with a mechanism for transmitting the hydraulic pressure generated by the operation of the brake pedal included in the driving operation tool 80 to the hydraulic cylinder via the master cylinder as a backup. The brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device that transmits the hydraulic pressure of the master cylinder to the hydraulic cylinder by controlling the actuator in accordance with information input from the second control unit 160.
The steering device 220 includes, for example, a steering ECU and an electric motor. The electric motor changes the orientation of the steering wheel by applying a force to a rack-and-pinion mechanism, for example. The steering ECU drives the electric motor in accordance with information input from the second control unit 160 or information input from the driving operation element 80 to change the direction of the steered wheels.
[ terminal device ]
Fig. 3 is a diagram showing an example of a functional configuration of the terminal device 300. The terminal device 300 includes, for example, a communication unit 310, an input unit 320, a display 330, a speaker 340, an application execution unit 350, an output control unit 360, and a storage unit 370. The communication unit 310, the input unit 320, the application execution unit 350, and the output control unit 360 are realized by executing a program (software) by a hardware processor such as a CPU, for example. Some or all of these components may be realized by hardware (including circuit units) such as LSIs, ASICs, FPGAs, GPUs, and the like, or may be realized by cooperation of software and hardware. The program may be stored in advance in a storage device (a storage device including a non-transitory storage medium, for example, the storage unit 370) such as an HDD or a flash memory of the automatic driving control device 100, or may be stored in a removable storage medium such as a DVD, a CD-ROM, or a memory card, and may be attached to a storage device of the terminal device 300 by attaching the storage medium (the non-transitory storage medium) to a drive device, a card slot, or the like.
The communication unit 310 communicates with the vehicle M and other external devices using a network such as a cellular network, a Wi-Fi network, Bluetooth, DSRC, LAN, WAN, or the internet.
The input unit 320 receives user input through operations of various keys, buttons, and the like, for example. The display 330 is, for example, an lcd (liquid Crystal display) or the like. The input unit 320 may be configured integrally with the display 330 as a touch panel. The display 330 displays information related to automatic driving in the embodiment and other information necessary for using the terminal device 300 under the control of the output control unit 360. The speaker 340 outputs a predetermined sound under the control of the output control unit 360, for example.
The application execution unit 350 is realized by executing the vehicle cooperation application 372 stored in the storage unit 370. The vehicle cooperation application 372 is, for example, an application program (application) that communicates with the vehicle M via a network and transmits a warehousing instruction and a delivery instruction under automated driving to the vehicle M. The vehicle cooperation application 372 may also have a function (e.g., an electronic key function) that transmits an instruction to the vehicle M to lock or unlock the doors of the vehicle M or to start or stop the engine.
The vehicle cooperation application 372 acquires information transmitted by the vehicle M, and causes the output control unit 360 to execute a predetermined report based on the acquired information. For example, when the information related to the vehicle identification is received from the vehicle M by the communication unit 310 after the delivery instruction is given, the application execution unit 350 executes control to display an image on the display 330 or to output a sound from the speaker 340 based on the report content or the like associated with the information related to the vehicle identification.
The vehicle cooperation application 372 may transmit the position information of the terminal apparatus 300 acquired by a gps (global positioning system) device (not shown) built in the terminal apparatus 300 to the vehicle M, register the terminal information, the report content, and the like, or perform other processes related to the vehicle cooperation.
The output control unit 360 controls the content and display mode of an image to be displayed on the display 330, the content and output mode of a sound to be output from the speaker 340. For example, the output control unit 360 may display information related to driving control (automatic entry and automatic exit), information indicating a locked state and an unlocked state (hereinafter, referred to as a locked state) of the doors, information inquiring about driving control and an instruction of the locked state, and the like from the vehicle M on the display 330, or may output a sound corresponding to the above information from the speaker 340. The output control unit 360 may acquire an image or a sound corresponding to the notification content from an external device or generate the image or the sound in the terminal device 300, and output the image or the sound from the display 330 and the speaker 340. The output control unit 360 may output information instructed by the vehicle cooperation application 372 and various information necessary for using the terminal device 300 from the display 330 and the speaker 340.
The storage unit 370 is implemented by, for example, an HDD, a flash memory, an EEPROM, a ROM, a RAM, or the like. The storage unit 370 stores, for example, a vehicle cooperation application 372, a program, and other information.
[ Driving control ]
Next, driving control by automated driving according to the embodiment will be specifically described. Hereinafter, as an example of driving control under driving by the automated driving of the vehicle M, a description will be given using a scene in which the vehicle M is automatically parked under driving by the automated driving in a valet parking lot accessing a target facility. Hereinafter, "unmanned running" in which the vehicle runs in an unmanned manner is used as an example of "running by automatic driving". The automatic driving in the present embodiment may be performed in a state where an occupant is present in the vehicle.
Fig. 4 is a diagram schematically showing a scenario in which an automatic parking event is executed in the embodiment. In the example of fig. 4, a parking lot (e.g., a valet parking lot) PA of an access target facility is shown. In the parking lot PA, a gate 400-in and a gate 400-out, a parking area 410, and an entering/leaving area 420 are provided on a route from the road Rd to the facility to be accessed. The entrance/exit area 420 is an example of a "predetermined area". The boarding area 420 may be divided into a boarding area and a alighting area. The boarding/alighting area may include, for example, a reserved boarding/alighting area (an example of a first area) in which a passenger of a vehicle, for which a reservation to enter a predetermined parking area of the parking lot PA is made, can board or alighting an area (an example of a second area) in which the passenger of the vehicle, for which the reservation is made or the reservation is not made, can board or alighting. In the example of fig. 4, a parking lot management device 500 is provided that manages a parking status of the parking lot PA, and a reservation status of parking in or parking out, and transmits an empty status or the like to the vehicle.
Here, the processing at the time of automatic entry and automatic exit of the vehicle M based on the automatic parking event will be described. The process at the time of warehousing and the time of delivery is executed by, for example, receiving a warehousing instruction and a delivery instruction from the terminal device 300, or by the elapse of a predetermined time or by the satisfaction of other execution conditions.
[ automatic parking event-automatic warehousing time ]
The automatic parking control unit 142 parks the vehicle M in the parking space of the parking lot PA, for example, based on information acquired from the parking lot management device 500 by the communication device 20. In this case, the vehicle M travels to the stop area 410 through the gate 400-in under manual driving or automatic driving. The stop area 410 faces an entering/leaving area 420 connected to the facility to be accessed. Eaves for sheltering from rain and snow are provided on the boarding and alighting areas 420.
After the passenger (hereinafter, referred to as a user U) gets off the vehicle in the stop area 410, the vehicle M starts an automatic parking event in which the vehicle M is automatically driven in an unmanned manner and moves to the parking space PS in the parking lot PA. The start triggering condition of the automatic parking event is, for example, a case where the doors of the vehicle M are locked after the user U gets in and out of the boarding/alighting area 420 in a state where a predetermined condition is satisfied. The start trigger condition may be an operation by the user U (e.g., an entering instruction from the terminal device 300) or a reception of a predetermined signal from the parking lot management device 500 by radio. Specific control contents in the scenario of starting the automatic parking event will be described later.
When starting the automatic parking event, the automatic parking control unit 142 controls the communication device 20 to transmit a parking request to the parking lot management device 500. Then, the vehicle M moves from the stop area 410 to the parking lot PA, guided by the parking lot management device 500, or moves while being sensed by its own force.
Fig. 5 is a diagram showing an example of the configuration of the parking lot management device 500. The parking lot management device 500 includes, for example, a communication unit 510, a control unit 520, and a storage unit 530. The storage unit 530 stores information such as parking lot map information 532, parking space state table 534, and reservation management information 536.
The parking lot map information 532 is information geometrically representing the structure of the parking lot PA. The parking lot map information 532 includes coordinates of each parking space PS. The parking space state table 534 corresponds, for example, to a parking space ID that is discrimination information of the parking space PS, a vehicle ID that is discrimination information of a parked vehicle in a case where the parking space ID is a vehicle ID indicating whether the parking space is in an empty state or a full (in-parking) state and in a case where the parking space ID is in a full state. The reservation management information 536 is information in which the vehicle ID, the scheduled warehousing time, the scheduled ex-warehousing time, and the vehicle space ID are associated with each other. The reservation management information 536 may include position information of the stop area 410 corresponding to the getting-on/off area 420 where the user U gets on/off the vehicle.
For example, when the HMI30 of the vehicle M or the terminal device 300 receives the utilization reservation information (for example, one or both of the time to enter the parking lot PA or the time to leave the parking lot PA) from the user U, the vehicle ID and the utilization reservation information are transmitted to the parking lot management device 500. When the vehicle ID and the reservation information for use are received via the communication unit 510, the control unit 520 of the parking lot management device 500 refers to the parking space state table 534, the reservation management information 536, and the like, extracts the parking space ID that is vacant at the scheduled parking time, and stores the reservation information for use (vehicle ID, scheduled parking time, scheduled leaving time) in the same parking space ID included in the reservation management information 536. Thereby, the reservation before parking is completed. When the reservation is completed, the control unit 520 transmits the parking space ID to the vehicle M or the terminal device 300, which has transmitted the vehicle ID and the reservation information, via the communication unit 510. If there is no parking space that can be reserved, control unit 520 transmits information indicating the fact to vehicle M or terminal device 300 that has previously transmitted the vehicle ID and the reservation information.
When communication unit 510 receives a parking request from the vehicle, control unit 520 refers to reservation management information 536 and determines whether or not there is a reserved parking space, based on the vehicle ID transmitted together with the request. When reserving the parking space, the position of the parking space (for example, the parking space PS) is acquired from the parking lot map information 532, and an appropriate route to the acquired position of the parking space PS is transmitted to the vehicle using the communication unit 510. When the parking space is not reserved (when the reservation is not completed in advance), the control unit 520 extracts the parking space PS in the empty state with reference to the parking space state table 534, acquires the position of the extracted parking space PS from the parking lot map information 532, and transmits an appropriate route to the acquired position of the parking space PS to the vehicle using the communication unit 510. Based on the positional relationship of the plurality of vehicles, control unit 520 instructs a specific vehicle to stop, creep, or the like as necessary in order to avoid the vehicles from traveling to the same position at the same time.
In the vehicle (hereinafter, referred to as a vehicle M) that has received the route, the automatic parking control unit 142 generates a target trajectory obtained based on the route. When approaching the parking space PS, the parking space recognition unit 131 recognizes a parking frame line or the like that divides the parking space PS, recognizes a detailed position of the parking space PS, and provides the parking space PS to the automatic parking control unit 142. The automatic parking control unit 142 receives the detailed position, corrects the target trajectory, and parks the vehicle M in the parking space PS.
Not limited to the above description, the automatic parking control unit 142 may automatically find the parking space in the empty state based on the detection result detected by the camera 10, the radar device 12, the probe 14, or the object recognition device 16, and park the vehicle M in the found parking space, without depending on communication.
[ automatic parking event-automatic warehouse-out time ]
The automatic parking control unit 142 and the communication device 20 maintain the operating state even when the vehicle M is parked. For example, when the communication device 20 receives a vehicle pickup request (an example of a delivery instruction) from the terminal device 300 or when the delivery reservation time is reached, the automatic parking control unit 142 activates the system of the vehicle M and moves the vehicle M to the stop area 410. At this time, the automatic parking control unit 142 controls the communication device 20 to transmit a start request to the parking lot management device 500. The control unit 520 of the parking lot management device 500 instructs a specific vehicle to stop, creep, or the like as necessary in order to avoid the vehicles from simultaneously traveling to the same position based on the positional relationship of the plurality of vehicles, as in the case of parking.
The automatic parking control unit 142 determines whether or not the user U is present in the boarding/alighting area (the boarding area when the boarding area and the alighting area are divided) 420, and if it is determined that the user U is present, stops the vehicle M in an empty space of the stop area 410 within a predetermined distance from the position where the user U is present. In this case, the automatic parking control unit 142 acquires the position information from the terminal device 300, and determines that the user U is present in the boarding/alighting area 420 when the acquired position information is within the boarding/alighting area 420. The automatic parking control unit 142 may determine whether the user U is present in the boarding/alighting area 420 based on the detection result recognized by the camera 10, the radar device 12, the probe 14, or the object recognition device 16. In this case, the automatic parking control unit 142 acquires the feature information of the user U from the detection result detected by the camera 10, the radar device 12, the probe 14, or the object recognition device 16 in advance before the entry of the vehicle M. The automatic parking control unit 142 compares the feature information of the person obtained from the detection result at the time of delivery with the feature information of the user U, and determines that the user U is present when the degree of similarity is equal to or greater than a predetermined value.
After the user U gets in the car after parking, the automatic parking control unit 142 stops its operation, and thereafter, starts manual driving or automatic driving by another function unit. In this case, the automatic parking control unit 142 determines that the user U has taken a car when it detects that the doors of the vehicle M are opened or closed, or when it receives an operation of the driving operation element 80 or the HMI30, or when it detects that the load applied to the seat in the vehicle room is equal to or greater than a predetermined value, for example.
If it is determined that the user U is not present in the getting-on/off area 420, the automatic parking control unit 142 may perform control to slow the vehicle M and delay the arrival time at the stop area 410. This can shorten the waiting time in the stop area 410, and can alleviate the congestion in the stop area 410.
The automatic parking control unit 142 may execute automatic driving that creates a circuit that travels through the stop area 410 and travels through the circuit when the elapsed time from the stop in the stop area 410 is equal to or longer than a predetermined time and the riding of the user U is not detected. When the above conditions are satisfied, the automatic parking control unit 142 may execute the automatic driving for parking in the parking lot PA again. This can suppress congestion in the stop area 410.
[ concrete control contents in the scene of starting the automatic parking event ]
Next, specific control contents in a scenario where the automatic parking event is started in the warehousing process will be described. Hereinafter, each of several control modes will be described.
< first control mode >
The door lock recognition unit 132 recognizes the locked state of the doors in the state where the vehicle M is parked in the parking area 410. For example, the door lock recognition unit 132 recognizes the locked state of the door of the vehicle M from the locked state of the door lock device 42. The occupant recognition unit 134 analyzes the image captured by the vehicle interior camera 90, and recognizes whether or not the user U seated in the seat gets off the vehicle. The occupant recognition unit 134 may recognize which seat the user U is present in the vehicle interior from the magnitude or amount of change in the load of the seat obtained by the vehicle sensor 40 in addition to (or instead of) the captured image of the vehicle interior camera 90. The occupant recognition unit 134 may recognize whether or not the user U who is seated in the driver seat gets off the vehicle, or whether or not all occupants including occupants other than the user U get off the vehicle, using the cabin camera 90 and the vehicle sensor 40. The occupant recognition unit 134 may recognize the occupant seated in the vehicle M and the seat seated in the vehicle using the vehicle interior camera 90 and the vehicle sensor 40.
When the occupant recognition unit 134 recognizes that the user U of the vehicle M is getting off the vehicle in the predetermined area and the door lock recognition unit 132 recognizes that the door of the vehicle M is locked in a state where the predetermined condition is satisfied, the automatic parking control unit 142 generates an action plan for starting the vehicle M so as to park the vehicle in the predetermined parking area by the automatic parking event.
Here, the predetermined condition in the first control mode is a case where a setting of a mode (hereinafter referred to as an automatic parking mode) for automatically parking the vehicle M in a predetermined parking space PS is received before the user U gets off the vehicle. For example, the parking setting recognition unit 133 determines whether or not the setting of the automatic loading mode is accepted, based on an operation performed by the user U on a setting screen displayed on the HMI30 or the terminal device 300.
Fig. 6 is a diagram showing an example of an automatic entry setting image displayed on the display device of the HMI 30. The image IM1 shown in fig. 6 is generated by the HMI control unit 180 when displayed on the HMI30, for example, and is generated by the output control unit 360 when displayed on the display 330 of the terminal device 300. The layout and display contents of the image shown in fig. 6 are not limited to the following contents. The same applies to the example of the image of fig. 7 and later. The image IM1 shown in fig. 6 may be displayed in response to a display instruction by the user U, or may be displayed when the position of the vehicle M is within a predetermined distance from the facility to be accessed.
The parking setting recognition unit 133 causes the HMI control unit 180 to generate an image IM1 for causing the user U to perform setting of driving control such as the automatic parking mode, and causes the generated image IM1 to be output to the display device of the HMI 30. In the example of fig. 6, the image IM1 includes a text information display area a1 and an option display area a2 as a driving control setting screen image. In the character information display area a1, for example, character information for inquiring whether or not the automatic parking event can be executed in accordance with the door lock operation of the vehicle M after getting off is displayed. In the example of fig. 6, a text message display area a1 shows "is automatic entry executed when a door lock after getting off the vehicle is received? "this text information.
In the selection item display area a2, a GUI (graphical User interface) icon IC1 ("yes" button) that permits contents displayed by the text information display area a1 and a GUI icon IC2 ("no" button) that rejects the displayed contents are included. The parking setting recognition unit 133 determines that the automatic entry mode for allowing the automatic entry is set when the door of the vehicle M is locked, upon receiving the operation of the GUI icon IC1 by the user U. When the user U has received an operation to the GUI icon IC2, the parking setting recognition unit 133 determines that a setting to not automatically enter the vehicle M (in other words, the automatic entering mode is not set) has been received when the vehicle M is door-locked.
The parking setting recognition unit 133 may receive the mode setting by the user U by causing the HMI control unit 180 to generate a sound having the same contents as the text information display area a1 of the image IM1 shown in fig. 6, causing the speaker of the HMI30 to output the contents, and obtaining the sound of the answer by the speaker.
By performing the above-described setting before the user U gets off the vehicle, for example, the automatic parking control unit 142 receives the setting of the automatic parking mode, and executes an automatic parking event for parking the vehicle M in the parking area when the user U gets off the vehicle in the getting-on/off area 420 and all the doors of the vehicle M are locked. The automatic parking control unit 142 does not execute an automatic parking event for parking the vehicle M in the parking area even when the user U gets off the vehicle in the loading/unloading area 420 and all the doors of the vehicle M are locked in a state where the setting of the automatic parking mode is not accepted. Thus, when the door lock instruction is received, it is possible to suppress automatic garage entering against the intention of the user U.
< second control mode >
Next, the second control mode is explained. Hereinafter, differences from the first control mode will be mainly described, and descriptions of the same control contents will be omitted. The same applies to the following description of the control mode. In the second control mode, a condition is that an entrance to an access target facility (i.e., an area for traveling by driving control) having a parking lot PA capable of automatic parking in and out is made, instead of the predetermined condition in the first control mode. The entry may be, for example, through the gate 400-in, or may be into the stop area 410, or may be traveling at a position within a predetermined distance from the stop area 410.
The automatic parking control unit 142 generates an action plan for starting the vehicle M when recognizing that the vehicle enters an access target facility including a parking lot PA capable of automatic entry and exit and when the door of the vehicle M is locked after the user U gets off the vehicle in and out of the entry and exit area 420. Thus, a vehicle entering the parking lot in the valet parking can be smoothly put in or taken out by a simple operation.
< third control mode >
Next, the third control mode is explained. In the third control mode, the predetermined conditions for automatically parking the vehicle M are set to be different depending on which of the reserved boarding/alighting area (an example of the first area) or the free boarding/alighting area (an example of the second area) the vehicle M is parked in. Fig. 7 is a diagram for explaining a reserved boarding/alighting area 420R and a free boarding/alighting area 420F included in the boarding/alighting area 420. In the example of fig. 7, the free boarding/alighting area 420F is provided on the entrance side of the stop area 410, and the reserved boarding/alighting area 420R is provided on the far side (far side in the X-axis direction), but the present invention is not limited to this. For example, a reserved boarding/alighting area 420R may be provided on the entrance side of the stop area 410, and a free boarding/alighting area 420F may be provided on the far side, or each boarding/alighting area may be provided as a different boarding/alighting lane. In the example of fig. 7, a vehicle stop area for allowing the user U to get on and off the vehicle in the free entry and exit area 420F is set as a free stop area 410F, and a vehicle stop area for allowing the user U to get on and off the vehicle in the reserved entry and exit area 420R is set as a reserved stop area 410R. When the user U stops the vehicle by manual driving, the user visually recognizes the reserved entering/exiting area 420R and the free entering/exiting area 420F to stop the vehicle, and when the user performs parking by automatic driving based on the reservation management information 536 and the like, the user stops the vehicle in the reserved stop area 410R by the automatic driving control device 100 when the reservation is completed before entering/exiting the vehicle, and stops the vehicle in the free stop area 410F by the automatic driving control device 100 when the reservation is not completed.
For example, after the user U gets off the vehicle in the reserved getting-on/off area 420R, the automatic parking control unit 142 executes an automatic parking event and starts the vehicle M when all the doors of the vehicle M are locked. This makes it possible to smoothly perform the driving control of the garage desired by the occupant by a simple operation.
As shown in fig. 7, when the vehicle is stopped in the free stop area 410F and the door of the vehicle M is locked after the user U gets off the vehicle in the free entry/exit area 420F, the automatic parking control unit 142 inquires of the HMI control unit 180 whether to enter the terminal device 300.
Fig. 8 is a diagram showing an example of a warehousing confirmation image notified to the terminal device 300 after the locking operation. The image IM2 shown in fig. 8 is an image generated by the output control unit 360 of the terminal device 300 in accordance with an instruction from the HMI control unit 180 and displayed on the display 330. In the example of fig. 8, the image IM2 includes a character information display area A3 and a selection item display area a4 as an automatic warehousing confirmation screen. In the character information display area a3, for example, after the door lock operation of the vehicle M is received and the door is locked after the vehicle is alighted, character information for inquiring the user U whether or not the automatic parking event can be executed is displayed. In the example of fig. 8, "automatic entry execution" is displayed in the character information display area a 3. Can? "such text information.
In the selection item display area a4, a GUI icon IC3 that permits the content displayed by the text information display area A3 (yes button) and a GUI icon IC4 that rejects the displayed content (no button) are included. When the user U has accepted an operation to the GUI icon IC3 or the GUI icon IC4, the display 330 serving as the input unit 320 or the touch panel transmits the accepted information to the vehicle M via the communication unit 310. When receiving an instruction to perform an automatic parking (an example of a start instruction to start the vehicle M) based on the instruction information transmitted from the terminal device 300, the automatic parking control unit 142 of the vehicle M generates an action plan to start the vehicle M so as to park the vehicle in a predetermined parking space by the automatic parking event. When receiving the instruction to not execute the automatic parking, the automatic parking control unit 142 maintains the parking in the free stop area 410F as shown in fig. 6. Thus, for example, when a vehicle is stopped for a short period of time, the vehicle can be parked temporarily, and thus the vehicle can be immediately taken in without waiting for the vehicle to enter the parking space PS or to exit from the parking space PS.
In the free entry/exit area 420F, since there are both vehicles that are temporarily parked and vehicles that are automatically parked immediately after getting off, the automatic parking control unit 142 can suppress execution of automatic parking against the intention of the user U by making an inquiry to the user U when the door locking operation is received while the vehicle is stopped in the free stop area 410F.
When the reservation of the parking area is completed while the vehicle M is parked in the free stop area 410F, the automatic parking control unit 142 may start the vehicle M by the automatic parking event at the stage when the locking operation of the door is received without displaying the image shown in fig. 8 on the terminal device 300 and requesting the instruction of the user U. Thus, even when the vehicle M is stopped in the free stop area 410F, the automatic parking event can be immediately executed without performing the automatic parking confirmation by the user U.
< fourth control mode >
Next, the fourth control mode is explained. In the fourth control mode, the automatic parking control unit 142 may determine whether or not an occupant including the user U is present in the vehicle interior, and execute the automatic parking event to start the vehicle when the occupant is not present and it is recognized that the door of the vehicle M is locked. This can prevent the occupant from being left in the vehicle interior against the intention of the occupant.
< fifth control mode >
Next, the fifth control mode is explained. In the fifth control mode, when it is recognized that the user U who gets off the vehicle M is separated from the vehicle M by a predetermined distance or more in a state where a predetermined condition for starting the vehicle M by the auto-parking event is not satisfied, the terminal device 300 is notified. The conditions for starting the vehicle M include, for example, a case where the user U gets off the vehicle in the boarding/alighting area 420 (hereinafter referred to as a first condition), a case where the door of the vehicle M is locked (hereinafter referred to as a second condition), a case where no occupant is present in the vehicle interior (hereinafter referred to as a third condition), and the like. The above conditions are examples, and other conditions may be added or replaced.
The automatic parking control unit 142 assumes the position of the terminal device 300 held by the user U as the position information of the user U, calculates the distance between the vehicle M and the user U from the position information of the vehicle M and the position information of the terminal device 300, and notifies the terminal device 300 through the HMI control unit 180 when the calculated distance is equal to or longer than a predetermined distance and does not satisfy the first to third conditions. In this case, the HMI control unit 180 may notify information on the unsatisfied condition among the first to third conditions.
Fig. 9 is a diagram showing an example of an image related to a notification indicating that the automatic parking event cannot be executed. The image IM3 shown in fig. 9 is an image generated by the output control unit 360 of the terminal device 300 in accordance with an instruction from the HMI control unit 180 and displayed on the display 330. In the example of fig. 9, the image IM3 includes a text information display area a5 and a selection item display area a6 as a notification screen. In the character information display area a5, for example, character information indicating that the automatic parking event cannot be executed is displayed. In the example of fig. 9, in order to notify that the third condition is not satisfied, the text information display area a5 indicates "the passenger is present in the vehicle, and therefore the vehicle cannot be automatically put in storage. "this text information.
A GUI icon IC5 ("good" button) permitting the contents displayed by the text information display area a5 is included in the selection item display area a 6. When the user U has accepted an operation on the GUI icon IC5 from the input unit 320 or the display 330 as a touch panel, the output controller 360 terminates the display of the image IM 3. This enables the user U to more clearly understand the reason why the automatic parking event is not started. The user U visually recognizes the image IM3, adjusts the image so that all conditions for executing the automatic parking event are satisfied (in the example of fig. 9, all passengers in the vehicle interior are alighted), and performs the door locking operation again, thereby executing the automatic parking event.
When all of the above-described conditions for starting the vehicle M are not satisfied, the HMI control unit 180 may cause the terminal device 300 to display the image IM2 corresponding to the automatic entry confirmation screen shown in fig. 8, instead of causing the terminal device 300 to display the image IM3 corresponding to the notification screen shown in fig. 9. Thus, for example, in order to park the vehicle M by the automatic parking event, even when the condition for starting the vehicle M is not fully satisfied, the automatic parking event can be executed by an operation simpler than a normal operation based on the content of the unsatisfied condition among the predetermined conditions. The simple operation includes, for example, an operation in which an item is selected less frequently or an operation in which a part of the operation is omitted. Therefore, when the occupant is intentionally present in the vehicle interior (for example, the occupant is left behind), the automatic parking can be performed by a simple operation using the image IM2 shown in fig. 8 without going through the display of the notification screen shown in fig. 9.
For example, the automatic parking control unit 142 may determine a priority for each predetermined condition for starting the vehicle M in advance, and may display the image IM3 shown in fig. 9 on the terminal device 300 when the condition that is not satisfied has a high priority, and display the image IM2 (simple operation image) shown in fig. 8 on the terminal device 300 when the condition that is not satisfied has a low priority. The priority may be set, for example, for each shape of the facility to be accessed, the boarding/alighting area, and the parking area, according to the degree of congestion of the boarding/alighting area, the time period, and the like. For example, the automatic parking control unit 142 may display the image IM3 shown in fig. 9 on the terminal device 300 when the number of the unsatisfied conditions among the plurality of predetermined conditions is equal to or greater than a predetermined number, and may display the image IM2 shown in fig. 8 on the terminal device 300 when the number of the unsatisfied conditions is less than the predetermined number. This makes it possible to provide more appropriate notification to the user U based on the priority and number of the unsatisfied conditions.
< sixth control mode >
Next, the sixth control mode is explained. In the sixth control mode, when the terminal device 300 is not present in the vehicle interior and it is recognized that the door of the vehicle M is locked, the automatic parking event is executed to start the vehicle M. Specifically, the terminal recognition unit 135 analyzes the image captured by the vehicle interior camera 90 to obtain the feature information of the object, performs pattern matching between the obtained feature information and the predetermined feature information of the terminal device 300, and determines whether or not the image captured by the vehicle interior camera 90 includes the terminal device 300. When recognizing that the terminal device 300 is not present in the vehicle interior based on the recognition result recognized by the terminal recognition unit 135, the automatic parking control unit 142 executes the automatic parking event and generates an action plan for starting the vehicle M when other conditions (for example, the first to third conditions) for starting the vehicle M are satisfied. For example, when recognizing that the terminal device 300 is present in the vehicle interior based on the recognition result recognized by the terminal recognition unit 135, the automatic parking control unit 142 does not execute the automatic parking event (maintain the vehicle stopped) even when other conditions (for example, the first to third conditions) for starting the vehicle M are satisfied in advance. Thus, when the terminal device 300 that issues the delivery instruction is present in the vehicle interior, the automatic storage is not performed even when the door locking operation is received, and it is possible to suppress the terminal device 300 from being unable to issue the instruction at the time of delivery. When the door is locked while the terminal device 300 is present in the vehicle interior, the HMI control unit 180 may notify the user U that the terminal device 300 is present in the vehicle interior by operating a buzzer or an emergency blinking display lamp provided in the vehicle M. This makes it possible for the user U to easily recognize that the terminal device 300 has been left behind in the vehicle interior.
Each of the first to sixth control modes may be combined with a part or all of the other control modes, for example. Which of the first to sixth control modes is used may be set by the user U, or may be determined according to the parking area PA, the mode of the boarding/alighting area 420, and the control mode of the vehicle M.
[ treatment procedure ]
Fig. 10 is a flowchart showing an example of the flow of processing executed by the automatic driving control apparatus 100 according to the embodiment. Hereinafter, the parking process in the automatic parking event will be mainly described. First, the recognition unit 130 determines whether or not the vehicle M is parked in the boarding/alighting area based on the position information of the vehicle M and the map information of the facility to be visited (step S100). When it is determined that the vehicle M is parked in the boarding/alighting area, the occupant recognition unit 134 determines whether or not an occupant (user U) gets off the vehicle (step S102). When it is determined that the occupant gets off the vehicle, the door lock recognition unit 132 determines whether or not an instruction to lock the door of the vehicle M is received (step S104). When it is determined that the lock instruction of the door of the vehicle M is received, the door lock device 42 locks the door (step S106). Next, the parking setting recognition unit 133 determines whether or not the automatic entry is set after the door is locked (step S108).
If it is determined that the automatic parking is not set, the automatic parking control unit 142 determines whether or not the vehicle M enters a parking lot in which the automatic parking is possible (step S110). If it is determined in the process of step S108 that automatic parking is set, or if it is determined in the process of step S110 that the vehicle M enters a parking lot in which automatic parking is possible, the automatic parking control unit 142 starts the vehicle M and executes automatic parking (step S112). This completes the processing of the flowchart. If it is determined in the process of step S100 that the vehicle is not parked in the boarding/alighting area, if it is determined in the process of step S102 that the vehicle occupant is not parked, if it is determined in the process of step S104 that the lock instruction of the door is not received, or if it is determined in the process of step S110 that the vehicle M is not entered into the parking lot where the automatic parking is possible, the automatic parking is not performed, and the process of the present flowchart is ended. When the automatic entry is not executed, an image corresponding to a notification screen indicating the reason for the automatic entry may be generated and displayed on the terminal device 300.
In the embodiment, in addition to the automatic entering process shown in fig. 10, for example, the automatic entering process may be executed in combination with a determination result as to whether or not the entering/leaving area is a reserved entering/leaving area and whether or not the terminal device is forgotten to be taken. Fig. 11 is a flowchart showing another example of the flow of processing executed by the automatic driving control apparatus 100 according to the embodiment. The processing in fig. 11 is added with the processing in steps S120 to S126, compared with the processing shown in fig. 10. Therefore, the following description will be mainly focused on the processing of steps S120 to S126, and the other description will be omitted. In the example of fig. 11, the boarding area 420 includes a reserved boarding area 420R and a free boarding area 420F.
When it is determined that the automatic parking is set in the process of step S108 shown in fig. 11 or when it is determined that the vehicle M enters a parking lot in which the automatic parking is possible in the process of step S110, the automatic parking control unit 142 determines whether or not the terminal device 300 is present in the vehicle interior of the vehicle M based on the recognition result recognized by the terminal recognition unit 135 (step S120). If it is determined that the terminal device 300 is not present in the vehicle interior, the automatic parking control unit 142 determines whether or not the getting-on/off area 420 is the reserved getting-on/off area 420R (step S122). When it is determined that the boarding area 420 is not the reserved boarding area 420R, the HMI control unit 180 makes the terminal device 300 inquire whether to execute an automatic parking event (step S124). Next, the HMI control unit 180 determines whether or not an instruction to execute the automatic entering is received from the terminal device 300 (step S126).
When it is determined in the process of step S122 that the entering/leaving area is the reserved entering/leaving area or when it is determined in the process of step S126 that an instruction to execute the automatic parking is received from the terminal device 300, the automatic parking control unit 142 starts the vehicle M and executes the automatic parking (step S112). This completes the processing of the flowchart. If it is determined in the process of step S120 that the terminal device 300 is present in the vehicle interior or if it is determined in the process of step S126 that an instruction to execute the automatic parking is not received from the terminal device 300, the automatic parking is not executed, and the process of the present flowchart is ended. If it is determined in the process of step S120 that the terminal device 300 is present in the vehicle interior, a buzzer or an emergency blinking indicator lamp provided in the vehicle M may be operated to notify the occupant.
According to the above embodiment, for example, the automatic driving control device 100 includes: an identification unit 130 that identifies the surrounding environment of the vehicle M and identifies the locked state of the vehicle M; and a driving control unit (action plan generating unit 140, second control unit 160) that performs driving control of one or both of the speed and the steering of the vehicle M based on the recognition result of the recognition unit 130, wherein the driving control unit is configured to start the vehicle M when the door of the vehicle M is locked after the occupant of the vehicle M gets off the vehicle in the predetermined area in a state where a predetermined condition is satisfied, thereby enabling more appropriate driving control to be performed. Specifically, according to the above-described embodiment, the automatic driving control apparatus 100 suppresses the execution of the automatic parking against the user's intention, and can execute the automatic traveling (parking under automatic driving) more reliably by a simple operation.
In the above-described embodiment, the driving control unit performs control to start the vehicle M when the door of the vehicle M is locked after the occupant of the vehicle M gets off the vehicle in the predetermined area in a state where the predetermined condition is satisfied, but also performs driving control such that the vehicle M does not contact the occupant who gets off the vehicle. For example, the driving control unit executes control to cause the vehicle to travel when the passenger who gets off the vehicle is separated from the vehicle M by a predetermined distance or more, or after a predetermined time has elapsed since the locking of the door is performed.
[ hardware configuration ]
Fig. 12 is a diagram showing an example of the hardware configuration of the automatic driving control apparatus 100 according to the embodiment. As shown in the figure, the computer of the automatic driving control apparatus 100 is configured such that a communication controller 100-1, a CPU100-2, a RAM100-3 used as a work memory, a ROM100-4 for storing boot programs and the like, a flash memory, a storage apparatus 100-5 such as an HDD, a drive apparatus 100-6, and the like are connected to each other via an internal bus or a dedicated communication line. The communication controller 100-1 performs communication with components other than the automatic driving control apparatus 100. A removable storage medium (e.g., a non-transitory storage medium that can be read by a computer) such as an optical disk is mounted on the drive device 100-6. The storage device 100-5 stores a program 100-5a to be executed by the CPU 100-2. The program is developed into the RAM100-3 by a dma (direct Memory access) controller (not shown) or the like, and executed by the CPU 100-2. The program 100-5a referred to by the CPU100-2 may be stored in a removable storage medium mounted on the drive device 100-6, or may be downloaded from another device via a network. This realizes a part or all of the respective components of the automatic driving control apparatus 100.
The above-described embodiments can be expressed as follows.
The vehicle control device is configured to include:
a storage device storing a program; and
a hardware processor for executing a program of a program,
the hardware processor performs the following processing by executing a program stored in the storage device:
identifying a surrounding environment of a vehicle and identifying a locked state of the vehicle;
performing driving control of one or both of a speed and a steering of the vehicle based on the recognition result; and
the vehicle is started when a door of the vehicle is locked after an occupant of the vehicle gets off the vehicle in a predetermined area in a state where a predetermined condition is satisfied.
While the present invention has been described with reference to the embodiments, the present invention is not limited to the embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.
Claims (14)
1. A control apparatus for a vehicle, wherein,
the vehicle control device includes:
an identification unit that identifies the surrounding environment of a vehicle and identifies the locked state of the vehicle; and
a driving control unit that performs driving control of one or both of a speed and a steering of the vehicle based on a recognition result of the recognition unit,
the driving control unit starts the vehicle when a door of the vehicle is locked after a passenger of the vehicle gets off the vehicle in a predetermined area in a state where a predetermined condition is satisfied.
2. The vehicle control apparatus according to claim 1,
the vehicle control device further includes a receiving unit that receives an operation of the occupant,
the driving control unit causes the vehicle to be parked in a predetermined parking area when the receiving unit receives a setting of a mode for parking the vehicle in the parking area before the occupant gets off the vehicle and the door of the vehicle is locked while the occupant of the vehicle gets off the vehicle in the predetermined parking area.
3. The vehicle control apparatus according to claim 2,
the driving control unit does not perform driving control for parking the vehicle in the parking area when the occupant of the vehicle gets off the vehicle in a predetermined area and the door of the vehicle is locked in a state where the mode setting is not accepted by the acceptance unit before the occupant gets off the vehicle.
4. The vehicle control apparatus according to any one of claims 1 to 3,
the driving control unit starts the vehicle when the recognition unit recognizes that the vehicle enters a region for allowing the vehicle to travel by the driving control, and when the occupant of the vehicle gets off the vehicle in a predetermined region and the door of the vehicle is locked.
5. The vehicle control apparatus according to any one of claims 2 to 4,
the vehicle control device further includes a communication unit that communicates with the occupant terminal device,
the prescribed region includes a first region and a second region,
the driving control unit makes a condition for starting the vehicle in a case where the occupant is driven in the first region different from a condition for starting the vehicle in a case where the occupant is driven in the second region.
6. The vehicle control apparatus according to claim 5,
the driving control unit performs the following processing:
starting the vehicle when the occupant gets off the vehicle in the first area and the door of the vehicle is locked; and
when the occupant gets off the vehicle in the second area, the door of the vehicle is locked, and a start instruction from the terminal device is received by the communication unit, the vehicle is started.
7. The vehicle control apparatus according to claim 6,
the first region is a region where an occupant of a vehicle having completed a reservation to park the vehicle in a predetermined parking region gets on or off the vehicle, the second region is a region where the occupant of the vehicle having completed the reservation or the vehicle having not completed the reservation gets on or off the vehicle,
the driving control unit starts the vehicle when the reservation is completed and the door of the vehicle is locked in a state where the vehicle in which the reservation has not been completed stops at the position where the passenger gets off the vehicle in the second area.
8. The vehicle control apparatus according to any one of claims 1 to 7,
the vehicle control device further includes:
a communication unit that communicates with the occupant terminal device; and
a notification control unit that notifies the terminal device,
the notification control unit notifies the terminal device via the communication unit when a passenger who gets off the vehicle is separated from the vehicle by a predetermined distance or more in a state where the predetermined condition is not satisfied.
9. The vehicle control apparatus according to claim 8,
the notification control unit receives a simple operation from the terminal device based on the content of the condition that is not satisfied among the predetermined conditions, and causes the driving control unit to start the vehicle.
10. The vehicle control apparatus according to any one of claims 1 to 9,
the driving control unit starts the vehicle when the occupant is not present in the vehicle interior and the recognition unit recognizes that the door of the vehicle is locked.
11. The vehicle control apparatus according to any one of claims 1 to 10,
the driving control unit starts the vehicle when the terminal device of the passenger who gets off the vehicle is not present in the interior of the vehicle and the recognition unit recognizes that the door of the vehicle is locked.
12. The vehicle control apparatus according to claim 11,
the driving control unit does not start the vehicle even when the recognition unit recognizes that the door of the vehicle is locked, in a case where the terminal device of the passenger who gets off the vehicle is present in the interior of the vehicle.
13. A control method for a vehicle, wherein,
the vehicle control method causes a computer to perform:
identifying a surrounding environment of a vehicle and identifying a locked state of the vehicle;
performing driving control of one or both of a speed and a steering of the vehicle based on the recognition result; and
the vehicle is started when a door of the vehicle is locked after an occupant of the vehicle gets off the vehicle in a predetermined area in a state where a predetermined condition is satisfied.
14. A storage medium storing a program, wherein,
the program causes a computer to perform the following processing:
identifying a surrounding environment of a vehicle and identifying a locked state of the vehicle;
performing driving control of one or both of a speed and a steering of the vehicle based on the recognition result; and
the vehicle is started when a door of the vehicle is locked after an occupant of the vehicle gets off the vehicle in a predetermined area in a state where a predetermined condition is satisfied.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019064036A JP7110149B2 (en) | 2019-03-28 | 2019-03-28 | VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM |
JP2019-064036 | 2019-03-28 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111746513A true CN111746513A (en) | 2020-10-09 |
CN111746513B CN111746513B (en) | 2023-06-30 |
Family
ID=72607215
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010215429.8A Active CN111746513B (en) | 2019-03-28 | 2020-03-24 | Vehicle control device, vehicle control method, and storage medium |
Country Status (3)
Country | Link |
---|---|
US (1) | US11400921B2 (en) |
JP (1) | JP7110149B2 (en) |
CN (1) | CN111746513B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113525352A (en) * | 2021-06-21 | 2021-10-22 | 上汽通用五菱汽车股份有限公司 | Parking method of vehicle, vehicle and computer readable storage medium |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7405067B2 (en) * | 2020-12-11 | 2023-12-26 | トヨタ自動車株式会社 | Vehicle control device and vehicle control method |
CN112750335A (en) * | 2020-12-25 | 2021-05-04 | 中国第一汽车股份有限公司 | Vehicle auxiliary control method and device, vehicle and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017061320A (en) * | 2016-12-08 | 2017-03-30 | みこらった株式会社 | Automatic driving vehicle and program for automatic driving vehicle |
US20170329346A1 (en) * | 2016-05-12 | 2017-11-16 | Magna Electronics Inc. | Vehicle autonomous parking system |
CN107640148A (en) * | 2016-07-20 | 2018-01-30 | 现代自动车株式会社 | The method that remote auto parking support system guides car-parking model |
CN107878362A (en) * | 2016-09-30 | 2018-04-06 | Lg电子株式会社 | Autonomous driving vehicle |
US20180105167A1 (en) * | 2015-11-10 | 2018-04-19 | Hyundai Motor Company | Automatic parking system and automatic parking method |
JP2018065488A (en) * | 2016-10-20 | 2018-04-26 | 三菱自動車工業株式会社 | Automatic driving system |
JP2019026067A (en) * | 2017-07-31 | 2019-02-21 | 日立オートモティブシステムズ株式会社 | Autonomous driving control device, autonomous moving car, and autonomous moving car control system |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5471462B2 (en) | 2010-01-11 | 2014-04-16 | 株式会社デンソーアイティーラボラトリ | Automatic parking equipment |
JP6279351B2 (en) | 2014-03-04 | 2018-02-14 | アンリツインフィビス株式会社 | Weight measuring device |
WO2015166721A1 (en) | 2014-05-02 | 2015-11-05 | エイディシーテクノロジー株式会社 | Vehicle controller |
JP6769860B2 (en) | 2016-12-19 | 2020-10-14 | クラリオン株式会社 | Terminals and terminal control methods |
JP7093602B2 (en) | 2017-05-24 | 2022-06-30 | Ihi運搬機械株式会社 | How to operate vehicles and parking lots |
KR102102651B1 (en) * | 2017-10-12 | 2020-05-29 | 엘지전자 주식회사 | Autonomous vehicle and method for controlling the same |
KR102694198B1 (en) | 2018-12-31 | 2024-08-13 | 현대자동차주식회사 | Automated Valet Parking System and method, infrastructure and vehicle thereof |
-
2019
- 2019-03-28 JP JP2019064036A patent/JP7110149B2/en active Active
-
2020
- 2020-03-18 US US16/822,084 patent/US11400921B2/en active Active
- 2020-03-24 CN CN202010215429.8A patent/CN111746513B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180105167A1 (en) * | 2015-11-10 | 2018-04-19 | Hyundai Motor Company | Automatic parking system and automatic parking method |
US20170329346A1 (en) * | 2016-05-12 | 2017-11-16 | Magna Electronics Inc. | Vehicle autonomous parking system |
CN107640148A (en) * | 2016-07-20 | 2018-01-30 | 现代自动车株式会社 | The method that remote auto parking support system guides car-parking model |
CN107878362A (en) * | 2016-09-30 | 2018-04-06 | Lg电子株式会社 | Autonomous driving vehicle |
JP2018065488A (en) * | 2016-10-20 | 2018-04-26 | 三菱自動車工業株式会社 | Automatic driving system |
JP2017061320A (en) * | 2016-12-08 | 2017-03-30 | みこらった株式会社 | Automatic driving vehicle and program for automatic driving vehicle |
JP2019026067A (en) * | 2017-07-31 | 2019-02-21 | 日立オートモティブシステムズ株式会社 | Autonomous driving control device, autonomous moving car, and autonomous moving car control system |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113525352A (en) * | 2021-06-21 | 2021-10-22 | 上汽通用五菱汽车股份有限公司 | Parking method of vehicle, vehicle and computer readable storage medium |
CN113525352B (en) * | 2021-06-21 | 2022-12-02 | 上汽通用五菱汽车股份有限公司 | Parking method of vehicle, vehicle and computer-readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
US20200307556A1 (en) | 2020-10-01 |
JP7110149B2 (en) | 2022-08-01 |
US11400921B2 (en) | 2022-08-02 |
CN111746513B (en) | 2023-06-30 |
JP2020166357A (en) | 2020-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111942369B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN111376853B (en) | Vehicle control system, vehicle control method, and storage medium | |
CN111932928B (en) | Parking lot management system, parking lot management device, parking lot management method, and storage medium | |
CN111986505B (en) | Control device, boarding/alighting facility, control method, and storage medium | |
CN111762174B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN111391826A (en) | Vehicle control system, vehicle control method, and storage medium | |
CN111746513B (en) | Vehicle control device, vehicle control method, and storage medium | |
CN111667709B (en) | Vehicle control device, information providing system, vehicle control method, information providing method, and storage medium | |
CN111951566A (en) | Vehicle control system, vehicle control method, and storage medium | |
CN111833644A (en) | Parking management device, control method for parking management device, and storage medium | |
CN111619569A (en) | Vehicle control system, vehicle control method, and storage medium | |
CN111932927B (en) | Management device, management method, and storage medium | |
CN111619550A (en) | Vehicle control device, vehicle control system, vehicle control method, and storage medium | |
CN111731293A (en) | Vehicle control system, vehicle control method, and storage medium | |
CN111951599B (en) | Parking lot management device, parking lot management method, and storage medium | |
CN111688708B (en) | Vehicle control system, vehicle control method, and storage medium | |
CN112037561B (en) | Information processing apparatus, information processing method, and storage medium | |
CN111796591B (en) | Vehicle control device, monitoring system, vehicle control method, and storage medium | |
US11377098B2 (en) | Vehicle control device, vehicle control method, and storage medium | |
CN113470417A (en) | Housing area management device | |
CN111661038A (en) | Vehicle control system, vehicle control method, and storage medium | |
CN112009478B (en) | Vehicle control system, vehicle control method, and storage medium | |
US20200311621A1 (en) | Management device, management method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |