WO2018189952A1 - 自動運転車両、自動運転車両の停車方法及びプログラム - Google Patents
自動運転車両、自動運転車両の停車方法及びプログラム Download PDFInfo
- Publication number
- WO2018189952A1 WO2018189952A1 PCT/JP2017/044809 JP2017044809W WO2018189952A1 WO 2018189952 A1 WO2018189952 A1 WO 2018189952A1 JP 2017044809 W JP2017044809 W JP 2017044809W WO 2018189952 A1 WO2018189952 A1 WO 2018189952A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- gesture
- person
- road
- vehicle body
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 29
- 238000001514 detection method Methods 0.000 claims abstract description 83
- 238000004891 communication Methods 0.000 claims description 18
- 238000010586 diagram Methods 0.000 description 10
- 210000003811 finger Anatomy 0.000 description 6
- 239000000470 constituent Substances 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000002093 peripheral effect Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000004927 fusion Effects 0.000 description 4
- 230000033001 locomotion Effects 0.000 description 4
- 238000012795 verification Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 101001093748 Homo sapiens Phosphatidylinositol N-acetylglucosaminyltransferase subunit P Proteins 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000001172 regenerating effect Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000029305 taxis Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/02—Reservations, e.g. for tickets, services or events
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
- B60Q1/50—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
- B60Q1/503—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text
- B60Q1/5035—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text electronic displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
- B60Q1/50—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
- B60Q1/507—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking specific to autonomous vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/181—Preparing for stopping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0025—Planning or execution of driving tasks specially adapted for specific operations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0025—Planning or execution of driving tasks specially adapted for specific operations
- B60W60/00253—Taxi operations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0016—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/40—Business processes related to the transportation industry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
- G08G1/202—Dispatching vehicles on the basis of a location, e.g. taxi dispatching
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/01—Occupants other than the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/041—Potential occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/049—Number of occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/60—Traffic rules, e.g. speed limits or right of way
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/027—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
Definitions
- the present disclosure relates to an autonomous driving vehicle, a stopping method of the autonomous driving vehicle, and a program.
- the present disclosure is intended to efficiently put a prospective customer who is trying to stop a taxi on an autonomous driving vehicle.
- An autonomous driving vehicle includes a vehicle main body, a gesture detection unit that detects a gesture of a person existing around the road on which the vehicle main body is traveling, and the vehicle main body, and a vehicle main body A determination unit that determines whether or not there are passengers and a control unit that automatically drives the vehicle body, and the control unit determines that the gesture detection unit performs a gesture when the determination unit determines that there are no passengers in the vehicle body. If the person who is doing is detected, the vehicle main body will be stopped around the person concerned.
- a prospective customer who is trying to stop a taxi can be efficiently boarded on an autonomous driving vehicle.
- FIG. 1 is a schematic diagram showing a schematic configuration of a vehicle allocation system according to the present embodiment.
- FIG. 2 is a schematic diagram showing an example of an autonomous driving vehicle in the present embodiment.
- FIG. 3 is a block diagram illustrating an example of a functional configuration of the autonomous driving vehicle according to the present embodiment.
- FIG. 4 is a flowchart showing a flow of a method for stopping an autonomous driving vehicle.
- FIG. 5 is a flowchart showing a flow of an acceptance mode according to the embodiment.
- FIG. 6 is a flowchart showing the flow of the reject mode according to the embodiment.
- FIG. 7 is a schematic diagram illustrating an example of an image captured by the first camera when the autonomous driving vehicle according to the embodiment travels.
- FIG. 8 is a schematic diagram illustrating an example of an image captured by the first camera when the autonomous driving vehicle according to the embodiment travels.
- the present inventor examined the adoption of an autonomous driving vehicle as a taxi.
- the level 4 self-driving vehicle defined by the Japanese government or the US Department of Transportation's Road Traffic Safety Bureau is fully automatic driving, the passengers are not involved in the driving operation at all.
- the self-driving vehicle itself determines whether or not there is a prospective customer on the road.
- the driving vehicle needs to stop.
- the self-driving vehicle stops every time a prospective customer is discovered it is inefficient when there are already passengers in the self-driving vehicle. Therefore, hereinafter, a technique capable of solving this problem will be described.
- An autonomous driving vehicle includes a vehicle main body, a gesture detection unit that detects a gesture of a person existing around a road on which the vehicle main body is traveling, and the vehicle main body is stopped, A determination unit that determines whether or not there are passengers in the main body and a control unit that automatically drives the vehicle main body.
- the gesture detection unit performs a gesture.
- the vehicle body is stopped around the person.
- a vehicle of a person existing around the road on which the vehicle body is traveling when a gesture for stopping the main body is detected, the vehicle main body is stopped around the person.
- a program according to an aspect of the present disclosure causes a computer to execute the above-described method for stopping an autonomous driving vehicle.
- the vehicle body when it is determined that there are no passengers in the vehicle body of the autonomous driving vehicle, when detecting a gesture to stop the vehicle body, the vehicle body is stopped around the person who made the gesture. When there are passengers in the vehicle body, the autonomous driving vehicle continues to travel without stopping. In other words, it is possible to prevent the autonomous driving vehicle from stopping when another passenger is expected to ride even though the passenger is already present in the autonomous driving vehicle. Thereby, a prospective customer can be boarded in an autonomous driving vehicle efficiently.
- the gesture detection unit may detect a gesture of a person existing on one of the right side and the left side of the road.
- the gesture detection unit can set one of the right side and the left side of the road as the detection range, the detection process can be speeded up. For example, in a country that employs left-hand traffic for vehicles, there are many prospective customers on the left side of the road. For this reason, a detection process can be speeded up by making only the left side of a road into the detection range of a gesture detection part.
- the autonomous driving vehicle includes a lane detection unit that detects the number of lanes in the same traveling direction on the road, and the gesture detection unit is provided on the right side and the left side of the road when the lane detection unit detects 1 as the lane number. If a gesture of an existing person is detected and the lane detection unit detects that the number of lanes is 2 or more, a gesture of a person existing on one of the right side and the left side of the road may be detected.
- the detection range of the gesture detection unit can be adjusted according to the number of lanes.
- the width of the entire road is relatively narrow. More specifically, it is a one-way road or a one-lane road.
- the autonomous driving vehicle can handle prospective passengers existing on the left and right of the road in its lane.
- the self-driving vehicle can make a U-turn to accommodate potential customers in the opposite lane.
- the width of the entire road is relatively wide. For this reason, it is not desirable to handle the prospective customers present on the right side and the left side of the road with the autonomous driving vehicle because there is a possibility that the traffic of other vehicles may be hindered when moving to another lane.
- the detection range of the gesture detection unit can be adjusted according to the number of lanes, it is possible to make the detection range appropriate for the situation.
- control unit detects the number of lanes as 2 or more, and when the determination unit determines that there are no passengers in the vehicle body, the control unit is closest to the sidewalk among the two or more lanes. You may run on the lane.
- the prospective customers are usually on the sidewalk. Therefore, as described above, when it is determined that there are no passengers in the vehicle body, the vehicle body is traveling in the lane closest to the sidewalk among the two or more lanes. The main body can travel. Thereby, it is possible to smoothly access the prospective customer.
- the self-driving vehicle includes a U-turn prohibition detecting unit that detects whether or not a U-turn is prohibited on the road, and the gesture detector detects the U-turn prohibition on the road when the U-turn prohibition detecting unit detects the prohibition of the U-turn on the road.
- the gesture of the person present on one of the right side and the left side of the road may be detected.
- the detection range of the gesture detection unit can be adjusted depending on whether or not U-turn is prohibited. Specifically, the self-driving vehicle cannot move to the opposite lane at a U-turn prohibited place. That is, even if there is a prospective customer on the opposite lane side of the autonomous driving vehicle, the autonomous driving vehicle cannot respond, and in this case, the detection range of the gesture detection unit can be only one of the right side and the left side of the road.
- the gesture detection unit detects a second gesture to cancel the stop of the vehicle body of the person, and the control unit detects the person making the gesture after the gesture detection unit detects the second gesture from the person. If two gestures are detected, the vehicle body may continue to travel.
- the gesture detection unit detects the person making the gesture and then detects the second gesture from the person, the vehicle body continues to travel, so the intention of canceling the prospective customer The vehicle body can continue to travel.
- the autonomous driving vehicle includes a notification unit that performs notification toward the outside of the vehicle body, and the control unit performs a gesture when the determination unit determines that there are no passengers in the vehicle body. If the person who detects is detected, the notification unit may be notified that the person has been detected.
- the notification unit since a notification indicating that the person who made the gesture is detected is performed by the notification unit, it is possible to indicate to the prospective customer that the gesture has been transmitted to the autonomous driving vehicle.
- the autonomous driving vehicle includes a communication unit that communicates with a dispatching system that dispatches another autonomous driving vehicle to a predetermined place, and the control unit determines that a passenger is present in the vehicle body by the determination unit.
- the gesture detection unit detects a person making a gesture, a dispatch request for the location of the person may be output from the communication unit to the dispatch system.
- a dispatch request for the place is output from the communication unit to the dispatch system.
- another autonomous driving vehicle can be dispatched to a prospective customer by a dispatch system.
- FIG. 1 is a schematic diagram showing a schematic configuration of a vehicle allocation system 100 according to the present embodiment.
- the dispatch system 100 is a system that dispatches a plurality of autonomous driving vehicles 1 used as taxis to predetermined places.
- the vehicle allocation system 100 includes an information terminal 101 such as a personal computer or a tablet terminal.
- the information terminal 101 can communicate with each autonomous driving vehicle 1 via the network N. From the information terminal 101, a dispatch instruction to a predetermined place is output to each autonomous driving vehicle 1.
- Each self-driving vehicle 1 travels automatically toward a predetermined place based on a dispatch instruction.
- the vehicle allocation system 100 may include an automobile that requires a driver's driving operation. In this case, it is assumed that the driver drives the vehicle and heads for a predetermined location based on the dispatch instruction received by the vehicle.
- FIG. 2 is a schematic diagram showing an example of the autonomous driving vehicle 1 in the present embodiment.
- the autonomous driving vehicle 1 includes a vehicle main body 10 and an automatic driving system 20 for automatically driving the vehicle main body 10.
- the automatic driving system 20 is mounted on the vehicle main body 10.
- the vehicle body 10 is provided with a seat 11 on which a passenger sits, and an operation unit 12 that is operated by the passenger is installed on a dashboard in front of the seat 11.
- the operation unit 12 is, for example, a touch panel, and a passenger can specify a destination.
- the roof in the vehicle body 10 is provided with a notification unit 13 on which various information is displayed.
- this notification unit 13 various information can be notified to people outside the autonomous driving vehicle 1.
- FIG. 3 is a block diagram showing an example of the functional configuration of the autonomous driving vehicle 1 in the present embodiment.
- the vehicle main body 10 of the autonomous driving vehicle 1 is provided with a propulsion system 110, a sensor system 120, a control system 130, and peripheral devices 140.
- the propulsion system 110 is a system that provides power movement to the vehicle body 10. Specifically, the propulsion system 110 includes a drive source 111 and a transmission 112.
- the drive source 111 is an internal combustion engine, an electric motor, a steam engine, a Stirling engine, or the like, and these may be a single body or a combination thereof.
- the autonomous driving vehicle 1 is a gasoline-electric hybrid vehicle
- the drive source 111 is a combination of a gasoline engine and an electric motor.
- the transmission 112 is configured to transmit mechanical power from the drive source 111 to the wheel.
- the transmission 112 includes a gear box, clutch, differential, drive shaft, and / or other elements.
- the sensor system 120 detects information related to the environment where the vehicle body 10 is placed.
- the sensor system 120 is composed of several sensors so as to detect information related to the environment.
- the sensor system 120 includes a GPS module 121, an inertia measurement unit 122, a characteristic unit 123, a first camera 124, and a second camera 125.
- the GPS module 121 is a module that estimates the latitude and longitude of the vehicle body 10 by GPS (Global Positioning System). Specifically, the GPS module 121 estimates the position of the vehicle body 10 with respect to the earth based on satellite-based positioning data. In one example, the automatic driving system 20 estimates the position of the lane boundary of the road on which the vehicle body 10 is traveling by using the GPS module 121 together with the map data.
- GPS Global Positioning System
- the inertial measurement unit 122 is an IMU (Inertial Measurement Unit), and is a sensor group that detects changes in the position and orientation of the vehicle body 10 based on inertial acceleration.
- the inertial measurement unit 122 may include, for example, an accelerometer and a gyroscope.
- the characteristic unit 123 determines the characteristics of the object such as the distance, altitude, direction, or speed of the object around the vehicle body 10.
- the characteristic unit 123 includes a radar unit that determines the distance, altitude, direction, or speed of an object using radio waves.
- Other systems similar to radar may also be included in the characteristic unit 123.
- Other systems include a rider system that uses light to detect an object in the environment where the vehicle body 10 is placed. If both scanning and non-scanning lidar systems are used, three-dimensional (3D) imaging can be performed.
- “3D gated viewing laser radar” is an example of a non-scanning laser ranging system that applies a pulsed laser and a high-speed gate camera.
- the first camera 124 is an arbitrary camera (for example, a still camera or a video camera) configured to take an image of an environment where the vehicle body 10 is placed. Specifically, the first camera 124 is attached to the front portion of the vehicle body 10 so as to photograph the front of the vehicle body 10 (see FIG. 2).
- the second camera 125 is an arbitrary camera (for example, a still camera or a video camera) configured to take an image inside the vehicle body 10. Specifically, the 2nd camera 125 is attached to the front part of the vehicle main body 10 so that the inside of the vehicle main body 10 may be image
- the control system 130 controls the operation of the vehicle body 10 and its components.
- the control system 130 may include a steering unit 131, a throttle unit 132, a brake unit 133, a navigation unit 134, and an obstacle avoidance system 135.
- the steering unit 131 is a mechanism configured to adjust the azimuth or direction of the vehicle body 10.
- the throttle unit 132 is a mechanism configured to control the operation speed and acceleration of the drive source 111 and, as a result, control the speed and acceleration of the vehicle body 10.
- the brake unit 133 is a mechanism configured to decelerate the vehicle body 10.
- the brake unit 133 may use friction to slow down the wheel.
- the brake unit 133 may be configured in a regenerative manner and convert the kinetic energy of the wheel into an electric current.
- the navigation unit 134 is configured to determine a driving route for the vehicle body 10.
- the navigation unit 134 may be configured to dynamically update the driving route during operation of the vehicle body 10.
- the navigation unit 134 may be configured to determine the driving route of the vehicle body 10 in combination with the GPS module 121 and one or more predetermined map data.
- the obstacle avoidance system 135 is configured to identify and evaluate an obstacle in the environment where the vehicle body 10 is placed, and to avoid or otherwise avoid the obstacle.
- control system 130 may include a sensor fusion algorithm, a computer vision system, and the like.
- the sensor fusion algorithm is an algorithm that can be executed by the processor 22 of the automatic driving system 20, for example.
- the sensor fusion algorithm is configured to receive data from the sensor system 120 as input.
- the sensor fusion algorithm includes, for example, an evaluation of individual objects and / or features in the environment in which the vehicle body 10 is placed, an evaluation of a specific situation, and / or an evaluation of a possible collision based on the specific situation.
- Various assessments are provided based on data from sensor system 120.
- the computer vision system uses images taken by the first camera 124 to identify objects and / or features in the environment in which the vehicle body 10 is placed, including, for example, lane information, traffic signals, and obstacles. It is configured for processing and analysis.
- the computer vision system may use object recognition algorithms, structure-from-motion (SFM) algorithms, video tracking, or other computer vision techniques.
- SFM structure-from-motion
- Peripheral device 140 is configured to allow vehicle body 10 to interact with external devices, other autonomous driving vehicles and / or passengers.
- the peripheral device 140 includes, for example, an operation unit 12, a notification unit 13, and a communication unit 141.
- the operation unit 12 is a system that receives various instructions such as a destination by being operated by a passenger.
- the operation unit 12 is, for example, a touch panel, but may be a voice input device.
- reporting part 13 is an apparatus which alert
- reporting part 13 is comprised from the electronic bulletin board, the liquid crystal monitor, etc., and provides visual information to the people who exist outside the automatic driving vehicle 1 by displaying various information.
- reporting part 13 may be an apparatus which provides various information with an audio
- the communication unit 141 is a system that can wirelessly communicate with other autonomous driving vehicles and the information terminal 101 via the network N.
- the communication unit 141 includes an antenna and a chip set for communicating with the network N.
- the chipset may be of any type of wireless communication that is generally feasible (e.g., protocol), e.g., a communication protocol described in Bluetooth (R), IEEE 802.11 (including any IEEE 802.11 revision), According to one or more of cellular technologies (GSM (registered trademark), CDMA, UMTS, EV-DO, WiMAX, LTE, etc.), Zigbee, dedicated narrow area communication (DSRC), and wireless automatic identification (RFID) communication You may be made to communicate.
- the automatic driving system 20 is a control unit that automatically drives the vehicle main body 10 by controlling the operation of each component of the vehicle main body 10.
- the automatic driving system 20 includes a memory 21 and a processor 22.
- the memory 21 may comprise one or more volatile storage components and / or one or more non-volatile storage components such as optical, magnetic and / or organic storage.
- the memory 21 may be incorporated in the processor 22 in whole or in part.
- the memory 21 may include a program that can be executed by the processor 22 for executing the automatic driving function according to the present embodiment.
- the memory 21 stores map data.
- the map data includes road map information, lane information indicating the number of lanes on the road, intersection information indicating the type of intersection, speed limit information indicating the speed limit, and regulatory information indicating other traffic regulations. ing. Map data may be stored in the memory 21 in advance. Further, the latest map data acquired from the network N via the communication unit 141 may be updated and stored in the memory 21 each time.
- the memory 21 stores reference gesture information that is a criterion for determining a gesture for stopping a taxi (stop request gesture).
- the reference gesture information is data indicating a standard stop request gesture generally used in the country or region when stopping a taxi. This data includes the coordinate position of each feature point of the stop request gesture, an image of a characteristic posture of the stop request gesture, a moving image of the stop request gesture, and the like.
- gestures such as “raise one hand” and “was raised hand” correspond to stop request gestures.
- stop gestures in other countries and regions, for example, “shake with your hand open”, “hold and raise your hand”, “shake with your index finger and middle finger up”, “with your middle finger up only”
- Other gestures can be used as the stop request gesture.
- the reference gesture information may be customized and stored in the memory 21 so as to correspond to the country and region when the shipping country and the shipping region of the autonomous driving vehicle 1 are determined.
- the memory 21 stores second reference gesture information that is a determination criterion for a cancel gesture (second gesture).
- the second reference gesture information is data indicating a reference cancel gesture for canceling acceptance when a gesture for stopping a taxi is accepted by the autonomous driving vehicle 1. This data includes the coordinate position of each feature point of the cancel gesture, an image of a characteristic posture of the cancel gesture, a moving image of the cancel gesture, and the like.
- the cancel gesture may be different from the stop request gesture. For example, “make x with both hands”, “shake the neck left and right”, “turn back”, “move away from the car”, etc. Can be mentioned. Further, an operation that does not move for a certain period of time may be used as a cancel gesture so that the person displayed in the street signboard is not recognized as an actual person.
- the processor 22 may comprise one or more general purpose processors and / or one or more dedicated processors (eg, image processor, digital signal processor, etc.). Where the processor 22 includes multiple processors, such multiple processors may work independently or in concert.
- the processor 22 automatically drives the vehicle main body 10 by reading and executing the program stored in the memory 21. Specifically, the processor 22 is based on the various detection results input from the sensor system 120, the various information input from the peripheral device 140, and the various information stored in the memory 21. By controlling the control system 130, the vehicle main body 10 is automatically driven to the destination while observing traffic regulations and avoiding other objects (other vehicles, buildings, people, animals, etc.). For the automatic operation control by the processor 22, well-known automatic operation control can be used.
- the processor 22 also executes a stopping method of the automatic driving vehicle 1 when the automatic driving vehicle 1 is automatically operated.
- step S1 the processor 22 of the automatic driving system 20 determines whether or not there are passengers in the vehicle main body 10 during the automatic driving. Specifically, the processor 22 determines whether there is a passenger in the vehicle body 10 based on the image in the vehicle body 10 captured by the second camera 125. That is, the processor 22 and the second camera 125 are determination units that determine whether there are passengers in the vehicle body 10.
- step S1; NO the processor 22 shifts to the acceptance mode of step S2, and when it is determined that there are passengers in the vehicle body 10 (step S1; YES). ) Shifts to the rejection mode of step S3.
- FIG. 5 is a flowchart showing the flow of the acceptance mode according to the embodiment.
- the acceptance mode is a mode in which a passenger can be accepted in the vehicle main body 10.
- step S20 in the acceptance mode the processor 22 determines whether or not the number of lanes in the same traveling direction as the lane (lane) in which the autonomous driving vehicle 1 is currently traveling is two or more. Specifically, the processor 22 detects the number of lanes from an image in front of the vehicle body 10 captured by the first camera 124. That is, the processor 22 and the first camera 124 are lane detection units that detect the number of lanes in the same traveling direction on the road. In addition to this, the processor 22 can also detect the number of lanes based on the current position detected by the GPS module 121 and the map data.
- step S20 the processor 22 determines that the number of lanes is 2 or more (step S20; YES).
- the processor 22 proceeds to step S21.
- step S21 the processor 22 causes the vehicle main body 10 to travel in a lane closest to the sidewalk among the two or more lanes.
- the sidewalk is an area adjacent to the roadway where a pedestrian can exist, and may include a roadside belt, a road shoulder, and the like. Thereby, when the prospective customer G (refer FIG. 7 etc.) is discovered, it can stop smoothly.
- FIG. 7 is a schematic diagram illustrating an example of an image captured by the first camera 124 when the autonomous driving vehicle 1 according to the embodiment travels.
- FIG. 7 shows a case where the number of lanes in the same traveling direction is two.
- the autonomous driving vehicle 1 is traveling on the lane L1 closest to the sidewalk W among the two lanes L1 and L2, that is, the left lane L1.
- step S22 the processor 22 sets the detection range S for detecting a gesture in the shooting range R of the first camera 124 as a range including the left side of the road. Specifically, as illustrated in FIG. 7, the processor 22 sets a range that does not include the right lane L2 and includes at least the sidewalk W on the left of the lane L1 as the detection range S. This makes it possible to detect a stop request gesture of a person on the left side of the road.
- step S20 determines that the number of lanes is not two or more (step S20; NO)
- step S23 the processor 22 proceeds to step S23.
- FIG. 8 is a schematic diagram illustrating an example of an image captured by the first camera 124 when the autonomous driving vehicle 1 according to the embodiment travels.
- FIG. 8 shows a case where the number of lanes in the same traveling direction is 1. Specifically, in FIG. 8, the autonomous driving vehicle 1 is traveling on a one-way, one-lane road.
- step S23 the processor 22 sets the detection range S of the shooting range R of the first camera 124 as a range including the right side and the left side of the road. Specifically, the processor 22 sets a range including at least the sidewalks W1 and W2 on the left and right sides of the road as the detection range S as shown in FIG. This makes it possible to detect stop request gestures of people present on the left and right sides of the road.
- step S24 the processor 22 determines whether or not a person making a stop request gesture is detected within the detection range S. Specifically, the processor 22 detects the person making the stop request gesture by comparing the reference gesture information stored in the memory 21 with the movement of the person existing in the detection range S. . That is, the first camera 124, the memory 21, and the processor 22 are gesture detection units that detect a stop request gesture. For this verification, any verification method that can determine whether or not the motion of the person imaged by the first camera 124 is a stop request gesture may be adopted. An example of the collation method is pattern matching. If the processor 22 detects a person making a stop request gesture (step S24; YES), the processor 22 proceeds to step S25. If not detected (step S24; NO), the processor 22 continues the state. To do.
- step S25 if the processor 22 determines that the person is making a stop request gesture, the processor 22 recognizes the person as a prospective customer G and tracks the person in the image captured by the first camera 124. At the time of tracking, the attitude of the first camera 124 may be controlled so that the prospective customer G is always within the shooting range.
- step S26 the processor 22 controls the notification unit 13 to cause the notification unit 13 to perform notification indicating that the prospective customer G has been detected. Thereby, it is possible to indicate to the prospective customer G that the stop request gesture has been transmitted to the autonomous driving vehicle 1.
- step S27 the processor 22 compares the operation of the prospective customer G tracked in the captured image with the second reference gesture information stored in the memory 21 to cancel the prospective customer G. Detect gestures. For this verification, any verification method that can determine whether the motion of the prospective customer G captured by the first camera 124 is a cancel gesture may be adopted. An example of the collation method is pattern matching. If the cancel gesture from the prospective customer G is not detected (step S27; NO), the processor 22 proceeds to step S28, and if the cancel gesture from the prospective customer G is detected (step S27; YES). Then, the process proceeds to step S29.
- step S28 the processor 22 controls the operation of each component of the vehicle main body 10 to automatically drive the vehicle main body 10 and stop it around the prospective customer G.
- the prospective customer G can enter the autonomous driving vehicle 1.
- the processor 22 ends the acceptance mode and resumes automatic driving of the vehicle body 10.
- the processor 22 turns the vehicle main body 10 into a U-turn. Then, after moving to the opposite lane L4, the vehicle body 10 is stopped around the prospective customer G.
- step S29 the processor 22 cancels the authorization of the prospective customer G, ends the acceptance mode, and resumes the automatic operation of the vehicle body 10.
- the notification timing is not limited to this.
- the notification may be performed after the autonomous driving vehicle 1 stops around the prospective customer G.
- the case where the autonomous driving vehicle 1 stops around the prospective customer G after determining the presence or absence of the cancel gesture is illustrated.
- the timing for determining whether or not there is a cancel gesture is not limited to this. For example, after the autonomous driving vehicle 1 stops around the prospective customer G, the presence or absence of a cancel gesture may be determined.
- FIG. 6 is a flowchart showing the flow of the rejection mode according to the embodiment.
- the rejection mode is a mode in which acceptance of the prospective customer G is rejected because there are already passengers in the vehicle body 10.
- step S30 of the rejection mode the processor 22 determines whether or not the number of lanes in the same traveling direction as the lane (lane) in which the autonomous driving vehicle 1 is currently traveling is two or more. Specifically, the processor 22 detects the number of lanes from an image in front of the vehicle body 10 captured by the first camera 124.
- step S30 the processor 22 determines that the number of lanes is 2 or more (step S30; YES).
- the processor 22 proceeds to step S31.
- step S31 the processor 22 sets the range including the left side of the road in the shooting range R of the first camera 124 as the detection range S, as in step S22.
- step S30 the processor 22 determines that the number of lanes is not two or more (step S30; NO)
- the processor 22 proceeds to step S32.
- step S32 the processor 22 sets the range including the right side and the left side of the road in the shooting range R of the first camera 124 as the detection range S, as in step S23.
- step S33 the processor 22 determines whether or not a person making a stop request gesture is detected within the detection range S. If the processor 22 detects a person making a stop request gesture (step S33; YES), the processor 22 proceeds to step S34, and if not detected (step S33; NO), the state is continued. To do.
- step S34 the processor 22 controls the communication unit 141 to output a vehicle allocation request for a place where the prospective customer G exists.
- the vehicle allocation request includes the time when the prospective customer G is detected and the position data of the place where the prospective customer G exists.
- the information terminal 101 of the vehicle allocation system 100 accepts this vehicle allocation request via the network N, the information terminal 101 outputs a vehicle allocation instruction for a place where the prospective customer G exists to the other autonomous driving vehicles 1.
- the processor 22 may control the notification unit 13 to cause the notification unit 13 to perform notification indicating that the vehicle allocation request has been output. Thereby, it is possible to indicate to the prospective customer G that the stop request gesture has been transmitted to the other autonomous driving vehicle 1.
- the processor 22 ends the refusal mode and resumes automatic operation of the vehicle body 10.
- the self-driving vehicle 1 detects a gesture (stop request gesture) of the vehicle main body 10 and a person existing around the road on which the vehicle main body 10 is traveling to stop the vehicle main body 10.
- a gesture detection unit first camera 124, memory 21 and processor 22
- a determination unit processing 22 and second camera 125
- control for automatically driving the vehicle body 10
- a control unit automatic driving system 20
- the control unit detects the vehicle body 10 when the gesture detection unit detects a person making a gesture. Stop around the person.
- the method for stopping the autonomous driving vehicle 1 according to the present embodiment is such that when it is determined that there are no passengers in the vehicle body 10 of the autonomous driving vehicle 1, the person existing around the road on which the vehicle body 10 is traveling When a gesture for stopping the vehicle main body 10 (stop request gesture) is detected, the vehicle main body 10 is stopped around the person.
- program according to the present embodiment causes the computer to execute the stopping method of the autonomous driving vehicle 1.
- the autonomous driving vehicle 1 when it is determined that there are no passengers in the vehicle main body 10 of the autonomous driving vehicle 1, if a stop request gesture for stopping the vehicle main body 10 is detected, the prospective customer G who made the stop request gesture Since the vehicle main body 10 is stopped in the vicinity, when there are passengers in the vehicle main body 10, the autonomous driving vehicle 1 continues to travel without stopping. That is, it is possible to prevent the automatic driving vehicle 1 from stopping when another prospective customer G is put on the passenger even though the passenger is already present in the automatic driving vehicle 1. Thereby, the prospective customer G can be boarded on the automatic driving vehicle 1 efficiently.
- the gesture detection unit may detect a stop request gesture of a person existing on one of the right side and the left side of the road.
- the gesture detection unit can set one of the right side and the left side of the road as the detection range S, the detection process can be speeded up.
- the detection process can be speeded up by setting only the left side of the road as the detection range S of the gesture detection unit.
- the autonomous driving vehicle 1 also includes a lane detection unit (processor 22 and first camera 124) that detects the number of lanes in the same traveling direction on the road, and the gesture detection unit detects the number of lanes as 1 by the lane detection unit.
- a lane detection unit processing unit 22 and first camera 1264 that detects the number of lanes in the same traveling direction on the road
- the gesture detection unit detects the number of lanes as 1 by the lane detection unit.
- stop gestures of people present on the right and left sides of the road are detected, and if the lane detector detects that the number of lanes is two or more, the stop of the person present on one of the right and left sides of the road A request gesture may be detected.
- the detection range S of the gesture detection unit can be adjusted according to the number of lanes.
- the width of the entire road is relatively narrow. More specifically, it is a one-way road or a one-lane road.
- the automatic driving vehicle 1 can correspond to the prospective customer G existing on the left and right of the road in its lane. Further, when the road is a one-lane road, the autonomous driving vehicle 1 can make a U-turn to deal with a prospective customer G in the opposite lane.
- the width of the entire road is relatively wide. For this reason, it is not desirable to handle the prospective passengers G present on the right side and the left side of the road with the autonomous driving vehicle 1 because the traffic of other vehicles may be hindered when moving to another lane.
- the detection range of the gesture detection unit can be adjusted according to the number of lanes, it is possible to make the detection range appropriate for the situation.
- control unit detects the number of lanes as two or more, and when the determination unit determines that there are no passengers in the vehicle main body, the control unit determines that the vehicle main body is the most on the sidewalk W among the two or more lanes. You may drive on the near lane L1.
- the prospective customer G is usually present on the sidewalk W. For this reason, as described above, when it is determined that there are no passengers in the vehicle body 10, the vehicle body 10 is traveling in the lane L1 closest to the sidewalk W among the two or more lanes L1 and L2. The vehicle body 10 can travel on the lane L1 close to the prospective customer G. Thereby, the access to the prospective customer G can be performed smoothly.
- the gesture detection unit detects a second gesture (cancellation gesture) of the person (probable customer G) who intends to cancel the stop of the vehicle main body 10, and the control unit performs a stop request gesture. If a cancel gesture from the person is detected after the person is detected, the vehicle body 10 may continue to travel.
- a second gesture cancellation gesture
- the control unit performs a stop request gesture. If a cancel gesture from the person is detected after the person is detected, the vehicle body 10 may continue to travel.
- the vehicle body 10 when the gesture detection unit detects a prospective customer G and then detects a cancel gesture from the prospective customer G, the vehicle body 10 continues to travel.
- the vehicle body 10 can continue to travel.
- the autonomous driving vehicle 1 is provided with the alerting
- the notification unit 13 may be notified that the prospective customer G has been detected.
- the notification unit 13 is an electronic bulletin board, a liquid crystal monitor, or the like attached to the roof of the vehicle main body 10 is illustrated.
- a hazard lamp provided in the vehicle main body 10 can also be used as a notification unit. In this case, it is possible to notify the prospective customer G by blinking the hazard lamp.
- the notification unit 13 since the notification indicating that the potential customer has been detected is performed by the notification unit 13, it is possible to indicate to the prospective customer G that the stop request gesture has been transmitted to the automatic driving vehicle 1.
- the autonomous driving vehicle 1 includes a communication unit 141 that communicates with a dispatching system 100 that dispatches another autonomous driving vehicle 1 to a predetermined place, and the control unit has a passenger in the vehicle body 10 by the determination unit.
- the determination is made, if the gesture detecting unit detects a person making a stop request gesture, a vehicle allocation request for the location of the person may be output from the communication unit 141 to the vehicle allocation system 100.
- the determination unit determines that there are passengers in the vehicle body 10, that is, in a state where the prospective customer G cannot be accepted
- the gesture detection unit detects the prospective customer G
- the prospective customer G A dispatch request for the location is output from the communication unit 141 to the dispatch system 100.
- another autonomous driving vehicle 1 can be allocated to the prospective customer G by the vehicle allocation system 100.
- the second camera 125 configures a part of the determination unit that determines whether there are passengers in the vehicle main body 10
- a sensor other than a camera can be used as a part of the determination unit as long as the presence or absence of a passenger in the vehicle body 10 can be determined.
- sensors include a weight sensor and a human sensor.
- the weight sensor may be disposed within the seating surface of the seat 11.
- the processor 22 can determine the presence or absence of a passenger.
- the human sensor may be provided in the vehicle main body 10 so that the vehicle interior space of the vehicle main body 10 is a detection range. Based on the detection result of the human sensor, the processor 22 can determine the presence or absence of a passenger.
- the human sensor include a temperature sensor, an ultrasonic sensor, and an infrared sensor.
- the case where the first camera 124 constitutes a part of the gesture detection unit that detects a human gesture is illustrated.
- the radar unit and the rider system form part of the gesture detection unit.
- the processor 22 detects a U-turn prohibition sign from an image in front of the vehicle body 10 captured by the first camera 124. That is, the processor 22 and the first camera 124 are a U-turn prohibition detecting unit that detects whether or not a U-turn is prohibited on the road.
- the processor 22 may detect a stop request gesture of a person existing on one of the right side and the left side of the road.
- the detection range S can be adjusted depending on whether or not the U-turn is prohibited. Specifically, the self-driving vehicle 1 cannot move to the opposite lane in a place where a U-turn is prohibited. In other words, even if there is a prospective customer G on the opposite lane side of the autonomous driving vehicle 1, the autonomous driving vehicle 1 cannot respond. In this case, the detection range S is set only on the lane side where the autonomous driving vehicle 1 is traveling. can do.
- the presence or absence of the cancel gesture is determined before the automatic driving vehicle 1 stops is illustrated.
- the presence or absence of a cancel gesture may be determined after the autonomous driving vehicle 1 stops. In this case, it can be determined as one of the cancel gestures that the prospective customer G does not get on the autonomous driving vehicle 1 even if a predetermined time elapses after the vehicle stops.
- each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
- Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
- the software that realizes the autonomous driving vehicle 1 and the like of each of the above embodiments is the following program.
- this program is a program that, when it is determined that there are no passengers in the vehicle body of the autonomous driving vehicle, a gesture for stopping the vehicle body of a person existing around the road on which the vehicle body is traveling Is detected, an automatic driving vehicle stopping method for stopping the vehicle main body in the vicinity of the person who made the gesture is executed.
- the automatic driving vehicle 1 and the method for stopping the automatic driving vehicle 1 according to one or more aspects of the present invention have been described above based on the embodiment. However, the present invention is limited to this embodiment. It is not a thing. Unless it deviates from the gist of the present invention, the embodiment in which various modifications conceived by those skilled in the art have been made in the present embodiment, or a form constructed by combining components in different embodiments is also possible. It may be included within the scope of the embodiments.
- This disclosure is useful as an autonomous driving vehicle that can be used as a taxi.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Theoretical Computer Science (AREA)
- Tourism & Hospitality (AREA)
- Transportation (AREA)
- Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Entrepreneurship & Innovation (AREA)
- Development Economics (AREA)
- Primary Health Care (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
Description
本発明者は、自動運転車両をタクシーとして採用することについて検討した。ここで、日本政府や米国運輸省道路交通安全局が定義したレベル4の自動運転車両は、完全自動運転であるために、乗客は運転操作に全く関与しない。つまり、レベル4の自動運転車両をタクシーに採用した場合には、自動運転車両自体で、道路上に見込み客が存在するか否かを判断し、存在した場合にはその見込み客の近くで自動運転車両が停車する必要がある。ところが、見込み客を発見するたびに自動運転車両が停車してしまうと、すでに自動運転車両内に乗客がいる場合には非効率である。そこで、以降においては、この問題を解消することができる技術について説明する。
図1は、本実施の形態に係る配車システム100の概略構成を示す模式図である。配車システム100は、タクシーとして使用する複数の自動運転車両1を、所定の場所に配車するシステムである。具体的には、配車システム100は、例えばパーソナルコンピュータ、タブレット端末などの情報端末101を備えている。この情報端末101は、各自動運転車両1とネットワークNを介して通信可能となっている。情報端末101からは、各自動運転車両1に対して、所定の場所への配車指示が出力される。各自動運転車両1は、配車指示に基づいて、所定の場所に向けて自動で走行する。なお、配車システム100には、運転者の運転操作を必要とする自動車が含まれていてもよい。この場合、自動車が受信した配車指示に基づいて、運転者が自動車を運転し所定の場所へ向かうものとする。
次に自動運転システム20が実行する自動運転車両1の停車方法について説明する。図4、図5及び図6は、自動運転車両1の停車方法の流れを示すフローチャートである。なお、ここでは、自動運転車両1が日本で用いられている場合を例示して説明する。つまり、自動車は左側通行とし、停車要求ジェスチャーは「片手を上げる」とする。
本実施の形態に係る自動運転車両1は、車両本体10と、車両本体10が走行している道路周辺に存在する人の、車両本体10を停車させようとするジェスチャー(停車要求ジェスチャー)を検出するジェスチャー検出部(第一カメラ124、メモリ21及びプロセッサ22)と、車両本体10に乗客がいるかどうかを判定する判定部(プロセッサ22及び第二カメラ125)と、車両本体10を自動運転する制御部(自動運転システム20)とを備え、制御部は、判定部によって車両本体10に乗客がいないと判定された場合に、ジェスチャー検出部がジェスチャーをしている人を検出すると、車両本体10を当該人の周辺で停車させる。
以上のように、本出願において開示する技術の例示として、上記実施の形態を説明した。しかしながら、本実施の形態における技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施の形態にも適用可能である。また、上記各実施の形態で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。
10 車両本体
11 シート
12 操作部
13 報知部
20 自動運転システム
21 メモリ
22 プロセッサ
100 配車システム
101 情報端末
110 推進システム
111 駆動源
112 トランスミッション
120 センサシステム
121 GPSモジュール
122 慣性測定ユニット
123 特性ユニット
124 第一カメラ
125 第二カメラ
130 制御システム
131 ステアリングユニット
132 スロットルユニット
133 ブレーキユニット
134 ナビゲーションユニット
135 障害物回避システム
140 周辺機器
141 通信部
G 見込み客
L1,L2,L3,L4 レーン
N ネットワーク
R 撮影範囲
S 検出範囲
W,W1,W2 歩道
Claims (10)
- 自動運転車両であって、
車両本体と、
前記車両本体が走行している道路周辺に存在する人の、前記車両本体を停車させようとするジェスチャーを検出するジェスチャー検出部と、
前記車両本体に乗客がいるかどうかを判定する判定部と、
前記車両本体を自動運転する制御部とを備え、
前記制御部は、前記判定部によって前記車両本体に乗客がいないと判定された場合に、前記ジェスチャー検出部が前記ジェスチャーをしている人を検出すると、当該人の周辺で前記車両本体を停車させる
自動運転車両。 - 前記ジェスチャー検出部は、前記道路の右側及び左側の一方に存在する前記人の前記ジェスチャーを検出する
請求項1記載の自動運転車両。 - 前記道路における同一進行方向のレーン数を検出するレーン検出部を備え、
前記ジェスチャー検出部は、
前記レーン検出部が前記レーン数を1と検出した場合には、前記道路の右側及び左側に存在する前記人の前記ジェスチャーを検出し、
前記レーン検出部が前記レーン数を2以上と検出した場合には、前記道路の右側及び左側の一方に存在する前記人の前記ジェスチャーを検出する
請求項1に記載の自動運転車両。 - 前記制御部は、前記レーン検出部が前記レーン数を2以上と検出するとともに、前記判定部によって前記車両本体に乗客がいないと判定された場合に、前記車両本体を前記2以上のレーンのうち、歩道に最も近いレーンで走行させる
請求項3に記載の自動運転車両。 - 前記道路におけるUターンの禁止有無を検出するUターン禁止検出部を備え、
前記ジェスチャー検出部は、
前記Uターン禁止検出部が前記道路におけるUターンの禁止を検出した場合には、前記道路の右側及び左側の一方に存在する前記人の前記ジェスチャーを検出する
請求項1に記載の自動運転車両。 - 前記ジェスチャー検出部は、前記人の、前記車両本体の停車をキャンセルしようとする第二ジェスチャーを検出し、
前記制御部は、前記ジェスチャー検出部が前記ジェスチャーをしている人を検出した後に、当該人からの前記第二ジェスチャーを検出した場合には、前記車両本体の走行を継続する
請求項1~5のいずれか一項に記載の自動運転車両。 - 前記車両本体の外方に向けて報知を行う報知部を備え、
前記制御部は、前記判定部によって前記車両本体に乗客がいないと判定された場合に、前記ジェスチャー検出部が前記ジェスチャーをしている人を検出すると、当該人を検出した旨を示す報知を前記報知部に行わせる
請求項1~6のいずれか一項に記載の自動運転車両。 - 他の自動運転車両を所定の場所に配車する配車システムに対して通信する通信部を備え、
前記制御部は、前記判定部によって前記車両本体に乗客がいると判定された場合に、前記ジェスチャー検出部が前記ジェスチャーをしている人を検出すると、当該人の場所に対する配車要求を前記通信部から前記配車システムに出力させる
請求項1~7のいずれか一項に記載の自動運転車両。 - 自動運転車両の停車方法であって、
前記自動運転車両の車両本体に乗客がいないと判定された場合に、前記車両本体が走行している道路周辺に存在する人の、前記車両本体を停車させようとするジェスチャーを検出すると、当該ジェスチャーをした人の周辺で前記車両本体を停車させる
自動運転車両の停車方法。 - 請求項9に記載の自動運転車両の停車方法をコンピュータに実行させる
プログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780051522.6A CN109643494B (zh) | 2017-04-14 | 2017-12-14 | 自动驾驶车辆、自动驾驶车辆的停车方法以及记录介质 |
EP17905521.5A EP3611712A4 (en) | 2017-04-14 | 2017-12-14 | AUTONOMOUS VEHICLE, METHOD FOR STOPPING AUTONOMOUS VEHICLE AND PROGRAM |
AU2017408956A AU2017408956A1 (en) | 2017-04-14 | 2017-12-14 | Autonomous vehicle, autonomous vehicle stopping method, and program |
CA3037470A CA3037470A1 (en) | 2017-04-14 | 2017-12-14 | Autonomous driving vehicle, method of stopping autonomous driving vehicle, and program |
US16/299,847 US11067986B2 (en) | 2017-04-14 | 2019-03-12 | Autonomous driving vehicle, method of stopping autonomous driving vehicle, and recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017080590A JP6862257B6 (ja) | 2017-04-14 | 2017-04-14 | 自動運転車両、自動運転車両の停車方法及びプログラム |
JP2017-080590 | 2017-04-14 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/299,847 Continuation US11067986B2 (en) | 2017-04-14 | 2019-03-12 | Autonomous driving vehicle, method of stopping autonomous driving vehicle, and recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018189952A1 true WO2018189952A1 (ja) | 2018-10-18 |
Family
ID=63792353
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/044809 WO2018189952A1 (ja) | 2017-04-14 | 2017-12-14 | 自動運転車両、自動運転車両の停車方法及びプログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US11067986B2 (ja) |
EP (1) | EP3611712A4 (ja) |
JP (1) | JP6862257B6 (ja) |
CN (1) | CN109643494B (ja) |
AU (1) | AU2017408956A1 (ja) |
CA (1) | CA3037470A1 (ja) |
WO (1) | WO2018189952A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10395457B2 (en) * | 2017-08-10 | 2019-08-27 | GM Global Technology Operations LLC | User recognition system and methods for autonomous vehicles |
WO2020157531A1 (ja) * | 2019-01-29 | 2020-08-06 | 日産自動車株式会社 | 乗車許可判断装置及び乗車許可判断方法 |
US11320830B2 (en) | 2019-10-28 | 2022-05-03 | Deere & Company | Probabilistic decision support for obstacle detection and classification in a working area |
WO2022269303A1 (ja) | 2021-06-23 | 2022-12-29 | 日産自動車株式会社 | 車両制御装置、車両制御方法、車両制御プログラム、及び、車両制御システム |
JP2023034260A (ja) * | 2021-08-30 | 2023-03-13 | 東芝ライフスタイル株式会社 | 空気調和装置 |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11300963B1 (en) | 2017-08-18 | 2022-04-12 | Amazon Technologies, Inc. | Robot movement constraint system |
US11422565B1 (en) * | 2017-08-18 | 2022-08-23 | Amazon Technologies, Inc. | System for robot movement that is informed by cultural conventions |
JP6617126B2 (ja) * | 2017-09-15 | 2019-12-11 | 本田技研工業株式会社 | 走行制御システムおよび車両の制御方法 |
DE102017220116A1 (de) * | 2017-11-13 | 2019-05-16 | Ford Global Technologies, Llc | Verfahren und Vorrichtung um einen schnellen Halt eines autonom fahrenden Fahrzeugs zu ermöglichen |
US11880800B2 (en) | 2018-06-30 | 2024-01-23 | Chian Chiu Li | Systems and methods for implementing hailing request and shipping request |
US11194399B2 (en) * | 2018-06-30 | 2021-12-07 | Chian Chiu Li | Systems and methods for implementing hailing request |
CN109733403A (zh) * | 2018-12-29 | 2019-05-10 | 百度在线网络技术(北京)有限公司 | 自动驾驶车辆停车方法、装置、服务器和计算机可读介质 |
US11094197B2 (en) * | 2019-02-26 | 2021-08-17 | Toyota Research Institute, Inc. | System and method for switching from a curbside lane to a lane-of-interest |
JP7077255B2 (ja) * | 2019-03-14 | 2022-05-30 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、及びプログラム |
JP7237684B2 (ja) * | 2019-03-27 | 2023-03-13 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、およびプログラム |
CN112660148B (zh) * | 2019-09-30 | 2022-09-02 | 阿波罗智能技术(北京)有限公司 | 确定车辆的调头路径的方法、装置、设备和介质 |
US11310269B2 (en) * | 2019-10-15 | 2022-04-19 | Baidu Usa Llc | Methods to detect spoofing attacks on automated driving systems |
CN110750159B (zh) * | 2019-10-22 | 2023-09-08 | 深圳市商汤科技有限公司 | 手势控制方法和装置 |
CN111147825A (zh) * | 2020-01-15 | 2020-05-12 | 哈工大机器人(岳阳)军民融合研究院 | 无人驾驶零售车及其招手即停控制方法 |
US11873000B2 (en) * | 2020-02-18 | 2024-01-16 | Toyota Motor North America, Inc. | Gesture detection for transport control |
JP2022047081A (ja) * | 2020-09-11 | 2022-03-24 | トヨタ自動車株式会社 | 情報処理装置、情報処理システム、及び、情報処理方法 |
KR102449330B1 (ko) * | 2020-11-24 | 2022-10-05 | 울산과학기술원 | 자율 주행 모빌리티 주행 모드 제어 방법 및 장치 |
CN112477886B (zh) * | 2020-12-03 | 2022-03-01 | 南京领行科技股份有限公司 | 无人驾驶车辆的控制方法、装置、电子设备及存储介质 |
JP7355050B2 (ja) | 2021-03-04 | 2023-10-03 | トヨタ自動車株式会社 | 車両制御装置、車両制御方法およびプログラム |
US20220309521A1 (en) * | 2021-03-24 | 2022-09-29 | Here Global B.V. | Computing a vehicle interest index |
US20230081186A1 (en) * | 2021-09-14 | 2023-03-16 | Gm Cruise Holdings Llc | Autonomous vehicle supervised stops |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012014482A (ja) * | 2010-07-01 | 2012-01-19 | Panasonic Corp | タクシー用判定装置 |
JP2015191641A (ja) * | 2014-03-31 | 2015-11-02 | Necエンベデッドプロダクツ株式会社 | 監視装置、監視システム、監視方法、及びプログラム |
JP2015191264A (ja) * | 2014-03-27 | 2015-11-02 | 株式会社ニコン | 自律走行車両 |
JP2016115364A (ja) * | 2014-04-01 | 2016-06-23 | みこらった株式会社 | 配車管理システム及び配車管理サーバ |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103182983A (zh) * | 2011-12-27 | 2013-07-03 | 鸿富锦精密工业(深圳)有限公司 | 乘客拦车提示系统及方法 |
JP6150258B2 (ja) | 2014-01-15 | 2017-06-21 | みこらった株式会社 | 自動運転車 |
EP2982562A1 (en) * | 2014-08-08 | 2016-02-10 | Nokia Technologies OY | Vehicle control |
TWI549069B (zh) * | 2014-12-15 | 2016-09-11 | Sheng Hui Meng | Method and device for passenger barge |
US20160377448A1 (en) * | 2015-06-29 | 2016-12-29 | Globalfoundries Inc. | Predicting and alerting user to navigation options and predicting user intentions |
US10150448B2 (en) * | 2015-09-18 | 2018-12-11 | Ford Global Technologies. Llc | Autonomous vehicle unauthorized passenger or object detection |
WO2017155740A1 (en) * | 2016-03-08 | 2017-09-14 | Pcms Holdings, Inc. | System and method for automated recognition of a transportation customer |
US10762358B2 (en) * | 2016-07-20 | 2020-09-01 | Ford Global Technologies, Llc | Rear camera lane detection |
US10095315B2 (en) * | 2016-08-19 | 2018-10-09 | Otis Elevator Company | System and method for distant gesture-based control using a network of sensors across the building |
DE102016217770A1 (de) * | 2016-09-16 | 2018-03-22 | Audi Ag | Verfahren zum Betrieb eines Kraftfahrzeugs |
-
2017
- 2017-04-14 JP JP2017080590A patent/JP6862257B6/ja active Active
- 2017-12-14 CA CA3037470A patent/CA3037470A1/en not_active Abandoned
- 2017-12-14 CN CN201780051522.6A patent/CN109643494B/zh active Active
- 2017-12-14 AU AU2017408956A patent/AU2017408956A1/en not_active Abandoned
- 2017-12-14 EP EP17905521.5A patent/EP3611712A4/en not_active Withdrawn
- 2017-12-14 WO PCT/JP2017/044809 patent/WO2018189952A1/ja active Application Filing
-
2019
- 2019-03-12 US US16/299,847 patent/US11067986B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012014482A (ja) * | 2010-07-01 | 2012-01-19 | Panasonic Corp | タクシー用判定装置 |
JP2015191264A (ja) * | 2014-03-27 | 2015-11-02 | 株式会社ニコン | 自律走行車両 |
JP2015191641A (ja) * | 2014-03-31 | 2015-11-02 | Necエンベデッドプロダクツ株式会社 | 監視装置、監視システム、監視方法、及びプログラム |
JP2016115364A (ja) * | 2014-04-01 | 2016-06-23 | みこらった株式会社 | 配車管理システム及び配車管理サーバ |
Non-Patent Citations (1)
Title |
---|
See also references of EP3611712A4 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10395457B2 (en) * | 2017-08-10 | 2019-08-27 | GM Global Technology Operations LLC | User recognition system and methods for autonomous vehicles |
WO2020157531A1 (ja) * | 2019-01-29 | 2020-08-06 | 日産自動車株式会社 | 乗車許可判断装置及び乗車許可判断方法 |
CN113383368A (zh) * | 2019-01-29 | 2021-09-10 | 日产自动车株式会社 | 乘车许可判断装置以及乘车许可判断方法 |
JPWO2020157531A1 (ja) * | 2019-01-29 | 2021-11-25 | 日産自動車株式会社 | 乗車許可判断装置及び乗車許可判断方法 |
JP7095757B2 (ja) | 2019-01-29 | 2022-07-05 | 日産自動車株式会社 | 乗車許可判断装置及び乗車許可判断方法 |
US11320830B2 (en) | 2019-10-28 | 2022-05-03 | Deere & Company | Probabilistic decision support for obstacle detection and classification in a working area |
WO2022269303A1 (ja) | 2021-06-23 | 2022-12-29 | 日産自動車株式会社 | 車両制御装置、車両制御方法、車両制御プログラム、及び、車両制御システム |
JP2023034260A (ja) * | 2021-08-30 | 2023-03-13 | 東芝ライフスタイル株式会社 | 空気調和装置 |
JP2023096034A (ja) * | 2021-08-30 | 2023-07-06 | 東芝ライフスタイル株式会社 | 空気調和装置 |
JP7348242B2 (ja) | 2021-08-30 | 2023-09-20 | 東芝ライフスタイル株式会社 | 空気調和装置 |
Also Published As
Publication number | Publication date |
---|---|
JP6862257B6 (ja) | 2021-06-23 |
EP3611712A1 (en) | 2020-02-19 |
CN109643494B (zh) | 2022-08-02 |
JP2018180987A (ja) | 2018-11-15 |
CA3037470A1 (en) | 2018-10-18 |
US20190212738A1 (en) | 2019-07-11 |
AU2017408956A1 (en) | 2019-04-04 |
US11067986B2 (en) | 2021-07-20 |
JP6862257B2 (ja) | 2021-04-21 |
EP3611712A4 (en) | 2020-05-13 |
CN109643494A (zh) | 2019-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018189952A1 (ja) | 自動運転車両、自動運転車両の停車方法及びプログラム | |
AU2021203701B2 (en) | Recognizing assigned passengers for autonomous vehicles | |
US10551207B2 (en) | Autonomous vehicle sensor data and map integration | |
US10497264B2 (en) | Methods and systems for providing warnings of obstacle objects | |
US11685371B2 (en) | Extension to safety protocols for autonomous vehicle operation | |
JP2018081080A (ja) | 自律走行車(adv)用の緊急処理システム | |
CA3073318A1 (en) | Estimating time to pick up and drop off passengers for improved stopping analysis in autonomous vehicles | |
JP2020111223A (ja) | 車両制御装置及び車両制御方法 | |
CN109923018A (zh) | 车辆控制系统、车辆控制方法及车辆控制程序 | |
CN110733496A (zh) | 信息处理装置、信息处理方法以及记录介质 | |
Hasan et al. | Intelligent car control for a smart car | |
JP2020021452A (ja) | 情報処理装置、情報処理方法及び情報処理プログラム | |
CN114207685B (zh) | 自主车辆交互系统 | |
US11333523B2 (en) | Vehicle control device, output device, and input and output device | |
JP7236897B2 (ja) | 運転支援方法及び運転支援装置 | |
Gupta et al. | Guidelines to convert Luxury Cars into Smart Cars |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17905521 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 3037470 Country of ref document: CA |
|
ENP | Entry into the national phase |
Ref document number: 2017408956 Country of ref document: AU Date of ref document: 20171214 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017905521 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017905521 Country of ref document: EP Effective date: 20191114 |