US20240029450A1 - Automated driving management system and automated driving management method - Google Patents
Automated driving management system and automated driving management method Download PDFInfo
- Publication number
- US20240029450A1 US20240029450A1 US18/222,520 US202318222520A US2024029450A1 US 20240029450 A1 US20240029450 A1 US 20240029450A1 US 202318222520 A US202318222520 A US 202318222520A US 2024029450 A1 US2024029450 A1 US 2024029450A1
- Authority
- US
- United States
- Prior art keywords
- automated driving
- sensor
- information
- perception
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000007726 management method Methods 0.000 title claims description 82
- 230000008447 perception Effects 0.000 claims abstract description 212
- 238000001514 detection method Methods 0.000 claims description 10
- 238000003384 imaging method Methods 0.000 claims description 5
- 238000005259 measurement Methods 0.000 claims description 5
- 238000000034 method Methods 0.000 description 59
- 101000836337 Homo sapiens Probable helicase senataxin Proteins 0.000 description 54
- 102100027178 Probable helicase senataxin Human genes 0.000 description 54
- 238000010586 diagram Methods 0.000 description 29
- LGRFSURHDFAFJT-UHFFFAOYSA-N phthalic anhydride Chemical compound C1=CC=C2C(=O)OC(=O)C2=C1 LGRFSURHDFAFJT-UHFFFAOYSA-N 0.000 description 26
- 230000007423 decrease Effects 0.000 description 14
- 238000012545 processing Methods 0.000 description 14
- 238000004891 communication Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 7
- 238000013461 design Methods 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 238000011156 evaluation Methods 0.000 description 4
- 230000004807 localization Effects 0.000 description 4
- 230000032683 aging Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 230000006866 deterioration Effects 0.000 description 2
- 230000003760 hair shine Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 239000012141 concentrate Substances 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/60—Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo or light sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B60W2420/408—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/42—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/52—Radar, Lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/20—Static objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/40—High definition maps
Definitions
- the present disclosure relates to a technique for managing automated driving of a vehicle.
- the present disclosure relates to a technique for determining whether or not an automated driving condition is satisfied.
- Patent Literature 1 discloses an autonomous traveling vehicle provided with a plurality of sensors.
- the autonomous traveling vehicle evaluates a state of dirt or failure of the sensors.
- a sensor performance is degraded due to the dirt or failure, the autonomous traveling vehicle operates in a degenerate mode in which a speed and a steering angle are limited.
- Patent Literature 2 discloses an electronic control device installed on a vehicle.
- the electronic control device determines a sensor detectable region based on detection information of a sensor installed on the vehicle.
- the electronic control device generates travel control information of the vehicle based on information detected by the sensor and a sensor detectable area.
- An automated driving condition is a condition under which the automated driving of the vehicle is permitted, and is also referred to as an operational design domain (ODD).
- An automated driving system is designed to be operated under a predetermined automated driving condition (ODD). Therefore, when performing the automated driving, it is important to determine whether or not the automated driving condition is satisfied.
- An object of the present disclosure is to provide a technique capable of more accurately determining whether or not an automated driving condition is satisfied.
- a first aspect relates to an automated driving management system.
- the automated driving management system is applied to a vehicle that performs automated driving by using a perception sensor for perceiving a surrounding situation.
- the automated driving management system includes:
- the one or more processors acquire, based on the reference information, the expected sensor perception information associated with a determination target position.
- the one or more processors determine whether or not the automated driving condition is satisfied at the determination target position by comparing the first sensor perception information acquired at the determination target position with the expected sensor perception information associated with the determination target position.
- a second aspect relates to an automated driving management method.
- the automated driving management method is applied to a vehicle that performs automated driving by using a perception sensor for perceiving a surrounding situation.
- First sensor perception information indicates a result of perception by the perception sensor.
- Reference information indicates a correspondence relationship between a vehicle position and expected sensor perception information that is the first sensor perception information expected when an automated driving condition is satisfied.
- the automated driving management method includes:
- the reference information indicating the correspondence relationship between the vehicle position and the expected sensor perception information is prepared.
- the expected sensor perception information is the first sensor perception information (i.e., the result of perception by the perception sensor) expected when the automated driving condition is satisfied. Therefore, comparing the first sensor perception information acquired at the determination target position with the expected sensor perception information associated with the determination target position makes it possible to accurately determine whether or not the automated driving condition is satisfied at the determination target position.
- FIG. 1 is a conceptual diagram for explaining an overview of a vehicle and a vehicle control system according to an embodiment of the present disclosure
- FIG. 2 is a conceptual diagram for explaining an overview of an ODD suitability determination process according to an embodiment of the present disclosure
- FIG. 3 is a diagram for explaining an example of an ODD suitability determination process according to an embodiment of the present disclosure
- FIG. 4 is a conceptual diagram for explaining a management server according to an embodiment of the present disclosure.
- FIG. 5 is a block diagram showing a configuration example of a vehicle control system according to an embodiment of the present disclosure
- FIG. 6 is a block diagram showing an example of driving environment information according to an embodiment of the present disclosure.
- FIG. 7 is a conceptual diagram for explaining a localization process according to an embodiment of the present disclosure.
- FIG. 8 is a block diagram showing a configuration example of an automated driving management system according to an embodiment of the present disclosure.
- FIG. 9 is a flowchart showing an example of processing by an automated driving management system according to an embodiment of the present disclosure.
- FIG. 10 is a conceptual diagram for explaining an example of point cloud information according to an embodiment of the present disclosure.
- FIG. 11 is a conceptual diagram for explaining an example of reference information according to an embodiment of the present disclosure.
- FIG. 12 is a conceptual diagram for explaining an example of an ODD suitability determination process according to an embodiment of the present disclosure
- FIG. 13 is a conceptual diagram for explaining another example of an ODD suitability determination process according to an embodiment of the present disclosure.
- FIG. 14 is a conceptual diagram for explaining still another example of an ODD suitability determination process according to an embodiment of the present disclosure.
- FIG. 15 is a conceptual diagram for explaining still another example of an ODD suitability determination process according to an embodiment of the present disclosure.
- FIG. 16 is a conceptual diagram for explaining still another example of an ODD suitability determination process according to an embodiment of the present disclosure.
- FIG. 1 is a conceptual diagram for explaining an overview of a vehicle 1 and a vehicle control system 10 according to the present embodiment.
- the vehicle control system 10 controls the vehicle 1 .
- the vehicle control system 10 is installed on the vehicle 1 .
- at least a part of the vehicle control system 10 may be included in a remote system outside the vehicle 1 to remotely control the vehicle 1 .
- the vehicle 1 is capable of automated driving, and the vehicle control system is configured to control the automated driving of the vehicle 1 .
- the automated driving supposed here is one where a driver may not necessarily 100% concentrate on the driving (e.g., so-called Level 3 or higher level automated driving).
- the automated driving level may be Level 4 or higher that does not need a driver.
- a perception sensor 30 mounted on the vehicle 1 is used.
- the perception sensor 30 is a sensor for perceiving a situation around the vehicle 1 .
- Examples of the perception sensor 30 include a laser imaging detection and ranging (LIDAR), a camera, a radar, and the like.
- LIDAR laser imaging detection and ranging
- the LIDAR emits beams and detects a reflected beam reflected at a reflection point to measure a relative position of the reflection point.
- the vehicle control system 10 uses the perception sensor 30 to perceive a situation around the vehicle 1 .
- the vehicle control system 10 uses the perception sensor 30 to perceive a stationary object and a moving object around the vehicle 1 .
- the stationary object include a road surface 2 , a road structure 3 (e.g., a wall, a guardrail, a curb), a white line, and the like.
- the moving object include a surrounding vehicle 4 , a pedestrian 5 , and the like. Then, the vehicle control system 10 executes automated driving control regarding the vehicle 1 based on a result of the perception processing using the perception sensor 30 .
- An automated driving condition is a condition under which the automated driving of the vehicle 1 is permitted.
- the automated driving condition is also referred to as an operational design domain (ODD) or an operation design domain.
- ODD operational design domain
- the automated driving condition is defined by a maximum vehicle speed, a traveling area, a weather condition, a sunshine condition, and the like.
- accuracy of the perception processing using the perception sensor 30 may decrease, and thus accuracy of the automated driving control may decrease. Therefore, conventionally, “a rainfall amount per unit time is less than a predetermined value (for example, 5 mm/h)” has been used as one of the automated driving condition related to the weather.
- the vehicle control system 10 is designed to perform the automated driving under a predetermined automated driving condition (ODD). Therefore, when performing the automated driving, it is important to determine whether or not the automated driving condition is satisfied.
- ODD suitability determination process A process of determining whether or not the automated driving condition is satisfied is hereinafter referred to as an “ODD suitability determination process.” The inventor of the present application has recognized the following problem regarding the ODD suitability determination process.
- the above-mentioned automated driving condition related to the weather “a rainfall amount per unit time is less than a predetermined value” is considered.
- the rainfall amount varies widely even in a relatively narrow area, and local torrential rain has increased in recent years. Therefore, it is not easy to accurately pinpoint the rainfall amount at a current position of the vehicle 1 .
- a large-scale infrastructure such as deployment of a large number of rainfall amount sensors is required. This is not desirable from a viewpoint of costs.
- the sun shines under a wet road surface condition after rain reflection of light from the wet road surface increases. In this case, accuracy of perception of the road surface and a fallen object by the perception sensor 30 may decrease.
- the present disclosure proposes a new technique capable of improving the accuracy of the ODD suitability determination process.
- a human driver does not decide whether it is easy or difficult to drive by looking at the specific parameter such as the rainfall amount.
- the human driver decides whether it is easy or difficult to drive based on information perceived by the human driver's own vision. For example, when the sun shines under a wet road surface condition after rain, reflection of light from the wet road surface is so bright that the road surface cannot be seen as usual, and thus the human driver decides that it is difficult to drive. That is, the human driver decides that it is difficult to drive, when the information perceived by the human driver's own vision is different from usual one.
- the ODD suitability determination process according to the present embodiment also is performed in the same manner as a sense of the human driver.
- An “eye” for the vehicle control system 10 that performs the automated driving control is the perception sensor 30 . Therefore, according to the present embodiment, the ODD suitability determination process is performed based on the result of perception by the perception sensor 30 . That is, the ODD suitability determination process is performed based on whether “appearance” viewed from the perception sensor 30 of the vehicle 1 is as usual or not.
- FIG. 2 is a conceptual diagram for explaining an overview of the ODD suitability determination process according to the present embodiment.
- “First sensor perception information SEN 1 ” indicates the result of perception by the perception sensor 30 mounted on the vehicle 1 . That is, the first sensor perception information SEN 1 corresponds to the “appearance” viewed from the perception sensor 30 .
- the perception sensor 30 includes the LIDAR
- the first sensor perception information SEN 1 includes information of a point cloud (beam reflection points) measured by the LIDAR.
- the first sensor perception information SEN 1 includes the number of beam reflection points on the road surface 2 measured during one frame.
- the first sensor perception information SEN 1 includes information on a stationary object (e.g., the road surface 2 , the road structure 3 ) perceived by the perception sensor 30 .
- information on a moving object e.g., the surrounding vehicle 4 , the pedestrian 5
- the information on the moving object perceived by the perception sensor 30 is hereinafter referred to as “second sensor perception information.”
- the second sensor perception information is necessary for the automated driving control by the vehicle control system 10 , but may not necessarily be included in the first sensor perception information SEN 1 .
- “Expected sensor perception information ESEN” is the first sensor perception information SEN 1 expected when the automated driving condition is satisfied.
- the automated driving condition is determined in advance in consideration of various factors that affect the accuracy of the automated driving control.
- the expected sensor perception information ESEN corresponds to “appearance” viewed from the perception sensor 30 in the case where the automated driving is permissible.
- Reference information REF indicates a correspondence relationship between a vehicle position PV and the expected sensor perception information ESEN. That is, the reference information REF indicates the expected sensor perception information ESEN as a function of the vehicle position PV. It can also be said that the reference information REF indicates the expected sensor perception information ESEN when the vehicle 1 is present at the vehicle position PV.
- the vehicle position PV may be set along a general vehicle travel trajectory in a road.
- the vehicle position PV may be assumed to be located at a lane center.
- the vehicle position PV may be a concept including both a position and a direction of the vehicle 1 .
- the direction of the vehicle 1 may be assumed to be parallel to an extending direction of a lane (white line).
- the expected sensor perception information ESEN and the reference information REF are generated and updated based on information acquired when the automated driving condition is satisfied.
- the expected sensor perception information ESEN and the reference information REF are generated and updated based on past automated driving records of one or more vehicles 1 .
- the expected sensor perception information ESEN and the reference information REF represent “past successful experiences.”
- the reference information REF may be generated and updated through a simulation based on configuration information of an automated driving area and design information of the perception sensor 30 .
- the reference information REF that represents the expected sensor perception information ESEN as a function of the vehicle position PV is a kind of map information.
- the general map information indicates an arrangement of objects in an absolute coordinate system. That is, the general map information indicates a correspondence relationship between an absolute position and an object present at the absolute position.
- the reference information REF indicates a correspondence relationship between the vehicle position PV and the first sensor perception information SEN 1 (i.e., the result of perception) as viewed from the vehicle position PV when the automated driving condition is satisfied.
- the reference information REF does not indicate an object present at the vehicle position PV.
- An automated driving management system 100 is applied to the vehicle 1 and manages the automated driving of the vehicle 1 .
- the automated driving management system 100 holds the above-described reference information REF, and performs the ODD suitability determination process regarding the vehicle 1 based on the reference information REF.
- the automated driving management system 100 determines whether or not the automated driving condition is satisfied for the vehicle 1 present at a determination target position PT.
- the determination target position PT is the current position of the vehicle 1 .
- the determination target position PT may be a past position of the vehicle 1 .
- the automated driving management system 100 acquires the first sensor perception information SEN 1 acquired by the vehicle 1 (i.e., the vehicle control system 10 ) at the determination target position PT.
- the automated driving management system 100 acquires, based on the reference information REF, the expected sensor perception information ESEN associated with the determination target position PT.
- the expected sensor perception information ESEN associated with the determination target position PT is the first sensor perception information SEN 1 expected at the determination target position PT when the automated driving condition is satisfied. Therefore, the automated driving management system 100 is able to determine whether or not the automated driving condition is satisfied at the determination target position PT by comparing the first sensor perception information SEN 1 acquired at the determination target position PT with the expected sensor perception information ESEN associated with the determination target position PT.
- the automated driving management system 100 determines that the automated driving condition is not satisfied at the determination target position PT.
- FIG. 3 is a diagram for explaining an example of the ODD suitability determination process according to the present embodiment.
- a horizontal axis represents the vehicle position PV and a vertical axis represents a parameter X.
- the parameter X is a parameter indicating the result of perception by the perception sensor 30 and is included in the first sensor perception information SEN 1 .
- the parameter X is the number of beam reflection points on the road surface 2 measured by the LIDAR.
- the expected sensor perception information ESEN includes an expected value Xe of the parameter X expected when the automated driving condition is satisfied.
- the expected value Xe is an average value of a large number of parameters X acquired when the automated driving condition is satisfied.
- the reference information REF indicates a correspondence relationship between the expected value Xe of the parameter X and the vehicle position PV. That is, the reference information REF indicates the expected value Xe of the parameter X as a function of the vehicle position PV.
- An allowable range RNG is a range of the parameter X in which the automated driving is allowed.
- the allowable range RNG includes at least the expected value Xe.
- a width of the allowable range RNG is predetermined. The width of the allowable range RNG may be set based on a standard deviation (a) of the large number of parameters X acquired when the automated driving condition is satisfied.
- a set of the expected value Xe and the allowable range RNG may be registered in the reference information REF.
- the automated driving management system 100 acquires the first sensor perception information SEN 1 acquired by the vehicle 1 (i.e., the vehicle control system 10 ) at the determination target position PT.
- the first sensor perception information SEN 1 includes an actual value Xa of the parameter X acquired at the determination target position PT.
- the automated driving management system 100 acquires the expected value Xe associated with the determination target position PT. Then, the automated driving management system 100 determines whether or not the automated driving condition is satisfied at the determination target position PT by comparing the actual value Xa of the parameter X acquired at the determination target position PT with the allowable range RNG including the expected value Xe associated with the determination target position PT.
- the automated driving management system 100 determines that the automated driving condition is satisfied at the determination target position PT.
- the automated driving management system 100 determines that the automated driving condition is not satisfied at the determination target position PT.
- the parameter X is the number of beam reflection points on the road surface 2 measured by the LIDAR.
- the expected value Xe is an expected value of the number of beam reflection points on the road surface 2 when the automated driving condition is satisfied. In rainy weather, the number of beam reflection points on the road surface 2 decreases conspicuously.
- the automated driving management system 100 determines that the automated driving condition is not satisfied at the determination target position PT. That is, when the “appearance” of the road surface 2 viewed from the perception sensor of the vehicle 1 is different from usual, it is determined that the automated driving condition is not satisfied.
- the automated driving management system 100 decelerates or stops the vehicle 1 .
- the automated driving management system 100 instructs the vehicle control system 10 to decelerate or stop the vehicle 1 .
- the automated driving management system 100 may be included in the vehicle control system 10 of the vehicle 1 or may be provided separately from the vehicle control system 10 .
- the automated driving management system 100 may be a management server that communicates with the vehicle 1 (the vehicle control system 10 ).
- the automated driving management system 100 and the vehicle control system 10 may be partially common.
- FIG. 4 is a conceptual diagram for explaining a management server 1000 that manages the automated driving.
- the management server 1000 may be configured by a plurality of servers that execute distributed processing.
- the management server 1000 is communicably connected to a large number of vehicles 1 that perform the automated driving.
- the management server 1000 collects the vehicle positions PV and the first sensor perception information SEN 1 from the large number of vehicles 1 .
- the management server 1000 collects the vehicle positions PV and the first sensor perception information SEN 1 in the case where the automated driving is possible from the large number of vehicles 1 .
- the management server 1000 generates and updates the above-described reference information REF based on the information collected from the large number of vehicles 1 .
- the automated driving management system 100 is included in the management server 1000 .
- the management server 1000 communicates with the vehicle 1 being a determination target and acquires information of the determination target position PT and the first sensor perception information SEN 1 acquired at the determination target position PT. Then, the management server 1000 performs the above-described ODD suitability determination process based on the information acquired from the vehicle 1 being the determination target and the reference information REF.
- the management server 1000 instructs the vehicle control system 10 of the vehicle 1 being the determination target to decelerate or stop.
- the automated driving management system 100 may be included in the vehicle control system 10 .
- the vehicle control system 10 communicates with the management server 1000 and acquires the reference information REF from the management server 1000 .
- the vehicle control system 10 acquires the first sensor perception information SEN 1 at the determination target position PT.
- the vehicle control system 10 performs the above-described ODD suitability determination process based on the first sensor perception information SEN 1 and the reference information REF.
- the vehicle control system 10 decelerates or stops the vehicle 1 .
- the automated driving management system 100 includes one or more processors and one or more memory devices.
- the one or more processors may be included in the vehicle control system 10 , may be included in the management server 1000 , or may be distributed to the vehicle control system 10 and the management server 1000 .
- the one or more memory devices may be included in the vehicle control system 10 , may be included in the management server 1000 , or may be distributed to the vehicle control system 10 and the management server 1000 .
- the one or more memory devices store the reference information REF.
- the one or more processors acquire the first sensor perception information SEN 1 and performs the ODD suitability determination process based on the first sensor perception information SEN 1 and the reference information REF.
- the reference information REF indicating the correspondence relationship between the expected sensor perception information ESEN and the vehicle position PV is prepared.
- the expected sensor perception information ESEN is the first sensor perception information SEN 1 (i.e., the result of perception by the perception sensor 30 ) expected when the automated driving condition is satisfied. Therefore, comparing the first sensor perception information SEN 1 acquired at the determination target position PT with the expected sensor perception information ESEN associated with the determination target position PT makes it possible to accurately determine whether or not the automated driving condition is satisfied at the determination target position PT. For example, as compared with a case where a parameter specifically defining the weather itself such as the rainfall amount is used, it is possible to more accurately determine whether or not the automated driving condition is satisfied.
- the expected sensor perception information ESEN is the first sensor perception information SEN 1 (i.e., the result of perception by the perception sensor 30 ) expected when the automated driving condition is satisfied. Therefore, the various factors that affect the accuracy of the automated driving control are integrally reflected in the expected sensor perception information ESEN. Using such the expected sensor perception information ESEN and the reference information REF makes it possible to perform the ODD suitability determination process easily and with high accuracy.
- FIG. 5 is a block diagram showing a configuration example of the vehicle control system 10 according to the present embodiment.
- the vehicle control system 10 includes a vehicle state sensor 20 , a perception sensor 30 , a position sensor 40 , a travel device 50 , a communication device 60 , and a control device 70 .
- the vehicle state sensor 20 detects a state of the vehicle 1 .
- the vehicle state sensor 20 includes a speed sensor, an acceleration sensor, a yaw rate sensor, a steering angle sensor, and the like.
- the perception sensor 30 perceives (detects) a situation around the vehicle 1 .
- the perception sensor 30 includes a LIDAR 31 , a camera 32 , a radar, and the like.
- the LIDAR 31 emits beams and detects a reflected beam reflected at a reflection point to measure a relative position of the reflection point.
- the camera 32 images a situation around the vehicle 1 to acquire an image.
- the position sensor 40 detects a position and an orientation of the vehicle 1 .
- Examples of the position sensor 40 include an inertial measurement unit (IMU), a global navigation satellite system (GNSS) sensor, and the like.
- IMU inertial measurement unit
- GNSS global navigation satellite system
- the travel device 50 includes a steering device, a driving device, and a braking device.
- the steering device steers wheels.
- the steering device includes an electric power steering (EPS) device.
- EPS electric power steering
- the driving device is a power source that generates a driving force. Examples of the driving device include an engine, an electric motor, and an in-wheel motor.
- the braking device generates a braking force.
- the communication device 60 communicates with the outside of the vehicle 1 .
- the communication device 60 communicates with the management server 1000 (see FIG. 4 ).
- the control device (controller) 70 is a computer that controls the vehicle 1 .
- the control device 70 includes one or more processors 71 (hereinafter simply referred to as a processor 71 ) and one or more memory devices 72 (hereinafter simply referred to as a memory device 72 ).
- the processor 71 executes a variety of processing.
- the processor 71 includes a central processing unit (CPU).
- the memory device 72 stores a variety of information. Examples of the memory device 72 include a volatile memory, a nonvolatile memory, a hard disk drive (HDD), a solid state drive (SSD), and the like.
- the control device 70 may include one or more electronic control units (ECUs). A part of the control device 70 may be an information processing device outside the vehicle 1 . In this case, a part of the control device 70 communicates with the vehicle 1 and remotely controls the vehicle 1 .
- a vehicle control program 80 is a computer program for controlling the vehicle 1 .
- the variety of processing by the control device 70 may be implemented by the processor 71 executing the vehicle control program 80 .
- the vehicle control program is stored in the memory device 72 .
- the vehicle control program 80 may be recorded on a non-transitory computer-readable recording medium.
- the control device 70 acquires driving environment information 200 indicating a driving environment for the vehicle 1 .
- the driving environment information 200 is stored in the memory device 72 .
- FIG. 6 is a block diagram showing an example of the driving environment information 200 .
- the driving environment information 200 includes map information 210 , vehicle state information 220 , surrounding situation information 230 , and vehicle position information 240 .
- the map information 210 includes a general navigation map.
- the map information 210 may indicate a lane configuration and a road shape.
- the map information 210 may include position information of landmarks, traffic signals, signs, and so forth.
- the control device 70 acquires the map information 210 of a necessary area from a map database.
- the map database may be stored in the memory device 72 or may be managed by the management server 1000 . In the latter case, the control device 70 communicates with the management server 1000 via the communication device 60 to acquire the necessary map information 210 .
- the map information 210 may include stationary object map information 215 indicating an absolute position where a stationary object is present.
- the stationary object include a road surface 2 , a road structure 3 , and the like.
- the road structure 3 include a wall, a guardrail, a curb, a fence, a plant, and the like.
- the stationary object map information 215 may include terrain map information indicating an absolute position (latitude, longitude, and altitude) where the road surface 2 is present.
- the terrain map information may include an evaluation value set for each absolute position. The evaluation value indicates “certainty (likelihood)” that the road surface 2 is present at the absolute position.
- the stationary object map information 215 may include road structure map information indicating an absolute position where the road structure 3 is present.
- the road structure map information may include an evaluation value set for each absolute position. The evaluation value indicates “certainty (likelihood)” that the road structure 3 is present at the absolute position.
- the vehicle state information 220 is information indicating the state of the vehicle 1 and includes a vehicle speed, an acceleration, a yaw rate, a steering angle, and the like.
- the control device 70 acquires the vehicle state information 220 from the vehicle state sensor 20 .
- the vehicle state information 220 may indicate a driving state (automated driving or manual driving) of the vehicle 1 .
- the surrounding situation information 230 is information indicating the situation around the vehicle 1 .
- the control device 70 perceives (recognizes) the situation around the vehicle 1 by using the perception sensor 30 to acquire the surrounding situation information 230 .
- the surrounding situation information 230 includes point cloud information 231 indicating a result of measurement by the LIDAR 31 . More specifically, the point cloud information 231 indicates a relative position (an azimuth and a distance) of each beam reflection point viewed from the LIDAR 31 .
- the surrounding situation information 230 may include image information 232 captured by the camera 32 .
- the surrounding situation information 230 further includes object information 233 regarding an object around the vehicle 1 .
- the object include a white line, the road structure 3 , a surrounding vehicle 4 (e.g., a preceding vehicle, a parked vehicle, and the like), a pedestrian 5 , a traffic signal, a landmark, a fallen object, and the like.
- the object information 233 indicates a relative position and a relative speed of the object with respect to the vehicle 1 .
- analyzing the image information 232 captured by the camera 32 makes it possible to identify an object and calculate the relative position of the object.
- the control device 70 identifies an object in the image information 232 by using image perception AI acquired by machine learning. It is also possible to identify an object and acquire the relative position and the relative speed of the object based on the point cloud information 231 acquired by the LIDAR 31 .
- the control device 70 may utilize the above-described stationary object map information 215 .
- the position of the stationary object e.g., the road surface 2 , the road structure 3
- the stationary object map information 215 makes it possible to distinguish the stationary object from other objects.
- the control device 70 grasps the position of the stationary object existing around the vehicle 1 based on the stationary object map information 215 and the vehicle position information 240 .
- the control device 70 removes (thins out) the stationary object from the objects perceived using the perception sensor 30 . It is thus possible to distinguish the stationary object from the other objects (e.g., a surrounding vehicle 4 , a pedestrian 5 , a fallen object, and the like).
- the control device 70 is able to detect the surrounding vehicle 4 , the pedestrian 5 , the fallen object, and the like on the road surface 2 by removing the road surface 2 indicated by the terrain map information from the point cloud information 231 .
- the vehicle position information 240 is information indicating the position and the orientation of the vehicle 1 .
- the control device 70 acquires the vehicle position information 240 from a result of detection by the position sensor 40 .
- the control device 70 may acquire highly accurate vehicle position information 240 by a known self-position estimation process (localization) using the object information 233 and the map information 210 .
- FIG. 7 is a conceptual diagram for explaining the self-position estimation process (localization).
- Various landmarks characteristic objects
- the control device 70 uses the perception sensor 30 to perceive the landmark around the vehicle 1 .
- the object information 233 indicates the relative position of the perceived landmark.
- the absolute position of the landmark is registered on the map information 210 .
- the control device 70 corrects the vehicle position information 240 such that the relative position of the landmark indicated by the object information 233 and the absolute position of the landmark acquired from the map information 210 are consistent with each other. As a result, highly accurate vehicle position information 240 can be acquired.
- the control device 70 executes vehicle travel control that controls travel of the vehicle 1 .
- the vehicle travel control includes steering control, acceleration control, and deceleration control.
- the control device 70 executes the vehicle travel control by controlling the travel device 50 . More specifically, the control device 70 executes the steering control by controlling the steering device. In addition, the control device 70 executes the acceleration control by controlling the driving device. Further, the control device 70 executes the deceleration control by controlling the braking device.
- the control device 70 executes the automated driving control based on the driving environment information 200 . More specifically, the control device 70 generates a travel plan of the vehicle 1 based on the driving environment information 200 . Examples of the travel plan include keeping a current travel lane, making a lane change, making a right or left turn, avoiding an obstacle, and the like. Further, based on the driving environment information 200 , the control device 70 generates a target trajectory necessary for the vehicle 1 to travel in accordance with the travel plan. The target trajectory includes a target position and a target velocity. Then, the control device 70 executes the vehicle travel control such that the vehicle 1 follows the target trajectory.
- the control device 70 when the automated driving management system 100 determines that the automated driving condition is not satisfied, the control device 70 generates an emergency plan for decelerating or stopping the vehicle 1 . Then, the control device 70 executes the vehicle travel control in accordance with the emergency plan to make the vehicle 1 decelerate or stop.
- FIG. 8 is a block diagram showing a configuration example of the automated driving management system 100 according to the present embodiment.
- the automated driving management system 100 includes a communication device 110 , one or more processors 120 (hereinafter simply referred to as a processor 120 ), and one or more memory devices 130 (hereinafter simply referred to as a memory device 130 ).
- the communication device 110 communicates with the outside of the automated driving management system 100 .
- the communication device 110 communicates with the management server 1000 (see FIG. 4 ).
- the communication device 110 communicates with the vehicle control system 10 .
- the processor 120 executes a variety of processing.
- the processor 120 includes a CPU.
- the memory device 130 stores a variety of information. Examples of the memory device 130 include a volatile memory, a nonvolatile memory, an HDD, an SSD, and the like.
- the processor 120 is the same as the processor 71 of the vehicle control system 10
- the memory device 130 is the same as the memory device 72 of the vehicle control system 10 .
- An automated driving management program 140 is a computer program for managing the automated driving.
- the variety of processing by the processor 120 may be implemented by the processor 120 executing the automated driving management program 140 .
- the automated driving management program 140 is stored in the memory device 130 .
- the automated driving management program 140 may be recorded on a non-transitory computer-readable recording medium.
- the reference information REF indicates a correspondence relationship between the expected sensor perception information ESEN and the vehicle position PV.
- the management server 1000 generates and updates the reference information REF.
- the reference information REF is generated and updated based on past automated driving records of one or more vehicles 1 .
- the reference information REF may be generated and updated through a simulation based on configuration information of an automated driving area and design information of the perception sensor 30 .
- the reference information REF is stored in the memory device 130 .
- the processor 120 communicates with the management server 1000 via the communication device 110 to acquire the reference information REF.
- the vehicle position information 240 and the first sensor perception information SEN 1 are acquired by the vehicle control system 10 .
- the first sensor perception information SEN 1 indicates the result of perception perceived by the perception sensor 30 of the vehicle 1 .
- the first sensor perception information SEN 1 includes information on the stationary object (e.g., the road surface 2 , the road structure 3 ) perceived by the perception sensor 30 .
- the processor 120 communicates with the vehicle control system 10 via the communication device 110 to acquire the vehicle position information 240 and the first sensor perception information SEN 1 .
- the vehicle position information 240 and the first sensor perception information SEN 1 are stored in the memory device 130 .
- FIG. 9 is a flowchart showing an example of processing performed by the automated driving management system 100 (the processor 120 ) according to the present embodiment.
- Step S 100 the processor 120 acquires the vehicle position information 240 and the first sensor perception information SEN 1 .
- the vehicle position information 240 includes information on the determination target position PT.
- the determination target position PT is a current position of the vehicle 1 .
- the determination target position PT may be a past position of the vehicle 1 .
- the first sensor perception information SEN 1 indicates the result of perception by the perception sensor 30 mounted on the vehicle 1 .
- the first sensor perception information SEN 1 includes the actual value Xa of the parameter X perceived by the perception sensor 30 .
- the processor 120 acquires the first sensor perception information SEN 1 acquired at the determination target position PT.
- Step S 110 the processor 120 acquires, based on the reference information REF, the expected sensor perception information ESEN associated with the determination target position PT.
- the expected sensor perception information ESEN includes the expected value Xe of the parameter X expected when the automated driving condition is satisfied.
- Step S 120 the processor 120 compares the first sensor perception information SEN 1 acquired at the determination target position PT with the expected sensor perception information ESEN associated with the determination target position PT.
- Step S 130 When the first sensor perception information SEN 1 acquired at the determination target position PT does not deviate from the expected sensor perception information ESEN (Step S 130 ; No), the processing proceeds to Step S 140 .
- the processing proceeds to Step S 140 .
- Step S 140 the processor 120 determines that the automated driving condition is satisfied at the determination target position PT. In this case, the processor 120 makes the automated driving of the vehicle 1 continue (Step S 150 ).
- Step S 130 when the first sensor perception information SEN 1 acquired at the determination target position PT deviates from the expected sensor perception information ESEN (Step S 130 ; Yes), the processing proceeds to Step S 160 .
- the processing proceeds to Step S 160 .
- Step S 160 the processor 120 determines that the automated driving condition is not satisfied at the determination target position PT. In this case, the processor 120 makes the vehicle 1 decelerate or stop (Step S 170 ).
- the perception sensor 30 includes the LIDAR 31
- the first sensor perception information SEN 1 includes the point cloud information 231 indicating the result of measurement by the LIDAR 31 .
- the point cloud information 231 indicates the relative position (the azimuth and the distance) of each beam reflection point viewed from the LIDAR 31 .
- FIG. 10 is a conceptual diagram for explaining an example of the point cloud information 231 .
- a first reflection point R 1 is the reflection point on the stationary object (e.g., the road surface 2 , the road structure 3 ).
- a second reflection point R 2 is the reflection point on the moving object (e.g., the surrounding vehicle 4 , the pedestrian 5 ).
- a noise reflection point R 3 is the reflection point caused by raindrops or dust in air.
- the first reflection point R 1 is detected spatially continuously over a certain range.
- the second reflection point R 2 also is detected spatially continuously over a certain range. That is, the spatially continuous point cloud is constituted by the first reflection point R 1 or the second reflection point R 2 .
- the above-described stationary object map information 215 indicates the absolute position where the stationary object is present. Combining the stationary object map information 215 and the vehicle position information 240 makes it possible to grasp the position at which the stationary object is assumed to be present around the vehicle 1 . Therefore, the processor 120 is able to classify the spatially continuous point cloud into the first reflection point R 1 and the second reflection point R 2 based on the stationary object map information 215 and the vehicle position information 240 .
- the processor 120 is able to distinguish between the first reflection point R 1 regarding the stationary object and the second reflection point R 2 regarding the moving object.
- the stationary object map information 215 includes at least one of the terrain map information and the road structure map information, it is also possible to distinguish between the first reflection point R 1 regarding the road surface 2 and the first reflection point R 1 regarding the road structure 3 .
- the noise reflection points R 3 are not spatially continuous. Typically, each noise reflection point R 3 exists alone. Therefore, the processor 120 is able to recognize the noise reflection point R 3 based on continuity of the point cloud. For example, assume a case where distances to a plurality of reflection points detected in a certain area are 19.8 m, 20.0 m, 5.5 m, 20.2 m, and 20.1 m. In this case, the reflection point whose distance is 5.5 m is the noise reflection point R 3 . For example, the processor 120 classifies a discontinuous reflection point whose distance difference from a reflection point for an adjacent beam is equal to or greater than a predetermined value as the noise reflection point R 3 .
- a number n_t is the number of the first reflection points R 1 that are measured during one frame.
- a number n_s is the number of the second reflection points R 2 that are measured during one frame.
- a number n_n is the number of noise reflection points R 3 that are measured during one frame.
- a number of no-reflection points m is the number of beams for which the reflected beam is not detected during one frame. In this case, a relationship represented by the following Equation (1) is satisfied.
- N n _ t+n _ s+n _ n+m Equation (1):
- the first sensor perception information SEN 1 includes at least one of the number of first reflection points n_t, the number of noise reflection points n_n, and the number of no-reflection points m.
- the first sensor perception information SEN 1 may include at least the number of first reflection points n_t.
- the first sensor perception information SEN 1 may include all of the number of first reflection points n_t, the number of noise reflection points n_n, and the number of no-reflection points m.
- the first sensor perception information SEN 1 may not include the number of second reflection points n_s regarding the moving object.
- second sensor perception information is information on the moving object perceived by the perception sensor 30 .
- the second sensor perception information is necessary for the automated driving control by the vehicle control system 10 , but may not necessarily be included in the first sensor perception information SEN 1 .
- FIG. 11 is a conceptual diagram for explaining an example of the reference information REF.
- the reference information REF indicates a correspondence relationship between the expected value Xe of the parameter X and the vehicle position PV. That is, the reference information REF expresses the expected value Xe of the parameter X as a function of the vehicle position PV.
- the parameter X includes the number of first reflection points n_t, the number of noise reflection points n_n, and the number of no-reflection points m.
- the ODD suitability determination process in rainy weather will be described.
- a reflected beam on a diffusely-reflecting surface is highly likely to return back to the LIDAR 31 , but a reflected beam on a totally-reflecting surface is not likely to return back to the LIDAR 31 . Therefore, in rainy weather, the first reflection point R 1 on the road surface 2 decreases.
- the noise reflection point R 3 increases. That is to say, in rainy weather, the number of first reflection points n_t conspicuously decreases, while the number of noise reflection points n_n conspicuously increases. In addition, the number of no-reflection points m increases as the number of first reflection points n_t decreases.
- FIG. 12 is a conceptual diagram for explaining the ODD suitability determination process using the number of first reflection points n_t.
- a vertical axis represents the number of first reflection points n_t, and a horizontal axis represents the vehicle position PV.
- the reference information REF indicates a correspondence relationship between the expected value Xe of the number of first reflection points n_t and the vehicle position PV. That is, the reference information REF expresses the expected value Xe of the number of first reflection points n_t as a function of the vehicle position PV.
- a first threshold value TH 1 defines a lower limit value of the number of first reflection points n_t that allows to continue the automated driving (e.g., LV 4 automated driving) without deceleration.
- the first threshold value TH 1 is set to be lower than the expected value Xe.
- a second threshold value TH 2 defines a lower limit value of the number of first reflection points n_t that allows to continue the automated driving if decelerated.
- the second threshold value TH 2 is set to be further lower than the first threshold value TH 1 .
- the first threshold value TH 1 and the second threshold value TH 2 may be registered on the reference information REF together with the expected value Xe.
- the processor 120 acquires the first sensor perception information SEN 1 acquired at the determination target position PT.
- the first sensor perception information SEN 1 includes the actual value Xa of the number of first reflection points n_t acquired at the determination target position PT.
- the processor 120 acquires the expected value Xe associated with the determination target position PT.
- the processor 120 determines that the automated driving condition is satisfied and the automated driving is possible.
- the processor 120 determines that the automated driving is possible if decelerated.
- the processor 120 determines that the automated driving condition is not satisfied and the automated driving is not possible.
- FIG. 13 is a conceptual diagram for explaining the ODD suitability determination process using the number of noise reflection points n_n.
- a vertical axis represents the number of noise reflection points n_n, and a horizontal axis represents the vehicle position PV.
- the reference information REF indicates a correspondence relationship between the expected value Xe of the number of noise reflection points n_n and the vehicle position PV. That is, the reference information REF expresses the expected value Xe of the number of noise reflection points n_n as a function of the vehicle position PV.
- a first threshold value TH 1 defines an upper limit value of the number of noise reflection points n_n that allows to continue the automated driving (e.g., LV 4 automated driving) without deceleration.
- the first threshold value TH 1 is set to be higher than the expected value Xe.
- a second threshold value TH 2 defines an upper limit value of the number of noise reflection points n_n that allows to continue the automated driving if decelerated.
- the second threshold value TH 2 is set to be further higher than the first threshold value TH 1 .
- the first threshold value TH 1 and the second threshold value TH 2 may be registered on the reference information REF together with the expected value Xe.
- the processor 120 acquires the first sensor perception information SEN 1 acquired at the determination target position PT.
- the first sensor perception information SEN 1 includes the actual value Xa of the number of noise reflection points n_n acquired at the determination target position PT.
- the processor 120 acquires the expected value Xe associated with the determination target position PT.
- the processor 120 determines that the automated driving condition is satisfied and the automated driving is possible.
- the processor 120 determines that the automated driving is possible if decelerated.
- the processor 120 determines that the automated driving condition is not satisfied and the automated driving is not possible.
- the processor 120 When the first sensor perception information SEN 1 includes the number of first reflection points n_t and the number of noise reflection points n_n, the processor 120 performs both the ODD suitability determination process shown in FIG. 12 and the ODD suitability determination process shown in FIG. 13 . Then, the processor 120 adopts the determination result in which the traveling of the vehicle 1 is more restricted, that is, the vehicle speed is lower.
- fog is considered.
- the number of water droplets in the air increases greatly. Therefore, the number of noise reflection points n_n is greatly increased.
- the number of no-reflection points m decreases as the number of noise reflection points n_n increases.
- the number of first reflection points n_t does not decrease so much. Therefore, using at least one of the number of noise reflection points n_n and the number of no-reflection points m makes it possible to appropriately perform the ODD suitability determination process.
- the ODD suitability determination process is the same as that in the first example described above.
- a case where an output of the LIDAR 31 is reduced is considered.
- the reduction in the output of the LIDAR 31 occurs due to aging, failure, heat, or the like.
- the number of reflection points decreases as a whole.
- the number of the first reflection points R 1 on a distant road surface 2 is greatly reduced. Therefore, in the third example, the first reflection points R 1 on the road surface 2 are further classified from a viewpoint of the distance from the vehicle 1 .
- the road surface 2 is divided into three types: a short-distance road surface 2 a , a medium-distance road surface 2 b , and a long-distance road surface 2 c .
- the processor 120 classifies the first reflection points R 1 on the road surface 2 into first reflection points R 1 a on the road surface 2 a , first reflection points R 1 b on the road surface 2 b , and first reflection points R 1 c on the road surface 2 c based on the distances to the measured reflection points.
- Numbers of first reflection points n_ta, n_tb, and n_tc are the numbers of first reflection points R 1 a , R 1 b , and R 1 c , respectively.
- the first sensor perception information SEN 1 includes the numbers of first reflection points n_ta, n_tb, and n_tc.
- the reference information REF indicates respective expected values Xe of the numbers of first reflection points n_ta, n_tb, and n_tc.
- the processor 120 performs the ODD suitability determination process by using the numbers of first reflection points n_ta, n_tb, and n_tc. As a result, even when the output of the LIDAR 31 is reduced, the ODD suitability determination process can be appropriately performed. Furthermore, it is also possible to estimate that the cause of the fact that the automated driving condition is not satisfied is the reduction in the output of the LIDAR 31 .
- a case where calibration of the LIDAR 31 is deteriorated is considered.
- the first reflection point R 1 on the road structure 3 can be identified by using the road structure map information included in the stationary object map information 215 .
- accuracy of the identification of the first reflection point R 1 using the road structure map information decreases.
- the number of first reflection points R 1 on the road structure 3 decreases. Therefore, it is possible to appropriately perform the ODD suitability determination process by using the number of first reflection points n_t regarding the road structure 3 .
- the ODD suitability determination process is the same as that in the first example described above.
- the number of landmarks perceived in the localization is considered.
- the landmark include a white line, a curb, a sign, a pole, and the like.
- the result of perception of the landmark using the perception sensor 30 is acquired from the object information 233 .
- the number of detection of the landmarks is reduced.
- the white line which is detected based on luminance value, becomes hard to detect when the road surface 2 is wet.
- the numbers of detection of the white line and the curb are significantly reduced.
- the ODD suitability determination process is performed by using the number of landmarks perceived by the perception sensor 30 .
- the first sensor perception information SEN 1 includes the number of landmarks perceived by using the perception sensor 30 .
- the reference information REF indicates an expected value Xe of the number of each landmark.
- the processor 120 performs the ODD suitability determination process by using the number of each landmark.
- FIG. 15 is a conceptual diagram for explaining the ODD suitability determination process using the number of white lines n_wl.
- a vertical axis represents the number of white lines n_wl, and a horizontal axis represents the vehicle position PV.
- the reference information REF indicates a correspondence relationship between the expected value Xe of the number of white lines n_wl and the vehicle position PV. That is, the reference information REF expresses the expected value Xe of the number of white lines n_wl as a function of the vehicle position PV.
- a first threshold value TH 1 defines a lower limit value of the number of white lines n_wl that allows to continue the automated driving (e.g., LV 4 automated driving) without deceleration.
- the first threshold TH 1 is set to be lower than the expected value Xe.
- a second threshold value TH 2 defines a lower limit value of the number of white lines n_wl that allows to continue the automated driving if decelerated.
- the second threshold value TH 2 is set to be further lower than the first threshold value TH 1 .
- the first threshold value TH 1 and the second threshold value TH 2 may be registered on the reference information REF together with the expected value Xe.
- the processor 120 acquires the first sensor perception information SEN 1 acquired at the determination target position PT.
- the first sensor perception information SEN 1 includes the actual value Xa of the number of white lines n_wl acquired at the determination target position PT.
- the processor 120 acquires the expected value Xe associated with the determination target position PT.
- the processor 120 determines that the automated driving condition is satisfied and the automated driving is possible.
- the processor 120 determines that the automated driving is possible if decelerated.
- the processor 120 determines that the automated driving condition is not satisfied and the automated driving is not possible.
- FIG. 16 is a conceptual diagram for explaining a sixth example.
- the image (image information 232 ) captured by the camera 32 is considered.
- the processor 120 can extract the road surface 2 in the image by analyzing the image captured by the camera 32 .
- the processor 120 can extract the road surface 2 in the image by applying semantic segmentation to the image.
- the segmentation (region division) is a technique of grouping regions having similar feature amounts (color, texture, or the like) in the image to divide the image into a plurality of regions.
- Quality (visibility) of the image captured by the camera greatly varies depending on an imaging condition. For example, in rainy weather, the image quality is deteriorated. As another example, if a lens of camera 32 is dirty, the image quality is deteriorated. As still another example, at night, the image quality is deteriorated due to insufficient light intensity. When the image quality of the image is deteriorated, object detection performance based on the image is deteriorated and thus the accuracy of the automated driving control is deteriorated. Therefore, it is desirable to perform the ODD suitability determination process with high accuracy.
- the processor 120 extracts the road surface 2 in the image by analyzing the image captured by the camera 32 .
- the ODD suitability determination process is performed by using an area ratio of the road surface 2 in the image.
- the first sensor perception information SEN 1 includes the area ratio of the road surface 2 in the image.
- the reference information REF indicates the expected value Xe of the area ratio of the road surface 2 in the image.
- the processor 120 performs the ODD suitability determination process by using the area ratio of the road surface 2 in the image.
- the ODD suitability determination process is the same as that in the fifth example described above.
Abstract
An automated driving management system is applied to a vehicle that performs automated driving by using a perception sensor. The automated driving management system acquires first sensor perception information indicating a result of perception by the perception sensor. Reference information indicates a correspondence relationship between a vehicle position and expected sensor perception information that is the first sensor perception information expected when an automated driving condition is satisfied. Based on the reference information, the automated driving management system acquires the expected sensor perception information associated with a determination target position. The automated driving management system determines whether or not the automated driving condition is satisfied at the determination target position by comparing the first sensor perception information acquired at the determination target position with the expected sensor perception information associated with the determination target position.
Description
- This application claims priority to Japanese Patent Application No. 2022-118084 filed on Jul. 25, 2022, the entire contents of which are incorporated by reference herein.
- The present disclosure relates to a technique for managing automated driving of a vehicle. In particular, the present disclosure relates to a technique for determining whether or not an automated driving condition is satisfied.
-
Patent Literature 1 discloses an autonomous traveling vehicle provided with a plurality of sensors. The autonomous traveling vehicle evaluates a state of dirt or failure of the sensors. When a sensor performance is degraded due to the dirt or failure, the autonomous traveling vehicle operates in a degenerate mode in which a speed and a steering angle are limited. -
Patent Literature 2 discloses an electronic control device installed on a vehicle. The electronic control device determines a sensor detectable region based on detection information of a sensor installed on the vehicle. The electronic control device generates travel control information of the vehicle based on information detected by the sensor and a sensor detectable area. -
-
- Patent Literature 1: International Publication No. WO2015/068249
- Patent Literature 2: Japanese Laid-Open Patent Application No. JP-2021-187324
- Automated driving of a vehicle is considered. An automated driving condition is a condition under which the automated driving of the vehicle is permitted, and is also referred to as an operational design domain (ODD). An automated driving system is designed to be operated under a predetermined automated driving condition (ODD). Therefore, when performing the automated driving, it is important to determine whether or not the automated driving condition is satisfied.
- An object of the present disclosure is to provide a technique capable of more accurately determining whether or not an automated driving condition is satisfied.
- A first aspect relates to an automated driving management system.
- The automated driving management system is applied to a vehicle that performs automated driving by using a perception sensor for perceiving a surrounding situation.
- The automated driving management system includes:
-
- one or more processors configured to acquire first sensor perception information indicating a result of perception by the perception sensor; and
- one or more memory devices configured to store reference information indicating a correspondence relationship between a vehicle position and expected sensor perception information that is the first sensor perception information expected when an automated driving condition is satisfied.
- The one or more processors acquire, based on the reference information, the expected sensor perception information associated with a determination target position.
- The one or more processors determine whether or not the automated driving condition is satisfied at the determination target position by comparing the first sensor perception information acquired at the determination target position with the expected sensor perception information associated with the determination target position.
- A second aspect relates to an automated driving management method.
- The automated driving management method is applied to a vehicle that performs automated driving by using a perception sensor for perceiving a surrounding situation.
- First sensor perception information indicates a result of perception by the perception sensor.
- Reference information indicates a correspondence relationship between a vehicle position and expected sensor perception information that is the first sensor perception information expected when an automated driving condition is satisfied.
- The automated driving management method includes:
-
- acquiring, based on the reference information, the expected sensor perception information associated with a determination target position; and
- determining whether or not the automated driving condition is satisfied at the determination target position by comparing the first sensor perception information acquired at the determination target position with the expected sensor perception information associated with the determination target position.
- According to the present disclosure, the reference information indicating the correspondence relationship between the vehicle position and the expected sensor perception information is prepared. The expected sensor perception information is the first sensor perception information (i.e., the result of perception by the perception sensor) expected when the automated driving condition is satisfied. Therefore, comparing the first sensor perception information acquired at the determination target position with the expected sensor perception information associated with the determination target position makes it possible to accurately determine whether or not the automated driving condition is satisfied at the determination target position.
-
FIG. 1 is a conceptual diagram for explaining an overview of a vehicle and a vehicle control system according to an embodiment of the present disclosure; -
FIG. 2 is a conceptual diagram for explaining an overview of an ODD suitability determination process according to an embodiment of the present disclosure; -
FIG. 3 is a diagram for explaining an example of an ODD suitability determination process according to an embodiment of the present disclosure; -
FIG. 4 is a conceptual diagram for explaining a management server according to an embodiment of the present disclosure; -
FIG. 5 is a block diagram showing a configuration example of a vehicle control system according to an embodiment of the present disclosure; -
FIG. 6 is a block diagram showing an example of driving environment information according to an embodiment of the present disclosure; -
FIG. 7 is a conceptual diagram for explaining a localization process according to an embodiment of the present disclosure; -
FIG. 8 is a block diagram showing a configuration example of an automated driving management system according to an embodiment of the present disclosure; -
FIG. 9 is a flowchart showing an example of processing by an automated driving management system according to an embodiment of the present disclosure; -
FIG. 10 is a conceptual diagram for explaining an example of point cloud information according to an embodiment of the present disclosure; -
FIG. 11 is a conceptual diagram for explaining an example of reference information according to an embodiment of the present disclosure; -
FIG. 12 is a conceptual diagram for explaining an example of an ODD suitability determination process according to an embodiment of the present disclosure; -
FIG. 13 is a conceptual diagram for explaining another example of an ODD suitability determination process according to an embodiment of the present disclosure; -
FIG. 14 is a conceptual diagram for explaining still another example of an ODD suitability determination process according to an embodiment of the present disclosure; -
FIG. 15 is a conceptual diagram for explaining still another example of an ODD suitability determination process according to an embodiment of the present disclosure; and -
FIG. 16 is a conceptual diagram for explaining still another example of an ODD suitability determination process according to an embodiment of the present disclosure. - Embodiments of the present disclosure will be described with reference to the accompanying drawings.
-
FIG. 1 is a conceptual diagram for explaining an overview of avehicle 1 and avehicle control system 10 according to the present embodiment. Thevehicle control system 10 controls thevehicle 1. Typically, thevehicle control system 10 is installed on thevehicle 1. Alternatively, at least a part of thevehicle control system 10 may be included in a remote system outside thevehicle 1 to remotely control thevehicle 1. - The
vehicle 1 is capable of automated driving, and the vehicle control system is configured to control the automated driving of thevehicle 1. The automated driving supposed here is one where a driver may not necessarily 100% concentrate on the driving (e.g., so-calledLevel 3 or higher level automated driving). The automated driving level may be Level 4 or higher that does not need a driver. - In the automated driving of the
vehicle 1, aperception sensor 30 mounted on thevehicle 1 is used. Theperception sensor 30 is a sensor for perceiving a situation around thevehicle 1. Examples of theperception sensor 30 include a laser imaging detection and ranging (LIDAR), a camera, a radar, and the like. The LIDAR emits beams and detects a reflected beam reflected at a reflection point to measure a relative position of the reflection point. - The
vehicle control system 10 uses theperception sensor 30 to perceive a situation around thevehicle 1. For example, thevehicle control system 10 uses theperception sensor 30 to perceive a stationary object and a moving object around thevehicle 1. Examples of the stationary object include aroad surface 2, a road structure 3 (e.g., a wall, a guardrail, a curb), a white line, and the like. Examples of the moving object include a surroundingvehicle 4, apedestrian 5, and the like. Then, thevehicle control system 10 executes automated driving control regarding thevehicle 1 based on a result of the perception processing using theperception sensor 30. - An automated driving condition is a condition under which the automated driving of the
vehicle 1 is permitted. The automated driving condition is also referred to as an operational design domain (ODD) or an operation design domain. Generally, the automated driving condition is defined by a maximum vehicle speed, a traveling area, a weather condition, a sunshine condition, and the like. For example, in rainy weather, accuracy of the perception processing using theperception sensor 30 may decrease, and thus accuracy of the automated driving control may decrease. Therefore, conventionally, “a rainfall amount per unit time is less than a predetermined value (for example, 5 mm/h)” has been used as one of the automated driving condition related to the weather. - The
vehicle control system 10 is designed to perform the automated driving under a predetermined automated driving condition (ODD). Therefore, when performing the automated driving, it is important to determine whether or not the automated driving condition is satisfied. A process of determining whether or not the automated driving condition is satisfied is hereinafter referred to as an “ODD suitability determination process.” The inventor of the present application has recognized the following problem regarding the ODD suitability determination process. - As an example, the above-mentioned automated driving condition related to the weather, “a rainfall amount per unit time is less than a predetermined value” is considered. The rainfall amount varies widely even in a relatively narrow area, and local torrential rain has increased in recent years. Therefore, it is not easy to accurately pinpoint the rainfall amount at a current position of the
vehicle 1. In order to increase accuracy of measurement of the local rainfall amount, a large-scale infrastructure such as deployment of a large number of rainfall amount sensors is required. This is not desirable from a viewpoint of costs. In addition, when the sun shines under a wet road surface condition after rain, reflection of light from the wet road surface increases. In this case, accuracy of perception of the road surface and a fallen object by theperception sensor 30 may decrease. That is, even when it is not raining, there is a possibility that an environment around thevehicle 1 is not desirable for the automated driving. Therefore, performing the ODD suitability determination process based on a simple comparison between the rainfall amount and a threshold value is not necessarily appropriate from a viewpoint of the accuracy of the automated driving control. - In a case of a weather that is difficult to measure such as fog, difficulty of the ODD suitability determination process further increases.
- In addition, not only the natural environment such as the weather but also aging or performance deterioration of the
perception sensor 30 itself affects the accuracy of the automated driving control. It is necessary to perform the ODD suitability determination process in consideration of various factors that affect the accuracy of the automated driving control. - In view of the above, the present disclosure proposes a new technique capable of improving the accuracy of the ODD suitability determination process.
- First, a technical concept of the new ODD suitability determination process according to the present embodiment will be described.
- As described above, it is not necessarily appropriate to perform the ODD suitability determination process using a parameter such as the rainfall amount that specifically defines the weather itself. A human driver does not decide whether it is easy or difficult to drive by looking at the specific parameter such as the rainfall amount. The human driver decides whether it is easy or difficult to drive based on information perceived by the human driver's own vision. For example, when the sun shines under a wet road surface condition after rain, reflection of light from the wet road surface is so bright that the road surface cannot be seen as usual, and thus the human driver decides that it is difficult to drive. That is, the human driver decides that it is difficult to drive, when the information perceived by the human driver's own vision is different from usual one.
- The ODD suitability determination process according to the present embodiment also is performed in the same manner as a sense of the human driver. An “eye” for the
vehicle control system 10 that performs the automated driving control is theperception sensor 30. Therefore, according to the present embodiment, the ODD suitability determination process is performed based on the result of perception by theperception sensor 30. That is, the ODD suitability determination process is performed based on whether “appearance” viewed from theperception sensor 30 of thevehicle 1 is as usual or not. -
FIG. 2 is a conceptual diagram for explaining an overview of the ODD suitability determination process according to the present embodiment. “First sensor perception information SEN1” indicates the result of perception by theperception sensor 30 mounted on thevehicle 1. That is, the first sensor perception information SEN1 corresponds to the “appearance” viewed from theperception sensor 30. For example, when theperception sensor 30 includes the LIDAR, the first sensor perception information SEN1 includes information of a point cloud (beam reflection points) measured by the LIDAR. For example, the first sensor perception information SEN1 includes the number of beam reflection points on theroad surface 2 measured during one frame. - Typically, the first sensor perception information SEN1 includes information on a stationary object (e.g., the
road surface 2, the road structure 3) perceived by theperception sensor 30. On the other hand, information on a moving object (e.g., the surroundingvehicle 4, the pedestrian 5) perceived by theperception sensor 30 may not necessarily be included in the first sensor perception information SEN1. For convenience sake, the information on the moving object perceived by theperception sensor 30 is hereinafter referred to as “second sensor perception information.” The second sensor perception information is necessary for the automated driving control by thevehicle control system 10, but may not necessarily be included in the first sensor perception information SEN1. - “Expected sensor perception information ESEN” is the first sensor perception information SEN1 expected when the automated driving condition is satisfied. The automated driving condition is determined in advance in consideration of various factors that affect the accuracy of the automated driving control. The expected sensor perception information ESEN corresponds to “appearance” viewed from the
perception sensor 30 in the case where the automated driving is permissible. - “Reference information REF” indicates a correspondence relationship between a vehicle position PV and the expected sensor perception information ESEN. That is, the reference information REF indicates the expected sensor perception information ESEN as a function of the vehicle position PV. It can also be said that the reference information REF indicates the expected sensor perception information ESEN when the
vehicle 1 is present at the vehicle position PV. - The vehicle position PV may be set along a general vehicle travel trajectory in a road. The vehicle position PV may be assumed to be located at a lane center. The vehicle position PV may be a concept including both a position and a direction of the
vehicle 1. The direction of thevehicle 1 may be assumed to be parallel to an extending direction of a lane (white line). - The expected sensor perception information ESEN and the reference information REF are generated and updated based on information acquired when the automated driving condition is satisfied. For example, the expected sensor perception information ESEN and the reference information REF are generated and updated based on past automated driving records of one or
more vehicles 1. In this case, it can be said that the expected sensor perception information ESEN and the reference information REF represent “past successful experiences.” As another example, the reference information REF may be generated and updated through a simulation based on configuration information of an automated driving area and design information of theperception sensor 30. - It can be said that the reference information REF that represents the expected sensor perception information ESEN as a function of the vehicle position PV is a kind of map information. However, it should be noted that the reference information REF is a concept that is totally different from general map information. The general map information indicates an arrangement of objects in an absolute coordinate system. That is, the general map information indicates a correspondence relationship between an absolute position and an object present at the absolute position. On the other hand, the reference information REF indicates a correspondence relationship between the vehicle position PV and the first sensor perception information SEN1 (i.e., the result of perception) as viewed from the vehicle position PV when the automated driving condition is satisfied. The reference information REF does not indicate an object present at the vehicle position PV.
- An automated
driving management system 100 is applied to thevehicle 1 and manages the automated driving of thevehicle 1. The automateddriving management system 100 holds the above-described reference information REF, and performs the ODD suitability determination process regarding thevehicle 1 based on the reference information REF. In particular, the automateddriving management system 100 determines whether or not the automated driving condition is satisfied for thevehicle 1 present at a determination target position PT. Typically, the determination target position PT is the current position of thevehicle 1. As another example, the determination target position PT may be a past position of thevehicle 1. - More specifically, the automated
driving management system 100 acquires the first sensor perception information SEN1 acquired by the vehicle 1 (i.e., the vehicle control system 10) at the determination target position PT. In addition, the automateddriving management system 100 acquires, based on the reference information REF, the expected sensor perception information ESEN associated with the determination target position PT. The expected sensor perception information ESEN associated with the determination target position PT is the first sensor perception information SEN1 expected at the determination target position PT when the automated driving condition is satisfied. Therefore, the automateddriving management system 100 is able to determine whether or not the automated driving condition is satisfied at the determination target position PT by comparing the first sensor perception information SEN1 acquired at the determination target position PT with the expected sensor perception information ESEN associated with the determination target position PT. When the first sensor perception information SEN1 acquired at the determination target position PT is significantly different from the expected sensor perception information ESEN associated with the determination target position PT, the automateddriving management system 100 determines that the automated driving condition is not satisfied at the determination target position PT. -
FIG. 3 is a diagram for explaining an example of the ODD suitability determination process according to the present embodiment. InFIG. 3 , a horizontal axis represents the vehicle position PV and a vertical axis represents a parameter X. The parameter X is a parameter indicating the result of perception by theperception sensor 30 and is included in the first sensor perception information SEN1. For example, the parameter X is the number of beam reflection points on theroad surface 2 measured by the LIDAR. - The expected sensor perception information ESEN includes an expected value Xe of the parameter X expected when the automated driving condition is satisfied. For example, the expected value Xe is an average value of a large number of parameters X acquired when the automated driving condition is satisfied. The reference information REF indicates a correspondence relationship between the expected value Xe of the parameter X and the vehicle position PV. That is, the reference information REF indicates the expected value Xe of the parameter X as a function of the vehicle position PV.
- An allowable range RNG is a range of the parameter X in which the automated driving is allowed. The allowable range RNG includes at least the expected value Xe. A width of the allowable range RNG is predetermined. The width of the allowable range RNG may be set based on a standard deviation (a) of the large number of parameters X acquired when the automated driving condition is satisfied. A set of the expected value Xe and the allowable range RNG may be registered in the reference information REF.
- The automated
driving management system 100 acquires the first sensor perception information SEN1 acquired by the vehicle 1 (i.e., the vehicle control system 10) at the determination target position PT. The first sensor perception information SEN1 includes an actual value Xa of the parameter X acquired at the determination target position PT. In addition, based on the reference information REF, the automateddriving management system 100 acquires the expected value Xe associated with the determination target position PT. Then, the automateddriving management system 100 determines whether or not the automated driving condition is satisfied at the determination target position PT by comparing the actual value Xa of the parameter X acquired at the determination target position PT with the allowable range RNG including the expected value Xe associated with the determination target position PT. More specifically, when the actual value Xa acquired at the determination target position PT is within the allowable range RNG, the automateddriving management system 100 determines that the automated driving condition is satisfied at the determination target position PT. On the other hand, when the actual value Xa acquired at the determination target position PT deviates from the allowable range RNG, the automateddriving management system 100 determines that the automated driving condition is not satisfied at the determination target position PT. - For example, the parameter X is the number of beam reflection points on the
road surface 2 measured by the LIDAR. The expected value Xe is an expected value of the number of beam reflection points on theroad surface 2 when the automated driving condition is satisfied. In rainy weather, the number of beam reflection points on theroad surface 2 decreases conspicuously. When the number of beam reflection points on theroad surface 2 falls below the allowable range RNG including the expected value Xe, the automateddriving management system 100 determines that the automated driving condition is not satisfied at the determination target position PT. That is, when the “appearance” of theroad surface 2 viewed from the perception sensor of thevehicle 1 is different from usual, it is determined that the automated driving condition is not satisfied. - When the determination target position PT is the current position of the
vehicle 1 and it is determined that the automated driving condition is not satisfied at the determination target position PT, the automateddriving management system 100 decelerates or stops thevehicle 1. For example, the automateddriving management system 100 instructs thevehicle control system 10 to decelerate or stop thevehicle 1. - The automated
driving management system 100 may be included in thevehicle control system 10 of thevehicle 1 or may be provided separately from thevehicle control system 10. The automateddriving management system 100 may be a management server that communicates with the vehicle 1 (the vehicle control system 10). The automateddriving management system 100 and thevehicle control system 10 may be partially common. -
FIG. 4 is a conceptual diagram for explaining amanagement server 1000 that manages the automated driving. Themanagement server 1000 may be configured by a plurality of servers that execute distributed processing. Themanagement server 1000 is communicably connected to a large number ofvehicles 1 that perform the automated driving. Themanagement server 1000 collects the vehicle positions PV and the first sensor perception information SEN1 from the large number ofvehicles 1. In particular, themanagement server 1000 collects the vehicle positions PV and the first sensor perception information SEN1 in the case where the automated driving is possible from the large number ofvehicles 1. Then, themanagement server 1000 generates and updates the above-described reference information REF based on the information collected from the large number ofvehicles 1. - For example, the automated
driving management system 100 is included in themanagement server 1000. In this case, themanagement server 1000 communicates with thevehicle 1 being a determination target and acquires information of the determination target position PT and the first sensor perception information SEN1 acquired at the determination target position PT. Then, themanagement server 1000 performs the above-described ODD suitability determination process based on the information acquired from thevehicle 1 being the determination target and the reference information REF. When it is determined that the automated driving condition is not satisfied at the determination target position PT, themanagement server 1000 instructs thevehicle control system 10 of thevehicle 1 being the determination target to decelerate or stop. - As another example, the automated
driving management system 100 may be included in thevehicle control system 10. In this case, thevehicle control system 10 communicates with themanagement server 1000 and acquires the reference information REF from themanagement server 1000. In addition, thevehicle control system 10 acquires the first sensor perception information SEN1 at the determination target position PT. Then, thevehicle control system 10 performs the above-described ODD suitability determination process based on the first sensor perception information SEN1 and the reference information REF. When it is determined that the automated driving condition is not satisfied at the determination target position PT, thevehicle control system 10 decelerates or stops thevehicle 1. - Generalization is as follows. The automated
driving management system 100 includes one or more processors and one or more memory devices. The one or more processors may be included in thevehicle control system 10, may be included in themanagement server 1000, or may be distributed to thevehicle control system 10 and themanagement server 1000. The one or more memory devices may be included in thevehicle control system 10, may be included in themanagement server 1000, or may be distributed to thevehicle control system 10 and themanagement server 1000. The one or more memory devices store the reference information REF. The one or more processors acquire the first sensor perception information SEN1 and performs the ODD suitability determination process based on the first sensor perception information SEN1 and the reference information REF. - As described above, according to the present embodiment, the reference information REF indicating the correspondence relationship between the expected sensor perception information ESEN and the vehicle position PV is prepared. The expected sensor perception information ESEN is the first sensor perception information SEN1 (i.e., the result of perception by the perception sensor 30) expected when the automated driving condition is satisfied. Therefore, comparing the first sensor perception information SEN1 acquired at the determination target position PT with the expected sensor perception information ESEN associated with the determination target position PT makes it possible to accurately determine whether or not the automated driving condition is satisfied at the determination target position PT. For example, as compared with a case where a parameter specifically defining the weather itself such as the rainfall amount is used, it is possible to more accurately determine whether or not the automated driving condition is satisfied.
- In addition, not only the natural environment such as the weather but also aging or the performance deterioration of the
perception sensor 30 itself affects the accuracy of the automated driving control. It is necessary to perform the ODD suitability determination process in consideration of various factors that affect the accuracy of the automated driving control. The expected sensor perception information ESEN according to the present embodiment is the first sensor perception information SEN1 (i.e., the result of perception by the perception sensor 30) expected when the automated driving condition is satisfied. Therefore, the various factors that affect the accuracy of the automated driving control are integrally reflected in the expected sensor perception information ESEN. Using such the expected sensor perception information ESEN and the reference information REF makes it possible to perform the ODD suitability determination process easily and with high accuracy. - Hereinafter, specific examples of the
vehicle control system 10, the automateddriving management system 100, and the ODD suitability determination process according to the present embodiment will be described. -
FIG. 5 is a block diagram showing a configuration example of thevehicle control system 10 according to the present embodiment. Thevehicle control system 10 includes avehicle state sensor 20, aperception sensor 30, aposition sensor 40, atravel device 50, acommunication device 60, and acontrol device 70. - The
vehicle state sensor 20 detects a state of thevehicle 1. For example, thevehicle state sensor 20 includes a speed sensor, an acceleration sensor, a yaw rate sensor, a steering angle sensor, and the like. - The
perception sensor 30 perceives (detects) a situation around thevehicle 1. Theperception sensor 30 includes aLIDAR 31, acamera 32, a radar, and the like. TheLIDAR 31 emits beams and detects a reflected beam reflected at a reflection point to measure a relative position of the reflection point. Thecamera 32 images a situation around thevehicle 1 to acquire an image. - The
position sensor 40 detects a position and an orientation of thevehicle 1. Examples of theposition sensor 40 include an inertial measurement unit (IMU), a global navigation satellite system (GNSS) sensor, and the like. - The
travel device 50 includes a steering device, a driving device, and a braking device. The steering device steers wheels. For example, the steering device includes an electric power steering (EPS) device. The driving device is a power source that generates a driving force. Examples of the driving device include an engine, an electric motor, and an in-wheel motor. The braking device generates a braking force. - The
communication device 60 communicates with the outside of thevehicle 1. For example, thecommunication device 60 communicates with the management server 1000 (seeFIG. 4 ). - The control device (controller) 70 is a computer that controls the
vehicle 1. Thecontrol device 70 includes one or more processors 71 (hereinafter simply referred to as a processor 71) and one or more memory devices 72 (hereinafter simply referred to as a memory device 72). Theprocessor 71 executes a variety of processing. For example, theprocessor 71 includes a central processing unit (CPU). Thememory device 72 stores a variety of information. Examples of thememory device 72 include a volatile memory, a nonvolatile memory, a hard disk drive (HDD), a solid state drive (SSD), and the like. Thecontrol device 70 may include one or more electronic control units (ECUs). A part of thecontrol device 70 may be an information processing device outside thevehicle 1. In this case, a part of thecontrol device 70 communicates with thevehicle 1 and remotely controls thevehicle 1. - A
vehicle control program 80 is a computer program for controlling thevehicle 1. The variety of processing by thecontrol device 70 may be implemented by theprocessor 71 executing thevehicle control program 80. The vehicle control program is stored in thememory device 72. Thevehicle control program 80 may be recorded on a non-transitory computer-readable recording medium. - The
control device 70 acquires drivingenvironment information 200 indicating a driving environment for thevehicle 1. The drivingenvironment information 200 is stored in thememory device 72.FIG. 6 is a block diagram showing an example of the drivingenvironment information 200. The drivingenvironment information 200 includesmap information 210,vehicle state information 220, surroundingsituation information 230, andvehicle position information 240. - The
map information 210 includes a general navigation map. Themap information 210 may indicate a lane configuration and a road shape. Themap information 210 may include position information of landmarks, traffic signals, signs, and so forth. Thecontrol device 70 acquires themap information 210 of a necessary area from a map database. The map database may be stored in thememory device 72 or may be managed by themanagement server 1000. In the latter case, thecontrol device 70 communicates with themanagement server 1000 via thecommunication device 60 to acquire thenecessary map information 210. - The
map information 210 may include stationaryobject map information 215 indicating an absolute position where a stationary object is present. Examples of the stationary object include aroad surface 2, aroad structure 3, and the like. Examples of theroad structure 3 include a wall, a guardrail, a curb, a fence, a plant, and the like. - The stationary
object map information 215 may include terrain map information indicating an absolute position (latitude, longitude, and altitude) where theroad surface 2 is present. The terrain map information may include an evaluation value set for each absolute position. The evaluation value indicates “certainty (likelihood)” that theroad surface 2 is present at the absolute position. - The stationary
object map information 215 may include road structure map information indicating an absolute position where theroad structure 3 is present. The road structure map information may include an evaluation value set for each absolute position. The evaluation value indicates “certainty (likelihood)” that theroad structure 3 is present at the absolute position. - The
vehicle state information 220 is information indicating the state of thevehicle 1 and includes a vehicle speed, an acceleration, a yaw rate, a steering angle, and the like. Thecontrol device 70 acquires thevehicle state information 220 from thevehicle state sensor 20. Thevehicle state information 220 may indicate a driving state (automated driving or manual driving) of thevehicle 1. - The surrounding
situation information 230 is information indicating the situation around thevehicle 1. Thecontrol device 70 perceives (recognizes) the situation around thevehicle 1 by using theperception sensor 30 to acquire thesurrounding situation information 230. - For example, the surrounding
situation information 230 includespoint cloud information 231 indicating a result of measurement by theLIDAR 31. More specifically, thepoint cloud information 231 indicates a relative position (an azimuth and a distance) of each beam reflection point viewed from theLIDAR 31. - The surrounding
situation information 230 may includeimage information 232 captured by thecamera 32. - The surrounding
situation information 230 further includesobject information 233 regarding an object around thevehicle 1. Examples of the object include a white line, theroad structure 3, a surrounding vehicle 4 (e.g., a preceding vehicle, a parked vehicle, and the like), apedestrian 5, a traffic signal, a landmark, a fallen object, and the like. Theobject information 233 indicates a relative position and a relative speed of the object with respect to thevehicle 1. For example, analyzing theimage information 232 captured by thecamera 32 makes it possible to identify an object and calculate the relative position of the object. For example, thecontrol device 70 identifies an object in theimage information 232 by using image perception AI acquired by machine learning. It is also possible to identify an object and acquire the relative position and the relative speed of the object based on thepoint cloud information 231 acquired by theLIDAR 31. - In the object perception, the
control device 70 may utilize the above-described stationaryobject map information 215. The position of the stationary object (e.g., theroad surface 2, the road structure 3) is registered on the stationaryobject map information 215. Therefore, using the stationaryobject map information 215 makes it possible to distinguish the stationary object from other objects. More specifically, thecontrol device 70 grasps the position of the stationary object existing around thevehicle 1 based on the stationaryobject map information 215 and thevehicle position information 240. Then, thecontrol device 70 removes (thins out) the stationary object from the objects perceived using theperception sensor 30. It is thus possible to distinguish the stationary object from the other objects (e.g., a surroundingvehicle 4, apedestrian 5, a fallen object, and the like). For example, thecontrol device 70 is able to detect the surroundingvehicle 4, thepedestrian 5, the fallen object, and the like on theroad surface 2 by removing theroad surface 2 indicated by the terrain map information from thepoint cloud information 231. - The
vehicle position information 240 is information indicating the position and the orientation of thevehicle 1. Thecontrol device 70 acquires thevehicle position information 240 from a result of detection by theposition sensor 40. Thecontrol device 70 may acquire highly accuratevehicle position information 240 by a known self-position estimation process (localization) using theobject information 233 and themap information 210. -
FIG. 7 is a conceptual diagram for explaining the self-position estimation process (localization). Various landmarks (characteristic objects) are present around thevehicle 1. Examples of the landmark include a white line, a curb, a sign, and a pole. Thecontrol device 70 uses theperception sensor 30 to perceive the landmark around thevehicle 1. Theobject information 233 indicates the relative position of the perceived landmark. Meanwhile, the absolute position of the landmark is registered on themap information 210. Thecontrol device 70 corrects thevehicle position information 240 such that the relative position of the landmark indicated by theobject information 233 and the absolute position of the landmark acquired from themap information 210 are consistent with each other. As a result, highly accuratevehicle position information 240 can be acquired. - The
control device 70 executes vehicle travel control that controls travel of thevehicle 1. The vehicle travel control includes steering control, acceleration control, and deceleration control. Thecontrol device 70 executes the vehicle travel control by controlling thetravel device 50. More specifically, thecontrol device 70 executes the steering control by controlling the steering device. In addition, thecontrol device 70 executes the acceleration control by controlling the driving device. Further, thecontrol device 70 executes the deceleration control by controlling the braking device. - The
control device 70 executes the automated driving control based on the drivingenvironment information 200. More specifically, thecontrol device 70 generates a travel plan of thevehicle 1 based on the drivingenvironment information 200. Examples of the travel plan include keeping a current travel lane, making a lane change, making a right or left turn, avoiding an obstacle, and the like. Further, based on the drivingenvironment information 200, thecontrol device 70 generates a target trajectory necessary for thevehicle 1 to travel in accordance with the travel plan. The target trajectory includes a target position and a target velocity. Then, thecontrol device 70 executes the vehicle travel control such that thevehicle 1 follows the target trajectory. - It should be noted that when the automated
driving management system 100 determines that the automated driving condition is not satisfied, thecontrol device 70 generates an emergency plan for decelerating or stopping thevehicle 1. Then, thecontrol device 70 executes the vehicle travel control in accordance with the emergency plan to make thevehicle 1 decelerate or stop. -
FIG. 8 is a block diagram showing a configuration example of the automateddriving management system 100 according to the present embodiment. The automateddriving management system 100 includes acommunication device 110, one or more processors 120 (hereinafter simply referred to as a processor 120), and one or more memory devices 130 (hereinafter simply referred to as a memory device 130). - The
communication device 110 communicates with the outside of the automateddriving management system 100. For example, when the automateddriving management system 100 is included in thevehicle control system 10, thecommunication device 110 communicates with the management server 1000 (seeFIG. 4 ). As another example, when the automateddriving management system 100 is included in themanagement server 1000, thecommunication device 110 communicates with thevehicle control system 10. - The
processor 120 executes a variety of processing. For example, theprocessor 120 includes a CPU. Thememory device 130 stores a variety of information. Examples of thememory device 130 include a volatile memory, a nonvolatile memory, an HDD, an SSD, and the like. When the automateddriving management system 100 is included in thevehicle control system 10, theprocessor 120 is the same as theprocessor 71 of thevehicle control system 10, and thememory device 130 is the same as thememory device 72 of thevehicle control system 10. - An automated
driving management program 140 is a computer program for managing the automated driving. The variety of processing by theprocessor 120 may be implemented by theprocessor 120 executing the automateddriving management program 140. The automateddriving management program 140 is stored in thememory device 130. The automateddriving management program 140 may be recorded on a non-transitory computer-readable recording medium. - The reference information REF indicates a correspondence relationship between the expected sensor perception information ESEN and the vehicle position PV. The
management server 1000 generates and updates the reference information REF. For example, the reference information REF is generated and updated based on past automated driving records of one ormore vehicles 1. As another example, the reference information REF may be generated and updated through a simulation based on configuration information of an automated driving area and design information of theperception sensor 30. The reference information REF is stored in thememory device 130. When the automateddriving management system 100 is included in thevehicle control system 10, theprocessor 120 communicates with themanagement server 1000 via thecommunication device 110 to acquire the reference information REF. - The
vehicle position information 240 and the first sensor perception information SEN1 are acquired by thevehicle control system 10. The first sensor perception information SEN1 indicates the result of perception perceived by theperception sensor 30 of thevehicle 1. For example, the first sensor perception information SEN1 includes information on the stationary object (e.g., theroad surface 2, the road structure 3) perceived by theperception sensor 30. When the automateddriving management system 100 is included in themanagement server 1000, theprocessor 120 communicates with thevehicle control system 10 via thecommunication device 110 to acquire thevehicle position information 240 and the first sensor perception information SEN1. Thevehicle position information 240 and the first sensor perception information SEN1 are stored in thememory device 130. -
FIG. 9 is a flowchart showing an example of processing performed by the automated driving management system 100 (the processor 120) according to the present embodiment. - In Step S100, the
processor 120 acquires thevehicle position information 240 and the first sensor perception information SEN1. Thevehicle position information 240 includes information on the determination target position PT. Typically, the determination target position PT is a current position of thevehicle 1. As another example, the determination target position PT may be a past position of thevehicle 1. The first sensor perception information SEN1 indicates the result of perception by theperception sensor 30 mounted on thevehicle 1. For example, the first sensor perception information SEN1 includes the actual value Xa of the parameter X perceived by theperception sensor 30. Theprocessor 120 acquires the first sensor perception information SEN1 acquired at the determination target position PT. - In Step S110, the
processor 120 acquires, based on the reference information REF, the expected sensor perception information ESEN associated with the determination target position PT. For example, the expected sensor perception information ESEN includes the expected value Xe of the parameter X expected when the automated driving condition is satisfied. - In Step S120, the
processor 120 compares the first sensor perception information SEN1 acquired at the determination target position PT with the expected sensor perception information ESEN associated with the determination target position PT. - When the first sensor perception information SEN1 acquired at the determination target position PT does not deviate from the expected sensor perception information ESEN (Step S130; No), the processing proceeds to Step S140. For example, when the actual value Xa of the parameter X acquired at the determination target position PT is within the allowable range RNG including the expected value Xe (Step S130; No), the processing proceeds to Step S140.
- In Step S140, the
processor 120 determines that the automated driving condition is satisfied at the determination target position PT. In this case, theprocessor 120 makes the automated driving of thevehicle 1 continue (Step S150). - On the other hand, when the first sensor perception information SEN1 acquired at the determination target position PT deviates from the expected sensor perception information ESEN (Step S130; Yes), the processing proceeds to Step S160. For example, when the actual value Xa of the parameter X acquired at the determination target position PT deviates from the allowable range RNG including the expected value Xe (Step S130; Yes), the processing proceeds to Step S160.
- In Step S160, the
processor 120 determines that the automated driving condition is not satisfied at the determination target position PT. In this case, theprocessor 120 makes thevehicle 1 decelerate or stop (Step S170). - Hereinafter, various examples of the ODD suitability determination process according to the present embodiment will be described.
- In a first example, the
perception sensor 30 includes theLIDAR 31, and the first sensor perception information SEN1 includes thepoint cloud information 231 indicating the result of measurement by theLIDAR 31. Thepoint cloud information 231 indicates the relative position (the azimuth and the distance) of each beam reflection point viewed from theLIDAR 31. -
FIG. 10 is a conceptual diagram for explaining an example of thepoint cloud information 231. A first reflection point R1 is the reflection point on the stationary object (e.g., theroad surface 2, the road structure 3). A second reflection point R2 is the reflection point on the moving object (e.g., the surroundingvehicle 4, the pedestrian 5). A noise reflection point R3 is the reflection point caused by raindrops or dust in air. - The first reflection point R1 is detected spatially continuously over a certain range. The second reflection point R2 also is detected spatially continuously over a certain range. That is, the spatially continuous point cloud is constituted by the first reflection point R1 or the second reflection point R2. Here, the above-described stationary
object map information 215 indicates the absolute position where the stationary object is present. Combining the stationaryobject map information 215 and thevehicle position information 240 makes it possible to grasp the position at which the stationary object is assumed to be present around thevehicle 1. Therefore, theprocessor 120 is able to classify the spatially continuous point cloud into the first reflection point R1 and the second reflection point R2 based on the stationaryobject map information 215 and thevehicle position information 240. In other words, theprocessor 120 is able to distinguish between the first reflection point R1 regarding the stationary object and the second reflection point R2 regarding the moving object. In addition, when the stationaryobject map information 215 includes at least one of the terrain map information and the road structure map information, it is also possible to distinguish between the first reflection point R1 regarding theroad surface 2 and the first reflection point R1 regarding theroad structure 3. - On the other hand, the noise reflection points R3 are not spatially continuous. Typically, each noise reflection point R3 exists alone. Therefore, the
processor 120 is able to recognize the noise reflection point R3 based on continuity of the point cloud. For example, assume a case where distances to a plurality of reflection points detected in a certain area are 19.8 m, 20.0 m, 5.5 m, 20.2 m, and 20.1 m. In this case, the reflection point whose distance is 5.5 m is the noise reflection point R3. For example, theprocessor 120 classifies a discontinuous reflection point whose distance difference from a reflection point for an adjacent beam is equal to or greater than a predetermined value as the noise reflection point R3. - It is assumed that the number of beams emitted from the
LIDAR 31 during one frame is “N.” A number n_t is the number of the first reflection points R1 that are measured during one frame. A number n_s is the number of the second reflection points R2 that are measured during one frame. A number n_n is the number of noise reflection points R3 that are measured during one frame. A number of no-reflection points m is the number of beams for which the reflected beam is not detected during one frame. In this case, a relationship represented by the following Equation (1) is satisfied. -
N=n_t+n_s+n_n+m Equation (1): - The first sensor perception information SEN1 includes at least one of the number of first reflection points n_t, the number of noise reflection points n_n, and the number of no-reflection points m. The first sensor perception information SEN1 may include at least the number of first reflection points n_t. The first sensor perception information SEN1 may include all of the number of first reflection points n_t, the number of noise reflection points n_n, and the number of no-reflection points m.
- On the other hand, the first sensor perception information SEN1 may not include the number of second reflection points n_s regarding the moving object. To generalize, second sensor perception information is information on the moving object perceived by the
perception sensor 30. The second sensor perception information is necessary for the automated driving control by thevehicle control system 10, but may not necessarily be included in the first sensor perception information SEN1. -
FIG. 11 is a conceptual diagram for explaining an example of the reference information REF. The reference information REF indicates a correspondence relationship between the expected value Xe of the parameter X and the vehicle position PV. That is, the reference information REF expresses the expected value Xe of the parameter X as a function of the vehicle position PV. In the example shown inFIG. 11 , the parameter X includes the number of first reflection points n_t, the number of noise reflection points n_n, and the number of no-reflection points m. - As an example, the ODD suitability determination process in rainy weather will be described. A reflected beam on a diffusely-reflecting surface is highly likely to return back to the
LIDAR 31, but a reflected beam on a totally-reflecting surface is not likely to return back to theLIDAR 31. Therefore, in rainy weather, the first reflection point R1 on theroad surface 2 decreases. On the other hand, since raindrops in air increase, the noise reflection point R3 increases. That is to say, in rainy weather, the number of first reflection points n_t conspicuously decreases, while the number of noise reflection points n_n conspicuously increases. In addition, the number of no-reflection points m increases as the number of first reflection points n_t decreases. - When the number of first reflection points n_t on the
road surface 2 decreases, it becomes difficult to detect a fallen object on theroad surface 2. Further, when the number of noise reflection points n_n increases, it becomes difficult to detect a distant object. That is, in the rainy weather, object detection performance is deteriorated and thus the accuracy of the automated driving control is deteriorated. Therefore, it is desirable to perform the ODD suitability determination process with high accuracy. -
FIG. 12 is a conceptual diagram for explaining the ODD suitability determination process using the number of first reflection points n_t. A vertical axis represents the number of first reflection points n_t, and a horizontal axis represents the vehicle position PV. The reference information REF indicates a correspondence relationship between the expected value Xe of the number of first reflection points n_t and the vehicle position PV. That is, the reference information REF expresses the expected value Xe of the number of first reflection points n_t as a function of the vehicle position PV. - In the example shown in
FIG. 12 , a first threshold value TH1 defines a lower limit value of the number of first reflection points n_t that allows to continue the automated driving (e.g., LV4 automated driving) without deceleration. The first threshold value TH1 is set to be lower than the expected value Xe. A second threshold value TH2 defines a lower limit value of the number of first reflection points n_t that allows to continue the automated driving if decelerated. The second threshold value TH2 is set to be further lower than the first threshold value TH1. The first threshold value TH1 and the second threshold value TH2 may be registered on the reference information REF together with the expected value Xe. - The
processor 120 acquires the first sensor perception information SEN1 acquired at the determination target position PT. The first sensor perception information SEN1 includes the actual value Xa of the number of first reflection points n_t acquired at the determination target position PT. Based on the reference information REF, theprocessor 120 acquires the expected value Xe associated with the determination target position PT. When the actual value Xa of the number of first reflection points n_t is equal to or greater than the first threshold value TH1, theprocessor 120 determines that the automated driving condition is satisfied and the automated driving is possible. When the actual value Xa of the number of first reflection points n_t is less than the first threshold value TH1 and equal to or greater than the second threshold value TH2, theprocessor 120 determines that the automated driving is possible if decelerated. When the actual value Xa of the number of first reflection points n_t is less than the second threshold value TH2, theprocessor 120 determines that the automated driving condition is not satisfied and the automated driving is not possible. -
FIG. 13 is a conceptual diagram for explaining the ODD suitability determination process using the number of noise reflection points n_n. A vertical axis represents the number of noise reflection points n_n, and a horizontal axis represents the vehicle position PV. The reference information REF indicates a correspondence relationship between the expected value Xe of the number of noise reflection points n_n and the vehicle position PV. That is, the reference information REF expresses the expected value Xe of the number of noise reflection points n_n as a function of the vehicle position PV. - In the example shown in
FIG. 13 , a first threshold value TH1 defines an upper limit value of the number of noise reflection points n_n that allows to continue the automated driving (e.g., LV4 automated driving) without deceleration. The first threshold value TH1 is set to be higher than the expected value Xe. A second threshold value TH2 defines an upper limit value of the number of noise reflection points n_n that allows to continue the automated driving if decelerated. The second threshold value TH2 is set to be further higher than the first threshold value TH1. The first threshold value TH1 and the second threshold value TH2 may be registered on the reference information REF together with the expected value Xe. - The
processor 120 acquires the first sensor perception information SEN1 acquired at the determination target position PT. The first sensor perception information SEN1 includes the actual value Xa of the number of noise reflection points n_n acquired at the determination target position PT. Based on the reference information REF, theprocessor 120 acquires the expected value Xe associated with the determination target position PT. When the actual value Xa of the number of noise reflection points n_n is equal to or less than the first threshold value TH1, theprocessor 120 determines that the automated driving condition is satisfied and the automated driving is possible. When the actual value Xa of the number of noise reflection points n_n exceeds the first threshold value TH1 and is equal to or less than the second threshold value TH2, theprocessor 120 determines that the automated driving is possible if decelerated. When the actual value Xa of the number of noise reflection points n_n exceeds the second threshold value TH2, theprocessor 120 determines that the automated driving condition is not satisfied and the automated driving is not possible. - When the first sensor perception information SEN1 includes the number of first reflection points n_t and the number of noise reflection points n_n, the
processor 120 performs both the ODD suitability determination process shown inFIG. 12 and the ODD suitability determination process shown inFIG. 13 . Then, theprocessor 120 adopts the determination result in which the traveling of thevehicle 1 is more restricted, that is, the vehicle speed is lower. - In a second example, fog is considered. In the case of fog, the number of water droplets in the air increases greatly. Therefore, the number of noise reflection points n_n is greatly increased. In addition, the number of no-reflection points m decreases as the number of noise reflection points n_n increases. On the other hand, the number of first reflection points n_t does not decrease so much. Therefore, using at least one of the number of noise reflection points n_n and the number of no-reflection points m makes it possible to appropriately perform the ODD suitability determination process. The ODD suitability determination process is the same as that in the first example described above.
- It should be noted that a tendency of variation in the number of reflection points is different between the case of rain and the case of fog. Therefore, it is also possible to estimate a cause of the fact that the automated driving condition is not satisfied, based on the tendency of variation in the number of reflection points.
- In a third example, a case where an output of the
LIDAR 31 is reduced is considered. The reduction in the output of theLIDAR 31 occurs due to aging, failure, heat, or the like. When the output of theLIDAR 31 is reduced, the number of reflection points decreases as a whole. Especially, the number of the first reflection points R1 on adistant road surface 2 is greatly reduced. Therefore, in the third example, the first reflection points R1 on theroad surface 2 are further classified from a viewpoint of the distance from thevehicle 1. - For example, as shown in
FIG. 14 , theroad surface 2 is divided into three types: a short-distance road surface 2 a, a medium-distance road surface 2 b, and a long-distance road surface 2 c. Theprocessor 120 classifies the first reflection points R1 on theroad surface 2 into first reflection points R1 a on theroad surface 2 a, first reflection points R1 b on theroad surface 2 b, and first reflection points R1 c on theroad surface 2 c based on the distances to the measured reflection points. Numbers of first reflection points n_ta, n_tb, and n_tc are the numbers of first reflection points R1 a, R1 b, and R1 c, respectively. - The first sensor perception information SEN1 includes the numbers of first reflection points n_ta, n_tb, and n_tc. The reference information REF indicates respective expected values Xe of the numbers of first reflection points n_ta, n_tb, and n_tc. The
processor 120 performs the ODD suitability determination process by using the numbers of first reflection points n_ta, n_tb, and n_tc. As a result, even when the output of theLIDAR 31 is reduced, the ODD suitability determination process can be appropriately performed. Furthermore, it is also possible to estimate that the cause of the fact that the automated driving condition is not satisfied is the reduction in the output of theLIDAR 31. - In a fourth example, a case where calibration of the
LIDAR 31 is deteriorated is considered. As described above, the first reflection point R1 on theroad structure 3 can be identified by using the road structure map information included in the stationaryobject map information 215. However, when the calibration of theLIDAR 31 is deteriorated, accuracy of the identification of the first reflection point R1 using the road structure map information decreases. As a result, the number of first reflection points R1 on theroad structure 3 decreases. Therefore, it is possible to appropriately perform the ODD suitability determination process by using the number of first reflection points n_t regarding theroad structure 3. The ODD suitability determination process is the same as that in the first example described above. - In a fifth example, the number of landmarks perceived in the localization (see
FIG. 7 ) is considered. Examples of the landmark include a white line, a curb, a sign, a pole, and the like. The result of perception of the landmark using theperception sensor 30 is acquired from theobject information 233. For example, in rainy weather, the number of detection of the landmarks is reduced. Especially, the white line, which is detected based on luminance value, becomes hard to detect when theroad surface 2 is wet. As another example, in a case of snow, the numbers of detection of the white line and the curb are significantly reduced. - Therefore, in the fifth example, the ODD suitability determination process is performed by using the number of landmarks perceived by the
perception sensor 30. The first sensor perception information SEN1 includes the number of landmarks perceived by using theperception sensor 30. The reference information REF indicates an expected value Xe of the number of each landmark. Theprocessor 120 performs the ODD suitability determination process by using the number of each landmark. -
FIG. 15 is a conceptual diagram for explaining the ODD suitability determination process using the number of white lines n_wl. A vertical axis represents the number of white lines n_wl, and a horizontal axis represents the vehicle position PV. The reference information REF indicates a correspondence relationship between the expected value Xe of the number of white lines n_wl and the vehicle position PV. That is, the reference information REF expresses the expected value Xe of the number of white lines n_wl as a function of the vehicle position PV. - In the example shown in
FIG. 15 , a first threshold value TH1 defines a lower limit value of the number of white lines n_wl that allows to continue the automated driving (e.g., LV4 automated driving) without deceleration. The first threshold TH1 is set to be lower than the expected value Xe. A second threshold value TH2 defines a lower limit value of the number of white lines n_wl that allows to continue the automated driving if decelerated. The second threshold value TH2 is set to be further lower than the first threshold value TH1. The first threshold value TH1 and the second threshold value TH2 may be registered on the reference information REF together with the expected value Xe. - The
processor 120 acquires the first sensor perception information SEN1 acquired at the determination target position PT. The first sensor perception information SEN1 includes the actual value Xa of the number of white lines n_wl acquired at the determination target position PT. Based on the reference information REF, theprocessor 120 acquires the expected value Xe associated with the determination target position PT. When the actual value Xa of the number of white lines n_wl is equal to or greater than the first threshold value TH1, theprocessor 120 determines that the automated driving condition is satisfied and the automated driving is possible. When the actual value Xa of the number of white lines n_wl is less than the first threshold value TH1 and equal to or greater than the second threshold value TH2, theprocessor 120 determines that the automated driving is possible if decelerated. When the actual value Xa of the white line number n_wl is less than the second threshold value TH2, theprocessor 120 determines that the automated driving condition is not satisfied and the automated driving is not possible. -
FIG. 16 is a conceptual diagram for explaining a sixth example. In the sixth example, the image (image information 232) captured by thecamera 32 is considered. Theprocessor 120 can extract theroad surface 2 in the image by analyzing the image captured by thecamera 32. For example, theprocessor 120 can extract theroad surface 2 in the image by applying semantic segmentation to the image. The segmentation (region division) is a technique of grouping regions having similar feature amounts (color, texture, or the like) in the image to divide the image into a plurality of regions. - Quality (visibility) of the image captured by the camera greatly varies depending on an imaging condition. For example, in rainy weather, the image quality is deteriorated. As another example, if a lens of
camera 32 is dirty, the image quality is deteriorated. As still another example, at night, the image quality is deteriorated due to insufficient light intensity. When the image quality of the image is deteriorated, object detection performance based on the image is deteriorated and thus the accuracy of the automated driving control is deteriorated. Therefore, it is desirable to perform the ODD suitability determination process with high accuracy. - As described above, the
processor 120 extracts theroad surface 2 in the image by analyzing the image captured by thecamera 32. However, when the image quality of the image is deteriorated, an area of theroad surface 2 to be extracted decreases. Therefore, in the sixth example, the ODD suitability determination process is performed by using an area ratio of theroad surface 2 in the image. The first sensor perception information SEN1 includes the area ratio of theroad surface 2 in the image. The reference information REF indicates the expected value Xe of the area ratio of theroad surface 2 in the image. Theprocessor 120 performs the ODD suitability determination process by using the area ratio of theroad surface 2 in the image. The ODD suitability determination process is the same as that in the fifth example described above.
Claims (10)
1. An automated driving management system applied to a vehicle that performs automated driving by using a perception sensor for perceiving a surrounding situation,
the automated driving management system comprising:
one or more processors configured to acquire first sensor perception information indicating a result of perception by the perception sensor; and
one or more memory devices configured to store reference information indicating a correspondence relationship between a vehicle position and expected sensor perception information that is the first sensor perception information expected when an automated driving condition is satisfied, wherein
the one or more processors are further configured to:
acquire, based on the reference information, the expected sensor perception information associated with a determination target position; and
determine whether or not the automated driving condition is satisfied at the determination target position by comparing the first sensor perception information acquired at the determination target position with the expected sensor perception information associated with the determination target position.
2. The automated driving management system according to claim 1 , wherein
the first sensor perception information includes a parameter indicating the result of perception by the perception sensor,
the expected sensor perception information includes an expected value of the parameter expected when the automated driving condition is satisfied,
the reference information indicates a correspondence relationship between the expected value of the parameter and the vehicle position, and
the one or more processors are further configured to:
acquire, based on the reference information, the expected value associated with the determination target position;
compare an actual value of the parameter acquired at the determination target position with an allowable range including the expected value associated with the determination target position; and
when the actual value of the parameter acquired at the determination target position deviates from the allowable range, determine that the automated driving condition is not satisfied at the determination target position.
3. The automated driving management system according to claim 1 , wherein
the first sensor perception information includes information on a stationary object perceived by the perception sensor.
4. The automated driving management system according to claim 3 , wherein
second sensor perception information is information on a moving object perceived by the perception sensor, and
the second sensor perception information is not included in the first sensor perception information.
5. The automated driving management system according to claim 1 , wherein
the perception sensor includes a laser imaging detection and ranging, and
the first sensor perception information includes a result of measurement by the laser imaging detection and ranging.
6. The automated driving management system according to claim 5 , wherein
the laser imaging detection and ranging emits beams and detects a reflected beam reflected at a reflection point to measure a relative position of the reflection point,
a number of first reflection points is a number of reflection points on a stationary object that are measured during one frame,
a number of noise reflection points is a number of reflection points in air that are measured during the one frame,
a number of no-reflection points is a number of beams for which the reflected beam is not detected during the one frame, and
the first sensor perception information includes at least one of the number of first reflection points, the number of noise reflection points, and the number of no-reflection points.
7. The automated driving management system according to claim 6 , wherein
a parameter includes the at least one of the number of first reflection points, the number of noise reflection points, and the number of no-reflection points included in the first sensor perception information,
the expected sensor perception information includes an expected value of the parameter expected when the automated driving condition is satisfied,
the reference information indicates a correspondence relationship between the expected value of the parameter and the vehicle position, and
the one or more processors are further configured to:
acquire, based on the reference information, the expected value associated with the determination target position;
compare an actual value of the parameter acquired at the determination target position with an allowable range including the expected value associated with the determination target position; and
when the actual value of the parameter acquired at the determination target position deviates from the allowable range, determine that the automated driving condition is not satisfied at the determination target position.
8. The automated driving management system according to claim 1 , wherein
the one or more processors are further configured to:
perceive a landmark around the vehicle by using the perception sensor; and
estimate a position of the vehicle based on a result of perception of the landmark and map information indicating a position of the landmark, and
the first sensor perception information includes a number of landmarks perceived by using the perception sensor.
9. The automated driving management system according to claim 1 , wherein
the perception sensor includes a camera that images a situation around the vehicle,
the one or more processors are further configured to analyze an image captured by the camera to extract a road surface in the image; and
the first sensor perception information includes an area ratio of the road surface in the image.
10. An automated driving management method applied to a vehicle that performs automated driving by using a perception sensor for perceiving a surrounding situation, wherein first sensor perception information indicates a result of perception by the perception sensor, and reference information indicates a correspondence relationship between a vehicle position and expected sensor perception information that is the first sensor perception information expected when an automated driving condition is satisfied,
the automated driving management method comprising:
acquiring, based on the reference information, the expected sensor perception information associated with a determination target position; and
determining whether or not the automated driving condition is satisfied at the determination target position by comparing the first sensor perception information acquired at the determination target position with the expected sensor perception information associated with the determination target position.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022-118084 | 2022-07-25 | ||
JP2022118084A JP2024015785A (en) | 2022-07-25 | 2022-07-25 | Automatic driving management system and automatic driving management method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20240029450A1 true US20240029450A1 (en) | 2024-01-25 |
Family
ID=89576744
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/222,520 Pending US20240029450A1 (en) | 2022-07-25 | 2023-07-17 | Automated driving management system and automated driving management method |
Country Status (2)
Country | Link |
---|---|
US (1) | US20240029450A1 (en) |
JP (1) | JP2024015785A (en) |
-
2022
- 2022-07-25 JP JP2022118084A patent/JP2024015785A/en active Pending
-
2023
- 2023-07-17 US US18/222,520 patent/US20240029450A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
JP2024015785A (en) | 2024-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11279346B2 (en) | Traffic signal response for autonomous vehicles | |
US9731729B2 (en) | Vehicle, vehicle system and method for increasing safety and/or comfort during autonomous driving | |
US9469307B2 (en) | Vehicle, vehicle system and method for increasing safety and/or comfort during autonomous driving | |
US9862364B2 (en) | Collision mitigated braking for autonomous vehicles | |
US11004000B1 (en) | Predicting trajectory intersection by another road user | |
US20170313297A1 (en) | Drive Control System | |
US10553117B1 (en) | System and method for determining lane occupancy of surrounding vehicles | |
CN113247014B (en) | Confidence identification method and system for automatic driving system | |
US20170168495A1 (en) | Active light sensors for determining expected traction value of a road segment | |
CN114442101B (en) | Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar | |
US20200189463A1 (en) | Detecting puddles and standing water | |
CN113119960B (en) | System and method for providing vehicle safety distance and speed warning under road-slip condition | |
US11403951B2 (en) | Driving assistance for a motor vehicle when approaching a tollgate | |
JP2021082286A (en) | System and method for improving lane change detection, and non-temporary computer-readable medium | |
JP2018073010A (en) | Mobile body control device, mobile body control method, and program for mobile body control device | |
US20240029450A1 (en) | Automated driving management system and automated driving management method | |
CN114084129A (en) | Fusion-based vehicle automatic driving control method and system | |
CN114954442A (en) | Vehicle control method and system and vehicle | |
US11590978B1 (en) | Assessing perception of sensor using known mapped objects | |
CN112230216A (en) | Vehicle-mounted millimeter wave radar multi-target detection method for cloud control intelligent chassis | |
RU2807793C1 (en) | Method for automatically limiting vehicle speed | |
US11970160B2 (en) | Traffic signal response for autonomous vehicles | |
US20230334873A1 (en) | Systems and methods for detecting traffic lights using hierarchical modeling | |
CN117184057A (en) | Control method and device for safe running of vehicle, electronic equipment and storage medium | |
CN112987014A (en) | Vehicle and detection system and detection method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KAWANAI, TAICHI;HAYASHI, YUSUKE;HOTTA, DAICHI;SIGNING DATES FROM 20230419 TO 20230421;REEL/FRAME:064295/0526 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |