WO2019073526A1 - Procédé et appareil de commande de conduite - Google Patents

Procédé et appareil de commande de conduite Download PDF

Info

Publication number
WO2019073526A1
WO2019073526A1 PCT/JP2017/036697 JP2017036697W WO2019073526A1 WO 2019073526 A1 WO2019073526 A1 WO 2019073526A1 JP 2017036697 W JP2017036697 W JP 2017036697W WO 2019073526 A1 WO2019073526 A1 WO 2019073526A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
vehicle
detection
detection condition
driving
Prior art date
Application number
PCT/JP2017/036697
Other languages
English (en)
Japanese (ja)
Inventor
吉郎 高松
Original Assignee
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社 filed Critical 日産自動車株式会社
Priority to EP17928354.4A priority Critical patent/EP3696789B1/fr
Priority to RU2020115456A priority patent/RU2743683C1/ru
Priority to CN201780095672.7A priority patent/CN111448596A/zh
Priority to JP2019547818A priority patent/JP6779590B2/ja
Priority to US16/754,971 priority patent/US11584388B2/en
Priority to PCT/JP2017/036697 priority patent/WO2019073526A1/fr
Publication of WO2019073526A1 publication Critical patent/WO2019073526A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3492Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo or light sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • B60W2420/408
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4042Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4049Relationship among other objects, e.g. converging dynamic objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/65Data transmitted between vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/932Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/582Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Definitions

  • the present invention relates to an operation control method and an operation control device.
  • multi-scale recognition is used to calculate the flow line group of the vehicle and the obstacle, and for each flow line, the risk based on the existence probability of both at the route intersection of the vehicle and the obstacle is calculated.
  • Techniques for predicting and selecting driving behavior are known (Patent Document 1).
  • the problem to be solved by the present invention is to set detection conditions according to driving behavior in the event that vehicles traveling along a route sequentially encounter.
  • the present invention is a method of extracting an event that a vehicle encounters based on detection information acquired according to detection conditions, and formulating a driving plan in which a driving action is defined for each of the extracted events, the event being defined.
  • the above problem is solved by determining the detection condition based on the content of the driving action.
  • the present invention since the content and the amount of the detected information are controlled according to the driving behavior, by acquiring the necessary information while reducing the processing load, a highly accurate operation plan based on the judgment in real time is formulated. it can.
  • FIG. 1 is a block diagram of the operation control system 1.
  • the operation control system 1 of the present embodiment includes an operation control device 100 and an in-vehicle device 200.
  • the embodiment of the operation control apparatus 100 of the present invention is not limited, and may be mounted on a vehicle, or may be applied to a portable terminal apparatus capable of exchanging information with the on-vehicle apparatus 200.
  • the terminal device includes devices such as a smartphone and a PDA.
  • the operation control system 1, the operation control apparatus 100, the in-vehicle apparatus 200, and the devices included in these are computers that include arithmetic processing devices such as a CPU and execute arithmetic processing.
  • the on-vehicle apparatus 200 of the present embodiment includes a vehicle controller 210, a navigation apparatus 220, a detection apparatus 230, a lane keeping apparatus 240, and an output apparatus 250.
  • the respective devices constituting the in-vehicle device 200 are connected by a controller area network (CAN) or another in-vehicle LAN to exchange information with each other.
  • the in-vehicle device 200 can exchange information with the operation control device 100 via the in-vehicle LAN.
  • the vehicle controller 210 of the present embodiment controls the operation of the vehicle according to the operation plan formulated by the processor 11.
  • the vehicle controller 210 operates the sensor 260, the drive device 270, and the steering device 280.
  • Vehicle controller 210 acquires vehicle information from sensor 260.
  • the sensor 260 includes a steering angle sensor 261, a vehicle speed sensor 262, and an attitude sensor 263.
  • the steering angle sensor 261 detects information such as a steering amount, a steering speed, and a steering acceleration, and outputs the information to the vehicle controller 210.
  • the vehicle speed sensor 262 detects the speed and / or acceleration of the vehicle and outputs it to the vehicle controller 210.
  • the posture sensor 263 detects the position of the vehicle, the pitch angle of the vehicle, and the roll angle of the yaw angle of the vehicle, and outputs the detected angle to the vehicle controller 210.
  • the attitude sensor 263 includes a gyro sensor.
  • the vehicle controller 210 of the present embodiment is an on-board computer such as an engine control unit (Electric Control Unit, ECU), and electronically controls the driving / operation of the vehicle.
  • ECU Electronic Control Unit
  • Examples of the vehicle include an electric vehicle equipped with an electric motor as a traveling drive source, an engine vehicle equipped with an internal combustion engine as a traveling drive source, and a hybrid vehicle equipped with both an electric motor and an internal combustion engine as a traveling drive source.
  • the electric vehicles and hybrid vehicles that use an electric motor as a traveling drive source include those that use a secondary battery as a power supply for the electric motor and those that use a fuel cell as a power supply for the electric motor.
  • the drive device 270 of the present embodiment includes a drive mechanism of a vehicle.
  • the drive mechanism includes an electric motor and / or an internal combustion engine as the traveling drive source described above, a drive shaft for transmitting an output from the traveling drive source to the drive wheels and a power transmission device including an automatic transmission, and brakes the wheels
  • a braking device 271 and the like are included.
  • the drive device 270 generates control signals of these drive mechanisms based on input signals by accelerator operation and brake operation, and control signals acquired from the vehicle controller 210 or the drive control apparatus 100, and performs drive control including acceleration and deceleration of the vehicle. Run. By transmitting control information to the drive device 270, operation control including acceleration and deceleration of the vehicle can be performed automatically.
  • the torque distribution to be output to each of the electric motor and the internal combustion engine according to the traveling state of the vehicle is also sent to the drive device 270.
  • the steering device 280 of the present embodiment includes a steering actuator.
  • the steering actuator includes a motor or the like attached to a column shaft of the steering.
  • the steering device 280 executes change control of the traveling direction of the vehicle based on a control signal acquired from the vehicle controller 210 or an input signal by a steering operation.
  • the vehicle controller 210 executes change control of the traveling direction by transmitting control information including a steering amount to the steering device 280.
  • the control of the drive device 270 and the control of the steering device 280 may be performed completely automatically or may be performed in a mode that supports the driver's drive operation (progressive operation).
  • the control of the drive device 270 and the control of the steering device 280 can be interrupted / stopped by the intervention operation of the driver.
  • the in-vehicle device 200 of the present embodiment includes a navigation device 220.
  • the navigation device 220 calculates the route from the current position of the vehicle to the destination using a method known at the time of filing.
  • the calculated route is sent to the vehicle controller 210 in order to use it for driving control of the vehicle.
  • the calculated route is output as route guidance information via an output device 250 described later.
  • the navigation device 220 includes a position detection device 221.
  • the position detection device 221 includes a receiver of a global positioning system (GPS), and detects a traveling position (latitude / longitude) of a moving vehicle.
  • GPS global positioning system
  • the navigation device 220 comprises accessible map information 222, road information 223 and traffic regulation information 224.
  • the map information 222, the road information 223, and the traffic rule information 224 may be read by the navigation device 220, and may be physically separated from the navigation device 220.
  • the communication device 30 (or the in-vehicle device 200) May be stored in a readable server via the communication device provided in
  • the map information 222 is a so-called electronic map, and is information in which the latitude and longitude are associated with the map information.
  • the map information 222 has road information 223 associated with each point.
  • the road information 223 is defined by nodes and links connecting the nodes.
  • the road information 223 includes information specifying the road by the position / area of the road, the road type for each road, the road width for each road, and the shape information of the road.
  • the road information 223 associates and stores information on the position of the intersection, the approach direction of the intersection, the type of the intersection, and other intersections for each identification information of each road link.
  • the intersection includes a junction and a junction.
  • the road information 223 relates to road type, road width, road shape, whether to go straight, whether to proceed, whether to pass or not to pass the adjacent lane, and other roads for each identification information of each road link. It associates and stores information.
  • the navigation device 220 specifies a first route on which the vehicle travels based on the current position of the vehicle detected by the position detection device 221.
  • the first route may be a route to a destination designated by the user, or may be a route to a destination estimated based on the travel history of the vehicle / user.
  • the first route traveled by the vehicle may be identified for each road, or the direction of the up / down direction may be identified for each identified link, or it may be identified for each lane where the vehicle actually travels. It is also good.
  • the navigation device 220 specifies a first route on which the vehicle travels by links and lanes with reference to road information 223 described later.
  • the first route includes specific information (coordinate information) of one or more points through which the vehicle passes in the future.
  • the first route includes at least a point indicating a traveling position where the vehicle will exist in the future.
  • the first path may be composed of continuous lines or may be composed of discrete points.
  • the first route is specified by a road identifier, a lane identifier, a lane identifier, and a link identifier.
  • the lane identifier, the lane identifier and the link identifier are defined in the map information 222 and the road information 223.
  • the traffic rule information 224 is a traffic rule that the vehicle should comply with when traveling, such as pausing on the route, parking / stop prohibition, slow driving, speed limit and the like. Each rule is defined for each point (latitude, longitude) and for each link.
  • the traffic regulation information 224 may include information on traffic signals acquired from devices provided on the road side.
  • the in-vehicle device 200 includes a detection device 230.
  • the detection device 230 acquires detection information of the surroundings of a vehicle traveling on a route.
  • the vehicle detection device 230 detects the presence of an object including an obstacle present around the vehicle and its location.
  • the detection device 230 includes a camera 231.
  • the camera 231 is an imaging device provided with an imaging element such as a CCD, for example.
  • the camera 231 may be an infrared camera or a stereo camera.
  • the camera 231 is installed at a predetermined position of the vehicle and captures an object around the vehicle.
  • the periphery of the vehicle includes the front, the rear, the front side, and the rear side of the vehicle.
  • the objects include two-dimensional signs such as stop lines marked on the road surface.
  • Objects include three-dimensional objects.
  • Objects include stationary objects such as labels.
  • the objects include moving objects such as pedestrians, two-wheeled vehicles, and four-wheeled vehicles (other vehicles).
  • Objects include road structures such as guardrails, medians, curbs and the like.
  • the detection device 230 may analyze the image data and identify the type of the object based on the analysis result.
  • the detection device 230 uses pattern matching technology or the like to identify whether the object included in the image data is a vehicle, a pedestrian, or a sign.
  • the detection device 230 processes the acquired image data and acquires the distance from the vehicle to the object based on the position of the object present around the vehicle.
  • the detection device 230 acquires time when the vehicle reaches the object based on the position and time of the object present around the vehicle.
  • the detection device 230 may use a radar device 232.
  • a radar device 232 a method known at the time of application such as a millimeter wave radar, a laser radar, an ultrasonic radar, a laser range finder, etc. can be used.
  • the detection device 230 detects the presence or absence of an object, the position of the object, and the distance to the object based on the reception signal of the radar device 232.
  • the detection device 230 detects the presence or absence of the target, the position of the target, and the distance to the target based on the clustering result of the point cloud information acquired by the laser radar.
  • the detection device 230 may acquire detection information from an external device via the communication device 233. If the communication device 233 can communicate between the other vehicle and the vehicle, the detection device 230 indicates that the other vehicle exists with the vehicle speed and acceleration of the other vehicle detected by the vehicle speed sensor of the other vehicle. You may acquire as object information. Of course, the detection device 230 can also acquire object information including the position, speed, and acceleration of another vehicle from an external device of Intelligent Transport Systems (ITS) through the communication device 233. The detection device 230 may acquire information in the vicinity of the vehicle by the on-vehicle device 200, and may acquire information on an area far from a vehicle by a predetermined distance or more from an external device provided on the roadside via the communication device 233.
  • ITS Intelligent Transport Systems
  • the detection device 230 sequentially outputs the detection result to the processor 11.
  • the detection condition is a condition that can be set for each point.
  • the processor 11 calculates a detection condition and instructs the detection device 230 to make a setting.
  • the detection device 230 can set detection conditions for each point.
  • the point may be a point (position information) defined in the event, or may be any point on the first route.
  • the processor 11 includes point information in the detection condition to be set.
  • the on-vehicle apparatus 200 of the present embodiment includes a lane keeping apparatus 240.
  • the lane keeping device 240 includes a camera 241 and road information 242.
  • the camera 241 may share the camera 231 of the detection device.
  • the road information 242 may share road information 223 of the navigation device.
  • the lane keeping device 240 detects the lane on the first route on which the vehicle travels from the image captured by the camera 241.
  • the lane keeping device 240 has a lane departure prevention function (lane keeping support function) that controls the movement of the vehicle such that the position of the lane marker of the lane and the position of the vehicle maintain a predetermined relationship.
  • the driving control device 100 controls the movement of the vehicle such that the vehicle travels in the center of the lane.
  • the lane marker is not limited as long as it has a function of defining a lane, and may be a diagram drawn on a road surface, or may be planting existing between lanes, or lanes. It may be a road structure such as a guardrail, a curb, a sidewalk, or a motorcycle road existing on the shoulder side of the road. In addition, the lane marker may be an immobile object such as a signboard, a sign, a store, or a roadside tree present on the roadside of the lane.
  • the processor 11 described later stores an object detected by the detection device 230 in association with an event and / or a route.
  • the processor 11 stores an object that exists within a predetermined distance of an event and that may be encountered in the event in association with the event.
  • the processor 11 encountered in the event stores the object in association with the path.
  • the processor 11 grasps at which position of which path the object exists. This makes it possible to quickly determine the object that the vehicle encounters in an event.
  • One or more objects may be associated with one event. Objects determined to meet the vehicle in an event identified as a location are mapped to a common event. For example, when there are a plurality of pedestrians in a pedestrian crossing defined as one event, each pedestrian is associated with the pedestrian incident. Each pedestrian may be associated as an independent object, or may be associated as a group of objects having a common position and velocity (within a predetermined range).
  • the in-vehicle device 200 includes an output device 250.
  • the output device 250 includes a display 251 and a speaker 252.
  • the output device 250 outputs various types of information regarding drive control to the user or the occupants of surrounding vehicles.
  • the output device 250 outputs information relating to the prepared driving action plan and the driving control based on the driving action plan.
  • the information regarding the operation control may be notified in advance to the occupant of the vehicle or the occupant of the other vehicle via the exterior lamp and the interior lamp.
  • the output device 250 may output various types of information related to operation control to an external device such as an intelligent transportation system via the communication device.
  • the operation control device 100 includes a control device 10, an output device 20, and a communication device 30.
  • the output device 20 has the same function as the output device 250 of the on-vehicle apparatus 200 described above.
  • the display 251 and the speaker 252 may be used as the configuration of the output device 20.
  • the control device 10 and the output device 20 can exchange information with each other via a wired or wireless communication line.
  • the communication device 30 exchanges information with the in-vehicle device 200, exchanges information within the operation control device 100, and exchanges information with an external device and the operation control system 1.
  • the control device 10 includes a processor 11.
  • the processor 11 is an arithmetic device that performs operation control processing including planning of an operation plan of a vehicle. Specifically, as the control device 10, the processor 11 executes a program stored in a ROM (Read Only Memory) storing a program for executing the operation control process including the planning of the operation plan and the program stored in the ROM. It is a computer provided with CPU (Central Processing Unit) as an operation circuit which functions, and RAM (Random Access Memory) which functions as an accessible storage device.
  • CPU Central Processing Unit
  • RAM Random Access Memory
  • the processor 11 executes processing according to the following method. (1) Acquisition of detection information around the vehicle, (2) sequentially extracting events encountered by the vehicle based on the detection information; (3) Judgment of driving behavior based on detected information acquired in the event, (4) Formulate a driving plan in which driving behavior is defined for each event, (5) Have the vehicle execute a drive control command according to the drive plan. Furthermore, the processor 11 (6) The process of determining the detection condition based on the content of the driving action defined for the event is executed.
  • the processor 11 detects the detection condition according to the first block for realizing the planning function of the operation plan, the second block for realizing the execution function of the operation plan, and the contents of the operation behavior of the operation plan being executed or the change thereof. And a third block for realizing a setting function to be set.
  • the processor 11 executes each function by cooperation of software for realizing the above functions or executing each process and the above-described hardware.
  • the operation planning process is a basic process that the operation control system 1 executes.
  • calculation processing of a first route on which the vehicle travels, extraction processing of an event encountered when traveling a first route, determination processing of a driving action in an event, and each event and driving action are associated. Operation planning process.
  • the processor 11 calculates a route (sometimes referred to as a first route) on which the vehicle is traveling or scheduled to travel.
  • the processor 11 acquires vehicle information in order to calculate the first route.
  • the processor 11 acquires the current position of the vehicle from the position detection device 221.
  • the processor 11 refers to the map information 222 and calculates a first route using the acquired current position and traveling direction.
  • the processor 11 may acquire the planned traveling route of the vehicle determined by the navigation device 220 as the first route.
  • the processor 11 may acquire, as a first route, the guide route from the current position to the destination, which the navigation device 220 has determined.
  • the calculation process of the route of a vehicle can use suitably the method known at the time of this application filing.
  • the processor 11 acquires (detects / extracts) an event encountered by a vehicle traveling on the first route.
  • the event (event) in the present embodiment is a thing (presence of a thing / object) that is a trigger of the judgment processing of the operation control.
  • the driving control to be performed includes acceleration and deceleration of the vehicle and steering of the vehicle. That is, an event (event) causes acceleration and deceleration of the vehicle and steering.
  • the event is an intersection on the first route, a stop line on the first route, a pedestrian crossing on the first route, and an object around a vehicle traveling on the first route.
  • the objects include flat / three-dimensional traffic signs, moving objects such as pedestrians, two-wheeled vehicles, and four-wheeled vehicles, and road structures such as guard rails, median dividers, and curbs.
  • the processor 11 locates an event.
  • the processor 11 refers to the map information 222 and extracts another route having a point of intersection with the first route where the vehicle is traveling or scheduled to travel.
  • the path having the first path and the intersection point includes a path that intersects the first path, a path that flows into the first path, a path that flows from the first path, and a path that intersects the first path. If another route is detected, the intersection with the other route is the intersection of the first route and is acquired as an event.
  • the processor 11 refers to the traffic rule information 224 to obtain the presence and position of the traffic sign on the first route.
  • the traffic regulation information 224 is information in which information such as a temporary stop position, entry prohibition, one-way traffic, etc. is associated with links (routes) and position information.
  • the processor 11 recognizes the traffic rule of the stop as an event.
  • the processor 11 extracts the position where the stop is defined as the position where the vehicle encounters an event.
  • the position of the extracted event is mapped to a path (including a link).
  • processor 11 recognizes a no-entry traffic rule as an event.
  • the processor 11 extracts a position (upstream side in the traveling direction) upstream of the position where the entry prohibition is defined as the position where the vehicle encounters an event.
  • the position of the extracted event is mapped to a path (including a link).
  • the traffic rule information 224 includes a traffic signal indicated by a traffic light. At this time, the map information 222 and the road information 223 may be referred to.
  • the processor 11 extracts the dynamic event which the vehicle which drive
  • the information of the event detected based on the detection information may include the presence and the position of the object on the first route.
  • the processor 11 recognizes the presence of an object (an object including a pedestrian, another vehicle, a road structure, etc., a so-called obstacle) detected by the detection device 230 as an event encountered by the vehicle.
  • the processor 11 may extract the presence of the object as an event when the distance between the vehicle and the detected object is less than a predetermined value.
  • the processor 11 may extract the presence of the object as an event when the predicted time until the vehicle and the detected object contact each other is less than a predetermined value.
  • the processor 11 uses the position information of the object to extract an event that a vehicle traveling on the first route encounters.
  • the targets include targets for which traffic is temporarily restricted such as a construction site, a broken car, and an avoidance area. Information on this type of object may be included in the road information 223.
  • the information on the position where the object is present may be received from a roadside information provision device such as ITS.
  • the processor 11 obtains the presence and the position of the object including the obstacle on the first path based on the output result of the detection device 230.
  • the processor 11 refers to the road information 223 to acquire the presence and the position of the road structure on the first route.
  • the map information 222 and the road information 223 may be referred to.
  • the processor 11 creates a first driving plan for traveling on the first route based on the acquired information (presence and position) of the event and the relationship with the vehicle.
  • the planning of the first driving plan may be performed at a predetermined cycle, or may be performed at a timing when the distance between the vehicle and the intersection (event) is less than the predetermined distance.
  • the processor 11 associates the encounter positions with the plurality of extracted events with the route of the vehicle.
  • the processor 11 rearranges the plurality of extracted events in the order in which the vehicle encounters.
  • the processor 11 determines the order of the encountered events from the transition of the position of the vehicle traveling on the first route and the position of the events, and rearranges the events in the order in which the vehicles are encountered. Information arranged in chronological order of encountering this event may be presented to the user via an output device 20 described later.
  • the processor 11 plans the driving behavior of the vehicle traveling on the route.
  • the processor 11 draws up an operation plan when the vehicle travels the first route, using a relation (evaluation result) between the vehicle and a plurality of events encountered over time when the vehicle travels the first route. .
  • the processor 11 creates an operation plan in consideration of the presence of the object detected by the detection device 230.
  • the processor 11 rearranges a plurality of events in the order in which the vehicle encounters, and creates control commands in accordance with a series of driving plans in which the driving behavior is defined for each event.
  • the driving behavior of the upstream or downstream event affects the driving behavior of the downstream or upstream event.
  • a change in driving behavior in one event affects the timing of arriving at each of the chronologically arranged events in order to change the arrival time of the vehicle event.
  • the vehicle When the timing of reaching the event changes, the movement amount of the object also changes, and the situation in each event also changes. For example, if the driving behavior of the upstream event is a stop, the vehicle is decelerated regardless of the driving behavior of the downstream event. Furthermore, if the driving behavior of the downstream event is a stop, the vehicle is required to at least decelerate even if the driving behavior of the upstream event is progressing. Since the vehicle speed is low and extension of the arrival time to the event due to deceleration brings about a change of situation in the encountered event, the driving behavior on the upstream side is also affected. In a driving plan having such technical features, there is a unique problem that it is required to respond to the contents of the driving actions arranged in time series and the change thereof. On the other hand, in the present embodiment, the detection conditions are optimized in accordance with the content of each driving action arranged in time series.
  • the processor 11 detects the type of each event (intersection, traffic rule, object), the position of the event and the change in position (distance, time to contact, approach speed, distance after a predetermined time), content of the event (Contents of traffic rules, attributes of objects), etc. are evaluated.
  • the processor 11 uses the vehicle speed of the vehicle acquired from the vehicle speed sensor 262 to determine the distance to the event and the change in distance.
  • the processor 11 refers to one or more of the traffic rule information 224, the map information 222, the road information 223, the detection result of the detection device 230, and the type, position / position of the traffic rule. Change, read the contents. If the event is a traffic light, the processor 11 recognizes, based on the recognition result of the signal recognition function of the detection device 230, whether the traffic rule indicated by the traffic light is progress / attention / stop. The processor 11 may recognize the traffic rule indicated by the traffic light based on the signal information transmitted by the external ITS acquired via the communication device 30.
  • the processor 11 refers to the traffic rule information 224, the road information 223, and the map information 222, and detects the detection device 230. Recognize the position and content of traffic signs detected by If the event is an object such as a pedestrian, another vehicle, or a road structure, the processor 11 detects the position of the object detected by the detection device 230 and the type and position of the object based on the movement speed. / Change in position, seek content.
  • the processor 11 determines one driving action for each of the plurality of extracted events.
  • the actions to be determined include a driving behavior (Go) and a stopping behavior (No Go).
  • the processor 11 determines, for each event, either a progress action or a stop action. If the event is a traffic rule and the traffic rule calls for a stop, the processor 11 determines the driving action for the event as "stop”. On the other hand, if the traffic rule permits passage, the processor 11 determines the driving behavior for the event as "progress”. If the event is an object, the distance to the object is less than a predetermined value, the change in distance is greater than or equal to a predetermined value, and the time to contact is less than a predetermined value, the processor 11 responds to the event. Determine driving behavior as "stop".
  • the processor 11 "progresses" the driving action for the event. Decide.
  • the processor 11 makes a series of operation plans based on the content of each action determined for the plurality of events.
  • the processor 11 determines a driving action to be taken for an event encountered when the vehicle V1 travels the first route RT1.
  • the processor 11 calculates a route along which the vehicle travels in consideration of the destination of the vehicle V1.
  • the calculated route is the first route RT1 in the present embodiment.
  • the planning of the operation plan when traveling the first route RT1 will be described.
  • the vehicle V1 travels in the direction indicated by the arrow F, passes the stop line ST1, the signal SG1, the pedestrian crossing CR1, and turns right at the intersection P.
  • the events encountered by the vehicle V1 when traveling the first route RT1 are the stop line ST1, the signal SG1, the pedestrian crossing CR1, the other vehicle V2 approaching when entering the right turn lane, and the pedestrian crossing CR4.
  • the processor 11 extracts an event at one detection timing.
  • the events that the vehicle V1 encounters change from moment to moment, so if the timing is different, the position, movement (speed, etc.) of the object also changes.
  • the processor 11 calculates an operation plan every moment in accordance with an event which changes every moment at a predetermined cycle.
  • the processor 11 may calculate the driving plan when the vehicle V1 approaches the intersection on the first route (the intersection with another route) within a predetermined distance.
  • the processor 11 detects the type of each extracted event (intersection, traffic rule, object), the position of the event and the change of the position (distance, time to contact, approach speed, distance after a predetermined time), Determine the content (content of traffic rules, attributes of objects).
  • the processor 11 recognizes an event closest to the vehicle V1 (stop line ST1).
  • the processor 11 determines that the stop line ST1 is a traffic rule, the distance from the vehicle V1 is D1, the arrival time is S1, and the stop line ST1 is an event requiring a pause.
  • the processor 11 corresponds to the stop line ST1 and recognizes an event (signal SG1) that is the second closest to the vehicle V1.
  • the processor 11 determines that the issue SG1 is a traffic rule, the distance from the vehicle V1 is D2, the arrival time is S2, and it is an event that prohibits progress (red / yellow light).
  • the stop line ST1 is an event indicating a position at which the vehicle is stopped upstream of the signal SG1 when the signal SG1 instructs a stop when the vehicle V1 enters an intersection.
  • Signals SG1 and stop lines ST1 recognized as separate events are associated in the traffic rule information 224.
  • the content of the stop line ST1 is "stop” when the signal SG1 is a signal (red / yellow signal) indicating a stop, but is "progress” when the signal SG1 is a signal indicating a progression (blue / green) It becomes.
  • the processor 11 determines that the driving action for the event (stop line ST1) associated with the event (signal SG1) is “stopped” based on the fact that the progress inhibition is instructed in the event (signal SG1).
  • the stop line ST1 and the signal SG1 can also be regarded as a common event in the operation control of this embodiment.
  • the processor 11 recognizes the third closest event (crosswalk CR1) from the vehicle V1.
  • the processor 11 determines that the pedestrian crossing CR1 is a traffic rule, the distance from the vehicle V1 is D2, the arrival time is S2, and the progress is permitted (blue / green light) event.
  • the traffic rules for pedestrian crossings are "stop” if the signal indicates no entry and “progress” if the signal indicates entry permission.
  • the traffic rule of the pedestrian crossing is "stop” when a pedestrian is present at the pedestrian crossing, and "going” when the pedestrian is absent at the pedestrian crossing. Since the processor 11 is instructed to prohibit the progress in the event (signal SG1), the event (crosswalk CR1) is "stopped”.
  • the detection device 230 detects a pedestrian H1. Based on the detection result of the detection device 230 (presence of the pedestrian H1), the processor 11 determines that the driving action for the event (crosswalk CR1) is “stop”.
  • the processor 11 When making a right turn in the intersection P, the processor 11 extracts a point (intersection) where the first route intersects with another road as an event.
  • the processor recognizes the third closest event (intersection MX12) from the vehicle V1.
  • the processor determines that the intersection MX12 is an intersection, the distance from the vehicle V1 is D3, and the arrival time is S3.
  • Detection device 230 detects other vehicle V2 approaching intersection MX12.
  • the detection device 230 recognizes an object whose time to collision (TTC) based on the vehicle V1 is within a predetermined time as an object. Based on the detection result of the detection device 230 (presence of the other vehicle V2), the processor 11 determines that the driving action for the event (intersection MX12) is “stop”.
  • the processor 11 extracts, as an event, a pedestrian crossing CR4 entering after the right turn in the intersection P.
  • the processor 11 recognizes an event (crosswalk CR4) near the fourth from the vehicle V1.
  • the processor 11 determines that the pedestrian crossing CR4 is a traffic rule, the distance from the vehicle V1 is D4, and the arrival time is S4.
  • the processor 11 monitors the detection result of the detection device 230 and confirms that there is no object in the surroundings.
  • the processor 11 determines that the driving behavior for the event (crosswalk CR4) is "progress", provided that the detection device 230 does not detect the object at the timing before entering the event (crosswalk CR4).
  • the processor 11 determines, for each event, either an advancing behavior or a stopping behavior based on the relationship between the vehicle V1 and a plurality of events that the vehicle V1 encounters over time, and determines for each event Create a series of operation plans using the contents of the action taken.
  • the processor 11 creates a series of operation plans for each event using a plurality of events encountered over time as the vehicle V1 travels the first route and the relationship with the vehicle V1. This can simplify the process of making the final operation plan.
  • the operation load can be reduced while drafting a highly accurate operation plan in consideration of necessary events.
  • the processor 11 draws up an operation plan of the vehicle V1 traveling on the first route continuously (at a predetermined cycle) using the relationship between each event and the vehicle V1.
  • the operation control apparatus 100 presents the prepared operation plan to the user.
  • the output device 20 displays the events extracted by the processor 11 and arranged in the order they are encountered.
  • the output device 20 uses the display 251 to display information regarding the operation plan.
  • the output device 20 may output a voice of a plurality of rearranged events using the speaker 252.
  • FIG. 2B is a display example showing events over time.
  • Arrow T indicates the traveling direction of the vehicle V1 on the first route.
  • the output device 20 displays the extracted events, that is, the stop line ST1 and the signal SG1, the pedestrian crossing CR1, the intersection MX12, and the pedestrian crossing CR4 in the order in which the vehicle V1 encounters along the time axis arrow T.
  • the information indicating an event may be a symbol, text information, or an abstract mark. Coloring, size, etc. can be determined arbitrarily.
  • the output device 20 displays the driving behavior of each event determined by the processor 11 in association with each event.
  • the driving behavior of the event is displayed under each event so that the position along the arrow T is common to each event.
  • the information indicating the driving behavior may be a symbol, text information, or an abstract mark. Coloring, size, etc. can be determined arbitrarily.
  • the output device 20 extracts a plurality of events even when the events include intersections of routes, stop positions on traffic rules, stationary objects such as road structures, and moving objects such as pedestrians and other vehicles. Are rearranged along a common time axis of the order in which the vehicle V1 encounters. Other vehicles include other vehicles approaching from the rear.
  • the driver of the vehicle V1 can determine which event and in what order You can visually recognize what kind of driving behavior you encounter and take.
  • step S1 the processor 11 acquires vehicle information of a vehicle to be controlled.
  • vehicle information includes information related to the driving of the vehicle such as the current position, traveling direction, speed, acceleration, braking amount, steering amount, steering speed, steering acceleration, etc., vehicle specification information, and vehicle performance information.
  • Vehicle information is acquired from the in-vehicle device 200.
  • step S2 the processor 11 acquires detection information.
  • Detection information includes the presence or absence of an event and the position of the event.
  • the position of the event is the position of the thing that triggers the judgment processing of the operation control of the intersection, the object, etc.
  • the detection information includes the presence or absence of an object around the vehicle, the attribute of the object (stationary or moving object), the position of the object, the velocity / acceleration of the object, and the traveling direction of the object.
  • the detection information can be acquired from the in-vehicle device 200 including the detection device 230 and the navigation device 220.
  • step S3 the processor 11 determines whether there is a change in the latest scene that the vehicle V1 will encounter from now on.
  • the scene is a scene such as an intersection to pass from now on.
  • the scene includes events from intersection intrusion, intersection with other routes, and intersection exit.
  • a scene contains multiple events.
  • a scene to be encountered can also be defined as a unit to which a group of control instructions corresponding to an event is applied.
  • the processor 11 determines that there is no change in the travel route, and whether or not the scene which is the target of the driving control has passed. This is to determine whether it is necessary to set a new (next) scene.
  • the processor 11 determines that there is no change in the travel route if the current position of the vehicle belongs to the route already calculated.
  • the processor 11 determines that the travel route has been changed. The processor 11 determines that the scene has passed when the current position of the vehicle does not belong to the area set as the scene passing immediately before. The processor 11 determines that the scene has not passed when the current position of the vehicle V1 belongs to the area set as the scene passing immediately before. When the scene is passed, the operation planning for the next scene or event is repeated.
  • step S3 the processor 11 determines that the scene is changed, and executes the processing of S4 to S9.
  • the processor 11 determines that there is a change in the scene when the traveling route is changed or passes through the scene, and executes the processing of S4 to S9. If the travel route has not been changed and the scene has not passed, it is determined that there is no change in the scene, and the process proceeds to S11.
  • step S4 the processor 11 calculates a first route on which the vehicle V1 travels.
  • the first route may use the one calculated by the navigation device 220.
  • the first route is identified by a road identifier, a lane identifier, a lane identifier, and a link identifier.
  • the lane identifier, the lane identifier and the link identifier are defined in the map information 222 and the road information 223.
  • step S5 the processor 11 sets a scene encountered by the vehicle V1 traveling on the first route.
  • a scene is an area including a point at which an intersection of a first path and another path exists.
  • the mode of the intersection with the first path is not limited, and may be any of merging, branching, crossing, T-crossing, and adjacent.
  • the scene is an area including a point where a stop is required on the first route according to the traffic rule information 224.
  • the processor 11 refers to the map information 222, the road information 223, and the traffic rule information 224, and sets a scene in which the vehicle V1 is likely to encounter an event as the region R1 (see FIG. 2).
  • the scene encountered by the vehicle V1 is, for example, the area near the intersection, the area near the junction of the lanes, the area near the pedestrian crossing, the area near the stop line, the area near the crossing, the area near the construction site.
  • step S6 the processor 11 extracts a second path having an intersection with the first path.
  • the processor 11 refers to the map information 222 and the road information 223 to extract a second route having an intersection with the first route.
  • the processor 11 refers to the link information (node information) defined in the map information 222. Where multiple routes intersect, link information (node information) is connected to other multiple links.
  • the processor 11 extracts a second route intersecting the first route from the connection status of the link information (node information).
  • step S7 the processor 11 extracts an event encountered by the vehicle V1 in the set scene.
  • the processor 11 extracts an intersection of the first path and the second path as an event.
  • a plurality of links are connected to one link at the junction point of the route.
  • the vicinity of the entrance to the intersection corresponds to the branch point of the lane, and the vicinity of the exit of the intersection corresponds to the junction of the lane.
  • the point where one link is connected to a plurality of links can be extracted as an event where the first route and the second route cross on the exit side of the intersection. That is, the second route at the exit of the intersection can be detected by detecting the presence of points where one link is connected to a plurality of links.
  • link information is defined also for the pedestrian crossing, and the pedestrian crossing that intersects the first route can be detected as the second route by performing intersection determination of the link of the first route and the link of the pedestrian crossing.
  • the processor 11 extracts, as an event, an object for which a pause is required on the first route in accordance with the traffic rule information 224.
  • the position of the extracted event is stored in association with the route.
  • the position of the extracted event may be stored in association with the map information 222 and the road information 223.
  • the driving behavior is determined for each extracted event (the position of the event).
  • step S8 the processor 11 determines driving behavior for each event.
  • Driving behavior includes "stop” and "progress".
  • the driving behavior of the event is "stop”.
  • the driving behavior of the event is also "stopped”.
  • the driving behavior of the event is "progression”.
  • the driving behavior of the event is also "progress”. The possibility of contact is determined based on the time difference between the time the vehicle reaches the event and the time the object reaches the event.
  • step S9 the processor 11 rearranges the plurality of extracted events in the order in which the vehicle V1 encounters.
  • the output device 20 displays the rearranged plurality of events on the display 251 (see FIG. 2B).
  • the output device 20 may output a voice of a plurality of rearranged events using the speaker 252. This display may be performed after operation planning.
  • step S11 the processor 11 verifies the newly acquired detection information.
  • the situation around the traveling vehicle changes from moment to moment.
  • the surrounding conditions such as movement of the vehicle itself, change of positional relationship with other vehicles, change of position of pedestrian, appearance of new object, change of detection accuracy due to change of position, etc. It will never be constant. That is, the presence of an event and the presence or absence of an event extracted based on the detection information acquired in step S2 must be sequentially reviewed.
  • the detection information verified in step S11 is detection information acquired at a timing later than the detection information acquired in step S2.
  • the processor 11 extracts an object which a vehicle traveling on the first route encounters based on the new detection information.
  • the processor 11 extracts information of an object present in a second route intersecting the first route among the object information obtained in step S2.
  • step 12 the processor 11 associates the newly detected object with the event.
  • the pedestrian crossing as the event may be associated with the pedestrian as the object, or the intersection point with the second route may be the event, and other vehicles traveling on the second route may be associated.
  • An object present in the first route may be defined as an event.
  • step S13 the processor 11 determines driving behavior for each event in consideration of the newly detected event.
  • the driving behavior is determined based on the traffic rules in the event and the possibility of contact between the vehicle and the object as described above.
  • the possibility of contact is determined based on the distance between the vehicle and the object or the time taken for the two to contact.
  • the time when the two are in contact is calculated using the speed of the vehicle and the object.
  • the possibility of contact is calculated based on the time when the vehicle and the object reach the event. Determine driving behavior in the event based on the possibility of contact in the event.
  • steps S11 to S13 are preferably performed in a predetermined cycle. Depending on the conditions, it is not impossible to skip from step S8 to step S14.
  • the possibility of contact is high, and the driving behavior "stop" is defined for the event. If the event is an object, the possibility of the contact is determined based on the positional relationship between the vehicle and the object.
  • the processor 11 determines the movement of the object based on the detection information or the temporal change of the detection information. In this determination, the processor 11 predicts the positional relationship between the object and the vehicle or event, assuming the moving direction of the object and the velocity of the object.
  • the degree of freedom (dispersion value) of the moving direction of the object varies depending on the attribute of the object such as a vehicle traveling on a road, a two-wheeled vehicle, or a pedestrian.
  • the processor 11 predicts the moving direction of the object based on the attribute of the object analyzed from the captured image or the like, and calculates the probability that the predicted moving direction matches the existing direction of the vehicle.
  • the processor 11 calculates the time for the object to reach the event according to the range of predicted speeds of the object, and compares the time for the vehicle calculated based on the vehicle information to the time for the event to reach the vehicle and the object The probability that the time difference between the time and the time to reach the event is within a predetermined time is calculated.
  • the processor 11 predicts the traveling direction and speed of the object based on the attribute of the object analyzed from the captured image or the like, and calculates the probability of contact with a vehicle moving at the predicted speed.
  • the probability of contact with the vehicle is calculated by multiplying the behavior of the object (traveling direction, speed, etc.) by a coefficient.
  • the processor 11 processes as an object to be noted that may come in contact with the vehicle when the probability is equal to or more than a predetermined probability as a threshold, and as an object not in contact with the vehicle when the probability is less than the predetermined probability.
  • the processor 11 changes the detection condition by adjusting the probability that the object moves toward the vehicle, the range of the predicted velocity of the object (variance value), and the threshold for evaluating the object. It is determined that the object contacts the vehicle by increasing the probability that the object moves toward the vehicle, widening the range of predicted speeds of the object, or changing the threshold for evaluating the object low. Detection conditions can be set.
  • the processor 11 calculates the probability that the object comes in contact with the vehicle based on the behavior of the object around the vehicle, and when the probability that the object comes in contact with the vehicle is higher than a predetermined threshold value In order to recognize as, "threshold" is determined as a detection condition.
  • the processor 11 calculates the probability that the object comes in contact with the vehicle based on the behavior of the object around the vehicle, and when the probability that the object comes in contact with the vehicle is higher than a predetermined threshold value As a detection condition, the method of calculating the probability of recognition as is determined.
  • the processor 11 of the present embodiment sets the probability of contact between the object and the vehicle and the threshold value thereof in accordance with the content of the driving action in the event.
  • the processor 11 adjusts (corrects) the moving direction or moving speed of the object determined based on the temporal change of the detection information, the attribute of the object, or the like according to the content of the driving action. The specific method will be described later.
  • step S14 the processor 11 makes a driving plan in which driving behavior is defined for each event.
  • the processor 11 draws up an operation plan in which driving behavior is associated with each of a plurality of events belonging to the area Q1 set as a scene.
  • the driving plan in this example is a series of instructions in which driving actions are defined for each event, by arranging the plurality of extracted events in the order in which the vehicle encounters.
  • the unit of operation plan is not particularly limited. In this example, the scene where an intersection is encountered is the target of the operation plan, but the operation plan up to the destination may be provisionally created, or the operation plan may be formulated based on a predetermined number of events. Good.
  • step S15 the processor 11 sets a detection condition based on the content of the driving action in the event.
  • the processor 11 causes the in-vehicle device to execute detection processing according to the determined detection condition.
  • the detection condition in the present embodiment can be set for each point. Since it can be set for each point, the detection condition can be set for each event for which the point is specified.
  • FIG. 4 is a conceptual diagram for explaining that the detection condition is set for each point. As shown in FIG. 4, the detection conditions R1 to R6 can be set for the points P1 to P6 set on the first route RT1 on which the vehicle V1 to be controlled travels.
  • the detection conditions R1 to R6 may be common or may be different conditions.
  • the setting command including the detection condition is the detection range (distance along the traveling direction, distance along the vehicle width direction, height, scan range, imaging angle of view, focal length), application event (application position), or application timing (pass Point, time) can be included.
  • the processor 11 sets the detection condition in the event based on the content of the driving action in the event.
  • the processor 11 may define an interval OVA to which the detection condition for the event PA is applied, and apply the same detection condition at the position belonging to the interval OVA.
  • the section OVA to which the detection condition is applied can be defined as a section between the first point of the first predetermined distance on the upstream side (vehicle side) and the second point of the second distance based on the event. The first point may be the position of the event.
  • the processor 11 sets detection conditions at each point belonging to the interval OVA upstream of the event PA based on the driving behavior determined for the event PA.
  • the processor 11 determines the detection condition based on the driving behavior in the second event encountered next to the first event that the vehicle first encounters.
  • the processor 11 causes the in-vehicle apparatus 200 to execute detection processing according to the determined detection condition.
  • the vehicle encounters the second event (following the second) after the first event PA it encounters first
  • the detection condition is obtained based on the content of the driving action in the two events PB, and the new detection condition is set in the on-vehicle apparatus 200.
  • the processor 11 can set the detection condition of the first event PA on the upstream side of the second event PB based on the content of the driving action of the second event PB encountered second.
  • the processor 11 rearranges a plurality of events in the order in which the vehicle encounters, and creates control commands in accordance with a series of driving plans in which driving behavior is defined for each event.
  • the driving behavior of the upstream or downstream event affects the driving behavior of the downstream or upstream event. For example, when the driving behavior of the downstream event is a stop, even if the driving behavior of the upstream event is progressing, the change of the situation occurs due to the extension of the arrival time due to the deceleration. When the driving behavior of the upstream event is a stop, the arrival time to the event is delayed because the vehicle is decelerated regardless of the downstream driving behavior. At this time, the vehicle is traveling at a low speed.
  • the processor 11 pre-reads the driving behavior in the relatively downstream side (advancing direction side) event, and sets the detection condition of the latest event. As a result, the change of the situation is predicted from the driving behavior of the next event, and the detection condition of the event is set, so that the appropriate detection condition based on the driving behavior of the next event can be set. Focusing on the second driving behavior and finding the appropriate detection conditions to be applied, from the viewpoint that the second encounter event affects the series of driving plans. Since the driving behavior and the driving plan can be made using the detection information according to such detection conditions, the success rate (completion rate) of the driving plan can be increased.
  • the processor 11 determines the detection condition based on the driving behavior in the second event encountered next to the first event encountered by the vehicle, and is determined after the vehicle passes the first event. Switching to the detection condition causes the on-vehicle apparatus 200 to execute detection processing according to the switched detection condition.
  • the vehicle encounters the second event (following the second) after the first event PA it encounters first
  • the detection condition is obtained based on the content of the driving action in the two events PB, and after passing the first event, the new detection condition is set in the on-vehicle apparatus 200.
  • the processor 11 can set the detection condition of the second event PB based on the content of the driving action of the second event PB encountered second.
  • the processor 11 prefetches the driving behavior in the second event on the relatively downstream side (advancing direction side), determines the detection condition early, and applies the new detection condition immediately after the passage of the first event. Can do Detection conditions suitable for driving behavior in the next event can be set at appropriate timing. Using the detection information according to the appropriate detection conditions, it is possible to make a driving action and a driving plan suitable for the situation, so that the success rate (completion rate) of the driving plan can be increased.
  • the factors of the detection condition will be described.
  • the factors of the detection condition include (1) detection range, (2) calculation method of contact probability with object, and (3) narrowing down of object to be extracted.
  • the processor 11 specifies the factor of the detection condition to change the factor of the detection condition in the detection device 230. Each factor will be described.
  • (1) “Detection range” is the width (area), length (length along the traveling direction), width (length along the vehicle width direction), height (traveling) of the detection area specified in the detection condition Approximately vertical length to the surface), scan range (angle), angle of view, or focal length.
  • the “method of calculating the contact probability with the object” is the movement direction of the object, the velocity of the object, the object, when determining the presence of the object in contact with the vehicle specified in the detection condition It is a setting condition of the movement range.
  • the moving direction of the object, the velocity of the object, and the moving range of the object vary depending on the attribute of the object.
  • a pedestrian having a high degree of freedom in the movement direction has different conclusions as to whether or not it is an object depending on whether it moves toward or away from the vehicle.
  • the processor 11 uses the movement direction of the object as the specified value set based on the certainty, the velocity of the object, the movement range of the object, and the coefficient to obtain the contact probability between the object and the vehicle calculate.
  • the coefficient is lower than in the case of stopping, and conversely, if the driving behavior of the event is stopping, the coefficient is higher than in the case of progressing.
  • the processor 11 of the present embodiment changes the moving direction of the object as the specified value, the speed of the object, and the moving range of the object so as to increase or decrease the probability that the object contacts the vehicle.
  • the driving behavior of the event is progressing, it is higher than in the case of stopping, and conversely, when the event is stopping, the coefficient is lower than in the case of progressing. This makes it possible to adjust whether or not the object in contact with the vehicle is the attention object.
  • the “narrowing down of objects to be extracted” is a detection condition of extracting only oncoming vehicles and crossing vehicles which are likely to come in contact with vehicles from among all the objects included in the detection information. For example, before the vehicle reaches an event, objects that have already passed the event are not extracted as objects.
  • FIG. 5 shows a subroutine of step S15 (change of detection condition) of FIG.
  • step S21 the necessity of change processing of detection conditions is examined.
  • the processor 11 determines whether to switch the detection condition based on the amount of change in the movement of the object obtained from the change in detection information over time.
  • the processor 11 executes the process of changing the detection condition in step S15 if the amount of change in movement of the object is equal to or greater than a predetermined value, and does not execute the process of changing the detection condition otherwise. If there is no change in the movement of the object, it is determined that the detection condition does not need to be changed.
  • the determination of the driving behavior is unlikely to be changed. If the determination of the driving behavior is not changed, the setting of the detection condition is likely to be maintained. As a result, the determination to change / maintain the detection condition can be appropriately made, so it is possible to execute the operation control suitable for the actual situation. By changing the detection condition frequently, it is possible to suppress the occupant from feeling discomfort.
  • step 22 the processor 11 determines whether the driving action is “progress”. In the case of "progress”, the process proceeds to step S23.
  • step S23 the processor 11 sets the following first detection condition according to the fact that the driving action is "progress”. (1) Detection condition capable of detecting an object whose arrival time to an object event is relatively short (2) Detection condition relatively narrow for detecting an object Detection condition relatively narrow (3) When an object contacts a vehicle Detection conditions for which the probability of being judged is set relatively low
  • step S24 the driving action is "stop" (step S24), and the process proceeds to step S25.
  • step S25 the processor 11 sets the following second detection condition according to the fact that the driving action is "stop”. (1) Detection condition capable of detecting an object having a relatively long arrival time to a target event (2) Detection condition having a relatively wide detection range for detecting an object (3) When an object contacts a vehicle Detection conditions where the probability of being judged is set relatively high (judged high)
  • step S16 When the setting of the detection condition is completed, the process proceeds to step S16.
  • the processor 11 remakes an operation plan.
  • step S16 the processes in steps S11 to S13 are performed based on the detection information acquired in accordance with the changed detection condition, and a new operation plan is redesigned. Verification of the detection information performed in step S16, that is, extraction of a new target, association with an event, determination of a driving action for an event is performed in steps S4 to S8 previously performed, and steps S11 to S13 previously performed.
  • the process may be the same as or different from the process described above.
  • An appropriate detection condition can be set by determining the detection condition based on the content of the driving action.
  • operation control the movement of the vehicle to be controlled changes, and the surrounding situation also changes.
  • the detection condition according to the driving action it is possible to accurately grasp the change of the object to be detected during the operation control and the change of the condition of the object.
  • an object corresponding to the driving behavior can be detected, so appropriate driving control can be performed according to the actual situation.
  • a passenger who has experience driving changes the viewpoint or the field of view according to the situation, and adjusts the judgment result according to the situation.
  • the detection information is acquired according to the detection condition according to the driving behavior specified in the driving plan, and the driving plan is set or revised based on it, so that the occupant who recognizes the change in the situation feels unnatural Can be suppressed.
  • the method of setting the detection condition in accordance with the previously defined driving behavior it is possible to make a driving plan corresponding to the change in the situation, so that complex scenes such as intersections can be smoothly passed.
  • appropriate detection information is acquired, and detection processing of an object based on detection information of an appropriate amount of information And so on. In other words, acquisition of excessive detection information can be suppressed, and detection processing of an object, etc.
  • FIGS. 6A and 6B A first setting example will be described based on FIGS. 6A and 6B.
  • the events PA, PB and PC shown in FIGS. 6A and 6B are common to each other.
  • FIG. 6A is a view showing a detection range at timing T0
  • FIG. 6B is a view showing a detection range R1 at timing T1 after the timing T0.
  • the detection ranges R1 and R2 shown here may be detection ranges of the sensor 260, or may be ranges in which the processor 11 detects the object OB.
  • the area Q1 of the intersection P is described as a scene in which the operation control is performed.
  • the vehicle V1 travels in the section OVA which is within a predetermined distance upstream of the event PA.
  • the processor 11 calculates the driving behavior for the common event PA and sets the detection condition in the event PA.
  • the vehicle V1 to be controlled travels on a route RT1 passing through the intersection Q1.
  • the vehicle V1 passes an event PA (which has the same sign as the point) defined at the point PA.
  • a pedestrian M1 which is an object OBA is present in the vicinity of a pedestrian crossing which is an event PA which is encountered first.
  • the processor 11 determines that the distance between the pedestrian M1 and the pedestrian crossing is equal to or greater than a predetermined distance, and determines that the driving action in the event PA is “progress”.
  • the vehicle V1 to be controlled travels along a route RT1.
  • the processor 11 monitors the movement of the object over time.
  • the pedestrian M1 who has not been evaluated as an object is moving, and at timing T1 enters the pedestrian crossing.
  • the processor 11 determines that the distance between the pedestrian M1 and the pedestrian crossing is less than a predetermined distance, and determines that the driving action at the event PA is “stop”.
  • the processor 11 reaches the event PA when the first detection condition in the case where the driving behavior in the event PA is “progress” is compared with the second detection condition in the case where the driving behavior in the event PA is “stop”
  • the time taken to reach the target is a detection condition for detecting a relatively short object.
  • the driving behavior of the vehicle V1 is "progress”
  • the time to reach the event PA is short. In such a case, it is sufficient for the vehicle V1 to be able to recognize the object OB arriving in a short time at the event PA. That is, when the driving action of the vehicle V1 is "progress", the detection process is performed excluding the object OB present at the position not arriving at the event PA within the predetermined time.
  • the fact that the arrival time of the object OB is short can be determined based on a factor that the distance between the event PA and the object OB is short or the speed of the object OB is fast.
  • the processor 11 compares the first detection condition in the case where the driving behavior in the event PA is “progress” with the second detection condition in the case where the driving behavior in the event PA is “stop”, the object OB
  • the detection range for detection is relatively narrow.
  • the width of the detection range may be evaluated by the area of the detection region, may be evaluated by the length along the traveling direction of the vehicle V1, or may be evaluated by the length along the vehicle width direction of the vehicle V1. Good.
  • the processor 11 compares the first detection condition in the case where the driving action in the event PA is "progress” with the second detection condition in the case where the driving action in the event PA is "stop”
  • it may be a detection condition for extracting an object OB having a relatively high speed of approaching the event PA.
  • the processor 11 adjusts the traveling locus on the traveling path of the vehicle V1 in order to set a detection condition in which the detection range for detecting the object OB becomes relatively narrow.
  • the processor 11 has a detection range for detecting an object when compared with the second detection condition in the case where the driving behavior in the event is a stop when the driving behavior in the event is a progression.
  • the traveling path of the vehicle is changed so as to be relatively narrow.
  • the detection range during traveling may include a blind spot (occlusion). Since a blind spot can not detect an object, it is not practically a detection range, and the substantial detection range is narrow.
  • the processor 11 accepts the state even if the detection range is narrowed, even if the occlusion is included, when the driving behavior in the event is progressing, and as a result, the detection range for detecting the object is
  • the travel path of the vehicle is set so as to be relatively narrow. That is, the processor 11 calculates the optimal reference movement trajectory (route) in each lane included in the route to the destination regardless of the presence of the occlusion, and applies the travel trajectory to automatic driving control.
  • the reference movement trajectory (route) in the automatic driving process is any of a trajectory located substantially at the center of the traveling lane, a trajectory whose included curvature is less than a predetermined value, or a variation whose included curvature is less than a predetermined value Including.
  • the present object OB is intensively monitored with the distance to the event PA being less than the first predetermined value or the arrival time to the event PA being limited to the near range less than the first predetermined value. Is appropriate.
  • FIGS. 7A and 7B A second setting example will be described based on FIGS. 7A and 7B.
  • the events PA, PB and PC shown in FIGS. 7A and 7B are common to each other.
  • FIG. 7A is a view showing a detection range at timing T0
  • FIG. 7B is a view showing a detection range R1 at timing T1 after the timing T0.
  • the detection ranges R1 and R2 shown here may be detection ranges of the sensor 260, or may be ranges in which the processor 11 detects the object OB.
  • the area Q1 of the intersection P is described as a scene in which the operation control is performed.
  • the vehicle V1 travels in the section OVA which is within a predetermined distance upstream of the event PA.
  • the processor 11 calculates the driving behavior for the common event PA and sets the detection condition in the event PA.
  • the vehicle V1 to be controlled travels on a route RT1 passing through the intersection Q1.
  • the vehicle V1 passes an event PA (which has the same sign as the point) defined at the point PA.
  • the pedestrian M1 which is the object OBA, is on the pedestrian crossing, which is an event PA encountered earlier.
  • the processor 11 determines that the distance between the pedestrian M1 and the pedestrian crossing is less than a predetermined distance, and determines that the driving action at the event PA is “stop”.
  • the vehicle V1 to be controlled travels along a route RT1.
  • the processor 11 monitors the movement of the object over time.
  • the pedestrian M1 who has walked the pedestrian crossing is separated from the pedestrian crossing at timing T1.
  • the processor 11 determines that the distance between the pedestrian M1 and the pedestrian crossing is equal to or greater than a predetermined distance, and determines that the driving action in the event PA is “progress”.
  • the processor 11 reaches the event PA when the second detection condition in the case where the driving behavior in the event PA is “stop” is compared with the first detection condition in the case where the driving behavior in the event PA is “progress”
  • the time taken to reach the target is a detection condition for detecting a relatively long object.
  • the vehicle V1 monitors not only the object OB arriving in a short time at the event PA, but also the remote object OB.
  • the pedestrian OBD shown in Fig. 7B may enter the pedestrian crossing before the vehicle V1 turns right at the intersection. It can be considered.
  • the first detection condition is that the object OB whose arrival time is within the predetermined time T1 is detected, but under the second detection condition,
  • the condition for detecting the object OB whose arrival time is within the predetermined time T2 (> T1) is used.
  • detection conditions can be set in line with the need to observe far.
  • the long arrival time of the object OB can be determined based on the factor that the distance between the event PA and the object OB is long or the speed of the object OB is slow.
  • the processor 11 compares the second detection condition in the case where the driving behavior in the event PA is “stop” with the first detection condition in the case where the driving behavior in the event PA is “progress”, the object OB
  • a detection range for detection is set as a relatively wide detection condition. The definition of the detection range is as described above. When the distance in the traveling direction of the vehicle in the detection range RO is increased, the detection range of the sensor 260 may be exceeded. In that case, the detection information of the object may be acquired from the roadside detection device via the communication device 30.
  • the processor 11 compares the second detection condition in the case where the driving behavior in the event PA is “stop” with the first detection condition in the case where the driving behavior in the event PA is “progression”. It may be a detection condition for extracting an object OB having a relatively slow access speed.
  • the processor 11 adjusts the travel locus on the route traveled by the vehicle V1 in order to set the detection range for detecting the object OB as a relatively wide detection condition.
  • the processor 11 has a detection range for detecting an object when compared with the first detection condition in the case where the driving behavior in the event is progress when the second detection condition in the case where the driving behavior in the event is a stop is
  • the detection condition in which the traveling path of the vehicle is changed is set so as to be relatively wide.
  • the processor 11 changes the traveling locus including the position in the lane (position in the road width direction / position in the traveling direction), although it is common as the first route to the destination, and calculates a changed traveling locus.
  • the adjustment of the traveling locus includes the passing timing of the points included in the route. By adjusting the timing of point passing included in the route, the distance to the front object can be adjusted.
  • the curvature of one or more of the adjusted paths is less than a predetermined curvature. This is to avoid operation control with a large amount of
  • the processor 11 calculates an optimal reference movement trajectory (route) in each lane included in the route to the destination in the automatic driving process of the vehicle.
  • the reference movement locus (path) is a locus located substantially at the center of a traveling lane, a locus whose included curvature is equal to or less than a predetermined value, and / or a locus whose variation of included curvature is equal to or less than a predetermined value.
  • the processor 11 applies the reference movement trajectory in an ideal situation to move the vehicle, and applies a trajectory in which the reference movement trajectory is changed according to the detected actual situation to move the vehicle.
  • the detection range in the detection process may include a blind spot (occlusion). Since the blind spot is not detected, it can not be said that it is practically a detection range. Here, the area
  • the processor 11 runs the vehicle V1 so as to reduce the dead zone area. Adjust the route to As a first example, in a curved portion of the route, since the preceding vehicle blocks the front detection range, the processor 11 includes a traveling locus including a position shifted to the left and right so as to turn the preceding vehicle along the width direction. Is calculated, and the vehicle travels on the change travel locus.
  • the processor 11 calculates a modified traveling locus including a position separated by a predetermined distance from the preceding vehicle, and travels the vehicle along the traveling locus.
  • the processor 11 calculates a traveling locus including a position at which the back of the preceding vehicle can be recognized, and causes the vehicle to travel on the changed traveling locus. .
  • a detection range with reduced occlusion can be set.
  • the viewpoint sensor position or camera position
  • the processor 11 applies the first detection condition applied when the driving behavior in the event PA is changed from “stop” to “progress”, when compared with the second detection condition in the case where the driving behavior in the event is a stop
  • the detection condition is to change the traveling locus of the vehicle so that the detection range for detecting the object becomes relatively narrow.
  • the driving action in the event PA is determined to be "stop”
  • the changed traveling locus is changed back to the original state. For example, the travel locus whose lateral position has been changed to narrow the blind area is changed to the travel locus returned to the original lateral position (center).
  • the driving behavior is “stop”
  • the arrival time of the vehicle V1 to the event PA is relatively longer than when the driving behavior is “progress”.
  • the state of the object OB is likely to change before the vehicle V1 arrives at the event PA, and the distance to the event PA is less than the second predetermined value (> the first predetermined value) or the event It is appropriate to monitor widely the object OB which exists in the near range whose arrival time to PA is less than the 2nd predetermined value (> 1st predetermined value).
  • the distance to the event PA is at least the first predetermined value and less than the second predetermined value, and the arrival time to the event PA is in a relatively distant range from the first predetermined value to the second predetermined value.
  • the object OB can be determined.
  • the processor 11 takes into consideration that the time taken for the vehicle V1 to arrive at the event PA is relatively long, and monitors a wide range including the object OB which may move within that time. Under the second detection condition, the object OB in the range excluded in the first detection condition is widely monitored. As a result, it is possible to obtain detection information of a range necessary for a scene where the driving action is “stop”.
  • FIG. 8A is a diagram showing the driving behavior at timing T0
  • FIG. 8B is a diagram showing the driving behavior at timing T1 after timing T0.
  • the vehicle V1 makes a right turn at the intersection.
  • a driving action "progress” is defined because an object in contact is not detected.
  • the driving action “stop” is defined in the event PC in order to approach the other vehicle OBC within a predetermined distance.
  • the pedestrian OBA enters a pedestrian crossing.
  • the processor 11 changes the driving behavior of the event PA to "stop". Since the vehicle V1 decelerates as the driving action of the event PA becomes “stopped”, the time to reach the events PA, PB and PC is extended. As a result, the driving behavior of the event PB is "stopped” because the possibility of contact with the object OBB in the event PB is increased. Further, since the object OBC passes the event PB earlier than the vehicle V1, the driving action of the event PC is "progress”. Thus, when the driving behavior of the event PA changes, the driving behavior of the events following the event PA may change.
  • the processor 11 detects that the object OB is the vehicle V1 when the second detection condition in the case where the driving behavior in the event is “stop” is compared with the first detection condition in the case where the driving behavior in the event is “progress”.
  • the detection condition has a high probability of being judged to touch the Depending on the attribute (vehicle, two-wheeled vehicle, pedestrian, etc.) of the object OB, the degree of freedom of the movement direction and the variation (threshold value) of the movement speed differ.
  • the processor 11 sets the probability that the moving direction of the object OB is the direction of the vehicle V1 to be high when the driving action is “stop”, and the probability that the object OB contacts the vehicle V1 is calculated to be high. Set the detection condition as follows.
  • the processor 11 sets the probability that the moving speed of the object OB will be in the speed range in which the object OB contacts with the vehicle V1 in the event high when the driving action is "stop", and the probability that the object OB contacts the vehicle V1.
  • the detection condition is set so that is calculated high.
  • the processor 11 detects the object OB when the second detection condition in the case where the driving behavior in the event is a stop is compared with the first detection condition in the case where the driving behavior in the event is “progress” Lower the threshold applied as the detection condition.
  • FIGS. 9A and 9B A fourth setting example will be described based on FIGS. 9A and 9B.
  • the events PA, PB and PC shown in FIGS. 9A and 9B are common to each other.
  • FIG. 9A is a diagram showing the driving behavior at timing T0
  • FIG. 9B is a diagram showing the driving behavior at timing T1 after timing T0.
  • the vehicle V1 makes a right turn at the intersection.
  • a driving object “stop” is defined because an object in contact is detected. Since the other vehicle OBC passes the event PC earlier than the vehicle V1 in the event PC before leaving the intersection after the right turn, the driving action "progress" is defined in the event PC.
  • the pedestrian OBA leaves the pedestrian crossing.
  • the processor 11 changes the driving behavior of the event PA to "progress". Since the vehicle V1 maintains or accelerates the current speed as the driving behavior of the event PA becomes "progress”, the time to reach the events PA, PB and PC is quickened. As a result, since the object OBB does not reach the event PB when the vehicle V1 reaches the event PB, the possibility that the object OBB and the vehicle V1 contact at the event PB is low, and the driving behavior of the event PB is It becomes "progress".
  • the vehicle V1 since the time to reach the event PC is also earlier, the vehicle V1 also reaches the event PC when the object OBC reaches the event PC, so the object OBC and the vehicle V1 are likely to contact at the event PC.
  • the driving action of the event PC is "stop". Thus, when the driving behavior of the event PA changes, the driving behavior of the events following the event PA may change.
  • the processor 11 compares the object OB with the vehicle V1 when the first detection condition in the case where the driving behavior in the event is “progress” is compared with the second detection condition in the case where the driving behavior in the event is “stop”.
  • the detection condition has a low probability of being judged to touch the Depending on the attribute (vehicle, two-wheeled vehicle, pedestrian, etc.) of the object OB, the degree of freedom of the movement direction and the variation (threshold value) of the movement speed differ.
  • the processor 11 sets the probability that the moving direction of the object OB is the direction of the vehicle V1 to be low when the driving action is "progress", and the probability that the object OB contacts the vehicle V1 is calculated to be low. Set the detection condition as follows.
  • the processor 11 sets the probability that the moving speed of the object OB will be in the speed range in which the object OB contacts with the vehicle V1 in the event low when the driving behavior is "progress”, and the probability that the object OB contacts the vehicle V1.
  • the processor 11 detects the object OB when the second detection condition in the case where the driving behavior in the event is “progress” is compared to the first detection condition in the case where the driving behavior in the event is “stop” Increase the threshold that is sometimes applied as a detection condition.
  • an object OB that is unlikely to approach the vehicle V1 in the event can also be considered as the object.
  • An object to be watched can be appropriately detected according to the content of driving behavior in the event.
  • FIG. 10A is a diagram showing a range RV1 for detecting the object OB when the vehicle speed of the vehicle V1 to be controlled is VS1 ( ⁇ VS2) which is relatively low
  • FIG. 10B is a vehicle speed VS2 (> It is a figure which shows range RV2 which detects target object OB in the case of being VS1).
  • the ranges RV1 and RV2 shown here are not the detection range of the sensor 260 but the range in which the processor 11 detects the object OB.
  • the processor 11 sets a detection condition for detecting an object whose arrival time to an event is shorter as the vehicle speed of the vehicle V1 is higher.
  • the object OB is detected in a wide range as shown in FIG. 10A in order to detect an object having a relatively long arrival time to the event.
  • the object OB is detected as shown in FIG. 10B in order to detect an object whose arrival time to an event is relatively shorter than when the speed is relatively low. Detect in a narrow range.
  • the prediction range of the movement speed of the object is set wide (the dispersion of the movement speed is set wide / the dispersion degree is set large), and contact with the object is possible Determine the sex.
  • the probability of contact with the object is calculated high. In this way, at low speeds, the possibility of contact can be taken into consideration also for an object located far from the event.
  • the prediction range of the movement speed of the object is set narrow (the distribution of the movement speed is set narrow / the dispersion degree is set small), and the contact with the object is Determine the possibility.
  • the probability of contact with the object is calculated low.
  • the speed of the vehicle V1 differs between FIG. 10A and FIG. 10B.
  • the position and speed of the object are common.
  • the range in which the presence of the object OBC is predicted is indicated by the presence area OBC-R1
  • the range in which the presence of the object OBB is predicted is indicated by the existence area OBB-R1
  • the presence of the object OBD The range in which is predicted is indicated by the presence area OBD-R1.
  • the existence areas of the objects OBC, OBB, OBD belong to the range RV1
  • the objects OBC, OBB, OBD are processed as the objects.
  • FIG. 10A since the existence areas of the objects OBC, OBB, OBD belong to the range RV1, the objects OBC, OBB, OBD are processed as the objects.
  • the range in which the presence of the object OBC is predicted is indicated by the presence area OBC-R2
  • the range in which the presence of the object OBB is predicted is indicated by the existence area OBB-R2
  • the presence of the object OBD The range in which is predicted is indicated by the presence area OBD-R2.
  • the presence area shown in FIG. 10B in which the vehicle speed of the vehicle V1 is low is narrower than the presence area shown in FIG. 10A in which the vehicle speed is high.
  • the existence region of only the object OBC belongs to the range RV2, only the object OBC is processed as the object, and the objects OBB and OBD are not considered as the object.
  • the vehicle speed As the vehicle speed is higher, it is possible to selectively detect an object whose arrival time to an event is short. The higher the vehicle speed, the shorter the time to reach an event may be. In the present invention, as the vehicle speed is higher, it is possible to detect an object whose arrival time to an event is short. In other words, it is possible to exclude from the judgment object the object OB which exists in a distant range where the arrival time to the event is equal to or more than the predetermined value. Unnecessary detection information can be prevented from being acquired while maintaining the detection accuracy of the object OB. As a result, computational load can be reduced and system resources can be used effectively.
  • step S16 when the driving behavior for the event extracted again after the change of the detection condition is determined, the process proceeds to step S16, and the driving plan is re-designed. Carry out an operation plan suitable for changing circumstances.
  • step S17 operation control is performed based on the prepared operation plan.
  • the processor 11 causes the vehicle to execute the driving plan via the vehicle controller 210.
  • the processor 11 determines the vehicle V1 on the target X coordinate value based on the actual X coordinate value of the host vehicle V1 (X axis is the vehicle width direction), the target X coordinate value corresponding to the current position, and the feedback gain. Calculate a target control value related to a steering angle, a steering angular velocity, etc. required to move the vehicle.
  • the processor 11 outputs the target control value to the on-vehicle apparatus 200.
  • the vehicle V1 travels on a target route defined by the target lateral position.
  • the processor 11 calculates a target Y coordinate value (Y axis is the traveling direction of the vehicle) along the route.
  • the processor 11 compares the current Y coordinate value of the vehicle V1, the vehicle speed and acceleration / deceleration at the current position, the target Y coordinate value corresponding to the current Y coordinate value, and the vehicle speed and acceleration / deceleration at the target Y coordinate value. Based on the above, the feedback gain related to the Y coordinate value is calculated. The processor 11 calculates a target control value related to the Y coordinate value based on the vehicle speed and the acceleration / deceleration according to the target Y coordinate value, and the feedback gain of the Y coordinate value.
  • the target control value in the Y-axis direction is the operation of the drive mechanism for achieving the acceleration / deceleration and the vehicle speed according to the target Y coordinate value (the operation of the internal combustion engine in In the case of a hybrid vehicle, the control value for the braking operation is also included in the case of a hybrid vehicle, including the torque distribution between the internal combustion engine and the electric motor.
  • the control function calculates the target intake air amount (the target opening of the throttle valve) and the target fuel injection amount based on the current and target acceleration / deceleration and vehicle speed values. , This is sent to the drive unit 270.
  • the control function calculates the acceleration / deceleration and the vehicle speed, sends them to the vehicle controller 210, and the operation of the drive mechanism for realizing the acceleration / deceleration and the vehicle speed in the vehicle controller 210 (internal engine internal combustion engine)
  • the control values for the operation of the engine, the electric motor system in the electric motor system and the electric motor operation, and in the hybrid vehicle, the torque distribution between the internal combustion engine and the electric motor may be calculated respectively.
  • the processor 11 outputs the calculated target control value in the Y-axis direction to the on-vehicle apparatus 200.
  • the vehicle controller 210 executes steering control and drive control, and causes the vehicle to travel on a target route defined by the target X coordinate value and the target Y coordinate value. The process is repeated each time target Y coordinate values are acquired, and control values for each of the acquired target X coordinate values are output to the on-vehicle apparatus 200.
  • step S18 the vehicle controller 210 executes the driving control instruction according to the instruction of the processor 11 until the destination is reached.
  • the operation control apparatus 100 is configured and operates as described above, and therefore, the following effects can be obtained.
  • the driving control method of the present embodiment determines driving behavior based on detection information acquired in an event according to detection conditions, drafts a driving plan in which the driving behavior is defined for each event, and generates a driving plan Driving control commands according to are executed by the vehicle, and the detection conditions are determined based on the contents of the driving behavior defined for the event.
  • An appropriate detection condition can be set by determining the detection condition based on the content of the driving action.
  • operation control the movement of the vehicle to be controlled changes, and the surrounding situation also changes.
  • an object corresponding to the driving behavior can be detected, so appropriate driving control can be performed according to the actual situation.
  • the method of setting the detection condition in accordance with the previously defined driving behavior it is possible to make a driving plan corresponding to the change in the situation, so that complex scenes such as intersections can be smoothly passed.
  • appropriate detection information is acquired, and detection processing of an object based on detection information of an appropriate amount of information And so on. In other words, acquisition of detection information of an excessive amount of information can be suppressed, detection processing of an object can be executed based on detection information of a minimum necessary amount of information, and efficient utilization of system resources can be realized.
  • the operation control method rearranges the plurality of extracted events in the order in which the vehicles encounter, drafts a series of operation plans in which the driving behavior is defined for each event, and the vehicle first
  • the detection condition is determined based on the content of the driving action in the second event encountered next to the first event encountered.
  • a series of driving plans determined once when any driving behavior changes due to the appearance of a new object (other vehicle), the other driving behaviors are also affected.
  • a change in driving behavior in one event affects the timing of arriving at each of the chronologically arranged events in order to change the arrival time of the vehicle event. When the timing of reaching the event changes, the movement amount of the object also changes, and the situation in each event also changes.
  • the vehicle is required to at least decelerate even if the driving behavior of the upstream event is progressing. Because extending the time to reach an event results in a change of context in the event, upstream driving behavior is also affected.
  • the processor 11 pre-reads the driving behavior in the relatively downstream side (advancing direction side) event, and sets the detection condition of the latest event. As a result, the driving behavior of the next event can be read ahead and the detection condition of the event that the vehicle first encounters can be set, so that the detection condition suitable for the driving behavior of the next event can be set.
  • the detection information according to the appropriate detection conditions, it is possible to make a driving action and a driving plan suitable for the situation, so that the success rate (completion rate) of the driving plan can be increased.
  • the operation control method rearranges the plurality of extracted events in the order in which the vehicles encounter, drafts a series of operation plans in which the driving behavior is defined for each event, and the vehicle first
  • the detection condition is determined based on the content of the driving action in the second event encountered next to the first event encountered, and switched to the determined detection condition after the vehicle passes the first event.
  • the processor 11 determines the detection condition based on the driving behavior in the second event encountered next to the first event that the vehicle first encounters, and switches to the determined detection condition after the vehicle passes the first event And causes the in-vehicle apparatus 200 to execute detection processing according to the switched detection condition.
  • the processor 11 prefetches the driving behavior in the second event on the relatively downstream side (advancing direction side), determines the detection condition early, and applies the new detection condition immediately after the passage of the first event. Can do Detection conditions suitable for driving behavior in the next event can be set at appropriate timing. Using the detection information according to the appropriate detection conditions, it is possible to make a driving action and a driving plan suitable for the situation, so that the success rate (completion rate) of the driving plan can be increased.
  • the first detection condition in the case where the driving behavior in the event is progressing is the event when compared with the second detection condition in the case where the driving behavior in the event is a stop
  • the detection time is a detection condition for detecting an object whose arrival time to reach is relatively short. It is appropriate to intensively monitor the objects OB present in a near range in which the distance to the event PA is less than the first predetermined value or the arrival time to the event PA is less than the first predetermined value.
  • By switching the detection condition it is possible to exclude the object OB having a distance to the event PA equal to or more than the first predetermined value and an arrival time to the event PA in a distant range equal to or more than the first predetermined value. Unnecessary detection information can be prevented from being acquired while maintaining the detection accuracy of the object OB. As a result, computational load can be reduced and system resources can be used effectively.
  • the second detection condition in the case where the driving behavior in the event is a stop is the event when compared with the first detection condition in the case where the driving behavior in the event is a progression
  • the arrival time until arrival is a detection condition for detecting a relatively long object.
  • the state of the object OB is likely to change before the vehicle V1 arrives at the event PA, and the distance to the event PA is less than a second predetermined value (> first predetermined value) or the arrival time to the event PA is first It is appropriate to widely monitor the object OB present in the vicinity range below 2 predetermined value (> first predetermined value).
  • the distance to the event PA is at least the first predetermined value and less than the second predetermined value, and the arrival time to the event PA is in a relatively distant range from the first predetermined value to the second predetermined value.
  • the object OB can be determined.
  • the processor 11 takes into consideration that the time taken for the vehicle V1 to arrive at the event PA is relatively long, and monitors a wide range including the object OB which may move within that time. Under the second detection condition, the object OB in the range excluded in the first detection condition is widely monitored. As a result, it is possible to obtain detection information of a range necessary for a scene where the driving action is “stop”.
  • the second detection condition in the case where the driving behavior in the event is a stop is the target when compared to the first detection condition in the case where the driving behavior in the event is a progression
  • the traveling condition of the vehicle is changed so as to relatively widen the detection range for detecting.
  • the state of the object OB is likely to change before the vehicle V1 arrives at the event PA, and the distance to the event PA is less than a second predetermined value (> first predetermined value) or the arrival time to the event PA is first It is appropriate to widely monitor the object OB present in the vicinity range below 2 predetermined value (> first predetermined value).
  • the distance to the event PA is at least the first predetermined value and less than the second predetermined value, and the arrival time to the event PA is in a relatively distant range from the first predetermined value to the second predetermined value.
  • the object OB can be determined.
  • the processor 11 takes into consideration that the time taken for the vehicle V1 to arrive at the event PA is relatively long, and monitors a wide range including the object OB which may move within that time. Under the second detection condition, the object OB in the range excluded in the first detection condition is widely monitored. As a result, it is possible to obtain detection information of a range necessary for a scene where the driving action is “stop”.
  • the first detection condition in the case where the driving behavior in the event is progressing is the target when compared with the second detection condition in the case where the driving behavior in the event is stopping
  • the traveling condition of the vehicle is changed so as to relatively narrow the detection range for detecting. It is appropriate to intensively monitor the objects OB present in a near range in which the distance to the event PA is less than the first predetermined value or the arrival time to the event PA is less than the first predetermined value.
  • By switching the detection condition it is possible to exclude the object OB having a distance to the event PA equal to or more than the first predetermined value and an arrival time to the event PA in a distant range equal to or more than the first predetermined value. Unnecessary detection information can be prevented from being acquired while maintaining the detection accuracy of the object OB. As a result, computational load can be reduced and system resources can be used effectively.
  • the first detection condition in the case where the driving behavior in the event is progress is the target when compared with the second detection condition in the case where the driving behavior in the event is a stop
  • the probability of being judged to touch the vehicle is regarded as a low detection condition. That is, the first detection condition in the case where the driving behavior in the event is progressing is a detection result in which the probability that the object contacts the vehicle is relatively lower than the second detection condition even if the condition of the object is the same. Lead. If the driving behavior in the event is "progress”, then consider the object OB whose possibility of approaching the vehicle V1 in the event is narrowed. An object to be watched can be appropriately detected according to the content of driving behavior in the event.
  • the second detection condition in the case where the driving action in the event is a stop is an object when compared with the first detection condition in the case where the driving action in the event is a progression Is a detection condition with a high probability of being judged to touch the vehicle. That is, the first detection condition in the case where the driving behavior in the event is a stop is a detection result in which the probability that the object contacts the vehicle is relatively higher than the second detection condition even if the condition of the object is the same. Lead.
  • the driving behavior in the event is "stop"
  • the possibility of approaching the vehicle V1 in the event is increased to consider the object OB present in a wide range.
  • An object to be watched can be appropriately detected according to the content of driving behavior in the event.
  • the detection condition is to detect an object whose arrival time to an event is shorter.
  • the vehicle speed it is possible to selectively detect an object whose arrival time to an event is short.
  • the higher the vehicle speed the shorter the time to reach an event may be.
  • the vehicle speed it is possible to detect an object whose arrival time to an event is short.
  • the operation control method of the present embodiment it is determined whether to switch the detection condition based on the amount of change in the movement of the object obtained from the change in detection information over time. If the change amount of the movement of the object is less than the predetermined value, the determination of the driving behavior is unlikely to be changed. If the determination of the driving behavior is not changed, the setting of the detection condition is likely to be maintained. As a result, the determination to change / maintain the detection condition can be appropriately made, so it is possible to execute the operation control suitable for the actual situation. By changing the detection condition frequently, it is possible to suppress that the occupant feels uncomfortable.
  • the operation control apparatus 100 of the present embodiment exhibits the same operation and effect as the above-described operation control method.

Abstract

L'invention concerne un procédé de commande de conduite dans lequel un processeur (11) destiné à commander la conduite d'un véhicule : acquiert des informations de détection à proximité d'un véhicule V1 en fonction d'une condition de détection pouvant être réglée pour chaque point ; extrait des événements rencontrés par le véhicule V1 en fonction des informations de détection ; prépare un plan de conduite dans lequel est défini un comportement de conduite pour chacun des événements en fonction des informations de détection acquises lors de l'événement ; exécute une commande pour réaliser la commande de conduite de véhicule en fonction du plan de conduite ; et détermine une condition de détection en fonction du contenu de comportement de conduite défini pour un événement.
PCT/JP2017/036697 2017-10-10 2017-10-10 Procédé et appareil de commande de conduite WO2019073526A1 (fr)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP17928354.4A EP3696789B1 (fr) 2017-10-10 2017-10-10 Procédé et appareil de commande de conduite
RU2020115456A RU2743683C1 (ru) 2017-10-10 2017-10-10 Способ управления вождением и оборудование управления вождением
CN201780095672.7A CN111448596A (zh) 2017-10-10 2017-10-10 驾驶控制方法以及驾驶控制装置
JP2019547818A JP6779590B2 (ja) 2017-10-10 2017-10-10 運転制御方法及び運転制御装置
US16/754,971 US11584388B2 (en) 2017-10-10 2017-10-10 Driving control method and driving control apparatus
PCT/JP2017/036697 WO2019073526A1 (fr) 2017-10-10 2017-10-10 Procédé et appareil de commande de conduite

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/036697 WO2019073526A1 (fr) 2017-10-10 2017-10-10 Procédé et appareil de commande de conduite

Publications (1)

Publication Number Publication Date
WO2019073526A1 true WO2019073526A1 (fr) 2019-04-18

Family

ID=66100473

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/036697 WO2019073526A1 (fr) 2017-10-10 2017-10-10 Procédé et appareil de commande de conduite

Country Status (6)

Country Link
US (1) US11584388B2 (fr)
EP (1) EP3696789B1 (fr)
JP (1) JP6779590B2 (fr)
CN (1) CN111448596A (fr)
RU (1) RU2743683C1 (fr)
WO (1) WO2019073526A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020075476A1 (fr) * 2018-10-11 2020-04-16 日立オートモティブシステムズ株式会社 Système embarqué
JP2020187551A (ja) * 2019-05-14 2020-11-19 株式会社豊田自動織機 自律走行車
RU2750118C1 (ru) * 2019-12-25 2021-06-22 Общество с ограниченной ответственностью "Яндекс Беспилотные Технологии" Способы и процессоры для управления работой беспилотного автомобиля
US11543506B2 (en) 2019-11-06 2023-01-03 Yandex Self Driving Group Llc Method and computer device for calibrating LIDAR system
RU2792946C1 (ru) * 2019-11-06 2023-03-28 Общество с ограниченной ответственностью "Яндекс Беспилотные Технологии" Способ и компьютерное устройство для калибровки лидарной (lidar) системы

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018205199B4 (de) * 2018-04-06 2021-03-18 Volkswagen Aktiengesellschaft Ermittlung und Verwendung von Haltepunkten für Kraftfahrzeuge
WO2019202627A1 (fr) * 2018-04-16 2019-10-24 三菱電機株式会社 Dispositif de détection d'obstacle, dispositif de freinage automatique à l'aide d'un dispositif de détection d'obstacle, procédé de détection d'obstacle et procédé de freinage automatique à l'aide d'un procédé de détection d'obstacle
US20210269040A1 (en) * 2018-07-16 2021-09-02 Nissan Motor Co., Ltd. Driving assist method and driving assist device
US10915762B1 (en) * 2018-12-07 2021-02-09 Waymo Llc Sidewalk detection for pedestrian behavior modeling
JP7121714B2 (ja) * 2019-09-17 2022-08-18 本田技研工業株式会社 車両制御システム
CN114572240B (zh) * 2020-11-30 2023-11-14 北京百度网讯科技有限公司 车辆行驶控制方法、装置、车辆、电子设备及存储介质
JP2022139065A (ja) * 2021-03-11 2022-09-26 本田技研工業株式会社 運転支援装置、運転支援方法、およびプログラム
CN114822058B (zh) * 2022-05-11 2023-03-03 深圳智慧车联科技有限公司 一种基于信号灯路口的行车规范驾驶提示监测方法、系统、车载终端及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009245120A (ja) * 2008-03-31 2009-10-22 Toyota Motor Corp 交差点見通し検出装置
JP2010096584A (ja) * 2008-10-15 2010-04-30 Kajima Corp 障害物検出装置、及び障害物検出方法
JP2011096105A (ja) 2009-10-30 2011-05-12 Toyota Motor Corp 運転支援装置
WO2017010264A1 (fr) * 2015-07-10 2017-01-19 本田技研工業株式会社 Dispositif de commande de véhicule, procédé de commande de véhicule, et programme de commande de véhicule

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3324421B2 (ja) 1996-12-09 2002-09-17 三菱自動車工業株式会社 車両の後側方警報装置
JP2008181419A (ja) * 2007-01-25 2008-08-07 Toyota Motor Corp 衝突予知装置及び衝突予知方法
JP4434224B2 (ja) * 2007-03-27 2010-03-17 株式会社デンソー 走行支援用車載装置
JP5408237B2 (ja) * 2010-12-28 2014-02-05 株式会社デンソー 車載障害物情報通知装置
EP2801962B1 (fr) * 2012-09-12 2016-09-21 Omron Corporation Dispositif de production des instructions de régulation de flux de données et dispositif de gestion de capteur
WO2014192370A1 (fr) * 2013-05-31 2014-12-04 日立オートモティブシステムズ株式会社 Dispositif de contrôle de véhicule
JP6180968B2 (ja) * 2014-03-10 2017-08-16 日立オートモティブシステムズ株式会社 車両制御装置
WO2017013749A1 (fr) * 2015-07-21 2017-01-26 日産自動車株式会社 Dispositif de plan de conduite, dispositif d'aide à la circulation et procédé de plan de conduite
CN107226088B (zh) * 2016-03-25 2022-03-08 松下电器(美国)知识产权公司 控制器、驾驶控制方法以及程序
US9535423B1 (en) * 2016-03-29 2017-01-03 Adasworks Kft. Autonomous vehicle with improved visual detection ability
JP7062898B2 (ja) * 2017-09-07 2022-05-09 株式会社デンソー 衝突回避装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009245120A (ja) * 2008-03-31 2009-10-22 Toyota Motor Corp 交差点見通し検出装置
JP2010096584A (ja) * 2008-10-15 2010-04-30 Kajima Corp 障害物検出装置、及び障害物検出方法
JP2011096105A (ja) 2009-10-30 2011-05-12 Toyota Motor Corp 運転支援装置
WO2017010264A1 (fr) * 2015-07-10 2017-01-19 本田技研工業株式会社 Dispositif de commande de véhicule, procédé de commande de véhicule, et programme de commande de véhicule

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3696789A4

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020075476A1 (fr) * 2018-10-11 2020-04-16 日立オートモティブシステムズ株式会社 Système embarqué
JPWO2020075476A1 (ja) * 2018-10-11 2021-10-07 日立Astemo株式会社 車載システム
JP2020187551A (ja) * 2019-05-14 2020-11-19 株式会社豊田自動織機 自律走行車
US11543506B2 (en) 2019-11-06 2023-01-03 Yandex Self Driving Group Llc Method and computer device for calibrating LIDAR system
RU2792946C1 (ru) * 2019-11-06 2023-03-28 Общество с ограниченной ответственностью "Яндекс Беспилотные Технологии" Способ и компьютерное устройство для калибровки лидарной (lidar) системы
RU2750118C1 (ru) * 2019-12-25 2021-06-22 Общество с ограниченной ответственностью "Яндекс Беспилотные Технологии" Способы и процессоры для управления работой беспилотного автомобиля
US11433893B2 (en) 2019-12-25 2022-09-06 Yandex Self Driving Group Llc Methods and processors for controlling operation of self-driving car

Also Published As

Publication number Publication date
US11584388B2 (en) 2023-02-21
CN111448596A (zh) 2020-07-24
EP3696789B1 (fr) 2022-08-03
EP3696789A4 (fr) 2020-10-21
RU2743683C1 (ru) 2021-02-24
EP3696789A1 (fr) 2020-08-19
JPWO2019073526A1 (ja) 2020-10-22
JP6779590B2 (ja) 2020-11-04
US20200298877A1 (en) 2020-09-24

Similar Documents

Publication Publication Date Title
JP6779590B2 (ja) 運転制御方法及び運転制御装置
JP6854357B2 (ja) 運転制御方法及び運転制御装置
JP6819177B2 (ja) 運転支援方法及び運転支援装置
JP6308233B2 (ja) 車両制御装置及び車両制御方法
JP6443550B2 (ja) シーン評価装置、走行支援装置、シーン評価方法
JP6451847B2 (ja) 運転計画装置、走行支援装置、運転計画方法
JP6451848B2 (ja) 運転計画装置、走行支援装置、運転計画方法
JP6443552B2 (ja) シーン評価装置、走行支援装置、シーン評価方法
US20190035278A1 (en) Driving Assistance Method and Device
JP6451211B2 (ja) 走行制御装置
JP6809087B2 (ja) 運転支援方法及び運転支援装置
JP7154914B2 (ja) 運転制御方法及び運転制御装置
JP6443551B2 (ja) シーン評価装置、走行支援装置、シーン評価方法
WO2021005392A1 (fr) Procédé de commande de conduite et dispositif de commande de conduite
WO2018033949A1 (fr) Procédé d'aide à la conduite et appareil d'aide à la conduite
JP2022129234A (ja) 遠隔支援システム及び遠隔支援方法
JP2020052673A (ja) 運転制御方法及び運転制御装置
JP2020199787A (ja) 車両の走行制御方法及び走行制御装置
JP7258677B2 (ja) 運転制御方法及び運転制御装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17928354

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019547818

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017928354

Country of ref document: EP

Effective date: 20200511