WO2022244604A1 - Procédé de traitement, système de traitement et programme de traitement - Google Patents

Procédé de traitement, système de traitement et programme de traitement Download PDF

Info

Publication number
WO2022244604A1
WO2022244604A1 PCT/JP2022/018843 JP2022018843W WO2022244604A1 WO 2022244604 A1 WO2022244604 A1 WO 2022244604A1 JP 2022018843 W JP2022018843 W JP 2022018843W WO 2022244604 A1 WO2022244604 A1 WO 2022244604A1
Authority
WO
WIPO (PCT)
Prior art keywords
host
target
lane
mobile
trajectory
Prior art date
Application number
PCT/JP2022/018843
Other languages
English (en)
Japanese (ja)
Inventor
世航 莫
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to JP2023522587A priority Critical patent/JP7487844B2/ja
Publication of WO2022244604A1 publication Critical patent/WO2022244604A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present disclosure relates to processing technology for performing processing related to driving a mobile object.
  • Patent Literature 1 plans operation control related to the navigation operation of the host vehicle according to sensed information regarding the internal and external environment of the host vehicle. Therefore, when it is determined that there is potential responsibility for an accident based on the safety model according to the driving policy and the detection information, the driving control is restricted.
  • An object of the present disclosure is to provide a processing method that promotes handling suitable for merging scenes. Yet another object of the present disclosure is to provide a processing system that facilitates handling suitable for merging scenes. Yet another object of the present disclosure is to provide a processing program that promotes appropriate handling for merging scenes.
  • a first aspect of the present disclosure is A processing method executed by a processor to perform processing related to driving a host mobile in a confluence scene in which a host mobile traveling in a host lane and a target mobile traveling in a target lane join. hand, estimating a junction point between a host trajectory, which is a future travel trajectory of the host mobile in the host lane, and a target trajectory, which is a future travel trajectory of the target mobile in the target lane; predicting a minimum arrival time to reach a confluence point for each of the host mobile and the target mobile; Setting a priority such that, of the host mobile body and the target mobile body, one of the mobile bodies having the shortest minimum arrival time has priority over the other mobile body in traveling at the confluence point. including.
  • a second aspect of the present disclosure is A processing system having a processor and performing processing related to operation of a host mobile in a merging scene in which a host mobile traveling in a host lane and a target mobile traveling in a target lane join,
  • the processor estimating a junction point between a host trajectory, which is a future travel trajectory of the host mobile in the host lane, and a target trajectory, which is a future travel trajectory of the target mobile in the target lane; predicting a minimum arrival time to reach a confluence point for each of the host mobile and the target mobile; Setting a priority such that, of the host mobile body and the target mobile body, one of the mobile bodies having the shortest minimum arrival time has priority over the other mobile body in traveling at the confluence point. is configured to run
  • a third aspect of the present disclosure is stored in a storage medium and executed by a processor to perform processing related to driving of the host mobile in a merging scene in which the host mobile traveling in the host lane and the target mobile traveling in the target lane meet
  • a processing program comprising instructions to cause the instruction is estimating a junction point between a host trajectory, which is a future travel trajectory of the host mobile in the host lane, and a target trajectory, which is a future travel trajectory of the target mobile in the target lane; predicting a minimum arrival time to reach a confluence point for each of the host mobile and the target mobile; To set a priority such that traveling at a confluence point is given priority over the other mobile body to one of the host mobile body and the target mobile body having a shorter minimum arrival time. including.
  • the confluence point between the host trajectory of the host moving body in the host lane and the target trajectory of the target moving body in the target lane is estimated. Therefore, for one of the moving bodies whose minimum arrival time to reach the merging point, predicted for each of the host moving body and the target moving body, is shorter than the other moving body, the merging point Priority is set so that driving on the According to this, it is possible to clearly give priority to the moving body that reaches the merging point earlier, and thus it is possible to promote a response suitable for the merging scene.
  • a fourth aspect of the present disclosure is A processing method executed by a processor to perform processing related to driving a host mobile in a confluence scene in which a host mobile traveling in a host lane and a target mobile traveling in a target lane join. hand, estimating a junction point between a host trajectory, which is a future travel trajectory of the host mobile in the host lane, and a target trajectory, which is a future travel trajectory of the target mobile in the target lane; setting priority at a meeting point between a host mobile and a target mobile; When priority is set to the target mobile body, the host mobile body is given operational control based on the safety envelope set according to the driving policy between the target mobile body and the host mobile body before reaching the confluence point. including.
  • a fifth aspect of the present disclosure includes: A processing system having a processor and performing processing related to operation of a host mobile in a merging scene in which a host mobile traveling in a host lane and a target mobile traveling in a target lane join,
  • the processor estimating a junction point between a host trajectory, which is a future travel trajectory of the host mobile in the host lane, and a target trajectory, which is a future travel trajectory of the target mobile in the target lane; setting priority at a meeting point between a host mobile and a target mobile;
  • the host mobile body is given operational control based on the safety envelope set according to the driving policy between the target mobile body and the host mobile body before reaching the confluence point. configured to perform
  • a sixth aspect of the present disclosure is stored in a storage medium and executed by a processor to perform processing related to driving of the host mobile in a merging scene in which the host mobile traveling in the host lane and the target mobile traveling in the target lane meet
  • a processing program comprising instructions to cause the instruction is estimating a junction point between a host trajectory, which is a future travel trajectory of the host mobile in the host lane, and a target trajectory, which is a future travel trajectory of the target mobile in the target lane; setting priority at a meeting point between the host mobile and the target mobile; When priority is set to the target mobile body, the host mobile body is given operational control based on the safety envelope set according to the operation policy between the target mobile body and the host mobile body before reaching the merging point. and causing.
  • the confluence point between the host trajectory of the host moving body in the host lane and the target trajectory of the target moving body in the target lane is estimated. Therefore, when the priority at the merging point is set to the target moving body, the operation control based on the safety envelope set according to the driving policy between the target moving body and the host moving body before reaching the merging point. is provided to the host mobile. According to this, the operation of the host moving body can be properly controlled before the running of the target moving body is prioritized at the merging point, so it is possible to promote a response suitable for the merging scene.
  • FIG. 2 is a schematic diagram showing a running environment of a host vehicle to which the first embodiment is applied; It is a block diagram which shows the functional structure of the processing system by 1st embodiment.
  • FIG. 4 is a schematic diagram showing a merging scene to which the first embodiment is applied;
  • FIG. 4 is a schematic diagram showing a merging scene to which the first embodiment is applied;
  • FIG. 4 is a schematic diagram showing a merging scene to which the first embodiment is applied;
  • FIG. 4 is a schematic diagram showing a merging scene to which the first embodiment is applied;
  • FIG. 4 is a schematic diagram showing a merging scene to which the first embodiment is applied;
  • FIG. 4 is a schematic diagram showing a merging scene to which the first embodiment is applied;
  • FIG. 4 is a schematic diagram showing a merging scene to which the first embodiment is applied;
  • FIG. 4 is a schematic diagram showing a merging scene to which the first embodiment is applied;
  • FIG. 4 is a schematic diagram showing a merging
  • FIG. 4 is a schematic diagram showing a merging scene to which the first embodiment is applied;
  • FIG. 4 is a schematic diagram showing a merging scene to which the first embodiment is applied;
  • FIG. 4 is a schematic diagram showing a merging scene to which the first embodiment is applied;
  • It is a flowchart which shows the processing method by 1st embodiment.
  • It is a block diagram which shows the functional structure of the processing system by 2nd embodiment. It is a correspondence table for explaining the processing method according to the second embodiment.
  • the processing system 1 of the first embodiment shown in FIG. 1 performs processing related to the operation of the host moving body (hereinafter referred to as operation-related processing).
  • the host vehicle 2 and the target vehicle 3 shown in FIG. 2 are the host vehicle 2 and the target vehicle 3, respectively, for which the processing system 1 performs driving-related processing. From a viewpoint centering on the host vehicle 2, the host vehicle 2 can be said to be an ego-vehicle. From a viewpoint centering on the host vehicle 2, the target vehicle 3 can also be said to be a user on another road.
  • Automated driving is executed in the host vehicle 2 shown in FIGS. Automated driving is classified into levels according to the degree of manual intervention by the driver in a dynamic driving task (hereinafter referred to as DDT). Autonomous driving may be achieved through autonomous cruise control, such as conditional driving automation, advanced driving automation, or full driving automation, where the system performs all DDTs when activated. Automated driving may be realized in advanced driving assistance control, such as driving assistance or partial driving automation, in which the driver as a passenger performs some or all of the DDT. Automatic driving may be realized by either one, combination, or switching between autonomous driving control and advanced driving support control.
  • autonomous cruise control such as conditional driving automation, advanced driving automation, or full driving automation, where the system performs all DDTs when activated.
  • Automated driving may be realized in advanced driving assistance control, such as driving assistance or partial driving automation, in which the driver as a passenger performs some or all of the DDT.
  • Automatic driving may be realized by either one, combination, or switching between autonomous driving control and advanced driving support
  • the host vehicle 2 is equipped with a sensor system 5, a communication system 6, a map DB (Data Base) 7, and an information presentation system 4.
  • the sensor system 5 obtains sensor data that can be used by the processing system 1 by detecting external and internal worlds at the host vehicle 2 . Therefore, the sensor system 5 includes an external sensor 50 and an internal sensor 52 .
  • the external sensor 50 detects targets existing in the external world of the host vehicle 2 .
  • the target detection type external sensor 50 is, for example, at least one type of camera, LiDAR (Light Detection and Ranging/Laser Imaging Detection and Ranging), laser radar, millimeter wave radar, ultrasonic sonar, and the like.
  • the internal world sensor 52 detects a specific physical quantity related to vehicle motion (hereinafter referred to as physical quantity of motion) in the internal world of the host vehicle 2 and the state of the occupant.
  • the internal sensor 52 is, for example, at least one of a speed sensor, an acceleration sensor, a gyro sensor, an actuator sensor, a driver status monitor (registered trademark), a biosensor, a seating sensor, an in-vehicle equipment sensor, and the like.
  • the communication system 6 acquires communication data that can be used by the processing system 1 by wireless communication.
  • the communication system 6 may receive positioning signals from artificial satellites of GNSS (Global Navigation Satellite System) existing outside the host vehicle 2 .
  • the positioning type communication system 6 is, for example, a GNSS receiver or the like.
  • the communication system 6 may transmit and receive communication signals with a V2X system existing outside the host vehicle 2 .
  • the V2X type communication system 6 is, for example, at least one of a DSRC (Dedicated Short Range Communications) communication device, a cellular V2X (C-V2X) communication device, and the like.
  • the communication system 6 may transmit and receive communication signals to and from terminals existing inside the host vehicle 2 .
  • the terminal communication type communication system 6 is, for example, at least one of Bluetooth (registered trademark) equipment, Wi-Fi (registered trademark) equipment, infrared communication equipment, and the like.
  • the map DB 7 stores map data that can be used by the processing system 1.
  • the map DB 7 includes at least one type of non-transitory tangible storage medium, such as semiconductor memory, magnetic medium, and optical medium.
  • the map DB 7 may be a locator DB for estimating the self-state quantity of the host vehicle 2 including its own position.
  • the map DB may be a DB of a navigation unit that navigates the travel route of the host vehicle 2 .
  • Map DB7 may be constructed
  • the map DB 7 acquires and stores the latest map data through communication with an external center via the V2X type communication system 6, for example.
  • the map data is two-dimensional or three-dimensional data representing the driving environment of the host vehicle 2 .
  • Digital data of a high-precision map may be adopted as the three-dimensional map data.
  • the map data may include road data representing at least one of the positional coordinates of the road structure, the shape, the road surface condition, and the like.
  • the map data may include, for example, marking data representing at least one type of position coordinates, shape, etc. of road signs attached to roads, road markings, and lane markings.
  • the map data may include structure data representing at least one of position coordinates, shapes, etc. of buildings and traffic lights facing roads, for example.
  • the information presentation system 4 presents notification information to passengers including the driver of the host vehicle 2 .
  • the information presentation system 4 includes a visual presentation unit, an auditory presentation unit, and a tactile presentation unit.
  • the visual presentation unit presents notification information by stimulating the visual sense of the occupant.
  • the visual presentation unit is at least one of, for example, a HUD (Head-up Display), an MFD (Multi Function Display), a combination meter, a navigation unit, a light emitting unit, and the like.
  • the auditory presentation unit presents the notification information by stimulating the auditory sense of the occupant.
  • the auditory presentation unit is, for example, at least one of a speaker, buzzer, vibration unit, and the like.
  • the cutaneous sensation presentation unit presents notification information by stimulating the passenger's cutaneous sensations.
  • the skin sensation presentation unit is, for example, at least one of a steering wheel vibration unit, a driver's seat vibration unit, a steering wheel reaction force unit, an accelerator pedal reaction force unit, a brake pedal reaction force unit, and an
  • a lane structure LS with separated lanes is assumed.
  • the lane structure LS restricts the movement of the host vehicle 2 and the target vehicle 3 with the direction in which the lane extends as the longitudinal direction.
  • the lane structure LS regulates the movement of the host vehicle 2 and the target vehicle 3 with the lateral direction of the lanes being the width direction or the direction in which the lanes are aligned.
  • the lane structure LS assumed in this embodiment is a confluence structure LSm in which the host lane Lh on which the host vehicle 2 travels and the target lane Lt on which the target vehicle 3 travels merge.
  • a safety model described according to driving policy and its safety is used.
  • the driving policy is defined based on a vehicle-level safety strategy that guarantees the safety of the intended functionality (Safety Of The Intended Functionality: hereinafter referred to as SOTIF).
  • driving policies between the host vehicle 2 and the target vehicle 3 in the lane structure LS are defined, for example, by the following (A) to (E).
  • the forward direction with respect to the host vehicle 2 is, for example, the direction of travel on a turning circle at the current steering angle of the host vehicle 2, the direction of travel of a straight line passing through the center of gravity of the host vehicle 2 perpendicular to the axle of the host vehicle 2, or the direction of travel of the host vehicle 2.
  • the traveling direction on the axis line of the FOE Focus of Expansion of the camera from the front camera module.
  • a vehicle shall not rear-end a vehicle traveling in front from behind.
  • Vehicles do not forcibly cut in between other vehicles.
  • a safety model is described by following a driving policy that implements a vehicle level safety strategy and by modeling the SOTIF.
  • the safety model may be at least one of a safety-related model itself and a model forming part of the safety-related model.
  • the safety model assumes behavior of road users that does not lead to unreasonable situations, i.e. proper and rational actions to be taken as a proper response.
  • the unreasonable situations between the host vehicle 2 and the target vehicle 3 in the lane structure LS are head-on collisions, rear-end collisions, and side collisions. Rational actions in a head-on collision include, for example, braking a vehicle traveling in the wrong direction.
  • Reasonable behavior in a rear-end collision includes the avoidance of sudden braking by a vehicle ahead of the vehicle, and the avoidance of rear-end collision by a vehicle behind the vehicle.
  • Reasonable behavior in a side collision includes vehicles running side-by-side steering away from each other.
  • the safety model may be trained by a machine learning algorithm that back-propagates the results of operational control to the safety model.
  • a machine learning algorithm that back-propagates the results of operational control to the safety model.
  • the safety model to be trained it is preferable to use at least one type of learning model among deep learning by a neural network such as DNN (Deep Neural Network), reinforcement learning, and the like.
  • a safety model may be designed in accordance with accident liability rules, in which a moving body that does not take rational actions is responsible for an accident.
  • the safety model used to monitor the risk between the host vehicle 2 and the target vehicle 3 under the accident liability rule according to the driving policy on the lane structure LS is the potential accident due to unreasonable risk or road user misuse. It should be designed to avoid liability by acting rationally.
  • Such a safety model includes, for example, a responsibility sensitive safety model (RSS (Responsibility Sensitive Safety) model) disclosed in Patent Document 1, and the like.
  • a safety envelope that guarantees SOTIF in the host vehicle 2, for example based on a vehicle level safety strategy, etc., is set according to the driving policy.
  • a safety distance is assumed from a profile relating to at least one type of kinematic quantity based on a safety model between the host vehicle 2 and the target vehicle 3 assuming that they follow the driving policy.
  • the safety distance in the lane structure LS defines a physics-based marginal boundary around the host vehicle 2 for the expected target vehicle 3 motion.
  • the safety distance for avoiding the risk of rear-end collision and head-on collision in the longitudinal direction of the host vehicle 2 and the safety distance for avoiding the risk of side collision in the lateral direction of the host vehicle 2 are: should be assumed.
  • the definition of the safety envelope here may be a common concept that can be used to address all principles that driving policy may adhere to. According to this concept, a motor vehicle has one or more boundaries around itself, and violation of one or more of these boundaries causes different responses by the motor vehicle.
  • a safety envelope may define a physics-based margin around a motor vehicle. Usually combined with assumptions defined as the reasonably predictable worst-case behavior of others, they form the basic building blocks of safety-related models.
  • a safety envelope may be a fundamental building block for understanding whether a motor vehicle is in a high-risk scenario.
  • a safety envelope may be defined to define a boundary, margin or buffer zone not only around the own vehicle, but also around other vehicles, pedestrians or stationary objects.
  • the actual distance between the host vehicle 2 and the target vehicle 3 is compared with the safety distance based on the safety model for each driving scene to determine whether or not the safety envelope is violated. If the result is a determination that the safety envelope has been violated, the host should take reasonable action for each state transition between the vehicles 2 and 3 based on reasonably foreseeable assumptions. It is preferable to set restrictions on driving control of the vehicle 2 .
  • the processing system 1 includes a sensor system 5, a communication system 6, and a map DB 7 via at least one of LAN (Local Area Network), wire harness, internal bus, wireless communication line, and the like. , and the information presentation system 4 .
  • the processing system 1 includes at least one dedicated computer.
  • a dedicated computer that configures the processing system 1 may be an integrated ECU (Electronic Control Unit) that integrates operation control of the host vehicle 2 .
  • the dedicated computer that constitutes the processing system 1 may be a judgment ECU that judges the DDT in the operation control of the host vehicle 2 .
  • a dedicated computer that configures the processing system 1 may be a monitoring ECU that monitors the operation control of the host vehicle 2 .
  • a dedicated computer that configures the processing system 1 may be an evaluation ECU that evaluates operation control of the host vehicle 2 .
  • a dedicated computer that configures the processing system 1 may be a navigation ECU that navigates the travel route of the host vehicle 2 .
  • a dedicated computer that configures the processing system 1 may be a locator ECU that estimates self-state quantities including the self-position of the host vehicle 2 .
  • the dedicated computer that makes up the processing system 1 may be an actuator ECU that controls the motion actuators of the host vehicle 2 .
  • a dedicated computer that configures the processing system 1 may be an HCU (HMI (Human Machine Interface) Control Unit) that controls information presentation in the host vehicle 2 .
  • the dedicated computer that constitutes the processing system 1 may be at least one external computer that constructs an external center or a mobile terminal that can communicate via the communication system 6, for example.
  • a dedicated computer that constitutes the processing system 1 has at least one memory 10 and at least one processor 12 .
  • the memory 10 stores computer-readable programs and data non-temporarily, for example, at least one type of non-transitory physical storage medium (non-transitory storage medium) among semiconductor memory, magnetic medium, optical medium, etc. tangible storage medium).
  • the processor 12 includes at least one of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and a RISC (Reduced Instruction Set Computer)-CPU as a core.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • RISC Reduced Instruction Set Computer
  • the processor 12 executes multiple instructions contained in a processing program stored in the memory 10 as software. Thereby, the processing system 1 constructs a plurality of functional blocks for executing the operation control processing of the host vehicle 2 . In this way, in the processing system 1, a plurality of functional blocks are constructed by causing the processor 12 to execute a plurality of instructions from the processing program stored in the memory 10 in order to perform the operation control processing of the host vehicle 2.
  • a plurality of functional blocks constructed by the processing system 1 of the first embodiment include a confluence estimation block 100, an arrival prediction block 110, a priority setting block 120, and an operation control block 130 as shown in FIG.
  • the merging estimation block 100 detects the presence or absence of the target vehicle 3 traveling in the target lane Lt while the host vehicle 2 is traveling in the host lane Lh. Monitor based on That is, the merging estimation block 100 enters a merging scene in which the host vehicle 2 traveling in the host lane Lh of the merging structure LSm and the target vehicle 3 traveling in the target lane Lt of the same structure LSm merge as shown in FIG. monitor whether there is a transition. Furthermore, this process selects a merging scenario in which the host vehicle 2 traveling on the host lane Lh and the target vehicle 3 traveling on the target lane Lt merge as shown in FIG. It can also be said that 100 selects processing.
  • the merging estimation block 100 detects (that is, recognizes) the running of the target vehicle 3 in the target lane Lt, it estimates the merging point Pm between the host vehicle 2 and the target vehicle 3 as shown in FIG. Specifically, the merging estimation block 100 determines a merging point Pm between the host trajectory TRh, which is the future travel trajectory of the host vehicle 2 in the host lane Lh, and the target trajectory TRt, which is the future travel trajectory of the target vehicle 3 in the target lane Lt. presume.
  • the point Pt at which the target vehicle 3 is detected in the target lane Lt and the point Ph at which the host vehicle 2 is traveling in the host lane Lh at the detection timing are referred to as detection points.
  • detection points Pt and Ph the points corresponding to the first or second or later predetermined timing at which the target vehicle 3 is detected in monitoring the merging scene (merging scenario) are used in subsequent blocks 110, 120, and 130. be.
  • the host trajectory TRh is a trajectory ( trajectory).
  • the target trajectory TRt is a trajectory estimated by detecting the target vehicle 3 and the target lane Lt based on data acquired by at least one of the sensor system 5, communication system 6, and map DB 7.
  • the merging point Pm is presumptively defined at the point where the target trajectory TRt extending from the target lane Lt as the merging lane to the host lane Lh intersects the host trajectory TRh in the host lane Lh as the main line. . If an estimation error in the lateral direction of the target lane Lt is expected in the target trajectory TRt, the estimation error is taken into consideration within the range Rm (see FIG. 4) where the projection of the target lane Lt intersects the host trajectory TRh. , a confluence point Pm may be defined.
  • the arrival prediction block 110 shown in FIG. 3 predicts the minimum arrival time for each of the host vehicle 2 and the target vehicle 3 to reach the confluence point Pm estimated by the confluence estimation block 100 .
  • the arrival prediction block 110 calculates the host arrival time th, which is the minimum arrival time of the host vehicle 2, and the target arrival time tt, which is the minimum arrival time of the target vehicle 3. is obtained by a prediction calculation based on and Equation 2.
  • the speed of the host vehicle 2 up to the merging point Pm remains the speed vh at the detection point Ph so as to avoid the host vehicle 2 from becoming a factor of collision or rear-end collision with the target vehicle 3. It is assumed that it is controlled substantially constant.
  • dh is the distance from the detection point Ph to the confluence point Pm.
  • Equation 2 assumes that the speed of the target vehicle 3 to the merging point Pm is controlled from the speed vt at the detection point Pt in the target lane Lt to the acceleration/deceleration at at the detection point Pt.
  • dt is the distance from the detection point Pt to the confluence point Pm.
  • the priority setting block 120 shown in FIG. 3 compares the host arrival time th predicted as the minimum arrival time by the arrival prediction block 110 with the target arrival time tt. As a result, when the host arrival time th is equal to or less than the target arrival time tt, the priority setting block 120 sets the merging point Pm as shown in FIG. A priority is set so that the running of the target vehicle 3 has priority over the other target vehicle 3 . On the other hand, when the host arrival time th exceeds the target arrival time tt, the priority setting block 120 merges the target vehicle 3 with the smaller minimum arrival time as shown in FIG. A priority is set such that traveling at the point Pm has priority over the other host vehicle 2 . Setting the priority here can also be said to define the right of way.
  • the priority setting block 120 may transmit information about the set priority from the communication system 6 of the host vehicle 2 to the target vehicle 3. At this time, transmission between the vehicles 2 and 3 may be directly realized by communication systems such as V2V type, or may be indirectly realized via a remote center such as a cloud server. , through a mesh network configured between a plurality of vehicles, including vehicles 2 and 3.
  • the priority setting block 120 may accumulate information about the set priority by storing it in the memory 10 as evidence information.
  • the evidence information may be stored in an unencrypted state, or may be stored in an encrypted or hashed state.
  • the memory 10 in which the evidence information is accumulated may be installed in the host vehicle 2 according to the type of dedicated computer that constitutes the processing system 1, or may be installed in an external center outside the host vehicle 2, for example. may have been Accumulated evidence information may be used to verify automated driving systems, driving policies, and safety models.
  • the priority setting block 120 may notify the occupants of the host vehicle 2 of information regarding the set priority through the information presentation system 4 (see FIG. 3). Further, as described above, when the priority information can be transmitted to the target vehicle 3 , the priority setting block 120 may instruct the occupants of the target vehicle 3 to notify the priority information.
  • the operation control block 130 shown in FIG. 3 controls the operation of the host vehicle 2 according to the priority at the confluence point Pm set by the priority setting block 120. Specifically, when the host vehicle 2 is given priority over the target vehicle 3 at the merging point Pm, the operation control block 130 causes the host vehicle 2 to maintain the operation control at the detection point Ph. As a result, the speed of the host vehicle 2 is maintained substantially constant at the speed vh at the detection point Ph. Maintenance of such driving control may be continued until the host vehicle 2 reaches the merging point Pm. The maintenance of operation control may be continued until the target vehicle 3 reaches the merging point Pm after passing the merging point Pm.
  • the operation control block 130 controls the target vehicle 3 and the host vehicle 2 from before reaching the merging point Pm.
  • the host vehicle 2 is given driving control based on the safety envelope in between.
  • the operation control block 130 projects a projection image 3a of the target vehicle 3 onto the host lane Lh from the detection point Pt of the target lane Lt before reaching the merging point Pm as shown in FIG. ,
  • the projected image 3a is a rotated image obtained by rotating the target vehicle 3 at the detection point Pt around a specific point such as the center point so that the lateral direction of the target vehicle 3 is aligned with the lateral direction of the host lane Lh.
  • the projected image 3a may be assumed by projecting along the lateral direction to .
  • the projected image 3a may be assumed by rotationally projecting the target vehicle 3 at the detection point Pt to the host lane Lh in a rotational direction about the merging point Pm.
  • the projected image 3a is assumed based on data acquired by at least one of the sensor system 5, the communication system 6, and the map DB 7. FIG.
  • the driving control block 130 uses a safety model according to the driving policy between the projected image 3a of the vehicle 3 and the host vehicle 2 to generate a safety envelope according to the policy. set.
  • the operation control block 130 sets the safety distance ds for rear-end collision of the host vehicle 2 with respect to the projected image 3a assumed ahead in the host lane Lh as shown in FIGS. Virtualize.
  • the safety distance ds is based on the premise that the speed of the host vehicle 2 to the merging point Pm is controlled to be substantially constant at the speed vh at the detection point Ph. is assumed by equations 3-6 to avoid becoming
  • Equations 3 to 6 assume that the projected image 3a running at the speed vt corresponding to the target vehicle 3 at the detection point Pt suddenly decelerates at the maximum deceleration at_brmax.
  • the assumptions may be reasonably foreseeable assumptions. Under these assumptions, Equations 3 to 6 assume that the host vehicle 2 at a speed vh brakes at the maximum deceleration ah_brmax that can occur with zero acceleration from the detection point Ph and the reaction time ⁇ .
  • dt_br is the braking distance of the projected image 3a
  • dh_fr is the free running distance of the host vehicle 2 at the reaction time ⁇
  • dh_br is the braking distance of the host vehicle 2.
  • the minimum deceleration of the projection image 3a may be considered instead of the maximum deceleration ah_brmax.
  • the deceleration of the projected image 3a that makes the jerk constant may be considered.
  • the operation control block 130 determines the determination target distance dj from the host vehicle 2 to the projection image 3a at the detection point Ph and the distance dj of the host vehicle 2 to the projection image 3a. The size with the safety distance ds is compared. As a result, when the determination target distance dj exceeds the safety distance ds as shown in FIG. 9, the operation control block 130 determines that the safety envelope is normally ensured. Based on the determination of normality, the operation control block 130 provides the host vehicle 2 with operation control for transitioning the host vehicle 2 to an acceleration state at the maximum allowable acceleration ah_acmax based on the safety model, as shown in FIG. 11 .
  • the allowable maximum acceleration ah_acmax is set as a constraint (that is, a guard) according to Equations 7-9. Operation control to such an accelerated state may be continued until the host vehicle 2 reaches the merging point Pm. Operation control to the acceleration state may be continued as long as a normal safety envelope is ensured even after passing the confluence point Pm.
  • the operation control block 130 determines that a violation of the safety envelope is assumed. Upon determination of a violation of the safety envelope, the driving control block 130 applies driving control constraints to the host vehicle 2 in order to transition the host vehicle 2 to a minimum risk continuous driving condition based on the safety model. At this time, the continuous driving state with the minimum risk is a driving state in which unreasonable risk of rear-end collision is avoided by, for example, decelerating or changing lanes. Such operational control restrictions may continue until the host vehicle 2 reaches the merging point Pm. Restrictions on driving control may be continued until the violation of the safety envelope is resolved even after passing the confluence point Pm.
  • blocks 100, 110, 120, and 130 jointly execute the processing method flow for performing driving-related processing according to the flowchart shown in FIG.
  • This flow is executed in response to detection of a merging scene (merging scenario) by the merging estimation block 100 at timings corresponding to the detection points Pt and Ph.
  • Each "S" in this flow means a plurality of steps executed by a plurality of instructions contained in the processing program stored in the memory 10, respectively.
  • the merging estimation block 100 estimates a merging point Pm between the host trajectory TRh of the host vehicle 2 in the host lane Lh and the target trajectory TRt of the target vehicle 3 in the target lane Lt.
  • the arrival prediction block 110 predicts the host arrival time th of the host vehicle 2 and the target arrival time tt of the target vehicle 3 as the minimum arrival time until reaching the merging point Pm estimated in S101.
  • the priority setting block 120 in S103 compares the sizes of the host arrival time th and the target arrival time tt predicted as the minimum arrival time in S102.
  • the priority setting block 120 sets the priority of the host vehicle 2 having the shorter minimum arrival time over the target vehicle 3 in traveling at the merging point Pm.
  • the operation control block 130 causes the host vehicle 2 to maintain the operation control at the detection point Ph.
  • the priority setting block 120 sets the priority of the target vehicle 3 having the shorter minimum arrival time over the host vehicle 2 in traveling at the merging point Pm.
  • the driving control block 130 sets a safety envelope according to the driving policy by setting the safe distance of the host vehicle 2 with respect to the projected image 3a of the target vehicle 3 projected onto the host lane Lh before reaching the merging point Pm. Assume ds. Further, in S108, the operation control block 130 compares the determination target distance dj from the host vehicle 2 to the projected image 3a at the detection point Ph with the safe distance ds of the host vehicle 2 with respect to the projected image 3a.
  • the flow proceeds to S109.
  • the operation control block 130 shifts the host vehicle 2 to an acceleration state at the maximum allowable acceleration ah_acmax based on the safety model.
  • the determination target distance dj is equal to or less than the safety distance ds in S108 and it is determined that the safety envelope is violated
  • the flow proceeds to S110.
  • the driving control block 130 transitions the host vehicle 2 to the minimum risk continuous driving state based on the safety model. This flow ends after the execution of any of S109, S110 and S105 described above is completed.
  • the confluence point Pm between the host trajectory TRh of the host vehicle 2 in the host lane Lh and the target trajectory TRt of the target vehicle 3 in the target lane Lt is estimated. Therefore, one of the host vehicle 2 and the target vehicle 3, which is predicted for each of the host vehicle 2 and the target vehicle 3 and has a shorter minimum arrival time to reach the merging point Pm, is more likely to reach the merging point Pm than the other vehicle. Priority is set so that driving on the According to this, the vehicle that reaches the merging point Pm earlier can be clearly prioritized, so it is possible to promote a response suitable for the merging scene (merging scenario).
  • the confluence point Pm between the host trajectory TRh of the host vehicle 2 in the host lane Lh and the target trajectory TRt of the target vehicle 3 in the target lane Lt is estimated. Therefore, when the priority at the merging point Pm is set to the target vehicle 3, it is based on the safety envelope set according to the driving policy between the target vehicle 3 and the host vehicle 2 before reaching the merging point Pm. Driving control is given to the host vehicle 2 . According to this, the operation of the host vehicle 2 can be properly controlled before the target vehicle 3 is prioritized at the merging point Pm, so that it is possible to promote a response suitable for the merging scene (merging scenario). becomes possible.
  • the second embodiment as shown in FIG. 13 is a modification of the first embodiment.
  • a plurality of functional blocks constructed by the processing system 1 of the second embodiment include a detection block 200, a planning block 220, a risk monitoring block 240, and a control block 260 as shown in FIG.
  • the detection block 200 detects the internal and external environments of the host vehicle 2 by fusing the data obtained from the sensor system 5, the communication system 6, and the map DB 7 as inputs.
  • the planning block 220 plans operation control of the host vehicle 2 based on the detection information obtained from the detection block 200 .
  • the planning block 120 plans a route for the host vehicle 2 to travel in the future by operation control and an appropriate trajectory based on the detection information for the host vehicle 2 following the route.
  • the risk monitoring block 240 Based on the detection information acquired from the detection block 200, the risk monitoring block 240 monitors the risk between the host vehicle 2 and other road users including the target moving body 3 for each scene. Therefore, the risk monitoring block 240 sets a safety envelope based on scene-by-scene detection information.
  • the control block 260 executes the operational control of the host vehicle 2 planned by the planning block 220 . At this time, if the safety envelope set by the monitoring block 240 is violated, the control block 260 imposes constraints on the planned driving control of the host vehicle 2 according to the driving policy.
  • the flow of the processing method for performing driving-related processing according to the first embodiment is executed jointly by blocks 200, 220, 240, and 260 as shown in FIG. .
  • S101 to S104 and S106 are executed by the detection block 200 or the planning block 220, S107 and S108 are executed by the risk monitoring block 240, and S105, S109 and S110 are executed by the control block 260. be.
  • S101 to S103 are executed by the detection block 200 or the planning block 220
  • S104, S106 to S108 are executed by the risk monitoring block 240
  • S105, S109, S110 are executed by the control block 260. be.
  • S101 and S102 are executed by the detection block 200 or the planning block 220
  • S103, S104, S106 to S108 are executed by the risk monitoring block 240
  • S105, S109 and S110 are executed by the control block 260. executed.
  • S101 to S104 and S106 to S108 are executed by the risk monitoring block 240, and S105, S109 and S110 are executed by the control block 260.
  • the dedicated computer that constitutes the processing system 1 may include at least one of a digital circuit and an analog circuit as a processor.
  • Digital circuits here include, for example, ASIC (Application Specific Integrated Circuit), FPGA (Field Programmable Gate Array), SOC (System on a Chip), PGA (Programmable Gate Array), and CPLD (Complex Programmable Logic Device). , at least one Such digital circuits may also have a memory that stores the program.
  • the merging point Pm between the host trajectory TRh in the host lane Lh as the merging lane and the target trajectory TRt in the target lane Lt as the main line may be estimated. good.
  • the host arrival time th as the minimum arrival time may be predicted on the assumption that the host vehicle 2 decelerates from the detection point Ph.
  • a safe distance ds may be hypothesized given that the host vehicle 2 decelerates from the sensing point Ph.
  • the distances dh and dt may be obtained in blocks 110, 200, 220, 240 and S102, and the magnitudes of the distances dh and dt may be compared in blocks 110, 200, 220, 240 and S103.
  • the host vehicle 2 corresponding to the smaller distance dh among the distances dh and dt is given priority over the target vehicle 3 in traveling at the junction point Pm. priority should be set.
  • the target vehicle 3 corresponding to the smaller distance dt out of the distances dh and dt has priority over the host vehicle 2 in traveling at the merging point Pm.
  • a priority to be set may be set.
  • S105 by blocks 130 and 260 may be omitted.
  • S107, S108 by blocks 130, 240 may be omitted.
  • S109 according to blocks 130, 260 may be omitted.
  • processing according to S105 by blocks 130 and 260 may be performed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)

Abstract

Afin d'exécuter un processus associé à la conduite d'un corps mobile hôte se déplaçant dans une voie hôte dans un scénario de fusion dans lequel à la fois le corps mobile hôte et un corps mobile cible se déplaçant dans une voie cible sont fusionnés, un procédé de traitement, qui est exécuté par un processeur, consiste : à estimer un point de fusion d'une trajectoire hôte, qui est la future trajectoire de déplacement du corps mobile hôte dans la voie hôte, et une trajectoire cible, qui est la future trajectoire de déplacement du corps mobile cible dans la voie cible ; à prédire un temps d'arrivée le plus court requis pour arriver au point de fusion par rapport à chacun du corps mobile hôte et du corps mobile cible ; et à définir un droit de priorité, selon lequel le déplacement d'un corps mobile au niveau du point de fusion est priorisé par rapport à celui de l'autre corps mobile, vers le corps mobile pour lequel le temps d'arrivée le plus court est plus court parmi le corps mobile hôte et le corps mobile cible.
PCT/JP2022/018843 2021-05-21 2022-04-26 Procédé de traitement, système de traitement et programme de traitement WO2022244604A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023522587A JP7487844B2 (ja) 2021-05-21 2022-04-26 処理方法、処理システム、処理プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-086369 2021-05-21
JP2021086369 2021-05-21

Publications (1)

Publication Number Publication Date
WO2022244604A1 true WO2022244604A1 (fr) 2022-11-24

Family

ID=84140560

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/018843 WO2022244604A1 (fr) 2021-05-21 2022-04-26 Procédé de traitement, système de traitement et programme de traitement

Country Status (2)

Country Link
JP (1) JP7487844B2 (fr)
WO (1) WO2022244604A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002163771A (ja) * 2000-11-27 2002-06-07 Natl Inst For Land & Infrastructure Management Mlit 車両誘導制御システム
JP2009230377A (ja) * 2008-03-21 2009-10-08 Mazda Motor Corp 車両用運転支援装置
JP2015102893A (ja) * 2013-11-21 2015-06-04 日産自動車株式会社 合流支援システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002163771A (ja) * 2000-11-27 2002-06-07 Natl Inst For Land & Infrastructure Management Mlit 車両誘導制御システム
JP2009230377A (ja) * 2008-03-21 2009-10-08 Mazda Motor Corp 車両用運転支援装置
JP2015102893A (ja) * 2013-11-21 2015-06-04 日産自動車株式会社 合流支援システム

Also Published As

Publication number Publication date
JPWO2022244604A1 (fr) 2022-11-24
JP7487844B2 (ja) 2024-05-21

Similar Documents

Publication Publication Date Title
JP2019185211A (ja) 車両制御装置、車両制御方法、及びプログラム
JP7163729B2 (ja) 車両制御装置
JP2021020580A (ja) 車両制御装置、車両制御方法、およびプログラム
JP7207256B2 (ja) 車両制御システム
CN114270422A (zh) 车道变更路径指示装置及车道变更路径指示系统
CN111824137A (zh) 机动车和用于避免碰撞的方法
JP2018055321A (ja) 運転支援方法及び運転支援装置
US20230406355A1 (en) Processing method, processing system, and storage medium storing processing program
WO2022244604A1 (fr) Procédé de traitement, système de traitement et programme de traitement
US20210024060A1 (en) Driving assistance device
JP7435787B2 (ja) 経路確認装置および経路確認方法
JP7226238B2 (ja) 車両制御システム
JP2021160708A (ja) 提示制御装置、提示制御プログラム、自動走行制御システムおよび自動走行制御プログラム
JP7364111B2 (ja) 処理方法、処理システム、処理プログラム
JP7428272B2 (ja) 処理方法、処理システム、処理プログラム、処理装置
WO2022244605A1 (fr) Procédé de traitement, système de traitement et programme de traitement
WO2022202002A1 (fr) Procédé de traitement, système de traitement et programme de traitement
WO2022202001A1 (fr) Procédé de traitement, système de traitement et programme de traitement
JP7428273B2 (ja) 処理方法、処理システム、処理プログラム、記憶媒体、処理装置
WO2023189680A1 (fr) Procédé de traitement, système d'exploitation, dispositif de traitement et programme de traitement
KR102668692B1 (ko) 차량 제어 시스템 및 차량 제어 방법
US20230019934A1 (en) Presentation control apparatus
US20240038069A1 (en) Processing device, processing method, processing system, and storage medium
WO2021199964A1 (fr) Dispositif de commande de présentation, programme de commande de présentation, système de commande de conduite automatisée et procédé de commande de conduite automatisée
US20240036575A1 (en) Processing device, processing method, processing system, storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22804518

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023522587

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22804518

Country of ref document: EP

Kind code of ref document: A1