JP2015123899A - Vehicle surrounding-situation estimation apparatus - Google Patents

Vehicle surrounding-situation estimation apparatus Download PDF

Info

Publication number
JP2015123899A
JP2015123899A JP2013270337A JP2013270337A JP2015123899A JP 2015123899 A JP2015123899 A JP 2015123899A JP 2013270337 A JP2013270337 A JP 2013270337A JP 2013270337 A JP2013270337 A JP 2013270337A JP 2015123899 A JP2015123899 A JP 2015123899A
Authority
JP
Japan
Prior art keywords
collision
vehicle
situation
surrounding
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2013270337A
Other languages
Japanese (ja)
Other versions
JP6299208B2 (en
Inventor
ヴァン クイ フン グェン
Van Quy Hung Nguyen
ヴァン クイ フン グェン
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyota Motor Corp
Original Assignee
Toyota Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Corp filed Critical Toyota Motor Corp
Priority to JP2013270337A priority Critical patent/JP6299208B2/en
Priority to PCT/IB2014/002743 priority patent/WO2015097511A1/en
Priority to US15/107,012 priority patent/US10479353B2/en
Priority to DE112014006071.2T priority patent/DE112014006071T5/en
Priority to CN201480070946.3A priority patent/CN105848980B/en
Publication of JP2015123899A publication Critical patent/JP2015123899A/en
Application granted granted Critical
Publication of JP6299208B2 publication Critical patent/JP6299208B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0225Failure correction strategy
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/029Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W2030/082Vehicle operation after collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/0215Sensor drifts or sensor failures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/14Yaw
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/932Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93272Sensor installation details in the back of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93274Sensor installation details on the side of the vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Regulating Braking Force (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)

Abstract

PROBLEM TO BE SOLVED: To provide a vehicle surrounding-situation estimation apparatus capable of appropriately estimating a surrounding situation of a vehicle even in a case where abnormality occurs to a sensor due to a collision, disabling detection of a surrounding situation.SOLUTION: A vehicle surrounding-situation estimation apparatus includes: collision detection means for detecting an event of a vehicle itself colliding with an object outside of the vehicle; surrounding-situation detection means for detecting a situation of a detection zone surrounding the vehicle itself; surrounding-situation estimation means for estimating a situation of a prediction zone surrounding the vehicle itself when the collision occurs on the basis of the detection result of the surrounding-situation detection means before the collision detection means detects the collision; surrounding-situation record means for recording a situation of the prediction zone estimated by the surrounding-situation estimation means; and surrounding-situation prediction means for predicting a situation of a vehicle-surrounding zone that is surrounding the vehicle itself, after the collision detection means detects the collision, on the basis of a situation of the prediction zone recorded by the surrounding-situation record means before the collision detection means detects the collision.

Description

本発明は、車両周辺状況推定装置に関する。   The present invention relates to a vehicle surrounding situation estimation device.

従来、車両と車外の障害物との衝突による一次的な事故(1次衝突)を検出後、その後に続いて起こり得る二次的な事故(2次衝突)による損傷を低減するための技術が報告されている。例えば、特許文献1には、衝突後にセンサにより車両の周囲環境を検出及び解析し、周囲環境と車両の挙動との比較結果に応じてブレーキ操作又は操舵操作等の走行制御を行う技術が開示されている。   Conventionally, after detecting a primary accident (primary collision) due to a collision between a vehicle and an obstacle outside the vehicle, there is a technique for reducing damage caused by a secondary accident (secondary collision) that may occur subsequently. It has been reported. For example, Patent Document 1 discloses a technique for detecting and analyzing a surrounding environment of a vehicle after a collision by a sensor and performing traveling control such as a brake operation or a steering operation according to a comparison result between the surrounding environment and the behavior of the vehicle. ing.

特表2007−516127号公報Special table 2007-516127

ところで、従来技術(特許文献1等)では、一次的な事故によってセンサの破損等の異常が発生した場合、車両の周囲環境を検出できなくなってしまう。その結果、従来技術においては、周囲環境の検出結果に基づいて車両の周辺状況を推定して車両の走行制御を行うことが困難になる恐れがある。   By the way, in the prior art (Patent Document 1 or the like), when an abnormality such as sensor breakage occurs due to a primary accident, it becomes impossible to detect the surrounding environment of the vehicle. As a result, in the related art, it may be difficult to estimate the surrounding situation of the vehicle based on the detection result of the surrounding environment and to control the traveling of the vehicle.

本発明は、上記の事情に鑑みてなされたものであって、衝突によってセンサに異常が発生して周辺状況を検出できなくなった場合であっても、車両の周辺状況を好適に推定することができる車両周辺状況推定装置を提供することを目的とする。   The present invention has been made in view of the above circumstances, and it is possible to suitably estimate the surrounding situation of the vehicle even when the sensor is abnormal due to a collision and the surrounding situation cannot be detected. An object of the present invention is to provide a vehicle surrounding situation estimation device.

本発明の車両周辺状況推定装置は、自車両が車両外部の物体に衝突したことを検出する衝突検出手段と、自車両周辺における検出領域の状況を検出する周辺状況検出手段と、前記衝突検出手段が衝突を検出する前の前記周辺状況検出手段の検出結果に基づいて、衝突した際の自車両周辺の予測領域の状況を推定する周辺状況推定手段と、前記周辺状況推定手段により推定された前記予測領域の状況を記録する周辺状況記録手段と、前記衝突検出手段が衝突を検出した後、前記衝突検出手段が衝突を検出する前に前記周辺状況記録手段に記録された前記予測領域の状況に基づいて、自車両周辺の車両周辺領域の状況を予測する周辺状況予測手段と、を備えたことを特徴とする。   The vehicle surrounding state estimation device of the present invention includes a collision detecting unit that detects that the own vehicle has collided with an object outside the vehicle, a surrounding state detecting unit that detects a state of a detection region around the own vehicle, and the collision detecting unit. Based on the detection result of the surrounding situation detection means before the collision is detected, the surrounding situation estimation means for estimating the situation of the prediction area around the host vehicle at the time of collision, and the surrounding situation estimation means estimated by the surrounding situation estimation means A surrounding situation recording means for recording the situation of the prediction area; and after the collision detection means detects a collision, the situation of the prediction area recorded in the surrounding situation recording means before the collision detection means detects a collision. On the basis of this, a surrounding situation prediction means for predicting the situation of the vehicle surrounding area around the host vehicle is provided.

上記車両周辺状況推定装置において、前記自車両と前記周辺状況検出手段により検出された前記車両外部の物体との衝突を回避可能か否か判定する衝突回避判定手段、を更に備え、前記周辺状況推定手段は、前記衝突回避判定手段が衝突回避不可能と判定した後、前記予測領域の状況を推定し、前記周辺状況記録手段は、前記衝突回避判定手段が衝突回避不可能と判定した後の時点から前記衝突検出手段が衝突を検出する前までの時点において前記周辺状況推定手段により推定された前記予測領域の状況を記録することが好ましい。   The vehicle surrounding state estimation device further includes collision avoidance determining means for determining whether or not a collision between the host vehicle and an object outside the vehicle detected by the surrounding state detecting unit can be avoided, and the surrounding state estimation The means estimates the situation of the prediction area after the collision avoidance determining means determines that the collision avoidance is impossible, and the surrounding situation recording means determines the time point after the collision avoidance determining means determines that the collision avoidance is impossible It is preferable to record the state of the prediction region estimated by the surrounding state estimation unit at a time from when the collision detection unit detects a collision.

上記周辺車両状況推定装置において、前記衝突検出手段が衝突を検出した後、前記自車両に複数搭載された前記周辺状況検出手段の異常の有無を判定するセンサ異常判定手段、
を更に備え、前記周辺状況予測手段は、前記衝突検出手段が衝突を検出した後、前記センサ異常判定手段により異常があると判定された前記周辺状況検出手段の衝突前に設定された検出領域に対応する前記車両周辺領域内の異常認識領域の状況については、前記衝突検出手段が衝突を検出する前に前記周辺状況記録手段に記録された前記予測領域の状況に基づいて予測し、前記センサ異常判定手段により正常であると判定された前記周辺状況検出手段の検出領域に対応する前記車両周辺領域内の正常認識領域の状況については、前記周辺状況検出手段の検出結果に基づいて予測して、前記自車両周辺の状況を予測することが好ましい。
In the surrounding vehicle state estimation device, after the collision detecting unit detects a collision, a sensor abnormality determining unit that determines presence / absence of abnormality in the plurality of surrounding state detecting units mounted on the host vehicle,
The surrounding situation predicting means further comprises a detection region set before the collision of the surrounding situation detecting means determined to be abnormal by the sensor abnormality judging means after the collision detecting means detects a collision. The situation of the corresponding abnormality recognition area in the vehicle surrounding area is predicted based on the situation of the prediction area recorded in the surrounding situation recording means before the collision detecting means detects a collision, and the sensor abnormality The situation of the normal recognition area in the vehicle surrounding area corresponding to the detection area of the surrounding situation detecting means determined to be normal by the judging means is predicted based on the detection result of the surrounding situation detecting means, It is preferable to predict the situation around the host vehicle.

上記車両周辺状況推定装置において、前記衝突検出手段が衝突を検出した後、前記周辺状況予測手段により予測された前記車両周辺領域の状況に基づいて、自車両の挙動を制御する走行制御を行う走行制御手段、を更に備えることが好ましい。   In the vehicle surrounding situation estimation apparatus, after the collision detecting unit detects a collision, traveling is performed to control the behavior of the host vehicle based on the situation of the vehicle surrounding area predicted by the surrounding situation predicting unit. Preferably, the control means is further provided.

本発明に係る車両周辺状況推定装置は、衝突によってセンサに異常が発生して周辺状況を検出できなくなった場合であっても、車両の周辺状況を好適に推定することができるという効果を奏する。   The vehicle surrounding situation estimation device according to the present invention has an effect that the surrounding situation of the vehicle can be suitably estimated even when an abnormality occurs in the sensor due to a collision and the surrounding situation cannot be detected.

図1は、本発明に係る車両周辺状況推定装置の構成を示す図である。FIG. 1 is a diagram showing a configuration of a vehicle surrounding situation estimation apparatus according to the present invention. 図2は、車両に搭載された複数の周辺環境認識センサの検出領域の一例を示す図である。FIG. 2 is a diagram illustrating an example of detection areas of a plurality of surrounding environment recognition sensors mounted on a vehicle. 図3は、周辺環境認識センサの検出結果に基づいて自車両周辺の車両周辺領域の状況を予測する一例を示す図である。FIG. 3 is a diagram illustrating an example of predicting the situation of the vehicle surrounding area around the own vehicle based on the detection result of the surrounding environment recognition sensor. 図4は、1次衝突場面の一例を示す図である。FIG. 4 is a diagram illustrating an example of a primary collision scene. 図5は、衝突を検出する前に取得された周辺環境情報に基づいて、衝突した際の自車両周辺における2次衝突の発生が予測される予測領域の状況を推定する一例を示す図である。FIG. 5 is a diagram illustrating an example of estimating a situation of a prediction region in which occurrence of a secondary collision around the host vehicle at the time of the collision is predicted based on the surrounding environment information acquired before the collision is detected. . 図6は、センサ間の重複領域において周辺環境情報の一致度を確認する状況の一例を示す図である。FIG. 6 is a diagram illustrating an example of a situation in which the degree of coincidence of surrounding environment information is confirmed in an overlapping area between sensors. 図7は、衝突直前にセンサ間の重複領域において周辺環境情報の一致度を確認する状況の一例を示す図である。FIG. 7 is a diagram illustrating an example of a situation in which the degree of coincidence of the surrounding environment information is confirmed in the overlapping area between the sensors immediately before the collision. 図8は、衝突直後に行われるセンサ異常判定において正常と判定する状況の一例を示す図である。FIG. 8 is a diagram illustrating an example of a situation in which it is determined normal in the sensor abnormality determination performed immediately after the collision. 図9は、衝突直後に行われるセンサ異常判定において異常と判定する状況の一例を示す図である。FIG. 9 is a diagram illustrating an example of a situation where an abnormality is determined in the sensor abnormality determination performed immediately after the collision. 図10は、本実施形態における2次衝突発生領域の予測処理の概要について説明する図である。FIG. 10 is a diagram illustrating an outline of the prediction process of the secondary collision occurrence area in the present embodiment. 図11は、本発明に係る車両周辺状況推定装置の基本処理の一例を示すフローチャートである。FIG. 11 is a flowchart showing an example of basic processing of the vehicle surrounding situation estimation apparatus according to the present invention. 図12は、衝突直前の周辺状況推定処理の一例を示すフローチャートである。FIG. 12 is a flowchart illustrating an example of the surrounding situation estimation process immediately before the collision. 図13は、衝突直前の一致度記録処理の一例を示すフローチャートである。FIG. 13 is a flowchart showing an example of the coincidence degree recording process immediately before the collision. 図14は、衝突直後のセンサ異常判定処理の一例を示すフローチャートである。FIG. 14 is a flowchart illustrating an example of sensor abnormality determination processing immediately after a collision. 図15は、センサ状態に応じた周辺状況予測処理の一例を示すフローチャートである。FIG. 15 is a flowchart illustrating an example of the surrounding situation prediction process according to the sensor state. 図16は、道路上の移動物に対する2次衝突位置を予測する場面の一例を示す図である。FIG. 16 is a diagram illustrating an example of a scene in which a secondary collision position with respect to a moving object on a road is predicted. 図17は、道路上の静止物に対する2次衝突位置を予測する場面の一例を示す図である。FIG. 17 is a diagram illustrating an example of a scene in which a secondary collision position with respect to a stationary object on a road is predicted.

以下に、本発明にかかる車両周辺状況推定装置の実施形態を図面に基づいて詳細に説明する。なお、この実施形態によりこの発明が限定されるものではない。また、下記の実施形態における構成要素には、当業者が容易に想定できるものあるいは実質的に同一のものが含まれる。   DESCRIPTION OF EMBODIMENTS Embodiments of a vehicle surrounding situation estimation apparatus according to the present invention will be described below in detail with reference to the drawings. In addition, this invention is not limited by this embodiment. In addition, constituent elements in the following embodiments include those that can be easily assumed by those skilled in the art or those that are substantially the same.

[実施形態]
図1〜図10を参照して、本発明に係る車両周辺状況推定装置の構成について説明する。図1は、本発明に係る車両周辺状況推定装置の構成を示す図である。図2は、車両に搭載された複数の周辺環境認識センサの検出領域の一例を示す図である。図3は、周辺環境認識センサの検出結果に基づいて自車両周辺の車両周辺領域の状況を予測する一例を示す図である。図4は、1次衝突場面の一例を示す図である。図5は、衝突を検出する前に取得された周辺環境情報に基づいて、衝突した際の自車両周辺における2次衝突の発生が予測される予測領域の状況を推定する一例を示す図である。図6は、センサ間の重複領域において周辺環境情報の一致度を確認する状況の一例を示す図である。図7は、衝突直前にセンサ間の重複領域において周辺環境情報の一致度を確認する状況の一例を示す図である。図8は、衝突直後に行われるセンサ異常判定において正常と判定する状況の一例を示す図である。図9は、衝突直後に行われるセンサ異常判定において異常と判定する状況の一例を示す図である。図10は、本実施形態における2次衝突発生領域の予測処理の概要について説明する図である。
[Embodiment]
With reference to FIGS. 1-10, the structure of the vehicle surrounding condition estimation apparatus which concerns on this invention is demonstrated. FIG. 1 is a diagram showing a configuration of a vehicle surrounding situation estimation apparatus according to the present invention. FIG. 2 is a diagram illustrating an example of detection areas of a plurality of surrounding environment recognition sensors mounted on a vehicle. FIG. 3 is a diagram illustrating an example of predicting the situation of the vehicle surrounding area around the own vehicle based on the detection result of the surrounding environment recognition sensor. FIG. 4 is a diagram illustrating an example of a primary collision scene. FIG. 5 is a diagram illustrating an example of estimating a situation of a prediction region in which occurrence of a secondary collision around the host vehicle at the time of the collision is predicted based on the surrounding environment information acquired before the collision is detected. . FIG. 6 is a diagram illustrating an example of a situation in which the degree of coincidence of surrounding environment information is confirmed in an overlapping area between sensors. FIG. 7 is a diagram illustrating an example of a situation in which the degree of coincidence of the surrounding environment information is confirmed in the overlapping area between the sensors immediately before the collision. FIG. 8 is a diagram illustrating an example of a situation in which it is determined normal in the sensor abnormality determination performed immediately after the collision. FIG. 9 is a diagram illustrating an example of a situation where an abnormality is determined in the sensor abnormality determination performed immediately after the collision. FIG. 10 is a diagram illustrating an outline of the prediction process of the secondary collision occurrence area in the present embodiment.

本実施形態において、ECU1は、車両に搭載された周辺環境認識センサ3の検出結果に基づいて、車両周辺の状況を推定する車両周辺状況推定装置としての機能を有する。また、ECU1は、周辺環境認識センサ3の異常を検出するセンサ異常検出装置、及び、車両の挙動を制御する運転支援制御を行う車両制御装置としての機能も有する。ECU1は、車両運動量検出センサ2と、周辺環境認識センサ3と、アクチュエータ4と、電気的に接続されている。ECU1は、車両運動量検出センサ2と、周辺環境認識センサ3とから入力される各種信号に基づいて演算処理を行う。例えば、ECU1は、各種信号に基づいて衝突の有無を判定し、衝突に伴い生じ得る周辺環境認識センサ3の異常を考慮して、衝突直前に推定された車両周辺の状況を利用することで、衝突直後の車両周辺の状況を推定する等の演算処理を行う。また、ECU1は、各種信号に基づいて衝突の有無を判定し、衝突に伴い生じ得る周辺環境認識センサ3の異常の有無を判定する等の演算処理を行う。また、ECU1は、これらの演算処理結果に基づいた制御信号をアクチュエータ4へ出力して、アクチュエータ4を動作させることで車両の挙動を制御する運転支援制御を行う。   In this embodiment, ECU1 has a function as a vehicle surrounding condition estimation apparatus which estimates the condition around a vehicle based on the detection result of the surrounding environment recognition sensor 3 mounted in the vehicle. The ECU 1 also has a function as a sensor abnormality detection device that detects an abnormality of the surrounding environment recognition sensor 3 and a vehicle control device that performs driving support control for controlling the behavior of the vehicle. The ECU 1 is electrically connected to the vehicle momentum detection sensor 2, the surrounding environment recognition sensor 3, and the actuator 4. The ECU 1 performs arithmetic processing based on various signals input from the vehicle momentum detection sensor 2 and the surrounding environment recognition sensor 3. For example, the ECU 1 determines the presence / absence of a collision based on various signals, considers abnormalities in the surrounding environment recognition sensor 3 that may occur due to the collision, and uses the situation around the vehicle estimated immediately before the collision, Arithmetic processing such as estimating the situation around the vehicle immediately after the collision is performed. The ECU 1 also performs arithmetic processing such as determining the presence or absence of a collision based on various signals and determining the presence or absence of an abnormality in the surrounding environment recognition sensor 3 that may occur due to the collision. Further, the ECU 1 outputs a control signal based on these calculation processing results to the actuator 4 and performs driving support control for controlling the behavior of the vehicle by operating the actuator 4.

車両運動量検出センサ2は、車両運動量を示す各種情報を検出する車両運動量検出装置である。本実施形態において、車両運動量検出センサ2は、加速度センサ2aと、ヨーレートセンサ2bと、車速センサ2cとを備える。   The vehicle momentum detection sensor 2 is a vehicle momentum detection device that detects various information indicating the vehicle momentum. In the present embodiment, the vehicle momentum detection sensor 2 includes an acceleration sensor 2a, a yaw rate sensor 2b, and a vehicle speed sensor 2c.

加速度センサ2aは、車体にかかる加速度を検出する加速度検出装置である。加速度センサ2aは、検出した加速度を示す加速度信号をECU1へ出力する。   The acceleration sensor 2a is an acceleration detection device that detects acceleration applied to the vehicle body. The acceleration sensor 2a outputs an acceleration signal indicating the detected acceleration to the ECU 1.

ヨーレートセンサ2bは、車両のヨーレートを検出するヨーレート検出装置である。ヨーレートセンサ2bは、検出したヨーレートを示すヨーレート信号をECU1へ出力する。   The yaw rate sensor 2b is a yaw rate detection device that detects the yaw rate of the vehicle. The yaw rate sensor 2b outputs a yaw rate signal indicating the detected yaw rate to the ECU 1.

車速センサ2cは、車輪毎に設けられ、夫々の車輪速度を検出する車輪速度検出装置である。各車速センサ2cは、各車輪の回転速度である車輪速度を検出する。各車速センサ2cは、検出した各車輪の車輪速度を示す車輪速信号をECU1へ出力する。ECU1は、各車速センサ2cから入力される各車輪の車輪速度に基づいて、車両の走行速度である車速を算出する。ECU1は、各車速センサ2cのうち少なくとも1つから入力される車輪速度に基づいて車速を算出してもよい。   The vehicle speed sensor 2c is a wheel speed detection device that is provided for each wheel and detects each wheel speed. Each vehicle speed sensor 2c detects a wheel speed that is a rotational speed of each wheel. Each vehicle speed sensor 2c outputs a wheel speed signal indicating the detected wheel speed of each wheel to the ECU 1. The ECU 1 calculates the vehicle speed, which is the traveling speed of the vehicle, based on the wheel speed of each wheel input from each vehicle speed sensor 2c. The ECU 1 may calculate the vehicle speed based on the wheel speed input from at least one of the vehicle speed sensors 2c.

このように、車両運動量検出センサ2は、加速度センサ2aで検出した加速度と、ヨーレートセンサ2bで検出したヨーレートと、車速センサ2cで検出した車輪速度とを、車両運動量を示す各種情報として検出し、これらの情報をECU1へ出力する。   As described above, the vehicle momentum detection sensor 2 detects the acceleration detected by the acceleration sensor 2a, the yaw rate detected by the yaw rate sensor 2b, and the wheel speed detected by the vehicle speed sensor 2c as various information indicating the vehicle momentum, These pieces of information are output to the ECU 1.

周辺環境認識センサ3は、車両周囲の移動物体や静止障害物等の車両周辺状況を認識する周辺環境認識装置である。すなわち、周辺環境認識センサ3は、自車両周辺における検出領域の状況を検出する周辺状況検出手段として機能する。周辺環境認識センサ3は、レーダやカメラ等により構成される。周辺環境認識センサ3は、周辺環境情報として、例えば、道路上のガードレールや白線等との相対位置、周辺静止物標の相対位置、周辺移動物標(例えば、車両の前方、後方、側方に存在する移動物標)との相対位置や相対速度や相対加速度等の情報を取得し、当該周辺環境情報をECU1へ出力する。更に、周辺環境認識センサ3は、車両周囲の認識対象の相対位置や相対速度等の情報の他に、認識対象の強度、明るさ、色等といった周辺障害物の属性に関する情報も周辺環境情報として取得してECU1へ出力してもよい。例えば、周辺環境認識センサ3がレーダで構成される場合、周辺環境認識センサ3が認識対象とする物体の強度が固い場合と柔らかい場合とでレーダの反射波の波長パターンは異なる。周辺環境認識センサ3は、この波長パターンの違いを利用して、認識対象の強度を検出する。認識対象の明るさ及び色は、周辺環境認識センサ3がレーダで構成される場合はレーダの反射波の波長パターンに違いにより検出され、周辺環境認識センサ3がカメラで構成される場合は画像のコントラストの違いにより検出される。   The surrounding environment recognition sensor 3 is a surrounding environment recognition device that recognizes vehicle surrounding conditions such as moving objects and stationary obstacles around the vehicle. That is, the surrounding environment recognition sensor 3 functions as a surrounding state detection unit that detects the state of the detection region around the host vehicle. The surrounding environment recognition sensor 3 is configured by a radar, a camera, or the like. The surrounding environment recognition sensor 3 is, for example, as the surrounding environment information, for example, a relative position with respect to a guardrail or a white line on a road, a relative position of a surrounding stationary target, a surrounding moving target (for example, forward, backward, or sideward of a vehicle). Information such as a relative position, a relative speed, and a relative acceleration with respect to an existing moving target is acquired, and the surrounding environment information is output to the ECU 1. Furthermore, the surrounding environment recognition sensor 3 also includes information on the attributes of surrounding obstacles such as the strength, brightness, and color of the recognition target as the surrounding environment information in addition to information on the relative position and relative speed of the recognition target around the vehicle. You may acquire and output to ECU1. For example, when the surrounding environment recognition sensor 3 is configured by a radar, the wavelength pattern of the reflected wave of the radar differs depending on whether the intensity of an object to be recognized by the surrounding environment recognition sensor 3 is hard or soft. The surrounding environment recognition sensor 3 detects the intensity of the recognition target using the difference in wavelength pattern. The brightness and color of the recognition target are detected by the difference in the wavelength pattern of the reflected wave of the radar when the surrounding environment recognition sensor 3 is configured by a radar. Detected by difference in contrast.

本実施形態において、複数の周辺環境認識センサ3が車両に搭載されている。例えば、周辺環境認識センサ3は、図1に示すように、第1のセンサとしてのセンサ1、第2のセンサとしてのセンサ2、第3のセンサとしてのセンサ3から構成される。なお、車両に搭載される周辺環境認識センサの数は、図1の例のように3つに限定されず、車両に3つ以上のセンサを搭載してもよい。   In the present embodiment, a plurality of surrounding environment recognition sensors 3 are mounted on the vehicle. For example, the surrounding environment recognition sensor 3 includes a sensor 1 as a first sensor, a sensor 2 as a second sensor, and a sensor 3 as a third sensor, as shown in FIG. The number of surrounding environment recognition sensors mounted on the vehicle is not limited to three as in the example of FIG. 1, and three or more sensors may be mounted on the vehicle.

センサ1〜3は、其々異なる検出領域の状況を検出する。例えば、センサ1は、自車両周辺における第1領域の状況を検出する第1のセンサとして機能する。センサ2は、第1領域とは異なる領域であって当該第1領域の一部と重複する自車両周辺における第2領域の状況を検出する第2のセンサとして機能する。センサ3は、第1領域及び第2領域とは異なる領域であって当該第1領域の一部と重複する自車両周辺における第3領域の状況を検出する第3のセンサとして機能する。   Sensors 1 to 3 detect the situation of different detection areas. For example, the sensor 1 functions as a first sensor that detects the state of the first region around the host vehicle. The sensor 2 functions as a second sensor that detects a situation of the second region around the host vehicle that is different from the first region and overlaps a part of the first region. The sensor 3 functions as a third sensor that detects the situation of the third region around the host vehicle that is different from the first region and the second region and overlaps a part of the first region.

一例として、図2に示すように、センサ1〜3が車両10の前面に取り付けられた場合を例に説明する。図2において、センサ1は、車両10の進行方向側をカバーする検出領域(図2において、第1領域)の状況を検出する。センサ2は、車両の右前方から右側面側をカバーする検出領域(図2において、第2領域)の状況を検出する。センサ3は、車両の左前方から左側面側をカバーする検出領域(図2において、第3領域)の状況を検出する。センサ1が検出する第1領域とセンサ2が検出する第2領域とは一部重複している。このセンサ1とセンサ2の跨ぎ領域を第1の重複領域とする。また、センサ1が検出する第1領域とセンサ3が検出する第3領域とは一部重複している。このセンサ1とセンサ3の検出領域の跨ぎ領域を第2の重複領域とする。なお、センサの取り付け位置は、図2の例のように前面に限定されず、車両の前面の他、右側面、左側面、後面等であってもよい。   As an example, a case where the sensors 1 to 3 are attached to the front surface of the vehicle 10 as illustrated in FIG. 2 will be described as an example. In FIG. 2, the sensor 1 detects the state of a detection region (the first region in FIG. 2) that covers the traveling direction side of the vehicle 10. The sensor 2 detects the state of a detection region (second region in FIG. 2) that covers the right side from the right front side of the vehicle. The sensor 3 detects the state of a detection region (third region in FIG. 2) that covers the left side from the left front of the vehicle. The first area detected by the sensor 1 and the second area detected by the sensor 2 partially overlap. The straddle region between the sensor 1 and the sensor 2 is defined as a first overlap region. Further, the first area detected by the sensor 1 and the third area detected by the sensor 3 partially overlap. A region between the detection regions of the sensors 1 and 3 is defined as a second overlapping region. The sensor mounting position is not limited to the front surface as in the example of FIG. 2, and may be the right side surface, the left side surface, the rear surface, etc. in addition to the front surface of the vehicle.

図1に戻り、本発明に係る車両周辺状況推定装置の構成の説明を続ける。ECU1は、周辺環境情報取得部1aと、周辺状況予測部1bと、周辺状況推定部1cと、周辺状況記録部1dと、一致度記録部1eと、衝突回避判定部1fと、衝突検出部1gと、センサ異常判定部1hと、走行制御部1iと、を少なくとも備える。   Returning to FIG. 1, the description of the configuration of the vehicle surrounding state estimation device according to the present invention will be continued. The ECU 1 includes a surrounding environment information acquisition unit 1a, a surrounding situation prediction unit 1b, a surrounding situation estimation unit 1c, a surrounding situation recording unit 1d, a coincidence degree recording unit 1e, a collision avoidance determination unit 1f, and a collision detection unit 1g. And at least a sensor abnormality determination unit 1h and a travel control unit 1i.

ECU1のうち、周辺環境情報取得部1aは、周辺環境認識センサ3から送信される、車両周囲の移動物体や静止障害物等の車両周辺状況を示す周辺環境情報を受信して取得する周辺環境情報取得手段である。すなわち、周辺環境情報取得部1aは、周辺状況検出手段としての周辺環境認識センサ3の検出結果を周辺環境情報として取得する。本実施形態において、周辺環境情報取得部1aは、例えば、周辺環境認識センサ3として搭載されたセンサ3a、センサ3b、センサ3cの其々から送信される第1領域、第2領域、第3領域の状況を示す周辺環境情報を受信して取得する。   In the ECU 1, the surrounding environment information acquisition unit 1a receives and acquires the surrounding environment information transmitted from the surrounding environment recognition sensor 3 and indicating the surrounding environment information such as a moving object and a stationary obstacle around the vehicle. It is an acquisition means. That is, the surrounding environment information acquisition unit 1a acquires the detection result of the surrounding environment recognition sensor 3 as the surrounding state detection means as the surrounding environment information. In the present embodiment, the surrounding environment information acquisition unit 1a includes, for example, a first area, a second area, and a third area that are transmitted from the sensors 3a, 3b, and 3c mounted as the surrounding environment recognition sensor 3, respectively. Receive and acquire ambient environment information indicating the status of

周辺状況予測部1bは、周辺環境情報取得部1aにより取得された周辺環境情報、すなわち、周辺状況検出手段としての周辺環境認識センサ3の検出結果に基づいて、自車両周辺の状況を予測する周辺状況予測手段である。例えば、周辺状況予測部1bは、自車両周辺における車両周辺領域の状況を予測する。車両周辺領域は、周辺環境認識センサ3の検出領域とは異なる自車両の現在位置周辺の領域であって、複数の周辺環境認識センサ3の検出領域を含むような範囲に設定される領域である。例えば、周辺状況予測部1bは、車両に搭載された複数の周辺環境認識センサ3の検出領域内で検出された車両外部の物体の相対位置、相対速度、属性等を示す周辺環境情報を利用して、車両周辺領域内で車両と衝突する危険性のある障害物の状況を予測する。   The surrounding situation prediction unit 1b predicts the situation around the host vehicle based on the surrounding environment information acquired by the surrounding environment information acquisition unit 1a, that is, based on the detection result of the surrounding environment recognition sensor 3 as the surrounding state detection unit. It is a situation prediction means. For example, the surrounding situation prediction unit 1b predicts the situation of the vehicle surrounding area around the host vehicle. The vehicle surrounding area is an area around the current position of the host vehicle that is different from the detection area of the surrounding environment recognition sensor 3, and is an area that is set to a range that includes the detection areas of the plurality of surrounding environment recognition sensors 3. . For example, the surrounding situation prediction unit 1b uses surrounding environment information indicating the relative position, relative speed, attribute, and the like of an object outside the vehicle detected in the detection area of the plurality of surrounding environment recognition sensors 3 mounted on the vehicle. Thus, the situation of an obstacle that may collide with the vehicle in the vehicle surrounding area is predicted.

一例として、図3に示すように、センサ1〜3が車両10の前面に取り付けられ、更にセンサ4が車両10の右側面、センサ5が車両10の左側面、センサ6が車両10の後面に取り付けられた場合を例に説明する。図3において、センサ1〜6は、其々異なる検出領域の状況を検出する。センサ1は進行方向側の第1領域、センサ2は右前方から右側面側の第2領域、センサ3は左前方から左側面側の第3領域、センサ4は右後方から右側面側の第4領域、センサ5は左後方から左側面側の第5領域、及び、センサ6は進行方向の反対方向側の第6領域の状況を其々検出する。そして、周辺状況予測部1bは、周辺環境情報取得部1aにより取得された各センサ1〜6の第1領域〜第6領域内の状況を示す周辺環境情報を用いて、車両10が現在位置する車両位置周辺の所定範囲に設定された車両周辺領域の状況を予測する。   As an example, as shown in FIG. 3, sensors 1 to 3 are attached to the front surface of the vehicle 10, the sensor 4 is the right side surface of the vehicle 10, the sensor 5 is the left side surface of the vehicle 10, and the sensor 6 is the rear surface of the vehicle 10. The case where it is attached will be described as an example. In FIG. 3, sensors 1 to 6 detect the situation of different detection areas. The sensor 1 is a first area on the traveling direction side, the sensor 2 is a second area on the right side from the right front, the sensor 3 is a third area on the left side from the left front, and the sensor 4 is a third area on the right side from the right rear. The four areas, the sensor 5 detect the situation of the fifth area on the left side from the left rear, and the sensor 6 detects the situation of the sixth area on the side opposite to the traveling direction. And the surrounding situation prediction part 1b uses the surrounding environment information which shows the condition in the 1st area | region-6th area | region of each sensor 1-6 acquired by the surrounding environment information acquisition part 1a, and the vehicle 10 is presently located. The situation of the vehicle surrounding area set to a predetermined range around the vehicle position is predicted.

本実施形態において、周辺状況予測部1bは、周辺環境情報取得部1aから送信される周辺環境情報に基づいて、車両外部の物体の現在位置を特定することで車両周辺領域の状況を予測する。例えば、周辺状況予測部1bは、複数の周辺環境認識センサ3の検出結果に基づいて自車両周辺における障害物の現在位置を特定する。具体的には、周辺状況予測部1bは、周辺環境情報に含まれる障害物と自車両との相対位置、相対速度、相対加速度等に基づいて、障害物の現在位置を特定する。   In the present embodiment, the surrounding situation prediction unit 1b predicts the situation of the vehicle surrounding area by specifying the current position of an object outside the vehicle based on the surrounding environment information transmitted from the surrounding environment information acquisition unit 1a. For example, the surrounding situation prediction unit 1b specifies the current position of the obstacle around the host vehicle based on the detection results of the plurality of surrounding environment recognition sensors 3. Specifically, the surrounding situation prediction unit 1b specifies the current position of the obstacle based on the relative position, relative speed, relative acceleration, and the like between the obstacle and the host vehicle included in the surrounding environment information.

ここで、周辺状況予測部1bは、周辺環境情報取得部1aにより取得された周辺環境情報に加え、車両運動量検出センサ2から送信される車両運動量を示す各種情報を更に用いて、自車両の位置及び傾きの状態を特定した上で、自車両周辺の状況を予測してもよい。   Here, in addition to the surrounding environment information acquired by the surrounding environment information acquisition unit 1a, the surrounding situation prediction unit 1b further uses various information indicating the vehicle momentum transmitted from the vehicle momentum detection sensor 2 to use the position of the host vehicle. In addition, the situation around the host vehicle may be predicted after specifying the state of inclination.

図1に戻り、本発明に係る車両周辺状況推定装置の構成の説明を続ける。ECU1のうち、周辺状況推定部1cは、衝突検出部1gが衝突を検出する前に周辺環境情報取得部1aにより取得された周辺環境情報に基づいて、衝突した際の自車両周辺の予測領域の状況を推定する周辺状況推定手段である。   Returning to FIG. 1, the description of the configuration of the vehicle surrounding state estimation device according to the present invention will be continued. Of the ECU 1, the surrounding situation estimation unit 1 c is based on the surrounding environment information acquired by the surrounding environment information acquisition unit 1 a before the collision detection unit 1 g detects a collision, It is a surrounding situation estimation means for estimating the situation.

例えば、図4に示すように、自車両としての車両10が同一走行車線上で先行する他の車両30を追い越し、反対車線を走行中の他の車両20に1次衝突する場面は、その後に車両10と車両30との2次衝突の発生が予測される状況を表している。このような状況では、周辺状況推定部1cは、図5下図に示すように、衝突回避判定部1fが車両20との衝突回避は不可能であると判定した後、衝突検出部1gが車両20との衝突を検出する前の周辺環境認識センサ3の検出結果(図5において、車両20と車両30に関する相対位置や相対速度等の周辺環境情報)に基づいて、図5の上図に示すように、車両20に衝突した際の自車両周辺の予測領域の状況を推定する。本実施形態において、予測領域は、所定時間経過後に車両10が車両外部の物体(図5において、車両20)と一次衝突する予測位置周辺の所定範囲に設定される領域である。予測領域の範囲は、車両周辺領域の範囲と同じであってもよいし、異なる範囲であってもよい。   For example, as shown in FIG. 4, a scene in which the vehicle 10 as the host vehicle overtakes another vehicle 30 that precedes the same traveling lane and first collides with another vehicle 20 traveling in the opposite lane is This represents a situation in which the occurrence of a secondary collision between the vehicle 10 and the vehicle 30 is predicted. In such a situation, as shown in the lower diagram of FIG. 5, the surrounding situation estimation unit 1c determines that the collision avoidance determination unit 1f cannot avoid the collision with the vehicle 20, and then the collision detection unit 1g detects that the vehicle 20 As shown in the upper diagram of FIG. 5 based on the detection result of the surrounding environment recognition sensor 3 before detecting a collision with the vehicle (in FIG. In addition, the situation of the prediction area around the host vehicle when it collides with the vehicle 20 is estimated. In the present embodiment, the prediction area is an area set in a predetermined range around a prediction position where the vehicle 10 primarily collides with an object outside the vehicle (vehicle 20 in FIG. 5) after a predetermined time has elapsed. The range of the prediction area may be the same as or different from the range of the vehicle surrounding area.

本実施形態において、周辺状況推定部1cは、周辺環境情報取得部1aから送信される周辺環境情報に基づいて、所定時間経過後の車両外部の物体の位置を予測することで予測領域の状況を推定する。所定時間は、衝突回避判定部1fにより算出される衝突までの時間に基づいて設定される。例えば、周辺状況推定部1cは、複数の周辺環境認識センサ3の検出結果に基づいて、所定時間経過後の自車両周辺における障害物の位置を予測する。周辺状況推定部1cは、周辺環境情報に含まれる障害物と自車両との相対位置、相対速度、相対加速度等に基づいて、所定時間経過後の移動位置を障害物の位置として予測する。所定時間は、例えば、車両10が障害物に衝突するまでにかかる時間に設定される。   In this embodiment, the surrounding state estimation unit 1c predicts the state of the prediction region by predicting the position of an object outside the vehicle after a lapse of a predetermined time based on the surrounding environment information transmitted from the surrounding environment information acquisition unit 1a. presume. The predetermined time is set based on the time until the collision calculated by the collision avoidance determination unit 1f. For example, the surrounding state estimation unit 1c predicts the position of an obstacle around the host vehicle after a predetermined time has elapsed based on the detection results of the plurality of surrounding environment recognition sensors 3. The surrounding state estimation unit 1c predicts the moving position after a predetermined time as the position of the obstacle based on the relative position, relative speed, relative acceleration, and the like between the obstacle and the vehicle included in the surrounding environment information. For example, the predetermined time is set to a time required for the vehicle 10 to collide with an obstacle.

ここで、周辺状況推定部1cは、周辺環境情報取得部1aにより取得された周辺環境情報に加え、車両運動量検出センサ2から送信される車両運動量を示す各種情報を更に用いて、衝突した際の自車両の位置及び傾きを予測した上で、衝突した際の自車両周辺における2次衝突の発生が予測される予測領域の状況を推定してもよい。   Here, in addition to the surrounding environment information acquired by the surrounding environment information acquisition unit 1a, the surrounding state estimation unit 1c further uses various information indicating the vehicle momentum transmitted from the vehicle momentum detection sensor 2 to perform the collision. You may estimate the condition of the prediction area | region where generation | occurrence | production of the secondary collision around the own vehicle at the time of a collision is estimated after estimating the position and inclination of the own vehicle.

図1に戻り、本発明に係る車両周辺状況推定装置の構成の説明を続ける。ECU1のうち、周辺状況記録部1dは、周辺状況推定部1cにより推定された、衝突した際の自車両周辺の予測領域の状況を記録する周辺状況記録手段である。本実施形態において、周辺状況記録部1dは、衝突回避判定部1fが衝突回避不可能と判定した後の時点から衝突検出部1gが衝突を検出する前までの時点において周辺状況推定部1cにより推定された予測領域の状況を記録する。例えば、周辺状況記録部1dは、周辺状況推定部1cにより推定された予測領域の状況を、推定時刻と対応付けてECU1のメモリ内に送信して記録する。   Returning to FIG. 1, the description of the configuration of the vehicle surrounding state estimation device according to the present invention will be continued. In the ECU 1, the surrounding situation recording unit 1 d is a surrounding situation recording unit that records the situation of the prediction area around the host vehicle at the time of the collision estimated by the surrounding situation estimation unit 1 c. In the present embodiment, the surrounding situation recording unit 1d is estimated by the surrounding situation estimating unit 1c from the time after the collision avoidance determining unit 1f determines that the collision avoidance is impossible until the collision detecting unit 1g detects the collision. Record the status of the predicted region. For example, the surrounding situation recording unit 1d transmits the situation of the prediction region estimated by the surrounding situation estimation unit 1c in association with the estimated time in the memory of the ECU 1 and records it.

一致度記録部1eは、周辺環境情報取得部1aにより取得された周辺環境情報に基づいて、周辺環境認識センサ3の重複領域における周辺環境情報の一致度を算出して記録する一致度記録手段である。   The coincidence degree recording unit 1e is a coincidence degree recording unit that calculates and records the coincidence degree of the surrounding environment information in the overlapping region of the surrounding environment recognition sensor 3 based on the surrounding environment information acquired by the surrounding environment information acquisition unit 1a. is there.

一例として、図6に示すように、一致度記録部1eは、センサ間の跨ぎ領域において、周辺環境情報の一致度を算出して記録する。図6において、一致度記録部1eは、センサ1とセンサ2との重複領域(図6において、第1の重複領域)において、認識対象となる車両10の右側(図6の(i)の位置)に存在する壁に関する周辺環境情報について、一致度を算出して記録している。   As an example, as shown in FIG. 6, the coincidence degree recording unit 1e calculates and records the coincidence degree of the surrounding environment information in the straddle region between the sensors. In FIG. 6, the coincidence degree recording unit 1 e is located on the right side (the position (i) in FIG. 6) of the vehicle 10 to be recognized in the overlapping region (the first overlapping region in FIG. 6) of the sensor 1 and the sensor 2. ), The degree of coincidence is calculated and recorded.

このような場合、例えば、一致度記録部1eは、周辺環境情報取得部1aから、センサ1の周辺環境情報として、第1領域内の第1の重複領域で検出した認識対象となる壁との相対位置、壁自体の硬さ及び柔らかさを示す強度、壁の明るさ、壁の色等を含む情報を受信する。また、一致度記録部1eは、周辺環境情報取得部1aから、センサ2の周辺環境情報として、第2領域内の第1の重複領域で検出した認識対象となる壁との相対位置、壁の強度、明るさ、色等を含む情報を受信する。そして、一致度記録部1eは、パラメータ(図6において、壁との相対位置、強度、明るさ、色)ごとに、センサ1の周辺環境情報と、センサ2の周辺環境情報とを比較する。続いて、一致度記録部1eは、比較対象のパラメータが、センサ1とセンサ2との間で同一であるか、または、センサ1とセンサ2との間で異なるがその違いが所定の閾値範囲内であれば、一致度は高いと判定する。例えば、センサ1が検出した壁との相対位置と、センサ2が検出した壁との相対位置とを比較する場合、一致度記録部1eは、センサ1及びセンサ2のうちいずれかの搭載位置、又は、車両上の所定位置を基準位置として設定する。そして、一致度記録部1eは、当該基準位置と壁との相対位置を演算し、演算した相対位置を比較して一致度を判定する。壁の属性に関する情報(例えば、壁の強度、明るさ、色等)については、センサ1及びセンサ2其々で検出した状況を比較して一致度を判定する。   In such a case, for example, the coincidence degree recording unit 1e obtains the recognition target wall detected from the surrounding environment information acquisition unit 1a as the surrounding environment information of the sensor 1 in the first overlapping region in the first region. Information including the relative position, the strength indicating the hardness and softness of the wall itself, the brightness of the wall, the color of the wall, etc. is received. Further, the coincidence degree recording unit 1e receives, as the surrounding environment information of the sensor 2 from the surrounding environment information acquiring unit 1a, the relative position of the wall to be recognized detected in the first overlapping region in the second region, the wall Receive information including intensity, brightness, color, etc. Then, the coincidence degree recording unit 1e compares the surrounding environment information of the sensor 1 and the surrounding environment information of the sensor 2 for each parameter (relative position with respect to the wall, intensity, brightness, and color in FIG. 6). Subsequently, in the coincidence degree recording unit 1e, the parameters to be compared are the same between the sensor 1 and the sensor 2, or are different between the sensor 1 and the sensor 2, but the difference is within a predetermined threshold range. If it is within, it is determined that the matching degree is high. For example, when comparing the relative position with the wall detected by the sensor 1 and the relative position with the wall detected by the sensor 2, the coincidence degree recording unit 1 e includes the mounting position of any one of the sensor 1 and the sensor 2, Alternatively, a predetermined position on the vehicle is set as the reference position. Then, the coincidence degree recording unit 1e calculates the relative position between the reference position and the wall, and compares the calculated relative positions to determine the coincidence. For information on wall attributes (for example, wall strength, brightness, color, etc.), the degree of coincidence is determined by comparing the conditions detected by sensor 1 and sensor 2 respectively.

一方、一致度記録部1eは、比較対象のパラメータ(例えば、センサ1が検出した壁との相対位置と、センサ2が検出した壁との相対位置)が、センサ1とセンサ2との間で異なり、その違いが所定の閾値範囲外であれば、一致度は低いと判定する。この他、一致度記録部1eは、センサ1とセンサ2との重複領域が存在しない場合も周辺環境情報の一致度は低いと判定する。そして、一致度記録部1eは、こうしたセンサ1とセンサ2との間で周辺環境情報の一致度の高低を判定する処理を、比較対象のパラメータごと(例えば、壁の強度、明るさ、色ごと)に行い、パラメータごとに判定された一致度の高低に基づいて周辺環境情報の一致度を算出する。周辺環境情報の一致度は、例えば、パラメータごとに判定された一致度の高低を点数化して、それらを総計したものであってもよい。   On the other hand, the coincidence degree recording unit 1e has a parameter to be compared (for example, a relative position between the wall detected by the sensor 1 and a relative position between the wall detected by the sensor 2) between the sensor 1 and the sensor 2. If the difference is outside the predetermined threshold range, it is determined that the degree of coincidence is low. In addition, the coincidence degree recording unit 1e determines that the degree of coincidence of the surrounding environment information is low even when there is no overlapping area between the sensor 1 and the sensor 2. Then, the coincidence degree recording unit 1e performs a process for determining the degree of coincidence of the surrounding environment information between the sensor 1 and the sensor 2 for each parameter to be compared (for example, for each wall strength, brightness, and color). And the degree of coincidence of the surrounding environment information is calculated based on the degree of coincidence determined for each parameter. The degree of coincidence of the surrounding environment information may be, for example, a score obtained by scoring the degree of coincidence determined for each parameter and totaling them.

図6の例では、説明の便宜上、センサ1とセンサ2との跨ぎ領域である第1の重複領域において検出された壁に関する周辺環境情報の一致度の算出のみについて説明したが、本実施形態において、一致度記録部1eは、周辺環境情報の一致度の算出を、重複領域を有するセンサのペアごとに行うものとする。例えば、一致度記録部1eは、第1の重複領域において検出された壁に関する周辺環境情報の一致度の算出の他、センサ1の第1領域内の第2の重複領域で検出される認識対象に関する周辺環境情報と、センサ3の第3領域内の第2の重複領域で検出される認識対象に関する周辺環境情報と、をパラメータごとに算出して、周辺環境情報の一致度を算出する。そして、一致度記録部1eは、算出した一致度を、算出時刻と対応付けてECU1のメモリ内に送信して記録する。   In the example of FIG. 6, for convenience of explanation, only the calculation of the degree of coincidence of the surrounding environment information related to the wall detected in the first overlapping area that is the straddling area between the sensor 1 and the sensor 2 has been described. The coincidence degree recording unit 1e calculates the coincidence degree of the surrounding environment information for each pair of sensors having overlapping regions. For example, the coincidence degree recording unit 1e calculates the degree of coincidence of the surrounding environment information regarding the wall detected in the first overlapping area, and also recognizes the recognition target detected in the second overlapping area in the first area of the sensor 1. And the surrounding environment information related to the recognition target detected in the second overlapping area in the third area of the sensor 3 are calculated for each parameter, and the degree of coincidence of the surrounding environment information is calculated. Then, the coincidence degree recording unit 1e transmits and records the calculated coincidence degree in the memory of the ECU 1 in association with the calculation time.

なお、本実施形態において、一致度記録部1eは、所定のタイミングで周辺環境情報の一致度を算出して記録する。例えば、一致度記録部1eは、衝突直前のタイミング(即ち、衝突回避判定部1fにより衝突回避不可能と判定されたタイミング)や、衝突直後のタイミング(即ち、衝突検出部1gにより衝突が検出されたタイミング)で、周辺環境認識センサ3の重複領域における周辺環境情報の一致度を算出して記録する。   In the present embodiment, the coincidence degree recording unit 1e calculates and records the coincidence degree of the surrounding environment information at a predetermined timing. For example, the coincidence degree recording unit 1e detects the timing immediately before the collision (that is, the timing when the collision avoidance determination unit 1f determines that the collision cannot be avoided) or the timing immediately after the collision (that is, the collision detection unit 1g detects the collision). The degree of coincidence of the surrounding environment information in the overlapping area of the surrounding environment recognition sensor 3 is calculated and recorded.

図1に戻り、本発明に係る車両周辺状況推定装置の構成の説明を続ける。ECU1のうち、衝突回避判定部1fは、車両運動量検出センサ2から送信される車両運動量を示す各種情報、及び、周辺環境情報取得部1aから送信される周辺環境情報に基づいて、車両10と車両外部の物体との衝突を回避可能か否か判定する衝突回避判定手段である。衝突回避判定部1fは、例えば、周辺環境情報が示す車両外部の物体と車両10との相対位置及び相対速度や、車両運動量が示す情報に含まれる車両10の車速及び加速度等に基づいて、車両外部の物体と車両10との衝突までの時間(所謂、衝突予測時間(Time−To−Collision:TTC))を算出する。そして、衝突回避判定部1fは、算出したTTCが所定閾値以上であれば衝突回避可能であると判定し、算出したTTCが所定閾値未満であれば衝突回避不可能と判定する。   Returning to FIG. 1, the description of the configuration of the vehicle surrounding state estimation device according to the present invention will be continued. Of the ECU 1, the collision avoidance determination unit 1 f is based on various information indicating the vehicle momentum transmitted from the vehicle momentum detection sensor 2 and the surrounding environment information transmitted from the surrounding environment information acquisition unit 1 a. It is a collision avoidance judging means for judging whether or not a collision with an external object can be avoided. For example, the collision avoidance determination unit 1f determines the vehicle position based on the relative position and relative speed between the object outside the vehicle indicated by the surrounding environment information and the vehicle 10, the vehicle speed and acceleration of the vehicle 10 included in the information indicated by the vehicle momentum, and the like. The time until the collision between the external object and the vehicle 10 (so-called collision prediction time (Time-To-Collision: TTC)) is calculated. Then, the collision avoidance determination unit 1f determines that collision can be avoided if the calculated TTC is equal to or greater than a predetermined threshold, and determines that collision avoidance is not possible if the calculated TTC is less than the predetermined threshold.

衝突検出部1gは、車両運動量検出センサ2から送信される車両運動量を示す各種情報、及び、周辺環境情報取得部1aから送信される周辺環境情報に基づいて、自車両が車両外部の物体に衝突したことを検出する衝突検出手段である。衝突検出部1gは、例えば、周辺環境情報が示す衝突対象と車両10との相対位置や、車両運動量が示す情報に含まれる車両10の加速度及びヨーレート等の変化に基づいて、衝突対象と車両10との衝突を検出する。   The collision detection unit 1g causes the host vehicle to collide with an object outside the vehicle based on various information indicating the vehicle momentum transmitted from the vehicle momentum detection sensor 2 and the surrounding environment information transmitted from the surrounding environment information acquisition unit 1a. This is a collision detection means for detecting the occurrence. The collision detection unit 1g, for example, based on a change in the relative position between the collision target indicated by the surrounding environment information and the vehicle 10, the change in the acceleration and yaw rate of the vehicle 10 included in the information indicated by the vehicle momentum, and the like. To detect a collision.

センサ異常判定部1hは、衝突検出部1gが衝突を検出した後、自車両に複数搭載された周辺環境認識センサ3の異常の有無を判定するセンサ異常判定手段である。本実施形態において、センサ異常判定部1hは、自車両周辺における第1領域の状況を検出する第1のセンサ、及び、第1領域とは異なる領域であって当該第1領域の一部と重複する自車両周辺における第2領域の状況を検出する第2のセンサの異常の有無を判定する。センサ異常判定部1hは、衝突検出部1gが衝突を検出した後、第1領域と第2領域とが一部重複する重複領域が存在する状況では、第1のセンサ及び第2のセンサは正常であると判定し、重複領域において第1領域と第2領域とが重複しない状況では、第1のセンサ及び第2のセンサのうち少なくともいずれか一方に異常があると判定する。具体的には、センサ異常判定部1hは、衝突検出部1gが衝突を検出した後、重複領域において第1のセンサと第2のセンサが同一の状況を検出していない場合に、第1のセンサ及び第2のセンサのうち少なくともいずれか一方に異常があると判定する。   The sensor abnormality determination unit 1h is a sensor abnormality determination unit that determines whether there is an abnormality in the surrounding environment recognition sensors 3 mounted on the host vehicle after the collision detection unit 1g detects a collision. In the present embodiment, the sensor abnormality determination unit 1h is a first sensor that detects the situation of the first region around the host vehicle, and a region that is different from the first region and overlaps a part of the first region. The presence or absence of abnormality of the second sensor that detects the situation of the second region around the subject vehicle is determined. After the collision detection unit 1g detects a collision, the sensor abnormality determination unit 1h is normal in the situation where there is an overlapping region in which the first region and the second region partially overlap. In the situation where the first area and the second area do not overlap in the overlapping area, it is determined that at least one of the first sensor and the second sensor is abnormal. Specifically, after the collision detection unit 1g detects a collision, the sensor abnormality determination unit 1h determines that the first sensor and the second sensor do not detect the same situation in the overlapping region. It is determined that at least one of the sensor and the second sensor is abnormal.

一例として、図7〜図9を参照して、第1のセンサをセンサ1とし、第2のセンサをセンサ2とし、第1領域と第2領域とが一部重複する重複領域を第1の重複領域とした場合に、センサ異常判定部1hがセンサの異常の有無を判定する処理について説明する。   As an example, referring to FIG. 7 to FIG. 9, the first sensor is the sensor 1, the second sensor is the sensor 2, and the overlapping region where the first region and the second region partially overlap is the first A process in which the sensor abnormality determination unit 1h determines whether or not there is a sensor abnormality when the overlapping area is used will be described.

図7は、自車両としての車両10が、車両周辺の移動物体としての他の車両20との衝突を避けられない状況を示している。図7に示すような状況において、まず、ECU1の衝突回避判定部1fは、周辺環境情報が示す車両20と車両10との相対位置及び相対速度や、車両運動量が示す情報に含まれる車両10の車速及び加速度等に基づいて、車両20と車両10との衝突までの時間(TTC)を算出する。そして、衝突回避判定部1fは、算出したTTCが所定閾値未満であるため衝突回避不可能と判定する。続いて、ECU1の一致度記録部1eは、衝突回避判定部1fにより衝突回避不可能と判定された場合(即ち、衝突直前のタイミング)、周辺環境認識センサ3の重複領域における周辺環境情報の一致度を算出して記録する。具体的には、図7の例において、一致度記録部1eは、周辺環境情報取得部1aから、センサ1の周辺環境情報として、第1領域内の第1の重複領域で検出した車両20との相対位置、車両20の強度、明るさ、色等を含む情報を受信する。また、一致度記録部1eは、周辺環境情報取得部1aから、センサ2の周辺環境情報として、第2領域内の第1の重複領域で検出した車両20との相対位置、車両20の強度、明るさ、色等を含む情報を受信する。そして、一致度記録部1eは、パラメータ(図7において、車両20との相対位置、強度、明るさ、色)ごとに、センサ1の周辺環境情報とセンサ2の周辺環境情報とを比較して一致度を算出し、算出した一致度を算出時刻と対応付けてECU1のメモリ内に送信して記録する。図7の例では、センサ1により第1の重複領域で検出した車両20との相対位置、車両20の強度、明るさ、色と、センサ2により第1の重複領域で検出した車両20との相対位置、車両20の強度、明るさ、色とは同程度であるので、一致度記録部1eは、センサ1とセンサ2との跨ぎ領域である第1の重複領域において検出された車両20に関する周辺環境情報の一致度は高い状態であるとして記録する。   FIG. 7 shows a situation where the vehicle 10 as the host vehicle cannot avoid a collision with another vehicle 20 as a moving object around the vehicle. In the situation as shown in FIG. 7, first, the collision avoidance determination unit 1f of the ECU 1 first determines the relative position and relative speed between the vehicle 20 and the vehicle 10 indicated by the surrounding environment information and the information included in the information indicated by the vehicle momentum. Based on the vehicle speed, acceleration, and the like, a time (TTC) until the collision between the vehicle 20 and the vehicle 10 is calculated. The collision avoidance determination unit 1f determines that collision avoidance is impossible because the calculated TTC is less than the predetermined threshold. Subsequently, the coincidence degree recording unit 1e of the ECU 1 matches the surrounding environment information in the overlapping region of the surrounding environment recognition sensor 3 when the collision avoidance determining unit 1f determines that the collision avoidance is impossible (that is, the timing immediately before the collision). Calculate and record the degree. Specifically, in the example of FIG. 7, the coincidence degree recording unit 1 e receives the vehicle 20 detected in the first overlapping region in the first region as the surrounding environment information of the sensor 1 from the surrounding environment information acquisition unit 1 a. Information including the relative position of the vehicle, the intensity of the vehicle 20, brightness, color, and the like. In addition, the coincidence degree recording unit 1e receives the relative position with respect to the vehicle 20 detected in the first overlapping region in the second region, the strength of the vehicle 20, as the surrounding environment information of the sensor 2 from the surrounding environment information acquiring unit 1a. Receive information including brightness, color, etc. Then, the coincidence degree recording unit 1e compares the surrounding environment information of the sensor 1 and the surrounding environment information of the sensor 2 for each parameter (relative position, intensity, brightness, and color with respect to the vehicle 20 in FIG. 7). The degree of coincidence is calculated, and the calculated degree of coincidence is transmitted and recorded in the memory of the ECU 1 in association with the calculation time. In the example of FIG. 7, the relative position of the vehicle 20 detected by the sensor 1 in the first overlap region, the intensity, brightness, and color of the vehicle 20 and the vehicle 20 detected by the sensor 2 in the first overlap region. Since the relative position, the intensity, the brightness, and the color of the vehicle 20 are approximately the same, the coincidence degree recording unit 1e relates to the vehicle 20 that is detected in the first overlap region that is the straddle region between the sensor 1 and the sensor 2. It is recorded that the degree of coincidence of the surrounding environment information is high.

図8は、自車両としての車両10が、車両周辺の移動物体としての他の車両20と衝突した直後であって、衝突によってもセンサが正常に作動している状況を示している。図8に示すような状況において、まず、ECU1の衝突検出部1gは、周辺環境情報が示す車両20と車両10との相対位置や、車両運動量が示す情報に含まれる車両10の加速度及びヨーレート等の変化に基づいて、車両20と車両10との衝突を検出する。そして、ECU1の一致度記録部1eは、衝突検出部1gにより衝突が検出された場合(即ち、衝突直後のタイミング)、周辺環境認識センサ3の重複領域における周辺環境情報の一致度を算出して記録する。具体的には、図8の例において、一致度記録部1eは、センサ1の第1領域内の第1の重複領域で取得した車両20に関する周辺環境情報と、センサ2の第2領域内の第1の重複領域で取得した車両20に関する周辺環境情報との一致度を算出して記録する。図8の例では、センサ1により第1の重複領域で検出した車両20との相対位置、車両20の強度、明るさ、色と、センサ2により第1の重複領域で検出した車両20との相対位置、車両20の強度、明るさ、色とは同程度であるので、一致度記録部1eは、センサ1とセンサ2との跨ぎ領域である第1の重複領域において検出された車両20に関する周辺環境情報の一致度は高い状態であるとして記録する。   FIG. 8 shows a situation immediately after the vehicle 10 as the own vehicle collides with another vehicle 20 as a moving object around the vehicle, and the sensor is operating normally even by the collision. In the situation shown in FIG. 8, first, the collision detection unit 1g of the ECU 1 detects the relative position between the vehicle 20 and the vehicle 10 indicated by the surrounding environment information, the acceleration and yaw rate of the vehicle 10 included in the information indicated by the vehicle momentum, and the like. Based on this change, a collision between the vehicle 20 and the vehicle 10 is detected. Then, the coincidence degree recording unit 1e of the ECU 1 calculates the degree of coincidence of the surrounding environment information in the overlapping region of the surrounding environment recognition sensor 3 when a collision is detected by the collision detecting unit 1g (that is, the timing immediately after the collision). Record. Specifically, in the example of FIG. 8, the coincidence degree recording unit 1 e includes the surrounding environment information regarding the vehicle 20 acquired in the first overlapping area in the first area of the sensor 1 and the second area in the sensor 2. The degree of coincidence with the surrounding environment information related to the vehicle 20 acquired in the first overlapping area is calculated and recorded. In the example of FIG. 8, the relative position of the vehicle 20 detected by the sensor 1 in the first overlap region, the intensity, brightness, and color of the vehicle 20 and the vehicle 20 detected by the sensor 2 in the first overlap region. Since the relative position, the intensity, the brightness, and the color of the vehicle 20 are approximately the same, the coincidence degree recording unit 1e relates to the vehicle 20 that is detected in the first overlap region that is the straddle region between the sensor 1 and the sensor 2. It is recorded that the degree of coincidence of the surrounding environment information is high.

続いて、ECU1のセンサ異常判定部1hは、図7の例に示すように衝突直前のタイミングで一致度記録部1eにより記録された周辺環境情報の一致度をECU1のメモリ内からロードする。そして、センサ異常判定部1hは、ロードした衝突直前のタイミングにおける車両20に関する周辺環境情報の一致度と、図8の例に示すように衝突直後のタイミングで一致度記録部1eにより記録された周辺環境情報の一致度とを比較する。センサ異常判定部1hは、比較の結果、図7の例に示すように衝突直前のタイミングにおける周辺環境情報の一致度が高く、かつ、図8の例に示すように衝突直後のタイミングにおける周辺環境情報の一致度も高い場合、両者の一致度は同程度であるため、衝突前後でセンサ1及びセンサ2について異常は発生しなかったと判定する。これは、衝突によりセンサ1及びセンサ2のいずれにも軸ずれ等の異常が発生せず、センサ1がカバーする第1領域も、センサ2がカバーする第2領域にも衝突によって変化が生じなかったと考えられるからである。このように、センサ異常判定部1hは、衝突検出部1gが衝突を検出した後、第1領域と第2領域とが一部重複する重複領域が存在する状況では、第1のセンサ及び第2のセンサは正常であると判定する。具体的には、センサ異常判定部1hは、衝突検出部1gが衝突を検出した後、重複領域において第1のセンサと第2のセンサが同一の状況を検出している場合に、第1のセンサ及び第2のセンサは正常であると判定する。   Subsequently, as shown in the example of FIG. 7, the sensor abnormality determination unit 1 h of the ECU 1 loads the degree of coincidence of the surrounding environment information recorded by the degree of coincidence recording unit 1 e at the timing immediately before the collision from the memory of the ECU 1. Then, the sensor abnormality determination unit 1h includes the degree of coincidence of the surrounding environment information regarding the vehicle 20 at the timing immediately before the loaded collision and the vicinity recorded by the coincidence degree recording unit 1e at the timing immediately after the collision as shown in the example of FIG. Compare the degree of coincidence of environmental information. As a result of the comparison, the sensor abnormality determination unit 1h has a high degree of coincidence of the surrounding environment information at the timing immediately before the collision as shown in the example of FIG. 7, and the surrounding environment at the timing immediately after the collision as shown in the example of FIG. When the degree of coincidence of information is also high, since the degree of coincidence between the two is the same, it is determined that no abnormality has occurred in sensor 1 and sensor 2 before and after the collision. This is because no abnormality such as an axis deviation occurs in either sensor 1 or sensor 2 due to the collision, and neither the first area covered by sensor 1 nor the second area covered by sensor 2 changes due to the collision. Because it is thought that it was. Thus, after the collision detection unit 1g detects a collision, the sensor abnormality determination unit 1h includes the first sensor and the second sensor in a situation where there is an overlapping region where the first region and the second region partially overlap. This sensor is determined to be normal. Specifically, after the collision detection unit 1g detects a collision, the sensor abnormality determination unit 1h detects the first situation when the first sensor and the second sensor detect the same situation in the overlapping region. It is determined that the sensor and the second sensor are normal.

図9は、自車両としての車両10が、車両周辺の移動物体としての他の車両20と衝突した直後であって、衝突によってセンサに異常が生じている状況を示している。図9に示すような状況において、まず、ECU1の衝突検出部1gは、周辺環境情報が示す車両20と車両10との相対位置や、車両運動量が示す情報に含まれる車両10の加速度及びヨーレート等の変化に基づいて、車両20と車両10との衝突を検出する。そして、ECU1の一致度記録部1eは、衝突検出部1gにより衝突が検出された場合(即ち、衝突直後のタイミング)、周辺環境認識センサ3の重複領域における周辺環境情報の一致度を算出して記録する処理を行う。しかし、図9の例においては、車両20との衝突により車両10に搭載されたセンサ2に軸ずれ等の異常が生じて第2領域が変化しているため、センサ1がカバーする第1領域とセンサ2がカバーする第2領域とが一部重複する第1の重複領域がなくなっている。そのため、一致度記録部1eは、周辺環境認識センサ3の重複領域における周辺環境情報の一致度の算出においては、重複領域が存在しないため一致度は低い状態であるとして記録する。   FIG. 9 shows a situation immediately after the vehicle 10 as the own vehicle collides with another vehicle 20 as a moving object around the vehicle, and an abnormality occurs in the sensor due to the collision. In the situation shown in FIG. 9, first, the collision detection unit 1g of the ECU 1 detects the relative position between the vehicle 20 and the vehicle 10 indicated by the surrounding environment information, the acceleration and yaw rate of the vehicle 10 included in the information indicated by the vehicle momentum, and the like. Based on this change, a collision between the vehicle 20 and the vehicle 10 is detected. Then, the coincidence degree recording unit 1e of the ECU 1 calculates the degree of coincidence of the surrounding environment information in the overlapping region of the surrounding environment recognition sensor 3 when a collision is detected by the collision detecting unit 1g (that is, the timing immediately after the collision). Process to record. However, in the example of FIG. 9, the sensor 2 mounted on the vehicle 10 due to a collision with the vehicle 20 has an abnormality such as an axis deviation and the second region is changed. And the second area covered by the sensor 2 partially disappears. Therefore, in the calculation of the degree of coincidence of the surrounding environment information in the overlapping area of the surrounding environment recognition sensor 3, the coincidence degree recording unit 1e records that the degree of coincidence is low because there is no overlapping area.

続いて、ECU1のセンサ異常判定部1hは、図7の例に示すように衝突直前のタイミングで一致度記録部1eにより記憶された周辺環境情報の一致度をECU1のメモリ内からロードする。そして、センサ異常判定部1hは、ロードした衝突直前のタイミングにおける車両20に関する周辺環境情報の一致度と、図9の例に示すように衝突直後のタイミングで一致度記録部1eにより記録された周辺環境情報の一致度とを比較する。センサ異常判定部1hは、比較の結果、図7の例に示すように衝突直前のタイミングにおける周辺環境情報の一致度は高いものの、図9の例に示すように衝突直後のタイミングにおける周辺環境情報の一致度は低い場合、両者の一致度は同程度ではないため、衝突前後でセンサ1及びセンサ2のうち少なくともいずれか一方に異常が発生したと判定する。これは、衝突によりセンサ1及びセンサ2のうち少なくともいずれか一方に軸ずれ等の異常が発生し、センサ1がカバーする第1領域又はセンサ2がカバーする第2領域のいずれかに衝突によって変化が生じたと考えられるからである。このように、センサ異常判定部1hは、衝突検出部1gが衝突を検出した後、重複領域において第1領域と第2領域とが重複しない状況では、第1のセンサ及び第2のセンサのうち少なくともいずれか一方に異常があると判定する。具体的には、センサ異常判定部1hは、衝突検出部1gが衝突を検出した後、重複領域において第1のセンサと第2のセンサが同一の状況を検出していない場合に、第1のセンサ及び第2のセンサのうち少なくともいずれか一方に異常があると判定する。   Subsequently, as shown in the example of FIG. 7, the sensor abnormality determination unit 1 h of the ECU 1 loads the degree of coincidence of the surrounding environment information stored by the degree of coincidence recording unit 1 e at the timing immediately before the collision from the memory of the ECU 1. Then, the sensor abnormality determination unit 1h determines the degree of coincidence of the surrounding environment information related to the vehicle 20 at the timing immediately before the loaded collision, and the vicinity recorded by the coincidence degree recording unit 1e at the timing immediately after the collision as shown in the example of FIG. Compare the degree of coincidence of environmental information. As a result of the comparison, the sensor abnormality determination unit 1h has a high degree of coincidence of the surrounding environment information at the timing immediately before the collision as shown in the example of FIG. 7, but the surrounding environment information at the timing immediately after the collision as shown in the example of FIG. When the degree of coincidence is low, the degree of coincidence between the two is not the same, so it is determined that an abnormality has occurred in at least one of the sensor 1 and the sensor 2 before and after the collision. This is because an abnormality such as an axis misalignment occurs in at least one of the sensor 1 and the sensor 2 due to the collision, and changes due to the collision in either the first area covered by the sensor 1 or the second area covered by the sensor 2. This is because it is considered that this occurred. As described above, after the collision detection unit 1g detects a collision, the sensor abnormality determination unit 1h determines whether the first sensor and the second sensor are not overlapped in the overlapping region. It is determined that there is an abnormality in at least one of them. Specifically, after the collision detection unit 1g detects a collision, the sensor abnormality determination unit 1h determines that the first sensor and the second sensor do not detect the same situation in the overlapping region. It is determined that at least one of the sensor and the second sensor is abnormal.

ここで、センサ異常判定部1hは、更にセンサ1がカバーする第1領域とセンサ3がカバーする第3領域とが一部重複する第2の重複領域についても、衝突前後で周辺環境情報の一致度を比較してもよい。これにより、センサ異常判定部1hは、この第2の重複領域における周辺環境情報の一致度の比較結果に基づいて、センサ1又はセンサ2のいずれかに異常が生じているかを判定することもできる。図9の例において、センサ異常判定部1hは、重複領域において第1領域と第2領域とが重複しない状況、具体的には、センサ異常判定部1hは、重複領域において第1のセンサと第2のセンサが同一の状況を検出していない場合に、第1のセンサ及び第2のセンサのうち少なくともいずれか一方に異常があると判定する。この時点では、第1のセンサに異常が生じているのか又は第2のセンサに異常が生じているのかが不明である。そこで、センサ異常判定部1hは、衝突前後で第2の重複領域における周辺環境情報の一致度の比較結果が同程度であれば、第1のセンサに異常はなく、第2のセンサに異常が生じていると判定する。一方、衝突前後で第2の重複領域における周辺環境情報の一致度の比較結果が同程度でなければ、第2のセンサに異常はなく、第1のセンサに異常が生じているか、あるいは、第1のセンサ及び第2のセンサの両方に異常が生じていると判定する。   Here, the sensor abnormality determination unit 1h also matches the surrounding environment information before and after the collision in the second overlapping area where the first area covered by the sensor 1 and the third area covered by the sensor 3 partially overlap. You may compare degrees. Thereby, the sensor abnormality determination part 1h can also determine whether abnormality has arisen in either the sensor 1 or the sensor 2 based on the comparison result of the coincidence degree of the surrounding environment information in the second overlapping region. . In the example of FIG. 9, the sensor abnormality determination unit 1h has a situation in which the first region and the second region do not overlap in the overlapping region, specifically, the sensor abnormality determination unit 1h When the two sensors do not detect the same situation, it is determined that at least one of the first sensor and the second sensor is abnormal. At this time, it is unknown whether an abnormality has occurred in the first sensor or an abnormality has occurred in the second sensor. Therefore, if the comparison result of the degree of coincidence of the surrounding environment information in the second overlapping region is approximately the same before and after the collision, the sensor abnormality determination unit 1h has no abnormality in the first sensor and has an abnormality in the second sensor. Determine that it has occurred. On the other hand, if the comparison result of the degree of coincidence of the surrounding environment information in the second overlapping region is not the same before and after the collision, there is no abnormality in the second sensor, and there is an abnormality in the first sensor. It is determined that an abnormality has occurred in both the first sensor and the second sensor.

図1に戻り、本発明に係る車両周辺状況推定装置の構成の説明を続ける。ECU1のうち、走行制御部1iは、車両運動量検出センサ2から送信される車両運動量を示す各種情報、及び、周辺状況予測部1bにより予測された自車両周辺の状況に基づいて、車両10の挙動を制御する走行制御を行う走行制御手段である。走行制御部1iは、例えば、車両運動量が示す情報に含まれる車両10の車速及び加速度、予測された自車両周辺の状況(例えば、車両周辺領域内の状況)が示す車両10が走行可能な領域を示す各種情報、及び、回避対象となる障害物の位置に基づいて、車両10が障害物を回避可能な走行軌跡や走行速度等を演算する。そして、走行制御部1iは、この演算処理結果に基づいた制御信号をアクチュエータ4へ出力して、アクチュエータ4を動作させることで走行制御を実行する。走行制御部1iは、例えば、EPS等のアクチュエータ4を介して車両10の操舵輪の舵角を制御することで、車両10が障害物を回避するように操舵支援を実行する。走行制御部1iは、障害物をより確実に回避できるように、操舵支援にブレーキ支援を組み合わせて実行してもよい。このようにして、走行制御部1iは、障害物の位置への車両10の移動を回避する走行制御手段として機能する。   Returning to FIG. 1, the description of the configuration of the vehicle surrounding state estimation device according to the present invention will be continued. Of the ECU 1, the travel control unit 1i is based on various information indicating the vehicle momentum transmitted from the vehicle momentum detection sensor 2 and the situation around the host vehicle predicted by the surrounding situation prediction unit 1b. It is the travel control means which performs the travel control which controls. The travel control unit 1i is an area in which the vehicle 10 can travel, for example, indicated by the vehicle speed and acceleration of the vehicle 10 included in the information indicated by the vehicle momentum, and the predicted situation around the host vehicle (for example, the situation in the vehicle surrounding area). The vehicle 10 calculates a travel locus, a travel speed, and the like that the vehicle 10 can avoid the obstacle based on various information indicating the obstacle and the position of the obstacle to be avoided. Then, the traveling control unit 1 i outputs a control signal based on the calculation processing result to the actuator 4 and operates the actuator 4 to execute the traveling control. For example, the traveling control unit 1i controls the steering angle of the steered wheels of the vehicle 10 via the actuator 4 such as EPS, thereby performing steering assistance so that the vehicle 10 avoids an obstacle. The traveling control unit 1i may execute the steering assistance in combination with the brake assistance so that the obstacle can be avoided more reliably. In this way, the traveling control unit 1i functions as a traveling control unit that avoids the movement of the vehicle 10 to the position of the obstacle.

ここで、衝突回避判定部1fにより障害物との衝突を回避可能であると判定された場合は、走行制御部1iの処理により、上述の走行制御を行うことで障害物との衝突を回避可能である。しかし、衝突回避判定部1fにより障害物との衝突を回避不可能であると判定された場合、走行制御部1iの処理によっても1次衝突は避けられない場合がある。このような場合であっても、1次衝突直後に、車両10を制御し、安全な場所に移動させて、次に起こり得る2次衝突による衝撃(インパクト)を最小限にすることが安全上望ましい。走行制御部1iにより、1次衝突後も走行制御を行うことで2次衝突回避のための運転支援制御を実行することもできるが、この場合、1次衝突の影響で周辺環境認識センサ3に異常が生じている可能性も考慮する必要がある。   Here, when it is determined by the collision avoidance determination unit 1f that a collision with an obstacle can be avoided, a collision with an obstacle can be avoided by performing the above-described traveling control by the processing of the traveling control unit 1i. It is. However, if the collision avoidance determination unit 1f determines that a collision with an obstacle cannot be avoided, the primary collision may not be avoided even by the processing of the traveling control unit 1i. Even in such a case, for safety reasons, the vehicle 10 is controlled immediately after the primary collision and moved to a safe place to minimize the impact (impact) caused by the secondary collision that may occur next. desirable. The driving control unit 1i can perform driving support control for avoiding secondary collision by performing traveling control even after the primary collision. In this case, the surrounding environment recognition sensor 3 is affected by the influence of the primary collision. It is also necessary to consider the possibility of abnormality.

そこで、本実施形態では、衝突直前に推定された2次衝突の可能性がある領域の状況に基づいて、衝突直後の車両周辺の状況を予測して、2次衝突回避のための走行制御を実行させている。更に、本実施形態では、周辺環境認識センサ3のセンサ状態(正常・異常)に応じて衝突直後の車両周辺の状況を予測した上で、2次衝突回避の走行制御を行っている。   Therefore, in the present embodiment, based on the situation of the area where there is a possibility of a secondary collision estimated immediately before the collision, the situation around the vehicle immediately after the collision is predicted, and travel control for avoiding the secondary collision is performed. It is running. Further, in the present embodiment, traveling control for avoiding secondary collision is performed after predicting the situation around the vehicle immediately after the collision according to the sensor state (normal / abnormal) of the surrounding environment recognition sensor 3.

具体的には、本実施形態において、周辺状況予測部1bは、衝突検出部1gが衝突を検出した後、衝突検出部1gが衝突を検出する前に周辺状況記録部1dに記録された予測領域の状況に基づいて、自車両周辺の状況を予測する。また、周辺状況予測部1bは、衝突検出部1gが衝突を検出した後、センサ異常判定部1hにより異常があると判定された周辺環境認識センサ3(周辺状況検出手段)の衝突前に設定された検出領域に対応する車両周辺領域内の異常認識領域の状況については、衝突検出部1gが衝突を検出する前に周辺状況記録部1dにより記録された予測領域の状況に基づいて予測する。また、周辺状況予測部1bは、センサ異常判定部1hにより正常であると判定された周辺環境認識センサ3の検出領域に対応する車両周辺領域内の正常認識領域の状況については、周辺環境認識センサ3の検出結果に基づいて予測する。そして、走行制御部1iは、衝突検出部1gが衝突を検出した後、周辺状況予測部1bにより予測された車両周辺領域の状況に基づいて、自車両周辺における2次衝突の発生を回避するための走行制御を行う。   Specifically, in this embodiment, the surrounding situation prediction unit 1b is a prediction area recorded in the surrounding situation recording unit 1d after the collision detection unit 1g detects a collision and before the collision detection unit 1g detects a collision. Based on the situation, the situation around the host vehicle is predicted. Further, the surrounding situation prediction unit 1b is set before the collision of the surrounding environment recognition sensor 3 (ambient situation detection means) that is determined to be abnormal by the sensor abnormality determination unit 1h after the collision detection unit 1g detects the collision. The situation of the abnormality recognition area in the vehicle surrounding area corresponding to the detected area is predicted based on the situation of the prediction area recorded by the surrounding situation recording section 1d before the collision detection section 1g detects a collision. In addition, the surrounding situation prediction unit 1b determines the surrounding environment recognition sensor for the situation of the normal recognition area in the vehicle surrounding area corresponding to the detection area of the surrounding environment recognition sensor 3 determined to be normal by the sensor abnormality determination unit 1h. 3 based on the detection result of 3. Then, after the collision detection unit 1g detects a collision, the traveling control unit 1i avoids the occurrence of a secondary collision around the host vehicle based on the situation in the vehicle surrounding area predicted by the surrounding situation prediction unit 1b. The travel control is performed.

図10を参照して、本実施形態における2次衝突発生領域の予測処理の概要について説明する。図10では、第1次の衝突タイミングとして、上述の図4に示したように、自車両としての車両10が同一走行車線上で先行する他の車両30を追い越し、反対車線を走行中の他の車両20に1次衝突するタイミングを想定している。   With reference to FIG. 10, the outline | summary of the prediction process of the secondary collision occurrence area | region in this embodiment is demonstrated. In FIG. 10, as the first collision timing, as shown in FIG. 4 described above, the vehicle 10 as the host vehicle overtakes another vehicle 30 that precedes the same traveling lane and travels in the opposite lane. The timing at which the vehicle 20 first collides is assumed.

図10において、周辺状況推定部1cは、衝突直前のM秒の時点(すなわち、衝突回避判定部1fが車両20との1次衝突の回避は不可能であると判定した時点)で、N秒後の周辺障害物(図10において、2次衝突の可能性がある車両30)が存在する領域を推定する。具体的には、周辺状況推定部1cは、上述の図5に示すように、衝突回避判定部1fが車両20との衝突回避は不可能であると判定した後、衝突検出部1gが車両20との衝突を検出する前の周辺環境認識センサ3の検出結果に基づいて、車両20に衝突した際の自車両周辺における車両30との2次衝突の発生が予測される予測領域の状況を推定する。そして、このタイミングで、周辺状況記録部1dは、周辺状況推定部1cにより推定された予測領域の状況をECU1のメモリ内に送信して記録する。   In FIG. 10, the surrounding situation estimation unit 1c is N seconds at a time point of M seconds immediately before the collision (that is, a time point when the collision avoidance determination unit 1f determines that it is impossible to avoid the primary collision with the vehicle 20). An area in which a subsequent peripheral obstacle (the vehicle 30 with the possibility of a secondary collision in FIG. 10) exists is estimated. Specifically, as shown in FIG. 5 described above, the surrounding state estimation unit 1c determines that the collision avoidance determination unit 1f cannot avoid a collision with the vehicle 20, and then the collision detection unit 1g detects that the vehicle 20 Based on the detection result of the surrounding environment recognition sensor 3 before detecting the collision with the vehicle 20, the situation of the prediction region where the occurrence of the secondary collision with the vehicle 30 around the own vehicle when the vehicle 20 collides is estimated. To do. At this timing, the surrounding situation recording unit 1d transmits the situation of the prediction region estimated by the surrounding situation estimation unit 1c in the memory of the ECU 1 and records it.

図10において、周辺状況予測部1bは、衝突直後のN秒の時点(すなわち、衝突検出部1gが車両20との1次衝突を検出した時点)で、自車両周辺の車両周辺領域の状況を予測する。具体的には、周辺状況予測部1bは、衝突直後に、周辺環境情報取得部1aにより取得された周辺環境情報に加え、車両運動量検出センサ2から送信される車両運動量を示す各種情報を更に用いて、自車両の位置及び傾きの状態を特定する。続いて、周辺状況予測部1bは、特定した自車両の位置及び傾きの状態を、衝突直前のM秒の時点でECU1のメモリ内に記録された予測領域に当てはめることで、衝突位置の判定を行う(1次予測)。このように、周辺状況予測部1bの1次予測処理により、1次衝突を回避不可能と判定した場合には、衝突直前に記録したECUメモリ情報を活用して2次衝突が発生する領域を予測することができる。   In FIG. 10, the surrounding situation prediction unit 1b determines the situation of the vehicle surrounding area around the host vehicle at a time point of N seconds immediately after the collision (that is, when the collision detection unit 1g detects a primary collision with the vehicle 20). Predict. Specifically, immediately after the collision, the surrounding situation prediction unit 1b further uses various information indicating the vehicle momentum transmitted from the vehicle momentum detection sensor 2 in addition to the surrounding environment information acquired by the surrounding environment information acquisition unit 1a. Then, the position and inclination state of the host vehicle are specified. Subsequently, the surrounding situation prediction unit 1b applies the identified position and inclination of the host vehicle to the prediction area recorded in the memory of the ECU 1 at the point of M seconds immediately before the collision, thereby determining the collision position. Perform (primary prediction). As described above, when it is determined that the primary collision cannot be avoided by the primary prediction process of the surrounding situation prediction unit 1b, the region where the secondary collision occurs is determined using the ECU memory information recorded immediately before the collision. Can be predicted.

更に、衝突直後のN秒の時点において、センサ異常判定部1hは、車両10に複数搭載された各周辺環境認識センサ3について、1次衝突による影響で異常が発生していないかを判定する。そして、周辺状況予測部1bは、センサ異常判定部1hの判定結果に基づいて、異常と判定した周辺環境認識センサ3(図10において、車両の右前方の検出領域をカバーするセンサ2)については、衝突直後のN秒の時点で衝突位置近辺の領域を正常に認識できていないと予測する。そこで、周辺状況予測部1bは、衝突検出部1gが衝突を検出した後、センサ異常判定部1hにより異常があると判定された周辺環境認識センサ3(図10において、センサ2)の衝突前に設定された検出領域に対応する車両周辺領域内の異常認識領域の状況については、衝突直前のM秒の時点において周辺状況記録部1dにより記録された予測領域の状況に基づいて予測する。つまり、図10において、異常認識領域の状況については、異常認識領域の範囲に対応するECUメモリに記録された予測領域(図10において、ECUメモリから復活した部分)の状況に基づいて予測される。   Furthermore, at the time of N seconds immediately after the collision, the sensor abnormality determination unit 1h determines whether or not abnormality has occurred in each of the surrounding environment recognition sensors 3 mounted on the vehicle 10 due to the influence of the primary collision. Then, the surrounding environment prediction unit 1b determines the surrounding environment recognition sensor 3 determined to be abnormal based on the determination result of the sensor abnormality determination unit 1h (the sensor 2 covering the detection area on the right front side of the vehicle in FIG. 10). It is predicted that the area near the collision position cannot be normally recognized at N seconds immediately after the collision. Therefore, the surrounding state prediction unit 1b detects the collision before the collision of the surrounding environment recognition sensor 3 (sensor 2 in FIG. 10), which is determined to be abnormal by the sensor abnormality determination unit 1h after the collision detection unit 1g detects the collision. The situation of the abnormality recognition area in the vehicle surrounding area corresponding to the set detection area is predicted based on the situation of the prediction area recorded by the surrounding situation recording unit 1d at the point of M seconds immediately before the collision. That is, in FIG. 10, the situation of the abnormality recognition area is predicted based on the situation of the prediction area (part restored from the ECU memory in FIG. 10) recorded in the ECU memory corresponding to the range of the abnormality recognition area. .

また、周辺状況予測部1bは、センサ異常判定部1hの判定結果に基づいて、正常と判定した周辺環境認識センサ3(図10において、車両の右前方の検出領域以外をカバーするセンサ1,3〜6)については、衝突直後のN秒の時点でも各検出領域を正常に認識できていると予測する。そこで、周辺状況予測部1bは、センサ異常判定部1hにより正常であると判定された周辺環境認識センサ3(図10において、センサ1,3〜6)の検出領域に対応する車両周辺領域内の正常認識領域の状況については、周辺環境認識センサ3の検出結果に基づいて予測する。続いて、周辺状況予測部1bは、衝突直前に記録されたECUメモリ情報に基づいて予測した異常認識領域の状況と、衝突直後に検出された周辺環境情報に基づいて予測した正常認識領域の状況とを結合させて、衝突直後のN秒の時点における車両周辺領域の状況として予測する。   In addition, the surrounding situation prediction unit 1b determines that the surrounding environment recognition sensor 3 is determined to be normal based on the determination result of the sensor abnormality determination unit 1h (in FIG. 10, sensors 1 and 3 that cover areas other than the detection area on the right front of the vehicle). With respect to ˜6), it is predicted that each detection area can be normally recognized even at the point of N seconds immediately after the collision. Therefore, the surrounding situation prediction unit 1b is provided in the vehicle surrounding region corresponding to the detection region of the surrounding environment recognition sensor 3 (sensors 1 to 3 to 6 in FIG. 10) determined to be normal by the sensor abnormality determination unit 1h. The state of the normal recognition region is predicted based on the detection result of the surrounding environment recognition sensor 3. Subsequently, the surrounding situation prediction unit 1b predicts the situation of the abnormality recognition area predicted based on the ECU memory information recorded immediately before the collision and the situation of the normal recognition area predicted based on the surrounding environment information detected immediately after the collision. And the situation of the vehicle surrounding area at the point of N seconds immediately after the collision is predicted.

このように、周辺状況予測部1bの2次予測としての上記処理により、センサ異常の有無を示す判定結果に基づいて、1次衝突後に異常が発生したセンサがカバーする領域については衝突直前に記録したECUメモリ情報を活用し、1次衝突後も正常なセンサがカバーする領域については実際に衝突直後のタイミングで検出した検出結果を用いて自車両周辺の状況を予測することができるので、2次衝突が発生する領域の予測精度を更に高めることができる。   As described above, the area covered by the sensor in which the abnormality has occurred after the primary collision is recorded immediately before the collision based on the determination result indicating the presence or absence of the sensor abnormality by the above-described processing as the secondary prediction of the surrounding situation prediction unit 1b. By utilizing the detected ECU memory information, the situation around the host vehicle can be predicted using the detection result actually detected at the timing immediately after the collision for the area covered by the normal sensor even after the primary collision. The prediction accuracy of the region where the next collision occurs can be further increased.

続いて、図11〜図15を参照して、上述の車両周辺状況推定装置により実行される各種処理について説明する。図11は、本発明に係る車両周辺状況推定装置の基本処理の一例を示すフローチャートである。図12は、衝突直前の周辺状況推定処理の一例を示すフローチャートである。図13は、衝突直前の一致度記録処理の一例を示すフローチャートである。図14は、衝突直後のセンサ異常判定処理の一例を示すフローチャートである。図15は、センサ状態に応じた周辺状況予測処理の一例を示すフローチャートである。   Then, with reference to FIGS. 11-15, the various processes performed by the above-mentioned vehicle surrounding condition estimation apparatus are demonstrated. FIG. 11 is a flowchart showing an example of basic processing of the vehicle surrounding situation estimation apparatus according to the present invention. FIG. 12 is a flowchart illustrating an example of the surrounding situation estimation process immediately before the collision. FIG. 13 is a flowchart showing an example of the coincidence degree recording process immediately before the collision. FIG. 14 is a flowchart illustrating an example of sensor abnormality determination processing immediately after a collision. FIG. 15 is a flowchart illustrating an example of the surrounding situation prediction process according to the sensor state.

図11に示すように、周辺環境情報取得部1aは、周辺環境認識センサ3から送信される、車両周囲の移動物体や静止障害物等の車両周辺状況を示す周辺環境情報を受信して取得する(ステップS10)。   As shown in FIG. 11, the surrounding environment information acquisition unit 1a receives and acquires surrounding environment information transmitted from the surrounding environment recognition sensor 3 and indicating the surrounding conditions of the vehicle such as a moving object and a stationary obstacle around the vehicle. (Step S10).

衝突回避判定部1fは、車両運動量検出センサ2から送信される車両運動量を示す各種情報、及び、ステップS10にて周辺環境情報取得部1aにより取得された周辺環境情報に基づいて、車両10と車両外部の物体との衝突を回避可能か否か判定する(ステップS20)。本実施形態において、衝突回避判定部1fは、例えば、周辺環境情報が示す車両外部の物体と車両10との相対位置及び相対速度や、車両運動量が示す情報に含まれる車両10の車速及び加速度等に基づいて、車両外部の物体と車両10との衝突までの時間(所謂、衝突予測時間(Time−To−Collision:TTC))を算出する。そして、衝突回避判定部1fは、算出したTTCが所定閾値以上であれば衝突回避可能であると判定し、算出したTTCが所定閾値未満であれば衝突回避不可能と判定する。   The collision avoidance determination unit 1f is based on various information indicating the vehicle momentum transmitted from the vehicle momentum detection sensor 2 and the surrounding environment information acquired by the surrounding environment information acquisition unit 1a in step S10. It is determined whether or not a collision with an external object can be avoided (step S20). In the present embodiment, the collision avoidance determination unit 1f, for example, the relative position and relative speed between an object outside the vehicle indicated by the surrounding environment information and the vehicle 10, the vehicle speed and acceleration of the vehicle 10 included in the information indicated by the vehicle momentum, and the like. Based on the above, a time until a collision between an object outside the vehicle and the vehicle 10 (so-called collision prediction time (Time-To-Collision: TTC)) is calculated. Then, the collision avoidance determination unit 1f determines that collision can be avoided if the calculated TTC is equal to or greater than a predetermined threshold, and determines that collision avoidance is not possible if the calculated TTC is less than the predetermined threshold.

ステップS20において、衝突回避判定部1fは、衝突回避不可能と判定した場合(ステップS20:Yes)、次のステップS30の処理へ移行し、衝突回避可能であると判定した場合(ステップS20:No)、ステップS10の処理へ戻る。   In step S20, when the collision avoidance determination unit 1f determines that collision avoidance is impossible (step S20: Yes), the process proceeds to the next step S30, and when it is determined that collision avoidance is possible (step S20: No). ), The process returns to step S10.

ステップS20において、衝突回避判定部1fにより衝突回避不可能と判定された場合(ステップS20:Yes)、周辺状況推定部1c及び周辺状況記録部1dは、衝突時の周辺状況の推定と記録を行う(ステップS30)。図11のステップS30で行われる衝突直前の周辺状況推定処理について、図12を参照して説明する。   In step S20, when it is determined by the collision avoidance determination unit 1f that collision avoidance is impossible (step S20: Yes), the surrounding situation estimation unit 1c and the surrounding situation recording unit 1d perform estimation and recording of the surrounding situation at the time of the collision. (Step S30). The surrounding state estimation process immediately before the collision performed in step S30 in FIG. 11 will be described with reference to FIG.

図12に示すように、周辺状況推定部1cは、衝突直後のタイミングであるN秒後に、道路上の障害物がある領域を推定する(ステップS31)。ステップS31において、周辺状況推定部1cは、上述の図5及び図10に示したように、衝突検出部1gが衝突を検出する前に周辺環境情報取得部1aにより取得された周辺環境情報に基づいて、衝突した際の自車両周辺における2次衝突の発生が予測される予測領域の状況を推定する。そして、周辺状況記録部1dは、ステップS31で推定した予測領域の状況を推定時刻と対応付けてECU1のメモリ内に送信して記録する(ステップS32)。その後、図11のステップS40の処理へ移行する。   As shown in FIG. 12, the surrounding state estimation unit 1c estimates an area where there is an obstacle on the road after N seconds, which is the timing immediately after the collision (step S31). In step S31, the surrounding state estimation unit 1c is based on the surrounding environment information acquired by the surrounding environment information acquiring unit 1a before the collision detecting unit 1g detects a collision, as shown in FIGS. Thus, the situation of the prediction area where the occurrence of the secondary collision around the host vehicle at the time of the collision is predicted is estimated. Then, the surrounding situation recording unit 1d transmits the situation of the prediction region estimated in step S31 in association with the estimated time in the memory of the ECU 1 and records it (step S32). Thereafter, the process proceeds to step S40 in FIG.

図11に戻り、一致度記録部1eは、衝突直前のタイミングで、センサ間の跨ぎ領域において周辺環境情報の一致度を算出して記録する処理を行う(ステップS40)。ここで、衝突回避判定部1fにより衝突回避不可能と判定された場合(ステップS20:Yes)は、衝突までの時間TTCが所定閾値未満と判定された場合であるため、衝突直前のタイミングである。図11のステップS40において行われる一致度記録処理について、図13を参照して説明する。   Returning to FIG. 11, the coincidence degree recording unit 1e performs a process of calculating and recording the coincidence degree of the surrounding environment information in the straddle region between the sensors at the timing immediately before the collision (step S40). Here, when it is determined by the collision avoidance determination unit 1f that collision avoidance is impossible (step S20: Yes), the time immediately before the collision is determined because the time TTC until the collision is determined to be less than the predetermined threshold. . The coincidence degree recording process performed in step S40 of FIG. 11 will be described with reference to FIG.

図13に示すように、一致度記録部1eは、センサ間の跨ぎ領域において、認識対象(例えば、衝突回避不可能と判定された障害物)に関する強度、明るさ、色、相対位置等を含む周辺環境情報の一致度を算出する(ステップS41)。一致度記録部1eは、例えば、上述の図7に示すように、衝突回避判定部1fにより衝突回避不可能と判定された場合(即ち、衝突直前のタイミング)、周辺環境認識センサ3の重複領域における周辺環境情報の一致度を算出する。そして、一致度記録部1eは、ステップS41で算出した一致度を算出時刻と対応付けてECU1のメモリ内に送信して記録する(ステップS42)。その後、図11のステップS50の処理へ移行する。   As illustrated in FIG. 13, the coincidence degree recording unit 1 e includes the strength, brightness, color, relative position, and the like related to the recognition target (for example, an obstacle determined to be a collision avoidable) in the straddle region between the sensors. The degree of coincidence of the surrounding environment information is calculated (step S41). For example, as shown in FIG. 7 described above, when the coincidence degree recording unit 1e determines that collision avoidance is impossible by the collision avoidance determining unit 1f (that is, the timing immediately before the collision), the overlapping area of the surrounding environment recognition sensor 3 The degree of coincidence of the surrounding environment information is calculated. Then, the coincidence degree recording unit 1e transmits the degree of coincidence calculated in step S41 in association with the calculation time in the memory of the ECU 1 and records it (step S42). Thereafter, the process proceeds to step S50 in FIG.

図11に戻り、ステップS50において、衝突検出部1gは、車両10と障害物との1次衝突を検出した場合(ステップS50:Yes)、次のステップS60の処理へ移行し、車両10と障害物との1次衝突を検出していない場合(ステップS50:No)、1次衝突を検出するまでステップS30から処理を繰り返す。ステップS30から処理を繰り返すことで、周辺状況記録部1dは、衝突回避判定部1fが衝突回避不可能と判定した後の時点から衝突検出部1gが衝突を検出する前までの時点において周辺状況推定部1cにより推定された予測領域の状況を記録する。   Returning to FIG. 11, in step S50, when the collision detection unit 1g detects a primary collision between the vehicle 10 and the obstacle (step S50: Yes), the process proceeds to the next step S60. When the primary collision with the object is not detected (step S50: No), the process is repeated from step S30 until the primary collision is detected. By repeating the processing from step S30, the surrounding situation recording unit 1d estimates the surrounding situation from the time point after the collision avoidance determining unit 1f determines that the collision avoidance is impossible until the collision detecting unit 1g detects the collision. The state of the prediction area estimated by the unit 1c is recorded.

周辺状況予測部1bは、ステップS50にて衝突検出部1gが1次衝突を検出した場合(ステップS50:Yes)、衝突検出部1gが1次衝突を検出する前に周辺状況記録部1dにより記録された予測領域の状況に基づいて、自車両周辺の車両周辺領域の状況を予測する(ステップS60)。例えば、周辺状況予測部1bは、上述の図10に示すように、1次予測処理として、衝突直後のN秒の時点(すなわち、衝突検出部1gが車両20との1次衝突を検出した時点)で、自車両周辺の車両周辺領域の状況を予測する。具体的には、周辺状況予測部1bは、衝突直後に、周辺環境情報取得部1aにより取得された周辺環境情報に加え、車両運動量検出センサ2から送信される車両運動量を示す各種情報を更に用いて、自車両の位置及び傾きの状態を特定する。続いて、周辺状況予測部1bは、特定した自車両の位置及び傾きの状態を、衝突直前のM秒の時点でECU1のメモリ内に記録された予測領域に当てはめることで、衝突位置の判定を行う。その後、ステップS70の処理へ移行する。   If the collision detection unit 1g detects a primary collision in step S50 (step S50: Yes), the surrounding situation prediction unit 1b records the surrounding situation prediction unit 1d before the collision detection unit 1g detects the primary collision. Based on the situation of the predicted region, the situation of the vehicle peripheral region around the host vehicle is predicted (step S60). For example, as shown in FIG. 10 described above, the surrounding situation prediction unit 1b performs N-second time immediately after the collision (that is, the time when the collision detection unit 1g detects the primary collision with the vehicle 20 as the primary prediction processing). ) To predict the situation of the area around the vehicle. Specifically, immediately after the collision, the surrounding situation prediction unit 1b further uses various information indicating the vehicle momentum transmitted from the vehicle momentum detection sensor 2 in addition to the surrounding environment information acquired by the surrounding environment information acquisition unit 1a. Then, the position and inclination state of the host vehicle are specified. Subsequently, the surrounding situation prediction unit 1b applies the identified position and inclination of the host vehicle to the prediction area recorded in the memory of the ECU 1 at the point of M seconds immediately before the collision, thereby determining the collision position. Do. Thereafter, the process proceeds to step S70.

センサ異常判定部1hは、ステップS50にて衝突検出部1gが衝突を検出した後、自車両に複数搭載された周辺環境認識センサ3の異常の有無を判定する(ステップS70)。ステップS70において、センサ異常判定部1hは、例えば、車両10の周辺における第1領域の状況を検出する第1のセンサ、及び、第1領域とは異なる領域であって当該第1領域の一部と重複する車両10の周辺における第2領域の状況を検出する第2のセンサの異常の有無を判定する。図11のステップS70において行われるセンサ異常判定処理について、図14を参照して説明する。   After the collision detection unit 1g detects a collision in step S50, the sensor abnormality determination unit 1h determines whether or not there are abnormalities in the surrounding environment recognition sensors 3 mounted on the host vehicle (step S70). In step S70, the sensor abnormality determination unit 1h is, for example, a first sensor that detects the state of the first region around the vehicle 10, and a region that is different from the first region and is a part of the first region. Whether there is an abnormality in the second sensor that detects the situation of the second region around the vehicle 10 that overlaps with the vehicle 10 is determined. The sensor abnormality determination process performed in step S70 of FIG. 11 will be described with reference to FIG.

図14に示すように、センサ異常判定部1hは、図11のステップS50にて衝突検出部1gが衝突を検出した場合(ステップS50:Yes)、車両10に搭載された全ての周辺環境認識センサ3に対して、認識状態を「異常」に設定する(ステップS71)。ここで、衝突検出部1gが衝突を検出した場合(ステップS50:Yes)は、衝突直後のタイミングである。そこで、図14のステップS71において、一致度記録部1eは、上述したように衝突直後のタイミングで、センサ間の跨ぎ領域において周辺環境情報の一致度を算出して記録する処理も行う。続いて、センサ異常判定部1hは、図13のステップS42にて衝突直前のタイミングで一致度記録部1eにより記録された周辺環境情報の一致度をECU1のメモリ内からロードする(ステップS72)。そして、センサ異常判定部1hは、ステップS72にてロードした衝突直前のタイミングにおける車両20に関する周辺環境情報の一致度と、ステップS71にて衝突直後のタイミングで一致度記録部1eにより記録された周辺環境情報の一致度とを比較する(ステップS73)。   As shown in FIG. 14, when the collision detection unit 1g detects a collision in step S50 of FIG. 11 (step S50: Yes), the sensor abnormality determination unit 1h detects all surrounding environment recognition sensors mounted on the vehicle 10. 3, the recognition state is set to “abnormal” (step S71). Here, when the collision detection unit 1g detects a collision (step S50: Yes), it is the timing immediately after the collision. Therefore, in step S71 in FIG. 14, the coincidence degree recording unit 1e also performs processing for calculating and recording the coincidence degree of the surrounding environment information in the straddle region between the sensors at the timing immediately after the collision as described above. Subsequently, the sensor abnormality determination unit 1h loads the degree of coincidence of the surrounding environment information recorded by the coincidence degree recording unit 1e at the timing immediately before the collision in step S42 of FIG. 13 from the memory of the ECU 1 (step S72). And the sensor abnormality determination part 1h is the periphery recorded by the coincidence degree recording part 1e at the timing immediately after the collision in step S71 and the coincidence degree of the surrounding environment information regarding the vehicle 20 at the timing immediately before the collision loaded in step S72. The degree of coincidence of the environmental information is compared (step S73).

センサ異常判定部1hは、ステップS73の処理による比較結果に基づいて、衝突直前の各センサ間の重複領域における一致度と、衝突直後の各センサ間の重複領域における一致度とが、同程度であるか否か判定する(ステップS74)。なお、ステップS74の判定処理は、跨ぎ領域があるセンサのペア毎に行う。   Based on the comparison result obtained in step S73, the sensor abnormality determination unit 1h has the same degree of coincidence in the overlapping area between the sensors immediately before the collision and the degree of coincidence in the overlapping area between the sensors immediately after the collision. It is determined whether or not there is (step S74). Note that the determination process in step S74 is performed for each pair of sensors having a straddle region.

センサ異常判定部1hは、ステップS74にて一致度の比較結果は同程度であると判定した場合(ステップ74:Yes)、該当する周辺環境認識センサ3の認識状態を「異常」から「正常」に更新する。その後、図11のステップS80の処理へ移行する。一方、ステップS74にて一致度の比較結果は同程度ではないと判定した場合(ステップ74:No)、該当する周辺環境認識センサ3の認識状態を「正常」に更新せずに「異常」のままとする。その後、図11のステップS80の処理へ移行する。   If the sensor abnormality determination unit 1h determines that the comparison result of the coincidence degree is approximately the same in step S74 (step 74: Yes), the recognition state of the corresponding surrounding environment recognition sensor 3 is changed from “abnormal” to “normal”. Update to Thereafter, the process proceeds to step S80 in FIG. On the other hand, if it is determined in step S74 that the comparison result of the degree of coincidence is not comparable (step 74: No), the recognition state of the corresponding surrounding environment recognition sensor 3 is not “normal” but “abnormal” is updated. Leave. Thereafter, the process proceeds to step S80 in FIG.

図11に戻り、周辺状況予測部1bは、センサ異常判定部1hの判定結果に応じて、ステップS60で1次予測した予測結果に修正を加えることで、自車両周辺の車両周辺領域の状況を予測する(ステップS80)。ステップS80において、周辺状況予測部1bは、衝突検出部1gが衝突を検出した後、センサ異常判定部1hにより異常があると判定された周辺環境認識センサ3(周辺状況検出手段)の衝突前に設定された検出領域に対応する車両周辺領域内の異常認識領域の状況については、衝突検出部1gが衝突を検出する前に周辺状況記録部1dにより記録された予測領域の状況に基づいて予測する。また、周辺状況予測部1bは、センサ異常判定部1hにより正常であると判定された周辺環境認識センサ3の検出領域に対応する車両周辺領域内の正常認識領域の状況については、周辺環境認識センサ3の検出結果に基づいて予測する。例えば、周辺状況予測部1bは、上述の図10に示すように、2次予測処理として、衝突直後のN秒の時点において、センサ異常判定部1hの判定結果に基づいて、衝突直前に記録されたECUメモリ情報に基づいて予測した異常認識領域の状況と、衝突直後に検出された周辺環境情報に基づいて予測した正常認識領域の状況とを結合させて、衝突直後のN秒の時点における車両周辺領域の状況として予測する。図11のステップS80において行われるセンサ状態に応じた周辺状況予測処理について、図15を参照して説明する。   Returning to FIG. 11, the surrounding situation prediction unit 1b modifies the prediction result predicted primarily in step S60 according to the determination result of the sensor abnormality determination unit 1h, so that the situation of the vehicle surrounding area around the host vehicle is determined. Prediction is made (step S80). In step S80, the surrounding situation prediction unit 1b detects the collision before the collision of the surrounding environment recognition sensor 3 (ambient situation detection unit) that is determined to be abnormal by the sensor abnormality determination unit 1h. The situation of the abnormality recognition area in the vehicle surrounding area corresponding to the set detection area is predicted based on the situation of the prediction area recorded by the surrounding situation recording section 1d before the collision detection section 1g detects a collision. . In addition, the surrounding situation prediction unit 1b determines the surrounding environment recognition sensor for the situation of the normal recognition area in the vehicle surrounding area corresponding to the detection area of the surrounding environment recognition sensor 3 determined to be normal by the sensor abnormality determination unit 1h. 3 based on the detection result of 3. For example, as shown in FIG. 10 described above, the surrounding situation prediction unit 1b is recorded immediately before the collision based on the determination result of the sensor abnormality determination unit 1h at the time of N seconds immediately after the collision as the secondary prediction process. The vehicle at the time of N seconds immediately after the collision is obtained by combining the situation of the abnormality recognition area predicted based on the ECU memory information and the situation of the normal recognition area predicted based on the surrounding environment information detected immediately after the collision. Predict the situation in the surrounding area. The surrounding state prediction process according to the sensor state performed in step S80 of FIG. 11 will be described with reference to FIG.

図15に示すように、周辺状況予測部1bは、センサ異常判定部1hの判定結果に基づいて、車両10に搭載された複数の周辺環境認識センサ3の其々について認識状態が「正常」であるか否か判定する(ステップS81)。   As shown in FIG. 15, the surrounding state prediction unit 1 b has a “normal” recognition state for each of the plurality of surrounding environment recognition sensors 3 mounted on the vehicle 10 based on the determination result of the sensor abnormality determination unit 1 h. It is determined whether or not there is (step S81).

周辺状況予測部1bは、ステップS81にて対象とする周辺環境認識センサ3の認識状態が「正常」であると判定した場合(ステップS81:Yes)、「正常」と判定された周辺環境認識センサ3の検出領域に対応する車両周辺領域内の正常認識領域については、衝突直後のタイミングにて周辺環境認識センサ3で検出された検出結果を用いるように設定する(ステップS82)。具体的には、ステップS82において、周辺状況予測部1bは、センサ情報使用の重み係数K1の値を1に設定し、ECUメモリ情報使用の重み係数K2の値を0に設定する。その後、ステップS84の処理へ移行する。   When the surrounding state prediction unit 1b determines that the recognition state of the target surrounding environment recognition sensor 3 is “normal” in step S81 (step S81: Yes), the surrounding environment recognition sensor determined to be “normal”. The normal recognition region in the vehicle peripheral region corresponding to the detection region 3 is set to use the detection result detected by the peripheral environment recognition sensor 3 at the timing immediately after the collision (step S82). Specifically, in step S82, the surrounding situation prediction unit 1b sets the value of the weighting factor K1 for using sensor information to 1 and sets the value of the weighting factor K2 for using ECU memory information to 0. Thereafter, the process proceeds to step S84.

一方、周辺状況予測部1bは、ステップS81にて対象とする周辺環境認識センサ3の認識状態が「異常」であると判定した場合(ステップS81:No)、「異常」と判定された周辺環境認識センサ3の検出領域に対応する車両周辺領域内の異常認識領域については、ECU1のメモリ内からの該当領域の情報を用いるように設定する(ステップS83)。具体的には。ステップS83において、周辺状況予測部1bは、センサ情報使用の重み係数K1の値を0に設定し、ECUメモリ情報使用の重み係数K2の値を1に設定する。その後、ステップS84の処理へ移行する。   On the other hand, when the surrounding state prediction unit 1b determines that the recognition state of the target surrounding environment recognition sensor 3 is “abnormal” in step S81 (step S81: No), the surrounding environment determined to be “abnormal” About the abnormality recognition area | region in the vehicle periphery area | region corresponding to the detection area | region of the recognition sensor 3, it sets so that the information of the applicable area | region from the memory of ECU1 may be used (step S83). In particular. In step S83, the surrounding situation prediction unit 1b sets the value of the weighting factor K1 for using sensor information to 0 and sets the value of the weighting factor K2 for using ECU memory information to 1. Thereafter, the process proceeds to step S84.

そして、周辺状況予測部1bは、次式「車両周辺領域の状況=K1×センサ情報+K2×ECUメモリ情報」に、ステップS82及びステップS83で設定したセンサ情報使用の重み係数K1の値と、ECUメモリ情報使用の重み係数K2の値を入力する。これにより、周辺状況予測部1bは、衝突直前に記録されたECUメモリ情報に基づいて予測した異常認識領域の状況と、衝突直後に検出された周辺環境情報に基づいて予測した正常認識領域の状況とを結合させることで、衝突直後の車両周辺領域の状況として予測する(ステップS84)。その後、図11のステップS80の処理へ移行する。   Then, the surrounding situation prediction unit 1b sets the weight of the sensor information use weighting factor K1 set in step S82 and step S83 to the following expression “vehicle surrounding area situation = K1 × sensor information + K2 × ECU memory information”, and the ECU. The value of the weight coefficient K2 for using memory information is input. Thereby, the surrounding situation prediction unit 1b predicts the situation of the abnormality recognition area predicted based on the ECU memory information recorded immediately before the collision and the situation of the normal recognition area predicted based on the surrounding environment information detected immediately after the collision. Is predicted as the situation of the vehicle peripheral area immediately after the collision (step S84). Thereafter, the process proceeds to step S80 in FIG.

図11に戻り、走行制御部1iは、ステップS80にて予測された車両周辺領域の状況に基づいて、自車両周辺における2次衝突の発生を回避するための走行制御を行う(ステップS90)。   Returning to FIG. 11, the traveling control unit 1 i performs traveling control for avoiding the occurrence of the secondary collision around the host vehicle based on the situation of the vehicle surrounding area predicted in step S80 (step S90).

以上説明したように、本実施形態における車両周辺状況推定装置により、例えば、図16及び図17に示すように、衝突によってセンサに異常が発生して周辺状況を検出できなくなった場合であっても、車両の周辺状況を好適に推定することができる。ここで、図16は、道路上の移動物に対する2次衝突位置を予測する場面の一例を示す図である。図17は、道路上の静止物に対する2次衝突位置を予測する場面の一例を示す図である。   As described above, even when the surrounding situation cannot be detected due to an abnormality in the sensor due to a collision, as shown in FIGS. Thus, the surrounding situation of the vehicle can be estimated appropriately. Here, FIG. 16 is a diagram illustrating an example of a scene in which the secondary collision position with respect to the moving object on the road is predicted. FIG. 17 is a diagram illustrating an example of a scene in which a secondary collision position with respect to a stationary object on a road is predicted.

図16の左図は、自車両としての車両10が同一走行車線上で先行する他の車両30を追い越し、反対車線を走行中の他の車両20に1次衝突する場面を示している。図16において、車両10は車両30を追い越した際に、車両20との右前方における衝突により右前方に搭載された周辺環境認識センサ3(例えば、センサ2)に異常が生じたものとする。このような状況において、本実施形態では、センサ異常判定部1hの処理により、右前方に搭載された周辺環境認識センサ3(例えば、センサ2)の認識状態は異常であると判定する。また、センサ異常判定部1hの処理により、右前方以外の位置に搭載された周辺環境認識センサ3(例えば、センサ1,3〜6)の認識状態は正常であると判定する。続いて、図16の右図に示すように、2次衝突位置の1次予測結果として、周辺状況予測部1bの処理により、ヨーレートや加速度等の情報から、車両20との衝突後の自車両10の位置と傾きを予測し、衝突直前に記録したECUメモリ情報から、2次衝突対象となる後方車の車両30との接近位置を予測する。更に、2次衝突位置の2次予測結果として、周辺状況予測部1bの処理により、「正常」と判断した左後方をカバーするセンサ(例えば、センサ5)については実際のセンシング情報へ切り替えて、車両30までの実距離を確定する。   The left diagram of FIG. 16 shows a scene in which the vehicle 10 as the host vehicle overtakes another vehicle 30 that precedes the same traveling lane and first collides with another vehicle 20 traveling in the opposite lane. In FIG. 16, when the vehicle 10 passes the vehicle 30, it is assumed that an abnormality has occurred in the surrounding environment recognition sensor 3 (for example, the sensor 2) mounted in the right front due to a collision in the right front with the vehicle 20. In such a situation, in the present embodiment, the sensor abnormality determination unit 1h determines that the recognition state of the surrounding environment recognition sensor 3 (for example, the sensor 2) mounted on the right front is abnormal. Moreover, it determines with the recognition state of the surrounding environment recognition sensor 3 (for example, sensor 1, 3-6) mounted in positions other than the right front by the process of the sensor abnormality determination part 1h being normal. Subsequently, as shown in the right diagram of FIG. 16, as the primary prediction result of the secondary collision position, the vehicle after the collision with the vehicle 20 is obtained from information such as the yaw rate and acceleration by the processing of the surrounding situation prediction unit 1 b. The position and inclination of 10 are predicted, and the approach position of the rear vehicle to be subjected to the secondary collision with the vehicle 30 is predicted from the ECU memory information recorded immediately before the collision. Further, as a secondary prediction result of the secondary collision position, the sensor that covers the left rear determined to be “normal” (for example, the sensor 5) by the processing of the surrounding situation prediction unit 1b is switched to the actual sensing information, The actual distance to the vehicle 30 is determined.

図17の左図は、自車両としての車両10が、左側にフラット壁があり右側に凹凸壁が存在する車線上を走行中に、車両40が車両10の左前方に1次衝突する場面を示している。図17において、車両10は車両40との衝突に際に、左前方に搭載された周辺環境認識センサ3(例えば、センサ3)に異常が生じたものとする。このような状況において、本実施形態では、センサ異常判定部1hの処理により、左前方に搭載された周辺環境認識センサ3(例えば、センサ3)の認識状態は異常であると判定する。また、センサ異常判定部1hの処理により、左前方以外の位置に搭載された周辺環境認識センサ3(例えば、センサ1〜2,4〜6)の認識状態は正常であると判定する。続いて、図17の右図に示すように、2次衝突位置の1次予測結果として、周辺状況予測部1bの処理により、ヨーレートや加速度等の情報から、車両40との衝突後の自車両10の位置と傾きを予測し、衝突直前に記録した道路幅の情報等を含むECUメモリ情報から、2次衝突対象となる右側の凹凸壁との距離を予測する。更に、2次衝突位置の2次予測結果として、周辺状況予測部1bの処理により、「正常」と判断した右前方をカバーするセンサ2については実際のセンシング情報へ切り替えて、右側の凹凸壁までの実距離(図17において、d_right)を確定する。また、道路幅の情報と合わせて左側のフラット壁までの実距離(図17において、d_left)も確定する。   The left diagram in FIG. 17 shows a scene in which the vehicle 40 primarily collides with the left front of the vehicle 10 while the vehicle 10 as the host vehicle is traveling on a lane having a flat wall on the left side and an uneven wall on the right side. Show. In FIG. 17, it is assumed that the vehicle 10 has an abnormality in the surrounding environment recognition sensor 3 (for example, the sensor 3) mounted on the left front when the vehicle 10 collides. In such a situation, in the present embodiment, the processing of the sensor abnormality determination unit 1h determines that the recognition state of the surrounding environment recognition sensor 3 (for example, the sensor 3) mounted on the left front is abnormal. Moreover, it determines with the recognition state of the surrounding environment recognition sensor 3 (for example, sensors 1-2, 4-6) mounted in positions other than the left front by the process of the sensor abnormality determination part 1h being normal. Subsequently, as shown in the right diagram of FIG. 17, as a primary prediction result of the secondary collision position, the host vehicle after the collision with the vehicle 40 is obtained from information such as the yaw rate and acceleration by the processing of the surrounding situation prediction unit 1 b. The position and inclination of 10 are predicted, and the distance from the uneven wall on the right side that is the target of the secondary collision is predicted from the ECU memory information including the road width information recorded immediately before the collision. Further, as a secondary prediction result of the secondary collision position, the sensor 2 covering the right front determined to be “normal” is switched to the actual sensing information by the processing of the surrounding situation prediction unit 1b, and the right uneven wall is The actual distance (d_right in FIG. 17) is determined. Also, the actual distance to the left flat wall (d_left in FIG. 17) is determined together with the road width information.

図16及び図17において、2次衝突位置の1次予測結果と、2次衝突位置の2次予測結果とを比較すると、衝突可能性がある領域(i)〜(iv)のうち、1次予測結果が示す領域(i)と2次予測結果が示す領域(iii)とは位置が異なっている。これは、衝突後に正常と判定されたセンサがカバーする領域については、衝突直前に記録したECUメモリ情報を用いて予測するよりも、実際のセンシング情報へ切り替えて予測する方が好適に車両周辺領域の状況を予測できることを示している。このように、本実施形態によれば、衝突によってセンサに異常が発生して周辺状況を検出できなくなった場合であっても、車両の周辺状況を好適に推定することができる。   16 and 17, when comparing the primary prediction result of the secondary collision position with the secondary prediction result of the secondary collision position, the primary collision out of the regions (i) to (iv) where there is a possibility of collision. The region (i) indicated by the prediction result and the region (iii) indicated by the secondary prediction result have different positions. This is because the area covered by the sensor determined to be normal after the collision is preferably predicted by switching to actual sensing information rather than using the ECU memory information recorded immediately before the collision. This shows that the situation can be predicted. As described above, according to the present embodiment, it is possible to appropriately estimate the surrounding situation of the vehicle even when the abnormality occurs in the sensor due to the collision and the surrounding situation cannot be detected.

なお、上述の実施形態では、一致度記録部1eが衝突直前のタイミング(即ち、衝突回避判定部1fにより衝突回避不可能と判定されたタイミング)や、衝突直後のタイミング(即ち、衝突検出部1gにより衝突が検出されたタイミング)で、周辺環境認識センサ3の重複領域における周辺環境情報の一致度を算出して記録する例を説明したが、一致度記録部1eは、衝突前の所定のタイミング(例えば、エンジン始動時や操舵支援スイッチのオン時)に、周辺環境認識センサ3の重複領域における周辺環境情報の一致度を算出して記録してもよい。これにより、センサ異常判定部1hは、衝突直前と衝突直後における各センサ間の重複領域の一致度を比較する処理を行う前に、車両10に搭載されたセンサに異常がないことを確認することができる。   In the above-described embodiment, the coincidence degree recording unit 1e has a timing immediately before the collision (that is, a timing at which the collision avoidance determining unit 1f determines that collision cannot be avoided) or a timing immediately after the collision (that is, the collision detecting unit 1g). In the example described above, the degree of coincidence of the surrounding environment information in the overlapping region of the surrounding environment recognition sensor 3 is calculated and recorded at the timing when the collision is detected. The degree of coincidence of the surrounding environment information in the overlapping region of the surrounding environment recognition sensor 3 may be calculated and recorded (for example, when the engine is started or when the steering assist switch is turned on). Thereby, the sensor abnormality determination unit 1h confirms that there is no abnormality in the sensor mounted on the vehicle 10 before performing the process of comparing the coincidence degree of the overlapping areas between the sensors immediately before and after the collision. Can do.

また、上述の実施形態では、走行制御部1iが、自車両周辺における2次衝突の発生を回避するための走行制御(例えば、操舵制御やブレーキ制御)を行う例を説明したが、走行制御部1iの処理内容はこれに限定されない。例えば、本実施形態では、周辺状況推定部1cの処理により、衝突した際の自車両周辺における予測領域の状況として白線の位置も推定可能である。これにより、周辺状況予測部1bに処理により、衝突を検出した後も、衝突を検出する前に記録された白線の位置を含む予測領域の状況に基づいて、自車両周辺の白線が規定する走行可能領域を予測可能である。その結果、走行制御部1iに処理により、衝突を検出した後も、白線を追従するように自車両の挙動を制御する走行制御(所謂、レーンキーピングアシスト:LKA)を行うこともできる。このように、本実施形態の車両周辺状況推定装置によれば、1次衝突後も好適に白線が規定する走行可能領域を予測可能であるため、衝突によってセンサに異常が発生して周辺状況を検出できなくなった場合であってもLKA制御を実行することができる。よって、1次衝突後もLKA制御を実行することで、衝突後に自車両が走行中のレーンをはみ出す状況を軽減できるので、例えば他のレーンを走行中の他車両との2次衝突やレーン脇に設置された障害物等との2次衝突の発生も軽減できる。   In the above-described embodiment, an example in which the travel control unit 1i performs travel control (for example, steering control or brake control) to avoid the occurrence of a secondary collision around the host vehicle has been described. The processing content of 1i is not limited to this. For example, in the present embodiment, the position of the white line can be estimated as the situation of the prediction area around the host vehicle when the vehicle collides, by the process of the surrounding situation estimation unit 1c. Thereby, even after the collision is detected by the processing in the surrounding situation prediction unit 1b, the travel defined by the white line around the host vehicle based on the situation of the predicted area including the position of the white line recorded before the collision is detected. Possible areas can be predicted. As a result, it is also possible to perform travel control (so-called lane keeping assist: LKA) for controlling the behavior of the host vehicle so as to follow the white line even after the collision is detected by the travel control unit 1i. As described above, according to the vehicle surrounding state estimation device of the present embodiment, it is possible to predict the travelable region that is preferably defined by the white line even after the primary collision. Even when it becomes impossible to detect, LKA control can be executed. Therefore, by executing the LKA control even after the primary collision, it is possible to reduce the situation where the host vehicle runs out of the lane that the vehicle is traveling after the collision. The occurrence of secondary collision with obstacles installed in the can also be reduced.

1 ECU(車両周辺状況推定装置)
1a 周辺環境情報取得部
1b 周辺状況予測部
1c 周辺状況推定部
1d 周辺状況記録部
1e 一致度記録部
1f 衝突回避判定部
1g 衝突検出部
1h センサ異常判定部
1i 走行制御部
2 車両運動量検出センサ
2a 加速度センサ
2b ヨーレートセンサ
2c 車速センサ
3 周辺環境認識センサ
3a センサ1(第1のセンサ)
3b センサ2(第2のセンサ)
3c センサ3(第3のセンサ)
4 アクチュエータ
1 ECU (vehicle surrounding situation estimation device)
DESCRIPTION OF SYMBOLS 1a Ambient environment information acquisition part 1b Ambient condition prediction part 1c Ambient condition estimation part 1d Ambient condition recording part 1e A coincidence degree recording part 1f A collision avoidance determination part 1g A collision detection part 1h A sensor abnormality determination part 1i A travel control part 2 A vehicle momentum detection sensor 2a Acceleration sensor 2b Yaw rate sensor 2c Vehicle speed sensor 3 Surrounding environment recognition sensor 3a Sensor 1 (first sensor)
3b Sensor 2 (second sensor)
3c sensor 3 (third sensor)
4 Actuator

Claims (4)

自車両が車両外部の物体に衝突したことを検出する衝突検出手段と、
自車両周辺における検出領域の状況を検出する周辺状況検出手段と、
前記衝突検出手段が衝突を検出する前の前記周辺状況検出手段の検出結果に基づいて、衝突した際の自車両周辺の予測領域の状況を推定する周辺状況推定手段と、
前記周辺状況推定手段により推定された前記予測領域の状況を記録する周辺状況記録手段と、
前記衝突検出手段が衝突を検出した後、前記衝突検出手段が衝突を検出する前に前記周辺状況記録手段により記録された前記予測領域の状況に基づいて、自車両周辺の車両周辺領域の状況を予測する周辺状況予測手段と、
を備えたことを特徴とする車両周辺状況推定装置。
Collision detection means for detecting that the host vehicle has collided with an object outside the vehicle;
Surrounding situation detection means for detecting the situation of the detection area around the vehicle,
Based on the detection result of the surrounding state detection unit before the collision detection unit detects a collision, a surrounding state estimation unit that estimates the state of the prediction region around the host vehicle when a collision occurs;
A surrounding situation recording means for recording the situation of the prediction area estimated by the surrounding situation estimation means;
After the collision detection means detects a collision, the situation of the vehicle surrounding area around the host vehicle is determined based on the situation of the predicted area recorded by the surrounding situation recording means before the collision detection means detects a collision. A surrounding situation prediction means to predict;
A vehicle surrounding state estimation device comprising:
前記自車両と前記周辺状況検出手段により検出された前記車両外部の物体との衝突を回避可能か否か判定する衝突回避判定手段、を更に備え、
前記周辺状況推定手段は、
前記衝突回避判定手段が衝突回避不可能と判定した後、前記予測領域の状況を推定し、
前記周辺状況記録手段は、
前記衝突回避判定手段が衝突回避不可能と判定した後の時点から前記衝突検出手段が衝突を検出する前までの時点において前記周辺状況推定手段により推定された前記予測領域の状況を記録する請求項1に記載の車両周辺状況推定装置。
A collision avoidance determining means for determining whether or not a collision between the host vehicle and the object outside the vehicle detected by the surrounding state detecting means can be avoided;
The surrounding situation estimation means includes
After the collision avoidance determining means determines that collision avoidance is impossible, the situation of the prediction area is estimated,
The surrounding situation recording means is
The situation of the prediction region estimated by the surrounding situation estimation unit from a time point after the collision avoidance determination unit determines that collision avoidance is impossible to a point before the collision detection unit detects a collision. The vehicle surrounding situation estimation apparatus according to 1.
前記衝突検出手段が衝突を検出した後、前記自車両に複数搭載された前記周辺状況検出手段の異常の有無を判定するセンサ異常判定手段、
を更に備え、
前記周辺状況予測手段は、
前記衝突検出手段が衝突を検出した後、前記センサ異常判定手段により異常があると判定された前記周辺状況検出手段の衝突前に設定された検出領域に対応する前記車両周辺領域内の異常認識領域の状況については、前記衝突検出手段が衝突を検出する前に前記周辺状況記録手段により記録された前記予測領域の状況に基づいて予測し、前記センサ異常判定手段により正常であると判定された前記周辺状況検出手段の検出領域に対応する前記車両周辺領域内の正常認識領域の状況については、前記周辺状況検出手段の検出結果に基づいて予測して、前記自車両周辺の状況を予測する請求項1又は2に記載の車両周辺状況推定装置。
After the collision detection unit detects a collision, a sensor abnormality determination unit that determines whether or not there is an abnormality in the surrounding state detection unit mounted on the host vehicle,
Further comprising
The surrounding situation prediction means includes
After the collision detection means detects a collision, the abnormality recognition area in the vehicle peripheral area corresponding to the detection area set before the collision of the peripheral situation detection means determined to be abnormal by the sensor abnormality determination means The situation is predicted based on the situation of the prediction area recorded by the surrounding situation recording means before the collision detection means detects a collision, and the sensor abnormality determination means determines that the situation is normal. The situation of the normal recognition area in the vehicle surrounding area corresponding to the detection area of the surrounding situation detecting means is predicted based on the detection result of the surrounding situation detecting means, and the situation around the own vehicle is predicted. The vehicle surrounding state estimation device according to 1 or 2.
前記衝突検出手段が衝突を検出した後、前記周辺状況予測手段により予測された前記車両周辺領域の状況に基づいて、自車両の挙動を制御する走行制御を行う走行制御手段、
を更に備える請求項1から3のいずれか一項に記載の車両周辺状況推定装置。
Travel control means for performing travel control for controlling the behavior of the host vehicle based on the situation of the vehicle surrounding area predicted by the surrounding situation prediction means after the collision detection means detects a collision;
The vehicle surrounding state estimation device according to any one of claims 1 to 3, further comprising:
JP2013270337A 2013-12-26 2013-12-26 Vehicle surrounding situation estimation device Active JP6299208B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2013270337A JP6299208B2 (en) 2013-12-26 2013-12-26 Vehicle surrounding situation estimation device
PCT/IB2014/002743 WO2015097511A1 (en) 2013-12-26 2014-12-12 Vehicle surrounding situation estimation device
US15/107,012 US10479353B2 (en) 2013-12-26 2014-12-12 Vehicle surrounding situation estimation device
DE112014006071.2T DE112014006071T5 (en) 2013-12-26 2014-12-12 Fahrzeugumgebungssitutation estimating
CN201480070946.3A CN105848980B (en) 2013-12-26 2014-12-12 Peripheral situation of vehicle estimating device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2013270337A JP6299208B2 (en) 2013-12-26 2013-12-26 Vehicle surrounding situation estimation device

Publications (2)

Publication Number Publication Date
JP2015123899A true JP2015123899A (en) 2015-07-06
JP6299208B2 JP6299208B2 (en) 2018-03-28

Family

ID=52355016

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013270337A Active JP6299208B2 (en) 2013-12-26 2013-12-26 Vehicle surrounding situation estimation device

Country Status (5)

Country Link
US (1) US10479353B2 (en)
JP (1) JP6299208B2 (en)
CN (1) CN105848980B (en)
DE (1) DE112014006071T5 (en)
WO (1) WO2015097511A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016088180A (en) * 2014-10-31 2016-05-23 富士重工業株式会社 Travel control unit of vehicle
JP2017027292A (en) * 2015-07-21 2017-02-02 トヨタ自動車株式会社 Vehicle control device
JP2017202802A (en) * 2016-05-13 2017-11-16 本田技研工業株式会社 Vehicle control system, vehicle control method, and vehicle control program
CN108983247A (en) * 2017-05-31 2018-12-11 本田技研工业株式会社 Object target identifying system, object target recognition methods and storage medium
JP2019513617A (en) * 2016-04-07 2019-05-30 ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh How to drive a vehicle
WO2019138964A1 (en) * 2018-01-09 2019-07-18 パイオニア株式会社 Control device, scanning device, control method, program, and storage medium
KR20190111321A (en) * 2018-03-22 2019-10-02 현대자동차주식회사 Chassis Integration System Method for Preventing Secondary Collision and Vehicle thereof
JP2019209910A (en) * 2018-06-07 2019-12-12 本田技研工業株式会社 Vehicle control system
JP2019209909A (en) * 2018-06-07 2019-12-12 本田技研工業株式会社 Vehicle control system
KR102107466B1 (en) * 2018-12-14 2020-05-07 국민대학교산학협력단 Driving control apparatus and method based on autonomous vehicle
JP2021112981A (en) * 2020-01-20 2021-08-05 トヨタ自動車株式会社 Driving support apparatus
WO2021166718A1 (en) * 2020-02-17 2021-08-26 株式会社デンソー In-vehicle measurement device unit and integrated data generation method in in-vehicle measurement device unit

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015174093A1 (en) * 2014-05-16 2015-11-19 パナソニックIpマネジメント株式会社 In-vehicle display device, in-vehicle display device control method, and program
US11242051B1 (en) 2016-01-22 2022-02-08 State Farm Mutual Automobile Insurance Company Autonomous vehicle action communications
US10747234B1 (en) 2016-01-22 2020-08-18 State Farm Mutual Automobile Insurance Company Method and system for enhancing the functionality of a vehicle
US11719545B2 (en) 2016-01-22 2023-08-08 Hyundai Motor Company Autonomous vehicle component damage and salvage assessment
US11441916B1 (en) 2016-01-22 2022-09-13 State Farm Mutual Automobile Insurance Company Autonomous vehicle trip routing
DE102016216738A1 (en) * 2016-09-05 2018-03-08 Robert Bosch Gmbh Method and device for controlling a vehicle
US10081299B2 (en) * 2016-09-07 2018-09-25 Thunder Power New Energy Vehicle Development Company Limited Front end sensor for pedestrians
DE102017216083B4 (en) 2016-09-13 2023-08-17 Hl Klemove Corp. Impact absorbing device and method for a vehicle
KR102039487B1 (en) 2016-11-11 2019-11-26 엘지전자 주식회사 Vehicle driving control apparatus and method
US10663974B2 (en) * 2016-11-23 2020-05-26 Electronics And Telecommunications Research Institute Object recognition device, autonomous driving system including the same, and object recognition method using the object recognition device
KR101911703B1 (en) * 2016-12-09 2018-10-25 엘지전자 주식회사 Driving control apparatus for vehicle and vehicle
KR20180071663A (en) * 2016-12-20 2018-06-28 현대자동차주식회사 Vehicle and method for controlling thereof
JPWO2018155142A1 (en) * 2017-02-21 2019-07-11 日立オートモティブシステムズ株式会社 Vehicle control device
US10329985B2 (en) * 2017-06-27 2019-06-25 Tenneco Automotive Operating Company Inc. Impingement mixer for exhaust treatment
CN107792076A (en) * 2017-09-25 2018-03-13 南京律智诚专利技术开发有限公司 A kind of method of work of the vehicle automatic running system of achievable identification thing analysis
JP6791106B2 (en) * 2017-12-06 2020-11-25 株式会社デンソー Peripheral recognition device and peripheral recognition method
JP6922852B2 (en) * 2018-06-12 2021-08-18 株式会社デンソー Electronic control device and electronic control system
DE102018216704A1 (en) * 2018-09-28 2020-04-02 Ibeo Automotive Systems GmbH Environment detection system, vehicle and method for an environment detection system
JP7265862B2 (en) * 2018-12-25 2023-04-27 株式会社デンソー Driving support device
EP3699632A1 (en) * 2019-02-20 2020-08-26 Veoneer Sweden AB A vehicle radar system for detecting preceding objects
KR20200144175A (en) * 2019-06-17 2020-12-29 현대자동차주식회사 Vehicle and control method thereof
DE102019209572A1 (en) * 2019-06-29 2020-12-31 Robert Bosch Gmbh Sensor unit for a vehicle and a vehicle with such a sensor unit
US11410545B2 (en) * 2019-07-19 2022-08-09 Ford Global Technologies, Llc Dynamic vehicle perimeter definition and reporting
KR20210071616A (en) * 2019-12-06 2021-06-16 현대자동차주식회사 Apparatus and method for controlling airbag
KR20210153998A (en) * 2020-06-11 2021-12-20 현대자동차주식회사 Vehicle and method for controlling thereof
CN112389392B (en) * 2020-12-01 2022-02-25 安徽江淮汽车集团股份有限公司 Vehicle active braking method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08235484A (en) * 1995-02-28 1996-09-13 Fujitsu Ten Ltd Data recorder in accident
JP2008213535A (en) * 2007-02-28 2008-09-18 Toyota Motor Corp Collision predicting device
JP2008221906A (en) * 2007-03-09 2008-09-25 Alpine Electronics Inc Damage part informing system for vehicle
US20090212993A1 (en) * 2008-02-22 2009-08-27 Toyota Jidosha Kabushiki Kaisha Collision detection apparatus, vehicle having same apparatus, and collision detection method
JP2009231937A (en) * 2008-03-19 2009-10-08 Mazda Motor Corp Surroundings monitoring device for vehicle
JP2010108182A (en) * 2008-10-29 2010-05-13 Honda Motor Co Ltd Vehicle driving support apparatus
JP2012138080A (en) * 2010-12-06 2012-07-19 Denso Corp Collision detecting device, avoidance support device and alarm system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101081777B1 (en) 2003-11-14 2011-11-09 콘티넨탈 테베스 아게 운트 코. 오하게 Method and device for reducing damage caused by an accident
JP4428208B2 (en) * 2004-11-16 2010-03-10 株式会社デンソー Vehicle object recognition device
JP2008121583A (en) * 2006-11-13 2008-05-29 Toyota Motor Corp Vehicle control device
JP5067091B2 (en) * 2007-09-18 2012-11-07 トヨタ自動車株式会社 Collision determination device
JP4982353B2 (en) * 2007-12-27 2012-07-25 日立オートモティブシステムズ株式会社 External recognition device
DE102011115223A1 (en) 2011-09-24 2013-03-28 Audi Ag Method for operating a safety system of a motor vehicle and motor vehicle

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08235484A (en) * 1995-02-28 1996-09-13 Fujitsu Ten Ltd Data recorder in accident
JP2008213535A (en) * 2007-02-28 2008-09-18 Toyota Motor Corp Collision predicting device
US20100042323A1 (en) * 2007-02-28 2010-02-18 Toyota Jidosha Kabushiki Kaisha Collision prediction device
JP2008221906A (en) * 2007-03-09 2008-09-25 Alpine Electronics Inc Damage part informing system for vehicle
US20090212993A1 (en) * 2008-02-22 2009-08-27 Toyota Jidosha Kabushiki Kaisha Collision detection apparatus, vehicle having same apparatus, and collision detection method
JP2009198402A (en) * 2008-02-22 2009-09-03 Toyota Motor Corp Impact detection device
JP2009231937A (en) * 2008-03-19 2009-10-08 Mazda Motor Corp Surroundings monitoring device for vehicle
JP2010108182A (en) * 2008-10-29 2010-05-13 Honda Motor Co Ltd Vehicle driving support apparatus
JP2012138080A (en) * 2010-12-06 2012-07-19 Denso Corp Collision detecting device, avoidance support device and alarm system

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9764640B2 (en) 2014-10-31 2017-09-19 Subaru Corporation Travel control apparatus for vehicle
JP2016088180A (en) * 2014-10-31 2016-05-23 富士重工業株式会社 Travel control unit of vehicle
JP2017027292A (en) * 2015-07-21 2017-02-02 トヨタ自動車株式会社 Vehicle control device
US10787171B2 (en) 2016-04-07 2020-09-29 Robert Bosch Gmbh Method for operating a vehicle
JP2019513617A (en) * 2016-04-07 2019-05-30 ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh How to drive a vehicle
JP2017202802A (en) * 2016-05-13 2017-11-16 本田技研工業株式会社 Vehicle control system, vehicle control method, and vehicle control program
CN108983247A (en) * 2017-05-31 2018-12-11 本田技研工业株式会社 Object target identifying system, object target recognition methods and storage medium
JP2018205878A (en) * 2017-05-31 2018-12-27 本田技研工業株式会社 Target recognition system, target recognition method, and program
CN108983247B (en) * 2017-05-31 2022-08-23 本田技研工业株式会社 Object target recognition system, object target recognition method, and storage medium
WO2019138964A1 (en) * 2018-01-09 2019-07-18 パイオニア株式会社 Control device, scanning device, control method, program, and storage medium
KR20190111321A (en) * 2018-03-22 2019-10-02 현대자동차주식회사 Chassis Integration System Method for Preventing Secondary Collision and Vehicle thereof
KR102429069B1 (en) * 2018-03-22 2022-08-04 현대자동차주식회사 Chassis Integration System Method for Preventing Secondary Collision and Vehicle thereof
JP2019209909A (en) * 2018-06-07 2019-12-12 本田技研工業株式会社 Vehicle control system
CN110641460A (en) * 2018-06-07 2020-01-03 本田技研工业株式会社 Vehicle control system
CN110576853A (en) * 2018-06-07 2019-12-17 本田技研工业株式会社 Vehicle control system
JP2019209910A (en) * 2018-06-07 2019-12-12 本田技研工業株式会社 Vehicle control system
KR102107466B1 (en) * 2018-12-14 2020-05-07 국민대학교산학협력단 Driving control apparatus and method based on autonomous vehicle
JP2021112981A (en) * 2020-01-20 2021-08-05 トヨタ自動車株式会社 Driving support apparatus
JP7268612B2 (en) 2020-01-20 2023-05-08 トヨタ自動車株式会社 Driving support device
WO2021166718A1 (en) * 2020-02-17 2021-08-26 株式会社デンソー In-vehicle measurement device unit and integrated data generation method in in-vehicle measurement device unit
JP2021128116A (en) * 2020-02-17 2021-09-02 株式会社デンソー Automotive measurement device unit and integrated data generation method in automotive measurement device unit
JP7283413B2 (en) 2020-02-17 2023-05-30 株式会社デンソー IN-VEHICLE MEASURING DEVICE UNIT AND INTEGRATED DATA GENERATION METHOD IN IN-VEHICLE MEASURING DEVICE UNIT

Also Published As

Publication number Publication date
DE112014006071T5 (en) 2016-09-08
CN105848980A (en) 2016-08-10
JP6299208B2 (en) 2018-03-28
US20170001637A1 (en) 2017-01-05
WO2015097511A1 (en) 2015-07-02
CN105848980B (en) 2019-05-03
US10479353B2 (en) 2019-11-19

Similar Documents

Publication Publication Date Title
JP6299208B2 (en) Vehicle surrounding situation estimation device
JP6032195B2 (en) Sensor abnormality detection device
US11738744B2 (en) Driving support apparatus
JP6470403B2 (en) Automatic operation control device
JP6507862B2 (en) Peripheral monitoring device and driving support device
JP6507839B2 (en) Vehicle travel control device
JP6331637B2 (en) Driving support device
US10350999B2 (en) Vehicle cruise control apparatus and vehicle cruise control method
WO2018074287A1 (en) Vehicle control device
US10967857B2 (en) Driving support device and driving support method
JP2016199262A (en) Avoidance of collision based on front wheel locus deviation during retreat travel
WO2017111135A1 (en) Travel assistance device and travel assistance method
CN107004367B (en) Vehicle travel control device and travel control method
JP5312217B2 (en) Vehicle collision possibility determination device
JP6589840B2 (en) Driving assistance device
JP4442520B2 (en) Course estimation device for vehicle
US20160167661A1 (en) Method for operating a driver assistance system of a motor vehicle and driver assistance system for a motor vehicle
US20170217432A1 (en) Driving assistance apparatus
US20190061750A1 (en) Collision mitigation control device
US20220113397A1 (en) Target tracking device
JP2017151726A (en) Collision predicting device
JP6627093B2 (en) VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND VEHICLE CONTROL PROGRAM
US10053092B2 (en) Road environment recognition device, vehicle control device, and vehicle control method
JP5928321B2 (en) Parking assistance device
JP6315070B1 (en) Obstacle detection device for vehicles

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20160118

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20161019

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20161025

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20161214

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170425

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20170621

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170818

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20180130

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180212

R151 Written notification of patent or utility model registration

Ref document number: 6299208

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151