GB2506479A - Collision detection system with a plausibility module - Google Patents
Collision detection system with a plausibility module Download PDFInfo
- Publication number
- GB2506479A GB2506479A GB1312853.3A GB201312853A GB2506479A GB 2506479 A GB2506479 A GB 2506479A GB 201312853 A GB201312853 A GB 201312853A GB 2506479 A GB2506479 A GB 2506479A
- Authority
- GB
- United Kingdom
- Prior art keywords
- host vehicle
- collision
- vehicle
- rate
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 48
- 238000000034 method Methods 0.000 claims abstract description 22
- 230000004927 fusion Effects 0.000 claims abstract description 17
- 238000005259 measurement Methods 0.000 claims description 20
- 238000012545 processing Methods 0.000 claims description 7
- 230000003213 activating effect Effects 0.000 claims description 4
- 230000006870 function Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 8
- 230000001133 acceleration Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 239000000463 material Substances 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000010200 validation analysis Methods 0.000 description 3
- 230000003044 adaptive effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 241000282836 Camelus dromedarius Species 0.000 description 1
- 241000272165 Charadriidae Species 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/09—Taking automatic action to avoid collision, e.g. braking and steering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/013—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W10/00—Conjoint control of vehicle sub-units of different type or different function
- B60W10/18—Conjoint control of vehicle sub-units of different type or different function including control of braking systems
- B60W10/184—Conjoint control of vehicle sub-units of different type or different function including control of braking systems with wheel brakes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W10/00—Conjoint control of vehicle sub-units of different type or different function
- B60W10/20—Conjoint control of vehicle sub-units of different type or different function including control of steering systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W10/00—Conjoint control of vehicle sub-units of different type or different function
- B60W10/30—Conjoint control of vehicle sub-units of different type or different function including control of auxiliary equipment, e.g. air-conditioning compressors or oil pumps
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/085—Taking automatic action to adjust vehicle attitude in preparation for collision, e.g. braking for nose dropping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/04—Monitoring the functioning of the control system
- B60W50/045—Monitoring control system parameters
- B60W2050/046—Monitoring control system parameters involving external transmission of data to or from the vehicle, e.g. via telemetry, satellite, Global Positioning System [GPS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/14—Yaw
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/18—Braking system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/20—Steering systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/30—Auxiliary equipments
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Regulating Braking Force (AREA)
Abstract
A collision detection system, for a vehicle, includes a camera 16 and a sensor 24 which measures a first data set of an object (30, fig 1) relative to the vehicle. The camera 16 measures a second data set of the object (30) relative to the vehicle and separately measures an image-based time-to-collision with the object (30) based on scalable differences of captured images. A fusion module 54 matches data from the sensor 24 and the camera 16 and estimates a collision threat based on the matched data. A plausibility module 58 generates a signal if the measured image-based time-to-collision is less than a calculated steering-based time-to-collision and a braking-based time-to-collision with the object (30). A countermeasure module 60 actuates a countermeasure device, such as an autonomous braking system 22, if the collision threat exceeds an actuation threshold and the signal from the plausibility module is received, thereby statistically reducing a rate of false actuations of the countermeasure device 22. Reference is also made to a method of actuating autonomous braking controller 62.
Description
COLLISION DETECTION SYSTEM Wfl'II A PLAUSIBILITY MODULt
CTROSS-REFERFNCE TO RELATED APPLICATION
[0001] This application claims priority under 35 U.S.C. § 119(e) to, and the benefit of, U.S. Provisional Patent Application No, 6 1/677,274, entitled "COLLISION DETECTION SYSTEM WITII A PLAUSIBILITY MODULE," filed on July 30, 2012, the entire disclosure of which is hereby incorporated by reference.
FIELD OF TUE ftWENTION
[00021 The present invention generally relates to a collision detection system for a vehicle that actuates a countermeasure device to mitigate or avoid a collision with an object. More specifically, the invention relates to a collision detection system having at least a camera to measure data of an object relative to a host vehicle and based on the measured data and estimations of colLision actuating an autonomous braking system of the vehicle.
BACKGROUND OF THE INVENTION
[00031 Automotive vehicles are increasingly being equipped with collision detection systems to identify objects in a host vehicle's path of travel, incLuding pedestrians and other vehicles. To mitigate or avoid collisions, these systems are used in conjunction with countermeasure devices, such as autonomous braking, adaptive cruise control, emergency steering assistance, and warning systems. For instance, collision mitigation by braking (CMhB) is capable of perthrming autonomous braking up to ftll anti-lock brake system levels, which must he validated to ensure an exceptionally low rate of false brake actuation. Increased collision detection reliability without a prolonged and expensive validation process is desirable.
SUMMARY OF' TJIF TNVENTION
[00041 According to one aspect oIthe present invention, a collision detection system for a host vehicle includes a sensor for detecting an object in a field of view and measuring a first set of target data of the object relative to the host vehicle. The system also includes a camera for capturing a plurality of images from the field of view and processing the plurality of images to measure a second set of target data of the object relative to the host vehicle and to measure an image-based time-to-collision (TTCTMAGF) of the host vehicle with the object based on scalable differences of the plurality of images. A fusion module determines a matched set of target data of the object relaLive to the host vehicle based on the first and second sets of target data received from the sensor and the camera, respectively. The fusion module estimates a threat of collision of the host vehicle with the object based on the matched set of target data. A plausibility module calculates a steering-based time-to-collision (TTCsEuuNc) and a braking-based time-to-collision (Ti'C:RRAKING) of the host vehicle with the object based on the second set of target data received from the camera and an additional set of data received from a vehicle dynamics detector. The plausibility module generates an actuation signal if the measured TTCJcjp is less than the calculated TTCSTFFRINC and the TICBRAKING. A countermeasure module actuates a countermeasure device if the threat of coflision received from the fusion module exceeds an actuation threshold and the actuation signal from the plausibility module is generated and received, thereby statistically reducing the rate of falsely actuating the countermeasure device.
100051 According to another aspect of the present invention, a collision detection system for a vehicle includes a sensor and a camera. The sensor measures data of an object relative to the vehicle. The camera also measures data of the object relative to the vehicle and measures an image-based time-to-collision (TTCIMAGE) with the object based on sealable differences of captured images. A fusion module matches data ñ-oi the sensor and the camera and estimates a collision threat based on the matched data. A plausibility module generates a signal if the measured TTCIMAGE is Less than a calculated steering-based time-to-collision (TTC5Luu!UNC) and a braking-based time-to-collision (TTCBIU\K[Nc) with the object. A countermeasure module actuates a countermeasure device if the collision threat exceeds an actuation threshold and the signal from the plausibility module is generated.
100061 According to yet another aspect of the present invention, a vehicle collision detection system comprises a sensor and a camera. A fusion module estimates a collision threat with an object using data of the object relative to the vehicle from the sensor and the camera. A plausibility module generates a signal if an image-based time-to-collision is less than a steering-based time-to-collision and a braking-based time-to-collision. A countermeasure actuates if the collision threat exceeds a threshold and the signal is received.
[00071 According to another aspect of the present invention, a method is provided for actuating an autonomous braking controller for a brake system of a host vehicle. The method comprises the step of sensing an object in a field of view by an object detection sensor on the host vehicle.
A first data set of the object is measured with the object detection sensor, including a first range and range rate of the object relative to the host vehicle, a first angle and angle rate of the object relative to the host vehicle, and a relative movement determination of the object. The method also includes the step of capturing a plurality of images based on light waves from the field of view by a camera on the host vehicle at known time intervals between instances when the images of the plurality of images are captured.. The captured images are processed to measure a second data set of the object, including second range and range rate of the object relative to the host vehicle, a second angle and angle rate of the object relative to the host vehicle, a width oltbe object, and an image based time-to-collision (rFCIWkGE) of the host vehicle with the object based on scalable differences of the object derived from the plurality images. An additional data set is measured with a vehicle dynamics detector, including a yaw-rate sensor for measuring a yaw rate of the host vehicle and a speed sensor for measuring the longitudinal velocity of the host vehicle. A controller is provided that receives the first and second data sets, the flCUAAQE, and the additional data set. The method further includes the step of estimating a threat of collision of the host vehicle with the object based on a combination of the first and second data sets. A steering-based time-to-collision (TTCsmuNc) of the host vehicle with the object is calculated as a function of the second data set, the longitudinal velocity of the host vehicle, and the yaw rate of the host vehicle. A braking-based time-to-collision (flCwuuawo) of the host vehicle with the object is calculated as a function of the longitudinal velocity of the host vehicle and a maximum rate of deceleration of the host vehicle. The method also includes the step of generating an actuation signal if the measured flCIMAQE is less than the calculated TFC5gINc and the TfC J.KIG, The autonomous braking controller for the brake system of the host vehicle is actuated based on the threat of collision and the actuation signal.
[0008J According to yet another aspect of the present invention, a collision detection system includes a camera and a sensor to measure data of an object relative a host vehicle, such that a threat of collision is estimated from combined data of the camera and the sensor. The independent plausibility module receives an image-based time-to-collision measured directly and independently by tile camera based on a measured rate of expansion of the object. The independent plausibility module generates an actuation signal if the image-based time-tofl collision is less than both a steering-based time-to-collision and a braking-based time-to-collision, which are calculated as a function of measurements received from the camera based relative to a general horizon plane. An autonomous braking controller for a brake system of the vehicle is actuated if the threat of collision is greater than a threshold and the independent plausibility module generates the actuation signal. The cheek against the signal from the independent plausibility module statistically increases reliability of the overall collision detection system and reduces the expense and extent of a validation process for implementing the system, without adding additional sensors to the vehicle.
10009] These and other aspects, objects, and features of the present invention will be understood and appreciated by those skilled in the art upon studying the following specification, claims, and appended drawings
BRIEF DESCRIPTION OF THE DRAWThGS
[0010J In the drawings: 100111 FIG. lisa plan view illustrating a colLision detection system on a host vehicle with an object in a field of view of the host vehicle and having a substantially similar direction of travel; 100121 FIG. 2 is a schematic diagram of the collision detection system including a sensor. a camera, and a vehicle dynamics detector in communication with a collision threat controller, which is in communication with a countermeasure; 100131 FIG. 3 is a flow chart illustrating a method for actuating a countermeasure, such as an autonomous braking controller for a brake system ola host vehicle, using a collision threat controller; [0014J FIG. 4 is a logic diagram illustrating a routine for generating an actuation signal for a countermeasure module; [0015J FIG. S is a logic diagram illustrating a routine for estimating and calculating a steering based time-to-collision; and 100161 FIG. 6 is a logic diagram illustrating a routine for estimating and calculating a braking-based time-to-collision.
DETAiLED DESCRIPTION OF TIlE EMBODIMENTS
[00171 For purposes of description herein, the terms "upper," "lower." "right," "left" "rear,' "front." "vertical," "horizontal," and derivatives thereof shall relate to the vehicle and its collision detection system as oriented in FIG. I, However, it is to be understood that the invention may assume various alternative orientations, except where expressly specified to the contraiy. It is also Lu be understood that the specific devices and processes illustrated in the attached drawings, and described in the following specification are simply exemplary embodiments of the inventive concepts defined in the appended claims. Hence, specific dimensions and other physical characteristics relating to the embodiments disclosed herein are not to be considered as limiting, unless the claims expressly state otherwise.
[00181 Referring to FIG. I, reference numeral 10 generally designates a host vehicle having a collision detection system 12 with an object detection sensor(R) 14 and a camera (C) 16. A field of view for the sensor 14 and the camera lóis generally indicated with reference numeral IS and is further defined by boundaries 18A and ISB. The host vehicle 10 shown and described herein is a passenger car (automotive vehicle) having wheels 20 for engaging a road and a brake system (B) 22 for engaging the wheels 20. Upon engaging the wheels 20, the brake system 22 is configured to reduce rotation of the wheels 20, thereby causing a longitudinal velocity V of the host vehicle 10 relative to the road to reduce, such that the host vehicle 10 has a negative longitudinal acceleration, or a deceleration. The host vehicle 10 includes a front side, two lateral sides, and a rear side, with the sensor 14 and the camera 16 positioned generally on the front side for detecting objects in the field of view 18 forward the host vehicle 10. However, it is contemplated that the sensor 14 and camera 16 could he positioned at a different location on the host vehicle 10 for detecting objects in an alternative field of view. The sensor 14 and the camera 16 on the host vehicle 10 are generally connected to a controller (C) 24, which is connected to the brake system (B) 22. The controller 24 also receives data from an onhoard vehicle dynamics detector (D) 26.
[00191 As illustrated in FIG. 1, an object 30 is Located forward the host vehicle 10 in the field of view 18. The object 30, as shown, is a lead vehicle oriented in a substantially similar direction of travel as the host vehicle 10. It is further contemplated that the object 30 may alternatively include, among other things, a pedestrian, a bicycle, or oilier mobile or fixed structure. The host vehicle 10 and the object 30, illustrated as the lead vehicle, have respective longitudinal velocilies relative to the underlying road respectively denoted as VH and YL and illustrated as vectors to show the respective general direction of travel.
[0020J The object detection sensor 14 monitors the field of view 18 and when the sensor 14 detects the object 30 hi the field of view 18, the sensor 14 measures a first set of target data of the object 30 relative to the host vehicle 10, based on a position of the object relative to the host vehicle. The first set of target data of the object 30 relative to the host vehicle 10 includes a first rangeR1 (radial distance) measurement between the object 30 and the host vehicle 10, a First range rate R1 (time rate of change of radial distance) of the object 30 relative to the host vehicle 10, a first angle 8 (azimuth) measurement of the direction to the object 30 relative to thc host vehicle 10, a first angle rate O1 (time rate of change of azimuth) of the direction to the object 30 relative to the host vehicle 10, and a relative movement determination of the object 30 relative to the road. As shown in FIG. 2. the object detection sensor 14 comprises a radar system 32. It is contemplated that the first set of target data includes more or fewer data measurements of the object 30 or the host vehicle 10.
[00211 The camera 16 also monitors the field of view 18 for detecting one or more objects, such as the object 30. The camera 16 captures a plurality of images based on light waves from the field of view 18 at known time intervals between instances when the images of the plurality of images are captured, The camera 1 6 processes the plurality of images to measure a second set of target data of the object 30 relative to the host vehicle 10 and to measure an image-based time-to-collision (l'TCINIAGF.) of the host vehicle 10 with the object 30 based on scalable differences of the plurality of images. More specifically, the image-based time-to-collision (TTCIMAGr) is independently based on measuring various aspects of the object 30 in the plurality of images to determine rate of expansion of the object 30 from the perspective of the camera on the host vehicle 10.
[00221 The second set of target data of the object 30 relative to the host vehicle 10 includes a second range measurement R2 between the object 30 and the host vehicle 10. a second range rate 2 of the object 30 relative to the host vehicle 10. a second angle 8, of the direction to the object relative to the host vehicle 10, a second angle rate 82 of the direction to the object 30 relative to the host vehicle 10, a width measurement of the object \VLEAD, an object classification 34 of' the object 30. and a confidence value 36 of the object 30. The object classification 34 value is based LLOfl common characteristics of known objects,, such as height and width. to identify the object 30, for example, as a passenger vehicle, a pedestrian, a bicycle, or a stationary structure.
The confidence value 36 of the object 30 is essentially a measurement of whether the individual parts of the object 30 in the field of view 18 are moving together consistently to constitute a singular object 30. For example, if side rearview mirrors 38 (FIG. I) of the object 30 move at the substantially identical range rate as a rear bumper 40 of the object 30, the confidence value 36 of the object 30 will be high. Again, with regard to the TTCIMAOB measurement, the camera 16 measures the TTC[MA&r directly and independently based on the measured rate of expansion of the object 30 from the plurality of images. Whereas the camel:a 16 measures the second range and second range rate ofthe second set of target data based on the general position of the object relative to the horizon, as is generally known in the art. Accordingly, the TTCIMAGE measurement is statistically independent from the measurements in the second set of target data..
[00231 Referring now to FIG. 2, the object detection sensor 14 for monitoring the field of view 18 includes the radar system 32. The radar 32 measures the first set of target data of the object relative to the host vehicle 10. 1-lowever, it is contemplated that the sensor 14 may also or alternatively comprise a lidar, an ultrasonic, an active infrared, a passive infrared, a telematic, an additional camera, or any other sensor known in the art.
100241 As illustrated in FIG. 2, the camera 16 generally comprises an imagcr 42 for capturing the plurality of images from the field of view 18 based on light waves received from the field of view 18 at known time intervals between times when images of the plurality of images are captured. The camera 16 also comprises an image processor 44 for processing the captured plurality of images to measure the second set of target data of the object 30 relative to the host vehicle 10 and to measure the image-based time-to-collision (TTCJNIAGE) of the host vehicle 10 with the object 30, based on the rate of expansion and the position of the object 30 relative to the host vehicle 10. The camera 16 may be comprised of one or more cameras, such as a charge-coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) image sensor device. The camera 16 generally implements instrumentation known in the art for capturing images, such that the imager 42 may comprise a visible light camera, a far infrared camera.
and/or a near infrared camera. Further, the image processor 44 of the camera 16 is typically capable of buffering and processing the plurality of images in real time. It is also contemplated that the image processor 44 may be integrated in another processor or controller separate from the camera 16.
[00251 As further shown in FIG. 2. the vehicle dynamics detector 26 comprises a yaw-rate sensor 46, a speed sensor 48, and a global positioning system (UPS) 50 to measure an additional data set indicative of the kinematics of the host vehicle 10. It is contemplated that the vehicle dynamics detector 26 may include other sensors, such as a steering wheel angle sensor and an acceleration sensor, to detect other kinematic-related data of the host vehicle 10. The yaw-rate sensor 46 determines the yaw rate w of the host vehicle 10 about a center of gravity of the host vehicle 10, measuring the rotational tendency of the host vehicle 10 about an axis perpendicular to the road surface. The speed sensor 48 measures the velocity V11 of the host vehicle 10 in the direction of travel. As illustrated in dashed lines, the UPS 50 is optionally included as a component of the vehicle dynamics detector 26, such that the UPS 50 may be utilized to measure various kinematic properties and relative positioning data of the host vehicle 10.
[00261 The collision threat controller 24, as shown in FIG. 2, receives inputs from the sensor 14, the camera 16, and the vehicle dynamics detector 26, The collision threat controller 24 may include a microprocessor 52 and memory 54 according to one embodiment, and may be configured as part of a shared controller used for other purposes or configured with multiple microprocessors and memory units integrated in various locations and components as parts ol' or separate from the host vehicle 10, The memory 54 may include random access memory (RAM).
read-only memory (ROM), and electrically erasable programmable read-only memory (FEPROM). The controller 24 receives the first data set from the sensor 14, the second data set and the TTCIMAG measurement from the camera 16, and thc additional data set from the vehicle dynamics detector 26. The controller 24 processes the inputs received with a fusion module routine 56 and a plausibility module routine 58 to determine whether a countermeasure 60 should be actuated to avoid or mitigate a potential collision of the host vehicle 10 with the object 30. It is contemplated that the thsion module routine 56 and the plausibility module routine 58 may be combined or incorporated with other routines to perform the general collision detection and avoidance functions and plausibility checks as described herein.
[00271 Stilt referring to FIG. 2, the countermeasure 60 includes an autonomous braking controller 62 for activating the brake system 22 ofl.hc host vehicle 10. As illustrated, the countermeasure 60 may also include a driver warning system 64, an occupant seat belt pretension controller 66, an emergency steering controller 68, and an adaptive cruise controller 70. it is contemplated that additional countermeasures may be incorporated to avoid a collision of the host vehicle 10 with the object 30 or to mitigate damage to the host vehicle 10, any occupants, or the object 30 upon a collision of the host vehicle 10 with the object 30. The countermeasure module 60 actuates a countermeasure device if a threat of collision received from the fusion modulc 56 exceeds an actuation threshold and the actuation signal from the plausibility module 58 is generated by the plausibility module 58 and received by the countermeasure module 60, thereby statistically reducing the rate of falsely actuating the countermeasure device, such as the brake system 22, and improving reliability of the collision detection system 12. Ultimately, the plausibility module 58 generates an actuation signal if the measured TTCTMAQI is less than a calculated steering-based time-to-collision (TTCSIEEPJNO) and a calculated braking-based time-to-collision (TTCBRAKING). Optionally, the plausibility module 58 may also perform additional plausibility checks to improve reliabilit, such as estimating whether the object 30 is in the path of the host vehicle 10 before generating the actuation signal.
100281 Referring now to FIG. 3, a method For actuating the autonomous braking controller 62 (FIG. 2) for the brake system 22 (FTG. 2) of the host vehicle 10 is shown. At step 92, the object in the field of view 18 is sensed by the object detection sensor 14 on the host vehicle 10. The first data set of the object 30 is measured at step 93 with the object detection sensor 14, including the First rangeR1 and first range rate R1 of the object 30 relative to the host vehicle 10, the first angle 0, and first angle rate 6 of the object 30 relative to the host vehicle 10, and the relative movement determination of the object 30. The sensor 14 used to sense the object 30 and make these measurements is contemplated to he a radar system 32, as shown in FIG. 2, however, it comprises other sensors known in the art.
[0029j Thc method further includes the step 94 of capturing the plurality of images based on light waves from the field of view 18 by the camera 16 on the host vehicle 10 at known time intervals between instances when the images of the plurality of images are captured. The captured images are processed at step 95. illustrated utilizing the image processor 44.
Thereafler, at step 96, the processed images are used to mcasurc thc second data set of the object 30, including the second rangeR and second range rate R7 out-ic object 30 relative to the host vehicle 10, the second angle 2 and second angle rate 0, of the object 30 relative to the host vehicle 10, the width WEFAD of the object 30, and the confidence value 36 of the object 30. The captured images are also processed at step 97 to independently measure the TTCIMAOE of the host vehicle 10 with the object 30 based solely on scalable differences of the object 30 derived from the plurality of images.
[00301 The vehicle dynamics detector 26 at step 98 senses the kinematics of the host vehicle tO.
At step 99, the additional data set is measured with the kinematic values from the vehicle dynamics detector 26, including the yaw-rate sensor 46 for measuring the yaw rate (0 of the host vehicle 10 and the speed sensor 48 for measuring the longitudinal velocity VH of the host vehicle 10. As previously mentioned, it is contemplated that the UPS 50 or other sensors could be used to measure components of this additional data set.
[0031J The method further includes step 100 of fusing of the first and second data sets to obtain the matched data set. The fusion module 56 (FIG. 2) determines the matched set of target data of the object 30 relative to the host vehicle 10 based on the first set and second set of target data received from the sensor 14 and the camera 16, respectively. Using the fused set of target data, also referred to as the matched data set, the fusion module 56 estimates a threat of collision value of the host vehicle 10 with the object 30. The threat of collision value has an increased reliability from utilizing measurements from both the first and second sets of data, as the matched set of data is derived by comparing the data sets and utilizing the more consistent value, utilizing the most accurate measurements based upon the type of sensor and camera, and/or utilizing a value between the measurements, such as the average of between the first and second sets. Accordingly. the matched set of target data is a relatively optimized value based on the first and second sets of data. At step 102, the threat of collision of the host vehicle 10 with the object is then estimated based on the matched data set and the additional data set from the vehicle dynamics detector, measured at step 99.
(0032] Still referring to FIG. 3, the TTCSTEERINC of the host vehicle 10 with the object 30 is calculated at step 104 as a function of the second data set and the additional data set, including the longitudinal velocity V11 of the host vehicle 10 and the yaw rate o of the host vehicle tO.
The plausibility module 58 (FIG. 2) calculates the [i'CsrprRINN value to estimate the maximum time to avoid a collision ofthe host vehicle 10 with the object 30 by steering the host vehicle 10.
Although this value can he derived or estimated with various function5. TTC5TrFBJNG herein is calculated based on the second set of target data received from the camera 16 and an additional set of data received from a vehicle dynamics detector 26.
[0033] The Ti'CFWAKING of the host vehicle 10 with the object 30 is calculated at step 106 as a function of the additional data set, namely the longitudinal velocity V11 of the host vehicle 10.
The plausibility module 58 (FiG. 2) calculates the TTCBRAKENci value to estimate the maximum time to avoid a collision of the host vehicle 10 with the object 30 by braking with the brake system 22 (FIG. I) of the host vehicle 10. This value can be derived or estimated in various ways, such as utilizing additional measurements of vehicle weight, road conditions, and braking capabilities. However, TTCRRAKIN(, as illustrated and described herein, is calculated based on the additional set of data received from a vehicle dynamics detector 26 and other values selected as constants, described in more detail below.
[00341 The method includes a determination step 108 of generating an actuation signal if the measured YFC[MAUF is less than the calculated TTG5TFrR(No and the TTCBR,\KING. Step 108 is contemplated as a function olthe plausibility module 58 (FIG. 2), allowing a countermeasure to be actuated only if the signal is present. Another determination step 110 includes comparing the threat oleollision with a threshold value. l'his step is contemplated as a function of the fusion module routine 56 (FRi. 2). however, it may be performed by the brake controller 62 or another controller in the host vehicle 10. As shown at step 112, the countermeasure is only activated if the threat of collision exceeds the threshold and the actuation signal is generated, and otherwise the determination directs the system to the start of the method. If both the threat of collision exceeds the threshold and the actuation signal is generated, ultimately the autonomous braking controller is activated at step 14. Accordingly, upon activating the countermeasure, such as the brake controller 62. the countermeasure functions to prevent or mitigate a collision of the host vehicle 10 with the object 30. For instance, as illustrated, the brake controller 62 may actuate the brake system 22 of the host vehicle 10 at step 116. It is contemplated that the brake controller 62 is integrated with the brake system 22, as shown in FIG. 2.
[00351 Referring now to FIG, 4, a logic flow diagram of the plausibility module 58 is illustrated, where it can be seen that the TTC5TEEPNG is calculated at step 1(8 as a function of the second rangeR2 from the camera 16, the second angle 8, from the camera 16, the measured object classification 34 from the camera 16, the velocity V1 from the speed sensor 48 of the vehicle dynamics detector 26, and the yaw rate co from the yaw-rate sensor 46 of the vehicle dynamics detector 26. In addition to calculating an output of the TTCSTEERIi.u3, the TTC5TEERINC function also optionally determines whether the object 30 is in the path of the host vehicle 10, shown as an IN PATh value 120.
[0036] More specifically, TTCSTEFRIN(; can he expressed as the following algorithm: / TI 2(l/W +Vw _±?_R8 \/ 2 LE1 / 2 HO/i / v = 1 F K14 MAX AlA? -ImzF-gR _ittr [0037] In the above expression, TTC5TEERING represents the maximum calculated time to avoid collision by steering the host vehicle 10. The TTC5-iinianic logic is a simplified equation assuming no relative lateral velocity of the host vehicle 10 or the object 30. A more complex strategy could he defined using measured lateral velocity, among other things. WLFAI) represents the width oithe object 30, or lead vehicle, such as the width of a car, motorcycle, or pedestrian.
WLEAD may either be a constant or measured by the camera 16 or other sensor. WI-losT, in turn, represents the width of the host vehicle ID. R equates to R2 and represents the range from the host vehicle 10 to the object 30, as measured by the camera 16, The oi variable represents the measured yaw-rate of the host vehicle 10. which can he measured by the yaw-rate sensor 46, the UPS 50. the camera 16, or an inertial sensor. V equates to V11 and represents the measured longitudinal velocity of the host vehicle 10, which can be measured by the speed sensor 48, the GPS 50, wheel speed sensors, the camera 16, or an inertial sensor. 9 equates to 02 and represents the relative angle from the host vehicle 10 to the object 30, as measured by the camera 16.
ALA]' ORIVER MAx represents the maximum achievable lateral acceleration of the host vehicle 10 by steering. A[ATnRlvhRlAx can he estimated as a constant or derived as a thnction of other information, such as road friction, speed, vehicle mass. brake system capabilities, driver preferences, or driver capabilities. Kjvr MAX simply represents a scaling factor which is typically less that one (1.0) and can be calibrated to receive desirable calculations.
[OO38 As illustrated in FiG. 5, thc function used to calculate the TTCsTpjNc+ is shown as a logic flow diagram, detailing the steps of the function. The diagram uses the same variables as outlined above to expresses the calculation steps. As illustrated, the measured object classification 34 from thc camera 16 is compared with the VALID OBJECt' field to ensure that the classification is valid and to determine the corresponding W1 EAD value. ft is also contemplated that the WIp.AD value may be measured. At switch 136, if the object classification is determined to be valid, half of the width of the object WLE,\D is output, represented as HALF_WLIiAD. This output is summed at step 138 with half of the width of the host vehicle W05-1, such that a driver of the host vehicle 10 would, at a maximum, need to move the host vehicle 10 laterally a distance equal to the combined halves of the WLEAD and the WnosTto avoid a collision, assuming the host vehicle 10 can freely move to either side of the object 30. This output is then reduced at step 1.40 by the absolute value of the output of the step referenced as 142, which, in addition other steps leading to 142, provides a comparison of V1, or (SPEED), with MIN SPEED, a threshold speed for the host vehicle to exceed before the TTCSTEERING output value is reasonably accurate. In general, the remainder of the steps in FIG. S provides the mathematical steps to calculate TTCSTEERINQ, as recited in the algorithm above.
Again referencing FIG. 4, I'l'CRItAKING is shown as being calculated at step 122 as a function of speed or longitudinal velocity, which is generally equal to the velocity VEI measurement from the speed sensor 48 of the vehicle dynamics detector 26. FIG. 6 illustrates a logic flow diagram detailing the steps of the function used to calculate TTCRRAicING. Essentially, the velocity V is divided by two times an estimated madrnum longitudinal acceleration value, A1 0MG DRIvER MAX, to calculate the TTCERAKIr.G. The maximum longitudinal acceleration, AlONG DRIVER MAX, or deceleration, shown as a constant value, is selected for the host vehicle 10 based upon the type, weight, and brake configuration of the host vehicle 10. It is conceived that the maximum longitudinal acceleration, ALcNc DRIVER MAX, could be a value derived from measuring road friction, vehicle mass, brake system conditions and capabilities, driver preferences, and/or driver capabilities, among other things.
More specifically, TTCs\K1Nc can be expressed as the following algorithm: KV = Lcy [0041j in the above expression, TTCRRAKNc represents the maximum calculated time to avoid collision impact by braking the host vehicle 10. Again, a more complex strategy could be defined using measured lateral velocity, among other things. V equates to V11 and represents the measured longitudinal velocity of host vehicle 10. ALoNG OR VER MAX represents the maximum achievable longitudinal acceleration of the host vehicle 10 by braking, which is selected as a constant here, although it is conceivable that it could be derived as a function of other sensor or selected information, such as road friction, speed, vehicle mass, brake system capabilities, driver preferences, and/or driver capabilities. For purposes of simplicity, AloNe DRIVER MAxhereni is selected constant rate of deceleration of the host vehicle. KIoNG MAX simply represents a scaling factor which is typically less than one (1.0) and can he calibrated to receive desirable calculations.
[00421 Referring again in FIG. 4. the TTC5i GERING and the TTCBpKrNG values generated at step 118 and 122, respectively, are compared at step 124, which is denoted as MIN, to output the lesser of the two values, The output from step 124 is summed with a constant, KUNCpRrAtNTy, tO provide a value to be compared with the measured TTCIMAGF. at step I 26, If the value greater than or equal to the measured iTCMAor at step 126, the plausibility module begins to generate the actuation signal. Aside from KUNCERIAINTy, a constant value used to calibrate the plausibility module 58, the plausibility module 58 begins to generate the actuation signal if the measured TTCJMAGF is determined to he less than the calculated TTC5TFgr and the TI'CRRAKINCI. As shown in FIG. 4. additional plausibility checks may be performed at step 128 beFore the plausibility module generates the actuation signal. One plausibility check is ensuring that the IN_PATH value 120 estimated by the function indicates that the object 30 is in the path of travel of the host vehicle 10.
100431 Specifically, the IN PATH value ftnction or pseudocode logic determination can he expressed as follows: IF (V2w,,) THEN TN PATH TRtJE ELSE IN PATH = FALSE [00441 In the above expression, the input variables represent the same values as measured or calculated in the iTCsIELRING expression. Accordingly, it is conceivable that a more complex strategy could be defined using measured lateral velocity, among other things.
[00451 Still referring to FIG. 4, the plausibility module 58 may also perform a plausibility cheek at step 130 thai the confidence value 36 of the object 30 measured by the camera 16 exceeds a conlidenee threshold 80. The confidence value 36 essentially isa measurement of whether the individual parts of the object 30 are moving together consistently, for example, the side rearview mirrors 38 of the object 30 more at a substantially identical range rate as the rear bumper 40 of the object 30. Step 128 only allows the actuation signal to be generated if all the plausibility checks are passed and the TTCIMAGE is determined to be less than the calculated T[CSTEER;NG and the TTCI9RAKTNG. Therefore, the countermeasure 60 (FIG. 2) may actuate based additionally on whether the object 30 is in the path of travel of the vehicle 10 and whether the confidence value 36 of the objeci exceeds the confidence threshold 80.
j0046j if all the plausibility cheeks have been passed, the plausibility module 58 may optionally include a time delay at step 132 to continue to generate the actuation signal for a set constant period of time, KOFE TIME DFJAY, such as 0.5 seconds, after the actuation signal has generated for another set constant period oftime, KON TiME DELAY, such as 0.5 seconds, to ensure that the countermeasure does not lose the actuation signal due to conditions created by the countermeasure or a momentary failure of one of the plausibility checks. The delay at step 132 l5 can he expressed in pseudocodc logic as Follows:
IF
actuation signal from step 128 is TRUE continuously for the past KON TIME DEL/iT seconds
THEN
actuation signal shall remain TRUE for KOFF TIME DELAY seconds following the ahovc conditions transitioning to FALSE.
[0047] As shown at step 134, the time delay at step 132 may alternatively be used to generate the actuation signal when it is not received directly from step 128. It is also contemplated that other time delays may be included at several other locations in the plausibility module, such as in concert with the plausibility checks [20 and 130.
[0048] In a simplified expression of the plausibility module SR. utilizing the optional iN_PATH value check, the actuation signal is enabled, or generated, when a CMbB PLAUSIBLE variable is true-This function or pseudocode logic detennination, as also partially illustrated in FIG. 4.
can be expressed as follows:
IF
(Oin(TCSmERNci, Tl'CjuAKINa) + KuicraUA1NTY)> TTCMEA5URED) AND (IN_PATH = TRUE)
THEN
CMbB_PLAUSIBLE = 1RUE -. Countermeasure Actuation Signal is Enabled
ELSE
CMbB PLAUSIBLE = FALSE -Countermeasure Actuation Signal is Disabled [00491 In the above expression, or logical determination, TTCMEASURED equates to TTCIMAQE and represents the time to collision between the host vehicle 10 and the object 30, as measured by the camera 16. In addition, KUNc'EJnAINTY again simply represents a constant that can be calibrated to receive desirable outcomes.
[0050] Referring again to FIGS. 2 and 3, the countermeasure module 60 actuates a countermeasure device if the threat of collision received from the fusion module 56 exceeds an actuation threshold and the actuation signal from the plausibility module 58 is generated by the plausibility module 58 and received by the countermeasure module 60, Upon this occurrence, the autonomous braking controller 62 for the brake system 22 of the host vehicle 10 is actuated.
The actuation signal generated by the plausibility module 58 statistically reduces the rate of falsely actuating the countermeasure device, specifically the autonomous braking controller 62.
and improves reliability of the overall collision detection system 12. Accordingly, the improved reliability reduces the expense and extent of validation needed for implementing an autonomous braking system.
100511 It will be understood by one having ordinary skill in the art that construction of the described invention and other components is not limited to any specific material. Other exemplary embodiments of the invention disclosed herein may be formed from a wide variety of materials, unless described otherwise herein.
100521 For purposes of this disclosure, the term eoupled" (in all of its forms, couple, coupling, coupled, etc.) generally means the joining of two components (electrical or mechanical) directly or indirectly to one another. Such joining may be stationary in nature or movable in nature.
Such joining may be achieved with the two components (electrical or mechanical) and any additional intermediate members being integrally formed as a single unitary body with one another or with the two components. Such joining may be permanent in nature or may be removable or releasable in nature unless otherwise stated.
100531 It is also important to note that the construction and arrangement of the elements of the invention as shown in the exemplary embodiments is illustrative only. Although only a few embodiments of the present innovations have been described in detail in this disclosure, those skilled in the art who review this disclosure wilt readily appreciate that many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.) without materially departing from the novel teachings and advantages of the subject matter recited. For example. elements shown as integrally formed may be constructed of multiple parts or elements shown as multiple parts may be integrally formed, the operation of the interfaces tnay he reversed or otherwise varied, the length or width of the structures and/or members or connector or other elements of the system may be varied, the nature or number of adjustment positions provided between the elements may he varied. It should be noted that the elements and/or assemblies of the system may he constructed from any of a wide variety of materials that provide sufficient strength or durability, in any ofa wide variety oleolors, textures, and combinations. Accordingly, all such modifications are intended to be included within the scope of the present innovations Other substitutions, modifications, changes, and omissions may he made in the design, operating conditions, and arrangement of the desired and other exeniplaiy embodiments without departing from tile spirit of the present innovations.
[0054] It will be understood that any described processes or steps within described processes may be combined with other disdosed processes or steps to form structures within the scope of the present invention. The exemplary structures and processes discLosed herein are for illustrative purposes and are not to be construed as limiting.
OO55 It is also to be understood that variations and modifications can be made on the aforementioned structure without departing from the concepts of the present invention, and further it is to be understood that such concepts are intended to be covered by the following claims unless these claims by their language expressly state otherwise.
Claims (20)
- What is claimed is: 1. A collision detection system for a host vehicle comprising: a sensor for detecting an object in a field of view and mcasuring a first set of target data based on a position of the object relative to the host vehicle; a camera for capturing a plurality of images from the field of view and processing the plurality of images to measure a second set of target data based on the position of the object relative to the host vehicle and to measure an image-based time-to-collision (TTCSMAGE) of the host vehicle with the object based on scalable differences of the plurality of images; a fusion module for determining a matched set of target data based on the first and second sets of target data received from the sensor and the camera, wherein the fusion module estimates a threat of collision of thc host vehicle with the object based on the matched set of target data; a plausibility module for calculating a steering-based time-to-collision (TTCsTL±R1c;) and a braking-based time-to-collision (TTCBRAnNG) of the host vehicle with the object based on the second set of target data received from the camera and an additional set of data received from a vehicle dynamics detector, wherein the plausibility module generates an actuation signal if the TTCIMAUF is less than the TTC5ftlNçj and the TTCBRAKING; a countermeasure modu[e for actuating a countermeasure device if the threat of collision received from the fusion module exceeds an actuation threshold and the actuation signal from the plausibility module is generated and received.
- 2. The collision detection system of claim 1, wherein the first set of target data measured by the sensor includes a first range and a first range rate of the object relative to the host vehicle, a first angle and a first angle rate of the object relative to the host vehicle, and a relative movement determination of the object, and wherein the sensor comprises a radar system.
- 3. The collision detection system of claim I. wherein the camera includes an iniagcr for capturing the plurality of images from the field of view based on light waves received from the field of view at known time intervals between times when images of the plurality of images are captured.
- 4. The collision detection system of claim I, wherein the second set of target data measured by the camera includes a second range and second range rate of the object relative to the host vehicle, a second angle and second angle rate of the object relative to the host vehicle, a width measurement of the object, and an object classification of the object.
- 5. The collision detection system of claim 4, wherein the vehicle dynamics detector includes a yaw-rate sensor and a speed sensor on the host vehicle, and wherein the additional set of data received from the vehicle dynamics detector includes a yaw rate of the host vehicle measured by the yaw-rate sensor and a velocity of the host vehicle measured by the speed sensor,
- 6. The collision detection system of claim 5, wherein the TrCsrccpn.ia is calculated as a function of the second range and the second range rate, the second angle and the second angle rate, the object classification, the yaw rate, and the velocity, and wherein the plausibility module estimates whether the object is in a path of travel of the host vehicle.
- 7. The collision detection system of claim 5, wherein the flCØp.j<q is calculated as a function of the velocity measured by the speed sensor and a selected constant rate of deceleration of the host vehicle.
- S. The collision detection system of claim 1, wherein the countermeasure device comprises an autonomous braking controllcr for activating a brake system of the host vehicle.
- 9. A collision detection system for a vehicle comprising: a sensor; a camera a fusion moduLe for estimating a collision threat with an object using data of the object relative to the vehicle from the sensor and the camera; a plausibility module for generating a signal if a measured image-based time-to-collision is less than a calculated steering-based time-to-collision and a calculated braking-based time-to-collision; and a countermeasure that actuates based on the collision threat and the signal.
- 10. The collision detection system of claim 9, wherein the measured image-based time-to-collision with the object is based on scalable differences of images captured by the camera
- ii. The collision detection system of claim 9, wherein the camera includes an imager for capturing a plurality of images from light waves received from a field of view at known time intervals between time when images of the plurality of intages are captured, and wherein the camera includes an image processor for processing the plurality of images for measuring the data and the image-based time-to-collision.
- 12. The collision detection system of claim 9, wherein the data used by the fusion module comprises: a first data set measured by the sensor having a first range and a first range rate of the object relative to thc vehicle, a first angle and a first angle rate of the object relative to the vehicle, and a relative movement detcrinination of the object and a second data set measured by the camera having a second range and a second range rate of the object relative to the host vehicle, a second angle and a second angle rate of the object relative to the host vehicle, a width measurement of the object, a classification of the object, and a confidence value of the object.
- 13. The collision detection system of claim 12, wherein the steering-based time-to-collision is calculated as a function of the second range and the second range rate, the second angle and the second angle rate, the classification of the object, a mcasured yaw rate of the vehicle, and a measured velocity of the vehicle.
- 14. The collision detection system of claim 12, wherein the braking-based time-to-collision is calculated as a function of a measured velocity of the vehicle and an estimated rate of deceleration of the vehicle.
- 15. The collision detection system of claim 12, wherein the plausibility module estimates whether the object is in a path of travel of the vehicle, and wherein the countenneasure actuates based additionally on whether the object is in the path of travel of the vehicle and whether the confidence value of the object exceeds a confidence threshold.
- 16. The collision detection system of claim 9, further comprising: a vehicle dynamics detector having a yaw-rate sensor and a speed sensor, wherein an additional set of data measured by the vehicle dynamics detector includes a yaw rate from the yaw-rate sensor and a velocity of the vehicle from the speed sensor.
- 17. The collision detection system of claim 9, wherein the countenneasure comprises an autonomous braking controller for activating a brake system of the vehicle, and wherein the sensor comprises a radar system.
- 18. The collision detection system of claim 9, wherein the countermeasure actuates if the collision threat exceeds a threshold and the signal is generated by the plausibility module.
- 19. A method for actuating an autonomous braking controller for a brake system ora host vehicle, the method comprising the steps of: sensing an object in a field of view by an object detection sensor on the host vehicle; measuring a first data set of the object with the object detection sensor, including a first range and a first range rate of the object relative to the host vehicle, a first angle and a first angle rate of the object relative to the host vehicle, and a relative movement determination of the object; capturing a plurality of images based on light waves from the field of view by a camera on the host vehicle at known time intervals between instances when the images of the plurality of images are captured; processing the plurality of images to measure a second data set of the object, including a second range and a second range rate ol'thc object relative to the host vehicle, a second angle and a second angle rate of the object relative to the host vehicle, a width of the object, and an image based time-to-collision (TTCIMAGL) of the host vehicle with the object based on scalable differences of the object derived from the plurality of images; measuring an additional data set with a vehicle dynamics detector, including a yaw-rate sensor for measuring a yaw rate of the host vehicle and a speed sensor for measuring a longitudinal velocity of the host vehicle; processing the first and second data sets, the TTC[MAop and the additional data set; estimating a threat of collision of the host vehicle with the object based on a combination of the first and second data sets; determining a steering-based time-to-collision (TTCSTFLRING) of the host vehicle with the object as a function of the second data set, the longitudinal velocity of the host vehicle, and the yaw rate of the host vehicle; determining a braking-based time-to-collision (ITCRPAKrNG) of the host vehicle with the object as a function of the longitudinal velocity of the host vehicle and a selected constant rate of deceleration of the host vehicle; generating an actuation signal if the TTCIMAGL is less than the TTCs-IrEruNG and the TTCIRAKftc; and actuating the autonomous braking controller for the brake system of the host vehicle based on the threat of collision and the actuation signal.
- 20. The method for actuating an autonomous braking controller of claim 19, wherein the step of actuating the autonomous braking controller for the brake system of the host vehicle is performed if the actuation signal is generated and the threat of collision of the host vehicle with the object exceeds an actuation threshold.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261677274P | 2012-07-30 | 2012-07-30 | |
US13/598,166 US9187091B2 (en) | 2012-07-30 | 2012-08-29 | Collision detection system with a plausibiity module |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201312853D0 GB201312853D0 (en) | 2013-09-04 |
GB2506479A true GB2506479A (en) | 2014-04-02 |
Family
ID=49118917
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1312853.3A Withdrawn GB2506479A (en) | 2012-07-30 | 2013-07-18 | Collision detection system with a plausibility module |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102013108000A1 (en) |
GB (1) | GB2506479A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110723142A (en) * | 2019-09-20 | 2020-01-24 | 江苏大学 | Intelligent automobile emergency collision avoidance control method |
DE102015214748B4 (en) | 2015-08-03 | 2023-07-20 | Bayerische Motoren Werke Aktiengesellschaft | Brake assistant for controlling an automatic deceleration of a motor vehicle |
US12109992B2 (en) | 2020-06-05 | 2024-10-08 | Zf Friedrichshafen Ag | Braking system for a vehicle and vehicle with a braking system of this kind |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9499141B1 (en) | 2015-09-29 | 2016-11-22 | Faraday & Future Inc. | Sensor-triggering of friction and regenerative braking |
DE102015226116A1 (en) | 2015-12-18 | 2017-06-22 | Robert Bosch Gmbh | A method of assessing a hazard situation detected by at least one sensor of a vehicle, a method of controlling a replay of a hazard alert, and a method of displaying a hazard alert |
DE102016001308A1 (en) | 2016-02-05 | 2017-08-10 | Audi Ag | Method for operating a vehicle and vehicle for carrying out the method |
DE102018211240A1 (en) * | 2018-07-07 | 2020-01-09 | Robert Bosch Gmbh | Method for classifying an object's relevance |
US20210197805A1 (en) * | 2019-12-27 | 2021-07-01 | Motional Ad Llc | Safety system for vehicle |
DE102020132431A1 (en) | 2020-12-07 | 2022-06-09 | Valeo Schalter Und Sensoren Gmbh | Determination of an inattentive state of a driver of a target vehicle in an environment of an ego vehicle |
CN113335311B (en) * | 2021-07-22 | 2022-09-23 | 中国第一汽车股份有限公司 | Vehicle collision detection method and device, vehicle and storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008004077A2 (en) * | 2006-06-30 | 2008-01-10 | Toyota Jidosha Kabushiki Kaisha | Automotive drive assist system with sensor fusion of radar and camera and probability estimation of object existence for varying a threshold in the radar |
EP1898232A1 (en) * | 2006-09-08 | 2008-03-12 | Ford Global Technologies, LLC | Method and system for collision avoidance |
-
2013
- 2013-07-18 GB GB1312853.3A patent/GB2506479A/en not_active Withdrawn
- 2013-07-26 DE DE102013108000.6A patent/DE102013108000A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008004077A2 (en) * | 2006-06-30 | 2008-01-10 | Toyota Jidosha Kabushiki Kaisha | Automotive drive assist system with sensor fusion of radar and camera and probability estimation of object existence for varying a threshold in the radar |
EP1898232A1 (en) * | 2006-09-08 | 2008-03-12 | Ford Global Technologies, LLC | Method and system for collision avoidance |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102015214748B4 (en) | 2015-08-03 | 2023-07-20 | Bayerische Motoren Werke Aktiengesellschaft | Brake assistant for controlling an automatic deceleration of a motor vehicle |
CN110723142A (en) * | 2019-09-20 | 2020-01-24 | 江苏大学 | Intelligent automobile emergency collision avoidance control method |
CN110723142B (en) * | 2019-09-20 | 2020-12-18 | 江苏大学 | Intelligent automobile emergency collision avoidance control method |
US12109992B2 (en) | 2020-06-05 | 2024-10-08 | Zf Friedrichshafen Ag | Braking system for a vehicle and vehicle with a braking system of this kind |
Also Published As
Publication number | Publication date |
---|---|
DE102013108000A1 (en) | 2014-01-30 |
GB201312853D0 (en) | 2013-09-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9187091B2 (en) | Collision detection system with a plausibiity module | |
GB2506479A (en) | Collision detection system with a plausibility module | |
US7447592B2 (en) | Path estimation and confidence level determination system for a vehicle | |
US11673545B2 (en) | Method for automated prevention of a collision | |
EP3007932B1 (en) | Door protection system | |
EP3039450B1 (en) | In-vehicle control device | |
US9981622B2 (en) | Occupant protection control system, storage medium storing program, and vehicle | |
US7389171B2 (en) | Single vision sensor object detection system | |
US7729858B2 (en) | Travel safety apparatus for vehicle | |
US7480570B2 (en) | Feature target selection for countermeasure performance within a vehicle | |
US11431958B2 (en) | Vision system and method for a motor vehicle | |
CN106585631B (en) | Vehicle collision system and method of using same | |
EP1566657A2 (en) | Collision detection system and method of estimating target crossing location | |
JP2007310741A (en) | Solid object recognition device | |
KR101604805B1 (en) | Weight measuring device for vehicle based on a camera and method thereof | |
US11338801B2 (en) | Collision avoidance device | |
JP2014008957A (en) | Collision prevention system and method | |
US20160075373A1 (en) | Control system for vehicle | |
US20130124053A1 (en) | Vehicle safety system and method with split active/passive processing | |
US20170263127A1 (en) | Vehicle collision system and method of using the same | |
US10977506B2 (en) | Apparatus for determining visual confirmation target | |
CN107499272B (en) | Method and control unit for controlling passenger protection means of a vehicle | |
CN107533133B (en) | Method and device for monitoring an area in front of a vehicle | |
US20060100760A1 (en) | Device for determining the actual vehicle speed | |
JP2009208670A (en) | Vehicular warning device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |