US20090299633A1 - Vehicle Pre-Impact Sensing System Having Terrain Normalization - Google Patents

Vehicle Pre-Impact Sensing System Having Terrain Normalization Download PDF

Info

Publication number
US20090299633A1
US20090299633A1 US12/468,881 US46888109A US2009299633A1 US 20090299633 A1 US20090299633 A1 US 20090299633A1 US 46888109 A US46888109 A US 46888109A US 2009299633 A1 US2009299633 A1 US 2009299633A1
Authority
US
United States
Prior art keywords
vehicle
zones
signal
zone
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/468,881
Inventor
Kevin J. Hawes
Ronald M. Taylor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Delphi Technologies Inc
Original Assignee
Delphi Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Delphi Technologies Inc filed Critical Delphi Technologies Inc
Priority to US12/468,881 priority Critical patent/US20090299633A1/en
Assigned to DELPHI TECHNOLOGIES, INC. reassignment DELPHI TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HAWES, KEVIN J.
Assigned to DELPHI TECHNOLOGIES, INC. reassignment DELPHI TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HAWES, KEVIN J., TAYLOR, RONALD M.
Publication of US20090299633A1 publication Critical patent/US20090299633A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R2021/0002Type of accident
    • B60R2021/0006Lateral collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R2021/01013Means for detecting collision, impending collision or roll-over
    • B60R2021/01027Safing sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0132Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to vehicle motion parameters, e.g. to vehicle longitudinal or transversal deceleration or speed value
    • B60R2021/01327Angular velocity or angular acceleration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93274Sensor installation details on the side of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section

Definitions

  • the present application generally relates to vehicle crash sensing and, more particularly, relates to a system and method of sensing an imminent collision of an object with a vehicle prior to impact.
  • Automotive vehicles are commonly equipped with passenger restraint and crash mitigation devices such as seat belts, front air bags, side air bags and side curtains. These and other devices may be deployed in the event of a collision with the host vehicle to mitigate adverse effects to the vehicle and the occupants in the vehicle. With respect to activated devices, such as air bags and side curtain bags, these devices generally must be deployed quickly and in a timely fashion. Typically, these types of devices are deployed when sensors (e.g., accelerometers) mounted on the vehicle sense a severe impact with the vehicle.
  • sensors e.g., accelerometers
  • vision systems employing cameras may be used to monitor the surrounding environment around the vehicle and the video images may be processed to determine if an object appears to be on a collision path with the vehicle.
  • visions systems are generally very expensive and suffer a number of drawbacks.
  • a vehicle pre-impact sensing system includes an array of energy signal transmitters mounted on a vehicle for transmitting signals within multiple transmit zones spaced from the vehicle.
  • the system further includes an array of receiver elements mounted on the vehicle for receiving the signals reflected from an object located in one or more multiple receive zones indicative of the object being in certain one or more receive zones.
  • the system also includes a processor for processing the received reflected signals and determining range, location, speed and direction of the object.
  • the processor performs a normalization to remove stationary targets by propagating a leading signal from one zone rearward to an adjacent follow zone based on vehicle speed to provide a normalized signal.
  • the processor further determines whether the object is expected to impact the vehicle as a function of the range, location, speed, direction of the object and the normalized signal, and generates an output indicative of a sensed pre-impact event.
  • a method of detecting an expected impact of an object with a vehicle includes the steps of transmitting signals within multiple transmit zones spaced from the vehicle, one zone at time, receiving signals reflected from an object located in the one or more multiple zones indicative of the object being in certain one or more receive zones and processing the received reflected signal. The method also determines location of the object, determines speed of the object and determines direction of the object.
  • the method further performs a normalization to remove stationary targets by propagating a leading signal from one zone rearward to an adjacent following zone based on vehicle speed to provide a normalized signal, determines whether the object is expected to impact the vehicle as a function of the determined range, location, speed and direction of the object and the normalized signal and generates an output signal indicative of a sensed pre-impact event.
  • FIG. 1 is a side perspective view of a vehicle employing a pre-impact crash sensing system illustrating an array of infrared (IR) transmit zones, according to one embodiment
  • FIG. 2 is a side perspective view of the vehicle employing the pre-impact crash sensing system showing an array of receiver photodetection zones, according to one embodiment
  • FIG. 3 is a top view of the vehicle further illustrating the IR transmit zones employed in the crash sensing system of FIG. 1 ;
  • FIG. 4 is a top view of the vehicle further illustrating the IR photoreceiver zones employed in the crash sensing system shown in FIG. 2 ;
  • FIG. 5 is an enlarged view of an integrated IR transmitter/receiver employed in the crash sensing system, according to one embodiment
  • FIG. 6 is a block diagram illustrating the pre-impact crash sensing system, according to one embodiment
  • FIG. 7 is a flow diagram illustrating a routine for sensing a pre-impact collision of an object with the vehicle, according to one embodiment
  • FIG. 8 is a rear view of the vehicle showing the IR transmit zones including an additional elevated horizontal IR transmit zone, according to one embodiment
  • FIGS. 9A and 9B is a flow diagram illustrating a routine for sensing a vehicle feature to update a range estimate and a side impact crash, according to one embodiment
  • FIG. 10 is a calibration chart illustrating reflected ripple signals as a function of distance for various identified objects, according to one example
  • FIG. 11 illustrates a U-shaped IR transmit beam superimposed onto the front of an oncoming vehicle for use in detecting a vehicle fascia feature(s), according to one embodiment
  • FIG. 12 illustrates an array of the U-shaped IR transmit beams shown in FIG. 11 superimposed onto an oncoming vehicle for detecting the vehicle fascia feature(s);
  • FIG. 13 illustrates receiver output signals and ripple signal in frequency (Hz) as a function of time as the host vehicle drives by another vehicle, according to one example
  • FIG. 14 illustrates the host vehicle traveling relative to a stationary barrier, lateral displaced vehicles and a lateral approaching vehicle, to illustrate terrain normalization, according to one embodiment
  • FIGS. 15A-15E illustrate the passing of a stationary object and terrain normalization to detect the object as stationary
  • FIGS. 15 A-A- 15 E-E are timing diagrams that illustrate normalization of a detected object as it passes through detection zones A 1 -A 3 shown in FIGS. 15A-15E , respectively;
  • FIGS. 16A-16D illustrate the passing of an angled stationary object and the terrain normalization for detecting the stationary target
  • FIGS. 16 A-A- 16 D-D are timing diagrams illustrating terrain normalization as the angled object passes through the detection zones shown in FIGS. 16A-16D , respectively;
  • FIGS. 17A-17D illustrate terrain normalization on an object moving laterally with respect to the host vehicle, according to one example.
  • FIGS. 18A-18C illustrate a routine for providing terrain normalization, according to one embodiment.
  • a vehicle pre-impact crash sensing system 20 is generally illustrated employed on a host vehicle 10 , according to one embodiment.
  • the crash sensing system 20 is shown and described herein configured to detect a pre-impact collision of an object (e.g., another vehicle) with the vehicle 10 , particularly on one or both lateral sides of the vehicle 10 .
  • the crash sensing system 20 may be employed to detect a pre-impact event on any side of the vehicle 10 , including one or both lateral sides, the front side and the rear side.
  • the host vehicle 10 is generally shown as an automotive wheeled vehicle having opposite lateral sides and exterior side view mirror housings 12 on opposite lateral sides.
  • the crash sensing system 20 generally includes an integrated infrared (IR) transmitter/receiver 25 shown mounted generally in one of the mirror housings 12 of the vehicle 10 , at a position sufficient to detect objects located adjacent to the corresponding lateral side of the vehicle 10 . While lateral crash sensing is shown and described herein for sensing a collision on one side of the host vehicle 10 , it should be appreciated that the crash sensing may also be employed on the opposite lateral side of the vehicle.
  • IR infrared
  • the integrated transmitter/receiver array 25 may be located at other locations on the vehicle 10 and positioned to detect one or more objects in the desired vicinity of the vehicle 10 .
  • the IR transmitter/receiver 25 includes a plurality of IR transmitters ( 22 A- 22 I) and a plurality of IR receivers 24 A- 24 I, as shown in FIG. 5 .
  • the IR transmitters 22 A- 22 I transmit infrared (IR) energy signals within designated transmit beam patterns 32 A- 32 I in a three-by-three (3 ⁇ 3) array spaced from the lateral side of the host vehicle 10 , according to one embodiment.
  • the infrared transmitter array 25 has a plurality (e.g., nine) of infrared transmitters 22 A- 22 I for transmitting infrared radiation signals within designated corresponding transmit zones 32 A- 32 I.
  • the transmitter array is activated to sequentially transmit infrared radiation signals in one zone at a time, such as zone 32 A, and then switches sequentially to the next zone, such as zone 32 B, and then to zone 32 C, and continues through the entire array to zone 32 I, and cyclically repeats the process at a high rate of speed, e.g., less than three milliseconds per zone. Alternately, multiple transmit zones could be illuminated simultaneously.
  • the IR transmit zones 32 A- 32 I are oriented in a three-by-three (3 ⁇ 3) array having three rows and three columns, each zone having a generally conical shape extending from the transmitter/receiver 25 shown located in the mirror housing 12 and oriented toward the roadway to the lateral side of the host vehicle 10 such that the row of zones 32 A- 32 C is spaced further away from the host vehicle 10 as compared to the row of zones 32 D- 32 F and row of zones 32 G- 32 I.
  • Each IR transmit zone 32 A- 32 I has a generally cone shape beam with a circular cross section which appears more as an elliptical shape as it impinges at an angle on the ground on the adjacent roadside.
  • Each IR transmit zone 32 A- 32 I has a size sufficient to cover the intended detection zone at the lateral side of the host vehicle 10 and may be spaced from the adjacent zones by a predetermined angle, according to one embodiment. According to another embodiment, the IR transmit zones 32 A- 32 I may overlap each other, thereby offering intermediary zones that can be further processed.
  • the crash sensing system 20 also includes a receiver array having a plurality of photosensitive receiver elements 24 A- 24 I as shown in FIG. 5 .
  • the receiver elements 24 A- 24 I receive and detects light intensity including reflected infrared radiation signals within corresponding IR receive zones 34 A- 34 I as shown in FIGS. 2 and 4 .
  • the IR receivers 24 A- 24 I essentially receive light including infrared radiation signals reflected from one or more objects within the corresponding IR receive zones 34 A- 34 I and converts the detected light intensity to a frequency output.
  • the receive zones 34 A- 34 I are arranged in a three-by-three (3 ⁇ 3) array having three columns and three rows that are located such as to substantially align with the corresponding three-by-three (3 ⁇ 3) array of IR transmit zones 32 A- 32 I.
  • the vehicle 10 is shown in FIG. 1 having an additional or tenth IR transmitter 26 shown illuminating a horizontal beam at an elevation where an oncoming vehicle bumper or grille would be expected to be located.
  • the additional IR transmitter 26 transmits a substantially horizontal IR calibration beam outward from the lateral side direction of the host vehicle 10 as shown in FIG. 8 .
  • IR transmitter 26 is located in a passenger door of the host vehicle 10 at a height similar to or the same as the elevation of the beam 28 relative to the ground. It should be appreciated that the transmitter 26 may be located elsewhere on host vehicle 10 such as in the front quarter panel or the front or rear bumpers of host vehicle 10 .
  • the additional IR transmit beam 28 provides a horizontal calibration IR beam which is illuminated by itself in the timing sequence of illuminating the nine IR transmitters 22 A- 22 I.
  • the IR receivers 24 A- 24 I may detect light intensity including infrared radiation reflected from objects in the corresponding zones 34 A- 34 I, particularly for object features located at the elevation of beam 28 such as a vehicle bumper, a vehicle grille or fascia or other features expected to be detected at that elevation.
  • triangulation of the calibration beam with the other nine scanned IR beams 32 A- 32 I allows ranging and a measure of the reflection coefficient of the surface of a target object. As such, enhanced range information may be acquired, according to one embodiment.
  • the host vehicle 10 is shown in FIG. 2 having an optional additional IR photoreceiver 27 shown located at the same or in close proximity to IR transmitter 26 for receiving reflected signals within a horizontal beam at an elevation where an oncoming vehicle bumper or grille would be expected to be located.
  • the IR receiver 27 receive light signals including reflected IR signals within receive beam 29 and generates a frequency output as a function of the detected light amplitude. It should be appreciated that with the additional IR transmitter 26 turned on, either the additional IR photoreceiver 27 or the individual IR photoreceivers 24 A- 24 I may be employed to detect the presence of an object within the horizontal beams at the elevation where an oncoming vehicle bumper or grille is expected which enables enhanced range estimation based on triangulation of the received signals.
  • the array of IR transmitters 22 A- 22 I transmits infrared radiation signals within the corresponding IR transmit zones 32 A- 32 I, one zone at a time, according to one embodiment, resulting in the transmission of sequential IR signals to the transmit zones, while the receiver array 24 A- 24 I receives light energy including reflected infrared radiation signals from objects located within the corresponding IR receive zones 34 A- 34 I.
  • the detected light signals are output as frequency signals which are then processed by a processor.
  • the progression of the object through multiple zones can be monitored to determine speed and direction of the object, such that a processor may determine whether a pre-impact event of the object with the host vehicle 10 is detected.
  • the system 20 may also activate the additional IR transmitter 26 as part of the sequence to detect objects within the illuminated horizontal IR calibration beam 28 .
  • the sequence of illumination may include successive activation of the nine IR transmitters 22 A- 22 I, the activation of the tenth IR transmitter 26 , then all IR transmitters turned off, and then repeat the cycle.
  • the integrated IR transmitter/receiver 25 is illustrated, according to one embodiment.
  • the IR transmitter/receiver 25 is shown generally having a housing 60 containing an array of nine IR LEDs 22 A- 22 I mounted onto the top side of a circuit board 62 .
  • the IR LEDs 22 A- 22 I may be disposed behind respective beam-forming optics, which may include reflecting (e.g., parabolic reflector) and/or refracting optical elements or an aperture for defining the conical-shaped beam pattern.
  • the IR transmitter array 22 A- 22 I may employ any of a number of signal transmitting elements for illuminating multiple transmit zones, and may be configured in any of a number of shaped and sized beam patterns.
  • the IR LEDs 22 A- 22 I may employs a central wavelength of about 850 nanometers.
  • One example of a commercially available IR LED is available from OSRAM Opto Semiconductors Inc., sold under the brand name Golden Dragon.
  • the IR transmitter/receiver 25 is shown employing nine photodetectors 24 A- 24 I which serve as photosensitive receivers and are shown mounted on the bottom side of the circuit board 62 .
  • Photodetectors 24 A- 24 I may generally be placed behind corresponding receiving lenses and/or receiving reflectors (e.g., parabolic reflectors).
  • the receiving lenses may include reflecting and/or refracting optical elements that focus the reflected infrared radiation received from the corresponding IR receive zones 34 A- 34 I onto the photodetectors 24 A- 24 I, respectively.
  • the receiver array may employ any number of a plurality of receiver elements for receiving reflected IR signals from objects within the corresponding number of receive zones 24 A- 24 I and may each be configured in a cone shape or other shapes and sizes.
  • a photodetector is a light-to-frequency converter commercially available from Texas Advanced Optoelectronic Solutions (TAOS).
  • TAOS Texas Advanced Optoelectronic Solutions
  • the light-to-frequency converter provides a frequency output (Hz) as a function of the amplitude of the detected light radiation.
  • the crash sensing system 20 is further illustrated employing a microprocessor 50 having various inputs and outputs.
  • the microprocessor 50 is shown providing outputs to the nine IR transmitters 22 A- 22 I and the tenth IR transmitter 26 .
  • the microprocessor 50 outputs LED strobe signals to the IR LEDs 22 A- 22 I and 26 to activate the IR LEDs in a cyclical pattern.
  • Signals indicating light and reflected infrared radiation received by each of the receiver elements 24 A- 24 I are input to the microprocessor 50 .
  • the receiver elements 24 A- 24 I provide frequency signals input to the microprocessor 50 in the form of a beam IR signature.
  • the beam IR signature includes frequency (in hertz) representing the amplitude of the photo or light energy detected by its irradiance on the receiver from within a given detection zone.
  • the microprocessor 50 receives an input from a passive thermal far IR receiver 46 .
  • the passive thermal far IR receiver 46 detects emitted radiation within a relatively large area and serves as a safing input that may be logically ANDed with a processor generated output signal to provide risk mitigation for high target certainty.
  • the crash sensing system 20 may employ radar or an ultrasonic transducer as the safing input.
  • Further safing inputs may include a steering wheel angular rate, yaw, external lateral slip, lateral acceleration and lateral velocity signals, amongst other possible safing inputs.
  • the crash sensing system 20 further includes memory 52 , including volatile and/or non-volatile memory, which may be in the form of random access memory (RAM), electrically erasable programmable read-only memory (EEPROM) or other memory storage medium.
  • memory 52 Stored within the memory 52 is a sensing routine 100 for processing the sensed data and determining a pre-impact event as described herein.
  • a routine 200 Also stored in memory 52 and executed by microprocessor 50 is a routine 200 detecting vehicle features and determining a pre-impact event and a routine 300 performing terrain normalization and determining a pre-impact event.
  • the microprocessor 50 provides a resettable countermeasure deploy output signal 54 and a non-resettable countermeasure deploy output signal 56 .
  • the countermeasure deploy output signals 54 and 56 may be employed to mitigate the effects of an anticipated collision. Examples of countermeasure deploy activity may include deploying a pretensioner for one or more seat belts, deploying one or more air bags and/or side air bag curtains, controlling an active suspension or other vehicle dynamics adjustment, or further may activate other countermeasures on board the host vehicle 10 . These are other deployments may be initiated early on, even prior to an actual impact.
  • the microprocessor 50 receives vehicle speed 58 which may be a measured vehicle speed on a vehicle speed estimate. Vehicle speed 58 may be employed to determine whether or not a lateral impact with an object is expected and is further employed for terrain normalization to determine whether or not an object is stationary despite its shape and orientation.
  • the sensing routine 100 is illustrated in FIG. 7 for sensing an anticipated near impact event of an object with the host vehicle.
  • Routine 100 begins at step 102 to sequentially transmit IR beams, and the proceeds to step 104 to monitor received photosensitive beams within the array of coverage zones. This occurs by sequentially applying IR radiation within each of the transmit beams and receiving light energy including reflected IR signals from the receive zones.
  • routine 100 performs noise rejection on the beam data that is received.
  • routine 100 determines if the temporal gating has been met and, if not, returns to step 102 .
  • the temporal gating bracketing requirements may take into consideration the path, trajectory and rate of the object, the number of pixels of area, the inferred mass/volume/area, the illumination consistency, and angular beam spacing consistency.
  • temporal gating requirements are determined based on comparison of an object's perceived motion (detection from one contiguous coverage zone to another) across the coverage zones to the expected relative speed of potential collision objects of interest (e.g., an automotive vehicle moving at a closing speed of 10 to 65 kilometers per hour (kph) or 6 to 40 miles per hour (mph) to a host vehicle's lateral side).
  • an object's perceived motion detection from one contiguous coverage zone to another
  • the expected relative speed of potential collision objects of interest e.g., an automotive vehicle moving at a closing speed of 10 to 65 kilometers per hour (kph) or 6 to 40 miles per hour (mph) to a host vehicle's lateral side.
  • the “range rate” of distance traveled per unit time of a potential collision object can be determined by the detection assessment of contiguous coverage zones for range rates consistent with an expected subject vehicle's closing speed (i.e., if an object is detected passing through the coverage zones at a rate of 1 observation zone per 70 milliseconds and each coverage zone is 0.3 meters in diameter perpendicular to the host vehicle's lateral side, then the closing speed or range rate is approximately 4 meters per second, and is equivalent to approximately 15 kph or 10 mph). Objects moving at range rates slower or faster than the expected range rate boundary through the coverage zones would not pass the temporal gating requirement.
  • Additional assessment can be made based on the quality of the received signal of a potential object as it passes through the coverage zones. If the amplitude of the detected signal varies substantially from one contiguous coverage zone to another (even if all signals are above a threshold value), it could indicate an off-axis collision trajectory or perhaps an object with a mass not consistent with a vehicle. The signal fidelity and consistency through the contiguous coverage zones can be used to verify a potential vehicle collision.
  • routine 100 then proceeds to decision step 110 to determine if the far IR safing has been enabled and, if not, returns to step 102 . If the safing has been enabled, routine 100 proceeds to deploy an output signal indicative of a sensed pre-impact event in step 112 . The output signal may be employed to activate deployment of one or more countermeasures.
  • the crash sensing system 20 creates a three-dimensional space extending from the lateral side of the host vehicle 10 by way of an array of high speed sequentially illuminated and scanned infrared light signals provided in dedicated coverage zones directed to the lateral side of the host vehicle 10 .
  • Objects which appear within the coverage zones are scanned, and their location, range, speed, and direction are determined.
  • the size of the object may be calculated.
  • the shape of the object and one or more features such as reflectivity present on the object may further be further determined. It should be appreciated that feature identification, such as may be achieved by monitoring reflectivity, such as that due to color, and other variations, may be detected and an enhanced range may be determined.
  • the processor 50 processes the information including location, range, speed and direction of the object in addition to the host vehicle speed, and determines whether or not a detected object is expected to impact the side of the host vehicle 10 .
  • the processor 50 processes the location of the detected object, range to the detected object, speed of the detected object, and direction of the detected object in relation to the host vehicle 10 and the speed of the host vehicle 10 . Additionally, the processor 50 may further process the size and shape of the object in order to determine whether the object will likely collide with the host vehicle 10 and, whether the object is of a sufficient size to be a concern upon impact with the host vehicle 10 . If the object is determined to be sufficiently small in size or moving at a sufficiently slow rate, the object may be disregarded as a potential crash threat, whereas a large object moving at a sufficiently high rate of speed toward the host vehicle 10 may be considered a crash threat.
  • crash sensing system 20 is described herein in connection with an integrated IR transmitter/receiver having nine IR transmitters and nine photosensitive receivers each arranged in an array of three-by-three (3 ⁇ 3), and the addition of an additional IR transmitter 26 and an optional photosensitive receiver 27 , it should be appreciated that other infrared transmit and receive configurations may be employed without departing from the spirit of the present invention. It should further be appreciated that other shapes and sizes of coverage zones for transmitting IR radiation and receiving photosensitive energy radiation may be employed and that the transmitters and/or receivers may be located at various locations on board the host vehicle 10 .
  • a complete field image encompassing all the coverage zones may be generated every scan of the entire array of the covered volume.
  • the additional IR illuminator 26 and optional receiver 27 may be employed along with triangulation.
  • triangulation the presence of an object in the designated zones is compared to the additional IR transmit zone 28 such that the range (distance) can be determined.
  • the reflection power of the signal received can be used to enhance the range estimate and thereby enhance the detection of a pre-crash event.
  • the vehicle pre-crash impact sensing system 20 employs a feature detection scheme that identifies certain features of an object vehicle, particularly an object vehicle moving laterally toward the host vehicle 10 , to provide enhanced vehicle discrimination.
  • the sensing system 20 employs the horizontally illuminated IR calibration beam 28 in conjunction with the IR transmit beams 32 A- 32 I in an attempt to identify a known feature such as bumper and/or grille of a laterally oncoming vehicle.
  • the horizontal calibration IR beam transmitter 26 is multiplexed between the main IR beams 32 A- 32 I to allow enhanced range calibration.
  • the additional IR illumination calibration beam 28 with the standard nine IR transmit zones 32 A- 32 I, the range of the reflected object can be better estimated.
  • the reflection coefficient of the surface of the object detected may be used for increased accuracy range estimate, and thus improved risk assessment for proper side air bag deployment or other countermeasure.
  • the optional tenth photosensitive receiver 27 may be employed to provide received photosensitive signals within zone 29 to further enhance the process.
  • enhanced oncoming laterally moving vehicle discrimination can be achieved by employing one or more scanned beams in a generally U-shape configuration which is generally configured to encompass the shape of a common vehicle front, particularly the fascia.
  • Multiple U-shaped patterns extending from a distant focus to larger and nearby may be created with the physical structure of the beam hardware (e.g., via the optical design).
  • the beam pattern can be created in software if the number of beams fully covers the oncoming laterally moving vehicle's trajectory path from far to nearby.
  • the U-shaped beam forms may have ends of about three feet by three feet which focuses on the oncoming laterally moving vehicle's headlamps/signal markers and a center connecting line of about a two foot height which receives the oncoming laterally moving vehicle's grille chrome. Accordingly, the pre-impact sensing system applies vehicle fascia detection with vehicle front grille shaped optical regions for improved detection of approaching vehicles. Nine overlapping regions may allow target tracking and relative ranging, and the geometry can apply to either the IR illumination or light receiver shape or possibly both the transmit and receiver shapes.
  • enhanced oncoming laterally moving vehicle discrimination may be achieved by detecting the differential spectral return of highly reflective vehicle surfaces, such as signal markers, headlamps, fog lamps, license plates, chrome and other features typical of vehicle front ends. Additionally, background light illumination levels may also be used to measure the highly reflective vehicle elements. Additionally, the system 20 may be used to detect the headlamp on status of the oncoming vehicle further, thereby allowing discrimination of their presence as well as any possible pulse width modulation (PWM). LED headlamps which are also pulsed may be sensed and used as an additional discrimination element. The geometry of the spectral objects on an inbound oncoming laterally moving vehicle may also aid in the discrimination risk assessment.
  • PWM pulse width modulation
  • routine 200 for sensing a pre-crash event of a vehicle to deploy one or more devices and includes steps of implementing the various aforementioned embodiments of the vehicle feature identification to advantageously provide updated object range estimates.
  • Routine 200 begins at step 202 and proceeds to step 204 to illuminate the IR beam N, wherein N is the number of IR transmit beams which are sequentially activated.
  • routine 200 stores the IR transmitter that is turned on, and receives the IR amplitude data for the N th receive zone.
  • the transmit beams are turned on one beam at a time, according to one embodiment, and that the light energy including IR energy reflected from objects within each zone is detected by receivers covering the corresponding receive zones.
  • beam N is turned off as part of this process, and at step 210 , the IR off is stored and the amplitude data of received light energy for beam N is stored with the IR off.
  • N is incremented to the next successive zone of coverage.
  • routine 200 determines if N has reached a value of ten which is indicative of the number of IR transmitters and, if not, repeats at step 202 . Once a complete cycle of all ten zones has been completed with N equal to ten (10), routine 200 proceeds to step 216 to indicate that a frame is complete, such that the received energy data is mapped out for the coverage zones.
  • routine 200 proceeds to step 218 to perform a sequential IR modulation signal calculation for each of the IR sensors, shown as sensors 1 - 9 indicative of the outputs of the respective photodetectors 24 A- 24 I.
  • Step 218 essentially performs a sequential IR modulation signal calculation by taking the difference of the sensed photo signal while the infrared transmitter is turned on and then when the infrared transmitter is turned off for each coverage zone. As such, for zone 1 , a signal (X 1 ) is determined based on the difference of the receive signal with the IR on and the IR off, signal (X 2 ) is indicative of the receive signal with the IR on minus the IR off, etc.
  • the emitted IR beams are essentially modulated at the switching or modulation frequency.
  • the difference between the received IR energy signals when the IR transmitter is turned on and when the same IR transmitter is turned off produces a ripple signal as described herein.
  • the sequential IR modulation signal calculation is performed for each of the nine zones 1 - 9 to generate corresponding ripple signals to remove or get rid of the overall background noise.
  • routine 200 performs step 220 which processes the output of the tenth IR receiver 27 to acquire enhanced signal to noise ratio.
  • Step 220 is an optional step that performs a reference IR modulation signal calculation for the tenth receiving sensor, also referred to as receiver 27 . In doing so, signal (X 10 ) is generated as a function of the IR on minus the IR off for the tenth coverage zone.
  • Routine 200 then proceeds to step 222 to perform a differential signal calculation (Y) for each of sensors 1 - 9 to acquire an enhanced differential signal by eliminating or removing background noise.
  • the differential signal calculation involves calculating the difference between signal X 1 and signal X 10 , to the extent a tenth transmitter is employed.
  • the signal Y 2 is acquired by taking the difference between signal X 2 and signal X 10 .
  • each of zones 3 - 9 involves subtracting the signal X 10 from the corresponding received signal for that zone to provide respective ripple signals for each zone.
  • routine 200 proceeds to step 224 to use an object calibration for vehicle feature identification.
  • routine 200 uses a standard object calibration for a bumper detection to detect the bumper or similar feature(s) on a lateral oncoming vehicle.
  • routine 200 employs a highly reflective object calibration for fascia and headlamp detection. It should be appreciated that the object calibration for bumper detection and reflective object calibration for fascia/headlamp detection may be achieved by use of one or more calibration charts or look-up tables, such as the exemplary calibration chart shown in FIG. 10 .
  • the calibration chart shown in FIG. 10 essentially maps a plurality of different sample objects having various features such as shapes, colors, materials (textures) and reflectivity as a function of the ripple signal in hertz versus range in feet.
  • the IR receiver photodetectors each provide an output frequency dependent upon the photosensitive detection.
  • the ripple signals shown are the reduced noise Y signals generated at step 222 .
  • 10 include arbitrary selected materials such as a white paper, black cotton twill, brick, tree and fog, black paper, black board, black cotton gauze, asphalt, and a typical sky scenario. It should be appreciated that these and other targets or selected materials may be mapped out in a given calibration table for use with routine 200 .
  • routine 200 compares the differential signal calculations Y to determine the highest correlation for a detected geometry. In doing so, routine 200 uses a selected calibration map to determine the optimum range estimate based on a common range from the nine received ripple signals. In step 228 , routine 200 updates the object range estimate based upon the identified feature(s). Accordingly, it should be appreciated that by identifying an anticipated feature for the front end of a lateral approaching vehicle, such as the vehicle bumper, fascia and/or headlamp, enhanced object range may be estimated for use in the pre-crash sensing system 20 .
  • routine 200 proceeds to decision step 230 to determine if the temporal gating has been met, and if not, returns to step 202 .
  • the temporal gating step 230 may be the same or similar temporal gating step described in connection with step 108 of routine 100 . If the temporal gating has been met, routine 200 proceeds to decision step 232 to see if the thermal IR safing has been enabled and, if so, deploys one or more devices in step 234 . If the thermal IR safing is not enabled, routine 200 returns to step 202 .
  • the thermal safing step 232 may be the same or similar to the safing logic described in connection with step 110 of routine 100 .
  • routine 200 advantageously provides for an enhanced object range estimate based on the detected type of feature of a vehicle that is oncoming in the lateral direction.
  • routine 200 advantageously looks up the calibration data and provides an updated range estimate which advantageously enhances the determination of whether a laterally approaching vehicle, such as an automobile, is expected to collide with the host vehicle 10 .
  • the U-shaped geometry for a single beam geometry transmit beam 32 is shown superimposed onto the front side fascia portion of the front side of a vehicle, which represents a potential oncoming laterally moving vehicle.
  • the U-shaped transmit beam 32 may be broadcast as an infrared beam by a single IR transmitter, according to one embodiment.
  • multiple U-shaped IR transmit beams 32 A- 32 I are transmit as shown in FIG. 12 , each having a generally U-shape and covering separate zones, which overlap each other.
  • the multiple U-shape beams may extend from a distant focus to larger and nearby and could be created with the physical structure of the beam hardware (e.g., via the optical design).
  • the U-shaped beam form may have ends of about three feet by three feet at distance of about six feet, which focuses on the oncoming laterally moving vehicle's headlamp/signal markers and a center connecting line of about a two foot height which receives the oncoming laterally moving vehicle's grille chrome. While an array of U-shape transmit beams is shown and described herein, it should be appreciated that the geometry of the light receivers may be shaped in a U-shape instead or in addition to the transmit beams.
  • the trajectory and range of the oncoming vehicle object may be better determined.
  • the U-shaped beams may be shaped in an oval shape for simplicity, or another shape that picks up the fascia and similar features of the front end of the oncoming vehicle. It should be appreciated that as the IR transmit beams cross horizontally the front fascia of an oncoming front end of a vehicle, a resultant ripple signal is generated. The ripple signal is the difference in detected energy signal when the IR is turned on and when the IR is turned off. It should be appreciated that for a high reflectance feature, such as the headlamps and signal markers, a higher ripple signal is achieved having a higher frequency.
  • the center grille area of a front end of an oncoming vehicle is mainly paint which has a lower reflective coefficient versus the reflected components of the headlamps and the signal markers.
  • the ripple signal signature can be processed to determine the presence of a likely vehicle fascia or portion thereof including the headlamps and signal markers of the front end of an oncoming vehicle.
  • the pre-crash sensing system 10 further employs a modulation technique to nullify background ambient light conditions and better enhance the estimated target range.
  • a modulation technique to nullify background ambient light conditions and better enhance the estimated target range.
  • the extreme lighting variation from darkness to full sunlight presents many challenges to an object detection system.
  • These and other deficiencies can be overcome by turning the IR transmit array on and off at a high frequency of three hundred (300) hertz, for example, or otherwise provide amplitude modulation of the IR light source with a square wave or a sine wave so as to nullify the background ambient light conditions and better enhance the estimate of target range.
  • a modulation technique which measures a scene with ambient lighting and also with an artificial IR illumination source. The difference between the two measurements provides an inferred target range within the scene.
  • This modulation method provides a very cost-effective method of target ranging, yet does not require extreme power levels as created by typical solar exposure which can be cost prohibitive.
  • the modulation technique may be implemented with a carrier signal as disclosed in U.S. Pat. No. 6,298,311, entitled “INFRARED OCCUPANT POSITION AND RECOGNITION SYSTEM FOR A MOTOR VEHICLE,” the entire disclosure of which is hereby incorporated herein by reference.
  • the modulation technique is illustrated in connection with an example of sensed IR and ripple signal both when a parked vehicle is located in the lateral side detection zone of the host vehicle 10 and when the parked vehicle is removed from the detection zone.
  • the sensed raw irradiance or received light energy level is plotted as a function of time versus frequency as the host vehicle drives by the parked car shown at top at time of about eighteen (18) seconds to a time of about thirty-three (33) seconds, such that initially there is no car on the lateral side of the host vehicle up until time equals eighteen (18) seconds, and the car passes through the detection zones and then departs the detection zones at time equals thirty-three (33) seconds.
  • the received raw IR irradiance or light energy indicated by area 92 between lines 94 and 96 increases between max and min values when the IR transmitter is turned on at max border 94 and when the IR transmitter is turned off at min border 96 .
  • the difference between the IR turned on and off signals is represented by the ripple signal 98 which goes from approximately zero (0) frequency to a frequency of about five thousand (5,000) when the lateral location vehicle passes through the detection zone, and returns to a value of near zero (0) when the lateral located parked vehicle departs the detection zone.
  • the modulation techniques advantageously allows for detection of an object due to reflected signal independent of shadow or sunlight.
  • the data shown in FIG. 13 illustrates frequency data where the object was generally in the center of the images and the IR illumination alternates from on to off.
  • This data yielded a ripple signal 98 of around one hundred (100) hertz of variation in the raw data when looking at the asphalt, and about five thousand (5,000) hertz with a white target at about six (6) feet, according to one example.
  • the ripple signal 98 did not change even though the ambient lighting moved from a full sun exposure from the time period of about zero to ten (0 to 10) seconds to the shadow of the vehicle at about the time period of about seventeen (17) or eighteen (18) seconds.
  • the data shown in FIG. 13 was taken within a 950 nanometer band which requires rather expensive optical bandpass filters to eliminate sunlight.
  • the modulation technique allows for operation with a much less costly high wavelength filter, such as greater than 700 nanometers.
  • the modulation technique enables the range estimation to be enhanced with acquisition from the calibration table shown in FIG. 10 .
  • a white object's range can be fairly well predicted as shown by the curve 70 A representing white paper. Dark objects are also predictable as shown by the curve 70 H representing asphalt. Textured fabrics may exhibit a self-shadowing effect.
  • a side impact risk estimate can be made and, if deemed of sufficient risk, the side air bags can be deployed.
  • the ranges as shown by lines 72 A, 72 B and 72 C may be established, for example.
  • an enhanced range estimate may be established.
  • ranges 72 A- 72 C have a common range value of about six feet. Since the frequency output of the IR photodetectors may vary based upon color or reflectivity in combination with a range, an enhanced range estimate may be provided by looking for common range values within the zones.
  • the pre-crash sensing system 10 further employs a terrain normalization method to normalize out stationary objects that pass through the detection zones to the lateral side of the vehicle 10 .
  • a terrain normalization method to normalize out stationary objects that pass through the detection zones to the lateral side of the vehicle 10 .
  • FIG. 14 an example driving scenario is shown illustrating a host vehicle 10 employing the sensing system 20 is traveling along a roadway relative to several objects including an angled barrier 80 , a lateral oncoming vehicle 84 , a passing lateral vehicle 82 , and a laterally projecting vehicle 86 expected to collide with the host vehicle 10 .
  • the terrain normalization method is able to normalize out stationary objects, such as the angled barrier 80 which, due to its shape, may appear to be moving toward the host vehicle 10 , such that the system 20 may ignore the stationary object 84 in determining whether or not an impending lateral collision will occur.
  • range data which is inherent to the power level of the reflected signal
  • a three-dimensional volume can be estimated with the array of coverage zones 34 A- 34 I.
  • This terrain information including inferred distance, is propagated to the successive rearward beams which follow and is used to normalize out the existence of stationary objects which are characterized as clutter to be ignored.
  • Objects which have a lateral velocity component are recognized as potential oncoming targets and their characteristics are further evaluated for assessed threat of a lateral collision to the host vehicle 10 .
  • angled barrier e.g., a guardrail
  • an angled road line may have a shape resembling an oncoming vehicle bumper and can produce similar or identical IR pattern signatures which possibly could cause undesirable deployment of an air bag or other device when not properly detected.
  • the terrain normalization method attempts to detect and eliminate such false triggers.
  • a matrix of infrared beams and receiving beams illuminate the side of the host vehicle 10 to provide a sensed volume by the three-by-three (3 ⁇ 3) array.
  • range data which is inherent to the power level of the reflected signal, a three-dimensional volume can be estimated. As the host vehicle 10 is driven forward, the frontmost beams or zones will see the terrain pass by first.
  • This terrain information includes distance that is propagated to subsequent following beams which follow behind in progression and are used to normalize out stationary objects.
  • Objects which have a lateral velocity component are recognized as potential oncoming targets to the host vehicle 10 , and their characteristics are further evaluated for assessed threat to the host vehicle 10 .
  • Each scan of the matrix yields an object light level for each spot of the zones detected. From this, the range (distance) of an object can be inferred according to a distance look-up table. Generally, objects which are at ground level are quite low in reflected energy as power is related to one divided by the distance squared. As seen in FIG. 14 , barriers and oncoming cars illuminated by using the leading spot zones and propagating this information rearward to the trailing spots is achieved with a normalization technique. The data may be processed by averaging, normalization, and time differential, amongst other embodiments. Additionally, road speed and steering angle could be used to propagate the optical distance of the measured lead objects rearward, thus, eliminating them from causing false deployments.
  • the detection of objects entering the rearmost zones first are processed in reverse, such as when a car passes the host vehicle 10 .
  • the resultant matrix of processed data which has been normalized to remove oncoming and passing objects is used to evaluate the event propagation of a lateral object, such as an oncoming laterally moving vehicle 86 .
  • FIGS. 15A-15D the progression of a stationary object is illustrated as a host vehicle 10 passes by stationary object 84 .
  • the stationary object 84 is in front of the nine detection zones.
  • the stationary object 84 is shown first being detected by zone A 1 in FIG. 15B , and then detected by both zones A 1 and A 2 in FIG. 15C , and then detected by zones A 1 -A 3 in FIG. 15D .
  • the object 84 is shown in FIG. 15E departing zone A 1 and is still detected in zones A 2 and A 3 .
  • the system 20 employs a normalization routine to subtract out the propagated signal detected in a forward located zone from the raw data in an attempt to detect whether or not a lateral motion of the object is occurring.
  • a normalization routine to subtract out the propagated signal detected in a forward located zone from the raw data in an attempt to detect whether or not a lateral motion of the object is occurring.
  • the detection of the object 84 in zone A 1 is detected and as it approaches zone A 2 , the propagated signal of zone A 1 is subtracted from the raw data of zone A 2 to provide a normalized result indicative of a stationary object which should be rejected.
  • FIGS. 16A-16D a scenario is shown in which a stationary object 80 in the form of an angled barrier, for example, is passed by the host vehicle 10 and the system 20 employs terrain normalization to reject the stationary object 80 , despite its potentially deceiving shape due to its angle towards the vehicle 10 .
  • the stationary object 80 first enters zone A 1 in FIG. 16A , and then proceeds to enter zone A 2 in FIG. 16B .
  • the stationary object 80 is detected by zones A 1 , A 2 and B 1
  • FIG. 16D the stationary object 80 is detected primarily in zones A 3 , B 2 and C 1 .
  • the terrain normalization subtracts the propagated signal of the leading zone from the following zone.
  • the normalized result is that the propagated signal of zone A 1 is subtracted from the raw data of zone A 2 to determine that the stationary object 80 is detected and should be rejected.
  • the propagated signal is subtracted based on a delay time which is determined based on the speed of host vehicle 10 , such that the signal data of the preceding zone captures an area of space that is also captured in the following zone.
  • the terrain normalization thereby takes into consideration the detected signal information from the preceding zone taking into consideration the time delay and the speed of the host vehicle 10 .
  • FIGS. 17A-17D a further example of a laterally oncoming vehicle 86 is illustrated in which the laterally oncoming vehicle 86 first enters zone A 1 and A 2 in FIG. 17B and proceeds into zones B 1 , B 2 , A 1 , A 2 and C 1 in FIG. 17C , and finally proceeds into zones A 1 , A 2 , B 1 , B 2 and C 1 -C 3 in FIG. 17D .
  • the terrain normalization effectively removes stationary objects so that the system 20 can detect that the laterally oncoming vehicle 86 is not stationary, but instead is moving with a lateral velocity component toward the host vehicle 10 , such that an oncoming collision may occur.
  • the terrain normalization routine 300 begins at step 302 and proceeds to step 304 to illuminate the outermost row of IR transmit beams in zones A 1 , A 2 and A 3 .
  • routine 300 receives the IR amplitude data for receive zones A 1 , A 2 and A 3 while the IR is turned on.
  • the IR transmit beams for zones A 1 , A 2 and A 3 are turned off.
  • routine 300 receives the IR amplitude data for zones A 1 , A 2 and A 3 while the IR transmit beams are turned off and stores the received amplitude data while the IR transmit beams for zones A 1 , A 2 and A 3 are turned off.
  • routine 300 turns on the IR transmit beams for the next or middle row of zones B 1 , B 2 and B 3 .
  • routine 300 receives the IR amplitude data for zones B 1 , B 2 and B 3 while the IR transmit beams are turned on and stores the received IR amplitude data in memory.
  • routine 300 turns off the IR transmit beams for zones B 1 , B 2 and B 3 . With the IR transmit beams turned off, routine 300 proceeds to step 318 to receive the IR amplitude data for zones B 1 , B 2 and B 3 and stores the received IR amplitude data in memory.
  • Routine 300 proceeds to step 320 to turn on the IR transmit beams for the third or closest row of zones C 1 , C 2 and C 3 .
  • routine 300 receives the IR amplitude data for zones C 1 , C 2 and C 3 and stores the received IR amplitude data in memory.
  • routine 300 turns off the IR transmit beams for zones C 1 , C 2 and C 3 . With the IR transmit beams turned off, routine 300 proceeds to step 326 to receive the IR amplitude data for zones C 1 , C 2 and C 3 and stores the received IR amplitude data in memory.
  • routine 300 turns on the IR transmit beam for the tenth or additional transmit beam at step 328 .
  • routine 300 proceeds to step 330 to receive the IR amplitude data for zones A 1 , A 2 , A 3 , B 1 , B 2 , B 3 , C 1 , C 2 , C 3 , and the tenth receiver while the tenth or additional IR transmit beam is turned on. Finally, the initial frame is complete at step 332 .
  • routine 300 proceeds to step 334 to perform a sequential IR modulation signal calculation X for each of sensors one through nine, shown labeled A 1 -C 3 .
  • routine 300 stores a history of the IR modulation signal calculation (X) for each of sensors one through nine for zones A 1 -C 3 . This involves storing the time average of the IR signals for each of the zones A 1 -C 3 .
  • routine 300 looks at the forward vehicle motion and cancels out the stationary signals from the following sensor locations. This includes normalizing the signal for a given zone by subtracting out from the following zone the history of the previous zone. For example, the history of zone A 1 is subtracted from zone A 2 when an object passes from zone A 1 to zone A 2 .
  • the continued signal normalization applies to zone A 3 in which the history from zones A 1 and A 2 is subtracted from zone A 3 at the appropriate time based upon a time delay based on the vehicle speed. The time delay is based on vehicle speed so that the zones cover the same area of space.
  • Signal normalization also occurs in rows B and C by subtracting out the signal from the prior zone.
  • the same signal normalization applies in the reverse direction for a vehicle or object passing laterally from the rear of the host vehicle 10 toward the front of the host vehicle 10 , except the signal normalization is reversed such that the propagating signal in zone A 3 is subtracted from A 2 , etc.
  • zone A 2 you take the current X 2 data and subtract off the history of zone A 3 such that a sliding window essentially is provided.
  • the terrain normalization essentially eliminates the fore and aft movement parallel to the host vehicle 10 in the detection zones. By doing so, stationary objects or clutter are rejected.
  • Routine 300 then proceeds to step 340 to determine the lateral component of the moving object.
  • the lateral component of an object is based on the lateral movement toward or away from the lateral side of the host vehicle 10 .
  • routine 300 determines if there is lateral velocity component greater than eighteen miles per hour (18 mph) and if the object is large enough and, if so, proceeds to step 346 to update the object range estimate, and then proceeds to decision step 348 to check the temporal gating. If the object is not large enough or if the lateral velocity component is not greater than eighteen miles per hour (18 mph), routine 300 returns to step 302 .
  • temporal gating is compared to determine whether or not an object is likely to collide with the host vehicle 10 and, if so, routine 300 proceeds to decision step 350 to determine if thermal IR safing is enabled and, if so, deploys an output at step 352 . If the thermal IR safing is not enabled, routine 300 returns to step 302 . It should be appreciated that the temporal gating of step 348 and the thermal IR safing of step 350 may be the same or similar to those steps provided in routine 100 as discussed above.
  • thermal far IR safing receiver 46 is shown and described herein for providing thermal IR safing, it should be appreciated that other safing techniques may be employed to eliminate false triggers.
  • a matrix of IR beams illuminates the side of host vehicle 10 to provide a sensed volume.
  • a single IR beam may be provided in a matrix of beams.
  • range data which is inherent to the power level of the reflected signal, a three-dimensional volume can be estimated.
  • the addition of a separate technology to “safe” the primary deploy signal is required to ensure against false air bag deployment.
  • “Safing” is defined as a complementary measure to verify that the detected object is an oncoming laterally moving vehicle where the measured speed of the oncoming laterally moving vehicle matches the speed measured by the primary detection. Moreover, the measured speed may be approximately fifteen (15) to fifty (50) miles per hour, according to one example. Additionally, this concept of lateral velocity verification can be used to enable sub-fifteen mile per hour air bag deployment.
  • Discrimination technology considered for the safing technique can include active near IR (NIR) radar, or camera.
  • NIR near IR
  • One or more safing technology and one or more deploy technology may be utilized in the design.
  • Individual safing or deploy technologies can include active near IR, far IR (FIR), ultrasonic acoustics, laser time-of-flight sensor (3D scanner), 3D camera, or stereo camera.
  • heat may be detected from an oncoming vehicle.
  • ultrasonic sensors perform the safing function.
  • the safing function is employed to ensure the event progression is due to an incoming vehicle and not due to other situations that would not require a side air bag deployment. Safing may prevent deployment of a side air bag when the host vehicle 10 strikes a stationary object, such as a tree, pole or parked vehicle. These objects are not likely to have the thermal signature of a moving vehicle, and in extreme yaw conditions, may not return a steady ultrasonic return which is especially true for trees and poles. Therefore, there is a need to relate a means to disable or reduce safing requirements in yaw conditions.
  • Situationally dependent safing is a method to modify side air bag pre-impact deployment safing based upon the vehicle stability.
  • an active IR sensing system is employed to determine when a side impact is imminent.
  • Supplemental information from either an ultrasonic or passive IR system is used for safing. If the host vehicle 10 path is tangent to its four-aft direction or the target follows a linear path into the side of the host vehicle 10 , incoming targets will follow a normal progression and safing techniques will provide information necessary to make a reliable decision. A relatively linear progression will allow sufficient path information of the active IR system to generate a mature path.
  • the path of the host vehicle 10 may not allow the development of a mature track for impacts.
  • the supplemental safing sensors are less likely to provide adequate information to supplement the deployment condition. If an ultrasonic sensor is employed, the host vehicle 10 may spin into the target too quickly to provide an adequate return. If a passive IR system is employed, the target may not generate the thermal signature necessary to allow deployment. Therefore, a decision tree may allow for safing levels to be reduced in cases where the host vehicle 10 is not following a consistent path due to the high yaw.
  • the decision tree may include logically ORing the following requirements: Far IR (FIR) is greater than threshold, steering wheel angular rate is greater than N degrees per second, yaw is greater than N degrees per second, external lateral slip divided by yaw, and lateral acceleration greater than 0.5 g.
  • FIR Far IR
  • the output of the OR logic is then logically ANDed with the discrimination output to determine whether or not to deploy one or more devices.
  • vehicle travel direction can be inferred by the ground terrain monitoring of the active IR scanning system.
  • the vehicle velocity and direction including lateral sliding can be detected and approach countermeasures initiated.
  • Under inclement conditions, such as blowing snow/rain/sand side slip monitoring is failed-safed by a lateral yaw rate sensor.
  • Lateral sliding information can be used for side air bag threshold lowering, stability control, or potentially rollover detection.
  • the lateral slip sensor may use a left or right sensor to monitor road pattern directions. With the transmit/receive beams, an optical flow through beam matrix determines ground travel direction, vehicle rotation and potential vehicle roll.
  • the array of transmit and receive IR beams may be arranged in an overlapping configuration. Tailoring of three-dimensional volume to the side of the host vehicle 10 can pose a challenge to ensure an oncoming laterally moving vehicle is detected and a side air bag is deployed, yet allow the numerous no-deploy objects which pass harmlessly by the host vehicle 10 to not cause false deploys.
  • the beam overlap may allow increased spatial resolution with a minimum number of discrete beams by use of the beam overlap.
  • each of the nine beam spots may be enlarged to allow a twenty percent (20%) beam spot overlap which provides twenty-one multiplexed zones, in contrast to the above disclosed nine non-overlap zones, which allows increased distance resolution of the incoming object.
  • the geometry can apply to IR illumination or light receiver shape or possibly both transmitter and receiver shapes for more resolution. Accordingly, the beam overlap method may consist of overlapping scanned areas or regions to allow increased target tracking resolution.
  • the pre-crash sensing system 20 of the present invention advantageously detects an impending collision of an object with the host vehicle 10 , prior to the actual collision.
  • the crash sensing system 20 is cost affordable and effective to determine whether an object is approaching the host vehicle 10 and may collide with the vehicle, sufficient to enable a determination of the impending collision prior to actual impact. Further, the pre-crash sensing system 20 may determine whether the object is of sufficient size and speed to deploy certain countermeasures.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)
  • Automotive Seat Belt Assembly (AREA)

Abstract

A vehicle pre-impact sensing system is provided that includes an array of energy signal transmitters mounted on a vehicle for transmitting signals within multiple transmit zones spaced from the vehicle and an array of receiver elements mounted on the vehicle for receiving signals reflected from an object located in one or more multiple receive zones indicative of the object being in certain one or more zones. A processor processes the received reflected signals and determines range, location, speed and direction of the object, determines whether the object is expected to impact the vehicle as a function of the determined range, location, speed and direction of the object, and generates an output signal indicative of a pre-impact event. The system may detect one or more features of a target object, such as a front end of a vehicle. Additionally, the system may modulate the transmit beams. Further, the system may perform a terrain normalization to remove stationary items.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims priority under 35 U.S.C. §119(e) to U.S. Provisional Patent Application No. 61/130,236, filed on May 29, 2008, the entire disclosure of which is hereby incorporated herein by reference.
  • TECHNICAL FIELD
  • The present application generally relates to vehicle crash sensing and, more particularly, relates to a system and method of sensing an imminent collision of an object with a vehicle prior to impact.
  • BACKGROUND OF THE INVENTION
  • Automotive vehicles are commonly equipped with passenger restraint and crash mitigation devices such as seat belts, front air bags, side air bags and side curtains. These and other devices may be deployed in the event of a collision with the host vehicle to mitigate adverse effects to the vehicle and the occupants in the vehicle. With respect to activated devices, such as air bags and side curtain bags, these devices generally must be deployed quickly and in a timely fashion. Typically, these types of devices are deployed when sensors (e.g., accelerometers) mounted on the vehicle sense a severe impact with the vehicle.
  • In some vehicle driving situations, it is desirable to determine the onset of a collision, prior to impact of an object with the host vehicle. For example, vision systems employing cameras may be used to monitor the surrounding environment around the vehicle and the video images may be processed to determine if an object appears to be on a collision path with the vehicle. However, visions systems are generally very expensive and suffer a number of drawbacks.
  • An alternative approach is disclosed in U.S. Patent Application Publication No. 2009/0099736, assigned to the assignee of the present application. The approach set forth in the aforementioned patent application discloses a vehicle pre-impact sensing system that transmits a plurality of infrared (IR) beams and receives a plurality of beams within a plurality of curtains incrementally spaced from the host vehicle for sensing objects that may impact the side of the host vehicle. The aforementioned published patent application is hereby incorporated herein by reference.
  • It would be desirable to provide for an enhanced cost-effective system that senses a collision prior to impact with the host vehicle, particularly for use to detect side impact events.
  • SUMMARY OF THE INVENTION
  • According to one aspect of the present invention, a vehicle pre-impact sensing system is provided that includes an array of energy signal transmitters mounted on a vehicle for transmitting signals within multiple transmit zones spaced from the vehicle. The system further includes an array of receiver elements mounted on the vehicle for receiving the signals reflected from an object located in one or more multiple receive zones indicative of the object being in certain one or more receive zones. The system also includes a processor for processing the received reflected signals and determining range, location, speed and direction of the object. The processor performs a normalization to remove stationary targets by propagating a leading signal from one zone rearward to an adjacent follow zone based on vehicle speed to provide a normalized signal. The processor further determines whether the object is expected to impact the vehicle as a function of the range, location, speed, direction of the object and the normalized signal, and generates an output indicative of a sensed pre-impact event.
  • According to another aspect of the present invention, a method of detecting an expected impact of an object with a vehicle is provided. The method includes the steps of transmitting signals within multiple transmit zones spaced from the vehicle, one zone at time, receiving signals reflected from an object located in the one or more multiple zones indicative of the object being in certain one or more receive zones and processing the received reflected signal. The method also determines location of the object, determines speed of the object and determines direction of the object. The method further performs a normalization to remove stationary targets by propagating a leading signal from one zone rearward to an adjacent following zone based on vehicle speed to provide a normalized signal, determines whether the object is expected to impact the vehicle as a function of the determined range, location, speed and direction of the object and the normalized signal and generates an output signal indicative of a sensed pre-impact event.
  • These and other features, advantages and objects of the present invention will be further understood and appreciated by those skilled in the art by reference to the following specification, claims and appended drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention will now be described, by way of example, with reference to the accompanying drawings, in which:
  • FIG. 1 is a side perspective view of a vehicle employing a pre-impact crash sensing system illustrating an array of infrared (IR) transmit zones, according to one embodiment;
  • FIG. 2 is a side perspective view of the vehicle employing the pre-impact crash sensing system showing an array of receiver photodetection zones, according to one embodiment;
  • FIG. 3 is a top view of the vehicle further illustrating the IR transmit zones employed in the crash sensing system of FIG. 1;
  • FIG. 4 is a top view of the vehicle further illustrating the IR photoreceiver zones employed in the crash sensing system shown in FIG. 2;
  • FIG. 5 is an enlarged view of an integrated IR transmitter/receiver employed in the crash sensing system, according to one embodiment;
  • FIG. 6 is a block diagram illustrating the pre-impact crash sensing system, according to one embodiment;
  • FIG. 7 is a flow diagram illustrating a routine for sensing a pre-impact collision of an object with the vehicle, according to one embodiment;
  • FIG. 8 is a rear view of the vehicle showing the IR transmit zones including an additional elevated horizontal IR transmit zone, according to one embodiment;
  • FIGS. 9A and 9B is a flow diagram illustrating a routine for sensing a vehicle feature to update a range estimate and a side impact crash, according to one embodiment;
  • FIG. 10 is a calibration chart illustrating reflected ripple signals as a function of distance for various identified objects, according to one example;
  • FIG. 11 illustrates a U-shaped IR transmit beam superimposed onto the front of an oncoming vehicle for use in detecting a vehicle fascia feature(s), according to one embodiment;
  • FIG. 12 illustrates an array of the U-shaped IR transmit beams shown in FIG. 11 superimposed onto an oncoming vehicle for detecting the vehicle fascia feature(s);
  • FIG. 13 illustrates receiver output signals and ripple signal in frequency (Hz) as a function of time as the host vehicle drives by another vehicle, according to one example;
  • FIG. 14 illustrates the host vehicle traveling relative to a stationary barrier, lateral displaced vehicles and a lateral approaching vehicle, to illustrate terrain normalization, according to one embodiment;
  • FIGS. 15A-15E illustrate the passing of a stationary object and terrain normalization to detect the object as stationary;
  • FIGS. 15A-A-15E-E are timing diagrams that illustrate normalization of a detected object as it passes through detection zones A1-A3 shown in FIGS. 15A-15E, respectively;
  • FIGS. 16A-16D illustrate the passing of an angled stationary object and the terrain normalization for detecting the stationary target;
  • FIGS. 16A-A-16D-D are timing diagrams illustrating terrain normalization as the angled object passes through the detection zones shown in FIGS. 16A-16D, respectively;
  • FIGS. 17A-17D illustrate terrain normalization on an object moving laterally with respect to the host vehicle, according to one example; and
  • FIGS. 18A-18C illustrate a routine for providing terrain normalization, according to one embodiment.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Referring to FIGS. 1-5, a vehicle pre-impact crash sensing system 20 is generally illustrated employed on a host vehicle 10, according to one embodiment. The crash sensing system 20 is shown and described herein configured to detect a pre-impact collision of an object (e.g., another vehicle) with the vehicle 10, particularly on one or both lateral sides of the vehicle 10. However, it should be appreciated that the crash sensing system 20 may be employed to detect a pre-impact event on any side of the vehicle 10, including one or both lateral sides, the front side and the rear side.
  • The host vehicle 10 is generally shown as an automotive wheeled vehicle having opposite lateral sides and exterior side view mirror housings 12 on opposite lateral sides. In the embodiment shown and described herein, the crash sensing system 20 generally includes an integrated infrared (IR) transmitter/receiver 25 shown mounted generally in one of the mirror housings 12 of the vehicle 10, at a position sufficient to detect objects located adjacent to the corresponding lateral side of the vehicle 10. While lateral crash sensing is shown and described herein for sensing a collision on one side of the host vehicle 10, it should be appreciated that the crash sensing may also be employed on the opposite lateral side of the vehicle. Further, while the transmitter/receiver 25 is shown mounted in the mirror housing 12, it should be appreciated that the integrated transmitter/receiver array 25 may be located at other locations on the vehicle 10 and positioned to detect one or more objects in the desired vicinity of the vehicle 10.
  • The IR transmitter/receiver 25 includes a plurality of IR transmitters (22A-22I) and a plurality of IR receivers 24A-24I, as shown in FIG. 5. As seen in FIGS. 1 and 3, the IR transmitters 22A-22I transmit infrared (IR) energy signals within designated transmit beam patterns 32A-32I in a three-by-three (3×3) array spaced from the lateral side of the host vehicle 10, according to one embodiment. The infrared transmitter array 25 has a plurality (e.g., nine) of infrared transmitters 22A-22I for transmitting infrared radiation signals within designated corresponding transmit zones 32A-32I. The transmitter array is activated to sequentially transmit infrared radiation signals in one zone at a time, such as zone 32A, and then switches sequentially to the next zone, such as zone 32B, and then to zone 32C, and continues through the entire array to zone 32I, and cyclically repeats the process at a high rate of speed, e.g., less than three milliseconds per zone. Alternately, multiple transmit zones could be illuminated simultaneously. In the embodiment shown, the IR transmit zones 32A-32I are oriented in a three-by-three (3×3) array having three rows and three columns, each zone having a generally conical shape extending from the transmitter/receiver 25 shown located in the mirror housing 12 and oriented toward the roadway to the lateral side of the host vehicle 10 such that the row of zones 32A-32C is spaced further away from the host vehicle 10 as compared to the row of zones 32D-32F and row of zones 32G-32I. Each IR transmit zone 32A-32I has a generally cone shape beam with a circular cross section which appears more as an elliptical shape as it impinges at an angle on the ground on the adjacent roadside. Each IR transmit zone 32A-32I has a size sufficient to cover the intended detection zone at the lateral side of the host vehicle 10 and may be spaced from the adjacent zones by a predetermined angle, according to one embodiment. According to another embodiment, the IR transmit zones 32A-32I may overlap each other, thereby offering intermediary zones that can be further processed.
  • The crash sensing system 20 also includes a receiver array having a plurality of photosensitive receiver elements 24A-24I as shown in FIG. 5. In one embodiment, the receiver elements 24A-24I receive and detects light intensity including reflected infrared radiation signals within corresponding IR receive zones 34A-34I as shown in FIGS. 2 and 4. The IR receivers 24A-24I essentially receive light including infrared radiation signals reflected from one or more objects within the corresponding IR receive zones 34A-34I and converts the detected light intensity to a frequency output. In one embodiment, the receive zones 34A-34I are arranged in a three-by-three (3×3) array having three columns and three rows that are located such as to substantially align with the corresponding three-by-three (3×3) array of IR transmit zones 32A-32I.
  • In addition, the vehicle 10 is shown in FIG. 1 having an additional or tenth IR transmitter 26 shown illuminating a horizontal beam at an elevation where an oncoming vehicle bumper or grille would be expected to be located. The additional IR transmitter 26 transmits a substantially horizontal IR calibration beam outward from the lateral side direction of the host vehicle 10 as shown in FIG. 8. In the embodiment shown, IR transmitter 26 is located in a passenger door of the host vehicle 10 at a height similar to or the same as the elevation of the beam 28 relative to the ground. It should be appreciated that the transmitter 26 may be located elsewhere on host vehicle 10 such as in the front quarter panel or the front or rear bumpers of host vehicle 10. The additional IR transmit beam 28 provides a horizontal calibration IR beam which is illuminated by itself in the timing sequence of illuminating the nine IR transmitters 22A-22I. By illuminating the additional IR transmitter 26, the IR receivers 24A-24I may detect light intensity including infrared radiation reflected from objects in the corresponding zones 34A-34I, particularly for object features located at the elevation of beam 28 such as a vehicle bumper, a vehicle grille or fascia or other features expected to be detected at that elevation. By providing the extra horizontal IR transmit beam 28, triangulation of the calibration beam with the other nine scanned IR beams 32A-32I allows ranging and a measure of the reflection coefficient of the surface of a target object. As such, enhanced range information may be acquired, according to one embodiment.
  • Further, the host vehicle 10 is shown in FIG. 2 having an optional additional IR photoreceiver 27 shown located at the same or in close proximity to IR transmitter 26 for receiving reflected signals within a horizontal beam at an elevation where an oncoming vehicle bumper or grille would be expected to be located. The IR receiver 27 receive light signals including reflected IR signals within receive beam 29 and generates a frequency output as a function of the detected light amplitude. It should be appreciated that with the additional IR transmitter 26 turned on, either the additional IR photoreceiver 27 or the individual IR photoreceivers 24A-24I may be employed to detect the presence of an object within the horizontal beams at the elevation where an oncoming vehicle bumper or grille is expected which enables enhanced range estimation based on triangulation of the received signals.
  • In operation, the array of IR transmitters 22A-22I transmits infrared radiation signals within the corresponding IR transmit zones 32A-32I, one zone at a time, according to one embodiment, resulting in the transmission of sequential IR signals to the transmit zones, while the receiver array 24A-24I receives light energy including reflected infrared radiation signals from objects located within the corresponding IR receive zones 34A-34I. The detected light signals are output as frequency signals which are then processed by a processor. By knowing which one of the IR transmit zones 32A-32I is illuminated with infrared radiation at a given point in time, the location and range of the detected object can be determined. As an object moves, the progression of the object through multiple zones can be monitored to determine speed and direction of the object, such that a processor may determine whether a pre-impact event of the object with the host vehicle 10 is detected. In addition to the sequential illumination of IR transmitters 22A-22I, the system 20 may also activate the additional IR transmitter 26 as part of the sequence to detect objects within the illuminated horizontal IR calibration beam 28. The sequence of illumination may include successive activation of the nine IR transmitters 22A-22I, the activation of the tenth IR transmitter 26, then all IR transmitters turned off, and then repeat the cycle.
  • With particular reference to FIG. 5, the integrated IR transmitter/receiver 25 is illustrated, according to one embodiment. The IR transmitter/receiver 25 is shown generally having a housing 60 containing an array of nine IR LEDs 22A-22I mounted onto the top side of a circuit board 62. The IR LEDs 22A-22I may be disposed behind respective beam-forming optics, which may include reflecting (e.g., parabolic reflector) and/or refracting optical elements or an aperture for defining the conical-shaped beam pattern. It should be appreciated that the IR transmitter array 22A-22I may employ any of a number of signal transmitting elements for illuminating multiple transmit zones, and may be configured in any of a number of shaped and sized beam patterns. According to one example, the IR LEDs 22A-22I may employs a central wavelength of about 850 nanometers. One example of a commercially available IR LED is available from OSRAM Opto Semiconductors Inc., sold under the brand name Golden Dragon.
  • The IR transmitter/receiver 25 is shown employing nine photodetectors 24A-24I which serve as photosensitive receivers and are shown mounted on the bottom side of the circuit board 62. Photodetectors 24A-24I may generally be placed behind corresponding receiving lenses and/or receiving reflectors (e.g., parabolic reflectors). The receiving lenses may include reflecting and/or refracting optical elements that focus the reflected infrared radiation received from the corresponding IR receive zones 34A-34I onto the photodetectors 24A-24I, respectively. The receiver array may employ any number of a plurality of receiver elements for receiving reflected IR signals from objects within the corresponding number of receive zones 24A-24I and may each be configured in a cone shape or other shapes and sizes. One example of a photodetector is a light-to-frequency converter commercially available from Texas Advanced Optoelectronic Solutions (TAOS). The light-to-frequency converter provides a frequency output (Hz) as a function of the amplitude of the detected light radiation.
  • Referring to FIG. 6, the crash sensing system 20 is further illustrated employing a microprocessor 50 having various inputs and outputs. The microprocessor 50 is shown providing outputs to the nine IR transmitters 22A-22I and the tenth IR transmitter 26. The microprocessor 50 outputs LED strobe signals to the IR LEDs 22A-22I and 26 to activate the IR LEDs in a cyclical pattern. Signals indicating light and reflected infrared radiation received by each of the receiver elements 24A-24I are input to the microprocessor 50. In the embodiment disclosed, the receiver elements 24A-24I provide frequency signals input to the microprocessor 50 in the form of a beam IR signature. The beam IR signature includes frequency (in hertz) representing the amplitude of the photo or light energy detected by its irradiance on the receiver from within a given detection zone.
  • In addition, the microprocessor 50 receives an input from a passive thermal far IR receiver 46. The passive thermal far IR receiver 46 detects emitted radiation within a relatively large area and serves as a safing input that may be logically ANDed with a processor generated output signal to provide risk mitigation for high target certainty. Alternately, the crash sensing system 20 may employ radar or an ultrasonic transducer as the safing input. Further safing inputs may include a steering wheel angular rate, yaw, external lateral slip, lateral acceleration and lateral velocity signals, amongst other possible safing inputs.
  • The crash sensing system 20 further includes memory 52, including volatile and/or non-volatile memory, which may be in the form of random access memory (RAM), electrically erasable programmable read-only memory (EEPROM) or other memory storage medium. Stored within the memory 52 is a sensing routine 100 for processing the sensed data and determining a pre-impact event as described herein. Also stored in memory 52 and executed by microprocessor 50 is a routine 200 detecting vehicle features and determining a pre-impact event and a routine 300 performing terrain normalization and determining a pre-impact event.
  • Additionally, the microprocessor 50 provides a resettable countermeasure deploy output signal 54 and a non-resettable countermeasure deploy output signal 56. The countermeasure deploy output signals 54 and 56 may be employed to mitigate the effects of an anticipated collision. Examples of countermeasure deploy activity may include deploying a pretensioner for one or more seat belts, deploying one or more air bags and/or side air bag curtains, controlling an active suspension or other vehicle dynamics adjustment, or further may activate other countermeasures on board the host vehicle 10. These are other deployments may be initiated early on, even prior to an actual impact. Further, the microprocessor 50 receives vehicle speed 58 which may be a measured vehicle speed on a vehicle speed estimate. Vehicle speed 58 may be employed to determine whether or not a lateral impact with an object is expected and is further employed for terrain normalization to determine whether or not an object is stationary despite its shape and orientation.
  • The sensing routine 100 is illustrated in FIG. 7 for sensing an anticipated near impact event of an object with the host vehicle. Routine 100 begins at step 102 to sequentially transmit IR beams, and the proceeds to step 104 to monitor received photosensitive beams within the array of coverage zones. This occurs by sequentially applying IR radiation within each of the transmit beams and receiving light energy including reflected IR signals from the receive zones. Next, routine 100 performs noise rejection on the beam data that is received. In decision step 108, routine 100 determines if the temporal gating has been met and, if not, returns to step 102. The temporal gating bracketing requirements may take into consideration the path, trajectory and rate of the object, the number of pixels of area, the inferred mass/volume/area, the illumination consistency, and angular beam spacing consistency.
  • According to one embodiment, temporal gating requirements are determined based on comparison of an object's perceived motion (detection from one contiguous coverage zone to another) across the coverage zones to the expected relative speed of potential collision objects of interest (e.g., an automotive vehicle moving at a closing speed of 10 to 65 kilometers per hour (kph) or 6 to 40 miles per hour (mph) to a host vehicle's lateral side). The “range rate” of distance traveled per unit time of a potential collision object can be determined by the detection assessment of contiguous coverage zones for range rates consistent with an expected subject vehicle's closing speed (i.e., if an object is detected passing through the coverage zones at a rate of 1 observation zone per 70 milliseconds and each coverage zone is 0.3 meters in diameter perpendicular to the host vehicle's lateral side, then the closing speed or range rate is approximately 4 meters per second, and is equivalent to approximately 15 kph or 10 mph). Objects moving at range rates slower or faster than the expected range rate boundary through the coverage zones would not pass the temporal gating requirement.
  • Additional assessment can be made based on the quality of the received signal of a potential object as it passes through the coverage zones. If the amplitude of the detected signal varies substantially from one contiguous coverage zone to another (even if all signals are above a threshold value), it could indicate an off-axis collision trajectory or perhaps an object with a mass not consistent with a vehicle. The signal fidelity and consistency through the contiguous coverage zones can be used to verify a potential vehicle collision.
  • If the temporal gating has been met, routine 100 then proceeds to decision step 110 to determine if the far IR safing has been enabled and, if not, returns to step 102. If the safing has been enabled, routine 100 proceeds to deploy an output signal indicative of a sensed pre-impact event in step 112. The output signal may be employed to activate deployment of one or more countermeasures.
  • The crash sensing system 20 creates a three-dimensional space extending from the lateral side of the host vehicle 10 by way of an array of high speed sequentially illuminated and scanned infrared light signals provided in dedicated coverage zones directed to the lateral side of the host vehicle 10. Objects which appear within the coverage zones are scanned, and their location, range, speed, and direction are determined. In addition, the size of the object may be calculated. Further, the shape of the object and one or more features such as reflectivity present on the object may further be further determined. It should be appreciated that feature identification, such as may be achieved by monitoring reflectivity, such as that due to color, and other variations, may be detected and an enhanced range may be determined. The processor 50 processes the information including location, range, speed and direction of the object in addition to the host vehicle speed, and determines whether or not a detected object is expected to impact the side of the host vehicle 10. The processor 50 processes the location of the detected object, range to the detected object, speed of the detected object, and direction of the detected object in relation to the host vehicle 10 and the speed of the host vehicle 10. Additionally, the processor 50 may further process the size and shape of the object in order to determine whether the object will likely collide with the host vehicle 10 and, whether the object is of a sufficient size to be a concern upon impact with the host vehicle 10. If the object is determined to be sufficiently small in size or moving at a sufficiently slow rate, the object may be disregarded as a potential crash threat, whereas a large object moving at a sufficiently high rate of speed toward the host vehicle 10 may be considered a crash threat.
  • While the crash sensing system 20 is described herein in connection with an integrated IR transmitter/receiver having nine IR transmitters and nine photosensitive receivers each arranged in an array of three-by-three (3×3), and the addition of an additional IR transmitter 26 and an optional photosensitive receiver 27, it should be appreciated that other infrared transmit and receive configurations may be employed without departing from the spirit of the present invention. It should further be appreciated that other shapes and sizes of coverage zones for transmitting IR radiation and receiving photosensitive energy radiation may be employed and that the transmitters and/or receivers may be located at various locations on board the host vehicle 10. U.S. Patent Application Publication No. 2009/0099736, entitled “VEHICLE PRE-IMPACT SENSING SYSTEM AND METHOD” discloses various configurations of IR transmitter and receiver arrays for detecting objects to a lateral side of a vehicle 10. The aforementioned U.S. Patent Application Publication is hereby incorporated herein by reference. It should be further be appreciated that variations in segmented lens or reflector designs may be utilized to provide design flexibility for customized coverage zones. One example of a segmented lens is disclosed in U.S. Patent Application Publication No. 2008/0245952, filed on Apr. 3, 2007 and entitled “SYNCHRONOUS IMAGING USING SEGMENTED ILLUMINATION,” the entire disclosure of which is hereby incorporated herein by reference.
  • It should be appreciated that a complete field image encompassing all the coverage zones may be generated every scan of the entire array of the covered volume. By comparing successively acquired photosensed images, the size, shape, location, range and trajectory of an incoming object can be determined. To aid in the estimation of the range (distance) of the object from the system 20, and hence the host vehicle 10, the additional IR illuminator 26 and optional receiver 27 may be employed along with triangulation. By employing triangulation, the presence of an object in the designated zones is compared to the additional IR transmit zone 28 such that the range (distance) can be determined. Additionally, the reflection power of the signal received can be used to enhance the range estimate and thereby enhance the detection of a pre-crash event.
  • The vehicle pre-crash impact sensing system 20 employs a feature detection scheme that identifies certain features of an object vehicle, particularly an object vehicle moving laterally toward the host vehicle 10, to provide enhanced vehicle discrimination. According to a first embodiment of the feature identification scheme, the sensing system 20 employs the horizontally illuminated IR calibration beam 28 in conjunction with the IR transmit beams 32A-32I in an attempt to identify a known feature such as bumper and/or grille of a laterally oncoming vehicle. In doing so, the horizontal calibration IR beam transmitter 26 is multiplexed between the main IR beams 32A-32I to allow enhanced range calibration. By multiplexing the additional IR illumination calibration beam 28 with the standard nine IR transmit zones 32A-32I, the range of the reflected object can be better estimated. Additionally, by employing a calibration chart, the reflection coefficient of the surface of the object detected may be used for increased accuracy range estimate, and thus improved risk assessment for proper side air bag deployment or other countermeasure. The optional tenth photosensitive receiver 27 may be employed to provide received photosensitive signals within zone 29 to further enhance the process.
  • According to another embodiment, enhanced oncoming laterally moving vehicle discrimination can be achieved by employing one or more scanned beams in a generally U-shape configuration which is generally configured to encompass the shape of a common vehicle front, particularly the fascia. Multiple U-shaped patterns extending from a distant focus to larger and nearby may be created with the physical structure of the beam hardware (e.g., via the optical design). Alternately, the beam pattern can be created in software if the number of beams fully covers the oncoming laterally moving vehicle's trajectory path from far to nearby. The U-shaped beam forms may have ends of about three feet by three feet which focuses on the oncoming laterally moving vehicle's headlamps/signal markers and a center connecting line of about a two foot height which receives the oncoming laterally moving vehicle's grille chrome. Accordingly, the pre-impact sensing system applies vehicle fascia detection with vehicle front grille shaped optical regions for improved detection of approaching vehicles. Nine overlapping regions may allow target tracking and relative ranging, and the geometry can apply to either the IR illumination or light receiver shape or possibly both the transmit and receiver shapes.
  • According to a further embodiment, enhanced oncoming laterally moving vehicle discrimination may be achieved by detecting the differential spectral return of highly reflective vehicle surfaces, such as signal markers, headlamps, fog lamps, license plates, chrome and other features typical of vehicle front ends. Additionally, background light illumination levels may also be used to measure the highly reflective vehicle elements. Additionally, the system 20 may be used to detect the headlamp on status of the oncoming vehicle further, thereby allowing discrimination of their presence as well as any possible pulse width modulation (PWM). LED headlamps which are also pulsed may be sensed and used as an additional discrimination element. The geometry of the spectral objects on an inbound oncoming laterally moving vehicle may also aid in the discrimination risk assessment.
  • Referring now to FIG. 9, a routine 200 is illustrated for sensing a pre-crash event of a vehicle to deploy one or more devices and includes steps of implementing the various aforementioned embodiments of the vehicle feature identification to advantageously provide updated object range estimates. Routine 200 begins at step 202 and proceeds to step 204 to illuminate the IR beam N, wherein N is the number of IR transmit beams which are sequentially activated. Next, in step 206, routine 200 stores the IR transmitter that is turned on, and receives the IR amplitude data for the Nth receive zone. It should be appreciated that the transmit beams are turned on one beam at a time, according to one embodiment, and that the light energy including IR energy reflected from objects within each zone is detected by receivers covering the corresponding receive zones. In step 208, beam N is turned off as part of this process, and at step 210, the IR off is stored and the amplitude data of received light energy for beam N is stored with the IR off. Next, N is incremented to the next successive zone of coverage. At decision step 214, routine 200 determines if N has reached a value of ten which is indicative of the number of IR transmitters and, if not, repeats at step 202. Once a complete cycle of all ten zones has been completed with N equal to ten (10), routine 200 proceeds to step 216 to indicate that a frame is complete, such that the received energy data is mapped out for the coverage zones.
  • Once a frame is complete, routine 200 proceeds to step 218 to perform a sequential IR modulation signal calculation for each of the IR sensors, shown as sensors 1-9 indicative of the outputs of the respective photodetectors 24A-24I. Step 218 essentially performs a sequential IR modulation signal calculation by taking the difference of the sensed photo signal while the infrared transmitter is turned on and then when the infrared transmitter is turned off for each coverage zone. As such, for zone 1, a signal (X1) is determined based on the difference of the receive signal with the IR on and the IR off, signal (X2) is indicative of the receive signal with the IR on minus the IR off, etc. By turning the IR transmitter array on and off at a frequency such as three hundred (300) hertz, the emitted IR beams are essentially modulated at the switching or modulation frequency. The difference between the received IR energy signals when the IR transmitter is turned on and when the same IR transmitter is turned off produces a ripple signal as described herein. The sequential IR modulation signal calculation is performed for each of the nine zones 1-9 to generate corresponding ripple signals to remove or get rid of the overall background noise.
  • For an embodiment that employs the tenth IR receiver 27, routine 200 performs step 220 which processes the output of the tenth IR receiver 27 to acquire enhanced signal to noise ratio. Step 220 is an optional step that performs a reference IR modulation signal calculation for the tenth receiving sensor, also referred to as receiver 27. In doing so, signal (X10) is generated as a function of the IR on minus the IR off for the tenth coverage zone.
  • Routine 200 then proceeds to step 222 to perform a differential signal calculation (Y) for each of sensors 1-9 to acquire an enhanced differential signal by eliminating or removing background noise. The differential signal calculation involves calculating the difference between signal X1 and signal X10, to the extent a tenth transmitter is employed. Similarly, the signal Y2 is acquired by taking the difference between signal X2 and signal X10. Similarly, each of zones 3-9 involves subtracting the signal X10 from the corresponding received signal for that zone to provide respective ripple signals for each zone.
  • Following step 222, routine 200 proceeds to step 224 to use an object calibration for vehicle feature identification. According to one embodiment, routine 200 uses a standard object calibration for a bumper detection to detect the bumper or similar feature(s) on a lateral oncoming vehicle. According to another embodiment, routine 200 employs a highly reflective object calibration for fascia and headlamp detection. It should be appreciated that the object calibration for bumper detection and reflective object calibration for fascia/headlamp detection may be achieved by use of one or more calibration charts or look-up tables, such as the exemplary calibration chart shown in FIG. 10.
  • The calibration chart shown in FIG. 10 essentially maps a plurality of different sample objects having various features such as shapes, colors, materials (textures) and reflectivity as a function of the ripple signal in hertz versus range in feet. It should be appreciated that the IR receiver photodetectors each provide an output frequency dependent upon the photosensitive detection. The frequency is generally equal to the amplitude (A) of the ripple signal divided by the distance (d) squared (frequency=A/d2). The ripple signals shown are the reduced noise Y signals generated at step 222. The various targets that are shown by lines or curves 70A-70H in the example calibration table of FIG. 10 include arbitrary selected materials such as a white paper, black cotton twill, brick, tree and fog, black paper, black board, black cotton gauze, asphalt, and a typical sky scenario. It should be appreciated that these and other targets or selected materials may be mapped out in a given calibration table for use with routine 200.
  • Next, in step 226, routine 200 compares the differential signal calculations Y to determine the highest correlation for a detected geometry. In doing so, routine 200 uses a selected calibration map to determine the optimum range estimate based on a common range from the nine received ripple signals. In step 228, routine 200 updates the object range estimate based upon the identified feature(s). Accordingly, it should be appreciated that by identifying an anticipated feature for the front end of a lateral approaching vehicle, such as the vehicle bumper, fascia and/or headlamp, enhanced object range may be estimated for use in the pre-crash sensing system 20.
  • Following the step of updating the object range estimate, routine 200 proceeds to decision step 230 to determine if the temporal gating has been met, and if not, returns to step 202. The temporal gating step 230 may be the same or similar temporal gating step described in connection with step 108 of routine 100. If the temporal gating has been met, routine 200 proceeds to decision step 232 to see if the thermal IR safing has been enabled and, if so, deploys one or more devices in step 234. If the thermal IR safing is not enabled, routine 200 returns to step 202. The thermal safing step 232 may be the same or similar to the safing logic described in connection with step 110 of routine 100.
  • Accordingly, routine 200 advantageously provides for an enhanced object range estimate based on the detected type of feature of a vehicle that is oncoming in the lateral direction. By detecting one or more features of the detected object, routine 200 advantageously looks up the calibration data and provides an updated range estimate which advantageously enhances the determination of whether a laterally approaching vehicle, such as an automobile, is expected to collide with the host vehicle 10.
  • Referring to FIG. 11, the U-shaped geometry for a single beam geometry transmit beam 32 is shown superimposed onto the front side fascia portion of the front side of a vehicle, which represents a potential oncoming laterally moving vehicle. The U-shaped transmit beam 32 may be broadcast as an infrared beam by a single IR transmitter, according to one embodiment. According to one embodiment, multiple U-shaped IR transmit beams 32A-32I are transmit as shown in FIG. 12, each having a generally U-shape and covering separate zones, which overlap each other. The multiple U-shape beams may extend from a distant focus to larger and nearby and could be created with the physical structure of the beam hardware (e.g., via the optical design). Alternately, such a beam pattern could be created in software if the number of beams fully covers the oncoming laterally moving vehicle's trajectory path from a far distance to a nearby distance. According to one example, the U-shaped beam form may have ends of about three feet by three feet at distance of about six feet, which focuses on the oncoming laterally moving vehicle's headlamp/signal markers and a center connecting line of about a two foot height which receives the oncoming laterally moving vehicle's grille chrome. While an array of U-shape transmit beams is shown and described herein, it should be appreciated that the geometry of the light receivers may be shaped in a U-shape instead or in addition to the transmit beams.
  • By processing the U-shaped beam and the IR signals received therefrom, the trajectory and range of the oncoming vehicle object may be better determined. In an alternate embodiment, the U-shaped beams may be shaped in an oval shape for simplicity, or another shape that picks up the fascia and similar features of the front end of the oncoming vehicle. It should be appreciated that as the IR transmit beams cross horizontally the front fascia of an oncoming front end of a vehicle, a resultant ripple signal is generated. The ripple signal is the difference in detected energy signal when the IR is turned on and when the IR is turned off. It should be appreciated that for a high reflectance feature, such as the headlamps and signal markers, a higher ripple signal is achieved having a higher frequency. Typically, the center grille area of a front end of an oncoming vehicle is mainly paint which has a lower reflective coefficient versus the reflected components of the headlamps and the signal markers. The ripple signal signature can be processed to determine the presence of a likely vehicle fascia or portion thereof including the headlamps and signal markers of the front end of an oncoming vehicle.
  • The pre-crash sensing system 10 further employs a modulation technique to nullify background ambient light conditions and better enhance the estimated target range. The extreme lighting variation from darkness to full sunlight presents many challenges to an object detection system. These and other deficiencies can be overcome by turning the IR transmit array on and off at a high frequency of three hundred (300) hertz, for example, or otherwise provide amplitude modulation of the IR light source with a square wave or a sine wave so as to nullify the background ambient light conditions and better enhance the estimate of target range.
  • Extreme lighting variations and other deficiencies may be overcome by a modulation technique which measures a scene with ambient lighting and also with an artificial IR illumination source. The difference between the two measurements provides an inferred target range within the scene. This modulation method provides a very cost-effective method of target ranging, yet does not require extreme power levels as created by typical solar exposure which can be cost prohibitive. According to another embodiment, the modulation technique may be implemented with a carrier signal as disclosed in U.S. Pat. No. 6,298,311, entitled “INFRARED OCCUPANT POSITION AND RECOGNITION SYSTEM FOR A MOTOR VEHICLE,” the entire disclosure of which is hereby incorporated herein by reference.
  • Referring to FIG. 13, the modulation technique is illustrated in connection with an example of sensed IR and ripple signal both when a parked vehicle is located in the lateral side detection zone of the host vehicle 10 and when the parked vehicle is removed from the detection zone. As shown, the sensed raw irradiance or received light energy level is plotted as a function of time versus frequency as the host vehicle drives by the parked car shown at top at time of about eighteen (18) seconds to a time of about thirty-three (33) seconds, such that initially there is no car on the lateral side of the host vehicle up until time equals eighteen (18) seconds, and the car passes through the detection zones and then departs the detection zones at time equals thirty-three (33) seconds. Initially, sunlight reflected from asphalt is detected at line 90 and a shadow is detected as indicated. Once the parked vehicle enters the detection zone, the received raw IR irradiance or light energy indicated by area 92 between lines 94 and 96 increases between max and min values when the IR transmitter is turned on at max border 94 and when the IR transmitter is turned off at min border 96. The difference between the IR turned on and off signals is represented by the ripple signal 98 which goes from approximately zero (0) frequency to a frequency of about five thousand (5,000) when the lateral location vehicle passes through the detection zone, and returns to a value of near zero (0) when the lateral located parked vehicle departs the detection zone. The modulation techniques advantageously allows for detection of an object due to reflected signal independent of shadow or sunlight.
  • The data shown in FIG. 13 illustrates frequency data where the object was generally in the center of the images and the IR illumination alternates from on to off. This data yielded a ripple signal 98 of around one hundred (100) hertz of variation in the raw data when looking at the asphalt, and about five thousand (5,000) hertz with a white target at about six (6) feet, according to one example. The ripple signal 98 did not change even though the ambient lighting moved from a full sun exposure from the time period of about zero to ten (0 to 10) seconds to the shadow of the vehicle at about the time period of about seventeen (17) or eighteen (18) seconds. The data shown in FIG. 13 was taken within a 950 nanometer band which requires rather expensive optical bandpass filters to eliminate sunlight. The modulation technique allows for operation with a much less costly high wavelength filter, such as greater than 700 nanometers.
  • Given the ripple signal 98 generated in the table of FIG. 13, the modulation technique enables the range estimation to be enhanced with acquisition from the calibration table shown in FIG. 10. As seen in FIG. 10, a white object's range can be fairly well predicted as shown by the curve 70A representing white paper. Dark objects are also predictable as shown by the curve 70H representing asphalt. Textured fabrics may exhibit a self-shadowing effect. By use of this inferred range information and the event progression of the multiple spot data, a side impact risk estimate can be made and, if deemed of sufficient risk, the side air bags can be deployed. By monitoring the ripple current 98 shown in FIG. 13, for each coverage zone, the ranges as shown by lines 72A, 72B and 72C may be established, for example. By looking at common points within the set of ranges 72A-72C, an enhanced range estimate may be established. In the given example, ranges 72A-72C have a common range value of about six feet. Since the frequency output of the IR photodetectors may vary based upon color or reflectivity in combination with a range, an enhanced range estimate may be provided by looking for common range values within the zones.
  • The pre-crash sensing system 10 further employs a terrain normalization method to normalize out stationary objects that pass through the detection zones to the lateral side of the vehicle 10. Referring to FIG. 14, an example driving scenario is shown illustrating a host vehicle 10 employing the sensing system 20 is traveling along a roadway relative to several objects including an angled barrier 80, a lateral oncoming vehicle 84, a passing lateral vehicle 82, and a laterally projecting vehicle 86 expected to collide with the host vehicle 10. The terrain normalization method is able to normalize out stationary objects, such as the angled barrier 80 which, due to its shape, may appear to be moving toward the host vehicle 10, such that the system 20 may ignore the stationary object 84 in determining whether or not an impending lateral collision will occur. By use of range data, which is inherent to the power level of the reflected signal, a three-dimensional volume can be estimated with the array of coverage zones 34A-34I. As the host vehicle 10 is driven forward, the frontmost beams will detect a terrain pass by of an object. This terrain information, including inferred distance, is propagated to the successive rearward beams which follow and is used to normalize out the existence of stationary objects which are characterized as clutter to be ignored. Objects which have a lateral velocity component are recognized as potential oncoming targets and their characteristics are further evaluated for assessed threat of a lateral collision to the host vehicle 10.
  • In particular, physical objects such as an angled barrier (e.g., a guardrail) or an angled road line may have a shape resembling an oncoming vehicle bumper and can produce similar or identical IR pattern signatures which possibly could cause undesirable deployment of an air bag or other device when not properly detected. The terrain normalization method attempts to detect and eliminate such false triggers. With the transmitter and receiver arrangement shown, a matrix of infrared beams and receiving beams illuminate the side of the host vehicle 10 to provide a sensed volume by the three-by-three (3×3) array. Using range data, which is inherent to the power level of the reflected signal, a three-dimensional volume can be estimated. As the host vehicle 10 is driven forward, the frontmost beams or zones will see the terrain pass by first. This terrain information includes distance that is propagated to subsequent following beams which follow behind in progression and are used to normalize out stationary objects. Objects which have a lateral velocity component are recognized as potential oncoming targets to the host vehicle 10, and their characteristics are further evaluated for assessed threat to the host vehicle 10.
  • Each scan of the matrix yields an object light level for each spot of the zones detected. From this, the range (distance) of an object can be inferred according to a distance look-up table. Generally, objects which are at ground level are quite low in reflected energy as power is related to one divided by the distance squared. As seen in FIG. 14, barriers and oncoming cars illuminated by using the leading spot zones and propagating this information rearward to the trailing spots is achieved with a normalization technique. The data may be processed by averaging, normalization, and time differential, amongst other embodiments. Additionally, road speed and steering angle could be used to propagate the optical distance of the measured lead objects rearward, thus, eliminating them from causing false deployments. In a similar manner, the detection of objects entering the rearmost zones first are processed in reverse, such as when a car passes the host vehicle 10. The resultant matrix of processed data which has been normalized to remove oncoming and passing objects is used to evaluate the event propagation of a lateral object, such as an oncoming laterally moving vehicle 86.
  • Referring to FIGS. 15A-15D, the progression of a stationary object is illustrated as a host vehicle 10 passes by stationary object 84. As seen in FIG. 15A, the stationary object 84 is in front of the nine detection zones. As the host vehicle 10 moves forward to the lateral side of a stationary object 84, the stationary object 84 is shown first being detected by zone A1 in FIG. 15B, and then detected by both zones A1 and A2 in FIG. 15C, and then detected by zones A1-A3 in FIG. 15D. Finally, as the object 84 departs at least part of the detection zone, the object 84 is shown in FIG. 15E departing zone A1 and is still detected in zones A2 and A3. In this scenario, the system 20 employs a normalization routine to subtract out the propagated signal detected in a forward located zone from the raw data in an attempt to detect whether or not a lateral motion of the object is occurring. As seen in FIGS. 15A-A-15E-E, the detection of the object 84 in zone A1 is detected and as it approaches zone A2, the propagated signal of zone A1 is subtracted from the raw data of zone A2 to provide a normalized result indicative of a stationary object which should be rejected.
  • Referring to FIGS. 16A-16D, a scenario is shown in which a stationary object 80 in the form of an angled barrier, for example, is passed by the host vehicle 10 and the system 20 employs terrain normalization to reject the stationary object 80, despite its potentially deceiving shape due to its angle towards the vehicle 10. In this example, the stationary object 80 first enters zone A1 in FIG. 16A, and then proceeds to enter zone A2 in FIG. 16B. In FIG. 16C, the stationary object 80 is detected by zones A1, A2 and B1, and then in FIG. 16D, the stationary object 80 is detected primarily in zones A3, B2 and C1. As the stationary object passes from zone A1 through and into zones A2, A3 and B1, the terrain normalization subtracts the propagated signal of the leading zone from the following zone. For example, in zone A2, the normalized result is that the propagated signal of zone A1 is subtracted from the raw data of zone A2 to determine that the stationary object 80 is detected and should be rejected. The propagated signal is subtracted based on a delay time which is determined based on the speed of host vehicle 10, such that the signal data of the preceding zone captures an area of space that is also captured in the following zone. The terrain normalization thereby takes into consideration the detected signal information from the preceding zone taking into consideration the time delay and the speed of the host vehicle 10.
  • Referring to FIGS. 17A-17D, a further example of a laterally oncoming vehicle 86 is illustrated in which the laterally oncoming vehicle 86 first enters zone A1 and A2 in FIG. 17B and proceeds into zones B1, B2, A1, A2 and C1 in FIG. 17C, and finally proceeds into zones A1, A2, B1, B2 and C1-C3 in FIG. 17D. The terrain normalization effectively removes stationary objects so that the system 20 can detect that the laterally oncoming vehicle 86 is not stationary, but instead is moving with a lateral velocity component toward the host vehicle 10, such that an oncoming collision may occur.
  • Referring to FIGS. 18A-18C, the terrain normalization routine 300 is illustrated, according to one embodiment. The terrain normalization routine 300 begins at step 302 and proceeds to step 304 to illuminate the outermost row of IR transmit beams in zones A1, A2 and A3. Next, in step 306, routine 300 receives the IR amplitude data for receive zones A1, A2 and A3 while the IR is turned on. Next, the IR transmit beams for zones A1, A2 and A3 are turned off. Proceeding to step 310, routine 300 receives the IR amplitude data for zones A1, A2 and A3 while the IR transmit beams are turned off and stores the received amplitude data while the IR transmit beams for zones A1, A2 and A3 are turned off. At step 312, routine 300 turns on the IR transmit beams for the next or middle row of zones B1, B2 and B3. In step 314, routine 300 receives the IR amplitude data for zones B1, B2 and B3 while the IR transmit beams are turned on and stores the received IR amplitude data in memory. At step 316, routine 300 turns off the IR transmit beams for zones B1, B2 and B3. With the IR transmit beams turned off, routine 300 proceeds to step 318 to receive the IR amplitude data for zones B1, B2 and B3 and stores the received IR amplitude data in memory.
  • Routine 300 proceeds to step 320 to turn on the IR transmit beams for the third or closest row of zones C1, C2 and C3. Next, at step 322, routine 300 receives the IR amplitude data for zones C1, C2 and C3 and stores the received IR amplitude data in memory. In step 324, routine 300 turns off the IR transmit beams for zones C1, C2 and C3. With the IR transmit beams turned off, routine 300 proceeds to step 326 to receive the IR amplitude data for zones C1, C2 and C3 and stores the received IR amplitude data in memory. Next, routine 300 turns on the IR transmit beam for the tenth or additional transmit beam at step 328. With the tenth or additional IR transmit beam turned on, routine 300 proceeds to step 330 to receive the IR amplitude data for zones A1, A2, A3, B1, B2, B3, C1, C2, C3, and the tenth receiver while the tenth or additional IR transmit beam is turned on. Finally, the initial frame is complete at step 332.
  • Once the frame is complete, routine 300 proceeds to step 334 to perform a sequential IR modulation signal calculation X for each of sensors one through nine, shown labeled A1-C3. This includes calculating the difference in signals when the IR transmit beam is turned on and when the IR transmit beam is turned off for each of zones A1-C3 to provide a raw ripple signal for each of zones A1-C3.
  • Proceeding on to step 336, routine 300 stores a history of the IR modulation signal calculation (X) for each of sensors one through nine for zones A1-C3. This involves storing the time average of the IR signals for each of the zones A1-C3.
  • At step 338, routine 300 looks at the forward vehicle motion and cancels out the stationary signals from the following sensor locations. This includes normalizing the signal for a given zone by subtracting out from the following zone the history of the previous zone. For example, the history of zone A1 is subtracted from zone A2 when an object passes from zone A1 to zone A2. The continued signal normalization applies to zone A3 in which the history from zones A1 and A2 is subtracted from zone A3 at the appropriate time based upon a time delay based on the vehicle speed. The time delay is based on vehicle speed so that the zones cover the same area of space. Signal normalization also occurs in rows B and C by subtracting out the signal from the prior zone. It should be appreciated that the same signal normalization applies in the reverse direction for a vehicle or object passing laterally from the rear of the host vehicle 10 toward the front of the host vehicle 10, except the signal normalization is reversed such that the propagating signal in zone A3 is subtracted from A2, etc. For example, for zone A2, you take the current X2 data and subtract off the history of zone A3 such that a sliding window essentially is provided. The terrain normalization essentially eliminates the fore and aft movement parallel to the host vehicle 10 in the detection zones. By doing so, stationary objects or clutter are rejected.
  • Routine 300 then proceeds to step 340 to determine the lateral component of the moving object. The lateral component of an object is based on the lateral movement toward or away from the lateral side of the host vehicle 10. Next, in decision step 342, routine 300 determines if there is lateral velocity component greater than eighteen miles per hour (18 mph) and if the object is large enough and, if so, proceeds to step 346 to update the object range estimate, and then proceeds to decision step 348 to check the temporal gating. If the object is not large enough or if the lateral velocity component is not greater than eighteen miles per hour (18 mph), routine 300 returns to step 302. At the decision step 348, temporal gating is compared to determine whether or not an object is likely to collide with the host vehicle 10 and, if so, routine 300 proceeds to decision step 350 to determine if thermal IR safing is enabled and, if so, deploys an output at step 352. If the thermal IR safing is not enabled, routine 300 returns to step 302. It should be appreciated that the temporal gating of step 348 and the thermal IR safing of step 350 may be the same or similar to those steps provided in routine 100 as discussed above.
  • While a thermal far IR safing receiver 46 is shown and described herein for providing thermal IR safing, it should be appreciated that other safing techniques may be employed to eliminate false triggers. As described, a matrix of IR beams illuminates the side of host vehicle 10 to provide a sensed volume. A single IR beam may be provided in a matrix of beams. Using range data, which is inherent to the power level of the reflected signal, a three-dimensional volume can be estimated. The addition of a separate technology to “safe” the primary deploy signal is required to ensure against false air bag deployment. “Safing” is defined as a complementary measure to verify that the detected object is an oncoming laterally moving vehicle where the measured speed of the oncoming laterally moving vehicle matches the speed measured by the primary detection. Moreover, the measured speed may be approximately fifteen (15) to fifty (50) miles per hour, according to one example. Additionally, this concept of lateral velocity verification can be used to enable sub-fifteen mile per hour air bag deployment.
  • Use of discrimination sensors to assess data to do a lateral velocity calculation of the oncoming laterally moving vehicle compared to the safing lateral velocity may be provided. If the two are similar, it can be assumed with a high degree of confidence that the oncoming object is indeed a sufficient threat to the occupants of the host vehicle 10. Objects which have a significant lateral velocity component, such as those greater than eighteen (18) miles per hour, may be recognized as potentially dangerous targets and their characteristics may be evaluated for assessed threat to the host vehicle 10. Each scan of the matrix yields an object light level for each spot or zone. An analysis of the light levels from all the spots can infer the distance, velocity and trajectory of an object from the host vehicle 10.
  • Discrimination technology considered for the safing technique can include active near IR (NIR) radar, or camera. One or more safing technology and one or more deploy technology may be utilized in the design. Individual safing or deploy technologies can include active near IR, far IR (FIR), ultrasonic acoustics, laser time-of-flight sensor (3D scanner), 3D camera, or stereo camera.
  • Employing the thermal IR safing technique, heat may be detected from an oncoming vehicle. According to an ultrasonic sensed safing technique, ultrasonic sensors perform the safing function. The safing function is employed to ensure the event progression is due to an incoming vehicle and not due to other situations that would not require a side air bag deployment. Safing may prevent deployment of a side air bag when the host vehicle 10 strikes a stationary object, such as a tree, pole or parked vehicle. These objects are not likely to have the thermal signature of a moving vehicle, and in extreme yaw conditions, may not return a steady ultrasonic return which is especially true for trees and poles. Therefore, there is a need to relate a means to disable or reduce safing requirements in yaw conditions.
  • Situationally dependent safing is a method to modify side air bag pre-impact deployment safing based upon the vehicle stability. During normal vehicle conditions, an active IR sensing system is employed to determine when a side impact is imminent. Supplemental information from either an ultrasonic or passive IR system is used for safing. If the host vehicle 10 path is tangent to its four-aft direction or the target follows a linear path into the side of the host vehicle 10, incoming targets will follow a normal progression and safing techniques will provide information necessary to make a reliable decision. A relatively linear progression will allow sufficient path information of the active IR system to generate a mature path. In extreme yaw conditions, however, the path of the host vehicle 10 may not allow the development of a mature track for impacts. Moreover, the supplemental safing sensors are less likely to provide adequate information to supplement the deployment condition. If an ultrasonic sensor is employed, the host vehicle 10 may spin into the target too quickly to provide an adequate return. If a passive IR system is employed, the target may not generate the thermal signature necessary to allow deployment. Therefore, a decision tree may allow for safing levels to be reduced in cases where the host vehicle 10 is not following a consistent path due to the high yaw. The decision tree may include logically ORing the following requirements: Far IR (FIR) is greater than threshold, steering wheel angular rate is greater than N degrees per second, yaw is greater than N degrees per second, external lateral slip divided by yaw, and lateral acceleration greater than 0.5 g. The output of the OR logic is then logically ANDed with the discrimination output to determine whether or not to deploy one or more devices.
  • Additionally, vehicle travel direction can be inferred by the ground terrain monitoring of the active IR scanning system. By use of both left and right pre-crash side sensors to monitor the optical flow of asphalt pattern directions, the vehicle velocity and direction including lateral sliding can be detected and approach countermeasures initiated. Under inclement conditions, such as blowing snow/rain/sand side slip monitoring is failed-safed by a lateral yaw rate sensor. Lateral sliding information can be used for side air bag threshold lowering, stability control, or potentially rollover detection. The lateral slip sensor may use a left or right sensor to monitor road pattern directions. With the transmit/receive beams, an optical flow through beam matrix determines ground travel direction, vehicle rotation and potential vehicle roll.
  • As mentioned herein, the array of transmit and receive IR beams may be arranged in an overlapping configuration. Tailoring of three-dimensional volume to the side of the host vehicle 10 can pose a challenge to ensure an oncoming laterally moving vehicle is detected and a side air bag is deployed, yet allow the numerous no-deploy objects which pass harmlessly by the host vehicle 10 to not cause false deploys. The beam overlap may allow increased spatial resolution with a minimum number of discrete beams by use of the beam overlap. In order to increase the effective spots or zones of a pre-crash side impact sensor without adding more channels, each of the nine beam spots may be enlarged to allow a twenty percent (20%) beam spot overlap which provides twenty-one multiplexed zones, in contrast to the above disclosed nine non-overlap zones, which allows increased distance resolution of the incoming object. The geometry can apply to IR illumination or light receiver shape or possibly both transmitter and receiver shapes for more resolution. Accordingly, the beam overlap method may consist of overlapping scanned areas or regions to allow increased target tracking resolution.
  • Accordingly, the pre-crash sensing system 20 of the present invention advantageously detects an impending collision of an object with the host vehicle 10, prior to the actual collision. The crash sensing system 20 is cost affordable and effective to determine whether an object is approaching the host vehicle 10 and may collide with the vehicle, sufficient to enable a determination of the impending collision prior to actual impact. Further, the pre-crash sensing system 20 may determine whether the object is of sufficient size and speed to deploy certain countermeasures.
  • It will be understood by those who practice the invention and those skilled in the art, that various modifications and improvements may be made to the invention without departing from the spirit of the disclosed concept. The scope of protection afforded is to be determined by the claims and by the breadth of interpretation allowed by law.

Claims (12)

1. A vehicle pre-impact sensing system comprising:
an array of energy signal transmitters mounted on a vehicle for transmitting signals within multiple transmit zones spaced from the vehicle;
an array of receiver elements mounted on the vehicle for receiving the signals reflected from an object located in one or more multiple receive zones indicative of the object being in certain one or more receive zones; and
a processor for processing the received reflected signals and determining range, location, speed and direction of the object, wherein the processor performs a normalization to remove stationary targets by propagating a leading signal from one zone rearward to an adjacent following zone based on vehicle speed to provide a normalized signal, wherein the processor further determines whether the object is expected to impact the vehicle as a function of the range, location, speed, direction of the object and the normalized signal, and generates an output indicative of a sensed pre-impact event.
2. The sensing system as defined in claim 1, wherein the processor determines a time delay for an object to pass from a leading detection zone to an adjacent rearward detection zone and subtracts the leading zone signal from the following zone signal based on the time delay.
3. The sensing system as defined in claim 1, wherein the processor ignores stationary targets.
4. The sensing system as defined in claim 1, wherein the array of energy signal transmitters comprises an array of light transmitters.
5. The sensing system as defined in claim 4, wherein the array of light transmitters comprises an array of infrared transmitters configured to emit infrared radiation in designated multiple transmit zones.
6. The sensing system as defined in claim 5, wherein the array of receiver elements comprises an array of photodetectors for receiving reflected infrared radiation, wherein the photodetectors receive light signals including reflected signals from designated receive zones.
7. The sensing system as defined in claim 1, wherein the system senses an object on a lateral side of the vehicle.
8. The sensing system as defined in claim 7, wherein the array of transmitters and receivers are mounted on exterior mirror housing.
9. A method of detecting an expected impact of an object with a vehicle, said method comprising the steps of:
transmitting signals within multiple transmit zones spaced from the vehicle;
receiving signals reflected from an object located in the one or more multiple receive zones indicative of the object being in certain one or more receive zones;
processing the received reflected signals;
determining location of the object;
determining speed of the object;
determining direction of the object;
performing a normalization to remove stationary targets by propagating a leading signal from a leading zone rearward to an adjacent following zone based on vehicle speed to provide a normalized signal;
determining whether the object is expected to impact the vehicle as a function of the determined range, location, speed and direction of the object and the normalized signal; and
generating an output signal indicative of a sensed pre-impact event.
10. The method as defined in claim 9, wherein the step of performing the normalization further comprises determining a time delay for an object to pass from the leading detection zone to the adjacent detection zone and subtracting the leading zone signal from the following zone signal based on the time delay.
11. The method as defined in claim 10, wherein the time delay is based on vehicle speed.
12. The method as defined in claim 9, wherein the method senses an object on a lateral side of the vehicle.
US12/468,881 2008-05-29 2009-05-20 Vehicle Pre-Impact Sensing System Having Terrain Normalization Abandoned US20090299633A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/468,881 US20090299633A1 (en) 2008-05-29 2009-05-20 Vehicle Pre-Impact Sensing System Having Terrain Normalization

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13023608P 2008-05-29 2008-05-29
US12/468,881 US20090299633A1 (en) 2008-05-29 2009-05-20 Vehicle Pre-Impact Sensing System Having Terrain Normalization

Publications (1)

Publication Number Publication Date
US20090299633A1 true US20090299633A1 (en) 2009-12-03

Family

ID=41021330

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/468,879 Abandoned US20090299631A1 (en) 2008-05-29 2009-05-20 Vehicle Pre-Impact Sensing System Having Object Feature Detection
US12/468,881 Abandoned US20090299633A1 (en) 2008-05-29 2009-05-20 Vehicle Pre-Impact Sensing System Having Terrain Normalization
US12/468,880 Active 2030-12-17 US8249798B2 (en) 2008-05-29 2009-05-20 Vehicle pre-impact sensing system having signal modulation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/468,879 Abandoned US20090299631A1 (en) 2008-05-29 2009-05-20 Vehicle Pre-Impact Sensing System Having Object Feature Detection

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/468,880 Active 2030-12-17 US8249798B2 (en) 2008-05-29 2009-05-20 Vehicle pre-impact sensing system having signal modulation

Country Status (3)

Country Link
US (3) US20090299631A1 (en)
EP (3) EP2138359A1 (en)
AT (1) ATE549214T1 (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100164479A1 (en) * 2008-12-29 2010-07-01 Motorola, Inc. Portable Electronic Device Having Self-Calibrating Proximity Sensors
US20100271331A1 (en) * 2009-04-22 2010-10-28 Rachid Alameh Touch-Screen and Method for an Electronic Device
US20100294938A1 (en) * 2009-05-22 2010-11-25 Rachid Alameh Sensing Assembly for Mobile Device
US20100299642A1 (en) * 2009-05-22 2010-11-25 Thomas Merrell Electronic Device with Sensing Assembly and Method for Detecting Basic Gestures
US20110006190A1 (en) * 2009-07-10 2011-01-13 Motorola, Inc. Devices and Methods for Adjusting Proximity Detectors
US20120163671A1 (en) * 2010-12-23 2012-06-28 Electronics And Telecommunications Research Institute Context-aware method and apparatus based on fusion of data of image sensor and distance sensor
US8269175B2 (en) 2009-05-22 2012-09-18 Motorola Mobility Llc Electronic device with sensing assembly and method for detecting gestures of geometric shapes
US8275412B2 (en) 2008-12-31 2012-09-25 Motorola Mobility Llc Portable electronic device having directional proximity sensors based on device orientation
US8294105B2 (en) 2009-05-22 2012-10-23 Motorola Mobility Llc Electronic device with sensing assembly and method for interpreting offset gestures
US8391719B2 (en) 2009-05-22 2013-03-05 Motorola Mobility Llc Method and system for conducting communication between mobile devices
US8542186B2 (en) 2009-05-22 2013-09-24 Motorola Mobility Llc Mobile device with user interaction capability and method of operating same
US8619029B2 (en) 2009-05-22 2013-12-31 Motorola Mobility Llc Electronic device with sensing assembly and method for interpreting consecutive gestures
US8665227B2 (en) 2009-11-19 2014-03-04 Motorola Mobility Llc Method and apparatus for replicating physical key function with soft keys in an electronic device
US8751056B2 (en) 2010-05-25 2014-06-10 Motorola Mobility Llc User computer device with temperature sensing capabilities and method of operating same
US8788676B2 (en) 2009-05-22 2014-07-22 Motorola Mobility Llc Method and system for controlling data transmission to or from a mobile device
US8963845B2 (en) 2010-05-05 2015-02-24 Google Technology Holdings LLC Mobile device with temperature sensing capability and method of operating same
US8963885B2 (en) 2011-11-30 2015-02-24 Google Technology Holdings LLC Mobile device for interacting with an active stylus
US9063591B2 (en) 2011-11-30 2015-06-23 Google Technology Holdings LLC Active styluses for interacting with a mobile device
US9103732B2 (en) 2010-05-25 2015-08-11 Google Technology Holdings LLC User computer device with temperature sensing capabilities and method of operating same
US9421929B2 (en) 2014-09-19 2016-08-23 Joseph Y. Yoon Airbag deployment control apparatus and method
US9592791B2 (en) 2014-09-19 2017-03-14 Joseph Y. Yoon Advanced seatbelt apparatus
CN106524036A (en) * 2012-02-07 2017-03-22 两树光子学有限公司 Lighting device for headlights with a phase modulator
JP2017150902A (en) * 2016-02-23 2017-08-31 株式会社Ihiエアロスペース On-vehicle laser radar device
US20180067495A1 (en) * 2016-09-08 2018-03-08 Mentor Graphics Corporation Event-driven region of interest management
US20180322788A1 (en) * 2017-05-04 2018-11-08 Allbridge Rogers Vehicle warning system
US10429506B2 (en) * 2014-10-22 2019-10-01 Denso Corporation Lateral distance sensor diagnosis apparatus
US10520904B2 (en) 2016-09-08 2019-12-31 Mentor Graphics Corporation Event classification and object tracking
US10730528B2 (en) * 2016-04-12 2020-08-04 Robert Bosch Gmbh Method and device for determining a safety-critical yawing motion of a vehicle
US11550036B2 (en) 2016-01-31 2023-01-10 Velodyne Lidar Usa, Inc. Multiple pulse, LIDAR based 3-D imaging
US11550056B2 (en) 2016-06-01 2023-01-10 Velodyne Lidar Usa, Inc. Multiple pixel scanning lidar
US11703569B2 (en) 2017-05-08 2023-07-18 Velodyne Lidar Usa, Inc. LIDAR data acquisition and control
US11796648B2 (en) 2018-09-18 2023-10-24 Velodyne Lidar Usa, Inc. Multi-channel lidar illumination driver
US11808891B2 (en) 2017-03-31 2023-11-07 Velodyne Lidar Usa, Inc. Integrated LIDAR illumination power control
US11885958B2 (en) 2019-01-07 2024-01-30 Velodyne Lidar Usa, Inc. Systems and methods for a dual axis resonant scanning mirror

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008154737A1 (en) 2007-06-18 2008-12-24 Leddartech Inc. Lighting system with traffic management capabilities
US8242476B2 (en) 2005-12-19 2012-08-14 Leddartech Inc. LED object detection system and method combining complete reflection traces from individual narrow field-of-view channels
JP2010529932A (en) * 2007-06-18 2010-09-02 レッダーテック インコーポレイテッド Lighting system with driver assistance function
EP2232462B1 (en) 2007-12-21 2015-12-16 Leddartech Inc. Parking management system and method using lighting system
JP5671345B2 (en) 2007-12-21 2015-02-18 レッダーテック インコーポレイテッド Detection and ranging method
WO2011077400A2 (en) * 2009-12-22 2011-06-30 Leddartech Inc. Active 3d monitoring system for traffic detection
US8908159B2 (en) 2011-05-11 2014-12-09 Leddartech Inc. Multiple-field-of-view scannerless optical rangefinder in high ambient background light
US9378640B2 (en) * 2011-06-17 2016-06-28 Leddartech Inc. System and method for traffic side detection and characterization
JP5853838B2 (en) * 2011-07-12 2016-02-09 株式会社デンソー In-vehicle laser radar system
DE102011112715A1 (en) * 2011-09-07 2013-03-07 Audi Ag Method for detecting an object in an environment of a motor vehicle
CA2865733C (en) * 2012-03-02 2023-09-26 Leddartech Inc. System and method for multipurpose traffic detection and characterization
EP3027469A4 (en) 2013-07-31 2017-04-05 SenseDriver Technologies, LLC Vehicle use portable heads-up display
US10203399B2 (en) 2013-11-12 2019-02-12 Big Sky Financial Corporation Methods and apparatus for array based LiDAR systems with reduced interference
WO2015095849A1 (en) 2013-12-20 2015-06-25 Amaru Michael Method and apparatus for in-vehicular communications
US9360554B2 (en) 2014-04-11 2016-06-07 Facet Technology Corp. Methods and apparatus for object detection and identification in a multiple detector lidar array
WO2016038536A1 (en) 2014-09-09 2016-03-17 Leddartech Inc. Discretization of detection zone
US10402143B2 (en) 2015-01-27 2019-09-03 Sensedriver Technologies, Llc Image projection medium and display projection system using same
US10036801B2 (en) 2015-03-05 2018-07-31 Big Sky Financial Corporation Methods and apparatus for increased precision and improved range in a multiple detector LiDAR array
AU2016293384A1 (en) * 2015-07-14 2018-01-18 Technological Resources Pty. Limited Impact detection system
US10548683B2 (en) 2016-02-18 2020-02-04 Kic Ventures, Llc Surgical procedure handheld electronic display device and method of using same
US9866816B2 (en) 2016-03-03 2018-01-09 4D Intellectual Properties, Llc Methods and apparatus for an active pulsed 4D camera for image acquisition and analysis
US11016499B2 (en) * 2016-07-29 2021-05-25 Pioneer Corporation Measurement device, measurement method and program
TWI678121B (en) * 2016-10-20 2019-11-21 宏達國際電子股份有限公司 Auxiliary apparatus for a lighthouse positioning system
JP6787766B2 (en) * 2016-12-09 2020-11-18 株式会社Subaru Vehicle rollover detection device
DE102017114565A1 (en) * 2017-06-29 2019-01-03 Osram Opto Semiconductors Gmbh An optical distance measuring device and method for operating an optical distance measuring device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5463384A (en) * 1991-02-11 1995-10-31 Auto-Sense, Ltd. Collision avoidance system for vehicles
US5818383A (en) * 1981-11-27 1998-10-06 Northrop Grumman Corporation Interferometric moving vehicle imaging apparatus and method
US6298311B1 (en) * 1999-03-01 2001-10-02 Delphi Technologies, Inc. Infrared occupant position detection system and method for a motor vehicle
US6697010B1 (en) * 2002-04-23 2004-02-24 Lockheed Martin Corporation System and method for moving target detection
US7359782B2 (en) * 1994-05-23 2008-04-15 Automotive Technologies International, Inc. Vehicular impact reactive system and method
US20080245952A1 (en) * 2007-04-03 2008-10-09 Troxell John R Synchronous imaging using segmented illumination
US20090099736A1 (en) * 2007-10-10 2009-04-16 Hawes Kevin J Vehicle pre-impact sensing system and method

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4134148A1 (en) * 1991-10-16 1993-04-22 Micro Epsilon Messtechnik Detecting positions of light reflecting objects, e.g. for testing vehicle shock absorbers - measuring intensity of light reflected from object, e.g., bar=code, and detecting steps in intensity distribution
US7831358B2 (en) * 1992-05-05 2010-11-09 Automotive Technologies International, Inc. Arrangement and method for obtaining information using phase difference of modulated illumination
US5635920A (en) * 1994-11-29 1997-06-03 J. B. Pogue Remote traffic signal indicator
DE19507957C1 (en) * 1995-03-07 1996-09-12 Daimler Benz Ag Vehicle with optical scanning device for a side lane area
DE19546715A1 (en) * 1995-12-14 1997-06-19 Daimler Benz Aerospace Ag Airbag releasing sensing system for motor vehicle
US5682225A (en) * 1996-06-07 1997-10-28 Loral Vought Systems Corp. Ladar intensity image correction for laser output variations
JP3766145B2 (en) * 1996-10-16 2006-04-12 株式会社日本自動車部品総合研究所 Vehicle interior condition detection device
US7592944B2 (en) * 1999-06-14 2009-09-22 Time Domain Corporation System and method for intrusion detection using a time domain radar array
US6400447B1 (en) * 2000-03-11 2002-06-04 Trimble Navigation, Ltd Differentiation of targets in optical station based on the strength of the reflected signal
US7146139B2 (en) * 2001-09-28 2006-12-05 Siemens Communications, Inc. System and method for reducing SAR values
US7961094B2 (en) * 2002-06-11 2011-06-14 Intelligent Technologies International, Inc. Perimeter monitoring techniques
US6839027B2 (en) * 2002-11-15 2005-01-04 Microsoft Corporation Location measurement process for radio-frequency badges employing path constraints
US7136753B2 (en) * 2002-12-05 2006-11-14 Denso Corporation Object recognition apparatus for vehicle, inter-vehicle control apparatus, and distance measurement apparatus
DE102004021561A1 (en) * 2004-05-03 2005-12-08 Daimlerchrysler Ag Object recognition system for a motor vehicle
DE102004032118B4 (en) * 2004-07-01 2006-09-21 Daimlerchrysler Ag Object recognition method for vehicles
DE102005056976A1 (en) * 2005-11-30 2007-06-06 GM Global Technology Operations, Inc., Detroit Motor vehicle`s e.g. parking vehicle, surrounding detecting device, has laser diode and receiver arranged for detection of information about part of surrounding located laterally next to vehicle with respect to driving direction
JP4657956B2 (en) * 2006-03-14 2011-03-23 三菱電機株式会社 Differential absorption lidar device
US7746450B2 (en) * 2007-08-28 2010-06-29 Science Applications International Corporation Full-field light detection and ranging imaging system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5818383A (en) * 1981-11-27 1998-10-06 Northrop Grumman Corporation Interferometric moving vehicle imaging apparatus and method
US5463384A (en) * 1991-02-11 1995-10-31 Auto-Sense, Ltd. Collision avoidance system for vehicles
US7359782B2 (en) * 1994-05-23 2008-04-15 Automotive Technologies International, Inc. Vehicular impact reactive system and method
US6298311B1 (en) * 1999-03-01 2001-10-02 Delphi Technologies, Inc. Infrared occupant position detection system and method for a motor vehicle
US6697010B1 (en) * 2002-04-23 2004-02-24 Lockheed Martin Corporation System and method for moving target detection
US20080245952A1 (en) * 2007-04-03 2008-10-09 Troxell John R Synchronous imaging using segmented illumination
US20090099736A1 (en) * 2007-10-10 2009-04-16 Hawes Kevin J Vehicle pre-impact sensing system and method

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8030914B2 (en) 2008-12-29 2011-10-04 Motorola Mobility, Inc. Portable electronic device having self-calibrating proximity sensors
US20100164479A1 (en) * 2008-12-29 2010-07-01 Motorola, Inc. Portable Electronic Device Having Self-Calibrating Proximity Sensors
US8346302B2 (en) 2008-12-31 2013-01-01 Motorola Mobility Llc Portable electronic device having directional proximity sensors based on device orientation
US8275412B2 (en) 2008-12-31 2012-09-25 Motorola Mobility Llc Portable electronic device having directional proximity sensors based on device orientation
US20100271331A1 (en) * 2009-04-22 2010-10-28 Rachid Alameh Touch-Screen and Method for an Electronic Device
US8294105B2 (en) 2009-05-22 2012-10-23 Motorola Mobility Llc Electronic device with sensing assembly and method for interpreting offset gestures
US8970486B2 (en) 2009-05-22 2015-03-03 Google Technology Holdings LLC Mobile device with user interaction capability and method of operating same
US8269175B2 (en) 2009-05-22 2012-09-18 Motorola Mobility Llc Electronic device with sensing assembly and method for detecting gestures of geometric shapes
US20100294938A1 (en) * 2009-05-22 2010-11-25 Rachid Alameh Sensing Assembly for Mobile Device
US8542186B2 (en) 2009-05-22 2013-09-24 Motorola Mobility Llc Mobile device with user interaction capability and method of operating same
US8304733B2 (en) * 2009-05-22 2012-11-06 Motorola Mobility Llc Sensing assembly for mobile device
US20100299642A1 (en) * 2009-05-22 2010-11-25 Thomas Merrell Electronic Device with Sensing Assembly and Method for Detecting Basic Gestures
US8788676B2 (en) 2009-05-22 2014-07-22 Motorola Mobility Llc Method and system for controlling data transmission to or from a mobile device
US8344325B2 (en) * 2009-05-22 2013-01-01 Motorola Mobility Llc Electronic device with sensing assembly and method for detecting basic gestures
US8391719B2 (en) 2009-05-22 2013-03-05 Motorola Mobility Llc Method and system for conducting communication between mobile devices
US8619029B2 (en) 2009-05-22 2013-12-31 Motorola Mobility Llc Electronic device with sensing assembly and method for interpreting consecutive gestures
US8319170B2 (en) 2009-07-10 2012-11-27 Motorola Mobility Llc Method for adapting a pulse power mode of a proximity sensor
US8519322B2 (en) 2009-07-10 2013-08-27 Motorola Mobility Llc Method for adapting a pulse frequency mode of a proximity sensor
US20110006190A1 (en) * 2009-07-10 2011-01-13 Motorola, Inc. Devices and Methods for Adjusting Proximity Detectors
US8665227B2 (en) 2009-11-19 2014-03-04 Motorola Mobility Llc Method and apparatus for replicating physical key function with soft keys in an electronic device
US8963845B2 (en) 2010-05-05 2015-02-24 Google Technology Holdings LLC Mobile device with temperature sensing capability and method of operating same
US8751056B2 (en) 2010-05-25 2014-06-10 Motorola Mobility Llc User computer device with temperature sensing capabilities and method of operating same
US9103732B2 (en) 2010-05-25 2015-08-11 Google Technology Holdings LLC User computer device with temperature sensing capabilities and method of operating same
US20120163671A1 (en) * 2010-12-23 2012-06-28 Electronics And Telecommunications Research Institute Context-aware method and apparatus based on fusion of data of image sensor and distance sensor
US9063591B2 (en) 2011-11-30 2015-06-23 Google Technology Holdings LLC Active styluses for interacting with a mobile device
US8963885B2 (en) 2011-11-30 2015-02-24 Google Technology Holdings LLC Mobile device for interacting with an active stylus
CN106524036A (en) * 2012-02-07 2017-03-22 两树光子学有限公司 Lighting device for headlights with a phase modulator
US9421929B2 (en) 2014-09-19 2016-08-23 Joseph Y. Yoon Airbag deployment control apparatus and method
US9592791B2 (en) 2014-09-19 2017-03-14 Joseph Y. Yoon Advanced seatbelt apparatus
US10429506B2 (en) * 2014-10-22 2019-10-01 Denso Corporation Lateral distance sensor diagnosis apparatus
US11698443B2 (en) 2016-01-31 2023-07-11 Velodyne Lidar Usa, Inc. Multiple pulse, lidar based 3-D imaging
US11550036B2 (en) 2016-01-31 2023-01-10 Velodyne Lidar Usa, Inc. Multiple pulse, LIDAR based 3-D imaging
JP2017150902A (en) * 2016-02-23 2017-08-31 株式会社Ihiエアロスペース On-vehicle laser radar device
US10730528B2 (en) * 2016-04-12 2020-08-04 Robert Bosch Gmbh Method and device for determining a safety-critical yawing motion of a vehicle
US11874377B2 (en) 2016-06-01 2024-01-16 Velodyne Lidar Usa, Inc. Multiple pixel scanning LIDAR
US11808854B2 (en) * 2016-06-01 2023-11-07 Velodyne Lidar Usa, Inc. Multiple pixel scanning LIDAR
US11561305B2 (en) 2016-06-01 2023-01-24 Velodyne Lidar Usa, Inc. Multiple pixel scanning LIDAR
US11550056B2 (en) 2016-06-01 2023-01-10 Velodyne Lidar Usa, Inc. Multiple pixel scanning lidar
US20180067495A1 (en) * 2016-09-08 2018-03-08 Mentor Graphics Corporation Event-driven region of interest management
US11067996B2 (en) * 2016-09-08 2021-07-20 Siemens Industry Software Inc. Event-driven region of interest management
US10585409B2 (en) 2016-09-08 2020-03-10 Mentor Graphics Corporation Vehicle localization with map-matched sensor measurements
US10558185B2 (en) 2016-09-08 2020-02-11 Mentor Graphics Corporation Map building with sensor measurements
US10520904B2 (en) 2016-09-08 2019-12-31 Mentor Graphics Corporation Event classification and object tracking
US11808891B2 (en) 2017-03-31 2023-11-07 Velodyne Lidar Usa, Inc. Integrated LIDAR illumination power control
US20180322788A1 (en) * 2017-05-04 2018-11-08 Allbridge Rogers Vehicle warning system
US11703569B2 (en) 2017-05-08 2023-07-18 Velodyne Lidar Usa, Inc. LIDAR data acquisition and control
US11796648B2 (en) 2018-09-18 2023-10-24 Velodyne Lidar Usa, Inc. Multi-channel lidar illumination driver
US11885958B2 (en) 2019-01-07 2024-01-30 Velodyne Lidar Usa, Inc. Systems and methods for a dual axis resonant scanning mirror

Also Published As

Publication number Publication date
US20090299631A1 (en) 2009-12-03
US20090299632A1 (en) 2009-12-03
EP2138359A1 (en) 2009-12-30
US8249798B2 (en) 2012-08-21
EP2138358A1 (en) 2009-12-30
EP2138358B1 (en) 2012-03-14
ATE549214T1 (en) 2012-03-15
EP2138357A1 (en) 2009-12-30

Similar Documents

Publication Publication Date Title
US8249798B2 (en) Vehicle pre-impact sensing system having signal modulation
US8880296B2 (en) Techniques for improving safe operation of a vehicle
JP6488327B2 (en) Lighting system with driver assistance function
US7278505B2 (en) Control device for starting motion of mobile body
US7783403B2 (en) System and method for preventing vehicular accidents
US7872764B2 (en) Machine vision for predictive suspension
US7630806B2 (en) System and method for detecting and protecting pedestrians
US8164432B2 (en) Apparatus, method for detecting critical areas and pedestrian detection apparatus using the same
US8285447B2 (en) Look ahead vehicle suspension system
US7049945B2 (en) Vehicular blind spot identification and monitoring system
JP4914361B2 (en) Spatial area monitoring equipment
JP5204963B2 (en) Solid-state image sensor
WO2017110418A1 (en) Image acquisition device to be used by vehicle, control device, vehicle equipped with control device or image acquisition device to be used by vehicle, and image acquisition method to be used by vehicle
EP2999207A1 (en) Vehicle optical sensor system
KR20220056169A (en) Apparatus for controlling headlamp of vehicle
Broggi et al. Localization and analysis of critical areas in urban scenarios
US20080164983A1 (en) System for the Detection of Objects Located in an External Front-End Zone of a Vehicle, Which Is Suitable for Industrial Vehicles
US20090099736A1 (en) Vehicle pre-impact sensing system and method
JP2004125636A (en) On-vehicle laser radar device
CN110953550A (en) Illumination detection lamp group and vehicle
EP3428686B1 (en) A vision system and method for a vehicle
JP2022538462A (en) Optical measuring device for determining object information of an object in at least one monitored zone
CN211399635U (en) Illumination detection lamp group and vehicle

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION