WO2003001474A2 - Procede et appareil de transfert d'informations entre vehicules - Google Patents

Procede et appareil de transfert d'informations entre vehicules Download PDF

Info

Publication number
WO2003001474A2
WO2003001474A2 PCT/US2002/020403 US0220403W WO03001474A2 WO 2003001474 A2 WO2003001474 A2 WO 2003001474A2 US 0220403 W US0220403 W US 0220403W WO 03001474 A2 WO03001474 A2 WO 03001474A2
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
local
kinematic state
data
inter
Prior art date
Application number
PCT/US2002/020403
Other languages
English (en)
Other versions
WO2003001474A3 (fr
Inventor
Robert Pierce Lutter
Dan Alan Preston
Original Assignee
Medius, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/892,293 external-priority patent/US20020140548A1/en
Priority claimed from US09/892,333 external-priority patent/US6615137B2/en
Application filed by Medius, Inc. filed Critical Medius, Inc.
Priority to AU2002349794A priority Critical patent/AU2002349794A1/en
Publication of WO2003001474A2 publication Critical patent/WO2003001474A2/fr
Publication of WO2003001474A3 publication Critical patent/WO2003001474A3/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/03Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for supply of electrical power to vehicle subsystems or for
    • B60R16/0315Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for supply of electrical power to vehicle subsystems or for using multiplexing techniques
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0965Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages responding to signals from another vehicle, e.g. emergency vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/162Decentralised systems, e.g. inter-vehicle communication event-triggered
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles

Definitions

  • Vehicle collisions are often caused when a driver can not see or is unaware of an oncoming object.
  • a tree may obstruct a drivers view of oncoming traffic at an intersection. The driver has to enter the intersection with no knowledge whether another vehicle may be entering the same intersection. After entering the intersection, it is often too late for the driver to avoid an oncoming car that has failed to properly yield.
  • Sensor data is generated for areas around a vehicle. Any objects detected in the sensor data are identified and a kinematic state for the object determined. The kinematic states for the detected objects are compared with the kinematic state of the vehicle. If it is likely that a collision will occur between the detected objects and the local vehicle, a warning is automatically generated to notify the vehicle operator of the impending collision. The sensor data and kinematic state of the vehicle can be transmitted to other vehicles so that the other vehicles are also notified of possible collision conditions. Assorted vehicle subsystems, sensors, communication devices, and other electronic devices are connected to a processing unit by using a plurality of wireless links.
  • FIG. 1 is a diagram of an inter-vehicle communication system.
  • FIG. 2 is a block diagram showing how the inter-vehicle communication system of FIG. 1 operates.
  • FIG. 3 is a diagram showing how sensor data can be exchanged between different vehicles.
  • FIG. 4 is a diagram showing Graphical User Interfaces (GUIs) are used for different vehicles that share sensor data.
  • GUIs Graphical User Interfaces
  • FIG. 5 is a diagram showing how collision information can be exchanged between different vehicles.
  • FIGS. 6 and 7 are diagrams showing how kinetic state information for multiple vehicles can be used to identify road direction.
  • FIGS. 8 and 9 are diagrams showing how the inter- vehicle communication system is used to help avoid collisions.
  • FIG. 10 is a diagram showing how an emergency signal is broadcast to multiple vehicles from a police vehicle.
  • FIGS. 11 and 12 are diagrams showing sensors are used to indicate proximity of a local vehicle to other objects.
  • FIGS. 13 and 14 show different sensor and communication envelopes that are used by the inter-vehicle communication system.
  • FIG. 15 is a block diagram showing the different data inputs and outputs that are coupled to an inter-vehicle communication processor.
  • FIG. 16 is a block diagram showing how the processor in FIG. 16 operates.
  • FIG. 17 is a block diagram illustrating a first embodiment of the present invention.
  • FIG. 18 is a block diagram illustrating a second embodiment of the present invention.
  • FIG. 19 is a block diagram of a specific instance of a vehicular wireless network according to the first embodiment of the present invention disclosed in Fig. 17.
  • FIG. 20 is block diagram of a specific instance of a vehicular wireless network according to the second embodiment of the present invention disclosed in Fig. 18.
  • FIG. 21 is a stylized profile of an automobile illustrating the physical location of some of the components described in Fig. 20.
  • FIG. 1 shows a multi-vehicle communication system 12 that allows different vehicles to exchange kinematic state data.
  • Each vehicle 14 may include one or more sensors 18 that gather sensor information around the associated vehicle 14.
  • a transmitter/receiver (transceiver) in the vehicle 14 transmits to other vehicles kinematic state data 19 for objects detected by the sensors 18 and kinematic state data 17 for the vehicle itself.
  • a Central Processing Unit (CPU) 20 in the vehicle 14 is coupled between the sensors 18 and transceivers 16.
  • the CPUs 20 display the sensor information acquired from the local sensors 18 in the same vehicle and also displays, if appropriate, the kinematic state data 17 and 19 received from the other vehicles 14.
  • the CPU 20 for one of the vehicles may identify an object 22 that is detected by the sensor 18 A.
  • the CPU 20A identifies how far the object 22 is away from the vehicle 14A.
  • the CPU 20 A may also generate a warning signal if the object 22 comes within a specific distance of the vehicle 14A.
  • the CPU 20A then transmits the kinematic state data for object 22 to the other vehicles 14B and 14C that are within some range of vehicle 14 A.
  • the CPU 20B from vehicle 14B establishes communication with the transmitting vehicle 14A in box 24.
  • a navigation grid is established in box 26 that determines where the vehicle 14A is in relationship to vehicle 14B.
  • vehicle 14A sending its kinematic state data 17 such as location, speed, acceleration, and direction to vehicle 14B.
  • the vehicle 14B receives the kinematic state data for object 22 from vehicle 14A in box 28.
  • the CPU 20B determines the position of object 22 relative to vehicle 14B.
  • the CPU 20B displays the object on a digital map in vehicle 14B in box 32.
  • vehicle 14B receives the position of vehicle 14A and the information regarding object 22 through an intermediary vehicle 14C.
  • the transceiver 16A in vehicle 14A transmits the kinematic state of vehicle 14A and the information regarding object 22 to vehicle 14C.
  • the transceiver 16C in vehicle 14C then relays its own kinematic state data along with the kinematic state data of vehicle 14A and object 22 to vehicle 14B.
  • the CPU 20B determines from the kinematic state of vehicle 14A and the kinematic state of object 22, the position of object 22 is in relation to vehicle 14B. If the position of object 22 is within some range of vehicle 14B, the object 22 is displayed on a Graphical User Interface (GUI) inside of vehicle 14B (not shown).
  • GUI Graphical User Interface
  • FIG. 3 shows an example of how the Inter-vehicle communication system 12 shown in FIG. 1 can be used to identify different objects that may not be detectable from a local vehicle. There are five vehicles shown in FIG. 3. Vehicle D is in an intersection 40.
  • a vehicle A is heading into the intersection 40 from the east and another vehicle B is heading into the intersection 40 coming from the west.
  • Vehicle E or vehicle F may not be able to see either vehicle A or vehicle B.
  • a building 44 obstructs easterly views by vehicles E and F and a tree 46 obstructs a westerly view by vehicle E and F.
  • Vehicle A or vehicle B may be entering the intersection 40 at a particular speed and distance that is likely to collide with vehicle E or vehicle F. Vehicle E or vehicle F could avoid the potential collision if notified in sufficient time. However, the tree 46 and building 44 prevent vehicles E and F from seeing either vehicle A or vehicle B until they have already entered the intersection 40.
  • Vehicle D includes multiple sensors 42 that sense objects in front, such as vehicle C, in the rear, such as vehicle E, or on the sides, such as vehicles A and B.
  • a processor in vehicle D (not shown) processes the sensor data and identifies the speed, direction and position of vehicles A and B.
  • a transceiver 48 in vehicle D transmits the data identifying vehicles A and B to vehicle E.
  • a transceiver 48 in vehicle E then relays the sensor data to vehicle F.
  • both vehicles E and F are notified about oncoming vehicles A and B even when vehicles A and B cannot be seen visually by the operators of vehicles E and F or detected electronically by sensors on vehicle E and F.
  • the sensing ranges for vehicles E and F are extended by receiving the sensing information from vehicle D.
  • FIG. 4 shows three different screens 50, 52, and 54 that are displayed by vehicles D, E, and F, respectively.
  • Each of screens 50, 52, and 54 are Graphical User Interfaces or other display systems that display sensor data and vehicle information from one or more different vehicles.
  • vehicle D shows different motion vectors that represent objects detected by sensors 42 (FIG. 3).
  • a motion vector 56 shows vehicle B approaching from the west
  • a motion vector 58 shows vehicle C moving in front of vehicle D in a northern direction
  • a motion vector 60 shows vehicle A approaching from the east
  • a motion vector 62 shows vehicle E approaching the back of vehicle D from a southern direction.
  • Screen 52 shows objects displayed by the GUI in vehicle E.
  • Motion vector 64 shows vehicle D moving in front of vehicle E and motion vectors 60 and 56 show vehicles A and B coming toward vehicle D from the east and the west, respectively. Even if the vehicles A and B can not be detected by sensors in vehicle E, the vehicles are detected by sensors in vehicle D and then transmitted to vehicle E.
  • Screen 54 shows the motion vectors displayed to an operator of vehicle F.
  • the motion vectors 64 and 66 shows vehicles D and E traveling north in front of vehicle F.
  • the vehicles A and B are shown approaching vehicle D from the east and west, respectively.
  • the inter-vehicle communication system allows vehicles to effectively see around corners and other obstructions by sharing sensor information between different vehicles. This allows any of the vehicles to anticipate and avoid potential accidents.
  • the operator of vehicle E can see by the displayed motion vector 60 that vehicle A is traveling at 40 MPH. This provides the operator of vehicle E a warning that vehicle A may not be stopping at intersection 40 (FIG. 3). Even if vehicle E has the right of way, vehicle E can avoid a collision by slowing down or stopping while vehicle A passes through intersection 40.
  • the motion vector 56 for vehicle B indicates deceleration and a current velocity of only 5 MPH. Deceleration may be indicated by a shorter motion vector 56 or by an alphanumeric display around the motion vector 56.
  • the motion vector 56 indicates that vehicle B is slowing down or stopping at intersection 40. Thus, if vehicle B were the only other vehicle entering intersection 40, the operator of vehicle E is more confident about entering intersection 50 without colliding into another vehicle.
  • vehicle F may not be close enough to intersection 40 to worry about colliding with vehicle A.
  • screen 54 shows that vehicle E may be on a collision track with vehicle A. If vehicle E were following too close to vehicle D, then vehicle E could possibly run into the pileup that may occur between vehicle D and vehicle A.
  • the operator of vehicle F seeing the possible collision between vehicles D and A in screen 54 can anticipate and avoid the accident by slowing down or stopping before entering the intersection 40.
  • the operator of vehicle F may also try and prevent the collision by honk a horn.
  • FIG. 5 shows another example of how sensor data and other vehicle kinematic state data can be transmitted between different vehicles.
  • Vehicles 70, 72, and 74 are all involved in an accident. At least one of the vehicles, in this case vehicle 70, broadcasts a collision indication message 76.
  • the accident indication message 76 can be triggered by anyone of multiple detected events. For example, the collision indication message 76 may be generated whenever an airbag is deployed in vehicle 76. Alternatively, sensors 78 in the vehicle 70 detect the collision. The detected collision causes a processor in vehicle 70 to broadcast the collision indication message 76.
  • the collision indication message 76 is received by a vehicle 80 that is traveling in the opposite traffic lane.
  • the vehicle 80 includes a transceiver 81 that in this example relays the collision indication message 76 to another vehicle 84 that is traveling in the same direction.
  • Vehicle 84 relays the message to other vehicles 82 and 86 that are traveling in the direction of the on coming collision.
  • Processors 83 and 87 in the vehicles 82 and 86 receive the collision indication message 76 and generate a warning message that may either be annunciated or displayed to drivers of vehicles 82 and 86.
  • the collision indication message 76 is received by vehicle 82 directly from vehicle 70.
  • the processor 83 in vehicle 82 generates a warning indication and also relays the collision indication message 76 to vehicle 86.
  • the collision indication message 76 and other sensor data and messages can be relayed by any vehicle traveling in any direction.
  • FIGS. 6 and 7 show an example of how the inter-vehicle communication system can be utilized to identify road direction.
  • FIG. 6 shows three vehicles A, B, and C traveling along the same stretch of highway 88.
  • Each vehicle includes a Global Positioning System (GPS) that periodically identifies a current longitude and latitude.
  • GPS Global Positioning System
  • Each vehicle A, B, and C generates kinematic state data 92 that includes position, velocity, acceleration or deceleration, and/or direction.
  • the kinematic state data 92 for each vehicle A, B, and C is broadcast to the other vehicles in the same vicinity.
  • the vehicles A, B, and C receive the kinematic state data from the other vehicles and display the information to the vehicle driver.
  • FIG. 7 shows a GUI 94 in vehicle A (FIG. 6).
  • the GUI 94 shows any combination of the position, driving direction, speed, distance, and acceleration for the other vehicles B and C.
  • Vectors 96 and 98 can visually represent this kinematic state data.
  • the position of vector 98 represents the longitude and latitude of vehicle B and the direction of vector 98 represents the direction that vehicle B is traveling.
  • the length of vector 98 represents the current speed and acceleration of vehicle 98. Displaying the kinematic state of other vehicles B and C allows the driver of vehicle A to anticipate curves and other turns in highway 88 (FIG. 6) regardless of the weather conditions.
  • the kinematic state data 92 for the vehicles A, B and C does not have to always be relayed by other vehicles.
  • the kinematic state data 92 can be relayed by a repeater located on a stationary tower 90. This may be desirable for roads with little traffic where there are generally long distances between vehicles on the same highway 88.
  • the transmitters 91 may also send along with the location data 93 some indication that the data is being transmitted from a stationary reference post.
  • the transmitters 91 can also include temperature sensors that detect different road conditions, such as ice. An ice warning is then generated along with the location data.
  • the processors in the vehicles A, B and C then display the transmitters 91 as nonmoving objects 100 along with any road condition information in the GUI 94.
  • FIGS. 8 and 9 show in more detail how collision information is exchanged and used by different vehicles.
  • vehicle A has collided with a tree 102. Upon impact with tree 102, the vehicle A deploys one or more airbags.
  • a processor 104 in vehicle A detects the airbag deployment and automatically sends out an air bag deployment message 106 over a cellular telephone network to an emergency vehicle service such as AAA.
  • the processor 104 broadcasts the kinematic state data 108 of vehicle A.
  • the kinematic state data 108 indicates a rapid deceleration of vehicle A.
  • the processor 104 may send a warning indication.
  • Another vehicle B receives GPS location data 112 from one or more GPS satellites 110.
  • Onboard sensor data 114 is also monitored by processor 116 to determine the speed, direction, etc. of vehicle B.
  • the onboard sensor data 114 may also include data from one or more sensors that are detecting objects within the vicinity of vehicle B.
  • the processor 116 in vehicle B determines a current location of vehicle B based on the GPS data 112 and the onboard sensor data 114. The processor 116 then determines if a danger condition exists by comparing the kinematic state of vehicle A with the kinematic state of vehicle B. For example, if vehicle A is within 50 feet of vehicle B, and vehicle B is traveling at 60 MPH, then processor 116 may determine that vehicle B is in danger of colliding with vehicle A. In this situation, a warning signal may be generated by processor 116. Alternatively, if vehicle A is 100 feet in front of vehicle B, and vehicle B is only traveling at 5 MPH, processor 116 may determine that no danger condition currently exists for vehicle B and no warning signal is generated.
  • FIG. 9 shows one example of how a GUI 105 in vehicle B displays information received from vehicle A and from local sensors.
  • the processor 116 displays vehicle A directly in front of vehicle B. Either from sensor data transmitted from vehicle A or from local sensors, the processor 116 generates a motion vector 113 that identifies another vehicle C approaching from the left.
  • the local sensors in vehicle B also detect another object 107 off to the left of vehicle B.
  • the processor 116 receives all of this sensor data information and generates a steering queue 109 that determines the best path for avoiding vehicle A, vehicle C and object 107. In this example, it is determined that vehicle B should move in a northeasterly direction to avoid colliding with all of the detected objects.
  • the processor 116 can also calculate a time to impact 111 with the closest detected object by comparing the kinematic state of the vehicle B with the kinematic states of the detected objects.
  • FIG. 10 shows another example of how vehicle information may be exchanged between different vehicles.
  • a police vehicle 120 is in pursuit of a chase vehicle 126.
  • police vehicle 120 may be entering an intersection 128.
  • the police vehicle 120 broadcasts an emergency warning signal 124.
  • the emergency warning signal 124 notifies all of the vehicles 122 that an emergency vehicle 120 is nearby and that the vehicles 122 should slowdown or stop.
  • Processors 130 in the vehicles 122 can generate an audible signal to the vehicle operator, display a warning icon on a GUI, and/or show the location of police vehicle 120 on the GUI.
  • the processor 130 in each vehicle 122 receives the kinematic state of police vehicle 120 and determines a relative position of the local vehicle 122 in relation to the police vehicle 120. If the police vehicle 120 is within a particular range, the processor 130 generates a warning signal and may also automatically slow or stop the vehicle 122.
  • the police vehicle 120 sends a disable signal 132 to a processor (not shown) in the chase vehicle 126.
  • the disable signal 132 causes the processor in chase vehicle 126 to automatically slow down the chase vehicle 126 and then eventually stop the chase vehicle 126.
  • FIGS. 11 and 12 show another application for the sensors 136 that are located around vehicle A. Vehicles A and B are parked in parking slots 138 and 140, respectively. Vehicle A has pulled out of parking slot 138 and is attempting to negotiate around vehicle B. The operator of vehicle A cannot see how far vehicle A is from vehicle B.
  • the sensors 136 detect objects that come within a certain distance of vehicle
  • sensors 136 may be activated only when the vehicle A is traveling below a certain speed, or may be activated at any speed, or may be manually activated by the vehicle operator. In any case, the sensors 136 detect vehicle B and display vehicle B on a GUI 144 shown in FIG. 12. The processor in vehicle A may also determine the closest distance between vehicle A and vehicle B and also identify the distance to impact and the particular area of impact 145 on vehicle A.
  • FIG. 13 shows an example of sensor and communication envelopes that are generated by sensors and transceivers in vehicle A.
  • a first local sensor envelope 150 is created around the vehicle A by multiple local sensors 158.
  • the sensor data from the local sensor envelope 150 is used by a processor to detect objects located anywhere around vehicle A.
  • Transceivers 156 are used to generate communication envelopes 152.
  • the transceivers 156 allow communications between vehicles that are located generally in front and in back of vehicle A However, it should be understood that any variety of communication and sensor envelopes can be generated by transceivers and sensors in vehicle A.
  • FIG. 14 shows another example of different sensor envelopes that can be generated around vehicle A.
  • a first type of sensor such as an infrared sensor, may be located around vehicle A to generate close proximity sensor envelopes 160 and 162.
  • a second type of sensor and antenna configuration such as radar antennas, may be used to generate larger sensor envelopes 164, 166, and 168.
  • the local sensor envelopes 160 and 162 may be used to detect objects in close proximity to vehicle A. For example, parked cars, pedestrians, etc.
  • the larger radar envelopes 164, 166 and 168 may be used for detecting objects that are further away from vehicle A.
  • envelopes 164, 166, and 168 may be used for detecting other vehicles that are longer distances from vehicle A.
  • the different sensor envelopes may dynamically change according to how fast the vehicle A is moving. For example, envelope 164 may be used when vehicle A is moving at a relatively low speed. When vehicle A accelerates to a higher speed, object detection will be needed for longer distances. Thus, the sensors may dynamically change to larger sensor envelopes 166 and 168 when vehicle A is moving at higher speeds. Any combination of local sensor envelopes 160 and 162 and larger envelopes 164, 166, and 168 may be used.
  • FIG. 15 is a detailed diagram of the components in one of the vehicles used for gathering local sensor data and receiving external sensor data from other vehicles.
  • a processor 170 receives sensor data from one or more local object detection sensors 172.
  • the sensors may be infrared sensors, radar sensors, or any other type of sensing device that can detect objects.
  • Communication transceivers 174 exchange sensor data, kinematic state data, and other notification messages with other vehicles. Any wireless communication device can be used for communicating information between the different vehicles including microwave, cellular, Citizen Band, two-way radio, etc.
  • a GPS receiver 176 periodically reads location data from GPS satellites.
  • Vehicle sensors 178 include any of the sensors or monitoring devices in the vehicle that detect vehicle direction, speed, temperature, collision conditions, breaking state, airbag deployment, etc.
  • Operator inputs 180 include any monitoring or selection parameter that may be input by the vehicle operator. For example, the operator may wish to view all objects within a 100 foot radius. In another situation, the operator may wish to view all objects within a one mile radius.
  • the processor display the objects within the range selected by the operator on GUT 182.
  • the speed of the vehicle identified by vehicle sensors 178 may determine what data from sensors 172 or from transceivers 174 is used to display on the GUI 182. For example, at higher speeds, the processor may want to display objects that are further distances from the local vehicle.
  • the processor receives sensor data from sensors on the local vehicle.
  • the processor performs image recognition algorithms on the sensor data in block 192. If an object is detected in block 194, kinematic state data for the object is determined in block 200. If the detected object is within a specified range in block 196, then the object is displayed on the GUI in block 198. For example, the current display range for the vehicle may only be for objects detected within 200 feet. If the detected object is outside of 200 feet, it will no be displayed on the GUI.
  • the processor receives kinematic state data for other vehicles and objects detection data from the other vehicles in block 202. Voice data from the other vehicles can also be transmitted along with the kinematic state data. In a similar manner as blocks 196 and 198, if any object detected by another vehicle is within a current display range in block 206, then the other object is displayed on the GUI in block 208. At the same time, the processor determines the current kinematic state its own local vehicle in block 205.
  • the processor in block 210 compares the kinematic state information of the local vehicle with all of the other objects and vehicles that are detected. If a collision condition is eminent based on the comparison, then the processor generates a collision warning in block 212.
  • a collision condition is determined in one example by comparing the current kinematic state of the local vehicle with the kinematic state of the detected objects. If the velocity vector (current speed and direction) of the local vehicle is about to interest with the velocity vector for another detected object, then a collision condition is indicated and a warning signal generated. Collision conditions are determined by analyzing the bearing rate of change of the detected object with respect to the local vehicle.
  • the processor identifies a possible collision condition.
  • a first warning signal is generated.
  • a second collision signal is generated.
  • Fig. 17 depicts a plurality of sensor devices 1011 located aboard an automobile. Though each of the sensors 1011 is annotated with the same number, this is merely to indicate that the sensors perform similar functions, and not to suggest that each of the sensor devices 1011 is exactly the same. Rather, each of the devices 1011 could be an LR sensor, a radar sensor, or another variety of sensor placed to monitor any condition within the automobile or exterior to the automobile that may be of use when implementing collision avoidance, situational awareness, navigation, or system diagnostic functions.
  • Each of the sensor devices 1011 is linked to a processing unit 1041 located within the automobile by a plurality of wireless links 1051.
  • the wireless links 1051 are uni-directional in nature, because the sensor devices 1011 typically transmit only raw data to processing unit 1041.
  • the similar numbering system merely indicates that they are a class of devices that both transmit and receive data from the processing unit, and does not imply that each of the devices 1021 is exactly the same.
  • the class of devices 1021 might include a security system, an environmental control system, a number of audio and video entertainment devices, a cellular phone, a GPS receiver and antenna, or personal digital assistants (PDA).
  • PDA personal digital assistants
  • the devices 1021 will be located within the automobile. However, some of the devices may be located outside the automobile, as in the case of a cellular phone or PDA.
  • GUI 1031 located in the automobile and linked to the processing unit 1041 by a bi-directional wireless or hardwired link 1061.
  • the GUI is the means by which the driver of the automobile can input commands to control a variety of the devices 1021.
  • the driver also receives system status data at GUI 1031 from the processing unit 1041.
  • GUI 1031 may take, including a touch-screen display or heads-up display similar to those typically found in military aircraft.
  • Processing unit 1041 may transmit data directly to GUI 1031 from sensor devices 1011 or may first perform a sensor fusion operation when multiple sensors are monitoring the same condition.
  • Processing unit 1041 may also transmit data received from one or more of the devices 1021 to GUI 1031.
  • the uni-directional wireless links 1051 and the bi-directional wireless links 1061 may be one of several types, depending upon the specific sensor or system that is wirelessly linked to the processing unit.
  • one of the sensor devices 1011 might require an IEEE 802.11 protocol, while one of the devices 1021 utilizes a Motorola Bluetooth link.
  • the processing unit 1041 has the capability of interfacing with sensor devices 1021 or devices 1031 using an analog cellular link, a Cellular Digital Packet Data (CDPD) link, a Satcom link, or a hardwired link.
  • CDPD Cellular Digital Packet Data
  • the number of sensor devices 1011 and 1021 or the pattern in which they are depicted in Fig. 17 should not be considered a limitation.
  • the number of devices 1021 and 1031 and the physical location of 1011, 1021, 1031, and 1041 within the automobile will vary depending on the specific design.
  • Fig. 18 is an illustration of a second embodiment of the present invention. Like the first embodiment depicted in Fig. 17, there are a plurality of sensor devices 1011, a plurality of devices 1021, a GUI 1031, and a plurality of one-directional wireless links 1051 and bi-directional links 1061.
  • the dashed lines divide the interior of an automobile into separate zones, with the engine, passenger, and trunk compartments represented by zone 2021, 2041, and 2061, respectively.
  • Zone 2081 represents the area outside the automobile.
  • the number of devices and wireless links located in each zone is arbitrary, there may be more or less depending on the specific design.
  • Each of the devices 1011, 1021, and 1031 is wirelessly linked with a signal interface unit 2031 that is located in the same zone.
  • the signal interface units 2031 are coupled to a bus 2051 that is installed to run throughout all zones of the automobile.
  • the processing unit 1041 is also coupled to the bus 2051. Once signals are received by the signal interface units 2041, they may be placed on bus 2051 and transmitted to the processing unit 1041. Similarly, signals are transmitted from processing unit 1041 to devices 1021 and GUI 1031 via the bus/signal interface route.
  • Device 1021 is located outside of the automobile in zone 2081 to indicate that there may be devices such as PDAs or cellular phones that receive or transmit data to the processing unit 1041 via a bi-directional wireless link.
  • This zone bus structure takes advantage of the natural shielding offered by the different structural compartments of an automobile.
  • Each zone contains a single signal interface unit that serves as the point where wireless signals are received and transmitted in each zone.
  • the number of zones may vary depending on the type of automobile that the invention is installed in. For example, a sport-utility vehicle would require only two signal interface units 2031 because it effectively has only two zones, the engine and passenger/cargo compartment.
  • the processing unit 1041 is shown located in zone 2041, but it might be moved to any zone depending on the space requirements of specific designs.
  • processing unit 1041 Both embodiments of the present invention described above will facilitate detection of people within the automobile, and based upon detection various functions may be implemented by processing unit 1041. For example, if a subset of the sensor devices 1011 happened to be IR sensors installed in the passenger compartment of an automobile, the sensors can indicate when a person is within the vehicle. Based upon this occupancy data, the processing unit could operate the lighting system more efficiently by turning off the dome light when the vehicle is parked and the last occupant leaves the vehicle, rather than the usual automatic shut off. As another example, typically keys must be in the ignition to operate the car radio and environmental controls. These systems could be enabled merely by a person's presence in the vehicle. The invention could also prevent airbags from being deployed in an accident for passenger seats where no passenger is sitting. An alarm system could be configured to disable the ignition when an unauthorized occupant is detected or to call 911 with the current location of the vehicle taken from the GPS system.
  • Fig. 19 is a specific instance of a vehicular wireless network according to the first embodiment of the present invention disclosed in Fig. 17.
  • the dashed lines in Fig. 19 indicate an engine compartment region 300, a passenger compartment region 310, a trunk compartment region 320, and a region 330 that represents the area external to the automobile.
  • Engine compartment 300 contains two IR sensors 302 that face forward to pick up heat signatures emanating from other automobiles.
  • Sensor 304 is a RF transmitter, receiver, and antenna that detects other automobiles.
  • Sensor 306 is a thermal sensor to monitor engine temperature.
  • Each of the sensors 302, 304, and 306 wirelessly transmits data to the processing unit 318 located in the passenger compartment 310 of the automobile with an IEEE 802.11 wireless link 340.
  • Passenger compartment 310 contains a touch screen display 312 which allows the driver to see the status of various vehicle subsystems along with providing a means to input commands.
  • Car audio components 314 are also located within the passenger compartment. Touch-screen display 312 and car audio components 314 are linked to the processing unit 318 by bi-directional wireless Bluetooth links 350.
  • two IR sensors 316 are installed to monitor the occupancy state of the automobile. The two sensors 316 are linked to processing unit 318 by wireless IEEE 802.11 links 340.
  • Trunk compartment contains GPS receiver and antenna 322 and multiband cellular receiver/transmitter/antenna 324.
  • the GPS subsystem 322 and cellular subsystem 324 are linked to processing unit 318 in the passenger compartment via bidirectional wireless Bluetooth links 350.
  • a mobile PDA unit 332 is located outside of the automobile in region 330, transmitting data to and receiving data from processing unit 318 via bi-directional Bluetooth link 350.
  • Fig. 20 is a specific instance of a vehicular wireless network according to the second embodiment of the present invention disclosed in Fig. 18. The dashed lines in Fig. 20 indicate an engine compartment region 400, a passenger compartment region 410, a trunk compartment region 420, and a region 430 that represents the area external to the automobile.
  • Engine compartment 400 contains two IR sensors 402 that face forward to pick up heat signatures emanating from other automobiles.
  • Sensor 404 is a RF transmitter, receiver, and antenna that detects other automobiles.
  • Sensor 406 is a thermal sensor to monitor engine temperature.
  • Each of the sensors 402, 404, and 406 wirelessly transmits data to the signal interface unit 440 located in the engine compartment 400 with IEEE 802.11 wireless links 460.
  • Passenger compartment 410 contains a touch screen display 412 which allows the driver to see the status of various vehicle subsystems along with providing a means to input commands.
  • Car audio components 414 are also located within the passenger compartment.
  • Touch-screen display 412 and car audio components 414 are linked to a second signal interface unit 440 by bi-directional wireless Bluetooth links 470.
  • two IR sensors 416 are installed to monitor the occupancy state of the automobile. The two sensors 416 are linked to the second signal interface unit 440 by wireless IEEE 802.1 1 links 460.
  • Trunk compartment 420 contains GPS receiver and antenna 424 and multiband cellular receiver/transmitter/antenna 426.
  • the GPS subsystem 424 and cellular subsystem 426 are linked to a third signal interface unit 440 located in the trunk compartment via bi-directional wireless Bluetooth links 470.
  • a mobile PDA unit 432 is located outside of the automobile in region 430, transmitting data to and receiving data from the second signal interface unit 440 via bi-directional Bluetooth link 470.
  • the mobile PDA unit 432 can link to any of the signal interface units 440 within the automobile, it is merely shown connected to the second unit in the passenger compartment by way of example.
  • Each of the signal interface units 440 is coupled to a fiber-optic bus 450 installed to extend into all zones 400, 410, and 420 of the automobile.
  • the processing unit 422 is also located in the truck compartment 420 and is coupled to fiber-optic bus 450. However, processing unit 422 could be coupled to the fiber-optic bus at any location in any region 400, 410, or 420 depending on space requirements.
  • Fig. 21 is a stylized profile of an automobile illustrating the physical location of some of the components described in Fig. 20.
  • three zones 500, 510, and 520 represent the engine compartment, passenger compartment, and trunk compartment, respectively, of the automobile.
  • the signal interface units 540 are installed underneath the hood in the engine compartment 500, underneath the dome in the passenger compartment 510, and underneath the trunk lid in the trunk compartment 520.
  • the signal interface unit 540 in the passenger compartment 510 may even share a physical location with the dome light of the automobile.
  • the fiberoptic bus 550 runs from the engine compartment 500 to the trunk compartment 520 and the signal interface units 540 are coupled to it.
  • the processing unit 522 is installed on the floor of the trunk section 520 and is also coupled to the fiber-optic bus 550.
  • the sensor devices and other automobile system devices that are linked to the signal interfaces by wireless connections are not shown, but their physical locations would be optimized in the various zones of the automobile depending upon their functionality and purpose.
  • the system described above can use dedicated processor systems, micro controllers, programmable logic devices, or microprocessors that perform some or all of the operations. Some of the operations described above may be implemented in software and other operations may be implemented in hardware.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

Selon l'invention, des données de détection sont créées pour des zones situées autour d'un véhicule. Tous les objets détectés dans ces données de détection sont identifiés, ainsi qu'un état cinématique pour chaque objet déterminé. L'état cinématique des objets détectés est comparé avec l'état cinématique du véhicule. Si une collision est susceptible de se produire entre les objets détectés et le véhicule local, un avertissement est automatiquement créé pour avertir l'opérateur du véhicule de l'imminence de la collision. Les données de détection et l'état cinématique du véhicule peuvent être transmis à d'autres véhicules, de sorte que ces autres véhicules sont également avertis de conditions de collision probable. Un assortiment de sous-systèmes, de capteurs, de dispositifs de communication et autres dispositifs électroniques de véhicule sont connectés à une unité de traitement au moyen d'une pluralité de liaisons sans fil.
PCT/US2002/020403 2001-06-26 2002-06-26 Procede et appareil de transfert d'informations entre vehicules WO2003001474A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002349794A AU2002349794A1 (en) 2001-06-26 2002-06-26 Method and apparatus for detecting possible collisions and transferring information between vehicles

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US09/892,293 2001-06-26
US09/892,293 US20020140548A1 (en) 2001-03-30 2001-06-26 Method and apparatus for a vehicular wireless network
US09/892,333 2001-06-26
US09/892,333 US6615137B2 (en) 2001-06-26 2001-06-26 Method and apparatus for transferring information between vehicles

Publications (2)

Publication Number Publication Date
WO2003001474A2 true WO2003001474A2 (fr) 2003-01-03
WO2003001474A3 WO2003001474A3 (fr) 2008-01-03

Family

ID=27129011

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/020403 WO2003001474A2 (fr) 2001-06-26 2002-06-26 Procede et appareil de transfert d'informations entre vehicules

Country Status (2)

Country Link
AU (1) AU2002349794A1 (fr)
WO (1) WO2003001474A2 (fr)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1441321A2 (fr) * 2003-01-21 2004-07-28 Robert Bosch Gmbh Procédé de transmission d'information entre les stations mobiles
DE10326648A1 (de) * 2003-06-11 2005-01-13 Daimlerchrysler Ag Kooperatives Radarsystem für Fahrzeuge
FR2896594A1 (fr) * 2006-01-24 2007-07-27 Renault Sas Procede de perception, par un vehicule, de son environnement
WO2008022817A1 (fr) * 2006-08-21 2008-02-28 Continental Automotive Gmbh Système d'assistance au conducteur pour évaluer et prévoir localement et temporellement la dynamique de conduite d'un véhicule
WO2008061890A1 (fr) * 2006-11-23 2008-05-29 Continental Automotive Gmbh Procédé de communication hertzienne entre des véhicules
WO2008084280A1 (fr) * 2007-01-08 2008-07-17 Sony Ericsson Mobile Communications Ab Système et procédé de radiodiffusion interactive
WO2008110926A2 (fr) 2007-03-12 2008-09-18 Toyota Jidosha Kabushiki Kaisha Système de détection d'état de route
WO2009121738A2 (fr) * 2008-04-03 2009-10-08 Siemens Aktiengesellschaft Procédé et dispositif de détection d'un risque de collision sur des unités mobiles à l'intérieur d'un espace industriel
DE102008041749A1 (de) 2008-09-01 2010-03-04 Robert Bosch Gmbh Verfahren zum Bereitstellen einer Kommunikation zwischen Fahrzeugen
WO2011090417A1 (fr) * 2010-01-19 2011-07-28 Volvo Technology Corporation Dispositif d'avertissement d'angle mort et système d'avertissement d'angle mort
WO2011130861A1 (fr) * 2010-04-19 2011-10-27 Safemine Ag Système et procédé d'alerte de proximité d'objet
US8056857B2 (en) 2006-02-15 2011-11-15 Be Aerospace, Inc. Aircraft seat with upright seat back position indicator
WO2011161176A1 (fr) * 2010-06-23 2011-12-29 Continental Teves Ag & Co. Ohg Procédé et système d'identification d'objet accélérée et/ou d'identification d'attribut d'objet accélérée et utilisation du procédé
US8595037B1 (en) 2012-05-08 2013-11-26 Elwha Llc Systems and methods for insurance based on monitored characteristics of an autonomous drive mode selection system
WO2014011545A1 (fr) * 2012-07-09 2014-01-16 Elwha Llc Systèmes et procédés pour détection de collision coopérative
US8779934B2 (en) 2009-06-12 2014-07-15 Safemine Ag Movable object proximity warning system
CN104137164A (zh) * 2012-02-25 2014-11-05 奥迪股份公司 用于在车对车通信时识别车辆的方法
US8886394B2 (en) 2009-12-17 2014-11-11 Bae Systems Plc Producing data describing states of a plurality of targets
EP2846172A1 (fr) * 2013-09-09 2015-03-11 Nxp B.V. Système et procédé d'avertissement
US8994557B2 (en) 2009-12-11 2015-03-31 Safemine Ag Modular collision warning apparatus and method for operating the same
US9000903B2 (en) 2012-07-09 2015-04-07 Elwha Llc Systems and methods for vehicle monitoring
CN104637344A (zh) * 2013-11-11 2015-05-20 纬创资通股份有限公司 车辆预警系统及车辆预警方法
ITVR20130267A1 (it) * 2013-12-03 2015-06-04 Emanuele Donatelli Sistema per la prevenzione del traffico e il controllo degli incidenti
US9165469B2 (en) 2012-07-09 2015-10-20 Elwha Llc Systems and methods for coordinating sensor operation for collision detection
CN105210129A (zh) * 2013-04-19 2015-12-30 大陆-特韦斯贸易合伙股份公司及两合公司 用于避免跟行车辆撞上紧前方车辆的方法和系统及该系统的应用
US9230442B2 (en) 2013-07-31 2016-01-05 Elwha Llc Systems and methods for adaptive vehicle sensing systems
US9269268B2 (en) 2013-07-31 2016-02-23 Elwha Llc Systems and methods for adaptive vehicle sensing systems
WO2016142603A1 (fr) * 2015-03-09 2016-09-15 Peugeot Citroen Automobiles Sa Procédé et dispositif d'aide au dépassement d'un véhicule en présence d'un autre véhicule invisible et circulant à contresens
US9558667B2 (en) 2012-07-09 2017-01-31 Elwha Llc Systems and methods for cooperative collision detection
US9776632B2 (en) 2013-07-31 2017-10-03 Elwha Llc Systems and methods for adaptive vehicle sensing systems
CN107769897A (zh) * 2016-08-23 2018-03-06 瑞萨电子株式会社 通信设备和重传控制方法
US10295662B2 (en) 2014-03-17 2019-05-21 Bae Systems Plc Producing data describing target measurements
WO2020086127A1 (fr) * 2018-10-22 2020-04-30 Ebay Inc. Communication et notification entre véhicules
WO2020229077A1 (fr) * 2019-05-13 2020-11-19 Volkswagen Aktiengesellschaft Mise en garde contre une situation dangereuse dans la circulation routière

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4237987A1 (de) * 1992-11-11 1994-05-19 Opel Adam Ag Elektronische Einrichtung
US5572201A (en) * 1994-08-05 1996-11-05 Federal Signal Corporation Alerting device and system for abnormal situations
EP0841648A2 (fr) * 1992-09-30 1998-05-13 Hitachi, Ltd. Mécanisme de support d'entraínement pour véhicule et véhicule avec un tel mécanisme
US5907293A (en) * 1996-05-30 1999-05-25 Sun Microsystems, Inc. System for displaying the characteristics, position, velocity and acceleration of nearby vehicles on a moving-map
US5983161A (en) * 1993-08-11 1999-11-09 Lemelson; Jerome H. GPS vehicle collision avoidance warning and control system and method
DE19922608A1 (de) * 1999-05-17 2000-11-23 Media Praesent Ursula Nitzsche Verfahren und Vorrichtung zur drahtlosen Notsignalübertragung, insbesondere zu oder zwischen Fahrzeugen, mit einem Hochfrequenz-Sender sowie einer, auch separaten Empfängeranordnung

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0841648A2 (fr) * 1992-09-30 1998-05-13 Hitachi, Ltd. Mécanisme de support d'entraínement pour véhicule et véhicule avec un tel mécanisme
DE4237987A1 (de) * 1992-11-11 1994-05-19 Opel Adam Ag Elektronische Einrichtung
US5983161A (en) * 1993-08-11 1999-11-09 Lemelson; Jerome H. GPS vehicle collision avoidance warning and control system and method
US5572201A (en) * 1994-08-05 1996-11-05 Federal Signal Corporation Alerting device and system for abnormal situations
US5907293A (en) * 1996-05-30 1999-05-25 Sun Microsystems, Inc. System for displaying the characteristics, position, velocity and acceleration of nearby vehicles on a moving-map
DE19922608A1 (de) * 1999-05-17 2000-11-23 Media Praesent Ursula Nitzsche Verfahren und Vorrichtung zur drahtlosen Notsignalübertragung, insbesondere zu oder zwischen Fahrzeugen, mit einem Hochfrequenz-Sender sowie einer, auch separaten Empfängeranordnung

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
NUSSER R ET AL: "Bluetooth-based wireless connectivity in an automotive environment" VEHICULAR TECHNOLOGY CONFERENCE FALL 2000. IEEE VTS FALL VTC2000. 52ND VEHICULAR TECHNOLOGY CONFERENCE (CAT. NO.00CH37152), VEHICULAR TECHNOLOGY CONFERENCE FALL 2000. IEEE VTS FALL VTC2000. 52ND VEHICULAR TECHNOLOGY CONFERENCE, BOSTON, MA, USA, 24-28, pages 1935-1942 vol.4, XP010524360 2000, Piscataway, NJ, USA, IEEE, USA ISBN: 0-7803-6507-0 *
STIRLING A: "Mobile multimedia platforms" VEHICULAR TECHNOLOGY CONFERENCE FALL 2000. IEEE VTS FALL VTC2000. 52ND VEHICULAR TECHNOLOGY CONFERENCE (CAT. NO.00CH37152), VEHICULAR TECHNOLOGY CONFERENCE FALL 2000. IEEE VTS FALL VTC2000. 52ND VEHICULAR TECHNOLOGY CONFERENCE, BOSTON, MA, USA, 24-28, pages 2541-2548 vol.6, XP010525052 2000, Piscataway, NJ, USA, IEEE, USA ISBN: 0-7803-6507-0 *

Cited By (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1441321A3 (fr) * 2003-01-21 2009-01-14 Robert Bosch Gmbh Procédé de transmission d'information entre les stations mobiles
EP1441321A2 (fr) * 2003-01-21 2004-07-28 Robert Bosch Gmbh Procédé de transmission d'information entre les stations mobiles
DE10326648A1 (de) * 2003-06-11 2005-01-13 Daimlerchrysler Ag Kooperatives Radarsystem für Fahrzeuge
FR2896594A1 (fr) * 2006-01-24 2007-07-27 Renault Sas Procede de perception, par un vehicule, de son environnement
US8056857B2 (en) 2006-02-15 2011-11-15 Be Aerospace, Inc. Aircraft seat with upright seat back position indicator
WO2008022817A1 (fr) * 2006-08-21 2008-02-28 Continental Automotive Gmbh Système d'assistance au conducteur pour évaluer et prévoir localement et temporellement la dynamique de conduite d'un véhicule
US8886386B2 (en) 2006-11-23 2014-11-11 Continental Automotive Gmbh Method for wireless communication between vehicles
WO2008061890A1 (fr) * 2006-11-23 2008-05-29 Continental Automotive Gmbh Procédé de communication hertzienne entre des véhicules
WO2008084280A1 (fr) * 2007-01-08 2008-07-17 Sony Ericsson Mobile Communications Ab Système et procédé de radiodiffusion interactive
US7826789B2 (en) 2007-01-08 2010-11-02 Sony Ericsson Mobile Communications Ab System and method for interactive broadcasting
WO2008110926A2 (fr) 2007-03-12 2008-09-18 Toyota Jidosha Kabushiki Kaisha Système de détection d'état de route
JP2008225786A (ja) * 2007-03-12 2008-09-25 Toyota Motor Corp 道路状況検出システム
WO2008110926A3 (fr) * 2007-03-12 2008-11-27 Toyota Motor Co Ltd Système de détection d'état de route
US8362889B2 (en) 2007-03-12 2013-01-29 Toyota Jidosha Kabushiki Kaisha Road condition detecting system
WO2009121738A2 (fr) * 2008-04-03 2009-10-08 Siemens Aktiengesellschaft Procédé et dispositif de détection d'un risque de collision sur des unités mobiles à l'intérieur d'un espace industriel
WO2009121738A3 (fr) * 2008-04-03 2010-02-25 Siemens Aktiengesellschaft Procédé et dispositif de détection d'un risque de collision sur des unités mobiles à l'intérieur d'un espace industriel
DE102008041749A1 (de) 2008-09-01 2010-03-04 Robert Bosch Gmbh Verfahren zum Bereitstellen einer Kommunikation zwischen Fahrzeugen
US9129509B2 (en) 2009-06-12 2015-09-08 Safemine Ag Movable object proximity warning system
US8779934B2 (en) 2009-06-12 2014-07-15 Safemine Ag Movable object proximity warning system
US8994557B2 (en) 2009-12-11 2015-03-31 Safemine Ag Modular collision warning apparatus and method for operating the same
US8886394B2 (en) 2009-12-17 2014-11-11 Bae Systems Plc Producing data describing states of a plurality of targets
WO2011090417A1 (fr) * 2010-01-19 2011-07-28 Volvo Technology Corporation Dispositif d'avertissement d'angle mort et système d'avertissement d'angle mort
AU2010351500B2 (en) * 2010-04-19 2014-09-11 Safemine Ag Object proximity warning system and method
WO2011130861A1 (fr) * 2010-04-19 2011-10-27 Safemine Ag Système et procédé d'alerte de proximité d'objet
CN102947870A (zh) * 2010-06-23 2013-02-27 大陆-特韦斯贸易合伙股份公司及两合公司 用于验证信息的方法和系统
CN103080953A (zh) * 2010-06-23 2013-05-01 大陆-特韦斯贸易合伙股份公司及两合公司 用于加速的物体检测和/或加速的物体属性检测的方法和系统及所述方法的用途
KR101942109B1 (ko) * 2010-06-23 2019-04-11 콘티넨탈 테베스 아게 운트 코. 오하게 정보를 유효화하는 방법 및 시스템
WO2011161177A1 (fr) * 2010-06-23 2011-12-29 Continental Teves Ag & Co. Ohg Procédé et système de validation d'informations
US9393958B2 (en) 2010-06-23 2016-07-19 Continental Teves Ag & Co. Ohg Method and system for validating information
WO2011161176A1 (fr) * 2010-06-23 2011-12-29 Continental Teves Ag & Co. Ohg Procédé et système d'identification d'objet accélérée et/ou d'identification d'attribut d'objet accélérée et utilisation du procédé
KR20130121816A (ko) * 2010-06-23 2013-11-06 콘티넨탈 테베스 아게 운트 코. 오하게 정보를 유효화하는 방법 및 시스템
US9096228B2 (en) 2010-06-23 2015-08-04 Continental Teves Ag & Co. Ohg Method and system for accelerated object recognition and/or accelerated object attribute recognition and use of said method
CN104137164A (zh) * 2012-02-25 2014-11-05 奥迪股份公司 用于在车对车通信时识别车辆的方法
CN104137164B (zh) * 2012-02-25 2016-03-02 奥迪股份公司 用于在车对车通信时识别车辆的方法
US9165198B2 (en) 2012-02-25 2015-10-20 Audi Ag Method for identifying a vehicle during vehicle-to-vehicle communication
US8595037B1 (en) 2012-05-08 2013-11-26 Elwha Llc Systems and methods for insurance based on monitored characteristics of an autonomous drive mode selection system
US9000903B2 (en) 2012-07-09 2015-04-07 Elwha Llc Systems and methods for vehicle monitoring
US9558667B2 (en) 2012-07-09 2017-01-31 Elwha Llc Systems and methods for cooperative collision detection
US9165469B2 (en) 2012-07-09 2015-10-20 Elwha Llc Systems and methods for coordinating sensor operation for collision detection
WO2014011545A1 (fr) * 2012-07-09 2014-01-16 Elwha Llc Systèmes et procédés pour détection de collision coopérative
CN105210129A (zh) * 2013-04-19 2015-12-30 大陆-特韦斯贸易合伙股份公司及两合公司 用于避免跟行车辆撞上紧前方车辆的方法和系统及该系统的应用
US9230442B2 (en) 2013-07-31 2016-01-05 Elwha Llc Systems and methods for adaptive vehicle sensing systems
US9269268B2 (en) 2013-07-31 2016-02-23 Elwha Llc Systems and methods for adaptive vehicle sensing systems
US9776632B2 (en) 2013-07-31 2017-10-03 Elwha Llc Systems and methods for adaptive vehicle sensing systems
EP2846172A1 (fr) * 2013-09-09 2015-03-11 Nxp B.V. Système et procédé d'avertissement
CN104637344A (zh) * 2013-11-11 2015-05-20 纬创资通股份有限公司 车辆预警系统及车辆预警方法
ITVR20130267A1 (it) * 2013-12-03 2015-06-04 Emanuele Donatelli Sistema per la prevenzione del traffico e il controllo degli incidenti
US10295662B2 (en) 2014-03-17 2019-05-21 Bae Systems Plc Producing data describing target measurements
FR3033539A1 (fr) * 2015-03-09 2016-09-16 Peugeot Citroen Automobiles Sa Procede et dispositif d'aide au depassement d'un vehicule en presence d'un autre vehicule invisible et circulant a contresens
WO2016142603A1 (fr) * 2015-03-09 2016-09-15 Peugeot Citroen Automobiles Sa Procédé et dispositif d'aide au dépassement d'un véhicule en présence d'un autre véhicule invisible et circulant à contresens
CN107769897A (zh) * 2016-08-23 2018-03-06 瑞萨电子株式会社 通信设备和重传控制方法
CN107769897B (zh) * 2016-08-23 2022-02-08 瑞萨电子株式会社 通信设备和重传控制方法
US10723366B2 (en) 2018-10-22 2020-07-28 Ebay Inc. Intervehicle communication and notification
US10703386B2 (en) 2018-10-22 2020-07-07 Ebay Inc. Intervehicle communication and notification
KR20210047333A (ko) * 2018-10-22 2021-04-29 이베이 인크. 차량 간 통신 및 통지
CN112955353A (zh) * 2018-10-22 2021-06-11 电子湾有限公司 车辆间通信和通知
JP2022502792A (ja) * 2018-10-22 2022-01-11 イーベイ インク.Ebay Inc. 車両間通信および通知
WO2020086127A1 (fr) * 2018-10-22 2020-04-30 Ebay Inc. Communication et notification entre véhicules
JP7326438B2 (ja) 2018-10-22 2023-08-15 イーベイ インク. 車両間通信および通知
KR102621430B1 (ko) * 2018-10-22 2024-01-09 이베이 인크. 차량 간 통신 및 통지
WO2020229077A1 (fr) * 2019-05-13 2020-11-19 Volkswagen Aktiengesellschaft Mise en garde contre une situation dangereuse dans la circulation routière
CN113785339A (zh) * 2019-05-13 2021-12-10 大众汽车股份公司 对在道路交通中的危险情况的警报
US11790782B2 (en) 2019-05-13 2023-10-17 Volkswagen Aktiengesellschaft Warning about a hazardous situation in road traffic

Also Published As

Publication number Publication date
AU2002349794A8 (en) 2008-02-28
AU2002349794A1 (en) 2003-01-08
WO2003001474A3 (fr) 2008-01-03

Similar Documents

Publication Publication Date Title
WO2003001474A2 (fr) Procede et appareil de transfert d'informations entre vehicules
US6615137B2 (en) Method and apparatus for transferring information between vehicles
US11315424B2 (en) Automotive driver assistance
US9478130B2 (en) Systems and methods for traffic guidance nodes and traffic navigating entities
US6791471B2 (en) Communicating position information between vehicles
US11375351B2 (en) Method and system for communicating vehicle position information to an intelligent transportation system
US20150042491A1 (en) Hazard warning system for vehicles
US11518394B2 (en) Automotive driver assistance
US11414073B2 (en) Automotive driver assistance
WO2004047047A1 (fr) Procédé et système de prévention des collisions de la route
US20130229289A1 (en) Hazard warning system for vehicles
CN108297880A (zh) 分心驾驶员通知系统
EP3472680A1 (fr) Systèmes de commande de véhicule
US10462225B2 (en) Method and system for autonomously interfacing a vehicle electrical system of a legacy vehicle to an intelligent transportation system and vehicle diagnostic resources
CN103318086A (zh) 汽车防追尾及安全行车信息交流警示控制系统
JP2022544533A (ja) 危険車両および道路状態を通信するためのシステム
JP2002123896A (ja) 車両用衝突警報装置
JPH1173595A (ja) 交通情報を形成する方法および車両用テレマティーク装置
CN111161551B (zh) 用于检测、警报和响应紧急车辆的设备、系统和方法
JP4478330B2 (ja) 交通安全性を高める装置
US7407028B2 (en) Navigation-based safety restraint system and method
JP2006259861A (ja) ハザードウォーニングシステム
KR20040037423A (ko) 디 에스 알 씨를 이용한 지능형 서비스 장치 및 방법
US20090105901A1 (en) System for utilizing vehicle data and method of utilizing vehicle data
JP4748121B2 (ja) 交通支援システム、車載器、携帯機及び基地局

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP