EP3703032B1 - Kollaborative sicherheit für verdeckte objekte - Google Patents
Kollaborative sicherheit für verdeckte objekteInfo
- Publication number
- EP3703032B1 EP3703032B1 EP19159377.1A EP19159377A EP3703032B1 EP 3703032 B1 EP3703032 B1 EP 3703032B1 EP 19159377 A EP19159377 A EP 19159377A EP 3703032 B1 EP3703032 B1 EP 3703032B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- target vehicle
- vehicle
- occluded
- occluded area
- potential hazard
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/162—Decentralised systems, e.g. inter-vehicle communication event-triggered
Definitions
- the present disclosure relates to a combination of sensor systems and communication systems for use in a vehicle or traffic infrastructure. There are disclosed methods and devices for alleviating problems related to potentially hazardous occluded objects in a traffic environment.
- V2V and V2I systems have been proposed as a means to alleviate the problems with occluded objects.
- Current solutions involve objects broadcasting their positions and motion vectors using active communication devices.
- Other solutions involve monitoring devices installed throughout the traffic infrastructure that identify and broadcast details about potentially hazardous objects to some road users.
- US20180319280 describes an object-detection system which uses both sensors and V2X communication to detect both visible and hidden objects.
- DE102016214316 discusses determining one or more driver vision occlusion areas that cannot be seen depending on a detected eye position of a driver and on information about the vehicle geometry.
- US 2017/0327035 A1 discloses methods and systems for beyond-the-horizon threat indication for vehicles.
- sensor systems are used to provide early warning to subscribing vehicles about possible hazards in the vicinity of the subscribing vehicle or in some pre-determined area of interest, such as an intersection.
- US 9,910,442 B2 discloses methods for determining occluded areas based on the spatial extension and orientation of occluding objects.
- US 2018/0319280 A1 discloses methods for detecting visually obstructed objects in automotive environments.
- V2X Even if manufacturers are now starting to equip vehicles with V2X capability, market penetration will take time before these systems become truly effective. Also, some types of road users might never be connected to a V2X system, like pedestrians, cyclists and powered two-wheelers.
- the occluding object data comprises information related to an orientation and/or spatial extension of the occluding object. This way a more refined estimate of the occluded area can be generated based on, e.g., ray-tracing techniques.
- the target vehicle data comprises information related to at least one field of view of the target vehicle.
- the field of view can be that of one or more sensors and can also comprise that of a driver gaze.
- an ego vehicle comprising the vehicle signal processing system constitutes the occluding object.
- the ego vehicle may warn other road users about potential hazardous situations due to occlusion.
- the information signal is arranged to trigger a warning system and/or a control maneuver in the target vehicle. This way efficient and automatic accident prevention can be realized.
- the method comprises determining a threat level associated with the potential hazard with respect to the target vehicle and triggering transmission in case the threat level meets a severity criterion.
- Figure 1 shows a vehicle 110 comprising at least one sensor 111.
- the sensor 111 is associated with a field of view (FoV) 125.
- the sensor may, e.g., be a radar or lidar sensor which detects objects in the sensor field of view.
- a vehicle may comprise a plurality of on-board sensors of different types and having different fields of view. These sensors provide information about a surrounding environment of the ego vehicle.
- the techniques and methods disclosed herein are applicable to a wide variety of on-board sensor data types, including radar sensor data and lidar sensor data, but also, e.g., vision-related sensors such as camera and IR sensors, as well as ultrasound sensors.
- Gaze tracking systems can be used to monitor where the driver is looking, and which areas that are hidden from the driver field of view.
- a field of view is to be interpreted as that of one or more sensors, and/or that of a driver or passenger.
- velocities, locations and areas may be determined with reference to global coordinate system, like WGS-84, or they may be relative quantities determined with respect to some local coordinate system, such as a local coordinate system defined based on an ego vehicle location and heading.
- a sensor detection 160 An object detected by a sensor is herein denoted a sensor detection 160.
- a sensor detection may comprise different types of information depending on the type of sensor that is used. For instance, a radar sensor provides sensor data comprising distances to detected objects, and often also a relative velocity of the object with respect to the radar transceiver. Radar sensors providing Doppler information are particularly suitable for obtaining joint range and relative velocity sensor data. Some radar sensors also provide angle information, e.g., relative to a bore-sight direction of the sensor transceiver.
- the raw sensor data is generated at a high rate and often also at a high resolution, implying that large quantities of data are generated.
- the vehicle 110 also comprises a signal processing unit 700. This unit will be discussed in more detail below in connection to Figure 7 .
- the vehicle 110 is equipped for V2X communications 115.
- the vehicle comprises a transceiver 112 arranged for communicating with other transceivers in the traffic environment via V2X.
- These other transceivers may be comprised in other vehicles, or attached to devices in the traffic infrastructure, and also radio base stations deployed throughout the environment.
- the methods disclosed herein limit the amount of information communicated via V2X by first determining occluded areas which are hidden from the viewpoint of some target vehicle. If a potential hazard is detected in the occluded area, an information transmission is triggered. This way only a limited amount of information is transmitted via V2X, since hazards outside of the occluded area does not trigger transmissions. The transmissions that are actually triggered are more likely to comprise relevant information not already detectable by, e.g., on-board sensors at the target vehicle or by eyesight from the target vehicle driver.
- FIG. 2 illustrates a scenario 200 where a vehicle 120 is about to turn left at, e.g., an intersection. There is another vehicle 110 waiting to turn left which is obstructing the view of the vehicle 120.
- Drivers look for a clear path in one direction at a time and in many cases misses the occluded vehicle 150 which therefore constitutes a potential hazard.
- the hazard in this case is the potential collision which will occur if both vehicles continue in the same headings and with the same velocities 121, 151.
- AEB automatic emergency braking
- the proposed method here first determines an occluded area 140 based on data obtained from the oncoming vehicle 120 and the location of the occluding object 130.
- the method detects the potential hazard, i.e., the other vehicle 150, in the occluded area, and therefore triggers transmission of a warning signal informing the vehicle 120 about the oncoming other vehicle 150. This way one or both vehicles 120, 150 are warned, and the accident can be avoided.
- Figure 3 illustrates a scenario 300 where a pedestrian 150' is crossing a street 310.
- the pedestrian is hidden from a vehicle 120 approaching the crossing but is detected by another vehicle 110 which at the same time is an occluding object 130 occluding the view from the vehicle 120.
- the position and movement of the pedestrian 150' can in this case be communicated to the incoming vehicle 120 to avoid collision. This could, e.g., be done by warning the driver, signaling to the pedestrian (by both vehicles) or triggering an AEB system at the vehicle 120. Consequently, in this scenario 300 the method again determines an extent of an occluded area 140, detects the potential hazard (the pedestrian) in the occluded area 140, and therefore triggers transmission of an information signal or a warning signal via V2X.
- Figure 4 illustrates an example of how an occluded area 140 can be determined based on trigonometry.
- the occluded area 140 is determined as a sector 400 having a center 410 at the target vehicle 120.
- An arc 420 and an orientation 430 of the sector 400 is determined based on the location of the occluding object 130 in relation to the target vehicle 120.
- the part of the sector 400 behind the occluding object constitutes the occluded area.
- the occluded area can be refined. For instance, all lines of sight comprised in a field of view (either eyesight or other sensor) defines a visible area. The occluded area 140 is then the part of the environment not comprised in the visible area. According to other aspects, a total field of view is first determined. The occluded area 140 is then determined as the part of the environment not comprised in the field of view.
- Figure 5 shows an example where several objects occlude a sensor field of view.
- the occluded area 140a, 140b, 140c is here determined as a polygon based on a plurality of occluding objects, in relation to the target vehicle 120.
- the occluded area 140 can be determined in a number of different ways with varying complexity.
- the occluded area can be roughly estimated or be determined in a more refined manner.
- the proposed methods may be advantageously combined with eye tracking or gaze tracking functions. Such functions may monitor where, e.g., a driver has directed his or her gaze during a current time window. A field of view can then be defined as an area covered by a drivers' gaze in a recent time period.
- FIG. 6 is a flow chart illustrating methods.
- a method in a vehicle signal processing system 700 for triggering transmission of an information signal 115 to a target vehicle 120.
- the method comprises obtaining S1 target vehicle data comprising a location of the target vehicle 120.
- the target vehicle data may, according to aspects, also comprise additional data such as heading, velocity, and the like.
- the vehicle data furthermore comprises information related to a field of view of the target vehicle.
- the target vehicle data may be obtained from on-board sensors, or via V2X transmission, or from a combination of different sources including from storage 730.
- the method also comprises obtaining S2 occluding object data comprising a location of an occluding object 130.
- the occluding object data may also be obtained from on-board sensors or via V2X transmission, or from a combination of V2X transmission and on-board sensors, including from storage 730.
- the occluding object may be an ego vehicle 110, or it may be some other object, like a building.
- the proposed method determines S3 an occluded area 140 based on the target vehicle data and on the occluded object data.
- the occluded area 140 here represents an area that is hidden from the target vehicle 120 due to the occluding object 130.
- the potential hazard 150 is detected S41 by one or more on-board sensors comprised in the ego vehicle 110. Potential hazards may of course also be detected by external sensors, by eye sight, or by other vehicles, which data is then communicated to the signal processing system 600.
- Examples of potential hazards may comprise any of; A potential collision with an oncoming vehicle determined based on extrapolations 121,151 of motion tracks of the oncoming vehicle and the target vehicle 120.
- a potential collision with a pedestrian or animal based on an extrapolation 121 of a motion track of the target vehicle 120 A potential collision with a fixed object based on an extrapolation 121 of a motion track of the target vehicle 120.
- the occluding object data comprises information related to an orientation and/or spatial extension of the occluding object. This allows for a more accurate determination of the extent of the occluded area.
- the determination of the occluded area can be based on geometrical relationships, such as straight lines, angles, and the like.
- the occluded area can be determined as all lines of view which cross the occluding object at some point, for instance as illustrated in Figures 2 and in Figure 3 .
- the target vehicle data comprises information related to at least one field of view 125 of the target vehicle 120.
- the occluded area 140 can now be determined as comprising any line of view not comprised in the field of view.
- gaze tracking functions are combined with the proposed methods to determine fields of view and occluded areas.
- a V2X transceiver vehicle may then enquire some other target vehicle to check if a potential hazard has been detected by the driver, or if a warning signal should be issued.
- the method may also comprise enquiring via V2X to check if a potential hazard has been detected by the target vehicle or not. In case the hazard has been detected already, then the information signal need not be triggered. This way redundant information signal transmissions can be avoided, which is an advantage.
- the occluded area 140 is determined as a sector 400 having a center 410 at the target vehicle 120, wherein an arc 420 and an orientation 430 of the sector 400 is determined based on the location of the occluding object 130 in relation to the target vehicle 120.
- a sector 400 like this was discussed above in connection to Figure 4 .
- Lines are drawn out from, e.g., a sensor location which pass corners of the occluding object 130.
- the area between the two lines behind the occluding object is then considered hidden from view.
- the spatial extension and orientation of the occluding object can, if not known, be assumed equal to some pre-configured values.
- a safety margin can be applied, i.e., the occluded area may be enlarged somewhat in order to account for any measurement errors and other uncertainties.
- the occluded area 140 is determined as a polygon 400 based on the location of the occluding object 130, or based on a plurality of occluding objects, in relation to the target vehicle 120.
- One such example was discussed above in connection to Figure 5 .
- the determination of the occluded area is of course improved if additional information becomes available, such as the field of view of the target vehicle, and/or the spatial configuration of the occluding object.
- the occluded area 140 can be determined as the part of the traffic environment not comprised in the field of view of the target vehicle 120.
- the field of view 125 is first determined, from which the occluded area 140 follows.
- the occluded area 140 is a pre-determined area configured in dependence of a current scenario.
- the occluded area can be pre-configured manually based on scenario. For instance, a given intersection may be associated with occluded areas which have been surveyed in advance.
- the determining then comprises detecting a set of prerequisites, i.e., oncoming vehicle locations, and then mapping the prerequisites to the pre-configured areas. For example, in case a vehicle enters this pre-defined region at the same time as another vehicle enters another pre-defined region, then the two vehicles are assumed hidden from each other's view.
- the method also comprises generating and transmitting S6 the information signal 115.
- the transmission may be over, e.g., 802.11p, DSRC, cellular communications, or the like.
- the information signal 115 is arranged to trigger a warning system and/or a control maneuver in the target vehicle 120, such as an AEB system or the like.
- the method also comprises determining S42 a threat level associated with the potential hazard 150 with respect to the target vehicle 120 and triggering transmission S51 in case the threat level meets a severity criterion.
- communicated information is further limited to only comprise information relevant to more severe scenarios.
- FIG. 7 schematically illustrates, in terms of a number of functional units, the components of a sensor signal processing system 700 according to an embodiment of the discussions herein.
- Processing circuitry 710 is provided using any combination of one or more of a suitable central processing unit CPU, multiprocessor, microcontroller, digital signal processor DSP, etc., capable of executing software instructions stored in a computer program product, e.g. in the form of a storage medium 730.
- the processing circuitry 710 may further be provided as at least one application specific integrated circuit ASIC, or field programmable gate array FPGA.
- the processing circuitry thus comprises a plurality of digital logic components.
- the processing circuitry 710 is configured to cause the system 700 to perform a set of operations, or steps.
- the storage medium 730 may store the set of operations
- the processing circuitry 710 may be configured to retrieve the set of operations from the storage medium 730 to cause the system 700 to perform the set of operations.
- the set of operations may be provided as a set of executable instructions.
- the processing circuitry 710 is thereby arranged to execute methods as herein disclosed.
- the storage medium 730 may also comprise persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid state memory or even remotely mounted memory.
- the sensor signal processing system 700 further comprises an interface 720 for communications with at least one external device, such as a vehicle sensor 111, and a V2X transceiver 112.
- the interface 720 may comprise one or more transmitters and receivers, comprising analogue and digital components and a suitable number of ports for wireline communication.
- the V2X transceiver 112 and the vehicle sensor 111 may be integrated into a single unit, possibly also comprising the interface 720.
- the processing circuitry 710 controls the general operation of the system 700, e.g. by sending data and control signals to the interface 720 and the storage medium 730, by receiving data and reports from the interface 720, and by retrieving data and instructions from the storage medium 730.
- the sensor signal processing system 700 is, as discussed above, arranged to trigger transmission of an information signal 115 to a target vehicle 120. Towards this end, the processing circuitry is arranged as claimed in claim 11.
- Figure 8 shows a computer program product 800 comprising computer executable instructions 810 to execute any of the methods disclosed herein.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
Claims (11)
- Verfahren zum Auslösen der Übertragung eines Informationssignals (115) zu einem Zielfahrzeug (120) mittels eines in einem Ego-Fahrzeug enthaltenen Fahrzeugsignalverarbeitungssystems (700), wobei das Verfahren Folgendes beinhaltet:Erhalten (S1) von Zielfahrzeugdaten, die einen Standort des Zielfahrzeugs (120) und mindestens ein Sichtfeld (125) des Zielfahrzeugs (120) umfassen, wobei das mindestens eine Sichtfeld das von einem oder mehreren Sensoren des Zielfahrzeugs (120) und/oder das eines Fahrers oder Passagiers des Zielfahrzeugs ist;Erhalten (S2) von Daten über verdeckende Objekte, die einen Standort, eine Orientierung und/oder eine räumliche Ausdehnung eines verdeckenden Objekts (130) umfassen, wobei das das Fahrzeugsignalverarbeitungssystem (600) umfassende Ego-Fahrzeug (110) das verdeckende Objekt (130) bildet;Bestimmen (S3) eines verdeckten Bereichs (140) auf der Basis der Zielfahrzeugdaten und der Daten über das verdeckte Objekt, wobei der verdeckte Bereich (140) einen Bereich darstellt, der aufgrund des verdeckenden Objekts (130) für das Zielfahrzeug (120) nicht sichtbar ist;und nur wenn eine potenzielle Gefahr (150, 150') in dem verdeckten Bereich (140) erkannt wird (S4),Auslösen (S5) der Übertragung des Potenzielle-Gefahr-Informationssignals (115) zu dem Zielfahrzeug (120), wobei das Potenzielle-Gefahr-Informationssignal Informationen in Bezug auf die potenzielle Gefahr (150) umfasst, wobei außerhalb des verdeckten Bereichs erkannte potenzielle Gefahren keine Übertragungen auslösen.
- Verfahren nach Anspruch 1, wobei der verdeckte Bereich (140) als ein Sektor (400) mit einem Mittelpunkt (410) am Zielfahrzeug (120) bestimmt wird, wobei ein Bogen (420) und eine Orientierung (430) des Sektors (400) auf der Basis der Position des verdeckenden Objekts (130) in Bezug auf das Zielfahrzeug (120) bestimmt werden.
- Verfahren nach einem der Ansprüche 1-2, wobei der verdeckte Bereich (140) als Polygon (400) auf der Basis der Position des verdeckenden Objekts (130) oder auf der Basis mehrerer verdeckender Objekte in Bezug auf das Zielfahrzeug (120) bestimmt wird.
- Verfahren nach einem vorherigen Anspruch, wobei die potenzielle Gefahr (150) durch einen oder mehrere im Ego-Fahrzeug (110) enthaltene Bordsensoren erkannt wird (S41).
- Verfahren nach einem vorherigen Anspruch, das das Erzeugen und Übertragen (S6) des Informationssignals (115) beinhaltet.
- Verfahren nach einem vorherigen Anspruch, wobei das Informationssignal (115) zum Auslösen eines Warnsystems und/oder eines Steuermanövers im Zielfahrzeug (120) ausgelegt ist.
- Verfahren nach einem vorherigen Anspruch, das das Bestimmen (S42) eines mit der potenziellen Gefahr (150) verbundenen Bedrohungsgrades in Bezug auf das Zielfahrzeug (120) und das Auslösen der Übertragung (S51) beinhaltet, falls der Bedrohungsgrad ein Schwerekriterium erfüllt.
- Verfahren nach einem vorherigen Anspruch, wobei die potenzielle Gefahr (150) eine potenzielle Kollision mit einem entgegenkommenden Fahrzeug umfasst, die auf der Basis von Extrapolationen (121, 151) der Bewegungsbahnen des entgegenkommenden Fahrzeugs und des Zielfahrzeugs (120) bestimmt wird.
- Verfahren nach einem der Ansprüche 1-7, wobei die potenzielle Gefahr (150) eine potenzielle Kollision mit einem Fußgänger oder Tier umfasst, die auf einer Extrapolation (121) einer Bewegungsbahn des Zielfahrzeugs (120) basiert.
- Verfahren nach einem der Ansprüche 1-7, wobei die potenzielle Gefahr (150) eine potenzielle Kollision mit einem festen Objekt umfasst, die auf einer Extrapolation (121) einer Bewegungsbahn des Zielfahrzeugs (120) basiert.
- Sensorsignalverarbeitungssystem (700), das in einem Ego-Fahrzeug enthalten ist, um die Übertragung eines Informationssignals (115) zu einem Zielfahrzeug (120) auszulösen, wobei das Sensorsignalverarbeitungssystem Verarbeitungsschaltung umfasst, die ausgelegt ist zum:Erhalten von Zielfahrzeugdaten, die einen Standort des Zielfahrzeugs (120) und mindestens ein Sichtfeld (125) des Zielfahrzeugs (120) umfassen, wobei das mindestens eine Sichtfeld das von einem oder mehreren Sensoren des Zielfahrzeugs (120) und/oder das eines Fahrers oder Passagiers des Zielfahrzeugs ist;Erhalten von Daten über verdeckende Objekte, die einen Standort, eine Orientierung und/oder eine räumliche Ausdehnung eines verdeckenden Objekts (130) umfassen, wobei das das Fahrzeugsignalverarbeitungssystem (600) umfassende Ego-Fahrzeug (110) das verdeckende Objekt (130) bildet;Bestimmen eines verdeckten Bereichs (140) auf der Basis der Zielfahrzeugdaten und der Daten über das verdeckte Objekt, wobei der verdeckte Bereich (140) einen Bereich darstellt, der aufgrund des verdeckenden Objekts (130) für das Zielfahrzeug (120) nicht sichtbar ist;und nur wenn eine potenzielle Gefahr (150, 150') in dem verdeckten Bereich (140) erkannt wird (S4),Auslösen (S5) der Übertragung des Informationssignals (115) zu dem Zielfahrzeug (120), wobei das Informationssignal Informationen in Bezug auf die potenzielle Gefahr (150) umfasst, wobei außerhalb des verdeckten Bereichs erkannte potenzielle Gefahren keine Übertragungen auslösen.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP19159377.1A EP3703032B1 (de) | 2019-02-26 | 2019-02-26 | Kollaborative sicherheit für verdeckte objekte |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP19159377.1A EP3703032B1 (de) | 2019-02-26 | 2019-02-26 | Kollaborative sicherheit für verdeckte objekte |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| EP3703032A1 EP3703032A1 (de) | 2020-09-02 |
| EP3703032C0 EP3703032C0 (de) | 2025-08-06 |
| EP3703032B1 true EP3703032B1 (de) | 2025-08-06 |
Family
ID=65598512
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP19159377.1A Active EP3703032B1 (de) | 2019-02-26 | 2019-02-26 | Kollaborative sicherheit für verdeckte objekte |
Country Status (1)
| Country | Link |
|---|---|
| EP (1) | EP3703032B1 (de) |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114274979B (zh) * | 2022-01-07 | 2024-06-14 | 中国第一汽车股份有限公司 | 自动驾驶的目标关注度等级判别方法、装置及存储介质 |
| CN115027490A (zh) * | 2022-05-31 | 2022-09-09 | 中国第一汽车股份有限公司 | 一种隐藏目标的预警方法、装置、车辆及存储介质 |
| DE102024000486A1 (de) * | 2024-02-15 | 2025-08-21 | Mercedes-Benz Group AG | Nutzung von Fahrzeugsensorik geparkter Fahrzeuge zur Erkennung verdeckter Verkehrsteilnehmer |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170327035A1 (en) * | 2016-05-10 | 2017-11-16 | Ford Global Technologies, Llc | Methods and systems for beyond-the-horizon threat indication for vehicles |
| US9910442B2 (en) * | 2016-06-28 | 2018-03-06 | Toyota Motor Engineering & Manufacturing North America, Inc. | Occluded area detection with static obstacle maps |
| DE102016214316A1 (de) | 2016-08-03 | 2018-02-08 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Vorrichtung zum Erkennen eines durch ein Fahrzeugteil zumindest teilverdeckten Objekts in einer Fahrzeugumgebung eines Kraftfahrzeugs |
| US11214143B2 (en) | 2017-05-02 | 2022-01-04 | Motional Ad Llc | Visually obstructed object detection for automated vehicle using V2V/V2I communications |
-
2019
- 2019-02-26 EP EP19159377.1A patent/EP3703032B1/de active Active
Also Published As
| Publication number | Publication date |
|---|---|
| EP3703032C0 (de) | 2025-08-06 |
| EP3703032A1 (de) | 2020-09-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11772489B2 (en) | Visually obstructed object detection for automated vehicle using V2V/V2I communications | |
| US7994902B2 (en) | Cooperative sensor-sharing vehicle traffic safety system | |
| EP3208165B1 (de) | Fahrzeugsicherheitsassistenzsystem | |
| KR102797059B1 (ko) | 차량 및 그 제어 방법 | |
| EP2302412B1 (de) | System und Verfahren zur Beurteilung einer Frontalzusammenstoßdrohung eines Automobils | |
| US9858817B1 (en) | Method and system to allow drivers or driverless vehicles to see what is on the other side of an obstruction that they are driving near, using direct vehicle-to-vehicle sharing of environment data | |
| US9260059B2 (en) | False warning reduction using location data | |
| US6442484B1 (en) | Method and apparatus for pre-crash threat assessment using spheroidal partitioning | |
| US8229663B2 (en) | Combined vehicle-to-vehicle communication and object detection sensing | |
| EP3044772B1 (de) | Erfassung eines objekts durch verwendung einer 3d-kamera und eines radars | |
| US9020728B2 (en) | Vehicle turn monitoring system and method | |
| US11091173B2 (en) | Driving safety enhancing system and method for making or enabling highly accurate judgment and providing advance early warning | |
| US20070150196A1 (en) | Method for detecting or predicting vehicle cut-ins | |
| KR20190101909A (ko) | 위험물 감지를 위한 차량용 레이더 시스템 | |
| CN110505631A (zh) | 利用到达角进行恶意无线安全消息的检测 | |
| CN108340866A (zh) | 防撞系统和方法 | |
| EP3703032B1 (de) | Kollaborative sicherheit für verdeckte objekte | |
| CN107009959B (zh) | 用于向驾驶员提供警告的方法和警告系统 | |
| US12060065B2 (en) | Vehicle control system | |
| US11001200B2 (en) | Vehicle occupant warning system | |
| WO2016126318A1 (en) | Method of automatically controlling an autonomous vehicle based on cellular telephone location information | |
| CN108831189A (zh) | 一种基于毫米波雷达防撞的智能预警方法 | |
| US11724692B2 (en) | Detection, warning and preparative action for vehicle contact mitigation | |
| CN115376363B (zh) | 用于控制运载工具的系统、方法 | |
| CN210617998U (zh) | 一种用于货运和客运车辆的盲区检测设备 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| AX | Request for extension of the european patent |
Extension state: BA ME |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20210226 |
|
| RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ARRIVER SOFTWARE AB |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
| 17Q | First examination report despatched |
Effective date: 20240813 |
|
| RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: QUALCOMM AUTO LTD. |
|
| GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
| INTG | Intention to grant announced |
Effective date: 20250228 |
|
| GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
| GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
| AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
| REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
| REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602019073565 Country of ref document: DE |
|
| U01 | Request for unitary effect filed |
Effective date: 20250821 |
|
| U07 | Unitary effect registered |
Designated state(s): AT BE BG DE DK EE FI FR IT LT LU LV MT NL PT RO SE SI Effective date: 20250827 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20251206 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20251106 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250806 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20251107 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250806 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20251106 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250806 |