EP3857530A1 - Dsrc-datenansicht der erweiterten realität - Google Patents

Dsrc-datenansicht der erweiterten realität

Info

Publication number
EP3857530A1
EP3857530A1 EP18786172.9A EP18786172A EP3857530A1 EP 3857530 A1 EP3857530 A1 EP 3857530A1 EP 18786172 A EP18786172 A EP 18786172A EP 3857530 A1 EP3857530 A1 EP 3857530A1
Authority
EP
European Patent Office
Prior art keywords
intersection
processor
data
vehicle
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP18786172.9A
Other languages
English (en)
French (fr)
Inventor
Jesse Aaron HACKER
Bastian Zydek
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive Systems Inc
Original Assignee
Continental Automotive Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/140,076 external-priority patent/US20190244515A1/en
Application filed by Continental Automotive Systems Inc filed Critical Continental Automotive Systems Inc
Publication of EP3857530A1 publication Critical patent/EP3857530A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection

Definitions

  • the invention relates generally to a system for warning a driver of a vehicle and more particularly warning a driver there may be a potential danger hidden from the driver's line of sight.
  • One general aspect includes a method for displaying an augmented image including: recording data with at least one sensor at an intersection.
  • the method also includes transmitting the data via a dedication short range communication (DSRC) device to a second DSRC device in a vehicle proximate to the intersection; analyzing the data with a processor to determine the location of an object proximate to the intersection; and augmenting an image by bounding a portion of an image of the intersection, where the coordinates of the bounded portion corresponds to a location of the object in the intersection.
  • the method also includes displaying the augmented image on a display in view of a vehicle operator when at least one vulnerable road user is crossing the intersection.
  • One general aspect includes an augmented visualization system for a vehicle including a communication device in a vehicle proximate to an intersection configured to receive data from at least one sensor at an intersection.
  • a processor configured with instructions for: analyzing the data with a processor to determine the location of an object proximate to the intersection; and augmenting an image by bounding a portion of an image of the intersection, where the coordinates of the bounded portion corresponds to a location of the object in the intersection.
  • the augmented visualization system also includes a display in view of a vehicle operator, where the augmented image is shown.
  • intersection monitoring system including: at least one sensor at an intersection, an intersection processor configured with instructions for: analyzing the data from the at least one sensor to determine the location of an object proximate to the intersection determining coordinates of a location of the object in the intersection; and a first communication device configured to broadcast data from the processor to at least one second communication device proximate to the intersection such that an image for a display may be augmented by bounding a portion of an image which corresponds to the coordinates on the location of the object in the interesting.
  • FIG. 1 is a perspective view of a traffic intersection having a warning system being part of an infrastructure component, according to embodiments of the present invention
  • FIG 2A is a schematic illustration of vehicle having a first embodiment of an augmented visualization system, according to embodiments of the present invention.
  • FIG 2B is a schematic illustration of vehicle of an exemplary display screen of the first embodiment of an augmented visualization system, according to embodiments of the present invention.
  • FIG 2C is perspective view of an object detected by a remote sensor and displayed by the augmented visualization system, according to embodiments of the present invention.
  • FIG 4 is a schematic illustration of vehicle having a second embodiment of an augmented visualization system, according to embodiments of the present invention.
  • FIG 5 is a flow diagram of an exemplary arrangement of operations for operating a tow vehicle in reverse for attachment to a trailer.
  • an intersection monitoring system(s) 10 may provide intelligent intersections 20 which may be enabled with communication device 24, such as dedicated short range communication (DSRC) device.
  • the intersection monitoring system 10 may be for detecting objects 14 including vehicles and vulnerable road users proximate to the intersection 20, and broadcasting information about them as a basic safety message (BSM) to another communication device 26.
  • BSM basic safety message
  • the first communication device 24 may be part of the intersection monitoring system 10 or may be part of another vehicle or smart device proximate to the intersection 20.
  • the information broadcast by the first communication device 24 may be received by a second communication device, 26, possibly another DSRC, in communication enabled vehicles 14a and allow the communication enabled vehicles 14a to warn the drivers of various situations which may be potentially dangerous.
  • an augmented vehicle 14b uses an augmented visualization system 12 equipped to visually alert a driver to a potential danger.
  • emergency brake assist (EBA) systems have sensors that may accurately determine the location, speed, and direction of objects (pedestrians, cyclists, etc), and may be equipped with V2X technologies and communicate with the smart city infrastructures, key information may be shared to allow for localized warnings. Additionally, this information may be used by the augmented visualization system 12 to determine when to alert a driver to a potential danger.
  • FIG. 1 illustrates an intersection monitoring system 10.
  • the intersection monitoring system 10 is associated with an intersection 20 which includes some type of infrastructure component 21, which in this embodiment is a post, having at least one sensor 22 and at least one first communication device 24.
  • the intersection monitoring system 10 may also have a warning device. While in this embodiment, the infrastructure component 21 is the post, it is within the scope of the invention that the intersection monitoring system 10 and warning system may include any other type of infrastructure component 21, such as a building, bridge, parking structure, support structure, or the like.
  • the communication device 24 is enabled with dedicated short range communication (DSRC) to sharing information sensed by the at least one sensor 22 through communications to broadcast information to vehicles 14, 14a, 14b proximate to the intersection or other devices capable of receiving such a communication, such as a smart phone.
  • DSRC dedicated short range communication
  • proximate may be interpreted by known dictionary definitions, other definitions known by those skilled in the art, within a distance to receive the communication from the first communication device 24, or within a predetermined physical distance of the intersection 20 predetermined for the intersection monitoring system 10.
  • the sensor 22 and communication device 24 are integrated into a single component, the sensor 22 and communication device 24 may be separate components in different locations or multiple types of sensors 22 may be linked to one communication device 24.
  • the sensor 22 in this embodiment is able to detect objects in a detection area, shown generally at 22A.
  • the sensor 22 is a long-range radar sensor 22, but it is within the scope of the invention that other types of sensors maybe used, such as, but not limited to long-range radar, short-range radar, LIDAR (Light Imaging, Detection, and Ranging), LADAR (Laser Imaging, Detection, and Ranging), and other types of radar, a camera, ultrasound, or sonar.
  • the senor 22 is able to detect the location, as well as speed and direction of each object 14, including the location, speed, and direction of vehicles and pedestrians 14. While there are two objects/pedestrians 14 which are walking in the example shown in Figure 1, it is within the scope of the invention that the sensor 22 is able to detect if each is walking, traveling by bicycle, scooter, skateboard, rollerblades, or the like and may be able to detect many more objects and vehicles 14.
  • the first communication device 24 broadcasts the information to any communication enabled objects/vehicles 14a having a second communication device 26, such as a common DSRC device or otherwise able to receive the information.
  • the augmented visualization system 12 also includes a visualization system processor 18.
  • the visualization system processor 18 may include at least one of a microprocessor, microcontroller, an application specific integrated circuits ("ASICs"), a digital signal processor, etc., as is readily appreciated by those skilled in the art.
  • the visualization system processor 18 is capable of performing calculations, executing instructions (i.e., running a program), and otherwise manipulating data as is also appreciated by those skilled in the art.
  • the intersection monitoring system 10 also has a processor 16.
  • the processor 16 is in communication with the at least one sensor 22. As such, the processor 16 may receive data from the various sensors 22.
  • the processor 16 is configured to determine various characteristics of the object 14 based on the data provided by the sensors 22. These characteristics include, but are not limited to, type of object 14 (e.g., motorcycle, truck, pedestrian, car, etc.), size of each object 14, position of each object 14, weight of each object 14, travel speed of each object 14, acceleration of each object 14, and heading for each object 14.
  • the processor 16 is also configured to estimate the trajectory for each object 14. This estimation is calculated based on at least one of the speed, acceleration, and heading for each object 14. That is, the processor 16 is configured to estimate potential future locations of the object 14 based on current and past location, speed, and/or acceleration.
  • the communication device 24 associated with the sensor 22 and processor 16 then broadcasts the information to the area proximate to the intersection 20. All vehicles having a second DSRC/communication device 26 or an otherwise able communication device to receive the information.
  • intersection processor 16 and/or visualization system processor 18 are configured to predict a possibility that the object 14 is not seen by the driver of the vehicle 14a, 14b and, thus, there is a potential danger of collision or accident present. This probability is based, at least in part, on the estimated trajectory for each object 14 that was received from the intersection monitoring system 10. The probability may be a number corresponding to a likelihood of collision based on various factors including the potential future locations of the object 14.
  • the processor 16 may have access to information regarding traffic signals (not shown) at the intersection 20.
  • the communications may be achieved, for example, by vehicle-to-vehicle communication ("V2V”) techniques and/or vehicle-to-X (“V2X”) techniques.
  • the processor 16 may be in communication with a signal controller (not shown) to determine the state of the various traffic signals (e.g., "green light north and southbound, red light east and westbound", etc.).
  • the processor 16 may determine the state of the traffic signals based on data provided by the sensors 22. This information can be included in the broadcast from the DSRC 24 to vehicles 14a in the vicinity of the intersection 20.
  • the vehicle processor 18 may then utilize the information regarding traffic signals in predicting the probability for a collision between objects 14.
  • the images and other data from the intersection monitoring system 10 is sent from the DSRC 24 to the vehicle/second DSRC 26.
  • the intersection processor 16 and/or visualization system processor 18 uses the data to determine there is at least one object 14 that possibly cannot be seen, or can be seen but is a potential danger to which the driver's attention should be directed.
  • the vehicle 14b has a user interface 30 for the augmented visualization system 12, including at least one type of display 32.
  • the augmented visualization system 12 displays an image 34 on the display 32.
  • the user interface 30 and display 32 may include a screen, a touch screen, a heads-up display, a helmet visor, a phone display, a windshield, etc.
  • the image 34 may be one captured by an on vehicle camera 28, as shown in Fig 2B, or may be from a camera 22 that is acting as a sensor for the intersection monitoring system, as shown in Fig 2C.
  • the augmented visualization system 12 provides a graphic overlay 36 to highlight and direct the driver's attention to the location of the detected object 14, such as a bounded area 36 of the image 34 around the portion which corresponds to the obstructed object 14. That way the driver of the augmented vehicle 14b is alerted to a potential danger and can take action to minimize the risk of collision or accident.
  • the object 14 with the potential danger can be a pedestrian about to use the cross-walk, as shown in the Figures or another type of potential danger. For example, other situations may be approaching or turning vehicles that are block from view by other vehicles or buildings, etc.
  • One skilled in the art would be able to determine possible situations when a driver may be unable or having difficulty viewing objects that may be sensed by sensors 22 to that are remote from the vehicle intersection 20, but in the area of an intersection 20.
  • DSRC data increases by effective information of the driver of a vehicle via the human machine interfaces 30.
  • the driver will see an overlay 36 of his field of view with Object Data from DSRC objects.
  • HUDs 32 integrated into vehicles 14b that would show a similar visualization.
  • the augmented visualization system 12 could also be implemented into a bicyclist helmet or motorcycle helmet, as well as in smart glass windscreens 14b as shown in Fig 3. Where the overlay of the bounded area 36 is added on the windscreen 32 which the driver of vehicle 14b is looking through.
  • the augmented visualization system 12 allows for far greater spatial perception by the driver and awareness of the data.
  • the augmented visualization system 12 scales with the real-life view and leaves far less about a driving scenario open to interpretation.
  • a sensor 22 at an intersection monitoring system 10 records data and/or images, at block 202.
  • An intersection communication device 24 sends data/images to a second communication device 26 (such as a DSRC device in a vehicle 14a), step 204.
  • An intersection processor 16 and/or augmented visualization system processor 18 for augmented visualization system 12 analyzes data and identifies an object 14 as a Potential Danger either before or after the information is sent, 206.
  • the intersection processor 16 and/or visualization system processor 18 determines the location of the object coordinates within an image 34, at 208. Location coordinates of the object are bounded 36 in the image 34 to highlight the location of the potential danger, at block 210.
  • the augmented image 34 is displayed on a display 32within view of the driver of the vehicle 14b, step 212.
  • intersection processor 16 and/or visualization system processor 18 identifies seen objects 14 in the image 34 which are identified as areas of obstructed view.
  • the intersection processor 16 and/or visualization system processor 18 can then identify objects 14 that are behind other objects 14 based on the data from the sensors 22.
  • a truck is parked in the road.
  • the bounded area 36 obstructed by the obstacle 14 is shown in shading for illustrative purposes in Figures 2B, 2C and 3, but would not be displayed on the display 32.
  • the processor 16 determines if the object 14 can be seen, illustrated to the driver by a first bounding color 36a, e.g. green, as shown in Fig 2C.
  • the object 14 may be illustrated to the driver by a second bounding 36b color, e.g. red as shown in Fig 2B.
  • a different patter on the bounding can be displayed as also shown, e.g. cross- hatching vs. solid highlighting.
  • intersection processor 16 While the location and trajectory information is disclosed as being processed by the intersection monitoring system 10 and the potential danger probability and image processing is described as completed by intersection processor 16 and/or visualization system processor 18 other processors may perform the described method in its entirety or in a different combination of processing as illustrated in the example. One skilled in the art would be able to determine which assigned steps should be completed by which processor 16, 18.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
EP18786172.9A 2018-09-24 2018-12-10 Dsrc-datenansicht der erweiterten realität Pending EP3857530A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/140,076 US20190244515A1 (en) 2017-09-25 2018-09-24 Augmented reality dsrc data visualization
PCT/US2018/052649 WO2019060891A1 (en) 2017-09-25 2018-12-10 VISUALIZATION OF DATA WITH INCREASED REALITY

Publications (1)

Publication Number Publication Date
EP3857530A1 true EP3857530A1 (de) 2021-08-04

Family

ID=71197860

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18786172.9A Pending EP3857530A1 (de) 2018-09-24 2018-12-10 Dsrc-datenansicht der erweiterten realität

Country Status (3)

Country Link
EP (1) EP3857530A1 (de)
JP (1) JP2021535519A (de)
CN (1) CN111357039A (de)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3214122B2 (ja) * 1993-01-19 2001-10-02 三菱電機株式会社 危険状況警報装置
DE102014219567A1 (de) * 2013-09-30 2015-04-02 Honda Motor Co., Ltd. Dreidimensionale (3-d) navigation
US9588340B2 (en) * 2015-03-03 2017-03-07 Honda Motor Co., Ltd. Pedestrian intersection alert system and method thereof
JP6563798B2 (ja) * 2015-12-17 2019-08-21 大学共同利用機関法人自然科学研究機構 視覚認知支援システムおよび視認対象物の検出システム
JP2017220030A (ja) * 2016-06-07 2017-12-14 アイシン・エィ・ダブリュ株式会社 走行情報提供システムおよび走行情報提供プログラム
CN108538084B (zh) * 2017-03-01 2021-01-08 奥迪股份公司 可视化提醒装置、车辆以及可视化提醒方法

Also Published As

Publication number Publication date
JP2021535519A (ja) 2021-12-16
CN111357039A (zh) 2020-06-30

Similar Documents

Publication Publication Date Title
US20190244515A1 (en) Augmented reality dsrc data visualization
US10906456B2 (en) Communicating the intention of a vehicle to another road user
JP6635428B2 (ja) 自動車周辺情報表示システム
US9723243B2 (en) User interface method for terminal for vehicle and apparatus thereof
JP4967015B2 (ja) 安全運転支援装置
JP6149846B2 (ja) 注意喚起装置
JP5278292B2 (ja) 情報提示装置
JP6084598B2 (ja) 標識情報表示システム及び標識情報表示方法
US8350686B2 (en) Vehicle information display system
CN109733283B (zh) 基于ar的被遮挡障碍物识别预警系统及识别预警方法
CN112771592B (zh) 用于警告机动车的驾驶员的方法、控制设备及机动车
JP4311426B2 (ja) 移動体を表示するための表示システム、車載装置及び表示方法
JP2007323556A (ja) 車両周辺情報報知装置
WO2014185042A1 (ja) 運転支援装置
CN112758013A (zh) 车辆的显示装置和显示方法
JP2010146459A (ja) 運転支援装置
KR20200142571A (ko) 자동차용 카메라 모니터 시스템을 동작시키는 방법 및 장치
CN111601279A (zh) 在车载显示器中显示动态交通态势的方法和车载系统
JP2005242526A (ja) 車両用危険情報提示システム及びその表示装置
JP6136564B2 (ja) 車両用表示装置
CN114312771A (zh) 车辆接触减轻的检测、警告和准备动作
EP2797027A1 (de) Fahrzeugführeralarmanordnung, Fahrzeug und Verfahren zur Warnung eines Fahrzeugführers
CN116935695A (zh) 用于具有增强现实抬头显示器的机动车辆的碰撞警告系统
JP5354193B2 (ja) 車両用運転支援装置
CN114523905A (zh) 一种显示车辆周围目标检测及轨迹预测的系统及方法

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210426

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS