CN114523905A - System and method for displaying detection and track prediction of targets around vehicle - Google Patents

System and method for displaying detection and track prediction of targets around vehicle Download PDF

Info

Publication number
CN114523905A
CN114523905A CN202210258687.3A CN202210258687A CN114523905A CN 114523905 A CN114523905 A CN 114523905A CN 202210258687 A CN202210258687 A CN 202210258687A CN 114523905 A CN114523905 A CN 114523905A
Authority
CN
China
Prior art keywords
vehicle
prediction
module
driver
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210258687.3A
Other languages
Chinese (zh)
Inventor
支蓉
王宝锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Mercedes Benz Group AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mercedes Benz Group AG filed Critical Mercedes Benz Group AG
Priority to CN202210258687.3A priority Critical patent/CN114523905A/en
Publication of CN114523905A publication Critical patent/CN114523905A/en
Priority to DE102023000838.9A priority patent/DE102023000838A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/22
    • B60K35/29
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • B60K2360/166
    • B60K2360/171
    • B60K2360/176
    • B60K2360/179
    • B60K2360/188
    • B60K2360/334
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display

Abstract

A system for displaying object detection and trajectory prediction in the vicinity of a vehicle, said system comprising at least: a detection module having at least one sensor for detecting and locating objects around a vehicle; a prediction module capable of predicting based on an object around a vehicle; a display module capable of displaying data related to objects around a vehicle and a prediction result of the prediction module, wherein information presented by the display module is capable of being autonomously selected by a driver. And a method for displaying objects around a vehicle, the method comprising at least the steps of: detecting targets around a vehicle and positioning the targets; predicting the target based on the detection and location of the target; the prediction result is displayed to the driver, wherein the displayed information can be autonomously selected by the driver.

Description

System and method for displaying detection and track prediction of targets around vehicle
Technical Field
The invention provides a system and a method for displaying the detection result of the target around the vehicle and giving the target track prediction, particularly on a HUD (head Up display), which not only can provide the information of the type, the direction and the distance of the target around the vehicle for the driver, but also can prompt the future track of the target for the driver. The system and the method can present the processed information to the driver, and neglect irrelevant or unimportant environmental information so as to better assist the driver in judging under different driving scenes.
Background
In recent years, with the development of vehicle intelligence and traffic intelligence, people have higher requirements on driving safety performance, and more vehicles are equipped with more intelligent driving assistance systems, such as a front collision early warning system, an automatic emergency brake system, a blind spot detection system, a panoramic parking assistance system, a panoramic automobile data recorder and the like.
However, there are some disadvantages in the prior art for the display of objects around a vehicle, such as the need for information according to V2X, which requires assistance from other vehicles or road infrastructure around. In addition, it is common in the art to display a driving blind area, not comprehensive environmental information around the vehicle. In some prior art, even if comprehensive environmental information around the vehicle is displayed, the trajectory prediction of the moving target cannot be performed; or the trajectory prediction is performed on a moving target, but the display information is excessively complicated, and the attention of the driver is greatly dispersed.
Disclosure of Invention
The invention provides a system and a method for displaying target detection and track prediction around a vehicle, aiming at the problems. The method according to the invention not only provides the driver with information about the type, orientation, distance of the target around the vehicle, but also indicates to the driver the future trajectory of said target. The system and the method can present the processed information to the driver, and neglect irrelevant or unimportant environmental information so as to better assist the driver in judging under different driving scenes.
According to a first aspect of the present invention, a system for displaying target detection and trajectory prediction around a vehicle is provided, wherein the system at least comprises:
a detection module having at least one sensor for detecting and locating objects around a vehicle;
a prediction module capable of predicting based on an object around a vehicle;
a display module capable of displaying data related to objects around a vehicle and a prediction result of the prediction module, wherein information presented by the display module is capable of being autonomously selected by a driver.
Within the framework of the invention, the objects surrounding the vehicle comprise at least other vehicles, obstacles, pedestrians, non-motor vehicles etc. surrounding the vehicle, wherein the obstacles comprise traffic facilities, green belts, urban infrastructure etc. on or beside the road.
Within the framework of the invention, the prediction module may be integrated in a vehicle control system, which refers to a control device or a set of multiple control devices configured as appropriate, for example comprising an application specific integrated circuit, one or more processors, a non-transitory memory storing instructions. The vehicle control system is, for example, an Electronic Control Unit (ECU), which is also commonly referred to as a "traveling computer", a "vehicle-mounted computer", or the like. It is also contemplated that the prediction module can be a separate module dedicated to prediction computation.
Within the framework of the invention, "autonomously selected by the driver" means that the driver can select the information presented on the display module according to his own needs, for example the driver can select other objects approaching a relative speed exceeding a certain threshold to be highlighted according to the needs, which includes at least two cases, one in which the object approaches at a greater speed, and two in which the object, although stationary, is driven towards the object at a greater speed by the vehicle on which the system according to the invention is installed. The driver can also choose not to display some targets according to own needs, for example, green belts, garbage cans, lane lines and the like on two sides of the road can be displayed according to the needs of the driver. This selection should be made prior to driving the vehicle.
Through the independent selection of the driver, the presented information can be set individually, so that the presented content is not too complicated, the information which is crucial to the safety can be avoided from being ignored, the driver can judge and operate correspondingly quickly, and the driving safety of the vehicle is improved.
According to a preferred embodiment of the invention, said at least one sensor comprises at least one of the following sensors: vehicle-mounted camera, millimeter wave radar, ultrasonic radar, laser radar. Within the framework of the invention, a plurality of sensors are preferably arranged in front of and behind the vehicle body and on both sides of the vehicle body, for example one millimeter wave radar and one camera on each side of the vehicle body and two millimeter wave radars and one camera on each side of the vehicle body. Naturally, it is also conceivable to provide other kinds and numbers of sensors depending on the design requirements of the vehicle.
According to a preferred embodiment of the present invention, the detection module is capable of obtaining at least a kind, a size, and a movement speed of the object based on a detection result of at least one sensor. For example, the detection module, after receiving the detection signals of the sensors, analyzes the detection signals and classifies the objects, in particular according to their contours and/or movement speed, for example, to the classes of passenger cars, commercial vehicles, traffic facilities, pedestrians, non-motor vehicles, city infrastructures, etc. Within the framework of the invention, the detection module is furthermore preferably also able to acquire vehicle driving data by means of a steering wheel angle sensor and a wheel speed sensor.
According to a preferred embodiment of the invention, the prediction result comprises at least a motion trajectory and/or an expected speed of the object. Within the framework of the invention, the display module can display the relevant data and the prediction results of the targets around the vehicle, and the driver can also select the display mode according to the needs of the driver, such as whether to display the predicted track, the time length and the color of the predicted track, the number of environment perception results, the type of the targets and the like. The driver can make a selection via a control panel provided on the display module, which may be, for example, a button provided on the vehicle central control or a control interface provided on a touch screen of the vehicle central control. It is also envisaged that the control panel may be provided on a mobile device connected to the vehicle, in particular on an application of the mobile device. The connection between the mobile device and the vehicle can be realized in a wired manner and a wireless manner, such as a cable, bluetooth, Zigbee (Zigbee), WiFi, and the like, which are common to those skilled in the art. The mobile device may be a tablet computer, a mobile phone, a notebook computer, etc. The connection of the mobile device to the vehicle can also be made indirectly, for example via a remote server, whereby remote modification of the display settings can be achieved.
According to the invention, the display module can also be provided with a sound production module which is used for producing alarm sound for the driver besides the visual reminding in the case of emergency.
Within the framework of the invention, for example, the driver can choose, according to his own needs, to render the object approaching at a speed exceeding the first relative speed red, the object approaching at a speed exceeding the second relative speed but below the first relative speed yellow, and the object at rest or relatively stationary with the vehicle green. The first relative speed and the second relative speed can be fixed, but it is also conceivable for the first relative speed and the second relative speed to be variable, in particular in dependence on the vehicle speed. It is also conceivable to set up differently for different classes of objects, for example to set objects that are themselves stationary (e.g. guardrails, greenbelts) to always appear green, and moving objects (e.g. pedestrians, non-motor vehicles, other vehicles) to appear in different colors depending on their relative speed. The position of the target may be set in accordance with the position of the target, for example, a target behind, and far away from the driver is always displayed in green, a target directly in front of and near the driver is displayed in red, and a target in front of or in side of the driver is displayed in yellow.
According to a preferred embodiment of the present invention, the system further has a fusion module, and the fusion module performs data fusion on the detection result of the sensor through a fusion algorithm, so that the information of the target and the road information are fused. The fusion module can acquire required data such as the detected target type and speed from the detection module, and then fuse the acquired data with map data or road data to calculate the environmental condition around the driver position. The map data can be stored in the ECU or in the memory of the fusion module itself, which can be integrated in the ECU or a separately provided module dedicated to fusion operations.
According to a preferred embodiment of the present invention, the prediction module performs prediction based on a data fusion result of the fusion module, and sends the fusion result and the prediction result to the display module. Thereby, the environmental situation around the position of the driver can be presented more accurately.
Within the framework of the invention, the display module is preferably a hud (head Up display) device. HUD devices, also known as heads-up displays or head-up displays, for projecting information to the eyes of a driver, are typically designed to present driving information of the vehicle on a front window (or windshield) of the vehicle so that the driver can see the driving information in head-up. These driving information are usually presented in the prior art on the dashboard of the vehicle or on a central display of the vehicle. By using the HUD device, the driver can be made to concentrate on his attention in the front direction without having to divert his attention to the instrument panel while driving.
According to a preferred embodiment of the invention, the display module, in particular the HUD device, is able to present the environmental information around the vehicle and the data relating to the object in a bird's eye view. The vehicle-mounted display system can enable the environmental information around the vehicle to be more visual by means of the aerial view, and is beneficial to the driver to quickly judge and carry out corresponding operation.
According to a second aspect of the invention, a method for displaying objects around a vehicle is proposed, the method comprising at least the following steps:
detecting targets around a vehicle and positioning the targets;
predicting the target based on the detection and location of the target;
the prediction result is displayed to the driver, wherein the displayed information can be autonomously selected by the driver.
According to the method of the present invention, it is preferable that the detection step is followed by a data fusion step in which information of the object is fused with road information or map information, and prediction is performed based on a fusion result in the prediction step, and the fusion result and the prediction result are displayed in the display step.
The method according to the invention can be carried out by means of the system according to the invention.
Within the framework of the invention, the order and function of the individual modules can also be adapted accordingly, depending on the specific design requirements, as long as the results of target detection and trajectory prediction around the vehicle can finally be given.
Within the framework of the present invention, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
Drawings
The following schematically illustrates a system and method for displaying object detection and trajectory prediction in the vicinity of a vehicle according to the present invention by way of example.
FIG. 1 illustrates a system for displaying object detection and trajectory prediction around a vehicle in accordance with a preferred embodiment of the present invention;
fig. 2 shows the environmental information around the vehicle, in particular presented on the HUD device in the form of a bird's eye view, and the corresponding predicted trajectory.
Detailed Description
A system for displaying object detection and trajectory prediction around a vehicle according to a preferred embodiment of the present invention is shown in fig. 1. In this preferred embodiment, the system according to the invention has a detection module, a fusion module, a prediction module and a display module.
In particular, a plurality of on-board cameras and/or millimeter-wave radars and/or ultrasonic radars and/or laser radars are provided in the detection module, for example, one millimeter-wave radar and one camera are provided in front of and behind the vehicle body, and two millimeter-wave radars and one camera are provided on both sides of the vehicle body. The speed of the target around the vehicle can be detected by the millimeter wave radar on the vehicle body, and the image data of the target around the vehicle can be detected by the camera, whereby the detection module can obtain information of other vehicles, obstacles, pedestrians, non-motor vehicles, and the like around the vehicle, and can obtain at least the kind, size, and movement speed of the target based on the information. For example, after receiving the detection signal of the sensor, the detection module analyzes the detection signal and classifies the target according to the contour and/or the movement speed. Within the framework of the invention, the detection module is preferably also able to acquire vehicle driving data via a steering wheel angle sensor and a wheel speed sensor.
And then, the fusion module performs data fusion on the detection result of the sensor through a fusion algorithm so as to fuse the information of the target with the road information. The fusion module can acquire information such as target types and speeds from the detection module, and then fuse the acquired information with map information or road information to calculate the environmental conditions around the position of the driver.
After the fusion module carries out fusion operation, the prediction module carries out prediction based on the data fusion result of the fusion module and sends the fusion result and the prediction result to the display module. Thereby, the environmental situation around the position of the driver can be presented more accurately.
The display module can display the related data and the prediction result of the target around the vehicle, and the driver can select the display mode according to the requirement, such as whether to display the prediction track, the time length and the color of the prediction track, the quantity of the environment perception results, the type of the target and the like. The driver can make a selection through a control panel provided on the display module. The control panel may not be provided on the display module, for example, on a vehicle central control or on a mobile device connected to the vehicle.
In the system shown in fig. 1, the driver can select different targets to be presented in different colors according to his/her needs, for example, a target with a high risk is presented in red, a target with a certain risk is presented in yellow, and a target that is stationary or relatively stationary with respect to the own vehicle is presented in green. The "danger" is particularly the possibility of a collision with the vehicle.
It is also conceivable that the driver can set up differently for different classes of objects, for example, objects that are themselves stationary (e.g. guardrails, greenbelts) are set up to always appear green, and objects that are moving (e.g. pedestrians, non-motor vehicles, other vehicles) are set up to appear in different colors depending on their relative speed.
The environmental information around the vehicle, in particular in the form of a bird's eye view, presented on the HUD device, and the corresponding predicted trajectory are shown in fig. 2.
In fig. 2, the "target vehicle" located in the middle lane represents the vehicle in which the driver is located, and as can be seen from fig. 2, around the location of the driver there is a large truck on the left side, there are pedestrians about the road in the left front, there are non-motor vehicles traveling from left to right in the front, there is a passenger vehicle on the right front, there are also pedestrians about the road in the right side, and there are non-motor vehicles away from the driver on the right rear. In fig. 2, each target has its predicted trajectory next to it (shown as a number of polylines). It is envisaged that the predicted trajectory is displayed in different colours to indicate its different speed of movement; or show its speed of movement numerically next to (e.g., above or below) the predicted trajectory. Here, the "moving speed" may mean an absolute moving speed that refers to a speed of the object with respect to the ground or a relative moving speed that refers to a speed of the object with respect to the object vehicle. In the case of representing different movement speeds by colors, for example, far is represented by green, near is represented by red, and the speed of far or near can also be represented by the shade of the color; in the case where the movement speed is numerically expressed, for example, the approach can be expressed by "+" before the number and the distance can be expressed by "-" before the number.
In the example shown in fig. 2, for example, the non-motor vehicle at the rear right, which is remote from the driver, can be displayed in green, but it may also be omitted for clarity, since it is obviously less dangerous. The pedestrians to the right and the pedestrians to the front of the left of the driver can for example select to be displayed in yellow and the non-motor vehicle directly in front of the driver in red, because the pedestrians to the right and the pedestrians to the front of the left of the driver have a higher risk and the non-motor vehicle directly in front of the driver has the highest risk.
In the example shown in fig. 2, the motor vehicles on the left and right sides of the driver can be displayed differently depending on their relative speeds, for example, for a passenger vehicle in the right front, for example, in yellow when the vehicle speed of the passenger vehicle is lower than the vehicle speed of the target vehicle (relatively close thereto), and in green when the vehicle speed of the passenger vehicle is higher than the vehicle speed of the target vehicle (relatively far therefrom). The display may also be made in conjunction with the driving behavior of the target vehicle, for example, for a right-front passenger car, which is displayed in red if its relative position is close to the target vehicle and the target vehicle turns on a right turn indicator. Similarly, a similar arrangement may be used for a large truck on the left, such as a large truck that is displayed in red when its relative position is near the target vehicle and the target vehicle turns on the left turn indicator; alternatively, for example, if the large truck remains relatively stationary with the subject vehicle and less than 10 meters away, the large truck may be displayed red when the subject vehicle turns the left turn indicator light on and yellow when the subject vehicle remains straight.
It is of course also conceivable to display different colors depending on the type of object, for example: presenting the pedestrians as green, light green, green and dark green according to the danger degree of the pedestrians; the non-motor vehicles are presented with yellow, light yellow, yellow and dark yellow according to the risk degree; the motor vehicle is rendered red, pink, red and deep red according to its degree of risk. The selected color can also be changed individually according to the requirements of the driver.
The selection should be made before driving the vehicle and the selected parameters are stored by the display module, thereby eliminating the need for repeated selections. Within the framework of the invention, the person skilled in the art is able to make the required selection of the displayed information according to the inventive concept, so that objects with a higher risk can be displayed with a higher priority and objects with a lower risk with a lower priority or not displayed at all, without the need for inventive effort.
It should be understood that the above-described embodiments of the present invention are intended to be illustrative, but not limiting, of exemplary implementations of air quality information level indicating systems according to the present invention. Rather, in addition to the embodiments described above, a large number of variants which are obvious to the person skilled in the art and which are produced by combining the individual features of the invention are likewise possible.

Claims (10)

1. A system for displaying object detection and trajectory prediction in the vicinity of a vehicle, said system comprising at least:
a detection module having at least one sensor for detecting and locating objects around a vehicle;
a prediction module capable of predicting based on an object around a vehicle;
a display module capable of displaying data related to objects around a vehicle and a prediction result of the prediction module, wherein information presented by the display module is capable of being autonomously selected by a driver.
2. The system of claim 1, wherein the at least one sensor comprises at least one of: vehicle-mounted camera, millimeter wave radar, ultrasonic radar, laser radar.
3. The system of any one of the preceding claims, wherein the detection module is capable of deriving at least the type, size and speed of movement of the object based on the detection of the at least one sensor.
4. The system of any one of the preceding claims, wherein the prediction result comprises at least a motion trajectory and/or an expected speed of the object.
5. The system according to any one of the preceding claims, wherein the system further has a fusion module for data fusion of the detection results of the sensors by a fusion algorithm so as to fuse the information of the target with the road information.
6. The system of claim 5, wherein the prediction module predicts based on a data fusion result of the fusion module and sends the fusion result and the prediction result to the display module.
7. The system according to any of the preceding claims, wherein the display module is a HUD device.
8. The system of any one of the preceding claims, wherein the display module is capable of presenting environmental information around the vehicle and data relating to the target in a bird's eye view.
9. A method for displaying objects around a vehicle, wherein the method comprises at least the steps of:
detecting targets around a vehicle and positioning the targets;
predicting the target based on the detection and location of the target;
the prediction result is displayed to the driver, wherein the displayed information can be autonomously selected by the driver.
10. The method according to claim 9, wherein there is further provided a data fusion step after the detection step, in which information of the object and information of the link are fused and predicted based on a fusion result in the prediction step, and the fusion result and the prediction result are displayed in the display step.
CN202210258687.3A 2022-03-16 2022-03-16 System and method for displaying detection and track prediction of targets around vehicle Pending CN114523905A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210258687.3A CN114523905A (en) 2022-03-16 2022-03-16 System and method for displaying detection and track prediction of targets around vehicle
DE102023000838.9A DE102023000838A1 (en) 2022-03-16 2023-03-07 System and method for displaying the detection of objects in the vehicle environment and a trajectory forecast

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210258687.3A CN114523905A (en) 2022-03-16 2022-03-16 System and method for displaying detection and track prediction of targets around vehicle

Publications (1)

Publication Number Publication Date
CN114523905A true CN114523905A (en) 2022-05-24

Family

ID=81626427

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210258687.3A Pending CN114523905A (en) 2022-03-16 2022-03-16 System and method for displaying detection and track prediction of targets around vehicle

Country Status (2)

Country Link
CN (1) CN114523905A (en)
DE (1) DE102023000838A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023179086A1 (en) * 2022-08-26 2023-09-28 东莞理工学院 Lidar driving environment cognitive system based on visual area guidance

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023179086A1 (en) * 2022-08-26 2023-09-28 东莞理工学院 Lidar driving environment cognitive system based on visual area guidance

Also Published As

Publication number Publication date
DE102023000838A1 (en) 2023-09-21

Similar Documents

Publication Publication Date Title
CN107176165B (en) Vehicle control device
US9723243B2 (en) User interface method for terminal for vehicle and apparatus thereof
EP3313696B1 (en) Augmented reality system for vehicle blind spot prevention
US11554667B2 (en) Display device provided in vehicle and control method of display device
US10595176B1 (en) Virtual lane lines for connected vehicles
US9589464B2 (en) Vehicular headlight warning system
US11634150B2 (en) Display device
CN109716415B (en) Vehicle control device, vehicle control method, and moving object
CN112771592B (en) Method for warning a driver of a motor vehicle, control device and motor vehicle
US20060152351A1 (en) Device and method for the active monitoring of the safety perimenter of a motor vehicle
CN109204305B (en) Method for enriching the field of view, device for use in an observer vehicle and object, and motor vehicle
US10759334B2 (en) System for exchanging information between vehicles and control method thereof
CN109715467B (en) Vehicle control device, vehicle control method, and movable body
US20170043720A1 (en) Camera system for displaying an area exterior to a vehicle
US20200020235A1 (en) Method, System, and Device for Forward Vehicular Vision
US20230399004A1 (en) Ar display device for vehicle and method for operating same
CN114179726A (en) Driving assistance information display method, device, equipment, medium and program product
CN112298016A (en) High beam warning system, method, vehicle and computer storage medium
CN113448096B (en) Display device for vehicle
CN114523905A (en) System and method for displaying detection and track prediction of targets around vehicle
US20230400321A1 (en) Ar display device for vehicle and method for operating same
CN116935695A (en) Collision warning system for a motor vehicle with an augmented reality head-up display
US20230322215A1 (en) System and method of predicting and displaying a side blind zone entry alert
EP3544293B1 (en) Image processing device, imaging device, and display system
CN112298020A (en) Vehicle information display system, method, vehicle and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication