CN116080399B - Display method, display system and storage medium - Google Patents

Display method, display system and storage medium Download PDF

Info

Publication number
CN116080399B
CN116080399B CN202310157913.3A CN202310157913A CN116080399B CN 116080399 B CN116080399 B CN 116080399B CN 202310157913 A CN202310157913 A CN 202310157913A CN 116080399 B CN116080399 B CN 116080399B
Authority
CN
China
Prior art keywords
range
display
vehicle
display element
puddle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310157913.3A
Other languages
Chinese (zh)
Other versions
CN116080399A (en
Inventor
张涛
韩雨青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zejing Automobile Electronic Co ltd
Original Assignee
Jiangsu Zejing Automobile Electronic Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Zejing Automobile Electronic Co ltd filed Critical Jiangsu Zejing Automobile Electronic Co ltd
Priority to CN202310157913.3A priority Critical patent/CN116080399B/en
Publication of CN116080399A publication Critical patent/CN116080399A/en
Application granted granted Critical
Publication of CN116080399B publication Critical patent/CN116080399B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • B60K35/81Arrangements for controlling instruments for controlling displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/178Warnings

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Instrument Panels (AREA)

Abstract

The invention discloses a display method, a display system and a storage medium. The method comprises the following steps: when a puddle exists in a first preset area, predicting a first range of splashed water when a vehicle passes through the puddle, wherein the first preset area corresponds to a projectable area of a head-up display (HUD) of the vehicle; generating a first display element according to the first range; the first display element is projected by the HUD to a windshield of the vehicle. The scheme provided by the invention can intuitively display the splash range of the vehicle when passing through the puddle on the windshield glass so as to remind a driver of civilized driving and ensure driving safety.

Description

Display method, display system and storage medium
Technical Field
The present invention relates to the field of automotive technologies, and in particular, to a display method, a display system, and a storage medium.
Background
A Head Up Display (HUD) is a Display device capable of mapping important information on a windshield glass, so that a driver can safely obtain vehicle information in a driving process, and the Head Up Display is a Display development trend in the future vehicle field.
At present, road surface water is often caused by overcast and rainy weather. When a vehicle runs through a road section with water accumulation, water can splash, and if pedestrians pass around the road section, the splashed water can get on the other side, so that unnecessary trouble is caused to the two sides. Therefore, how to display driving information in overcast and rainy weather by using the HUD becomes a current urgent need to be solved.
Disclosure of Invention
The invention provides a display method, a display system and a storage medium, which can intuitively display the splash range of a vehicle when passing through a puddle on windshield glass so as to remind a driver of civilized driving and ensure driving safety.
According to an aspect of the present invention, there is provided a display method including:
when a puddle exists in a first preset area, predicting a first range of splashed water when a vehicle passes through the puddle, wherein the first preset area corresponds to a projectable area of a head-up display (HUD) of the vehicle; generating a first display element according to the first range; the first display element is projected by the HUD to a windshield of the vehicle.
Optionally, predicting a first range for splash of water as the vehicle passes through the puddle includes:
acquiring predicted running tracks of front wheels of a vehicle and position information of a water pit, wherein the position information of the water pit comprises water pit coordinates, water pit areas and water pit depths; determining whether the predicted running track coincides with the puddle coordinates; if the predicted running track is at least partially overlapped with the water pit coordinates, determining the farthest distance of splash water when the vehicle passes through the water pit according to vehicle information, the water pit area and the water pit depth, wherein the vehicle information at least comprises the mass and the running speed of the vehicle; the first range is determined based on the furthest distance.
Optionally, generating the first display element according to the first range includes:
determining a second range according to the first range and the first preset area, wherein the second range is a range where the first range and the first preset area coincide; according to the second range, a first display element is generated.
Optionally, the first display element includes a plurality of element points, and one element point corresponds to geographic coordinates in at least one second range; projecting, by the HUD, a first display element to a windshield of a vehicle, comprising:
converting all geographic coordinates in the second range into camera coordinates in a camera coordinate system; converting the camera coordinates into first display coordinates of the element points on the windshield; the element point of the first display element is projected at the first display coordinates by the HUD.
Optionally, after projecting the first display element to the windshield of the vehicle through the HUD, further comprising:
when the preset object exists in the second preset area, acquiring the position information of the preset object; generating a second display element or a third display element according to the position information of the preset object and the first range; the second display element or the third display element is projected by the HUD to a windshield of the vehicle.
Optionally, generating the second display element according to the position information of the preset object and the first range includes:
determining the position relation between the position information of the preset object and a second range, wherein the second range is a range where the first range and the first preset area coincide; and if the position information of the preset object is in the second range, generating a second display element.
Optionally, generating the third display element according to the position information of the preset object and the first range includes:
determining the position relation between the position information of the preset object and the first range and the second range, wherein the second range is a range where the first range and the first preset area coincide; if the position information of the preset object is located in the first range and not located in the second range, generating a third display element.
Optionally, projecting the second display element to a windshield of the vehicle through the HUD includes:
determining coordinate information of the preset object on the windshield according to the position information of the preset object; determining a second display coordinate of the second display element on the windshield according to the coordinate information, wherein the coordinate information is adjacent to or at least partially overlapped with the second display coordinate; the second display element is projected at the second display coordinates by the HUD.
Optionally, projecting the third display element to a windshield of the vehicle through the HUD includes:
the third display element is projected at the first preset position by the HUD.
Optionally, the first range is divided into at least two pre-warning ranges, and one pre-warning range corresponds to one priority information; further comprises:
determining a target early warning range in which the position information of the preset object is located; generating a fourth display element according to the priority information corresponding to the target early warning range, wherein the fourth display element is used for prompting steering to avoid a water pit or prompting deceleration; the fourth display element is projected at the second preset position by the HUD.
According to another aspect of the present invention, there is provided a display system comprising a head up display HUD and an electronic device; an electronic device includes:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the display method of any one of the embodiments of the invention.
According to another aspect of the present invention there is provided a vehicle comprising a display system according to any of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute a display method according to any one of the embodiments of the present invention.
According to the technical scheme, the first range of splash when the vehicle passes through the puddle is predicted, and the first display element is generated based on the first range, so that the first display element is projected to the windshield glass of the vehicle. Thus, the range of splash when the vehicle passes through the puddle can be intuitively displayed on the windshield; further, whether a preset object passes through in the first range can be monitored, and when the preset object passes through, a corresponding prompt is made on the windshield glass to remind a driver of civilized driving, so that driving safety is guaranteed.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a display method according to a first embodiment of the invention;
FIG. 2 is a schematic top view of a vehicle according to a first embodiment of the present invention;
fig. 3 is a schematic view of a HUD according to a first embodiment of the present invention;
FIG. 4 is a schematic view of a first range provided by a first embodiment of the present invention;
fig. 5 is a flow chart of a display method according to a second embodiment of the invention;
FIG. 6 is a schematic diagram illustrating a first range division according to a second embodiment of the present invention;
fig. 7 is a flow chart of a display method according to a third embodiment of the present invention;
FIG. 8 is a diagram showing AR with a target warning range W1 according to a third embodiment of the present invention;
FIG. 9 is a diagram showing AR with a target warning range W2 according to a third embodiment of the present invention;
fig. 10 is a non-AR display diagram with a target early warning range W1 according to a third embodiment of the present invention;
FIG. 11 is a non-AR display diagram with a target warning range W2 according to a third embodiment of the present invention;
fig. 12 is a schematic structural diagram of a display device according to a fourth embodiment of the present invention;
fig. 13 is a schematic structural diagram of a display system according to a fifth embodiment of the present invention;
fig. 14 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," "target," and the like in the description and claims of the present invention and in the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a schematic flow chart of a display method according to an embodiment of the present invention, where the method may be performed by a display device, and the display device may be implemented in hardware and/or software, and the display device may be configured in a vehicle, such as a display system of the vehicle. As shown in fig. 1, the method includes:
s110, when a puddle exists in a first preset area, predicting a first range of splash when the vehicle passes through the puddle, wherein the first preset area corresponds to a projectable area of a head-up display (HUD) of the vehicle.
In the present invention, the vehicle refers to a vehicle that the driver is currently driving. Fig. 2 is a schematic top view of a vehicle according to a first embodiment of the present invention, where, as shown in fig. 2, cameras and radars are installed around the vehicle. The camera may be a conventional optical lens or a depth camera (also known as a TOF lens); the radar may include a laser radar, a millimeter wave radar, and an ultrasonic radar. Specifically, the vehicle may acquire various information outside the vehicle through the camera and/or the radar, such as confirming whether or not there is a puddle in the first preset area and position information of the puddle.
Fig. 3 is a schematic diagram of a first preset area according to a first embodiment of the present invention. As shown in fig. 3, P is an eye point (i.e., a human eye position), o is a coordinate system origin (i.e., a projection point of the eye point P on a horizontal plane), x-direction of the coordinate system is a running direction of the vehicle, y-direction is a direction of a left hand of the driver, and z-direction is a direction from the ground to the sky. The first preset area corresponds to the projectable area of the HUD of the vehicle, that is, the projectable area of the HUD in the real world, as shown by the shaded portion in fig. 3, and in the actual display, the shaded portion in fig. 3 is projected onto the windshield of the vehicle for the driver to observe.
Let the angle of Field of View (FOV) in the horizontal direction of the HUD be α, the angle of FOV in the vertical direction be β, and the lower viewing angle LDA (not shown in fig. 3) of the driver be γ. Due to the limitation of the HUD FOV, the vehicle can detect a situation on a road surface in the range of L1 to L2 in the y-axis direction, for example, a situation on a road surface in the range of 20 meters to 100 meters in front of the vehicle.
Wherein l1=w/2×cot (α/2), W is the minimum width detectable by the HUD in the x-axis direction; l2=k×cot (γ - β/2), k is an empirical value (i.e. constant), optionally k=1.3.
When the display method provided by the invention is executed, firstly, whether a puddle exists in a first preset area shown in fig. 3 is determined through equipment such as a camera and a radar arranged on a vehicle, and if the puddle exists, a first range of splash when the vehicle passes through the puddle is predicted.
Specifically, a method of predicting a first range of splash as a vehicle passes through a puddle may include the following five steps.
Step a1: and acquiring the predicted running track of the front wheel of the vehicle and the position information of the water pit, wherein the position information of the water pit comprises water pit coordinates, water pit area and water pit depth.
Although the puddle exists in the first preset area, the vehicle does not necessarily roll over the puddle when traveling to the puddle position, so that the first range of splash when the vehicle passes through the puddle is predicted to judge whether the vehicle rolls over the puddle when traveling to the puddle position. Specifically, the determination may be made based on the predicted travel track of the front wheels of the vehicle and the position information of the puddle.
The predicted travel locus includes the locus of the left and right front wheels, and since the tire of the vehicle has a certain width, the predicted travel locus can be understood as two ruts. In general, the predicted travel track may be acquired by car navigation.
The position information of the puddle can be obtained by image recognition based on images acquired by equipment such as a vehicle camera and a radar. The position information of the puddle includes puddle coordinates, puddle area and puddle depth.
Step a2: it is determined whether the predicted travel path coincides with the puddle coordinates.
Since a puddle is an area having a certain area and depth, puddle coordinates are a set including a plurality of coordinate points. Thus, determining whether the predicted travel path coincides with the puddle coordinates may be determining whether the predicted travel path coincides with each coordinate point within the set. If the predicted running track is not coincident with the puddle coordinate, the vehicle is not rolled through the puddle when running to the puddle position; if the predicted running track and the puddle coordinates are at least partially overlapped, the predicted running track and the puddle coordinates indicate that the vehicle rolls through the puddle when running to the puddle position.
Step a3: and if the predicted running track is not coincident with the puddle coordinates, determining that the first range is empty.
If the predicted running track is not coincident with the puddle coordinates, the vehicle cannot roll through the puddle when running to the puddle position, namely, the vehicle cannot splash and splash when passing through the puddle, and the first range is empty.
Step a4: if the predicted running track and the puddle coordinates are at least partially overlapped, determining the farthest distance of splash water when the vehicle passes through the puddle according to vehicle information, puddle area and puddle depth, wherein the vehicle information at least comprises the mass and running speed of the vehicle.
If the predicted running track is at least partially overlapped with the puddle coordinates, the predicted running track indicates that the vehicle rolls through the puddle when running to the puddle position, namely, the vehicle splashes water when passing through the puddle, and the farthest distance of the splashes water when passing through the puddle is further determined.
The predicted running track and the puddle coordinate at least partially coincide comprises any one of the following three conditions: 1. at least partially overlapping the track of the left front wheel with the coordinates of the puddle, wherein the left front wheel of the vehicle can roll the puddle at the moment, and the first range is a splash range formed by taking the left front wheel of the vehicle as a reference point; 2. at least partially overlapping the track of the right front wheel with the coordinates of the puddle, wherein the right front wheel of the vehicle can roll the puddle at the moment, and the first range is a splash range formed by taking the right front wheel of the vehicle as a reference point; 3. the tracks of the left front wheel and the right front wheel are overlapped with the coordinates of the puddle, at the moment, the two front wheels of the vehicle roll through the puddle, and the first range is the sum of splash ranges formed by taking the left front wheel and the right front wheel of the vehicle as datum points.
The vehicle provided by the invention can comprise a memory, wherein an experimental model is preset in the memory, and the experimental model is based on a common vehicle driving scene and is used for testing the furthest distance of splash when the left front wheel and the right front wheel of the vehicle are rolled on a puddle. The experimental model was for the following variables: and carrying out multiple groups of tests on the influence of a dependent variable by a plurality of factors to obtain a one-to-one mapping relation. Table 1 is a mapping table of variables versus furthest distance of splashed water spray.
TABLE 1
Mass m of vehicle Running speed v of vehicle Sump area S Pit depth H The furthest distanceSeparation of
…… …… …… …… ……
m1 v1 S1 H1 D1
m2 v2 S2 H2 D2
m3 v3 S3 H3 D3
…… …… …… …… ……
From the vehicle information, puddle area and puddle depth, the furthest distance from which puddles splash as the vehicle passes the puddle can be determined from table 1.
Step a5: the first range is determined based on the furthest distance.
In one embodiment, the first range may be a sector area with a radius of the furthest distance being about the wheel rolled through the sump, the angle of the sector area being less than or equal to 180 °. Fig. 4 is a schematic diagram of a first range according to a first embodiment of the present invention. Fig. 4 (a) - (c) correspond to the above three cases where the predicted travel track and the puddle coordinate at least partially overlap.
It should be noted that, the method provided by the present invention is specific to one puddle, if two or more puddles exist in the first preset area, and the vehicle splashes water when passing through the puddles, the corresponding first range of each puddle needs to be calculated. In addition, even if the left and right front wheels of the vehicle are simultaneously rolled over two different puddles, since the puddle area S and puddle depth H may be different, the farthest distances from which the puddles splash when the left and right front wheels pass through the two puddles may be different, resulting in the first range being also not the same, and thus calculation is also required for each puddle.
S120, generating a first display element according to the first range.
In an embodiment, a method of generating a first display element according to a first scope may include the following two steps.
Step b1: and determining a second range according to the first range and the first preset area, wherein the second range is a range where the first range and the first preset area coincide.
As can be seen from fig. 3, the HUD is a trapezoid-like area with limited length and width in the real world, so that when the furthest distance from the splash of water when the vehicle passes through the puddle is too large, the first range may exceed the display range of the HUD, and only the second range may actually be displayed on the windshield.
Step b2: according to the second range, a first display element is generated.
The first display element may be an element for representing a splash effect, such as a specific color, a display effect. The first display element includes a plurality of element points, one element point corresponding to geographic coordinates within at least one second range. The first display coordinates of the plurality of element points on the windscreen from the perspective of the driver's view coincide with the geographical coordinates of the second range, i.e. the virtual image of the first display element as seen by the driver coincides with the predicted real world splash position.
And S130, projecting the first display element to a windshield of the vehicle through the HUD.
In particular, the method of projecting a first display element onto a windscreen of a vehicle by means of a HUD may comprise the following three steps.
Step c1: all geographic coordinates within the second range are converted to camera coordinates in the camera coordinate system.
Step c2: the camera coordinates are converted into first display coordinates of the element points on the windscreen.
Taking a geographic coordinate (X, Y, Z) in the second range as an example, X, Y, Z are the positions of the geographic coordinates from the vehicle head, and the converted camera coordinates areWherein R is a rotation matrix, and t is a translation vector.
The first display coordinates areWherein K is the camera reference.
Step c3: the element point of the first display element is projected at the first display coordinates by the HUD.
The first display element is projected onto a windshield of the vehicle by predicting a first range of splash as the vehicle passes over the puddle and generating the first display element based on the first range. Therefore, the range of splash water when the vehicle passes through the puddle can be intuitively displayed on the windshield, so that a driver can clearly observe the information, and the subsequent vehicle driving is facilitated.
Example two
Fig. 5 is a flow chart of a display method according to a second embodiment of the present invention, where an early warning function for a preset object is introduced based on the above embodiment. As shown in fig. 5, after the step S130 is performed, the method further includes:
and S140, when a preset object exists in the second preset area, acquiring the position information of the preset object, wherein the preset object comprises at least one of pedestrians, animals and non-motor vehicle drivers.
On the basis of the embodiment, in order to avoid unnecessary trouble caused by splash of water splashed by a vehicle when the vehicle passes through the puddle onto a preset object, the embodiment of the invention can monitor whether the preset object passes through or not in the first range besides intuitively displaying the range of splash of water when the vehicle passes through the puddle on a windshield.
The second preset area is a certain range centered on the vehicle, and can be a range which can be detected by radar, a camera and the like, such as a range of 10/20/30 m centered on the vehicle.
The position information of the preset object comprises preset object coordinates or comprises preset object coordinates and preset object traveling conditions. For example, when the preset object is in a stationary state (such as a red light at a roadside or the like), the position of the preset object is unchanged before and after the vehicle passes through the puddle; when the preset object is in a moving state (e.g., walking on a roadside), the traveling condition (e.g., traveling direction and traveling speed) of the preset object may facilitate determining the actual position of the preset object when the vehicle finally passes through the puddle.
S150, generating a second display element or a third display element according to the position information of the preset object and the first range.
In an embodiment, assuming that the HUD is an AR HUD, the method of generating the second display element according to the position information of the preset object and the first range may include the following steps.
Step d1: and determining the position relation between the position information of the preset object and a second range, wherein the second range is a range where the first range and the first preset area coincide.
Step d2: and if the position information of the preset object is in the second range, generating a second display element.
If the position information of the preset object is located in the second range, the position information indicates that the preset object is located in the shadow range in fig. 3, and the second display element may be displayed by means of AR display.
In an embodiment, the method for generating the third display element according to the position information of the preset object and the first range may include the following steps.
Step d'1: and determining the position relation between the position information of the preset object and the first range and the second range, wherein the second range is a range where the first range and the first preset area coincide.
Step d'2: if the position information of the preset object is located in the first range and not located in the second range, generating a third display element.
If the position information of the preset object is located in the first range and not located in the second range, the position information indicates that the preset object is located outside the shadow range in fig. 3, and the third display element may be displayed in a non-AR display manner.
And S160, projecting the second display element or the third display element to the windshield of the vehicle through the HUD.
Specifically, when the second display element is projected to the windshield of the vehicle through the HUD, the second display element is displayed in an AR display mode, so that the coordinate information of the preset object on the windshield is determined according to the position information of the preset object; secondly, determining a second display coordinate of the second display element on the windshield according to the coordinate information, wherein the coordinate information is adjacent to or at least partially overlapped with the second display coordinate; finally, a second display element is projected at the second display coordinates by the HUD.
Specifically, when the third display element is projected onto the windshield of the vehicle by the HUD, the third display element is displayed by a non-AR display method, and therefore the third display element may be projected directly at the first preset position by the HUD.
Optionally, on the basis of the above embodiment, the display method provided by the present invention may further provide driving guidance for the driver. Specifically, after the step S150 is performed, steps S170 to S190 may be further included.
S170, determining a target early warning range where the position information of the preset object is located.
The first range is divided into at least two early warning ranges, and one early warning range corresponds to one priority information. Fig. 6 is a schematic diagram illustrating the division of a first range according to a second embodiment of the present invention, as shown in fig. 6, the first range is divided into three pre-warning ranges, and the three pre-warning ranges are respectively denoted by W3, W2 and W1 according to the direction away from the vehicle, one pre-warning range corresponds to one priority information, and generally, the priority of the pre-warning range closer to the vehicle is higher, i.e. the priority of the pre-warning range W3 is higher than the priority of the pre-warning range W2, and the priority of the pre-warning range W2 is higher than the priority of the pre-warning range W1.
It should be noted that, the size of each early warning range may be designed according to the actual situation, which is not particularly limited by the present invention.
And S180, generating a fourth display element according to priority information corresponding to the target early warning range, wherein the fourth display element is used for reminding a driver to turn to avoid a puddle or reminding the driver to slow down.
And S190, projecting the fourth display element to a windshield of the vehicle through the HUD.
And generating a fourth display element according to priority information corresponding to a target early warning range where the position information of the preset object is located, and projecting the fourth display element to a windshield glass of the vehicle through the HUD. Thereby providing driving guidance for the driver.
In general, the fourth display elements corresponding to different priority information are different, for example, the fourth display element corresponding to the early warning range with high priority preferentially prompts the driver to decelerate, so as to avoid the influence of splash on the preset object; and the fourth display element corresponding to the early warning range with low priority prompts the driver to steer to avoid the puddle preferentially, so that the normal running of the vehicle is ensured.
Example III
Fig. 7 is a flow chart of a display method according to a third embodiment of the present invention, and this embodiment provides a specific implementation manner of the display method, in this embodiment, it is assumed that the first range is divided into two early warning ranges, and the early warning ranges are respectively denoted as W2 and W1 according to a direction away from the vehicle, and the priority of the early warning range W2 is higher than the priority of the early warning range W1. As shown in fig. 7, the method includes:
s201, determining whether a puddle exists in the first preset area. If yes, go to step S202; if not, the process returns to step S201.
Wherein the first preset area corresponds to a projectable area of the HUD of the vehicle.
S202, obtaining the predicted running track of the front wheels of the vehicle and the position information of the puddle.
The position information of the puddle comprises puddle coordinates, puddle area and puddle depth. In general, the predicted travel track may be acquired by car navigation. The position information of the puddle can be obtained by image recognition based on images acquired by equipment such as a vehicle camera and a radar.
S203, determining whether the predicted running track coincides with the water pit coordinates. If the two images do not overlap, step S204 is executed; if at least partially overlap, step S205 is performed.
If the predicted running track is not coincident with the puddle coordinate, the vehicle is not rolled through the puddle when running to the puddle position; if the predicted running track and the puddle coordinates are at least partially overlapped, the predicted running track and the puddle coordinates indicate that the vehicle rolls through the puddle when running to the puddle position.
S204, determining that the first range is empty, and returning to execute the step S201.
S205, determining the furthest distance of splash when the vehicle passes through the puddle according to the vehicle information, the puddle area and the puddle depth.
Wherein the vehicle information includes at least a mass and a traveling speed of the vehicle.
S206, determining a first range according to the farthest distance.
The first range may be a sector area with a radius of the furthest distance of 180 ° or less from the wheel rolled through the sump.
S207, determining a second range according to the first range and the first preset area.
The second range is a range where the first range and the first preset area overlap.
S208, generating a first display element according to the second range.
The first display element includes a plurality of element points, one element point corresponding to geographic coordinates within at least one second range.
And S209, converting all geographic coordinates in the second range into camera coordinates in a camera coordinate system.
Taking a geographic coordinate (X, Y, Z) in the second range as an example, X, Y, Z are the positions of the geographic coordinates from the vehicle head, and the converted camera coordinates areWherein R is a rotation matrix, and t is a translation vector.
S210, converting the camera coordinates into first display coordinates of the element points on the windshield.
The first display coordinates areWherein K is the camera reference.
S211, projecting element points of the first display element at the first display coordinates through the HUD.
S212, determining whether a preset object exists in the second preset area. If not, returning to the execution step S212; if so, step S213 is performed.
Wherein the preset object comprises at least one of a pedestrian, an animal, and a non-motor vehicle driver.
If no preset object exists in the second preset area, only the first display element is displayed on the windshield.
S213, acquiring position information of the preset object.
The position information of the preset object comprises preset object coordinates or comprises preset object coordinates and preset object traveling conditions.
S214, determining the position relation between the position information of the preset object and the first range and the second range.
Alternatively, the minimum distance between the preset object and the front wheel of the vehicle may be determined first, and then the positional relationship between the minimum distance and the first and second ranges may be determined.
For example, assume that the position information of the preset object is (X i ,Y i ) The coordinates of the left front wheel of the vehicle are (X Left front ,Y Left front ) The coordinates of the right front wheel of the vehicle are (X Front right ,Y Front right )。
If the preset object is located on the left side of the vehicle, the minimum distance Di between the preset object and the front wheel of the vehicle is D i ^2=(X i –X Left front )^2+(Y i –Y Left front ) 2; if the preset object is located on the right side of the vehicle, the minimum distance Di between the preset object and the front wheel of the vehicle is D i ^2=(X i –X Front right )^2+(Y i –Y Front right )^2。
S215, if the position information of the preset object is located in the second range, generating a second display element.
S216, determining coordinate information of the preset object on the windshield according to the position information of the preset object.
S217, determining second display coordinates of the second display element on the windshield according to the coordinate information, wherein the coordinate information is adjacent to or at least partially overlapped with the second display coordinates.
S218, determining a target early warning range where the position information of the preset object is located.
S219, generating a fourth display element according to priority information corresponding to the target early warning range, wherein the fourth display element is used for reminding a driver to turn to avoid a puddle or reminding the driver to slow down.
S220, projecting a second display element at a second display coordinate through the HUD, and projecting a fourth display element at a second preset position.
Specifically, the method for projecting the second display element at the second display coordinate is similar to the method for projecting the first display element, which is not described herein for brevity. From the perspective of the driver's view, the virtual image of the second display element as seen by the driver is adjacent or at least partially coincident with the real world preset object.
Fig. 8 is an AR display diagram with a target early warning range W1 according to a third embodiment of the present invention; fig. 9 is an AR display diagram with a target early warning range W2 according to a third embodiment of the present invention. As shown in fig. 8 and 9, the position information of the preset object is located in the second range, which means that the preset object is located in the hatched range in fig. 3, and the second display element is displayed by means of AR display.
When the target early warning range is W1, the fourth display element is used for reminding a driver of steering so as to avoid the puddle; when the target early warning range is W2, the fourth display element is used for reminding the driver of decelerating.
S221, if the position information of the preset object is located in the first range and not located in the second range, generating a third display element.
S222, determining a target early warning range where the position information of the preset object is located.
And S223, generating a fourth display element according to the priority information corresponding to the target early warning range, wherein the fourth display element is used for reminding a driver to turn to avoid a puddle or reminding the driver to slow down.
S224, projecting a third display element at a first preset position and projecting a fourth display element at a second preset position through the HUD.
Fig. 10 is a non-AR display diagram with a target early warning range W1 according to a third embodiment of the present invention; fig. 11 is a non-AR display diagram with a target early warning range W2 according to a third embodiment of the present invention. As shown in fig. 10 and 11, the position information of the preset object is located in the first range and not located in the second range, which means that the preset object is located outside the shadow range in fig. 3, and the third display element is displayed by a non-AR display manner.
When the target early warning range is W1, the fourth display element is used for reminding a driver of steering so as to avoid the puddle; when the target early warning range is W2, the fourth display element is used for reminding the driver of decelerating.
It is also added that for fig. 8 and 810, when the driver turns the steering wheel slightly according to the HUD prompt, the time T for initially prompting the turning of the steering wheel may also be obtained 1 Time T after the steering wheel is operated and the direction of the head of the vehicle is deviated 2 And a head angle offset value θ, and further calculates an offset value Δd=v (T 2 -T 1 ) Sin theta, so as to re-judge according to the offset value, wherein the judging process is described in the above embodiments, if the predicted running track after the vehicle is judged to be turned is not coincident with the coordinates of the puddle, the steering prompt is not performed any more.
The embodiment of the invention provides a display method, which comprises the following steps: when a puddle exists in a first preset area, predicting a first range of splashed water when a vehicle passes through the puddle, wherein the first preset area corresponds to a projectable area of a head-up display (HUD) of the vehicle; generating a first display element according to the first range; the first display element is projected by the HUD to a windshield of the vehicle. The first display element is projected onto a windshield of the vehicle by predicting a first range of splash as the vehicle passes over the puddle and generating the first display element based on the first range. Thus, the range of splash when the vehicle passes through the puddle can be intuitively displayed on the windshield; further, whether a preset object passes through in the first range can be monitored, and when the preset object passes through, a corresponding prompt is made on the windshield glass to remind a driver of civilized driving, so that driving safety is guaranteed.
Example IV
Fig. 12 is a schematic structural diagram of a display device according to a fourth embodiment of the present invention. As shown in fig. 12, the apparatus includes: a prediction module 301, a generation module 302 and a display module 303.
The prediction module 301 is configured to predict a first range of splash when the vehicle passes through the puddle when the puddle exists in a first preset area, where the first preset area corresponds to a projectable area of a head up display HUD of the vehicle;
a generating module 302, configured to generate a first display element according to the first range;
a display module 303 for projecting the first display element to a windscreen of the vehicle via the HUD.
Optionally, the prediction module 301 is specifically configured to obtain a predicted running track of a front wheel of the vehicle and position information of a puddle, where the position information of the puddle includes puddle coordinates, puddle area and puddle depth; determining whether the predicted running track coincides with the puddle coordinates; if the predicted running track is at least partially overlapped with the water pit coordinates, determining the farthest distance of splash water when the vehicle passes through the water pit according to vehicle information, the water pit area and the water pit depth, wherein the vehicle information at least comprises the mass and the running speed of the vehicle; the first range is determined based on the furthest distance.
Optionally, the generating module 302 is specifically configured to determine a second range according to the first range and the first preset area, where the second range is a range where the first range and the first preset area overlap; according to the second range, a first display element is generated.
Optionally, the first display element includes a plurality of element points, and one element point corresponds to geographic coordinates in at least one second range;
the display module 303 is specifically configured to convert all geographic coordinates in the second range into camera coordinates in a camera coordinate system; converting the camera coordinates into first display coordinates of the element points on the windshield; the element point of the first display element is projected at the first display coordinates by the HUD.
Optionally, the prediction module 301 is further configured to obtain location information of the preset object when the preset object exists in the second preset area;
the generating module 302 is further configured to generate a second display element or a third display element according to the position information and the first range of the preset object;
the display module 303 is also used for projecting the second display element or the third display element to the windscreen of the vehicle via the HUD.
Optionally, the generating module 302 is specifically configured to determine a positional relationship between the positional information of the preset object and a second range, where the second range is a range where the first range and the first preset area overlap; and if the position information of the preset object is in the second range, generating a second display element.
Optionally, the generating module 302 is specifically configured to determine a positional relationship between the positional information of the preset object and the first range and the second range, where the second range is a range where the first range and the first preset area overlap; if the position information of the preset object is located in the first range and not located in the second range, generating a third display element.
Optionally, the display module 303 is specifically configured to determine coordinate information of the preset object on the windshield according to the position information of the preset object; determining a second display coordinate of the second display element on the windshield according to the coordinate information, wherein the coordinate information is adjacent to or at least partially overlapped with the second display coordinate; the second display element is projected at the second display coordinates by the HUD.
Optionally, the display module 303 is specifically configured to project, through the HUD, the third display element at the first preset position.
Optionally, the first range is divided into at least two pre-warning ranges, and one pre-warning range corresponds to one priority information;
the prediction module 301 is further configured to determine a target early warning range in which the position information of the preset object is located;
the generating module 302 is further configured to generate a fourth display element according to priority information corresponding to the target early warning range, where the fourth display element is used for prompting steering to avoid a sump, or prompting deceleration;
The display module 303 is further configured to project, via the HUD, a fourth display element at a second preset position.
The display device provided by the embodiment of the invention can execute the display method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example five
Fig. 13 is a schematic structural diagram of a display system according to a fifth embodiment of the present invention. As shown in fig. 13, the display system includes an electronic device 10 and a heads-up display HUD 20.
Fig. 14 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the present invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 14, the electronic device 10 includes at least one processor 11, and a memory such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, wherein the memory stores a computer program executable by the at least one processor, and the processor 11 can perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the display method.
In some embodiments, the display method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the display method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the display method in any other suitable way (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
Example six
The embodiment of the invention also provides a vehicle. The vehicle includes the display system described in any of the embodiments above.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A display method, comprising:
when a puddle exists in a first preset area, predicting a first range of splashed water when a vehicle passes through the puddle, wherein the first preset area corresponds to a projectable area of a head-up display (HUD) of the vehicle;
Generating a first display element according to the first range;
projecting the first display element through the HUD to a windshield of the vehicle;
when a preset object exists in the second preset area, acquiring the position information of the preset object;
determining the position relation between the position information of the preset object and the first range and the second range, wherein the second range is a range where the first range and the first preset area coincide;
if the position information of the preset object is located in the second range, generating a second display element;
and projecting the second display element to a windshield of the vehicle through the HUD, wherein the second display element is displayed in an AR display mode.
2. The display method of claim 1, wherein predicting a first range of splash as the vehicle passes the puddle comprises:
acquiring a predicted running track of a front wheel of a vehicle and position information of the puddle, wherein the position information of the puddle comprises puddle coordinates, puddle area and puddle depth;
determining whether the predicted travel track coincides with the sump coordinates;
If the predicted running track is at least partially overlapped with the water pit coordinate, determining the farthest distance of splash water when the vehicle passes through the water pit according to vehicle information, the water pit area and the water pit depth, wherein the vehicle information at least comprises the mass and the running speed of the vehicle;
and determining the first range according to the furthest distance.
3. The display method of claim 1, wherein generating a first display element from the first range comprises:
determining a second range according to the first range and the first preset area, wherein the second range is a range where the first range and the first preset area coincide;
and generating a first display element according to the second range.
4. A display method according to claim 3, wherein the first display element comprises a plurality of element points, one of the element points corresponding to geographic coordinates within at least one of the second ranges;
said projecting said first display element through said HUD to a windshield of said vehicle, comprising:
converting all geographic coordinates in the second range into camera coordinates in a camera coordinate system;
Converting the camera coordinates to first display coordinates of the element point on the windshield;
and projecting an element point of the first display element at the first display coordinate through the HUD.
5. The display method according to claim 1, characterized by further comprising:
if the position information of the preset object is located in the first range and not located in the second range, generating a third display element;
and projecting the third display element to a windshield of the vehicle through the HUD, wherein the third display element is displayed in a non-AR display mode.
6. The display method according to claim 1, wherein the projecting the second display element to a windshield of the vehicle through the HUD includes:
determining coordinate information of the preset object on the windshield according to the position information of the preset object;
determining a second display coordinate of the second display element on the windshield according to the coordinate information, wherein the coordinate information is adjacent to or at least partially overlapped with the second display coordinate;
the second display element is projected at the second display coordinates by the HUD.
7. The display method according to claim 5, wherein the projecting the third display element to the windshield of the vehicle through the HUD includes:
and projecting the third display element at a first preset position through the HUD.
8. The display method according to claim 1, wherein the first range is divided into at least two pre-warning ranges, one of the pre-warning ranges corresponding to one priority information; further comprises:
determining a target early warning range in which the position information of the preset object is located;
generating a fourth display element according to the priority information corresponding to the target early warning range, wherein the fourth display element is used for prompting steering to avoid the puddle or prompting deceleration;
and projecting the fourth display element at a second preset position through the HUD.
9. A display system comprising a heads-up display HUD and an electronic device; the electronic device includes:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the display method of any one of claims 1-8.
10. A computer readable storage medium storing computer instructions for causing a processor to implement the display method of any one of claims 1-8 when executed.
CN202310157913.3A 2023-02-23 2023-02-23 Display method, display system and storage medium Active CN116080399B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310157913.3A CN116080399B (en) 2023-02-23 2023-02-23 Display method, display system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310157913.3A CN116080399B (en) 2023-02-23 2023-02-23 Display method, display system and storage medium

Publications (2)

Publication Number Publication Date
CN116080399A CN116080399A (en) 2023-05-09
CN116080399B true CN116080399B (en) 2023-10-20

Family

ID=86186920

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310157913.3A Active CN116080399B (en) 2023-02-23 2023-02-23 Display method, display system and storage medium

Country Status (1)

Country Link
CN (1) CN116080399B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116909024B (en) * 2023-07-26 2024-02-09 江苏泽景汽车电子股份有限公司 Image display method, device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110025577A (en) * 2009-09-04 2011-03-10 현대자동차일본기술연구소 Vehicle surrounding information service system
DE102016005618A1 (en) * 2016-05-06 2017-02-09 Daimler Ag Method for operating a vehicle
CN112218786A (en) * 2019-03-26 2021-01-12 深圳大学 Driving control method and device under severe weather, vehicle and driving control system
CN112550308A (en) * 2020-12-17 2021-03-26 东风汽车有限公司 Method for preventing accumulated water on road surface from sputtering pedestrians, vehicle-mounted terminal and system
CN115128806A (en) * 2021-03-10 2022-09-30 矢崎总业株式会社 Display device for vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110025577A (en) * 2009-09-04 2011-03-10 현대자동차일본기술연구소 Vehicle surrounding information service system
DE102016005618A1 (en) * 2016-05-06 2017-02-09 Daimler Ag Method for operating a vehicle
CN112218786A (en) * 2019-03-26 2021-01-12 深圳大学 Driving control method and device under severe weather, vehicle and driving control system
CN112550308A (en) * 2020-12-17 2021-03-26 东风汽车有限公司 Method for preventing accumulated water on road surface from sputtering pedestrians, vehicle-mounted terminal and system
CN115128806A (en) * 2021-03-10 2022-09-30 矢崎总业株式会社 Display device for vehicle

Also Published As

Publication number Publication date
CN116080399A (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN108571974B (en) Vehicle positioning using a camera
JP6878493B2 (en) Pitch angle correction method used for autonomous vehicles (ADV)
CN111874006B (en) Route planning processing method and device
US11248925B2 (en) Augmented road line detection and display system
JP3619628B2 (en) Driving environment recognition device
CN115039129A (en) Surface profile estimation and bump detection for autonomous machine applications
CN112580571A (en) Vehicle running control method and device and electronic equipment
CN116080399B (en) Display method, display system and storage medium
CN115857169A (en) Collision early warning information display method, head-up display device, carrier and medium
JP2004265432A (en) Travel environment recognition device
CN113435392A (en) Vehicle positioning method and device applied to automatic parking and vehicle
CN112735163B (en) Method for determining static state of target object, road side equipment and cloud control platform
CN116620168B (en) Barrier early warning method and device, electronic equipment and storage medium
CN112639822B (en) Data processing method and device
JP2017191472A (en) Image display method and image display device
CN111427331B (en) Perception information display method and device of unmanned vehicle and electronic equipment
CN117168488A (en) Vehicle path planning method, device, equipment and medium
CN114030483B (en) Vehicle control method, device, electronic equipment and medium
US20230176216A1 (en) Automatic bootstrap for autonomous vehicle localization
JP2023135587A (en) Hazard detection using occupancy grid for autonomous system and application
US20240106989A1 (en) Vehicle display control device and non-transitory computer-readable medium
CN117008775B (en) Display method, display device, electronic equipment and storage medium
CN114407916B (en) Vehicle control and model training method and device, vehicle, equipment and storage medium
CN117944713A (en) Automatic driving method, device, domain controller, medium, system and vehicle
CN117557643A (en) Vehicle video auxiliary information generation method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant