Disclosure of Invention
The application aims to provide a display method, display equipment, a storage medium and a vehicle, which solve the technical problem that the HUD display equipment in the prior art has single enhanced display function and cannot carry out auxiliary prompt on the surrounding environment of a vehicle.
In order to solve the technical problems, the application adopts the following technical scheme.
In a first aspect, the present application provides a display method, including:
at least a first area projected by the HUD display device is displayed in a fitting way with a head cover of the vehicle;
The first area is configured to display a virtual contour line reflecting the appearance of the headstock cover and a first image reflecting the shielding live-action of the headstock cover, the first image is displayed in a matched mode with the virtual contour line according to the relative position relation between the shielding live-action of the headstock cover and the headstock cover, the virtual contour line is a perspective structure of the headstock cover in the first area range, and the first image is shooting content of the vehicle integrated camera corresponding to the first area range.
In an optional implementation manner of the first aspect, the color of the virtual contour line is consistent with the color of the vehicle body.
According to the description, the optional implementation manner can intuitively display the real situation that the vehicle front is shielded on the vehicle head cover, and the general distance between the shielded real situation and the vehicle head can be judged through the effect of watching through the perspective structure, so that a driver is helped to make an accurate driving decision.
In an optional implementation manner of the first aspect, the displaying, in cooperation with the virtual contour line, the first image according to a relative positional relationship between the cover covering live-action and the cover includes:
And displaying a first distance mark between the vehicle head cover represented by the virtual contour line and the vehicle head cover shielding live-action in the first image, wherein the first distance mark is used for prompting the actual distance value between the vehicle head cover and the surrounding environment.
In an optional implementation manner of the first aspect, the actual distance data is determined by an in-vehicle distance sensor.
In an optional implementation manner of the first aspect, the actual distance data is determined by analyzing the first image.
According to the above description, the optional embodiment can use more visual numerical display to assist the driver in judging on the basis of visually displaying the image of the occlusion real scene, and specifically, the first distance mark can also continuously update the numerical value along with the movement of the vehicle.
In an optional implementation manner of the first aspect, the first area configured to display a virtual contour line reflecting an outline of the head cover and a first image reflecting an occlusion fact of the head cover includes:
In response to the vehicle triggering a turn, the virtual contour and a first image projection are displayed in the first area.
In an optional implementation manner of the first aspect, the first area configured to display a virtual contour line reflecting an outline of the head cover and a first image reflecting an occlusion fact of the head cover includes:
The virtual contour and the first image projection are displayed in the first area in response to the driver raising the viewing line of sight to bring the eye position to a predetermined height.
According to the description, the optional implementation mode only calls out the virtual contour line and the first image to assist the driver to make a decision when the vehicle needs a complex decision, so that the information burden of the projection display on the driver is reduced, and the HUD processing pressure is reduced.
In an optional implementation manner of the first aspect, the first area configured to display a virtual contour line reflecting an outline of the head cover and a first image reflecting an occlusion fact of the head cover includes:
And in response to the vehicle being in a normal driving state, hiding and displaying the virtual contour line and the first image, wherein the first area is configured to display vehicle driving state information.
According to the above description, the utilization rate of the projection area can be maximally improved in the alternative embodiments, more information can be displayed through switching of different states, and the vehicle can display conventional information in a normal state of normal straight running.
In an optional implementation manner of the first aspect, the display method further includes:
The HUD display device is provided with a second area, the second area is adjacent to the first area and is located at the outer edge of the headstock cover, the second area is configured to display a second image reflecting the environment beside the real scene shielded by the headstock cover, the second image is shooting content of the vehicle integrated camera corresponding to the second area range, and the first image and the second image form continuous stitching.
According to the description, the transition between the virtual image and the real object is realized by using the second image in the optional embodiment, and the second image can be basically overlapped with the real environment in the second area range, so that the smoothness of virtual-real display is improved, and the viewing experience of a user is improved.
In an optional implementation manner of the first aspect, the perspective structure of the vehicle head cover in the first area with the virtual contour line includes:
The virtual contour lines comprise first virtual contour lines which are coincident with the upper surface of the headstock cover and second virtual contour lines which reflect the appearance of the front surface of the headstock cover.
According to the above description, the alternative embodiment can blend the virtual contour line with the real appearance of the headstock cover, and present the front surface appearance that the headstock cover is covered by the perspective structure, thereby providing the user with the real feeling that can see through the headstock cover.
In an optional implementation manner of the first aspect, the capturing content corresponding to the first area range for the vehicle integrated camera for the first image includes:
Acquiring a photographed front image of the headstock cover through the vehicle integrated camera;
and converting the front image of the headstock cover into the first image according to the position of the first area.
According to the above description, the optional embodiment may acquire the live view of the vehicle front covered by the vehicle head cover by using a front camera or the like, and then perform conversion processing on the acquired image, so as to adapt to the requirement of framing from the first area range.
In an optional implementation manner of the first aspect, the displaying, in cooperation with the virtual contour line, the first image according to a relative positional relationship between the cover covering live-action and the cover includes:
And adjusting the matching relation between the first image and the virtual contour line according to the viewpoint position of the driver.
According to the above description, the optional embodiment adapts to the parallax at different eye positions by adjusting the position matching relationship between the two, provides a more realistic viewing effect for the user, and is easier to assist the user in judging the distance ahead based on dynamic parallax.
In an optional implementation manner of the first aspect, the display method further includes:
The HUD display device projection has a third area located directly above the front wheels of the vehicle;
The third region is configured to display a state of the front wheels of the vehicle.
In an alternative embodiment of the first aspect, the front wheel of the vehicle is the left front wheel of the vehicle.
In an optional implementation manner of the first aspect, the third area configured to display a state of the front wheel of the vehicle includes:
in response to the vehicle triggering steering, a status projection of the front wheels of the vehicle is displayed in the third area.
According to the description, the optional implementation manner can assist a driver to acquire the actual state of the front wheels of the vehicle, expand the richness of information prompts and support on-demand display.
In an optional implementation manner of the first aspect, the third area configured to display a state of the front wheel of the vehicle includes:
And displaying the steering schematic model of the front wheels of the vehicle and a third image of the real scene beside the front wheels of the vehicle, wherein the third image is a vehicle front wheel adjacent environment shot by a vehicle integrated camera.
In an optional implementation manner of the first aspect, the displaying the steering schematic model of the front wheels of the vehicle and the third image of the scene beside the front wheels of the vehicle includes:
And displaying a second distance mark between the steering schematic model and the real scene in the third image, wherein the second distance mark is used for prompting an actual distance value.
In an optional implementation manner of the first aspect, the displaying the steering schematic model of the front wheels of the vehicle and the third image of the scene beside the front wheels of the vehicle includes:
and the third image is displayed in a matched mode with the steering schematic model according to the relative position relation between the real scene beside the front wheels of the vehicle and the front wheels of the vehicle.
According to the above description, the optional embodiments can enable a driver to more intuitively know the relative distance between the front wheel of the vehicle and the peripheral object, and avoid the wheels from rubbing the peripheral object.
In an optional implementation manner of the first aspect, the display method further includes:
simulating the running track of the vehicle according to the steering angle of the front wheels of the vehicle;
The HUD display device projects the driving track on a front road.
In an optional implementation manner of the first aspect, the projecting the driving track on the front road by the HUD display device includes:
the HUD display device projection has a fourth region having a fitting relationship with the road ahead, the travel track being configured in the fourth region.
In an optional implementation manner of the first aspect, the projecting the driving track on the front road by the HUD display device includes:
marking the position of the front wheels of the vehicle after a preset time on the running track according to the running speed of the vehicle.
According to the above description, the alternative embodiment can prompt the driver to pass through the path in the future by fitting the running track projected on the road, and further mark the position where the front wheels of the vehicle possibly press.
In an optional implementation manner of the first aspect, the display method further includes:
the HUD display device projects a fifth area configured to display a driving-assist reference point marker at a specific location of the vehicle head cover.
In an optional implementation manner of the first aspect, the fifth area is configured to display a reference point mark for assisting driving at a specific position of the vehicle head cover, and includes:
the reference point mark is highlighted according to the traveling intention of the vehicle.
In an alternative embodiment of the first aspect, the first region is the same as the fifth region.
In an alternative real-time manner of the first aspect, the distance between the head and the specific object is displayed in the second area.
According to the above description, the optional embodiment may directly project the point position for assisting in judging the relationship between the vehicle and the road on the head cover, and guide the eyes of the driver to look at the point position to observe the situation of the vehicle and the road.
In a second aspect, the present application provides a display device comprising a memory, a processor and a computer program stored on the memory and running on the processor, the processor implementing the steps of the display method of the first aspect when executing the computer program.
In a third aspect, the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the display method of the first aspect.
In a fourth aspect, the present application provides a vehicle comprising the display device of the second aspect or the computer readable storage medium of the third aspect.
Compared with the prior art, the projection area of the HUD display device is at least partially overlapped on the headstock cover to realize auxiliary enhanced display, and the front environment shielded by the headstock cover is intuitively displayed by projecting the virtual contour line reflecting the appearance of the headstock cover and the front image shot by the vehicle integrated camera on the entity of the headstock cover, so that the experience of viewing the surrounding environment of the vehicle through the headstock cover is provided for a driver. The application can enrich the auxiliary enhancement display function of the HUD display device, help drivers to accurately judge the surrounding environment of the vehicle and avoid unnecessary scratch of the vehicle.
Detailed Description
The present application will be described in detail below with reference to the attached drawings, but the descriptions are only examples of the present application and are not limited to the application, and variations in structure, method or function etc. according to these examples are included in the protection scope of the present application.
It should be noted that in different examples, the same reference numerals or labels may be used, but these do not represent absolute relationships in terms of structure or function. Also, the references to "first," "second," etc. in the examples are for descriptive convenience only and do not represent absolute distinguishing relationships between structures or functions, nor should they be construed as indicating or implying a relative importance or number of corresponding objects. Unless specifically stated otherwise, reference to "at least one" in the description may refer to one or more than one, and "a plurality" refers to two or more than two.
In addition, in representing the feature, the character "/" may represent a relationship in which the front-rear related objects exist or exist, for example, a head-up display/head-up display may be represented as a head-up display or a head-up display. In the expression operation, the character "/" may indicate that there is a division relationship between the front and rear related objects, for example, the magnification m=l/P may be expressed as L (virtual image size) divided by P (image source size). Also, "and/or" in different examples is merely to describe the association relationship of the front and rear association objects, and such association relationship may include three cases, for example, a concave mirror and/or a convex mirror, and may be expressed as the presence of a concave mirror alone, the presence of a convex mirror alone, and the presence of both concave and convex mirrors.
The HUD projection display mainly uses an optical reflection principle to reflect imaging light to be displayed through a transparent surface into human eyes of viewers, the human eyes can view virtual image information along the opposite direction of the light, correspondingly, the transparent surface can be a windshield of a vehicle, and the windshield serves as a display screen to display navigation indication content of the vehicle, vehicle running speed and the like. As shown in fig. 1, the HUD display device may at least include an optical engine 1, a first mirror 2, a second mirror 3, and the like, where the optical engine 1 includes a backlight source and an image source (not shown), and the backlight source is used to provide illumination light and adjust brightness of the illumination light according to control, for example, the backlight source may be an LED (LIGHT EMITTING Diode), a laser, and the like. The image source adjusts the corresponding display content according to the control under the illumination light provided by the backlight source and projects the display light from the surface of the image source, for example, the image source may be an LCD (Liquid CRYSTAL DISPLAY ), a DMD (Digital Micromirror Devices, digital micromirror device), a MEMS (Micro-Electro-MECHANICAL SYSTEM, micro Electro-mechanical system) micromirror, an LCOS (Liquid Crystal on Silicon ), or the like. The first reflecting mirror 2 and the second reflecting mirror 3 can project the display light projected by the optical machine 1 on the windshield 4, so that the light path customization is realized in a smaller space, different projection display requirements are met, the first reflecting mirror 2 and the second reflecting mirror 3 can be set into concave mirrors, convex mirrors, concave lenses, convex lenses and the like according to the requirements of optical planning, and the surface type of the lens can adopt free curved surfaces. Optionally, at least one of the first mirror 2 and the second mirror 3 may be further adjusted in angle to a certain extent, so as to change the projection position of the display light on the windshield 4, so as to meet viewers with different heights. The display light of the light machine 1 is finally reflected on the windshield 4 of the vehicle to form a virtual image 5, and when the virtual image 5 is observed against the windshield 4, the human eye 6 can feel a certain depth feeling, and the virtual image 5 can be navigation instruction content, vehicle running speed and the like just like watching a real object at a specific distance outside the windshield. It is to be added that the HUD display device may also be provided with a diffuser for the characteristics of the different optotypes, and in some examples fresnel lenses, waveguide optics, diffractive optics, holographic optics, tapered fibers etc. may also be included in the HUD display device.
As described above, the HUD display device may be integrated in a vehicle to display virtual image information at a front position outside the windshield, as shown in fig. 2 and 3, the HUD display device 11 may be specifically integrated in the center console 10 of the vehicle and opposite to the driving position, the HUD display device 11 projects display light onto the inner surface of the windshield 4 through the projection window to reflect, and accordingly forms a projection area 50 corresponding to the virtual image outside the windshield 4 of the vehicle, and the projection area 50 projects information to be displayed in a range of the projection area 50, and the position of the projection area 50 may also be adjusted as required, so as to have a fitting relationship with different positions outside the windshield. At the same time, as the design value of FOV increases, the projected area 50 can also cover more area in front of the windshield, and its freedom to enhance the display is also enhanced. As shown in fig. 2, for W-HUD (Windshield HUD), basic vehicle running state information, such as a vehicle speed, a gear, etc., is mainly displayed in the projection area 50, and the display function is consistent with the dashboard display function, so that conventional vehicle running state information is mainly presented directly in front of the driver's visual field, and the frequency of low head information viewing of the driver is reduced. As shown in fig. 3, for AR-HUD (Augmented Reality HUD), in addition to displaying the vehicle running status information, the display content in the projection area 50 may be combined with the virtual reality of the road ahead, for example, a specific indication mark is displayed on a specific road in a fitting manner, so that the navigation information is easier to understand than the conventional map navigation. However, the display content in the projection area 50 in the above example emphasizes the presentation manner and position of the information, that is, the projected content is often derived from the cabin equipment itself, and no connection between the interior and exterior of the vehicle is established, and the unidirectional output of the information is emphasized, so that it is difficult to assist the driver in obtaining more effective information from the outside. Taking an example that a vehicle turns right angle in a narrow space or parks at the side of a road shoulder, a driver is easy to be blocked by the sight of a vehicle head cover, and the relative positions of the vehicle body, tires and the like and objects outside the vehicle cannot be accurately judged, so that the vehicle is easy to rub, correspondingly, auxiliary information is required to be projected at the corresponding external position to help the driver to judge, the auxiliary information is required to be established based on the actual condition of the external environment, and the auxiliary information is not directly presented at the designated position through prepared contents.
In some examples, the projection functionality of the HUD display device may be utilized to assist in enhancing the display of the external environment, primarily content that the driver cannot directly determine through his own judgment of the external environment. Further, based on the projection mechanism, the projection area of the HUD display device may be set at a specific position where auxiliary enhancement is required, in this example, the auxiliary enhancement display to the head cover and the periphery of the head cover may be enhanced, as shown in fig. 4, the head cover 41 is a hood at the front part of the vehicle body, which may also be called an engine cover for an oil vehicle, the front wheel of the vehicle is generally under the head cover and is a blind spot in the cockpit, the driver is completely unable to see the front wheel, and the rear wheel of the vehicle may not be observed through the rear view mirror. At the same time, the vehicle head cover 41 also shields part of the surrounding environment of the vehicle head, so that the relative distance between the vehicle head and the object in front cannot be intuitively determined when the vehicle is traveling forward. Accordingly, the relationship between the driver's eye 6 line of sight and the head cover determines that the majority of the visible range is a real scene in front of the vehicle, while the visible region 42 of the head cover is viewable by the eye 6 through the lower portion of the windscreen. As shown in fig. 5, the visible region 42 of the head cover is visible from the cabin to the outside, and other parts of the head cover are not normally visible, but the head cover just shields the nearest surrounding environment from the head, so that the front wheels of the vehicle cannot be observed. It should be noted that, the size of the visible area 42 of the head cover is also related to the height of the human eye, and when the driver lifts the viewing height of the human eye, the visible area 42 is also correspondingly increased. In the HUD display device integrated in a vehicle, the projection area 50 thereof can be adjusted in front of the lower part of the windshield, and thus can be at least partially overlapped with the visible area 42 of the head cover, so that the visible area 42 of the head cover can be used as a background carrier for projection display, and the information displayed in the overlapped area of the projection area 50 can be matched with the visible area 42, thereby achieving the effect of the head cover, and correspondingly, the overlapped area of the projection area 50 can be adjusted according to the height of human eyes. Optionally, if the projection area 50 supports the expansion of FOV or adjustment of the projection position, auxiliary enhancement information corresponding to the display position may be attached to the outer edge of the vehicle head cover visible area 42, so as to increase the richness of the live-action enhancement display, which will be described in detail below.
In some examples, as shown in fig. 6, the projection area 50 is divided into a first area 501 and a second area 502, where the projection area 50 is generally rectangular in display range according to the FOV setting, and no obvious display dividing line is between the first area 501 and the second area 502, and is determined by the edge contour of the head cover superimposed thereunder, so that the boundary between the first area 501 and the second area 502 is an arc. Accordingly, the division between the first region 501 and the second region 502 is determined according to the projection position of the projection region 50, and the first region 501 is disposed opposite to and overlaps the head cover, that is, the intersection range of the visible region 42 and the projection region 50 in fig. 4 and 5, and the second region 502 is disposed opposite to the real scene outside the head cover, that is, the intersection range of the road in front of the vehicle and the projection region 50. Thus, the information displayed in the first region 501 can achieve the effect of being displayed in close contact with the head cover, while the information displayed in the second region 502 can achieve the effect of being displayed in close contact with the road ahead. In some examples, the projection area 50 may display the first area 501 and the second area 502 as a whole, such as conventional vehicle driving status information displayed according to this mode, which may uniformly display the vehicle driving status information in the middle or bottom of the projection area 50, and is particularly suitable for a normal driving status. When the auxiliary enhancement display is required, the first region 501 and the second region 502 can be displayed in a divided function, so that on the one hand, the utilization rate of the projection region 50 is improved, and on the other hand, the first region 501 displayed in a manner of being attached to the head cover can be used for displaying information for assisting the driver in judging the surrounding environment of the vehicle body. Alternatively, the display distribution of the first region 501 and the second region 502 may be divided by obtaining the relative positional relationship of the vehicle cover with a camera. In some examples, the projection display may be switched in both states to meet different scene requirements, further enhancing the richness of the information display in the projection area 50. In some examples, the vehicle is in a steering behavior, and may be specifically determined according to sensing information such as a steering wheel, where the vehicle turns with a high risk of collision, especially when the distance between the vehicle and a front vehicle is very small in a narrow parking environment, the projection area 50 is triggered to perform independent distributed display according to the first area 501 and the second area 502, and auxiliary enhanced display information is displayed in the first area 501. Further, when the vehicle switches from the steering behavior back to the straight behavior of normal running, the projection region 50 may also be switched back to the mode of the overall distribution display, that is, without distinguishing the first region 501 and the second region 502. In some examples, the image recognition is performed by a front camera, and if an object in front of the front is analyzed and has a risk of collision, the function of assisting the enhanced display is called out, that is, specific environment enhancement information is displayed in the first area 501, so that the driver is not only helped to make an accurate driving decision, but also is warned and prompted to the driver. Further, when the camera recognizes that there is no obstacle around that may cause a collision, the projection area 50 is switched back to the mode of the overall distribution display. in some examples, the behavior of the driver may be analyzed and judged by the in-vehicle camera, and when the driver raises the observation height of the human eye to a predetermined height, for example, higher than the average height during normal driving, it is explained that the driver has an intention to more clearly observe the situation of the front vehicle, and at this time, auxiliary information capable of showing the surrounding environment of the head cover is displayed in the first area 501, which can well improve the driving experience of the driver and guide the driver to avoid the obstacle correctly. Optionally, after determining that the intention of needing auxiliary enhancement exists according to the sight line height of the driver, triggering to display corresponding auxiliary enhancement display information in the first area 501, and keeping the preset duration, so that it can be ensured that the frequent switching of the display modes in the projection area 50 is reduced when the height of the driver is continuously adjusted, and if the intention of needing auxiliary enhancement is confirmed again within the preset duration, the line brushing time is reckoned, so that the driver can always see specific content in the first area 501 when the auxiliary enhancement display information is needed.
As described above, when the projection area 50 is divided into the first area 501 and the second area 502 and the display distribution is independently performed, the auxiliary enhancement display is performed for the vehicle head cover in the first area 501. Alternatively, the projection area 50 may project the entire area onto the visible area 42 (refer to fig. 4 and 5) of the head cover, that is, the entire area of the projection area 50 is the first area 501, so that the playing space for performing auxiliary enhancement display on the head cover may be increased. The specific enhanced auxiliary display information includes a virtual contour line reflecting the appearance of the headstock cover and a first image reflecting the shielding live view of the headstock cover, as shown in fig. 7, the virtual contour line is determined by attaching the virtual contour line to a designated position on the headstock cover, the control of the position of the virtual contour line is determined according to the appearance trend of the actual headstock cover, the virtual contour line specifically includes a first virtual contour line 511, a second virtual contour line 521 and a third virtual contour line 531, the first virtual contour line 511, the second virtual contour line 521 and the third virtual contour line 531 form a three-dimensional structure of at least part of the headstock cover together, the first virtual contour line 511 coincides with the upper surface of the headstock cover, and outlines the shape of the outwards visible surface of the headstock cover, so that the first virtual contour line 511 and the real headstock cover have smooth fusion feeling, and in this example, the first virtual contour line 511 has three. The second virtual contour 521 reflects the outline of the front surface of the head cover, i.e., the front-facing surface of the head, and the actual front surface is not visible to the in-vehicle viewer due to the line of sight shielding, and the second virtual contour 521 may reproduce the outer shape of the front surface of the head cover, thereby forming a perspective structure, and in this example, the second virtual contour 521 also has three. The third virtual contour line 531 is a line representing the structure of the inner space of the car head cover, and the inner space of the third virtual contour line 531 is blocked by the upper surface of the car head cover and cannot be seen in the actual viewing process, so that the three-dimensional vision of the inner space can be provided for the viewer through the third virtual contour line, and the three-dimensional vision of the inner space can jointly form the perspective structure of the car head cover, and in the example, the third virtual contour line 531 has five lines. Optionally, the structure formed by the first virtual contour 511, the second virtual contour 521 and the third virtual contour 531 is not necessarily a cube, but may be constructed into other shapes according to the viewing experience of the viewer, and further, the shape design may be performed in combination with the size of the first region 501. In this way, the opaque headstock cover can be converted into the visual effect of the perspective structure through the cooperation of the first virtual contour line 511, the second virtual contour line 521 and the third virtual contour line 531, so that the viewer can see the whole space through the upper surface of the headstock cover, that is, the virtual-real bonding effect is realized between the content displayed in the first area 501 and the visible area of the headstock cover, in a preferred example, the virtual image distance of the virtual image (such as the virtual contour line) displayed in the visible area of the headstock cover is required to keep consistent with the position of the headstock cover, and the sum sense caused by the viewing of eyes is reduced, so that the displayed perspective structure is a part of the headstock cover. Because the virtual contour line needs to correctly reflect the appearance of the vehicle head cover, projection can be performed according to the structural data of the actual vehicle head cover, for example, the HUD display device can be configured with the model of the installed vehicle and the structural data of the corresponding model when in installation standard, and the virtual contour line can be generated according to the structural data and the projection position of the vehicle head cover and projected at the appointed position of the visible region of the vehicle head cover, so that the virtual contour line can be connected with the actual shape of the vehicle head cover. Optionally, in order to further improve the sense of reality of fusion, the color of the virtual contour line adopts a color consistent with the head cover or a brighter color, and in particular, the corresponding color may be configured in advance according to the vehicle model.
The fitting display of the virtual contour line can provide accurate headstock cover boundary information for a viewer, so that the relative position relation between the headstock and the surrounding environment can be judged through the perspective structure. Accordingly, in fig. 7, the first virtual surface 541 and the second virtual surface 551 formed by virtual contour lines respectively can simulate the situation that the vehicle head cover is in a transparent state and is seen through to the surrounding environment, the first virtual surface 541 can represent the front surface of the vehicle head cover which is not visible, and the second virtual surface 551 can represent the bottom surface of the inner space of the vehicle head cover which is not visible, and the surfaces are closest to the surrounding environment. In this example, the real perspective effect can be simulated by displaying the surrounding environment images visible from the front and bottom surfaces of the car head cover by projection on the first virtual surface 541 and the second virtual surface 551. The images displayed on the first virtual surface 541 and the second virtual surface 551 may be derived from real-time photographing contents of a front camera of the vehicle, specifically, the front camera of the vehicle photographs a front image including a hood shielding environment, and then determines a required first image according to the requirements displayed on the first virtual surface 541 and the second virtual surface 551. Optionally, the first image obtained through image conversion not only implements clipping in a specific range, but also processes corresponding light rays, viewing angles, and the like, and the converted first image is displayed on the first virtual surface 541 and the second virtual surface 551, so that a real scene of the hood shielding is reproduced, and an effect as seen through the hood from the view angle of a viewer is seen, and as the vehicle moves, the front camera of the vehicle can continuously acquire new shooting content, and the first image displayed on the first virtual surface 541 and the second virtual surface 551 is updated in real time, so that the viewer sees the real surrounding environment. In some examples, in order to provide a real depth feeling between the first image and the virtual contour line, the relative position between the cover and the cover is dynamically adjusted according to the dynamic parallax of the cover along with the change of the line of sight, that is, the position of the overlapping between the first image and the virtual contour line is changed along with the change of the viewpoint position of the viewer. In order to further improve the intuitiveness of the viewer in judging the front-rear distance, a distance mark can be further arranged between the head cover in the first image, which is used for shielding the real scene and the head cover represented by the virtual contour line, the distance mark can be a connecting line between the real scene and the head cover, and the actual distance value can be accurately displayed on the connecting line, so that a driver serving as the viewer can slowly control the movement of the vehicle according to the accurate distance value, and the distance value on the connecting line can be continuously updated in real time according to the corresponding sensor, so that real-time data reference is provided for the driver.
As shown in fig. 8, after the virtual contour line and the first image are projected in the first area 501, when a driver in the vehicle observes a pedestrian in front of the vehicle, the driver can see not only the upper half 71 of the pedestrian which is not blocked above the head cover but also a part of the lower half 72 of the pedestrian by projecting the first image displayed, and determine the front-rear distance between the specific vehicle and the pedestrian by the matching relationship between the part of the lower half 72 and the virtual contour line, for example, the overlapping position change under the dynamic parallax, so as to determine the possibility that the vehicle may hit the pedestrian. In some examples, a second image reflecting the environment beside the live action blocked by the head cover, that is, the live action visible to the human eye above the head cover, may also be displayed in the second area 502, and the second image may also be captured by the front camera of the vehicle, and may be cropped according to the position corresponding to the second area 502, so as to be displayed in the second area 502. Referring to fig. 6-8, the first region 501 and the second region 502 are adjacent and close together, that is, the second region 502 faces the outer edge of the car head cover, and the projected second image can be exactly overlapped on the real scene of the corresponding content through precise control of the projection position, so that the second image can be exactly fused with the surrounding real scene. More importantly, the second image can be continuously spliced with the first image, so that the unification of light and visual angles between the first image and the second image is maintained, smooth transition is formed between the real image and the virtual image, namely, the situation of the adjacent area of the vehicle head cover is continuously reproduced, interference to a driver caused by uncooled projection is reduced, and the definition and viewing experience of the environment where the vehicle head is observed by the driver are improved.
In some examples, when the vehicle is parked laterally or the like, it is important that the driver accurately determine the steering angle of the wheels and the relative positional relationship with obstacles such as surrounding shoulders, and in this example, the left and right sides of the head cover of the front wheel of the vehicle in front of the windshield are reproduced by projection display, so as to assist the driver in viewing the conditions of the two sides of the vehicle. Specifically, the projection area corresponding to the HUD display device further includes a third area corresponding to the position of the front wheel of the vehicle, as shown in fig. 9, taking the left front wheel as an example, since the left front wheel is located at the left lower side of the vehicle head cover, it is not in the line of sight direction of the viewer in the vehicle, and therefore the simulated wheel state cannot be reproduced directly by the projection on the vehicle head cover, and accordingly the third area 503 can be configured and projected directly above the left front wheel of the vehicle, so that the driver can generally know the position of the wheel on the road through the direct downward displacement judgment. Alternatively, the state of the actual wheels of the vehicle is represented in the third area 503 by the steering indicating model 513, and the steering indicating model 513 may determine the displayed structural dimensions and the wheel appearance according to the vehicle model, and may maintain the same look and feel as the actual wheels. Meanwhile, the steering schematic model 513 adjusts the displayed angle according to the actual steering of the vehicle, and may be specifically determined according to the control data of the steering wheel of the vehicle. Alternatively, the guide arrow 533 is projected and displayed in a direction in which the steering indicating model 513 points to the actual wheel, and the guide arrow 533 may assist the driver in judging the position of the actual wheel. Further, in order to more intuitively display the environment around the left front wheel, a third image 523 is also displayed by projection beside the steering indication model 513, and similarly, the third image 523 can also be obtained by shooting the environment near the front wheel of the vehicle through a camera integrated with the vehicle, and updated in real time. Through the coordinated display, the viewer can understand the distance relationship between the actual wheel and the adjacent road shoulder and other obstacles, and optionally, the relative position relationship between the steering indicating model 513 and the obstacles in the third image 523 can be changed according to the dynamic parallax determined by the viewpoint position of the driver, so as to provide a visual real distance sense for the viewer. Optionally, a distance mark may also be directly set between the steering schematic model 513 and the obstacle in the third image 523, where the distance mark may be a line between the two, and the actual distance between the two is accurately displayed on the line, so that when the wheel approaches the obstacle, the driver can timely find and accurately perceive the distance, thereby changing the movement track and avoiding the scratch between the wheel and the obstacle. In some examples, the projection of the content in the third area 503 is not a constant display function, because the constant display may bring unnecessary information burden to the driver, which may be that the corresponding steering indication model 513 and the third image 523 are triggered to be displayed only in response to the steering of the vehicle, and the projection content in the third area 503 is automatically hidden when the vehicle is in a normal driving state such as straight running, so as to reduce the interference to the driver.
In some examples, as shown in fig. 10, in order to further provide a multi-dimensional auxiliary enhanced display function for the driver, a driving track 514 based on the steering angle of the front wheel of the vehicle and the current speed of the vehicle is displayed on the road in front of the vehicle in a fitting manner, specifically, the driving track is displayed through a projected fourth area 504, and the fourth area 504 is well fitted with the road in front of the road at a specific distance according to the projected configuration, that is, the virtual image distance of the focal plane of the fourth area 504 is consistent with the specific distance of the road. When the vehicle is running straight, the running track 514 is attached to the road ahead along a straight path, when the vehicle is turning, the running track 514 is attached to the road ahead along a curved path, and when the projection range of the fourth area 504 is large enough, the running track 514 can extend from the edge position of the head cover to the front of the road all the time, so that the integrity of the prediction display of the vehicle path is enhanced. The driver can judge the possible passing position of the vehicle through the driving track 514, and if the obstacle influencing the passing of the tire exists on the road in front, the driver can be assisted to avoid in time. Optionally, the predicted position of the front wheel 524 of the vehicle is also marked on the driving track 514, where the marked front wheel 524 of the vehicle may be a black shade matching the actual cross-sectional size of the wheel, and the predicted position may be a position where the front wheel is pressed after a predetermined time is determined according to the steering angle of the front wheel of the vehicle and the current speed of the vehicle, for example, the front wheel 524 of the vehicle displayed in the fourth area 504 is exactly fit to a specific position on the road ahead, and the specific fit position is a position where the tire is pressed after 3 seconds of the vehicle, so that the driver can more accurately determine the position where the wheel passes.
In some examples, as shown in fig. 11, the projection area corresponding to the HUD display device further includes a fifth area 505 that fits over the head cover, and optionally, the fifth area 505 may at least partially overlap with the first area or completely belong to the same range. Projected on the fifth area 505 are reference point marks displayed on the head cover, which are located at specific positions on the head cover, and the positions have aiming points for estimating the distance between the vehicle and the roadside, so that a driver can find an object on the road along the three-point line with the human eyes according to the reference point marks. In further examples, reference point markers may be displayed on five locations on the head cover, respectively, including the left top, left third, right center, right third, and right top of the head cover, with which a driver may more easily estimate the distance between the vehicle and the curb, especially in complex driving scenarios such as hill-hold, quarter turn, side-hold, etc., which may help the driver control the vehicle to be at an accurate target location, such as 30 cm from the curb when parking on a hill. In the example of fig. 11, the first reference point mark 515, the second reference point mark 525 and the third reference point mark 535 are respectively configured in the fifth area 505, and these reference point marks may play a role of aiming at different road positions in different driving scenes, for example, aiming at the second reference point mark 525 may assist in realizing estimation of the right front wheel passing position, so that a driver can aim at different positions according to different requirements. Further, the HUD display device may highlight a specific reference point mark according to different driving scenes, for example, determine that the vehicle is in a straight running state, and if the driver has an intention to observe the right front wheel, the color of the second reference point mark 525 may be changed, which may be different from the colors of the first reference point mark 515 and the third reference point mark 535, for example, all the reference point marks are black, and change the color of the second reference point mark 525 to red by switching. Optionally, a prompt message aiming at the reference point mark may also be displayed in the projection area, which may be displayed in the second area in the above example, or may be directly displayed in the fifth area, such as by displaying the reference point mark pointing to the highlight by a text display in combination with an arrow. In some examples, the reference point markers in the fifth region 505 may also be displayed on demand, such as when a highlighted reference point marker is needed to fit on the bonnet, while other reference point markers are not projected, reducing the information burden on the driver. In some examples, the distance between the vehicle and the aimed object may also be displayed in the second area in conjunction with highlighting of the reference point marker, alternatively, the corresponding value may be displayed over the edge of the head cover, improving the aesthetic appeal of the layout.
The auxiliary enhanced display information in the above examples may be displayed in a plurality of combinations at the same time as needed, or may be displayed individually when the trigger condition is satisfied at a certain time. Accordingly, the diversity of the auxiliary enhancement display needs to be determined according to the size of the projection area, and further, the plurality of auxiliary enhancement display information in the examples can be projected on the same focal plane, or can be projected on the focal planes with different virtual image distances according to the capability of the optical hardware and the position distribution of the actual attachment.
When the HUD display device for realizing the display method in the example is applied to a vehicle, auxiliary enhancement display can be performed aiming at the vehicle head cover and the surrounding environment, and when a driver drives the vehicle, the driver can see the environment information which cannot be directly seen, so that an accurate driving decision is made. As shown in fig. 12, the HUD display device integrated in the vehicle may be powered by the vehicle machine 92 or may be powered by the HUD display device itself and generate data. The HUD display device may specifically include a processor 91, an ethernet interface 901, a CAN (Controller Area Network ) interface 902, a power management module 903, a run memory 904, a storage memory 905, a temperature detection 906, a motor 907, a backlight 908, an image source 909, a positioning module 910, a radar 911, a camera 912, and the like. It should be noted that the various modules listed in fig. 12 are merely exemplary descriptions and not limiting in any way, and in some examples, the HUD display device may also include other modules. In addition, the modules described above may be implemented in one or more hardware in different examples, or a single module may be implemented by a combination of a plurality of hardware.
The Processor 91 serves as a control center of the HUD display device and includes one or more processing units of any type, including but not limited to a micro control unit, a microcontroller, a DSP (DIGITAL SIGNAL Processor, digital signal control unit), or any combination thereof. The processor 91 is configured to generate an operation control signal according to a computer program, implement control of other modules, and cooperate with the corresponding modules to process acquired or own data, instructions, and the like.
The ethernet interface 901 is a network data connection port for lan communication, and defines a series of software and hardware standards, through which a plurality of electronic devices may be connected together through the ethernet interface 901, and in this example, the processor 91 may interact with the vehicle 92 through the ethernet interface 901, such as sending data to the vehicle 92 or receiving data sent by the vehicle 92.
The CAN interface 902 is a network data connection port of the controller area network, provides a standard bus for a control system and an embedded industrial control in the automobile, and realizes communication interaction between the control nodes, in this example, the processor 91 CAN also interact information with the automobile 92 through the CAN interface 902, and optionally, the processor 91 CAN also connect with other external devices through the CAN interface 902. In some examples, processor 91 may also be provided with a GPIO (General purpose input/output) interface to improve the compatibility of peripheral connections.
The power management module 903 is connected with the vehicle machine 92, and can receive the power provided by the vehicle machine 92, and provide a regulated power supply for each module of the HUD display device, so as to ensure that the processor 91 and each module work under normal voltage supply, and avoid damage under overvoltage.
The running Memory 904 is used for storing computer programs executed by the processor 91, temporarily stored operation data, data exchanged with a storage Memory, and the like, and the running Memory 904 may be a Memory such as an SDRAM (Synchronous Dynamic Random-access Memory).
The storage memory 905 is used for storing resources such as related display content of the HUD display device, and long-term stored running programs and data, and the storage memory 905 may be a memory such as Flash (Flash memory). In some examples, the processor 91 may also provide an interface to access external memory.
The temperature detection 906 is configured to detect the temperature inside the HUD display device, and may specifically include a plurality of temperature sensors, and since the temperature sensors change with a change in the resistance value along with a change in the temperature, the processor 91 may determine, at a fixed power supply voltage, the resistance value of each temperature sensor at the corresponding temperature according to a voltage change between each temperature sensor and the voltage dividing resistor, so as to reversely push out the temperature at the position where the temperature sensor is located. In some examples, the processor 91 may control a plurality of temperature sensors through the GPIO interface, the plurality of temperature sensors may be disposed at different positions inside the HUD display device, and the processor 91 may respectively obtain temperature values fed back by the plurality of temperature sensors by using a time-sharing detection manner.
And a motor 907 for driving the optical lens in the HUD display device to rotate under the control of the processor 91, so as to change the corresponding light path, for example, when the sunlight flows backward to cause the temperature rise on the image source surface, the motor can drive the optical lens to make external sunlight unable to reach the image source surface. In some examples, the processor 91 may also drive a fan provided on the HUD display device by the motor 907 to increase the speed of the exchange of outside air within the HUD display device to achieve heat dissipation. Specifically, the motor 907 is connected to the processor 91 through a motor driving chip, and the motor driving chip provides high-performance power output for the motor 907, and may also communicate and control with the processor 91 through an interface such as an SPI (SERIAL PERIPHERAL INTERFACE ).
A backlight 908 for providing illumination light and adjusting the brightness of the illumination light according to the control of the processor 91 to adjust the projection display brightness of the entire HUD display device. The backlight 908 and the image source 909 cooperate to realize the main functions of the optical-mechanical projection display, and the backlight 908 may be an LED (LIGHT EMITTING Diode), a laser, or the like. Specifically, the backlight 908 is connected to the processor 91 through a backlight driving chip, and the backlight driving chip provides a driving voltage for the backlight 908 and controls the brightness of the backlight 908 according to the pulse width signal output by the processor 91.
The image source 909 is configured to display an image of the corresponding content and project display light corresponding to the image according to control of the processor 91, and the image source 909 may be an LCD (Liquid CRYSTAL DISPLAY ), a DMD (Digital Micromirror Devices, digital micromirror device), a MEMS (Micro-Electro-MECHANICAL SYSTEM, microelectromechanical system) micromirror, an LCOS (Liquid Crystal on silicon ), or the like.
The positioning module 910 is configured to monitor the HUD display device and the position of the corresponding vehicle, where the positioning module 910 may be a global navigation satellite system such as a GPS (Global Positioning System ), a beidou satellite navigation system, and the like, and determine corresponding position and orientation by measuring distances between satellites and receivers on the positioning module 910 at different positions. In some examples, the positioning module 910 may also include an inertial navigation system based on newton's law of mechanics that integrates the acceleration of the positioning module 910 in the inertial reference frame over time and transforms it into the navigation coordinate system to obtain data such as speed, yaw angle, and position in the navigation coordinate system. Alternatively, the inertial navigation system may assist the global navigation satellite system in achieving a more accurate position fix, providing corresponding position information to the processor 91.
Radar 911 is used to determine the position of a target object by electromagnetic waves, and it is generally possible to determine the distance of the target object from the vehicle in which radar 911 is located.
The camera 912 includes a body camera and an in-vehicle camera, where the body camera is used to determine a position of a target object through visual recognition, the body camera may be a monocular camera or a binocular camera, and the biggest difference between the monocular camera and the binocular camera is that the binocular camera may capture images under two different viewing angles, so that distance information in a three-dimensional space may be obtained. In-vehicle cameras are used to identify the behavior of drivers and passengers in a vehicle, including fatigue detection, distraction detection, expression recognition, gesture recognition, gaze tracking, etc., in which case in-vehicle cameras may also particularly enable eye tracking.
In some examples, the positioning module 910, the radar 911 and the camera 912 may also be directly connected to the vehicle 92, and not directly connected to the processor 91 of the HUD display device, for example, the vehicle 92 itself is integrated with a positioning module for position tracking and a radar and camera for automatic driving, and the HUD display device may acquire the acquired data of the positioning module, the radar and the camera in real time through communication with the vehicle 92.
In some examples, as shown in fig. 13, the HUD display device implementing the above display method may specifically include a processor 931, a memory 932, an input device 933 and an output device 934, where the input device 933 may include operation keys integrated on the display device, and the display device may receive input control instructions and data through the input device 933. The output devices 934 may include image sources or the like integrated on a display device, which may output corresponding instructions or data to the output devices 934. Further, a computer program running on the processor 931 is stored on the memory 932, and the display method of the above example is implemented when the processor 931 executes the computer program. In some examples, a computer readable storage medium stores a computer program that when executed by a processor implements the display method of the above examples.
Referring to fig. 2-5, 8-11, the vehicle may be provided with the above-described HUD display device, and in particular, the HUD display device 11 is integrated inside the center console 10, such as at a front position of the steering wheel. The corresponding display light is projected onto the opposite vehicle windshield 4 through the projection window of the HUD display device 11, so that a viewer can directly see the corresponding virtual image when observing the windshield 4 from the cockpit, specifically, the corresponding projection area 50 of the windshield 4 includes display elements such as basic display information (vehicle speed information, gear information and the like) and extended display information (navigation information and the like), and more importantly, the corresponding auxiliary enhanced display information can be displayed on the vehicle head cover in a fitting manner, which reflects the surrounding environment of the vehicle in real time, particularly the shielding part of the vehicle head cover, so as to help the driver avoid possible obstacles. In some examples, the vehicle may also distribute the program for obtaining the display method in the above examples through the above computer readable storage medium, so that the on-board HUD display device may be updated conveniently. The vehicle is not limited to a car as a travel aid, and may include a bus, a truck, an excavator, a motorcycle, a train, a high-speed rail, a ship, a yacht, an airplane, a spacecraft, and the like. The projected windshield is not limited to the front windshield of the automobile, and may be a transparent surface in other positions.
In connection with the above examples, the present application may be embodied directly in hardware, in a software module executed by a control unit, or in a combination of the two, i.e., in one or more steps and/or in a combination of one or more steps, in a computer program flow, or in a combination of hardware, such as an ASIC (Application SPECIFIC INTEGRATED Circuit), an FPGA (Field-Programmable gate array) or other Programmable logic device, a discrete gate or transistor logic device, a discrete hardware component, or any suitable combination thereof. For convenience of description, the above description is described as functionally divided into various modules, and of course, the functions of each module may be implemented in the same or multiple pieces of software and/or hardware when implementing the present application.
From the above description of examples, it will be apparent to those skilled in the art that the present application may be implemented in software plus a necessary general hardware platform. Based on such understanding, the technical solutions to which the application relates may be embodied in the form of software products in essence or in part contributing to the prior art. The software is executed by the micro-control unit and may include any type of one or more micro-control units, including but not limited to a micro-control unit, a microcontroller, a DSP (DIGITAL SIGNAL Processor, digital signal control unit), or any combination thereof, depending on the desired configuration. The software is stored in a memory, such as a volatile memory (e.g., random access memory, etc.), a non-volatile memory (e.g., read only memory, flash memory, etc.), or any combination thereof.
In summary, the projection area of the HUD display device is at least partially overlapped on the head cover to realize auxiliary enhanced display, and the front environment shielded by the head cover is intuitively displayed by projecting the virtual contour line reflecting the appearance of the head cover and the front image shot by the vehicle integrated camera on the entity of the head cover, so that the experience of viewing the surrounding environment of the vehicle through the head cover is provided for a driver. The application can enrich the auxiliary enhancement display function of the HUD display device, help drivers to accurately judge the surrounding environment of the vehicle and avoid unnecessary scratch of the vehicle.
It should be understood that while this specification includes examples, any of these examples does not include only a single embodiment, and that this depiction of the specification is for clarity only. Those skilled in the art will recognize that the embodiments of the present invention may be combined as appropriate with one another to form other embodiments as would be apparent to one of ordinary skill in the art.
The above list of detailed descriptions is only specific to practical embodiments of the present application, and they are not intended to limit the scope of the present application, and all equivalent embodiments or modifications that do not depart from the teachings of the present application should be included in the scope of the present application.