CN113119862B - Head-up display device for driving assistance - Google Patents

Head-up display device for driving assistance Download PDF

Info

Publication number
CN113119862B
CN113119862B CN202010039954.9A CN202010039954A CN113119862B CN 113119862 B CN113119862 B CN 113119862B CN 202010039954 A CN202010039954 A CN 202010039954A CN 113119862 B CN113119862 B CN 113119862B
Authority
CN
China
Prior art keywords
information
warning
prompt
reflecting
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010039954.9A
Other languages
Chinese (zh)
Other versions
CN113119862A (en
Inventor
徐俊峰
方涛
吴慧军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Futurus Technology Co Ltd
Original Assignee
Futurus Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Futurus Technology Co Ltd filed Critical Futurus Technology Co Ltd
Priority to CN202010039954.9A priority Critical patent/CN113119862B/en
Publication of CN113119862A publication Critical patent/CN113119862A/en
Application granted granted Critical
Publication of CN113119862B publication Critical patent/CN113119862B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)

Abstract

The invention provides a head-up display device for driving assistance, comprising: a projection image source, a light control device and an auxiliary driving controller; the light control device is arranged on one side of the reflecting device and comprises a main optical axis control element and a dispersion element; the auxiliary driving controller is connected with the projection image source and used for determining prompt contents to be displayed and controlling the projection image source to display the prompt contents. According to the head-up display device provided by the embodiment of the invention, imaging light rays with different incidence angles can be converged to the same observation range and dispersed into the range of the eye box, so that the brightness of the imaging light rays can be improved, and the imaging range can be ensured; and can realize the large-scale setting to form the imaging region of great area on reflecting device surface, realize the large-scale formation of image, the driver can watch the suggestion content on the reflecting device surface of bigger scope, can improve reflecting device's display effect.

Description

Head-up display device for driving assistance
Technical Field
The invention relates to the technical field of safe driving, in particular to head-up display equipment for assisting driving.
Background
In recent years, with the continuous development of technologies such as automobile intellectualization, internet of vehicles, automatic driving and the like, information received by a mobile vehicle-mounted terminal and various expanded application layers are endless, people have increasingly greater demands for flexibly displaying various information by communicating all display screens in an automobile, but the sight of a driver is easy to deviate when related operations are carried out, and potential safety risks exist.
The HUD (head up display) technology can avoid distraction caused by low head viewing of an instrument panel or other display screens in the driving process of a driver, improves driving safety coefficient, can bring better driving experience, is receiving more and more attention in recent years, and has huge application potential in the aspect of vehicle-mounted intelligent display. However, most of the current HUD designs are based on free-form surface mirrors, and the image source reflects light through a plane mirror and a free-form surface mirror, and the light reaches the windshield and enters the human eye after being reflected. The FOV tends to be small, typically within 10 degrees, which results in a small display size of the HUD image; and the displayed animation cannot be attached to the actual environment due to the smaller display frame, so that the effect of augmented reality is difficult to achieve. The limited display effect of current HUD display screen is poor, is difficult to play good bandwagon effect to the driver, and then is difficult to expand more application.
Disclosure of Invention
In order to solve the above problems, an object of an embodiment of the present invention is to provide a head-up display device for driving assistance.
The embodiment of the invention provides head-up display equipment for assisting driving, which comprises the following components: a projection image source, a light control device and an auxiliary driving controller; the light control device is arranged on one side of the reflecting device and comprises a main optical axis control element and a dispersion element;
the projection image source is used for emitting imaging light rays incident to the light ray control device, the main optical axis control element is used for reflecting multiple paths of imaging light rays to the reflecting device and reflecting the multiple paths of imaging light rays to the same observation range through the reflecting device, and the observation range is a position or a region in the range of the eye box;
the dispersing element is arranged on one side of the main optical axis control element, which is close to the projection image source, and is arranged between the main optical axis control element and the projection image source, and the dispersing element is used for dispersing the imaging light reflected by the main optical axis control element and forming a light spot covering the range of the eye box;
the driving assisting controller is connected with the projection image source and used for determining prompt contents to be displayed and controlling the projection image source to display the prompt contents.
In the above-mentioned scheme provided by the embodiment of the invention, the projection image source sends imaging light to the light control device, and the light control device can collect imaging light with different incidence angles to the same observation range and diffuse the imaging light into the range of the eye box, so that the brightness of the imaging light can be improved, and the imaging range can be ensured; the light control device can realize large-scale arrangement, so that an imaging area with a large area is formed on the surface of the reflecting device, and large-scale imaging is realized. The auxiliary driving controller determines prompt contents to be displayed and controls the projection image source to display the prompt contents on the surface of the reflecting device, so that the reflecting device can display a larger range of images, a driver can watch the prompt contents on the surface of the reflecting device in a larger range, and the display effect of the reflecting device can be improved.
In order to make the above objects, features and advantages of the present invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram illustrating an imaging principle of a head-up display device according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a head-up display device according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a first discontinuous reflection structure imaging principle in a head-up display device according to an embodiment of the present invention;
fig. 4 is a schematic diagram of imaging principles of a continuous second reflective structure in a head-up display device according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a main optical axis control element with a second reflection structure in a head-up display device according to an embodiment of the present invention;
fig. 6 shows an electrical schematic diagram of a head-up display device according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a display of a reflective device when a vehicle is closely spaced according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a display screen of a reflecting device during fatigue driving of a driver according to an embodiment of the present invention;
FIG. 9 is a schematic diagram showing a display of a reflector device when the driver's steering is out of specification in an embodiment of the present invention;
FIG. 10 is a schematic diagram showing a display screen of a reflection device when a oncoming vehicle approaches in an embodiment of the present invention;
FIG. 11 is a schematic diagram of a reflection device displaying a portion of navigation information according to an embodiment of the present invention;
FIG. 12 shows a schematic diagram of a reflective device displaying alarm information when a public object is present in an embodiment of the invention.
Icon:
10-projection image source, 20-light control device, 30-reflection device, 21-main optical axis control element, 211-first reflection structure, 212-second reflection structure, 22-dispersion element, 501-letter, 502-rectangular frame, 503-bird's eye view, 504-arrow, 505-motion path, 61-observation range, 62-eye-box range, 71-front vehicle, 72-local vehicle, 73-opposite vehicle, 74-ambulance, 75-current driving lane.
Detailed Description
In the description of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the present invention, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
According to the head-up display device for assisting driving, provided by the embodiment of the invention, the driver is assisted to drive a local vehicle by displaying the related information in the observation range of the driver. Referring to fig. 1, the head-up display apparatus includes: a projection image source 10, a light control device 20, and an auxiliary driving controller; the light control device 20 is disposed at one side of the reflecting device 30, and the reflecting device 30 may be a windshield of a vehicle, or a reflecting film on the inner side of the windshield, wherein the reflecting film can reflect imaging light, and does not influence the driver to observe things or scenes outside the vehicle through the reflecting film; accordingly, the light control device 20 and the projection image source 10 may both be located within the vehicle, i.e. inside the reflective device 30. Specifically, the light control device 20 in the present embodiment may be specifically disposed below the reflecting device 30, for example, at an IP (Instrument Panel) table of an automobile or the like. The projection image source 10 is specifically a device capable of projecting an image or a video projection device, and can specifically adopt CRT (Cathode Ray Tube) projection, LCD (liquid crystal display) projection, DLP (Digital Light Procession) projection, LCOS (Liquid Crystal on Silicon) projection, laser projection and other different types.
Referring to fig. 2, the light control device 20 includes a main optical axis control element 21 and a dispersing element 22; the projection image source 10 is configured to emit imaging light incident on the light control device 20, and the main optical axis control device 21 is configured to reflect multiple imaging light beams to the reflecting device 30, and reflect the multiple imaging light beams to the same observation range 61 through the reflecting device 30, where the observation range 61 is a position or area within the eye box range 62. The dispersing element 22 is disposed on a side of the main optical axis control element 21 near the projection image source 10 and between the main optical axis control element 21 and the projection image source, and the dispersing element 22 is configured to disperse the imaging light reflected by the main optical axis control element 21 and form a light spot covering the eye-box range 62.
In this embodiment, the imaging light emitted from the projection image source 10 is reflected by the light control device 20 and then can be incident on the reflecting device 30, and reflected by the reflecting device 30 into the eye box range 62, so that the driver can observe the image formed by the projection image source 10 in the eye box range 62. The eyebox (eyebox) range 62 in this embodiment refers to the range of areas where the driver can observe the image represented by the imaging light, i.e., the range where the driver can view the image on the reflection device 30, approximately corresponding to the position of the driver's head; with the driver's eyes in the box range 62, an image of the shadowgraph image source 10, in particular the virtual image of fig. 1, can be viewed, and the size of the box range 62 can be determined based on practical circumstances.
In addition, as shown in fig. 2, the imaging light emitted from the projection image source 10 may be incident on the reflection device 30 after being processed by the light control device 20, and reflected by the reflection device 30 into the eye box range 62, so that the driver can view the virtual image formed outside the reflection device 30. As shown in fig. 1, the reflection means 30 may form an image of the shadowgraph image source 10 on the outside.
Specifically, referring to fig. 2, the imaging light a emitted from the projection image source 10 passes through the dispersing element 22 and then is directed to the main optical axis control element 21; the dispersing element 22 will disperse the imaging light a for a first time, the process of which is not illustrated in fig. 2 for convenience of description. The main optical axis control element 21 then reflects the incident imaging light a; as shown in fig. 2, in the absence of the dispersive element 22, the imaging light ray a may be directed along the optical path a to the observation range 61; when the dispersing element 22 is disposed outside the main optical axis control element 21, the dispersing element 22 disperses the imaging light a for the second time, disperses the imaging light a into a plurality of light rays (including the light rays A1, the light rays A2, and the like) and forms a light spot within a range, and the light spot can be used as the eye box range 62, so that the driver can view the imaging of the projection image source 10 within the eye box range 62. The dispersing element 22 disperses the light and forms a light spot, specifically, the dispersing element 22 can change the propagation direction and/or the diffusion angle of the light passing through the dispersing element, and can diffuse the light passing through the dispersing element into a light spot with a circular shape, a rectangular shape and the like.
Alternatively, the preset shape of the spot includes, but is not limited to, a circle, oval, square, rectangle, batwing shape. In this embodiment, the size of the light spot is determined by two dispersions, the shape of the light spot is determined by the shape of the dispersion member 22, and fig. 2 illustrates a rectangular light spot. The observation range 61 may be one point or one area, that is, the main optical axis control element 21 may converge the imaging light emitted from the projection image source 10 into the observation range 61. Furthermore, the dispersion angle of the dispersed light spot in the side view direction may be 10 degrees, preferably 5 degrees; the angle of dispersion in the forward direction may be 50 degrees, preferably 30 degrees. The dispersing element 22 includes, but is not limited to, a diffractive optical element (Diffractive Optical Elements, DOE), such as a Beam shaping lens (Beam Shaper), through which light passes, is dispersed and forms a spot of a specific geometry, the size and shape of which is determined by the microstructure of the diffractive optical element. The dispersing element 22 is used for controlling the degree of dispersion of the light, and dispersing the light passing through the main optical axis control element 21 at a certain angle so as to cover the required eye box range 62. The propagation angle and the spot size of the dispersed light determine the brightness and the visual angle of the final imaging, and the smaller the dispersion angle is, the higher the imaging brightness is, and the smaller the visual angle is; and vice versa.
In this embodiment, the main optical axis control element 21 can collect imaging light rays with different incident angles into the same observation range 61, so that the brightness of the imaging light rays can be improved; meanwhile, the dispersing element 22 disperses light, so that a driver can conveniently watch an image formed by the projection image source 10 in a light spot range, and the imaging range can be enlarged while the brightness of the light is improved. In this embodiment, since the light control device 20 can collect the imaging light, the driver can observe the virtual image formed by the reflecting device 30 without having to have a particularly high brightness of the projection image source 10; the light control device 20 may have a larger area, so that the light control device 20 may reflect the imaging light to a larger area of the surface of the reflecting device 30, and the light control device 20 may be specifically laid on the surface of the IP platform of the vehicle, so that the larger area of the surface of the reflecting device 30 may be reflected and imaged, that is, an imaging area with a larger area on the surface of the reflecting device 30 may be used for imaging, so that the driver may conveniently watch the information projected by the projection image source 10.
It should be noted that, due to scattering, the light emitted by the projection image source 10 may cover the whole reflecting device 30, but since the light only reaches the eye box range 62 after being reflected by the reflecting device 30, the light is observed by the driver, so the "imaging light" in this embodiment refers to the light emitted by the projection image source and capable of being imaged in the eye box range 62. That is, only the surface area of the reflective device 30 that is incident on the imaging light rays that can be imaged within the range 62 of the eye-box will be the imaging area.
In addition, the driving support controller in this embodiment is connected to the projection image source 10, and is configured to determine the prompt content to be displayed, and control the projection image source 10 to display the prompt content, so that the prompt content can be displayed in the imaging area of the reflection device 20. For example, it is currently required to display the vehicle speed on the surface of the reflection device 30, that is, at this time, the vehicle speed may be taken as a prompt, the projector image source 10 displays the vehicle speed, and a virtual image including the vehicle speed is formed outside the reflection device 30 by the action of the light control device 20 and the reflection device 30 so that the driver at the eyebox range 62 can see the virtual image through the reflection device 30, and it appears that the imaging area of the reflection device 30 displays the vehicle speed.
It should be noted that "displaying the hint content in the imaging area" in the present embodiment means that the driver can view the hint content through the imaging area, so that the hint content appears to be displayed in the imaging area from the perspective of the driver, but the virtual image corresponding to the hint content is substantially located outside the reflecting device 30, for example, the position where the virtual image is located in fig. 1. The description of the present embodiment that is the same as or similar to "display of the hint content in the imaging area" (e.g., "display of hint content on transmitting device 30", etc.) is for convenience of description only, and is not intended to limit the imaging area, etc. may display hint content itself.
According to the head-up display device provided by the embodiment of the invention, the projection image source sends imaging light to the light control device, and the light control device can collect imaging light with different incidence angles to the same observation range and diffuse the imaging light into the range of the eye box, so that the brightness of the imaging light can be improved, and the imaging range can be ensured; the light control device can realize large-scale arrangement, so that an imaging area with a large area is formed on the surface of the reflecting device, and large-scale imaging is realized. The auxiliary driving controller determines prompt contents to be displayed and controls the projection image source to display the prompt contents on the surface of the reflecting device, so that the reflecting device can display a larger range of images, a driver can watch the prompt contents on the surface of the reflecting device in a larger range, and the display effect of the reflecting device can be improved.
On the basis of the above embodiment, the surface of the main optical axis control element 21 of the light control device is provided with a plurality of reflection structures, on the basis of which the plurality of imaging light rays are converged to the same observation range 61. Specifically, the main optical axis control element 21 may include a plurality of discontinuous first reflecting structures 211, where the first reflecting structures 211 are configured to reflect one imaging light path to the observation range 61; each first reflecting structure 211 is similar to a micro mirror, and can reflect an imaging light beam emitted by the projection image source 10 to the observation range 61. In this embodiment, a path of light refers to light with the same incident angle or an incident angle within a preset range.
In this implementation, the points (x, y, z) on the first reflective structure 211 satisfy the following equation:
wherein ,P1 Is the coordinate of the position of the projection image source 10, P 2 For observing the coordinates of the range 61, M 0 (x 0 ,y 0 ,z 0 ) For the coordinates of a known point on the first reflecting structure 211,representing the normal vector of the first reflective structure 211.
In the embodiment of the present invention, the plane of each first reflecting structure 211 is determined by the position of the projection image source 10, the observation range 61 to which the imaging light is reflected, and the position of the first reflecting structure 211 itself. Specifically, fig. 3 illustrates one first reflecting structure 211 in the main optical axis control element 21 as an example. In FIG. 3, the projection source 10 is positioned at P 1 The point where the observation range 61 is located is P 2 . For the first reflective structure 211, its normal (i.e., the dashed line in fig. 3) is perpendicular to the plane in which the first reflective structure 211 lies, i.e., the normal is the perpendicular vector to the plane in which the first reflective structure 211 lies. In the spatial coordinate system, the vertical vector is:
meanwhile, the incident angle of the incident light ray (i.e. the imaging light ray) of the first reflective structure 211 is the same as the exit angle, let M in FIG. 3 0 (x 0 ,y 0 ,z 0 ) Is a known point on the first reflective structure 211, then the vertical vector is located at the vector Andon the angle bisector of (c), so:
at the same time due to M 0 Is a point on the first reflective structure 211 of known coordinates, then for any point M (x, y, z) on the first reflective structure 211, the vectorPerpendicular to vector->Then->Namely:
P ⊥,x (x-x 0 )+P ⊥,y (y-y 0 )+P ⊥,z (z-z 0 )=0
thus, for a discontinuous first reflective structure 211, the reflective surface of the first reflective structure 211 (i.e., the plane in which the first reflective structure 211 lies) can be defined by a normal vectorAnd a known point M on the reflecting surface 0 To determine. Meanwhile, the first reflective structure 211 is a microstructure, that is, only the point (x, y, z) of the first reflective structure 211 needs to be determined within a small value range, that is, the point (x, y, z) on the first reflective structure 211 satisfies the following equation within the corresponding value range:
wherein ,P1 Is the coordinate of the position of the projection image source 10, P 2 For observing the coordinates of the range 61, M 0 (x 0 ,y 0 ,z 0 ) For the coordinates of a known point on the first reflecting structure 211,representing the normal vector, P, of the first reflective structure 211 ⊥,x 、P ⊥,y 、P ⊥,z Representing normal vector +.>Components in the x, y and z axes.
For each first reflecting structure 211 of the primary optical axis control element 21, a known point on each first reflecting structure 211 may be determined, in turn in combination with the position P of the projection source 10 1 And a viewing range 61P 2 The normal vector of each first reflective structure 211 can be determined to determine the reflective surface of each first reflective structure 211. Wherein the known point M 0 The center point of the first reflecting structure 211 may be a point on the intersection line of the first reflecting structure 211 and the plane where the main optical axis control element 21 is located, or may be another preset point on the first reflecting structure 211, which is not limited in this embodiment.
Meanwhile, the value range of the point (x, y, z) can be specifically:
wherein x is 1 ,x 2 ,y 1 ,y 2 ,z 1 ,z 2 Is a preset value determined according to the placement position of the first reflecting structure 211, and x is corresponding to different first reflecting structures 211 1 ,x 2 ,y 1 ,y 2 ,z 1 ,z 2 The values of (2) are not identical. For example, for the x-axis, if the x-component of the location of a first reflective structure 211 is between 1 and 1.5, then for that first reflective structure 211, x 1 =1,x 2 =1.5; if the x component of the position of the other first reflective structure 211 is between 1.5 and 1.9, then for the other first reflective structure 211, x is 1 =1.5,x 2 =1.9. Wherein the numerical values are not exactly the same meaning: for six values x 1 ,x 2 ,y 1 ,y 2 ,z 1 ,z 2 The six values corresponding to the two different first reflective structures 211 are not identical, i.e. at least 1 or even all of the six values are different.
Alternatively, the primary optical axis control element 21 includes a plurality of successive second reflecting structures 212, the second reflecting structures 212 being configured to reflect the plurality of imaging light rays to the viewing range 61;
the angle between the second reflecting structure 212 and the plane of the main optical axis control element 21 is θ:
wherein ,a normal vector representing the plane in which the main optical axis control element 21 is located; p (P) 1 Is the coordinate of the position of the projection image source 10, P 2 For observing the coordinates of the range 61, M 0 (x 0 ,y 0 ,z 0 ) For the coordinates of a known point on the intersection of the second reflecting structure 212 and the plane of the main optical axis control element 21 +.>Representing the second reflective structure 212 at point M 0 Normal vector at;
the point M (x, y, z) on the intersection line of the second reflecting structure 212 and the plane in which the main optical axis control element 21 is located satisfies the following equation:
wherein ,representing the normal vector of the second reflective structure 212 at point M.
In this embodiment, the second reflecting structure 212 is a continuous structure, that is, the main optical axis control element 21 includes a plurality of continuous second reflecting structures 212, and each second reflecting structure 212 is configured to reflect multiple paths of imaging light emitted by the projection image source 10 to the observation range 61.
Specifically, the second reflecting structure 212 is a continuous free-form surface, and an included angle between the free-form surface and the plane of the main optical axis control element 21 is a fixed value θ. Referring to fig. 4, the upper half of fig. 4 shows a schematic view of the front view of the light control device, and the lower half shows a schematic view of the top view of the light control device. Wherein the second reflecting structure 212 intersects the main optical axis controlling element 21, and the intersection line is a free curve, i.e. the middle point M and the point M in the lower half of FIG. 4 0 A curve in which the position is located.
In this embodiment, a known point M on the intersection line of the second reflecting structure 212 and the plane of the main optical axis control element 21 is preset first 0 And M is 0 Is (x) 0 ,y 0 ,z 0 ). Similar to the embodiment of FIG. 3, the projection source 10 is located at a position P 1 The point where the observation range 61 is located is P 2 . For the second reflective structure 212, since the second reflective structure 212 is a free-form surface, it has no unique normal, but at a known point M 0 Where the normal of the second reflective structure 212The method comprises the following steps:
at the same time, for the plane of the main optical axis control element 21, the normal vector of the plane is setIs (A, B, C), i.eA. B, C the normal vectors ∈ ->Components in the x, y and z axes. From the geometrical relationship, the normal vector +.>From the normal->The included angle between the second reflecting structure 212 and the plane of the main optical axis control element 21 is the included angle θ. Therefore, according to the normal vector of the plane of the main optical axis control element 21 +>The second reflective structure 212 is at point M 0 Normal atThe angle theta may be determined. I.e.)>
From the vector quantity product formula:
therefore, the included angle θ between the second reflecting structure 212 and the plane of the main optical axis control element 21 satisfies:
wherein ,A normal vector representing the plane in which the main optical axis control element 21 is located; p (P) 1 Is the coordinate of the position of the projection image source 10, P 2 For observing the coordinates of the range 61, M 0 (x 0 ,y 0 ,z 0 ) Is a second reflective structure 212Coordinates of a known point on the intersection with the plane on which the main optical axis control element 21 lies, +.>Representing the second reflective structure 212 at point M 0 Normal vector at (a).
After the angle θ is determined, the angle θ is determined based on the intersection of the second reflective structure 212 and the plane in which the main optical axis control element 21 lies (i.e., the midpoint M and the point M in the lower half of fig. 4 0 The curve in which it is located) determines the free-form surface of the second reflective structure 212.
Specifically, referring to fig. 4, for any point M (x, y, z) on the intersection line of the second reflecting structure 212 and the plane of the main optical axis control element 21, the point M is located in the plane of the main optical axis control element 21, so:
A(x-x 0 )+B(y-y 0 )+C(z-z 0 )=0
at the same time, the normal vector of the second reflective structure 212 at point MIs->And the normal vector->Normal vector to the plane of main optical axis control element 21>The included angle between them is also theta
In addition, there is a preset value range on the plane of the main optical axis control element 21, so the point M (x, y, z) on the intersection line of the second reflective structure 212 and the plane of the main optical axis control element 21 satisfies the following equation within the preset value range:
wherein ,representing the normal vector of the second reflective structure 212 at point M. The preset value range of the point M (x, y, z) may specifically be:
wherein ,xv ,x u ,y v ,y u ,z v ,z u Respectively the boundary value of the dimensions of the main optical axis control element 21.
In this embodiment, the second reflecting structure 212 is a continuous free-form surface, and the free-form surface of the second reflecting structure 212 can be accurately determined by using the fixed included angle θ between the second reflecting structure 212 and the main optical axis control element 21 and the intersection line therebetween. At the same time, for other second reflecting structures 212, another known point M may be redetermined 0 And further determining a corresponding included angle theta and an intersecting line. The intersection line between the second reflecting structure 212 and the main optical axis controlling element 21 is different if the second reflecting structure 212 has different included angles θ. For the main optical axis control element 21, intersecting lines of different forms are distributed on the plane thereof. Referring to fig. 5, two second reflecting structures 212 correspond to different included angles θ 1 and θ2 And the two included angles correspond to the tracks L of different intersecting lines 1 and L2
Meanwhile, after determining the included angle and the intersection line of the continuous second reflecting structure 212, when manufacturing and processing the second reflecting structure 212 on the main optical axis control element 21, a processing machine can fix the included angle, and then process along the trajectory of the intersection line, so that the processing technology is simple; meanwhile, if the processing depth of the second reflective structure 212 (or the height of the second reflective structure 212) is the same, since the included angle θ of the second reflective structure 212 is fixed, the distance between two adjacent intersecting lines is also a fixed value, and the distribution of the second reflective structure 212 is more uniform.
On the basis of the above embodiment, referring to fig. 6, the head-up display apparatus further includes an information acquisition device 200, and the information acquisition device 200 is communicatively connected to the driving assistance controller 100; the information collection device 200 is used for collecting current information and transmitting the collected current information to the driving assistance controller 100. The driving support controller 100 is specifically configured to: and generating prompt content according to the current information.
In the embodiment of the present invention, the information acquisition device 200 may acquire vehicle state information capable of representing a current state of a vehicle, external object information located outside the vehicle, driver information related to the driver itself, manipulation information that may be acquired during the driver's manipulation of the vehicle, a local motion path representing a current motion trend of the vehicle, navigation information, corresponding information of a public object, and the like. The vehicle state information may specifically include a vehicle speed, a yaw rate, a brake pedal displacement, an accelerator pedal displacement, and the like, the external object information includes a current position of the external object, a current distance from the external object, a movement speed of the external object, and the like, the driver information includes a blink frequency, an eye-closing time period, a current gaze direction, a line-of-sight concentration time period, a low head frequency, and the like, the driver's manipulation information indicates a motion of the driver during driving the vehicle, for example, whether to observe a lane condition before lane change, whether to turn on a turn lamp when turning, and the like, and the navigation information may include a start position, an end position, a current distance from the end point, a time to reach the end point, one or more travel paths, and the like.
The information collecting device 200 may specifically include one or more of an image collecting device, a Vehicle radar, an infrared sensor, a laser sensor, an ultrasonic sensor, a speed sensor, a rotation speed sensor, an angular velocity sensor, a displacement sensor, a GPS (Global Positioning System ), a V2X (Vehicle to X) system, an ADAS (Advanced Driving Assistant System, advanced driving assistance system). In this embodiment, different information acquisition devices may be installed at different positions based on the requirements thereof, for example, the rotation speed sensor may be disposed at a wheel, and image acquisition devices may be disposed on both the inner side and the outer side of the vehicle, which will not be described herein.
On the basis of the above-described embodiment, the information collection device 200 may include a speed sensor that can collect a vehicle speed, an angular velocity sensor that can collect a yaw rate of the vehicle, a displacement sensor that can collect displacements of the brake pedal and the accelerator pedal, that is, vehicle state information such as a vehicle speed, a yaw rate, a brake pedal displacement, an accelerator pedal displacement, and the like, based on the information collection device 200, and then transmit the vehicle state information to the assisted driving controller 100. Alternatively, the driving support controller 100 may be connected to an OBD (On Board Diagnostics, on-board automatic diagnostic system) interface of the vehicle, and acquire vehicle state information such as the vehicle speed based on the OBD interface. After acquiring the vehicle state information, the driving support controller 100 may determine whether the vehicle state information exceeds a preset safety range, and if the vehicle state information exceeds the preset safety range, the driving support controller may use corresponding first warning information as the prompt content, where the first warning information includes one or more of a first warning text, a first warning image, and a first warning video.
In this embodiment, the safety range of the vehicle state information is preset, if the current vehicle state information exceeds the safety range, it is indicated that there is a risk or illegal operation at present, and at this time, the first warning information capable of warning the driver may be displayed on the reflecting device 30 as the prompt content. For example, if the safety range of the vehicle speed is preset to be 0-60 km/h, when the vehicle operated by the driver is overspeed (i.e. the vehicle speed is greater than 60 km/h) or is reversed (i.e. the vehicle speed is negative), the driver is determined to be required to be warned currently, and then the corresponding first warning text, first warning image or first warning video can be displayed as prompt content. For example, the first warning text may be "you have overspeed", the first warning image may be a sigh figure, the first warning video may be an animation of two vehicles colliding, etc.
Specifically, when a learner of the driving school learns to drive the vehicle as the driver, the driver may be reminded of whether the current operation is appropriate based on the collected vehicle state information. For example, if the force of the learner stepping on the accelerator pedal is too large, that is, the displacement of the accelerator pedal exceeds the corresponding safety range, the first warning information such as "please lightly step on the accelerator pedal" can be used as the prompt content, the driving assisting controller 100 controls the projection image source 10 to display the prompt content, and a virtual image is formed outside the reflecting device 30 through the actions of the light control device 20 and the reflecting device 30, so that the learner can watch the prompt content through the reflecting device 30 (such as the windshield in front of the coach), the learner is not required to look at the instrument or the like at low head, the safety of the learner in the driving learning process can be improved, and real-time reminding can be performed on the learner, so that the learner can learn the shortfall or the place with wrong operation in time, and further the learning effect and efficiency can be corrected conveniently and timely.
Alternatively, the information collection device 200 may include an image collection apparatus, a distance sensor, etc., determine a current distance between the local vehicle and the external object based on the information collection device 200, and transmit the current distance as an external object information to the driving assistance controller 100; after the driving support controller 100 acquires the external object information, it is determined whether the current distance is smaller than a preset safety distance; when the current distance is smaller than the preset safety distance, the corresponding second warning information is used as prompt content, and the second warning information comprises one or more of second warning characters, second warning images and second warning videos.
In the embodiment of the invention, the safety distance can be preset, and the safety distance is a threshold value which can ensure the safety of the vehicle under the general condition; the safety distance may be a fixed value obtained by statistics, or may be a value related to the current vehicle speed, and the greater the vehicle speed, the greater the safety distance. If the current distance is larger than the safety distance, the external object is far away from the local vehicle, and the possibility of collision between the external object and the local vehicle is low; conversely, if the current distance is smaller than the preset safety distance, it indicates that the two have a higher collision risk, and at this time, the corresponding second warning information may be displayed on the reflecting device 30 as a prompt content, so as to remind the driver that the higher collision risk exists currently. For example, the second warning information may include a second warning text, such as "get too close to the vehicle ahead, please slow down"; the second warning information may also include a second warning image, for example, a graph displaying a red sigh, or a graph highlighting a match with an external object at a position corresponding to the external object, to implement AR (Augmented Reality ) fit display; the second warning information may also include a second warning video, such as an animation showing a collision between two vehicles. In addition, the external objects in the present embodiment may include other vehicles, pedestrians, animals, non-motor vehicles, etc. outside the vehicle, and may also include stationary objects such as roads, signs, etc. For different external objects, different corresponding relations can be adopted to determine the safety distance.
Optionally, the information collecting device 200 may include an image collecting device, based on which driver information in the vehicle, such as blink frequency, low head duration, etc., may be collected, and based on an eyeball tracking technique, a current gaze direction of the driver may be determined, and specifically, the information collecting device 200 may collect driver information including blink frequency, eye-closing duration, current gaze direction, line-of-sight concentration duration, low head frequency, etc., and send the collected driver information to the auxiliary driving controller 100; after the driver information is acquired by the driving support controller 100, it is determined whether the driver information meets the abnormal driving condition; when the driver information meets the abnormal driving condition, the corresponding third warning information is used as prompt content, and the third warning information comprises one or more of third warning characters, third warning images and third warning videos. Wherein the abnormal driving condition includes: the blink frequency is greater than one or more of a preset blink frequency threshold, an eye closure time period is greater than a preset eye closure time period threshold, a current gaze direction deviates from the road direction and the deviation time period is greater than a preset deviation time period threshold, a line of sight concentration time period is greater than a preset line of sight concentration time period threshold, a low head time period is greater than a preset low head time period threshold, and a low head frequency is greater than a preset low head frequency threshold.
In the embodiment of the invention, whether the driver is in abnormal driving conditions such as fatigue driving, drunk driving and the like can be judged based on the acquired driver information. Specifically, if the blink frequency of the driver is greater than a preset blink frequency threshold, the eye closing time is greater than a preset eye closing time threshold, the current gazing direction deviates from the road direction and the deviation time is greater than a preset deviation time threshold, and the line of sight concentration time is greater than a preset line of sight concentration time threshold, the driver is indicated to be very likely to be tired; or the driver low head time length is larger than a preset low head time length threshold value, and the low head frequency is larger than a preset low head frequency threshold value, so that the driver can be in a fatigue driving state possibly. The method can be used for comprehensively judging whether the driver is in fatigue driving or not by combining various conditions, for example, if the current gazing direction of the driver deviates from the road direction and the deviation time length is larger than a preset deviation time length threshold value, and the line of sight concentration time length is larger than a preset line of sight concentration time length threshold value, the driver is determined to be in fatigue driving at the moment.
In this embodiment, if the driver is likely to be driving fatigue currently, the driver is currently in an abnormal state and is alerted by displaying the third alert information. The third warning information includes a third warning text, such as "you have tired driving, please pay attention to rest", and may also include a third warning image, for example, a graph displaying a red sigh; a third warning video may also be included, such as an animation showing a collision of two vehicles, etc.
Correspondingly, if the driving information does not meet the abnormal driving conditions, the current normal state can be determined, namely, the driver does not have fatigue driving, drunk driving and the like, and the head-up display device can normally display information required in the driving process, such as current vehicle speed, navigation information and the like; alternatively, in the normal state, if the vehicle is currently at a time when the vehicle is not suitable for displaying information, such as a curve, the information may not be displayed, that is, the displayed information is empty, so as to avoid distraction of the driver caused by displaying the information.
Alternatively, the image capturing apparatus in the information capturing device 200 may also capture manipulation information of the driver in the vehicle, such as whether the driver observes the surrounding environment, the driver's hand movements, and the like, and transmit the captured manipulation information to the driving assistance controller 100. The driving support controller 100 determines whether or not the manipulation process of the driver is normative based on the manipulation information of the driver; specifically, when the manipulation information is not matched with the standard manipulation information, the corresponding fourth warning information and/or the standard manipulation information is used as prompt content, and the fourth warning information comprises one or more of fourth warning characters, fourth warning images and fourth warning videos.
In the embodiment of the invention, the image acquisition equipment can acquire the video of the driver in the vehicle in the process of operating the vehicle, can extract corresponding operation information based on the image recognition technology, and if the operation information is not matched with the standard operation information, the current operation process of the driver is not standard, and at the moment, the fourth warning information is displayed as the prompt content, so that the driver can be reminded and the error operation of the driver is corrected; meanwhile, if the standard operation information is displayed on the reflecting device 30 as the prompt content, the driver can be further assisted, and the operation level of the driver can be improved. In this embodiment, the local vehicle may be a learner-driven vehicle, and the driver may be a learner of the learner-driven vehicle, and the image acquisition device is disposed in the learner-driven vehicle to acquire the operation information of the learner in real time, so as to determine whether the operation process of the learner is standard.
On the basis of the embodiment, when the warning information (such as the first warning information, the second warning information, the third warning information, the fourth warning information and the like) is not required to be displayed, normal identification information can be displayed as prompt content, for example, the vehicle speed is displayed on a front windshield in real time, and the low head of a driver is prevented from viewing a vehicle speed instrument panel in the driving process. When the warning information is required to be displayed, the corresponding warning information is displayed as prompt content according to the corresponding condition.
In this embodiment, in order to enhance the warning effect, the driving assistance controller 100 may control the shadowgraph image source 10 to display the prompt content in a highlighting manner including one or more of scrolling, jumping, blinking, highlighting, displaying in a warning color (e.g., red, etc.), displaying in a preset position (e.g., right in front of the driver's line of sight, etc.). If the warning information does not need to be displayed currently, the vehicle speed and other identification information can be displayed in a normal mode or other colors. For example, if the current vehicle is overspeed, the prompt content of "please slow down" can be displayed in red; after the driver adjusts the speed of the vehicle in a deceleration mode, the local vehicle is restored to a normal state, and prompt contents such as current driving safety and the like can be displayed in a green mode. Fig. 7 schematically illustrates a display mode when the external object is too close to the local vehicle, taking the vehicle as an example. As shown in fig. 7, the head-up display device detects that the current distance from the front vehicle 71 to the local vehicle is 50m, and the current safe distance is 60m, that is, the warning state at this time, and the head-up display device may display the prompt content on the reflecting device 30 (that is, the windshield of the local vehicle) including warning text 501, that is, "please decelerate-! The prompt may further include a rectangular box 502 for selecting the front vehicle 501, and the rectangular box 502 may be specifically displayed in red or highlighted, etc. to enhance the reminding effect. In addition, the distance to the front vehicle 71 may be displayed at the same time, and the distance "50.0m" is displayed below the rectangular frame 502 in fig. 7. When the current distance between the local vehicle and the front vehicle 71 is greater than the safe distance, the front vehicle 71 may be framed with a green rectangular frame 502. Similarly, in fig. 7, the warning text 501 "please slow down" may be displayed in red, and when the distance is greater than the safety distance, the text "current safety please keep on" may be displayed in green.
Similarly, in order to improve the warning effect, when the driver drives abnormally, the corresponding third warning information can be highlighted. For example, referring to fig. 8, a schematic diagram of displaying prompt content on the reflecting device 30 is shown, and the head-up display device detects that the driver is currently driving fatigue, at this time, an alarm text 501 may be displayed on the reflecting device 30, namely, "you have tired driving-! The alert text 501 may be a red display or a highlighting display, etc., to enhance the alert effect.
Or if the driver drives the local vehicle, the current lane change is left; if the driver does not turn on the left turn light currently, i.e. the control information of the driver is not in compliance, the driver is currently in a illegal lane change, at this time, referring to fig. 9, an alarm word 501 "please turn left light" may be displayed on the reflecting device 30 to remind the driver to turn on the left turn light; meanwhile, the current traveling direction of the vehicle may also be indicated by an arrow 504, alerting the driver that the vehicle is currently shifting to the left.
In addition, the head-up display device can intuitively identify the position of an external object in an AR display mode. Specifically, the driving support controller 100 may determine a projection position of the external object onto the reflection device 30, take the projection position or an edge of the projection position as a presentation position, and instruct the projection image source 10 to display the presentation content at the presentation position. In this embodiment, by setting the projection position of the external object as the presentation position, the presentation content consistent with the external object can be displayed at the corresponding position of the reflection device 30, so that the external object can be intuitively marked to the driver. As shown in fig. 7, if the external object is a front vehicle 71, a rectangular frame 502 may be displayed at a corresponding position of the windshield (i.e., the reflecting device 30), and the rectangular frame 502 may frame the vehicle, so as to intuitively remind the driver of the position of the front vehicle 71.
In this embodiment, when the driver is required to be warned in the current warning state, other reminding modes may be adopted to perform auxiliary reminding. Specifically, the driving assist controller 100 may also be configured to: sending alarm voice to the sounding device and indicating the sounding device to play the alarm voice; or sending a vibration instruction to the vibration device to instruct the vibration device to vibrate; the vibration device is a device that can be contacted to a user. In this embodiment, a speaker may be added to the head-up display device, or a speaker on the vehicle may be used to perform a voice alert, where the alert voice may be an alert bell without a specific meaning, or may be a specific voice, such as "attention-! Keep the distance-! "etc. In addition, a mechanical vibration device may be provided at a position where a driver such as a steering wheel or a seat of a vehicle may directly contact, so that the driver may be alerted in a vibrating manner in an alert state.
On the basis of the embodiment, the head-up display device can realize AR fitting display with external objects, and can also actively display non-existing objects. Specifically, under the scene that environment such as driving teaching highway section is relatively fixed, can add corresponding virtual teaching object according to teacher's teaching environment to make the student can experience more driving scenes. In this embodiment, the driving assistance controller 100 may be configured to determine a virtual teaching object to be displayed according to a current scene, and use the virtual teaching object as a prompt content; the actual position at which the virtual teaching object needs to be displayed is determined, the intersection position between the actual position and the line connecting the eye-box range 62 and the reflection device 30 is taken as the prompting position, and the shadowgraph image source 10 is controlled to display prompting contents at the prompting position.
In the embodiment of the invention, the virtual teaching object can be a virtual indication board, a virtual pedestrian, a vehicle and the like, and the corresponding virtual teaching object is determined according to the current scene of a learner driving the local vehicle. For example, if a left turn is currently required, a virtual indication board for the left turn can be used as a virtual teaching object; if the scene of the opposite lane meeting needs to be simulated at present, the opposite vehicle can be used as a virtual teaching object. Meanwhile, for different virtual teaching objects, an actual position at which the virtual teaching object is displayed needs to be determined in conjunction with the current scene, and where the virtual teaching object is displayed is determined based on the actual position. Specifically, in this embodiment, the intersection position between the actual position and the line connecting the eye-box range 62 and the reflecting device 30 is taken as the prompting position, that is, the actual position, the prompting position, and the eye-box range 62 are collinear, and when the projection image source 10 displays the virtual teaching object at the prompting position, the learner at the eye-box range 62 can view the virtual teaching object at the prompting position, so that the virtual teaching object appears to exist at the actual position from the perspective of the learner. For example, if a learner is required to learn to avoid an obstacle, a virtual obstacle may be displayed in the current driving lane. In addition, since the virtual teaching object is generally a stationary object, when the head-up display device displays the virtual teaching object, the prompt position needs to be determined and adjusted in real time so that the virtual teaching object is stationary as seen by a learner.
In this embodiment, the environments such as the driving teaching road section are relatively fixed, and virtual AR operation prompts can be set at different turns according to the current teaching environment, or special scenes such as pedestrians are additionally arranged at the AR, accidents are additionally arranged in front of the road section, or emergency situations occur near the stop-let sign, or the opposite-direction road meeting scene is simulated, and the operation assessment under the special scenes is realized through simulating the special scenes, so that the strain and the driving capability of students can be improved, and the driving teaching level and the assessment difficulty can be also improved.
Optionally, the driving assistance controller 100 determines that the prompt content needs to be displayed includes: and determining target information to be displayed, and generating prompt contents with corresponding characteristic values according to the parameter values of the target information, wherein the characteristic values comprise one or more of color, size and shape.
In the embodiment of the invention, when the head-up display device displays certain target information, the target information has the attribute of parameter value and the like, and the prompting content with corresponding color, size or shape can be adaptively generated based on the parameter value. For example, the target information is a vehicle speed, which may be displayed in different colors according to the magnitude of a vehicle speed value (i.e., a parameter value of the vehicle speed); further, the vehicle speed may be schematically represented in the shape of an engine, such as an engine that displays green at a low speed, a yellow engine, a red engine, or the like in order as the rotation speed increases, so that the target information may be displayed more intuitively.
On the basis of the above embodiment, when the auxiliary driving controller 100 determines the prompt content to be displayed, if an external object exists on the outside of the local vehicle, the auxiliary driving controller 100 may be configured to determine external object information, generate the prompt content to be displayed currently according to the external object information, and determine the prompt position; then controlling the projection image source 10 to display prompt contents at the prompt position; the prompting position is a preset position for displaying prompting content; alternatively, the hint location is the intersection location between the current location of the external object and the line connecting the range of eyebox 62 and the reflective device 30.
In the embodiment of the invention, each prompting content can be provided with a corresponding prompting position, and the prompting position can be preset or determined based on the current actual scene. For example, if the presentation content is the vehicle speed and the vehicle speed is displayed at the lower left of the reflecting device 30, the corresponding position at the lower left of the reflecting device 30 may be directly used as the presentation position; or, when a pedestrian exists in the current outside (i.e. the external object is a pedestrian), a graph corresponding to the pedestrian position needs to be formed to remind the driver, and the graph is the prompt content, and the position on the reflecting device 30 where the prompt content needs to be displayed is the prompt position. Specifically, a line connecting the current position of the pedestrian and the eye-box range 62 may be determined, and the intersection position between the line and the reflection device 30 may be used as a prompting position, where the projection image source 10 displays a corresponding graphic. In this embodiment, the prompting position may be a position point or a position range, which may be specifically determined based on practical situations.
On the basis of the above embodiment, the external object information includes one or more of a current position of the external object, a current distance from the external object, and a movement speed of the external object; the driving support controller 100 generates a prompt content to be displayed currently according to the external object information, and determines a prompt position, including:
generating prompt contents with corresponding characteristic values according to the difference between the external object information and the standard value, wherein the characteristic values comprise one or more of color, size and shape; the present position of the external object and the intersection position between the line of the eye-box range 62 and the reflecting means 30 are taken as the cue positions.
In the embodiment of the present invention, the external object may also have a corresponding parameter value, and the parameter value of the external object may be included in the external object information. For example, when the external object is the vehicle a, the vehicle speed of the vehicle a, the distance between the local vehicle, and the like may be used as the parameter value of the vehicle a. Meanwhile, a standard value of the external object may be preset, the standard value is a standard parameter value of the external object, and by comparing the parameter value of the external object with the standard value, it may be determined what feature value is more suitable for displaying the prompt content. The standard value may be a preset fixed value, for example, if the standard value is a standard value related to the speed, the maximum speed allowed by the current lane may be used as the standard value, if the speed of the side vehicle is greater than the standard value, it indicates that the side vehicle is overspeed, a red prompt content may be generated at this time (i.e., the characteristic value is red), and if the side vehicle is not overspeed, a green prompt content may be generated (i.e., the characteristic value is green). Alternatively, the standard value may be a value determined in real time based on the external object information, and if the standard value is a distance-related standard value, for example, the safe distance may be determined based on the vehicle speed of the vehicle and used as the standard value; if the other vehicles A are far away from the local vehicle, namely the difference between the current distance of the external object and the standard value is large, green prompt contents can be generated at the moment; if the difference between the current distance and the standard value is generally large, yellow prompt content can be generated at the moment; if the difference between the current distance and the standard value is small, red prompt content can be generated.
In addition, since the head-up display device can image a wide range on the surface of the reflection means 30, when there are different kinds of external objects outside, or the same kind of external objects but different kinds of external objects, the characteristic value of the hint content can be determined based on the actual situation. Specifically, when there are a plurality of external objects, the driving support controller 100 generates a presentation content having a corresponding characteristic value according to a difference between external object information of each external object and a standard value, and different external objects or different classes of external objects correspond to the presentation content having different characteristic values.
In the embodiment of the invention, corresponding different characteristic values can be allocated to different external objects or different classes of external objects. For example, for two types of external objects, namely a vehicle and a pedestrian, characteristic values of different shapes, sizes and the like can be adopted; alternatively, if there are a plurality of vehicles at present, different characteristic values such as different shapes and colors may be assigned to different vehicles to distinguish a plurality of external objects. Specifically, for the external object to be marked, the outline of the line with different colors can be used for representing the size information of the external object, the motion information of the external object is recorded around the outline by using graphic characters, or the arrow or trace line can be used for representing the motion direction of the external object, and the like. Based on the head-up display device, the outline of pedestrians around the local vehicle and the direction speed of the movement of the pedestrians can be rapidly displayed, so that a driver can conveniently judge whether collision occurs with the running of the local vehicle or not.
In this embodiment, since the display area of the head-up display device is very large, the displayed prompt content can follow the pedestrian until the pedestrian disappears in the driver's view. When a plurality of moving objects appear in the driver's visual field, all the moving objects can be framed simultaneously by the head-up display device, and different objects can be replaced by different figures, such as vehicles, pedestrians, animals, road boundaries and obstacles.
On the basis of the above-described embodiment, the driving assistance controller 100 may also display the movement tendency of the external object, further intuitively showing whether the external object may collide with the local vehicle. Specifically, the driving support controller 100 is further configured to:
determining a local motion path, and determining a motion path of an external object according to a change value of external object information; taking the local motion path and/or the motion path of the external object as prompt content; and/or when the motion path of the external object is intersected with the local motion path, the corresponding warning message is used as prompt content, and the warning message comprises one or more of warning characters, warning images and warning videos.
In the embodiment of the present invention, the driving assistance controller 100 may predictably determine a motion path of the local vehicle, that is, a local motion path, according to a state parameter (such as a vehicle speed, an orientation, a wheel steering angle, etc.) of the local vehicle; meanwhile, the movement path of the external object may be predictably determined based on the change value of the external object information such as the current position of the external object, and the driving assistance controller 100 may display the local movement path and the movement path of the external object on the reflection device 30 so that the driver may intuitively predict whether the external object may collide with the local vehicle. In addition, the driving support controller 100 may determine whether there is a collision risk based on whether the movement path of the external object intersects the local movement path, and if the movement path of the external object intersects the local movement path, the collision risk is described, and at this time, a corresponding warning message may be displayed on the reflecting device 30 as a presentation content, for example, a warning text "please avoid" or the like.
Referring to fig. 10, if the external object is the opposite vehicle 73 and the movement trend of the opposite vehicle 73 is toward the local vehicle, the movement path 505 of the opposite vehicle 73 may be displayed on the reflection device 30, and the text 501 "please avoid-! ", to alert the driver of the local vehicle to the subtended vehicle 73; in addition, the opposite vehicle 73 may be marked by a rectangular frame 502, and the reminding effect may be further improved.
Optionally, if the situation is unmanned, the external object approaches the local vehicle to a certain extent, the speed, the size profile, the movement path and the like of the movement of the object can be rapidly displayed, and whether the approaching object has danger or not is judged; when it is judged that a danger may occur to an approaching object, the change in the contour flickering of the object and the change in color, such as a change from green to red, may be displayed by the head-up display device to remind the driver that the unmanned can be taken over.
On the basis of the above embodiment, the driving support controller 100 may also determine navigation information, and generate the prompt content to be displayed according to the navigation information. In this embodiment, the navigation information may include a start position, an end position, a current distance from the end, a time to reach the end, one or more travel paths, and the like, and when the driver needs to display navigation, the navigation information may be displayed on the reflection device 30, so that the driver can conveniently view the navigation information in real time.
Specifically, the auxiliary driving controller 100 determining the navigation information may include:
acquiring navigation information issued by a server;
or, acquiring the running requirement input by the user, and generating navigation information according to the running requirement;
alternatively, one or more travel paths are generated and displayed on the reflection device 30, and when a selection instruction input by the user is received, navigation information is generated according to the travel path to which the selection instruction is directed.
In this embodiment, when the driver needs to navigate, the navigation information may be generated by means of the server and sent to the local head-up display device, so that the head-up display device may display the navigation information on the reflection apparatus 30; the server may be a preset upper computer, a smart phone used by a driver, or the like, and only needs to have a function of generating navigation information. Alternatively, the driver may input a travel demand, such as a destination, to the head-up display device, which may automatically generate navigation information based on the travel demand. In addition, there may be a plurality of travel paths from the start position to the end position, and the head-up display apparatus may display the determined travel path or paths on the reflection device 30 for the user to select, and after receiving a selection instruction input by the user, uniquely display the travel path selected by the user. The driver can select a preferred driving path through gestures, voices, keys and the like.
In this embodiment, the head-up display device may be installed in a logistics vehicle, so that navigation may be displayed to a logistics vehicle driver. The logistics vehicle information can specifically include cargo information, a delivery point position, the number of cargoes, a cargo delivery address, a cargo delivery sequence, a cargo volume/weight, a delivery requirement, demand information of a receiving user and the like, a recommended path with the shortest total transportation distance, a recommended path with the shortest total transportation duration, a recommended path for preferentially delivering large cargoes or the like are obtained through analysis according to the information, and the recommended path is displayed on a windshield in front of the logistics vehicle as navigation information. Meanwhile, the distribution condition can be displayed, such as the total number of pieces to be distributed, the number of pieces to be distributed and the number of pieces not to be distributed; or, can set up the camera in the goods railway carriage or compartment, gather the circumstances in the packing box in real time through this camera to show on the windshield. In addition, the recommended route can be re-planned and displayed according to real-time traffic conditions, dispatch conditions and the like, or the driver's own operation and the like.
Optionally, the driving assistance controller 100 generates the prompt content to be displayed according to the navigation information, including:
Taking key information in the navigation information as prompt content to be displayed, wherein the key information comprises one or more of a starting point position, an end point position, a distance from the end point, a time for reaching the end point and a travel progress;
and/or determining corresponding operation prompt information according to the local current position and the navigation information, and taking the operation prompt information as prompt content.
In the embodiment of the invention, all information in the navigation information can be displayed as key information, and part of information can be selected from the key information to be displayed as key information. In addition, corresponding manipulation prompt information is determined according to the local current position and the navigation information, and the manipulation prompt information is displayed on the reflecting device 30, so that a driver can perform reasonable driving operation based on the seen manipulation prompt information. As shown in fig. 11, the head-up display device determines that the local vehicle is currently located in the current driving lane 75, and knows that the local vehicle needs to turn right at the position according to the navigation information, then the warning text 501, that is, "please turn right", can be displayed on the reflecting device 30 at this time, and meanwhile, an arrow 504 attached to the current driving lane 75 can be displayed, so that the driver can be intuitively reminded of turning right.
Optionally, the head-up display device may also collect related information of the public object, so as to avoid collision between local navigation information and the public object. Specifically, the driving support controller 100 is further configured to:
step A1: corresponding information of public objects is acquired, and the public objects comprise public transportation and emergency vehicles.
Step A2: and determining the position and/or the driving path of the public object according to the corresponding information of the public object, and taking the position and/or the driving path of the public object as prompt content.
In the embodiment of the invention, the head-up display device can acquire the corresponding information of the public object based on a public information platform (such as a municipal information system or a civil navigation service system), wherein the public object can be public transportation such as buses, emergency vehicles such as ambulances, fire engines, police cars, road rescue vehicles and the like. The corresponding information of the common object may include time of task execution, departure place, destination, emergency degree, etc.
In this embodiment, the public information platform may issue the task execution arrangement of the emergency vehicle to service providers of different head-up display devices, or issue the task execution arrangement of the emergency vehicle to a navigation system, or issue the task execution arrangement of the emergency vehicle to a mobile phone of a user, so that the head-up display device of the local vehicle may directly or indirectly obtain corresponding information of the public object, and further may perform positioning and position information calculation, or after performing positioning and position information calculation by other servers, transmit calculation results such as a distance, a position, a driving path, etc. to the local head-up display device, so that the local head-up display device may display a position and/or a driving path of the public object, thereby facilitating driver avoidance in advance. For example, the map display of the emergency vehicle may be added based on the original planned navigation information, the positional relationship between the emergency vehicle and the current vehicle may be presented, and the proceeding routes of the two may be displayed differently in different colors.
Optionally, when there is an intersection between the local navigation information and the location or travel path of the public object, the driving assistance controller 100 is further configured to: generating alarm information or updating navigation information, wherein no intersection exists between the updated navigation information and the position or the driving path of the public object. For example, as shown in fig. 12, when the local vehicle is traveling on the current traveling lane 75 and an ambulance is performing a mission 500m behind, the driver may be notified of the bird's eye view 503 as alarm information through the bird's eye view 503; in the bird's eye view 503, the local vehicle 72 is displayed in a plan view, the ambulance 74 is displayed behind the local vehicle 72, and the azimuth distance therebetween, that is, "500 m behind", is indicated. At the same time, the text 501 "avoid ambulance" may also be displayed to further alert the driver.
In the embodiment of the present invention, when the position or the driving path of the public object does not intersect with the local motion path or the local navigation information, it is indicated that the driving path of the local vehicle does not collide with the driving path of the public object, and at this time, the position or the driving path of the public object may be displayed as a secondary information on the reflecting device 30, that is, the function of notifying the driver is mainly played. For example, on the basis of information display without changing the main interface of the head-up display device, the edge part of the display interface of the head-up display device can be presented in a small bubble or arrow mode to mark the coming direction and distance of the emergency vehicle.
When the intersection exists between the local navigation information and the position or the running path of the public object, the running route of the local vehicle is indicated to possibly influence the public object, and the existence of the public object also possibly influences the time of the local vehicle reaching the destination, so that alarm information can be generated to remind a driver whether to change the route; alternatively, the navigation information may be updated directly to provide the driver with a navigation route that does not conflict with the position or travel path of the common object.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art can easily think about variations or alternatives within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (16)

1. A head-up display device for driving assistance, comprising: a projection image source, a light control device and an auxiliary driving controller; the light control device is arranged on one side of the reflecting device and comprises a main optical axis control element and a dispersion element; the light control device is paved on the surface of the IP platform;
The projection image source is used for emitting imaging light rays incident to the light control device, and the imaging light rays emitted by the projection image source are emitted to the main optical axis control element after passing through the dispersion element; the main optical axis control element is used for reflecting a plurality of imaging light rays to the reflecting device and reflecting the imaging light rays to the same observation range through the reflecting device, and the observation range is a position or a region in the range of the eye box; the main optical axis control element is used for converging the imaging light rays emitted by the projection image source to the observation range;
the dispersing element is arranged on one side of the main optical axis control element, which is close to the projection image source, and is arranged between the main optical axis control element and the projection image source, and the dispersing element is used for dispersing the imaging light reflected by the main optical axis control element and forming a light spot covering the range of the eye box;
the driving assisting controller is connected with the projection image source and used for determining prompt contents to be displayed and controlling the projection image source to display the prompt contents.
2. The head-up display device according to claim 1, wherein the auxiliary driving controller determining prompt content to be displayed includes:
The auxiliary driving controller obtains vehicle state information, wherein the vehicle state information comprises one or more of vehicle speed, yaw rate, brake pedal displacement and accelerator pedal displacement; when the vehicle state information exceeds a preset safety range, corresponding first warning information is used as the prompt content, and the first warning information comprises one or more of first warning characters, first warning images and first warning videos;
or, the driving assistance controller acquires external object information including a current distance to an external object; when the current distance is smaller than a preset safety distance, corresponding second warning information is used as the prompt content, wherein the second warning information comprises one or more of second warning characters, second warning images and second warning videos;
or the auxiliary driving controller acquires driver information, wherein the driver information comprises one or more of blink frequency, eye closing time, current gazing direction, concentrated time of sight, low head time and low head frequency; when the driver information meets abnormal driving conditions, corresponding third warning information is used as the prompt content, and the third warning information comprises one or more of third warning characters, third warning images and third warning videos; wherein the abnormal driving condition includes: one or more of the blink frequency is greater than a preset blink frequency threshold, the eye closure time period is greater than a preset eye closure time period threshold, the current gaze direction deviates from the road direction and the deviation time period is greater than a preset deviation time period threshold, the line of sight concentration time period is greater than a preset line of sight concentration time period threshold, the low head time period is greater than a preset low head time period threshold, and the low head frequency is greater than a preset low head frequency threshold;
Or the driving assistance controller acquires the control information of the driver, and when the control information is not matched with the standard control information, the corresponding fourth warning information and/or the standard control information are used as the prompt content, and the fourth warning information comprises one or more of fourth warning characters, fourth warning images and fourth warning videos.
3. The head-up display device according to claim 1, wherein the auxiliary driving controller determining prompt content to be displayed includes:
the auxiliary driving controller is used for determining a virtual teaching object to be displayed according to the current scene, and taking the virtual teaching object as the prompt content;
determining an actual position of the virtual teaching object to be displayed, taking an intersecting position between a connecting line of the actual position and the eye box range and the reflecting device as a prompting position, and controlling the projection image source to display the prompting content at the prompting position.
4. The head-up display device according to claim 3, wherein the auxiliary driving controller determining prompt content to be displayed includes:
and determining target information to be displayed, and generating prompt contents with corresponding characteristic values according to the parameter values of the target information, wherein the characteristic values comprise one or more of color, size and shape.
5. The head-up display device according to claim 1, wherein the auxiliary driving controller determining prompt content to be displayed includes:
the auxiliary driving controller determines external object information, generates prompt contents to be displayed currently according to the external object information, and determines prompt positions; then controlling the projection image source to display the prompt content at the prompt position;
the prompting position is a preset position for displaying the prompting content; alternatively, the hint location is an intersection location between a current location of the external object and a line connecting the range of the eyebox and the reflective device.
6. The head-up display device of claim 5, wherein the external object information includes one or more of a current position of the external object, a current distance from the external object, and a movement speed of the external object;
the auxiliary driving controller generates prompt contents to be displayed currently according to the external object information, determines prompt positions, and comprises the following steps:
generating prompt contents with corresponding characteristic values according to the difference between the external object information and the standard value, wherein the characteristic values comprise one or more of color, size and shape; and taking the intersection position between the current position of the external object and the connecting line of the eye box range and the reflecting device as a prompt position.
7. The head-up display apparatus according to claim 6, wherein when there are a plurality of external objects, the driving assistance controller generates a hint content having a corresponding feature value from a difference between the external object information and a standard value of each of the external objects, and different external objects or different classes of external objects correspond to hint contents having different feature values.
8. The heads-up display device of claim 6 wherein the drive-assist controller is further configured to:
determining a local motion path, and determining the motion path of the external object according to the change value of the external object information;
taking the local motion path and/or the motion path of the external object as prompt content; and/or when the motion path of the external object intersects with the local motion path, the corresponding warning message is used as the prompt content, and the warning message comprises one or more of warning characters, warning images and warning videos.
9. The head-up display device according to claim 1, wherein the auxiliary driving controller determining prompt content to be displayed includes:
And determining navigation information, and generating prompt contents to be displayed according to the navigation information.
10. The heads-up display device of claim 9 wherein the auxiliary drive controller determining navigation information includes:
acquiring navigation information issued by a server;
or, acquiring a running requirement input by a user, and generating navigation information according to the running requirement;
or generating one or more travel paths and displaying the travel paths on the reflecting device, and generating navigation information according to the travel path pointed by the selection instruction when the selection instruction input by a user is received.
11. The head-up display device according to claim 9, wherein the driving assistance controller generating the prompt content to be displayed according to the navigation information includes:
taking key information in the navigation information as prompt content to be displayed, wherein the key information comprises one or more of a starting point position, an end point position, a distance from the end point, a time for reaching the end point and a travel progress;
and/or determining corresponding operation prompt information according to the local current position and the navigation information, and taking the operation prompt information as prompt content.
12. The heads-up display device of claim 9 wherein the drive-assist controller is further configured to:
acquiring corresponding information of a public object, wherein the public object comprises public transportation and emergency vehicles;
and determining the position and/or the driving path of the public object according to the corresponding information of the public object, and taking the position and/or the driving path of the public object as prompt content.
13. The heads-up display device of claim 12 wherein the drive-assist controller is further configured to:
when the intersection exists between the navigation information and the position or the running path of the public object, generating alarm information or updating the navigation information, and no intersection exists between the updated navigation information and the position or the running path of the public object.
14. The head-up display device of claim 1, wherein the driving assistance controller is further configured to:
sending reminding voice to a sound generating device and indicating the sound generating device to play the reminding voice;
or sending a vibration signal to a vibration device to instruct the vibration device to vibrate; the vibration device is a device that can be contacted with a user.
15. The heads-up display device of any one of claims 1-14, further comprising an information acquisition device communicatively coupled to the driver-assist controller; the information acquisition device comprises one or more of image acquisition equipment, a vehicle-mounted radar, an infrared sensor, a laser sensor, an ultrasonic sensor, a speed sensor, a rotating speed sensor, an angular speed sensor, a displacement sensor, a GPS, a V2X system and an ADAS;
the information acquisition device is used for acquiring current information and sending the acquired current information to the auxiliary driving controller; the current information comprises one or more of vehicle state information, external object information, driver information, manipulation information of a driver, a local motion path, navigation information and corresponding information of a public object;
the driving assist controller is configured to: and generating the prompt content according to the current information.
16. The head-up display device of claim 1, wherein the display device further comprises a display unit,
the main optical axis control element comprises a plurality of discontinuous first reflecting structures, and the first reflecting structures are used for reflecting one path of imaging light rays to the observation range;
The point (x, y, z) on the first reflective structure satisfies the following equation:
wherein ,P1 P is the coordinate of the position of the projection image source 2 M is the coordinate of the observation range 0 (x 0 ,y 0 ,z 0 ) For the coordinates of a known point on the first reflecting structure,a normal vector representing the first reflective structure;
or ,
the main optical axis control element comprises a plurality of continuous second reflecting structures for reflecting a plurality of imaging light rays to the observation range;
the included angle between the second reflecting structure and the plane where the main optical axis control element is located is theta:
wherein ,a normal vector representing a plane in which the primary optical axis control element lies; p (P) 1 P is the coordinate of the position of the projection image source 2 M is the coordinate of the observation range 0 (x 0 ,y 0 ,z 0 ) For the coordinates of a known point on the intersection of the second reflecting structure with the plane of the main optical axis control element +.>Representing the second reflective structure at point M 0 Normal vector at;
the point M (x, y, z) on the plane intersection of the second reflective structure and the main optical axis control element satisfies the following equation:
wherein ,representing the normal vector of the second reflecting structure at point M.
CN202010039954.9A 2020-01-15 2020-01-15 Head-up display device for driving assistance Active CN113119862B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010039954.9A CN113119862B (en) 2020-01-15 2020-01-15 Head-up display device for driving assistance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010039954.9A CN113119862B (en) 2020-01-15 2020-01-15 Head-up display device for driving assistance

Publications (2)

Publication Number Publication Date
CN113119862A CN113119862A (en) 2021-07-16
CN113119862B true CN113119862B (en) 2023-09-12

Family

ID=76771232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010039954.9A Active CN113119862B (en) 2020-01-15 2020-01-15 Head-up display device for driving assistance

Country Status (1)

Country Link
CN (1) CN113119862B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114034310B (en) * 2021-10-28 2023-09-29 东风汽车集团股份有限公司 Automatic navigation auxiliary driving system based on AR-HUD and gesture interaction

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334530A (en) * 2007-06-27 2008-12-31 先进微系统科技股份有限公司 Head-up-display system
CN101866049A (en) * 2009-04-02 2010-10-20 通用汽车环球科技运作公司 Traveling lane on the windscreen head-up display
CN106101667A (en) * 2016-06-30 2016-11-09 京东方科技集团股份有限公司 HUD and HUD display packing, mobile devices
CN107076991A (en) * 2014-07-22 2017-08-18 诺迪公司 Compact HUD system
CN107521411A (en) * 2017-07-18 2017-12-29 吉林大学 A kind of track level navigation augmented reality device for aiding in driver
CN108128246A (en) * 2017-12-19 2018-06-08 深圳大学 A kind of vehicle yaw warning and control method and system
CN108237980A (en) * 2016-12-27 2018-07-03 大众汽车有限公司 Driver assistance system, program product, signal sequence, mobile devices and method
CN109462750A (en) * 2018-12-29 2019-03-12 上海玮舟微电子科技有限公司 A kind of head-up-display system, information display method, device and medium
CN109946838A (en) * 2017-11-02 2019-06-28 矢崎总业株式会社 Head-up display device
CN110214107A (en) * 2017-01-26 2019-09-06 福特全球技术公司 The autonomous vehicle of driver education is provided
WO2019225572A1 (en) * 2018-05-25 2019-11-28 日本精機株式会社 Head-up display device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180252915A1 (en) * 2017-03-01 2018-09-06 E-Lead Electronic Co., Ltd. Head-up display device with narrow angle diffusion sheet

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334530A (en) * 2007-06-27 2008-12-31 先进微系统科技股份有限公司 Head-up-display system
CN101866049A (en) * 2009-04-02 2010-10-20 通用汽车环球科技运作公司 Traveling lane on the windscreen head-up display
CN107076991A (en) * 2014-07-22 2017-08-18 诺迪公司 Compact HUD system
CN106101667A (en) * 2016-06-30 2016-11-09 京东方科技集团股份有限公司 HUD and HUD display packing, mobile devices
CN108237980A (en) * 2016-12-27 2018-07-03 大众汽车有限公司 Driver assistance system, program product, signal sequence, mobile devices and method
CN110214107A (en) * 2017-01-26 2019-09-06 福特全球技术公司 The autonomous vehicle of driver education is provided
CN107521411A (en) * 2017-07-18 2017-12-29 吉林大学 A kind of track level navigation augmented reality device for aiding in driver
CN109946838A (en) * 2017-11-02 2019-06-28 矢崎总业株式会社 Head-up display device
CN108128246A (en) * 2017-12-19 2018-06-08 深圳大学 A kind of vehicle yaw warning and control method and system
WO2019225572A1 (en) * 2018-05-25 2019-11-28 日本精機株式会社 Head-up display device
CN109462750A (en) * 2018-12-29 2019-03-12 上海玮舟微电子科技有限公司 A kind of head-up-display system, information display method, device and medium

Also Published As

Publication number Publication date
CN113119862A (en) 2021-07-16

Similar Documents

Publication Publication Date Title
JP6861375B2 (en) Display system, information presentation system, display system control method, program, and mobile
JP6485732B2 (en) Information providing apparatus, information providing method, and information providing control program
US8536995B2 (en) Information display apparatus and information display method
US20170161009A1 (en) Vehicular display device
US6919866B2 (en) Vehicular navigation system
WO2020261781A1 (en) Display control device, display control program, and persistent tangible computer-readable medium
JP6883759B2 (en) Display systems, display system control methods, programs, and mobiles
JP2015068831A (en) Function-extended three-dimensional (3d) navigation
CN105523041B (en) Lane-departure warning system and method for controlling the system
JP6443716B2 (en) Image display device, image display method, and image display control program
JP6796806B2 (en) Display system, information presentation system, display system control method, program, and mobile
JP7310560B2 (en) Display control device and display control program
JP2017092678A (en) Vehicle image display system
JP7300112B2 (en) Control device, image display method and program
US20230046484A1 (en) Head up display system and control method thereof, and vehicle
JP2016109645A (en) Information providing device, information providing method, and control program for providing information
JP2016107947A (en) Information providing device, information providing method, and control program for providing information
JP2020071415A (en) Head-up display system
JP2019040634A (en) Image display device, image display method and image display control program
CN113126295A (en) Head-up display device based on environment display
CN113119862B (en) Head-up display device for driving assistance
JP2019202589A (en) Display device
JP2021006805A (en) Display control device and display control program
JP7318431B2 (en) Display control device and display control program
JP2022138173A (en) Display device for vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant