WO2022024964A1 - Dispositif d'affichage tête haute (hud) - Google Patents

Dispositif d'affichage tête haute (hud) Download PDF

Info

Publication number
WO2022024964A1
WO2022024964A1 PCT/JP2021/027463 JP2021027463W WO2022024964A1 WO 2022024964 A1 WO2022024964 A1 WO 2022024964A1 JP 2021027463 W JP2021027463 W JP 2021027463W WO 2022024964 A1 WO2022024964 A1 WO 2022024964A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
display area
viewer
upright
Prior art date
Application number
PCT/JP2021/027463
Other languages
English (en)
Japanese (ja)
Inventor
勇希 舛屋
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Priority to CN202180058974.3A priority Critical patent/CN116157290A/zh
Priority to JP2022540275A priority patent/JPWO2022024964A1/ja
Priority to DE112021004005.7T priority patent/DE112021004005T5/de
Publication of WO2022024964A1 publication Critical patent/WO2022024964A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion

Definitions

  • the present invention is, for example, a head-up display (HUD) that projects (projects) image display light onto a windshield such as a vehicle or a projected member such as a combiner to display a virtual image in front of the driver. ) Regarding equipment, etc.
  • HUD head-up display
  • cognition is improved by presenting an information image (depth image such as an arrow or a map) having depth information.
  • an information image (upright image such as text or numbers) having no depth information (in a broad sense, not emphasizing depth) stand upright and visually recognized, which is highly convenient.
  • the HUD device when displaying characters, numbers, etc., the HUD device needs to display it as correctly upright content so that a person can read the information correctly.
  • the above image (virtual image) is called an upright image (upright virtual image). Since the upright image is often displayed so that the viewer faces each other, it may be said that the image is a facing image.
  • the present inventor examined the oblique image plane HUD device and recognized the following new problems. If an image (virtual image) can be displayed on an inclined surface, as described above, an image (virtual image) with a sense of depth can be displayed, and the inclination is to some extent the ground (or a surface corresponding to the ground: equivalent). If it is erected on the surface), it can be displayed as an upright image of the content whose constituent elements are characters, numbers, and the like, for which a sense of depth is not emphasized.
  • the inclined surface is greatly inclined with respect to the ground (corresponding surface thereof), it will be in a state suitable for a deep display, but on the other hand, there will be a problem such as deterioration of visibility for an upright image. As a result, there are problems such as difficulty in recognition by the viewer, long time required for recognition even if it can be recognized, or subjective discomfort or discomfort.
  • the upright image can be recognized correctly, in other words, the upright image can be visually recognized. If not, the upright image can be discriminated, identified, visually recognized, etc. It means that it cannot be done or it is difficult to see.
  • the visibility (visual sensitivity) of a person depends on the distance to the image, and the sensitivity is high when displayed on the front side and relatively low when displayed on the back side.
  • the inclination (inclination angle) also changes, and a unified standard (threshold value) cannot be obtained, which is not easy to use.
  • One of the objects of the present invention is to enable an oblique image plane HUD device to display an image of content (upright image) to be viewed upright while suppressing a decrease in visibility.
  • the head-up display device is An image display unit that displays an image and By projecting the light of the image displayed by the image display unit toward the projected member, the viewer can visually recognize the virtual image of the image within a virtual display area in the real space in front of the viewer.
  • Optical system and A control unit that controls the display of the image in the image display unit, Have The direction toward the front of the viewer in the real space is defined as the forward direction.
  • the direction perpendicular to the front direction and along the line segment connecting the left and right eyes of the viewer is defined as the left-right direction.
  • the direction along the line segment orthogonal to the front direction and the left-right direction is defined as the vertical direction or the height direction.
  • the control unit In the display area, which is an inclined surface of a flat surface or a curved surface that inclines from the side close to the viewer and the lower side to the far side and the upper side with respect to the ground or a surface corresponding to the ground in the real space.
  • control is performed to display an upright image, which is an upright image to be visually recognized.
  • the upright image is displayed in a display area that is the outline of a quadrangle when viewed from the viewer.
  • the difference in convergence angle between the upper end and the lower end of the display area is set to be less than a predetermined threshold value determined based on at least one of the visibility of the image, the time required for visual recognition, and psychological factors such as discomfort and discomfort. ..
  • the degree of inclination of the display area so that the upright image can be correctly visually recognized is replaced by "convergence angle difference (provided that the inclination distortion angle caused by the difference (see the angle ⁇ d in FIG. 6C)). It is possible to make a judgment using a new index (standard as a threshold value), which makes it more efficient (or easier) to determine and set the degree of inclination.
  • the display area has a quadrangular outline, the ground (or equivalent surface of the ground: road surface, etc.) side of the display area is the lower end (lower side), and the opposite side (the side away from the ground) is the upper end (upper side). do.
  • a pair of points (which can be provided at arbitrary positions) corresponding to each other at each of the lower end (lower side) and the upper end (upper side) are preferably provided, for example, the right end point or the left end point of each end (each side). ) Is set.
  • This pair of points is defined as a first point and a second point, and the convergence angle (the angle formed by the visual axis indicating the line-of-sight direction of each eye) when the first point is viewed from each of the left and right eyes is the first point.
  • the convergence angle is defined as the convergence angle when the second point is viewed, and the difference between the first congestion angle and the second convergence angle (the first congestion angle to the second convergence angle is defined as the second convergence angle).
  • the subtracted difference) is defined as the "convergence angle difference”. In other words, this convergence angle difference can be said to be “convergence angle difference between the upper end and the lower end of the display area”.
  • the display area is erected on the ground (corresponding surface thereof) at a substantially right angle, for example, the upper end (upper side) and the lower end (lower side) of the quadrangle are viewed in a plan view from above.
  • Overlap and the first point and the second point overlap.
  • the length (height) of the vertical side of the quadrangle is small, the variation in the distance between the first and second points and the left and right eyes due to the height position of each of the first and second points. Can be ignored.
  • the first and second convergence angles for each of the first and second points are the same (substantially the same), and the convergence angle difference is zero (substantially zero).
  • the convergence angle difference is ⁇ ( ⁇ is an integer larger than 0).
  • the first convergence angle further increases, so that the difference from the second convergence angle is. It becomes large and the convergence angle difference becomes ⁇ ( ⁇ is an integer satisfying ⁇ ⁇ ).
  • the "convergence angle difference between the upper end and the lower end of the display area” is an index indicating the degree of inclination of the display area with respect to the ground (corresponding surface thereof). Further, since the convergence angle fluctuates depending on the distance from the viewer's eyes, distance information is included. Therefore, the convergence angle difference is the ground of the display area (or the virtual image display surface, etc.) including the distance. It is one integrated index (threshold value) that has information on the degree of inclination with respect to (its equivalent surface). Unlike the conventional method, it is not necessary to set the inclination with the precondition such as the number of inclination angles at a distance of several meters.
  • the visibility of an upright image varies from person to person and cannot be unequivocally stated, but it is based on at least one of the visibility of the image to be displayed, the time required for viewing, and psychological factors such as discomfort and discomfort. It is possible to objectively determine whether or not a normal upright image can be visually recognized. Then, by using the above index in the determination, a threshold value that can be used in the determination can be obtained.
  • the difference in convergence angle near the limit at which the viewer can fuse (synthesize) each image of the left eye and the right eye in the brain to recognize the erect image is set as a threshold value, and for example, when designing a HUD device. If each part is set so that the difference in convergence angle is less than the threshold value, the viewer can recognize the upright image even if it is displayed on the inclined surface. In other words, the visibility of the upright image by the viewer is ensured to a predetermined level or higher.
  • this new index (convergence angle difference or tilt distortion angle caused by this) can also be used for calibration of the HUD device, initialization of the HUD device, simulation of the function of the HUD device, and the like. , The effect of improving the efficiency of each process can be obtained.
  • the display area includes a first area in which both a virtual image of a depth image, which is an image to be visually recognized by being tilted, and a virtual image of the upright image can be displayed, and a second area for displaying a virtual image of the depth image.
  • Divided into The convergence angle difference between the upper end and the lower end of the display area in the first region is set to be less than the predetermined threshold value, and the convergence angle difference between the upper end and the lower end of the display area in the second region is set to be equal to or higher than the predetermined threshold value. It may have been done.
  • the display area is divided into a first area capable of displaying both a depth image and an upright image and a second area suitable for displaying a depth image, and the display in the second area is performed.
  • the difference in convergence angle between the upper end and the lower end of the region is set to be equal to or higher than the above-mentioned predetermined threshold value.
  • the threshold value is set based on the visibility of the upright image and the like, and above the threshold value, the visibility of the upright image is lowered and it is not suitable for displaying the upright image.
  • it is suitable for displaying a depth image expressed in an inclined manner (including an image that floats in the air and extends substantially parallel to the road surface, etc., or gives a visual sense superimposed on the road surface). It can be said that. Therefore, for the image (virtual image) displayed in the second region, the convergence angle difference or the like is set to be equal to or higher than the threshold value.
  • the predetermined threshold value functions as a normal visibility determination threshold for determining the normal visibility of the upright image.
  • the convergence angle difference as the predetermined threshold value may be set to 0.2 °.
  • predetermined threshold value can be specifically used as, for example, a "normal visibility determination threshold value", and an example of a preferable value thereof is 0.2 °. It clarifies that there is.
  • the limitation of the convergence angle difference by the predetermined threshold is the content of an upright image having a vertical angle of view of 0.75 ° or less. May be applied to.
  • a vertical image is taken into consideration that as the size of the displayed content increases, it becomes more difficult to fuse the image of the left eye and the image of the right eye in the brain even with the same convergence angle difference.
  • the above threshold value is applied to small contents having an angle of 0.75 ° or less, and the difference in convergence angle between the upper end and the lower end is set to be less than the threshold value.
  • FIG. 1A is a diagram showing an example of a configuration of a HUD device mounted on a vehicle and an inclined display area
  • FIGS. 1B and 1C are display areas shown in FIG. 1A. It is a figure which shows the example of the method to realize.
  • FIG. 2A is a diagram showing a main configuration of a HUD device mounted on a vehicle and a display example in a display area
  • FIG. 2B is a diagram showing both a virtual image of a depth image and a virtual image of an upright image. It is a figure which shows the example of the display area
  • FIG. 3A is a diagram and a diagram showing a state in which an inclined surface (inclined display area) on which an arrow figure as a depth image is displayed is arranged in front and the viewer is viewing it with both eyes.
  • 3 (B) is a diagram showing an image seen with the left eye
  • FIG. 3 (C) is a diagram showing a depth image seen by fusing (combining) each image of the left eye and the right eye
  • D) is a diagram showing an image seen by the right eye.
  • FIG. 4A is a diagram showing a state in which an inclined surface (inclined display area) on which a vehicle speed display as an upright image is displayed is arranged in front and the viewer is viewing it with both eyes.
  • FIG. 4 (B) is a diagram showing an image seen with the left eye
  • FIG. 4 (C) is a diagram showing an upright image seen by fusing (combining) each image of the left eye and the right eye
  • FIG. 4 (D) is a figure showing an image seen by the right eye
  • FIG. 5 (A) is a diagram showing a state in which a viewer is viewing a display area erected substantially perpendicular to the road surface with both eyes
  • FIG. 5 (B) is a view of the display area in FIG. 5 (A).
  • FIG. 5 (A) is a diagram showing the convergence angles of both eyes with respect to the first right end point of the upper end (upper side) and the second right end point of the lower end (lower side) corresponding to the first right end point
  • FIG. 5 (A) is
  • FIG. 5C is a diagram showing the left eye. It is a figure which shows the image which fused (combined) each image of a right eye.
  • FIG. 6A is a diagram showing a state in which a viewer is viewing a display area inclined by approximately 45 ° with respect to the road surface with both eyes
  • FIG. 6B is a display area in FIG. 6A.
  • FIG. 6C is a diagram showing the left eye.
  • FIG. 6A is a diagram showing a state in which a viewer is viewing a display area inclined by approximately 45 ° with respect to the road surface with both eyes
  • FIG. 6B is a display area in FIG. 6A.
  • FIG. 6 (D) is a diagram showing an upright image obtained by fusing (synthesizing) each image of the left eye and the right eye
  • FIG. 6 (E) is a diagram showing an image viewed with the right eye. It is a figure which shows.
  • FIG. 7A is a diagram showing a state in which a viewer is viewing a display area inclined by approximately 30 ° with respect to the road surface with both eyes
  • FIG. 7B is a display area in FIG. 7A.
  • FIG. 7 (C) is a diagram showing the convergence angles of both eyes with respect to the first right end point of the upper end (upper side) and the second right end point of the lower end (lower side) corresponding to the first right end point
  • FIG. 7 (C) is the left eye.
  • FIG. 7 (D) is a diagram showing the images seen in the above, and FIG. 7 (D) is a diagram showing visual perception that is difficult to see due to double vision when each image of the left eye and the right eye is fused (combined).
  • 8 (A) and 8 (B) are flowcharts showing an example of a design method of a HUD device (oblique image plane HUD device). It is a figure which shows the structural example of the display control unit (control unit) in a HUD apparatus.
  • 10 (A) and 10 (B) are views showing another example of the inclined display area.
  • FIG. 11 is a graph of experimental results showing the proportion (vertical axis) of those who answered that they did not feel uncomfortable with each convergence angle difference (horizontal axis).
  • FIG. 1A is a diagram showing an example of a configuration of a HUD device mounted on a vehicle and an inclined display area
  • FIGS. 1B and 1C are display areas shown in FIG. 1A. It is a figure which shows the example of the method to realize.
  • the direction along the front of the vehicle 1 (also referred to as the front-rear direction) is the Z direction
  • the direction along the width (horizontal width) of the vehicle 1 is the X direction
  • Direction (Y direction is the direction away from the ground or its corresponding surface (here, the road surface) 40 of the line segment perpendicular to the flat road surface 40.
  • the term virtual display area (sometimes simply referred to as a display area) provided in front of the viewer can be broadly interpreted.
  • it may be a virtual display surface (sometimes referred to as a virtual image display surface) corresponding to (the display range of) a display surface such as a screen on which an image is displayed, or it may be displayed on the virtual display surface.
  • the image area can be regarded as one display area (or a part of a virtual image display surface). .. In the following description, based on the above, it is simply referred to as "display area".
  • the HUD device 100 of this embodiment is mounted inside the dashboard 41 of the vehicle (own vehicle) 1.
  • the HUD device 100 is an upright image (an image that does not particularly emphasize depth and is also referred to as an erect image) in which the display area PS1 having an area inclined with respect to the road surface 40 is visually recognized in an upright position in front of the vehicle 1. (For example, it may be composed of numbers, letters, etc.), and a depth image in which depth is an important factor (also referred to as an inclined image or an inclined image, for example, a navigation arrow extending along the road surface 40). Etc.) can be displayed.
  • the HUD device 100 has a display unit (sometimes referred to as an image display unit, specifically, a screen) 160 having a display surface 164 for displaying an image, and a display light K for displaying an image, and a projected member (reflection).
  • a curved mirror (concave mirror) having an optical system 120 including an optical member projecting onto a windshield which is a translucent member) 2 and a light projecting unit (image projection unit) 150, and the optical member 120 has a reflecting surface 179.
  • the reflecting surface 179 of the curved mirror 170 having 170 is not a shape having a uniform radius of curvature, but a shape consisting of, for example, a set of partial regions having a plurality of radius of curvature.
  • a free curved surface design method can be used (the free curved surface itself may be used).
  • a free curved surface is a curved surface that cannot be expressed by a simple mathematical formula, and a curved surface is expressed by setting some intersections and curvatures in space and interpolating each intersection with a higher-order equation.
  • the shape of the reflective surface 179 has a considerable influence on the shape of the display area PS1 and the relationship with the road surface.
  • the shape of the display area PS1 includes the shape of the reflective surface 179 of the curved mirror (concave mirror) 130, the curved shape of the windshield (reflection translucent member 2), and other optical members mounted in the optical system 120. It is also affected by the shape of (for example, the correction mirror). It is also affected by the shape of the display surface 164 of the display unit 160 (generally a flat surface, but all or part of it may be non-planar) and the arrangement of the display surface 164 with respect to the reflective surface 179.
  • the curved mirror (concave mirror) 170 is a magnifying reflector, and has a considerable influence on the shape of the display area (virtual image display surface). Further, if the shape of the reflecting surface 179 of the curved mirror (concave mirror) 170 is different, the shape of the display area (virtual image display surface) PS1 actually changes.
  • the display surface 164 of the display unit 160 is 90 with respect to the optical axis of the optical system (the main optical axis corresponding to the main light ray). It can be formed by arranging diagonally at an intersection angle of less than a degree.
  • the shape of the curved surface of the display area PS1 is such that the optical characteristics of all or a part of the optical system can be adjusted, the arrangement of the optical member and the display surface 164 can be adjusted, and the shape of the display surface 164 can be adjusted. It may be adjusted or adjusted by a combination thereof. In this way, the shape of the virtual image display surface can be adjusted in various ways. Thereby, the display area PS1 having the first area Z1 and the second area Z2 can be realized.
  • the display area PS1 is suitable for displaying the first area Z1 capable of displaying both the depth image (tilted image) and the upright image (standing image) and the depth image (tilted image) (in other words, the depth image). It is divided into a second area (used exclusively for display of).
  • the mode and degree of the overall inclination of the display area (including the virtual image display surface) PS1 are adjusted according to the mode and degree of inclination of the display surface 164 of the display unit 160. do.
  • the distortion of the display area (virtual image display surface) due to the curved surface of the windshield (reflection and translucent member 2) is corrected by the curved surface shape of the reflective surface 179 of the curved mirror (concave mirror or the like) 170.
  • a flat display area (virtual image display surface) PS1 is generated.
  • the display surface 164 is rotated.
  • the mirror By moving the mirror to make the relative relationship with the optical member (curved surface mirror 170) different, it is possible to adjust the degree to which the display region (virtual image display surface) PS1 which is an inclined surface is separated from the road surface 40.
  • the shape of the reflecting surface of the curved mirror (concave mirror or the like) 170 which is an optical member, is adjusted (or the shape of the display surface 164 of the display unit 160 is adjusted) for display.
  • the virtual image display distance near the end (near end) U1 on the side of the area PS1 near the vehicle 1 is bent toward the road surface and controlled to stand against the road surface ( In other words, elevation), thereby obtaining a display area PS1 having an inclined portion.
  • the reflective surface 179 of the curved mirror 170 has three portions (Near (nearby display), Center (middle (center) display), and Far (far display)). It can be divided into parts).
  • Near is a part that generates display light E1 (indicated by a long-dotted line in FIGS. 4A and 4B) corresponding to the near-end U1 of the display area PS1, and the Center is a display.
  • Far is a portion that generates the display light E2 (indicated by a broken line) corresponding to the intermediate portion (central portion) U2 of the region PS1, and Far is the display light E3 (solid line) corresponding to the far end portion U3 of the display region PS1. Is the part that produces).
  • the Center and Far portions are the same as the curved mirror (concave mirror or the like) 170 in the case of generating the flat display area PS1 shown in FIG. 1 (B).
  • the curvature of the Near portion is set to be smaller than that in FIG. 1 (B). Then, the magnification corresponding to the Near part becomes large.
  • the magnification (referred to as c) of the HUD device 100 is the distance (referred to as a) from the display surface 164 of the display unit 160 to the windshield 2 and the viewpoint of the light reflected by the windshield (reflecting translucent member 2).
  • the image is formed at a position farther from the vehicle 1. That is, in the case of FIG. 1 (C), the virtual image display distance is larger than that of the case of FIG. 1 (B).
  • the near-end U1 of the display region PS1 will be separated from the vehicle 1, and the near-end U1 will bow toward the road surface 40, and as a result, the first region Z1 will be formed. .. As a result, a display region PS1 having a first region Z1 and a second region Z2 can be obtained.
  • FIG. 2A is a diagram showing a main configuration of a HUD device mounted on a vehicle and a display example in a display area
  • FIG. 2B is a diagram showing both a virtual image of a depth image and a virtual image of an upright image. It is a figure which shows the example of the display area
  • the same reference numerals are given to the portions common to those in FIG.
  • the HUD device 100 includes a display unit (for example, a light transmission type screen) 160 having a display surface 164, a reflector 165, and a curved mirror (for example, a curved mirror) as an optical member for projecting display light. It is a concave mirror having a reflecting surface 179, and the reflecting surface may be a free curved surface) 170.
  • the image displayed on the display unit 160 is projected onto the projected region 5 of the windshield 2 as a projected member via the reflecting mirror 165 and the curved mirror 170.
  • the HUD device 100 may be provided with a plurality of curved mirrors.
  • a refraction optical element such as a lens, a diffractive optical element, or the like can be used.
  • a configuration including a functional optical element may be adopted.
  • FIG. 2A As display examples in the first area Z1 of the display area PS1, for example, a vehicle speed display SP which is an upright image (upright virtual image) and an image (virtual image) AW of a navigation arrow which is a depth display. 'It is shown. Further, in the second region Z2, an image (virtual image) AW of a navigation arrow extending from the front side to the back side of the vehicle 1 along the road surface 40 is shown.
  • the angle (inclination angle) formed by the first region Z1 with the road surface 40 is ⁇ 1 (0 ⁇ 1 ⁇ 90 °), and the angle formed by the second region Z2 with the road surface 40. (Inclination angle) is ⁇ 2 (0 ⁇ 2 ⁇ 1).
  • Each of the first and second regions Z1 and Z2 is an inclined region (or a region having at least an inclined portion).
  • FIG. 3A is a diagram and a diagram showing a state in which an inclined surface (inclined display area) on which an arrow figure as a depth image is displayed is arranged in front and the viewer is viewing it with both eyes.
  • 3 (B) is a diagram showing an image seen with the left eye
  • FIG. 3 (C) is a diagram showing a depth image seen by fusing (combining) each image of the left eye and the right eye
  • FIG. 3 (C) is a diagram showing an image seen by the right eye.
  • the midpoint C0 is drawn at the center position between the left eye A1 and the right eye A2.
  • the image obtained by fusing the image seen by the left eye A1 and the image seen by the right eye A2 in the brain of the viewer can be said to be an image at the midpoint position C0 for convenience.
  • FIG. 3A an image (virtual image) AW of an inclined and extending navigation arrow is displayed in the second area Z2 of the display area PS1.
  • images (images with binocular parallax) of the left and right eyes A1 and A2 shown in FIGS. 3 (B) and 3 (D) are fused (combined), a sense of depth (three-dimensional) as shown in FIG. 3 (C) is obtained.
  • An image with a feeling) is visually recognized.
  • the image (virtual image) AW of the arrow is naturally tilted to the back side and visually recognized due to the image shape change caused by the positional deviation between the upper end of the angle of view and the lower end of the angle of view in each of the left and right eyes A1 and A2.
  • FIG. 4A is a diagram showing a state in which an inclined surface (inclined display area) on which a vehicle speed display as an upright image is displayed is arranged in front and the viewer is viewing it with both eyes.
  • 4 (B) is a diagram showing an image seen with the left eye
  • FIG. 4 (C) is a diagram showing an upright image seen by fusing (combining) each image of the left eye and the right eye
  • FIG. 4 (D) is a figure showing an image seen by the right eye.
  • FIG. 4A an image (virtual image) of a vehicle speed display SP (displayed as “120 km / h” as described in FIG. 4B and the like) is shown in the first region Z1 of the display region PS1.
  • This vehicle speed display SP is displayed as an upright image (upright virtual image) that is displayed in the inclined first region Z1 and is visually recognized in an upright position.
  • the inclination angle of the first region Z1 with respect to the road surface 40 is not so small and it is erected to some extent, the cognition of the viewer is not impaired and the visual recognition as an upright image (reading the information as an upright image). ) Is possible.
  • the images (images having binocular parallax) of the left and right eyes A1 and A2 shown in FIGS. 4 (B) and 4 (D) are fused (combined), the images shown in FIG. 4 (C) are obtained.
  • the vehicle speed display SP is visually recognized as a standing image.
  • the appearance of the first region Z1 when the inclination angle with respect to the road surface 40 is small and the fusion (combination) of the images by the left and right eyes fails (when the fusion is not performed) varies from person to person. For example, it may appear tilted as a whole, or it may appear due to a change in the shape of either eye, but in any case, visibility decreases, recognition time increases, and it is subjective (psychological factor). ) Also cannot be seen favorably.
  • FIG. 5 (A) is a diagram showing a state in which a viewer is viewing a display area erected substantially perpendicular to the road surface with both eyes
  • FIG. 5 (B) is a view of the display area in FIG. 5 (A).
  • FIG. 5C is a diagram showing the left eye. It is a figure which shows the image which fused (combined) each image of a right eye.
  • the display area has a contour of a predetermined shape (here, a quadrangle), the ground (or the corresponding surface of the ground: road surface, etc.) side of the display area is the lower end (lower side), and the opposite side (lower side). The side away from the ground) is the upper end (upper side).
  • the "quadrangle" indicating the shape of the display area shall be broadly interpreted to include, for example, a rectangle, a square, a trapezoid, a parallelogram, and the like.
  • a display area here, the first area Z1 standing substantially perpendicular to the road surface 40 is shown in front of the viewer.
  • the viewer is looking at the image (virtual image) displayed in the first region Z1 with both eyes A1 and A2.
  • the displayed image is the vehicle speed display SP shown above.
  • the lower end of the angle of view (hereinafter, may be simply referred to as the lower end or the lower side) in the display area (first area Z1) is indicated by a PL reference numeral, and the upper end of the angle of view (hereinafter, may be simply referred to as the lower end or the lower side) is shown. (Sometimes referred to simply as the top or top) is indicated with a PU sign.
  • FIG. 5 (B) shows the convergence angle due to the binocular parallax of the viewer in a plan view when looking from the upper side to the lower side (in the figure, the ⁇ Y direction side indicated by the arrow) in FIG. 5 (A). Is shown.
  • a pair of points (which can be provided at arbitrary positions) corresponding to each other at each of the lower end (lower side) PL and the upper end (upper side) PU are preferably provided, but preferably, for example, each end (each side). ) Is set to the right end point or the left end point).
  • the right end point R1 at the lower end (lower side) and the right end point R2 at the upper end (upper side) are set.
  • the point R1 is referred to as a first point
  • the point R2 is referred to as a second point.
  • the convergence angle (the angle formed by the visual axis indicating the line-of-sight direction of each eye A1 and A2) when the first point R1 is viewed from the left and right eyes A1 and A2 is defined as the first convergence angle ⁇ L, and the second point.
  • the congestion angle when looking at R2 is defined as the second congestion angle ⁇ U, and the difference between the first congestion angle ⁇ L and the second congestion angle ⁇ U (the second congestion angle ⁇ U is subtracted from the first congestion angle ⁇ L). Difference) is defined as "convergence angle difference”. In other words, this convergence angle difference can be said to be “convergence angle difference between the upper end (or upper side) and the lower end (or lower side) of the display area".
  • the contour quadrangle of the display area (first area Z1) is viewed from above.
  • the upper end (upper side) PU and the lower end (lower side) PL of the quadrangle overlap, and the first point R1 and the second point R2 overlap.
  • the length of the vertical side of the quadrangle (the length of the line segment indicating the first region Z1 in FIG. 5A: in other words, the height with respect to the road surface 40 of the first region) is small.
  • an image obtained by fusing (combining) each image of the left eye and the right eye is visually recognized as a standing image. There is no problem with the visibility of the vehicle speed display SP.
  • FIG. 6A is a diagram showing a state in which a viewer is viewing a display area inclined by approximately 45 ° with respect to the road surface with both eyes
  • FIG. 6B is a display area in FIG. 6A.
  • FIG. 6C is a diagram showing the left eye.
  • FIG. 6 (D) is a diagram showing an upright image obtained by fusing (synthesizing) each image of the left eye and the right eye
  • FIG. 6 (E) is a diagram showing an image viewed with the right eye. It is a figure which shows.
  • the display area (first area Z1) is arranged at an inclination of approximately 45 ° with respect to the road surface 40.
  • the lower end (lower side) of the quadrangle showing the outline of the first region Z1 moves to the side of the viewer.
  • the lower end (lower side) PL is closer to the viewer than the upper end (upper side) PU.
  • the convergence angle difference is ⁇ ( ⁇ is an integer larger than 0).
  • the human eye can recognize an upright image by correcting a certain depth, left-right distortion, etc., and in the example of FIG. 6, the limit of the correction function (recognition function) is not exceeded. Therefore, as shown in FIG. 6D, the vehicle speed display SP as an upright image can be visually recognized correctly for the time being.
  • FIG. 7A is a diagram showing a state in which a viewer is viewing a display area inclined by approximately 30 ° with respect to the road surface with both eyes
  • FIG. 7B is a display area in FIG. 7A
  • FIG. 7 (C) is a diagram showing the convergence angles of both eyes with respect to the first right end point of the upper end (upper side) and the second right end point of the lower end (lower side) corresponding to the first right end point
  • FIG. 7 (C) is the left eye.
  • FIG. 7 (D) is a diagram showing the images seen in the above
  • FIG. 7 (D) is a diagram showing visual perception that is difficult to see due to double vision when each image of the left eye and the right eye is fused (combined).
  • Is a diagram showing an image seen with the right eye.
  • the display area (first area Z1) is further inclined with respect to the road surface 40.
  • the lower end (lower side) PL of the quadrangle further moves toward the viewer and approaches the viewer.
  • the first convergence angle ⁇ L is further increased. Therefore, the difference from the second convergence angle ⁇ U becomes large, and the convergence angle difference ( ⁇ L ⁇ U) becomes ⁇ ( ⁇ is an integer satisfying ⁇ ⁇ ).
  • the "convergence angle difference ( ⁇ L- ⁇ U) between the upper end (upper side) and the lower end (lower side) of the display area” is the inclination of the display area with respect to the ground (corresponding surface thereof). Can be an indicator of the degree of.
  • the convergence angle difference ( ⁇ L ⁇ U) includes the distance. It is one integrated index (threshold value) having information on the degree of inclination of the display area (or the virtual image display surface, etc.) with respect to the ground (corresponding surface thereof). It is not necessary to set the inclination with the precondition of the distance, such as the number of inclination angles at the distance of several meters as in the conventional case. Therefore, by introducing this index into the design of the HUD device or the like, the setting of the inclination of the display area or the like can be made more efficient (easy).
  • the visibility of the upright image varies from person to person and cannot be unequivocally stated, but the visibility of the image to be displayed (first factor), the time required for viewing (second factor), and discomfort or discomfort. It is possible to objectively determine whether or not a normal upright image can be visually recognized based on at least one of psychological factors (third factor) such as pleasure. Then, by using the above index in the determination, a threshold value that can be used in the determination can be obtained.
  • this "predetermined threshold value” can be used as, for example, a “normal visual visibility determination threshold value”, and an example of a preferable value thereof is the above-mentioned 0.2 °.
  • this threshold value for example, a design that enables an image of content (upright image) to be visually recognized upright while suppressing a decrease in visibility is made more efficient or easier. Will be done.
  • this new index (convergence angle difference, or tilt distortion angle caused by this (see the angle ⁇ d in FIG. 6C)) is used for calibration of the HUD device, initialization of the HUD device, and functions of the HUD device. It can also be used for simulations and the like, and effects such as improving the efficiency of each process can be obtained.
  • the display area includes a first area Z1 capable of displaying both a depth image and an upright image, and a second area Z2 suitable for displaying a depth image.
  • the difference in convergence angle between the upper end and the lower end of the display region in the second region Z2 is preferably set to the above-mentioned predetermined threshold value (preferably 0.2 °) or more.
  • the threshold value is set based on the visibility of the upright image and the like, and above the threshold value, the visibility of the upright image is lowered and it is not suitable for displaying the upright image.
  • it is suitable for displaying a depth image expressed in an inclined manner (including an image that floats in the air and extends substantially parallel to the road surface, etc., or gives a visual sense superimposed on the road surface). It can be said that. Therefore, for the image (virtual image) displayed in the second region Z2, the convergence angle difference is set to be equal to or larger than the threshold value.
  • the limitation of the convergence angle difference by a predetermined threshold is the content of an upright image having a vertical angle of view of 0.75 ° or less. May be applied to.
  • the above threshold value is applied to small content of 75 ° or less, and the convergence angle difference between the upper end and the lower end is set to be less than the threshold value.
  • FIG. 8 (A) and 8 (B) are flowcharts showing an example of a design method of a HUD device (oblique image plane HUD device).
  • a HUD device oblique image plane HUD device
  • the convergence angle difference between the upper end and the lower end of the display area of the information image (upright image) to be visually recognized upright (or the tilt distortion angle caused by the difference) is shown.
  • Each part is designed so as to be less than a predetermined threshold value (preferably less than 0.2 °) determined based on at least one of psychological factors such as image visibility, time required for visual recognition, and discomfort and discomfort (preferably less than 0.2 °).
  • the display area is tilted and the first area in which both the information image (depth image) to be visually recognized by tilting and the information image (upright image) to be visually recognized in an upright position are displayed and the display area is tilted.
  • the information image (depth image) to be visually recognized is divided into a second area to be displayed (step S2).
  • the convergence angle difference between the upper end and the lower end in the first region is designed to be less than a predetermined threshold value (preferably less than 0.2 °).
  • the convergence angle difference between the upper end and the lower end of the second region is designed to be equal to or larger than a predetermined threshold value (step S3).
  • FIG. 9 is a diagram showing a configuration example of a display control unit (control unit) in the HUD device.
  • the upper view of FIG. 9 is almost the same as that of FIG. 1 (A).
  • a line-of-sight detection camera 188 and a viewpoint position detection unit 192 are provided.
  • the display control unit (control unit) 190 has an input / output (I / O) interface 193 and an image processing unit 194.
  • the image processing unit 194 includes an image generation control unit 195, a ROM (having an upright image table 199 and a depth image table 200) 198, and a VRAM (having a warping parameter 196 and a post-warping data storage buffer 197) 201. And an image generation unit (image rendering unit) 202.
  • the image generation control unit 195 arranges the vehicle speed display SP and the arrow display AW'in the first region Z1 and the arrow display AW in the second region. It is possible to control the display position of the content, such as arranging it in Z2. Further, the display control unit (control unit) 190 performs control such as setting the display area at an appropriate position by using the above threshold value at the time of calibration or initialization of the HUD device 100, for example. You can also do it.
  • FIG. 10 (A) and 10 (B) are views showing another example of the inclined display area.
  • the device configuration itself is the same as in FIG. 1 (A).
  • the cross-sectional shape of the vehicle 1 in the display area PS1 as seen from the width direction (horizontal direction, X direction) is not limited to the one in which the driver side is convex as shown in FIG. As shown in FIG. 10A, the display area PS1 may have a concave shape on the driver side. Further, the display area PS1 does not have to be curved as shown in FIG. 10 (B). These are examples, and display areas having various cross-sectional shapes can be assumed.
  • the configuration of this embodiment has the effect of improving the visibility of the image.
  • a sensory evaluation was performed in which a subject was asked to visually recognize a virtual image displayed by multiple types of head-up display devices and to answer whether or not he / she felt uncomfortable.
  • a plurality of types of head-up display devices display upright images having different convergence gaps at the upper and lower ends.
  • the upright image has a shape that is visually recognized as a rectangle in the vertical and horizontal directions when viewed from the observation position of the subject, and the vertical angle of view is set to 0.75 °. Will be done.
  • FIG. 11 is a graph of experimental results showing the proportion (vertical axis) of those who answered that they did not feel uncomfortable with each convergence angle difference (horizontal axis). It can be seen that the proportion of people who answered that they did not feel uncomfortable increased as the difference in the convergence angle between the upper end and the lower end became smaller when visually recognizing the upright image.
  • the present invention can be widely applied to a parallax type HUD device for incident an image having parallax, a light ray reproduction type HUD device using a lenticular lens or the like, and the like.
  • the term vehicle can be broadly interpreted as a vehicle.
  • terms related to navigation shall be interpreted in a broad sense, taking into consideration, for example, the viewpoint of navigation information in a broad sense useful for vehicle operation, and road signs and the like can also be included.
  • the upright image can also be said to be a face-to-face image that the viewer sees face-to-face, and can be widely interpreted without being bound by the name.
  • the HUD device shall include a device used as a simulator (for example, a simulator as an aircraft simulator, a simulator as a game device, etc.).
  • An optical system including an optical member, 150 ... a light projecting unit (image projection unit), 160 ... a display unit (for example, a liquid crystal display device, a screen, etc.), 164 ... a display surface, 170 ... a curved mirror. (Concave mirror, etc.) 179 ... Reflective surface, 188 ... Line-of-sight detection camera, 190 ... Display control unit (control unit), 192 ... Viewpoint position detection unit, 194 ... Image processing unit, 195 ... Image generation control unit, 196 ...
  • Warping parameter 197 ... Data buffer after warping processing, 198 ... ROM, 199 ... Upright image table, 200 ... Depth image table, 201.
  • VRAM storage device for image processing
  • 202 image generation unit (image rendering unit)
  • EB eye box
  • PS1 display area (imaginary image display surface)
  • Z1 upright image

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)

Abstract

Le but de la présente invention est de permettre d'afficher, dans un dispositif HUD à surface d'image oblique, une image (image dressée) d'un contenu à visualiser dans une position dressée tout en supprimant une diminution de la visibilité. Une unité de commande d'un dispositif d'affichage tête haute selon la présente invention réalise une commande pour afficher une image dressée qui est une image à visualiser dans une position dressée à l'intérieur d'une zone d'affichage (Z1) qui est un plan incliné plat ou incurvé incliné à partir d'un côté qui est proche d'un spectateur et d'un côté inférieur à un côté qui est éloigné du spectateur et d'un côté supérieur par rapport à une surface de sol ou à un plan (40) correspondant à la surface de sol dans l'espace réel, l'image dressée est affichée dans la zone d'affichage (Z1) qui est un contour quadrangulaire lorsqu'elle est vue depuis le spectateur, une différence d'angle de convergence (θL-θU) entre une extrémité supérieure (PU) et une extrémité inférieure (PL) de la zone d'affichage (Z1) est réglée à moins d'un seuil prédéterminé (de préférence un angle de convergence de 0,2°) déterminé sur la base d'au moins l'une de la visibilité de l'image, du temps nécessaire à la reconnaissance visuelle, et d'un facteur psychologique tel qu'une sensation inconfortable ou une sensation déplaisante.
PCT/JP2021/027463 2020-07-29 2021-07-26 Dispositif d'affichage tête haute (hud) WO2022024964A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202180058974.3A CN116157290A (zh) 2020-07-29 2021-07-26 平视显示装置
JP2022540275A JPWO2022024964A1 (fr) 2020-07-29 2021-07-26
DE112021004005.7T DE112021004005T5 (de) 2020-07-29 2021-07-26 Head-up-Display-Vorrichtung

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020128459 2020-07-29
JP2020-128459 2020-07-29

Publications (1)

Publication Number Publication Date
WO2022024964A1 true WO2022024964A1 (fr) 2022-02-03

Family

ID=80035672

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/027463 WO2022024964A1 (fr) 2020-07-29 2021-07-26 Dispositif d'affichage tête haute (hud)

Country Status (4)

Country Link
JP (1) JPWO2022024964A1 (fr)
CN (1) CN116157290A (fr)
DE (1) DE112021004005T5 (fr)
WO (1) WO2022024964A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011082985A1 (de) * 2011-09-19 2013-03-21 Bayerische Motoren Werke Aktiengesellschaft Projektionseinrichtung und Verfahren zur Projektion
JP2018120135A (ja) * 2017-01-26 2018-08-02 日本精機株式会社 ヘッドアップディスプレイ
WO2020009219A1 (fr) * 2018-07-05 2020-01-09 日本精機株式会社 Dispositif d'affichage tête haute
JP2020064235A (ja) * 2018-10-19 2020-04-23 コニカミノルタ株式会社 表示装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011082985A1 (de) * 2011-09-19 2013-03-21 Bayerische Motoren Werke Aktiengesellschaft Projektionseinrichtung und Verfahren zur Projektion
JP2018120135A (ja) * 2017-01-26 2018-08-02 日本精機株式会社 ヘッドアップディスプレイ
WO2020009219A1 (fr) * 2018-07-05 2020-01-09 日本精機株式会社 Dispositif d'affichage tête haute
JP2020064235A (ja) * 2018-10-19 2020-04-23 コニカミノルタ株式会社 表示装置

Also Published As

Publication number Publication date
JPWO2022024964A1 (fr) 2022-02-03
DE112021004005T5 (de) 2023-06-01
CN116157290A (zh) 2023-05-23

Similar Documents

Publication Publication Date Title
KR101127534B1 (ko) 타원형 디옵터의 두 초점의 무비점수차를 이용하여 망막이미지를 생성하는 방법 및 장치
US9785306B2 (en) Apparatus and method for designing display for user interaction
JP4155343B2 (ja) 二つの光景からの光を観察者の眼へ代替的に、あるいは同時に導くための光学系
JP5008556B2 (ja) ヘッドアップ表示を使用する途上ナビゲーション表示方法および装置
JPH06270716A (ja) 車両のヘッドアップディスプレイ装置
JPH09322199A (ja) 立体映像ディスプレイ装置
JPH01503572A (ja) 立体表示システム
JP7126115B2 (ja) 表示システム、移動体、及び、設計方法
CN112204453B (zh) 影像投射系统、影像投射装置、影像显示光衍射光学元件、器具以及影像投射方法
JP7358909B2 (ja) 立体表示装置及びヘッドアップディスプレイ装置
JP6105531B2 (ja) 車両用投影表示装置
WO2022024964A1 (fr) Dispositif d'affichage tête haute (hud)
JPWO2020032095A1 (ja) ヘッドアップディスプレイ
WO2016150166A1 (fr) Appareil d'amplification et d'affichage formant une image virtuelle
WO2021241718A1 (fr) Dispositif d'affichage tête haute
CN114127614B (zh) 平视显示装置
JP7354846B2 (ja) ヘッドアップディスプレイ装置
CN217655373U (zh) 平视显示装置以及移动体
JP4102410B2 (ja) 立体映像ディスプレイ装置
CN112731664A (zh) 一种车载增强现实抬头显示系统及显示方法
JP6516161B2 (ja) 浮遊像表示装置および表示方法
JP7372618B2 (ja) 車載表示装置
CN215181216U (zh) 像距连续变化的抬头显示系统及车辆
WO2021261438A1 (fr) Dispositif d'affichage tête haute
WO2019151199A1 (fr) Système d'affichage, corps mobile et procédé de mesure

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21848675

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022540275

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 21848675

Country of ref document: EP

Kind code of ref document: A1