WO2018105585A1 - Dispositif d'affichage tête haute - Google Patents

Dispositif d'affichage tête haute Download PDF

Info

Publication number
WO2018105585A1
WO2018105585A1 PCT/JP2017/043564 JP2017043564W WO2018105585A1 WO 2018105585 A1 WO2018105585 A1 WO 2018105585A1 JP 2017043564 W JP2017043564 W JP 2017043564W WO 2018105585 A1 WO2018105585 A1 WO 2018105585A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual image
image
display
vehicle
displayable area
Prior art date
Application number
PCT/JP2017/043564
Other languages
English (en)
Japanese (ja)
Inventor
誠 秦
勇希 舛屋
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Publication of WO2018105585A1 publication Critical patent/WO2018105585A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor

Definitions

  • the present invention relates to a head-up display that displays a virtual image.
  • the head-up display of Patent Document 1 includes an image display unit that displays an image, and the projection unit projects the image on a front windshield (an example of a projection member) positioned in front of the viewer, thereby The virtual image based on the image is displayed on the back side (outside of the vehicle) of the seen front windshield.
  • a front windshield an example of a projection member
  • the pitching angle with respect to the traveling road surface changes depending on the condition of the traveling road surface or driving operation.
  • the position where the virtual image is displayed changes according to the change in the pitching angle of the vehicle. Specifically, the position where the virtual image is displayed rotates and moves around a predetermined axis in conjunction with a change in the pitching angle of the vehicle. As a result, the virtual image is visually recognized as being rotated about the horizontal direction as viewed from the viewer. This phenomenon will be specifically described with reference to FIGS.
  • FIGS. 10, 11 and 12 are diagrams for explaining the difference in the appearance of the virtual image 510 when the pitching angle 1p of the vehicle with respect to the road surface 502 is different.
  • the virtual image 510r shown in FIG. 10B, the virtual image 510u shown in FIG. 11B, and 510d shown in FIG. 12B are shown in different shapes, but they are all during normal travel ( When the virtual image 510 that is viewed in a rectangular shape by a viewer is displayed when the pitching angle 1p of the vehicle with respect to the road surface 502 is zero), an example in which the shape is viewed differently due to the difference in the pitching angle 1p of the vehicle is shown. ing. 11B and 12B, the amount of change in the shape of the virtual image 510 is exaggerated.
  • FIG. 10 is a diagram showing a state during normal traveling in which the vehicle pitching angle 1p with respect to the road surface 502 is zero.
  • the virtual image 510r displayed by the head-up display is displayed so that an angle 511r formed between the head-up display and the road surface 502 is approximately 90 degrees as shown in FIG.
  • the virtual image 510r displayed by the head-up display shown in FIG. 10B has a rectangular shape, and in order to make it easy to understand the change in shape from FIG. 11B and FIG.
  • a straight grid line parallel to (X-axis direction) and a straight grid line parallel to the vertical direction (Y-axis direction) are attached. Since the virtual image 510r visually recognized during normal traveling shown in FIG. 10B is rectangular, the lower end width (also referred to as the reference lower end width) 512r and the upper end width (also referred to as the reference upper end width) 513r of the virtual image 510r. And become equal.
  • FIG. 11 is a view showing a pitch-up state in which the pitching angle 1p of the vehicle is inclined upward with respect to the road surface 502.
  • An angle 511u formed between the virtual image 510u displayed on the head-up display and the road surface 502 is an angle larger than 90 degrees as shown in FIG.
  • the virtual image 510u displayed by the head-up display shown in FIG. 11B is visually recognized as a trapezoidal shape having a short lower side when viewed from the viewer.
  • the virtual image 510u that is visually recognized at the time of pitch-up illustrated in FIG. 11B is viewed such that the lower end width 512u is shorter than the upper end width 513u, and the vertical width 514u is illustrated in FIG. 10B. It is visually recognized as being shorter than the vertical width 514r of the virtual image 510r visually recognized during normal traveling.
  • FIG. 12 is a diagram showing a pitch-down state in which the pitching angle 1p of the vehicle is inclined downward with respect to the road surface 502.
  • An angle 511d formed between the virtual image 510d displayed on the head-up display and the road surface 502 is an angle smaller than 90 degrees as shown in FIG.
  • the virtual image 510d displayed by the head-up display shown in FIG. 12B is visually recognized as a trapezoidal shape having a long lower side when viewed from the viewer.
  • the virtual image 510d visually recognized at the time of pitch down shown in FIG. 12B is viewed such that the lower end width 512d is longer than the upper end width 513d, and the vertical width 514d is shown in FIG. 10B. It is visually recognized as being shorter than the vertical width 514r of the virtual image 510r visually recognized during normal traveling.
  • the visibility of the virtual image 510 may be reduced because the visible virtual image 510 is visually recognized as having a trapezoidal distortion due to the difference in the pitching angle 1p of the vehicle. was there.
  • One object of the present invention is to provide a head-up display capable of improving the visibility of a virtual image.
  • a head-up display is mounted on a vehicle and projects the first display light (210) on a projection member that transmits and reflects light, thereby superimposing the actual display outside the vehicle.
  • a head-up display that displays a first virtual image (311) in a first virtual image displayable area (310), the first image displayable area (13) corresponding to the first virtual image displayable area
  • the display control unit 70 corrects the first image based on the vehicle attitude information G, so that the first virtual image is visually recognized as having a trapezoidal distortion due to a change in the attitude of the vehicle. Can be visually recognized by the viewer while reducing or canceling the trapezoidal distortion.
  • the visibility of the virtual image can be improved.
  • FIG. 3 is a diagram showing an example in which a first image is displayed on the first screen shown in FIG. 2 when the vehicle is in a pitch down state. It is a figure which shows arrangement
  • (a) is a diagram showing the arrangement of the first virtual image displayable region when the pitching angle of the vehicle is substantially parallel to the road surface, and (b) shows the real scene and the virtual image. It is a figure which shows a mode that it is visually recognized from a driver's seat.
  • (a) is a diagram showing the arrangement of the first virtual image displayable area when the vehicle is in a pitch-up state
  • (b) is a diagram showing a real scene and a virtual image from the driver's seat. It is a figure which shows a mode that it is visually recognized.
  • (a) is a diagram showing the arrangement of the first virtual image displayable area when the vehicle is in a pitch-down state
  • (b) is a diagram showing a real scene and a virtual image from the driver's seat. It is a figure which shows a mode that it is visually recognized.
  • the virtual first virtual image displayable area 310 and the second virtual image displayable area 320 generated by the head-up display (hereinafter referred to as HUD) 100 of the present invention will be described.
  • the left-right direction facing the front of the vehicle 1 is defined as the X axis (the left direction is the X axis positive direction), and the vertical
  • the direction is defined as the Y-axis (upper vertical direction is the positive Y-axis direction)
  • the front-rear direction is defined as the Z-axis (the forward direction is the positive Z-axis direction).
  • the HUD 100 is stored in a dashboard 3 of a vehicle (one application example of a moving body) 1, for example.
  • the HUD 100 projects display light 200 (first display light 210 and second display light 220) indicating vehicle information and the like onto a part of a front windshield (an example of a projection member) 2 of the vehicle 1.
  • the front windshield 2 generates a predetermined eye box (not shown) by reflecting the first display light 210 and the second display light 220 toward the viewer 4 side.
  • the viewer 4 places the viewpoint in the eye box so that the first virtual image displayable area 310 and the second virtual image displayable area 320 virtually generated by the HUD 100 are displayed in front of the front windshield 2.
  • the first virtual image 311 and the second virtual image 321 can be visually recognized above.
  • the eyebox in the description of the present embodiment means that the entire first virtual image 311 generated by the first display light 210 and the entire second virtual image 321 generated by the second display light 220 are visually recognized. It is defined as a possible viewing area.
  • the first virtual image displayable area 310 shown in FIG. 1 is virtually arranged such that the angle formed with the road surface 5L with the horizontal direction as viewed from the viewer 4 being approximately 90 degrees.
  • the first virtual image displayable area 310 is virtually arranged at a position 5 [m] to 10 [m] away from the eyebox in the forward direction (the traveling direction of the vehicle 1).
  • the second virtual image displayable region 320 may be provided with an inclination of about ⁇ 20 degrees from an angle at which the angle formed with the road surface 5L is approximately 90 degrees.
  • the second virtual image displayable area 320 shown in FIG. 1 is virtually arranged such that the angle formed with the road surface 5L with the horizontal direction as viewed from the viewer 4 being approximately 0 degrees. .
  • the second virtual image displayable area 320 is virtually arranged so as to be substantially parallel to the road surface 5L.
  • the second virtual image displayable area 320 is provided so as to overlap from a position 10 [m] away from the eyebox in a forward direction to a position 100 [m] away.
  • the second virtual image displayable area 320 is tilted from ⁇ 10 degrees (CCW direction on the paper surface of FIG. 1) to +45 degrees (CW direction on the paper surface of FIG. 1) from an angle parallel to the road surface 5L of FIG. May be provided.
  • FIG. 2 is a diagram illustrating an example of a landscape, a first virtual image 311, and a second virtual image 321 that can be seen from the viewer 4 sitting in the driver's seat of the vehicle 1 including the HUD 100 illustrated in FIG. 1.
  • the HUD 100 is a maximum area where the first virtual image 311 can be displayed, for example, a rectangular first virtual image displayable area 310 and the maximum area where the second virtual image 321 can be displayed when viewed from the viewer 4.
  • a trapezoidal second virtual image displayable area 320 in which an upper side partially overlapping with the first virtual image displayable area 310 is visually recognized shorter than the lower side is generated.
  • first virtual image displayable area 310 and the second virtual image displayable area 320 themselves are not visible to the viewer 4 or difficult to be viewed by the viewer 4, and are on the first virtual image displayable area 310.
  • the viewer 4 visually recognizes the first virtual image element V1 of the first virtual image 311 to be displayed and the second virtual image element V2 of the second virtual image 321 displayed on the second virtual image displayable area 320; Become.
  • the first virtual image 311 displayed in the first virtual image displayable area 310 is not associated with a specific target of the real scene or does not add information to the specific target of the real scene.
  • 1st virtual image element V1 which shows vehicle information, such as a vehicle speed, and the distance to a branch point is included.
  • the second virtual image 321 displayed in the second virtual image displayable area 320 is displayed in the vicinity of a specific target in the real scene or at a position where it is superimposed, and this specific target is emphasized or information is added.
  • an arrow image that adds information that guides the route to the destination on the road surface 5L, a linear image that emphasizes a white line (specific target) at the time of lane departure warning (LDW: Lane Depth Warning), etc.
  • the second virtual image element V2 is included.
  • the specific target is, for example, a road surface, an obstacle present on or near the road surface, a traffic sign, a traffic sign, a traffic signal, a building, and the like.
  • the first virtual image displayable area 310 includes an area in which the first virtual image 311 including the first virtual image element V1 is displayed, and the first virtual image 311 in the vertical direction (Y-axis direction) and the horizontal direction (X-axis). And a virtual image blank area 311a surrounding from (direction).
  • the first virtual image 311 has a rectangular shape when viewed from the viewer 4.
  • the HUD 100 of the present invention performs an image correction process, which will be described later, so as to cancel the distortion of the first virtual image 311 caused by the change in the pitching angle 1p of the vehicle 1. This image correction process will be described later.
  • various detection units for detecting the pitching angle 1p of the vehicle 1 will be described.
  • Various detection units to be described below may be provided in the HUD 100.
  • the vehicle 1 or the HUD 100 may be detachably wired or wirelessly connected.
  • the sensor unit may be wired or wirelessly connected to the vehicle 1 or the HUD 100, or a portable terminal having various detection units may be wired or wirelessly connected.
  • the vehicle 1 is equipped with a front information detection unit 6 that acquires actual scene information F of the vehicle 1.
  • the real scene information F includes at least position information of a specific target that is emphasized or loaded with information by displaying the second virtual image 321 in the real scene in front of the vehicle 1, for example, one or more cameras. Obtained from an infrared sensor or the like.
  • the vehicle attitude detection unit 7 analyzes, for example, a triaxial acceleration sensor (not shown) and the triaxial acceleration detected by the triaxial acceleration sensor, so that the pitching angle 1p (vehicle attitude) of the vehicle 1 with respect to the horizontal plane is used. ) And the vehicle attitude information G including information related to the pitching angle 1p of the vehicle 1 is output to the HUD 100 (display control unit 70).
  • the vehicle attitude detection unit 7 may be configured by a height sensor (not shown) arranged in the vicinity of the suspension of the vehicle 1 other than the three-axis acceleration sensor described above.
  • the vehicle attitude detection unit 7 estimates the pitching angle 1p of the vehicle 1 as described above by analyzing the height from the ground of the vehicle 1 detected by the height sensor, and the pitching angle 1p of the vehicle 1 is estimated.
  • the vehicle attitude information G including the information regarding is output to the HUD 100 (display control unit 70).
  • the vehicle posture detection unit 7 may include an imaging camera (not shown) that images the outside of the vehicle 1 and an image analysis unit (not shown) that analyzes the captured image.
  • the vehicle posture detection unit 7 estimates the pitching angle 1p (vehicle posture) of the vehicle 1 from the temporal change of the landscape included in the captured image. Note that the method by which the vehicle attitude detection unit 7 determines the pitching angle 1p of the vehicle 1 is not limited to the method described above, and the pitching angle 1p of the vehicle 1 may be determined using a known sensor or analysis method.
  • FIG. 3 is a diagram showing an example of the configuration of the HUD 100 shown in FIG. 1 includes, for example, a first image display unit 10, a second image display unit 20, a projection unit 30 (a reflection unit 31, a display synthesis unit 32, a concave mirror 33), an angle adjustment unit (actuator) 40, And a display control unit 70.
  • the HUD 100 is generally housed in the dashboard of the vehicle 1, but the first image display unit 10, the second image display unit 20, the reflection unit 31, the actuator 40, the display composition unit 32, and the concave mirror 33.
  • all or part of the display control unit 70 may be arranged outside the dashboard.
  • the HUD 100 (display control unit 70) is connected to a bus 8 including an in-vehicle LAN (Local Area Network) mounted on the vehicle 1, and the above-described actual scene information F and vehicle attitude information G can be input from the bus 8. it can.
  • a bus 8 including an in-vehicle LAN (Local Area Network) mounted on the vehicle 1, and the above-described actual scene information F and vehicle attitude information G can be input from the bus 8. it can.
  • the first image display unit 10 in FIG. 3 receives, for example, a first projection unit 11 including a projector using a reflective display device such as DMD or LCoS, and projection light from the first projection unit 11.
  • a first projection unit 11 including a projector using a reflective display device such as DMD or LCoS
  • the first image 14 is displayed, and the first screen 12 that emits the first display light 210 indicating the first image 14 toward the reflection unit 31 is mainly configured.
  • the first image display unit 10 virtually displays the first image 14 on the first screen 12 based on the image data D input from the display control unit 70 described later, so that the first image display unit 10 is virtually in front of the viewer 4.
  • the first virtual image 311 is displayed on the generated first virtual image displayable area 310.
  • FIG. 4 is a front view of the first screen 12 shown in FIG.
  • the left-right direction of the first screen 12 is defined as the dx axis (the left direction is the positive direction of the dx axis).
  • the direction is defined as the dy axis (the downward direction is the positive direction of the dy axis).
  • the position in the X-axis direction of the first virtual image 311 that the viewer 4 shown in FIG. 2 visually recognizes from the driver's seat of the vehicle 1 is the first image 14 displayed on the first screen 12 shown in FIG. Corresponds to the position in the dx-axis direction.
  • the position in the Y-axis direction of the first virtual image 311 viewed by the viewer 4 shown in FIG. 2 from the driver's seat of the vehicle 1 is displayed on the first screen 12 shown in FIG. This corresponds to the position of the image 14 in the dy-axis direction.
  • the above-described real space is determined by the arrangement of the optical members (first image display unit 10, second image display unit 20, reflection unit 31, actuator 40, display composition unit 32, concave mirror 33) in the HUD 100.
  • the relationship between the above XY coordinate axes and the dxdy coordinate axes used in the description of the first screen 12 is not limited to the above.
  • the first screen 12 in FIG. 3 has an area 13 that is the maximum area in which the first image 14 can be displayed, as shown in FIG.
  • this region 13 is also referred to as a first image displayable region 13 corresponding to the first virtual image displayable region 310 that is virtually generated by the HUD 100.
  • the first image displayable area 13 displays the first image 14 in the area according to the image data D.
  • one or a plurality of first image elements M1 are arranged in part or in whole.
  • the display control unit 70 to be described later adjusts the size and shape of the first image 14 in the first image displayable area 13 based on the image data D, so that the first virtual image displayable area 310 in the first virtual image displayable area 310 is adjusted.
  • the size and shape of one virtual image 311 can be adjusted.
  • the first image 14 shown in FIG. 4 has a rectangular shape corresponding to the first virtual image 311 in FIG. 2, and is parallel to the horizontal direction (dx axis direction) and linear grid lines, Although the grid line is attached in parallel to the direction (dy-axis direction), the first display light 210 is actually projected depending on the curved surface shape of the projection member (front windshield 2).
  • the shapes of the first image 14 displayed on the screen 12 and the first virtual image 311 viewed from the viewer 4 are not equal. That is, when the shape of the first virtual image 311 and the grid lines in FIG. 2 are used as a reference, the shape of the first image 14 is different from the rectangular shape, and the grid lines are not straight but bent or parallel to each other.
  • the shape and grid line of the first image 14 are changed to the shape and grid line of the first virtual image 311.
  • the width of the lower end of the first image 14 is defined as the lower end width dx1
  • the width of the upper end of the first image 14 is defined as the upper end width dx2
  • the height of the first image 14 is defined as the height dy. It prescribes.
  • the first image display unit 10 of the present embodiment normally displays the first image 14 including one or more first image elements M1 based on the image data D, and the pitching angle 1p of the vehicle 1 has changed.
  • the first virtual image 311 close to the normal shape and size can be visually recognized by executing an image correction process, which will be described later, that expands or narrows a part of the first image 14.
  • the second image display unit 20 in FIG. 3 has the same configuration as the first image display unit 10 described above, and receives the projection light from the second projection unit 21 and the second projection unit 21.
  • the second screen 22 is mainly configured to display the second image 24 and emit the second display light 220 indicating the second image 24 toward the reflection unit 31. Similar to the first screen 12 shown in FIG. 4, the second screen 22 in FIG. 3 is the maximum area where the second image 24 can be displayed, and corresponds to the second virtual image displayable area 320. It is also called a second image displayable area 23.
  • the second image displayable area 23 displays the second image 24 in the area according to the image data D. In the second image 24, one or a plurality of second image elements (not shown) are arranged in part or in whole.
  • the reflection unit 31 (projection unit 30) of FIG. 3 is formed by, for example, a flat plane mirror, and is inclined and arranged on the optical path of the second display light 220 from the second image display unit 20 toward the viewer 4.
  • the second display light 220 emitted from the second image display unit 20 is reflected toward the display composition unit 32.
  • the reflection unit 31 includes an actuator 40 that rotates the reflection unit 31.
  • the reflection part 31 may have a curved surface instead of a plane.
  • the actuator 40 is, for example, a stepping motor or a DC motor, and rotates the reflecting unit 31 based on the vehicle attitude information G detected by the vehicle attitude detecting unit 7 under the control of the display control unit 70 described later.
  • the angle and position of the second virtual image displayable area 320 are adjusted.
  • the display composition unit 32 (projection unit 30) in FIG. 3 is configured by, for example, a flat half mirror in which a transflective layer such as a metal reflective film or a dielectric multilayer film is formed on one surface of a translucent substrate. Is done.
  • the display composition unit 32 reflects the second display light 220 from the second image display unit 20 reflected by the reflection unit 31 toward the concave mirror 33, and the second display light 220 from the second image display unit 20.
  • the display light 220 is transmitted to the concave mirror 33 side.
  • the transmittance of the display combining unit 32 is, for example, 50%, but the luminance of the first virtual image 311 and the second virtual image 321 may be adjusted by appropriately adjusting.
  • the concave mirror 33 (projection unit 30) of FIG. 3 is configured to transmit the first display light 210 and the second display light 220 from the first image display unit 10 and the second image display unit 20 to the viewer 4.
  • the first display light 210 and the second display light 220 that are arranged on the optical path and are emitted from the first image display unit 10 and the second image display unit 20 are reflected toward the front windshield 2.
  • the optical path length of the first display light 210 from the first screen 12 of the first image display unit 10 to the concave mirror 33 is the first optical path length from the second screen 22 of the second image display unit 20 to the concave mirror 33.
  • the first virtual image 311 generated by the first image display unit 10 is arranged so as to be shorter than the optical path length of the second display light 220, whereby the second image display unit 20 generates the second virtual image 311.
  • the virtual image 321 is formed at a position near the eye box.
  • the concave mirror 33 typically cooperates with the front windshield 2 for the first display light 210 and the second display light 220 generated by the first image display unit 10 and the second image display unit 20.
  • a function of working and enlarging a function of correcting the distortion of the first virtual image 311 and the second virtual image 321 caused by the curved surface of the front windshield 2, and visually recognizing a virtual image without distortion, the first virtual image 311 and the second virtual image
  • the virtual image 321 has a function of forming an image at a position away from the viewer 4 by a predetermined distance.
  • FIG. 5 shows a schematic configuration example of the display control unit 70 of FIG.
  • the display control unit 70 includes, for example, a processing unit 71, a storage unit 72, and an interface 73.
  • the processing unit 71 is configured by, for example, one or a plurality of CPUs and RAMs
  • the storage unit 72 is configured by, for example, a ROM
  • the interface 73 is configured by an input / output communication interface connected to the bus 8.
  • the interface 73 can acquire the vehicle information, the above-described actual scene information F, the vehicle attitude information G, and the like via the bus 8, and the storage unit 72 can input the input vehicle information, the actual scene information F, and the vehicle attitude information G.
  • the data for generating the image data D based on the data and the data for generating the drive data T based on the input vehicle attitude information G can be stored. Can be generated and image data D and drive data T can be generated.
  • the interface 73 can acquire the vehicle attitude information G including information related to the pitching angle 1p of the vehicle 1 from the vehicle attitude detection unit 7 via the bus 8, for example, and also functions as a vehicle attitude information acquisition unit. Have.
  • the display control unit 70 may be inside the HUD 100, and a part or all of the functions may be provided on the vehicle 1 side outside the HUD 100.
  • an operation example of the HUD 100 of the present embodiment will be described.
  • FIG. 6 is a flowchart showing an example of the operation of the HUD 100 of the present embodiment.
  • the HUD 100 is, for example, when the vehicle 1 is activated, when electric power is supplied to the electronic device of the vehicle 1, or when a predetermined time has elapsed since the activation of the vehicle 1 or the electric power supply of the electronic device of the vehicle 1. The processing described below is started.
  • step S ⁇ b> 1 the processing unit 71 acquires the vehicle attitude information G from the interface 73.
  • step S2 the processing unit 71 determines whether the vehicle attitude information G acquired in step S1 is equal to or greater than a predetermined threshold value TH. Specifically, the processing unit 71 determines whether the pitching angle 1p of the vehicle 1 is greater than or equal to a predetermined threshold value TH. If the processing unit 71 determines that the pitching angle 1p of the vehicle 1 is large based on the vehicle attitude information G (YES in step S2), the processing unit 71 proceeds to step S3 and determines that the pitching angle 1p of the vehicle 1 is small. If (NO in step S2), the process proceeds to step S4.
  • step S3 the processing unit 71 corrects the first virtual image 311 to be trapezoidally corrected so as to suppress or cancel the trapezoidal distortion of the first virtual image 311 caused by the change in the pitching angle 1p of the vehicle 1.
  • the processing unit 71 is visually recognized by the shape and size of the first virtual image 311 that is visually recognized during normal traveling in which the pitching angle 1p of the vehicle 1 is substantially zero.
  • the first virtual image 311 is trapezoidally corrected.
  • the processing unit 71 corrects the lower end horizontal width dx1, the upper end horizontal width dx2, and the vertical width dy of the first image 14 displayed on the first screen 12 according to the pitching angle 1p of the vehicle 1.
  • the processing unit 71 corrects the lower end width dx1 and / or the upper end width dx2 of the first image 14 to reduce the difference between the upper side and the lower side of the trapezoidal distortion of the first virtual image 311, and the first virtual image 311. Can be approximated to a rectangular shape.
  • FIG. 7A shows a first image 14 displayed on the first screen 12 during normal traveling in which the pitching angle 1p of the vehicle 1 with respect to the road surface 5L is substantially zero.
  • the first image 14 is also referred to as a reference image 14r.
  • FIG. 7B shows a first image 14u when the vehicle 1 is in a pitch-up state with respect to the road surface 5L.
  • the processing unit 71 determines the ratio of the upper end width dx2u to the lower end width dx1u (dx2u / dx1u) and the ratio of the reference upper end width dx2r to the reference lower end width dx1r during normal travel (dx2r / dx1r). As a result, the difference (DX2> DX1) between the lower end lateral width DX1 and the upper end lateral width DX2 caused by pitch-up can be reduced or eliminated.
  • FIG. 7C shows a first image 14d when the vehicle 1 is in a pitch-down state with respect to the road surface 5L.
  • the processing unit 71 determines the ratio of the upper end width dx2d to the lower end width dx1d (dx2d / dx1) and the ratio of the reference upper end width dx2r to the reference lower end width dx1r during normal travel (dx2r / larger than dx1r). Thereby, the difference (DX1> DX2) between the lower end lateral width DX1 and the upper end lateral width DX2 caused by the pitch down can be reduced or eliminated.
  • the processing unit 71 sets the longitudinal width dy (dyu, dyd) to be longer than the reference longitudinal width dyr during normal traveling as the absolute value of the pitching angle 1p of the vehicle 1 with respect to the road surface 5L increases. Thereby, the reduction of the vertical width DY of the first virtual image 311 can be reduced or eliminated by the change of the pitching angle 1p of the vehicle 1.
  • a magnification may be set for each area of the first image 14. That is, when the processing unit 71 corrects the keystone of the first image 14, the horizontal interval Gx1 and the vertical interval of the grid (region G1) corresponding to the first virtual image 311 below the first virtual image 311 in the vertical direction.
  • the processing unit 71 enlarges the lower grid G1u of the first image 14 at a larger magnification than the upper grid G2u, as shown in FIG. 7B. To do. Therefore, when the vehicle 1 is in the pitch-up state, the horizontal interval Gx1u and the vertical interval Gy1u of the lower grid G1u of the first image 14 are larger than the horizontal interval Gx2u and the vertical interval Gy2u of the upper grid G2u.
  • the processing unit 71 enlarges the lower grid G1d of the first image 14 at a smaller magnification than the upper grid G2d, as shown in FIG. 7C. Therefore, when the vehicle 1 is in a pitch-down state, the horizontal interval Gx1d and the vertical interval Gy1d of the lower grid G1d of the first image 14 are smaller than the horizontal interval Gx2d and the vertical interval Gy2d of the upper grid G2d.
  • step S4 the processing unit 71 corrects the vertical width DY of the first virtual image 311 so as to suppress or cancel the reduction in the vertical width DY of the first virtual image 311 caused by the change in the pitching angle 1p of the vehicle 1. To do. In other words, even after the pitching angle 1p of the vehicle 1 is changed, the processing unit 71 is visually recognized with the vertical width DY of the first virtual image 311 that is visually recognized when the pitching angle 1p of the vehicle 1 is substantially zero. In addition, the vertical width DY of the first virtual image 311 is corrected.
  • the processing unit 71 sets the vertical width dy (dyu, dyd) to be longer than the reference vertical width dyr during normal traveling as the absolute value of the pitching angle 1p of the vehicle 1 with respect to the road surface 5L increases.
  • the above is the image correction processing executed by the display control unit 70 of the present embodiment.
  • the angle with respect to the road surface 5L changes according to the pitching angle 1p of the vehicle 1.
  • the processing unit 71 of the present embodiment adjusts the angle of the second virtual image displayable area 320 according to the vehicle attitude information G to be acquired, so that the unintended angle of the second virtual image displayable area 320 with respect to the road surface 5L. Reduce or offset changes. The method is described below.
  • the display control unit 70 determines drive data T including the drive amount of the actuator 40 corresponding to the acquired vehicle attitude information G. Specifically, the display control unit 70 reads table data stored in advance in the storage unit 72 and determines drive data T corresponding to the acquired vehicle attitude information G. Note that the display control unit 70 may obtain the drive data T from the vehicle attitude information G by calculation using a preset calculation formula.
  • the display control unit 70 drives the actuator 40 based on the determined drive data T.
  • the display control unit 70 drives the actuator 40 to rotate the reflecting unit 31 located on the optical path of the second display light 220 emitted from the second image display unit 20.
  • the relative angle 330 of the 2nd image display part 20 with respect to the 1st virtual image displayable area 310 changes.
  • the display control unit 70 controls the actuator 40 so that the second virtual image displayable region 320 is parallel to the road surface 5L even when the vehicle posture of the vehicle 1 changes, and the reflection unit 31 May be rotated.
  • 8A, 8B, and 8C are diagrams showing how the angle 330 formed by the first virtual image displayable area 310 and the second virtual image displayable area 320 generated by the HUD 100 of the present embodiment is changed.
  • 8A, 8B, and 8C are diagrams illustrating an angle 330 formed by the first virtual image displayable area 310 and the second virtual image displayable area 320, which changes based on the pitching angle 1p of the vehicle 1.
  • FIG. 8A is a diagram illustrating an angle 330 formed by the first virtual image displayable area 310 and the second virtual image displayable area 320 when the vehicle 1 has a pitching angle 1p parallel to the road surface 5L.
  • the second virtual image displayable area 320 is adjusted to be substantially parallel to the road surface 5L
  • the angle 330 formed by the first virtual image displayable area 310 and the second virtual image displayable area 320 is: The angle is approximately 90 degrees 330r.
  • an angle 330r formed by the first virtual image displayable area 310 and the second virtual image displayable area 320 illustrated in FIG. 8A is also referred to as a reference angle 330r below.
  • FIG. 8B is a diagram illustrating an angle 330 formed by the first virtual image displayable area 310 and the second virtual image displayable area 320 when the vehicle 1 is pitched up.
  • the first virtual image displayable area 310 is adjusted to be substantially parallel to the road surface 5L as shown in FIG. 8B, for example, and the first virtual image displayable area 310 and the second virtual image displayable area 320 are adjusted.
  • FIG. 8C is a diagram illustrating an angle 330 formed by the first virtual image displayable area 310 and the second virtual image displayable area 320 when the vehicle 1 is in the pitch down state.
  • the first virtual image displayable area 310 is adjusted to be substantially parallel to the road surface 5L as shown in FIG. 8C, for example, and the first virtual image displayable area 310 and the second virtual image displayable area 320 are adjusted.
  • the angle 330 formed by the first virtual image displayable area 310 and the second virtual image displayable area 320 changes based on the vehicle posture of the vehicle 1, which of the visible virtual images is the first virtual image. Since the viewer 4 can recognize whether the information is displayed in the displayable area 310 or the second virtual image displayable area 320, the first virtual image displayable area 310 and the second virtual image displayable area 320 can be recognized. The first virtual image 311 and the second virtual image 321 displayed on each can be recognized more three-dimensionally.
  • the second virtual image displayable area 320 is generated by being inclined in the horizontal direction from the first virtual image displayable area 310, and the angle with respect to the real scene is adjusted by driving the actuator 40.
  • the angle adjustment with respect to the real scene of the virtual image display surface (second virtual image displayable area 320) tilted in the horizontal direction is more than the angle adjustment with respect to the real scene of the virtual image display surface (first virtual image displayable area 310) tilted in the vertical direction.
  • the impression given to the viewer 4 with respect to a certain angle change of the virtual image display surface is large.
  • the second virtual image 321 displayed in the second virtual image displayable area 320 and the first virtual image are adjusted by adjusting the angle of the virtual image display surface (second virtual image displayable area 320) inclined in the horizontal direction. It becomes easy to distinguish the first virtual image 311 displayed in the displayable area 310, and the first virtual image 311 displayed in each of the first virtual image displayable area 310 and the second virtual image displayable area 320, The second virtual image 321 can be recognized more three-dimensionally.
  • the HUD 100 is mounted on the vehicle 1 and projects the first display light 210 onto the front windshield 2, thereby superimposing the first virtual image on the real scene outside the vehicle 1.
  • a head-up display that displays the first virtual image 311 in the displayable area 310, which has the first image displayable area 13 corresponding to the first virtual image displayable area 310, and can display the first image
  • the first image display unit 10 that emits the first display light 210 from the region 13, the projection unit 30 that directs the first display light 210 emitted from the first image display unit 10 toward the front windshield 2, and the vehicle 1
  • the display control unit 70 corrects the first image 14 based on the vehicle attitude information G, thereby correcting the trapezoidal distortion of the first virtual
  • the display control unit 70 sets the size of the first image 14 so that the aspect ratio of the first virtual image 311 after the keystone correction is the same as the aspect ratio of the first virtual image 311 before the keystone correction. You may adjust. By making the aspect ratio constant before and after the trapezoid correction, the display of the first virtual image 311 can be continued without giving the display a strange feeling due to the trapezoid correction.
  • the display control unit 70 enlarges only the vertical width DY of the first virtual image 311 when viewed from the viewer 4 when the change in the vehicle attitude of the vehicle 1 is equal to or less than the predetermined threshold value TH. May be. Thereby, a part of the processing load of the display control unit 70 can be reduced.
  • the actuator 40 rotates the reflecting unit 31 positioned on the optical path of the first display light 210 up to the display combining unit 32 that directs the first display light 210 and the second display light 220 in the same direction.
  • the angle 330 formed by the first virtual image displayable area 310 and the second virtual image displayable area 320 is changed, but the display combining unit 32 may be rotated by the actuator 40.
  • the actuator 40 rotates the reflecting unit 31 positioned on the optical path of the first display light 210 up to the display combining unit 32 that directs the first display light 210 and the second display light 220 in the same direction.
  • the HUD 100 of the present invention can display the first virtual image displayable area 310 and the second virtual image by rotating the image display surface (first screen 12) of the first image display unit 10 by the actuator 40.
  • the angle 330 formed with the region 320 may be changed.
  • the actuator 40 does not need to set the center of the image display surface (first screen 12) of the reflection unit 31, the display synthesis unit 32, and the first image display unit 10 as the rotation axis AX.
  • a predetermined position may be used as the rotation axis AX.
  • the rotation axis AX may be provided at a position separated from each optical member.
  • FIG. 9 shows an example in which the real scene and the first virtual image 311 and the second virtual image 321 displayed by the modified example of the HUD 100 shown in FIG. 2 are visually recognized when facing the front of the vehicle 1 from the driver's seat.
  • the HUD 100 of the present invention can display the first virtual image displayable area 310 generated by the first image display unit 10 and the second virtual image display generated by the second image display unit 20.
  • the region 320 may be viewed with a distance.
  • the HUD 100 in this modified example includes a region on the display combining unit 32 where the first display light 210 is incident from the first image display unit 10 and a second display from the second image display unit 20. You may comprise by separating
  • the first image display unit 10 that generates the first virtual image displayable region 310 and the second image display unit 20 that generates the second virtual image displayable region 320 are provided.
  • the image display unit may be a single unit.
  • the HUD 100 in this modified example projects projection light from a single projection unit (not shown) onto a plurality of screens (image display surfaces) (not shown), and rotates one of the screens with an actuator, thereby The angle 330 formed by the virtual image displayable area 310 and the second virtual image displayable area 320 may be adjusted.
  • the angle 330 formed by the first virtual image displayable area 310 and the second virtual image displayable area 320 is adjusted by adjusting the angle of the first virtual image displayable area 310 with respect to the real scene.
  • the first virtual image displayable area 310 is adjusted.
  • the angle 330 formed by the second virtual image displayable area 320 may be adjusted.
  • the first image display unit 10 may be a transmissive display panel such as a liquid crystal display element, a self-luminous display panel such as an organic EL element, or a scanning display device that scans with laser light. Good.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)

Abstract

La présente invention peut améliorer la visibilité d'une image virtuelle. Une unité de projection 30 projette une première lumière d'affichage 210 d'une première image 14 à afficher par une première unité d'affichage d'image 10 vers un élément de projection afin d'afficher une première image virtuelle 311 sur la base de la première image 14, et acquiert des informations d'orientation de véhicule G, comprenant des informations concernant l'orientation de véhicule d'un véhicule 1, à partir d'une interface 73. Une unité de commande d'affichage 70 réalise une correction de trapèze sur la première image 14 sur la base des informations d'orientation de véhicule G de façon à supprimer ou à annuler une distorsion de trapèze dans la première image virtuelle 311 devant être vue par un observateur 4.
PCT/JP2017/043564 2016-12-09 2017-12-05 Dispositif d'affichage tête haute WO2018105585A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-239554 2016-12-09
JP2016239554 2016-12-09

Publications (1)

Publication Number Publication Date
WO2018105585A1 true WO2018105585A1 (fr) 2018-06-14

Family

ID=62492303

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/043564 WO2018105585A1 (fr) 2016-12-09 2017-12-05 Dispositif d'affichage tête haute

Country Status (1)

Country Link
WO (1) WO2018105585A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020009218A1 (fr) * 2018-07-05 2020-01-09 日本精機株式会社 Dispositif d'affichage tête haute
CN112313737A (zh) * 2018-06-22 2021-02-02 三菱电机株式会社 影像显示装置
CN113306565A (zh) * 2020-02-26 2021-08-27 丰田自动车株式会社 行驶辅助装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010156608A (ja) * 2008-12-26 2010-07-15 Toshiba Corp 車載用表示システム及び表示方法
JP2014026244A (ja) * 2012-07-30 2014-02-06 Jvc Kenwood Corp 表示装置
WO2016181749A1 (fr) * 2015-05-13 2016-11-17 日本精機株式会社 Affichage tête haute

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010156608A (ja) * 2008-12-26 2010-07-15 Toshiba Corp 車載用表示システム及び表示方法
JP2014026244A (ja) * 2012-07-30 2014-02-06 Jvc Kenwood Corp 表示装置
WO2016181749A1 (fr) * 2015-05-13 2016-11-17 日本精機株式会社 Affichage tête haute

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112313737A (zh) * 2018-06-22 2021-02-02 三菱电机株式会社 影像显示装置
WO2020009218A1 (fr) * 2018-07-05 2020-01-09 日本精機株式会社 Dispositif d'affichage tête haute
JPWO2020009218A1 (ja) * 2018-07-05 2021-08-12 日本精機株式会社 ヘッドアップディスプレイ装置
JP7375753B2 (ja) 2018-07-05 2023-11-08 日本精機株式会社 ヘッドアップディスプレイ装置
CN113306565A (zh) * 2020-02-26 2021-08-27 丰田自动车株式会社 行驶辅助装置
CN113306565B (zh) * 2020-02-26 2024-05-17 丰田自动车株式会社 行驶辅助装置

Similar Documents

Publication Publication Date Title
JP6458998B2 (ja) ヘッドアップディスプレイ
WO2018088362A1 (fr) Dispositif d'affichage tête haute
JP6731644B2 (ja) 表示位置補正装置、表示位置補正装置を備える表示装置、及び表示装置を備える移動体
CN110573369B (zh) 平视显示器装置及其显示控制方法
WO2016190135A1 (fr) Système d'affichage pour véhicule
JP6911868B2 (ja) ヘッドアップディスプレイ装置
WO2015041203A1 (fr) Dispositif d'affichage tête haute
US20220208148A1 (en) Image display system, image display method, movable object including the image display system, and non-transitory computer-readable medium
WO2017094427A1 (fr) Afficheur tête haute
JP2010070066A (ja) ヘッドアップディスプレイ
JP2009150947A (ja) 車両用ヘッドアップディスプレイ装置
WO2017090464A1 (fr) Affichage tête haute
JP2018077400A (ja) ヘッドアップディスプレイ
WO2018003650A1 (fr) Casque d'affichage
JP2018120135A (ja) ヘッドアップディスプレイ
JP7396404B2 (ja) 表示制御装置及び表示制御プログラム
WO2018105585A1 (fr) Dispositif d'affichage tête haute
JP6845988B2 (ja) ヘッドアップディスプレイ
JP6481445B2 (ja) ヘッドアップディスプレイ
JP7223283B2 (ja) 画像処理ユニット及びそれを備えるヘッドアップディスプレイ装置
JPWO2018199244A1 (ja) 表示システム
JP2016185768A (ja) 車両用表示システム
JP2019032362A (ja) ヘッドアップディスプレイ装置、ナビゲーション装置
JP6943079B2 (ja) 画像処理ユニット及びそれを備えるヘッドアップディスプレイ装置
JP7253719B2 (ja) 表示装置を備える車両

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17877709

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17877709

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP