CN114127614B - Head-up display device - Google Patents

Head-up display device Download PDF

Info

Publication number
CN114127614B
CN114127614B CN202080051108.7A CN202080051108A CN114127614B CN 114127614 B CN114127614 B CN 114127614B CN 202080051108 A CN202080051108 A CN 202080051108A CN 114127614 B CN114127614 B CN 114127614B
Authority
CN
China
Prior art keywords
display
road
virtual image
hud
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202080051108.7A
Other languages
Chinese (zh)
Other versions
CN114127614A (en
Inventor
舛屋勇希
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nippon Seiki Co Ltd
Original Assignee
Nippon Seiki Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Seiki Co Ltd filed Critical Nippon Seiki Co Ltd
Publication of CN114127614A publication Critical patent/CN114127614A/en
Application granted granted Critical
Publication of CN114127614B publication Critical patent/CN114127614B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • B60K35/81Arrangements for controlling instruments for controlling displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/03Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes specially adapted for displays having non-planar surfaces, e.g. curved displays
    • G09G3/035Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes specially adapted for displays having non-planar surfaces, e.g. curved displays for flexible display surfaces
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/149Instrument input by detecting viewing direction not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/347Optical elements for superposition of display information
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/77Instrument locations other than the dashboard
    • B60K2360/785Instrument locations other than the dashboard on or in relation to the windshield or windows
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/013Head-up displays characterised by optical features comprising a combiner of particular shape, e.g. curvature
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0185Displaying image at variable distance
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B2027/0192Supplementary details
    • G02B2027/0196Supplementary details having transparent supporting structure for display mounting, e.g. to a window or a windshield
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention provides a HUD device capable of improving expressive force by combining a road surface HUD and an on-road HUD. The virtual image display surface (PS 3) integrally extends from a proximal end portion (U1) on the side close to the vehicle (1) to a distal end portion (U3) on the side far away from the vehicle, and is divided into: a road surface HUD region (Z1) that displays a virtual image (G1) superimposed on a road surface (40); and a road HUD region (Z2) which is located at a position farther from the road HUD region (Z1) and which displays a virtual image (G2') located at an upper side of the road (40), wherein the display control unit (300) divides the image display region (45) of the display surface (47) into: a first display region (Z1 ') corresponding to a road surface HUD region (Z1) and a second display region (Z2 ') corresponding to a road surface HUD region (Z2) are displayed, a first image (RG 1) (61 and SP) overlapping the road surface is displayed in the first display region, and a second image (RG 2 ') to be displayed on the upper side of the road surface is displayed in the second display region (63 and 65).

Description

Head-up display device
Technical Field
The present invention relates to a Head-up Display (Head-up Display) device and the like that displays a virtual image on a front windshield, a combiner, and the like of a vehicle.
Background
For example, a technique for displaying a navigation arrow or the like of the vehicle by attaching the navigation arrow or the like to a road surface, in other words, a technique for displaying the navigation arrow or the like by being superimposed on the road surface is known. Since the virtual image display surface (imaging surface) is arranged substantially horizontally with respect to the road surface, it is possible to display with a sense of depth.
In this specification, such display is sometimes referred to as "road surface overlap display" or "road surface HUD display", or simply as road surface HUD ". In the present specification, a case where a virtual image is displayed on a virtual image display surface (for example, a virtual image display surface (imaging surface) provided upright (standing) on a road surface) on top of the road surface is sometimes referred to as "on-road HUD display" or simply as "on-road HUD".
In patent document 1, a display for displaying a main image, a projector for drawing a sub image by projection, a curved surface screen, and the like are provided separately, and for example, a road surface HUD display (virtual image of the sub image) is formed on a virtual image display surface of the curved surface, and a display (virtual image of the main image) standing on the road surface is formed in connection with an end portion of one side of the road surface HUD display (virtual image of the sub image).
Prior art literature
Patent literature
Patent document 1: japanese patent application laid-open No. 6232363 (for example, FIG. 1, FIG. 2, [ 0013 ], and [ 0029 ])
In patent document 1, it is necessary to provide a display for displaying a main image and a projector and a screen for displaying a sub image separately, which inevitably complicates the structure of a display optical system.
In patent document 1, it is described that the virtual images based on the main image and the sub image are three-dimensionally integrated, but there is no description about improving the expressive power of the HUD device by using the display superimposed on the road surface and the display standing on the road surface in combination.
Disclosure of Invention
The purpose of the present invention is to provide a HUD device capable of improving the expressive force by using a road surface HUD and a road surface HUD in combination.
Other objects of the present invention will become apparent to those skilled in the art from a consideration of the following illustrative modes and best modes and figures.
In the following, the mode according to the present invention is exemplified for easy understanding of the outline of the present invention.
In a first aspect, a head-up display device is a head-up display (HUD) device that projects display light onto a reflective light-transmitting member provided in a vehicle, overlaps with a real scene transmitted through the reflective light-transmitting member, generates a virtual image from the display light reflected by the reflective light-transmitting member, and displays the virtual image,
the device comprises: an image display unit having a display surface for displaying an image; a display control unit that controls display of the virtual image; and an optical system including an optical member for projecting the display light onto the reflective light-transmitting member,
a virtual image display surface as an imaging surface for virtual image imaging integrally extends from a proximal end portion on a side close to the vehicle to a distal end portion on a side far from the vehicle, and a second distance between a road surface and the distal end portion is larger than a first distance between the road surface and the proximal end portion, and the virtual image display surface is a plane or a curved surface,
Further, the virtual image display surface is divided (distinguished) into:
a road surface HUD area that displays a virtual image superimposed on the road surface; and
a road HUD region which is located at a position remote from the road HUD region and displays a virtual image located above the road,
the display control unit divides an image display area of the display surface into: a first display area corresponding to the road surface HUD area; and a second display area corresponding to the above-mentioned on-road HUD area,
displaying a first image overlapping the road surface in the first display area,
and displaying a second image to be displayed on the upper side of the road surface in the second display area.
In this embodiment, a virtual image display surface is used that extends from a side closer to the host vehicle to a side farther from the host vehicle. The virtual image display surface is more lifted from the road surface than the proximal end portion, and the virtual image display surface as a whole can be regarded as an inclined surface.
When the virtual image display surface of the inclined surface is used, a display (for example, a display of an arrow for navigation) that overlaps with the road surface can be formed in a region on the side closer to the host vehicle on the virtual image display surface, and a display that is located on the road surface (for example, a display of a sign or the like located on the road surface) can be formed in a region on the side farther from the host vehicle. Conventionally, HUD devices that form a display superimposed on a road surface have difficulty in displaying a logo or the like located above the road surface, and in this regard, various displays have been limited, and the expressive power has been limited. According to this aspect, the restriction is greatly reduced, and various displays of the road HUD and the on-road HUD can be made in combination.
Here, when a display is formed to be superimposed on the road surface on the front side of the inclined virtual image display surface (imaging surface), the virtual image display surface does not exactly coincide with the road surface. However, when forming a display that is attached to a road surface, the human eyes tend to perceive and feel that the display is superimposed on the road surface based on daily experience or the like, and thus, for example, if the angle formed with the road surface on an inclined surface is suppressed to a certain degree, the road surface HUD display with less discomfort to the user, in other words, the display with a deep feeling can be performed within a certain degree.
On the other hand, if the inclined virtual image display surface floats up to some extent from the road surface, the user perceives that the display formed in the area where the float up is large is a display located on the upper side of the road surface (in other words, an on-road HUD display).
The display control unit, considering the above-described aspect, divides an image display area (in other words, an image displayable area) of a display surface (a display surface of a display device, a display surface of a screen, or the like) into: a first display area corresponding to the road surface HUD area; and a second display area corresponding to the on-road HUD area. Then, the following display control is performed: the first image overlapping the road surface is displayed in the first display area, and the second image to be displayed on the upper side of the road surface is displayed in the second display area.
According to this aspect, the road surface HUD and the road surface HUD are used in combination on one continuous virtual image display surface, whereby a plurality of kinds of display can be performed, and the expressive power of the HUD device can be improved.
In a second mode depending on the first mode, it may be that,
the virtual image display surface integrally extending from the proximal end portion to the distal end portion is formed by disposing the display surface obliquely with respect to an optical axis of the optical system,
the shape of the plane or curved surface of the virtual image display surface is adjusted by adjusting the optical characteristics of the entire region or a part of the region in the optical system, adjusting the arrangement of the optical member and the display surface, adjusting the shape of the display surface, or a combination of these.
According to this aspect, the shape of the virtual image display surface can be adjusted to a plurality of types. Therefore, the degree of freedom in design of the HUD device is improved.
In a third mode depending on the first or second mode, it may be that,
the on-road HUD region in the virtual image display surface is set such that, when the user observes a virtual image disposed in the on-road HUD region, the absolute distance perception is deactivated, and a distance of a sense of distance cannot be accurately perceived, whereby the virtual image display surface of the on-road HUD region is set as a virtual image display surface of a pseudo-vertical plane.
In this embodiment, attention is paid to the fact that when a person views an image of a distant place, the image is quite distant, and therefore, the absolute distance perception is deactivated, and the person may not accurately perceive the sense of distance. By setting the on-road HUD region in such a distance, even if a virtual image imaged in the on-road HUD region actually has a depth, a person cannot perceive (feel) the depth, and the illusion appears to stand at the same distance. In other words, the virtual image display surface of the on-road HUD region can be regarded as a pseudo-vertical surface (as if it were a surface that stands substantially perpendicular to the road surface, for example). Thus, virtual images such as road signs, guidance displays, advertisements, etc. standing on the road can be displayed without any discomfort.
In a fourth aspect depending on any one of the first to third aspects, it may be that,
the distance from the viewpoint of the user or a reference point corresponding to the viewpoint to a point in real space corresponding to a boundary between the road surface and a space located above the road surface when viewed by the user, that is, a boundary position is set to 20m or more.
In this embodiment, a reference (threshold value) of a distance from which a person cannot perceive and feel the depth of a virtual image is set to 20m. The object located in a distance of about 20m is difficult to perceive and feel depth, and thus is based on 20m. Thus, the region having a virtual image display distance of 20m or more becomes the road HUD region.
In a fifth aspect depending on any one of the first to fourth aspects, it may be that,
comprises a vanishing point detecting unit for detecting vanishing points of the road surface,
in the case where a point at which a line segment connecting the viewpoint of the user or a reference point corresponding to the viewpoint and the detected vanishing point intersects the virtual image display surface is set as a vanishing point on the virtual image display surface,
the road HUD area is an area closer to the vehicle and the road HUD area is an area farther from the vehicle, based on the position of the vanishing point on the virtual image display surface.
In this embodiment, the road HUD area and the road HUD area are divided based on the vanishing point on the virtual image display surface. The vanishing point detecting unit applies image processing to an image captured in front of the vehicle, for example, and calculates (approximately calculates) a position where a plurality of white lines such as a side line and a center line of the road surface intersect at a limited distance, and sets the calculated position as a vanishing point. When the road is curved, for example, an approximation curve can be used to determine the vanishing point.
Further, a point at which a line segment connecting the viewpoint of the user or a reference point corresponding to the viewpoint and the detected vanishing point intersects the virtual image display surface is set as a vanishing point on the virtual image display surface.
Since the virtual image display surface is an inclined surface, the virtual image display surface is not exactly coincident with the road surface of the road. However, from the viewpoint of the user, the region on the virtual image display surface on the front side (the side closer to the host vehicle) than the vanishing point (the point corresponding to the detected actual vanishing point) is not considered to be located on the road surface, and thus the region where display with the road surface overlapping with the road surface is enabled, that is, the road surface HUD region can be set.
On the other hand, since the virtual image display surface is greatly raised from the road surface in the region on the far side from the vanishing point on the virtual image display surface, the user can feel that the display is formed on the upper side of the road surface when the virtual image is displayed in the region. This region can be used as an on-road HUD region.
If the vanishing point on the virtual image display surface is set to a point having a virtual image display distance of 20m or more, the area on the far side becomes the above-described on-road HUD area of the pseudo-vertical surface. This allows the user to be presented with an object, such as a sign, a guide sign, or an advertisement, standing vertically above the road surface, or information related to the object without any uncomfortable feeling.
In a sixth aspect which is dependent on any one of the first to fifth aspects, it may be that,
The second image is another separate image independent of the first image.
In this embodiment, the first image corresponding to the road HUD area and the second image corresponding to the road HUD area are set as separate images independent of each other. Therefore, for example, different information (different kinds of information, etc.) can be displayed by the first and second images, and the diversity of the presented information can be ensured. In the present invention, the first and second images are not linked to each other, but the two images are integrated to present one information. For example, it is needless to say that a straight line portion of the arrow for navigation may be used as the first image, and a front end portion of the arrow may be used as the second image, and these images may be integrated together to present guidance (guidance) information of the vehicle. Such information presentation means can be allowed in the present invention. However, as described above, by making the first and second images different, the information presented by each image is independent (or the other is independent), and there is an advantage that expressive force is increased and diversity of presented information is ensured.
In a seventh aspect depending on any one of the first to sixth aspects, it may be that,
The first image is at least one of an image representing information of a state of the vehicle, a route guidance image, and a route guidance image,
the second image is at least one of an image of speed limitation information, an image of various marks provided on a road, a path guidance image on a road, an image of a sign provided on a road, and an image of an advertisement provided on a road.
In this embodiment, the first and second images are specifically exemplified. This allows a user to be presented with useful information without causing any discomfort.
In an eighth aspect depending on any one of the first to seventh aspects, it may be that,
the display control unit performs at least one of the following controls when adjusting a sense of distance to a first virtual image corresponding to the first image: controlling a virtual image display distance by changing a display position of the first image on the display surface; and performing morphological control based on at least one of the size, shape, pattern, color, presence or absence of shading, and stereoscopic drawing of a display object constituting the first image,
the second virtual image corresponding to the second image is subjected to only the form control, and is not subjected to the control of the virtual image display distance.
In this embodiment, when the first virtual image displayed in the road HUD area is adjusted in sense of distance (in other words, displayed in distance), the virtual image display distance can be adjusted, morphological control based on a change in the size, shape, or the like of the display object can be performed, or both of them can be performed (used together). Here, as a method of adjusting the virtual image display distance, a method of adjusting the depth by changing the display position on the inclined virtual image display surface is adopted.
In addition, as described above, the influence of the depth perception of the person due to the factor of the convergence angle is small and the absolute distance perception is blunted with respect to the second virtual image displayed in the on-road HUD region, so that the control of the virtual image display distance is not performed uselessly. Instead, form control of the size, shape, and the like of the display object is performed to adjust the sense of distance. Thus, the first and second virtual images can be adjusted in sense of distance by an appropriate method.
In a ninth aspect depending on any one of the first to eighth aspects, it may be that,
when a time has elapsed since the second image was displayed at a distant point in time, the distance between the second image and the vehicle decreases as the vehicle advances forward, and the display control unit causes at least a part of the display content of the second image to be displayed as the first image so as to overlap the road surface.
In this aspect, for example, when a sign or the like (second virtual image) appearing to be far approaches the host vehicle with the lapse of time, at least a part of the content of the sign or the like is displayed so as to be attached to the road surface, and the transition (switching) from the on-road HUD display to the road surface HUD display (first virtual image) is made clear. After that, for example, a new vehicle guidance display may be displayed so as to be connected to the road surface HUD display, or display control may be performed so that the road surface HUD display immediately after switching is made to flow backward as time passes closer to the host vehicle.
In other words, when a sign or the like appearing far is close to each other and a stage of steering operation or the like affecting reality is reached, a road HUD is displayed as a stitched image for some or all of the sign, and a user is clearly presented with a switch from the road HUD to the road HUD. This allows the user to intuitively understand that information on important marks and the like is presented mainly by the road HUD in the future. Therefore, after the presentation, for example, by performing a guidance display or the like, a smooth information presentation without a sense of discomfort can be performed. As described above, the road surface HUD display at the time of the above-described switching can also be regarded as a mosaic display for mosaic with a guidance display or the like thereafter. Through setting up the concatenation display, can be smoothly from road HUD demonstration to road surface HUD demonstration.
In a tenth aspect depending from any one of the first to ninth aspects, it may be that,
at least a part of the virtual image display surface corresponding to the road surface HUD area is located below the road surface.
In this aspect, at least a part of the virtual image display surface of the road surface HUD area is allowed to be positioned on the lower side than the road surface. In this case, the first virtual image on the virtual image display surface is imaged below the road surface. However, the user who observes the virtual image understands that the display superimposed on the road surface is not actually located under the road surface, and therefore, the user perceives, based on common knowledge, that the first virtual image is attached to the road surface.
Thus, the first virtual image does not feel as if it were floating from the road surface, and appears to be closely attached to the road surface. Therefore, the accuracy of road surface overlapping of the virtual images can be improved. According to this aspect, the first virtual image and the second virtual image are accurately superimposed on the road surface, whereby a high-expressive display with a sense of realism is achieved.
Those skilled in the art will readily appreciate that the manner in which the invention is described in accordance with the examples can be further varied without departing from the spirit of the invention.
Drawings
Fig. 1 (a) is a diagram for explaining a road HUD display, and fig. 1 (B) to (D) are diagrams showing a change in display mode when a virtual image display surface is an inclined surface, and fig. 1 (E) is a diagram showing an example of image display on a display surface of a display unit (corresponding to the example of fig. 1 (D)).
Fig. 2 (a) is a diagram showing an example of a virtual image (including road HUD display and road HUD display) visually recognized by a user through a windshield, and fig. 2 (B) is a diagram showing a case of virtual image display on a virtual image display surface as an inclined surface.
Fig. 3 (a) is a diagram showing an example of a virtual image display surface of the road HUD area in pseudo-elevation in the example of fig. 2 (a), and fig. 3 (B) is a diagram showing a virtual image display on the virtual image display surface as an inclined surface.
Fig. 4 (a) and (B) are diagrams for explaining a method of distance adjustment when the distance display is performed.
Fig. 5 (a) and (B) are views showing display examples when the road HUD display is shifted from the road HUD display to the road HUD display.
Fig. 6 (a) to (D) are diagrams showing examples of the shape of the virtual image display surface and examples of the positional relationship between the virtual image display surface and the road surface position.
Fig. 7 is a diagram showing an example of virtual image display using a virtual image display surface by the HUD device.
Fig. 8 is a diagram showing a specific example of an optical system in the head-up display device.
Fig. 9 is a diagram for explaining an example of a curved shape and a focal point of a concave mirror (a magnifying mirror having a curved reflecting surface).
Fig. 10 is a diagram showing an example of a system configuration of the head-up display device.
Fig. 11 is a diagram for explaining control of the virtual image display distance when the display shown in fig. 5 (a) and 5 (B) is performed.
Fig. 12 is a diagram showing an overall configuration example of the HUD device.
Fig. 13 is a flowchart showing an example of the procedure of the display process by the display control unit.
Symbol description
1. Vehicle (self-vehicle)
2. Reflecting light-transmitting parts (windshields, etc)
3. Virtual image display region
7. Steering wheel
9. Operation part
10. Vehicle (self-vehicle)
11. Front panel
13. Display (display panel, etc)
17. Front camera
19. Driving scene determination unit
21. Image processing unit
22. User' s
32. Image output unit
33. Drive unit
34. Virtual image display distance control unit
35. Image generating unit
36. Road surface HUD region/road HUD region detecting part
37. Virtual image display surface position adjusting part
38. Pitch angle and vanishing point calculating unit
39. Vanishing point detecting unit
40. Road surface
46. Image display unit (display unit: screen, etc.)
47. Display surface
52. Optical system
101 HUD device
130. Concave mirror
150. Light projecting part
300. Display control unit
400. Navigation part (navigation ECU)
500. Communication unit
502 GPS receiving unit
505. Various sensors
600. Driving assistance system
700. Vehicle-mounted ECU
Z1 road HUD region
HUD region on Z2 road
PS (PS 3 to PS 7) virtual image display surface (virtual image display surface of inclined surface)
G virtual image
G1 First virtual image (for road HUD area)
G2 Second virtual image (for on-road HUD region)
Proximal end of U1 virtual image display surface
Center position of U2 virtual image display surface (viewing angle center)
Distal end portion of U3 virtual image display surface
VP1 vanishing point
Vanishing point on VP2 virtual image display surface
VP2' shows a point on the surface corresponding to VP2
Boundary position between road surface HUD and road surface HUD on LN' display surface
First image corresponding to RG1 and G1
Second image of RG2 and G2
Detailed Description
The following description of the preferred embodiments is provided for ease of understanding the present invention. Therefore, it should be noted by those skilled in the art that the present invention is not unduly limited by the embodiments described below.
Reference is made to fig. 1. Fig. 1 (a) is a diagram for explaining a road HUD display, and fig. 1 (B) to (D) are diagrams showing a change in display mode when a virtual image display surface is an inclined surface, and fig. 1 (E) is a diagram showing an example of image display on a display surface of a display unit (corresponding to the example of fig. 1 (D)).
In fig. 1 a, a virtual image display surface PS1 as an imaging surface is provided so as to be attached to a road surface 40 of a road on which the vehicle 1 runs, as viewed from a viewpoint (eyes) a of a user (driver or the like) riding on the vehicle (host vehicle) 1. A virtual image G is displayed on the virtual image display surface PS 1. The virtual image G appears to the user to be superimposed on the road surface 40.
Such a HUD device of the display system is sometimes referred to as a "road surface HUD device", and a display system of the display system that is displayed so as to be attached to a road surface is sometimes referred to as a "road surface HUD display".
In fig. 1B, the virtual image display surface (imaging surface) PS2 is located at substantially the same position as the virtual image display surface PS1 of fig. 1 a, but the virtual image display surface PS2 is provided obliquely at a predetermined angle θ1 with respect to the road surface 40. When such a virtual image display surface PS2 that is an inclined surface is used, the virtual image display distance, which is the distance from the user's viewpoint a (or a predetermined reference point corresponding to the viewpoint a) to the virtual image display surface, can be appropriately adjusted by adjusting the position of the virtual image G. The prior art is limited to fig. 1 (a) and (B).
In fig. 1 (C), the virtual image display surface PS3 is an inclined surface as in fig. 1 (B), but in the example of fig. 1 (C), it extends over a considerable range from a position close to the vehicle 1 to a position far from the vehicle 1 on the road surface. The inclination angle of the virtual image display surface PS3 with respect to the road surface 40 is a predetermined angle θ2.
As shown in the drawing, a virtual image display surface PS3 as an imaging surface for virtual image imaging integrally extends from a proximal end portion U1 on a side close to the vehicle 1 to a distal end portion U3 on a side distant from the vehicle 1, and a second distance h1 between the road surface 40 and the distal end portion U3 is larger than a first distance between the road surface 40 and the proximal end portion U1 (in fig. 1 (C), U1 is on the road surface 40 and thus the distance is zero), and the virtual image display surface is a plane (or a curved surface: as will be described later with respect to the curved surface). Note that the symbol U2 is a midpoint (center point) of the virtual image display surface PS 3. The virtual image display distances corresponding to the points U1, U2, and U3 are L1, L2, and L3.
Further, the virtual image display surface PS3 is divided into: a road surface HUD area Z1, the road surface HUD area Z1 displaying a virtual image superimposed on the road surface 40; and an on-road HUD region Z2, wherein the on-road HUD region Z2 is located farther from the road HUD region Z1, and displays a virtual image located above the road surface 40.
When the point at which the line segment of the virtual image display surface PS3 intersects with the vanishing point VP2 on the virtual image display surface, which is a limited distance point on the horizontal line (which is a point of reference for connecting the viewpoint a (or the viewpoint corresponding to the viewpoint) of the user and the vanishing point of the road surface 40, is detected by a vanishing point detecting unit (39 in fig. 13) described later), the region on the side close to the vehicle 1 is the road surface HUD region Z1, and the region on the side far from the vehicle 1 is the road surface HUD region Z2, based on the position of the vanishing point VP2 on the virtual image display surface.
As shown in fig. 1 (C), since the virtual image display surface PS3 is an inclined surface, the virtual image display surface PS3 is not exactly coincident with the road surface 40 of the road. However, from the viewpoint of the user, the region on the front side (the side closer to the host vehicle 1) than the vanishing point VP2 on the virtual image display surface (the point corresponding to the detected actual vanishing point VP 1) is not considered to be located on the road surface, and thus can be used as the road surface HUD region Z1 which is a region where display with the road surface 40 is enabled. Further, the distance between the vanishing point VP2 on the virtual image display surface and the road surface 40 is "h0".
If the upward movement of the virtual image display surface PS3 from the road surface 40 is equal to or less than the "h0", the user can perceive and feel that the displayed virtual image G1 is attached to the road surface 40 without being affected by the upward movement.
On the other hand, the virtual image display surface PS3 in the region on the far side from the vanishing point VP2 on the virtual image display surface is greatly raised from the road surface. For example, the distance from the road surface 40 of the distal end portion U3 of the virtual image display surface PS3 is "h1 (> h 0)". If the virtual image G2 is displayed in the area on the far side, the user can feel that the display is formed on the upper side than the road surface 40. This region can thus be used as the road HUD region Z2.
Next, refer to fig. 1 (D). In the example of fig. 1 (D), the on-road HUD region Z2 is set so that, when the user observes the virtual image G2 disposed in the on-road HUD region Z2, the absolute distance perception is deactivated, and the distance perception cannot be accurately perceived. In other words, the virtual image display distance L2 of the "vanishing point VP2 on virtual image display surface" that is the boundary between the road surface HUD region Z1 and the road surface HUD region Z2 is set to 20m or more (20 m is further).
Thus, the virtual image display surface of the on-road HUD region Z2 can be set as the virtual image display surface of the pseudo-vertical surface.
In the example of fig. 1 (D), attention is paid to the fact that when a person observes an image of a distant place, the influence of depth perception due to the factor of the convergence angle becomes small, and the absolute distance perception is deactivated, but the depth (distance perception) may not be accurately perceived although the absolute distance perception is far known. By setting the on-road HUD region Z2 in such a distance, even if the virtual image G2 imaged on the on-road HUD region Z2 actually has a depth, the human cannot perceive (feel) the depth, and the human can be perceived as if it were standing at the same distance.
In other words, the virtual image display surface of the road HUD region Z2 can be regarded as a pseudo-vertical surface (a surface that stands substantially perpendicular to the road surface, for example, as if it were). In fig. 1 (D), the virtual image G2 is on an inclined plane, but appears as if it were a virtual image G2' standing on the road surface 40 to a user (person) who observes it. Thus, for example, the virtual image G2' such as the road sign, guidance display, advertisement, etc. standing on the road can be displayed without any uncomfortable feeling.
The length (extension range) of the virtual image display surface PS3 along the road surface 40 can be, for example, about 10m to 30 m.
Further, the inclination angle θ2 of the virtual image display surface PS3 with respect to the road surface 40 is preferably set in a range of approximately 1 ° or more and θ2 or less than 3 °.
For example, when an average planar angle (angle) LDA of the user (an angle formed by a horizontal line based on the viewpoint a and the viewing direction of the user) is set to "3 °", a vertical (up-down) viewing angle VFOV (corresponding to the viewing angle in the vertical direction of the display surface) is set to "5 °", L1 (virtual image display distance to the near end portion U1) is set to "9m", L2 (virtual image display distance to the vanishing point VP2 on the virtual image display surface) is set to "20m", and L3 (virtual image display distance to the far end portion U3) is set to "30m", the tilt angle θ2 is set to "1.64 °". However, this is an example, and is not limited thereto.
Next, refer to fig. 1 (E). Fig. 1E shows an example of display control by a display control unit (reference numeral 300 in fig. 13). Further, the specific structure of the HUD device will be described later.
The display control unit (reference numeral 300 in fig. 13) divides the image display area 45 of the display surface 47 of the display unit 46 into: a first display area Z1' corresponding to the road surface HUD area Z1; and a second display area Z2' corresponding to the on-road HUD area Z2. The "point VP2'" is a point on the display surface 47 corresponding to the "vanishing point VP2 on the virtual image display surface" shown in fig. 1 (C) and (D).
Then, the display control unit (symbol 300 in fig. 13) causes the first image RG1 (arrow 61 'for navigation and speed display SP') superimposed on the road surface 40 to be displayed in the first display region Z1', and causes the second image RG2 (speed limit display 63' and "learning road" display 65 ') displayed on the upper side of the road surface 40 to be displayed in the second display region Z2'.
Further, depending on the structure of the optical system, there are sometimes opposite cases of "upper view angle" and "lower view angle" in fig. 1 (E). At this time, the display image of fig. 1 (E) is displayed in a vertically inverted manner.
In fig. 1 (E), points U1', U2', and U3' correspond to points U1, U2, and U3 in fig. 1 (C) and (D).
The side lines 51, 53 and the center line 55 of the road depicted by the broken lines in fig. 1 (E) are detected by applying image processing to an image obtained in front of the imaging vehicle 1, and the points at which the lines intersect at the distant points are vanishing points VP1 shown in fig. 1 (C) and (D). As described above, the vanishing point on the virtual image display surface is VP2, and the point corresponding to VP2 on the display surface 47 is VP2'.
In fig. 1E, the first image RG1 may be at least one of an image indicating information (for example, vehicle speed display) of the state of the vehicle 1, a route guidance image (for example, an image of an arrow for navigation), and a route guidance image (for example, an image obtained by combining a place name, a landmark, and an arrow). In addition, the second image RG2 may be at least one of an image of speed limitation information, an image of various marks provided on the road, a path guidance image on the road, an image of a sign provided on the road, and an image of an advertisement provided on the road. Further, these are examples. In this way, by making the contents (meaning, content, kind, etc.) of the first and second images different from each other, a variety of information can be presented to the user, and the expressive power of the HUD device can be improved.
Next, refer to fig. 2. Fig. 2 (a) is a diagram showing an example of a virtual image (including road HUD display and road HUD display) visually recognized by a user through a windshield, and fig. 2 (B) is a diagram showing a case of virtual image display on a virtual image display surface as an inclined surface. In fig. 2, the same reference numerals are given to the portions common to the above-described drawings (this is the same for other drawings).
In the example of fig. 2, a windshield (front glass) of a vehicle (host vehicle) 1 functions as a projected member (a member having light reflectivity and light transmittance). The projected member is in other words the reflective light-transmitting member 2. The windshield as the projected member (reflective transparent member 2) may be a combiner or the like. The HUD device projects display light onto a reflective light-transmitting member 2 provided in a vehicle 1, overlaps with a live view passing through the reflective light-transmitting member 2, and generates and displays a virtual image using the display light reflected by the reflective light-transmitting member 2.
The display of fig. 2 (a) is shown in fig. 1 (E), for example. By forming the desired image on the display surface 47. The virtual image is displayed in a virtual image display region 3 in the windshield (reflective light-transmitting member 2).
In fig. 2 (a), the virtual image display surface PS3 of fig. 1 (C) described earlier is used. Fig. 2 (B) is a view of fig. 1 (C) taken again, and the contents are the same.
As shown in fig. 2 a, a first virtual image G1 (including an arrow 61 for navigation and a speed display SP) superimposed on the road surface 40 is displayed in the first display region Z1. Further, a second virtual image G2 to be displayed on the upper side of the road surface 40 ("display 63 of the through road" and display 65 of the "speed limit" ("40 km/h")) is displayed in the second display region Z2.
In fig. 2 (a), reference numerals 51 and 53 denote side lines (real views) of a road, and reference numeral 55 denotes a center line (real view).
In the example of fig. 2 a, an operation unit 9 capable of switching on/off of the HUD device or the like and setting an operation mode or the like is provided near a steering wheel (steering handle in a broad sense) 7. A display device (e.g., a liquid crystal display device) 13 is provided in the center of the front panel 11. The display device 13 may be used for example for assistance based on the display of the HUD device. The display device 13 may be a composite panel including a touch panel or the like.
Fig. 2 (B) is a view of fig. 1 (C) again, and the contents are the same, so that the description thereof is omitted.
Next, refer to fig. 3. Fig. 3 (a) is a diagram showing an example of a virtual image display surface of the road HUD area in pseudo-elevation in the example of fig. 2 (a), and fig. 3 (B) is a diagram showing a virtual image display on the virtual image display surface as an inclined surface.
Fig. 3 (a) uses the virtual image display surface PS3 of fig. 1 (C) described earlier. Fig. 3 (B) is a view of fig. 1 (C) taken again, and the contents are the same. In fig. 3 (a), only the display 63 of the learning route is displayed as the second virtual image G2.
As described above, when the vanishing point VP2 on the virtual image display surface is set to a point having a virtual image display distance of 20m or more, the area on the far side becomes the on-road HUD area Z2 of the pseudo-vertical surface. This allows the user to be presented with an object, such as a sign, a guide sign, or an advertisement (these may be collectively referred to as virtual sign information), or information related to the object, which stands vertically upward from the road surface without causing any discomfort.
Fig. 3 (B) is a view of fig. 1 (D) again, and the contents are the same, so that the description thereof is omitted.
As described above, in the example of fig. 2 and 3, the virtual image display surface PS3 having the inclined surface is provided in a considerably wide front area of the vehicle 1. In this way, a display (for example, a display of an arrow for navigation) overlapping with the road surface can be formed in the region Z1 on the side closer to the host vehicle 1 on the virtual image display surface PS3, and a display (for example, a display of a sign or the like located above the road surface) located on the road surface can be formed in the region Z2 on the side farther from the host vehicle 1.
Conventionally, HUD devices that form a display superimposed on a road surface have difficulty in displaying a logo or the like located above the road surface, and in this regard, various displays have been limited, and the expressive power has been limited. According to the present embodiment, this limitation is greatly reduced, and various displays of the road HUD and the on-road HUD can be made in combination. This can improve the expressive power of the HUD device.
Next, refer to fig. 4. Fig. 4 (a) and (B) are diagrams for explaining a method of distance adjustment in the case of performing distance display. Fig. 4 (a) is the same as fig. 3 (a).
When the sense of distance is adjusted for the first virtual image (virtual image superimposed on the road surface) G1, at least one of the following can be implemented by changing the display position of the first image RG1 (see fig. 1 (E)) on the display surface 47: changing the virtual image display distance; and performing morphological control based on at least one of the presence or absence of size, shape, pattern, color, shading, and stereoscopic drawing of the display object constituting the first image.
In other words, the first image RG1 corresponding to the road surface HUD region Z1 can be adjusted in depth on the virtual image display surface PS3 as an inclined surface by changing the position. In addition, the first image RG1 can be changed in size, shape, or the like, in other words, the sense of distance can be adjusted by controlling the form of the display object. In addition, the two methods can be used together.
The second virtual image G2 corresponding to the road HUD region Z2 is not controlled in the virtual image display distance, and only the form control of the display object is performed. As described above, the second virtual image G2 has little influence on the perception of depth of the person due to the factor of the convergence angle, and the perception of the absolute distance is deactivated, so that control of the virtual image display distance is not performed uselessly. Instead, morphological control such as size and shape of the display object is performed to adjust the sense of distance. In this way, the sense of distance can be adjusted for each of the first and second virtual images G1 and G2 by an appropriate method.
Next, refer to fig. 4 (B). Fig. 4 (B) shows a display control example in which the virtual image display distance is changed by changing the virtual image position on the virtual image display surface as the inclined surface. As shown in the drawing, in the road surface HUD region Z2 of the virtual image display surface PS3, the arrow 61 for navigation as the first virtual image G1 extends from the side closer to the vehicle 1 to the side farther from the vehicle 1, and the farther the display position is, the larger the virtual image display distance is, whereby the sense of distance can be adjusted.
In addition, for example, when the display position of the vehicle speed display ("40 km/h" display) is changed, the farther from the vehicle 1, the larger the virtual image display distance, whereby the sense of distance can be adjusted.
As described above, the control of the virtual image display distance is not performed on the second virtual image G2 in the road HUD region Z2 ("the display 63 (guidance display) of the through road"), but the adjustment of the sense of distance is performed based on the change in size (dimension), shape, and the like.
Thus, for example, the display distance of the virtual image can be appropriately adjusted even for a virtual image including a plurality of contents, and a high-quality virtual image having a desired sense of distance can be displayed.
Next, refer to fig. 5. Fig. 5 (a) and (B) are views showing display examples when the road HUD display is shifted from the road HUD display to the road HUD display.
In fig. 5 a, as a second virtual image (a virtual image standing on the road surface), the exit guide mark G2-1 and the direction guide mark G2-2 are displayed. When the virtual images G2-1, G2-2 of these signs that appear to be far approach the host vehicle 1 over time, at least a part of the content of the signs or the like (here, the whole of the content of the exit guidance sign G2-1, which is set to present the information most important to the user) is temporarily displayed so as to be attached to the road surface 40. In other words, the guide mark G1-1 is displayed superimposed on the road surface 40. This makes it possible to clarify the transition (switching) from the road HUD display (specifically, the vehicle guidance based on the second virtual image) to the road HUD display (specifically, the vehicle guidance based on the first virtual image).
Then, for example, display control is performed such that the road surface HUD display G1-1 immediately after switching is brought close to the host vehicle 1 with the lapse of time.
In the driving scenario of fig. 5 (B), the timing of changing the travel route of the vehicle 1 toward the exit is approached. At this time, a new vehicle guidance display G1-2 is displayed so as to be connected to the exit guidance sign G1-1 displayed as the road surface HUD, and guidance for reliably guiding the vehicle 1 to the exit is performed.
Further, after that, the exit guide mark G1-1 may also perform display control so as to flow to the rear of the vehicle 1 with the passage of time.
In other words, when a sign or the like appearing far is close to each other and a stage of steering operation or the like affecting reality is reached, a road HUD is displayed as a stitched image for some or all of the sign, and a user is clearly presented with a switch from the road HUD to the road HUD.
This allows the user to intuitively understand that information such as important marks is presented mainly on the road HUD in the future. Therefore, after the presentation, for example, by performing a guidance display or the like, a smooth information presentation without a sense of discomfort can be performed.
As described above, the road surface HUD display (the exit guide mark G1-1 displayed overlapping the road surface) at the time of the above-described switching can also be regarded as a "mosaic display" for mosaic with the guide display (G1-2) or the like thereafter. By providing the "mosaic display", it is possible to smoothly shift from the road HUD display to the road HUD display (in other words, display with less feeling of tangency). Thus, dynamic virtual image display control with a sense of presence, which is difficult in the prior art, is realized, and the expressive force of the HUD device is remarkably improved.
Next, refer to fig. 6. Fig. 6 (a) to (D) are diagrams showing examples of the shape of the virtual image display surface and examples of the positional relationship between the virtual image display surface and the road surface position.
In the example of fig. 6 (a), the virtual image display surface PS4 is a slope whose shape is a straight line, and its entirety is located on the road surface 40. In the example of fig. 6 (B), the virtual image display surface PS5 is a straight inclined surface, and the inclined surface on the side closer to the host vehicle 1 is located below the road surface 40.
In the example of fig. 6C, the virtual image display surface PS6 is a curved slope (curved slope) and is entirely located on the road surface 40. In the example of fig. 6D, the virtual image display surface PS7 is a curved inclined surface (curved inclined surface), and the inclined surface on the side closer to the host vehicle 1 is located below the road surface 40.
At least a part of the virtual image display surface corresponding to the road surface HUD area may be positioned below the road surface.
As described above, in the present embodiment, the variety of virtual image display surfaces that can be used is rich. As in the example of fig. 6 (B) and (D), at least a part of the virtual image display surface of the road surface HUD area is allowed to be positioned below the road surface.
In this case, the first virtual image on the virtual image display surface is imaged below the road surface 40. However, the user who observes the virtual image understands that the display superimposed on the road surface 40 is not actually located under the road surface 40, and therefore, the user perceives, based on the common sense, that the first virtual image is attached to the road surface 40.
Thus, the first virtual image does not appear to float upward from the road surface 40, but is closely attached to the road surface 40. Therefore, the accuracy of road surface overlapping of the virtual images can be improved. According to the present embodiment, by accurately superimposing the first virtual image (both in the case of imaging on the road surface 40 and in the case of imaging under the road surface 40) on the road surface 40 and the second virtual image (the virtual image displayed on the virtual image display surface located above the road surface 40) standing from the road surface 40, the display with high expressive power full of the feeling of presence is realized.
Next, refer to fig. 7. Fig. 7 is a diagram showing an example of virtual image display using a virtual image display surface by the HUD device. In addition, although four virtual image display surfaces are shown in fig. 7, these are the same as the example shown in fig. 6, and the same reference numerals as in fig. 6 are given to the respective virtual image display surfaces.
In fig. 7, the direction along the front of the vehicle 1 (also referred to as the front-rear direction) is referred to as the Z direction, the direction along the width (lateral width) of the vehicle 1 (the left-right direction) is referred to as the X direction, and the height direction of the vehicle 1 (the direction of a line segment perpendicular to the flat road surface 40 and away from the road surface 40) is referred to as the Y direction.
In the following description, the shape of the virtual image display surface is described as being up and down. Here, for convenience of explanation, a direction along a line segment (normal line) perpendicular to the road surface 40 is referred to as an up-down direction. When the road surface is horizontal, the vertical direction is downward, and the opposite direction is upward. This also applies to the description of the figures described above.
As shown in the figure, the HUD device (a HUD device using a road HUD and a road HUD) 101 of the present embodiment is mounted in the instrument panel of a vehicle (own vehicle) 1.
The HUD device 101 has: an image display unit (here, a screen) 46 having a display surface 47 for displaying an image; an optical system 120 including an optical member that projects display light K of a display image onto a windshield (reflective light-transmitting member 2) as a reflective light-transmitting member; and a light projecting section (image projecting section) 150, wherein the optical member includes a concave mirror (magnifying mirror) 130 having a reflecting surface 139, and the reflecting surface 139 of the concave mirror 130 has a shape (including a curved surface) suitable for displaying a virtual image with the road surface 40 as an object to be superimposed on the side close to the host vehicle 1 and for displaying an image standing from the road surface 40 on the side far from the host vehicle 1. The shape of the reflection surface 139 has a considerable influence on the shape of the virtual image display surfaces PS4 to PS7 and the relationship with the road surface.
The shape of the virtual image display surfaces PS4 to PS7 is affected by the shape of the reflecting surface 139 (including a curved surface) of the concave mirror 130, the curved surface shape of the windshield (the reflective light-transmitting member 2), and the shape of other optical members (for example, a correction mirror) mounted in the optical system 120. In addition, the shape of the display surface 47 (generally, a plane, but the whole or a part may be non-plane) and the arrangement of the display surface 47 with respect to the reflection surface 139 are also affected. The concave mirror 130 is a magnifying mirror, and has a considerable influence on the shape of the virtual image display surface. In addition, if the shape of the reflecting surface 139 of the concave mirror 130 is different, the shape of the virtual image display surface is changed in practice.
The virtual image display surfaces P4 to P7 extending integrally from the proximal end portion U1 to the distal end portion U3 are formed by disposing the display surface 47 of the image display unit (display unit) 46 obliquely at an intersecting angle of less than 90 degrees with respect to the optical axis of the optical system (main optical axis corresponding to the principal ray).
The shape of the plane or curved surface of the virtual image display surfaces P4 to P7 can be adjusted by adjusting the optical characteristics of the entire region or a partial region of the optical system, adjusting the arrangement of the optical components and the display surface 47, adjusting the shape of the display surface 47, or a combination of these. In this way, the shape of the virtual image display surface can be adjusted to a plurality of types. Therefore, the degree of freedom in design of the HUD device is improved.
Next, refer to fig. 8. Fig. 8 is a diagram showing a specific example of an optical system in the head-up display device.
The HUD device 101 has: a light projecting unit 151; a screen 161 as an image display section; a reflecting mirror 133; concave mirror 131; and a control unit 171, wherein the control unit 171 (which may also be referred to as a display control device or a display control unit) is configured by an I/O interface that obtains information from an external sensor or other ECU, a processor, a memory, a computer program stored in the memory, and the like.
The angle of the concave mirror 131 can be appropriately adjusted by the operation of the rotation mechanism 175 constituted by an actuator. The inclination and position of the screen 161 can be appropriately adjusted by an adjusting section 173 constituted by an actuator of the image display section. In addition, specifically, the inclination of the screen 161 may be an inclination with respect to the optical axis of the light projecting section 151, an inclination with respect to the optical axis of the optical system, or an inclination with respect to the main optical path (principal ray) of the light emitted from the light projecting section. The control unit 171 controls the operation of the light projecting unit 151, the operation of the rotation mechanism 175, the operation of the adjustment unit 173 of the image display unit, and the like in a unified manner. Note that reference symbol K denotes display light (outgoing light). By adjusting the characteristics of the optical system from various angles, for example, the change in the shape of the curved surface of the virtual image display surface can be increased, and the curvature of the curved surface and the like can be adjusted with higher accuracy.
Next, refer to fig. 9. Fig. 9 is a diagram for explaining an example of a curved shape and a focal point of a concave mirror (a magnifying mirror having a curved reflecting surface). Concave mirror 135 shown in fig. 9 has portions α, β, and γ, and the radii of curvature of the portions are set to be approximately large, small, and small. Further, reference numeral 163 denotes a screen as an image display section. In addition, the optical path shown by the broken line represents a main optical path (principal ray) along the optical axis of the concave mirror 135 (optical system in a broader sense).
The concave mirror 135 has a focal point indicated by each of points F1 to F5 according to the change in the radius of curvature of the concave mirror 135. The shape (degree of curvature, flatness, etc.) of the virtual image display surface can be changed according to the shape of the curved surface shown by the locus including the focus. For example, various changes are considered, such as stepwise fine adjustment of the radii of curvature of the respective portions α, β, γ of the concave mirror 135, or continuous change thereof. The following design method is adopted in the present embodiment: instead of correcting the distortion caused by the aberration between the concave mirror and the windshield as in the prior art, the virtual image display surface is allowed to include a curved surface shape, and the shape including the curved surface is controlled freely with high accuracy, whereby the flatness is ensured by utilizing the characteristics of the human eye, or the human is made to feel that the virtual image display surface is superimposed on the superimposed object (road surface or the like) without floating or the like in the road surface HUD region Z1.
Next, refer to fig. 10. Fig. 10 is a diagram showing an example of a system configuration of the head-up display device.
Next, refer to fig. 19. Fig. 19 is a diagram showing an example of a system configuration of the head-up display device. The system shown in fig. 19 includes a display control device 740, an object detection unit 801, a vehicle information detection unit 803, a display unit 12, a first actuator 177, and a second actuator 179. The display control device 740 has an I/O interface 741, a processor 742, and a memory 743. The display control device 740, the object detection unit 801, and the vehicle information detection unit 803 are connected to a communication line (BUS or the like).
The display control device 740 can be used as the control unit (display control device, display control unit) 171 shown in fig. 8, for example. The first actuator 177 and the second actuator 179 can be used as the rotating mechanism 179 and the adjusting portion 173 shown in fig. 8, and can be used to individually adjust the entire and detailed portions of the optical system 120 shown in fig. 8. These can also be referred to as adjustment systems for the optical system.
The object detection unit 801 may be constituted by an off-vehicle sensor and an off-vehicle camera provided in the vehicle 1, for example. The vehicle information detection unit 803 may be configured by, for example, a speed sensor, a vehicle ECU, an off-vehicle communication device, a sensor that detects the position of an eye, a yaw rate sensor (yaw rate sensor) that detects the pitch angle (tilt angle) of the vehicle 1, or the like, or a height sensor (height sensor). The display control device 740 may also realize the HUD device that uses the road HUD and the road HUD together while operating the optical system optimally, for example, based on the detection information of the object detection unit 801 and the information from the vehicle information detection unit 803.
Further, the one or more processors 742 can obtain, for example, the position of the road surface 40, and drive at least one of the first actuator 173 and the second actuator 175 so that at least a part of the virtual image display surface is disposed below the road surface 40, for example, based on the road surface 40.
Next, refer to fig. 11. Fig. 11 is a diagram for explaining control of the virtual image display distance when the display shown in fig. 5 (a) and 5 (B) is performed. In fig. 11, symbol 22 denotes a user, and symbol K denotes display light.
As shown in the figure, the exit guide mark G2-1 standing on the road surface 40 is displayed, and at time t2, the exit guide mark G1-1 is displayed as a spliced display so as to be superimposed on the road surface 40 (the display content is the same as G2-1).
Next, at time t3, the exit guide mark G1-1 is displayed at a position closer to the vehicle 1.
Then, the vehicle guidance display G1-2 is displayed from time t4 to t6, and then, the exit guidance mark G1-1 is displayed at a position closer to the vehicle 1 at time t 7. Such control of the virtual image display position is performed by a display control unit (for example, reference numeral 300 in fig. 12).
Fig. 12 is a diagram showing an overall configuration example of the HUD device. In fig. 12, the optical system 52 is provided, but the same configuration as that shown in fig. 7 is employed as the configuration of the optical system 52. The same reference numerals are given to the same parts as those in fig. 5.
In the example of fig. 12, there are provided: an optical system 52; a driving scene determination unit 19 having an image processing unit 21 for performing image processing based on an image captured by the front camera 17; and a navigation section (navigation ECU) 400.
The optical system 52 includes: a light projecting section 150; a screen (image display section) 46 having a display surface 47 on which an image M is formed; and a concave mirror 130. As a result of the display light K being emitted from the optical system 52 toward the windshield (reflective light-transmitting member 2), the virtual image G is displayed on the virtual image display surface PS having an inclined surface, for example, as described above.
The display control unit 300 includes an image output unit 32, a driving unit 33, a virtual image display distance control unit 34, an image generation unit 35, a road HUD area/road HUD area detection unit 36, a virtual image display surface position adjustment unit 37, and a vanishing point detection unit 39 (including a pitch angle and vanishing point calculation unit 38).
The navigation unit (navigation ECU) 400 includes a depth map unit 402, a navigation information (road guidance information, road identification information, etc.) generation unit 404, a driving route information acquisition unit 406, a host vehicle position information acquisition unit 408, a map information acquisition unit 410, and a storage unit (functioning as a database of maps, road guidance information, road identifications, etc.) 412. Vehicle information and the like collected by the in-vehicle ECU700 are supplied to the navigation section (navigation ECU) 400 via a BUS (BUS).
The communication unit 500 may appropriately supply the own-vehicle position information acquisition unit 408 and the map information acquisition unit 410 of the navigation unit 400 with various information acquired by wireless communication with the driving support system 600 provided outside the vehicle 10, for example. The position information and the like received by the GPS receiving unit 502 from the GPS satellites may be appropriately supplied to the own-vehicle position information acquiring unit 408 and the map information acquiring unit 410 of the navigation unit 400.
In addition, the vehicle 1 is provided with various sensors (including a yaw rate sensor for pitch angle detection) 505. The various pieces of information collected by the vehicle-mounted ECU700 are supplied to the navigation unit 400 via the BUS, and a part of the various pieces of information are also supplied to the vanishing point detecting unit 39 (pitch angle and vanishing point calculating unit 38).
In the display control unit 300, the pitch angle and vanishing point calculating unit 38 included in the vanishing point detecting unit 39 calculates the current pitch angle of the vehicle 1 based on the image information supplied from the image processing unit 21, the information of various sensors supplied from the in-vehicle ECU700, and the like, and calculates the positions of vanishing points (VP 1 and VP2 in fig. 1) in consideration of the pitch angle, the position of the viewpoint a of the user 22, and the like.
When it is determined that the vehicle 1 has arrived on an ascending slope (or descending slope), for example, based on the information supplied from the driving scene determination unit 19, the virtual image display surface position adjustment unit 37 adjusts (corrects) the position of the inclined surface of the virtual image display surface PS (the relative position with respect to the road surface) in consideration of the pitch angle or the like. When correcting the position of the virtual image display surface PS, the pitch angle and vanishing point calculating unit 38 acquires the vanishing point position on the corrected virtual image display surface again (VP 2 in fig. 1).
The road surface HUD area/road HUD area detecting unit 36 is divided into a road surface HUD area Z1 and a road surface HUD area Z2 with the position of the vanishing point VP2 in the virtual image display surface PS as a boundary.
The virtual image display distance control unit 34 determines the display position of the marker or the like on the display surface 47 based on depth information or the like of the marker or the like as a display target supplied from the navigation unit (navigation ECU) 400, and thereby adjusts the virtual image display distance.
As described with reference to fig. 1E, the image generating unit 35 generates an image (original image) to be displayed on the display surface 47 based on various pieces of input information. The generated image (original image) is supplied to the image output unit 32. The image output unit 32 supplies data cv of an image (original image) to the light projecting unit 150 of the optical system 52. The driving unit 33 supplies a control signal rvs for rotating the concave mirror 130 to the actuator (reference numerals 177 and 179 in fig. 10, for example).
Next, refer to fig. 13. Fig. 13 is a flowchart showing an example of the procedure of the display process by the display control unit.
First, a pitch angle and a vanishing point position are acquired (step S1). Next, the position of the virtual image display surface is adjusted (position correction) in consideration of the pitch angle (tilt angle) of the vehicle or the like (step S2). Next, the road surface HUD area/road surface HUD area is detected (step S3).
Next, the road virtual image is displayed in the road HUD area, and the road virtual image is displayed in the road HUD area (step S4).
In step S4, when the virtual image display distance exceeds 20m, a distance display based on shape control (shape change) of the size, shape, or the like is performed, and when the virtual image display distance is 20m or less, at least one of a change in the virtual image display distance based on a change in the display position on the inclined surface and a distance display based on shape control (shape change) is performed. When the road display displayed in the road HUD area is shifted to the road surface display, a display (a mosaic display) continuous with the meaning content of the road display is displayed so as to be attached to the road surface at the time of the switching, and thereafter, the display is moved so as to be close to the vehicle by the distance change control as needed to adjust the sense of distance, and a new image is additionally displayed as needed.
As described above, according to the embodiments of the present invention, the road HUD and the road HUD are used in combination on the virtual image display surface formed of one continuous surface, whereby various kinds of display can be performed, and the expressive power of the HUD device can be improved.
In the present specification, the term vehicle can be interpreted as a vehicle in a broad sense. The term "sign" is also interpreted broadly, considering, for example, a viewpoint such as generalized navigation information useful for traveling of the vehicle. The HUD device also includes a device used as a simulator (e.g., a simulator of an aircraft).
The information provided by the road HUD display is, for example, information such as vehicle speed information, an arrow superimposed on the road surface, speed limitation information, and the like, and the distance or time until the user (driver) performs a correspondence based on the information, or the operation is somewhat close, and these are sometimes collectively referred to as short-distance information or short-distance information.
The information provided in the above-described on-road HUD display is, for example, information such as steering information and guidance marks, and information about a distance or time to some extent until a user (driver) performs a corresponding operation based on the information, and these information are sometimes collectively referred to as remote information or distance information.
The present invention is not limited to the above-described exemplary embodiments, and those skilled in the art can easily modify the above-described exemplary embodiments to the scope of the claims.

Claims (10)

1. A head-up display device which projects display light onto a reflective light-transmitting member provided in a vehicle, and which displays the display light by overlapping a live view transmitted through the reflective light-transmitting member and generating a virtual image by the display light reflected by the reflective light-transmitting member,
the device comprises: an image display unit having a display surface for displaying an image; a display control unit that controls display of the virtual image; and an optical system including an optical member that projects the display light to the reflective light-transmitting member,
a virtual image display surface as an imaging surface for virtual image imaging integrally extends from a proximal end portion on a side close to the vehicle to a distal end portion on a side distant from the vehicle, and a second distance between a road surface and the distal end portion is larger than a first distance between the road surface and the proximal end portion, and the virtual image display surface is a plane or a curved surface,
The virtual image display surface is divided into:
a road surface head-up display area, hereinafter referred to as a road surface HUD area, which displays a virtual image superimposed on the road surface; and
an on-road head-up display area, hereinafter referred to as an on-road HUD area, which is located farther than the road HUD area and displays a virtual image located on the upper side than the road,
the display control unit divides an image display area of the display surface into: a first display area corresponding to the road surface HUD area; and a second display area corresponding to the on-road HUD area,
displaying a first image overlapping the road surface in the first display area,
and displaying a second image to be displayed on an upper side of the road surface in the second display area.
2. The head-up display device of claim 1, wherein,
the virtual image display surface integrally extending from the proximal end portion to the distal end portion is formed by disposing the display surface obliquely with respect to an optical axis of the optical system,
The shape of the plane or curved surface of the virtual image display surface is adjusted by adjusting the optical characteristics of the entire region or a part of the region in the optical system, adjusting the arrangement of the optical member and the display surface, adjusting the shape of the display surface, or a combination of these.
3. The head-up display device according to claim 1 or 2, wherein,
the on-road HUD region of the virtual image display surface is set such that, when the user observes a virtual image disposed in the on-road HUD region, the absolute distance is perceived as being blunted and a distance from the far and near sensation cannot be accurately perceived, whereby the virtual image display surface of the on-road HUD region is set to be a virtual image display surface of a surface that stands substantially perpendicular to the road surface.
4. The head-up display device according to claim 1 or 2, wherein,
the distance from the viewpoint of the user or a reference point corresponding to the viewpoint to a point in real space corresponding to a boundary between the road surface and a space located above the road surface when viewed from the user, that is, a boundary position is set to 20m or more.
5. The head-up display device according to claim 1 or 2, wherein,
Has a vanishing point detecting unit for detecting vanishing points of the road surface,
in the case where a point at which a point of view of the user or a reference point corresponding to the point of view and the detected vanishing point intersect with the virtual image display surface is set as a vanishing point on the virtual image display surface,
the road HUD area is an area closer to the vehicle and the road HUD area is an area farther from the vehicle, based on the position of the vanishing point on the virtual image display surface.
6. The head-up display device according to claim 1 or 2, wherein,
the second image is another separate image that is independent of the first image.
7. The head-up display device according to claim 1 or 2, wherein,
the first image is at least one of an image of information indicating a state of the vehicle and a route guidance image,
the second image is at least one of an image of speed limitation information, a route guidance image on a road, and an image of a sign provided on a road.
8. The head-up display device according to claim 1 or 2, wherein,
the display control unit performs at least one of the following control when adjusting the sense of distance for a first virtual image corresponding to the first image: controlling a virtual image display distance by changing a display position of the first image on the display surface; and performing morphological control based on at least one of the size, shape, color, presence or absence of shadows, and stereoscopic drawing of a display object constituting the first image,
The second virtual image corresponding to the second image performs only the morphological control, and does not perform control of the virtual image display distance.
9. The head-up display device according to claim 1 or 2, wherein,
when a time has elapsed since the second image was displayed at a distant point in time and the distance between the second image and the vehicle decreases as the vehicle advances forward, the display control unit displays at least a part of the display content of the second image as the first image so as to overlap the road surface.
10. The head-up display device according to claim 1 or 2, wherein,
at least a part of the virtual image display surface corresponding to the road surface HUD area is located on a lower side than the road surface.
CN202080051108.7A 2019-08-25 2020-08-20 Head-up display device Active CN114127614B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2019-153317 2019-08-25
JP2019153317 2019-08-25
PCT/JP2020/031442 WO2021039579A1 (en) 2019-08-25 2020-08-20 Head-up display device

Publications (2)

Publication Number Publication Date
CN114127614A CN114127614A (en) 2022-03-01
CN114127614B true CN114127614B (en) 2023-08-25

Family

ID=74683522

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080051108.7A Active CN114127614B (en) 2019-08-25 2020-08-20 Head-up display device

Country Status (4)

Country Link
JP (1) JPWO2021039579A1 (en)
CN (1) CN114127614B (en)
DE (2) DE202020005800U1 (en)
WO (1) WO2021039579A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023149628A (en) * 2022-03-31 2023-10-13 パナソニックIpマネジメント株式会社 Image generation device, display system, image generation method and program

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1934484A (en) * 2004-03-19 2007-03-21 特里吉特股份公司 Device and system for display of information, and vehicle equipped with such a system
CN104471354A (en) * 2012-08-10 2015-03-25 爱信艾达株式会社 Intersection guidance system, method and program
CN104508430A (en) * 2012-08-10 2015-04-08 爱信艾达株式会社 Intersection guidance system, method and program
CN105511079A (en) * 2016-01-08 2016-04-20 北京乐驾科技有限公司 Lateral light path transmission system and method based on HUD (Head Up Display)
JP2016090344A (en) * 2014-10-31 2016-05-23 アイシン・エィ・ダブリュ株式会社 Navigation device and navigation program
CN105682973A (en) * 2013-10-22 2016-06-15 日本精机株式会社 Vehicle information projection system, and projection device
JP2016118423A (en) * 2014-12-19 2016-06-30 アイシン・エィ・ダブリュ株式会社 Virtual image display device
CN106133803A (en) * 2014-03-31 2016-11-16 株式会社电装 Vehicle display control unit
CN106489096A (en) * 2015-06-11 2017-03-08 法国圣戈班玻璃厂 Projection arrangement for head-up display (hud)
CN106687327A (en) * 2014-09-29 2017-05-17 矢崎总业株式会社 Vehicular display device
CN107249922A (en) * 2015-02-24 2017-10-13 日本精机株式会社 Display apparatus
WO2017183322A1 (en) * 2016-04-18 2017-10-26 Sony Corporation Image display device and method for a vehicle
CN108369341A (en) * 2015-12-01 2018-08-03 日本精机株式会社 Head-up display
CN108473055A (en) * 2016-02-05 2018-08-31 麦克赛尔株式会社 head-up display device
WO2018168595A1 (en) * 2017-03-15 2018-09-20 日本精機株式会社 Head-up display device
JP2018159882A (en) * 2017-03-23 2018-10-11 日本精機株式会社 Head-up display device
JP2018159738A (en) * 2017-03-22 2018-10-11 アイシン・エィ・ダブリュ株式会社 Virtual image display device
WO2019097763A1 (en) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Superposed-image display device and computer program
CN110073275A (en) * 2017-11-14 2019-07-30 Jvc 建伍株式会社 Virtual image display apparatus

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4871459B2 (en) * 2001-07-17 2012-02-08 矢崎総業株式会社 In-vehicle head-up display device
JP4206955B2 (en) * 2004-04-02 2009-01-14 株式会社デンソー VEHICLE DISPLAY DEVICE, VEHICLE DISPLAY SYSTEM, AND PROGRAM
JP5161760B2 (en) * 2008-12-26 2013-03-13 株式会社東芝 In-vehicle display system and display method
US9030749B2 (en) * 2012-08-01 2015-05-12 Microvision, Inc. Bifocal head-up display system
JP6232363B2 (en) 2014-09-29 2017-11-15 矢崎総業株式会社 Vehicle display device
RU2746380C2 (en) * 2016-05-11 2021-04-12 ВэйРэй АГ Windshield indicator with variable focal plane
JP6925916B2 (en) * 2017-09-14 2021-08-25 アルパイン株式会社 Head-up display
JP2019079351A (en) * 2017-10-25 2019-05-23 日本精機株式会社 Image processing unit and head-up display device equipped with the same
JP6969509B2 (en) * 2018-06-29 2021-11-24 株式会社デンソー Vehicle display control device, vehicle display control method, and control program

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1934484A (en) * 2004-03-19 2007-03-21 特里吉特股份公司 Device and system for display of information, and vehicle equipped with such a system
CN104471354A (en) * 2012-08-10 2015-03-25 爱信艾达株式会社 Intersection guidance system, method and program
CN104508430A (en) * 2012-08-10 2015-04-08 爱信艾达株式会社 Intersection guidance system, method and program
CN105682973A (en) * 2013-10-22 2016-06-15 日本精机株式会社 Vehicle information projection system, and projection device
CN106133803A (en) * 2014-03-31 2016-11-16 株式会社电装 Vehicle display control unit
CN106687327A (en) * 2014-09-29 2017-05-17 矢崎总业株式会社 Vehicular display device
JP2016090344A (en) * 2014-10-31 2016-05-23 アイシン・エィ・ダブリュ株式会社 Navigation device and navigation program
JP2016118423A (en) * 2014-12-19 2016-06-30 アイシン・エィ・ダブリュ株式会社 Virtual image display device
CN107249922A (en) * 2015-02-24 2017-10-13 日本精机株式会社 Display apparatus
CN106489096A (en) * 2015-06-11 2017-03-08 法国圣戈班玻璃厂 Projection arrangement for head-up display (hud)
CN108369341A (en) * 2015-12-01 2018-08-03 日本精机株式会社 Head-up display
CN105511079A (en) * 2016-01-08 2016-04-20 北京乐驾科技有限公司 Lateral light path transmission system and method based on HUD (Head Up Display)
CN108473055A (en) * 2016-02-05 2018-08-31 麦克赛尔株式会社 head-up display device
WO2017183322A1 (en) * 2016-04-18 2017-10-26 Sony Corporation Image display device and method for a vehicle
WO2018168595A1 (en) * 2017-03-15 2018-09-20 日本精機株式会社 Head-up display device
JP2018159738A (en) * 2017-03-22 2018-10-11 アイシン・エィ・ダブリュ株式会社 Virtual image display device
JP2018159882A (en) * 2017-03-23 2018-10-11 日本精機株式会社 Head-up display device
CN110073275A (en) * 2017-11-14 2019-07-30 Jvc 建伍株式会社 Virtual image display apparatus
WO2019097763A1 (en) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Superposed-image display device and computer program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
胡宇.车载抬头显示器系统的研究.中国优秀硕士学位论文全文数据库工程科技Ⅱ辑.2012,C035-236. *

Also Published As

Publication number Publication date
WO2021039579A1 (en) 2021-03-04
CN114127614A (en) 2022-03-01
DE112020004003T5 (en) 2022-05-12
JPWO2021039579A1 (en) 2021-03-04
DE202020005800U1 (en) 2022-09-16

Similar Documents

Publication Publication Date Title
CN111433067B (en) Head-up display device and display control method thereof
US11993145B2 (en) Method, device and computer-readable storage medium with instructions for controlling a display of an augmented reality display device for a motor vehicle
WO2014174575A1 (en) Vehicular head-up display device
CN110001400B (en) Display device for vehicle
WO2019097755A1 (en) Display device and computer program
JP6536856B2 (en) Vehicle display device
JP2014225017A (en) Device and method for projecting image information in field of view of driver of vehicle
EP3694740A1 (en) Display device, program, image processing method, display system, and moving body
CN112292630B (en) Method for operating a visual display device for a motor vehicle
WO2019097762A1 (en) Superimposed-image display device and computer program
JP2013112269A (en) In-vehicle display device
JP6225379B2 (en) Vehicle information projection system
JP2018173399A (en) Display device and computer program
CN114489332A (en) Display method and system of AR-HUD output information
WO2022209439A1 (en) Virtual image display device
CN111796422A (en) Display device for vehicle
CN114127614B (en) Head-up display device
JP2018020779A (en) Vehicle information projection system
JP7310560B2 (en) Display control device and display control program
JP7358909B2 (en) Stereoscopic display device and head-up display device
JP6282567B2 (en) Vehicle display device
US11049320B2 (en) Method, device, and computer-readable storage medium with instructions for controlling a display of an augmented reality head-up display device
JP7354846B2 (en) heads up display device
JP2010002341A (en) On-vehicle information presenting device
JP2019032362A (en) Head-up display device and navigation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant