WO2018216552A1 - Dispositif d'affichage tête haute - Google Patents

Dispositif d'affichage tête haute Download PDF

Info

Publication number
WO2018216552A1
WO2018216552A1 PCT/JP2018/018745 JP2018018745W WO2018216552A1 WO 2018216552 A1 WO2018216552 A1 WO 2018216552A1 JP 2018018745 W JP2018018745 W JP 2018018745W WO 2018216552 A1 WO2018216552 A1 WO 2018216552A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
image
vehicle
display
acquired
Prior art date
Application number
PCT/JP2018/018745
Other languages
English (en)
Japanese (ja)
Inventor
勇希 舛屋
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Publication of WO2018216552A1 publication Critical patent/WO2018216552A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position

Definitions

  • the present invention relates to a head-up display device.
  • a head-up display (HUD) device that displays a virtual image by emitting display light to a translucent member such as a windshield.
  • the head-up display device disclosed in Patent Document 1 displays a virtual image based on information acquired from detection means such as a radar or a stereo camera mounted on the host vehicle, or information acquired from another vehicle through inter-vehicle communication. It was.
  • a plurality of screens each having a light control layer capable of adjusting transmittance are arranged in the thickness direction, and a projector projects a projected image at a high speed toward the plurality of screens.
  • a transmittance adjustment screen method is adopted in which a plurality of screens adjust the dimming rate appropriately according to high-speed switching of the projected images, and display a three-dimensional real image inside the apparatus.
  • the head-up display device disclosed in Patent Document 3 employs a method of displaying a three-dimensional real image inside the device by overlapping a plurality of liquid crystal display elements in the thickness direction.
  • JP 2016-107731 A Japanese Unexamined Patent Publication No. 2016-212318 JP 2004-168230 A
  • the present invention has been made in view of the above problems, and an object thereof is to provide a head-up display device that allows a viewer to recognize acquired information more reliably.
  • a head-up display device displays a virtual image so as to be superimposed on a real scene seen through the translucent member by emitting display light to the translucent member.
  • a display that emits the display light
  • an object information acquisition unit that acquires information on an object existing around the host vehicle on which the head-up display device is mounted, and an information image is generated as the virtual image
  • An image generation unit that changes the display mode of the information image according to the type, number, or combination of the information acquired via the object information acquisition unit.
  • the head-up display device of the present invention it is possible to make the viewer recognize the acquired information more reliably.
  • (A) which concerns on 1st Embodiment of this invention is the schematic of the vehicle by which a head up display apparatus is mounted
  • (b) is the schematic which shows the structure of a head up display apparatus.
  • It is a block diagram of the head up display device concerning a 1st embodiment.
  • It is a flowchart of a display process of the head-up display device according to the first embodiment.
  • (A) which concerns on 1st Embodiment is a rear view of the other vehicle seen from the viewer,
  • (b) is a top view of the other vehicle.
  • (A) which concerns on 1st Embodiment is a rear view of the other vehicle seen from the viewer, (b) is a top view of the other vehicle.
  • (A) which concerns on 3rd Embodiment is a rear view of the other vehicle seen from the viewer, (b) is a top view of the other vehicle. It is a flowchart of the display process of the head-up display apparatus which concerns on 3rd Embodiment.
  • (A) which concerns on 4th Embodiment is a rear view of the other vehicle seen from the viewer, (b) is a top view of the other vehicle. It is a flowchart of the display process of the head-up display apparatus which concerns on 4th Embodiment.
  • HUD device head-up display device
  • the HUD device 100 is mounted in a dashboard 301 of a vehicle (hereinafter referred to as the host vehicle) 300.
  • the HUD device 100 emits display light L representing an information image W toward a windshield 200 which is an example of a light transmissive member of the host vehicle 300.
  • the viewer U who is a driver receives the display light L reflected by the windshield 200 and can visually recognize the information image W as a virtual image superimposed on the real scenery seen through the windshield 200.
  • the HUD device 100 includes a housing 110, a display 120, a screen 130, a screen position adjustment unit 140, a plane mirror 150, a concave mirror 160, and a control unit 170. Prepare.
  • the housing 110 is formed in a box shape from light-shielding resin, and accommodates the display 120, the screen 130, the plane mirror 150, and the concave mirror 160.
  • An opening 111 that allows the display light L to pass therethrough is provided at the top of the housing 110.
  • the opening 111 is covered with a translucent cover 112.
  • the display 120 emits display light L representing the information image W to the screen 130.
  • the display device 120 includes a backlight 121 that emits light, and a display element 122 that receives light from the backlight 121 and generates an image.
  • the display element 122 includes a liquid crystal panel or a DMD (Digital Micro Mirror Device) element that operates under the control of the control unit 170.
  • the screen 130 is a transmissive screen composed of a holographic diffuser, a microlens array, a diffusion plate, and the like. When the display light L is incident on one surface, the screen 130 forms a display image on the other surface and emits the display light L indicating the display image toward the plane mirror 150.
  • the screen position adjustment unit 140 moves the position of the screen 130 along the screen movement direction A.
  • the screen movement direction A is a direction along the optical axis of the display light L emitted from the display device 120.
  • the display position of the information image W displayed by the HUD device 100 moves in the front-rear direction Y of the host vehicle 300.
  • the screen position adjustment unit 140 includes a driving unit such as a motor or an actuator, and a transmission mechanism that transmits the driving force of the driving unit to the screen 130 as a linear motion. Since the optical distance increases as the distance between the screen 130 and a concave mirror 160 described later increases, the distance from the vehicle 300 to the information image W increases.
  • the plane mirror 150 reflects the display light L incident from the screen 130 to the concave mirror 160.
  • the flat mirror 150 includes a flat base material made of synthetic resin, glass, or the like, and a metal reflection film formed on one surface of the base material.
  • the concave mirror 160 reflects the display light L toward the windshield 200.
  • Concave mirror 160 includes a curved plate-like base material made of synthetic resin, glass, or the like, and a metal reflection film formed on one surface of the base material.
  • the plane mirror 150 and the concave mirror 160 constitute an optical system that guides the display light L from the screen 130 to the windshield 200.
  • the host vehicle 300 includes a vehicle speed detection unit 310, an imaging unit 320, and an inter-vehicle communication unit 500 in addition to the HUD device 100.
  • the vehicle speed detection unit 310 detects the speed of the host vehicle 300, and outputs the detection result to the control unit 170 of the HUD device 100 via a communication interface such as CAN (Control Area Network) not shown.
  • the vehicle speed detection unit 310 includes position information acquisition means (not shown) such as GPS (Global Positioning System), and estimates the speed of the host vehicle 300 from the transition of the position information of the host vehicle 300. The estimated speed may be output to the control unit 170 of the HUD 100.
  • position information acquisition means not shown
  • GPS Global Positioning System
  • the imaging unit 320 is configured by a stereo camera or a monocular camera, images the front of the host vehicle 300, and outputs the captured image data to the control unit 170 of the HUD device 100 as detection information I1.
  • the inter-vehicle communication unit 500 performs wireless communication with the other vehicle 400. Specifically, the inter-vehicle communication unit 500 receives the other vehicle information signal St from the other vehicle 400 and outputs the received other vehicle information signal St to the control unit 170 of the HUD device 100 as the communication information I2.
  • the other vehicle information signal St includes vehicle speed information, turn signal information, car navigation information, automatic driving level information, and the like in the other vehicle 400. That is, the other vehicle 400 acquires these various types of information and wirelessly transmits the acquired various types of information after including them in the other vehicle information signal St.
  • the control unit 170 includes a memory (not shown) that stores a control program, and a CPU (Central Processing Unit) that executes display processing and the like according to each flowchart described below by executing the control program stored in the memory. And GDC (Graphics Display Controller) for generating an image.
  • the control unit 170 includes an image analysis unit 171, an image generation unit 172, a screen position control unit 173, and a communication information acquisition unit 174 as functions.
  • the image analysis unit 171 detects the presence / absence of the other vehicle 400 as the object P and the inter-vehicle distance to the other vehicle 400 in the image to be analyzed by analyzing the image data from the imaging unit 320.
  • the screen position control unit 173 controls the timing at which the display 120 projects the display light L on the screen 130 while detecting the position information of the screen 130 via the screen position adjustment unit 140. For example, when the virtual image is displayed two-dimensionally, the screen position control unit 173 emits the display light L only when the screen 130 is at a specific position, and displays the display image on the screen 130. Thereby, as shown in FIG.4 (b), the 1st information image W1 can be fixed to the 1st position P1. For example, when the virtual image is displayed in three dimensions, the screen position control unit 173 moves the screen 130 back and forth at high speed along the screen movement direction A. Accordingly, as shown in FIG. 6B, the first information image W1 and the second information image W2 can be displayed at different positions in the front-rear direction Y.
  • the communication information acquisition unit 174 is an interface with the vehicle-to-vehicle communication unit 500, and demodulates the other vehicle information signal St, which is the communication information I2 from the vehicle-to-vehicle communication unit 500, so that it can be read by the image generation unit 172.
  • the image generation unit 172 is based on the position information of the screen 130 from the screen position control unit 173, the detection result of the image analysis unit 171, and the other vehicle information signal St (communication information I2) from the communication information acquisition unit 174.
  • the display element 122 is controlled to generate an image. Note that the image analysis unit 171 and the communication information acquisition unit 174 of the control unit 170 correspond to an object information acquisition unit.
  • control unit 170 image generation unit 172
  • the display process is repeatedly executed during a period when the power of the HUD device 100 is turned on.
  • control unit 170 acquires image data that is an example of the detection information I1 from the imaging unit 320 via the image analysis unit 171 (step S101).
  • control unit 170 analyzes the image data acquired via the image analysis unit 171 and determines whether or not there is another vehicle 400 in the image to be analyzed (step S102).
  • step S102 determines whether or not there is another vehicle 400 in the image to be analyzed.
  • step S102 NO
  • the control unit 170 ends the display process.
  • step S102 when determining that there is another vehicle 400 in the image to be analyzed (step S102: YES), the control unit 170 first information at the first position P1 via the image generation unit 172 and the screen position control unit 173.
  • the image W1 is displayed (step S103).
  • the first position P1 is located behind the other vehicle 400 as shown in FIG.
  • the image generation part 172 displays the square frame-shaped icon surrounding the circumference
  • the control unit 170 determines whether or not to receive another vehicle information signal St from the other vehicle 400 (step S104).
  • step S104 determines that the other vehicle information signal St is not received (step S104: NO)
  • the control unit 170 ends the display process. That is, when the other vehicle 400 traveling in front of the host vehicle 300 is a vehicle type that does not transmit the other vehicle information signal St, or when the other vehicle information signal St cannot be received from the other vehicle 400 for some reason, the steps S101 to S1 are performed. By repeating S104, the display of the first information image W1 surrounding the other vehicle 400 is continued.
  • step S104 when determining that the other vehicle information signal St has been received (step S104: YES), the control unit 170 changes the display form of the information image W through steps S105 and S106. Specifically, the control unit 170 moves the screen 130 via the screen position control unit 173, so that the first information image W1 is moved from the first position P1 to the second position as shown in FIG. 5B. Move to position P2 (step S105). The second position P2 is located in front of the other vehicle 400. In this step S105, the size of the first information image W1 is the same, but the display position of the first information image W1 is far from the viewer U. Therefore, as shown in FIG. From U, the first information image W1 appears smaller than the other vehicle 400.
  • the control unit 170 displays the first information image W1 at the second position P2, and the second information image indicating the information included in the other vehicle information signal St. W2 is displayed at the first position P1 (step S106).
  • the second information image W2 includes vehicle speed information Ds indicating the speed of the other vehicle 400 and traveling direction information Dt indicating the traveling direction of the other vehicle 400 with an arrow.
  • the controller 170 displays the traveling direction information Dt based on the car navigation information included in the other vehicle information signal St, and displays the vehicle speed information Ds based on the vehicle speed information included in the other vehicle information signal St.
  • finished finished.
  • the HUD device 100 displays the virtual image so as to be superimposed on the real scenery seen through the windshield 200 by emitting the display light L to the windshield 200 (translucent member).
  • the HUD device 100 includes a display 120 that emits display light L, an image analysis unit 171 that acquires detection information I1 and communication information I2 regarding the other vehicle 400 (target object P) that exists around the host vehicle 300, and communication information.
  • An acquisition unit 174 (object information acquisition unit) and an image generation unit 172 that displays the information image W as a virtual image at a position corresponding to the other vehicle 400 seen through the windshield 200 are provided.
  • the image generation unit 172 changes the display mode of the information image W when the acquired information is a combination of the detection information I1 and the communication information I2. According to this configuration, the viewer U can recognize the detection information I1 and the communication information I2 acquired by the HUD device 100 by changing the display mode of the information image W.
  • the HUD device 100 includes a communication information acquisition unit 174 (target information acquisition unit) that acquires communication information I2 related to the other vehicle 400 by communicating with the other vehicle 400.
  • the image generation unit 172 changes the display mode of the information image W. According to this configuration, for example, when the other vehicle 400 is traveling in front of the host vehicle 300, the viewer U determines whether or not the HUD device 100 has acquired the communication information I2 from the other vehicle 400. It can be recognized by the display mode of the information image W.
  • the HUD device 100 communicates with the other vehicle 400 by communicating with the image analysis unit 171 (detection information acquisition unit) that acquires information (image data) captured by the imaging unit 320 as the detection information I1.
  • a communication information acquisition unit 174 that acquires information about 400 as communication information I2.
  • the image generation unit 172 displays the first information image W1 as the information image W when only the detection information I1 is acquired, and in addition to the first information image W1 when the communication information I2 is acquired in addition to the detection information I1.
  • the second information image W2 as the information image W, the display mode of the information image W is changed. According to this configuration, the viewer U can recognize the detection information I1 and the communication information I2 acquired by the HUD device 100 by changing the display mode of the information image W.
  • the image generation unit 172 displays the first information image W1 at the first position P1 when the detection information I1 is acquired, and the first information image W1 when the communication information I2 is acquired in addition to the detection information I1.
  • the information image W1 is moved from the first position P1 to the second position P2 that is further away from the host vehicle 300 than the first position P1, and the second information image W2 is moved to the first position P1.
  • the display mode of the information image is changed by displaying. According to this configuration, the second information image W2 is displayed at a position close to the viewer U, so that the viewer U can more reliably recognize that the communication information I2 has been acquired from the other vehicle 400. .
  • the controller 170 may display the vehicle speed information Ds at a position closer to the host vehicle 300 as the absolute value of the negative value increases. Thereby, it can emphasize to the viewer U that the own vehicle 300 approached the other vehicle 400.
  • FIG. 7B the control unit 170 moves the vehicle speed information Ds from the first position P1 toward the other vehicle 400 when it is determined that the host vehicle 300 is separated from the other vehicle 400.
  • the control unit 170 determines that the host vehicle 300 is separated from the other vehicle 400.
  • the controller 170 may display the vehicle speed information Ds at a position farther from the host vehicle 300 as the positive value increases. Thereby, it can emphasize to the viewer U that the own vehicle 300 was separated from the other vehicle 400. Note that the control unit 170 may determine that the host vehicle 300 has approached or separated from the other vehicle 400 based on the image data captured by the imaging unit 320. Further, the control unit 170 may display the vehicle speed information Ds at a position closer to the host vehicle 300 as the distance between the host vehicle 300 and the other vehicle 400 detected by the image analysis unit 171 decreases.
  • the frame-shaped first information image W1 is displayed perpendicular to the front-rear direction Y.
  • the display direction of the first information image W1 is not limited to this, and for example, FIG. As shown to a) and (b), you may display diagonally with respect to the front-back direction Y.
  • the control unit 170 can display the first information image W1 obliquely by displaying the first information image W1 in a time-division manner within the moving range C of the display surface B. As shown in FIGS. 6A and 6B, the controller 170 displays the information included in the other vehicle information signal St from the state in which the first information image W1 is displayed perpendicular to the front-rear direction Y.
  • the left turn light of the other vehicle 400 is turned on based on the blinker information which is a kind
  • the left side of the first information image W1 is the right side as viewed from the viewer U.
  • the first information image W1 is displayed obliquely so as to be closer. Accordingly, as shown in FIG. 8A, it is emphasized that the other vehicle 400 takes a course on the left side by the first information image W1.
  • the control unit 170 determines that the right turnlight of the other vehicle 400 is turned on based on the winker information which is a kind of information included in the other vehicle information signal St, the first view from the viewer U is performed.
  • the first information image W1 is displayed obliquely so that the right side of the information image W1 is closer to the left side. Thus, it is emphasized that the other vehicle 400 takes a course on the right side. Note that the control unit 170 may blink the first information image W1 in a state where the first information image W1 is displayed obliquely.
  • the HUD device 100 displays the predicted course of the other vehicle 400 as a predicted course moving image.
  • a display process executed by the control unit 170 (image generation unit 172) will be described with reference to the flowchart shown in FIG. Since the processes according to steps S201 to S203 in FIG. 9 are the same as the processes according to steps S101 to S103, the description thereof is omitted.
  • the control part 170 acquires the advancing direction information based on the car navigation information contained in the other vehicle information signal St (communication information I2) via the communication information acquisition part 174 after the process which concerns on step S203,
  • the image analysis part 171 Lane position information is acquired via (step S204).
  • the control part 170 displays the prediction course moving image which shows the prediction course of the other vehicle 400 based on the acquired traveling direction information and lane position information (step S205), and complete
  • the predicted course moving image is a moving image in which icons move along the predicted course by displaying frame-shaped icons AI1 to AI3 in order as the information image W. .
  • the icons AI1 to AI3 are displayed at different positions in the front-rear direction Y and the vehicle width direction X, respectively.
  • the icon AI1 is a position corresponding to a travel lane in which the other vehicle 400 travels, and is displayed in front of and in the vicinity of the other vehicle 400.
  • the icon AI3 is a position corresponding to the lane of the other vehicle 400 that is the lane change destination, and is located further forward from the other vehicle 400 than the icon AI1.
  • the icon AI2 is a position corresponding to between the travel lane and the lane to which the lane is changed, and is displayed at a position between the icon AI1 and the icon AI3 in the front-rear direction Y. Any number of icons may be displayed between the icons AI1 and AI3.
  • the image generation unit 172 displays an icon indicating the predicted route of the other vehicle 400.
  • the display mode of the information image W is changed. According to this configuration, the viewer U can recognize that the navigation information has been acquired by the HUD device 100 by changing the display mode of the information image W. Further, the viewer U can easily recognize the expected course of the other vehicle 400 by the icons AI1 to AI3 indicating the expected course of the other vehicle 400.
  • the HUD device 100 displays the predicted course moving image, but may display a still image indicating the predicted course of the other vehicle 400.
  • the still image may be an arrow indicating the predicted course of the other vehicle 400.
  • the HUD device 100 changes the display position of the first information image W1 according to the automatic driving level of the other vehicle 400.
  • the display process executed by the control unit 170 (image generation unit 172) will be described with reference to the flowchart shown in FIG.
  • the processes related to steps S301 to S304 in FIG. 12 are the same as the processes related to steps S101 to S104 described above, and thus the description thereof is omitted.
  • the controller 170 When receiving the other vehicle information signal St (step S304: YES), the controller 170 displays the first information image W1 at a position corresponding to the automatic driving level included in the other vehicle information signal St (step S305). Specifically, the control unit 170 changes the display mode of the first information image W1 by displaying the first information image W1 at a position closer to the host vehicle 300 as the automatic driving level is lower. The process according to the flowchart is thus completed.
  • the control unit 170 displays the first information image W1 for the other vehicle 400a at a position closer to the host vehicle 300 than the first information image W1 for the other vehicle 400b. Thereby, the 1st information image W1 to other vehicles 400a with a low automatic driving level can be emphasized.
  • the image generation unit 172 acquires the other vehicle information signal St including the automatic driving level from the other vehicle 400 through the communication information acquisition unit 174, and the first information according to the automatic driving level that is the type of the acquired information.
  • the display mode of the image W1 is changed. According to this configuration, the viewer U can recognize the automatic driving level of the other vehicle 400 by changing the display mode of the first information image W1.
  • the image generation unit 172 changes the display mode of the first information image W1 so as to emphasize the other vehicle 400 having a low automatic driving level. This makes it possible to emphasize the other vehicle 400 having a low automatic driving level that requires more attention during driving.
  • the HUD device 100 changes the display mode of the first information image W1 depending on whether or not it is the other vehicle 400 that has acquired information.
  • the control unit 170 displays the information unacquired icon AI3, which is one of the first information images W1 (step S405).
  • the information non-acquisition icon AI3 is made up of, for example, “x” marks as shown in FIG. 13A, and is located behind the other vehicle 400 as shown in FIG. 13B.
  • the control unit 170 displays the information acquired icon AI4 which is one of the first information images W1 (step S406).
  • the information-acquired icon AI4 is composed of, for example, a “ ⁇ ” mark as shown in FIG. 13A, and is located behind the other vehicle 400 as shown in FIG. 13B. The process according to the flowchart is thus completed.
  • the image generation unit 172 displays the information acquired icon AI4 as the first information image W1 at a position corresponding to the other vehicle 400 that acquired the other vehicle information signal St (communication information I2) through the communication information acquisition unit 174. To do. In addition, when the image analysis unit 171 has analyzed that the other vehicle 400 exists, the image generation unit 172 cannot acquire the other vehicle information signal St from the other vehicle 400 through the communication information acquisition unit 174. An information unacquired icon AI3 is displayed as the first information image W1 at a position corresponding to 400.
  • the image generation unit 172 changes the display mode of the first information image W1 for each other vehicle 400 according to whether or not the communication information I2 is received. According to this configuration, the viewer U can recognize the success or failure of communication with the other vehicle 400 by visually recognizing the information unacquired icon AI3 or the information acquired icon AI4.
  • the control unit 170 may change the display mode of the information image W according to the type, number, or combination of information obtained. For example, the control unit 170 may change the display mode of the information image W when the type or number of obtained information is equal to or greater than a predetermined number. When the specific number is set to “2”, the control unit 170 determines that the type or number of information obtained when the detection information I1 and the communication information I2 are equal to or greater than the specific number, The display mode of the information image W may be changed. In addition, the display mode of the information image W may be changed when the type or number of information included in the detection information I1 or information included in the communication information I2 is equal to or greater than a specific number.
  • the control unit 170 may change the display mode of the information image W by changing at least one of the color, brightness, size, shape, display position, display direction, rotation speed, and blinking cycle of the information image W.
  • the control unit 170 increases the luminance or size of the first information image W1 or increases the display position of the first information image W1 closer to the host vehicle 300 as the type or number of obtained information increases.
  • the control unit 170 may rotate the first information image W1 around the rotation axis along the front-rear direction Y, or the first information image W1 around the rotation axis along the height direction of the host vehicle 300. It may be rotated.
  • the HUD device 100 of the above embodiment adjusts the transmittance as disclosed in a parallax division method including a parallax barrier method and a lenticular lens method, a spatial reproduction method including a light field method and a hologram method, and Patent Document 2.
  • Multiple screens with possible light control layers are arranged in the thickness direction, and the projector projects the projected images while switching the projected images at high speed toward the multiple screens.
  • a transmittance adjustment screen system that displays a three-dimensional real image inside the apparatus by appropriately adjusting the dimming rate, and three-dimensionally by overlapping a plurality of liquid crystal display elements in the thickness direction as disclosed in Patent Document 3.
  • a method of displaying a real image of the inside of the apparatus may be employed. Further, if the display 120 has a screen 130, the display 120 itself may be moved.
  • the HUD device 100 in the above embodiment may include a sensor such as a millimeter wave laser that detects a situation around the host vehicle 300 instead of or together with the imaging unit 320.
  • the HUD device 100 detects the position and size of the object P based on the detection result (detection information I1) of the sensor.
  • the other vehicle information signal St in the above embodiment may include information related to the driver state.
  • the information relating to the driver state includes, for example, the driver's sleepiness, presence / absence of looking away, driving duration, and accident history.
  • the HUD device 100 identifies that there is a driver's drowsiness, looking away, or accident history based on the other vehicle information signal St from the other vehicle 400 or that the driving duration has exceeded a predetermined time, the HUD device 100
  • the display mode of the information image W is changed by displaying the information image W that emphasizes the fact at the corresponding position.
  • the object P is the other vehicle 400, but is not limited to the other vehicle 400, and may be a structure existing around the host vehicle 300.
  • the object P may be a traffic light, a pedestrian, a bicycle, or the like.
  • the HUD device 100 may obtain the communication information I2 by wireless communication with a traffic light, a portable terminal carried by a pedestrian, or a bicycle.
  • the HUD device 100 includes the image analysis unit 171 and the communication information acquisition unit 174.
  • the image analysis unit 171 and the communication information acquisition unit 174 are provided outside the HUD device 100 in the host vehicle 300. May be.
  • the HUD device 100 acquires information related to the object from the image analysis unit 171 and the communication information acquisition unit 174 provided outside the HUD device 100 via the input interface. That is, in this case, the input interface of the HUD device 100 corresponds to the object information acquisition unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Instrument Panels (AREA)

Abstract

L'invention concerne un dispositif d'affichage tête haute (HUD) qui est capable de faire en sorte qu'un téléspectateur reconnaisse de manière plus fiable les informations acquises. Ce dispositif HUD 100 émet une lumière d'affichage vers un pare-brise pour afficher une image virtuelle superposée sur un paysage réel qui peut être vu à travers le pare-brise. Le dispositif HUD 100 comprend: un dispositif d'affichage qui émet la lumière d'affichage; une unité d'analyse d'images 171 et une unité d'acquisition d'informations de communication 174 qui acquièrent des informations de détection I1 et des informations de communication I2 associées à un autre véhicule 400 présent dans l'environnement d'un véhicule hôte 300; et une unité de génération d'images 172 qui génère une image d'informations en tant qu'image virtuelle. L'unité de génération d'images 172 modifie le mode d'affichage de l'image d'information en fonction du type, du nombre de pièces ou de la combinaison des informations de détection I1 et des informations de communication I2 qui ont été acquises.
PCT/JP2018/018745 2017-05-22 2018-05-15 Dispositif d'affichage tête haute WO2018216552A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017100522 2017-05-22
JP2017-100522 2017-05-22

Publications (1)

Publication Number Publication Date
WO2018216552A1 true WO2018216552A1 (fr) 2018-11-29

Family

ID=64396702

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/018745 WO2018216552A1 (fr) 2017-05-22 2018-05-15 Dispositif d'affichage tête haute

Country Status (1)

Country Link
WO (1) WO2018216552A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009072366A1 (fr) * 2007-12-05 2009-06-11 Bosch Corporation Dispositif d'affichage d'informations de véhicule
US20140063064A1 (en) * 2012-08-31 2014-03-06 Samsung Electronics Co., Ltd. Information providing method and information providing vehicle therefor
JP2015043012A (ja) * 2013-08-26 2015-03-05 アイシン・エィ・ダブリュ株式会社 ヘッドアップディスプレイ装置
JP2015096946A (ja) * 2013-10-10 2015-05-21 パナソニックIpマネジメント株式会社 表示制御装置、表示制御プログラム、および表示制御方法
JP2017021546A (ja) * 2015-07-10 2017-01-26 田山 修一 車輌用画像表示システム及び方法
WO2017026223A1 (fr) * 2015-08-07 2017-02-16 株式会社デンソー Système d'affichage

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009072366A1 (fr) * 2007-12-05 2009-06-11 Bosch Corporation Dispositif d'affichage d'informations de véhicule
US20140063064A1 (en) * 2012-08-31 2014-03-06 Samsung Electronics Co., Ltd. Information providing method and information providing vehicle therefor
JP2015043012A (ja) * 2013-08-26 2015-03-05 アイシン・エィ・ダブリュ株式会社 ヘッドアップディスプレイ装置
JP2015096946A (ja) * 2013-10-10 2015-05-21 パナソニックIpマネジメント株式会社 表示制御装置、表示制御プログラム、および表示制御方法
JP2017021546A (ja) * 2015-07-10 2017-01-26 田山 修一 車輌用画像表示システム及び方法
WO2017026223A1 (fr) * 2015-08-07 2017-02-16 株式会社デンソー Système d'affichage

Similar Documents

Publication Publication Date Title
EP3015904B1 (fr) Dispositif d'affichage tête haute
US9678341B2 (en) Head-up display apparatus
JP7254832B2 (ja) ヘッドアップディスプレイ、車両用表示システム、及び車両用表示方法
JP2007326419A (ja) 車両用表示装置
WO2017082067A1 (fr) Système d'affichage d'image pour véhicule
JP6443716B2 (ja) 画像表示装置、画像表示方法及び画像表示制御プログラム
CN110967833B (zh) 显示装置、显示控制方法及存储介质
JP2016118851A (ja) 虚像表示装置
JP2016159656A (ja) 車両用表示装置
EP3718809B1 (fr) Dispositif d'affichage tête haute
JP2020050162A (ja) 表示装置、表示制御方法、およびプログラム
WO2022209439A1 (fr) Dispositif d'affichage d'image virtuelle
CN114466761A (zh) 平视显示器及图像显示系统
EP3835128A1 (fr) Système d'affichage de véhicule et véhicule
WO2021015171A1 (fr) Affichage tête haute
JP2015074391A (ja) ヘッドアップディスプレイ装置
WO2020110598A1 (fr) Dispositif d'affichage tête haute
JPWO2017138432A1 (ja) ヘッドアップディスプレイ装置
JP6642103B2 (ja) ヘッドアップディスプレイ装置
JP2000111834A (ja) 車両用ステレオ画像表示装置
WO2018216552A1 (fr) Dispositif d'affichage tête haute
JP2020024561A (ja) 表示装置、表示制御方法、およびプログラム
JP2018125629A (ja) ヘッドアップディスプレイ装置
WO2019130860A1 (fr) Dispositif d'affichage tête haute et programme de commande
CN110816269B (zh) 显示装置、显示控制方法及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18805199

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18805199

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP