WO2020022239A1 - Display control device and head-up display device - Google Patents

Display control device and head-up display device Download PDF

Info

Publication number
WO2020022239A1
WO2020022239A1 PCT/JP2019/028579 JP2019028579W WO2020022239A1 WO 2020022239 A1 WO2020022239 A1 WO 2020022239A1 JP 2019028579 W JP2019028579 W JP 2019028579W WO 2020022239 A1 WO2020022239 A1 WO 2020022239A1
Authority
WO
WIPO (PCT)
Prior art keywords
visibility
image
real object
related information
display
Prior art date
Application number
PCT/JP2019/028579
Other languages
French (fr)
Japanese (ja)
Inventor
貴生人 川手
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Priority to JP2020532369A priority Critical patent/JP7255596B2/en
Publication of WO2020022239A1 publication Critical patent/WO2020022239A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits

Definitions

  • the present invention relates to a head-up display device used in a vehicle and superimposing a virtual image on a foreground of the vehicle for visual recognition, and a display control device used for the device.
  • the present invention has been made in view of the above problems, and has as its object to make it easy to recognize an approaching real object while reducing the complexity of an image.
  • a first aspect of the display control device is a display control device which is mounted on a vehicle and displays a related information image of a real object present in the foreground in a display area overlapping the foreground of the vehicle, Acquiring a position, as the distance to the real object is reduced, or as the real object approaches the outer edge of the display area, a first visibility adjustment unit that gradually reduces the visibility of the related information image, Obtain the gaze range of the viewer, when it is determined that the related information image has been watched, after increasing the visibility of the related information image, based on the travel distance of the vehicle, or elapsed time, gradually And a second visibility adjusting unit for lowering the visibility.
  • the real object approaching the vehicle has a relatively low importance for the viewer (the driver of the vehicle) because the risk of collision has been dealt with or the object is not of interest. Or, the degree of importance has been reduced).
  • the visibility of the related information image decreases as the real object approaches the vehicle. Therefore, for the viewer (the driver of the vehicle), the real object which has avoided the danger of the collision and the real object which is not of interest.
  • the visual attention to the related information image of the real object can be made difficult to direct, but if the viewer is interested in the real object approaching the vehicle or the related information image of this real object, Since the visibility of the related information image is temporarily increased based on gazing at the related information image, information on the real object of interest is easily recognized. Then, the visibility of the related information image temporarily increases, and then gradually decreases based on the traveling distance of the vehicle or the elapsed time. This can contribute to safe driving of the vehicle.
  • FIG. 1 is a block diagram functionally showing a configuration of a head-up display device according to an embodiment of the present invention. It is a figure showing the example of the virtual image which the head up display device of the above-mentioned embodiment displays. It is a flowchart which shows the 1st visibility adjustment process which the head-up display device of the said embodiment performs. It is a flowchart which shows the 2nd visibility adjustment process which the head-up display device of the said embodiment performs.
  • FIG. 4 is a diagram illustrating an example of a virtual image when the first visibility adjustment process illustrated in FIG. 3 is performed.
  • FIG. 5 is a diagram illustrating an example of a virtual image when the second visibility adjustment process illustrated in FIG. 4 is performed.
  • FIG. 4 is a diagram illustrating a characteristic of visibility of a related information image with respect to a distance in the first visibility adjustment processing illustrated in FIG. 3. It is a figure which shows the characteristic of the visibility of the related information image with respect to time and travel distance when a 2nd visibility adjustment process is performed before a 1st visibility adjustment process is performed. It is a figure which shows the characteristic of the visibility of the related information image with respect to time and travel distance when a 2nd visibility adjustment process is performed after a 1st visibility adjustment process is performed. It is a figure which shows the modification of the characteristic of the visibility of the related information image with respect to time when a 2nd visibility adjustment process is performed before a 1st visibility adjustment process is performed. It is a figure which shows the modification of the characteristic of the visibility of the related information image with respect to the time when a 2nd visibility adjustment process is performed after a 1st visibility adjustment process is performed.
  • FIG. 1 is a block diagram showing an overall configuration of a head-up display device (hereinafter, referred to as a HUD device 1) in the present embodiment.
  • the HUD device 1 includes an image display unit 10 that displays a display image that is a source of a virtual image V (see FIG. 2), and a display unit 2 (for example, a front of a vehicle) that displays display light of the display image displayed by the image display unit 10.
  • a windshield a relay optical unit 10a composed of one or a plurality of optical members such as a reflective optical system, a refractive optical system, and a diffractive optical system for appropriately expanding and projecting, and display control for controlling the display of the image display unit 10.
  • an apparatus 20 is a block diagram showing an overall configuration of a head-up display device (hereinafter, referred to as a HUD device 1) in the present embodiment.
  • the HUD device 1 includes an image display unit 10 that displays a display image that is a source of a virtual image V (see FIG. 2), and
  • the image display unit 10 displays a display image by receiving a projector (not shown) using a reflective display device such as DMD or LCoS, and receives display light from the projector. (Not shown) and a screen (not shown) that emits light toward the relay optical unit 10a.
  • the image display unit 10 generates a virtual image in front of the viewer by displaying a display image on a screen based on image data input from the image generation unit 80 (display control device 20).
  • the image display unit 10 may be, for example, a transmissive display panel such as a liquid crystal display element, a self-luminous display panel such as an organic EL element, or a scanning display device that scans a laser beam.
  • a screen can be said to be a display surface of the image display unit 10.
  • the image display unit 10 includes a transmissive display panel or a self-luminous display panel, the display area of the panel can be said to be the display surface of the image display unit 10.
  • the display control device 20 replaces the first interface (the identification information acquisition unit 30, the position information acquisition unit 32, the distance information acquisition unit 34, the gaze position acquisition unit 40, and the vehicle state acquisition unit 50) with the first interface.
  • the second interface 60 for acquiring various information necessary for the processing of the display processing unit 70 and the display control of a related information image V1 (a type of virtual image V) described later are executed.
  • a display processing unit 70 is executed together with the first interface.
  • the first interface (identification information acquisition unit 30, position information acquisition unit 32, distance information acquisition unit 34, gaze position acquisition unit 40, vehicle state acquisition unit 50), and second interface 60 can communicate with external devices. It is connected to.
  • the external devices include a real object identification unit 31, a real object position detection unit 33, a real object distance detection unit 35, a line of sight detection unit 41, a position detection unit 51, a direction detection unit 52, a posture detection unit 53, and a cloud server (external server).
  • the HUD device 1 acquires various types of information and image element data that is the source of the virtual image V from an external device via an interface, and displays the virtual image V according to the processing of the display processing unit 70.
  • image element data is stored in a storage unit 71 described later, and an image generation unit 80 described later reads out the image element data stored in the storage unit 71 according to an instruction from the display processing unit 70, A mode in which image data for displaying a display image on which the virtual image V is based on the image display unit 10 may be generated.
  • the identification information acquisition unit 30 is an input interface for acquiring identification information of the real object 300 existing in the foreground 200 of the own vehicle.
  • the real information identification unit includes one or more cameras and sensors mounted on the own vehicle.
  • the identification information is acquired from the unit 31 and output to the display processing unit 70.
  • the real object identifying unit 31 captures (detects) the foreground 200 of the own vehicle, analyzes the captured (detected) data, and generates identification information of the real object 300 of the foreground 200 of the own vehicle (not shown). Having.
  • the identification information acquired by the identification information acquisition unit 30 is the type of the real object 300.
  • the type of the real object 300 is, for example, an obstacle (another vehicle, a person, an animal, a thing), a road sign, a road, a building, or the like, but is not limited thereto as long as it exists in the foreground 200 and can be identified. .
  • the position information acquisition unit 32 is an input interface for acquiring the position of the real object 300 present in the foreground 200 of the own vehicle, and for example, detects the position of a real object including one or more cameras and sensors mounted on the own vehicle.
  • the position of the real object 300 is acquired from the unit 33 and output to the display processing unit 70.
  • the real object position detection unit 33 captures (detects) the foreground 200 of the own vehicle, analyzes the captured (detected) data, and generates a position analysis unit (not shown) that generates position information of the real object 300 of the foreground 200. Have.
  • the position information of the real object 300 acquired by the position information acquisition unit 32 includes a two-dimensional (up and down direction and left and right direction) position viewed from the viewer, for example, The real coordinates or the coordinates of the real object 300 in the display area 11.
  • the distance information acquisition unit 34 is an input interface that acquires the distance to the real object 300 existing in the foreground 200 of the own vehicle, and is, for example, a real object distance including one or more cameras and sensors mounted on the own vehicle.
  • the distance from the detection unit 35 to the real object 300 is obtained and output to the display processing unit 70.
  • the real object distance detection unit 35 has a distance analysis unit (not shown) that captures (detects) the foreground 200 of the vehicle and analyzes the captured (detected) data to generate distance information to the real object 300.
  • the distance information of the real object 300 acquired by the distance information acquisition unit 34 is the distance from the host vehicle or the HUD device 1 to the real object 300.
  • the above-described analysis unit may be provided in the HUD device 1. Specifically, the analysis unit may be provided in the display processing unit 70.
  • the distance information acquired by the unit 34 is imaging (detection) data obtained by imaging (detecting) the foreground 200 by the real object identification unit 31 (the real object position detection unit 33 and the real object distance detection unit 35).
  • the identification information acquisition unit 30, the position information acquisition unit 32, and the distance information acquisition unit 34 may be common to the camera, the sensor, and the analysis unit.
  • the identification information, the position information, and the distance information of the real object 300 are also collectively referred to as real object information.
  • the second interface 60 may function as the identification information acquisition unit 30 (the position information acquisition unit 32, the distance information acquisition unit 34) together with the / and the distance information acquisition unit 34).
  • the second interface 60 includes the cloud server 500, the vehicle ECU 600, the navigation system 700, and other in-vehicle devices in the own vehicle (not shown), mobile devices in the own vehicle, other vehicles (inter-vehicle communication V2V), on the road.
  • the real object information may be acquired from a communication infrastructure (road-to-vehicle communication V2I) or a portable device of a pedestrian (communication V2P between a vehicle and a pedestrian).
  • the second interface 60 is, for example, from the cloud server 500, the vehicle ECU 600, and the navigation system 700, along with map information, position information, identification information, and coordinate information (distance information) of the real object 300 such as a road sign, a road, or a building.
  • vehicles not shown
  • vehicle-to-vehicle communication V2V road communication infrastructure
  • pedestrian portable devices vehicle-to-pedestrian communication V2P.
  • the display processing unit 70 outputs the position information of the own vehicle or the HUD device 1 (the direction information of the own vehicle may be added) to the external device via the second interface 60, and the external device Based on the input position information (additional direction information) of the own vehicle or the HUD device 1, the position information, the identification information, and the distance information of the real object 300 existing in the foreground 200 of the own vehicle are output to the second interface 60.
  • the identification information may be individual identification information finer than the type of the real object 300.
  • the gaze position acquisition unit 40 is an input interface for acquiring gaze information on a range (gaze range GZ) that the viewer is gazing at.
  • the gaze position acquisition unit 40 receives a gaze range GZ from a gaze detection unit 41 including a camera mounted on a vehicle. And outputs it to the display processing unit 70.
  • the gaze range GZ of the viewer is an area in which the viewer is watching (gazing), including the eye viewpoint (gazing point), and is a certain range in which a person can normally gaze.
  • the method of setting the gaze range GZ of the viewer is not limited to this, and the line-of-sight detection unit 41 may use a predetermined time (for example, as described in JP-A-2015-126451). (1 second), a rectangular area including a plurality of eye viewpoints (gaze points) may be set in the gaze range GZ.
  • the vehicle state acquisition unit 50 acquires information on the state (position, direction, posture) of the vehicle.
  • the vehicle state acquisition unit 50 acquires the position information of the vehicle or the HUD device 1 detected by the position detection unit 51 such as a GNSS (Global Navigation Satellite System), and is detected by the direction detection unit 52 including a direction sensor.
  • the position detection unit 51 such as a GNSS (Global Navigation Satellite System)
  • the direction detection unit 52 including a direction sensor.
  • the display processor 70 includes at least one processor (eg, a central processing unit (CPU)), at least one application specific integrated circuit (ASIC), and at least one application specific integrated circuit (ASIC), and / or It includes at least one semiconductor integrated circuit such as at least one field programmable gate array (FPGA).
  • the at least one processor reads one or a plurality of instructions from at least one computer-readable and tangible recording medium to thereby provide a notification necessity determination unit 72, a first visibility adjustment unit 73, and a second visibility. All or a part of the functions of the display processing unit 70 including the adjustment unit 74 can be executed.
  • Such recording media include any type of magnetic medium, such as a hard disk, any type of optical medium, such as CDs and DVDs, any type of semiconductor memory, such as volatile memory, and non-volatile memory.
  • Volatile memory includes DRAM and SRAM
  • non-volatile memory includes ROM and NVROM.
  • a semiconductor memory is also a semiconductor circuit that becomes part of a circuit with at least one processor.
  • the ASIC is an integrated circuit that is customized to execute all or a part of the functional blocks shown in FIG. 1, and the FPGA is designed to be able to execute all or a part of the functional blocks shown in FIG. 1 after manufacturing. Integrated circuit.
  • the display processing unit 70 may include an image generation unit 80 that generates image data for displaying a display image (an image that is a source of the virtual image V) on the image display unit 10. Some or all of the functions of the 80 may be provided separately from the display control device 20 or the HUD device 1.
  • the notification necessity determination unit 72 determines the notification necessity of the real object 300 based on information acquired from the identification information acquisition unit 30, the position information acquisition unit 32, the distance information acquisition unit 34, and notifies the viewer of the necessity. Determine whether or not. When the notification need becomes higher than a predetermined threshold value, it is determined that it is necessary to notify the viewer.
  • the notification necessity is, for example, a danger derived from the degree of a serious situation of the real object 300 or an urgency derived from the length of a reaction time required to perform a reaction action.
  • the information acquisition unit 30, the position information acquisition unit 32, and the distance information acquisition unit 34 may determine only the real object information related to the real object 300, and in addition to the real object information, various types input from another interface. It may be determined based on information.
  • the notification necessity determination unit 72 essentially has only to have a function of determining whether or not to notify the viewer of the existence of the real object 300 based on the notification necessity. May not be provided, and a part or all of the function of determining the notification necessity may be provided separately from the display control device 20.
  • the display processing unit 70 instructs the image generation unit 80 to display at least the related information image V1 (in addition to the emphasized image V2) for the real object 300 for which the notification necessity determination unit 72 has determined that notification is necessary. I do.
  • the necessity of notification is not limited to the method using the risk or the urgency.
  • the notification necessity determination unit 72 may determine that notification is necessary.
  • the real object 300 does not need to be seen by the viewer, and the display processing unit 70 needs to be notified even when the real object 300 is hidden behind other objects in the foreground 200 or is too far away to be visually recognized.
  • the image generation unit 80 is instructed to display the related information image V1 for the real object 300 at a predetermined position or to display the emphasized image V2 in the direction of the real object 300 that needs to be notified. Good.
  • the first visibility adjustment unit 73 determines the presence or absence of the relevant information of the real object 300 for which the notification necessity determination unit 72 has determined that the notification is necessary.
  • the image generation unit 80 is instructed to display.
  • the first visibility adjustment unit 73 acquires the identification information of the real object 300 from the identification information acquisition unit 30 or the second interface 60, and the related information corresponding to the identification information is stored in the storage unit 71 or the second interface 60. If the information is in the external device via the interface, it is determined that there is related information.
  • the first visibility adjuster 73 adjusts the visibility of the virtual image V (the related information image V1 and the emphasized image V2). As a method of changing the “visibility” here, for example, changing the brightness, lightness, color, or a combination thereof of the virtual image V is used.
  • the first visibility adjustment unit 73 changes the visibility of the related information image V1 based on the distance information of the real object 300 acquired from the real object distance detection unit 35. Specifically, the first visibility adjustment unit 73 gradually reduces the visibility of the related information image V1 as the real object 300 approaches.
  • the first visibility adjustment unit 73 adjusts the visibility of the related information image V1 by the position of the real object 300 in the display area 11 (or the position of the emphasized image V2 displayed on the real object 300). It may be changed. Specifically, the first visibility adjuster 73 specifies the position of the real object 300 in the display area 11 based on the position information of the real object 300 acquired from the real object position detector 33, and May gradually decrease the visibility of the related information image V ⁇ b> 1 as approaches the outer edge of the display area 11. As described above, the first visibility adjustment unit 73 performs the visibility based on the distance information (the depth direction viewed from the viewer) or the position information (the vertical direction and / or the left / right direction viewed from the viewer) of the real object 300. Is referred to as first visibility adjustment processing.
  • the second visibility adjustment unit 74 determines whether the viewer is visually recognizing the real object 300 (or the emphasized image V2 displayed on the real object 300) or the related information image V1. I do.
  • the second visibility adjustment unit 74 acquires the position information of the currently displayed related information image V1 and the emphasized image V2 stored in the storage unit 71, and acquires the position information of the real object 300 from the position information acquisition unit 32.
  • the gaze range GZ is acquired from the gaze position acquisition unit 40, and the determination is made based on the position information of the related information image V1 or the emphasized image V2 and the gaze range GZ.
  • the second visibility adjuster 74 determines whether the viewer visually recognizes the real object 300 (or the emphasized image V2 displayed on the real object 300) or the related information image V1. After increasing the visibility, it is possible to execute a second visibility adjustment process for decreasing the visibility. The second visibility adjuster 74 determines that the viewer does not visually recognize the real object 300 (and the emphasized image V2 displayed on the real object 300) and does not visually recognize the related information image V1. When the first visibility adjustment process is executed and the viewer visually recognizes the real object 300 (or the emphasized image V2 displayed on the real object 300) or the related information image V1, 2 Execute visibility adjustment processing.
  • the virtual image V displayed by the HUD device 1 will be described with reference to FIG.
  • the real object 300 is a parking lot
  • the emphasized image V2 is displayed so as to indicate the parking lot
  • the related information image V1 displaying “P” indicating the parking lot is displayed in the display area 11. It is displayed on the upper end portion 12.
  • the display processing unit 70 displays the emphasized image V2 in the vicinity of the real object 300 for which the notification necessity determination unit 72 has determined that notification is necessary or at a position where the real object 300 is not visually recognized but exists, and the related information image of the real object 300 is displayed.
  • V1 is displayed at a predetermined specific position in the display area 11 (the upper end 12 in the example of FIG. 2).
  • the specific position is preferably the upper end portion 12 or the lower end portion 13 when the vertical direction of the display area 11 in which the HUD 10 can display the virtual image V is divided into four equal parts, but the vertical direction of the display area 11 is divided into three equal parts.
  • the upper end portion 12 or the lower end portion 13 may be used.
  • the display region 11 has a vertical direction that is about twice as large as the vertical width of the upper end portion 12 displaying the related information image V1 (the length between the upper end of the display region 11 and the lower end of the related information image V1).
  • Is preferably provided below the upper end portion 12, or the width of the lower end portion 13 for displaying the related information image V1 in the vertical direction (between the lower end of the display region 11 and the upper end of the related information image V1). It is preferable to provide a region having a width in the vertical direction which is about twice as large as the length between them (the length between them).
  • FIG. 3 is a flowchart illustrating an example of a main procedure of a first visibility adjustment process related to display control of the HUD device 1 of the present embodiment.
  • a first visibility adjustment process is started.
  • step S1 real object information (identification information, position information, distance information) is obtained (step S1), and it is determined whether the real object 300 is to be notified to the viewer (step S2). (Step S3), and in the case of NO, the first visibility adjustment processing ends.
  • step S4 it is determined whether the display condition of the related information image V1 is satisfied. If YES, the related information image V1 is displayed (step S5). If NO, the related information image V1 is not displayed. Or, if the related information image V1 is displayed, it is not displayed (step S6), and the first visibility adjustment processing is ended.
  • the display condition of the related information image V1 in step S4 includes at least the presence of the related information for the real object 300, and in addition, it can be determined that the priority (necessity) of the related information image V1 is high according to the situation. May be included.
  • step S7 it is determined whether a condition for executing the first visibility adjustment processing is satisfied (step S7), and in the case of YES, the visibility of the related information image V1 is determined based on the distance information of the real object 300 acquired in step S1 or the like. It is lowered based on the position information (step S8, also referred to as first visibility adjustment processing), and in the case of NO, the first visibility adjustment processing ends.
  • the condition for executing the first visibility adjustment process in step S7 is that the real object 300 is within a predetermined distance Dm (see FIG. 7), which will be described later, or that the real object 300 is close to the outer edge of the display area 11. It is within a defined area (not shown).
  • the visibility of the related information image V1 is gradually reduced as the real object 300 approaches or the real object 300 approaches the outer edge of the display area 11. Automatically lowering the visibility of the related information of the real object 300 having a lower notification priority (necessity) when approaching the vehicle, making it harder for a viewer to lose visual attention, and concentrating on vehicle operation Can be.
  • FIG. 4 is a flowchart illustrating an example of a main procedure of a second visibility adjustment process related to display control of the HUD device 1 according to the present embodiment.
  • the second visibility adjustment process starts.
  • the process is restarted from the first visibility adjustment process.
  • step S11 the gaze range GZ of the viewer is obtained (step S11), and it is determined whether the related information image V1 is being watched (step S12). If YES, the process proceeds to step S13. If NO, the second visibility is determined. The adjustment process ends.
  • step S13 it is determined whether the related information image V1 determined to be watched in step S12 is undergoing the first visibility adjustment process. In other words, it is determined in step S8 whether the condition of the first visibility adjustment processing is satisfied.
  • step S13 the current visibility of the related information image V1 is obtained (step S14), and the visibility is increased by a predetermined value based on the current visibility (step S15). The visibility is reduced (step S16), the second visibility adjustment process is terminated, and the routine is started again from the first visibility adjustment process.
  • step S13 the visibility is increased by a predetermined value based on the normal visibility of the related information image V1 (step S15), and thereafter, the visibility is reduced (step S16), and the second visibility is performed.
  • the gender adjustment processing is terminated, and the routine is started again from the first visibility adjustment processing.
  • the processing from step S13 to step S16 is referred to as second visibility adjustment processing. That is, when the viewer visually recognizes the related information image V1, the first visibility adjustment process in which the visibility gradually decreases is interrupted, and the visibility increases, so that the related information image V1 being viewed is gradually recognized. It is possible to avoid a situation where the related information image V1 becomes difficult, and it is possible to easily recognize the related information image V1.
  • FIG. 5 shows the foreground 200, which is visually recognized via the projection target portion 2 (the front windshield of the vehicle) when the viewer faces forward, and the case where the first visibility adjustment processing in the present embodiment is executed.
  • 5 is a diagram showing a virtual image V of FIG.
  • step S7 in FIG. 3 since the real object 300 is located far away (away from the predetermined distance Dm), the result of step S7 in FIG. 3 is NO, and the first visibility adjustment processing in step S8 is executed.
  • the related information image V1 that is not present and the rectangular emphasized image V2 near the real object 300 are displayed. At this time, the visibility of the related information image V1 is an initial value Lm (see FIG. 7) described later.
  • step S7 in FIG. 3 since the real object 300 approaches (becomes closer than the predetermined distance Dm) as the vehicle travels, the result of step S7 in FIG. 3 is YES, and the first visibility adjustment processing in step S8. Are displayed, and a rectangular emphasized image V2 is displayed near the real object 300. At this time, the visibility of the related information image V1 is set to be lower than the initial value Lm.
  • the related information image V1 is subjected to the first visibility adjustment processing in step S8, and the visibility is reduced. Since the real object 300 is outside the display area 11 when viewed from the viewer, the visibility of the emphasized image V2 is set to zero (non-display). That is, in the example of the first visibility adjustment processing illustrated in FIG. 5, the visibility of the related information image V ⁇ b> 1 is reduced as the real object 300 approaches. In the example of FIG. 5, the visibility of the emphasized image V2 is also reduced as the distance of the real object 300 decreases (or as the real object 300 approaches the outer edge of the display area 11), but is not limited thereto. Not done.
  • FIG. 6 shows the foreground 200, which is visually recognized through the projection target (windshield of the vehicle) 2 when the viewer faces forward, and the case where the second visibility adjustment processing in this embodiment is executed. It is a figure which shows a virtual image V.
  • step S7 in FIG. 3 since the real object 300 is located at a far distance (away from the predetermined distance Dm), NO is obtained in step S7 in FIG. 3 and the first visual recognition is performed in step S8.
  • a related information image V1 for which the gender adjustment processing has not been performed and a rectangular emphasized image V2 near the real object 300 are displayed.
  • the related information image V1 is visually recognized by the viewer (the related information image V1 is included in the gazing range GZ), the result is YES in step S12 in FIG. 4, and the visibility is increased in step S15. Is displayed, and a rectangular emphasized image V2 is displayed near the real object 300.
  • the visibility of the related information image V1 is set to be increased by a predetermined value based on the visibility if the related information image V1 is undergoing the first visibility adjustment processing. Is not in the first visibility adjustment process, the value is increased by a predetermined value based on the normal visibility (initial value Lm).
  • the related information image V1 in which the visibility has been reduced in step S16 in FIG. 4 and the rectangular emphasized image V2 near the real object 300 are displayed. That is, in the example of the second visibility adjustment processing illustrated in FIG. 6, the related information image V1 is highlighted when the related information image V1 is visually recognized, and thereafter the visibility is reduced.
  • FIG. 7 is a characteristic diagram showing the relationship between the distance to the real object 300 and the visibility of the related information image V1 in the first visibility adjustment processing of the present embodiment. If the distance to the real object 300 is equal to or greater than the predetermined first distance threshold Dm, the determination in step S7 of FIG. 3 is NO, and the related information image V1 is maintained at the initial visibility Lm, and the first distance If the distance is less than the threshold value Dm, the result of step S7 in FIG. 3 is YES, and the process proceeds to step S8. As the distance becomes shorter, the visibility is gradually reduced. Display) (first visibility adjustment processing is executed). The relationship between the distance and the visibility between the first distance threshold Dm and the second distance threshold Ds is linear in the example of FIG. 7, but may be non-linear, and the visibility is gradually increased according to the distance. May be changed.
  • FIG. 8 is a time chart illustrating a change in the visibility of the related information image V1 due to the second visibility adjustment processing.
  • FIG. 8A is a diagram illustrating the distance D1 in FIG. 7 before the first visibility adjustment processing is started.
  • 8B is a time chart of the visibility of the related information image V1 by the second visibility adjustment processing when the viewer gazes at the related information image V1 at the time (time t10), and FIG. 8B shows the first visibility adjustment.
  • 8 is a time chart of the visibility of the related information image V1 by the second visibility adjustment process when the viewer gazes at the related information image V1 at the distance D2 (time t20) in FIG. 7 after the process is started. .
  • the related information image V1 when the related information image V1 is gazed, it is gradually applied over a certain first period Ta (for example, 1 second) based on the visibility at that time (initial visibility Lm).
  • the visibility is increased by a predetermined value, and then the visibility is kept constant until the vehicle travels a predetermined first traveling distance La. Thereafter, the vehicle further travels a second traveling distance Lb.
  • the visibility is gradually reduced so that the visibility becomes 0 (zero).
  • the visibility at that time (the visibility L2 adjusted corresponding to the distance D2 by the first display adjustment processing illustrated in FIG. 7) is used as a reference.
  • the visibility is gradually increased by a predetermined constant value over a constant first period Ta, and thereafter, the visibility is maintained constant until the vehicle travels a constant first traveling distance La.
  • the visibility is gradually reduced so that the visibility becomes 0 (zero) before the vehicle further travels the third traveling distance Lc.
  • the rate of decrease in visibility with respect to an increase in travel distance is the visibility when the second visibility adjustment process is performed (Lm at time t10 in FIG. 8A, L2 at time t20 in FIG. 8B ( ⁇ Lm). ), It is preferable to be constant, but the present invention is not limited to this. For example, the lower the visibility may be when the second visibility adjustment process is performed, the more rapidly the reduction rate may be increased (unit). The rate of decrease in visibility per mileage may be increased).
  • FIG. 9 is a time chart illustrating a change in the visibility of the related information image V1 due to the second visibility adjustment process.
  • FIG. 9A is a diagram illustrating the distance D1 in FIG. 7 before the first visibility adjustment process is started. Is a time chart of the visibility of the related information image V1 by the second visibility adjustment process when the viewer gazes at the related information image V1 at the time (time t30), and FIG. 9B is the first visibility adjustment.
  • 8 is a time chart of the visibility of the related information image V1 by the second visibility adjustment process when the viewer gazes at the related information image V1 at the time of the distance D2 (time t40) in FIG. 7 after the process is started. .
  • a predetermined constant value is gradually set over a predetermined first period Ta based on the visibility at that time (initial visibility Lm).
  • the visibility is kept constant until a certain second period Tb elapses, and thereafter, the visibility becomes 0 (zero) until the third period Tc elapses. So that the visibility is gradually reduced.
  • the visibility at that time (the visibility L2 adjusted according to the distance D2 by the first display adjustment processing illustrated in FIG. 7) is used as a reference.
  • the visibility is gradually increased by a predetermined constant value over a certain first period Ta, and thereafter, the visibility is kept constant until a certain second period Tb elapses.
  • the visibility is gradually reduced so that the visibility becomes 0 (zero) before the period Td ( ⁇ Tc) of 4 elapses.
  • the rate of decrease in visibility with respect to the passage of time is determined by the visibility when the second visibility adjustment processing is performed (Lm at time t30 in FIG. 9A, and L2 ( ⁇ Lm) at time t40 in FIG. 9B).
  • the constant be constant.
  • the lower the visibility when the second visibility adjustment process is performed the more the reduction rate may be increased (per unit time). May be increased.
  • the visibility is not increased by a certain value from the visibility adjusted in the first display adjustment process over a certain first period Ta, but is changed to a specific visibility. (For example, normal visibility Lm).
  • SYMBOLS 1 HUD apparatus (head-up display apparatus), 2 ... windshield, 10 ... image display part, 10a ... relay optical part, 11 ... display area, 12 ... upper end part, 13 ... lower end part, 20 ... display control apparatus, 30 ... identification information acquisition unit, 31 ... real object identification unit, 32 ... position information acquisition unit, 33 ... real object position detection unit, 34 ... distance information acquisition unit, 35 ... real object distance detection unit, 40 ... gaze position acquisition unit, 41: line-of-sight detection unit, 50: vehicle state acquisition unit, 51: position detection unit, 52: direction detection unit, 53: attitude detection unit, 60: second interface, 70: display processing unit, 71: storage unit, 72 ...
  • Notification necessity determination unit 73 73 73 First visibility adjustment unit 74 74 74 Second visibility adjustment unit 80 Image generation unit 200 Foreground 300 Real object 500 Cloud server 60 ... vehicle ECU, 700 ... navigation system, GZ ... gaze range, V ... virtual image, V1 ... related information image, V2 ... enhanced image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The present invention facilitates the recognition of a real object that has approached, and reduces image complexity. A first visibility adjustment unit 73 acquires the position of a real object 300, and gradually reduces the visibility of a relevant information image V1 as the distance to the real object 300 decreases, or as the real object 300 approaches the outer edge of a display region 11. A second visibility adjustment unit 74 acquires a gazing range GZ of a viewer, increases the visibility of the relevant information image V1 when it is determined that the relevant information image V1 was gazed at, and then gradually lowers the visibility on the basis of the travel distance of a vehicle or the elapsed time.

Description

表示制御装置、ヘッドアップディスプレイ装置Display control device, head-up display device
 本発明は、車両で使用され、車両の前景に虚像を重畳して視認させるヘッドアップディスプレイ装置及びこれに用いられる表示制御装置に関する。 The present invention relates to a head-up display device used in a vehicle and superimposing a virtual image on a foreground of the vehicle for visual recognition, and a display control device used for the device.
 特許文献1に開示されているヘッドアップディスプレイ装置は、前景に存在する実オブジェクトが接近した場合、実オブジェクトに対して表示される強調画像が煩わしく感じられるため、実オブジェクトが接近して来た場合に、実オブジェクトに対して表示される強調画像の視認性を低下させている(特許文献1では、画像を暗くする、又は小さくする)。 In the head-up display device disclosed in Patent Document 1, when a real object existing in the foreground approaches, an emphasized image displayed for the real object is perceived as annoying. In addition, the visibility of the emphasized image displayed on the real object is reduced (in Patent Literature 1, the image is darkened or reduced).
特開2015-11666号公報JP-A-2015-11666
 しかしながら、接近した実オブジェクトに対する画像の視認性を低下させることで、画像の煩わしさを低減することができるが、接近した実オブジェクトを見逃す可能性が増加してしまう。 However, by reducing the visibility of the image with respect to the approaching real object, the complexity of the image can be reduced, but the possibility of missing the approaching real object increases.
 本発明は、上記の問題点に鑑みてなされたものであり、画像の煩わしさを低減しつつ、接近した実オブジェクトを認識させやすくすることを目的とする。 The present invention has been made in view of the above problems, and has as its object to make it easy to recognize an approaching real object while reducing the complexity of an image.
 表示制御装置の第1の態様は、車両に搭載され、前記車両の前景に重なる表示領域に、前記前景に存在する実オブジェクトの関連情報画像を表示させる表示制御装置であって、前記実オブジェクトの位置を取得し、前記実オブジェクトまでの距離が短くなるにつれて、又は前記実オブジェクトが前記表示領域の外縁に近づくにつれて、徐々に前記関連情報画像の視認性を低下させる第1視認性調整部と、視認者の注視範囲を取得し、前記関連情報画像が注視されたと判定された場合、前記関連情報画像の視認性を、上昇させた後、前記車両の走行距離、又は経過時間に基づき、徐々に低下させる第2視認性調整部と、を備える、ものである。 A first aspect of the display control device is a display control device which is mounted on a vehicle and displays a related information image of a real object present in the foreground in a display area overlapping the foreground of the vehicle, Acquiring a position, as the distance to the real object is reduced, or as the real object approaches the outer edge of the display area, a first visibility adjustment unit that gradually reduces the visibility of the related information image, Obtain the gaze range of the viewer, when it is determined that the related information image has been watched, after increasing the visibility of the related information image, based on the travel distance of the vehicle, or elapsed time, gradually And a second visibility adjusting unit for lowering the visibility.
 車両に接近した実オブジェクトは、視認者(車両の運転者)にとって、衝突の危険性を対処済みのものであったり、興味対象外のものであったりするため、比較的、重要度が低い(又は重要度が低くなった)ものである。第1の態様によれば、実オブジェクトが車両に近づくにつれて、関連情報画像の視認性が低下するため、視認者(車両の運転者)にとって、衝突の危険を回避済みの実オブジェクトや興味対象外の実オブジェクトの関連情報画像への視覚的注意が向きにくくすることができるが、視認者が車両に接近した実オブジェクト、又はこの実オブジェクトの関連情報画像に興味がある場合には、視認者が関連情報画像を注視したことに基づいて、関連情報画像の視認性が一時的に上昇するため、興味がある実オブジェクトに関する情報が認識しやすくなる。そして、関連情報画像の視認性は、一時的に上昇した後、車両の走行距離、又は経過時間に基づいて、徐々に低下するので、関連情報画像の煩わしさを軽減して、注視し過ぎを防止し、車両の安全運転に寄与することができる。 The real object approaching the vehicle has a relatively low importance for the viewer (the driver of the vehicle) because the risk of collision has been dealt with or the object is not of interest. Or, the degree of importance has been reduced). According to the first aspect, the visibility of the related information image decreases as the real object approaches the vehicle. Therefore, for the viewer (the driver of the vehicle), the real object which has avoided the danger of the collision and the real object which is not of interest. The visual attention to the related information image of the real object can be made difficult to direct, but if the viewer is interested in the real object approaching the vehicle or the related information image of this real object, Since the visibility of the related information image is temporarily increased based on gazing at the related information image, information on the real object of interest is easily recognized. Then, the visibility of the related information image temporarily increases, and then gradually decreases based on the traveling distance of the vehicle or the elapsed time. This can contribute to safe driving of the vehicle.
本発明の実施形態に係るヘッドアップディスプレイ装置の構成を機能的に示すブロック図である。FIG. 1 is a block diagram functionally showing a configuration of a head-up display device according to an embodiment of the present invention. 上記実施形態のヘッドアップディスプレイ装置が表示する虚像の例を示す図である。It is a figure showing the example of the virtual image which the head up display device of the above-mentioned embodiment displays. 上記実施形態のヘッドアップディスプレイ装置が実行する第1視認性調整処理を示すフローチャートである。It is a flowchart which shows the 1st visibility adjustment process which the head-up display device of the said embodiment performs. 上記実施形態のヘッドアップディスプレイ装置が実行する第2視認性調整処理を示すフローチャートである。It is a flowchart which shows the 2nd visibility adjustment process which the head-up display device of the said embodiment performs. 図3に示した第1視認性調整処理が実行される際の虚像の例を示す図である。FIG. 4 is a diagram illustrating an example of a virtual image when the first visibility adjustment process illustrated in FIG. 3 is performed. 図4に示した第2視認性調整処理が実行される際の虚像の例を示す図である。FIG. 5 is a diagram illustrating an example of a virtual image when the second visibility adjustment process illustrated in FIG. 4 is performed. 図3に示した第1視認性調整処理における距離に対する関連情報画像の視認性の特性を示す図である。FIG. 4 is a diagram illustrating a characteristic of visibility of a related information image with respect to a distance in the first visibility adjustment processing illustrated in FIG. 3. 第1視認性調整処理が行われる前に第2視認性調整処理が行われた場合の時間及び走行距離に対する関連情報画像の視認性の特性を示す図である。It is a figure which shows the characteristic of the visibility of the related information image with respect to time and travel distance when a 2nd visibility adjustment process is performed before a 1st visibility adjustment process is performed. 第1視認性調整処理が行われた後に第2視認性調整処理が行われた場合の時間及び走行距離に対する関連情報画像の視認性の特性を示す図である。It is a figure which shows the characteristic of the visibility of the related information image with respect to time and travel distance when a 2nd visibility adjustment process is performed after a 1st visibility adjustment process is performed. 第1視認性調整処理が行われる前に第2視認性調整処理が行われた場合の時間に対する関連情報画像の視認性の特性の変形例を示す図である。It is a figure which shows the modification of the characteristic of the visibility of the related information image with respect to time when a 2nd visibility adjustment process is performed before a 1st visibility adjustment process is performed. 第1視認性調整処理が行われた後に第2視認性調整処理が行われた場合の時間に対する関連情報画像の視認性の特性の変形例を示す図である。It is a figure which shows the modification of the characteristic of the visibility of the related information image with respect to the time when a 2nd visibility adjustment process is performed after a 1st visibility adjustment process is performed.
 以下、本発明に係る実施形態について図面を参照して説明する。なお、本発明は下記の実施形態(図面の内容も含む。)によって限定されるものではない。下記の実施形態に変更(構成要素の削除も含む)を加えることができる。また、以下の説明では、本発明の理解を容易にするために、公知の技術的事項の説明を適宜省略する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. The present invention is not limited by the following embodiments (including the contents of the drawings). Modifications (including deletion of components) can be made to the following embodiments. In the following description, well-known technical items are not described in order to facilitate understanding of the present invention.
 図1は、本実施形態におけるヘッドアップディスプレイ装置(以下、HUD装置1と呼ぶ)の全体構成を示すブロック図である。HUD装置1は、虚像V(図2参照)の元となる表示画像を表示する画像表示部10と、画像表示部10が表示した表示画像の表示光を被投影部2(例えば、車両のフロントウインドシールド)に向けて適宜拡大して投射する反射光学系、屈折光学系、回折光学系などの単数又は複数の光学部材からなるリレー光学部10aと、画像表示部10の表示を制御する表示制御装置20と、によって構成される。 FIG. 1 is a block diagram showing an overall configuration of a head-up display device (hereinafter, referred to as a HUD device 1) in the present embodiment. The HUD device 1 includes an image display unit 10 that displays a display image that is a source of a virtual image V (see FIG. 2), and a display unit 2 (for example, a front of a vehicle) that displays display light of the display image displayed by the image display unit 10. (A windshield), a relay optical unit 10a composed of one or a plurality of optical members such as a reflective optical system, a refractive optical system, and a diffractive optical system for appropriately expanding and projecting, and display control for controlling the display of the image display unit 10. And an apparatus 20.
 画像表示部10は、例えば、DMDやLCoSなどの反射型表示デバイスを用いたプロジェクタ(図示しない)と、プロジェクタからの投影光を受光して表示画像を表示し、この表示画像を示す表示光(図示しない)をリレー光学部10aに向けて出射するスクリーン(図示しない)と、から主に構成される。画像表示部10は、画像生成部80(表示制御装置20)から入力する画像データに基づき、スクリーンに表示画像を表示することで、視認者の前方に虚像を生成する。なお、画像表示部10は、例えば、液晶表示素子などの透過型表示パネルや、有機EL素子などの自発光表示パネルや、レーザー光を走査する走査型表示デバイスなどを適用してもよく、画像表示部10が、反射型表示デバイスや走査型表示デバイスなどのプロジェクタから構成される場合、スクリーンが画像表示部10の表示面と言える。一方、画像表示部10が、透過型表示パネルや自発光表示パネルから構成される場合、パネルの表示領域が画像表示部10の表示面と言える。 For example, the image display unit 10 displays a display image by receiving a projector (not shown) using a reflective display device such as DMD or LCoS, and receives display light from the projector. (Not shown) and a screen (not shown) that emits light toward the relay optical unit 10a. The image display unit 10 generates a virtual image in front of the viewer by displaying a display image on a screen based on image data input from the image generation unit 80 (display control device 20). The image display unit 10 may be, for example, a transmissive display panel such as a liquid crystal display element, a self-luminous display panel such as an organic EL element, or a scanning display device that scans a laser beam. When the display unit 10 is configured by a projector such as a reflective display device or a scanning display device, a screen can be said to be a display surface of the image display unit 10. On the other hand, when the image display unit 10 includes a transmissive display panel or a self-luminous display panel, the display area of the panel can be said to be the display surface of the image display unit 10.
 表示制御装置20は、第1のインターフェース(識別情報取得部30、位置情報取得部32、距離情報取得部34、注視位置取得部40、車両状態取得部50)と、この第1のインターフェースに代えて、あるいは、第1のインターフェースとともに、表示処理部70の処理に必要な各種情報を取得する第2のインターフェース60と、後述する関連情報画像V1(虚像Vの一種)の表示制御を少なくとも実行する表示処理部70と、を備える。 The display control device 20 replaces the first interface (the identification information acquisition unit 30, the position information acquisition unit 32, the distance information acquisition unit 34, the gaze position acquisition unit 40, and the vehicle state acquisition unit 50) with the first interface. Alternatively, together with the first interface, at least the second interface 60 for acquiring various information necessary for the processing of the display processing unit 70 and the display control of a related information image V1 (a type of virtual image V) described later are executed. A display processing unit 70.
 前記第1のインターフェース(識別情報取得部30、位置情報取得部32、距離情報取得部34、注視位置取得部40、車両状態取得部50)、及び第2のインターフェース60は、外部機器と通信可能に連結されている。前記外部機器は、実オブジェクト識別部31、実オブジェクト位置検出部33、実オブジェクト距離検出部35、視線検出部41、位置検出部51、方角検出部52、姿勢検出部53、クラウドサーバー(外部サーバー)500、車両ECU600、ナビゲーションシステム700、及び図示しない自車両内の他の車載機器、自車両内の携帯機器、他車両(車車間通信V2V)、路上の通信インフラ(路車間通信V2I)、歩行者の携帯機器(車両と歩行者との通信V2P)などである。HUD装置1は、外部機器からインターフェースを介して、各種情報、及び虚像Vの元となる画像要素データを取得し、表示処理部70の処理に従い虚像Vを表示する。なお、画像要素データの一部又は全部は、後述する記憶部71に記憶され、後述する画像生成部80が、表示処理部70からの指示に従い記憶部71に記憶された画像要素データを読み出し、画像表示部10に虚像Vの基となる表示画像を表示させる画像データを生成させる形態であってもよい。 The first interface (identification information acquisition unit 30, position information acquisition unit 32, distance information acquisition unit 34, gaze position acquisition unit 40, vehicle state acquisition unit 50), and second interface 60 can communicate with external devices. It is connected to. The external devices include a real object identification unit 31, a real object position detection unit 33, a real object distance detection unit 35, a line of sight detection unit 41, a position detection unit 51, a direction detection unit 52, a posture detection unit 53, and a cloud server (external server). ) 500, vehicle ECU 600, navigation system 700, other in-vehicle devices in the own vehicle (not shown), mobile devices in the own vehicle, other vehicles (vehicle-to-vehicle communication V2V), communication infrastructure on the road (road-to-vehicle communication V2I), walking Portable device (communication V2P between a vehicle and a pedestrian). The HUD device 1 acquires various types of information and image element data that is the source of the virtual image V from an external device via an interface, and displays the virtual image V according to the processing of the display processing unit 70. Note that part or all of the image element data is stored in a storage unit 71 described later, and an image generation unit 80 described later reads out the image element data stored in the storage unit 71 according to an instruction from the display processing unit 70, A mode in which image data for displaying a display image on which the virtual image V is based on the image display unit 10 may be generated.
 識別情報取得部30は、自車両の前景200に存在する実オブジェクト300の識別情報を取得する入力インターフェースであり、例えば、自車両に搭載された単数又は複数のカメラやセンサなどからなる実オブジェクト識別部31から識別情報を取得し、表示処理部70に出力する。例えば、実オブジェクト識別部31は、自車両の前景200を撮像(検出)し、その撮像(検出)データを解析して自車両の前景200の実オブジェクト300の識別情報を生成する図示しない識別部を有する。例えば、識別情報取得部30が取得する識別情報は、実オブジェクト300の種類である。実オブジェクト300の種類とは、例えば、障害物(他車両、人物、動物、モノ),道路標識,道路,又は建物などであるが、前景200に存在し、識別可能であればこれらに限定されない。 The identification information acquisition unit 30 is an input interface for acquiring identification information of the real object 300 existing in the foreground 200 of the own vehicle. For example, the real information identification unit includes one or more cameras and sensors mounted on the own vehicle. The identification information is acquired from the unit 31 and output to the display processing unit 70. For example, the real object identifying unit 31 captures (detects) the foreground 200 of the own vehicle, analyzes the captured (detected) data, and generates identification information of the real object 300 of the foreground 200 of the own vehicle (not shown). Having. For example, the identification information acquired by the identification information acquisition unit 30 is the type of the real object 300. The type of the real object 300 is, for example, an obstacle (another vehicle, a person, an animal, a thing), a road sign, a road, a building, or the like, but is not limited thereto as long as it exists in the foreground 200 and can be identified. .
 位置情報取得部32は、自車両の前景200に存在する実オブジェクト300の位置を取得する入力インターフェースであり、例えば、自車両に搭載された単数又は複数のカメラやセンサなどからなる実オブジェクト位置検出部33から実オブジェクト300の位置を取得し、表示処理部70に出力する。例えば、実オブジェクト位置検出部33は、自車両の前景200を撮像(検出)し、その撮像(検出)データを解析して前景200の実オブジェクト300の位置情報を生成する図示しない位置解析部を有する。この場合、位置情報取得部32が取得する実オブジェクト300の位置情報は、視認者から見た2次元(上下方向と左右方向)の位置を含み、例えば、自車両の前景200における実オブジェクト300の実座標又は表示領域11における実オブジェクト300の座標である。 The position information acquisition unit 32 is an input interface for acquiring the position of the real object 300 present in the foreground 200 of the own vehicle, and for example, detects the position of a real object including one or more cameras and sensors mounted on the own vehicle. The position of the real object 300 is acquired from the unit 33 and output to the display processing unit 70. For example, the real object position detection unit 33 captures (detects) the foreground 200 of the own vehicle, analyzes the captured (detected) data, and generates a position analysis unit (not shown) that generates position information of the real object 300 of the foreground 200. Have. In this case, the position information of the real object 300 acquired by the position information acquisition unit 32 includes a two-dimensional (up and down direction and left and right direction) position viewed from the viewer, for example, The real coordinates or the coordinates of the real object 300 in the display area 11.
 距離情報取得部34は、自車両の前景200に存在する実オブジェクト300までの距離を取得する入力インターフェースであり、例えば、自車両に搭載された単数又は複数のカメラやセンサなどからなる実オブジェクト距離検出部35から実オブジェクト300までの距離を取得し、表示処理部70に出力する。例えば、実オブジェクト距離検出部35は、自車両の前景200を撮像(検出)し、その撮像(検出)データを解析して実オブジェクト300までの距離情報を生成する図示しない距離解析部を有する。この場合、距離情報取得部34が取得する実オブジェクト300の距離情報は、自車両から又はHUD装置1から実オブジェクト300までの距離である。 The distance information acquisition unit 34 is an input interface that acquires the distance to the real object 300 existing in the foreground 200 of the own vehicle, and is, for example, a real object distance including one or more cameras and sensors mounted on the own vehicle. The distance from the detection unit 35 to the real object 300 is obtained and output to the display processing unit 70. For example, the real object distance detection unit 35 has a distance analysis unit (not shown) that captures (detects) the foreground 200 of the vehicle and analyzes the captured (detected) data to generate distance information to the real object 300. In this case, the distance information of the real object 300 acquired by the distance information acquisition unit 34 is the distance from the host vehicle or the HUD device 1 to the real object 300.
 なお、上記の解析部(識別部、位置解析部、距離解析部)は、HUD装置1内に設けられてもよい。具体的には、前記解析部は、表示処理部70内に設けられてもよく、この場合、識別情報取得部30が取得する識別情報、位置情報取得部32が取得する位置情報、距離情報取得部34が取得する距離情報は、実オブジェクト識別部31(実オブジェクト位置検出部33、実オブジェクト距離検出部35)が前景200を撮像(検出)した撮像(検出)データとなる。なお、識別情報取得部30、位置情報取得部32、距離情報取得部34の一部又は全部は、カメラ、センサ、解析部が共通であってもよい。なお、以下では、実オブジェクト300の識別情報、位置情報、距離情報を総称して実オブジェクト情報とも呼ぶ。 The above-described analysis unit (identification unit, position analysis unit, distance analysis unit) may be provided in the HUD device 1. Specifically, the analysis unit may be provided in the display processing unit 70. In this case, the identification information acquired by the identification information acquisition unit 30, the position information acquired by the position information acquisition unit 32, the distance information acquisition The distance information acquired by the unit 34 is imaging (detection) data obtained by imaging (detecting) the foreground 200 by the real object identification unit 31 (the real object position detection unit 33 and the real object distance detection unit 35). Note that a part or all of the identification information acquisition unit 30, the position information acquisition unit 32, and the distance information acquisition unit 34 may be common to the camera, the sensor, and the analysis unit. In the following, the identification information, the position information, and the distance information of the real object 300 are also collectively referred to as real object information.
 また、他の実施形態では、識別情報取得部30(又は/及び位置情報取得部32又は/及び距離情報取得部34)の代わりに、あるいは識別情報取得部30(又は/及び位置情報取得部32又は/及び距離情報取得部34)とともに、第2のインターフェース60が識別情報取得部30(位置情報取得部32、距離情報取得部34)として機能してもよい。言い換えると、第2のインターフェース60が、クラウドサーバー500、車両ECU600、ナビゲーションシステム700、及び図示しない自車両内の他の車載機器、自車両内の携帯機器、他車両(車車間通信V2V)、路上の通信インフラ(路車間通信V2I)、歩行者の携帯機器(車両と歩行者との通信V2P)から実オブジェクト情報を取得してもよい。第2のインターフェース60は、例えば、クラウドサーバー500、車両ECU600、ナビゲーションシステム700、から、地図情報とともに、道路標識,道路,建物等の実オブジェクト300の位置情報、識別情報、座標情報(距離情報)等を取得可能であり、また、図示しない他車両(車車間通信V2V)、路上の通信インフラ(路車間通信V2I)、歩行者の携帯機器(車両と歩行者との通信V2P)から、障害物(他車両、人物、動物、モノ),道路標識,道路等の位置情報、識別情報、座標情報(距離情報)等を取得可能である。表示処理部70は、自車両又はHUD装置1の位置情報(自車両の方角情報を追加してもよい)を、第2のインターフェース60を介して、前記外部機器に出力し、前記外部機器は、入力した自車両又はHUD装置1の位置情報(追加の方角情報)に基づき、自車両の前景200に存在する実オブジェクト300の位置情報、識別情報、距離情報を第2のインターフェース60に出力してもよい。なお、前記識別情報は、実オブジェクト300の種類より細かい個体識別情報であってもよい。 In another embodiment, instead of the identification information acquisition unit 30 (or / and the position information acquisition unit 32 and / or the distance information acquisition unit 34), or the identification information acquisition unit 30 (or / and / or the position information acquisition unit 32). Alternatively, the second interface 60 may function as the identification information acquisition unit 30 (the position information acquisition unit 32, the distance information acquisition unit 34) together with the / and the distance information acquisition unit 34). In other words, the second interface 60 includes the cloud server 500, the vehicle ECU 600, the navigation system 700, and other in-vehicle devices in the own vehicle (not shown), mobile devices in the own vehicle, other vehicles (inter-vehicle communication V2V), on the road. The real object information may be acquired from a communication infrastructure (road-to-vehicle communication V2I) or a portable device of a pedestrian (communication V2P between a vehicle and a pedestrian). The second interface 60 is, for example, from the cloud server 500, the vehicle ECU 600, and the navigation system 700, along with map information, position information, identification information, and coordinate information (distance information) of the real object 300 such as a road sign, a road, or a building. And other vehicles (not shown) (vehicle-to-vehicle communication V2V), road communication infrastructure (road-to-vehicle communication V2I), and pedestrian portable devices (vehicle-to-pedestrian communication V2P). (Other vehicles, persons, animals, objects), road signs, road and other positional information, identification information, coordinate information (distance information), and the like can be obtained. The display processing unit 70 outputs the position information of the own vehicle or the HUD device 1 (the direction information of the own vehicle may be added) to the external device via the second interface 60, and the external device Based on the input position information (additional direction information) of the own vehicle or the HUD device 1, the position information, the identification information, and the distance information of the real object 300 existing in the foreground 200 of the own vehicle are output to the second interface 60. You may. Note that the identification information may be individual identification information finer than the type of the real object 300.
 注視位置取得部40は、視認者が注視している範囲(注視範囲GZ)に関する注視情報を取得する入力インターフェースであり、例えば、車両に搭載されたカメラなどからなる視線検出部41から注視範囲GZを取得し、表示処理部70に出力する。視認者の注視範囲GZは、視認者が目視(注視)している領域であり、目視点(注視点)を含み、人が通常注視可能とされる一定範囲である。なお、視認者の注視範囲GZの設定方法は、これに限定されるものではなく、視線検出部41は、例えば、特開2015-126451号公報に記載されているように、所定の時間(例えば、1秒)内に測定される複数の目視点(注視点)を含む矩形領域を、注視範囲GZに設定してもよい。 The gaze position acquisition unit 40 is an input interface for acquiring gaze information on a range (gaze range GZ) that the viewer is gazing at. For example, the gaze position acquisition unit 40 receives a gaze range GZ from a gaze detection unit 41 including a camera mounted on a vehicle. And outputs it to the display processing unit 70. The gaze range GZ of the viewer is an area in which the viewer is watching (gazing), including the eye viewpoint (gazing point), and is a certain range in which a person can normally gaze. The method of setting the gaze range GZ of the viewer is not limited to this, and the line-of-sight detection unit 41 may use a predetermined time (for example, as described in JP-A-2015-126451). (1 second), a rectangular area including a plurality of eye viewpoints (gaze points) may be set in the gaze range GZ.
 車両状態取得部50は、車両の状態(位置、方角、姿勢)に関する情報を取得する。車両状態取得部50は、GNSS(全地球航法衛星システム)等からなる位置検出部51により検出された車両又はHUD装置1の位置情報を取得し、方角センサからなる方角検出部52により検出された車両又はHUD装置1の向きを示す方角情報を取得し、ハイトセンサや加速度センサからなる姿勢検出部53により検出された車両の路面からの高さ又は/及び路面に対する角度を示す姿勢情報を取得し、表示処理部70に出力する。 The vehicle state acquisition unit 50 acquires information on the state (position, direction, posture) of the vehicle. The vehicle state acquisition unit 50 acquires the position information of the vehicle or the HUD device 1 detected by the position detection unit 51 such as a GNSS (Global Navigation Satellite System), and is detected by the direction detection unit 52 including a direction sensor. Obtain direction information indicating the direction of the vehicle or the HUD device 1 and obtain posture information indicating the height of the vehicle from the road surface and / or the angle with respect to the road surface detected by the posture detection unit 53 including a height sensor and an acceleration sensor. , To the display processing unit 70.
 表示処理部70は、少なくとも1つのプロセッサ(例えば、中央処理装置(CPU))、少なくとも1つの特定用途向け集積回路(ASIC)、及び少なくとも1つの特定用途向け集積回路(ASIC)、及び/又は、少なくとも1つのフィールドプログラマブルゲートアレイ(FPGA)などの少なくとも1つの半導体集積回路を含む。少なくとも1つのプロセッサは、少なくとも1つのコンピュータ読み取り可能で有形な記録媒体から1つ又は複数の命令を読み取ることにより、報知要否判定部72と、第1視認性調整部73と、第2視認性調整部74と、を含む表示処理部70の機能の全部又は一部を実行することができる。そのような記録媒体は、ハードディスクのような任意のタイプの磁気媒体、CD及びDVDのような任意のタイプの光学媒体、揮発性メモリのような任意のタイプの半導体メモリ、及び不揮発性メモリを含む。揮発性メモリはDRAM及びSRAMを含み、不揮発性メモリはROM及びNVROMを含む。半導体メモリは、少なくとも1つのプロセッサと共に回路の一部となる半導体回路でもある。ASICは、図1に示す機能ブロックの全部又は一部を実行するようにカスタマイズされた集積回路であり、FPGAは、製造後に、図1に示す機能ブロックの全部又は一部を実行できるように設計された集積回路である。なお、表示処理部70は、画像表示部10に表示画像(虚像Vの元となる画像)を表示させるための画像データを生成する画像生成部80を有していてもよいが、画像生成部80の機能の一部又は全部は、表示制御装置20又はHUD装置1とは別に設けられてもよい。 The display processor 70 includes at least one processor (eg, a central processing unit (CPU)), at least one application specific integrated circuit (ASIC), and at least one application specific integrated circuit (ASIC), and / or It includes at least one semiconductor integrated circuit such as at least one field programmable gate array (FPGA). The at least one processor reads one or a plurality of instructions from at least one computer-readable and tangible recording medium to thereby provide a notification necessity determination unit 72, a first visibility adjustment unit 73, and a second visibility. All or a part of the functions of the display processing unit 70 including the adjustment unit 74 can be executed. Such recording media include any type of magnetic medium, such as a hard disk, any type of optical medium, such as CDs and DVDs, any type of semiconductor memory, such as volatile memory, and non-volatile memory. . Volatile memory includes DRAM and SRAM, and non-volatile memory includes ROM and NVROM. A semiconductor memory is also a semiconductor circuit that becomes part of a circuit with at least one processor. The ASIC is an integrated circuit that is customized to execute all or a part of the functional blocks shown in FIG. 1, and the FPGA is designed to be able to execute all or a part of the functional blocks shown in FIG. 1 after manufacturing. Integrated circuit. The display processing unit 70 may include an image generation unit 80 that generates image data for displaying a display image (an image that is a source of the virtual image V) on the image display unit 10. Some or all of the functions of the 80 may be provided separately from the display control device 20 or the HUD device 1.
 報知要否判定部72は、識別情報取得部30,位置情報取得部32、距離情報取得部34などから取得した情報に基づき、実オブジェクト300の報知必要度を判定し、視認者に報知するか否かを判定する。報知必要度が所定の閾値より高くなった場合に、視認者に報知する必要があると判定する。報知必要度は、実オブジェクト300に対して、起こり得る事態の重大さの程度から導き出される危険度や、反応行動を起こすまでに要求される反応時間の長短から導かれる緊急度などであり、識別情報取得部30,位置情報取得部32、距離情報取得部34が取得する実オブジェクト300に関する実オブジェクト情報のみで決定されてもよく、前記実オブジェクト情報に加えて、他のインターフェースから入力される各種情報に基づいて決定されてもよい。なお、報知要否判定部72は、本質的には、前記報知必要度に基づき視認者に実オブジェクト300の存在を報知するかを判定する機能を有していればいいので、前記報知必要度を決定する機能を有していなくてもよく、前記報知必要度を決定する機能の一部又は全部は、表示制御装置20とは別に設けられてもよい。表示処理部70は、報知要否判定部72が報知を必要と判定した実オブジェクト300に対して少なくとも関連情報画像V1(加えて、強調画像V2)が表示されるように画像生成部80に指示する。なお、報知の要否は、前記危険度や前記緊急度を用いる方法に限定されない。例えば、クラウドサーバー500、車両ECU600、ナビゲーションシステム700から視認者にとって役立つ実オブジェクト300に関する情報が取得されれば、報知要否判定部72は、報知を必要と判定してもよい。また、実オブジェクト300は、視認者から見えている必要はなく、表示処理部70は、前景200で他のものの影に隠れている場合や遠方過ぎて視認できない場合でも、報知が必要とされた実オブジェクト300に対する関連情報画像V1が所定の位置に表示されるように、又は報知が必要とされた実オブジェクト300の方向に強調画像V2が表示されるように画像生成部80に指示してもよい。 The notification necessity determination unit 72 determines the notification necessity of the real object 300 based on information acquired from the identification information acquisition unit 30, the position information acquisition unit 32, the distance information acquisition unit 34, and notifies the viewer of the necessity. Determine whether or not. When the notification need becomes higher than a predetermined threshold value, it is determined that it is necessary to notify the viewer. The notification necessity is, for example, a danger derived from the degree of a serious situation of the real object 300 or an urgency derived from the length of a reaction time required to perform a reaction action. The information acquisition unit 30, the position information acquisition unit 32, and the distance information acquisition unit 34 may determine only the real object information related to the real object 300, and in addition to the real object information, various types input from another interface. It may be determined based on information. Note that the notification necessity determination unit 72 essentially has only to have a function of determining whether or not to notify the viewer of the existence of the real object 300 based on the notification necessity. May not be provided, and a part or all of the function of determining the notification necessity may be provided separately from the display control device 20. The display processing unit 70 instructs the image generation unit 80 to display at least the related information image V1 (in addition to the emphasized image V2) for the real object 300 for which the notification necessity determination unit 72 has determined that notification is necessary. I do. The necessity of notification is not limited to the method using the risk or the urgency. For example, if information on the real object 300 useful to the viewer is acquired from the cloud server 500, the vehicle ECU 600, and the navigation system 700, the notification necessity determination unit 72 may determine that notification is necessary. In addition, the real object 300 does not need to be seen by the viewer, and the display processing unit 70 needs to be notified even when the real object 300 is hidden behind other objects in the foreground 200 or is too far away to be visually recognized. Even if the image generation unit 80 is instructed to display the related information image V1 for the real object 300 at a predetermined position or to display the emphasized image V2 in the direction of the real object 300 that needs to be notified. Good.
 第1視認性調整部73は、報知要否判定部72が報知を必要と判定した実オブジェクト300の関連情報の有無を判定し、実オブジェクト300の関連情報が有れば、関連情報画像V1を表示させるように画像生成部80に指示する。第1視認性調整部73は、識別情報取得部30又は第2のインターフェース60から実オブジェクト300の識別情報を取得し、この識別情報に対応した関連情報が記憶部71又は第2のインターフェース60を介した外部機器にあれば、関連情報が有ると判定する。 The first visibility adjustment unit 73 determines the presence or absence of the relevant information of the real object 300 for which the notification necessity determination unit 72 has determined that the notification is necessary. The image generation unit 80 is instructed to display. The first visibility adjustment unit 73 acquires the identification information of the real object 300 from the identification information acquisition unit 30 or the second interface 60, and the related information corresponding to the identification information is stored in the storage unit 71 or the second interface 60. If the information is in the external device via the interface, it is determined that there is related information.
 また、第1視認性調整部73は、虚像V(関連情報画像V1、強調画像V2)の視認性を調整する。ここでいう「視認性」を変化させる手法としては、例えば、虚像Vの輝度、明度、色彩、又はこれらの結合を変更することである。第1視認性調整部73は、実オブジェクト距離検出部35から取得する実オブジェクト300の距離情報に基づき、関連情報画像V1の視認性を変化させる。具体的には、第1視認性調整部73は、実オブジェクト300が接近するにつれて、関連情報画像V1の視認性を徐々に低下させる。 The first visibility adjuster 73 adjusts the visibility of the virtual image V (the related information image V1 and the emphasized image V2). As a method of changing the “visibility” here, for example, changing the brightness, lightness, color, or a combination thereof of the virtual image V is used. The first visibility adjustment unit 73 changes the visibility of the related information image V1 based on the distance information of the real object 300 acquired from the real object distance detection unit 35. Specifically, the first visibility adjustment unit 73 gradually reduces the visibility of the related information image V1 as the real object 300 approaches.
 変形例として、第1視認性調整部73は、表示領域11内での実オブジェクト300の位置(又は実オブジェクト300に対して表示される強調画像V2の位置)により関連情報画像V1の視認性を変化させてもよい。具体的には、第1視認性調整部73は、実オブジェクト位置検出部33から取得する実オブジェクト300の位置情報に基づき、表示領域11内での実オブジェクト300の位置を特定し、実オブジェクト300が表示領域11の外縁に近づくにつれて、関連情報画像V1の視認性を徐々に低下させてもよい。このように、第1視認性調整部73が、実オブジェクト300の距離情報(視認者から見た奥行方向)又は位置情報(視認者から見た上下方向又は/及び左右方向)に基づき、視認性を徐々に低下させる処理を第1視認性調整処理と呼ぶ。 As a modification, the first visibility adjustment unit 73 adjusts the visibility of the related information image V1 by the position of the real object 300 in the display area 11 (or the position of the emphasized image V2 displayed on the real object 300). It may be changed. Specifically, the first visibility adjuster 73 specifies the position of the real object 300 in the display area 11 based on the position information of the real object 300 acquired from the real object position detector 33, and May gradually decrease the visibility of the related information image V <b> 1 as approaches the outer edge of the display area 11. As described above, the first visibility adjustment unit 73 performs the visibility based on the distance information (the depth direction viewed from the viewer) or the position information (the vertical direction and / or the left / right direction viewed from the viewer) of the real object 300. Is referred to as first visibility adjustment processing.
 第2視認性調整部74は、視認者が、実オブジェクト300(あるいは実オブジェクト300に対して表示される強調画像V2)を視認しているか、又は、関連情報画像V1を視認しているかを判定する。第2視認性調整部74は、記憶部71に記憶されている現在表示されている関連情報画像V1及び強調画像V2の位置情報を取得し、位置情報取得部32から実オブジェクト300の位置情報を取得し、注視位置取得部40から注視範囲GZを取得し、関連情報画像V1又は強調画像V2の位置情報と注視範囲GZとに基づいて判定する。 The second visibility adjustment unit 74 determines whether the viewer is visually recognizing the real object 300 (or the emphasized image V2 displayed on the real object 300) or the related information image V1. I do. The second visibility adjustment unit 74 acquires the position information of the currently displayed related information image V1 and the emphasized image V2 stored in the storage unit 71, and acquires the position information of the real object 300 from the position information acquisition unit 32. Acquired, the gaze range GZ is acquired from the gaze position acquisition unit 40, and the determination is made based on the position information of the related information image V1 or the emphasized image V2 and the gaze range GZ.
 また、第2視認性調整部74は、視認者が、実オブジェクト300(あるいは実オブジェクト300に対して表示される強調画像V2)を視認した場合、又は、関連情報画像V1を視認した場合、視認性を上昇させた後、低下させる第2視認性調整処理を実行することが可能である。第2視認性調整部74は、視認者が、実オブジェクト300(及び実オブジェクト300に対して表示される強調画像V2)を視認しておらず、かつ、関連情報画像V1を視認していない場合、第1視認性調整処理を実行し、視認者が、実オブジェクト300(あるいは実オブジェクト300に対して表示される強調画像V2)を視認した場合、又は、関連情報画像V1を視認した場合、第2視認性調整処理を実行する。 In addition, the second visibility adjuster 74 determines whether the viewer visually recognizes the real object 300 (or the emphasized image V2 displayed on the real object 300) or the related information image V1. After increasing the visibility, it is possible to execute a second visibility adjustment process for decreasing the visibility. The second visibility adjuster 74 determines that the viewer does not visually recognize the real object 300 (and the emphasized image V2 displayed on the real object 300) and does not visually recognize the related information image V1. When the first visibility adjustment process is executed and the viewer visually recognizes the real object 300 (or the emphasized image V2 displayed on the real object 300) or the related information image V1, 2 Execute visibility adjustment processing.
 次に、図2を用いて、HUD装置1が表示する虚像Vを説明する。図2の例では、実オブジェクト300は、駐車場であり、この駐車場を指示するように強調画像V2が表示され、駐車場を示す『P』を表示する関連情報画像V1が表示領域11の上端部12に表示されている。表示処理部70は、報知要否判定部72が報知を必要と判定した実オブジェクト300の近傍又は視認されないが実オブジェクト300が存在する位置に強調画像V2を表示し、実オブジェクト300の関連情報画像V1を表示領域11の予め定められた特定の位置(図2の例では上端部12)に表示する。特定の位置は、HUD10が虚像Vを表示可能な表示領域11の鉛直方向を4等分した場合の上端部12又は下端部13であることが好ましいが、表示領域11の鉛直方向を3等分した場合の上端部12又は下端部13であってもよい。言い換えると、表示領域11は、関連情報画像V1を表示する上端部12の鉛直方向の幅(表示領域11の上端と関連情報画像V1の下端との間の長さ)の2倍程度の鉛直方向の幅の領域を、上端部12の下側に設けることが好ましく、又は、関連情報画像V1を表示する下端部13の鉛直方向の幅(表示領域11の下端と関連情報画像V1の上端との間の長さ)の2倍程度の鉛直方向の幅の領域を、下端部13の上側に設けることが好ましい。 Next, the virtual image V displayed by the HUD device 1 will be described with reference to FIG. In the example of FIG. 2, the real object 300 is a parking lot, the emphasized image V2 is displayed so as to indicate the parking lot, and the related information image V1 displaying “P” indicating the parking lot is displayed in the display area 11. It is displayed on the upper end portion 12. The display processing unit 70 displays the emphasized image V2 in the vicinity of the real object 300 for which the notification necessity determination unit 72 has determined that notification is necessary or at a position where the real object 300 is not visually recognized but exists, and the related information image of the real object 300 is displayed. V1 is displayed at a predetermined specific position in the display area 11 (the upper end 12 in the example of FIG. 2). The specific position is preferably the upper end portion 12 or the lower end portion 13 when the vertical direction of the display area 11 in which the HUD 10 can display the virtual image V is divided into four equal parts, but the vertical direction of the display area 11 is divided into three equal parts. The upper end portion 12 or the lower end portion 13 may be used. In other words, the display region 11 has a vertical direction that is about twice as large as the vertical width of the upper end portion 12 displaying the related information image V1 (the length between the upper end of the display region 11 and the lower end of the related information image V1). Is preferably provided below the upper end portion 12, or the width of the lower end portion 13 for displaying the related information image V1 in the vertical direction (between the lower end of the display region 11 and the upper end of the related information image V1). It is preferable to provide a region having a width in the vertical direction which is about twice as large as the length between them (the length between them).
 次に、図3を参照する。図3は、本実施形態のHUD装置1の表示制御に関わる第1視認性調整処理の主要な手順の一例を示すフローチャートである。HUD装置1が起動されると、第1視認性調整処理が開始される。 Next, FIG. 3 is referred to. FIG. 3 is a flowchart illustrating an example of a main procedure of a first visibility adjustment process related to display control of the HUD device 1 of the present embodiment. When the HUD device 1 is started, a first visibility adjustment process is started.
 まず、実オブジェクト情報(識別情報、位置情報、距離情報)が取得され(ステップS1)、視認者に報知すべき実オブジェクト300であるかが判定され(ステップS2)、YESの場合、実オブジェクト300の近傍に強調画像V2が表示され(ステップS3)、NOの場合、第1視認性調整処理が終了される。 First, real object information (identification information, position information, distance information) is obtained (step S1), and it is determined whether the real object 300 is to be notified to the viewer (step S2). (Step S3), and in the case of NO, the first visibility adjustment processing ends.
 続いて、関連情報画像V1の表示条件が成立したかが判定され(ステップS4)、YESの場合、関連情報画像V1が表示され(ステップS5)、NOの場合、関連情報画像V1が表示されず、又は関連情報画像V1が表示してあれば非表示とされ(ステップS6)、第1視認性調整処理が終了される。ステップS4の関連情報画像V1の表示条件は、実オブジェクト300に対する関連情報があることを少なくとも含み、これに加えて、状況に応じて関連情報画像V1の優先度(必要度)が高いと判定できることを含んでいてもよい。 Subsequently, it is determined whether the display condition of the related information image V1 is satisfied (step S4). If YES, the related information image V1 is displayed (step S5). If NO, the related information image V1 is not displayed. Or, if the related information image V1 is displayed, it is not displayed (step S6), and the first visibility adjustment processing is ended. The display condition of the related information image V1 in step S4 includes at least the presence of the related information for the real object 300, and in addition, it can be determined that the priority (necessity) of the related information image V1 is high according to the situation. May be included.
 この後、第1視認性調整処理を実行する条件が成立したか判定され(ステップS7)、YESの場合、関連情報画像V1の視認性が、ステップS1で取得された実オブジェクト300の距離情報や位置情報に基づいて低下され(ステップS8、第1視認性調整処理とも呼ぶ)、NOの場合、第1視認性調整処理が終了される。ステップS7における第1視認性調整処理を実行する条件は、実オブジェクト300が予め定められた後述する距離Dm(図7参照)以内であること、又は実オブジェクト300が表示領域11の外縁に近い予め定められた図示しない領域内であること、などである。ステップS8の第1視認性調整処理では、実オブジェクト300が接近するにつれて、又は実オブジェクト300が表示領域11の外縁に近づくにつれて、関連情報画像V1の視認性が徐々に低下される。車両に接近した報知優先度(必要度)が低くなった実オブジェクト300の関連情報の視認性を自動的に低下させ、視認者の視覚的注意が奪われにくくして、車両運行に集中させることができる。 Thereafter, it is determined whether a condition for executing the first visibility adjustment processing is satisfied (step S7), and in the case of YES, the visibility of the related information image V1 is determined based on the distance information of the real object 300 acquired in step S1 or the like. It is lowered based on the position information (step S8, also referred to as first visibility adjustment processing), and in the case of NO, the first visibility adjustment processing ends. The condition for executing the first visibility adjustment process in step S7 is that the real object 300 is within a predetermined distance Dm (see FIG. 7), which will be described later, or that the real object 300 is close to the outer edge of the display area 11. It is within a defined area (not shown). In the first visibility adjustment processing in step S8, the visibility of the related information image V1 is gradually reduced as the real object 300 approaches or the real object 300 approaches the outer edge of the display area 11. Automatically lowering the visibility of the related information of the real object 300 having a lower notification priority (necessity) when approaching the vehicle, making it harder for a viewer to lose visual attention, and concentrating on vehicle operation Can be.
 次に、図4を参照する。図4は、本実施形態のHUD装置1の表示制御に関わる第2視認性調整処理の主要な手順の一例を示すフローチャートである。第1視認性調整処理が終了すると、続けて第2視認性調整処理が開始される。なお、第2視認性調整処理が終了すると、第1視認性調整処理から再度開始される。 (4) Referring to FIG. FIG. 4 is a flowchart illustrating an example of a main procedure of a second visibility adjustment process related to display control of the HUD device 1 according to the present embodiment. When the first visibility adjustment process ends, the second visibility adjustment process starts. When the second visibility adjustment process ends, the process is restarted from the first visibility adjustment process.
 まず、視認者の注視範囲GZが取得され(ステップS11)、関連情報画像V1が注視されているか判定され(ステップS12)、YESの場合、ステップS13に移行され、NOの場合、第2視認性調整処理が終了される。 First, the gaze range GZ of the viewer is obtained (step S11), and it is determined whether the related information image V1 is being watched (step S12). If YES, the process proceeds to step S13. If NO, the second visibility is determined. The adjustment process ends.
 続いて、ステップS13では、ステップS12で注視されていると判定された関連情報画像V1が、第1視認性調整処理中であるかが判定される。換言すると、ステップS8で第1視認性調整処理の条件が満たされているかが判定される。ステップS13でYESの場合、関連情報画像V1の現在の視認性が取得され(ステップS14)、この現在の視認性を基準として予め定められた値だけ視認性が上昇され(ステップS15)、その後、視認性が低下され(ステップS16)、第2視認性調整処理が終了され、再び、第1視認性調整処理からルーチンを開始する。ステップS13でNOの場合、関連情報画像V1の通常の視認性を基準として予め定められた値だけ視認性が上昇され(ステップS15)、その後、視認性が低下され(ステップS16)、第2視認性調整処理が終了され、再び、第1視認性調整処理からルーチンを開始する。ステップS13からステップS16までの処理を第2視認性調整処理と呼ぶ。すなわち、視認者が関連情報画像V1を視認すると、視認性が徐々に低下する第1視認性調整処理が中断され、視認性が上昇するため、視認している関連情報画像V1が徐々に視認しづらくなる状況を避けることができ、関連情報画像V1を認識しやすくすることができる。 Subsequently, in step S13, it is determined whether the related information image V1 determined to be watched in step S12 is undergoing the first visibility adjustment process. In other words, it is determined in step S8 whether the condition of the first visibility adjustment processing is satisfied. In the case of YES in step S13, the current visibility of the related information image V1 is obtained (step S14), and the visibility is increased by a predetermined value based on the current visibility (step S15). The visibility is reduced (step S16), the second visibility adjustment process is terminated, and the routine is started again from the first visibility adjustment process. In the case of NO in step S13, the visibility is increased by a predetermined value based on the normal visibility of the related information image V1 (step S15), and thereafter, the visibility is reduced (step S16), and the second visibility is performed. The gender adjustment processing is terminated, and the routine is started again from the first visibility adjustment processing. The processing from step S13 to step S16 is referred to as second visibility adjustment processing. That is, when the viewer visually recognizes the related information image V1, the first visibility adjustment process in which the visibility gradually decreases is interrupted, and the visibility increases, so that the related information image V1 being viewed is gradually recognized. It is possible to avoid a situation where the related information image V1 becomes difficult, and it is possible to easily recognize the related information image V1.
 次に、図5、図6を参照して、第1視認性調整処理と第2視認性調整処理について説明する。図5は、視認者が前方を向いた際に被投影部2(車両のフロントウインドシールド)を介して視認される、前景200と、本実施形態における第1視認性調整処理が実行された際の虚像Vとを示す図である。 Next, the first visibility adjustment process and the second visibility adjustment process will be described with reference to FIGS. FIG. 5 shows the foreground 200, which is visually recognized via the projection target portion 2 (the front windshield of the vehicle) when the viewer faces forward, and the case where the first visibility adjustment processing in the present embodiment is executed. 5 is a diagram showing a virtual image V of FIG.
 図5(a)では、実オブジェクト300が遠方に位置する(予め定められた距離Dmより離れている)ため、図3のステップS7でNOとなり、ステップS8の第1視認性調整処理が実行されていない関連情報画像V1と、実オブジェクト300の近傍に矩形の強調画像V2とが表示されている。この際の関連情報画像V1の視認性は、後述する初期値Lm(図7参照)である。 In FIG. 5A, since the real object 300 is located far away (away from the predetermined distance Dm), the result of step S7 in FIG. 3 is NO, and the first visibility adjustment processing in step S8 is executed. The related information image V1 that is not present and the rectangular emphasized image V2 near the real object 300 are displayed. At this time, the visibility of the related information image V1 is an initial value Lm (see FIG. 7) described later.
 図5(b)では、車両の走行に伴い、実オブジェクト300が接近した(予め定められた距離Dmより近づいた)ため、図3のステップS7でYESとなり、ステップS8の第1視認性調整処理が実行された関連情報画像V1と、実オブジェクト300の近傍に矩形の強調画像V2とが表示されている。この際の関連情報画像V1の視認性は、初期値Lmより低くなるように設定される。 In FIG. 5B, since the real object 300 approaches (becomes closer than the predetermined distance Dm) as the vehicle travels, the result of step S7 in FIG. 3 is YES, and the first visibility adjustment processing in step S8. Are displayed, and a rectangular emphasized image V2 is displayed near the real object 300. At this time, the visibility of the related information image V1 is set to be lower than the initial value Lm.
 図5(c)では、実オブジェクト300が近傍に位置する(予め定められた距離Dsより近づいた)ため、関連情報画像V1は、ステップS8の第1視認性調整処理が実行されて視認性がゼロ(非表示)となり、実オブジェクト300が視認者から見て表示領域11外になったため、強調画像V2の視認性がゼロ(非表示)とされている。すなわち、図5に示す第1視認性調整処理の例では、実オブジェクト300の接近に従い、関連情報画像V1の視認性を低下させている。なお、図5の例では、強調画像V2の視認性も、実オブジェクト300の距離が短くなるにつれて(又は実オブジェクト300が表示領域11の外縁に近づくにつれて)、低下させているが、これに限定されない。 In FIG. 5C, since the real object 300 is located in the vicinity (approached from the predetermined distance Ds), the related information image V1 is subjected to the first visibility adjustment processing in step S8, and the visibility is reduced. Since the real object 300 is outside the display area 11 when viewed from the viewer, the visibility of the emphasized image V2 is set to zero (non-display). That is, in the example of the first visibility adjustment processing illustrated in FIG. 5, the visibility of the related information image V <b> 1 is reduced as the real object 300 approaches. In the example of FIG. 5, the visibility of the emphasized image V2 is also reduced as the distance of the real object 300 decreases (or as the real object 300 approaches the outer edge of the display area 11), but is not limited thereto. Not done.
 図6は、視認者が前方を向いた際に被投影部(車両のウインドシールド)2を介して視認される、前景200と、本実施形態における第2視認性調整処理が実行された際の虚像Vとを示す図である。 FIG. 6 shows the foreground 200, which is visually recognized through the projection target (windshield of the vehicle) 2 when the viewer faces forward, and the case where the second visibility adjustment processing in this embodiment is executed. It is a figure which shows a virtual image V.
 図6(a)、図6(b)では、実オブジェクト300が遠方に位置する(予め定められた距離Dmより離れている)ため、図3のステップS7でNOとなり、ステップS8の第1視認性調整処理が実行されていない関連情報画像V1と、実オブジェクト300の近傍に矩形の強調画像V2とが表示されている。図6(b)では、関連情報画像V1が視認者に視認された(注視範囲GZ内に関連情報画像V1が含まれる)ため、図4のステップS12でYESとなり、ステップS15の視認性の上昇がなされた関連情報画像V1と、実オブジェクト300の近傍に矩形の強調画像V2とが表示されている。この際の関連情報画像V1の視認性は、関連情報画像V1が第1視認性調整処理中であれば、その視認性を基準として予め定められた値だけ上昇して設定され、関連情報画像V1が第1視認性調整処理中でなければ、通常の視認性(初期値Lm)を基準として予め定められた値だけ上昇して設定される。図6(c)では、図4のステップS16の視認性低下が実行された関連情報画像V1と、実オブジェクト300の近傍に矩形の強調画像V2とが表示されている。すなわち、図6に示す第2視認性調整処理の例では、関連情報画像V1は、関連情報画像V1が視認されると強調表示され、その後、視認性が低下される。 In FIG. 6A and FIG. 6B, since the real object 300 is located at a far distance (away from the predetermined distance Dm), NO is obtained in step S7 in FIG. 3 and the first visual recognition is performed in step S8. A related information image V1 for which the gender adjustment processing has not been performed and a rectangular emphasized image V2 near the real object 300 are displayed. In FIG. 6B, since the related information image V1 is visually recognized by the viewer (the related information image V1 is included in the gazing range GZ), the result is YES in step S12 in FIG. 4, and the visibility is increased in step S15. Is displayed, and a rectangular emphasized image V2 is displayed near the real object 300. At this time, the visibility of the related information image V1 is set to be increased by a predetermined value based on the visibility if the related information image V1 is undergoing the first visibility adjustment processing. Is not in the first visibility adjustment process, the value is increased by a predetermined value based on the normal visibility (initial value Lm). 6C, the related information image V1 in which the visibility has been reduced in step S16 in FIG. 4 and the rectangular emphasized image V2 near the real object 300 are displayed. That is, in the example of the second visibility adjustment processing illustrated in FIG. 6, the related information image V1 is highlighted when the related information image V1 is visually recognized, and thereafter the visibility is reduced.
 図7は、本実施形態の第1視認性調整処理における実オブジェクト300までの距離と関連情報画像V1の視認性との関係を示す特性図である。実オブジェクト300までの距離が、予め定められた所定の第1距離閾値Dm以上であれば、図3のステップS7でNOとなり、関連情報画像V1を初期の視認性Lmに維持し、第1距離閾値Dm未満であれば、図3のステップS7でYESとなり、ステップS8に移行し、距離が短くなるにつれて、徐々に視認性を低下させ、第2距離閾値Ds未満では、視認性をゼロ(非表示)とする(第1視認性調整処理が実行される)。第1距離閾値Dmと第2距離閾値Dsとの間の距離と視認性との関係は、図7の例では線形としてあるが、非線形であってもよく、距離に応じて視認性を段階的に変化させてもよい。 FIG. 7 is a characteristic diagram showing the relationship between the distance to the real object 300 and the visibility of the related information image V1 in the first visibility adjustment processing of the present embodiment. If the distance to the real object 300 is equal to or greater than the predetermined first distance threshold Dm, the determination in step S7 of FIG. 3 is NO, and the related information image V1 is maintained at the initial visibility Lm, and the first distance If the distance is less than the threshold value Dm, the result of step S7 in FIG. 3 is YES, and the process proceeds to step S8. As the distance becomes shorter, the visibility is gradually reduced. Display) (first visibility adjustment processing is executed). The relationship between the distance and the visibility between the first distance threshold Dm and the second distance threshold Ds is linear in the example of FIG. 7, but may be non-linear, and the visibility is gradually increased according to the distance. May be changed.
 図8は、第2視認性調整処理による関連情報画像V1の視認性の変化を示すタイムチャートであり、図8Aは、上記の第1視認性調整処理が開始される前の図7の距離D1の時(時間t10)に視認者が関連情報画像V1を注視した場合の第2視認性調整処理による関連情報画像V1の視認性のタイムチャートであり、図8Bは、上記の第1視認性調整処理が開始される後の図7の距離D2の時(時間t20)に視認者が関連情報画像V1を注視した場合の第2視認性調整処理による関連情報画像V1の視認性のタイムチャートである。 FIG. 8 is a time chart illustrating a change in the visibility of the related information image V1 due to the second visibility adjustment processing. FIG. 8A is a diagram illustrating the distance D1 in FIG. 7 before the first visibility adjustment processing is started. 8B is a time chart of the visibility of the related information image V1 by the second visibility adjustment processing when the viewer gazes at the related information image V1 at the time (time t10), and FIG. 8B shows the first visibility adjustment. 8 is a time chart of the visibility of the related information image V1 by the second visibility adjustment process when the viewer gazes at the related information image V1 at the distance D2 (time t20) in FIG. 7 after the process is started. .
 図8Aに示す実施形態では、関連情報画像V1が注視されると、その時の視認性(初期の視認性Lm)を基準として、一定の第1の期間Ta(例えば、1秒)をかけて徐々に予め定められた一定の値だけ視認性を上昇させ、その後、車両が一定の第1走行距離Laを走行するまで視認性を一定に維持し、その後、車両がさらに第2走行距離Lbを走行するまでに視認性が0(ゼロ)になるように視認性を徐々に低下させる。徐々に関連情報画像V1の視認性を上昇させることで、視認性を上昇させた関連情報画像V1に、視認者の視覚的注意を維持又は再び向けやすくさせることができる。 In the embodiment illustrated in FIG. 8A, when the related information image V1 is gazed, it is gradually applied over a certain first period Ta (for example, 1 second) based on the visibility at that time (initial visibility Lm). The visibility is increased by a predetermined value, and then the visibility is kept constant until the vehicle travels a predetermined first traveling distance La. Thereafter, the vehicle further travels a second traveling distance Lb. By the time, the visibility is gradually reduced so that the visibility becomes 0 (zero). By gradually increasing the visibility of the related information image V1, the visual attention of the viewer can be easily maintained or redirected to the related information image V1 whose visibility has been increased.
 図8Bに示す実施形態では、関連情報画像V1が注視されると、その時の視認性(図7に示す第1表示調整処理によって距離D2に対応して調整された視認性L2)を基準として、一定の第1の期間Taをかけて徐々に予め定められた一定の値だけ視認性を上昇させ、その後、車両が一定の第1走行距離Laを走行するまで視認性を一定に維持し、その後、車両がさらに第3走行距離Lcを走行するまでに視認性が0(ゼロ)になるように視認性を徐々に低下させる。なお、走行距離の増加に対する視認性の低下率は、第2視認性調整処理が実行される際の視認性(図8Aでは、時間t10におけるLm、図8Bでは、時間t20におけるL2(<Lm))によらず、一定であることが好ましいが、これに限定されず、例えば、第2視認性調整処理が実行される際の視認性が低い程、前記低下率を急にしてもよい(単位走行距離あたりの視認性が低下する割合を大きくしてもよい)。 In the embodiment illustrated in FIG. 8B, when the related information image V1 is gazed at, the visibility at that time (the visibility L2 adjusted corresponding to the distance D2 by the first display adjustment processing illustrated in FIG. 7) is used as a reference. The visibility is gradually increased by a predetermined constant value over a constant first period Ta, and thereafter, the visibility is maintained constant until the vehicle travels a constant first traveling distance La. The visibility is gradually reduced so that the visibility becomes 0 (zero) before the vehicle further travels the third traveling distance Lc. Note that the rate of decrease in visibility with respect to an increase in travel distance is the visibility when the second visibility adjustment process is performed (Lm at time t10 in FIG. 8A, L2 at time t20 in FIG. 8B (<Lm). ), It is preferable to be constant, but the present invention is not limited to this. For example, the lower the visibility may be when the second visibility adjustment process is performed, the more rapidly the reduction rate may be increased (unit). The rate of decrease in visibility per mileage may be increased).
 図9は、第2視認性調整処理による関連情報画像V1の視認性の変化を示すタイムチャートであり、図9Aは、上記の第1視認性調整処理が開始される前の図7の距離D1の時(時間t30)に視認者が関連情報画像V1を注視した場合の第2視認性調整処理による関連情報画像V1の視認性のタイムチャートであり、図9Bは、上記の第1視認性調整処理が開始される後の図7の距離D2の時(時間t40)に視認者が関連情報画像V1を注視した場合の第2視認性調整処理による関連情報画像V1の視認性のタイムチャートである。 FIG. 9 is a time chart illustrating a change in the visibility of the related information image V1 due to the second visibility adjustment process. FIG. 9A is a diagram illustrating the distance D1 in FIG. 7 before the first visibility adjustment process is started. Is a time chart of the visibility of the related information image V1 by the second visibility adjustment process when the viewer gazes at the related information image V1 at the time (time t30), and FIG. 9B is the first visibility adjustment. 8 is a time chart of the visibility of the related information image V1 by the second visibility adjustment process when the viewer gazes at the related information image V1 at the time of the distance D2 (time t40) in FIG. 7 after the process is started. .
 図9Aに示す実施形態では、関連情報画像V1が注視されると、その時の視認性(初期の視認性Lm)を基準として、一定の第1の期間Taをかけて徐々に予め定められた一定の値だけ視認性を上昇させ、その後、一定の第2の期間Tbが経過するまで視認性を一定に維持し、その後、第3の期間Tcが経過するまでに視認性が0(ゼロ)になるように視認性を徐々に低下させる。 In the embodiment illustrated in FIG. 9A, when the related information image V1 is gazed, a predetermined constant value is gradually set over a predetermined first period Ta based on the visibility at that time (initial visibility Lm). , The visibility is kept constant until a certain second period Tb elapses, and thereafter, the visibility becomes 0 (zero) until the third period Tc elapses. So that the visibility is gradually reduced.
 図9Bに示す実施形態では、関連情報画像V1が注視されると、その時の視認性(図7に示す第1表示調整処理によって距離D2に対応して調整された視認性L2)を基準として、一定の第1の期間Taをかけて徐々に予め定められた一定の値だけ視認性を上昇させ、その後、一定の第2の期間Tbが経過するまで視認性を一定に維持し、その後、第4の期間Td(<Tc)が経過するまでに視認性が0(ゼロ)になるように視認性を徐々に低下させる。なお、時間経過に対する視認性の低下率は、第2視認性調整処理が実行される際の視認性(図9Aでは、時間t30におけるLm、図9Bでは、時間t40におけるL2(<Lm))によらず、一定であることが好ましいが、これに限定されず、例えば、第2視認性調整処理が実行される際の視認性が低い程、前記低下率を急にしてもよい(単位時間あたりの視認性が低下する割合を大きくしてもよい)。なお、第2視認性調整処理において、一定の第1の期間Taをかけて、第1表示調整処理で調整された視認性から一定の値だけ視認性を上昇させるのではなく、特定の視認性(例えば、通常の視認性Lm)まで上昇させてもよい。 In the embodiment illustrated in FIG. 9B, when the related information image V1 is gazed, the visibility at that time (the visibility L2 adjusted according to the distance D2 by the first display adjustment processing illustrated in FIG. 7) is used as a reference. The visibility is gradually increased by a predetermined constant value over a certain first period Ta, and thereafter, the visibility is kept constant until a certain second period Tb elapses. The visibility is gradually reduced so that the visibility becomes 0 (zero) before the period Td (<Tc) of 4 elapses. Note that the rate of decrease in visibility with respect to the passage of time is determined by the visibility when the second visibility adjustment processing is performed (Lm at time t30 in FIG. 9A, and L2 (<Lm) at time t40 in FIG. 9B). Instead, it is preferable, but not limited to, that the constant be constant. For example, the lower the visibility when the second visibility adjustment process is performed, the more the reduction rate may be increased (per unit time). May be increased. In the second visibility adjustment process, the visibility is not increased by a certain value from the visibility adjusted in the first display adjustment process over a certain first period Ta, but is changed to a specific visibility. (For example, normal visibility Lm).
1…HUD装置(ヘッドアップディスプレイ装置)、2…ウインドシールド、10…画像表示部、10a…リレー光学部、11…表示領域、12…上端部、13…下端部、20…表示制御装置、30…識別情報取得部、31…実オブジェクト識別部、32…位置情報取得部、33…実オブジェクト位置検出部、34…距離情報取得部、35…実オブジェクト距離検出部、40…注視位置取得部、41…視線検出部、50…車両状態取得部、51…位置検出部、52…方角検出部、53…姿勢検出部、60…第2のインターフェース、70…表示処理部、71…記憶部、72…報知要否判定部、73…第1視認性調整部、74…第2視認性調整部、80…画像生成部、200…前景、300…実オブジェクト、500…クラウドサーバー、600…車両ECU、700…ナビゲーションシステム、GZ…注視範囲、V…虚像、V1…関連情報画像、V2…強調画像 DESCRIPTION OF SYMBOLS 1 ... HUD apparatus (head-up display apparatus), 2 ... windshield, 10 ... image display part, 10a ... relay optical part, 11 ... display area, 12 ... upper end part, 13 ... lower end part, 20 ... display control apparatus, 30 ... identification information acquisition unit, 31 ... real object identification unit, 32 ... position information acquisition unit, 33 ... real object position detection unit, 34 ... distance information acquisition unit, 35 ... real object distance detection unit, 40 ... gaze position acquisition unit, 41: line-of-sight detection unit, 50: vehicle state acquisition unit, 51: position detection unit, 52: direction detection unit, 53: attitude detection unit, 60: second interface, 70: display processing unit, 71: storage unit, 72 ... Notification necessity determination unit 73 73 First visibility adjustment unit 74 74 Second visibility adjustment unit 80 Image generation unit 200 Foreground 300 Real object 500 Cloud server 60 ... vehicle ECU, 700 ... navigation system, GZ ... gaze range, V ... virtual image, V1 ... related information image, V2 ... enhanced image

Claims (6)

  1.  車両に搭載され、前記車両の前景に重なる表示領域に、前記前景に存在する実オブジェクトの関連情報画像を生成する画像生成部を備える表示制御装置であって、
     1つ又はそれ以上のプロセッサと、
     メモリと、
     前記メモリに格納され、前記1つ又はそれ以上のプロセッサによって実行されるように構成される1つ又はそれ以上の命令と、を備え、
     前記1つ又はそれ以上のプロセッサは、
     前記実オブジェクトの位置を取得し、
      前記実オブジェクトまでの距離が短くなるにつれて、又は前記実オブジェクトが前記表示領域の外縁に近づくにつれて、徐々に前記関連情報画像の視認性を低下させ、
     視認者の注視範囲を取得し、
      前記関連情報画像が注視されたと判定された場合、前記関連情報画像の視認性を、上昇させた後、前記車両の走行距離、又は経過時間に基づき、徐々に低下させる、命令を実行する、
    表示制御装置。
    A display control device that is mounted on a vehicle and includes an image generation unit that generates a related information image of a real object existing in the foreground in a display area overlapping a foreground of the vehicle,
    One or more processors,
    Memory and
    One or more instructions stored in the memory and configured to be executed by the one or more processors;
    The one or more processors include:
    Obtain the position of the real object,
    As the distance to the real object decreases, or as the real object approaches the outer edge of the display area, gradually reduces the visibility of the related information image,
    Obtain the gaze range of the viewer,
    When it is determined that the related information image has been watched, after increasing the visibility of the related information image, based on the traveling distance of the vehicle, or the elapsed time, gradually reduce the command,
    Display control device.
  2.  前記1つ又はそれ以上のプロセッサは、
     前記関連情報画像が注視されたと判定された場合、所定の期間をかけて前記関連情報画像の視認性を徐々に上昇させた後、前記走行距離、又は前記経過時間に基づき、徐々に低下させる、命令を実行する、
    請求項1に記載の表示制御装置。
    The one or more processors include:
    If it is determined that the related information image has been watched, after gradually increasing the visibility of the related information image over a predetermined period, based on the travel distance, or the elapsed time, gradually decrease, Execute instructions,
    The display control device according to claim 1.
  3.  前記1つ又はそれ以上のプロセッサは、
     前記第2視認性調整部は、前記関連情報画像が注視されたと判定された場合、前記関連情報画像が通常の視認性より低下させられていた場合、所定の期間をかけて前記関連情報画像の視認性を前記通常の視認性まで徐々に上昇させる、命令を実行する、
    請求項1又は2に記載の表示制御装置。
    The one or more processors include:
    The second visibility adjustment unit, when it is determined that the related information image is gazed, when the related information image is lower than normal visibility, takes a predetermined period of time of the related information image Executing the instruction, gradually increasing the visibility to the normal visibility,
    The display control device according to claim 1.
  4.  前記1つ又はそれ以上のプロセッサは、
     前記走行距離、又は前記経過時間に基づく視認性の低下率を、前記関連情報画像が注視されたと判定された際の前記関連情報画像の視認性によらず、一定とする、命令を実行する、
    請求項1乃至3のいずれか一項に記載の表示制御装置。
    The one or more processors include:
    The travel distance, or a decrease rate of visibility based on the elapsed time, regardless of the visibility of the related information image when it is determined that the related information image has been watched, to execute a command,
    The display control device according to claim 1.
  5.  前記1つ又はそれ以上のプロセッサは、
    前記実オブジェクトに重なる位置、又は近傍に、前記実オブジェクトを強調する強調画像をさらに表示させ、
     前記実オブジェクトまでの距離が短くなるにつれて、又は前記実オブジェクトが前記表示領域の外縁に近づくにつれて、徐々に前記強調画像の視認性を低下させ、
     前記強調画像が注視されたと判定された場合でも、前記強調画像の視認性を調整しない、命令を実行する、
    請求項1乃至4のいずれか一項に記載の表示制御装置。
    The one or more processors include:
    At a position overlapping with the real object, or in the vicinity, an enhanced image that emphasizes the real object is further displayed,
    As the distance to the real object decreases, or as the real object approaches the outer edge of the display area, gradually lowers the visibility of the enhanced image,
    Even if it is determined that the emphasized image has been watched, do not adjust the visibility of the emphasized image, execute a command,
    The display control device according to claim 1.
  6.  請求項1乃至5のいずれかに記載の表示制御装置と、
     前記表示制御装置の制御に基づき前記関連情報画像のもととなる画像を表示する画像表示部と、
     前記画像表示部が表示した前記画像を被投影部に向けて投影することで、前記画像の虚像を視認させるリレー光学部と、を備える、ヘッドアップディスプレイ装置。
     
    A display control device according to any one of claims 1 to 5,
    An image display unit that displays an image that is a source of the related information image based on control of the display control device,
    A head-up display device, comprising: a relay optical unit configured to project the image displayed by the image display unit toward a projection target unit to visually recognize a virtual image of the image.
PCT/JP2019/028579 2018-07-24 2019-07-22 Display control device and head-up display device WO2020022239A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020532369A JP7255596B2 (en) 2018-07-24 2019-07-22 Display control device, head-up display device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018138641 2018-07-24
JP2018-138641 2018-07-24

Publications (1)

Publication Number Publication Date
WO2020022239A1 true WO2020022239A1 (en) 2020-01-30

Family

ID=69181514

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/028579 WO2020022239A1 (en) 2018-07-24 2019-07-22 Display control device and head-up display device

Country Status (2)

Country Link
JP (1) JP7255596B2 (en)
WO (1) WO2020022239A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014194503A (en) * 2013-03-29 2014-10-09 Funai Electric Co Ltd Head-up display device
JP2015011666A (en) * 2013-07-02 2015-01-19 株式会社デンソー Head-up display and program
JP2017004265A (en) * 2015-06-10 2017-01-05 トヨタ自動車株式会社 Display device
WO2017046937A1 (en) * 2015-09-18 2017-03-23 日産自動車株式会社 Display apparatus for vehicle and display method for vehicle

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017039373A (en) * 2015-08-19 2017-02-23 トヨタ自動車株式会社 Vehicle video display system
JP2017138350A (en) * 2016-02-01 2017-08-10 アルプス電気株式会社 Image display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014194503A (en) * 2013-03-29 2014-10-09 Funai Electric Co Ltd Head-up display device
JP2015011666A (en) * 2013-07-02 2015-01-19 株式会社デンソー Head-up display and program
JP2017004265A (en) * 2015-06-10 2017-01-05 トヨタ自動車株式会社 Display device
WO2017046937A1 (en) * 2015-09-18 2017-03-23 日産自動車株式会社 Display apparatus for vehicle and display method for vehicle

Also Published As

Publication number Publication date
JPWO2020022239A1 (en) 2021-08-05
JP7255596B2 (en) 2023-04-11

Similar Documents

Publication Publication Date Title
JP5866553B2 (en) Display control device, projection device, and display control program
US9481287B2 (en) Roadway projection system
US9818206B2 (en) Display device
JP6304628B2 (en) Display device and display method
KR102105447B1 (en) Method and apparatus for minimizing error due to surrounding environment of three dimentional head-up display
US11250816B2 (en) Method, device and computer-readable storage medium with instructions for controlling a display of an augmented-reality head-up display device for a transportation vehicle
WO2016067574A1 (en) Display control device and display control program
JP2017215816A (en) Information display device, information display system, information display method, and program
JP6347326B2 (en) Display control device, display control method, display control program, and projection device
JP2016112984A (en) Virtual image display system for vehicle, and head up display
KR20180022374A (en) Lane markings hud for driver and assistant and same method thereof
JP2020032866A (en) Vehicular virtual reality providing device, method and computer program
JP2018088118A (en) Display control device, control method, program and storage media
WO2020105685A1 (en) Display control device, method, and computer program
JP7402083B2 (en) Display method, display device and display system
US11170537B2 (en) Vehicle display device
WO2020022239A1 (en) Display control device and head-up display device
CN111971197B (en) Display control device and head-up display apparatus
JP6415968B2 (en) COMMUNICATION DEVICE, WARNING DEVICE, DISPLAY DEVICE, CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
JP6988368B2 (en) Head-up display device
JP2019121140A (en) Display control device and head-up display device
JP2019148935A (en) Display control device and head-up display device
JP2019207632A (en) Display device
WO2021200913A1 (en) Display control device, image display device, and method
JP7338632B2 (en) Display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19841599

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020532369

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19841599

Country of ref document: EP

Kind code of ref document: A1