WO2023199896A1 - Display device - Google Patents

Display device Download PDF

Info

Publication number
WO2023199896A1
WO2023199896A1 PCT/JP2023/014594 JP2023014594W WO2023199896A1 WO 2023199896 A1 WO2023199896 A1 WO 2023199896A1 JP 2023014594 W JP2023014594 W JP 2023014594W WO 2023199896 A1 WO2023199896 A1 WO 2023199896A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
image
display
displayed
viewer
Prior art date
Application number
PCT/JP2023/014594
Other languages
French (fr)
Japanese (ja)
Inventor
凌 高野
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Publication of WO2023199896A1 publication Critical patent/WO2023199896A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns

Definitions

  • the present disclosure relates to a display device for vehicles and the like.
  • a head-up display which is an example of a display device, supports the driver's driving operations by displaying images that represent information necessary for driving in a manner that allows the driver to intuitively understand the actual scenery that the driver sees. do.
  • a technique is conventionally known that improves the visibility of an image including an AR image superimposed on a real object.
  • a head-up display forms an image so that information is visible in front of the vehicle. For this reason, if too many images are displayed in the direction of the driver's line of sight, there is a risk that this will actually hinder driving.
  • the present disclosure has been made in consideration of such circumstances, and aims to provide a display device that can have visibility suitable for the viewer's situation.
  • a display device of the present disclosure includes a display device that displays one or more images, and a stress value that acquires a stress value that is information regarding the magnitude of stress of a viewer viewing the image.
  • the display device includes an acquisition unit, and a display control unit that controls an image to be displayed on the display according to the stress value acquired by the stress value acquisition unit.
  • the display device of the present disclosure can have visibility suitable for the viewer's situation.
  • FIG. 3 is an explanatory diagram showing a virtual display area formed in front of a vehicle.
  • 5 is a flowchart illustrating display number control processing executed by the HUD.
  • FIG. 4 is an explanatory diagram showing a display area when an image with a low priority is hidden from the display area of FIG. 3;
  • FIG. 7 is an explanatory diagram showing another virtual display area formed in front of the vehicle.
  • FIG. 6 is an explanatory diagram showing a display area when an image with a low priority is hidden from the display area of FIG. 5;
  • FIG. 7 is an explanatory diagram showing yet another virtual display area formed in front of the vehicle.
  • FIG. 3 is an explanatory diagram showing a virtual display area formed in front of a vehicle.
  • 5 is a flowchart illustrating display number control processing executed by the HUD.
  • FIG. 4 is an explanatory diagram showing a display area when an image with a low priority is hidden from the display area of FIG. 3
  • FIG. 7 is an explanatory diagram showing another
  • FIG. 8 is an explanatory diagram showing a display area when a part of an image with a low priority is hidden from the display area of FIG. 7;
  • FIG. 9 is an explanatory diagram showing a display area when a part of the image in FIG. 8 is hidden;
  • 10 is an explanatory diagram showing a display area when a part of the image in FIG. 9 is hidden;
  • FIG. 8 is an explanatory diagram showing a display area when a part of an image with a low priority is hidden from the display area of FIG. 7;
  • FIG. 9 is an explanatory diagram showing a display area when a part of the image in FIG. 8 is hidden;
  • 10 is an explanatory diagram showing a display area when a part of the image in FIG. 9 is hidden;
  • Embodiments of the display device of the present disclosure will be described based on the accompanying drawings.
  • the display device of the present disclosure can be applied to, for example, a display device mounted on a vehicle such as an automobile or a motorcycle, a ship, an aircraft, an agricultural machine, or a construction machine.
  • the present embodiment will be described using an example in which the display device is a head-up display (HUD) that is mounted on a vehicle and displays required information based on various information acquired from the vehicle.
  • HUD head-up display
  • FIG. 1 is a diagram showing an example of the system configuration of the HUD 10 in this embodiment.
  • FIG. 2 is an explanatory diagram showing a virtual display area 23 formed in front of the vehicle 1. As shown in FIG.
  • the HUD 10 is mounted within the instrument panel of the vehicle 1.
  • the HUD 10 includes a display 11, a plane mirror 12, a concave mirror 13, a housing 15, and a control unit 30.
  • the display 11 is, for example, a TFT (Thin Film Transistor) type liquid crystal display or an organic EL (Electroluminescence) display.
  • the display device 11 emits display light L related to an image to be viewed by the driver 4 (viewer) of the vehicle 1, and displays one or more images in the display area 23.
  • the display device 11 may include a projector and a screen that constitutes a display surface.
  • the plane mirror 12 reflects the display light L emitted by the display 11 toward the concave mirror 13.
  • the concave mirror 13 further reflects the display light L reflected by the plane mirror 12 and emits it toward the windshield 3.
  • the concave mirror 13 has a function as a magnifying glass, magnifies the image displayed on the display 11, and reflects the image toward the windshield 3 (projection member). That is, the virtual image visually recognized by the driver 4 is an enlarged image of the image displayed on the display 11.
  • the display light L reflected by the concave mirror 13 is irradiated onto the windshield 3 of the vehicle 1.
  • the driver 4 of the vehicle 1 visually recognizes the reflected light of the display light L related to the image on the windshield 3 as a virtual image within the rectangular virtual display area 23 .
  • the image is displayed in the display area 23 as a virtual image that is displayed via the windshield 3 (in front of the windshield 3) so as to be superimposed on the actual scene in the field of view of the driver 4. That is, the image can be visually recognized as a so-called AR (Augmented Reality) image that is displayed with the actual scene in front of the vehicle 1 as the background.
  • AR Augmented Reality
  • the housing 15 accommodates the display 11, the plane mirror 12, and the concave mirror 13 inside the housing 15. Furthermore, the housing 15 houses a control board on which the control unit 30 is mounted.
  • control unit 30 controls the display 11 based on information acquired from each part of the vehicle 1, which will be described later.
  • the control unit 30 is a microprocessor, microcontroller, graphic controller, integrated circuit, etc., and executes predetermined processing.
  • the control unit 30 particularly includes a stress value acquisition unit 31, a priority determination unit 32, and a display control unit 33.
  • the stress value acquisition unit 31 acquires, for example, a numerical stress value, which is information regarding the magnitude of stress of the driver 4 who visually recognizes the image.
  • the stress value acquisition unit 31 acquires detection results from, for example, the biological sensor 7, posture sensor 8, and camera 9 installed in the vehicle 1, and performs necessary calculations to acquire a numerical stress value.
  • the stress value acquisition unit 31 may perform calculations based on the detection results obtained from any one of the biological sensor 7, the posture sensor 8, and the camera 9, or may perform calculations based on the detection results obtained from the biological sensor 7, the posture sensor 8, and the camera 9. The calculation may be performed based on the detection results obtained.
  • the biosensor 7 is, for example, a microwave Doppler sensor type, and irradiates microwaves toward the human body and measures the reflected waves.
  • the biosensor 7 detects heart rate fluctuations of the driver 4 by detecting minute vibration speeds on the body surface caused by heartbeats.
  • the posture sensor 8 is disposed on the driver's seat, and measures body movements and posture changes by calculating the pressurization balance of the driver 4.
  • the camera 9 captures an image of the driver's 4 face.
  • the stress value acquisition unit 31 obtains a stress value from information calculated from at least one of the heart rate fluctuation of the driver 4, the driving posture of the driver 4, and the face image of the driver 4, which are the detection results obtained from these. Calculate.
  • the stress value acquisition unit 31 calculates a stress value based on the heartbeat interval acquired from the biological sensor 7, for example. Further, the stress value acquisition unit 31 calculates the degree of fatigue or stress as a stress value from changes in body movements and posture acquired from the posture sensor 8.
  • the stress value acquisition unit 31 calculates heart rate fluctuation by analyzing temporal changes in the RGB signals of the skin area of the driver's 4 face acquired from the camera 9, and calculates the stress value based on this.
  • the priority determination unit 32 determines the number of images to be displayed in the display area 23 based on the stress value calculated by the stress value acquisition unit 31. Furthermore, the priority determination unit 32 determines the images to be displayed on the display 11 based on the priority assigned to each image in advance and the determined number of displays. The priority determining unit 32 determines the number of displays so that the larger the stress value is, the smaller the display number is, and the smaller the stress value is, the larger the display number is. The priority indicates the degree to which each image is preferentially displayed in the display area 23, and is relatively assigned and set in advance between the images.
  • the display control unit 33 acquires information from the car navigation system of the vehicle 1 and the vehicle ECU (Electronic Control Unit), and controls the images to be displayed. Specifically, the display control unit 33 acquires information regarding route guidance from the car navigation system, acquires vehicle information necessary for driving the vehicle 1 such as vehicle speed and engine rotation speed from the vehicle ECU, and so on. Generate relevant images. Furthermore, the display control unit 33 controls the image displayed on the display 11 based on the information determined by the priority determination unit 32 in accordance with the stress value acquired by the stress value acquisition unit 31 (for details, see (described later).
  • the HUD 10 transmits necessary information to the driver 4 via various images depending on the situation of the vehicle 1.
  • an image especially an AR image superimposed on a real scene
  • the HUD 10 in this embodiment can perform a display suitable for the situation of the driver 4 by controlling the image displayed in the display area 23 according to the level of stress of the driver 4. . That is, the HUD 10 reduces the number of images displayed as the stress of the driver 4 increases, and increases the number of images displayed as the stress decreases. The process of controlling the number of images displayed by the HUD 10 will be described in detail below.
  • FIG. 3 is a flowchart illustrating the display number control process executed by the HUD 10.
  • This display number control process is a process that is always executed while the HUD 10 is activated. Furthermore, whether or not to execute the display number control process may be set by the driver 4 at his or her discretion.
  • step S1 the stress value acquisition unit 31 acquires required detection results from the biological sensor 7, posture sensor 8, and camera 9, and performs necessary calculations to acquire a stress value.
  • the stress value acquisition unit 31 outputs the acquired stress value to the priority determination unit 32.
  • step S2 the priority determining unit 32 determines that the acquired stress value is greater than or equal to a threshold (first value, hereinafter referred to as "non-display determination threshold") for determining whether or not to reduce display of the image. Determine whether or not.
  • a threshold first value, hereinafter referred to as "non-display determination threshold” for determining whether or not to reduce display of the image. Determine whether or not.
  • the priority determination unit 32 determines the number of images to be displayed in the display area 23 in step S3. That is, when the stress value becomes equal to or greater than the non-display determination threshold, the priority determining unit 32 determines the number subtracted from the currently displayed number of images as the number of images to be displayed.
  • the priority determination unit 32 may determine the display number to be a predetermined number less than the current display number (for example, the current display number - 1), or a predetermined number (for example, 2). ) may be determined as the display number. Furthermore, the priority determination unit 32 may set the non-display determination threshold in multiple stages, and may determine such that the larger the non-display determination threshold, the smaller the number of items to be displayed. Furthermore, the priority determining unit 32 may gradually reduce the number of displayed items when the stress value is equal to or higher than the non-display determination threshold for a predetermined period of time.
  • the priority determination unit 32 determines the images to be displayed according to the determined number of displays, based on the priority assigned to the currently displayed image. For example, the images 21a, 21b, 21c, 21d, and 21e displayed in the display area 23 of FIG. 2 are assigned priorities such as image 21a ⁇ image 21b ⁇ image 21c ⁇ image 21d ⁇ image 21e. shall be.
  • the priority determining unit 32 determines the current display number minus one as the display number
  • the priority determining unit 32 determines that the image 21a with the lowest priority is an image to be hidden, and the other images 21b and 21c are , 21d, and 21e are determined as images to be maintained.
  • the priority determination unit 32 determines the images to be hidden in the order of lower priority images 21b, 21c, . . . .
  • step S4 the display control unit 33 controls the image to be displayed on the display 11 based on the information determined by the priority determination unit 32 in step S3. That is, when the stress value becomes equal to or higher than the non-display determination threshold, the display control unit 33 hides the currently displayed images in descending order of priority.
  • the display control unit 33 may instantly erase or fade out the image to be hidden.
  • FIG. 4 is an explanatory diagram showing the display area 23 in the case where the low priority image 21a is hidden from the display area 23 in FIG. 2. In FIG. 4, the image 21a with a low priority is not displayed.
  • FIG. 5 is an explanatory diagram showing another virtual display area 23 formed in front of the vehicle 1.
  • FIG. 6 is an explanatory diagram showing the display area 23 in a case where a part of the low priority image 21f is hidden from the display area 23 in FIG.
  • the images 21b, 21c, 21d, 21e, and 21f displayed in the display area 23 of FIG. 5 are assigned priorities as follows: image 21f ⁇ image 21b ⁇ image 21c ⁇ image 21d ⁇ image 21e
  • the determining unit 32 determines the image 21f as an image to be hidden.
  • the image 21f is a plurality of arrow-shaped images 22a, 22b, and 22c that guide the driver 4 to turn right, and each of the images 22a, 22b, and 22c is an image composed of a plurality of elements that can transmit information to the driver 4 by itself. It is.
  • the display control unit 33 can transmit priority by assigning priority to each image 22a, 22b, 22c in advance , a part of the image 21f may be hidden.
  • the priority determining unit 32 determines the next image to be hidden, it determines the images to be hidden in the order of image 22b, image 22c, image 21b, . . . .
  • FIG. 7 is an explanatory diagram showing yet another virtual display area 23 formed in front of the vehicle 1.
  • FIG. 8 is an explanatory diagram showing the display area 23 in a case where a part of the low priority image 21g is hidden from the display area 23 in FIG.
  • FIG. 9 is an explanatory diagram showing the display area 23 when a part of the image 21g in FIG. 8 is hidden.
  • FIG. 10 is an explanatory diagram showing the display area 23 when the image 21g in FIG. 9 is not displayed.
  • the images 21b, 21c, 21d, 21e, and 21g displayed in the display area 23 of FIG. 7 are assigned priorities as follows: image 21g ⁇ image 21b ⁇ image 21c ⁇ image 21d ⁇ image 21e.
  • the determining unit 32 determines the image 21g as an image to be hidden.
  • the image 21g is an image that guides straight forward movement with the actual road surface as the background, and is an AR image that has a sense of perspective such that the upper part of the display area 23 is farther away and the lower part of the display area 23 is closer.
  • the display control unit 33 sets the image to be hidden far from the driver 4.
  • the display control unit 33 may sequentially hide the image from an area that is superimposed and displayed close to the driver 4 to an area that is superimposed and displayed far from the image to be hidden.
  • the priority determination unit 32 sequentially hides the image 21g in FIG. 7 from the upper region visible on the far side as shown in FIGS. 8 and 9, and finally hides the image in FIG. As shown, the entire image 21g may be hidden.
  • the display control unit 33 gradually hides the images to be hidden as shown in FIGS. 7 to 10. It is preferable to immediately hide the display instead of displaying it immediately. Thereby, the display control unit 33 can draw attention to the actual scene without the image blocking the driver's 4 field of view. Under situations where it is highly necessary to visually confirm the actual scene, for example, when other vehicles, pedestrians, obstacles, etc. are detected around vehicle 1, when entering a curve, when changing lanes, etc., the direction indicator is activated. This applies to cases where it is necessary to ensure forward visibility.
  • the priority determination unit 32 determines in step S2 that the stress value is not equal to or higher than the non-display determination threshold (NO in step S2), the priority determination unit 32 determines whether the acquired stress value increases the display of the image or not in step S5. It is determined whether or not it is less than or equal to a threshold value (second value, hereinafter referred to as "appearance determination threshold value").
  • the appearance determination threshold is set to a value less than or equal to the non-display determination threshold.
  • the appearance determination threshold is a value smaller than the non-display determination threshold, which has hysteresis with respect to the non-display determination threshold, and is different from the non-display determination threshold in order to avoid frequent repetition of hiding and appearing of images. It is preferable.
  • the priority determining unit 32 determines that the stress value is not below the appearance determination threshold (NO in step S5), the priority determining unit 32 maintains the number of images displayed in normal times and returns to the stress value acquisition step S1.
  • the priority determination unit 32 determines the number of images to be displayed in the display area 23 in step S6. That is, when the stress value becomes less than or equal to the appearance determination threshold, the priority determination unit 32 determines the number of images that is increased from the number of images currently displayed as the number of images to be displayed.
  • the priority determining unit 32 may decide to increase the display number by a predetermined number from the current display number (for example, the current display number + 1), or may decide to increase the display number by a predetermined number (for example, 2). etc.) may be determined as the display number. Furthermore, the priority determination unit 32 may set the appearance determination threshold in multiple stages, and may determine such that the smaller the appearance determination threshold, the larger the number of items to be displayed. Further, the priority determination unit 32 may gradually increase the number of displayed items when the stress value is equal to or lower than the appearance determination threshold for a predetermined period of time.
  • the priority determination unit 32 determines the images to be displayed according to the determined number of displays, based on the priority assigned to images that would normally be displayed but are currently hidden.
  • step S7 the display control unit 33 controls the image to be displayed on the display 11 based on the information determined by the priority determination unit 32 in step S6. That is, when the stress value becomes equal to or less than the appearance determination threshold, the display control unit 33 causes the images to appear in order of priority among images that would normally be displayed but are currently hidden. For example, the display control unit 33 may cause the image to appear in step S4 in a manner reverse to the manner in which the image is hidden as described using FIGS. 2 and 4 to 10.
  • the display control unit 33 may cause the images to appear sequentially from an area that is superimposed and displayed close to the driver 4 to an area that is superimposed and displayed that is far away from the driver 4.
  • the display control unit 33 may cause the images to appear sequentially from an area that is displayed in a superimposed manner far from the driver 4 to an area that is displayed in a superimposed manner close to the driver 4 .
  • the display control unit 33 may cause the image to appear instantaneously in order to quickly provide the driver 4 with necessary information.
  • the HUD 10 in this embodiment controls the image displayed in the display area 23 according to the stress value of the driver 4 who is the viewer. Thereby, the HUD 10 can provide visibility according to the driver's 4 situation, and can reduce unnecessarily obstructing the driver's 4's visibility during driving.
  • the HUD 10 decreases the number of images to be displayed as the stress value increases, and increases the number of images to be displayed as the stress value decreases. Thereby, the HUD 10 can eliminate the trouble caused by the large number of images, and can contribute to reducing the stress of the driver 4.
  • the HUD 10 can reliably provide the driver 4 with the necessary information by considering the priority assigned to the image when hiding or appearing the image.
  • the HUD 10 can reduce the sense of discomfort given to the driver 4 due to non-display and appearance.
  • a combiner may be used instead of or in addition to this.

Abstract

Provided is a display device capable of having visibility adapted to the circumstances of a viewer. A display device according to the present embodiment comprises: a display 11 for displaying one or more images; a stress value acquisition unit 31 for acquiring a stress value, which is information relating to the magnitude of stress of a viewer 4 viewing an image; and a display control unit 33 for controlling the image displayed by the display 11 in accordance with the stress value acquired by the stress value acquisition unit 31. The display control unit 33 may reduce the number of images if the stress value reaches or exceeds a first value and increase the number of images if the stress value reaches a second value, which is less than or equal to the first value.

Description

表示装置display device
 本開示は、車両用等の表示装置に関する。 The present disclosure relates to a display device for vehicles and the like.
 表示装置の一例としてのヘッドアップディスプレイは、運転に必要な情報等を表す画像を、運転者が視認する実景と対応付けて直感的に把握できるように表示させて、運転者の運転操作を支援する。このようなヘッドアップディスプレイにおいて、従来、現実の物体と重畳させたAR画像を含む映像の視認性を高める技術が知られている。 A head-up display, which is an example of a display device, supports the driver's driving operations by displaying images that represent information necessary for driving in a manner that allows the driver to intuitively understand the actual scenery that the driver sees. do. In such a head-up display, a technique is conventionally known that improves the visibility of an image including an AR image superimposed on a real object.
特開2020-142792号公報Japanese Patent Application Publication No. 2020-142792
 ヘッドアップディスプレイは、車両の前方に情報が視認されるように画像を形成する。このため、運転者の視線方向に過剰に画像が表示されると、かえって運転の妨げになる虞がある。 A head-up display forms an image so that information is visible in front of the vehicle. For this reason, if too many images are displayed in the direction of the driver's line of sight, there is a risk that this will actually hinder driving.
 本開示はこのような事情を考慮してなされたもので、視認者の状況に適した視認性を有することができる表示装置を提供することを目的とする。 The present disclosure has been made in consideration of such circumstances, and aims to provide a display device that can have visibility suitable for the viewer's situation.
 本開示の表示装置は、上述した課題を解決するために、一又は複数の画像を表示する表示器と、前記画像を視認する視認者のストレスの大小に関する情報であるストレス値を取得するストレス値取得部と、前記ストレス値取得部により取得された前記ストレス値に応じて、前記表示器に表示させる画像を制御する表示制御部と、を備える。 In order to solve the above-mentioned problems, a display device of the present disclosure includes a display device that displays one or more images, and a stress value that acquires a stress value that is information regarding the magnitude of stress of a viewer viewing the image. The display device includes an acquisition unit, and a display control unit that controls an image to be displayed on the display according to the stress value acquired by the stress value acquisition unit.
 本開示の表示装置においては、視認者の状況に適した視認性を有することができる。 The display device of the present disclosure can have visibility suitable for the viewer's situation.
本実施形態におけるHUDのシステム構成例を示す図。The figure which shows the system configuration example of HUD in this embodiment. 車両の前方に形成される仮想的な表示領域を示す説明図。FIG. 3 is an explanatory diagram showing a virtual display area formed in front of a vehicle. HUDにより実行される表示数制御処理を説明するフローチャート。5 is a flowchart illustrating display number control processing executed by the HUD. 図3の表示領域から優先度の低い画像が非表示となった場合の表示領域を示す説明図。FIG. 4 is an explanatory diagram showing a display area when an image with a low priority is hidden from the display area of FIG. 3; 車両の前方に形成される仮想的な他の表示領域を示す説明図。FIG. 7 is an explanatory diagram showing another virtual display area formed in front of the vehicle. 図5の表示領域から優先度の低い画像が非表示となった場合の表示領域を示す説明図。FIG. 6 is an explanatory diagram showing a display area when an image with a low priority is hidden from the display area of FIG. 5; 車両の前方に形成される仮想的な更に他の表示領域を示す説明図。FIG. 7 is an explanatory diagram showing yet another virtual display area formed in front of the vehicle. 図7の表示領域から優先度の低い画像の一部が非表示となった場合の表示領域を示す説明図。FIG. 8 is an explanatory diagram showing a display area when a part of an image with a low priority is hidden from the display area of FIG. 7; 図8の画像の一部が非表示となった場合の表示領域を示す説明図。FIG. 9 is an explanatory diagram showing a display area when a part of the image in FIG. 8 is hidden; 図9の画像の一部が非表示となった場合の表示領域を示す説明図。10 is an explanatory diagram showing a display area when a part of the image in FIG. 9 is hidden; FIG.
 本開示の表示装置の実施形態を添付図面に基づいて説明する。本開示の表示装置は、例えば自動車や二輪車等の車両や、船舶、航空機、農業機械、建設機械に搭載される表示装置に適用することができる。本実施形態においては、表示装置が車両に搭載され、車両から取得した各種情報等に基づいて所要の情報を表示するヘッドアップディスプレイ(HUD、Head Up Display)である例を用いて説明する。 Embodiments of the display device of the present disclosure will be described based on the accompanying drawings. The display device of the present disclosure can be applied to, for example, a display device mounted on a vehicle such as an automobile or a motorcycle, a ship, an aircraft, an agricultural machine, or a construction machine. The present embodiment will be described using an example in which the display device is a head-up display (HUD) that is mounted on a vehicle and displays required information based on various information acquired from the vehicle.
 図1は、本実施形態におけるHUD10のシステム構成例を示す図である。
 図2は、車両1の前方に形成される仮想的な表示領域23を示す説明図である。
FIG. 1 is a diagram showing an example of the system configuration of the HUD 10 in this embodiment.
FIG. 2 is an explanatory diagram showing a virtual display area 23 formed in front of the vehicle 1. As shown in FIG.
 以下の説明において、「前」、「後」、「上」、「下」、「右」及び「左」は、図1から図10における定義「Fr.」、「Re.」、「To.」、「Bo.」、「R」、及び「L」に従う。 In the following description, "front", "rear", "upper", "lower", "right" and "left" are defined as "Fr.", "Re.", "To." in FIGS. 1 to 10. ”, “Bo.”, “R”, and “L”.
 HUD10は、車両1のインストルメントパネル内に搭載される。HUD10は、表示器11と、平面鏡12と、凹面鏡13と、筐体15と、制御部30と、を有する。 The HUD 10 is mounted within the instrument panel of the vehicle 1. The HUD 10 includes a display 11, a plane mirror 12, a concave mirror 13, a housing 15, and a control unit 30.
 表示器11は、例えばTFT(Thin Film Transistor)型の液晶表示器や有機EL(Electroluminescence)表示器である。表示器11は、車両1の運転者4(視認者)に視認させる画像に係る表示光Lを出射し、表示領域23内に一又は複数の画像を表示する。尚、表示器11はプロジェクタ及び表示面を構成するスクリーンを有するものであってもよい。 The display 11 is, for example, a TFT (Thin Film Transistor) type liquid crystal display or an organic EL (Electroluminescence) display. The display device 11 emits display light L related to an image to be viewed by the driver 4 (viewer) of the vehicle 1, and displays one or more images in the display area 23. Note that the display device 11 may include a projector and a screen that constitutes a display surface.
 平面鏡12は、表示器11が出射した表示光Lを、凹面鏡13に向けて反射する。凹面鏡13は、平面鏡12で反射した表示光Lを更に反射させ、ウインドシールド3に向けて出射させる。凹面鏡13は、拡大鏡としての機能を有し、表示器11に表示された画像を拡大してウインドシールド3(投影部材)側へ反射する。すなわち、運転者4が視認する虚像は、表示器11に表示された画像が拡大された像である。 The plane mirror 12 reflects the display light L emitted by the display 11 toward the concave mirror 13. The concave mirror 13 further reflects the display light L reflected by the plane mirror 12 and emits it toward the windshield 3. The concave mirror 13 has a function as a magnifying glass, magnifies the image displayed on the display 11, and reflects the image toward the windshield 3 (projection member). That is, the virtual image visually recognized by the driver 4 is an enlarged image of the image displayed on the display 11.
 凹面鏡13で反射した表示光Lは、車両1のウインドシールド3に照射される。車両1の運転者4は、ウインドシールド3における画像に係る表示光Lの反射光を四角形状の仮想的な表示領域23内で虚像として視認する。これにより、画像は、ウインドシールド3を介して(ウインドシールド3の前方で)運転者4の視界の実景に対して重畳して表示される虚像として、表示領域23内に表示される。すなわち、画像は、車両1の前方の実景を背景として表示される、いわゆるAR(拡張現実、Augmented Reality)画像として視認され得る。 The display light L reflected by the concave mirror 13 is irradiated onto the windshield 3 of the vehicle 1. The driver 4 of the vehicle 1 visually recognizes the reflected light of the display light L related to the image on the windshield 3 as a virtual image within the rectangular virtual display area 23 . Thereby, the image is displayed in the display area 23 as a virtual image that is displayed via the windshield 3 (in front of the windshield 3) so as to be superimposed on the actual scene in the field of view of the driver 4. That is, the image can be visually recognized as a so-called AR (Augmented Reality) image that is displayed with the actual scene in front of the vehicle 1 as the background.
 筐体15は、筐体15の内部に、表示器11、平面鏡12及び凹面鏡13を収容する。また、筐体15は、制御部30が実装された制御基板を収容する。 The housing 15 accommodates the display 11, the plane mirror 12, and the concave mirror 13 inside the housing 15. Furthermore, the housing 15 houses a control board on which the control unit 30 is mounted.
 制御部30は、特に、後述する車両1の各部から取得する情報に基づいて、表示器11を制御する。制御部30は、マイクロプロセッサ、マイクロコントローラ、グラフィックコントローラ、集積回路等であり、所定の処理を実行する。制御部30は、特にストレス値取得部31と、優先度決定部32と、表示制御部33と、を有する。 In particular, the control unit 30 controls the display 11 based on information acquired from each part of the vehicle 1, which will be described later. The control unit 30 is a microprocessor, microcontroller, graphic controller, integrated circuit, etc., and executes predetermined processing. The control unit 30 particularly includes a stress value acquisition unit 31, a priority determination unit 32, and a display control unit 33.
 ストレス値取得部31は、画像を視認する運転者4のストレスの大小に関する情報である、例えば数値化されたストレスト値を取得する。ストレス値取得部31は、例えば車両1内に設置された生体センサー7、姿勢センサー8及びカメラ9より検出結果を取得し、所要の演算を行うことにより、数値化されたストレス値を取得する。ストレス値取得部31は、生体センサー7、姿勢センサー8及びカメラ9のうちいずれか一つから得られる検出結果に基づいて演算してもよいし、生体センサー7、姿勢センサー8及びカメラ9から得られる検出結果に基づいて演算してもよい。 The stress value acquisition unit 31 acquires, for example, a numerical stress value, which is information regarding the magnitude of stress of the driver 4 who visually recognizes the image. The stress value acquisition unit 31 acquires detection results from, for example, the biological sensor 7, posture sensor 8, and camera 9 installed in the vehicle 1, and performs necessary calculations to acquire a numerical stress value. The stress value acquisition unit 31 may perform calculations based on the detection results obtained from any one of the biological sensor 7, the posture sensor 8, and the camera 9, or may perform calculations based on the detection results obtained from the biological sensor 7, the posture sensor 8, and the camera 9. The calculation may be performed based on the detection results obtained.
 生体センサー7は、例えばマイクロ波ドップラーセンサー方式であり、人体に向けてマイクロ波を照射し、その反射波を計測する。生体センサー7は、心臓の拍動により生じる体表面の微細な振動速度を検出することにより、運転者4の心拍変動を検知する。姿勢センサー8は、運転席のシートに配置され、運転者4の加圧バランスを計算することにより、体動や姿勢変化を計測する。カメラ9は、運転者4の顔の画像を撮影する。 The biosensor 7 is, for example, a microwave Doppler sensor type, and irradiates microwaves toward the human body and measures the reflected waves. The biosensor 7 detects heart rate fluctuations of the driver 4 by detecting minute vibration speeds on the body surface caused by heartbeats. The posture sensor 8 is disposed on the driver's seat, and measures body movements and posture changes by calculating the pressurization balance of the driver 4. The camera 9 captures an image of the driver's 4 face.
 ストレス値取得部31は、これらから得られる検出結果である、運転者4の心拍変動、運転者4の運転姿勢及び運転者4の顔画像のうち少なくとも一つから算出される情報から、ストレス値を算出する。ストレス値取得部31は、例えば生体センサー7から取得した心拍間隔に基づいてストレス値を算出する。また、ストレス値取得部31は、姿勢センサー8から取得した体動や姿勢の変化から、疲労やストレスの度合をストレス値として算出する。ストレス値取得部31は、カメラ9から取得した運転者4の顔の肌領域のRGB信号の時間的な変化を解析することにより心拍変動を算出し、これに基づいてストレス値を算出する。 The stress value acquisition unit 31 obtains a stress value from information calculated from at least one of the heart rate fluctuation of the driver 4, the driving posture of the driver 4, and the face image of the driver 4, which are the detection results obtained from these. Calculate. The stress value acquisition unit 31 calculates a stress value based on the heartbeat interval acquired from the biological sensor 7, for example. Further, the stress value acquisition unit 31 calculates the degree of fatigue or stress as a stress value from changes in body movements and posture acquired from the posture sensor 8. The stress value acquisition unit 31 calculates heart rate fluctuation by analyzing temporal changes in the RGB signals of the skin area of the driver's 4 face acquired from the camera 9, and calculates the stress value based on this.
 以下の説明においては、ストレス値が大きいほどストレスが大きく、ストレス値が小さいほどストレスが小さいものと規定する。 In the following description, it is defined that the larger the stress value, the greater the stress, and the smaller the stress value, the smaller the stress.
 優先度決定部32は、ストレス値取得部31により算出されたストレス値に基づいて、表示領域23に表示する画像の表示数を決定する。また、優先度決定部32は、各画像に予め割り当てられた優先度及び決定された表示数に基づいて、表示器11に表示させる画像を決定する。優先度決定部32は、ストレス値が大きいほど少なく、ストレス値が小さいほど多くなるように表示数を決定する。優先度は、各画像が表示領域23に優先的に表示される度合を示すもので、各画像間で相対的に割り当てられて予め設定されたものである。 The priority determination unit 32 determines the number of images to be displayed in the display area 23 based on the stress value calculated by the stress value acquisition unit 31. Furthermore, the priority determination unit 32 determines the images to be displayed on the display 11 based on the priority assigned to each image in advance and the determined number of displays. The priority determining unit 32 determines the number of displays so that the larger the stress value is, the smaller the display number is, and the smaller the stress value is, the larger the display number is. The priority indicates the degree to which each image is preferentially displayed in the display area 23, and is relatively assigned and set in advance between the images.
 表示制御部33は、車両1が有するカーナビゲーションシステムや、車両ECU(Electronic Control Unit)より情報を取得し、表示すべき画像を制御する。具体的には、表示制御部33は、カーナビゲーションシステムより経路案内に関する情報を取得したり、車両ECUから車速やエンジン回転数等の車両1の走行に必要な車両情報を取得したりして、関連する画像を生成する。また、表示制御部33は、ストレス値取得部31により取得されたストレス値に応じて優先度決定部32により決定された情報に基づいて、表示器11に表示される画像を制御する(詳細は後述)。 The display control unit 33 acquires information from the car navigation system of the vehicle 1 and the vehicle ECU (Electronic Control Unit), and controls the images to be displayed. Specifically, the display control unit 33 acquires information regarding route guidance from the car navigation system, acquires vehicle information necessary for driving the vehicle 1 such as vehicle speed and engine rotation speed from the vehicle ECU, and so on. Generate relevant images. Furthermore, the display control unit 33 controls the image displayed on the display 11 based on the information determined by the priority determination unit 32 in accordance with the stress value acquired by the stress value acquisition unit 31 (for details, see (described later).
 ここで、HUD10は、車両1の状況に応じて種々の画像を介して、運転者4に必要な情報を伝達する。しかしながら、画像、特に実景に重畳表示されるAR画像が常に運転者4の視界に表示される場合、画像の表示がかえって煩わしさを与えてしまう虞がある。これに対し、本実施形態におけるHUD10は、運転者4のストレスの大小に応じて、表示領域23に表示される画像を制御することにより、運転者4の状況に適した表示を行うことができる。すなわち、HUD10は、運転者4のストレスが大きいほど表示する画像の数を減らし、ストレスが小さいほど表示する画像の数を増やす。以下、HUD10による画像の表示数制御処理について詳細に説明する。 Here, the HUD 10 transmits necessary information to the driver 4 via various images depending on the situation of the vehicle 1. However, if an image, especially an AR image superimposed on a real scene, is always displayed in the field of view of the driver 4, there is a risk that the display of the image may be more bothersome. On the other hand, the HUD 10 in this embodiment can perform a display suitable for the situation of the driver 4 by controlling the image displayed in the display area 23 according to the level of stress of the driver 4. . That is, the HUD 10 reduces the number of images displayed as the stress of the driver 4 increases, and increases the number of images displayed as the stress decreases. The process of controlling the number of images displayed by the HUD 10 will be described in detail below.
 図3は、HUD10により実行される表示数制御処理を説明するフローチャートである。この表示数制御処理は、HUD10が起動中は常時実行される処理である。また、表示数制御処理は、実行の有無を、運転者4の任意で設定できるようにしてもよい。 FIG. 3 is a flowchart illustrating the display number control process executed by the HUD 10. This display number control process is a process that is always executed while the HUD 10 is activated. Furthermore, whether or not to execute the display number control process may be set by the driver 4 at his or her discretion.
 ステップS1において、ストレス値取得部31は、生体センサー7、姿勢センサー8及びカメラ9より所要の検出結果を取得し、所要の演算することにより、ストレス値を取得する。ストレス値取得部31は、取得したストレス値を優先度決定部32に出力する。 In step S1, the stress value acquisition unit 31 acquires required detection results from the biological sensor 7, posture sensor 8, and camera 9, and performs necessary calculations to acquire a stress value. The stress value acquisition unit 31 outputs the acquired stress value to the priority determination unit 32.
 ステップS2において、優先度決定部32は、取得したストレス値が画像の表示を減らすか否かを判定するための閾値(第一の値、以下、「非表示判定閾値」という。)以上であるか否かを判定する。優先度決定部32は、ストレス値が非表示判定閾値以上であると判定した場合(ステップS2のYES)、ステップS3において、表示領域23に表示される画像の表示数を決定する。すなわち、優先度決定部32は、ストレス値が非表示判定閾値以上となった場合、現在表示されている画像の数から減った数を、表示数として決定する。 In step S2, the priority determining unit 32 determines that the acquired stress value is greater than or equal to a threshold (first value, hereinafter referred to as "non-display determination threshold") for determining whether or not to reduce display of the image. Determine whether or not. When determining that the stress value is equal to or greater than the non-display determination threshold (YES in step S2), the priority determination unit 32 determines the number of images to be displayed in the display area 23 in step S3. That is, when the stress value becomes equal to or greater than the non-display determination threshold, the priority determining unit 32 determines the number subtracted from the currently displayed number of images as the number of images to be displayed.
 優先度決定部32は、表示数を、現在の表示数から所定数減らした数(例えば現在の表示数-1個)を表示数として決定してもよいし、予め定められた数(例えば2個)を表示数として決定してもよい。また、優先度決定部32は、非表示判定閾値を複数段階設定してもよく、非表示判定閾値が大きいほど表示数が少なくなるように決定してもよい。さらに、優先度決定部32は、ストレス値が非表示判定閾値以上である状態が所定時間継続した場合に、徐々に表示数を減らしてもよい。 The priority determination unit 32 may determine the display number to be a predetermined number less than the current display number (for example, the current display number - 1), or a predetermined number (for example, 2). ) may be determined as the display number. Furthermore, the priority determination unit 32 may set the non-display determination threshold in multiple stages, and may determine such that the larger the non-display determination threshold, the smaller the number of items to be displayed. Furthermore, the priority determining unit 32 may gradually reduce the number of displayed items when the stress value is equal to or higher than the non-display determination threshold for a predetermined period of time.
 また、優先度決定部32は、現在表示されている画像に割り当てられた優先度に基づいて、決定した表示数に応じて表示する画像を決定する。例えば、図2の表示領域23に表示されている画像21a、21b、21c、21d、21eに、画像21a<画像21b<画像21c<画像21d<画像21eのように優先度が割り当てられているものとする。優先度決定部32が現在の表示数から1減らした数を表示数として決定した場合、優先度決定部32は、最も優先度の低い画像21aが非表示となる画像、その他の画像21b、21c、21d、21eが表示が維持される画像として決定する。また、優先度決定部32は、更に非表示とする画像を決定する場合には、優先度の低い画像21b、21c、……の順に非表示とする画像を決定する。 Furthermore, the priority determination unit 32 determines the images to be displayed according to the determined number of displays, based on the priority assigned to the currently displayed image. For example, the images 21a, 21b, 21c, 21d, and 21e displayed in the display area 23 of FIG. 2 are assigned priorities such as image 21a<image 21b<image 21c<image 21d<image 21e. shall be. When the priority determining unit 32 determines the current display number minus one as the display number, the priority determining unit 32 determines that the image 21a with the lowest priority is an image to be hidden, and the other images 21b and 21c are , 21d, and 21e are determined as images to be maintained. Furthermore, when determining images to be hidden, the priority determination unit 32 determines the images to be hidden in the order of lower priority images 21b, 21c, . . . .
 ステップS4において、表示制御部33は、ステップS3で優先度決定部32により決定された情報に基づいて、表示器11に表示させる画像を制御する。すなわち、表示制御部33は、ストレス値が非表示判定閾値以上となった場合、現在表示されている画像のうち、優先度の低い画像から順に非表示とする。 In step S4, the display control unit 33 controls the image to be displayed on the display 11 based on the information determined by the priority determination unit 32 in step S3. That is, when the stress value becomes equal to or higher than the non-display determination threshold, the display control unit 33 hides the currently displayed images in descending order of priority.
 表示制御部33は、非表示とする画像を瞬時に消してもよいし、フェードアウトさせてもよい。 The display control unit 33 may instantly erase or fade out the image to be hidden.
 図4は、図2の表示領域23から優先度の低い画像21aが非表示となった場合の表示領域23を示す説明図である。図4においては、優先度の低い画像21aが非表示となっている。 FIG. 4 is an explanatory diagram showing the display area 23 in the case where the low priority image 21a is hidden from the display area 23 in FIG. 2. In FIG. 4, the image 21a with a low priority is not displayed.
 また、図5は、車両1の前方に形成される仮想的な他の表示領域23を示す説明図である。図6は、図5の表示領域23から優先度の低い画像21fの一部が非表示となった場合の表示領域23を示す説明図である。 Further, FIG. 5 is an explanatory diagram showing another virtual display area 23 formed in front of the vehicle 1. FIG. 6 is an explanatory diagram showing the display area 23 in a case where a part of the low priority image 21f is hidden from the display area 23 in FIG.
 図5の表示領域23に表示されている画像21b、21c、21d、21e、21fに、画像21f<画像21b<画像21c<画像21d<画像21eのように優先度が割り当てられており、優先度決定部32が画像21fを非表示となる画像として決定する。画像21fは、運転者4に右折を案内する複数の矢印状の画像22a、22b、22cであり、各画像22a、22b、22c単体でも運転者4に情報を伝達可能な複数の要素からなる画像である。非表示となる画像が画像21fのように各画像22a、22b、22c単体でも情報を伝達可能な場合においては、各画像22a、22b、22cに予め優先度を割り当てることにより、表示制御部33は、画像21fの一部を非表示としてもよい。 The images 21b, 21c, 21d, 21e, and 21f displayed in the display area 23 of FIG. 5 are assigned priorities as follows: image 21f<image 21b<image 21c<image 21d<image 21e The determining unit 32 determines the image 21f as an image to be hidden. The image 21f is a plurality of arrow-shaped images 22a, 22b, and 22c that guide the driver 4 to turn right, and each of the images 22a, 22b, and 22c is an image composed of a plurality of elements that can transmit information to the driver 4 by itself. It is. When the image to be hidden is the image 21f, in which information can be transmitted by each image 22a, 22b, 22c alone, the display control unit 33 can transmit priority by assigning priority to each image 22a, 22b, 22c in advance , a part of the image 21f may be hidden.
 具体的には、優先度の低い画像21fを構成する画像22a、22b、22cに、画像22a<画像22b<画像22cのように優先度が割当てられている場合、優先度決定部32は、画像21f全体を非表示とするのではなく、図6に示すように、画像22aのみを非表示としてもよい。この場合、優先度決定部32が次に非表示とする画像を決定する場合には、画像22b、画像22c、画像21b、……の順に非表示とする画像を決定する。 Specifically, when the images 22a, 22b, and 22c constituting the low-priority image 21f are assigned priorities such as image 22a<image 22b<image 22c, the priority determining unit 32 Instead of hiding the entire image 21f, only the image 22a may be hidden, as shown in FIG. In this case, when the priority determination unit 32 determines the next image to be hidden, it determines the images to be hidden in the order of image 22b, image 22c, image 21b, . . . .
 また、図7は、車両1の前方に形成される仮想的な更に他の表示領域23を示す説明図である。図8は、図7の表示領域23から優先度の低い画像21gの一部が非表示となった場合の表示領域23を示す説明図である。図9は、図8の画像21gの一部が非表示となった場合の表示領域23を示す説明図である。図10は、図9の画像21gが非表示となった場合の表示領域23を示す説明図である。 Further, FIG. 7 is an explanatory diagram showing yet another virtual display area 23 formed in front of the vehicle 1. FIG. 8 is an explanatory diagram showing the display area 23 in a case where a part of the low priority image 21g is hidden from the display area 23 in FIG. FIG. 9 is an explanatory diagram showing the display area 23 when a part of the image 21g in FIG. 8 is hidden. FIG. 10 is an explanatory diagram showing the display area 23 when the image 21g in FIG. 9 is not displayed.
 図7の表示領域23に表示されている画像21b、21c、21d、21e、21gに、画像21g<画像21b<画像21c<画像21d<画像21eのように優先度が割り当てられており、優先度決定部32が画像21gを非表示となる画像として決定する。画像21gは、実景の路面を背景として直進を案内する画像であり、表示領域23の上方ほど遠く、下方ほど近くなるような遠近感を有するAR画像である。非表示となる画像が画像21gのように遠近感を伴うAR画像である場合においては、画像を非表示とする際には、表示制御部33は、非表示とする画像の運転者4から遠い方に重畳して表示される領域(表示領域23の上方)から近い方に重畳して表示される領域(表示領域23の下方)に向って順次非表示としてもよい。または、表示制御部33は、非表示とする画像の運転者4から近い方に重畳して表示される領域から遠い方に重畳して表示される領域に向って順次非表示としてもよい。 The images 21b, 21c, 21d, 21e, and 21g displayed in the display area 23 of FIG. 7 are assigned priorities as follows: image 21g<image 21b<image 21c<image 21d<image 21e. The determining unit 32 determines the image 21g as an image to be hidden. The image 21g is an image that guides straight forward movement with the actual road surface as the background, and is an AR image that has a sense of perspective such that the upper part of the display area 23 is farther away and the lower part of the display area 23 is closer. When the image to be hidden is an AR image with a sense of perspective, such as the image 21g, when the image is hidden, the display control unit 33 sets the image to be hidden far from the driver 4. It may be possible to sequentially hide the display from the area that is displayed in a superimposed manner on the one side (above the display area 23) to the area that is displayed in a superimposed manner on the near side (below the display area 23). Alternatively, the display control unit 33 may sequentially hide the image from an area that is superimposed and displayed close to the driver 4 to an area that is superimposed and displayed far from the image to be hidden.
 具体的には、優先度決定部32は、図7の画像21gの遠方側に視認される上方の領域から、図8及び図9に示すように順次非表示とし、最終的には図10に示すように画像21g全体を非表示としてもよい。尚、運転者4に車両1の周辺の実景を視認させる必要性が高い状況下においては、表示制御部33は、非表示とする画像を図7から図10のように徐々に非表示とするのではなく、瞬時に非表示とするのが好ましい。これにより、表示制御部33は、運転者4の視界を画像が遮ることなく、実景に誘目させることができる。実景を視認させる必要性が高い状況下は、例えば、車両1周辺に他車両や歩行者、障害物等が感知されている場合、カーブ進入時や車線変更時等方向指示器が作動している場合等、前方の視認性を確保する必要性がある場合が該当する。 Specifically, the priority determination unit 32 sequentially hides the image 21g in FIG. 7 from the upper region visible on the far side as shown in FIGS. 8 and 9, and finally hides the image in FIG. As shown, the entire image 21g may be hidden. Note that in a situation where it is highly necessary for the driver 4 to see the actual scenery around the vehicle 1, the display control unit 33 gradually hides the images to be hidden as shown in FIGS. 7 to 10. It is preferable to immediately hide the display instead of displaying it immediately. Thereby, the display control unit 33 can draw attention to the actual scene without the image blocking the driver's 4 field of view. Under situations where it is highly necessary to visually confirm the actual scene, for example, when other vehicles, pedestrians, obstacles, etc. are detected around vehicle 1, when entering a curve, when changing lanes, etc., the direction indicator is activated. This applies to cases where it is necessary to ensure forward visibility.
 その後、処理はストレス値取得ステップS1に戻り、以降の処理を繰り返す。 After that, the process returns to stress value acquisition step S1, and the subsequent processes are repeated.
 一方、優先度決定部32は、ステップS2において、ストレス値が非表示判定閾値以上ではないと判定した場合(ステップS2のNO)、ステップS5において、取得したストレス値が画像の表示を増やすか否かを判定するための閾値(第二の値、以下、「出現判定閾値」という。)以下であるか否かを判定する。出現判定閾値は、非表示判定閾値以下の値に設定される。出現判定閾値は、画像の非表示及び出現が頻繁に繰り返されることを回避するため、非表示判定閾値に対してヒステリシスを有する、非表示判定閾値とは異なる、非表示判定閾値より小さい値であることが好ましい。 On the other hand, if the priority determination unit 32 determines in step S2 that the stress value is not equal to or higher than the non-display determination threshold (NO in step S2), the priority determination unit 32 determines whether the acquired stress value increases the display of the image or not in step S5. It is determined whether or not it is less than or equal to a threshold value (second value, hereinafter referred to as "appearance determination threshold value"). The appearance determination threshold is set to a value less than or equal to the non-display determination threshold. The appearance determination threshold is a value smaller than the non-display determination threshold, which has hysteresis with respect to the non-display determination threshold, and is different from the non-display determination threshold in order to avoid frequent repetition of hiding and appearing of images. It is preferable.
 優先度決定部32は、ストレス値が出現判定閾値以下ではないと判定した場合(ステップS5のNO)、通常時の画像の表示数を維持し、ストレス値取得ステップS1に戻る。一方、優先度決定部32は、ストレス値が出現判定閾値以下であると判定した場合(ステップS5のYES)、ステップS6において、表示領域23に表示される画像の表示数を決定する。すなわち、優先度決定部32は、ストレス値が出現判定閾値以下となった場合、現在表示されている画像の数から増やした数を、表示数として決定する。 If the priority determining unit 32 determines that the stress value is not below the appearance determination threshold (NO in step S5), the priority determining unit 32 maintains the number of images displayed in normal times and returns to the stress value acquisition step S1. On the other hand, when determining that the stress value is less than or equal to the appearance determination threshold (YES in step S5), the priority determination unit 32 determines the number of images to be displayed in the display area 23 in step S6. That is, when the stress value becomes less than or equal to the appearance determination threshold, the priority determination unit 32 determines the number of images that is increased from the number of images currently displayed as the number of images to be displayed.
 優先度決定部32は、表示数を、現在の表示数から所定数増やした数(例えば現在の表示数+1個)を表示数として決定してもよいし、予め定められた数(例えば2個等)を表示数として決定してもよい。また、優先度決定部32は、出現判定閾値を複数段階に設定してもよく、出現判定閾値が小さいほど表示数が多くなるように決定してもよい。さらに、優先度決定部32は、ストレス値が出現判定閾値以下ある状態が所定時間継続した場合に、徐々に表示数を増やしてもよい。 The priority determining unit 32 may decide to increase the display number by a predetermined number from the current display number (for example, the current display number + 1), or may decide to increase the display number by a predetermined number (for example, 2). etc.) may be determined as the display number. Furthermore, the priority determination unit 32 may set the appearance determination threshold in multiple stages, and may determine such that the smaller the appearance determination threshold, the larger the number of items to be displayed. Further, the priority determination unit 32 may gradually increase the number of displayed items when the stress value is equal to or lower than the appearance determination threshold for a predetermined period of time.
 また、優先度決定部32は、通常時であれば表示されるが現在非表示となっている画像に割り当てられた優先度に基づいて、決定した表示数に応じて表示する画像を決定する。 Furthermore, the priority determination unit 32 determines the images to be displayed according to the determined number of displays, based on the priority assigned to images that would normally be displayed but are currently hidden.
 ステップS7において、表示制御部33は、ステップS6で優先度決定部32により決定された情報に基づいて、表示器11に表示させる画像を制御する。すなわち、表示制御部33は、ストレス値が出現判定閾値以下となった場合、通常時であれば表示されるが現在非表示となっている画像のうち、優先度の高い画像から順に出現させる。表示制御部33は、例えば、ステップS4において図2、図4から図10を用いて説明した画像を非表示にする態様と逆の手順で、画像を出現させてもよい。 In step S7, the display control unit 33 controls the image to be displayed on the display 11 based on the information determined by the priority determination unit 32 in step S6. That is, when the stress value becomes equal to or less than the appearance determination threshold, the display control unit 33 causes the images to appear in order of priority among images that would normally be displayed but are currently hidden. For example, the display control unit 33 may cause the image to appear in step S4 in a manner reverse to the manner in which the image is hidden as described using FIGS. 2 and 4 to 10.
 例えば、表示制御部33は、出現させる画像の運転者4から近い方に重畳して表示される領域から遠い方に重畳して表示される領域に向って順次出現させてもよい。または、表示制御部33は、出現させる画像の運転者4から遠い方に重畳して表示される領域から近い方に重畳して表示される領域に向って順次出現させてもよい。また、表示制御部33は、必要な情報を素早く運転者4に提供するために、画像を瞬時に出現させてもよい。 For example, the display control unit 33 may cause the images to appear sequentially from an area that is superimposed and displayed close to the driver 4 to an area that is superimposed and displayed that is far away from the driver 4. Alternatively, the display control unit 33 may cause the images to appear sequentially from an area that is displayed in a superimposed manner far from the driver 4 to an area that is displayed in a superimposed manner close to the driver 4 . Further, the display control unit 33 may cause the image to appear instantaneously in order to quickly provide the driver 4 with necessary information.
 その後、処理はストレス値取得ステップS1に戻り、以降の処理を繰り返す。 After that, the process returns to stress value acquisition step S1, and the subsequent processes are repeated.
 このような本実施形態におけるHUD10は、視認者である運転者4のストレス値に応じて、表示領域23に表示される画像を制御する。これにより、HUD10は、運転者4の状況に応じた視認性を提供でき、運転者4の運転時の視界を必要以上に妨げることを低減できる。 The HUD 10 in this embodiment controls the image displayed in the display area 23 according to the stress value of the driver 4 who is the viewer. Thereby, the HUD 10 can provide visibility according to the driver's 4 situation, and can reduce unnecessarily obstructing the driver's 4's visibility during driving.
 また、HUD10は、ストレス値が大きいほど表示する画像の数を減らし、ストレス値が小さいほど表示する画像の数を増やす。これにより、HUD10は、画像の数が多いことから生じる煩わしさを解消でき、運転者4のストレスを低減することに寄与できる。 Furthermore, the HUD 10 decreases the number of images to be displayed as the stress value increases, and increases the number of images to be displayed as the stress value decreases. Thereby, the HUD 10 can eliminate the trouble caused by the large number of images, and can contribute to reducing the stress of the driver 4.
 また、HUD10は、画像の非表示及び出現に、画像に割り当てられた優先度を考慮することにより、必要な情報は確実に運転者4に提供できる。 Additionally, the HUD 10 can reliably provide the driver 4 with the necessary information by considering the priority assigned to the image when hiding or appearing the image.
 また、HUD10は、画像の種類に適した態様で画像を非表示及び出現させることにより、非表示及び出現に伴う運転者4に与える違和感を低減できる。 Furthermore, by hiding and making images appear in a manner appropriate to the type of image, the HUD 10 can reduce the sense of discomfort given to the driver 4 due to non-display and appearance.
 本発明のいくつかの実施形態を説明したが、これらの実施形態は、例として提示したものであり、発明の範囲を限定することは意図していない。これら新規な実施形態は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更を行うことができる。これら実施形態やその変形は、発明の範囲や要旨に含まれると共に、特許請求の範囲に記載された発明とその均等の範囲に含まれる。 Although several embodiments of the present invention have been described, these embodiments are presented as examples and are not intended to limit the scope of the invention. These novel embodiments can be implemented in various other forms, and various omissions, substitutions, and changes can be made without departing from the gist of the invention. These embodiments and their modifications are included within the scope and gist of the invention, as well as within the scope of the invention described in the claims and its equivalents.
 HUD10の投影部材がウインドシールド3である例を説明したが、これに代えて、又はこれと共にコンバイナであってもよい。 Although an example has been described in which the projection member of the HUD 10 is the windshield 3, a combiner may be used instead of or in addition to this.
1 車両
3 ウインドシールド
4 運転者(視認者)
7 生体センサー
8 姿勢センサー
9 カメラ
10 HUD(表示装置)
11 表示器
12 平面鏡
13 凹面鏡
15 筐体
21a、21b、21c、21d、21e、21f、21g、22a、22b、22c 画像
23 表示領域
30 制御部
31 ストレス値取得部
32 優先度決定部
33 表示制御部
L 表示光
1 Vehicle 3 Windshield 4 Driver (visible person)
7 Biological sensor 8 Posture sensor 9 Camera 10 HUD (display device)
11 Display 12 Plane mirror 13 Concave mirror 15 Housings 21a, 21b, 21c, 21d, 21e, 21f, 21g, 22a, 22b, 22c Image 23 Display area 30 Control section 31 Stress value acquisition section 32 Priority determination section 33 Display control section L Display light

Claims (6)

  1.  一又は複数の画像を表示する表示器と、
     前記画像を視認する視認者のストレスの大小に関する情報であるストレス値を取得するストレス値取得部と、
     前記ストレス値取得部により取得された前記ストレス値に応じて、前記表示器に表示させる前記画像を制御する表示制御部と、を備える表示装置。
    a display device that displays one or more images;
    a stress value acquisition unit that acquires a stress value that is information regarding the magnitude of stress of a viewer viewing the image;
    A display device comprising: a display control section that controls the image to be displayed on the display according to the stress value acquired by the stress value acquisition section.
  2.  前記表示制御部は、前記ストレス値が第一の値以上となった場合前記画像の数を減らし、前記ストレス値が前記第一の値以下である第二の値となった場合前記画像の数を増やす、請求項1記載の表示装置。 The display control unit reduces the number of images when the stress value is a first value or more, and reduces the number of images when the stress value reaches a second value that is less than or equal to the first value. The display device according to claim 1, wherein the display device increases .
  3.  各前記画像は、表示の優先度に関する情報を有し、
     前記表示制御部は、前記画像の数を減らす場合、前記優先度の低い前記画像から順に非表示とし、前記画像の数を増やす場合前記優先度の高い前記画像から順に出現させる、請求項2記載の表示装置。
    each said image has information regarding display priority;
    3. The display control unit hides the images in order from the lowest priority when decreasing the number of images, and causes the images to appear in order from the highest priority when increasing the number of images. display device.
  4.  前記表示器は、前記視認者の視界の実景に対して重畳して表示される虚像を表示する、請求項1記載の表示装置。 The display device according to claim 1, wherein the display device displays a virtual image that is displayed superimposed on the actual scene in the visual field of the viewer.
  5.  前記表示制御部は、
      前記画像の数を減らす場合、非表示とする前記画像の前記視認者から遠い方に重畳して表示される領域から近い方に重畳して表示される領域に向って順次非表示とし、前記画像の数を増やす場合、出現させる前記画像の前記視認者から近い方に重畳して表示される領域から遠い方に重畳して表示される領域に向って順次出現させる、又は、
      前記画像の数を減らす場合、非表示とする前記画像の前記視認者から近い方に重畳して表示される領域から遠い方に重畳して表示される領域に向って順次非表示とし、前記画像の数を増やす場合、出現させる前記画像の前記視認者から遠い方に重畳して表示される領域から近い方に重畳して表示される領域に向って順次出現させる、請求項4記載の表示装置。
    The display control section includes:
    When reducing the number of images, the images are sequentially hidden from the area that is displayed in a superimposed manner far from the viewer to the area that is displayed in a superimposed manner close to the viewer. When increasing the number of images, the images are made to appear sequentially from an area that is displayed in a superimposed manner close to the viewer to an area that is displayed in a superimposed manner that is far away from the viewer, or
    When reducing the number of images, the images are sequentially hidden from an area that is displayed in a superimposed manner close to the viewer to an area that is displayed in a superimposed manner that is far away from the viewer. 5. The display device according to claim 4, wherein when increasing the number of images, the images are made to appear sequentially from an area superimposed and displayed on an area far from the viewer to an area superimposed and displayed close to the viewer. .
  6.  前記ストレス値取得部は、前記視認者の心拍変動、前記視認者の運転姿勢及び前記視認者の顔画像のうち少なくとも一つから算出される情報から前記ストレス値を取得する、請求項1記載の表示装置。 The stress value acquisition unit acquires the stress value from information calculated from at least one of the viewer's heart rate fluctuation, the viewer's driving posture, and the viewer's face image. Display device.
PCT/JP2023/014594 2022-04-11 2023-04-10 Display device WO2023199896A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-065393 2022-04-11
JP2022065393 2022-04-11

Publications (1)

Publication Number Publication Date
WO2023199896A1 true WO2023199896A1 (en) 2023-10-19

Family

ID=88329802

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/014594 WO2023199896A1 (en) 2022-04-11 2023-04-10 Display device

Country Status (1)

Country Link
WO (1) WO2023199896A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004125572A (en) * 2002-10-01 2004-04-22 Nissan Motor Co Ltd Navigation system
JP2017056933A (en) * 2015-09-18 2017-03-23 株式会社リコー Information display device, information providing system, mobile device, information display method, and program
JP2017191367A (en) * 2016-04-11 2017-10-19 株式会社デンソー Safe driving support device and safe driving support program
JP2019061559A (en) * 2017-09-27 2019-04-18 本田技研工業株式会社 Display device, display control device and vehicle
JP2021039471A (en) * 2019-09-02 2021-03-11 三菱電機株式会社 Driving support device, driving support method, and driving support program
JP2021059323A (en) * 2019-10-02 2021-04-15 株式会社デンソー Display controller and display control program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004125572A (en) * 2002-10-01 2004-04-22 Nissan Motor Co Ltd Navigation system
JP2017056933A (en) * 2015-09-18 2017-03-23 株式会社リコー Information display device, information providing system, mobile device, information display method, and program
JP2017191367A (en) * 2016-04-11 2017-10-19 株式会社デンソー Safe driving support device and safe driving support program
JP2019061559A (en) * 2017-09-27 2019-04-18 本田技研工業株式会社 Display device, display control device and vehicle
JP2021039471A (en) * 2019-09-02 2021-03-11 三菱電機株式会社 Driving support device, driving support method, and driving support program
JP2021059323A (en) * 2019-10-02 2021-04-15 株式会社デンソー Display controller and display control program

Similar Documents

Publication Publication Date Title
JP4353162B2 (en) Vehicle surrounding information display device
JP5056831B2 (en) Head-up display device
US10019965B2 (en) Vehicle display system having a plurality of color temperature settings
CN103608207B (en) Method and display device and corresponding computer program product for the transport condition for showing vehicle
JP2009269551A (en) Display for vehicle
CN110824709B (en) Head-up display
US10185392B2 (en) Method for transmitting information to a driver of a motor vehicle, and adaptive driver assistance system
US20150124097A1 (en) Optical reproduction and detection system in a vehicle
JP5802926B2 (en) Luminance control device for in-vehicle display device, luminance control program, and luminance control method
US20160124224A1 (en) Dashboard system for vehicle
JP2017081456A (en) Display device and display method
JP2018185654A (en) Head-up display device
CN110116619B (en) Method for displaying information in a head-up display HUD of a vehicle
US20150378155A1 (en) Method for operating virtual reality glasses and system with virtual reality glasses
JP7058800B2 (en) Display control device, display control method, and display control program
WO2023199896A1 (en) Display device
JP4720979B2 (en) Vehicle monitoring device
JP7268526B2 (en) VEHICLE DISPLAY CONTROL DEVICE AND VEHICLE DISPLAY SYSTEM
US11790615B2 (en) Marking objects for a vehicle using a virtual element
JP2015501440A (en) Especially display devices for automobiles
US20220074753A1 (en) Method for Representing a Virtual Element
JP5196265B2 (en) Vehicle periphery recognition support device
CN116963926A (en) Improved visual display using augmented reality heads-up display
JP2019031176A (en) Display device for vehicle
JP2020161002A (en) Video display system, driving simulator system, video display method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23788314

Country of ref document: EP

Kind code of ref document: A1