WO2021200914A1 - Display control device, head-up display device, and method - Google Patents

Display control device, head-up display device, and method Download PDF

Info

Publication number
WO2021200914A1
WO2021200914A1 PCT/JP2021/013481 JP2021013481W WO2021200914A1 WO 2021200914 A1 WO2021200914 A1 WO 2021200914A1 JP 2021013481 W JP2021013481 W JP 2021013481W WO 2021200914 A1 WO2021200914 A1 WO 2021200914A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye position
display
eye
display parameter
specific state
Prior art date
Application number
PCT/JP2021/013481
Other languages
French (fr)
Japanese (ja)
Inventor
誠 秦
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Publication of WO2021200914A1 publication Critical patent/WO2021200914A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory

Definitions

  • the present disclosure relates to a display control device, a head-up display device, and a method used in a vehicle to superimpose and visually recognize an image on the foreground of the vehicle.
  • Patent Document 1 describes a head-up display device that detects the viewpoint position of an observer and corrects the display position of an image (an example of display setting) according to the viewpoint position, and the case where the viewpoint position cannot be detected. , It is disclosed that the display position of the image is maintained and the display is continued.
  • the display setting according to the actual eye position display setting that the observer should visually recognize
  • the display setting actually executed by the display device referred to as the eye position. It is assumed that there will be a difference between (inappropriate display setting) that the observer can see. Even after the image with inappropriate display settings is displayed in this way, if the eye position sensor can correctly detect the actual eye position, the display control device can update to the appropriate display settings. However, in such a case, the observer perceives a change from an image with inappropriate display settings to an image with appropriate display settings.
  • the head-up display device disclosed in Patent Document 1 maintains the display position (display setting) of the image when the viewpoint position cannot be detected, that is, corresponds to the last viewpoint position where the viewpoint position could be detected. It maintains the display settings. If the viewpoint position cannot be detected after changing from the situation where the viewpoint position could be detected, there is a high possibility that the viewpoint position has moved to another position, and more specifically, the viewpoint of the observer in general. It is assumed that there is a high possibility that the position is out of the expected area (also called eye lip or eye box).
  • the expected area also called eye lip or eye box
  • the observer's viewpoint position when the observer's viewpoint position enters the eyelips from outside the eyelips (state in which the viewpoint position cannot be detected), the observer's viewpoint position is arranged at the boundary between the inside and outside of the eyelips, but is visually recognized here.
  • the display settings corresponding to the last viewpoint position where the viewpoint position could be detected are applied to the image, and the last detected viewpoint position and the actual observer's viewpoint position (boundary inside and outside the irips) If the images are far apart, the display settings are also significantly different, and there is a risk that the observer may visually recognize an image with a low display quality that deviates from the appropriate image.
  • the outline of the present disclosure relates to suppressing deterioration of image display quality. More specifically, it is also related to making it difficult for the observer to visually recognize an image having low display quality even if a specific state occurs in the detection result of the eye position.
  • the display control device described in the present specification acquires information indicating the eye position of the observer and / or information capable of estimating the eye position, and the memory has a plurality of corresponding information for each spatial area.
  • the display parameters are stored, and one or more processors set one or more display parameters based on the acquired eye position, determine whether the information about the eye position indicates a specific state, and relate to the eye position.
  • the display parameter corresponding to the area closer to the boundary of the eye box is set than the area corresponding to the display parameter already set, at least based on the eye position.
  • FIG. 1 is a diagram showing an example of application of a vehicle display system to a vehicle according to some embodiments.
  • FIG. 2 is a diagram showing a configuration of a head-up display device according to some embodiments.
  • FIG. 3 is a block diagram of a vehicle display system according to some embodiments.
  • FIG. 4A is a flow chart showing a method of performing an operation of setting display parameters based on the observer's eye position according to some embodiments.
  • FIG. 4B is a flow chart showing a method following FIG. 4A.
  • the upper diagram of FIG. 5A is a diagram for providing an explanation of the arrangement of spatial regions corresponding to display parameters in some embodiments, and the lower diagram of FIG. 5A corresponds to FIG. 5A.
  • FIG. 5B is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments.
  • FIG. 5C is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments.
  • FIG. 6 is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments.
  • FIG. 7 is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments.
  • FIGS. 1 to 7 provide a description of the configuration and operation of an exemplary vehicle display system.
  • the present invention is not limited to the following embodiments (including the contents of the drawings). Of course, changes (including deletion of components) can be made to the following embodiments. Further, in the following description, in order to facilitate understanding of the present invention, description of known technical matters will be omitted as appropriate.
  • the image display unit 20 in the vehicle display system 10 is a head-up display (HUD: Head-Up Display) device provided in the dashboard 5 of the vehicle 1.
  • the image display unit 20 emits the display light 40 toward the front windshield 2 (an example of the projected unit), and the front windshield 2 emits the display light 40 toward the image M displayed by the image display unit 20 (see FIG. 2).
  • the display light 40 of the above is reflected on the eye box 200.
  • the observer visually recognizes the virtual image V of the image M displayed by the image display unit 20 at a position overlapping the foreground, which is the real space visually recognized through the front windshield 2. can do.
  • the left-right direction of the vehicle 1 is the X-axis direction (the left side when facing the front of the vehicle 1 is the X-axis positive direction), and the vertical direction is the Y-axis direction (a vehicle traveling on the road surface).
  • the upper side of 1 is the Y-axis positive direction), and the front-rear direction of the vehicle 1 is the Z-axis direction (the front of the vehicle 1 is the Z-axis positive direction).
  • the "eye box" used in the description of the present embodiment is (1) a region in which the entire virtual image V of the image M can be visually recognized in the region, and at least a part of the virtual image V of the image M is not visible outside the region, (2). At least a part of the virtual image V of the image M can be visually recognized in the area, and a part of the virtual image V of the image M is not visible outside the area.
  • the virtual image V of the image M can be visually recognized as described above and the entire virtual image V of the image M is less than the predetermined brightness, or (4) the virtual image V which can be viewed stereoscopically can be displayed by the image display unit 20, the virtual image V of the virtual image V At least a part of the virtual image V can be viewed stereoscopically, and a part of the virtual image V is not stereoscopically viewed outside the region. That is, when the observer places the eyes (both eyes) outside the eye box 200, the observer cannot see the entire virtual image V of the image M, and the visibility of the entire virtual image V of the image M is very low and difficult to perceive. Or, the virtual image V of the image M cannot be viewed stereoscopically.
  • the predetermined brightness is, for example, about 1/50 of the brightness of the virtual image of the image M visually recognized at the center of the eye box.
  • the "eye box” is the same as the area (also referred to as eye lip) where the observer's viewpoint position is expected to be arranged in the vehicle on which the HUD device 20 is mounted, or most of the eye lip (for example, 80% or more). Is set to include.
  • the display area 100 is an area of a plane, a curved surface, or a partially curved surface in which the image M generated inside the image display unit 20 forms an image as a virtual image V, and is also called an image forming surface.
  • the display area 100 is a position where the display surface (for example, the exit surface of the liquid crystal display panel) 21a of the display 21 described later of the image display unit 20 is imaged as a virtual image, that is, the display area 100 is the image display unit.
  • the display area 100 Corresponding to the display surface 21a described later of 20 (in other words, the display area 100 has a conjugate relationship with the display surface 21a of the display 21 described later), and the virtual image visually recognized in the display area 100 is an image. It can be said that it corresponds to the image displayed on the display surface 21a described later of the display unit 20. It is preferable that the display area 100 itself has low visibility to the extent that it is not actually visible to the observer's eyes or is difficult to see.
  • the display area 100 includes an angle (tilt angle ⁇ t in FIG. 1) formed by the horizontal direction (XZ plane) about the left-right direction (X-axis direction) of the vehicle 1, the center 205 of the eyebox 200, and the display area 100.
  • the angle formed by the line connecting the upper end 101 of the eye box and the line connecting the center of the eyebox and the lower end 102 of the display area 100 is defined as the vertical angle of the display area 100, and is horizontal to the bisector of this vertical angle.
  • the angle formed by the direction (XZ plane) (vertical arrangement angle ⁇ v in FIG. 1) is set.
  • the display area 100 of the present embodiment has a tilt angle ⁇ t of approximately 90 [degree] so as to substantially face the display region 100 when facing forward (the Z-axis positive direction).
  • the tilt angle ⁇ t is not limited to this, and can be changed within the range of 0 ⁇ ⁇ t ⁇ 90 [degree].
  • the tilt angle ⁇ t may be set to 60 [degree]
  • the display area 100 may be arranged so that the upper area is farther than the lower area when viewed from the observer.
  • FIG. 2 is a diagram showing the configuration of the HUD device 20 of the present embodiment.
  • the HUD device 20 includes a display 21 having a display surface 21a for displaying the image M, and a relay optical system 25.
  • the display 21 of FIG. 2 is composed of a liquid crystal display panel 22 and a light source unit 24.
  • the display surface 21a is a surface on the visual side of the liquid crystal display panel 22, and emits the display light 40 of the image M. Display by setting the angle of the display surface 21a with respect to the optical axis 40p of the display light 40 from the center of the display surface 21a toward the eye box 200 (center 205 of the eye box 200) via the relay optical system 25 and the projected portion.
  • the angle of the region 100 (including the tilt angle ⁇ t) can be set.
  • the relay optical system 25 is arranged on the optical path of the display light 40 (light from the display 21 toward the eyebox 200) emitted from the display 21, and the display light 40 from the display 21 is directed to the outside of the HUD device 20. It is composed of one or more optical members projected onto the front windshield 2.
  • the relay optical system 25 of FIG. 2 includes one concave first mirror 26 and one flat second mirror 27.
  • the first mirror 26 has, for example, a free curved surface shape having positive optical power.
  • the first mirror 26 may have a curved surface shape in which the optical power differs for each region, that is, the optical power added to the display light 40 according to the region (optical path) through which the display light 40 passes. It may be different.
  • the first image light 41, the second image light 42, and the third image light 43 (see FIG. 2) heading from each region of the display surface 21a toward the eyebox 200 are added by the relay optical system 25.
  • the optical power may be different.
  • the second mirror 27 is, for example, a flat mirror, but is not limited to this, and may be a curved surface having optical power. That is, the relay optical system 25 is added according to the region (optical path) through which the display light 40 passes by synthesizing a plurality of mirrors (for example, the first mirror 26 and the second mirror 27 of the present embodiment). The optical power may be different.
  • the second mirror 27 may be omitted. That is, the display light 40 emitted from the display 21 may be reflected by the first mirror 26 on the projected portion (front windshield) 2.
  • the relay optical system 25 includes two mirrors, but the present invention is not limited to this, and one or more refractive optics such as a lens may be added or substituted to these. It may include a member, a diffractive optical member such as a hologram, a reflective optical member, or a combination thereof.
  • the relay optical system 25 of the present embodiment has a function of setting the distance to the display area 100 by the curved surface shape (an example of optical power), and a virtual image obtained by enlarging the image displayed on the display surface 21a. It has a function of generating, but in addition to this, it may have a function of suppressing (correcting) distortion of a virtual image that may occur due to the curved shape of the front windshield 2.
  • relay optical system 25 may be rotatable to which actuators 28 and 29 controlled by the display control device 30 are attached.
  • the liquid crystal display panel 22 receives light from the light source unit 24 and emits spatial light-modulated display light 40 toward the relay optical system 25 (second mirror 27).
  • the liquid crystal display panel 22 has, for example, a rectangular shape whose short side is the direction in which the pixels corresponding to the vertical direction (Y-axis direction) of the virtual image V seen from the observer are arranged.
  • the observer visually recognizes the transmitted light of the liquid crystal display panel 22 via the virtual image optical system 90.
  • the virtual image optical system 90 is a combination of the relay optical system 25 shown in FIG. 2 and the front windshield 2.
  • the light source unit 24 is composed of a light source (not shown) and an illumination optical system (not shown).
  • the light source (not shown) is, for example, a plurality of chip-type LEDs, and emits illumination light to a liquid crystal display panel (an example of a spatial light modulation element) 22.
  • the light source unit 24 is composed of, for example, four light sources, and is arranged in a row along the long side of the liquid crystal display panel 22.
  • the light source unit 24 emits illumination light toward the liquid crystal display panel 22 under the control of the display control device 30.
  • the configuration of the light source unit 24 and the arrangement of the light sources are not limited to this.
  • the illumination optical system includes, for example, one or a plurality of lenses (not shown) arranged in the emission direction of the illumination light of the light source unit 24, and diffusion arranged in the emission direction of the one or a plurality of lenses. It is composed of a board (not shown).
  • the display 21 may be a self-luminous display or a projection type display that projects an image on a screen.
  • the display surface 21a is the screen of the projection type display.
  • the display 21 may be attached with an actuator (not shown) including a motor controlled by the display control device 30, and may be able to move and / or rotate the display surface 21a.
  • the relay optical system 25 has two rotation axes (first rotation axis AX1 and second rotation axis AX2) that move the eyebox 200 in the vertical direction (Y-axis direction).
  • Each of the first rotation axis AX1 and the second rotation axis AX2 is not perpendicular to the left-right direction (X-axis direction) of the vehicle 1 in the state where the HUD device 20 is attached to the vehicle 1 (in other words, the YZ plane). It is set so that it is not parallel).
  • the angle between the first rotation axis AX1 and the second rotation axis AX2 with respect to the left-right direction (X-axis direction) of the vehicle 1 is set to less than 45 [degree], and more preferably. It is set to less than 20 [degree].
  • the amount of vertical movement of the display area 100 is relatively small, and the amount of vertical movement of the eyebox 200 is relatively large.
  • the rotation of the relay optical system 25 on the second rotation axis AX2 the amount of movement of the display area 100 in the vertical direction is relatively large, and the amount of movement of the eyebox 200 in the vertical direction is relatively small. That is, when the first rotation axis AX1 and the second rotation axis AX2 are compared, "the amount of vertical movement of the eyebox 200 / the amount of vertical movement of the display area 100" due to the rotation of the first rotation axis AX1.
  • the relative amount of the vertical movement amount of the display area 100 and the vertical movement amount of the eyebox 200 due to the rotation of the relay optical system 25 on the first rotation axis AX1 is the relative amount on the second rotation axis AX2.
  • the relative amount of the vertical movement amount of the display area 100 due to the rotation of the relay optical system 25 and the vertical movement amount of the eyebox 200 are different.
  • the HUD device 20 includes a first actuator 28 that rotates the first mirror 26 on the first rotation axis AX1 and a second actuator 29 that rotates the first mirror 26 on the second rotation axis AX2.
  • the HUD device 20 rotates one relay optical system 25 on two axes (first rotation axis AX1 and second rotation axis AX2).
  • the first actuator 28 and the second actuator 29 may be composed of one integrated two-axis actuator.
  • the HUD device 20 in another embodiment rotates the two relay optical systems 25 on two axes (first rotation axis AX1 and second rotation axis AX2).
  • the HUD device 20 includes a first actuator 28 that rotates the first mirror 26 on the first rotation axis AX1 and a second actuator 29 that rotates the second mirror 27 on the second rotation axis AX2. You may.
  • the HUD device 20 in another embodiment does not have to drive the relay optical system 25.
  • the HUD device 20 may not have an actuator that rotates and / or rotates the relay optical system 25.
  • the HUD device 20 of this embodiment may include a wide eye box 200 that covers a range of driver's eye heights where the vehicle 1 is expected to be used.
  • the image display unit (head-up display device) 20 exists in the foreground, which is a real space (actual view) that is visually recognized through the front windshield 2 of the vehicle 1, based on the control of the display control device 30 described later.
  • Near the real object 300 such as lane road surface 310 (see FIG. 1), branch roads, road signs, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), and features (buildings, bridges, etc.).
  • the visual augmented reality (AR) can be displayed on the observer (typically, in the driver's seat of the vehicle 1). It can also be perceived by the seated observer).
  • AR augmented reality
  • an image whose displayed position can be changed according to the position of the real object existing in the real scene is defined as an AR image, and the displayed position is set regardless of the position of the real object.
  • the image is defined as a non-AR image.
  • FIG. 3 is a block diagram of the vehicle display system 10 according to some embodiments.
  • the display control device 30 includes one or more I / O interfaces 31, one or more processors 33, one or more image processing circuits 35, and one or more memories 37.
  • the various functional blocks described in FIG. 3 may consist of hardware, software, or a combination of both.
  • FIG. 3 is only one embodiment, and the illustrated components may be combined with a smaller number of components, or there may be additional components.
  • the image processing circuit 35 (for example, a graphic processing unit) may be included in one or more processors 33.
  • the processor 33 and the image processing circuit 35 are operably connected to the memory 37. More specifically, the processor 33 and the image processing circuit 35 execute a program stored in the memory 37 to generate and / or transmit image data, for example, and display the vehicle display system 10 (image display). The operation of unit 20) can be performed.
  • the processor 33 and / or the image processing circuit 35 includes at least one general purpose microprocessor (eg, central processing unit (CPU)), at least one application specific integrated circuit (ASIC), and at least one field programmable gate array (FPGA). , Or any combination thereof.
  • the memory 37 includes any type of magnetic medium such as a hard disk, any type of optical medium such as a CD and DVD, any type of semiconductor memory such as a volatile memory, and a non-volatile memory.
  • the volatile memory may include DRAM and SRAM, and the non-volatile memory may include ROM and NVRAM.
  • the processor 33 is operably connected to the I / O interface 31.
  • the I / O interface 31 communicates with, for example, the vehicle ECU 401 described later or another electronic device (reference numerals 403 to 419 described later) provided in the vehicle according to the standard of CAN (Controller Area Network) (also referred to as CAN communication). ).
  • CAN Controller Area Network
  • the communication standard adopted by the I / O interface 31 is not limited to CAN, for example, CANFD (CAN with Flexible Data Rate), LIN (Local Interconnect Network), Ethernet (registered trademark), MOST (Media Oriented Systems Transport).
  • MOST is a registered trademark
  • a wired communication interface such as UART, or USB
  • a local such as a personal area network (PAN)
  • PAN personal area network
  • Bluetooth registered trademark
  • 802.1x Wi-Fi registered trademark
  • In-vehicle communication (internal communication) interface which is a short-range wireless communication interface within several tens of meters such as an area network (LAN), is included.
  • the I / O interface 31 is a wireless wide area network (WWAN0, IEEE 802.16-2004 (WiMAX: Worldwide Interoperability for Microwave Access)), IEEE 802.16e base (Mobile WiMAX), 4G, 4G-LTE, LTE Advanced,
  • An external communication (external communication) interface such as a wide area communication network (for example, an Internet communication network) may be included according to a cellular communication standard such as 5G.
  • the processor 33 is interoperably connected to the I / O interface 31 to provide information with various other electronic devices and the like connected to the vehicle display system 10 (I / O interface 31). Can be exchanged.
  • the I / O interface 31 includes, for example, a vehicle ECU 401, a road information database 403, a vehicle position detection unit 405, an external sensor 407, an operation detection unit 409, an eye position detection unit 411, an IMU 413, a line-of-sight direction detection unit 415, and mobile information.
  • the terminal 417, the external communication device 419, and the like are operably connected.
  • the I / O interface 31 may include a function of processing (converting, calculating, analyzing) information received from another electronic device or the like connected to the vehicle display system 10.
  • the display 21 is operably connected to the processor 33 and the image processing circuit 35. Therefore, the image displayed by the image display unit 20 may be based on the image data received from the processor 33 and / or the image processing circuit 35.
  • the processor 33 and the image processing circuit 35 control the image displayed by the image display unit 20 based on the information acquired from the I / O interface 31.
  • the vehicle ECU 401 uses sensors and switches provided on the vehicle 1 to determine the state of the vehicle 1 (for example, mileage, vehicle speed, accelerator pedal opening, brake pedal opening, engine throttle opening, injector fuel injection amount, engine rotation speed). , Motor speed, steering angle, shift position, drive mode, various warning states, attitude (including roll angle and / or pitching angle), vehicle vibration (including magnitude, frequency, and / or frequency of vibration) )) and the like, and collect and manage (may include control) the state of the vehicle 1. As a part of the function, the numerical value of the state of the vehicle 1 (for example, the vehicle speed of the vehicle 1). ) Can be output to the processor 33 of the display control device 30.
  • the state of the vehicle 1 for example, mileage, vehicle speed, accelerator pedal opening, brake pedal opening, engine throttle opening, injector fuel injection amount, engine rotation speed.
  • the vehicle ECU 401 simply transmits the numerical value detected by the sensor or the like (for example, the pitching angle is 3 [brake] in the forward tilt direction) to the processor 33, or instead, the numerical value detected by the sensor is used.
  • Judgment results based on one or more states of the including vehicle 1 (for example, the vehicle 1 satisfies a predetermined condition of the forward leaning state) and / and analysis results (for example, of the brake pedal opening degree). Combined with the information, the brake has caused the vehicle to lean forward.) May be transmitted to the processor 33.
  • the vehicle ECU 401 may output a signal indicating a determination result indicating that the vehicle 1 satisfies a predetermined condition stored in advance in a memory (not shown) of the vehicle ECU 401 to the display control device 30.
  • the I / O interface 31 may acquire the above-mentioned information from the sensors and switches provided in the vehicle 1 provided in the vehicle 1 without going through the vehicle ECU 401.
  • the vehicle ECU 401 may output an instruction signal indicating an image to be displayed by the vehicle display system 10 to the display control device 30, and at this time, it is necessary to notify the coordinates, size, type, display mode, and image of the image.
  • the degree and / or the necessity-related information that is the basis for determining the notification necessity may be added to the instruction signal and transmitted.
  • the road information database 403 is included in a navigation device (not shown) provided in the vehicle 1 or an external server connected to the vehicle 1 via an external communication interface (I / O interface 31), and the vehicle position detection described later.
  • Road information (lane, white line, stop line, crosswalk, Road width, number of lanes, intersections, curves, branch roads, traffic regulations, etc.), feature information (buildings, bridges, rivers, etc.), presence / absence, position (including distance to vehicle 1), direction, shape, type , Detailed information and the like may be read out and transmitted to the processor 33. Further, the road information database 403 may calculate an appropriate route (navigation information) from the departure point to the destination, and output a signal indicating the navigation information or image data indicating the route to the processor 33.
  • the own vehicle position detection unit 405 is a GNSS (Global Navigation Satellite System) or the like provided in the vehicle 1, detects the current position and orientation of the vehicle 1, and transmits a signal indicating the detection result via the processor 33. , Or directly output to the road information database 403, the portable information terminal 417 described later, and / or the external communication device 419.
  • the road information database 403, the mobile information terminal 417 described later, and / or the external communication device 419 obtains the position information of the vehicle 1 from the own vehicle position detection unit 405 continuously, intermittently, or at a predetermined event. , Information around the vehicle 1 may be selected and generated and output to the processor 33.
  • the vehicle exterior sensor 407 detects real objects existing around the vehicle 1 (front, side, and rear).
  • the actual objects detected by the external sensor 407 are, for example, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) of the traveling lane described later. Etc. may be included.
  • a detection unit composed of a radar sensor such as a millimeter wave radar, an ultrasonic radar, a laser radar, a camera, or a combination thereof, and detection data from the one or a plurality of detection units are processed ( It consists of a processing device (data fusion) and a processing device.
  • One or more external sensors 407 detect a real object in front of the vehicle 1 for each detection cycle of each sensor, and the real object information (presence or absence of the real object, existence of the real object exists) which is an example of the real object information.
  • information such as the position, size, and / or type of each real object
  • these real object information may be transmitted to the processor 33 via another device (for example, vehicle ECU 401).
  • a camera an infrared camera or a near-infrared camera is desirable so that a real object can be detected even when the surroundings are dark such as at night.
  • a stereo camera capable of acquiring a distance or the like by parallax is desirable.
  • the operation detection unit 409 is, for example, a CID (Center Information Processor) of the vehicle 1, a hardware switch provided on the instrument panel, or a software switch that combines an image and a touch sensor, and the like.
  • the operation information based on the operation by the occupant is output to the processor 33.
  • the operation detection unit 409 sets the display area setting information based on the operation of moving the display area 100, the eyebox setting information based on the operation of moving the eyebox 200, and the observer's eye position 700 by the user's operation. Information based on the operation is output to the processor 33.
  • the eye position detection unit 411 may include a camera such as an infrared camera that detects the eye position 700 of the observer sitting in the driver's seat of the vehicle 1, and may output the captured image to the processor 33.
  • the processor 33 acquires an captured image (an example of information capable of estimating the eye position 700) from the eye position detection unit 411, and analyzes the captured image by a method such as pattern matching to obtain an observer's eye position.
  • the coordinates of 700 may be detected, and a signal indicating the detected coordinates of the eye position 700 may be output to the processor 33.
  • the eye position detection unit 411 analyzes the analysis result obtained by analyzing the image captured by the camera (for example, the observer's eye position 700 is a spatial region corresponding to a plurality of preset display parameters 600 (described later)). A signal indicating where it belongs to) may be output to the processor 33.
  • the method of acquiring information capable of estimating the eye position 700 of the observer of the vehicle 1 or the eye position 700 of the observer is not limited to these, and a known eye position detection (estimation) technique is used. May be obtained.
  • the eye position detection unit 411 detects the moving speed and / or moving direction of the observer's eye position 700, and outputs a signal indicating the moving speed and / or moving direction of the observer's eye position 700 to the processor 33. It may be output to.
  • the eye position detection unit 411 is presumed to have (10) a signal indicating that the observer's eye position 700 is outside the eye box 200, and (20) the observer's eye position 700 is outside the eye box 200.
  • a signal or (30) a signal in which the observer's eye position 700 is predicted to be outside the eye box 200 is detected, it is determined that a predetermined condition is satisfied, and a signal indicating the state is output to the processor 33. You may.
  • the signal that the observer's eye position 700 is estimated to be outside the eye box 200 is (21) a signal indicating that the observer's eye position 700 cannot be detected, and (22) the observer's eye position 700.
  • a signal indicating that the observer's eye position 700 cannot be detected, and / or (23) either of the observer's eye positions 700R or 700L is near the boundary 200A of the eyebox 200 (the vicinity). Includes, for example, a signal indicating that it is within a predetermined coordinate from the boundary 200A).
  • the signal predicted that the observer's eye position 700 is outside the eye box 200 is (31) stored in advance in the memory 37 with respect to the previously detected eye position 700 by the newly detected eye position 700.
  • the IMU413 is one or more sensors (eg, accelerometers and gyroscopes) configured to detect the position, orientation, and changes (speed of change, acceleration of change) of vehicle 1 based on inertial acceleration. Can include combinations of.
  • the IMU413 processes the detected values (the detected values include the position and orientation of the vehicle 1, signals indicating these changes (change speed, change acceleration), and the like), and the results of analyzing the detected values. It may be output to 33.
  • the result of the analysis is a signal or the like indicating a determination result of whether or not the detected value satisfies a predetermined condition, and is, for example, from a value relating to a change (change speed, change acceleration) of the position or orientation of the vehicle 1. , It may be a signal indicating that the behavior (vibration) of the vehicle 1 is small.
  • the line-of-sight direction detection unit 415 includes an infrared camera or a visible light camera that captures the face of an observer sitting in the driver's seat of the vehicle 1, and may output the captured image to the processor 33.
  • the processor 33 acquires an captured image (an example of information capable of estimating the line-of-sight direction) from the line-of-sight direction detection unit 415, and analyzes the captured image to determine the line-of-sight direction (and / or the gaze position) of the observer. Can be identified.
  • the line-of-sight direction detection unit 415 may analyze the captured image from the camera and output a signal indicating the line-of-sight direction (and / or the gaze position) of the observer, which is the analysis result, to the processor 33.
  • the method for acquiring information that can estimate the line-of-sight direction of the observer of the vehicle 1 is not limited to these, and is not limited to these, but is the EOG (Electro-oculogram) method, the corneal reflex method, the scleral reflex method, and the Purkinje image detection. It may be obtained using other known line-of-sight detection (estimation) techniques such as the method, search coil method, infrared fundus camera method.
  • the mobile information terminal 417 is a smartphone, a laptop computer, a smart watch, or other information device that can be carried by an observer (or another occupant of vehicle 1).
  • the I / O interface 31 can communicate with the mobile information terminal 417 by pairing with the mobile information terminal 417, and the data recorded in the mobile information terminal 417 (or the server through the mobile information terminal). To get.
  • the mobile information terminal 417 has, for example, the same functions as the above-mentioned road information database 403 and own vehicle position detection unit 405, acquires the road information (an example of real object-related information), and transmits it to the processor 33. You may.
  • the personal digital assistant 417 may acquire commercial information (an example of information related to a real object) related to a commercial facility in the vicinity of the vehicle 1 and transmit it to the processor 33.
  • the mobile information terminal 417 transmits schedule information of the owner (for example, an observer) of the mobile information terminal 417, incoming information on the mobile information terminal 417, mail reception information, and the like to the processor 33, and the processor 33 and an image.
  • the processing circuit 35 may generate and / or transmit image data relating to these.
  • the external communication device 419 is a communication device that exchanges information with the vehicle 1, for example, another vehicle connected to the vehicle 1 by vehicle-to-vehicle communication (V2V: Vehicle To Vehicle), and pedestrian-to-vehicle communication (V2P: Vehicle To Pestation). ), A network communication device connected by pedestrians (portable information terminals carried by pedestrians) and road-to-vehicle communication (V2I: Vehicle To vehicle Infrastructure), and in a broad sense, communication with vehicle 1 (V2X). : Includes everything connected by (Vehicle To Everything).
  • the external communication device 419 acquires, for example, the positions of pedestrians, bicycles, motorcycles, other vehicles (preceding vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) and sends them to the processor 33. It may be output. Further, the external communication device 419 has the same function as the own vehicle position detection unit 405 described above, and may acquire the position information of the vehicle 1 and transmit it to the processor 33. Further, the road information database 403 described above may be used. It also has a function, and the road information (an example of information related to a real object) may be acquired and transmitted to the processor 33. The information acquired from the external communication device 419 is not limited to the above.
  • the software components stored in the memory 37 are the eye position detection module 502, the eye position estimation module 504, the eye position prediction module 506, the eye position state determination module 508, the display parameter setting module 510, the graphic module 512, and the light source drive module 514. , And the actuator drive module 516.
  • FIGS. 4A and 4B are flow charts showing a method S100 for executing an operation of setting display parameters based on the eye position of an observer according to some embodiments.
  • Method S100 is executed by an image display unit 20 including a display and a display control device 30 that controls the image display unit 20. Some actions in method S100 are optionally combined, some steps are optionally modified, and some actions are optionally omitted.
  • the method S100 is an image (virtual image) of an image (virtual image) in particular when the observer's eye position is in an unusual specific state (abnormality) and the eye position detection is in an unusual specific state (abnormality).
  • an unusual specific state abnormality
  • an unusual specific state abnormality
  • an unusual specific state abnormality
  • an unusual specific state abnormality
  • the eye position detection module 502 of FIG. 3 detects the eye position 700 of the observer (S110).
  • the eye position detection module 502 detects coordinates indicating the height of the observer's eyes (position in the Y-axis direction, which is an example of a signal indicating the eye position 700), and observer's eye height.
  • the coordinates indicating the position in the depth direction (the positions in the Y and Z axes, which is an example of the signal indicating the eye position 700), and / or the coordinates indicating the observer's eye position 700 (X).
  • Y, Z axis position which is an example of a signal indicating eye position 700)
  • the eye position 700 detected by the eye position detection module 502 is one of the right eye position 700R and 700L, the right eye position 700R and the left eye position 700L, the right eye position 700R and the left eye position 700L, respectively. It includes one of the detectable (easy-to-detect) positions, or a position calculated from the right eye position 700R and the left eye position 700L (for example, the midpoint between the right eye position and the left eye position). For example, the eye position detection module 502 determines the eye position 700 based on the observation position acquired from the eye position detection unit 411 immediately before the timing of updating the display setting.
  • the eye position detection unit 411 detects the movement direction and / or the movement speed of the observer's eye position 700 based on the observation positions of the eyes of a plurality of observers acquired from the eye position detection unit 411, and the observer.
  • a signal indicating the moving direction and / or moving speed of the eye position 700 may be output to the processor 33.
  • the eye position estimation module 504 acquires information capable of estimating the eye position (S114).
  • Information that can estimate the eye position is, for example, an image captured from the eye position detection unit 411, the position of the driver's seat of the vehicle 1, the position of the observer's face, the height of the sitting height, or the eyes of a plurality of observers. Observation position, etc.
  • the eye position estimation module 504 estimates the eye position 700 of the observer of the vehicle 1 from the information capable of estimating the eye position.
  • the eye position estimation module 504 is based on the captured image acquired from the eye position detection unit 411, the position of the driver's seat of the vehicle 1, the position of the observer's face, the height of the sitting height, the observation positions of the eyes of a plurality of observers, and the like. Includes various software components for performing various actions related to estimating the observer's eye position 700, such as estimating the observer's eye position 700. That is, the eye position estimation module 504 may include table data, an arithmetic expression, and the like for estimating the eye position 700 of the observer from the information capable of estimating the eye position.
  • the eye position detection module 502 sets the eye position 700 into an eye observation position acquired from the eye position detection unit 411 immediately before the timing of updating the display parameter and one or a plurality of eye observation positions acquired in the past. Based on this, it may be calculated by, for example, a method such as a weighted average.
  • the eye position prediction module 506 acquires information that can predict the eye position 700 of the observer (S116).
  • the information that can predict the eye position 700 of the observer is, for example, the latest observation position acquired from the eye position detection unit 411, or one or more observation positions acquired in the past.
  • the eye position prediction module 506 includes various software components for performing various actions related to predicting the eye position 700 based on predictable information on the observer's eye position 700. Specifically, for example, the eye position prediction module 506 predicts the observer's eye position 700 at the timing when the image to which the new display setting is applied is visually recognized by the observer.
  • the eye position prediction module 506 uses a prediction algorithm such as a least squares method, a Kalman filter, an ⁇ - ⁇ filter, or a particle filter to obtain the next value using one or more observation positions in the past. You may try to predict.
  • the eye position state determination module 508 determines whether the observer's eye position 700 is in a specific state (S130).
  • the eye position state determination module 508 (10) determines whether or not the observer's eye position 700 can be detected, and if the eye position 700 cannot be detected, determines that the observer is in a specific state, and (20) the observer's. It is determined whether or not the eye position 700 is outside the eye box 200, and if it is outside the eye box 200, it is determined that it is in a specific state. (30) The observer's eye position 700 is outside the eye box 200.
  • the eye position state determination module 508 may include table data, an arithmetic expression, and the like for determining whether or not the eye position is in a specific state from the detection information, estimation information, or prediction information of the eye position 700.
  • the method of determining whether or not the observer's eye position 700 can be detected is as follows: (1) A part of the observer's eye observation position acquired from the eye position detection unit 411 within a predetermined period (for example, (2) The eye position detection module 502 cannot detect the observer's eye position 700 in normal operation, and (3) the eye position estimation module 504 usually cannot detect the eye position 700 or more than a predetermined number of times. The observer's eye position 700 cannot be estimated in the movement of the observer, (4) the eye position prediction module 506 cannot predict the eye position 700 of the observer in the normal operation, or a combination thereof causes the observer's eye position 700. It includes determining that the eye position 700 cannot be detected (the observer's eye position 700 is in a specific state) (note that the determination method is not limited thereto).
  • the method of determining whether or not the observer's eye position 700 is outside the eye box 200 is as follows: (1) One of the observer's eye observation positions acquired from the eye position detection unit 411 within a predetermined period. Acquiring a part (for example, a predetermined number of times or more) or all of them outside the eye box 200, (2) the eye position detection module 502 detects the observer's eye position 700 outside the eye box 200, or these.
  • the combination includes determining that the observer's eye position 700 is outside the eye box 200 (the observer's eye position 700 is in a specific state) (note that the determination method is not limited thereto).
  • the method of determining whether the observer's eye position 700 can be estimated to be outside the eye box 200 is as follows: (1) After the movement of the observer's eye position 700 is detected by the eye position detection unit 411, the observer The eye position 700 cannot be detected, (2) the eye position detection module 502 detects the observer's eye position 700 near the boundary 200A of the eye box 200, and (3) the eye position detection module 502 observes. It can be estimated that the observer's eye position 700 is outside the eye box 200 by detecting either the right eye position 700R or the left eye position 700L near the boundary 200A of the eye box 200, or a combination thereof (observer). (The determination method is not limited to these).
  • the method of determining whether or not the observer's eye position 700 is predicted to be outside the eye box 200 is as follows: (1) The eye position prediction module 506 sets the observer's eye position 700 after a predetermined time in the eye box. Predicting outside 200, (2) The eye position 700 newly detected by the eye position detection module 502 is equal to or greater than the eye position movement distance threshold stored in advance in the memory 37 with respect to the previously detected eye position 700. There is (the movement speed of the eye position 700 is equal to or higher than the eye position movement speed threshold stored in advance in the memory 37), or the observer's eye position 700 is outside the eye box 200 due to a combination thereof. It includes determining that it can be predicted (the eye position 700 of the observer is in a specific state) (note that the determination method is not limited thereto).
  • the display parameter setting module 510 determines that the state is not a specific state by the eye position state determination module 508, the eye position 700 detected by the eye position detection module 502, the eye position 700 estimated by the eye position estimation module 504, or the eye position 700.
  • the eye position 700 predicted by the position prediction module 506 has a plurality of predetermined coordinate ranges (for example, the plurality of coordinate ranges correspond to a plurality of display parameters 600 of FIGS. 5A, 5B, or 5C. It is determined where it belongs to the partitioned area), and the display parameter 600 corresponding to the eye position 700 is set.
  • the display parameter setting module 510 determines where in a plurality of spatial regions the X-axis coordinate, the Y-axis coordinate, the Z-axis coordinate, or a combination thereof indicated by the eye position 700 belongs, and the space to which the eye position 700 belongs. Includes various software components for performing various operations related to setting the display parameter 600 corresponding to a specific area. That is, the eye position estimation module 504 may include table data, an arithmetic expression, and the like for specifying the display parameter 600 from the X-axis coordinates, the Y-axis coordinates, the Z-axis coordinates, or a combination thereof indicated by the eye position 700. ..
  • the types of display parameters 600 are (1) arrangement of images so as to have a predetermined positional relationship with an actual object located outside the vehicle 1 when viewed from the eye position 700 (display 21 for controlling the arrangement of images).
  • Parameters that control A parameter that controls the display 21 to distort the image displayed on the display 21 in advance, and is also called a warping parameter.
  • Parameters for directional display (parameters for controlling the display 21, parameters for controlling the light source of the display 21, actuators) for directional display in which the light of the image is not directed (or weakened) to the display parameters 600 other than 700.
  • Parameters for visually recognizing a desired stereoscopic image when viewed from the eye position are included.
  • a parameter for controlling, a parameter for controlling an actuator, or a parameter including a combination thereof.), Etc. are included.
  • the display parameter 600 set (selected) by the display parameter setting module 510 may be any parameter that is preferably changed according to the eye position 700 of the observer, and the type of the display parameter 600 is not limited to these. ..
  • the display parameter setting module 510 selects one or a plurality of display parameters 600 corresponding to the eye position 700 for one type of display parameter.
  • the display parameter setting module 510 includes the display parameter 600 corresponding to the right eye position 700R, the display parameter 600 corresponding to the left eye position 700L, and the display parameter 600. Two can be selected.
  • the eye position 700 is, for example, one of the predetermined positions of the right eye position 700R and the left eye position 700L, and one of the right eye position 700R and the left eye position 700L that can be detected (easily detected), or
  • the display parameter setting module 510 has one display corresponding to the eye position 700.
  • Parameter 600 can be selected.
  • the display parameter setting module 510 may also select the display parameter 600 corresponding to the eye position 700 and the display parameter 600 set around the eye position 700. That is, the display parameter setting module 510 may select three or more display parameters 600 corresponding to the eye positions 700.
  • the display parameter setting module 510 selects the display parameter 600 (including the boundary display parameter 610E) set near the boundary of the eye box 200 when the eye position state determination module 508 determines that the specific state is present. (Block S154).
  • the display parameter setting module 510 is set closer to the boundary of the eyebox 200 than the latest eye position 700 when the eye position state determination module 508 determines that the specific state is present.
  • 600 (including the boundary display parameter 610E) is selected (block S156).
  • the boundary display parameter 610E set in the nearest neighbor of the latest eye position 700 is selected, or the display parameter set on the boundary side of the eye box 200 by a predetermined distance from the latest eye position 700. It also includes selecting 600 (including the boundary display parameter 610E).
  • the display parameter setting module 510 is based on the latest eye position 700 and the moving direction of the eye position when the eye position state determination module 508 determines that the specific state is present.
  • the boundary display parameter 610E set at a position along the moving direction of the eye position from the latest eye position 700 is selected (an example of block S158).
  • the boundary display parameter 610E set in the moving direction of the eye position from the latest eye position 700 is selected, or the eye box 200 is set by a predetermined distance in the moving direction of the eye position from the latest eye position 700. It also includes selecting the display parameter 600 (including the boundary display parameter 610E) set on the boundary side of the above.
  • the display parameter setting module 510 when the display parameter setting module 510 is determined to be in a specific state by the eye position state determination module 508, the latest eye position 700 and the latest moving direction of the eye position 700 are determined. Depending on the movement direction of the eye position 700, the display parameter 600 corresponding to the latest eye position 700 and the boundary display parameter 610E may be selected (block S158). One case). That is, the display parameter setting module 510 can select a display parameter 600 that is not the boundary display parameter 610E that is closer to the boundary display parameter 610E along the moving direction of the eye position 700 than the display parameter 600 corresponding to the latest eye position 700. ..
  • the display parameter setting module 510 when the display parameter setting module 510 is determined to be in a specific state by the eye position state determination module 508, the latest eye position 700, the latest moving direction of the eye position 700, and the eye Based on the moving speed of the position 700, one of the display parameters 600 from the display parameter 600 corresponding to the latest eye position 700 to the boundary display parameter 610E along the moving direction of the eye position 700 is selected according to the moving speed. It may be (an example of block S160). That is, in the display parameter setting module 510, if the movement speed is faster than the predetermined eye position movement speed threshold value, the observer's eye position 700 may be located outside the eye box 200 or near the boundary of the eye box 200.
  • the boundary display parameter 610E (or the display parameter 600 close to the boundary display parameter 610E) is selected, and if the movement speed is slower than the predetermined eye position movement speed threshold value, the observer's eye position 700 is the eye box 200. Since it can be estimated that it is unlikely to be located outside, any of the display parameters 600 from the display parameter 600 corresponding to the latest eye position 700 to the boundary display parameter 610E along the moving direction of the eye position 700. Among them, the display parameter 600 set near the latest eye position 700 can be selected.
  • the display parameter setting module 510 reselects the display parameter 600 when it is determined by the eye position state determination module 508 that the specific state has been released (block S170).
  • the display parameter setting module 510 determines that the specific state is released in the display parameter update cycle after the specific state is released when the eye position state determination module 508 determines that the specific state is released.
  • the display parameter 600 corresponding to the latest eye position 700 after being released is selected (block S172).
  • the display parameter setting module 510 determines that the specific state is released by the eye position state determination module 508, it is determined in the display parameter update cycle after the specific state is released.
  • the display parameters up to the above are maintained, and after the next display parameter update cycle, the display parameter 600 corresponding to the latest eye position 700 is selected (block S174).
  • 5A, 5B, and 5C are diagrams that virtually show the arrangement of a plurality of display parameters 600 associated with the eye box 200 and the space around it.
  • One display parameter 600 is associated with an area partitioned by two-dimensional coordinates consisting of X-axis coordinates in the left-right direction and Y-axis coordinates in the up-down direction when the vehicle 1 faces the forward direction. There is. That is, within the same section, the same one display parameter 600 is applied regardless of the coordinates of the eye position 700.
  • one display parameter 600 may be associated with three-dimensional coordinates including Z-axis coordinates.
  • the plurality of display parameters 600 include a first display parameter 610 associated with the space inside the eyebox 200 where the image can be visually recognized, and a second display parameter 650 associated with the space outside the eyebox 200. (The second display parameter 650 may be omitted.).
  • the first display parameter 610 includes the boundary display parameter 610E.
  • the boundary display parameter 610E is a display parameter 600 associated with the space inside the boundary 200A of the eyebox 200 (on the side of the center 205 of the eyebox 200).
  • the display parameter 612 of FIG. 5A is arranged at the left end (inside the boundary) of the eye box 200, it is classified into the boundary display parameter 610E.
  • the display parameter 645 of FIG. 5B is arranged at the right end (inside the boundary) of the eye box 200, it is classified into the boundary display parameter 610E.
  • the display parameter 642 of FIG. 5B is arranged at the upper end (inside the boundary) of the eye box 200, it is classified into the boundary display parameter 610E. That is, the boundary display parameter 610E in FIG. 5A is the display parameter 611-616, 620-621, 625-626, 630 to 635, and the boundary display parameter 610E in FIGS. 5B and 5C is the display parameter 641 to 645. ..
  • FIGS. 5A, 5B and 5C are shown mainly to provide an explanation of the spatial arrangement corresponding to the display parameter 600, and the spatial area (section) corresponding to the second display parameter 650. ) Is conceptually shown. That is, the spatial area corresponding to the second display parameter 650 may be further expanded outward or reduced inward (on the eyebox 200 side).
  • the first display parameter 610 is composed of 25 first display parameters 611 to 635, which are divided into 5 columns in the horizontal direction (X-axis direction) and 5 rows in the vertical direction (Y-axis direction).
  • the columns are in the order of the first column, the second column, ...
  • the fifth column from the positive direction (left) of the X axis, and the rows are the first row from the positive direction (top) of the Y axis.
  • the 1st display parameters 611 to 615 are arranged in the order of the 1st row to the 5th row of the 1st column, and the 1st to 5th rows of the 2nd column.
  • 616 to 620 are arranged in the order of the eyes.
  • the lower figure of FIG. 5A corresponds to the upper figure of FIG. 5A, and provides a description of the display parameter 600 selected by the display parameter setting module 510 according to the eye position 700.
  • the display parameter setting module 510 selects the display parameter 627 to which the eye position 701 detected by the eye position detection module 502 belongs. ..
  • the display parameter setting module 510 is the latest of the latest eye position 701 before the determination of the specific state.
  • the nearest boundary display parameter 610E corresponding to the coordinates of the predetermined eye position 701 may be stored in the memory 37 in advance, that is, the spatial area corresponding to the display parameter 627 is the closest boundary display.
  • the display parameter setting module 510 is either display parameter 626, display parameter 631, or display parameter 632, which is close to the detailed coordinates of the eye position 701 in the section to which the display parameter 627 corresponds. You may choose one. That is, the display parameter setting module 510 selects the display parameter 626 if the coordinates of the eye position 701 are close to the area corresponding to the display parameter 626, while the coordinates of the eye position 701 are the area corresponding to the display parameter 632. If it is close to, the display parameter 632 may be selected.
  • the display parameter setting module 510 corresponds to the latest eye position 702 before the specific state is determined.
  • the display parameter 626 which is the boundary display parameter 610E set in the space closest to the area to which the display parameter 654 (second display parameter 654) corresponds, is selected.
  • the display parameter setting module 510 belongs to the latest eye position 703 before the determination of the specific state. If the display parameter 631 corresponding to the area is the boundary display parameter 610E, the display parameter 631 which is the boundary display parameter 610E is selected as it is.
  • the display parameter setting module 510 determines that the latest eye position 704 before the specific state is determined. Display parameters corresponding to the latest eye position movement direction 751 before the determination of the specific state (or immediately after the determination of the specific state) and the area to which the latest eye position 704 belongs.
  • the display parameter 634 which is the boundary display parameter 610E arranged in the moving direction 751 of the eye position of 624, is selected.
  • the display parameter setting module 510 is arranged in the moving direction 752 of the eye position of the display parameter 624 corresponding to the latest eye position 704 according to the latest eye position 704 and the moving direction 752 of the eye position.
  • the display parameter 614 which is the boundary display parameter 610E, is selected.
  • the display parameter setting module 510 determines that the latest eye position 705 before the specific state is determined.
  • the latest display parameter 622 corresponding to the latest eye position 705, depending on the latest movement direction of the eye position 753 before the determination of the specific state (or immediately after the determination of the specific state).
  • the display parameter 612 which is the boundary display parameter 610E arranged in the moving direction 751 of the eye position, is selected. After that, when the eye position 706 in the specific state is moved along the moving direction 754, it is assumed that the specific state is no longer present, and the latest eye position after the specific state is lost is indicated by reference numeral 707.
  • the display parameter setting module 510 selects the display parameter 611 to which the eye position 707 detected by the eye position detection module 502 belongs. That is, to briefly explain the display parameter 600 selected as the eye position 700 moves, the display parameter setting module 510 selects and specifies the display parameter 622 corresponding to the eye position 705 when the eye position is not in a specific state.
  • the display parameter 612 (boundary display parameter 610E) corresponding to the eye position 705 and the moving direction 753 when the state is reached is selected, and the display parameter 611 corresponding to the eye position 707 when the state is no longer specified is selected.
  • FIG. 5B provides a description of a modified example of the arrangement of the display parameter 600.
  • the first display parameter 610 is composed of five first display parameters 641 to 645, which are partitioned in five rows in the left-right direction (X-axis direction) and not partitioned in the vertical direction (Y-axis direction).
  • the columns are in the order of the first column, the second column, ...
  • the fifth column from the positive direction (left) of the X axis, and the first display parameters 641 to the fifth column are in order. 645 is placed.
  • the arrangement of the display parameter 600 in the eye box 200 is not a two-dimensional arrangement in which a plurality of determination areas are arranged in the X-axis direction and a plurality of determination areas are also arranged in the Y-axis direction, as shown in FIG. 5A.
  • a one-dimensional arrangement may be adopted in which a plurality of determination areas are arranged in the X-axis direction and a plurality of determination areas are not arranged in the Y-axis direction.
  • FIG. 5C provides a description of a modified example of the arrangement of the display parameter 600.
  • the sizes of the plurality of display parameters 600 arranged in the eye box 200 are the same, but may be different as shown in FIG. 5C. Further, the shapes of the plurality of display parameters 600 may also be different from each other.
  • the upper figure of FIG. 6 is a diagram that provides an explanation of the arrangement of a plurality of display parameters 600 associated with the eye box 200 and the space around the eye box 200.
  • the second display parameter 650 may also include the second boundary display parameter 650E, as shown in the upper diagram of FIG.
  • the second boundary display parameter 650E is a display parameter 600 associated with a spatial area outside the boundary 200A of the eyebox 200. That is, the boundary display parameter may include a first boundary display parameter 610E corresponding to the region inside the boundary 200A of the eyebox 200 and a second boundary display parameter 650E corresponding to the region outside the boundary 200A.
  • the lower figure of FIG. 6 corresponds to the upper figure of FIG. 6, and provides an explanation of the display parameter 600 selected by the display parameter setting module 510 according to the eye position 700.
  • the display parameter setting module 510 belongs to the latest eye position 703 before the determination of the specific state. Even if the display parameter 631 corresponding to the area is the first boundary display parameter 610E, the second boundary corresponding to the area further distant from the center 205 of the eyebox 200 from the area corresponding to the already set display parameter 631.
  • Set the display parameter 650E (display parameter 655, display parameter 666, or display parameter (not shown) corresponding to the area above the area corresponding to the display parameter 665 on the right side of the corresponding area). May be good.
  • the display parameter setting module 510 determines that the latest eye position 704 before the specific state is determined. Display parameters corresponding to the latest eye position movement direction 751 before the determination of the specific state (or immediately after the determination of the specific state) and the area to which the latest eye position 704 belongs.
  • the display parameter 634 which is the first boundary display parameter 610E or the display parameter 669 which is the second boundary display parameter 650E arranged in the moving direction 751 of the eye position of 624 is selected.
  • the display parameter setting module 510 determines that the latest eye position 705 before the specific state is determined.
  • the latest display parameter 622 corresponding to the latest eye position 705, depending on the latest movement direction of the eye position 753 before the determination of the specific state (or immediately after the determination of the specific state).
  • the display parameter 612 which is the first boundary display parameter 610E, or the display parameter 662, which is the second boundary display parameter 650E, is selected in the moving direction 753 of the eye position.
  • the display parameter setting module 510 selects the display parameter 611 to which the eye position 707 detected by the eye position detection module 502 belongs.
  • the adjacent first boundary display parameter 610E and the second boundary display parameter 650E may have the same display parameter (same setting value). That is, the display parameter 631 which is the first boundary display parameter 610E in FIG. 6 corresponds to the display parameter 655, the display parameter 666, and the display parameter 655 which are the second boundary display parameters 650E associated with the first boundary display parameter 610E. Two or all of the display parameters (not shown) on the right side of the area and corresponding to the area above the area to which the display parameter 666 corresponds may be the same display parameter (same set value).
  • the arrangement of the spatial areas corresponding to the plurality of display parameters 600 may be changeable. For example, when a predetermined condition is satisfied, the arrangement of the spatial area corresponding to the plurality of display parameters 600 may be changed from the arrangement shown in FIG. 5A to the arrangement shown in FIG. 7.
  • the arrangement shown in FIG. 7 is the width of the spatial region corresponding to the boundary display parameter 610E (650E) in the horizontal direction (X-axis direction) and / or the vertical direction (Y) as compared with the arrangement shown in FIG. 5A.
  • the width in the axial direction) is increased. That is, since the arrangement shown in FIG. 7 has a wide area corresponding to the boundary display parameter 610E (650E), it is easy to maintain a constant display parameter 600 (boundary display parameter 610E (650E)) even if the eye position 700 shifts. Become.
  • the display parameter setting module 510 sets the boundary display parameter.
  • the spatial area covered by the 610E (650E) may be expanded. That is, when the detected eye position 700 or the detected eye position 700 is a specific state, it is difficult to switch the display parameter 600 even if the actual eye position 700 changes near the boundary of the eye box 200. can do.
  • the latest eye position 700 determined to be in the specific state by the eye position state determination module 508 and before being determined to be the specific state is at least a predetermined distance from the boundary 200A of the eye box 200.
  • the display parameter setting module 510 may expand the spatial area corresponding to the boundary display parameter 610E (650E). At this time, the boundary display parameter 610E may be expanded so as to include the region of the latest eye position 700 before the determination of the specific state.
  • the number (density) of spatial regions corresponding to the plurality of display parameters 600 may be changeable. For example, when a predetermined condition is met, the number of spatial regions corresponding to the plurality of first display parameters 610 in the eyebox 200 is from 5 shown in FIG. 5B to 25 shown in FIG. 5A. It may be changed to pieces. When the number of spatial regions corresponding to the plurality of display parameters 600 is increased (in other words, the density is increased), the display parameters 600 are likely to be switched according to the change of the eye position 700.
  • the display parameter setting module 510 May set the density of the spatial region corresponding to the plurality of display parameters 600 to be high. That is, the display parameter 600 can be smoothly switched based on the eye position 700 at high speed, and is difficult to switch based on the eye position 700 at low speed.
  • the eye position state determination module 508 indicates a signal indicating the moving direction 750 of the eye position 700 including the left-right direction (X-axis direction) and / or the vertical direction as a condition for determining the specific state. At least one of the signals indicating the moving direction 750 of the eye position 700 including (Y-axis direction) may be detected.
  • the display parameter setting module 510 when the eye position 700 moves in the left-right direction (X-axis direction), the area closer to the boundary 200A of the eye box 200 than the area corresponding to the display parameter 600 already set.
  • the process of setting the display parameter 600 corresponding to the area away from the center 205 of the eye box 200 is executed, and if the eye position 700 does not move in the left-right direction (X-axis direction), these processes are not executed. You may.
  • the eye position state determination module 508 may at least include that a signal indicating that the behavior (vibration) of the vehicle 1 is small is detected as a condition for determining the specific state. ..
  • the display parameter setting module 510 may not determine the specific state when the behavior (vibration) of the vehicle 1 is large.
  • the display setting module 510 cannot detect the eye position 700, but if the eye position state determination module 508 does not determine the specific state, the display parameter 600 corresponding to the latest eye position 700 is maintained. You may try to do it.
  • the display setting module 510 cannot detect the eye position 700, but when the eye position state determination module 508 does not determine the specific state, the display parameter 600 corresponding to the center 205 of the eye box 700 is set. You may set it.
  • the graphic module 512 includes various known software components for performing image processing such as rendering to generate image data and driving the display 21.
  • the graphic module 512 also provides a type, arrangement (positional coordinates, angle), size, display distance (in the case of 3D), visual effect (eg, brightness, transparency, saturation, contrast, or contrast) of the displayed image. Other visual characteristics), may include various known software components for modification.
  • the graphic module 512 includes an image type (one of the display parameters), image position coordinates (one of the display parameters), an image angle (pitching angle about the X direction, and yaw rate about the Y direction). An image is generated by generating image data so that the observer can see the angle, the rolling angle about the Z direction, etc., which is one of the display parameters) and the size of the image (one of the display parameters).
  • the display unit 20 can be driven.
  • the light source drive module 514 includes various known software components for performing driving of the light source unit 24.
  • the light source drive module 514 can drive the light source unit 24 based on the set display parameter 600.
  • the actuator drive module 516 includes various known software components for performing driving the first actuator 28 and / or the second actuator 29.
  • the actuator drive module 516 is based on a set display parameter 600.
  • the first actuator 28 and the second actuator 29 can be driven.
  • the operation of the above-mentioned processing process can be performed by executing one or more functional modules of an information processing device such as a general-purpose processor or a chip for a specific purpose. All of these modules, combinations of these modules, and / or combinations with known hardware capable of substituting their functionality are within the scope of the protection of the present invention.
  • the functional blocks of the vehicle display system 10 are optionally executed by hardware, software, or a combination of hardware and software in order to execute the principles of the various embodiments described.
  • the functional blocks described in FIG. 3 may be optionally combined or one functional block separated into two or more subblocks in order to implement the principles of the embodiments described. It will be understood by those skilled in the art. Accordingly, the description herein optionally supports any possible combination or division of functional blocks described herein.
  • Vehicle display system 20 Image display unit (head-up display device) 21: Display 21a: Display surface 22: Liquid crystal display panel 24: Light source unit 25: Relay optical system 26: First mirror 27: Second mirror 28: First actuator 29: Second actuator 30: Display control device 31: I / O interface 33: Processor 35: Image processing circuit 37: Memory 40: Display light 40p: Optical axis 41: First image light 42: Second image light 43: Third image light 90: Virtual image optical system 100: Display area 101 : Upper end 102: Lower end 200: Eye box 200A: Boundary 205: Center 401: Vehicle ECU 403: Road information database 405: Own vehicle position detection unit 407: Vehicle outside sensor 409: Operation detection unit 411: Eye position detection unit 413: IMU 415: Line-of-sight direction detection unit 417: Mobile information terminal 419: External communication device 502: Eye position detection module 504: Eye position estimation module 506: Eye position prediction module 50

Abstract

The present invention suppresses a decrease in the display quality of an image. A memory stores a plurality of display parameters 600 corresponding to each spatial region. A processor sets one or more display parameters 600 on the basis of an eye position, and determines whether information relating to the eye position indicates a specific condition. If the information relating to the eye position is determined to indicate the specific condition, then on the basis of at least the eye position, the processor sets a display parameter 600 corresponding to a region that is closer to the boundary of an eye box 200, or a region that is farther from the center 205 of the eye box 200, than the region to which an already set display parameter 600 corresponds.

Description

表示制御装置、ヘッドアップディスプレイ装置、及び方法Display control device, head-up display device, and method
 本開示は、車両で使用され、車両の前景に画像を重畳して視認させる表示制御装置、ヘッドアップディスプレイ装置、及び方法に関する。 The present disclosure relates to a display control device, a head-up display device, and a method used in a vehicle to superimpose and visually recognize an image on the foreground of the vehicle.
 特許文献1には、観察者の視点位置を検出し、視点位置に応じて画像の表示位置(表示設定の一例。)を補正するヘッドアップディスプレイ装置であって、視点位置が検出できなくなった場合、画像の表示位置を維持して、表示を継続することが開示されている。 Patent Document 1 describes a head-up display device that detects the viewpoint position of an observer and corrects the display position of an image (an example of display setting) according to the viewpoint position, and the case where the viewpoint position cannot be detected. , It is disclosed that the display position of the image is maintained and the display is continued.
特開2016-210212号公報Japanese Unexamined Patent Publication No. 2016-210212
 視点位置(以下では、目位置とも呼ぶ。)が検出できない場合、実際の目位置に応じた表示設定(観察者が視認するべき表示設定)と、実際に表示装置で実行されている表示設定(観察者が視認してしまう不適切な表示設定)と、に差が生じてしまうことが想定される。このように不適切な表示設定の画像が表示された後でも、目位置センサが、実際の目位置を正しく検出できれば、表示制御装置が適切な表示設定に更新することができる。しかし、このような場合、観察者は、不適切な表示設定の画像から、適切な表示設定の画像への変化を知覚してしまう。 When the viewpoint position (hereinafter, also referred to as the eye position) cannot be detected, the display setting according to the actual eye position (display setting that the observer should visually recognize) and the display setting actually executed by the display device (referred to as the eye position). It is assumed that there will be a difference between (inappropriate display setting) that the observer can see. Even after the image with inappropriate display settings is displayed in this way, if the eye position sensor can correctly detect the actual eye position, the display control device can update to the appropriate display settings. However, in such a case, the observer perceives a change from an image with inappropriate display settings to an image with appropriate display settings.
 特許文献1に開示されたヘッドアップディスプレイ装置は、視点位置が検出できなくなった場合、画像の表示位置(表示設定)を維持する、すなわち、視点位置が検出できていた最後の視点位置に対応した表示設定を維持するものである。視点位置が検出できていた状況から一転して、視点位置が検出できなくなった場合、視点位置が、他の位置に移動している可能性が高く、さらに言えば、一般的に観察者の視点位置の配置が想定されるエリア(アイリプス、又はアイボックスとも呼ぶ。)から外れている可能性も低くないことが想定される。これによれば、観察者の視点位置がアイリプス外(視点位置を検出できない状態)からアイリプス内に入った場合、観察者の視点位置は、アイリプス内外の境界に配置されるが、ここで視認される画像には、視点位置が検出できていた最後の視点位置に対応した表示設定が適応されており、最後に検出された視点位置と、実際の観察者の視点位置(アイリプス内外の境界)とが離れていれば、表示設定も大きく異なるため、適切な画像と乖離した表示品位の低い画像を観察者に視認させてしまう、おそれがあった。 The head-up display device disclosed in Patent Document 1 maintains the display position (display setting) of the image when the viewpoint position cannot be detected, that is, corresponds to the last viewpoint position where the viewpoint position could be detected. It maintains the display settings. If the viewpoint position cannot be detected after changing from the situation where the viewpoint position could be detected, there is a high possibility that the viewpoint position has moved to another position, and more specifically, the viewpoint of the observer in general. It is assumed that there is a high possibility that the position is out of the expected area (also called eye lip or eye box). According to this, when the observer's viewpoint position enters the eyelips from outside the eyelips (state in which the viewpoint position cannot be detected), the observer's viewpoint position is arranged at the boundary between the inside and outside of the eyelips, but is visually recognized here. The display settings corresponding to the last viewpoint position where the viewpoint position could be detected are applied to the image, and the last detected viewpoint position and the actual observer's viewpoint position (boundary inside and outside the irips) If the images are far apart, the display settings are also significantly different, and there is a risk that the observer may visually recognize an image with a low display quality that deviates from the appropriate image.
 本明細書に開示される特定の実施形態の要約を以下に示す。これらの態様が、これらの特定の実施形態の概要を読者に提供するためだけに提示され、この開示の範囲を限定するものではないことを理解されたい。実際に、本開示は、以下に記載されない種々の態様を包含し得る。 The following is a summary of the particular embodiments disclosed herein. It should be understood that these aspects are presented solely to provide the reader with an overview of these particular embodiments and do not limit the scope of this disclosure. In fact, the present disclosure may include various aspects not described below.
 本開示の概要は、画像の表示品位の低下を抑制することに関する。より具体的には、目位置の検知結果に特定状態が発生しても、観察者に表示品位の低い画像を視認させにくくすることにも関する。 The outline of the present disclosure relates to suppressing deterioration of image display quality. More specifically, it is also related to making it difficult for the observer to visually recognize an image having low display quality even if a specific state occurs in the detection result of the eye position.
 したがって、本明細書に記載される表示制御装置は、観察者の目位置を示す情報、及び/又は目位置を推定可能な情報を取得し、メモリは、空間的な領域毎に対応する複数の表示パラメータを記憶し、1つ又は複数のプロセッサは、取得した目位置に基づき、1つ又はそれ以上の表示パラメータを設定し、目位置に関する情報が、特定状態を示すか判定し、目位置に関する情報が特定状態を示すと判定される場合、少なくとも目位置に基づき、既に設定されている表示パラメータが対応する領域より、アイボックスの境界に近い領域に対応する表示パラメータを設定する。 Therefore, the display control device described in the present specification acquires information indicating the eye position of the observer and / or information capable of estimating the eye position, and the memory has a plurality of corresponding information for each spatial area. The display parameters are stored, and one or more processors set one or more display parameters based on the acquired eye position, determine whether the information about the eye position indicates a specific state, and relate to the eye position. When it is determined that the information indicates a specific state, the display parameter corresponding to the area closer to the boundary of the eye box is set than the area corresponding to the display parameter already set, at least based on the eye position.
図1は、いくつかの実施形態に係る、車両用表示システムの車両への適用例を示す図である。FIG. 1 is a diagram showing an example of application of a vehicle display system to a vehicle according to some embodiments. 図2は、いくつかの実施形態に係る、ヘッドアップディスプレイ装置の構成を示す図であるFIG. 2 is a diagram showing a configuration of a head-up display device according to some embodiments. 図3は、いくつかの実施形態に係る、車両用表示システムのブロック図である。FIG. 3 is a block diagram of a vehicle display system according to some embodiments. 図4Aは、いくつかの実施形態に従って、観察者の目位置に基づき、表示パラメータを設定する動作を実行する方法を示すフロー図である。FIG. 4A is a flow chart showing a method of performing an operation of setting display parameters based on the observer's eye position according to some embodiments. 図4Bは、図4Aにつづく方法を示すフロー図である。FIG. 4B is a flow chart showing a method following FIG. 4A. 図5Aの上図は、いくつかの実施形態における、表示パラメータに対応する空間的な領域の配置の説明を提供するための図であり、図5Aの下図は、図5Aに対応しており、目位置に応じて適用される表示パラメータの説明を提供するための図である。The upper diagram of FIG. 5A is a diagram for providing an explanation of the arrangement of spatial regions corresponding to display parameters in some embodiments, and the lower diagram of FIG. 5A corresponds to FIG. 5A. It is a figure for providing the explanation of the display parameter applied according to the eye position. 図5Bは、いくつかの実施形態における、表示パラメータに対応する空間的な領域の配置の説明を提供するための図である。FIG. 5B is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments. 図5Cは、いくつかの実施形態における、表示パラメータに対応する空間的な領域の配置の説明を提供するための図である。FIG. 5C is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments. 図6は、いくつかの実施形態における、表示パラメータに対応する空間的な領域の配置の説明を提供するための図である。FIG. 6 is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments. 図7は、いくつかの実施形態における、表示パラメータに対応する空間的な領域の配置の説明を提供するための図である。FIG. 7 is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments.
 以下、図1ないし図7では、例示的な車両用表示システムの構成、及び動作の説明を提供する。なお、本発明は以下の実施形態(図面の内容も含む)によって限定されるものではない。下記の実施形態に変更(構成要素の削除も含む)を加えることができるのはもちろんである。また、以下の説明では、本発明の理解を容易にするために、公知の技術的事項の説明を適宜省略する。 Hereinafter, FIGS. 1 to 7 provide a description of the configuration and operation of an exemplary vehicle display system. The present invention is not limited to the following embodiments (including the contents of the drawings). Of course, changes (including deletion of components) can be made to the following embodiments. Further, in the following description, in order to facilitate understanding of the present invention, description of known technical matters will be omitted as appropriate.
 車両用表示システム10における画像表示部20は、車両1のダッシュボード5内に設けられたヘッドアップディスプレイ(HUD:Head-Up Display)装置である。画像表示部20は、表示光40をフロントウインドシールド2(被投影部の一例である)に向けて出射し、フロントウインドシールド2は、画像表示部20が表示する画像M(図2参照。)の表示光40をアイボックス200へ反射する。観察者は、アイボックス200内に目を配置することで、フロントウインドシールド2を介して視認される現実空間である前景に重なる位置に、画像表示部20が表示する画像Mの虚像Vを視認することができる。なお、本実施形態に用いる図面において、車両1の左右方向をX軸方向(車両1の前方を向いた際の左側がX軸正方向)とし、上下方向をY軸方向(路面を走行する車両1の上側がY軸正方向)とし、車両1の前後方向をZ軸方向(車両1の前方がZ軸正方向)とする。 The image display unit 20 in the vehicle display system 10 is a head-up display (HUD: Head-Up Display) device provided in the dashboard 5 of the vehicle 1. The image display unit 20 emits the display light 40 toward the front windshield 2 (an example of the projected unit), and the front windshield 2 emits the display light 40 toward the image M displayed by the image display unit 20 (see FIG. 2). The display light 40 of the above is reflected on the eye box 200. By arranging the eyes in the eye box 200, the observer visually recognizes the virtual image V of the image M displayed by the image display unit 20 at a position overlapping the foreground, which is the real space visually recognized through the front windshield 2. can do. In the drawings used in the present embodiment, the left-right direction of the vehicle 1 is the X-axis direction (the left side when facing the front of the vehicle 1 is the X-axis positive direction), and the vertical direction is the Y-axis direction (a vehicle traveling on the road surface). The upper side of 1 is the Y-axis positive direction), and the front-rear direction of the vehicle 1 is the Z-axis direction (the front of the vehicle 1 is the Z-axis positive direction).
 本実施形態の説明で用いる「アイボックス」とは、(1)領域内では画像Mの虚像Vの全部が視認でき、領域外では画像Mの虚像Vの少なくとも一部分が視認されない領域、(2)領域内では画像Mの虚像Vの少なくとも一部が視認でき、領域外では画像Mの虚像Vの一部分も視認されない領域、(3)領域内では画像Mの虚像Vの少なくとも一部が所定の輝度以上で視認でき、領域外では画像Mの虚像Vの全体が前記所定の輝度未満である領域、又は(4)画像表示部20が立体視可能な虚像Vを表示可能である場合、虚像Vの少なくとも一部が立体視でき、領域外では虚像Vの一部分も立体視されない領域である。すなわち、観察者が目(両目)をアイボックス200外に配置すると、観察者は、画像Mの虚像Vの全体が視認できない、画像Mの虚像Vの全体の視認性が非常に低く知覚しづらい、又は画像Mの虚像Vが立体視できない。前記所定の輝度とは、例えば、アイボックスの中心で視認される画像Mの虚像の輝度に対して1/50程度である。「アイボックス」は、HUD装置20が搭載される車両で観察者の視点位置の配置が想定されるエリア(アイリプスとも呼ぶ。)と同じ、又は前記アイリプスの大部分(例えば、80%以上。)を含むように設定される。 The "eye box" used in the description of the present embodiment is (1) a region in which the entire virtual image V of the image M can be visually recognized in the region, and at least a part of the virtual image V of the image M is not visible outside the region, (2). At least a part of the virtual image V of the image M can be visually recognized in the area, and a part of the virtual image V of the image M is not visible outside the area. When the virtual image V of the image M can be visually recognized as described above and the entire virtual image V of the image M is less than the predetermined brightness, or (4) the virtual image V which can be viewed stereoscopically can be displayed by the image display unit 20, the virtual image V of the virtual image V At least a part of the virtual image V can be viewed stereoscopically, and a part of the virtual image V is not stereoscopically viewed outside the region. That is, when the observer places the eyes (both eyes) outside the eye box 200, the observer cannot see the entire virtual image V of the image M, and the visibility of the entire virtual image V of the image M is very low and difficult to perceive. Or, the virtual image V of the image M cannot be viewed stereoscopically. The predetermined brightness is, for example, about 1/50 of the brightness of the virtual image of the image M visually recognized at the center of the eye box. The "eye box" is the same as the area (also referred to as eye lip) where the observer's viewpoint position is expected to be arranged in the vehicle on which the HUD device 20 is mounted, or most of the eye lip (for example, 80% or more). Is set to include.
 表示領域100は、画像表示部20の内部で生成された画像Mが、虚像Vとして結像する平面、曲面、又は一部曲面の領域であり、結像面とも呼ばれる。表示領域100は、画像表示部20の後述する表示器21の表示面(例えば、液晶ディスプレイパネルの出射面)21aが虚像として結像される位置であり、すなわち、表示領域100は、画像表示部20の後述する表示面21aに対応し(言い換えると、表示領域100は、後述する表示器21の表示面21aと、共役関係となる。)、そして、表示領域100で視認される虚像は、画像表示部20の後述する表示面21aに表示される画像に対応している、と言える。表示領域100自体は、実際に観察者の目に視認されない、又は視認されにくい程度に視認性が低いことが好ましい。 The display area 100 is an area of a plane, a curved surface, or a partially curved surface in which the image M generated inside the image display unit 20 forms an image as a virtual image V, and is also called an image forming surface. The display area 100 is a position where the display surface (for example, the exit surface of the liquid crystal display panel) 21a of the display 21 described later of the image display unit 20 is imaged as a virtual image, that is, the display area 100 is the image display unit. Corresponding to the display surface 21a described later of 20 (in other words, the display area 100 has a conjugate relationship with the display surface 21a of the display 21 described later), and the virtual image visually recognized in the display area 100 is an image. It can be said that it corresponds to the image displayed on the display surface 21a described later of the display unit 20. It is preferable that the display area 100 itself has low visibility to the extent that it is not actually visible to the observer's eyes or is difficult to see.
 表示領域100には、車両1の左右方向(X軸方向)を軸とした水平方向(XZ平面)とのなす角度(図1のチルト角θt)と、アイボックス200の中心205と表示領域100の上端101とを結ぶ線分と、アイボックス中心と表示領域100の下端102とを結ぶ線分とのなす角度を表示領域100の縦画角として、この縦画角の二等分線と水平方向(XZ平面)とのなす角度(図1の縦配置角θv)と、が設定される。本実施形態の表示領域100は、前方(Z軸正方向)を向いた際に概ね正対するように、概ね90[degree]のチルト角θtを有する。但し、チルト角θtは、これに限定されるものではなく、0≦θt<90[degree]の範囲で変更し得る。この場合、例えば、チルト角θtは、60[degree]に設定され、表示領域100は、観察者から見て上側の領域が下側の領域より遠方になるように配置されてもよい。 The display area 100 includes an angle (tilt angle θt in FIG. 1) formed by the horizontal direction (XZ plane) about the left-right direction (X-axis direction) of the vehicle 1, the center 205 of the eyebox 200, and the display area 100. The angle formed by the line connecting the upper end 101 of the eye box and the line connecting the center of the eyebox and the lower end 102 of the display area 100 is defined as the vertical angle of the display area 100, and is horizontal to the bisector of this vertical angle. The angle formed by the direction (XZ plane) (vertical arrangement angle θv in FIG. 1) is set. The display area 100 of the present embodiment has a tilt angle θt of approximately 90 [degree] so as to substantially face the display region 100 when facing forward (the Z-axis positive direction). However, the tilt angle θt is not limited to this, and can be changed within the range of 0 ≦ θt <90 [degree]. In this case, for example, the tilt angle θt may be set to 60 [degree], and the display area 100 may be arranged so that the upper area is farther than the lower area when viewed from the observer.
 図2は、本実施形態のHUD装置20の構成を示す図である。HUD装置20は、画像Mを表示する表示面21aを有する表示器21と、リレー光学系25と、を含む。 FIG. 2 is a diagram showing the configuration of the HUD device 20 of the present embodiment. The HUD device 20 includes a display 21 having a display surface 21a for displaying the image M, and a relay optical system 25.
 図2の表示器21は、液晶ディスプレイパネル22と、光源ユニット24と、から構成される。表示面21aは、液晶ディスプレイパネル22の視認側の表面であり、画像Mの表示光40を出射する。表示面21aの中心からリレー光学系25及び前記被投影部を介してアイボックス200(アイボックス200の中心205)へ向かう表示光40の光軸40pに対する、表示面21aの角度の設定により、表示領域100の角度(チルト角θtを含む。)が設定され得る。 The display 21 of FIG. 2 is composed of a liquid crystal display panel 22 and a light source unit 24. The display surface 21a is a surface on the visual side of the liquid crystal display panel 22, and emits the display light 40 of the image M. Display by setting the angle of the display surface 21a with respect to the optical axis 40p of the display light 40 from the center of the display surface 21a toward the eye box 200 (center 205 of the eye box 200) via the relay optical system 25 and the projected portion. The angle of the region 100 (including the tilt angle θt) can be set.
 リレー光学系25は、表示器21から出射された表示光40(表示器21からアイボックス200へ向かう光。)の光路上に配置され、表示器21からの表示光40をHUD装置20の外側のフロントウインドシールド2に投影する1つ又はそれ以上の光学部材で構成される。図2のリレー光学系25は、1つの凹状の第1ミラー26と、1つの平面の第2ミラー27と、を含む。 The relay optical system 25 is arranged on the optical path of the display light 40 (light from the display 21 toward the eyebox 200) emitted from the display 21, and the display light 40 from the display 21 is directed to the outside of the HUD device 20. It is composed of one or more optical members projected onto the front windshield 2. The relay optical system 25 of FIG. 2 includes one concave first mirror 26 and one flat second mirror 27.
 第1ミラー26は、例えば、正の光学的パワーを有する自由曲面形状である。換言すると、第1ミラー26は、領域毎に光学的パワーが異なる曲面形状であってもよく、すなわち、表示光40が通る領域(光路)に応じて表示光40に付加される光学的パワーが異なってもよい。具体的には、表示面21aの各領域からアイボックス200へ向かう第1画像光41、第2画像光42、第3画像光43(図2参照)とで、リレー光学系25によって付加される光学的パワーが異なってもよい。 The first mirror 26 has, for example, a free curved surface shape having positive optical power. In other words, the first mirror 26 may have a curved surface shape in which the optical power differs for each region, that is, the optical power added to the display light 40 according to the region (optical path) through which the display light 40 passes. It may be different. Specifically, the first image light 41, the second image light 42, and the third image light 43 (see FIG. 2) heading from each region of the display surface 21a toward the eyebox 200 are added by the relay optical system 25. The optical power may be different.
 なお、第2ミラー27は、例えば、平面ミラーであるが、これに限定されるものではなく、光学的パワーを有する曲面であってもよい。すなわち、リレー光学系25は、複数のミラー(例えば、本実施形態の第1ミラー26、第2ミラー27。)を合成することで、表示光40が通る領域(光路)に応じて付加される光学的パワーを異ならせてもよい。なお、第2ミラー27は、省略されてもよい。すなわち、表示器21から出射される表示光40は、第1ミラー26により被投影部(フロントウインドシールド)2に反射されてもよい。 The second mirror 27 is, for example, a flat mirror, but is not limited to this, and may be a curved surface having optical power. That is, the relay optical system 25 is added according to the region (optical path) through which the display light 40 passes by synthesizing a plurality of mirrors (for example, the first mirror 26 and the second mirror 27 of the present embodiment). The optical power may be different. The second mirror 27 may be omitted. That is, the display light 40 emitted from the display 21 may be reflected by the first mirror 26 on the projected portion (front windshield) 2.
 また、本実施形態では、リレー光学系25は、2つのミラーを含んでいたが、これに限定されるものではなく、これらに追加又は代替で、1つ又はそれ以上の、レンズなどの屈折光学部材、ホログラムなどの回折光学部材、反射光学部材、又はこれらの組み合わせを含んでいてもよい。 Further, in the present embodiment, the relay optical system 25 includes two mirrors, but the present invention is not limited to this, and one or more refractive optics such as a lens may be added or substituted to these. It may include a member, a diffractive optical member such as a hologram, a reflective optical member, or a combination thereof.
 また、本実施形態のリレー光学系25は、この曲面形状(光学的パワーの一例。)により、表示領域100までの距離を設定する機能、及び表示面21aに表示された画像を拡大した虚像を生成する機能、を有するが、これに加えて、フロントウインドシールド2の湾曲形状により生じ得る虚像の歪みを抑制する(補正する)機能、を有していてもよい。 Further, the relay optical system 25 of the present embodiment has a function of setting the distance to the display area 100 by the curved surface shape (an example of optical power), and a virtual image obtained by enlarging the image displayed on the display surface 21a. It has a function of generating, but in addition to this, it may have a function of suppressing (correcting) distortion of a virtual image that may occur due to the curved shape of the front windshield 2.
 また、リレー光学系25は、表示制御装置30により制御されるアクチュエータ28、29が取り付けられ、回転可能であってもよい。 Further, the relay optical system 25 may be rotatable to which actuators 28 and 29 controlled by the display control device 30 are attached.
 液晶ディスプレイパネル22は、光源ユニット24から光を入射し、空間光変調した表示光40をリレー光学系25(第2ミラー27)へ向けて出射する。液晶ディスプレイパネル22は、例えば、観察者から見た虚像Vの上下方向(Y軸方向)に対応する画素が配列される方向が短辺である矩形状である。観察者は、液晶ディスプレイパネル22の透過光を、虚像光学系90を介して視認する。虚像光学系90は、図2で示すリレー光学系25とフロントウインドシールド2とを合わせたものである。 The liquid crystal display panel 22 receives light from the light source unit 24 and emits spatial light-modulated display light 40 toward the relay optical system 25 (second mirror 27). The liquid crystal display panel 22 has, for example, a rectangular shape whose short side is the direction in which the pixels corresponding to the vertical direction (Y-axis direction) of the virtual image V seen from the observer are arranged. The observer visually recognizes the transmitted light of the liquid crystal display panel 22 via the virtual image optical system 90. The virtual image optical system 90 is a combination of the relay optical system 25 shown in FIG. 2 and the front windshield 2.
 光源ユニット24は、光源(不図示)と、照明光学系(不図示)と、によって構成される。 The light source unit 24 is composed of a light source (not shown) and an illumination optical system (not shown).
 光源(不図示)は、例えば、複数のチップ型のLEDであり、液晶ディスプレイパネル(空間光変調素子の一例)22へ照明光を出射する。光源ユニット24は、例えば、4つの光源で構成されており、液晶ディスプレイパネル22の長辺に沿って一列に配置される。光源ユニット24は、表示制御装置30からの制御のもと、照明光を液晶ディスプレイパネル22に向けて出射する。光源ユニット24の構成や光源の配置などはこれに限定されない。 The light source (not shown) is, for example, a plurality of chip-type LEDs, and emits illumination light to a liquid crystal display panel (an example of a spatial light modulation element) 22. The light source unit 24 is composed of, for example, four light sources, and is arranged in a row along the long side of the liquid crystal display panel 22. The light source unit 24 emits illumination light toward the liquid crystal display panel 22 under the control of the display control device 30. The configuration of the light source unit 24 and the arrangement of the light sources are not limited to this.
 照明光学系(不図示)は、例えば、光源ユニット24の照明光の出射方向に配置された1つ又は複数のレンズ(不図示)と、1つ又は複数のレンズの出射方向に配置された拡散板(不図示)と、によって構成される。 The illumination optical system (not shown) includes, for example, one or a plurality of lenses (not shown) arranged in the emission direction of the illumination light of the light source unit 24, and diffusion arranged in the emission direction of the one or a plurality of lenses. It is composed of a board (not shown).
 なお、表示器21は、自発光型ディスプレイであってもよく、又は、スクリーンに画像を投影するプロジェクション型ディスプレイであってもよい。この場合、表示面21aは、プロジェクション型ディスプレイのスクリーンである。 The display 21 may be a self-luminous display or a projection type display that projects an image on a screen. In this case, the display surface 21a is the screen of the projection type display.
 また、表示器21は、表示制御装置30により制御されるモータなどを含む不図示のアクチュエータが取り付けられ、表示面21aを移動、及び/又は回転可能であってもよい。 Further, the display 21 may be attached with an actuator (not shown) including a motor controlled by the display control device 30, and may be able to move and / or rotate the display surface 21a.
 リレー光学系25は、アイボックス200を上下方向(Y軸方向)に移動させる2つの回転軸(第1の回転軸AX1、第2の回転軸AX2)を有する。第1の回転軸AX1、第2の回転軸AX2それぞれは、HUD装置20が車両1に取り付けられた状態で、車両1の左右方向(X軸方向)と垂直とならない(換言すると、YZ平面と平行にならない)ように設定される。具体的には、第1の回転軸AX1、第2の回転軸AX2は、車両1の左右方向(X軸方向)との間の角度が、45[degree]未満に設定され、さらに好ましくは、20[degree]未満に設定される。 The relay optical system 25 has two rotation axes (first rotation axis AX1 and second rotation axis AX2) that move the eyebox 200 in the vertical direction (Y-axis direction). Each of the first rotation axis AX1 and the second rotation axis AX2 is not perpendicular to the left-right direction (X-axis direction) of the vehicle 1 in the state where the HUD device 20 is attached to the vehicle 1 (in other words, the YZ plane). It is set so that it is not parallel). Specifically, the angle between the first rotation axis AX1 and the second rotation axis AX2 with respect to the left-right direction (X-axis direction) of the vehicle 1 is set to less than 45 [degree], and more preferably. It is set to less than 20 [degree].
 第1の回転軸AX1でのリレー光学系25の回転によれば、表示領域100の上下方向の移動量が比較的小さく、アイボックス200の上下方向の移動量が比較的大きい。また、第2の回転軸AX2でのリレー光学系25の回転によれば、表示領域100の上下方向の移動量が比較的大きく、アイボックス200の上下方向の移動量が比較的小さい。すなわち、第1の回転軸AX1と第2の回転軸AX2とを対比すると、第1の回転軸AX1の回転による『アイボックス200の上下方向の移動量/表示領域100の上下方向の移動量』は、第2の回転軸AX2の回転による『アイボックス200の上下方向の移動量/表示領域100の上下方向の移動量』より大きくなる。言い換えると、第1の回転軸AX1でのリレー光学系25の回転による表示領域100の上下方向の移動量とアイボックス200の上下方向の移動量との相対量が、第2の回転軸AX2でのリレー光学系25の回転による表示領域100の上下方向の移動量とアイボックス200の上下方向の移動量との相対量とが異なる。 According to the rotation of the relay optical system 25 on the first rotation axis AX1, the amount of vertical movement of the display area 100 is relatively small, and the amount of vertical movement of the eyebox 200 is relatively large. Further, according to the rotation of the relay optical system 25 on the second rotation axis AX2, the amount of movement of the display area 100 in the vertical direction is relatively large, and the amount of movement of the eyebox 200 in the vertical direction is relatively small. That is, when the first rotation axis AX1 and the second rotation axis AX2 are compared, "the amount of vertical movement of the eyebox 200 / the amount of vertical movement of the display area 100" due to the rotation of the first rotation axis AX1. Is larger than "the amount of vertical movement of the eyebox 200 / the amount of vertical movement of the display area 100" due to the rotation of the second rotation axis AX2. In other words, the relative amount of the vertical movement amount of the display area 100 and the vertical movement amount of the eyebox 200 due to the rotation of the relay optical system 25 on the first rotation axis AX1 is the relative amount on the second rotation axis AX2. The relative amount of the vertical movement amount of the display area 100 due to the rotation of the relay optical system 25 and the vertical movement amount of the eyebox 200 are different.
 HUD装置20は、第1の回転軸AX1で第1ミラー26を回転させる第1アクチュエータ28と、第2の回転軸AX2で第1ミラー26を回転させる第2アクチュエータ29と、を含む。言い換えると、HUD装置20は、1つのリレー光学系25を2つの軸(第1の回転軸AX1、第2の回転軸AX2)で回転させる。なお、第1アクチュエータ28と第2アクチュエータ29は、統合された1つの2軸アクチュエータで構成されてもよい。また、他の実施形態におけるHUD装置20は、2つのリレー光学系25を2つの軸(第1の回転軸AX1、第2の回転軸AX2)で回転させる。例えば、HUD装置20は、第1の回転軸AX1で第1ミラー26を回転させる第1アクチュエータ28と、第2の回転軸AX2で第2ミラー27を回転させる第2アクチュエータ29と、を含んでいてもよい。 The HUD device 20 includes a first actuator 28 that rotates the first mirror 26 on the first rotation axis AX1 and a second actuator 29 that rotates the first mirror 26 on the second rotation axis AX2. In other words, the HUD device 20 rotates one relay optical system 25 on two axes (first rotation axis AX1 and second rotation axis AX2). The first actuator 28 and the second actuator 29 may be composed of one integrated two-axis actuator. Further, the HUD device 20 in another embodiment rotates the two relay optical systems 25 on two axes (first rotation axis AX1 and second rotation axis AX2). For example, the HUD device 20 includes a first actuator 28 that rotates the first mirror 26 on the first rotation axis AX1 and a second actuator 29 that rotates the second mirror 27 on the second rotation axis AX2. You may.
 また、他の実施形態におけるHUD装置20は、リレー光学系25を駆動しなくてもよい。換言すると、HUD装置20は、リレー光学系25を回転、及び/又は回転させるアクチュエータを有していなくてもよい。この実施形態のHUD装置20は、車両1の使用が想定される運転者の目高さのレンジをカバーする広いアイボックス200を備え得る。 Further, the HUD device 20 in another embodiment does not have to drive the relay optical system 25. In other words, the HUD device 20 may not have an actuator that rotates and / or rotates the relay optical system 25. The HUD device 20 of this embodiment may include a wide eye box 200 that covers a range of driver's eye heights where the vehicle 1 is expected to be used.
 画像表示部(ヘッドアップディスプレイ装置)20は、後述する表示制御装置30の制御に基づいて、車両1のフロントウインドシールド2を介して視認される現実空間(実景)である前景に存在する、走行レーンの路面310(図1参照。)、分岐路、道路標識、障害物(歩行者、自転車、自動二輪車、他車両など)、及び地物(建物、橋など)などの実オブジェクト300の近傍、実オブジェクトに重なる位置、又は実オブジェクトを基準に設定された位置に画像を表示することで、視覚的な拡張現実(AR:Augmented Reality)を観察者(典型的には、車両1の運転席に着座する観察者)に知覚させることもできる。本実施形態の説明では、実景に存在する実オブジェクトの位置に応じて、表示される位置を変化させ得る画像をAR画像と定義し、実オブジェクトの位置によらず、表示される位置が設定される画像を非AR画像と定義することとする。 The image display unit (head-up display device) 20 exists in the foreground, which is a real space (actual view) that is visually recognized through the front windshield 2 of the vehicle 1, based on the control of the display control device 30 described later. Near the real object 300 such as lane road surface 310 (see FIG. 1), branch roads, road signs, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), and features (buildings, bridges, etc.). By displaying the image at a position that overlaps the real object or at a position set based on the real object, the visual augmented reality (AR) can be displayed on the observer (typically, in the driver's seat of the vehicle 1). It can also be perceived by the seated observer). In the description of the present embodiment, an image whose displayed position can be changed according to the position of the real object existing in the real scene is defined as an AR image, and the displayed position is set regardless of the position of the real object. The image is defined as a non-AR image.
 図3は、いくつかの実施形態に係る、車両用表示システム10のブロック図である。表示制御装置30は、1つ又は複数のI/Oインタフェース31、1つ又は複数のプロセッサ33、1つ又は複数の画像処理回路35、及び1つ又は複数のメモリ37を備える。図3に記載される様々な機能ブロックは、ハードウェア、ソフトウェア、又はこれら両方の組み合わせで構成されてもよい。図3は、1つの実施形態に過ぎず、図示された構成要素は、より数の少ない構成要素に組み合わされてもよく、又は追加の構成要素があってもよい。例えば、画像処理回路35(例えば、グラフィック処理ユニット)が、1つ又は複数のプロセッサ33に含まれてもよい。 FIG. 3 is a block diagram of the vehicle display system 10 according to some embodiments. The display control device 30 includes one or more I / O interfaces 31, one or more processors 33, one or more image processing circuits 35, and one or more memories 37. The various functional blocks described in FIG. 3 may consist of hardware, software, or a combination of both. FIG. 3 is only one embodiment, and the illustrated components may be combined with a smaller number of components, or there may be additional components. For example, the image processing circuit 35 (for example, a graphic processing unit) may be included in one or more processors 33.
 図示するように、プロセッサ33及び画像処理回路35は、メモリ37と動作可能に連結される。より具体的には、プロセッサ33及び画像処理回路35は、メモリ37に記憶されているプログラムを実行することで、例えば画像データを生成、及び/又は送信するなど、車両用表示システム10(画像表示部20)の操作を行うことができる。プロセッサ33及び/又は画像処理回路35は、少なくとも1つの汎用マイクロプロセッサ(例えば、中央処理装置(CPU))、少なくとも1つの特定用途向け集積回路(ASIC)、少なくとも1つのフィールドプログラマブルゲートアレイ(FPGA)、又はそれらの任意の組み合わせを含むことができる。メモリ37は、ハードディスクのような任意のタイプの磁気媒体、CD及びDVDのような任意のタイプの光学媒体、揮発性メモリのような任意のタイプの半導体メモリ、及び不揮発性メモリを含む。揮発性メモリは、DRAM及びSRAMを含み、不揮発性メモリは、ROM及びNVRAMを含んでもよい。 As shown, the processor 33 and the image processing circuit 35 are operably connected to the memory 37. More specifically, the processor 33 and the image processing circuit 35 execute a program stored in the memory 37 to generate and / or transmit image data, for example, and display the vehicle display system 10 (image display). The operation of unit 20) can be performed. The processor 33 and / or the image processing circuit 35 includes at least one general purpose microprocessor (eg, central processing unit (CPU)), at least one application specific integrated circuit (ASIC), and at least one field programmable gate array (FPGA). , Or any combination thereof. The memory 37 includes any type of magnetic medium such as a hard disk, any type of optical medium such as a CD and DVD, any type of semiconductor memory such as a volatile memory, and a non-volatile memory. The volatile memory may include DRAM and SRAM, and the non-volatile memory may include ROM and NVRAM.
 図示するように、プロセッサ33は、I/Oインタフェース31と動作可能に連結されている。I/Oインタフェース31は、例えば、車両に設けられた後述の車両ECU401、又は他の電子機器(後述する符号403~419)と、CAN(Controller Area Network)の規格に応じて通信(CAN通信とも称する)を行う。なお、I/Oインタフェース31が採用する通信規格は、CANに限定されず、例えば、CANFD(CAN with Flexible Data Rate)、LIN(Local Interconnect Network)、Ethernet(登録商標)、MOST(Media Oriented Systems Transport:MOSTは登録商標)、UART、もしくはUSBなどの有線通信インタフェース、又は、例えば、Bluetooth(登録商標)ネットワークなどのパーソナルエリアネットワーク(PAN)、802.11x Wi-Fi(登録商標)ネットワークなどのローカルエリアネットワーク(LAN)等の数十メートル内の近距離無線通信インタフェースである車内通信(内部通信)インタフェースを含む。また、I/Oインタフェース31は、無線ワイドエリアネットワーク(WWAN0、IEEE802.16-2004(WiMAX:Worldwide Interoperability for Microwave Access))、IEEE802.16eベース(Mobile WiMAX)、4G、4G-LTE、LTE Advanced、5Gなどのセルラー通信規格により広域通信網(例えば、インターネット通信網)などの車外通信(外部通信)インタフェースを含んでいてもよい。 As shown, the processor 33 is operably connected to the I / O interface 31. The I / O interface 31 communicates with, for example, the vehicle ECU 401 described later or another electronic device (reference numerals 403 to 419 described later) provided in the vehicle according to the standard of CAN (Controller Area Network) (also referred to as CAN communication). ). The communication standard adopted by the I / O interface 31 is not limited to CAN, for example, CANFD (CAN with Flexible Data Rate), LIN (Local Interconnect Network), Ethernet (registered trademark), MOST (Media Oriented Systems Transport). : MOST is a registered trademark), a wired communication interface such as UART, or USB, or a local such as a personal area network (PAN) such as a Bluetooth (registered trademark) network, or an 802.1x Wi-Fi (registered trademark) network. In-vehicle communication (internal communication) interface, which is a short-range wireless communication interface within several tens of meters such as an area network (LAN), is included. The I / O interface 31 is a wireless wide area network (WWAN0, IEEE 802.16-2004 (WiMAX: Worldwide Interoperability for Microwave Access)), IEEE 802.16e base (Mobile WiMAX), 4G, 4G-LTE, LTE Advanced, An external communication (external communication) interface such as a wide area communication network (for example, an Internet communication network) may be included according to a cellular communication standard such as 5G.
 図示するように、プロセッサ33は、I/Oインタフェース31と相互動作可能に連結されることで、車両用表示システム10(I/Oインタフェース31)に接続される種々の他の電子機器等と情報を授受可能となる。I/Oインタフェース31には、例えば、車両ECU401、道路情報データベース403、自車位置検出部405、車外センサ407、操作検出部409、目位置検出部411、IMU413、視線方向検出部415、携帯情報端末417、及び外部通信機器419などが動作可能に連結される。なお、I/Oインタフェース31は、車両用表示システム10に接続される他の電子機器等から受信する情報を加工(変換、演算、解析)する機能を含んでいてもよい。 As shown in the figure, the processor 33 is interoperably connected to the I / O interface 31 to provide information with various other electronic devices and the like connected to the vehicle display system 10 (I / O interface 31). Can be exchanged. The I / O interface 31 includes, for example, a vehicle ECU 401, a road information database 403, a vehicle position detection unit 405, an external sensor 407, an operation detection unit 409, an eye position detection unit 411, an IMU 413, a line-of-sight direction detection unit 415, and mobile information. The terminal 417, the external communication device 419, and the like are operably connected. The I / O interface 31 may include a function of processing (converting, calculating, analyzing) information received from another electronic device or the like connected to the vehicle display system 10.
 表示器21は、プロセッサ33及び画像処理回路35に動作可能に連結される。したがって、画像表示部20によって表示される画像は、プロセッサ33及び/又は画像処理回路35から受信された画像データに基づいてもよい。プロセッサ33及び画像処理回路35は、I/Oインタフェース31から取得される情報に基づき、画像表示部20が表示する画像を制御する。 The display 21 is operably connected to the processor 33 and the image processing circuit 35. Therefore, the image displayed by the image display unit 20 may be based on the image data received from the processor 33 and / or the image processing circuit 35. The processor 33 and the image processing circuit 35 control the image displayed by the image display unit 20 based on the information acquired from the I / O interface 31.
 車両ECU401は、車両1に設けられたセンサやスイッチから、車両1の状態(例えば、走行距離、車速、アクセルペダル開度、ブレーキペダル開度、エンジンスロットル開度、インジェクター燃料噴射量、エンジン回転数、モータ回転数、ステアリング操舵角、シフトポジション、ドライブモード、各種警告状態、姿勢(ロール角、及び/又はピッチング角を含む)、車両の振動(振動の大きさ、頻度、及び/又は周波数を含む))などを取得し、車両1の前記状態を収集、及び管理(制御も含んでもよい。)するものであり、機能の一部として、車両1の前記状態の数値(例えば、車両1の車速。)を示す信号を、表示制御装置30のプロセッサ33へ出力することができる。なお、車両ECU401は、単にセンサ等で検出した数値(例えば、ピッチング角が前傾方向に3[degree]。)をプロセッサ33へ送信することに加え、又はこれに代わり、センサで検出した数値を含む車両1の1つ又は複数の状態に基づく判定結果(例えば、車両1が予め定められた前傾状態の条件を満たしていること。)、若しくは/及び解析結果(例えば、ブレーキペダル開度の情報と組み合わせされて、ブレーキにより車両が前傾状態になったこと。)を、プロセッサ33へ送信してもよい。例えば、車両ECU401は、車両1が車両ECU401のメモリ(不図示)に予め記憶された所定の条件を満たすような判定結果を示す信号を表示制御装置30へ出力してもよい。なお、I/Oインタフェース31は、車両ECU401を介さずに、車両1に設けられた車両1に設けられたセンサやスイッチから、上述したような情報を取得してもよい。 The vehicle ECU 401 uses sensors and switches provided on the vehicle 1 to determine the state of the vehicle 1 (for example, mileage, vehicle speed, accelerator pedal opening, brake pedal opening, engine throttle opening, injector fuel injection amount, engine rotation speed). , Motor speed, steering angle, shift position, drive mode, various warning states, attitude (including roll angle and / or pitching angle), vehicle vibration (including magnitude, frequency, and / or frequency of vibration) )) And the like, and collect and manage (may include control) the state of the vehicle 1. As a part of the function, the numerical value of the state of the vehicle 1 (for example, the vehicle speed of the vehicle 1). ) Can be output to the processor 33 of the display control device 30. In addition, the vehicle ECU 401 simply transmits the numerical value detected by the sensor or the like (for example, the pitching angle is 3 [brake] in the forward tilt direction) to the processor 33, or instead, the numerical value detected by the sensor is used. Judgment results based on one or more states of the including vehicle 1 (for example, the vehicle 1 satisfies a predetermined condition of the forward leaning state) and / and analysis results (for example, of the brake pedal opening degree). Combined with the information, the brake has caused the vehicle to lean forward.) May be transmitted to the processor 33. For example, the vehicle ECU 401 may output a signal indicating a determination result indicating that the vehicle 1 satisfies a predetermined condition stored in advance in a memory (not shown) of the vehicle ECU 401 to the display control device 30. The I / O interface 31 may acquire the above-mentioned information from the sensors and switches provided in the vehicle 1 provided in the vehicle 1 without going through the vehicle ECU 401.
 また、車両ECU401は、車両用表示システム10が表示する画像を指示する指示信号を表示制御装置30へ出力してもよく、この際、画像の座標、サイズ、種類、表示態様、画像の報知必要度、及び/又は報知必要度を判定する元となる必要度関連情報を、前記指示信号に付加して送信してもよい。 Further, the vehicle ECU 401 may output an instruction signal indicating an image to be displayed by the vehicle display system 10 to the display control device 30, and at this time, it is necessary to notify the coordinates, size, type, display mode, and image of the image. The degree and / or the necessity-related information that is the basis for determining the notification necessity may be added to the instruction signal and transmitted.
 道路情報データベース403は、車両1に設けられた図示しないナビゲーション装置、又は車両1と車外通信インタフェース(I/Oインタフェース31)を介して接続される外部サーバー、に含まれ、後述する自車位置検出部405から取得される車両1の位置に基づき、車両1の周辺の情報(車両1の周辺の実オブジェクト関連情報)である車両1が走行する道路情報(車線,白線,停止線,横断歩道,道路の幅員,車線数,交差点,カーブ,分岐路,交通規制など)、地物情報(建物、橋、河川など)の、有無、位置(車両1までの距離を含む)、方向、形状、種類、詳細情報などを読み出し、プロセッサ33に送信してもよい。また、道路情報データベース403は、出発地から目的地までの適切な経路(ナビゲーション情報)を算出し、当該ナビゲーション情報を示す信号、又は経路を示す画像データをプロセッサ33へ出力してもよい。 The road information database 403 is included in a navigation device (not shown) provided in the vehicle 1 or an external server connected to the vehicle 1 via an external communication interface (I / O interface 31), and the vehicle position detection described later. Road information (lane, white line, stop line, crosswalk, Road width, number of lanes, intersections, curves, branch roads, traffic regulations, etc.), feature information (buildings, bridges, rivers, etc.), presence / absence, position (including distance to vehicle 1), direction, shape, type , Detailed information and the like may be read out and transmitted to the processor 33. Further, the road information database 403 may calculate an appropriate route (navigation information) from the departure point to the destination, and output a signal indicating the navigation information or image data indicating the route to the processor 33.
 自車位置検出部405は、車両1に設けられたGNSS(全地球航法衛星システム)等であり、現在の車両1の位置、方位を検出し、検出結果を示す信号を、プロセッサ33を介して、又は直接、道路情報データベース403、後述する携帯情報端末417、及び/もしくは外部通信機器419へ出力する。道路情報データベース403、後述する携帯情報端末417、及び/又は外部通信機器419は、自車位置検出部405から車両1の位置情報を連続的、断続的、又は所定のイベント毎に取得することで、車両1の周辺の情報を選択・生成して、プロセッサ33へ出力してもよい。 The own vehicle position detection unit 405 is a GNSS (Global Navigation Satellite System) or the like provided in the vehicle 1, detects the current position and orientation of the vehicle 1, and transmits a signal indicating the detection result via the processor 33. , Or directly output to the road information database 403, the portable information terminal 417 described later, and / or the external communication device 419. The road information database 403, the mobile information terminal 417 described later, and / or the external communication device 419 obtains the position information of the vehicle 1 from the own vehicle position detection unit 405 continuously, intermittently, or at a predetermined event. , Information around the vehicle 1 may be selected and generated and output to the processor 33.
 車外センサ407は、車両1の周辺(前方、側方、及び後方)に存在する実オブジェクトを検出する。車外センサ407が検知する実オブジェクトは、例えば、障害物(歩行者、自転車、自動二輪車、他車両など)、後述する走行レーンの路面、区画線、路側物、及び/又は地物(建物など)などを含んでいてもよい。車外センサとしては、例えば、ミリ波レーダ、超音波レーダ、レーザレーダ等のレーダセンサ、カメラ、又はこれらの組み合わせからなる検出ユニットと、当該1つ又は複数の検出ユニットからの検出データを処理する(データフュージョンする)処理装置と、から構成される。これらレーダセンサやカメラセンサによる物体検知については従来の周知の手法を適用する。これらのセンサによる物体検知によって、三次元空間内での実オブジェクトの有無、実オブジェクトが存在する場合には、その実オブジェクトの位置(車両1からの相対的な距離、車両1の進行方向を前後方向とした場合の左右方向の位置、上下方向の位置等)、大きさ(横方向(左右方向)、高さ方向(上下方向)等の大きさ)、移動方向(横方向(左右方向)、奥行き方向(前後方向))、移動速度(横方向(左右方向)、奥行き方向(前後方向))、及び/又は種類等を検出してもよい。1つ又は複数の車外センサ407は、各センサの検知周期毎に、車両1の前方の実オブジェクトを検知して、実オブジェクト情報の一例である実オブジェクト情報(実オブジェクトの有無、実オブジェクトが存在する場合には実オブジェクト毎の位置、大きさ、及び/又は種類等の情報)をプロセッサ33に出力することができる。なお、これら実オブジェクト情報は、他の機器(例えば、車両ECU401)を経由してプロセッサ33に送信されてもよい。また、夜間等の周辺が暗いときでも実オブジェクトが検知できるように、センサとしてカメラを利用する場合には赤外線カメラや近赤外線カメラが望ましい。また、センサとしてカメラを利用する場合、視差で距離等も取得できるステレオカメラが望ましい。 The vehicle exterior sensor 407 detects real objects existing around the vehicle 1 (front, side, and rear). The actual objects detected by the external sensor 407 are, for example, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) of the traveling lane described later. Etc. may be included. As the vehicle exterior sensor, for example, a detection unit composed of a radar sensor such as a millimeter wave radar, an ultrasonic radar, a laser radar, a camera, or a combination thereof, and detection data from the one or a plurality of detection units are processed ( It consists of a processing device (data fusion) and a processing device. Conventional well-known methods are applied to object detection by these radar sensors and camera sensors. By object detection by these sensors, the presence or absence of a real object in the three-dimensional space, and if the real object exists, the position of the real object (relative distance from the vehicle 1 and the traveling direction of the vehicle 1 in the front-rear direction). (Horizontal position, vertical position, etc.), size (horizontal (horizontal), height (vertical), etc.), movement direction (horizontal (horizontal), depth) The direction (front-back direction)), movement speed (lateral direction (left-right direction), depth direction (front-back direction)), and / or type may be detected. One or more external sensors 407 detect a real object in front of the vehicle 1 for each detection cycle of each sensor, and the real object information (presence or absence of the real object, existence of the real object exists) which is an example of the real object information. In this case, information such as the position, size, and / or type of each real object) can be output to the processor 33. Note that these real object information may be transmitted to the processor 33 via another device (for example, vehicle ECU 401). Further, when using a camera as a sensor, an infrared camera or a near-infrared camera is desirable so that a real object can be detected even when the surroundings are dark such as at night. Further, when a camera is used as a sensor, a stereo camera capable of acquiring a distance or the like by parallax is desirable.
 操作検出部409は、例えば、車両1のCID(Center Information Display)、インストルメントパネルなどに設けられたハードウェアスイッチ、又は画像とタッチセンサなどとを兼ね合わされたソフトウェアスイッチなどであり、車両1の乗員(運転席の着座するユーザ、及び/又は助手席に着座するユーザ)による操作に基づく操作情報を、プロセッサ33へ出力する。例えば、操作検出部409は、ユーザの操作により、表示領域100を移動させる操作に基づく表示領域設定情報、アイボックス200を移動させる操作に基づくアイボックス設定情報、観察者の目位置700を設定する操作に基づく情報などを、プロセッサ33へ出力する。 The operation detection unit 409 is, for example, a CID (Center Information Processor) of the vehicle 1, a hardware switch provided on the instrument panel, or a software switch that combines an image and a touch sensor, and the like. The operation information based on the operation by the occupant (the user seated in the driver's seat and / or the user seated in the passenger seat) is output to the processor 33. For example, the operation detection unit 409 sets the display area setting information based on the operation of moving the display area 100, the eyebox setting information based on the operation of moving the eyebox 200, and the observer's eye position 700 by the user's operation. Information based on the operation is output to the processor 33.
 目位置検出部411は、車両1の運転席に着座する観察者の目位置700を検出する赤外線カメラなどのカメラを含み、撮像した画像を、プロセッサ33に出力してもよい。プロセッサ33は、目位置検出部411から撮像画像(目位置700を推定可能な情報の一例。)を取得し、この撮像画像を、パターンマッチングなどの手法で解析することで、観察者の目位置700の座標を検出し、検出した目位置700の座標を示す信号を、プロセッサ33へ出力してもよい。 The eye position detection unit 411 may include a camera such as an infrared camera that detects the eye position 700 of the observer sitting in the driver's seat of the vehicle 1, and may output the captured image to the processor 33. The processor 33 acquires an captured image (an example of information capable of estimating the eye position 700) from the eye position detection unit 411, and analyzes the captured image by a method such as pattern matching to obtain an observer's eye position. The coordinates of 700 may be detected, and a signal indicating the detected coordinates of the eye position 700 may be output to the processor 33.
 また、目位置検出部411は、カメラの撮像画像を解析した解析結果(例えば、観察者の目位置700が、予め設定された複数の表示パラメータ600(後述する。)が対応する空間的な領域のどこに属しているかを示す信号。)を、プロセッサ33に出力してもよい。なお、車両1の観察者の目位置700、又は観察者の目位置700を推定可能な情報を取得する方法は、これらに限定されるものではなく、既知の目位置検出(推定)技術を用いて取得されてもよい。 Further, the eye position detection unit 411 analyzes the analysis result obtained by analyzing the image captured by the camera (for example, the observer's eye position 700 is a spatial region corresponding to a plurality of preset display parameters 600 (described later)). A signal indicating where it belongs to) may be output to the processor 33. The method of acquiring information capable of estimating the eye position 700 of the observer of the vehicle 1 or the eye position 700 of the observer is not limited to these, and a known eye position detection (estimation) technique is used. May be obtained.
 また、目位置検出部411は、観察者の目位置700の移動速度、及び/又は移動方向を検出し、観察者の目位置700の移動速度、及び/又は移動方向を示す信号を、プロセッサ33に出力してもよい。 Further, the eye position detection unit 411 detects the moving speed and / or moving direction of the observer's eye position 700, and outputs a signal indicating the moving speed and / or moving direction of the observer's eye position 700 to the processor 33. It may be output to.
 また、目位置検出部411は、(10)観察者の目位置700がアイボックス200外にあることを示す信号、(20)観察者の目位置700がアイボックス200外にあると推定される信号、又は(30)観察者の目位置700がアイボックス200外になると予測される信号、を検出した場合、所定の条件を満たしたと判定し、当該状態を示す信号を、プロセッサ33に出力してもよい。 Further, the eye position detection unit 411 is presumed to have (10) a signal indicating that the observer's eye position 700 is outside the eye box 200, and (20) the observer's eye position 700 is outside the eye box 200. When a signal or (30) a signal in which the observer's eye position 700 is predicted to be outside the eye box 200 is detected, it is determined that a predetermined condition is satisfied, and a signal indicating the state is output to the processor 33. You may.
 (20)観察者の目位置700がアイボックス200外にあると推定される信号は、(21)観察者の目位置700が検出できないことを示す信号、(22)観察者の目位置700の移動が検出された後、観察者の目位置700が検出できないことを示す信号、及び/又は(23)観察者の目位置700R、700Lのいずれかがアイボックス200の境界200Aの近傍(前記近傍は、例えば、境界200Aから所定の座標以内であることを含む。)にあることを示す信号、などを含む。 (20) The signal that the observer's eye position 700 is estimated to be outside the eye box 200 is (21) a signal indicating that the observer's eye position 700 cannot be detected, and (22) the observer's eye position 700. After the movement is detected, a signal indicating that the observer's eye position 700 cannot be detected, and / or (23) either of the observer's eye positions 700R or 700L is near the boundary 200A of the eyebox 200 (the vicinity). Includes, for example, a signal indicating that it is within a predetermined coordinate from the boundary 200A).
 (30)観察者の目位置700がアイボックス200外になると予測される信号は、(31)新たに検出した目位置700が、過去に検出した目位置700に対して、メモリ37に予め記憶された目位置移動距離閾値以上であること(所定の単位時間内における目位置の移動が規定範囲より大きいこと。)を示す信号、(32)目位置の移動速度が、メモリ37に予め記憶された目位置移動速度閾値以上であることを示す信号、などを含む。 (30) The signal predicted that the observer's eye position 700 is outside the eye box 200 is (31) stored in advance in the memory 37 with respect to the previously detected eye position 700 by the newly detected eye position 700. A signal indicating that the eye position movement distance is equal to or greater than the threshold value (the eye position movement within a predetermined unit time is larger than the specified range), and (32) the eye position movement speed are stored in advance in the memory 37. Includes a signal indicating that the eye position movement speed threshold value is equal to or higher than the threshold value.
 IMU413は、慣性加速に基づいて、車両1の位置、向き、及びこれらの変化(変化速度、変化加速度)を検知するように構成された1つ又は複数のセンサ(例えば、加速度計及びジャイロスコープ)の組み合わせを含むことができる。IMU413は、検出した値(前記検出した値は、車両1の位置、向き、及びこれらの変化(変化速度、変化加速度)を示す信号などを含む。)、検出した値を解析した結果を、プロセッサ33に出力してもよい。前記解析した結果は、前記検出した値が、所定の条件を満たしたか否かの判定結果を示す信号などであり、例えば、車両1の位置又は向きの変化(変化速度、変化加速度)に関する値から、車両1の挙動(振動)が少ないことを示す信号であってもよい。 The IMU413 is one or more sensors (eg, accelerometers and gyroscopes) configured to detect the position, orientation, and changes (speed of change, acceleration of change) of vehicle 1 based on inertial acceleration. Can include combinations of. The IMU413 processes the detected values (the detected values include the position and orientation of the vehicle 1, signals indicating these changes (change speed, change acceleration), and the like), and the results of analyzing the detected values. It may be output to 33. The result of the analysis is a signal or the like indicating a determination result of whether or not the detected value satisfies a predetermined condition, and is, for example, from a value relating to a change (change speed, change acceleration) of the position or orientation of the vehicle 1. , It may be a signal indicating that the behavior (vibration) of the vehicle 1 is small.
 視線方向検出部415は、車両1の運転席に着座する観察者の顔を撮像する赤外線カメラ、又は可視光カメラを含み、撮像した画像を、プロセッサ33に出力してもよい。プロセッサ33は、視線方向検出部415から撮像画像(視線方向を推定可能な情報の一例。)を取得し、この撮像画像を解析することで観察者の視線方向(及び/又は前記注視位置)を特定することができる。なお、視線方向検出部415は、カメラからの撮像画像を解析し、解析結果である観察者の視線方向(及び/又は前記注視位置)を示す信号をプロセッサ33に出力してもよい。なお、車両1の観察者の視線方向を推定可能な情報を取得する方法は、これらに限定されるものではなく、EOG(Electro-oculogram)法、角膜反射法、強膜反射法、プルキンエ像検出法、サーチコイル法、赤外線眼底カメラ法などの他の既知の視線方向検出(推定)技術を用いて取得されてもよい。 The line-of-sight direction detection unit 415 includes an infrared camera or a visible light camera that captures the face of an observer sitting in the driver's seat of the vehicle 1, and may output the captured image to the processor 33. The processor 33 acquires an captured image (an example of information capable of estimating the line-of-sight direction) from the line-of-sight direction detection unit 415, and analyzes the captured image to determine the line-of-sight direction (and / or the gaze position) of the observer. Can be identified. The line-of-sight direction detection unit 415 may analyze the captured image from the camera and output a signal indicating the line-of-sight direction (and / or the gaze position) of the observer, which is the analysis result, to the processor 33. The method for acquiring information that can estimate the line-of-sight direction of the observer of the vehicle 1 is not limited to these, and is not limited to these, but is the EOG (Electro-oculogram) method, the corneal reflex method, the scleral reflex method, and the Purkinje image detection. It may be obtained using other known line-of-sight detection (estimation) techniques such as the method, search coil method, infrared fundus camera method.
 携帯情報端末417は、スマートフォン、ノートパソコン、スマートウォッチ、又は観察者(又は車両1の他の乗員)が携帯可能なその他の情報機器である。I/Oインタフェース31は、携帯情報端末417とペアリングすることで、携帯情報端末417と通信を行うことが可能であり、携帯情報端末417(又は携帯情報端末を通じたサーバ)に記録されたデータを取得する。携帯情報端末417は、例えば、上述の道路情報データベース403及び自車位置検出部405と同様の機能を有し、前記道路情報(実オブジェクト関連情報の一例。)を取得し、プロセッサ33に送信してもよい。また、携帯情報端末417は、車両1の近傍の商業施設に関連するコマーシャル情報(実オブジェクト関連情報の一例。)を取得し、プロセッサ33に送信してもよい。なお、携帯情報端末417は、携帯情報端末417の所持者(例えば、観察者)のスケジュール情報、携帯情報端末417での着信情報、メールの受信情報などをプロセッサ33に送信し、プロセッサ33及び画像処理回路35は、これらに関する画像データを生成及び/又は送信してもよい。 The mobile information terminal 417 is a smartphone, a laptop computer, a smart watch, or other information device that can be carried by an observer (or another occupant of vehicle 1). The I / O interface 31 can communicate with the mobile information terminal 417 by pairing with the mobile information terminal 417, and the data recorded in the mobile information terminal 417 (or the server through the mobile information terminal). To get. The mobile information terminal 417 has, for example, the same functions as the above-mentioned road information database 403 and own vehicle position detection unit 405, acquires the road information (an example of real object-related information), and transmits it to the processor 33. You may. Further, the personal digital assistant 417 may acquire commercial information (an example of information related to a real object) related to a commercial facility in the vicinity of the vehicle 1 and transmit it to the processor 33. The mobile information terminal 417 transmits schedule information of the owner (for example, an observer) of the mobile information terminal 417, incoming information on the mobile information terminal 417, mail reception information, and the like to the processor 33, and the processor 33 and an image. The processing circuit 35 may generate and / or transmit image data relating to these.
 外部通信機器419は、車両1と情報のやりとりをする通信機器であり、例えば、車両1と車車間通信(V2V:Vehicle To Vehicle)により接続される他車両、歩車間通信(V2P:Vehicle To Pedestrian)により接続される歩行者(歩行者が携帯する携帯情報端末)、路車間通信(V2I:Vehicle To roadside Infrastructure)により接続されるネットワーク通信機器であり、広義には、車両1との通信(V2X:Vehicle To Everything)により接続される全てのものを含む。外部通信機器419は、例えば、歩行者、自転車、自動二輪車、他車両(先行車等)、路面、区画線、路側物、及び/又は地物(建物など)の位置を取得し、プロセッサ33へ出力してもよい。また、外部通信機器419は、上述の自車位置検出部405と同様の機能を有し、車両1の位置情報を取得し、プロセッサ33に送信してもよく、さらに上述の道路情報データベース403の機能も有し、前記道路情報(実オブジェクト関連情報の一例。)を取得し、プロセッサ33に送信してもよい。なお、外部通信機器419から取得される情報は、上述のものに限定されない。 The external communication device 419 is a communication device that exchanges information with the vehicle 1, for example, another vehicle connected to the vehicle 1 by vehicle-to-vehicle communication (V2V: Vehicle To Vehicle), and pedestrian-to-vehicle communication (V2P: Vehicle To Pestation). ), A network communication device connected by pedestrians (portable information terminals carried by pedestrians) and road-to-vehicle communication (V2I: Vehicle To vehicle Infrastructure), and in a broad sense, communication with vehicle 1 (V2X). : Includes everything connected by (Vehicle To Everything). The external communication device 419 acquires, for example, the positions of pedestrians, bicycles, motorcycles, other vehicles (preceding vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) and sends them to the processor 33. It may be output. Further, the external communication device 419 has the same function as the own vehicle position detection unit 405 described above, and may acquire the position information of the vehicle 1 and transmit it to the processor 33. Further, the road information database 403 described above may be used. It also has a function, and the road information (an example of information related to a real object) may be acquired and transmitted to the processor 33. The information acquired from the external communication device 419 is not limited to the above.
 メモリ37に記憶されたソフトウェア構成要素は、目位置検出モジュール502、目位置推定モジュール504、目位置予測モジュール506、目位置状態判定モジュール508、表示パラメータ設定モジュール510、グラフィックモジュール512、光源駆動モジュール514、及びアクチュエータ駆動モジュール516、を含む。 The software components stored in the memory 37 are the eye position detection module 502, the eye position estimation module 504, the eye position prediction module 506, the eye position state determination module 508, the display parameter setting module 510, the graphic module 512, and the light source drive module 514. , And the actuator drive module 516.
 図4A、図4Bは、いくつかの実施形態に従って、観察者の目位置に基づき、表示パラメータを設定する動作を実行する方法S100を示すフロー図である。方法S100は、ディスプレイを含む画像表示部20と、この画像表示部20を制御する表示制御装置30と、において実行される。方法S100内のいくつかの動作は任意選択的に組み合わされ、いくつかの動作の手順は任意選択的に変更され、いくつかの動作は任意選択的に省略される。 4A and 4B are flow charts showing a method S100 for executing an operation of setting display parameters based on the eye position of an observer according to some embodiments. Method S100 is executed by an image display unit 20 including a display and a display control device 30 that controls the image display unit 20. Some actions in method S100 are optionally combined, some steps are optionally modified, and some actions are optionally omitted.
 以下で説明するように、方法S100は、特に、観察者の目位置が通常ではない特定状態(異常)、目位置の検出が通常ではない特定状態(異常)である際の画像(虚像)の提示方法を提供する。 As will be described below, the method S100 is an image (virtual image) of an image (virtual image) in particular when the observer's eye position is in an unusual specific state (abnormality) and the eye position detection is in an unusual specific state (abnormality). Provide a presentation method.
 図3の目位置検出モジュール502は、観察者の目位置700を検出する(S110)。目位置検出モジュール502は、観察者の目の高さを示す座標(Y軸方向の位置であり、目位置700を示す信号の一例である。)を検出すること、観察者の目の高さ及び奥行方向の位置を示す座標(Y及びZ軸方向の位置であり、目位置700を示す信号の一例である。)を検出すること、及び/又は観察者の目位置700を示す座標(X,Y,Z軸方向の位置であり、目位置700を示す信号の一例である。)を検出すること、に関係する様々な動作を実行するための様々なソフトウェア構成要素を含む。 The eye position detection module 502 of FIG. 3 detects the eye position 700 of the observer (S110). The eye position detection module 502 detects coordinates indicating the height of the observer's eyes (position in the Y-axis direction, which is an example of a signal indicating the eye position 700), and observer's eye height. And the coordinates indicating the position in the depth direction (the positions in the Y and Z axes, which is an example of the signal indicating the eye position 700), and / or the coordinates indicating the observer's eye position 700 (X). , Y, Z axis position, which is an example of a signal indicating eye position 700), includes various software components for performing various operations related to.
なお、目位置検出モジュール502が検出する目位置700は、右目と左目のそれぞれの位置700R,700L、右目位置700R及び左目位置700Lのうち予め定められた一方の位置、右目位置700R及び左目位置700Lのうち検出可能な(検出しやすい)いずれか一方の位置、又は右目位置700Rと左目位置700Lとから算出される位置(例えば、右目位置と左目位置との中点)、などを含む。例えば、目位置検出モジュール502は、目位置700を、表示設定を更新するタイミングの直前に目位置検出部411から取得した観測位置に基づき決定する。 The eye position 700 detected by the eye position detection module 502 is one of the right eye position 700R and 700L, the right eye position 700R and the left eye position 700L, the right eye position 700R and the left eye position 700L, respectively. It includes one of the detectable (easy-to-detect) positions, or a position calculated from the right eye position 700R and the left eye position 700L (for example, the midpoint between the right eye position and the left eye position). For example, the eye position detection module 502 determines the eye position 700 based on the observation position acquired from the eye position detection unit 411 immediately before the timing of updating the display setting.
 また、目位置検出部411は、目位置検出部411から取得する複数の観察者の目の観測位置に基づき、観察者の目位置700の移動方向、及び/又は移動速度を検出し、観察者の目位置700の移動方向、及び/又は移動速度を示す信号を、プロセッサ33に出力してもよい。 Further, the eye position detection unit 411 detects the movement direction and / or the movement speed of the observer's eye position 700 based on the observation positions of the eyes of a plurality of observers acquired from the eye position detection unit 411, and the observer. A signal indicating the moving direction and / or moving speed of the eye position 700 may be output to the processor 33.
 目位置推定モジュール504は、目位置を推定可能な情報を取得する(S114)。目位置を推定可能な情報は、例えば、目位置検出部411から取得した撮像画像、車両1の運転席の位置、観察者の顔の位置、座高の高さ、又は複数の観察者の目の観測位置などである。目位置推定モジュール504は、目位置を推定可能な情報から、車両1の観察者の目位置700を推定する。目位置推定モジュール504は、目位置検出部411から取得した撮像画像、車両1の運転席の位置、観察者の顔の位置、座高の高さ、又は複数の観察者の目の観測位置などから、観察者の目位置700を推定すること、など観察者の目位置700を推定することに関係する様々な動作を実行するための様々なソフトウェア構成要素を含む。すなわち、目位置推定モジュール504は、目の位置を推定可能な情報から観察者の目位置700を推定するためのテーブルデータ、演算式、などを含み得る。 The eye position estimation module 504 acquires information capable of estimating the eye position (S114). Information that can estimate the eye position is, for example, an image captured from the eye position detection unit 411, the position of the driver's seat of the vehicle 1, the position of the observer's face, the height of the sitting height, or the eyes of a plurality of observers. Observation position, etc. The eye position estimation module 504 estimates the eye position 700 of the observer of the vehicle 1 from the information capable of estimating the eye position. The eye position estimation module 504 is based on the captured image acquired from the eye position detection unit 411, the position of the driver's seat of the vehicle 1, the position of the observer's face, the height of the sitting height, the observation positions of the eyes of a plurality of observers, and the like. Includes various software components for performing various actions related to estimating the observer's eye position 700, such as estimating the observer's eye position 700. That is, the eye position estimation module 504 may include table data, an arithmetic expression, and the like for estimating the eye position 700 of the observer from the information capable of estimating the eye position.
 目位置検出モジュール502は、目位置700を、表示パラメータを更新するタイミングの直前に目位置検出部411から取得した目の観測位置と、1つ又は複数の過去に取得した目の観測位置とに基づき、例えば、加重平均などの手法により、算出してもよい。 The eye position detection module 502 sets the eye position 700 into an eye observation position acquired from the eye position detection unit 411 immediately before the timing of updating the display parameter and one or a plurality of eye observation positions acquired in the past. Based on this, it may be calculated by, for example, a method such as a weighted average.
 目位置予測モジュール506は、観察者の目位置700を予測可能な情報を取得する(S116)。観察者の目位置700を予測可能な情報は、例えば、目位置検出部411から取得した最新の観測位置、又は1つ又はそれ以上の過去に取得した観測位置などである。目位置予測モジュール506は、観察者の目位置700を予測可能な情報に基づいて、目位置700を予測することに関係する様々な動作を実行するための様々なソフトウェア構成要素を含む。具体的に、例えば、目位置予測モジュール506は、新たな表示設定が適用された画像が観察者に視認されるタイミングの、観察者の目位置700を予測する。目位置予測モジュール506は、例えば、最小二乗法や、カルマンフィルタ、α-βフィルタ、又はパーティクルフィルタなどの予測アルゴリズムを用いて、過去の1つ又はそれ以上の観測位置を用いて、次回の値を予測するようにしてもよい。 The eye position prediction module 506 acquires information that can predict the eye position 700 of the observer (S116). The information that can predict the eye position 700 of the observer is, for example, the latest observation position acquired from the eye position detection unit 411, or one or more observation positions acquired in the past. The eye position prediction module 506 includes various software components for performing various actions related to predicting the eye position 700 based on predictable information on the observer's eye position 700. Specifically, for example, the eye position prediction module 506 predicts the observer's eye position 700 at the timing when the image to which the new display setting is applied is visually recognized by the observer. The eye position prediction module 506 uses a prediction algorithm such as a least squares method, a Kalman filter, an α-β filter, or a particle filter to obtain the next value using one or more observation positions in the past. You may try to predict.
 目位置状態判定モジュール508は、観察者の目位置700が特定状態であるか判定する(S130)。目位置状態判定モジュール508は、(10)観察者の目位置700が検出できるか否かを判定し、目位置700が検出できない場合、特定状態であると判定すること、(20)観察者の目位置700がアイボックス200外にあるか否かを判定し、アイボックス200外にある場合、特定状態であると判定すること、(30)観察者の目位置700がアイボックス200外にあると推定できるか判定し、アイボックス200外にあると推定できる場合、特定状態であると判定すること、又は(40)観察者の目位置700がアイボックス200外になると予測されるか否かを判定し、アイボックス200外になると予測される場合、特定状態であると判定すること、など観察者の目位置700が特定状態であることに関係する様々な動作を実行するための様々なソフトウェア構成要素を含む。すなわち、目位置状態判定モジュール508は、目位置700の検出情報、推定情報、又は予測情報から特定状態であるか否かを判定するためのテーブルデータ、演算式、などを含み得る。 The eye position state determination module 508 determines whether the observer's eye position 700 is in a specific state (S130). The eye position state determination module 508 (10) determines whether or not the observer's eye position 700 can be detected, and if the eye position 700 cannot be detected, determines that the observer is in a specific state, and (20) the observer's. It is determined whether or not the eye position 700 is outside the eye box 200, and if it is outside the eye box 200, it is determined that it is in a specific state. (30) The observer's eye position 700 is outside the eye box 200. If it can be estimated that it is outside the eyebox 200, it is determined that it is in a specific state, or (40) whether the observer's eye position 700 is predicted to be outside the eyebox 200. To perform various actions related to the observer's eye position 700 being in a specific state, such as determining that the eye box 200 is out of the eye box 200 and determining that it is in a specific state. Includes software components. That is, the eye position state determination module 508 may include table data, an arithmetic expression, and the like for determining whether or not the eye position is in a specific state from the detection information, estimation information, or prediction information of the eye position 700.
 (S132)観察者の目位置700が検出できるか否かの判定する方法は、(1)目位置検出部411から所定期間内において取得される観察者の目の観測位置の一部(例えば、所定の回数以上。)又は全部が検出できないこと、(2)目位置検出モジュール502が、通常の動作において、観察者の目位置700を検出できないこと、(3)目位置推定モジュール504が、通常の動作において、観察者の目位置700を推定できないこと、(4)目位置予測モジュール506が、通常の動作において、観察者の目位置700を予測できないこと、又はこれらの組み合わせにより、観察者の目位置700が検出できない(観察者の目位置700が特定状態である)と判定すること、を含む(なお前記判定方法は、これらに限定されない。)。 (S132) The method of determining whether or not the observer's eye position 700 can be detected is as follows: (1) A part of the observer's eye observation position acquired from the eye position detection unit 411 within a predetermined period (for example, (2) The eye position detection module 502 cannot detect the observer's eye position 700 in normal operation, and (3) the eye position estimation module 504 usually cannot detect the eye position 700 or more than a predetermined number of times. The observer's eye position 700 cannot be estimated in the movement of the observer, (4) the eye position prediction module 506 cannot predict the eye position 700 of the observer in the normal operation, or a combination thereof causes the observer's eye position 700. It includes determining that the eye position 700 cannot be detected (the observer's eye position 700 is in a specific state) (note that the determination method is not limited thereto).
 (S134)観察者の目位置700がアイボックス200外にあるか否かの判定する方法は、(1)目位置検出部411から所定期間内において取得される観察者の目の観測位置の一部(例えば、所定の回数以上。)又は全部をアイボックス200外で取得すること、(2)目位置検出モジュール502が観察者の目位置700をアイボックス200外で検出すること、又はこれらの組み合わせにより、観察者の目位置700がアイボックス200外にある(観察者の目位置700が特定状態である)と判定すること、を含む(なお前記判定方法は、これらに限定されない。)。 (S134) The method of determining whether or not the observer's eye position 700 is outside the eye box 200 is as follows: (1) One of the observer's eye observation positions acquired from the eye position detection unit 411 within a predetermined period. Acquiring a part (for example, a predetermined number of times or more) or all of them outside the eye box 200, (2) the eye position detection module 502 detects the observer's eye position 700 outside the eye box 200, or these. The combination includes determining that the observer's eye position 700 is outside the eye box 200 (the observer's eye position 700 is in a specific state) (note that the determination method is not limited thereto).
 (S136)観察者の目位置700がアイボックス200外にあると推定できるか判定する方法は、(1)目位置検出部411で観察者の目位置700の移動が検出された後、観察者の目位置700が検出できなくなったこと、(2)目位置検出モジュール502が観察者の目位置700をアイボックス200の境界200Aの近傍で検出すること、(3)目位置検出モジュール502が観察者の右目位置700R及び左目位置700Lのいずれかアイボックス200の境界200Aの近傍で検出すること、又はこれらの組み合わせにより、観察者の目位置700がアイボックス200外にあると推定できる(観察者の目位置700が特定状態である)と判定すること、を含む(なお前記判定方法は、これらに限定されない。)。 (S136) The method of determining whether the observer's eye position 700 can be estimated to be outside the eye box 200 is as follows: (1) After the movement of the observer's eye position 700 is detected by the eye position detection unit 411, the observer The eye position 700 cannot be detected, (2) the eye position detection module 502 detects the observer's eye position 700 near the boundary 200A of the eye box 200, and (3) the eye position detection module 502 observes. It can be estimated that the observer's eye position 700 is outside the eye box 200 by detecting either the right eye position 700R or the left eye position 700L near the boundary 200A of the eye box 200, or a combination thereof (observer). (The determination method is not limited to these).
 (S138)観察者の目位置700がアイボックス200外になると予測されるか否かを判定する方法は、(1)目位置予測モジュール506が所定時間後の観察者の目位置700をアイボックス200外に予測すること、(2)目位置検出モジュール502が新たに検出した目位置700が、過去に検出した目位置700に対して、メモリ37に予め記憶された目位置移動距離閾値以上であること(目位置700の移動速度が、メモリ37に予め記憶された目位置移動速度閾値以上であること。)、又はこれらの組み合わせにより、観察者の目位置700がアイボックス200外にあると予測できる(観察者の目位置700が特定状態である)と判定すること、を含む(なお前記判定方法は、これらに限定されない。)。 (S138) The method of determining whether or not the observer's eye position 700 is predicted to be outside the eye box 200 is as follows: (1) The eye position prediction module 506 sets the observer's eye position 700 after a predetermined time in the eye box. Predicting outside 200, (2) The eye position 700 newly detected by the eye position detection module 502 is equal to or greater than the eye position movement distance threshold stored in advance in the memory 37 with respect to the previously detected eye position 700. There is (the movement speed of the eye position 700 is equal to or higher than the eye position movement speed threshold stored in advance in the memory 37), or the observer's eye position 700 is outside the eye box 200 due to a combination thereof. It includes determining that it can be predicted (the eye position 700 of the observer is in a specific state) (note that the determination method is not limited thereto).
 表示パラメータ設定モジュール510は、目位置状態判定モジュール508により特定状態ではないと判定された場合、目位置検出モジュール502で検出した目位置700、目位置推定モジュール504で推測した目位置700、又は目位置予測モジュール506で予測した目位置700が、予め定められた複数の座標範囲(例えば、前記複数の座標範囲は、図5A、図5B、又は図5Cの複数の表示パラメータ600が対応する複数の区画された領域である。)のどこに属しているか判定し、目位置700に対応する表示パラメータ600を設定する。表示パラメータ設定モジュール510は、目位置700が示すX軸座標、Y軸座標、Z軸座標、又はこれらの組み合わせが、複数の空間的な領域のどこに属するのかを判定し、目位置700が属する空間的な領域に対応する表示パラメータ600を設定することに関係する様々な動作を実行するための様々なソフトウェア構成要素を含む。すなわち、目位置推定モジュール504は、目位置700が示すX軸座標、Y軸座標、Z軸座標、又はこれらの組み合わせから、表示パラメータ600を特定するためのテーブルデータ、演算式、などを含み得る。 When the display parameter setting module 510 determines that the state is not a specific state by the eye position state determination module 508, the eye position 700 detected by the eye position detection module 502, the eye position 700 estimated by the eye position estimation module 504, or the eye position 700. The eye position 700 predicted by the position prediction module 506 has a plurality of predetermined coordinate ranges (for example, the plurality of coordinate ranges correspond to a plurality of display parameters 600 of FIGS. 5A, 5B, or 5C. It is determined where it belongs to the partitioned area), and the display parameter 600 corresponding to the eye position 700 is set. The display parameter setting module 510 determines where in a plurality of spatial regions the X-axis coordinate, the Y-axis coordinate, the Z-axis coordinate, or a combination thereof indicated by the eye position 700 belongs, and the space to which the eye position 700 belongs. Includes various software components for performing various operations related to setting the display parameter 600 corresponding to a specific area. That is, the eye position estimation module 504 may include table data, an arithmetic expression, and the like for specifying the display parameter 600 from the X-axis coordinates, the Y-axis coordinates, the Z-axis coordinates, or a combination thereof indicated by the eye position 700. ..
 表示パラメータ600の種類は、(1)目位置700から見て、車両1の外側に位置する実オブジェクトと所定の位置関係になるような画像の配置(画像の配置を制御するために表示器21を制御するパラメータ、及び/又はアクチュエータを制御すパラメータ。)、(2)目位置700から見た虚像光学系90などにより生じ得る画像の歪みを軽減するために画像を事前に歪ませるためのパラメータ(表示器21上で表示される画像を事前に歪ませるために表示器21を制御するパラメータであり、ワーピングパラメータとも呼ばれる。)、(3)目位置700に、画像の光を向け、目位置700以外の表示パラメータ600には画像の光を向けない(又は光を弱くする)指向性表示をするためのパラメータ(表示器21を制御するパラメータ、表示器21の前記光源を制御するパラメータ、アクチュエータを制御するパラメータ、又はこれらの組み合わせを含むパラメータ。)、(4)目位置から見て、所望の立体画像を視認させるためのパラメータ(表示器21を制御するパラメータ、表示器21の前記光源を制御するパラメータ、アクチュエータを制御するパラメータ、又はこれらの組み合わせを含むパラメータ。)、などを含む。ただし、表示パラメータ設定モジュール510が設定(選択)する表示パラメータ600は、観察者の目位置700に応じて変更されることが好ましいパラメータであればよく、表示パラメータ600の種類は、これらに限定されない。 The types of display parameters 600 are (1) arrangement of images so as to have a predetermined positional relationship with an actual object located outside the vehicle 1 when viewed from the eye position 700 (display 21 for controlling the arrangement of images). Parameters that control (A parameter that controls the display 21 to distort the image displayed on the display 21 in advance, and is also called a warping parameter.), (3) Direct the light of the image to the eye position 700 and the eye position. Parameters for directional display (parameters for controlling the display 21, parameters for controlling the light source of the display 21, actuators) for directional display in which the light of the image is not directed (or weakened) to the display parameters 600 other than 700. (4) Parameters for visually recognizing a desired stereoscopic image when viewed from the eye position (parameters for controlling the display 21 and the light source of the display 21). A parameter for controlling, a parameter for controlling an actuator, or a parameter including a combination thereof.), Etc. are included. However, the display parameter 600 set (selected) by the display parameter setting module 510 may be any parameter that is preferably changed according to the eye position 700 of the observer, and the type of the display parameter 600 is not limited to these. ..
 表示パラメータ設定モジュール510は、1種類の表示パラメータにつき、目位置700に対応する1つ又は複数の表示パラメータ600を選定する。目位置700が、例えば、右目位置700R、及び左目位置700Lのそれぞれを含む場合、表示パラメータ設定モジュール510は、右目位置700Rに対応する表示パラメータ600と、左目位置700Lに対応する表示パラメータ600と、の2つを選定し得る。一方、目位置700が、例えば、右目位置700R及び左目位置700Lのうち予め定められた一方の位置、右目位置700R及び左目位置700Lのうち検出可能な(検出しやすい)いずれか一方の位置、又は右目位置700Rと左目位置700Lとから算出される1つの位置(例えば、右目位置700Rと左目位置700Lとの中点)である場合、表示パラメータ設定モジュール510は、目位置700に対応する1つの表示パラメータ600を選定し得る。なお、表示パラメータ設定モジュール510は、目位置700に対応する表示パラメータ600と、当該目位置700の周囲に設定される表示パラメータ600も選定し得る。すなわち、表示パラメータ設定モジュール510は、目位置700に対応する3つ以上の表示パラメータ600を選定し得る。 The display parameter setting module 510 selects one or a plurality of display parameters 600 corresponding to the eye position 700 for one type of display parameter. When the eye position 700 includes, for example, the right eye position 700R and the left eye position 700L, the display parameter setting module 510 includes the display parameter 600 corresponding to the right eye position 700R, the display parameter 600 corresponding to the left eye position 700L, and the display parameter 600. Two can be selected. On the other hand, the eye position 700 is, for example, one of the predetermined positions of the right eye position 700R and the left eye position 700L, and one of the right eye position 700R and the left eye position 700L that can be detected (easily detected), or When there is one position calculated from the right eye position 700R and the left eye position 700L (for example, the midpoint between the right eye position 700R and the left eye position 700L), the display parameter setting module 510 has one display corresponding to the eye position 700. Parameter 600 can be selected. The display parameter setting module 510 may also select the display parameter 600 corresponding to the eye position 700 and the display parameter 600 set around the eye position 700. That is, the display parameter setting module 510 may select three or more display parameters 600 corresponding to the eye positions 700.
 表示パラメータ設定モジュール510は、目位置状態判定モジュール508により特定状態であると判定された場合、アイボックス200の境界の近くに設定される表示パラメータ600(境界表示パラメータ610Eも含む。)を選定する(ブロックS154)。 The display parameter setting module 510 selects the display parameter 600 (including the boundary display parameter 610E) set near the boundary of the eye box 200 when the eye position state determination module 508 determines that the specific state is present. (Block S154).
 いくつかの実施形態では、表示パラメータ設定モジュール510は、目位置状態判定モジュール508により特定状態であると判定された場合、最新の目位置700よりアイボックス200の境界の近くに設定される表示パラメータ600(境界表示パラメータ610Eも含む。)を選定する(ブロックS156)。前記方法は、例えば、最新の目位置700の最近傍に設定される境界表示パラメータ610Eを選定すること、又は最新の目位置700から所定の距離だけアイボックス200の境界側に設定される表示パラメータ600(境界表示パラメータ610Eも含む。)を選定すること、なども含む。 In some embodiments, the display parameter setting module 510 is set closer to the boundary of the eyebox 200 than the latest eye position 700 when the eye position state determination module 508 determines that the specific state is present. 600 (including the boundary display parameter 610E) is selected (block S156). In the method, for example, the boundary display parameter 610E set in the nearest neighbor of the latest eye position 700 is selected, or the display parameter set on the boundary side of the eye box 200 by a predetermined distance from the latest eye position 700. It also includes selecting 600 (including the boundary display parameter 610E).
 また、いくつかの実施形態では、表示パラメータ設定モジュール510は、目位置状態判定モジュール508により特定状態であると判定された場合、最新の目位置700、及び目位置の移動方向と、に基づき、最新の目位置700から目位置の移動方向沿った位置に設定される境界表示パラメータ610Eを選定する(ブロックS158の一例)。前記方法は、例えば、最新の目位置700から目位置の移動方向に設定される境界表示パラメータ610Eを選定すること、又は最新の目位置700から目位置の移動方向に所定の距離だけアイボックス200の境界側に設定される表示パラメータ600(境界表示パラメータ610Eも含む。)を選定すること、なども含む。 Further, in some embodiments, the display parameter setting module 510 is based on the latest eye position 700 and the moving direction of the eye position when the eye position state determination module 508 determines that the specific state is present. The boundary display parameter 610E set at a position along the moving direction of the eye position from the latest eye position 700 is selected (an example of block S158). In the method, for example, the boundary display parameter 610E set in the moving direction of the eye position from the latest eye position 700 is selected, or the eye box 200 is set by a predetermined distance in the moving direction of the eye position from the latest eye position 700. It also includes selecting the display parameter 600 (including the boundary display parameter 610E) set on the boundary side of the above.
 また、いくつかの実施形態では、表示パラメータ設定モジュール510は、目位置状態判定モジュール508により特定状態であると判定された場合、最新の目位置700と、最新の目位置700の移動方向と、に応じて、目位置700の移動方向に沿って、最新の目位置700に対応する表示パラメータ600と、境界表示パラメータ610Eと、の間にある表示パラメータ600を選定してもよい(ブロックS158の一例)。つまり、表示パラメータ設定モジュール510は、最新の目位置700に対応した表示パラメータ600よりも目位置700の移動方向に沿って境界表示パラメータ610Eに近い境界表示パラメータ610Eではない表示パラメータ600を選定し得る。 Further, in some embodiments, when the display parameter setting module 510 is determined to be in a specific state by the eye position state determination module 508, the latest eye position 700 and the latest moving direction of the eye position 700 are determined. Depending on the movement direction of the eye position 700, the display parameter 600 corresponding to the latest eye position 700 and the boundary display parameter 610E may be selected (block S158). One case). That is, the display parameter setting module 510 can select a display parameter 600 that is not the boundary display parameter 610E that is closer to the boundary display parameter 610E along the moving direction of the eye position 700 than the display parameter 600 corresponding to the latest eye position 700. ..
 また、いくつかの実施形態では、表示パラメータ設定モジュール510は、目位置状態判定モジュール508により特定状態であると判定された場合、最新の目位置700、最新の目位置700の移動方向、及び目位置700の移動速度、に基づき、最新の目位置700に対応する表示パラメータ600から目位置700の移動方向に沿った境界表示パラメータ610Eまでのいずれかの表示パラメータ600を、移動速度に応じて選定してもよい(ブロックS160の一例)。つまり、表示パラメータ設定モジュール510は、移動速度が所定の目位置移動速度閾値より速ければ、観察者の目位置700がアイボックス200の外又はアイボックス200の境界付近に位置している可能性が高いと推定できるため、境界表示パラメータ610E(又は境界表示パラメータ610Eに近い表示パラメータ600)を選定し、移動速度が所定の目位置移動速度閾値より遅ければ、観察者の目位置700がアイボックス200の外に位置している可能性が低いと推定できるため、最新の目位置700に対応する表示パラメータ600から目位置700の移動方向に沿った境界表示パラメータ610Eまでのいずれかの表示パラメータ600のうち、最新の目位置700の近くに設定される表示パラメータ600を選定し得る。 Further, in some embodiments, when the display parameter setting module 510 is determined to be in a specific state by the eye position state determination module 508, the latest eye position 700, the latest moving direction of the eye position 700, and the eye Based on the moving speed of the position 700, one of the display parameters 600 from the display parameter 600 corresponding to the latest eye position 700 to the boundary display parameter 610E along the moving direction of the eye position 700 is selected according to the moving speed. It may be (an example of block S160). That is, in the display parameter setting module 510, if the movement speed is faster than the predetermined eye position movement speed threshold value, the observer's eye position 700 may be located outside the eye box 200 or near the boundary of the eye box 200. Since it can be estimated to be high, the boundary display parameter 610E (or the display parameter 600 close to the boundary display parameter 610E) is selected, and if the movement speed is slower than the predetermined eye position movement speed threshold value, the observer's eye position 700 is the eye box 200. Since it can be estimated that it is unlikely to be located outside, any of the display parameters 600 from the display parameter 600 corresponding to the latest eye position 700 to the boundary display parameter 610E along the moving direction of the eye position 700. Among them, the display parameter 600 set near the latest eye position 700 can be selected.
 表示パラメータ設定モジュール510は、目位置状態判定モジュール508により特定状態が解除されたと判定された場合、表示パラメータ600を再び選定する(ブロックS170)。 The display parameter setting module 510 reselects the display parameter 600 when it is determined by the eye position state determination module 508 that the specific state has been released (block S170).
 いくつかの実施形態では、表示パラメータ設定モジュール510は、目位置状態判定モジュール508により特定状態が解除されたと判定された場合、特定状態が解除された後の表示パラメータの更新周期で、特定状態が解除された後の最新の目位置700に対応する表示パラメータ600を選定する(ブロックS172)。 In some embodiments, the display parameter setting module 510 determines that the specific state is released in the display parameter update cycle after the specific state is released when the eye position state determination module 508 determines that the specific state is released. The display parameter 600 corresponding to the latest eye position 700 after being released is selected (block S172).
 また、いくつかの実施形態では、表示パラメータ設定モジュール510は、目位置状態判定モジュール508により特定状態が解除されたと判定された場合、特定状態が解除された後の表示パラメータの更新周期では、それまでの表示パラメータを維持し、次の表示パラメータの更新周期以降で、最新の目位置700に対応する表示パラメータ600を選定する(ブロックS174)。 Further, in some embodiments, when the display parameter setting module 510 determines that the specific state is released by the eye position state determination module 508, it is determined in the display parameter update cycle after the specific state is released. The display parameters up to the above are maintained, and after the next display parameter update cycle, the display parameter 600 corresponding to the latest eye position 700 is selected (block S174).
 図5A、図5B、及び図5Cは、アイボックス200、及びその周辺の空間に対応づけられた複数の表示パラメータ600の配置を仮想的に示す図である。1つの表示パラメータ600は、車両1の前進方向を向いた際の左右方向であるX軸座標と、上下方向であるY軸座標と、からなる2次元座標で区画される領域に対応づけられている。つまり、同じ区画内であれば、目位置700の座標によらず、同じ1つの表示パラメータ600が適用される。なお、いくつかの実施形態では、1つの表示パラメータ600は、Z軸座標も含む3次元座標で対応づけられていてもよい。複数の表示パラメータ600は、画像の視認が想定されるアイボックス200内の空間に対応付けられる第1表示パラメータ610と、アイボックス200外の空間に対応付けられる第2表示パラメータ650と、を含む(第2表示パラメータ650は、省略され得る。)。 5A, 5B, and 5C are diagrams that virtually show the arrangement of a plurality of display parameters 600 associated with the eye box 200 and the space around it. One display parameter 600 is associated with an area partitioned by two-dimensional coordinates consisting of X-axis coordinates in the left-right direction and Y-axis coordinates in the up-down direction when the vehicle 1 faces the forward direction. There is. That is, within the same section, the same one display parameter 600 is applied regardless of the coordinates of the eye position 700. In some embodiments, one display parameter 600 may be associated with three-dimensional coordinates including Z-axis coordinates. The plurality of display parameters 600 include a first display parameter 610 associated with the space inside the eyebox 200 where the image can be visually recognized, and a second display parameter 650 associated with the space outside the eyebox 200. (The second display parameter 650 may be omitted.).
 第1表示パラメータ610は、境界表示パラメータ610Eを含む。境界表示パラメータ610Eは、アイボックス200の境界200Aの内側(アイボックス200の中心205側)の空間に対応付けられる表示パラメータ600である。 The first display parameter 610 includes the boundary display parameter 610E. The boundary display parameter 610E is a display parameter 600 associated with the space inside the boundary 200A of the eyebox 200 (on the side of the center 205 of the eyebox 200).
 例えば、図5Aの表示パラメータ612は、アイボックス200の左端(境界の内側)に配置されるので、境界表示パラメータ610Eに分類される。また、例えば、図5Bの表示パラメータ645は、アイボックス200の右端(境界の内側)に配置されるので、境界表示パラメータ610Eに分類される。また、例えば、図5Bの表示パラメータ642は、アイボックス200の上端(境界内側)に配置されるので、境界表示パラメータ610Eに分類される。すなわち、図5Aの境界表示パラメータ610Eは、表示パラメータ611-616,620-621,625-626,630~635であり、図5B及び図5Cの境界表示パラメータ610Eは、表示パラメータ641~645である。 For example, since the display parameter 612 of FIG. 5A is arranged at the left end (inside the boundary) of the eye box 200, it is classified into the boundary display parameter 610E. Further, for example, since the display parameter 645 of FIG. 5B is arranged at the right end (inside the boundary) of the eye box 200, it is classified into the boundary display parameter 610E. Further, for example, since the display parameter 642 of FIG. 5B is arranged at the upper end (inside the boundary) of the eye box 200, it is classified into the boundary display parameter 610E. That is, the boundary display parameter 610E in FIG. 5A is the display parameter 611-616, 620-621, 625-626, 630 to 635, and the boundary display parameter 610E in FIGS. 5B and 5C is the display parameter 641 to 645. ..
 なお、図5A、図5B及び図5Cは、主に表示パラメータ600が対応する空間的な配置の説明を提供するために示されており、第2表示パラメータ650が対応する空間的な領域(区画)の大きさは、概念的に示される。すなわち、第2表示パラメータ650が対応する空間的な領域は、さらに外側に拡大してもよく、内側(アイボックス200側)に縮小してもよい。 It should be noted that FIGS. 5A, 5B and 5C are shown mainly to provide an explanation of the spatial arrangement corresponding to the display parameter 600, and the spatial area (section) corresponding to the second display parameter 650. ) Is conceptually shown. That is, the spatial area corresponding to the second display parameter 650 may be further expanded outward or reduced inward (on the eyebox 200 side).
 図5Aの上図を参照する。第1表示パラメータ610は、左右方向(X軸方向)に5列、上下方向(Y軸方向)に5行、で区画される、25個の第1表示パラメータ611~635で構成される。図5Aでは、列は、X軸の正方向(左)から1列目、2列目、…5列目の順となっており、行は、Y軸の正方向(上)から1行目、2行目、…5行目の順となっており、1列目の1行目から5行目の順に第1表示パラメータ611~615が配置され、2列目の1行目から5行目の順に616~620が配置される。 Refer to the upper figure of FIG. 5A. The first display parameter 610 is composed of 25 first display parameters 611 to 635, which are divided into 5 columns in the horizontal direction (X-axis direction) and 5 rows in the vertical direction (Y-axis direction). In FIG. 5A, the columns are in the order of the first column, the second column, ... The fifth column from the positive direction (left) of the X axis, and the rows are the first row from the positive direction (top) of the Y axis. , 2nd row, ... 5th row, the 1st display parameters 611 to 615 are arranged in the order of the 1st row to the 5th row of the 1st column, and the 1st to 5th rows of the 2nd column. 616 to 620 are arranged in the order of the eyes.
 図5Aの下図は、図5Aの上図と対応しており、目位置700に応じて、表示パラメータ設定モジュール510が選定する表示パラメータ600の説明を提供する。 The lower figure of FIG. 5A corresponds to the upper figure of FIG. 5A, and provides a description of the display parameter 600 selected by the display parameter setting module 510 according to the eye position 700.
 いくつかの実施形態において、目位置状態判定モジュール508により特定状態ではないと判定された場合、表示パラメータ設定モジュール510は、目位置検出モジュール502で検出した目位置701が属する表示パラメータ627を選択する。 In some embodiments, when the eye position state determination module 508 determines that the state is not a specific state, the display parameter setting module 510 selects the display parameter 627 to which the eye position 701 detected by the eye position detection module 502 belongs. ..
 また、いくつかの実施形態において、目位置状態判定モジュール508により特定状態であると判定された場合、表示パラメータ設定モジュール510は、特定状態であると判定される前の最新の目位置701の最近傍の領域に対応する境界表示パラメータ610Eである表示パラメータ626、表示パラメータ631、表示パラメータ632、又はこれら2つ以上の組み合わせを選定する。なお、所定の目位置701の座標に対応する最近傍の境界表示パラメータ610Eは、予めメモリ37に記憶されていてもよく、すなわち、表示パラメータ627が対応する空間的な領域は、最も近い境界表示パラメータ610Eとして、表示パラメータ626、表示パラメータ631、及び表示パラメータ632のいずれか1つに対応付けてメモリ37に予め記憶されていてもよい。なお、いくつかの実施形態では、表示パラメータ設定モジュール510は、表示パラメータ627が対応する区画内における目位置701の詳細な座標に近い、表示パラメータ626、表示パラメータ631、又は表示パラメータ632のいずれか1つを選択するようにしてもよい。つまり、表示パラメータ設定モジュール510は、目位置701の座標が、表示パラメータ626が対応する領域に近ければ、表示パラメータ626を選択し、一方、目位置701の座標が、表示パラメータ632が対応する領域に近ければ、表示パラメータ632を選択するようにしてもよい。 Further, in some embodiments, when the eye position state determination module 508 determines that the specific state is present, the display parameter setting module 510 is the latest of the latest eye position 701 before the determination of the specific state. A display parameter 626, a display parameter 631, a display parameter 632, or a combination of two or more of these, which are boundary display parameters 610E corresponding to the neighboring area, is selected. The nearest boundary display parameter 610E corresponding to the coordinates of the predetermined eye position 701 may be stored in the memory 37 in advance, that is, the spatial area corresponding to the display parameter 627 is the closest boundary display. As the parameter 610E, it may be stored in the memory 37 in advance in association with any one of the display parameter 626, the display parameter 631, and the display parameter 632. In some embodiments, the display parameter setting module 510 is either display parameter 626, display parameter 631, or display parameter 632, which is close to the detailed coordinates of the eye position 701 in the section to which the display parameter 627 corresponds. You may choose one. That is, the display parameter setting module 510 selects the display parameter 626 if the coordinates of the eye position 701 are close to the area corresponding to the display parameter 626, while the coordinates of the eye position 701 are the area corresponding to the display parameter 632. If it is close to, the display parameter 632 may be selected.
 また、いくつかの実施形態では、目位置状態判定モジュール508により特定状態であると判定された場合、表示パラメータ設定モジュール510は、特定状態であると判定される前の最新の目位置702に対応する表示パラメータ654(第2表示パラメータ654)が対応する領域の最近傍の空間に設定された境界表示パラメータ610Eである表示パラメータ626を選定する。 Further, in some embodiments, when the eye position state determination module 508 determines that the specific state is present, the display parameter setting module 510 corresponds to the latest eye position 702 before the specific state is determined. The display parameter 626, which is the boundary display parameter 610E set in the space closest to the area to which the display parameter 654 (second display parameter 654) corresponds, is selected.
 また、いくつかの実施形態では、目位置状態判定モジュール508により特定状態であると判定された場合、表示パラメータ設定モジュール510は、特定状態であると判定される前の最新の目位置703が属する領域に対応する表示パラメータ631が境界表示パラメータ610Eであれば、そのまま境界表示パラメータ610Eである表示パラメータ631を選定する。 Further, in some embodiments, when the eye position state determination module 508 determines that the specific state is present, the display parameter setting module 510 belongs to the latest eye position 703 before the determination of the specific state. If the display parameter 631 corresponding to the area is the boundary display parameter 610E, the display parameter 631 which is the boundary display parameter 610E is selected as it is.
 また、いくつかの実施形態では、目位置状態判定モジュール508により特定状態であると判定された場合、表示パラメータ設定モジュール510は、特定状態であると判定される前の最新の目位置704と、特定状態であると判定される前(又は特定状態であると判定された直後)の最新の、目位置の移動方向751と、に応じて、最新の目位置704が属する領域に対応する表示パラメータ624の、目位置の移動方向751に配置される境界表示パラメータ610Eである表示パラメータ634を選定する。一方、表示パラメータ設定モジュール510は、最新の目位置704と、目位置の移動方向752と、に応じて、最新の目位置704に対応する表示パラメータ624の、目位置の移動方向752に配置される境界表示パラメータ610Eである表示パラメータ614を選定する。 Further, in some embodiments, when the eye position state determination module 508 determines that the specific state is present, the display parameter setting module 510 determines that the latest eye position 704 before the specific state is determined. Display parameters corresponding to the latest eye position movement direction 751 before the determination of the specific state (or immediately after the determination of the specific state) and the area to which the latest eye position 704 belongs. The display parameter 634, which is the boundary display parameter 610E arranged in the moving direction 751 of the eye position of 624, is selected. On the other hand, the display parameter setting module 510 is arranged in the moving direction 752 of the eye position of the display parameter 624 corresponding to the latest eye position 704 according to the latest eye position 704 and the moving direction 752 of the eye position. The display parameter 614, which is the boundary display parameter 610E, is selected.
 また、いくつかの実施形態では、目位置状態判定モジュール508により特定状態であると判定された場合、表示パラメータ設定モジュール510は、特定状態であると判定される前の最新の目位置705と、特定状態であると判定される前(又は特定状態であると判定された直後)の最新の、目位置の移動方向753と、に応じて、最新の目位置705に対応する表示パラメータ622の、目位置の移動方向751に配置される境界表示パラメータ610Eである表示パラメータ612を選定する。その後、特定状態の際の目位置706から移動方向754に沿って移動した際、特定状態ではなくなったと仮定し、特定状態ではなくなった後の最新の目位置を、符号707で示す。目位置状態判定モジュール508により特定状態ではないと判定された場合、表示パラメータ設定モジュール510は、目位置検出モジュール502で検出した目位置707が属する表示パラメータ611を選択する。すなわち、これらの目位置700の移動に伴い選択される表示パラメータ600を簡潔に説明すると、表示パラメータ設定モジュール510は、特定状態ではない際の目位置705に対応した表示パラメータ622を選択し、特定状態になった際の目位置705と移動方向753に対応した表示パラメータ612(境界表示パラメータ610E)を選択し、かつ特定状態ではなくなった際の目位置707に対応した表示パラメータ611を選択する。 Further, in some embodiments, when the eye position state determination module 508 determines that the specific state is present, the display parameter setting module 510 determines that the latest eye position 705 before the specific state is determined. The latest display parameter 622 corresponding to the latest eye position 705, depending on the latest movement direction of the eye position 753 before the determination of the specific state (or immediately after the determination of the specific state). The display parameter 612, which is the boundary display parameter 610E arranged in the moving direction 751 of the eye position, is selected. After that, when the eye position 706 in the specific state is moved along the moving direction 754, it is assumed that the specific state is no longer present, and the latest eye position after the specific state is lost is indicated by reference numeral 707. When the eye position state determination module 508 determines that the state is not a specific state, the display parameter setting module 510 selects the display parameter 611 to which the eye position 707 detected by the eye position detection module 502 belongs. That is, to briefly explain the display parameter 600 selected as the eye position 700 moves, the display parameter setting module 510 selects and specifies the display parameter 622 corresponding to the eye position 705 when the eye position is not in a specific state. The display parameter 612 (boundary display parameter 610E) corresponding to the eye position 705 and the moving direction 753 when the state is reached is selected, and the display parameter 611 corresponding to the eye position 707 when the state is no longer specified is selected.
 次に、図5Bを参照する。図5Bは、表示パラメータ600の配置の変形例の説明を提供する。第1表示パラメータ610は、左右方向(X軸方向)に5列で区画され、上下方向(Y軸方向)では区画されない、5個の第1表示パラメータ641~645で構成される。図5Bでは、列は、X軸の正方向(左)から1列目、2列目、…5列目の順となっており、1列目から5列目の順に第1表示パラメータ641~645が配置される。すなわち、アイボックス200における表示パラメータ600の配置は、図5Aに示すように、X軸方向に複数の判定エリアを配置し、Y軸方向にも複数の判定エリアを配置した二次元配置ではなく、図5Bに示すように、X軸方向に複数の判定エリアを配置し、Y軸方向に複数の判定エリアを配置しない一次元配置であってもよい。 Next, refer to FIG. 5B. FIG. 5B provides a description of a modified example of the arrangement of the display parameter 600. The first display parameter 610 is composed of five first display parameters 641 to 645, which are partitioned in five rows in the left-right direction (X-axis direction) and not partitioned in the vertical direction (Y-axis direction). In FIG. 5B, the columns are in the order of the first column, the second column, ... The fifth column from the positive direction (left) of the X axis, and the first display parameters 641 to the fifth column are in order. 645 is placed. That is, the arrangement of the display parameter 600 in the eye box 200 is not a two-dimensional arrangement in which a plurality of determination areas are arranged in the X-axis direction and a plurality of determination areas are also arranged in the Y-axis direction, as shown in FIG. 5A. As shown in FIG. 5B, a one-dimensional arrangement may be adopted in which a plurality of determination areas are arranged in the X-axis direction and a plurality of determination areas are not arranged in the Y-axis direction.
 次に、図5Cを参照する。図5Cは、表示パラメータ600の配置の変形例の説明を提供する。図5A、及び図5Bでは、アイボックス200内に配置される複数の表示パラメータ600の大きさは、同じであったが、図5Cに示すように、異なっていてもよい。また、複数の表示パラメータ600の形状も、それぞれ異なっていてもよい。 Next, refer to FIG. 5C. FIG. 5C provides a description of a modified example of the arrangement of the display parameter 600. In FIGS. 5A and 5B, the sizes of the plurality of display parameters 600 arranged in the eye box 200 are the same, but may be different as shown in FIG. 5C. Further, the shapes of the plurality of display parameters 600 may also be different from each other.
 図6の上図は、アイボックス200、及びその周辺の空間に対応づけられた複数の表示パラメータ600の配置の説明を提供する図である。 The upper figure of FIG. 6 is a diagram that provides an explanation of the arrangement of a plurality of display parameters 600 associated with the eye box 200 and the space around the eye box 200.
 いくつかの実施形態では、第2表示パラメータ650も、図6上図に示すように、第2境界表示パラメータ650Eを含んでいてもよい。第2境界表示パラメータ650Eは、アイボックス200の境界200Aの外側の空間的な領域に対応付けられる表示パラメータ600である。すなわち、境界表示パラメータは、アイボックス200の境界200Aの内側の領域に対応する第1境界表示パラメータ610Eと、境界200Aの外側の領域に対応する第2境界表示パラメータ650Eと、を含み得る。 In some embodiments, the second display parameter 650 may also include the second boundary display parameter 650E, as shown in the upper diagram of FIG. The second boundary display parameter 650E is a display parameter 600 associated with a spatial area outside the boundary 200A of the eyebox 200. That is, the boundary display parameter may include a first boundary display parameter 610E corresponding to the region inside the boundary 200A of the eyebox 200 and a second boundary display parameter 650E corresponding to the region outside the boundary 200A.
 図6の下図は、図6の上図と対応しており、目位置700に応じて、表示パラメータ設定モジュール510が選定する表示パラメータ600の説明を提供する。 The lower figure of FIG. 6 corresponds to the upper figure of FIG. 6, and provides an explanation of the display parameter 600 selected by the display parameter setting module 510 according to the eye position 700.
 また、いくつかの実施形態では、目位置状態判定モジュール508により特定状態であると判定された場合、表示パラメータ設定モジュール510は、特定状態であると判定される前の最新の目位置703が属する領域に対応する表示パラメータ631が第1境界表示パラメータ610Eであっても、既に設定されている表示パラメータ631が対応する領域より、アイボックス200の中心205からさらに離れた領域に対応する第2境界表示パラメータ650E(表示パラメータ655、表示パラメータ666、又は表示パラメータ655が対応する領域の右側であり、表示パラメータ666が対応する領域の上側の領域に対応する表示パラメータ(不図示))を設定してもよい。 Further, in some embodiments, when the eye position state determination module 508 determines that the specific state is present, the display parameter setting module 510 belongs to the latest eye position 703 before the determination of the specific state. Even if the display parameter 631 corresponding to the area is the first boundary display parameter 610E, the second boundary corresponding to the area further distant from the center 205 of the eyebox 200 from the area corresponding to the already set display parameter 631. Set the display parameter 650E (display parameter 655, display parameter 666, or display parameter (not shown) corresponding to the area above the area corresponding to the display parameter 665 on the right side of the corresponding area). May be good.
 また、いくつかの実施形態では、目位置状態判定モジュール508により特定状態であると判定された場合、表示パラメータ設定モジュール510は、特定状態であると判定される前の最新の目位置704と、特定状態であると判定される前(又は特定状態であると判定された直後)の最新の、目位置の移動方向751と、に応じて、最新の目位置704が属する領域に対応する表示パラメータ624の、目位置の移動方向751に配置される第1境界表示パラメータ610Eである表示パラメータ634、又は第2境界表示パラメータ650Eである表示パラメータ669を選定する。 Further, in some embodiments, when the eye position state determination module 508 determines that the specific state is present, the display parameter setting module 510 determines that the latest eye position 704 before the specific state is determined. Display parameters corresponding to the latest eye position movement direction 751 before the determination of the specific state (or immediately after the determination of the specific state) and the area to which the latest eye position 704 belongs. The display parameter 634 which is the first boundary display parameter 610E or the display parameter 669 which is the second boundary display parameter 650E arranged in the moving direction 751 of the eye position of 624 is selected.
 また、いくつかの実施形態では、目位置状態判定モジュール508により特定状態であると判定された場合、表示パラメータ設定モジュール510は、特定状態であると判定される前の最新の目位置705と、特定状態であると判定される前(又は特定状態であると判定された直後)の最新の、目位置の移動方向753と、に応じて、最新の目位置705に対応する表示パラメータ622の、目位置の移動方向753に配置される第1境界表示パラメータ610Eである表示パラメータ612、又は第2境界表示パラメータ650Eである表示パラメータ662を選定する。その後、特定状態の際の目位置706から移動方向754に沿って移動した際、特定状態ではなくなったと仮定し、特定状態ではなくなった後の最新の目位置を、符号707で示す。目位置状態判定モジュール508により特定状態ではないと判定された場合、表示パラメータ設定モジュール510は、目位置検出モジュール502で検出した目位置707が属する表示パラメータ611を選択する。 Further, in some embodiments, when the eye position state determination module 508 determines that the specific state is present, the display parameter setting module 510 determines that the latest eye position 705 before the specific state is determined. The latest display parameter 622 corresponding to the latest eye position 705, depending on the latest movement direction of the eye position 753 before the determination of the specific state (or immediately after the determination of the specific state). The display parameter 612, which is the first boundary display parameter 610E, or the display parameter 662, which is the second boundary display parameter 650E, is selected in the moving direction 753 of the eye position. After that, when the eye position 706 in the specific state is moved along the moving direction 754, it is assumed that the specific state is no longer present, and the latest eye position after the specific state is lost is indicated by reference numeral 707. When the eye position state determination module 508 determines that the state is not a specific state, the display parameter setting module 510 selects the display parameter 611 to which the eye position 707 detected by the eye position detection module 502 belongs.
 なお、隣接する第1境界表示パラメータ610Eと第2境界表示パラメータ650Eは、同じ表示パラメータ(同じ設定値)であってもよい。すなわち、図6の第1境界表示パラメータ610Eである表示パラメータ631と、これと隣接する領域に対応付けられる第2境界表示パラメータ650Eである表示パラメータ655、表示パラメータ666、及び表示パラメータ655が対応する領域の右側であり、表示パラメータ666が対応する領域の上側の領域に対応する表示パラメータ(不図示)の2つ又は全ては、同じ表示パラメータ(同じ設定値)であってもよい。 The adjacent first boundary display parameter 610E and the second boundary display parameter 650E may have the same display parameter (same setting value). That is, the display parameter 631 which is the first boundary display parameter 610E in FIG. 6 corresponds to the display parameter 655, the display parameter 666, and the display parameter 655 which are the second boundary display parameters 650E associated with the first boundary display parameter 610E. Two or all of the display parameters (not shown) on the right side of the area and corresponding to the area above the area to which the display parameter 666 corresponds may be the same display parameter (same set value).
 また、いくつかの実施形態において、複数の表示パラメータ600が対応する空間的な領域の配置は、変更可能であってもよい。例えば、所定の条件が満たされた場合に、複数の表示パラメータ600が対応する空間的な領域の配置は、図5Aに示される配置から、図7に示される配置に変更されてもよい。図7に示される配置は、図5Aに示される配置と比べて、境界表示パラメータ610E(650E)が対応する空間的な領域の左右方向(X軸方向)の幅、及び/又は上下方向(Y軸方向)の幅を長くしている。つまり、図7に示される配置は、境界表示パラメータ610E(650E)に対応する領域が広いため、目位置700がずれても一定の表示パラメータ600(境界表示パラメータ610E(650E))が維持されやすくなる。 Further, in some embodiments, the arrangement of the spatial areas corresponding to the plurality of display parameters 600 may be changeable. For example, when a predetermined condition is satisfied, the arrangement of the spatial area corresponding to the plurality of display parameters 600 may be changed from the arrangement shown in FIG. 5A to the arrangement shown in FIG. 7. The arrangement shown in FIG. 7 is the width of the spatial region corresponding to the boundary display parameter 610E (650E) in the horizontal direction (X-axis direction) and / or the vertical direction (Y) as compared with the arrangement shown in FIG. 5A. The width in the axial direction) is increased. That is, since the arrangement shown in FIG. 7 has a wide area corresponding to the boundary display parameter 610E (650E), it is easy to maintain a constant display parameter 600 (boundary display parameter 610E (650E)) even if the eye position 700 shifts. Become.
 すなわち、いくつかの実施形態において、目位置状態判定モジュール508により特定状態であると判定された場合(これは前記所定の条件を満たす例である。)、表示パラメータ設定モジュール510は、境界表示パラメータ610E(650E)が対応する空間的な領域を拡大してもよい。つまり、検出される目位置700の状態、又は目位置700の検出状態が、特定状態である場合に、アイボックス200の境界付近で実際の目位置700が変化しても表示パラメータ600を切り替わりにくくすることができる。 That is, in some embodiments, when the eye position state determination module 508 determines that the specific state is satisfied (this is an example satisfying the predetermined condition), the display parameter setting module 510 sets the boundary display parameter. The spatial area covered by the 610E (650E) may be expanded. That is, when the detected eye position 700 or the detected eye position 700 is a specific state, it is difficult to switch the display parameter 600 even if the actual eye position 700 changes near the boundary of the eye box 200. can do.
 また、いくつかの実施形態において、目位置状態判定モジュール508により特定状態であると判定され、かつ特定状態と判定される前の最新の目位置700がアイボックス200の境界200Aから所定の距離以上離れていた場合(これは前記所定の条件を満たす例である。)、表示パラメータ設定モジュール510は、境界表示パラメータ610E(650E)が対応する空間的な領域を拡大してもよい。この際、特定状態と判定される前の最新の目位置700の領域が含まれるように、境界表示パラメータ610Eを拡大してもよい。 Further, in some embodiments, the latest eye position 700 determined to be in the specific state by the eye position state determination module 508 and before being determined to be the specific state is at least a predetermined distance from the boundary 200A of the eye box 200. When separated (this is an example satisfying the predetermined condition), the display parameter setting module 510 may expand the spatial area corresponding to the boundary display parameter 610E (650E). At this time, the boundary display parameter 610E may be expanded so as to include the region of the latest eye position 700 before the determination of the specific state.
 また、いくつかの実施形態において、複数の表示パラメータ600が対応する空間的な領域の数(密度)は、変更可能であってもよい。例えば、所定の条件が満たされた場合に、アイボックス200内の複数の第1表示パラメータ610が対応する空間的な領域の数は、図5Bに示される5個から、図5Aに示される25個に変更されてもよい。複数の表示パラメータ600が対応する空間的な領域の数を多くする(換言すると、密度を高くする)と、目位置700の変化に応じて、表示パラメータ600が切り替わりやすくなる。 Further, in some embodiments, the number (density) of spatial regions corresponding to the plurality of display parameters 600 may be changeable. For example, when a predetermined condition is met, the number of spatial regions corresponding to the plurality of first display parameters 610 in the eyebox 200 is from 5 shown in FIG. 5B to 25 shown in FIG. 5A. It may be changed to pieces. When the number of spatial regions corresponding to the plurality of display parameters 600 is increased (in other words, the density is increased), the display parameters 600 are likely to be switched according to the change of the eye position 700.
 すなわち、いくつかの実施形態において、車両1(車両ECU401)から取得される自車両速度が高速であると判定できる場合(これは前記所定の条件を満たす例である。)、表示パラメータ設定モジュール510は、複数の表示パラメータ600が対応する空間的な領域の密度を高く設定してもよい。つまり、表示パラメータ600は、高速時には、目位置700に基づいてスムーズに切り替えることができ、低速時には、目位置700に基づいて切り替わりにくくなる。 That is, in some embodiments, when it can be determined that the own vehicle speed acquired from the vehicle 1 (vehicle ECU 401) is high (this is an example of satisfying the above-mentioned predetermined conditions), the display parameter setting module 510 May set the density of the spatial region corresponding to the plurality of display parameters 600 to be high. That is, the display parameter 600 can be smoothly switched based on the eye position 700 at high speed, and is difficult to switch based on the eye position 700 at low speed.
 また、いくつかの実施形態において、目位置状態判定モジュール508は、特定状態と判定する条件に、左右方向(X軸方向)を含む目位置700の移動方向750を示す信号、及び/又は上下方向(Y軸方向)を含む目位置700の移動方向750を示す信号のいずれか一方が検出されること、を少なくとも含むようにしてもよい。例えば、表示パラメータ設定モジュール510は、左右方向(X軸方向)に目位置700が移動した際に、既に設定されている表示パラメータ600が対応する領域より、アイボックス200の境界200Aに近い領域、又はアイボックス200の中心205から離れた領域に対応する表示パラメータ600を設定する処理を実行し、左右方向(X軸方向)に目位置700が移動していなければ、これらの処理を実行しなくてもよい。 Further, in some embodiments, the eye position state determination module 508 indicates a signal indicating the moving direction 750 of the eye position 700 including the left-right direction (X-axis direction) and / or the vertical direction as a condition for determining the specific state. At least one of the signals indicating the moving direction 750 of the eye position 700 including (Y-axis direction) may be detected. For example, in the display parameter setting module 510, when the eye position 700 moves in the left-right direction (X-axis direction), the area closer to the boundary 200A of the eye box 200 than the area corresponding to the display parameter 600 already set. Alternatively, the process of setting the display parameter 600 corresponding to the area away from the center 205 of the eye box 200 is executed, and if the eye position 700 does not move in the left-right direction (X-axis direction), these processes are not executed. You may.
 また、いくつかの実施形態では、目位置状態判定モジュール508は、特定状態と判定する条件に、車両1の挙動(振動)が少ないことを示す信号が検出されること、を少なくとも含むようにしてもよい。例えば、表示パラメータ設定モジュール510は、車両1の挙動(振動)が大きい場合、特定状態と判定しないようにしてもよい。 Further, in some embodiments, the eye position state determination module 508 may at least include that a signal indicating that the behavior (vibration) of the vehicle 1 is small is detected as a condition for determining the specific state. .. For example, the display parameter setting module 510 may not determine the specific state when the behavior (vibration) of the vehicle 1 is large.
 また、いくつかの実施形態において、表示設定モジュール510は、目位置700が検出できないが、目位置状態判定モジュール508が特定状態と判定しない場合、最新の目位置700に対応する表示パラメータ600を維持するようにしてもよい。 Further, in some embodiments, the display setting module 510 cannot detect the eye position 700, but if the eye position state determination module 508 does not determine the specific state, the display parameter 600 corresponding to the latest eye position 700 is maintained. You may try to do it.
 また、いくつかの実施形態において、表示設定モジュール510は、目位置700が検出できないが、目位置状態判定モジュール508が特定状態と判定しない場合、アイボックス700の中心205に対応する表示パラメータ600を設定するようにしてもよい。 Further, in some embodiments, the display setting module 510 cannot detect the eye position 700, but when the eye position state determination module 508 does not determine the specific state, the display parameter 600 corresponding to the center 205 of the eye box 700 is set. You may set it.
 再び図3を参照する。グラフィックモジュール512は、レンダリングなどの画像処理をして画像データを生成し、表示器21を駆動するための様々な既知のソフトウェア構成要素を含む。また、グラフィックモジュール512は、表示される画像の、種類、配置(位置座標、角度)、サイズ、表示距離(3Dの場合。)、視覚的効果(例えば、輝度、透明度、彩度、コントラスト、又は他の視覚特性)、を変更するための様々な既知のソフトウェア構成要素を含んでいてもよい。グラフィックモジュール512は、画像の種類(表示パラメータの1つ。)、画像の位置座標(表示パラメータの1つ。)、画像の角度(X方向を軸としたピッチング角、Y方向を軸としたヨーレート角、Z方向を軸としたローリング角などであり、表示パラメータの1つ。)、及び画像のサイズ(表示パラメータの1つ。)で観察者に視認されるように画像データを生成し、画像表示部20を駆動し得る。 Refer to Fig. 3 again. The graphic module 512 includes various known software components for performing image processing such as rendering to generate image data and driving the display 21. The graphic module 512 also provides a type, arrangement (positional coordinates, angle), size, display distance (in the case of 3D), visual effect (eg, brightness, transparency, saturation, contrast, or contrast) of the displayed image. Other visual characteristics), may include various known software components for modification. The graphic module 512 includes an image type (one of the display parameters), image position coordinates (one of the display parameters), an image angle (pitching angle about the X direction, and yaw rate about the Y direction). An image is generated by generating image data so that the observer can see the angle, the rolling angle about the Z direction, etc., which is one of the display parameters) and the size of the image (one of the display parameters). The display unit 20 can be driven.
 光源駆動モジュール514は、光源ユニット24を駆動することを実行するための様々な既知のソフトウェア構成要素を含む。光源駆動モジュール514は、設定された表示パラメータ600に基づき、光源ユニット24を駆動し得る。 The light source drive module 514 includes various known software components for performing driving of the light source unit 24. The light source drive module 514 can drive the light source unit 24 based on the set display parameter 600.
 アクチュエータ駆動モジュール516は、第1アクチュエータ28及び/又は第2アクチュエータ29を駆動することを実行するための様々な既知のソフトウェア構成要素を含むアクチュエータ駆動モジュール516は、設定された表示パラメータ600に基づき、第1アクチュエータ28及び第2アクチュエータ29を駆動し得る。 The actuator drive module 516 includes various known software components for performing driving the first actuator 28 and / or the second actuator 29. The actuator drive module 516 is based on a set display parameter 600. The first actuator 28 and the second actuator 29 can be driven.
 上述の処理プロセスの動作は、汎用プロセッサ又は特定用途向けチップなどの情報処理装置の1つ以上の機能モジュールを実行させることにより実施することができる。これらのモジュール、これらのモジュールの組み合わせ、及び/又はそれらの機能を代替えし得る公知のハードウェアとの組み合わせは全て、本発明の保護の範囲内に含まれる。 The operation of the above-mentioned processing process can be performed by executing one or more functional modules of an information processing device such as a general-purpose processor or a chip for a specific purpose. All of these modules, combinations of these modules, and / or combinations with known hardware capable of substituting their functionality are within the scope of the protection of the present invention.
 車両用表示システム10の機能ブロックは、任意選択的に、説明される様々な実施形態の原理を実行するために、ハードウェア、ソフトウェア、又はハードウェア及びソフトウェアの組み合わせによって実行される。図3で説明する機能ブロックが、説明される実施形態の原理を実施するために、任意選択的に、組み合わされ、又は1つの機能ブロックを2以上のサブブロックに分離されてもいいことは、当業者に理解されるだろう。したがって、本明細書における説明は、本明細書で説明されている機能ブロックのあらゆる可能な組み合わせ若しくは分割を、任意選択的に支持する。 The functional blocks of the vehicle display system 10 are optionally executed by hardware, software, or a combination of hardware and software in order to execute the principles of the various embodiments described. The functional blocks described in FIG. 3 may be optionally combined or one functional block separated into two or more subblocks in order to implement the principles of the embodiments described. It will be understood by those skilled in the art. Accordingly, the description herein optionally supports any possible combination or division of functional blocks described herein.
1       :車両
2       :フロントウインドシールド
4       :目
4L      :目位置
5       :ダッシュボード
10      :車両用表示システム
20      :画像表示部(ヘッドアップディスプレイ装置)
21      :表示器
21a     :表示面
22      :液晶ディスプレイパネル
24      :光源ユニット
25      :リレー光学系
26      :第1ミラー
27      :第2ミラー
28      :第1アクチュエータ
29      :第2アクチュエータ
30      :表示制御装置
31      :I/Oインタフェース
33      :プロセッサ
35      :画像処理回路
37      :メモリ
40      :表示光
40p     :光軸
41      :第1画像光
42      :第2画像光
43      :第3画像光
90      :虚像光学系
100     :表示領域
101     :上端
102     :下端
200     :アイボックス
200A    :境界
205     :中心
401     :車両ECU
403     :道路情報データベース
405     :自車位置検出部
407     :車外センサ
409     :操作検出部
411     :目位置検出部
413     :IMU
415     :視線方向検出部
417     :携帯情報端末
419     :外部通信機器
502     :目位置検出モジュール
504     :目位置推定モジュール
506     :目位置予測モジュール
508     :目位置状態判定モジュール
510     :表示パラメータ設定モジュール
512     :グラフィックモジュール
514     :光源駆動モジュール
516     :アクチュエータ駆動モジュール
600     :表示パラメータ
610     :第1表示パラメータ
610E    :境界表示パラメータ
700     :目位置
700L    :左目位置
700R    :右目位置
751     :移動方向
752     :移動方向
753     :移動方向
754     :移動方向
AX1     :第1の回転軸
AX2     :第2の回転軸
M       :画像
V       :虚像
θt      :チルト角
θv      :縦配置角
1: Vehicle 2: Front windshield 4: Eye 4L: Eye position 5: Dashboard 10: Vehicle display system 20: Image display unit (head-up display device)
21: Display 21a: Display surface 22: Liquid crystal display panel 24: Light source unit 25: Relay optical system 26: First mirror 27: Second mirror 28: First actuator 29: Second actuator 30: Display control device 31: I / O interface 33: Processor 35: Image processing circuit 37: Memory 40: Display light 40p: Optical axis 41: First image light 42: Second image light 43: Third image light 90: Virtual image optical system 100: Display area 101 : Upper end 102: Lower end 200: Eye box 200A: Boundary 205: Center 401: Vehicle ECU
403: Road information database 405: Own vehicle position detection unit 407: Vehicle outside sensor 409: Operation detection unit 411: Eye position detection unit 413: IMU
415: Line-of-sight direction detection unit 417: Mobile information terminal 419: External communication device 502: Eye position detection module 504: Eye position estimation module 506: Eye position prediction module 508: Eye position state determination module 510: Display parameter setting module 512: Graphic Module 514: Light source drive module 516: Actuator drive module 600: Display parameter 610: First display parameter 610E: Boundary display parameter 700: Eye position 700L: Left eye position 700R: Right eye position 751: Movement direction 752: Movement direction 753: Movement direction 754: Movement direction AX1: First rotation axis AX2: Second rotation axis M: Image V: Virtual image θt: Tilt angle θv: Vertical arrangement angle

Claims (15)

  1.  アイボックス(200)の範囲内から画像を視認できる画像表示部(20)を制御する表示制御装置(30)において、
     情報を取得可能な1つ又は複数のI/Oインタフェース(31)と、
     1つ又は複数のプロセッサ(33)と、
     メモリ(37)と、
     前記メモリ(37)に格納され、前記1つ又は複数のプロセッサ(33)によって実行されるように構成される1つ又は複数のコンピュータ・プログラムと、を備え、
     前記1つ又は複数のI/Oインタフェース(31)は、
      観察者の目位置を示す情報、及び/又は前記目位置を推定可能な情報を取得し、
     前記メモリ(37)は、空間的な領域毎に対応する複数の表示パラメータ(600)を記憶し、
     前記1つ又は複数のプロセッサ(33)は、
      前記目位置に基づき、1つ又はそれ以上の前記表示パラメータ(600)を設定し、
      前記目位置に関する情報が、特定状態を示すか判定し、
       前記目位置に関する情報が前記特定状態を示すと判定される場合、少なくとも前記目位置に基づき、既に設定されている前記表示パラメータ(600)が対応する領域より、前記アイボックス(200)の境界に近い領域、又は前記アイボックス(200)の中心(205)から離れた領域に対応する前記表示パラメータ(600)を設定する、
    表示制御装置(30)。
    In the display control device (30) that controls the image display unit (20) capable of visually recognizing an image from within the range of the eye box (200).
    One or more I / O interfaces (31) from which information can be obtained,
    With one or more processors (33),
    Memory (37) and
    It comprises one or more computer programs stored in the memory (37) and configured to be executed by the one or more processors (33).
    The one or more I / O interfaces (31)
    Obtaining information indicating the eye position of the observer and / or information capable of estimating the eye position,
    The memory (37) stores a plurality of display parameters (600) corresponding to each spatial area, and stores a plurality of display parameters (600).
    The one or more processors (33)
    Based on the eye position, one or more of the display parameters (600) are set.
    It is determined whether the information regarding the eye position indicates a specific state, and the result is determined.
    When it is determined that the information regarding the eye position indicates the specific state, at least based on the eye position, the area to which the display parameter (600) already set corresponds is located at the boundary of the eye box (200). Set the display parameter (600) corresponding to a region near or away from the center (205) of the eyebox (200).
    Display control device (30).
  2.  前記複数の表示パラメータ(600)は、前記アイボックス(200)の境界の近傍の領域に対応する境界表示パラメータ(610E、650E)を含み、
     前記1つ又は複数のI/Oインタフェース(31)は、
      前記目位置に関する情報が前記特定状態を示すと判定される場合、前記特定状態であると判定される前の最新の前記目位置に最も近い領域に対応する前記境界表示パラメータ(610E、650E)を設定する、
    請求項1に記載の表示制御装置(30)。
    The plurality of display parameters (600) include boundary display parameters (610E, 650E) corresponding to regions in the vicinity of the boundary of the eyebox (200).
    The one or more I / O interfaces (31)
    When it is determined that the information regarding the eye position indicates the specific state, the boundary display parameters (610E, 650E) corresponding to the latest region closest to the eye position before the determination of the specific state are set. Set,
    The display control device (30) according to claim 1.
  3.  前記複数の表示パラメータ(600)は、前記アイボックス(200)の境界の内側の領域に対応する境界表示パラメータ(610E)を含み、
     前記1つ又は複数のI/Oインタフェース(31)は、
      前記目位置に関する情報が前記特定状態を示すと判定される場合、前記特定状態であると判定される前の最新の前記目位置に最も近い領域に対応する前記境界表示パラメータ(610E)を設定する、
    請求項1に記載の表示制御装置(30)。
    The plurality of display parameters (600) include a boundary display parameter (610E) corresponding to an area inside the boundary of the eye box (200).
    The one or more I / O interfaces (31)
    When it is determined that the information regarding the eye position indicates the specific state, the boundary display parameter (610E) corresponding to the latest region closest to the eye position before the determination of the specific state is set. ,
    The display control device (30) according to claim 1.
  4.  前記1つ又は複数のI/Oインタフェース(31)は、
      観察者の目位置の移動方向を示す情報、及び/又は前記目位置の移動方向を推定可能な情報を取得し、
     前記1つ又は複数のプロセッサ(33)は、
      前記目位置に関する情報が前記特定状態を示すと判定される場合、少なくとも前記目位置、及び前記目位置の移動方向に基づき、既に設定されている前記表示パラメータ(600)が対応する領域より、前記アイボックス(200)の境界に近い領域に対応する前記表示パラメータ(600)を設定する、
    請求項1に記載の表示制御装置(30)。
    The one or more I / O interfaces (31)
    Obtain information indicating the moving direction of the observer's eye position and / or information capable of estimating the moving direction of the eye position.
    The one or more processors (33)
    When it is determined that the information regarding the eye position indicates the specific state, the display parameter (600) already set is based on at least the eye position and the moving direction of the eye position from the corresponding region. The display parameter (600) corresponding to the region near the boundary of the eye box (200) is set.
    The display control device (30) according to claim 1.
  5.  前記1つ又は複数のI/Oインタフェース(31)は、
      観察者の目位置の移動方向を示す情報、及び/又は前記目位置の移動方向を推定可能な情報と、
      観察者の目位置の移動速度を示す情報、及び/又は前記目位置の移動速度を推定可能な情報と、を取得し、
     前記1つ又は複数のプロセッサ(33)は、
      前記目位置に関する情報が前記特定状態を示すと判定される場合、少なくとも前記目位置、前記目位置の移動方向、及び前記目位置の移動速度に基づき、既に設定されている前記表示パラメータ(600)が対応する領域より、前記アイボックス(200)の境界に近い領域に対応する前記表示パラメータ(600)を設定する、
    請求項1に記載の表示制御装置(30)。
    The one or more I / O interfaces (31)
    Information indicating the moving direction of the observer's eye position and / or information capable of estimating the moving direction of the eye position, and
    Information indicating the moving speed of the observer's eye position and / or information capable of estimating the moving speed of the eye position are acquired.
    The one or more processors (33)
    When it is determined that the information regarding the eye position indicates the specific state, the display parameter (600) already set based on at least the eye position, the moving direction of the eye position, and the moving speed of the eye position. Sets the display parameter (600) corresponding to the area closer to the boundary of the eye box (200) than the corresponding area.
    The display control device (30) according to claim 1.
  6.  前記1つ又は複数のプロセッサ(33)は、
     前記目位置が検出できない場合、前記特定状態であると判定する、
    請求項1に記載の表示制御装置(30)。
    The one or more processors (33)
    If the eye position cannot be detected, it is determined that the specific state is present.
    The display control device (30) according to claim 1.
  7.  前記1つ又は複数のプロセッサ(33)は、
     前記目位置が前記アイボックス(200)外で検出された場合、前記特定状態であると判定する、
    請求項1に記載の表示制御装置(30)。
    The one or more processors (33)
    When the eye position is detected outside the eye box (200), it is determined that the specific state is present.
    The display control device (30) according to claim 1.
  8.  前記1つ又は複数のプロセッサ(33)は、
      前記特定状態が解除されたと判定された後の前記表示パラメータ(600)の更新周期では、設定されている前記表示パラメータ(600)を維持し、
      次の前記表示パラメータ(600)の更新周期以降で、最新の前記目位置が属する領域に対応する前記表示パラメータ(600)に変更する、
    請求項1に記載の表示制御装置(30)。
    The one or more processors (33)
    In the update cycle of the display parameter (600) after it is determined that the specific state has been released, the set display parameter (600) is maintained.
    After the next update cycle of the display parameter (600), the display parameter (600) is changed to the display parameter (600) corresponding to the area to which the latest eye position belongs.
    The display control device (30) according to claim 1.
  9.  アイボックス(200)の範囲内から画像を視認できるヘッドアップディスプレイ装置(20)において、
     表示器(21)と、
     前記表示器(21)からの光を被投影部に向けるリレー光学系(25)と、
     情報を取得可能な1つ又は複数のI/Oインタフェース(31)と、
     1つ又は複数のプロセッサ(33)と、
     メモリ(37)と、
     前記メモリ(37)に格納され、前記1つ又は複数のプロセッサ(33)によって実行されるように構成される1つ又は複数のコンピュータ・プログラムと、を備え、
     前記1つ又は複数のI/Oインタフェース(31)は、
      観察者の目位置を示す情報、及び/又は前記目位置を推定可能な情報を取得し、
     前記メモリ(37)は、空間的な領域毎に対応する複数の表示パラメータ(600)を記憶し、
     前記1つ又は複数のプロセッサ(33)は、
      前記目位置に基づき、1つ又はそれ以上の前記表示パラメータ(600)を設定し、
      前記目位置に関する情報が、特定状態を示すか判定し、
       前記目位置に関する情報が前記特定状態を示すと判定される場合、少なくとも前記目位置に基づき、既に設定されている前記表示パラメータ(600)が対応する領域より、前記アイボックス(200)の境界に近い領域に対応する前記表示パラメータ(600)を設定する、
    ヘッドアップディスプレイ装置(20)。
    In the head-up display device (20) that can visually recognize an image from within the range of the eye box (200).
    Display (21) and
    A relay optical system (25) that directs the light from the display (21) toward the projected portion, and
    One or more I / O interfaces (31) from which information can be obtained,
    With one or more processors (33),
    Memory (37) and
    It comprises one or more computer programs stored in the memory (37) and configured to be executed by the one or more processors (33).
    The one or more I / O interfaces (31)
    Obtaining information indicating the eye position of the observer and / or information capable of estimating the eye position,
    The memory (37) stores a plurality of display parameters (600) corresponding to each spatial area, and stores a plurality of display parameters (600).
    The one or more processors (33)
    Based on the eye position, one or more of the display parameters (600) are set.
    It is determined whether the information regarding the eye position indicates a specific state, and the result is determined.
    When it is determined that the information regarding the eye position indicates the specific state, at least based on the eye position, the area to which the display parameter (600) already set corresponds is located at the boundary of the eye box (200). Set the display parameter (600) corresponding to a near area,
    Head-up display device (20).
  10.  前記複数の表示パラメータ(600)は、前記アイボックス(200)の境界の近傍の領域に対応する境界表示パラメータ(610E、650E)を含み、
     前記1つ又は複数のI/Oインタフェース(31)は、
      前記目位置に関する情報が前記特定状態を示すと判定される場合、前記特定状態であると判定される前の最新の前記目位置に最も近い領域に対応する前記境界表示パラメータ(610E、650E)を設定する、
    請求項9に記載のヘッドアップディスプレイ装置(20)。
    The plurality of display parameters (600) include boundary display parameters (610E, 650E) corresponding to regions in the vicinity of the boundary of the eyebox (200).
    The one or more I / O interfaces (31)
    When it is determined that the information regarding the eye position indicates the specific state, the boundary display parameters (610E, 650E) corresponding to the latest region closest to the eye position before the determination of the specific state are set. Set,
    The head-up display device (20) according to claim 9.
  11.  前記1つ又は複数のI/Oインタフェース(31)は、
      観察者の目位置の移動方向を示す情報、及び/又は前記目位置の移動方向を推定可能な情報を取得し、
     前記1つ又は複数のプロセッサ(33)は、
      前記目位置に関する情報が前記特定状態を示すと判定される場合、少なくとも前記目位置、及び前記目位置の移動方向に基づき、既に設定されている前記表示パラメータ(600)が対応する領域より、前記アイボックス(200)の境界に近い領域に対応する前記表示パラメータ(600)を設定する、
    請求項9に記載のヘッドアップディスプレイ装置(20)。
    The one or more I / O interfaces (31)
    Obtain information indicating the moving direction of the observer's eye position and / or information capable of estimating the moving direction of the eye position.
    The one or more processors (33)
    When it is determined that the information regarding the eye position indicates the specific state, the display parameter (600) already set is based on at least the eye position and the moving direction of the eye position from the corresponding region. The display parameter (600) corresponding to the region near the boundary of the eye box (200) is set.
    The head-up display device (20) according to claim 9.
  12.  前記1つ又は複数のI/Oインタフェース(31)は、
      観察者の目位置の移動方向を示す情報、及び/又は前記目位置の移動方向を推定可能な情報と、
      観察者の目位置の移動速度を示す情報、及び/又は前記目位置の移動速度を推定可能な情報と、を取得し、
     前記1つ又は複数のプロセッサ(33)は、
      前記目位置に関する情報が前記特定状態を示すと判定される場合、少なくとも前記目位置、前記目位置の移動方向、及び前記目位置の移動速度に基づき、既に設定されている前記表示パラメータ(600)が対応する領域より、前記アイボックス(200)の境界に近い領域に対応する前記表示パラメータ(600)を設定する、
    請求項9に記載のヘッドアップディスプレイ装置(20)。
    The one or more I / O interfaces (31)
    Information indicating the moving direction of the observer's eye position and / or information capable of estimating the moving direction of the eye position, and
    Information indicating the moving speed of the observer's eye position and / or information capable of estimating the moving speed of the eye position are acquired.
    The one or more processors (33)
    When it is determined that the information regarding the eye position indicates the specific state, the display parameter (600) already set based on at least the eye position, the moving direction of the eye position, and the moving speed of the eye position. Sets the display parameter (600) corresponding to the area closer to the boundary of the eyebox (200) than the corresponding area.
    The head-up display device (20) according to claim 9.
  13.  前記1つ又は複数のプロセッサ(33)は、
     前記目位置が検出できない場合、前記特定状態であると判定する、
    請求項9に記載のヘッドアップディスプレイ装置(20)。
    The one or more processors (33)
    If the eye position cannot be detected, it is determined that the specific state is present.
    The head-up display device (20) according to claim 9.
  14.  前記1つ又は複数のプロセッサ(33)は、
      前記特定状態が解除されたと判定された後の前記表示パラメータ(600)の更新周期では、設定されている前記表示パラメータ(600)を維持し、
      次の前記表示パラメータ(600)の更新周期以降で、最新の前記目位置が属する領域に対応する前記表示パラメータ(600)に変更する、
    請求項9に記載のヘッドアップディスプレイ装置(20)。
    The one or more processors (33)
    In the update cycle of the display parameter (600) after it is determined that the specific state has been released, the set display parameter (600) is maintained.
    After the next update cycle of the display parameter (600), the display parameter (600) is changed to the display parameter (600) corresponding to the area to which the latest eye position belongs.
    The head-up display device (20) according to claim 9.
  15.  アイボックス(200)の範囲内から画像を視認できる画像表示部(20)を制御する方法であって、
     観察者の目位置を示す情報、及び/又は前記目位置を推定可能な情報を取得することと、
     前記目位置に基づき、1つ又はそれ以上の表示パラメータ(600)を設定することと、
     前記目位置に関する情報が、特定状態を示すか判定することと、
      前記目位置に関する情報が前記特定状態を示すと判定される場合、少なくとも前記目位置に基づき、既に設定されている前記表示パラメータ(600)が対応する領域より、前記アイボックス(200)の境界に近い領域に対応する前記表示パラメータ(600)を設定することと、を含む、
    方法。
    It is a method of controlling an image display unit (20) capable of visually recognizing an image from within the range of the eye box (200).
    Acquiring information indicating the eye position of the observer and / or information capable of estimating the eye position, and
    Setting one or more display parameters (600) based on the eye position
    Determining whether the information about the eye position indicates a specific state
    When it is determined that the information regarding the eye position indicates the specific state, at least based on the eye position, the area to which the display parameter (600) already set corresponds is located at the boundary of the eye box (200). Including setting the display parameter (600) corresponding to a near area.
    Method.
PCT/JP2021/013481 2020-03-31 2021-03-30 Display control device, head-up display device, and method WO2021200914A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020063845 2020-03-31
JP2020-063845 2020-03-31

Publications (1)

Publication Number Publication Date
WO2021200914A1 true WO2021200914A1 (en) 2021-10-07

Family

ID=77928456

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/013481 WO2021200914A1 (en) 2020-03-31 2021-03-30 Display control device, head-up display device, and method

Country Status (1)

Country Link
WO (1) WO2021200914A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113997786A (en) * 2021-12-30 2022-02-01 江苏赫奕科技有限公司 Instrument interface display method and device suitable for vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015215505A (en) * 2014-05-12 2015-12-03 パナソニックIpマネジメント株式会社 Display apparatus and display method
US20160109943A1 (en) * 2014-10-21 2016-04-21 Honeywell International Inc. System and method for controlling visibility of a proximity display
JP2016210212A (en) * 2015-04-30 2016-12-15 株式会社リコー Information providing device, information providing method and control program for information provision
JP2017171146A (en) * 2016-03-24 2017-09-28 カルソニックカンセイ株式会社 Head-up display device
JP2019018770A (en) * 2017-07-20 2019-02-07 アルパイン株式会社 Head-up display device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015215505A (en) * 2014-05-12 2015-12-03 パナソニックIpマネジメント株式会社 Display apparatus and display method
US20160109943A1 (en) * 2014-10-21 2016-04-21 Honeywell International Inc. System and method for controlling visibility of a proximity display
JP2016210212A (en) * 2015-04-30 2016-12-15 株式会社リコー Information providing device, information providing method and control program for information provision
JP2017171146A (en) * 2016-03-24 2017-09-28 カルソニックカンセイ株式会社 Head-up display device
JP2019018770A (en) * 2017-07-20 2019-02-07 アルパイン株式会社 Head-up display device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113997786A (en) * 2021-12-30 2022-02-01 江苏赫奕科技有限公司 Instrument interface display method and device suitable for vehicle
CN113997786B (en) * 2021-12-30 2022-03-25 江苏赫奕科技有限公司 Instrument interface display method and device suitable for vehicle

Similar Documents

Publication Publication Date Title
EP3888965B1 (en) Head-up display, vehicle display system, and vehicle display method
JP7006235B2 (en) Display control device, display control method and vehicle
JP2020032866A (en) Vehicular virtual reality providing device, method and computer program
JP7255608B2 (en) DISPLAY CONTROLLER, METHOD, AND COMPUTER PROGRAM
JP7459883B2 (en) Display control device, head-up display device, and method
WO2021200914A1 (en) Display control device, head-up display device, and method
WO2022230995A1 (en) Display control device, head-up display device, and display control method
WO2023048213A1 (en) Display control device, head-up display device, and display control method
WO2020158601A1 (en) Display control device, method, and computer program
JP2022072954A (en) Display control device, head-up display device, and display control method
JP2020121607A (en) Display control device, method and computer program
JP2020121704A (en) Display control device, head-up display device, method and computer program
JP2021056358A (en) Head-up display device
WO2023003045A1 (en) Display control device, head-up display device, and display control method
JP2022077138A (en) Display controller, head-up display device, and display control method
WO2023145852A1 (en) Display control device, display system, and display control method
WO2021200913A1 (en) Display control device, image display device, and method
WO2023210682A1 (en) Display control device, head-up display device, and display control method
JP2022057051A (en) Display controller and virtual display device
JP2022190724A (en) Display control device, head-up display device and display control method
JP7434894B2 (en) Vehicle display device
JP2022113292A (en) Display control device, head-up display device, and display control method
JP2020199883A (en) Display control device, head-up display device, method and computer program
JP2021162704A (en) Head-up display device and head-up display system
JP2022028389A (en) Head-up display device, display controller, and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21779616

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21779616

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP