WO2021200914A1 - Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé - Google Patents

Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé Download PDF

Info

Publication number
WO2021200914A1
WO2021200914A1 PCT/JP2021/013481 JP2021013481W WO2021200914A1 WO 2021200914 A1 WO2021200914 A1 WO 2021200914A1 JP 2021013481 W JP2021013481 W JP 2021013481W WO 2021200914 A1 WO2021200914 A1 WO 2021200914A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye position
display
eye
display parameter
specific state
Prior art date
Application number
PCT/JP2021/013481
Other languages
English (en)
Japanese (ja)
Inventor
誠 秦
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Publication of WO2021200914A1 publication Critical patent/WO2021200914A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory

Definitions

  • the present disclosure relates to a display control device, a head-up display device, and a method used in a vehicle to superimpose and visually recognize an image on the foreground of the vehicle.
  • Patent Document 1 describes a head-up display device that detects the viewpoint position of an observer and corrects the display position of an image (an example of display setting) according to the viewpoint position, and the case where the viewpoint position cannot be detected. , It is disclosed that the display position of the image is maintained and the display is continued.
  • the display setting according to the actual eye position display setting that the observer should visually recognize
  • the display setting actually executed by the display device referred to as the eye position. It is assumed that there will be a difference between (inappropriate display setting) that the observer can see. Even after the image with inappropriate display settings is displayed in this way, if the eye position sensor can correctly detect the actual eye position, the display control device can update to the appropriate display settings. However, in such a case, the observer perceives a change from an image with inappropriate display settings to an image with appropriate display settings.
  • the head-up display device disclosed in Patent Document 1 maintains the display position (display setting) of the image when the viewpoint position cannot be detected, that is, corresponds to the last viewpoint position where the viewpoint position could be detected. It maintains the display settings. If the viewpoint position cannot be detected after changing from the situation where the viewpoint position could be detected, there is a high possibility that the viewpoint position has moved to another position, and more specifically, the viewpoint of the observer in general. It is assumed that there is a high possibility that the position is out of the expected area (also called eye lip or eye box).
  • the expected area also called eye lip or eye box
  • the observer's viewpoint position when the observer's viewpoint position enters the eyelips from outside the eyelips (state in which the viewpoint position cannot be detected), the observer's viewpoint position is arranged at the boundary between the inside and outside of the eyelips, but is visually recognized here.
  • the display settings corresponding to the last viewpoint position where the viewpoint position could be detected are applied to the image, and the last detected viewpoint position and the actual observer's viewpoint position (boundary inside and outside the irips) If the images are far apart, the display settings are also significantly different, and there is a risk that the observer may visually recognize an image with a low display quality that deviates from the appropriate image.
  • the outline of the present disclosure relates to suppressing deterioration of image display quality. More specifically, it is also related to making it difficult for the observer to visually recognize an image having low display quality even if a specific state occurs in the detection result of the eye position.
  • the display control device described in the present specification acquires information indicating the eye position of the observer and / or information capable of estimating the eye position, and the memory has a plurality of corresponding information for each spatial area.
  • the display parameters are stored, and one or more processors set one or more display parameters based on the acquired eye position, determine whether the information about the eye position indicates a specific state, and relate to the eye position.
  • the display parameter corresponding to the area closer to the boundary of the eye box is set than the area corresponding to the display parameter already set, at least based on the eye position.
  • FIG. 1 is a diagram showing an example of application of a vehicle display system to a vehicle according to some embodiments.
  • FIG. 2 is a diagram showing a configuration of a head-up display device according to some embodiments.
  • FIG. 3 is a block diagram of a vehicle display system according to some embodiments.
  • FIG. 4A is a flow chart showing a method of performing an operation of setting display parameters based on the observer's eye position according to some embodiments.
  • FIG. 4B is a flow chart showing a method following FIG. 4A.
  • the upper diagram of FIG. 5A is a diagram for providing an explanation of the arrangement of spatial regions corresponding to display parameters in some embodiments, and the lower diagram of FIG. 5A corresponds to FIG. 5A.
  • FIG. 5B is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments.
  • FIG. 5C is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments.
  • FIG. 6 is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments.
  • FIG. 7 is a diagram for providing a description of the arrangement of spatial regions corresponding to display parameters in some embodiments.
  • FIGS. 1 to 7 provide a description of the configuration and operation of an exemplary vehicle display system.
  • the present invention is not limited to the following embodiments (including the contents of the drawings). Of course, changes (including deletion of components) can be made to the following embodiments. Further, in the following description, in order to facilitate understanding of the present invention, description of known technical matters will be omitted as appropriate.
  • the image display unit 20 in the vehicle display system 10 is a head-up display (HUD: Head-Up Display) device provided in the dashboard 5 of the vehicle 1.
  • the image display unit 20 emits the display light 40 toward the front windshield 2 (an example of the projected unit), and the front windshield 2 emits the display light 40 toward the image M displayed by the image display unit 20 (see FIG. 2).
  • the display light 40 of the above is reflected on the eye box 200.
  • the observer visually recognizes the virtual image V of the image M displayed by the image display unit 20 at a position overlapping the foreground, which is the real space visually recognized through the front windshield 2. can do.
  • the left-right direction of the vehicle 1 is the X-axis direction (the left side when facing the front of the vehicle 1 is the X-axis positive direction), and the vertical direction is the Y-axis direction (a vehicle traveling on the road surface).
  • the upper side of 1 is the Y-axis positive direction), and the front-rear direction of the vehicle 1 is the Z-axis direction (the front of the vehicle 1 is the Z-axis positive direction).
  • the "eye box" used in the description of the present embodiment is (1) a region in which the entire virtual image V of the image M can be visually recognized in the region, and at least a part of the virtual image V of the image M is not visible outside the region, (2). At least a part of the virtual image V of the image M can be visually recognized in the area, and a part of the virtual image V of the image M is not visible outside the area.
  • the virtual image V of the image M can be visually recognized as described above and the entire virtual image V of the image M is less than the predetermined brightness, or (4) the virtual image V which can be viewed stereoscopically can be displayed by the image display unit 20, the virtual image V of the virtual image V At least a part of the virtual image V can be viewed stereoscopically, and a part of the virtual image V is not stereoscopically viewed outside the region. That is, when the observer places the eyes (both eyes) outside the eye box 200, the observer cannot see the entire virtual image V of the image M, and the visibility of the entire virtual image V of the image M is very low and difficult to perceive. Or, the virtual image V of the image M cannot be viewed stereoscopically.
  • the predetermined brightness is, for example, about 1/50 of the brightness of the virtual image of the image M visually recognized at the center of the eye box.
  • the "eye box” is the same as the area (also referred to as eye lip) where the observer's viewpoint position is expected to be arranged in the vehicle on which the HUD device 20 is mounted, or most of the eye lip (for example, 80% or more). Is set to include.
  • the display area 100 is an area of a plane, a curved surface, or a partially curved surface in which the image M generated inside the image display unit 20 forms an image as a virtual image V, and is also called an image forming surface.
  • the display area 100 is a position where the display surface (for example, the exit surface of the liquid crystal display panel) 21a of the display 21 described later of the image display unit 20 is imaged as a virtual image, that is, the display area 100 is the image display unit.
  • the display area 100 Corresponding to the display surface 21a described later of 20 (in other words, the display area 100 has a conjugate relationship with the display surface 21a of the display 21 described later), and the virtual image visually recognized in the display area 100 is an image. It can be said that it corresponds to the image displayed on the display surface 21a described later of the display unit 20. It is preferable that the display area 100 itself has low visibility to the extent that it is not actually visible to the observer's eyes or is difficult to see.
  • the display area 100 includes an angle (tilt angle ⁇ t in FIG. 1) formed by the horizontal direction (XZ plane) about the left-right direction (X-axis direction) of the vehicle 1, the center 205 of the eyebox 200, and the display area 100.
  • the angle formed by the line connecting the upper end 101 of the eye box and the line connecting the center of the eyebox and the lower end 102 of the display area 100 is defined as the vertical angle of the display area 100, and is horizontal to the bisector of this vertical angle.
  • the angle formed by the direction (XZ plane) (vertical arrangement angle ⁇ v in FIG. 1) is set.
  • the display area 100 of the present embodiment has a tilt angle ⁇ t of approximately 90 [degree] so as to substantially face the display region 100 when facing forward (the Z-axis positive direction).
  • the tilt angle ⁇ t is not limited to this, and can be changed within the range of 0 ⁇ ⁇ t ⁇ 90 [degree].
  • the tilt angle ⁇ t may be set to 60 [degree]
  • the display area 100 may be arranged so that the upper area is farther than the lower area when viewed from the observer.
  • FIG. 2 is a diagram showing the configuration of the HUD device 20 of the present embodiment.
  • the HUD device 20 includes a display 21 having a display surface 21a for displaying the image M, and a relay optical system 25.
  • the display 21 of FIG. 2 is composed of a liquid crystal display panel 22 and a light source unit 24.
  • the display surface 21a is a surface on the visual side of the liquid crystal display panel 22, and emits the display light 40 of the image M. Display by setting the angle of the display surface 21a with respect to the optical axis 40p of the display light 40 from the center of the display surface 21a toward the eye box 200 (center 205 of the eye box 200) via the relay optical system 25 and the projected portion.
  • the angle of the region 100 (including the tilt angle ⁇ t) can be set.
  • the relay optical system 25 is arranged on the optical path of the display light 40 (light from the display 21 toward the eyebox 200) emitted from the display 21, and the display light 40 from the display 21 is directed to the outside of the HUD device 20. It is composed of one or more optical members projected onto the front windshield 2.
  • the relay optical system 25 of FIG. 2 includes one concave first mirror 26 and one flat second mirror 27.
  • the first mirror 26 has, for example, a free curved surface shape having positive optical power.
  • the first mirror 26 may have a curved surface shape in which the optical power differs for each region, that is, the optical power added to the display light 40 according to the region (optical path) through which the display light 40 passes. It may be different.
  • the first image light 41, the second image light 42, and the third image light 43 (see FIG. 2) heading from each region of the display surface 21a toward the eyebox 200 are added by the relay optical system 25.
  • the optical power may be different.
  • the second mirror 27 is, for example, a flat mirror, but is not limited to this, and may be a curved surface having optical power. That is, the relay optical system 25 is added according to the region (optical path) through which the display light 40 passes by synthesizing a plurality of mirrors (for example, the first mirror 26 and the second mirror 27 of the present embodiment). The optical power may be different.
  • the second mirror 27 may be omitted. That is, the display light 40 emitted from the display 21 may be reflected by the first mirror 26 on the projected portion (front windshield) 2.
  • the relay optical system 25 includes two mirrors, but the present invention is not limited to this, and one or more refractive optics such as a lens may be added or substituted to these. It may include a member, a diffractive optical member such as a hologram, a reflective optical member, or a combination thereof.
  • the relay optical system 25 of the present embodiment has a function of setting the distance to the display area 100 by the curved surface shape (an example of optical power), and a virtual image obtained by enlarging the image displayed on the display surface 21a. It has a function of generating, but in addition to this, it may have a function of suppressing (correcting) distortion of a virtual image that may occur due to the curved shape of the front windshield 2.
  • relay optical system 25 may be rotatable to which actuators 28 and 29 controlled by the display control device 30 are attached.
  • the liquid crystal display panel 22 receives light from the light source unit 24 and emits spatial light-modulated display light 40 toward the relay optical system 25 (second mirror 27).
  • the liquid crystal display panel 22 has, for example, a rectangular shape whose short side is the direction in which the pixels corresponding to the vertical direction (Y-axis direction) of the virtual image V seen from the observer are arranged.
  • the observer visually recognizes the transmitted light of the liquid crystal display panel 22 via the virtual image optical system 90.
  • the virtual image optical system 90 is a combination of the relay optical system 25 shown in FIG. 2 and the front windshield 2.
  • the light source unit 24 is composed of a light source (not shown) and an illumination optical system (not shown).
  • the light source (not shown) is, for example, a plurality of chip-type LEDs, and emits illumination light to a liquid crystal display panel (an example of a spatial light modulation element) 22.
  • the light source unit 24 is composed of, for example, four light sources, and is arranged in a row along the long side of the liquid crystal display panel 22.
  • the light source unit 24 emits illumination light toward the liquid crystal display panel 22 under the control of the display control device 30.
  • the configuration of the light source unit 24 and the arrangement of the light sources are not limited to this.
  • the illumination optical system includes, for example, one or a plurality of lenses (not shown) arranged in the emission direction of the illumination light of the light source unit 24, and diffusion arranged in the emission direction of the one or a plurality of lenses. It is composed of a board (not shown).
  • the display 21 may be a self-luminous display or a projection type display that projects an image on a screen.
  • the display surface 21a is the screen of the projection type display.
  • the display 21 may be attached with an actuator (not shown) including a motor controlled by the display control device 30, and may be able to move and / or rotate the display surface 21a.
  • the relay optical system 25 has two rotation axes (first rotation axis AX1 and second rotation axis AX2) that move the eyebox 200 in the vertical direction (Y-axis direction).
  • Each of the first rotation axis AX1 and the second rotation axis AX2 is not perpendicular to the left-right direction (X-axis direction) of the vehicle 1 in the state where the HUD device 20 is attached to the vehicle 1 (in other words, the YZ plane). It is set so that it is not parallel).
  • the angle between the first rotation axis AX1 and the second rotation axis AX2 with respect to the left-right direction (X-axis direction) of the vehicle 1 is set to less than 45 [degree], and more preferably. It is set to less than 20 [degree].
  • the amount of vertical movement of the display area 100 is relatively small, and the amount of vertical movement of the eyebox 200 is relatively large.
  • the rotation of the relay optical system 25 on the second rotation axis AX2 the amount of movement of the display area 100 in the vertical direction is relatively large, and the amount of movement of the eyebox 200 in the vertical direction is relatively small. That is, when the first rotation axis AX1 and the second rotation axis AX2 are compared, "the amount of vertical movement of the eyebox 200 / the amount of vertical movement of the display area 100" due to the rotation of the first rotation axis AX1.
  • the relative amount of the vertical movement amount of the display area 100 and the vertical movement amount of the eyebox 200 due to the rotation of the relay optical system 25 on the first rotation axis AX1 is the relative amount on the second rotation axis AX2.
  • the relative amount of the vertical movement amount of the display area 100 due to the rotation of the relay optical system 25 and the vertical movement amount of the eyebox 200 are different.
  • the HUD device 20 includes a first actuator 28 that rotates the first mirror 26 on the first rotation axis AX1 and a second actuator 29 that rotates the first mirror 26 on the second rotation axis AX2.
  • the HUD device 20 rotates one relay optical system 25 on two axes (first rotation axis AX1 and second rotation axis AX2).
  • the first actuator 28 and the second actuator 29 may be composed of one integrated two-axis actuator.
  • the HUD device 20 in another embodiment rotates the two relay optical systems 25 on two axes (first rotation axis AX1 and second rotation axis AX2).
  • the HUD device 20 includes a first actuator 28 that rotates the first mirror 26 on the first rotation axis AX1 and a second actuator 29 that rotates the second mirror 27 on the second rotation axis AX2. You may.
  • the HUD device 20 in another embodiment does not have to drive the relay optical system 25.
  • the HUD device 20 may not have an actuator that rotates and / or rotates the relay optical system 25.
  • the HUD device 20 of this embodiment may include a wide eye box 200 that covers a range of driver's eye heights where the vehicle 1 is expected to be used.
  • the image display unit (head-up display device) 20 exists in the foreground, which is a real space (actual view) that is visually recognized through the front windshield 2 of the vehicle 1, based on the control of the display control device 30 described later.
  • Near the real object 300 such as lane road surface 310 (see FIG. 1), branch roads, road signs, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), and features (buildings, bridges, etc.).
  • the visual augmented reality (AR) can be displayed on the observer (typically, in the driver's seat of the vehicle 1). It can also be perceived by the seated observer).
  • AR augmented reality
  • an image whose displayed position can be changed according to the position of the real object existing in the real scene is defined as an AR image, and the displayed position is set regardless of the position of the real object.
  • the image is defined as a non-AR image.
  • FIG. 3 is a block diagram of the vehicle display system 10 according to some embodiments.
  • the display control device 30 includes one or more I / O interfaces 31, one or more processors 33, one or more image processing circuits 35, and one or more memories 37.
  • the various functional blocks described in FIG. 3 may consist of hardware, software, or a combination of both.
  • FIG. 3 is only one embodiment, and the illustrated components may be combined with a smaller number of components, or there may be additional components.
  • the image processing circuit 35 (for example, a graphic processing unit) may be included in one or more processors 33.
  • the processor 33 and the image processing circuit 35 are operably connected to the memory 37. More specifically, the processor 33 and the image processing circuit 35 execute a program stored in the memory 37 to generate and / or transmit image data, for example, and display the vehicle display system 10 (image display). The operation of unit 20) can be performed.
  • the processor 33 and / or the image processing circuit 35 includes at least one general purpose microprocessor (eg, central processing unit (CPU)), at least one application specific integrated circuit (ASIC), and at least one field programmable gate array (FPGA). , Or any combination thereof.
  • the memory 37 includes any type of magnetic medium such as a hard disk, any type of optical medium such as a CD and DVD, any type of semiconductor memory such as a volatile memory, and a non-volatile memory.
  • the volatile memory may include DRAM and SRAM, and the non-volatile memory may include ROM and NVRAM.
  • the processor 33 is operably connected to the I / O interface 31.
  • the I / O interface 31 communicates with, for example, the vehicle ECU 401 described later or another electronic device (reference numerals 403 to 419 described later) provided in the vehicle according to the standard of CAN (Controller Area Network) (also referred to as CAN communication). ).
  • CAN Controller Area Network
  • the communication standard adopted by the I / O interface 31 is not limited to CAN, for example, CANFD (CAN with Flexible Data Rate), LIN (Local Interconnect Network), Ethernet (registered trademark), MOST (Media Oriented Systems Transport).
  • MOST is a registered trademark
  • a wired communication interface such as UART, or USB
  • a local such as a personal area network (PAN)
  • PAN personal area network
  • Bluetooth registered trademark
  • 802.1x Wi-Fi registered trademark
  • In-vehicle communication (internal communication) interface which is a short-range wireless communication interface within several tens of meters such as an area network (LAN), is included.
  • the I / O interface 31 is a wireless wide area network (WWAN0, IEEE 802.16-2004 (WiMAX: Worldwide Interoperability for Microwave Access)), IEEE 802.16e base (Mobile WiMAX), 4G, 4G-LTE, LTE Advanced,
  • An external communication (external communication) interface such as a wide area communication network (for example, an Internet communication network) may be included according to a cellular communication standard such as 5G.
  • the processor 33 is interoperably connected to the I / O interface 31 to provide information with various other electronic devices and the like connected to the vehicle display system 10 (I / O interface 31). Can be exchanged.
  • the I / O interface 31 includes, for example, a vehicle ECU 401, a road information database 403, a vehicle position detection unit 405, an external sensor 407, an operation detection unit 409, an eye position detection unit 411, an IMU 413, a line-of-sight direction detection unit 415, and mobile information.
  • the terminal 417, the external communication device 419, and the like are operably connected.
  • the I / O interface 31 may include a function of processing (converting, calculating, analyzing) information received from another electronic device or the like connected to the vehicle display system 10.
  • the display 21 is operably connected to the processor 33 and the image processing circuit 35. Therefore, the image displayed by the image display unit 20 may be based on the image data received from the processor 33 and / or the image processing circuit 35.
  • the processor 33 and the image processing circuit 35 control the image displayed by the image display unit 20 based on the information acquired from the I / O interface 31.
  • the vehicle ECU 401 uses sensors and switches provided on the vehicle 1 to determine the state of the vehicle 1 (for example, mileage, vehicle speed, accelerator pedal opening, brake pedal opening, engine throttle opening, injector fuel injection amount, engine rotation speed). , Motor speed, steering angle, shift position, drive mode, various warning states, attitude (including roll angle and / or pitching angle), vehicle vibration (including magnitude, frequency, and / or frequency of vibration) )) and the like, and collect and manage (may include control) the state of the vehicle 1. As a part of the function, the numerical value of the state of the vehicle 1 (for example, the vehicle speed of the vehicle 1). ) Can be output to the processor 33 of the display control device 30.
  • the state of the vehicle 1 for example, mileage, vehicle speed, accelerator pedal opening, brake pedal opening, engine throttle opening, injector fuel injection amount, engine rotation speed.
  • the vehicle ECU 401 simply transmits the numerical value detected by the sensor or the like (for example, the pitching angle is 3 [brake] in the forward tilt direction) to the processor 33, or instead, the numerical value detected by the sensor is used.
  • Judgment results based on one or more states of the including vehicle 1 (for example, the vehicle 1 satisfies a predetermined condition of the forward leaning state) and / and analysis results (for example, of the brake pedal opening degree). Combined with the information, the brake has caused the vehicle to lean forward.) May be transmitted to the processor 33.
  • the vehicle ECU 401 may output a signal indicating a determination result indicating that the vehicle 1 satisfies a predetermined condition stored in advance in a memory (not shown) of the vehicle ECU 401 to the display control device 30.
  • the I / O interface 31 may acquire the above-mentioned information from the sensors and switches provided in the vehicle 1 provided in the vehicle 1 without going through the vehicle ECU 401.
  • the vehicle ECU 401 may output an instruction signal indicating an image to be displayed by the vehicle display system 10 to the display control device 30, and at this time, it is necessary to notify the coordinates, size, type, display mode, and image of the image.
  • the degree and / or the necessity-related information that is the basis for determining the notification necessity may be added to the instruction signal and transmitted.
  • the road information database 403 is included in a navigation device (not shown) provided in the vehicle 1 or an external server connected to the vehicle 1 via an external communication interface (I / O interface 31), and the vehicle position detection described later.
  • Road information (lane, white line, stop line, crosswalk, Road width, number of lanes, intersections, curves, branch roads, traffic regulations, etc.), feature information (buildings, bridges, rivers, etc.), presence / absence, position (including distance to vehicle 1), direction, shape, type , Detailed information and the like may be read out and transmitted to the processor 33. Further, the road information database 403 may calculate an appropriate route (navigation information) from the departure point to the destination, and output a signal indicating the navigation information or image data indicating the route to the processor 33.
  • the own vehicle position detection unit 405 is a GNSS (Global Navigation Satellite System) or the like provided in the vehicle 1, detects the current position and orientation of the vehicle 1, and transmits a signal indicating the detection result via the processor 33. , Or directly output to the road information database 403, the portable information terminal 417 described later, and / or the external communication device 419.
  • the road information database 403, the mobile information terminal 417 described later, and / or the external communication device 419 obtains the position information of the vehicle 1 from the own vehicle position detection unit 405 continuously, intermittently, or at a predetermined event. , Information around the vehicle 1 may be selected and generated and output to the processor 33.
  • the vehicle exterior sensor 407 detects real objects existing around the vehicle 1 (front, side, and rear).
  • the actual objects detected by the external sensor 407 are, for example, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) of the traveling lane described later. Etc. may be included.
  • a detection unit composed of a radar sensor such as a millimeter wave radar, an ultrasonic radar, a laser radar, a camera, or a combination thereof, and detection data from the one or a plurality of detection units are processed ( It consists of a processing device (data fusion) and a processing device.
  • One or more external sensors 407 detect a real object in front of the vehicle 1 for each detection cycle of each sensor, and the real object information (presence or absence of the real object, existence of the real object exists) which is an example of the real object information.
  • information such as the position, size, and / or type of each real object
  • these real object information may be transmitted to the processor 33 via another device (for example, vehicle ECU 401).
  • a camera an infrared camera or a near-infrared camera is desirable so that a real object can be detected even when the surroundings are dark such as at night.
  • a stereo camera capable of acquiring a distance or the like by parallax is desirable.
  • the operation detection unit 409 is, for example, a CID (Center Information Processor) of the vehicle 1, a hardware switch provided on the instrument panel, or a software switch that combines an image and a touch sensor, and the like.
  • the operation information based on the operation by the occupant is output to the processor 33.
  • the operation detection unit 409 sets the display area setting information based on the operation of moving the display area 100, the eyebox setting information based on the operation of moving the eyebox 200, and the observer's eye position 700 by the user's operation. Information based on the operation is output to the processor 33.
  • the eye position detection unit 411 may include a camera such as an infrared camera that detects the eye position 700 of the observer sitting in the driver's seat of the vehicle 1, and may output the captured image to the processor 33.
  • the processor 33 acquires an captured image (an example of information capable of estimating the eye position 700) from the eye position detection unit 411, and analyzes the captured image by a method such as pattern matching to obtain an observer's eye position.
  • the coordinates of 700 may be detected, and a signal indicating the detected coordinates of the eye position 700 may be output to the processor 33.
  • the eye position detection unit 411 analyzes the analysis result obtained by analyzing the image captured by the camera (for example, the observer's eye position 700 is a spatial region corresponding to a plurality of preset display parameters 600 (described later)). A signal indicating where it belongs to) may be output to the processor 33.
  • the method of acquiring information capable of estimating the eye position 700 of the observer of the vehicle 1 or the eye position 700 of the observer is not limited to these, and a known eye position detection (estimation) technique is used. May be obtained.
  • the eye position detection unit 411 detects the moving speed and / or moving direction of the observer's eye position 700, and outputs a signal indicating the moving speed and / or moving direction of the observer's eye position 700 to the processor 33. It may be output to.
  • the eye position detection unit 411 is presumed to have (10) a signal indicating that the observer's eye position 700 is outside the eye box 200, and (20) the observer's eye position 700 is outside the eye box 200.
  • a signal or (30) a signal in which the observer's eye position 700 is predicted to be outside the eye box 200 is detected, it is determined that a predetermined condition is satisfied, and a signal indicating the state is output to the processor 33. You may.
  • the signal that the observer's eye position 700 is estimated to be outside the eye box 200 is (21) a signal indicating that the observer's eye position 700 cannot be detected, and (22) the observer's eye position 700.
  • a signal indicating that the observer's eye position 700 cannot be detected, and / or (23) either of the observer's eye positions 700R or 700L is near the boundary 200A of the eyebox 200 (the vicinity). Includes, for example, a signal indicating that it is within a predetermined coordinate from the boundary 200A).
  • the signal predicted that the observer's eye position 700 is outside the eye box 200 is (31) stored in advance in the memory 37 with respect to the previously detected eye position 700 by the newly detected eye position 700.
  • the IMU413 is one or more sensors (eg, accelerometers and gyroscopes) configured to detect the position, orientation, and changes (speed of change, acceleration of change) of vehicle 1 based on inertial acceleration. Can include combinations of.
  • the IMU413 processes the detected values (the detected values include the position and orientation of the vehicle 1, signals indicating these changes (change speed, change acceleration), and the like), and the results of analyzing the detected values. It may be output to 33.
  • the result of the analysis is a signal or the like indicating a determination result of whether or not the detected value satisfies a predetermined condition, and is, for example, from a value relating to a change (change speed, change acceleration) of the position or orientation of the vehicle 1. , It may be a signal indicating that the behavior (vibration) of the vehicle 1 is small.
  • the line-of-sight direction detection unit 415 includes an infrared camera or a visible light camera that captures the face of an observer sitting in the driver's seat of the vehicle 1, and may output the captured image to the processor 33.
  • the processor 33 acquires an captured image (an example of information capable of estimating the line-of-sight direction) from the line-of-sight direction detection unit 415, and analyzes the captured image to determine the line-of-sight direction (and / or the gaze position) of the observer. Can be identified.
  • the line-of-sight direction detection unit 415 may analyze the captured image from the camera and output a signal indicating the line-of-sight direction (and / or the gaze position) of the observer, which is the analysis result, to the processor 33.
  • the method for acquiring information that can estimate the line-of-sight direction of the observer of the vehicle 1 is not limited to these, and is not limited to these, but is the EOG (Electro-oculogram) method, the corneal reflex method, the scleral reflex method, and the Purkinje image detection. It may be obtained using other known line-of-sight detection (estimation) techniques such as the method, search coil method, infrared fundus camera method.
  • the mobile information terminal 417 is a smartphone, a laptop computer, a smart watch, or other information device that can be carried by an observer (or another occupant of vehicle 1).
  • the I / O interface 31 can communicate with the mobile information terminal 417 by pairing with the mobile information terminal 417, and the data recorded in the mobile information terminal 417 (or the server through the mobile information terminal). To get.
  • the mobile information terminal 417 has, for example, the same functions as the above-mentioned road information database 403 and own vehicle position detection unit 405, acquires the road information (an example of real object-related information), and transmits it to the processor 33. You may.
  • the personal digital assistant 417 may acquire commercial information (an example of information related to a real object) related to a commercial facility in the vicinity of the vehicle 1 and transmit it to the processor 33.
  • the mobile information terminal 417 transmits schedule information of the owner (for example, an observer) of the mobile information terminal 417, incoming information on the mobile information terminal 417, mail reception information, and the like to the processor 33, and the processor 33 and an image.
  • the processing circuit 35 may generate and / or transmit image data relating to these.
  • the external communication device 419 is a communication device that exchanges information with the vehicle 1, for example, another vehicle connected to the vehicle 1 by vehicle-to-vehicle communication (V2V: Vehicle To Vehicle), and pedestrian-to-vehicle communication (V2P: Vehicle To Pestation). ), A network communication device connected by pedestrians (portable information terminals carried by pedestrians) and road-to-vehicle communication (V2I: Vehicle To vehicle Infrastructure), and in a broad sense, communication with vehicle 1 (V2X). : Includes everything connected by (Vehicle To Everything).
  • the external communication device 419 acquires, for example, the positions of pedestrians, bicycles, motorcycles, other vehicles (preceding vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) and sends them to the processor 33. It may be output. Further, the external communication device 419 has the same function as the own vehicle position detection unit 405 described above, and may acquire the position information of the vehicle 1 and transmit it to the processor 33. Further, the road information database 403 described above may be used. It also has a function, and the road information (an example of information related to a real object) may be acquired and transmitted to the processor 33. The information acquired from the external communication device 419 is not limited to the above.
  • the software components stored in the memory 37 are the eye position detection module 502, the eye position estimation module 504, the eye position prediction module 506, the eye position state determination module 508, the display parameter setting module 510, the graphic module 512, and the light source drive module 514. , And the actuator drive module 516.
  • FIGS. 4A and 4B are flow charts showing a method S100 for executing an operation of setting display parameters based on the eye position of an observer according to some embodiments.
  • Method S100 is executed by an image display unit 20 including a display and a display control device 30 that controls the image display unit 20. Some actions in method S100 are optionally combined, some steps are optionally modified, and some actions are optionally omitted.
  • the method S100 is an image (virtual image) of an image (virtual image) in particular when the observer's eye position is in an unusual specific state (abnormality) and the eye position detection is in an unusual specific state (abnormality).
  • an unusual specific state abnormality
  • an unusual specific state abnormality
  • an unusual specific state abnormality
  • an unusual specific state abnormality
  • the eye position detection module 502 of FIG. 3 detects the eye position 700 of the observer (S110).
  • the eye position detection module 502 detects coordinates indicating the height of the observer's eyes (position in the Y-axis direction, which is an example of a signal indicating the eye position 700), and observer's eye height.
  • the coordinates indicating the position in the depth direction (the positions in the Y and Z axes, which is an example of the signal indicating the eye position 700), and / or the coordinates indicating the observer's eye position 700 (X).
  • Y, Z axis position which is an example of a signal indicating eye position 700)
  • the eye position 700 detected by the eye position detection module 502 is one of the right eye position 700R and 700L, the right eye position 700R and the left eye position 700L, the right eye position 700R and the left eye position 700L, respectively. It includes one of the detectable (easy-to-detect) positions, or a position calculated from the right eye position 700R and the left eye position 700L (for example, the midpoint between the right eye position and the left eye position). For example, the eye position detection module 502 determines the eye position 700 based on the observation position acquired from the eye position detection unit 411 immediately before the timing of updating the display setting.
  • the eye position detection unit 411 detects the movement direction and / or the movement speed of the observer's eye position 700 based on the observation positions of the eyes of a plurality of observers acquired from the eye position detection unit 411, and the observer.
  • a signal indicating the moving direction and / or moving speed of the eye position 700 may be output to the processor 33.
  • the eye position estimation module 504 acquires information capable of estimating the eye position (S114).
  • Information that can estimate the eye position is, for example, an image captured from the eye position detection unit 411, the position of the driver's seat of the vehicle 1, the position of the observer's face, the height of the sitting height, or the eyes of a plurality of observers. Observation position, etc.
  • the eye position estimation module 504 estimates the eye position 700 of the observer of the vehicle 1 from the information capable of estimating the eye position.
  • the eye position estimation module 504 is based on the captured image acquired from the eye position detection unit 411, the position of the driver's seat of the vehicle 1, the position of the observer's face, the height of the sitting height, the observation positions of the eyes of a plurality of observers, and the like. Includes various software components for performing various actions related to estimating the observer's eye position 700, such as estimating the observer's eye position 700. That is, the eye position estimation module 504 may include table data, an arithmetic expression, and the like for estimating the eye position 700 of the observer from the information capable of estimating the eye position.
  • the eye position detection module 502 sets the eye position 700 into an eye observation position acquired from the eye position detection unit 411 immediately before the timing of updating the display parameter and one or a plurality of eye observation positions acquired in the past. Based on this, it may be calculated by, for example, a method such as a weighted average.
  • the eye position prediction module 506 acquires information that can predict the eye position 700 of the observer (S116).
  • the information that can predict the eye position 700 of the observer is, for example, the latest observation position acquired from the eye position detection unit 411, or one or more observation positions acquired in the past.
  • the eye position prediction module 506 includes various software components for performing various actions related to predicting the eye position 700 based on predictable information on the observer's eye position 700. Specifically, for example, the eye position prediction module 506 predicts the observer's eye position 700 at the timing when the image to which the new display setting is applied is visually recognized by the observer.
  • the eye position prediction module 506 uses a prediction algorithm such as a least squares method, a Kalman filter, an ⁇ - ⁇ filter, or a particle filter to obtain the next value using one or more observation positions in the past. You may try to predict.
  • the eye position state determination module 508 determines whether the observer's eye position 700 is in a specific state (S130).
  • the eye position state determination module 508 (10) determines whether or not the observer's eye position 700 can be detected, and if the eye position 700 cannot be detected, determines that the observer is in a specific state, and (20) the observer's. It is determined whether or not the eye position 700 is outside the eye box 200, and if it is outside the eye box 200, it is determined that it is in a specific state. (30) The observer's eye position 700 is outside the eye box 200.
  • the eye position state determination module 508 may include table data, an arithmetic expression, and the like for determining whether or not the eye position is in a specific state from the detection information, estimation information, or prediction information of the eye position 700.
  • the method of determining whether or not the observer's eye position 700 can be detected is as follows: (1) A part of the observer's eye observation position acquired from the eye position detection unit 411 within a predetermined period (for example, (2) The eye position detection module 502 cannot detect the observer's eye position 700 in normal operation, and (3) the eye position estimation module 504 usually cannot detect the eye position 700 or more than a predetermined number of times. The observer's eye position 700 cannot be estimated in the movement of the observer, (4) the eye position prediction module 506 cannot predict the eye position 700 of the observer in the normal operation, or a combination thereof causes the observer's eye position 700. It includes determining that the eye position 700 cannot be detected (the observer's eye position 700 is in a specific state) (note that the determination method is not limited thereto).
  • the method of determining whether or not the observer's eye position 700 is outside the eye box 200 is as follows: (1) One of the observer's eye observation positions acquired from the eye position detection unit 411 within a predetermined period. Acquiring a part (for example, a predetermined number of times or more) or all of them outside the eye box 200, (2) the eye position detection module 502 detects the observer's eye position 700 outside the eye box 200, or these.
  • the combination includes determining that the observer's eye position 700 is outside the eye box 200 (the observer's eye position 700 is in a specific state) (note that the determination method is not limited thereto).
  • the method of determining whether the observer's eye position 700 can be estimated to be outside the eye box 200 is as follows: (1) After the movement of the observer's eye position 700 is detected by the eye position detection unit 411, the observer The eye position 700 cannot be detected, (2) the eye position detection module 502 detects the observer's eye position 700 near the boundary 200A of the eye box 200, and (3) the eye position detection module 502 observes. It can be estimated that the observer's eye position 700 is outside the eye box 200 by detecting either the right eye position 700R or the left eye position 700L near the boundary 200A of the eye box 200, or a combination thereof (observer). (The determination method is not limited to these).
  • the method of determining whether or not the observer's eye position 700 is predicted to be outside the eye box 200 is as follows: (1) The eye position prediction module 506 sets the observer's eye position 700 after a predetermined time in the eye box. Predicting outside 200, (2) The eye position 700 newly detected by the eye position detection module 502 is equal to or greater than the eye position movement distance threshold stored in advance in the memory 37 with respect to the previously detected eye position 700. There is (the movement speed of the eye position 700 is equal to or higher than the eye position movement speed threshold stored in advance in the memory 37), or the observer's eye position 700 is outside the eye box 200 due to a combination thereof. It includes determining that it can be predicted (the eye position 700 of the observer is in a specific state) (note that the determination method is not limited thereto).
  • the display parameter setting module 510 determines that the state is not a specific state by the eye position state determination module 508, the eye position 700 detected by the eye position detection module 502, the eye position 700 estimated by the eye position estimation module 504, or the eye position 700.
  • the eye position 700 predicted by the position prediction module 506 has a plurality of predetermined coordinate ranges (for example, the plurality of coordinate ranges correspond to a plurality of display parameters 600 of FIGS. 5A, 5B, or 5C. It is determined where it belongs to the partitioned area), and the display parameter 600 corresponding to the eye position 700 is set.
  • the display parameter setting module 510 determines where in a plurality of spatial regions the X-axis coordinate, the Y-axis coordinate, the Z-axis coordinate, or a combination thereof indicated by the eye position 700 belongs, and the space to which the eye position 700 belongs. Includes various software components for performing various operations related to setting the display parameter 600 corresponding to a specific area. That is, the eye position estimation module 504 may include table data, an arithmetic expression, and the like for specifying the display parameter 600 from the X-axis coordinates, the Y-axis coordinates, the Z-axis coordinates, or a combination thereof indicated by the eye position 700. ..
  • the types of display parameters 600 are (1) arrangement of images so as to have a predetermined positional relationship with an actual object located outside the vehicle 1 when viewed from the eye position 700 (display 21 for controlling the arrangement of images).
  • Parameters that control A parameter that controls the display 21 to distort the image displayed on the display 21 in advance, and is also called a warping parameter.
  • Parameters for directional display (parameters for controlling the display 21, parameters for controlling the light source of the display 21, actuators) for directional display in which the light of the image is not directed (or weakened) to the display parameters 600 other than 700.
  • Parameters for visually recognizing a desired stereoscopic image when viewed from the eye position are included.
  • a parameter for controlling, a parameter for controlling an actuator, or a parameter including a combination thereof.), Etc. are included.
  • the display parameter 600 set (selected) by the display parameter setting module 510 may be any parameter that is preferably changed according to the eye position 700 of the observer, and the type of the display parameter 600 is not limited to these. ..
  • the display parameter setting module 510 selects one or a plurality of display parameters 600 corresponding to the eye position 700 for one type of display parameter.
  • the display parameter setting module 510 includes the display parameter 600 corresponding to the right eye position 700R, the display parameter 600 corresponding to the left eye position 700L, and the display parameter 600. Two can be selected.
  • the eye position 700 is, for example, one of the predetermined positions of the right eye position 700R and the left eye position 700L, and one of the right eye position 700R and the left eye position 700L that can be detected (easily detected), or
  • the display parameter setting module 510 has one display corresponding to the eye position 700.
  • Parameter 600 can be selected.
  • the display parameter setting module 510 may also select the display parameter 600 corresponding to the eye position 700 and the display parameter 600 set around the eye position 700. That is, the display parameter setting module 510 may select three or more display parameters 600 corresponding to the eye positions 700.
  • the display parameter setting module 510 selects the display parameter 600 (including the boundary display parameter 610E) set near the boundary of the eye box 200 when the eye position state determination module 508 determines that the specific state is present. (Block S154).
  • the display parameter setting module 510 is set closer to the boundary of the eyebox 200 than the latest eye position 700 when the eye position state determination module 508 determines that the specific state is present.
  • 600 (including the boundary display parameter 610E) is selected (block S156).
  • the boundary display parameter 610E set in the nearest neighbor of the latest eye position 700 is selected, or the display parameter set on the boundary side of the eye box 200 by a predetermined distance from the latest eye position 700. It also includes selecting 600 (including the boundary display parameter 610E).
  • the display parameter setting module 510 is based on the latest eye position 700 and the moving direction of the eye position when the eye position state determination module 508 determines that the specific state is present.
  • the boundary display parameter 610E set at a position along the moving direction of the eye position from the latest eye position 700 is selected (an example of block S158).
  • the boundary display parameter 610E set in the moving direction of the eye position from the latest eye position 700 is selected, or the eye box 200 is set by a predetermined distance in the moving direction of the eye position from the latest eye position 700. It also includes selecting the display parameter 600 (including the boundary display parameter 610E) set on the boundary side of the above.
  • the display parameter setting module 510 when the display parameter setting module 510 is determined to be in a specific state by the eye position state determination module 508, the latest eye position 700 and the latest moving direction of the eye position 700 are determined. Depending on the movement direction of the eye position 700, the display parameter 600 corresponding to the latest eye position 700 and the boundary display parameter 610E may be selected (block S158). One case). That is, the display parameter setting module 510 can select a display parameter 600 that is not the boundary display parameter 610E that is closer to the boundary display parameter 610E along the moving direction of the eye position 700 than the display parameter 600 corresponding to the latest eye position 700. ..
  • the display parameter setting module 510 when the display parameter setting module 510 is determined to be in a specific state by the eye position state determination module 508, the latest eye position 700, the latest moving direction of the eye position 700, and the eye Based on the moving speed of the position 700, one of the display parameters 600 from the display parameter 600 corresponding to the latest eye position 700 to the boundary display parameter 610E along the moving direction of the eye position 700 is selected according to the moving speed. It may be (an example of block S160). That is, in the display parameter setting module 510, if the movement speed is faster than the predetermined eye position movement speed threshold value, the observer's eye position 700 may be located outside the eye box 200 or near the boundary of the eye box 200.
  • the boundary display parameter 610E (or the display parameter 600 close to the boundary display parameter 610E) is selected, and if the movement speed is slower than the predetermined eye position movement speed threshold value, the observer's eye position 700 is the eye box 200. Since it can be estimated that it is unlikely to be located outside, any of the display parameters 600 from the display parameter 600 corresponding to the latest eye position 700 to the boundary display parameter 610E along the moving direction of the eye position 700. Among them, the display parameter 600 set near the latest eye position 700 can be selected.
  • the display parameter setting module 510 reselects the display parameter 600 when it is determined by the eye position state determination module 508 that the specific state has been released (block S170).
  • the display parameter setting module 510 determines that the specific state is released in the display parameter update cycle after the specific state is released when the eye position state determination module 508 determines that the specific state is released.
  • the display parameter 600 corresponding to the latest eye position 700 after being released is selected (block S172).
  • the display parameter setting module 510 determines that the specific state is released by the eye position state determination module 508, it is determined in the display parameter update cycle after the specific state is released.
  • the display parameters up to the above are maintained, and after the next display parameter update cycle, the display parameter 600 corresponding to the latest eye position 700 is selected (block S174).
  • 5A, 5B, and 5C are diagrams that virtually show the arrangement of a plurality of display parameters 600 associated with the eye box 200 and the space around it.
  • One display parameter 600 is associated with an area partitioned by two-dimensional coordinates consisting of X-axis coordinates in the left-right direction and Y-axis coordinates in the up-down direction when the vehicle 1 faces the forward direction. There is. That is, within the same section, the same one display parameter 600 is applied regardless of the coordinates of the eye position 700.
  • one display parameter 600 may be associated with three-dimensional coordinates including Z-axis coordinates.
  • the plurality of display parameters 600 include a first display parameter 610 associated with the space inside the eyebox 200 where the image can be visually recognized, and a second display parameter 650 associated with the space outside the eyebox 200. (The second display parameter 650 may be omitted.).
  • the first display parameter 610 includes the boundary display parameter 610E.
  • the boundary display parameter 610E is a display parameter 600 associated with the space inside the boundary 200A of the eyebox 200 (on the side of the center 205 of the eyebox 200).
  • the display parameter 612 of FIG. 5A is arranged at the left end (inside the boundary) of the eye box 200, it is classified into the boundary display parameter 610E.
  • the display parameter 645 of FIG. 5B is arranged at the right end (inside the boundary) of the eye box 200, it is classified into the boundary display parameter 610E.
  • the display parameter 642 of FIG. 5B is arranged at the upper end (inside the boundary) of the eye box 200, it is classified into the boundary display parameter 610E. That is, the boundary display parameter 610E in FIG. 5A is the display parameter 611-616, 620-621, 625-626, 630 to 635, and the boundary display parameter 610E in FIGS. 5B and 5C is the display parameter 641 to 645. ..
  • FIGS. 5A, 5B and 5C are shown mainly to provide an explanation of the spatial arrangement corresponding to the display parameter 600, and the spatial area (section) corresponding to the second display parameter 650. ) Is conceptually shown. That is, the spatial area corresponding to the second display parameter 650 may be further expanded outward or reduced inward (on the eyebox 200 side).
  • the first display parameter 610 is composed of 25 first display parameters 611 to 635, which are divided into 5 columns in the horizontal direction (X-axis direction) and 5 rows in the vertical direction (Y-axis direction).
  • the columns are in the order of the first column, the second column, ...
  • the fifth column from the positive direction (left) of the X axis, and the rows are the first row from the positive direction (top) of the Y axis.
  • the 1st display parameters 611 to 615 are arranged in the order of the 1st row to the 5th row of the 1st column, and the 1st to 5th rows of the 2nd column.
  • 616 to 620 are arranged in the order of the eyes.
  • the lower figure of FIG. 5A corresponds to the upper figure of FIG. 5A, and provides a description of the display parameter 600 selected by the display parameter setting module 510 according to the eye position 700.
  • the display parameter setting module 510 selects the display parameter 627 to which the eye position 701 detected by the eye position detection module 502 belongs. ..
  • the display parameter setting module 510 is the latest of the latest eye position 701 before the determination of the specific state.
  • the nearest boundary display parameter 610E corresponding to the coordinates of the predetermined eye position 701 may be stored in the memory 37 in advance, that is, the spatial area corresponding to the display parameter 627 is the closest boundary display.
  • the display parameter setting module 510 is either display parameter 626, display parameter 631, or display parameter 632, which is close to the detailed coordinates of the eye position 701 in the section to which the display parameter 627 corresponds. You may choose one. That is, the display parameter setting module 510 selects the display parameter 626 if the coordinates of the eye position 701 are close to the area corresponding to the display parameter 626, while the coordinates of the eye position 701 are the area corresponding to the display parameter 632. If it is close to, the display parameter 632 may be selected.
  • the display parameter setting module 510 corresponds to the latest eye position 702 before the specific state is determined.
  • the display parameter 626 which is the boundary display parameter 610E set in the space closest to the area to which the display parameter 654 (second display parameter 654) corresponds, is selected.
  • the display parameter setting module 510 belongs to the latest eye position 703 before the determination of the specific state. If the display parameter 631 corresponding to the area is the boundary display parameter 610E, the display parameter 631 which is the boundary display parameter 610E is selected as it is.
  • the display parameter setting module 510 determines that the latest eye position 704 before the specific state is determined. Display parameters corresponding to the latest eye position movement direction 751 before the determination of the specific state (or immediately after the determination of the specific state) and the area to which the latest eye position 704 belongs.
  • the display parameter 634 which is the boundary display parameter 610E arranged in the moving direction 751 of the eye position of 624, is selected.
  • the display parameter setting module 510 is arranged in the moving direction 752 of the eye position of the display parameter 624 corresponding to the latest eye position 704 according to the latest eye position 704 and the moving direction 752 of the eye position.
  • the display parameter 614 which is the boundary display parameter 610E, is selected.
  • the display parameter setting module 510 determines that the latest eye position 705 before the specific state is determined.
  • the latest display parameter 622 corresponding to the latest eye position 705, depending on the latest movement direction of the eye position 753 before the determination of the specific state (or immediately after the determination of the specific state).
  • the display parameter 612 which is the boundary display parameter 610E arranged in the moving direction 751 of the eye position, is selected. After that, when the eye position 706 in the specific state is moved along the moving direction 754, it is assumed that the specific state is no longer present, and the latest eye position after the specific state is lost is indicated by reference numeral 707.
  • the display parameter setting module 510 selects the display parameter 611 to which the eye position 707 detected by the eye position detection module 502 belongs. That is, to briefly explain the display parameter 600 selected as the eye position 700 moves, the display parameter setting module 510 selects and specifies the display parameter 622 corresponding to the eye position 705 when the eye position is not in a specific state.
  • the display parameter 612 (boundary display parameter 610E) corresponding to the eye position 705 and the moving direction 753 when the state is reached is selected, and the display parameter 611 corresponding to the eye position 707 when the state is no longer specified is selected.
  • FIG. 5B provides a description of a modified example of the arrangement of the display parameter 600.
  • the first display parameter 610 is composed of five first display parameters 641 to 645, which are partitioned in five rows in the left-right direction (X-axis direction) and not partitioned in the vertical direction (Y-axis direction).
  • the columns are in the order of the first column, the second column, ...
  • the fifth column from the positive direction (left) of the X axis, and the first display parameters 641 to the fifth column are in order. 645 is placed.
  • the arrangement of the display parameter 600 in the eye box 200 is not a two-dimensional arrangement in which a plurality of determination areas are arranged in the X-axis direction and a plurality of determination areas are also arranged in the Y-axis direction, as shown in FIG. 5A.
  • a one-dimensional arrangement may be adopted in which a plurality of determination areas are arranged in the X-axis direction and a plurality of determination areas are not arranged in the Y-axis direction.
  • FIG. 5C provides a description of a modified example of the arrangement of the display parameter 600.
  • the sizes of the plurality of display parameters 600 arranged in the eye box 200 are the same, but may be different as shown in FIG. 5C. Further, the shapes of the plurality of display parameters 600 may also be different from each other.
  • the upper figure of FIG. 6 is a diagram that provides an explanation of the arrangement of a plurality of display parameters 600 associated with the eye box 200 and the space around the eye box 200.
  • the second display parameter 650 may also include the second boundary display parameter 650E, as shown in the upper diagram of FIG.
  • the second boundary display parameter 650E is a display parameter 600 associated with a spatial area outside the boundary 200A of the eyebox 200. That is, the boundary display parameter may include a first boundary display parameter 610E corresponding to the region inside the boundary 200A of the eyebox 200 and a second boundary display parameter 650E corresponding to the region outside the boundary 200A.
  • the lower figure of FIG. 6 corresponds to the upper figure of FIG. 6, and provides an explanation of the display parameter 600 selected by the display parameter setting module 510 according to the eye position 700.
  • the display parameter setting module 510 belongs to the latest eye position 703 before the determination of the specific state. Even if the display parameter 631 corresponding to the area is the first boundary display parameter 610E, the second boundary corresponding to the area further distant from the center 205 of the eyebox 200 from the area corresponding to the already set display parameter 631.
  • Set the display parameter 650E (display parameter 655, display parameter 666, or display parameter (not shown) corresponding to the area above the area corresponding to the display parameter 665 on the right side of the corresponding area). May be good.
  • the display parameter setting module 510 determines that the latest eye position 704 before the specific state is determined. Display parameters corresponding to the latest eye position movement direction 751 before the determination of the specific state (or immediately after the determination of the specific state) and the area to which the latest eye position 704 belongs.
  • the display parameter 634 which is the first boundary display parameter 610E or the display parameter 669 which is the second boundary display parameter 650E arranged in the moving direction 751 of the eye position of 624 is selected.
  • the display parameter setting module 510 determines that the latest eye position 705 before the specific state is determined.
  • the latest display parameter 622 corresponding to the latest eye position 705, depending on the latest movement direction of the eye position 753 before the determination of the specific state (or immediately after the determination of the specific state).
  • the display parameter 612 which is the first boundary display parameter 610E, or the display parameter 662, which is the second boundary display parameter 650E, is selected in the moving direction 753 of the eye position.
  • the display parameter setting module 510 selects the display parameter 611 to which the eye position 707 detected by the eye position detection module 502 belongs.
  • the adjacent first boundary display parameter 610E and the second boundary display parameter 650E may have the same display parameter (same setting value). That is, the display parameter 631 which is the first boundary display parameter 610E in FIG. 6 corresponds to the display parameter 655, the display parameter 666, and the display parameter 655 which are the second boundary display parameters 650E associated with the first boundary display parameter 610E. Two or all of the display parameters (not shown) on the right side of the area and corresponding to the area above the area to which the display parameter 666 corresponds may be the same display parameter (same set value).
  • the arrangement of the spatial areas corresponding to the plurality of display parameters 600 may be changeable. For example, when a predetermined condition is satisfied, the arrangement of the spatial area corresponding to the plurality of display parameters 600 may be changed from the arrangement shown in FIG. 5A to the arrangement shown in FIG. 7.
  • the arrangement shown in FIG. 7 is the width of the spatial region corresponding to the boundary display parameter 610E (650E) in the horizontal direction (X-axis direction) and / or the vertical direction (Y) as compared with the arrangement shown in FIG. 5A.
  • the width in the axial direction) is increased. That is, since the arrangement shown in FIG. 7 has a wide area corresponding to the boundary display parameter 610E (650E), it is easy to maintain a constant display parameter 600 (boundary display parameter 610E (650E)) even if the eye position 700 shifts. Become.
  • the display parameter setting module 510 sets the boundary display parameter.
  • the spatial area covered by the 610E (650E) may be expanded. That is, when the detected eye position 700 or the detected eye position 700 is a specific state, it is difficult to switch the display parameter 600 even if the actual eye position 700 changes near the boundary of the eye box 200. can do.
  • the latest eye position 700 determined to be in the specific state by the eye position state determination module 508 and before being determined to be the specific state is at least a predetermined distance from the boundary 200A of the eye box 200.
  • the display parameter setting module 510 may expand the spatial area corresponding to the boundary display parameter 610E (650E). At this time, the boundary display parameter 610E may be expanded so as to include the region of the latest eye position 700 before the determination of the specific state.
  • the number (density) of spatial regions corresponding to the plurality of display parameters 600 may be changeable. For example, when a predetermined condition is met, the number of spatial regions corresponding to the plurality of first display parameters 610 in the eyebox 200 is from 5 shown in FIG. 5B to 25 shown in FIG. 5A. It may be changed to pieces. When the number of spatial regions corresponding to the plurality of display parameters 600 is increased (in other words, the density is increased), the display parameters 600 are likely to be switched according to the change of the eye position 700.
  • the display parameter setting module 510 May set the density of the spatial region corresponding to the plurality of display parameters 600 to be high. That is, the display parameter 600 can be smoothly switched based on the eye position 700 at high speed, and is difficult to switch based on the eye position 700 at low speed.
  • the eye position state determination module 508 indicates a signal indicating the moving direction 750 of the eye position 700 including the left-right direction (X-axis direction) and / or the vertical direction as a condition for determining the specific state. At least one of the signals indicating the moving direction 750 of the eye position 700 including (Y-axis direction) may be detected.
  • the display parameter setting module 510 when the eye position 700 moves in the left-right direction (X-axis direction), the area closer to the boundary 200A of the eye box 200 than the area corresponding to the display parameter 600 already set.
  • the process of setting the display parameter 600 corresponding to the area away from the center 205 of the eye box 200 is executed, and if the eye position 700 does not move in the left-right direction (X-axis direction), these processes are not executed. You may.
  • the eye position state determination module 508 may at least include that a signal indicating that the behavior (vibration) of the vehicle 1 is small is detected as a condition for determining the specific state. ..
  • the display parameter setting module 510 may not determine the specific state when the behavior (vibration) of the vehicle 1 is large.
  • the display setting module 510 cannot detect the eye position 700, but if the eye position state determination module 508 does not determine the specific state, the display parameter 600 corresponding to the latest eye position 700 is maintained. You may try to do it.
  • the display setting module 510 cannot detect the eye position 700, but when the eye position state determination module 508 does not determine the specific state, the display parameter 600 corresponding to the center 205 of the eye box 700 is set. You may set it.
  • the graphic module 512 includes various known software components for performing image processing such as rendering to generate image data and driving the display 21.
  • the graphic module 512 also provides a type, arrangement (positional coordinates, angle), size, display distance (in the case of 3D), visual effect (eg, brightness, transparency, saturation, contrast, or contrast) of the displayed image. Other visual characteristics), may include various known software components for modification.
  • the graphic module 512 includes an image type (one of the display parameters), image position coordinates (one of the display parameters), an image angle (pitching angle about the X direction, and yaw rate about the Y direction). An image is generated by generating image data so that the observer can see the angle, the rolling angle about the Z direction, etc., which is one of the display parameters) and the size of the image (one of the display parameters).
  • the display unit 20 can be driven.
  • the light source drive module 514 includes various known software components for performing driving of the light source unit 24.
  • the light source drive module 514 can drive the light source unit 24 based on the set display parameter 600.
  • the actuator drive module 516 includes various known software components for performing driving the first actuator 28 and / or the second actuator 29.
  • the actuator drive module 516 is based on a set display parameter 600.
  • the first actuator 28 and the second actuator 29 can be driven.
  • the operation of the above-mentioned processing process can be performed by executing one or more functional modules of an information processing device such as a general-purpose processor or a chip for a specific purpose. All of these modules, combinations of these modules, and / or combinations with known hardware capable of substituting their functionality are within the scope of the protection of the present invention.
  • the functional blocks of the vehicle display system 10 are optionally executed by hardware, software, or a combination of hardware and software in order to execute the principles of the various embodiments described.
  • the functional blocks described in FIG. 3 may be optionally combined or one functional block separated into two or more subblocks in order to implement the principles of the embodiments described. It will be understood by those skilled in the art. Accordingly, the description herein optionally supports any possible combination or division of functional blocks described herein.
  • Vehicle display system 20 Image display unit (head-up display device) 21: Display 21a: Display surface 22: Liquid crystal display panel 24: Light source unit 25: Relay optical system 26: First mirror 27: Second mirror 28: First actuator 29: Second actuator 30: Display control device 31: I / O interface 33: Processor 35: Image processing circuit 37: Memory 40: Display light 40p: Optical axis 41: First image light 42: Second image light 43: Third image light 90: Virtual image optical system 100: Display area 101 : Upper end 102: Lower end 200: Eye box 200A: Boundary 205: Center 401: Vehicle ECU 403: Road information database 405: Own vehicle position detection unit 407: Vehicle outside sensor 409: Operation detection unit 411: Eye position detection unit 413: IMU 415: Line-of-sight direction detection unit 417: Mobile information terminal 419: External communication device 502: Eye position detection module 504: Eye position estimation module 506: Eye position prediction module 50

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)

Abstract

La présente invention élimine la diminution de la qualité d'affichage d'une image. Une mémoire stocke une pluralité de paramètres d'affichage 600 correspondant à chaque région spatiale. Un processeur définit un ou plusieurs paramètres d'affichage 600 sur la base d'une position de l'œil, et il détermine si des informations relatives à la position de l'œil indiquent une condition spécifique. S'il est déterminé que les informations relatives à la position de l'œil indiquent la condition spécifique, le processeur se base au moins sur la position de l'œil pour définir un paramètre d'affichage 600 correspondant à une région qui est plus près de la limite d'une zone oculaire 200, ou une région qui est plus éloignée du centre 205 de la zone oculaire 200, que la région à laquelle correspond un paramètre d'affichage 600 déjà établi.
PCT/JP2021/013481 2020-03-31 2021-03-30 Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé WO2021200914A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020063845 2020-03-31
JP2020-063845 2020-03-31

Publications (1)

Publication Number Publication Date
WO2021200914A1 true WO2021200914A1 (fr) 2021-10-07

Family

ID=77928456

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/013481 WO2021200914A1 (fr) 2020-03-31 2021-03-30 Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé

Country Status (1)

Country Link
WO (1) WO2021200914A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113997786A (zh) * 2021-12-30 2022-02-01 江苏赫奕科技有限公司 一种适用于车辆的仪表界面显示方法和装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015215505A (ja) * 2014-05-12 2015-12-03 パナソニックIpマネジメント株式会社 表示装置、および表示方法
US20160109943A1 (en) * 2014-10-21 2016-04-21 Honeywell International Inc. System and method for controlling visibility of a proximity display
JP2016210212A (ja) * 2015-04-30 2016-12-15 株式会社リコー 情報提供装置、情報提供方法及び情報提供用制御プログラム
JP2017171146A (ja) * 2016-03-24 2017-09-28 カルソニックカンセイ株式会社 ヘッドアップディスプレイ装置
JP2019018770A (ja) * 2017-07-20 2019-02-07 アルパイン株式会社 ヘッドアップディスプレイ装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015215505A (ja) * 2014-05-12 2015-12-03 パナソニックIpマネジメント株式会社 表示装置、および表示方法
US20160109943A1 (en) * 2014-10-21 2016-04-21 Honeywell International Inc. System and method for controlling visibility of a proximity display
JP2016210212A (ja) * 2015-04-30 2016-12-15 株式会社リコー 情報提供装置、情報提供方法及び情報提供用制御プログラム
JP2017171146A (ja) * 2016-03-24 2017-09-28 カルソニックカンセイ株式会社 ヘッドアップディスプレイ装置
JP2019018770A (ja) * 2017-07-20 2019-02-07 アルパイン株式会社 ヘッドアップディスプレイ装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113997786A (zh) * 2021-12-30 2022-02-01 江苏赫奕科技有限公司 一种适用于车辆的仪表界面显示方法和装置
CN113997786B (zh) * 2021-12-30 2022-03-25 江苏赫奕科技有限公司 一种适用于车辆的仪表界面显示方法和装置

Similar Documents

Publication Publication Date Title
EP3888965B1 (fr) Affichage tête haute, système d'affichage de véhicule et procédé d'affichage de véhicule
JP7006235B2 (ja) 表示制御装置、表示制御方法および車両
JP2020032866A (ja) 車両用仮想現実提供装置、方法、及びコンピュータ・プログラム
JP7255608B2 (ja) 表示制御装置、方法、及びコンピュータ・プログラム
JP7459883B2 (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び方法
JP2023077857A (ja) ヘッドアップディスプレイ装置
WO2021200914A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé
WO2022230995A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé de commande d'affichage
WO2023048213A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé de commande d'affichage
WO2020158601A1 (fr) Dispositif, procédé et programme informatique de commande d'affichage
JP2022072954A (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び表示制御方法
JP2020121607A (ja) 表示制御装置、方法、及びコンピュータ・プログラム
JP2020121704A (ja) 表示制御装置、ヘッドアップディスプレイ装置、方法、及びコンピュータ・プログラム
JP2020117105A (ja) 表示制御装置、方法、及びコンピュータ・プログラム
JP2021056358A (ja) ヘッドアップディスプレイ装置
WO2023003045A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé de commande d'affichage
JP2022077138A (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び表示制御方法
WO2023145852A1 (fr) Dispositif de commande d'affichage, système d'affichage et procédé de commande d'affichage
WO2021200913A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage d'image et procédé
WO2023210682A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé de commande d'affichage
JP2022057051A (ja) 表示制御装置、虚像表示装置
JP2022190724A (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び表示制御方法
JP7434894B2 (ja) 車両用表示装置
JP2022113292A (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び表示制御方法
JP2020199883A (ja) 表示制御装置、ヘッドアップディスプレイ装置、方法、及びコンピュータ・プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21779616

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21779616

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP