WO2021200913A1 - Dispositif de commande d'affichage, dispositif d'affichage d'image et procédé - Google Patents

Dispositif de commande d'affichage, dispositif d'affichage d'image et procédé Download PDF

Info

Publication number
WO2021200913A1
WO2021200913A1 PCT/JP2021/013480 JP2021013480W WO2021200913A1 WO 2021200913 A1 WO2021200913 A1 WO 2021200913A1 JP 2021013480 W JP2021013480 W JP 2021013480W WO 2021200913 A1 WO2021200913 A1 WO 2021200913A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
distance
display control
approach
Prior art date
Application number
PCT/JP2021/013480
Other languages
English (en)
Japanese (ja)
Inventor
誠 秦
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Publication of WO2021200913A1 publication Critical patent/WO2021200913A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory

Definitions

  • the present disclosure relates to a display control device, an image display device, and a method used in a vehicle to superimpose and visually recognize an image on the foreground of the vehicle.
  • Patent Document 1 discloses a device that displays an image visually recognized by a viewer (mainly the driver of the vehicle) as a virtual image overlapping with the scenery in front of the vehicle.
  • the display device disclosed in Patent Document 1 controls the relative height of a close-up image that notifies the real object according to the distance from the vehicle to the real object.
  • Patent Document 1 If the technique disclosed in Patent Document 1 is used when displaying a plurality of the above-mentioned approach images at the same time, the same display control is performed for each approach image according to the distance from the vehicle. However, there is room for improvement in effectively displaying close-up images that are expected to have a higher priority for the viewer.
  • the outline of this disclosure relates to properly directing the observer's visual attention to the image. More specifically, it is also related to providing an image that does not get in the way of the observer while recognizing the distance to the object.
  • the display control device described in the present specification is a display control device that controls an image display device that displays an image to the observer as an imaginary image that overlaps with the scenery in front of the vehicle, and is one that can acquire information.
  • the one or more I / O interfaces 31 include a program, and the one or more I / O interfaces 31 acquire a distance L to an object in front of the vehicle and vehicle information about the vehicle, and the one or more processors.
  • FIG. 1 is a diagram showing an example of application of a vehicle display system to a vehicle.
  • FIG. 2 is a diagram showing a configuration of a head-up display device.
  • FIG. 3 is a block diagram of a vehicle display system.
  • FIG. 4A is a diagram showing an example of a close-up image displayed by an image display device.
  • FIG. 4B is a diagram showing an example of a close-up image displayed by an image display device.
  • FIG. 5 is a diagram that provides an explanation of control data for changing the size (display ratio) of a close-up image according to a distance.
  • FIG. 6A is a diagram showing a display example of an image before the enlargement processing is executed.
  • FIG. 6A is a diagram showing a display example of an image before the enlargement processing is executed.
  • FIG. 6B is a diagram showing an example of displaying an image when the enlargement processing is executed, after the situation shown in FIG. 6A.
  • FIG. 6C is a diagram showing a display example of an image after the situation shown in FIG. 6B and after the enlargement processing is executed.
  • FIG. 7A is a diagram illustrating a display transition example of a plurality of close-up images.
  • FIG. 7B is a diagram after the situation of FIG. 7A, for explaining a display transition example of a plurality of close-up images.
  • FIG. 7C is a diagram after the situation of FIG. 7B and explaining the image enlargement processing. It is a flow chart which shows the enlargement processing.
  • FIG. 9A is a diagram for explaining a modification of the enlargement processing.
  • FIG. 9A is a diagram for explaining a modification of the enlargement processing.
  • FIG. 9B is a diagram for explaining a modification of the enlargement processing.
  • FIG. 9C is a diagram for explaining a modification of the enlargement processing.
  • FIG. 10A is a diagram for explaining a modification of the reference control data.
  • FIG. 10B is a diagram for explaining a modification of the reference control data.
  • FIG. 10C is a diagram for explaining a modification of the reference control data.
  • FIG. 11 is a flowchart showing the approach process.
  • FIG. 12A is a diagram illustrating an example of display transitions of a plurality of close-up images.
  • FIG. 12B is a diagram after the situation of FIG. 12A and explaining a display transition example of a plurality of close-up images.
  • FIG. 12C is a diagram that follows the situation of FIG.
  • FIG. 13A is a diagram showing a display example of an image before the approach process is executed.
  • FIG. 13B is a diagram showing an example of displaying an image when the approach process is executed, after the situation shown in FIG. 13A.
  • FIG. 13C is a diagram showing a display example of an image after the situation shown in FIG. 13B and after the approach process is executed.
  • FIG. 14A is a flow chart showing a method of performing the distance perception shortening process and the distance perception extension process according to some embodiments.
  • FIG. 14B is a flow chart following FIG. 14A.
  • FIG. 15 is a diagram for explaining the second and first distance perception shortening processing data.
  • FIG. 15 is a diagram for explaining the second and first distance perception shortening processing data.
  • FIG. 16A shows the display of the approach image executed when another information having a higher notification priority than the second information is generated while displaying the approach image related to the second information according to some embodiments. It is a flow chart which shows the method about control.
  • FIG. 16B is a flow chart following FIG. 16A.
  • FIG. 16C is a flow chart following FIG. 16B.
  • the left figure is a diagram for explaining an example of image display transition
  • the right figure is a diagram corresponding to the right figure showing an image display ratio according to a distance
  • FIG. 17B is after the situation of FIG. 17A, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance.
  • FIG. 17C is after the situation of FIG. 17B, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance.
  • FIG. 17D is after the situation of FIG. 17C, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance.
  • FIG. 17E is after the situation of FIG. 17D, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. .. FIG.
  • FIG. 17F is after the situation of FIG. 17E, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. ..
  • FIG. 18A the left figure is a diagram for explaining an example of image display transition, and the right figure is a diagram corresponding to the right figure showing an image display ratio according to a distance.
  • FIG. 18B is after the situation of FIG. 18A, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. ..
  • FIG. 18C is after the situation of FIG. 17B, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. ..
  • FIGS. 1 to 16 provide a description of the configuration and operation of an exemplary vehicle display system.
  • the present invention is not limited to the following embodiments (including the contents of the drawings).
  • changes including deletion of components
  • description of known technical matters will be omitted as appropriate.
  • the image display unit (image display device) 20 in the vehicle display system 10 is a head-up display (HUD: Head-Up Display) device provided in the dashboard 5 of the vehicle 1.
  • the image display unit 20 emits the display light 40 toward the front windshield 2 (an example of the projected unit), and the front windshield 2 emits the display light 40 of the image displayed by the image display unit 20 to the eye box. Reflects to 200.
  • the HUD device 20 integrates not only information about the vehicle 1 (hereinafter referred to as vehicle information) but also information other than the vehicle information as an occupant of the vehicle 1 (the occupant is typically a driver of the vehicle 1). Is notified.
  • vehicle information includes not only the information of the vehicle 1 itself but also the external information of the vehicle 1 related to the operation of the vehicle 1.
  • FIG. 2 is a diagram showing the configuration of the HUD device 20 of the present embodiment.
  • the HUD device 20 includes a display 21 having a display surface 21a for displaying an image, and a relay optical system 25.
  • the display 21 of FIG. 2 is composed of a liquid crystal display panel 22 and a light source unit 24.
  • the display surface 21a is a surface on the visual side of the liquid crystal display panel 22, and emits the display light 40 of the image. Display by setting the angle of the display surface 21a with respect to the optical axis 40p of the display light 40 from the center of the display surface 21a toward the eye box 200 (center 205 of the eye box 200) via the relay optical system 25 and the projected portion.
  • the angle of the region 100 (including the tilt angle ⁇ t) can be set.
  • the relay optical system 25 is arranged on the optical path of the display light 40 (light from the display 21 toward the eyebox 200) emitted from the display 21, and the display light 40 from the display 21 is directed to the outside of the HUD device 20. It is composed of one or more optical members projected onto the front windshield 2.
  • the relay optical system 25 of FIG. 2 includes one concave first mirror 26 and one flat second mirror 27.
  • the first mirror 26 has, for example, a free curved surface shape having positive optical power.
  • the first mirror 26 may have a curved surface shape in which the optical power differs for each region, that is, the optical power added to the display light 40 according to the region (optical path) through which the display light 40 passes. It may be different.
  • the relay optical system 25 adds the first display light 41, the second image light 42, and the third image light 43 (see FIG. 2) toward the eyebox 200 from each region of the display surface 21a.
  • the optical power may be different.
  • the second mirror 27 is, for example, a flat mirror, but is not limited to this, and may be a curved surface having optical power. That is, the relay optical system 25 is added according to the region (optical path) through which the display light 40 passes by synthesizing a plurality of mirrors (for example, the first mirror 26 and the second mirror 27 of the present embodiment). The optical power may be different.
  • the second mirror 27 may be omitted. That is, the display light 40 emitted from the display 21 may be reflected by the first mirror 26 on the projected portion (front windshield) 2.
  • the relay optical system 25 includes two mirrors, but the present invention is not limited to this, and one or more refractive optics such as a lens may be added or substituted to these. It may include a member, a diffractive optical member such as a hologram, a reflective optical member, or a combination thereof.
  • the relay optical system 25 of the present embodiment has a function of setting the distance to the display area 100 by the curved surface shape (an example of optical power), and a virtual image obtained by enlarging the image displayed on the display surface 21a. It has a function of generating, but in addition to this, it may have a function of suppressing (correcting) distortion of a virtual image that may occur due to the curved shape of the front windshield 2.
  • relay optical system 25 may be rotatable to which actuators 28 and 29 controlled by the display control device 30 are attached.
  • the liquid crystal display panel 22 receives light from the light source unit 24 and emits spatial light-modulated display light 40 toward the relay optical system 25 (second mirror 27).
  • the liquid crystal display panel 22 has, for example, a rectangular shape whose short side is the direction in which the pixels corresponding to the vertical direction (Y-axis direction) of the virtual image V seen from the observer are arranged.
  • the observer visually recognizes the transmitted light of the liquid crystal display panel 22 via the virtual image optical system 90.
  • the virtual image optical system 90 is a combination of the relay optical system 25 shown in FIG. 2 and the front windshield 2.
  • the light source unit 24 is composed of a light source (not shown) and an illumination optical system (not shown).
  • the light source (not shown) is, for example, a plurality of chip-type LEDs, and emits illumination light to a liquid crystal display panel (an example of a spatial light modulation element) 22.
  • the light source unit 24 is composed of, for example, four light sources, and is arranged in a row along the long side of the liquid crystal display panel 22.
  • the light source unit 24 emits illumination light toward the liquid crystal display panel 22 under the control of the display control device 30.
  • the configuration of the light source unit 24 and the arrangement of the light sources are not limited to this.
  • the illumination optical system includes, for example, one or a plurality of lenses (not shown) arranged in the emission direction of the illumination light of the light source unit 24, and diffusion arranged in the emission direction of the one or a plurality of lenses. It is composed of a board (not shown).
  • the light source unit 24 is configured to be locally dimmable, and the degree of illumination for each area of the display surface 21a may be changed under the control of the display control device 30. As a result, the light source unit 24 can adjust the brightness of the image displayed on the display surface 21a for each area.
  • the display 21 may be a self-luminous display or a projection type display that projects an image on a screen.
  • the display surface 21a is the screen of the projection type display.
  • the image display unit (image display device) 20 is a traveling lane existing in the foreground, which is a real space (actual view) visually recognized via the front windshield 2 of the vehicle 1 based on the control of the display control device 30 described later.
  • Near objects such as road surfaces, branch roads, road signs, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), and features (buildings, bridges, etc.), positions overlapping the objects, or based on the object.
  • the observer typically, the observer sitting in the driver's seat of the vehicle 1 can perceive the visual augmented reality (AR).
  • AR augmented reality
  • an image whose display position can be changed according to the position of a real object existing in a real scene or a virtual object described later is defined as an AR image, and is displayed regardless of the position of the real object.
  • An image in which a position is set is defined as a non-AR image.
  • FIG. 3 is a block diagram of the vehicle display system 10 according to some embodiments.
  • the display control device 30 includes one or more I / O interfaces 31, one or more processors 33, one or more image processing circuits 35, and one or more memories 37.
  • the various functional blocks described in FIG. 3 may consist of hardware, software, or a combination of both.
  • FIG. 3 is only one embodiment, and the illustrated components may be combined with a smaller number of components, or there may be additional components.
  • the image processing circuit 35 (for example, a graphic processing unit) may be included in one or more processors 33.
  • the processor 33 and the image processing circuit 35 are operably connected to the memory 37. More specifically, the processor 33 and the image processing circuit 35 execute a program stored in the memory 37 to generate and / or transmit image data, for example, and display the vehicle display system 10 (image display). The operation of unit 20) can be performed.
  • the processor 33 and / or the image processing circuit 35 includes at least one general purpose microprocessor (eg, central processing unit (CPU)), at least one application specific integrated circuit (ASIC), and at least one field programmable gate array (FPGA). , Or any combination thereof.
  • the memory 37 includes any type of magnetic medium such as a hard disk, any type of optical medium such as a CD and DVD, any type of semiconductor memory such as a volatile memory, and a non-volatile memory.
  • the volatile memory may include DRAM and SRAM, and the non-volatile memory may include ROM and NVRAM.
  • the processor 33 is operably connected to the I / O interface 31.
  • the I / O interface 31 communicates with, for example, the vehicle ECU 401 described later or another electronic device (reference numerals 403 to 419 described later) provided in the vehicle according to the standard of CAN (Controller Area Network) (also referred to as CAN communication). ).
  • CAN Controller Area Network
  • the communication standard adopted by the I / O interface 31 is not limited to CAN, for example, CANFD (CAN with Flexible Data Rate), LIN (Local Interconnect Network), Ethernet (registered trademark), MOST (Media Oriented Systems Transport).
  • MOST is a registered trademark
  • a wired communication interface such as UART, or USB
  • a local such as a personal area network (PAN)
  • PAN personal area network
  • Bluetooth registered trademark
  • 802.1x Wi-Fi registered trademark
  • In-vehicle communication (internal communication) interface which is a short-range wireless communication interface within several tens of meters such as an area network (LAN), is included.
  • the I / O interface 31 is a wireless wide area network (WWAN0, IEEE 802.16-2004 (WiMAX: Worldwide Interoperability for Microwave Access)), IEEE 802.16e base (Mobile WiMAX), 4G, 4G-LTE, LTE Advanced,
  • An external communication (external communication) interface such as a wide area communication network (for example, an Internet communication network) may be included according to a cellular communication standard such as 5G.
  • the processor 33 is interoperably connected to the I / O interface 31 to provide information with various other electronic devices and the like connected to the vehicle display system 10 (I / O interface 31). Can be exchanged.
  • the I / O interface 31 includes, for example, a vehicle ECU 401, a road information database 403, a vehicle position detection unit 405, an external sensor 407, an operation detection unit 409, an eye position detection unit 411, an IMU 413, a line-of-sight direction detection unit 415, and mobile information.
  • the terminal 417, the external communication device 419, and the like are operably connected.
  • the I / O interface 31 may include a function of processing (converting, calculating, analyzing) information received from another electronic device or the like connected to the vehicle display system 10.
  • the display 21 is operably connected to the processor 33 and the image processing circuit 35. Therefore, the image displayed by the image display unit 20 may be based on the image data received from the processor 33 and / or the image processing circuit 35.
  • the processor 33 and the image processing circuit 35 control the image displayed by the image display unit 20 based on the information acquired from the I / O interface 31.
  • the configuration of the display control device 30 is arbitrary as long as the functions described below are satisfied.
  • the vehicle ECU 401 uses sensors and switches provided on the vehicle 1 to determine the state of the vehicle 1 (for example, mileage, vehicle speed, accelerator pedal opening, brake pedal opening, engine throttle opening, injector fuel injection amount, engine rotation speed). , Motor speed, steering angle, shift position, drive mode, various warning states, attitude (including roll angle and / or pitching angle), vehicle vibration (including magnitude, frequency, and / or frequency of vibration) )) and the like, and collect and manage (may include control) the state of the vehicle 1. As a part of the function, the numerical value of the state of the vehicle 1 (for example, the vehicle speed of the vehicle 1). ) Can be output to the processor 33 of the display control device 30.
  • the state of the vehicle 1 for example, mileage, vehicle speed, accelerator pedal opening, brake pedal opening, engine throttle opening, injector fuel injection amount, engine rotation speed.
  • the vehicle ECU 401 simply transmits the numerical value detected by the sensor or the like (for example, the pitching angle is 3 [brake] in the forward tilt direction) to the processor 33, or instead, the numerical value detected by the sensor is used.
  • Judgment results based on one or more states of the including vehicle 1 (for example, the vehicle 1 satisfies a predetermined condition of the forward leaning state) and / and analysis results (for example, of the brake pedal opening degree). Combined with the information, the brake has caused the vehicle to lean forward.) May be transmitted to the processor 33.
  • the vehicle ECU 401 may output a signal indicating a determination result indicating that the vehicle 1 satisfies a predetermined condition stored in advance in a memory (not shown) of the vehicle ECU 401 to the display control device 30.
  • the I / O interface 31 may acquire information on the state of the vehicle 1 as described above from the sensors and switches provided on the vehicle 1 provided on the vehicle 1 without going through the vehicle ECU 401.
  • the vehicle ECU 401 may output an instruction signal indicating an image to be displayed by the vehicle display system 10 to the display control device 30, and at this time, it is necessary to notify the coordinates, size, type, display mode, and image of the image.
  • the degree and / or the necessity-related information that is the basis for determining the notification necessity may be added to the instruction signal and transmitted.
  • the road information database 403 is included in a navigation device (not shown) provided in the vehicle 1 or an external server connected to the vehicle 1 via an external communication interface (I / O interface 31), and the vehicle position detection described later.
  • Road information (lane, white line, stop line, crosswalk, Road width, number of lanes, intersections, curves, branch roads, traffic regulations, etc.), feature information (buildings, bridges, rivers, etc.), presence / absence, position (including distance to vehicle 1), direction, shape, type , Detailed information and the like may be read out and transmitted to the processor 33. Further, the road information database 403 may calculate an appropriate route (navigation information) from the departure point to the destination, and output a signal indicating the navigation information or image data indicating the route to the processor 33.
  • the own vehicle position detection unit 405 is a GNSS (Global Navigation Satellite System) or the like provided in the vehicle 1, detects the current position and orientation of the vehicle 1, and transmits a signal indicating the detection result via the processor 33. , Or directly output to the road information database 403, the portable information terminal 417 described later, and / or the external communication device 419.
  • the road information database 403, the mobile information terminal 417 described later, and / or the external communication device 419 obtains the position information of the vehicle 1 from the own vehicle position detection unit 405 continuously, intermittently, or at a predetermined event. , Information around the vehicle 1 may be selected and generated and output to the processor 33.
  • the vehicle exterior sensor 407 detects real objects existing around the vehicle 1 (front, side, and rear).
  • the actual objects detected by the external sensor 407 are, for example, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) of the traveling lane described later. Etc. may be included.
  • a detection unit composed of a radar sensor such as a millimeter wave radar, an ultrasonic radar, a laser radar, a camera, or a combination thereof, and detection data from the one or a plurality of detection units are processed ( It consists of a processing device (data fusion) and a processing device.
  • One or more external sensors 407 detect a real object in front of the vehicle 1 for each detection cycle of each sensor, and the real object information (presence or absence of the real object, existence of the real object exists) which is an example of the real object information.
  • information such as the position, size, and / or type of each real object
  • these real object information may be transmitted to the processor 33 via another device (for example, vehicle ECU 401).
  • a camera an infrared camera or a near-infrared camera is desirable so that a real object can be detected even when the surroundings are dark such as at night.
  • a stereo camera capable of acquiring a distance or the like by parallax is desirable.
  • the operation detection unit 409 is, for example, a CID (Center Information Processor) of the vehicle 1, a hardware switch provided on the instrument panel, or a software switch that combines an image and a touch sensor, and the like.
  • the operation information based on the operation by the occupant is output to the processor 33.
  • the operation detection unit 409 sets the display area setting information based on the operation of moving the display area 100, the eyebox setting information based on the operation of moving the eyebox 200, and the observer's eye position 700 by the user's operation. Information based on the operation is output to the processor 33.
  • the eye position detection unit 411 includes a camera such as an infrared camera that detects the eye position 700 (see FIG. 1) of the observer sitting in the driver's seat of the vehicle 1, and even if the captured image is output to the processor 33. good.
  • the processor 33 acquires an captured image (an example of information capable of estimating the eye position 700) from the eye position detection unit 411, and analyzes the captured image by a method such as pattern matching to obtain an observer's eye position.
  • the coordinates of 700 may be detected, and a signal indicating the detected coordinates of the eye position 700 may be output to the processor 33.
  • the eye position detection unit 411 determines where in the spatial region corresponding to the plurality of preset display parameters the eye position 700 of the observer belongs to the analysis result obtained by analyzing the image captured by the camera (for example, the eye position 700 of the observer belongs to).
  • the indicated signal may be output to the processor 33.
  • the method of acquiring information capable of estimating the eye position 700 of the observer of the vehicle 1 or the eye position 700 of the observer is not limited to these, and a known eye position detection (estimation) technique is used. May be obtained.
  • the eye position detection unit 411 detects the moving speed and / or moving direction of the observer's eye position 700, and outputs a signal indicating the moving speed and / or moving direction of the observer's eye position 700 to the processor 33. It may be output to.
  • the IMU413 is one or more sensors (eg, accelerometers and gyroscopes) configured to detect the position, orientation, and changes (speed of change, acceleration of change) of vehicle 1 based on inertial acceleration. Can include combinations of.
  • the IMU413 processes the detected values (the detected values include the position and orientation of the vehicle 1, signals indicating these changes (change speed, change acceleration), and the like), and the results of analyzing the detected values. It may be output to 33.
  • the result of the analysis is a signal or the like indicating a determination result of whether or not the detected value satisfies a predetermined condition, and is, for example, from a value relating to a change (change speed, change acceleration) of the position or orientation of the vehicle 1. , It may be a signal indicating that the behavior (vibration) of the vehicle 1 is small.
  • the line-of-sight direction detection unit 415 includes an infrared camera or a visible light camera that captures the face of an observer sitting in the driver's seat of the vehicle 1, and may output the captured image to the processor 33.
  • the processor 33 acquires an captured image (an example of information capable of estimating the line-of-sight direction) from the line-of-sight direction detection unit 415, and analyzes the captured image to determine the line-of-sight direction (and / or the gaze position) of the observer. Can be identified.
  • the line-of-sight direction detection unit 415 may analyze the captured image from the camera and output a signal indicating the line-of-sight direction (and / or the gaze position) of the observer, which is the analysis result, to the processor 33.
  • the method for acquiring information that can estimate the line-of-sight direction of the observer of the vehicle 1 is not limited to these, and is not limited to these, but is the EOG (Electro-oculogram) method, the corneal reflex method, the scleral reflex method, and the Purkinje image detection. It may be obtained using other known line-of-sight detection (estimation) techniques such as the method, search coil method, infrared fundus camera method.
  • the mobile information terminal 417 is a smartphone, a laptop computer, a smart watch, or other information device that can be carried by an observer (or another occupant of vehicle 1).
  • the I / O interface 31 can communicate with the mobile information terminal 417 by pairing with the mobile information terminal 417, and the data recorded in the mobile information terminal 417 (or the server through the mobile information terminal). To get.
  • the mobile information terminal 417 has, for example, the same functions as the above-mentioned road information database 403 and own vehicle position detection unit 405, acquires the road information (an example of real object-related information), and transmits it to the processor 33. You may.
  • the personal digital assistant 417 may acquire commercial information (an example of information related to a real object) related to a commercial facility in the vicinity of the vehicle 1 and transmit it to the processor 33.
  • the mobile information terminal 417 transmits schedule information of the owner (for example, an observer) of the mobile information terminal 417, incoming information on the mobile information terminal 417, mail reception information, and the like to the processor 33, and the processor 33 and an image.
  • the processing circuit 35 may generate and / or transmit image data relating to these.
  • the external communication device 419 is a communication device that exchanges information with the vehicle 1, for example, another vehicle connected to the vehicle 1 by vehicle-to-vehicle communication (V2V: Vehicle To Vehicle), and pedestrian-to-vehicle communication (V2P: Vehicle To Pestation). ), A network communication device connected by pedestrians (portable information terminals carried by pedestrians) and road-to-vehicle communication (V2I: Vehicle To vehicle Infrastructure), and in a broad sense, communication with vehicle 1 (V2X). : Includes everything connected by (Vehicle To Everything).
  • the external communication device 419 acquires, for example, the positions of pedestrians, bicycles, motorcycles, other vehicles (preceding vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) and sends them to the processor 33. It may be output. Further, the external communication device 419 has the same function as the own vehicle position detection unit 405 described above, and may acquire the position information of the vehicle 1 and transmit it to the processor 33. Further, the road information database 403 described above may be used. It also has a function, and the road information (an example of information related to a real object) may be acquired and transmitted to the processor 33. The information acquired from the external communication device 419 is not limited to the above.
  • the approach image V displayed in the display area of the virtual image is an image that notifies an object in front of the vehicle 1.
  • the close-up image is roughly divided into a display in the "emphasis mode” that emphasizes the real object, a display in the "visualization mode” that visualizes information (virtual object) that is not visible in the front landscape, and information from the vehicle 1 side. It is possible to display in the "presentation mode" to be presented.
  • the "object” can be perceived as an existing obstacle (pedestrian, bicycle, motorcycle, other vehicle, etc.), a feature (building, bridge, etc.), and an object on the road (branch road, road sign).
  • the virtual object includes various information associated with location information, such as switching points between general roads and highways, the number of lanes, speed limits, road names, road congestion status, and commercial information related to commercial facilities.
  • the vehicle 1 starts displaying in the "presentation mode" of presenting information
  • a virtual object is set 50 [meter] ahead, and the distance between the set virtual object and the vehicle 1 is increased. Therefore, the size of the image displayed in the "presentation mode” can be adjusted.
  • the approach image of the emphasized mode is displayed at a position corresponding to a real object such as a preceding vehicle, a pedestrian, a road sign, or a building existing in front of the vehicle 1.
  • the close-up image of the emphasized mode is, for example, superimposed on the real object or displayed in the vicinity of the real object to emphasize and notify the existence of the real object. That is, the "position corresponding to the real object" as the display position of the approach image in the emphasized mode is not limited to the position that is visually recognized by being superimposed on the real object, and may be a position in the vicinity of the real object.
  • the emphasis mode is arbitrary as long as it does not interfere with the visibility of the real object.
  • the approach image of the visualization mode is, for example, an image imitating a road sign for notifying various information at a specific place (an example of a virtual object) where a road sign is not installed, or a car navigation system by an observer. It is an image showing a POI (Point of Interest) of a destination, a facility, etc., which is preset by the device 80 or the like.
  • the road sign referred to here includes all kinds of guide signs, warning signs, regulation signs, instruction signs, and the like. That is, the approach image that can be displayed in this embodiment includes an image that imitates a road sign.
  • FIG. 4A and FIG. 4B show an example of a close-up image of a visualization mode imitating a road sign.
  • the close-up image shown in FIG. 4B is an image imitating a regulation sign indicating a speed limit.
  • the approach image of the visualization mode may include a navigation image such as an arrow shape that guides the route of the vehicle 1.
  • Various close-up images of the presentation mode are presented from the vehicle 1 (vehicle ECU 401), for example, advice for driving, information on driving support in the vehicle 1, information indicating the next song title in the audio, and the like, which are presented by the vehicle ECU 401. It is an image showing information.
  • the display control device 30 approaches so that the farther the distance L from the vehicle 1 to the object is, the smaller the display size of the approach image is (in other words, the shorter the distance L is, the larger the display size of the approach image is). Controls the display of images. This makes it possible to give a pseudo perspective to the close-up image.
  • the basic change in the size of the close-up image according to the distance L is preferably continuous, but is not limited to this, and may be intermittent.
  • the image display unit 20 is composed of a depth perception display device capable of adjusting the distance (display distance) on which the approach image is displayed, and the display control device 30 is from the vehicle 1 to the object.
  • the display control of the close-up image may be performed so that the perceived distance of the close-up image becomes long as the distance L is long (conversely, the display distance of the close-up image becomes short as the distance L is short). .. This makes it possible to give a pseudo perspective to the close-up image.
  • the depth perception display device includes a known 3D display device such as a light field display, a multi-lens display, and a binocular disparity display.
  • the virtual image display area 100 is formed in two dimensions of a plane or a curved surface, and the tilt angle ⁇ t is 90 [degree] or less (preferably 50 [degree] or less, more preferably 0 [degree] or less. ] In the vicinity), a depth 2D display device that changes the imaging distance by changing the display position of the virtual image V in the virtual image display area 100 may be included.
  • the display control device 30 is an I / O from at least one of a road information database 403 (including a car navigation system), an external sensor 407, a personal digital assistant 417, and an external communication device 419 (hereinafter, also referred to as an information source).
  • the distance L to the object is acquired based on the information acquired via the interface 31.
  • the display control device 30 may acquire information indicating the distance L from the information source, or may acquire the coordinate information of the object from the information source and calculate the distance L based on the coordinate information.
  • the distance L to the real object which is the notification target of the approach image in the visualization mode, is the representative position of the virtual object specified based on the information from the road information database 403, the personal digital assistant 417, and the external communication device 419.
  • the representative position can be acquired or calculated as a distance to, for example, a start position of a regulated section, a position predetermined as a position suitable for road guidance, and the like).
  • the display control device 30 executes display control for gradually increasing the size of the approaching image (or, in addition, shortening the display distance of the approaching image) based on the shortening of the acquired or calculated distance L.
  • the "distance L" does not necessarily have to be acquired or calculated, and the size of the close-up image may be adjusted based on the information that can estimate the shortening of the distance L.
  • the information that can estimate the shortening of the distance L between the object and the vehicle 1 includes (1) the combination of the coordinate position of the object and the coordinate position of the vehicle 1, (2) the mileage of the vehicle 1, and the like. good.
  • the display control device 30 provides control data for controlling the size (or display distance in addition to this) of the approach image based on the distance L (or information that can estimate the distance L) acquired or calculated as described above. , Stored in memory 37 in advance.
  • the control data includes control data for controlling the size of the close-up image (or, in addition, control data for controlling the display distance).
  • the control data for controlling the size of the close-up image includes the reference control data CDs.
  • the display control device 30 determines the display ratio M based on the acquired or calculated distance L (or information capable of estimating the distance L) and the reference control data CDs, and displays the approach image at the determined display ratio M.
  • the reference control data CDs include data of a reference function Fs (mathematical expression) showing the relationship between the distance L and the display ratio M.
  • the display ratio M is the display ratio of the close-up image to the reference size, and is represented by S / S0 if the size to be visually recognized by the observer is S and the reference size is S0.
  • the reference size is stored in the memory 37 in advance as the size of the approaching image at a predetermined distance L0. How to set the predetermined distance L0 is arbitrary.
  • FIG. 4A shows a close-up image of a reference size having a length of A in the horizontal direction and a length of B in the vertical direction as a rectangular close-up image.
  • FIG. 4B shows a close-up image of a reference size having a length of C in the horizontal direction and a length of D in the vertical direction as a circular close-up image.
  • C D
  • C ⁇ D The length referred to here is a length that can be visually recognized by the observer, and can be defined, for example, by the number of pixels and the enlargement ratio by the virtual image optical system 90 for each partial region of the display surface 21a.
  • the reference size may be adjusted according to the size of the object to which the broadcast image is associated.
  • the display control device 30 may change the reference size based on the size of the real object so that the size of the real object is approximately the same as or larger than the size of the real object when viewed from the observer.
  • the close-up image shown in FIG. 4A is enlarged from the reference size to a size of 2A in width and 2B in length. Will be visible to the observer.
  • the close-up image shown in FIG. 4B has a length of 1 / 2C in the horizontal direction and 1 / 2D in the vertical direction, and has a reference size. It will be displayed in a reduced size from.
  • the enlargement or reduction of the close-up image is executed while maintaining the aspect ratio of the close-up image of the reference size.
  • the shape of the close-up image is not limited to a rectangle or a circle, and may be another shape such as a triangle or a polygon, and the concept of enlargement or reduction is the same for the other shape.
  • the composition of the close-up image is arbitrary as long as it can convey information about the real object, and may be, for example, characters, symbols, figures, icons, or a combination thereof.
  • the display control device 30 controls the display of the approach image.
  • the display control device 30 can display a plurality of close-up images (at the same time) within the same period.
  • the display ratio M of the approach image determined based on the reference control data CDs is equal to or greater than the predetermined threshold value Mth, the approach image may be deleted.
  • the one that notifies the first object having the closest distance L from the vehicle 1 is referred to as the first approach image V1, and the second approach image V1 is farther than the first approach image V1.
  • the second approach image V2 is used to notify the object of the above
  • the third approach image V3 is to notify the third object whose distance L is further distant from the second approach image V2.
  • these plurality of close-up images are controlled based on the reference control data CDs (hereinafter, referred to as normal control), and as shown in FIG. It becomes a close-up image V2 and a third close-up image V3.
  • the display control device 30 executes the distance perception shortening process described later.
  • the display control device 30 Based on the reference control data CDs, the display control device 30 performs display control to increase the size of the approach image V as the distance L to the object becomes shorter (an example of normal control). Then, when the attraction condition described later is satisfied during this normal control, the display control device 30 executes an enlargement process (an example of a distance perception shortening process).
  • FIG. 8 is a flow chart showing the enlargement processing.
  • the display control device 30 determines whether or not the display ratio of the first approach image V1 is equal to or higher than the first threshold value MT1 (block S1).
  • the first threshold value MT1 is predetermined as a display ratio corresponding to the distance L when the object is sufficiently close to the vehicle 1 and the object is expected to pass soon, and is stored in the memory 37.
  • the display control device 30 executes the process of block S2.
  • the display control device 30 is set by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the approach image V1 of 1 is visually recognized (block S3).
  • the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the first approach image V1. If yes (block S3; Yes), the process of block S2 is executed. By doing so, even if the display ratio of the first approach image V1 is less than the first threshold value MT1, the observer recognizes the first approach image V1 and the necessity of notifying the first approach image V1 is already reduced.
  • an enlargement process an example of the distance perception shortening process for ensuring the visibility of the second close-up image V2 can be executed as described later.
  • the display control device 30 determines whether or not the erasing condition of the first approach image V1 is satisfied. For example, when the display ratio of the first approach image V1 is equal to or greater than the predetermined threshold value Mth, the display control device 30 determines that the erasing condition is satisfied (block S2; Yes), and virtualizes the first approach image V1. Erase from within the display range of V (block S4).
  • the predetermined threshold value Mth is predetermined as a value larger than, for example, the first threshold value MT1, and is stored in the memory 37.
  • the display control device 30 determines whether or not the display ratio of the second approach image V2 is less than the second threshold value MT2.
  • the second threshold value MT2 is a value smaller than the first threshold value MT1, and the distance L when the object is far from the vehicle 1 and the second approach image V2 showing the object is assumed to be difficult for the observer to see.
  • the display ratio corresponding to is predetermined and stored in the memory 37.
  • the display control device 30 ends the distance perception shortening process.
  • the second close-up image V2 is displayed according to the normal control based on the reference control data CDs, that is, the reference function Fs.
  • the display control device 30 executes an enlargement process on the second approach image V2 (block). S6).
  • the display control device 30 displays the second approach image V2 at a ratio larger than the display ratio MR (see FIGS. 7B and 7C) determined based on the reference control data CDs.
  • FIG. 6A to 6C show an example of image transition when the enlargement process is executed.
  • the second close-up image V2 displayed under normal control as shown in FIG. 6A is enlarged as shown in FIG. 6B.
  • the enlargement processing an example of the distance perception shortening processing
  • the second approach image V2 which is expected to increase the need for notification, can be displayed in an easy-to-see manner.
  • the display control device 30 may display the second approach image V2 higher than before the execution when viewed from the observer. In this way, the observer can secure the field of view necessary for driving.
  • the control of moving the second approach image V2 upward and displaying it in accordance with the enlargement processing is particularly useful when the second approach image V2 is an approach image in a visualization mode imitating a road sign. be. This is because the normal road sign is displayed above the road surface ahead, so that the observer can visually recognize the second approach image V2 as if he / she is looking at the actual road sign.
  • the display control device 30 when displaying the third approach image V3 that notifies an object whose distance L is farther than the second approach image V2, the display control device 30 performs enlargement processing on the second approach image V2. Even in the case of execution, the display of the third close-up image V3 is continued by the control based on the reference control data CDs. As a result, the second approach image V2 is relatively more conspicuous than the third approach image V3, and the second approach image V2 whose notification necessity is higher than that of the third approach image V3 in the latest. Can be displayed effectively. If there are four or more close-up images when the distance perception shortening process is executed, the close-up image displayed smaller than the third close-up image V3 (that is, the distance from the notification target of the third close-up image V3). Even for the image (an image in which L is a distant object), the display under the control based on the reference control data CDs is continued.
  • the display control device 30 determines whether or not a part of the first close-up image V1 is out of the display area of the virtual image V as shown in FIG. If it is, and if it is determined as Yes in the block S5, the enlargement process may be executed. Even in this way, the notification target of the first approach image V1 will soon pass, and the second approach image V2, which is expected to have a higher notification necessity than the first approach image V1, can be displayed in an easy-to-see manner. ..
  • the display control device 30 determines whether or not a predetermined predetermined period (for example, several seconds) has elapsed from the start of execution of the enlargement processing (block S7).
  • a predetermined predetermined period for example, several seconds
  • the display control device 30 returns the display control of the second approach image V2 to the control based on the reference control data CDs (distance perception extension process described later). (Example)) is executed (block S8), and the distance perception shortening process is terminated.
  • FIG. 6B to 6C show an example of image transition when the return process is executed.
  • the second approach image V2 which has been enlarged and displayed as shown in FIG. 6B, is displayed by control based on the reference control data CDs as shown in FIG. 6C.
  • FIG. 6C shows an example in which the display ratio changes in the M direction when the enlargement processing or the restoration processing is executed
  • the display ratio is gradually increased in order to suppress a sudden size change of the second approach image V2. It may be changed.
  • the points (L, M) indicating the second approach image V2 are obliquely upward to the left with respect to the M direction from the reference function Fs to the second function Ft when the enlargement processing is executed.
  • the second function Ft moves diagonally downward to the left with respect to the M direction from the second function Ft to the reference function Fs.
  • the display control device 30 is the second display being enlarged by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the close-up image V2 is visually recognized (block S9).
  • the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the second approach image V2. If so (block S9; Yes), the return process of block S8 is executed. By doing so, when it is assumed that the observer has recognized the second approach image V2 being displayed in an enlarged manner, the display control of the second approach image V2 is returned to the control based on the reference control data CDs. This makes it possible to prevent the observer from continuing to magnify the second close-up image V2, for which the need for notification has already decreased. This concludes the explanation of the distance perception shortening process.
  • the present invention is not limited to this.
  • the display control of the second approach image V2 may be performed at a constant display ratio M that does not depend on the distance L.
  • the points (L, M) representing the second approach image V2 are ideally restored as shown in FIG. 9A. It is rarely located on the reference function Fs at the start of processing. Therefore, as shown in FIG.
  • the return reference is passed through the point representing the second approach image V2 at the start time, and the proportionality constant of the reference function Fs is subtracted by a predetermined value.
  • the display control of the second close-up image V2 may be performed by using the function Fsb. It can be said that the control using the return reference function Fsb is an example of controlling the display ratio of the second approach image V2 based on the reference control data CDs.
  • the display ratio which was a constant value from the start time of the enlargement process to the start time of the return process, may be changed based on the reference function Fs from the start time. In these modified examples as well, the display ratio may be gradually changed when the enlargement process or the return process is executed, as described above.
  • the constant display ratio M intersects with the reference function Fs.
  • the return process may be started at the same time. That is, the predetermined period from the execution of the enlargement processing to the second close-up image V2 to the start of the restoration processing does not have to be a predetermined period.
  • the second function Ft is the proportional constant of the reference function Fs increased
  • the return reference function Fsb is the proportional constant of the reference function Fs decreased.
  • the reference function Fs is inversely proportional
  • the reference function Fs may be an algebraic function including a polynomial function or an elementary function and shows a curve. Further, the reference function Fs may be composed of a plurality of linear functions having different slopes. For example, as shown in FIG. 10B, the reference function Fs may show a curve in which the display ratio M becomes larger than the direct proportional relationship as the distance L becomes larger. Further, as shown in FIG. 10B, the reference function Fs is represented by a plurality of straight lines, and the plurality of straight lines have a gentler slope when the distance is L or more than when the distance is less than L. May include. The example of the reference function Fs shown in FIGS. 10B and 10C is useful when it is difficult to see a distant landscape during rain or heavy fog, or when it is difficult to see a close-up image due to strong sunlight.
  • the reference control data CDs are the data representing the mathematical formula of the reference function Fs, but the reference control data CDs are configured so that the display ratio M can be determined according to the acquired distance L. It may be table data. That is, if the reference control data CDs can determine the display ratio M (in other words, the display ratio M that increases as the distance L becomes shorter) according to the acquired distance L, the display ratio M becomes smaller as the distance L becomes longer.
  • the data indicating a mathematical formula or the table data may be used, and the configuration thereof is arbitrary.
  • the display control device 30 has determined that the erasing condition is satisfied when the display ratio of the first approach image V1 is equal to or greater than the predetermined threshold value Mth (in the distance perception shortening process shown in FIG. 8).
  • Block S2 is not limited to this.
  • the display control device 30 determines that the erasing condition is satisfied when the object visually recognized by the observer is the first approach image V1 (block S2 of the distance perception shortening process shown in FIG. 8; Yes. ) May.
  • the second close-up image V2 can be effectively displayed, it goes without saying that in the distance perception shortening process described above, it is possible to omit some processes and change or add a predetermined process. stomach.
  • the distance perception shortening process at least one of blocks S3, blocks S2 and S4, and block S9 may be omitted. Further, the processing of the block S5 may be omitted in the distance perception shortening processing.
  • the control data for controlling the display distance of the close-up image includes the reference control data CDs.
  • the display control device 30 determines the display ratio M based on the acquired or calculated distance L (or information capable of estimating the distance L) and the reference control data CDs, and displays the approach image at the determined display ratio M.
  • the reference control data CDs include data of a reference function Fs (mathematical expression) showing the relationship between the distance L and the display ratio M.
  • the constant ⁇ is determined by, for example, an offset amount in the depth direction (Z-axis direction) of the approaching image with respect to the distance L to the object.
  • the constant ⁇ may be zero.
  • the display control device 30 Based on the reference control data CDs, the display control device 30 performs display control that shortens the display distance P of the approach image V as the distance L to the object becomes shorter (an example of normal control). Then, when the attraction condition described later is satisfied during this normal control, the display control device 30 executes an approach process (an example of a distance perception shortening process).
  • the display control device 30 determines whether or not the display distance of the first approach image V1 is equal to or less than the first threshold value PT1 (block S11).
  • the first threshold value PT1 is predetermined as a display distance corresponding to the distance L when the object is sufficiently close to the vehicle 1 and the object is expected to pass soon, and is stored in the memory 37.
  • the display control device 30 executes the process of block S12.
  • the display control device 30 is set by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the approach image V1 of 1 is visually recognized (block S13).
  • the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the first approach image V1. If yes (block S13; Yes), the process of block S12 is executed. By doing so, even when the display distance P of the first approach image V1 is longer than the first threshold value PT1, the observer recognizes the first approach image V1 and already notifies the first approach image V1.
  • the approach process an example of the distance perception shortening process for ensuring the visibility of the second approach image V2 can be executed as described later.
  • the display control device 30 determines whether or not the erasing condition of the first approach image V1 is satisfied. For example, when the display distance P of the first approach image V1 is equal to or less than the predetermined threshold value Pth, the display control device 30 determines that the erasing condition is satisfied (block S12; Yes), and displays the first approach image V1. It is erased from the display range of the virtual image V (block S14).
  • the predetermined threshold value Pth is predetermined as a value smaller than, for example, the first threshold value PT1 and is stored in the memory 37.
  • the display control device 30 determines whether or not the display distance P of the second approach image V2 is longer than the second threshold value PT2.
  • the second threshold value PT2 is a value larger than the first threshold value PT1 and the distance L when the object is far from the vehicle 1 and the second approach image V2 showing the object is assumed to be difficult for the observer to see.
  • the display distance P corresponding to is predetermined and stored in the memory 37.
  • the display control device 30 ends the distance perception shortening process.
  • the second close-up image V2 is displayed according to the normal control based on the reference control data CDs, that is, the reference function Fs.
  • the display control device 30 executes the approach process with respect to the second approach image V2 (block). S16).
  • the display control device 30 displays the second approach image V2 at a distance closer than the display distance PR (see FIGS. 12B and 12C) determined based on the reference control data CDs.
  • 13A to 13C show an example of image transition when the approach process is executed.
  • the second approach image V2 displayed at the display distance P22 under normal control approaches the display distance P22t closer to the observer than the display distance P22, as shown in FIG. 13B.
  • the approach processing an example of the distance perception shortening process
  • the second approach image V2 which is expected to increase the need for notification, can be displayed in an easy-to-see manner.
  • the display control device 30 when displaying the third approach image V3 that notifies an object whose distance L is farther than the second approach image V2, the display control device 30 performs an approach process to the second approach image V2. Even in the case of execution, the display of the third close-up image V3 is continued by the control based on the reference control data CDs. As a result, the second approach image V2 is relatively more conspicuous than the third approach image V3, and the second approach image V2 whose notification necessity is higher than that of the third approach image V3 in the latest. Can be displayed effectively. If there are four or more close-up images when the distance perception shortening process is executed, the close-up image displayed smaller than the third close-up image V3 (that is, the distance from the notification target of the third close-up image V3). Even for the image (an image in which L is a distant object), the display under the control based on the reference control data CDs is continued.
  • the display control device 30 determines whether or not a predetermined predetermined period (for example, several seconds) has elapsed from the start of execution of the approach process (block S17).
  • a predetermined predetermined period for example, several seconds
  • the display control device 30 executes a return process for returning the display control of the second approach image V2 to the control based on the reference control data CDs (block S18). , End the distance perception shortening process.
  • the second approach image V2 displayed at the display distance P23t in the approach process is a display distance based on the reference control data CDs, which is farther from the observer than the display distance P22, as shown in FIG. 13C. It is displayed so as to move away from P23.
  • FIG. 13C shows an example in which the display distance P changes in the P direction when the approach processing or the return processing is executed, the display distance is gradually changed in order to suppress a sudden size change of the second approach image V2.
  • P may be changed.
  • the points (L, P) indicating the second approach image V2 are reference when the approach process is executed, as shown by the dotted line between Fs and Ft in FIG. 12C.
  • the function Fs may move diagonally downward to the left to the second function Ft, and when the return process is executed, the function Fs may move diagonally downward to the left from the second function Ft to the reference function Fs.
  • the display control device 30 is the second display being enlarged by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the close-up image V2 is visually recognized (block S19).
  • the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the second approach image V2. If so (block S19; Yes), the return process of block S18 is executed. By doing so, when it is assumed that the observer has recognized the second approach image V2 being displayed in an enlarged manner, the display control of the second approach image V2 is returned to the control based on the reference control data CDs. This makes it possible to prevent the observer from continuing to magnify the second close-up image V2, for which the need for notification has already decreased. This concludes the explanation of the distance perception shortening process.
  • Method S100 is executed by an image display unit 20 including a display and a display control device 30 that controls the image display unit 20. Some actions in method S100 are optionally combined, some steps are optionally modified, and some actions are optionally omitted.
  • the software components stored in the memory 37 of FIG. 3 include a proximity image generation module 502, a distance perception shortening condition determination module 504, a notification necessity determination module 506, a distance perception shortening processing module 508, and a distance perception extension condition determination module 510. And the distance perception extension processing module 512.
  • the method S100 specifically provides a method of presenting an image (virtual image) that attracts the visual attention of the observer to a predetermined close-up image.
  • Method S100 can be implemented by having one or more processors 33 execute one or more computer programs stored in memory 37.
  • One or more processors 33 generate the close-up image V by executing the close-up image generation module 502.
  • the approach image generation module 502 acquires the positions of objects (real objects or virtual objects) from the road information database 403, the vehicle exterior sensor 407, the mobile information terminal 417, and / or the external communication device 419, and these objects and the vehicle 1 As the relative distance (distance L) between the two becomes shorter, the size of the approach image V is controlled to be gradually increased.
  • the close-up image generation module 502 may gradually reduce the display distance of the close-up image V while gradually increasing the size of the close-up image V as the distance L between the object and the vehicle 1 becomes short. good.
  • One or more processors 33 execute the distance perception shortening condition determination module 504.
  • the size of the image related to the first information, which has a higher notification necessity than the second information which is a candidate for performing the attraction increase process becomes equal to or larger than a predetermined threshold (S132).
  • the display distance of the image related to the first information, which has a higher notification necessity than the information becomes equal to or more than a predetermined threshold (S134), and the notification necessity of the second information increases to a predetermined determination value or more (S136).
  • the notification necessity of the first information having a higher notification necessity than the second information is reduced to a predetermined determination value or less (S138), and the approach image relating to the first information having a higher notification necessity than the second information is displayed.
  • S140 a predetermined determination value or less
  • various operations related to determining that the distance perception shortening condition is satisfied are executed. That is, the distance perception shortening condition determination module 504 is a command, determination value, table data, for determining whether the distance perception shortening condition is satisfied from the notification necessity of the information indicated by the approach image or the display state of the approach image. It may include various software components such as arithmetic expressions.
  • (S130-1) the size of the approach image of the second information, which is a candidate for performing the attraction raising process, is equal to or less than a predetermined threshold value
  • (S130-2) the candidate for performing the attraction raising process.
  • the objects corresponding to the second information are separated by a predetermined threshold value or more, and (S130-3) the approach image of the second information, which is a candidate for performing the attraction ascending process, contains text, or these. It may be added that the combination holds.
  • S130-1 is a condition that can be a substitute for the size of the close-up image of S130-1.
  • the notification necessity determination module 506 determines whether the approach image is the content to be notified to the observer.
  • the notification necessity determination module 506 may acquire information from various other electronic devices connected to the I / O interface 31 and calculate the notification necessity. Further, the electronic device connected to the I / O interface 31 in FIG. 11 transmits information to the vehicle ECU 401, and the notification necessity determination module 506 detects (acquires) the notification necessity determined by the vehicle ECU 401 based on the received information. ) May.
  • the "necessity of notification” is, for example, the degree of danger derived from the degree of seriousness that can occur, the degree of urgency derived from the length of reaction time required to take a reaction action, vehicle 1 or a viewer (or vehicle). It can be determined by the effectiveness derived from the situation of (1 other occupant), or a combination thereof (the index of the need for notification is not limited to these).
  • the notification necessity determination module 506 may detect the necessity-related information that is the source for estimating the notification necessity, and may estimate the notification necessity from this.
  • Necessity-related information which is the basis for estimating the notification necessity of an image, may be estimated based on, for example, the position and type of a real object or traffic regulation (an example of road information), and is connected to the I / O interface 31. It may be estimated based on other information input from various other electronic devices or in addition to other information. That is, the notification necessity determination module 506 may determine whether to notify the viewer and may choose not to display the image described later.
  • the display control device 30 does not have to have a function of estimating (calculating) the necessity of notification as long as it can acquire the necessity of notification, and a part or all of the function of estimating the necessity of notification is It may be provided separately from the display control device 30 of the vehicle display system 10 (for example, the vehicle ECU 401).
  • One or more processors 33 execute the distance perception shortening processing module 508.
  • the distance perception shortening processing module 508 enlarges the size of the approach image from the size in the reference control, further expands as the distance L becomes shorter (enlargement processing S152), and displays the display distance of the approach image in the reference control.
  • the display distance is shortened as the distance L becomes shorter than the distance (approach processing S154), and the size of the approach image is enlarged to be larger than the size in the reference control and the size is maintained (S156).
  • the display distance of the close-up image is made closer than the display distance in the reference control and the display distance is maintained (S158), or a software component for executing a combination thereof is included.
  • the distance perception shortening processing module 508 is stored in the memory 37 in order to enlarge the size of the approach image to be larger than the size in the reference control, or to make the display distance of the approach image closer than the display distance in the reference control.
  • the pre-stored first distance perception shortening processing data CDt (including the second function Ft) is executed (blocks S152 and S154).
  • One or more processors 33 execute the distance perception extension condition determination module 510.
  • a predetermined time elapses after the distance perception shortening process of the block S150 is executed (S172), and the notification necessity of the second information is reduced to a predetermined determination value or less (S174). )
  • the notification necessity of the third information which is lower than the second information, increases to a predetermined determination value or more (S176), the vehicle 1 is manually operated (S178), and the speed of the vehicle 1. Is equal to or greater than a predetermined determination value, or a combination thereof causes various actions related to determining that the distance perception extension condition is satisfied.
  • the distance perception extension condition determination module 510 determines whether the distance perception extension condition is satisfied based on the notification necessity of the information indicated by the approach image, the display state of the approach image, and the like, the command, the determination value, and the table data.
  • the distance perception extension processing module 512 makes the size of the approach image smaller than the size in the distance perception shortening process (S192), makes the display distance of the approach image farther than the display distance in the distance perception shortening process (S194), and makes the image Includes software components for maintaining size (S196), maintaining image display distance (S198), or performing combinations thereof.
  • the distance perception shortening processing module 508 is based on the distance L between the object and the vehicle 1 and the reference control data CDs (reference function Fs), the size of the approach image, the display distance of the approach image, or these. Control the combination.
  • the block S192 may include changing the size of the close-up image to the size in the reference control.
  • the block S194 may include changing the display distance of the approach image to the display distance in the reference control.
  • the approach image in the present embodiment is roughly classified into (1) as the relative distance between the vehicle and the object becomes shorter, the size becomes larger, the display distance becomes shorter, or a combination thereof is executed, and the approach image becomes
  • the size is larger, the display distance is shorter, or a combination of these is larger than the display in the "normal control process” that is perceived as gradually approaching, and (2) the display in the "normal control process”.
  • the size is larger, the display distance is shorter, or a combination of these is executed than the display in the "first distance perception shortening process" to be executed and (3) the display in the "first distance perception shortening process". It is possible to display in the "second distance perception shortening process".
  • the close-up image control data CDs in the present embodiment include reference control data CDs that are displayed in the "normal control process”, first distance perception shortening process data CDt that is displayed in the "first distance perception shortening process”, and so on. And the second distance perception shortening process data CDv to be displayed in the "second distance perception shortening process" is included.
  • the second distance perception shortening processing data CDv controls the approaching image so that the size of the approaching image with respect to the distance L is larger and / or the display distance with respect to the distance L is shorter than the first distance perception shortening processing data CDt. It is data. That is, as shown in FIG. 15, when the size and / or display distance of the approach image is changed from the one based on the reference control data CDs to the second distance perception shortening processing data CDv, the first distance perception shortening processing data Compared with the case of changing to CDt, the distance perception can be further shortened, and the degree of attraction (attractiveness) can be further increased.
  • FIG. 16A shows the display of the approach image executed when another information having a higher notification priority than the second information is generated while displaying the approach image related to the second information according to some embodiments. It is a flow chart which shows the method S300 concerning control. 16B is a flow chart following FIG. 16A, and FIG. 16C is a flow chart following FIG. 16B. Some actions in the method S300 are optionally combined, some actions are optionally modified, and some actions are optionally omitted. Further, FIGS. 17A to 17F are views in which the left figure explains an example of the display transition of the image, and the right figure corresponds to the right figure showing the display ratio of the image according to the distance.
  • FIG. 17A shows a display example of the second approach image V2 (L1, M1) showing the second information regarding the object set at the distance L1 displayed by the block S310.
  • One or more processors 33 determine whether or not the condition for reducing the attractiveness of the close-up image V is satisfied.
  • the one or more processors 33 have higher notification necessity information (here, the third information) than the information indicated by the close-up image V, which is a candidate for reducing the attractiveness (here, the second information). Is newly detected, it is determined that the condition for reducing the attractiveness is satisfied (S332).
  • One or more processors 33 execute a display process for reducing the attractiveness of the approach image V when it is determined in S330 that the above conditions are satisfied.
  • One or more processors 33 have visibility of the approach image V (here, the second approach image V2) determined in S330 that the above condition is satisfied (the visibility includes, for example, brightness and display color). , Transparency, display gradation, or a combination thereof.) (S352).
  • FIG. 17B shows a third approaching image V2 (L1.M1) whose visibility is reduced in the block S352 and a third information having a higher notification necessity than the second information shown by the second approaching image V2 in the block S332. It is a figure which shows the display example of the 3rd approach image V3 which shows information.
  • the visibility of the second approach image V2 is lowered, and the third approach image V3 (here, the image having a higher need for notification than the second information is highlighted.
  • the third approach image V3 (or the third approach image V3) (or the third one) has a high need for notification. It is possible to facilitate the visual attention of the observer to the object) to which the close-up image V3 of 3 is associated.
  • the third virtual image V3 does not have to be a close-up image that shortens the perceptual distance as the distance L to the object becomes shorter.
  • the third virtual image V3 may not be displayed. That is, in FIGS. 17B and 17C, the third close-up image V3, which has a high need for notification, does not have to be displayed. In this way, by reducing the attractiveness of the second close-up image V2 and reducing the attractiveness of the second close-up image V2, the observer's visual attention to the object of the third information is indirectly taken. Is easier to face.
  • FIG. 17C shows a situation after the situation shown in FIG. 17B, when the distance L to the second object associated with the second information is shortened from L1 to L2 due to reasons such as the vehicle 1 moving forward.
  • the second approach image V2 expands the display ratio M of the second approach image V2 from M1 to M2 based on the reference control data CDs as the distance L becomes shorter from L1 to L2 (shortens the distance perception). An example of doing.).
  • the attractiveness (as an example, visibility) of the second close-up image V2 remains reduced, but for example, a predetermined time has elapsed since the block S350 was executed.
  • the attractiveness of the second approach image V2 may be increased on the condition that the vehicle 1 has traveled a predetermined distance or more after the block S350 is executed.
  • the block S330 (first condition in the claims) is the information indicated by the close-up image V which is a candidate for reducing the attractiveness in place of the block S332 or in addition to the block S332 (here, the second condition).
  • Information (information) has a lower notification necessity (here, the first information), the notification necessity increases (S334), and the information indicated by the approach image V, which is a candidate for reducing the attractiveness (here, is referred to as the first information).
  • the notification necessity of the second information may be lower than the notification necessity of the lower information (here, the first information) (S336), or a combination thereof may be included.
  • the block S350 (first display control in the claims) hides the approach image V (here, the second approach image V2) in place of the block S352 or in addition to the block S352. (S354), reducing the size of the close-up image V (here, the second close-up image V2) (S356), or a combination thereof. That is, the display process for reducing the attractiveness is to reduce the visibility of the image, hide the image, reduce the size of the image (an example of extending the distance perception of the image), or. These combinations may be included.
  • One or more processors 33 determine whether the condition for shortening the distance perception of the approach image V is satisfied.
  • the information here, the first information or the third information
  • the information here, the second information indicated by the close-up image V which is a candidate for shortening the distance perception
  • One or more processors 33 execute a display process that shortens the distance perception of the approach image V when it is determined in S370 that the above conditions are satisfied.
  • One or more processors 33 enlarge the size of the close-up image V (here, the second close-up image V2) determined in S370 that the above condition is satisfied, and further expand the size of the close-up image V (here, the second close-up image V2) from the size in the reference control, and further increase the distance L. As it becomes shorter, it expands (S392).
  • FIG. 17D is a diagram showing a display example of the second approach image V2 (L2.M3) enlarged in the block S392.
  • one or more processors 33 increase the display ratio M corresponding to the distance L2 from M2 to M3 by changing the control data from the reference control data CDs to the first distance perception shortening processing data CDt. do.
  • the perceived distance felt by the observer is shortened, so that the second approach image V2 can easily recognize the approach of the associated object quickly. can do.
  • FIG. 17E is after the situation shown in FIG. 17D, when the distance L to the second object associated with the second information is shortened from L2 to L3 due to reasons such as the vehicle 1 moving forward. It is a figure which shows the display transition of the 2nd approach image V2 of.
  • the second close-up image V2 expands the display ratio M of the second close-up image V2 from M3 to M4 based on the first distance perception shortening processing data CDt in accordance with the shortening of the distance L from L2 to L3. It may be good (an example of shortening the perception of distance).
  • the block S370 (second condition in the claims) is the information indicated by the close-up image V which is a candidate for reducing the attractiveness in place of the block S372 or in addition to the block S372 (here, the second condition).
  • Information which has a higher notification necessity (here, the first information or the third information) has a lower notification necessity (S374), and the information indicated by the approach image V, which is a candidate for reducing the attractiveness.
  • the notification necessity of (here, the second information) is higher than the notification necessity of the higher-level information (here, the first information or the third information) (S376), or a combination thereof. You may be.
  • the block S390 (second display control in the claims) replaces the block S392 or, in addition to the block S392, makes the display distance of the approach image closer than the display distance in the reference control, and further sets the distance L.
  • the display distance is shortened (S394)
  • the size of the approach image is enlarged and maintained in size (S396)
  • the display distance of the approach image is the display distance in the reference control. It may be closer and maintain the display distance (S398), or a combination thereof may be included. That is, the display process that shortens the distance perception may include enlarging the image, shortening the display distance of the image, or a combination thereof.
  • One or more processors 33 determine whether the condition for extending the distance perception of the approach image V is satisfied. When one or more processors 33 detect that a predetermined time has elapsed since the block S390 was executed, it determines that the condition for extending the distance perception is satisfied (S412). Further, in the block S410 (third condition in the claims), the vehicle 1 has traveled a predetermined distance or more after the block S390 is executed, for example, in place of the block S412 or in addition to the block S412 (S414). ), Etc. may be included.
  • one or more processors 33 execute a display process for extending the distance perception of the approach image V.
  • One or more processors 33 make the size of the approach image V (here, the second approach image V2) determined in S410 that the above condition is satisfied smaller than the size in the first distance perception shortening process (in this case, the second approach image V2).
  • FIG. 17F is a diagram showing a display example of the second approach image V2 (L3.M5) enlarged by the block S432.
  • One or more processors 33 change the control data from the first distance perception shortening processing data CDt to the reference control data CDs in the block S432 (in other words, restore the distance perception shortened in the block S390).
  • the display ratio M corresponding to the distance L3 is reduced from M4 to M5.
  • the change in the image that reduces the size of the second close-up image V2 makes it easier for the observer to pay attention to the visual attention, and by extension, the observer's object in the vicinity of the distance perception of the changed image. Can direct visual attention.
  • block S430 sets the display distance of the approach image longer than the display distance in the first distance perception shortening process in place of the block S432 or in addition to the block S432 (in addition to the block S432). S434) may be included.
  • the block S390 when the block S390 is executed, the display ratio M smaller than the display ratio M3 based on the one-distance perception shortening processing data CDt corresponding to the actual object position L2 (for example, the display ratio M1 when hidden in S350).
  • the block S390 After displaying the second approach image V2 by (may be May be expanded. That is, the block S390 has the 1-distance perception shortening processing data CDt corresponding to the actual object position L2 before displaying the image by the distance perception based on the 1-distance perception shortening processing data CDt corresponding to the actual object position L2.
  • the image of the distance perception based on the one-distance perception shortening processing data CDt corresponding to the position L2 of the actual object may be continuously changed. As a result, it is possible to prevent the observer from being surprised by suddenly displaying an image having a short perceptual distance.
  • the operation of the above-mentioned processing process can be performed by executing one or more functional modules of an information processing device such as a general-purpose processor or a chip for a specific purpose. All of these modules, combinations of these modules, and / or combinations with known hardware capable of substituting their functionality are within the scope of the protection of the present invention.
  • the functional blocks of the vehicle display system 10 are optionally executed by hardware, software, or a combination of hardware and software in order to execute the principles of the various embodiments described.
  • the functional blocks described in FIG. 3 may be optionally combined or one functional block separated into two or more subblocks in order to implement the principles of the embodiments described. It will be understood by those skilled in the art. Accordingly, the description herein optionally supports any possible combination or division of functional blocks described herein.
  • the image display unit 20 in the vehicle display system 10 may be a head-mounted display (hereinafter, HMD) device.
  • HMD head-mounted display
  • the observer superimposes the displayed virtual image V on the foreground via the front windshield 2 of the vehicle 1 and visually recognizes it.
  • the display area 100 on which the image display unit 20 displays a predetermined virtual image V is fixed at a specific position with respect to the coordinate system of the vehicle 1, and when the observer turns to that direction, the display area 100 is fixed at the specific position.
  • the virtual image V displayed in the display area 100 can be visually recognized.
  • Vehicle display system 20 Image display unit (HUD device) 21: Display 21a: Display surface 22: Liquid crystal display panel 24: Light source unit 25: Relay optical system 26: First mirror 27: Second mirror 28: Actuator 29: Actuator 30: Display control device 31: I / O interface 33 : Processor 35: Image processing circuit 37: Memory 40: Display light 40p: Optical axis 41: First display light 42: Second image light 43: Third image light 80: Car navigation device 90: Virtual image optical system 100: Display area 200 : Eye box 205: Center 401: Vehicle ECU 403: Road information database 405: Own vehicle position detection unit 407: Vehicle outside sensor 409: Operation detection unit 411: Eye position detection unit 415: Line-of-sight direction detection unit 417: Mobile information terminal 419: External communication device 502: Approach image generation module 504 : Distance perception shortening condition determination module 506: Notification necessity determination module 508: Distance perception shortening processing module 510

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente invention concerne l'orientation correcte de l'attention visuelle d'un observateur sur une image. Un dispositif de commande d'affichage (30) commande un dispositif d'affichage d'image (20) qui affiche une image à un observateur sous la forme d'une image virtuelle qui chevauche le décor devant un véhicule, un processeur exécutant une première commande d'affichage dans laquelle une image en approche (V) est affichée de façon à apparaître progressivement plus grande en taille alors que la distance (L) à un objet devant le véhicule devient plus courte, et exécutant également une seconde commande d'affichage dans laquelle : lorsqu'une première condition prescrite est satisfaite, la visibilité de l'image en approche (V) est réduite, l'image en approche (V) cesse d'être affichée et/ou la taille de l'image en approche (V) est réduite ; et lorsqu'une seconde condition prescrite est satisfaite, la taille de l'image en approche (V) par rapport à la distance (L) est augmentée et/ou la distance d'affichage est raccourcie dans une plus grande mesure que dans la première commande d'affichage.
PCT/JP2021/013480 2020-03-31 2021-03-30 Dispositif de commande d'affichage, dispositif d'affichage d'image et procédé WO2021200913A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020061737 2020-03-31
JP2020-061737 2020-03-31

Publications (1)

Publication Number Publication Date
WO2021200913A1 true WO2021200913A1 (fr) 2021-10-07

Family

ID=77929009

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/013480 WO2021200913A1 (fr) 2020-03-31 2021-03-30 Dispositif de commande d'affichage, dispositif d'affichage d'image et procédé

Country Status (1)

Country Link
WO (1) WO2021200913A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010072365A (ja) * 2008-09-18 2010-04-02 Toshiba Corp ヘッドアップディスプレイ
JP2012162109A (ja) * 2011-02-03 2012-08-30 Toyota Motor Corp 車両用表示装置
JP2017052364A (ja) * 2015-09-09 2017-03-16 日本精機株式会社 ヘッドアップディスプレイ装置
JP2019156296A (ja) * 2018-03-15 2019-09-19 矢崎総業株式会社 車両用表示投影装置
JP2019199139A (ja) * 2018-05-15 2019-11-21 日本精機株式会社 車両用表示装置
JP2020032866A (ja) * 2018-08-30 2020-03-05 日本精機株式会社 車両用仮想現実提供装置、方法、及びコンピュータ・プログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010072365A (ja) * 2008-09-18 2010-04-02 Toshiba Corp ヘッドアップディスプレイ
JP2012162109A (ja) * 2011-02-03 2012-08-30 Toyota Motor Corp 車両用表示装置
JP2017052364A (ja) * 2015-09-09 2017-03-16 日本精機株式会社 ヘッドアップディスプレイ装置
JP2019156296A (ja) * 2018-03-15 2019-09-19 矢崎総業株式会社 車両用表示投影装置
JP2019199139A (ja) * 2018-05-15 2019-11-21 日本精機株式会社 車両用表示装置
JP2020032866A (ja) * 2018-08-30 2020-03-05 日本精機株式会社 車両用仮想現実提供装置、方法、及びコンピュータ・プログラム

Similar Documents

Publication Publication Date Title
WO2019097763A1 (fr) Dispositif d'affichage d'image superposée et programme informatique
US11525694B2 (en) Superimposed-image display device and computer program
US11803053B2 (en) Display control device and non-transitory tangible computer-readable medium therefor
CN111095078A (zh) 用于控制用于机动车的增强现实平视显示器装置的显示的方法、装置和带有指令的计算机可读的存储介质
JP7310560B2 (ja) 表示制御装置及び表示制御プログラム
JP2020032866A (ja) 車両用仮想現実提供装置、方法、及びコンピュータ・プログラム
JP7255608B2 (ja) 表示制御装置、方法、及びコンピュータ・プログラム
JP7459883B2 (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び方法
WO2022230995A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé de commande d'affichage
WO2021200914A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé
WO2021200913A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage d'image et procédé
WO2020158601A1 (fr) Dispositif, procédé et programme informatique de commande d'affichage
JP2021160409A (ja) 表示制御装置、画像表示装置、及び方法
JP2020121607A (ja) 表示制御装置、方法、及びコンピュータ・プログラム
JP2020121704A (ja) 表示制御装置、ヘッドアップディスプレイ装置、方法、及びコンピュータ・プログラム
JP2020117105A (ja) 表示制御装置、方法、及びコンピュータ・プログラム
JP7434894B2 (ja) 車両用表示装置
JP2020158014A (ja) ヘッドアップディスプレイ装置、表示制御装置、及び表示制御プログラム
WO2023003045A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé de commande d'affichage
JP2019207632A (ja) 表示装置
WO2023210682A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé de commande d'affichage
WO2023145852A1 (fr) Dispositif de commande d'affichage, système d'affichage et procédé de commande d'affichage
JP2022057051A (ja) 表示制御装置、虚像表示装置
JP7338632B2 (ja) 表示装置
JP2022113292A (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び表示制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21781516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21781516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP