WO2021200913A1 - Display control device, image display device, and method - Google Patents

Display control device, image display device, and method Download PDF

Info

Publication number
WO2021200913A1
WO2021200913A1 PCT/JP2021/013480 JP2021013480W WO2021200913A1 WO 2021200913 A1 WO2021200913 A1 WO 2021200913A1 JP 2021013480 W JP2021013480 W JP 2021013480W WO 2021200913 A1 WO2021200913 A1 WO 2021200913A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
distance
display control
approach
Prior art date
Application number
PCT/JP2021/013480
Other languages
French (fr)
Japanese (ja)
Inventor
誠 秦
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Publication of WO2021200913A1 publication Critical patent/WO2021200913A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory

Definitions

  • the present disclosure relates to a display control device, an image display device, and a method used in a vehicle to superimpose and visually recognize an image on the foreground of the vehicle.
  • Patent Document 1 discloses a device that displays an image visually recognized by a viewer (mainly the driver of the vehicle) as a virtual image overlapping with the scenery in front of the vehicle.
  • the display device disclosed in Patent Document 1 controls the relative height of a close-up image that notifies the real object according to the distance from the vehicle to the real object.
  • Patent Document 1 If the technique disclosed in Patent Document 1 is used when displaying a plurality of the above-mentioned approach images at the same time, the same display control is performed for each approach image according to the distance from the vehicle. However, there is room for improvement in effectively displaying close-up images that are expected to have a higher priority for the viewer.
  • the outline of this disclosure relates to properly directing the observer's visual attention to the image. More specifically, it is also related to providing an image that does not get in the way of the observer while recognizing the distance to the object.
  • the display control device described in the present specification is a display control device that controls an image display device that displays an image to the observer as an imaginary image that overlaps with the scenery in front of the vehicle, and is one that can acquire information.
  • the one or more I / O interfaces 31 include a program, and the one or more I / O interfaces 31 acquire a distance L to an object in front of the vehicle and vehicle information about the vehicle, and the one or more processors.
  • FIG. 1 is a diagram showing an example of application of a vehicle display system to a vehicle.
  • FIG. 2 is a diagram showing a configuration of a head-up display device.
  • FIG. 3 is a block diagram of a vehicle display system.
  • FIG. 4A is a diagram showing an example of a close-up image displayed by an image display device.
  • FIG. 4B is a diagram showing an example of a close-up image displayed by an image display device.
  • FIG. 5 is a diagram that provides an explanation of control data for changing the size (display ratio) of a close-up image according to a distance.
  • FIG. 6A is a diagram showing a display example of an image before the enlargement processing is executed.
  • FIG. 6A is a diagram showing a display example of an image before the enlargement processing is executed.
  • FIG. 6B is a diagram showing an example of displaying an image when the enlargement processing is executed, after the situation shown in FIG. 6A.
  • FIG. 6C is a diagram showing a display example of an image after the situation shown in FIG. 6B and after the enlargement processing is executed.
  • FIG. 7A is a diagram illustrating a display transition example of a plurality of close-up images.
  • FIG. 7B is a diagram after the situation of FIG. 7A, for explaining a display transition example of a plurality of close-up images.
  • FIG. 7C is a diagram after the situation of FIG. 7B and explaining the image enlargement processing. It is a flow chart which shows the enlargement processing.
  • FIG. 9A is a diagram for explaining a modification of the enlargement processing.
  • FIG. 9A is a diagram for explaining a modification of the enlargement processing.
  • FIG. 9B is a diagram for explaining a modification of the enlargement processing.
  • FIG. 9C is a diagram for explaining a modification of the enlargement processing.
  • FIG. 10A is a diagram for explaining a modification of the reference control data.
  • FIG. 10B is a diagram for explaining a modification of the reference control data.
  • FIG. 10C is a diagram for explaining a modification of the reference control data.
  • FIG. 11 is a flowchart showing the approach process.
  • FIG. 12A is a diagram illustrating an example of display transitions of a plurality of close-up images.
  • FIG. 12B is a diagram after the situation of FIG. 12A and explaining a display transition example of a plurality of close-up images.
  • FIG. 12C is a diagram that follows the situation of FIG.
  • FIG. 13A is a diagram showing a display example of an image before the approach process is executed.
  • FIG. 13B is a diagram showing an example of displaying an image when the approach process is executed, after the situation shown in FIG. 13A.
  • FIG. 13C is a diagram showing a display example of an image after the situation shown in FIG. 13B and after the approach process is executed.
  • FIG. 14A is a flow chart showing a method of performing the distance perception shortening process and the distance perception extension process according to some embodiments.
  • FIG. 14B is a flow chart following FIG. 14A.
  • FIG. 15 is a diagram for explaining the second and first distance perception shortening processing data.
  • FIG. 15 is a diagram for explaining the second and first distance perception shortening processing data.
  • FIG. 16A shows the display of the approach image executed when another information having a higher notification priority than the second information is generated while displaying the approach image related to the second information according to some embodiments. It is a flow chart which shows the method about control.
  • FIG. 16B is a flow chart following FIG. 16A.
  • FIG. 16C is a flow chart following FIG. 16B.
  • the left figure is a diagram for explaining an example of image display transition
  • the right figure is a diagram corresponding to the right figure showing an image display ratio according to a distance
  • FIG. 17B is after the situation of FIG. 17A, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance.
  • FIG. 17C is after the situation of FIG. 17B, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance.
  • FIG. 17D is after the situation of FIG. 17C, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance.
  • FIG. 17E is after the situation of FIG. 17D, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. .. FIG.
  • FIG. 17F is after the situation of FIG. 17E, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. ..
  • FIG. 18A the left figure is a diagram for explaining an example of image display transition, and the right figure is a diagram corresponding to the right figure showing an image display ratio according to a distance.
  • FIG. 18B is after the situation of FIG. 18A, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. ..
  • FIG. 18C is after the situation of FIG. 17B, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. ..
  • FIGS. 1 to 16 provide a description of the configuration and operation of an exemplary vehicle display system.
  • the present invention is not limited to the following embodiments (including the contents of the drawings).
  • changes including deletion of components
  • description of known technical matters will be omitted as appropriate.
  • the image display unit (image display device) 20 in the vehicle display system 10 is a head-up display (HUD: Head-Up Display) device provided in the dashboard 5 of the vehicle 1.
  • the image display unit 20 emits the display light 40 toward the front windshield 2 (an example of the projected unit), and the front windshield 2 emits the display light 40 of the image displayed by the image display unit 20 to the eye box. Reflects to 200.
  • the HUD device 20 integrates not only information about the vehicle 1 (hereinafter referred to as vehicle information) but also information other than the vehicle information as an occupant of the vehicle 1 (the occupant is typically a driver of the vehicle 1). Is notified.
  • vehicle information includes not only the information of the vehicle 1 itself but also the external information of the vehicle 1 related to the operation of the vehicle 1.
  • FIG. 2 is a diagram showing the configuration of the HUD device 20 of the present embodiment.
  • the HUD device 20 includes a display 21 having a display surface 21a for displaying an image, and a relay optical system 25.
  • the display 21 of FIG. 2 is composed of a liquid crystal display panel 22 and a light source unit 24.
  • the display surface 21a is a surface on the visual side of the liquid crystal display panel 22, and emits the display light 40 of the image. Display by setting the angle of the display surface 21a with respect to the optical axis 40p of the display light 40 from the center of the display surface 21a toward the eye box 200 (center 205 of the eye box 200) via the relay optical system 25 and the projected portion.
  • the angle of the region 100 (including the tilt angle ⁇ t) can be set.
  • the relay optical system 25 is arranged on the optical path of the display light 40 (light from the display 21 toward the eyebox 200) emitted from the display 21, and the display light 40 from the display 21 is directed to the outside of the HUD device 20. It is composed of one or more optical members projected onto the front windshield 2.
  • the relay optical system 25 of FIG. 2 includes one concave first mirror 26 and one flat second mirror 27.
  • the first mirror 26 has, for example, a free curved surface shape having positive optical power.
  • the first mirror 26 may have a curved surface shape in which the optical power differs for each region, that is, the optical power added to the display light 40 according to the region (optical path) through which the display light 40 passes. It may be different.
  • the relay optical system 25 adds the first display light 41, the second image light 42, and the third image light 43 (see FIG. 2) toward the eyebox 200 from each region of the display surface 21a.
  • the optical power may be different.
  • the second mirror 27 is, for example, a flat mirror, but is not limited to this, and may be a curved surface having optical power. That is, the relay optical system 25 is added according to the region (optical path) through which the display light 40 passes by synthesizing a plurality of mirrors (for example, the first mirror 26 and the second mirror 27 of the present embodiment). The optical power may be different.
  • the second mirror 27 may be omitted. That is, the display light 40 emitted from the display 21 may be reflected by the first mirror 26 on the projected portion (front windshield) 2.
  • the relay optical system 25 includes two mirrors, but the present invention is not limited to this, and one or more refractive optics such as a lens may be added or substituted to these. It may include a member, a diffractive optical member such as a hologram, a reflective optical member, or a combination thereof.
  • the relay optical system 25 of the present embodiment has a function of setting the distance to the display area 100 by the curved surface shape (an example of optical power), and a virtual image obtained by enlarging the image displayed on the display surface 21a. It has a function of generating, but in addition to this, it may have a function of suppressing (correcting) distortion of a virtual image that may occur due to the curved shape of the front windshield 2.
  • relay optical system 25 may be rotatable to which actuators 28 and 29 controlled by the display control device 30 are attached.
  • the liquid crystal display panel 22 receives light from the light source unit 24 and emits spatial light-modulated display light 40 toward the relay optical system 25 (second mirror 27).
  • the liquid crystal display panel 22 has, for example, a rectangular shape whose short side is the direction in which the pixels corresponding to the vertical direction (Y-axis direction) of the virtual image V seen from the observer are arranged.
  • the observer visually recognizes the transmitted light of the liquid crystal display panel 22 via the virtual image optical system 90.
  • the virtual image optical system 90 is a combination of the relay optical system 25 shown in FIG. 2 and the front windshield 2.
  • the light source unit 24 is composed of a light source (not shown) and an illumination optical system (not shown).
  • the light source (not shown) is, for example, a plurality of chip-type LEDs, and emits illumination light to a liquid crystal display panel (an example of a spatial light modulation element) 22.
  • the light source unit 24 is composed of, for example, four light sources, and is arranged in a row along the long side of the liquid crystal display panel 22.
  • the light source unit 24 emits illumination light toward the liquid crystal display panel 22 under the control of the display control device 30.
  • the configuration of the light source unit 24 and the arrangement of the light sources are not limited to this.
  • the illumination optical system includes, for example, one or a plurality of lenses (not shown) arranged in the emission direction of the illumination light of the light source unit 24, and diffusion arranged in the emission direction of the one or a plurality of lenses. It is composed of a board (not shown).
  • the light source unit 24 is configured to be locally dimmable, and the degree of illumination for each area of the display surface 21a may be changed under the control of the display control device 30. As a result, the light source unit 24 can adjust the brightness of the image displayed on the display surface 21a for each area.
  • the display 21 may be a self-luminous display or a projection type display that projects an image on a screen.
  • the display surface 21a is the screen of the projection type display.
  • the image display unit (image display device) 20 is a traveling lane existing in the foreground, which is a real space (actual view) visually recognized via the front windshield 2 of the vehicle 1 based on the control of the display control device 30 described later.
  • Near objects such as road surfaces, branch roads, road signs, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), and features (buildings, bridges, etc.), positions overlapping the objects, or based on the object.
  • the observer typically, the observer sitting in the driver's seat of the vehicle 1 can perceive the visual augmented reality (AR).
  • AR augmented reality
  • an image whose display position can be changed according to the position of a real object existing in a real scene or a virtual object described later is defined as an AR image, and is displayed regardless of the position of the real object.
  • An image in which a position is set is defined as a non-AR image.
  • FIG. 3 is a block diagram of the vehicle display system 10 according to some embodiments.
  • the display control device 30 includes one or more I / O interfaces 31, one or more processors 33, one or more image processing circuits 35, and one or more memories 37.
  • the various functional blocks described in FIG. 3 may consist of hardware, software, or a combination of both.
  • FIG. 3 is only one embodiment, and the illustrated components may be combined with a smaller number of components, or there may be additional components.
  • the image processing circuit 35 (for example, a graphic processing unit) may be included in one or more processors 33.
  • the processor 33 and the image processing circuit 35 are operably connected to the memory 37. More specifically, the processor 33 and the image processing circuit 35 execute a program stored in the memory 37 to generate and / or transmit image data, for example, and display the vehicle display system 10 (image display). The operation of unit 20) can be performed.
  • the processor 33 and / or the image processing circuit 35 includes at least one general purpose microprocessor (eg, central processing unit (CPU)), at least one application specific integrated circuit (ASIC), and at least one field programmable gate array (FPGA). , Or any combination thereof.
  • the memory 37 includes any type of magnetic medium such as a hard disk, any type of optical medium such as a CD and DVD, any type of semiconductor memory such as a volatile memory, and a non-volatile memory.
  • the volatile memory may include DRAM and SRAM, and the non-volatile memory may include ROM and NVRAM.
  • the processor 33 is operably connected to the I / O interface 31.
  • the I / O interface 31 communicates with, for example, the vehicle ECU 401 described later or another electronic device (reference numerals 403 to 419 described later) provided in the vehicle according to the standard of CAN (Controller Area Network) (also referred to as CAN communication). ).
  • CAN Controller Area Network
  • the communication standard adopted by the I / O interface 31 is not limited to CAN, for example, CANFD (CAN with Flexible Data Rate), LIN (Local Interconnect Network), Ethernet (registered trademark), MOST (Media Oriented Systems Transport).
  • MOST is a registered trademark
  • a wired communication interface such as UART, or USB
  • a local such as a personal area network (PAN)
  • PAN personal area network
  • Bluetooth registered trademark
  • 802.1x Wi-Fi registered trademark
  • In-vehicle communication (internal communication) interface which is a short-range wireless communication interface within several tens of meters such as an area network (LAN), is included.
  • the I / O interface 31 is a wireless wide area network (WWAN0, IEEE 802.16-2004 (WiMAX: Worldwide Interoperability for Microwave Access)), IEEE 802.16e base (Mobile WiMAX), 4G, 4G-LTE, LTE Advanced,
  • An external communication (external communication) interface such as a wide area communication network (for example, an Internet communication network) may be included according to a cellular communication standard such as 5G.
  • the processor 33 is interoperably connected to the I / O interface 31 to provide information with various other electronic devices and the like connected to the vehicle display system 10 (I / O interface 31). Can be exchanged.
  • the I / O interface 31 includes, for example, a vehicle ECU 401, a road information database 403, a vehicle position detection unit 405, an external sensor 407, an operation detection unit 409, an eye position detection unit 411, an IMU 413, a line-of-sight direction detection unit 415, and mobile information.
  • the terminal 417, the external communication device 419, and the like are operably connected.
  • the I / O interface 31 may include a function of processing (converting, calculating, analyzing) information received from another electronic device or the like connected to the vehicle display system 10.
  • the display 21 is operably connected to the processor 33 and the image processing circuit 35. Therefore, the image displayed by the image display unit 20 may be based on the image data received from the processor 33 and / or the image processing circuit 35.
  • the processor 33 and the image processing circuit 35 control the image displayed by the image display unit 20 based on the information acquired from the I / O interface 31.
  • the configuration of the display control device 30 is arbitrary as long as the functions described below are satisfied.
  • the vehicle ECU 401 uses sensors and switches provided on the vehicle 1 to determine the state of the vehicle 1 (for example, mileage, vehicle speed, accelerator pedal opening, brake pedal opening, engine throttle opening, injector fuel injection amount, engine rotation speed). , Motor speed, steering angle, shift position, drive mode, various warning states, attitude (including roll angle and / or pitching angle), vehicle vibration (including magnitude, frequency, and / or frequency of vibration) )) and the like, and collect and manage (may include control) the state of the vehicle 1. As a part of the function, the numerical value of the state of the vehicle 1 (for example, the vehicle speed of the vehicle 1). ) Can be output to the processor 33 of the display control device 30.
  • the state of the vehicle 1 for example, mileage, vehicle speed, accelerator pedal opening, brake pedal opening, engine throttle opening, injector fuel injection amount, engine rotation speed.
  • the vehicle ECU 401 simply transmits the numerical value detected by the sensor or the like (for example, the pitching angle is 3 [brake] in the forward tilt direction) to the processor 33, or instead, the numerical value detected by the sensor is used.
  • Judgment results based on one or more states of the including vehicle 1 (for example, the vehicle 1 satisfies a predetermined condition of the forward leaning state) and / and analysis results (for example, of the brake pedal opening degree). Combined with the information, the brake has caused the vehicle to lean forward.) May be transmitted to the processor 33.
  • the vehicle ECU 401 may output a signal indicating a determination result indicating that the vehicle 1 satisfies a predetermined condition stored in advance in a memory (not shown) of the vehicle ECU 401 to the display control device 30.
  • the I / O interface 31 may acquire information on the state of the vehicle 1 as described above from the sensors and switches provided on the vehicle 1 provided on the vehicle 1 without going through the vehicle ECU 401.
  • the vehicle ECU 401 may output an instruction signal indicating an image to be displayed by the vehicle display system 10 to the display control device 30, and at this time, it is necessary to notify the coordinates, size, type, display mode, and image of the image.
  • the degree and / or the necessity-related information that is the basis for determining the notification necessity may be added to the instruction signal and transmitted.
  • the road information database 403 is included in a navigation device (not shown) provided in the vehicle 1 or an external server connected to the vehicle 1 via an external communication interface (I / O interface 31), and the vehicle position detection described later.
  • Road information (lane, white line, stop line, crosswalk, Road width, number of lanes, intersections, curves, branch roads, traffic regulations, etc.), feature information (buildings, bridges, rivers, etc.), presence / absence, position (including distance to vehicle 1), direction, shape, type , Detailed information and the like may be read out and transmitted to the processor 33. Further, the road information database 403 may calculate an appropriate route (navigation information) from the departure point to the destination, and output a signal indicating the navigation information or image data indicating the route to the processor 33.
  • the own vehicle position detection unit 405 is a GNSS (Global Navigation Satellite System) or the like provided in the vehicle 1, detects the current position and orientation of the vehicle 1, and transmits a signal indicating the detection result via the processor 33. , Or directly output to the road information database 403, the portable information terminal 417 described later, and / or the external communication device 419.
  • the road information database 403, the mobile information terminal 417 described later, and / or the external communication device 419 obtains the position information of the vehicle 1 from the own vehicle position detection unit 405 continuously, intermittently, or at a predetermined event. , Information around the vehicle 1 may be selected and generated and output to the processor 33.
  • the vehicle exterior sensor 407 detects real objects existing around the vehicle 1 (front, side, and rear).
  • the actual objects detected by the external sensor 407 are, for example, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) of the traveling lane described later. Etc. may be included.
  • a detection unit composed of a radar sensor such as a millimeter wave radar, an ultrasonic radar, a laser radar, a camera, or a combination thereof, and detection data from the one or a plurality of detection units are processed ( It consists of a processing device (data fusion) and a processing device.
  • One or more external sensors 407 detect a real object in front of the vehicle 1 for each detection cycle of each sensor, and the real object information (presence or absence of the real object, existence of the real object exists) which is an example of the real object information.
  • information such as the position, size, and / or type of each real object
  • these real object information may be transmitted to the processor 33 via another device (for example, vehicle ECU 401).
  • a camera an infrared camera or a near-infrared camera is desirable so that a real object can be detected even when the surroundings are dark such as at night.
  • a stereo camera capable of acquiring a distance or the like by parallax is desirable.
  • the operation detection unit 409 is, for example, a CID (Center Information Processor) of the vehicle 1, a hardware switch provided on the instrument panel, or a software switch that combines an image and a touch sensor, and the like.
  • the operation information based on the operation by the occupant is output to the processor 33.
  • the operation detection unit 409 sets the display area setting information based on the operation of moving the display area 100, the eyebox setting information based on the operation of moving the eyebox 200, and the observer's eye position 700 by the user's operation. Information based on the operation is output to the processor 33.
  • the eye position detection unit 411 includes a camera such as an infrared camera that detects the eye position 700 (see FIG. 1) of the observer sitting in the driver's seat of the vehicle 1, and even if the captured image is output to the processor 33. good.
  • the processor 33 acquires an captured image (an example of information capable of estimating the eye position 700) from the eye position detection unit 411, and analyzes the captured image by a method such as pattern matching to obtain an observer's eye position.
  • the coordinates of 700 may be detected, and a signal indicating the detected coordinates of the eye position 700 may be output to the processor 33.
  • the eye position detection unit 411 determines where in the spatial region corresponding to the plurality of preset display parameters the eye position 700 of the observer belongs to the analysis result obtained by analyzing the image captured by the camera (for example, the eye position 700 of the observer belongs to).
  • the indicated signal may be output to the processor 33.
  • the method of acquiring information capable of estimating the eye position 700 of the observer of the vehicle 1 or the eye position 700 of the observer is not limited to these, and a known eye position detection (estimation) technique is used. May be obtained.
  • the eye position detection unit 411 detects the moving speed and / or moving direction of the observer's eye position 700, and outputs a signal indicating the moving speed and / or moving direction of the observer's eye position 700 to the processor 33. It may be output to.
  • the IMU413 is one or more sensors (eg, accelerometers and gyroscopes) configured to detect the position, orientation, and changes (speed of change, acceleration of change) of vehicle 1 based on inertial acceleration. Can include combinations of.
  • the IMU413 processes the detected values (the detected values include the position and orientation of the vehicle 1, signals indicating these changes (change speed, change acceleration), and the like), and the results of analyzing the detected values. It may be output to 33.
  • the result of the analysis is a signal or the like indicating a determination result of whether or not the detected value satisfies a predetermined condition, and is, for example, from a value relating to a change (change speed, change acceleration) of the position or orientation of the vehicle 1. , It may be a signal indicating that the behavior (vibration) of the vehicle 1 is small.
  • the line-of-sight direction detection unit 415 includes an infrared camera or a visible light camera that captures the face of an observer sitting in the driver's seat of the vehicle 1, and may output the captured image to the processor 33.
  • the processor 33 acquires an captured image (an example of information capable of estimating the line-of-sight direction) from the line-of-sight direction detection unit 415, and analyzes the captured image to determine the line-of-sight direction (and / or the gaze position) of the observer. Can be identified.
  • the line-of-sight direction detection unit 415 may analyze the captured image from the camera and output a signal indicating the line-of-sight direction (and / or the gaze position) of the observer, which is the analysis result, to the processor 33.
  • the method for acquiring information that can estimate the line-of-sight direction of the observer of the vehicle 1 is not limited to these, and is not limited to these, but is the EOG (Electro-oculogram) method, the corneal reflex method, the scleral reflex method, and the Purkinje image detection. It may be obtained using other known line-of-sight detection (estimation) techniques such as the method, search coil method, infrared fundus camera method.
  • the mobile information terminal 417 is a smartphone, a laptop computer, a smart watch, or other information device that can be carried by an observer (or another occupant of vehicle 1).
  • the I / O interface 31 can communicate with the mobile information terminal 417 by pairing with the mobile information terminal 417, and the data recorded in the mobile information terminal 417 (or the server through the mobile information terminal). To get.
  • the mobile information terminal 417 has, for example, the same functions as the above-mentioned road information database 403 and own vehicle position detection unit 405, acquires the road information (an example of real object-related information), and transmits it to the processor 33. You may.
  • the personal digital assistant 417 may acquire commercial information (an example of information related to a real object) related to a commercial facility in the vicinity of the vehicle 1 and transmit it to the processor 33.
  • the mobile information terminal 417 transmits schedule information of the owner (for example, an observer) of the mobile information terminal 417, incoming information on the mobile information terminal 417, mail reception information, and the like to the processor 33, and the processor 33 and an image.
  • the processing circuit 35 may generate and / or transmit image data relating to these.
  • the external communication device 419 is a communication device that exchanges information with the vehicle 1, for example, another vehicle connected to the vehicle 1 by vehicle-to-vehicle communication (V2V: Vehicle To Vehicle), and pedestrian-to-vehicle communication (V2P: Vehicle To Pestation). ), A network communication device connected by pedestrians (portable information terminals carried by pedestrians) and road-to-vehicle communication (V2I: Vehicle To vehicle Infrastructure), and in a broad sense, communication with vehicle 1 (V2X). : Includes everything connected by (Vehicle To Everything).
  • the external communication device 419 acquires, for example, the positions of pedestrians, bicycles, motorcycles, other vehicles (preceding vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) and sends them to the processor 33. It may be output. Further, the external communication device 419 has the same function as the own vehicle position detection unit 405 described above, and may acquire the position information of the vehicle 1 and transmit it to the processor 33. Further, the road information database 403 described above may be used. It also has a function, and the road information (an example of information related to a real object) may be acquired and transmitted to the processor 33. The information acquired from the external communication device 419 is not limited to the above.
  • the approach image V displayed in the display area of the virtual image is an image that notifies an object in front of the vehicle 1.
  • the close-up image is roughly divided into a display in the "emphasis mode” that emphasizes the real object, a display in the "visualization mode” that visualizes information (virtual object) that is not visible in the front landscape, and information from the vehicle 1 side. It is possible to display in the "presentation mode" to be presented.
  • the "object” can be perceived as an existing obstacle (pedestrian, bicycle, motorcycle, other vehicle, etc.), a feature (building, bridge, etc.), and an object on the road (branch road, road sign).
  • the virtual object includes various information associated with location information, such as switching points between general roads and highways, the number of lanes, speed limits, road names, road congestion status, and commercial information related to commercial facilities.
  • the vehicle 1 starts displaying in the "presentation mode" of presenting information
  • a virtual object is set 50 [meter] ahead, and the distance between the set virtual object and the vehicle 1 is increased. Therefore, the size of the image displayed in the "presentation mode” can be adjusted.
  • the approach image of the emphasized mode is displayed at a position corresponding to a real object such as a preceding vehicle, a pedestrian, a road sign, or a building existing in front of the vehicle 1.
  • the close-up image of the emphasized mode is, for example, superimposed on the real object or displayed in the vicinity of the real object to emphasize and notify the existence of the real object. That is, the "position corresponding to the real object" as the display position of the approach image in the emphasized mode is not limited to the position that is visually recognized by being superimposed on the real object, and may be a position in the vicinity of the real object.
  • the emphasis mode is arbitrary as long as it does not interfere with the visibility of the real object.
  • the approach image of the visualization mode is, for example, an image imitating a road sign for notifying various information at a specific place (an example of a virtual object) where a road sign is not installed, or a car navigation system by an observer. It is an image showing a POI (Point of Interest) of a destination, a facility, etc., which is preset by the device 80 or the like.
  • the road sign referred to here includes all kinds of guide signs, warning signs, regulation signs, instruction signs, and the like. That is, the approach image that can be displayed in this embodiment includes an image that imitates a road sign.
  • FIG. 4A and FIG. 4B show an example of a close-up image of a visualization mode imitating a road sign.
  • the close-up image shown in FIG. 4B is an image imitating a regulation sign indicating a speed limit.
  • the approach image of the visualization mode may include a navigation image such as an arrow shape that guides the route of the vehicle 1.
  • Various close-up images of the presentation mode are presented from the vehicle 1 (vehicle ECU 401), for example, advice for driving, information on driving support in the vehicle 1, information indicating the next song title in the audio, and the like, which are presented by the vehicle ECU 401. It is an image showing information.
  • the display control device 30 approaches so that the farther the distance L from the vehicle 1 to the object is, the smaller the display size of the approach image is (in other words, the shorter the distance L is, the larger the display size of the approach image is). Controls the display of images. This makes it possible to give a pseudo perspective to the close-up image.
  • the basic change in the size of the close-up image according to the distance L is preferably continuous, but is not limited to this, and may be intermittent.
  • the image display unit 20 is composed of a depth perception display device capable of adjusting the distance (display distance) on which the approach image is displayed, and the display control device 30 is from the vehicle 1 to the object.
  • the display control of the close-up image may be performed so that the perceived distance of the close-up image becomes long as the distance L is long (conversely, the display distance of the close-up image becomes short as the distance L is short). .. This makes it possible to give a pseudo perspective to the close-up image.
  • the depth perception display device includes a known 3D display device such as a light field display, a multi-lens display, and a binocular disparity display.
  • the virtual image display area 100 is formed in two dimensions of a plane or a curved surface, and the tilt angle ⁇ t is 90 [degree] or less (preferably 50 [degree] or less, more preferably 0 [degree] or less. ] In the vicinity), a depth 2D display device that changes the imaging distance by changing the display position of the virtual image V in the virtual image display area 100 may be included.
  • the display control device 30 is an I / O from at least one of a road information database 403 (including a car navigation system), an external sensor 407, a personal digital assistant 417, and an external communication device 419 (hereinafter, also referred to as an information source).
  • the distance L to the object is acquired based on the information acquired via the interface 31.
  • the display control device 30 may acquire information indicating the distance L from the information source, or may acquire the coordinate information of the object from the information source and calculate the distance L based on the coordinate information.
  • the distance L to the real object which is the notification target of the approach image in the visualization mode, is the representative position of the virtual object specified based on the information from the road information database 403, the personal digital assistant 417, and the external communication device 419.
  • the representative position can be acquired or calculated as a distance to, for example, a start position of a regulated section, a position predetermined as a position suitable for road guidance, and the like).
  • the display control device 30 executes display control for gradually increasing the size of the approaching image (or, in addition, shortening the display distance of the approaching image) based on the shortening of the acquired or calculated distance L.
  • the "distance L" does not necessarily have to be acquired or calculated, and the size of the close-up image may be adjusted based on the information that can estimate the shortening of the distance L.
  • the information that can estimate the shortening of the distance L between the object and the vehicle 1 includes (1) the combination of the coordinate position of the object and the coordinate position of the vehicle 1, (2) the mileage of the vehicle 1, and the like. good.
  • the display control device 30 provides control data for controlling the size (or display distance in addition to this) of the approach image based on the distance L (or information that can estimate the distance L) acquired or calculated as described above. , Stored in memory 37 in advance.
  • the control data includes control data for controlling the size of the close-up image (or, in addition, control data for controlling the display distance).
  • the control data for controlling the size of the close-up image includes the reference control data CDs.
  • the display control device 30 determines the display ratio M based on the acquired or calculated distance L (or information capable of estimating the distance L) and the reference control data CDs, and displays the approach image at the determined display ratio M.
  • the reference control data CDs include data of a reference function Fs (mathematical expression) showing the relationship between the distance L and the display ratio M.
  • the display ratio M is the display ratio of the close-up image to the reference size, and is represented by S / S0 if the size to be visually recognized by the observer is S and the reference size is S0.
  • the reference size is stored in the memory 37 in advance as the size of the approaching image at a predetermined distance L0. How to set the predetermined distance L0 is arbitrary.
  • FIG. 4A shows a close-up image of a reference size having a length of A in the horizontal direction and a length of B in the vertical direction as a rectangular close-up image.
  • FIG. 4B shows a close-up image of a reference size having a length of C in the horizontal direction and a length of D in the vertical direction as a circular close-up image.
  • C D
  • C ⁇ D The length referred to here is a length that can be visually recognized by the observer, and can be defined, for example, by the number of pixels and the enlargement ratio by the virtual image optical system 90 for each partial region of the display surface 21a.
  • the reference size may be adjusted according to the size of the object to which the broadcast image is associated.
  • the display control device 30 may change the reference size based on the size of the real object so that the size of the real object is approximately the same as or larger than the size of the real object when viewed from the observer.
  • the close-up image shown in FIG. 4A is enlarged from the reference size to a size of 2A in width and 2B in length. Will be visible to the observer.
  • the close-up image shown in FIG. 4B has a length of 1 / 2C in the horizontal direction and 1 / 2D in the vertical direction, and has a reference size. It will be displayed in a reduced size from.
  • the enlargement or reduction of the close-up image is executed while maintaining the aspect ratio of the close-up image of the reference size.
  • the shape of the close-up image is not limited to a rectangle or a circle, and may be another shape such as a triangle or a polygon, and the concept of enlargement or reduction is the same for the other shape.
  • the composition of the close-up image is arbitrary as long as it can convey information about the real object, and may be, for example, characters, symbols, figures, icons, or a combination thereof.
  • the display control device 30 controls the display of the approach image.
  • the display control device 30 can display a plurality of close-up images (at the same time) within the same period.
  • the display ratio M of the approach image determined based on the reference control data CDs is equal to or greater than the predetermined threshold value Mth, the approach image may be deleted.
  • the one that notifies the first object having the closest distance L from the vehicle 1 is referred to as the first approach image V1, and the second approach image V1 is farther than the first approach image V1.
  • the second approach image V2 is used to notify the object of the above
  • the third approach image V3 is to notify the third object whose distance L is further distant from the second approach image V2.
  • these plurality of close-up images are controlled based on the reference control data CDs (hereinafter, referred to as normal control), and as shown in FIG. It becomes a close-up image V2 and a third close-up image V3.
  • the display control device 30 executes the distance perception shortening process described later.
  • the display control device 30 Based on the reference control data CDs, the display control device 30 performs display control to increase the size of the approach image V as the distance L to the object becomes shorter (an example of normal control). Then, when the attraction condition described later is satisfied during this normal control, the display control device 30 executes an enlargement process (an example of a distance perception shortening process).
  • FIG. 8 is a flow chart showing the enlargement processing.
  • the display control device 30 determines whether or not the display ratio of the first approach image V1 is equal to or higher than the first threshold value MT1 (block S1).
  • the first threshold value MT1 is predetermined as a display ratio corresponding to the distance L when the object is sufficiently close to the vehicle 1 and the object is expected to pass soon, and is stored in the memory 37.
  • the display control device 30 executes the process of block S2.
  • the display control device 30 is set by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the approach image V1 of 1 is visually recognized (block S3).
  • the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the first approach image V1. If yes (block S3; Yes), the process of block S2 is executed. By doing so, even if the display ratio of the first approach image V1 is less than the first threshold value MT1, the observer recognizes the first approach image V1 and the necessity of notifying the first approach image V1 is already reduced.
  • an enlargement process an example of the distance perception shortening process for ensuring the visibility of the second close-up image V2 can be executed as described later.
  • the display control device 30 determines whether or not the erasing condition of the first approach image V1 is satisfied. For example, when the display ratio of the first approach image V1 is equal to or greater than the predetermined threshold value Mth, the display control device 30 determines that the erasing condition is satisfied (block S2; Yes), and virtualizes the first approach image V1. Erase from within the display range of V (block S4).
  • the predetermined threshold value Mth is predetermined as a value larger than, for example, the first threshold value MT1, and is stored in the memory 37.
  • the display control device 30 determines whether or not the display ratio of the second approach image V2 is less than the second threshold value MT2.
  • the second threshold value MT2 is a value smaller than the first threshold value MT1, and the distance L when the object is far from the vehicle 1 and the second approach image V2 showing the object is assumed to be difficult for the observer to see.
  • the display ratio corresponding to is predetermined and stored in the memory 37.
  • the display control device 30 ends the distance perception shortening process.
  • the second close-up image V2 is displayed according to the normal control based on the reference control data CDs, that is, the reference function Fs.
  • the display control device 30 executes an enlargement process on the second approach image V2 (block). S6).
  • the display control device 30 displays the second approach image V2 at a ratio larger than the display ratio MR (see FIGS. 7B and 7C) determined based on the reference control data CDs.
  • FIG. 6A to 6C show an example of image transition when the enlargement process is executed.
  • the second close-up image V2 displayed under normal control as shown in FIG. 6A is enlarged as shown in FIG. 6B.
  • the enlargement processing an example of the distance perception shortening processing
  • the second approach image V2 which is expected to increase the need for notification, can be displayed in an easy-to-see manner.
  • the display control device 30 may display the second approach image V2 higher than before the execution when viewed from the observer. In this way, the observer can secure the field of view necessary for driving.
  • the control of moving the second approach image V2 upward and displaying it in accordance with the enlargement processing is particularly useful when the second approach image V2 is an approach image in a visualization mode imitating a road sign. be. This is because the normal road sign is displayed above the road surface ahead, so that the observer can visually recognize the second approach image V2 as if he / she is looking at the actual road sign.
  • the display control device 30 when displaying the third approach image V3 that notifies an object whose distance L is farther than the second approach image V2, the display control device 30 performs enlargement processing on the second approach image V2. Even in the case of execution, the display of the third close-up image V3 is continued by the control based on the reference control data CDs. As a result, the second approach image V2 is relatively more conspicuous than the third approach image V3, and the second approach image V2 whose notification necessity is higher than that of the third approach image V3 in the latest. Can be displayed effectively. If there are four or more close-up images when the distance perception shortening process is executed, the close-up image displayed smaller than the third close-up image V3 (that is, the distance from the notification target of the third close-up image V3). Even for the image (an image in which L is a distant object), the display under the control based on the reference control data CDs is continued.
  • the display control device 30 determines whether or not a part of the first close-up image V1 is out of the display area of the virtual image V as shown in FIG. If it is, and if it is determined as Yes in the block S5, the enlargement process may be executed. Even in this way, the notification target of the first approach image V1 will soon pass, and the second approach image V2, which is expected to have a higher notification necessity than the first approach image V1, can be displayed in an easy-to-see manner. ..
  • the display control device 30 determines whether or not a predetermined predetermined period (for example, several seconds) has elapsed from the start of execution of the enlargement processing (block S7).
  • a predetermined predetermined period for example, several seconds
  • the display control device 30 returns the display control of the second approach image V2 to the control based on the reference control data CDs (distance perception extension process described later). (Example)) is executed (block S8), and the distance perception shortening process is terminated.
  • FIG. 6B to 6C show an example of image transition when the return process is executed.
  • the second approach image V2 which has been enlarged and displayed as shown in FIG. 6B, is displayed by control based on the reference control data CDs as shown in FIG. 6C.
  • FIG. 6C shows an example in which the display ratio changes in the M direction when the enlargement processing or the restoration processing is executed
  • the display ratio is gradually increased in order to suppress a sudden size change of the second approach image V2. It may be changed.
  • the points (L, M) indicating the second approach image V2 are obliquely upward to the left with respect to the M direction from the reference function Fs to the second function Ft when the enlargement processing is executed.
  • the second function Ft moves diagonally downward to the left with respect to the M direction from the second function Ft to the reference function Fs.
  • the display control device 30 is the second display being enlarged by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the close-up image V2 is visually recognized (block S9).
  • the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the second approach image V2. If so (block S9; Yes), the return process of block S8 is executed. By doing so, when it is assumed that the observer has recognized the second approach image V2 being displayed in an enlarged manner, the display control of the second approach image V2 is returned to the control based on the reference control data CDs. This makes it possible to prevent the observer from continuing to magnify the second close-up image V2, for which the need for notification has already decreased. This concludes the explanation of the distance perception shortening process.
  • the present invention is not limited to this.
  • the display control of the second approach image V2 may be performed at a constant display ratio M that does not depend on the distance L.
  • the points (L, M) representing the second approach image V2 are ideally restored as shown in FIG. 9A. It is rarely located on the reference function Fs at the start of processing. Therefore, as shown in FIG.
  • the return reference is passed through the point representing the second approach image V2 at the start time, and the proportionality constant of the reference function Fs is subtracted by a predetermined value.
  • the display control of the second close-up image V2 may be performed by using the function Fsb. It can be said that the control using the return reference function Fsb is an example of controlling the display ratio of the second approach image V2 based on the reference control data CDs.
  • the display ratio which was a constant value from the start time of the enlargement process to the start time of the return process, may be changed based on the reference function Fs from the start time. In these modified examples as well, the display ratio may be gradually changed when the enlargement process or the return process is executed, as described above.
  • the constant display ratio M intersects with the reference function Fs.
  • the return process may be started at the same time. That is, the predetermined period from the execution of the enlargement processing to the second close-up image V2 to the start of the restoration processing does not have to be a predetermined period.
  • the second function Ft is the proportional constant of the reference function Fs increased
  • the return reference function Fsb is the proportional constant of the reference function Fs decreased.
  • the reference function Fs is inversely proportional
  • the reference function Fs may be an algebraic function including a polynomial function or an elementary function and shows a curve. Further, the reference function Fs may be composed of a plurality of linear functions having different slopes. For example, as shown in FIG. 10B, the reference function Fs may show a curve in which the display ratio M becomes larger than the direct proportional relationship as the distance L becomes larger. Further, as shown in FIG. 10B, the reference function Fs is represented by a plurality of straight lines, and the plurality of straight lines have a gentler slope when the distance is L or more than when the distance is less than L. May include. The example of the reference function Fs shown in FIGS. 10B and 10C is useful when it is difficult to see a distant landscape during rain or heavy fog, or when it is difficult to see a close-up image due to strong sunlight.
  • the reference control data CDs are the data representing the mathematical formula of the reference function Fs, but the reference control data CDs are configured so that the display ratio M can be determined according to the acquired distance L. It may be table data. That is, if the reference control data CDs can determine the display ratio M (in other words, the display ratio M that increases as the distance L becomes shorter) according to the acquired distance L, the display ratio M becomes smaller as the distance L becomes longer.
  • the data indicating a mathematical formula or the table data may be used, and the configuration thereof is arbitrary.
  • the display control device 30 has determined that the erasing condition is satisfied when the display ratio of the first approach image V1 is equal to or greater than the predetermined threshold value Mth (in the distance perception shortening process shown in FIG. 8).
  • Block S2 is not limited to this.
  • the display control device 30 determines that the erasing condition is satisfied when the object visually recognized by the observer is the first approach image V1 (block S2 of the distance perception shortening process shown in FIG. 8; Yes. ) May.
  • the second close-up image V2 can be effectively displayed, it goes without saying that in the distance perception shortening process described above, it is possible to omit some processes and change or add a predetermined process. stomach.
  • the distance perception shortening process at least one of blocks S3, blocks S2 and S4, and block S9 may be omitted. Further, the processing of the block S5 may be omitted in the distance perception shortening processing.
  • the control data for controlling the display distance of the close-up image includes the reference control data CDs.
  • the display control device 30 determines the display ratio M based on the acquired or calculated distance L (or information capable of estimating the distance L) and the reference control data CDs, and displays the approach image at the determined display ratio M.
  • the reference control data CDs include data of a reference function Fs (mathematical expression) showing the relationship between the distance L and the display ratio M.
  • the constant ⁇ is determined by, for example, an offset amount in the depth direction (Z-axis direction) of the approaching image with respect to the distance L to the object.
  • the constant ⁇ may be zero.
  • the display control device 30 Based on the reference control data CDs, the display control device 30 performs display control that shortens the display distance P of the approach image V as the distance L to the object becomes shorter (an example of normal control). Then, when the attraction condition described later is satisfied during this normal control, the display control device 30 executes an approach process (an example of a distance perception shortening process).
  • the display control device 30 determines whether or not the display distance of the first approach image V1 is equal to or less than the first threshold value PT1 (block S11).
  • the first threshold value PT1 is predetermined as a display distance corresponding to the distance L when the object is sufficiently close to the vehicle 1 and the object is expected to pass soon, and is stored in the memory 37.
  • the display control device 30 executes the process of block S12.
  • the display control device 30 is set by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the approach image V1 of 1 is visually recognized (block S13).
  • the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the first approach image V1. If yes (block S13; Yes), the process of block S12 is executed. By doing so, even when the display distance P of the first approach image V1 is longer than the first threshold value PT1, the observer recognizes the first approach image V1 and already notifies the first approach image V1.
  • the approach process an example of the distance perception shortening process for ensuring the visibility of the second approach image V2 can be executed as described later.
  • the display control device 30 determines whether or not the erasing condition of the first approach image V1 is satisfied. For example, when the display distance P of the first approach image V1 is equal to or less than the predetermined threshold value Pth, the display control device 30 determines that the erasing condition is satisfied (block S12; Yes), and displays the first approach image V1. It is erased from the display range of the virtual image V (block S14).
  • the predetermined threshold value Pth is predetermined as a value smaller than, for example, the first threshold value PT1 and is stored in the memory 37.
  • the display control device 30 determines whether or not the display distance P of the second approach image V2 is longer than the second threshold value PT2.
  • the second threshold value PT2 is a value larger than the first threshold value PT1 and the distance L when the object is far from the vehicle 1 and the second approach image V2 showing the object is assumed to be difficult for the observer to see.
  • the display distance P corresponding to is predetermined and stored in the memory 37.
  • the display control device 30 ends the distance perception shortening process.
  • the second close-up image V2 is displayed according to the normal control based on the reference control data CDs, that is, the reference function Fs.
  • the display control device 30 executes the approach process with respect to the second approach image V2 (block). S16).
  • the display control device 30 displays the second approach image V2 at a distance closer than the display distance PR (see FIGS. 12B and 12C) determined based on the reference control data CDs.
  • 13A to 13C show an example of image transition when the approach process is executed.
  • the second approach image V2 displayed at the display distance P22 under normal control approaches the display distance P22t closer to the observer than the display distance P22, as shown in FIG. 13B.
  • the approach processing an example of the distance perception shortening process
  • the second approach image V2 which is expected to increase the need for notification, can be displayed in an easy-to-see manner.
  • the display control device 30 when displaying the third approach image V3 that notifies an object whose distance L is farther than the second approach image V2, the display control device 30 performs an approach process to the second approach image V2. Even in the case of execution, the display of the third close-up image V3 is continued by the control based on the reference control data CDs. As a result, the second approach image V2 is relatively more conspicuous than the third approach image V3, and the second approach image V2 whose notification necessity is higher than that of the third approach image V3 in the latest. Can be displayed effectively. If there are four or more close-up images when the distance perception shortening process is executed, the close-up image displayed smaller than the third close-up image V3 (that is, the distance from the notification target of the third close-up image V3). Even for the image (an image in which L is a distant object), the display under the control based on the reference control data CDs is continued.
  • the display control device 30 determines whether or not a predetermined predetermined period (for example, several seconds) has elapsed from the start of execution of the approach process (block S17).
  • a predetermined predetermined period for example, several seconds
  • the display control device 30 executes a return process for returning the display control of the second approach image V2 to the control based on the reference control data CDs (block S18). , End the distance perception shortening process.
  • the second approach image V2 displayed at the display distance P23t in the approach process is a display distance based on the reference control data CDs, which is farther from the observer than the display distance P22, as shown in FIG. 13C. It is displayed so as to move away from P23.
  • FIG. 13C shows an example in which the display distance P changes in the P direction when the approach processing or the return processing is executed, the display distance is gradually changed in order to suppress a sudden size change of the second approach image V2.
  • P may be changed.
  • the points (L, P) indicating the second approach image V2 are reference when the approach process is executed, as shown by the dotted line between Fs and Ft in FIG. 12C.
  • the function Fs may move diagonally downward to the left to the second function Ft, and when the return process is executed, the function Fs may move diagonally downward to the left from the second function Ft to the reference function Fs.
  • the display control device 30 is the second display being enlarged by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the close-up image V2 is visually recognized (block S19).
  • the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the second approach image V2. If so (block S19; Yes), the return process of block S18 is executed. By doing so, when it is assumed that the observer has recognized the second approach image V2 being displayed in an enlarged manner, the display control of the second approach image V2 is returned to the control based on the reference control data CDs. This makes it possible to prevent the observer from continuing to magnify the second close-up image V2, for which the need for notification has already decreased. This concludes the explanation of the distance perception shortening process.
  • Method S100 is executed by an image display unit 20 including a display and a display control device 30 that controls the image display unit 20. Some actions in method S100 are optionally combined, some steps are optionally modified, and some actions are optionally omitted.
  • the software components stored in the memory 37 of FIG. 3 include a proximity image generation module 502, a distance perception shortening condition determination module 504, a notification necessity determination module 506, a distance perception shortening processing module 508, and a distance perception extension condition determination module 510. And the distance perception extension processing module 512.
  • the method S100 specifically provides a method of presenting an image (virtual image) that attracts the visual attention of the observer to a predetermined close-up image.
  • Method S100 can be implemented by having one or more processors 33 execute one or more computer programs stored in memory 37.
  • One or more processors 33 generate the close-up image V by executing the close-up image generation module 502.
  • the approach image generation module 502 acquires the positions of objects (real objects or virtual objects) from the road information database 403, the vehicle exterior sensor 407, the mobile information terminal 417, and / or the external communication device 419, and these objects and the vehicle 1 As the relative distance (distance L) between the two becomes shorter, the size of the approach image V is controlled to be gradually increased.
  • the close-up image generation module 502 may gradually reduce the display distance of the close-up image V while gradually increasing the size of the close-up image V as the distance L between the object and the vehicle 1 becomes short. good.
  • One or more processors 33 execute the distance perception shortening condition determination module 504.
  • the size of the image related to the first information, which has a higher notification necessity than the second information which is a candidate for performing the attraction increase process becomes equal to or larger than a predetermined threshold (S132).
  • the display distance of the image related to the first information, which has a higher notification necessity than the information becomes equal to or more than a predetermined threshold (S134), and the notification necessity of the second information increases to a predetermined determination value or more (S136).
  • the notification necessity of the first information having a higher notification necessity than the second information is reduced to a predetermined determination value or less (S138), and the approach image relating to the first information having a higher notification necessity than the second information is displayed.
  • S140 a predetermined determination value or less
  • various operations related to determining that the distance perception shortening condition is satisfied are executed. That is, the distance perception shortening condition determination module 504 is a command, determination value, table data, for determining whether the distance perception shortening condition is satisfied from the notification necessity of the information indicated by the approach image or the display state of the approach image. It may include various software components such as arithmetic expressions.
  • (S130-1) the size of the approach image of the second information, which is a candidate for performing the attraction raising process, is equal to or less than a predetermined threshold value
  • (S130-2) the candidate for performing the attraction raising process.
  • the objects corresponding to the second information are separated by a predetermined threshold value or more, and (S130-3) the approach image of the second information, which is a candidate for performing the attraction ascending process, contains text, or these. It may be added that the combination holds.
  • S130-1 is a condition that can be a substitute for the size of the close-up image of S130-1.
  • the notification necessity determination module 506 determines whether the approach image is the content to be notified to the observer.
  • the notification necessity determination module 506 may acquire information from various other electronic devices connected to the I / O interface 31 and calculate the notification necessity. Further, the electronic device connected to the I / O interface 31 in FIG. 11 transmits information to the vehicle ECU 401, and the notification necessity determination module 506 detects (acquires) the notification necessity determined by the vehicle ECU 401 based on the received information. ) May.
  • the "necessity of notification” is, for example, the degree of danger derived from the degree of seriousness that can occur, the degree of urgency derived from the length of reaction time required to take a reaction action, vehicle 1 or a viewer (or vehicle). It can be determined by the effectiveness derived from the situation of (1 other occupant), or a combination thereof (the index of the need for notification is not limited to these).
  • the notification necessity determination module 506 may detect the necessity-related information that is the source for estimating the notification necessity, and may estimate the notification necessity from this.
  • Necessity-related information which is the basis for estimating the notification necessity of an image, may be estimated based on, for example, the position and type of a real object or traffic regulation (an example of road information), and is connected to the I / O interface 31. It may be estimated based on other information input from various other electronic devices or in addition to other information. That is, the notification necessity determination module 506 may determine whether to notify the viewer and may choose not to display the image described later.
  • the display control device 30 does not have to have a function of estimating (calculating) the necessity of notification as long as it can acquire the necessity of notification, and a part or all of the function of estimating the necessity of notification is It may be provided separately from the display control device 30 of the vehicle display system 10 (for example, the vehicle ECU 401).
  • One or more processors 33 execute the distance perception shortening processing module 508.
  • the distance perception shortening processing module 508 enlarges the size of the approach image from the size in the reference control, further expands as the distance L becomes shorter (enlargement processing S152), and displays the display distance of the approach image in the reference control.
  • the display distance is shortened as the distance L becomes shorter than the distance (approach processing S154), and the size of the approach image is enlarged to be larger than the size in the reference control and the size is maintained (S156).
  • the display distance of the close-up image is made closer than the display distance in the reference control and the display distance is maintained (S158), or a software component for executing a combination thereof is included.
  • the distance perception shortening processing module 508 is stored in the memory 37 in order to enlarge the size of the approach image to be larger than the size in the reference control, or to make the display distance of the approach image closer than the display distance in the reference control.
  • the pre-stored first distance perception shortening processing data CDt (including the second function Ft) is executed (blocks S152 and S154).
  • One or more processors 33 execute the distance perception extension condition determination module 510.
  • a predetermined time elapses after the distance perception shortening process of the block S150 is executed (S172), and the notification necessity of the second information is reduced to a predetermined determination value or less (S174). )
  • the notification necessity of the third information which is lower than the second information, increases to a predetermined determination value or more (S176), the vehicle 1 is manually operated (S178), and the speed of the vehicle 1. Is equal to or greater than a predetermined determination value, or a combination thereof causes various actions related to determining that the distance perception extension condition is satisfied.
  • the distance perception extension condition determination module 510 determines whether the distance perception extension condition is satisfied based on the notification necessity of the information indicated by the approach image, the display state of the approach image, and the like, the command, the determination value, and the table data.
  • the distance perception extension processing module 512 makes the size of the approach image smaller than the size in the distance perception shortening process (S192), makes the display distance of the approach image farther than the display distance in the distance perception shortening process (S194), and makes the image Includes software components for maintaining size (S196), maintaining image display distance (S198), or performing combinations thereof.
  • the distance perception shortening processing module 508 is based on the distance L between the object and the vehicle 1 and the reference control data CDs (reference function Fs), the size of the approach image, the display distance of the approach image, or these. Control the combination.
  • the block S192 may include changing the size of the close-up image to the size in the reference control.
  • the block S194 may include changing the display distance of the approach image to the display distance in the reference control.
  • the approach image in the present embodiment is roughly classified into (1) as the relative distance between the vehicle and the object becomes shorter, the size becomes larger, the display distance becomes shorter, or a combination thereof is executed, and the approach image becomes
  • the size is larger, the display distance is shorter, or a combination of these is larger than the display in the "normal control process” that is perceived as gradually approaching, and (2) the display in the "normal control process”.
  • the size is larger, the display distance is shorter, or a combination of these is executed than the display in the "first distance perception shortening process" to be executed and (3) the display in the "first distance perception shortening process". It is possible to display in the "second distance perception shortening process".
  • the close-up image control data CDs in the present embodiment include reference control data CDs that are displayed in the "normal control process”, first distance perception shortening process data CDt that is displayed in the "first distance perception shortening process”, and so on. And the second distance perception shortening process data CDv to be displayed in the "second distance perception shortening process" is included.
  • the second distance perception shortening processing data CDv controls the approaching image so that the size of the approaching image with respect to the distance L is larger and / or the display distance with respect to the distance L is shorter than the first distance perception shortening processing data CDt. It is data. That is, as shown in FIG. 15, when the size and / or display distance of the approach image is changed from the one based on the reference control data CDs to the second distance perception shortening processing data CDv, the first distance perception shortening processing data Compared with the case of changing to CDt, the distance perception can be further shortened, and the degree of attraction (attractiveness) can be further increased.
  • FIG. 16A shows the display of the approach image executed when another information having a higher notification priority than the second information is generated while displaying the approach image related to the second information according to some embodiments. It is a flow chart which shows the method S300 concerning control. 16B is a flow chart following FIG. 16A, and FIG. 16C is a flow chart following FIG. 16B. Some actions in the method S300 are optionally combined, some actions are optionally modified, and some actions are optionally omitted. Further, FIGS. 17A to 17F are views in which the left figure explains an example of the display transition of the image, and the right figure corresponds to the right figure showing the display ratio of the image according to the distance.
  • FIG. 17A shows a display example of the second approach image V2 (L1, M1) showing the second information regarding the object set at the distance L1 displayed by the block S310.
  • One or more processors 33 determine whether or not the condition for reducing the attractiveness of the close-up image V is satisfied.
  • the one or more processors 33 have higher notification necessity information (here, the third information) than the information indicated by the close-up image V, which is a candidate for reducing the attractiveness (here, the second information). Is newly detected, it is determined that the condition for reducing the attractiveness is satisfied (S332).
  • One or more processors 33 execute a display process for reducing the attractiveness of the approach image V when it is determined in S330 that the above conditions are satisfied.
  • One or more processors 33 have visibility of the approach image V (here, the second approach image V2) determined in S330 that the above condition is satisfied (the visibility includes, for example, brightness and display color). , Transparency, display gradation, or a combination thereof.) (S352).
  • FIG. 17B shows a third approaching image V2 (L1.M1) whose visibility is reduced in the block S352 and a third information having a higher notification necessity than the second information shown by the second approaching image V2 in the block S332. It is a figure which shows the display example of the 3rd approach image V3 which shows information.
  • the visibility of the second approach image V2 is lowered, and the third approach image V3 (here, the image having a higher need for notification than the second information is highlighted.
  • the third approach image V3 (or the third approach image V3) (or the third one) has a high need for notification. It is possible to facilitate the visual attention of the observer to the object) to which the close-up image V3 of 3 is associated.
  • the third virtual image V3 does not have to be a close-up image that shortens the perceptual distance as the distance L to the object becomes shorter.
  • the third virtual image V3 may not be displayed. That is, in FIGS. 17B and 17C, the third close-up image V3, which has a high need for notification, does not have to be displayed. In this way, by reducing the attractiveness of the second close-up image V2 and reducing the attractiveness of the second close-up image V2, the observer's visual attention to the object of the third information is indirectly taken. Is easier to face.
  • FIG. 17C shows a situation after the situation shown in FIG. 17B, when the distance L to the second object associated with the second information is shortened from L1 to L2 due to reasons such as the vehicle 1 moving forward.
  • the second approach image V2 expands the display ratio M of the second approach image V2 from M1 to M2 based on the reference control data CDs as the distance L becomes shorter from L1 to L2 (shortens the distance perception). An example of doing.).
  • the attractiveness (as an example, visibility) of the second close-up image V2 remains reduced, but for example, a predetermined time has elapsed since the block S350 was executed.
  • the attractiveness of the second approach image V2 may be increased on the condition that the vehicle 1 has traveled a predetermined distance or more after the block S350 is executed.
  • the block S330 (first condition in the claims) is the information indicated by the close-up image V which is a candidate for reducing the attractiveness in place of the block S332 or in addition to the block S332 (here, the second condition).
  • Information (information) has a lower notification necessity (here, the first information), the notification necessity increases (S334), and the information indicated by the approach image V, which is a candidate for reducing the attractiveness (here, is referred to as the first information).
  • the notification necessity of the second information may be lower than the notification necessity of the lower information (here, the first information) (S336), or a combination thereof may be included.
  • the block S350 (first display control in the claims) hides the approach image V (here, the second approach image V2) in place of the block S352 or in addition to the block S352. (S354), reducing the size of the close-up image V (here, the second close-up image V2) (S356), or a combination thereof. That is, the display process for reducing the attractiveness is to reduce the visibility of the image, hide the image, reduce the size of the image (an example of extending the distance perception of the image), or. These combinations may be included.
  • One or more processors 33 determine whether the condition for shortening the distance perception of the approach image V is satisfied.
  • the information here, the first information or the third information
  • the information here, the second information indicated by the close-up image V which is a candidate for shortening the distance perception
  • One or more processors 33 execute a display process that shortens the distance perception of the approach image V when it is determined in S370 that the above conditions are satisfied.
  • One or more processors 33 enlarge the size of the close-up image V (here, the second close-up image V2) determined in S370 that the above condition is satisfied, and further expand the size of the close-up image V (here, the second close-up image V2) from the size in the reference control, and further increase the distance L. As it becomes shorter, it expands (S392).
  • FIG. 17D is a diagram showing a display example of the second approach image V2 (L2.M3) enlarged in the block S392.
  • one or more processors 33 increase the display ratio M corresponding to the distance L2 from M2 to M3 by changing the control data from the reference control data CDs to the first distance perception shortening processing data CDt. do.
  • the perceived distance felt by the observer is shortened, so that the second approach image V2 can easily recognize the approach of the associated object quickly. can do.
  • FIG. 17E is after the situation shown in FIG. 17D, when the distance L to the second object associated with the second information is shortened from L2 to L3 due to reasons such as the vehicle 1 moving forward. It is a figure which shows the display transition of the 2nd approach image V2 of.
  • the second close-up image V2 expands the display ratio M of the second close-up image V2 from M3 to M4 based on the first distance perception shortening processing data CDt in accordance with the shortening of the distance L from L2 to L3. It may be good (an example of shortening the perception of distance).
  • the block S370 (second condition in the claims) is the information indicated by the close-up image V which is a candidate for reducing the attractiveness in place of the block S372 or in addition to the block S372 (here, the second condition).
  • Information which has a higher notification necessity (here, the first information or the third information) has a lower notification necessity (S374), and the information indicated by the approach image V, which is a candidate for reducing the attractiveness.
  • the notification necessity of (here, the second information) is higher than the notification necessity of the higher-level information (here, the first information or the third information) (S376), or a combination thereof. You may be.
  • the block S390 (second display control in the claims) replaces the block S392 or, in addition to the block S392, makes the display distance of the approach image closer than the display distance in the reference control, and further sets the distance L.
  • the display distance is shortened (S394)
  • the size of the approach image is enlarged and maintained in size (S396)
  • the display distance of the approach image is the display distance in the reference control. It may be closer and maintain the display distance (S398), or a combination thereof may be included. That is, the display process that shortens the distance perception may include enlarging the image, shortening the display distance of the image, or a combination thereof.
  • One or more processors 33 determine whether the condition for extending the distance perception of the approach image V is satisfied. When one or more processors 33 detect that a predetermined time has elapsed since the block S390 was executed, it determines that the condition for extending the distance perception is satisfied (S412). Further, in the block S410 (third condition in the claims), the vehicle 1 has traveled a predetermined distance or more after the block S390 is executed, for example, in place of the block S412 or in addition to the block S412 (S414). ), Etc. may be included.
  • one or more processors 33 execute a display process for extending the distance perception of the approach image V.
  • One or more processors 33 make the size of the approach image V (here, the second approach image V2) determined in S410 that the above condition is satisfied smaller than the size in the first distance perception shortening process (in this case, the second approach image V2).
  • FIG. 17F is a diagram showing a display example of the second approach image V2 (L3.M5) enlarged by the block S432.
  • One or more processors 33 change the control data from the first distance perception shortening processing data CDt to the reference control data CDs in the block S432 (in other words, restore the distance perception shortened in the block S390).
  • the display ratio M corresponding to the distance L3 is reduced from M4 to M5.
  • the change in the image that reduces the size of the second close-up image V2 makes it easier for the observer to pay attention to the visual attention, and by extension, the observer's object in the vicinity of the distance perception of the changed image. Can direct visual attention.
  • block S430 sets the display distance of the approach image longer than the display distance in the first distance perception shortening process in place of the block S432 or in addition to the block S432 (in addition to the block S432). S434) may be included.
  • the block S390 when the block S390 is executed, the display ratio M smaller than the display ratio M3 based on the one-distance perception shortening processing data CDt corresponding to the actual object position L2 (for example, the display ratio M1 when hidden in S350).
  • the block S390 After displaying the second approach image V2 by (may be May be expanded. That is, the block S390 has the 1-distance perception shortening processing data CDt corresponding to the actual object position L2 before displaying the image by the distance perception based on the 1-distance perception shortening processing data CDt corresponding to the actual object position L2.
  • the image of the distance perception based on the one-distance perception shortening processing data CDt corresponding to the position L2 of the actual object may be continuously changed. As a result, it is possible to prevent the observer from being surprised by suddenly displaying an image having a short perceptual distance.
  • the operation of the above-mentioned processing process can be performed by executing one or more functional modules of an information processing device such as a general-purpose processor or a chip for a specific purpose. All of these modules, combinations of these modules, and / or combinations with known hardware capable of substituting their functionality are within the scope of the protection of the present invention.
  • the functional blocks of the vehicle display system 10 are optionally executed by hardware, software, or a combination of hardware and software in order to execute the principles of the various embodiments described.
  • the functional blocks described in FIG. 3 may be optionally combined or one functional block separated into two or more subblocks in order to implement the principles of the embodiments described. It will be understood by those skilled in the art. Accordingly, the description herein optionally supports any possible combination or division of functional blocks described herein.
  • the image display unit 20 in the vehicle display system 10 may be a head-mounted display (hereinafter, HMD) device.
  • HMD head-mounted display
  • the observer superimposes the displayed virtual image V on the foreground via the front windshield 2 of the vehicle 1 and visually recognizes it.
  • the display area 100 on which the image display unit 20 displays a predetermined virtual image V is fixed at a specific position with respect to the coordinate system of the vehicle 1, and when the observer turns to that direction, the display area 100 is fixed at the specific position.
  • the virtual image V displayed in the display area 100 can be visually recognized.
  • Vehicle display system 20 Image display unit (HUD device) 21: Display 21a: Display surface 22: Liquid crystal display panel 24: Light source unit 25: Relay optical system 26: First mirror 27: Second mirror 28: Actuator 29: Actuator 30: Display control device 31: I / O interface 33 : Processor 35: Image processing circuit 37: Memory 40: Display light 40p: Optical axis 41: First display light 42: Second image light 43: Third image light 80: Car navigation device 90: Virtual image optical system 100: Display area 200 : Eye box 205: Center 401: Vehicle ECU 403: Road information database 405: Own vehicle position detection unit 407: Vehicle outside sensor 409: Operation detection unit 411: Eye position detection unit 415: Line-of-sight direction detection unit 417: Mobile information terminal 419: External communication device 502: Approach image generation module 504 : Distance perception shortening condition determination module 506: Notification necessity determination module 508: Distance perception shortening processing module 510

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The present invention pertains to properly directing an observer's visual attention to an image. A display control device 30 controls an image display device 20 that displays an image to an observer as a virtual image that overlaps the scenery in front of a vehicle, wherein a processor executes first display control in which an approaching image V is displayed so as to gradually appear larger in size as the distance L to an object in front of the vehicle becomes shorter, and also executes second display control in which: when a prescribed first condition is satisfied, the visibility of the approaching image V is reduced, the approaching image V ceases to be displayed, and/or the size of the approaching image (V) is reduced; and when a prescribed second condition is satisfied, the size of the approaching image V relative to the distance L is increased and/or the display distance is shortened to a greater extent than in the first display control.

Description

表示制御装置、画像表示装置、及び方法Display control device, image display device, and method
 本開示は、車両で使用され、車両の前景に画像を重畳して視認させる表示制御装置、画像表示装置、及び方法に関する。 The present disclosure relates to a display control device, an image display device, and a method used in a vehicle to superimpose and visually recognize an image on the foreground of the vehicle.
 従来の表示装置として、特許文献1には、車両の前方風景と重なる虚像として視認者(主に車両の運転者)に視認される画像を表示するものが開示されている。特許文献1に開示された表示装置は、車両から実オブジェクトまでの距離に応じて当該実オブジェクトを報知する接近画像の相対的な高さ等を制御するものである。 As a conventional display device, Patent Document 1 discloses a device that displays an image visually recognized by a viewer (mainly the driver of the vehicle) as a virtual image overlapping with the scenery in front of the vehicle. The display device disclosed in Patent Document 1 controls the relative height of a close-up image that notifies the real object according to the distance from the vehicle to the real object.
特開2015-221633号公報Japanese Unexamined Patent Publication No. 2015-221633
 上記のような接近画像を複数同時に表示する場合に特許文献1に開示された技術を用いると、各接近画像に対して車両からの距離に応じた同様の表示制御が行われることになる。しかしながら、接近画像のうち視認者にとってより優先度が高いと想定されるものを効果的に表示するに当たっては改善の余地がある。 If the technique disclosed in Patent Document 1 is used when displaying a plurality of the above-mentioned approach images at the same time, the same display control is performed for each approach image according to the distance from the vehicle. However, there is room for improvement in effectively displaying close-up images that are expected to have a higher priority for the viewer.
 本明細書に開示される特定の実施形態の要約を以下に示す。これらの態様が、これらの特定の実施形態の概要を読者に提供するためだけに提示され、この開示の範囲を限定するものではないことを理解されたい。実際に、本開示は、以下に記載されない種々の態様を包含し得る。 The following is a summary of the particular embodiments disclosed herein. It should be understood that these aspects are presented solely to provide the reader with an overview of these particular embodiments and do not limit the scope of this disclosure. In fact, the present disclosure may include various aspects not described below.
 本開示の概要は、観察者の視覚的注意を適切に画像に向けさせることに関する。より具体的には、オブジェクトまでの距離を認識させつつ、観察者にとって邪魔になりにくい画像を提供することにも関する。 The outline of this disclosure relates to properly directing the observer's visual attention to the image. More specifically, it is also related to providing an image that does not get in the way of the observer while recognizing the distance to the object.
 したがって、本明細書に記載される表示制御装置は、車両の前方風景と重なる虚像として観察者に画像を表示する画像表示装置を制御する表示制御装置であって、情報を取得可能な1つ又は複数のI/Oインタフェースと、1つ又は複数のプロセッサと、メモリ37と、前記メモリ37に格納され、前記1つ又は複数のプロセッサ33によって実行されるように構成される1つ又は複数のコンピュータ・プログラムと、を備え、前記1つ又は複数のI/Oインタフェース31は、前記車両の前方のオブジェクトまでの距離Lと、前記車両に関する車両情報と、を取得し、前記1つ又は複数のプロセッサ33は、前記距離Lが短くなるに従い徐々に大きくなるように視認される接近画像Vを表示する通常制御処理を実行し、所定の第1の条件が満たされるか判定し、前記所定の第1の条件が満たされた場合、前記通常制御処理より前記距離Lに対して、前記接近画像Vを大きくする、及び/又は表示距離を短くする、距離知覚短縮処理を実行し、前記距離知覚短縮処理は、第1距離知覚短縮処理と、前記第1距離知覚短縮処理より前記距離Lに対して、前記接近画像Vをさらに大きくする、及び/又は表示距離をさらに短くする、第2距離知覚短縮処理と、を少なくとも含み、前記車両情報に基づいて、実行する前記距離知覚短縮処理を異ならせる。 Therefore, the display control device described in the present specification is a display control device that controls an image display device that displays an image to the observer as an imaginary image that overlaps with the scenery in front of the vehicle, and is one that can acquire information. A plurality of I / O interfaces, one or more processors, a memory 37, and one or more computers stored in the memory 37 and configured to be executed by the one or more processors 33. The one or more I / O interfaces 31 include a program, and the one or more I / O interfaces 31 acquire a distance L to an object in front of the vehicle and vehicle information about the vehicle, and the one or more processors. 33 executes a normal control process for displaying a close-up image V that is visually recognized so as to gradually increase as the distance L becomes shorter, determines whether a predetermined first condition is satisfied, and determines whether the predetermined first condition is satisfied. When the condition of Is a second distance perception shortening process that further increases the approach image V and / or further shortens the display distance with respect to the distance L as compared with the first distance perception shortening process and the first distance perception shortening process. And, at least, and the distance perception shortening process to be executed is different based on the vehicle information.
 本発明によれば、接近画像のうち優先度が高いと想定されるものを効果的に表示することができる。 According to the present invention, it is possible to effectively display a close-up image that is assumed to have a high priority.
図1は、車両用表示システムの車両への適用例を示す図である。FIG. 1 is a diagram showing an example of application of a vehicle display system to a vehicle. 図2は、ヘッドアップディスプレイ装置の構成を示す図である。FIG. 2 is a diagram showing a configuration of a head-up display device. 図3は、車両用表示システムのブロック図である。FIG. 3 is a block diagram of a vehicle display system. 図4Aは、画像表示装置が表示する接近画像の一例を示す図である。FIG. 4A is a diagram showing an example of a close-up image displayed by an image display device. 図4Bは、画像表示装置が表示する接近画像の一例を示す図である。FIG. 4B is a diagram showing an example of a close-up image displayed by an image display device. 図5は、距離に応じて、接近画像のサイズ(表示比率)を変更するための制御データの説明を提供する図である。FIG. 5 is a diagram that provides an explanation of control data for changing the size (display ratio) of a close-up image according to a distance. 図6Aは、拡大処理が実行される前の画像の表示例を示す図である。FIG. 6A is a diagram showing a display example of an image before the enlargement processing is executed. 図6Bは、図6Aに示す状況の後であり、拡大処理が実行された際の画像の表示例を示す図である。FIG. 6B is a diagram showing an example of displaying an image when the enlargement processing is executed, after the situation shown in FIG. 6A. 図6Cは、図6Bに示す状況の後であり、拡大処理が実行された後の画像の表示例を示す図である。FIG. 6C is a diagram showing a display example of an image after the situation shown in FIG. 6B and after the enlargement processing is executed. 図7Aは、複数の接近画像の表示遷移例を説明する図である。FIG. 7A is a diagram illustrating a display transition example of a plurality of close-up images. 図7Bは、図7Aの状況の後であり、複数の接近画像の表示遷移例を説明する図である。FIG. 7B is a diagram after the situation of FIG. 7A, for explaining a display transition example of a plurality of close-up images. 図7Cは、図7Bの状況の後であり、画像の拡大処理を説明する図である。FIG. 7C is a diagram after the situation of FIG. 7B and explaining the image enlargement processing. 拡大処理を示すフロー図である。It is a flow chart which shows the enlargement processing. 図9Aは、拡大処理の変形例を説明するための図である。FIG. 9A is a diagram for explaining a modification of the enlargement processing. 図9Bは、拡大処理の変形例を説明するための図である。FIG. 9B is a diagram for explaining a modification of the enlargement processing. 図9Cは、拡大処理の変形例を説明するための図である。FIG. 9C is a diagram for explaining a modification of the enlargement processing. 図10Aは、基準制御データの変形例を説明するための図である。FIG. 10A is a diagram for explaining a modification of the reference control data. 図10Bは、基準制御データの変形例を説明するための図である。FIG. 10B is a diagram for explaining a modification of the reference control data. 図10Cは、基準制御データの変形例を説明するための図である。FIG. 10C is a diagram for explaining a modification of the reference control data. 図11は、接近処理を示すフローチャートである。FIG. 11 is a flowchart showing the approach process. 図12Aは、複数の接近画像の表示遷移例を説明する図である。FIG. 12A is a diagram illustrating an example of display transitions of a plurality of close-up images. 図12Bは、図12Aの状況の後であり、複数の接近画像の表示遷移例を説明する図である。FIG. 12B is a diagram after the situation of FIG. 12A and explaining a display transition example of a plurality of close-up images. 図12Cは、図12Bの状況の後であり、画像の接近処理を説明する図である。FIG. 12C is a diagram that follows the situation of FIG. 12B and illustrates the image proximity process. 図13Aは、接近処理が実行される前の画像の表示例を示す図である。図13Bは、図13Aに示す状況の後であり、接近処理が実行された際の画像の表示例を示す図である。図13Cは、図13Bに示す状況の後であり、接近処理が実行された後の画像の表示例を示す図である。FIG. 13A is a diagram showing a display example of an image before the approach process is executed. FIG. 13B is a diagram showing an example of displaying an image when the approach process is executed, after the situation shown in FIG. 13A. FIG. 13C is a diagram showing a display example of an image after the situation shown in FIG. 13B and after the approach process is executed. 図14Aは、いくつかの実施形態に従って、距離知覚短縮処理及び距離知覚延長処理を実行する方法を示すフロー図である。FIG. 14A is a flow chart showing a method of performing the distance perception shortening process and the distance perception extension process according to some embodiments. 図14Bは、図14Aに続くフロー図である。FIG. 14B is a flow chart following FIG. 14A. 図15は、第2第1距離知覚短縮処理データを説明する図である。FIG. 15 is a diagram for explaining the second and first distance perception shortening processing data. 図16Aは、いくつかの実施形態に従って、第2情報に関する接近画像を表示している際に、第2情報よりも報知優先度が高い別の情報が発生した場合に実行される接近画像の表示制御に関するする方法を示すフロー図である。FIG. 16A shows the display of the approach image executed when another information having a higher notification priority than the second information is generated while displaying the approach image related to the second information according to some embodiments. It is a flow chart which shows the method about control. 図16Bは、図16Aに続くフロー図である。FIG. 16B is a flow chart following FIG. 16A. 図16Cは、図16Bに続くフロー図である。FIG. 16C is a flow chart following FIG. 16B. 図17Aは、左図が画像の表示遷移の例を説明する図であり、右図が距離に応じた画像の表示比率を示す右図に対応した図である。In FIG. 17A, the left figure is a diagram for explaining an example of image display transition, and the right figure is a diagram corresponding to the right figure showing an image display ratio according to a distance. 図17Bは、図17Aの状況の後であり、左図が画像の表示遷移の例を説明する図であり、右図が距離に応じた画像の表示比率を示す右図に対応した図である。FIG. 17B is after the situation of FIG. 17A, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. .. 図17Cは、図17Bの状況の後であり、左図が画像の表示遷移の例を説明する図であり、右図が距離に応じた画像の表示比率を示す右図に対応した図である。FIG. 17C is after the situation of FIG. 17B, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. .. 図17Dは、図17Cの状況の後であり、左図が画像の表示遷移の例を説明する図であり、右図が距離に応じた画像の表示比率を示す右図に対応した図である。FIG. 17D is after the situation of FIG. 17C, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. .. 図17Eは、図17Dの状況の後であり、左図が画像の表示遷移の例を説明する図であり、右図が距離に応じた画像の表示比率を示す右図に対応した図である。FIG. 17E is after the situation of FIG. 17D, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. .. 図17Fは、図17Eの状況の後であり、左図が画像の表示遷移の例を説明する図であり、右図が距離に応じた画像の表示比率を示す右図に対応した図である。FIG. 17F is after the situation of FIG. 17E, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. .. 図18Aは、左図が画像の表示遷移の例を説明する図であり、右図が距離に応じた画像の表示比率を示す右図に対応した図である。In FIG. 18A, the left figure is a diagram for explaining an example of image display transition, and the right figure is a diagram corresponding to the right figure showing an image display ratio according to a distance. 図18Bは、図18Aの状況の後であり、左図が画像の表示遷移の例を説明する図であり、右図が距離に応じた画像の表示比率を示す右図に対応した図である。FIG. 18B is after the situation of FIG. 18A, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. .. 図18Cは、図17Bの状況の後であり、左図が画像の表示遷移の例を説明する図であり、右図が距離に応じた画像の表示比率を示す右図に対応した図である。FIG. 18C is after the situation of FIG. 17B, the left figure is a diagram for explaining an example of the display transition of the image, and the right figure is a diagram corresponding to the right figure showing the display ratio of the image according to the distance. ..
 以下、図1ないし図16では、例示的な車両用表示システムの構成、及び動作の説明を提供する。なお、本発明は以下の実施形態(図面の内容も含む)によって限定されるものではない。下記の実施形態に変更(構成要素の削除も含む)を加えることができるのはもちろんである。また、以下の説明では、本発明の理解を容易にするために、公知の技術的事項の説明を適宜省略する。 Hereinafter, FIGS. 1 to 16 provide a description of the configuration and operation of an exemplary vehicle display system. The present invention is not limited to the following embodiments (including the contents of the drawings). Of course, changes (including deletion of components) can be made to the following embodiments. Further, in the following description, in order to facilitate understanding of the present invention, description of known technical matters will be omitted as appropriate.
 車両用表示システム10における画像表示部(画像表示装置)20は、車両1のダッシュボード5内に設けられたヘッドアップディスプレイ(HUD:Head-Up Display)装置である。画像表示部20は、表示光40をフロントウインドシールド2(被投影部の一例である)に向けて出射し、フロントウインドシールド2は、画像表示部20が表示する画像の表示光40をアイボックス200へ反射する。HUD装置20は、車両1に関する情報(以下、車両情報と言う。)だけでなく、車両情報以外の情報も統合的に車両1の乗員(前記乗員は、典型的には、車両1の運転者である。)に報知する。なお、車両情報は、車両1自体の情報だけでなく、車両1の運行に関連した車両1の外部の情報も含む。 The image display unit (image display device) 20 in the vehicle display system 10 is a head-up display (HUD: Head-Up Display) device provided in the dashboard 5 of the vehicle 1. The image display unit 20 emits the display light 40 toward the front windshield 2 (an example of the projected unit), and the front windshield 2 emits the display light 40 of the image displayed by the image display unit 20 to the eye box. Reflects to 200. The HUD device 20 integrates not only information about the vehicle 1 (hereinafter referred to as vehicle information) but also information other than the vehicle information as an occupant of the vehicle 1 (the occupant is typically a driver of the vehicle 1). Is notified. The vehicle information includes not only the information of the vehicle 1 itself but also the external information of the vehicle 1 related to the operation of the vehicle 1.
 図2は、本実施形態のHUD装置20の構成を示す図である。HUD装置20は、画像を表示する表示面21aを有する表示器21と、リレー光学系25と、を含む。 FIG. 2 is a diagram showing the configuration of the HUD device 20 of the present embodiment. The HUD device 20 includes a display 21 having a display surface 21a for displaying an image, and a relay optical system 25.
 図2の表示器21は、液晶ディスプレイパネル22と、光源ユニット24と、から構成される。表示面21aは、液晶ディスプレイパネル22の視認側の表面であり、画像の表示光40を出射する。表示面21aの中心からリレー光学系25及び前記被投影部を介してアイボックス200(アイボックス200の中心205)へ向かう表示光40の光軸40pに対する、表示面21aの角度の設定により、表示領域100の角度(チルト角θtを含む。)が設定され得る。 The display 21 of FIG. 2 is composed of a liquid crystal display panel 22 and a light source unit 24. The display surface 21a is a surface on the visual side of the liquid crystal display panel 22, and emits the display light 40 of the image. Display by setting the angle of the display surface 21a with respect to the optical axis 40p of the display light 40 from the center of the display surface 21a toward the eye box 200 (center 205 of the eye box 200) via the relay optical system 25 and the projected portion. The angle of the region 100 (including the tilt angle θt) can be set.
 リレー光学系25は、表示器21から出射された表示光40(表示器21からアイボックス200へ向かう光。)の光路上に配置され、表示器21からの表示光40をHUD装置20の外側のフロントウインドシールド2に投影する1つ又はそれ以上の光学部材で構成される。図2のリレー光学系25は、1つの凹状の第1ミラー26と、1つの平面の第2ミラー27と、を含む。 The relay optical system 25 is arranged on the optical path of the display light 40 (light from the display 21 toward the eyebox 200) emitted from the display 21, and the display light 40 from the display 21 is directed to the outside of the HUD device 20. It is composed of one or more optical members projected onto the front windshield 2. The relay optical system 25 of FIG. 2 includes one concave first mirror 26 and one flat second mirror 27.
 第1ミラー26は、例えば、正の光学的パワーを有する自由曲面形状である。換言すると、第1ミラー26は、領域毎に光学的パワーが異なる曲面形状であってもよく、すなわち、表示光40が通る領域(光路)に応じて表示光40に付加される光学的パワーが異なってもよい。具体的には、表示面21aの各領域からアイボックス200へ向かう第1表示光41、第2画像光42、第3画像光43(図2参照)とで、リレー光学系25によって付加される光学的パワーが異なってもよい。 The first mirror 26 has, for example, a free curved surface shape having positive optical power. In other words, the first mirror 26 may have a curved surface shape in which the optical power differs for each region, that is, the optical power added to the display light 40 according to the region (optical path) through which the display light 40 passes. It may be different. Specifically, the relay optical system 25 adds the first display light 41, the second image light 42, and the third image light 43 (see FIG. 2) toward the eyebox 200 from each region of the display surface 21a. The optical power may be different.
 なお、第2ミラー27は、例えば、平面ミラーであるが、これに限定されるものではなく、光学的パワーを有する曲面であってもよい。すなわち、リレー光学系25は、複数のミラー(例えば、本実施形態の第1ミラー26、第2ミラー27。)を合成することで、表示光40が通る領域(光路)に応じて付加される光学的パワーを異ならせてもよい。なお、第2ミラー27は、省略されてもよい。すなわち、表示器21から出射される表示光40は、第1ミラー26により被投影部(フロントウインドシールド)2に反射されてもよい。 The second mirror 27 is, for example, a flat mirror, but is not limited to this, and may be a curved surface having optical power. That is, the relay optical system 25 is added according to the region (optical path) through which the display light 40 passes by synthesizing a plurality of mirrors (for example, the first mirror 26 and the second mirror 27 of the present embodiment). The optical power may be different. The second mirror 27 may be omitted. That is, the display light 40 emitted from the display 21 may be reflected by the first mirror 26 on the projected portion (front windshield) 2.
 また、本実施形態では、リレー光学系25は、2つのミラーを含んでいたが、これに限定されるものではなく、これらに追加又は代替で、1つ又はそれ以上の、レンズなどの屈折光学部材、ホログラムなどの回折光学部材、反射光学部材、又はこれらの組み合わせを含んでいてもよい。 Further, in the present embodiment, the relay optical system 25 includes two mirrors, but the present invention is not limited to this, and one or more refractive optics such as a lens may be added or substituted to these. It may include a member, a diffractive optical member such as a hologram, a reflective optical member, or a combination thereof.
 また、本実施形態のリレー光学系25は、この曲面形状(光学的パワーの一例。)により、表示領域100までの距離を設定する機能、及び表示面21aに表示された画像を拡大した虚像を生成する機能、を有するが、これに加えて、フロントウインドシールド2の湾曲形状により生じ得る虚像の歪みを抑制する(補正する)機能、を有していてもよい。 Further, the relay optical system 25 of the present embodiment has a function of setting the distance to the display area 100 by the curved surface shape (an example of optical power), and a virtual image obtained by enlarging the image displayed on the display surface 21a. It has a function of generating, but in addition to this, it may have a function of suppressing (correcting) distortion of a virtual image that may occur due to the curved shape of the front windshield 2.
 また、リレー光学系25は、表示制御装置30により制御されるアクチュエータ28、29が取り付けられ、回転可能であってもよい。 Further, the relay optical system 25 may be rotatable to which actuators 28 and 29 controlled by the display control device 30 are attached.
 液晶ディスプレイパネル22は、光源ユニット24から光を入射し、空間光変調した表示光40をリレー光学系25(第2ミラー27)へ向けて出射する。液晶ディスプレイパネル22は、例えば、観察者から見た虚像Vの上下方向(Y軸方向)に対応する画素が配列される方向が短辺である矩形状である。観察者は、液晶ディスプレイパネル22の透過光を、虚像光学系90を介して視認する。虚像光学系90は、図2で示すリレー光学系25とフロントウインドシールド2とを合わせたものである。 The liquid crystal display panel 22 receives light from the light source unit 24 and emits spatial light-modulated display light 40 toward the relay optical system 25 (second mirror 27). The liquid crystal display panel 22 has, for example, a rectangular shape whose short side is the direction in which the pixels corresponding to the vertical direction (Y-axis direction) of the virtual image V seen from the observer are arranged. The observer visually recognizes the transmitted light of the liquid crystal display panel 22 via the virtual image optical system 90. The virtual image optical system 90 is a combination of the relay optical system 25 shown in FIG. 2 and the front windshield 2.
 光源ユニット24は、光源(不図示)と、照明光学系(不図示)と、によって構成される。 The light source unit 24 is composed of a light source (not shown) and an illumination optical system (not shown).
 光源(不図示)は、例えば、複数のチップ型のLEDであり、液晶ディスプレイパネル(空間光変調素子の一例)22へ照明光を出射する。光源ユニット24は、例えば、4つの光源で構成されており、液晶ディスプレイパネル22の長辺に沿って一列に配置される。光源ユニット24は、表示制御装置30からの制御のもと、照明光を液晶ディスプレイパネル22に向けて出射する。光源ユニット24の構成や光源の配置などはこれに限定されない。 The light source (not shown) is, for example, a plurality of chip-type LEDs, and emits illumination light to a liquid crystal display panel (an example of a spatial light modulation element) 22. The light source unit 24 is composed of, for example, four light sources, and is arranged in a row along the long side of the liquid crystal display panel 22. The light source unit 24 emits illumination light toward the liquid crystal display panel 22 under the control of the display control device 30. The configuration of the light source unit 24 and the arrangement of the light sources are not limited to this.
 照明光学系(不図示)は、例えば、光源ユニット24の照明光の出射方向に配置された1つ又は複数のレンズ(不図示)と、1つ又は複数のレンズの出射方向に配置された拡散板(不図示)と、によって構成される。 The illumination optical system (not shown) includes, for example, one or a plurality of lenses (not shown) arranged in the emission direction of the illumination light of the light source unit 24, and diffusion arranged in the emission direction of the one or a plurality of lenses. It is composed of a board (not shown).
 なお、光源ユニット24は、ローカルディミング可能に構成され、表示制御装置30からの制御のもと、表示面21aのエリア毎の照明の度合いを変更してもよい。これにより、光源ユニット24は、表示面21aに表示される画像の輝度をエリア毎に調整することができる。 The light source unit 24 is configured to be locally dimmable, and the degree of illumination for each area of the display surface 21a may be changed under the control of the display control device 30. As a result, the light source unit 24 can adjust the brightness of the image displayed on the display surface 21a for each area.
 なお、表示器21は、自発光型ディスプレイであってもよく、又は、スクリーンに画像を投影するプロジェクション型ディスプレイであってもよい。この場合、表示面21aは、プロジェクション型ディスプレイのスクリーンである。 The display 21 may be a self-luminous display or a projection type display that projects an image on a screen. In this case, the display surface 21a is the screen of the projection type display.
 画像表示部(画像表示装置)20は、後述する表示制御装置30の制御に基づいて、車両1のフロントウインドシールド2を介して視認される現実空間(実景)である前景に存在する、走行レーンの路面、分岐路、道路標識、障害物(歩行者、自転車、自動二輪車、他車両など)、及び地物(建物、橋など)などのオブジェクトの近傍、オブジェクトに重なる位置、又はオブジェクトを基準に設定された位置に画像を表示することで、視覚的な拡張現実(AR:Augmented Reality)を観察者(典型的には、車両1の運転席に着座する観察者)に知覚させることもできる。本実施形態の説明では、実景に存在する実オブジェクト又は後述する仮想オブジェクトの位置に応じて、表示される位置を変化させ得る画像をAR画像と定義し、実オブジェクトの位置によらず、表示される位置が設定される画像を非AR画像と定義することとする。 The image display unit (image display device) 20 is a traveling lane existing in the foreground, which is a real space (actual view) visually recognized via the front windshield 2 of the vehicle 1 based on the control of the display control device 30 described later. Near objects such as road surfaces, branch roads, road signs, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), and features (buildings, bridges, etc.), positions overlapping the objects, or based on the object. By displaying the image at the set position, the observer (typically, the observer sitting in the driver's seat of the vehicle 1) can perceive the visual augmented reality (AR). In the description of the present embodiment, an image whose display position can be changed according to the position of a real object existing in a real scene or a virtual object described later is defined as an AR image, and is displayed regardless of the position of the real object. An image in which a position is set is defined as a non-AR image.
 図3は、いくつかの実施形態に係る、車両用表示システム10のブロック図である。表示制御装置30は、1つ又は複数のI/Oインタフェース31、1つ又は複数のプロセッサ33、1つ又は複数の画像処理回路35、及び1つ又は複数のメモリ37を備える。図3に記載される様々な機能ブロックは、ハードウェア、ソフトウェア、又はこれら両方の組み合わせで構成されてもよい。図3は、1つの実施形態に過ぎず、図示された構成要素は、より数の少ない構成要素に組み合わされてもよく、又は追加の構成要素があってもよい。例えば、画像処理回路35(例えば、グラフィック処理ユニット)が、1つ又は複数のプロセッサ33に含まれてもよい。 FIG. 3 is a block diagram of the vehicle display system 10 according to some embodiments. The display control device 30 includes one or more I / O interfaces 31, one or more processors 33, one or more image processing circuits 35, and one or more memories 37. The various functional blocks described in FIG. 3 may consist of hardware, software, or a combination of both. FIG. 3 is only one embodiment, and the illustrated components may be combined with a smaller number of components, or there may be additional components. For example, the image processing circuit 35 (for example, a graphic processing unit) may be included in one or more processors 33.
 図示するように、プロセッサ33及び画像処理回路35は、メモリ37と動作可能に連結される。より具体的には、プロセッサ33及び画像処理回路35は、メモリ37に記憶されているプログラムを実行することで、例えば画像データを生成、及び/又は送信するなど、車両用表示システム10(画像表示部20)の操作を行うことができる。プロセッサ33及び/又は画像処理回路35は、少なくとも1つの汎用マイクロプロセッサ(例えば、中央処理装置(CPU))、少なくとも1つの特定用途向け集積回路(ASIC)、少なくとも1つのフィールドプログラマブルゲートアレイ(FPGA)、又はそれらの任意の組み合わせを含むことができる。メモリ37は、ハードディスクのような任意のタイプの磁気媒体、CD及びDVDのような任意のタイプの光学媒体、揮発性メモリのような任意のタイプの半導体メモリ、及び不揮発性メモリを含む。揮発性メモリは、DRAM及びSRAMを含み、不揮発性メモリは、ROM及びNVRAMを含んでもよい。 As shown, the processor 33 and the image processing circuit 35 are operably connected to the memory 37. More specifically, the processor 33 and the image processing circuit 35 execute a program stored in the memory 37 to generate and / or transmit image data, for example, and display the vehicle display system 10 (image display). The operation of unit 20) can be performed. The processor 33 and / or the image processing circuit 35 includes at least one general purpose microprocessor (eg, central processing unit (CPU)), at least one application specific integrated circuit (ASIC), and at least one field programmable gate array (FPGA). , Or any combination thereof. The memory 37 includes any type of magnetic medium such as a hard disk, any type of optical medium such as a CD and DVD, any type of semiconductor memory such as a volatile memory, and a non-volatile memory. The volatile memory may include DRAM and SRAM, and the non-volatile memory may include ROM and NVRAM.
 図示するように、プロセッサ33は、I/Oインタフェース31と動作可能に連結されている。I/Oインタフェース31は、例えば、車両に設けられた後述の車両ECU401、又は他の電子機器(後述する符号403~419)と、CAN(Controller Area Network)の規格に応じて通信(CAN通信とも称する)を行う。なお、I/Oインタフェース31が採用する通信規格は、CANに限定されず、例えば、CANFD(CAN with Flexible Data Rate)、LIN(Local Interconnect Network)、Ethernet(登録商標)、MOST(Media Oriented Systems Transport:MOSTは登録商標)、UART、もしくはUSBなどの有線通信インタフェース、又は、例えば、Bluetooth(登録商標)ネットワークなどのパーソナルエリアネットワーク(PAN)、802.11x Wi-Fi(登録商標)ネットワークなどのローカルエリアネットワーク(LAN)等の数十メートル内の近距離無線通信インタフェースである車内通信(内部通信)インタフェースを含む。また、I/Oインタフェース31は、無線ワイドエリアネットワーク(WWAN0、IEEE802.16-2004(WiMAX:Worldwide Interoperability for Microwave Access))、IEEE802.16eベース(Mobile WiMAX)、4G、4G-LTE、LTE Advanced、5Gなどのセルラー通信規格により広域通信網(例えば、インターネット通信網)などの車外通信(外部通信)インタフェースを含んでいてもよい。 As shown, the processor 33 is operably connected to the I / O interface 31. The I / O interface 31 communicates with, for example, the vehicle ECU 401 described later or another electronic device (reference numerals 403 to 419 described later) provided in the vehicle according to the standard of CAN (Controller Area Network) (also referred to as CAN communication). ). The communication standard adopted by the I / O interface 31 is not limited to CAN, for example, CANFD (CAN with Flexible Data Rate), LIN (Local Interconnect Network), Ethernet (registered trademark), MOST (Media Oriented Systems Transport). : MOST is a registered trademark), a wired communication interface such as UART, or USB, or a local such as a personal area network (PAN) such as a Bluetooth (registered trademark) network, or an 802.1x Wi-Fi (registered trademark) network. In-vehicle communication (internal communication) interface, which is a short-range wireless communication interface within several tens of meters such as an area network (LAN), is included. The I / O interface 31 is a wireless wide area network (WWAN0, IEEE 802.16-2004 (WiMAX: Worldwide Interoperability for Microwave Access)), IEEE 802.16e base (Mobile WiMAX), 4G, 4G-LTE, LTE Advanced, An external communication (external communication) interface such as a wide area communication network (for example, an Internet communication network) may be included according to a cellular communication standard such as 5G.
 図示するように、プロセッサ33は、I/Oインタフェース31と相互動作可能に連結されることで、車両用表示システム10(I/Oインタフェース31)に接続される種々の他の電子機器等と情報を授受可能となる。I/Oインタフェース31には、例えば、車両ECU401、道路情報データベース403、自車位置検出部405、車外センサ407、操作検出部409、目位置検出部411、IMU413、視線方向検出部415、携帯情報端末417、及び外部通信機器419などが動作可能に連結される。なお、I/Oインタフェース31は、車両用表示システム10に接続される他の電子機器等から受信する情報を加工(変換、演算、解析)する機能を含んでいてもよい。 As shown in the figure, the processor 33 is interoperably connected to the I / O interface 31 to provide information with various other electronic devices and the like connected to the vehicle display system 10 (I / O interface 31). Can be exchanged. The I / O interface 31 includes, for example, a vehicle ECU 401, a road information database 403, a vehicle position detection unit 405, an external sensor 407, an operation detection unit 409, an eye position detection unit 411, an IMU 413, a line-of-sight direction detection unit 415, and mobile information. The terminal 417, the external communication device 419, and the like are operably connected. The I / O interface 31 may include a function of processing (converting, calculating, analyzing) information received from another electronic device or the like connected to the vehicle display system 10.
 表示器21は、プロセッサ33及び画像処理回路35に動作可能に連結される。したがって、画像表示部20によって表示される画像は、プロセッサ33及び/又は画像処理回路35から受信された画像データに基づいてもよい。プロセッサ33及び画像処理回路35は、I/Oインタフェース31から取得される情報に基づき、画像表示部20が表示する画像を制御する。なお、表示制御装置30の構成は、以下に説明する機能を充足する限りにおいては任意である。 The display 21 is operably connected to the processor 33 and the image processing circuit 35. Therefore, the image displayed by the image display unit 20 may be based on the image data received from the processor 33 and / or the image processing circuit 35. The processor 33 and the image processing circuit 35 control the image displayed by the image display unit 20 based on the information acquired from the I / O interface 31. The configuration of the display control device 30 is arbitrary as long as the functions described below are satisfied.
 車両ECU401は、車両1に設けられたセンサやスイッチから、車両1の状態(例えば、走行距離、車速、アクセルペダル開度、ブレーキペダル開度、エンジンスロットル開度、インジェクター燃料噴射量、エンジン回転数、モータ回転数、ステアリング操舵角、シフトポジション、ドライブモード、各種警告状態、姿勢(ロール角、及び/又はピッチング角を含む)、車両の振動(振動の大きさ、頻度、及び/又は周波数を含む))などを取得し、車両1の前記状態を収集、及び管理(制御も含んでもよい。)するものであり、機能の一部として、車両1の前記状態の数値(例えば、車両1の車速。)を示す信号を、表示制御装置30のプロセッサ33へ出力することができる。 The vehicle ECU 401 uses sensors and switches provided on the vehicle 1 to determine the state of the vehicle 1 (for example, mileage, vehicle speed, accelerator pedal opening, brake pedal opening, engine throttle opening, injector fuel injection amount, engine rotation speed). , Motor speed, steering angle, shift position, drive mode, various warning states, attitude (including roll angle and / or pitching angle), vehicle vibration (including magnitude, frequency, and / or frequency of vibration) )) And the like, and collect and manage (may include control) the state of the vehicle 1. As a part of the function, the numerical value of the state of the vehicle 1 (for example, the vehicle speed of the vehicle 1). ) Can be output to the processor 33 of the display control device 30.
なお、車両ECU401は、単にセンサ等で検出した数値(例えば、ピッチング角が前傾方向に3[degree]。)をプロセッサ33へ送信することに加え、又はこれに代わり、センサで検出した数値を含む車両1の1つ又は複数の状態に基づく判定結果(例えば、車両1が予め定められた前傾状態の条件を満たしていること。)、若しくは/及び解析結果(例えば、ブレーキペダル開度の情報と組み合わせされて、ブレーキにより車両が前傾状態になったこと。)を、プロセッサ33へ送信してもよい。例えば、車両ECU401は、車両1が車両ECU401のメモリ(不図示)に予め記憶された所定の条件を満たすような判定結果を示す信号を表示制御装置30へ出力してもよい。なお、I/Oインタフェース31は、車両ECU401を介さずに、車両1に設けられた車両1に設けられたセンサやスイッチから、上述したような車両1の状態に関する情報を取得してもよい。 In addition, the vehicle ECU 401 simply transmits the numerical value detected by the sensor or the like (for example, the pitching angle is 3 [brake] in the forward tilt direction) to the processor 33, or instead, the numerical value detected by the sensor is used. Judgment results based on one or more states of the including vehicle 1 (for example, the vehicle 1 satisfies a predetermined condition of the forward leaning state) and / and analysis results (for example, of the brake pedal opening degree). Combined with the information, the brake has caused the vehicle to lean forward.) May be transmitted to the processor 33. For example, the vehicle ECU 401 may output a signal indicating a determination result indicating that the vehicle 1 satisfies a predetermined condition stored in advance in a memory (not shown) of the vehicle ECU 401 to the display control device 30. The I / O interface 31 may acquire information on the state of the vehicle 1 as described above from the sensors and switches provided on the vehicle 1 provided on the vehicle 1 without going through the vehicle ECU 401.
 また、車両ECU401は、車両用表示システム10が表示する画像を指示する指示信号を表示制御装置30へ出力してもよく、この際、画像の座標、サイズ、種類、表示態様、画像の報知必要度、及び/又は報知必要度を判定する元となる必要度関連情報を、前記指示信号に付加して送信してもよい。 Further, the vehicle ECU 401 may output an instruction signal indicating an image to be displayed by the vehicle display system 10 to the display control device 30, and at this time, it is necessary to notify the coordinates, size, type, display mode, and image of the image. The degree and / or the necessity-related information that is the basis for determining the notification necessity may be added to the instruction signal and transmitted.
 道路情報データベース403は、車両1に設けられた図示しないナビゲーション装置、又は車両1と車外通信インタフェース(I/Oインタフェース31)を介して接続される外部サーバー、に含まれ、後述する自車位置検出部405から取得される車両1の位置に基づき、車両1の周辺の情報(車両1の周辺の実オブジェクト関連情報)である車両1が走行する道路情報(車線,白線,停止線,横断歩道,道路の幅員,車線数,交差点,カーブ,分岐路,交通規制など)、地物情報(建物、橋、河川など)の、有無、位置(車両1までの距離を含む)、方向、形状、種類、詳細情報などを読み出し、プロセッサ33に送信してもよい。また、道路情報データベース403は、出発地から目的地までの適切な経路(ナビゲーション情報)を算出し、当該ナビゲーション情報を示す信号、又は経路を示す画像データをプロセッサ33へ出力してもよい。 The road information database 403 is included in a navigation device (not shown) provided in the vehicle 1 or an external server connected to the vehicle 1 via an external communication interface (I / O interface 31), and the vehicle position detection described later. Road information (lane, white line, stop line, crosswalk, Road width, number of lanes, intersections, curves, branch roads, traffic regulations, etc.), feature information (buildings, bridges, rivers, etc.), presence / absence, position (including distance to vehicle 1), direction, shape, type , Detailed information and the like may be read out and transmitted to the processor 33. Further, the road information database 403 may calculate an appropriate route (navigation information) from the departure point to the destination, and output a signal indicating the navigation information or image data indicating the route to the processor 33.
 自車位置検出部405は、車両1に設けられたGNSS(全地球航法衛星システム)等であり、現在の車両1の位置、方位を検出し、検出結果を示す信号を、プロセッサ33を介して、又は直接、道路情報データベース403、後述する携帯情報端末417、及び/もしくは外部通信機器419へ出力する。道路情報データベース403、後述する携帯情報端末417、及び/又は外部通信機器419は、自車位置検出部405から車両1の位置情報を連続的、断続的、又は所定のイベント毎に取得することで、車両1の周辺の情報を選択・生成して、プロセッサ33へ出力してもよい。 The own vehicle position detection unit 405 is a GNSS (Global Navigation Satellite System) or the like provided in the vehicle 1, detects the current position and orientation of the vehicle 1, and transmits a signal indicating the detection result via the processor 33. , Or directly output to the road information database 403, the portable information terminal 417 described later, and / or the external communication device 419. The road information database 403, the mobile information terminal 417 described later, and / or the external communication device 419 obtains the position information of the vehicle 1 from the own vehicle position detection unit 405 continuously, intermittently, or at a predetermined event. , Information around the vehicle 1 may be selected and generated and output to the processor 33.
 車外センサ407は、車両1の周辺(前方、側方、及び後方)に存在する実オブジェクトを検出する。車外センサ407が検知する実オブジェクトは、例えば、障害物(歩行者、自転車、自動二輪車、他車両など)、後述する走行レーンの路面、区画線、路側物、及び/又は地物(建物など)などを含んでいてもよい。車外センサとしては、例えば、ミリ波レーダ、超音波レーダ、レーザレーダ等のレーダセンサ、カメラ、又はこれらの組み合わせからなる検出ユニットと、当該1つ又は複数の検出ユニットからの検出データを処理する(データフュージョンする)処理装置と、から構成される。これらレーダセンサやカメラセンサによる物体検知については従来の周知の手法を適用する。これらのセンサによる物体検知によって、三次元空間内での実オブジェクトの有無、実オブジェクトが存在する場合には、その実オブジェクトの位置(車両1からの相対的な距離、車両1の進行方向を前後方向とした場合の左右方向の位置、上下方向の位置等)、大きさ(横方向(左右方向)、高さ方向(上下方向)等の大きさ)、移動方向(横方向(左右方向)、奥行き方向(前後方向))、移動速度(横方向(左右方向)、奥行き方向(前後方向))、及び/又は種類等を検出してもよい。1つ又は複数の車外センサ407は、各センサの検知周期毎に、車両1の前方の実オブジェクトを検知して、実オブジェクト情報の一例である実オブジェクト情報(実オブジェクトの有無、実オブジェクトが存在する場合には実オブジェクト毎の位置、大きさ、及び/又は種類等の情報)をプロセッサ33に出力することができる。なお、これら実オブジェクト情報は、他の機器(例えば、車両ECU401)を経由してプロセッサ33に送信されてもよい。また、夜間等の周辺が暗いときでも実オブジェクトが検知できるように、センサとしてカメラを利用する場合には赤外線カメラや近赤外線カメラが望ましい。また、センサとしてカメラを利用する場合、視差で距離等も取得できるステレオカメラが望ましい。 The vehicle exterior sensor 407 detects real objects existing around the vehicle 1 (front, side, and rear). The actual objects detected by the external sensor 407 are, for example, obstacles (pedestrians, bicycles, motorcycles, other vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) of the traveling lane described later. Etc. may be included. As the vehicle exterior sensor, for example, a detection unit composed of a radar sensor such as a millimeter wave radar, an ultrasonic radar, a laser radar, a camera, or a combination thereof, and detection data from the one or a plurality of detection units are processed ( It consists of a processing device (data fusion) and a processing device. Conventional well-known methods are applied to object detection by these radar sensors and camera sensors. By object detection by these sensors, the presence or absence of a real object in the three-dimensional space, and if the real object exists, the position of the real object (relative distance from the vehicle 1 and the traveling direction of the vehicle 1 in the front-rear direction). (Horizontal position, vertical position, etc.), size (horizontal (horizontal), height (vertical), etc.), movement direction (horizontal (horizontal), depth) The direction (front-back direction)), movement speed (lateral direction (left-right direction), depth direction (front-back direction)), and / or type may be detected. One or more external sensors 407 detect a real object in front of the vehicle 1 for each detection cycle of each sensor, and the real object information (presence or absence of the real object, existence of the real object exists) which is an example of the real object information. In this case, information such as the position, size, and / or type of each real object) can be output to the processor 33. Note that these real object information may be transmitted to the processor 33 via another device (for example, vehicle ECU 401). Further, when using a camera as a sensor, an infrared camera or a near-infrared camera is desirable so that a real object can be detected even when the surroundings are dark such as at night. Further, when a camera is used as a sensor, a stereo camera capable of acquiring a distance or the like by parallax is desirable.
 操作検出部409は、例えば、車両1のCID(Center Information Display)、インストルメントパネルなどに設けられたハードウェアスイッチ、又は画像とタッチセンサなどとを兼ね合わされたソフトウェアスイッチなどであり、車両1の乗員(運転席の着座するユーザ、及び/又は助手席に着座するユーザ)による操作に基づく操作情報を、プロセッサ33へ出力する。例えば、操作検出部409は、ユーザの操作により、表示領域100を移動させる操作に基づく表示領域設定情報、アイボックス200を移動させる操作に基づくアイボックス設定情報、観察者の目位置700を設定する操作に基づく情報などを、プロセッサ33へ出力する。 The operation detection unit 409 is, for example, a CID (Center Information Processor) of the vehicle 1, a hardware switch provided on the instrument panel, or a software switch that combines an image and a touch sensor, and the like. The operation information based on the operation by the occupant (the user seated in the driver's seat and / or the user seated in the passenger seat) is output to the processor 33. For example, the operation detection unit 409 sets the display area setting information based on the operation of moving the display area 100, the eyebox setting information based on the operation of moving the eyebox 200, and the observer's eye position 700 by the user's operation. Information based on the operation is output to the processor 33.
 目位置検出部411は、車両1の運転席に着座する観察者の目位置700(図1参照。)を検出する赤外線カメラなどのカメラを含み、撮像した画像を、プロセッサ33に出力してもよい。プロセッサ33は、目位置検出部411から撮像画像(目位置700を推定可能な情報の一例。)を取得し、この撮像画像を、パターンマッチングなどの手法で解析することで、観察者の目位置700の座標を検出し、検出した目位置700の座標を示す信号を、プロセッサ33へ出力してもよい。 The eye position detection unit 411 includes a camera such as an infrared camera that detects the eye position 700 (see FIG. 1) of the observer sitting in the driver's seat of the vehicle 1, and even if the captured image is output to the processor 33. good. The processor 33 acquires an captured image (an example of information capable of estimating the eye position 700) from the eye position detection unit 411, and analyzes the captured image by a method such as pattern matching to obtain an observer's eye position. The coordinates of 700 may be detected, and a signal indicating the detected coordinates of the eye position 700 may be output to the processor 33.
 また、目位置検出部411は、カメラの撮像画像を解析した解析結果(例えば、観察者の目位置700が、予め設定された複数の表示パラメータが対応する空間的な領域のどこに属しているかを示す信号。)を、プロセッサ33に出力してもよい。なお、車両1の観察者の目位置700、又は観察者の目位置700を推定可能な情報を取得する方法は、これらに限定されるものではなく、既知の目位置検出(推定)技術を用いて取得されてもよい。 Further, the eye position detection unit 411 determines where in the spatial region corresponding to the plurality of preset display parameters the eye position 700 of the observer belongs to the analysis result obtained by analyzing the image captured by the camera (for example, the eye position 700 of the observer belongs to). The indicated signal) may be output to the processor 33. The method of acquiring information capable of estimating the eye position 700 of the observer of the vehicle 1 or the eye position 700 of the observer is not limited to these, and a known eye position detection (estimation) technique is used. May be obtained.
 また、目位置検出部411は、観察者の目位置700の移動速度、及び/又は移動方向を検出し、観察者の目位置700の移動速度、及び/又は移動方向を示す信号を、プロセッサ33に出力してもよい。 Further, the eye position detection unit 411 detects the moving speed and / or moving direction of the observer's eye position 700, and outputs a signal indicating the moving speed and / or moving direction of the observer's eye position 700 to the processor 33. It may be output to.
 IMU413は、慣性加速に基づいて、車両1の位置、向き、及びこれらの変化(変化速度、変化加速度)を検知するように構成された1つ又は複数のセンサ(例えば、加速度計及びジャイロスコープ)の組み合わせを含むことができる。IMU413は、検出した値(前記検出した値は、車両1の位置、向き、及びこれらの変化(変化速度、変化加速度)を示す信号などを含む。)、検出した値を解析した結果を、プロセッサ33に出力してもよい。前記解析した結果は、前記検出した値が、所定の条件を満たしたか否かの判定結果を示す信号などであり、例えば、車両1の位置又は向きの変化(変化速度、変化加速度)に関する値から、車両1の挙動(振動)が少ないことを示す信号であってもよい。 The IMU413 is one or more sensors (eg, accelerometers and gyroscopes) configured to detect the position, orientation, and changes (speed of change, acceleration of change) of vehicle 1 based on inertial acceleration. Can include combinations of. The IMU413 processes the detected values (the detected values include the position and orientation of the vehicle 1, signals indicating these changes (change speed, change acceleration), and the like), and the results of analyzing the detected values. It may be output to 33. The result of the analysis is a signal or the like indicating a determination result of whether or not the detected value satisfies a predetermined condition, and is, for example, from a value relating to a change (change speed, change acceleration) of the position or orientation of the vehicle 1. , It may be a signal indicating that the behavior (vibration) of the vehicle 1 is small.
 視線方向検出部415は、車両1の運転席に着座する観察者の顔を撮像する赤外線カメラ、又は可視光カメラを含み、撮像した画像を、プロセッサ33に出力してもよい。プロセッサ33は、視線方向検出部415から撮像画像(視線方向を推定可能な情報の一例。)を取得し、この撮像画像を解析することで観察者の視線方向(及び/又は前記注視位置)を特定することができる。なお、視線方向検出部415は、カメラからの撮像画像を解析し、解析結果である観察者の視線方向(及び/又は前記注視位置)を示す信号をプロセッサ33に出力してもよい。なお、車両1の観察者の視線方向を推定可能な情報を取得する方法は、これらに限定されるものではなく、EOG(Electro-oculogram)法、角膜反射法、強膜反射法、プルキンエ像検出法、サーチコイル法、赤外線眼底カメラ法などの他の既知の視線方向検出(推定)技術を用いて取得されてもよい。 The line-of-sight direction detection unit 415 includes an infrared camera or a visible light camera that captures the face of an observer sitting in the driver's seat of the vehicle 1, and may output the captured image to the processor 33. The processor 33 acquires an captured image (an example of information capable of estimating the line-of-sight direction) from the line-of-sight direction detection unit 415, and analyzes the captured image to determine the line-of-sight direction (and / or the gaze position) of the observer. Can be identified. The line-of-sight direction detection unit 415 may analyze the captured image from the camera and output a signal indicating the line-of-sight direction (and / or the gaze position) of the observer, which is the analysis result, to the processor 33. The method for acquiring information that can estimate the line-of-sight direction of the observer of the vehicle 1 is not limited to these, and is not limited to these, but is the EOG (Electro-oculogram) method, the corneal reflex method, the scleral reflex method, and the Purkinje image detection. It may be obtained using other known line-of-sight detection (estimation) techniques such as the method, search coil method, infrared fundus camera method.
 携帯情報端末417は、スマートフォン、ノートパソコン、スマートウォッチ、又は観察者(又は車両1の他の乗員)が携帯可能なその他の情報機器である。I/Oインタフェース31は、携帯情報端末417とペアリングすることで、携帯情報端末417と通信を行うことが可能であり、携帯情報端末417(又は携帯情報端末を通じたサーバ)に記録されたデータを取得する。携帯情報端末417は、例えば、上述の道路情報データベース403及び自車位置検出部405と同様の機能を有し、前記道路情報(実オブジェクト関連情報の一例。)を取得し、プロセッサ33に送信してもよい。また、携帯情報端末417は、車両1の近傍の商業施設に関連するコマーシャル情報(実オブジェクト関連情報の一例。)を取得し、プロセッサ33に送信してもよい。なお、携帯情報端末417は、携帯情報端末417の所持者(例えば、観察者)のスケジュール情報、携帯情報端末417での着信情報、メールの受信情報などをプロセッサ33に送信し、プロセッサ33及び画像処理回路35は、これらに関する画像データを生成及び/又は送信してもよい。 The mobile information terminal 417 is a smartphone, a laptop computer, a smart watch, or other information device that can be carried by an observer (or another occupant of vehicle 1). The I / O interface 31 can communicate with the mobile information terminal 417 by pairing with the mobile information terminal 417, and the data recorded in the mobile information terminal 417 (or the server through the mobile information terminal). To get. The mobile information terminal 417 has, for example, the same functions as the above-mentioned road information database 403 and own vehicle position detection unit 405, acquires the road information (an example of real object-related information), and transmits it to the processor 33. You may. Further, the personal digital assistant 417 may acquire commercial information (an example of information related to a real object) related to a commercial facility in the vicinity of the vehicle 1 and transmit it to the processor 33. The mobile information terminal 417 transmits schedule information of the owner (for example, an observer) of the mobile information terminal 417, incoming information on the mobile information terminal 417, mail reception information, and the like to the processor 33, and the processor 33 and an image. The processing circuit 35 may generate and / or transmit image data relating to these.
 外部通信機器419は、車両1と情報のやりとりをする通信機器であり、例えば、車両1と車車間通信(V2V:Vehicle To Vehicle)により接続される他車両、歩車間通信(V2P:Vehicle To Pedestrian)により接続される歩行者(歩行者が携帯する携帯情報端末)、路車間通信(V2I:Vehicle To roadside Infrastructure)により接続されるネットワーク通信機器であり、広義には、車両1との通信(V2X:Vehicle To Everything)により接続される全てのものを含む。外部通信機器419は、例えば、歩行者、自転車、自動二輪車、他車両(先行車等)、路面、区画線、路側物、及び/又は地物(建物など)の位置を取得し、プロセッサ33へ出力してもよい。また、外部通信機器419は、上述の自車位置検出部405と同様の機能を有し、車両1の位置情報を取得し、プロセッサ33に送信してもよく、さらに上述の道路情報データベース403の機能も有し、前記道路情報(実オブジェクト関連情報の一例。)を取得し、プロセッサ33に送信してもよい。なお、外部通信機器419から取得される情報は、上述のものに限定されない。 The external communication device 419 is a communication device that exchanges information with the vehicle 1, for example, another vehicle connected to the vehicle 1 by vehicle-to-vehicle communication (V2V: Vehicle To Vehicle), and pedestrian-to-vehicle communication (V2P: Vehicle To Pestation). ), A network communication device connected by pedestrians (portable information terminals carried by pedestrians) and road-to-vehicle communication (V2I: Vehicle To vehicle Infrastructure), and in a broad sense, communication with vehicle 1 (V2X). : Includes everything connected by (Vehicle To Everything). The external communication device 419 acquires, for example, the positions of pedestrians, bicycles, motorcycles, other vehicles (preceding vehicles, etc.), road surfaces, lane markings, roadside objects, and / or features (buildings, etc.) and sends them to the processor 33. It may be output. Further, the external communication device 419 has the same function as the own vehicle position detection unit 405 described above, and may acquire the position information of the vehicle 1 and transmit it to the processor 33. Further, the road information database 403 described above may be used. It also has a function, and the road information (an example of information related to a real object) may be acquired and transmitted to the processor 33. The information acquired from the external communication device 419 is not limited to the above.
(接近画像の機能及び制御)
 ここで、接近画像の機能及び制御について説明する。虚像の表示領域内において表示される接近画像Vは、車両1の前方にあるオブジェクトを報知する画像である。接近画像は、大別して、実オブジェクトを強調する「強調態様」での表示と、前方風景内に視認されない情報(仮想オブジェクト)を可視化する「可視化態様」での表示と、車両1側から情報を呈示する「呈示態様」での表示と、が可能となっている。
(Function and control of close-up image)
Here, the function and control of the close-up image will be described. The approach image V displayed in the display area of the virtual image is an image that notifies an object in front of the vehicle 1. The close-up image is roughly divided into a display in the "emphasis mode" that emphasizes the real object, a display in the "visualization mode" that visualizes information (virtual object) that is not visible in the front landscape, and information from the vehicle 1 side. It is possible to display in the "presentation mode" to be presented.
 ここで、『オブジェクト』とは、実在する障害物(歩行者、自転車、自動二輪車、他車両など)、地物(建物、橋など)、及び道路上物体(分岐路、道路標識)など知覚できる『実オブジェクト』、及び知覚できない(又は知覚しづらい)『仮想オブジェクト』と、を含む。仮想オブジェクトは、例えば、一般道路及び高速道路の切り替わり地点、車線数、制限速度、道路の名称、道路の混雑状況、商用施設に関連するコマーシャル情報、など位置情報に結び付けられる様々な情報を含む。なお、車両1は、情報を呈示する「呈示態様」で表示する場合、前景の所定の位置に仮想オブジェクトを設定し得る。具体的に、例えば、車両1は、情報を呈示する「呈示態様」で表示を開始する場合、50[meter]先に仮想オブジェクトを設定し、その設定した仮想オブジェクトと車両1との距離に応じて、「呈示態様」で表示する画像の大きさを調整し得る。 Here, the "object" can be perceived as an existing obstacle (pedestrian, bicycle, motorcycle, other vehicle, etc.), a feature (building, bridge, etc.), and an object on the road (branch road, road sign). Includes "real objects" and "virtual objects" that are imperceptible (or difficult to perceive). The virtual object includes various information associated with location information, such as switching points between general roads and highways, the number of lanes, speed limits, road names, road congestion status, and commercial information related to commercial facilities. When the vehicle 1 is displayed in the "presentation mode" in which the information is presented, the virtual object can be set at a predetermined position in the foreground. Specifically, for example, when the vehicle 1 starts displaying in the "presentation mode" of presenting information, a virtual object is set 50 [meter] ahead, and the distance between the set virtual object and the vehicle 1 is increased. Therefore, the size of the image displayed in the "presentation mode" can be adjusted.
 強調態様の接近画像は、車両1の前方に存在する、先行車、歩行者、道路標識、建物などの実オブジェクトに対応する位置に表示される。強調態様の接近画像は、例えば、実オブジェクトに重畳して、または実オブジェクトの近傍に表示され、当該実オブジェクトの存在を強調して報知する。つまり、強調態様の接近画像の表示位置としての「実オブジェクトに対応する位置」とは、実オブジェクトに重畳して視認される位置に限られず、実オブジェクトの近傍の位置であってもよい。なお、強調態様は、実オブジェクトの視認を妨げない態様であれば、任意である。 The approach image of the emphasized mode is displayed at a position corresponding to a real object such as a preceding vehicle, a pedestrian, a road sign, or a building existing in front of the vehicle 1. The close-up image of the emphasized mode is, for example, superimposed on the real object or displayed in the vicinity of the real object to emphasize and notify the existence of the real object. That is, the "position corresponding to the real object" as the display position of the approach image in the emphasized mode is not limited to the position that is visually recognized by being superimposed on the real object, and may be a position in the vicinity of the real object. The emphasis mode is arbitrary as long as it does not interfere with the visibility of the real object.
 可視化態様の接近画像は、例えば、道路標識が設置されていない特定の箇所(仮想オブジェクトの一例。)において、当該箇所における各種情報を報知するための道路標識を模した画像や、観察者がカーナビ装置80等で予め設定した、目的地や施設などのPOI(Point of Interest)を示す画像などである。なお、ここでいう道路標識は、案内標識、警戒標識、規制標識、指示標識などあらゆるものを含む。つまり、この実施形態で表示可能な接近画像は、道路標識を模した画像を含む。ここで、道路標識を模した、可視化態様の接近画像の一例を図4A及び図4Bに示す。図4Aに示す接近画像は、案内標識を模した画像である。図4Bに示す接近画像は、制限速度を示す規制標識を模した画像である。また、可視化態様の接近画像は、車両1の経路を案内する矢印形状などのナビゲーション画像を含んでいてもよい。 The approach image of the visualization mode is, for example, an image imitating a road sign for notifying various information at a specific place (an example of a virtual object) where a road sign is not installed, or a car navigation system by an observer. It is an image showing a POI (Point of Interest) of a destination, a facility, etc., which is preset by the device 80 or the like. The road sign referred to here includes all kinds of guide signs, warning signs, regulation signs, instruction signs, and the like. That is, the approach image that can be displayed in this embodiment includes an image that imitates a road sign. Here, FIG. 4A and FIG. 4B show an example of a close-up image of a visualization mode imitating a road sign. The approach image shown in FIG. 4A is an image imitating a guide sign. The close-up image shown in FIG. 4B is an image imitating a regulation sign indicating a speed limit. Further, the approach image of the visualization mode may include a navigation image such as an arrow shape that guides the route of the vehicle 1.
 呈示態様の接近画像は、例えば、車両ECU401から呈示される、運転に対するアドバイス、車両1での運転支援に関する情報、オーディオにおける次の曲名を示す情報、など車両1(車両ECU401)から呈示する様々な情報を示す画像である。 Various close-up images of the presentation mode are presented from the vehicle 1 (vehicle ECU 401), for example, advice for driving, information on driving support in the vehicle 1, information indicating the next song title in the audio, and the like, which are presented by the vehicle ECU 401. It is an image showing information.
 表示制御装置30は、車両1からオブジェクトまでの距離Lが遠いほど接近画像の表示サイズが小さくなるように(換言すれば、距離Lが近いほど接近画像の表示サイズが大きくなるように)、接近画像の表示制御を行う。これにより、接近画像に擬似的な遠近感を与えることができる。なお、距離Lに応じた接近画像のサイズの基本的な変化は、連続的であることが好ましいが、これに限定されることはなく、断続的であってもよい。 The display control device 30 approaches so that the farther the distance L from the vehicle 1 to the object is, the smaller the display size of the approach image is (in other words, the shorter the distance L is, the larger the display size of the approach image is). Controls the display of images. This makes it possible to give a pseudo perspective to the close-up image. The basic change in the size of the close-up image according to the distance L is preferably continuous, but is not limited to this, and may be intermittent.
 また、いくつかの実施形態において、画像表示部20は、接近画像が表示される距離(表示距離)を調整可能な奥行き知覚表示装置で構成され、表示制御装置30は、車両1からオブジェクトまでの距離Lが遠いほど接近画像の知覚される距離が長くなるように(逆に言えば、距離Lが近いほど接近画像の表示距離が短くなるように)、接近画像の表示制御を行ってもよい。これにより、接近画像に擬似的な遠近感を与えることができる。前記奥行き知覚表示装置は、ライトフィールドディスプレイ、多眼式ディスプレイ、両眼視差式ディスプレイ、など公知の3D表示装置を含む。また、前記奥行き知覚表示装置は、虚像表示領域100を平面、又は曲面の2次元で形成し、チルト角θtを90[degree]以下(好ましくは、50[degree]以下、さらに好ましくは0[degree]近傍)にして、虚像表示領域100内の虚像Vの表示位置を変えることで結像距離を変える、奥行き2D表示装置を含んでいても良い。 Further, in some embodiments, the image display unit 20 is composed of a depth perception display device capable of adjusting the distance (display distance) on which the approach image is displayed, and the display control device 30 is from the vehicle 1 to the object. The display control of the close-up image may be performed so that the perceived distance of the close-up image becomes long as the distance L is long (conversely, the display distance of the close-up image becomes short as the distance L is short). .. This makes it possible to give a pseudo perspective to the close-up image. The depth perception display device includes a known 3D display device such as a light field display, a multi-lens display, and a binocular disparity display. Further, in the depth perception display device, the virtual image display area 100 is formed in two dimensions of a plane or a curved surface, and the tilt angle θt is 90 [degree] or less (preferably 50 [degree] or less, more preferably 0 [degree] or less. ] In the vicinity), a depth 2D display device that changes the imaging distance by changing the display position of the virtual image V in the virtual image display area 100 may be included.
 表示制御装置30は、道路情報データベース403(カーナビも含む。)、車外センサ407、携帯情報端末417、及び外部通信機器419(以下、情報源、とも言う。)の少なくともいずれかから、I/Oインタフェース31を介して取得した情報に基づき、オブジェクトまでの距離Lを取得する。なお、表示制御装置30は、情報源から距離Lを示す情報を取得してもよいし、情報源からオブジェクトの座標情報を取得し、座標情報に基づいて距離Lを算出してもよい。特に、可視化態様の接近画像の報知対象である実オブジェクトまでの距離Lは、道路情報データベース403、携帯情報端末417、及び外部通信機器419からの情報に基づいて特定される仮想オブジェクトの代表位置(前記代表位置は、例えば、規制区間の開始位置や、道路案内に適した位置として予め定められた位置などを含む。)までの距離として、取得又は算出することができる。なお、表示制御装置30は、取得又は算出した距離Lの短縮に基づいて、接近画像のサイズを徐々に大きくする(又はこれに加えて、接近画像の表示距離を短くする)表示制御を実行するが、『距離L』が必ずしも取得又は算出されなくてもよく、距離Lの短縮を推定可能な情報に基づき、接近画像のサイズを調整してもよい。オブジェクトと車両1との間の距離Lの短縮を推定可能な情報は、(1)オブジェクトの座標位置と車両1の座標位置との組み合わせ、(2)車両1の走行距離などを含んでいてもよい。 The display control device 30 is an I / O from at least one of a road information database 403 (including a car navigation system), an external sensor 407, a personal digital assistant 417, and an external communication device 419 (hereinafter, also referred to as an information source). The distance L to the object is acquired based on the information acquired via the interface 31. The display control device 30 may acquire information indicating the distance L from the information source, or may acquire the coordinate information of the object from the information source and calculate the distance L based on the coordinate information. In particular, the distance L to the real object, which is the notification target of the approach image in the visualization mode, is the representative position of the virtual object specified based on the information from the road information database 403, the personal digital assistant 417, and the external communication device 419. The representative position can be acquired or calculated as a distance to, for example, a start position of a regulated section, a position predetermined as a position suitable for road guidance, and the like). The display control device 30 executes display control for gradually increasing the size of the approaching image (or, in addition, shortening the display distance of the approaching image) based on the shortening of the acquired or calculated distance L. However, the "distance L" does not necessarily have to be acquired or calculated, and the size of the close-up image may be adjusted based on the information that can estimate the shortening of the distance L. The information that can estimate the shortening of the distance L between the object and the vehicle 1 includes (1) the combination of the coordinate position of the object and the coordinate position of the vehicle 1, (2) the mileage of the vehicle 1, and the like. good.
 表示制御装置30は、以上のように取得又は算出した距離L(又は距離Lを推定可能な情報)に基づき、接近画像のサイズ(又はこれに加えて表示距離)を制御するための制御データを、メモリ37に予め記憶する。前記制御データは、接近画像のサイズを制御するための制御データ(又はこれに加えて表示距離を制御するための制御データ)を含む。 The display control device 30 provides control data for controlling the size (or display distance in addition to this) of the approach image based on the distance L (or information that can estimate the distance L) acquired or calculated as described above. , Stored in memory 37 in advance. The control data includes control data for controlling the size of the close-up image (or, in addition, control data for controlling the display distance).
 まず、接近画像のサイズを制御するための制御データについて説明する。接近画像のサイズを制御するための制御データは、基準制御データCDsを含む。表示制御装置30は、取得又は算出した距離L(又は距離Lを推定可能な情報)及びと基準制御データCDsに基づき、表示比率Mを決定し、決定した表示比率Mで接近画像を表示する。基準制御データCDsは、距離Lと表示比率Mとの関係を示す基準関数Fs(数式)のデータを含む。基準関数Fsは、例えば、図5に示すように、M=α/Lで表すことができる逆比例である。なお、定数のαは、例えば、仮想面からアイボックス200までの所定距離と、後述する基準サイズとによって、決定される。また、基準関数Fsは、M=arctan(α)で表されても良い。 First, the control data for controlling the size of the close-up image will be described. The control data for controlling the size of the close-up image includes the reference control data CDs. The display control device 30 determines the display ratio M based on the acquired or calculated distance L (or information capable of estimating the distance L) and the reference control data CDs, and displays the approach image at the determined display ratio M. The reference control data CDs include data of a reference function Fs (mathematical expression) showing the relationship between the distance L and the display ratio M. The reference function Fs is, for example, an inverse proportionality that can be expressed by M = α / L, as shown in FIG. The constant α is determined by, for example, a predetermined distance from the virtual surface to the eyebox 200 and a reference size described later. Further, the reference function Fs may be represented by M = arctan (α).
 表示比率Mは、基準サイズに対する接近画像の表示比率であって、観察者に視認させたいサイズをSとし、基準サイズをS0とすれば、S/S0で表される。基準サイズは、所定距離L0における接近画像のサイズとして予めメモリ37に記憶されている。なお、所定距離L0をどのように設定するかは任意である。 The display ratio M is the display ratio of the close-up image to the reference size, and is represented by S / S0 if the size to be visually recognized by the observer is S and the reference size is S0. The reference size is stored in the memory 37 in advance as the size of the approaching image at a predetermined distance L0. How to set the predetermined distance L0 is arbitrary.
 基準サイズは、接近画像の種類に応じて予め定められている。図4Aは、矩形の接近画像のとして、横がAの長さ、縦がBの長さの基準サイズの接近画像を示している。図4Bは、円形の接近画像として、横がCの長さ、縦がDの長さの基準サイズの接近画像を示している。なお、接近画像が、真円形である場合はC=Dであり、楕円形である場合はC≠Dである。ここで言う長さは、観察者に視認される長さであり、例えば、例えばピクセル数及び表示面21aの部分領域毎の虚像光学系90による拡大率で規定することができる。 The standard size is predetermined according to the type of close-up image. FIG. 4A shows a close-up image of a reference size having a length of A in the horizontal direction and a length of B in the vertical direction as a rectangular close-up image. FIG. 4B shows a close-up image of a reference size having a length of C in the horizontal direction and a length of D in the vertical direction as a circular close-up image. When the close-up image is a perfect circle, C = D, and when it is an ellipse, C ≠ D. The length referred to here is a length that can be visually recognized by the observer, and can be defined, for example, by the number of pixels and the enlargement ratio by the virtual image optical system 90 for each partial region of the display surface 21a.
 また、基準サイズは、報知画像が対応付けられるオブジェクトのサイズに応じて、調整されてもよい。例えば、表示制御装置30は、実オブジェクトのサイズに基づき、観察者から見て、実オブジェクトのサイズと概ね同じ、又は実オブジェクトのサイズより大きくなるように、基準サイズを変更し得る。 Further, the reference size may be adjusted according to the size of the object to which the broadcast image is associated. For example, the display control device 30 may change the reference size based on the size of the real object so that the size of the real object is approximately the same as or larger than the size of the real object when viewed from the observer.
 例えば、表示制御装置30が決定した表示比率Mが「2」である場合、図4Aに示す接近画像は、横が2Aの長さ、縦が2Bの長さのサイズに、基準サイズから拡大されて観察者に視認されることになる。また、表示制御装置30が決定した表示比率Mが「1/2」である場合、図4Bに示す接近画像は、横が1/2C、縦が1/2Dの長さのサイズに、基準サイズから縮小されて表示されることになる。以上のように、接近画像の拡大又は縮小は、基準サイズの接近画像のアスペクト比を保ったまま実行される。なお、接近画像の形状は、矩形や円形に限られず、三角形、多角形状などの他の形状であってもよく、当該他の形状であっても拡大又は縮小の考え方は同様である。接近画像の構成は、実オブジェクトに関する情報を報知することができれば任意であり、例えば、文字、記号、図形、アイコンやこれらの組み合わせであればよい。 For example, when the display ratio M determined by the display control device 30 is "2", the close-up image shown in FIG. 4A is enlarged from the reference size to a size of 2A in width and 2B in length. Will be visible to the observer. When the display ratio M determined by the display control device 30 is "1/2", the close-up image shown in FIG. 4B has a length of 1 / 2C in the horizontal direction and 1 / 2D in the vertical direction, and has a reference size. It will be displayed in a reduced size from. As described above, the enlargement or reduction of the close-up image is executed while maintaining the aspect ratio of the close-up image of the reference size. The shape of the close-up image is not limited to a rectangle or a circle, and may be another shape such as a triangle or a polygon, and the concept of enlargement or reduction is the same for the other shape. The composition of the close-up image is arbitrary as long as it can convey information about the real object, and may be, for example, characters, symbols, figures, icons, or a combination thereof.
 以上のようにして、表示制御装置30は、接近画像の表示制御を行う。オブジェクトが複数ある場合、表示制御装置30は、同一の期間内で(同時に)複数の接近画像を表示することが可能である。なお、基準制御データCDsに基づき決定される接近画像の表示比率Mが所定閾値Mth以上となった場合、接近画像を消去してもよい。こうすることで、車両1が所定の実オブジェクトに差し掛かる際、当該実オブジェクトを報知する接近画像が大きく表示されすぎ、観察者に煩わしさを与えることを回避することができる。所定閾値Mthについては後述する。 As described above, the display control device 30 controls the display of the approach image. When there are a plurality of objects, the display control device 30 can display a plurality of close-up images (at the same time) within the same period. When the display ratio M of the approach image determined based on the reference control data CDs is equal to or greater than the predetermined threshold value Mth, the approach image may be deleted. By doing so, when the vehicle 1 approaches a predetermined real object, it is possible to avoid causing an annoyance to the observer because the approach image for notifying the real object is displayed too large. The predetermined threshold value Mth will be described later.
 ここで、複数の接近画像のうち、車両1からの距離Lが最も近い第1のオブジェクトを報知するものを第1の接近画像V1とし、第1の接近画像V1よりも距離Lが遠い第2のオブジェクトを報知するものを第2の接近画像V2とし、第2の接近画像V2よりも距離Lがさらに遠い第3のオブジェクトを報知するものを第3の接近画像V3とする。これら複数の接近画像は、基準制御データCDsに基づく制御(以下、通常制御と言う。)により、図6Aに示すように、表示サイズが大きいものから順に、第1の接近画像V1、第2の接近画像V2、第3の接近画像V3となる。車両1が前方へ走行していくとオブジェクトまでの距離Lは徐々に近くなっていくので、第1の接近画像V1、第2の接近画像V2及び第3の接近画像V3の表示サイズは徐々に大きくなっていく。これを、図7Aに示す基準関数Fsで説明すれば、基準関数Fsが示す曲線上における、第1の接近画像V1、第2の接近画像V2及び第3の接近画像V3の各々を示す点が左側に移動することに対応する。しかしながら、通常制御による制御だけでは、距離Lによっては、第1の接近画像V1に続いて表示される第2の接近画像V2が小さくなりすぎて視認しづらい場合がある。これを解消するため、表示制御装置30は、後述する距離知覚短縮処理を実行する。 Here, among the plurality of approach images, the one that notifies the first object having the closest distance L from the vehicle 1 is referred to as the first approach image V1, and the second approach image V1 is farther than the first approach image V1. The second approach image V2 is used to notify the object of the above, and the third approach image V3 is to notify the third object whose distance L is further distant from the second approach image V2. As shown in FIG. 6A, these plurality of close-up images are controlled based on the reference control data CDs (hereinafter, referred to as normal control), and as shown in FIG. It becomes a close-up image V2 and a third close-up image V3. As the vehicle 1 travels forward, the distance L to the object gradually becomes closer, so that the display sizes of the first approach image V1, the second approach image V2, and the third approach image V3 gradually become smaller. It gets bigger. Explaining this with the reference function Fs shown in FIG. 7A, points indicating each of the first approach image V1, the second approach image V2, and the third approach image V3 on the curve indicated by the reference function Fs are Corresponds to moving to the left. However, depending on the distance L, the second approach image V2 displayed following the first approach image V1 may become too small to be visually recognized only by the control by the normal control. In order to solve this problem, the display control device 30 executes the distance perception shortening process described later.
(拡大処理)
 表示制御装置30は、基準制御データCDsに基づき、オブジェクトまでの距離Lが短くなるに従い、接近画像Vの大きさを大きくする表示制御を行う(通常制御の一例。)。そして、この通常制御中に、後述する誘目条件が満たされた場合、表示制御装置30は、拡大処理(距離知覚短縮処理の一例。)を実行する。
(Enlargement processing)
Based on the reference control data CDs, the display control device 30 performs display control to increase the size of the approach image V as the distance L to the object becomes shorter (an example of normal control). Then, when the attraction condition described later is satisfied during this normal control, the display control device 30 executes an enlargement process (an example of a distance perception shortening process).
 図8は、拡大処理を示すフロー図である。まず、表示制御装置30は、第1の接近画像V1の表示比率が第1閾値MT1以上であるか否かを判別する(ブロックS1)。第1閾値MT1は、オブジェクトが車両1に対して十分に近くなり、当該オブジェクトがまもなく過ぎ去ると想定される場合の距離Lに対応した表示比率として予め定められ、メモリ37内に記憶されている。 FIG. 8 is a flow chart showing the enlargement processing. First, the display control device 30 determines whether or not the display ratio of the first approach image V1 is equal to or higher than the first threshold value MT1 (block S1). The first threshold value MT1 is predetermined as a display ratio corresponding to the distance L when the object is sufficiently close to the vehicle 1 and the object is expected to pass soon, and is stored in the memory 37.
 第1の接近画像V1の表示比率が第1閾値MT1以上である場合(ブロックS1;Yes)、表示制御装置30は、ブロックS2の処理を実行する。一方、第1の接近画像V1の表示比率が第1閾値MT1未満である場合(ブロックS1;No)、表示制御装置30は、視線方向検出部415から取得した視線情報に基づき、観察者が第1の接近画像V1を視認しているか否かを判別する(ブロックS3)。 When the display ratio of the first approach image V1 is equal to or greater than the first threshold value MT1 (block S1; Yes), the display control device 30 executes the process of block S2. On the other hand, when the display ratio of the first approach image V1 is less than the first threshold value MT1 (block S1; No), the display control device 30 is set by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the approach image V1 of 1 is visually recognized (block S3).
 表示制御装置30は、観察者が第1の接近画像V1を視認していない場合(ブロックS3;No)、距離知覚短縮処理を終了する一方で、観察者が第1の接近画像V1を視認している場合(ブロックS3;Yes)、ブロックS2の処理を実行する。こうすることで、第1の接近画像V1の表示比率が第1閾値MT1未満であっても、観察者が第1の接近画像V1を認識し、既に第1の接近画像V1の報知必要度が低下したと想定される場合には、後述のように第2の接近画像V2の視認性を確保する拡大処理(距離知覚短縮処理の一例。)が実行可能となる。 When the observer does not visually recognize the first approach image V1 (block S3; No), the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the first approach image V1. If yes (block S3; Yes), the process of block S2 is executed. By doing so, even if the display ratio of the first approach image V1 is less than the first threshold value MT1, the observer recognizes the first approach image V1 and the necessity of notifying the first approach image V1 is already reduced. When it is assumed that the image has decreased, an enlargement process (an example of the distance perception shortening process) for ensuring the visibility of the second close-up image V2 can be executed as described later.
 ブロックS2で、表示制御装置30は、第1の接近画像V1の消去条件が成立したか否かを判別する。例えば、表示制御装置30は、第1の接近画像V1の表示比率が所定閾値Mth以上となった場合、消去条件が成立したと判別し(ブロックS2;Yes)、第1の接近画像V1を虚像Vの表示範囲内から消去する(ブロックS4)。所定閾値Mthは、例えば、第1閾値MT1よりも大きい値として予め定められ、メモリ37内に記憶されている。ブロックS4の実行後や第1の接近画像V1の消去条件が成立していない場合(ブロックS2;No)、表示制御装置30は、ブロックS5の処理を実行する。 In block S2, the display control device 30 determines whether or not the erasing condition of the first approach image V1 is satisfied. For example, when the display ratio of the first approach image V1 is equal to or greater than the predetermined threshold value Mth, the display control device 30 determines that the erasing condition is satisfied (block S2; Yes), and virtualizes the first approach image V1. Erase from within the display range of V (block S4). The predetermined threshold value Mth is predetermined as a value larger than, for example, the first threshold value MT1, and is stored in the memory 37. After the execution of the block S4 or when the erasing condition of the first approach image V1 is not satisfied (block S2; No), the display control device 30 executes the process of the block S5.
 ブロックS5で、表示制御装置30は、第2の接近画像V2の表示比率が第2閾値MT2未満であるか否かを判別する。第2閾値MT2は、第1閾値MT1よりも小さい値であるとともに、オブジェクトが車両1から遠く、当該オブジェクトを示す第2の接近画像V2が観察者にとって視認しづらくなると想定される場合の距離Lに対応した表示比率として予め定められ、メモリ37内に記憶されている。 In block S5, the display control device 30 determines whether or not the display ratio of the second approach image V2 is less than the second threshold value MT2. The second threshold value MT2 is a value smaller than the first threshold value MT1, and the distance L when the object is far from the vehicle 1 and the second approach image V2 showing the object is assumed to be difficult for the observer to see. The display ratio corresponding to is predetermined and stored in the memory 37.
 第2の接近画像V2の表示比率が第2閾値MT2以上である場合(ブロックS5;No)、表示制御装置30は、距離知覚短縮処理を終了する。この場合、第2の接近画像V2は、基準制御データCDsに基づく通常制御、つまり、基準関数Fsに従って表示されることになる。一方で、第2の接近画像V2の表示比率が第2閾値MT2未満である場合(ブロックS5;Yes)、表示制御装置30は、第2の接近画像V2に対して拡大処理を実行する(ブロックS6)。 When the display ratio of the second approach image V2 is equal to or greater than the second threshold value MT2 (block S5; No), the display control device 30 ends the distance perception shortening process. In this case, the second close-up image V2 is displayed according to the normal control based on the reference control data CDs, that is, the reference function Fs. On the other hand, when the display ratio of the second approach image V2 is less than the second threshold value MT2 (block S5; Yes), the display control device 30 executes an enlargement process on the second approach image V2 (block). S6).
 ブロックS6の拡大処理で、表示制御装置30は、第2の接近画像V2を基準制御データCDsに基づいて決定される表示比率MR(図7B及び図7C参照)よりも大きい比率で表示する。拡大処理において、表示制御装置30は、例えば図7Cに示すように、基準制御データCDsに基づく基準関数Fs(M=α/L)を表示比率Mが大きくなるように比例定数を所定量βだけ増加させた後述する第1距離知覚短縮処理データCDtに基づく第2関数Ft(M=(α+β)/L)を用い、第2の接近画像V2の表示制御を行う。 In the enlargement processing of the block S6, the display control device 30 displays the second approach image V2 at a ratio larger than the display ratio MR (see FIGS. 7B and 7C) determined based on the reference control data CDs. In the enlargement processing, as shown in FIG. 7C, for example, the display control device 30 sets the proportional constant of the reference function Fs (M = α / L) based on the reference control data CDs by a predetermined amount β so that the display ratio M becomes large. The display control of the second close-up image V2 is performed by using the second function Ft (M = (α + β) / L) based on the increased first distance perception shortening processing data CDt described later.
 図6A~図6Cは、拡大処理が実行される際の画像遷移例を表している。拡大処理が実行されると、図6Aに示すように通常制御で表示されていた第2の接近画像V2が、図6Bに示すように拡大される。このように第2の接近画像V2に対して拡大処理(距離知覚短縮処理の一例。)を実行することで、第1の接近画像V1の報知対象が間もなく過ぎ去り、第1の接近画像V1よりも報知の報知必要度が高くなると想定される第2の接近画像V2を視認しやすく表示することができる。 6A to 6C show an example of image transition when the enlargement process is executed. When the enlargement process is executed, the second close-up image V2 displayed under normal control as shown in FIG. 6A is enlarged as shown in FIG. 6B. By executing the enlargement processing (an example of the distance perception shortening processing) on the second approach image V2 in this way, the notification target of the first approach image V1 will soon pass, and the notification target of the first approach image V1 will soon pass, and the notification target will soon pass. The second approach image V2, which is expected to increase the need for notification, can be displayed in an easy-to-see manner.
 また、表示制御装置30は、拡大処理を実行する場合、当該実行前よりも第2の接近画像V2を観察者から見て上方に表示するようにしてもよい。こうすれば、観察者に、運転に必要な視野を確保させることができる。このように拡大処理に併せて第2の接近画像V2を上方に移動して表示する制御は、第2の接近画像V2が道路標識を模した可視化態様の接近画像である場合に、特に有用である。通常の道路標識は前方路面よりも上方に表示されるため、観察者にとっては、あたかも実際の道路標識を見ているかのように第2の接近画像V2を視認することができるためである。 Further, when the display control device 30 executes the enlargement processing, the display control device 30 may display the second approach image V2 higher than before the execution when viewed from the observer. In this way, the observer can secure the field of view necessary for driving. The control of moving the second approach image V2 upward and displaying it in accordance with the enlargement processing is particularly useful when the second approach image V2 is an approach image in a visualization mode imitating a road sign. be. This is because the normal road sign is displayed above the road surface ahead, so that the observer can visually recognize the second approach image V2 as if he / she is looking at the actual road sign.
 また、第2の接近画像V2よりも距離Lが遠いオブジェクトを報知する第3の接近画像V3を表示している際は、表示制御装置30は、第2の接近画像V2に対して拡大処理を実行中の場合であっても、第3の接近画像V3の表示を基準制御データCDsに基づく制御で継続する。これにより、相対的に第3の接近画像V3よりも第2の接近画像V2のほうが目立つことになり、直近では報知の報知必要度が第3の接近画像V3よりも高い第2の接近画像V2を効果的に表示することができる。なお、距離知覚短縮処理の実行時に接近画像が4つ以上あった場合は、第3の接近画像V3よりも小さく表示されている接近画像(つまり、第3の接近画像V3の報知対象よりも距離Lが遠いオブジェクトを報知する画像)についても、基準制御データCDsに基づく制御での表示が継続される。 Further, when displaying the third approach image V3 that notifies an object whose distance L is farther than the second approach image V2, the display control device 30 performs enlargement processing on the second approach image V2. Even in the case of execution, the display of the third close-up image V3 is continued by the control based on the reference control data CDs. As a result, the second approach image V2 is relatively more conspicuous than the third approach image V3, and the second approach image V2 whose notification necessity is higher than that of the third approach image V3 in the latest. Can be displayed effectively. If there are four or more close-up images when the distance perception shortening process is executed, the close-up image displayed smaller than the third close-up image V3 (that is, the distance from the notification target of the third close-up image V3). Even for the image (an image in which L is a distant object), the display under the control based on the reference control data CDs is continued.
 なお、表示制御装置30は、ブロックS1又はS2において、図6Bに示すように、第1の接近画像V1の一部が虚像Vの表示領域から外れているか否かを判別し、そのように外れている場合、且つ、ブロックS5でYesと判別した場合に、拡大処理を実行してもよい。こうしても、第1の接近画像V1の報知対象が間もなく過ぎ去り、第1の接近画像V1よりも報知の報知必要度が高くなると想定される第2の接近画像V2を視認しやすく表示することができる。 In the block S1 or S2, the display control device 30 determines whether or not a part of the first close-up image V1 is out of the display area of the virtual image V as shown in FIG. If it is, and if it is determined as Yes in the block S5, the enlargement process may be executed. Even in this way, the notification target of the first approach image V1 will soon pass, and the second approach image V2, which is expected to have a higher notification necessity than the first approach image V1, can be displayed in an easy-to-see manner. ..
 続いて、表示制御装置30は、拡大処理の実行開始から予め定められた所定期間(例えば数秒)が経過したか否かを判別する(ブロックS7)。所定期間が経過している場合(ブロックS7;Yes)、表示制御装置30は、第2の接近画像V2の表示制御を基準制御データCDsに基づく制御に復帰させる復帰処理(後述する距離知覚延長処理の一例。)を実行し(ブロックS8)、距離知覚短縮処理を終了する。 Subsequently, the display control device 30 determines whether or not a predetermined predetermined period (for example, several seconds) has elapsed from the start of execution of the enlargement processing (block S7). When the predetermined period has elapsed (block S7; Yes), the display control device 30 returns the display control of the second approach image V2 to the control based on the reference control data CDs (distance perception extension process described later). (Example)) is executed (block S8), and the distance perception shortening process is terminated.
 図6B~図6Cは、復帰処理が実行される際の画像遷移例を表している。復帰処理が実行されると、図6Bに示すように拡大表示されていた第2の接近画像V2が、図6Cに示すように基準制御データCDsに基づく制御で表示される。 6B to 6C show an example of image transition when the return process is executed. When the return process is executed, the second approach image V2, which has been enlarged and displayed as shown in FIG. 6B, is displayed by control based on the reference control data CDs as shown in FIG. 6C.
 なお、図6Cでは、拡大処理や復帰処理の実行時にM方向に表示比率が変化する例を示したが、第2の接近画像V2の急激なサイズ変化を抑制するために、徐々に表示比率を変化させてもよい。このように、徐々に表示比率を変化させる場合、第2の接近画像V2を示す点(L,M)は、拡大処理実行時には基準関数Fsから第2関数FtへM方向に対して左斜め上方に移動し、復帰処理の実行時には第2関数Ftから基準関数FsへM方向に対して左斜め下方に移動する。 Although FIG. 6C shows an example in which the display ratio changes in the M direction when the enlargement processing or the restoration processing is executed, the display ratio is gradually increased in order to suppress a sudden size change of the second approach image V2. It may be changed. In this way, when the display ratio is gradually changed, the points (L, M) indicating the second approach image V2 are obliquely upward to the left with respect to the M direction from the reference function Fs to the second function Ft when the enlargement processing is executed. When the return process is executed, the second function Ft moves diagonally downward to the left with respect to the M direction from the second function Ft to the reference function Fs.
 拡大処理の実行開始から所定期間が経過していない場合(ブロックS7;No)、表示制御装置30は、視線方向検出部415から取得した視線情報に基づき、観察者が拡大表示中の第2の接近画像V2を視認しているか否かを判別する(ブロックS9)。 When a predetermined period has not elapsed from the start of execution of the enlargement processing (block S7; No), the display control device 30 is the second display being enlarged by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the close-up image V2 is visually recognized (block S9).
 表示制御装置30は、観察者が第2の接近画像V2を視認していない場合(ブロックS9;No)、距離知覚短縮処理を終了する一方で、観察者が第2の接近画像V2を視認している場合(ブロックS9;Yes)、ブロックS8の復帰処理を実行する。こうすることで、観察者が拡大表示中の第2の接近画像V2を認識したと想定される場合には、第2の接近画像V2の表示制御を基準制御データCDsに基づく制御に復帰させることができ、観察者にとっては既に報知必要度が低下した第2の接近画像V2が拡大され続けることを抑制することができる。距離知覚短縮処理の説明は以上である。 When the observer does not visually recognize the second approach image V2 (block S9; No), the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the second approach image V2. If so (block S9; Yes), the return process of block S8 is executed. By doing so, when it is assumed that the observer has recognized the second approach image V2 being displayed in an enlarged manner, the display control of the second approach image V2 is returned to the control based on the reference control data CDs. This makes it possible to prevent the observer from continuing to magnify the second close-up image V2, for which the need for notification has already decreased. This concludes the explanation of the distance perception shortening process.
 以上に説明した拡大処理(距離知覚短縮処理の一例。)では、第2関数Ftに基づいて第2の接近画像V2を制御する例を示したが、これに限られない。例えば、図9Aに示すように、拡大処理(距離知覚短縮処理)の実行時には、距離Lに依らない一定の表示比率Mで第2の接近画像V2を表示制御してもよい。しかしながら、拡大処理実行時に一定の表示比率Mで第2の接近画像V2を表示制御する場合、図9Aに示すように理想的に、第2の接近画像V2を表す点(L,M)が復帰処理の開始時点で基準関数Fs上に位置することは稀である。したがって、図9Bに示すように、復帰処理の開始時点からは、当該開始時点における第2の接近画像V2を表す点を通るとともに、基準関数Fsの比例定数を所定の値で減じた復帰用基準関数Fsbを用いて第2の接近画像V2の表示制御を行うようにしてもよい。なお、復帰用基準関数Fsbを用いた制御は、基準制御データCDsに基づく第2の接近画像V2の表示比率の制御の一例であるとも言える。また、図9Cに示すように、拡大処理が実行されて復帰処理の開始時点までは一定値であった表示比率が、当該開始時点から基準関数Fsに基づいて可変されるようにしてもよい。これらの変形例においても、前述と同様、拡大処理や復帰処理の実行時に徐々に表示比率を変化させてもよい。 In the enlargement processing (an example of the distance perception shortening processing) described above, an example of controlling the second approach image V2 based on the second function Ft is shown, but the present invention is not limited to this. For example, as shown in FIG. 9A, when the enlargement process (distance perception shortening process) is executed, the display control of the second approach image V2 may be performed at a constant display ratio M that does not depend on the distance L. However, when the display control of the second approach image V2 is performed at a constant display ratio M when the enlargement processing is executed, the points (L, M) representing the second approach image V2 are ideally restored as shown in FIG. 9A. It is rarely located on the reference function Fs at the start of processing. Therefore, as shown in FIG. 9B, from the start time of the return processing, the return reference is passed through the point representing the second approach image V2 at the start time, and the proportionality constant of the reference function Fs is subtracted by a predetermined value. The display control of the second close-up image V2 may be performed by using the function Fsb. It can be said that the control using the return reference function Fsb is an example of controlling the display ratio of the second approach image V2 based on the reference control data CDs. Further, as shown in FIG. 9C, the display ratio, which was a constant value from the start time of the enlargement process to the start time of the return process, may be changed based on the reference function Fs from the start time. In these modified examples as well, the display ratio may be gradually changed when the enlargement process or the return process is executed, as described above.
 なお、図9Aに示すように、拡大処理の実行時に、距離Lに依らない一定の表示比率Mで第2の接近画像V2を表示制御する場合、当該一定の表示比率Mが基準関数Fsと交差した時点で復帰処理を開始するようにしてもよい。つまり、第2の接近画像V2に対して拡大処理を実行してから復帰処理が開始されるまでの所定期間は、予め定めた期間でなくてもよい。 As shown in FIG. 9A, when the second approach image V2 is displayed and controlled at a constant display ratio M that does not depend on the distance L when the enlargement processing is executed, the constant display ratio M intersects with the reference function Fs. The return process may be started at the same time. That is, the predetermined period from the execution of the enlargement processing to the second close-up image V2 to the start of the restoration processing does not have to be a predetermined period.
 また、以上では、第2関数Ftを基準関数Fsの比例定数を増加させたものとし、復帰用基準関数Fsbを基準関数Fsの比例定数を減少させたものとした例を示したが、これに限られない。例えば、第2関数Ftは、基準関数Fs(M=α/L)をM方向に+δs(δsは定数)だけ平行移動させたものであってもよい(M=α/L+δs)。また、復帰用基準関数Fsbは、基準関数Fs(M=α/L)をM方向に-δb(δbは定数)だけ平行移動させたものであってもよい(M=α/L-δb)。 Further, in the above, an example is shown in which the second function Ft is the proportional constant of the reference function Fs increased, and the return reference function Fsb is the proportional constant of the reference function Fs decreased. Not limited. For example, the second function Ft may be a reference function Fs (M = α / L) translated in the M direction by + δs (δs is a constant) (M = α / L + δs). Further, the return reference function Fsb may be a translation of the reference function Fs (M = α / L) in the M direction by −δb (δb is a constant) (M = α / L−δb). ..
 また、以上では、基準関数Fsが反比例である例を示したが、曲線を近似して、基準関数Fsを、例えば、図10Aに示すように、傾きが一定の1次関数(M=αL+β)で表される直線を示すものとしてもよい。 Further, in the above, an example in which the reference function Fs is inversely proportional has been shown, but the reference function Fs is obtained by approximating the curve, for example, as shown in FIG. 10A, a linear function (M = αL + β) having a constant slope. It may indicate a straight line represented by.
 また、以上では、基準関数Fsは、多項式関数を含む代数関数や初等関数であって曲線を示すものであってもよい。また、基準関数Fsは、傾きが異なる複数の一次関数で構成されてもよい。例えば、図10Bに示すように、基準関数Fsは、距離Lが大きくなるほど正比例関係よりも表示比率Mが大きくなる曲線を示すものであってもよい。また、図10Bに示すように、基準関数Fsが複数の直線で表され、当該複数の直線は、所定の距離L以上の場合には当該距離L未満の場合に比べて傾きが緩やかになる直線を含んでいてもよい。図10Bや図10Cに示す基準関数Fsの例は、雨や濃霧時に遠方風景が視認しづらくなる場合や、日差しが強く接近画像が見えづらい場合などに有用である。 Further, in the above, the reference function Fs may be an algebraic function including a polynomial function or an elementary function and shows a curve. Further, the reference function Fs may be composed of a plurality of linear functions having different slopes. For example, as shown in FIG. 10B, the reference function Fs may show a curve in which the display ratio M becomes larger than the direct proportional relationship as the distance L becomes larger. Further, as shown in FIG. 10B, the reference function Fs is represented by a plurality of straight lines, and the plurality of straight lines have a gentler slope when the distance is L or more than when the distance is less than L. May include. The example of the reference function Fs shown in FIGS. 10B and 10C is useful when it is difficult to see a distant landscape during rain or heavy fog, or when it is difficult to see a close-up image due to strong sunlight.
 また、以上では、基準制御データCDsが基準関数Fsの数式を表すデータである例を示したが、基準制御データCDsは、取得した距離Lに対応して表示比率Mを決定可能に構成されたテーブルデータであってもよい。つまり、基準制御データCDsは、距離Lが長いほど小さくなる表示比率M(換言すれば、距離Lが短いほど大きくなる表示比率M)を、取得した距離Lに応じて決定可能なものであれば、数式を示すデータであってもテーブルデータであってもよく、その構成は任意である。 Further, in the above, the reference control data CDs are the data representing the mathematical formula of the reference function Fs, but the reference control data CDs are configured so that the display ratio M can be determined according to the acquired distance L. It may be table data. That is, if the reference control data CDs can determine the display ratio M (in other words, the display ratio M that increases as the distance L becomes shorter) according to the acquired distance L, the display ratio M becomes smaller as the distance L becomes longer. , The data indicating a mathematical formula or the table data may be used, and the configuration thereof is arbitrary.
 また、以上では、表示制御装置30は、第1の接近画像V1の表示比率が所定閾値Mth以上となった場合、消去条件が成立したと判別していた(図8に示す距離知覚短縮処理のブロックS2)が、これに限定されない。例えば、表示制御装置30は、観察者が視認している対象が、第1の接近画像V1であった場合、消去条件が成立したと判別(図8に示す距離知覚短縮処理のブロックS2;Yes)してもよい。 Further, in the above, the display control device 30 has determined that the erasing condition is satisfied when the display ratio of the first approach image V1 is equal to or greater than the predetermined threshold value Mth (in the distance perception shortening process shown in FIG. 8). Block S2) is not limited to this. For example, the display control device 30 determines that the erasing condition is satisfied when the object visually recognized by the observer is the first approach image V1 (block S2 of the distance perception shortening process shown in FIG. 8; Yes. ) May.
 また、第2の接近画像V2を効果的に表示することができる限りにおいては、以上で説明した距離知覚短縮処理において、一部処理の省略、所定処理の変更や追加が可能であることは言うまでもない。例えば、距離知覚短縮処理のうち、ブロックS3、ブロックS2及びS4、ブロックS9のうち少なくともいずれかを省略してもよい。また、距離知覚短縮処理においてブロックS5の処理を省略してもよい。 Further, as long as the second close-up image V2 can be effectively displayed, it goes without saying that in the distance perception shortening process described above, it is possible to omit some processes and change or add a predetermined process. stomach. For example, in the distance perception shortening process, at least one of blocks S3, blocks S2 and S4, and block S9 may be omitted. Further, the processing of the block S5 may be omitted in the distance perception shortening processing.
(接近処理)
 まず、表示距離を制御するための制御データについて説明する。接近画像の表示距離を制御するための制御データは、基準制御データCDsを含む。表示制御装置30は、取得又は算出した距離L(又は距離Lを推定可能な情報)及びと基準制御データCDsに基づき、表示比率Mを決定し、決定した表示比率Mで接近画像を表示する。基準制御データCDsは、距離Lと表示比率Mとの関係を示す基準関数Fs(数式)のデータを含む。基準関数Fsは、例えば、図5に示すように、P=L-αで表すことができる。なお、定数のαは、例えば、オブジェクトまでの距離Lに対して接近画像の奥行方向(Z軸方向)のオフセット量によって決定される。定数のαは、ゼロであってもよい。
(Approach processing)
First, control data for controlling the display distance will be described. The control data for controlling the display distance of the close-up image includes the reference control data CDs. The display control device 30 determines the display ratio M based on the acquired or calculated distance L (or information capable of estimating the distance L) and the reference control data CDs, and displays the approach image at the determined display ratio M. The reference control data CDs include data of a reference function Fs (mathematical expression) showing the relationship between the distance L and the display ratio M. The reference function Fs can be represented by, for example, P = L−α, as shown in FIG. The constant α is determined by, for example, an offset amount in the depth direction (Z-axis direction) of the approaching image with respect to the distance L to the object. The constant α may be zero.
 表示制御装置30は、基準制御データCDsに基づき、オブジェクトまでの距離Lが短くなるに従い、接近画像Vの表示距離Pを短くする表示制御を行う(通常制御の一例。)。そして、この通常制御中に、後述する誘目条件が満たされた場合、表示制御装置30は、接近処理(距離知覚短縮処理の一例。)を実行する。 Based on the reference control data CDs, the display control device 30 performs display control that shortens the display distance P of the approach image V as the distance L to the object becomes shorter (an example of normal control). Then, when the attraction condition described later is satisfied during this normal control, the display control device 30 executes an approach process (an example of a distance perception shortening process).
 まず、表示制御装置30は、第1の接近画像V1の表示距離が第1閾値PT1以下であるか否かを判別する(ブロックS11)。第1閾値PT1は、オブジェクトが車両1に対して十分に近くなり、当該オブジェクトがまもなく過ぎ去ると想定される場合の距離Lに対応した表示距離として予め定められ、メモリ37内に記憶されている。 First, the display control device 30 determines whether or not the display distance of the first approach image V1 is equal to or less than the first threshold value PT1 (block S11). The first threshold value PT1 is predetermined as a display distance corresponding to the distance L when the object is sufficiently close to the vehicle 1 and the object is expected to pass soon, and is stored in the memory 37.
 第1の接近画像V1の表示距離Pが第1閾値PT1以下である場合(ブロックS11;Yes)、表示制御装置30は、ブロックS12の処理を実行する。一方、第1の接近画像V1の表示距離Pが第1閾値PT1より長い場合(ブロックS11;No)、表示制御装置30は、視線方向検出部415から取得した視線情報に基づき、観察者が第1の接近画像V1を視認しているか否かを判別する(ブロックS13)。 When the display distance P of the first approach image V1 is equal to or less than the first threshold value PT1 (block S11; Yes), the display control device 30 executes the process of block S12. On the other hand, when the display distance P of the first approach image V1 is longer than the first threshold value PT1 (block S11; No), the display control device 30 is set by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the approach image V1 of 1 is visually recognized (block S13).
 表示制御装置30は、観察者が第1の接近画像V1を視認していない場合(ブロックS13;No)、距離知覚短縮処理を終了する一方で、観察者が第1の接近画像V1を視認している場合(ブロックS13;Yes)、ブロックS12の処理を実行する。こうすることで、第1の接近画像V1の表示距離Pが第1閾値PT1より長い場合であっても、観察者が第1の接近画像V1を認識し、既に第1の接近画像V1の報知必要度が低下したと想定される場合には、後述のように第2の接近画像V2の視認性を確保する接近処理(距離知覚短縮処理の一例。)が実行可能となる。 When the observer does not visually recognize the first approach image V1 (block S13; No), the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the first approach image V1. If yes (block S13; Yes), the process of block S12 is executed. By doing so, even when the display distance P of the first approach image V1 is longer than the first threshold value PT1, the observer recognizes the first approach image V1 and already notifies the first approach image V1. When it is assumed that the necessity is reduced, the approach process (an example of the distance perception shortening process) for ensuring the visibility of the second approach image V2 can be executed as described later.
 ブロックS12で、表示制御装置30は、第1の接近画像V1の消去条件が成立したか否かを判別する。例えば、表示制御装置30は、第1の接近画像V1の表示距離Pが所定閾値Pth以下となった場合、消去条件が成立したと判別し(ブロックS12;Yes)、第1の接近画像V1を虚像Vの表示範囲内から消去する(ブロックS14)。所定閾値Pthは、例えば、第1閾値PT1よりも小さい値として予め定められ、メモリ37内に記憶されている。ブロックS14の実行後や第1の接近画像V1の消去条件が成立していない場合(ブロックS12;No)、表示制御装置30は、ブロックS15の処理を実行する。 In block S12, the display control device 30 determines whether or not the erasing condition of the first approach image V1 is satisfied. For example, when the display distance P of the first approach image V1 is equal to or less than the predetermined threshold value Pth, the display control device 30 determines that the erasing condition is satisfied (block S12; Yes), and displays the first approach image V1. It is erased from the display range of the virtual image V (block S14). The predetermined threshold value Pth is predetermined as a value smaller than, for example, the first threshold value PT1 and is stored in the memory 37. After the execution of the block S14 or when the erasing condition of the first approach image V1 is not satisfied (block S12; No), the display control device 30 executes the process of the block S15.
 ブロックS15で、表示制御装置30は、第2の接近画像V2の表示距離Pが第2閾値PT2より長いか否かを判別する。第2閾値PT2は、第1閾値PT1よりも大きい値であるとともに、オブジェクトが車両1から遠く、当該オブジェクトを示す第2の接近画像V2が観察者にとって視認しづらくなると想定される場合の距離Lに対応した表示距離Pとして予め定められ、メモリ37内に記憶されている。 In block S15, the display control device 30 determines whether or not the display distance P of the second approach image V2 is longer than the second threshold value PT2. The second threshold value PT2 is a value larger than the first threshold value PT1 and the distance L when the object is far from the vehicle 1 and the second approach image V2 showing the object is assumed to be difficult for the observer to see. The display distance P corresponding to is predetermined and stored in the memory 37.
 第2の接近画像V2の表示距離Pが第2閾値PT2以下である場合(ブロックS15;No)、表示制御装置30は、距離知覚短縮処理を終了する。この場合、第2の接近画像V2は、基準制御データCDsに基づく通常制御、つまり、基準関数Fsに従って表示されることになる。一方で、第2の接近画像V2の表示距離Pが第2閾値PT2より長い場合(ブロックS15;Yes)、表示制御装置30は、第2の接近画像V2に対して接近処理を実行する(ブロックS16)。 When the display distance P of the second approach image V2 is equal to or less than the second threshold value PT2 (block S15; No), the display control device 30 ends the distance perception shortening process. In this case, the second close-up image V2 is displayed according to the normal control based on the reference control data CDs, that is, the reference function Fs. On the other hand, when the display distance P of the second approach image V2 is longer than the second threshold value PT2 (block S15; Yes), the display control device 30 executes the approach process with respect to the second approach image V2 (block). S16).
 ブロックS16の接近処理で、表示制御装置30は、第2の接近画像V2を基準制御データCDsに基づいて決定される表示距離PR(図12B及び図12C参照)よりも近い距離で表示する。接近処理において、表示制御装置30は、例えば図12Cに示すように、基準制御データCDsに基づく基準関数Fs(P=L-α)を、表示距離Pがさらに近くなるように所定量γ(γは正数。)だけ減少させた第2関数Ft(M=L-α-γ)を用い、第2の接近画像V2の表示制御を行う。 In the approach processing of the block S16, the display control device 30 displays the second approach image V2 at a distance closer than the display distance PR (see FIGS. 12B and 12C) determined based on the reference control data CDs. In the approach processing, as shown in FIG. 12C, for example, the display control device 30 applies a predetermined amount γ (γ) of the reference function Fs (P = L−α) based on the reference control data CDs so that the display distance P becomes even closer. Is a positive number.) The display control of the second close-up image V2 is performed by using the second function Ft (M = L-α-γ) reduced by.
 図13A~図13Cは、接近処理が実行される際の画像遷移例を表している。接近処理が実行されると、通常制御で表示距離P22に表示される第2の接近画像V2が、図13Bに示すように、表示距離P22より観察者に近い表示距離P22tまで接近する。このように第2の接近画像V2に対して接近処理(距離知覚短縮処理の一例。)を実行することで、第1の接近画像V1の報知対象が間もなく過ぎ去り、第1の接近画像V1よりも報知の報知必要度が高くなると想定される第2の接近画像V2を視認しやすく表示することができる。 13A to 13C show an example of image transition when the approach process is executed. When the approach process is executed, the second approach image V2 displayed at the display distance P22 under normal control approaches the display distance P22t closer to the observer than the display distance P22, as shown in FIG. 13B. By executing the approach processing (an example of the distance perception shortening process) on the second approach image V2 in this way, the notification target of the first approach image V1 will soon pass, and the notification target of the first approach image V1 will soon pass, and the notification target will soon pass. The second approach image V2, which is expected to increase the need for notification, can be displayed in an easy-to-see manner.
 また、第2の接近画像V2よりも距離Lが遠いオブジェクトを報知する第3の接近画像V3を表示している際は、表示制御装置30は、第2の接近画像V2に対して接近処理を実行中の場合であっても、第3の接近画像V3の表示を基準制御データCDsに基づく制御で継続する。これにより、相対的に第3の接近画像V3よりも第2の接近画像V2のほうが目立つことになり、直近では報知の報知必要度が第3の接近画像V3よりも高い第2の接近画像V2を効果的に表示することができる。なお、距離知覚短縮処理の実行時に接近画像が4つ以上あった場合は、第3の接近画像V3よりも小さく表示されている接近画像(つまり、第3の接近画像V3の報知対象よりも距離Lが遠いオブジェクトを報知する画像)についても、基準制御データCDsに基づく制御での表示が継続される。 Further, when displaying the third approach image V3 that notifies an object whose distance L is farther than the second approach image V2, the display control device 30 performs an approach process to the second approach image V2. Even in the case of execution, the display of the third close-up image V3 is continued by the control based on the reference control data CDs. As a result, the second approach image V2 is relatively more conspicuous than the third approach image V3, and the second approach image V2 whose notification necessity is higher than that of the third approach image V3 in the latest. Can be displayed effectively. If there are four or more close-up images when the distance perception shortening process is executed, the close-up image displayed smaller than the third close-up image V3 (that is, the distance from the notification target of the third close-up image V3). Even for the image (an image in which L is a distant object), the display under the control based on the reference control data CDs is continued.
 続いて、表示制御装置30は、接近処理の実行開始から予め定められた所定期間(例えば数秒)が経過したか否かを判別する(ブロックS17)。所定期間が経過している場合(ブロックS17;Yes)、表示制御装置30は、第2の接近画像V2の表示制御を基準制御データCDsに基づく制御に復帰させる復帰処理を実行し(ブロックS18)、距離知覚短縮処理を終了する。 Subsequently, the display control device 30 determines whether or not a predetermined predetermined period (for example, several seconds) has elapsed from the start of execution of the approach process (block S17). When the predetermined period has elapsed (block S17; Yes), the display control device 30 executes a return process for returning the display control of the second approach image V2 to the control based on the reference control data CDs (block S18). , End the distance perception shortening process.
 図13B~図13Cは、復帰処理が実行される際の画像遷移例を表している。復帰処理が実行されると、接近処理で表示距離P23tに表示される第2の接近画像V2が、図13Cに示すように、表示距離P22より観察者から遠い、基準制御データCDsに基づく表示距離P23まで離れるように表示される。 13B to 13C show an example of image transition when the return process is executed. When the return process is executed, the second approach image V2 displayed at the display distance P23t in the approach process is a display distance based on the reference control data CDs, which is farther from the observer than the display distance P22, as shown in FIG. 13C. It is displayed so as to move away from P23.
 なお、図13Cでは、接近処理や復帰処理の実行時にP方向に表示距離Pが変化する例を示したが、第2の接近画像V2の急激なサイズ変化を抑制するために、徐々に表示距離Pを変化させてもよい。このように、徐々に表示距離Pを変化させる場合、第2の接近画像V2を示す点(L,P)は、図12CにおけるFsとFtとの間の点線のように、接近処理実行時には基準関数Fsから第2関数Ftへ左斜め下方に移動し、復帰処理の実行時には第2関数Ftから基準関数Fsへ左斜め下方に移動してもよい。 Although FIG. 13C shows an example in which the display distance P changes in the P direction when the approach processing or the return processing is executed, the display distance is gradually changed in order to suppress a sudden size change of the second approach image V2. P may be changed. In this way, when the display distance P is gradually changed, the points (L, P) indicating the second approach image V2 are reference when the approach process is executed, as shown by the dotted line between Fs and Ft in FIG. 12C. The function Fs may move diagonally downward to the left to the second function Ft, and when the return process is executed, the function Fs may move diagonally downward to the left from the second function Ft to the reference function Fs.
 接近処理の実行開始から所定期間が経過していない場合(ブロックS17;No)、表示制御装置30は、視線方向検出部415から取得した視線情報に基づき、観察者が拡大表示中の第2の接近画像V2を視認しているか否かを判別する(ブロックS19)。 When a predetermined period has not elapsed from the start of execution of the approach process (block S17; No), the display control device 30 is the second display being enlarged by the observer based on the line-of-sight information acquired from the line-of-sight direction detection unit 415. It is determined whether or not the close-up image V2 is visually recognized (block S19).
 表示制御装置30は、観察者が第2の接近画像V2を視認していない場合(ブロックS19;No)、距離知覚短縮処理を終了する一方で、観察者が第2の接近画像V2を視認している場合(ブロックS19;Yes)、ブロックS18の復帰処理を実行する。こうすることで、観察者が拡大表示中の第2の接近画像V2を認識したと想定される場合には、第2の接近画像V2の表示制御を基準制御データCDsに基づく制御に復帰させることができ、観察者にとっては既に報知必要度が低下した第2の接近画像V2が拡大され続けることを抑制することができる。距離知覚短縮処理の説明は以上である。 When the observer does not visually recognize the second approach image V2 (block S19; No), the display control device 30 ends the distance perception shortening process, while the observer visually recognizes the second approach image V2. If so (block S19; Yes), the return process of block S18 is executed. By doing so, when it is assumed that the observer has recognized the second approach image V2 being displayed in an enlarged manner, the display control of the second approach image V2 is returned to the control based on the reference control data CDs. This makes it possible to prevent the observer from continuing to magnify the second close-up image V2, for which the need for notification has already decreased. This concludes the explanation of the distance perception shortening process.
 図14A、図14Bは、いくつかの実施形態に従って、距離知覚短縮処理及び距離知覚延長処理を実行する方法S100を示すフロー図である。方法S100は、ディスプレイを含む画像表示部20と、この画像表示部20を制御する表示制御装置30と、において実行される。方法S100内のいくつかの動作は任意選択的に組み合わされ、いくつかの動作の手順は任意選択的に変更され、いくつかの動作は任意選択的に省略される。 14A and 14B are flow charts showing a method S100 for executing the distance perception shortening process and the distance perception extension process according to some embodiments. Method S100 is executed by an image display unit 20 including a display and a display control device 30 that controls the image display unit 20. Some actions in method S100 are optionally combined, some steps are optionally modified, and some actions are optionally omitted.
 図3のメモリ37に記憶されたソフトウェア構成要素は、接近画像生成モジュール502、距離知覚短縮条件判定モジュール504、報知必要度判定モジュール506、距離知覚短縮処理モジュール508、距離知覚延長条件判定モジュール510、及び距離知覚延長処理モジュール512、を含む。 The software components stored in the memory 37 of FIG. 3 include a proximity image generation module 502, a distance perception shortening condition determination module 504, a notification necessity determination module 506, a distance perception shortening processing module 508, and a distance perception extension condition determination module 510. And the distance perception extension processing module 512.
 以下で説明するように、方法S100は、特に、観察者の視覚的注意を所定の接近画像に誘目させる画像(虚像)の提示方法を提供する。 As will be described below, the method S100 specifically provides a method of presenting an image (virtual image) that attracts the visual attention of the observer to a predetermined close-up image.
 方法S100は、1つ又はそれ以上のプロセッサ33が、メモリ37に記憶された1つ又は複数のコンピュータ・プログラムを実行させることにより実施することができる。 Method S100 can be implemented by having one or more processors 33 execute one or more computer programs stored in memory 37.
 (S110)1つ又はそれ以上のプロセッサ33は、接近画像生成モジュール502を実行することで、接近画像Vを生成する。接近画像生成モジュール502は、道路情報データベース403、車外センサ407、携帯情報端末417、及び/又は外部通信機器419から、オブジェクト(実オブジェクト又は仮想オブジェクト)の位置を取得し、これらオブジェクトと車両1との間の相対距離(距離L)が短くなるに従い、接近画像Vの大きさを徐々に大きくなるように制御する。なお、接近画像生成モジュール502は、オブジェクトと車両1との間の距離Lが短くなるに従い、接近画像Vの大きさを徐々に大きくしつつ、接近画像Vの表示距離を徐々に短くしてもよい。 (S110) One or more processors 33 generate the close-up image V by executing the close-up image generation module 502. The approach image generation module 502 acquires the positions of objects (real objects or virtual objects) from the road information database 403, the vehicle exterior sensor 407, the mobile information terminal 417, and / or the external communication device 419, and these objects and the vehicle 1 As the relative distance (distance L) between the two becomes shorter, the size of the approach image V is controlled to be gradually increased. The close-up image generation module 502 may gradually reduce the display distance of the close-up image V while gradually increasing the size of the close-up image V as the distance L between the object and the vehicle 1 becomes short. good.
 (ブロックS130)1つ又はそれ以上のプロセッサ33は、距離知覚短縮条件判定モジュール504を実行する。距離知覚短縮条件判定モジュール504は、誘目上昇処理を行う候補である第2情報より報知必要度が上位の第1情報に関する画像の大きさが所定の閾値以上になること(S132)、前記第2情報より報知必要度が上位の第1情報に関する画像の表示距離が所定の閾値以上になること(S134)、前記第2情報の報知必要度が所定の判定値以上に上昇すること(S136)、前記第2情報より報知必要度が上位の第1情報の報知必要度が所定の判定値以下に低下すること(S138)、前記第2情報より報知必要度が上位の第1情報に関する接近画像が非表示となること(S140)、又はこれらの組み合わせにより、距離知覚短縮条件が成立したと判定することに関係する様々な動作を実行する。すなわち、距離知覚短縮条件判定モジュール504は、接近画像が示す情報の報知必要度、又は接近画像の表示状態から、距離知覚短縮条件が成立したかを判定するためのコマンド、判定値、テーブルデータ、演算式、などの様々なソフトウェア構成要素を含み得る。 (Block S130) One or more processors 33 execute the distance perception shortening condition determination module 504. In the distance perception shortening condition determination module 504, the size of the image related to the first information, which has a higher notification necessity than the second information which is a candidate for performing the attraction increase process, becomes equal to or larger than a predetermined threshold (S132). The display distance of the image related to the first information, which has a higher notification necessity than the information, becomes equal to or more than a predetermined threshold (S134), and the notification necessity of the second information increases to a predetermined determination value or more (S136). The notification necessity of the first information having a higher notification necessity than the second information is reduced to a predetermined determination value or less (S138), and the approach image relating to the first information having a higher notification necessity than the second information is displayed. By hiding (S140) or a combination thereof, various operations related to determining that the distance perception shortening condition is satisfied are executed. That is, the distance perception shortening condition determination module 504 is a command, determination value, table data, for determining whether the distance perception shortening condition is satisfied from the notification necessity of the information indicated by the approach image or the display state of the approach image. It may include various software components such as arithmetic expressions.
 また、ブロックS130の前記条件に、(S130-1)誘目上昇処理を行う候補である第2情報の接近画像のサイズが所定の閾値以下であること、(S130-2)誘目上昇処理を行う候補である第2情報に対応するオブジェクトが所定の閾値以上離れていること、(S130-3)誘目上昇処理を行う候補である第2情報の接近画像にテキストが含まれていること、又はこれらの組み合わせが成立することが追加されてもよい。S130-1を条件に追加することで、接近画像のサイズが大きく視認しやすい場合に誘目上昇処理が行われる有効性の低い動作を少なくすることができる。なお、S130-2は、S130-1の接近画像のサイズの代替となり得る条件である。 Further, under the above-mentioned conditions of the block S130, (S130-1) the size of the approach image of the second information, which is a candidate for performing the attraction raising process, is equal to or less than a predetermined threshold value, and (S130-2) the candidate for performing the attraction raising process. The objects corresponding to the second information are separated by a predetermined threshold value or more, and (S130-3) the approach image of the second information, which is a candidate for performing the attraction ascending process, contains text, or these. It may be added that the combination holds. By adding S130-1 to the condition, it is possible to reduce the less effective operation in which the attraction raising process is performed when the size of the close-up image is large and it is easy to visually recognize. Note that S130-2 is a condition that can be a substitute for the size of the close-up image of S130-1.
 また、1つ又はそれ以上のプロセッサ33は、報知必要度判定モジュール506をさらに実行してもよい。報知必要度判定モジュール506は、接近画像が観察者に報知するべき内容であるかを判定する。報知必要度判定モジュール506は、I/Oインタフェース31に接続される種々の他の電子機器から情報を取得し、報知必要度を算出してもよい。また、図11でI/Oインタフェース31に接続された電子機器が車両ECU401に情報を送信し、受信した情報に基づき車両ECU401が決定した報知必要度を、報知必要度判定モジュール506が検出(取得)してもよい。『報知必要度』は、例えば、起こり得る自体の重大さの程度から導き出される危険度、反応行動を起こすまでに要求される反応時間の長短から導き出される緊急度、車両1や視認者(又は車両1の他の乗員)の状況から導き出される有効度、又はこれらの組み合わせなどで決定され得る(報知必要度の指標はこれらに限定されない)。報知必要度判定モジュール506は、報知必要度を推定する元となる必要度関連情報を検出し、これから報知必要度を推定してもよい。画像の報知必要度を推定する元となる必要度関連情報は、例えば、実オブジェクトや交通規制(道路情報の一例)の位置、種類などで推定されてもよく、I/Oインタフェース31に接続される種々の他の電子機器から入力される他の情報に基づいて、又は他の情報を加味して推定されてもよい。すなわち、報知必要度判定モジュール506は、視認者に報知すべきかを判定し、後述する画像を表示しないことも選択し得る。なお、表示制御装置30は、報知必要度を取得できればよく、報知必要度を推定する(算出する)機能を有していなくてもよく、報知必要度を推定する機能の一部又は全部は、車両用表示システム10の表示制御装置30とは別(例えば、車両ECU401)に設けられてもよい。 Further, one or more processors 33 may further execute the notification necessity determination module 506. The notification necessity determination module 506 determines whether the approach image is the content to be notified to the observer. The notification necessity determination module 506 may acquire information from various other electronic devices connected to the I / O interface 31 and calculate the notification necessity. Further, the electronic device connected to the I / O interface 31 in FIG. 11 transmits information to the vehicle ECU 401, and the notification necessity determination module 506 detects (acquires) the notification necessity determined by the vehicle ECU 401 based on the received information. ) May. The "necessity of notification" is, for example, the degree of danger derived from the degree of seriousness that can occur, the degree of urgency derived from the length of reaction time required to take a reaction action, vehicle 1 or a viewer (or vehicle). It can be determined by the effectiveness derived from the situation of (1 other occupant), or a combination thereof (the index of the need for notification is not limited to these). The notification necessity determination module 506 may detect the necessity-related information that is the source for estimating the notification necessity, and may estimate the notification necessity from this. Necessity-related information, which is the basis for estimating the notification necessity of an image, may be estimated based on, for example, the position and type of a real object or traffic regulation (an example of road information), and is connected to the I / O interface 31. It may be estimated based on other information input from various other electronic devices or in addition to other information. That is, the notification necessity determination module 506 may determine whether to notify the viewer and may choose not to display the image described later. The display control device 30 does not have to have a function of estimating (calculating) the necessity of notification as long as it can acquire the necessity of notification, and a part or all of the function of estimating the necessity of notification is It may be provided separately from the display control device 30 of the vehicle display system 10 (for example, the vehicle ECU 401).
 (ブロックS150)1つ又はそれ以上のプロセッサ33は、距離知覚短縮処理モジュール508を実行する。距離知覚短縮処理モジュール508は、接近画像の大きさを基準制御における大きさよりも拡大し、さらに距離Lが短くなるに従い、拡大すること(拡大処理S152)、接近画像の表示距離を基準制御における表示距離よりも近くし、さらに距離Lが短くなるに従い、表示距離を短くすること(接近処理S154)、接近画像の大きさを基準制御における大きさよりも拡大し、大きさを維持すること(S156)、接近画像の表示距離を基準制御における表示距離よりも近くし、表示距離を維持すること(S158)、又はこれらの組み合わせを実行するためのソフトウェア構成要素を含む。具体的には、距離知覚短縮処理モジュール508は、接近画像の大きさを基準制御における大きさよりも拡大する、又は接近画像の表示距離を基準制御における表示距離よりも近くするために、メモリ37に予め記憶された第1距離知覚短縮処理データCDt(第2関数Ftを含む。)を実行する(ブロックS152,S154)。 (Block S150) One or more processors 33 execute the distance perception shortening processing module 508. The distance perception shortening processing module 508 enlarges the size of the approach image from the size in the reference control, further expands as the distance L becomes shorter (enlargement processing S152), and displays the display distance of the approach image in the reference control. The display distance is shortened as the distance L becomes shorter than the distance (approach processing S154), and the size of the approach image is enlarged to be larger than the size in the reference control and the size is maintained (S156). , The display distance of the close-up image is made closer than the display distance in the reference control and the display distance is maintained (S158), or a software component for executing a combination thereof is included. Specifically, the distance perception shortening processing module 508 is stored in the memory 37 in order to enlarge the size of the approach image to be larger than the size in the reference control, or to make the display distance of the approach image closer than the display distance in the reference control. The pre-stored first distance perception shortening processing data CDt (including the second function Ft) is executed (blocks S152 and S154).
 (ブロックS170)1つ又はそれ以上のプロセッサ33は、距離知覚延長条件判定モジュール510を実行する。距離知覚延長条件判定モジュール510は、ブロックS150の距離知覚短縮処理が実行されてから所定時間経過すること(S172)、前記第2情報の報知必要度が所定の判定値以下に低下すること(S174)、前記第2情報より報知必要度が下位の第3情報の報知必要度が所定の判定値以上に上昇すること(S176)、車両1が手動運転であること(S178)、車両1の速度が所定の判定値以上であること、又はこれらの組み合わせにより、距離知覚延長条件が成立したと判定することに関係する様々な動作を実行する。すなわち、距離知覚延長条件判定モジュール510は、接近画像が示す情報の報知必要度、又は接近画像の表示状態などから、距離知覚延長条件が成立したかを判定するためのコマンド、判定値、テーブルデータ、演算式、などの様々なソフトウェア構成要素を含み得る。 (Block S170) One or more processors 33 execute the distance perception extension condition determination module 510. In the distance perception extension condition determination module 510, a predetermined time elapses after the distance perception shortening process of the block S150 is executed (S172), and the notification necessity of the second information is reduced to a predetermined determination value or less (S174). ), The notification necessity of the third information, which is lower than the second information, increases to a predetermined determination value or more (S176), the vehicle 1 is manually operated (S178), and the speed of the vehicle 1. Is equal to or greater than a predetermined determination value, or a combination thereof causes various actions related to determining that the distance perception extension condition is satisfied. That is, the distance perception extension condition determination module 510 determines whether the distance perception extension condition is satisfied based on the notification necessity of the information indicated by the approach image, the display state of the approach image, and the like, the command, the determination value, and the table data. Can include various software components such as, arithmetic expressions, and the like.
 (ブロックS190)1つ又はそれ以上のプロセッサ33は、距離知覚延長処理モジュール512を実行する。距離知覚延長処理モジュール512は、接近画像の大きさを距離知覚短縮処理における大きさより小さくすること(S192)、接近画像の表示距離を距離知覚短縮処理における表示距離より遠くする(S194)、画像の大きさを維持すること(S196)、画像の表示距離を維持すること(S198)、又はこれらの組み合わせを実行するためのソフトウェア構成要素を含む。具体的には、距離知覚短縮処理モジュール508は、オブジェクトと車両1との距離Lと、基準制御データCDs(基準関数Fs)に基づき、接近画像の大きさ、接近画像の表示距離、又はこれらの組み合わせを制御する。なお、ブロックS192は、接近画像の大きさを基準制御における大きさに変更することを含んでいてもよい。また、ブロックS194は、接近画像の表示距離を基準制御における表示距離に変更することを含んでいてもよい。 (Block S190) One or more processors 33 execute the distance perception extension processing module 512. The distance perception extension processing module 512 makes the size of the approach image smaller than the size in the distance perception shortening process (S192), makes the display distance of the approach image farther than the display distance in the distance perception shortening process (S194), and makes the image Includes software components for maintaining size (S196), maintaining image display distance (S198), or performing combinations thereof. Specifically, the distance perception shortening processing module 508 is based on the distance L between the object and the vehicle 1 and the reference control data CDs (reference function Fs), the size of the approach image, the display distance of the approach image, or these. Control the combination. The block S192 may include changing the size of the close-up image to the size in the reference control. Further, the block S194 may include changing the display distance of the approach image to the display distance in the reference control.
 本実施形態における接近画像は、大別して、(1)車両とオブジェクトとの相対的な距離が短くなるに従い、サイズが大きくなる、表示距離が短くなる、又はこれらの組み合わせが実行され、接近画像が徐々に接近するように知覚される「通常制御処理」での表示、(2)「通常制御処理」での表示よりも、サイズがさらに大きくなる、表示距離がさらに短くなる、又はこれらの組み合わせが実行される「第1距離知覚短縮処理」での表示、及び(3)「第1距離知覚短縮処理」での表示よりサイズがさらに大きくなる、表示距離がさらに短くなる、又はこれらの組み合わせが実行される「第2距離知覚短縮処理」での表示、が可能となっている。 The approach image in the present embodiment is roughly classified into (1) as the relative distance between the vehicle and the object becomes shorter, the size becomes larger, the display distance becomes shorter, or a combination thereof is executed, and the approach image becomes The size is larger, the display distance is shorter, or a combination of these is larger than the display in the "normal control process" that is perceived as gradually approaching, and (2) the display in the "normal control process". The size is larger, the display distance is shorter, or a combination of these is executed than the display in the "first distance perception shortening process" to be executed and (3) the display in the "first distance perception shortening process". It is possible to display in the "second distance perception shortening process".
 なお、本実施形態における接近画像制御データCDは、「通常制御処理」での表示を行う基準制御データCDs、「第1距離知覚短縮処理」での表示を行う第1距離知覚短縮処理データCDt、及び「第2距離知覚短縮処理」での表示を行う第2距離知覚短縮処理データCDvを含む。 The close-up image control data CDs in the present embodiment include reference control data CDs that are displayed in the "normal control process", first distance perception shortening process data CDt that is displayed in the "first distance perception shortening process", and so on. And the second distance perception shortening process data CDv to be displayed in the "second distance perception shortening process" is included.
 図15を参照する。第2距離知覚短縮処理データCDvは、第1距離知覚短縮処理データCDtより、距離Lに対する接近画像の大きさが大きい、及び/又は距離Lに対する表示距離が短くなるように、接近画像を制御するデータである。すなわち、図15に示すように、接近画像の大きさ及び/又は表示距離が、基準制御データCDsに基づくものから第2距離知覚短縮処理データCDvに変更されると、第1距離知覚短縮処理データCDtに変更した場合と比較すると、距離知覚をさらに短くすることができ、誘目の度合い(誘目性)をさらに高くすることができる。 Refer to FIG. The second distance perception shortening processing data CDv controls the approaching image so that the size of the approaching image with respect to the distance L is larger and / or the display distance with respect to the distance L is shorter than the first distance perception shortening processing data CDt. It is data. That is, as shown in FIG. 15, when the size and / or display distance of the approach image is changed from the one based on the reference control data CDs to the second distance perception shortening processing data CDv, the first distance perception shortening processing data Compared with the case of changing to CDt, the distance perception can be further shortened, and the degree of attraction (attractiveness) can be further increased.
 図16Aは、いくつかの実施形態に従って、第2情報に関する接近画像を表示している際に、第2情報よりも報知優先度が高い別の情報が発生した場合に実行される接近画像の表示制御に関するする方法S300を示すフロー図である。図16Bは、図16Aに続くフロー図であり、図16Cは、図16Bに続くフロー図である。方法S300内のいくつかの動作は任意選択的に組み合わされ、いくつかの動作の手順は任意選択的に変更され、いくつかの動作は任意選択的に省略される。また、図17A~図17Fは、左図が画像の表示遷移の例を説明する図であり、右図が距離に応じた画像の表示比率を示す右図に対応した図である。 FIG. 16A shows the display of the approach image executed when another information having a higher notification priority than the second information is generated while displaying the approach image related to the second information according to some embodiments. It is a flow chart which shows the method S300 concerning control. 16B is a flow chart following FIG. 16A, and FIG. 16C is a flow chart following FIG. 16B. Some actions in the method S300 are optionally combined, some actions are optionally modified, and some actions are optionally omitted. Further, FIGS. 17A to 17F are views in which the left figure explains an example of the display transition of the image, and the right figure corresponds to the right figure showing the display ratio of the image according to the distance.
 (S310)1つ又はそれ以上のプロセッサ33は、接近画像Vを生成する。1つ又はそれ以上のプロセッサ33は、実オブジェクト、また、仮想オブジェクトまでの距離Lを取得し、距離Lとメモリ37に予め記憶された接近画像の大きさを設定する基準制御データCDsに基づき、距離Lが短くなるに従い、接近画像の大きさを大きくする。これに加えて、1つ又はそれ以上のプロセッサ33は、距離Lとメモリ37に予め記憶された接近画像の表示距離を設定する基準制御データCDsに基づき、距離Lが短くなるに従い、接近画像の表示距離を短くしてもよい。図17Aは、ブロックS310により表示された距離L1に設定されたオブジェクトに関する第2の情報を示す第2の接近画像V2(L1,M1)の表示例を示している。 (S310) One or more processors 33 generate a close-up image V. One or more processors 33 acquire the distance L to the real object or the virtual object, and set the distance L and the size of the approach image stored in the memory 37 in advance based on the reference control data CDs. As the distance L becomes shorter, the size of the close-up image is increased. In addition to this, one or more processors 33 increase the approach image as the distance L becomes shorter, based on the reference control data CDs that set the distance L and the display distance of the approach image stored in the memory 37 in advance. The display distance may be shortened. FIG. 17A shows a display example of the second approach image V2 (L1, M1) showing the second information regarding the object set at the distance L1 displayed by the block S310.
 (S330)1つ又はそれ以上のプロセッサ33は、接近画像Vの誘目性を低下させる条件が成立したかを判定する。1つ又はそれ以上のプロセッサ33は、誘目性を低下させる候補となる接近画像Vが示す情報(ここでは、第2情報)より報知必要度が上位の情報(ここでは、第3情報とする)を新たに検出した場合、誘目性を低下させる条件が成立したと判定する(S332)。 (S330) One or more processors 33 determine whether or not the condition for reducing the attractiveness of the close-up image V is satisfied. The one or more processors 33 have higher notification necessity information (here, the third information) than the information indicated by the close-up image V, which is a candidate for reducing the attractiveness (here, the second information). Is newly detected, it is determined that the condition for reducing the attractiveness is satisfied (S332).
 (S350)1つ又はそれ以上のプロセッサ33は、S330で前記条件が成立したと判定された場合、接近画像Vの誘目性を低下させる表示処理を実行する。1つ又はそれ以上のプロセッサ33は、S330で前記条件が成立したと判定された接近画像V(ここでは、第2接近画像V2)の視認性(前記視認性とは、例えば、輝度、表示色、透過度、表示階調、又はこれらの組み合わせ。)を低下させる(S352)。図17Bは、ブロックS352で視認性を低下させた第2の接近画像V2(L1.M1)と、ブロックS332で第2の接近画像V2が示す第2情報より報知必要度が上位である第3情報を示す第3の接近画像V3と、の表示例を示す図である。ここで、第2の接近画像V2の視認性(誘目性の一例。)を低下させ、第3の接近画像V3(ここで、第2情報よりも報知必要度の高い画像は、強調表示されてもよい。)を表示することで、第2の接近画像V2への観察者の視覚的注意が向きにくくなり(誘目性が低下し)、報知必要度の高い第3の接近画像V3(又は第3の接近画像V3が関連づけられたオブジェクト)に観察者の視覚的注意を向きやすくすることができる。なお、ここで、第3の虚像V3は、オブジェクトまでの距離Lが短くなるに従い、知覚的距離を短くする接近画像でなくてもよい。また、ブロックS350で第2の虚像V2の誘目性を低下させる際、第3の虚像V3は、表示されなくてもよい。すなわち、図17B、図17Cでは、報知必要度の高い第3の接近画像V3は、表示されなくてもよい。このように、第2の接近画像V2の誘目性を低下させ、第2の接近画像V2の誘目性を低下させることで、間接的に、第3情報のオブジェクトに対して観察者の視覚的注意が向きやすくなる。 (S350) One or more processors 33 execute a display process for reducing the attractiveness of the approach image V when it is determined in S330 that the above conditions are satisfied. One or more processors 33 have visibility of the approach image V (here, the second approach image V2) determined in S330 that the above condition is satisfied (the visibility includes, for example, brightness and display color). , Transparency, display gradation, or a combination thereof.) (S352). FIG. 17B shows a third approaching image V2 (L1.M1) whose visibility is reduced in the block S352 and a third information having a higher notification necessity than the second information shown by the second approaching image V2 in the block S332. It is a figure which shows the display example of the 3rd approach image V3 which shows information. Here, the visibility of the second approach image V2 (an example of attractiveness) is lowered, and the third approach image V3 (here, the image having a higher need for notification than the second information is highlighted. By displaying the second approach image V2, it becomes difficult for the observer to pay attention to the second approach image V2 (the attractiveness is reduced), and the third approach image V3 (or the third approach image V3) (or the third one) has a high need for notification. It is possible to facilitate the visual attention of the observer to the object) to which the close-up image V3 of 3 is associated. Here, the third virtual image V3 does not have to be a close-up image that shortens the perceptual distance as the distance L to the object becomes shorter. Further, when the attractiveness of the second virtual image V2 is reduced in the block S350, the third virtual image V3 may not be displayed. That is, in FIGS. 17B and 17C, the third close-up image V3, which has a high need for notification, does not have to be displayed. In this way, by reducing the attractiveness of the second close-up image V2 and reducing the attractiveness of the second close-up image V2, the observer's visual attention to the object of the third information is indirectly taken. Is easier to face.
 図17Cは、図17Bに示す状況よりも後であり、車両1が進行するなどの理由により、第2の情報に関連づけられた第2のオブジェクトまでの距離LがL1からL2に短くなった際の第2の接近画像V2の表示遷移を示す図である。第2の接近画像V2は、距離LがL1からL2に短くなったことに従い、基準制御データCDsに基づき、第2の接近画像V2の表示比率MをM1からM2に拡大する(距離知覚を短縮する一例。)。ここでも、第2の接近画像V2の誘目性(一例として、視認性。)が低下したままであるように図17Cでは、表現してあるが、例えば、ブロックS350が実行されてから所定時間経過したこと、ブロックS350が実行されてから車両1が所定距離以上走行したこと、などを条件に、第2の接近画像V2の誘目性を上昇させてもよい。 FIG. 17C shows a situation after the situation shown in FIG. 17B, when the distance L to the second object associated with the second information is shortened from L1 to L2 due to reasons such as the vehicle 1 moving forward. It is a figure which shows the display transition of the 2nd approach image V2 of. The second approach image V2 expands the display ratio M of the second approach image V2 from M1 to M2 based on the reference control data CDs as the distance L becomes shorter from L1 to L2 (shortens the distance perception). An example of doing.). Here, too, it is expressed in FIG. 17C that the attractiveness (as an example, visibility) of the second close-up image V2 remains reduced, but for example, a predetermined time has elapsed since the block S350 was executed. The attractiveness of the second approach image V2 may be increased on the condition that the vehicle 1 has traveled a predetermined distance or more after the block S350 is executed.
 また、ブロックS330(特許請求の範囲における第1の条件)は、ブロックS332に代えて、又はブロックS332に加えて、誘目性を低下させる候補となる接近画像Vが示す情報(ここでは、第2情報)より報知必要度が下位の情報(ここでは、第1情報とする)の報知必要度が上昇すること(S334)、誘目性を低下させる候補となる接近画像Vが示す情報(ここでは、第2情報)の報知必要度が、下位の情報(ここでは、第1情報とする)の報知必要度より低下すること(S336)、又はこれらの組み合わせ、を含んでいてもよい。 Further, the block S330 (first condition in the claims) is the information indicated by the close-up image V which is a candidate for reducing the attractiveness in place of the block S332 or in addition to the block S332 (here, the second condition). Information (information) has a lower notification necessity (here, the first information), the notification necessity increases (S334), and the information indicated by the approach image V, which is a candidate for reducing the attractiveness (here, is referred to as the first information). The notification necessity of the second information) may be lower than the notification necessity of the lower information (here, the first information) (S336), or a combination thereof may be included.
 また、ブロックS350(特許請求の範囲における第1表示制御)は、ブロックS352に代えて、又はブロックS352に加えて、接近画像V(ここでは、第2の接近画像V2)を非表示にすること(S354)、接近画像V(ここでは、第2の接近画像V2)のサイズを小さくすること(S356)、又はこれらの組み合わせ、を含んでいてもよい。すなわち、誘目性を低下させる表示処理は、画像の視認性を低下させること、画像を非表示にすること、画像のサイズを小さくすること(画像の距離知覚を延長することの一例。)、又はこれらの組み合わせを含んでいてもよい。 Further, the block S350 (first display control in the claims) hides the approach image V (here, the second approach image V2) in place of the block S352 or in addition to the block S352. (S354), reducing the size of the close-up image V (here, the second close-up image V2) (S356), or a combination thereof. That is, the display process for reducing the attractiveness is to reduce the visibility of the image, hide the image, reduce the size of the image (an example of extending the distance perception of the image), or. These combinations may be included.
 (S370)1つ又はそれ以上のプロセッサ33は、接近画像Vの距離知覚を短縮する条件が成立したかを判定する。1つ又はそれ以上のプロセッサ33は、距離知覚を短縮する候補となる接近画像Vが示す情報(ここでは、第2情報)より報知必要度が上位の情報(ここでは、第1情報又は第3情報とする)が無くなったことを検出した場合、距離知覚を短縮させる条件が成立したと判定する(S372)。 (S370) One or more processors 33 determine whether the condition for shortening the distance perception of the approach image V is satisfied. In the one or more processors 33, the information (here, the first information or the third information) whose notification necessity is higher than the information (here, the second information) indicated by the close-up image V which is a candidate for shortening the distance perception When it is detected that the information) has disappeared, it is determined that the condition for shortening the distance perception is satisfied (S372).
 (S390)1つ又はそれ以上のプロセッサ33は、S370で前記条件が成立したと判定された場合、接近画像Vの距離知覚を短縮させる表示処理を実行する。1つ又はそれ以上のプロセッサ33は、S370で前記条件が成立したと判定された接近画像V(ここでは、第2接近画像V2)の大きさを基準制御における大きさよりも拡大し、さらに距離Lが短くなるに従い、拡大する(S392)。図17Dは、ブロックS392で大きくした第2の接近画像V2(L2.M3)の表示例を示す図である。1つ又はそれ以上のプロセッサ33は、ブロックS392では、制御データを基準制御データCDsから第1距離知覚短縮処理データCDtに変更することで、距離L2に対応する表示比率MをM2からM3へ大きくする。ここで、第2の接近画像V2のサイズを大きくすることにより、観察者が感じる知覚距離を短縮することで、第2の接近画像V2が関連付えられたオブジェクトの接近を迅速に認識しやすくすることができる。 (S390) One or more processors 33 execute a display process that shortens the distance perception of the approach image V when it is determined in S370 that the above conditions are satisfied. One or more processors 33 enlarge the size of the close-up image V (here, the second close-up image V2) determined in S370 that the above condition is satisfied, and further expand the size of the close-up image V (here, the second close-up image V2) from the size in the reference control, and further increase the distance L. As it becomes shorter, it expands (S392). FIG. 17D is a diagram showing a display example of the second approach image V2 (L2.M3) enlarged in the block S392. In the block S392, one or more processors 33 increase the display ratio M corresponding to the distance L2 from M2 to M3 by changing the control data from the reference control data CDs to the first distance perception shortening processing data CDt. do. Here, by increasing the size of the second approach image V2, the perceived distance felt by the observer is shortened, so that the second approach image V2 can easily recognize the approach of the associated object quickly. can do.
 図17Eは、図17Dに示す状況よりも後であり、車両1が進行するなどの理由により、第2の情報に関連づけられた第2のオブジェクトまでの距離LがL2からL3に短くなった際の第2の接近画像V2の表示遷移を示す図である。第2の接近画像V2は、距離LがL2からL3に短くなったことに従い、第1距離知覚短縮処理データCDtに基づき、第2の接近画像V2の表示比率MをM3からM4に拡大してもよい(距離知覚を短縮する一例。)。 FIG. 17E is after the situation shown in FIG. 17D, when the distance L to the second object associated with the second information is shortened from L2 to L3 due to reasons such as the vehicle 1 moving forward. It is a figure which shows the display transition of the 2nd approach image V2 of. The second close-up image V2 expands the display ratio M of the second close-up image V2 from M3 to M4 based on the first distance perception shortening processing data CDt in accordance with the shortening of the distance L from L2 to L3. It may be good (an example of shortening the perception of distance).
 また、ブロックS370(特許請求の範囲における第2の条件)は、ブロックS372に代えて、又はブロックS372に加えて、誘目性を低下させる候補となる接近画像Vが示す情報(ここでは、第2情報)より報知必要度が上位の情報(ここでは、第1情報又は第3情報とする)の報知必要度が低下すること(S374)、誘目性を低下させる候補となる接近画像Vが示す情報(ここでは、第2情報)の報知必要度が、上位の情報(ここでは、第1情報又は第3情報とする)の報知必要度より上昇すること(S376)、又はこれらの組み合わせ、を含んでいてもよい。 Further, the block S370 (second condition in the claims) is the information indicated by the close-up image V which is a candidate for reducing the attractiveness in place of the block S372 or in addition to the block S372 (here, the second condition). Information), which has a higher notification necessity (here, the first information or the third information) has a lower notification necessity (S374), and the information indicated by the approach image V, which is a candidate for reducing the attractiveness. The notification necessity of (here, the second information) is higher than the notification necessity of the higher-level information (here, the first information or the third information) (S376), or a combination thereof. You may be.
 また、ブロックS390(特許請求の範囲における第2表示制御)は、ブロックS392に代えて、又はブロックS392に加えて、接近画像の表示距離を基準制御における表示距離よりも近くし、さらに距離Lが短くなるに従い、表示距離を短くすること(S394)、接近画像の大きさを基準制御における大きさよりも拡大し、大きさを維持すること(S396)、接近画像の表示距離を基準制御における表示距離よりも近くし、表示距離を維持すること(S398)、又はこれらの組み合わせを含んでいてもよい。すなわち、距離知覚を短縮させる表示処理は、画像を大きくすること、画像の表示距離を短くすること、又はこれらの組み合わせを含んでいてもよい。 Further, the block S390 (second display control in the claims) replaces the block S392 or, in addition to the block S392, makes the display distance of the approach image closer than the display distance in the reference control, and further sets the distance L. As it becomes shorter, the display distance is shortened (S394), the size of the approach image is enlarged and maintained in size (S396), and the display distance of the approach image is the display distance in the reference control. It may be closer and maintain the display distance (S398), or a combination thereof may be included. That is, the display process that shortens the distance perception may include enlarging the image, shortening the display distance of the image, or a combination thereof.
 (S410)1つ又はそれ以上のプロセッサ33は、接近画像Vの距離知覚を延長する条件が成立したかを判定する。1つ又はそれ以上のプロセッサ33は、ブロックS390が実行されてから所定時間経過したこと、を検出した場合、距離知覚を延長させる条件が成立したと判定する(S412)。また、ブロックS410(特許請求の範囲における第3の条件)は、ブロックS412に代えて、又はブロックS412に加えて、例えば、ブロックS390が実行されてから車両1が所定距離以上走行したこと(S414)、などを含んでいてもよい。 (S410) One or more processors 33 determine whether the condition for extending the distance perception of the approach image V is satisfied. When one or more processors 33 detect that a predetermined time has elapsed since the block S390 was executed, it determines that the condition for extending the distance perception is satisfied (S412). Further, in the block S410 (third condition in the claims), the vehicle 1 has traveled a predetermined distance or more after the block S390 is executed, for example, in place of the block S412 or in addition to the block S412 (S414). ), Etc. may be included.
 (S430)1つ又はそれ以上のプロセッサ33は、S410で前記条件が成立したと判定された場合、接近画像Vの距離知覚を延長させる表示処理を実行する。1つ又はそれ以上のプロセッサ33は、S410で前記条件が成立したと判定された接近画像V(ここでは、第2接近画像V2)の大きさを第1距離知覚短縮処理における大きさより小さくする(S432)。図17Fは、ブロックS432で大きくした第2の接近画像V2(L3.M5)の表示例を示す図である。1つ又はそれ以上のプロセッサ33は、ブロックS432では、制御データを第1距離知覚短縮処理データCDtから基準制御データCDsに変更する(換言すると、ブロックS390で短縮した距離知覚を元に戻す)ことで、距離L3に対応する表示比率MをM4からM5へ小さくする。ここで、第2の接近画像V2のサイズを小さくする画像の変化により、観察者の視覚的注意がむきやすくなり、延いては、その変化後の画像の距離知覚の近傍のオブジェクトに観察者の視覚的注意を向けることができる。 (S430) When it is determined in S410 that the above condition is satisfied, one or more processors 33 execute a display process for extending the distance perception of the approach image V. One or more processors 33 make the size of the approach image V (here, the second approach image V2) determined in S410 that the above condition is satisfied smaller than the size in the first distance perception shortening process (in this case, the second approach image V2). S432). FIG. 17F is a diagram showing a display example of the second approach image V2 (L3.M5) enlarged by the block S432. One or more processors 33 change the control data from the first distance perception shortening processing data CDt to the reference control data CDs in the block S432 (in other words, restore the distance perception shortened in the block S390). Then, the display ratio M corresponding to the distance L3 is reduced from M4 to M5. Here, the change in the image that reduces the size of the second close-up image V2 makes it easier for the observer to pay attention to the visual attention, and by extension, the observer's object in the vicinity of the distance perception of the changed image. Can direct visual attention.
 また、ブロックS430(特許請求の範囲における第3表示制御)は、ブロックS432に代えて、又はブロックS432に加えて、接近画像の表示距離を第1距離知覚短縮処理における表示距離より長くすること(S434)を含んでいてもよい。 Further, the block S430 (third display control within the scope of claims) sets the display distance of the approach image longer than the display distance in the first distance perception shortening process in place of the block S432 or in addition to the block S432 (in addition to the block S432). S434) may be included.
 なお、ブロックS310により表示された第2の接近画像V2(図18A)を、ブロックS354により非表示(図18B)にし、ブロックS390の表示するために、再び第2の接近画像Vを表示した場合、非表示になった際の距離L1に対応する基準制御データCDsに基づく表示比率M1と、ブロックS390で再表示される際の距離L2に対応する第1距離知覚短縮処理データCDtに基づく表示比率M3と、で差が大きいと、再表示した際に、観察者を驚かせてしまうことも想定される。したがって、ブロックS390を実行する際、実際のオブジェクトの位置L2に対応する1距離知覚短縮処理データCDtに基づく表示比率M3より小さい表示比率M(例えば、S350で非表示になった際の表示比率M1であってもよい。)により第2の接近画像V2を表示させた後、距離Lに依らず、表示比率M3になるまで高速(ブロックS392ないしブロックS398と比較して)で第2接近画像V2を拡大してもよい。すなわち、ブロックS390は、実際のオブジェクトの位置L2に対応する1距離知覚短縮処理データCDtに基づく距離知覚で画像を表示する前に、実際のオブジェクトの位置L2に対応する1距離知覚短縮処理データCDtに基づく距離知覚より延長された距離知覚の接近画像を表示した後、実際のオブジェクトの位置L2に対応する1距離知覚短縮処理データCDtに基づく距離知覚の画像まで連続的に変化させてもよい。これにより、急に知覚距離が近い画像が表示されて、観察者を驚かせてしまうことを抑制することができる。 When the second approach image V2 (FIG. 18A) displayed by the block S310 is hidden by the block S354 (FIG. 18B) and the second approach image V is displayed again in order to display the block S390. , The display ratio M1 based on the reference control data CDs corresponding to the distance L1 when hidden and the display ratio based on the first distance perception shortening processing data CDt corresponding to the distance L2 when redisplayed in the block S390. If there is a large difference between M3 and M3, it is expected that the observer will be surprised when the image is redisplayed. Therefore, when the block S390 is executed, the display ratio M smaller than the display ratio M3 based on the one-distance perception shortening processing data CDt corresponding to the actual object position L2 (for example, the display ratio M1 when hidden in S350). After displaying the second approach image V2 by (may be May be expanded. That is, the block S390 has the 1-distance perception shortening processing data CDt corresponding to the actual object position L2 before displaying the image by the distance perception based on the 1-distance perception shortening processing data CDt corresponding to the actual object position L2. After displaying the approach image of the distance perception extended from the distance perception based on the above, the image of the distance perception based on the one-distance perception shortening processing data CDt corresponding to the position L2 of the actual object may be continuously changed. As a result, it is possible to prevent the observer from being surprised by suddenly displaying an image having a short perceptual distance.
なお、図14AのブロックS150が実行されている際でも、図16AにおけるブロックS330の誘目性を低下させる条件が成立すれば、S330以降の処理を実行してもよい。 Even when the block S150 of FIG. 14A is being executed, the processing of S330 or later may be executed as long as the condition for reducing the attractiveness of the block S330 in FIG. 16A is satisfied.
 上述の処理プロセスの動作は、汎用プロセッサ又は特定用途向けチップなどの情報処理装置の1つ以上の機能モジュールを実行させることにより実施することができる。これらのモジュール、これらのモジュールの組み合わせ、及び/又はそれらの機能を代替えし得る公知のハードウェアとの組み合わせは全て、本発明の保護の範囲内に含まれる。 The operation of the above-mentioned processing process can be performed by executing one or more functional modules of an information processing device such as a general-purpose processor or a chip for a specific purpose. All of these modules, combinations of these modules, and / or combinations with known hardware capable of substituting their functionality are within the scope of the protection of the present invention.
 車両用表示システム10の機能ブロックは、任意選択的に、説明される様々な実施形態の原理を実行するために、ハードウェア、ソフトウェア、又はハードウェア及びソフトウェアの組み合わせによって実行される。図3で説明する機能ブロックが、説明される実施形態の原理を実施するために、任意選択的に、組み合わされ、又は1つの機能ブロックを2以上のサブブロックに分離されてもいいことは、当業者に理解されるだろう。したがって、本明細書における説明は、本明細書で説明されている機能ブロックのあらゆる可能な組み合わせ若しくは分割を、任意選択的に支持する。 The functional blocks of the vehicle display system 10 are optionally executed by hardware, software, or a combination of hardware and software in order to execute the principles of the various embodiments described. The functional blocks described in FIG. 3 may be optionally combined or one functional block separated into two or more subblocks in order to implement the principles of the embodiments described. It will be understood by those skilled in the art. Accordingly, the description herein optionally supports any possible combination or division of functional blocks described herein.
 また、車両用表示システム10における画像表示部20は、ヘッドマウントディスプレイ(以下、HMD)装置であってもよい。観察者は、HMD装置を頭部に装着して車両1の座席に着座することで、表示される虚像Vを、車両1のフロントウインドシールド2を介した前景に重畳して視認する。画像表示部20が所定の虚像Vを表示する表示領域100は、車両1の座標系を基準とした特定の位置に固定され、観察者がその方向を向くと、その特定の位置に固定された表示領域100内に表示された虚像Vを視認することができる。 Further, the image display unit 20 in the vehicle display system 10 may be a head-mounted display (hereinafter, HMD) device. By attaching the HMD device to the head and sitting on the seat of the vehicle 1, the observer superimposes the displayed virtual image V on the foreground via the front windshield 2 of the vehicle 1 and visually recognizes it. The display area 100 on which the image display unit 20 displays a predetermined virtual image V is fixed at a specific position with respect to the coordinate system of the vehicle 1, and when the observer turns to that direction, the display area 100 is fixed at the specific position. The virtual image V displayed in the display area 100 can be visually recognized.
1    :車両
2    :フロントウインドシールド
2D   :奥行き
5    :ダッシュボード
10   :車両用表示システム
20   :画像表示部(HUD装置)
21   :表示器
21a  :表示面
22   :液晶ディスプレイパネル
24   :光源ユニット
25   :リレー光学系
26   :第1ミラー
27   :第2ミラー
28   :アクチュエータ
29   :アクチュエータ
30   :表示制御装置
31   :I/Oインタフェース
33   :プロセッサ
35   :画像処理回路
37   :メモリ
40   :表示光
40p  :光軸
41   :第1表示光
42   :第2画像光
43   :第3画像光
80   :カーナビ装置
90   :虚像光学系
100  :表示領域
200  :アイボックス
205  :中心
401  :車両ECU
403  :道路情報データベース
405  :自車位置検出部
407  :車外センサ
409  :操作検出部
411  :目位置検出部
415  :視線方向検出部
417  :携帯情報端末
419  :外部通信機器
502  :接近画像生成モジュール
504  :距離知覚短縮条件判定モジュール
506  :報知必要度判定モジュール
508  :距離知覚短縮処理モジュール
510  :距離知覚延長条件判定モジュール
512  :距離知覚延長処理モジュール
700  :目位置
CD   :接近画像制御データ
CDs  :基準制御データ
CDt  :第1距離知覚短縮処理データ
CDu  :第2距離知覚短縮処理データ
Fs   :基準関数
Fsb  :復帰用基準関数
Ft   :第2関数
L    :相対距離
L0   :所定距離
M    :表示比率
MR   :表示比率
MT1  :第1閾値
MT2  :第2閾値
Mth  :所定閾値
P    :知覚距離
PR   :知覚距離
PT1  :第1閾値
PT2  :第2閾値
Pth  :所定閾値
V    :接近画像
V1   :第1の接近画像
V2   :第2の接近画像
V3   :第3の接近画像
β    :所定量
γ    :所定量
θt   :チルト角
 

 
1: Vehicle 2: Front windshield 2D: Depth 5: Dashboard 10: Vehicle display system 20: Image display unit (HUD device)
21: Display 21a: Display surface 22: Liquid crystal display panel 24: Light source unit 25: Relay optical system 26: First mirror 27: Second mirror 28: Actuator 29: Actuator 30: Display control device 31: I / O interface 33 : Processor 35: Image processing circuit 37: Memory 40: Display light 40p: Optical axis 41: First display light 42: Second image light 43: Third image light 80: Car navigation device 90: Virtual image optical system 100: Display area 200 : Eye box 205: Center 401: Vehicle ECU
403: Road information database 405: Own vehicle position detection unit 407: Vehicle outside sensor 409: Operation detection unit 411: Eye position detection unit 415: Line-of-sight direction detection unit 417: Mobile information terminal 419: External communication device 502: Approach image generation module 504 : Distance perception shortening condition determination module 506: Notification necessity determination module 508: Distance perception shortening processing module 510: Distance perception extension condition determination module 512: Distance perception extension processing module 700: Eye position CD: Approach image control data CDs: Reference control Data CDt: First distance perception shortening processing data CDu: Second distance perception shortening processing data Fs: Reference function Fsb: Restoration reference function Ft: Second function L: Relative distance L0: Predetermined distance M: Display ratio MR: Display ratio MT1: First threshold MT2: Second threshold Mth: Predetermined threshold P: Perceived distance PR: Perceived distance PT1: First threshold PT2: Second threshold Pth: Predetermined threshold V: Approach image V1: First approach image V2: First 2 approach image V3: 3rd approach image β: predetermined amount γ: predetermined amount θt: tilt angle

Claims (12)

  1.  車両の前方風景と重なる虚像として観察者に画像を表示する画像表示装置(20)を制御する表示制御装置(30)であって、
     情報を取得可能な1つ又は複数のI/Oインタフェース(31)と、
     1つ又は複数のプロセッサ(33)と、
     メモリ(37)と、
     前記メモリ(37)に格納され、前記1つ又は複数のプロセッサ(33)によって実行されるように構成される1つ又は複数のコンピュータ・プログラムと、を備え、
     前記1つ又は複数のI/Oインタフェース(31)は、
      前記車両の前方のオブジェクトまでの距離(L)と、
      前記車両に関する車両情報と、を取得し、
     前記1つ又は複数のプロセッサ(33)は、
      前記距離(L)が短くなるに従い徐々にサイズが大きく視認されるように接近画像(V)を表示する第1表示制御を実行し、
      所定の第1の条件が満たされるか判定し、
       前記所定の第1の条件が満たされた場合、前記接近画像(V)の視認性を低下させる、前記接近画像(V)を非表示にする、及び/又は前記接近画像(V)のサイズを小さくし、
      所定の第2の条件が満たされるか判定し、
       前記所定の第2の条件が満たされた場合、前記第1表示制御と比較して、前記距離(L)に対する前記接近画像(V)のサイズを大きくする、及び/又は表示距離を短くする、第2表示制御を実行する、
    表示制御装置(30)。
    A display control device (30) that controls an image display device (20) that displays an image to an observer as a virtual image that overlaps with the scenery in front of the vehicle.
    One or more I / O interfaces (31) from which information can be obtained,
    With one or more processors (33),
    Memory (37) and
    It comprises one or more computer programs stored in the memory (37) and configured to be executed by the one or more processors (33).
    The one or more I / O interfaces (31)
    The distance (L) to the object in front of the vehicle and
    Acquire vehicle information about the vehicle and
    The one or more processors (33)
    The first display control for displaying the approach image (V) is executed so that the size gradually increases as the distance (L) becomes shorter.
    Determine if the predetermined first condition is met,
    When the predetermined first condition is satisfied, the visibility of the close-up image (V) is reduced, the close-up image (V) is hidden, and / or the size of the close-up image (V) is adjusted. Make it smaller
    Determine if the predetermined second condition is met,
    When the predetermined second condition is satisfied, the size of the approach image (V) with respect to the distance (L) is increased and / or the display distance is shortened as compared with the first display control. Execute the second display control,
    Display control device (30).
  2.  前記1つ又は複数のプロセッサ(33)は、
      前記接近画像(V)の輝度、表示色、透過度、及び表示階調のうち少なくとも1つを変更することにより、前記接近画像(V)の視認性を低下させる、
    請求項1に記載の表示制御装置(30)。
    The one or more processors (33)
    By changing at least one of the brightness, display color, transparency, and display gradation of the close-up image (V), the visibility of the close-up image (V) is lowered.
    The display control device (30) according to claim 1.
  3.  前記1つ又は複数のプロセッサ(33)は、
      所定の第3の条件が満たされるか判定し、
       前記所定の第3の条件が満たされた場合、前記第2の表示制御部によって変更された前記接近画像(V)のサイズ及び/又は表示距離を、前記第1の表示制御における前記距離(L)に対応する前記接近画像(V)の大きさ及び/又は表示距離に変更する、第3表示制御を実行する、
    請求項1に記載の表示制御装置(30)。
    The one or more processors (33)
    Determine if the predetermined third condition is met,
    When the predetermined third condition is satisfied, the size and / or display distance of the approach image (V) changed by the second display control unit is changed to the distance (L) in the first display control. ) Corresponding to the size and / or display distance of the approach image (V), the third display control is executed.
    The display control device (30) according to claim 1.
  4.  前記1つ又は複数のプロセッサ(33)は、
      前記第2表示制御において、前記第2表示制御によって変更された前記接近画像(V)の大きさ及び/又は表示距離を、前記距離(L)に関わらず、前記第2表示制御が開始されてから少なくとも一定の期間維持する、
    請求項1に記載の表示制御装置(30)。
    The one or more processors (33)
    In the second display control, the size and / or display distance of the approach image (V) changed by the second display control is started by the second display control regardless of the distance (L). Maintain at least for a certain period of time,
    The display control device (30) according to claim 1.
  5.  車両の前方風景と重なる虚像として観察者に画像を表示する画像表示装置(20)であって、
     情報を取得可能な1つ又は複数のI/Oインタフェース(31)と、
     1つ又は複数のプロセッサ(33)と、
     メモリ(37)と、
     前記メモリ(37)に格納され、前記1つ又は複数のプロセッサ(33)によって実行されるように構成される1つ又は複数のコンピュータ・プログラムと、を備え、
     前記1つ又は複数のI/Oインタフェース(31)は、
      前記車両の前方のオブジェクトまでの距離(L)と、
      前記車両に関する車両情報と、を取得し、
     前記1つ又は複数のプロセッサ(33)は、
      前記距離(L)が短くなるに従い徐々にサイズが大きく視認されるように接近画像(V)を表示する第1表示制御を実行し、
      所定の第1の条件が満たされるか判定し、
       前記所定の第1の条件が満たされた場合、前記接近画像(V)の視認性を低下させる、前記接近画像(V)を非表示にする、及び/又は前記接近画像(V)のサイズを小さくし、
      所定の第2の条件が満たされるか判定し、
       前記所定の第2の条件が満たされた場合、前記第1表示制御と比較して、前記距離(L)に対する前記接近画像(V)のサイズを大きくする、及び/又は表示距離を短くする、第2表示制御を実行する、
    画像表示装置(20)。
    An image display device (20) that displays an image to the observer as a virtual image that overlaps with the scenery in front of the vehicle.
    One or more I / O interfaces (31) from which information can be obtained,
    With one or more processors (33),
    Memory (37) and
    It comprises one or more computer programs stored in the memory (37) and configured to be executed by the one or more processors (33).
    The one or more I / O interfaces (31)
    The distance (L) to the object in front of the vehicle and
    Acquire vehicle information about the vehicle and
    The one or more processors (33)
    The first display control for displaying the approach image (V) is executed so that the size gradually increases as the distance (L) becomes shorter.
    Determine if the predetermined first condition is met,
    When the predetermined first condition is satisfied, the visibility of the close-up image (V) is reduced, the close-up image (V) is hidden, and / or the size of the close-up image (V) is adjusted. Make it smaller
    Determine if the predetermined second condition is met,
    When the predetermined second condition is satisfied, the size of the approach image (V) with respect to the distance (L) is increased and / or the display distance is shortened as compared with the first display control. Execute the second display control,
    Image display device (20).
  6.  前記1つ又は複数のプロセッサ(33)は、
      前記接近画像(V)の輝度、表示色、透過度、及び表示階調のうち少なくとも1つを変更することにより、前記接近画像(V)の視認性を低下させる、
    請求項5に記載の画像表示装置(20)。
    The one or more processors (33)
    By changing at least one of the brightness, display color, transparency, and display gradation of the close-up image (V), the visibility of the close-up image (V) is lowered.
    The image display device (20) according to claim 5.
  7.  前記1つ又は複数のプロセッサ(33)は、
      所定の第3の条件が満たされるか判定し、
       前記所定の第3の条件が満たされた場合、前記第2の表示制御部によって変更された前記接近画像(V)のサイズ及び/又は表示距離を、前記第1の表示制御における前記距離(L)に対応する前記接近画像(V)の大きさ及び/又は表示距離に変更する、第3表示制御を実行する、
    請求項5に記載の画像表示装置(20)。
    The one or more processors (33)
    Determine if the predetermined third condition is met,
    When the predetermined third condition is satisfied, the size and / or display distance of the approach image (V) changed by the second display control unit is changed to the distance (L) in the first display control. ) Corresponding to the size and / or display distance of the approach image (V), the third display control is executed.
    The image display device (20) according to claim 5.
  8.  前記1つ又は複数のプロセッサ(33)は、
      前記第2表示制御において、前記第2表示制御によって変更された前記接近画像(V)の大きさ及び/又は表示距離を、前記距離(L)に関わらず、前記第2表示制御が開始されてから少なくとも一定の期間維持する、
    請求項5に記載の画像表示装置(20)。
    The one or more processors (33)
    In the second display control, the size and / or display distance of the approach image (V) changed by the second display control is started by the second display control regardless of the distance (L). Maintain at least for a certain period of time,
    The image display device (20) according to claim 5.
  9.  車両の前方風景と重なる虚像として観察者に画像を表示する画像表示装置(20)する方法であって、
     前記車両の前方のオブジェクトまでの距離(L)を取得することと、
     前記車両に関する車両情報を取得することと、
     前記距離(L)が短くなるに従い徐々にサイズが大きく視認されるように接近画像(V)を表示する第1表示制御を実行することと、
     所定の第1の条件が満たされるか判定することと、
      前記所定の第1の条件が満たされた場合、前記接近画像(V)の視認性を低下させる、前記接近画像(V)を非表示にする、及び/又は前記接近画像(V)のサイズを小さくすることと、
     所定の第2の条件が満たされるか判定することと、
      前記所定の第2の条件が満たされた場合、前記第1表示制御と比較して、前記距離(L)に対する前記接近画像(V)のサイズを大きくする、及び/又は表示距離を短くする、第2表示制御を実行することと、を含む、
    方法。
    This is a method of using an image display device (20) that displays an image to the observer as a virtual image that overlaps with the scenery in front of the vehicle.
    Obtaining the distance (L) to the object in front of the vehicle
    Acquiring vehicle information about the vehicle
    The first display control for displaying the approach image (V) so that the size gradually increases as the distance (L) becomes shorter is executed.
    Determining if the first condition is met
    When the predetermined first condition is satisfied, the visibility of the close-up image (V) is reduced, the close-up image (V) is hidden, and / or the size of the close-up image (V) is adjusted. To make it smaller and
    Determining if a given second condition is met
    When the predetermined second condition is satisfied, the size of the approach image (V) with respect to the distance (L) is increased and / or the display distance is shortened as compared with the first display control. Including performing a second display control,
    Method.
  10.  前記接近画像(V)の輝度、表示色、透過度、及び表示階調のうち少なくとも1つを変更することにより、前記接近画像(V)の視認性を低下させる、
    請求項9に記載の方法。
    By changing at least one of the brightness, display color, transparency, and display gradation of the close-up image (V), the visibility of the close-up image (V) is lowered.
    The method according to claim 9.
  11.  所定の第3の条件が満たされるか判定することと、
      前記所定の第3の条件が満たされた場合、前記第2の表示制御部によって変更された前記接近画像(V)のサイズ及び/又は表示距離を、前記第1の表示制御における前記距離(L)に対応する前記接近画像(V)の大きさ及び/又は表示距離に変更する、第3表示制御を実行すること、をさらに含む、
    請求項9に記載の方法。
    Determining if a predetermined third condition is met
    When the predetermined third condition is satisfied, the size and / or display distance of the approach image (V) changed by the second display control unit is changed to the distance (L) in the first display control. ) Corresponding to the size and / or display distance of the approach image (V), further including executing a third display control.
    The method according to claim 9.
  12.  前記第2表示制御において、前記第2表示制御によって変更された前記接近画像(V)の大きさ及び/又は表示距離を、前記距離(L)に関わらず、前記第2表示制御が開始されてから少なくとも一定の期間維持する、
    請求項9に記載の方法。
     

     
    In the second display control, the size and / or display distance of the approach image (V) changed by the second display control is started by the second display control regardless of the distance (L). Maintain at least for a certain period of time,
    The method according to claim 9.


PCT/JP2021/013480 2020-03-31 2021-03-30 Display control device, image display device, and method WO2021200913A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020061737 2020-03-31
JP2020-061737 2020-03-31

Publications (1)

Publication Number Publication Date
WO2021200913A1 true WO2021200913A1 (en) 2021-10-07

Family

ID=77929009

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/013480 WO2021200913A1 (en) 2020-03-31 2021-03-30 Display control device, image display device, and method

Country Status (1)

Country Link
WO (1) WO2021200913A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010072365A (en) * 2008-09-18 2010-04-02 Toshiba Corp Head up display
JP2012162109A (en) * 2011-02-03 2012-08-30 Toyota Motor Corp Display apparatus for vehicle
JP2017052364A (en) * 2015-09-09 2017-03-16 日本精機株式会社 Head up display device
JP2019156296A (en) * 2018-03-15 2019-09-19 矢崎総業株式会社 Vehicular display projection device
JP2019199139A (en) * 2018-05-15 2019-11-21 日本精機株式会社 Vehicular display device
JP2020032866A (en) * 2018-08-30 2020-03-05 日本精機株式会社 Vehicular virtual reality providing device, method and computer program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010072365A (en) * 2008-09-18 2010-04-02 Toshiba Corp Head up display
JP2012162109A (en) * 2011-02-03 2012-08-30 Toyota Motor Corp Display apparatus for vehicle
JP2017052364A (en) * 2015-09-09 2017-03-16 日本精機株式会社 Head up display device
JP2019156296A (en) * 2018-03-15 2019-09-19 矢崎総業株式会社 Vehicular display projection device
JP2019199139A (en) * 2018-05-15 2019-11-21 日本精機株式会社 Vehicular display device
JP2020032866A (en) * 2018-08-30 2020-03-05 日本精機株式会社 Vehicular virtual reality providing device, method and computer program

Similar Documents

Publication Publication Date Title
WO2019097763A1 (en) Superposed-image display device and computer program
US11525694B2 (en) Superimposed-image display device and computer program
US11803053B2 (en) Display control device and non-transitory tangible computer-readable medium therefor
CN111095078A (en) Method, device and computer-readable storage medium with instructions for controlling the display of an augmented reality head-up display device for a motor vehicle
JP2020032866A (en) Vehicular virtual reality providing device, method and computer program
JP7255608B2 (en) DISPLAY CONTROLLER, METHOD, AND COMPUTER PROGRAM
JP7459883B2 (en) Display control device, head-up display device, and method
JP7310560B2 (en) Display control device and display control program
WO2022230995A1 (en) Display control device, head-up display device, and display control method
WO2021200914A1 (en) Display control device, head-up display device, and method
WO2021200913A1 (en) Display control device, image display device, and method
WO2020158601A1 (en) Display control device, method, and computer program
JP2021160409A (en) Display control device, image display device, and method
JP2020121607A (en) Display control device, method and computer program
JP2020121704A (en) Display control device, head-up display device, method and computer program
JP2020117105A (en) Display control device, method and computer program
JP7434894B2 (en) Vehicle display device
JP2020158014A (en) Head-up display device, display control device, and display control program
WO2023003045A1 (en) Display control device, head-up display device, and display control method
JP2019207632A (en) Display device
WO2023210682A1 (en) Display control device, head-up display device, and display control method
WO2023145852A1 (en) Display control device, display system, and display control method
JP2022057051A (en) Display controller and virtual display device
JP7338632B2 (en) Display device
JP2022113292A (en) Display control device, head-up display device, and display control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21781516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21781516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP