WO2019003929A1 - 表示システム、情報提示システム、表示システムの制御方法、プログラムと記録媒体、及び移動体装置 - Google Patents

表示システム、情報提示システム、表示システムの制御方法、プログラムと記録媒体、及び移動体装置 Download PDF

Info

Publication number
WO2019003929A1
WO2019003929A1 PCT/JP2018/022659 JP2018022659W WO2019003929A1 WO 2019003929 A1 WO2019003929 A1 WO 2019003929A1 JP 2018022659 W JP2018022659 W JP 2018022659W WO 2019003929 A1 WO2019003929 A1 WO 2019003929A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual image
display
shield
control unit
display system
Prior art date
Application number
PCT/JP2018/022659
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
忠司 芝田
中野 信之
田中 彰
勝長 辻
祥平 林
勇義 苑田
友哉 吉田
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to DE112018003314.7T priority Critical patent/DE112018003314B4/de
Publication of WO2019003929A1 publication Critical patent/WO2019003929A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60JWINDOWS, WINDSCREENS, NON-FIXED ROOFS, DOORS, OR SIMILAR DEVICES FOR VEHICLES; REMOVABLE EXTERNAL PROTECTIVE COVERINGS SPECIALLY ADAPTED FOR VEHICLES
    • B60J1/00Windows; Windscreens; Accessories therefor
    • B60J1/02Windows; Windscreens; Accessories therefor arranged at the vehicle front, e.g. structure of the glazing, mounting of the glazing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers

Definitions

  • the present disclosure relates generally to a display system, an information presentation system, a control method for the display system, a program, and a mobile device, and more specifically to a display system that projects a virtual image on a target space, an information presentation system, and a control method for the display system , A program, a recording medium, and a mobile device.
  • a display device for a vehicle
  • a head-up display device for remotely displaying a driving information image or the like necessary for driving as a virtual image through a windshield
  • Patent Document 1 Japanese Patent Document 1
  • the display device described in Patent Document 1 has a screen on which an image is drawn.
  • the image formed on the screen is reflected by the windshield of the vehicle through the projection means and reaches the driver's eyes, so that the virtual image is visible to the driver's eyes far ahead of the windshield .
  • a shield such as a person or a car may be present.
  • the display device (display system) disclosed in Patent Document 1 displays a virtual image superimposed on the shield when a shield is present when projecting a virtual image. Therefore, the driver (target person) may feel discomfort due to the virtual image overlapping the existing shield.
  • the present disclosure relates to a display system, an information presentation system, a display system control method, a program and a recording medium, and a mobile device, which can reduce the possibility of the subject feeling discomfort by overlapping a virtual image on an existing shield. I will provide a.
  • a display system includes a projection unit and a control unit.
  • the projection unit projects a virtual image on the target space.
  • the control unit controls the display of the virtual image, and changes the display mode of the virtual image when the shield exists in the projection direction of the virtual image and the shield is within the visual distance to the virtual image.
  • An information presentation system includes the display system and a detection system that detects an obstacle.
  • a control method of a display system is a control method of a display system including a projection unit that projects a virtual image on a target space and a control unit that controls display of the virtual image.
  • this control method when the shield is present in the projection direction of the virtual image and the shield is in the range within the visual distance to the virtual image, the display mode of the virtual image is changed.
  • a program according to an aspect of the present disclosure is a program for causing a computer to execute the control method of the display system.
  • a non-transitory recording medium stores the program in a computer readable manner.
  • a mobile device includes the above-described display system, and a reflective member that is light transmissive and that reflects light emitted from a projection unit.
  • FIG. 1 is a conceptual view of a vehicle equipped with a display system according to an embodiment of the present disclosure.
  • FIG. 2 is a conceptual diagram showing the field of view of the user when the display system according to the embodiment of the present disclosure is used.
  • FIG. 3 is a conceptual diagram showing configurations of a display system and an information presentation system according to the embodiment of the present disclosure.
  • FIG. 4 is a flowchart for explaining the operation of the display system shown in FIG.
  • FIG. 5A is a conceptual diagram showing an example of projecting a first virtual image.
  • FIG. 5B is a conceptual diagram showing the first virtual image of FIG. 5A as viewed from the driver's seat.
  • FIG. 5C is a conceptual diagram showing an example of the case where the projected first virtual image overlaps the shield.
  • FIG. 5A is a conceptual diagram showing an example of projecting a first virtual image.
  • FIG. 5B is a conceptual diagram showing the first virtual image of FIG. 5A as viewed from the driver's seat.
  • FIG. 5D is a conceptual diagram showing the first virtual image of FIG. 5C viewed from the driver's seat.
  • FIG. 6A is a conceptual diagram showing an example in the case where the display mode of a part of the first virtual image is changed when the first virtual image overlaps the shield.
  • FIG. 6B is a conceptual diagram showing an example of display of the first virtual image when the distance to the shield is equal to or greater than the threshold.
  • FIG. 7 is a conceptual diagram for explaining the change of the display mode in the modification A.
  • FIG. 8A is a conceptual diagram for explaining a change of the display mode in the modified example B.
  • FIG. 8B is a conceptual diagram for explaining another change of the display mode in the modified example B.
  • FIG. 9A is a conceptual diagram for explaining the change of the display mode in the modification C.
  • FIG. 9B is a conceptual diagram for explaining another change of the display mode in the modification C.
  • FIG. 10A is a conceptual diagram for describing a change of the display mode in the modification D.
  • FIG. 10B is a conceptual diagram for explaining another change of the display mode in the modification D.
  • FIG. 11A is a conceptual diagram for describing a change of the display mode in the modification D.
  • FIG. 11B is a conceptual diagram for explaining another change of the display mode in the modification D.
  • FIG. FIG. 12A is a conceptual diagram for illustrating still another change of the display mode in the modification D.
  • FIG. FIG. 12B is a conceptual diagram for illustrating another change of the display mode in the modification D.
  • FIG. 13A is a conceptual diagram for describing changes in still another display mode in the modification D.
  • FIG. 13B is a conceptual diagram for describing changes in still another display mode in the modification D.
  • FIG. 14A is a conceptual diagram for explaining a change of the display mode in the modification E.
  • FIG. 14B is a conceptual diagram for explaining another change of the display mode in the modification E.
  • FIG. 15A is a conceptual diagram for describing a change of the display mode in the modification F.
  • FIG. 15B is a conceptual diagram for explaining another change of the display mode in the modification F.
  • FIG. 16A is a conceptual diagram of a position for explaining a change of the display mode in the modification G.
  • FIG. 16B is a conceptual diagram for explaining the change of the display mode in the modification G.
  • FIG. 16C is a conceptual diagram for explaining another change of the display mode in the modification G.
  • FIG. 16D is a conceptual diagram for illustrating still another change of the display mode in the modification G.
  • FIG. 16E is a conceptual diagram for illustrating another change of the display mode in the modification G.
  • FIG. 17 is a flowchart for explaining the operation of the display system in the modification H.
  • FIG. 18A is a conceptual diagram for describing changes in the display mode in the modification H.
  • FIG. 18B is a conceptual diagram for illustrating another change of the display mode in the modification H.
  • FIG. 1 is a conceptual view of a car 100 as a mobile device equipped with a display system 10 according to an embodiment of the present disclosure.
  • the automobile 100 has a main body 110 and a drive unit 220.
  • the main body 110 constitutes a compartment.
  • the main body 110 includes a windshield (front glass) 101.
  • a seat including a front seat 103 on which a user 200 (an occupant such as a driver of the automobile 100) is seated is installed.
  • the driving unit 220 is mounted on the main body 110 to move the main body 110.
  • the drive unit 220 includes a drive source 222 including an engine and a motor, and drive wheels 224.
  • the drive unit 220 may include a steering 226.
  • the display system 10 is, for example, a head-up display device used for the automobile 100.
  • the display system 10 is installed in the cabin of the automobile 100 so as to project an image from below onto the windshield 101 of the automobile 100.
  • the display system 10 is disposed in the dashboard 102 below the windshield 101.
  • the image reflected by the windshield 101 as a reflecting member is viewed by the user (driver) 200.
  • the user 200 visually recognizes, through the windshield 101, the virtual image 300 projected on the target space 400 set in front of the vehicle 100 (outside the vehicle).
  • the windshield 101 as a reflecting member is light transmissive, reflects light emitted by the projection unit 40 (described later with reference to FIG. 3) of the display system 10, and transmits the virtual image 300 to the user. Make 200 visible.
  • the “virtual image” means an image in which an object is actually connected by the diverging rays when light emitted from the display system 10 diverges at a reflecting object such as the windshield 101 or the like.
  • the windshield 101 is light transmissive, and the user 200 who is the subject can view the subject space 400 in front of the automobile 100 through the windshield 101.
  • the user 200 can see the virtual image 300 projected by the display system 10 superimposed on the real space extending in front of the automobile 100. Therefore, according to the display system 10, various driving support information such as, for example, vehicle speed information, navigation information, pedestrian information, forward vehicle information, lane deviation information, and vehicle condition information are displayed as a virtual image 300, It can be made visible to 200. As a result, the user 200 can visually acquire the driving support information only by moving the line of sight slightly from the state where the line of sight is directed to the front of the windshield 101.
  • various driving support information such as, for example, vehicle speed information, navigation information, pedestrian information, forward vehicle information, lane deviation information, and vehicle condition information are displayed as a virtual image 300, It can be made visible to 200.
  • the user 200 can visually acquire the driving support information only by moving the line of sight slightly from the state where the line of sight is directed to the front of the windshield 101.
  • the virtual image 300 formed in the target space 400 includes at least two types of a first virtual image 301 and a second virtual image 302.
  • the first virtual image 301 is formed on the first virtual surface 501.
  • the inclination angle ⁇ of the first virtual surface 501 with respect to the optical axis 500 of the display system 10 is smaller than the predetermined value ⁇ ( ⁇ ⁇ ).
  • a second virtual image 302 is formed on the second virtual surface 502.
  • the inclination angle ⁇ of the second virtual surface 502 with respect to the optical axis 500 of the display system 10 is larger than a predetermined value ⁇ ( ⁇ > ⁇ ).
  • the optical axis 500 means the optical axis of the projection optical system 4 (described later with reference to FIG. 3), which passes through the center of the target space 400 and along the optical path of the virtual image 300.
  • the predetermined value ⁇ is 45 degrees as an example, and the inclination angle ⁇ is 90 degrees as an example.
  • the virtual image 300 includes a third virtual image 303 (described later with reference to FIG. 2) in addition to the first virtual image 301 and the second virtual image 302. Similar to the second virtual image 302, the third virtual image 303 is formed on the second virtual surface 502 in which the tilt angle ⁇ with respect to the optical axis 500 is larger than the predetermined value ⁇ . The difference between the second virtual image 302 and the third virtual image 303 will be described later.
  • the optical axis 500 is along the road surface 600 in front of the vehicle 100 in the target space 400 in front of the vehicle 100.
  • the first virtual image 301 is formed on a first virtual surface 501 substantially parallel to the road surface 600
  • the second virtual image 302 and the third virtual image 303 are on a second virtual surface 502 substantially perpendicular to the road surface 600. It is formed.
  • the road surface 600 is a horizontal surface
  • the first virtual image 301 is displayed along the horizontal surface
  • the second virtual image 302 and the third virtual image 303 are displayed along the vertical surface.
  • FIG. 2 is a conceptual view showing a field of view of the user 200.
  • the display system 10 includes a first virtual image 301 visually recognized with a depth along the road surface 600, and a second virtual image 302 and a third virtual image 303 visually recognized upright on the road surface 600 at a predetermined distance from the user 200. It can be displayed. Therefore, to the user 200, the first virtual image 301 appears to be on a plane substantially parallel to the road surface 600, and the second virtual image 302 and the third virtual image 303 are on a plane substantially perpendicular to the road surface 600. appear.
  • the first virtual image 301 indicates, for example, information indicating the traveling direction of the vehicle 100 as navigation information, and can present an arrow indicating a right turn or a left turn on the road surface 600.
  • the second virtual image 302 indicates, for example, information indicating the distance to a preceding vehicle or a pedestrian, and for example, it is possible to present the distance to the preceding vehicle (inter-vehicle distance) on the preceding vehicle.
  • the third virtual image 303 indicates, for example, the current time, vehicle speed information, and vehicle condition information, and it is possible to present such information with a meter such as a letter, a number, a symbol, or a fuel gauge, for example.
  • FIG. 3 is a conceptual diagram showing the configuration of the display system 10 and the information presentation system 1000 according to the present embodiment.
  • the display system 10 constitutes an information presentation system 1000 in combination with the detection system 7.
  • the detection system 7 includes an imaging device 71 and a laser radar 72.
  • the imaging device 71 has a camera and captures the projection direction of the virtual image 300.
  • the laser radar 72 detects an object (obstacle) present in the projection direction of the virtual image 300.
  • the projection direction is the direction in which the user 200 looks at the virtual image 300 in FIG.
  • the detection system 7 detects, based on the image captured by the imaging device 71 and the detection result of the laser radar 72, whether a shield such as a car or a person is present in the projection direction.
  • the detection system 7 determines the distance from the automobile 100 to the shield based on the detection result by the laser radar 72 when the shield is present.
  • the laser radar 72 irradiates the target space 400 with pulsed laser light, and receives the reflected light reflected by the object in the target space.
  • the laser radar 72 calculates the distance to the object based on the time from the irradiation of the laser light to the reception of the reflected light.
  • the detection system 7 recognizes a shield present in the target space from the image captured by the imaging device 71 using, for example, a learning model generated by a machine learning algorithm.
  • the detection system 7 notifies the display system 10 of detection result information including the detection result of the presence or absence of the shield and the distance to the shield if the shield is present.
  • the detection system 7 obtains the current position of the vehicle 100 using a GPS (Global Positioning System), and acquires map information around the current position based on the current position of the vehicle 100.
  • the detection system 7 may acquire map information around the current position of the vehicle 100 from a memory in which map information is stored in advance, or by communicating with an external server, the periphery of the current position of the vehicle 100 Map information may be acquired.
  • the position information is, for example, information of the road at the current position of the automobile 100, and information such as the number of lanes of the road, the width of the roadway, the presence or absence of a sidewalk, the slope, and the curvature of a curve.
  • the detection system 7 also acquires vehicle information representing the state of the vehicle 100 from an advanced driver assistance system (ADAS) or the like.
  • vehicle information includes, for example, a traveling speed (vehicle speed) of the automobile 100, an acceleration, an accelerator opening degree, a depression degree of a brake pedal, and the like.
  • the imaging device 71 and the laser radar 72 may be shared with the ADAS.
  • the display system 10 includes a movable screen 1 a, a fixed screen 1 b, a drive unit 2, an irradiation unit 3, a projection optical system 4, a control unit 5, and an acquisition unit 6.
  • the projection optical system 4 constitutes a projection unit 40 for projecting the virtual image 300 onto the target space 400 shown in FIG.
  • the fixed screen 1 b is fixed at a fixed position with respect to a housing or the like of the display system 10.
  • the movable screen 1 a is inclined at an angle ⁇ with respect to the reference surface 503.
  • the movable screen 1a is configured to be movable in a moving direction X (direction shown by an arrow X1-X2 in FIG. 3) orthogonal to the reference surface 503.
  • the reference surface 503 is an imaginary plane that defines the moving direction of the movable screen 1a, and is not an existing surface.
  • the movable screen 1a is configured to be linearly movable in the moving direction X while maintaining the posture inclined by the angle ⁇ with respect to the reference surface 503.
  • each may be referred to as a screen 1.
  • the screen 1 is translucent, and forms an image for forming a virtual image 300 in the target space 400 shown in FIG. That is, on the screen 1, an image is drawn by the light from the irradiation unit 3, and the light passing through the screen 1 forms a virtual image 300 in the target space 400.
  • the screen 1 has, for example, a light diffusing property and includes a plate-like member formed in a rectangular shape. The screen 1 is disposed between the irradiation unit 3 and the projection optical system 4.
  • the drive unit 2 moves the movable screen 1 a in the moving direction X.
  • the drive unit 2 can move the movable screen 1 a in both the direction approaching the projection optical system 4 and the direction away from the projection optical system 4 along the moving direction X.
  • the drive unit 2 includes, for example, an electrically driven actuator such as a voice coil motor, and operates according to a first control signal from the control unit 5.
  • the scanning irradiation unit 3 irradiates light to the movable screen 1a or the fixed screen 1b.
  • the irradiation unit 3 includes a light source 31 and a scanning unit 32.
  • the light source 31 and the scanning unit 32 operate according to the second control signal from the control unit 5, respectively.
  • the light source 31 includes a laser module that outputs a laser beam.
  • the light source 31 includes a red laser diode that outputs red (R) laser light, a green laser diode that outputs green (G) laser light, and a blue laser diode that outputs blue (B) laser light. It is.
  • the laser beams of three colors output from these three types of laser diodes are synthesized by, for example, a dichroic mirror, and enter the scanning unit 32.
  • the scanning unit 32 scans the light from the light source 31 to emit light scanning the entire surface of the movable screen 1a or the fixed screen 1b onto the movable screen 1a or the fixed screen 1b.
  • the scanning unit 32 performs raster scan for two-dimensionally scanning light on one surface of the movable screen 1 a or the fixed screen 1 b.
  • the light output from the irradiation unit 3 and transmitted through the screen 1 enters the projection optical system 4 as incident light.
  • the projection optical system 4 projects the virtual image 300 on the target space 400 as shown in FIG. 1 by the incident light.
  • the projection optical system 4 is arranged in line with the screen 1 in the moving direction X.
  • the projection optical system 4 has a magnifying lens 41, a first mirror 42, and a second mirror 43, as shown in FIG.
  • the magnifying lens 41, the first mirror 42, and the second mirror 43 are disposed in this order on the path of the light transmitted through the screen 1.
  • the magnifying lens 41 is disposed on the side (first direction X1 side) opposite to the irradiation unit 3 in the moving direction X as viewed from the screen 1 so that light output along the moving direction X from the screen 1 is incident It is done.
  • the magnifying lens 41 magnifies the image formed on the screen 1 by the light from the irradiating unit 3 and outputs the image to the first mirror 42.
  • the first mirror 42 reflects the light from the magnifying lens 41 toward the second mirror 43.
  • the second mirror 43 reflects the light from the first mirror 42 toward the windshield 101 shown in FIG.
  • the projection optical system 4 projects the virtual image 300 on the target space 400 by enlarging the image formed on the screen 1 by the light from the irradiation unit 3 with the magnifying lens 41 and projecting it on the windshield 101.
  • the optical axis of the magnifying lens 41 is the optical axis 500 of the projection optical system 4.
  • the control unit 5 is configured of, for example, a microcomputer whose main configuration is a CPU (Central Processing Unit) and a memory.
  • the control unit 5 is realized by a computer having a CPU and a memory, and the computer functions as the control unit 5 when the CPU executes a program stored in the memory.
  • the program may be recorded in advance in the memory of the control unit 5 or provided through a telecommunication line such as the Internet, or recorded on a non-transitory recording medium such as various disks or a recording medium such as a memory card. May be
  • the control unit 5 controls the drive unit 2 and the irradiation unit 3.
  • the control unit 5 controls the drive unit 2 with the first control signal, and controls the irradiation unit 3 with the second control signal. Further, the control unit 5 is configured to synchronize the operation of the drive unit 2 and the operation of the irradiation unit 3. Furthermore, as shown in FIG. 3, the control unit 5 has functions as a drive control unit 51 and a display control unit 52.
  • the drive control unit 51 controls the drive unit 2 to move the movable screen 1 a relative to the reference position.
  • the reference position is set to a specified position in the movement range of the movable screen 1a.
  • the drive control unit 51 moves the movable screen 1 a to project the second virtual image 302 onto the target space 400 by the light transmitted through the movable screen 1 a.
  • the drive control unit 51 controls the drive unit 2 in synchronization with the drawing on the movable screen 1 a by the irradiation unit 3.
  • the display control unit 52 determines the content (content) of the virtual image 300 to be displayed and the display position (visual distance) of the virtual image 300 based on the one or more pieces of information acquired by the acquisition unit 6. Furthermore, when there is a shield, the display control unit 52 determines the presence or absence of a change in the display mode of the virtual image 300 to be projected, based on the distance to the shield and the display position of the virtual image 300.
  • the display control unit 52 determines that the distance to the shield is equal to or less than the visual distance to the virtual image 300. , Change the display mode of the virtual image. That is, when the shielding object exists in the projection direction of the virtual image 300 and the shielding object is in the range within the visual distance to the virtual image 300, the display control unit 52 changes the display mode of the virtual image 300. Specific processing will be described later.
  • the display mode of the virtual image includes not only the design of the virtual image but also the display position. Changing the display mode is intended to change at least one of the design and the display position.
  • the change of the display mode includes the case of adding other information and the case of changing the type of the virtual image 300.
  • the change of the type of the virtual image 300 is to change one virtual image out of the first virtual image 301, the second virtual image 302, and the third virtual image 303 into another virtual image.
  • the viewing distance is a virtual distance at which the virtual image 300 is viewed in the eye (eye point) of the user 200. “The distance to the shield is equal to or less than the visual distance to the virtual image 300” means that the distance to the shield is equal to or greater than the visual distance to the rear end of the virtual image 300 (the end of the front portion) and to the shield Means less than or equal to the viewing distance to the front end of the virtual image 300.
  • the acquisition unit 6 acquires, from the detection system 7, information (driving support information) related to an object present around the automobile 100. Specifically, the acquisition unit 6 acquires detection result information as driving support information from the detection system 7. The acquisition unit 6 also acquires information such as the above-described map information, vehicle information, position information, information on navigation with respect to the automobile 100 (navigation information) as driving support information.
  • the control unit 5 controls the irradiation unit 3 to cause the movable screen 1 a to emit light from the irradiation unit 3.
  • Light that scans one surface of the movable screen 1 a is emitted from the irradiation unit 3 to the movable screen 1 a.
  • an image is formed (projected) on the movable screen 1a.
  • the light from the irradiation unit 3 is transmitted through the movable screen 1 a and irradiated from the projection optical system 4 to the windshield 101.
  • the image formed on the movable screen 1a is projected onto the windshield 101 from the lower side of the windshield 101 shown in FIG.
  • the windshield 101 When an image is projected from the projection optical system 4 to the windshield 101, the windshield 101 reflects the light from the projection optical system 4 toward the user 200 in the vehicle compartment. Thereby, the image reflected by the windshield 101 is visually recognized by the user 200. As a result, the user 200 can visually recognize the first virtual image 301 or the second virtual image 302 projected to the front of the automobile 100 through the windshield 101.
  • the control unit 5 scans light on one surface of the movable screen 1 a in a state where the movable screen 1 a is fixed in the moving direction X. By this scanning, the first virtual image 301 is formed so as to be viewed with depth along the road surface 600.
  • the control unit 5 moves the movable screen 1a on one surface of the movable screen 1a while moving the movable screen 1a so that the distance along the moving direction X from the bright spot on one surface of the movable screen 1a to the projection optical system 4 becomes constant. Let the light scan. As a result, a second virtual image 302 that is viewed upright on the road surface 600 at a certain distance from the user 200 is formed.
  • the control unit 5 causes the drive control unit 51 to control the drive unit 2 to move the movable screen 1 a in the moving direction X in a period in which light is emitted from the irradiation unit 3 to the movable screen 1 a.
  • the irradiation position of the light from the irradiation unit 3 on one surface of the movable screen 1a that is, the position of the bright spot
  • the visual distance becomes short.
  • the position of the bright spot on one surface of the movable screen 1a is the same, when the movable screen 1a moves in the second direction X2, the visual distance to the virtual image 300 becomes longer. That is, the viewing distance to the virtual image 300 changes with the position of the movable screen 1 a in the moving direction X.
  • the control unit 5 moves the movable screen 1a along the moving direction X according to the visual distance, and fixes the movable screen 1a at the position after movement. In this state, light is scanned on one surface of the movable screen 1a.
  • the control unit 5 moves the movable screen 1a along the moving direction X according to the viewing distance.
  • the control unit 5 moves the movable screen 1a on one surface of the movable screen 1a so that the distance along the moving direction X from the bright spot to the projection optical system 4 becomes constant based on the position after the movement. Let the light scan.
  • control unit 5 controls the irradiating unit 3 so that the fixed screen 1 b is irradiated with light from the irradiating unit 3.
  • Light that scans one surface of the fixed screen 1 b is emitted from the irradiation unit 3 to the fixed screen 1 b.
  • an image is formed (projected) on the fixed screen 1 b and the image is projected on the windshield 101.
  • the user 200 can visually recognize the third virtual image 303 projected to the front of the automobile 100 through the windshield 101.
  • the virtual image formed by the light transmitted through the movable screen 1a is the second virtual image 302, and is formed by the light transmitted through the fixed screen 1b.
  • the virtual image is the third virtual image 303.
  • the third virtual image 303 is formed by light projected onto the fixed screen 1b whose position is fixed, the third virtual image 303 stands upright on the road surface 600 at a predetermined distance (for example, 2 to 3 m) from the user 200 Be visible.
  • all of the first virtual image 301, the second virtual image 302, and the third virtual image 303 can be projected during one cycle in which the scanning unit 32 reciprocates in the vertical direction of the movable screen 1a.
  • the longitudinal direction of the movable screen 1a means an example of a direction inclined with respect to the reference surface 503 of the movable screen 1a.
  • the projection unit 40 first applies light to the movable screen 1a to project the first virtual image 301, and then projects the light on the fixed screen 1b
  • the third virtual image 303 is displayed by emitting light.
  • the projection unit 40 first irradiates the fixed screen 1b with light to display the third virtual image 303, and then irradiates the light on the movable screen 1a Then, a second virtual image 302 is projected.
  • the first virtual image 301, the third virtual image 303, and the second virtual image 302 are projected on the target space 400 during one cycle in which the scanning unit 32 scans in the vertical direction. Since the scanning in the vertical direction in the irradiation unit 3 is performed at relatively high speed, the user 200 visually recognizes that the first virtual image 301, the third virtual image 303, and the second virtual image 302 are simultaneously displayed.
  • the frequency of scanning in the vertical direction in the irradiation unit 3 is, for example, 60 Hz or more.
  • FIG. 4 is a flowchart for explaining the operation of the display system 10.
  • the display control unit 52 of the control unit 5 determines the content of the virtual image 300 to be displayed and the display position of the virtual image 300 based on the driving support information acquired by the acquisition unit 6 (step S1).
  • the display control unit 52 determines whether a shield is present in the projection direction of the virtual image 300 (in front of the user 200) based on the detection result of the detection system 7 (step S2).
  • step S2 If it is determined that a shield is present (“Yes” in step S2), the display control unit 52 uses the driving support information acquired by the acquisition unit 6 to determine the distance to the shield determined by the detection system 7 It acquires (step S3).
  • the display control unit 52 determines whether the distance to the shield is equal to or less than the visual distance to the virtual image 300 (step S4).
  • step S4 If it is determined that the distance to the shield is equal to or less than the visual distance of the virtual image 300 ("Yes" in step S4), the display control unit 52 determines that the distance to the shield is based on the notification received from the detection system 7. It is judged whether it is more than a threshold (for example, 100 [m]) (step S5).
  • a threshold for example, 100 [m]
  • the display control unit 52 changes the display mode of the virtual image 300 to be displayed, and changes the display control unit 51 to the drive control unit 51.
  • the virtual image 300 is displayed in the display mode (step S6).
  • the display control unit 52 When it is determined that the shield does not exist (“No” in step S2), the display control unit 52 causes the drive control unit 51 to display the virtual image 300 without changing the display mode of the virtual image 300 (step S7). Similarly, when it is determined that the distance to the shield is not equal to or less than the visual distance of the virtual image 300 (“No” in step S4), the display control unit 52 does not change the display mode of the virtual image 300. The virtual image 300 is displayed (step S7).
  • the display control unit 52 does not change the display mode of the virtual image 300. Is displayed (step S7). “The distance to the shield is equal to or greater than the threshold value” means that the shield is at a first position closer than the position at the viewing distance or farther than the first position.
  • steps S1 to S7 described above is repeatedly performed in a cycle (for example, 1/60 seconds) in which the virtual image 300 is displayed.
  • the distance to the shield and the visual distance are compared. This is because the problem is that the virtual image 300 and the shield overlap and are viewed by the user 200. Therefore, the distance to the shield means the distance from the eye point of the user 200 to the shield.
  • the eye point changes somewhat depending on the person, but the fluctuation range of the distance between the laser radar 72 fixed to the automobile 100 on which the user 200 rides and the eye point is smaller than the distance from the laser radar 72 to the shield. Therefore, the distance from the eye point of the user 200 to the shield can be estimated by adding a constant distance value to the distance from the laser radar 72 to the shield.
  • FIG. 5A is a conceptual view showing an example of projecting the first virtual image 301
  • FIG. 5B is a conceptual view showing the first virtual image 301 of FIG. 5A viewed from the driver's seat.
  • FIG. 5C is a conceptual view showing an example of the case where the projected first virtual image 301 overlaps the shield
  • FIG. 5D is a conceptual view showing the first virtual image 301 of FIG. 5C viewed from the driver's seat.
  • the display control unit 52 determines to display the straight advancing arrow 310 as the first virtual image 301 on the first virtual surface 501 as shown in FIGS. 5A and 5C based on the driving support information.
  • the straight advance arrow 310 When the straight advance arrow 310 is projected on the first virtual surface 501, as shown in FIGS. 5B and 5D, the user 200 visually recognizes the straight advance arrow 310 as the first virtual image 301 having a depth along the road surface 600. Can.
  • the display control unit 52 changes the display mode of the straight arrow 310 which is the first virtual image 301.
  • FIG. 6A is a conceptual diagram showing an example in the case where the display mode of a part of the first virtual image 301 is changed when the first virtual image 301 overlaps the shield.
  • the display control unit 52 changes the display mode so as to make the portion 311 overlapping with the automobile 150 inconspicuous as compared with the portion 312 not overlapping, of the entire straight arrows 310.
  • the display control unit 52 lowers the luminance, changes the color, changes the thickness of the line, or sets the white portion out of the entire straight advance arrow 310 to the part 311 overlapping the automobile 150. Do.
  • the display control unit 52 performs at least one of a decrease in luminance, a change in color, a change in line thickness, and a whiteout in the portion 311 overlapping the car 150 among the entire straight arrows 310.
  • the display control unit 52 sets the overlapping portion so that the overlapping portion which is a part of the first virtual image 301 and overlaps with the shield (the automobile 150) has lower visibility than the non-overlapping portion.
  • Change the display mode of In other words, the display control unit 52 changes the display mode of the first virtual image 301 so as to leave the original display mode for the non-overlapping portion.
  • FIG. 6B is a conceptual diagram showing an example of the display of the first virtual image 301 when the distance to the shield is equal to or greater than the threshold.
  • the display control unit 52 does not change the display mode of the straight arrow 310 when the distance to the vehicle 150 is equal to or greater than the threshold even if the distance to the shield such as the vehicle 150 is equal to or less than the viewing distance. This is because, even if the distal end portion (the portion with the longest viewing distance) of the straight arrow 310 overlaps the car 150, the user 200 feels less uncomfortable than when the car 150 is nearby.
  • the display control unit 52 changes the display mode of the straight arrow 310 between the portion 311 overlapping with the automobile 150 and the portion 312 not overlapping with the vehicle 150, but the present invention is not limited to this configuration.
  • the display control unit 52 may change the display mode of the first virtual image 301 so that the first virtual image 301 (straight arrow 310) does not overlap the shield.
  • the display mode of the first virtual image 301 may be changed such that the visual distance of the first virtual image 301 is shorter than the distance to the shield. That is, the display mode of the first virtual image 301 may be changed such that the first virtual image 301 is projected to the front of the shield.
  • the display mode of the first virtual image 301 may be changed such that the first virtual image 301 is projected behind the shield.
  • the first virtual image 301 may be divided, and the display mode of the first virtual image 301 may be changed such that the first virtual image 301 is projected to both the front and back of the shield.
  • the straight arrow 310 as the first virtual image 301 is an example, and another display content may be used as the first virtual image 301.
  • the display control unit 52 displays the display mode of the first virtual image 301 Change Accordingly, the display system 10 can reduce the discomfort of the user 200 due to the fact that the shield and the first virtual image 301 with a depth look overlapping.
  • the display system 10 has described the example in which the display mode of the first virtual image 301 is not changed when the distance to the shield is equal to or greater than the threshold even if the distance to the shield is equal to or less than the viewing distance. Not limited to this configuration.
  • the display system 10 may always change the display mode of the first virtual image 301 regardless of the distance to the shield if the distance to the shield is equal to or less than the viewing distance. That is, step S5 in FIG. 4 is not essential.
  • both end portions in the direction along the road surface 600 may not overlap the shield, and only the central portion may overlap.
  • both ends of the first virtual image 301 which do not overlap the shielding object may be different display modes, or may be the same display mode.
  • FIG. 7 is a conceptual diagram for explaining the change of the display mode in the modification A.
  • the display control unit 52 determines to display the straight arrow 310 as the first virtual image 301 on the first virtual surface 501 based on the driving support information.
  • the display control unit 52 changes the display mode so as to display on the second virtual surface 502 instead of displaying the straight arrow 310 on the first virtual surface 501 as shown in FIG. 7. That is, the display control unit 52 changes the display mode so as to display the straight advancing arrow 310 not as the first virtual image 301 but as the second virtual image 302.
  • the drive control unit 51 controls the projection unit 40 such that the straight arrow 315 as the second virtual image 302 is projected on the second virtual surface 502 instead of the straight arrow 310 as the first virtual image 301.
  • the second virtual image 302 when viewed from the user 200, is a virtual image having a substantially equal viewing distance on the upper end side and the lower end side in the vertical direction (vertical direction in FIG. 2). In other words, since the second virtual image 302 is displayed substantially vertically, the depth along the road surface 600 is eliminated.
  • the straight arrow 310 as the first virtual image 301 is an example, and another display content may be used as the first virtual image 301.
  • the display mode is changed so as to display the straight arrow 310 as the second virtual image 302, but the display mode may be changed so as to be displayed as the third virtual image 303.
  • the situation in front of the vehicle 100 is also a virtual image. indicate.
  • FIG. 8A and FIG. 8B are conceptual diagrams for explaining the change of the display mode in the modified example B, respectively.
  • the display control unit 52 determines to display a right turn arrow or a left turn arrow as the first virtual image 301 on the first virtual surface 501 based on the driving support information. In addition, below, it demonstrates using the right turn arrow as an example.
  • the right turn arrow overlaps the automobile 150.
  • the display control unit 52 changes the display mode so as to display on the second virtual surface 502 instead of displaying the right turn arrow on the first virtual surface 501 as in the modification A.
  • the display system 10 obtains the information on the periphery of the vehicle 100 such as pedestrian information and forward vehicle information by the acquisition unit 6, the virtual image representing the situation in front of the vehicle 100 is also based on these information.
  • the second virtual surface 502. That is, the drive control unit 51 controls the projection unit 40 such that a virtual image representing the right turn arrow and the situation in front of the vehicle 100 is displayed on the second virtual surface 502.
  • the virtual image 320 is not the content that the user 200 sees as the situation in front of the automobile 100, but the content as viewed from the top of the user 200. That is, the virtual image 320 represents the situation when looking obliquely forward from above the automobile 100. This enables virtual display on the second virtual image 302.
  • FIG. 8A shows an example in which a virtual image 320 as a second virtual image 302 composed of a right turn arrow and information representing the situation in front of the automobile 100 is displayed on the second virtual surface 502. That is, FIG. 8A shows an example in which a virtual image 320 as the second virtual image 302 showing information representing the right turn arrow and the automobile 150 is displayed on the second virtual surface 502.
  • the virtual image 320 includes a right turn arrow 321 determined to be displayed based on the driving support information, and a virtual image 322 representing the automobile 150 as a shield.
  • FIG. 8B shows an example in which a virtual image 330 as a second virtual image 302 different from that of FIG. 8A is displayed on the second virtual surface 502.
  • the virtual image 330 includes a right turn arrow 331 determined to be displayed based on the driving support information, a virtual image 332 representing the car 150 as a shield, and a virtual image 333 representing a person present in front of the car 150 It is done.
  • the user 200 can know that the vehicle 150 is present in front of the vehicle 100 and that a person is present in front of the vehicle 150. Thereby, the user 200 can know an object (person, car, etc.) hidden by the shield.
  • the right-turn arrow as the first virtual image 301 is an example, and another display content may be displayed as the first virtual image 301.
  • the display system 10 converts the first virtual image 301 to be displayed into another virtual image (second virtual image 302) corresponding to the content of the first virtual image 301 to be displayed by combining the first virtual image 301 to be displayed.
  • FIG. 9A and FIG. 9B are conceptual diagrams for explaining the change of the display mode in the modified example C, respectively.
  • the display control unit 52 determines to display the right turn arrow as the first virtual image 301 on the first virtual surface 501 based on the driving support information.
  • the right turn arrow overlaps the automobile 150.
  • the display control unit 52 changes the right turn arrow into a second virtual image 302 which is another virtual image representing a right turn by combining with the automobile 150 which is a shield.
  • the display control unit 52 changes the display mode of the right turn arrow.
  • the display control unit 52 changes the right turn arrow (virtual image) into another circular virtual image 340 superimposed on the position of the right blinker lamp of the automobile 150 ahead.
  • the virtual image 340 is a type of the second virtual image 302.
  • the display control unit 52 obtains the position of the right blinker lamp of the automobile 150 from the image captured by the imaging device 71.
  • the display control unit 52 obtains the projection position of the virtual image 340 on the second virtual surface 502 so that the virtual image 340 overlaps the obtained position.
  • the drive control unit 51 controls the projection unit 40 such that the virtual image 340 is projected on the second virtual surface 502 at the display position.
  • the display system 10 superimposes the virtual image 340 on the position of the right blinker lamp of the front car 150, instead of displaying the right turn arrow.
  • the display system 10 can prompt the user 200 to make a right turn. That is, in the present modification, the display system 10 uses the automobile 150 as a canvas instead, and informs the user 200 of the content corresponding to the right turn arrow by combining the automobile 150 and the virtual image 340.
  • a virtual image (noting that the person is present ( It is decided to display the caution virtual image on the first virtual surface 501.
  • the attention virtual image overlaps the automobile 150.
  • the display control unit 52 changes the virtual attention image into another virtual image that is combined with the shielding vehicle 150 to prompt attention to the presence of a person.
  • the display control unit 52 changes the display mode of the virtual attention image.
  • the display control unit 52 changes the attention virtual image into a pair of virtual images 341 superimposed on the position of the brake lamp (tail lamp) of the automobile 150 ahead.
  • the display control unit 52 obtains the positions of the pair of brake lamps of the automobile 150 from the image captured by the imaging device 71.
  • the display control unit 52 determines the projection positions of the pair of virtual images 341 on the second virtual surface 502 so that the pair of virtual images 341 respectively overlap the determined pair of positions.
  • the drive control unit 51 controls the projection unit 40 such that each of the pair of virtual images 341 is projected on the second virtual surface 502 at the display position.
  • the display system 10 superimposes the virtual image 341 on the position of the brake lamp which is a portion of the car 150 ahead, instead of displaying the attention virtual image.
  • the display system 10 can alert the user 200. That is, the display system 10 can use the automobile 150 as a canvas instead, and in the combination of the automobile 150 and the virtual image 341, can prompt the user 200 to pay attention to the presence of a person.
  • the display control unit 52 when displaying the second virtual image 302 in which the display mode is changed from the first virtual image 301, the display control unit 52 allows the visual distance of the second virtual image 302 to a shield such as the automobile 150 or the like.
  • the display position is determined to match the distance.
  • the user 200 can visually recognize the second virtual image 302 without a sense of discomfort as compared with the case where the visual distance of the second virtual image 302 does not match the distance of the shield.
  • the color of the virtual image superimposed on the position of the brake lamp may be changed according to the distance from the automobile 100 to a person.
  • the brightness may be increased as the distance from the automobile 100 to a person decreases.
  • the change of the display mode in such a case will be described with reference to FIGS. 10A to 13B.
  • 10A to 13B are conceptual diagrams for explaining the change of the display mode in the modification D.
  • the display system 10 moves the straight arrow 350 as the first virtual image 301 along the road surface 600. Display so that there is depth.
  • the display system 10 displays the straight arrow 350 in blue (a color representing safety).
  • the display control unit 52 determines that the vehicle 150 is changing lanes to the lane in which the vehicle 100 is traveling based on the driving support information, in particular, the information around the vehicle 100. .
  • the display control unit 52 changes the blue straight advancing arrow 350 into a straight advancing arrow 350 such as yellow that is a color representing attention.
  • the drive control unit 51 controls the projection unit 40 such that the straight arrow 350 is displayed in the changed display color.
  • the display control unit 52 when the display control unit 52 recognizes that the overlapping portion with the rectilinear arrow 350 is generated by the further movement of the automobile 150, the display control unit 52 sets the automobile 150 as a shielding object and Change the display mode so that the portion overlapping with 150 is not displayed. Furthermore, as shown in FIG. 11B, the display control unit 52 gradually deletes the portion of the straight advancing arrow 350 overlapping the automobile 150 according to the movement of the automobile 150.
  • the luminance is gradually lowered so that the portion 350b becomes gradually less noticeable than the portion 350a.
  • the display control unit 52 recognizes that the lane change of the car 150 is completed when the movement of the car 150 in the left and right direction is stable and the movement in the left and right direction is within a predetermined range.
  • a straight advance arrow 350 is displayed in a state in which the leading end portion is deleted.
  • the display control unit 52 displays the virtual image 355 of the display content including the straight arrow and the message indicating straight movement as the second virtual image 302 in addition to the straight arrow 350 from which the tip portion is deleted. I assume.
  • the drive control unit 51 controls the projection unit 40 to display the straight arrow 350 whose front end portion is deleted on the first virtual surface 501 and display the virtual image 355 on the second virtual surface 502.
  • the drive control unit 51 controls the projection unit 40 so that when the projection unit 40 projects the virtual image 355, the virtual image 355 is displayed at a position not overlapping the automobile 150 as a shield.
  • the drive control unit 51 may control the projection unit 40 to display the virtual image 355 so as to be located in front of the vehicle 150, or to display the virtual image 355 so as to be located behind the vehicle 150.
  • the unit 40 may be controlled.
  • the display system 10 can simultaneously display the first virtual image 301 and the second virtual image 302 and make the user 200 visually recognize.
  • FIG. 12B shows an example in which a right turn arrow 351 whose front end is deleted and a virtual image 356 of display contents including a message indicating a right turn arrow and a right turn are displayed as the second virtual image 302 as the first virtual image 301.
  • the drive control unit 51 may control the projection unit 40 to display the virtual image 356 so as to be located in front of the vehicle 150 or display the virtual image 356 so as to be located behind the vehicle 150 The projection unit 40 may be controlled to do this.
  • the drive control unit 51 may control the projection unit 40 such that the virtual image 356 is displayed with the virtual image 356 as the reference point of the display with the actual right turn position as the reference point.
  • the drive control unit 51 may control the projection unit 40 to display so that the virtual image 356 becomes larger as the right turn position is approached.
  • the straight arrow 350 and the right turn arrow 351 as the first virtual image 301 are an example, and other display contents may be displayed as the first virtual image 301.
  • the display control unit 52 determines to display the straight arrow 360 on the first virtual surface 501 based on the driving support information.
  • the rectilinear arrow 360 When the rectilinear arrow 360 does not overlap with the automobile 150 which is a shield, the rectilinear arrow 360 is displayed as a first virtual image 301 as shown in FIG. 13A.
  • the display control unit 52 displays a portion 361 of the straight advance arrow 360 which does not overlap the automobile 150 as a first virtual image 301.
  • the display control unit 52 changes the display mode of the rectilinear arrow 360 so as to display the virtual image 362 corresponding to a portion overlapping the car 150 among the rectilinear arrows 360 as the second virtual image 302.
  • the drive control unit 51 controls the projection unit 40 so that the portion 361 not overlapping the vehicle 150 is displayed on the first virtual surface 501, and the virtual image 362 does not overlap the vehicle 150 on the second virtual surface 502.
  • the projection unit 40 is controlled to be displayed continuously with the display unit 361.
  • the display system 10 compensates the portion of the first virtual image 301 removed by the shield with the second virtual image 302, so that the user 200 can visually recognize the removed portion.
  • the straight arrow 360 as the first virtual image 301 is an example, and another display content may be used as the first virtual image 301.
  • FIG. 14A and FIG. 14B are conceptual diagrams for explaining the change of the display mode in the modification E.
  • the display control unit 52 illustrated in FIG. 3 may determine to display a virtual image as the first virtual image 301 representing no lane change based on the driving support information.
  • the display control unit 52 determines that the first virtual image 301 determined to be displayed overlaps the shield of the automobile 150 or the like, it displays another virtual image representing no lane change at a position not overlapping the automobile 150.
  • Change the display mode as follows. For example, as shown in FIG. 14A, the display control unit 52 changes the display mode so that the virtual image 370 as the first virtual image 301 is displayed on both sides of the vehicle 150.
  • the virtual image 370 includes a plurality of quadrangular figures having a color representing danger such as red.
  • the drive control unit 51 controls the projection unit 40 such that the virtual image 370 is projected as the first virtual image 301 on both sides of the vehicle 150. Since the red squares are displayed on both sides of the automobile 150 along the road surface 600, the user 200 knows that the lane change can not be made.
  • the display control unit 52 may determine that the lane change is prohibited on the left side with respect to the traveling direction but may be changeable on the right side. In this case, when the display control unit 52 determines that the virtual image as the first virtual image 301 determined to be displayed overlaps the automobile 150, the left side represents lane change prohibition and the right side represents different virtual images representing lane change possible. , Change the display mode so as to display at a position not overlapping the car 150.
  • the display control unit 52 changes the display mode so that the virtual image 370 as the first virtual image 301 consisting of a plurality of red square shapes is displayed on the left side of the automobile 150 .
  • the display mode is changed so that a virtual image 372 as a first virtual image 301 consisting of a plurality of blue square shapes is displayed on the right side of the automobile 150.
  • the drive control unit 51 controls the projection unit 40 such that the virtual image 370 is displayed on the left side of the automobile 150 and the virtual image 372 is displayed on the right side as the first virtual image 301.
  • red squares are displayed on the left side of the automobile 150, and blue squares are displayed on the right side.
  • the user 200 knows that lane changes can be made on the right side.
  • lane change is an example as a content to be displayed, and other content may be used.
  • the figure displayed as the 1st virtual image 301 in the position which does not overlap with a shielding object may not be a quadrangle, but may be another form.
  • the virtual image displayed as the first virtual image 301 at a position not overlapping the shielding object is not limited to a figure, and may be a character, a symbol or the like.
  • FIGS. 15A and 15B are conceptual diagrams for explaining the change of the display mode in the modification F.
  • the display control unit 52 illustrated in FIG. 3 may determine to display the speed of the vehicle 100 as the second virtual image 302 as illustrated in FIG. 15A.
  • the display control unit 52 displays the second virtual image 302 representing the velocity in front of the vehicle 150 when the second virtual image 302 does not overlap with the shield of the vehicle 150 or the like.
  • the display control unit 52 changes the display mode from the second virtual image 302 to the third virtual image 303, as shown in FIG. 15B.
  • the drive control unit 51 controls the projection unit 40 such that the vehicle speed information is displayed as the third virtual image 303.
  • the display system 10 can display the content based on the navigation information as described above. For example, in a general display system, in which direction (straight direction, right direction, left direction, etc.) vehicle 100 should go toward a destination in display area C4 surrounded by broken lines C1 to C3 shown in FIG. 16A Information for guiding the user (arrows indicating the direction to be advanced) are displayed. In this case, the user 200 may become uneasy because it does not know in which direction the destination is located with reference to the current time.
  • the display system 10 of the present modified example sets a reference point A1 at a fixed distance in the projection direction of the virtual image 300.
  • the reference point A1 is located further forward than the broken line C3.
  • the reference point A1 is set 50 m ahead of the projection direction of the virtual image 300 from the automobile 100.
  • the display system 10 displays information representing the direction of the destination as a virtual image 300 with the set reference point A1 as a start point.
  • FIG. 16A is a conceptual diagram of a position for explaining a change of the display mode in the modification G.
  • the display control unit 52 obtains the direction from the reference point A1 to the point B1.
  • the drive control unit 51 controls the projection unit 40 to display the direction determined by the display control unit 52 as the first virtual image 301.
  • FIG. 16B shows an example showing the direction from the reference point A1 to the point B1 when the point B1 is the destination.
  • the virtual image 380 as the first virtual image 301 shown in FIG. 16B includes a virtual image 381 representing the reference point A1 and a virtual image 382 representing the direction from the reference point A1 toward the point B1.
  • FIG. 16C shows an example showing the direction from the reference point A1 to the point B2 when the point B2 is the destination.
  • the virtual image 385 as the first virtual image 301 shown in FIG. 16C includes a virtual image 386 representing the reference point A1 and a virtual image 387 representing the direction from the reference point A1 to the point B2.
  • FIG. 16D shows an example showing the direction from the reference point A1 to the point B3 when the point B3 is the destination.
  • the virtual image 390 as the first virtual image 301 shown in FIG. 16D includes a virtual image 391 representing the reference point A1 and a virtual image 392 representing the direction from the reference point A1 to the point B3.
  • FIG. 16E shows an example showing the direction from the reference point A1 to the point B4 when the point B4 is a destination.
  • the virtual image 395 as the first virtual image 301 shown in FIG. 16E includes a virtual image 396 representing the reference point A1 and a virtual image 397 representing the direction from the reference point A1 to the point B4. In FIG. 16E, an arrow is not displayed on the virtual image 397.
  • the display color of the first virtual image 301 may be changed according to the distance and direction from the reference point A1 to the destination. For example, when the destination is located farther from the reference point A1 and in the vicinity of the reference point A1, the display color of the first virtual image 301 is set to blue. When the destination exists between the automobile 100 and the reference point A1, the display color of the first virtual image 301 is set to yellow. If the destination is present before the broken line C2, the display color of the first virtual image 301 is red. By changing the display color in this manner, the user 200 can further intuitively know the sense of distance to the destination and the direction of the destination. In addition, each color is not limited to said example.
  • the first virtual image 301 when displaying the first virtual image 301, the first virtual image 301 may overlap the shield. In this case, as described in the embodiment and the modified examples A to F described above, the display system 10 changes the display mode of the first virtual image 301.
  • the display control unit 52 shown in FIG. 3 sets a new reference point at a position not overlapping the shield, for example, a position before the shield, when the set reference point overlaps the shield. Do.
  • the display control unit 52 displays a first virtual image 301 representing the direction from the newly set reference point to the destination.
  • FIG. 17 is a flowchart for explaining the operation of the display system in the modification H.
  • the acquisition unit 6 illustrated in FIG. 3 acquires position information, traveling direction information, vehicle speed information, and the like of the vehicle 100 (see FIG. 1) on which the display system 10 is mounted (step S100).
  • the acquisition unit 6 acquires shield information on a shield from the detection system 7 (step S101).
  • the shield information includes the presence or absence of a shield and the distance to the shield, if any.
  • the acquisition unit 6 acquires the coordinates of the guidance target point (destination) from the navigation information (step S102).
  • the display control unit 52 determines whether the reference point is visible or not (step S103). Specifically, the display control unit 52 determines whether the reference point does not overlap with a shield such as a building.
  • step S103 If it is determined that the reference point is visible ("Yes” in step S103), the display control unit 52 calculates the display area of the virtual image 300 to be displayed (step S104).
  • the display control unit 52 determines whether or not the display area is visible (step S105). If it is determined that the display area is visible (“Yes” in step S105), the display control unit 52 obtains the distance and direction from the current point to the guidance target point (step S107).
  • the display control unit 52 determines a display format such as a graphic (display graphic) to be displayed as the virtual image 300, a display color, a transmittance, and the like according to the distance and direction to the guidance target point (step S108).
  • the display control unit 52 specifies the distance and the direction from the reference point to the guidance target point from the coordinates of the reference point and the coordinates of the guidance target point.
  • the display control unit 52 determines, for example, a display graphic indicating the direction from the reference point to the guidance target point as a graphic shown in any of FIGS. 16B to 16E. Further, the display control unit 52 determines the display color and the transmittance of the reference point, and determines the display color and the transmittance of the display graphic according to the distance and the direction. For example, as described in modification G, the display control unit 52 determines the display color.
  • the drive control unit 51 performs display processing so as to project the first virtual image 301 in the determined display format (step S109).
  • step S106 the display control unit 52 determines that the reference point is not visible (“No” in step S103). Specifically, the display control unit 52 changes the reference point to a position where it can be visually recognized. For example, the reference point is changed to a position before the shield. Then, the display control unit 52 performs the processing of step S104 'and step S105' to calculate the display area that can be viewed. Steps S104 'and S105' are the same processes as steps S104 and S105, respectively. Thereafter, the process proceeds to step S107.
  • step S106 the display control unit 52 performs correction processing of the reference point. In this case, since the reference point is visible, the display control unit 52 calculates the visible display area again. Thereafter, the process proceeds to step S107.
  • FIGS. 18A and 18B are conceptual diagrams for describing changes in the display mode in the modified example H.
  • FIG. 18A and 18B are conceptual diagrams for describing changes in the display mode in the modified example H.
  • FIG. 18A shows a modified example in which the reference point A1 overlaps the shield D1 such as a building.
  • the first virtual image 301 to be displayed (in particular, a virtual image representing the reference point A1) overlaps the shield D1, making it difficult for the user 200 to visually recognize. Therefore, as described above, the display control unit 52 sets a new reference point A2 in front of the shield D1.
  • the first virtual image 301 to be displayed and the shield D1 do not overlap, so the user 200 sets the distance and direction between the new reference point A2 and the destination (for example, points B2 to B4 shown in FIG. 18A). It can be easily viewed.
  • the new reference point is not limited to the case where the reference point A1 overlaps the shield D1.
  • the reference point may be changed according to the traveling conditions such as the speed, the shape of the road, and the slope regardless of the presence or absence of the shield.
  • the reference point A1 is changed to the reference point A3.
  • the reference point A3 is located in front of the shield D1 and in a position where it is easily visible. Thereby, the user 200 can visually recognize the reference point A3, and thus can easily visually recognize the distance and direction between the reference point A3 and the destination (for example, points B2 to B4 shown in FIG. 18B).
  • the above embodiment is only one of various embodiments of the present disclosure.
  • the above embodiment can be variously modified according to the design and the like as long as the object of the present disclosure can be achieved.
  • the same function as that of the display system 10 may be embodied by an identification method, a computer program, or a non-transitory recording medium or the like recording the program.
  • the control method of the display system according to the present embodiment projects the virtual image 300 in the target space 400 and causes the target person (for example, the user 200) to visually recognize, and the control unit 5 that controls the display of the virtual image 300.
  • Control method of the display system 10. In this control method, when a shield is present in the projection direction of the virtual image and the shield is within the range of the visual distance to the virtual image, the display mode of the virtual image is changed.
  • the program according to the present embodiment is a program for executing the control method of the display system 10 described above in the computer system.
  • the non-transitory recording medium according to the present embodiment stores the program in a computer readable manner.
  • An execution subject of the display system 10 or a control method of the display system 10 includes a computer system.
  • the computer system mainly includes a processor and memory as hardware.
  • the processor executes the program recorded in the memory of the computer system to implement the function as the execution subject of the display system 10 or the method.
  • the program may be pre-recorded in the memory of the computer system, but may be provided through a telecommunication line. Also, the program may be provided by being recorded in a non-transitory recording medium such as a memory card readable by a computer system, an optical disk, a hard disk drive and the like.
  • a processor of a computer system is configured of one or more electronic circuits including a semiconductor integrated circuit (IC) or a large scale integrated circuit (LSI).
  • the plurality of electronic circuits may be integrated into one chip or may be distributed to a plurality of chips.
  • the plurality of chips may be integrated into one device or may be distributed to a plurality of devices.
  • control unit 5 of the display system 10 may be distributed to a plurality of systems (devices).
  • at least part of the functions of the control unit 5 may be realized by a cloud (cloud computing).
  • the display system 10 communicates directly between a vehicle and a vehicle (vehicle-vehicle) or between a vehicle and an infrastructure such as a traffic light and a road sign (road-vehicle), so-called V2X.
  • V2X vehicle to Everything
  • the V2X communication technology enables the automobile 100 to acquire mobile device information from surrounding vehicles or infrastructure.
  • the contents of the virtual image 300 to be projected to the target space 400 may be determined by infrastructure, and in this case, at least a part of the control unit 5 may not be mounted on the vehicle 100.
  • the display system 10 is not limited to the configuration in which the virtual image 300 is projected on the target space 400 set in front of the traveling direction of the automobile 100.
  • the display system 10 is applicable not only to the motor vehicle 100 but to mobile devices other than the motor vehicle 100, such as, for example, a two-wheeled vehicle, a train, an aircraft, a construction machine, and a ship.
  • the display system 10 is not limited to a mobile device, and may be used, for example, in an amusement facility, or as a wearable terminal such as a head mounted display, a medical facility, or a stationary device.
  • the display system 10 is not limited to the configuration that projects a virtual image using laser light.
  • the display system 10 may be configured to project an image (virtual image 300) with a projector from behind a diffuse transmission type screen.
  • the virtual image 300 according to the image displayed on the liquid crystal display may be projected through the projection unit 40.
  • the virtual image 300 before changing the display mode is described as the first virtual image 301 in the above-described embodiment and each modification, the virtual image 300 before changing the display mode may be the second virtual image 302.
  • the detection system 7 includes the imaging device 71 and the laser radar 72.
  • the detection system 7 may have a function capable of detecting the presence or absence of an attention object in the projection direction of the virtual image 300.
  • control unit 5 of the display system 10 may obtain the distance to the shield.
  • the display system 10 has a projection unit 40 and a control unit 5.
  • the projection unit 40 projects the virtual image 300 on the target space 400.
  • the control unit 5 controls the display of the virtual image 300.
  • the control unit 5 changes the display mode of the virtual image 300 when the distance to the shield is equal to or less than the visual distance to the virtual image 300 when the shield (for example, the automobile 150) is present in the projection direction of the virtual image 300. That is, when the shielding object exists in the projection direction of the virtual image 300 and the shielding object is in the range within the visual distance to the virtual image 300, the control unit 5 changes the display mode of the virtual image 300.
  • the control unit 5 may not change the display mode of the virtual image 300 when the distance to the shield is equal to or greater than the threshold. That is, the control unit 5 does not have to change the display mode of the virtual image 300 when the shield is at a first position closer to the viewing distance or at a position farther than the first position.
  • the display system 10 can reduce the processing load by avoiding the change of the display mode of the virtual image 300.
  • the control unit 5 may change the display mode of the virtual image 300 so that the virtual image 300 does not overlap the shield. That is, when the shield is in the range within the visual distance to the virtual image 300, the control unit 5 may change the display mode of the virtual image so that the virtual image 300 does not overlap the shield. According to this configuration, it is possible to reduce the possibility of the subject getting discomfort due to the virtual image 300 overlapping the shield.
  • the control unit 5 may change the display mode of the virtual image 300 so that the virtual image 300 is projected to at least one of the front and the back of the shield. According to this configuration, it is possible to reduce the possibility of the subject getting discomfort due to the virtual image overlapping the shield.
  • the control unit 5 may change the display mode of the virtual image 300 so that the virtual image 300 is projected to the side of the shield. In this configuration as well, the possibility of the subject getting discomfort can be reduced by overlapping the virtual image on the shield.
  • the control unit 5 may change the display mode of the virtual image 300 so as to display information on the object hidden by the shield. According to this configuration, the display system 10 can alert the user 200 by displaying information on an object hidden behind the shield.
  • the control unit 5 may change the display mode of the virtual image 300 so as to display the virtual image 300 superimposed on a predetermined portion of the shield. According to this configuration, the display system 10 can notify the user 200 of meaningful content by combining the shielding object and the virtual image 300.
  • the control unit 5 may change the display mode of the virtual image 300 according to the distance of the shield. That is, when superimposing the virtual image 300 on a predetermined portion of the shield, the control unit 5 may change the display mode of the virtual image 300 according to the position of the shield. According to this configuration, it is possible to reduce the discomfort due to the combination of the shield and the virtual image 300.
  • the control unit 5 may change the display mode of the virtual image 300 so as to leave a non-overlapping portion which is a part of the virtual image 300 and does not overlap with the shield. According to this configuration, the display system 10 can reduce the possibility of the subject getting discomfort due to the virtual image 300 overlapping the shield.
  • the control unit 5 may change the display mode of the overlapping portion so that the overlapping portion that is a part of the virtual image 300 and overlaps with the shielding has lower visibility than the non-overlapping portion. According to this configuration, it is possible to reduce the possibility of the subject getting discomfort due to the virtual image 300 overlapping the shield.
  • the information presentation system 1000 includes a display system 10 and a detection system 7 that detects an obstacle. According to this configuration, the information presentation system 1000 can reduce the possibility of the subject getting discomfort due to the virtual image 300 overlapping the shield.
  • the control method of the display system 10 is a control method of the display system 10 including the projection unit 40 that projects the virtual image 300 onto the target space 400 and the control unit 5 that controls the display of the virtual image 300.
  • this control method when a shield is present in the projection direction of the virtual image 300, the display mode of the virtual image is changed when the distance to the shield is equal to or less than the visual distance to the virtual image 300. That is, in this control method, when the shield is present in the projection direction of the virtual image 300 and the shield is within the range of the visual distance to the virtual image 300, the display mode of the virtual image 300 is changed. According to this control method, it is possible to reduce the possibility of the subject getting discomfort due to the virtual image 300 overlapping the shield.
  • a program according to the present embodiment causes a computer to execute the control method. According to this program, it is possible to reduce the possibility of the subject getting discomfort due to the virtual image 300 overlapping the shield.
  • the non-transitory recording medium according to the present embodiment stores the program in a readable manner by a computer.
  • a mobile device for example, an automobile 100
  • the reflective member is light transmissive and reflects the light emitted by the projection unit 40 to make the virtual image 300 visible to the subject (for example, the user 200). According to this configuration, the mobile device can reduce the possibility of the subject getting discomfort due to the virtual image 300 overlapping the shield.
  • the display system, the information presentation system, the display system control method, the program, and the recording medium according to the present disclosure are suitable for a mobile device, and the driver etc. feels strange when the virtual image overlaps the shield. The possibility of gaining can be reduced.
  • Reference Signs List 1 screen 1a movable screen 1b fixed screen 2 drive unit 3 irradiation unit 4 projection optical system 5 control unit 6 acquisition unit 7 detection system 10 display system 31 light source 32 scanning unit 40 projection unit 41 magnifying lens 42 first mirror 43 second mirror 51 Drive control unit 52 Display control unit 71 Imaging device 72 Laser radar 100, 150 Car 101 Windshield 102 Dashboard 103 Front seat 110 Main body 200 User 220 Drive unit 222 Drive unit 224 Drive source 224 Drive wheel 226 Steering 300, 320, 322, 330, 332, 333, 340, 341, 355, 356, 362, 370, 381, 382, 385, 386, 387, 390, 391, 392, 392, 395, 396, 397 Virtual image 301 First virtual image 302 2 virtual images 303 third virtual images 310, 315, 350, 360 straight arrows 311 overlapping portions 312, 361 non-overlapping portions 321, 331, 351 right turn arrows 350a, 350b portion 400 target space 500 optical axis 501 first virtual surface 502 second virtual

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Combustion & Propulsion (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/JP2018/022659 2017-06-30 2018-06-14 表示システム、情報提示システム、表示システムの制御方法、プログラムと記録媒体、及び移動体装置 WO2019003929A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112018003314.7T DE112018003314B4 (de) 2017-06-30 2018-06-14 Anzeigesystem, Informationsdarstellungssystem, Verfahren zum Steuern eines Anzeigesystems, Programm und Aufzeichnungsmedium für ein Anzeigesystem und mobiler Körper

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017129895A JP6883759B2 (ja) 2017-06-30 2017-06-30 表示システム、表示システムの制御方法、プログラム、及び移動体
JP2017-129895 2017-06-30

Publications (1)

Publication Number Publication Date
WO2019003929A1 true WO2019003929A1 (ja) 2019-01-03

Family

ID=64741478

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/022659 WO2019003929A1 (ja) 2017-06-30 2018-06-14 表示システム、情報提示システム、表示システムの制御方法、プログラムと記録媒体、及び移動体装置

Country Status (3)

Country Link
JP (1) JP6883759B2 (de)
DE (1) DE112018003314B4 (de)
WO (1) WO2019003929A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2584609A (en) * 2019-05-03 2020-12-16 Jaguar Land Rover Ltd Controller for a vehicle

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113196377B (zh) * 2018-12-20 2024-04-12 Ns西日本株式会社 显示光射出装置、平视显示装置、图像显示系统及头盔
US20220212689A1 (en) * 2019-05-15 2022-07-07 Nissan Motor Co., Ltd. Display Control Method and Display Control Device
KR102270502B1 (ko) * 2019-10-24 2021-06-30 네이버랩스 주식회사 주행 정보 안내 방법 및 시스템
JP7415516B2 (ja) * 2019-12-11 2024-01-17 株式会社デンソー 表示制御装置
EP4269152A4 (de) * 2020-12-25 2024-01-17 Nissan Motor Co., Ltd. Informationsverarbeitungsvorrichtung und informationsverarbeitungsverfahren
DE102021127551A1 (de) 2021-10-22 2023-04-27 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Kraftfahrzeuge zur virtuellen Anzeige von Kraftfahrzeugteilen
WO2023233447A1 (ja) * 2022-05-30 2023-12-07 マクセル株式会社 仮想オブジェクトを表示する装置およびその表示方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004302643A (ja) * 2003-03-28 2004-10-28 Denso Corp 表示方法及び表示装置
JP2005069799A (ja) * 2003-08-22 2005-03-17 Denso Corp 車両用ナビゲーションシステム
JP2013120574A (ja) * 2011-12-08 2013-06-17 Daimler Ag 車両用歩行者報知装置
JP2013235378A (ja) * 2012-05-08 2013-11-21 Toyota Motor Corp 車両用情報提供装置
JP2015101311A (ja) * 2013-11-28 2015-06-04 日本精機株式会社 車両情報投影システム
JP2016112984A (ja) * 2014-12-12 2016-06-23 日本精機株式会社 車両用虚像表示システム、ヘッドアップディスプレイ
JP2016118423A (ja) * 2014-12-19 2016-06-30 アイシン・エィ・ダブリュ株式会社 虚像表示装置
WO2016166887A1 (ja) * 2015-04-17 2016-10-20 三菱電機株式会社 表示制御装置、表示システム、表示制御方法および表示制御プログラム

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014181927A (ja) 2013-03-18 2014-09-29 Aisin Aw Co Ltd 情報提供装置、及び情報提供プログラム
JP2014185926A (ja) * 2013-03-22 2014-10-02 Aisin Aw Co Ltd 誘導表示システム
JP6176541B2 (ja) 2014-03-28 2017-08-09 パナソニックIpマネジメント株式会社 情報表示装置、情報表示方法及びプログラム
US9713956B2 (en) 2015-03-05 2017-07-25 Honda Motor Co., Ltd. Vehicle-to-vehicle communication system providing a spatiotemporal look ahead and method thereof
MX2018003908A (es) * 2015-09-30 2018-05-23 Nissan Motor Dispositivo de despliegue vehicular.
KR101916993B1 (ko) 2015-12-24 2018-11-08 엘지전자 주식회사 차량용 디스플레이 장치 및 그 제어방법

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004302643A (ja) * 2003-03-28 2004-10-28 Denso Corp 表示方法及び表示装置
JP2005069799A (ja) * 2003-08-22 2005-03-17 Denso Corp 車両用ナビゲーションシステム
JP2013120574A (ja) * 2011-12-08 2013-06-17 Daimler Ag 車両用歩行者報知装置
JP2013235378A (ja) * 2012-05-08 2013-11-21 Toyota Motor Corp 車両用情報提供装置
JP2015101311A (ja) * 2013-11-28 2015-06-04 日本精機株式会社 車両情報投影システム
JP2016112984A (ja) * 2014-12-12 2016-06-23 日本精機株式会社 車両用虚像表示システム、ヘッドアップディスプレイ
JP2016118423A (ja) * 2014-12-19 2016-06-30 アイシン・エィ・ダブリュ株式会社 虚像表示装置
WO2016166887A1 (ja) * 2015-04-17 2016-10-20 三菱電機株式会社 表示制御装置、表示システム、表示制御方法および表示制御プログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2584609A (en) * 2019-05-03 2020-12-16 Jaguar Land Rover Ltd Controller for a vehicle
GB2584609B (en) * 2019-05-03 2021-10-20 Jaguar Land Rover Ltd Controller for a vehicle

Also Published As

Publication number Publication date
DE112018003314B4 (de) 2024-02-15
JP2019012236A (ja) 2019-01-24
DE112018003314T5 (de) 2020-03-26
JP6883759B2 (ja) 2021-06-09

Similar Documents

Publication Publication Date Title
US10600250B2 (en) Display system, information presentation system, method for controlling display system, computer-readable recording medium, and mobile body
US10551619B2 (en) Information processing system and information display apparatus
US10699486B2 (en) Display system, information presentation system, control method of display system, storage medium, and mobile body
WO2019003929A1 (ja) 表示システム、情報提示システム、表示システムの制御方法、プログラムと記録媒体、及び移動体装置
JP6699675B2 (ja) 情報提供装置
JP7113259B2 (ja) 表示システム、表示システムを備える情報提示システム、表示システムの制御方法、プログラム、及び表示システムを備える移動体
JP6834537B2 (ja) 表示装置、移動体装置、表示装置の製造方法及び表示方法。
JP7464096B2 (ja) 表示制御装置及び表示制御プログラム
JP2010143520A (ja) 車載用表示システム及び表示方法
US20220172652A1 (en) Display control device and display control program product
US10649207B1 (en) Display system, information presentation system, method for controlling display system, recording medium, and mobile body
US20230221569A1 (en) Virtual image display device and display system
JP2020071415A (ja) ヘッドアップディスプレイシステム
US20210347259A1 (en) Vehicle display system and vehicle
JP2022084266A (ja) 表示制御装置、表示装置、及び画像の表示制御方法
JP7127565B2 (ja) 表示制御装置及び表示制御プログラム
JP2021135933A (ja) 表示方法、表示装置及び表示システム
JP7318431B2 (ja) 表示制御装置及び表示制御プログラム
JP7130688B2 (ja) 車両用表示装置
JP2021105986A (ja) 表示装置、移動体、表示方法及びプログラム
JP2021117089A (ja) 表示装置、及び表示方法
JP7266257B2 (ja) 表示システム、及び表示システムの制御方法
JP7429875B2 (ja) 表示制御装置、表示装置、表示制御方法、及びプログラム
JP2021127032A (ja) 表示制御装置及び表示制御プログラム
JP2019202641A (ja) 表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18822745

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18822745

Country of ref document: EP

Kind code of ref document: A1